Skip to main content
Heliyon logoLink to Heliyon
. 2023 Nov 8;9(11):e22036. doi: 10.1016/j.heliyon.2023.e22036

Comparative analysis of various machine learning algorithms to predict 28-day compressive strength of Self-compacting concrete

Waleed Bin Inqiad a,, Muhammad Shahid Siddique a,∗∗, Saad S Alarifi b, Muhammad Jamal Butt c, Taoufik Najeh d, Yaser Gamil e
PMCID: PMC10692774  PMID: 38045144

Abstract

Construction industry is indirectly the largest source of CO2 emissions in the atmosphere, due to the use of cement in concrete. These emissions can be reduced by using industrial waste materials in place of cement. Self-Compacting Concrete (SCC) is a promising material to enhance the use of industrial wastes in concrete. However, there are very few methods available for accurate prediction of its strength, therefore, reliable models for estimating 28-day Compressive Strength (C–S) of SCC are developed in current study by using three Machine Learning (ML) algorithms including Multi Expression Programming (MEP), Extreme Gradient Boosting (XGB), and Random Forest (RF). The ML models were meticulously developed using a dataset of 231 points collected from internationally published literature considering seven most influential parameters including cement content, quantities of fly ash and silica fume, water content, coarse aggregate, fine aggregate, and superplasticizer dosage to predict C–S. The developed models were evaluated using different statistical errors including Root Mean Square Error (RMSE), Mean Absolute Error (MAE), coefficient of determination (R2) etc. The results showed that the XGB model outperformed the MEP and RF model in terms of accuracy with a correlation R2 = 0.998 compared to 0.923 for MEP and 0.986 for RF. Similar trend was observed for other error metrices. Thus, XGB is the most accurate model for estimating C–S of SCC. However, it is pertinent to mention here that it does not give its output in the form of an empirical equation like MEP model. The construction of these empirical models will help to efficiently estimate C–S of SCC for practical purposes.

Keywords: Machine learning (ML), Self-compacting concrete (SCC), 28-Day compressive strength (C–S), Multi expression programming (MEP), Extreme gradient boosting (XGB), Random forest (RF)

1. Introduction

The surge in greenhouse gas emissions such as CO2 pose a significant environmental threat to the mankind. Most of the emissions are due to human activities. Construction industry is a substantial emitter of CO2 contributing almost 50 % to the global emissions [1]. Concrete is the intensely used building material due to its adaptability and ease of usage. The annual production of concrete is more than 25 billion tons [2]. Traditional concrete uses Portland cement as the main binding agent and it is the leading contributor of CO2 emissions from concrete [3]. The manufacturing of cement accounts for 7 % of the global CO2 emissions [4]. The yearly production of cement is about 4000 million tons, and it is set to increase to 6000 million tons in 2060 [5]. The emission of these harmful gases into the atmosphere brings about climate change. Thus, the invention and production of green construction materials is desirable to lower the annual carbon emissions and to conserve precious natural resources.

The concrete industry has undergone significant advancements in recent years. One notable development is the production of green concrete by using industrial waste materials in concrete. The process of cement manufacture utilizes large amount of energy and contributes significantly to greenhouse gas emissions. To reduce impact of cement production on environment, industrial wastes like limestone, fly ash, silica fume etc. have been used [6]. The use of these industrial wastes as Secondary Cementitious Materials (SCMs) in concrete resulted in a decrease in cement utilization across the globe and reduced the dangerous effects of cement production on the environment [7]. The use of SCMs has been seen by researchers as an effective way to reduce the emissions from the construction industry [8,9]. These waste materials also have other applications. For instance, metakaolin, fly ash, silica fume etc. are frequently used in soilcrete mixtures to modify their properties [[10], [11], [12]]. The use of SCMs in concrete not only reduces the carbon emissions from construction industry but also reduces the stress on depleting natural resources by meeting the growing demand for concrete in the modern world [13].

The increase in use of SCMs in concrete designing technology have resulted in the invention of SCC [14]. It is a concrete having the property to flow and compact itself under the influence of its own weight, without needing external compaction. This property of SCC makes it a preferable choice for use in complex forms or confined spaces, where traditional methods of vibration are not feasible or desirable, such as in the presence of delicate rebar structures. Additionally, SCC offers several other benefits, including improved surface finish, higher resistance to segregation and bleeding, and increased strength and durability as shown in Fig. 1. Different studies have explored the subject of using SCMs in SCC production [[15], [16], [17], [18], [19]]. The use of SCC containing industrial wastes results in cost savings, as it reduces the time required for concrete placement and finishing. Therefore, SCC has the ability to make the concrete industry more sustainable and environment friendly and improve the quality, efficiency, and safety of construction projects [20].

Fig. 1.

Fig. 1

Benefits of SCC.

The mixture composition of SCC is designed to achieve the desired flowability and Self-compacting characteristics. One key component of SCC is to use high levels of fines, such as sand and other admixtures like fly ash, silica fume etc, which help to fill in the spaces between the coarse aggregate particles. These fines are typically mixed with a superplasticizer, which is a type of chemical admixture that helps to increase the flowability of the mixture. In general, SCC has a higher water-to-cement ratio and a lower coarse aggregate content, which helps to achieve the required flowability and Self-compacting characteristics [21]. SCC is widely utilized in construction practices, but the literature contains few reliable methods that can predict its compressive strength based on its mixture composition. It is because C–S of SCC exhibits a non-linear relation with the concrete mixture composition [22]. The SCC requires an appropriate mix design process to achieve its desirable properties. According to Boukendakdji [17], changes in cement or mineral admixtures, as well as changes in aggregate type, can result in significant variations in the properties of SCC. Another reason for the lack of reliable prediction methods is the lack of familiarity and expertise in the use of SCC. Thus, it is very important to have a robust quantitative method, which can accurately predict C–S of SCC. There are physical tests available to check C–S of concrete by casting cubic or cylindrical specimens, but these methods are less economical, ineffective and time consuming. Also, these methods contribute to the construction waste as the specimens are destroyed and thrown after conduction the test. Thus, it is inevitable to have a fast, convenient yet accurate method to estimate C–S of SCC.

Recently, the estimation of different properties of concrete having industrial wastes using machine learning (ML) algorithms have captured the attention of researchers as an efficient way to enhance the use of SCMs in concrete as an alternative of cement. ML techniques have an advantage that they are capable of capturing the underlying mechanisms, even in situations where there is a lack of information regarding specific parameters. This advantage stems from the ability of ML algorithms to learn patterns and associations from data, which can help to reveal the hidden relationships between variables [23]. By using this ability, ML techniques can effectively extract meaningful insights and make accurate predictions, even in complex situations. Due to these advantages, ML techniques have found extensive use in civil engineering to predict different concrete properties, soil properties, soil classification, slope stability and process control etc [[24], [25], [26], [27], [28], [29], [30], [31], [32], [33], [34]]. For instance, Asteris et al. [35], utilized artificial neural network (ANN) to predict different mechanical properties of sandcrete mixtures. Similarly, Sarrir et al. [36], utilized GEP and ANN to predict bearing capacity of steel tube columns filled with concrete.

Regarding the C–S prediction of SCC, Farooq et al. [37] used GEP alongside ANN and support vector machine (SVM) to predict C–S of SCC containing fly ash as an admixture. The best performing algorithm was selected having the least average error value. The results revealed that GEP is the most accurate algorithm having the least error value of 3.714 MPa compared to 5.428 MPa and 5.023 for ANN and SVM respectively. The study revealed an added advantage of GEP algorithm that it gives its output in the form of an empirical equation for C–S prediction unlike ANN and SVM. However, ANN gives the higher R value of 0.9588 compared to 0.9344 and 0.935 of SVM and GEP respectively. Similarly, Asteris et al. [38] leveraged ANN to estimate C–S of SCC containing a various admixture including limestone powder, fly ash, slag, and silica fume. The author used 169 data points and the results showed that ANN can be efficiently used to predict SCC strength. The coefficient of correlation for the developed ANN model was 0.9825 showing the excellent predicting capabilities of neural networks. Similarly, another study [39] utilized three algorithms: decision trees (DT), light gradient boosting machine (LGBM), and XGB to predict C–S of fibre-reinforced SCC having admixtures like fly ash, nano silica, marble powder etc. Fiber-reinforced SCC is a promising material to be used in place of normal concrete because of it offers many advantages such as crack resistance, cost savings and high strength etc. The authors revealed that XGB surpassed all other algorithms and proved to be the most accurate one with R2 = 0.992 and mean absolute error of 1.438 MPa. Thus, the developed XGB model can be effectively used to foster the use of SCC reinforced with fibres in the construction industry.

There are several other studies attributed to predicting SCC strength containing recycled aggregates using different boosting algorithms. For example, a study [40] utilized random forest (RF), k-nearest neighbor (KNN), XGB, gradient boosting (GB), category boosting (CB), and extremely randomized trees (ERT), to predict the C–S of SCC having recycled aggregates. The study applied the algorithms on 515 data points from the literature and this study confirmed that RF technique is the most robust to predict C–S with testing R2 = 0.7128. Furthermore, there are several studies that undertook the prediction of different SCC properties such as flexural strength, slump flow etc. Saha et al. [41] leveraged the support vector regression (SVR) to predict different properties of SCC. The study utilized 115 data points having fly ash as a mineral admixture to predict the L-box ratio, slump flow, V-funnel time, and C–S. The value of coefficient of correlation achieved for slump flow is 0.965 followed by 0.954 for L-box ratio, 0.979 for funnel time and 0.977 for strength indicating the robustness of these techniques to predict different properties of SCC. Apart from the mineral admixtures, several other industrial wastes such as crumb rubber etc have been used in SCC to impart special properties to SCC. Robert et al. [42] incorporated silica fume and rubber waste in SCC and used multivariate regression models to predict flexural strength, C–S, and modulus of elasticity. The study used crumb rubber of size ranging from 0.5 to 3.5 mm and replaced sand by crumb rubber by different percentages such as 5 %, 10 %, 20 % and 30 % of the aggregate volume. The model was created on a set of 63 instances and achieved R2 value of 0.931 for C–S, 0.922 for modulus of elasticity and 0.937 for flexural strength. The summary of prediction of different properties of SCC using various ML algorithms is shown in Table 1.

Table 1.

Summary of SCC properties prediction by different researchers.

S. No Machine learning method Data set Output Year Admixtures References
1. DT, XGB, LGBM 387 C–S 2023 Fly ash, Limestone, Basalt powder, Marble powder, Nano silica, Metakaolin [39]
2. ANN 111 C–S 2011 Fly ash, Bottom ash [43]
3. Ridge Regression, DT, Linear Regression, Lasso Regression, RF, Multi-layer Perceptron Regression 99 C–S 2022 Silica flour, Polypropylene Fiber [44]
4. CB, KNN, XGB, LGBM, Inverse Gaussian, ERT, Poisson Gaussian, RF, GB 515 C–S 2022 Fly ash [40]
5. M5′ algorithm, Multivariate adaptive regression splines (MARS) 114 C–S, V-funnel time, L-box ratio, Slump flow 2018 Fly ash [45]
6. SVR 327 C–S 2023 Fly ash [46]
7. GEP 26 C–S, Slump flow, J Ring 2009 Fly ash [47]
8. Multivariate Regression (MVR) 63 C–S, Modulus of Elasticity, Flexural Strength 2020 Silica fume with crumb rubber [42]
9. ANN 169 C–S 2016 Limestone powder, Fly ash, Slag, Silica fume, Rice husk ash [38]
10. ANN 114 C–S, Slump flow, V-funnel time L-box ratio, 2016 Fly ash [48]
11. SVR 115 Slump flow, L-box ratio, C–S, V-funnel time, 2019 Fly ash [41]
12. Extreme Learning Machine, long short-term memory (LSTM) 48 Slump flow, J Ring 2021 Slag, Fly ash, Limestone powder, Silica fume [49]
13. ANN, SVM 85 C–S 2022 Fly ash, Silica fume [50]
14. RF, Multilayer perceptron network (MLP), KNN 1030 C–S 2022 Fly ash, Slag [51]
15. SVM, ANN, GEP 300 C–S 2021 Fly ash [37]
16. SVR, Deep Learning (DL) 24 C–S, Splitting tensile strength 2021 Fly ash, Silica fume [52]

The significance of this study is prediction of C–S of SCC incorporating silica fume and fly ash as mineral admixtures by using newest machine learning algorithms in a comparative manner. For this purpose, three machine learning algorithms including MEP, RF and extreme gradient XGB were used. Particularly, the novelty of this research is leveraging MEP to predict the strength of SCC. The accuracy of the developed models was compared using statistical evaluation and the accurate model was selected among the developed models.

2. Machine learning models

2.1. Multi Expression Programming

MEP was introduced by Oltean and Dumitrescu [53]. It involves building and evolving several mathematical expressions to find the solution to a problem. MEP encode solutions using linear chromosomes, with a unique ability of encoding various solutions within one chromosome. On the basis of fitness of individuals, the optimal solution is selected to be the representation of the chromosome [54]. Compared to other GP techniques that allow to only encode one solution in one chromosome, MEP has the advantage that it does not rise the complexity of the equation [55]. The MEP techniques starts by generating a population of random individuals. To evolve the optimal solution over specific generations until a termination criterion is reached, MEP employs the following steps [53,55].

  • The process of selection of two parents utilizing the binary tournament is done in first step, followed by recombination with a specified probability for crossover.

  • Two offsprings are produced by recombining two parents.

  • The obtained offspring is mutated, and the worst performing individual is replaced with the best offspring.

MEP is a versatile optimization technique that offers several advantages over other methods, particularly in its ability to handle complex optimization problems without relying on specific mathematical models or assumptions. Many studies suggest that MEP's iterative process of generating and evaluating mathematical expressions, along with its multi-expression chromosome, makes it a useful tool for solving optimization problems [[56], [57], [58], [59]]. The flow chart of MEP algorithm [60] is shown in Fig. 2.

Fig. 2.

Fig. 2

Flow chart of MEP algorithm.

2.2. Random forest (RF)

Leo Breiman and Cutler Adele introduced the RF technique [61], which is utilized for classification and regression problems. The RFR algorithm has several advantages, including its efficiency in handling large data sets, its robustness against overfitting [62], and its lower complexity compared to other machine learning techniques. In geotechnical engineering, RFR is increasingly being employed [63,64]. Additionally, RFR is used in structural engineering to predict the properties of concretes with various admixtures.

RF is composed of two key components: Decision Tree (DT) and Bagging technique. Prior to implementing the algorithm, the feature space is further divided into a number of segments. This segmentation is done iteratively until the stop threshold is met. It results in the formation of three components: internal, external, and branches. The internal nodes are continually connected with functions to choose which node to contact next. DT nodes, which can no longer be split, are referred to as output nodes, terminals, or leaf nodes. The RFR algorithm has proven to be more reliable in various tasks compared to one decision tree [65]. The RF method is an ensemble learning algorithm that utilizes bagging to predict outputs [66]. This algorithm constructs a huge number of decision trees and summarizes their results on the training sample set. If the problem being solved is a problem of classification, the RF algorithm predicts the classification category based on decision trees. On the other hand, if the problem encountered is a regression one, the algorithm calculates the mean of the regression trees to obtain the result. Fig. 3 shows the diagram of the RF technique [67].

Fig. 3.

Fig. 3

Flow Chart of RF algorithm.

2.3. Extreme gradient boosting (XGB)

Chen and Guestrin [68] introduced extreme gradient boosting (XGB) in 2016. It is an ensemble ML method for boosting of trees. XGB is useful for both machine learning (ML) and data mining applications. This approach controls overfitting and improve performance, resulting in a decrease in model complexity and a significant reduction in overfitting [69]. By controlling overfitting, XGB can achieve superior results in comparison to other tree boosting algorithms.

It involves the use of a decision tree along with gradient boosting [70]. DT are known for their ability to provide insights into the importance of inputs in calculating the output. However, XGB takes this a step further by providing users with enhanced capabilities to learn the aforementioned insights [71]. It achieves this by iteratively training decision trees using gradient boosting, a technique that involves the optimization of a loss function by minimizing the errors of the previously trained tree. This iterative process enables the model to learn complex relations between inputs and the output, resulting in enhanced accuracy and interpretability. Thus, XGB offers an improved decision tree algorithm that not only provides accurate predictions but also offers a deeper understanding of the underlying data. Furthermore, it has been observed that extreme gradient boosting outperforms gradient boosting (GB) in terms of predictive accuracy, particularly when dealing with vast amounts of data. This can be attributed to XGB's implementation of a regularization and loss function to access the model's fitness [72].

3. Data acquisition and presentation

The most important step in the creation of a ML model is gathering reliable data. To achieve this goal, a thorough literature review is performed, and a database was compiled from published research [19,21,[73], [74], [75], [76], [77], [78], [79], [80], [81], [82], [83]] containing 231 data points for prediction of C–S. To select the most influential parameters, a thorough evaluation was carried out, and the effectiveness and performance of numerous early trials were calculated. Thus, to forecast C–S of SCC, seven input factors were chosen: cement (kg/m3), fly ash (kg/m3), silica fume (kg/m3), water (kg/m3), coarse aggregate (kg/m3), fine aggregate (kg/m3), and superplasticizer (kg/m3). These seven input parameters were used to build the prediction models for C–S (MPa).

The variance and covariance of the regressions between the variables can be better understood by using a descriptive statistical technique known as a scatter matrix. It is frequently used alongside other statistical analysis techniques to visualize the data and get useful insights about the regressions of the variables used in the model. It provides information on the fluctuation of each variable and the relationships between the inputs and the output. Fig. 4 demonstrate the scatter matrix of input and output variables. It is also called a pair plot because it shows pairwise relationships that exists in the dataset. The x-axis and y-axis of the pair plot are associated with rows and with columns respectively. Thus, it is a gird of axis that shows the associations between all the variables used in the development of the model [84]. It also shows the relation of C–S with the input variables. The diagonal of the scatter matrix presents a graphical representation of the input and output parameters distribution used in the creation of the models. Table 2 provides description of the data used to create the ML models. To obtain reliable predictions of C–S of SCC, it is advised to utilize the models within the specified range of variables presented in Table 2. These measures will contribute to the model's accuracy and overall validity. For the development of robust ML models, the dataset needs to be split into subsets. Thus, the 231 instances are randomly split into two data parts, named as training and validation data sets. The training set constitutes 70 % of the data, while the validation set comprises 30 % [85,86]. Out of the two data sets, the training data help in the creation of the models. On the other hand, the validation data is used to confirm the generalization capacity of the proposed model [87]. The reason behind this approach is to make sure that the model is not overfitted and can accurately predict outputs for unseen data.

Fig. 4.

Fig. 4

Scatter matrix of inputs and output variables.

Table 2.

Statistical description of the data used in model development.

Symbol Average Minimum Maximum Standard Deviation
Cement x0 311.98 135 542.10 102.45
Fly ash x1 145.45 0 390 112.54
Silica Fume x2 17.90 0 67.5 21.90
Coarse Aggregate x3 829.58 565 1091.4 162.27
Fine Aggregate x4 851.36 630 1120 116.80
Water x5 180.79 150 202.10 10.55
Superplasticizer x6 1.33 0 8.70 2.15
Strength fc 45.50 21.54 94.40 17.95

4. Model development and performance assessment

4.1. MEP model development

It is necessary to mention that MEPX 2021.05.18.0 a computer software tool, is used to employ the MEP algorithm. Before modelling, numerous MEP fitting parameters must be established for constructing an efficient and generalised model. The inputs are chosen using recommendations from the literature and a trial-and-error method [88]. The optimized set parameters utilized to create the model are depicted in Table 3. The total number of programs present in the solution is specified by the size of subpopulation. Increase in subpopulation size increases the accuracy and complexity of the resulting equation but increasing it beyond a limit may cause overfitting.

Table 3.

Hyperparameters of the algorithms used for model development.

Parameters
Settings
MEP Parameters
No. of Subpopulations 200
Size of Subpopulation 1000
Code Length 40
Number of Generations 1500
Runs 20
Functions +, -, x, ÷, sqrt
RF Parameters
n-estimators 100
Max-depth 40
Random State 10
XGB Parameters
Max. Depth 3
n estimators 100
Initial Learning Rate 0.01

4.2. RF model development

The random forest (RF) model is constructed using the Scikit-Learn module in Python. The algorithm's hyperparameters were carefully chosen through an iterative process, aiming to attain the highest accuracy. The number of estimators in the random forest regression (RFR) model, directly associated with the quantity of decision trees constructed, was predetermined prior to computing the maximum forecast averages. Increasing the number of trees enhances the model's effectiveness, but at the expense of increased computational requirements. The hyperparameter max depth signifies the depth each decision tree in the forest. The chosen hyperparameters are presented in Table 3.

4.3. XGBOOST model development

The XGB model was developed using Python programming language. An initial XGB model was constructed then the model was allowed to find out the most optimal hyperparameters with the lowest RMSE. Eventually, the model with optimal hyperparameters as shown in Table 3 was created.

4.4. Performance assessment criteria of the models

Statistical evaluation is carried out to access the accuracy of the developed models. The statistical error metrices used to evaluate the model's performance are given below. In the expressions given below, x and y indicate experimental, and model predicted values respectively. Moreover, n indicates the total number of samples in a dataset and n20 shows the number of samples having ratio of actual and predicted values between 0.80 and 1.20.

MeanAbsoluteError(MAE)=Σ|xy|n
RootMeanSquaredError(RMSE)=(xy)2n
RootSquareError=(xy)2(xmeanx)2
CoefficientofCorrelation(R)=(nxy(x)(y))(nx2(x)2)(ny2(y)2)
CoefficientofDetermination(R2)=1(xy)2(yymean)2
Performanceindex(ρ)=RRMSE1+R
a20index=n20n

The R2 value is used to check accuracy of a model. A reliable and accurate model should have high R2 value and minimum values of other error metrices. The R2 value greater than 0.8 denotes a high relation between input and output variables [30]. However, it cannot be solely used to check a model's accuracy because it is not sensitive to multiplication or division of output with a constant [89]. Other statistical error metrices are also used alongside R2. The average magnitude of mistakes is depicted by MAE and RMSE. A large RMSE value shows that the prediction with large errors is greater than the desired and should be reduced. The value of ρ ranges from 0 to infinity and a good model should have ρ value less than 0.2 [90]. It is worth mentioning that ρ considers RRMSE and R simultaneously and its value closer to zero indicates model's accuracy. The a20-index is also a very useful metric to assess the accuracy of ML models. It measures the proportion of predictions that deviate ±20 % from the predicted values [91]. It has been recommended in literature that for a perfect ML model, the a20-index value should be 1 [92].

The error metrices calculated for training and validation sets of the three algorithms are shown in Table 4. Notice that R2 values of all the developed models are greater than 0.8 and surpassed 0.9 for both training and validation datasets. Also, according to the a20-index criteria, the developed models are accurate and robust. The a20-index value is 1 or close to 1 for both datasets of all the models. It indicates that the model predicted values are very close to the experimental values and none of the predicted values deviate more than ±20 % from the actual values. It is important to mention that the validation error metrices are less than the training ones which shows excellent generalization capabilities of the models, and that the overfitting has been dealt effectively. Based on the error metrices, the XGB model outperformed both MEP and RF model and has the lowest values of RSE, RMSE, average error, and highest values of R2. Thus, the most accurate model is XGB model, and MEP is the least accurate model, however it is important to mention that XGB does not generate an empirical equation like the MEP model.

Table 4.

Statistical evaluation of developed models.

MEP RF XGB
Training Validation Training Validation Training Validation
MAE 3.45 3.58 1.45 1.35 0.527 0.45
RMSE 4.84 4.68 2.14 2.02 0.66 0.69
RSE 0.069 0.086 0.0145 0.0124 0.001 0.001
R 0.965 0.962 0.993 0.994 0.998 0.999
R2 0.924 0.923 0.984 0.986 0.997 0.998
ρ 0.04 0.056 0.044 0.0426 0.006 0.007
a20-index 0.975 0.94 1 1 1 1

5. Results

5.1. MEP result

In the development of a MEP model, the selection of most important parameters is the most important step. Thus, numerous trials were performed and the resulting equation for C–S is a function of the variables given in Equation (1).

fc=(x0,x1,x2,x5,x6) (1)

The output of MEP algorithm is decoded to get empirical equation for the C–S. The equation thus derived is given below:

fc=(x62x0x1)3+2(x1x5)2+x22+(((x1+x0)x0)x0)x5+x0++34.6105x6x0x1+3.31565x2 (2)

Notice that Equation (2) is formulated using only the basic arithmetic operations and square root function. It is done to keep the equation simple and compact. The statistical error metrices calculated for MEP model are shown in Table 4. Note that the R2 values of training and validation dataset are 0.924 and 0.923 respectively which shows the strong relation between the actual and predicted values. It is also evident from the scatter plots of MEP training and validation data shown in Fig. 5. The performance index values of training and validation are also closer to 0 which shows the accuracy of the algorithm.

Fig. 5.

Fig. 5

Scatter plots of MEP training and validation data.

5.2. RF result

The results of RF algorithm give a clear depiction of relation between the actual and predicted values of SCC compressive strength as shown in Fig. 6. The RF algorithm is more accurate than MEP algorithm with training R2 = 0.984 and validation R2 = 0.986. Also, the other error metrices like RMSE and average error are lower for RF algorithm than MEP as shown in Table 4. Similarly, the value of performance index of RF algorithm is closer to 0. Thus, the overall accuracy of RF algorithm is significantly greater than MEP, but it does not give an empirical equation like MEP.

Fig. 6.

Fig. 6

Scatter plots of RF training and validation data.

5.3. XGB result

The excellent predictive capability of the XGB algorithm is evident from the scatter plots of actual versus predicted C–S values for both data sets. The scatter plots for the models are shown in Fig. 7. Also, the R2 values of training and validation are 0.997 and 0.998 respectively which shows that XGB model is incredibly accurate, and it surpassed both the other algorithms. Moreover, the other error metrices such as MAE, RMSE, RSE and ρ are lowest for XGB algorithm.

Fig. 7.

Fig. 7

Scatter plots for training and validation data sets.

5.4. Statistical comparison of the models

The excellent correlation that exists between the experimental and predicted C–S is evident from the scatter plots drawn for the three developed models. However, as discussed earlier, the accuracy of the algorithms is checked by calculating different statistical error metrices and the best model will be chosen having the highest correlation and minimum errors [30]. The comparison of R2, average error and performance index of the three developed models is shown in Fig. 8. Notice from Fig. 8 (a) that the value of coefficient of determination R2 is the highest for XGB model. Both training and validation R2 values are approaching 1 for XGB model. Also, the a20-index values are nearly equal to 1 for MEP and exactly equal to 1 for RF and XGB. It shows that the model predicted values lie close to the experimental values and doesn't deviate much from the experimental values. Similarly, it has the lowest average error out of the three developed models with training error 0.527 and validation error 0.47. Moreover, the value of performance index is also the lowest for XGB as shown in Fig. 8 (c). Also, notice from Fig. 8 (b) that MEP model has the highest absolute error for both datasets. The MEP validation error is 3.58 compared to 1.35 and 0.45 for RF and XGB respectively. Also, the R2 values are the least for MEP model and highest for XGB. Thus, it can be concluded that XGB is the most accurate and robust model for predicting C–S of SCC having the least average error and highest correlation, followed by RF and MEP. However, MEP offers a significant advantage of presenting the result in the form of an empirical equation while RF and XGB do not yield an empirical equation.

Fig. 8.

Fig. 8

Error comparison of the developed models; (a) R2; (b) Average Error; (c) Performance Index.

Overfitting is a common problem that can occur when using ML algorithms. It means that a model performs well on the training data but cease to perform well on the validation or unseen data [93]. It is of sheer importance to check and avoid the overfitting of the ML models. Overfitting can be checked by comparing the training and validation error metrices for the model. If the validation errors are significantly less than the training error, means the model is overfitted to the training data [94]. Notice from Table 4 that almost all the validation error metrices are closer to the training ones and no validation error is significantly lesser than the training error metrices. It can also be verified from average error and correlation coefficient comparison given in Fig. 8. Thus, the developed models have good generalization capacity as they performed equally good on the validation data are not overfitted to the training data.

5.5. Sensitivity analysis (SA)

XGB model is considered for SA because it performed well than the other two algorithms. It is conducted to find the relative contribution of inputs used in estimating fc ′ of SCC, using Equations (3), (4). Sensitivity analysis tells that how much the output is dependent on the input variables. The results of SA are shown in Fig. 9.

Ni=fmax(xi)fmin(xi) (3)
SA=Ninj=1Nj (4)

Where.

  • xi = i tℎ input variable keeping other variables constant

  • fmax(xi) = maximum predicted output

  • fmin(xi) = minimum predicted output

Fig. 9.

Fig. 9

Relative Contribution of input parameters (%).

It is evident from Fig. 9 that every input plays a significant role in prediction of output. However, cement has a pivotal contribution in prediction of C–S contributing almost 35 % alone. The remaining parameters fly ash, silica fume, coarse aggregate, fine aggregate, water, and superplasticizer contributes 16.05 %, 12.64 %, 12.24 %,5.22 % and 16.75 % respectively. These conclusions are also in line with the findings of other studies [95]. A similar study attributed to finding C–S of SCC also concluded that cement is the most important factor in predicting the strength followed by admixtures like silica fume and fly ash [85]. The author concluded that small changes in the cement or mineral admixtures content can have a profound effect on C–S and other properties of SCC. Similarly, superplasticizer is also very important in achieving the strength and desired properties of SCC [[96], [97], [98]]. Also, the comparatively lesser role of fine aggregate and water content in the prediction of C–S has been explained in many studies [14,21,99].

6. Conclusions

This study employed MEP, RF and XGB to forecast C–S of SCC using a reliable dataset collected from literature. The models were evaluated using different statistical error metrices and the following conclusions are drawn from the developed models.

  • The MEP predicted values showed a strong correlation with experimental values having validation R2 equal to 0.923. The RF and XGB models also showed excellent accuracy with RF validation R2 equal to 0.986 and XGB validation R2 equal to 0.998. Thus, these algorithms are more accurate than MEP, but it is of paramount importance to mention that RF and XGB did not yield an empirical equation to predict strength like MEP.

  • XGB had an average error of 0.45 compared to the 1.35 and 3.58 for RF and MEP respectively which shows that XGB is the most accurate of all the algorithms used. The same trend is observed for other error metrices.

  • The scatter plots drawn between actual and predicted values show that the models are well-fitted across both datasets as evident from R2 values for training and validation data. However, XGB model has the highest R2 value, so it gives the most accurate predictions.

  • The sensitivity analysis is done to study the influence of explanatory variables on the C–S and the results revealed that the trend of the relative importance of explanatory variables used in XGB followed the order: Cement (35.18 %) > Superplasticizer (16.75 %) > Fly ash (16.05 %) > Silica Fume (12.64 %) > Coarse aggregate (12.24 %) > fine aggregate (5.22 %) > water (1.3 %).

7. Limitations and future work

The aim of this study was to develop accurate prediction models for C–S of SCC to foster its use in the construction industry. While this study used a dataset of 231 points, it is recommended to use a larger dataset in subsequent research for the development of more comprehensive empirical models. The empirical models in the current study used seven explanatory variables, but it is crucial to consider other variables like age of concrete, different types of chemical and mineral admixtures etc. in future research endeavours. Also, the input variables used in the study had a limited range as given in Tables 2 and it is recommended to develop models with a broader range of input and output variables. Furthermore, predictive models for other mechanical properties of SCC like flexural strength, modulus of elasticity, tensile strength etc. should be developed to facilitate its use in the industry.

Data availability

Data will be made available on request.

CRediT authorship contribution statement

Waleed Bin Inqiad: Conceptualization, Data curation, Formal analysis, Software, Visualization, Writing - original draft. Muhammad Shahid Siddique: Validation, Supervision, Project administration. Sana Shahab: Project administration, Methodology. Muhammad Jamal Butt: Writing – review & editing. Yaser Gamil: Software, Funding acquisition, Methodology, Resources.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgement

This research was supported by Researchers Supporting Projectnumber (RSP2023R496), King Saud University, Riyadh, Saudi Arabia.

Footnotes

Appendix A

Supplementary data to this article can be found online at https://doi.org/10.1016/j.heliyon.2023.e22036.

Contributor Information

Waleed Bin Inqiad, Email: WaleedBinInqiad@gmail.com.

Muhammad Shahid Siddique, Email: mshahidsiddique@mce.nust.edu.pk.

Saad S. Alarifi, Email: ssalarifi@ksu.edu.sa.

Muhammad Jamal Butt, Email: jamalbutt@cuiatd.edu.pk.

Taoufik Najeh, Email: taoufik.najeh@ltu.se.

Yaser Gamil, Email: yaser.g@monash.edu.

Appendix A. Supplementary data

The following is the Supplementary data to this article.

Multimedia component 1
mmc1.xlsx (38.6KB, xlsx)

References

  • 1.Arrigoni A., et al. Life cycle greenhouse gas emissions of concrete containing supplementary cementitious materials: cut-off vs. substitution. J. Clean. Prod. 2020;263(Aug) doi: 10.1016/j.jclepro.2020.121465. [DOI] [Google Scholar]
  • 2.Onat N.C., Kucukvar M. Carbon footprint of construction industry: a global review and supply chain analysis. Renew. Sustain. Energy Rev. May 2020;124 doi: 10.1016/j.rser.2020.109783. [DOI] [Google Scholar]
  • 3.Rehan R., Nehdi M. Carbon dioxide emissions and climate change: policy implications for the cement industry. Environ. Sci. Pol. 2005;8(2):105–114. doi: 10.1016/j.envsci.2004.12.006. Apr. [DOI] [Google Scholar]
  • 4.Benhelal E., Zahedi G., Shamsaei E., Bahadori A. Global strategies and potentials to curb CO2 emissions in cement industry. J. Clean. Prod. 2013;51:142–161. doi: 10.1016/j.jclepro.2012.10.049. Elsevier Ltd. Jul. 15. [DOI] [Google Scholar]
  • 5.Shahmansouri A.A., Akbarzadeh Bengar H., Ghanbari S. Compressive strength prediction of eco-efficient GGBS-based geopolymer concrete using GEP method. J. Build. Eng. 2020;31(Sep) doi: 10.1016/j.jobe.2020.101326. [DOI] [Google Scholar]
  • 6.Mikulčić H., Klemeš J.J., Vujanović M., Urbaniec K., Duić N. Reducing greenhouse gasses emissions by fostering the deployment of alternative raw materials and energy sources in the cleaner cement manufacturing process. J. Clean. Prod. Nov. 2016;136:119–132. doi: 10.1016/j.jclepro.2016.04.145. [DOI] [Google Scholar]
  • 7.Rahla K.M., Mateus R., Bragança L. Comparative sustainability assessment of binary blended concretes using supplementary cementitious materials (SCMs) and ordinary Portland cement (OPC) J. Clean. Prod. May 2019;220:445–459. doi: 10.1016/j.jclepro.2019.02.010. [DOI] [Google Scholar]
  • 8.Tang P., Chen W., Xuan D., Zuo Y., Poon C.S. Investigation of cementitious properties of different constituents in municipal solid waste incineration bottom ash as supplementary cementitious materials. J. Clean. Prod. 2020;258(Jun) doi: 10.1016/j.jclepro.2020.120675. [DOI] [Google Scholar]
  • 9.Bajpai R., Choudhary K., Srivastava A., Sangwan K.S., Singh M. Environmental impact assessment of fly ash and silica fume based geopolymer concrete. J. Clean. Prod. May 2020;254 doi: 10.1016/j.jclepro.2020.120147. [DOI] [Google Scholar]
  • 10.Kolovos K.G., Asteris P.G., Cotsovos D.M., Badogiannis E., Tsivilis S. Mechanical properties of soilcrete mixtures modified with metakaolin. Construct. Build. Mater. 2013;47:1026–1036. doi: 10.1016/j.conbuildmat.2013.06.008. [DOI] [Google Scholar]
  • 11.Kolovos K.G., Asteris P.G., Tsivilis S. Properties of sandcrete mixtures modified with metakaolin. European Journal of Environmental and Civil Engineering. 2016;20:s18–s37. doi: 10.1080/19648189.2016.1246690. Nov. [DOI] [Google Scholar]
  • 12.Asteris P.G., Kolovos K.G. Data on the physical and mechanical properties of soilcrete materials modified with metakaolin. Data Brief. Aug. 2017;13:487–497. doi: 10.1016/j.dib.2017.06.014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Tam V.W.Y., Tam C.M. A review on the viable technology for construction waste recycling. Resour. Conserv. Recycl. Jun. 2006;47(3):209–221. doi: 10.1016/j.resconrec.2005.12.002. [DOI] [Google Scholar]
  • 14.Brouwers H.J.H., Radix H.J. Self-compacting concrete: theoretical and experimental study. Cem Concr Res. 2005;35(11):2116–2136. doi: 10.1016/j.cemconres.2005.06.002. [DOI] [Google Scholar]
  • 15.De Schutter Geert. Whittles Pub.; 2008. Self-compacting Concrete. [Google Scholar]
  • 16.A. Bradu, N. Cazacu, N. Florea, and P. Mihai, “COMPRESSIVE STRENGTH OF SELF COMPACTING CONCRETE”.
  • 17.Boukendakdji O., Kenai S., Kadri E.H., Rouis F. Effect of slag on the rheology of fresh self-compacted concrete. Construct. Build. Mater. Jul. 2009;23(7):2593–2598. doi: 10.1016/j.conbuildmat.2009.02.029. [DOI] [Google Scholar]
  • 18.Mohamed O. Durability and compressive strength of high cement replacement ratio self-consolidating concrete. Buildings. 2018;8(11) doi: 10.3390/buildings8110153. Nov. [DOI] [Google Scholar]
  • 19.Grdic Z., Despotovic I., Toplicic-Curcic G. Properties of self-compacting concrete with different types of additives. Facta Univ. – Ser. Archit. Civ. Eng. 2008;6(2):173–177. doi: 10.2298/fuace0802173g. [DOI] [Google Scholar]
  • 20.Asteris P.G., Kolovos K.G. Self-compacting concrete strength prediction using surrogate models. Neural Comput. Appl. Jan. 2019;31:409–424. doi: 10.1007/s00521-017-3007-7. [DOI] [Google Scholar]
  • 21.Valcuende M., Marco E., Parra C., Serna P. Influence of limestone filler and viscosity-modifying admixture on the shrinkage of self-compacting concrete. Cem Concr Res. Apr. 2012;42(4):583–592. doi: 10.1016/j.cemconres.2012.01.001. [DOI] [Google Scholar]
  • 22.Asteris P.G., Kolovos K.G., Douvika M.G., Roinos K. Prediction of self-compacting concrete strength using artificial neural networks. European Journal of Environmental and Civil Engineering. 2016;20:s102–s122. doi: 10.1080/19648189.2016.1246693. Nov. [DOI] [Google Scholar]
  • 23.Nunez I., Marani A., Nehdi M.L. Mixture optimization of recycled aggregate concrete using hybrid machine learning model. Materials. 2020;13(19):1–24. doi: 10.3390/ma13194331. Oct. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Zobel C.W., Cook D.F. Evaluation of neural network variable influence measures for process control. Eng. Appl. Artif. Intell. Aug. 2011;24(5):803–812. doi: 10.1016/j.engappai.2011.03.001. [DOI] [Google Scholar]
  • 25.Faraj R.H., Mohammed A.A., Omer K.M. Modeling the compressive strength of eco-friendly self-compacting concrete incorporating ground granulated blast furnace slag using soft computing techniques. Environ. Sci. Pollut. Control Ser. Oct. 2022;29(47) doi: 10.1007/s11356-022-20889-5. –71357. [DOI] [PubMed] [Google Scholar]
  • 26.Tanyildizi H., Çevik A. Modeling mechanical performance of lightweight concrete containing silica fume exposed to high temperature using genetic programming. Construct. Build. Mater. 2010;24(12):2612–2618. doi: 10.1016/j.conbuildmat.2010.05.001. [DOI] [Google Scholar]
  • 27.Wang H.L., Yin Z.Y. High performance prediction of soil compaction parameters using multi expression programming. Eng. Geol. 2020;276(Oct) doi: 10.1016/j.enggeo.2020.105758. [DOI] [Google Scholar]
  • 28.Mohammadzadeh D., Kazemi S.-F., Mosavi A., Nasseralshariati E., Tah J.H.M. Prediction of Compression Index of Fine-Grained Soils Using a Gene Expression Programming Model. Infrastructures. 2019;4(2):26. doi: 10.3390/infrastructures4020026. [DOI] [Google Scholar]
  • 29.A. Demir, “New Computational Models for Better Predictions of the Soil-Compression Index”.
  • 30.Alavi A.H., Gandomi A.H., Sahab M.G., Gandomi M. Multi expression programming: a new approach to formulation of soil classification. Eng. Comput. Apr. 2010;26(2):111–118. doi: 10.1007/s00366-009-0140-7. [DOI] [Google Scholar]
  • 31.Saridemir M. Genetic programming approach for prediction of compressive strength of concretes containing rice husk ash. Construct. Build. Mater. Oct. 2010;24(10):1911–1919. doi: 10.1016/j.conbuildmat.2010.04.011. [DOI] [Google Scholar]
  • 32.Tran V.L., Kim S.E. Efficiency of three advanced data-driven models for predicting axial compression capacity of CFDST columns. Thin-Walled Struct. 2020;152(Jul) doi: 10.1016/j.tws.2020.106744. [DOI] [Google Scholar]
  • 33.Sarveghadi M., Gandomi A.H., Bolandi H., Alavi A.H. Development of prediction models for shear strength of SFRCB using a machine learning approach. Neural Comput. Appl. 2019;31(7):2085–2094. doi: 10.1007/s00521-015-1997-6. Springer London. Jul. 01. [DOI] [Google Scholar]
  • 34.Chu H.H., et al. Sustainable use of fly-ash: use of gene-expression programming (GEP) and multi-expression programming (MEP) for forecasting the compressive strength geopolymer concrete. Ain Shams Eng. J. Dec. 2021;12(4):3603–3617. doi: 10.1016/j.asej.2021.03.018. [DOI] [Google Scholar]
  • 35.Asteris P.G., Roussis P.C., Douvika M.G. Feed-forward neural network prediction of the mechanical properties of sandcrete materials. Sensors. 2017;17(6) doi: 10.3390/s17061344. Jun. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Sarir P., Chen J., Asteris P.G., Armaghani D.J., Tahir M.M. Developing GEP tree-based, neuro-swarm, and whale optimization models for evaluation of bearing capacity of concrete-filled steel tube columns. Eng. Comput. Jan. 2021;37(1):1–19. doi: 10.1007/s00366-019-00808-y. [DOI] [Google Scholar]
  • 37.Farooq F., et al. A comparative study for the prediction of the compressive strength of self-compacting concrete modified with fly ash. Materials. 2021;14(17) doi: 10.3390/ma14174934. Sep. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Asteris P.G., Kolovos K.G., Douvika M.G., Roinos K. Prediction of self-compacting concrete strength using artificial neural networks. European Journal of Environmental and Civil Engineering. 2016;20:s102–s122. doi: 10.1080/19648189.2016.1246693. Nov. [DOI] [Google Scholar]
  • 39.Mai H.V.T., Nguyen M.H., Ly H.B. Development of machine learning methods to predict the compressive strength of fiber-reinforced self-compacting concrete and sensitivity analysis. Construct. Build. Mater. 2023;367 doi: 10.1016/j.conbuildmat.2023.130339. Feb. [DOI] [Google Scholar]
  • 40.de-Prado-Gil J., Palencia C., Silva-Monteiro N., Martínez-García R. To predict the compressive strength of self compacting concrete with recycled aggregates utilizing ensemble machine learning models. Case Stud. Constr. Mater. 2022;16(Jun) doi: 10.1016/j.cscm.2022.e01046. [DOI] [Google Scholar]
  • 41.Saha P., Debnath P., Thomas P. Prediction of fresh and hardened properties of self-compacting concrete using support vector regression approach. Neural Comput. Appl. Jun. 2020;32(12):7995–8010. doi: 10.1007/s00521-019-04267-w. [DOI] [Google Scholar]
  • 42.Bušić R., Benšić M., Miličević I., Strukar K. Prediction models for the mechanical properties of self-compacting concrete with recycled rubber and silica fume. Materials. 2020;13(8) doi: 10.3390/MA13081821. Apr. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Siddique R., Aggarwal P., Aggarwal Y. Prediction of compressive strength of self-compacting concrete containing bottom ash using artificial neural networks. Adv. Eng. Software. 2011;42(10):780–786. doi: 10.1016/j.advengsoft.2011.05.016. [DOI] [Google Scholar]
  • 44.Rajakarunakaran S.A., et al. Prediction of strength and analysis in self-compacting concrete using machine learning based regression techniques. Adv. Eng. Software. 2022;173 doi: 10.1016/j.advengsoft.2022.103267. Nov. [DOI] [Google Scholar]
  • 45.Kaveh A., Bakhshpoori T., Hamze-Ziabari S.M. M5’ and mars based prediction models for properties of selfcompacting concrete containing fly ash. Period. Polytech. Civ. Eng. Mar. 2018;62(2):281–294. doi: 10.3311/PPci.10799. [DOI] [Google Scholar]
  • 46.J. Web, L. Chen, and W. Jiang, “Advanced in Engineering and Intelligence Systems Estimation of the Compressive Strength of Self-Compacting Concrete (SCC) by a Machine Learning Technique Coupling with Novel Optimization Algorithms”.
  • 47.Sonebi M., Cevik A. Genetic programming based formulation for fresh and hardened properties of self-compacting concrete containing pulverised fuel ash. Construct. Build. Mater. Jul. 2009;23(7):2614–2622. doi: 10.1016/j.conbuildmat.2009.02.012. [DOI] [Google Scholar]
  • 48.Belalia Douma O., Boukhatem B., Ghrici M., Tagnit-Hamou A. Prediction of properties of self-compacting concrete containing fly ash using artificial neural network. Neural Comput. Appl. Dec. 2017;28:707–718. doi: 10.1007/s00521-016-2368-7. [DOI] [Google Scholar]
  • 49.Kina C., Turk K., Atalay E., Donmez I., Tanyildizi H. Comparison of extreme learning machine and deep learning model in the estimation of the fresh properties of hybrid fiber-reinforced SCC. Neural Comput. Appl. Sep. 2021;33(18):11641–11659. doi: 10.1007/s00521-021-05836-8. [DOI] [Google Scholar]
  • 50.Abunassar N., Alas M., Ali S.I.A. Prediction of compressive strength in self-compacting concrete containing fly ash and silica fume using ANN and SVM. Arabian J. Sci. Eng. Apr. 2023;48(4):5171–5184. doi: 10.1007/s13369-022-07359-3. [DOI] [Google Scholar]
  • 51.Ghunimat D., Alzoubi A.E., Alzboon A., Hanandeh S. Prediction of concrete compressive strength with GGBFS and fly ash using multilayer perceptron algorithm, random forest regression and k-nearest neighbor regression. Asian Journal of Civil Engineering. Jan. 2023;24(1):169–177. doi: 10.1007/s42107-022-00495-z. [DOI] [Google Scholar]
  • 52.Kina C., Turk K., Tanyildizi H. Estimation of strengths of hybrid FR-SCC by using deep-learning and support vector regression models. Struct. Concr. Oct. 2022;23(5):3313–3330. doi: 10.1002/suco.202100622. [DOI] [Google Scholar]
  • 53.Oltean M. Multi Expression Programming for solving classification problems Fruit recognition from images using deep learning View project Optical Computing View project Mihai Oltean Multi Expression Programming for solving classification problems. 2022 doi: 10.21203/rs.3.rs-1458572/v1. [DOI] [Google Scholar]
  • 54.Zhang Q., Meng X., Yang B., Liu W. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) Springer Verlag; 2016. MREP: multi-reference expression programming; pp. 26–38. [DOI] [Google Scholar]
  • 55.Crina M.O., Gros¸an G. 2003. A Comparison of Several Linear GP Techniques A Comparison of Several Linear Genetic Programming Techniques. [Online] [Google Scholar]
  • 56.Deng W., He P., Huang Z. Lecture Notes in Electrical Engineering. 2013. Multi-expression based gene expression programming; pp. 439–448. [DOI] [Google Scholar]
  • 57.Chisari C., Bedon C. Multi-objective optimization of FRP jackets for improving the seismic response of reinforced concrete frames. Am. J. Eng. Appl. Sci. Sep. 2016;9(3):669–679. doi: 10.3844/ajeassp.2016.669.679. [DOI] [Google Scholar]
  • 58.Oltean, Mihai, and D. Dumitrescu. "Multi expression programming." Journal of Genetic Programming and Evolvable Machines (2002).Oltean M., Multi Expression Programming (2006) [Online].
  • 59.Gandomi A.H., Faramarzifar A., Rezaee P.G., Asghari A., Talatahari S. New design equations for elastic modulus of concrete using multi expression programming. J. Civ. Eng. Manag. Aug. 2015;21(6):761–774. doi: 10.3846/13923730.2014.893910. [DOI] [Google Scholar]
  • 60.Jalal F.E., et al. Indirect estimation of swelling pressure of expansive soil: GEP versus MEP modelling. Adv. Mater. Sci. Eng. Jan. 2023;2023:1–25. doi: 10.1155/2023/1827117. [DOI] [Google Scholar]
  • 61.Breiman L. 2001. Random Forests. [Google Scholar]
  • 62.liang Jin X., et al. Estimation of wheat agronomic parameters using new spectral indices. PLoS One. 2013;8(8) doi: 10.1371/journal.pone.0072736. Aug. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Qi C., Fourie A., Du X., Tang X. Prediction of open stope hangingwall stability using random forests. Nat. Hazards. 2018;92(2):1179–1197. doi: 10.1007/s11069-018-3246-7. Jun. [DOI] [Google Scholar]
  • 64.Zhang K., Wu X., Niu R., Yang K., Zhao L. The assessment of landslide susceptibility mapping using random forest and decision tree methods in the Three Gorges Reservoir area, China. Environ. Earth Sci. 2017;76(11) doi: 10.1007/s12665-017-6731-5. Jun. [DOI] [Google Scholar]
  • 65.Qi C., Tang X. Slope stability prediction using integrated metaheuristic and machine learning approaches: a comparative study. Comput. Ind. Eng. Apr. 2018;118:112–122. doi: 10.1016/j.cie.2018.02.028. [DOI] [Google Scholar]
  • 66.Brokamp C., Jandarov R., Rao M.B., LeMasters G., Ryan P. Exposure assessment models for elemental components of particulate matter in an urban environment: a comparison of regression and random forest approaches. Atmos. Environ. Feb. 2017;151:1–11. doi: 10.1016/j.atmosenv.2016.11.066. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Wu X., Zhu F., Zhou M., Sabri M.M.S., Huang J. Intelligent design of construction materials: a comparative study of ai approaches for predicting the strength of concrete with blast furnace slag. Materials. 2022;15(13) doi: 10.3390/ma15134582. Jul. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Chen T., Guestrin C. Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Association for Computing Machinery; Aug. 2016. XGBoost: a scalable tree boosting system; pp. 785–794. [DOI] [Google Scholar]
  • 69.Guelman L. Gradient boosting trees for auto insurance loss cost modeling and prediction. Expert Syst. Appl. Feb. 2012;39(3):3659–3667. doi: 10.1016/j.eswa.2011.09.058. [DOI] [Google Scholar]
  • 70.Cui L., Chen P., Wang L., Li J., Ling H. Application of extreme gradient boosting based on grey relation analysis for prediction of compressive strength of concrete. Adv. Civ. Eng. 2021;2021 doi: 10.1155/2021/8878396. [DOI] [Google Scholar]
  • 71.Wang C., Xu S., Yang J. Adaboost algorithm in artificial intelligence for optimizing the IRI prediction accuracy of asphalt concrete pavement. Sensors. 2021;21(17) doi: 10.3390/s21175682. Sep. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Marani A., Jamali A., Nehdi M.L. Predicting ultra-high-performance concrete compressive strength using tabular generative adversarial networks. Materials. 2020;13(21):1–24. doi: 10.3390/ma13214757. Nov. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Sonebi M. Medium strength self-compacting concrete containing fly ash: modelling using factorial experimental plans. Cem Concr Res. Jul. 2004;34(7):1199–1208. doi: 10.1016/j.cemconres.2003.12.022. [DOI] [Google Scholar]
  • 74.Siddique R. Properties of self-compacting concrete containing class F fly ash. Mater. Des. Mar. 2011;32(3):1501–1507. doi: 10.1016/j.matdes.2010.08.043. [DOI] [Google Scholar]
  • 75.Ramanathan P., Baskar I., Muthupriya P., Venkatasubramani R. Performance of self-compacting concrete containing different mineral admixtures. KSCE J. Civ. Eng. 2013;17(2):465–472. doi: 10.1007/s12205-013-1882-8. Mar. [DOI] [Google Scholar]
  • 76.Yazici H. The effect of silica fume and high-volume Class C fly ash on mechanical properties, chloride penetration and freeze-thaw resistance of self-compacting concrete. Construct. Build. Mater. Apr. 2008;22(4):456–462. doi: 10.1016/j.conbuildmat.2007.01.002. [DOI] [Google Scholar]
  • 77.Ofuyatan O.M., Adeniyi A.G., Ighalo J.O. Evaluation of fresh and hardened properties of blended silica fume self-compacting concrete (SCC) Research on Engineering Structures and Materials. Jun. 2021;7(2):211–223. doi: 10.17515/resm2020.228ma1023. [DOI] [Google Scholar]
  • 78.Yang Z., Liu S., Yu L., Xu L. A comprehensive study on the hardening features and performance of self-compacting concrete with high-volume fly ash and slag. Materials. 2021;14(15) doi: 10.3390/ma14154286. Aug. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Guo Z., Jiang T., Zhang J., Kong X., Chen C., Lehman D.E. Mechanical and durability properties of sustainable self-compacting concrete with recycled concrete aggregate and fly ash, slag and silica fume. Construct. Build. Mater. 2020;231(Jan) doi: 10.1016/j.conbuildmat.2019.117115. [DOI] [Google Scholar]
  • 80.Felekoǧlu B., Türkel S., Baradan B. Effect of water/cement ratio on the fresh and hardened properties of self-compacting concrete. Build. Environ. Apr. 2007;42(4):1795–1802. doi: 10.1016/j.buildenv.2006.01.012. [DOI] [Google Scholar]
  • 81.Gesoǧlu M., Özbay E. Effects of mineral admixtures on fresh and hardened properties of self-compacting concretes: binary, ternary and quaternary systems. Materials and Structures/Materiaux et Constructions. 2007;40(9):923–937. doi: 10.1617/s11527-007-9242-0. Nov. [DOI] [Google Scholar]
  • 82.Gesoǧlu M., Güneyisi E., Özbay E. Properties of self-compacting concretes made with binary, ternary, and quaternary cementitious blends of fly ash, blast furnace slag, and silica fume. Construct. Build. Mater. May 2009;23(5):1847–1854. doi: 10.1016/j.conbuildmat.2008.09.015. [DOI] [Google Scholar]
  • 83.Boukendakdji O., Kadri E.H., Kenai S. Effects of granulated blast furnace slag and superplasticizer type on the fresh properties and compressive strength of self-compacting concrete. Cem. Concr. Compos. Apr. 2012;34(4):583–590. doi: 10.1016/j.cemconcomp.2011.08.013. [DOI] [Google Scholar]
  • 84.Groeneveld R.A., Meeden G. 1984. Measuring Skewness and Kurtosis.https://about.jstor.org/terms [Online]. Available: [Google Scholar]
  • 85.De-Prado-gil J., Palencia C., Jagadesh P., Martínez-García R. A comparison of machine learning tools that model the splitting tensile strength of self-compacting recycled aggregate concrete. Materials. 2022;15(12) doi: 10.3390/ma15124164. Jun. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Farooq F., et al. A comparative study for the prediction of the compressive strength of self-compacting concrete modified with fly ash. Materials. 2021;14(17) doi: 10.3390/ma14174934. Sep. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.Gholampour A., Gandomi A.H., Ozbakkaloglu T. New formulations for mechanical properties of recycled aggregate concrete using gene expression programming. Construct. Build. Mater. Jan. 2017;130:122–145. doi: 10.1016/j.conbuildmat.2016.10.114. [DOI] [Google Scholar]
  • 88.Mousavi S.M., Alavi A.H., Gandomi A.H., Esmaeili M.A., Gandomi M. 2010. A Data Mining Approach to Compressive Strength of CFRP-Confined Concrete Cylinders. [Google Scholar]
  • 89.Jalal F.E., Xu Y., Iqbal M., Jamhiri B., Javed M.F. Predicting the compaction characteristics of expansive soils using two genetic programming-based algorithms. Transportation Geotechnics. 2021;30 doi: 10.1016/j.trgeo.2021.100608. Sep. [DOI] [Google Scholar]
  • 90.Gandomi A.H., Roke D.A. Assessment of artificial neural network and genetic programming as predictive tools. Adv. Eng. Software. Jun. 2015;88:63–72. doi: 10.1016/j.advengsoft.2015.05.007. [DOI] [Google Scholar]
  • 91.Li Y., et al. The effects of rock index tests on prediction of tensile strength of granitic samples: a neuro-fuzzy intelligent system. Sustainability. 2021;13(19) doi: 10.3390/su131910541. Oct. [DOI] [Google Scholar]
  • 92.Asteris P.G., Koopialipoor M., Armaghani D.J., Kotsonis E.A., Lourenço P.B. Prediction of cement-based mortars compressive strength using machine learning techniques. Neural Comput. Appl. Oct. 2021;33(19):13089–13121. doi: 10.1007/s00521-021-06004-8. [DOI] [Google Scholar]
  • 93.Armaghani D.J., Asteris P.G. A comparative study of ANN and ANFIS models for the prediction of cement-based mortar materials compressive strength. Neural Comput. Appl. May 2021;33(9):4501–4532. doi: 10.1007/s00521-020-05244-4. [DOI] [Google Scholar]
  • 94.S. Whiteson, B. Tanner, M. E. Taylor, and P. Stone, “Protecting against Evaluation Overfitting in Empirical Reinforcement Learning”.
  • 95.Farooq F., et al. A comparative study for the prediction of the compressive strength of self-compacting concrete modified with fly ash. Materials. 2021;14(17) doi: 10.3390/ma14174934. Sep. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Jagadesh P., de Prado-Gil J., Silva-Monteiro N., Martínez-García R. Assessing the compressive strength of self-compacting concrete with recycled aggregates from mix ratio using machine learning approach. J. Mater. Res. Technol. May 2023;24:1483–1498. doi: 10.1016/j.jmrt.2023.03.037. [DOI] [Google Scholar]
  • 97.Sobhani J., Najimi M., Pourkhorshidi A.R., Parhizkar T. Prediction of the compressive strength of no-slump concrete: a comparative study of regression, neural network and ANFIS models. Construct. Build. Mater. May 2010;24(5):709–718. doi: 10.1016/j.conbuildmat.2009.10.037. [DOI] [Google Scholar]
  • 98.Boukendakdji O., Kadri E.H., Kenai S. Effects of granulated blast furnace slag and superplasticizer type on the fresh properties and compressive strength of self-compacting concrete. Cem. Concr. Compos. Apr. 2012;34(4):583–590. doi: 10.1016/j.cemconcomp.2011.08.013. [DOI] [Google Scholar]
  • 99.Rajakarunakaran S.A., et al. Prediction of strength and analysis in self-compacting concrete using machine learning based regression techniques. Adv. Eng. Software. 2022;173 doi: 10.1016/j.advengsoft.2022.103267. Nov. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Multimedia component 1
mmc1.xlsx (38.6KB, xlsx)

Data Availability Statement

Data will be made available on request.


Articles from Heliyon are provided here courtesy of Elsevier

RESOURCES