Skip to main content
. 2021 Jul 5;21(13):4606. doi: 10.3390/s21134606

Table A2.

The list of candidate parameters and parameters tuning results for finding the optimal parameter set in XGBoost.

List of Candidate Hyperparameters
n_estimators Colsample_bytree Learning_rate Max_depth Subsample
{500, 1000, 1500, 2000, 2500, 3000} {0.2, 0.4, 0.6, 0.8, 1.0} {0.01, 0.03, 0.05, 0.07, 0.09} {5, 6, 7, 8, 9} {0.2, 0.4, 0.6, 0.8, 1.0}
Hyperparameters Tuning Results (top 6)
n_estimators Colsample_bytree Learning_rate Max_depth Subsample Best Score
3000 1.0 0.03 9 0.4 0.998009
2500 1.0 0.03 9 0.4 0.997989
3000 1.0 0.03 8 0.4 0.997984
2000 1.0 0.03 9 0.4 0.997951
2500 1.0 0.03 8 0.4 0.997945
3000 1.0 0.05 8 0.4 0.997924