Skip to main content
. 2021 May 6;78:189–202. doi: 10.1016/j.jsr.2021.04.007

Table 6.

Examined and optimized hyperparameters for speeding XGBoost algorithms.

Hyperparameter Examined range Optimized Value
Learning rate 0.000–1.000 0.06
Gamma 0–100 0.34
Maximum tree depth 1–50 2
Minimum child weight 1–10 4
Number of rounds 1–1000 250
Mean Squared Error as low as possible 0.177