Table 5.
XGBoosta hyperparameter tuning in the internal training and validation cohorts using 5-fold experiments.
Trees | Value |
100 | 0.80 (0.01) |
200 | 0.81 (0.01) |
1000 | 0.80 (0.01) |
aXGBoost: extreme gradient boosting.
XGBoosta hyperparameter tuning in the internal training and validation cohorts using 5-fold experiments.
Trees | Value |
100 | 0.80 (0.01) |
200 | 0.81 (0.01) |
1000 | 0.80 (0.01) |
aXGBoost: extreme gradient boosting.