Table 3.
Results of the model hyperparameter tuning.
| Model | Hyperparameters (lower bound-upper bound) | |
| Logistic regression | ||
|
|
—a | — |
| Support vector machine | ||
|
|
Gamma | 0.1 (0.1-10) |
|
|
Cost | 7.1 (0.1-100) |
|
|
Degree | 3 (1-5) |
|
|
Kernel | Radial basis |
| K-nearest neighbor | ||
|
|
K | 1 (1-10) |
| Random forest | ||
|
|
Number of trees | 230 (10-300) |
|
|
Depth | 90 (10-100) |
|
|
Features | 3 (1-25) |
| Extreme gradient boosting | ||
|
|
Number of trees | 30 (10-300) |
|
|
Depth | 16 (10-100) |
|
|
Eta | 0.23 (0.01-0.4) |
|
|
Gamma | 0.19 (0.01-0.2) |
|
|
Lambda | 1.60 (0.1-2) |
|
|
Alpha | 0.30 (0.1-2) |
| Bayesian additive regression trees | ||
|
|
Number of trees | 30 (10-100) |
|
|
Depth | 90 (10-100) |
| Artificial neural network | ||
|
|
Layers | 2 (1-5) |
|
|
Neurons | 16. 8 (64.2-32.2) |
|
|
Threshold | 0.001 (0.1-0.001) |
aNot available.