Table 3.
Mean (SD) data of the cross-validation results.
| Model | Sensitivity | Specificity | Accuracy | Balanced accuracy | AUROCa | 
| XGBoostb | 0.8662 (0.006) | 0.7655 (0.014) | 0.8498 (0.004) | 0.8309 (0.006) | 0.9061 (0.005) | 
| AdaBoostc | 0.8546 (0.005) | 0.8206 (0.012) | 0.8467 (0.005) | 0.8377 (0.006) | 0.9076 (0.006) | 
| GBMd | 0.9110 (0.005) | 0.7215 (0.001) | 0.8669 (0.002) | 0.8162 (0.004) | 0.9114 (0.002) | 
| LGBMe | 0.8927 (0.007) | 0.7462 (0.014) | 0.8587 (0.002) | 0.8194 (0.003) | 0.083 (0.004) | 
| RFf | 0.8992 (0.006) | 0.7490 (0.011) | 0.8643 (0.003) | 0.8241 (0.004) | 0.9097 (0.002) | 
| LRg | 0.8551 (0.009) | 0.8171 (0.005) | 0.8259 (0.002) | 0.8361 (0.003) | 0.9089 (0.004) | 
| SVMh | 0.8370 (0.003) | 0.8456 (0.002) | 0.8390 (0.004) | 0.8413 (0.009) | 0.9172 (0.004) | 
aAUROC: area under receiver operating characteristic.
bXGBoost: extreme gradient boosting.
cAdaBoost: adaptive boosting.
dGBM: gradient boosting machine.
eLGBM: light gradient boosting machine.
fRF: random forest.
gLR: logistic regression.
hSVM: support vector machine.