Table 2.
Multi-model classification–training set results.
Model | AUC(SD) | Cut off (SD) | Accuracy (SD) | Sensitivity (SD) | Specificity (SD) | Positive predictive value (SD) |
Negative predictive value (SD) |
F1 score (SD) | Kappa (SD) |
---|---|---|---|---|---|---|---|---|---|
XG Boost | 1.000 (0.000) | 0.873 (0.012) | 0.996 (0.000) | 1.000 (0.000) | 1.000 (0.000) | 1.000 (0.000) | 0.989 (0.000) | 1.000 (0.000) | 0.991 (0.000) |
logistic | 0.789 (0.016) | 0.553 (0.036) | 0.744 (0.013) | 0.784 (0.038) | 0.694 (0.034) | 0.792 (0.012) | 0.679 (0.031) | 0.787 (0.016) | 0.469 (0.022) |
Light GBM | 1.000 (0.000) | 0.567 (0.024) | 0.994 (0.002) | 0.997 (0.004) | 1.000 (0.000) | 1.000 (0.000) | 0.985 (0.005) | 0.998 (0.002) | 0.987 (0.005) |
RandomForest | 1.000 (0.000) | 0.540 (0.037) | 0.987 (0.012) | 0.997 (0.004) | 0.998 (0.004) | 1.000 (0.000) | 0.970 (0.027) | 0.999 (0.002) | 0.974 (0.024) |
GNB | 0.765 (0.017) | 0.728 (0.120) | 0.734 (0.012) | 0.732 (0.027) | 0.748 (0.036) | 0.812 (0.019) | 0.646 (0.018) | 0.769 (0.012) | 0.460 (0.023) |
SVM | 0.787 (0.018) | 0.602 (0.031) | 0.733 (0.015) | 0.759 (0.052) | 0.705 (0.056) | 0.793 (0.021) | 0.659 (0.034) | 0.774 (0.020) | 0.451 (0.027) |
KNN | 0.834 (0.020) | 0.680 (0.098) | 0.640 (0.075) | 0.743 (0.112) | 0.748 (0.127) | 0.916 (0.070) | 0.541 (0.059) | 0.810 (0.046) | 0.336 (0.104) |
CNB | 0.742 (0.022) | 0.195 (0.388) | 0.700 (0.023) | 0.693 (0.062) | 0.721 (0.052) | 0.787 (0.020) | 0.609 (0.029) | 0.735 (0.034) | 0.395 (0.037) |