Table 3.
Comparison of the performance of the six models in training set.
| Model | Training set | |||||
|---|---|---|---|---|---|---|
| Accuracy | AUC | F1 | Recall | Sensitivity | Specificity | |
| KNN | 0.79 (0.77, 0.81) | 0.84 (0.82, 0.86) | 0.85 (0.84, 0.87) | 0.91 (0.89, 0.93) | 0.91 (0.89, 0.93) | 0.54 (0.50, 0.58) |
| LR | 0.81 (0.79, 0.83) | 0.87 (0.86, 0.89) | 0.87 (0.85, 0.88) | 0.90 (0.88, 0.92) | 0.90 (0.88, 0.92) | 0.63 (0.59, 0.67) |
| NNET | 0.79 (0.77, 0.81) | 0.86 (0.84, 0.88) | 0.85 (0.83, 0.86) | 0.87 (0.85, 0.89) | 0.87 (0.85, 0.89) | 0.63 (0.59, 0.67) |
| RF | 0.83 (0.81, 0.84) | 0.90 (0.88, 0.91) | 0.88 (0.86, 0.89) | 0.94 (0.92, 0.95) | 0.94 (0.92, 0.95) | 0.60 (0.55, 0.64) |
| SVM | 0.79 (0.77, 0.81) | 0.87 (0.85, 0.88) | 0.84 (0.82, 0.85) | 0.80 (0.78, 0.83) | 0.80 (0.78, 0.83) | 0.75 (0.71, 0.79) |
| XGboost | 0.82 (0.80, 0.84) | 0.90 (0.88, 0.91) | 0.87 (0.85, 0.89) | 0.90 (0.88, 0.91) | 0.90 (0.88, 0.91) | 0.66 (0.62, 0.71) |
LR, Logistic regression; RF, Random Forest; XGBoost, Extreme Gradient Boosting; SVC, Support vector Classifier; KNN, k-nearest neighbor; NNET, Neural Network.