Table 4.
Dataset | Classifier | Accuracy (%) | Precision (%) | Sensitivity (%) | Specificity (%) | F1-score (%) | AUC (%) |
---|---|---|---|---|---|---|---|
Dataset 1 | MLP | 79.96 | 82.40 | 79.67 | 80.29 | 81.01 | 79.98 |
SVM | 75.72 | 78.70 | 75.10 | 76.44 | 76.86 | 75.77 | |
LR | 77.51 | 80.70 | 76.35 | 78.85 | 78.46 | 77.60 | |
Decision tree | 79.69 | 79.38 | 84.65 | 74.52 | 81.93 | 79.58 | |
Gradient boosting | 80.40 | 81.22 | 82.57 | 77.88 | 81.89 | 80.23 | |
Random forest | 81.51 | 83.19 | 82.16 | 80.77 | 82.67 | 81.46 | |
XGboost | 80.40 | 81.22 | 82.57 | 77.88 | 81.89 | 80.23 | |
AdaBoost | 79.96 | 81.86 | 80.50 | 79.33 | 81.17 | 79.91 | |
Dataset 2 | MLP | 89.36 | 89.36 | 89.36 | 90.01 | 89.36 | 89.68 |
SVM | 92.62 | 92.62 | 92.62 | 93.90 | 92.62 | 93.26 | |
LR | 92.88 | 92.88 | 92.88 | 94.28 | 92.88 | 93.58 | |
Decision tree | 85.96 | 85.96 | 85.96 | 86.27 | 85.96 | 86.12 | |
Gradient boosting | 92.41 | 92.41 | 92.41 | 93.95 | 92.41 | 93.18 | |
Random forest | 89.36 | 89.36 | 89.36 | 90.01 | 89.36 | 89.68 | |
XGboost | 92.36 | 92.36 | 92.36 | 93.94 | 92.36 | 93.15 | |
AdaBoost | 89.35 | 89.35 | 89.35 | 90.01 | 89.35 | 89.68 |
Bold values highlight the best results for the two studied datasets