Table 3.
Performance of ML model I with different classification algorithms for training and test datasets
5-fold cross validation performance of classification algorithms on the training dataset | |||||
---|---|---|---|---|---|
Algorithms | Precision | Recall | F1-score | AUC-ROC | Accuracy |
SVM | 86.92 ± 5.18 | 82.68 ± 3.94 | 84.64 ± 3.58 | 92.53 ± 3.34 | 85.04 ± 3.41 |
Extra Trees | 85.77 ± 6.05 | 83.58 ± 7.19 | 84.26 ± 3.62 | 92.82 ± 3.64 | 84.52 ± 3.21 |
Random Forest | 85.14 ± 6.11 | 83.45 ± 7.72 | 83.92 ± 4.47 | 92.08 ± 4.24 | 84.16 ± 4.04 |
AdaBoost | 88.06 ± 7.36 | 80.52 ± 9.79 | 83.63 ± 6.36 | 93.69 ± 3.61 | 84.47 ± 5.63 |
XGBoost | 82.99 ± 6.23 | 82.81 ± 7.04 | 83.00 ± 5.33 | 92.39 ± 4.58 | 82.74 ± 5.18 |
LR | 82.80 ± 5.21 | 79.21 ± 12.17 | 80.65 ± 8.36 | 90.01± 4.06 | 81.55 ± 6.93 |
Performance of classification algorithms on the test dataset | |||||
SVM | 93.75 | 85.71 | 89.55 | 93.89 | 90.14 |
Random Forest | 88.78 | 79.54 | 83.84 | 91.60 | 84.94 |
Extra Trees | 89.94 | 76.79 | 82.80 | 91.29 | 84.31 |
AdaBoost | 77.42 | 68.57 | 72.72 | 84.84 | 74.65 |
XGBoost | 83.87 | 74.28 | 78.78 | 88.17 | 80.28 |
LR | 87.10 | 77.14 | 81.81 | 88.01 | 83.09 |