TABLE 2.
The table indicates the ROC, AUC, precision recall and F1 score of top 10 ML models.
| Sr No. | Model | Accuracy | ROC | AUC | F1 score |
|---|---|---|---|---|---|
| 1 | ExtraTreesClassifier | 0.89 | 0.88 | 0.88 | 0.89 |
| 2 | NuSVC | 0.88 | 0.87 | 0.88 | 0.88 |
| 3 | RandomForestClassifier | 0.88 | 0.87 | 0.88 | 0.88 |
| 4 | SVC | 0.89 | 0.87 | 0.88 | 0.88 |
| 5 | AdaBoostClassifier | 0.87 | 0.87 | 0.87 | 0.87 |
| 6 | BaggingClassifier | 0.87 | 0.87 | 0.87 | 0.87 |
| 7 | LGBMClassifier | 0.87 | 0.87 | 0.87 | 0.87 |
| 8 | DecisionTreeClassifier | 0.87 | 0.87 | 0.87 | 0.87 |
| 9 | ExtraTreeClassifier | 0.86 | 0.86 | 0.86 | 0.86 |
| 10 | KNeighborsClassifier | 0.86 | 0.86 | 0.86 | 0.86 |