Table 2.
Performance analysis of test data using traditional machine learning technique with feature reduction (Technique 2).
ML models | Using Chi-square feature set | Using PCA feature set | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Accu. | Prec | Rec. | Spec | F1-s | Time | Accu. | Prec. | Rec. | Spec | F1-s | Time | |
Logistic Regression | 0.768 | 0.769 | 0.76 | 0.77 | 0.77 | 3.7 | 0.760 | 0.761 | 0.76 | 0.76 | 0.76 | 3.9 |
SVM | 0.823 | 0.820 | 0.82 | 0.82 | 0.82 | 111.9 | 0.776 | 0.770 | 0.77 | 0.77 | 0.77 | 113.9 |
Decision Tree | 0.639 | 0.647 | 0.64 | 0.69 | 0.63 | 14.6 | 0.637 | 0.673 | 0.63 | 0.54 | 0.65 | 10.2 |
KNN | 0.778 | 0.822 | 0.73 | 0.67 | 0.72 | 6.3 | 0.782 | 0.819 | 0.77 | 0.72 | 0.77 | 7.6 |
Naive Bayes | 0.698 | 0.699 | 0.69 | 0.69 | 0.69 | 2.75 | 0.697 | 0.664 | 0.67 | 0.65 | 0.66 | 3.3 |
Random Forest | 0.888 | 0.886 | 0.89 | 0.89 | 0.88 | 4.3 | 0.793 | 0.781 | 0.72 | 0.70 | 0.72 | 4.8 |
Gradient Boosting | 0.853 | 0.859 | 0.86 | 0.86 | 0.85 | 35.2 | 0.805 | 0.809 | 0.81 | 0.81 | 0.80 | 36.8 |
Adaptive Boosting | 0.773 | 0.789 | 0.77 | 0.75 | 0.77 | 177.2 | 0.790 | 0.792 | 0.79 | 0.78 | 0.79 | 104.1 |
XG Boosting | 0.783 | 0.779 | 0.77 | 0.74 | 0.77 | 75.2 | 0.765 | 0.769 | 0.76 | 0.74 | 0.76 | 86.8 |
CAT Boost | 0.811 | 0.813 | 0.81 | 0.80 | 0.81 | 48.3 | 0.799 | 0.795 | 0.79 | 0.79 | 0.80 | 50.5 |