[22] |
fusion Model ( LR and RF) |
99.83% |
[23] |
J48 Classifier |
99.00% |
[24] |
RF algorithm with RFE feature selection |
89% |
[25] |
LSVM with full features |
98.86% |
[26] |
RF with Random Forest Feature Selection |
98.8% |
[27] |
MLP Classifier with genetic search algorithm |
98.1% |
[28] |
Random Subspace method with KNN classifier |
97.2% |
[29] |
Gradient Boosting Machines (GBM) |
97.5% |
[30] |
Deep learning with Convolutional neural Networks (CNN) |
98.3% |
[31] |
XGBoost with feature selection |
99.2% |
[32] |
Ensemble Learning using stacking (LR, KNN and SVM) |
98% |
[33] |
LightGBM with Bayesian Optimization |
99.0% |
[34] |
CatBoost With feature selection |
98.2% |
[35] |
Extreme Learning Machines (ELM) |
97.3% |