Skip to main content
. 2024 Oct 14;13(10):971. doi: 10.3390/antibiotics13100971

Table 2.

Accuracy results of various machine learning models across different preprocessing techniques.

Machine Learning Original Data SMOTE Feature Selection PCA
AdaBoost 0.674596 0.742910 0.677193 0.681545
Bagging 0.697323 0.818048 0.698200 0.687685
BernouliNB 0.626361 0.614733 0.658715 0.615117
Decision Tree 0.666712 0.659669 0.680656 0.67979
Extra Trees 0.630793 0.828361 0.648348 0.632604
Gradient Boosting 0.685042 0.806262 0.685919 0.681556
K-Nearest Neighbors 0.688540 0.706446 0.688551 0.689417
Linier Discriminant Analysis 0.601003 0.750645 0.640351 0.664912
Logistic Regression 0.648257 0.764273 0.688551 0.662303
Multi-Layer Perceptron 0.694691 0.838200 0.694680 0.690305
Random Forest 0.687697 0.825046 0.691194 0.688596
Support Vector Machine 0.697323 0.776427 0.698200 0.691171

This table presents the accuracy scores of 12 machine learning algorithms, including AdaBoost, Bagging, Decision Tree, and Support Vector Machine, evaluated using original data and three preprocessed datasets: SMOTE, feature selection, and PCA. SMOTE generally improves accuracy across most models, particularly in Bagging (0.818048) and Multi-Layer Perceptron (0.838200), while feature selection and PCA have more mixed impacts depending on the model.