Table 6.
Comparison of different classifiers ( test dataset; First Layer).
| Classifier | Precision (%) | Recall (%) | MCA (%) | MCF (%) | ||||
|---|---|---|---|---|---|---|---|---|
| CN | MCI | AD | CN | MCI | AD | |||
| SVM | 100.0 | 91.43 | 83.87 | 96.67 | 89.13 | 89.66 | 91.43 | 91.74 |
| KNN | 65.71 | 59.57 | 73.91 | 76.67 | 60.87 | 58.62 | 64.76 | 65.89 |
| DT | 100.0 | 86.54 | 96.30 | 86.67 | 97.83 | 89.66 | 92.38 | 92.81 |
| NB | 90.32 | 92.68 | 84.85 | 93.33 | 82.61 | 96.55 | 89.52 | 90.05 |
| RF (our model)* | 100.0 | 86.79 | 100.0 | 86.67 | 100.0 | 89.66 | 93.33 | 93.82 |
MCA: multiclass classification accuracy, MCF: multiclass F1 score; Asterisk ( ∗): is the model with the best predictive performance.