Table 1.
Attribute evaluator + search method | Classifier |
---|---|
1. CFS subset evaluator + greedy stepwise 2. Information gain attribute evaluator + ranker 3. Principal components analysis + ranker* 4. ReliefF attribute evaluator + ranker - number of nearest neighbors (k): 10 Equal influence nearest neighbors 5. OneRules attribute evaluator + ranker 6. Mann–Whitney U-test with P<0.05 7. No selection |
1. Naive Bayes updateable 2. OneRules 3. Multinomial logistic regression using a ridge estimator - logistic regression with ridge parameter of 1.0E-8 4. Simple logistic regression 5. Multilayer perceptron using sigmoid nodes 6. Logistic model tree – LM_1: 6/6 (40) Number of leaves: 1, size of the tree: 1 7. Random forest - bagging with 100 iterations and base learner 8. AdaBoostM1, weight: 0.25 Number of performed iterations: 10 9. Bagging – bagging with 10 iterations and base learner 10. Iterative classifier optimizer 11. Randomizable filtered classifier - IB1 instance-based classifier using 1 nearest neighbor (s) for classification 12. K-Nearest neighbors – IB1 instance-based classifier using 1 nearest neighbour (s) for classification 13. Support vector machine using sequential minimal optimization - IB1 instance-based classifier using 1 nearest neighbor (s) for classification 14. LogitBoost – number of performed iterations: 10 |
*All feature selection and classification methods were 10-fold cross-validated (stratified) with seed value=1 except for principal components analysis + ranker. Confusion matrices were obtained for each classifier along with performance metrics. CFS: Correlation-based feature selection