Table 4.
Classification Algorithms | Percent Correctly Classified | AUC | Precision | Recall | F-Measure |
---|---|---|---|---|---|
AdaBoost | 84.28% | 0.849 | 0.981 | 0.484 | 0.649 |
Bayes Net | 81.55% | 0.889 | 0.674 | 0.744 | 0.707 |
Decision Stump | 84.31% | 0.730 | 1 | 0.476 | 0.645 |
Decision Table | 84.29% | 0.865 | 0.923 | 0.519 | 0.665 |
J48a | 84.64% | 0.843 | 0.97 | 0.503 | 0.662 |
JRip | 84.27% | 0.737 | 0.985 | 0.483 | 0.648 |
LMT | 85.12% | 0.901 | 0.867 | 0.594 | 0.705 |
Logistic | 85.06% | 0.899 | 0.835 | 0.626 | 0.715 |
Native Bayes | 82.63% | 0.862 | 0.768 | 0.603 | 0.675 |
OneR | 84.88% | 0.749 | 0.993 | 0.499 | 0.664 |
PART | 84.58% | 0.869 | 0.973 | 0.500 | 0.660 |
Random Forest | 84.60% | 0.893 | 0.949 | 0.514 | 0.667 |
Random Tree | 76.77% | 0.776 | 0.628 | 0.552 | 0.588 |
REP Tree | 83.61% | 0.845 | 0.884 | 0.522 | 0.656 |
SGD | 84.90% | 0.765 | 0.904 | 0.555 | 0.688 |
Simple Logistic | 85.12% | 0.901 | 0.867 | 0.594 | 0.705 |
SMOb | 84.84% | 0.751 | 0.972 | 0.509 | 0.668 |
Voted Perceptron | 70.70% | 0.519 | 0.773 | 0.031 | 0.06 |
Deep FNN | 82.02% | 0.751 | 0.831 | 0.820 | 0.813 |
The implementation of C4.5 decision tree in WEKA
WEKA’s implementation of John Platt's sequential minimal optimization algorithm for training a support vector classifier.