Table 3.
Accuracy rates of machine learning methods in predicting survival status of patients using all data without feature selection.
| Fine tree | 81 |
| Medium tree | 81 |
| Coarse tree | 81.9 |
| Binary GLM logistic regression | 81 |
| Efficient logistic regression | 86.7 |
| Efficient linear SVM | 86.7 |
| Gaussian Naive Bayes | 77.8 |
| Kernel Naive Bayes | 80.3 |
| Linear SVM | 86.3 |
| Quadratic SVM | 85.4 |
| Cubic SVM | 86.7 |
| Fine Gaussian SVM | 86.7 |
| Medium Gaussian SVM | 86.7 |
| Coarse Gaussian SVM | 86.7 |
| Boosted trees | 84.4 |
| Bagged trees | 85.4 |
| RUS boosted trees | 77.5 |
| Narrow neural network | 84.1 |
| Medium neural network | 84.8 |
| Wide neural network | 85.7 |
| Bilayered neural network | 82.2 |
| Trilayered neural network | 82.5 |
| SVM kernel | 87 |
| Logistic regression kernel | 86.7 |
GLM = generalized linear model, SVM = support vector machine.