Table 2.
Performance results of the classification methods applied to the prediction of attrition in four follow-ups of EPICE-PT cohort.
| Follow-up | Methods | Performance metrics (mean, SD) | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Baseline model | Incremental modela | ||||||||||||
| Sensitivity | Accuracy | F-measure | Sensitivity | Accuracy | F-measure | ||||||||
| 1 | AdaBoost | 82.3 | 6.0 | 83.2 | 5.7 | 83.3 | 5.7 | N/a | N/a | N/a | N/a | N/a | N/a |
| Artificial Neural Networks | 81.4 | 3.1 | 81.1 | 3.1 | 81.2 | 3.1 | N/a | N/a | N/a | N/a | N/a | N/a | |
| Functional Trees | 74.5 | 5.2 | 74.7 | 1.8 | 74.7 | 1.8 | N/a | N/a | N/a | N/a | N/a | N/a | |
| J48 | 76.9 | 3.3 | 78.0 | 2.9 | 78.0 | 2.8 | N/a | N/a | N/a | N/a | N/a | N/a | |
| J48Consolidated | 82.0 | 4.2 | 79.3 | 2.0 | 79.3 | 1.9 | N/a | N/a | N/a | N/a | N/a | N/a | |
| K-Nearest Neighbours | 86.0 | 3.9 | 76.5 | 2.1 | 76.5 | 2.2 | N/a | N/a | N/a | N/a | N/a | N/a | |
| Logistic Regression | 69.7 | 5.7 | 73.7 | 2.0 | 73.6 | 2.1 | N/a | N/a | N/a | N/a | N/a | N/a | |
| Random Forest | 82.3 | 6.3 | 88.2 | 1.9 | 88.1 | 2.0 | N/a | N/a | N/a | N/a | N/a | N/a | |
| 2 | AdaBoost | 82.4 | 5.8 | 71.6 | 7.2 | 70.9 | 7.6 | 85.6 | 3.6 | 82.3 | 3.7 | 82.3 | 3.7 |
| Artificial Neural Networks | 82.6 | 6.3 | 75.2 | 3.5 | 74.8 | 3.5 | 82.2 | 1.8 | 79.9 | 1.9 | 79.9 | 2.0 | |
| Functional Trees | 76.8 | 3.8 | 71.4 | 2.6 | 71.2 | 2.6 | 76.1 | 2.8 | 73.1 | 3.2 | 73.1 | 3.2 | |
| J48 | 77.8 | 7.4 | 73.2 | 5.3 | 73.1 | 5.3 | 79.4 | 3.1 | 77.0 | 1.8 | 76.9 | 1.9 | |
| J48Consolidated | 73.7 | 4.1 | 73.6 | 4.2 | 73.6 | 4.3 | 76.5 | 4.1 | 78.1 | 1.5 | 78.2 | 1.5 | |
| K-Nearest Neighbours | 87.6 | 4.5 | 71.7 | 3.9 | 70.5 | 4.0 | 85.4 | 2.7 | 76.7 | 1.6 | 76.4 | 1.7 | |
| Logistic Regression | 77.2 | 2.5 | 67.0 | 1.7 | 66.4 | 1.8 | 80.2 | 4.7 | 74.7 | 2.5 | 74.6 | 2.4 | |
| Random Forest | 86.8 | 2.4 | 82.6 | 1.8 | 82.5 | 1.8 | 85.0 | 3.3 | 84.6 | 2.5 | 84.6 | 2.5 | |
| 3 | AdaBoost | 75.4 | 6.2 | 85.0 | 3.5 | 84.8 | 3.6 | 87.9 | 7.3 | 90.3 | 1.7 | 90.3 | 1.8 |
| Artificial Neural Networks | 79.0 | 7.0 | 81.3 | 3.1 | 81.3 | 3.2 | 87.2 | 5.1 | 89.8 | 0.3 | 89.8 | 0.3 | |
| Functional Trees | 74.4 | 5.7 | 78.2 | 3.0 | 78.3 | 3.0 | 84.9 | 6.0 | 87.5 | 2.1 | 87.5 | 2.1 | |
| J48 | 70.8 | 3.4 | 81.0 | 2.2 | 80.8 | 2.2 | 84.2 | 6.4 | 89.0 | 2.7 | 89.0 | 2.8 | |
| J48Consolidated | 74.1 | 4.6 | 80.5 | 2.7 | 80.5 | 2.7 | 87.8 | 3.0 | 89.6 | 1.9 | 89.6 | 1.9 | |
| K-Nearest Neighbours | 72.5 | 2.6 | 77.7 | 2.0 | 77.7 | 1.9 | 88.9 | 6.6 | 90.1 | 1.8 | 90.1 | 1.9 | |
| Logistic Regression | 69.5 | 5.5 | 77.6 | 1.1 | 77.4 | 1.2 | 87.9 | 6.4 | 88.1 | 3.0 | 88.2 | 3.1 | |
| Random Forest | 73.4 | 3.8 | 86.1 | 2.1 | 85.7 | 2.2 | 89.8 | 4.1 | 92.9 | 0.9 | 92.9 | 0.9 | |
| 4 | AdaBoost | 83.3 | 3.1 | 84.2 | 1.5 | 84.2 | 1.5 | 88.5 | 4.5 | 92.1 | 2.6 | 92.1 | 2.6 |
| Artificial Neural Networks | 82.3 | 4.0 | 78.4 | 2.9 | 78.4 | 2.9 | 91.0 | 1.6 | 92.9 | 2.1 | 92.9 | 2.1 | |
| Functional Trees | 76.2 | 4.1 | 74.3 | 1.2 | 74.2 | 1.2 | 91.5 | 3.7 | 92.2 | 3.1 | 92.2 | 3.1 | |
| J48 | 74.6 | 5.6 | 79.6 | 2.5 | 79.5 | 2.6 | 88.7 | 3.4 | 92.5 | 1.7 | 92.4 | 1.7 | |
| J48Consolidated | 77.4 | 4.3 | 77.0 | 5.4 | 77.0 | 5.3 | 89.2 | 3.3 | 92.7 | 1.6 | 92.7 | 1.6 | |
| K-Nearest Neighbours | 84.1 | 1.0 | 72.6 | 2.0 | 72.4 | 2.1 | 89.0 | 1.5 | 93.3 | 1.4 | 93.3 | 1.4 | |
| Logistic Regression | 76.1 | 3.0 | 73.5 | 1.8 | 73.6 | 1.9 | 87.7 | 4.9 | 89.2 | 1.6 | 89.2 | 1.6 | |
| Random Forest | 82.6 | 3.0 | 85.3 | 2.3 | 85.2 | 2.3 | 91.0 | 2.3 | 94.3 | 2.2 | 94.2 | 2.2 | |
aAt follow-up 1, baseline and incremental model are equivalent.