Table 2.
Performance summary of machine learning models in CMP-IV group.
| Model | Accuracy (95% CI) | Precision (95% CI) | Recall (95% CI) | F1 (95% CI) | AUC (95% CI) |
|---|---|---|---|---|---|
| XGBoost | 0.802 (0.760, 0.844) | 0.799 (0.782, 0.816) | 0.633 (0.551, 0.772) | 0.706 (0.664, 0.748) | 0.768 (0.742, 0.786) |
| LGBM | 0.804 (0.746, 0.836) | 0.769 (0.796, 0.742) | 0.582 (0.438, 0.732) | 0.659 (0.575, 0.746) | 0.766 (0.744, 0.792) |
| CatBoost | 0.807 (0.729, 0.861) | 0.764 (0.716, 0.812) | 0.631 (0.532, 0.778) | 0.697 (0.647, 0.747) | 0.762 (0.723, 0.801) |
| RF | 0.788 (0.769, 0.802) | 0.736 (0.708, 0.764) | 0.612 (0.598, 0.621) | 0.671 (0.656, 0.687) | 0.732 (0.701, 0.763) |
| GBDT | 0.793 (0.785, 0.801) | 0.717 (0.632, 0.802) | 0.534 (0.503, 0.579) | 0.611 (0.566, 0.655) | 0.702 (0.665, 0.739) |
| Bagging | 0.796 (0.744, 0.859) | 0.716 (0.688, 0.744) | 0.522 (0.387, 0.591) | 0.615 (0.575, 0.656) | 0.698 (0.652, 0.744) |
| LR | 0.777 (0.732, 0.811) | 0.689 (0.505, 0.819) | 0.494 (0.405, 0.619) | 0.588 (0.450, 0.702) | 0.688 (0.664, 0.712) |
| SVM | 0.795 (0.758, 0.836) | 0.612 (0.568, 0.656) | 0.512 (0.459, 0.565) | 0.561 (0.512, 0.609) | 0.687 (0.646, 0.728) |
| AdaBoost | 0.773 (0.702, 0.812) | 0.698 (0.601, 0.795) | 0.533 (0.501, 0.565) | 0.608 (0.552, 0.665) | 0.668 (0.624, 0.712) |
| MLP | 0.744 (0.688, 0.805) | 0.637 (0.459, 0.781) | 0.515 (0.488, 0.542) | 0.575 (0.488, 0.656) | 0.658 (0.616, 0.691) |
XGBoost, extreme gradient boosting; LGBM, light gradient boosting machine; CatBoost, category boosting; RF, random forest; GBDT, Gradient boosting decision tree; Bagging, bootstrap aggregation; LR, logistic regression; SVM, support vector machine; AdaBoost, adaptive boosting; MLP, multi-layer perceptron; AUC, the area under the receiver operating characteristic curve.