Table 3.
Dataset | Algorithms 1 | AUROC (95% CI) | Sensitivity (95% CI) | Specificity (95% CI) | Brier Score |
---|---|---|---|---|---|
Validation dataset | LR | 0.709 (0.679–0.737) | 0.679 (0.624–0.728) | 0.660 (0.625–0.695) | 0.218 |
SVM | 0.728 (0.699–0.756) | 0.578 (0.522–0.632) | 0.779 (0.747–0.809) | 0.195 | |
MLP | 0.735 (0.707–0.761) | 0.494 (0.438–0.549) | 0.832 (0.803–0.858) | 0.231 | |
XGBoost | 0.825 (0.802–0.849) | 0.724 (0.672–0.771) | 0.777 (0.744–0.806) | 0.165 | |
RF | 0.855 (0.832–0.877) | 0.565 (0.509–0.619) | 0.927 (0.905–0.944) | 0.139 | |
Testing dataset | LR | 0.685 (0.653–0.715) | 0.615 (0.558–0.670) | 0.644 (0.609–0.679) | 0.223 |
SVM | 0.704 (0.673–0.733) | 0.566 (0.508–0.623) | 0.756 (0.723–0.786) | 0.199 | |
MLP | 0.668 (0.633–0.698) | 0.406 (0.350–0.463) | 0.811 (0.781–0.838) | 0.254 | |
XGBoost | 0.821 (0.795–0.843) | 0.706 (0.651–0.756) | 0.775 (0.743–0.804) | 0.163 | |
RF | 0.851 (0.824–0.872) | 0.577 (0.519–0.633) | 0.940 (0.921–0.955) | 0.134 |
1 LR: logistic regression, SVM: support vector machine, MLP: multi-layer perceptron, XGBoost: eXtreme Gradient Boosting, RF: random forest.