Table 5.
Testing data results and comparison with other machine learning algorithms.
| Model | TNa | FPb | FNc | TPd | Sene | Spef | Accg | BAh | AUCi |
| 5-layer DNNj: copying | 967 | 103 | 5 | 46 | 0.9020 | 0.9037 | 0.9037 | 0.9028 | 0.9617 |
| 5-layer DNN: SMOTEk [23] | 984 | 86 | 8 | 43 | 0.8431 | 0.9196 | 0.9161 | 0.8814 | 0.9555 |
| 5-layer DNN with PCAl (8 features) | 922 | 148 | 5 | 46 | 0.9020 | 0.8617 | 0.8635 | 0.8818 | 0.9549 |
| Linear regression [24] | 983 | 87 | 7 | 44 | 0.8627 | 0.9187 | 0.9161 | 0.8907 | 0.9563 |
| Decision tree [25] | 915 | 155 | 5 | 46 | 0.9020 | 0.8551 | 0.8573 | 0.8786 | 0.9252 |
| Random forest [21] | 955 | 115 | 5 | 46 | 0.9020 | 0.8925 | 0.8930 | 0.8972 | 0.9590 |
| Support vector machine [26] | 955 | 115 | 5 | 46 | 0.9020 | 0.8925 | 0.8930 | 0.8972 | 0.9588 |
| XGBoostm [22] | 945 | 125 | 6 | 45 | 0.8824 | 0.8832 | 0.8831 | 0.8828 | 0.9558 |
| AdaBoost [19,20] | 937 | 133 | 5 | 46 | 0.9020 | 0.8757 | 0.8769 | 0.8888 | 0.9586 |
| GradBoost [27] | 936 | 134 | 6 | 45 | 0.8824 | 0.8748 | 0.8751 | 0.8786 | 0.9525 |
| HistBoost [28] | 959 | 111 | 7 | 44 | 0.8627 | 0.8963 | 0.8947 | 0.8795 | 0.9535 |
aTN: true negative.
bFP: false positive.
cFN: false negative.
dTP: true positive.
eSen: sensitivity.
fSpe: specificity.
gAcc: accuracy.
hBA: balanced accuracy.
iAUC: area under the curve.
jDNN: deep neural network.
kSMOTE: synthetic minority oversampling technique.
lPCA: principal component analysis.
mXGBoost: eXtreme Gradient Boosting.