Skip to main content
. 2021 Dec 20;10(24):5982. doi: 10.3390/jcm10245982

Table 5.

Performance metrics of the meta-classifier on test set.

Symbol Performance Metric Definition as What Does It Measure? Value
CCR Correctly Classified instance Rate—Accuracy (TP + TN)/(TP + TN + FP + FN) How good the model is at correctly predicting both positive and negative cases 0.9904
TPR True Positive Rate—Sensitivity—Recall TP/(TP + FN) How good the model is at correctly predicting positive cases 0.9908
FPR False Positive Rate—Fall-out FP/(FP + TN) Proportion of incorrectly classified negative cases 0.010
PPV Positive Predictive Value—Precision TP/(TP + FP) Proportion of correctly classified positive cases out of total positive predictions 0.9908
AUC ROC Area Area under the ROC curve Area under plot of TPR against FPR 0.997