Skip to main content
. 2021 Dec 2;18(1):63. doi: 10.1186/s41239-021-00300-y

Table 5.

Classification performance metrics

Metric Explanation Formula
Accuracy How often are the model's predictions correct? (TP + TN) / (TP + TN + FN + FP)
Precision When the model predicts positive, how often is it correct? TP / (TP + FP)
Sensitivity (Recall) or TPR When it is actually positive, how often it correctly predicts? TP / (TP + FN)
Specificity or TNR When it is actually negative, how often it correctly predicts? TN / (TN + FP)
Balanced accuracy The mean of sensitivity and specificity (TPR + TNR) / 2
F-Measure The harmonic mean of precision and sensitivity 2 × Precision × Recall / (Precision + Recall)