Skip to main content
. 2023 May 6;10(5):202. doi: 10.3390/jcdd10050202

Table 1.

Description of performance metrics for classification.

Performance Metric Description
False positive (FP) Object incorrectly classified as positive
False negative (FN) Object incorrectly classified as negative
True positive (TP) Object correctly classified as positive
True negative (TN) Object correctly classified as negative
Precision (PPV) The fraction of TP among all positive classified
Sensitivity (TPR)/recall The fraction of TP that were correctly classified
accuracy The fraction of TP and TN that were correctly classified
F1 score The harmonic mean of precision and recall
Specificity (TNR) The fraction of TN that were correctly classified
Receiver operating characteristic curve The curve between recall (Y-axis) and =false positive Rate = 1-specificity (X-axis)
Area under the curve ROC Evaluates the overall quality of the model
Precision recall curve The curve between precision (Y-axis) and recall (X-axis)
Precision recall AUC (PR-AUC) Alternative for AUC-ROC based on the PR curve