Skip to main content
. 2022 Jun 17;17(1):1100–1113. doi: 10.1515/med-2022-0508

Table 2.

Definition of performance metrics

Performance metrices Definition/explanation
Accuracy Truepositive(TP)+Truenegative(TN)Truepositive(TP)+Truenegative(TN)+Falsepositive(FP)+Falsenegative(FN)
Precision TPTP+FP
Recall (TP rate) TPTP+FN
F1-score TPTPTP+TP+FP+FN
Support The number of actual occurrences of a class in the provided data set
FP rate FPFP+TN
Area under the curve (AUC) AUC is an important feature of the ROC curve that measures the ability of a classifier to distinguish between classes. The greater the AUC, the better the model’s performance
ROC An ROC, or ROC curve, is a graphical representation of a binary classifier
Macro average All classes equally contribute to the final averaged metric
Weighted avg. The weight of each class’s contribution to the average