Skip to main content
. 2023 Oct 18;52(8):20230118. doi: 10.1259/dmfr.20230118

Table 2.

Evaluation metrics and their calculations based on confusion matrix

Precision True PositiveTrue Positive+False Positive
Recall (sensitivity) True PositiveTrue Positive+False Negative
Accuracy True Positive+True NegativeAll predictions
F1-Score 2True Positive2True Positive+False Positive+False Negative
Average precision APthreshold=01p(x)dx
Mean average precision for n-classes mAPthreshold=1ni=1nAPi