Skip to main content
. 2022 Oct 22;53(11):14249–14268. doi: 10.1007/s10489-022-04221-9

Table 9.

Description of the performance evaluation metrics

Measures Formula Description
Accuracy TP+TNTP+FP+TN+FN Accuracy refers to the amount of accurate assumptions the algorithm produces for forecasts of all sorts.
Precision TPTP+FP Precision is the percentage of successful cases that were reported correctly.
Recall TPTP+FN It is the number of right positive outcomes divided by the number of all related samples (including samples that were meant to be positive).
F1-score 2×P×RP+R It is the harmonic mean of the precision and recall values.
MCC (TPTNFPFN)(TP+FP)(TP+FN)(TN+FP)(TN+FN) MCC is the correlation coefficient between the actual values of the class and the predicted values of the class.
Specificity TNTP+FN It is used to calculate the fraction of negative values correctly classified.
Gmean1 Precision×Recall Gmean1 is computed as the square root of the product of precision and recall.
Gmean2 Specificity×Recall Gmean2 is computed as the square root of the product of specificity and recall.