Skip to main content
. 2023 Jun 8;26(7):107039. doi: 10.1016/j.isci.2023.107039

Table 4.

Prediction scores for most accurate models

Metric Model 1 Model 2 Model 3 Model 4 Model 5
Accuracy 99.7 99.7 99.7 99.3 99.4
AUC Value 1.000 1.000 0.997 1.000 0.998
Average prediction score 98.3 99.5 98.7 98.3 98.8
True prediction score 98.5 99.6 98.8 98.8 99.2
False prediction score 21.2 51.6 37.6 24.4 43.7
2nd highest prediction score 0.52 0.16 0.37 0.43 0.61
Accuracy (0–100%): Positive match accuracy obtained by selecting class with the highest prediction score
AUC Value (0–1): Area under the model’s ROC curve. Has a maximum value of 1, and gives an indication of the true positive rate at low false positive rates
Average prediction score (0–100%): The average prediction score produced by the model when classifying test images (includes both true and false matches
True prediction score (0–100%): Average prediction score produced when the model correctly classifies a test image
False prediction score (0–100%): Average prediction score produced when the model incorrectly classifies a test image
2nd highest prediction score (0–100%): Average prediction score of the second highest class when the model classifies a test image (includes both true and false matches)