Table 2.
Metric | Formula | Description |
---|---|---|
Precision | Indicates the proportion of positive identifications which were correct. | |
Recall | Indicates the proportion of actual positives which were correctly classified | |
F1-score | Combination of precision and recall | |
Accuracy | Overall performance of the model | |
AUC-ROC | Comparison of a model’s TPR versus model’s FPR |