Table 2.
Definition of performance metrics
Performance metrices | Definition/explanation |
---|---|
Accuracy | |
Precision | |
Recall (TP rate) | |
F1-score | |
Support | The number of actual occurrences of a class in the provided data set |
FP rate | |
Area under the curve (AUC) | AUC is an important feature of the ROC curve that measures the ability of a classifier to distinguish between classes. The greater the AUC, the better the model’s performance |
ROC | An ROC, or ROC curve, is a graphical representation of a binary classifier |
Macro average | All classes equally contribute to the final averaged metric |
Weighted avg. | The weight of each class’s contribution to the average |