Table 3.
Performance metrics used.
| No. | Performance Metric | Description |
|---|---|---|
| 1 | Accuracy | Accuracy is a measurement that gives the correctness of classification and loss is a measure indicating that how well a model behaves after every iteration. |
| 2 | Precision | The fraction of true positives (TP) from the total amount of relevant result. Precision = TP/(TP + FP). |
| 3 | Recall (Sensitivity) | The fraction of true positives from the total amount of TP and FN. Recall = TP/(TP + FN). |
| 4 | F1 Score | The harmonic mean of Precision and Recall given by the following formula: F1 = 2 ∗ (TP ∗ FP)/(TP + FP) |
| 5 | Specificity | Specificity = TN/(FP + TN) |
| 6 | Negative Predictive Value | NPV = TN/(TN + FN) |
| 7 | False Positive Rate | FPR = FP/(FP + TN) |
| 8 | False Discovery Rate | FDR = FP (FP + TP) |
| 9 | False Negative Rate | FNR = FN/(FN + TP) |
| 10 | Matthews Correlation Coefficient | TP ∗ TN − FP ∗ FN/sqrt((TP + FP) ∗ (TP + FN) ∗ (TN + FP) ∗ (TN + FN)) |