Table 9.
Description of the performance evaluation metrics
| Measures | Formula | Description |
|---|---|---|
| Accuracy | Accuracy refers to the amount of accurate assumptions the algorithm produces for forecasts of all sorts. | |
| Precision | Precision is the percentage of successful cases that were reported correctly. | |
| Recall | It is the number of right positive outcomes divided by the number of all related samples (including samples that were meant to be positive). | |
| F1-score | It is the harmonic mean of the precision and recall values. | |
| MCC | MCC is the correlation coefficient between the actual values of the class and the predicted values of the class. | |
| Specificity | It is used to calculate the fraction of negative values correctly classified. | |
| Gmean1 | Gmean1 is computed as the square root of the product of precision and recall. | |
| Gmean2 | Gmean2 is computed as the square root of the product of specificity and recall. |