Table 4.
Numerical results for proposed models using 30–70% hold-out for five different seeds
| Metrics | Model 1 | Model 2 | Model 3 |
|---|---|---|---|
| Accuracy | |||
| Mean | 0.9261 | 0.9583 | 0.9666 |
| Std. Dev. | 0.0079 | 0.0076 | 0.0017 |
| Best | 0.9358 | 0.9608 | 0.9689 |
| Worst | 0.9143 | 0.9499 | 0.9644 |
| Recall | |||
| Mean | 0.9274 | 0.9583 | 0.9665 |
| Std. Dev. | 0.098 | 0.0052 | 0.0017 |
| Best | 0.9400 | 0.9630 | 0.9671 |
| Worst | 0.9140 | 0.9500 | 0.9643 |
| Precision | |||
| Mean | 0.9275 | 0.9582 | 0.9666 |
| Std. Dev. | 0.0093 | 0.0052 | 0.0017 |
| Best | 0.9401 | 0.9631 | 0.9672 |
| Worst | 0.9143 | 0.9496 | 0.9644 |
| F1-Score | |||
| Mean | 0.9275 | 0.9582 | 0.9666 |
| Std. Dev. | 0.0097 | 0.0051 | 0.0017 |
| Best | 0.9401 | 0.9630 | 0.9690 |
| Worst | 0.9143 | 0.9606 | 0.9643 |