Skip to main content
. 2024 Feb 29;10:e1915. doi: 10.7717/peerj-cs.1915

Table 15. Classification results of the MLCEFE classifiers test set.

Class Precision Recall F1 score Support
C1 1 0.9435 0.9793 0.9610 3,085
2 0.9663 0.9404 0.9532 9,956
3 0.8948 0.9198 0.9071 9,956
4 0.9448 0.9436 0.9442 9,956
5 0.9287 0.9710 0.9494 5,108
6 0.5670 0.1230 0.2022 447
Accuracy 0.9336 38,508
Macro avg 0.8742 0.8129 0.8195 38,508
C2 1 0.9618 0.9721 0.9670 3,085
2 0.9666 0.9393 0.9528 9,956
3 0.8933 0.9197 0.9063 9,956
4 0.9445 0.9432 0.9438 9,956
5 0.9193 0.9834 0.9502 5,108
6 0.4138 0.0537 0.0950 447
Accuracy 0.8611 38,508
Macro avg 0.8499 0.8019 0.8025 38,508
C3 1 0.9545 0.9728 0.9636 3,085
2 0.9662 0.9406 0.9532 9,956
3 0.8945 0.9193 0.9068 9,956
4 0.9447 0.9435 0.9441 9,956
5 0.9125 0.9818 0.9459 5,108
6 0.0000 0.0000 0.0000 447
Accuracy 0.9330 38,508
Macro avg 0.7787 0.7930 0.7856 38,508

Note:

Values in bold represent the optimum values for each group.