Table 9. Metrics AUC-ROC (Benchmark techniques VS MLEn).
Methods/Datasets | Automobiles (%) | Birds (%) | Emotions (%) | Hotel (%) | Medical (%) | Movies (%) | News (%) | Proteins (%) |
---|---|---|---|---|---|---|---|---|
DenseNet-EHO | 0.98 | 0.99 | 0.96 | 0.98 | 0.97 | 0.99 | 0.96 | 0.96 |
BERT [52] | 0.9 | 0.9 | 0.87 | 0.91 | 0.89 | 0.91 | 0.86 | 0.88 |
ML-RBF [30] | 0.81 | 0.83 | 0.78 | 0.84 | 0.8 | 0.82 | 0.77 | 0.79 |
RAKEL [9] | 0.84 | 0.86 | 0.81 | 0.87 | 0.83 | 0.85 | 0.8 | 0.82 |
ECC [31] | 0.79 | 0.81 | 0.76 | 0.82 | 0.78 | 0.8 | 0.75 | 0.77 |
CNN [5] | 0.82 | 0.84 | 0.79 | 0.85 | 0.81 | 0.83 | 0.78 | 0.8 |
NB [6] | 0.8 | 0.82 | 0.77 | 0.83 | 0.79 | 0.81 | 0.76 | 0.78 |
LSTM [32] | 0.86 | 0.88 | 0.83 | 0.89 | 0.85 | 0.87 | 0.82 | 0.84 |
ResNet [14] | 0.83 | 0.85 | 0.8 | 0.86 | 0.82 | 0.84 | 0.79 | 0.81 |
CapsNet [33] | 0.85 | 0.87 | 0.82 | 0.88 | 0.84 | 0.86 | 0.81 | 0.83 |
GRU [51] | 0.86 | 0.88 | 0.83 | 0.89 | 0.85 | 0.87 | 0.82 | 0.84 |