Table 2:
F1 | AUC (PR) | AUC (ROC) | P@k | R@k | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Prec. | Recall | Micro | Macro | Micro | Macro | Micro | Macro | 8 | 40 | 8 | 40 | |
Flat SVM (Perotte et al., 2013) | 0.867 | 0.164 | 0.276 | – | – | – | – | – | – | – | – | – |
Hier. SVM (Perotte et al., 2013) | 0.577 | 0.301 | 0.395 | – | – | – | – | – | – | – | – | – |
Logistic (Vani et al., 2017) | 0.774 | 0.395 | 0.523 | 0.042 | 0.541 | 0.125 | 0.919 | 0.704 | 0.913 | 0.572 | 0.169 | 0.528 |
Attn BoW (Vani et al., 2017) | 0.745 | 0.399 | 0.52 | 0.027 | 0.521 | 0.079 | 0.975 | 0.807 | 0.912 | 0.549 | 0.169 | 0.508 |
GRU–128 (Vani et al., 2017) | 0.725 | 0.396 | 0.512 | 0.027 | 0.523 | 0.082 | 0.976 | 0.827 | 0.906 | 0.541 | 0.168 | 0.501 |
BiGRU–64 (Vani et al., 2017) | 0.715 | 0.367 | 0.485 | 0.021 | 0.493 | 0.071 | 0.973 | 0.811 | 0.892 | 0.522 | 0.165 | 0.483 |
GRNN–128 (Vani et al., 2017) | 0.753 | 0.472 | 0.58 | 0.052 | 0.587 | 0.126 | 0.976 | 0.815 | 0.93 | 0.592 | 0.172 | 0.548 |
BiGRNN–64 (Vani et al., 2017) | 0.761 | 0.466 | 0.578 | 0.054 | 0.589 | 0.131 | 0.975 | 0.798 | 0.925 | 0.596 | 0.172 | 0.552 |
CNN (Baumel et al., 2018)* | 0.810 | 0.403 | 0.538 | 0.031 | 0.599 | 0.127 | 0.971 | 0.759 | 0.931 | 0.585 | 0.207 | 0.586 |
Matching Network* | 0.439 | 0.388 | 0.412 | 0.014 | 0.394 | 0.034 | 0.893 | 0.551 | 0.793 | 0.427 | 0.172 | 0.425 |
Match–CNN (Ours) | 0.605 | 0.561 | 0.582 | 0.064 | 0.612 | 0.148 | 0.977 | 0.792 | 0.930 | 0.586 | 0.207 | 0.590 |
Match–CNN Ens. (Ours) | 0.616 | 0.567 | 0.591 | 0.066 | 0.623 | 0.157 | 0.977 | 0.793 | 0.935 | 0.594 | 0.208 | 0.598 |
Models marked with * represent our custom implementations.