Table 9.
F-score on the Sarcasm detection task
| Baselines | Our resources | Combinations | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| Class | Model | N | L | L | L | L | E | NL | NE | LE | NLE |
| Non-sarcastic | MC | 88.47 | 88.47 | 88.47 | 88.47 | 88.47 | 88.47 | 88.47 | 88.47 | 88.47 | 88.47 |
| LR | 92.75 | 88.48 | 88.76 | 91.00 | 91.21 | 90.87 | 92.79 | 91.97 | 91.33 | 91.85 | |
| RF | 92.93 | 88.51 | 88.73 | 90.11 | 90.42 | 93.01 | 91.59 | 92.65 | 92.96 | 92.81 | |
| SVM | 92.34 | 88.49 | 88.59 | 87.20 | 87.22 | 92.64 | 92.30 | 93.46 | 92.28 | 93.40 | |
| Sarcastic | MC | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |
| LR | 70.94 | 0.77 | 22.43 | 57.70 | 59.05 | 64.52 | 71.37 | 67.93 | 66.21 | 67.92 | |
| RF | 71.61 | 12.11 | 33.43 | 50.72 | 52.10 | 68.50 | 59.72 | 65.53 | 68.84 | 67.04 | |
| SVM | 72.32 | 11.79 | 21.70 | 33.99 | 39.31 | 68.63 | 71.50 | 73.14 | 68.50 | 73.10 | |
| Macro-average | MC | 44.23 | 44.23 | 44.23 | 44.23 | 44.23 | 44.23 | 44.23 | 44.23 | 44.23 | 44.23 |
| LR | 81.85 | 44.63 | 55.59 | 74.35 | 75.13 | 77.69 | 82.08 | 79.95 | 78.77 | 79.88 | |
| RF | 82.27 | 50.31 | 61.08 | 70.41 | 71.26 | 80.76 | 75.65 | 79.09 | 80.90 | 79.93 | |
| SVM | 82.33 | 50.14 | 55.14 | 60.60 | 63.26 | 80.64 | 81.90 | 83.30 | 80.39 | 83.25 | |
The best performing feature set per algorithm is highlighted in bold