Table 3.
Model | Accuracy | Precision | Recall | F1-score |
---|---|---|---|---|
LSTM | 95 | 0.90 | 0.95 | 0.93 |
BiLSTM | 97 | 0.97 | 0.97 | 0.97 |
GRU | 95 | 0.91 | 0.95 | 0.93 |
RNN | 95 | 0.91 | 0.95 | 0.93 |
Conv1d | 97 | 0.97 | 0.97 | 0.97 |
Model | Accuracy | Precision | Recall | F1-score |
---|---|---|---|---|
LSTM | 95 | 0.90 | 0.95 | 0.93 |
BiLSTM | 97 | 0.97 | 0.97 | 0.97 |
GRU | 95 | 0.91 | 0.95 | 0.93 |
RNN | 95 | 0.91 | 0.95 | 0.93 |
Conv1d | 97 | 0.97 | 0.97 | 0.97 |