Table 6.
Baselines | Conv Layer | LSTM Layer | Attention Layer | Output Layer |
---|---|---|---|---|
LSTM | / | / | ||
ABLSTM | / |
tanh SoftMax |
||
Deep-Conv LSTM |
Sigmoid |
|||
TPN |
Dropout = 0.1 Maxpool(8) |
/ | / | |
MSRLSTM |
Res Maxpool(2) |
SoftMax |
Note: CNN (a),(b) is convolutional layer, where a is the size of convolutional kernel, (b) is the kernel number; LSTM (c) is LSTM layer, where c is the size of hidden layer; FC (d) is fully connected layer, where d is the size of FC layer. Maxpool (e) is the max pooling layer, e is the size of pooling kernel; Res means a residual option after the previous convolutional layer.