Table 2.
Network | Layer | Shape | Out | Padding | Stride | Kernel |
---|---|---|---|---|---|---|
CNN | Conv | 64 | Same | 1 | 3 | |
BN + ReLU | ||||||
Conv | 64 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Maxpool (size = 3) |
- | Same | 3 | - | ||
Conv | 128 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Conv | 128 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Maxpool (size = 3) |
- | Same | 3 | - | ||
Conv | 256 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Conv | 256 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Conv | 256 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Maxpool (size = 3) |
- | Same | 3 | - | ||
Conv | 512 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Conv | 512 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Conv | 512 | Same | 1 | 3 | ||
BN + ReLU | ||||||
Maxpool (size = 3) |
- | Same | 3 | - | ||
Bi-GRU | Forward | 64 | - | |||
Backward | 64 | - | ||||
Concatenation | ||||||
Attention | 1-layer perceptron | 1 | - | |||
Activation tanh | ||||||
Softmax | ||||||
Weighted sum | ||||||
1-layer perceptron | 128 | 2 | - |