Table 1.
Layer | Name | Channels | Size |
---|---|---|---|
0 | Inputs | 1 | (163,148) |
1 | Convolution | 16 | (3,3) |
Max-pooling | 16 | (2,2) | |
Activation (relu) | —— | —— | |
Batch-normalization | —— | —— | |
2 | Convolution | 32 | (3,3) |
Max-pooling | 32 | (2,2) | |
Activation (relu) | —— | —— | |
Batch-normalization | —— | —— | |
3 | Convolution | 64 | (3,3) |
Activation (relu) | —— | —— | |
Batch-normalization | —— | —— | |
4 | Convolution | 64 | (3,3) |
Activation (relu) | —— | —— | |
Batch-normalization | —— | —— | |
5 | Convolution | 128 | (3,3) |
Max-pooling | 128 | (2,2) | |
Activation (relu) | —— | —— | |
Batch-normalization | —— | —— | |
6 | Flatten | —— | —— |
7 | Fully connected | —— | 278 |
8 | Lstm1 | —— | 800 |
Activation (tanh) | —— | ||
9 | Lstm2 | —— | 800 |
Activation (tanh) | —— | —— | |
10 | Dropout (0.2) | —— | —— |
11 | Fully connected | —— | 278 |