Table 2.
Type | In | Out | Size | Stride | Pad | Normalization | Activation | |
---|---|---|---|---|---|---|---|---|
1 | Conv | 1 | 16 | 3 | 1 | 1 | BN | LReLU |
2 | Conv | 16 | 32 | 3 | 1 | 1 | BN | LReLU |
3 | Conv | 32 | 64 | 3 | 2 | 1 | BN | LReLU |
4 | Res | 64/64 | 64/64 | 3/3 | 1/1 | 1/1 | BN/BN | LReLU/LReLU |
5 | Res | 64/64 | 64/64 | 3/3 | 1/1 | 1/1 | BN/BN | LReLU/LReLU |
6 | Res | 64/64 | 64/64 | 3/3 | 1/1 | 1/1 | BN/BN | LReLU/LReLU |
7 | Res | 64/64 | 64/64 | 3/3 | 1/1 | 1/1 | BN/BN | LReLU/LReLU |
8 | Conv | 64 | 32 | 3 | 1 | 1 | BN | LReLU |
9 | Conv | 32 | 1 | 3 | 1 | 1 | - | Sigmoid |
Conv, convolutional layer; Res, residual block; Pool, max-pooling layer; BN, batch normalization; LReLU, leaky rectified linear unit.