Table A3.
Layer | Input | Kernel size | Activation function | Output channel | Stride size | Output |
---|---|---|---|---|---|---|
Convolution | lin | 3 | LeakyReLU | 32 | 1 | l0 |
Convolution+MaxPooling | l0 | 3 | LeakyReLU | 32 | 1 | l1 |
Convolution | l1 | 3 | LeakyReLU | 64 | 1 | l2 |
Convolution+MaxPooling | l2 | 3 | LeakyReLU | 64 | 1 | l3 |
Convolution | l3 | 3 | LeakyReLU | 128 | 1 | l4 |
Convolution+MaxPooling | l4 | 3 | LeakyReLU | 128 | 1 | l5 |
Convolution | l5 | 3 | LeakyReLU | 256 | 1 | l6 |
Convolution+BilinearUp2 | l6 | 3 | LeakyReLU | 256 | 1 | l7 |
Convolution | l7 | 3 | LeakyReLU | 128 | 1 | l8 |
Convolution+BilinearUp2 | l8 | 3 | LeakyReLU | 128 | 1 | l9 |
Convolution | l9 | 3 | LeakyReLU | 64 | 1 | l10 |
Convolution+BilinearUp2 | l10 | 3 | LeakyReLU | 64 | 1 | lll |
Convolution | l11 | 3 | LeakyReLU | 32 | 1 | l12 |
Convolution | l12 | 3 | tanh | 2 | 1 | lout |