Table 2.
Layers | Type | Filter Size | Output Dimension | Activation | Note |
---|---|---|---|---|---|
Input | 1 | (100,1,1) | ReLU | ||
Batch norm | (100,1,1) | Momentum = 0.8 | |||
Deconvolution | 2 | 3 × 3 (1024) | (1024,4,4) | ReLU | |
Batch norm | (1024,4,4) | ||||
Deconvolution | 3 | 3 × 3 (512) | (512,8,8) | ReLU | |
Batch norm | (512,8,8) | ||||
Deconvolution | 4 | 3 × 3 (256) | (256,16,16) | ReLU | |
Batch norm | (256,16,16) | ||||
Deconvolution | 5 | 3 × 3 (128) | (128,32,32) | ReLU | |
Batch norm | (128,32,32) | ||||
Output | 6 | 3 × 3 (3) | (3,64,64) | Tanh |
ReLU: Rectified linear unit.