Skip to main content
. 2020 Aug 11;20(16):4485. doi: 10.3390/s20164485

Table 2.

Detailed architecture for the generator.

Layers Type Filter Size Output Dimension Activation Note
Input 1 (100,1,1) ReLU
Batch norm (100,1,1) Momentum = 0.8
Deconvolution 2 3 × 3 (1024) (1024,4,4) ReLU
Batch norm (1024,4,4)
Deconvolution 3 3 × 3 (512) (512,8,8) ReLU
Batch norm (512,8,8)
Deconvolution 4 3 × 3 (256) (256,16,16) ReLU
Batch norm (256,16,16)
Deconvolution 5 3 × 3 (128) (128,32,32) ReLU
Batch norm (128,32,32)
Output 6 3 × 3 (3) (3,64,64) Tanh

ReLU: Rectified linear unit.