TABLE 3.
Layer type | Details |
---|---|
Input | Size 256 × 256 × 1 |
Conv2D | Size 32 × 7 × 7, Stride 2 |
MaxPooling2D | Size 2 |
Conv2D | Size 64 × 5 × 5, Stride 1 |
MaxPooling2D | Size 2 |
Conv2D | Size 128 × 3 × 3, Stride 1 |
MaxPooling2D | Size 2 |
Conv2D | Size 256 × 3 × 3, Stride 1 |
MaxPooling2D | Size 2 |
Conv2D | Size 512 × 3 × 3, Stride 1 |
Flatten | (None) |
Dense | 256 Units |
Dropout | 0.5 |
Dense | 4 Units |
L2 kernel and activity regularization (1e−06, with default biases turned off) were applied to each Conv2D layer, with batch normalization (momentum = 0.01) applied between the Conv2D and MaxPooling2D layers. ReLU activation was used for all Conv2D layers and the first Dense layer, with Softmax activation applied to the output layer.