Skip to main content
. 2017 Dec 17;17(12):2933. doi: 10.3390/s17122933

Table 2.

The proposed CNN architecture used in our research (ReLU means rectified linear unit).

Layer Type Number of Filters Size of Feature Map
(Height × Width × Channel)
Kernel (Filter) Size
(Height × Width × Channel)
Number of Stride
(Height × Width)
Number of Padding
(Height × Width)
Image input layer 8 × 256 × 3
1st convolutional layer 64 8 × 244 × 64 1 × 13 × 3 1 × 1 0 × 0
Batch normalization 8 × 244 × 64
ReLU layer 8 × 244 × 64
2nd convolutional layer 64 8 × 232 × 64 1 × 13 × 64 1 × 1 0 × 0
Batch normalization 8 × 232 × 64
ReLU layer 8 × 232 × 64
Max pooling layer 1 8 × 116 × 64 1 × 2 × 64 1 × 2 0 × 0
3rd convolutional layer 128 8 × 104 × 128 1 × 13 × 64 1 × 1 0 × 0
Batch normalization 8 × 104 × 128
ReLU layer 8 × 104 × 128
4th convolutional layer 128 8 × 92 × 128 1 × 13 × 128 1 × 1 0 × 0
Batch normalization 8 × 92 × 128
ReLU layer 8 × 92 × 128
Max pooling layer 1 8 × 46 × 128 1 × 2 × 128 1 × 2 0 × 0
5th convolutional layer 256 8 × 36 × 256 1 × 11 × 128 1 × 1 0 × 0
Batch normalization 8 × 36 × 256
ReLU layer 8 × 36 × 256
6th convolutional layer 256 8 × 26 ×256 1 × 11 × 256 1 × 1 0 × 0
Batch normalization 8 × 26 × 256
ReLU layer 8 × 26 × 256
Max pooling layer 1 8 × 13 × 256 1 × 2 × 256 1 × 2 0 × 0
7th convolutional layer 512 6 × 11 × 512 3 × 3 × 256 1 × 1 0 × 0
Batch normalization 6 × 11 × 512
ReLU layer 6 × 11 × 512
8th convolutional layer 512 4 × 9 × 512 3 × 3 × 512 1 × 1 0 × 0
Batch normalization 4 × 9 × 512
ReLU layer 4 × 9 × 512
Max pooling layer 1 4 × 5 × 512 1 × 2 × 512 1 × 2 0 × 1
1st fully connected layer 4096
Batch normalization 4096
ReLU layer 4096
2nd fully connected layer 4096
Batch normalization 4096
ReLU layer 4096
3rd fully connected layer # of classes
Softmax layer # of classes
Classification layer (output layer) # of classes