Table 1.
No | Layer | Operation | Type | Kernel size | Kernel amount | Output |
---|---|---|---|---|---|---|
1 | - | - | Input | - | - | 100 × 100 × 3 |
2 | L1 | Convolution | Conv2D | 7 × 7 | 16 | 94 × 94 × 16 |
3 | - | Activation functions | ReLu | - | - | - |
4 | L2 | Convolution | Conv2D | 5 × 5 | 32 | 90 × 90 × 32 |
5 | - | Activation functions | ReLu | - | ||
6 | L3 | Fully connected layer | FC | - | - | 256 |
7 | - | Activation functions | ReLu | - | - | - |
8 | - | Dropout layer | Dropout | - | - | - |
9 | L4 | Fully connected layer | FC | - | - | 128 |
10 | - | Activation functions | ReLu | - | - | - |
11 | L5 | Fully connected layer | FC | - | - | 6 |
13 | L6 | Activation functions | Sigmoid | - | - | 6 |