Table 5.
Main added features in the CNNs.
| Network | What’s Novel? |
|---|---|
| AlexNet | - Apply Rectified Linear Units (ReLU) to add non-linearity |
| VGG-16 | - Deep network, approximately twice as deep as AlexNet |
| GoogLeNet | - Using dense modules as opposed to stacking convolutional layers |
| ResNets | - Using batch normalization and skip connections |
| Inception-ResNet-V2 |
- Using residual inception blocks instead of inception modules - Adding the inception module (Inception-A) after the stem module - Using more inception modules |
| DarkNet-19 | - Combine Darknet extraction, Network In Network, Inception and Batch Normalization in a single model |