Skip to main content
. 2020 Jun 5;12119:282–289. doi: 10.1007/978-3-030-51935-3_30

Table 5.

Main added features in the CNNs.

Network What’s Novel?
AlexNet - Apply Rectified Linear Units (ReLU) to add non-linearity
VGG-16 - Deep network, approximately twice as deep as AlexNet
GoogLeNet - Using dense modules as opposed to stacking convolutional layers
ResNets - Using batch normalization and skip connections
Inception-ResNet-V2

- Using residual inception blocks instead of inception modules

- Adding the inception module (Inception-A) after the stem module

- Using more inception modules

DarkNet-19 - Combine Darknet extraction, Network In Network, Inception and Batch Normalization in a single model