Skip to main content
. 2025 Feb 21;11:e2718. doi: 10.7717/peerj-cs.2718

Table 2. Layers and properties of ResNet64 architecture.

ResNet models have advantages in deep networks such as training simplicity, network augmentation, transfer learning capability, and backward adaptability.

Layer Output size Filters Activation Params
Conv2d-1 [batch_size, 64, 112, 112] 64 9,408
BatchNorm2d-2 [batch_size, 64, 112, 112] 64 128
ReLU-3 [batch_size, 64, 112, 112] ReLU (inplace=True) 0
MaxPool2d-4 [batch_size, 64, 56, 56] 0
Conv2d-5 [batch_size, 64, 56, 56] 64 4,096
BatchNorm2d-6 [batch_size, 64, 56, 56] 64 128
ReLU-7 [batch_size, 64, 56, 56] ReLU 0
Conv2d-8 [batch_size, 64, 56, 56] 64 36,864
BatchNorm2d-9 [batch_size, 64, 56, 56] 64 128
ReLU-10 [batch_size, 64, 56, 56] ReLU 0
Conv2d-11 [batch_size, 256, 56, 56] 256 16,384
BatchNorm2d-12 [batch_size, 256, 56, 56] 256 512
Conv2d-13 [batch_size, 256, 56, 56] 256 16,384
BatchNorm2d-14 [batch_size, 256, 56, 56] 256 512
ReLU-15 [batch_size, 256, 56, 56] ReLU 0
Bottleneck-16 [batch_size, 256, 56, 56] 0
Conv2d-17 [batch_size, 64, 56, 56] 64 16,384
BatchNorm2d-18 [batch_size, 64, 56, 56] 64 128
ReLU-19 [batch_size, 64, 56, 56] ReLU 0
AdaptiveAvgPool2d-173 [batch_size, 2048, 1, 1] 0
Linear-174 [batch_size, 4] 4 Varies
ResNet-175 [batch_size, 4] 0