Skip to main content
. 2022 Sep 29:10.1002/ima.22812. Online ahead of print. doi: 10.1002/ima.22812

TABLE 2.

Simulation study results

Reference No. Algorithms used Trained parameters Computational complexity
[39]

ResNet18 ResNet50 SqueezeNet DenseNet‐121 Dataset 5 k images Transfer Learning (Training of 2 k images)

Weights, Layer of neurons It is possible to reduce the number of channels in a DenseNet Architecture since each layer receives feature mappings from all preceding layers (so, it have higher computational efficiency and memory efficiency)
[40] COVXNET Kernels, Dilation Rate, Weights, Fine Tuning Layers Traditional convolution can also be divided into depth wise and pointwise convolutions, which greatly reduces the amount of time and calculation required to perform the operation
[41] CORONET Fine Tuning Layers, Batch Size, Optimizer There were promising results on the dataset supplied for CoroNet. When more training data is provided, the performance can be further enhanced
[35] Compared 3 models Inception v3 Xception ResNeXt Layers, Weights, Bias The XCeption net provides the best performance and is best suited for use. We were able to correctly classify covid‐19 images, demonstrating the potential of such systems for automating diagnostic activities in the near future
[42] COVID‐XNETS Layers, Kernel Size, Optimizer This unique model was selected by way of an Exhaustive Grid Search over the number of layers and kernel's sizes, prioritizing accuracy, and computational complexity
[23] VGG19 MobileNet V2 Inception Xception Inception ResNet V2 Layers, Classifier This technique is widely used to avoid the computational costs of starting from scratch when training a very deep network or to keep the important feature extractors learned during the initial step
[43] DarkNet Model Provides binary classification (Covid vs. non‐findings) Multiclass Classification (Covid vs. Non‐finding vs. Pneumonia) Layers With the filters it applies, a convolution layer captures features from the input, and a pooling layer reduces the size for computational performance, as is typical in CNN structures
[34] Experiment Model usedVGG19 ResNet‐50 COVID‐Net Batch Size, Epochs, Learning Rate, Factor, Patience The VGG‐19 and ResNet‐50 architectures were far more sophisticated, whereas COVID‐Net was significantly simpler. The COVID‐Net network architecture outperformed the VGG‐19 and ResNet‐50 in terms of test accuracy and COVID‐19 sensitivity