Skip to main content
. 2024 Oct 1;14:22797. doi: 10.1038/s41598-024-71893-3

Table 4.

Comparing the results of pre-trained transfer learning models with the proposed ViT-GRU on the AdamW, Adam, and SGD optimizer.

Model Optimizer Precision (%) Recall (%) F1-score (%) Training loss Validation loss Trainig time
(s)
Training accuracy (%) Test accuracy (%)
ResNet18 AdamW 91 91 90 0.0113 0.1577 5.83 99.88 94.29
Adam 94 93 93 0.0124 0.1193 6.07 99.88 96.86
SGD 65 81 72 1.4124 1.4792 5.63 81.25 80.57
ResNet50 AdamW 91 91 91 0.0435 0.2218 12.94 98.65 91.43
Adam 92 92 92 0.0524 0.2454 13.03 98.65 92.86
SGD 63 79 70 0.5341 0.5443 12.64 81.74 79.43
VGG16 AdamW 91 92 91 0.0588 0.1521 18.41 97.43 93.71
Adam 93 93 93 0.0547 0.1415 18.9 99.02 94
SGD 72 81 75 0.5593 0.5347 17.73 78.55 81.43
VGG19 AdamW 92 91 91 0.0540 0.1755 20.92 98.28 94.29
Adam 91 91 91 0.0540 0.1755 20.92 98.9 94.29
SGD 72 79 73 0.5951 0.5601 20.55 78.8 78.29
EfficientNet_V2_l AdamW 91 91 91 0.0723 0.1646 17.36 96.56 92.65
Adam 89 89 89 0.1131 0.2954 14.61 95.83 91.71
SGD 68 76 72 6.0248 6.0263 14.57 50.86 51.14
MobileNet_V2 AdamW 85 86 85 0.2885 0.3368 6.43 90.2 85.14
Adam 80 83 80 0.2819 0.3643 6.54 88.97 83.71
SGD 74 84 77 3.6244 2.5056 11.82 82.27 84.86
DenseNet121 AdamW 97 97 97 0.0182 0.0892 13.87 99.88 97.71
Adam 93 93 93 0.0311 0.1541 13.93 99.88 96.29
SGD 72 83 73 2.7244 2.6056 13.82 80.27 82.86
AlexNet AdamW 86 87 86 0.1983 0.2656 4.57 92.89 89.43
Adam 89 89 88 0.1451 0.2399 4.05 96.08 91.14
SGD 71 84 77 6.8541 6.8497 4.29 79.78 84
ViT-GRU AdamW 97 97 97 0.0056 0.0374 47.93 99.93 98.97
Adam 96 95 95 0.0089 0.1025 49.28 99.75 96.56
SGD 66 81 73 0.4695 0.4606 48.3 81.02 81.66