Skip to main content
. 2021 Sep 5;186:115805. doi: 10.1016/j.eswa.2021.115805

Table 4.

Top-1 combinations for the VGG16 in the 15 optimization iterations.

# Parameters optimizer Batch size Dropout ratio TL learn ratio Loss Accuracy F1 Precision Recall AUC WSM
1 SGD 32 0.57 80% 0.0716 97.21% 97.21% 97.21% 97.21% 0.9966 95.48%
2 SGD 32 0.56 80% 0.0952 96.63% 96.63% 96.63% 96.63% 0.9946 94.74%
3 AdaGrad 32 0.21 30% 0.2933 91.25% 91.25% 91.25% 91.25% 0.9674 89.14%
4 SGD 32 0.58 80% 0.0584 98.08% 98.08% 98.08% 98.08% 0.9967 96.48%
5 AdaGrad 32 0.19 25% 0.1302 95.57% 95.58% 95.57% 95.57% 0.9897 93.57%
6 SGD 32 0.58 80% 0.0394 98.61% 98.61% 98.61% 98.61% 0.9980 97.41%
7 SGD 32 0.58 80% 0.0263 99.13% 99.13% 99.13% 99.13% 0.9991 98.55%
8 SGD 32 0.58 80% 0.0252 99.23% 99.23% 99.23% 99.23% 0.9988 98.73%
9 SGD 32 0.58 80% 0.0221 99.23% 99.23% 99.23% 99.23% 0.9993 99.02%
10 SGD 32 0.58 80% 0.0314 98.99% 98.99% 98.99% 98.99% 0.9995 98.11%
11 SGD 32 0.58 80% 0.0324 98.94% 98.94% 98.94% 98.94% 0.9986 98.01%
12 SGD 32 0.58 80% 0.0226 99.28% 99.28% 99.28% 99.28% 0.9993 99.01%
13 SGD 32 0.58 80% 0.0247 99.09% 99.09% 99.09% 99.09% 0.9988 98.63%
14 SGD 32 0.58 80% 0.0251 99.09% 99.09% 99.09% 99.09% 0.9988 98.60%
15 SGD 32 0.58 80% 0.0292 98.85% 98.85% 98.85% 98.85% 0.9987 98.09%