Skip to main content
. 2021 Sep 5;186:115805. doi: 10.1016/j.eswa.2021.115805

Table 9.

Top-1 combinations for the DenseNet169 in the 15 optimization iterations.

# Parameters optimizer Batch size Dropout ratio TL learn ratio Loss Accuracy F1 Precision Recall AUC WSM
1 AdaMax 32 0.37 50% 1.1045 78.11% 78.10% 78.11% 78.11% 0.8572 76.20%
2 RMSProp 64 0.5 70% 15.632 82.06% 82.06% 82.06% 82.06% 0.8228 80.01%
3 AdaMax 32 0.36 50% 1.7170 85.67% 85.66% 85.67% 85.67% 0.8845 83.55%
4 AdaMax 32 0.36 50% 1.9082 88.02% 88.02% 88.02% 88.02% 0.9028 85.85%
5 AdaMax 32 0.36 50% 1.6761 88.89% 88.89% 88.89% 88.89% 0.9100 86.70%
6 AdaMax 32 0.36 50% 1.7754 88.02% 88.02% 88.02% 88.02% 0.9028 85.85%
7 AdaMax 32 0.36 50% 1.1883 89.71% 89.70% 89.71% 89.71% 0.9261 87.50%
8 AdaMax 32 0.36 50% 1.2521 88.07% 88.06% 88.07% 88.07% 0.9117 85.91%
9 AdaMax 32 0.36 50% 1.7210 88.84% 88.84% 88.84% 88.84% 0.9103 86.65%
10 AdaMax 32 0.36 50% 1.1751 88.12% 88.11% 88.12% 88.12% 0.9153 85.95%
11 AdaMax 32 0.36 50% 1.0352 89.56% 89.56% 89.56% 89.56% 0.9290 87.37%
12 AdaMax 32 0.36 50% 1.6333 87.73% 87.73% 87.73% 87.73% 0.9005 85.57%
13 AdaMax 32 0.36 50% 1.3847 87.01% 87.01% 87.01% 87.01% 0.9043 84.87%
14 AdaMax 32 0.36 50% 1.1712 90.76% 90.76% 90.76% 90.76% 0.9294 88.54%
15 AdaMax 32 0.36 50% 1.0113 89.66% 89.66% 89.66% 89.66% 0.9244 87.47%