Skip to main content
. 2022 Jan 6;12(1):135. doi: 10.3390/diagnostics12010135

Table 1.

The averaged performance results over 5-fold cross validation of the proposed multistage transfer learning and its comparison against conventional transfer learning. TL: transfer learning; CNN: convolutional neural network; AUC: area under ROC curve, Avg.: average; CTL: conventional transfer learning; MSTL: multistage transfer learning; SGD: stochastic gradient descent.

TL Type CNN Optimizer AUC F1 Measure Specificity Sensitivity Loss Test Accuracy (%) Avg. Test Acc. (%)
CTL method InceptionV3 SGD 0.903 0.833 0.87 0.80 0.412 83.50 ± 5.491 83
Adam 0.778 0.605 0.66 0.75 9.570 70.50 ± 6.085
Adagrad 0.976 0.967 1 0.93 0.195 96.49 ± 2.091
EfficientNetb2 SGD 0.717 0.664 0.71 0.61 0.644 66.00 ± 1.895 83
Adam 0.993 0.948 0.98 0.90 0.194 93.99 ± 4.726
Adagrad 0.980 0.904 0.98 0.81 0.300 89.50 ± 2.709
ResNet50 SGD 0.960 0.902 0.90 0.91 0.296 90.50 ± 2.850 89
Adam 0.817 0.698 0.66 0.97 0.117 81.50 ± 10.216
Adagrad 0.989 0.974 0.97 0.98 0.084 97.50 ± 2.165
The proposed MSTL method InceptionV3 SGD 0.935 0.873 0.83 0.94 0.458 88.50 ± 3.758 92
Adam 0.967 0.930 0.94 0.92 0.292 93.00 ± 2.291
Adagrad 0.981 0.945 0.95 0.94 0.208 94.50 ± 0.935
EfficientNetB2 SGD 0.820 0.762 0.77 0.76 0.606 76.50 ± 3.409 90
Adam 0.998 0.980 0.98 0.98 0.067 97.99 ± 1.249
Adagrad 0.992 0.965 0.97 0.96 0.207 96.50 ± 1.274
ResNet50 SGD 0.995 0.985 0.99 0.98 0.065 98.50 ± 1.118 98
Adam 0.986 0.964 0.96 0.97 0.216 96.49 ± 1.000
Adagrad 0.999 0.989 0.98 1 0.030 99.00 ± 0.612