Table 3.
Metrics | VGG Tr./Va. | VGG Te. | GoogLeNet Tr./Va. | GoogLeNet Te. | Ours Tr./Va. | Ours Te. |
---|---|---|---|---|---|---|
ACC | 0.99/0.98 | 0.92 | 0.99/0.98 | 0.92 | 1.00/0.96 | 0.93 |
SENS | 1.00/0.98 | 0.92 | 0.99/0.99 | 0.92 | 0.99/0.95 | 0.93 |
SPE | 0.99/0.98 | 0.92 | 0.99/0.98 | 0.91 | 0.99/0.95 | 0.92 |
IoU | 0.99/0.97 | 0.85 | 0.98/0.97 | 0.85 | 0.99/0.91 | 0.85 |
F1 | 0.99/0.98 | 0.92 | 0.99/0.98 | 0.92 | 0.99/0.95 | 0.92 |
AUC | 0.99/0.98 | 0.92 | 0.99/0.98 | 0.92 | 1.00/0.99 | 0.93 |
Comparison of Vgg16 with GoogLeNet and ResNet50.
*Tr. denotes the result on the training set.
*Va. denotes the result on the validation set.
*Te. denotes the result on the testing set