Skip to main content
. 2021 Jun 2;2021:5598001. doi: 10.1155/2021/5598001

Table 4.

Recognition effects of different CNN models on TUSP images.

Models TPTI LPTI UTPLT DTPLT UTPRT DTPRT LPLT LPRT Macro average Accuracy
ResNet-18 P 0.9815 0.9980 0.9410 0.9328 0.9512 0.9046 0.7852 0.8170 0.9139 0.9107
R 0.9953 0.9970 0.9399 0.9241 0.9146 0.9408 0.7680 0.8272 0.9134
F1 0.9883 0.9975 0.9401 0.9277 0.9322 0.9218 0.7753 0.8212 0.9130

ResNet-50 P 0.9845 0.9832 0.9026 0.8956 0.9179 0.8792 0.6884 0.7812 0.8791 0.8744
R 0.9843 0.9890 0.9004 0.8842 0.8909 0.9100 0.7329 0.7348 0.8783
F1 0.9843 0.9861 0.9006 0.8890 0.9030 0.8929 0.7078 0.7558 0.8775

ResNet-101 P 0.9922 0.9902 0.9117 0.9281 0.9040 0.8964 0.7492 0.7242 0.8870 0.8795
R 0.9906 0.9910 0.9296 0.9041 0.9061 0.8876 0.6127 0.8261 0.8810
F1 0.9913 0.9906 0.9203 0.9156 0.9048 0.8909 0.6710 0.7703 0.8818

VGG16 P 0.9937 0.9851 0.9328 0.8929 0.9039 0.8798 0.7190 0.7461 0.8817 0.8762
R 0.9874 0.9900 0.8971 0.9221 0.8911 0.8813 0.6768 0.7815 0.8784
F1 0.9905 0.9875 0.9132 0.9057 0.8956 0.8784 0.6963 0.7626 0.8787

ResNet-152 P 0.9938 0.9813 0.9247 0.9028 0.8826 0.8538 0.7800 0.7024 0.8777 08634
R 0.9858 0.9910 0.9057 0.9258 0.8768 0.8569 0.5289 0.8370 0.8635
F1 0.9897 0.9861 0.9143 0.9128 0.8773 0.8507 0.6042 0.7556 0.8613

InceptionV3 P 0.9907 0.9911 0.9359 0.9341 0.9220 0.8960 0.7430 0.7926 0.9007 0.8962
R 0.9858 0.9920 0.9398 0.9313 0.9098 0.9140 0.7407 0.7870 0.9000
F1 0.9881 0.9915 0.9374 0.9320 0.9150 0.9038 0.7398 0.7882 0.8995

MobileNet P 0.9892 0.9921 0.9341 0.9294 0.9162 0.9118 0.7490 0.8028 0.9031 0.8986
R 0.9937 0.9960 0.9347 0.9223 0.9199 0.8936 0.7613 0.7880 0.9012
F1 0.9914 0.9940 0.9340 0.9254 0.9174 0.9018 0.7528 0.7932 0.9012

Xception P 0.9844 0.9913 0.9298 0.9605 0.9504 0.9054 0.7634 0.7812 0.9083 0.9013
R 0.9890 0.9900 0.9639 0.9061 0.9148 0.9406 0.7015 0.8315 0.9047
F1 0.9867 0.9906 0.9461 0.9319 0.9318 0.9222 0.7258 0.8030 0.9047

The values in the table are the average of five-fold cross-validation.