Skip to main content
. 2023 Apr 30;12(9):3234. doi: 10.3390/jcm12093234

Table 1.

Image quality scores for comparison between SSFSE-DL and SSFSE and between SSFSE-DL and FSE.

Rated Feature Observer SSFSE-DL SSFSE FSE SSFSE-DL vs.
SSFSE
SSFSE-DL vs.
FSE
Blurring artifacts 1 2.600 ± 0.598 2.200 ± 0.834 1.950 ± 0.826 p = 0.0117 p = 0.0015
2 2.550 ± 0.510 2.050 ± 0.759 1.900 ± 0.788 p = 0.0051 p = 0.0022
Inter-observer agreement 0.720 (0.463–0.976) 0.591
(0.338–0.843)
0.474
(0.165–0.783)
Subjective noise 1 2.750 ± 0.550 1.500 ± 0.513 2.450 ± 0.605 p = 0.0001 p = 0.0277
2 2.650 ± 0.587 1.700 ± 0.571 2.250 ± 0.550 p = 0.0001 p = 0.0051
Inter-observer agreement 0.789
(0.498–1.000)
0.636
(0.352–0.921)
0.664
(0.366–0.961)
Clarity of the follicles 1 4.150 ± 0.745 3.700 ± 0.657 3.150 ± 0.875 p = 0.0164 p = 0.0003
2 4.400 ± 0.681 3.800 ± 0.696 3.200 ± 1.005 p = 0.0051 p = 0.0007
Inter-observer agreement 0.671
(0.423–0.919)
0.565
(0.245–0.885)
0.536
(0.260–0.812)

Note. Data are presented as mean ± SD. Comparisons of the subjective scoring were performed using the Wilcoxon signed-rank test. Cohen’s weighted kappa (95% confidence interval) values are shown for inter-observer agreement.