Skip to main content
. 2023 Nov 3;13:1274557. doi: 10.3389/fonc.2023.1274557

Table 4.

Consistency rates between Auto BI-RADS model and two radiologist groups for classification of BI-RADS in test set.

US Characteristic Rate between Auto BI-RADS and Experienced Radiologists (%) Kappa
Value
Rate between Auto BI-RADS and Junior Radiologists (%) Kappa
Value
BI-RADS 2 100(17/17)


0.82
82(14/17)


0.60
BI-RADS 3 97(36/37) 51(21/37)
BI-RADS 4a 84(68/81) 77(63/81)
BI-RADS 4b 80(42/52) 53(28/52)
BI-RADS 4c 83(50/60) 70(42/60)
BI-RADS 5 88(40/45) 75(34/45)

-Numbers in parentheses are numbers of lesions (n = 292).

Cohen’s Kappa coefficient is used to a diagnostic method. A Kappa value of 0.4–0.59 indicated weak agreement, 0.6–0.79 indicated moderate agreement, 0.8–0.9 indicated strong agreement and values above 0.9 indicated perfect agreement between two references.