Skip to main content
. 2015 Dec 29;15:61. doi: 10.1186/s12880-015-0103-y

Table 2.

Agreement between radiologists in scoring severe acute respiratory infection CXRs from their reading of the digital CXR images and agreement in scoring severe acute respiratory infection CXRs: clinicians reading of CXR reports versus radiologists reading of CXRs

Weighted Kappa Strength of
Health professional (95 % CI) agreementa
Radiologist Agreement
Pediatric radiologists 0.83 (0.65 to 1.00) ‘Very good’
Adult radiologists 0.75 (0.57 to 0.93) ‘Good’
Radiologist-clinician agreement
Radiologist vs. pediatrician 0.65 (0.52 to 0.78) ‘Good’
Radiologist vs. internal medicine physician 0.68 (0.55 to 0.80) ‘Good’
Radiologist vs. internal medicine resident 0.66 (0.53 to 0.78) ‘Good’
Radiologist vs. pediatric resident 0.69 (0.56 to 0.82) ‘Good’
Radiologist vs. medical student 1 0.56 (0.44 to 0.69) ‘Moderate’
Radiologist vs. medical student 2 0.53 (0.40 to 0.66) ‘Moderate’
Radiologist vs. research nurse 0.49 (0.36 to 0.62) ‘Moderate’

aAgreement: weighted Kappa <0.2 = ‘poor’, >0.2 to 0.4 = ‘fair’, >0.4 to 0.6 = ‘moderate’, >0.6 to 0.8 = ‘good’, >0.8 to 1.0 = ‘very good’ agreement

CI = confidence interval