Table 2.
Weighted Kappa | Strength of | |
---|---|---|
Health professional | (95 % CI) | agreementa |
Radiologist Agreement | ||
Pediatric radiologists | 0.83 (0.65 to 1.00) | ‘Very good’ |
Adult radiologists | 0.75 (0.57 to 0.93) | ‘Good’ |
Radiologist-clinician agreement | ||
Radiologist vs. pediatrician | 0.65 (0.52 to 0.78) | ‘Good’ |
Radiologist vs. internal medicine physician | 0.68 (0.55 to 0.80) | ‘Good’ |
Radiologist vs. internal medicine resident | 0.66 (0.53 to 0.78) | ‘Good’ |
Radiologist vs. pediatric resident | 0.69 (0.56 to 0.82) | ‘Good’ |
Radiologist vs. medical student 1 | 0.56 (0.44 to 0.69) | ‘Moderate’ |
Radiologist vs. medical student 2 | 0.53 (0.40 to 0.66) | ‘Moderate’ |
Radiologist vs. research nurse | 0.49 (0.36 to 0.62) | ‘Moderate’ |
aAgreement: weighted Kappa <0.2 = ‘poor’, >0.2 to 0.4 = ‘fair’, >0.4 to 0.6 = ‘moderate’, >0.6 to 0.8 = ‘good’, >0.8 to 1.0 = ‘very good’ agreement
CI = confidence interval