Skip to main content
. Author manuscript; available in PMC: 2014 Dec 1.
Published in final edited form as: JAMA Dermatol. 2014 Jun;150(6):601–607. doi: 10.1001/jamadermatol.2013.7321

Table 3. Inter-rater Reliability of Main Outcomes.

Reviewer 2 Reviewer 3 Reviewer 4 Reviewer 5
Inter-rater Agreement by “diagnosis” for the primary differential diagnosis (%) (N=136) 46.6 56.8 48.6
Inter-rater Agreement by “diagnostic category” for the primary differential diagnosis (%) (N=136) 28.7 43.2 50.0
Inter-rater Agreement by “managment” (%) (N=234) 31.6 50.8 40.2
Kappa “diagnosis” for the primary differential diagnosis (95% CI) 0.41 (0.31-0.52) 0.51 (0.41-0.61) 0.43 (0.34-0.53)
Kappa “diagnostic category” for the primary differential diagnosis (95% CI) 0.22 (0.14-0.31) 0.37 (0.27-0.46) 0.43 (0.34-0.53)
Kappa “management” (95% CI) 0.08 (0.02-0.15) 0.12 (0.01-0.23) 0.04 (-0.06- 0.15)
Oral Cases Only:
Reviewer 2 Reviewer 3 Reviewer 4 Reviewer 5
Inter-rater Agreement by “diagnosis” for the primary differential diagnosis (%) (N=29) 66.67 68.0 66.67 61.54
Inter-rater Agreement by “diagnostic category” for the primary differential diagnosis (%) (N=29) 34.62 64.0 66.67 64.00
Inter-rater Agreement by “managment” (%) (N=30) 32.41 44.4 46.15 34.62
Kappa “diagnosis” for the primary differential diagnosis (95% CI) 0.58 (0.39-0.79) 0.58 (0.36-0.80) 0.57 (0.36-0.78) 0.51 (0.33-0.73)
Kappa “diagnostic category” for the primary differential diagnosis (95% CI) 0.17 (0.041-0.36) 0.51 (0.32-0.74) 0.55 (0.37-0.76) 0.52 (0.32-0.73)
Kappa “management” (95% CI) -0.010 (-0.21-0.13) 0.09 (-0.19-0.41) -0.03 (-0.28-0.22) -0.14 (-0.38- 0.09)