Skip to main content
. Author manuscript; available in PMC: 2012 Apr 1.
Published in final edited form as: Ophthalmology. 2010 Nov 20;118(4):768–771. doi: 10.1016/j.ophtha.2010.08.027

Table 2.

Inter-observer Agreement

Grader 1 vs 2 Grader 1 vs 3 Grader 2 vs 3 Pooled Kappa
Graders 1, 2,
and 3)
Unweighted (only exact agreement is considered)
Session 1 0.442 0.348 0.086 0.2896
Session 2 0.587 0.672 0.576 0.6034
Weighted (agreement within ± 1 category is considered)
Session 1 0.638 0.554 0.342 N.A.
Session 2 0.726 0.797 0.708 N.A.

Agreement among graders for sessions 1 and 2 when categories 0 and 0.5 are combined, (kappa statistic) is shown. Note the improvement in weighted kappa between session 1 and 2. N.A.=Not applicable (represents where a pooled kappa is not applicable)