Skip to main content
. 2016 Nov 20;2016:1727039. doi: 10.1155/2016/1727039

Table 3.

Gonioscopic intraobserver (intervisit) agreement for angle classification by examiner (kappa [95% confidence interval]).

Examiner Kappa
[95% CI]
N Agree Disagree
n (%)
Overall
n (%)
Narrow
n
Open
n
V 0.67
[0.10, 1.00]
6 5 (83%) 3 2 1 (17%)
W 0.72
[0.43, 1.00]
27 24 (89%) 6 18 3 (11%)
X 0.86
[0.68, 1.00]
32 30 (94%) 10 20 2 (6%)
Y 0.68
[0.43, 0.94]
36 31 (86%) 9 22 5 (14%)
Z 0.53
[0.07, 1.00]
41 38 (93%) 2 36 3 (7%)
Combined 0.74 [0.62, 0.87] 142 128 (90%) 30 98 14 (10%)

CI: confidence interval.

The kappa criteria were <0.2 poor; 0.21 to 0.40 fair; 0.41 to 0.60 moderate; 0.61 to 0.80 good; and >0.80 excellent.