Skip to main content
. 2009 Jan 1;9:1. doi: 10.1186/1471-2407-9-1

Table 2.

Calculation of the interobserver agreement (kappa coefficient)

Pathologist 2 Total
1+ 2+ 3+ 1.00
Pathologist 1 1+ Count (n) 7 1 0 8
Pathologist 1 (%) 87.5 12.5 0 100
Pathologist 2 (%) 100 11.1 0 44.4
% of Total 38.9 5.6 0 44.4
2+ Count (n) 0 8 0 8
Pathologist 1 (%) 0 100 0 100
Pathologist 2 (%) 0 88 0 44.4
% of Total 0 44 0 44.4
3+ Count (n) 0 0 2 2
Pathologist 1 (%) 0 0 100 100
Pathologist 2 (%) 0 0 100 11.1
% of Total 0 0 11.1 11.1
Total Count (n) 7 9 2 18
Pathologist 1 (%) 38.9 50 11.1 100
Pathologist 2 (%) 100 100 100 100
% of Total 38.9 50 11.1 100

Symmetric measures

Value Standard error (a) Approximate T (b) Approximate significance

Kappa coefficient 0.906 0.092 4.894 0.00
Number of valid cases (n) 18

a Not assuming the null hypothesis.

b Using the asymptotic standard error assuming the null hypothesis.