Skip to main content
. 2014 May 6;9(5):e96801. doi: 10.1371/journal.pone.0096801

Table 4. Variability of scores in between the pathological opinions.

Observer 2
High Positive Positive Low Positive Negative Total
Observer 1 High Positive 0 0 0 0 0
Positive 0 0 52 0 52
Low Positive 0 48 0 153 201
Negative 0 0 127 0 127
Total 0 48 179 153 380

Table summarizes the inter-observer variability of two pathologists whose opinions were taken into consideration during this study (383 cases as shown in Table 2). Sample number was rounded off to 380 for statistical comparison between the two groups. Kappa statistics was performed and the value of Kappa  = −0.669 (95% CI: From −0.702 to −0.637) indicates the strength of agreement is worse than what one would expect to see by chance alone.