Skip to main content
. 2021 Mar 13;479(3):585–595. doi: 10.1007/s00428-021-03075-9

Table 2.

Intraobserver agreement metrics, pathologists’ experience and reported technical/case-related impairments

Pathologist 1 Pathologist 2 Pathologist 3 Pathologist 4
Pathologist’s experience Lead Pathologist Phd student Phd student Phd student
Po

160 of 162

(98.76%)

144 of 162 slides

(88.8%)

142 of 162 slides

(87.65%)

138 of 162 slides

(85.1%)

Fleiss’s Kappa index

κ = 0.98

(95% CI: 0.94-1)

P-value = 0

κ = 0.88

(95% CI: 0.84-0.92)

P-value = 0

κ = 0.86

(95% CI: 0.82-0.90)

P-value = 0

κ = 0.85

(95% CI: 0.81-0.89)

P-value = 0

Discordancies

2 of 162 slides

SL: none

D: (2;1.2%)

ER: 1.2%

18 of 162 slides

SL: (5;27.7%)

D: (13;72.2%)

ER: 7.4% (n=12)

20 of 162 slides

SL: (8;40%)

D: (12;60%)

ER: 4.9% (n=8)

24 of 162 slides

SL: (4;16.6%)

D: (20;83.3%)

ER: 8% (n=13)

Preferred Diagnosis CM: 2

DM: 8

CM: 10

DM: 12

CM: 8

DM: 10

CM: 14

DM reported pitfalls*

LSM (10)

Lack of details of inflammatory cells (1)

Lag screen mirroring (11)

Lag screen mirroring (7)

Higher magnification needed**(1)

Lag screen mirroring (2)

Po percentage of agreement, D discordant cases, SL slightly discordant cases, ER error rate of DM (*calculated based on the reference standard diagnosis), DM digital method, CM conventional method. **For dysplasia grading