Skip to main content
. 2003 Jun 13;5(2):e11. doi: 10.2196/jmir.5.2.e11

Table 2.

For each participant: years of experience in pathology practice, diagnostic classification of slides, level of agreement with each other (% concordance and Kappa index), and number of fields of view examined

ID* EXP Virtual Pathology Slide Concordance,% Kappa§ Fields of View
S6 S2 S3 S4 S7 S9 S10 S1 S5 S8
5 5 B5 B5 B5 B2 B2 B2 B2 B4 B5 B4 90 0.97 321
62 5 B5 B5 B5 B2 B2 B3 B2 B5 B4 B4 80 0.94 326
35 5 B5 B5 B5 B2 B2 B1 B1 B4 B5 B4 70 0.94 122
10 5 B5 B5 B5 B2 B3 B3 B2 B5 B4 B4 70 0.91 157
39 5 B5 B5 B5 B1 B3 B3 B2 B5 B5 B5 60 0.90 343
55 5 B5 B5 B5 B2 B2 B2 B2 B5 B3 B3 80 0.87 289
87 5 B5 B5 B5 B1 B2 B2 B2 B3 B5 B3 70 0.86 130
18 3 B5 B5 B4 B1 B3 B2 B1 B5 B4 B3 40 0.86 252
68 5 B5 B5 B5 B2 B3 B2 B2 B5 B4 B2 70 0.85 228
65 5 B5 B5 B5 B2 B2 B3 B1 B3 B4 B4 60 0.80 234
22 5 B5 B5 B5 B2 B2 B2 B5 B5 B5 B5 80 0.75 204
41 5 B5 B5 B5 B1 B2 B2 B2 B2 B4 B4 70 0.75 216
7 5 B5 B5 B5 B2 B2 B2 B1 B4 B2 B3 60 0.73 121
1 5 B5 B5 B4 B2 B4 B2 B2 B5 B5 B1 70 0.67 223
75 5 B5 B5 B5 B2 B2 B2 B5 B4 B5 B2 70 0.65 120
36 3 B5 B5 B2 B2 B3 B3 B4 B5 B2 B2 40 0.26 201
6 5 B5 B2 B5 B2 B2 B4 B5 B3 B5 B2 50 0.23 418
Average 66.5 0.76 23

* ID = identification number of participant.

EXP = years of experience in pathology practice.

Concordance = number of slides (expressed as a percentage) for which the user's diagnosis is in agreement with the consensus Virtual Pathology Slide diagnosis.

§ Kappa = Cohen's Kappa, a measure of agreement between observers, taking into account agreement that could occur by chance. Kappa greater than 0.7 indicates "substantial agreement."