Skip to main content
. 2021 Nov 11;80(2):397–406. doi: 10.1111/his.14562

Table 2.

Measure of agreement

Measure of agreement Results
ICC among pathologists, 22C3 0.834 (CI 0.758–0.896)
ICC among pathologists, SP263 0.868 (CI 0.803–0.918)
ICC between clones 0.911 (CI 0.885–0.931)
Kappa at CPS ≥ 1 between clones 0.891 (CI 0.825–0.957)
OPA at CPS ≥ 1 between clones 98% (CI 95–99%)
PPA at CPS ≥ 1 between clones 98% (CI 95–99%)
NPA at CPS ≥ 1 between clones 97% (CI 85–100%)
Kappa at CPS ≥ 20 between clones 0.808 (CI 0.753–0.862)
OPA at CPS ≥ 20 between clones 90% (CI 87–93%)
PPA at CPS ≥ 20 between clones 95% (CI 90–98%)
NPA at CPS ≥ 20 between clones 87% (CI 80–91%)
Kappa for three categories (CPS < 1, CPS ≥ 1, and CPS ≥ 20) 0.878 (CI 0.813–0.943)
OPA for three categories (CPS < 1, CPS ≥ 1, and CPS ≥ 20) 88% (CI 84–92%)

CI, confidence interval; CPS, combined positive score; NPA, negative percentage agreement; OPA, overall percentage agreement; PPA, positive percentage agreement.