Skip to main content
. 2022 Feb 13;12(2):477. doi: 10.3390/diagnostics12020477

Table 2.

Measure of agreement.

Concordance among Raters—ICC 22C3 Omnis 0.938 (CI 0.914 to 0.957)
SP263 0.930 (CI 0.899 to 0.953)
22C3 Autostainer 0.914 (CI 0.881 to 0.941)
Concordance among Raters for Cut-Off Categories—Fleiss’ Kappa 22C3 Omnis Range 0.675–0.848
SP263 Range 0.720–0.880
22C3 Autostainer Range 0.748–0.895
Concordance among Assays—ICC 22C3 Omnis vs. SP263 Range 0.874–0.993
SP263 vs. 22C3 Autostainer Range 0.532–0.807
22C3 Autostainer vs. 22C3 Omnis Range 0.686–0.924
Concordance among Assays for Cut-Off Categories—Cohen’s Kappa for Single Rater 22C3 Omnis vs. SP263 Range 0.642–0.796
SP263 vs. 22C3 Autostainer Range 0.522–0.687
22C3 Autostainer vs. 22C3 Omnis Range 0.681–0.823