Skip to main content
. 2024 Oct 24;62(11):e00791-24. doi: 10.1128/jcm.00791-24

TABLE 3.

Agreement values and kappa coefficient between qPCR and RT-qPCR according to clinical classification and sample typea,b

All classifications
(possible, probable, proven)
Probable Probable and proven
All samples
 Overall agreement % 85.5 [79.8–89.8] 87.5 [81.1–91.9] 87.8 [81.5–92.1]
 Positive agreement % 100.0 [93.8–100.0] 100.0 [92.4–100.0] 100.0 [95.4–100.0]
 Negative agreement % 79.3 [71.7–85.2] 81.4 [72.6–87.9] 73.1 [61.5–82.3]
 Kappa coefficient 0.69 [0.59–0.79] 0.75 [0.64–0.85] 0.75 [0.64–0.85]
 Kappa interpretation Substantial agreement Substantial agreement Substantial agreement
Plasma samples
 Overall agreement % 73.2 [60.4–83.0] 76.5 [60.0–87.6] 77.1 [61.0–87.9]
 Positive agreement % NA NA NA
 Negative agreement % 73.2 [60.4–83.0] 76.5 [60.0–87.6] 77.1 [61.0–87.9]
 Kappa coefficient NA NA NA
 Kappa interpretation NA NA NA
Respiratory samples
 Overall agreement % 92.7 [86.2–96.2] 94.0 [86.7–97.4] 94.0 [86.8–97.4]
 Positive agreement % 100.0 [93.2–100.0] 100.0 [91.8–100.0] 100.0 [92.0–100.0]
 Negative agreement % 85.7 [74.3–92.6] 87.5 [73.9–94.5] 87.5 [73.9–94.5]
 Kappa coefficient 0.85 [0.76–0.95] 0.88 [0.78–0.98] 0.88 [0.78–0.98]
 Kappa interpretation Almost perfect agreement Almost perfect agreement Almost perfect agreement
a

For agreement calculation, qPCR method was considered the comparative method and RT-qPCR as the candidate method. Agreements and Kappa coefficient are provided with their 95% confidence intervals [95% CI].

b

NA, not applicable.