Table 1.
κ | % of Agreement | |
---|---|---|
Repeatability | ||
Intraobserver intramodality | ||
MRI | 1 | 100 |
CT | 1 | 100 |
Reproducibility | ||
Interobserver intramodality | ||
MRI | 0.54 | 81 |
CT | 0.87 | 94 |
Intraobserver intermodality | ||
Observer 1 | 0.59 | 81 |
Observer 2 | 0.57 | 81 |
κ, Kappa value; levels of agreement: almost perfect (0.8 < κ ≤ 1), substantial (0.6 < κ ≤ 0.8), moderate (0.4 < κ ≤ 0.6), fair (0.2 < κ ≤ 0.4), slight (0.2 < κ ≤ 0).