Skip to main content
. 2019 Jun 26;14:65. doi: 10.1186/s13000-019-0839-8

Table 3.

Intra-observer agreement in each observation method (Kappa coefficient)

Observer 1 Observer 2 Observer 3 Observer 4 Observer 5
Scanner A vs Microscope 0.835 0.677 0.784 0.748 0.830
Scanner B vs Microscope 0.775 0.729 0.861 0.763 0.797
Scanner C vs Microscope 0.824 0.717 0.814 0.738 0.726
Scanner D vs Microscope 0.815 0.664 0.804 0.776 0.799
Scanner A vs B 0.816 0.745 0.777 0.858 0.808
Scanner A vs C 0.851 0.704 0.744 0.859 0.892
Scanner A vs D 0.856 0.594 0.728 0.880 0.742
Scanner B vs C 0.792 0.784 0.939 0.891 0.848
Scanner B vs D 0.810 0.709 0.852 0.871 0.761
Scanner C vs D 0.831 0.772 0.832 0.872 0.754