Skip to main content
. 2024 Jul 23;15:1402065. doi: 10.3389/fpsyg.2024.1402065

Table 2.

The I-CVI values and modified kappa values.

Items Athletes (n = 22) Coaches (n = 16) Sport Psychologists (n = 7)
M I-CVI pc k* M I-CVI pc k* M I-CVI pc k*
1 4.05 0.64 0.01 0.64 4.06 0.63 0.12 0.58 4.57 1.00 0.00 1.00
2 4.50 1.00 0.00 1.00 4.44 0.94 0.00 0.94 4.29 0.86 0.05 0.85
3 3.32 0.45 0.15 0.35 3.25 0.50 0.20 0.38 4.14 0.71 0.16 0.65
4 4.36 0.91 0.00 0.91 4.31 0.88 0.00 0.88 3.71 0.57 0.27 0.41
5 4.32 0.91 0.00 0.91 4.50 1.00 0.00 1.00 4.71 1.00 0.00 1.00
6 4.36 0.77 0.01 0.77 4.31 0.75 0.03 0.74 4.57 1.00 0.00 1.00
7 4.27 0.73 0.02 0.72 4.13 0.69 0.07 0.67 4.71 0.86 0.05 0.85
8 3.68 0.55 0.15 0.47 4.06 0.75 0.03 0.74 4.57 1.00 0.00 1.00
9 4.27 0.82 0.00 0.82 4.25 0.88 0.00 0.88 3.86 0.86 0.05 0.85
10 4.45 0.82 0.00 0.82 4.44 0.88 0.00 0.88 4.86 1.00 0.00 1.00
11 4.41 0.82 0.00 0.82 4.31 0.88 0.00 0.88 4.43 0.86 0.05 0.85
12 4.32 0.95 0.00 0.95 4.50 1.00 0.00 1.00 4.86 1.00 0.00 1.00
13 4.50 1.00 0.00 1.00 4.44 0.94 0.00 0.94 4.57 1.00 0.00 1.00
14 4.41 0.82 0.00 0.82 4.19 0.75 0.03 0.74 3.71 0.57 0.27 0.41
15 4.36 0.86 0.00 0.86 4.38 0.88 0.00 0.88 4.71 1.00 0.00 1.00
16 3.95 0.73 0.02 0.72 4.13 0.81 0.01 0.81 4.43 1.00 0.00 1.00
17 4.18 0.77 0.01 0.77 4.13 0.81 0.01 0.81 4.29 0.86 0.05 0.85
18 4.41 0.77 0.01 0.77 4.25 0.75 0.03 0.74 4.86 1.00 0.00 1.00
19 4.45 0.82 0.00 0.82 4.44 0.88 0.00 0.88 4.71 1.00 0.00 1.00
20 4.00 0.68 0.04 0.67 4.06 0.75 0.03 0.74 4.86 1.00 0.00 1.00
21 4.18 0.82 0.00 0.82 4.06 0.88 0.00 0.88 4.86 1.00 0.00 1.00
22 4.36 0.91 0.00 0.91 4.50 1.00 0.00 1.00 4.71 1.00 0.00 1.00
23 4.45 0.91 0.00 0.91 4.25 0.81 0.01 0.81 4.86 1.00 0.00 1.00
24 4.32 0.77 0.01 0.77 4.00 0.75 0.03 0.74 4.29 0.86 0.05 0.85
25 4.36 0.95 0.00 0.95 4.13 0.88 0.00 0.88 4.71 1.00 0.00 1.00
26 4.41 0.86 0.00 0.86 4.44 0.94 0.00 0.94 5.00 1.00 0.00 1.00
27 4.41 0.86 0.00 0.86 4.38 0.81 0.01 0.81 4.71 0.86 0.05 0.85

I-CVI is the item-level content validity index. pc is the probability of a chance occurrence. It was computed using the formula: pc = 0.5N [N!/A!(N-A)!], where N = number of judges and A = number agreeing on good relevance. k* is kappa designating agreement on relevance. It was computed using the formula: k* = (I-CVI – pc)/(1- pc). Evaluation criteria for kappa using guidelines by Cicchetti and Sparrow (1981) and Fleiss (1981): fair = k* of 0.40–0.59; good = k* of 0.60–0.74; and excellent = k* > 0.74.