Table 5.
Agreement on items for usability evaluation involving experts (N=27).
Planning and reporting procedures for usability evaluation with experts | Items grouped from 1 to 3, n (%) | Items grouped from 4 to 6, n (%) | Items grouped from 7 to 9, n (%) | ||||
Usability evaluator and the domain evaluator | |||||||
|
Determine the number of evaluators involved in the evaluation. | 0 (0) | 10 (37) | 17 (63) | |||
|
Provide the rationale to establish the number of evaluators. | 1 (4) | 14 (52) | 12 (44) | |||
|
Define as inclusion criteria having previous experience in inspection usability evaluation or consider adequate training and provide details of training. | 0 (0) | 5 (19) | 22 (82) a | |||
|
State whether the evaluators are external to the product or service development team. | 1 (4) | 11 (41) | 15 (56) | |||
|
Specify whether a combination of evaluators from different domains was used (eg, for a health-related digital service, use both usability and health domain evaluators). | 0 (0) | 3 (11) | 24 (89) | |||
|
Provide clear inclusion and exclusion criteria.b | 1 (4) | 1 (4) | 25 (93) | |||
Inspection method | |||||||
|
Detail the protocol to conduct the inspection (including the techniques used and how they are implemented).c | 0 (0) | 1 (4) | 26 (96) | |||
|
State whether a combination of techniques was used (eg, heuristic evaluation and cognitive walkthrough). | 2 (7) | 6 (22) | 19 (70) | |||
|
Provide the rationale for the choice of the techniques. | 1 (4) | 5 (19) | 21 (78) | |||
|
Detail the criteria to prioritize the resolution of problems identified (eg, according to the severity criteria, problems with higher impact on users are solved first).a | 0 (0) | 4 (15) | 23 (85) |
aValues in italics denote the items that reached consensus on inclusion.
bNew items that emerged from round 1.
cItems that were rephrased.