Skip to main content
. 2019 Jan 9;19:4. doi: 10.1186/s12911-018-0718-3

Table 2.

Form content

 1. Is there some form of feedback for every user interaction? [13, 28]
 2. Is this feedback noticeable and readable?
 3. Is this feedback given within a reasonable amount of time? [29]
 4. Does the tool provide informative progress disclosure when filling a form e.g. percentage of completion or time to wait to complete the form? [5, 28]
 5. After users complete a task or group of tasks does the feedback indicate that they can proceed to the next task? [13]
 6. Are the icons used in the tool concrete and familiar? [13]
 7. In the event that shapes are used as a visual cue in the tool, do they match the cultural conventions? [13]
 8. Is the language used in the form clear, effective and appropriate for the target users? [14, 28]
 9. Is only and all information essential to decision making displayed on the screen? [13, 28]
 10. Is colour coding used for clarity where appropriate? [28]
 11. Do the selected colours used in the form correspond to common expectations about colour codes? [13]
 12. Is the number of colours limited to 3–4? [28]
 13. Are different presentations adopted for each of the headings, subheadings and instructions?
 14. Do the information elements e.g. images and labels stand out from the form background? [5]
 15. Are there visual differences between interaction objects (e.g., buttons) and information objects (e.g., labels, images) [5]
 16. Can the questionnaire be broken down into sections?
 17. Can each section have a section name with a small introduction?
 18. Are the rows and columns of a table designed to be clear and understandable by the users?
 19. If the form has multipage data entry screens, do all pages have the same title? [13, 28]
 20. Do help instructions appear in a consistent location across all the form screens? [13]
 21. Is there a consistent icon design scheme and stylistic treatment across the form? [5, 13]
 22. Is there consistent location of the menu across the form? [5]
 23. Is all the information users enter into the data forms validated and users informed if it is not in an acceptable format? [28]
 24. Are all abbreviated words of the same length? [13]
 25. Is the format of a data entry value for similar data types consistent from screen to screen of a given form? [13]
 26. Is the design on the input type e.g. text box or drop down consistent across the form? [5]
 27. Do data entry screens and dialog boxes indicate when fields are optional? [13]
 28. Are the mandatory or required data entry fields clearly marked? [5, 28]
 29. Is the length of the page controlled? E.g. by limiting the number of questions on the page [13, 28]
 30. Has the skip logic been automated?
 31. Are the help instructions visually distinct and accessible? [13, 28]
 32. If menu items are ambiguous, does the tool provide additional explanatory information when an item is selected? [13]
 33. Is the help function visible; for example, a key labelled HELP or a special menu? [13, 14]
 34. Can users easily switch between help and their work? [13, 28]
 35. Can users resume work where they left off after accessing help? [13, 28]