Table 5.
Adaptations in REDCap to address heuristic and usability findings.
Challenges with REDCapa intervention delivery |
Sample adaptation to enhance usability |
||
Mismatch between format and the conceptual expectations of a website | |||
|
Heuristic analysis recommended better distribution of white space by moving information to the footer, header, or side menus where possible. | Participant resources links and survey page instructions were moved to the Survey Footer to separate them from module related text. | |
|
Participants recommended more branding visibility, as they appreciated the affiliation of the project with their clinic. | Aesthetics were constrained by limited options where logos could be added; combined logos were created to allow multiple entities to be represented. | |
|
Participants found page titles confusing and recommended clearer instructions and wording about the intervention and module titles. | Headers and text were revised and simplified to clarify instructions about the study and intervention and make the intervention consistently identifiable on each page. | |
Nonintuitive navigation through the program | |||
|
No independent home page functionality besides using the Survey Queue as a starting point, which participants found unfamiliar and confusing. | A site map was not possible within REDCap. Therefore, study status graphics were added to the first and last page of each module to show the participant’s progression through the intervention. | |
|
Participants struggled to tell how far along they were in the program, as the survey queue did not show what was forthcoming when using Automatic Survey Invitations. | Page numbers were added to show progression through each module. | |
|
Participants found that saving and returning using the randomly generated code for re-entry was nonintuitive and easy to miss when leaving a survey, making returning to the intervention difficult. | The Survey Login was enabled to use participant email to log into REDCap instead of a random generated code. Instructions for navigation in the FAQb were added and linked to the FAQ in the Automatic Survey Invitation email(s). |
|
|
Participants experienced difficulty returning to REDCap intervention pages after clicking on a hyperlink due lack of ability to link back to other instruments within a survey. | The number of embedded hyperlinks was minimized. Where hyperlinks were unavoidable, instructions were added, eg, how to navigate back to the next part of the intervention from the patient resources webpage. | |
Confusing site architecture | |||
|
REDCap’s participant-facing interface was the survey format, and participants struggled with hardcoded survey labels and buttons such as Survey Login or Close Survey. |
Instructions were revised to say “survey” instead of “assessment” or “questionnaire.” When removing survey labels was not possible, such as for instructions in the linked tips and help documentation, descriptive text with instructions was added, eg, “Click ‘Close Survey’ to close this window. Then go back to the program page.” |
|
Using the Survey Queue as the home page for the intervention confused some participants because the program was not a survey in the typical sense. | Visible use of Survey Queue was replaced with study status graphics at the beginning and end of each module to limit the amount of “survey” titles and buttons. The Stealth Queue external plug-in was used to prevent the survey queue from automatically displaying at the end of a survey. |
||
To advance, participants needed to click the Submit button, even if nothing was being submitted, such as after viewing educational material. | The number of instruments per module was reduced to limit the number of Submit buttons. |
aREDCap-specific terms are in italics.
bFAQ: frequently asked question.