Abstract
Objective
The aim of this study was to design and assess the formative usability of a novel patient portal intervention designed to empower patients with diabetes to initiate orders for diabetes-related monitoring and preventive services.
Materials and Methods
We used a user-centered Design Sprint methodology to create our intervention prototype and assess its usability with 3 rounds of iterative testing. Participants (5/round) were presented with the prototype and asked to perform common, standardized tasks using think-aloud procedures. A facilitator rated task performance using a scale: (1) completed with ease, (2) completed with difficulty, and (3) failed. Participants completed the System Usability Scale (SUS) scored 0—worst to 100—best. All testing occurred remotely via Zoom.
Results
We identified 3 main categories of usability issues: distrust about the automated system, content concerns, and layout difficulties. Changes included improving clarity about the ordering process and simplifying language; however, design constraints inherent to the electronic health record system limited our ability to respond to all usability issues (eg, could not modify fixed elements in layout). Percent of tasks completed with ease across each round were 67%, 60%, and 80%, respectively. Average SUS scores were 87, 74, and 93, respectively. Across rounds, participants found the intervention valuable and appreciated the concept of patient-initiated ordering.
Conclusions
Through iterative user-centered design and testing, we improved the usability of the patient portal intervention. A tool that empowers patients to initiate orders for disease-specific services as part of their existing patient portal account has potential to enhance the completion of recommended health services and improve clinical outcomes.
Keywords: patient portal, diabetes mellitus, user-centered design, practice guidelines, usability
INTRODUCTION
Diabetes self-management can effectively control diabetes and reduce rates of complications, but requires that patients regularly engage in diabetes-related monitoring and preventive services (eg, hemoglobin A1c [HbA1c] testing, nephropathy screening).1 However, many barriers prevent patients from completing these evidence-based services, including both patient-level (eg, lack of awareness of recommended screening, limited health literacy) and clinician/system-level factors (eg, limited physician time, limited patient support between visits).2–6 As a result, many patients experience gaps in recommended care.7 As one example, despite guidelines recommending patients with type 2 diabetes receive an annual eye screening, nearly half of adults with diabetes have not had retinopathy screening in the last year.7
Patient portals offer a promising platform for reducing barriers to completing recommended diabetes health services. Most US health care systems offer their patients convenient online portal access to their electronic health record (EHR) data.8 Patients are highly interested in using portals9 and adoption continues to rise.10,11 New advances in patient portal technology have created opportunities to engage patients further in their care by enabling patients to receive heath alerts and request or self-schedule health services.12–15 Early results suggest providing this functionality may increase rates of preventive health services and patient activation, and improve clinical efficiency.12–15
Despite growing availability, adoption, and potential to engage and empower patients, challenges to patient portal use remain. For example, patients often struggle with complex user-interfaces and difficult to understand medical content.9,16,17 A recent scoping review identified systematically measuring the usability of patient portals as a research priority and best practice to achieve higher rates of patient engagement.18 Incorporating formative usability testing into the design process ensures the design meets users’ needs prior to spending time and effort on programming and reduces the likelihood that content and layout issues will interfere with efficacy.19
There are many approaches to design and usability testing; however, many are time and resource intensive or lack the necessary rigor. Our research team and others have previously described the application of Design Sprint methodology, a 5-phase process for rapidly testing and rigorously evaluating ideas,20 to design innovative, patient-facing health apps.21,22 The process allows for early feedback from end-users and implementation planning during intervention development to ensure apps meet users’ expectations and are designed to facilitate effective implementation.22,23
Objective
We sought to: (1) apply a user-centered Design Sprint methodology to design a novel patient portal intervention to make patients aware of and empower them to address diabetes care gaps and (2) assess the usability of the resulting intervention prototype before evaluating the intervention in a trial. Our overall goal was to design an intervention that would reduce the prevalence of diabetes care gaps by notifying patients when they are due for services, promote understanding the importance of these services with content appropriate for varying levels of literacy, and allow patients to initiate orders for relevant care while facilitating care delivery.
METHODS
Setting
This research was conducted at Vanderbilt University Medical Center (VUMC) in Nashville, TN. All clinical data at VUMC is stored in the EHR supplied by Epic Systems Corp. Patients receive access to their clinical data via an integrated patient portal called My Health at Vanderbilt (MHAV)24 which is accessible on desktops and via a native mobile app for iOS and Android operating systems. We planned for our intervention to be embedded in this patient portal.
Design Sprint
We used a 5-phase user-centered Design Sprint methodology developed by Google Ventures (Alphabet Inc., Mountain View, CA) to design our intervention.20Figure 1 outlines each phase of a Design Sprint. Below we generally describe our approach; more detailed information on each phase is available in other resources.20,21,25
Figure 1.
The 5 phases in a Design Sprint. Adapted from Knapp et al.20
In phase 1, we mapped our challenge to create a patient portal intervention that would empower patients to close diabetes care gaps. In this phase, we sought to understand patients’ needs and challenges with receiving recommended diabetes services. This process was informed by a literature review26–31 and discussions with expert stakeholders (eg, population health, patient experience, quality and safety, diabetes providers) to enhance our understanding of challenges. In phase 2, we reviewed existing patient portal features and functionality available on Epic’s MyChart platform as well as existing ideas and architecture used for alerting patients, displaying guidance, and enabling action. Using what we learned from stakeholders in phase 1 and a shared understanding of Epic’s existing infrastructure, we sketched potential intervention designs. The goal was to broadly ideate and generate lots of solutions that creatively leveraged Epic’s functionality. This involved having team members sketch out different ways existing functions could be reenvisioned so patients could receive alerts and request services. In phase 3, the study team reviewed proposed designs and, guided by the feasibility and sustainability of the proposed solutions, decided by consensus which design had the greatest potential to successfully meet our challenges in the long-term. In this phase, we chose to repurpose Epic’s questionnaire template (originally designed for surveying patients on symptoms and other health information) by adapting this functionality into an alert and action/response tool for patients to initiate orders for diabetes services. In phase 4, we used a prototyping tool (Figma) to develop an initial prototype for formative usability testing in phase 5.
Formative usability testing
Study population
Participants were recruited from VUMC-affiliated adult primary care clinics throughout middle Tennessee. Eligible patients were age 18 or older, diagnosed with type 1 or 2 diabetes mellitus, able to speak and read English, and had an existing MHAV account. We excluded patients living in long-term care facilities, with known cognitive deficits or functional impairment preventing the use of a desktop or mobile device or receiving hospice care.
We queried the EHR for eligible patients and then sent them letters describing the study and inviting them to participate. Interested patients completed an online eligibility prescreener and signed an electronic consent form via REDCap. Patients were then contacted by phone to confirm eligibility, review study participation, and answer questions. We recruited our participant pool expecting to conduct 2–5 rounds of formative usability testing, with each round consisting of approximately 5 participants. Based on prior usability research, 85% of usability issues are discovered within the first 5 users, such that there is little additional benefit of testing with more than 5.32,33 By including 5 users in each of our 3 rounds of testing, we could be confident we would identify the majority of usability issues with each new iteration of the intervention.34 Patients in the participant pool completed a short survey of demographics and health literacy.35,36 To ensure participation of groups with distinct usability challenges, purposive sampling was used to include at least 1 person aged 65 years or older and 1 person with limited health literacy in each round of testing.36,37 The Vanderbilt Institutional Review Board approved this research.
Procedure
Each formative usability session included a task-based usability assessment of the patient portal intervention’s prototyped features and functionality. All testing occurred remotely via Zoom; each session took approximately 60 minutes and was audio/video-recorded and transcribed. A Human Factors Researcher (author CR) facilitated each usability session using a semistructured interview guide. First, the patient received a standardized introduction to the task-based usability session. The facilitator instructed patients on “think-aloud” procedures which allow for rich qualitative data collection to better understand and track participants’ thought processes as they attempt to use the prototype.38,39 Next, the facilitator presented participants with the prototype and asked them to perform common, standardized tasks (Table 1). The interview guide elicited participants’ (1) expectations for content and functionality, (2) ability to comprehend the information displayed, and (3) satisfaction with the prototype. The facilitator privately rated the participants’ task performance after they completed each task and later used transcripts to verify the rating. All participants completed the tasks (Table 1) in the same order. The findings from 1 round informed prototype revisions prior to the subsequent round.
Table 1.
Tasks participants were asked to complete as part of the usability assessment of the intervention prototype
| Task | Participant attempts to… |
|---|---|
| 1. Interpret A1c | Interpret the information presented within a To Do list item notifying the patient that they are due for an A1c blood test and can request this care through MHAV. |
| 2. Complete A1c | Complete a To Do list item to request the A1c blood test. |
| 3. A1c confirmation | Review and interpret the confirmation message for the A1c blood test request. |
| 4. Interpret microalbumin | Interpret the information presented within a To Do list item notifying the patient that they are due for a urine protein test (microalbumin) and can request this care through MHAV. |
| 5. Interpret eye exam | Interpret the information presented within a To Do list item notifying the patient that they are due for a diabetes eye exam and can request this care through MHAV. |
| 6. Eye exam done | Complete a To Do list item to indicate the patient received a diabetic eye exam outside of VUMC within the last 12 months. |
| 7. Request eye exam | Complete a To Do list item to request a diabetic eye exam. |
| 8. Eye exam confirmation | Review and interpret the confirmation message for the diabetic eye exam request. |
| 9. Interpret pneumonia vaccine | Interpret the information presented within a To Do List item notifying the patient that they are due for a pneumonia vaccine and can request this care through MHAV. |
| 10. Complete pneumonia vaccine | Complete a To Do list item to request the pneumonia vaccine. |
| 11. Pneumonia vaccine confirmation | Review and interpret the confirmation email for the pneumonia vaccine request. |
Abbreviations: MHAV: My Health at Vanderbilt; VUMC: Vanderbilt University Medical Center.
At the end of session, participants completed a questionnaire that included the 10-item System Usability Scale (SUS)40,41 and a validated measure of eHealth literacy,42 and collected their demographics, diabetes characteristics (eg, insulin use, diabetes duration), and familiarity with MHAV. All participants who completed a usability session were compensated $40.
Outcome measures
Task performance
Facilitators rated participants’ task performance using a 3-category scale adapted from prior studies: (1) completed with ease (ie, successfully achieved all task goals without assistance or any indication of confusion or uncertainty on how to proceed with the task), (2) completed with difficulty (ie, successfully achieved all task goals without assistance, but demonstrated some confusion or uncertainty on how to proceed with the task), and (3) failed (ie, did not successfully achieve 1 or more task goals, or required assistance from the facilitator to continue with the task).
Usability
Participants completed the SUS using a 5-point Likert scale (1 = “strongly disagree” to 5 = “strongly agree”).40,41 Overall item scores are converted to a 0–100 scale, with 100 indicating the best possible usability. Based on prior research,43 a score above 68 suggests “above average” usability and a score of 85 or above suggests “excellent” usability.
Analysis
Quantitative
We used descriptive statistics to characterize the study participants, task completion, and usability data. Analyses were conducted using R (http://www.r-project.org; version 4.1.0).
Qualitative
We used a pragmatic approach to quickly analyze testing session transcripts for participant feedback and support changes to the prototype between rounds in a timely fashion.44 Between rounds, a member of the research team (author CR) reviewed interview transcripts and organized participants’ feedback by usability issues. We then identified usability themes and specific user-interface issues to (1) identify areas for improvement, (2) identify areas of satisfaction, and (3) ascertain the overall tone and perception of the users’ feedback about the intervention.
RESULTS
Design Sprint
Figure 2 shows selected patient-facing screenshots of the initial prototype that resulted from phase 4 of the Design Sprint and were subsequently used for round 1 of usability testing. The design includes patient portal notifications when diabetes monitoring and preventive services become due (Figure 2A and B). Notifications direct patients to a To Do list within the portal that presents tasks and explains their importance using language at or below a 6th grade reading level (Figure 2C). The To Do list item contains an “order” button patients can click to initiate an order for the corresponding care (Figure 2D and E). Using Epic’s Reporting Workbench, all patients initiating care via the portal are included in bulk orders placed for the corresponding services. Patients receive a confirmation message when the order is processed with instructions on how to proceed (eg, instructions to go to the lab) (Figure 2F). Results go to their primary care physician via the EHR.
Figure 2.
Selected screenshots of the initial prototype.
Formative usability testing
Participant characteristics
A total of 15 participants completed the usability testing with 5 participants in each of the 3 rounds (Table 2). The average age of the participants was 60.5 (SD: 12.9) years, 60% were non-Hispanic White, 40% had less than a college degree, and 47% were currently employed. Most participants (73%) were prescribed insulin and the average HbA1c level was 7.8% (SD: 1.4%). About half (47%) had limited health literacy and 40% used the patient portal (MHAV) weekly.
Table 2.
Participant characteristics, n (%) or mean ± SD
| Total, N = 15 | Round 1, n = 5 | Round 2, n = 5 | Round 3, n = 5 | |
|---|---|---|---|---|
| Gender, female | 8 (53) | 3 (60) | 4 (80) | 1 (20) |
| Age | 60.5 ± 12.9 | 62.1 ± 11.7 | 61.0 ± 17.0 | 58.2 ± 12.2 |
| Race/ethnicity | ||||
| Non-Hispanic White | 9 (60) | 4 (80) | 2 (40) | 3 (60) |
| Non-Hispanic Black | 5 (33) | 1 (20) | 2 (40) | 2 (40) |
| Non-Hispanic Asian | 1 (7) | 0 (0) | 1 (20) | 0 (0) |
| Hispanic | 0 (0) | 0 (0) | 0 (0) | 0 (0) |
| Education | ||||
| High-school graduate or GED | 1 (7) | 0 (0) | 0 (0) | 1 (20) |
| Some college or technical school | 5 (33) | 1 (20) | 3 (60) | 1 (20) |
| College degree | 3 (20) | 2 (40) | 0 (0) | 1 (20) |
| Graduate or professional degree | 6 (40) | 2 (40) | 2 (40) | 2 (40) |
| Employment status | ||||
| Full time | 6 (40) | 2 (40) | 2 (40) | 2 (40) |
| Part time | 1 (7) | 0 (0) | 0 (0) | 1 (20) |
| Retired | 5 (33) | 1 (20) | 3 (60) | 1 (20) |
| Disabled, not able to work | 3 (20) | 2 (40) | 0 (0) | 1 (20) |
| Insurance status | ||||
| Individual plan | 2 (13) | 0 (0) | 0 (0) | 2 (40) |
| Group plan via employer | 5 (33) | 2 (40) | 2 (40) | 1 (20) |
| Medicare | 8 (53) | 3 (60) | 3 (60) | 2 (40) |
| Years with diabetes | 21 ± 15 | 16 ± 9 | 27 ± 19 | 20 ± 15 |
| Uses insulin | 11 (73) | 5 (100) | 3 (60) | 3 (60) |
| Most recent hemoglobin A1c | 7.8 ± 1.4 | 8.7 ± 1.8 | 7.0 ± 0.4 | 7.7 ± 1.4 |
| Health literacya | ||||
| Adequate | 8 (53) | 2 (40) | 2 (40) | 4 (80) |
| Limited | 7 (47) | 3 (60) | 3 (60) | 1 (20) |
| eHealth literacyb | 33 ± 4 | 35 ± 5 | 32 ± 2 | 32 ± 5 |
| Years of MHAV use | 8 ± 6 | 9 ± 5 | 10 ± 9 | 5 ± 0 |
| Frequency of MHAV use | ||||
| Weekly | 6 (40) | 2 (40) | 3 (60) | 1 (20) |
| Monthly | 4 (27) | 3 (60) | 0 (0) | 1 (20) |
| Other | 5 (33) | 0 (0) | 2 (40) | 3 (60) |
Abbreviations: GED: General Education Development; MHAV: My Health at Vanderbilt.
Assessed using a validated 1-item scale to identify patients with limited health literacy: “How confident are you filling out medical forms?” Response options included 1 = not at all, 2 = quite a bit, 3 = somewhat, 4 = a little bit, and 5 = extremely. Consistent with prior research, participants reporting any lack of confidence (ie, response options 1–4) were classified as having limited health literacy.
Assessed using the eHealth Literacy Scale (eHEALS). Possible score range: 5–40. One participant did not respond to this measure.
Usability concerns and solutions
Table 3 summarizes the major usability issues identified during testing and the resulting changes in the prototype that were evaluated in subsequent testing rounds. Below we describe the issues and relevant solutions in more detail based on the 3 most prevalent categories.
Table 3.
Usability issues and respective solutions
| Issue type and description | Solution |
|---|---|
| Trust/Assurance | |
| • Skepticism about the automated system | Added language associating the initial To Do item and confirmation message with patients’ primary care doctor’s office |
| • Unclear how the system was integrated with the patient’s care team | Added clarity in To Do item content of who on the care team would be informed of test and exam results |
| Content | |
| • Hesitancy about “ordering” care and preference for “requesting” care | Changed wording |
| • Desire for a response option to decline “self-ordering” services | Added response option |
| • To Do list item was difficult to understand and contained too much information | Simplified language and clearly stated reasoning and importance for the care |
| Layout | |
| • Unclear how to advance to next screen from the To Do item | Changed wording |
| • Unclear how to enter dates for when services were completed | Added functionality that allowed patients to type in the dates |
| • Confusion about the order/request care screen and confirmation screen and where to click | Layout was inherent to system; tested alternative design to explore usability/reactions when no constraints |
Trust/understanding of process
Several participants expressed skepticism about the self-ordering process. Namely, patients were uncertain about how and why the alert for due services was generated; since it was not coming directly from their doctor, some were apprehensive about its legitimacy and accuracy. To help alleviate skepticism, we added language associating the initial To Do item with the patients’ doctor’s office (see Figure 3C). There was also confusion about integration of the system with the patients’ care team and who would get notified about the requests for care and care received. We edited the language in the To Do item and confirmation message to indicate that patients’ primary care physician (vs their endocrinologist) would be informed of the request and results (see Figure 3C and F).
Figure 3.
Selected screenshots of the final prototype.
Content/language
Participants pointed out several potential issues with how they were being prompted to order care. First, they were hesitant about “ordering” care, a function traditionally performed by clinicians. Instead, they expressed a preference for “requesting” care to make it easier to understand what they were being asked to do and provide reassurance that their doctor’s office would be involved in the process. Second, they felt it was important that an option was available to decline self-ordering care (eg, if they were not ready to order at that time or had an upcoming appointment where they expected to receive the care). Respective solutions to these concerns included changing the language so patients were prompted to “request” care and adding an option to decline ordering a service (Figure 3D).
A common concern from participants was that the language introducing each test or service was difficult to understand. We responded by consulting VUMC’s Effective Health Communication (EHC) Core and revising the language for several of the To Do list items which centered on simplifying the information and explaining the purpose of and importance of completing a screening or test. For example, instead of using the term “microalbumin” we stated, “It is time for your urine protein test.”
Layout
Some participants identified layout issues involving navigation (unclear how to proceed to the next screen) and others the mode of entering data (scrolling through a wheel). When layout issues were resolvable within the constraints of the platform, we made modifications to improve usability. As one example, the screen where users indicate their eye exam was already completed outside VUMC was modified to clearly state the purpose of the task (“When was your diabetes eye exam done?”) and to allow patients to type in the date the exam was completed (Figure 4).
Figure 4.
Prototype screenshots illustrating a layout change made in response to user feedback.
However, some layout issues involved fixed elements in the Epic system, which we did not have the ability to change. Most issues stemmed from our repurposing of the questionnaire template built into Epic (described in Design Sprint) which involved elements not well-suited for this purpose. For example, we created To Do list tasks that amounted to 1-item “questionnaires” (ie, Do you want to order/request this health maintenance item?) (see Figure 2D). Unfortunately, using the template in this novel way meant the single question was immediately followed by a built-in survey response confirmation screen asking users to review their answer and confirm (see Figure 2E). This redundancy caused some participants confusion and frustration about the order/request and confirmation screens (eg, extra clicks, repetitive words). In addition, some participants felt the order screen contained too much information and were confused about where to click to request the service. Although we were able to make small changes to the wording on these screens to help with clarity (Figure 3D and E), fixed elements in the template prevented us from modifying the layout.
Task performance
Supplementary Table S1 includes task performance ratings for each task across all 3 rounds. Task performance generally improved across each round of testing, such that tasks were more likely to be completed with ease and less likely to be completed with difficulty or failed from round 1 to round 3. Three tasks did show a drop in performance from round 1 to round 2 (ie, Interpret A1c, Interpret Pneumonia Vaccine, and Pneumonia Vaccine Confirmation), but after subsequent design modifications, participants’ performance rebounded in round 3 to at or above performance in round 1. Because tasks were presented in the same sequence and many of the tasks were similar across several of the topics (A1c, microalbumin, eye exam, and pneumonia vaccine), task performance did improve within the same round of testing with repeated exposure to the same set of screen templates. Task performance was high across rounds 1 and 2 for Interpret Eye Exam, Request Eye Exam, Eye Exam Confirmation, and Compete Pneumonia Vaccine. Therefore, these tasks were eliminated from round 3. Figure 5 shows changes in task performance from the initial (round 1) to the final prototype (round 3) for the 7 tasks we assessed across all rounds. Complete A1c had lower task performance in all 3 rounds; this was due to the inherent usability challenges of using the questionnaire template for a novel purpose as described above coupled with the fact that this task was the first exposure to the template for all participants.
Figure 5.
Task performance ratings in rounds 1 and 3. Task performance was high across rounds 1 and 2 for Interpret Eye Exam, Request Eye Exam, Eye Exam Confirmation, and Compete Pneumonia Vaccine. Therefore, these tasks were not included in round 3 and are not shown in this figure.
Usability ratings
The average SUS in round 1 was 87 (SD: 18) which corresponds to a percentile ranking of 97%. In round 2, the average score dropped to 74 (SD: 19; percentile ranking: 70%) and then increased to 93 (SD: 7; percentile ranking: 99.9%) in round 3. The decreased SUS scores in round 2 may be attributable to concerns about “self-ordering” versus “requesting” care and trust in the system which were not expressed by participants in the first round. Those concerns were addressed in round 3 (final) prototype which received the highest mean SUS score.
Overall perceptions
Table 4 includes themes and representative quotes summarizing participants’ overall perceptions of the intervention. In general, impressions were very positive. Participants appreciated the reminder when services were due which they said would make them more likely to initiate the order for care. Most indicated they would use the system in the real word. Participants also emphasized the novelty of the system. Several pointed out how this process was different from how they typically received tests and services. Lastly, there were a few participants who said they might not use the program themselves. These participants mentioned how they regularly attended doctor visits so would just complete the tests there; however, they did still point out the value for others.
Table 4.
Representative quotes from patients as part of usability testing
| General theme | Quotes |
|---|---|
| Helpfulness | “I really enjoyed learning the new feature because [it] was very helpful. I would use it. When they send a notification to me about it, I would definitely go ahead and order my labs or order an exam that I need.” [P31] |
| “I think it would be very helpful. As I say, I would use it a whole lot to keep up with my care. I think you need to. I think the things that you’re talking about…are things that diabetes patients need to be aware of and need to keep up with. So, I’m impressed with it, I think it’s very good.” [P58] | |
| Novelty of process | “I have never ordered a test without being told…’Hey, I'm going to order these tests,’ or, ‘I want you to take this slip of paper down to the lab on the next floor down…and they’re going to take some blood.’ That's generally the way tests are done.” [P10] |
| “What we do now is when I see [my endocrinologist] or [my PCP], they assign us to go to the lab and get our blood work done so this is somewhat different from what the norm is.” [P29] | |
| Already receive care regularly | “When I go to see [my diabetes doctor], they automatically do an A1c so I really don’t need a reminder…My primary doctor asks me that every time when I see him, so it’s kind of redundant.” [P48] |
| “…I go regularly, and so [my doctor] initiates it because I see him every three months. But for the general public, which I know is horrible about going and doing this, I think it's a good concept, a good trigger for them to go get their hemoglobin A1c ordered.” [P45] |
Unconstrained alternative design
As referenced above, there were some layout concerns, involving fixed elements of the EHR and our repurposing Epic’s questionnaire template that we were unable to address in our revisions. Therefore, we chose to explore an unconstrained design with round 3 participants at the end of their session. Although the changes reflected in the unconstrained design were not possible in the current system, our goal was to explore users’ reactions if these constraints were not in place and possibly inform future EHR system upgrades. The main changes in the unconstrained design included linking to additional information (rather than including all information on the order screen) and adding buttons that clearly designated where users should click to request care (see Supplementary Figure S1). In round 3, after data collection for the primary design (reported above) was complete, participants were presented with the unconstrained design and asked for their thoughts and feedback. All 5 participants preferred the alternative design and agreed the information was clearer, more concise, and easier to understand.
DISCUSSION
Principal findings
To improve the rates at which patients receive diabetes monitoring and preventive services, solutions are needed that improve clinical efficiency, are appropriate for patients with varying levels of health literacy and are highly scalable and sustainable. We designed a novel patient portal intervention to empower patients with diabetes to initiate orders for guideline-based diabetes monitoring and preventative care.45 We used a user-centered Design Sprint methodology to design a prototype of the intervention and then conducted 3 rounds of formative usability testing to iteratively identify and resolve issues. Overall, task performance and usability ratings improved from the initial to the final prototype. In addition, participants generally felt the tool had value and would facilitate receiving diabetes services. Most concerns relevant to trust and content were resolvable; however, some layout issues could not be addressed due to design constraints inherent to the EHR system.
Although still a relatively new concept, patient portal functionality that allows patients to self-schedule appointments and initiate clinical services is starting to emerge. A recent review explored barriers and facilitators of patient self-scheduling in health care; however, none of the studies included in the review mentioned usability testing.46 Considering low technology adoption and digital literacy among certain groups,16,18,47 involving these users early in the design process is imperative. Similar to our study, the review of patient self-scheduling in health care did identify trust in the system as 1 factor which would impact use. Unlike most studies in the review which focus on self-scheduling appointments for general health services, our intervention focuses on a specific disease context and aims to improve understanding around the importance of services relevant to that disease. Although we designed the system to target diabetes, it can be adapted for preventive care and monitoring of other chronic diseases. This study reflects a first step in a larger plan to eventually evaluate the effects of the intervention on patient outcomes. By starting with formative testing, it helps ensure the intervention meets users’ needs in advance of the trial and reduces the chance that usability issues will interfere with efficacy.19,23,34
One of the challenges and advantages of our intervention was working within constraints inherent to our vendor supplied EHR system. Namely, participants requested changes to the interface which we were unable to accommodate due to specifications of the built system. Other researchers who embed interventions in patient portals, may experience similar difficulties. However, there are several advantages to leveraging existing functionality within an EHR vendor’s system when developing interventions. Compared to external builds (eg, SMART on FHIR apps) which can become out of date or out of sync over time, an existing infrastructure already present within the healthcare setting can help minimize costs and greatly facilitate sustainability and scalability. Notably, the EHR system influenced both our original prototype and our response to usability issues. By testing our alternate design, with a layout unconstrained by an underlying EHR platform, we explored users’ reactions to a user-interface that could inform EHR vendors’ product development. For structural issues we could not currently change, we used findings to inform training materials and onboarding processes for an upcoming clinical trial by emphasizing points in the process known to create difficulties for patients.
Limitations
Our study has several limitations. First, our study included a small number of participants recruited from a single, urban medical center which may limit the generalizability of our findings. Despite a smaller sample, our purposive sampling approach allowed representation from older adults and those with limited health literacy who tend to experience usability challenges. Our study did not include participants who were Hispanic/Latinx, a community disproportionately affected by type 2 diabetes; future work should focus on increasing their representation. Task presentation order was not counterbalanced which may have impacted our results; however, this was necessary due to the nature of the tasks building on the previous one. Observing participants while they interacted with the prototype allowed us to uncover usability issues; however, knowing they were being observed may have altered participants’ behavior (ie, Hawthorne effect). In addition, while the session facilitator stressed the importance and value of honest feedback including criticism, participants’ statements and ratings of the app’s usability may have been affected by a desire to please the research team (ie, social desirability bias). Relatedly, completing the usability sessions remotely and not on an actual phone may have contributed to some of the difficulty participants experienced with navigating the prototype. Additional research focused on patients’ real-world use of the app, including user engagement over time, may reveal more about the user experience.
CONCLUSIONS
Through iterative user-centered design and testing, we substantially improved the usability of a patient portal intervention to help ensure it works as intended so patients can reap its potential benefits. A tool that engages and empowers patients to self-order disease monitoring and preventative services as part of their existing patient portal account has potential to enhance completion of recommended health services and improve clinical outcomes. We hope this innovative approach allows for improved disease self-management support while enhancing access to health services between visits.
Supplementary Material
Contributor Information
Lyndsay A Nelson, Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Carrie Reale, Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Shilo Anders, Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, Tennessee, USA; Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Russ Beebe, Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
S Trent Rosenbloom, Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA; Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Amber Hackstadt, Department of Biostatistics, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Kryseana J Harper, Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Lindsay S Mayberry, Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA; Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Jared G Cobb, HealthIT, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Neeraja Peterson, Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Tom Elasy, Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
Zhihong Yu, Department of Biostatistics, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
William Martinez, Department of Medicine, Vanderbilt University Medical Center, Nashville, Tennessee, USA.
FUNDING
This research was funded by the National Institutes of Health/National Institute of Diabetes and Digestive and Kidney Diseases (NIH/NIDDK) R18-DK123373. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
AUTHOR CONTRIBUTIONS
CR, SA, RB, STR, LSM, JGC, NP, TE, and WM contributed to the conception or design of the intervention prototype. CR, SA, STR, TE, and WM contributed to study design and methods. CR, KJH, and NP contributed to participant recruitment or acquisition of data. LAN, CR, SA, AH, ZY, and WM contributed to the analysis and interpretation of data. CR, SA, RB, STR, KJH, LSM, JGC, NP, and WM contributed to intervention prototype revisions based on study findings. All authors contributed to drafting the article or revising it critically for important intellectual content and provided final approval of the submitted manuscript. WM oversaw the study and takes full responsibility for the work.
SUPPLEMENTARY MATERIAL
Supplementary material is available at JAMIA Open online.
CONFLICT OF INTEREST STATEMENT
None declared.
DATA AVAILAVILITY
The de-identified participant data underlying this article will be shared on reasonable request to the corresponding author.
REFERENCES
- 1. Draznin B, Aroda VR, Bakris G, et al. Comprehensive medical evaluation and assessment of comorbidities: standards of medical care in diabetes. Diabetes Care 2022; 45 (Suppl. 1): S46–S59. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Nam S, Chesla C, Stotts NA, et al. Barriers to diabetes management: patient and provider factors. Diabetes Res Clin Pract 2011; 93 (1): 1–9. [DOI] [PubMed] [Google Scholar]
- 3. Christofides EA, Desai N.. Optimal early diagnosis and monitoring of diabetic kidney disease in type 2 diabetes mellitus: addressing the barriers to albuminuria testing. J Prim Care Community Health 2021; 12: 215013272110036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Fairless E, Nwanyanwu K.. Barriers to and facilitators of diabetic retinopathy screening utilization in a high-risk population. J Racial Ethn Health Disparities 2019; 6 (6): 1244–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Privett N, Guerrier S.. Estimation of the time needed to deliver the 2020 USPSTF preventive care recommendations in primary care. Am J Public Health 2021; 111 (1): 145–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Alberti H, Boudriga N, Nabli M.. Primary care management of diabetes in a low/middle income country: a multi-method, qualitative study of barriers and facilitators to care. BMC Fam Pract 2007; 8: 63. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. National Committee for Quality Assurance. Comprehensive diabetes care (CDC). https://www.ncqa.org/hedis/measures/comprehensive-diabetes-care/. Accessed June 1, 2022.
- 8. U.S. Government Accountability Office. Health information technology: HHS should assess the effectiveness of its efforts to enhance patient access to and use of electronic health information. https://www.gao.gov/products/gao-17-305. Accessed March 14, 2022.
- 9. Antonio MG, Petrovskaya O, Lau F.. The state of evidence in patient portals: umbrella review. J Med Internet Res 2020; 22 (11): e23851. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Crockett S, Carter-Templeton H.. A portal adoption program to enhance patient satisfaction in primary care. Comput Inform Nurs 2020; 38 (8): 402–7. [DOI] [PubMed] [Google Scholar]
- 11. Johnson C, Richwine C, Patel V. Individuals’ access and use of patient portals and smartphone health apps, 2020. ONC Data Brief, no. 57. Washington, DC: Office of the National Coordinator for Health Information Technology; 2021.
- 12. North F, Nelson EM, Majerus RJ, et al. Impact of web-based self-scheduling on finalization of well-child appointments in a primary care setting: retrospective comparison study. JMIR Med Inform 2021; 9 (3): e23450. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Ueberroth BE, Labonte HR, Wallace MR.. Impact of patient portal messaging reminders with self-scheduling option on influenza vaccination rates: a prospective, randomized trial. J Gen Intern Med 2022; 37 (6): 1394–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Judson TJ, Odisho AY, Neinstein AB, et al. Rapid design and implementation of an integrated patient self-triage and self-scheduling tool for covid-19. J Am Med Inform Assoc 2020; 27 (6): 860–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Hahn EE, Baecker A, Shen E, et al. A patient portal-based commitment device to improve adherence with screening for colorectal cancer: a retrospective observational study. J Gen Intern Med 2021; 36 (4): 952–60. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Coughlin SS, Stewart JL, Young L, et al. Health literacy and patient web portals. Int J Med Inform 2018; 113: 43–8. [DOI] [PubMed] [Google Scholar]
- 17. Wildenbos GA, Maasri K, Jaspers M, et al. Older adults using a patient portal: registration and experiences, one year after implementation. Digital Health 2018; 4 (1–9): 205520761879788. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Lyles CR, Nelson EC, Frampton S, et al. Using electronic health record portals to improve patient engagement: research priorities and best practices. Ann Intern Med 2020; 172 (11 Suppl.): S123–S129. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Lyles CR, Sarkar U, Osborn CY.. Getting a technology-based diabetes intervention ready for prime time: a review of usability testing studies. Curr Diab Rep 2014; 14 (10): 534. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Knapp J, Zeratsky J, Kowitz B.. Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days. New York, NY: Simon and Schuster; 2016. [Google Scholar]
- 21. Martinez W, Threatt AL, Rosenbloom ST, et al. A patient-facing diabetes dashboard embedded in a patient web portal: Design Sprint and usability testing. JMIR Hum Factors 2018; 5 (3): e26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Jake-Schoffman DE, McVay MA.. Using the Design Sprint process to enhance and accelerate behavioral medicine progress: a case study and guidance. Transl Behav Med 2021; 11 (5): 1099–106. [DOI] [PubMed] [Google Scholar]
- 23. Nelson LA, Threatt AL, Martinez W, et al. Agile science: what and how in digital diabetes research. In: Klonoff DC, Kerr D, Mulvaney SA, eds. Diabetes Digital Health. Cambridge, MA: Elsevier; 2020: 51–63. [Google Scholar]
- 24. Steitz BD, Wong JIS, Cobb JG, et al. Policies and procedures governing patient portal use at an academic medical center. JAMIA Open 2019; 2 (4): 479–88. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. Banfield R, Lombardo CT, Wax T.. Design Sprint: A Practical Guidebook for Building Great Digital Products. Sebastopol, CA: O'Reilly Media, Inc.; 2015. [Google Scholar]
- 26. Amante DJ, Hogan TP, Pagoto SL, et al. A systematic review of electronic portal usage among patients with diabetes. Diabetes Technol Ther 2014; 16 (11): 784–93. [DOI] [PubMed] [Google Scholar]
- 27. Eng DS, Lee JM.. The promise and peril of mobile health applications for diabetes and endocrinology. Pediatr Diabetes 2013; 14 (4): 231–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28. Kuo A, Dang S.. Secure messaging in electronic health records and its impact on diabetes clinical outcomes: a systematic review. Telemed J E Health 2016; 22 (9): 769–77. [DOI] [PubMed] [Google Scholar]
- 29. Osborn CY, Mayberry LS, Mulvaney SA, et al. Patient web portals to improve diabetes outcomes: a systematic review. Curr Diab Rep 2010; 10 (6): 422–35. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Payne HE, Lister C, West JH, et al. Behavioral functionality of mobile apps in health interventions: a systematic review of the literature. JMIR Mhealth Uhealth 2015; 3 (1): e20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Martínez-Pérez B, de la Torre-Díez I, López-Coronado M.. Mobile health applications for the most prevalent conditions by the world health organization: review and analysis. J Med Internet Res 2013; 15 (6): e120. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Nielsen J. Why You Only Need to Test with 5 Users. Nielsen Norman Group; 2022. https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/. Accessed April 20, 2023.
- 33. Virzi RA. Refining the test phase of usability evaluation: how many subjects is enough? Hum Factors 1992; 34 (4): 457–68. [Google Scholar]
- 34. Maramba I, Chatterjee A, Newman C.. Methods of usability testing in the development of ehealth applications: a scoping review. Int J Med Inform 2019; 126: 95–104. [DOI] [PubMed] [Google Scholar]
- 35. Sarkar U, Piette JD, Gonzales R, et al. Preferences for self-management support: findings from a survey of diabetes patients in safety-net health systems. Patient Educ Couns 2008; 70 (1): 102–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36. Tieu L, Sarkar U, Schillinger D, et al. Barriers and facilitators to online portal use among patients and caregivers in a safety net health care system: a qualitative study. J Med Internet Res 2015; 17 (12): e275. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37. Taha J, Sharit J, Czaja SJ.. The impact of numeracy ability and technology skills on older adults’ performance of health management tasks using a patient portal. J Appl Gerontol 2014; 33 (4): 416–36. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Van Someren M, Barnard YF, Sandberg J.. The Think Aloud Method: A Practical Approach to Modelling Cognitive. Vol. 11. London: Academic Press; 1994. [Google Scholar]
- 39. Wood LE. Semi-structured interviewing for user-centered design. Interactions 1997; 4 (2): 48–61. [Google Scholar]
- 40. Brooke J. SUS: a ‘quick and dirty’ usability. In: Jordan PW, Thomas B, Weerdmeester BA, McClelland IL, eds. Usability Evaluation in Industry. London: Taylor & Francis; 1996: 189–94. [Google Scholar]
- 41. Bangor A, Kortum PT, Miller JT.. An empirical evaluation of the system usability scale. Int J Hum Comput Interact 2008; 24 (6): 574–94. [Google Scholar]
- 42. Norman CD, Skinner HA.. Eheals: the ehealth literacy scale. J Med Internet Res 2006; 8 (4): e27. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43. Sauro J. A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices. Denver, CO: Measuring Usability LLC; 2011. [Google Scholar]
- 44. Nelson LA, Mayberry LS, Wallston K, et al. Development and usability of reach: a tailored theory-based text messaging intervention for disadvantaged adults with type 2 diabetes. JMIR Hum Factors 2016; 3 (2): e23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45. American Diabetes Association. Standards of medical care in diabetes—2022 abridged for primary care providers. Clin Diabetes 2022; 40 (1): 10–38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46. Woodcock EW. Barriers to and facilitators of automated patient self-scheduling for health care organizations: scoping review. J Med Internet Res 2022; 24 (1): e28323. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47. Son H, Nahm E-S.. Older adults’ experience using patient portals in communities: challenges and opportunities. Comput Inform Nurs 2019; 37 (1): 4–10. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.





