Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2010 Nov 13;2010:702–706.

A Rapid Usability Evaluation (RUE) Method for Health Information Technology

Alissa L Russ 1–3, Darrell A Baker 4, W Jeffrey Fahner 4, Bryce S Milligan 4, LeeAnn Cox 4, Heather K Hagg 4, Jason J Saleem 1–3,5
PMCID: PMC3041446  PMID: 21347069

Abstract

Usability testing can help generate design ideas to enhance the quality and safety of health information technology. Despite these potential benefits, few healthcare organizations conduct systematic usability testing prior to software implementation. We used a Rapid Usability Evaluation (RUE) method to apply usability testing to software development at a major VA Medical Center. We describe the development of the RUE method, provide two examples of how it was successfully applied, and discuss key insights gained from this work. Clinical informaticists with limited usability training were able to apply RUE to improve software evaluation and elected to continue to use this technique. RUE methods are relatively simple, do not require advanced training or usability software, and should be easy to adopt. Other healthcare organizations may be able to implement RUE to improve software effectiveness, efficiency, and safety.

Introduction

Wide-spread evidence indicates that healthcare providers and patients could benefit from enhanced usability of electronic health records (EHRs).1, 2 Poor usability can lead to a variety of undesirable effects ranging from end-user dissatisfaction and failed EHR implementations to endangered patient safety.35 If health informaticists incorporate usability testing methods into their design work, this may enhance the adoption, end-user satisfaction, and safety of EHRs.

Several researchers have conducted usability studies in health informatics.4, 6 These efforts are important, but these approaches are often designed to generate research-level data and it is unclear whether these relatively formal methods would or could be readily adopted by informaticists who are responsible for the actual design and implementation of EHRs. Informaticists’ work is complex and demanding, and they may have little to no usability training.

Recently, a major Veterans Affairs Medical Center (VAMC) implemented a Rapid Usability Evaluation (RUE) method to aid informaticists’ work and improve software designs. RUE minimizes the time and effort required for assessment, promoting the practical application of usability testing in VA operations. Clinical informaticists conducted RUE assessments and provided feedback on the strengths and weaknesses of its practical application. The methods and insights from this work may aid other healthcare organizations and promote the adoption of usability testing among health informatics designers.

Methods

RUE was developed via collaboration between VA human factors engineering professionals (AR, JS) and clinical informatics leaders in hospital operations and systems redesign (DB, JF). Existing usability testing methods were selected and adapted based on several factors: ease of adoption; capacity to provide rich usability data; and ability to generate rapid findings. The RUE approach is outlined in Table 1. At this VAMC, we had the option of conducting RUE in the natural hospital setting or within a formal usability laboratory. The laboratory has several computer workstations, an experimenter’s station, and Morae® advanced usability testing software. This software allows the experimenter to view the testing process remotely, minimizing bias during the assessment, and provides video capture, immediate playback, and the ability to electronically log usability events real- time.

Table 1.

Detailed overview of the Rapid Usability Evaluation (RUE) method.

STEP DESCRIPTION
Preparation
1. Develop Scenarios Work with front-line staff/clinical experts to create one or more fictitious scenarios that mimic actual patient care tasks. To minimize time requirements, scenarios should focus on the aspects of the tool(s) that will be tested, but should be sufficiently detailed to provide a logical context for decision-making.
2. Identify Volunteers Recruit volunteers that represent the target end-user population. (E.g., if it is a tool for inpatient nurses, recruit nurse volunteers from inpatient wards, etc)
3. Choose Test Environment Consider the pros/cons and feasibility of conducting RUE in an isolated laboratory environment vs. real-world hospital setting.
Conduct RUE
4. Introduce RUE Explain the purpose of the usability test to the volunteer and gather relevant information, such as professional training, years of computer experience, etc.
5. Explain “Think Aloud” 3, 7 Demonstrate the “Think Aloud” technique. Ask the volunteer to verbalize reactions and thought processes while working through the tool.
6. Start Scenarios Ask the volunteer to complete the scenario(s) on his/her own, using the new tool; s/he should “Think Aloud” during the entire scenario. Avoid guiding the volunteer through the tool or offering any advice on how it should be used. Remind the volunteer to continue verbalizing thought processes.
7. Record Events Record emergent usability issues. Some examples include: becoming “stuck”; statements of confusion; committing an error (e.g., choosing the “wrong thing”); and misinterpreting something. Note these real-time by recording on paper or capturing with video software, as permitted by the volunteer.
8. Debrief & Invite Feedback Clarify any ambiguous usability issues and elicit open-ended input. As applicable, review video alongside the volunteer and ask questions to clarify specific usability issues. Invite the volunteer to explain what s/he liked and did not like about the tool and what changes may improve the design.
RUE Analysis
9. Compile & Prioritize Findings Review data across volunteers. Usability issues can be prioritized as: 1) must be fixed prior to actual use; 2) high impact on usability or patient care/low effort to address; 3) high impact/high effort; and 4) low impact/low effort.
10. Share Summary Report Create a summary report that includes the purpose of testing, test conditions, prioritized usability findings, and topics for end-user education, etc

The RUE method includes a “Think Aloud” technique,3, 7 derived from the field of human factors, that provides rich, qualitative data, and can be readily adopted by individuals without formal usability training. Think Aloud uncovers cognitive processes, points of confusion, etc. For this technique, representative end-users are asked to verbalize their thought processes while concurrently working through realistic work tasks.3 Usability issues can be recorded real-time by an observer. To prioritize RUE results, the VAMC Chief Health Informatics Officer (DB) created a useful summary report format (Items 9 & 10, Table 1). We describe how VA clinical informaticists applied Rapid Usability Evaluation to two informatics tools prior to implementation. We did not seek Institutional Review Board approval since this work was for quality improvement purposes: in this paper we focus on the RUE methods, their application, and lessons learned.

Application Examples

Example 1:

Inpatient discharge, where hospitalized patients transition to care in their home, a rehabilitation facility, or extended care facility, is a highly complex process. At this VAMC, this discharge process involves asynchronous activities across nursing, pharmacy, inpatient residents, attending physicians, social workers, etc. For an improvement project in July 2009, clinical informaticists designed a new EHR template, intended to help coordinate discharge planning activities.

RUE was first used to evaluate this template prior to implementation. For usability training, researchers prepared and distributed a one-page summary of proposed RUE methods to informaticists and then held a single, one-hour meeting to answer questions, further refine RUE methods based on informaticists’ input, and demonstrate available usability software. One clinical informaticist (DB) worked closely with physicians to construct a short, realistic test scenario (Figure 1). Volunteers were identified from the intended end-user population (i.e. physicians and residents) and invited to try the new template. RUE was conducted by informaticists (DB, JF) in the VAMC’s usability laboratory. The primary software developer (JF) sat at the observer’s station, away from the volunteer, and recorded usability issues real-time in Morae® while viewing the volunteer’s progress remotely. Human factors engineering professionals (AR, JS) provided assistance as needed.

Figure 1.

Figure 1.

Usability scenario developed to apply RUE to a new discharge informatics tool, the Anticipated Discharge Note. Actual patient data were not used in this scenario, but the information above was intended to mimic a potential real-life situation. Notes in square brackets [ ] were added for the purposes of this paper.

During RUE, end-users became actively engaged in the software evaluation process and provided valuable, practical feedback. RUE revealed 21 usability issues. At least 18 were addressed via software redesigns prior to implementation. Usability testing uncovered important information about the tool, end-user’s cognitive assumptions, and decision-making processes. Moreover, tests exposed issues associated with other EHR applications, which influenced physicians’ interactions with the tool. Each usability session lasted ∼ 45 min and the entire process required 8 hrs of work (RUE training (1 hr); scenario development (2 hrs); conducting RUE (3 hrs); and RUE analysis (2 hrs)). The final summary report was shared with the discharge project team.

Example 2:

Later, in Jan 2010, a clinical informaticist (JF) elected to use RUE before implementing computerized provider order entry (CPOE) in the Emergency Department (ED). At the time of this evaluation, ED orders were paper-based, and RUE was initiated to assess the usability of electronic orders and ordering menus. In this case, potential volunteers were unwilling or unable to complete RUE in an environment outside the ED; consequently, RUE was conducted in the physicians’ documentation room. The observer (JF) sat next to the volunteer while s/he completed the RUE scenario(s) for ordering. Attempts were made to encourage each volunteer to complete the scenario(s) on his/her own, but volunteers frequently requested assistance from the clinical informaticist while working through the RUE scenario(s).

Usability issues were documented real-time on paper and then typed in a summary report. RUE revealed 15 usability issues; 10 were marked as ‘must do’ prior to implementation and 7 items were tagged for user education. The total time was 6.5 hrs (general preparation (2 hrs); scenario development (2 hrs); conducting RUE (2 hrs); and RUE analysis (30 min)).

Discussion

Rapid Usability Evaluation offers several advantages compared to other forms of usability testing or strategies to elicit user input. RUE methods are easy to learn, rapid, and low-cost (e.g. data can be recorded on paper). More rigorous usability tests can capture research-level data,3, 6, 8 but may be difficult for designers to practically apply due to the amount of time and effort required. In healthcare organizations, including the VA, end-users are commonly invited to review software tools and provide verbal feedback to the design team. In these instances, end-users try to envision how they will interact with the system; this subjective feedback often does not reflect actual human-computer interactions. RUE provides a safe environment, and more systematic approach, to assess software designs. Informaticists can observe the human-computer interaction directly and quickly assess some of the strengths and weaknesses of software designs. Usability testing reveals how the user interacts with application, which may be considerably different then how developers envisioned the interaction. Through RUE, developers can rapidly evaluate the software’s potential impact on workflow and patient safety and prioritize modifications. Finally, experts seeking to develop best practices and EHR certification requirements may be able to use RUE, its associated categorization scheme and summary report format (Table 1), to collect generalized information across EHR systems.

Several factors may have contributed to the success of RUE adoption at this VAMC. Informaticists had previous exposure to human factors engineering concepts via a short course in 2009. Moreover, the Chief of Systems Redesign (HH) had experience in VA operations and HSR&D and encouraged interactions between these experts, which likely promoted RUE development and implementation. Lastly, and perhaps most importantly, VAMC clinical informaticists were curious about usability testing methods and wanted to find a way to apply these techniques to their work. They elected to continue to use RUE and are planning training sessions to spread this method across VA facilities; these voluntary efforts indicate that RUE improves the software evaluation process and is valuable for their work.

Key insights were gained through our efforts to incorporate RUE into the software design process. There are several trade-offs to consider when preparing for RUE. Conducting RUE in a controlled laboratory environment can help minimize bias by separating the observer from the volunteer; facilitates rapid review of usability issues via video-recording; and improves the efficiency of data collection by allowing the observer to record data electronically, real time. However, even when the end-user population and laboratory are located within the facility, it can still be challenging to find volunteers willing to leave their work area. In Example 1, volunteers were paged when they did not show up at the designated time and others were no-shows. It may be helpful to have volunteers scheduled for RUE, but page them when everything is ready to go to save volunteers’ and informaticists’ time.

Alternatively, RUE can be conducted in the volunteer’s normal clinical environment to facilitate recruitment. Data can be recorded in a paper notebook, cheaply, and without specialized equipment. However, in Example 2, volunteers seemed to expect assistance while working through through the software tool, presumably because informaticists normally offer this type of support as part of their other work obligations. Further, inherent clinical interruptions may make it harder to conduct RUE, but may reveal important patient safety issues that could arise upon deployment. As the observer gains more experience, it may become easier to conduct RUE in a clinic setting. If needed, RUE may be performed in an exam room or private office area within the clinic to help mitigate distractions.

In Examples 1 and 2, the primary software developer was present during RUE assessment. Overall, this seemed advantageous. The developer could see usability issues directly and in real-time, eliminating the need for an ‘intermediate messenger’. This may save time and potentially increase the salience and acceptance of usability concerns. In addition, the developer has the greatest understanding of the tool design, how it is intended to function, and how it interacts with other components of the EHR. If a third party conducted RUE, there may be less bias, but this valuable expertise would be lost and some usability issues may not be apparent. In our tests, there were cases where the developer recognized an important usability issue that was not perceived by other team members. If the developer is not involved in the RUE analysis, it may still be possible to capture video data and meet with the developer to illustrate key usability examples or discuss complex issues.

There are also trade-offs associated with the number of scenarios and volunteers. Increasing either variable increases the time and effort required. In Example 1, we decided to minimize the effort for RUE preparation by creating a single scenario. This strategy is advantageous when one patient scenario can be used to evaluate several aspects of the software tool. We also designed the scenario to evaluate only the work processes most relevant to the tool being tested. However, in situations where the robustness of the software tool is critical (e.g, Example 2), it may be more beneficial to test several different scenarios, even if time limitations constrain the evaluation to fewer end-users.

Although RUE results were valuable for efforts at this VAMC, there are some limitations to this approach that warrant consideration. First, 6+ hrs of time commitment may still be challenging for some quality improvement efforts and software designers. The time needed to create realistic scenarios is one disadvantage of the using Think Aloud in the healthcare setting. Time requirements may decrease as individuals gain experience conducting RUE. In some cases, the same scenario may be modified slightly and used to evaluate multiple software tools. It may also be feasible to use shorter patient scenarios for some software applications to reduce time requirements. Time costs for RUE are small compared to costs of poor usability, which can include provider dissatisfaction, inflated training needs and informatics support costs, failed implementations, and even patient harm.35 Second, RUE does not provide quantitative data on efficiency or error rate; the Think Aloud technique can confound these performance measures, although it does elicit rich, qualitative data, revealing the underlying causes of poor usability. It also uncovers more severe, rather than low-priority, usability issues compared to other usability methods.3 In this applied project, we did not evaluate the strengths and weaknesses of RUE compared to other noteworthy approaches, such as Nielsen’s “discount” usability testing9 and the Rapid Iterative Testing and Evaluation (RITE) method10; the healthcare setting is particularly challenging, thus, systematic research that compares these methods in this environment may be insightful. We did not specifically assess the robustness of RUE, although RUE has been used for several distinct medical applications, including electronic discharge templates, ED orders, transplant summaries, and rapid response documentation, suggesting a wide range of applicability. Finally, RUE revealed important and unanticipated issues related to software design, but pilot testing is still needed to understand the strengths and weakness of informatics tools in the context of care. RUE provides valuable information on how an individual interacts with the software interface, but does not provide information on how software designs might affect interactions between healthcare employees.

Conclusions

In summary, this work established a foundation for usability testing at this VAMC. Clinical informaticists with limited usability training were able to apply RUE to improve software evaluation and elected to continue to use this technique. RUE may be used by VA and non-VA facilities to aid software development; it should especially be considered when the end-user’s first experience with the tool is critical to successful implementation and for tools that influence patient safety. Results illustrate the value of partnerships between operations and research and demonstrate how human factors tools can be practically applied for quality improvement.

Acknowledgments

This work was supported by the Veterans Health Administration (VHA) Systems Improvement Capability Grant as well as the Roudebush VA HSR&D CIEPB, Center grant #HFP 04-148. In Example 1, RUE was conducted in the CIEBP Human-Computer Interaction and Simulation Laboratory. Dr. Russ was funded in part by a VA HSR&D Associated Health Postdoctoral Fellowship. Dr. Saleem was supported by a VA HSR&D Research Career Development Award (CDA 09-024-1). Views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs.

References

  • 1.Boyd AD, Funk EA, Schwartz SM, Kaplan B, Keenan GM. Top EHR challenges in light of the stimulus. Enabling effective interdisciplinary, intradisciplinary and cross-setting communication. J Healthc Inf Manag. 2010 Winter;24(1):18–24. [PubMed] [Google Scholar]
  • 2.Sittig DF, Wright A, Osheroff JA, et al. Grand challenges in clinical decision support. J Biomed Inform. 2008 Apr;41(2):387–92. doi: 10.1016/j.jbi.2007.09.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Jaspers MW. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence. Int J Med Inform. 2009 May;78(5):340–53. doi: 10.1016/j.ijmedinf.2008.10.002. [DOI] [PubMed] [Google Scholar]
  • 4.Smelcer JB, Miller-Jacobs H, Kantrovich L. Usability of Electronic Medical Records. Journal of Usability Studies. 2009 Feb;4(2):70–84. [Google Scholar]
  • 5.Horsky J, Kuperman GJ, Patel VL. Comprehensive analysis of a medication dosing error related to CPOE. J Am Med Inform Assoc. 2005 Jul–Aug;12(4):377–82. doi: 10.1197/jamia.M1740. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Saleem JJ, Patterson ES, Militello L, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc. 2007 Sep–Oct;14(5):632–40. doi: 10.1197/jamia.M2163. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform. 2004 Feb;37(1):56–76. doi: 10.1016/j.jbi.2004.01.003. [DOI] [PubMed] [Google Scholar]
  • 8.Kushniruk AW, Borycki EM. Low-cost rapid usability engineering: designing and customizing usable healthcare information systems. Healthc Q. 2006;9(4):98–100. 2. [PubMed] [Google Scholar]
  • 9.Nielsen J. Usability Engineering at a Discount. Proceedings of the third international conference on human-computer interaction on designing and using human-computer interfaces and knowledge based systems; Boston, MA. 1989. [Google Scholar]
  • 10.Medlock MC, Wixon D, Terrano M, Romero RL, Fultron B. Using the RITE method to improve products; a definition and a case study. Usability Professionals Association; Orlando, FL: 2002. [Google Scholar]

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES