Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2011 Oct 8;18(Suppl 1):i13–i17. doi: 10.1136/amiajnl-2010-000082

Lessons learned from usability testing of the VA's personal health record

David A Haggstrom 1,2,3,, Jason J Saleem 1,2, Alissa L Russ 1,2, Josette Jones 4, Scott A Russell 1, Neale R Chumbler 1,2,5
PMCID: PMC3241159  PMID: 21984604

Abstract

In order to create user-centered design information to guide the development of personal health records (PHRs), 24 patients participated in usability assessments of VA's MyHealtheVet program. Observational videos and efficiency measures were collected among users performing four PHR scenarios: registration and log-in, prescription refill, tracking health, and searching for health information. Twenty-five percent of users successfully completed registration. Individuals preferred prescription numbers over names, sometimes due to privacy concerns. Only efficiency in prescription refills was significantly better than target values. Users wanted to print their information to share with their doctors, and questioned the value of MyHealtheVet search functions over existing online health information. In summary, PHR registration must balance simplicity and security, usability tests guide how PHRs can tailor functions to individual preferences, PHRs add value to users' data by making information more accessible and understandable, and healthcare organizations should build trust for PHR health content.

Keywords: Personal health record, medical informatics, veterans health, cancer, primary care, information exchange, human factors, computerized provider order entry, health information technology, medication safety

Introduction

Personal health records (PHRs) have been defined as an internet-based set of tools that allow individuals to manage and control their health information.1 In 2008, PHR usability and adoption was identified as an agenda item for national research.2 Improving the usability of an information technology tool may increase its future adoption,3 4 functionality,5 and effectiveness.6 Usability assessments of electronic health records (EHRs) have provided valuable insights,7 but less is known about the usability of PHRs.

Case description

In 2003, the Veterans Administration (VA) launched a web-based PHR called MyHealtheVet (MHV). MHV provides patient access to evidence-based information on various health topics. In addition, VA patients can view their medications and request refills online. In fact, prescription refill is the most common goal of MHV site visitors,8 and more than 20 million refills have been requested since 2005. At the time of this study, other EHR data were not available. Instead, users were encouraged to manually self-enter health data and monitor this information using tracking functions. A consumer survey of MHV users identified many positive aspects of the system, but identified problems with search functions and ease of navigation.9 The VA is interested in identifying usability barriers to PHR adoption to ensure that the MHV program is sustainable.10 Thus, we performed a usability assessment of the existing MHV program to inform the future design and implementation of this and other PHRs.

Methods of implementation

Recruitment

We recruited a convenience sample of 24 VA patients who had not previously used MHV. Previous work in the usability field indicates that this number of participants is sufficient to detect the majority of usability problems.11 Patients were recruited from three different primary care clinics at a VA medical center, gave written informed consent, and received a $25 gift card for participation and completion of the 90 min usability test. The study was approved by the IUPUI Institutional Review Board and the Roudebush VA Research and Development Committee.

Usability testing environment

Usability testing was performed in a human–computer interaction laboratory. Participants used a computer workstation to access and use MHV. An experimenter's station, located behind a physical barrier to reduce potential ‘experimenter bias’, allowed a researcher to record observations about the participant's screen use in real-time with Morae software. We recorded audio and two video sources: (1) computer screen interaction and (2) the participant's face.

Usability test procedure

Sociodemographics and self-described computer experience were collected from participants. To reduce potential variation in the testing procedure, each participant was provided with the same scripted verbal introduction to the experiment. Next, participants were given paper instructions for each of the four scenarios (see online supplementary appendix 1). Usability scenarios reflected key existing functions of the MHV program: registration and log-in, prescription refill, track health (eg, physical activity, colonoscopy procedure results, and blood pressure readings), and search for health information (eg, post-traumatic stress disorder and healthy eating).

For the prescription refill scenario, we alternately used two types of MHV accounts to compare refill tasks: (1) authenticated accounts where users could see both prescription names and numbers; and (2) unauthenticated accounts where users could see only prescription numbers. For this scenario, prescription labels were placed on bottles, which were filled with candy to simulate pills. Bottles were alternately labeled with generic prescription names (plus dosage) from the authenticated account, and prescription numbers from the unauthenticated account. The unauthenticated account became unavailable during the latter half of the study due to technical issues in hosting the account in local pharmacy data files, and thus, only the first 10 users were observed using prescription numbers.

Specific time limits were established for each scenario (table 1) based on the scenario's complexity. After the participant completed a scenario, or the designated time limit expired, the participant was given instructions for the next scenario. A debrief interview was performed after each scenario and the end of the experiment.

Table 1.

Participants' time to complete scenarios (efficiency) compared to targeted performance values

Registration Prescription refill Track health Search for health information
Authenticated account (names) Unauthenticated account (numbers)
Number of participants 24 24 10 20 21
Number (%) completed 6/24 (25%) 16/24 (67%) 9/10 (90%) 2/20 (10%) 6/21 (29%)
Range (397–600 s) (64–300 s) (49–300 s) (1031–1500 s) (233–600 s)
Time limit 600 s (10 min) 300 s (5 min) 300 s (5 min) 1500 s (25 min) 600 s (10 min)
Targeted performance 480 s (8 min) 240 s (4 min) 240 s (4 min) 900 s (15 min) 480 s (8 min)
Mean 573 s (9 min, 33 s) 187 s (3 min, 7 s) 151 s (2 min, 31 s) 1458 s (24 min, 18 s) 560 s (9 min, 20 s)
Median 600 s 153 s 116 s 1500 s 600 s
p Value (Wilcoxon) p<0.001 p=0.011 p=0.047 p<0.001 p=0.002

Data collection and analysis

Quantitative efficiency measures captured the time to complete each scenario. We used time-stamped video recordings to measure efficiency for each scenario, as well as each subtask within a scenario. When participants started a task, but did not complete the task, those participants were assigned the ‘maximum’ time value (time limit) as their task completion time. Data from all participants who started a scenario or task were included in the analysis. Wilcoxon signed rank tests (two-tailed) compared participants' time to complete scenarios to target performance values, which were established with the assistance of the VA MHV program.

Qualitative, direct observational data were collected during the usability test for each scenario and debrief interviews. The interview after each scenario consisted of retrospective video review, by the participant and the researcher, of points of interest identified in real-time during the usability test. Afterward, the study team identified emerging categories of usability issues. Results were maintained in a database including the category (eg, ‘navigation problems’), when it occurred, and a brief description of the specific observation.12

Examples and observations

Study population

The 24 participants in the study had not previously used MHV. First-time MHV users were a reasonable choice because a small proportion of American veterans were using MHV (16%),9 and few US patients were using PHRs (2.7%),13 at the time of the study (2008). The average age of participants was 55 years (range, 33–80 years). Two participants were female. One participant completed the usability test with the assistance of his spouse. Computer experience was distributed as follows: ‘none or low’ (40%), ‘medium’ (30%), and ‘high’ (30%). Six participants had some type of disability, including wheelchair confinement and color blindness.

Efficiency measures

The proportion of users participating in, and successfully completing, each scenario are described in the first two rows of table 1. Three users dropped out after the prescription refill scenario, and one user experienced technical difficulties while completing the track health scenario.

Participants' time to complete scenarios was significantly longer than the target value for three scenarios: registration and log-in, track health, and search for health information. For prescription refills, participants' efficiency was significantly better than target values.

Direct observations

We recorded 1160 unique observations across the four scenarios, including 15 categories of usability issues from video and debrief interviews. The following sections describe usability issues for each scenario.

Registration and log-in

During registration, most (17/24) users created invalid passwords at least once. A participant needed to follow password rules involving 8–12 characters, including special characters. Several (5/24) users expressed confusion about the meaning of ‘special characters.’ In a simple but important case, a color-blind participant could not read the registration error messages in red.

Prescription refill

Of the 15 users who expressed a preference, more users liked seeing prescription numbers (9/15) than seeing prescription names (6/15) (figure 1).

Figure 1.

Figure 1

Screenshot from prescription refill scenario showing medications with drug numbers and names.

Two out of 24 users stated that they did not want to see the prescription names due to privacy concerns when viewing their medication list at home. One user said: ‘“I may be on and get up and move to answer the door, and someone could see my prescriptions that I'm taking.”’

Track health

MHV enables users to track their own health with multiple types of self-reported health information, including vital signs, labs, tests, and food and activity journals. Most observations for this scenario were drawn from the first task, wherein users were asked to record their physical activity in an ‘activity journal’ (figure 2).

Figure 2.

Figure 2

Screenshot from track health scenario showing activity journal.

Almost one-third of users (6/20) expressed confusion or became lost when manually entering information into the activity journal form. User statements included: ‘“This is probably self-explanatory, but I'm not getting it”’ and ‘“[I] felt like I was in the woods and didn't know which way to go”.’

When asked if they would use the activity journal, some users expressed a desire to share this information with their providers. One user responded: ‘“Does it print? If I could print this information out and go ‘here you go, doc’ then, yes, I would use it.”’ For the final task in this scenario, wherein users self-entered blood pressure values, two users liked the graphing function. However, they thought that the graph should appear without prompting from the user: ‘“To me, that [graphing] should be automatically down there below blood pressure.”’

Search for health information

One user noted that there were too many references to the word ‘health’ and stated: ‘“It's a hospital, we know it's all about health”’ (figure 3).

Figure 3.

Figure 3

Screenshot from search for health information scenario showing cover tab of ‘Research Health’ section from My HealtheVet website.

Users also commented on the parallels between searching for medical information on MHV (about post-traumatic stress disorder and healthy eating) and more general searches. One user drew this parallel: ‘“It's like a web search”’, while two others preferred to search in non-VA sources for health information and stated: ‘“I would use other websites”’ and ‘“[I] would have Googled first”.’

Discussion

Early in the development of the MHV program, Veteran feedback indicated that pharmacy refill was the most desired feature. Consequently, MHV's prescription refill function was the first widely deployed function to extract information from the VA's EHR. Thus, it is not surprising that the most commonly used function of MHV (prescription refill) was associated with greater usability (efficiency) given the association between usability and adoption suggested elsewhere.3 4 Yet the causal direction between usability and adoption is uncertain (ie, did more usable (efficient) prescription refills lead to greater use or did anticipated high utilization cause developers to focus upon refill efficiency?). Future research may consider longitudinal, comparative14 study designs to better understand these relationships.

In the next several paragraphs, we summarize our usability findings in the context of the existing literature and provide lessons learned. Potential design solutions to usability issues identified are described in table 2.

Table 2.

Usability issues identified and potential design solutions

Usability issues identified Potential design solution(s)
Patients, especially with advanced age, experience difficulty with registration.
  • Collect data not necessary for security (eg, demographics) after logging into the PHR.

  • Consider allowing longer passwords and dropping the requirement for special characters.

A few patients have privacy concerns with viewing medical information at home.
  • Provide options for greater on-screen confidentiality via prescription numbers for those who value privacy.

  • Pictures of pills may be a more private and effective way to communicate.15

Patients want to share information at the time of the visit with their healthcare team. Make PHR data easier to print or download so that patients can bring selected information to their healthcare appointments.
Patients may choose other ways of searching for health information. Healthcare organizations should build trust for information in PHRs by curating high-quality health content and highlighting the advantage of this content compared to other sources of information.

A substantial number of users experienced problems with the seemingly straightforward registration process. Registration standards in other consumer-based industries that expect a high level of security (ie, banking) are worth consideration. Consumer focus groups suggest that the security of patient data is a primary concern regarding health information systems.16 Yet as in our study, complexity requirements are often found to be challenging for older adults.17 18 Lesson learned: the development challenge with registration is to make the process both simple and secure.

During the prescription refill scenario, the faster completion times (mean and median) for unauthenticated accounts suggests that prescription names and doses, while introducing more information, may not increase user efficiency. A few participants expressed privacy concerns about viewing prescription names in their homes. Such concerns may have been related to the social stigma associated with taking certain medications. These findings suggest that there may be a subset of PHR users who prefer greater on-screen confidentiality, even at home. Unlike our results, online survey comments of MHV users have indicated an overall preference for prescription names9 (and consequently, the requirement for in-person authentication to view medication names was removed from MHV in March 2011). These apparently contradictory findings are a worthwhile topic for future exploration. Lesson learned: by focusing upon the individual user experience, assessment of usability makes possible the identification of important subsets of patients. These newly identified subsets may promote the tailoring of PHR functions so as to have a tighter fit with individual patient preferences, for example, one privacysizedoes not appear to fit all.

Some of our usability findings were present across tasks, for example, navigation labels such as ‘track health’ or ‘search health’ did not always intuitively signify their functions (self-entering health data and searching for health information, respectively). Other PHR usability studies emphasize the importance of minimizing jargon.19 Other findings were also consistent with the informatics literature. Our participants expressed confusion and frustration when manually entering information, and prior reports suggest that manual data entry can be both time consuming and error prone.20 Our participants also expressed an interest in graphing, as well as printing information from MHV to share with their doctor. Other PHR usability tests have demonstrated faster task completion times for visual information.21 Since the time of this study, the VA has implemented the ‘Blue Button’ feature which allows users to download or print information from MHV to bring to their VA appointments, as referenced in President Obama's 2011 State of the Union address.22 Lesson learned: when users add data to a PHR, add value to their personal data by making it more accessible (printable or downloadable) and more understandable (visualization).

A few participants expressed a preference for other internet sites or Google for health information. In the case of MHV, health information is vetted and approved by a clinical advisory board to ensure that it is evidence based, but this process alone is unlikely sufficient for patient buy-in. Consumer trust in online health information is also critical.23 Lesson learned: when adding functionality to a PHR, consider how the function adds value to what is already available to consumers (eg, Google search).

Limitations

The study enrolled a convenience sample. Nonetheless, computer experience was widely distributed among participants, enabling our study to better explore the digital divide.24 While 74% of American adults use the internet,25 40% of our participants reported computer experience as ‘low or none.’ This differential suggests that our participants were at particularly high risk for falling on the wrong side of the digital divide. Thus, the usability issues identified in our study may identify important barriers contributing to the digital divide. For the prescription refill scenario, another limitation was the small number of participants (10) who used unauthenticated accounts (with numbers only). Thus, the comparison with target values, as well as the observed preference for prescription numbers, was more exploratory in this usability study, especially given the preference for prescription names identified in MHV online survey comments.9

Conclusions

Users consistently highlighted potential opportunities for the PHR to ‘add value’ to existing information. Furthermore, the individual user experiences captured by our usability methods highlighted how PHR functions may be more precisely tailored to patient preferences. With these types of insights, healthcare systems may move closer to ‘personalized’ health information technology. This report constitutes the first published usability test of the MHV program—the most widely disseminated, tethered PHR in the USA. Little has been reported in the literature related to the usability of PHRs. For a patient-centered technology, we believe this gap in the medical literature is a large one to fill. These findings will inform both future redesign of the MHV site being undertaken by the VA, as well as future development of PHRs more generally.

Acknowledgments

The authors would like to acknowledge the contributions of Brad Doebbeling, MD, MSc, to initial study conception and design and Kim Nazi, FACHE, for valuable feedback. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States Government.

Footnotes

Funding: The project was supported by VA HSR&D project SHP 08-192 (Chumbler, PI). DAH is the recipient of VA HSR&D Career Development Award CD207016-2 and part of the VA/Robert Wood Johnson Foundation Physician Faculty Scholars Program. JJS is the recipient of VA HSR&D Career Development Award 09-024-1. The study sponsor had no role in the study design; in the collection, analysis and interpretation data; in the writing of the report; and in the decision to submit the paper for publication.

Competing interests: None.

Ethics approval: Ethics approval was provided by the IUPUI Institutional Review Board and the Roudebush VA Research and Development Committee.

Provenance and peer review: Not commissioned; externally peer reviewed.

References

  • 1.Connecting for Health The Personal Health Working Group: Final Report. New York, NY: Markle Foundation, 2003 [Google Scholar]
  • 2.Kaelber DC, Jha AK, Johnston D, et al. A research agenda for personal health records (PHRs). J Am Med Inform Assoc 2008;15:729–36 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Tang PC, Ash JS, Bates DW, et al. Personal health records: definitions, benefits, and strategies for overcoming barriers to adoption. J Am Med Inform Assoc 2006;13:121–6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Saleem JJ, Patterson ES, Militello L, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc 2007;14:632–40 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Britto MT, Jimison HB, Munafo JK, et al. Usability testing finds problems for novice users of pediatric portals. J Am Med Inform Assoc 2009;16:660–9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Kushniruk AW, Patel VL, Cimino JJ. Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces. Proc AMIA Annu Fall Symp 1997:218–22 [PMC free article] [PubMed] [Google Scholar]
  • 7.Schumacher RM, Lowry SZ. National Institute of Standards and Technology Guide to the Processes Approach for Improving the Usability of Electronic Health Records. Gaithersburg, MD, U.S: Department of Commerce, 2010 [Google Scholar]
  • 8.Nazi KM, Woods SS. MyHealtheVet PHR: a description of users and patient portal use. AMIA Annu Symp Proc 2008;6:1182. [PubMed] [Google Scholar]
  • 9.Nazi KM. Veterans' voices: use of the American Customer Satisfaction Index (ACSI) Survey to identify My HealtheVet personal health record users' characteristics, needs, and preferences. J Am Med Inform Assoc 2010;17:203–11 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Nazi KM, Hogan TP, Wagner TH, et al. Embracing a health services research perspective on personal health records: lessons learned from the VA My HealtheVet system. J Gen Intern Med 2007;25(Suppl 1):62–7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. Proceedings of the INTERACT '93 AND CHI '93 conference on Human Factors in computing systems. Amsterdam, The Netherlands, 1993 [Google Scholar]
  • 12.Russell SA, Saleem JJ, Haggstrom DA, et al. A novel tool to track and analyze qualitative usability data: Lessons learned from the VA's personal health record. Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting. San Francisco, CA, 2010:1264–8 [Google Scholar]
  • 13.Connecting for Health Markle Foundation; http://www.connectingforhealth.org/resources/ResearchBrief-200806.pdf (accessed 31 Dec 2010). [Google Scholar]
  • 14.Chan J, Shojania KG, Easty AC, et al. Does user-centred design affect the efficiency, usability and safety of CPOE order sets? J Am Med Inform Assoc 2011;18:276–81 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Schillinger D, Machtinger EL, Wang F, et al. Language, literacy, and communication regarding medication in an anticoagulation clinic: a comparison of verbal vs. visual assessment. J Health Commun 2006;11:651–64 [DOI] [PubMed] [Google Scholar]
  • 16.Schneider SJ, Kerwin J, Robins C, et al. Consumer Engagement in Developing Electronic Health Information Systems: Final Report. Rockville: AHRQ, 2009 [Google Scholar]
  • 17.Kim EH, Stolyar A, Lober WB, et al. Challenges to using an electronic personal health record by a low-income elderly population. J Med Internet Res 2009;11:e44. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Lober WB, Zierler B, Herbaugh A. Barriers to use of a personal health record by an elderly population. AMIA Annu Symp Proc 2006:514–18 [PMC free article] [PubMed] [Google Scholar]
  • 19.Tran DT, Zhang X, Stolyar A, et al. Patient-centered design for a personal health record system. AMIA Annu Symp Proc 2005:1140. [PMC free article] [PubMed] [Google Scholar]
  • 20.Kahn JS, Aulakh V, Bosworth A. What it takes: characteristics of the ideal personal health record. Health Aff (Millwood) 2009;28:369–76 [DOI] [PubMed] [Google Scholar]
  • 21.Marchionini G, Rimer BK, Wildemuth B. Evidence Base for Personal Health Record Usability: Final Report to the National Cancer Institute, February 10, 2007. http://www.ils.unc.edu/phr/files/final%20report%20010307.pdf (accessed 30 Sep 2011). [Google Scholar]
  • 22.Blue Button Download My Data. http://www.va.gov/bluebutton/ (accessed 31 Mar 2011).
  • 23.Hou J, Shim M. The role of provider-patient communication and trust in online sources in Internet use for health-related activities. J Health Commun 2010;15(Suppl 3):186–99 [DOI] [PubMed] [Google Scholar]
  • 24.Chang BL, Bakken S, Brown SS, et al. Bridging the digital divide: reaching vulnerable populations. J Am Med Inform Assoc 2004;11:448–57 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Rainie L. Internet, Broadband, and Cellphone Statistics. Pew Internet & American Life Project, January 5, 2010. http://www.pewinternet.org/∼/media//Files/Reports/2010/PIP_December09_update.pdf (accessed 30 Sep 2011). [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES