Summary
Despite its promise, recent literature has revealed possible safety hazards of health information technology (HIT) use. The Office of the National Coordinator for HIT recently sponsored an Institute of Medicine committee to synthesize evidence and experience from the field on how HIT affects patient safety. To lay the groundwork for defining, measuring, and analyzing HIT-related safety hazards, we propose that Health information technology-related error occurs anytime HIT is unavailable for use, malfunctions during use, is used incorrectly by someone, or when HIT interacts with another system component incorrectly, resulting in data being lost or incorrectly entered, displayed, or transmitted. These errors, or the decisions that result from them, significantly increase the risk of adverse events and patient harm. In this paper, we describe how a socio-technical approach can be used to understand the complex origins of HIT errors, which may have roots in rapidly evolving technological, professional, organizational, and policy initiatives.
Keywords: Electronic Health Records, Health Information Technology, Patient Safety, Errors
Introduction
Two Institute of Medicine (IOM) reports have recommended the use of information technologies to improve patient safety and reduce errors in health care1,2. Broadly speaking, health information technology (HIT) is the overarching term applied to various information and communication technologies used to collect, transmit, display, or store patient data. Despite HIT’s promise in improving safety, recent literature has revealed potential safety hazards associated with its use, often referred to as e-iatrogenesis.3,4 For example, Koppel et al5 describe 22 types of errors facilitated by a commercially available electronic health record (EHR) system’s computerized provider order entry (CPOE) application. In response to similar emerging concerns, the Office of the National Coordinator for HIT recently sponsored an IOM committee to “review the available evidence and the experience from the field” on how HIT use affects patient safety. Given the national impact of HIT, this initiative is a major step forward in ensuring the safety and well-being of our patients. However, the field currently lacks acceptable definitions of HIT-related errors and it is unclear how best to measure or analyze “HIT errors”.
The goal of this manuscript is to advance the understanding of HIT-related errors and explain how adverse events, near misses, and patient harm can result from problems with HIT itself or from interactions between HIT, its users, and the work system. Health information technology errors almost always jeopardize patient outcomes and have high potential for harm6 because they are often latent errors that occur at the “blunt end” of the healthcare system,7 potentially affecting large numbers of patients if not corrected. Furthermore, if important structural or process-related HIT problems are not addressed proactively, the care of millions of patients may be affected owing to the impending widespread adoption and implementation of EHRs8. We thus focus heavily on errors related to the use of EHR systems.
General Criteria for a HIT Error
We define the HIT work system as the combination of the hardware and software required to implement the HIT, as well as the social environment in which it is implemented. We thus propose that HIT errors should be defined from the socio-technical viewpoint of end users (including patients, when applicable) rather than from the purely technical viewpoint of manufacturers, developers, vendors, and personnel responsible for implementation. Health information technology related error occurs anytime the HIT system is unavailable for use, malfunctions during use, is used incorrectly, or when HIT interacts with another system component incorrectly resulting in data being lost or incorrectly entered, displayed, or transmitted.9,10 Errors with HIT may involve failures of either structures or processes and can occur in the design and development, implementation and use, or evaluation and optimization phases of the HIT life cycle.11 This approach is consistent with the currently recommended systems and human factors approaches used to understand and reduce error.1
The HIT system is considered to be unavailable for use if for any reason the user cannot enter, review, transmit, or print data (e.g., patient’s medication allergies or most recent laboratory test results). Reasons could include unavailable computer hardware (e.g., missing keyboard or problems with the computer’s monitor, network routers that connect the computer to the data servers and printers, or the server where data is stored), unavailable software (e.g., missing components with the operating system that manages either the computer applications such as the internet browser and EHR or the interface between an EHR system and the information system of an ancillary service such as radiology or lab), and power sources (e.g., a power outage that results in hospital-wide computer failure).4
The HIT system is considered to be malfunctioning (i.e., available, but not working correctly) whenever a user cannot accomplish the desired task despite using the HIT system as designed. In this situation, error results from any hardware or software defect (or bug) that prohibits a user from entering or reviewing data, or any defect that causes the data to be entered, displayed, transmitted, or stored incorrectly. For example, the clinician might enter a patient’s weight in pounds, and the weight-based dosing algorithm might fail to convert it to kilograms before calculating the appropriate dose, resulting in a 2-fold overdose.
Finally, errors can occur even when hardware and software are functioning as designed. For instance, errors may result when users do not use the hardware or software as intended. For example, users might enter free-text comments (e.g., “take 7.5 mg Mon-Fri only”) that contradict information contained in the structured section of the medication order (e.g., “Warfarin tabs 10mg QD”).12 Errors may also arise when 2 or more parts of the HIT system (e.g., CPOE application and the pharmacy’s medication dispensing system) interact in an unpredicted manner, resulting in inaccurate, incomplete, or lost data during entry, display, transmission, or storage.13
Origin-Specific Typology for a HIT Error
Leveson14 proposes that new technologies have fundamentally altered the nature of errors and asserts that these changes necessitate new models and methods for investigating technology-related errors. Thus, technological advances could potentially give rise to increasingly complex and multifaceted errors in healthcare. In view of the resultant expanding and evolving context of safe HIT implementation and use, we illustrate how a recently developed socio-technical model for HIT evaluation and use can provide an origin-specific typology for HIT errors.15 The model’s 8 dimensions (Table 1) comprehensively account for the technology; its users and their respective workflow processes and how these 2 elements interface with the technology; the work system context including organizational and policy factors that affect HIT; and notably, the interactions between all of these factors.16 The Table lists examples of specific EHR-related errors that can occur within each of the 8 dimensions of the socio-technical model, along with examples of potential ways that the likelihood of each error could be reduced. Thus, the model not only illustrates the complex relationships between active and latent errors but also lays a foundation for error analysis.
Table 1.
Socio-technical model dimension | Examples of types of errors that could occur in each dimension | Examples of potential ways to reduce likelihood of these errors… |
---|---|---|
Hardware and Software - required to run the healthcare applications | Computer or network is not functioning17 | Provide redundant hardware for all essential patient care activities |
Input data truncated (ie, buffer overflow) – some entered data lost | Warn users when data entered exceeds amount that can be stored | |
Clinical Content – data, information, and knowledge entered, displayed, or transmitted | Allowable item can’t be ordered (eg., no “amoxicillin” in the antibiotic pick-list)5 | Conduct extensive pre-release testing on all system-system data and human-computer interfaces to insure that new features are working as planned and that existing features are working as before |
Incorrect default dose for given medication5 | ||
Human Computer Interface - aspects of the system that users can see, touch, or hear | Data entry/review screen does not show the patient name, medical record number, birthdate, etc. | Encourage and provide methods for clinicians to report when patient-specific screens do not contain key patient demographics so that the software can be fixed |
Human Computer Interface (cont.) | Two patients with same name; data entered on wrong patient18 | Alert providers to potential duplicate patients and require re-confirmation of patient ID before saving data (eg, display patient photo before signing) |
Two buttons with same label, but different functionality | Pre-release inspection of all screens for duplicate button names | |
People - the humans involved in the design, development, implementation, and use of HIT | Wrong decision about KCl administration based on poor data presentation on the computer screen19 | Improve data displays and train users to routinely review and cross-validate all data values for appropriateness before making critical decisions. |
Incorrect merge of two patient’s data20 | Develop tools to compare key demographic data and calculate a probability estimate of similarity | |
RNs scan duplicate patient barcode taped to their clipboard rather than barcode on patient to save time21 | Improve user training, user interfaces, work processes, and organizational policies to reduce need for workarounds | |
Workflow and Communication - the steps needed to ensure that each patient receives the care they need at the time they need it | Computer discontinues a medication order without notifying a human | Implement fail-safe communication (eg, re-send message to another hospital designee if no response from MD or RN) for all computer-generated actions22, |
Critical abnormal test result alerts not followed up23 | Implement robust quality assurance systems to monitor critical follow-alert up rates24; use “dual notification” for alerts judiciously25 | |
Organizational Policies and Procedures - internal culture, structures, policies, and procedures that affect all aspects of HIT management and healthcare | Policy contradicts physical reality (eg, required Barcode med administration readers not available in all patient locations)21 | Conduct pre- and post-implementation inspections, interviews, and monitor feedback from users in all physical locations |
Policy contradicts personnel capability (eg, 1 pharmacist to verify all orders entered via CPOE in large hospital) | Conduct pre- and post-implementation interviews with all affected users to better gauge workload | |
Incorrect policy allows “hard-stops” on clinical alerts causing delays in needed therapy26 | Disallow “hard-stops” on alerts; users should be able to override the computer in all but the most egregious cases (eg, ordering promethazine as IV push by peripheral vein27) | |
External Rules, Regulations, and Pressures - external forces that facilitate or place constraints on the design, development, implementation, use, and evaluation of HIT in the clinical setting | Billing requirements lead to inaccurate documentation in EHR (eg, inappropriate copy & paste) | Highlight all “pasted” material and include reference to source of material |
Joint Commission required medication reconciliation processes28 causing rushed development of new medication reconciliation applications that were difficult to use and caused errors29; rescinded safety goal30 only to reinstate it 7/1/201131 | Carefully consider potential adverse unintended consequences before making new rules or regulations; conduct interviews and observations of users to gauge effects of rules and regulations on patient safety, quality of care, and clinician work-life | |
System Measurement and Monitoring - of system availability, use, effectiveness, and unintended consequences of system use | Incomplete or inappropriate (eg, combining disparate data) data aggregation leads to erroneous reporting | Increase measurement and monitoring transparency by providing involved stakeholders with access to raw data, analytical methods, and reports |
Incorrect interpretation of results |
Conclusion
In conclusion, rapid advances in HIT development, implementation, and regulation have complicated the landscape of HIT-related safety issues. Erroneous or missing data and the decisions based on them, increase the risk of an adverse event and unnecessary costs. Because these errors can and frequently do occur after implementation, simply increasing oversight of HIT vendors’ development processes will not address all HIT-related errors. Comprehensive efforts to reduce HIT errors must start with clear definitions and an origin-focused understanding of HIT errors that addresses important socio-technical aspects of HIT use and implementation. To this end, we provide herein a much needed foundation for coordinating safety initiatives of HIT designers, developers, implementers, users, and policy makers, who must continue to work together to achieve a high-reliability HIT work system for safe patient care.
Acknowledgments
Dr. Sittig is supported in part by a grant from the National Library of Medicine R01-LM006942 and by a SHARP contract from the Office of the National Coordinator for Health Information Technology (ONC #10510592).
Dr. Singh is supported by an NIH K23 career development award (K23CA125585), the VA National Center of Patient Safety, Agency for Health Care Research and Quality, a SHARP contract from the Office of the National Coordinator for Health Information Technology (ONC #10510592), and in part by the Houston VA HSR&D Center of Excellence (HFP90-020).
These sources had no role in the preparation, review, or approval of the manuscript.
We thank Laura A. Petersen, MD, MPH, VAHSR&D Center of Excellence, Michael E. DeBakey Veterans Affairs Medical Center, and Baylor College of Medicine, and Eric J. Thomas, MD, MPH, University of Texas, Houston-Memorial Hermann Center for Healthcare Quality and Safety and Department of Medicine, University of Texas Medical School, Houston, for their guidance in this work and Annie Bradford, PhD, for assistance with medical editing, for which they received no compensation.
Footnotes
Portions of this manuscript were presented to the United States’ Institute of Medicine Committee on Patient Safety and Health Information Technology held December 14, 2010 in Washington, DC
The views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs or any of the other funding agencies.
References
- 1.Institute of Medicine. To err is human: Building a safer health system. Washington, DC: National Academy Press; 1999. [Report by the Committee on Quality of HealthCare in America] [Google Scholar]
- 2.Institute of Medicine. Patient Safety: Achieving a new standard for care. Washington, DC: National Academy Press; 2004. [Report by the Committee on Data Standards for Patient Safety] [Google Scholar]
- 3.Weiner JP, Kfuri T, Chan K, Fowles JB. “e-Iatrogenesis:” The most critical unintended consequence of CPOE and other HIT. J Am Med Inform Assoc. 2007 Feb 28; doi: 10.1197/jamia.M2338. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Myers RB, Jones SL, Sittig DF. Reported Clinical Information System Adverse Events in US Food and Drug Administration Databases. Applied Clinical Informatics. 2011;2:63–74. doi: 10.4338/ACI-2010-11-RA-0064. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE, Strom BL. Role of computerized physician order entry systems in facilitating medication errors. JAMA. 2005 Mar 9;293(10):1197–203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]
- 6.Hofer TP, Kerr EA, Hayward RA. What is an error? Eff Clin Pract. 2000 Nov-Dec;3(6):261–9. [PubMed] [Google Scholar]
- 7.Reason J. Human error: models and management. BMJ. 2000 Mar 18;320(7237):768–70. doi: 10.1136/bmj.320.7237.768. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Stead W, Lin H, editors. Computational technology for effective health care: immediate steps and strategic directions. Washington, DC: National Academies Press; 2009. [PubMed] [Google Scholar]
- 9.Mangalmurti SS, Murtagh L, Mello MM. Medical malpractice liability in the age of electronic health records. N Engl J Med. 2010 Nov 18;363(21):2060–7. doi: 10.1056/NEJMhle1005210. [DOI] [PubMed] [Google Scholar]
- 10.Perrow C. Normal Accidents: Living with High-Risk Technologies. Princeton University Press; Princeton, New Jersey: 1999. [Google Scholar]
- 11.Walker JM, Carayon P, Leveson N, Paulus RA, Tooker J, Chin H, Bothe A, Jr, Stewart WF. EHR safety: the way forward to safe and effective systems. J Am Med Inform Assoc. 2008 May-Jun;15(3):272–7. doi: 10.1197/jamia.M2618. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Singh H, Mani S, Espadas D, Petersen N, Franklin V, Petersen LA. Prescription errors and outcomes related to inconsistent information transmitted through computerized order entry: a prospective study. Arch Intern Med. 2009 May 25;169(10):982–9. doi: 10.1001/archinternmed.2009.102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Kleiner B. Sociotechnical System Design in Health Care. In: Carayon P, editor. Handbook of Human Factors and Ergonomics in Health Care and Patient Safety. Mahwah, NJ: Lawrence Erlbaum; 2007. [Google Scholar]
- 14.Leveson N. A New Accident Model for Engineering Safer Systems. Safety Science. 2004 Apr;42(4):237–270. [Google Scholar]
- 15.Sittig DF, Singh H. Eight rights of safe electronic health record use. JAMA. 2009 Sep 9;302(10):1111–3. doi: 10.1001/jama.2009.1311. [DOI] [PubMed] [Google Scholar]
- 16.Sittig DF, Singh H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care. 2010 Oct;19( Suppl 3):i68–74. doi: 10.1136/qshc.2010.042085. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Kilbridge P. Computer crash--lessons from a system failure. N Engl J Med. 2003 Mar 6;348(10):881–2. doi: 10.1056/NEJMp030010. [DOI] [PubMed] [Google Scholar]
- 18.Shojania KG. Patient Mix-Up. AHRQ WebM&M [serial online] 2003 Feb; Available at: http://www.webmm.ahrq.gov/case.aspx?caseID=1.
- 19.Horsky J, Kuperman GJ, Patel VL. Comprehensive analysis of a medication dosing error related to CPOE. J Am Med Inform Assoc. 2005 Jul-Aug;12(4):377–82. doi: 10.1197/jamia.M1740. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.AHIMA MPI Task Force. Merging Master Patient Indexes. 1997 Sep; Available at: http://www.cstp.umkc.edu/~leeyu/Mahi/medical-data6.pdf.
- 21.Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: their occurrences, causes, and threats to patient safety. J Am Med Inform Assoc. 2008 Jul-Aug;15(4):408–23. doi: 10.1197/jamia.M2616. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Kuperman GJ, Teich JM, Tanasijevic MJ, Ma’Luf N, Rittenberg E, Jha A, Fiskio J, Winkelman J, Bates DW. Improving response to critical laboratory results with automation: results of a randomized controlled trial. J Am Med Inform Assoc. 1999 Nov-Dec;6(6):512–22. doi: 10.1136/jamia.1999.0060512. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Singh H, Wilson L, Petersen LA, Sawhney MK, Reis B, Espadas D, Sittig DF. Improving follow-up of abnormal cancer screens using electronic health records: trust but verify test result communication. BMC Med Inform Decis Mak. 2009 Dec 9;9:49. doi: 10.1186/1472-6947-9-49. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Singh H, Thomas EJ, Sittig DF, Wilson L, Espadas D, Khan MM, Petersen LA. Notification of abnormal lab test results in an electronic medical record: do any safety concerns remain? Am J Med. 2010 Mar;123(3):238–44. doi: 10.1016/j.amjmed.2009.07.027. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Singh H, Thomas EJ, Mani S, Sittig D, Arora H, Espadas D, Khan MM, Petersen LA. Timely follow-up of abnormal diagnostic imaging test results in an outpatient setting: are electronic medical records achieving their potential? Arch Intern Med. 2009 Sep 28;169(17):1578–86. doi: 10.1001/archinternmed.2009.263. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Strom BL, Schinnar R, Aberra F, Bilker W, Hennessy S, Leonard CE, Pifer E. Unintended effects of a computerized physician order entry nearly hard-stop alert to prevent a drug interaction: a randomized controlled trial. Arch Intern Med. 2010 Sep 27;170(17):1578–83. doi: 10.1001/archinternmed.2010.324. [DOI] [PubMed] [Google Scholar]
- 27.Grissinger M. Preventing serious tissue injury with intravenous promethazine (phenergan) Pharmacy & Therapeutics. 2009 Apr;34(4):175–6. [PMC free article] [PubMed] [Google Scholar]
- 28.Medication reconciliation. 2005 National Patient Safety Goal #8 by the Joint Commission.
- 29.Poon EG, Blumenfeld B, Hamann C, Turchin A, Graydon-Baker E, McCarthy PC, Poikonen J, Mar P, Schnipper JL, Hallisey RK, Smith S, McCormack C, Paterno M, Coley CM, Karson A, Chueh HC, Van Putten C, Millar SG, Clapp M, Bhan I, Meyer GS, Gandhi TK, Broverman CA. Design and implementation of an application and associated services to support interdisciplinary medication reconciliation efforts at an integrated healthcare delivery network. J Am Med Inform Assoc. 2006 Nov-Dec;13(6):581–92. doi: 10.1197/jamia.M2142. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.APPROVED: Will Not Score Medication Reconciliation in 2009. Joint Commission. Available at: http://www.jcrinc.com/common/PDFs/fpdfs/pubs/pdfs/JCReqs/JCP-03-09-S1.pdf. [PubMed]
- 31.Revised National Patient Safety Goal on medication reconciliation is approved. Joint Commission Online - December 8, 2010.