Skip to main content
Applied Clinical Informatics logoLink to Applied Clinical Informatics
editorial
. 2011 Aug 24;2(3):345–349. doi: 10.4338/ACI-2011-03-IE-0020

HIT or Miss – Studying Failures to Enable Success

J Leviss 1,
PMCID: PMC3631932  PMID: 23616880

Health information technology (HIT) is intended to improve the quality and efficiency of clinical care, but what happens when projects fail? What if an HIT product or project adversely affects patient safety? What can we learn from HIT failures to improve future HIT initiatives and our healthcare delivery system?

A hospitalized patient’s INR (blood coagulation time) becomes dangerously elevated; an investigation finds that the patient received double doses of anticoagulant medication due to an error in the pharmacy information system’s handling of a medication ordered by CPOE (computerized provider order entry).

A physician uses an EHR to order a medication for a patient. Unknowingly, he orders a combination pill that includes a second medication. He is unaware of the error because the ordering field in the EHR is only large enough to display one of the medication names in the combination pill instead of both, making the order seem as if it were for the single intended medication. The error is not discovered until the patient is evaluated for an abnormal lab result, likely caused by the medication error.

One morning, at a multi-site ambulatory care organization live on an EHR for 2 years, physicians, nurses, and staff experience EHR freezes and shut downs that make the EHR unusable. A decision is made to revert to paper. For the next few hours, physicians and nurses struggle to treat patients with diabetes, heart disease, and mental illness without access to past records, medication lists, test results, or care plans.

A health system implements evidence-based order sets to support quality of care across many clinical departments. There is no systematic approach to review and update the order sets. As recognized guidelines change, some departments recommend revisions to their order sets, but other departments do not. One result is that antibiotic prophylaxis order sets are no longer current. Some physicians are aware that these order sets are out of date, while other physicians are not and continue to use them.

On February 24, 2009, President Barack Obama pledged to the entire US Congress, “Our...plan will invest in electronic health records and new technology that will reduce errors, bring down costs, ensure privacy, and save lives” [1]. The US federal government then embarked on an ambitious multi-billion dollar program to accelerate the adoption of health information technology across the health care delivery system under the American Reinvestment and Recovery Act (ARRA). Since then, billions of dollars have been invested by hospitals, physician practices, HIT vendors, and consultants to design, develop, and implement HIT, all with the goal of improving the quality and efficiency of health care. Training programs have been created at large universities and community colleges to prepare an appropriately skilled workforce to lead these initiatives and to perform the day to day tasks for specific projects. However, little attention has been paid to projects that have stumbled, critical resources that are regularly missing, or errors that are repeated routinely at great expense and with negative consequences to patients.

At the most basic level, if HIT is to help solve the current health care crisis and help improve patient safety, HIT projects must succeed. Yet HIT projects fail at a rate up to 70% of the time when failure is defined as: “an HIT project in which an unintended negative consequence occurred, such as a project delay, a substantial cost overrun, a failure to meet an intended goal, or complete abandonment of the project” [2]. What happens when projects fail, especially if patients are harmed? How do health systems analyze the challenges, costs, and patient safety problems from failed HIT initiatives? Are these problems even identified or discussed? Countless anecdotes highlight the high frequency with which health systems and vendors fail to follow known best practices, when a novice organization implements a proven technology or a leading organization implements a new technology, yet such stories are typically shared confidentially between colleagues and not reviewed in any formal or public manner so that others may learn from the mistakes.

One result of the lack of open discussion of HIT failures is that common errors remain common, including technology errors (how HIT is designed and built), organizational errors (how HIT is implemented and managed), and process errors (how HIT is used). Most importantly, the same lessons or 'best practices' are repeatedly learned in large and small health systems through trial and error, over and over again, without successful dissemination of the knowledge from one organization to another. Each health system incurs its own financial costs and experiences patient safety problems to learn the same critical lessons. Professional conferences routinely share experiences from successful HIT initiatives, highlighting “best practices,” but then attendees return home and fail to follow the lessons discussed. As a result, another barrier remains to effective HIT adoption and HIT-related patient safety problems occur without documentation or understanding about how to prevent them again. HIT failures need to be openly studied and learned from.

One perspective – “...you can really see how a minor mistake here or there, nothing huge, can lead to a situation where someone gets killed” [3]. The statement is not about HIT but rather the journal Accidents in North American Mountaineering. The American Alpine Club reports yearly about mountain climbing accidents so that other climbers can learn from the devastating errors of their peers and not repeat them. The journal was created by Jed Williams, a climber and college professor who believed in the power of studying accidents and sharing the information transparently so that climbers could learn from each other’s mistakes.

Another example of critical learning from failures is the Commercial Aviation Safety Team (CAST), a multi-disciplinary organization of government and industry experts that has analyzed 500 accidents and thousands of safety incidents worldwide to reduce the leading causes of commercial aviation accidents in the United States. CAST’s mission statement is: “Enable a continuous improvement framework built on monitoring the effectiveness of implemented actions and modifying actions to achieve the goal” [4]. The airline industry is famous for its vigilance in studying adverse events to understand what went wrong, whether human or technical error, and how to prevent it from recurring. Why have HIT leaders failed to adopt a similar approach?

“I’ve...[messed] up in many, many ways in terms of managing people and product decisions and business, so I feel fairly confident at this point....”[5] Evan Williams, co-founder of Twitter recognizes the value of lessons learned from failures. The myriad of websites and publications that review failed businesses illustrates that entrepreneurs value the lessons from failures and share them openly, so others may succeed.

Clinical medicine, HIT’s parent discipline, values the lessons learned from adverse patient outcomes. Hospitals around the world regularly hold “morbidity and mortality rounds” to review clinical cases involving mistakes and adverse outcomes so that others will learn from them. Conferences involve all participants in patient care, from medical students to the senior physicians, to teach both the specific lessons of a case as well as the value of reviewing failures.

But in HIT most people remain convinced that we should only discuss success and that sharing best practices will lead to following such practices. Conferences and publications rarely highlight lessons learned from failed initiatives and a recent review of HIT experts failed to identify a single health system that regularly reviews HIT project failures. When will we recognize the value of examining failed initiatives for HIT and, ultimately, learn how to do it right?

Simply put, lessons from successes are not sticky enough. Human tendencies to take shortcuts, combined with resource constraints, are likely contributors to our inability to learn from successes. Consider the typical scenario: a CMIO and CIO attend a presentation on a successful implementation of ambulatory care EHR clinical decision support. The presentation outlines the team, project plan, and technology used for the success. Returning to their health system, the CMIO and CIO put together a similar project plan for similar technology but realize that they only have ¾ the staff for the project. Thinking that their smaller team is close enough to the presented model, they push ahead with the project which fails due to inadequate staffing resources. If the CMIO and CIO had heard a presentation about how reducing the size of a clinical decision support team by 25% was a significant risk for project failure, would they have proceeded with their reduced team? Perhaps they would have found the full resources or changed the project plan or scope to succeed. Like the mountain climber who learns that a certain rope technique can lead to climbers falling and injuring themselves, the CMIO and CIO would be much more sensitized to the value of a fully staffed team when that value is illustrated by a failure rather than a success.

Health systems may soon be required to report adverse HIT events, especially those affecting patient safety. In 2009 and 2010, Senator Chuck Grassley sent letters to health systems, HIT vendors, and services firms requesting explanations about how they address “patient care and/or safety problems related to HIT” [6]. Both the US Congress and the Obama Administration could require further disclosure of these processes. In 2010, the American Medical Informatics Association (AMIA) published a position paper on the challenges and legal impediments health systems encounter when trying to openly report adverse events that involve HIT, especially when problems are directly related to technology design or configuration. AMIA made recommendations to address the problems, stating “patient safety should trump all other values” [7].

Should we, as an industry or field of study, wait for outside regulators or professional societies to examine HIT failures within our own organizations? How long will we remain unwilling to follow the “best practice” of learning from problems and adverse outcomes? The content and human factors associated with implementing technology have proven to be formidable barriers to the available transformation that HIT would bring; the rapid and varied workflows in healthcare pose challenges to technologies that have demonstrated proven value in other industries. Sufficient technical expertise is not always available, or affordable, for HIT projects. Despite these challenges, we can succeed if we begin to recognize our failed efforts, learn from them, and optimize our resources and opportunities. Each and every health system, service firm, and HIT vendor has a responsibility to create a process that transparently identifies, tracks, and evaluates HIT failures and the adverse effects on patients and organizations.

First steps for every provider organization include organizing internal monthly conferences to review projects that failed and to identify critical lessons for both current and future initiatives. Health systems should involve all members of the HIT community in such discussions, including vendors and outside consultants. For some organizations, an easier place to start might be to restrict participation to internal personnel. Additionally, active projects should be regularly reviewed for key warning signs of failure, just as a quality officer regularly reviews steps in manufacturing or health care delivery processes; create staff incentives for identifying problems and suggesting solutions before they cause adverse outcomes or cause complete project failure; create hotlines for staff to report HIT problems anonymously as most health systems have for reporting other quality or safety issues.

Most importantly, federal funds for HIT should include programs that identify and track HIT failures, especially those that affect patient safety. Processes similar to the FDAs tracking of medication safety issues could be applied and enable lessons from failures to be shared across the US health system, including vendors and provider organizations. ‘Gag clauses’ should be prohibited from HIT vendor contracts, as Koppel and Kreda raised in a 2009 JAMA article, to enable open sharing of problems related to product design, implementation, and use [8]. Many HIT advocates fear regulation of HIT, but monitoring for failure and safety would provide direct value without requiring additional regulation of HIT‘s development, implementation, or use. A first step would be for the ONC or CMS to mandate HIT failure reporting as a requirement of receiving meaningful use attainment funds and then to establish simple and efficient mechanisms to submit such reports. Focusing on HIT cannot succeed if we only focus on success and ignore the lessons of our failures. That would lead to the ultimate failure.

Conflict of Interest Disclosure

The author has no financial conflict of interest to disclose regarding this article.

Human Subjects Protections

No human subjects were involved in the development of this article.

References


Articles from Applied Clinical Informatics are provided here courtesy of Thieme Medical Publishers

RESOURCES