Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2009 May-Jun;16(3):291–299. doi: 10.1197/jamia.M2997

Health IT Success and Failure: Recommendations from Literature and an AMIA Workshop

Bonnie Kaplan a , b , c ,, Kimberly D Harris-Salamone d
PMCID: PMC2732244  PMID: 19261935

Abstract

With the United States joining other countries in national efforts to reap the many benefits that use of health information technology can bring for health care quality and savings, sobering reports recall the complexity and difficulties of implementing even smaller-scale systems. Despite best practice research that identified success factors for health information technology projects, a majority, in some sense, still fail. Similar problems plague a variety of different kinds of applications, and have done so for many years. Ten AMIA working groups sponsored a workshop at the AMIA Fall 2006 Symposium. It was entitled “Avoiding The F-Word: IT Project Morbidity, Mortality, and Immortality” and focused on this under-addressed problem. Participants discussed communication, workflow, and quality; the complexity of information technology undertakings; the need to integrate all aspects of projects, work environments, and regulatory and policy requirements; and the difficulty of getting all the parts and participants in harmony. While recognizing that there still are technical issues related to functionality and interoperability, discussion affirmed the emerging consensus that problems are due to sociological, cultural, and financial issues, and hence are more managerial than technical. Participants drew on lessons from experience and research in identifying important issues, action items, and recommendations to address the following: what “success” and “failure” mean, what contributes to making successful or unsuccessful systems, how to use failure as an enhanced learning opportunity for continued improvement, how system successes or failures should be studied, and what AMIA should do to enhance opportunities for successes. The workshop laid out a research agenda and recommended action items, reflecting the conviction that AMIA members and AMIA as an organization can take a leadership role to make projects more practical and likely to succeed in health care settings.

Introduction

With the United States Congress appropriating more than U.S. $20 billion for health information technology (IT) as part of the Feb 2009 economic stimulus package, the United States joined other countries in national efforts to reap benefits that such technology can bring to health care quality and savings. Moreover, Medicare and private and commercial health plans are implementing a new paradigm for paying for health care services in the United States, known as Value-Based Purchasing (VBP), or pay for performance initiatives (P4P). Those initiatives rely heavily on the use of electronic health records to document the value of clinical services delivered. Tempering the fervor, though, are sobering reports that raise concerns about how the technology is designed and deployed. In Jan 2009, The United States National Research Council advised that nationwide deployment of health information technology would not achieve its goals unless it provided health care workers and patients with support for decision-making and problem-solving, thereby making health IT adoption all the more complex and daunting. 1 In the few weeks before passage of the United States stimulus package, troubles with Britain's National Health Service's move towards a nationwide electronic health records system were investigated by Parliamentary inquiries, 2 the Dutch minister for health announced that their national electronic health record deployment would be postponed despite announcing the roll out almost three months earlier, and smart card introduction in Germany was seriously delayed. 3 In addition, the United States Joint Commission on Accreditation of Healthcare Organizations issued a Sentinel Alert in Dec 2008, which warned of technology-related adverse events. 4 These are reminders of the complexity and difficulties of implementing even smaller-scale health IT systems.

Despite an accumulation of best practices research identifying success factors, IT implementation projects are often not successful. Across industry sectors, at least 40% of such generic IT projects either are abandoned or fail to meet business requirements, while fewer than 40% of large systems purchased from vendors meet their goals. 5,6 Some sources report 70% failure rates. 7 Other studies show that as few as one in eight information technology projects is considered truly successful, with more than half overshooting budgets and timetables and still not delivering what was promised. 8 According to the 2006 CHAOS Report by The Standish Group, only 35% of IT projects were completed on time, on budget, and met user requirements. Although that is more than double the 16.2% reported in the 1994 CHAOS Report, it still amounts to about two-thirds of projects with significant problems, including 19% that “failed outright” (down from 31.1% in 1994). 9

The range of systems involved and variations in outcomes raise questions on how to define project “failure.” 10,11 A common definition in health care is that “[s]ignificant budget and timeline overruns, underdelivery of value, and the outright termination of a project before completion are all forms of failure.” 12 Regardless of definition and other methodological differences, the studies share a common finding: over half of IT projects do not deliver as they should, are over budget, or are late. 13 Since the 1990s, organizations such as The Standish Group International Inc (Boston); KPMG (Toronto); Gartner, Inc (Stamford, CT); and the Aberdeen Group (Boston) all repeatedly have pronounced IT project failure a serious problem. 13

Similar failure rates have been reported specifically for health IT. 14,15 Hospitals are among those organizations where delays and cancellations of software projects are endemic. 16 For years, problems have plagued the implementation of health IT applications, whether for ancillary services, for whole institutions, for regional or national systems, or for consumers. Today's problems are reminiscent of those analyzed since at least the 1970s in classic studies of hospital information and patient record systems. 17–19 In 1990, Dowling estimated that staff interfere with or sabotage “nearly half” of projects, 20 while Heeks noted in 2006 that it is his “best estimate that most HIS [health information systems] fail in some way.” 15

Recent studies and newspaper accounts cite difficulties in a variety of health information technology applications. Over the years, in many countries, patterns of severe problems repeatedly have beset a variety of efforts: hospital information systems and electronic records; 21–26 ambulance services; 27,28 community, regional, and National Health Information networks; 28–33 public health systems; 34,35 patient education; 36 and physician order entry. 18,19,37–41 The situation is even more disturbing when high-profile failures, partial successes, and unsustainable IT undertakings are coupled with accumulating evidence of negative unintended consequences, increased error rates accompanying IT use, and the need for workarounds. 42–49

Much is known about ways to reduce these difficulties, as evidenced by literature on project and change management, success factors, and ways to identify and address problematic issues in IT implementation in health care. As in other application areas in different sectors, problems have been longstanding, with researchers and practitioners addressing issues of project success since there were projects. 16,50–52 In health care, lessons learned and prescriptions for success have been available at least since the 1970s. 53 More recent papers include compilations of evaluation research findings, implementation and project management advice, and system success and failure stories in health care. 15,53–61

Management wisdom also has been encapsulated in writings by well-known health care IT executives and government bodies 12,62–68 and the advice offered is much like that in other sectors. A 2007 study of 214 projects in a variety of sectors that included 18 health care projects identified inadequate management practices as accounting for 65% of the factors associated with project failure. The remaining 35% of the failed projects were classified by the authors as due to technical factors, including poor or inappropriate requirements, design, development tools, user documentation, test planning, and technical support, 8 all arguably management issues as well. According to the IT executive managers surveyed for the 1994 CHAOS Report, the three major reasons for project success are user involvement, executive management support, and a clear requirements statement, while lack of these constituted the main reasons for project challenges, impairments, and cancelations. 69 Their recipe for project success remained much the same in 2001: executive support, user involvement, experienced project manager, clear business objectives, and minimized scope. 70

However, despite important similarities, health care differs in significant ways from other sectors. In healthcare IT implementation, systems need to have well-defined standards for interoperability and terminologies and comply with legal requirements. Health IT systems must generate quality reports for a variety of different health plans. In addition, such systems must be flexible enough to support organizations ranging from solo practitioner offices to national integrated delivery networks. Ideally these systems also improve workflow, reduce cost, and improve quality of care, all the while maintaining long-standing beneficial patterns of communication, collaboration, and healthcare delivery. 71 While recognizing that there still are technical issues related to functionality and interoperability, a consensus is emerging that problems with health care IT projects, as in other sectors, 13,16 are due to sociological, cultural, and financial issues, and hence, are more managerial in nature than technical. For some years, it has been recognized that system success requires a mix of organizational, behavioral, cognitive, and social factors. There must be well-developed methods for design and dissemination; and early determination of who defines “success,” and when the determination of “success” is made. 53

There have been some published research reports of healthcare IT failures. There have been a few systematic and thoughtful publications describing lessons learned from IT interventions that had null, negative, or disappointing outcomes. 27,53 Despite calls for iscreased research, there are still too few published research reports of health care IT failures, removals, sabotage of systems, or how failures became successes or were otherwise redefined. As in other sectors, 69 IT-related failures in health care often are covered up, ignored, or rationalized, so mistakes are repeated. The same barriers and problems to health IT have been identified over the years. 72 They parallel those in other sectors in attributing problems to actors and circumstances outside of management's or informaticians' control. 21,73 One result is alarming headlines when high-profile health IT failures that adversely affect patient care or when well known institutions suspend their systems or halt their development due to physician protest, extreme overspending, errors, and delays. 26,32,37,41,74,75 Less sensational, but certainly serious, are studies of health care computer applications that cause errors through poor design and management. 43,76,77 Significantly, the US's Joint Commission on Accreditation of Healthcare Organizations recognized this problem and issued a Sentinel Alert recommending good management practices to help prevent patient harm through technology-related adverse events. 4

Sensational headlines and studies of systems causing errors have both surprised and dismayed the medical informatics community. The many success stories over the years make sometimes less-than-informed mass media reporting of project failures all the more disappointing and problematic. Such reports produce reactions that are as costly both financially and in terms of the benefits that information technology could bring for improving health care. Health care informatics projects are extremely complex, yet their benefits are manifold if the risks of failure are minimized. Multiple stakeholders share an interest in supporting the implementation of health information technology. The United States Congress has passed incentive packages, the Centers for Medicare and Medicaid Services (CMS) have put considerable effort into Pay for Performance initiatives, and electronic health record vendors, health care payers, and providers all are interested parties. With the Obama administration's emphasis on rapid implementation of health IT, issues of failure are all the more acute.

Workshop Development

With years of practical experience and research, and with increasing national and international pressure for health IT, the continued prevalence of project failure leads to questions of how to increase the success rate of IT systems implementations. The topic inspired a lively listserv discussion among members of AMIA working groups. The first author, who at the time chaired the IMIA Working Group (WG) on Organizational and Social Issues, realized that with widespread interest in the topic and considerable experience and wisdom in the AMIA membership, a meeting could continue the discussion and enable participants to learn from each other. As a result, ten working groups cosponsored a workshop at the AMIA Fall 2006 Symposium to examine why health IT implementations and applications do not meet the expectations held for them and what might be done to improve the situation. Entitled “Avoiding The F-Word: IT Project Morbidity, Mortality, and Immortality”, the session was devoted to better defining and characterizing reasons for “success” and “failure.”

Presenters representing the sponsoring WGs are listed in . In addition, J. Michael Fitzmaurice of the Agency for Healthcare Research and Quality (AHRQ) also spoke at the workshop. lists the issues framing their comments. After their remarks, over fifty participants broke into smaller groups to continue the discussion, and to develop sets of important issues, action items, and recommendations.

Table 1.

Table 1 Workshop Presenters (affiliations at time of workshop)

Moderator
Bonnie Kaplan, PhD, Yale Center for Medical Informatics, Yale University, New Haven, CT
Department of Biomedical and Health Information Sciences, University of Illinois—Chicago, Chicago, IL, and Kaplan Associates, Hamden, CT—Chair, International Medical Informatics Association Working Group on Organizational and Social Issues
Recorder
Kimberly D. Harris-Salamone, PhD, Director, Physician Office Quality, Health Services Advisory Group, Phoenix, AZ—Chair, AMIA People and Organizational Issues Working Group
Clinical Information Systems Working Group
Scot M. Silverstein, MD, Director, Institute for Healthcare Informatics, College of Information Science and Technology, Drexel University, Philadelphia, PA
Consumer Health Informatics Working Group
Rita Kukafka, DrPH, MA, Track Director, Public Health Informatics Specialization, Departments of Biomedical Informatics and Sociomedical Sciences, Mailman School of Public Health, Columbia University, New York, NY
Ethical, Legal and Social Issues Working Group
Robert Hsiung, MD, Department of Psychiatry, University of Chicago, Chicago, IL
Evaluation Working Group
Nicolette de Keizer, PhD, Department of Medical Informatics, Academic Medical Center, University of Amsterdam, Amsterdam, The Netherlands
Medical Imaging Systems Working Group
Melvyn Greberman, MD, MS, MPH, FACPM, President, Public Health Resources, LLC, helped in organizing the workshop and reviewing the draft paper. Unfortunately, the representative from this working group was unable to attend
Nursing Informatics Working Group
Linda Dietrich, MSN, RN, Dearborn Advisors, LLC, Chicago, IL
Pharmacoinformatics Working Group
Sandi Mitchell, RPh, MSIS, Program Director, Pharmacy Informatics Residency, Medication Use Team, The Johns Hopkins Hospital, Baltimore, MD
People and Organizational Issues Working Group
Jos Aarts, PhD, Institute of Health Policy and Management, Erasmus University Medical Center, Rotterdam, The Netherlands
Prevention and Public Health Working Group
Paul Fu, Jr, MD, MPH, Department of Pediatrics and Health Services, David Geffen School of Medicine at UCLA and, UCLA School of Public Health; and ocio/CMIO, Los Angeles County Department of Health Services, Los Angeles, CA
Agency for Healthcare Research and Quality
J. Michael Fitzmaurice, PhD, FACMI, Senior Science Advisor for Information Technology, Office of the Director, AHRQ, Rockville, MD

Table 2.

Table 2 Workshop Questions and Research Agenda

  • 1 What does “success” or “failure” mean?
    • 1.1 Is there such as thing as “failure”, or, for that matter, “success”? Is “failure” an inevitable part of achieving success?
    • 1.2 Is failure simply not meeting specifications or goals?
    • 1.3 Is it possible to develop objective measures or criteria for success or failure?
    • 1.4 Can a taxonomy of success factors be developed?
    • 1.5 Is a system “successful” if it supports a “dysfunctional” environment? Can IT succeed in the current health care environment?
    • 1.6 Is one's success another's failure? How can these differences be reconciled?
    • 1.7 Is the success/failure dichotomy the most helpful way to think about systems?
  • 2 Are drug models applicable to information systems studies?
    • 2.1 Can clinical information systems be “scientific based entities” which, like medicines, can be shown to be effective and having well defined and manageable side-effects?
    • 2.2 Are there fail-safe approaches to improving outcomes?
  • 3 What contributes to the making of successful or unsuccessful systems? Is it possible to identify objective factors that determines a project's survival? What makes a system healthy and sustainable?

  • 4 How can we make failure an enhanced learning opportunity for continued improvement?
    • 4.1 What have we learned from our 50 years experience in medical informatics?
    • 4.2 What should we have learned but do not seem to have learned? And why have we not learned it?
      • 4.2.1 Why do the same issues keep arising?
      • 4.2.2 How can we address the combination of publication bias, avoiding embarrassment, project promotion, and ideology that contribute to uneasiness in reporting or hearing studies that challenge the prevailing ethos?
  • 5 What can we learn from experiences outside health care?

  • 6 How should systems successes or failures be studied?
    • 6.1 Would a repository of cases (systems morgue) be helpful? How should the cases be collected and analyzed? What would make the cases most useful?
  • 7 What ethical, legal, social, regulatory, and policy issues need be considered relating to “failure”?

  • 8 What should AMIA do to enhance opportunities for successes?

The following workshop report is based on notes the second author, then chair of the AMIA People and Organizational Issues WG, kept during the entire workshop and displayed via a projector for all participants to see in real-time. The first author analyzed the second author's notes together with additional notes taken by one of the attendees, reviewed the literature, identified themes, and produced a draft of what was to become this paper. This draft was sent to all presenters and their comments were incorporated into this paper. What follows reflects what was said at the workshop.

Workshop Themes

Three themes characterized the workshop discussion, as summarized in :

  • • what “success” is

  • • what makes it so hard—communication, workflow, and quality

  • • what we know—lessons from experience

Table 3.

Table 3 Workshop Themes

Theme
  • • what “success” is

    There are different ideas and definitions of success. We need more understanding of different stakeholder views and more longitudinal and qualitative studies of failure.

  • • what makes it so hard—communication, workflow, and quality

    Difficulties of communicating across different groups makes it harder to identify requirements and understand workflow

  • • what we know—lessons from experience

    Provide incentives, remove disincentives; identify and mitigate risks; allow resources and time for training, exposure, and learning to input data; learn from the past and from each other.

Common threads cross-cutting these three themes were:
  • • the complexity of IT undertakings,

  • • the need to integrate all aspects of projects, work environments, and regulatory and policy requirements

  • • The difficulty of getting all the parts and participants in harmony.

Common threads cross-cutting these three themes were:

  • • the complexity of IT undertakings,

  • • the need to integrate all aspects of projects, work environments, and regulatory and policy requirements; and

  • • the difficulty of getting all the parts and participants in harmony.

What “Success” Is

Many comments concerned the complexity of both large-scale projects and the clinical environment. Participants indicated that this made implementation very difficult because it is not only a technical process, but also a social one replete with interprofessional collaboration, the need for top management understanding, and professional and terminological differences. Success may be defined as simply getting the application or system turned on, getting people to use it, and getting at least grudging acceptance, with the caveat that grudging acceptance can turn to non-acceptance. It might entail only offering even “small successes” to users. Problems are compounded in that what works for one group, such as pharmacists, may not work for another group, such as nurses, and those who gain may not be those who actually do the work. For these reasons, there is little agreement about what “success” or “failure” is. As an audience member put it, “failure is in the eye of the stakeholder.”

Participants spoke about the need for more teaching and research—especially longitudinal and qualitative studies—addressing what failed, why it failed, how an institution tries to turn itself around after failure, and what to do differently the next time.

What Makes It So Hard—Communication, Workflow, and Quality

Participants emphasized that communication and workflow issues add to project complexity. Health care requires collaboration, as does system implementation, yet there is difficulty in translating among specialties, stakeholders, clinicians, and implementers, sometimes to the point of a seeming “culture clash.” Related to these communication challenges is the difficulty of identifying requirements for the various groups involved. Individuals gathering requirements may not include all the necessary people within an organization, or these individuals may not know how to effectively communicate their requirements. Some projects are undertaken for reasons other than need for the project: because requirements come down from the top, or because the project was simple to do, or because developers like the people who want the project. Participants described the difficulty in fully understanding workflow, as evidenced by workflow changes resulting in endless workarounds. Sometimes this was due to the inability of those doing the work to articulate what they do or need; sometimes to senior management or IT not understanding the clinical environment or workflow, or not agreeing on what needs to be done; sometimes to not providing sufficient or meaningful incentives to change. By contrast, participants also described projects that went well because they made the workflow easier. Others emphasized that quality issues also need to be considered, especially in light of the importance of administrative and clinical data reporting for Pay for Performance initiatives. Administrative and quality indicators related to workflow, therefore, also need to be incorporated into policies and procedures, thereby further adding to project complexity.

What We Know—Lessons from Experience

Participants drew lessons from their research and experiences on how management might improve project success. These included:

  • • provide incentives, remove disincentives

Users may perceive that they have no time, or that what they are being asked to do moves work to them and away from others. Physicians, for example, would be more engaged if they experienced applications that helped them directly rather than providing disincentives to adopt the system. As an incentive, for example, physicians could get rounds done more easily if patient lists were ready when shifts begin.

  • • identify and mitigate risks

Determine the social risks, the IT risks, the leadership risks, the user risks, etc, and consider them early and often during the project. These risks and possible ways to mitigate them should become part of new or existing policies and procedures pertaining to the new system and incorporated into training.

  • • allow resources and time for training, exposure, and learning to input data

Participants described systems where clinicians had never used a keyboard or had exposure to computers, yet training was very limited. Sufficient training and time to learn need to be part of the implementation, and need to be on-going afterward.

  • • learn from the past and from others

Participants spoke of the need for studies of successes, failures, and how failing situations were turned around. They suggested longitudinal studies, qualitative studies, more focus on health care teams as a whole, and incorporating insights from change management, diffusion of innovation and technology, social science and sociotechnical theory, and multilevel frameworks. Although participants suggested drawing on existing theories and knowledge and also incorporating project management and methodology issues, they advised caution when doing so because of differences between health care and the business settings where models were developed. There also were calls for measurable evidence, including evidence of publication bias concerning project failure, and for various databases to be created (see below).

AMIA Action Recommendations

The workshop concluded with reports from break-out groups charged with discussing ideas for how AMIA could address health informatics failure. Break-out groups made suggestions concerning:

  • • research and publication

  • • best practices

  • • advocacy

  • • education

  • • certification

  • • databases and knowledge integration

These are summarized in and described more fully here.

Table 4.

Table 4 AMIA Action Recommendations

  • • Research and Publication

    Support and publish qualitative and longitudinal studies of all project phases in addition to outcomes for a variety of applications, including for failed projects

  • • Best Practices

    Create data bases and an AMIA White Paper translating general principles into practice

  • • Advocacy

    Advocate for regulatory changes to facilitate using best practices of health IT

  • • Education

    Develop curriculum on project management and organizational issues to maximize success

  • • Certification

    Partner with certifying bodies to include guidelines for better health IT development and use

  • • Databases and Knowledge Integration

    Develop repositories (data base, blog) for project histories and outcomes and for best practices

Research and Publication

Participants recognized that the questions framing the workshop, listed in , constitute a research agenda. As indicated above, they recommended addressing these issues through more qualitative and longitudinal evaluations, including examining teams and also different views of “success.” In addition, groups called for further studying underlying processes throughout the life cycle, interface and workflow issues, and how organizations turned around after “failure.” Participants recognized the importance of these issues both for large systems and organizations as well as for office practices. They also called for a JAMIA feature or an AMIA blog, with ideas for how to integrate this with existing databases (see below).

Best Practices

Participants recommended identifying best practices and suggested that AMIA produce a White Paper on best practices for health information technology projects. The scope should cover system design, development (including development models and iterative practices), implementation, change management, intuitive interfaces appropriate for clinical settings, help systems, how to identify all stakeholders and insure a common vision among them, workflow and process redesign, and providing benefits (or, as one break-out group put it, getting “the most bang for the buck” and addressing “the pain points”).

However, it also was noted that while AMIA could make recommendations, much already is known about these areas from health informatics research, as well as from research in other domains. Nevertheless, they noted that it can be “hard to translate general principles into practice in actual organizational settings … [because] the context. can be very different across organizations.” Therefore, we need more translational research studies that explicitly explore the effects of context on implementation of IT innovations. Further, studies, databases, and examples are important not only for identifying general principles, but also for how they work in practice in particular settings. Such information would help people gain familiarity with how to pull together regulations, workflow, policies, and IT practices in comprehensive ways that make them easier to apply in particular health care settings.

Advocacy

Participants suggested that AMIA not only advocate for best practices, but also participate in the regulatory process. For example, AMIA could point out difficulties related to privacy issues concerning access to on-line patient information (such as getting lists of patients' room locations) and ways in which the HIPAA privacy legislation impedes workflow. AMIA also could work with such agencies as Office of the National Coordinator for Health Information Technology (ONCHIT) or the Food and Drug Administration in institutionalizing best practices. Standards for alerting and for interoperability also are possible areas for advocacy.

Education

Participants called for developing informatics curricula for both students and professionals. They pointed to the need for core curriculum in medical informatics that would include project management, implementation, and other topics addressed by the workshop. Another idea was to design curricula around actual projects. Participants suggested more training in executive leadership. They further suggested that AMIA partner with professional organizations, the Centers for Medicare and Medicaid Services (CMS), and other entities pushing for Pay for Performance, to develop and promote curricula on best practices and lessons learned. In addition, AMIA working groups might work with the Education Working Group to help develop curricula especially relevant to implementation issues.

Certification

AMIA could work with various certification agencies, such as the Certification Commission for Health Information Technology (CCHIT), to develop guidelines that promote better development, implementation, and use. This might be aided by a new AMIA working group on software development and certification processes.

Databases and Knowledge Integration

Underlying these ideas was the belief that existing knowledge and experience should be collected, integrated, and available for analysis. Participants called for databases of best practices, vendor implementation services, and a case study repository. This repository would be similar to the Healthcare Information Management Systems Society (HIMSS) database of United States hospitals and the systems implemented, but would also include what workflow adjustments were needed or how problems and potential difficulties were addressed. Participants thought that even less formally structured repositories, such as an AMIA blog, would be useful, for sharing both “success” and “failure” examples.

Conclusions

Much has been learned about success and failure in IT implementation, but we need to understand more. There are legal issues when a system “fails”, including just what constitutes “failure.” There are social issues, ranging from how such failures affect various groups and health informatics as a whole (including possible policy and regulatory reactions), to the social aspects of what makes for a “successful” implementation. Finally there are ethical issues involved in evaluating system “success” or not sufficiently attending to previously identified success factors and best practices. 24 Most “failures” are failures to properly apply managerial wisdom that has been substantiated by research and experience. Perhaps the worst aspect of failure is failure to learn from past experiences, so the same issues and problems are perpetuated.

Ten AMIA working groups (), together with the IMIA WG on Organizational and Social Issues, and over fifty other individuals contributed to the workshop. Participants discussed communication, workflow, and quality; the complexity of IT undertakings; the need to integrate all aspects of projects, work environments, and regulatory and policy requirements; and the difficulty of getting all the parts and participants in harmony (). They addressed what “success” and “failure” mean, what contributes to making successful or unsuccessful systems, how to use failure as an enhanced learning opportunity for continued improvement, how system successes or failures should be studied, and what AMIA should do to enhance opportunities for successes.

The proposed research agenda () and recommended action items () reflect the conviction that AMIA members and AMIA as an organization can take a leadership role to make projects more likely to succeed in health care settings. Action items address curriculum development, advocacy in the regulatory process, and documenting and disseminating best practices based on both research and learning from experience.

AMIA has been active in some of these areas, but more could be done. Though the workshop affirmed accumulated wisdom concerning best practices, its call for more research and repositories of lessons learned recognize that tools and prescriptions for success need empiric validation and that failures need to be studied to appropriately change practice. 27 Participants joined with others in challenging dominant approaches to project management and evaluation. The alternative approach favors more nuanced and broader views of project leadership that include complex intertwined relationships, multi-faceted analyses, political and stakeholder issues, institutional and cultural realities, sensitivity to who benefits and who does not, and different views of what constitutes “success.” 8,10,11,15,27,35,53,68,78 In a time of increasing Pay for Performance, pressure for electronic health records, integration across systems, massive expenditures for national health IT programs, and flux in the health care system, workshop participants urged that AMIA and its members take a proactive and applied perspective that addresses the complexities of health informatics projects to realize the potentials of informatics for improving health.

Acknowledgments

The authors are grateful for the helpful comments by Jos Aarts, Judith Effken, Paul Fu, Melvyn Greberman, and Scot Silverstein, and for the additional session notes provided by Yunan Chen, Drexel University. Our thanks, too, to H. Dominic Covvey, who gave the title to the workshop, and to the many listserv participants and workshop attendees who contributed to the discussion.

References

  • 1.National Research Council Computational Technology for Effective Health Care: Immediate Steps and Strategic DirectionsWashington, DC: National Academies; 2009. [PubMed]
  • 2.House of Commons Public Accounts Committee The National Programme for IT in the NHS: Progress Since 2006London: The Stationery Office Ltd.; 2009. Jan 27.
  • 3.International Council on Medical and Care Compunetics (ICMCC) Dutch nationwide EHR postponed. Are They in Good Company?. http://articles.icmcc.org/2009/01/23/dutch-ehr-postponed-are-they-in-good-company/ 2009. Accessed: Jan 28, 2009.
  • 4.Joint Commission on Accreditation of Healthcare Organizations (JCAHO) Safety implementing health information and converging technologies. Sentinel Event Alert 2008, December 11. http://www.jointcommission.org/SentinelEvents/SentinelEventAlert/sea42.htm 2009. Accessed: Dec 23, 2008. [PubMed]
  • 5.Booth R. Project failures costly, TechRepublic/Gartner study findshttp://articlestechrepubliccomcom/5100-10878_11-1062043html 2009. Accessed: Dec 29, 2008.
  • 6.ITCortex Failure rate: Statistics over IT projects failure ratehttp://www.it-cortex.com/Stat-Failure_Rate.htm 2009. Accessed: Aug 4, 2008.
  • 7.Lewis B. The 70-percent failure. InfoWorld. http://infoworld.com/articles/op/xml/01/10/29/011029/opsurvival.xml 2009. Accessed: Dec 30, 2008.
  • 8.McManus J, Wood-Harper T. Understanding the sources of information systems project failure Manag Serv 2007;Autumn:38-43. [Google Scholar]
  • 9.Rubinstein D Standish Group Report: There's Less Development CHAOS Today. vol 1, March: SDTimes, 2007http://www.sdtimes.com/content/article.aspx?ArticleID-30247 2007. Accessed: Aug 4, 2008.
  • 10.Berg M. Implementing information systems in health care organizations: Myths and challenges Int J Med Inf 2001;64(2–3):143-156. [DOI] [PubMed] [Google Scholar]
  • 11.Fitzgerald G, Russo NL. The turnaround of the London ambulance service computer-aided despatch system Eur J Inf Syst 2005 2005;14:244-257. [Google Scholar]
  • 12.Glaser J. More on management's role in IT project failure Healthc Financ Manage 2005;59:82-89http://findarticles.com/p/articles/mi_m3257/is_1_59/ai_n8706921/pg_1?tag=artBody 2005. Accessed: Jan 1, 2009. [PubMed] [Google Scholar]
  • 13.Tichy L, Bascom T. The business end of IT project failure. Mortgage Banking. http://findarticles.com/p/articles/mi_hb5246/is_6_68/ai_n29421069?tag=content;col1 2005. Accessed: Dec 29, 2008.
  • 14.Wears RL, Berg M. Computer technology and clinical work: Still Waiting for Godot J Am Med Assoc 2005;293(10):1261-1263. [DOI] [PubMed] [Google Scholar]
  • 15.Heeks R. Health information systems: Failure, success and improvisation Int J Med Inf 2006;75:125-137. [DOI] [PubMed] [Google Scholar]
  • 16.Jones C. Patterns of software systems failure and successLondon: International Thompson Computer Press; 1996.
  • 17.Lundsgaarde HP, Fischer PJ, Steele DJ. Human Problems in Computerized MedicineLawrence, KS: University of Kansas; 1981.
  • 18.Massaro TA. Introducing physician order entry at a major Academic Medical Center. 1: Impact on organizational culture and behavior. Acad Med 1993;68(1):20-25. [DOI] [PubMed] [Google Scholar]
  • 19.Massaro TA. Introducing physician order entry at a major Academic Medical Center. 2: Impact on medical education. Acad Med 1993;68(1):25-30. [DOI] [PubMed] [Google Scholar]
  • 20.Dowling AF. Do hospital staff interfere with computer system implementation? Health Care Manage Rev 1980;5:23-32. [PubMed] [Google Scholar]
  • 21.Brown AD, Jones MR. Doomed to failure: Narratives of inevitability and conspiracy in a failed IS project Organ Stud 1998;19(1):73-88. [Google Scholar]
  • 22.Sicotte C, Denis JL, Lehoux P. The computer based patient record: A strategic issue in process innovation J Med Syst 1998;22(6):431-443. [DOI] [PubMed] [Google Scholar]
  • 23.Sicotte C, Denis JL, Lehoux P, Champagne F. The computer-based patient record: Challenges towards timeless and spaceless medical practice J Med Syst 1998;22(4):237-256. [DOI] [PubMed] [Google Scholar]
  • 24.Timpka T. Professional ethics for system developers in health care Methods Inf Med 1999;38(2):144-147. [PubMed] [Google Scholar]
  • 25.Scott JT, Rundall TG, Vogt TM, Hsu J. Kaiser Permanente's experience of implementing an electronic medical record: A qualitative study BMJ 2005;331:1313-1316. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Costello D. Head of Kaiser's digital project quits. Los Angeles Times November 8, 2006. http://doctorandpatient.blogspot.com/2006/11/head-of-kaiser-digital-project-quits.html 2005. Accessed: Dec 30, 2008.
  • 27.Beynon-Davies P. Information systems “failure”: The case of the London ambulance service's computer aided despatch project Eur J Inf Syst 1995;4:171-184. [Google Scholar]
  • 28.Beynon-Davies P, Lloyd-Williams M. When health information systems fail Top Health Inf Manage 1999;19(4):66-79. [PubMed] [Google Scholar]
  • 29.Hagland M. From struggles to success. Part technology, part cooperation and part good old fashioned trial and error are what it take to build—or break—A RHIO. Healthc Inform 2007;34:36-37. [PubMed] [Google Scholar]
  • 30.DerGurahian J. Patient Safety Institute folds due to lack of funding. Mod Healthc 2007, October 29or available at: http://www.modernhealthcare.com/apps/pbcs.dll/article?AID-/20071029/FREE/31029002//0/mostreadmonth. Accessed: Jan 1, 2009http://www.modernhealthcare.com/apps/pbcs.dll/article?AID-/20071029/FREE/31029002/1029/FREE 2007. Accessed: Aug 30, 2008.
  • 31.Adler-Milstein J, McAfee AP, Bates DW, Jha AK. The state of regional health information organizations: Current activities and financing Health Aff Millwood 2008;27(1):w60-w69. [DOI] [PubMed] [Google Scholar]
  • 32.Bloxham A. £13 Billion NHS computer system failures affecting patient care. Telegraph August 10, 2008http://www.telegraph.co.uk/news/uknews/2535099/13-bn-NHS-computer-system-failures-affecting-patient-care.html 2008. Accessed: Dec 29, 2009.
  • 33. Another government IT scheme in trouble. Telegraph December 20, 2008. http://www.telegraph.co.uk/comment/telegraph-view/3563090/Another-Government-IT-scheme-in-trouble.html 2008. Accessed Dec 20, 2008.
  • 34.Southon F, Sauer C, Dampney C. Information technology in complex health services: Organizational impediments to successful technology transfer and diffusion J Am Med Inform Assoc 1997;4:112-124. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Wells S, Bullen C. A near miss: The importance of context in a public health informatics project in a New Zealand case study J Am Med Inform Assoc 2008;15(5):701-704. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.van't Riet A, Berg M, Hiddema F, Kees S. Meeting patients' needs with patient information systems: Potential benefits from qualitative research methods Int J Med Inf 2001;64:1-14. [DOI] [PubMed] [Google Scholar]
  • 37.Williams LS. Microchips versus stethoscopes; Calgary Hospital, MDs face off over controversial computer system Can Med Assoc J 1992;147(10):1534-1547. [PMC free article] [PubMed] [Google Scholar]
  • 38.Chin T. Doctors Pull plug on paperless systemAvailable at: http://www.ama-assn.org/amednews/2003/02/17/vil20217.htm. Accessed: Dec 29, 2008http://amednews.com 1992. Feb 13, 2003.
  • 39.Aarts J, Doorewaard H, Berg M. Understanding implementation: The case of a computerized physician order entry systems in a large Dutch university medical center J Am Med Inform Assoc 2004;11(3):207-216. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Aarts J, Berg M. Same system, different outcomes Methods Inf Med 2006;45:53-61. [PubMed] [Google Scholar]
  • 41.Ornstein C. Hospital heeds doctors, suspends use of software. Los Angeles Times January 22, 2003http://articles.latimes.com/2003/jan/22/local/me-cedars22 2006. Accessed: Dec 30, 2008; Sect, p B-1.
  • 42.Ash J, Berg M, Coiera EW. Some unintended consequences of information technology in health care: The nature of patient care information system-related errors J Am Med Inform Assoc 2004;11(2):104-112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Koppel R, Metlay JP, Cohen A, et al. Role of computerized physician order entry systems in facilitating medication errors J Am Med Assoc 2005;293(10):1197-1203. [DOI] [PubMed] [Google Scholar]
  • 44.Campbell E, Sittig D, Ash J, Guappone K, Dykstra R. Types of unintended consequences related to computerized provider order entry J Am Med Inform Assoc 2006;13(5):547-556. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Wetter T. To decay is system: The challenges of keeping a health information system alive Int J Med Inf 2007;76S:S252-S260. [DOI] [PubMed] [Google Scholar]
  • 46.Harrison MI, Koppel R, Bar-Lev S. Unintended consequences of information technologies in health care—An interactive sociotechnical analysis J Am Med Inform Assoc 2007;14(5):542-549. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Silverstein S. Sociotechnologic issues in clinical computing: Common examples of healthcare IT difficultieshttp://www.ischool.drexel.edu/faculty/ssilverstein/failurecases/?loc=home 2007. Accessed: Aug 4, 2008.
  • 48.Vogelsmeier AA, Halbersleben JRB, Scott-Cawiezell JR. Technology implementation and workarounds in the nursing home J Am Med Inform Assoc 2008;15(1):114-119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Koppel R, Wetterneck T, Telles JL, Karsh B-T. Workarounds to barcode medication administration systems: Their occurrences, causes, and threats to patient safety J Am Med Inform Assoc 2008;15(4):408-423. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Brooks FP. The Mythical Man-Month: Essays on Software EngineeringReading, MA: Addison-Wesley; 1975.
  • 51.Glass R. The Universal Elixir, and Other Computing Projects Which FailedSeattle: Computing. Trends; 1977.
  • 52.Sauer C. Why Information Systems Fail: A Case Study ApproachOxfordshire, UK: Alfred Waller; 1993.
  • 53.Kaplan B, Shaw N. Future directions in evaluation research: People, organizational, and social issues Methods Inf Med 2004;43(3–4):215-231. [PubMed] [Google Scholar]
  • 54.Lorenzi N, Riley RT. Managing change: An overview J Am Med Inform Assoc 2000;7(2):116-124. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.van der Meijden MJ, Tange HH, Troost J, Hasman A. Determinants of success of inpatient clinical information systems: A literature review J Am Med Inform Assoc 2003;10(3):235-243. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Brender J, Ammenwerth E, Nykänen P, Talmon J. Factors influencing success and failure of health informatics systems—A pilot Delphi study Methods Inf Med 2006;45(1):125-136. [PubMed] [Google Scholar]
  • 57.Goldfinch S. Pessimism, computer failure, and information systems development in the public sector Pub Adm Rev 2007;67(Sept/Oct):917-929. [Google Scholar]
  • 58.Ammenwerth E, de Keizer N. A web-based inventory of evaluation studies in medical informatics 1982–2002http://evaldb.umit.at 2007. Accessed: Aug 4, 2008.
  • 59.European Federation of Medical Informatics Bad health informatics can killhttp://iig.umit.at/efmi/badinformatics.htm 2007. Accessed: Feb 8, 2009.
  • 60.Paré G, Sicotte C, Jaana M, Girouard D. Prioritizing the risk factors influencing the success of clinical information systems Methods Inf Med 2008;47(3):251-259. [PubMed] [Google Scholar]
  • 61.Ash JS, Anderson NR, Tarczy-Hornoch P. People and organizational issues in research systems implementation J Am Med Inform Assoc 2008;15(3):283-289. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Ash J, Stavri PZ, Kuperman GJ. A consensus statement on considerations for a successful CPOE implementation J Am Med Inform Assoc 2003;10:229-234. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Kitch P, Yasnoff WA. Managing IT personnel and projectsIn: O'Carrol PW, Yasnoff WA, Ripp LH, Martin EL, editors. Public Health Informatics and Information Systems. New York: Springer; 2003. pp. 159-178 http://books.google.com/books?id=ap6uCR0Ybo4C&pg=PA66&lpg=PA66&dq=yasnoff+health+IT+success&source=bl&ots=B5PzENd9QC&sig=3mSlWnrSFdrsYY2ZnNCrozgTIRI&hl=en&sa=X&oi=book_result&resnum=1&ct=result#PPA59,M1. Accessed: Dec 29, 2008.
  • 64.Glaser J. Management's role in IT project failure. Healthc Financ Manage 2004 October, 2004. http://findarticles.com/p/articles/mi_m3257/is_10_58/ai_n6274067/pg_1?tag=artBody 2003. Accessed: Jan 1, 2009.
  • 65. Common Causes of Project FailureLondon: Office Of Government Commerce; 2005. http://www.ogc.gsi.gov.uk/documents/cp0015.pdf 2005. Accessed: Dec 29, 2009.
  • 66.AHRQ Health IT adoption toolboxhttp://healthit.ahrq.gov/portal/server.pt?open=512&objID-1077&cached=true&mode=28userID=7330 2005. Accessed: Jan 1, 2009.
  • 67.Lorenzi NM, Novak LL, Weiss JB, Gadd CS, Unertl KM. Crossing the implementation chasm: A proposal for bold action J Am Med Inform Assoc 2008;15(3):290-296. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.In: Leviss J, Keenan G, Ozeran L, et al. editors. Implementation Lessons Learned. Chic: American Health Information Management Association; 2009.
  • 69.The Standish Group CHAOS report, 1995http://net.educause.edu/if/library/pdf/NCP08083B.pdf 2009. Accessed: Dec 29, 2008.
  • 70.The Standish Group Extreme chaos, 2001http://www.quarrygroup.com/wp-content/uploads/art-standishgrou-CHAOS0report.pdf 2009. Accessed: Dec 29, 2008.
  • 71.Avison D, Young T. Time to rethink health care and ICT? Commun ACM 2007;50(6):69-74. [Google Scholar]
  • 72.Kaplan B. The medical computing “lag”: Perceptions of barriers to the application of computers to medicine Int J Technol Assess Health Care 1987;3(1):123-136. [DOI] [PubMed] [Google Scholar]
  • 73.Kaplan B. The Computer Prescription: Medical Computing, Public Policy and Views of History Sci, Tech, Human Values 1995;20(1):5-38. [Google Scholar]
  • 74.Rosencrance LK, Cio C, Dodd. Resigns. Computerworld 2006, November 9. http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleID=9004940 1995. Accessed: Dec 30, 2008.
  • 75.Rosencrance L. Problems abound for Kaiser e-health records management system. Computerworld 2006, November 13. http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleID=9005004 1995. Accessed: Dec 30, 2008.
  • 76.Han Y, Carcillo J, Venkataraman S, et al. Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system Pediatrics 2005;116(6):1506-1512. [DOI] [PubMed] [Google Scholar]
  • 77.Del Beccaro MA, Jeffries HE, Eisenberg MA, Harry ED. Computerized provider order entry implementation: No association with increased mortality rates in an intensive care unit. Pediatrics 2006doi: 10.1542/peds.2006-0367 2005. Accessed: Jul 3, 2006. [DOI] [PubMed]
  • 78.Wetter T. Lessons learned from bringing knowledge-based decision support into routine use Artif Intell Med 2002;24(3):195-203. [DOI] [PubMed] [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES