In the eight months since we put out the call for papers for this special issue of the BMJ devoted to medical errors, the landscape has changed considerably. In Britain the Bristol Inquiry has continued to focus professional and public attention on patient safety in a manner unprecedented both for its depth and for the extent of professional involvement.1 In the United States the recent publication of the report To Err is Human by the Institute of Medicine of the National Academy of Sciences2 received extraordinary media coverage as well as prompt responses to its recommendations from the President and Congress.3
The error prevention “movement” has clearly accelerated. As the papers in this issue bear witness, major changes are occurring in the way we think about and carry out our daily work. For practising physicians, some of the ideas and practices described here may be mind bending, or at least mind stretching. But most of the insights and solutions will, we think, have resonance for all those who strive to provide safe care for patients. All physicians, after all, have had the unwelcome experience of becoming what Wu calls “the second victim,” being involved in an error or patient injury and feeling the attendant sense of guilt or remorse as responsible professionals.4 Familiar, too, are Helmreich's findings that doctors, like pilots, tend to overestimate their ability to function flawlessly under adverse conditions, such as under the pressures of time, fatigue, or high anxiety.5
Some of the solutions reported here are as simple as teaching emergency room doctors to read x ray films6; others require substantial capital investment.7 The new world of automation described by Bates and by Gaba seems ever closer,8,9 and, although every new technology will inevitably introduce new forms of error, it is high time for medicine to enter the computer age. We should now hope that the death knell has at last been sounded for the handwritten paper prescription; and the paper medical record, a dinosaur long overdue for extinction, may at last be en route to replacement by far more useful and reliable automated systems.
But, several of these authors warn us, making the more fundamental and lasting changes that will have a major impact on patient safety is much more difficult than simply installing new technologies. There are no “quick fixes.” We must re-examine all that we do and redesign our many and complex systems to make them less vulnerable to human error.10,11 The necessary changes are as much cultural as technical. Creating a culture of safety requires attention not only to the design of our tasks and processes, but to the conditions under which we work—hours, schedules and workloads; how we interact with one another; and, perhaps most importantly, how we train every member of the healthcare team to participate in the quest for safer patient care.
We have already learnt a great deal from the early experiences of error reduction in healthcare organisations. Firstly, we have discovered an immense reservoir of creativity and motivation among healthcare workers of all kinds. When given the opportunity to help, when the barriers of shame and punishment are removed, doctors, nurses, pharmacists, and others eagerly work to improve safety, implementing best practices or developing new ones.
Secondly, we have learnt again that leadership is an essential ingredient of success in the search for safety, as it is throughout the enterprise of quality improvement. In the absence of commitment from professional and organisational leaders, efforts will be fragmentary and uncoordinated and will have only minor effects. We need leadership at all levels. While local “champions”—individual doctors, pharmacists, or nurses—can, by their enthusiasm, motivate others to make improvements, major systems changes require direction and support from the top—leaders who communicate their own commitment by insisting on safety as an explicit organisational goal backed by adequate resources. The test, as Reinertsen tells us, is that senior managers feel personally responsible for each error.12
Thirdly, we have learnt that the problem of medical error is not fundamentally due to lack of knowledge. Though clearly we have much more to learn about how to make our systems safe, we already know far more than we put into practice. Simple measures of known effectiveness, such as unit dosing, marking the correct side before surgery on paired organs, and 24 hour availability of pharmacists and emergency physicians, are often ignored. Health care alone refuses to accept what other hazardous industries recognised long ago: safe performance cannot be expected from workers who are sleep deprived, who work double or triple shifts, or whose job designs involve multiple competing urgent priorities. Based on currently available knowledge, constructive, effective changes to improve patient safety can begin at once.
If we can mobilise our resources and make safety our priority, health care can make tremendous strides in the next few years. But today's culture of blame and guilt too often shackles us. Achieving the culture we need—one of learning, trust, curiosity, systems thinking, and executive responsibility—will be immensely difficult. Harder still, we must now accomplish this cultural change under the spotlight of a newly aroused public that, given our track record, is understandably doubtful that health care can, on its own, do what needs to be done. Indeed, the public's doubt in our commitment may be all too well founded. In truth, no other hazardous industry has achieved safety without substantial external pressure. Safe industries are, by and large, highly regulated. Health care's track record of failure to act on over three decades of accumulating evidence of medical errors offers plenty of ammunition to those who claim that we may need to be forced to do what is, at bottom, right.
The need is obvious, and the mandate is clear. Will we respond adequately and fast enough? Will hospitals and healthcare organisations get serious enough, soon enough, about patient safety? Will they make the changes that are needed, and will they be willing to hold themselves accountable for achieving improvements? Can we accept the legitimacy of the public's right to know when serious accidents occur, and can we honour the public's legitimate expectation that we will admit our mistakes, investigate them, and make the changes necessary to prevent them in the future? As we enter the new century, a key lesson from the old is that everyone benefits from transparency. Both the safety of our patients and the satisfaction of our workers require an open and non-punitive environment where information is freely shared and responsibility broadly accepted.
Are we ready to change? Or will we procrastinate and dissemble—to lament later when the inevitable regulatory backlash occurs? It may seem to some that the race for patient safety has just begun, but the patience of the public we serve is already wearing thin. They are asking us to promise something reasonable, but more than we have ever promised before: that they will not be harmed by the care that is supposed to help them. We owe them nothing less, and that debt is now due.
Acknowledgments
Lucian Leape and Donald Berwick are the guest editors of this theme issue.
References
- 1.www.bristol-inquiry.org.uk/brisphase2.htm; accessed 6 March 2000.
- 2.Kohn LT, Corrigan JM, Donaldson MS, editors. To err is human. Building a safer health system. Washington, DC: National Academy Press; 1999. [PubMed] [Google Scholar]
- 3.Charatan F. Clinton acts to reduce medical mistakes. BMJ. 2000;320:597. [PMC free article] [PubMed] [Google Scholar]
- 4.Wu A. Medical error: the second victim. BMJ. 2000;320:726–727. doi: 10.1136/bmj.320.7237.726. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Helmreich RL. On error management: lessons from aviation. BMJ. 2000;320:781–785. doi: 10.1136/bmj.320.7237.781. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Espinosa JA, Nolan TW. Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study. BMJ. 2000;320:737–740. doi: 10.1136/bmj.320.7237.737. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Nightingale PG, Adu D, Richards NT, Peters M. Implementation of rules based computerised bedside prescribing and administration: intervention study. BMJ. 2000;320:750–753. doi: 10.1136/bmj.320.7237.750. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Bates DW. Using information technology to reduce rates of medication errors in hospitals. BMJ. 2000;320:788–791. doi: 10.1136/bmj.320.7237.788. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Gaba DM. Anaesthesiology as a model for patient safety in health care. BMJ. 2000;320:785–788. doi: 10.1136/bmj.320.7237.785. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Reason J. Human error: models and management. BMJ. 2000;320:768–770. doi: 10.1136/bmj.320.7237.768. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Nolan TW. System changes to improve patient safety. BMJ. 2000;320:771–773. doi: 10.1136/bmj.320.7237.771. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Reinertsen JL. Let's talk about error. BMJ. 2000;320:730. doi: 10.1136/bmj.320.7237.730. [DOI] [PMC free article] [PubMed] [Google Scholar]