Skip to main content
The BMJ logoLink to The BMJ
editorial
. 2000 Jun 24;320(7251):1683–1684. doi: 10.1136/bmj.320.7251.1683

How the NHS can improve safety and learning

By learning free lessons from near misses

Paul Barach 1,2, Stephen D Small 1,2
PMCID: PMC1127461  PMID: 10864524

An Organisation with a Memory, the newly released report from England's chief medical officer on learning from adverse events in the NHS,1 joins recent high profile policy statements from the United States2,3 and Australia4 that acknowledge an epidemic of underreported preventable injuries due to medical management. This report should not, however, be seen as a one off response to preventable patient catastrophes—although recent celebrated cases in Britain, such as the Bristol paediatric cardiac surgery story, have provided a driving force for change. Instead, An Organisation with a Memory should be understood in the larger context of a 10 year modernisation strategy to continuously and measurably improve the quality of health care. If it is understood in this way, and the government is willing to invest in the necessary systems, training, and research, then it will prove a real force for change.

Most care in the NHS is of high clinical standard, but the chief medical officer's report suggests that as many as 850 000 serious adverse health care events might occur in the NHS hospital sector each year at a cost of over £2bn. Half of these events are thought to be preventable. Though the report points out that little comprehensive research on adverse events in health care has been carried out in the United Kingdom, the international community has long been in the debt of British safety theorists and applications experts.57

The report focuses on the theme of proactively learning from experience. It does a remarkable job of addressing this thread, from near misses to adverse events, from individuals to organisations, from aviation to health care. Adverse events and failures of service delivery rightly provide the key focus, as opposed to “errors.” A discussion of safety culture, safety information systems, and challenges to safely operating complex systems8 sets the stage for an assessment of how the NHS learns from adverse events, near misses, and monitors its own vital signs.

One of the problems that clinical professionals face in understanding the importance of systems in safety is reconciling a systems approach with individual accountability. The report emphasises that “in the great majority of cases, the causes of serious failures stretch far beyond the actions of the individuals immediately involved.” Yet cases it uses to explain its concepts are also effective in demonstrating a universal systems approach without neglecting individual accountability and the importance of patient disclosure. This is critical to maintain the public's trust and need for transparency.

Just praise falls on historically leading UK efforts to collect and analyse incident data, such as the confidential inquiries and department of health guidance on incident reporting dating back to 1955.6 Ultimately, however, the NHS is described as a “passive” as opposed to an “active” learning organisation.1 The report offers ample evidence of a large, distributed health care system that usually provides excellent care but lacks the structure, processes, and mindset to advance to the next level of truly high collective performance. Patterns of severe injuries play out repeatedly, as in the case of deaths caused by wrongly administered spinal injections. Administrative barriers exist to implementing known solutions. Beliefs, attitudes, and values nourish a culture of blame and superficial analysis and thereby perpetuate this cycle. Free lessons from near misses and more expensive ones from litigation are largely lost. Positive change and improvement do exist, but the effort is slow, disorganised, and costly.

Curiously, An Organisation with a Memory takes an ambivalent approach to team research and training. Table 5.1 of the report (see box on bmj.com) lists the need for a teamwork focus, but the report is silent on the contribution of the social sciences to safety, in favour of the individual-system dynamic.1 In a rare reference teams are mentioned pejoratively—for their potential contribution to a culture of silence and resistance through norms of loyalty. This is puzzling since the report makes a well articulated call for establishing a culture of self-reflection and appraisal.9 Moreover, aviation safety practices are discussed at length, and team training has long been an important part of aviation safety measures. Similarly, simulation is widely used in aviation training, yet the report does not mention the important experience with simulation tools supported by video feedback in many other industries as well as health care.10 The report could have underscored more clearly the need for a compulsory safety curriculum, which would include team training, human factors engineering, risk management, and simulation.

The report offers many recommendations for changing the culture, which the health secretary has promised to take seriously, including opening up the neglected area of safety in primary care. Its major recommendation is for a national mandatory reporting scheme for adverse health events, and specified near misses, based on standardised local reporting systems. There are, however, many barriers to implementing this vision.11 These include the degree of confidentiality and accountability, protection for reporters, and feedback to the reporters. Sustained effectiveness of reporting systems requires that all stakeholders be at the table.12 The evolution of sophisticated safety reporting systems in non-medical domains occurred over several decades, and translation of these lessons to health care will prove to be challenging.13,14

It is time for a fundamental rethink of the way that the NHS approaches learning from experience. Physicians have avoided the tough questions of how safety is to become more central to their thinking and behaviour. We must splice safety culture and learning into the organisational genome of the NHS. Safety is a suitable vehicle to carry a renewed focus on quality improvement. We endorse the overarching theme of the report to make “safety and quality everybody's business.” This excellent and readable report should be taken up by all involved in healthcare delivery in the United Kingdom, as well as the international community. The government must not ignore it.

Supplementary Material

[extra: Table]

News p 1689, Reviews p 1738

Footnotes

A table from the report appears on the BMJ's website

References

  • 1.Expert Group on Learning from Adverse Events in the NHS. An organisation with a memory. London: Stationery Office; 2000. www.doh.gov.uk/orgmemreport/index.htm [Google Scholar]
  • 2.Kohn LT, Corrigan JM, Donaldson MS, editors. To err is human: building a safer health system. Washington, DC: National Academy Press; 1999. [PubMed] [Google Scholar]
  • 3.Quality Interagency Coordination Task Force. Doing what counts for patient safety: Federal actions to reduce medical errors and their impact. Washington: Agency for Healthcare Research and Quality; 2000. [Google Scholar]
  • 4.Runciman W. Iatrogenic injury in Australia. Canberra: Australian Patient Safety Foundation; 2000. [Google Scholar]
  • 5.Turner BA, Pidgeon NF. Man-made disasters. London: Butterworth and Heinemann; 1997. [Google Scholar]
  • 6.Vincent C, Ennis M, Audley RJ. Medical accidents. Oxford: Oxford University Press; 1993. [Google Scholar]
  • 7.Reason J. Managing the risks of organizational accidents. Aldershot: Ashgate; 1997. [Google Scholar]
  • 8.Millenson M. Demanding medical excellence: doctors and accountability in the information age. Chicago: University of Chicago; 1997. [Google Scholar]
  • 9.Sagan SD. The limits of safety: organizations, accidents, and nuclear weapons. Princeton, NJ: Princeton University Press; 1994. [Google Scholar]
  • 10.Helmrich RL, Schaefer HG. Team performance in the operating room. In: Bogner S, editor. Human error in medicine. Hillsdale, NJ: Lawrence Erlbaum; 1994. [Google Scholar]
  • 11.Barach P, Small SD. Reporting and preventing medical mishaps: Lessons from non-medical near miss reporting systems. BMJ. 2000;320:753–763. doi: 10.1136/bmj.320.7237.759. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Reynard WD, Billings CE, Cheney ES, Hardy R. The development of the NASA Aviation Safety Reporting System. Washington, DC: National Aeronautics and Space Administration; 1986. [Google Scholar]
  • 13.Rees JV. Hostages to each other: the transformation of nuclear safety since Three Mile Island. Chicago: University of Chicago Press; 1994. [Google Scholar]
  • 14.Carroll J. Incident reviews in high-hazard industries: sense making and learning under ambiguity and accountability. Industrial and Environmental Crisis Quarterly. 1995;9:175–197. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

[extra: Table]

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Publishing Group

RESOURCES