Abstract
In response to a weight of evidence that patients are frequently harmed as a result of their care, there have been concerted efforts to make healthcare safer, with health systems across the globe investing significant resources in policies and programmes designed to reduce adverse events. Yet, despite extensive efforts, improvements in safety have proved difficult to sustain and spread, with studies confirming there has been no measurable, systems-level improvement in the overall rates of preventable harm. Here, we highlight the limitations of the thinking which underpins current efforts to make healthcare systems safer and point to new and emerging approaches to understanding and addressing patient safety in complex, dynamic health systems.
Keywords: Health System, Patient Safety, Adverse Events, Preventable Harm, System Thinking
Try Again. Fail Again. Fail Better [1]
It has been recognised since antiquity that medical practice brings the possibility for both patient benefit as well as the potential for adverse clinical outcomes.1 This problematic is reflected in the guiding principle, primum non nocere (first do no harm) which remains a touchstone of contemporary codes of professional ethics. Nevertheless, history is replete with examples of where misguided but well-intentioned medical interventions have generated more harm than good [2] . The true scale of patient harm in health systems remained latent and under-recognised, and was not well known (even by the medical, nursing and allied professions) until comparatively recently.
Dating back to the early 1990s, successive international research studies that have assessed the extent of adverse events have accumulated evidence to suggest that around one in 10 patients experience some type of health-related harm while receiving inpatient care.2 Similar rates appear to hold in other health settings.3 The burgeoning field of patient safety research has highlighted failures of both commission (eg, medication errors) and of omission (eg, the failure to provide recommended care).4-7 In response to this mounting weight of evidence, there have been concerted efforts to make healthcare safer, with health systems across the globe investing significant resources in a bewildering array of policies and research programmes designed to reduce the burden of patient harm.8,9 As with many other aspects of health system reform, the medical profession has been a key influence in shaping research and policy in relation to patient harm, first by rejecting, then by gradually accepting, and finally by embracing the patient safety agenda.10
There have undoubtedly been gains in making care safer, for example, through decreased catheter-related blood stream infections,11 lower mortality and morbidity attributed to the use of checklists in operating theatres,12 and the widespread adoption of Medical Emergency Teams and Rapid Response Systems to deal with deteriorating patients.13 Yet despite extensive efforts by many committed and well-intentioned policy-makers, managers, clinicians, researchers and patient groups, it is disconcerting that improvements in safety have been confined to a few celebrated examples or niche areas such as these. Where there have been solutions advanced, they have proved difficult to sustain and spread, with recent studies confirming there has been little or no measurable improvement in the overall rates of preventable harm at the systems level.2 Indeed, amid growing recognition that most patient safety problems are not amenable to simple solutions, it is apparent that the original optimism of the patient safety movement has given way to hard bitten realism.2,14
System Heal Thyself
Perhaps we have not been sufficiently sophisticated in specifying solutions to the problem of ameliorating harm. The system is complex and the problem is a wicked one.15 Linear, uni-dimensional solutions—let’s run a hand hygiene campaign, let’s issue more policy, let’s regulate more extensively, let’s mandate root cause analysis in all cases of severe harm—applied to care settings, have fallen short of expectations for them.16 That being the case, the question becomes, what models, studies or concepts lay better foundations for improvements in patient safety?
Current approaches have deep historical and institutional antecedents that have been affirmed over decades, and are woven into the mindsets of those involved in health services research, healthcare policy and professional practice. Traditionally, the dominant mode of thinking about patient safety was predicated on the assumption of individual culpability, with errors and adverse events mainly attributed to incompetence, negligence and individual personality deficits such as carelessness, forgetfulness or recklessness. Based on this logic, individual clinicians, so called ‘bad apples,’ were blamed and held personally accountable for any errors made. Even so, in practice the medical profession largely failed historically to address wrongdoing and failures against professional regulatory procedures focused on identifying and rooting out ‘bad apple’ miscreants. Famous cases such as those in which the GP Dr Harold Shipman, working in the English NHS, systematically murdered his patients, and instances of large-scale inquiries into extensive harm,17,18 highlight this.
In order to tackle things differently, since around the turn of the millennium, stimulated amongst other factors by cases such as these, patient safety has increasingly been viewed as a systems issue whereby errors and adverse events are thought to arise mainly from dysfunctions in the wider systems of care rather than as the consequences of personal fallibility.19 A new “safe systems orthodoxy” gradually emerged, in which errors and adverse events were attributed to aberrations in the “system.” In this framing, clinicians are not so much seen as culpable or reckless, but as interacting agents in problematic systems or cultures. Interventions were oriented towards designing working environments that anticipated and minimised the impact of human error, with a focus on reducing both ‘active failures,’ ie, mistakes and unsafe acts made by interacting health professionals working at the sharp end of care delivery, and addressing more upstream ‘latent failures’ located in the wider environment (eg, problematic organisational cultures, poor team dynamics, time pressures or deficits in workload scheduling). This kind of early systems thinking was underpinned by a conviction that reorganising formal structures, streamlining systems of care, or purposively managing organisational cultures would deliver the desired improvements in patient safety.20 This systems approach spawned many interventions built on a multiplicity of conceptualisations of the problem, including root cause analysis, the swiss cheese model, lean techniques, checklists, and the like.
Fault Lines in Systems Thinking
But this systems approach, while an advance on earlier linear attempts to improve things, has not taken us sufficiently far. It has relied on what we think of as simplistic systems thinking. It is the version of systems theory that underpins many current efforts to make care safer, but has a number of serious limitations which may, in part, explain why progress has still been so painfully slow. First, this kind of systems thinking, which underpins the approach we now know as Safety-I, still assumes at its heart that adverse outcomes can be explained by linear cause-effect chains, as originally proposed by the Domino metaphor21 and embodied in the widespread use of patient safety frameworks and tools such as incident reporting and root cause analysis.19,22 Although it does bring into the safety arena a systems science perspective, this framing of errors and adverse events nevertheless sees that they have a definable cause which can be ‘found and fixed.’ It remains focused on locating the causes of failure, rooting out aberrations, and introducing interventions in an attempt to eliminate (or at least attenuate) their distal causes.
However, it is clear that the logic inherent in this approach no longer corresponds to the reality of care settings, if ever it did. The lack of success in improving patient safety over these last two decades may be due to the concepts and methods underpinning systems-oriented approaches not matching the profound complexity of modern healthcare settings. Care is delivered in intricate, fragmented, sometimes chaotic settings, in complex political, socio-cultural environments with a virtually infinite range of moving parts and interconnections. Healthcare is characterised by informalities, work-arounds, feedback loops, emergent behaviours, politics, nested networks, fractal properties, systems dynamics, and bottom-up adaptiveness.23 These kinds of settings stubbornly resist the introduction of top-down, standardised policy, regulations or linear-style interventions.24,25 They may even induce serious deleterious consequences if there are substantial mismatches between the putative solutions and their intended target.26
Another, related limitation of simplistic systems thinking is that the ‘system’ to be fixed is primarily conceived as the local ‘micro-system,’ comprising local clinical and team behaviours responsive to upper-echelon prescriptions, rather than more broad-based factors, such as profession-wide social structures and cultural norms.27,28 Indeed, a fatal flaw in the Safety-I approach is that it is heavily infused with thinking which has been described as work-as-imagined.4 Work-as-imagined is what those working at the blunt end of the health system (eg, policy makers, regulators, planners, directors and researchers) believe or think should happen at the sharp end of care delivery when they design improvement strategies or attempt to influence or nudge the behaviours of those working at the sharp end. Much of this is predicated on encouraging and mandating adherence to a multiplicity of rules, regulations and external performance metrics and standards. Work-as-done on the other hand is what people do on the front-lines of care to get the job accomplished in complex settings which are always very different from the way those remote from the front lines imagine them to be. Sometimes clinicians do provide care that corresponds to the ways those who envision and prescribe it at the blunt end imagine it to be, but this is rare. This is because people working at the sharp end do so in resource-constrained, challenging and culturally unique circumstances, and for the most part they accomplish their tasks by employing informalities, localised patchwork-quilt solutions, and work-arounds.29,30 In short, clinicians on the front lines of care flex and adjust their practices to fit with, and map to, the local contexts, demands, contingencies and cultural characteristics, rather than in response to top-down regulation, policies and standard operating procedures.
Despite the on-the-ground variation, complexities, and local adjustments in clinical care, things go right far more often than they go wrong. This is a key insight of those advocating Safety-II.4,14,28,31,32 This argues that a feature of clinical work that has been paid too little attention is that it succeeds far more often than it fails. As a consequence of this insight and argument, in such complex adaptive systems, we need to better understand how work is actually accomplished across different healthcare settings and thereby seek to become more accomplished at matching work-as-imagined and work-as-done. This would mean we need to get better at designing work-as-imagined policies, regulations and standards so that they much closer align to, and reflect an enhanced understanding of, how work is actually done.33 It would also mean we need a system in which those at the sharp end of care delivery have a better appreciation of what is being sought by those doing work-as-imagined, and an enhanced appreciation of their own organisation cultures and systems.20 Yet in health systems research, culture and culture change are often poorly theorised and based on the overly simplistic assumption that cultures are malleable and can be readily manipulated and managed to beneficial effect.34,35 We should be cautious, too, in drawing parallels between healthcare and other industries, such as aviation, as it is problematic to import ideas wholesale from very different cultural contexts and expect them to be taken up unproblematically.36
In short, most systems improvement strategies to date have been based on imagining work rather than how it is done in practice.4 That is to say, those at the apex of the hierarchy have determined, funded or prescribed policies or initiatives to be embraced by those on the front lines of clinical care without adequate understanding of the complex intricacies and nuances inherent in the delivery of modern healthcare.
New Approaches and Future Directions for Research
As we have seen, progress in making care safer for patients has been sufficiently slow such that the momentum is seen to have stalled. The rates of harm appear to have flatlined at 10%. This necessarily calls into question the dominant ways in which patient safety has been framed and addressed. Albert Einstein is credited with the familiar aphorism, “Insanity is doing the same thing over and over again and expecting different results.” If we want to achieve different results then we need to be less reverential towards the orthodox paradigm, get beyond simplistic system thinking, expand our research horizons, and advance new and better ways for understanding and intervening in patient safety. One approach is to take the thinking originating from resilience engineering, the Safety-II approaches identified above, and begin to focus more clearly on how things typically go right. This is a radically wide-ranging view of the system. It argues that the same behaviours produce good care as produce poor care, and seeks to understand the complex health system in more comprehensive ways. This extended view of safety argues that attention must be placed on the conditions under which people succeed rather than fail, instead of looking myopically at why things go wrong.
The Safety-II perspective recognises that both failures and successes have their origin in performance variability (at both the individual and systemic levels), and that it is equally mistaken to attribute successes to careful planning and diligence as it is to attribute failures to individual incompetence or error. An important practical implication flows from this argument. We cannot appreciate a system and its performance by looking solely or predominantly at harm. We must direct attention to the whole system, and the conditions under which it operates, if we are to make gains in understanding. We will not apprehend the system in all its complexity by looking only at abnormalities when healthcare delivery fails.
Complementing this approach, we point to other recent work drawing on complexity science applied to healthcare.32,37-41 This calls into view the dynamic nature of the system, the relationships that deliver care, and its interactional characteristics. We also point to advances and evolving research from across the social sciences such as recent studies on the sociology of patient safety,10 including new dramaturgical perspectives on the organisational governance of patient safety,42 increasingly persuasive research on the potential for patient and relatives to contribute towards safer care,43 and the use of (behavioural) economic levers and incentives to motivate organisations and individuals to provide safer care for patients.8
Taken together, these new horizons work with the complex dynamics of healthcare instead of assuming that things will change by instrumental means, by operationalising top-down linear solutions. They also show, much more profoundly than previously, how superficial conceptualisations of systems will not do. We need to form deep knowledge of the nature of the healthcare complex adaptive system, its resilient expressions and capacities, and mobilise a wider range of stakeholders including patients, to the patient safety enterprise.
That is the new horizon, and we see glimpses of evidence, reported here, that we are shifting toward it. There are grounds to believe, at least for the optimists among us, as we have argued above, that it will not be yet another false dawn. But only time will tell whether, and the extent to which, we can make more tangible progress in our journey, reaching toward this new horizon.
Ethical issues
Not applicable.
Competing interests
Authors declare that they have no competing interests.
Authors’ contributions
Both authors contributed equally to the writing of this article.
Authors’ affiliations
1Health Services Management Centre, University of Birmingham, Birmingham, UK. 2Institute of Health Innovation, Macquarie University, Sydney, NSW, Australia.
Endnotes
[1] Becket S. Worstward Ho. New York, NY: Grove Press Inc; 1983.
[2] Classic examples include the use of mercury and arsenic as medicines in the 18th Century and Thalidomide in the 1960s.
Citation: Mannion R, Braithwaite J. False dawns and new horizons in patient safety research and practice. Int J Health Policy Manag. 2017;6(12):685–689. doi:10.15171/ijhpm.2017.115
References
- 1. Porter R. The Greatest Benefit to Mankind: A Medical History of Humanity From Antiquity to the Present. London, UK: Harper Collins; 1999.
- 2. Braithwaite J, Donaldson L. Patient safety and quality. In: Ferlie E, Montgomery K, Pederson AR, eds. The Oxford Handbook of Health Care Management. Oxford, UK: Oxford University Press; 2016:325-351.
- 3. World Health Organization. Patient safety: making health care safer. Geneva, Switzerland: World Health Organization; 2017.
- 4. Braithwaite J, Wears RL, Hollnagel E, eds. Resilient Health Care Volume 3: Reconciling Work-as-Imagined and Work-as-Done. Abingdon, UK: Taylor & Francis; 2016.
- 5.McGlynn EA, Asch SM, Adams J. et al. The Quality of Health Care Delivered to Adults in the United States. N Engl J Med. 2003;348(26):2635–2645. doi: 10.1056/NEJMsa022615. [DOI] [PubMed] [Google Scholar]
- 6.Runciman WB, Hunt TD, Hannaford NA. et al. CareTrack: assessing the appropriateness of health care delivery in Australia. Med J Aust. 2012;197(2):100–105. doi: 10.5694/mja12.10510. [DOI] [PubMed] [Google Scholar]
- 7.Hooper TD, Hibbert PD, Mealing N. et al. CareTrack Kids—part 2 Assessing the appropriateness of the healthcare delivered to Australian children: study protocol for a retrospective medical record review. BMJ Open. 2015;5(4):e007749. doi: 10.1136/bmjopen-2015-007749. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Slawomirski L, Auraaen A, Klazinga N. The Economics of Patient Safety: Strengthening a Value-Based Approach to Reducing Patient Harm at National Level. https://www.oecd.org/els/health-systems/The-economics-of-patient-safety-March-2017.pdf. Published 2017.
- 9. DeVita MA, Hillman K, Bellomo R, et al, eds. Textbook of Rapid Response Systems: Concept and Implementation. New York, NY: Springer; 2017.
- 10. Allen D, Braithwaite J, Sandall J, Waring J, eds. The Sociology of Healthcare Safety and Quality. West Sussex, UK: Wiley-Blackwell; 2016. [DOI] [PubMed]
- 11.Pronovost P, Needham D, Berenholtz S. et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med. 2006;355(26):2725–2732. doi: 10.1056/NEJMoa061115. [DOI] [PubMed] [Google Scholar]
- 12.Haynes AB, Weiser TG, Berry WR. et al. A Surgical Safety Checklist to Reduce Morbidity and Mortality in a Global Population. N Engl J Med. 2009;360(5):491–499. doi: 10.1056/NEJMsa0810119. [DOI] [PubMed] [Google Scholar]
- 13. Hillman K, Nosrati H, Braithwaite J. RRS and the Culture of Safety. In: DeVita MA, Hillman K, Bellomo R, eds. Textbook of Rapid Response Systems. Cham, Switzerland: Springer International Publishing; 2017:53-57.
- 14.Braithwaite J, Wears RL, Hollnagel E. Resilient health care: turning patient safety on its head. Int J Qual Health Care. 2015;27(5):418–420. doi: 10.1093/intqhc/mzv063. [DOI] [PubMed] [Google Scholar]
- 15.Rittel HWJ, Webber MM. Dilemmas in a general theory of planning. Policy Sci. 1973;4(2):155–169. doi: 10.1007/bf01405730. [DOI] [Google Scholar]
- 16.Braithwaite J, Runciman WB, Merry AF. Towards safer, better healthcare: harnessing the natural properties of complex sociotechnical systems. Qual Saf Health Care. 2009;18(1):37. doi: 10.1136/qshc.2007.023317. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Baker R, Hurwitz B. Intentionally harmful violations and patient safety: the example of Harold Shipman. J R Soc Med. 2009;102(6):223–227. doi: 10.1258/jrsm.2009.09k028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Hindle D, Braithwaite J, Iedema R. Patient safety Research: a Review of the Technical Literature. Sydney, Australia: University of New South Wales; 2006.
- 19. Reason J. Human Error. Cambridge, UK: Cambridge University Press; 1990.
- 20. Mannion R, Davies H, Marshall MN. Cultures for Performance in Health Care. Milton Keynes, UK: Open University Press; 2005.
- 21. Heinrich HW. Industrial Accident Prevention: A Scientific Approach. New York, NY: McGraw-Hill; 1931.
- 22.Parker D, Wensing M, Esmail A, Valderas JM. Measurement tools and process indicators of patient safety culture in primary care A mixed methods study by the LINNEAUS collaboration on patient safety in primary care. Eur J Gen Pract. 2015;21(suppl 1):26–30. doi: 10.3109/13814788.2015.1043732. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Braithwaite J, Churruca K, Ellis LA. Can we fix the uber-complexities of healthcare? J R Soc Med. 2017; forthcoming. [DOI] [PMC free article] [PubMed]
- 24.Sweeney KG, Mannion R. Complexity and clinical governance: using the insights to develop the strategy. Br J Gen Pract. 2002;52(Suppl):S4–S9. [PMC free article] [PubMed] [Google Scholar]
- 25.Mannion R, Exworthy M. (Re) Making the procrustean bed? Standardization and customization as competing logics in healthcare. Int J Health Policy Manag. 2017;6(6):301–304. doi: 10.15171/ijhpm.2017.35. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Mannion R, Braithwaite J. Unintended consequences of performance measurement in healthcare: 20 salutary lessons from the English National Health Service. Internal Medicine Journal. 2012;42(5):569–574. doi: 10.1111/j.1445-5994.2012.02766.x. [DOI] [PubMed] [Google Scholar]
- 27. Braithwaite J, Hyde P, Pope C, eds. Culture and Climate in Health Care Organizations. New York, NY: Palgrave Macmillan; 2010.
- 28. Hollnagel E, Braithwaite J, Wears RL, eds. Resilient Health Care. Farnham, Surrey: Ashgate Publishing Ltd; 2013.
- 29. Debono D, Clay-Williams R, Taylor N, Greenfield D, Black D, Braithwaite J. Using workarounds to examine characteristics of resilience in action. In: Hollnagel E, Braithwaite J, Wears RL, eds. The Field Guide to Resilient Health Care. Farnham, Surrey: Ashgate Publishing Ltd; In Press.
- 30.Debono DS, Greenfield D, Travaglia JF. et al. Nurses’ workarounds in acute healthcare settings: a scoping review. BMC Health Serv Res. 2013;13:175. doi: 10.1186/1472-6963-13-175. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Wears RL, Hollnagel E, Braithwaite J, eds. Resilient Health Care Volume. 2: The Resilience of Everyday Clinical Work. Farnham, Surrey: Ashgate Publishing Ltd; 2015.
- 32. Braithwaite J, Churruca K, Ellis LA, et al. Complexity Science in Healthcare – Aspirations, Approaches, Applications and Accomplishments: A White Paper. 2017.
- 33. Wears RL, Hunte G. Resilient procedures: oxymoron or innovation? In: Braithwaite J, Wears RL, Hollnagel E, eds. Resilient Health Care Volume 3: Reconciling Work-as-Imagined and Work-as-Done. Abingdon, UK: Taylor & Francis; 2016:163-170.
- 34.Davies HTO, Mannion R. Will prescriptions for cultural change improve the NHS? BMJ. 2013;346:f1305. doi: 10.1136/bmj.f1305. [DOI] [PubMed] [Google Scholar]
- 35. Mannion R, Davies H. Cultures in Healthcare. In: Ferlie E, Montgomery K, Pedersen AR, eds. The Oxford Handbook of Health Care Management. Oxford, UK: Oxford University Press; 2016.
- 36.Bosk CL, Dixon-Woods M, Goeschel CA, Pronovost PJ. Reality check for checklists. Lancet. 2009;374(9688):444–445. doi: 10.1016/S0140-6736(09)61440-9. [DOI] [PubMed] [Google Scholar]
- 37.Greenhalgh T, Plsek P, Wilson T, Fraser S, Holt T. Response to ‘The appropriation of complexity theory in health care’. J Health Serv Res Policy. 2010;15(2):115–117. doi: 10.1258/jhsrp.2010.009158. [DOI] [PubMed] [Google Scholar]
- 38.Hawe P. Lessons from complex interventions to improve health. Annu Rev Public Health. 2015;36:307–323. doi: 10.1146/annurev-publhealth-031912-114421. [DOI] [PubMed] [Google Scholar]
- 39.Hawe P, Bond L, Butler H. Knowledge theories can inform evaluation practice: What can a complexity lens add? New Directions for Evaluation. 2009;2009(124):89–100. doi: 10.1002/ev.316. [DOI] [Google Scholar]
- 40.Plsek PE, Greenhalgh T. The challenge of complexity in health care. BMJ. 2001;323(7313):625. doi: 10.1136/bmj.323.7313.625. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Leykum LK, Lanham HJ, Pugh JA. et al. Manifestations and implications of uncertainty for improving healthcare systems: an analysis of observational and interventional studies grounded in complexity science. Implement Sci. 2014;9:165. doi: 10.1186/s13012-014-0165-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Freeman T, Millar R, Mannion R, Davies H. Enacting corporate governance of healthcare safety and quality: a dramaturgy of hospital boards in England. Sociol Health Illn. 2016;38(2):233–251. doi: 10.1111/1467-9566.12309. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Rhodes P, Campbell S, Sanders C. Trust, temporality and systems: how do patients understand patient safety in primary care? A qualitative study. Health Expect. 2016;19(2):253–263. doi: 10.1111/hex.12342. [DOI] [PMC free article] [PubMed] [Google Scholar]