Abstract
The tragic case of Mayra Cabrera who died as a result of wrong route drug administration is notable as it was the first time a verdict of unlawful killing was recorded against an NHS Trust. Error within medicine is a significant cause of patient morbidity and mortality. We explore the costs of error, the dynamics of error causation, the role of both the individual and institution in accountability for error, as well as transferrable lessons from other industries to reduce error.
Keywords: Accountability, error, human factors
Ah ne'er so dire a Thirst of Glory boast,
Nor in the Critick let the Man be lost!
Good-Nature and Good-Sense must ever join;
To err is Human; to Forgive, Divine.
‘An Essay on Criticism’, Alexander Pope.1
While ‘Zero Harm’ is a bold and worthy aspiration, the scientifically correct goal is ‘continual reduction’. All in the NHS should understand that safety is a continually emerging property, and that the battle for safety is never ‘won’; rather, it is always in progress.
A promise to learn – a commitment to act, DOH.2
The case
On Tuesday 5 February 2008 BBC News reported on the death of Mayra Cabrera, a 30-year-old theatre nurse, who died in 2004 within 2 h of giving birth to her son Zac by forceps delivery.3 Intravenous fluids and syntocinon were commenced post-partum. Shortly after, she suffered a grand mal seizure and developed cardiac arrest due to ventricular fibrillation from which she could not be resuscitated. Investigations into the incident revealed that 150 ml of a 500 ml bag of 0.1% bupivacaine in saline had been administered intravenously in error.4
Introduction
Tragically this was not the first death due to wrong route drug administration. Two other patients had also died following epidural bupivacaine infusions inadvertently being administered intravenously between 2000 and 2004, leading to the National Patient Safety Agency forming new national guidance.5 What is unique about this particular case and why it was considered a legal landmark is that it was the first verdict of ‘unlawful killing’ recorded against an NHS Trust – ‘gross negligence manslaughter re the storage and administration of an epidural drug’.6 The lack of a proper system in place for storing and handling of drugs, specifically bupivacaine, was highlighted7 – there had been two previous ‘near misses’ at the same hospital trust in 1994 and 2001, resulting in a change in standard operating procedure (SOP), which unfortunately had been discarded on the move of the hospital to new premises in 2002.7 As well as this, the individual midwife, who ‘cannot have read the label carefully or possibly at all’,7 was also at fault. Mr Justice Clarke stated Mayra Cabrera’s death was attributable to ‘systemic deficiency’ as well as to ‘individual fault’.7
‘Health care professionals are accountable professionally, civilly and criminally for their own acts and omissions. They are, however, individuals who must operate within, and are constrained by, organisational structures that can themselves be deficient’.7 Despite the clear warnings, there was insufficient individual and institutional learning to prevent the error that resulted in the tragic death of this young mother. The Trust involved acknowledged that this event should never have happened and its spokesman stated: ‘I hope other hospitals will be able to learn from the bitter lessons we have learnt’.3 Thirteen years have elapsed since these tragic events and the question that comes to mind is whether we have indeed learnt? This article aims to explore the problem of error within health care, ways of mitigating error and examine the current state of accountability for errors. We feel the subject is most pertinent to those working in highly complex, high stakes environments such as intensive care units (ICUs), operating theatres and emergency departments, as these environments are recognised as having higher error rates with serious consequences.8
Epidemiology of error
In the case of Mayra Cabrera, her death was secondary to bupivacaine toxicity, due to the drug being administered intravenously in error. Error is highly prevalent in health care systems across the world, and is estimated to be the third leading cause of death in the United States of America.9 There are many reasons why the incidence of error in health care is difficult to measure. These include different definitions of error in usage; under-reporting due to cultural difficulty in dealing with human error and fear of blame; and outcome-based perception of errors (mainly errors that result in harm are reported, and even these are under-reported).10,11 Studies have found that reporting systems register only around 7–15% of all adverse incidents which are subsequently identified through more intensive retrospective review processes.11 As early as 1991, the Harvard Medical Practice study found that 3.7% of inpatients suffered adverse events, with 13.6% of those events contributing to death, and highlighted error as a significant cause of overall mortality.12 Subsequently, UK data reviewing 1000 patient records from two acute hospital Trusts, revealed that 10% of all patients admitted to hospital suffered an adverse event, of which half were considered preventable.13
ICUs are identified, not unsurprisingly, as departments where errors have more serious consequences.8 One study of errors in ICU found that there were on average 1.7 errors per day per patient, of which 29% had potential for serious or fatal injury. Each patient received on average 178 activities per day, indicating that the hospital personnel were functioning at a proficiency level of 99%.10 Medication errors, including those in anaesthesia and ICU, are among the top 10 causes of overall mortality worldwide.14 The estimated rate of drug errors in anaesthesia is 1:133 anaesthetics, and in intensive care 130 errors per 1000 patient days.14 Interestingly, human factors and organisational inadequacies have been implicated in up to 87% of medication errors.14
The anatomy of error
There are many definitions of error. James Reason, originator of the ‘Swiss cheese model’, defines error as ‘all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some chance agency’.15 Intention is important, as error does not concern wilful acts of misconduct. Further, this definition is incomplete in as far as errors can be made even when the intended overall goal is still successfully achieved (all be it with error reducing the chance of a successful outcome or the margin of safety with which it is achieved).16 Hence, outcome per se should not be the sole criterion for defining error. Merry and McCall Smith suggest error is ‘an unintentional failure in the formulation of a plan by which it is intended to achieve a goal, or an unintentional departure of a sequence of mental or physical activities from the sequence planned, except when such a departure is due to a chance intervention’.16 This definition explicitly shifts the focus from the outcome of an act, to a failure in the act or the plan itself. It also highlights the importance of focusing on the actions and thought processes, rather than on results, when evaluating events and accountability. Interestingly, negligent acts that have non-fatal consequences are currently not considered crimes under UK law.17
Defining, classifying and analysing error is key to understanding the dynamics of error causation. The framework that is used impacts significantly on how error is managed; and how the balance between individual versus organisational failures is viewed and hence attribution of accountability for error.
The traditional person-based approach views error as the result of aberrant mental processes and human actions.18 Therefore, resultant error management focuses on disciplinary measures, blame, remediation and retraining – methods that are increasingly acknowledged to be ineffective in reducing error and harm.18,19 As Reason states, ‘error is not the monopoly of an unfortunate few’ and is not necessarily associated with incompetence.18
Applying cognitive theory to human performance identifies three levels of performance: skill based (thoughts and actions governed by stored patterns); rule based (solutions governed by stored rules); and knowledge based (conscious analysis and stored knowledge used to solve novel problems).10 Both novices and experts make errors. Skill-based errors, termed ‘slips’ (if an unintended action is performed) or ‘lapses’ (if an intended action is omitted), are unconscious errors in automatic activity, typically arising due to distraction or inattention. Rule-based and knowledge-based errors, termed ‘mistakes’, result from errors of conscious thought – either due to lack of knowledge or misinterpretation of the problem.16
In contrast to the person-based approach, the increasingly advocated systems approach acknowledges the innate nature of human error – ‘humans are fallible and errors are to be expected’.18 In this model, errors are seen as consequences of upstream systemic factors, rather than causes in themselves.18 Rather than focusing on generating a system that relies on error-free performance (an impossibility due to human fallibility), the focus on increasing safety within a system shifts to introducing system defences; identifying latent failures within the system; avoiding active failures and mitigating errors before they cause harm. Interestingly, on closer analysis of the cognitive processes underlying error, events that appear superficially identical contextually are often quite different error types. Equally, many seemingly diverse events can have quite similar stereotyped latent errors. As errors are often repetitive, stereotyped and predictable, one can implement barriers and defences to mitigate and reduce the probability of future errors.
The cost of error and the ‘second victim’
One approach to appreciate the burden of medical error is to examine the costs. However, there is a caveat that as we cannot reliably measure error incidence, calculating its cost accurately is impossible. Estimates can perhaps be surmised from negligence claims, although not all negligence cases end up as a claim. Furthermore, negligence only constitutes a small part of total error as not all error results in adverse outcomes. The Department of Health commissioned report, ‘Exploring the costs of unsafe care in the NHS’ identified costs of unsafe care of more than £1 billion per year but potentially up to £2.5 billion,20 with the proviso that not all unsafe care can be classed as error.
From a human perspective, there is the financial and emotional cost to the individual, their family and dependants, of loss of life, or additional care required in view of disability and lost income. The cost to society is diminished trust in the health care system, lost worker productivity and reduced population health. To the health care professionals, the cost is one of reduced morale, loss of job satisfaction and the ‘second victim’ syndrome.21
Whilst the cost of error in terms of patient suffering, morbidity and mortality is widely acknowledged, there are also impacts on health care professionals. It seems that although many errors have roots in systems failure, it is the individual health care professional who is usually found guilty of error, even if a ‘Swiss cheese’ effect has led to its occurrence. At times there is a hefty price to pay for the error as demonstrated in the sad case of Wayne Jowett who paid with his life due to a medical error. Even though the subsequent report concluded that ‘the adverse incident that led to Mr Jowett’s death was not caused by one or even several human errors but by a far more complex amalgam of human, organisational, technical and social interactions’,22 the doctor who administered the wrong injection received a custodial sentence for manslaughter on grounds of gross negligence. Commenting on this incident for the BMJ Mr Holbrook, a barrister, noted that ‘even the most diligent, conscientious, and competent practitioner will make mistakes…(the Dr’s) mistake was the 23rd incident reported worldwide (and the 14th in 15 years in the United Kingdom) in which this drug had been fatally and mistakenly injected into the spine’.23 Donaldson notes that, ‘the law’s interventions in the complex and subtle territory of avoidable harm in healthcare are too often haphazard and inconsistent’,24 and perhaps this is to some extent evidenced by the cases of Cabrera and Jowett.
When doctors commit errors with serious consequences, they face multiple jeopardy in the form of internal hospital investigations, GMC investigations, Coroner inquests, all the way through to criminal negligence investigations. Reckless behaviour is culpable, but should error arising from poorly managed complex systems be culpable in the same way? Whilst the hospital admitted liability for the error in Mr Jowett’s case, it was the doctor who received a custodial sentence, with the judge passing the sentence stating, ‘No sentence I impose can possibly compensate Wayne’s family for their loss’.25
We, as clinicians, work in partnerships and so arguably there is a need for joint accountability for errors. The Corporate Manslaughter and Corporate Homicide Act 2007 has rarely been invoked to examine the role of institutions in medical error cases resulting in loss of life. Institutions are often able to settle legal disputes about an error by financial settlement. Meanwhile, clinicians – termed ‘second victims’ by Edrees and Federico – ‘may feel guilt, fear, anxiety, or anger and experience social withdrawal, disturbing and troubling memories, depression, and insomnia. They tend to doubt their clinical skills, feel as though they have failed the patient, and worry about what their colleagues think’.21 Many suffer financially, having been suspended from work; psychologically with depression or posttraumatic stress disorder (PTSD); and for some, the pressure is such that it leads to suicide as recently highlighted by 28 such cases amongst doctors facing fitness to practice procedures.26
Learning from errors
Niels Bohr defined an expert as ‘a person who has found out by his own painful experience all the mistakes that one can make in a very narrow field’.27 The comment, while clearly intended as humorous, provides an insight into the price paid for the experience. It highlights the gap between theoretical knowledge and the practical application of that knowledge, which is traversed by experience and practice. In health care, we do not practice in isolation and therefore we should not have to commit ‘every possible error’ ourselves to accrue the experience. A blame-free culture that enables errors to be identified, catalogued and learning from these to be disseminated appropriately, is a solution to avoiding repetition of the same error by different individuals. It also helps us to identify weaknesses of the system that generate potential for error. This model of practice has been fully embraced by the aviation industry and we are beginning to see a similar cultural change taking place in the field of medicine.
A real threat to such practice is litigation and criminalisation of error. In the wake of findings of The Francis Enquiry, a report by Don Berwick, president emeritus of the US Institute for Health care Improvement, suggested that serious errors and neglect should be criminalised.28 There is already legislation in place including the Health and Safety at Work Act 1974 and Corporate Manslaughter and Corporate Homicide Act 2007 that may address consequences of serious errors. A number of high-profile cases, notably one of David Sellu, have demonstrated the vulnerability of an individual health care practitioner to systemic failures.24,29
Studying the aviation industry experience, the culture of openness, candour and forgiveness (in absence of frank recklessness) may seem a better approach in the long run. Regulation 4 of the Civil Aviation (Investigation of Air Accidents and Incidents) Regulations 1996 reads: ‘The sole objective of the investigation of an accident or incident under these Regulations shall be the prevention of accidents and incidents. It shall not be the purpose of such an investigation to apportion blame or liability’.30 This approach is echoed by the aviation authorities concerned with accident investigations across the world.
In medicine, the shift towards a culture of safety has been slow and challenging, as eloquently described by Dr Atul Gawande in his book Checklist Manifesto, concerning introduction of the World Health Organization Surgical Checklist (modelled to an extent on aviation checklists).31 Tools other than checklists, such as simulation and structured error reporting, have also taken root. Indeed, in situ simulation is already being used successfully to identify latent system threats on paediatric intensive care units.32 A recent systematic review found in-situ simulation a useful patient safety tool to reduce morbidity and mortality.33 The National Reporting and Learning System forms a central database of patient safety incident reports nationwide. Human factors training has been incorporated in the curriculum for emergency medicine and intensive care as well as being included in resuscitation manuals.34
Checklists, SOPs and protocols are all standardised safety tools. Whilst the implementation of SOPs remains vulnerable to operator error, the development of equipment to prevent or at least significantly reduce error could be a more robust approach. Directly pertinent to the case of Mayra Cabrera is the introduction of non-luer neuraxial connectors, which although will not prevent all types of error (as there remains the potential to draw up the wrong drug into a syringe intended for another route), would ‘fill one large hole in the cheese’, as Kinsella notes.35 Improving safety is a global responsibility – for individuals, local institutions, national and international bodies and the drug and equipment supply chains.
The future may also see the adoption in health care systems of a modified version of the Crew Resource Management (CRM) training model from the Aviation industry, promoting the optimal utilisation of available resources. CRM employs a conceptual model of three lines of defence: ‘avoid, trap, mitigate’ as countermeasures to error.36 Avoidance of errors requires a high level of situational awareness. Errors that occur should be immediately recognised, understood and trapped. Finally, if errors cannot be trapped, mitigation of errors should take place to reduce their consequences.37 In aviation, the value of CRM training is now unquestioned.
Another model that can be borrowed from aviation is LOSA – so-called ‘line operations safety audit’ whereby non-judgmental expert observers collect data about crew behaviour and situational factors during a normal flight, with a focus on threat and error management. One could imagine this played out on ICU, highlighting potential latent threats and team behaviours that either promote or threaten safety. Furthermore, the Health care Safety Investigation Branch (HSIB), which is closely modelled on the Air Accident investigation Branch (AAIB), was established in April 2017 and is led by Keith Conradi, the former chief investigator of the AAIB. Their mission statement is ‘to improve patient safety through effective and independent investigations that do not apportion blame or liability’ and make ‘meaningful safety recommendations’ from which all can benefit.38
The error factory
The frequency of errors, the consequences of errors and the psychology of errors may fill us all with unease and yet most doctors continue to come to work, to the factory of errors we have described. They come to work because of their dedication to the treatment and care of their patients. They work in partnership with other professionals, within the health care system. The practice of medicine concerns individuals, uncertainty and risk, whereas health care systems, seen as an industry, focuses on populations, certainty and profit. The Six Sigma approach (Motorola/GE) encountered in industry purports a disciplined, data-driven approach to eliminate defects and increase productivity. It relies on decision making, based on gathered information about a process, to engineer out errors and reduce waste. With this model in mind, one could argue that the health care system should be engineered in such a way that it becomes easier for the medical practitioner working in the system, to do the correct thing and harder for them to make an error and if an error is made to mitigate its effects through early recognition. Simple solutions often work best. Ensuring the use of SOPs, protocols and checklists in the working environment, is one such example. This would also include standardisation of equipment, drug labelling, storage and pharmacy policies etc, across clinical areas and ideally throughout institutions.
A simplistic solution would be to cut out human inputs from the system altogether. Indeed, the Health Secretary had suggested the use of computers instead of doctors, in patient management.39 However, high reliability organisations have recognised that human responsiveness to changing events is one of the system’s most important safeguards.18 Machines are much better than humans in the vigilance mode, monitoring relatively stable situations in anticipation of a rare crisis, but humans unlike machines have the ability to identify, analyse and manage the unexpected crisis. Humans are fallible but also an irreplaceable element of any present-day health care system. Importantly, it takes a human being to understand another’s suffering without reducing him or her to a mere ‘condition’, and to empathise and help in bearing the burden of illness. For this reason alone, it is important to understand and support the health care professionals in the context of the job they are trying to do. Errors are not necessarily due to incompetence and therefore blame and punishment of individuals may be ineffective in improving the organisational system. Other high risk industries – aviation or nuclear industry to name some – are characterised by a strong organisational commitment to safety, including high levels of redundancy built into the system both in terms of personnel levels and safety measures. This is further strengthened by a strong organisational culture for continuous learning and willingness to change.8 Can we say the same about the current culture of the NHS?
Conclusion
Total elimination of errors would be a utopian dream even in the most highly optimised system. ‘Never events’ will never disappear.40 Human beings are perpetrators of errors and at the same time are the most essential elements of a health care delivery system. Accountability for error often seems to focus on the role of the individual health care professional. The case of Mayra Cabrera is notable because the organisation was also held to account for its role in the complex causal network that contributed to the midwife’s error. By recognising and also holding to account institutions for failure to provide appropriate systems, processes, training and equipment, could encourage all those working in the health care system – clinical, managerial and administrative staff – to take joint responsibility and play their part in identifying, trapping and preventing errors.
Most health care professionals are already embracing the evolving culture of safety. The system can be further improved in the interests of all concerned, including patients and carers, if it promotes a culture of enhancing individual performance through continuous education and personal development as well as engineering solutions that reduce the likelihood of errors at both local and national levels. Above all, this has to be underpinned by working in partnership, and a no-blame culture which is non-judgemental, supportive and promotes reporting and shared learning from error. This would enable us to truly strive to fulfil the central tenet of medical practice, ‘primum non nocere’.
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
References
- 1.Pope A. Essay on criticism. [Internet]. Read Books Ltd., http://www.myilibrary.com?id=852685 (2014, accessed 6 August 2017).
- 2.Berwick D (ed). A promise to learn – a commitment to act [Internet]. Department of Health, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/226703/Berwick_Report.pdf (2013, accessed 5 May 2017).
- 3.Mother’s epidural death unlawful. BBC [Internet], http://news.bbc.co.uk/1/hi/england/wiltshire/7226836.stm (2008, accessed 3 June 2017).
- 4.Cooper GM, McClure JH. Anaesthesia chapter from Saving Mothers’ Lives; reviewing maternal deaths to make pregnancy safer. Br J Anaesth 2008; 100: 17–22. [DOI] [PubMed] [Google Scholar]
- 5.Safer practice with epidural injections and infusions. [Internet]. National Patient Safety Agency, http://www.nrls.npsa.nhs.uk/EasySiteWeb/getresource.axd?AssetID=60063& (2007, accessed 18 June 2017).
- 6.Dyer C. Trust faces prosecution over death of patient who was given the wrong drug after birth of son. BMJ 2009; 339: b4954–b4954. [DOI] [PubMed] [Google Scholar]
- 7.Brearey-Horne PJ. The corporate manslaughter and corporate Homicide Act 2007 and maternal death: an opportunity to address systemic deficiencies in maternity services? In: Griffiths D, Sanders A. (eds). Bioethics, medicine and the criminal law [Internet], Cambridge: Cambridge University Press, 2013, pp. 210–226. . http://ebooks.cambridge.org/ref/id/CBO9781139109376A023 (2013, accessed 2 July 2017). [Google Scholar]
- 8.Kohn LT, Corrigan J and Donaldson MS (eds). To err is human: building a safer health system. Washington, DC: National Academy Press, 2000, p. 287. [PubMed]
- 9.Makary MA and Daniel M. Medical error – the third leading cause of death in the US. BMJ 2016; 353: i2139. [DOI] [PubMed]
- 10.Leape LL. Error in medicine. J Am Med Assoc 1994; 272: 1851. [PubMed] [Google Scholar]
- 11.Dalton D and Williams N. Building a culture of candour: a review of the threshold for the duty of candour and of the incentives for care organisations to be candid. [Internet], https://www.rcseng.ac.uk/policy/documents/CandourreviewFinal.pdf (2014, accessed 18 June 2017).
- 12.Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study I. N Engl J Med 1991; 324: 370–376. [DOI] [PubMed] [Google Scholar]
- 13.Vincent C, Neale G, Woloshynowych M. Adverse events in British hospitals: preliminary retrospective record review. BMJ 2001; 322: 517–519. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Mahajan RP. Medication errors: can we prevent them? Br J Anaesth 2011; 107: 3–5. [DOI] [PubMed] [Google Scholar]
- 15.Reason JT. Human error, Cambridge, England; New York: Cambridge University Press, 1990, pp. 302. [Google Scholar]
- 16.Merry A, McCall Smith RA. Errors, medicine and the law, Cambridge: Cambridge University Press, 2001, pp. 254. [Google Scholar]
- 17.McDowell SE, Ferner RE. Medical manslaughter. BMJ 2013; 347: f5609–f5609. [DOI] [PubMed] [Google Scholar]
- 18.Reason J. Human error: models and management. BMJ 2000; 320: 768–770. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.First, do no harm [Internet]. General Medical Council, http://www.gmc-uk.org/First_do_no_harm_patient_safety_in_undergrad_education_FINAL.pdf_62483215.pdf (2015, accessed 14 January 2017).
- 20.Exploring the costs of unsafe care in the NHS: a report prepared for the department of health [Internet]. London: Frontier Economics, http://www.frontier-economics.com/documents/2014/10/exploring-the-costs-of-unsafe-care-in-the-nhs-frontier-report-2-2-2-2.pdf (2014, accessed 18 June 2017).
- 21.Edrees H, Federico F. Supporting clinicians after medical error. BMJ 2015; 350: h1982–h1982. [DOI] [PubMed] [Google Scholar]
- 22.Toft B. External inquiry into the adverse incident that occurred at Queens Medical Centre Nottingham [Internet]. Department of Health, http://www.who.int/patientsafety/news/Queens%20Medical%20Centre%20report%20(Toft).pdf (2001, 4 January 2001).
- 23.Holbrook J. The criminalisation of fatal medical mistakes. BMJ 2003; 327: 1118–1119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Donaldson LJ. Shadow of the law in cases of avoidable harm. BMJ 2016; 355: i6268. [DOI] [PubMed]
- 25.Anger as fatal jab doctor freed. [Internet]. BBC, http://news.bbc.co.uk/1/hi/health/3133076.stm (2003, accessed 9 March 2017).
- 26.Horsfall S. Doctors who commit suicide while under GMC fitness to practise investigation [Internet]. General Medical Council, http://www.gmc-uk.org/Internal_review_into_suicide_in_FTP_processes.pdf_59088696.pdf (2014, accessed 2 May 2017).
- 27.Coughlan R. Dr. Edward Teller’s magnificent obsession. Life 1954; 37: 62. [Google Scholar]
- 28.Hawkes N. Serious errors and neglect in the NHS should be a criminal offence, says safety expert. BMJ 2013; 347: f4973–f4973. [DOI] [PubMed] [Google Scholar]
- 29.Dyer C. Where should the buck stop? Doctors taking the blame for system failure. BMJ 2016; 355: i6274. [DOI] [PubMed]
- 30.The Civil Aviation (Investigation of Air Accidents and Incidents) Regulations; Regulation 4 [Internet], http://www.legislation.gov.uk/uksi/1996/2798/regulation/4/made (1996, accessed 9 March 2017).
- 31.Gawande A. The checklist manifesto: how to get things right,, 1st ed New York: Metropolitan Books, 2010, pp. 209. [Google Scholar]
- 32.Auerbach M, Kessler DO, Patterson M. The use of in situ simulation to detect latent safety threats in paediatrics: a cross-sectional survey. BMJ Simul Technol Enhanc Learn 2015; 1: 77–82. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Fent G, Blythe J, Farooq O, et al. In situ simulation as a tool for patient safety: a systematic review identifying how it is used and its effectiveness. BMJ Simul Technol Enhanc Learn 2015; 1: 103–110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Samuels M and Wieteska S (eds.). Advanced paediatric life support: a practical approach (6th ed.). Chichester, UK; Hoboken, NJ: John Wiley, 2016.
- 35.Kinsella M and Sharpe P. Obstetric Anaesthetists’ Association Survey #168: non-Luer equipment for specialised procedures – epidural blood patch and spinal catheters [Internet]. OAA–endorsed, http://www.oaa-anaes.ac.uk/assets/_managed/cms/files/Surveys/OAA%20Survey%20_168_nonLuer%20equipment%20specialised.pdf (2016, accessed 11 July 2017).
- 36.CAP 737: flightcrew human factors handbook [Internet]. Civil Aviation Authority, http://publicapps.caa.co.uk/docs/33/CAP%20737%20DEC16.pdf (2016, accessed 7 February 2017).
- 37.Helmreich RL, Merritt AC, Wilhelm JA. The evolution of crew resource management training in commercial aviation. Int J Aviat Psychol 1999; 9: 19–32. [DOI] [PubMed] [Google Scholar]
- 38.Healthcare Safety Investigation Branch [Internet]. Healthcare Safety Investigation Branch, https://www.hsib.org.uk (2017, accessed 10 May 2017).
- 39.Watt H. Jeremy Hunt: people will be diagnosed at home by a computer. [Internet]. The Telegraph, 4th February 2015 http://www.telegraph.co.uk/news/nhs/11391132/Jeremy-Hunt-People-will-be-diagnosed-at-home-by-a-computer.html (2015, accessed 9 May 2017).
- 40.Pandit JJ. Deaths by horsekick in the Prussian army – and other “Never Events” in large organisations. Anaesthesia 2016; 71: 7–11. [DOI] [PubMed] [Google Scholar]