Skip to main content
BJA Education logoLink to BJA Education
. 2021 Sep 1;21(11):420–425. doi: 10.1016/j.bjae.2021.07.004

Cognitive biases in diagnosis and decision making during anaesthesia and intensive care

CS Webster 1,, S Taylor 2, JM Weller 1,2
PMCID: PMC8520040  PMID: 34707887

Learning objectives.

By reading this article, you should be able to:

  • Discuss the widely accepted dual process theory of cognition and its relevance to bias and diagnosis.

  • Illustrate the evidence base for strategies to reduce bias during diagnosis.

  • Apply evidence to the development of more effective bias reduction strategies.

Diagnosing the patient's condition is perhaps the single most important task performed by clinicians, as an incorrect diagnosis may lead to an incorrect management plan. In high-intensity domains such as anaesthesia and intensive care, diagnosis is often performed under time pressure and in rapidly evolving and uncertain situations, putting clinicians at particular risk of error.

In this context, cognitive bias is typically defined as flaws or distortions in judgment and decision making that can lead to poor outcomes. More than 100 different identifiable biases have been reported in healthcare.1 The overall rate of incorrect diagnosis in healthcare has been estimated to be between 10% and 15%, with autopsy studies suggesting higher rates.1,2 Human error is known to be a major contributor to preventable harm to patients, associated with substantial injury, death and large financial costs.3 Therefore, reducing or eliminating cognitive biases would potentially reduce harm. However, reducing cognitive bias is easier to propose than to achieve.

In a forthcoming paper we will consider the effects of interpersonal bias on the interactions between members of clinical teams, between clinicians and their patients, and the consequences for healthcare outcomes. In this article, we consider the underlying causes of cognitive biases, why they are so difficult to eliminate, several common manifestations of bias, and the evidence for the effectiveness of strategies to reduce cognitive biases and their consequences.

Psychological mechanisms

Although it is common to conceptualise human bias in purely negative terms, any consideration of the underlying causes of bias quickly identifies the fact that the same psychological mechanisms are involved in both biased and non-biased thought and action. These underlying mechanisms are those that allow us to identify categories of objects in the world and to discern the similarities and differences between them – a general-purpose cognitive ability that begins to form in children from only a few years old.4 Such cognitive abilities have obvious survival advantages; for example in finding nutritious food, avoiding dangers and discerning ‘us’ from ‘them’ within family and tribal groups.5 It is also advantageous for many of these discriminations to be carried out very quickly. Consequently, our brains have evolved the ability to recognise common patterns using rapid unconscious processing, thus leaving more effortful conscious processing for dealing with novel experiences or problems.

Dual process theory or system-1 and system-2 processing

The division of human cognitive abilities into conscious and unconscious processes has been apparent to philosophers and psychologists for centuries, but recently this division has been popularised as thinking ‘fast and slow’ or dual process theory.6 In the dual process theory, unconscious processing is called system-1, and conscious processing is called system-2, with both systems operating in parallel and simultaneously. System-1 is fast, automatic, intuitive and largely relies on pattern recognition. System-1 is also responsible for instant emotive reactions such as fear or anger in response to perceived dangers or threats. In contrast, system-2 is slow, effortful, deliberative and associated with conscious reasoning. It is important to note that almost all cognitive tasks use some mixture of system-1 and system-2 processing. However, we are often unaware of this distinction because system-1 operates automatically or unconsciously. For example inducing anaesthesia involves consciously deciding which drugs to use (system-2 processing), but also engages a series of highly practised motor skills involved in drawing and giving drugs in a routine sequence, many elements of which are handled unconsciously by system-1.

The advantages of system-1 are that it is fast, effortless and very often accurate. The disadvantages of system-1 include the fact that because it operates unconsciously, the conscious mind does not have access to its underlying mechanisms, often meaning that decisions and choices made by system-1 cannot be explained other than by intuition. The advantages of system-2 are that it can handle complex, novel problems and is able to offer rational explanations for decisions and choices. However, system-2 thinking is slow and effortful, making it unsuitable for many time-critical tasks.

Our ability to introspect and explain the actions of system-2, but not of system-1, leads to the common misunderstanding that system-2 is less error prone than system-1, therefore making system-2 preferable for high-stakes decisions. In fact, both systems perform very well most of the time, meaning that the great majority of tasks involving predominately system-1 or system-2 processing result in good outcomes. However, both systems are also capable of being affected by bias, potentially leading to incorrect decisions or actions. Cognitive biases common during diagnosis and medical decision making have recently been documented in two evidence-based reviews, which identified 14 and 19 biases, respectively.7,8 Table 1 shows a summary of the nine biases common to both reviews, with definitions of each. Biases in system-1, affecting unconscious, emotive or automatic responses, are collectively termed implicit bias. Biases in system-2, affecting conscious attitudes, beliefs and knowledge, are collectively termed explicit bias.

Table 1.

Common cognitive biases affecting technical tasks in anaesthesia such as diagnosis and management

Cognitive bias Definition
Anchoring Being excessively influenced by one element of the presented or prior information, which subsequently biases the interpretation of later information.
Availability bias Choosing a particular interpretation or diagnosis because it is at the front of mind (including frequency and recency bias).
Premature closure Arriving at a conclusion or diagnosis too early without considering all possibilities.
Confirmation bias Seeking or prioritising information which confirms current or desired thinking rather than considering all information.
Framing effect The order or way in which initial information is presented ‘frames’ or biases the way subsequent information is interpreted.
Commission bias The tendency to act rather than not to act, hence motivating unneeded treatments or actions.
Overconfidence bias The common tendency to believe we know more than we do, or that we are all better-than-average practitioners. May lead to action based on incomplete information or hunches, rather than carefully gathered evidence.
Omission bias Tendency not to act when intervention is indicated, for example a hesitancy to initiate emergency measures because of worries about being wrong or harming the patient.
Sunk costs Unwillingness to give up on a poor conclusion or diagnosis as much time or resource has been invested in developing it.

Examples of how such common biases can operate at an implicit or explicit level are illustrated in Clinical scenarios 1 and 2. In Clinical scenario 1, initial examination of the patient did not identify any evidence of immediately life-threatening injury. However, later when the patient became increasingly anxious, he offered the explanation that he was prone to anxiety attacks. This admission appears to have biased subsequent interpretation of the patient's deteriorating condition by anchoring the diagnosis on an anxiety attack rather than prompting further investigation of more serious possibilities, eventually resulting in the tragic death of the patient. The lack of rational consideration of other possibilities when the patient's condition deteriorated (a system-2 activity) suggests that anchoring bias was operating implicitly in this case.

Clinical scenario 1.

Mr G, a 28-year-old man, was brought into the Emergency Department with a leg fracture, following a car crash where the airbag had deployed. He was reviewed by Dr C who noted he had a Glasgow Coma Scale of 15 with a non-tender abdomen and a clear chest, but would require surgery on his fractured leg. A junior anaesthetist then performed a preoperative assessment of Mr G. However, within the next 60 minutes Mr G became increasingly anxious. He explained that he had a history of anxiety attacks for which he had sought counselling. He was reassured by nursing staff and given paracetamol 1g. He became more anxious when talking about how he almost went through the windscreen. He was breathing rapidly and complained his fingers were tingling. Dr C believed he was having a panic attack and attempted to calm him down. Mr G complained of feeling claustrophobic, his pulse became 122 beats min−1, blood pressure 102/58 mmHg, and his oxygen saturation 91%. As staff continued their attempts to calm him his breathing slowed, he became limp and lost consciousness. ECG showed pulseless electrical activity and despite resuscitation attempts, Mr G died. Dr C had wrongly attributed Mr G's symptoms to dehydration and anxiety. The coroner recorded a verdict of accidental death, concluding that the death was due to the injuries sustained from the car crash.

Source: Adapted from McKeague, G. Casebook 26, November 2018, page 6. Available from: https://www.medicalprotection.org/newzealand/casebook.

Clinical scenario 2.

Mr K was a 35-year-old builder who weighed 110kg and who presented for acute plating of his ankle fracture. He was a current smoker with no other medical history of note. His admission blood pressure was 160/95 mmHg and Mr K mentioned that he had ‘white coat syndrome’.

Anaesthesia was induced with fentanyl and propofol. After 3 attempts at laryngeal mask placement the anaesthetist, Dr T, noticed the blood pressure had not been measured since the beginning of the case (over 5 minutes prior).

The first blood pressure cuff inflated and deflated twice but was unable to take a reading. Dr T assumed this was due to the surgeon moving the patient. Dr T cycled the blood pressure unsuccessfully for a second time. Dr T thought this was because the cuff was too small. A larger cuff was attached, and the blood pressure cycled again. The reading was 68/50 mmHg. Boluses of metaraminol and i.v. fluids were given immediately with the desired effect. An infusion of metaraminol was commenced and continued throughout the procedure. Mr K made an uneventful recovery.

Source: Case description heard by one of the authors.

In Clinical scenario 2, we see the operation of both implicit and explicit bias in cognitive processes. The anaesthetist notices the patient's blood pressure has not been taken for some time. She assumes that in an otherwise healthy patient a normal blood pressure would be expected in this situation (implicit bias). Hence, when the anaesthetist is unable to obtain a blood pressure reading, she concludes that this is caused by some technical problem. Confirmation bias is the tendency to prioritise information that confirms existing thinking, rather than considering other possibilities. Hence, the anaesthetist consciously seeks information to support her diagnosis. That is, that the blood pressure is normal and that the inability to obtain a blood pressure reading has resulted from the surgeon moving the patient or the use of the wrong size blood pressure cuff, rather than considering the possibility that the patient's blood pressure is in fact too low to detect. Thus, the interpretation of environmental factors as the reason for the lack of a blood pressure reading is an explicit bias.

Strategies to reduce cognitive bias and diagnostic error

Many strategies have been proposed in order to reduce cognitive bias and diagnostic error. Although these approaches are described using a variety of overlapping terminologies, we provide a representative overview in Table 2.7, 8, 9, 10, 11, 12, 13, 14, 15, 16 Most of these approaches are consistent with, or make mention of, the dual processing theory of cognition, including one prominent approach called cognitive debiasing proposed by Croskerry.2 Cognitive debiasing comprises a final debiasing step before making a diagnosis, which involves the clinician making deliberate efforts to decouple themselves from the intuitive responses of system-1, being aware of common biases (such as those in Table 1), and using system-2 to appraise current reasoning.

Table 2.

Commonly reported strategies proposed to reduce cognitive bias and diagnostic error

Strategy Definition Key evidence, if any
Training on cognitive bias and human error Specific training or education covering theory of cognitive decision making and its application to diagnosis. Training increased knowledge but no evidence this translated to fewer diagnostic errors in individuals.9



Cognitive debiasing Perform a calibration step before the determination of a diagnosis, involving decoupling from intuitive system-1 processing. From a systematic review of 68 relevant papers 42.5% found support for their debiasing hypotheses.10



Slowing down strategies Making a conscious effort to slow down to avoid premature closure on a diagnosis. Diagnosis took significantly longer in ‘slow’ condition but accuracy was not improved.11



Consider the opposite, or other alternatives A deliberate step to consider the opposite conclusion, or other alternatives, in order to validate the final diagnosis. More justification for diagnoses presented but no evidence this translated to fewer diagnostic errors.12



Mindfulness techniques Training in mindful or reflective practice may focus attention and reduce diagnostic error. From a systematic review of 33 relevant papers, 79% contained only opinion-based justifications. Seven (21%) non-randomised studies suggested some benefits in processes or outcomes related to increased diagnostic accuracy.13



Second opinion or group decision making Seeking a second opinion or more than one in complex cases. Modest reduction in diagnostic errors in some domains.16



Stopping rules and standing rules Rules designed to, respectively, determine when information gathering can stop, and must-not-miss alternatives have been considered before final diagnosis. No published evidence



Checklists A formal set of checks customised to the work domain or task to ensure critical steps are not missed. Evidence of effectiveness in a number of healthcare domains, including surgical operating room (a 36% reduction in surgical complications), but specific evidence in diagnosis is lacking.14



Clinical decision support systems Computerised systems capable of analysing patient data and making recommendations or alerts during decision making. Evidence of reduced medication errors and improved adherence to best practice, but evidence in diagnosis is lacking.15

Many of the other strategies in Table 2 comprise general purpose approaches intended to increase awareness of cognitive biases and their potential effects on clinical reasoning. The most general approach is training on the nature of biases, human error and decision making. Others include getting a second opinion and metacognitive strategies, such as mindfulness and slowing-down strategies, intended to focus the clinician's attention during the diagnostic process and reduce reliance on system-1 processing.

The remainder of the strategies in Table 2 involve structured approaches to avoiding bias, including memorised rule systems such as the use of stopping rules, intended to indicate when sufficient information has been gathered to allow an unbiased diagnosis to be made. In addition, the strategy to ‘consider the opposite’ or alternatives is intended to avoid confirmation bias and validate the final diagnosis. Finally, approaches such as diagnostic checklists and computerised clinical decision support systems represent technological approaches intended to support and facilitate the cognitive abilities of clinicians.

Evidence for the efficacy of strategies to avoid bias

Strategies to avoid bias in clinical diagnosis and decision making are often promoted on the basis of opinion or theory-based arguments without any empirical evidence.13 The evidence that does exist on the efficacy of such strategies is mixed at best (Table 2). Systematic reviews on debiasing and mindfulness techniques suggest that only a minority of included studies show quantitative evidence of efficacy.10,13 In addition, a profusion of differing terminologies across individual studies makes it difficult to compare and summarise findings. For example one systematic review found 71 distinct terms used in describing mindful practice alone.13 Furthermore, the strategies listed in Table 2 may be used in isolation or in combination, again making it difficult to gauge the effect of any single strategy. Debiasing, for example may be conducted in a number of ways, including by incorporating many of the individual strategies as part of the final self-checking step of this process. However, one systematic review suggests that debiasing approaches that include elements of technological support, such as cognitive aids, are more effective than those that rely only on the cognitive effort or memory of the clinician.10 This finding is consistent with the larger literature in healthcare on error prevention, where general-purpose directives to ‘be more careful’ and ‘avoid error’ are typically ineffective, whereas checklists and clinical decision support systems have shown strong evidence of the ability to improve healthcare practice and patients' outcomes (Table 2).17,18

Blind spots and limitations

Many of the proposed strategies for reducing cognitive bias in healthcare may yield relatively small gains (Table 2), yet environmental factors, such as workload and fatigue, are likely to play a greater role in predisposing to biases and errors.19,20 For example in a large-scale observation study in the USA, hospital interns working traditional shifts involving multiple work periods longer than 24 h each month, had a 20.8% greater chance of making serious medication errors and were 5.6 times more likely to make serious diagnostic errors, than when working without extended-duration shifts.20 Fatigue levels such as these have been equated with blood-alcohol concentrations in terms of their detrimental effects on performance, suggesting that work shifts of 17 h or more are equivalent to being intoxicated over the legal limit to drive a car.19

Many of the studies evaluating strategies to reduce bias make use of medical students, often in laboratory conditions.10,13 This is despite the fact that experts are known to process information differently to non-experts or novices.21 In particular, because many aspects of experts' work have become highly practised, experts are typically able to carry out more tasks with system-1 compared with novices, and so have greater cognitive resource available in system-2 in order to perform bias reduction strategies. For example one study found that resident clinicians who used reflective practice made more correct diagnoses than those who did not (52% vs 43%, P=0.03); however, this was not the case for student participants in the study.22 Given that the cognitive resources of system-2 are finite, adding additional cognitive load by asking clinicians to conduct mindful practice, debiasing or other self-checking strategies may be unsustainable, and this may be particularly the case for students or junior doctors.

The error proneness of system-2 is also largely ignored in many reports in healthcare, particularly when it is proposed that clinicians use system-2 thinking to perform the final check before reaching a diagnosis. Given the fact that both system-1 and system-2 can be biased, the primary limitation of all self-checking bias-reduction strategies is that these rely on the ability of an imperfect cognitive system to check itself. Such a limitation obviously does not apply to strategies to reduce bias that do not rely on self-checking. For example, a knowledge of cognitive biases may allow team members to better detect biases in others and to correct for them during team-based decision making. Another strategy to reduce cognitive bias that does not rely on self-checking is the use of external cognitive aids or clinical decision support systems during decision making. Such externally situated checking strategies appear to offer the most sustainable and effective solution to bias reduction and improved diagnostic accuracy.

Future directions

The creation of computer systems capable of storing large numbers of rules and analysing large volumes of patient data have been proposed since the 1970s.23 However, it is only recently that this technology has matured enough to begin to be practically applied in the clinical workplace.21,24 When clinicians work closely with such decision support systems, the systems are known as augmented cognition (AC) systems, as – at least in principle – they can seamlessly expand the rule set and information processing capabilities of the clinician's system-2.25 Development of AC systems must take into account a more detailed understanding of human psychology than has previously been used in the design of medical devices, in order to support and facilitate rather than hinder or annoy the clinician. A further consideration for future research could be the role of inclusive leadership and team-based decision making in recognition of and protection against diagnostic errors made by the leader.16 A parallel technological development, that of high-fidelity healthcare simulation, is an ideal test bed for fine tuning and determining the safety of team-based and decision support systems, including AC, and without exposing patients to potential risks.

Conclusions

Cognitive biases commonly contribute to mistakes being made during diagnosis and decision making in healthcare. Human cognitive processing is comprised of two systems: a fast system-1, largely based on pattern matching; and a slow system-2, associated with more rational reasoning. A common misconception is that system-2 is less prone to error than system-1. In fact, both systems work very well most of the time, but both systems may be affected by bias that leads to errors. Most proposed strategies for reducing bias involve some type of self-checking using system-2, but the primary limitation of such approaches is that they rely on the ability of an imperfect cognitive system to check itself. The great majority of proposed strategies to reduce cognitive bias are promoted based on opinion rather than evidence. The most promising strategies for sustainable bias reduction are those that do not rely on self-checking, such as team-based decision making, where clinicians may learn about cognitive bias in order to detect and correct it in each other, the use of appropriate cognitive aids, or the use of clinical decision support systems. Awareness and active management of fatigue is also important, as fatigue is known to substantially increase the risk of cognitive bias. Healthcare simulation also offers a valuable new approach for the testing of future strategies for reducing cognitive bias.

MCQs

The associated MCQs (to support CME/CPD activity) will be accessible at www.bjaed.org/cme/home by subscribers to BJA Education.

Declaration of interests

CSW is a minor shareholder in SaferSleep LLC, a company that manufactures an anaesthesia record system. JMW is a member of the editorial board of the British Journal of Anaesthesia. ST has no potential conflicts to declare.

Biographies

Craig Webster MSc PhD is an associate professor at the Department of Anaesthesiology and Centre for Medical and Health Sciences Education at the University of Auckland, New Zealand. He is a psychologist with more than 20 yr experience in human factors research and redesign projects.

Saana Taylor MBChB is a senior registrar in anaesthesia at Auckland City Hospital, New Zealand.

Jennifer Weller MD MClinEd FANZCA is head of the Centre for Medical and Health Sciences Education at the University of Auckland, New Zealand. Professor Weller is a specialist anaesthetist at Auckland City Hospital, and is on the editorial board of the British Journal of Anaesthesia. Her research encompasses medical education, patient safety and simulation-based team training.

Matrix codes: 1I03, 2C03, 3J02

References

  • 1.Croskerry P. From mindless to mindful practice — cognitive bias and clinical decision making. N Engl J Med. 2013;368:2445–2448. doi: 10.1056/NEJMp1303712. [DOI] [PubMed] [Google Scholar]
  • 2.Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84:1022–1028. doi: 10.1097/ACM.0b013e3181ace703. [DOI] [PubMed] [Google Scholar]
  • 3.Institute of Medicine . National Academy Press; Washington DC: 2000. To err is human — building a safer Health system. [Google Scholar]
  • 4.Deng W., Sloutsky V.M. Selective attention, diffused attention, and the development of categorization. Cogn Psychol. 2016;91:24–62. doi: 10.1016/j.cogpsych.2016.09.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Amodio D.M. The neuroscience of prejudice and stereotyping. Nat Rev. 2014;15:670–682. doi: 10.1038/nrn3800. [DOI] [PubMed] [Google Scholar]
  • 6.Kahneman D. Penguin; London: 2012. Thinking, fast and slow. [Google Scholar]
  • 7.Stiegler M.P., Neelankavil J.P., Canales C. Cognitive errors detected in anaesthesiology: a literature review and pilot study. Br J Anaesth. 2012;108:229–235. doi: 10.1093/bja/aer387. [DOI] [PubMed] [Google Scholar]
  • 8.Gustavo S., Redelmeier D., Ruff C.C. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak. 2016;16:138. doi: 10.1186/s12911-016-0377-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Reilly J.B., Ogdie A.R., Von Feldt J.M. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf. 2013;22:1044–1050. doi: 10.1136/bmjqs-2013-001987. [DOI] [PubMed] [Google Scholar]
  • 10.Ludolph R., Schulz P.J. Debiasing health-related judgements and decision making: a systematic review. Med Decis Making. 2017;38:3–13. doi: 10.1177/0272989X17716672. [DOI] [PubMed] [Google Scholar]
  • 11.Norman G.R., Monteiro S.D., Sherbino J. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med. 2017;92:23–30. doi: 10.1097/ACM.0000000000001421. [DOI] [PubMed] [Google Scholar]
  • 12.Hirt E.R., Markman K.D. Multiple explanation: a consider-an-alternative strategy for debiasing judgments. J Pers Soc Psychol. 1995;69:1069–1086. [Google Scholar]
  • 13.Pinnock R., Ritchie D., Gallagher S. The efficacy of mindful practice in improving diagnosis in healthcare: a systematic review and evidence synthesis. Adv Health Sci Educ. 2021 doi: 10.1007/s10459-020-10022-x. [DOI] [PubMed] [Google Scholar]
  • 14.Haynes A.B., Weiser T.G., Berry W.R. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med. 2009;360:491–499. doi: 10.1056/NEJMsa0810119. [DOI] [PubMed] [Google Scholar]
  • 15.Kawamoto K., Houlihan C.A., Balas E.A. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330:765. doi: 10.1136/bmj.38398.500764.8F. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Minehart R.D., Foldy E.G., Long J.A. Challenging gender stereotypes and advancing inclusive leadership in the operating theatre. Br J Anaesth. 2020;124:e148–e154. doi: 10.1016/j.bja.2019.12.015. [DOI] [PubMed] [Google Scholar]
  • 17.Berwick D.M. Not again! Preventing errors lies in redesign — not exhortation. BMJ. 2001;322:247–248. [Google Scholar]
  • 18.De Bie A.J.R., Nan S., Vermeulen L.R.E. Intelligent dynamic clinical checklists improved checklist compliance in the intensive care unit. Br J Anaesth. 2017;119:231–238. doi: 10.1093/bja/aex129. [DOI] [PubMed] [Google Scholar]
  • 19.Dawson D., Reid K. Fatigue, alcohol and performance impairment. Nature. 1997;388:235. doi: 10.1038/40775. [DOI] [PubMed] [Google Scholar]
  • 20.Landrigan C.P., Rothschild J.M., Cronin J.W. Effect of reducing interns' work hours on serious medical errors in intensive care units. N Engl J Med. 2004;351:1838–1848. doi: 10.1056/NEJMoa041406. [DOI] [PubMed] [Google Scholar]
  • 21.Webster C.S., Weller J.M. Data visualisation and cognitive ergonomics in anaesthesia and healthcare. Br J Anaesth. 2021;126:913–915. doi: 10.1016/j.bja.2021.01.009. [DOI] [PubMed] [Google Scholar]
  • 22.Mamede S., Splinter T.A.W., Van Gog T. Exploring the role of salient distracting clinical features in the emergence of diagnostic errors and the mechanisms through which reflection counteracts mistakes. BMJ Qual Saf. 2012;21:295–300. doi: 10.1136/bmjqs-2011-000518. [DOI] [PubMed] [Google Scholar]
  • 23.Schwartz W.B. Medicine and the computer. The promise and problems of change. N Engl J Med. 1970;283:1257–1264. doi: 10.1056/NEJM197012032832305. [DOI] [PubMed] [Google Scholar]
  • 24.Wachter R. McGraw-Hill; New York: 2015. The digital doctor — hope, hype, and harm at the dawn of medicine's computer age. [Google Scholar]
  • 25.Glenn L.M., Boyce J.A.S. At the nexus: augmented cognition, health care, and the law. J Cogn Eng Decis Mak. 2007;1:363–373. [Google Scholar]

Articles from BJA Education are provided here courtesy of Elsevier

RESOURCES