Abstract
Aim:
This is Article 2 of a three-part series on clinical reasoning that encourages practitioners to explore and understand how they think and make case-based decisions. It is hoped that, in the process, they will learn to trust their intuition but, at the same time, put in place safeguards to diminish the impact of bias and misguided logic on their diagnostic decision-making.
Series outline:
Article 1, published in the January 2016 issue of JFMS, discussed the relative merits and shortcomings of System 1 thinking (immediate and unconscious) and System 2 thinking (effortful and analytical). This second article examines ways of managing cognitive error, particularly the negative impact of bias, when making a diagnosis. Article 3, to appear in the May 2016 issue, explores the use of heuristics (mental short cuts) and illness scripts in diagnostic reasoning.
Introduction
Making errors in veterinary practice is a constant source of irritation and disappointment. When cognitive errors happen, we feel we have let ourselves down. We also feel we have let down the patient and its owner. This is especially pertinent when it is not clear why a misdiagnosis was made, because we will be concerned that we may make the same mistake in the future. While this article cannot provide a single strategy for removing cognitive error from the diagnostic process, it can hopefully contribute by reducing the impact of such errors on clinical decision-making.
We will attempt to do this by asking you, the reader, to engage in metacognition (that is, to reflect on how you think about the diagnostic process) as we work through some case scenarios. These have been selected to emphasise cognitive error, especially that created through the use of bias. Many of the case scenarios are from our own personal experience, so we feel very qualified and not at all embarrassed (well, maybe a little!) in discussing what can go wrong when making a diagnosis. Before working through the case studies, we need to lay the groundwork for understanding the range of cognitive errors that might potentially occur in practice and to discuss those factors that predispose clinicians to such errors.
In Article 1 of this series, we introduced the relative merits and disadvantages of intuitive System 1 thinking (most commonly using pattern recognition) and System 2 thinking (a trained form of analytical, problem-oriented, data- or hypothesis-driven thinking) in diagnostic reasoning. 1 We have also introduced the concept of error when using one or both types of thinking to reach a diagnosis. Diagnostic error has been divided into (i) ‘no-fault’ error, due to an unusual presentation of the disease or deliberate or unknowing deception by the owner; (ii) system-related error, due to organisational or technical failures leading to lack of results, delayed results, or false or misleading results; and (iii) cognitive error (flawed clinical reasoning), due to faulty knowledge, data gathering or synthesis. 2
In human medicine, cognitive error is generally recognised as being the most common factor in getting the diagnosis wrong. It is likely that this is also the case in veterinary medicine (however tempting it might be to blame the laboratory for misleading you!). In Article 1 we suggested that cognitive error is more likely to occur when engaging in System 1 thinking, rather than System 2 thinking, because of greater use of bias, the influence of affective states, and a lack of conscious safeguards against error. We still hold to that view, but we should qualify it by stating that System 2 thinking has its own set of biases to contend with and manage. Moreover, there is the potential for poorly understood inbuilt (‘hardware’ issue) cognitive error unrelated to tiredness, stress or emotion when using both systems of thinking. If one accepts these generalisations, then the obvious question is: ‘what can be done to diminish cognitive error in diagnostic reasoning, irrespective of the system of thinking employed?’

The first step towards minimising cognitive errors is to try to understand why they occur. The second step is to develop strategies to reduce the likelihood of them occurring. Of course, our current knowledge of why some cognitive errors occur is incomplete, so the presented general strategies cannot provide a panacea; but they should, if used appropriately, help diminish cognitive error in clinical reasoning. This is, in a nutshell, the basis for this article: to better recognise when you are being led astray; what you can do to get back on the right track; and how to try to avoid repeating the mistake in the future.

What is cognitive error and why might it occur?
Erring in clinical reasoning is well accepted by the medical fraternity and has led to analysis of why it occurs. 3 While not all the factors contributing to cognitive error in medical clinical diagnosis have been delineated, it is highly likely that many apply equally to cognitive error in veterinary diagnosis. Flawed or misleading thinking strikes us all, especially towards the end of a long day filled with difficult conversations and many disappointments. Sometimes it strikes when we are feeling a little smug after a brilliant diagnosis. More commonly, it happens when we are feeling unsure due to a personal crisis (maybe a difficult surgery looming or a procedure that didn’t go well, or a letter of complaint from a client). What is probably harder to accept and manage is when the diagnosis goes wrong and we are feeling relaxed and apparently in good form. This last situation, which has happened to all the authors, highlights the lack of understanding of factors causing errors in thinking. This review acknowledges that ignorance and mindfully focuses on those factors known to contribute to cognitive error. For example, since personality, past experiences and present mood can influence the system of thinking utilised to reach a diagnosis, it follows that they may contribute to some errors in the reasoning process. It has been suggested that affective states, such as mood and sentiments, can influence and sometimes flaw both detection (sensory and other forms of perception) and, possibly more so, abductive and deductive reasoning during decision-making in any arena. 4
Cognitive error, as an inherent weakness in the scientific process, was perhaps best expressed by Sir Francis Bacon, a disgraced, corrupt politician who suddenly found that he had plenty of time on his hands to pursue his interest in experimental science. His treatise, which he called Novum Organum (Part 1 – Science or ‘True Suggestions for the Interpretation of Nature’), 5 established the basis for the fundamental approach to modern inductive scientific investigation. In it, Bacon outlined four ‘idols of the mind’, which were capable of generating false notions in scientific investigation. ‘Idols of the Tribe’ relate to deceptive reasoning common to the human race (a ‘hardware’ issue, as mentioned earlier, and which can affect both System 1 and System 2 thinking). ‘Idols of the Cave’ are aspects of deceptive reasoning specific to individuals (a ‘software issue’ related to the influence of one’s past experiences, personality and present mood). ‘Idols of the Marketplace’ relate to reasoning errors based on the written or spoken word, which may deliberately or inadvertently mislead the reader or listener. Finally, ‘Idols of the Theatre’ refer to the perpetuation of misleading ideas by learned groups (dogma perpetuated by experts of the time).
All these ‘idols’ are as relevant today as 400 years ago. This suggests that while there has been dramatic technological advancement in society, little has changed about the way we think when making decisions. Moreover, because the idols relate to reasons for cognitive error, they apply to faulty clinical reasoning irrespective of whether it primarily employs intuitive pattern recognition or a more effortful analytical, data-driven approach. For example, how many of you were given an idea or approach to diagnosis for a particular disease at university or at a conference by a highly respected colleague and utilised it in intuitive System 1 or analytical System 2 thinking for cases before realising, months or perhaps even years later, that it had misled you? One of us (PJC) has had that experience and had to review a number of similar cases over a 5 year period as a consequence. This related to being misguided on some key histopathological features used to distinguish between severe inflammatory bowel disease and diffuse small cell malignant lymphoma. Thankfully, while the diagnosis had to be adjusted for some cases, treatment and outcome were not greatly affected for all the cases. It seems it doesn’t matter how well trained we are, we probably all harbour ‘urban myths’ about disease that have been implanted by articles or presentations.
Article 1, on types of clinical reasoning, discussed emotion as being one factor that may lead to cognitive error because of its attachment with some long-term memories. 6 How does this relate to the situation in veterinary practice? Some types of cognitive error are a price we pay for experiencing rapid emotional responses to sensory information related to a case. Much of the time an emotional response in decision-making can be very useful (eg, do I need to calm this anxious client?). But it also has the potential to mislead when making complex decisions about disease causation. This is because emotion can not only influence memory usage but it can also directly affect clinical reasoning in both systems of thinking. That said, emotion does not seem to play a role in many forms of cognitive error. Rather it is factual information related to perception, as opposed to emotion, that can often provide powerful triggers for the retrieval of useful memories related to past cases, either as instance or generic illness scripts.
Bias – friend or foe?
There are many publications on cognitive error in clinical diagnosis in medicine, most focusing on when bias misleads.1,7,8 Bias is defined as possessing a predisposition or prejudice. It has been suggested that forms of cognitive bias exist in both System 1 and System 2 thinking for veterinary diagnosis, but that bias leading to cognitive error has a greater impact on System 1. 9 Numerous biases affect the process of diagnosis. An explanation of the more common ones, and examples of when they may come into play and mislead in veterinary practice, are given in Table 1. The information provided in Table 1 is the crux of this article. We urge you to work your way carefully through the table, and we hope you will find it rewarding.
Table 1.
Common biases utilised in clinical reasoning and examples of how they sometimes cause cognitive error (continued on page 243)
| Definition | Example | |
|---|---|---|
| Confirmation bias | The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions about a case
– This is related to the so-called premature closure bias, where other possibilities are ignored or discounted too soon – It is also related to the diagnostic strategy of going for the obvious (Sutton’s Law) and ignoring other possibilities (Sutton’s Slip) – Confirmation bias is common in new graduates because of their lack of domain knowledge and consequent willingness to latch onto a diagnosis they know about |
A 10-year-old cat is presented to you (newly graduated) with polydipsia, polyuria, weight loss and a small ventral cervical nodule. Laboratory testing demonstrates mild azotaemia, reduced urine concentrating ability, elevated total serum calcium concentration, normoglycaemia and a normal serum thyroxine concentration. A plausible explanation for the accumulated data, and one that is well known to you, is chronic renal insufficiency. Consequently, you recommend renal prescription diets and phosphate binders to the owner. However, the cat continues to show the initial presenting complaints and starts to vomit. You continue to attribute this to chronic kidney disease and undertake further renal function tests to support that diagnosis, while not considering other possibilities. (Confirmation and premature closure biases kick in, possibly because of ignorance of other conditions and possibly because you do not want to appear wrong.) The cat might alternatively have primary hyperparathyroidism, so determining the ionised calcium concentration and parathyroid hormone concentration (and possibly even the renin concentration) is sensible. Perhaps the cervical thyroid mass should be subjected to ultrasonographic examination and needle aspiration or surgical excision |
| Anchoring bias | The tendency to rely too heavily, or ‘anchor’, on one trait or piece of information when making decisions. Often this anchor is inflicted on you by others
– This might be viewed as focusing on one piece of the puzzle, at the expense of understanding the ‘big picture’ – Related to the anchoring and adjustment heuristic (a reasoning strategy used in System 2 thinking – discussed in Article 3) |
You focus on treatment of idiopathic cystitis in a female cat with stranguria and haematuria because signs apparently developed after a new kitten was introduced into the household (the anchor). However, the problem continues unabated despite your recommended stress reduction programme (behavioural modification and use of feline facial pheromone diffusers). A colleague later radiographs the cat and detects a radiopaque urolith in the bladder |
| Gambler’s fallacy | The tendency to think that the probability of a cat having a particular diagnosis or prognosis is influenced by preceding but independent cases. In other words, the mistaken belief that if something happens more frequently than normal during some period, then it will happen less frequently in the future, or vice versa
– Remember, each case is usually an independent event, except for contagious and environmental diseases when case clusters are common (eg, viral upper respiratory disease) |
It has been an extraordinary month and you have diagnosed malignant lymphoma in four cats. A young Siamese cat is then presented with clinical signs of thoracic disease. Surely this can’t be another case of lymphoma? You treat the cat for likely infectious disease, especially since it has a mildly elevated temperature. There is no improvement over the next few days so the animal is radiographed. A mediastinal mass is detected which proves to be malignant lymphoma on cytological examination |
| Availability bias | The tendency to overestimate the likelihood of events that have greater ‘availability’ in memory. Availability can be influenced by how recent the memories are or how unusual or emotionally charged they may be. In other words, choosing a diagnosis because it is in the forefront of your mind, especially if the last case left a strong impression
– The availability bias is referred to as the availability heuristic when it gives you the right answer (discussed in Article 3) |
You presumptively diagnose a dyspnoeic cat with purulent pleurisy (pyothorax) because you saw two similar cats earlier in the year with dyspnoea and halitosis that turned out to have pyothorax (the halitosis was a particular trigger for recall of those cases). You may be right (availability heuristic), but have you considered that this case might actually have chylothorax secondary to primary myocardial disease, feline infectious peritonitis, mediastinal lymphoma or a diaphragmatic rupture? |
| Feedback bias | The tendency to interpret no feedback on a case as positive feedback
– This bias influences diagnosis (as well as treatment) |
You diagnose your third case of exocrine pancreatic insufficiency in a cat with weight loss, good appetite and poorly formed stools and treat as per the first two cases with enzyme replacement therapy. The cat returns in 2 weeks with no improvement and further investigations reveal diffuse small cell lymphoma of the gut. You later find out from your colleague down the road (where the first two cases were taken for a second opinion) that one cat had diabetes mellitus, while the other had hyperthyroidism |
| Overconfidence bias | Boldness in diagnosis based on a belief of infallibility. Often, this leads to not asking for advice, when that advice may have helped forge the path to the right diagnosis
– This bias, in combination with commission bias (the tendency towards action, rather than inaction, in making a diagnosis), can lead to a quick but erroneous diagnosis based on inadequate or incorrect evidence |
You are presented with a coughing cat. Having recently arrived from an area where lungworm is a common cause of airway disease, and wanting to impress your new employer, you make a presumptive diagnosis of aleurostongylosis and treat with milbemycin. The cat gets worse. Your new employer then informs you that lungworm is rare in the area and asks whether you considered the alternative diagnosis of feline asthma. Bronchoalveolar lavage demonstrates eosinophilic inflammation, excess mucus and no infectious agents, and the cat improves markedly with the instigation of inhaled bronchodilator and corticosteroid therapy |
| Omission bias | The tendency towards inaction in making a diagnosis because of lack of confidence or fear of the consequences for the owner of diagnosing a serious or potentially fatal illness in their cat
– This may also be related to the affect bias, which is the tendency to focus on diagnoses that lead to good rather than bad outcomes for the cat and owner |
You are presented with an aged cat with swelling and ulceration of two adjacent toes. The owner is an amputee (due to cancer) and is very attached to his cat. You suggest an infectious aetiology as your presumptive diagnosis (better outcome than cancer) and treat accordingly. The cat gets worse and you castigate yourself for not investigating the toe lesions via cytology, culture or biopsy, and ruling in or out ‘lung–digit syndrome’ by taking chest radiographs |
| Hindsight bias | Otherwise known as the ‘knew-it-all-along effect’ or ‘creeping determinism’
– This is related to the outcome bias, which is the tendency to judge or assess the quality of a diagnostic decision on the basis of the outcome of the decision, rather than on the basis of what was known at the time the decision was made |
Your employer points out a case of jaundice in a cat referable to hepatic lipidosis and focuses on a few key points (‘illness script’) that led him to that diagnosis (without telling you all the difficulties incurred in reaching that diagnosis and the complex comorbidities; after all he is engaging in hindsight bias!). As a consequence of this bias, you now believe that it will be really easy for you to diagnose the next hepatic lipidosis case that comes through the door. The next jaundiced cat presents and you feel great when you are correct – but the subsequent two have neutrophilic cholangitis and feline infectious anaemia, respectively |
| Visceral bias | The tendency to harbour negative (or positive) feelings towards the owner of a cat (or even a breed of cat!) may result in diagnoses being missed or ignored
– This is related to affect bias, which is the tendency to focus on diagnoses that lead to good rather than bad outcomes for the cat and owner |
You are confronted by a bullying and opinionated owner who tells you that he thinks his cat has been in a fight because of a noisy altercation during the night. He wants you to give the cat some antibiotics so it will not go on to develop an abscess, as happened previously (and which cost him a lot of money to have lanced). He adds that your boss has always been happy to do this in the past. You cannot find a wound (or even tooth marks) and tell the owner (with some satisfaction) that you don’t think the cat has been in a fight and to just observe it. Three days later the owner returns with the cat because it has a swelling that has to be lanced. Oh dear, why did you let the owner get under your skin! |
| Shared information bias | Known as the tendency for group members to spend more time and energy discussing information that all members are already familiar with (ie, shared information), and less time and energy discussing information that only some members are aware of (ie, unshared information)
– The impact of this bias is dependent on your experience and personality, and the personalities of your colleagues |
You think you might have a case of myasthenia gravis and discuss it with your two colleagues. Neither of them have diagnosed this condition before in the practice and spend more time with you discussing other possibilities, such as tick paralysis, snake bite, spider bite and chronic organophosphorus intoxication. You explore those differential diagnoses over the next few days, to no avail. The owner loses patience and takes the cat to another practice, which diagnoses the condition you were considering first with an edrophonium-response test |
Note that biases don’t necessarily (in fact, rarely) operate in isolation. A variety of cognitive errors may confound any one case, due to the involvement of multiple biases
While it is certainly true that cognitive bias can mislead or misguide, more often the bias can actually be useful in the diagnostic process. Cognitive bias can save a busy practitioner a significant amount of time as it can focus the mind about a possible diagnosis (‘the nose for the story’). 10 In other words, biases can provide short cuts for thinking about a diagnosis (Article 3 will explore this further). It is accepted that some cognitive biases can facilitate fast and correct diagnoses in veterinary practice. 11
Of course, it is the potential for biases inherent in System 1 thinking to mislead that is our focus here. Since they act at the level of our subconscious, we are not aware of their action when we are carrying out diagnostic reasoning. However, knowing that these biases exist and cause potential for error, allows us to use trained System 2 thinking to apply some general measures to reduce their misleading effects – which is where this article is now heading. We should probably first remind you that System 2 thinking has its own set of biases that need to be managed when managing System 1 biases that have led to cognitive error!
Clearly life wasn’t meant to be easy, but we hope that it can be made a little more straightforward with the following discussion of general strategies for managing bias in clinical reasoning.
Strategies to manage bias and harness its power in clinical diagnosis
Sherlock Holmes had a knack for expressing his understanding of how his mind worked when solving a problem (see box). And even the great man himself understood that thinking can go awry! Moreover, he advocated the first rule of managing cognitive bias when it misleads: recognition.
Having catalogued some of the more common cognitive biases and suggested some mechanisms by which they may give rise to error when thinking about a clinical problem (see Table 1), we can now focus on strategies to manage some of the more common cognitive errors in clinical reasoning (see Table 2). While bias cannot, and probably should not, be eliminated from thinking, its capacity to mislead can be diminished in intensity and frequency, and sometimes repaired. Additionally, safeguards can be put into place to ensure that cognitive error through the use of bias has less impact the next time it strikes.
Table 2.
Suggested metacognitive strategies for managing cognitive error in clinical reasoning
| Strategy | Explanation |
|---|---|
| Knowledge of cognitive error | An understanding of how cognitive errors operate in clinical reasoning |
| Reflection (‘know thyself’) | Accept that you are fallible like every other human being (‘Idols of the Tribe’ are shared by all of us). By doing so you will be more willing to accept mistakes and get the most out of reviewing cases that have gone wrong (better decisions come from reflecting on poor decisions). Also reflect on why some biases seemed to work in your favour (ie, why they gave you the ‘nose for the story’) |
| ‘Getting on the balcony’ (or ‘forest gazing rather than tree spotting’) | View the diagnostic process holistically and realistically. Understand that there will always be some uncertainty about the diagnosis (hence the reason for probability algorithms for particular clinical presentations). Accepting uncertainty can build confidence in managing difficult cases and avoid error through the use of bias |
| Rationalising when using System 1 thinking (eg, pattern recognition) and checking on how you are using System 2 thinking | Search objectively for test data to support your pattern, and also for data that will exclude your diagnosis. Consider appropriate evidence-based clinical guidelines while conducting further testing. Encourage yourself to consider alternative diagnoses |
| Practising using trained System 2 thinking to support System 1 thinking | We learn ‘to do by doing’; so the more you use both systems of thinking, the easier it becomes to know when trained System 2 thinking is needed to support System 1 thinking |
| Acknowledging the impact of your mood and personality | Some biases have an affective (emotional) component. Accept that your present mood may influence the way you make decisions about a diagnosis. Your personality and past experiences will also influence your decision-making, mainly by impacting on your preference for thinking about cases |
| Slowing down (create review time) | This is often very difficult to do in a busy veterinary practice, but it may be possible to plan a review during the day of difficult diagnoses (ie, when alarm bells have gone off for various reasons) or perhaps of more complex cases. This is particularly valuable when you are having an ‘off day’ because of tiredness, stress or distraction. Slowing down is really about giving you time to reflect on select cases and ensuring that you and your colleagues have avoided error through the use of cognitive bias and become aware of the influence of other potential diagnostic errors |
| Accountability* (‘checks and balances’) | Bias leading to misdiagnoses may not be detected unless there are mechanisms to ensure feedback.† This may be through personal accountability – by setting aside time to research difficult or confusing cases and ensuring that client feedback is sought for most cases that you found difficult to diagnose, as well as for random ‘easy’ cases. You may be able to get supportive colleagues to scrutinise and review some of your case records and provide feedback on your thinking. Any errors in thinking will soon become evident. This last check highlights the importance of having a supportive work environment to ensure that you continue to learn in an enjoyable fashion |
| ‘No (wo)man is an island’ | Advice on difficult cases should always be sought and welcomed. However, it is important to be critical, even sceptical, of group wisdom as it may lead to error through ‘shared information bias’ and possibly ‘hindsight bias’. Nevertheless, the old adage ‘a problem shared is a problem solved’ applies strongly to clinical decision-making, since everyone learns through listening (not talking!) |
| Checklist reminders | Developing checklists for cases you know you have had difficulty with in the past may help reduce error and also provide you with a way forward when you are feeling concerned, anxious or tired (‘security blanket effect’). This may diminish the negative impact of certain biases. Some people combine checklists with ‘rule-ins and rule-outs’ thinking |
Individual countries may also impose formal accountability schemes, such as clinical audits, to ensure standards are maintained. In the UK, for example, the Royal College of Veterinary Surgeons (RCVS) sets standards through a Guide to Professional Conduct. The RCVS also runs a voluntary Practice Standards Scheme, which sets a higher bar for veterinary practices
Post-mortem examination represents the ultimate form of feedback, and should be encouraged in cases where there is doubt about correct diagnosis in difficult cases that end badly
Since the 1970s, psychological research has focused on cognitive fallibility, with work on biases and heuristics leading to a shared Nobel Prize in 2002 for Daniel Kahneman. 12 Research on managing error through bias is less common, so it is much easier for us to try to summarise that information for you. In 1990, Gideon Keren coined the phrase ‘cognitive pills for cognitive ills’. 13 He was writing about whether ‘debiasing’ methods and aids are effective in reducing error (hence the real clause is ‘can cognitive pills cure cognitive ills?’). Since that time, metacognitive ‘debiasing’ strategies have become important for the medical fraternity, where misdiagnoses due to the use of cognitive bias have led to angst for physicians, novices and experts alike.7,8,14 –18
Strategies to manage error caused by the use of bias are scant in the veterinary literature but in a 2013 paper for veterinary educators, Professor Stephen May, of the Royal Veterinary College in the UK, suggests that these strategies need to be developed and introduced as early as possible for veterinary undergraduates. 9
What can the veterinary practitioner do to combat error related to bias?
The first strategy is to be aware of the existence of bias and to accept that it may affect your diagnostic reasoning. This may seem obvious, but it requires effort as it is trained System 2 thinking. Of course, some of us might not perceive bias as relevant to the way that we make decisions. 19 Other impediments to accepting the need for general debiasing strategies include strong emotional (affective) objections, personality and cultural influences. 19 For example, many Euro-American cultures are likely to engage in confirmation bias (Figure 1) and, therefore, resist confirmation debiasing; while many Asian cultures, because of their capacity to hold seemingly contradictory views simultaneously, are less likely to engage in confirmation bias and more accepting of the need to combat its potential harmful effects in decision-making. 20
Figure 1.

When confirmation bias in diagnosis (or should that be overconfidence bias?) moves to a higher level!
Let’s assume that after reading this article you understand bias and how it may give rise to cognitive error in your diagnostic reasoning. What is the next step to manage its impact? Well, since bias can negatively affect System 1 thinking or untrained System 2 thinking (confirmation bias occurs largely in untrained System 2 thinking), it seems logical to engage in trained System 2 thinking to manage any misleading influence of bias. So, if you become aware that you have come to a presumptive diagnosis by intuition, or have even utilised some untrained System 2 thinking to follow up a presumptive diagnosis, force yourself to consider two groups of questions:
Asking yourself the second question (what alternative diagnoses are possible?) can make you more forceful in combating error through the use of biases. This is supported by research published in the 1980s, when terms such as ‘consider-the-opposite’ and ‘consider-an-alternative’ were used to describe this ‘debiasing’ strategy.21,22 However, close to a century beforehand Sherlock Holmes knew the power of looking for alternatives (see box below).
You will notice that these questions also include the four cornerstones of ‘good science’; namely, objectivity, accuracy, open-mindedness and scepticism. The last two, in particular, will stand you in good stead in accepting that there is always some uncertainty about formulating a diagnosis and deciding upon a level of probability with respect to diagnostic accuracy that is sufficient to feel comfortable in proceeding with treatment. Remember, a long-term favourable response to therapy often provides further evidence that a presumptive diagnosis is correct. In essence, by accepting uncertainty about the diagnostic process (no test will give you 100% certainty about a diagnosis – not even necropsy!) you can combat error related to the use of bias, particularly confirmation or overconfidence bias. Such an approach seems to have currency in other disciplines; in meteorology, for example, it is now common to predict a certain rainfall, and also the probability of that rainfall.
A willingness to engage in trained System 2 thinking to support System 1 thinking will help diminish any negative impact of bias, thus leading to a higher probability of diagnostic accuracy. 23 The old adage ‘practice makes perfect’ can perhaps be adapted to ‘practice at using trained System 2 thinking to support and manage System 1 thinking leads to better understanding about the power of both’. At first, the deliberate checking of your intuitive thought processes with System 2 thinking will slow down the diagnostic process, but time delays diminish with practice and experience. Not that slowing down clinical reasoning is always a bad thing! It may be viewed as detrimental for the ‘cash register’, and when you want to get home for dinner but have a waiting room full of clients. But slowing down the decision-making process is an accepted strategy for diminishing any negative impact of bias (Table 2).18,19
Of course, slowing down as a strategy to combat cognitive error must be utilised appropriately and requires the practitioner to recognise and select the difficult cases where this approach is justified. Creating review time might involve setting aside a period of reflection on problem cases or perhaps, if working in a group practice, a session of discussion with trusted and supportive colleagues about challenging or enigmatic cases. Some practices will use a visiting internal medicine or imaging specialist as a sounding board for such case discussions. Indeed, this is why teaching hospitals have grand rounds. However, don’t expect this to work for every case, as even collective decision-making is prone to bias (as discussed, it may lead to error through the shared information bias and possibly hindsight bias).
Finally, strategies to combat cognitive error in clinical reasoning should always include accountability (checks and balances). This may be through personal accountability, by setting aside time to research difficult or confusing cases (using textbooks and medical databases) and by ensuring that client feedback occurs for random ‘easy’ cases and for most cases that you found difficult to diagnose. Additionally – and perhaps a little more daunting – you may be able to persuade supportive colleagues to scrutinise and review some of your cases and provide feedback on your reasoning. This is unlikely to identify all cognitive errors, but some that you have not thought of may become evident and a plan can be designed to try to avoid that type of thinking in future cases.
Accountability and reflection can generate valuable checklists for difficult cases or for times when you are just having an ‘off day’. 24
Final comments
The general strategies outlined in this article will not always prevent cognitive errors in diagnosis, but we hope that through increased awareness you will diminish your error rate. Of course, it is up to you how you implement these strategies. Perhaps you have additional strategies in mind? We would love to hear what they are! We do encourage you to be brave, however, and at least try one or two strategies to combat the ‘dark side’ of bias.

We leave the last word to Sherlock Holmes: ‘There is nothing more stimulating than a case where everything goes against you.’ – Sir Arthur Conan Doyle (The Hound of the Baskervilles).
Supplemental Material
Richard Malik interviews Paul Canfield, lead author of the JFMS clinical reasoning series. To access this video, please visit: https://player.vimeo.com/video/150981038
Acknowledgments
Richard Malik is supported by the Valentine Charlton Bequest administered by the Centre for Veterinary Education of the University of Sydney.
Footnotes
Funding: The authors received no financial support for the research, authorship and/or publication of this article.
The authors declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Contributor Information
Paul J Canfield, Faculty of Veterinary Science, B14, University of Sydney, NSW 2006, Australia.
Martin L Whitehead, Chipping Norton Veterinary Hospital, Banbury Road, Chipping Norton, Oxon, OX7 5SY, UK.
Robert Johnson, South Penrith Veterinary Clinic, 126 Stafford Street, Penrith, NSW 2750, Australia.
Carolyn R O’Brien, Faculty of Veterinary Science, The University of Melbourne, Parkville, VIC 3152, Australia.
Richard Malik, Centre for Veterinary Education, B22, University of Sydney, NSW 2006, Australia.
References
- 1. Canfield PJ, Whitehead ML, Johnson R, et al. Case-based clinical reasoning in feline medicine. 1: Intuitive and analytical systems. J Feline Med Surg 2016; 18: 35–45. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Graber ML. Diagnostic error in internal medicine. Arch Int Med 2005; 165: 1493–1499. [DOI] [PubMed] [Google Scholar]
- 3. Kohn LT, Corrigan JM, Donaldson MS. To err is human: building a safer health system. Report by the Committee on Quality of Health Care in America, Institute of Medicine. Washington: National Academies Press, 2000, pp 1–287. [PubMed] [Google Scholar]
- 4. Eysenck MW, Keane MT. Cognition and emotion. In: Cognitive psychology. 6th ed. Hove and New York: Psychology Press, 2010, pp 571–605. [Google Scholar]
- 5. Bacon F. Novum organum (part 1 – science or ‘true suggestions for the interpretation of nature’). Lexington: Forgotten Books, 2010, pp 13–24. [Google Scholar]
- 6. Ashwell KWS. The brain book – development, function, disorder, health. New York: Firefly Books (US), 2012, pp 1–36. [Google Scholar]
- 7. Singh H, Petersen LA, Thomas EJ. Understanding diagnostic errors in medicine: a lesson from aviation. Qual Saf Health Care (Brit Med J) 2007; 15: 159–164. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Stiegler MP, Neelankavil JP, Canales C, et al. Cognitive errors detected in anaesthesiology: a literature review and pilot study. Brit J Anaesth 2012; 108: 229–235. [DOI] [PubMed] [Google Scholar]
- 9. May SA. Clinical reasoning and case-based decision making: the fundamental challenge to veterinary educators. J Vet Med Educ 2013; 40: 200–209. [DOI] [PubMed] [Google Scholar]
- 10. Smallberg G. Bias is the nose for the story. In: Brockman J. (ed). This will make you smarter – new concepts to improve your thinking. London: Transworld Publishers, 2012, pp 43–45. [Google Scholar]
- 11. McKenzie BA. Commentary – veterinary clinical decision-making: cognitive biases, external constraints, and strategies for improvement. J Am Vet Med Assoc 2014; 244: 271–276. [DOI] [PubMed] [Google Scholar]
- 12. Tversky A, Kahneman D. Judgement under uncertainty: heuristics and biases. Science 1974; 185: 1124–1131. [DOI] [PubMed] [Google Scholar]
- 13. Keren G. Cognitive aids and debiasing methods: CAN cognitive pills cure cognitive ills? Adv Psychol 1990; 68: 523–552. [Google Scholar]
- 14. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimise them. Acad Med 2003; 78: 775–780. [DOI] [PubMed] [Google Scholar]
- 15. Croskerry P, Norman G. Overconfidence in clinical decision making. Am J Med 2008; 121: S24–S29. [DOI] [PubMed] [Google Scholar]
- 16. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. Qual Saf Health Care (Brit Med J) 2013; 22: 58–64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. Qual Saf Health Care (Brit Med J) 2013; 22: 65–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Eva KW, Norman GR. Heuristics and biases – a biased perspective on clinical reasoning. Med Educ 2005; 39: 870–872. [DOI] [PubMed] [Google Scholar]
- 19. Lilienfeld SO, Ammirati R, Landfield K. Giving debiasing away. Can psychological research on correcting cognitive errors promote human welfare? Perspect Psychol Sci 2009; 4: 390–398. [DOI] [PubMed] [Google Scholar]
- 20. Peng K, Nisbett R. Culture, dialectics and reasoning about contradiction. Am Psychol 1999; 54: 741–754. [Google Scholar]
- 21. Lord C, Lepper M, Preston E. Considering the opposite: a corrective strategy for social judgment. J Pers Soc Psychol 1984; 47: 1231–1243. [DOI] [PubMed] [Google Scholar]
- 22. Hirt E, Markman K. Multiple explanation: a consider-an-alternative strategy for debiasing judgments. J Pers Soc Psychol 1995; 69: 1069–1086. [Google Scholar]
- 23. Greenhalgh T. Intuition and evidence – uneasy bedfellows. Brit J Gen Pract 2002; 52: 395–400. [PMC free article] [PubMed] [Google Scholar]
- 24. Gawande A. The checklist manifesto: how to get things right. New York: St Martin’s Press, 2011, pp 215–216. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Richard Malik interviews Paul Canfield, lead author of the JFMS clinical reasoning series. To access this video, please visit: https://player.vimeo.com/video/150981038



