Skip to main content
Insights into Imaging logoLink to Insights into Imaging
. 2010 Oct 24;2(1):47–55. doi: 10.1007/s13244-010-0048-1

Image perception and interpretation of abnormalities; can we believe our eyes? Can we do something about it?

Durr-e- Sabih 1,, Ayan Sabih 2, Quratulain Sabih 3, Ali N Khan 4
PMCID: PMC3259345  PMID: 22347933

Abstract

The radiologist’s visual impression of images is transmitted, via non-visual means (the report), to the clinician. There are several complex steps from the perception of the images by the radiologist to the understanding of the impression by the clinician. With a process as complex as this, it is no wonder that errors in perception, cognition, interpretation, transmission and understanding are very common. This paper reviews the processes of perception and error generation and possible strategies for minimising them.

Keywords: Image perception, Interpretation, Errors

Introduction

Humans rely upon their eyes, more than any other sense, to assess the world around them. We see effortlessly, and we see every waking moment of our lives; even when we sleep we “see” dreams. With such familiarity comes a sense of dependability and for most of us only “seeing is believing”; thus, the surprise and fascination when we look at optical illusions.

As medical imaging physicians, “seeing correctly” is our business. What we see and report has a tremendous impact upon the well-being of our patients. When we are wrong the impact extends to include potential patient harm, loss of personal self-esteem, risk to livelihood and even liberty.

Errors in medical imaging have been noticed since the early days of radiology, first reported by Garland [1] in 1959. The “surprising” degree of inaccuracies first reported over 50 years ago have persisted and seem to have remained unchanged. Some of the techniques are particularly prone to errors; these include chest X-rays, with a “miss rate” of 20-50% [2], and mammography, with a “miss rate” of up to 75% [3]. Most workers agree that if a radiologist is given only “positive” images to comment upon, an error rate of 30% occurs, but with a mix of normal and abnormal cases, representing usual clinical practice, the rate declines to about 4% [4].

The process of “seeing” is complex and the chain has anatomical, physiological, neuropsychological and psycho-emotional components. With a process as complex as “seeing” it should come as no surprise that there are so many opportunities for making mistakes.

Let us start by briefly reviewing the processes involved.

The input, anatomy and physiology that make seeing difficult

The process of “seeing” starts with the eyes. Helmholtz, the inventor of the ophthalmoscope, concluded that eyes have rather poor optics [5]. Eyes are organs that are elegantly designed for daytime hunting, for rapidly obtaining information about large objects, but really not the perfect design for detailed analysis. The retinal surfaces, where the whole process starts, are curved, the images are projected upside down, and the images are flat (depth perception is derived through post-processing, largely using the difference in the images from the two eyes, stereopsis and secondary cues like the apparent difference in size of objects at different distances). Even more importantly, out of the total retinal surface (25 cm2), there is only the 1.5-mm sized fovea that has the right type of receptors (cones); of this, the 0.3-mm foveola is capillary-free and rod-free to allow detailed and colour vision [6]. Thus, the eye executes rapid jerky movements, called saccades that scan the scene and bring different areas of the scene on to the fovea. These movements, which can take place up to four times a second, are rapid and can reach speeds of up to 400 degrees per second [7]. The eye is essentially blind during saccadic movements. While foveal vision, which is dependent on cones, can process three to four high-quality colour images per second, peripheral vision, using rods, is less accurate and is not sensitive to colour but can process image information at 90 per second.

Therefore, the input is essentially discontinuous, and jerky, and there is a lot of noise as blurred images are projected during movements, alternating for a short time with static, high-resolution images when the eyes are at rest. The stream of data has varying frame rates as well as varying qualities of resolution. It is with this input that we see the world in all of its colour, depth and movement.

The processing; the neurology that makes seeing difficult

There is a lot of information, coming in bits, that needs to be integrated to form images, and all of this needs to be done in real time, because the world around us is in motion and we need to respond rapidly. The brain gets around this by taking shortcuts, and the most important shortcut is by relying on a process similar to Fourier analysis of images [8]. Fourier analysis is the process of splitting up information, in this case visual information, into frequency components containing progressively greater detailed information. The brain analyses the frequency with the least detailed component first, and then if time permits and interest dictates, higher frequency components are analysed. Small amounts of information obviously need less time and this trick allows us to rapidly assess the visual situation and to respond appropriately. It is this ability that allows us to recognise caricatures that are made up of only a few lines and contain no detail, and to recognise shadows and people we know in low light conditions that hide detail (Fig. 1). While this is useful for most purposes and allows us to respond quickly and appropriately, this also makes the brain jump to conclusions that might be erroneous (Fig. 2). This sequential processing of visual information has been accepted as the most useful model of how we progressively appreciate visual detail [9].

Fig. 1.

Fig. 1

Fourier analysis allows us to recognise silhouettes that contain very little detail

Fig. 2.

Fig. 2

The image is really made up of three Pac-man figures and three triangles. The inverted light triangle is a visual illusion formed by the way Fourier transform analyses the image. By concentrating on the image this can be realised

It does not stop here; an interesting analysis [10] calculates that 10-billion bits of information arrive at the retina, 6-million bits enter into the optic nerves, but only 100 bits per second constitute conscious perception; of course there are also the limits of working memory [1113] and the need for attention [14] to really appreciate something in the field of vision.

The scene

Not every scene is equally easy to interpret, the content and the physical attributes of the scene are important and a number of physical parameters affect interpretation, the most important attributes being contrast, signal-to-noise ratio (SNR), grey scale and colour content.

Contrast

The level of contrast needed for target identification varies from 0.5% to almost 100%. Low-contrast targets are difficult to appreciate and are the bane of radiologists, e.g. the isoechoic lesion on ultrasound and the isodense lesion on computed tomography (CT) that can be recognised only indirectly, through contour irregularities or displacement of identifiable adjacent structures.

SNR

A related parameter is SNR. It refers to the photons that have been collected on the detector itself, and this quantum of information cannot be increased. Even the most sophisticated post-processing will not create valid information if it is not there in the first place. SNR has interesting implications in the study of perception, one of which is the relation of target size to detectability. It is intuitive to assume that larger targets and getting closer to images will make detectability easier under low-contrast situations; surprisingly, it is the opposite. The optimal viewing distance varies with the target size and it has been experimentally determined that for many lesions (e.g. 1-cm nodules on chest X-rays), increasing the distance from 45 to 91 cm improves perception [15]. This obviously has practical implications.

Grey scale

High-resolution images, like those generated by ultrasound, CT and magnetic resonance (MR) imaging need a large number of grey shades to ensure smooth transitions, a small number of grey shades in the image leads to the compression of adjacent shades into one. This is called contouring and gives a “posterised” look to the image.

Colour

The human eye is more sensitive to colour than to grey scale. It should make sense to colourise medical images, but in real life it creates noise within images, adding to the visual clutter, and colours do not seem to be useful in displaying normal and abnormal anatomy. There is also the problem of how we perceive different colours; nuclear medicine physicians are well aware of this and need to switch between different colour schemes and grey scale to be sure that what is seen in colour is really there. The image displayed in different colour schemes can have different levels of feature conspicuity.

Functional imaging, however, lends itself easily to coloured imaging, especially when a single cyclical or uni- or bi-directional function is being evaluated. Cardiac contraction is an example where nuclear medicine imaging can be used for phase and amplitude analysis, and in Doppler, blood flow away from the probe and flow towards the probe can be visually separated by colour and analysed [16].

The neuropsychological processes involved in lesion recognition

There appear to be several processes that go on in the brain leading to lesion recognition; these include detection, localisation and identification. The processes and the accuracy with which patterns are analysed and interpreted depend, to an extent, upon the physical parameters of the image. However, physical attributes of an image are not the only determinants of the way we see things; there are also neurophysiological, psychological and psycho-emotional factors that influence the way we see medical images. These processes take the form of pattern recognition, spatial localisation of the area of interest and the comparison of the patterns with known patterns (Gestalt processing). A simplistic description of Gestalt theory is that perception cannot be reduced to its individual sensory components or the physical attributes of an image. It is a global phenomenon that transcends physics and physiology. The Gestalt principles describe how we group objects visually and recognise patterns; some of the Gestalt phenomena of relevance to medical imaging are proximity, similarity, closure, symmetry, common fate and continuity [17] (Table 1).

Table 1.

Gestalt principles

graphic file with name 13244_2010_48_Tab1_HTML.jpg

Much of the mental processing is automatic, effortless, rapid and carried out on an unconscious plane; like driving to work every morning, which takes very little conscious input. This thinking uses a schematic control mode; we carry in our heads a large number of schemata or repetitive tasks, which we can call upon to process information rapidly, not really thinking about what we are doing. Another cognitive process is the attention control mode, this is used for problem solving as well as monitoring the unconscious activities (e.g. noticing when we take a wrong turn on a well travelled path) and is slow, sequential, effortful and difficult to sustain [18].

Human performance can be classified into three levels [19]: skill-based, which consists of patterns of thought and actions that are governed by pre-programmed instructions (schemata); rule-based, where solutions of familiar problems are governed by stored rules (if “x” is the problem, then “y” is the solution); or knowledge-based, which requires conscious analytic processing to synthesise known patterns to derive inference about new findings.

Humans prefer pattern recognition to calculation [20] and skill-based or rule-based thinking to knowledge-based thinking.

Causes of errors

One way of grouping errors in medicine is to classify them into no-fault errors, system errors and cognitive errors [21].

  • No-fault errors can be situations where the illness is silent, or is so atypical or rare as to escape consideration; these are errors that anyone in the situation might make.

  • System errors occur when there are faults in the healthcare system and include poor policy, inadequate training or supervision, defective communication and suboptimal working conditions like stress, fatigue and frustrations in the workplace.

  • Cognitive errors (Table 2) are the true “human errors” that cannot be blamed on the system, disease or patient himself; these relate to inadequate knowledge, faulty data-gathering, inaccurate clinical reasoning or faulty verification [22, 23].

Table 2.

Types of cognitive errors. Adapted from Croskerry [23] unless cited otherwise

Satisfaction of search Perhaps the most significant cause of diagnostic error, once a diagnostic finding is met with, the search stops, with the potential of missing a second finding which might be even more significant than the first.
Availability bias Recent experience will modify the threshold of diagnosis for that condition, if a certain condition has been seen recently; the tendency is to think of that in a new patient. Even more importantly, if a condition has been missed and brought to the notice of the physician, the next or next few patients will certainly be assessed on those lines. Similarly, if a similar finding has not been encountered for a long time it might not be considered as easily.
Capture A more frequently used schema captures or takes over from a similar but less familiar one [19]; for example, if an imaging routine involves looking at the left flank after the right flank, and the patient points to a mass in the mid abdomen that is seen after looking at the right flank, the left flank can be missed. These are also called post-completion errors and are most frequent when the interruption occurs just before the step that needed to be completed [45].
Gambler’s fallacy Thinking that if a series of patients of the same kind have been seen sequentially, the chances of the next patient having the same condition diminished, something like imagining that if on a coin flip you get ten heads in a row, the chances of an 11th head is reduced.
Aggregate bias Thinking that an individual physician’s patients are somehow unique and do not display the common features of a particular process, this can lead to false diagnoses and unnecessary procedures. A tendency to neglect or acknowledge the base rate or the local prevalence of that condition distorts Bayesian reasoning, although sometimes increasing the previous probability of a condition that might enable diagnosis or a rare or rarely encountered condition.
Ascertainment bias The patient has non-medical attributes that touch upon the physician’s own prejudices, biasing him in a certain direction; overweight people [60], women [61], minorities [62] may all pay the price of a visceral bias held by the physician.
Anchoring Making an impression very early in the diagnostic process and then refusing to change it as new evidence becomes available. This leads to confirming when the evidence that supports the initial opinion is acknowledged while that to the contrary is ignored. The diagnosis made gains its own momentum and it becomes more and more difficult to think of alternative possibilities. Finally, closure takes place and further thinking about a possible alternative diagnosis stops. One of the reasons might be the mental investment already made in this diagnosis and the reluctance to see the work go to waste.
Alliterative errors A previous report of another radiologist or even the same reader will influence the current reading. If a lesion has been ascribed benign findings previously, it will be similarly judged, and if a significance has been assigned so will it on subsequent readings [63].
Overconfidence Tendency to believe that one knows more than one really does, prompting action on incomplete information, intuition or hunches.
Framing bias The patient’s diagnostic possibilities are restricted by the referral situation, or the question that is asked. For example, a referral from a gastroenterology service might cause a focus on the liver and gut but ignore the other viscera; or a differential diagnosis of a finding might be limited to only or mostly gastroenterology.
Pressure to report There is a “need” to find something wrong with the patient, so findings, often insignificant or even “invisible” are reported in a language that is ambiguous but might be misinterpreted as something significant (local data).
Misdirection Similar to the framing bias; commonest where the patient interacts with the imaging physician and points to the wrong site or emphasises a minor symptom leading to a less detailed evaluation of the region where the significant pathological condition might actually lie. An example is a woman who will not be forthcoming about a gynaecological symptom during an ultrasound examination and will insist that her presenting the complaint is elsewhere. Walk-in patients and those without a proper referral requisition are most prone to creating this bias (local data).

Time available

It is intuitive that hurrying a reading session will increase the number of false negatives, but what is interesting to know is that spending too much time poring over an image tends to increase false positives as well as false negatives; so an optimal time must be allowed for each film report, depending on the technique, complexity of the findings and familiarity with that particular finding [24, 25].

Expectation of abnormality

The expectation of abnormality often determines if a finding is missed or not. Everyone has heard sad stories of missing lesions just because it was not expected, “the eye does not see what the mind does not know (or expect)” is dramatically demonstrated in these situations. Conversely, the mind might create a finding where none exists based on expecting it to happen. A whole gamut of errors that arise from expectation have been identified; these have variously been called “cognitive errors”, “cognitive dispositions to response (CDRs)” [23] or “biases” [26], and some are shown in Table 2.

Communicative errors

The report forms the link between the imaging physician’s perceptions and the clinician’s understanding of this perception. The importance of communicating the findings, the confidence (or lack of), the nuances of perception and language use, all make report writing critical. Those who have to report in a language not their own need to be even more careful because of the potential for misunderstanding or misinterpreting the findings.

A particularly important type of communicative errors are laterality errors [27]. These fall into two broad categories, one in which the side of the lesion is correctly identified in the report, but switched in the impression/opinion section and the other more serious error of reporting the lesion on the wrong side of the body in both the report and opinion parts. Only a second look at the images and the report, or a subsequent examination will pick this up, or of course a therapeutic intervention will discover this, with potentially disastrous results.

Wrong patient

When a procedure is carried out on the wrong patient [28, 29], the potential consequences are of truly epic proportions; it still happens and is probably more common than one would wish. In imaging, it is only too easy to assume that the images belong to the person claimed, because the opportunities for matching the images with the actual patients are limited.

Reducing errors

The aim in any healthcare activity should be to eliminate error completely. The only acceptable error rate should be zero [21], but given the complexity of the process and the rather tenuous understanding of the processes influencing perception, there is a feeling that “the search for zero error is doomed from the start” [30]. The desire to do something about error reduction in medicine can be gauged by the thousands of articles published on the topic. Physicians have looked to other professions, including the aviation industry, to learn about their method of error management [31].

It is not possible to discuss all the possible strategies suggested for error prevention or reduction but a few practical methods can be mentioned.

Education and training

The impact of continued medical education, training and re-training helps to improve management and diagnostic decisions. Initiating CME activities is usually a management responsibility but individuals must actively think about their own weaknesses and seek education and training in their areas of need.

The models of these activities are already universally available in the form of educational meetings, workshop seminars, group discussions, simulation laboratories, certification programmes, etc.

Some of us are “experts”; right more times than others. Some attributes of an expert in medical imaging are a disciplined strategy towards visual search, a wide knowledge base, the ability to use this knowledge to analyse the current situation and find recognisable patterns, a continuous upgrading of the knowledge base so that more and more cues are available for recall as new diagnostic challenges are met, and finally to understand the context of the diagnostic examination, to know what to look for, and why; at the same time keeping an open mind to unexpected or new findings [32].

The threshold for diagnosing abnormality, the receiver-operating curve [33], should be continuously re-evaluated. With the understanding that a trade-off exists between high sensitivity and specificity, depending upon the practice requirements, a high sensitivity versus a high specificity threshold should be chosen [21].

Clinical history and previous films

The effect of a clinical history accompanying a radiology requisition might [34] or might not [35] lead to more relevant reports; but some points in history are extremely relevant to the imaging study; for example, sickle cell anaemia can be missed if the patients ethnicity is not known, etc. Having previous films, not only the report, to hand while conducting a new examination is always useful. Previous images can also lead to picking up additional findings in as many as 20% of cases [36, 37].

Improving perception

The complexity and sheer volume of visual information that a radiologist needs to deal with on a daily basis means that he can ill afford suboptimal viewing conditions that can mask, or even more significantly, create significant findings. Viewing conditions can be optimised in physical terms [38] of luminosity, monitor resolution etc. Further assistance can be provided by using computer aids like image enhancement, CAD (computer-aided diagnosis) [39, 40], temporal subtraction [41] and artificial intelligence [42]. The computer might never replace the human expert but digital processing is superior to film reading when looking for subtle diagnostic evidence.

Preserving open mindedness

There should be a conscious effort to construct a comprehensive differential diagnosis for any given situation; there might be tendency to favour the common, or the uncommon, but an unbiased search for alternatives gives the best results [21].

Second opinion

Adding a second opinion reduces errors and picks up missed findings [43, 44]; a full-time second reader might not be possible everywhere, given the resource constraints in most departments, but a tele-consultation, part-time meeting or looking at the images using internet technology might yield the desired benefits without straining the resources too much.

Checklists, routines, drills, standards and guidelines

This can be as simple as a form that enumerates the points one wishes to note on a particular examination [45], ensuring that significant findings are not missed and that interruptions and distractions do not interfere with the completion of tasks [46]. Standard operating procedures and guidelines are more sophisticated check-lists that can contribute very significantly to reducing errors. The mere presence of guidelines does not mean that these will be implemented because there are several barriers to following guidelines [47]; these biases and barriers need to be recognised and addressed.

The presence of reference material in the reading room helps to reduce reliance on the vagaries of human memory and overcome memory lapses and memory biases that we are all so prone to.

A formal procedure for recording, acknowledging and trying to reduce errors

This is a systems approach to reducing errors and implies management sensitivity and desire to reduce errors [48], and involves the maintenance of personal error logs, departmental error meetings and a continual improvement in the error prevention strategies. Of course these procedures and practices need to be implemented in a no-blame environment to encourage participation.

Improving and strengthening communication

The report and the communication of the report form the weakest link in any imaging service; one study of lawsuits against radiologists over a 15-year period [49] calculated that 80% of these related to some communication lapse between the radiologist and the end-user, either the treating physician or the patient. Reporting unambiguously, stating the confidence level and proof-reading the text for accuracy before delivery are important. The recent voice recognition tools come with their own vulnerabilities [50], but newer information technology tools can be used to ensure a timely delivery, emphasising important information in the report with follow-ups [51] and confirming acknowledgement of receipt. The American College of Radiology has extensive guidelines on the content of the radiology report [52].

Patient identifiers, site and side of disease

Laterality errors [27] and wrong patient errors [29, 53] are so significant and pervasive in all medical fields that the Joint Commission created and approved the Universal Protocol for Preventing Wrong Site, Wrong Procedure and Wrong Person Surgery in July 2003. This can be applied to imaging [54] and should go a long way towards avoiding these. Developing a personal strategy of double-checking the patient identifiers, site and side of disease in the report and ensuring that these correspond to the images should help in checking this error.

Errors as opportunities

Errors seem to be unavoidable, but we all hate to be wrong and there is a special sense of shame that accompanies the discovery of our errors by others; we fear loss of prestige, that our patients might be referred elsewhere and that we may be passed over for promotions if we are seen to be error-prone [55]. This leads us to hide our errors from others and we might become so error-averse that we inadvertently hide errors from ourselves [56]. In doing so we deprive ourselves of one of the most valuable resources in the quest for improvement. We must inculcate in ourselves the commitment to enhancing our knowledge and improving our quality of work; if we regard the status quo as sufficient, we are unlikely to achieve anything better [57]. We must learn to live with errors, and learn to learn from errors. Errors should not be suppressed, but used to improve systems and individual performance[58]; the process should be non-blaming and without fear of reprisals. Leaders have an especially important role in building a learning culture by being open and honest about their own errors [56].

Dealing with errors in the real world

While disclosures, sharing and even highlighting errors have educational and quality improvement advantages, sharing the details in the wrong way with the wrong person can have undesirable consequences [59, 60]. While the truth must always be disclosed, it should be done judiciously and in a manner that avoids self-incrimination. A finding missed on a colleague’s previous report but present on the film should be gauged in generic and non-judgemental terms. It is a great disservice to the profession and also to oneself to use words like “missed”, “mistake” or “should have been picked up or diagnosed”.

Conclusion

Given the complexity involved in image perception, interpretation, transmission and comprehension of reports, zero errors in medical imaging might be desirable but difficult or even impossible to achieve. However, by adopting the strategies and methods given here, the reader should be able to understand the general processes that lead to errors, and might even be able to identify some of his own peculiar weaknesses and, hopefully, improve his error rate in the practice of imaging medicine.

When imaging, we should believe our eyes, but keep our minds open and constantly question what we are seeing with regard to artefacts, pitfalls and errors.

References

  • 1.Garland LH. Studies on the accuracy of diagnostic procedures. Am J Roentgenol Radium Ther Nucl Med. 1959;82(1):25–38. [PubMed] [Google Scholar]
  • 2.Forrest JV, Friedman PJ. Radiologic errors in patients with lung cancer. West J Med. 1981;134(6):485–490. [PMC free article] [PubMed] [Google Scholar]
  • 3.Harvey JA, Fajardo LL, Innis CA. Previous mammograms in patients with impalpable breast carcinoma: retrospective vs blinded interpretation. 1993 ARRS President’s award. AJR Am J Roentgenol. 1993;161(6):1167–1172. doi: 10.2214/ajr.161.6.8249720. [DOI] [PubMed] [Google Scholar]
  • 4.Berlin L. Accuracy of diagnostic procedures: has it improved over the past five decades? AJR Am J Roentgenol. 2007;188(5):1173–1178. doi: 10.2214/AJR.06.1270. [DOI] [PubMed] [Google Scholar]
  • 5.Hunziker HW (2006) Im Auge des Lesers: foveale und periphere Wahrnehmung—vom Buchstabieren zur Lesefreude [In the eye of the reader: foveal and peripheral perception—from letter recognition to the joy of reading]. Transmedia Staubli, Zurich
  • 6.Straatsma BR, et al. Topography of the adult human retina. UCLA Forum Med Sci. 1969;8:379–410. [PubMed] [Google Scholar]
  • 7.Boghen D, et al. Velocity characteristics of normal human saccades. Invest Ophthalmol. 1974;13(8):619–623. [PubMed] [Google Scholar]
  • 8.Ginsburg A. Visual information processing based on spatial filters constrained by biological data. Aerospace Meti Res Leih TR-78-129, vols 1-2. Springfield: National Technical Information Service; 1978. [Google Scholar]
  • 9.Marr D. Vision: a computational investigation into the human representation and processing of visual information. San Francisco: W.H. Freeman; 1982. [Google Scholar]
  • 10.Raichle ME. The brain’s dark energy. Sci Am. 2010;302(3):44–49. doi: 10.1038/scientificamerican0310-44. [DOI] [PubMed] [Google Scholar]
  • 11.Alvarez GA, Cavanagh P. The capacity of visual short-term memory is set both by visual information load and by number of objects. Psychol Sci. 2004;15(2):106–111. doi: 10.1111/j.0963-7214.2004.01502006.x. [DOI] [PubMed] [Google Scholar]
  • 12.Davis G, Holmes A. The capacity of visual short-term memory is not a fixed number of objects. Mem Cogn. 2005;33(2):185–195. doi: 10.3758/BF03195307. [DOI] [PubMed] [Google Scholar]
  • 13.Simon HA. Invariants of human behavior. Annu Rev Psychol. 1990;41:1–19. doi: 10.1146/annurev.ps.41.020190.000245. [DOI] [PubMed] [Google Scholar]
  • 14.Carrasco M, McElree B. Covert attention accelerates the rate of visual information processing. Proc Natl Acad Sci USA. 2001;98(9):5363–5367. doi: 10.1073/pnas.081074098. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Kelsey CA, et al. Observer performance as a function of viewing distance. Invest Radiol. 1981;16(5):435–437. doi: 10.1097/00004424-198109000-00149. [DOI] [PubMed] [Google Scholar]
  • 16.Wagner HN. Color: contributions or camouflage. New York: HP; 1975. [Google Scholar]
  • 17.Koontz NA, Gunderman RB. Gestalt theory: implications for radiology education. AJR Am J Roentgenol. 2008;190(5):1156–1160. doi: 10.2214/AJR.07.3268. [DOI] [PubMed] [Google Scholar]
  • 18.Reason J. Human error. Cambridge: Cambridge University Press; 1992. [Google Scholar]
  • 19.Rasmussen J, Jensen A. Mental procedures in real-life tasks: a case study of electronic trouble shooting. Ergonomics. 1974;17(3):293–307. doi: 10.1080/00140137408931355. [DOI] [PubMed] [Google Scholar]
  • 20.Leape LL. Error in medicine. JAMA. 1994;272(23):1851–1857. doi: 10.1001/jama.272.23.1851. [DOI] [PubMed] [Google Scholar]
  • 21.Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77(10):981–992. doi: 10.1097/00001888-200210000-00009. [DOI] [PubMed] [Google Scholar]
  • 22.Kassirer JP, Kopelman RI. Cognitive errors in diagnosis: instantiation, classification, and consequences. Am J Med. 1989;86(4):433–441. doi: 10.1016/0002-9343(89)90342-2. [DOI] [PubMed] [Google Scholar]
  • 23.Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78(8):775–780. doi: 10.1097/00001888-200308000-00003. [DOI] [PubMed] [Google Scholar]
  • 24.Kundel HL, Nodine CF, Krupinski EA. Searching for lung nodules. Visual dwell indicates locations of false-positive and false-negative decisions. Invest Radiol. 1989;24(6):472–478. doi: 10.1097/00004424-198906000-00012. [DOI] [PubMed] [Google Scholar]
  • 25.Nodine CF, et al. Time course of perception and decision making during mammographic interpretation. AJR Am J Roentgenol. 2002;179(4):917–923. doi: 10.2214/ajr.179.4.1790917. [DOI] [PubMed] [Google Scholar]
  • 26.Gunderman RB. Biases in radiologic reasoning. AJR Am J Roentgenol. 2009;192(3):561–564. doi: 10.2214/AJR.08.1220. [DOI] [PubMed] [Google Scholar]
  • 27.Sangwaiya MJ, et al. Errare humanum est: frequency of laterality errors in radiology reports. AJR Am J Roentgenol. 2009;192(5):W239–W244. doi: 10.2214/AJR.08.1778. [DOI] [PubMed] [Google Scholar]
  • 28.Grissinger M. Oops, sorry, wrong patient!: applying the Joint Commission’s “two-identifier” rule goes beyond the patient’s room. P T. 2008;33(11):625–651. [PMC free article] [PubMed] [Google Scholar]
  • 29.Chassin MR, Becher EC. The wrong patient. Ann Intern Med. 2002;136(11):826–833. doi: 10.7326/0003-4819-136-11-200206040-00012. [DOI] [PubMed] [Google Scholar]
  • 30.Berwick DM (1999) Taking action to improve safety: how to increase the odds of success. Enhancing patient safety and reducing errors in health care. Chicago: National Patient Safety Foundation, Chicago, pp 1-11
  • 31.Helmreich RL. On error management: lessons from aviation. BMJ. 2000;320(7237):781–785. doi: 10.1136/bmj.320.7237.781. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Robinson PJ. Radiology’s Achilles’ heel: error and variation in the interpretation of the Röntgen image. Br J Radiol. 1997;70(839):1085–1098. doi: 10.1259/bjr.70.839.9536897. [DOI] [PubMed] [Google Scholar]
  • 33.Zweig MH, Campbell G. Receiver-operating characteristic (ROC) plots: a fundamental evaluation tool in clinical medicine. Clin Chem. 1993;39(4):561–577. [PubMed] [Google Scholar]
  • 34.Doubilet P, Herman PG. Interpretation of radiographs: effect of clinical history. AJR Am J Roentgenol. 1981;137(5):1055–1058. doi: 10.2214/ajr.137.5.1055. [DOI] [PubMed] [Google Scholar]
  • 35.Good BC, et al. Does knowledge of the clinical history affect the accuracy of chest radiograph interpretation? AJR Am J Roentgenol. 1990;154(4):709–712. doi: 10.2214/ajr.154.4.2107662. [DOI] [PubMed] [Google Scholar]
  • 36.Aideyan UO, Berbaum K, Smith WL. Influence of prior radiologic information on the interpretation of radiographic examinations. Acad Radiol. 1995;2(3):205–208. doi: 10.1016/S1076-6332(05)80165-5. [DOI] [PubMed] [Google Scholar]
  • 37.White K, Berbaum K, Smith WL. The role of previous radiographs and reports in the interpretation of current radiographs. Invest Radiol. 1994;29(3):263–265. doi: 10.1097/00004424-199403000-00002. [DOI] [PubMed] [Google Scholar]
  • 38.Abdullah BJ, Ng KH. In the eyes of the beholder: what we see is not what we get. Br J Radiol. 2001;74(884):675–676. doi: 10.1259/bjr.74.884.740675. [DOI] [PubMed] [Google Scholar]
  • 39.Destounis S, et al. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography. Int J Comput Assist Radiol Surg. 2009;4(4):331–336. doi: 10.1007/s11548-009-0300-7. [DOI] [PubMed] [Google Scholar]
  • 40.Jiang Y et al (2000) Relative gains in diagnostic accuracy between computer-aided diagnosis and independent double reading. In: Krpunski E (ed) Proceedings of SPIE: medical imaging 2000, vol 3981. International Society for Optical Engineering, Bellingham, pp 10–15
  • 41.MacMahon H, Armato SG., 3rd Temporal subtraction chest radiography. Eur J Radiol. 2009;72(2):238–243. doi: 10.1016/j.ejrad.2009.05.059. [DOI] [PubMed] [Google Scholar]
  • 42.Kahn CE., Jr Artificial intelligence in radiology: decision support systems. Radiographics. 1994;14(4):849–861. doi: 10.1148/radiographics.14.4.7938772. [DOI] [PubMed] [Google Scholar]
  • 43.Espinosa JA, Nolan TW. Reducing errors made by emergency physicians in interpreting radiographs: longitudinal study. BMJ. 2000;320(7237):737–740. doi: 10.1136/bmj.320.7237.737. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Kripalini S, Williams MV, Rask K, et al. Reducing errors in the interpretation of plain radiographs and computed-tomography scans. In: Shojania K, et al., editors. Making health care safer. A critical analysis of patient safety practices. Rockville: Agency for Healthcare Research and Quality; 2001. [Google Scholar]
  • 45.Hales BM, Pronovost PJ. The checklist—a tool for error management and performance improvement. J Crit Care. 2006;21(3):231–235. doi: 10.1016/j.jcrc.2006.06.002. [DOI] [PubMed] [Google Scholar]
  • 46.Li SY, et al. The effect of interruptions on postcompletion and other procedural errors: an account based on the activation-based goal memory model. J Exp Psychol Appl. 2008;14(4):314–328. doi: 10.1037/a0014397. [DOI] [PubMed] [Google Scholar]
  • 47.Cabana MD, et al. Why don’t physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999;282(15):1458–1465. doi: 10.1001/jama.282.15.1458. [DOI] [PubMed] [Google Scholar]
  • 48.Mankad K, et al. Radiology errors: are we learning from our mistakes? Clin Radiol. 2009;64(10):988–993. doi: 10.1016/j.crad.2009.06.002. [DOI] [PubMed] [Google Scholar]
  • 49.Raskin MM. Survival strategies for radiology: some practical tips on how to reduce the risk of being sued and losing. J Am Coll Radiol. 2006;3(9):689–693. doi: 10.1016/j.jacr.2006.03.018. [DOI] [PubMed] [Google Scholar]
  • 50.Janower ML. Re: “frequency and spectrum of errors in final radiology reports generated with automatic speech recognition technology”. J Am Coll Radiol. 2009;6(7):536. doi: 10.1016/j.jacr.2009.05.004. [DOI] [PubMed] [Google Scholar]
  • 51.Singh H, et al. Reducing diagnostic errors through effective communication: harnessing the power of information technology. J Gen Intern Med. 2008;23(4):489–494. doi: 10.1007/s11606-007-0393-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Cascade PN, Berlin L. Malpractice issues in radiology. american college of radiology standard for communication. AJR Am J Roentgenol. 1999;173(6):1439–1442. doi: 10.2214/ajr.173.6.10584778. [DOI] [PubMed] [Google Scholar]
  • 53.Seiden SC, Barach P. Wrong-side/wrong-site, wrong-procedure, and wrong-patient adverse events: are they preventable? Arch Surg. 2006;141(9):931–939. doi: 10.1001/archsurg.141.9.931. [DOI] [PubMed] [Google Scholar]
  • 54.Angle JF, et al. Quality improvement guidelines for preventing wrong site, wrong procedure, and wrong person errors: application of the joint commission “Universal Protocol for Preventing Wrong Site, Wrong Procedure, Wrong Person Surgery” to the practice of interventional radiology. J Vasc Interv Radiol. 2008;19(8):1145–1151. doi: 10.1016/j.jvir.2008.03.027. [DOI] [PubMed] [Google Scholar]
  • 55.Gunderman RB, Burdick EJ. Error and opportunity. AJR Am J Roentgenol. 2007;188(4):901–903. doi: 10.2214/AJR.06.1032. [DOI] [PubMed] [Google Scholar]
  • 56.Sexton JB, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ. 2000;320(7237):745–749. doi: 10.1136/bmj.320.7237.745. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Milstein A, Adler NE. Out of sight, out of mind: why doesn’t widespread clinical quality failure command our attention? Health Aff (Millwood) 2003;22(2):119–127. doi: 10.1377/hlthaff.22.2.119. [DOI] [PubMed] [Google Scholar]
  • 58.Gunderman RB, Nyce JM. The tyranny of accuracy in radiologic education. Radiology. 2002;222(2):297–300. doi: 10.1148/radiol.2222010586. [DOI] [PubMed] [Google Scholar]
  • 59.Berlin L. Reporting the “missed” radiologic diagnosis: medicolegal and ethical considerations. Radiology. 1994;192(1):183–187. doi: 10.1148/radiology.192.1.8208934. [DOI] [PubMed] [Google Scholar]
  • 60.Berlin L. Malpractice issues in radiology. Admitting mistakes. AJR Am J Roentgenol. 1999;172(4):879–884. doi: 10.2214/ajr.172.4.10587115. [DOI] [PubMed] [Google Scholar]
  • 61.Abuful A, Gidron Y, Henkin Y. Physicians’ attitudes toward preventive therapy for coronary artery disease: is there a gender bias? Clin Cardiol. 2005;28(8):389–393. doi: 10.1002/clc.4960280809. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Beagan BL, Kumas-Tan Z. Approaches to diversity in family medicine: “I have always tried to be colour blind”. Can Fam Physician. 2009;55(8):e21–e28. [PMC free article] [PubMed] [Google Scholar]
  • 63.Berlin L. Malpractice issues in radiology. Alliterative errors. AJR Am J Roentgenol. 2000;174(4):925–931. doi: 10.2214/ajr.174.4.1740925. [DOI] [PubMed] [Google Scholar]

Articles from Insights into Imaging are provided here courtesy of Springer

RESOURCES