Skip to main content
Journal of Digital Imaging logoLink to Journal of Digital Imaging
. 2014 Oct 2;28(2):240–246. doi: 10.1007/s10278-014-9738-4

Preventing Errors in Laterality

Elliot Landau 1,, David Hirschorn 1, Iakovos Koutras 1, Alexander Malek 2, Seleshie Demissie 1
PMCID: PMC4359204  PMID: 25273506

Abstract

An error in laterality is the reporting of a finding that is present on the right side as on the left or vice versa. While different medical and surgical specialties have implemented protocols to help prevent such errors, very few studies have been published that describe these errors in radiology reports and ways to prevent them. We devised a system that allows the radiologist to view reports in a separate window, displayed in a simple font and with all terms of laterality highlighted in separate colors. This allows the radiologist to correlate all detected laterality terms of the report with the images open in PACS and correct them before the report is finalized. The system is monitored every time an error in laterality was detected. The system detected 32 errors in laterality over a 7-month period (rate of 0.0007 %), with CT containing the highest error detection rate of all modalities. Significantly, more errors were detected in male patients compared with female patients. In conclusion, our study demonstrated that with our system, laterality errors can be detected and corrected prior to finalizing reports.

Keywords: Quality control, Radiology workflow, Structured reporting, Radiology reporting, Software design, Laterality, Patient safety

Introduction

In 1998, the first report of its kind was published by the Institute of Medicine titled “To err is human” which detailed the rate and cost of medical errors while emphasizing the point that most errors go unreported [1]. Since that time, many guidelines have been set in place to help reduce these errors and allow for easier/more reliable reporting of mistakes. While there have been many promising ideas, based on an article published in September 2013, the number of patients affected by preventable medical errors may be between 210,000 and 400,000 per year [2].

Starting in 1998, the Joint Commission (JCAHO) began requiring the reporting of one specific type of medical error or serious adverse event termed a “sentinel event” defined as “an unexpected occurrence involving death or serious physiological or psychological injury, or the risk thereof.” Such events are called “sentinel” because they signal the need for immediate investigation and response. The term “Never Event” was first introduced in 2001 by Ken Kizer, MD, former CEO of the National Quality Forum, in reference to particularly shocking medical errors (such as wrong-site surgery) that should never occur. The list has been revised several times, most recently in 2011, and now consists of 29 events grouped into 7 categories: surgical, product or device, patient protection, care management, environmental, radiologic, and criminal.

Since the program’s initiation, the data collected demonstrated an alarming rate of wrong-site surgery sentinel events comprising 13.5 % of cases as of September 2009. Most recently, wrong-patient, wrong-site, or wrong-procedure sentinel events made up 14 % of reported cases in 2012 [3]. These errors are especially devastating to the patient and doctor. In 2004, the Joint Commission introduced the Universal Protocol requiring a pre-procedural verification, surgical site marking, and a “time-out” to be performed prior to each procedure.

Very few studies have been published that deal with errors in radiology reports. In 2009, Sangwaiya was the first to assess the rate of laterality errors [4]. An error of laterality is the reporting of a finding that is present on the right side as on the left or vice versa. The authors reviewed all studies over the course of 1 year that contained an addendum to correct a discrepancy pertaining to the laterality of the lesion in the body and impression sections or between the images and the entire report. They found 88 reports meeting these criteria for a rate of 0.00008 %. The authors also searched through all signed reports over the course of 1 month to determine the rate of unrecognized errors of laterality (those that did not have an addendum), for which they ultimately found 36 reports. In 2013, Luetmer et al. performed a retrospective search of all studies at a single institution over the course of several years that contained corrected errors of laterality [5]. They also found a very low error rate (0.055 %) and no discrepancy between reports generated with and without speech recognition software.

As of today, several programs have been created that survey radiology reports and notify the radiologist if there is laterality term discordance present. Some of these algorithms though are made to operate after the report has been finalized, allowing the radiologist to create an addendum to the finalized report. While this is important for patient safety, the radiologist’s original report had already been accessible to the referring physician and possibly the patient, leaving a window of opportunity for medical error.

A major barrier to computer algorithm laterality error detection though is the frequency of laterality term usage in radiology reports. It is common for one report to contain findings in several bilateral structures, thereby limiting the utility of simple computerized algorithms that detect just laterality discordance. Although this type of system would detect simple errors (i.e., when a term of laterality occurred only once in the report), when several bilateral findings are reported, a computer algorithm will not reliably be able to distinguish between true and false discordance.

In addition, these computer-based laterality detection programs may miss real errors when the laterality terms are concordant throughout the report, i.e., when the same error was propagated over and over again. A computer will not be able to determine if the laterality term corresponds with the appropriate imaging findings, unless software such as CAD would be implemented that can reliably survey the images and assess for abnormalities. As of today though, such CAD software is not available.

In our institution, we devised a system where the radiologist has the opportunity to view the report in a separate window displayed in a simple font with all terms of laterality highlighted in separate colors. This allows detection of all laterality terms while the study is still open in PACS and the dictation system so that the radiologist can correlate the reported findings with their images. We hypothesized that utilizing such a system will decrease laterality errors before reports are signed off.

Materials and Methods

Several years ago, a button was added to our institution’s dictation system (Nuance RadWhere) called “Check PACS and sign” using Quick Macros desktop automation software (Fig. 1). When clicked, the system first performs a check of the current open report against the exam open in PACS and warns if they do not match. All radiologists were instructed to use this feature before finalizing reports. The project was considered a quality improvement project to detect and prevent errors at our institution; therefore, IRB approval was not required. The name of the button was subsequently changed to “Safe Sign” in August 2014.

Fig. 1.

Fig. 1

Button called “Check PACS and sign” was added to toolbar that displays report in simple font with laterality terms highlighted in colors in new window when clicked. Button was renamed to “Safe Sign” in August 2014

Starting June 1, 2013, we added a second functionality to this button. It performs a check of the report for any term of laterality. If none are found, the system proceeds to sign the report as before. If a laterality term is found, a new window opens containing a copy of the report in a simple font, and with the words “right” highlighted in blue and “left” highlighted in red (Fig. 2). This provides a simplified view of the report which makes it easy to scan the laterality terms and look back and forth between the highlighted terms in the report and the findings they correspond to in the images. This second view of the report is optimized to enhance detection of laterality discrepancies. If no errors are present, the radiologist can click a button on the right side of the title bar labeled “Close and sign report” to close the second report window and sign the report (Fig. 3). This is important because it minimizes extra clicks and streamlines the reporting process instead of requiring the user to close the window in one action and then sign the report in a separate action. Were they to be required to sign the report using the regular “Sign” button of the dictation system, they may be more likely to forget to use the error checking system altogether.

Fig. 2.

Fig. 2

New window opens displaying report in simple font with “right” highlighted in blue and “left” highlighted in red

Fig. 3.

Fig. 3

A button called “Close and sign report” was added to new window that when clicked will sign the report open in dictation system if there are no errors present

If an error of laterality is detected, the radiologist is encouraged to click a second button at the top left of the window labeled “Click here if you caught a laterality error” before fixing the error and signing the report (Fig. 4). One of the main advantages of this system is that the errors are caught and corrected so early in the process that without such an indicator, there would be no evidence that a mistake was ever made in the first place.

Fig. 4.

Fig. 4

A button called “Click here if you caught a laterality error” was added to new window that allows the radiologist to click on it and alert the system that an error was detected

Every time the buttons “Check PACS and sign”/”Safe Sign” and “Click here if you caught a laterality error” are clicked, the events are logged to a database for analysis. The data collected include the user identity, the date and time, and the radiological examination accession number. It is important to point out that screening mammogram examinations were excluded from our results as in our institution; the main body of the mammogram report is not dictated using our speech dictation software. Also, vascular studies were excluded (including venous and arterial duplex) as these studies are dictated by the vascular surgery department in our institution.

The data were categorized by modality and type of examination and tabulated in Microsoft Excel for analysis. The modalities included CT, MRI, radiograph, ultrasound, nuclear medicine (including PET CT scans), and interventional procedures (fluoroscopy examinations were included in the interventional category). Statistical analysis was performed by SAS 9.3 statistical software.

Results

From June 1, 2013 until January 1, 2014, a total of 141,426 studies were signed off meeting our inclusion criteria and the “Check PACS and sign” button was clicked 44,992 times resulting in a compliance rate of 31.8 %. The “Click here if you caught a laterality error” button was clicked a total of 32 times, yielding a rate of 0.0007 %. This data is higher than the previously published data by Sangwaiya [4] and lower than those described by Luetmer [5].

The modality containing the highest rate of error detection was CT, followed by MRI, radiography, nuclear medicine, radiography, and ultrasound (Table 1). Overall ANOVA test demonstrated a significant difference between modalities (p = 0.001). After correcting for multiplicity using Bonferroni adjustment procedure, pairwise comparisons showed that CT had almost a 4 times higher rate of error occurrence (z = 3.89) as compared to radiography (p = 0.0015). Also, CT had almost a three times higher rate of error occurrence compared to ultrasound (z = 2.73) with a p value approaching significance (p = 0.094).

Table 1.

Incidence and rate of laterality errors detected broken down by modality

Modality Number of examinations Incidence of error Rate of error
CT 24,646 16 0.065 %
MRI 8544 3 0.035 %
Radiograph 74,002 10 0.014 %
Ultrasound 25,782 1 0.004 %
Nuclear medicine 4385 2 0.046 %
Interventional 4067 0 0.000 %
141,426 32

With respect to type of examination, the highest reported error rate was MRI of the lower extremity, followed by nuclear medicine renal flow study and MRI of the pelvis (Table 2). When comparing examination error rates, we found that there was a significant difference between CT chest compared with chest radiography (p = 0.0025) while there was no significant difference between CT abdomen and pelvis compared to abdominal radiography (p = 0.361). There was also no significant difference between types of MRI examinations (pelvis compared to lower extremity) and types of CT examinations (abdomen and pelvis compared to chest compared to head).

Table 2.

Incidence and rate of laterality errors detected broken down by examination

Modality Location Number of examinations Incidence of error Rate of error
CT Abdomen/pelvis 8739 8 0.092 %
Head 8424 5 0.059 %
Chest 3590 3 0.084 %
MRI Pelvis 170 2 1.176 %
Lower extremity 64 1 1.563 %
Radiograph Ankle 2160 2 0.093 %
Knee 2843 2 0.070 %
Chest 44,552 2 0.004 %
Shoulder 1650 1 0.061 %
Cervical spine 915 1 0.109 %
Wrist 1438 1 0.070 %
Abdomen 2601 1 0.038 %
Nuclear medicine Persantine 1877 1 0.053 %
Renal flow 73 1 1.370 %
Ultrasound Retroperitoneal 3128 1 0.032 %

When measuring differences in gender, we found 18 errors in male patients and 14 errors in female patients (Table 3). Using Fisher’s exact test, this difference was found to be significant (one-sided p value = 0.0153). The mean age of patients that an error was reported was 57.7 years with a standard deviation of 18.2. The number of errors did increase with age with the highest numbers seen between 50 and 80 years of age (Table 4).

Table 3.

Incidence of laterality errors detected based on patient gender

Modality Incidence of error Male Female
CT 16 9 7
MRI 3 2 1
Radiograph 10 4 6
Nuclear medicine 2 2 0
Ultrasound 1 1 0
Total 32 18 14

Table 4.

Incidence of laterality errors detected by patient age

Modality Age
<30 30–39 40–49 50–59 60–69 70–79 >79
CT 0 1 2 5 3 3 2
MRI 0 1 0 1 0 1 0
Radiograph 2 2 1 2 2 0 1
Nuclear medicine 0 0 0 0 1 1 0
Ultrasound 0 0 0 0 0 1 0
Total 2 4 3 8 6 6 3

Conclusion

Although the rate of laterality errors was low, our system was able to prevent 32 errors over the course of 7 months. The system does require the user to click at least one extra button to sign off a report which possibly contributed to the low system compliance rate (31.8 %). However, a survey among the radiologists that had been encouraged to use the system when it was first introduced mostly attributed their noncompliance with forgetting to activate the system rather than the extra clicks required. Most users found the system very helpful without causing a significant work flow deceleration.

Our results are similar to those reported by Sangwaiya [4] with the highest detection rate found in CT. Luetmer [5] also found CT to contain the most laterality errors when excluding mammography like in our study. A possible explanation for this difference is the increased number of bilateral findings that are usually described in these reports. Another possible contributing factor, however, is that the users with the highest system compliance rates may predominately read these examinations leading to an increased probability of error detection.

With regard to difference in gender, our results are different from those previously published by Sangwaiya [4] who found almost double the amount of errors in female patients compared to males. One major reason for this discrepancy is that they included mammography in their analysis that increased the number of errors in females. Our results though are strictly measuring total number of errors, not difference in error rate between genders.

Finally, we did see that reported errors were highest among patients between 50 and 80 years of age. This reason for this difference is likely multifactorial including a higher number of examinations performed in these age groups as well as the increased number of bilateral findings associated with increased age.

Discussion

As tolerance for medical errors decreases, radiologists must do everything to ensure that their reports are accurate. With preventive measures for other subspecialties already set forth by lead authorities, guidelines for radiology reports may soon be in place to prevent certain types of errors that are deemed avoidable. To this date, the current mandate is based on the ACR practice guidelines which states:

“The final report is the definitive documentation of the results of an imaging examination. The final report should be proofread to minimize typographical errors, accidentally deleted words, and confusing or conflicting statements [6].”

Besides for reflecting poorly on the radiologist, errors in laterality have potential to cause significant patient morbidity. As an example, an editorial published in AJR in 2001 described a scenario where a radiologist inadvertently reversed the right and left side when reporting findings on a knee MRI [7]. The orthopedic surgeon subsequently took the patient to surgery without crosschecking the images, and the case ultimately led to a malpractice settlement. In the 2009 study by Sangwaiya, although the rates for laterality errors were low, the authors urged radiologists to double-check for errors in laterality and referring physicians to correlate the clinical complaints and radiologic images.

Most radiologists understand the need to proofread their reports; however, a thorough review of every report can cause fatigue and decrease productivity without necessarily yielding any improvement in accuracy. Reviewing the reports in the original form used during dictation can still miss some errors—if they missed it once, they may very well miss it again. Our study allowed the radiologist to review the report in a simplified form, containing visual aids, optimized to identify laterality terms and confirm their accuracy.

There are many strengths to our study. First, all laterality terms throughout the report can be detected so that the radiologist can correlate the findings against the image displayed. This can confirm the accuracy of each occurrence, even if there are multiple findings throughout the report that reference a term of laterality. Also, our study gives the radiologist the opportunity to correct an error if detected before the report is finalized. As mentioned earlier, this feature is helpful as only the radiologist will know an error was ever made. In addition, our system ensures that the report being finalized is for the study that is currently displayed in PACS, preventing “wrong study-wrong report” errors that may have led to potential patient hazard. Finally, a secondary benefit of the new display of the report in a simple font is the enhanced ability to detect other errors present in the report including spelling/grammatical errors.

Our study does contain several limitations. Our system is not error detection software, rather it just helps to increase the conspicuity of laterality errors. Although this may be helpful, if the radiologist does not ultimately detect the error, it will not be corrected. Also, our system requires compliance from the radiologists. Even though the utilization of such software is beneficial to all users, radiologists that have signed off on thousands of studies in the past using a certain method may ultimately return to their familiar patterns and forget to click on the “Check PACS and sign”/”Safe Sign” button. Furthermore, as the system requires a few extra clicks to operate, some radiologists may prefer to forego this system and try to detect errors on their own. As mentioned earlier though, the system records all users that click on the “Check PACS and sign”/”Safe Sign” button; therefore, individual compliance may be monitored to improve compliance rates.

Finally, it is likely that a learning effect occurred over the time period of our study. After several occurrences of system activation and seeing the laterality terms highlighted in reports, the user may have ultimately started to focus on every instance a laterality term was mentioned during the initial report dictation. This would have especially occurred if the system had previously detected a laterality error for that user. Once users are required to focus on these terms by activating the system, errors in laterality terms may subsequently decrease due to the radiologists becoming more aware of the potential problems.

With other specialties taking strides to prevent medical errors, it is incumbent on radiologists to continue to develop guidelines and standards that ensure our work is also safe from “Never Events.” Continued research will be necessary that confirms rates of errors in radiological reports and develop measures that aid in their prevention. Our study may be one more step in this process.

Contributor Information

Elliot Landau, Phone: (718) 551- 1970, Email: elliotlandau2@gmail.com.

David Hirschorn, Email: Hirschorn.david@mgh.harvard.edu.

Iakovos Koutras, Email: ikoutras@yahoo.com.

Alexander Malek, Email: Malek.alexander.m@gmail.com.

Seleshie Demissie, Email: Sdemissie@nshs.edu.

References

  • 1.Kohn LT, Corrigan JM, Donaldson MS (Institute of Medicine): To err is human: building a safer health system. Washington, DC: National Academy Press, 2000 [PubMed]
  • 2.James JT. A new, evidence-based estimate of patient harms associated with hospital care. J Patient Saf. 2013;9:122–128. doi: 10.1097/PTS.0b013e3182948a69. [DOI] [PubMed] [Google Scholar]
  • 3.Sentinel Event Statistics for 2012: The Joint Commission Perspectives 2013:33;3. Retrieved from https://www.jointcommissionconnect.org/NR/rdonlyres/591BB908-2D44-4523-A504-8FD508F25534/0/JCP032013.pdf [PubMed]
  • 4.Sangwaiya MJ, Saini S, Blake MA, Dreyer KJ, Kalra MK. Errare humanum est: frequency of laterality errors in radiology reports. AJR. 2009;192:239–44. doi: 10.2214/AJR.08.1778. [DOI] [PubMed] [Google Scholar]
  • 5.Luetmer MT, Hunt CH, Mcdonald RJ, Bartholmai BJ, Kallmes DF. Laterality errors in radiology reports generated with and without voice recognition software: Frequency and clinical significance. JACR. 2013;10:538–543. doi: 10.1016/j.jacr.2013.02.017. [DOI] [PubMed] [Google Scholar]
  • 6.“ACR practice guideline for communication of diagnostic imaging findings.” ACR: http://www.acr.org/∼/media/C5D1443C9EA4424AA12477D1AD1D927D.pdf. Uploaded 4/24/14
  • 7.Smith JJ, Berlin L. Signing a colleague’s radiology report. AJR. 2001;176:27–30. doi: 10.2214/ajr.176.1.1760027. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Digital Imaging are provided here courtesy of Springer

RESOURCES