Skip to main content
Journal of Digital Imaging logoLink to Journal of Digital Imaging
. 2010 Aug 27;24(2):300–307. doi: 10.1007/s10278-010-9330-5

Tracking Delays in Report Availability Caused by Incorrect Exam Status with Web-Based Issue Tracking: A Quality Initiative

Omer Abdulrehman Awan 1,, Frans van Wagenberg 1, Mark Daly 1, Nabile Safdar 1, Paul Nagy 1
PMCID: PMC3056982  PMID: 20798973

Abstract

Many radiology information systems (RIS) cannot accept a final report from a dictation reporting system before the exam has been completed in the RIS by a technologist. A radiologist can still render a report in a reporting system once images are available, but the RIS and ancillary systems may not get the results because of the study’s uncompleted status. This delay in completing the study caused an alarming number of delayed reports and was undetected by conventional RIS reporting techniques. We developed a Web-based reporting tool to monitor uncompleted exams and automatically page section supervisors when a report was being delayed by its incomplete status in the RIS. Institutional Review Board exemption was obtained. At four imaging centers, a Python script was developed to poll the dictation system every 10 min for exams in five different modalities that were signed by the radiologist but could not be sent to the RIS. This script logged the exams into an existing Web-based tracking tool using PHP and a MySQL database. The script also text-paged the modality supervisor. The script logged the time at which the report was finally sent, and statistics were aggregated onto a separate Web-based reporting tool. Over a 1-year period, the average number of uncompleted exams per month and time to problem resolution decreased at every imaging center and in almost every imaging modality. Automated feedback provides a vital link in improving technologist performance and patient care without assigning a human resource to manage report queues.

Key words: Quality control, quality assurance, turnaround time, human error, communication

Introduction

When a radiologist finalizes a report in a speech recognition system, they have an expectation that the report is immediately available to referring physicians. Many radiology information systems (RIS) however cannot accept a preliminary or final report from a dictation system before the exam has been completed by a technologist. This is an artifact of picture archiving and communications system (PACS)-driven radiologist workflow without tight integration with the RIS. The advent of speech recognition and increasing concerns about quality and responsiveness have accelerated radiologist reporting time to near “real-time” rates. The result is that technologist delays in completing studies that would once have been “invisible” because they occurred during radiologist reporting times can now be significant factors in reporting delays.1 Although a radiologist can still render a report in a dictation system once images are available, the RIS and ancillary systems may not get the results of uncompleted studies, delaying patient care and causing friction with clinicians. Report delays can lengthen times to accurate diagnoses and thereby delay appropriate management, with the potential to negatively affect the overall quality of care.

The typical sequence of events in delayed reporting in diagnostic imaging begins with the ordering of a specific study by a clinician (Fig. 1). The technologist performs the study at the scheduled time but does not complete the study by signing off on it in the RIS. The images will already be available in the PACS to be reviewed and dictated by the radiologist. If the radiologist uses a PACS-based worklist instead of an RIS-based one, he will be unaware that the study was not logged as completed. Because the technologist has not completed or signed off on the report in the RIS, the result is a report that, although dictated, cannot be accessed by clinicians. In a RIS-based worklist, the radiologist will not even see the study as available to be read until the technologist sign-off is complete. Thus, this problem exists in daily workflow where there is lack of integration between PACS, RIS, and the speech dictation system.

Fig 1.

Fig 1

Schematic diagram illustrating typical sequence of events for a study to be performed (PACS picture archiving and communications system, RIS radiology information system).

One specific cause of this problem is a gap in communication between the radiologist and technologist. Radiologists often will dictate reports without being aware that the exams have not been completed in the RIS by technologists. Technologists, in turn, may be unaware that reports have been dictated and, under increasing workflow pressure during a full shift, may postpone the completion of an exam. This problem is exacerbated when radiologists work remotely or in different institutional areas from technologists, making communication difficult.2 Even when located relatively near to one another, the time needed for a radiologist to make a phone call to address the issue or step out of the reading room can disrupt workflow for both the radiologist and technologist.

The full scope of the problem may not be recognized by either radiologists or technologists. In some cases, days can pass before an uncompleted study is brought to the attention of either group. Many large institutions with large volumes of studies have no mechanisms for routinely assessing this problem or notifying radiologists and technologists about uncompleted exams. This can be a daunting problem, especially if critical clinical findings, such as an intra-abdominal abscess on a computed tomography scan, are not appropriately reported to clinicians and patients in a timely fashion.

Quality control (QC) issues have recently become the focus of considerable concern and analysis, and among the key issues in radiology are the need to report studies in a timely fashion and to identify methods for monitoring technologists’ performance.3,4 In addition, increasing medicolegal emphasis has been placed on accurately and quickly reporting critical findings and documenting that communication.5 A technical solution has been proposed by Digital Imaging and Communications in Medicine (DICOM) using the modality performed procedure step (MPPS) object which could automatically complete a study in the RIS. Unfortunately, few Radiology Information Systems are able to complete a study upon the receipt of an MPPS message. The second challenge is that additional information frequently needs to be recorded at the time of completion for documentation purposes. The inclusion of any manual step in study completion invariably results in some small percentage of human error. Our goal in the work reported here was to build a prompt feedback mechanism to minimize delays and identify opportunities to recover quickly from mistakes made in the process of study completion.

To address this problem, we developed a Web-based reporting tool to track uncompleted exams and automatically page the appropriate section supervisor when a report was delayed because of failure to complete the examination by a technologist. By creating a Web-based tool, we believed that the scope of the problem would become more apparent to our radiology department, and we could address ways to intervene and mitigate the problem. This process, we hypothesized, would decrease the number of uncompleted exams in the RIS.

Materials and Methods

Institutional Review Board Status

Institutional Review Board exemption was obtained for this study.

Design of Web-Based Reporting Tool

A Python script was used to poll the speech recognition dictation system (RadWhere, Nuance Healthcare Solutions; Burlington, MA) every 10 min for exams that were signed by the radiologist but unable to be sent to the RIS. This script logged the exams into an existing Web-based QC issue track tool using Hypertext Preprocessor and a MySQL database. In addition, the program automatically e-mailed and text-paged the modality supervisor to notify the appropriate personnel that an exam was incomplete and that it required immediate attention. To resolve the problem of bedside studies on which the technologist could not sign off until he or she returned to the department, a 60-min delay was given before initiating the paging system. A 60-min delay was given because technologists deployed a wireless computed radiography device on their rounds to transmit the images remotely and did not have access to the RIS until they returned to the department. Even though the python script polled the speech dictation system every 10 min for uncompleted exams, the paging system did not take effect until 60 min had elapsed where an exam was uncompleted. An exam that was uncompleted for 30 min would be recognized by the python script, but the paging system would not be activated until 60 min have elapsed.

The script also logged when the report was finally able to be sent, and statistics were aggregated onto a separate Web-based reporting tool created within our department, called Radtracker. By using Radtracker, specific data on uncompleted exams at our different imaging centers within our institution could be monitored and analyzed.

Data Acquisition and Analysis

Before the system went live on January 5, 2009, 4 weeks of preliminary data (December 8, 2008, through January 4, 2009) was collected before implementing the paging system using this Web-based reporting tool. Data were collected at four different imaging centers served by our department (a university hospital, a community-based hospital, and outpatient centers A and B) on the total number of studies performed each month. We also calculated the total number of errors per month, with an error defined as a study that could not be completed within 1 h after scheduled time by the technologist. The average time to resolution of the problem in minutes was also recorded. We also calculated the rate of error as a function of time at our four imaging centers. Rate of error was defined as total errors per month divided by total number of studies. We also stratified these data only at our university hospital according to modality, which included plain radiography, computed tomography (CT), magnetic resonance (MR), nuclear medicine (NM), and ultrasound (US). Data stratified according to modality could only be done at our university hospital as this was the only imaging center that had information technology (IT) personnel readily able to compile these data.

We compiled the same data points after implementing the paging system to make the problem known to modality supervisors. These data included the total number of errors per month, average time to resolution of error in minutes, and rate of error as a function of time, with the period of analysis being January 5–November 30, 2009. We formulated these data to compare results on a monthly basis. As per the pre-implementation analysis, we also conducted a subfocus analysis by stratifying data for our university hospital by plain radiography, CT, MR, NM, and US.

Actions/Interventions

A number of interventions were implemented in this study to determine whether the results would decrease the frequency and duration of errors. The first intervention was built into the Web-based reporting tool itself and involved paging the various modality supervisors when an exam was uncompleted for greater than 1 h after finalization of the dictation by a radiologist. This paging system allowed the five different modality supervisors to be aware of the problem and gave them the ability to resolve the delay by logging into the RIS and completing the study or communicating the problem to the technologist. At the same time as the page, an e-mail was automatically sent to the appropriate supervisor describing the problem.

Data collected using this Web-based reporting tool also became a part of each technologist’s report card, which had a section for the number of uncompleted exams logged by that technologist.2 This intervention was used to make the technologist aware of his number of uncompleted exams, as well as to improve overall performance over time. This report card was reviewed at semi-annual meetings between the technologist and modality supervisor and was available as a potential appraisal metric for bonuses if the technologist showed improvement over time.

Finally, the data compiled by the Web-based reporting tool were discussed at monthly operations quality meetings between the chair of the radiology department, modality supervisors, modality section chiefs, and radiology information technology personnel. These meetings encouraged quality improvement discussions and suggestions for further improvement when numbers of uncompleted exams were not decreasing sufficiently.

Results

The average number of studies performed during the overall study period at each of our four imaging centers as well as the average percentage of studies performed per modality at our university hospital is shown in Table 1. Table 2 (the numerator value in the fraction) and Table 3 detail the numbers of studies per month identified as uncompleted and average time to resolution, respectively. As these data were gathered, they were reviewed with section heads, several of whom were unaware of challenges in completing studies. These data were discussed at monthly image quality meetings.

Table 1.

Average number of studies performed per month between December 2008 and November 2009 in each setting

Setting Number of studies
University hospital 26,969
University plain films 12,865
University CT 6,562
University MR 1,267
University NM 1,121
University US 1,198
Community hospital 3,877
Outpatient 1 484
Outpatient 2 1,978

CT computed tomography, MR magnetic resonance, NM nuclear medicine, US ultrasound

Table 2.

Total number of errors and rate of error by month at four imaging centers

Month University hospital Community hospital Outpatient A Outpatient B Overall (%)
December Inline graphic Inline graphic Inline graphic Inline graphic 4.0
January Inline graphic Inline graphic Inline graphic Inline graphic 3.1
February Inline graphic Inline graphic Inline graphic Inline graphic 2.5
March Inline graphic Inline graphic Inline graphic Inline graphic 1.6
April Inline graphic Inline graphic Inline graphic Inline graphic 2.1
May Inline graphic Inline graphic Inline graphic Inline graphic 1.6
June Inline graphic Inline graphic Inline graphic Inline graphic 1.7
July Inline graphic Inline graphic Inline graphic Inline graphic 1.7
August Inline graphic Inline graphic Inline graphic Inline graphic 1.4
September Inline graphic Inline graphic Inline graphic Inline graphic 1.4
October Inline graphic Inline graphic Inline graphic Inline graphic 1.3
November Inline graphic Inline graphic Inline graphic Inline graphic 1.2

Rate of error = number of errors/total number of studies

Table 3.

Average time in minutes to resolution of errors by month at four imaging centers

Month University hospital Community hospital Outpatient A Outpatient B
December 4,206 980 3,880 1,380
January 1,098 706 1,189 378
February 924 380 787 548
March 329 674 172 331
April 209 656 404 382
May 378 156 336 128
June 262 390 185 247
July 310 265 512 228
August 232 332 267 169
September 298 239 376 398
October 312 198 239 256
November 259 337 494 321

Table 4 (the numerator value in the fraction) and Table 5 identify the total monthly numbers of errors per modality and time to resolution, respectively, at our university hospital over the study period. The rates of errors for each month at our four imaging centers and for the individual imaging modalities at our university hospital are shown in Table 2/Figure 2 and Table 4/Figure 3, respectively.

Table 4.

Total number of errors and rate of error per month for different modalities at university hospital

Month Plain films CT MR NM US
December Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
January Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
February Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
March Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
April Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
May Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
June Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
July Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
August Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
September Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
October Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
November Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic

Rate of error = number of errors/total number of studies

CT computed tomography, MR magnetic resonance, NM nuclear medicine, US ultrasound

Table 5.

Average time in minutes to resolution of errors per month for different modalities at university hospital

Month Plain films CT MR NM US
December 442 1,575 6,674 10,020 2,340
January 93 523 454 550 3,870
February 59 605 1,353 150 2,454
March 33 113 205 338 944
April 50 425 388 0 181
May 65 223 178 1,028 398
June 39 173 301 417 353
July 61 389 599 791 578
August 45 578 196 1,112 267
September 52 189 717 642 293
October 71 419 461 0 470
November 51 295 332 256 725

CT computed tomography, MR magnetic resonance, NM nuclear medicine, US ultrasound

Fig 2.

Fig 2

Line graph demonstrating the rate of error per month at four imaging centers.

Fig 3.

Fig 3

Line graph demonstrating the rate of error per month for different modalities at the university hospital (CT computed tomography, MR magnetic resonance, NM nuclear medicine, US ultrasound).

Of note, at our university hospital, the rate of error decreased threefold during this interval, 3% in December 2008 to 0.8% in November 2009. The average rate of error at all four imaging centers decreased fourfold, 4% in December 2008 to 1.2% in November 2009 over the study period.

Discussion

These data demonstrate a decrease in the number of uncompleted exams and the average time to resolution of the delay when using a Web-based tracking tool. All categories that we measured decreased at all four of our imaging centers from December 2008 to November 2009.

Although the number of uncompleted exams decreased at all four sites, it is noteworthy that the initial rate of error varied from 3.0% at the university site to 13.4% at one outpatient facility at baseline. Various factors may contribute to such an initial difference. The patient population at a university hospital has many critically ill inpatients, while an outpatient facility typically serves less acute patients. A university-based hospital practice may have significant staff besides technologists and radiologists, such as information technology personnel, an onsite physicist, and other staff that share the task of monitoring processes which may lead to a delay in reporting. Furthermore, the referring physician base in an inpatient facility may be more likely to raise concerns over delays in reporting for their critically ill patients than, for example, an outpatient practitioner who will see his patient during their next appointment, which may be days or weeks away.

The uncompleted error rate decreased more dramatically at the university hospital, from 3.0% to 0.8%, than at the community hospital, from 7.5% to 3.6%. This may be because the monthly operations quality meetings were held at the university hospital, and it was more difficult for modality supervisors and technologists from the community hospital to fully engage in this process. This suggests that a key part of the quality improvement process in this case was likely the human interaction facilitated by discussion of the data at technologist performance reviews and monthly meetings. Making data about this type of error available without engaging relevant parties in a discussion may not in itself be an effective method of improving underlying processes involving human workflows.

The scope of the problem at the beginning of this initiative was underestimated, as the quantity of errors was higher than anticipated at baseline. We did not realize that the quantity of errors or uncompleted exams would be as high as they were in December or January. This Web-based tool allowed us to recognize a potentially serious problem that could compromise patient care and gave us a means to rectify the problem with a feasible solution. Indeed, other institutions may not be aware of this type of problem in their radiology departments just as we were not.

Quality initiatives in general and turnaround time for uncompleted exams specifically have been topics in the recent literature.1,615 Strife et al.7 emphasized the importance of quality care and the need to reduce medical errors in the health care system. This group elaborated on ways in which quality and medical error reduction can be improved, citing the utility of practice quality improvement projects focusing on decreasing turnaround time (the time between completion of the exam and availability of the final report to the referring physician). Our quality initiative addressed the same topic, and we were successful in reducing turnaround time, or time to resolution of the error in our study, through an automatic Web-based tracking tool that allowed us to monitor uncompleted exams, made us aware of the scope of the challenge, and provided a means to rectify the problem.

Other authors have discussed approaches to improved turnaround times for medical reports. Branstetter1 described ways in which the advent of speech recognition in dictation systems significantly reduced turnaround time for medical reports. As our tracking tool demonstrated, however, other problems remain a challenge to timely delivery of results, including reports not completed by technologists.

Our Web-based reporting tool has a number of potentially beneficial applications for patient care and quality improvement in radiology. In addition to offering a solution that decreases report turnaround time related to the technologist’s failure to complete reports on time, the tool facilitates more open and evidence-based departmental communication about efficiency, turnaround, and the importance of timely delivery of results. By offering the ability to page supervisors as well as technologists, the tool also opened a backup mechanism that ensured that each study would be completed either by the supervisor or the technologist. The end result, as we originally hypothesized, was the ability to notify clinicians sooner about examination results, a process that can have the downstream effects of improving relationships with providers and enhancing patient outcomes.

Our study has limitations. The first limitation is simply that this failure mode may be dependent upon the workflow specific to a given practice. In an RIS-driven workflow, the radiologist will not even see the case on their worklist because the study is still in scheduled status even though the images are available in the PACS. While this solves the dictation and reporting problem, this raises an even more dangerous patient safety delay opportunity. The authors are unaware of any reporting systems that accept final reports and allows bypassing of the completion of a procedure. This may be due to the requirement of a completion notification for technical billing to occur.

Another limitation was we had a relatively short 4-week period of baseline data gathering before implementation of the Web-based paging system. After 4 weeks of monitoring the problem, we determined we had a responsibility to intervene. Also, this 4-week interval included the Christmas holiday where staff and personnel at hospitals and outpatient centers are limited and thus may not reflect the routine workflow when compared to a non-holiday environment. Although a longer timeline of baseline data might have better illuminated the scope of the problem, we are confident that the downward trend in numbers of errors and time to resolution over the ensuing 1 year presents a valid picture of the utility of the tool. Another limitation was that data from January to November of 2009 could not be time-matched to the same months in 2008, a process that might have accounted for monthly periodicity in errors. However, our data for 2009 suggested no such periodicity.

Another limitation was our tracking tool’s inability to break out the data for the different imaging modalities at our community hospital and outpatient centers A and B. The tool could only do so for data from our university hospital, the only site at which the information technology personnel needed to do so were available. This raises a final potential limitation in that building these types of quality monitoring tools requires significant information technology expertise, and not all radiology departments have access to such resources. Our rationale was driven quality tracking to the supervisor level. At smaller sites, there is only a site supervisor and not a modality one.

Each stage within radiology from acquisition to reporting should be monitored to where cases can become “lost in the cracks.” Special attention should be paid to stages that span different information systems. A manual daily QA checklist can be appropriate for some stages. We have found that, for examinations left uncompleted in the RIS, the potential delay in patient care required a more direct approach.

Conclusion

Automatically tracking a previously anecdotal problem related to examinations not completed in the RIS decreased turnaround time at our institutions. The number of uncompleted exams per month decreased at every imaging center and in every imaging modality over the 1-year study period. The average time to resolution of problems also decreased significantly at each imaging center and for every imaging modality over the study period. Hopefully, with further study and improvement, the automated feedback will continue to provide a vital link in improving technologist performance and patient care without assigning a human resource to manage the report queues. Radiology is a labor-intensive field, and modern IT systems need to provide fundamental mechanisms to allow users to be aware of and correct mistakes quickly to prevent larger potentially dangerous delays in patient care. IT cannot automate many of the steps in radiology, and they have an imperative to enable transparency and quality mechanisms.

Acknowledgment

We would like to thank Nancy Knight, Ph.D., for her assistance in preparing this manuscript.

Contributor Information

Omer Abdulrehman Awan, Phone: +1-410-9137787, Email: omer.awan786@gmail.com.

Frans van Wagenberg, Phone: +1-410-3283477, Email: Fvanw001@umaryland.edu.

Mark Daly, Phone: +1-410-3283477, Email: mdaly@umm.edu.

Nabile Safdar, Phone: +1-410-3283477, Email: nsafdar@umm.edu.

Paul Nagy, Phone: +1-410-3283477, Email: pnagy@umm.edu.

References

  • 1.Branstetter BF., 4th Basics of imaging informatics. Part 1. Radiology. 2007;243:656–667. doi: 10.1148/radiol.2433060243. [DOI] [PubMed] [Google Scholar]
  • 2.Nagy PG, Pierce B, Otto M, Safdar NM. Quality control management and communication between radiologists and technologists. J Am Coll Radiol. 2008;5:759–765. doi: 10.1016/j.jacr.2008.01.013. [DOI] [PubMed] [Google Scholar]
  • 3.Papp J. Quality management in imaging sciences. 2. St. Louis: Mosby; 2004. [Google Scholar]
  • 4.Diagnostic X-Ray Imaging Committee, report no. 74. Madison: Medical Physics; 2002. [Google Scholar]
  • 5.Altman D, Gunderman R. Outsourcing: A primer for radiologists. J Am Coll Radiol. 2008;5:893–899. doi: 10.1016/j.jacr.2008.03.005. [DOI] [PubMed] [Google Scholar]
  • 6.Kang J, Kim M, Hong S, Jung J, Song M. The application of the Six Sigma program for the quality management of PACS. AJR Am J Roentgenol. 2005;185:1361–1365. doi: 10.2214/AJR.04.0716. [DOI] [PubMed] [Google Scholar]
  • 7.Strife J, Kun L, Becker G, Dunnick N, Bosma J, Hattery R. The American Board of Radiology perspective on maintenance of certification: Part IV—Practice quality improvement for diagnostic radiology. Radiology. 2007;243:309–313. doi: 10.1148/radiol.2432061954. [DOI] [PubMed] [Google Scholar]
  • 8.Kruskal JB. Editorial: Quality initiatives in radiology: Historical perspectives for an emerging field. RadioGraphics. 2008;28:3–5. doi: 10.1148/rg.281075199. [DOI] [PubMed] [Google Scholar]
  • 9.Kruskal JB, Yam CS, Sosna J, Hallett DT, Milliman YJ, Kressel HY. Implementation of online radiology quality assurance reporting system for performance improvement: Initial evaluation. Radiology. 2006;241:518–527. doi: 10.1148/radiol.2412051400. [DOI] [PubMed] [Google Scholar]
  • 10.Johnson CD, Swensen SJ, Applegate KE, Blackmore CC, Borgstede JP, Cardella JF, Dilling JA, Dunnick NR, Glenn LW, Hillman BJ, Lau LS, Lexa FJ, Weinreb JC, Wilcox P. Quality improvement in radiology: White paper report on the Sun Valley Group Meeting. J Am Coll Radiol. 2006;3:544–549. doi: 10.1016/j.jacr.2006.01.009. [DOI] [PubMed] [Google Scholar]
  • 11.Blackmore CC. Defining quality in radiology. J Am Coll Radiol. 2007;4:217–223. doi: 10.1016/j.jacr.2006.11.014. [DOI] [PubMed] [Google Scholar]
  • 12.Adams HG, Arora S. Total quality in radiology: A guide to implementation. Boca Raton: St. Lucie; 1994. [Google Scholar]
  • 13.Crabbe JP, Frank CL, Nye WW. Improving report turnaround time: An integrated method using data from a radiology information system. AJR Am J Roentgenol. 1994;163:1503–1507. doi: 10.2214/ajr.163.6.7992756. [DOI] [PubMed] [Google Scholar]
  • 14.Seltzer SE, Kelly P, Adams DF, Chiango BF, Viera MA, Fener E, Rondeau R, Kazanjian N, Laffel G, Shaffer K. Expediting the turnaround of radiology reports: Use of total quality management to facilitate radiologists’ report signing. AJR Am J Roentgenol. 1994;162:775–781. doi: 10.2214/ajr.162.4.8140990. [DOI] [PubMed] [Google Scholar]
  • 15.Seltzer SE, Kelly P, Adams DF, Chiango BF, Viera MA, Fener E, Hooton S, Bannon-Rohrbach S, Healy CD, Doubilet PM, Holman BL. Expediting the turnaround of radiology reports in a teaching hospital setting. AJR Am J Roentgenol. 1997;168:889–893. doi: 10.2214/ajr.168.4.9124134. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Digital Imaging are provided here courtesy of Springer

RESOURCES