Skip to main content
Journal of Digital Imaging logoLink to Journal of Digital Imaging
. 2007 Mar 3;21(1):50–58. doi: 10.1007/s10278-007-9008-9

The Radiology Digital Dashboard: Effects on Report Turnaround Time

Matthew B Morgan 1,, Barton F Branstetter IV 1, David M Lionetti 1, Jeremy S Richardson 1, Paul J Chang 2
PMCID: PMC3043829  PMID: 17334871

Abstract

As radiology departments transition to near-complete digital information management, work flows and their supporting informatics infrastructure are becoming increasingly complex. Digital dashboards can integrate separate computerized information systems and summarize key work flow metrics in real time to facilitate informed decision making. A PACS-integrated digital dashboard function designed to alert radiologists to their unsigned report queue status, coupled with an actionable link to the report signing application, resulted in a 24% reduction in the time between transcription and report finalization. The dashboard was well received by radiologists who reported high usage for signing reports. Further research is needed to identify and evaluate other potentially useful work flow metrics for inclusion in a radiology clinical dashboard.

Key words: Workflow, user interface, radiology management, efficiency, software design, decision support

INTRODUCTION

Digital work flows are inevitable. Within the last decade, radiology departments have moved from “early adopter” to “early majority” in the implementation of electronic information systems. Picture Archiving and Communication Systems (PACS), Radiology Information Systems (RIS), speech recognition, and electronic health records (EHR) are some of the many digital systems in use in a modern radiology department. At the intersection of these separate electronic systems is an increasingly complex radiology work flow and supporting informatics infrastructure. Within this complexity, it can be difficult to evaluate the overall state of radiology departmental work flow. Yet, understanding the state of the work flow is crucial if radiologists are to preserve added value and ensure the quality of patient care.

A digital dashboard is a computerized user interface designed to summarize key system metrics for at-a-glance evaluation of the state of multiple systems. Convenient access to a real-time summary of important system metrics empowers knowledge workers to make more informed decisions. Although dashboards are commonplace in administrative settings, they have been slow to make headway into the clinical arena. The potential advantages of creating a PACS-integrated radiology clinical dashboard to monitor important work flow metrics in real time have been reported previously.1 Depending on the type of PACS deployment and the configuration of support software (such as speech recognition), the useful metrics in a radiology dashboard may vary among different radiology departments. We developed and deployed a PACS-integrated, clinical radiology dashboard that provides the radiologist with real-time, at-a-glance status of select work flow metrics. In this article, we focus particularly on the effect of a digital dashboard on report turnaround time by decreasing the time between radiologist dictation and signing of reports.

Timely report finalization is an important aspect of quality in diagnostic radiology. If the radiology department utilizes manual transcription services, radiologists must regularly review and sign (finalize) dictated reports. Reducing overall report turnaround time, and in particular, time to sign reports, has been an ongoing challenge.25 One reason for this is that in many RIS/PACS configurations, report signing is an out-of-band task, ie, not an integral part of the radiologist’s normal work flow. Because the PACS and RIS are often completely separate applications, the radiologist must stop interpreting images in the PACS environment, and launch a separate RIS application. Because the RIS typically requires separate sign on, and times out after a few minutes of inactivity, there is a time penalty when looking for unsigned reports prematurely. Furthermore, there is no signal to alert the radiologist to the status of his report queue. As a result, radiologists tend to sign their reports in large batches, sometimes waiting until the end of the day (or worse, the beginning of the next day), thereby adding unnecessary delay to report finalization. We hypothesized that a real-time, PACS-integrated dashboard function that alerts the radiologist to the number of unsigned reports and provides ready access to the report signing application would decrease the time interval from transcription to finalization and thus reduce overall report turnaround time.

The purposes of this paper are to describe the development of the unsigned report monitor in our radiology clinical dashboard, evaluate its effect on report turnaround time, and report users’ impressions of the effect of the unsigned report counter on their work flow.

MATERIALS AND METHODS

The Front-end: The Dashboard User Interface

The dashboard user interface was integrated directly into our PACS display using an embedded ActiveX control and client-side dynamic hypertext mark-up language (DHTML) with Asynchronous Javascipt and Extensible Mark-up Language (AJAX) scripting (Fig. 1). AJAX refers to a technique of creating web applications in which small amounts of data are exchanged with the server “behind the scenes” so that the entire web page does not have to be reloaded. In the dashboard application, the ActiveX control queries the dashboard server every 60 s for database updates on the status of the three categories of alerts (User, Division, and System) and then changes the color of the dashboard “light” accordingly without refreshing the entire PACS interface, thereby making the dashboard user interface more interactive and responsive.

Fig 1.

Fig 1

Placement of the dashboard. The dashboard is located prominently at the top of the screen in both the worklist and image viewing modes of the PACS.

To maximize its efficacy, the dashboard was placed in a prominent location in the PACS user interface at the top of the viewing area in both “image viewing” and “worklist” modes. Three categories of radiology alerts were defined for the dashboard: User, Division, and System. Each category is represented by a separate “light” on the dashboard (Fig. 1). Alerts in the User category include those issues pertinent to only a single radiologist, such as unsigned reports, or the appearance of additional images after interpretation. Alerts in the Division category include those pertinent to a group of radiologists, such as delinquent undictated cases that need to be interpreted. The System category is reserved for alerts that affect the entire department or hospital system, such as unplanned downtime or disaster alerts.

For simplicity and user familiarity, a traffic light metaphor (a red, yellow, or green circle) is used to represent the state of the system. The dashboard “lights” change color according to the priority of the alert. For example, in the case of the unsigned report monitor, when the radiologist’s queue is clear and there are no unsigned reports waiting, the “User” dashboard light is green. When there are between one and 20 unsigned reports, the light turns yellow, and when there are over 20 unsigned reports, the light turns red. The red light begins to flash if there are more than 30 unsigned reports. When the radiologist clicks on the alert light, a display appears with the alert information (Fig. 2). Of note, each item on the list of alerts is actionable, that is, clicking the item will launch the appropriate software to address the alert.

Fig 2.

Fig 2

User interface. The ‘User’ light displays the state of the radiologist’s unsigned reports. The color of the display follows a traffic light metaphor: no unsigned reports = green; one to 20 = yellow; over 20 = red. The radiologist can click on the dashboard light to see detail of the alert and can click the link to directly launch the appropriate report editing/signing application.

The Back-end: The Dashboard Software and Hardware Architecture

The dashboard system is implemented as a relational database with a web service interface for modifying and querying state. The data model consists of “agents” that register and manage dashboard alerts and “owners” that receive those alerts. Alerts may be assigned either to individuals or to groups. These entities are defined as “Owner Types” (specifically, User, Division, and System) and correspond to the dashboard categories of the same names. For example, the unsigned report monitor is an agent implemented as a Windows service that monitors the state of multiple RISs and registers the state of each radiologist’s report counter in the dashboard database. Alerts may also be manually assigned to users by administrators using a separate web interface. Standard edge protocols (ie, Health Level 7 (HL7) messaging, database queries, etc.) are normalized to a middle layer, enabling a service-oriented architecture (SOA) (Fig. 3). Table 1 lists the server configuration used to develop and implement the radiology dashboard.

Fig 3.

Fig 3

Service-oriented architecture. Using a “middleware” layer (service-oriented architecture), the data from multiple systems can be normalized and encapsulated in a centralized location for use by other systems. This enables integrated, real-time, interactive reporting, alerts, and reminders for applications such as the radiology dashboard.

Table 1.

Dashboard Server Hardware and Software Configuration

Central Processing Unit (CPU) Dual Intel 864 MHz Processors
Memory 1,024 Megabytes
Hard drive 70 Gigabytes
Operating system Windows XP server 2003
Web server Internet information services (IIS)
Database management system (DBMS) Microsoft SQL 2000 standard

Interfaces and Integration

Actionable integration is achieved through a variety of methods. Integration of the dashboard with the PACS (Philips/Stentor, Brisbane, CA, USA) was achieved using the vendor’s application program interface (API), which is made available to institutions wishing to customize the user interface. An API is an optional interface that a vendor may provide to allow exchange of data between its application and other computer programs. Using the API, the dashboard application can directly launch studies by passing unique study identification parameters to the PACS application. The result is that a dashboard alert can contain a hyperlink that will immediately load a study in the PACS when clicked by the user.

Actionable integration with the RIS was less direct. Within the various sites of the radiology department of the University of Pittsburgh Medical Center (UPMC), multiple RISs are utilized and radiologists accumulate reports on different systems as they move from site to site. Furthermore, UPMC is currently transitioning to a voice recognition system, (M*Modal, Pittsburgh, PA, USA) where reports are dictated, processed remotely, edited by transcription, and then are ready for signature on a separate report editing system. The result of this reporting complexity is that the dashboard must track the number of unsigned reports in multiple systems. The RIS applications do not currently provide query access to their systems; however, the count of unsigned reports can be tracked indirectly through HL7 report status messages. Unsigned report data are reported under the User category in the dashboard. When the radiologist clicks the User “light” in the dashboard, each reporting system is listed as a hyperlink with their respective unsigned report counts (Fig. 2). Clicking the hyperlinks on the dashboard launches the appropriate report editing application. As we do not currently use a single sign-on technology, a separate sign-on is then required from the user to access each application.

Data Acquisition

Data were acquired during three different time periods, each lasting 10 months. The first study period was before the introduction of the digital dashboard (“no dashboard”, April 2003–Jan 2004). The second study period was after the introduction of the digital dashboard, but before the dashboard became integrated with the RIS signing tools (“unintegrated dashboard”, March 2004–December 2004). During this second study period, radiologists were continually notified of how many unsigned reports were pending, but they could not launch the report signing software from within the PACS. The third study period came after the dashboard was integrated with the signing tools (“integrated dashboard”, March 2005–December 2005), at which point radiologists could launch the signing tools from within the dashboard (Fig. 2). A gap of 1–2 months was placed between each study period for deployment of software and user training.

Across the three study periods, RIS timestamps were collected on 1,741,551 distinct examinations. If multiple billing codes were included in a single dictation, this was considered a single event.

The number of cases signed by each attending radiologist in the department was tabulated for each calendar month. For each radiologist in each month, the average time between the completion of transcription and report signing (“time-to-sign”) was calculated

Inclusion Criteria

To be included in the study, a radiologist must have dictated 10 or more cases in at least 8 of the 10 months in each of the three study periods. Forty-seven radiologists met these criteria: 11 women and 36 men, representing all subspecialty radiology divisions.

Statistical Analysis

Because we expected large variation between individual radiologists’ turnaround times, we normalized the monthly turnaround times. For each radiologist, the average time to sign across the entire study period was used as a normalization factor. These normalized values were used for all further calculations.

To compare the three study periods for significant differences in time to sign, we applied Fisher’s Least Significant Difference test. Thus, an initial analysis of variance (ANOVA) was used to reject the null hypothesis that all three study periods were equivalent, followed by pairwise comparisons of the study periods.

User Survey

A user satisfaction survey was used to assess the users’ experience with the digital dashboard. The survey contained 18 items using a five-point Likert-type scale and two free-text response items (Appendix). A web-based survey tool was used to collect the responses. A link to the survey was sent by e-mail to all clinical faculty, fellows, and residents who had used the clinical dashboard, and a reminder e-mail was sent 1 week later. The survey was closed 1 week after the reminder e-mail and descriptive statistics were tabulated.

RESULTS

The average time to sign for each month is graphed in Figure 4. Before the introduction of a digital dashboard, the average turnaround time for signing reports was 22.5 h. After the introduction of the unintegrated dashboard, the average time was 24.3 h. When the dashboard was integrated with the signing tools, the average time decreased to 17.7 h (a decrease of 24% from the combined average of the other two study periods). The initial ANOVA was significant (F-observed = 20.2; F-critical = 3.1), so pairwise comparisons were undertaken. There was no significant difference between “no dashboard” and “unintegrated dashboard” (p = 0.12). There were significant differences between “no dashboard” and “integrated dashboard” (p = 0.001) and between “unintegrated dashboard” and “integrated dashboard” (p < 0.001).

Fig 4.

Fig 4

Average turnaround time for radiologists signing reports, before and after the introduction of a digital dashboard. The line represents a running average of four data points. The grey bars represent transition periods that were excluded from the statistical analysis. The average time to sign in each of the three study periods is listed. * = statistically significant difference from other study periods (p = 0.001).

Response rates to the survey were 80% (40/50) for trainees and 62% (32/52) for faculty radiologists. Selected survey results pertaining to the unsigned report monitor are summarized in Table 2. No substantial differences in attitudes were noted between trainees and faculty.

Table 2.

Selected User Survey Results Pertaining to the Unsigned Report Counter

Survey Item Percent Agree or Strongly Agree
“The red-yellow-green stoplight analogy is a good way to summarize the status of the system.” 85
“I use the links in the dashboard (e.g. to sign reports in IDAS or IDXRad)” 92
“The dashboard helps to guide what task I do next.” 42
“I sign reports more often with the dashboard in place.” 76
“The dashboard is frequently inaccurate.” 44

DISCUSSION

A clinical radiology dashboard represents a novel way to get important work flow information to the radiologist where he or she can quickly see it and act on it. In particular, an unsigned report monitor in a PACS-integrated radiology dashboard, when coupled with an actionable link to report signing software, resulted in an overall 24% decrease in time from report transcription to finalization. It should be noted here that dictated reports are actually available to the clinician in preliminary status immediately upon transcription or completion of speech recognition (we employ both types of systems), which is usually within a few hours of dictation. Considering that the 24% decrease represents an average decrease across all radiologists, it is both statistically and clinically significant. A report finalized consistently under 24 h, helps both preserve added value and promote quality patient care.

It is important to note that the “unintegrated dashboard”, in which there was no actionable link but only a counter of unsigned reports, had no significant impact on report signing behavior. This is consistent with results of other studies showing clinicians’ reluctance to add even a few keystrokes or clicks to their work flow.6 With the “unitegrated dashboard”, radiologists were presented with a means of monitoring the status of their unsigned reports, but were required to take multiple additional steps to act on the information (ie, navigate to the RIS application and log in). This was associated with a nonstatistically significant increase in time-to-sign, most easily explained as statistical fluctuation. In other words, the presentation of information to the user, without an efficient means to act on the information, had no significant effect. It follows that to be widely accepted by practicing clinicians, computerized decision support systems (such as a clinical dashboard) must be “actionable” and integrated into the clinical work flow. They must present the right information, in the right format, at the right time, without requiring special effort or reducing clinical productivity. In other words, decision support applications should “make it easy to do it right.”7

Ideally, the user experience should be as fluid as possible across applications (including across separate vendors). Achieving a “seamless” user experience will become easier as vendors embrace modern open architectures and normalize their data for consumption by other applications (service-oriented architectures/web services). Hopefully, the effort to standardize the interaction among vendors’ applications by integrating the health care enterprise (IHE) will help facilitate this end. In addition, single sign-on technologies, which allow the user to authenticate once and then have access to multiple applications, will become more standardized and effective.

Dashboard accuracy was an important issue. Forty-four percent of users agreed that the counter was frequently inaccurate. The challenges in keeping an accurate count of unsigned reports were based on the idiosyncratic behavior of the two different versions of the RIS (IDX V9 and V10) being used at different sites within the enterprise. When a radiologist had reports in both systems, depending on the order in which the reports were signed, duplicated HL7 messages about the finalized status could be generated. This, in turn, would cause errors in the count of unsigned reports in the dashboard. This type of error was minimized by training the radiologists to sign reports in a defined order; however, this training-dependent solution was not always successful, and would lead to user dissatisfaction. For example, in the free response section, one radiologist indicated that “an inaccurate dashboard is worse than no dashboard.” This sentiment highlights the high importance of maintaining the integrity of the dashboard information. A clinical decision support system may be ignored if users discover that they cannot trust the information. Moreover, if errors are encountered early in software deployment, it may be difficult to overcome these first impressions even once the accuracy of the system improves.

The unsigned report monitor was well received by the majority of users, who reported that they used it regularly (Table 2). A large majority of users (85%) agreed that the traffic light metaphor is a good way to summarize the status of the system. Although only 42% agreed that “the dashboard helps guide what task I do next”, 76% agreed that they “sign reports more often with the dashboard in place.” This interesting contradiction suggests that radiologists (like most people) may be reluctant to concede that a computer support system is influencing their decision making.

CONCLUSION

With the increasing complexity of digital information management, radiology departments must focus not only on being more efficient, but also on being more effective. Work flows are optimized when radiologists have real-time information to make informed decisions, and the capacity to efficiently act on that information. A PACS-integrated digital dashboard designed to alert radiologists to their unsigned report queue status, coupled with an actionable link to the report signing application, is an effective method of decreasing report turnaround times. Further research is needed to identify and evaluate other potentially useful work flow metrics for inclusion in a radiology clinical dashboard.

Acknowledgments

This work was funded in part by a 2005 research grant from the Society of Imaging Informatics in Medicine (SIIM).

Appendix

The following is a subset of the user survey that pertains to the unsigned report monitor. The survey was placed online and a link to it was sent by e-mail to all trainees and faculty who used the dashboard.

  1. I am a ....
    • ○First-year resident
    • ○Second-year resident
    • ○Third-year resident
    • ○Fourth-year resident
    • ○Fellow
    • ○Attending
  2. Please indicate whether you agree or disagree with the following statements:

    Disagree strongly | Disagree | Neutral | Agree | Agree strongly
    • ○I use the links in the dashboard (eg, to sign reports in IDAS or IDXRad).
    • ○The red-yellow-green stoplight analogy is a good way to summarize the status of the system.
    • ○The dashboard is frequently inaccurate.
    • ○The dashboard helps to guide what task I do next.
    • ○I sign reports more often with the dashboard in place.
  3. What other information would you like to see in the dashboard?

  4. Do you have any other comments about the dashboard?

References

  • 1.Morgan MB, Branstetter BFt, Mates J, Chang PJ. Flying blind: using a digital dashboard to navigate a complex PACS environment. J Digit Imaging. 2006;19:69–75. doi: 10.1007/s10278-005-8732-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Langer SG. Impact of tightly coupled PACS/speech recognition on report turnaround time in the radiology department. J Digit Imaging. 2002;15(Suppl 1):234–236. doi: 10.1007/s10278-002-5011-3. [DOI] [PubMed] [Google Scholar]
  • 3.Oguz KK, Yousem DM, Deluca T, Herskovits EH, Beauchamp NJ., Jr Impact of pager notification on report verification times. Acad Radiol. 2002;9:954–959. doi: 10.1016/S1076-6332(03)80466-X. [DOI] [PubMed] [Google Scholar]
  • 4.Cavagna E, Berletti R, Schiavon F, Scarsi B, Barbato G. Optimized delivery radiological reports: applying Six Sigma methodology to a radiology department. Radiol Med (Torino) 2003;105:205–214. [PubMed] [Google Scholar]
  • 5.Mehta A, Dreyer K, Boland G, Frank M. Do picture archiving and communication systems improve report turnaround times? J Digit Imaging. 2000;13:105–107. doi: 10.1007/BF03167637. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Overhage JM, Tierney WM, McDonald CJ. Computer reminders to implement preventive care guidelines for hospitalized patients. Arch Intern Med. 1996;156:1551–1556. doi: 10.1001/archinte.156.14.1551. [DOI] [PubMed] [Google Scholar]
  • 7.James BC. Making it easy to do it right. N Engl J Med. 2001;345:991–993. doi: 10.1056/NEJM200109273451311. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Digital Imaging are provided here courtesy of Springer

RESOURCES