Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2026 Apr 2.
Published in final edited form as: AJR Am J Roentgenol. 2015 Jul 23;205(5):936–940. doi: 10.2214/AJR.15.14677

Effects of Performance Feedback Reports on Adherence to Evidence-Based Guidelines in Use of CT for Evaluation of Pulmonary Embolism in the Emergency Department: A Randomized Trial

Ali S Raja 1,2,3,4, Ivan K Ip 1,2,4,5, Ruth M Dunne 1,2,4, Jeremiah D Schuur 1,3,4, Angela M Mills 6, Ramin Khorasani 1,2,4
PMCID: PMC13040545  NIHMSID: NIHMS2133519  PMID: 26204114

Abstract

PURPOSE:

To assess whether implementing emergency department physician performance feedback reports improves adherence to evidence-based guidelines for use of CT for evaluation of pulmonary embolism (CTPE) beyond that achieved with clinical decision support alone.

MATERIALS AND METHODS:

This institutional review board-approved randomized controlled trial was conducted from January 1, 2012-December 31, 2013 in an urban Level-I adult trauma center emergency department. Attending physicians were stratified into quartiles by 2012 CTPE use and randomized to receive quarterly feedback reporting or not, beginning January 2013. Reports consisted of individual and anonymized group data on guideline adherence (using the Wells criteria), CTPE use (CTPEs/1,000 patients), and yield (percentage of CTPEs positive for pulmonary embolism). We compared guideline adherence (primary outcome) and use and yield (secondary outcomes) of CTPE between the control and intervention groups in 2013 and with historical imaging from 2012.

RESULTS:

Of 109,793 emergency department patients during the control and intervention periods; 2,167 (2.0%) underwent CTPE. In controls, guideline adherence remained unchanged between 2012 (78.8%, 476/604) and 2013 (77.2%, 421/545, p=0.5); intervention group guideline adherence increased 8.8% following feedback report implementation, from 78.3% (426/544) to 85.2% (404/474, p<0.05). Use and yield were unchanged in both groups.

CONCLUSION:

Implementing quarterly feedback reporting resulted in a modest but significant increase in adherence to evidence-based guidelines for use of CTPE for emergency department patients, enhancing the impact of clinical decision support alone. These results suggest potentially synergistic effects of traditional performance improvement tools with clinical decision support to improve guideline adherence.

Introduction

With rising healthcare costs, a number of recent initiatives have focused on increasing the appropriateness of tests ordered by physicians to reduce waste and improve quality of care. In parallel, national campaigns such as Choosing Wisely, as well as more local efforts, have focused on improving utilization of high cost imaging.1,2 Approximately 5% of U.S. healthcare costs are spent in emergency departments (ED),3 thus many of these efforts have been led by emergency medicine physicians and radiologists and focused on patients cared for in the ED.4 Two federal regulations promote use of health information technology, in the form of clinical decision support (CDS), to improve appropriate use of imaging. CDS, including for imaging, is a fundamental component of Stage 2 of meaningful use regulations,5,6 and provides relatively small incentives for providers to adopt CDS. The more recent Protecting Access to Medicare Act of 2014,7 however, requires use of imaging CDS for targeted high cost ambulatory and ED imaging procedures, beginning in January 2017, in order to receive payment for imaging services. However, best practices for the design, implementation and even the content of evidence presented in imaging CDS are debated.8,9

One such group of patients, those presenting with symptoms suggestive of acute pulmonary embolism (PE), has been the focus of significant attention. Use of computed tomography for evaluation of PE (CTPE) and diagnosis of PEs have increased significantly without a significant change in mortality, suggesting evidence of overdiagnosis.10 The diagnosis of PE requires risk stratification and, potentially, use of CT imaging – which may carry risks of both radiation and renal injury due to administration of intravenous contrast.11 While CDS has been shown to improve adherence to evidence-based guidelines (EBG) for suspected PE,12 there remains opportunity for additional improvement as a significant portion of CTPEs remain non-adherent to EBG even after CDS implementation.13

Physician-specific performance feedback reports are a well-established tool for quality improvement initiatives.14,15. Our objective was to assess the effect of performance feedback reports on adherence to evidence-based guidelines for use of CTPE for emergency department. Our secondary objectives were to determine their combined impact on use and yield of CTPE. We hypothesized that the addition of performance feedback reports to CDS implementation would improve adherence to EBG for use of CTPE beyond that achieved with CDS alone, and decrease use and increase yield of CTPEs.

Methods

Setting and Subjects

This IRB-approved and HIPAA-compliant prospective randomized controlled trial was conducted from January 1, 2012 to December 31, 2013 in the ED of an urban Level-I adult trauma center. Imaging CDS was deployed for all CTPE requests throughout the study period as previously reported.12,13 Ordering providers could ignore the evidence presented in CDS and proceed with CTPE requests deviating from EBG without interference. We included all attending emergency physicians and, prior to randomization, stratified them into quartiles by 2012 CTPE use. Attending physician assignment in the ED is random, with all attendings equally likely to work in all areas of the ED. Attendings were then randomized by quartile into two groups using a random number generator: the intervention group received individualized feedback reports on CTPE adherence to EBG, use (defined as number of CTPEs per 1,000 patients), and yield (percentage of CTPEs positive for PE), and the control group did not.

Intervention

The intervention consisted of quarterly performance feedback reports sent via e-mail that displayed both individual physicians’ statistics as well as their performance compared to anonymized results for the entire group of emergency physicians (Figure 1). The frequency of feedback report distribution was selected to mirror other physician performance feedback reporting initiatives at our institution. In addition, physicians were given the medical record numbers of any patients for whom the CTPEs ordered were deemed non-adherent to EBG by CDS to facilitate individual chart review. These quarterly reports began in January 2013; both new physicians who joined the group after this date and physicians who left prior to study completion were excluded from the analysis.

Figure 1:

Figure 1:

Figure 1:

Figure 1:

Sample Feedback Report (Anonymized Physician Label in X-axis, Individual Data Highlighted as Red Column)

1A) Adherence to Evidence-Based Guidelines, with Mean of All Attending Physicians Noted

1B) Use of CTPE (Number per 1,000 Patients seen), with Mean of All Attending Physicians Noted

1C) Yield of CTPE (Percentage Positive for Acute Pulmonary Embolism), with Mean of All Attending Physicians Noted

Data Collection

Use of CTPEs for each physician was calculated using the number of completed CTPE examinations and the total number of patients seen during the quarter. Yield of CTPEs for acute PE was determined using a previously validated natural language processing tool and reported as a percentage of total CTPEs completed.12 Adherence to EBG was determined by applying the Wells Criteria and reviewing the serum D-dimer levels (if obtained).16 The discrete criteria making up the Wells Criteria were prospectively documented in our computerized physician order entry (CPOE) system at the time of order entry, as previously reported.12

In order to determine whether any differences observed in guideline adherence were the result of “gaming” (erroneous data entry to either avoid potentially onerous CDS interactions or to enhancing the physician’s apparent performance on feedback reports), we performed manual chart reviews of 100 randomly chosen charts from each of the two groups. These chart reviews were performed by an attending physician to assess concordance between adherence to EBG calculated from the CDS data, and adherence to EBG calculated from data documented in the ED visit clinical note. The sample size was determined using a baseline concordance of 90% as previously reported,17 and chosen to detect a difference in concordance of 15% between groups with a power of 80% and an alpha error rate of 5%.17 Demographic data (including gender and both age and years since residency training as measured at the beginning of the study period) was also captured for all the attending physicians in the study.

Outcome Measures and Statistical Analyses

We compared adherence to EBG (the primary outcome measure) and use and yield (secondary outcome measures) of CTPEs between the control and intervention groups in 2012 to determine the historical baseline characteristics of the groups. We then implemented the quarterly feedback reports and compared outcomes, both between the two groups and between each group and its historical control. An a priori sample size calculation for the primary outcome measure indicated that we would need 335 CTPEs in each group to detect a 10% increase from a baseline of 75% adherence, as previously reported,13 with a 90% power and alpha error rate of 5%, and we calculated that this could be achieved within 12 months. All statistical analyses were conducted using JMP 11.0.

Results

A total of 43 attending physicians were randomized; their average age was 40.3 years (standard deviation [SD] 8.1 years), 13 (30%) were women, and they had an average of 8.8 years (SD 8.4 years) of experience after residency training (Table 1). During the study period, a total of 109,793 patients were evaluated in the ED, of whom 2,167 (2.0%) underwent CTPE – 1,148 (2.0%) of 56,526 patients in 2012 and 1,019 (1.9%) patients in 2013. The baseline characteristics of both the control and intervention groups were similar in 2012; there were no differences in EBG adherence (78.8% vs. 78.3%, p=0.837), use (20.4 vs. 20.2 CTs/1,000 patients, p=0.9052), or yield (11.6% vs. 11.2%, p=0.8413) between the groups.

Table 1:

Attending Physician Characteristics

Control Intervention Total
Number 21 22 43
Mean Age (SD) 41.2 (8.6) 39.4 (7.6) 40.3 (8.1)
Female Gender (%) 6 (29) 7 (32) 13 (30)
Mean Years of Experience (SD) 9.2 (9.0) 8.3 (7.9) 8.8 (8.4)

SD – Standard Deviation

After the intervention, adherence to EBG, use, and yield all remained unchanged in the control group (Table 2). However, the intervention group demonstrated an improvement in adherence to EBG, increasing from 78.3% in 2012 to 85.2% in 2013 (p=0.0043); an absolute increase of 6.9% and a relative increase of 8.8%. Alternatively, these data represent a 31.8% reduction in deviation from EBG ([(100% − 78.3%) − (100%−85.2%)]/(100%−78.3%) =31.8%; p=0.0043). Although we observed a trend towards decreased use (20.2 CTPE procedures per 1000 ED visits pre v 18.1 CTPE procedures per 1000 ED visits post; relative reduction of [20.2-18.1]/20.2= 10.4%; P=0.08)) and increased yield (11.2% pre and 13.1% post; [13.1-11.2]/11.2= 17%; p=0.36), neither use nor yield demonstrated significant change.

Table 2:

Outcomes in Control and Intervention Groups

Group Period Value p-value
Adherence to Evidence Based Guidelines
(% Adherent)
Control Pre 78.8 (476/604) 0.5235
Post 77.2 (421/545)
Intervention Pre 78.3 (426/544) 0.0043
Post 85.2 (404/474)
Use
(# CTPEs/1,000 ED patients)
Control Pre 20.4 (604/29,642) 0.8033
Post 20.1 (545/27,139)
Intervention Pre 20.2 (544/26,884) 0.0789
Post 18.1 (474/26,128)
Yield
(% CTPEs positive for PE)
Control Pre 11.6 (70/604) 0.8326
Post 11.2 (61/545)
Intervention Pre 11.2 (61/544) 0.3625
Post 13.1 (62/474)

CTPE=CT for evaluation of pulmonary embolism

There was no significant difference between the concordance of adherence to EBG based on data captured in the CDS system and the ED visit clinical note for either the control (87% concordant) or the intervention (92% concordant) groups (p=0.3565, Table 3).

Table 3:

Concordance of Adherence to Evidence-Based Guidelines in Clinical Decision Support vs. Clinical Notes

Group Result Number p-value
Control Concordant 87 0.3565
Discordant 13
Intervention Concordant 92
Discordant 8

Discussion

We found that implementing quarterly physician-specific performance feedback reports modestly but significantly (8.8% relative increase) increased adherence to evidence-based guidelines for CTPE beyond the improvements gained through use of CDS alone. The use of imaging CDS is mandated to comply with relatively modest Meaningful Use Stage 2 – and much broader forthcoming Protecting Access to Medicare Act of 2014 – requirements.7,18 Prior studies have shown that implementation of imaging CDS alone is unlikely to optimize adherence to EBG.12,13 However, our findings suggest that the use of traditional performance improvement methods and strategies (such as physician-specific performance feedback reports), in conjunction with CDS, may enhance the return on the substantial health IT investment being made in the U.S. to help transform the healthcare system.

The observed improved adherence to EBG for CTPE in the ED is similar to care improvements achieved in other specialties with the use of feedback reports. Prior research has demonstrated that feedback reports can result in proportional improvements in colorectal cancer screening in primary care14 and administration of prophylactic antibiotics by anesthesiologists19 similar to that seen in our intervention group. Our control group, on the other hand, maintained adherence similar to that previously documented for adherence to EBG in CTs for PE after the use of CDS.13

Given the improvement in adherence to EBG observed in the intervention group, it might be expected that our results would have demonstrated a concurrent decrease in the use, and increase in the yield, of CTPEs. However, while we observed a 10.3% reduction in utilization of CTPE per 1000 ED visits and 17% increase in yield of CTPE in the intervention group the results were not statistically significant, likely due to the fact that our study was underpowered to detect significant changes in the secondary outcomes of use and yield of CTPE in the ED.

One potential explanation for the increase in adherence to EBG as documented in the CPOE system might be erroneous data entry by providers who knew that they were being measured, in effect “gaming the system”. However, comparison of the adherence based on CPOE data to the adherence based on data documented in ED visit clinical notes demonstrated a trend towards increased concordance in the intervention group, rather than the decreased concordance expected if providers were entering erroneous data.

Notably, both our CDS and our feedback reports target and monitor adherence to EBG based on validated high quality evidence, a concept that has fundamental face validity to providers. Such evidence, defined by disease-specific guidelines, can be unambiguously represented in CDS. This focus on promoting and monitoring adherence to validated high quality evidence may have improved provider buy-in of our intervention.

A number of limitations to this randomized trial bear noting. First, given that it was performed at a large academic medical center with a well-integrated CDS system, the results may not be generalizable to other sites, particularly those currently without a CDS on which to build these performance feedback reports. However, given the increasing use of CPOE systems,20 we believe that integrated CDS will become increasingly common in EDs, and that adding this traditional performance improvement methodology based on CDS reports will soon be more practical for many sites. Secondly, most of our orders for CTPEs are placed by resident physicians or physician assistants, while the feedback reports were given to attending physicians. However, our ED culture is one of attending physician involvement, and imaging decisions are typically made at the level of the attending rather than solely by a resident or physician assistant. This, as well as the short rotation length of some of the off-service residents (often only 2-4 weeks), made randomization at the level of attending physicians a necessary – but appropriate – method at our site. Other sites with less attending-level decision making may find that other provider groups require feedback reports as well. Also, our study was powered only for EBG adherence and not use or yield of CTs for PE. A larger sample size may have captured significant change in these last two outcomes. Notably, our reports were distributed via e-mail, allowing for the possibility that some providers simply ignored the reports altogether. However, the effect on the intervention group seems to imply that at least a certain percentage of the intervention group reviewed the reports. It is also possible that our results underestimate the impact of feedback reports as attending physicians in the control group may have become aware of the distribution of feedback reports and potentially modified their ordering behavior. Finally, we were unable to measure the impact of physician performance feedback reporting alone, as all CTPE orders were exposed to CDS throughout the study period. However, our CDS implementation enabled prospective documentation of the discrete data needed to unambiguously measure physician adherence to EBG for every CTPE request, likely the most clinically relevant component of the feedback report. Without CDS, because provider documentation of the needed discrete data to determine adherence is not easily enforced in the free text format of existing electronic health records, even extensive manual chart reviews would have been unlikely to provide the detailed adherence to EBG measure for our feedback reports.

In conclusion, earlier studies using imaging CDS have demonstrated improved use of CT for evaluation of patients suspected of having pulmonary embolism in the ED, however nearly one in four CTPEs still deviate from evidence-based guidelines after CDS implementation. Our findings demonstrate that the use of quarterly physician-specific performance feedback reports in conjunction with CDS augments adherence to evidence-based guideline gains observed through CDS use alone. These data suggest that supplementing CDS with traditional quality improvement strategies and tools such as individualized performance feedback reports may improve the return on the substantial national health IT investment in the US to help transform healthcare, improve quality of care and reduce waste.

References

  • 1.Choosing Wisely - An Initiative of the ABIM Foundation. Available at: http://choosingwisely.org/. Accessed April 25, 2012.
  • 2.Schuur JD, Carney DP, Lyn ET, et al. A Top-Five List for Emergency Medicine: A Pilot Project to Improve the Value of Emergency Care. JAMA Intern Med 2014. doi: 10.1001/jamainternmed.2013.12688. [DOI] [PubMed] [Google Scholar]
  • 3.Lee MH, Schuur JD, Zink BJ. Owning the cost of emergency medicine: beyond 2%. Ann Emerg Med 2013;62(5):498–505.e3. doi: 10.1016/j.annemergmed.2013.03.029. [DOI] [PubMed] [Google Scholar]
  • 4.Raja AS, Walls RM, Schuur JD. Decreasing use of high-cost imaging: the danger of utilization-based performance measures. Ann Emerg Med 2010;56(6):597–599. doi: 10.1016/j.annemergmed.2010.09.013. [DOI] [PubMed] [Google Scholar]
  • 5.Federal Register. Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 Edition.; 2012:77:54163–292. [PubMed] [Google Scholar]
  • 6.Blumenthal D Launching HITECH. N. Engl. J. Med 2010;362(5):382–385. doi: 10.1056/NEJMp0912825. [DOI] [PubMed] [Google Scholar]
  • 7.Protecting Access to Medicare Act of 2014 (H.R. 4302). Available at: https://www.govtrack.us/congress/bills/113/hr4302. Accessed May 29, 2014.
  • 8.Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc 2003;10(6):523–530. doi: 10.1197/jamia.M1370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Khorasani R, Hentel K, Afilalo J, et al. Ten Commandments of Effective Decision Support for Imaging. American Journal of Roentgenology 2014. [DOI] [PubMed] [Google Scholar]
  • 10.Mitchell AM, Jones AE, Tumlin JA, Kline JA. Prospective Study of the Incidence of Contrast-induced Nephropathy Among Patients Evaluated for Pulmonary Embolism by Contrast-enhanced Computed Tomography. Acad Emerg Med 2012;19(6):618–625. doi: 10.1111/j.1553-2712.2012.01374.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Raja AS, Ip IK, Prevedello LM, et al. Effect of computerized clinical decision support on the use and yield of CT pulmonary angiography in the emergency department. Radiology 2012;262(2):468–474. doi: 10.1148/radiol.11110951. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Raja AS, Gupta A, Ip IK, Mills AM, Khorasani R. The use of decision support to measure documented adherence to a national imaging quality measure. Acad Radiol 2014;21(3):378–383. doi: 10.1016/j.acra.2013.10.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Winickoff RN, Coltin KL, Morgan MM, Buxbaum RC, Barnett GO. Improving physician performance through peer comparison feedback. Med Care 1984;22(6):527–534. [DOI] [PubMed] [Google Scholar]
  • 14.Kiefe CI, Allison JJ, Williams O, Person SD, Weaver MT, Weissman NW. Improving quality improvement using achievable benchmarks for physician feedback: A randomized controlled trial. JAMA 2001;285(22):2871–2879. doi: 10.1001/jama.285.22.2871. [DOI] [PubMed] [Google Scholar]
  • 15.Wells PS, Anderson DR, Rodger M, et al. Excluding pulmonary embolism at the bedside without diagnostic imaging: management of patients with suspected pulmonary embolism presenting to the emergency department by using a simple clinical model and d-dimer. Ann. Intern. Med 2001;135(2):98–107. [DOI] [PubMed] [Google Scholar]
  • 16.Gupta A, Raja AS, Khorasani R. Examining clinical decision support integrity: is clinician self-reported data entry accurate? JAMIA 2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N. Engl. J. Med 2010;363(6):501–504. doi: 10.1056/NEJMp1006114. [DOI] [PubMed] [Google Scholar]
  • 18.O’Reilly M, Talsma A, VanRiper S, Kheterpal S, Burney R. An anesthesia information system designed to provide physician-specific feedback improves timely administration of prophylactic antibiotics. Anesth. Analg 2006;103(4):908–912. doi: 10.1213/01.ane.0000237272.77090.a2. [DOI] [PubMed] [Google Scholar]
  • 19.Pallin DJ, Sullivan AF, Espinola JA, Landman AB, Camargo CA. Increasing Adoption of Computerized Provider Order Entry, and Persistent Regional Disparities, in US Emergency Departments. Annals of Emergency Medicine 2011;58(6):543–550.e3. doi: 10.1016/j.annemergmed.2011.05.015. [DOI] [PubMed] [Google Scholar]

RESOURCES