Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Sep 1.
Published in final edited form as: J Hosp Med. 2017 Sep;12(9):743–746. doi: 10.12788/jhm.2811

Internal Medicine Resident Engagement with a Laboratory Utilization Dashboard: Mixed Methods Study

Gregory Kurtzman 1,3,4, Jessica Dine 2, Andrew Epstein 1,3,5, Yevgeniy Gitelman 4,5, Damien Leri, MS Ed 4, Mitesh S Patel 1,3,4,5, Kira Ryskina 1,3
PMCID: PMC5803096  NIHMSID: NIHMS938985  PMID: 28914280

Abstract

Our objective was to measure internal medicine resident engagement with an EMR-based dashboard providing continuous feedback on their use of routine laboratory tests relative to service averages. From January to June 2016, residents were emailed a snapshot of their personalized dashboard, a link to the online dashboard, and text summarizing the resident’s and service utilization averages. We measured resident engagement using email read-receipts and web-based tracking. We also conducted three hour-long focus groups with residents. Using grounded theory approach, the transcripts were analyzed for common themes focusing on barriers and facilitators of dashboard use. Among 80 residents, 74% opened the email containing a link to the dashboard and 21% accessed the dashboard itself. Residents who deviated further from service averages had significantly higher odds of accessing the dashboard. Focus group participants described key barriers and concerns and made suggestions for future iterations of this scalable intervention for practice-based learning.

Keywords: cost-effective care, practice-based teaching, laboratory test, utilization, internship and residency

Introduction

Recent efforts to reduce waste and overuse in healthcare include reforms such as merit-based physician reimbursement for efficient resource use1 and the inclusion of cost-effective care as a competency for physician trainees.2 Focusing on resource use in physician training and reimbursement presumes that teaching and feedback about utilization can alter physician behavior. Early studies of social comparison feedback observed considerable variation in effectiveness depending on the behavior targeted and how feedback was provided to physicians.35 The widespread adoption of electronic medical record (EMR) software enables the design of scalable feedback interventions that provide continuous feedback in real-time via EMR-based practice dashboards. Currently, little is known about physician engagement with practice dashboards and, in particular, about trainee engagement with dashboards aimed to improve cost-effective care.

To inform future efforts in using social comparison feedback to teach cost-effective care in residency, we measured internal medicine resident engagement with an EMR-based utilization dashboard providing feedback on their use of routine laboratory tests on an inpatient medicine service. Routine labs are often overused in the inpatient setting. In fact, one study reported that 68% of laboratory tests ordered in an academic hospital did not contribute to improving patient outcomes.6 To understand resident perceptions of the dashboards and identify barriers to their use, we conducted a mixed methods study tracking resident utilization of the dashboard over time and collecting qualitative data from three focus groups about resident attitudes toward the dashboards.

Methods

From January to June 2016, resident-specific rates of routine lab orders (e.g. complete blood count, basic metabolic panel, complete metabolic panel, liver function panel, and common coagulation tests) were synthesized continuously in a web-based dashboard. The dashboard was linked to the health system EMR and allowed the user to look up each patient’s medical record to obtain more detailed information.

198 resident-blocks on six general medicine services at the Hospital of University of Pennsylvania were cluster-randomized with equal probability to one of two arms: (1) those emailed a snapshot of the personalized dashboard, a link to the online dashboard, and text summarizing resident and service utilization averages, and (2) those who did not receive the feedback intervention. PGY1 residents were attributed only orders by that resident. PGY2 and PGY3 residents were attributed orders for all patients assigned to the resident’s team.

The initial emails were timed to arrive in the middle of each resident’s two-week service to allow for a baseline and follow-up period. They were followed by a reminder email 24 hours later containing only the link to the report card. We measured resident engagement with the utilization dashboard using email read-receipts and a web-based tracking platform that recorded when the dashboard was opened and who logged on.

Following completion of the intervention, three hour-long focus groups were conducted with residents. These focus groups were guided with pre-scripted questions to prompt discussion on the advantages and drawbacks of the laboratory ordering dashboard and the usage of dashboards in general. These sessions were digitally recorded and transcribed. The transcripts were analyzed by two authors (KR and GK) to identify common themes. We used grounded theory approach to analyze the transcripts.7 First, the transcripts were reviewed independently by each author who each generated a broad list of themes across three domains: dashboard usability, barriers to use, and suggestions for the future. Next, the codebook was refined through an iterative series of discussions and transcript review, resulting in a unified codebook reached by consensus. Lastly, all transcripts were reviewed using the final codebook definitions, resulting in a list of exemplary quotes and suggestions.

The study was approved by the University of Pennsylvania IRB and registered on clinicaltrials.gov (NCT02330289).

Results

Eighty unique residents participated in the intervention, including 51 PGY1s (64%) and 29 PGY2- or PGY3-level (36%) residents. Of these, 17 (21%) of participants participated more than once. Participants opened the email 74% of the resident-blocks during which they were exposed to the intervention and opened the link to the dashboard 21% of the resident-blocks. The average elapsed time from receiving the initial email to logging into the dashboard was 28.5 hours (SD=25.7, median=25.5, Inter-Quartile Range (IQR)=40.5). On average, residents deviated from service mean by 0.54 laboratory test orders (SD=0.49, median=0.40, IQR=0.60).

Table 1 shows the associations between dashboard use and participant characteristics. Participants who learned from the summary email that they deviated from the service average by one standard deviation of labs per patient-day had higher odds of opening the link to the dashboard (Odds Ratio [OR]: 1.48; 95% CI: 1.01, 2.17; P=0.047). Associations with other characteristics (direction of deviation from the mean, PGY level, first occurrence of intervention, weeks since start of intervention, and other team members opening the link) were not significant.

Table 1.

Odds of Accessing the Online Dashboard by Resident Characteristics

Resident Characteristic Odds Ratio 95% Confidence Interval P-value
Absolute difference of 1 standard deviation of labs per patient-day from service average 1.48* 1.01–2.17 0.047
Ordering more tests than service average 1.11 0.45–2.73 0.83
PGY2 or 3 (reference=PGY1) 1.06 0.37–3.03 0.91
First occurrence of intervention 1.38 0.31–6.10 0.68
Weeks since start of intervention 0.92 0.80–1.06 0.23
Whether other members of team opened the link 1.10 0.32–3.72 0.88

Table 2 displays the main themes generated from the resident focus groups and provides representative quotes. First, residents commented on the advantages of the dashboard intervention about test utilization. Specifically, they felt positively that it raised awareness about overuse, appreciated receiving individualized feedback about their own practice, and liked that the data could be reviewed quickly. However, residents also expressed concerns about the design and implementation of the dashboard, including a lack of adjustment for patient complexity, small sample size, and time constraints limiting detailed dashboard exploration. Second, participants questioned the practicality of using such data-driven individualized feedback for training purposes in general, given low patient volume as trainees and the sense that such feedback is too simplistic. For example, one participant commented that “…It really takes all of the thinking out of it and just is glossing over the numbers, which I think could be a little bit frustrating.”

Table 2.

Main Themes from Resident Focus Groups

Domain Theme Representative quote
Usefulness of the laboratory dashboard Raises awareness about laboratory ordering rates “I don’t think there’s any question that it’s interesting. It’s super interesting to look at it and decide if you can derive meaningful information from it. I think it’s definitely interesting, and I’d love to see it out of curiosity.
Provides individual feedback “It is nice to get data tailored to us.”
Can be reviewed quickly “It really only takes a couple of minutes to go through.”
Barriers to using this dashboard intervention Does not account for patient complexity “I would be a little bit annoyed actually, because it takes into account none of the complexity of your patients.”
Sample size/duration on service “There are almost no circumstances where I would care about a one-week interval or a two-week interval. It would be essentially impossible for me to be convinced that it was statistically relevant and that somehow you could account for all of the variants. But I would be interested in a longer time interval.”
Too simplistic “It really takes all of the thinking out of it and just is glossing over the numbers, which I think could be a little bit frustrating.”
Reliability of data “You have to order troponin three times because they’re like oh, I didn’t order that one. But if you put it in again, and you get three troponin orders, and the last one is the one that gets drawn.”
Delivery of Intervention “I currently have 5900 unread messages in my email box, so I’m going to say no.”
Superfluous to traditional training “We’ve gone through several years of medical training and hopefully are able to sort of triage which labs you think are necessary for our patients or not.”
Laboratory orders are not under resident control “I feel like there’s just too many variables, and on top of that, what can you really control in it that’s going to give you these metrics that either give you a false sense of doing well or make you feel like you’re doing poorly when in reality it probably has very little to do with you. I’d rather get feedback from nursing staff and from phlebotomy and from the social workers that leave direct comments about how you interact with them in your performance as colleagues and your social skills and communication skills.”
Barriers to using dashboards in general Lack of time in resident schedule “It’s honestly just a time issue. If you can see everything in one screen, that’s a lot easier than while you’re on your phone trying to log into a different screen or remembering a password”
Insufficient patient volume “My preceptor is always like don’t pay attention to this because you don’t see enough patients for it to be useful at this point.”
Suggestions to facilitate use of dashboards Alleviate concerns about punitive consequences “As long as people have the reassurance that this is really for your own benefit and to help you guide your practices and get some feedback, and not to punish you if you’re low or even reward you if you’re on the high end, but really just to kind of help you, I think people would maybe be more onboard.”
Additional guidance for how to use the data “I don’t know how it would change my practices ..unless there was some guidance for us to do that.”
Ease of access “I can read all this in an email - I’m much more likely to do that than log into a computer.”
Drop outlier patients or apply risk-adjustment “I don’t know if for each patient you could kind of score them based on how many active medical problems they have or the intensity of their problems and then kind of have a scale where you can compare patients between different – in a more objective fashion.”
Include other team members “I think the difference is a lot in what the expectation is of the attending. So I think that maybe the target should be for this to go to the attendings on this teams, because they’re the ones who are going to make that immediate culture change.”

Third, participants identified barriers to using dashboards during training, including time constraints, insufficient patient volume, possible unanticipated consequences, and concerns regarding punitive action by the hospital administration or teaching supervisors. Suggestions to improve the uptake of practice feedback via dashboards included additional guidance for interpreting the data, exclusion of outlier cases or risk-adjustment, and ensuring ease of access to the data.

Last, participants also expressed enthusiasm toward receiving other types of individualized feedback data including patient satisfaction, timing of discharges, readmission rates, and utilization of consulting services, length of stay, antibiotic stewardship practices, costs and utilization data, mortality and intensive care unit transfer rates.

Conclusion/Discussion

Overall, the engagement rates of internal medicine trainees with the online dashboard were low. Most residents did open the emails containing the link and basic information about their utilization rates, but less than a quarter accessed the dashboard containing real-time data. Additionally, on average it took them more than a day to do so. However, there is some indication that residents who deviated further from the mean in either direction were more motivated to access the dashboard to further investigate their performance. This suggests that providing practice feedback in this manner may be effective for a subset of residents who deviate from the “typical practice,” and as such, dashboards may represent a scalable educational tool that could be aligned with practice-based learning competencies.

The focus groups provided important context about residents’ attitudes toward EMR-based dashboards. Overall, residents were enthusiastic about receiving information regarding their personal laboratory ordering, both in terms of preventing iatrogenic harm and waste of resources. This supports previous research that found that both medical students and residents overwhelmingly believe overuse of laboratory tests to be a problem and that there may be insufficient focus on cost-conscious care during training.89 However, residents raised questions regarding several aspects of dashboard use in practice-based learning and suggested specific improvements to increase the utility of future dashboards.

To our knowledge, this is the first attempt to evaluate resident engagement and attitudes toward receiving practice-based feedback using an EMR-based online dashboard. Previous efforts to influence resident laboratory ordering behavior have primarily focused on didactic sessions, financial incentives, and repeated email messaging containing summary statistics about ordering practices and peer comparisons.1012 While some prior studies observed success in decreasing unnecessary use of laboratory tests, such efforts are challenging to implement routinely on a teaching service with multiple rotating providers and may be difficult to scale and replicate. Nevertheless, dashboards that incorporate focused curriculum design and active participation of teaching attendings may be more effecting in engaging residents.

This study has limitations. Most importantly, the sample size of physicians is relatively small and consists of residents at a single institution. Additionally, the dashboard captured laboratory-ordering rates during a two-week block on an inpatient medicine service and was not adjusted for factors such as patient case mix. However, the rates were adjusted for patient volume.

Since residents are expected to be responsive to feedback, their use of the dashboards may represent an upper bound on physician responsiveness to social comparison feedback about utilization. However, dashboards alone may not be an effective way to provide feedback in areas that require additional engagement by the learner. Future efforts to improve care efficiency may try to better capture baseline ordering rates, encourage attendings to review utilization information with trainees, and provide clear recommendations for how this information can be used to adjust behavior.

Footnotes

The authors report no external funding source for this study. The authors declare no conflict of interest.

References

  • 1.Clough JD, McClellan M. Implementing MACRA: implications for physicians and for physician leadership. JAMA. 2016 doi: 10.1001/jama.2016.7041. (published online) [DOI] [PubMed] [Google Scholar]
  • 2.The Internal Medicine Subspecialty Milestones Project. [Accessed on July 6, 2016];A Joint Initiative of the Accrediation Council for Graduate Medical Education and The American Board of Internal Medicine. at http://www.acgme.org/portals/0/pdfs/milestones/internalmedicinesubspecialtymilestones.pdf.
  • 3.Meeker D, Linder JA, Fox CR, et al. Effect of behavioral interventions on inappropriate antibiotic prescribing among primary care practices: a randomized clinical trial. JAMA. 2016;315(6):562–570. doi: 10.1001/jama.2016.0275. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2006;2(2):CD000259. doi: 10.1002/14651858.CD000259.pub2. [DOI] [PubMed] [Google Scholar]
  • 5.Navathe AS, Emanuel EJ. Physician peer comparisons as a nonfinancial strategy to improve the value of care. JAMA. 2016;316(17):1759–1760. doi: 10.1001/jama.2016.13739. [DOI] [PubMed] [Google Scholar]
  • 6.Miyakis S, Karamanof G, Liontos M, Mountokalakis TD. Factors contributing to inappropriate ordering of tests in an academic medical department and the effect of an educational feedback strategy. Postgraduate Medical Journal. 2006;82(974):823–829. doi: 10.1136/pgmj.2006.049551. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Glaser B, Strauss A. The Discovery of Grounded Theory. London: Weidenfeld and Nicholson; 1967. [Google Scholar]
  • 8.Sedrak MS, Patel MS, Ziemba JB, et al. Residents’ self-report on why they order perceived unnecessary inpatient laboratory tests. J Hosp Med. 2016 doi: 10.1002/jhm.2645. (published online) [DOI] [PubMed] [Google Scholar]
  • 9.Tartaglia KM, Kman N, Ledford C. Medical student perceptions of cost-conscious care in an internal medicine clerkship: a thematic analysis. J Gen Intern Med. 2015;30(10):1491–6. doi: 10.1007/s11606-015-3324-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Iams W, Heck J, Kapp M, et al. A multidisciplinary housestaff-led initiative to safely reduce daily laboratory testing. Academic Medicine. 2016;91(6):813–820. doi: 10.1097/ACM.0000000000001149. [DOI] [PubMed] [Google Scholar]
  • 11.Corson AH, Fan VS, White T, et al. A multifaceted hospitalist quality improvement intervention: decreased frequency of common labs. Journal of Hospital Medicine. 2015;10:390–395. doi: 10.1002/jhm.2354. [DOI] [PubMed] [Google Scholar]
  • 12.Yarbrough P, Kukhareva P, Horton D, Edholm K, Kawamoto K. Multifaceted intervention including education, rounding checklist implementation, cost feedback, and financial incentives reduces inpatient laboratory costs. Journal of Hospital Medicine. 2016;11(5):348–54. doi: 10.1002/jhm.2552. [DOI] [PubMed] [Google Scholar]

RESOURCES