Abstract
Background
Quality measures help identify gaps and disparities in care delivery and prioritize opportunities to improve health. Calls to enhance Systems-Based Practice and Practice-Based Learning and Improvement competencies for residency training cite the need for quality measures for trainees as central to this effort. The authors sought to demonstrate the feasibility of creating a residency program data visualization dashboard to examine individual and program quality measures for an internal medicine residency program within Kaiser Permanente Northern California.
Methods
An interactive display was developed to allow for easy visualization of quality and operational measures through an iterative design process. The dashboard displays data for individual residents, residency classes (PGY1-3), and the entire program, including quality measures, systems measures, and patient diagnoses. An iterative process continues to improve the functionality and usefulness of the dashboard.
Results
It is feasible to create a dashboard to visualize individual and program quality measures and health equity measures for a residency program using a learner-centered approach and alignment with institutional goals through collaboration between education and operational teams. Future studies will examine the audit and feedback process, resident perceptions, and changes to patient outcomes.
Conclusion
Use of dashboards in graduate medical education is feasible and can be used to help residents and residency programs identify gaps in quality of care.
Keywords: Quality measures, graduate medical education, competency assessment, health information technology, health equity
Introduction
Quality measures, which quantify “health care processes, outcomes, patient perceptions, and organizational structure,” 1 are used to help identify gaps and disparities in care delivery, prioritize opportunities to improve health care services, and drive changes in the health care system. The proliferation of quality measures has a complex impact on physicians and health care systems, and many physician societies have advocated defining the measures that have the highest value to patients and physicians. 2
As quality measures become increasingly embedded in physician practice, the medical education community has accepted the charge to train physicians to interpret and address quality measures. Calls to equip trainees with skills in Systems-Based Practice (SBP) and Practice-Based Learning and Improvement (PBLI) specifically cite the availability of quality measures to trainees as central to this effort. 3–5 The Accreditation Council for Graduate Medical Education (ACGME) outlines a vision of a health system in which trainees are “immersed in evidence-based, data-driven clinical learning and care environments.” 5 Concordantly, the ACGME Common Program Requirements require sponsoring institutions to provide residents and fellows with quality data; 6 the ACGME Clinical Learning Environment Review (CLER) pathway in health care quality provides a road map for sponsoring institutions to do so. This includes granting trainees and training programs access to performance data that is “specific to the patients for whom they provide direct patient care” and ensuring that trainees can interpret and act on this data. 7
The need for residency program access to quality measures is particularly pressing for internal medicine, 8 as many quality measures pertain to general medicine in inpatient and outpatient settings. The ACGME November 2020 Internal Medicine Milestone Revisions, 9 developed with the support of physician organizations including the American Board of Internal Medicine and Society of General Internal Medicine, have added emphasis and clarity to the SBP and PBLI competencies, including the use of performance data and metrics in quality and safety initiatives. Although such data are increasingly available to health care systems, opportunities for trainees to examine quality measures are often limited by institutional policies and processes. Graduate medical education programs may be siloed from medical center quality offices, without having access to patient-level data for trainees. Finally, even if access is granted, many quality reports can appear complex, confusing, and unactionable at the point of care.
Quality dashboards are designed to synthesize and concisely visualize complex data, 10 as well as provide the opportunity to present visually appealing actionable ways in clinical applications. Epstein et al 11 have proposed design principles for quality dashboards for residency programs, yet limited literature exists regarding the feasibility of implementing it. The theoretical framework of audit and feedback (A&F), defined as the measurement within an individual clinician’s professional practice along with a comparison to professional standards or targets, 12 provides a pedagogical model for implementing the use of dashboards within residency training programs allowing data to be contextualized and improvement and growth to be emphasized. Social comparison theory may help explain why A&F may psychologically or sociologically compel individuals to strive to reach specific goals. 13,14
In this innovation, the authors aimed to determine the feasibility of creating an interactive and comprehensive dashboard for individual residents and program-level performance in quality measures using health-equity associated data.
Methods
Kaiser Permanente is an integrated health care system serving 12.5 million members in nine regions across the United States. It has a workforce of 220,000 Kaiser Foundation Health Plan and Hospital employees and 23,000 Permanente physicians. Kaiser Permanente’s commitment to preventive, population-based care in its “Total Health” and “Community Health” imperatives have contributed to its high Medicare Star Quality Ratings. Kaiser Permanente’s population health management tools and longstanding electronic medical records enable it to retrieve highly reliable data and analyze quality measures.
Kaiser Permanente has had graduate and undergraduate medical education training programs since its inception in 1945, now with 67 residency and fellowship programs nationwide in various specialties. The oldest of these programs, the Kaiser Permanente Internal Medicine Residency in Oakland, California, has used population health management tools in training its 39 categorical residents for over two decades. Specifically, Healthcare Effectiveness Data and Information Set (HEDIS) data have been available for resident patient panels in an Excel™ spreadsheet format since 2015. However, this data often lagged several months. It was difficult to interpret, and thus was rarely used by trainees or faculty to understand resident patient panels or drive quality improvement efforts.
As Kaiser Permanente began using Tableau™ software to create dashboards to visually represent quality data, and ACGME CLER visits enabled ongoing collaboration between medical education and quality departments, the decision was made to build a prototype dashboard for internal medicine residents at Kaiser Permanente Oakland in 2019.
This process, outlined in Figure 1, involved collaboration among stakeholders, including learners and educational and operational leaders. Physician executives, medical group administrators, hospital administrators, regional quality leads, medical center data analysts, residency program directors, assistant program directors, and chief residents were consulted throughout the process. Stakeholders specifically provided input on which quality measures were of the highest priority for patients, the institution, and residents. Residents selected measures for patient panels in which they would most likely be able to impact at the point of care, as well as data relevant to learning and feedback in SBP and PBLI. A data visualization dashboard was created using information from the Epic™ electronic health record (EHR) and institutional quality reports, and then programmed into Tableau™ software. Data used for the pilot period included extraction of data for 36 internal medicine residents from January to August of 2021.
Figure 1:
Iterative process of dashboard creation. HEDIS = Healthcare Effectiveness Data and Information Set
Initial dashboard iterations focused on outpatient quality outcomes, including 12 HEDIS measures, with subsets, to include measures by race to examine health equity. Outcomes for the attending physicians at that site serve as a reference. All information was deidentified by name, and individual residents were assigned a unique profile number that would allow them to determine their performance levels in comparison to the group average, as well as other peers. An iterative process continued, with user testing by residents, program directors, and operational leaders on measures and user-friendliness of the display and interactive components. Additional measures were added to the dashboard, including outpatient panel demographics and visit diagnoses, visit proportions of empaneled patients or urgent care visits, handling time of online patient messages, and inpatient note types, volumes, and diagnoses. At each stage, feedback from at least three individuals was used to enhance functionality and appearance, determine additional data to include, and refine analytic models. The site has been refreshed monthly to ensure ongoing functionality of programming and will be available to all resident physicians.
Resident physicians and program leadership attended several focus groups to examine the dashboard data and functionality between January and July of 2021. Suggested comments and functionality were included in the ongoing quality improvement process of the dashboard.
Results
HEDIS measures can be visualized for individual residents, residency classes (PGY1-3), and the entire program. A sample of the quality dashboard demonstrating individual resident and class level performance on HEDIS measures can be seen in Figure 2. Attendings outperform residents in every HEDIS measure; most measures improve with increasing PGY year. The total of all resident panel patients is 7850; 4767 (61%) are women, 1469 (19%) are less than 65 years, 1605 (24%) are Black, and 859 (13%) are Latino. Outpatient visit data, seen in Figure 3, includes patient electronic messages responded to within 2 days, with a 90% target; only 17 residents meet this target. Top outpatient diagnoses, by an individual resident or by program, include routine health care checks, hypertension, and DM2. The percentage of appointments taken by empaneled patients as well as the overall residency practice is also reported. Inpatient data includes the total number of inpatient notes during residency; the average PGY3 writes 286 history and physicals (range 224–337), 705 progress notes (range 628–837), and 141 discharge summaries (range 119–168) in 26 months of residency. Top inpatient diagnoses include congestive heart failure, hypertension, and pneumonia.
Figure 2:
Sample section of quality dashboard demonstrating individual resident and class level performance on Healthcare Effectiveness Data and Information Set measures. © 2022 Tableau Software, LLC and its licensors. All rights reserved.
Figure 3:
Sample section of quality dashboard demonstrating outpatient panel management data. © 2022 Tableau Software, LLC and its licensors. All rights reserved.
Initial reception by residents during focus groups was positive. Residents noted user-friendliness, visual appeal, and ease of use as critical features. They indicated potential uses would include population health management, identifying care delivery gaps, capturing data for quality improvement interventions, personal assessment of the breadth of exposure to various inpatient and outpatient diagnoses to target further study and board preparation, and personal improvement of systems measures. In similar focus groups, education leaders noted the ability to compare individual residents over time and program in total to attending practice as critical features. They indicated potential uses would include determining competency in SBP and PBLI for clinical competency committee evaluations and identifying residents whose performance on a given outcome was an outlier compared to peers.
Discussion
The authors demonstrated that it is feasible to create a dashboard to visualize individual and program quality and health equity measures for a residency program using a learner-centered approach aligned with institutional goals. Creating the dashboard relied upon collaboration and co-creation with leaders, residents, and a data analyst, as well as data accessibility from an integrated delivery system and EHR. The speed and breadth of dashboard development were limited by the complexity of programming and data sources and required an investment of analysis time for updates. The inability to attribute patient outcomes to individual physicians also presents a challenge, as individual patient outcomes are the end product of many variables, including physician care. Indeed, the literature suggests that GME programs may reliably evaluate HEDIS performance pooled at the program level, but less so at the resident level. 14
The next steps include continuous revision of the dashboard, including the incorporation of data assessing interpersonal and communication skills (including patient satisfaction measures). Dashboards are under development for additional residency programs within our institution, allowing for insights on program equivalency. An A&F process for residents when viewing this data to promote competency in SBP, and PBLI is being developed. Finally, the authors aim to study changes in patient outcomes over time, resident perception of clinical relevance and utility, and use by programs in clinical competence committees, ACGME self-studies, and CLER pathway achievement.
Electronic dashboards show promise for improving the clinical care of patients, as well as medical training. This innovation shows that it is possible to coproduce a quality dashboard that can be designed to be accessible for trainees and educational leaders. As residency programs often rely on subjective measures by preceptors to assess SBP and PBLI, this dashboard offers the promise of objective data when those assessments are made by clinical competency committees. Furthermore, the A&F construct allows for better dashboard design. For example, benchmarks embedded into dashboard visual displays may reduce complexity and help physicians better understand their performance and identify areas for improvement; trends help physicians interpret when clinical performance requires action. 15
Conclusion
As health care systems shift toward patient and population outcomes to understand and drive care, dashboard development has the potential to increase the value and effectiveness of this data and to prepare trainees in internal medicine for the realities of future practice.
Acknowledgments
The authors wish to thank Connie Li, MPH; Hernan Oscco, Thomas Baudendistel MD, and Adam Luxenberg MD for their contributions to this project. The authors would like to honor patient contributions to this work, including the use of data from the electronic health record.
Footnotes
Funding: None declared
Conflicts of Interest: None declared
Author Contributions: Nardine Saad Riegels, MD, assisted in the study design, data collection, data analysis, and manuscript preparation. Lindsay A Mazotti, MD, assisted in the study design, data collection, data analysis, and manuscript preparation. Both authors have given their final approval to the manuscript.
References
- 1. Centers for Medicare and Medicaid Services . Quality measures. Accessed 8 September 2021. https://www.cms.gov/medicare/quality-initiatives-patient-assessment-instruments/qualitymeasures
- 2. National Quality Forum . Core Quality Measures Collaborative. Accessed 8 September 2021. http://www.qualityforum.org/cqmc/
- 3.Holmboe ES, Batalden P. Achieving the desired transformation: Thoughts on next steps for outcomes-based medical education. Acad Med . 2015;90(9):1215–1223. 10.1097/ACM.0000000000000779 [DOI] [PubMed] [Google Scholar]
- 4.Frenk J, Chen L, Bhutta ZA, et al. Health professionals for a new century: Transforming education to strengthen health systems in an interdependent world. Lancet . 2010;376(9756):1923–1958. 10.1016/S0140-6736(10)61854-5 [DOI] [PubMed] [Google Scholar]
- 5. Accreditation Council for Graduate Medical Education . 2020 Strategic Plan Summary. 2020. Accessed 6 September 2021. https://www.acgme.org/Portals/0/PFAssets/PublicationsPapers/Strategic%20Plan%20Summary.pdf?ver=2020-10-22-114251-953
- 6. Accreditation Council for Graduate Medical Education . Common Program Requirements. 2021. Accessed 6 September 2021. https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRResidency2021.pdf
- 7. Accreditation Council for Graduate Medical Education . CLER Pathways to Excellence: Expectations for an optimal clinical learning environment to achieve safe and high-quality patient care. Version 2.0, 2019. Accessed 8 September 2021. https://www.acgme.org/Portals/0/PDFs/CLER/1079ACGME-CLER2019PTE-BrochDigital.pdf
- 8.Fazio SB, Steinmann AF. A new era for residency training in internal medicine. JAMA Intern Med . 2016;176(2):161–162. 10.1001/jamainternmed.2015.6952 [DOI] [PubMed] [Google Scholar]
- 9. Accreditation Council for Graduate Medical Education . Internal medicine milestones second revision. 2020. Accessed 8 September 2021. InternalMedicineMilestones2.0.pdf (acgme.org)
- 10.Dowding D, Randell R, Gardner P, et al. Dashboards for improving patient care: Review of the literature. Int J Med Inform . 2015;84(2):87–100. 10.1016/j.ijmedinf.2014.10.001 [DOI] [PubMed] [Google Scholar]
- 11.Epstein JA, Noronha C, Berkenblit G. Smarter screen time: Integrating clinical dashboards into graduate medical education. J Grad Med Educ . 2020;12(1):19–24. 10.4300/JGME-D-19-00584.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Gude WT, Brown B, van der Veer SN, et al. Clinical performance comparators in audit and feedback: A review of theory and evidence. Implement Sci . 2019;14(1):39. 10.1186/s13012-019-0887-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Festinger L. Social comparison theory. Selective Exposure Theory . 1957;16. [Google Scholar]
- 14.Kim JG, Rodriguez HP, Holmboe ES, et al. The reliability of graduate medical education quality of care clinical performance measures. J Grad Med Educ . 2022;14(3):281–288. 10.4300/JGME-D-21-00706.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Jamtvedt G, Young JM, Kristoffersen DT, Thomson O’Brien MA, Oxman AD. Audit and feedback: Effects on professional practice and health care outcomes. Cochrane Database Syst Rev . 2003;(3). 10.1002/14651858.CD000259 [DOI] [PubMed] [Google Scholar]



