Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Aug 17.
Published in final edited form as: Am J Med Qual. 2018 Apr 11;33(6):604–613. doi: 10.1177/1062860618767312

Measuring Value in Internal Medicine Residency Training Hospitals Using Publicly Reported Measures

Adam Schickedanz 1,ѱ, Reshma Gupta 2,ѱ, Vineet M Arora 3, Clarence H Braddock III 4
PMCID: PMC6697657  NIHMSID: NIHMS1045702  PMID: 29637791

Abstract

Aims:

Graduate medical education (GME) lacks measures of resident preparation for high quality, cost-conscious practice. We used publicly-reported teaching hospital value measures to compare internal medicine residency programs on high value care training and validate these measures against program director value perceptions.

Methods:

We constructed program-level value training scores using Centers for Medicare and Medicaid Services Value-Based Purchasing (VBP) Program hospital quality and cost-efficiency data. Correlations with Association of Program Directors in Internal Medicine Annual Survey high-value care training measures were examined using logistic regression.

Results:

For every point increase in program-level VBP score, residency directors were more likely to agree GME programs have a responsibility to contain healthcare costs (adjusted odds ratio (aOR) 1.18, p=0.04), their faculty model high value care (aOR 1.07, p=0.03), and residents are prepared to make high value medical decisions (aOR 1.07, p=0.09).

Conclusions:

Publicly reported clinical data offer valid measures of GME value training.

INTRODUCTION:

Graduate medical education (GME) programs in the United States face increasing pressure to train physicians who deliver high-value clinical care, defined by high quality and cost-efficiency. If GME does not achieve this goal, the Institute of Medicine has cautioned that its funding and public trust could erode.1 Moreover, the Medicare Payment Advisory Commission and others have recommended not only increasing GME training in quality, cost-efficient care but also reallocating GME funding to incentivize programs to prepare residents who practice high-value care.1-6

These reforms will be challenging unless GME programs have established approaches to measure value training, defined by how programs perform in preparing residents to practice high-value care. Such measurement could help GME programs establish value training benchmarks, identify deficits, and improve through increased resident exposure to high-value clinical settings or targeted value curricula.7-8 Others outside GME could also benefit from publicly-available value training measures. Medical students applying for residency, for instance, could discern which programs immerse them in high-value care settings to prepare them for modern practice, and employers hiring graduating residents would understand the readiness of prospective employees for high-value care delivery.

Essential components of any residency training include clinical experiences and exposures, program director and faculty models, and formal curricula, and each of these components can be leveraged to prepare trainees for high-value practice. We present our conceptual model for how these factors contribute to value training in Figure 1.

Figure 1. Conceptual Model of Value Training at the Residency Program Level.

Figure 1.

The model depicts the relationships between clinical care value exposures and experiences, faculty models, and curricula in the context of institutional value culture that produce resident value training and high value practice.

While didactic curricula on value are increasingly common,9 they represent only a small fraction of resident training compared to time spent learning through clinical experiences. Measures of GME value training that gauge clinical exposures and experiences could therefore reflect value training with greater fidelity than measures of formal didactics. Such clinical value measures would reflect not only the learning environment but also upstream institutional value-based care culture10 while predicting the downstream effect of value training: high value care delivery. Clinical quality and cost data measure the training settings that residents are immersed in, contribute to, and learn from. Therefore, they may be useful measures of value training consistent with calls for reform of GME aligned with standards of a high-performance health care system.1,2,3

Two existing measures of value training using clinical data focus on a few care quality and utilization metrics at the end of life at university-based hospitals.11-12 However, neither has been used to assess GME programs beyond the individual university hospital, directly incorporated cost metrics, nor included comprehensive sets of quality indicators. Publicly reported teaching hospital quality and cost measures could fill this gap, especially in internal medicine where residents spend the majority of their training in hospitals developing knowledge, skills, attitudes, and practices that have been shown to influence patient care quality, costs, and overall value outcomes far into their post-graduate careers.13-18 The Centers for Medicare and Medicaid Services (CMS) Hospital Value Based Purchasing (VBP) program total performance score, a publicly reported aggregate measure incorporating component domains of hospital process of care, patient experience, clinical outcomes, and cost-efficiency, is a well-established measure that could be used to compare clinical value training across GME programs. This VBP score has been used widely to adjust reimbursement for roughly three thousand hospitals in CMS’ VBP program,19 including academic teaching hospitals nationwide. While VBP scores do not measure IM resident care alone, VBP component measures predict IM inpatient care quality outcomes20 and overall VBP scores correlate with IM physicians’ assessments of value-based care at their institutions10.

In this study, we developed internal medicine GME programs’ composite value training scores based on CMS VBP measures of their affiliated teaching hospitals to compare differences in composite and component VBP measures, examined hospital and program level characteristics associated with higher scores, and cross-validated the VBP measures against program director assessments of value training.

METHODS:

Data Sources

We used hospital data from the Centers for Medicare and Medicaid Services (CMS) Hospital Value-based Purchasing (VBP) Program released in fiscal year 2015 (2011–2014 claims). The VBP data include quality measures in three domains (process of care, patient satisfaction, and outcomes) and one cost-efficiency domain measuring Medicare spending per beneficiary (MSPB) from which CMS constructs a composite value measure as an adjustment factor for hospital reimbursement.21 We linked these data to the Medicare Impact Files for fiscal year 2014, which also contained hospital characteristics including bed number, nurse-to-bed ratio, region, ownership, urbanicity, case mix, and Disproportionate Share Hospital (DSH) Index.

Data from the 2012 Association of Program Directors in Internal Medicine (APDIM) annual survey22 was collected to cross-validate the composite program-level VBP scores against concurrent program-level assessments of value training and understand the extent that these clinical and educational measures align. The 2012 APDIM survey contacted 96% of all internal medicine residency programs accredited by ACGME and collected responses via email and hyperlink from 295 programs for an overall response rate of 76% (77% in our sample).22

Teaching Hospital Sample

Our study sample included 262 teaching hospitals from the hundred top internal medicine residency programs ranked by US News and World Report in 2014.23 We chose this approach to assure that the sample focused on the best programs based on academic standards and board exam pass rates (which correlate with VBP score) so that program-level differences in clinical quality and cost-effectiveness were less likely to be attributable to the adequacy of the teaching or the aptitude of the learners. This also allowed for comparisons of VBP scores against conventional program rankings. Teaching hospitals were included in the analyses if they met the criteria for the hospital VBP program, meaning they were acute care, general medical or surgical hospitals paid under the Inpatient Prospective Payment System (IPPS), reported data from at least 100 Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) surveys, and reported data for at least 4 of the 12 clinical process measures with at least 10 eligible cases.21 This removed from the sample five Maryland hospitals not paid under the IPPS20 and sixty-eight Veterans Affairs hospitals that are not Medicare hospitals and do not publicly report through the VBP program, as well as one hospital excluded from the fiscal year 2015 VBP program while on probation by CMS. Thirty teaching hospitals were excluded because residents trained there fewer than two months during their three years of residency, leaving a final sample of 158 hospitals.

Value Training Measures

We employed fiscal year 2015 CMS VBP total performance scores (TPS) of teaching hospitals as our primary measure of value training because it reflects clinical exposures and experiences in the inpatient setting where internal medicine residents spend most of their training time and adopt lasting practice patterns.13,17 CMS calculates this VBP TPS as a composite of four domains (Table 1): clinical processes of care (12 component metrics, 20% of the TPS), patient satisfaction (8 components, 30% of TPS), patient outcomes including mortality and complications (5 components, 30% of TPS), and cost-efficiency defined by MSPB (20% of TPS).21 The MSPB cost-efficiency measure assesses Medicare Part A and Part B payments provided during episodes of care, which are price-standardized and risk adjusted. Quality domain component measures at the patient level that roll up into the hospital level VBP TPS are adjusted by CMS for age, sex, and severity of illness, but not race/ethnicity or socioeconomic status due to methodological challenges and intent to hold all hospitals to the same standard, despite calls for such social risk adjustment.24

Table 1:

Centers for Medicare and Medicaid Services Value-based Purchasing Total Performance Score Component Measures of Quality and Cost-Efficiency

Clinical Process of Care (Quality) Patient Satisfaction (Quality) Patient Outcomes (Quality) Cost-Efficiency
Fibrinolytic therapy received within 30 minutes of hospital arrival* Nurse communication Acute myocardial infarction 30-day mortality rate Medicare payment per beneficiary
Primary PCI received within 30 minutes of hospital arrival Doctor communication Heart failure 30-day mortality rate
Discharge instructions Hospital staff responsiveness Pneumonia 30-day mortality rate
Blood cultures performed in the ED prior to initial antibiotic received in hospital Pain management Patient safety for selected indicators (composite of pressure ulcer rate, iatrogenic pneumothorax rate, central venous catheter-related bloodstream infection rate, postoperative hip fracture rate, postoperative pulmonary embolism or deep vein thrombosis rate, postoperative sepsis rate, postoperative wound dehiscense rate, accidental puncture or laceration rate)
Initial antibiotic selection for CAP in immunocompetent patient* Medicine communication Central line-associated bloodstream infection
Prophylactic antibiotic received within one hour prior to surgical outcome Hospital cleanliness and quietness
Prophylactic antibiotic selection for surgical patient Discharge information
Prophylactic antibiotics discontinued within 24 hours of surgery Overall hospital rating
Cardiac surgery patients with controlled 6am postoperative serum glucose
Postoperative urinary catheter removal on postoperative day 1 or 2
Surgery patients on a beta-blocker prior to arrival that received a beta blocker during the perioperative period
Surgery patients who received appropriate venous thromboembolism prophylaxis within 24 hours
*

CMS did not have data from providers on these measures to include in 2015 scoring (15).

For each IM residency program, we created a program-level VBP score as the weighted average of its affiliated teaching hospitals’ VBP TPS values according to the proportion of all months of training residents spend at each hospital reported by program directors and publicly available through the Doximity residency navigator database (Doximity Residency Navigator, San Francisco, CA 2016; https://residency.doximity.com). Similar program-level scores were constructed for each domain of the VBP TPS, along with a composite program-level quality score calculated as the overall VBP TPS minus the cost-efficiency domain.

Through the 2012 APDIM survey, program directors responded to the following prompts on a dichotomized 5-point Likert scale (“strongly disagree” to “strongly agree”): 1) “GME has a responsibility to curtail the rising cost of health care”, 2) “The majority of faculty who work with residents in our program consistently model cost-conscious care”, and 3) “Residents in our program are prepared to incorporate the value and costs of care into consideration when making medical decisions”. These program director responses constitute value training measures we used to cross-validate the program-level VBP measure. The APDIM survey also collected program characteristics, including number of residency positions filled, census region, faculty number and proportion volunteer, and number and type of teaching sites (i.e. safety net, university, veterans administration/government-affiliated).

Analyses

We compared teaching hospital-level and program-level quality, cost-efficiency, and overall composite VBP scores across our entire sample in rank order and by hospital or program characteristics using descriptive analyses of means or proportions, student’s t-tests, and χ2 tests. Because safety-net hospitals have been shown to perform more poorly on patient experience scores,25 we compared VBP scores for programs with and without safety net hospitals (defined by DSH index above 0.5).

We used logistic regression to predict odds ratios of program director agreement with the APDIM survey prompts using program-level VBP scores, adjusting for program characteristics of size, region, faculty makeup, and teaching site count. In sensitivity analyses, additional variables added individually to the logistic models probed whether factors related to known limitations of the VBP score would affect its association with the APDIM survey measures of value training. These additional variables included program-level weighted-average teaching hospital DSH index to adjust for social risk of the patient population served26 and hospital-census-to-program-size ratio to account for the proportion of discharges not cared for by residents, as well as proportions of the residency class who subspecialize or pass their board certification exam. We calculated bivariate Spearman and linear correlation coefficients between program rank based on VBP scores and US News rankings and board certification rates. Analyses were performed in STATA 14.1 (StataCorp LP, College Station, TX).

RESULTS:

Value Training Measure - Hospital and Program-Level Value Based Purchasing Program Scores

The hundred internal medicine GME programs included in our sample were affiliated with 158 teaching hospitals and had an average of 103 residents. Compared to national averages, teaching hospitals affiliated with programs in our sample had worse VBP total performance scores (training hospitals’ average score 38.9 (SD 10.3), versus national average 41.7 (SD 12.6), p=0.002; all other teaching hospitals average 37.9 (SD 11.2), p=0.3; all non-teaching hospitals average 42.6 (SD 12.7), p=0.0004), cost-efficiency (2.9 (SD 4.3) versus national average 4.6 (SD 6.2), p<0.001), and patient satisfaction (10.7 (SD 5.2) versus national average 13.7 (SD 8.6), p<0.001). The teaching hospitals’ scores did not differ from average national process of care scores and patient outcome scores.27

Program-level VBP total performance scores, calculated as the weighted average of their affiliated teaching hospital scores (Table 2), showed considerable variation in quality and cost-efficiency scores across programs. Figure 2 displays GME program VBP cost-efficiency and composite quality (composite patient satisfaction, process, and outcome) domain scores. Within-program cost-efficiency and composite quality scores were not correlated.

Table 2.

Comparisons of Centers for Medicare and Medicaid Services Value Based Purchasing (VBP, Fiscal Year 2015) Program Scores and Characteristics of the US News and World Report Top 100 Internal Medicine Residency Training Programs and Their Affiliated Teaching Hospitals

Characteristics Teaching
Hospital
Overall
Sample
No. (%) or
Mean (SD)
Hospitals of
Programs
with Top 20
VBP Scores
No. (%) or
Mean (SD)
Hospitals of
Programs with
the Bottom 20
VBP Scores
No. (%) or Mean
(SD)
p-value
Hospitals from
Top 20 Versus
Bottom 20
VBP Programs
Program
Overall
Sample
Weighted
Mean (SD)
Top 20 VBP
Score
Programs
Weighted
Mean (SD)
Bottom 20
VBP Score
Programs
Weighted
Mean (SD)
p-value
Top 20
Versus
Bottom 20
VBP
Programs
Hospital Value Based Purchasing (VBP) Total Performance Score (sum all domains) 38.9 (10.3) 48.2 (6.3) 35.5 (8.4) <0.001 37.8 (9.1) 50.5 (5.0) 26.3 (3.0) <0.001
Patient Satisfaction Score 10.7 (5.2) 15.9 (4.9) 9.1 (4.8) <0.001 10.2 (5.0) 15.7 (5.6) 6.3 (2.2) <0.001
Process of Care Score 11.1 (3.3) 12.9 (2.8) 9.8 (3.1) <0.001 11.0 (3.1) 12.6 (2.7) 8.9 (2.5) 0.001
Patient Outcome Score 14.3 (6.2) 15.9 (5.8) 11.0 (6.3) 0.045 14.2 (5.4) 18.1 (4.7) 9.1 (3.6) <0.001
Medicare spending per beneficiary Score 2.9 (4.3) 3.4 (4.5) 2.1 (4.1) 0.23 2.5 (3.2) 4.1 (4.3) 1.9 (2.1) 0.043
Bed Size 522 (308) 564 (412) 485 (269) 0.25 -- -- -- --
Census to Program Size Ratio -- -- -- -- 4.7 (3.1) 6.2 (4.2) 4.8 (4.0) 0.31
Ownership
 For Profit 10 (6.3) 2 (6.9) 2 (6.25) 0.92 -- -- -- --
 Non-profit – Private 74 (46.8) 15 (51.7) 12 (37.5) 0.26 -- -- -- --
 Non-profit – Other 32 (20.3) 7 (24.1) 9 (28.1) 0.72 -- -- -- --
 Public/Municipal 42 (26.6) 5 (17.3) 9 (28.1) 0.31 -- -- -- --
Region
 Northeast 43 (27.2) 8 (27.6) 10 (31.3) 0.75 -- -- -- --
 Southeast 32 (20.3) 10 (34.5) 5 (15.6) 0.09 -- -- -- --
 Central 36 (22.8) 6 (20.7) 7 (21.9) 0.91 -- -- -- --
 South 10 (6.3) 0 (0.0) 4 (12.5) 0.05 -- -- -- --
 West 37 (23.4) 5 (17.2) 6 (18.8) 0.88 -- -- -- --
Urban rural status
 Large urban 130 (82.3) 24 (82.8) 27 (84.4) 0.87 -- -- -- --
 Other urban 25 (15.8) 5 (17.2) 5 (15.6) 0.87 -- -- -- --
 Rural 3 (0.02) 0 (0.0) 0 (0.0) -- -- -- -- --
DSH index 37.6 (19.5) 28.3 (15.4) 39.2 (19.2) 0.020 -- -- -- --
Case mix index 1.8(0.3) 1.9 (0.4) 1.8 (0.3) 0.042 -- -- -- --
Nurse bed ratio 1.8 (0.8) 1.9 (0.8) 1.8 (0.7) 0.91 -- -- -- --

Figure 2. Internal Medicine Residency Program Teaching Hospital Quality and Cost-Efficiency.

Figure 2.

Program Quality and Cost Efficiency Scores measured using Centers for Medicare and Medicaid Services (CMS) inpatient Value Based Purchasing (VBP) quality and cost component measures. Program Cost Efficiency Score is calculated as the Medicare Part A and Part B hospital payments for average price-standardized and clinically risk-adjusted spending-per-beneficiary divided by expected hospital payments, as reported by CMS, aggregated as a weighted average across each residency program’s teaching hospitals according to proportion of time the typical resident spends at each hospital over three years of residency training. Program Quality Score is a weighted average of hospital clinical processes of care (20% of Program Quality Score), patient satisfaction (30%), and patient outcomes (30%) for each teaching hospital, as reported by CMS, aggregated as a weighted average across each residency program’s teaching hospitals according to proportion of time the typical resident spends at each hospital over three years of residency training.

Differences between Programs by Value Based Purchasing Program Scores

Program-level VBP TPS, patient satisfaction, process of care, patient outcome, and cost-efficiency domain scores were higher among the top versus and bottom 20 ranked programs (Tables 2 and 3). The top twenty performing programs on VBP TPS were affiliated with hospitals that had a higher average case mix index (p=0.04), were more often from the south (p=0.05), and were less likely to be safety net hospitals based on average DSH Index (p =0.02), compared to hospitals affiliated with programs with the lowest twenty VBP TPS values. We found a linear correlation between program VBP score and board exam pass rates (linear regression coefficient 0.0014 [95%CI 0.0004–0.0024], p=0.006) and only a weakly positive correlation with US News and World Report rankings (Spearman correlation coefficient 0.27, p=0.008).

Table 3.

Quality and Cost-Efficiency Assessment of Residency Programs with the Top 20 Weighted Centers for Medicare and Medicaid Services Hospital Value-Based Purchasing (VBP) Program Overall Scores, Listed in Alphabetic Order)

Top 20 Residency Programs Residency Programs’ Affiliated Training Hospitals
Residency
Program Name
Quality Measures Rankings Cost Measure Ranking Hospital Name Individual Hospital
VBP Ranking
Patient
Satisfaction
Clinical
Process
of Care
Patient
Outcomes
Medicare payment per
beneficiary*
University of Alabama Medical Center 4 16 34 18 UNIVERSITY OF ALABAMA HOSPITAL 12
Beth Israel Deaconess Medical Center 40 8 15 37 BETH ISRAEL DEACONESS MEDICAL CENTER 39
Cedars-Sinai Medical Center 32 42 5 37 CEDARS-SINAI MEDICAL CENTER 30
Cleveland Clinic Florida 13 57 2 37 CLEVELAND CLINIC HOSPITAL 15
Cleveland Clinic Foundation in OH 7 92 19 29 CLEVELAND CLINIC 33
Duke University 20 5 14 29 DUKE UNIVERSITY HOSPITAL 9
DUKE REGIONAL HOSPITAL 74
Eastern Virginia Medical School 5 11 56 31 SENTARA NORFOLK GENERAL HOSPITAL 23
SENTARA LEIGH HOSPITAL 47
Emory University 12 52 39 12 GRADY MEMORIAL HOSPITAL 11
EMORY UNIVERSITY HOSPITAL 76
EMORY UNIVERSITY HOSPITAL MIDTOWN 41
Massachusetts General Hospital 11 47 26 37 MASSACHUSETTS GENERAL HOSPITAL 36
NEWTON-WELLESLEY HOSPITAL 24
Mayo Clinical College of Medicine Arizona 1 19 13 6 MAYO CLINIC HOSPITAL 4
Mayo Clinic College of Medicine Jacksonville 2 58 29 29 MAYO CLINIC 16
Mayo Clinic College of Medicine Rochester 3 35 40 11 MAYO CLINIC HOSPITAL ROCHESTER 10
New York Presbyterian Hospital Columbia Campus and New York Presbyterian Hospital Cornell Campus 71 23 3 18 NEW YORK-PRESBYTERIAN HOSPITAL 22
Olive View/UCLA Medical Center 70 76 48 1 LAC/OLIVE VIEW-UCLA MEDICAL CENTER 26
RONALD REAGAN UCLA MEDICAL CENTER 82
Oregon Health and Science University 24 34 24 14 OHSU HOSPITAL AND CLINICS 56
Rush University Medical Center 16 15 16 27 RUSH UNIVERSITY MEDICAL CENTER 13
University of Kansas School of Medicine 9 26 59 29 UNIVERSITY OF KANSAS HOSPITAL 34
University of North Carolina Hospitals 6 4 49 19 UNIVERSITY OF NORTH CAROLINA HOSPITAL 14
WAKEMED, RALEIGH CAMPUS 84
University of Rochester 42 2 77 4 STRONG MEMORIAL HOSPITAL 28
HIGHLAND HOSPITAL 20
UPMC Medical Education 46 3 12 37 UPMC PRESBYTERIAN SHADYSIDE 32

Correlations of Value Based Purchasing Scores with Program Director Perceptions of Value Training

Eighty-eight percent of program directors agreed or strongly agreed that GME had a responsibility to help contain health care costs while only 36% (n=27) agreed or strongly agreed that faculty in their program modeled high value care. Seventy-four percent of programs had high value care curricula in place (20%, n=16) or in development (53%, n=41). These factors correlated with program director agreement (61%, n=46) that their residents were at least somewhat prepared to incorporate value and cost into their medical decisions.

A one point increase in overall program-level VBP TPS was associated with a 1.18 fold increase in odds of program director agreement that GME programs had a responsibility to help contain costs (VBP TPS aOR 1.18 [95% CI 1.002–1.43], p=0.04) and a 1.07 fold increase in odds of agreement that faculty model high value care (aOR 1.07, [95% CI 1.006–1.14], p=0.03) controlling for program size, region, and faculty and teaching site characteristics. Though it did not reach statistical significance, every point increase in program-level VBP TPS was associated with a 1.07 fold increase in odds of agreement that residents were prepared to incorporate value in their medical decisions (aOR, 1.07 [95% CI 0.99–1.15], p=0.09). These effect sizes remained similar even after including the program-level weighted average disproportionate share hospital index, teaching hospital-census-to-residency-size-ratio, sub-specialization rate, or board certification rate in sensitivity analyses. Programs with the top twenty VBP TPS values were more likely to be adopters of high-value care curricula compared to the bottom twenty programs (88% versus 65%, n=15 of 17 versus n=13 of 20), but VBP TPS did not significantly correlate with cost-conscious care curriculum adoption in adjusted regression analyses.

Differences by Safety-Net Hospital and Veteran Administration Teaching Hospital Affiliation

The thirty-four programs affiliated with safety-net hospital teaching sites had a lower average VBP TPS (33.9 [SD 8.0] versus 40.0 [SD 9.1], p-value=0.002) driven by lower average patient experience and patient outcome domain scores despite superior cost-efficiency. A majority (58.3%) of programs had Veterans Administration (VA) hospital training sites, but residents in the overall sample only spent 13.5% of their time in training at VA sites, on average. Programs with and without VA sites did not differ significantly in their VBP scores or APDIM survey value training measures.

DISCUSSION:

We used publicly available CMS measures of teaching hospital quality and cost-efficiency to measure value training and compare clinical experiences and exposures to high-value care across internal medicine residency programs. Overall, teaching hospitals affiliated with the most respected residency programs performed below national averages in terms of composite clinical value scores, though there was wide variation in both cost-efficiency and quality. While many GME programs exceled in specific areas of high-value care, very few are top performers across the board in process, satisfaction, outcome, and cost-efficiency. In our sample of GME programs, higher likelihood of program director agreement that cost-consciousness was a responsibility of residency training, that faculty modeled high-value care, and that residents were prepared to incorporate value into medical decisions all showed relationships with VBP TPS measures. Affiliated safety net teaching hospitals predicted poorer program rankings in our analyses, as did region and case mix index, consistent with prior literature.25, 28-29 Our results suggest that internal medicine residents’ exposure to high-value care in teaching hospitals may be quantified using publicly-reported quality and cost-efficiency data.

Understanding the relative value training strengths of GME programs provides transparency and could facilitate improvements in training at a time when reducing healthcare costs and improving quality are a national priority. This information can help individual GME programs assess their need to improve value-based training and better prepare physicians for practice in the increasingly value-driven health care system. Programs that identify their teaching hospitals as poor performers in cost-efficiency, for example, could supplement their curricula with a specific focus on health care cost-awareness or create new rotations at hospitals with high VBP cost-efficiency scores. Best practices could be identified from top performing programs and be applied to lower performers to advance value-based education more quickly and uniformly.

Limitations

Though measures of clinical cost and quality in the VBP have the advantages of being well-established and familiar because of their use for reimbursement, there are important shortcomings of their use that we attempted to address in the study. CMS’s risk-adjustment precludes adjustment based on race, ethnicity, or socioeconomic status data of either patients or hospital service areas and some component measures included in the value based payment score calculation are affected by care from clinical teams that internal medicine residents have limited participation in. Either of these factors could lead to bias or imprecision in using the VBP TPS as a measure of value training. However, we found that the size of associations between CMS VBP TPS and program director perceptions of value training persisted even after adjusting for hospital- and program-level proxies for patient population social risk and proportion of hospital discharges unlikely to be cared for by resident inpatient teams. Though program directors’ perceptions of value training may be imprecise and imperfect due to the biases inherent in survey methods, no other value training measures are available at the residency level. The VBP scoring incorporates benchmark thresholds, degree of improvement year-to-year, and fixed weighting of components to allocate points that contribute to the final score, which can complicate interpretation of VBP scores when comparing hospitals head-to-head. While VBP data we use is derived only from Medicare patients, quality and relative costs among Medicare patients are routinely used to infer patterns in care for patients covered by other payers. 30 Despite these caveats, the VBP scores are the most comprehensive publicly reported value measures available.

Conclusion

Our findings demonstrate the utility of publicly-available teaching hospital clinical quality and cost-efficiency data to measure the extent to which trainees in IM residency programs are exposed to high-value care settings that prepare them for high value practice. This information can help educators understand the inpatient environments in which residents are immersed, identify sites that best deliver value training, supplement learning at sites that lag in value-based care, and accelerate diffusion of best practices in value training within and across programs.

Acknowledgements:

The authors wish to acknowledge Drs. Kavita Patel and Bob Brook for their input regarding the study design.

Source/Funding: Dr. Schickedanz’ and Dr. Gupta’s time for this project was funded Robert Wood Johnson Clinical Scholars Program and the UCLA National Research Service Award Primary Care and Health Services Research Fellowship.

Footnotes

Other Disclosures: Dr. Gupta is the Director of Outreach and Evaluations and the Director of the Teaching Value in Healthcare Learning Network at Costs of Care. Dr. Schickedanz is co-chair of the Healthcare Value Special Interest Group of the Academic Pediatric Association. Dr. Arora is the Director of Educational Initiatives at Costs of Care, receives royalties from McGraw Hill, and is a member of the Board of Directors for the American Board of Internal Medicine. She is also the former chair of the Association of Program Directors of Internal Medicine Survey Committee and has served as a rankings panelist for the US News and World Report Hospital Rankings for Common Core Conditions. Dr. Braddock is Vice Dean for Education at the David Geffen School of Medicine at UCLA and past chair of the American Board of Internal Medicine.

Declaration of Conflicts of Interest: The authors declare there is no conflict of interest.

REFERENCES:

  • 1.Eden J, Berwick D, & Wilensky G (2014). Graduate Medical Education that meets the nation’s health needs. Washington, DC: Institute of Medicine of the National Academies. [PubMed] [Google Scholar]
  • 2.Aligning Incentives in Medicare. Medicare Payment Advisory Committee. June 2010. http://www.medpac.gov/documents/Jun10_EntireReport.pdf.
  • 3.Hackbarth G, Boccuti C. Transforming Graduate Medical Education to Improve Health Care Value. NEJM. (2011); 364(8): 693–695. [DOI] [PubMed] [Google Scholar]
  • 4.Sklar DP. “Graduate medical education and the Institute of Medicine report.” Academic Medicine (2014) 8912: 1575–1577. [DOI] [PubMed] [Google Scholar]
  • 5.Iglehart JK. Institute of Medicine Report on GME — A Call for Reform. NEJM (2015) 372:4, 376–381. [DOI] [PubMed] [Google Scholar]
  • 6.Andolsek KM. Chasing Perfection and Catching Excellence in Graduate Medical Education. Academic Medicine. 2015; 90(9): 1–5. [DOI] [PubMed] [Google Scholar]
  • 7.Gupta R, Arora VM. Merging the health system and education silos to better educate future physicians. JAMA. 2015. 314(22):2349–2350. [DOI] [PubMed] [Google Scholar]
  • 8.Korenstein D Charting the Route to High-value Care: the Role of Medical Education. JAMA. 2015; 314(22):2359–2361. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Chaudhry SI, Lien C, Ehrlich J, Lane, et al. Curricular content of internal medicine residency programs: A nationwide report. Am J Med. 2014. 127(12): 1247–1254 [DOI] [PubMed] [Google Scholar]
  • 10.Gupta R, Moriates C, Harrison JD, et al. Development of a high value care culture survey: A modified Delphi process and psychometric evaluation. BMJ Quality and Safety. (2016); in press. [DOI] [PubMed] [Google Scholar]
  • 11.Arora A, True A. Dartmouth Atlas of Health Care. What Kind of Physician Will You Be? Variation in Health Care and Its Importance for Residency Training Dartmouth Atlas Project. Dartmouth Institute for Health Policy and Clinical Practice; 2012. [PubMed] [Google Scholar]
  • 12.Flanigan S, Morse R. Methodology: 2016 Best Medical Schools Rankings: Find out how US News ranks medical schools. US New and World Report. 2015. [Google Scholar]
  • 13.Ryskina KL, Halpern SD, Minyanou NS, Goold SD, Tilbert JC. The Role of Training Environment Care Intensity in US Physician Cost Consciousness. Mayo Clin Proc. 2015; 90(3): 313–320. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Bowen J, Salerno SM, Chamberlain JK, Eckstrom E, Chen HL, Brandenburg S. Changing Habits of Practice: Transforming Internal Medicine Residency Education in Ambulatory Settings. J Gen Intern Med. 2005; 20 (12): 1181–1187. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Stammen LA, Stalmeijer RE, Paternotte E, Pool AO, Driessen EW, Scheele F, et al. Training Physicians to Provide High-Value, Cost-Conscious Care: a Systematic Review. JAMA. 2015; 314(22):2384–2400. [DOI] [PubMed] [Google Scholar]
  • 16.Leep Hunderfund AN, Dyrbye LN, Starr SR, Mandrekar J, Naessens JM, Tilburt JC, et al. Role Modeling and Regional Health Care Intensity: U.S. Medical Student Attitudes Toward and Experiences With Cost-Conscious Care. Acad Med. 2016; 1–9. [DOI] [PubMed] [Google Scholar]
  • 17.Chen C, Petterson S, Phillips R, Bazemore A, Mullan F. Spending patterns in region of residency training and subsequent expenditures for care provided by practicing physicians for Medicare beneficiaries. JAMA (2014) 31222: 2385–2393 [DOI] [PubMed] [Google Scholar]
  • 18.Mehrotra A, Reid RO, Adams JL, Friedberg MW, McGlynn EA, Hussey PS. Physicians with the least experience have higher cost profiles than do physicians with the most experience. Health Aff (Milwood). 2012; 31(11): 2453–2463. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Center for Medicare and Medicaid Services. Medicare Program. https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html?redirect=/Hospital-Value-Based-Purchasing/. Accessed August 27, 2015.
  • 20.Gray Darrell M., et al. “The Link Between Clinically Validated Patient Safety Indicators and Clinical Outcomes.” American Journal of Medical Quality (2016). [DOI] [PubMed] [Google Scholar]
  • 21.Centers for Medicare and Medicaid Services. Medicare Program; Hospital Inpatient Value-Based Purchasing Program. Federal Register, Vol. 76, No. 88 (May 6, 2011), p. 26496, http://www.gpo.gov/fdsys/pkg/FR-2011-05-06/pdf/2011-10568.pdf. Accessed August 27, 2015. [PubMed] [Google Scholar]
  • 22.Patel MS, Reed DA, Loertscher L, McDonald FS, Arora VM. Teaching Residents to Provide Cost-Conscious Care: a National Survey of Residency Program Directors. JAMA Internal Medicine. 2014; 174(3): 470–472. [DOI] [PubMed] [Google Scholar]
  • 23.Olmsted MG, Geisen E, Murphy J, et al. Methodology: US News and World Report Best Hospitals 2014–15. US News and World Report. Research Triangle Institute International; 2014. [Google Scholar]
  • 24.Joynt KE, et al. “Should Medicare Value-Based Purchasing Take Social Risk into Account?.” New England Journal of Medicine (2016). [DOI] [PubMed] [Google Scholar]
  • 25.Chaterjee P, Joynt KE, Orav J, Jha A. Patient Experience in Safety-net Hospitals. Arch Intern Med. 2012; 172(16): 1204–1210. [DOI] [PubMed] [Google Scholar]
  • 26.Department of Health and Human Services, Office of the Assistant Secretary of Planning and Evaluation. Report to Congress: social risk factors and performance under Medicare’s value-based payment programs. (2016)
  • 27.Medicare Quality Improvement Organizations contracted with Centers for Medicare and Medicaid Services and US Department of Health and Human Services. Achievement Thresholds and Benchmarks for FY 2014 and 2015 Hospital VBP Program. West Virginia: 2014. Available from: http://www.qipa.org/getattachment/2295cac6-0014-4db3-9a0e-934817a3b9aa/VBP-Threshold-and-Benchmark-Scores-FY-2014-and-201.aspx. [No longer available.] Accessed August 27, 2015. [Google Scholar]
  • 28.Jha AK, Orav EJ, Epstein AM. Low-quality, high-cost hospitals, mainly in South, care for sharply higher shares of elderly black, Hispanic, and medicaid patients.” Health Affairs (Millwood). 2011. 30(10):1904–1911. [DOI] [PubMed] [Google Scholar]
  • 29.Borah BJ, Rock MG, Wood DL, Roellinger DL, Johnson MG, Naessens JM. Association between value-based purchasing score and hospital characteristics. BMC Health Services Research. 2012. 12(464): 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Cooper Zack, et al. The price ain’t right? Hospital prices and health spending on the privately insured. Working Paper No. w21815. National Bureau of Economic Research, 2015. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES