Abstract
Importance
A variety of state Medicaid reforms are underway, but the relative performance of different approaches is unclear.
Objective
To compare performance in Oregon’s and Colorado’s Medicaid Accountable Care Organization (ACO) models.
Design, Setting, and Participants
Oregon initiated its Medicaid transformation in 2012, supported by a $1.9 billion federal investment, moving the majority of Medicaid enrollees into sixteen Coordinated Care Organizations (CCOs), which managed care within a global budget. Colorado initiated its Medicaid Accountable Care Collaborative (ACC) in 2011, creating seven Regional Care Collaborative Organizations that received funding to coordinate care with providers and connect Medicaid enrollees with community services. We analyzed data spanning July 1, 2010 through December 31, 2014, (18 months pre-intervention and 24 months post intervention, treating 2012 as a transition year) for 452,371 Oregon and 330,511 Colorado Medicaid enrollees, assessing changes in outcomes using difference-in-differences analyses.
Exposures
Both states emphasized a regional focus, primary care homes, and care coordination. Oregon’s CCO model was more comprehensive in its reform goals and in the imposition of downside financial risk.
Main Outcomes and Measures
Performance on claims-based measures of standardized expenditures and utilization for selected services, access, preventable hospitalizations, and appropriateness of care.
Results
Standardized expenditures for selected services declined in both states over the 2010–2014 time period, but these decreases were not significantly different between the two states. Oregon’s model was associated with reductions in emergency department visits (−6.28 per 1000 beneficiary months, 95% CI −10.51 to −2.05) and primary care visits (−15.09 visits per 1000 beneficiary months, 95% CI −26.57 to −3.61), improvements in acute preventable hospital admissions, three out of four measures of access, and one out of four measures of appropriateness of care.
Conclusions and Relevance
Two years into implementation, Oregon and Colorado’s Medicaid ACO models exhibited similar performance on standardized expenditures for selected services. Oregon’s model, marked by a large federal investment and movement to global budgets, was associated with improvements in some measures of utilization, access and quality, but Colorado’s model paralleled Oregon on a number of other metrics.
Introduction
Medicaid, the federal-state health insurance program for low-income individuals, has grown to cover more than 20% of the population nationally and accounts for a significant and growing portion of state budgets.1 This growth poses a significant budgetary challenge, even among states choosing not to expand coverage through the Affordable Care Act (ACA). States are experimenting with a wide range of policies designed to control spending, including payment reforms that mirror aspects of Accountable Care Organizations (ACOs) in the Medicare and commercial markets.2 As of 2016, nine states had launched Medicaid ACOs, with eight more actively pursuing this model.3
In this paper, we compare the performance of two early adopters of the Medicaid ACO model, Oregon and Colorado. Oregon’s Medicaid transformation occurred in 2012. Supported in part by a $1.9B investment from the federal government, the state moved the majority (90%) of its Medicaid beneficiaries into sixteen Coordinated Care Organizations, or CCOs.4–9 CCOs are community-based with governing boards that include representatives of the health care delivery system and consumers who reflect the community’s needs. Unlike most ACO models, CCOs accept full financial risk for their patient population and must manage all care (including mental health, addiction, and dental services) within a global budget. Oregon’s ambitious model has led some to refer to CCOs as “ACOs on steroids.”10
Colorado’s Medicaid Accountable Care Collaborative (ACC) reform was initiated in 2011, with the state creating seven Regional Care Collaborative Organizations (RCCOs). RCCOs receive per member per month funding to provide administrative support to improve connections between Medicaid enrollees, providers, and community services. Approximately 70% of Colorado Medicaid beneficiaries were enrolled in the ACC program by 2014. Unlike the Oregon CCO model, the ACC model maintained fee-for-service payments and did not impose downside financial risk on providers or RCCOs. Furthermore, Colorado did not receive federal investments on the scale of those provided to Oregon.
The objective of this study was to compare performance in Oregon’s CCO model to performance in the Colorado ACC model, using claims-based measures of expenditures, utilization, access, quality, and appropriateness of care. The Oregon and Colorado Medicaid agencies have described positive outcomes associated with their reforms, with both states reporting lower expenditures, reductions in utilization, and improvements in quality.9,11–13 However, a formal comparison allows for an assessment of the relative performance of a Medicaid ACO model focused on enhanced payment for care coordination and case management (Colorado) vs. a more comprehensive Medicaid ACO model predicated on a global budget and downside financial risk (Oregon). Assessing these impacts is particularly salient when viewed in the context of a nationwide trend toward ACO models and the need for evidence on their ability to slow utilization and improve access, quality, and outcomes.
Methods
We used a difference-in-differences approach, with the more intensive Oregon CCO intervention serving as the treatment group and the Colorado Medicaid program serving as the comparison group, for two years after the Medicaid reforms were implemented. This study was approved by the Institutional Review Board at Oregon Health & Science University.
Study Populations
We obtained data from each state’s Medicaid agency and analyzed claims for 18 months (July 1, 2010 through December 31, 2011) of pre-intervention data and 24 months of 2013–2014 post-intervention data, treating 2012 as a transition year. Our primary analyses focused on the population of individuals who were enrolled in both the pre-intervention and post-intervention periods, and enrolled for at least three months within a twelve month window. We excluded individuals who were dually eligible for both Medicare and Medicaid. In Oregon, we excluded Medicaid enrollees who were not enrolled in CCOs because of special health needs,4 and Medicaid enrollees from one CCO (Cascade Health Alliance), which did not launch until August 2013.
The Colorado comparison group was restricted to Medicaid beneficiaries who were in the standard (non-ACC) Medicaid program in the 2010–2011 time period but covered by the ACC program for the 2013–2014 time period. Children who were eligible for Medicaid through the State Children’s Health Insurance Program (SCHIP) were excluded because they were not eligible for the ACC. We excluded individuals enrolled in managed care, since they were required to “opt out” of managed care into the ACC. Managed care penetration was low (<2%), with the exception of Denver and Mesa counties, which had substantially higher managed care penetration rates. To avoid potential selection bias, our analyses excluded residents of these two counties.
Propensity Score Weighting
We used propensity score weighting as a first step in adjusting for observable differences between the Oregon and Colorado groups. The propensity score variables included age, gender, rural residence, and Chronic Illness and Disability Payment System (CDPS) risk indicators.14 Propensity weights were applied across the Oregon and Colorado populations for all study periods, with each individual in each time period given a weight proportional to the probability of being in the Oregon Medicaid program in the fourth quarter of 2011, prior to the CCO intervention. This weighting approach adjusted for observable differences between the Oregon and Colorado populations as well as changes in the composition of each population over time.15 Additional details are provided in the Supplement.
Outcome Variables
In Oregon’s managed care and CCO environment, capitation and other alternative payment mechanisms result in “encounter” claims which include information on diagnosis and procedure but record paid amounts as zero. To create a composite measure that could be compared across states, we created a measure of standardized expenditures using the following steps. First, we identified the set of procedure codes and services that were common across both states and included as one of four categories of service in the Berenson-Eggers Type of Service (BETOS) classification. These services included evaluation & management, imaging, tests, and procedures. Next, we repriced these claims with “standardized prices,” using the Oregon 2014 Medicaid fee schedule to attach standardized prices to claims in both states according to procedure and site of service codes. We repriced inpatient facility services on a per diem basis. Additional details are provided in the Supplement. This approach creates a measure of “standardized expenditures,” representing typical Medicaid expenditures for selected services across both states.
Utilization measures included emergency department (ED) visits, primary care visits, and acute inpatient days. To further investigate changes in access, we constructed measures from the Healthcare Effectiveness Data and Information Set (HEDIS):16 well-child visits in the third, fourth, fifth and sixth years of life; children and adolescents’ access to preventive/ambulatory health services (members 1–6 years who had an ambulatory or preventive care visit during the year, or 7–19 years who had an ambulatory or preventive care visit during the last two years); adolescent well care visits (members 12–21 years who had at least one comprehensive well-care visit during the year); and the percentage of adults 20–44 who had an ambulatory or preventive care visit during the year. We also analyzed performance on four measures of appropriateness or “low-value” care (appropriate medications for individuals with asthma; testing for children with pharyngitis; imaging studies for low back pain; and imaging for uncomplicated headache), hypothesizing that these services might be areas of focus for organizations seeking to reduce spending and improve quality.17 Finally, we assessed changes in quality by estimating changes in “potentially avoidable” ED visits18 and preventable hospitalizations as defined by the Agency for Healthcare Research and Quality Prevention Quality Indicators (PQI).19 Following Joynt and colleagues,20 we did not include admission source as a variable in our PQI algorithm.
Statistical Analyses
We used standardized differences to assess the comparability of the Medicaid population and the propensity weighted comparison group.21,22 We used propensity score weighted linear models to assess changes in expenditures, utilization, access measures, preventable ED visits and hospitalizations, and the provision of low-value care.
Our covariates included age, gender, CDPS risk indicators, rural residence indicators, an indicator for individuals in Oregon, indicator variables for the second, third, and fourth quarters of the year (to control for seasonality), an indicator for the post-intervention period (2013–2014), and the interaction between the Oregon population and post-intervention indicators, which produced our estimates of the policy effects. We tested the assumption of parallel trends in the treatment and comparison groups for utilization measures in the pre-period.23,24 Measures of access, low-value care, and PQI required one-year lookbacks and were restricted to continuously enrolled individuals with annual assessments in 2011, 2013, and 2014. Standard errors were adjusted for clustering at the primary care service area level.25,26
Sensitivity Analyses
We conducted multiple analyses to assess the sensitivity of propensity score specifications, transition period, study population definitions, and other assumptions, described in the Supplement.
Results
Table 1 compares delivery system and reform components of the Oregon and Colorado programs. The reforms were similar in their regional focus and emphasis on primary care, but the Oregon program was more comprehensive in its scope of benefits covered and its use of global budgets and downside financial risk as a mechanism for cost control. The substantial CMS investment in Oregon provided funding for administrative staff, data infrastructure, resources for implementation, training, and related services, and insured that the transformation efforts would not be hampered by reductions in reimbursement rates.
Table 1.
Coverage Model | CCO (Oregon) | ACC/RCCO (Colorado) |
---|---|---|
Regional Focus | Regional and community resources support the coordination of care across programs (16 CCOs covering entire state) | Same as Oregon (7 RCCOs cover entire state) |
Enrollment of beneficiaries | Automatic enrollment with exceptions primarily for individuals with special health needs. Approximately 90% of Medicaid population enrolled in CCOs in June 2013. | Automatic enrollment based on primary care provider attribution†; primary care providers could opt in to ACC program. Approximately 47% of Medicaid population enrolled in RCCOs in June 2013; 58% by June 2014. |
Primary Care Medical Homes | Beneficiaries are assigned a primary care medical home, which serves as the primary agency for coordinating care | Same as Oregon |
Incentive Measures | 17 “Incentive Measures”; CCOs eligible for bonus payments of up to 3% of global budget in 2014. | 3–4 core measures, updated annually; RCCs eligible for bonus payments of up to $1.00 PMPM |
Cost Savings | Requirement to slow spending growth rate to 4.4% in 2014 and 3.4% in 2015 | Incremental reductions anticipated but not required |
Financing | Global budget: risk adjusted, per capita, with CCOs at financial risk | FFS, with RCCOs and primary care providers receiving per-member payment to support care coordination |
Examples of delivery system initiatives | High utilizer programs Programs to reduce ED utilization Hospital-to-home transition programs Support for social services Alternative payment models designed to move away from fee-for-service and decrease incentives for high-volume, high-intensity care Integration of oral and mental health |
High utilizer programs Programs to reduce ED utilization Support for social services Centralized data repository to track and report clinic performance |
Investments | CMS provided approximately $1.92B over 5 years from (2012–2017) | State-based investment of approximately $155M between 2011–2014 |
Most Medicaid enrollees were in Colorado were covered by the traditional fee-for-service Medicaid program; individuals in Colorado Medicaid Managed Care Organizations were generally required to opt out of Managed Care in order to be enrolled in the ACC program.
There were 452,371 Oregon Medicaid enrollees and 330,511 Colorado Medicaid enrollees included in analyses. After propensity score weighting, differences in enrollee clinical and demographic characteristics were small, although the propensity-weighted Colorado group was, on average, slightly younger than the Oregon group (Table 2). Investigation of the pre-intervention parallel trends assumption indicated no significant difference in quarterly trends across most expenditure and utilization measures,23,27 with exceptions for standardized expenditures for procedures and the primary care visit utilization measure (eTable 1 in the Supplement).
Table 2.
Characteristic | Oregon Medicaid | Colorado Medicaid | Standardized Difference† |
---|---|---|---|
Enrollees (N) | 452,371 | 330,511 | |
Age (yr) | −13.0 | ||
Mean | 17.4 | 16.0 | |
Median | 12 | 12 | |
Interquartile range | 6–24 | 6–25 | |
Female sex | 54.5% | 55.4% | 3.1 |
CDPS risk score | −4.2 | ||
Mean | 1.0 | 1.0 | |
Median | 0.6 | 0.6 | |
Interquartile range | 0.5–1.0 | 0.5–1.0 | |
Rural residence | 27.8% | 27.1% | −1.2 |
A standardized difference less than 10 may be regarded as indicative of a negligible difference in the prevalence of a covariate between the treatment and control groups
Standardized expenditures decreased in Oregon relative to Colorado, but after adjusting for demographic and health risk, there was no significant difference ($2.00; 95% CI [CI] −$0.79 to $4.78) in per member per month standardized expenditures for Oregon’s Medicaid enrollees; positive values indicate higher growth in standardized expenditures in Oregon relative to Colorado (Table 3). In general, performance on standardized expenditures for most measures was similar in the first and second year, with some exceptions. For example, relative to Colorado, standardized expenditures for inpatient services were significantly higher for Oregon in the second year of implementation ($4.37; 95% CI $0.01 to $8.73).
Table 3.
Utilization and Expenditure Measures | Oregon Medicaid
Beneficiaries N = 452, 371 |
Colorado Medicaid
Beneficiaries N = 330,511 |
Changes in Oregon Medicaid compared to Changes in Colorado Medicaid † (95% CI) | ||||||
---|---|---|---|---|---|---|---|---|---|
7/2010–12/2011 | 2013–2014 | Change | 7/2010–12/2011 | 2013–2014 | Change | Average 2-year effect | Year 1 (2013) effect | Year 2 (2014) effect | |
Standardized Expenditures (total, US $) | 103 | 97 | −6 | 88 | 87 | −1 | 2.00 (−0.79, 4.78) | 0.52 (−2.47, 3.51) | 3.45 (−0.82, 7.72) |
Spending by BETOS code (US $) | |||||||||
Evaluation & Management | 30 | 28 | −2 | 25 | 25 | 0 | −0.70 (−5.99, −1.54) | 0.06 (−0.74, 0.85) | −1.47* (−2.40, −0.54) |
Imaging | 7 | 7 | 0 | 7 | 7 | 0 | 0.32* (0.14, 0.50) | 0.16 (−0.02, 0.34) | 0.48* (0.25, 0.71) |
Procedures | 15 | 14 | −1 | 14 | 13 | −1 | 0.36 (−0.06, 0.78) | 0.41* (0.01, 0.80) | 0.32 (−0.27, 0.91) |
Tests | 7 | 7 | 0 | 6 | 7 | 1 | −0.17 (−0.39, 0.05) | −0.30* (−0.50, −0.09) | −0.04 (−0.32, 0.25) |
Inpatient Facility | 44 | 41 | −3 | 37 | 35 | −2 | 2.18 (−0.46, 4.82) | 0.16 (−2.38, 2.69) | 4.19* (0.01, 8.37) |
Spending by site (US $) | |||||||||
Outpatient services | 50 | 49 | −1 | 43 | 44 | 1 | −0.26 (−1.37, 0.85) | 0.37 (−0.72, 1.46) | −0.89 (02.18, 0.40) |
Inpatient services | 52 | 48 | −4 | 45 | 43 | −2 | 2.25 (−0.54, 5.04) | 0.11 (−2.60, 2.82) | 4.37* (0.01, 8.73) |
Utilization (Per 1000 beneficiary months) | |||||||||
Emergency department (ED) visits | 57.9 | 53.2 | −4.7 | 67.4 | 70.5 | 3.1 | −6.28* (−10.51, −2.05) | −4.58* (−8.72, −0.42) | −8.00* (−12.38, −3.61) |
Primary care visits | 313.7 | 277.3 | −36.4 | 270.1 | 257.5 | −12. −6 | −15.09* (−26.57, −3.61) | −9.43* (−21.83,2.98) | −20.70* (−31.75, −9.65) |
Inpatient days | 28.1 | 26.0 | −2.1 | 23.2 | 22.1 | −1.1 | 1.39 (−0.29,3.06) | 0.10 (−1.51, 1.71) | 2.66* (0.01,5.32) |
Adjusted difference based on propensity score weighted difference-in-differences regression model;
Statistically significant at P<0.05
Table 4 displays differences in standardized expenditure and utilization measures, stratified by adults and children. Relative to Colorado, Oregon’s growth in overall standardized expenditures was lower for adults, compared to children, but neither group showed statistically significant differences between the states. (Point estimates of the pooled analyses in Table 3 do not necessarily reflect weighted estimates of Table 4’s stratified analyses, in part because Table 4 excludes individuals transitioning to adults over the study period, and in part because different propensity score weights were used for each analysis). Patterns were generally similar across metrics for both children and adults, with some exceptions. For example, decreases in emergency department visits for Oregon relative to Colorado were statistically significant for adults but not children.
Table 4.
Expenditure and Utilization Measures | Changes in Oregon Medicaid compared to Changes in Colorado Medicaid † (95% CI) | |||||
---|---|---|---|---|---|---|
Children (N = 283,816 OR; N = 247,215 CO) | Adults (N = 155,507 OR; N = 77,720 CO) | |||||
Average 2-year effect | Year 1 (2013) effect | Year 2 (2014) effect | Average 2-year effect | Year 1 (2013) effect | Year 2 (2014) effect | |
Standardized Expenditures (total, US $) | −0.22 (−2.25, 1.82) | 0.22 (−1.88, 2.31) | −0.65 (−3.17, 1.87) | −4.27 (−12.65, 4.11) | −11.18* (−20.90, −1.46) | 2.57 (−9.70, 14,84) |
Spending by BETOS code (US $) | ||||||
Evaluation & Management | −0.90* (−1.57, −0.24) | −0.39 (−1.02, 0.24) | −1.42* (−2.18, −0.66) | −0.53 (−2.04, 0.99) | 0.61 (−0.98, 2.19) | −1.67 (−3.42, 0.09) |
Imaging | 0.12* (0.05, 0.19) | 0.07 (−0.01, 0.15) | 0.17* (0.08, 0.26) | 1.19* (0.58, 1.80) | 0.68* (0.09, 1.26) | 1.70* (0.95, 2.46) |
Procedures | 0.30* (0.04, 0.56) | 0.11 (−0.16, 0.39) | 0.48* (0.18, 0.79) | −0.20 (−1.33, 0.93) | 0.26 (−0.78, 1.31) | −0.66 (−2.30, 0.98) |
Tests | −0.04 (−0.12, 0.05) | −0.09* (−0.18, −0.00) | 0.01 (−0.09, 0.12) | −0.43 (−1.19, 0.32) | −0.65 (−1.37, 0.08) | −0.22 (1.12, 0.68) |
Inpatient Facility | 0.31 (−1.39, 2.00) | 0.49 (−1.21, 2.20) | 0.12 (−2.07, 2.32) | −4.28 (−12.13, 3.58) | −12.10* (−20.53, −3.58) | 3.51 (−8.28, 15.29) |
Spending by site (US $) | ||||||
Outpatient services | −0.56 (−1.29, 0.18) | −0.27 (−1.03, 0.49) | −0.84 (−1.70, 0.01) | 0.78 (−1.61, 3.18) | 1.97 (−0.32, 4.26) | −0.40 (−3.29, 2.48) |
Inpatient services | 0.34 (−1.48, 2.16) | 0.47 (−1.37, 2.31) | 0.21 (−2.16, 2.58) | −5.05 (−13.49, 3.39) | −13.20* (−22.48, −3.92) | 3.05 (−9.38, 15.48) |
Utilization (Per 1000 beneficiary months) | ||||||
ED visits | −1.94 (−5.48, 1.61) | −0.88 (−4.18, 2.41) | −2.99 (−6.88, 0.90) | −11.63* (−17.85, −5.42) | −8.86* (−15.70, −2.02) | −14.45* (−20.60, −8.30) |
Primary care visits | −11.54* (−21.48, −1.60) | −10.80 (−21.80, 0.20) | −12.29* (−21.86, −2.72) | −16.79* (−33.57,0.00) | −1.13 (−18.88,16.63) | −32.37* (−49.16, −15.38) |
Inpatient days | 0.20 (−0.88, 1.27) | 0.31 (−0.77, 1.40) | 0.08 (−1.32, 1.47) | −2.72 (−7.71, 2.28) | −7.69* (−13.11, −2.27) | 2.23 (−5.26,9.72) |
Adjusted difference based on propensity score weighted difference-in-differences regression model;
Statistically significant at P<0.05
Table 5 displays measures of access, avoidable ED visits, preventable hospitalizations (PQIs), and measures of low-value care. Of note, although primary care utilization decreased across both states, Oregon maintained or improved care in three of four measures of access (well child visits for children ages 3–6: +2.7%, 95% CI 1.2% to 4.2%; adolescent well care visits: +6.8%, 95% CI 5.2% to 8.3%; adult access to preventive ambulatory care: +1.3%, 95% CI 0.3% to 2.2%) relative to Colorado. Oregon also improved on measures of avoidable ED visits, decreasing by 1.8 per 1000 member months (95% CI −3.1 to −0.4), as well as acute PQI preventable hospitalizations (−1.0 per 1000 member months, 95% CI −1.6 to −0.4). Relative to Colorado, Oregon’s CCO transformation was not associated with statistically significant improvements in three out of four measures of low-value care. However, avoidance of imaging for uncomplicated headache improved by 2.6% relative to Colorado (95% CI 1.4% to 3.8%).
Table 5.
Trends | Difference-in-Differences | ||||||||
---|---|---|---|---|---|---|---|---|---|
Metric | Oregon Medicaid | Colorado Medicaid | Changes in Oregon Medicaid compared to Changes in Colorado Medicaid † (95% CI) | ||||||
2011 | 2013–14 | Change | 2011 | 2013–2014 | Change | Average 2-year effect | Year 1 (2013) effect | Year 2 (2014) effect | |
Access Metrics | |||||||||
Child Access to PCP 1–6 years | 90.2 | 84.0 | −6.2 | 91.2 | 84.2 | −7.0 | 0.4 (−0.5, 1.3) | 0.1 (−0.9, 1.1) | 0.7 (−0.4, 1.8) |
Well Child Visits 3–6 | 60.4 | 57.5 | −2.9 | 65.7 | 59.2 | −6.5 | 2.7* (1.2, 4.2) | 2.5* (1.0, 4.0) | 2.8* (1.1, 4.6) |
Adolescent Well Care Visits | 27.1 | 27.2 | 0.1 | 39.8 | 33.8 | −6.0 | 6.8* (5.2, 8.3) | 5.2* (3.6, 6.8) | 8.4* (6.6, 10.1) |
Adult Access to Preventive Ambulatory Care, 20–44 | 83.1 | 81.0 | −2.1 | 86.5 | 81.1 | −5.4 | 1.3* (0.3, 2.2) | 1.9* (0.9, 2.8) | 0.7 (0.0, 1.8) |
Outcomes | |||||||||
Avoidable ED visits | 12.3 | 10.2 | −2.1 | 14.2 | 15.3 | 1.1 | −1.8* (−3.1, −0.4) | −1.4* (−2.7, −0.1) | −2.2* (−3.6, −0.8) |
PQI Overall (Rates per 1000) | 7.2 | 7.6 | 0.4 | 5.0 | 8.1 | 3.1 | −1.4 (−3.0, 0.2) | −2.6* (−4.8, −0.4) | −0.2 (−1.9, 1.4) |
PQI Acute | 2.8 | 2.6 | −0.2 | 2.5 | 3.5 | 1.0 | −1.0* (−1.6, −0.4) | −1.8* (−2.8, −0.8) | −0.2 (−1.0, 0.6) |
PQI Chronic | 4.4 | 5.0 | 0.6 | 2.5 | 4.6 | 2.1 | −0.4 (−1.7, 0.9) | −0.8 (−2.6, 1.0) | 0.0 (−1.4, 1.3) |
Low-Value/Appropriateness of Care | |||||||||
Appropriate Medications for Individuals with Asthma | 74.0 | 69.9 | −4.1 | 73.4 | 72.1 | −1.3 | −2.7 (−5.6, 0.2) | −3.1 (−6.9, 0.1) | −2.3 (−5.0, 0.0) |
Appropriate Testing for Children with Pharyngitis | 61.9 | 63.2 | 1.3 | 39.0 | 37.9 | −1.1 | 4.0 (−0.7, 8.7) | 2.8 (−2.7, 8.2) | 5.3* (0.1, 10.5) |
Appropriate Use of Imaging Studies for Low Back Pain | 78.9 | 77.9 | −1.0 | 73.7 | 76.2 | 2.5 | −2.9 (−6.1, 0.4) | −2.2 (−5.7, 1.3) | −3.6* (−7.1, −0.01) |
Avoidance of Head Imaging for Uncomplicated Headache | 15.9 | 16.2 | 0.3 | 18.5 | 17.0 | −1.5 | 2.6* (1.4, 3.8) | 1.3* (0.1, 2.6)1.3 | 3.8* (1.9, 5.7) |
Adjusted difference based on propensity score weighted difference-in-differences regression model;
Statistically significant at P<0.05
Our results were robust to sensitivity analyses, with some exceptions. For example, Oregon exhibited statistically significantly higher standardized expenditures and inpatient utilization than Colorado in some specifications, and the reduction in primary care visits observed in Oregon was not statistically significant in other models. (eTable 1 in the Supplement).
Discussion
Oregon’s and Colorado’s reforms represent two early efforts to implement Medicaid ACOs. Relative to Colorado, Oregon’s CCO model was not associated with reductions in standardized expenditures for selected services in the first two years after implementation, although utilization for emergency department and primary care visits was significantly lower. Trends were similar among adults and children. Our results were generally consistent across a series of sensitivity analyses (eTable 1 in Supplement).
Relative to Colorado, Oregon’s CCO transformation was associated with improvements on three out of four HEDIS access measures, reductions in avoidable emergency department visits and preventable acute hospital admissions. However, in other areas, Colorado performed as well as or better than Oregon. Inpatient care days, a potentially expensive service area, declined in both states, and in some specifications, reductions in Colorado were statistically greater than those in Oregon (eTable 1 in the Supplement).
Although Oregon and Colorado’s Medicaid ACO programs emphasized primary care homes, primary care visits decreased in the study populations for both states and were significantly lower in Oregon than Colorado in 2014. These observed decreases may reflect a lack of primary care capacity attributable to the 2014 ACA Medicaid expansion, wherein both states increased their Medicaid enrollment substantially: Colorado increased its Medicaid enrollment by 41% by July 2014, whereas Oregon increased its enrollment by 59%, the second largest increase in the country.28 Reductions in primary care visits should be monitored closely and may be a cause for concern if they reflect restricted access. Alternatively, a reduction in primary care visits could represent substitutions toward case management. Oregon’s reduction in primary care visits was accompanied by relative or absolute improvements in most HEDIS access measures, suggesting the potential for a more efficient reconfiguration of primary care resulting in fewer visits but maintaining access.29
More than two years into their programs, both states can point to successes. In the 2011–2014 timespan, both states demonstrated reductions in measures of standardized expenditures and utilization. Relative to Colorado, Oregon experienced some improvements in some access and quality measures, but did not generate savings that might be anticipated with its ambitious reform model and the $1.9 billion federal investment to support the CCO transformation.4,9 There are a few possible reasons why greater savings were not achieved. First, CCOs may need more time to fully implement changes that translate to greater savings. Second, spending not only slowed but was reduced in both states over the study period. There may be limits to the extent to which relative savings can be achieved in a period of shrinking (as opposed to growing) health care spending. Furthermore, although Colorado did not have the benefit of a similar investment from the federal government, its ACC model has had apparent success. Its focus on manageable, incremental steps has been followed by growth in enrollment, reductions in utilization, and improvement in some key performance indicators.12 From this vantage point, Colorado’s approach may represent a promising delivery system reform that may be more feasible for other states to adopt, in comparison to the larger scope of the model pursued by Oregon.
Our study has important limitations. Our main outcome – standardized expenditures – focused on a narrow set of services with common codes across states. Estimates of total per capita Medicaid spending published by the Oregon Health Authority11 suggest that our measure of standardized expenditures accounts for approximately 42% of total spending on medical services. Our analysis did not include expenditures on prescription drugs, a growing portion of Medicaid spending. Thus, we do not test for differences in overall expenditures. It is possible, for example, that estimated savings for Oregon could be reversed if Oregon’s expenditures on other services grew at a rate faster than Colorado. Furthermore, while our use of standardized expenditures allowed us to attach prices to managed care-type encounter claims and to insure consistency across states, it may have obscured reductions in spending that could have arisen through changes in overall reimbursement rates or in the intensity of inpatient services.
Our findings should also be interpreted in the light of other large changes occurring in both states. Neither represents a “business as usual” counterfactual. Nonetheless, our results are still useful in guiding expectations about Medicaid reforms. Furthermore, the lack of differences in pre-intervention trends between the states for most measures, coupled with a large number of sensitivity analyses, improves the fidelity and reliability of our findings.
This evaluation should also be viewed in terms of the broader trends in healthcare utilization. Our findings of slowed or reduced utilization in the 2010–2014 time period may not be entirely attributable to the Medicaid reforms in Colorado and Oregon, and may instead correspond with a period of historically low national health spending growth. National per capita Medicaid acute care increased at an average of 3.7% in the 2008–2011 period, but decreased by 1.7% in the 2011–2012 period.30 Nonetheless, recent evidence suggests a resurgence in overall healthcare spending growth, rising from a growth rate of 2.9 percent in 2013 to 5.3 percent in 2014.31 Given these trends, restraining utilization in future years may require additional effort from the Oregon and Colorado Medicaid ACO models.
Conclusions
A wide variety of Medicaid reforms are underway in the United States. Some states have emphasized a greater role in patient responsibility through the imposition of co-payments or health savings accounts, while others have emphasized the delivery system as a path towards a higher quality and financially sustainable public insurance program. Our study of two years of post-intervention data from Medicaid ACO reforms found relative performance improvements in several aspects of care in Oregon relative to Colorado, but found no significant differences in standardized expenditures for selected services. These results should be considered in the context of overall promising trends in both states. Continued evaluation of Medicaid reforms and payment models can inform the most effective approaches to improving and sustaining the value of this growing public program.
Supplementary Material
Acknowledgments
This research was funded by a grant from NIH Common Fund Health Economics Program (1R01MH1000001) and the Silver Family Foundation and by NIH Grant R33 DA035640. Aside from financial support, the funding agencies had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; or in preparation, review, or approval of the manuscript; or in the decision to submit the manuscript for publication. KJM and RCL were responsible for obtaining funding. KJM, SR, THM and BKSC were responsible for data management and statistical analyses. KJM, DC, AM, DM, NW, JW, and RCL were responsible for qualitative data acquisition and management. KJM was responsible for initial drafting of the manuscript and all authors were involved in revising and editing. KJM had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. All authors agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. There are no conflicts of interest.
Contributor Information
K. John McConnell, Center for Health Systems Effectiveness, Oregon Health & Science University
Stephanie Renfro, Center for Health Systems Effectiveness, Oregon Health & Science University
Benjamin K.S. Chan, Center for Health Systems Effectiveness, Oregon Health & Science University
Thomas H.A. Meath, Center for Health Systems Effectiveness, Oregon Health & Science University.
Aaron Mendelson, Center for Health Systems Effectiveness, Oregon Health & Science University
Deborah Cohen, Department of Family Medicine, Oregon Health & Science University
Jeanette Waxmonsky, Jefferson Center for Mental Health, Office of Healthcare Transformation, Wheat Ridge, Colorado
Dennis McCarty, OHSU – PSU School of Public Health, Oregon Health & Science University
Neal Wallace, Hatfield School of Government, Portland State University
Richard C. Lindrooth, Department of Health Systems, Management and Policy, School of Public Health, University of Colorado Denver
References
- 1.National Association of State Budget Officers. State Expenditure Report: Examining Fiscal 2012–2014 State Spending [Internet] Washington DC: National Association of State Budget Officers; 2014. [cited 2016 Jan 7]. Available from: https://www.nasbo.org/sites/default/files/State%20Expenditure%20Report%20(Fiscal%202012-2014)S.pdf. [Google Scholar]
- 2.Kocot SL, Dang-Vu C, White R, McClellan M. Early experiences with accountable care in Medicaid: special challenges, big opportunities. Popul Health Manag. 2013;16(Suppl 1):S4–11. doi: 10.1089/pop.2013.0058. [DOI] [PubMed] [Google Scholar]
- 3.Medicaid Accountable Care Organizations: State Update [Internet] Cent Health Care Strateg. [cited 2016 Aug 15];Available from: http://www.chcs.org/resource/medicaid-accountable-care-organizations-state-update/
- 4.McConnell KJ, Chang AM, Cohen D, et al. Oregon’s Medicaid Transformation: An Innovative Approach To Holding A Health System Accountable For Spending Growth. Health Care J Deliv Sci Innov. 2014;2(3):163–7. doi: 10.1016/j.hjdsi.2013.11.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Stecker EC. The Oregon ACO Experiment — Bold Design, Challenging Execution. N Engl J Med. 2013;368(11):982–5. doi: 10.1056/NEJMp1214141. [DOI] [PubMed] [Google Scholar]
- 6.Howard SW, Bernell SL, Yoon J, Luck J. Oregon’s Coordinated Care Organizations: A Promising and Practical Reform Model. J Health Polit Policy Law. 2014;39(4):933–40. doi: 10.1215/03616878-2744450. [DOI] [PubMed] [Google Scholar]
- 7.Pollack HA. Oregon’s Coordinated Care Organizations. J Health Polit Policy Law. 2014;39(4):929–31. doi: 10.1215/03616878-2744438. [DOI] [PubMed] [Google Scholar]
- 8.Chang AM, Cohen DJ, McCarty D, Rieckmann T, McConnell KJ. Oregon’s Medicaid Transformation — Observations on Organizational Structure and Strategy. J Health Polit Policy Law. 2014;40(1):257–64. doi: 10.1215/03616878-2854959. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.McConnell KJ. Oregon’s Medicaid Coordinated Care Organizations. JAMA. 2016;315(9):869–70. doi: 10.1001/jama.2016.0206. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Coughlin TA, Corlette S. Oregon: site visit report [Internet] Princeton, NJ: Robert Wood Johnson Foundation; 2012. ACA implementation — monitoring and tracking. [cited 2015 Jun 14]. Available from: http://www.urban.org/UploadedPDF/412498-ACA-Implementation-Monitoring-and-Tracking-Oregon-Site-Visit-Report.pdf. [Google Scholar]
- 11.Oregon Health Authority. Oregon’s Health System Transformation 2014 Final Report [Internet] 2015 [cited 2015 Nov 1]. Available from: http://www.oregon.gov/oha/Metrics/Documents/2014%20Final%20Report%20-%20June%202015.pdf.
- 12.Colorado Department of Health Care Policy and Financing. Creating a Culture of Change: Accountable Care Collaborative 2014 Annual Report [Internet] [cited 2016 Mar 13]. Available from: https://www.colorado.gov/pacific/sites/default/files/Accountable%20Care%20Collaborative%202014%20Annual%20Report.pdf.
- 13.Lindrooth RC, Tung G, Santos T, O’Leary S. Evaluation of the Accountable Care Collaborative: Year 1 Report [Internet] Available from: https://www.colorado.gov/pacific/sites/default/files/ACC%20Evaluation_Year%201%20Final%20Report_Final%2012%207%2015%20(1).pdf.
- 14.Chronic Illness and Disability Payment System, version 5.3. University of California; San Diego: 2011. [Google Scholar]
- 15.Stuart EA, Huskamp HA, Duckworth K, et al. Using propensity scores in difference-in-differences models to estimate the effects of a policy change. Health Serv Outcomes Res Methodol. 2014:1–17. doi: 10.1007/s10742-014-0123-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.National Committee for Quality Assurance. HEDIS 2014: Healthcare Effectiveness Data and Information Set. Vol. 2. Washington, DC: National Committee for Quality Assurance; 2013. Technical Specifications for Health Plans. [Google Scholar]
- 17.Schwartz AL, Chernew ME, Landon BE, McWilliams J. CHanges in low-value services in year 1 of the medicare pioneer accountable care organization program. JAMA Intern Med. 2015:1–11. doi: 10.1001/jamainternmed.2015.4525. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Medi-Cal Managed Care Division, California Department of Health Care Services. Statewide Collaborative Quality Improvement Project: Reducing Avoidable Emergency Room Visits [Internet] 2012 [cited 2015 Feb 10];Available from: http://www.dhcs.ca.gov/dataandstats/reports/Documents/MMCD_Qual_Rpts/EQRO_QIPs/CA2011-12_QIP_Coll_ER_Remeasure_Report.pdf.
- 19.Prevention Quality Indicators overview. [Internet] Agency for Healthcare Research and Quality; [cited 2015 Feb 21]. Available from: http://www.qualityindicators.ahrq.gov/modules/pqi_overview.aspx. [Google Scholar]
- 20.Joynt KE, Gawande AA, Orav E, Jha AK. Contribution of preventable acute care spending to total spending for high-cost medicare patients. JAMA. 2013;309(24):2572–8. doi: 10.1001/jama.2013.7103. [DOI] [PubMed] [Google Scholar]
- 21.Austin PC. A critical appraisal of propensity-score matching in the medical literature between 1996 and 2003. Stat Med. 2008;27:2037–49. doi: 10.1002/sim.3150. [DOI] [PubMed] [Google Scholar]
- 22.Austin PC. Balance diagnostics for comparing the distribution of baseline covariates between treatment groups in propensity-score matched samples. Stat Med. 2009;28(25):3083–107. doi: 10.1002/sim.3697. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Ryan AM, Burgess JF, Dimick JB. Why We Should Not Be Indifferent to Specification Choices for Difference-in-Differences. Health Serv Res. 2015;50(4):1211–35. doi: 10.1111/1475-6773.12270. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Angrist JD, Pirschke J-S. Mostly Harmless Econometrics: An Empiricist’s Companion. Princeton, NJ: Princeton University Press; 2008. [Google Scholar]
- 25.Dartmouth Atlas of Healthcare. Dartmouth Atlas Primary Care Service Area Data [Internet] [cited 2015 Jun 10];Available from: http://www.dartmouthatlas.org/tools/downloads.aspx?tab=42.
- 26.Bertrand M, Duflo E, Mullainathan S. How Much Should We Trust Differences-In-Differences Estimates? Q J Econ. 2004;119(1):249–75. [Google Scholar]
- 27.Osborne NH, Nicholas LH, Ryan AM, Thumma JR, Dimick JB. ASsociation of hospital participation in a quality reporting program with surgical outcomes and expenditures for medicare beneficiaries. JAMA. 2015;313(5):496–504. doi: 10.1001/jama.2015.25. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Total Montlhy Medicaid and CHIP Enrollment [Internet] [cited 2016 Mar 30];Available from: http://kff.org/health-reform/state-indicator/total-monthly-medicaid-and-chip-enrollment/
- 29.Dale SB, Ghosh A, Peikes DN, et al. Two-Year Costs and Quality in the Comprehensive Primary Care Initiative. N Engl J Med. 2016 doi: 10.1056/NEJMsa1414953. [DOI] [PubMed] [Google Scholar]
- 30.Young K, Clemans-Cope L, Lawton E, Holahan J. Medicaid Spending Growth in the Great Recession and Its Aftermath, FY 2007–2012 [Internet] 2014 [cited 2014 Jul 11];Available from: http://kff.org/medicaid/issue-brief/medicaid-spending-growth-in-the-great-recession-and-its-aftermath-fy-2007-2012/
- 31.Martin AB, Hartman M, Benson J, Catlin A Team the NHEA. National Health Spending In 2014: Faster Growth Driven By Coverage Expansion And Prescription Drug Spending. Health Aff (Millwood) 2016;35(1):150–60. doi: 10.1377/hlthaff.2015.1194. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.