Abstract
Policy Points.
Maine, Massachusetts, Minnesota, and Vermont leveraged State Innovation Model awards to implement Medicaid accountable care organizations (ACOs). Flexibility in model design, ability to build on existing reforms, provision of technical assistance to providers, and access to feedback data all facilitated ACO development. Challenges included sustainability of transformation efforts and the integration of health care and social service providers.
Early estimates showed promising improvements in hospital‐related utilization and Vermont was able to reduce or slow the growth of Medicaid costs.
These states are sustaining Medicaid ACOs owing in part to provider support and early successes in generating shared savings. The states are modifying their ACOs to include greater accountability and financial risk.
Context
As state Medicaid programs consider alternative payment models (APMs), many are choosing accountable care organizations (ACOs) as a way to improve health outcomes, coordinate care, and reduce expenditures. Four states (Maine, Massachusetts, Minnesota, and Vermont) leveraged State Innovation Model awards to create or expand Medicaid ACOs.
Methods
We used a mixed‐methods design to assess achievements and challenges with ACO implementation and the impact of Medicaid ACOs on health care utilization, quality, and expenditures in three states. We integrated findings from key informant interviews, focus groups, document review, and difference‐in‐difference analyses using data from Medicaid claims and an all‐payer claims database.
Findings
States built their Medicaid ACOs on existing health care reforms and infrastructure. Facilitators of implementation included allowing flexibility in design and implementation, targeting technical assistance, and making clinical, cost, and use data readily available to providers. Barriers included provider concerns about their ability to influence patient behavior, sustainability of provider practice transformation efforts when shared savings are reinvested into the health system and not shared with participating clinicians, and limited integration between health care and social service providers. Medicaid ACOs were associated with some improvements in use, quality, and expenditures, including statistically significant reductions in emergency department visits. Only Vermont's ACO demonstrated slower growth in total Medicaid expenditures.
Conclusions
Four states demonstrated that adoption of ACOs for Medicaid beneficiaries was both possible and, for three states, associated with some improvements in care. States revised these models over time to address stakeholder concerns, increase provider participation, and enable some providers to accept financial risk for Medicaid patients. Lessons learned from these early efforts can inform the design and implementation of APMs in other Medicaid programs.
Keywords: Medicaid, alternative payment models, state innovation models, accountable care organizations
As state and federal medicaid spending continue to rise,1 strategies for reducing or controlling Medicaid spending by states include limiting enrollment, eligibility, and benefits; capping prices paid for Medicaid benefits (eg, through capitated payment or global budget arrangements with managed care or provider organizations); and adopting payment models other than fee‐for‐service (FFS). These alternative payment models (APMs) task providers with managing the cost and quality of a specific population in exchange for potential payments separate from those paid for services rendered. Some Medicaid programs are experimenting with APMs like accountable care organizations (ACOs), episode‐based payments, and global budgeting.2
Spurred by modest cost reductions and quality improvements observed among Medicare and commercial payer ACOs,3, 4, 5, 6, 7 12 state Medicaid programs have adopted some form of a Medicaid ACO model to change patterns of care to reduce costs.8 ACOs are groups of physicians, hospitals, and other health care providers, such as those providing services for mental and behavioral health, home health, and long‐term services and support, who voluntarily enter into contracts in which they are held accountable for the quality and total cost for a specific population.9 ACOs are expected to work together across service sectors (eg, inpatient and outpatient, primary and specialty care, physical and behavioral health) to manage and coordinate patient care and provide high‐quality care, with the expectation that these activities will reduce use of high‐cost services such as inpatient admissions, readmissions, and emergency department (ED) visits, thereby reducing Medicaid cost growth or lowering costs outright. In contrast to global budgeting models or capitated managed care/ACO hybrids, a key characteristic of the ACO model is the ability for providers to share in any financial savings or losses accrued by the sponsoring payer. In one‐sided risk models, the ACO shares with Medicaid any savings generated if it meets certain cost and quality targets but are not held accountable for any losses. In two‐sided risk models, the ACO may receive a greater portion of any savings, but it also must pay Medicaid if the total costs of care for their attributed patients exceed specific targets. Cost targets are usually predicted spending based on historical trends and population characteristics.10, 11, 12
Developing ACO structures and supporting providers in their practice transformation efforts to meet ACO goals is a time‐ and resource‐intensive process for providers and states. Several states have been able to leverage federal initiatives to support their efforts. In 2013, the Center for Medicare and Medicaid Innovation (the Innovation Center) at the Centers for Medicare and Medicaid Services (CMS) awarded State Innovation Models (SIM) Initiative Round 1 funds to six states to test ways to accelerate statewide health care transformation, including through a greater shift toward APMs aligned across multiple payers. Four of the six states—Maine, Massachusetts, Minnesota and Vermont—used a portion of their total SIM awards (ranging from $33 to $45 million over four or more years) to develop, broaden, and support their Medicaid ACOs, known, respectively, in each state as Accountable Communities, Accountable Care Organizations (three types), Integrated Health Partnerships, and the Medicaid Shared Savings Program. SIM Initiative funds could be used for program design, stakeholder convening, health information technology, data analytics, workforce development, technical assistance to providers, and other infrastructure to support ACO implementation. SIM funds were in place by the time ACO implementation began in all states except Minnesota (in which the first year of Medicaid ACO implementation was 2013): Vermont began in January 2014, Maine in August 2014, and Massachusetts in December 2016 (pilot) followed by full implementation in 2018.
The purpose of this study was to identify achievements and challenges with ACO implementation and to assess the impact of Medicaid ACOs on health care utilization, quality, and expenditures in three states. This study is among the first to show the impact of the ACO model on the Medicaid population using robust evaluation designs that employ comparison groups. Also, it is the first to examine states’ implementation of Medicaid ACOs over time, which provides important context for interpreting impact results. This study expands the evidence on Medicaid ACO performance beyond previous work by McConnell and colleagues examining Medicaid coordinated care organizations in Oregon, a global budgeting model, that found reductions in expenditures relative to a neighboring state13 but no reductions in expenditures compared with a state that implemented a regionally based accountable care model,14 and no evidence on quality of life impacts following implementation of a capitated managed care/ACO hybrid.15
Methods
Study Design
To assess ACO implementation progress within the context of the SIM Initiative and the impact of ACO enrollment on use, quality, and expenditures, we used a mixed‐methods design. We integrated qualitative results from interviews, focus groups, and key documents, and quantitative results from difference‐in‐difference (DID) analyses of Medicaid and all‐payer claims data. RTI International's Institutional Review Board (IRB) determined that this study did not require IRB approval, as it was an evaluation approved by CMS designed to examine changes in quality of care and spending.
Qualitative Data Sources, Analysis, and Outcomes
The study team was divided into state‐specific groups, each of which conducted more than 60 interviews, in person and by phone, within the four states between 2014 and 2018, for the purposes of understanding Medicaid ACO implementation in the context of each state's participation in the SIM Initiative. Interviewees included state officials, payers and purchasers, health system and provider organizations, provider associations, advocacy groups, consumers, and consumer advocates. The structured interview protocols focused on state health policy developments occurring during the SIM Initiative, stakeholder participation, health care delivery transformation, payment system reform, quality measurement and reporting, population health improvement strategies, and infrastructure investments such as health information technology, data analytics, workforce development, and technical assistance to providers. During the same time period, we held focus groups across the four states with six to nine primary care providers each (including nurses and nurse practitioners in addition to physicians) who were exposed to each state ACO model in the years that the model was active (Maine: six groups in 2014‐2017; Massachusetts: four groups in 2018 only; Minnesota: twelve groups in 2014‐2017; Vermont: ten groups in 2014‐2017). We also had monthly discussions with state officials and routinely reviewed state documents to better understand each state's ACO model and how it was implemented. Each state‐specific research group reviewed notes and transcripts for each interview and focus group multiple times to identify important content and generate themes. This thematic analysis was used to identify features of each state's ACO, compare perspectives of different stakeholder types, summarize lessons learned from implementing and supporting the model, provide additional context to explain our quantitative outcomes, and highlight which features of the ACO models may be associated with certain outcomes.
Quantitative Data Sources, Study Outcomes, and Statistical Analyses
Data Sources
For the three states with sufficient implementation experience, we used Medicaid claims and enrollment data provided by each state's respective Medicaid agency to evaluate impacts on outcomes. Maine, Minnesota, and Vermont provided these Medicaid data for the three years prior to ACO implementation and the two years (Maine), three years (Vermont/Minnesota, expenditures), or four years (Minnesota, utilization and quality outcomes) after ACO implementation. In Minnesota, most Medicaid beneficiaries were enrolled in Medicaid managed care organizations (MCOs), which introduced a unique data challenge: Minnesota's Medicaid agency was unable to release detailed expenditures paid by MCOs to providers. Instead, we used expenditure data from the Department of Health's Minnesota All‐Payer Claims Database (MN APCD). Neither Maine nor Vermont have managed care within their Medicaid programs, so the claims were sufficient to estimate expenditures.
Study Outcomes
We modeled the impact of each state's ACO model on utilization, quality, and expenditures of ACO‐attributed Medicaid beneficiaries. We examined utilization in terms of visits to a primary care or specialist provider, all‐cause acute inpatient admissions, ED visits that did not lead to a hospitalization, and 30‐day readmissions following a hospitalization. Although there were slight variations owing to state‐specific specialty codes included within claims, primary care was most often defined as care delivered by practitioners in internal medicine, family medicine, pediatrics, geriatrics, and obstetrics/gynecology. Measures were calculated as a probability of use (ie, did the event ever happen or not in the year). These common utilization measures were selected as they were central to quantifying the effects of the ACO models in all states and could be operationalized comparably across states and data sources. We chose to model the probability of any use rather than a count of utilization encounters because there were few beneficiaries that had more than one or two utilization encounters within any given year. We multiplied the marginal effect from the logistic regression models by 1,000, which provides a reasonable approximation of the impact in terms of a change in the rate of utilization per 1,000 beneficiaries.
Although utilization measures were common to all states, the evaluation team a priori chose quality measures based upon the key priorities of the states’ SIM Initiative, which led to few common quality measures across states. The quality measure outcomes included in this article highlight comparisons between two binary quality measures available for Maine and Minnesota: 1) receipt of hemoglobin A1c (HbA1c) testing among Medicaid beneficiaries with diabetes, and 2) remaining on antidepressant medication for at least 180 days among Medicaid beneficiaries newly diagnosed with depression. Neither measure was calculated for Vermont, so instead, we included the percentage of beneficiaries aged one to three years who had a developmental screening, which is of interest because it was the only quality measure to which ACOs in Vermont were held accountable in their Medicaid program but not their commercial program. The remaining quality measures evaluated for each of these states—one in Maine, two in Minnesota, and three in Vermont—are reported in the SIM Initiative Round 1 Final Report.16
Expenditure measures included total per‐beneficiary‐per‐month (PBPM) expenditures, inpatient PBPM expenditures, professional PBPM expenditures, and pharmaceutical PBPM expenditures. Our analysis did not include capitated payments due to a lack of data availability. We did not examine prescription drug expenditures in Minnesota due to constraints preventing us from being able to integrate the medical and pharmacy data.
Statistical Analyses
Analyses compare pre‐ and post‐periods for intervention and comparison groups using a longitudinal design with an unbalanced panel. This rolling entry design allows beneficiaries without continuous Medicaid eligibility to be included in the study sample and contribute data. Allowing rolling entry into the sample was important to ensure that the results are generalizable to the full Medicaid population, given the significant churn of Medicaid beneficiaries into and out of the program over time. Although some beneficiaries may not have been observed in each study year due to changes in their Medicaid eligibility, the use of propensity score weighting ensured that beneficiary characteristics, on average, do not differ substantially between years. However, the majority of beneficiaries observed in the pre‐period were also observed in the post‐period.
Attribution to ACOs and Identification of Comparison Groups
For the Medicaid claims analyses, each state provided us with the list of ACO‐attributed enrollees (hereafter “ACO enrollees”), which comprised the intervention group for each set of state analyses. Each of the three states attributed Medicaid beneficiaries to an ACO, generally based on a Medicaid beneficiary's relationship with the ACO provider within a given year. A Medicaid beneficiary could be aligned to an ACO‐participating primary care provider, or a Medicaid beneficiary could have received a majority of their primary care services from an ACO‐participating provider. Online Appendix Table 1 provides more detail on each state's ACO attribution process. For Minnesota's expenditure analyses using the MN APCD data, we were unable to use the state's attribution list owing to differences in beneficiary identifiers across data sets. Instead, we replicated Minnesota's attribution methodology in the MN APCD data to construct the intervention group. As a result, the sample of ACO‐attributed beneficiaries used in the expenditure analyses differed somewhat from the ACO‐attributed beneficiary sample used for Minnesota's utilization and quality of care analyses (n = 239,245 and 294,923, respectively).
To identify Medicaid beneficiaries affiliated with non‐ACO providers for the in‐state comparison groups in Maine and Minnesota, we replicated annually to the extent possible each state's attribution method. In contrast, Vermont provided us with a list of Medicaid beneficiaries affiliated with non‐ACO providers practicing in the state and Medicaid enrollees affiliated with providers participating in its commercial ACO.
The Minnesota and Vermont ACO models exclude dual‐eligible Medicare‐Medicaid beneficiaries, so we likewise excluded these beneficiaries from the analytic sample. In contrast, some dual‐eligible beneficiaries were enrolled in Maine's ACO, so we allowed these beneficiaries to be included in the Maine analytic sample.
Because this is not a randomized study, and comparison group members did not necessarily resemble ACO enrollees on key characteristics, we used propensity score weighting to ensure that the comparison group closely resembled ACO enrollees in all observable baseline characteristics. A propensity score weight was assigned to each comparison group beneficiary that reflects how similar or different she or he was to an ACO enrollee, based on select observed sociodemographic characteristics of the beneficiary or characteristics of his or her geographic area of residence. The comparison group beneficiaries that most resembled ACO enrollees were assigned larger weights and therefore their data contributed more to the difference‐in‐difference estimates than beneficiaries with smaller assigned weights. After applying inverse probability of treatment weights,17 each state's ACO and comparison groups more closely resembled each other. Additional details about the propensity scores can be found in the Online Methods Appendix.
Regression Modeling
We assessed baseline trends of our key outcomes (total expenditures, ED visits, and inpatient admissions) in each state sample by testing for parallel trends between the ACO and comparison groups. We modeled all outcomes using a difference‐in‐differences specification, which compares study outcomes of the ACO group to the comparison group before and after ACO implementation. We used weighted logistic regression models for the binary utilization and quality outcomes using a non‐linear difference‐in‐difference methodology,18 and we used weighted ordinary least squares models for expenditure outcomes. Models controlled for person‐level variables (eg, gender, age, disability, time in Medicaid, and comorbidity) and county‐level variables (eg, urban/rural residence, percentage of population living in poverty, and supply of hospital beds). Each state regression model included additional covariates relevant for their state's ACO model (see Online Appendix Table 6 for additional details).
All models use cluster‐robust standard errors to account for the correlation of multiple observations (claims) in a single measurement year. In Maine and Vermont, we clustered standard errors at the practice level and provider level, respectively, to account for similar care delivery practice patterns for beneficiaries seeing the same providers. In Minnesota, identifying the participating provider or practice to which each beneficiary was attributed was not possible. Therefore, claims were clustered at the beneficiary level. All regression models were weighted by the product of the propensity score and the fraction of the year the beneficiary was enrolled in Medicaid. This process down‐weights the influence of beneficiaries with less than an entire year of Medicaid coverage.
Estimates of the impact of ACO attribution on study outcomes were calculated for each year of post‐implementation data and for the overall post‐implementation study period. We present only the overall estimates in this paper. Single‐year estimates and estimates for select subpopulations are available in the SIM Initiative Round 1 Final Report.16 Statistical significance is assessed at p < 0.10 (or 90%) in accordance with the SIM Round 1 evaluation design.
Results
Implementation Findings
Maine, Massachusetts, Minnesota, and Vermont used SIM funding to support ACO adoption as a primary means of furthering statewide goals to spread APMs and encourage provider reimbursement based on value rather than volume. By the end of 2017 almost 740,000 Medicaid beneficiaries were enrolled in a Medicaid ACO across these four states (representing approximately 20% of the 3.9 million Medicaid beneficiaries participating in a Medicaid ACO model). Table 1 provides a summary of Medicaid ACO models implemented in the four SIM Round 1 states, including contextual factors, model and payment characteristics, each state's use of SIM funding to support ACO development, and challenges reported by Medicaid ACOs, all of which are relevant in interpreting ACO‐related outcomes described in the next section. We expand here on several key components of model design and highlight lessons learned from implementation of these model features in Table 2.
Table 1.
MA | ME | MN | VT | |
---|---|---|---|---|
Number of Medicaid ACOs | 17 | 4 | 21a | 2 |
Implementation began | 2016, 2018b | August 2014 | January 2014 | January 2014 |
Implementation period included in the claims‐based analysis (baseline period) | August 2014‐ March 2016; (August 2011‐July 2014) | January 2014‐ December 2016; (January 2011‐December 2013) | January 2014‐ December 2016; (January 2011‐December 2013) | |
Number of Medicaid enrollees at end of the SIM Initiative | 160,000c | 55,314 | 455,888 | 67,515 |
Contextual Factors | ||||
Overlap with a Medicaid primary care home model | ✓ | ✓ | ✓ | |
Coordinated with Medicare ACO models | ✓ | |||
Coordinated with commercial ACO models | ✓ | |||
Commercial and/or Medicare ACOs in the state | ✓ | ✓ | ✓ | ✓ |
Coordinated with Medicaid managed cared | ✓ | ✓ | ||
Model and Payment Characteristics | ||||
Providers leading the ACO: | ||||
Health system | ✓ | ✓ | ✓ | |
Providers (eg, primary care) | ✓ | ✓ | ✓ | |
Hospital | ✓ | ✓ | ✓ | |
Federally qualified health center | ✓ | ✓ | ✓ | |
Calculate shared savings and total costs of care based on retrospective use, cost, and quality | ✓ | ✓ | ✓ | ✓ |
Behavioral health services included in shared savings calculationse | ✓ | ✓ | ✓ | |
LTSS included in shared savings calculationsf | ✓ | ✓ | ||
Engaged in one‐sided risk | ✓ | ✓ | ✓ | |
Engaged in two‐sided riskg | ✓ | ✓ | ||
Includes prospective payments to ACO providersh | ✓ | ✓ | ||
SIM‐Funded Support | ||||
Grants to providers for infrastructure development | ✓ | ✓ | ||
Learning collaboratives: | ||||
For ACO administrators | ✓ | |||
For providers participating in ACOsi | ✓ | ✓ | ||
Technical assistance: | ||||
For ACO administrators | ✓ | ✓ | ✓ | |
For providers participating in ACOsj | ✓ | ✓ | ||
HIE and/or event notification access for providers, which could include ACO‐participating providers | ✓ | ✓ | ✓ | ✓ |
Feedback reports on quality, use, and cost: | ||||
For ACO administrators | ✓ | ✓ | ||
For providers participating in ACOsk | ✓ | ✓ | ||
State provided Medicaid data to support data analytics | ✓ | ✓ |
Abbreviations: ACO, accountable care organization; LTSS, long‐term services and supports; HIE, health information exchange.
In Minnesota, 8 of the 21 ACOs are virtual. These virtual ACOs began in 2016.
Massachusetts initially launched a pilot ACO program in 2016, followed by the full‐scale ACO program in 2018.
The number of Medicaid enrollees at the end of the SIM Initiative in Massachusetts reflects enrollees in the pilot program. The number of Medicaid enrollees in the full‐scale ACO program are not available at the time of this report but are projected to reach 850,000.
Maine and Vermont do not have a Medicaid managed‐care program.
In Minnesota, intensive and residential mental health and chemical dependency services are excluded. In Massachusetts, Behavioral Health Community Partners will serve a population with high behavioral health needs, including individuals with severe mental illness or substance use disorders.
In Maine, LTSS is optional. Even if ACOs elect to include LTSS, some home‐ and community‐based services and targeted case management services are still excluded. In Massachusetts, LTSS will be phased into the total cost of care benchmark after the second year of the program.
In Vermont, a down‐side risk sharing pilot began in 2017.
In Massachusetts, one type of ACO in the 2018 full‐scale program included prospective payments; in Vermont, prospective payment began in 2017.
In Maine, Minnesota, and Vermont, learning collaborative opportunities were available to practices, some of which were ACO‐affiliated practices.
In Maine, technical assistance was limited to ACO‐participating providers who were Medicaid health homes or behavioral health homes. In Minnesota, technical assistance specifically targeted to Integrated Health Partnerships was focused on data analytics. In Massachusetts, SIM funds allowed technical assistance to be provided for state officials when designing and piloting the ACO. Technical assistance to ACOs and community partners will be provided from a non‐SIM funding stream during the full implementation of the ACS.
In Maine, feedback reports were limited to ACO‐participating providers who were Medicaid health homes or behavioral health homes.
Table 2.
Facilitators | Barriers |
---|---|
|
|
Context for Medicaid ACO implementation
Prior provider experience with Medicare and/or commercial ACOs was a factor in states’ decisions to pursue an ACO model as the preferred APM for Medicaid. Also, since the premise of the SIM Initiative was that states could lead, and should pursue, alignment of payment models across payers, most states took other payer initiatives into account when developing their Medicaid ACO models. In practice, Vermont was the only state to achieve significant coordination across payers: Vermont made significant efforts to align operational aspects (eg, covered services, attribution, care management requirements, provider payment incentives) of its Medicaid and commercial ACO shared savings programs with the Medicare Shared Savings Program (MSSP). Minnesota aligned its attribution methodology for Medicaid beneficiaries with that of the MSSP. In Massachusetts, where the state ultimately designed three types of ACOs with varying levels of financial risk, one MCO in the Accountable Care Partnership Plan type (an integrated MCO/ACO taking full insurance risk) chose to partner with a provider network based on their decades‐long relationship that included shared risk arrangements in the commercial and Medicare markets.
Additionally, all four states had previous or concurrent experience with promoting the patient‐centered medical home (PCMH) model using financial incentives from one or more payers in the state. Coordination with these PCMH models was a consideration in designing features of the Medicaid ACO for all states except Massachusetts. Maine and Minnesota used a Medicaid beneficiary's receipt of primary care from a health home or health care home (respectively) as part of their Medicaid ACO attribution algorithm where applicable. Vermont, the state with the longest‐standing PCMH model, used quality measures from its PCMH pay‐for‐performance model for its Medicaid and commercial shared savings programs—an alignment effort praised by providers. As one ACO representative noted, “We have seen movement on quality measures that I can only attribute to ACO work. I don't think it would have been on their radar [otherwise].”
Finally, Medicaid programs in Massachusetts and Minnesota accounted for the presence of longstanding Medicaid managed care when designing their ACO models (Maine and Vermont did not because Maine does not have Medicaid MCOs and Vermont's Medicaid program operates under a Medicaid Section 1115 waiver as a single MCO). In both cases, the states dictated how MCOs would participate in the ACO model during their MCO contract procurement process. Among Massachusetts’ three types of ACO models, one type is an integrated MCO/ACO model (noted earlier) that receives prospective capitated payments for an aligned population; another type is a provider‐led ACO that may contract with multiple MCOs and receives retrospective shared savings or losses calculated by each MCO; and a third type is a provider‐led ACO that contracts directly with the state for its aligned population and shares risk. In contrast to Massachusetts, in which beneficiaries’ MCO enrollment and ACO alignment are largely dictated by which primary care provider they see, Minnesota allows beneficiaries to choose which MCO they enroll in. Also, Minnesota requires each Medicaid MCO to pay shared savings or reap losses to an ACO in proportion to the MCO's share of Medicaid enrollees attributed to that ACO, based on the state's own calculations, whereas Massachusetts involves the MCOs in calculating and distributing payments to their affiliated providers.
ACO Characteristics and Requirements
To meet the unique needs of the Medicaid population and increase provider participation, states allowed significant flexibility on several dimensions. First, most states gave ACOs latitude when deciding the composition of providers eligible to become ACOs, consistent with findings of a prior study.19 Second, recognizing that ACOs include different networks of providers and that one ACO's attributed population will not look like another, states offered ACOs flexibility in operational requirements. For example, Minnesota allowed an ACO with a large pediatric population to replace adult‐focused quality measures with child‐focused measures and allowed other accommodations to an ACO with a large population with mental illness.20 Third, some states (Maine and Vermont) let ACOs choose between one‐ or two‐sided financial risk. All ACOs in these states opted for one‐sided risk, citing a desire to build experience learning to manage quality and costs for attributed Medicaid beneficiaries before taking on more risk. In Minnesota, ACOs were subject to different risk terms depending on whether their provider networks were “integrated” or “virtual” (integrated ACOs began with one‐sided risk but moved into two‐sided risk by the end of the contract term, “virtual” ACOs were allowed to retain one‐sided risk). Fourth, ACOs in all four states were given flexibility in whether to invest in care management, care coordination, or quality improvement activities.
The states varied in the types of formal arrangements they required to meet Medicaid beneficiaries’ comorbid, complex clinic and social needs. For example, Maine's Medicaid ACOs were required to include behavioral health providers (although ACOs in Maine, Massachusetts, and Minnesota also included behavioral health providers, since behavioral health services were included in total costs of care calculations). Recognizing the critical role that nontraditional providers play in patient care, Massachusetts went a step further and required Medicaid ACOs to contract with providers of long‐term services and supports and of behavioral health care (“Community Partners”) in order to enable beneficiaries’ access to these local social service organizations. Long‐term services and supports costs were included in the total cost of care calculation in Massachusetts and at the ACO's discretion in Maine. Although not implemented during the study period, Vermont plans to include behavioral health and home‐ and community‐based services in its total cost of care calculations by 2020.
Regardless of state requirements, some providers saw the financial incentives inherent in the ACO risk‐sharing model as sufficient motivation to change. As one provider in Massachusetts said, the model “shook the trees” and got ACOs to allocate resources for care coordinators and managers that made direct contact with patients for issues such as overdue health screening and high utilization of EDs for nonemergent conditions.
States also used SIM Initiative funds to promote coordination across providers. For example, Minnesota provided grants to ACOs and similar entities to develop partnerships with social services, local public health, long‐term services and supports, and behavioral health providers and to target specific population health needs. Maine used its SIM Initiative funds to build connections between ACO‐participating primary care medical homes and behavioral health organizations and connect them to the electronic clinical and utilization data housed in Maine's state health information exchange (HIE).
Model requirements and payment terms evolved over time. Minnesota adjusted its attribution methodology in response to provider concerns about churning of aligned ACO enrollees in and out of Medicaid, and Massachusetts made changes to some financial components of its model (rate‐setting for capitated payments). While piloting its ACO model, Massachusetts uncovered implementation challenges in distinguishing participating providers from the practices in which they worked. States also changed over time the degree of risk ACOs assumed. For example, Vermont allowed ACOs to choose between one‐ or two‐sided risk at the start of the Medicaid shared savings program but required them to enter a two‐sided risk arrangement along with prospective, capitated payments in 2017. Massachusetts piloted an ACO program with two‐sided risk in 2017 and incorporated prospective capitation for ACOs that integrated with managed care organizations when the ACO program was fully implemented in 2018.
Use of SIM Funding to Support ACO Development
External funding is a common catalyst and facilitator for delivery system and payment model reform. Participation in the SIM Initiative provided the four states with external funding to support key aspects of their Medicaid ACO implementation. Strategies the states used to support providers in ACO implementation included: (1) direct grants to providers, (2) learning collaboratives and technical assistance, (3) enhanced systems to enable electronic exchange of clinical information, and (4) data analytics to inform providers on performance on key metrics related to their attributed populations.
First, as described earlier, Minnesota offered grants to providers targeted at changing care delivery and connecting them with nonmedical providers. One Minnesota stakeholder observed that its grant allowed the organization “to establish relationships with community partners, or individuals, that [it] didn't have before, and they were able to start to understand each other in different ways.” Vermont's grants funded specific infrastructure development and quality improvement projects.
Second, all four states provided some level of technical assistance to clinical providers participating in the ACO, as well as to ACO administrators, to help them meet performance expectations. One‐on‐one technical assistance and learning collaboratives that offered peer‐to‐peer learning opportunities on practice transformation and use of data to better manage patient care were the most valued form of technical assistance. For example, Minnesota provided peer learning sessions on how to implement the Plan‐Do‐Study‐Act rapid cycle improvement process, and Vermont implemented the Integrated Communities Care Management Learning Collaborative to teach providers how to identify high‐risk patients and fill gaps in existing services. In Massachusetts, funding was used to support infrastructure development, hire analytic staff to develop dashboards for identifying frequent health care utilizers, create a standing internal workgroup, and prepare internal and external data reports on ACO provider affiliations. Maine conducted regular check‐ins with its ACO administrators to discuss program operations.
Third, states invested significant SIM resources in existing health information exchanges (Maine and Vermont) and inpatient and ED event notification systems (Massachusetts and Vermont), in order to broaden providers’ access to real‐time clinical data. Consumers reported that their providers knew that they had been to the hospital, and providers reported a greater ability to follow‐up with patients after discharge as a result of these notifications. Providers in Maine uniformly lauded access to the real‐time clinical data available in the state's health information exchange.
Fourth, all states provided or facilitated access to raw Medicaid claims data, or aggregated Medicaid claims data in the form of feedback reports. For example, Maine provided summary feedback reports to ACO administrators on ACO enrollees’ health care quality, use, and cost so administrators could track progress in meeting total cost of care goals, and the state offered consults to its ACOs on how to review and act on data feedback reports. To facilitate care management, Minnesota started providing Medicaid encounter data to Medicaid ACO administrators across all its attributed Medicaid beneficiaries, regardless of managed care or fee‐for‐service enrollment. Minnesota's Medicaid ACOs reported that the data gave them a comprehensive picture of their Medicaid beneficiaries’ health care quality and cost to help them improve care delivery practices. The state also provided ACOs with technical assistance for data analytics and gave grants to Medicaid ACOs for development and implementation of data analytic capabilities. With full Medicaid ACO implementation, Massachusetts began providing its ACOs quarterly utilization and costs reports to help identify high utilizers, as well as raw Medicaid claims data that ACOs can combine with their clinical data to identify areas of clinical need among ACO‐attributed beneficiaries. Vermont used SIM funding to augment its preexisting health information exchange so that participating Medicaid ACOs could examine their progress in meeting cost and quality targets for ACO participation.
Challenges in ACO Implementation
In all states, ACO providers often expressed concern over the fact that they were held accountable for patients’ inefficient use of health care services (eg, nonemergent use of the ED, resistance to receiving an evidence‐based test) despite providers’ efforts to reach out and educate patients on appropriate care. In addition, individual clinicians perceived that any shared savings earned by the ACO were kept by the parent organizations, which may have reduced incentives for individual clinicians to align practice patterns with ACO goals. On the other hand, ACO administrators in Vermont noted the need to make up‐front investments prior to earning shared savings. In Minnesota Medicaid's ACO “2.0,” launched in 2018, the revised terms of the payment model addressed this issue by introducing a population‐based payment to be paid retrospectively every quarter.
Utilization, Quality, and Expenditure Outcomes
Table 3 summarizes the sociodemographic and health care utilization characteristics of ACO enrollees in Vermont, Minnesota, and Maine one year prior to ACO implementation. These data show the similarities and differences in ACO enrollees’ characteristics across the three states. Minnesota had the largest study sample (more than 230,000), while Maine had the smallest (almost 44,000). With few exceptions, sociodemographic characteristics were similar across states. Maine had a higher rate of disabled Medicaid beneficiaries, due in part to Maine's inclusion of Medicare‐Medicaid enrolled beneficiaries in the ACOs, and Minnesota had more ACO enrollees in urban areas, relative to Maine and Vermont. Minnesota also had more inpatient utilization, while Vermont had greater ED use.
Table 3.
Minnesota | ||||
---|---|---|---|---|
Characteristic | Maine | Medicaid | MN APCD | Vermont |
Number of unique beneficiaries | 43,994 | 294,923 | 239,245 | 64,643 |
Beneficiary‐level sociodemographic characteristics | ||||
Female, % | 57.29 | 56.19 | 56.28 | 53.13 |
Age < 1 year, % | 2.72 | 4.17 | 3.95 | 0.75 |
Age 1‐18 years, % | 43.13 | 51.24 | 50.60 | 53.16 |
Age 19‐64 years, % | 47.68 | 44.59 | 45.45 | 46.08 |
Age ≥ 65 years, % | 6.48 | 0.01 | 0.0 | – |
Enrolled in Medicaid because of disability, % | 23.23 | 8.28 | – | 14.26 |
Medicare‐Medicaid beneficiaries, % | 18.71 | – | – | – |
CDPS risk score in the prior year, mean | 1.30 | 1.30 | – | 1.30 |
Hierarchical Condition Categories scoreb | – | – | 2.13 | – |
Characteristics of beneficiary county of residence | ||||
Metropolitan status, % | 50.63 | 77.41 | 77.14 | 22.74 |
Uninsured rate, % | 14.0 | 9.7 | 9.7 | 8.9 |
Median age | 42.2 | 37.4 | 37.5 | 42.2 |
Poverty rate, % | 15.4 | 12.4 | 12.7 | 12.6 |
Hospital beds per 1,000 population | 3.7 | 3.5 | 3.5 | 2.4 |
Health care utilization/expenditures for beneficiaries | ||||
Total annual Medicaid expenditures, prior year, $ | 5,975.00 | – | 4,943.88 | 4,960.70 |
Any inpatient admissions in the prior year, % | 10.44 | 10.28 | 8.80 | 4.53 |
Any ED visits in the prior year, % | 39.76 | 31.76 | 29.53 | 27.29 |
Abbreviations: ACO, accountable care organization; APCD, all payer claims database; CDPS, Chronic Illness and Disability Payment System (larger CDPS scores correspond with a larger number of comorbidities or a more severe set of comorbidities); ED, emergency department; MN, Minnesota.
In Maine, year prior to implementation was August 2013‐July 2014. In Minnesota, the year prior to implementation was January 2012‐December 2012. In Vermont, the year prior to implementation January 2013‐December 2013. bLarger hierarchical condition categories scores correspond with a larger number of comorbidities or a more severe set of comorbidities.
Utilization
As ACOs increased their emphasis on care coordination and care management, we expected to see more primary care use and fewer inpatient admissions, 30‐day readmissions, and ED visits. The expected impact on specialty care is more ambiguous. Use of specialty care could increase if care management activities are able to connect patients with needed care or if specialists are able to manage conditions in lower cost settings. Conversely, the use of specialty care could decline if more care is being delivered and managed in the primary care setting.
As shown in Table 4, the percentage of ACO enrollees in Maine, Minnesota, and Vermont with a primary care visit remained either unchanged (Vermont) or decreased (Maine and Minnesota), resulting in a decline in the likelihood of having a primary care visit relative to the comparison group (Maine DID estimate: −5.8, p = 0.004; Minnesota DID estimate: −7.8, p < 0.001; Vermont DID estimate: −0.5, p = 0.21). Similar patterns were observed with receipt of specialty care. Maine and Vermont had relatively little change among ACO enrollees, while Minnesota had a large drop in specialty care use. In all three states, difference‐in‐difference estimates indicated a lower likelihood of receipt of specialty care relative to the comparison group (Maine DID estimate: −1.1, p < 0.001; Minnesota DID estimate: −9.4, p < 0.001; Vermont DID estimate: −1.8, p < 0.001).
Table 4.
Pre‐Period Adjusted Mean, ACO | Pre‐Period Adjusted Mean, CG | Post‐Period Adjusted Mean, ACO | Post‐Period Adjusted Mean, CG | Regression‐Adjusted Difference‐in‐Differences (90% Confidence Interval) | P‐Value | Total Weighted Number of Period‐Years in Regression Model | |
---|---|---|---|---|---|---|---|
Percentage of beneficiaries with a visit to a primary care provider (%) | |||||||
Maine | 57.4 | 63.6 | 56.5 | 66.8 | −5.8 (−9.1, −2.5) | 0.004 | 394,589 |
Minnesota | 92.5 | 87.2 | 67.2 | 71.1 | −7.8 (−8.0, −7.7) | < 0.001 | 3,985,920 |
Vermont | 81.1 | 82.3 | 76.2 | 78.1 | −0.5 (−1.2, 0.2) | 0.21 | 767,146 |
Percentage of beneficiaries with any visit to a specialist (%) | |||||||
Maine | 30.2 | 29.6 | 30.9 | 31.0 | −1.1 (−1.7, −0.5) | < 0.001 | 394,589 |
Minnesota | 38.1 | 33.7 | 25.0 | 29.9 | −9.4 (−9.5, −9.3) | < 0.001 | 3,985,920 |
Vermont | 28.7 | 28.6 | 28.4 | 30.0 | −1.8 (−2.4, −1.2) | < 0.001 | 767,146 |
All‐cause acute inpatient hospitalizations per 1,000 covered persons | |||||||
Maine | 99.9 | 97.5 | 91.9 | 95.8 | −6.8 (−10.8, −2.7) | 0.006 | 394,589 |
Minnesota | 100.1 | 99.2 | 113.1 | 104.3 | 7.4 (6.6, 8.1) | < 0.001 | 3,985,920 |
Vermont | 54.1 | 50.1 | 60.7 | 61.9 | −5.8 (−7.8, −3.9) | < 0.001 | 767,146 |
ED visits not leading to hospitalization per 1,000 covered persons | |||||||
Maine | 420.0 | 405.3 | 406.4 | 401.7 | −12.4 (−18.6, −6.2) | 0.001 | 394,589 |
Minnesota | 425.8 | 372.4 | 320.9 | 305.0 | −29.7 (−30.9, −28.5) | < 0.001 | 369,362 |
Vermont | 348.9 | 320.9 | 312.3 | 301.3 | −15.8 (−19.7, −11.8) | < 0.001 | 767,146 |
All cause 30‐day readmissions per 1,000 discharges | |||||||
Maine | 134.3 | 128.5 | 160.1 | 150.7 | 2.0 (−13.2, 17.2) | 0.827 | 43,840 |
Minnesota | 123.5 | 124.3 | 126.1 | 128.7 | −5.1 (−9.4, −0.8) | 0.05 | 369,362 |
Vermont | 92.2 | 98.3 | 103.1 | 107.9 | 0.0 (−13.7, 13.7) | 0.1 | 37,547 |
Abbreviations, ACO, accountable care organization; CG, comparison group.
ACO enrollees in Maine experienced 6.8 fewer inpatient admissions relative to the comparison group (p = 0.006). In contrast, Minnesota and Vermont ACO enrollees experienced an increase in inpatient admissions. In Minnesota, the increase resulted in 7.4 more inpatient admissions relative to the comparison group (p < 0.001), whereas in Vermont, the increase was smaller than the increase in the comparison group (5.8 fewer admissions for ACO enrollees relative to the comparison group {p < 0.001}). All states experienced an increase in 30‐day readmissions; however, in Minnesota, the ACO group experienced a smaller increase relative to the comparison group (DID estimate −5.1, p = 0.05). ACO enrollees in all three states experienced a decline in the rate of ED visits that was larger than the decline experienced by the comparison group (Maine DID estimate: −12.4, p = 0.001; Minnesota DID estimate: −29.7; p < 0.001; Vermont DID estimate: −15.8, p < 0.001).
Quality
As care coordination and incentives to meet quality targets were put into place, we expected more beneficiaries to receive recommended screenings and tests to manage chronic diseases. In Maine and Minnesota, we expected that more beneficiaries with diabetes would receive the recommended HbA1c tests. As shown in Table 5, HbA1c testing rates declined for both the ACO and comparison groups in Maine, and there were no statistically significant differences between groups. In contrast, in Minnesota, the likelihood of HbA1c testing increased by 3 percentage points for ACO enrollees relative to the comparison group (p < 0.001).
Table 5.
Pre‐Period Adjusted Mean, ACO | Pre‐Period Adjusted Mean, CG | Post‐Period Adjusted Mean, ACO | Post‐Period Adjusted Mean, CG | Regression‐Adjusted Difference‐in‐Differences (90% Confidence Interval) | P‐Value | Total Weighted Number of Period‐Years in Regression Model | |
---|---|---|---|---|---|---|---|
Percentage of Medicaid beneficiaries age 18‐75 years with diabetes who had hemoglobin A1c (HbA1c) testing | |||||||
Maine | 85.3 | 83.3 | 79.1 | 76.6 | −0.2 (−3.5, 3.1) | 0.93 | 12,141 |
Minnesota | 93.1 | 92.8 | 93.8 | 90.7 | 3.0 (2.5, 3.4) | < 0.001 | 113,674 |
Percentage of Medicaid beneficiaries age 18 years and older diagnosed with a new episode of major depression and treated with antidepressant medication, who remained on medication treatment at least 180 days | |||||||
Maine | 41.1 | 39.3 | 40.1 | 38.6 | 0.3 (−2.2, 2.7) | 0.859 | 10,677 |
Minnesota | 34.1 | 35.1 | 36.0 | 38.3 | −1.4 (−2.2, −0.7) | 0.002 | 113,674 |
Percentage of beneficiaries age one to three years who had a developmental screening | |||||||
Vermont | 36.1 | 49.9 | 46.3 | 47.1 | 12.9 (9.2, 16.7) | < 0.001 | 56,812 |
Abbreviations: ACO, accountable care organizations; CG, comparison group.
All three states used three years of baseline data prior to the start of their post‐period. Maine's post‐period was August 2014 through July 2016. Minnesota's post‐period was January 2013 through December 2016 for utilization and quality measures. Minnesota's post‐period was January 2013 through December 2015 for expenditure measures. Vermont's post‐period was January 2014 through December 2016.
Maine and Minnesota identified behavioral health integration as integral to improving outcomes for Medicaid beneficiaries. We expected beneficiaries with major depression disorder to receive more coordinated care delivery through their ACOs and subsequently experience better medication adherence. In Maine, among Medicaid beneficiaries aged 18 years or older with depression, there was little change in the percent of beneficiaries on antidepressants for 180 days among ACO enrollees and the comparison group, and there were no statistically significant differences between the two groups (see Table 5). In contrast, the percent of ACO enrollees in Minnesota who remained on medication treatment at least 180 days increased by 1.4 fewer percentage points, relative to the comparison group (p = 0.002).
Vermont also included early childhood developmental screening as a Medicaid ACO priority area. The percentage of beneficiaries aged one to three years who had a developmental screening increased for ACO‐attributed beneficiaries but declined for the comparison group, resulting in a 12.9 percentage point relative increase in developmental screenings for ACO‐attributed beneficiaries during the three years of the Medicaid shared savings program (p < 0.001) (see Table 5).
Other quality measures included in the final evaluation16 showed nonsignificant differences between the ACO and comparison groups.
Expenditures
A central goal for all of the Medicaid ACO models is to either lower costs or slow cost growth by preventing avoidable use of high‐cost services. As care coordination and quality of care increased, we expected cost growth to slow as a result of decreased use of high‐cost services, such as care in EDs and inpatient hospitalizations.
Maine and Vermont ACO enrollees experienced an increase in total Medicaid expenditures after ACO implementation, while Minnesota enrollees experienced a decrease. However, changes in total Medicaid expenditures were not statistically different from the comparison group in Maine and Minnesota. In Vermont, total Medicaid expenditure growth was $39 PBPM less than the growth in expenditures for the comparison group (p < 0.001). Inpatient expenditures increased for ACO enrollees in all states, and in Vermont the increase was $12 PBPM less than the growth in inpatient expenditures for the comparison group (p < 0.001). Professional expenditures decreased by $6.92 (p = 0.056) and $21.34 (p < 0.001) in Minnesota and Vermont, respectively, relative to the comparison group, and in Maine there was little change relative to the comparison group. While pharmaceutical expenditures increased in Maine and Vermont, the increase in Maine was not statistically different from the increase in the comparison group; and the increase in Vermont was $4.59 less than the increase in the comparison group (p = 0.04) (Table 6).
Table 6.
Pre‐Period Adjusted Mean, ACO | Pre‐Period Adjusted Mean, CG | Post‐Period Adjusted Mean, ACO | Post‐Period Adjusted Mean, CG | Regression‐Adjusted Difference‐in‐Differences (90% Confidence Interval) | P‐Value | Total Weighted Number of Period‐Years in Regression Model | |
---|---|---|---|---|---|---|---|
Total expenditures (PBPM) b | |||||||
Maine | 679.48 | 682.3 | 729.92 | 716.13 | 8.94 (−22.75, 40.63) | 0.642 | 394,589 |
Minnesota | 500.96 | 470.99 | 474.41 | 453.85 | −9.40 (−20.75, 1.95) | 0.173 | 2,076,353 |
Vermont | 477.99 | 469.41 | 493.48 | 519.99 | −39.92 (−50.21, −29.63) | < 0.001 | 767,146 |
Inpatient expenditures (PBPM) | |||||||
Maine | 59.58 | 57.54 | 62.4 | 61.76 | −2.35 (−8.85, 4.15) | 0.552 | 394,589 |
Minnesota | 158.07 | 153.86 | 181.79 | 180.08 | −2.49 (−10.95, 5.97) | 0.629 | 2,076,353 |
Vermont | 58.79 | 49.32 | 67.56 | 68.95 | −12.29 (−16.52, −8.06) | < 0.001 | 767,146 |
Professional expenditures (PBPM) | |||||||
Maine | 242.03 | 261.75 | 241.54 | 259.21 | 0.18 (−13.96, 14.31) | 0.984 | 394,589 |
Minnesota | 342.89 | 317.13 | 292.62 | 273.76 | −6.92 (−12.86, −0.97) | 0.056 | 2,076,353 |
Vermont | 258.53 | 252.87 | 251.5 | 265.77 | −21.34 (−27.74, −14.93) | < 0.001 | 767,146 |
Pharmaceutical expenditures (PBPM) | |||||||
Maine | 79.81 | 73.18 | 102.14 | 95.07 | −1.21 (−5.84, 3.43) | 0.669 | 394,589 |
Vermont | 90.3 | 96.37 | 96.07 | 105.13 | −4.59 (−8.26, −0.93) | 0.04 | 767,146 |
Abbreviations: ACO, accountable care organizations; CG, comparison group; PBPM, per beneficiary per month.
All three states used three years of baseline data prior to the start of their post‐period. Maine's post‐period was August 2014 through July 2016. Minnesota's post‐period was January 2013 through December 2016 for utilization and quality measures. Minnesota's post‐period was January 2013 through December 2015 for expenditure measures. Vermont's post‐period was January 2014 through December 2016.
Due to data constraints, Minnesota's total spending is inclusive only of medical spending. Maine and Vermont's total spending includes both medical and pharmaceutical spending. Estimates do not include shared savings payments.
Discussion
Many states are using ACOs as a way to improve health outcomes, coordinate care, and reduce expenditures for their Medicaid beneficiaries. This study, which used an in‐state comparison group and controlled for population characteristics, shows promising impacts of ACO enrollment on patient‐level outcomes in the three states where data were available. Maine and Vermont had slower growth in inpatient admissions, and all three states experienced a decrease in ED visits, relative to the in‐state comparison group. Changes in quality of care were minimal relative to the comparison group, though we note that the measures presented here do not necessarily represent the measures used for assessing ACOs’ performance for purposes of shared savings, and therefore may not be reflective of their overall quality performance. Nonetheless, improvements in some measures in Minnesota and Vermont suggest that ACOs can improve providers’ ability to reach out and engage patients around evidence‐based care. While Vermont experienced the greatest number of statistically significant reductions or slower growth across expenditure categories, Minnesota was often trending in a positive direction of lower costs or slower growth in costs.
The potential financial benefit for both states and providers is significant. As calculated by the Medicaid programs themselves, the total amount of savings generated by the Medicaid ACO programs in their early years of operations—and distributed to each participating ACO—varied, from $5.4 million in Maine (sharing $856,675 across four ACOs in its initial year,21 August 2014‐July 2015), to $14.6 million in Vermont (sharing $6.6 million across its two ACOs in its first year, 2014),22 to more than $65 million in Minnesota (sharing $23 million across nine providers in its second year, 2014).12 Still, early results may not predict future trends: in Vermont, the amount of savings decreased over time, with only one ACO generating shared savings at a significantly lower rate in 2015 than in 2014.23 Additionally, the implementation costs are significant, none of which are factored into the savings calculations.
Certain Medicaid ACO design features may account for these positive findings. First, all three states built their ACO models upon existing primary care health reforms. As a result, ACO participation may have been seen as the natural “next step” for primary care providers. Additionally, these primary care health reforms gave participating clinicians experience in setting practice transformation goals to reduce costs and improve quality, which may have facilitated their participation in an ACO.
Second, ACOs in each state included providers from multiple service sectors to ensure that the behavioral health, long‐term care, and social service needs of medically and socially complex Medicaid beneficiaries could be met under an ACO arrangement. Developing partnerships among clinical and nonclinical providers was often cited by ACO participating providers as both a welcome but challenging opportunity to work across service sectors to improve patient outcomes. Third, Maine, Minnesota, and Vermont each used their SIM funds to invest heavily in health IT and data analytics to help ACOs and their providers access and use clinical and claims data to manage high utilizers and identify patients with gaps in evidence‐based care. Providers in all three states reported that inpatient and ED event notification systems, in particular, helped them identify which patients to follow up with about appropriate use of the hospital and ED and, consequently, to reduce ED and inpatient admissions rates and meet ACO quality and cost targets for shared savings.
Fourth, each program provided practice transformation assistance to ACO‐participating practices through one‐on‐one assistance or learning collaboratives. Providers used these supports to effect change in care delivery to meet ACO performance expectations. While each state tailored the design to meet its specific needs, they all shared the overarching goals of redesigning workflows, improving quality, and coordinating patient care.
Notably, one key Medicaid ACO design feature—the use of two‐sided risk—likely had minimal influence on study outcomes. During the study period, only some of Minnesota's ACOs were operating under a two‐sided risk arrangement, suggesting that ACOs did not need to be at more financial risk in order to improve health care outcomes. It will be important to continue to study these Medicaid beneficiary outcomes in these Medicaid ACO models as Vermont, Massachusetts, and Minnesota shift providers into greater risk arrangements, and as ACOs in Maine opt for two‐sided risk.
Between the three states, Vermont had the most promising results in reducing high‐cost service use (inpatient admissions and ED visits) and Medicaid cost growth, followed by Minnesota and Maine. While our study was not designed to elicit explanations for the relative performance of one state compared to another, we can posit a few potential explanations for differences in results. First, Vermont and Minnesota's efforts to support ACOs were very focused, while Maine's efforts were less intense. For example, in contrast to Maine and Minnesota, Vermont aligned its Medicaid ACO design with Medicare and commercial ACOs in the state. As a result, ACO‐participating providers were subject to similar program features across all the payers with which they entered into an ACO arrangement. This cross‐payer alignment may have made it easier for providers to meet program expectations. Second, Vermont prioritized connecting the Medicaid ACOs to its health information exchange so that the ACOs could leverage clinical data for care management, and Minnesota gave providers, some of which were ACOs, grants to improve and build data analytics capacity. Both Vermont and Minnesota also gave grants to providers, some of which were ACOs, to build any infrastructure needed to effectively operate within an APM. Maine, on the other hand, took a more passive approach to its ACOs, though the state did use SIM Initiative funding to provide extensive practice transformation and data support to its primary care medical homes and behavioral health providers, some of whom participated in the ACOs. Third, both Vermont and Minnesota had more years of post‐ACO implementation data available, which could have increased the likelihood of observing positive results over more mature implementation years.
Although there are no published studies of Medicaid ACO programs that use a comparison group with which to compare our results, several Medicaid programs have reported reductions in inpatient and ED visits within ACOs, consistent with our findings.24 Others have reported slower total Medicaid cost growth compared to historical expenditure growth, resulting in shared savings.24
Provider support and acceptance of the Medicaid ACO model, early results generating shared savings, and the utilization of Medicaid demonstration waivers have facilitated model sustainability beyond the SIM Initiative. Vermont's Medicaid shared savings program, which was one‐sided risk only, has now been replaced by an ACO model developed with a Medicaid demonstration waiver with two‐sided risk. Maine will continue its Medicaid ACOs without any significant changes, while Minnesota has significantly redesigned its model to facilitate better integration of nontraditional providers of care, improve population heath, and move more providers into two‐sided risk. Building on the experience of its ACO pilot, Massachusetts launched an ACO program that will reach the majority of beneficiaries in the state via three types of ACO contracts with two‐sided risk.
Limitations of Quantitative Analysis
Through the SIM Initiative, Maine, Minnesota, and Vermont introduced or broadened Medicaid ACO programs while also providing varying degrees of technical assistance or statewide support for practice transformation, population identification and management, and quality improvement efforts for non‐ACO related initiatives, like patient‐centered medical homes. These complementary state‐based reforms could have influenced participating ACO providers, and thereby may provide an alternative explanation for some of the more favorable findings. We could not identify, and therefore could not disentangle, the synergistic effects of all ongoing initiatives from ACO effects. In this context, these three states and their generally positive findings may be considered a “best‐case scenario” that depends on a strong foundation of primary care transformation. Because we could not identify and control for all ongoing initiatives, the comparison group practices also may have been engaged in transformation activities similar to those of ACO providers. This contamination could have attenuated our findings. Study results in Minnesota and Vermont also may have been impacted by the expansion in Medicaid eligibility that went into effect at the same time as ACO implementation. While we did not explicitly test for the effect of this policy change, analyses attempted to balance the distribution of these newly eligible individuals between the intervention and comparison groups.
In Maine, we included beneficiaries who were dually eligible for both Medicaid and Medicare to reflect the program design that emphasized participation of persons with disabilities. However, we restricted our analysis to those services covered by Medicaid and therefore do not capture services paid for by Medicare.
One additional potential criticism of the presented analyses is that we included baseline outcome measures in our propensity score specification, which has been shown to be associated with regression‐to‐the‐mean biases in difference‐in‐differences modeling in some situations (eg, Daw and Hatfield 201825). We made the analytic decision to include these measures in the propensity score specification because we wanted to better account for unobserved baseline health risks that, if left unaddressed, could have led to omitted variable bias. Additionally, the choice to include these measures agrees with the decision rules for including or not including pretreatment outcome measures provided by Daw and Hatfield,25 which suggests that regression‐to‐the‐mean biases are less likely in cases where pretreatment outcome differences are low. This was the case in our empirical context and for the pretreatment outcome measures we included (see Online Appendix Tables 3‐5).
Finally, this analysis reflects two to four years of post‐ACO implementation experience. The first few years of implementation often are focused on the foundational work of learning how to transform care. Over time, providers and health systems gain the knowledge and skill to begin to change patterns of care and realize lower cost growth or even net savings, a sentiment echoed by key stakeholders in these states. Because our analysis period is relatively short, more time is likely needed to see the full effects of ACO activities. The generally positive trends in results during this limited exposure time suggest that this model may be a promising alternative to the traditional volume‐driven system of reimbursement.
Conclusion
Over the course of the SIM Initiative, four states implemented Medicaid ACOs, and three states exhibited promising associations between ACO enrollment and patient outcomes. These SIM states’ experiences illustrate that transition to APMs and value‐based payment is not only feasible but can achieve positive results, even when implemented in a diverse high‐cost, high‐needs Medicaid population. As more Medicaid ACOs emerge, states can look to these findings and the implementation experience of these four states for a better understanding of what can be achieved when payers align and a range of providers are enabled to deliver a more coordinated care experience for Medicaid beneficiaries.
Supporting information
Funding/Support
The data collection and analysis on which this article is based was funded by the Centers for Medicare and Medicaid Services under the State Innovation Models Evaluation, contract no. HHSM‐500‐2010‐00021i. The findings and conclusions contained in this article are those of the authors and do not necessarily reflect the official position of Centers for Medicare and Medicaid Services.
Conflict of Interest Disclosures: All authors have completed the ICMJE Form for Disclosure of Potential Conflicts of Interest. No disclosures were reported.
Acknowledgements: The authors would like to thank Vincent Keyes and Marisa Morrison for data analysis and programming assistance; Sara Freeman, Noelle Siegfried and Leslie Greenwald for input on state‐specific content; and Suzanne Wensky at the Centers for Medicare and Medicaid Services for helpful comments on earlier versions.
References
- 1. Rudowitz R, Valentine A. Medicaid enrollment & spending growth: FY 2017 & 2018. The Henry J. Kaiser Family Foundation; October 19, 2017. https://www.kff.org/medicaid/issue-brief/medicaid-enrollment-spending-growth-fy-2017-2018/. Accessed October 12, 2018.
- 2. Wiener JM, Romaire M, Thach N, et al. Strategies to reduce Medicaid spending: findings from a literature review. The Henry J. Kaiser Family Foundation; June 21, 2017. https://www.kff.org/medicaid/issue-brief/strategies-to-reduce-medicaid-spending-findings-from-a-literature-review/. Accessed October 12, 2018.
- 3. Muhlestein D. Growth and dispersion of accountable care organizations in 2015. Health Affairs blog. https://www.healthaffairs.org/do/10.1377/hblog20150331.045829/full/. Published March 31, 2015. Accessed October 12, 2018.
- 4. McWilliams JM, Hatfield LA, Chernew ME, Landon BE, Schwartz AL. Early performance of accountable care organizations in Medicare. N Engl J Med. 2016;374(24):2357‐2366. 10.1056/NEJMsa1600142 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. McWilliams JM, Landon BE, Chernew ME. Performance in year 1 of Pioneer Accountable Care Organizations. N Engl J Med. 2015;373(8):777 10.1056/NEJMc1507320 [DOI] [PubMed] [Google Scholar]
- 6. Nyweide DJ, Lee W, Cuerdon TT, et al. Association of Pioneer Accountable Care Organizations vs traditional Medicare fee for service with spending, utilization, and patient experience. JAMA. 2015;313(21):2152‐2161. 10.1001/jama.2015.4930 [DOI] [PubMed] [Google Scholar]
- 7. McWilliams JM, Hatfield LA, Landon BE, Hamed P, Chernew ME. Medicare spending after 3 years of the Medicare Shared Savings Program. N Engl J Med. 2018; 379:1139‐1149. 10.1056/NEJMsa1803388 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Matulis R, Lloyd J. The history, evolution, and future of Medicaid accountable care organizations. Center for Health Care Strategies website. https://www.chcs.org/resource/history-evolution-future-medicaid-accountable-care-organizations/. Published February 2018. Accessed October 12, 2018.
- 9. Baseman S, Boccuti C, Moon M, Griffin S, Dutta T. Payment and delivery system reform in Medicare: a primer on medical homes, accountable care organizations, and bundled payments. The Henry J. Kaiser Family Foundation; 2016. http://files.kff.org/attachment/Report-Payment-and-Delivery-System-Reform-in-Medicare.pdf. Accessed October 12, 2018.
- 10. Truchil A, Dravid N, Singer S, et al. Lessons from the Camden Coalition of Healthcare Provides’ first Medicaid shared savings performance evaluation. Popul Health Manag. 2018;21(4):278‐284. 10.1089/pop.2017.0164 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Lloyd, J. Shared savings for Medicaid accountable care organizations: design considerations. Center for Health Care Strategies, Inc; July 2017. https://www.chcs.org/media/Medicaid-ACO-Shared-Savings-TA-Brief_072817.pdf. Accessed March 11, 2019.
- 12. Blewett LA, Spencer D, Huckfeldt P. Minnesota integrated health partnership demonstration: implementation of a Medicaid ACO model. J Health Polit Policy Law. 2017;42(6):1127‐1142. 10.1215/03616878-4193666 [DOI] [PubMed] [Google Scholar]
- 13. McConnell JK, Renfro S, Lindrooth RC, et al. Oregon's Medicaid reform and transition to global budgets were associated with reductions in expenditures. Health Aff (Millwood). 2017;36(3):451‐459. 10.3177/hlthaff.2016.1298 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. McConnell JK, Renfro S, Chan BKS, et al. Early performance in Medicaid accountable care organizations: a comparison of Oregon and Colorado. JAMA Intern Med. 2017;177(4):538‐545. 10.1001/jamainternmed.2016.9098 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Vickery KD, Shippee ND, Guzman‐Corrales LM, et al. Changes in quality of life among enrollees in Hennepin Health: a Medicaid expansion ACO [epub ahead of print]. Med Care Res Rev. 2018;1‐25. 10.1177/1077558718769457 [DOI] [PubMed] [Google Scholar]
- 16. RTI International . State Innovation Models (SIM) Initiative Evaluation: Model Test Year Five Annual Report . Baltimore, MD: Centers for Medicare and Medicaid Services; 2019. https://downloads.cms.gov/files/cmmi/sim-rd1-mt-fifthannrpt.pdf. Accessed May 24, 2019. [Google Scholar]
- 17. Austin, PC. An introduction to propensity score methods for reducing the effects of confounding in observational studies. Multivariate Behav Res. 2011;46(3):399‐424. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Puhani P. The treatment effect, the cross difference, and the interaction term in nonlinear “difference‐in‐differences” models. Econ Letters. 2012;115:85‐87. [Google Scholar]
- 19. Thompson FJ, Cantor JC, Houston R. Control versus administrative discretion in negotiating voluntary P4P networks: the case of Medicaid accountable care organizations [epub ahead of print]. AdmSoc. 2018;1‐30. 10.1177/0095399718775320 [DOI] [Google Scholar]
- 20. RTI International . State Innovation Models (SIM) Initiative Evaluation: Model Test Year Four Annual Report . Baltimore, MD: Centers for Medicare and Medicaid Services; 2018. https://downloads.cms.gov/files/cmmi/sim-rd1-mt-fourthannrpt.pdf. Accessed March 11, 2019.
- 21. MaineCare . Accountable Communities Initiative. Maine.gov website. https://www.maine.gov/dhhs/oms/vbp/accountable.html. Published May 2018. Accessed October 12, 2018.
- 22. Slusky R, Jones P, Cooper A. Year 1 (2014) results for Vermont's commercial and Medicaid ACO shared savings programs. Presented to VHCIP Steering Committee, October 28, 2015. https://healthcareinnovation.vermont.gov/sites/vhcip/files/documents/October%20-%202015%20Year%201%20%282014%29%20Results%20for%20Vermont%E2%80%99s%20Commercial%20and%20Medicaid%20ACO%20Shared%20Savings%20Programs.pdf. Accessed March 11, 2019.
- 23. Green Mountain Care Board . Vermont's ACO Shared Savings Programs: results and lessons learned 2014–2016. http://gmcboard.vermont.gov/sites/gmcb/files/FINAL%20Year%203%20Shared%20Savings%20Program%20Results%2012%2019%202017%20to%20GMCB%20FINAL_DVHA%20update.pdf. Published December 2017.
- 24. Center for Health Care Strategies, Inc . Medicaid ACO programs: promising results from leading‐edge states. Center for Health Care Strategies website. http://www.chcs.org/media/MedicaidACOProgramsWebinar_01.17.17.pdf. Published January 17, 2017. Accessed October 12, 2018.
- 25. Daw JR, Hatfield LA. Matching and regression to the mean in difference‐in‐differences analysis. Health Serv Res. 2018;53:4138‐4156. 10.1111/1475-6773.12993 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.