Abstract
Objective. To investigate how integration between Medicare Advantage plans and health care providers is related to plan premiums and quality ratings.
Data Source. We used public data from the Centers for Medicare and Medicaid Services (CMS) and the Area Resource File and private data from one large insurer. Premiums and quality ratings are from 2009 CMS administrative files and some control variables are historical.
Study Design. We estimated ordinary least-squares models for premiums and plan quality ratings, with state fixed effects and firm random effects. The key independent variable was an indicator of plan–provider integration.
Data Collection. With the exception of Medigap premium data, all data were publicly available. We ascertained plan–provider integration through examination of plans’ websites and governance documents.
Principal Findings. We found that integrated plan–providers charge higher premiums, controlling for quality. Such plans also have higher quality ratings. We found no evidence that integration is associated with more generous benefits.
Conclusions. Current policy encourages plan–provider integration, although potential effects on health insurance products and markets are uncertain. Policy makers and regulators may want to closely monitor changes in premiums and quality after integration and consider whether quality improvement (if any) justifies premium increases (if they occur).
Keywords: Medicare, health economics, industrial organization, health care
Increasingly, health care providers are seeking advantages through integration (mergers). Although market power resulting from “horizontal” mergers between hospitals or insurance companies has received considerable attention from the media (Kowalczyk and Weisman 2012; Pearlstein 2012), health economists (Capps et al. 2002), and antitrust regulators at the federal (Federal Trade Commission and Department of Justice 1996) and state (Coakley 2010) levels, mergers between insurance companies and hospitals have received less scrutiny. However, such plan–provider or “vertical” integration is justified by organizations on the same cost efficiency and quality improvement grounds as horizontal integration of hospitals or insurers.1 Both types of integration raise concerns about excessive market power and consumer welfare.
In this study, we focus on plan–provider integration in the Medicare Advantage (MA) market. This is both convenient and relevant to policy. The data necessary for a study of this kind are publicly available. The same cannot be said of the commercial health insurance market. In addition, integration in the MA market is incentivized by provisions of the 2010 Patient Protection and Affordable Care Act (Public Law 111–148) (hereafter, ACA) and subsequent regulation. Through bonus payments for quality improvement and cost reduction, the ACA encourages the formation of accountable care organizations (ACOs), networks of providers responsible for the care of a defined group of Medicare patients (Frakt and Mayes 2012). ACOs give providers an incentive to consolidate the spectrum of care under one management because bonus payments will be tied to performance on quality measures and a spending target based on the difference between a benchmark and all Medicare spending attributed to beneficiaries associated with the ACO—even when incurred for services provided outside the ACO. In addition, some ACO contracts put providers at financial risk if their costs are above a benchmark. Consequently, ACOs with risk management capabilities will be better positioned to succeed (Fuchs and Schaeffer 2012). Providers can develop these capabilities internally or acquire them by merging with an insurer. Finally, the ACA offers quality bonus payments to MA plans (Jacobson et al. 2011). To the extent that higher quality can be achieved through plan–provider integration, this is another incentive to integrate.
Although our focus is on MA, our work is relevant to the market for commercial insurance plans and providers, where plan–provider integration may be just as common, if not more so. According to our analysis, about 17 percent of MA plans are integrated. Rabin (2012) reports on an industry survey that found that 20 percent of hospital networks offer an insurance product and an equal proportion are considering doing so. Although we make no claims about the generality of our findings beyond Medicare, this suggests that our study is in the context of an integration trend that is considerably broader than the market we examine.
Although plan–provider integration is occurring and encouraged by policy, to our knowledge there have been no studies of its relation to quality and premiums. Our study begins to fill this void. Using data before ACA passage, we find that integrated plan–providers charge higher premiums, controlling for quality. Such plans also have higher quality ratings. We also find no evidence that integrated plans offer more generous benefits. In the concluding discussion, we speculate on what these results might mean for consumers and policy makers.
Background
The Medicare Advantage Market
We focus on the MA market in 2009. MA plans are private health plans that bundle the benefits of Medicare Part A (inpatient hospital insurance), Part B (insurance for physician and outpatient services), and, optionally, Part D (drug insurance), as well as other services outside the standard Medicare benefit. A plan's cost for these services is covered by beneficiary premiums and government payments, the latter capped by an administratively established benchmark. Plans bidding below the benchmark receive a rebate equal to a fraction of the benchmark–bid difference, which they must use to finance enhanced benefits. In 2009 this fraction was 75 percent, although beginning in 2012 the fraction is tied to the plan's quality rating, with lower quality plans receiving smaller rebates. Plans bidding above the benchmark must charge a premium to make up the difference (Medicare Payment Advisory Commission [MedPAC] 2012a).
There are a variety of MA plan types. Following Song, Landrum, and Chernew (2012), we focus on the types with the longest history of Medicare participation, local coordinated care plans (CCPs, mostly HMOs and PPOs), excluding private fee for service (PFFS) plans, as well as employer-sponsored plans. Other MA plan types have very small enrollment: regional PPOs (3 percent of MA enrollment); medical savings accounts (1,866 enrollees); and other plan types and demonstrations (collectively accounting for 3 percent of MA enrollment) (Frakt, Pizer, and Feldman 2012). CCPs operate in multicounty service areas, and their bids are compared with the enrollment-weighted average benchmarks in those areas. Consequently, their bids reflect cost and demand factors in proportion to their popularity in the counties in which they operate (Song, Landrum, and Chernew 2012). This fact informs our handling of the data, described below.
Integration, Competition, and Prices
Plan–provider integration, our focus, is an extreme form of the more general economic concept of vertical restraints, which are constraints on competition due to agreements between firms at different stages of a production and distribution process. In health care, vertical restraints can take many forms, including exclusive dealing for specific services (like radiology or pathology) between physicians and hospitals, and most favored nations agreements requiring the provider to give the insurer a rate as low as it gives to any other (Gaynor and Town 2012). Plan–provider integration and exclusive dealing that are broader than specific services have not been very common, which may explain why they have not been extensively studied.
Why do plans and providers integrate? When the insurer and provider are both monopolists, each adds a monopoly mark-up to its per-unit costs. Consumers facing the double mark-up buy less output than when they face a single, integrated monopolist. The firms’ collective profits also are lower. Hence, both consumers and firms can be advantaged by integration (Carlton and Perloff 1990, pp. 526–527).
When the hospital and insurer are not both monopolists, empirical models can generate ambiguous findings about the welfare effects of vertical integration. Carlton and Perloff (1990) showed that a monopoly hospital may integrate into a competitive insurance market to solve a problem that occurs in the production of insurance: the nonintegrated insurers tend to substitute away from hospital services and use more inputs that can be purchased on more competitive markets. By integrating, the hospital–insurance firm uses proportions of hospital services and other inputs in ways that simultaneously increase profits and may increase consumer welfare, if the gain in production efficiency is greater than the loss due to increased monopoly power.
Vertical restraints can improve welfare if they eliminate inefficient substitution or reduce prices (Baranes and Bardey 2004; Gaynor and Town 2012). However, there is much work on the anticompetitive effects of vertical restraints (Ma 1997; De Fontenay and Gans 2007; Bijlsma, Boone, and Zwart 2009; Douven et al. 2011; Halbersma and Katona 2011). When there are barriers to entry for providers, an exclusive contract between the insurer and provider may be anticompetitive if the provider has a large market share, offers the highest quality, or is the most cost effective in the market (Haas-Wilson 2003). In such a situation, actual and potential competitors are disadvantaged due to restricted access to one of the most favorable providers. In short, rivals’ costs are higher for equivalent quality and consumer desirability.
Although vertical restraints have not been examined in the MA market, MA plan competition has been found to reduce premiums and raise the generosity of benefits (Pizer and Frakt 2002). In addition, more competition has been shown to be associated with lower copayments for generic and brand name drugs and lower drug caps (Pizer, Frakt, and Feldman 2003). Our study relates to this prior literature in the sense that integration is a constraint on competition and, therefore, might be expected to have similar effects on consumer prices as found in previous work on MA competition.
Quality and Competition
There is some empirical evidence that consumers prefer higher quality plans, although the effects of quality seem to be limited to satisfaction-based quality measures, and not measures based on clinical processes. Scanlon et al. (2006, 2008) found that HMO competition (as measured by the Herfindahl–Hirschman Index [HHI]) from 1998 to 2002 was not significantly associated with Health Plan Employer Data and Information Set (HEDIS) measures of process quality, although Scanlon et al. (2006) found that competition was related to Consumer Assessment of Health Plans Survey (CAHPS) measures of satisfaction. Based on analysis of Harvard University's introduction of satisfaction and quality of care ratings in 1996, Beaulieu (2002) found that provision of plan quality information to eligible workers had a small but significant effect on plan choice. Employees were more likely to switch from plans with low quality ratings. In a laboratory setting, Spranca et al. (2000) found that subjects preferred hypothetical plans with favorable CAHPS ratings even when those plans covered fewer services. Farley et al. (2002) studied managed care plan choice among New Jersey Medicaid enrollees. Those reporting receipt of CAHPS-based ratings preferred more favorably rated plans.
Maeng et al. (2010) found lower levels and variance of quality ratings associated with greater degree of overlap of plans’ provider networks. Echoing Schoenbaum and Coltin (1998), they hypothesized that it is more difficult for plans with greater network overlap to reap the rewards of investing in quality initiatives for two reasons: (1) the investment would have to be larger to overcome competing incentives from other plans and (2) the benefits would not accrue exclusively to the investor. Consequently, according to these hypotheses, it is likely that plans with (closer to) exclusive networks would be more likely to invest in and benefit from quality initiatives. This suggests that plan–provider integration would be associated with higher quality. Our study tests this hypothesis.
Data and Methods
We constructed an analytic file for 2009 from publicly available plan- and county-level data with one exception noted below. We selected this year because it is prior to any possible anticipation by plans and providers of the changes made to MA plan payments and the health care landscape by the ACA, which was passed in 2010. In particular, the ACA encourages the formation of ACOs, froze plan payments in 2011, adjusts the payment formula in subsequent years to lower payments and to reduce the geographic variation in the difference between plan payments and Medicare fee for service (FFS) cost, and changes the plan payment formula to incorporate plan quality (aka, “stars”). Some of these changes may have been anticipated by plans as early as 2010. Integrated firms might have responded to quality bonuses more effectively than nonintegrated ones. Our aim was to assess integration without the complication of the bonus program.
To create the analytic file, we began with all CCPs in U.S. states and the District of Columbia. We then excluded nondrug MA plans because they are qualitatively different from plans that offer drug benefits (hereafter referred to as MA-PD plans) and because premium data do not allow us to credibly separate premiums into drug and nondrug components. Although CMS provides both drug and nondrug premiums for MA-PD plans, plans likely subsidize nondrug benefits with drug premium revenue. As such, neither the reported drug premium nor the nondrug premium is an unambiguous reflection of the true premium. We also excluded special needs plans because they serve different populations than other MA plans (typically, Medicaid-Medicare dual eligibles and institutionalized beneficiaries) and are, effectively, in a different market. The final analytic file contains drug-offering CCPs that are not special needs plans and for which data for all variables described below are not missing.
We control for market factors related to level of government payment to plans, costs, and demand in 2009. All such data originate from CMS's administrative files at the plan–county level unless otherwise indicated. From CMS's Medicare Advantage Rates & Statistics webpage2 we obtained the county-level benchmark payment rate; FFS cost data; and 2006 diagnosis-based risk scores. We aggregated 2007 MA enrollment data to the firm level to compute the historical HHI in each county.3 From a large insurer, we obtained county-level Medigap premiums for 2005 (our only source of nonpublic data).4 To all these data, we merged other county-level, historical cost and demand correlates from the Area Resource File (ARF).5 Plan service areas and MA product premiums are from the Drug and Health Plan Data and Plans Information by County. Contract-level star quality ratings are from the Plan Ratings Data.6
As Song, Landrum, and Chernew (2012) articulated, plans are offered across multicounty service areas. Therefore, it is appropriate to aggregate market factors across the counties in which plans operate. We aggregated all county-level market variables into plan-level variables through enrollment weighting. That is, the value of a market variable assigned to a plan is the enrollment-weighted value across all the counties in which that plan operates. Plan characteristics (e.g., premium, benefits, and star quality ratings) are constant across counties in the service area, so weighting was not required for these variables. Our final unit of analysis is the plan.
The plan “star” quality data consist of quality ratings at various levels of aggregation, derived by CMS from four sources:
HEDIS, developed and maintained by the National Committee for Quality Assurance, measures health care process and intermediate outcome quality;
CAHPS, an initiative of the Agency for Healthcare Research and Quality, measures patients’ experiences or consumer satisfaction with their health plans (e.g., customer service and getting needed care quickly);
The Health Outcomes Survey, a CMS survey of self-reported outcomes;
Other CMS administrative sources (Jacobson et al. 2011; MedPAC 2012b).
To obtain the broadest possible measure of quality, we used the two CMS-provided summary scores reflecting prescription drug plan and health plan quality. These are on a five-point rating scale with 1 star the lowest and 5 stars the highest. We added these two measures together to construct a single, 2–10 stars summary quality score. We imputed missing summary quality scores from a model based on finer level star ratings provided in the Plan Ratings Data.
By reviewing plans’ websites and governing documents, we determined which plan-offering firms had vertically integrated with a hospital or provider group. “Integration” means that the provider and plan are owned by the same firm. We coded plans associated with such firms as “integrated.” All other plans are “not integrated.” We randomly sampled about 30 percent of plans coded as integrated and verified from online sources that they were integrated prior to our study year. Table 1 provides summary statistics for our variables. The appendix provides additional detail on how we ascertained integration status and a table of means by integration status.
Table 1.
Variable | Description | Mean (SD) | [Min, Max] | Source |
---|---|---|---|---|
Dependent variables | ||||
Premium | Monthly premium, 2009 (dollars) | 50.57 (67.14) | [0, 511] | CMS |
Quality | Star rating, reported in 2009 based on data from prior years | 6.39 (0.94) | [4.30, 8.99] | CMS |
Key independent variable | ||||
Integrated firm | Indicator of vertical integration, 2009 | 0.17 (0.37) | [0, 1] | Web |
Market structure | ||||
Prop. integrated | Proportion of other firms integrated, 2009 | 0.037 (0.054) | [0, 0.21] | Web |
HHI | MA firm Herfindahl–Hirschman index, 2007 | 0.33 (0.12) | [0.11, 0.79] | CMS |
MA enrollment | Thousands of enrollees in MA plans, 2009 | 24.34 (30.89) | [0.17, 126.13] | CMS |
Payment | ||||
Benchmark | Benchmark payment rate, 2009 (dollars) | 864.20 (99.28) | [740.82, 1,237.61] | CMS |
Cost | ||||
FFS cost | Monthly average FFS cost, 2009 (dollars) | 743.18 (120.56) | [540.74, 1,213.25] | CMS |
Prop. elderly 75+ | Proportion of elderly 75+ years old, 2000 | 0.47 (0.029) | [0.33, 0.55] | ARF |
Docs. per capita | General practitioners per capita, 2006 | 0.00028 (0.00010) | [0.000035, 0.00066] | ARF |
Beds per capita | Hospital beds per capita, 2005 | 0.0033 (0.0013) | [0.00076, 0.011] | ARF |
Rural county | Rural county, 2003 | 0.014 (0.060) | [0, 1] | ARF |
Urban county | Urban county, 2003 | 0.88 (0.23) | [0, 1] | ARF |
Rx Medigap prem. | Monthly drug Medigap premium, 2005 (dollars) | 250.14 (43.03) | [187.66, 465.50] | –* |
Non-Rx Medigap prem. | Monthly nondrug Medigap premium, 2005 (dollars) | 148.70 (23.47) | [102.95, 263.00] | –* |
Risk score | Aged/disabled risk score, 2006 | 1.02 (0.091) | [0.84, 1.34] | CMS |
Demand | ||||
Prop. elderly in poverty | Proportion elderly in poverty, 1999 | 0.094 (0.036) | [0.036, 0.24] | ARF |
Per capita income | Per capita income in thousands, 2005 | 33.93 (7.38) | [16.91, 93.37] | ARF |
Prop. HS diploma | Proportion of population age 25+ with a high school diploma, 2000 | 0.79 (0.056) | [0.52, 0.91] | ARF |
Prop. 4+ years col. | Proportion of population age 25 with 4+ years college, 2000 | 0.23 (0.065) | [0.089, 0.49] | ARF |
Prop. Manufacturing | Proportion of workers in manufacturing, 2000 | 0.12 (0.053) | [0.015, 0.28] | ARF |
Prop. white collar | Proportion of white-collar workers, 2000 | 0.59 (0.058) | [0.43, 0.79] | ARF |
Prop. const. | Proportion of workers in construction, 2000 | 0.070 (0.015) | [0.017, 0.12] | ARF |
Note. Table based on study data. N = 910 plans. Some variables are aggregated by enrollment weighting across plan market areas, as described in the text.
Provided by a large insurer.
ARF, Area Resource File; CMS, Centers for Medicare and Medicaid Services; Web, based on examination of plan websites.
Based on these data sources, in 2009 there were 1,047 nonspecial need, non-PFFS, drug-offering CCPs in the U.S. states and the District of Columbia. Of these, 910 had sufficient star quality data to be included in our models. The appendix includes a table that compares means for observations with and without star quality data. Although the means are statistically significantly different for many variables, this is not unexpected. There is a several-year lag between data collection and star rating reporting, and contracts that are too new can have missing star data. Also, contracts for plans with enrollment below 1,000 are not required to submit HEDIS data to CMS. Consequently, our results do not necessarily apply to the newest or smallest plans.
We estimated two ordinary least-squares models with firm random effects (because a given firm may offer multiple plans) and state fixed effects:
(1) |
(2) |
α, β, and γ are coefficients to be estimated, where γ is the coefficient on the state fixed effect (indexed by state s), and the error terms ε are assumed to be uncorrelated with each other and the independent variables. The value of a state fixed effect for a given plan is one if the plan offers services anywhere in that state, otherwise it is zero; therefore, multiple state fixed effects can be associated with a single plan. Market structure, cost, and demand are vectors and the independent variables that comprise them, as well as all others, are listed in Table 1. Premium is the beneficiary monthly out-of-pocket premium; quality is the 10-point star rating described above; integrated firm is an indicator that the firm with which the plan is associated is integrated with a provider; market structure includes proportion integrated, the (unweighted) proportion of MA plan-offering firms in the county that are integrated with a provider, not including the plan in question, lagged HHI, and MA enrollment in all plans in the market; benchmark is the MA benchmark payment rate; cost includes variables that are correlated with MA plan cost (Frakt, Pizer, and Feldman 2012): average Medicare FFS cost, the proportion of elderly age 75 years or older, doctors and hospital beds per capita, urban/rural indicators, Medigap premiums, and a diagnosis-based risk score, which measures average health status of the Medicare population in the market (Pope et al. 2004); demand includes per capita income and the proportions of the population who are elderly, in poverty, have a high school diploma, have four or more years of college, and work in manufacturing, construction, or white-collar jobs. These labor force variables are significant predictors of Medicare plan entry (Cawley, Chernew, and McLaughlin 2005; Pizer, Feldman, and Frakt 2005).
Results
Descriptive Statistics
Table 1 reports descriptive statistics for variables in Equations (1) and (2), the premium and quality models, respectively. Some aspects of key variables of interest are worth highlighting. On average, monthly premiums were about $50, although not shown in the table, about half of plans were offered at zero premium. Some plans reported extremely high monthly premiums, with the maximum reported value at $511. We repeated the analysis described below after excluding plans with unusually high premiums and unusually low enrollment, and obtained similar results. Although our quality rating variable has a 10-point scale, the data only range from 4.3 to about 9, with an average of 6.4 stars.
On average, only 3.8 percent of other firms in a market were integrated. Because the proportion of other firms integrated is an enrollment-weighted average across counties—as are most of our variables, as described above—it understates the degree to which plans are integrated. If plans have disproportionately higher enrollment in counties with relatively fewer other plans integrated, it pulls the weighted average down. In fact, about 17 percent of plans were associated with an integrated firm. Integrated plans are found in 23 states; the remaining states and the District of Columbia have no integrated plans. Table 2 lists the states, organized by Census division, with any integrated plans, and reports the percent of plans operating in them that are integrated.
Table 2.
State | % Plans Integrated | State | % Plans Integrated |
---|---|---|---|
Northeast: New England | Northeast: Middle Atlantic | ||
Massachusetts | 57.1 | New Jersey | 15 |
New York | 6.9 | ||
Pennsylvania | 56.2 | ||
Midwest: East North Central | Midwest: West North Central | ||
Illinois | 6.3 | Minnesota | 57.1 |
Indiana | 35.7 | ||
Michigan | 9.1 | ||
Ohio | 30.3 | ||
Wisconsin | 53.8 | ||
South: South Atlantic | South: East and West South Central | ||
Georgia | 12.5 | Alabama | 14.3 |
Louisiana | 10.5 | ||
Oklahoma | 14.3 | ||
Texas | 13.3 | ||
West: Mountain | West: Pacific | ||
Arizona | 10 | California | 20.2 |
Colorado | 22.7 | Hawaii | 50 |
New Mexico | 44.4 | Oregon | 22.9 |
Nevada | 21.4 | Washington | 37.5 |
Note. Based on study data. N = 1,093 plan–state pairs.
Multivariate Models
Table 3 reports the estimated coefficients from Equations (1) and (2). Considering the premium model first, we find that higher quality is associated with a higher premium: $9 per month for each additional quality star. Second, we find that plans offered by integrated firms charge $28 more per month, controlling for star quality rating.
Table 3.
Variable | Coefficient (SE) | |
---|---|---|
Dependent Variable: Premium | Dependent Variable: Quality | |
Quality | 9.26 (2.90)*** | – |
Key independent variable | ||
Integrated firm | 28.48 (7.73)*** | 0.85 (0.15)*** |
Market structure | ||
Proportion integrated | −67.16 (67.06) | 0.042 (0.056) |
HHI | 44.25 (21.28)* | 0.0085 (0.016) |
MA enrollment | −0.11 (0.12) | 0.000088 (0.000088) |
Payment | ||
Benchmark | −0.044 (0.064) | −0.000040 (0.000051) |
Cost | ||
FFS cost | −0.00080 (0.061) | 0.000027 (0.000048) |
Prop. elderly 75+ | −30.11 (109.38) | 0.17 (0.079)* |
Docs. per capita | 77,617.1 (28,662.86)** | 34.24 (21.55) |
Hosp. beds per capita | −1,574.83 (2,132.77) | −0.48 (1.54) |
Rural county | 10.14 (42.52) | 0.084 (0.032)** |
Urban county | −7.84 (13.12) | 0.0092 (0.0092) |
Rx Medigap prem. | 0.92 (0.95) | 0.0025 (0.00086)** |
Non-Rx Medigap prem. | −1.89 (1.50) | −0.00396 (0.0013)** |
Risk score | 134.90 (76.25)* | −0.11 (0.056)* |
Demand | ||
Prop. eld. in poverty | 50.42 (163.24) | −0.083 (0.12) |
Per capita income | 0.24 (0.63) | −0.00023 (0.00046) |
Prop. HS diploma | 187.47 (111.05)* | −0.13 (0.082) |
Prop. 4+ years. col. | 164.17 (129.76) | −0.024 (0.095) |
Prop. manufacturing | −65.97 (73.64) | 0.025 (0.057) |
Prop. white collar | −221.17 (152.94) | −0.010 (0.11) |
Prop. construction | −198.58 (292.85) | −0.035 (0.22) |
Note. State fixed effects omitted. N = 910 plans.
Significant at the *.1 level, **.01 level, and ***.001 level.
Plans operating in markets that had been more concentrated (higher lagged HHI) charge higher premiums. The coefficient on the benchmark payment rate is not statistically significant, suggesting that plan premiums are not responsive to payment rates. This could occur in two ways. First, plans could adjust their bids in response to benchmarks, as Song, Landrum, and Chernew (2012) found they do, and this is why they inferred that the MA market was not competitive. Second, plans in areas with higher benchmarks could be offering enhanced benefits and not charging lower premiums. However, in a limited analysis of benefits (described below), we did not find a consistent, statistically significant, relationship between higher benchmarks and more generous benefits. (Results from our analysis of benefits are included in the appendix). Although our benefits analysis is not comprehensive, the results are consistent with the idea that the MA market was not competitive and that plans retain extra payments as profit. Plans in markets with higher lagged risk scores have higher premiums. This could reflect inadequacy of Medicare's risk adjustment mechanism. Alternatively, providers in markets with higher risk scores could have different practice styles—they could treat all patients more intensively, leading to higher costs for a given risk score.
Table 3 also reports coefficients for the quality model. Plans offered by integrated firms have 0.85 of a star of additional quality. No other coefficients on policy-relevant variables are statistically significant.
The analysis above investigates the relationships between integration, premiums, and quality, but integration could also be associated with an enhancement in benefits. To test this, we estimated models identical to Equation (1), but with various measures of benefit generosity as dependent variables: prescription drug deductible; average preferred brand name drug copayment; physician visit deductible; and physician visit copayment. In no case was integration associated with a statistically significant increase in benefit generosity (coefficient estimates in the appendix). We cannot be certain that integration is not associated with enhancement of other benefits that we did not investigate. A thorough examination of the relationship between integration and benefits was beyond the scope of our study. All we can conclude is that we did not observe any statistically significant and positive relationship between integration and benefit generosity.
Discussion
We examined the relations between plan–provider integration, premiums, and quality in the MA market. We found that integration is associated with an increase of $28 per month in premiums and a monetized increase of just under $8 per month in quality.7 Consequently, about 70 percent of the total premium difference between integrated and nonintegrated plans is not attributable to quality. Some or all of this premium increase could be associated with benefit enhancements by plans, but we did not observe a statistically significant increase in benefit generosity with integration among the subset of benefit variables we examined. An alternative possibility is that higher premiums for integrated plans are related to higher market power commanded by those plans (whether due to integration or a causal factor of it). Because we did not examine all possible benefits, we cannot completely distinguish between these two possibilities, and that is a natural focus for future investigation.
Our study has several limitations. First, and most important, we did not estimate causal relationships between integration and premium and quality (or benefits). It is possible that integration causes increased premiums and quality, but it is also possible that causality runs the other way. Consider, for example, if nonintegrated insurers underprovide access to higher quality hospital services, perhaps because consumers do not value them or because of insurer monopoly pricing. In this case, a higher quality, monopoly hospital has an incentive to integrate to address underappreciation of quality and underutilization caused by monopoly pricing of hospital services. For these reasons, plan–provider integration may be more common among higher quality providers than lower quality ones. The only study we could find addressing the causal effect of quality on integration (Fernández-Olmos, Rosell-Martínez, and Espitia-Escuer 2009) found that wineries producing higher quality wines were more likely to vertically integrate grape growing with wine production than those producing lower quality wines. Consequently, our coefficient estimate of integration in the quality Equation (2) may be biased upward under a different causal interpretation. If so, integration would induce a smaller quality increase than suggested by our estimate. If one also interprets Equation (1) causally, the implication is that integration causes premiums to be more than 70 percent higher than warranted by the quality increase alone. However, we stress that this interpretation requires assumptions we are not articulating or defending here.
A second limitation is that we examined only the Medicare market. Although integration is occurring outside that market, and perhaps to a greater extent, we urge caution in generalizing the findings. Third, our results do not reflect plans too new or under contracts too small (in number of enrollees) to have star quality ratings. No star rating data were available for such plans, so they were excluded from our sample. Fourth, to establish a baseline, we deliberately studied a period before the changes to the MA program and the health care landscape brought about by the ACA. Future work should examine the effects of the ACA on integration and hence on premiums and quality, contrasting them with our findings. Fifth, as shown in Table 2, many markets have no integrated plans. It is possible that such markets and the plans therein differ systematically from those with integrated plans. Our analysis assumes that our controls, which include state fixed effects, address all sources of that difference. Finally, as mentioned, a more thorough investigation of the relation between integration and benefits is warranted, but it is beyond the scope of this study.
Despite these limitations, our findings have some implications for policy makers. They demonstrate that plan–provider integration in the MA market is associated with substantially higher premiums. Policy makers considering promoting integration as a means to increase quality and reduce cost should be aware that recent experience does not support an expectation of lower cost. Experience under new initiatives, like accountable care organizations, may be different, especially if the new organizations are created to take advantage of shared savings opportunities. Nevertheless, our research suggests the potential for anticompetitive effects that may be challenging to manage through regulation.
Acknowledgments
Joint Acknowledgement/Disclosure Statement: This research was supported by a contract with the Attorney General of the Commonwealth of Pennsylvania. The views expressed in this article are those of the authors and do not necessarily reflect the positions of the Department of Veterans Affairs, Boston University, the University of Minnesota, the University of Pennsylvania, or the Commonwealth of Pennsylvania.
Disclosure: None.
Disclaimers: None.
Notes
Although vertical integration can occur by mergers between existing insurers and providers, it also can occur when a provider offers a new insurance product of his/her own.
Enrollment data are available at http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/MCRAdvPartDEnrolData/index.html.
Medigap premiums potentially reflect variation in plan cost not captured by other control variables.
The ARF is available from the U.S. Department of Health and Human Services: http://www.arf.hrsa.gov.
The Drug and Health Plan Data, Plans Information by County data, and Plans Rating Data are available at http://www.medicare.gov/download/downloaddb.asp.
Monetized quality associated with integration is computed as follows. The coefficient on integration in our quality equation is 0.85. Plans with one star's worth of additional quality charge a $9.26 higher premium, according to our estimated premium equation. The product is 0.85 × $9.26 = $7.87; the monetized value of quality is associated with integration.
Supporting Information
Additional supporting information may be found in the online version of this article:
Appendix SA1: Appendix to “Plan-Provider Integration, Premiums, and Quality in the Medicare Advantage Market,” by Austin B. Frakt, Steven D. Pizer, and Roger Feldman.
Appendix SA2: Author Matrix.
References
- Baranes E, Bardey D. Competition in Health Care Markets and Vertical Restraints. Montpellier, France: Cahiers de Recherche du LASER 013-03-04, Laboratoire de Sciences É conomiques de Richter (LASER), Université de Montpellier; 2004. [Google Scholar]
- Beaulieu N. “Quality Information and Consumer Health Plan Choices”. Journal of Health Economics. 2002;21:43–63. doi: 10.1016/s0167-6296(01)00126-6. [DOI] [PubMed] [Google Scholar]
- Bijlsma M, Boone J, Zwart G. Selective Contracting and Foreclosure in Health Care Markets. Tilburg, The Netherlands: Tilburg University; 2009. [Google Scholar]
- Capps C, Dranove D, Greenstein S, Satterthwaite M. Antitrust Policy and Hospital Mergers: Recommendations for a New Approach. 2002. [accessed June 6, 2013]. Available at http://heinonline.org/HOL/LandingPage?collection=journals&handle=hein.journals/antibull47&div=34&id=&page= [Google Scholar]
- Carlton DW, Perloff JM. Modern Industrial Organization. New York: HarperCollins Publishers; 1990. [Google Scholar]
- Cawley J, Chernew M, McLaughlin C. “HMO Participation in Medicare+Choice”. Journal of Economics and Management Strategy. 2005;14(3):543–74. [Google Scholar]
- Coakley M. 2010. Investigation of Health Care Cost Trends and Cost Drivers. Preliminary Report of the Massachusetts Office of Attorney General Pursuant to G.L. c. 118G, § 6½(b). January 29.
- De Fontenay C, Gans J. Bilateral Bargaining with Externalities. 2007. [accessed June 6, 2013]. Available at http://works.bepress.com/joshuagans/14. [Google Scholar]
- Douven R, Halbersma R, Katona K, Shestalova V. Vertical Integration and Exclusive Vertical Restraints between Insurers and Hospitals. 2011. [accessed June 6, 2013]. Available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1781685. [Google Scholar]
- Farley DO, Short PF, Elliot MN, Kanouse DE, Brown JA, Hays RD. “Effects of CAHPS® Health Plan Performance Information on Plan Choices by New Jersey Medicaid Beneficiaries”. Health Services Research. 2002;37(4):985–1007. doi: 10.1034/j.1600-0560.2002.62.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Federal Trade Commission and Department of Justice. 1996. Statements of Antitrust Enforcement Policy in Health Care [accessed on June 3, 2013]. Available at http://www.ftc.gov/bc/healthcare/industryguide/policy/hlth3s.pdf.
- Fernández-Olmos M, Rosell-Martínez J, Espitia-Escuer M. “Vertical Integration in the Wine Industry: A Transaction Costs Analysis on the Rioja DOCa”. Agribusiness. 2009;25(2):231–50. [Google Scholar]
- Frakt AB, Mayes R. “Beyond Capitation: How New Payment Experiments Seek to Find the ‘Sweet Spot’ in Amount of Risk Providers and Payers Bear”. Health Affairs. 2012;31(9):1951–8. doi: 10.1377/hlthaff.2012.0344. [DOI] [PubMed] [Google Scholar]
- Frakt AB, Pizer SD, Feldman R. “The Effects of Market Structure and Payment Rates on Private Medicare Health Plan Entry”. Inquiry. 2012;49(1):15–36. doi: 10.5034/inquiryjrnl_49.01.03. [DOI] [PubMed] [Google Scholar]
- Fuchs V, Schaeffer L. “If Accountable Care Organizations Are the Answer, Who Should Create Them?”. Journal of the American Medical Association. 2012;307(21):2261–2. doi: 10.1001/jama.2012.5564. [DOI] [PubMed] [Google Scholar]
- Gaynor M, Town R. Competition in Health Care Markets. In: Pauly M, McGuire T, Barros P, editors. Handbook of Health Economics, Vol. 2, Chapter 9. Oxford, UK: Elsevier; 2012. pp. 499–637. [Google Scholar]
- Haas-Wilson D. Managed Care and Monopoly Power: The Antitrust Challenge, Chapter 7. Cambridge, MA: Harvard University Press; 2003. [Google Scholar]
- Halbersma R, Katona K. Vertical Restraints in Health Care Markets. 2011. [accessed June 6, 2013]. Available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1748519. [Google Scholar]
- Jacobson G, Neuman T, Damico A, Huang J. Kaiser Family Foundation; 2011. Medicare Advantage Plan Star Ratings and Bonus Payments in 2012. Data Brief [accessed on June 3, 2013]. Available at http://kaiserfamilyfoundation.files.wordpress.com/2013/01/8257.pdf. [Google Scholar]
- Kowalczyk L, Weisman R. The Boston Globe; 2012. Partners’ South Shore Hospital Bid Draws Scrutiny. [accessed on June 3, 2013]. Available at http://www.bostonglobe.com/lifestyle/health-wellness/2012/09/23/government-regulators-scrutinize-partners-bid-buy-south-shore-hospital/HLu6Mn8WP9oGOCkKsG8DxI/story.html. [Google Scholar]
- Ma C-tA. “Option Contracts and Vertical Foreclosure”. Journal of Economics and Management Strategy. 1997;6(4):725–53. [Google Scholar]
- Maeng D, Scanlon D, Chernew M, Gronniger T, Wodchis W, McLaughlin C. “The Relationship between Health Plan Performance Measures and Physician Network Overlap: Implications for Measuring Plan Quality”. Health Services Research. 2010;45(4):1005–23. doi: 10.1111/j.1475-6773.2010.01111.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Medicare Payment Advisory Commission (MedPAC) 2012a. Medicare Advantage Program Payment System [accessed on March 4, 2012]. Available at http://www.medpac.gov/documents/MedPAC_Payment_Basics_12_MA.pdf.
- Medicare Payment Advisory Commission (MedPAC) Report to the Congress: Medicare Payment Policy, Chapter 12. Washington, DC: MedPAC; 2012b. [Google Scholar]
- Pearlstein S. 2012. Aetna, Coventry and the Arms Race in Health Care. The Washington Post [accessed on June 3, 2013]. Available at http://articles.washingtonpost.com/2012-09-01/business/35497264_1_coventry-health-care-small-group-market-health-insurance.
- Pizer SD, Feldman R, Frakt AB. “Defective Design: Regional Competition in Medicare”. Health Affairs. 2005 doi: 10.1377/hlthaff.w5.399. Web Exclusive w5-399-411, 23 August. doi: 10.1377/hlthaff.w5.399. [DOI] [PubMed] [Google Scholar]
- Pizer SD, Frakt AB. “Payment Policy and Competition in the Medicare+Choice Program”. Health Care Financing Review. 2002;24(1):83–94. [PMC free article] [PubMed] [Google Scholar]
- Pizer SD, Frakt AB, Feldman R. “Payment Policy and Inefficient Benefits in the Medicare+Choice Program”. International Journal of Health Care Finance and Economics. 2003;3(2):79–93. doi: 10.1023/a:1023373630383. [DOI] [PubMed] [Google Scholar]
- Pope G, Kautter J, Ellis R, Ash A, Ayanian J, Iezzoni L, Ingber M, Levy J, Robst J. “Risk Adjustment of Medicare Capitation Payments Using the CMS-HCC Model”. Health Care Financing Review. 2004;25(4):119–41. [PMC free article] [PubMed] [Google Scholar]
- Rabin C. 2012. Some Hospital Networks Also Become Insurers. Washington Post [accessed on June 3, 2013]. Available at http://articles.washingtonpost.com/2012-08-25/business/35491800_1_private-insurers-insurance-product-hospital-systems.
- Scanlon D, Swaminathan S, Chernew M, Lee W. “Market and Plan Characteristics Related to HMO Quality and Improvement”. Medical Care Research and Review. 2006;63:56S–89S. doi: 10.1177/1077558706293835. [DOI] [PubMed] [Google Scholar]
- Scanlon D, Swaminathan S, Lee W, Chernew M. “Does Competition Improve Health Care Quality?”. Health Services Research. 2008;43(6):1931–51. doi: 10.1111/j.1475-6773.2008.00899.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schoenbaum SC, Coltin KL. “Competition on Quality in Managed Care”. International Journal for Quality in Health Care. 1998;10(5):421–6. doi: 10.1093/intqhc/10.5.421. [DOI] [PubMed] [Google Scholar]
- Song Z, Landrum MB, Chernew ME. “Competitive Bidding in Medicare: Who Benefits from Competition?”. American Journal of Managed Care. 2012;18(9):546–52. [PMC free article] [PubMed] [Google Scholar]
- Spranca M, Kanouse DE, Elliot M, Short PF, D. O F, R. D H. “Do Consumer Reports of Health Plan Quality Affect Health Plan Selection?”. Health Services Research. 2000;35:933–47. [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.