Skip to main content
Health Care Financing Review logoLink to Health Care Financing Review
. 2005 Spring;26(3):105–123.

M+C Plan County Exit Decisions 1999-2001: Implications for Payment Policy

Rachel Halpern
PMCID: PMC4194933  PMID: 17290631

Abstract

The primary legislative response to diminishing private plan participation in the Medicare+Choice (M+C) program since 1999 has been substantial payment increases. Analysis of M+C decisions to continue serving or drop counties from 1999-2000 and 2000-2001 reveals that payment amounts, although important, did not have a consistent impact on these decisions. Plan decisions varied depending on the year and the intention to continue participating in M+C at all. Simulations show that M+C plans were better off, on average, with the payment methodology imposed by the Balanced Budget Act (BBA) of 1997 than under the previous payment system and that large payment increases would increase plan retention.

Introduction

Private managed health care plans have participated in Medicare since 1982, first in the Medicare risk program established by the Tax Equity and Fiscal Responsibility Act of 1982 and subsequently in M+C, established by the 1997 BBA. M+C was renamed Medicare Advantage in the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA). Congress consistently expresses two primary objectives for private plan participation: to promote competition for Medicare enrollees among health plans, resulting in lower expenditures,1 and to provide beneficiaries with a choice of health care delivery options and benefit packages.

The legislative response to several years of M+C plan exodus has been substantial payment increases and regulatory compromises contained in the Balanced Budget Refinement Act of 1999 (BBRA), the Benefits Improvement and Protection Act of 2000 (BIPA), and the MMA. A more thorough understanding of M+C plan decisions, particularly those made before payments increased dramatically in 2001, becomes increasingly important as calls to modify the MMA arise.

Background

Until 1998, CMS paid risk plans 95 percent of the adjusted average per capita cost (AAPCC), a county-level estimate of expenditures CMS would have incurred under fee-for-service (FFS) reimbursement. The 1997 BBA modified the payment mechanism for M+C plans in an attempt to promote payment equity among U.S. counties while containing the growth of health care costs. M+C plans were paid the higher of two rates in 1998 and 1999: a minimum (floor) or a minimum increase of 2 percent over the previous year's payment. In 2000, CMS added a blended rate of local and national average FFS spending (U.S. General Accounting Office, 1999).

The number of plans grew from 32 in 1985 to a peak of 346 in 1998 (U.S. General Accounting Office, 1999) before taking a precipitous turn. Forty-five M+C plans terminated their contracts and 54 plans reduced their service areas for the 1999 contract year. Forty-one plans terminated contracts and another 58 reduced service areas for 2000. Sixty-five plans terminated contracts for 2001 while another 53 reduced service areas. By 2002, 148 M+C plans remained (Medicare Payment Advisory Committee, 2002).2

M+C plan contract terminations and service area reductions between 1999 and 2001 received considerable attention in the media and from Congress. Much of the debate over the withdrawals is summarized in two basic arguments. Health plans argued that payment rates and increases under the 1997 BBA were too low, creating payment inequity, or a “fairness gap,” between M+C plans and FFS Medicare (Ignagni, 1999). The Federal Government, represented primarily by CMS, the General Accounting Office, and the Department of Health and Human Services, argued that payments to plans were adequate; the spate of exits was attributable more to competition than to smaller payment hikes.

Factors Affecting M+C Plan Decisions

The M+C literature has focused on a consistent set of factors that appear to guide plan decisions: government payment, costs, enrollment, competition, and organizational characteristics.

Descriptive and multivariate analyses show that payment levels and increases are negatively associated with leaving a county (Lake and Brown, 2002; Glavin et al., 2002/2003; Merrill, 2001; U.S. General Accounting Office, 1999; Kornfield and Gold, 1999). Analysis has established a consistent positive association between costs and county exit (Lake and Brown, 2002; Glavin et al., 2002/2003; Stuber et al., 2002; Stuber, Dallek, and Biles, 2001). Costs are affected most by utilization (Call et al., 1999; 2001; Brown et al., 1993) and plans' bargaining power with providers (Grossman, Strunk, and Hurley, 2002; Stuber et al., 2002; Benko, 2000; U.S. General Accounting Office, 1999).

A negative relationship between enrollment and county exit is well documented (Lake and Brown, 2002; Glavin et al., 2002/2003; U.S. General Accounting Office, 1999), as is a positive relationship between the number of competing M+C plans in a county and the likelihood of leaving it (Lake and Brown, 2002; Kornfield and Gold, 1999). Research also uncovered a competitive relationship between M+C plans and supplemental Medicare insurance, or Medigap, plans (McLaughlin, Chernew, and Taylor, 2002; Abraham et al., 2000).

Several health plan characteristics are associated with strategic decisions. For-profit health plans tend to be more responsive to market and payment changes than are non-profit plans (Srinivasan, Levitt, and Lundy, 1998; Feldman, Wholey, and Christianson 1996). Relatively young health plans are more likely to exit their markets than are older plans (Glavin et al., 2002/2003; U.S. General Accounting Office, 1999; Feldman, Wholey, and Christianson, 1996), although this trend has weakened, probably due to previous withdrawals by younger M+C plans. The impact of affiliation with national health plans is ambiguous. National affiliation can indicate access to capital (Feldman, Wholey, and Christianson, 1996; Wholey, Christianson, and Sanchez, 1992) and a lower probability of dropping counties, although recent analyses have found nationally-affiliated M+C plans more likely to leave counties (Stuber et al., 2002; Merrill, 2001; Stuber, Dallek, and Biles, 2001; Scully and van der Walde, 2001).

Theory

Economic theory of profit maximization was used to develop hypotheses about M+C plans' decisions from 1999 to 2001.3 The following were derived from this theoretical foundation:

  • Hypothesis 1—An M+C plan will leave a county if average costs exceed average revenues at all levels of enrollment.

  • Hypothesis 2—An M+C plan will drop a county if enrollment in that county is insufficient to achieve economies of scale.

  • Hypothesis 3—An M+C plan will drop an unprofitable county within a multiple-county service area if the loss in the unprofitable county is greater than the loss in economies of scope across counties.4

Data

Data were observed and analyzed at the M+C plan/county level. The distribution of M+C plan/county observations is shown in Table 1. Analysis was limited to decisions to keep or drop counties. Observations were limited to plans classified as M+C HMO, with or without point-of-service option, which are the subject of most research and policy debate. The initial data set comprised 2,287 observations for 1999-2000 decisions and 1,957 for observations 2000-2001 observations. The data include 275 plans for 1999, 238 plans for 2000, and 175 plans for 2001. The number of M+C plans in these data differ from other reported figures for two reasons. First, criteria to identify M+C plans likely vary, and plans are not always categorized consistently among the various CMS reports. Second, between 1999 and 2001, M+C plans consolidated multiple contracts within a State. Covered counties were rolled into the remaining contract. Contract consolidations were incorporated to reflect 2001 contracts. Failure to do so would have resulted in reporting more contract terminations and service area reductions than actually occurred. Table 2 displays variables and data sources.

Table 1. Medicare+Choice (M+C) Service Area Decisions: 1999-2000 and 2000-2001.

M+C Plan/County Observations 1999-2000 2000-2001


N=2,287 Percent of Total N=1,957 Percent of Total
County Stay/Drop Decisions
Retain County 1,876 82.0 1,214 62.0
Drop County; Continue M+C Contract 189 8.3 248 12.7
Drop County; Terminate M+C Contract 222 9.7 495 25.3

SOURCE: Halpern, R., University of Minnesota, 2005.

Table 2. Variables Used in County Stay/Drop Specifications: 1999-2000 and 2000-2001.

Variable Explanation Expected Sign of Coefficient Data Source
Dependent
Drop County Binary, discrete variable with value = 0 if M+C plan retains county in service area the following year; = 1 if M+C plan drop county CMS: Geographic Service Area Report, MedicareCompare Database
Explanatory Revenue
Payment County-level government payment amount for next contract year
(e.g., 2000 payment for 1999-2000 decision)
Negative CMS: Market Penetration Quarterly State/County/Plan Report
2000 Floor County Indicator variable for floor-payment county in 2000
(1 = floor county; 0 else. For 1999-2000 decisions)
Unclear CMS: Market Penetration Quarterly State/County/Plan Report
2001 $525 Floor County Indicator variable for floor-payment county in MSA with population > 250,000 in 2001
(1 = floor county; 0 else. For 2000-2001 decisions)
Unclear CMS: Market Penetration Quarterly State/County/Plan Report
Cost
Specialist Density County-level number of office-based specialty physicians per 10,000 population Negative Health Resources and Services Administration: Area Resource File
Registered Nurse Wage MSA- or State-level registered nurse wage Positive Bureau of Labor Statistics: Occupational Employment Statistics Survey
Economies of Scale
M+C County Enrollment County-level M+C plan enrollment Negative CMS: Market Penetration Quarterly State/County/Plan Report
Economies of Scope
Commercial Enrollment Health Maintenance Organization's (HMO's) that offers M+C plan Negative InterStudy: HMO Directory
Medicaid Enrollment HMO's that offers M+C plan Positive InterStudy: HMO Directory
Service Area Size Number of counties in M+C plan's service area Positive CMS: Geographic Service Area Report, MedicareCompare database
Demand
Elderly Population Number of persons age 65 or over in county Negative HRSA: Area Resource File
Per Capita Income County-level per capita income Negative HRSA: Area Resource File
Competition/Market Structure
Variation in Plan Size Variation in size (enrollment) of M+C plans in county Positive CMS: Market Penetration Quarterly State/County/Plan Report
Monopoly County Indicator variable for county served by one M+C plan
(1 = M+C plan is monopoly; 0 = else)
Negative CMS: Geographic Service Area Report, MedicareCompare Database
Duopoly County Indicator variable for county served by two M+C plans
(1=2 plans in county; 0 = else)
Negative CMS: Geographic Service Area Report, MedicareCompare Database
Medigap Premium County- or State-level price for Medigap Plan C premium Negative Anonymous Insurer
Plan Characteristics
M+C Plan Age Number of years plan has operated Negative CMS: Medicare Managed Care Monthly Report
Aetna Indicator variable (= 1; 0 else) Unclear InterStudy: HMO Directory
Cigna Indicator variable (= 1; 0 else) Unclear InterStudy: HMO Directory
Kaiser Indicator variable (= 1; 0 else) Unclear InterStudy: HMO Directory
PacifiCare Indicator variable (=1; 0 else) Unclear InterStudy: HMO Directory
UnitedHealthcare Indicator variable (=1; 0 else) Unclear InterStudy: HMO Directory
BlueCross®BlueShield® Indicator variable (=1; 0 else) Unclear InterStudy: HMO Directory
Other National Firm Indicator variable for affilation with other national firms with too few observations to be included individually (=1; 0 else) Unclear InterStudy: HMO Directory
Firm Change Indicator variable for change in ownership of firm between 1999 and 2000 (=1;0 else) Unclear InterStudy: HMO Directory

NOTES: M+C is Medicare+Choice program. CMS is Centers for Medicare & Medicaid Services. MSA is metropolitan statistical area. HRSA is Health Resources and Services Administration.

SOURCE: Halpern, R., University of Minnesota, 2005.

Payment is the exogenous monthly payment to M+C plans for a standard enrollee for the following contract year. Payment amounts for the next year are published several months before plans submit their contracts to CMS. A negative payment coefficient was expected, indicating that plans were likely to drop counties when payment was low. Floor counties represent not only low payment, but may also represent relatively large payment increases; the payment increase to qualify a plan for a floor-payment amount was the largest of the payment increase options.5 The expected sign of this coefficient, therefore, was unclear.

Registered nurse (RN) wages measured input costs. A positive coefficient for RN wages was expected because plans would be likely to drop counties with high costs. Specialist density measured M+C plans' bargaining power with physicians. Specialist physician density was used to reflect the increasing need for specialty care among the elderly population. A negative coefficient was expected for this variable: high density implied increased competition for inclusion in a plan's network, providing plans leverage in obtaining price concessions and lower costs.

M+C county enrollment measured economies of scale. A negative coefficient was expected. Plans were likely to drop counties where their M+C enrollment was low and did not result in sufficiently decreasing average costs per additional enrollee.

Service area size measured economies of scope. The expected sign was positive, signifying that losses in economies of scope from leaving a county would be smaller in large service areas than in small ones. Commercial products have been shown to share economies of scope with Medicare products, whereas Medicaid products have displayed diseconomies of scope (Wholey, Feldman, and Christianson, 1996; Given, 1996). Therefore, plans were expected to drop counties if commercial enrollment was small (negative coefficient) or if Medicaid enrollment was large (positive coefficient).

Elderly population is a measure of demand (or market size) in M+C research,6 as are income levels, such as per capita income (Pizer and Frakt, 2002; Cawley, Chernew, and McLaughlin, 2000). In accordance with the literature, plans were expected to drop counties with small elderly populations and low income levels, resulting in negative coefficients for both variables.

Variation in M+C plan size is a measure of market structure. Plans in counties with high variation in plan enrollments experience low price elasticity of demand compared with those in counties with low variation.7 Hence, it is comparatively difficult for plans in high-variation counties to attract enrollees away from competing firms. Therefore, plans were expected to drop counties with high variation in plan size, signified with a positive coefficient. Monopoly and duopoly counties8 (excluded category: counties with three or more M+C plans) measured the number of competing plans. Negative coefficients for both variables were expected: plans were less likely to drop counties where there were fewer competitors. Medigap (Plan C) premium9 was a measure of competition with an expected negative coefficient suggesting that plans would drop counties where Medigap prices were low and attractive to beneficiaries.

M+C plan age measured experience with Medicare managed care products. Older plans were expected to be less likely to leave counties than were younger plans based on experience and name recognition in the community. The individual national firm indicators (e.g., Aetna), as well as firm change, measured the impact of unobserved, firm-specific strategies on decisions. There were no predetermined expectations regarding the signs of these coefficients, except that for-profit firms (Aetna, Cigna, PacifiCare, UnitedHealthcare, most BlueCross®/BlueShield® plans, and most other nationally affiliated plans) would be more likely to drop counties, characterized by positive coefficients, than the non-profit plan (Kaiser). Firm indicators were created for firms that represented at least 5 percent of the observations; non-affiliated (local) plans was the excluded category.

Data Concerns

One-year lags were used for all enrollment measures and for variation in plan size to address a potential source of endogeneity and resulting simultaneity bias; lagged enrollments are exogenous. Service area size raised concerns about potential endogeneity and biased coefficients attributable to Granger causality, which arises when lagged values of a dependent variable affect the coefficient of a contemporary explanatory variable. Granger causality could be a threat if service area decisions of competing plans in a county in the previous year affected the M+C market environment and, consequently, a plan's decisions about its service area. The correlation coefficient for 1998 and 1999 service area sizes was 0.97; the 1999-2000 correlation was 0.95. It is, therefore, unlikely that concurrent service area size posed a threat of Granger causality. Lagged service area size was tested and it yielded the same results as concurrent service area size, which was the measure used.

Finally, condition numbers and variance inflation factors (VIFs) were used to diagnose multicollinearity, with inconsistent results. In general, condition numbers greater than 30 and VIFs greater than 10 indicate potentially harmful multicollinearity. Condition numbers were quite high, in the thousands. In contrast, most VIFs were below 2 and all VIFs were well below 3. Certainly, many explanatory variables were collinear, particularly with payment. Nevertheless, the models estimated were stable enough to allay concerns about inaccurate estimates.10 Explanatory variables were scaled to similar magnitudes to minimize multicollinearity.

Observations with missing M+C county enrollments were excluded. The excluded observations had missing enrollments in the first, second, and third quarters of the CMS Market Penetration Quarterly State/County/Plan Report. Consequently, observations with M+C plan age = 1 were excluded. Observations with plan age = 1 constituted 81 percent of excluded 1999 observations and 36 percent of excluded 2000 observations. Among excluded observations with M+C plan age greater than 1, there were no patterns by plan, geography, or national affiliation among observations with missing enrollment. Excluded observations accounted for 182 (8 percent) of 1999-2000 observations and 147 (7.5 percent) of 2000-2001 observations. In addition, one county reported in the 1999 Geographic Service Area Report was not in the ARF, leading to one more discarded observation. The final data set included 2,104 observations for 1999-2000 and 1,810 observations for 2000-2001.

Methods

Logit was used to model decisions to drop counties.11 Robust standard errors were estimated using Intercooled Stata® 7.0 robust and cluster commands. The robust command calculates a Huber-White estimate of the variance, which relaxes the assumption of identically distributed error terms. The cluster option relaxes the assumption of independence by allowing the user to identify groups of observations within which observations are expected to have correlated errors (StataCorp®, 2001). Observations were clustered by M+C plan identifier because decisions about counties within a plan's service area are expected not to be independent.

County stay/drop decisions were modeled for four groups: (1) all plans in M+C in 1999 (full sample); (2) only 1999 plans that continued their contracts in 2000 (contract-continuation sample, N = 1,910); (3) 2000-2001 full sample; and (4) 2000-2001 contract-continuation sample (N = 1,341). The contract-continuation samples were used in addition to the full samples to explore possible differences in determinants of service area reduction decisions compared with those for contract termination decisions.

The Wald test was used for joint significance among the categories of variables listed in Table 2. (Stata® recommends Wald tests rather than likelihood ratio tests when the cluster command is invoked.) The Hosmer-Lemeshow statistic was used to assess model fit. A second goodness-of-fit statistic equal to the sum of the proportions of correctly predicted stay and drop decisions was also used. A sum less than 1 indicates poor model fit (Kennedy, 1998).

Payment policy simulations were conducted using logit coefficients (shown in Tables 4 and 6). Each explanatory variable's coefficient was multiplied by the value of that variable—except for payment—for each observation. The coefficient for payment was multiplied by a counterfactual payment amount. Finally, the equation for estimating probabilities based on a logistic distribution (e/(1 + e)) was used to calculate how the probability of dropping a county would change given the counterfactual payment.

Table 4. Results of County Stay/Drop Decision Models: 1999-2000.

Variable Full Sample
Prob[Drop County] = 0.17
Contract-Continuation Sample
Prob[Drop County] = 0.09


Coefficient Marginal Effect Coefficient Marginal Effect
Revenue
2000 Payment -0.352 * -0.030 ($100) -0.641 *** -0.022 ($100)
2000 Floor -0.108 ** -0.063 -0.144** -0.028
Cost
Specialist Density -0.002 -0.001 (10) 0.002 0.001 (10)
Registered Nurse Wage 0.042 0.004 0.111 0.004
Economies of Scale
M+C County Enrollment -0.259 *** -0.022 (1,000) -0.308*** -0.010 (1,000)
Economies of Scope
Commercial Enrollment -0.052 -0.004 (100,000) -0.085 -0.003 (100,000)
Medicaid Enrollment -0.003 -0.000 (10,000) 0.025 0.001 (10,000)
Service Area Size 0.009 0.001 0.059** 0.002
Demand
Elderly Population 0.011 0.001 (10,000) 0.006 0.000 (10,000)
Per Capita Income -0.062 -0.005 ($10,000) -0.092 -0.003 ($10,000)
Competition/Market Structure
Variation in Plan Size 0.129 ** 0.110 0.197 *** 0.066
Monopoly County 0.088 ** 0.096 0.099 ** 0.046
Duopoly County -0.000 -0.000 0.032 0.012
Medigap Premium 0.064 0.006 ($10) 0.187 0.006 ($10)
Plan Characteristics
M+C Plan Age 0.017 0.001 0.122 ** 0.004
Aetna -0.170 ** -0.090 -0.179 -0.036
Cigna 0.118 ** 0.151 0.137 * 0.084
Kaiser 0.016 0.015 -0.038 -0.011
PacifiCare -0.075 -0.049 -0.077 -0.019
UnitedHealthcare 0.154 ** 0.222 NA NA
BlueCross®BlueShield® 0.020 0.018 0.051 0.020
Other National Affiliation 0.014 0.012 0.033 0.012
Firm Change -0.286 * -0.106 -0.269 -0.039
Constant -1.581 NA -5.401 NA
Pseudo R2 0.154 NA 0.202 NA
***

p< 0.01 (highly significant).

**

0.01<p< 0.05 (significant).

*

0.05<p< 0.10 (weakly significant).

NOTES: All standard error terms are robust standard errors. UnitedHealthcare was not an explanatory variable in the contract-continuation sample because fewer than 5 percent of the observations were affiliated with it. NA is not applicable.

SOURCE: Data came from several reports on the CMS Web site; the Area Resource File (ARF); the Occupational Employment Statistics Survey; InterStudy HMO Directory; and a large Medigap insurer that provided premium data on condition of anonymity.

Table 6. Results of County Stay/Drop Decision Models: 2000-2001.

Variable Full Sample
Prob[Drop County] = 0.38
Contract-Continuation Sample
Prob[Drop County] = 0.17


Coefficient
(Standard Error)
Marginal Effect Coefficient
(Standard Error)
Marginal Effect
Revenue
2001 Payment -0.237 -0.048 ($100) -0.755 ** -0.059 ($100)
2001 $525 Floor County -0.007 -0.014 -0.056 ** -0.040
Cost
Specialist Density 0.004 0.009 (10) 0.024 0.002
Registered Nurse Wage 0.092 * 0.019 0.149 ** 0.012
Economies of Scale
M+C County Enrollment -0.113 *** -0.023 (1,000) -0.194 *** -0.015 (1,000)
Economies of Scope
Commercial Enrollment -0.368 *** -0.074 (100,000) -0.100 -0.008 (100,000)
Medicaid Enrollment 0.040 0.008 (10,000) -0.011 -0.001 (10,000)
Service Area Size 0.033 * 0.007 0.054 *** 0.004
Demand
Elderly Population -0.011 -0.002 (10,000) -0.011 -0.001 (10,000)
Per Capita Income 0.077 0.015 ($10,000) -0.006 -0.000 ($10,000)
Competition/Market Structure
Variation in Plan Size 0.057 0.115 0.073 0.057
Monopoly County 0.002 0.003 0.026 0.022
Duopoly County -0.047 ** -0.087 -0.013 -0.009
Medigap Premium 0.087 0.017 ($10) 0.175 ** 0.014 ($10)
Plan Characteristics
M+C Plan Age 0.030 0.006 0.112 *** 0.009
Aetna 0.260 *** 0.571 0.041 0.037
Cigna 0.367 *** 0.685 NA NA
Kaiser -0.314 *** -0.297 -0.307 *** -0.100
PacifiCare -0.007 -0.015 -0.128 ** -0.066
BlueCross®BlueShield® 0.076 0.168 -0.079 -0.049
Other National Affiliation 0.012 0.025 0.044 0.037
Constant -2.6 NA -3.875 NA
Pseudo R2 0.248 NA 0.207 NA
***

p< 0.01 (highly significant).

**

0.01<p< 0.05 (significant).

*

0.05<p< 0.10 (weakly significant).

NOTES: All standard error terms are robust standard errors. Cigna was not an explanatory variable in the contract-continuation sample because fewer than 5 percent of the observations were affiliated with it. NA is not applicable.

SOURCE: Halpern, R., University of Minnesota, 2005.

Three simulations were conducted. Simulation 1 provides probability estimates of dropping a county if M+C payments from 1997 (pre-BBA) had changed at the same rate as national FFS expenditures. Simulation 2 provides probability estimates if monthly, per capita M+C payments had been $100 higher. While Simulations 1 and 2 proceed from payment to county decisions, Simulation 3 proceeds from decisions to payment. The third simulation provides estimates of payments that would have reduced the probability of dropping a county to 10 percent for the full 1999-2000 sample and the 2000-2001 contract-continuation sample, and to 5 percent for the 1999-2000 contract-continuation sample (for which the estimated probability of dropping a county was 0.09). For Simulation 3, the average simulated payment across all observations was reported as well as payments only for observations with predicted probabilities greater than 0.17 for the full 1999-2000 and 2000-2001 contract-continuation samples, and greater than 0.09 for the 1999-2000 contract-continuation sample. One would expect that payment was adequate if a plan continued to serve a county. The latter subgroups of observations are of particular policy interest because their individual probabilities of dropping a county were greater than the average probability across all observations; they were the ones that caused loss of access to M+C plans for many beneficiaries and concerns about program stability.

Results

1999-2000 Decisions

Descriptive statistics (unscaled) for the full 1999-2000 sample are shown in Table 3. There are significant differences between the averages of many explanatory variables, classified by county stay and drop decisions, although not all of these differences resulted in significant logit coefficients. The results of the 1999-2000 county decision models are shown in Table 4. The average probability of dropping a county was 0.17 for the full sample and 0.09 for the contract-continuation sample.

Table 3. Descriptive Statistics for All M+C Plan/County Observations, by County Stay/Drop Decision: 1999-2000.

Variable Mean (Standard Deviation)
Stay in County (N =1,738)
Mean (Standard Deviation)
Drop County (N = 366)
Revenue
2000 Payment*** 521.3 (84.8) 502.6 (64.9)
2000 Floor County 0.045 (0.21) 0.027 (0.16)
Cost
Registered Nurse Wage** 22.1 (2.5) 21.7 (2.1)
Specialist Density** 5.3 (5.6) 4.6 (4.0)
Economies of Scale
M+C County Enrollment*** 2,861.5 (7,579.2) 690.5 (1,309.1)
Economies of Scope
Commercial Enrollment*** 304,552.5 (446,315.4) 179,430.7 (241,103.2)
Medicaid Enrollment 23,618.7 (65,406.5) 18,100.5 (56,832.4)
Service Area Size** 16.4 (12.6) 14.7 (11.7)
Demand
Elderly Population*** 59,035.9 (101,928.7) 40,445.6 (74,868.3)
Per Capita Income*** 27,564.6 (8,396.4) 26,226.2 (7,948.3)
Competition/Market Structure
Variation in Plan Size 0.174 (0.14) 0.176 (0.15)
Monopoly County*** 0.150 (0.36) 0.276 (0.45)
Duopoly County 0.163 (0.37) 0.167 (0.37)
Medigap Premium*** 112.0 (19.2) 108.8 (20.6)
Plan Characteristics
M+C Plan Age*** 6.8 (4.4) 5.7 (3.9)
Aetna*** 0.133 (0.34) 0.014 (0.12)
Cigna*** 0.047 (0.21) 0.137 (0.34)
Kaiser 0.059 (0.24) 0.041 (0.20)
PacifiCare*** 0.067 (0.25) 0.027 (0.16)
UnitedHealthcare*** 0.043 (0.20 0.112 (0.32)
BlueCross®BlueShield® 0.134 (0.34) 0.150 (0.36)
Other National Affiliation 0.209 (0.41) 0.183 (0.39)
Firm Change*** 0.083 (0.28) 0.005 (0.07)
***

p< 0.01in two-sample t-test or in Pearson chi-square (χ2).

**

0.01 < p< 0.05 in two-sample t-test or in Pearson χ2.

*

0.05 < p< 0.10 in two-sample t-test or in Pearson χ2.

NOTE: M+C is Medicare+Choice.

SOURCE: Halpern, R., University of Minnesota, 2005.

Plans represented in both samples were likely to drop counties when 2000 payment was low, although this effect was much stronger for plans that continued their contracts. Plans in both samples also were more likely to drop non-floor-payment counties (supporting the interpretation that floor-payment counties signify large payment increases), counties with small M+C enrollments and high variation in plan size, and monopoly counties where they had no competitors. Plans affiliated with Cigna were more likely to drop counties compared with local plans; this effect was stronger for all plans than for continuing plans only.

The result that plans were likely to drop monopoly counties was surprising, initially, particularly because they also were likely to drop counties where it was difficult to attract enrollees away from competing plans (where variation in plan size was large). An alternative interpretation is that the absence of competing plans indicated something undesirable about the county that already deterred other plans from serving it. For example, the mean 2000 payment was $462 in monopoly counties, compared with $473 in duopoly counties and $544 in counties with two or more competing M+C plans. Elderly populations were larger in duopoly counties (average 146,071) and counties with more competition (average 780,847) than in monopoly counties (average 89,314). Monopoly counties were more likely than other counties to be rural or urban adjacent. These examples are not exhaustive, but they support this interpretation.

There were differences in additional decision determinants between all plans and plans that continued into 2000. M+C plans in the full sample were likely to drop counties if they were affiliated with UnitedHealthcare and were less likely to drop counties if they were affiliated with Aetna or if their ownership changed between 1999 and 2000. In contrast, plans that continued into 2000 were more likely to drop counties if their service areas were large and if they were older. The latter result supports prior observations of a weakening trend in younger plan exits.

The marginal effects show how the probability of dropping a county changed with a change in the explanatory variable. A unit change should be assumed unless a change of a different magnitude is reported. For instance, the addition of one county to a service area increased the probability of dropping a county by 0.001 among all observations and by 0.002 within the contract-continuation sample. A $100 increase in payment reduced that probability by 0.03 in the full sample and by 0.022 in the contract-continuation sample. If a plan changed to an affiliation with UnitedHealthcare, the probability of dropping a county increased by 0.22, a large marginal effect. The marginal effects of variation in plan size, presence in a monopoly county, affiliation with Aetna or Cigna, and change in ownership also were large for the observations in the full sample.

Joint significance tests of revenue, cost, economies of scope, competition and market structure, and plan characteristics were consistent with the impact of individual variables these theoretical constructs comprised. The Hosmer-Lemeshow test showed good model fit for the full sample and poor fit for the contract-continuation sample model. Nevertheless, the alternative goodness-of-fit statistic—the sum of the correctly predicted proportions of stay and drop observations—was greater than 1 for both models, showing good fit, and higher for the contract-continuation estimation. Both models correctly predicted at least 98 percent of the stay observations. The full sample model predicted about 5 percent of drop observations correctly, whereas the contract-continuation sample model accurately predicted about 16 percent of drop decisions.

Several specifications were estimated for each sample to test for sensitivity of results. In some specifications, payment was included without the floor county indicator. In others, the floor indicator was included, as shown in Table 4, or used as the only revenue variable. Payment amount typically was not significant, even at p < 0.10, among the full sample specifications. Payment in contract-continuation specifications, however, was significant whether it was the sole revenue variable or paired with floor county. Floor county by itself was typically weakly significant for the full sample specifications and weakly significant or not significant for the contract-continuation specifications; it was always negative and significant when paired with payment in both samples. (These results should be interpreted cautiously. Although the correlation between payment and floor indicator was low in both years (about -0.30), there is clearly a relationship between these variables that could cause sensitivity to model specification. The overall stability of coefficients and standard errors across specifications is encouraging, however, and suggests this relationship is not necessarily, or primarily, a function of collinearity.)

These results imply different concerns or priorities between terminating and continuing plans. Payment increase, and perhaps expectations of continued large payment increases, were important to all plans, but payment amount and economies of scope played a stronger role in continuing plans' decisions. The larger marginal effects for variation in plan size and presence in a monopoly county suggest that terminating plans were more sensitive to difficult market conditions than were continuing plans. (The average variation in plan size and presence in a monopoly county were not significantly different between terminating and continuing plans [data not shown]). Firm indicators seemed to be determinants of contract decisions rather than of individual county decisions. The county-level decisions Cigna and United Healthcare made in 1999 were principally part of larger decisions to terminate M+C contracts rather than to undertake service area reductions. The coefficients for Aetna were negative in both sets of models, but only significant in the full sample, suggesting that plans associated with Aetna (none of which terminated their 2000 contracts) were less likely to drop counties compared with all local plans, but were not significantly less likely to do so compared with continuing local plans.

2000-2001 Decisions

Descriptive statistics for the full 2000-2001 sample are in Table 5. There were not enough observations affiliated with UnitedHealthcare and with change in firm ownership between 2000 and 2001 to include them. The results of the 2000-2001 county stay/drop decision models are in Table 6. The probability of dropping a county was 0.38 among all plans, and 0.17 among continuing plans.

Table 5. Descriptive Statistics for All M+C Plan/County Observations, by County Stay/Drop Decision: 2000-2001.

Variable Mean (Standard Deviation)
Stay in County (N = 1,111)
Mean (Standard Deviation)
Drop County (N = 699)
Revenue
2001 Payment 559.2 (72.8) 554.6 (68.6)
2001 $525 Floor County** 0.329 (0.47) 0.280 (0.45)
Cost
Specialist Density* 5.5 (4.5) 5.1 (4.4)
Registered Nurse Wage*** 23.0 (2.6) 22.6 (2.3)
Economies of Scale
M+C County Enrollment*** 4,337.7 (9,521.8) 1,518.5 (2,963.9)
Economies of Scope
Commercial Enrollment*** 388,427.8 (542,247.8) 188,788.1 (218,942.1)
Medicaid Enrollment*** 32,696.2 (85,754.4) 18,777.7 (70,347.2)
Service Area Size*** 13.7 (8.0) 16.5 (15.3)
Demand
Elderly Population*** 66,966.5 (113,121.7) 46,076.4 (73,555.9)
Per Capita Income 27,780.2 (8,403.6) 27,217.8 (8,073.3)
Competition/Market Structure
Variation in Firm Size 0.161 (0.13) 0.157 (0.13)
Monopoly County*** 0.157 (0.36) 0.216 (0.41)
Duopoly County*** 0.204 (0.40) 0.153 (0.36)
Medigap Premium 114.9 (24.1) 115.7 (15.4)
Plan Characteristics
M+C Plan Age*** 8.0 (4.6) 6.9 (4.0)
Aetna*** 0.044 (0.21) 0.177 (0.38)
Cigna*** 0.006 (0.08) 0.133 (0.34)
Kaiser*** 0.091 (0.29) 0.001 (0.04)
PacifiCare*** 0.086 (0.28) 0.050 (0.22)
BlueCross®BlueShield®*** 0.154 (0.36) 0.203 (0.40)
Other National Affiliation*** 0.277 (0.45) 0.182 (0.39)
***

p< 0.01 in two-sample t-test or in Pearson chi-square (χ2).

**

0.01<p< 0.05 in two-sample t-test or in Pearson χ2.

*

0.05<p< 0.10 in two-sample t-test or in Pearson χ2.

SOURCE: Halpern, R., University of Minnesota, 2005.

M+C plans represented in both samples were likely to drop counties with low M+C enrollment. They were likely to drop counties with high RN wages and counties within large service areas, though these effects were weak for the full sample. Kaiser affiliated plans were unlikely to drop counties compared with local plans.

There were more differences in the determinants of decisions between terminating and continuing plans between 2000 and 2001 than there were during the previous decision period. Plans in the full sample also were more likely to drop counties if commercial enrollment was low and if they were affiliated with Aetna (the opposite effect from the 1999-2000 decision period) or Cigna. They were less likely to drop duopoly counties (one competing M+C plan) compared with counties where they faced two or more competitors.

Plans in the contract-continuation sample were likely to drop counties where payment was low and Medigap premiums were high. They were less likely to drop counties with the $525 floor payment and if they were affiliated with PacifiCare. Once again, older plans were more likely than younger plans to drop counties. The sign for the Medigap premium was contrary to expectations. The premiums for this Medigap plan, however, were based on community rating. It is possible that instead of measuring competition from an alternative form of coverage, as intended, this Medigap premium reflects the general level of health care costs for the eligible population in a beneficiary's geographic area. If so, a positive coefficient, indicating that plans left counties where Medicare beneficiaries' health care costs were high, is plausible.

Joint significance test results were reasonable in light of the impact of the individual variables. The Hosmer-Lemeshow statistic showed poor fit for the full sample model and good model fit for the contract-continuation sample. Again, the second goodness-of-fit statistic showed good fit for both models, especially for the full sample with a value of 1.4. The full sample model accurately predicted 87 percent of the stay observations and 55 percent of the drop observations. The contract-continuation model correctly predicted 97 percent and 22 of the stay and drop observations, respectively.

Several specifications were estimated to test the sensitivity of the models to different payment variables. An indicator variable for $475 floor-payment counties was used instead of the $525 floor indicator12 in some of the specifications. Neither payment nor either floor county indicator were ever significant for the full sample. In contrast, payment was significant only when the $525 floor was also included for the contract-continuation sample; otherwise, payment was sporadically weakly significant or not significant. The $525 floor county indicator was always significant and negative (with or without payment), whereas the $475 floor county indicator was always significant and positive, indicating that plans were likely to drop these counties.

Like the 1999-2000 models, the 2000-2001 results suggest that terminating plans had different criteria for dropping counties than did continuing plans. Neither payment nor payment changes appeared to shape terminating plans' decisions, while continuing plans were influenced by payment increase or a combination of payment level and increase. Decisions were affected by different economies of scope: terminating plans reacted to low enrollment in a different product line (commercial) while continuing plans seemed to react to M+C product-related economies of scope (service area size). Terminating plans appeared, again, to be more sensitive to market conditions given the results for duopoly counties, as previously described, while continuing plans were affected more strongly by costs (RN wage and, perhaps, Medigap premium). Finally, national firm affiliation was a determinant both of decisions to drop individual counties and of contract decisions. Most of the Aetna-affiliated plans (15 of 20) and Cigna-affiliationed plans (12 of 14) terminated their M+C contracts in 2000. In contrast, only 3 of 11 plans associated with PacifiCare, and none of the Kaiser plans, left the program.

Payment Simulation Results

Simulations were performed on two specifications for each sample: one that included the coefficient and value for the floor county indicator and one that excluded these coefficients and values. The results from the logit estimations indicate that floor-payment counties were important. The inclusion and exclusion of floor counties was used in the simulation to continue this exploration. Simulations were not performed for the 2000-2001 full sample because 2001 payment coefficients were not significant.

The credibility of the simulation depends on the credibility of the models; the simulations rely on the models to represent M+C plans' behavior accurately. Models are, by definition, simplifications of reality, and one can reasonably argue that M+C plan decisions were made within a complex environment of market forces, regulatory change, firm-specific organizational dynamics, and health care cost escalation that models are unlikely to capture fully. Nevertheless, although a model can never fully explain a phenomenon of interest, the adjusted R2 statistics associated with these models (0.15-0.25) are encouraging. Similarly, these simulations provide important information about the general magnitude of payment change that would have an impact on plans' decisions even though one cannot assume these simulations perfectly predict what would have happened if payment had been different. The results of the payment simulations are shown in Table 7.

Table 7. Results of Payment Policy Simulations: 1999-2000 and 2000-2001.

Simulation Performed Floor County 1999-2000 Full Sample 1999-2000 Contract-Continuation Sample 2000-2001 Contract-Continuation Sample
Estimated Prob[Drop county] 0.17 0.09 0.17
Simulation 1 Including 0.19 0.113 0.202
FFS Expenditure Change Excluding 0.197 0.119 0.225
Simulation 2 Including 0.133 0.055 0.099
M+C Payments + $100 Excluding 0.138 0.058 0.109
Simulation 3 Including $498, All Observations; $459, All Observations; $531, All Observations;
Reduce Pr[Drop] Excluding $863, Pr[Drop]>0.17 $715, Pr[Drop]>0.09 $733, Pr[Drop]>0.17
$511, All Observations; $469, All Observations; $554, All Observations;
$868, Pr[Drop]>0.17 $719, Pr[Drop]>0.09 $746, Pr[Drop]>0.17

NOTES: In Simulation 3, payments were simulated to reduce the predicted probability of dropping a county to 0.10 for the 1999-2000 full sample and the 2000-2001 contract-continuation sample; payments for the 1999-2000 contract-continuation sample reflect the payment needed to reduce the predicted probability to 0.05. FFS is fee for service.

SOURCE: 2003 Annual Report of the Boards of Trustees of the Federal Hospital Insurance and Federal Supplementary Medical Insurance Trust Funds (for FFS expenditure changes).

The first simulation shows that the probability of leaving a county could have been higher had M+C payment remained tied to the AAPCC. Contrary to allegations about a FFS/M+C “fairness gap,” M+C plans, on average, received higher and larger payment increases under BBA. This result must be interpreted cautiously. National changes in FFS expenditures were used. FFS expenditures vary dramatically across counties, however. County-level FFS expenditure changes were preferable, but 1997 county FFS expenditures would be necessary to compute the 1997-1998 change to apply to 1997 M+C payments, and 1997 FFS expenditures were not available on the CMS Web site. The changes in national FFS expenditures were: -1 percent, 1997-1998; -0.9 percent, 1998-1999; and 2.7 percent, 1999-2000. The corresponding M+C payment increases were: 2.6 percent, 1997-1998; 2.1 percent, 1998-1999; and 4.9 percent, 1999-2000. The national FFS expenditure change from 2000 to 2001 (9.9 percent) surpassed the mean M+C payment increase (7.5 percent). An assessment of county-level FFS expenditures relative to M+C payments during the study period was undertaken by taking the difference between M+C payments and Part A and B FFS expenditures (adjusted for county demographic risk). These differences were generally positive during the study period. The 1999 and 2000 mean (standard deviation) differences both were 46 (44). Thus, while national changes in FFS expenditures mask important variation, the M+C payment/FFS expenditure differences support the assertion that M+C plans, on average, fared better under the BBA payment scheme.

A vital qualification for the first simulation is the BBA also resulted in extraordinarily small growth in FFS expenditures during the study period. Hence, the point is not that payments were high enough to sustain participation in the M+C program or cope with rapidly rising health care costs in the private sector, but rather that plans' concerns about payment inequality relative to the FFS sector, at least during the study period, were unsubstantiated.

The second and third simulations show, not surprisingly, that higher payments would have reduced the probability of dropping counties. The second simulation shows that probabilities would have been 0.03 to 0.07 lower (e.g., 0.13-0.14 compared with 0.17 for the full 1999-2000 sample) for all samples given an approximate 20 percent payment ($100) increase. Simulation 3 shows an important distinction in the average payment that would have reduced the probability of dropping a county. The first payment amount, which applies to the entire sample, is relatively low because the majority of observations reflect stay decisions (83 percent for the full 1999-2000 and 2000-2001 contract-continuation samples, 62 percent for the 2000-2001 contract-continuation sample) and stay decisions, in turn, imply adequate payment. It was the observations with higher estimated probabilities of dropping a county whose decisions might have changed given higher payments; payment would have had to increase by 30 to 65 percent for those observations.

All simulations show that floor payment appeared to have some kind of beneficial consequence for plan behavior. The simulated probabilities of dropping a county, as well as the simulated payment associated with lower probabilities, are lower when the effect of presence in a floor county is included in the simulation. This supports the idea that plans in floor counties might have expected continued favorable payment increases given the BBA objective of greater geographic payment equity.

Discussion

The results of the county decision models provide partial support for the hypotheses underlying this analysis. Payment and floor indicator support the revenue portion of Hypothesis 1, that plans will drop counties where average costs exceed average revenues, at least for continuing plans. The cost portion of this hypothesis is only weakly supported by RN wage in the 2000-2001 decisions. This lack of support is likely a function of weak measures (albeit supported by the literature) rather than unreasonable theory. The second hypothesis is plans will drop counties with inadequate economies of scale. Hypothesis 2 is strongly supported by M+C county enrollment: plans were likely to drop counties where enrollment was low. Hypothesis 3 stated that plans would drop counties from multiple-county service areas if economies of scope were inadequate; it is partly supported by the likelihood of dropping counties within large service areas among continuing plans, as well as the negative, significant coefficient for commercial enrollment in the 2000-2001 full sample.

Competition, market structure, and plan characteristics were not addressed directly by the hypotheses, but provide a fuller explanation of plans' decisions and are important controls. Plans were likely to drop counties where enrollee recruitment was difficult in 1999, and where they faced multiple competitors in 2000. Firm indicators captured information about unobserved organizational considerations. As speculated, for-profit plans were generally more likely to drop counties than were non-profit plans.

M+C plans' decisions about the counties they serve have important implications for competition and beneficiary choice. This analysis shows that government payment certainly deserves its place in the Medicare Advantage policy debate, but that it may not have the same impact on decisions from plan to plan, or from year to year. It has cast doubt on health plans' assertion that the BBA was deleterious to M+C plans, at least compared with AAPCC-based payment, although it supports their argument that payment and payment increase were key factors in their decisions. This study also casts doubt on the government's position that payment was not a significant consideration, but it bolsters the contention that plans reacted to competition, as previously discussed.

The differences in county decision determinants between terminating and continuing plans and between years should be considered carefully as further Medicare Advantage payment incentives are explored. Payment levels and increases, costs, competition, and other product lines did not inform all plan decisions uniformly. This study suggests that some plans continued their contracts by adjusting their M+C products-staying in counties where favorable payments and payment increases were expected, dropping counties with high input costs, and addressing diseconomies of scale and scope. It is difficult to know whether terminating plans first considered the viability of their M+C products county by county and then determined that none of the counties they served would be profitable enough to justify continued participation, or instead reacted to broader, firm-wide concerns, such as investor pressure and the consequences for other products (e.g., commercial), by dropping Medicare managed care as a line of business without county-by-county analysis. The lack of significance of any revenue variables in the full 2000-2001 sample suggests the latter scenario applied to at least some plans.

Moreover, policymakers need to consider the improvement in program access they achieve with increased payment. The simulations and marginal effects show that payment hikes will keep more plans in counties. Is the degree of expected additional retention of plans commensurate with the expenditures required to attain it?

The role of private health care organizations in Medicare is a contentious issue. It has been the catalyst for a heated debate in Congress, with conservatives promoting increased use of private organizations to increase competition and liberals maintaining that beneficiaries are best served by the traditional Medicare Program. The ability to better understand how M+C plans structure their service areas in response to payment and market environment should provide valuable information to inform CMS as it attempts to continue and strengthen Medicare Advantage.

Acknowledgments

The author would like to thank Bryan Dowd, Douglas Wholey, and Roger Feldman for their steadfast and patient support, input, and mentorship and three anonymous reviewers for their suggestions.

Footnotes

The author is with the University of Minnesota. The research in this article was supported by the Centers for Medicare & Medicaid Services (CMS) under Grant Number 30-P-91703/5-01. The statements expressed in this article are those of the author and do not necessarily reflect the views or policies of the University of Minnesota or CMS.

1

The addition of outpatient prescription drugs as a covered benefit under MMA will increase Medicare costs, although some legislators continue to assert that private health plan participation will yield increased efficiency and competition, ultimately resulting in lower expenditures (Pear, 2004).

2

Some of the reduction in the number of M+C plan contracts between 1999 and 2002 is attributable to contract consolidation.

3

Additional information on theoretical development is available on request from the author.

4

Counties served by M+C plans can be considered separate products, with administrative costs distributed across them, because payment varies between counties and plans can vary premiums and benefits between counties.

5

Furthermore, payment increases were varied and often large. They ranged from 2.0 to 13.6 percent between 1999 and 2000 and from 3.0 to 30.7 percent between 2000 and 2001.

6

The vast majority of M+C plan enrollees are eligible for Medicare based on age; in 2002, 93.3 percent of M+C plan enrollees were aged beneficiaries, compared with 6.7 percent disabled enrollees (Centers for Medicare & Medicaid Services, 2003).

7

Derivation of this conclusion is based on a logistic cumulative distribution, and is available on request from the author.

8

More than 40 percent of counties with any M+C plan in 1999 and 2000 had only one M+C plan. Another 21 to 22 percent of counties had two M+C plans.

9

This was the most popular Medigap plan purchased from this vendor by beneficiaries upon M+C disenrollment.

10

Several specifications of each model were estimated to test sensitivity and robustness. There were no signs of damaging multicollinearity, such as unstable parameter estimates or unreasonable coefficient magnitudes (Greene, 2000).

11

Probit is another estimation technique for a binary dependent variable. There is no theoretical reason to select one of these estimators, logit or probit, over the other (Greene, 2000).

12

BIPA instituted two floor-payment amounts: $525 per enrollee, described in Table 2, and a $475 for all other floor-payment counties.

Reprint Requests: Rachel Halpern, M.P.H., University of Minnesota, MMC 729, 420 Delaware Street, SE, Minneapolis, MN 55455. E-mail: rachel.halpern@i3magnifi.com

References

  1. Abraham J, Arora A, Gaynor M, et al. Enter At Your Own Risk: HMO Participation and Enrollment in the Medicare Risk Program. Economic Inquiry. 2000 Jul;38(3):385–401. [Google Scholar]
  2. Benko L. Less Is Not More. Modern Healthcare. 2000 Sep 11;30(38):41–48. [PubMed] [Google Scholar]
  3. Brown R, Clement DG, Hill JW, et al. Do Health Maintenance Organizations Work For Medicare? Health Care Financing Review. 1993 Fall;15(1):7–23. [PMC free article] [PubMed] [Google Scholar]
  4. Call KT, Dowd BE, Feldman R, et al. Disenrollment from Medicare HMOs. American Journal of Managed Care. 2001 Jan;7(1):37–51. [PubMed] [Google Scholar]
  5. Call KT, Dowd BE, Feldman R, et al. Selection Experiences in Medicare HMOs: Pre-Enrollment Expenditures. Health Care Financing Review. 1999 Summer;20(4):197–209. [PMC free article] [PubMed] [Google Scholar]
  6. Cawley J, Chernew M, McLaughlin C. HMO Participation in Medicare Managed Care. Chicago, IL.: Oct 26, 2000. Presented at the University of Chicago Health Economics/CHAS Workshop. [Google Scholar]
  7. Centers for Medicare & Medicaid Services. Health Care Financing, Statistical Supplement, 2001. U.S. Government Printing Office; Washington, DC.: Apr, 2003. [Google Scholar]
  8. Feldman R, Wholey D, Christianson J. Economic and Organizational Determinants of HMO Mergers and Failure. Inquiry. 1996 Summer;33(2):118–132. [PubMed] [Google Scholar]
  9. Given RS. Economies of Scale and Scope as an Explanation of Merger and Output Diversification Activities in the Health Maintenance Organization Industry. Journal of Health Economics. 1996 Dec;15(6):685–713. doi: 10.1016/s0167-6296(96)00500-0. [DOI] [PubMed] [Google Scholar]
  10. Glavin MPV, Tompkins CP, Wallack SS, et al. An Examination of Factors in the Withdrawal of Managed Care Plans from the Medicare+Choice Program. Inquiry. 2002/2003 Winter;39(4):341–354. doi: 10.5034/inquiryjrnl_39.4.341. [DOI] [PubMed] [Google Scholar]
  11. Greene W. Econometric Analysis, Fourth Edition. Prentice-Hall, Inc.; Upper Saddle River, NJ.: 2000. [Google Scholar]
  12. Grossman JM, Strunk BC, Hurley RE. Reversal of Fortune: Medicare+Choice Collides with Market Forces. Center for Studying Health System Change; Washington, DC.: 2002. [PubMed] [Google Scholar]
  13. Ignagni K. Medicare+Choice: An Evaluation of the Program. 1999. Testimony before the House Subcommittee on Health and Environment.
  14. Kennedy P. A Guide to Econometrics, Fourth Edition. The MIT Press; Cambridge, MA.: 1998. [Google Scholar]
  15. Kornfield T, Gold M. Monitoring Medicare+Choice Fast Facts: Is There More or Less Choice? Mathematica Policy Research, Inc.; Washington, DC.: 1999. [Google Scholar]
  16. Lake T, Brown R. Medicare+Choice Withdrawals: Understanding Key Factors. The Henry J. Kaiser Family Foundation; Washington, DC.: Jun, 2002. [Google Scholar]
  17. McLaughlin CG, Chernew M, Taylor EF, et al. Medigap Premiums and Medicare HMO Enrollment. Health Services Research. 2002 Dec;37(6):1445–1468. doi: 10.1111/1475-6773.10832. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Medicare Payment Advisory Commission. Report to the Congress: Medicare Payment Policy. Washington, DC.: Mar, 2002. [Google Scholar]
  19. Merrill K. Medicare+Choice: Payment and Service Areas. U.S. Department of Health and Human Services; 2001. Final Report to the Assistant Secretary for Program Evaluation. [Google Scholar]
  20. Pear R. Bush's Aides Put Higher Price Tag on Medicare Law. The New York Times. 2004 Jan 30;:A1. [Google Scholar]
  21. Pizer SD, Frakt AB. Payment Policy and Competition in the Medicare+Choice Program. Health Care Financing Review. 2002 Fall;24(1):83–94. [PMC free article] [PubMed] [Google Scholar]
  22. Srinivasan S, Levitt L, Lundy J. Wall Street's Love Affair with Health Care. Health Affairs. 1998 Jul-Aug;17(4):126–131. doi: 10.1377/hlthaff.17.4.126. [DOI] [PubMed] [Google Scholar]
  23. StataCorp®. Stata Statistical Software: Release 7.0. StataCorp LP.; College Station, TX.: 2001. [Google Scholar]
  24. Stuber J, Dallek G, Edwards C, et al. Instability and Inequity in Medicare+Choice: The Impact on Medicare Beneficiaries Findings from Seven Case Studies. The Commonwealth Fund; New York, NY.: 2002. Executive Summary. Publication Number 496. [Google Scholar]
  25. Stuber J, Dallek G, Biles B. National and Local Factors Driving Health Plan Withdrawals from Medicare+Choice: Analyses of Seven Medicare+Choice Markets. Field Report. The Commonwealth Fund; New York, NY.: 2001. Publication Number 491. [Google Scholar]
  26. U.S. General Accounting Office. Medicare Managed Care Plans: Many Factors Contribute to Recent Withdrawals; Plan Interest Continues. U.S. Government Printing Office; Washington, DC.: Apr, 1999. GAO/HEHS-99-91. [Google Scholar]
  27. Wholey D, Feldman R, Christianson J, et al. Scale and Scope Economies Among Health Maintenance Organizations. Journal of Health Economics. 1996 Dec;15(6):657–684. doi: 10.1016/s0167-6296(96)00499-7. [DOI] [PubMed] [Google Scholar]
  28. Wholey D, Christianson J, Sanchez S. Organizational Size and Failure Among Health Maintenance Organizations. American Sociological Review. 1992 Dec;57:829–842. [Google Scholar]

Articles from Health Care Financing Review are provided here courtesy of Centers for Medicare and Medicaid Services

RESOURCES