Skip to main content
Health Care Financing Review logoLink to Health Care Financing Review
. 2002 Fall;24(1):117–132.

What Does Voluntary Disenrollment from Medicare+Choice Plans Mean to Beneficiaries?

Lauren D Harris-Kojetin, Elizabeth MF Jaël, Fiona Smith, Beth Kosiak, Julie Brown
PMCID: PMC4194787  PMID: 12553297

Abstract

The Balanced Budget Act (BBA) of 1997 required CMS to report publicly Medicare managed care (MMC) plan voluntary disenrollment rates. To ensure disenrollment rates would be meaningful to beneficiaries in health plan choice, CMS funded the development of surveys and reporting formats to identify and present the reasons that beneficiaries voluntarily leave plans. Public reporting of reasons on the Medicare Web site began in 2002. We discuss results from extensive audience testing of disenrollment rates and reasons materials. Medicare beneficiaries do not easily understand disenrollment. We also discuss challenges in presenting useful disenrollment information and policy implications for public reporting.

Introduction

The 1997 BBA required that CMS report 2 years of disenrollment rates on all MMC plans, also known as Medicare+Choice (M+C) organizations. This mandate provides a unique opportunity to examine consumer understanding, interest, and use of disenrollment rates. The 2000 national voluntary disenrollment rate from M+C plans was 11 percent (with plans ranging from 0 to 51 percent) (Harris-Kojetin et al., 2002). To ensure that such rates would be meaningful to consumers and to increase the likelihood of their use for health plan choice, CMS funded a project to create survey instruments and reporting formats that would identify and present the reasons that beneficiaries disenrolled from M+C plans. National public reporting of M+C disenrollment rates began in 2000 and reporting of reasons for disenrollment began in 2002.

This article discusses the results of testing information intended for Medicare beneficiaries about voluntary disenrollment from M+C plans. We describe the results of five rounds of materials development and testing consisting of one round of six exploratory focus groups followed by four rounds of intensive one-on-one interviews for a total of 89 interviews. In all, our testing included 137 Medicare beneficiaries in six metropolitan areas (Baltimore, Boston, Houston, Los Angeles, Phoenix, and Tampa) between December 1998 and November 2000 (Jaël et al., 2000; Hargraves, Smith, and Stern, 1999; Harris-Kojetin et al., 1999a, b; Harris-Kojetin, Jaël, and Hampton, 1999). We describe Medicare beneficiaries' reactions to printed materials about M+C plan disenrollment rates, reasons, and background information, focusing on beneficiaries' level of trust, comprehension, and likelihood of using such information to compare plans. We discuss challenges in presenting useful disenrollment information and policy implications for public reporting of comparative plan performance information to support informed consumer choice.

Background

In recent years, a substantial effort has been made to provide comparative quality information on health care plans to the public so that they can make more informed choices, and thereby actively contribute to a more market-driven system (Ernst & Young, 1998; Varner and Christy, 1986). To this end, many public and private purchasers such as Medicare, State Medicaid agencies, and General Motors have published quality information which includes consumer ratings of plan performance (e.g., Consumer Assessment of Health Plans Study® [CAHPS®]) and plan-reported clinical measures (e.g., Health Plan Employer Data and Information Set® [HEDIS®]). These efforts have tried to highlight the relevance and importance of these performance measures for consumers' health plan choices, but have met with mixed results.

For example, one recent study found that more than one-half of Medicare beneficiaries had difficulty understanding and using comparative health plan information (Hibbard et al., 2001). Barriers to effective use of such performance reports include consumers' limited understanding of their health plan choices (Hibbard et al., 1998) and difficulties in disseminating such information at the time it is most likely to be used by consumers (Harris-Kojetin et al., 2001a).

Debate exists both over the relative role that market factors and member dissatisfaction play in explaining voluntary disenrollment rates (Riley, Ingber, and Tudor, 1997; Schlesinger, Druss, and Thomas, 1999) and the suitability of disenrollment rates as a valid indicator of plan quality (Dallek and Swirsky, 1997; Newhouse, 2000; Rector, 2000; Schlesinger, Druss, and Thomas, 1999; U.S. General Accounting Office, 1998). The U.S. General Accounting Office (GAO) issued a report in October 1996 urging public disclosure of disenrollment rates to help Medicare beneficiaries to choose among competing plans (U.S. General Accounting Office, 1996). In later testimony to the U.S. Senate, the GAO reiterated the value of disenrollment information as an indicator of health plan quality (U.S. General Accounting Office, 1997).

Consumer advocacy groups such as Families USA Foundation have also argued that disenrollment rates are a critical component of information that Medicare beneficiaries need to make informed choices about managed care plans (Dallek and Swirsky, 1997). Some consumer guides (e.g., www.clevelandclinic.org/quality/08-24 and www.nysenior.org/Issues/issues-frame.htm) recommend that patients look at a plan's disenrollment rate when making a health plan choice. However, there has been little published research on how consumers understand and use disenrollment information.

Methods

Overview

The goal was to develop materials that explain both voluntary disenrollment rates from M+C plans and the reasons for leaving plans to Medicare beneficiaries in a useful and understandable way. To meet this goal, the research design employed one round of six exploratory focus groups followed by four rounds of interviews. The process included background research followed by iterative rounds of materials development, revision, and testing with three groups of Medicare beneficiaries—those who recently voluntarily left an M+C plan and those continuously enrolled in fee-for-service (FFS) or in M+C plans. Three research organizations—RTI International, RAND, and The Picker Institute, through a subcontract to Harvard Medical School—conducted the audience testing between December 1998 and November 2000. Through our testing, we examined these four key research questions:

  • What are Medicare beneficiaries' understanding, perceptions, and interpretations of voluntary plan disenrollment?

  • What are beneficiaries' level of trust, interest in, and likelihood to use voluntary plan disenrollment rates and reasons?

  • What are effective ways to explain and display voluntary plan disenrollment information to beneficiaries?

  • What are beneficiaries' level of understanding of statistical concepts, such as average and percentage?

First, we conducted six exploratory focus groups to: (1) elicit knowledge and attitudes regarding voluntary disenrollment from M+C plans, (2) assess attitudes and trust toward Medicare as a provider of information to beneficiaries, and (3) identify preferred data displays.

Based on these exploratory focus group findings and lessons drawn from reporting CAHPS® assessment and ratings data to the public (Harris-Kojetin et al., 2001a, b; McGee et al., 1999), we drafted report templates (Figure 1). Then we conducted the first round of 36 indepth interviews to test comprehension and interpretation of these draft templates and to identify ways to make the materials easier to understand and use. After this first round of interviews, the report templates were revised and we conducted three more rounds of interviews (ranging from 12 to 24 interviews per round) and revisions to the materials. In each round we sought to refine further the materials to enhance user understanding, usability, trust, and interest in using the materials in plan choice, as well as respond to CMS's evolving reporting strategy.

Figure 1. Early Template of Voluntary Disenrollment Rates, Showing Percentage Leaving and Percentage Stay.

Figure 1

Materials Tested

The terms “disenrollment materials” and “disenrollment information” refer to the templates that were developed and tested to display two types of information: (1)rates of M+C plan voluntary disenrollment and (2) the survey-based self-reported reasons disenrollees gave for leaving M+C plans. Reasons were grouped into smaller numbers of categories for purposes of reporting. These reasons groupings were developed based on input from the project team; later psychometric analysis of the survey field test data generally supported these groupings (Harris-Kojetin et al., 2002).

The reasons groupings were tested at two levels of aggregation. The less aggregated display contained six categories of problems or reasons for leaving: (1) plan information, (2) getting care, (3) seeing the doctor you wanted, (4) communicating with doctors, (5) costs, and (6) getting or paying for prescription medicines. The more aggregated display contained only two categories—problems with care or service and concerns about costs (Figure 2). The problems with care or service category contained the first four reasons groupings listed above. The concerns about costs category contained the last two reasons groupings listed above. The disenrollment rates and reasons were presented as percentages in the templates (Figures 1 and 2). In this article, we focus on the more aggregated two-category display of reasons groupings shown in Figure 2 because that is the approach that users are more likely to see when visiting the Medicare Health Plan Compare Web site.

Figure 2. Final Template: Voluntary Disenrollment Rates, by Two-Category Breakdown of Reasons.

Figure 2

During the 2 years we designed and tested the disenrollment materials, CMS was also determining how the disenrollment information would be disseminated (e.g., in the Medicare & You handbook, on the Medicare Web site [www.Medicare.gov], etc.). When we first started, we used the working assumption that the disenrollment materials would be displayed in print using several pages. As a result, in our first three testing rounds, we examined different ways to present disenrollment rates using bar graphs (Figure 1) and to present reasons using table displays and explanatory text.

We then learned that the disenrollment information would be included as one of three main areas in which to compare plans on CMS's Medicare Health Plan Compare Web site. (The other two areas are costs and benefits and quality measures.) We needed to provide the main disenrollment rates and reasons information on one screen. Therefore, in the last two rounds of testing, we examined optimal ways to present the disenrollment rates and reasons information on one page.

We started with a tabular display because previous testing showed it worked well with our target audience (Harris-Kojetin, Miller, and Nemo, 2000). Based on the previous two testing rounds, the tabular display was refined to present disenrollment rates aggregated into two main categories of reasons (Figure 2). On the Medicare Health Plan Compare Web site, users may link to the full description of the less aggregated (more detailed) reasons groupings. The environment in which this applied research was conducted required that we change the disenrollment materials rather significantly after we began our first testing round. However, we audience tested these changes and the final template on which the actual display is based (Figure 2) reflects the lessons learned from multiple testing rounds conducted in diverse geographic locations by multiple research organizations.

Study Sites

The exploratory focus groups and the indepth interviews were conducted in six locations: Phoenix, Arizona; Los Angeles, California; Tampa, Florida; Baltimore, Maryland; Boston, Massachusetts; and Houston, Texas. We selected these study sites to represent the diversity of the health plan market with respect to history of managed care, Medicare voluntary disenrollment rates, and local culture. All sites, however, needed to have relatively high MMC penetration rates at the time of testing, to facilitate the recruitment of voluntary disenrollees as testing participants.

Recruitment

For all five testing rounds, potential participants were randomly selected from a list of several hundred names provided by CMS. We then sent CMS-approved advance letters to sampled beneficiaries under the agency's letterhead and signature. Then we screened beneficiaries who proactively called researchers in response to the advance letter or for whom we could obtain a working telephone number and whom we contacted by telephone.

To be eligible to participate, recruitees needed to be between the ages of 65 and 85, able to read, and speak English. Within each of the testing rounds, we had to reflect three geographic locations and perspectives from enrollees, disenrollees, and FFS participants. Since we had to reflect this geographic and enrollment status diversity among the aged, we were concerned that we would be unable to address adequately the same characteristics among the non-elderly disabled and oldest- old beneficiaries, given the number of completions per round. The non-elderly disabled and oldest-old were excluded for logistical reasons as well, for example lack of sign-language resources and wheelchair accessibility in the testing sites.

Overall, across all five testing rounds, 57 participants were continuously enrolled in an M+C plan, 45 had recently disenrolled from an M+C plan, and 35 were continuously enrolled in Medicare FFS for at least 12 months at the time of testing. Of these three strata, we found it most challenging to recruit FFS participants and least challenging to recruit voluntary disenrollees. Advance letters were sent to more FFS and managed care enrollees than to voluntary disenrollees in order to get the needed number of participants. Medicare FFS participants had limited knowledge about health maintenance organizations (HMOs) or managed care, most had never been enrolled in managed care, and several communicated in the groups and interviews that they had no desire to make a change from FFS. We included only M+C enrollees and voluntary disenrollees in the fourth and fifth testing rounds, because we were focusing largely on usability of disenrollment information in plan choice. Previous testing rounds showed that FFS participants were not interested in using disenrollment data.

Across the five testing rounds, participants were about evenly distributed on education. Forty-five participants (33 percent) had at least a college education, 43 participants (32 percent) had some college or technical school training, and 47 (35 percent) had a high school education or less.

Research Protocol

After obtaining consent at the beginning of each respective group and interview, we conducted the exploratory focus groups and the interviews using a structured discussion guide. The goals of the exploratory focus groups were to measure comprehension and perceptions of voluntary disenrollment, assess levels of trust and interest in the materials about disenrollment, and to identify effective presentation formats for interview testing. The focus group moderator guide was designed to address the following research questions:

  • What are the participants' current sources of Medicare information?

  • What do participants know about disenrollment, both the term and the concept?

  • In what ways do participants prefer to see disenrollment rates and reasons presented?

  • Would participants understand the disenrollment information if they received it?

  • Would participants trust the disenrollment information if they received it?

  • How would participants use the information? Who do participants think would use the information? How? When?

After the first round of exploratory focus groups, we used interviews in the remaining four testing rounds. Interviews are an effective method for investigating the thought processes used by persons as they gather information, explore their options, and make decisions. In this project, we used the interviews to address these research questions about the disenrollment materials:

  • Content—Does the material contain information relevant to Medicare health care consumers? Is the information sufficient? Is the information complete? Does the material contain information that seems unnecessary to Medicare health care consumers? Did Medicare beneficiaries perceive the information as trustworthy?

  • Comprehension—Do beneficiaries understand the information provided? Do they understand the information as intended?

  • Navigation—How easily do beneficiaries work their way through the sections of the materials? Do the sections fit together in ways that make sense to beneficiaries? Are beneficiaries able to find the information they want? Do the sections help beneficiaries interpret the information they find?

  • Decision Processes—Do beneficiaries understand the health plan choice task that faces them? Do they recognize the tradeoffs they probably have to make? Does the material help them understand the choice task?

For the first two interview rounds, participants were asked to read a page or section at a time and then were asked pre-scripted, structured closed- and open-ended questions about the page or section. For the last two interview rounds, participants were asked to read the entire set of materials, and then the interviewer went back and asked prescripted questions about each section. During the rounds, we encouraged participants to ask questions that came to mind while reading a section; the interviewer wrote down those comments and questions on the protocol during the interview. In the last interview round, participants were asked to read the materials with the primary task of using them to choose one of the simulated M+C plans listed in the materials.

After each interview was completed, the interviewer rated the interest and comprehension of the participant. The interviewer then reviewed the completed protocol to ensure its accuracy and completeness. Each interviewer developed a short list of main findings and recommended edits to the materials based on her/his own set of interviews. These lists were shared with the analyst(s) within each respective organization. Once all interviews were completed, the interviewers within each organization met as a group to debrief on main findings. Each organization's analyst wrote up the testing findings and corresponding recommendations for revisions to the materials based on the completed protocols, interviewer lists, debriefing, and tallies of responses to the close-ended protocol questions.

Strengths and Limitations of Methods

Focus groups are an effective way to elicit opinions and ideas and to assess prevailing attitudes. Indepth interviews help researchers to assess comprehension and interpretation, memory recall, comparison, evaluation, and selection or choice. Both methods can provide a rich source of data to help explain why people respond to materials the way they do.

However, both of these methods also have limitations which affect the extent to which the results can be generalized beyond the participants tested. First, to be eligible, participants needed to be able to speak and read English and be between the ages of 65 and 85. People with severe vision, hearing, or comprehension problems were also removed from the list of eligible participants.

Secondly, results may be affected by self-selection bias. Participants willing and able to participate in focus groups and interviews likely overrepresent the beneficiaries with a higher level of physical and cognitive functioning. Individuals who feel strongly about the focus group or interview topic as presented in the advance letter and screening materials are more likely to volunteer to participate.

Third, participants are paid ($50 for this study) to participate and expected to read materials provided to them that they may not find truly relevant. For these reasons, the results may not be representative of the Medicare population at large and do not necessarily reflect how beneficiaries would use or interpret the materials in real life.

Another potential limitation of focus groups and indepth interviews applies to all data collection and analysis endeavors—interpretation of the findings. As with all researchers, we may have preconceived ideas that can affect how we interpreted and reported the findings (and what findings we saw). This work was conducted with teams of researchers at three different institutions. After each testing round, we wrote up our respective findings, shared them, and discussed differences in findings where they occurred. We believe that through this collaborative, inter-subjective process, we incorporated a level of reliability to help minimize possible bias in interpretation.

Findings

Comprehension of Disenrollment

A considerable amount of our testing focused on examining beneficiaries' comprehension of disenrollment information, the logic being that if Medicare beneficiaries did not understand key aspects of disenrollment, they would not use the information. Many participants had not heard the term disenrollment before in the context of Medicare, but were willing to try to define it. When given the term disenrollment, participants had three interpretations: (1) a plan drops it service or moves out of the market area, (2) a plan drops or cancels particular members without their permission, and (3) dissatisfied people choose to leave the plan. The following quotes illustrate these three different interpretations, only the latter of which was as researchers intended.

I signed up and three months later they cancelled [their service].

The elderly people [who] require a lot of service get thrown off the HMO.

Someone is dissatisfied generally if they disenroll due to a bad experience.

The continued and increasing M+C plan withdrawals and service area reductions in the country contribute to the first interpretation of disenrollment. Participants' interpretations of the term disenrollment tended to vary by their current enrollment status. Participants who had recently voluntarily disenrolled from an M+C plan generally understood the term as intended. Those who were enrolled in either an M+C plan or Medicare FFS at the time of testing more often understood the term in the involuntary sense. As a result of testing, we added the phrase “people who choose to leave their managed care plan” to the materials. This resulted in increased comprehension of voluntary disenrollment as intended.

While some participants did not understand the concept of disenrollment completely, they did understand that it meant leaving. Leaving or canceling a plan were terms most often suggested by participants to describe disenrollment. Participants expressed a strong interest in knowing the reasons why Medicare beneficiaries leave their Medicare health plans, as evidenced by the following quotes:

If my plan were falling behind, I would want to know why.

Having leavers and stayers doesn't say anything unless you also know the reasons for leaving.

Six main groupings of reasons described earlier were tested in multiple rounds. Almost all participants understood these reasons for leaving and agreed that the descriptions for why beneficiaries disenrolled reflected the main reasons they cared about, as illustrated by the following quotes:

I think it hit all the questions that are important. It's what people always worry about is the care, services, and cost.

[It is] the basic reasons people leave their health plan. It is exactly what is happening out there—[it's] on target…it covers all the reasons.

[These are] reasons I have run into myself—why I left.

Further, participants stated that information on the reasons for disenrollment made the disenrollment rates information more comprehensive and meaningful.

Comprehension of Numerical Information

If beneficiaries do not understand the data being reported on disenrollment rates and reasons, they will be unlikely to use it to compare health plans. Our research team believed that a fundamental aspect of this understanding is comprehension of numerical information. As a result, participants' understanding of percentage and State average was tested. We also examined participants' preferences for different ways of presenting disenrollment rates and reasons data. Figure 1 presents bar graphs in the earlier form tested and Figure 2 presents tabular results in the final form developed.

Most participants understood the concept of percentage. This comprehension is illustrated by the following quotes referring to the first plan in Figure 1, showing 21 percent of plan members chose to leave the plan.

Out of every 100 members 21 leave and 79 stay.

21 of every 100 leaves. That's a poor percentage. I wouldn't be happy going into a plan with that many people leaving.

For those participants who did not understand percentage before seeing the material, various definitions we devised and tested did not help them to understand the concept. In the initial design of the report, a definition of percentage was included beside the percentages we presented. For example, we tested this statement: “In each bar, the number in the dark blue section shows how many Medicare health plan members out of every 100 left their plan. Each number is a percentage.”

Most participants described the previous definition of percentage as being useful. However, having the definition did not help the large minority who could not define percentage mathematically. Yet, without the definition participants who could not explain what a percentage is mathematically understood that it represented the amount of beneficiaries who dis-enrolled or remained with the managed care plan. Thus, although participants may not have been able to explain the concept (with or without a definition provided), they understood how to use the information as intended, as illustrated by the following quotes:

If I were looking for a new plan I'd go for the one with the highest number of stayers.

The bigger the percentage, the more problems in the plan.

The concept of average was much more difficult for participants to comprehend and use. We tested different ways to define the concept and help readers of the materials understand how to use the average as intended. For the bar graph display of plan disenrollment rates, originally we had a horizontal bar at the top of the page representing the State average followed by the plan-specific bars showing the percentage of beneficiaries who had voluntarily disenrolled from their plan (Figure 1). We visually set the State average bar graph apart from the plan bar graphs in the display and included a definition beside the State average.

When we asked participants what they thought a State average meant, slightly less than one-half were able to explain the meaning or its relationship with the rest of the bars. For example, a few of the participants thought the State average bar simply represented a general example of a bar graph. As a result, we tested participants' comprehension of a mathematical definition of average placed beside the State average bar graph: “This bar shows the average for the [enter number] people enrolled in all Medicare health plans in [enter State name] for the last 12 months. In [enter State name], [enter number] out of 100 people in a Medicare health plan left their plan.”

Participants who did not already understand State average generally found this definition confusing. In addition, even those participants who could not define an average nonetheless intuitively knew how to use the State average bar graph correctly as a reference (without the detailed definition beside it). As a result of these findings, we simplified the accompanying text for the State average to read: “Average for all Medicare managed care plans in [enter State or region]” (Figure 2). When asked, “What is the top row for?,” most participants were able to describe how it is used, as exemplified by the following quote:

If 11 percent is average, then when you see 18 percent leaving [for a plan] you know something is wrong. If only 2 percent are leaving you know they [the plan] are doing something right.

Approximately two-thirds of the participants found the State average bar graph in Figure 1 useful as a reference to compare the health plans.

To help participants compare the health plan bar graphs with the State average bar graph we tested a vertical line that ran from the State average through each of the health plan bars (Figure 1). Our intent was to create a visual cue that would assist participants to understand which health plans had a disenrollment rate above or below the State average. However, most participants did not understand that the vertical line represented the State average. For example, several participants asked, “Why is this line here?” The line was removed when testing showed that it did not aid participants in using the data and, worse, added confusion.

In sum, less than one-half of the participants came to the disenrollment materials with an understanding of the concept of a State average. For the majority, who were not able to describe the mathematical concept, they did understand how they were supposed to use the State average without the need for a conceptual definition or visual aid like the vertical line.

Contrary to what was initially assumed, we found that people can often understand how to use a numeric concept (e.g., percentage, average) even when they are unable to talk about it in a mathematical way. That is, beneficiaries do not have to be able to describe what a percentage or a State average is in the way that we as researchers assumed they should, in order to know how to use them appropriately when comparing health plans. In such cases, giving even a basic mathematical definition can be unproductive if beneficiaries already have a grasp of the concept but cannot define it using mathematical terms.

Reactions to Disenrollment

Trust

In each round of testing trust was examined, since beneficiaries are more apt to use information they think is credible and trustworthy. Participants' trust in the disenrollment materials increased in each round of testing. Less than one-half of the participants trusted the information a lot in the first round of interviews, while two-thirds of participants stated they trusted the information a lot by the final round of interviews, as illustrated by the following quotes:

Medicare collected it so it set me at ease. And that it is researched by an outside company—that puts me at ease as well.

They're not selling you anything. It's unbiased. They've done research and they've outlined it. They leave it to your own discretion.

Two themes emerged when participants were asked how much they trusted the materials—(1) they wanted to know the sponsor of the materials and (2) they wanted to know the source of the disenrollment rates and reasons information. In the earlier versions of the materials tested, we explained that the information came from Medicare's own records and an independent survey. We also included the statement “Source: U.S. Health Care Financing Administration” (i.e., HCFA, the name of CMS at the time of testing) underneath the bar graphs in Figure 1, to indicate that HCFA was the source of the data. Participants' trust in the materials increased once they understood that HCFA administered the Medicare Program. For the small proportion of participants who did not fully trust the materials, one reason that they gave was their perception that the data or the materials came from a plan, as characterized by the following quote:

If it comes from an insurance company, they are catering to your business. They give fake rates to pull you in and then change the rates to the correct ones.

Participants suggested using the term the Medicare Program rather than Health Care Financing Administration, since it is more readily identifiable to Medicare beneficiaries. As a result, we changed the source statement at the bottom of the data display in Figure 2 to refer to Medicare rather than to the Health Care Financing Administration.

Participants also wanted to know more about the survey used to collect the reasons information. For example, a minority of participants wanted to see the sample sizes and the actual number of people in the M+C plans. They also wanted to know who or what sponsored the survey, as exemplified by the following quote:

[I] want more information on who did the survey. Was it done by health plans?

The initial description of the survey used in the materials stated that Medicare oversaw the survey, described what the survey was about, noted that an outside research company performed the survey, and listed the sample exclusion criteria. However, this description of the survey was revised based on test results. The end result is written in a very conversational style:

Please keep in mind that this chart shows the percentage of members who chose to leave. This does not include members who moved, died, or who were no longer eligible for Medicare managed care. It also does not show members who had to leave because their plans stopped serving Medicare members in their area…Medicare collected this information from Medicare managed care plans and from Medicare members who chose to leave their Medicare managed care plan. Medicare uses an independent research company to collect this information and report it back to you.

We opted not to include sample sizes or other detailed, technical information because that information can be misinterpreted if someone does not understand survey sampling and statistics.

Participants' views of the disenrollment materials were influenced by the proportion of those who leave a plan relative to those who continue with their plans that was shown in the data displays (Figures 1 and 2). The following quotes reflected a wide variety of participants' views on the information, from feeling comforted by the high percentage of people who continue with their plan to being skeptical about the veracity of the percentages.

Overall message is that only 17 percent leave and 83 percent stay—so most people stay. That is pretty good.

This tells me people tend to stick with what they have.

Does 3 percent [under Concerns about Costs] mean 3 out of 100 have a specific problem? It's not impressive. If numbers were bigger it would be more realistic.

The disenrollment rates we used in the Figure 2 test materials reflected actual average disenrollment rates among M+C organizations at the time of testing.

Finally, some participants revealed ambivalent feelings about using disenrollment rates as an indicator of the quality of health plans, as reflected in the following quote:

Stayers are people who are satisfied or who don't have enough energy to change, too old to change.

Recent findings on the relationship among health status, dissatisfaction, and voluntary disenrollment (Schlesinger, Druss, and Thomas, 1999) corroborate this participant's concerns.

Use and Acceptance

Almost all participants stated that the disenrollment data were very useful. Participants who were current M+C plan enrollees said that they would use the data if they had to find a new plan. Other participants said they would refer to the information at some later time. Both sentiments are reflected in the following quotes:

If I were looking for a new plan I'd go for the one with the highest number of stayers.

I'd read them and file them.

However, participants generally said that they would not use the data now because they were happy with what they had. Many said that the information would really be most suitable for those new to Medicare and interested in considering an M+C plan. Not surprisingly, participants who were enrolled in an FFS plan had the least interest in the information while those enrolled in or recently disenrolled from an M+C plan exhibited the most interest. Those enrolled in FFS said they were not interested in the disenrollment information because they had no interest in enrolling in an M+C plan.

During the last round of testing, participants' interest in and use of disenrollment data, a HEDIS® measure (mammography rate), and a CAHPS® rating (0-10 plan rating) were examined. Participants responded that all three types of plan performance information were either somewhat or very useful in helping them choose a hypothetical M+C plan.

Summary

Most participants were not familiar with the term disenrollment in the context of plans. Many of them did not intuitively understand the concept of disenrollment as voluntary on the part of the beneficiary. Once participants were introduced to the intended meaning, they were generally interested in and understood the reasons why beneficiaries would choose to leave an M+C plan. They understood that plans with the least voluntary disenrollees were performing the best.

Participants were also able to use the State average to help them compare the plan-level data, even though many were unable to define the meaning of average. Once participants understood that Medicare was the source of the disenrollment information, they trusted the information. Finally, participants generally agreed that disenrollment information is most useful for those about to select a Medicare health plan, i.e., new beneficiaries and beneficiaries interested in selecting a different M+C plan.

Discussion and Implications

These results provide support for the long-range strategy and general direction CMS has taken with respect to beneficiary education and use of comparative information. First, they highlight the need for effective use of a variety of information intermediaries to provide guidance and assistance to beneficiaries in using complex quality performance information. CMS-sponsored research shows that many beneficiaries need help with this information (e.g., McCormack et al., 2001) and often turn to trusted sources to provide that support. These trusted sources include friends and family members, counselors at the State Health Insurance Assistance Programs (SHIPs), and partners such as AARP, industry, professional societies, and labor unions, among others.

CMS has actively engaged such partners—from the SHIPs active in all States and territories and the REACH Program in the regional offices, to its Partnership and Promotion Group. With respect to the partners, for example, CMS holds regular monthly meetings to determine ways in which they can best disseminate Medicare information and provide active support to beneficiaries in their understanding and use of that information. To further the dissemination of quality information, CMS, along with other government agencies, has sponsored several conferences on the reporting of quality information to the public, as well as a Web site (www.talkingquality.gov) that provides guidance about how to create user-friendly reports.

Second, the use of comparative quality information in plan decisionmaking is still a relatively novel experience for most consumers. Many consumers (particularly current elderly beneficiaries) are not accustomed to receiving or using such information. CMS and other purchasers need to continue to pursue strategic promotional initiatives to educate the American public about the importance and use of quality information in plan choice. If such initiatives are successful, adults may age into Medicare knowing more about, and being more interested in using, plan quality performance information than new beneficiaries are now. Younger friends and family members (children, grandchildren) of Medicare beneficiaries are currently more likely to use Web-based resources than older beneficiaries. If CMS includes these informal intermediaries as target audience members for a campaign to increase awareness of Medicare Health Plan Compare, they may be more likely to use the Web site and performance information when helping their Medicare beneficiary family member or friend in choosing a health plan.

One particularly promising way to educate the public is to identify and effectively respond to openings or times, places, and situations when the audience will be most attentive to, and able to act on, the message (Siegel and Doner, 1998). This approach suggests that for certain kinds of information, such as comparative plan quality data, more targeted dissemination strategies may be more effective than broad-based dissemination strategies currently mandated by Congress.

For example, our research showed that the disenrollment materials are perceived by beneficiaries as being most relevant and likely to be used by beneficiaries already intending to enroll in an M+C plan. This suggests that the disenrollment information should be targeted to new beneficiaries and beneficiaries who request information because they are considering changing plans (e.g., because their plan left their area). The Medicare toll-free service is a primary way in which this information could be targeted to beneficiaries who need information to help make a plan change; other, local venues (such as through the SHIPs and State-level AARP offices) should also be used to implement a more targeted dissemination strategy.

Third, the results provide strong support for the approach that CMS has taken with respect to public reporting of disenrollment information. While many policy analysts have presumed that disenrollment rates, on their own, were both an accurate proxy for health plan quality and easy for consumers to use, CMS lacked the empirical evidence to support these presumptions. Our research suggests that the picture is more complex. It appears that for beneficiaries disenrollment rates are a necessary, but not sufficient basis on which to evaluate health plans.

Indeed, CMS designed and implemented the Medicare CAHPS® disenrollment reasons survey specifically to ascertain the reasons beneficiaries disenroll from their plans. One year's worth of data is now available from that survey and is being reported publicly in concert with the rates to help consumers better understand why Medicare beneficiaries leave their plans. More research is needed to examine whether other populations not included in our research, such as non-elderly disabled beneficiaries and family members of beneficiaries who help them make health care decisions, find the disenrollment information understandable and useful.

Acknowledgments

The authors would like to thank Christina Smith Ritter, Elizabeth Goldstein, Christine Crofton, and Charles Darby for their ongoing support and assistance with this study. We gratefully acknowledge the efforts of Nancy Hampton, Julie Irish, Rosa Elena Garcia, Benjamin Nemo, Laura Miller, Amy Stern, Claudia Squire, and Abby Kessler for their research assistance. And, we appreciate the helpful comments of three anonymous reviewers on an earlier draft of this article.

Footnotes

Lauren D. Harris-Kojetin and Elizabeth M.F. Jaël are with RTI International. Fiona Smith is with the Barents Group of KPMG Consulting. Beth Kosiak is with the Agency for Healthcare Research and Quality (AHRQ). Julie Brown is with the RAND Corporation. The research for this article was supported through Intra-Agency Agreement Number IA98-48 between the Centers for Medicare & Medicaid (CMS) and AHRQ as part of Cooperative Agreement Number U18HS09218. The views expressed in this article are those of the authors and do not necessarily reflect the views of RTI International, Barents Group of KPMG Consulting, AHRQ, RAND, or CMS.

Reprint Requests: Lauren D. Harris-Kojetin, Ph.D., RTI International, 1615 M. St., NW., Suite 740, Washington, DC 20036. E-mail: lauren@RTI.org

References

  1. Dallek G, Swirsky L. Comparing Medicare HMOs: Do They Keep Their Members? Families USA Foundation; Washington, DC.: 1997. [Google Scholar]
  2. Ernst & Young. Built to Last Means Built to Change: Medicare+Choice and the New Health Care Consumerism. Ernst & Young Publications for the Health Care Industry; Cleveland, Ohio: 1998. Score No. O-00200, 1998. [Google Scholar]
  3. Hargraves JL, Smith F, Stern A. Medicare Disenrollee Report Template Project: Cognitive Testing Findings Among Medicare Beneficiaries Enrolled in Managed Care Health Plans in Massachusetts and Texas. The Picker Institute; Boston, MA.: 1999. Report to the Agency for Healthcare Research and Quality and the Health Care Financing Administration. [Google Scholar]
  4. Harris-Kojetin L, Bender R, Booske B, et al. Medicare CAHPS® 2000 Voluntary Disenrollment Reasons Survey: Findings from an Analysis of Key Beneficiary Subgroups, Draft Report. RTI International; Research Triangle Park, NC.: Aug, 2002. Report to the Centers for Medicare & Medicaid Services. [Google Scholar]
  5. Harris-Kojetin LD, McCormack LA, Jaël EMF, Lissy KS. Beneficiaries' Perceptions of New Medicare Health Plan Choice Materials. Health Care Financing Review. 2001a Fall;23(1):21–35. [PMC free article] [PubMed] [Google Scholar]
  6. Harris-Kojetin LD, McCormack LA, Jaël EMF, Lissy KS. Creating More Effective Health Plan Quality Reports for Consumers: Lessons from a Synthesis of Qualitative Testing. Health Services Research. 2001b Jul;36(3):447–476. [PMC free article] [PubMed] [Google Scholar]
  7. Harris-Kojetin L, Miller L, Nemo B. Implementation of CAHPS® Medicare Disenrollment Survey Project: Medicare Managed Care Plan Voluntary Disenrollment Rates Testing. RTI International; Research Triangle Park, NC.: 2000. Report to the Health Care Financing Administration. [Google Scholar]
  8. Harris-Kojetin LD, Jaël EMF, Hampton N. CAHPS® Medicare Disenrollee Report Template Development and Testing Project: First Round Focus Group Report. RTI International; Research Triangle Park, NC.: 1999. Report to the Agency for Healthcare Research and Quality and the Health Care Financing Administration. [Google Scholar]
  9. Harris-Kojetin LD, Jaël EMF, Hampton N, Nemo B. Medicare Disenrollee Report Template Project Round 2 Testing: Findings and Recommendations from Cognitive Interviews with Medicare Beneficiaries. RTI International; Research Triangle Park, NC.: 1999a. Report to the Agency for Healthcare Research and Quality and the Health Care Financing Administration. [Google Scholar]
  10. Harris-Kojetin LD, Jaël EMF, Hampton N, Miller L. Medicare Disenrollee Report Template Project Round 3 Testing: Findings and Recommendations from Cognitive Interviews with Medicare Beneficiaries. RTI International; Research Triangle Park, NC.: 1999b. Report to the Agency for Healthcare Research and Quality and the Health Care Financing Administration. [Google Scholar]
  11. Hibbard JH, Jewett JJ, Engelmann S, Tusler M. Can Medicare Beneficiaries Make Informed Choices? Health Affairs. 1998 Nov-Dec;17(6):181–193. doi: 10.1377/hlthaff.17.6.181. [DOI] [PubMed] [Google Scholar]
  12. Hibbard JH, Slovic P, Peters E, et al. Is the Informed-Choice Policy Approach Appropriate for Medicare Beneficiaries? Health Affairs. 2001 May-Jun;20(3):199–203. doi: 10.1377/hlthaff.20.3.199. [DOI] [PubMed] [Google Scholar]
  13. Jaël EF, Squire S, Harris-Kojetin LD, et al. Medicare Disenrollee Report Template Project: Testing Rounds 4 and 5: August and November 2000 Cognitive Testing Findings. RTI International; Research Triangle Park, NC.: 2000. Report to the Agency for Healthcare Research and Quality and the Health Care Financing Administration. [Google Scholar]
  14. McCormack LA, Harris-Kojetin LD, Burrus BB, et al. Providing Information to Help Medicare Beneficiaries Choose a Health Plan. Journal of Aging and Social Policy. 2001;12(2):49–72. doi: 10.1300/J031v12n02_04. [DOI] [PubMed] [Google Scholar]
  15. McGee J, Kanouse DE, Sofaer S, et al. Making Survey Results Easy to Report to Consumers: How Reporting Needs Guided Survey Design in CAHPS®. Medical Care. 1999;37(3):32–40. doi: 10.1097/00005650-199903001-00004. Supplement. [DOI] [PubMed] [Google Scholar]
  16. Newhouse JP. Switching Health Plans to Obtain Drug Coverage. Journal of the American Medical Association. 2000 Apr;283(16):2161–2162. doi: 10.1001/jama.283.16.2161. [DOI] [PubMed] [Google Scholar]
  17. Rector TS. Exhaustion of Drug Benefits and Disenrollment of Medicare Beneficiaries From Managed Care Organizations. Journal of the American Medical Association. 2000 Apr;283(16):2163–2167. doi: 10.1001/jama.283.16.2163. [DOI] [PubMed] [Google Scholar]
  18. Riley G, Ingber M, Tudor C. Disenrollment of Medicare Beneficiaries from HMOs. Health Affairs. 1997 Sep-Oct;16(5):117–124. doi: 10.1377/hlthaff.16.5.117. [DOI] [PubMed] [Google Scholar]
  19. Schlesinger M, Druss B, Thomas T. No Exit? The Effect of Health Status on Dissatisfaction and Disenrollment from Health Plans. Health Services Research. 1999 Jun;34(2):547–576. [PMC free article] [PubMed] [Google Scholar]
  20. Siegel M, Doner L. Marketing Public Health: Strategies to Promote Social Change. Aspen Publishers, Inc.; Gaithersburg, MD.: 1998. [Google Scholar]
  21. U.S. General Accounting Office. Medicare: HCFA Should Release Data to Aid Consumers, Prompt Better HMO Performance. U.S. General Accounting Office; Washington, DC.: Oct, 1996. GAO/HEHS-97-23. [Google Scholar]
  22. U.S. General Accounting Office. Medicare Managed Care: HCFA Missing Opportunities to Provide Consumer Information. U.S. General Accounting Office; Washington, DC.: Apr, 1997. GAO/T-HEHS-97-109. [Google Scholar]
  23. U.S. General Accounting Office. Medicare: Many HMOs Experience High Rates of Beneficiary Disenrollment. U.S. General Accounting Office; Washington, DC.: Apr, 1998. GAO/HEHS-98-142. [Google Scholar]
  24. Varner T, Christy J. Consumer Information Needs in a Competitive Health Care Environment. Health Care Financing Review. 1986 Dec;(1986 Annual Supplement):99–104. [PMC free article] [PubMed] [Google Scholar]

Articles from Health Care Financing Review are provided here courtesy of Centers for Medicare and Medicaid Services

RESOURCES