Abstract
This article examines the reporting of Consumer Assessment of Healthcare Providers and Systems (CAHPS®) consumer experience data by sponsors, those that fund data collection and decide how information is summarized and disseminated. We found that sponsors typically publicly reported comparative data to consumers, employers, and/or purchasers. They presented health plan-level data in print and online at least annually, usually in combination with non-CAHPS® information. Many provided trend data, comparisons to individual plans, and summary scores. Most shared information consistent with known successful reporting practices. Areas meriting attention include: tailoring reports to specific audiences, assessing literacy, planning dissemination, educating vendors, and evaluating products and programs.
Introduction
The CAHPS® project was designed by AHRQ to develop and test surveys that elicit consumers' perceptions about their health care as well as reports that present this information to consumers, providers, and other audiences (Agency for Healthcare Research and Quality, 2006a). The reporting of consumer experience data is part of a national movement toward transparency of health care quality information in both the private and public sectors (McIntyre, Rogers, and Heier, 2001; Zema and Rogers, 2001; PricewaterhouseCoopers Health Research Institute, 2006; Davis and Haran, 2007; Bush, 2006; Leavitt, 2006). This movement has spurred reporting on quality (as measured by patient experience, clinical process, and outcome measures) and cost/efficiency of health plans, hospitals, medical groups, individual physicians, and other health care entities either publicly or to a limited audience of consumers or other stakeholders. The hope is that consumers will make better choices if they are provided with relevant and understandable data. In theory, these improved choices, combined with the competition that the disclosure of performance data engenders among providers, will lead to improved care and more efficient use of health system resources. Emerging evidence suggests that such disclosure can improve quality (Barr et al., 2006; Hibbard, Stockard, and Tusler, 2003, 2005; Thompson et al., 2003; Lindenauer, 2007).
CAHPS® surveys have been developed for ambulatory care (health plans, group practices, individual physicians), hospitals, and nursing homes. They are now the most thoroughly researched and widely used measures of consumer health care experience in the U.S. (Agency for Healthcare Research and Quality, 2006b). The health plan measures, the focus of this article, are currently used to assess care provided by health plans covering more than 130 million Americans, and are key measures of patient-centered care in use by CMS, the Federal Employees Health Benefit Program, the Department of Defense, and AHRQ (Agency for Healthcare Research and Quality, 2004; Goldstein et al., 2005).
Through CAHPS®, AHRQ has funded formative research on collecting and reporting consumer experience data to consumers and other audiences, and has identified key steps associated with the successful implementation of a reporting program (Spranca et al., 2000; Harris-Kojetin et al., 2001; McCormack et al., 2001; Goldstein and Fyock, 2001; Farley et al., 2002a,b; Short et al., 2002; Kanouse, Spranca, and Vaiana, 2004).
CAHPS® sponsors typically fund CAHPS® data collection efforts and decide how the survey information will be collected, summarized, and disseminated to consumers and other audiences. Sponsors include public and private sector employers, business coalitions, health plans, State and Federal agencies (e.g., Medicare and Medicaid), quality improvement (QI) organizations, labor unions, and nonprofit organizations. By implementing reporting in a way that is consistent with the goals of the CAHPS® program, sponsors act as purveyors, and potentially as change agents (Fixsen et al., 2005; Rogers, 2003).
Being an effective purveyor requires an organized and persistent approach to implementation so that barriers to success can be identified and overcome (Fixsen et al., 2005). It also requires tailoring an innovative program to particular audiences in a way that retains fidelity to programmatic goals (Rogers, 2003). Despite the important role sponsors play in summarizing and disseminating CAHPS® data to consumers and other audiences, little is known about how they do this, or what factors motivate and constrain their actions. Shortly before the CAHPS® program was launched, McCormack et al. (1996) conducted case studies of 24 organizations that developed and disseminated informational materials on health plans. More recently, limited information on individual sponsor's reporting activities has been reported in CAHPS® demonstration studies (Guadagnoli et al., 2000; Farley et al., 2002a,b). However, to our knowledge no previous studies have examined how a broad set of sponsors report CAHPS® data. Therefore, to better understand the motivations and incentives behind sponsors' collection and use of consumer experience data, and to inform future efforts to collect and report such data, we conducted interviews with CAHPS® sponsors.
Our three primary research questions were: (1) What CAHPS® consumer experience data do sponsors report?, (2) How do sponsors report this information?, and (3) What are sponsors' goals in reporting data? We also sought to identify reporting and dissemination practices that might be deficient, especially where AHRQ might be able to assist. Our research focused on the reporting of data from the CAHPS® Health Plan Survey, the most widely used survey at the time of our study. Given the CAHPS® research that has gone into establishing best reporting practices, we were interested in the extent to which these practices have been effectively implemented by sponsors.
Methods
Sampling and Recruitment
Sponsors were selected using a purposive sampling strategy that sought diversity with respect to: organization type (State Medicaid agencies, non-Medicaid State agencies, State-level business groups on health or employer coalitions, national-level organizations1, and Fortune 500 companies); and geographic region (West, Northeast, Midwest, and South). We did not include health plans, because they do not typically collect CAHPS® data to inform consumers, but instead to meet credentialing requirements and/or for QI or marketing purposes. During our recruitment process, we inquired as to the type(s) of report(s) produced by each organization, and assigned each organization to one of the following categories:
Public Reporter—An entity producing reports that are available to consumers or anyone else outside the organization (e.g., available on a publicly accessible Internet site).
Limited Reporter—An entity producing reports that are available within the sponsoring organization (e.g., to employees during open enrollment), or to an outside organization with which a CAHPS® sponsor does business (e.g., providers, plans, and regulators), but not to the broader public. Limited reporting could take many forms, such as the release of raw CAHPS® data, a written report, an oral presentation, or a Web posting on an internal intranet.
Dual Reporter—An entity producing both public and limited reports.
Our sample frame focused on CAHPS® sponsors that were separate, non-collaborating entities and that had collected data to inform health care choices either for internal or external audiences and reported it within the past 2 years; it consisted of 86 organizations. We identified these organizations through an extensive review of those that were listed as CAHPS® sponsors on the CAHPS® Users Network Web site, had submitted CAHPS® data to the National CAHPS® Benchmarking Database (NCBD)2 (Agency for Healthcare Research and Quality, 2006a,c), and/or were identified as sponsors by members of the CAHPS® consortium3. Although the identification process was not exhaustive, we found significant overlap in the sponsors mentioned by different sources, indicating substantial coverage of our target population.
Because of their importance to CAHPS® reporting efforts, we included with certainty six specific organizations: one business group, one non-Medicaid State agency, and four national level organizations. For the remainder of the sample, we randomly sampled 43 of the remaining 80 sponsors within each of the 20 combinations of 5 organizational categories and 4 geographic regions.
Of the 49 organizations that we attempted to contact, 11 were eliminated as ineligible: 3 were duplicated listings; 6 did not collect CAHPS® data, and 2 collected, but did not report CAHPS® data. Five of the remaining 38 organizations had non-working numbers or never returned telephone calls to establish eligibility. Of the remaining 33 organizations identified as eligible, 25 agreed to be interviewed, yielding a 76-percent participation rate (25/33) and a 66-percent response rate (25/38). Two Medicaid agencies, three non-Medicaid State organizations, two business coalitions, one Fortune 500 company, and no national-level organizations refused. The Fortune 500 company that declined to be interviewed was the only eligible Fortune 500 company we approached that was not part of a business coalition. Participants were not paid to be interviewed, and those declining most commonly cited lack of time as the reason.
Two RAND researchers conducted one-hour, semistructured telephone interviews with 25 CAHPS® sponsors who reported data from June 2004-February 2005. Table 1 presents the final sample with geographic region and type(s) of report.
Table 1. Summary of Final Sample of CAHPS® Reporters Interviewed, by Organization: June 2004-February 2005.
Type of Organization | Geographic Region1 | Type of Reporter2 |
---|---|---|
State Medicaid Agencies (n = 8) | West | Dual |
West | Dual | |
Northeast | Dual | |
Northeast | Dual | |
Midwest | Dual | |
Midwest | Dual | |
South | Public | |
South | Public | |
Non-Medicaid State Agency (n = 9) | West | Dual |
West | Public | |
West | Limited | |
Northeast | Public | |
Northeast | Public | |
Midwest | Public | |
South | Public | |
South | Public | |
South | Public | |
Business Coalitions (n = 4) | West | Dual |
West | Dual | |
Midwest | Public | |
South | Limited | |
National-Level Organization (n = 4) | NA | Public |
NA | Dual | |
NA | Dual | |
NA | Dual |
Geographic regions were based on U.S. Census Bureau classifications.
Categories of reporting: dual—an entity producing both public and limited reports; public—an entity producing reports available to consumers; and limited—an entity producing reports available only within the sponsoring organization.
NOTES: CAHPS® is Consumer Assessment of Healthcare Providers and Systems. NA is not available.
SOURCE: Teleki, S., Kanouse, D.E., Elliott, M.N., Hiatt, L., de Vries, H., and Quigley, D.D., RAND Health and Pardee Rand Graduate School.
Data Collection, Management, and Analysis
Interviews were conducted using a semistructured guide consisting of 30 core questions. Topic areas addressed included reasons for reporting, audience determination, report content, report format, dissemination, reporting experience, and evaluation. Most core questions were followed by probes. Respondents were asked to share their thoughts and experiences in an unstructured manner.
If a sponsor created more than one type of CAHPS® report, separate answers were recorded for each. All interviews were audio taped by permission. Interviewers reviewed the audiotapes to prepare notes. Each interviewer reviewed the notes for the interviews conducted and independently created codes; then the two interviewers conferred to produce a standardized set of codes and used these to code the telephone interviews.
To assess intercoder agreement, we randomly selected six interview summaries for double-coding and examined agreement in classification of responses. For example, for the item “What did you hope to accomplish by reporting CAHPS® data?,” for which more than one reply was permitted, we assessed intercoder agreement in assigning each of four standardized response codes to categorize a given answer: (1) help consumers make informed choices/distinguish among plans, (2) external assessment for accountability/policymaking, (3) quality assessment and improvement, and (4) contracting.
Because only six interviews were double-coded, we calculated pooled Cohen's (1960) kappas across each of 12 topic areas. An average of 181 items per topic area for each of 6 double-coded interviews yielded an average of 1,086 observations for each reported kappa statistic, resulting in sufficient precision to assess interrater agreement. Pooled kappas were calculated by averaging the proportion of observed and expected agreements separately before calculating the pooled kappa. The median pooled kappa across 12 sections was 0.78, ranging from 0.63 for the section regarding dissemination to 0.94 for the section of followup questions regarding limited-audience reporting, suggesting substantial agreement for all topic areas (Landis and Koch, 1977; Schouten, 1993). Descriptive statistics were calculated to summarize results. All study methods and materials were approved by RAND's Human Subjects Protection Committee.
Results
Content Reported by Sponsors
Table 2 summarizes findings regarding the content of reports. In general, sponsors indicated that educating consumers about the importance of health care quality and improving quality of care were their primary goals in reporting CAHPS® information. These goals are consistent with those of AHRQ and the CAHPS® consortium. These sponsors used CAHPS® data for internal purposes to monitor and improve the quality of health care that their organizations (or those they contract with) provide, and to meet contract or waiver requirements.
Table 2. Content of Reports Produced, by CAHPS® Sponsors: June 2004-February 2005.
Content | Sponsors Reporting | |
---|---|---|
| ||
Percent | Proportion | |
Type of Data | ||
Both CAHPS® and Non-CAHPS Data | 84 | (21/25) |
CAHPS® Data Exclusively | 16 | (4/25) |
Health Plan-Level | 92 | (23/25) |
Trend Data | 48 | (12/25) |
Comparison Groups | 91 | (22/24)1 |
Composite Measures | 70 | (17/24)1 |
CAHPS® Supplemental Items | 68 | (17/25) |
This denominator is 24 (instead of 25) because one sponsor interviewed did not produce a written report, so questions about written (i.e., hard copy) report design did not apply.
NOTE: CAHPS® is Consumer Assessment of Healthcare Providers and Systems.
SOURCE: Teleki, S., Kanouse, D.E., Elliott, M.N., Hiatt, L., de Vries, H., and Quigley, D.D., RAND Health and Pardee Rand Graduate School.
Type of Data
Most interviewed sponsors (84 percent or 21/25) produced at least one report containing both CAHPS® and non-CAHPS® information (e.g., Health Plan Employer Data and Information Set® measures, general enrollment, and plan information such as participating providers and clinic locations). They usually intended these more comprehensive reports for a public, consumer audience to use in selecting a health plan. The small number of sponsors that reported CAHPS® data exclusively (16 percent or 4/25) were either Medicaid or non-Medicaid State organizations, typically responding to a mandate. A challenge noted by sponsors reporting both CAHPS® and other types of data in a single report was making the presentation of consumer experience data congruent with clinical performance data. The burden of survey data collection for the sponsor was also mentioned.
Health System Level
Nearly all sponsors (92 percent or 23/25) reported CAHPS® data at the health plan level in at least one of the reports they produced. This decision was driven primarily by data availability. That is, the CAHPS® instrument most commonly used at the time of our interviews focused on health plans. Sponsors said that while health plan-level provided a good starting point, interest has shifted increasingly to other levels of the health system, especially to individual physicians.
Trend Information
Almost one-half (48 percent or 12/25) of the sponsors that reported CAHPS® data produced at least one report with trend information (typically 2 or 3 years of data), with the goal of improving quality by allowing audiences to track changes in performance over time. Those that included trend data in only some of their reports did so to keep it simple for a specific audience that might be overwhelmed by large amounts of data (e.g., elderly consumers or those with lower levels of education). Sponsors that never displayed trend data gave the following reasons: a desire to keep all reports for all audiences simple, a belief that comparisons over time were not fair and/or useful, and a lack of comparable data across years.
Comparison Groups
The majority of CAHPS® sponsors (91 percent or 22/24)4 used comparison groups when reporting data. The most common comparison group tended to be other, individual health plans, followed by national benchmarks. These sponsors indicated they used comparisons to help consumers make choices and for QI; they also said that regional data were more useful than local or national data for these purposes. Local data were viewed as being too narrow, and national data as too broad. For the most part, those using comparison groups said the challenges they faced were due to inadequate data (i.e., not enough to allow for comparisons and establishing benchmarks); some also noted resistance among those being measured to comparisons with others.
Composite Measures
Most sponsors (70 percent or 17/24) said that they produced at least one comprehensive report containing both CAHPS® composite measures and individual survey items to meet the needs of their most detail-oriented audience. Composites are summary measures; for example, the composite “getting needed care” summarizes two survey questions about how easy it was for patients to get: (1) appointments with specialists and (2) the care, tests, or treatments they needed through their health plan. Whether presented together in one report or not, composites were typically used to simplify the report message and to conserve space, whereas individual items were used to make the report both complete and flexible in meeting the varied needs of end-users. Sponsors noted tradeoffs: on the one hand, it may be challenging to explain how composites are constructed and/or why scores on individual items may conflict with composite scores; on the other hand, pages of individual items may overwhelm some audiences.
Supplemental Items
The majority of sponsors (68 percent or 17/25) did not report CAHPS® supplemental items.5 The main reasons given for not doing so were a desire for a shorter report, the fact that they already had a history of using and reporting non-CAHPS® supplemental items (making it unappealing to change measure sets), and/or a lack of demand from constituents for the CAHPS® supplemental items. Sponsors reporting CAHPS® supplemental items said they did so to meet the needs of their target audiences interested in children and/or Medicaid beneficiaries. One obstacle to reporting supplemental items was that not all entities collected the same ones, making it impossible to report comparable measures across organizations.
Report Format and Distribution Considerations of Sponsors
Table 3 summarizes findings regarding report characteristics and dissemination considerations. The sponsors we interviewed noted the following tendencies concerning the main areas we asked about.
Table 3. Report Characteristics and Distribution Considerations of CAHPS® Sponsors: June 2004-February 2005.
Ways Data Were Reported | Sponsors Reporting1 | |
---|---|---|
| ||
Percent | Proportion | |
Intended Audience | ||
Public Only | 44 | (11/25) |
Limited Audience Only | 8 | (2/25) |
Both Public and Limited Audiences | 48 | (12/25) |
Media | ||
Web-Based | 100 | (25/25) |
Written | 96 | (24/25) |
Data Files | 40 | (10/25) |
Frequency of Reporting | ||
At Least One Report within Past 2 Years | 88 | (22/25) |
At Least One Report Annually | 80 | (20/25) |
Timing of Report Release | ||
Fall | 52 | (13/25) |
No Specific/Consistent Month | 28 | (7/25) |
Literacy | ||
Assessed Literacy of at Least One Report | 54 | (13/24)2 |
Among Those Assessing Literacy | ||
With Literacy Software Program | 46 | (6/13) |
By Internal Staff | 38 | (5/13) |
With Some Other Method (e.g., Focus Group) | 23 | (3/13) |
Translation | ||
Translation of at Least One Report into a Foreign Language | 33 | (8/24)2 |
Hired Vendor to do Translation(s) | 100 | (8/8) |
Dissemination of Report | ||
Notified Audience about at Least One Report | 76 | (19/25) |
Distributed Report by Regular Mail | 68 | (17/25) |
Distributed Report on Web Site | 60 | (15/25) |
Distributed Report by E-mail | 28 | (7/25) |
Evaluation of Reporting Process | ||
Conducted Any Type of Evaluation | 52 | (14/25) |
Hired Vendor to Assist with Evaluation | 71 | (10/14) |
Categories may not sum to 100 percent because of one or more of the following reasons: rounding error, the response categories were not mutually exclusive, several distinct questions are being reported and/or only the most common responses are reported.
Denominator is 24 (instead of 25) because one sponsor we interviewed did not produce a written report, so questions about written (i.e., hard copy) report design did not apply.
NOTE: CAHPS® is Consumer Assessment of Healthcare Providers and Systems.
SOURCE: Teleki, S., Kanouse, D.E., Elliott, M.N., Hiatt, L., de Vries, H., and Quigley, D.D., RAND Health and Pardee Rand Graduate School.
Audience
Almost one-half of the sponsors (48 percent or 12/25) produced at least one report for the public and one for a limited audience (dual reporters). Eleven (44 percent) only produced reports intended for the public, and the rest (8 percent or 2/25) restricted their reports to a limited audience. As expected given their public charge, the overwhelming majority of Medicaid and non-Medicaid State agencies produced at least one public report; most business coalitions and national-level entities produced both types of reports.
Regardless of whether sponsors released their reports publicly or only to a limited group, most (and especially the public reporters) said they produced at least one report for an entity responsible for making a purchasing decision (e.g., consumer, employer, or purchaser); to a lesser extent, sponsors directed reports to health care providers (e.g., health plans, medical groups), policymakers and/or regulators. However, some sponsors had several reports for which the audience was everyone (i.e., generic reports not geared to the needs of any specific audience).
Sponsors did not report significant challenges in determining who their audience was, but approximately one-quarter changed their target audience over time, almost always expanding it.
Number of Reports
The 25 reporters of CAHPS® data we interviewed produced one to five reports per year, with Medicaid, non-Medicaid State agencies, and business coalitions averaging two, and national-level organizations averaging four.
Media
All 25 of the reporting sponsors interviewed reported CAHPS® information via Web-based reports, and all but 1 (96 percent) produced written reports. Forty percent of these sponsors (10/25) also produced data files for submission to the NCBD.
Frequency
Sponsors reporting CAHPS® data had done so fairly recently and with some regularity. Most (88 percent or 22/25) had reported CAHPS® data to a limited audience, the public, or both within the previous 2 years, and most (80 percent or 20/25) had produced at least one report annually. Common drivers of reporting frequency were the availability of data, stipulations of a mandate/waiver, and budget limitations.
Timing
One-half of the organizations interviewed (52 percent or 13/25) reported in the fall, and nearly one-third (28 percent or 7/25) could not pinpoint a specific, consistent month. Most released their reports to coincide with open enrollment; some said they reported CAHPS® data as soon as they became available.
Literacy
Of the sponsors producing written reports, about one-half (54 percent or 13/24) accounted for the literacy level of their audience(s). They did so primarily by using computer software to estimate reading level (46 percent or 6/13), and/or submitting the report for review by internal staff (38 percent or 5/13). Some (23 percent or 3/13) used additional methods, such as focus groups, to assess the reading level of their report(s). The main reasons cited for not accounting for literacy levels were the sponsor's belief that the report was already geared to the right reading level and/or that the intended audience was highly educated.
Foreign Language
Of those producing written reports, one-third (8/24) translated at least one report from English into another language. Spanish was by far the most common; other languages in decreasing order of frequency included: Chinese, Vietnamese, Russian, and Korean. All sponsors engaged in translation used outside vendors; some also relied on review by internal staff. All of the reports translated were public reports. A frequently highlighted translation challenge was accurately conveying quality-of-care concepts in other languages. Those not engaged in translation cited lack of demand or lack of funds as reasons for not doing so.
Dissemination
Approximately three-quarters (76 percent or 19/25) of sponsors produced at least one report for which they engaged in promotional activities to increase awareness among their intended audience prior to the report's release. Those not doing so said they did not believe such advertising was helpful. Those that did promote the upcoming release of their report(s) typically used the media (i.e., television, radio, press conferences and/or releases), mailed announcements, Web notices, e-mail alerts, and/or special events (e.g., open enrollment fairs). These methods were viewed as being efficient, affordable, and/or having a wide reach.
To distribute a report, the sponsors we interviewed most often used regular mail (68 percent or 17/25) and their Web sites (60 percent or 15/25), followed by email (28 percent or 7/25). To a lesser extent, they made reports available on request or at meetings/seminars. More than one-third (36 percent or 9/25) used both mail and internet for distribution. The main factors influencing the report distribution method(s) used were ease for the sponsor, cost, and broad access to the end-user.
Many sponsors noted that they did not usually develop a comprehensive dissemination plan as part of their reporting process. Rather, their focus was on report production, with dissemination viewed as a fairly straightforward task that did not require significant preparation. The dissemination challenges mentioned most often were cost and making the information exciting or newsworthy for their audiences, especially since report findings usually did not change substantially year-to-year.
Evaluation
More than one-half (56 percent or 14/25) of the sponsors we interviewed conducted some type of evaluation after the release of their CAHPS® report(s). Those that did were usually Medicaid and national-level organizations wanting to learn whether the information was useful to consumers (e.g., whether consumers understood the data as presented, whether the data were helpful to them in making a health care decision). Most of these entities (71 percent or 10/14) hired an outside vendor for assistance in conducting focus groups and interviews. Evaluations were most often paid for through departmental funds (79 percent or 11/14); some were supported by external grants (29 percent or 4/14).
All sponsors that were far enough along in their evaluation process to comment on results found the information gathered to be useful. Nearly three-quarters (72 percent or 8/11) of those who had received feedback said that they made changes in order to create a more user-friendly report. For example, some redesigned data displays to make them more understandable; others added text to explain results and how to use data in decisionmaking.
Those that did not conduct an evaluation cited lack of audience interest (i.e., no one to measure) (45 percent or 5/11), their belief that an evaluation was unnecessary (45 percent or 5/11), and/or lack of funds (27 percent or 3/11). Some specifically requested help from AHRQ in finding simple, cost-effective ways to assess consumer reporting needs.
Study Limitations
This study has several limitations. First, generalizability is limited by purposive rather than probability sampling. The relatively small sample size provides a general picture of sponsor reporting activities, but does not permit precise estimation of the frequency of specific behaviors. Nonetheless, our sample includes many large and important sponsors that collectively provide reports to millions of consumers and represents random sampling from an incomplete but far-reaching frame. Second, our study relies on informant self-reports; answers may be incomplete or inaccurate due to lack of knowledge and/or recall bias. To diminish this concern, we made significant efforts to identify and interview the person(s) in each organization most knowledgeable regarding CAHPS® reporting. Third, because some sponsors may not have been identified in our canvassing and others declined to participate, non-response bias is possible; non-participating organizations may differ systematically in their practices from those that agreed to participate. Concerns regarding the lack of Fortune 500 companies in our sample are mitigated by the fact that such companies are often part of business coalitions, which our sample does include. Fourth, our study focuses entirely on the reporting of CAHPS® data on health plans. Reporting approaches and experiences related to other entities, such as hospitals or individual clinicians, may differ. Lastly, our sample comprises mostly sponsors at later stages of adoption and use of CAHPS® data for reporting purposes. The approaches taken and results obtained by recent adopters, whose numbers are not known, could differ in important ways. Overall, these biases may be such that our sample includes those sponsors who are most experienced at reporting. If so, adoption of best practices in reporting may be less widespread than is estimated here.
Discussion
The majority of CAHPS® sponsors we interviewed engaged in generally sound reporting practices. Nonetheless, there were areas where education in best practices, assistance in improving the production and dissemination of CAHPS® reports, or augmented resources could help sponsors become more successful change agents. Making such improvements is particularly important today, given market movement toward consumer-directed health plans, product tiering, and pay-for-performance—all of which tend to rely on patient experience measures, usually CAHPS®. These new models demand that consumers make their own choices and manage their own health care dollars, and require purchasers to measure and compare the performance of health care providers. Sponsors play a critical role in ensuring that the reports they produce are effective for consumer decisionmaking and provider QI.
We found that many sponsors now routinely report CAHPS® data. On average, sponsors produce two CAHPS® reports each year, including at least one released publicly. Thus, patient experience data—often in conjunction with other quality of care information—are being shared in a timely manner with consumers and other decisionmakers to inform their health care choices. Often these reports are also used in internal, sponsor-supported QI efforts. The following are areas that could help sponsors become more effective change agents.
Report Tailoring
Although sponsors are producing more than one report per year, many said that their reports were intended for everyone rather than tailored to the needs of specific audiences. Appropriate tailoring to an audience through intelligent segmentation is a key feature of successful public health communication (Slater, 1995; Kanouse, Spranca, and Vaiana, 2004). The failure to tailor reports to well-defined audiences could result in failure to communicate information clearly or even to engage the audience's interest. Future research should explore how sponsors can improve tailoring without assuming untenable burdens. Tailoring may be especially important, and especially challenging, for reports that integrate CAHPS® data with other decision-relevant information.
Literacy Assessments
Almost one-half of the sponsors interviewed did not take the literacy of their audience into account or assess the reading level of their reports. As this is critical for ensuring that a report will effectively communicate to its audience, the CAHPS® consortium may want to consider additional education/outreach. Emphasis should be placed on the importance of assessing the reading level of reporting materials and the benefits of achieving as low a reading level as possible, so that the information is accessible to consumers with lower levels of literacy, and is readily processed without unnecessary effort by consumers with higher literacy levels.
Dissemination
Although many sponsors shared their reports with others, most did not actively plan the dissemination. Given that developing a dissemination plan early on—including a strategy for the notification and promotion of the reports as well as the actual distribution of them—is likely to be a critical part of a successful reporting effort (National Cancer Institute, 2002; Kanouse, Spranca, and Vaiana, 2004), the omission of this key step by many sponsors is of concern and merits further attention.
Evaluation
Conducting evaluations remains a low priority for many sponsors, despite the fact that examining the impact of a report and ensuring that it meets the needs of the intended audience(s) is vital to ensuring a successful reporting effort (Atkin and Freimuth, 1989; Kanouse, Spranca, and Vaiana, 2004). Sponsors were hindered not only by lack of funds, but also by lack of knowledge of how to conduct a meaningful evaluation with limited resources. This is an area where researchers and those in the field could work together to identify and/or develop practical tools. However, even if such tools are developed, sponsors will need to be educated about the critical role of evaluation in improving the reporting process and creating reports that work well for their intended audiences, as many sponsors we interviewed did not view evaluation as useful.
Supplemental Items
The majority of sponsors interviewed did not report CAHPS® supplemental items. In many cases, such items were not available for many organizations addressed in the sponsor's report. Supplemental items were also considered more relevant for internal QI purposes than for external reporting. Many sponsors want drill down measures to use for QI purposes and CAHPS® surveys that move beyond the health plan to examine patient experience with other levels of the health care system. AHRQ has taken some steps to address these needs by developing new tools (Quigley et al., 2006;Agency for Healthcare Research and Quality, 2006b). Nevertheless, further exploration to determine if there are areas where sponsors might be especially motivated to collect and report such items is warranted.
Vendor Engagement
Our results suggest that vendors are important, often-overlooked stakeholders to consider for targeted educational outreach about best reporting practices. Many sponsors hired vendors assuming they were knowledgeable about report card design and dissemination. However, it is not clear whether these vendors were familiar with AHRQ's recommended practices for reporting CAHPS® data. It may be useful to provide educational outreach geared specifically to vendors and encourage sponsors to discuss these issues with their vendors.
It is encouraging that most sponsors, both large and small, have arrived at ways of reporting CAHPS® data that are remarkably similar; this suggests that CAHPS® is achieving a level of measurement and presentation uniformity that meets the needs of a variety of sponsors. However, the constraints that sponsors face6 often mean they are just going through the motions of getting reports out, as opposed to engaging in activities that may lead to breakthroughs in effective reporting strategies. There is a danger that in becoming more routinized in their reporting practices, sponsors will overlook opportunities to learn and improve. Future research should assess whether the lack of substantial changes in reports from year to year is appropriate or represents missed opportunities. For example, does the lack of substantial differences in CAHPS® data from year to year make the reporting of annual CAHPS® rates unhelpful? Would collecting and reporting trend data be more compelling?
Although budget and time constraints are always issues for busy organizations with many responsibilities, CAHPS® sponsors may benefit from targeted efforts by AHRQ, the CAHPS® consortium, and others to help them focus on shortcomings in their reporting activities. Attention to areas where improvement seems possible has the potential to strengthen the content, design, and reach of these reports—which in turn may better inform decisionmakers at all levels and improve quality of care, the ultimate goals of the reporting process.
Acknowledgments
The authors gratefully acknowledge James Garulski for recruiting; Mark Spranca for help with study design; and Ron D. Hays for review of the article.
Footnotes
The authors are with RAND Health and Pardee RAND Graduate School. The research in this article is supported by the Agency for Healthcare Research and Quality (AHRQ) under Cooperative Agreement 5 U18 HS09204). Marc Elliott is supported in part by the Centers for Disease Control and Prevention (CDC) under Cooperative Agreement U48/DP000056). The statements expressed in this article are those of the authors and do not necessarily reflect the views or policies of RAND Health, Pardee RAND Graduate School, AHRQ, CDC, or the Centers for Medicare & Medicaid Services (CMS).
National-level organizations include the Federal Government, private sector, and regulatory entities operating nationwide that are responsible for the development, evaluation, production, and/or implementation of health care quality reports.
A national repository for data from the CAHPS® family of surveys. Its primary purpose is to facilitate comparisons of CAHPS® survey results by and among survey sponsors. Data are submitted voluntarily by sponsors, and the current database contains 8 years of data from the CAHPS® Health Plan Survey.
A team of researchers that has worked with AHRQ over the past 10 years to develop the CAHPS® surveys and reporting tools.
The denominator is 24 (instead of 25) because one sponsor we interviewed did not produce a printed report, and some questions were only applicable to sponsors producing such reports.
CAHPS® supplemental items are optional items that add to the core survey content (e.g., provider communication), address topics not captured by the core survey (e.g., prescription medicines), or address care for subpopulations (e.g., children).
For small sponsors, constraints were typically related to resources, such as funding and staffing. For large sponsors, the challenges were often political.
Reprint Requests: Stephanie Teleki, Ph.D., RAND Health, 1776 Main Street, PO Box 2138, Santa Monica, CA 90407-2138. E-mail: teleki@rand.org
References
- Agency for Healthcare Research and Quality. Market Research for Ambulatory CAHPS®. Rockville, MD.: 2004. [Google Scholar]
- Agency for Healthcare Research and Quality CAHPS® Homepage. doi: 10.1080/15360280802537332. Internet address: http://www.cahps.ahrq.gov/ (Accessed 2006a.) [DOI] [PubMed]
- Agency for Healthcare Research and Quality CAHPS® Survey Products Web Site. Internet address: www.cahps.ahrq.gov (Accessed 2006b.)
- Agency for Healthcare Research and Quality National CAHPS® Benchmarking Database (NCBD) Internet address: http://www.cahps.ahrq.gov/content/ncbd/ncbd_Intro.asp?p=105&s=5 (Accessed 2006c.)
- Atkin CK, Freimuth V. Formative Evaluation Research in Campaign Design. In: Rice RE, Atkin CK, editors. Public Communication Campaigns. Second. Sage Publications USA.; Thousand Oaks, CA.: 1989. [Google Scholar]
- Barr JK, Giannotti TE, Sofaer S, et al. Using Public Reports of Patients Satisfaction for Hospital Quality Improvement. Health Services Research. 2006 Jun;41(3 Pt.1):663–682. doi: 10.1111/j.1475-6773.2006.00508.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bush GW. The White House; Aug 22, 2006. Executive Order: Promoting Quality and Efficient Health Care in Federal Government Administered or Sponsored Health Care Programs. Internet address: http://www.whitehouse.gov/news/releases/2006/08/20060822-2.html (Accessed 2007.) [Google Scholar]
- Cohen J. A Coefficient of Agreement for Nominal Scales. Educational and Psychological Measurement. 1960;20:37–46. [Google Scholar]
- Davis K, Haran C. The Commonwealth Fund's Top 10 Health Policy Stories of 2006. The Commonwealth Fund; Jan, 2007. [Google Scholar]
- Farley DO, Elliott MN, Short PF, et al. Effect of CAHPS® Performance Information on Health Plan Choices by Iowa Medicaid Beneficiaries. Medical Care Research and Review. 2002a Sep;59(3):319–336. doi: 10.1177/107755870205900305. [DOI] [PubMed] [Google Scholar]
- Farley DO, Short PF, Elliott MN, et al. Effects of CAHPS® Health Plan Performance on Plan Choices by New Jersey Medicaid Beneficiaries. Health Services Research. 2002b Aug;37(4):985–1007. doi: 10.1034/j.1600-0560.2002.62.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fixsen DL, Naoom SF, Blase KA, et al. Implementation Research: A Synthesis of the Literature. University of South Florida, Louis de la Parte Florida Mental Health Institute The National Implementation Research Network; Tampa, FL.: 2005. FMHI Publication Number 231. [Google Scholar]
- Goldstein E, Farquhar M, Crofton C, et al. Measuring Hospital Care from the Patients' Perspective: An Overview of the CAHPS® Hospital Survey Development Process. Health Services Research. 2005 Dec;40(6):1977–1995. doi: 10.1111/j.1475-6773.2005.00477.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goldstein E, Fyock J. Reporting of CAHPS® Quality Information to Medicare Beneficiaries. Health Services Research. 2001 Jul;36(3):477–488. [PMC free article] [PubMed] [Google Scholar]
- Guadagnoli E, Epstein AM, Zaslavsky A, et al. Providing Consumers with Information About the Quality of Health Plans: The Consumer Assessment of Health Plans Demonstration in Washington State. Joint Commission Journal on Quality Improvement. 2000 Jul;26(7):410–420. doi: 10.1016/s1070-3241(00)26034-3. [DOI] [PubMed] [Google Scholar]
- Harris-Kojetin LD, McCormack LA, Jael EM, et al. Beneficiaries' Perceptions of New Medicare Health Plan Choice Print Materials. Health Care Financing Review. 2001 Fall;23(1):21–35. [PMC free article] [PubMed] [Google Scholar]
- Hibbard JH, Stockard J, Tusler M. Does Publicizing Hospital Performance Stimulate Quality Improvement Efforts? Health Affairs. 2003 Mar-Apr;22(2):84–94. doi: 10.1377/hlthaff.22.2.84. [DOI] [PubMed] [Google Scholar]
- Hibbard JH, Stockard J, Tusler M. Hospital Performance Reports: Impact on Quality, Market Share, and Reputation. Health Affairs. 2005 Jul-Aug;24(4):1150–1160. doi: 10.1377/hlthaff.24.4.1150. [DOI] [PubMed] [Google Scholar]
- Kanouse DE, Spranca M, Vaiana M. Reporting About Health Care Quality: A Guide to the Galaxy. Health Promotion Practice. 2004 Jul;5(3):222–231. doi: 10.1177/1524839904264511. [DOI] [PubMed] [Google Scholar]
- Landis JR, Koch GG. The Measurement of Observer Agreement for Categorical Data. Biometrics. 1977 Mar;33(1):159–174. [PubMed] [Google Scholar]
- Leavitt MO. Better Care, Lower Cost: Prescription for a Value-Driven Health System. Department of Health and Human Services Office of the Secretary; Washington, DC.: 2006. [Google Scholar]
- Lindenauer PK, Remus D, Roman S, et al. Public Reporting and Pay for Performance in Hospital Quality Improvement. New England Journal of Medicine. 2007 Feb;356(5):486–496. doi: 10.1056/NEJMsa064964. [DOI] [PubMed] [Google Scholar]
- McCormack LA, Garfinkel SA, Hibbard JH, et al. Health Plan Decision Making With New Medicare Information Materials. Health Services Research. 2001 Jul;36(3):531–554. [PMC free article] [PubMed] [Google Scholar]
- McCormack LA, Garfinkel SA, Schnaier JA, et al. Consumer Information Development and Use—Consumer Information in a Changing Health Care System. Health Care Financing Review. 1996 Fall;18(1):15–30. [PMC free article] [PubMed] [Google Scholar]
- McIntyre D, Rogers L, Heier EJ. Overview, History, and Objectives of Performance Measurement. Health Care Financing Review. 2001 Spring;22(3):7–21. [PMC free article] [PubMed] [Google Scholar]
- National Cancer Institute. Making Health Communication Programs Work. U.S. Department of Health and Human Services; 2002. NIH Publication Number 02-5145. [Google Scholar]
- PricewaterhouseCoopers Health Research Institute. Price waterhouseCoopers LLP.: 2006. Top Seven Health Industry Trends in ‘07. Internet address: http://pwchealth.com/cgi-local/hcregister.cgi?link=reg/topseven.pdf (Accessed 2007.) [Google Scholar]
- Quigley DD, Farley DO, Brown JA, et al. Development of Supplemental Quality Improvement Items for the Consumer Assessment of Healthcare Providers and Systems (CAHPS®) RAND Corporation; Santa Monica, CA.: 2006. [Google Scholar]
- Rogers EM. Diffusion of Innovations. Fifth. Free Press; New York, NY.: 2003. [Google Scholar]
- Schouten HJ. Estimating Kappa from Binocular Data and Comparing Marginal Probabilities. Statistics in Medicine. 1993 Dec;12(23):2207–2217. doi: 10.1002/sim.4780122306. [DOI] [PubMed] [Google Scholar]
- Short PF, McCormack L, Hibbard J, et al. Similarities and Differences in Choosing Health Plans. Medical Care. 2002 Apr;40(4):289–302. doi: 10.1097/00005650-200204000-00005. [DOI] [PubMed] [Google Scholar]
- Slater MD. Choosing Audience Segmentation Strategies and Methods for Health Communication. In: Maibach E, Parrott RL, editors. Designing Health Messages: Approaches from Communication Theory and Public Health Practice. Sage Publications USA.; Thousand Oaks, CA.: 1995. pp. 186–198. [Google Scholar]
- Spranca M, Kanouse DE, Elliott M, et al. Do Consumer Reports of Health Plan Quality Affect Health Plan Selection? Health Services Research. 2000 Dec;35(5 Pt 1):933–947. [PMC free article] [PubMed] [Google Scholar]
- Thompson JW, Pinidiya SD, Ryan KW, et al. Health Plan Quality Data: The Importance of Public Reporting. American Journal of Preventive Medicine. 2003 Jan;24(1):62–70. doi: 10.1016/s0749-3797(02)00569-x. [DOI] [PubMed] [Google Scholar]
- Zema CL, Rogers L. Evidence of Innovative Uses of Performance Measures Among Purchasers. Health Care Financing Review. 2001 Spring;22(3):35–47. [PMC free article] [PubMed] [Google Scholar]