Abstract
Background
The complexity of health information frequently exceeds patients’ skills to understand and use it. Improvement in hospital communication has the potential to improve the quality of care.
Objective
To develop a set of items to supplement the CAHPS® Hospital Survey (HCAHPS) to assess how well hospitals communicate health information to inpatients.
Methods
We conducted an environmental scan and obtained input from stakeholders to identify domains and survey items, and cognitively tested the item set in English and Spanish. We administered the items to a random sample of adult hospital patients using mail and telephone data collection. We estimate item-scale correlations for hypothesized multi-item composites, internal consistency reliability for composites, correlations among composites, and regressed global rating of the hospital and a would you recommend the hospital items on HCAHPS existing core and the new composites to evaluate the unique contribution of each to these “bottom-line” measures.
Results
A total of 1,013 surveys were obtained (55% response rate). With some exceptions, correlations between items and scales were consistent with the hypothesized item clusters. Three composites were identified: 1) communication about tests; 2) communication about how to care for self and medicines; and 3) communication about forms.
Conclusions
This study provides support for the measurement properties of the HCAHPS Item Set for Addressing Health Literacy. It can serve as both a measure of whether health care providers in a hospital setting have communicated effectively with their patients and as a tool for quality improvement.
Keywords: Health literacy, HCAHPS®, patient survey, hospital survey, patient-provider communication
INTRODUCTION
The Consumer Assessment of Healthcare Providers and Systems (CAHPS®) program, funded by the Agency for Healthcare Research and Quality (AHRQ), aims to develop standardized surveys to assess patient experiences with their health care providers.1 Over the years, the CAHPS program has produced surveys designed to collect information across a range of health care settings, including both ambulatory and facility-based settings.
The CAHPS Hospital Survey (HCAHPS), developed in partnership with the Centers for Medicare and Medicaid Services (CMS), is a standardized survey of the experiences of adult patients with hospital care and services. HCAHPS was created using the standard CAHPS survey development process, including a public call for measures, an extensive environmental scan, cognitive testing in English and Spanish, patient focus groups, input from stakeholders, and multiple field tests in both English and Spanish. CMS provided opportunities for the public to comment on HCAHPS. 2 The National Quality Forum has endorsed the survey and the federal Office of Management and Budget approved national implementation for public reporting purposes. HCAHPS was implemented nationally in the U.S. by CMS on a voluntary basis, starting in October 2006. CMS includes this survey in its Hospital Quality Reporting program requirements and has been reporting the survey results on a public Web site at www.hospitalcompare.hhs.gov. 3
HCAHPS includes items that capture communication with nurses and doctors and communication about medicines and discharge information. The HCAHPS Item Set for Addressing Health Literacy was developed as a supplement to the core HCAHPS survey to provide hospitals with actionable information they can use to “drill-down” to identify areas for quality improvement. The development of this item set builds on earlier CAHPS work to develop an item set to address health literacy issues in an ambulatory care setting. 4 Stakeholders that participated in these earlier development efforts identified the need for a CAHPS health literacy item set specifically for the HCAHPS survey. At the same time, CMS reported stakeholders were requesting more detailed questions about discharge coordination that they could use for quality improvement efforts.
This new Health Literacy item set reflects both the ascendency of health literacy as a public health concern, and recognition that addressing health literacy issues has the potential to mitigate the negative sequelae of limited health literacy.5 A recent systematic review of the literature found that poor health literacy among patients is associated with more hospitalizations, greater use of emergency care, lower receipt of preventive care, poorer ability to demonstrate taking medications appropriately, poorer ability to interpret labels and health messages, and – among elderly persons – worse overall health status and higher mortality rates. 6 National studies document that the complexity of health information frequently exceeds American’s health literacy skills.7-10
In 2007, the Joint Commission called for health care organizations to make effective communication a priority across the care continuum.11 Transitions from hospital to home represent a particularly vulnerable time. Many patients leave the hospital without understanding their diagnosis or how to take their medicines. 12 Surgical patients often do not remember postoperative instructions.13 Patients with low literacy reported hospital staff becoming frustrated or angry when someone could not complete a form or read instructions and recounted serious medication errors resulting from their inability to read labels.14 Re-engineering the discharge process so that patient education occurs throughout the hospital stay and health literacy techniques are used have been shown to be able to reduce re-hospitalizations by 30 percent.15 With the recent reduction in reimbursement to hospitals with excessive readmission rates, hospitals are eager to make changes to improve how they communicate with patients about tests, self-care, and medicines. 16 The new Health Literacy item set can help them assess their efforts.
METHODS
Domain and Item Development
The HCAHPS Item Set for Addressing Health Literacy was developed using the survey development approach used to develop other CAHPS surveys.2 First, we identified the domains for the item set through an extensive environmental scan and via interviews with content experts. We held meetings with stakeholders to obtain input into the domains identified by the environmental scan, prioritize candidate domains, review draft survey items, and obtain input on how best to disseminate the new health literacy item set and promote its use. Two stakeholder meetings were held in March 2009 that included representatives from various government agencies (including CMS), hospital representatives, clinicians and other health providers, health literacy experts and advocates, and consumers.
Seven health literacy domains were identified through this process: 1) communication with nurses; 2) communication with doctors; 3) communication about tests; 4) communication about caring for yourself at home; 5) communication about medicines; 6) interpreter services; and 7) communication about forms. While some of these domains were completely new (e.g., interpreter services, communication about forms), others represented an expansion of domains already included in the core HCAHPS survey but, in stakeholders’ opinions, not sufficiently addressed.
We reviewed the survey items that had been collected through the environmental scan and mapped the survey items to the domains. AHRQ issued a call for measures through the Federal Register, but very few responses were submitted. In developing the draft item set, we modified or adapted items in the public domain to conform to the HCAHPS survey structure and format. In addition, new survey items were drafted for domains for which the team was unable to identify existing items.
Translation
The core HCAHPS Survey was developed and evaluated concurrently in both English and Spanish. Findings from the original HCAHPS pilot tests demonstrate that responses to both language versions have similar patterns with respect to item-scale correlations, factor structure, content validity and associations between the reporting measures and the overall rating of the hospital. Overall, the survey items were generally equivalent across language versions. 17
Like the core HCAHPS Survey, we developed and tested the supplemental Health Literacy item set concurrently in both English and Spanish. The supplemental items were translated into Spanish using the CAHPS guidelines for selection of translators and reviewers and for translation.18-19 The translation team aimed to produce a Spanish version that was conceptually equivalent to the English version, that was at the appropriate reading level for the target population, and that could be understood by Spanish speakers throughout the continental United States. Using a translation approach that involves multiple translators and bilingual reviewers from different Spanish speaking countries (often referred to as the “translation by committee” approach) ensures that the translation is understood by a wide range of Spanish speakers and has been shown to produce more culturally appropriate translations. 20-22
Cognitive Interviews
Cognitive interviewing23-24 is a key step in the CAHPS survey development process and has traditionally been used to evaluate CAHPS survey items prior to field-testing.25-27 We conducted three rounds of cognitive interviews in the spring and summer of 2009. In total, 36 interviews were conducted, half in English and half in Spanish. All interviews were conducted with adults with a recent overnight hospital stay. Cognitive interview respondents included a mix of men and women in terms of age and race and ethnicity, insurance coverage, and patients who had been admitted to the hospital through the emergency room as well as those who had scheduled surgeries.
In order to test the health literacy item set with respondents with limited health literacy skills, we recruited subjects with less than a high school education. The 2003 National Assessment of Adult Literacy (NAAL) found that adults who had not gone beyond a high school education had lower average health literacy than adults with higher levels of education.28 Although we did not assess respondents’ literacy or health literacy skills, two thirds of the cognitive interviews were conducted with respondents who had no more than a high school education.
We used the cognitive interviews to revise and refine the item set iteratively and to “weed out” survey items that were problematic, did not capture meaningful information, or focused on issues that did not resonate strongly with respondents. Of the 84 items that we initially tested, 22 items were dropped based on findings from the cognitive interviews.
Composite Development
A composite is composed of two or more survey items that are closely related conceptually and statistically. Composites summarize a large amount of survey data, thus making it easier to understand.29 For each of the 7 health literacy domains in the item set, we identified the subset of items that could be used to create a composite measure for each domain (excluding screener questions that are used to skip respondents out of questions that don’t apply to them). Table 1 lists the hypothesized composites and the subset of items that comprise them.
Table 1.
Hypothesized Composites Prior to the Field Test
How Well Nurses Communicate |
Response
Option |
|
Q23 | During this hospital stay, how often did nurses answer all your questions to your satisfaction? |
Never Sometimes Usually Always |
Q24 | During this hospital stay, how often did nurses use pictures, drawings, or models to explain things to you? |
|
Q25 | During this hospital stay, how often did you feel nurses really cared about you as a person? |
|
Q26 | During this hospital stay, how often did nurses use medical words you did not understand? |
|
Q27 | During this hospital stay, how often did nurses interrupt you when you were talking? |
|
Q28 | During this hospital stay, how often did nurses talk too fast when talking with you? | |
Q29 | During this hospital stay, how often were nurses hard to understand because of an accent or the way they spoke to you? |
|
Q30 | During this hospital stay, how often did nurses use a condescending, sarcastic or rude tone or manner with you? |
|
How Well Doctors Communicate |
Response
Option |
|
Q31 | During this hospital stay, how often did doctors answer all your questions to your satisfaction? |
Never Sometimes Usually Always |
Q32 | During this hospital stay, how often did doctors use pictures, drawings, or models to explain things to you? |
|
Q33 | During this hospital stay, how often did doctors make sure you understood all the information you were given? |
|
Q34 | During this hospital stay, how often did you feel doctors really cared about you as a person? |
Never Sometimes Usually Always |
Q35 | During this hospital stay, how often did doctors use medical words you did not understand? |
|
Q36 | During this hospital stay, how often did doctors interrupt you when you were talking? |
|
Q37 | During this hospital stay, how often did doctors talk too fast when talking with you? |
|
Q38 | During this hospital stay, how often were doctors hard to understand because of an accent or the way they spoke to you? |
|
Q39 | During this hospital stay, how often did doctors use condescending, sarcastic or rude tone or manner with you? |
|
Communication About Tests |
Response
Option |
|
Q41 | During this hospital stay, before you had a blood test, x-ray, or other test, how often did hospital staff explain what it was for? |
Never Sometimes Usually Always |
Q42 | How often was the explanation easy to understand? | |
Q43 | During this hospital stay, when you had a blood test, x-ray, or other test, how often did hospital staff explain the results to you? |
|
Q44 | How often was the explanation easy to understand? | |
Information About How to Care for Yourself At Home |
Response
Option |
|
Q46 | During this hospital stay, did hospital staff give you a telephone number to call if you had problems after you left the hospital? |
Yes/ No |
Q47 | During this hospital stay, did hospital staff tell you how to take care of yourself at home? |
|
Q48 | Was the information easy to understand? | |
Q49 | During this hospital stay, did you get instructions in writing about how to take care of yourself at home? |
Yes/No |
Q50 | Were the written instructions easy to understand? | |
Q52 | Were the instructions available in your preferred language? | |
Information About Medicines |
Response
Option |
|
Q54 | During this hospital stay, did hospital staff explain the purpose of each of the medicines you were to take at home? |
Yes/ No |
Q55 | Was the explanation easy to understand? | |
Q56 | During this hospital stay, did hospital staff explain how much to take of each medicine and when to take it when you were at home? |
|
Q57 | How often was the explanation easy to understand? | Never Sometimes Usually Always |
Q58 | During this hospital stay, did hospital staff ask you to describe how much you would take of each medicine and when you would take it when you were at home? |
Yes/ No |
Q59 | During this hospital stay, did hospital staff tell you whom to call if you had questions about your medicines? |
|
Interpreter Services |
Response
Option |
|
Q62 | During this hospital stay, how often were you treated unfairly because you do not speak English very well? |
Never Sometimes Usually Always |
Q65 | During this hospital stay, was there any time when you needed an interpreter and did not get one at the hospital? |
Yes/ No |
Q66 | During this hospital stay, did hospital staff tell you that you had a right to interpreter services free of charge? |
|
Q70 | During this hospital stay, how often did this interpreter provided by the hospital treat you with courtesy and respect? |
Never Sometimes Usually Always |
Q72 | During this hospital stay, how often did you have to wait 15 minutes or longer for the interpreter provided by the hospital to come help you? |
|
Communication About Forms |
Response
Option |
|
Q78 | During this hospital stay, how often did hospital staff explain the purpose of each form before you filled it out or signed it? |
Never Sometimes Usually Always |
Q79 | During this hospital stay, how often were the forms that you got easy to fill out? | |
Q80 | During this hospital stay, how often did hospital staff offer you help in filling out a form? |
|
Q82 | During this hospital stay, how often were you given enough time to fill out or sign forms? |
|
Q84 | During this hospital stay, how often were the forms that you had to fill out or sign available in your preferred language? |
Note: Item numbers refer to the numbering used in the field test version of the survey.
We evaluated the extent to which items correlated together into multi-item composites and calculated a measure of internal consistency, Cronbach’s alpha, to estimate the internal consistency reliability of the composites . We deemed as acceptable items that were correlated 0.30 or higher with their hypothesized composite (correcting for item overlap with the total score). We also examined the correlations of each item with composites it was not hypothesized to represent to assess whether the composites represent unique aspects of communication to improve health literacy. Through an iterative process, we revised the placement of items into composites, taking into account correlations between the items as well as item content, and reran the correlation analysis to assess the fit of each individual item into the composite it was hypothesized to represent. We also conducted categorical confirmatory factor analysis in Mplus30 to assess the fit of the final composite structure.
Field Test
The purpose of the field test was to assess the reliability and validity of the supplemental Health Literacy item set and to identify items that could be combined to create composite measures that could be used for both internal quality improvement as well as for public reporting. The field test was conducted between December 2010 and February 2011. The 62 health literacy items were appended to the 27-item core HCAHPS survey, in accordance with the technical specifications for implementing the HCAHPS survey provided by CMS, 31 creating an 89-item field test survey.
Recruiting field test partners proved challenging. While several hospitals expressed interest in participating in the field test, in the end only one hospital was able to secure all the necessary permissions in time to take part in the field test. This hospital serves a largely suburban population located in a Midwestern metropolitan area with a population of approximately 145,000 persons. The majority of patients served by the hospital are non-Hispanic White patients who are English-speaking. In 2009, among in-patient discharges, 35% were covered by Medicare, 36% were commercially insured, 23% were covered by Medicaid, 5% were uninsured, and 1% was covered by other types of health insurance.32
The sample frame for the field test included 3,772 adult patients drawn by the participating hospital using sample selection specifications adapted from Version 4.0 of the HCAHPS Quality Assurance Guidelines (published in February 2009). The participating hospital was instructed to create a sampling frame of adult patients (age 18 or older) with at least one overnight hospital stay in the previous six weeks. Excluded from the sample frame were patients who were less than 18 years old at the time of their admission, had a psychiatric diagnosis, were discharged to a hospice facility, died during the hospitalization, patients who requested not to be contacted, court/law enforcement patients (i.e. prisoners), patients with a foreign home address, and patients who should be excluded based on state regulations. From the sample frame provided by the hospital, RAND then randomly selected 2,000 patients to participate in the field test. While we wanted to field test the health literacy item set in both English and Spanish, the sample provided by our field test partner did not include sufficient Spanish speakers.
The survey was field tested using a combination of mail and phone survey administration, which has been shown to produce higher response rates than mail surveys alone and is a less expensive option than conducting the survey either in person or by telephone alone.33-34 Conducting phone follow-up of sampled individuals that did not respond to the mailed survey also offers a means to reach individuals whose low literacy presents a barrier to self-administering a survey. The data collection protocol included 4 mailings before phone follow-up (1) advance notification letter, 2) first survey mailing, 3) reminder letter, 4) second survey mailing). Up to 10 attempts to complete the survey by phone were made over a four-week period.
To maximize response rates among those with limited literacy skills, the survey and all survey materials were written using simple, plain language. As described above the survey was assessed for comprehension through extensive cognitive testing. The letters used as part of the survey were evaluated using the Flesch-Kincaid readability tool available through Microsoft Office Word. The readability ease score was 71.0 (indicating that in theory, the letters could be understood by average 13-15 years olds) and the Flesch-Kincaid grade level score was 6.9 (indicating that in theory, the letters could be understood by an average 6th or 7th grader). 35
Data Analysis
We examined item distributions (ceiling and floor effects), item missing data, internal consistency reliability of the multi-item scales or composites, correlations of the new health literacy composites and the core HCAHPS composites, and correlations of the health literacy composites with a global rating of the hospital on a scale of 0 to 10 (where 0 = worse possible hospital and 10 = best possible hospital).36-38
RESULTS
Field Test Response
A total of 1,013 surveys were completed, for a response rate of 55%. Overall, more surveys were completed by mail (790 or 78%) than by phone (223 or 22%), with most of the surveys (62%) coming in after the first mailing. All but one of the surveys was completed in English. Table 2 shows the demographic characteristics of survey respondents. Twenty-two percent reported that their health was fair or poor while 30% reported it was good, 43% reported it was very good or excellent. Ten percent of the sample had less than a high school education, while 21% had graduated from college or had more than a college degree. Whites were by far the largest racial group and Hispanics constituted only 2% of respondents. Almost all respondents reported that English is the main language they speak at home and a majority was female. Over half the sample was 65 years old or older. Demographic characteristics were not available for those that did not respond to the survey; therefore non-response bias could not be estimated.
Table 2.
Demographic Characteristics of the HCAHPS Health Literacy Sample
Question | Response | % |
---|---|---|
Self-Rated Overall Health | Excellent | 13.9 |
Very Good | 29.1 | |
Good | 30.0 | |
Fair | 18.2 | |
Poor | 4.6 | |
Missing | 4.1 | |
Highest Level of Completed Education | 8th grade or less | 3.7 |
Some high school | 6.4 | |
High school graduate or GED | 37.9 | |
Some college or 2-year degree | 27.4 | |
4-year college graduate | 11.5 | |
More than 4-year college degree | 9.5 | |
Missing | 3.6 | |
Spanish/Hispanic/Latino Origin or Descent | Not Spanish/Hispanic/Latino | 86.6 |
Puerto Rican | 1.2 | |
Mexican/Mexican-American/Chicano | 0.6 | |
Other Spanish/Hispanic/Latino | 0.4 | |
Missing | 11.3 | |
Race | American Indian or Alaska Native | 0.2 |
Asian | 0.2 | |
Black | 1.4 | |
White | 93.7 | |
2 or more races | 1.0 | |
Missing | 3.6 | |
Main Language Spoken at Home | English | 96.0 |
Spanish | 0.3 | |
Some other language | 0.6 | |
Missing | 3.2 | |
Age Category | 18 to 24 | 3.8 |
25 to 34 | 12.5 | |
35 to 44 | 4.5 | |
45 to 54 | 9.0 | |
55 to 64 | 14.1 | |
65 to 74 | 23.0 | |
75 or older | 33.0 | |
Gender | Male | 38.9 |
Female | 61.1 |
Item Distributions
We conducted analyses to evaluate the response distribution of survey items. The percentage of ceiling effects for the items ranged from 11% (q32) to 99.6% (q48) with a median of 73%. Question 48 appears to be an outlier in terms of response distributions, however we opted to retain it pending further evaluation of the item set. The percentage of floor effects ranged from 0.1% (q36) to 67% (q66) with a median of 2%.
Evaluation of hypothesized multi-item composites
With a few exceptions, correlations between items and scales revealed that the data were consistent with the hypothesized multi-item composites. A few items correlated just as highly (or in some cases more highly) with composites they were not hypothesized to belong to. Based on item content, however, we opted to keep them in the composite that was conceptually the best fit. For example, item 23 correlated more highly with the communication with doctor composite than with the hypothesized communication with nurses composite. However, this item measure experiences with nurses and therefore, in terms of content, does not fit into the communication with doctor composite. In addition, items 54, 56, 58, and 59 in the communication about medicines composite correlated more highly with the original composite on communication about how to care for yourself at home. Given that these items refer to information about medicines that the patient would take at home upon discharge, we combined the two composites into one larger composite that includes items 46, 47, 49, 54, 56, 58, and 59.
Of the original seven domains, two of the hypothesize composites (communication with doctors and communication with nurses) were similar to existing HCAHPS composites and a decision was made not to publish these until such time that composite labels could be tested to ensure consumers would not be confused. A third composite (interpreter services) was not published because there was an insufficient number of responses to these items to conduct psychometric testing. As noted above, two of the hypothesized composites (communication about medicines and communication about how to care for yourself at home) were combined, resulting in a total of 3 final composites. The item-scale correlation matrix for the 3 final composites is presented in Table 3: (1) communication about tests (4 items with item-scale correlations ranging from 0.64 to 0.77); (2) communication about how to care for yourself and medicines (7 items with item-scale correlations ranging from 0.33 to 0.54); and (3) communication about forms (4 items with item-scale correlations ranging from 0.34 to 0.57).
Table 3.
Item-Scale Correlation Matrix For The Final Composite Measures (N = 1,013)
Item | Item Description | Communication About Tests |
Information About Caring for Self at Home and Medicines |
Communication About Forms |
---|---|---|---|---|
41 | before you had a blood test, x-ray, or other test, how often did hospital staff explain what it was for? |
0.67* | 0.34 | 0.53 |
42 | how often was the explanation easy to understand? | 0.65* | 0.24 | 0.53 |
43 | when you had a blood test, x-ray, or other test, how often did hospital staff explain the results to you? |
0.64* | 0.29 | 0.40 |
44 | How often were the results of your blood test, x-ray, or other test easy to understand? |
0.77* | 0.26 | 0.49 |
46 | did hospital staff give you a telephone number to call if you had problems after you left the hospital? |
0.14 | 0.46* | 0.17 |
47 | did hospital staff tell you how to take care of yourself at home? | 0.16 | 0.49* | 0.15 |
49 | did you get instructions in writing about how to take care of yourself at home? | 0.12 | 0.45* | 0.13 |
54 | did hospital staff explain the purpose of each of the medicines you were to take at home? |
0.29 | 0.41* | 0.23 |
56 | did hospital staff give you instructions about how to your medicines when you were at home? |
0.21 | 0.43* | 0.13 |
58 | did hospital staff ask you to describe how you were going to take your medicines when you were at home? |
0.26 | 0.33* | 0.19 |
59 | did hospital staff tell you whom to call if you had questions about your medicines? | 0.26 | 0.54* | 0.28 |
78 | how often did hospital staff explain the purpose of each form before you signed it? | 0.49 | 0.23 | 0.57* |
79 | how often were the forms that you got easy to fill out? | 0.52 | 0.18 | 0.52* |
80 | how often did hospital staff offer you help in filling out a form? | 0.32 | 0.27 | 0.34* |
82 | how often were you given enough time to fill out forms? | 0.46 | 0.16 | 0.51* |
Note 1: Item numbers refer to the numbering used in the field test version of the survey.
Note 2: Starred items indicate items that belong to the composite under a given column.
A three-factor categorical confirmatory factor analysis model representing the final item configuration fit the data well (Comparative Fit Index = 0.957; Tucker-Lewis Index = 0.951; Root Mean Square Error of Approximation = 0.057). Factor loadings were all statistically significant and ranged from 0.776 to 0.898 for communication about tests, 0.612 to 0.893 for communication about how to care for yourself and medicines, and 0.514 to 0.835 for communication about forms. The estimated correlations between factors ranged from 0.392 (communication about how to care for yourself and medicines) to 0.780 (communication about forms and communication about tests). Thus, the confirmatory factor analysis provided support for the three new CAHPS composites. Table 4 presents scale means, standard deviations, and internal consistency reliability estimates for the final composites.
Table 4.
Composite means, standard deviations and internal consistency reliability estimates
Composite | Mean | SD | Coefficient Alpha |
---|---|---|---|
Communication about Tests |
75.24 | 16.46 | 0.83 |
Information about how to care for yourself at home and medicines |
86.84 | 19.18 | 0.71 |
Communication about forms |
78.53 | 19.50 | 0.65 |
Eleven of the 62 health literacy items that were field-tested (items 24, 26, 32, 35 36, 37, 38, 39, 48, 50, and 55) did not fit into their hypothesized composite. Although these items had low item-scale correlations for their hypothesized composite, they still provide useful content and can provide information that survey users can use for quality improvement purposes. For this reason, we opted to keep them as part of the final version of the health literacy item set.
The health literacy item set that was field-tested included 17 items that collect information on interpreter services. However, given the fact that almost all of the respondents that participated in the field test were English-speaking, the vast majority screened out of these questions and thus we do not have sufficient data to evaluate these items. Given the importance of this domain in a hospital setting however, we have opted to keep some of these items in the final version of the Health Literacy item set. The items were modified based on similar interpreter services items that were developed and tested as part of other CAHPS efforts and have been included in Table 5. The items are available for public use as users may find them useful for quality improvement purposes. However, further research is needed in order to fully test and evaluate these items.
Table 5.
Final Version of the HCAHPS Item Set for Addressing Health Literacy in Hospital Care
Topic 1: Communication with nurses to improve health literacy
During this hospital stay… |
Response
Options |
|
HL1. | how often were nurses hard to understand because of an accent or the way they spoke to you? |
Never Sometimes Usually Always |
HL2. | how often did nurses use medical words you did not understand? | |
HL3. | how often did nurses talk too fast when talking with you? | |
HL4. | how often did nurses use pictures, drawings, models, or videos to explain things to you? |
|
HL5. | how often did nurses interrupt you when you were talking? | |
HL6. | how often did nurses answer all your questions to your satisfaction? | |
HL7. | how often did nurses use a condescending, sarcastic or rude tone or manner with you? |
|
HL8. | how often did you feel nurses really cared about you as a person? | |
Topic 2: Communication with doctors to improve health literacy
During this hospital stay… |
Response
Options |
|
HL9. | how often were doctors hard to understand because of an accent or the way they spoke to you? |
Never Sometimes Usually Always |
HL10. | how often did doctors use medical words you did not understand? | |
HL11. | how often did doctors talk too fast when talking with you? | |
HL12. | how often did doctors use pictures, drawings, models, or videos to explain things to you? |
|
HL13. | how often did doctors interrupt you when you were talking? | |
HL14. | how often did doctors answer all your questions to your satisfaction? | |
HL15. | how often did doctors make sure you understood all the information you were given? |
|
HL16. | how often did doctors use condescending, sarcastic or rude tone or manner with you? |
|
HL17. | how often did you feel doctors really cared about you as a person? | |
Topic 3: Communication About Tests
During this hospital stay… |
Response
Options |
|
HL19.
(q41)* |
before you had a blood test, x-ray, or other test, how often did hospital
staff explain what it was for? |
Never Sometimes Usually Always |
HL20.
(q42)* |
How often was the explanation easy to understand? | |
HL21.
(q43)* |
when you had a blood test, x-ray, or other test, how often did hospital staff
explain the results to you? |
|
HL22.
(q44)* |
How often were the results of your blood test, x-ray, or other test easy to
understand? |
|
Topic 4: Information About How to Care for Yourself At Home
During this hospital stay… |
Response
Options |
|
HL23. | After you left the hospital, did you go directly to your own home, to someone else’s home, or to another health facility? |
Own home Someone else’s home Another health facility |
HL24.
(q46)* |
did hospital staff give you a telephone number to call if you had problems
after you left the hospital? |
Yes/No |
HL25.
(q47)* |
did hospital staff tell you how to take care of yourself at home? | |
HL26. | Was the information easy to understand? | |
HL27.
(q49)* |
did you get instructions in writing about how to take care of yourself at
home? |
|
HL28. | Were the written instructions easy to understand? | |
HL29. | Did you need instructions in a language other than English? | |
HL30. | Were the instructions available in your preferred language? | |
Topic 5: Information About Medicines
During this hospital stay… |
Response
Options |
|
HL32.
(q54)* |
did hospital staff explain the purpose of each of the medicines you were to
take at home? |
Yes/No |
HL33. | Was the explanation easy to understand? | Yes/No |
HL34.
(q56)* |
did hospital staff give you instructions about how to take your medicines
when you were at home? |
Yes/No |
HL35. | Were these instructions easy to understand? | |
HL36.
(q58)* |
did hospital staff ask you to describe how you were going to take of your
medicines when you were at home? |
|
HL37.
(q59)* |
did hospital staff tell you whom to call if you had questions about your
medicines? |
|
Topic 6: Interpreter Services
During this hospital stay… |
Response
Options |
|
HL38. | What is your preferred language? | English Insert Language 2 Insert Language 3 Insert Language 4 Other |
HL39. | How well do you speak English? | Very well Well Not well Not at all |
HL40. | During this hospital stay, how often were you treated unfairly because you do not speak English very well? |
Never Sometimes Usually Always |
HL41. | An interpreter is someone who helps you talk with others who do not speak your language. Interpreters can include hospital staff or telephone interpreters. During this hospital stay, was there any time when you needed an interpreter? |
Yes/No |
HL42. |
did hospital staff let you know that an interpreter was available free of
charge? |
Yes/No |
HL43. | how often did you use an interpreter provided by the hospital to help you talk with hospital staff? |
Never Sometimes Usually Always |
HL44. | when you used an interpreter provided by the hospital, who was the interpreter you used most often? |
A nurse, nurse’s aide, or social worker A professional interpreter hired by the hospital A telephone interpreter Someone else→Who? |
HL45. | how often did this interpreter treat you with courtesy and respect? | Never Sometimes Usually Always |
HL46. | Using any number from 0 to 10, where 0 is the worst interpreter possible and 10 is the best interpreter possible, what number would you use to rate this interpreter? |
0-10 |
HL47. | how often did you have to wait 15 minutes or longer for the interpreter provided by the hospital to come help you? |
Never Sometimes Usually Always |
HL48. | how often did you use a friend or family member as an interpreter when you talked with hospital staff? |
|
HL49. | did you use a child younger than 18 to help you talk with hospital staff? | Yes/No |
HL50. | did you use friends or family members as interpreters because that was what you preferred? |
Yes/No |
Topic 7: Communication About Forms
During this hospital stay… |
Response
Options |
|
HL52.
(q78)* |
how often did hospital staff explain the purpose of each form before you
signed it? |
Never
Sometimes Usually Always |
HL54.
(q80)* |
how often did hospital staff offer you help in filling out a form? | |
HL55.
(q79)* |
how often were the forms that you got easy to fill out? | |
HL56.
(q82)* |
how often were you given enough time to fill out forms? | |
HL57. | did you ever need forms in a language other than English? | Yes/No |
HL58. | how often were the forms that you had to fill out or sign available in your preferred language? |
Never Sometimes Usually Always |
Note 1: Items marked with an asterisk and in bold are items that form part of the composite measure for each topic. Item numbers in parentheses correspond to the numbering used in the field test version of the survey.
Note 2: Items marked with an asterisk and in bold under Topics 4 and 5 form part of a combined composite measure on Information About How to Care for Yourself at Home and Medicines.
Associations of New Composites with Global Ratings of the Hospital and the Core HCAHPS Composites
The largest correlations of the new composites with the global rating of the hospital and whether the respondent would recommend the hospital to family or friends were observed for the items on communication with nurses (0.50 and 0.43, respectively). This is consistent with results for the HCAHPS core composites.1 Not surprisingly, the new health literacy items on communication with nurses correlated most highly with the core communication with nurses composite (r = 0.62). Similarly, the largest correlation of the new health literacy items on communication with doctor were with the core communication with doctor composite (r = 0.69) and the largest correlation of the new communication about discharge and medicines composite was with the core discharge composite (r = 0.47).
In contrast, the largest correlation of the new communication about tests composite was with the core communication about medicines composite (r = 0.49). The new communication about forms composite, which does not have a core composite counterpart, correlates most highly with the core communication about medicines composite (r = 0.42). While these correlations are sizable, they suggest that the composites provide unique information beyond what would be provided by the core composite alone. A multiple regression of the recommend to family and friends item on the HCAHPS core and new composites had an adjusted R-square of 52% and significant unique associations with the core nursing composite (beta = 0.01, t = 5.14, p <.0001), the core pain management composite (beta = 0.01, t = 5.68, p <.0001), and the new items on communication with nurses (beta = 0.01, t = 2.02, p <0.05). None of the new composites were uniquely associated with the global rating of the hospital item.
After the field test was completed, we made further revisions to the item set based on the results of additional testing conducted as part of other CAHPS initiatives. In addition, CAHPS has recently undertaken an effort to harmonize various CAHPS supplemental item sets, which resulted in further revisions to the item set. The final version of the item set includes 58 items and can be found in Table 5.
DISCUSSION
Hospitals are increasingly being called to do a better job communicating with patients, as exemplified by the Joint Commission’s new patient-centered communication standard on hospitals communicating effectively with patients when providing care, treatment, and services.39 Hospitals need to be able to monitor how well they are meeting expectations about better communication. For example, the American Hospital Association’s Improving Communication with Patients and Families: A Blueprint for Action40 includes on its checklist of leadership strategies to enhance communication communicating with patients and families at all stages of their hospital experience. The HCAHPS Item Set for Addressing Health Literacy is one tool hospitals can use to make sure they are taking advantage of the patient’s time in the hospital to provide clear, understandable information. This comprehensive item set includes topics that are of interest to a wide range of stakeholders, with three multi-item composites that can be reported. Although the item set includes more survey items than any one hospital is likely to use, users have the flexibility of picking and choosing individual survey items to “drill down” to get detailed, actionable information, or can use one of the three composites to report a composite score on a particular topic. Additional information on how CAHPS composite measures are calculated can be found at www.cahps.ahrq.gov/About-CAHPS/FAQs. While there are no plans for CMS to publicly report the survey results from these items, the new Health Literacy item set is designed to provide information that can be used for quality improvement and for consumers to assess key aspects of hospital quality of care.
The study has several limitations. First, the health literacy item set was field tested with a predominantly white, adult patient sample from one hospital in the Midwest. Further research is needed in order to test the health literacy item set with a mix of patients in terms of race/ethnicity, in a broader range of hospital settings in various geographic locations. Second, the study was primarily limited to an English-speaking population. Although the health literacy item set is available in Spanish (and indeed was extensively cognitively tested in Spanish), the sample for the field test did not include sufficient numbers of Spanish speakers to adequately assess the Spanish version and in particular, to assess the interpreter services items. Further research is needed in order to fully test the health literacy item set with Spanish speakers as well as other non-English speaking populations. Despite these limitations, the HCAHPS Item Set for Addressing Health Literacy can serve as a tool to measure, from the patient’s perspective, how well hospital providers’ are meeting their patients’ health literacy needs and to use this information for quality improvement purposes. Additional information on quality improvement strategies using CAHPS data can be found at https://www.cahps.ahrq.gov/Quality-Improvement/Improvement-Guide.aspx.
Acknowledgements
This work was supported by a contract from the Agency for Healthcare Research and Quality (HHSP233200600332P). Ron Hays was also supported in part by grants from AHRQ (U18 HS016980), NIA (P30AG021684) and the NIMHD (2P20MD000182).
Footnotes
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Contributor Information
Beverly Weidmer, RAND Corporation 1776 Main Street, PO Box 2138 Santa Monica, CA 90407-2138 Tel: (310) 393-0411, ext. 6788 Fax: (31) 451-6921 Beverly_Weidmer@rand.org.
Cindy Brach, Center for Delivery, Organization, and Markets Agency for Healthcare Research and Quality 540 Gaither Road Rockville, MD 20850 Tel: (301) 427-1444 Cindy.Brach@ahrq.hhs.gov.
Mary Ellen Slaughter, RAND Corporation 4570 Fifth Avenue, Suite 600 Pittsburgh, Pennsylvania 15213 Tel: (310) 393-0411, ext. 4603 mslaughter@rand.org.
Ron D. Hays, UCLA Department of Medicine Division of General Internal Medicine & Health Services Research 911 Broxton AvenueLos Angeles, Ca 90095-1736 Tel: (310) 794-2294 drhays@ucla.edu.
References
- 1.Consumer Assessment of Health Plans (CAHPS) [Accessed May 15, 2012];Fact sheet. Agency for Health Care Research and Quality. AHRQ Publication No. 00-PO47. Available at: https://www.cahps.ahrq.gov/About-CAHPS/CAHPS-Program.aspx.
- 2.Goldstein E, Farquhar M, Crofton C, et al. Measuring hospital care from the patients’ perspective: an overview of the CAHPS Hospital Survey development process. Health Serv Res. 2005 Dec;40(6 Pt 2):1977–95. doi: 10.1111/j.1475-6773.2005.00477.x. [Research Support, U.S. Gov’t, Non-P.H.S. Research Support, U.S. Gov’t, P.H.S.] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Giordano LA, Elliott MN, Goldstein E, et al. Development, implementation, and public reporting of the HCAHPS survey. Med Care Res Rev. 2010 Feb;67(1):27–37. doi: 10.1177/1077558709341065. [DOI] [PubMed] [Google Scholar]
- 4.Weidmer B, Brach C, Hays RH. Development and Evaluation of CAHPS® Survey Items Assessing How Well Health Care Providers Address Health Literacy. Medical Care. 2012 doi: 10.1097/MLR.0b013e3182652482. In press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Koh H, Berwick D, Clancy C, et al. Forthcoming New Federal Policy Initiatives To Boost Health Literacy Can Help The Nation Move Beyond The Cycle Of Costly ‘Crisis Care. Health Affairs. doi: 10.1377/hlthaff.2011.1169. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Berkman ND, Sheridan SL, Donahue KE, H, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011 Jul 19;155(2):97–107. doi: 10.7326/0003-4819-155-2-201107190-00005. [Research Support, U.S. Gov’t, P.H.S.] [DOI] [PubMed] [Google Scholar]
- 7.Helitzer D, Hollis C, Cotner J, et al. Health literacy demands of written health information materials: an assessment of cervical cancer prevention materials. Cancer Control. 2009 Jan;16(1):70–8. doi: 10.1177/107327480901600111. [Research Support, Non-U.S. Gov’t] [DOI] [PubMed] [Google Scholar]
- 8.Goodfellow GW, Trachimowicz R, Steele G. Patient literacy levels within an inner-city optometry clinic. Optometry. 2008 Feb;79(2):98–103. doi: 10.1016/j.optm.2007.03.015. [Comparative Study] [DOI] [PubMed] [Google Scholar]
- 9.Badarudeen S, Sabharwal S. Readability of patient education materials from the American Academy of Orthopaedic Surgeons and Pediatric Orthopaedic Society of North America web sites. J Bone Joint Surg Am. 2008 Jan;90(1):199–204. doi: 10.2106/JBJS.G.00347. [Comparative Study] [DOI] [PubMed] [Google Scholar]
- 10.Hill-Briggs F, Smith AS. Evaluation of diabetes and cardiovascular disease print patient education materials for use with low-health literate populations. Diabetes Care. 2008 Apr;31(4):667–71. doi: 10.2337/dc07-1365. [Research Support, N.I.H., Extramural Research Support, Non-U.S. Gov’t] [DOI] [PubMed] [Google Scholar]
- 11.The Joint Commission . What did the doctor say?: improving health literacy to protect patient safety. The Joint Commission; Oakbrook Terrace, IL: 2007. [Google Scholar]
- 12.Makaryus AN, Friedman EA. Patients’ understanding of their treatment plans and diagnosis at discharge. Mayo Clin Proc. 2005 Aug;80(8):991–4. doi: 10.4065/80.8.991. [DOI] [PubMed] [Google Scholar]
- 13.Atchison KA, Black EE, Leathers R, et al. A qualitative report of patient problems and postoperative instructions. J Oral Maxillofac Surg. 2005 Apr;63(4):449–56. doi: 10.1016/j.joms.2004.07.019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Baker DW, Parker RM, Williams MV, et al. The health care experience of patients with low literacy. Arch Fam Med. 1996 Jun;5(6):329–34. doi: 10.1001/archfami.5.6.329. [DOI] [PubMed] [Google Scholar]
- 15.Jack BW, Chetty VK, Anthony D, et al. A reengineered hospital discharge program to decrease rehospitalization: a randomized trial. Ann Intern Med. 2009 Feb 3;150(3):178–87. doi: 10.7326/0003-4819-150-3-200902030-00007. [Randomized Controlled Trial Research Support, N.I.H., Extramural Research Support, U.S. Gov’t, P.H.S.] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.The Joint Commission . Advancing Effective Communication, Cultural Competence, and Patient- and Family-Centered Care: A Roadmap for Hospitals. The Joint Commission; Oakbrook Terrace, IL: 2010. [Google Scholar]
- 17.Hurtado MP, Angeles J, Blahut SA, Hays RD. Assessment of the Equivalence of the Spanish and English Versions of the CAHPS® Hospital Survey. Health Services Research. 2005 doi: 10.1111/j.1475-6773.2005.00469.x. Available at: www.blackwell-synergy.com DOI: 10.1111/j.1475-6773.2005.00469.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Weidmer, et al. [Accessed January 6, 2012];Guidelines for Translating CAHPS Surveys. 2006 Available at: www.cahps.ahrq.gov/Surveys-Guidance/Helpful-Resources/Translating-Surveys.aspx.
- 19.Solano-Flores, et al. [Accessed January 6, 2012];The Assessment and Selection of Translators and Translation Reviewers. 2006 Available at: www.cahps.ahrq.gov/Surveys-Guidance/Helpful-Resources/Translating-Surveys.aspx.
- 20.Harkness Janet A., Van de Vijver, Fons JR, et al. Cross-Cultural Survey Methods. John Wiley & Sons, Inc.; Hoboken, NJ: 2003. [Google Scholar]
- 21.Behling and Law . Translating Questionnaires and Other Research Instruments: Problems and Solutions. Sage; Thousand Oaks, CA: 2000. Sage Univesity Papers Series on Quantitative Applications in the Social Sciences, 07-131. [Google Scholar]
- 22.U.S. Census Bureau Methodology and Standards Council . Census Bureau Guideline: Language Translation of Data Collection Instruments and Supporting Materials. U.S. Census Bureau; Washington, D.C.: May 4, 2000. The Translation of Surveys: An Overview of Methods and Practices and the Current State of Knowledge. (Working Paper) [Google Scholar]
- 23.Hughes KA. Comparing Pretesting Methods: Cognitive Interviews, Respondent Debriefing, and Behavior Coding. Annual Meeting of the Federal Committee on Statistical Methodology; Arlington (VA). U.S Census Bureau; 2003. [Google Scholar]
- 24.Willis G. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Sage Publications; Thousand Oaks, CA: 2004. [Google Scholar]
- 25.Harris-Kojetin LD, Fowler FJ, Brown JA, et al. The use of cognitive testing to develop and evaluate CAHPS 1.0 core survey items. Medical Care. 1999;37:MS10–21. doi: 10.1097/00005650-199903001-00002. [DOI] [PubMed] [Google Scholar]
- 26.Levine RE, Fowler FJ, Jr., Brown JA. Role of cognitive testing in the development of the CAHPS® Hospital Survey. Health Services Research. 2005 Dec;40(6 Pt 2):2037–56. doi: 10.1111/j.1475-6773.2005.00472.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Weech-Maldonado R, Weidmer B, Morales L, et al. Cross-cultural adaptation of survey instruments: The CAHPS experience. In: Cynamon ML, Kulka RA, editors. Seventh Conference on Health Survey Research Methods; Hyattsville, MD: 2001. (DHSS Publications No. [PHS] 01-1013.) [Google Scholar]
- 28.Kutner, Mark, et al. [Accessed April, 2012];The Health Literacy of America’s Adults: Results from the 2003 National Assessment of Adult Literacy. Available at: www.nces.ed.gov/pubs2006/2006483_1.pdf.
- 29.McGee J, Kanouse DE, Sofaer S, Hargraves JL, Hoy E, Kleimann S. Making survey results easy to report to consumers: How reporting needs guided survey design in CAHPS. Med Care. 1999 Mar;37(3 Suppl):MS32–40. doi: 10.1097/00005650-199903001-00004. [DOI] [PubMed] [Google Scholar]
- 30.Muthén LK, Muthén BO. Mplus User’s Guide. Muthén & Muthén; Los Angeles: CA: 2009. [Google Scholar]
- 31.Centers for Medicare & Medicaid Services. Baltimore, MD: [Accessed April 30, 2012]. http://www.hcahpsonline.org. [Google Scholar]
- 32.St. Luke’s Regional Medical Center [Accessed April 30, 2012];Annual Report. 2009 Available at: www.stlukes.org/Documents/AboutUs/annual-report-0810.pdf.
- 33.Dillman DA, Smyth JD, Christian LM. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. John Wiley & Sons, Inc.; Hoboken: 2009. [Google Scholar]
- 34.Fredrisckshon DD, Jones TL, Molgaard CA, et al. Optimal Design Features for Surveying Low Income Populations. Journal of Health Care for the Poor and Underserved. 2005;16:677–90. doi: 10.1353/hpu.2005.0096. [DOI] [PubMed] [Google Scholar]
- 35.Kincaid JP, Fishburne RP, Rogers RL, Chissom BS. Derivation of New Readability Formulas (Automated Readability Index, Fog Count, and Flesch Reading Ease formula) for Navy Enlisted Personnel. Chief of Naval Technical Training; Naval Air Station Memphis: 1975. Research Branch Report 8-75. [Google Scholar]
- 36.Hays RD, Shaul JA, Williams VS, et al. Psychometric Properties of the CAHPS 1.0 Survey Measures Consumer Assessment of Health Plans Study. Medical Care. 1999;37(3, Supplement):MS22–31. doi: 10.1097/00005650-199903001-00003. [DOI] [PubMed] [Google Scholar]
- 37.Hargraves JL, Hays RD, Cleary PD. Psychometric properties of the Consumer Assessment of Health Plans Study (CAHPS) 2.0 adult core survey. Health Serv. Res. 2003 Dec.38:1509–27. doi: 10.1111/j.1475-6773.2003.00190.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Weech-Maldonado R, Weidmer B, Morales L, et al. In: Cynamon ML, Kulka RA, editors. Cross-cultural adaptation of survey instruments: the CAHPS experience; Seventh Conference on Health Survey Research Methods.2001. pp. 75–82. [Google Scholar]
- 39.New & Revised Standards & EPs for Patient Centered Communication . Accreditation Program: Hospital. ©The Joint Commission; [Accessed May 15, 2012]. 2010. Pre-publication version. Available at: http://www.imiaweb.org/uploads/pages/275.pdf. [Google Scholar]
- 40.Strategies for Leadership . Improving Communication with Patients and Families: A Blueprint for Action. American Hospital Association; [Accessed May 15, 2012]. Available at: http://www.aha.org/content/00-10/blueprint_for_action.pdf. [Google Scholar]