TABLE 1.
Reference | Aims | Design and sample characteristics | Main findings | Methodological strengths or shortcomings |
---|---|---|---|---|
Castle, N. 2005 | To examine administrators' opinions of the Nursing Home Compare (NHC) website initiative and its influence on quality improvement | Design: Cross-sectional survey Sample size: n=324 Subjects: Nursing home administrators Setting: Four states Country: USA Response rate: 68% |
90% had viewed the NHC website. 51% said they would, in the future, use the information for quality improvement purposes. 33% said they were currently using the information for quality improvement purposes. |
Potential for response bias Restricted to four states of USA Administrators' opinions were used as a proxy for those of consumers. |
Castle, N. and T. Lowe 2005 | To identify which states produce nursing home report cards To compare information contained in the report cards To identify sources of information used in the report cards To examine factors identified as being associated with the usefulness of the report cards |
Design: Exploratory descriptive study Sample size: n=19 states Setting: Nursing home Country: USA |
19 states were identified as having nursing home report cards. Although the data sources did not vary considerably, the information included in the nursing home report cards varied significantly. Across states, there was substantial variation in the method of presentation of the information. Sources of information used in the report cards included annual licensure and recertification inspection reports, MDS data and primary data such as satisfaction survey data. Factors identified to be associated with the utility of report cards included a user-friendly structure, explanatory information and navigation aids, layering information for a diverse audience, using a stepwise approach to minimize complexity in decision-making, explanation about how and why to use quality information in decision-making, large font size and ample white space. |
The researchers undertook evaluations of the utility of the report cards. Thus, the opinions of consumers were not sought in this study. |
Castle, N., J. Engberg and D. Liu 2007 | To examine changes in quality measure scores over one year To assess whether competition and/or demand have influenced changes in the scores |
Design: Cross-sectional data collected at two time points, a year apart Data sources: The NHC website and the On-line Survey Certification and Recording (OSCAR) system Setting: Nursing home Country: USA |
An average decrease in scores occurred for eight quality measures, while there was an average increase in scores for six quality measures. An average of less than 1% change in the quality measures was reported. An association was found between (a) competition and improved quality measure scores and (b) lower occupancy and improved quality measure scores. |
Changes in quality observed are not necessarily the result of the report card availability. RAI-MDS reporting by facilities may have changed during the year. |
Grando, T., M. Rantz and M. Maas 2007 | To elicit the opinions of nursing home staff on a quality performance feedback quality improvement intervention | Design: Qualitative exploratory descriptive study Sample size: n=9 nursing homes (six of which had received the intervention) Subjects: Facility staff directly involved in a prior QI Feedback Intervention trial Setting: Nursing homes in one state Country: USA |
Of the six nursing homes that received the feedback intervention, all found the QI Feedback reports useful. The reports helped identify potential quality problems and enabled tracking of the potential problems over time. Accuracy of the QI reports was questioned; this prompted critique of the RAI-MDS assessments undertaken by staff. Willingness of administrators to change practice based on the feedback reports varied. |
This study was conducted in a small number of facilities in one state in the USA. Therefore, generalizing the findings beyond this setting is difficult. |
Mukamel, D., W. Spector, J. Zinn, L. Huang, D. Weimer and A. Dozier 2007 | To examine nursing home administrators' responses to public reporting through Nursing Home Compare | Design: Cross-sectional survey Sample size: n=724 Subjects: Chief administrators Setting: Nursing homes nationally Country: USA Response rate: 48% |
82% of administrations had viewed their scores on at least one occasion. 69% of respondents reported having viewed their scores for the first and subsequent publications. 60% of respondents believed that quality of care (among other factors) influenced the quality measures. Less than 1% of respondents believed the report card data (quality measures and deficiency citations) were the most important factor in consumer decision-making. In response to publication of quality measures, 63% of respondents reported having investigated their scores, 42% reported having re-prioritized their quality improvement program, and 20% initiated a new quality improvement program and sought assistance from their Quality Improvement Organization (contracted by Centers for Medicare and Medicaid Services). |
National sample Potential for self-report bias |
Stevenson, D. 2006 | To determine whether findings of public reporting in the acute care setting can provide insights for public reporting in the nursing home sector To evaluate the effects of public reporting of nursing home data to date |
Design: Longitudinal observational study, including OSCAR data from pre- and post-release of Nursing Home Compare Data sources: The NHC website and the On-line Survey Certification and Recording (OSCAR) system Setting: Nursing home Country: USA |
Reports of quality data appear to have a very small influence on nursing home occupancy rate. | Absence of a control group Occupancy rate, as a dependent variable, is limited by the capacity for occupancy to change in response to quality. |