Abstract
The Agency of Healthcare Research and Quality Healthcare (AHRQ) Innovations Exchange (IE) was developed to collect and report on innovative approaches to improving healthcare. We reviewed 348 IE innovations including patient-reported satisfaction or experience measures. Innovations most often measured overall rating of care (61% of innovations), followed by access (52%) and provider-patient communication (12%). More than half used patient satisfaction surveys (n=187) rather than patient experience surveys (n=64). Innovations using patient experience surveys more often measured specific aspects of patient care, e.g., access, versus a general overall rating of care. Most innovations using patient experience surveys administered non-validated, homegrown surveys, with few using the AHRQ-endorsed, psychometrically-tested CAHPS survey. The most-common study design was post-implementation only (65%), highlighting that methodological rigor used to assess patient-centeredness in the IE is low. Broad use of patient experience surveys and more rigorous evaluation study designs has increased some over time but is still lacking.
Keywords: patient experience, patient satisfaction, innovations, improvement
Introduction
Policymakers, researchers, clinicians and health care leaders leverage data in quality improvement efforts and interventions to support and improve patient-centered care.1 Patient-centered care is when patients are “listened to, informed, respected, and involved in their care—and their wishes are honored”.2 Patient-centeredness improves patient satisfaction, patient experiences as well as clinician and non-experiences morale and productivity;3 patient-centeredness is a key component of quality.
One approach to assessing patient-centered care is by measuring patient experience of care by administering patient satisfaction or patient experience surveys.4, 5 The family of Consumer Assessment of Healthcare Providers and Systems (CAHPS®) surveys are rigorously developed including stakeholder input, psychometric analysis, cognitive and field testing, and the national standard for collecting, tracking and benchmarking patient care experiences across settings, such as the Hospital CAHPS survey for inpatient care and the CAHPS Clinician and Group survey for ambulatory care.6-8 Endorsed by Agency for Healthcare Research and Quality (AHRQ),9 CAHPS surveys were developed to inform public reporting, pay-for-performance initiatives, interventions, patient choice of physicians/practices, and quality improvement.10-25 If measured over time, CAHPS surveys also help determine whether care is becoming more patient-centered. While patient experience surveys have been widely used to assess changes related to specific interventions,26-33 or trends,34 relatively few35 have looked at how patient experience surveys have been used to improve patient-centeredness outside of evaluations of interventions.
The AHRQ Health Care Innovations Exchange (IE) was developed to collect and report on innovative approaches to improving care and describes “the implementation of new or altered products, services, processes, systems, policies, organizational structures, or business models that aim to improve one or more domains of health care quality or reduce health care disparities.”36 This repository was developed and maintained by AHRQ from June 2008 to September 2016 and includes over 900 innovations.37 Innovations range in their scope and care categorized by type: Service Delivery Innovation, Innovation Attempt, and Policy Innovation. One previous study systematically35 examined the submitted innovations (through the end of January 2013) that used patient satisfaction or patient experience as a key outcome and highlighted how it was measured, and how innovations attempted to improve care using these metrics. Overall, the study found that few innovations measured patient experience and opted to measure patient satisfaction; also, very few innovations used the CAHPS survey or a survey of similar rigor to assess their efforts. However, since then and through August 2016, 244 additional innovations have been added to the database. No study has conducted a review of the innovations published since January 2013.
The goal of this study is to extend the analysis done by Weinick et al., 2014 and examine all of the IE innovations by describing the types of innovations (focus, activities) and their impact.
Methods
Search of Innovations
We searched the 933 innovations in the AHRQ IE database,36 from the first innovation (June 2008) through the final update (August 2016). Following the same search process and coding approach of a previous study on the IE,35 we reviewed innovations using terms related to patient experience or satisfaction generally and to specific domains covered by CAHPS, as per a previous study. We used the following search terms: “satisfaction,” “patient experience,” “consumer assessment,” “provider communication,” “office staff,” “access,” “coordination of care,” “provider-patient communication,” “improving access to care,” “laboratory tests,” “cultural competence,” “pain medication,” “pain management,” “follow-up care,” “discharge planning,” and “shared decision making.” Searching for key words among the innovations screened out 452 innovations, yielding a total of 481 innovations for analysis.
Analysis of Innovation Profiles
We analyzed the content of the profiles of innovations submitted by implementers to the AHRQ IE website. Since the previous study, conducted in 2013, submitted pdf files were coalesced into a single spreadsheet and include all historical innovations from June 2008 through August 2016. Each profile includes a description of the innovation, what activities were implemented, the impact of their intervention, and an assessment of the strength of evidence of the intervention improving outcomes.
We used the structural coding approach38 based on a set of codes developed for the previous study of the IE. Codes aimed to identify relevant innovations related to patient experience or patient satisfaction. The codes were: 6 types of patient experience outcomes (patient experience or patient satisfaction; and anecdotal, qualitative, or quantitative approach); six categories to define how improvements were evaluated/reported (anecdotal reports, comparisons over time for two groups, comparisons over time between time points, comparison over time between subjects, post implementation only, and series/trend data); and 12 categories describing the organizational processes targeted for improvement by innovation. In Table 1, we define the codes and provide an illustrative example of each from the innovation profiles. In coding the identified innovations by keyword, some innovations were dropped as they did not include patient satisfaction or experience, mostly because they included just provider satisfaction or experience.
Table 1:
Description and Illustrative Examples of Codes
| Codes | Description | Example from Innovation Profile |
|---|---|---|
| Type of patient experience or satisfaction outcomes | ||
| Patient Satisfaction quantitative data | Assessment of how happy/satisfied patients are with their medical care, collected via quantitative measures (i.e., surveys or other numerical estimates) | Enrollee satisfaction surveys reveal that the percentage of enrollees who are satisfied with the ACO is 87 percent. |
| Patient Satisfaction qualitative data | Assessment of how happy/satisfied patients are with their medical care, collected via qualitative approaches (i.e., interviews, focus groups) | An ongoing and comprehensive analysis of the program, based on exit interviews since July 2012, found very high levels of satisfaction, with 71 percent of the 172 clients interviewed indicating they were either completely or mostly satisfied with the program. Another 21 percent indicated they were satisfied. |
| Patient Satisfaction anecdotal data | Assessment of how happy/satisfied patients are with their medical care, collected via anecdotal data (i.e., patient testimonials, informal feedback) | Participants consistently provide positive feedback on program services, both on satisfaction surveys and in personal comments to members of the care team. Many patients believe the program has had a major impact on their ability to understand and manage their health and has made them more aware of resources available to them. |
| Patient Experience quantitative data | Assessment of how well received various interactions were between the patient and the health system, collected via quantitative measures (i.e., surveys or other numerical estimates) | Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey scores for education are at 89 percent, higher than the national average of 84 percent (updated April 2014). |
| Patient Experience qualitative data | Assessment of how well received various interactions were between the patient and the health system, collected via qualitative approaches (i.e., interviews, focus groups) | Focus groups and feedback from patients indicate high levels of satisfaction with the services offered and the providers. Many homeless individuals who were not previously able to access medical and psychosocial services have expressed their gratitude and relief that they are now able to have immediate access to a comprehensive set of services offered in one place. |
| Patient Experience anecdotal data | Assessment of how well received various interactions were between the patient and the health system, collected via anecdotal data (i.e., patient testimonials, informal feedback) | Anecdotal reports suggest that patients now find it easier to access care for their resistant hypertension, encountering shorter wait and travel times for appointments (versus appointments in far-away specialty centers) |
| Methods for assessing improvement in patient experience or satisfaction measures | ||
| Comparison overtime between two or more groups | Comparison, generally pre- and post-innovation implementation, among at least one site that received the innovation and one that did not receive the innovation | The percentage of participants receiving case management and/or substance abuse services increased during the 2-year trial, while use rates for these services remained flat in the control group. For example, the percentage of participants with access to disease management services rose from just over 10 percent at baseline to between 30 and 40 percent during the 24-month period after implementation; by contrast, access to these services remained below 10 percent over this time period for those in the control group. |
| Comparison overtime for a single group | Comparison among a single group that received the innovation pre- and post-innovation | Patient satisfaction at [health system] increased between fiscal year 2008 (when the program was implemented) and fiscal year 2009, with increases in the following: staff response to patient pain (from 84.9 to 86.7 percent), staff teamwork in the provision of care (from 87.3 to 89.0 percent), and likelihood of recommending the hospital (from 66.9 to 70.0 percent). |
| Comparisons between groups | Comparison between two groups, one that received the innovation and one that did not after the implementation of the innovation. | Pilot clinic patients were 9.8 times more likely to have had an e-mail exchange with their provider within 14 days of a well-care visit than were control clinic patients, and 1.89 times more likely to get a telephone call or have an e-mail exchange with providers within 3 days of an ED visit. |
| Series/Trends in outcome over time | Use of time series or trend analysis from either before the implementation of the innovation to after the innovation over multiple time points, or after the implementation of the innovation over time, to assess continued impact of the innovation. | [Hospital’s] inpatient admissions increased by approximately 28 percent between 1997 and 2009 (from 5,866 to 7,533), well above the 10-percent increase experienced by the typical U.S. hospital. Over the same time period, admissions generally remained flat across [state] hospitals, with some experiencing declines. Between 1998 and 2012, annual outpatient visits grew from 93,347 to 196,386, a jump of 110 percent. ED visits reached an all-time high in fiscal 2012. |
| Post implementation analysis only | Assessment of innovation outcomes after the implementation of the innovation among the group that received the innovation only | More than 60 percent of survey respondents are drawn to the clinic's convenient location and the ability to minimize time away from work. In the absence of the clinic, 64 percent would have gone to their primary care provider, 14 percent to an urgent care facility, and 9 percent would have foregone care. |
| Anecdotal patient reports | Patient reported anecdotes of changes to outcomes from the innovation | Anecdotal feedback from patients and physicians suggests that the program has improved access and the quality of care and services within these practices, including the ability to coordinate care. |
| Organizational processes targeted for improvement by innovation | ||
| Patient health education | Education provided by a clinician or staff member to influence patient attitudes, knowledge, and beliefs of health behaviors | The advocate assists staff in delivering treatment and medication instructions by helping to create and communicate individualized treatment plans. For example, the advocate provides a culturally and socioeconomically relevant context for care instructions. In addition, the advocate uses a color-coded medication regimen calendar to overcome language/literacy barriers and promote adherence. |
| Chronic care management | Targeted management of patients with a single or multiple chronic diseases | Two nurses provide intensive short-term disease management for patients with uncontrolled chronic illnesses and transitional care to these patients after a hospitalization. The nurses also use motivational interviewing techniques to engage patients in self-managing their condition. |
| Follow-up care, referrals, and discharge processes | Updating the process for coordination follow-up care, referrals, and discharge to support patient care and ensure care continuity | At the end of the initial meeting, the pharmacist or pharmacy resident establishes a timetable for followup visits, which tend to be fairly frequent at first (every week or two) and less so over time. Each followup visit follows the same pattern, with a focus on monitoring adherence to, and the impact of, action steps discussed at the previous visit and addressing unmet needs. Most followup visits occur via video link with the patient in the clinic, but some occur by phone with the patient at home. |
| Intake/admissions | Updating the intake or admission process to support patient care | Peer specialists (i.e., people on staff who are in recovery from a major mental disorder) welcome clients to the center, offer them a healthy snack and beverage, listen to their needs, provide emotional support if necessary, and help them access and use the decision support tool. |
| Organizational changes | Changes to how the organization operates, including changing care teams, changing workflows, updating guidelines, and adding care capabilities | Information provided in January 2012 indicates that pharmacists now work as part of care teams that include a physician, nurse case manager, and social worker. Each care team is assigned to three to six primary care practices to offer more intensive care management for high-risk patients, defined as those that are at high risk of hospital readmission due to the presence of multiple comorbidities (e.g., diabetes, cardiovascular conditions, coronary obstructive pulmonary disease, etc.). As part of the care team, each pharmacist is responsible for medication management of approximately 200 to 300 high-risk patients who are covered under a capitated reimbursement system. |
| Access to care | Improving access to care through various means, including opening clinics, changing hours, allowing for more channels to connect to the care team (i.e., through telehealth or electronic communication) | Patients can get same-day, face-to-face appointments during regular work hours. To provide this service, prescheduled appointments are generally scheduled during morning hours, which leaves about two-thirds of her day open for same-day visits. The doctor is also willing to stay late if patients need to see her that day. |
| Patient safety | Updating processes to promote patient safety, including physical and health safety | Hand hygiene, chlorhexidine gluconate bath cloths, infection prevention intravenous cap use, and other measures help prevent infections. Any identified infection triggers a drilldown to pinpoint where the prevention process broke down; patients with diarrhea are routinely tested for Clostridium difficile |
| Physical environment modification | Changes to the physical environment of the clinic or hospital to support patient care | The critical care unit is designed in a unique horseshoe shape, with private rooms forming the horseshoe, an outer ring that serves as the visitor corridor, and a center area for professionals. Patient rooms have doors that access both the visitor and professional areas. Each room also has a private, handicapped-accessible bathroom, which improves infection control. Visitors can stay around the clock; multiple visitor lounges are located around the outside ring, some with sleeping accommodations and shower facilities. |
| Physician and non-physician training | Training provided to physicians and non-physician staff to support patient care | The mental health team offers a full curriculum of training to the centers' professionals related to the identification and management of mental and behavioral health problems in seniors, including depression, anxiety, personality disorders, mood and psychotic disorders, difficult behaviors, and dementia. The curriculum is customized to specific disciplines (e.g., geriatric home care aides, social workers, transportation staff, rehabilitation staff) and to the cultural/linguistic needs of center staff. Training sessions are offered once every few months to each discipline within each center. In 2009, the mental health team conducted more than 30 training sessions at the eight sites, each involving approximately 20 participants. |
| Quality measurement and pay-for-performance | Measuring quality of care and reviewing those measures, or tying quality performance to payment | [Hospital] monitors performance through a monthly report card that tracks progress towards the goals within each pillar, quarterly patient satisfaction surveys, and annual physician and employee satisfaction surveys. In addition, Sharp administers the Agency for Healthcare Research and Quality Hospital Survey of Patient Safety Culture annually. All data are shared throughout the organization via quality councils, intranet display, clinical and administrative practice councils, staff meetings, communication boards, and e-mail. |
| Technology | Implementation or use of technology, including hardware and software, to support patient care. | Primary care clinicians have access to the Web-based software iHealth that supports longitudinal case management of patients according to established protocols. |
Two coders (Author#1, Author#2) coded the full set of innovations. One coder, the senior principal investigator (Author#2), audited a subset of the coding of innovations of the other coder to ensure consistency between coders. We pilot tested the codebook and coding process using a subset of innovation profiles to assess inter-rater reliability. We calculated a kappa value of 0.87, including very good consistency.39 Coders met periodically to review outstanding questions and to clarify the coding. Of the 481 innovations identified by key word, 348 that included a patient satisfaction or patient experience measure were included in our final analytic sample.
We calculated counts and percentages of innovations and innovation characteristics to present overall frequencies and compare innovations by characteristic. Data was stratified by type of patient satisfaction/experience survey, patient experience domain, organizational process to impact changes, and the main goal of innovations. Additionally, we show trends over time in how innovations are being studies.
Study protocols were approved by our Human Subjects Protection Committee (IRB_Assurance_No: FWA00003425; IRB Number: IRB00000051).
Results
Overview of Included Innovations
Of the 933 innovations includes in the AHRQ IE, 348 included at least one patient satisfaction or patient experience measure. On average, each innovation included had 2.4 measures of patient satisfaction or experience. One hundred and eighty-seven innovations included a patient satisfaction measure (54%), 64 included a patient experience measure (18%), and 133 included no specific measure but measured a patient experience concept using a non-survey mode (38%). Among patient satisfaction measures, the majority measured outcomes quantitatively (N=135, 72% of patient satisfaction innovations), followed by anecdotally (57, 30%) and qualitatively (6, 3%). Among patient experience measures, the majority measured outcomes quantitatively (N=54, 86% of patient experience innovations), followed by qualitatively (8, 13%) and anecdotally (4, 6%).
Measuring Outcomes
The most common measures used in the innovations were measures of overall rating of patient satisfaction or experience of care (N=211, 61% of all innovations), followed by measures of access to care (182, 52%) and provider-patient communication (44, 12%). Table 2 includes the frequency of patient care domains by type of patient data collection (i.e., patient satisfaction survey, patient experience survey, or a non-survey mode). Among patient surveys, the most common patient care domains were about overall satisfaction or experience, followed by access to care; communication with providers; and care coordination, follow-up, and discharge. Among innovations that used a patient satisfaction survey, overall satisfaction measures were included in 90% of all innovations, followed by access to care (27%) and communication with providers (11%). Similarly, among innovations that used a patient experience survey, the most common patient care domain was also overall rating of patient experience (69%). Additionally, innovation using a patient experience survey more often measured specific aspects of patient care (compared to those using a patient satisfaction survey). For example, innovations using a patient experience survey (compared to a patient satisfaction survey) more often measured access to care (39% using patient experience survey vs. 27% using patient satisfaction survey); communication with providers (25% vs. 11%); and coordination of care, follow-up, and discharge planning (27% vs. 5%). The remaining measures were included in less than 10% of innovations.
Table 2:
Domain of Measures Included in Innovation Exchange Profiles Overall and By Type of Patient Data Collection
| Number of Innovations | ||||
|---|---|---|---|---|
| Domain of Measure | Patient Satisfaction Survey (N=187) |
Patient Experience Survey (N=64) |
Non-Survey Data (N=133) |
Total Innovations (N=348) |
| Overall rating of patient satisfaction/ experience | 89.8% (168) | 59.4% (38) | 3.8% (5) | 60.6% (211) |
| Access to care | 26.7% (50) | 39.1% (25) | 80.5% (107) | 52.3% (182) |
| Communication with providers | 11.2% (21) | 25% (16) | 5.3% (7) | 12.6% (44) |
| Care coordination, follow-up, and discharge information | 4.8% (9) | 26.6% (17) | 5.3% (7) | 9.5% (33) |
| Customer service | 3.2% (6) | 7.8% (5) | 1.5% (2) | 3.7% (13) |
| Cultural competence | 2.1% (4) | 6.3% (4) | 2.3% (3) | 3.2% (11) |
| Shared decision making | 2.1% (4) | 4.7% (3) | 0.8% (1) | 2.3% (8) |
| Health promotion and education | 0.5% (1) | 6.3% (4) | 2.3% (3) | 2.3% (8) |
| Medication and pain management | 2.7% (5) | 3.1% (2) | 0% (0) | 2% (7) |
| Environment | 0% (0) | 1.6% (1) | 0% (0) | 0.3% (1) |
In contrast, the vast majority of measures in non-survey mode were measures of access to care (81% of non-survey measures), related primarily to the number of patients seen after an innovation was implemented or the ability to access care (i.e., opening a new clinic).
Data Collection and Assessment
The most common study design for examining impact on patient satisfaction or patient experience was post-implementation only (65%). No other assessment strategy was used more than 20% of the time. Within these, the approach with the highest rigor (i.e., comparison overtime for two groups) was used the least often (3% of innovations), whereas the less rigorous approaches (i.e., post-implementation only and anecdotal patient reports) were used in over 75% of innovations.
Table 3 describes methods of reporting improvements in patient data overall and by type of patient data collection mode. Methods of assessing improvements varied by patient data collection mode. While post-implementation only was most common method used overall, innovations that collected patient data by administering a patient satisfaction survey and those that used a non-survey mode more frequently employed a post-implementation only assessment of outcomes (62% and 58% vs. 48%). Innovations that used patient experience surveys employed more rigorous methods of assessing impact compared to innovations that used patient satisfaction surveys or non-survey data. Innovations that used patient experience surveys also more often employed comparisons over time (compared to innovations using patient satisfaction surveys or non-survey data (30% vs. 14% and 12%, respectively). Innovations using patient experience surveys were less likely to use post-implementation only designs compared to those using patient satisfaction surveys and non-survey data (48% vs. 63% and 58%) and were less likely to use anecdotal patient reports than those using patient satisfaction surveys (9% vs. 25%).
Table 3:
Method of Reporting Improvement in Patient Data Overall and By Type of Patient Data Collection
| Number of Innovations | ||||
|---|---|---|---|---|
| Improvement in patient satisfaction or experience |
Patient Satisfaction Survey (N=187) |
Patient Experience Survey (N=64) |
Non-Survey Data (N=133) |
Total Innovations (N=348) |
| Comparison overtime for two groups | 2.1% (4) | 3.1% (2) | 2.3% (3) | 2.6% (9) |
| Comparison over time only | 13.9% (26) | 29.7% (19) | 12% (16) | 17.5% (61) |
| Comparison between groups only | 10.2% (19) | 14.1% (9) | 4.5% (6) | 9.8% (34) |
| Series data/trends | 3.2% (6) | 7.8% (5) | 3% (4) | 4.3% (15) |
| Post implementation only | 62.6% (117) | 48.4% (31) | 57.9% (77) | 64.7% (225) |
| Anecdotal patient reports | 25.1% (47) | 9.4% (6) | 0% (0) | 15.2% (53) |
Organizational Process Targeted to Improve Patient Satisfaction and Experience
Innovations focused and changed a variety of organizational processes to improve either patient satisfaction or patient experience. The majority of innovations implemented organizational changes (N=269, 77% of all innovations) including changes to staffing, culture, and workflows; followed by process changes to access to care (185, 53%), and follow-up care, referrals, and discharge processes (180, 52%). Less than 10% of innovations made physical environment changes (31, 9%) or patient safety changes (13, 4%). On average, there were 4.4 organizational process changes made during innovations that targeted patient satisfaction or patient experience, 5.1 made when targeting clinical outcome changes, and 4.5 when targeting a mixed of clinical and patient experience goal.
Table 4 shows the organizational process changes made as part of innovations according to the main stated goal of the innovation (i.e., improvement in a clinical outcome, patient satisfaction/experience, a mix of clinical and patient satisfaction/experience). Overall, organizational changes and access to care changes were the most common regardless of the main goal of the innovation. More organizational changes were made when the innovation aimed to improve clinical outcomes compared to improving patient satisfaction and experience (89% vs. 72%). In contrast, access to care changes were made more often as part of innovations aimed to improve patient satisfaction and experience compared to innovations that aimed to improve clinical outcomes (56% vs. 47%).
Table 4:
Organizational Processes Targeted for Improvement By Main Goal of Innovation
| Main Goal of Innovation | ||||
|---|---|---|---|---|
| Organizational Processes Targeted for Improvement |
Clinical (N=75) |
Mixed (N=165) |
Patient Satisfaction/ Experience (N=108) |
Total (N=348) |
| Access to care | 46.7% (35) | 54.5% (90) | 55.6% (60) | 53.2% (185) |
| Follow-up care, referrals, and discharge process | 64% (48) | 54.5% (90) | 38.9% (42) | 51.7% (180) |
| Technology | 36% (27) | 40.6% (67) | 33.3% (36) | 37.4% (130) |
| Chronic care management | 21.3% (16) | 29.1% (48) | 20.4% (22) | 24.7% (86) |
| Patient health education | 45.3% (34) | 36.4% (60) | 28.7% (31) | 35.9% (125) |
| Organizational changes (staffing, culture, etc.) | 89.3% (67) | 75.2% (124) | 72.2% (78) | 77.3% (269) |
| Physician and non-physician training | 22.7% (17) | 26.7% (44) | 31.5% (34) | 27.3% (95) |
| Intake/admission | 50.7% (38) | 38.8% (64) | 38% (41) | 41.1% (143) |
| Physical environment modifications | 8% (6) | 9.7% (16) | 8.3% (9) | 8.9% (31) |
| Quality measurement and pay-for-performance | 18.7% (14) | 24.8% (41) | 18.5% (20) | 21.6% (75) |
| Patient safety | 5.3% (4) | 4.8% (8) | 0.9% (1) | 3.7% (13) |
| Other process | 1.3% (1) | 1.2% (2) | 0.9% (1) | 1.1% (4) |
Most organizational process changes were more common as part of innovations aimed to change clinical outcomes than when the innovation aimed to change patient satisfaction and experience, including process changes to follow-up care, referrals, and discharge processes (64% vs. 39%); patient health education (45% vs. 29%); and intake/admissions (51% vs. 38%). The only exception was that training for physicians and non-physician was more common as part of innovations aimed to change patient experience and satisfaction compared to innovations aimed at clinical outcomes (38% vs. 23%).
Changes in Innovation Reporting Overtime
The majority of innovations (24%) were reported to AHRQ in the year 2008 or earlier, with fairly consistent reporting of innovations from 2009-2014. As the IE lost funding in 2015 and 2016, however, reported innovations dropped significantly.
Figure 1 depicts the number of innovations published per year and the share of innovations that aimed to improve patient experience or satisfaction as the main goal and whether patient experience measures were used to assess the main goal of the innovation. Overtime from 2008-2014, there was a slight increase for innovations that focused on patient experience or satisfaction as the main goal and for innovations that used patient experience measures to assess their goals; we excluded 2015 and 2016 due to low overall counts.
Figure 1:
Trend in use of Patient Experience over time
Use of Validated Patient Experience Surveys
Overall, very few innovations (7%; n=26) used a psychometrically tested and validated patient experience survey (such as a CAHPS survey or a proprietary Press Ganey survey). Only nine innovations (3% overall) used the CAHPS survey (either CG-CAHPS or HCAHPS) to assess improvements in patient experience.
Discussion
Our extended review of the AHRQ IE database shows that while there is significant attention to patient-centeredness, as evidenced by the high use of patient satisfaction and patient experience measures in innovations by health systems, most measures used in innovations were patient satisfaction measures and were used to assess changes in access to care. These innovations tended to also use weak study designs, evaluating change using a post-implementation only evaluation or anecdotal reports. However, innovations that measured changes in patient experience tended to use more rigorous approaches such as comparisons over time. The methodological rigor used to assess patient-centeredness in innovations overall is low, as evidenced by the small number of innovations a validated patient survey (7%) of which a small number of innovations use the CAHPS survey (3%).
Similar to Weinick et al., 2014, we found that more than half of innovations used patient satisfaction surveys rather than patient experience surveys, whereas the majority of innovations using patient experience surveys used non-validated, homegrown patient surveys for assessment. Our findings confirmed that patient experience surveys were associated with more specific aspects of patient experience as targeted topics of improvement, with the majority of patient experience innovations focused on access to care; communication with providers; and care coordination, follow-up, and discharge planning. Even with additional years of IE innovation data available, the use of patient experience surveys to assess patient experience outcomes is limited.
When looking at innovations over time, we found that innovations that aimed to improve patient experience measures increased over time and that patient experience measures were used more frequently to assess the intended goals of innovations. These findings point to a greater emphasis on patient-centeredness over time from 2008 to 2016 and the continued importance of patient experience and patient satisfaction measures in understanding and assessing patient-centeredness.
This study has several limitations. First, we are limited by our study sample, which ended in 2016 due to the discontinuation of the AHRQ Innovation Exchange. This review however does include all innovations published in the AHRQ IE. As such, it is likely that since 2016 health systems have continued to improve and advance in their methodological rigor assessing patient satisfaction and patient experience. We also did not see appreciable differences over time in the innovations reported to the IE database. Furthermore, the AHRQ IE was compiled by user input and AHRQ review of grants and may include reporting bias in the type of innovations that were included. This bias is partially mitigated by AHRQ inviting participants to submit innovations through a review of their grants and projects. Yet, generalizing from these innovations in the IE database should be done cautiously.
Conclusion
While patient satisfaction is measured routinely and regularly in innovations, satisfaction measures typically lack the actionability and specificity of patient experience measures. Satisfaction is determined by preference of the patient or respondent that make it difficult to assess the quality of patient-centeredness. In contrast, patient experience measures target the frequency of specific actions and behaviors that a patient experiences when receiving care, allowing for comparable levels of specific behaviors over time or across units (such as health care organizations or providers) and the identification of areas of improvement as well as more precise targeting for ongoing improvement. Health systems should continue to invest in patient-centeredness and in using patient experience measures, particularly CAHPS surveys, which are designed for tracking and benchmarking patient care experiences for which patients are the best source of information, to monitor change and evaluate innovations. CAHPS surveys are the standard for collecting information about patient experience of care in the United States. Continued review of quality improvement efforts using patient experience measures is needed to foster an atmosphere of continued measurement and improvement of patient-centeredness.
Acknowledgements:
We acknowledge Lynn Polite for administrative support.
Funding disclosures:
All phases of this study were supported by a cooperative agreement from the Agency for Health Care Research and Quality (AHRQ; U18HS025920 and U18 HS029321). The funder/sponsor did not participate in the work.
Footnotes
Conflict of interest statement: The authors declare that there is no conflict of interest.
References
- 1.Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century. National Academy Press; 2001. [PubMed] [Google Scholar]
- 2.Epstein RM, Street RL Jr. The values and value of patient-centered care. Annals of family medicine. Mar-Apr 2011;9(2):100–3. doi: 10.1370/afm.1239 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Catalyst NEJM. What is patient-centered care? NEJM Catalyst. 2017; [Google Scholar]
- 4.Davies E, Cleary PD. Hearing the patient's voice? Factors affecting the use of patient survey data in quality improvement. Qual Saf Health Care. Dec 2005;14(6):428–32. doi: 10.1136/qshc.2004.012955 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Reeves R, Seccombe I. Do patient surveys work? The influence of a national survey programme on local quality-improvement initiatives. Qual Saf Health Care. Dec 2008;17(6):437–41. doi: 10.1136/qshc.2007.022749 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Darby C, Hays RD, Kletke P. Development and evaluation of the CAHPS hospital survey. Health Serv Res. Dec 2005;40(6 Pt 2):1973–6. doi: 10.1111/j.1475-6773.2005.00490.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Goldstein E, Farquhar M, Crofton C, Darby C, Garfinkel S. Measuring hospital care from the patients' perspective: an overview of the CAHPS Hospital Survey development process. Health Serv Res. Dec 2005;40(6 Pt 2):1977–95. doi: 10.1111/j.1475-6773.2005.00477.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Hargraves JL, Hays RD, Cleary PD. Psychometric properties of the Consumer Assessment of Health Plans Study (CAHPS) 2.0 adult core survey. Health Serv Res. Dec 2003;38(6 Pt 1):1509–27. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Agency for Healthcare Research and Quality. About CAHPS. Agency for Healthcare Research and Quality. Accessed December 8, 2023. https://www.ahrq.gov/cahps/about-cahps/index.html [Google Scholar]
- 10.Dyer N, Sorra JS, Smith SA, Cleary PD, Hays RD. Psychometric properties of the Consumer Assessment of Healthcare Providers and Systems (CAHPS(R)) Clinician and Group Adult Visit Survey. Med Care. Nov 2012;50 Suppl:S28–34. doi: 10.1097/MLR.0b013e31826cbc0d [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Elliott MN. Components of care vary in importance for overall patient-reported experience by type of hospitalization. Med Care. 2009:842–849. [DOI] [PubMed] [Google Scholar]
- 12.Evensen CT, Yost KJ, Keller S, et al. Development and testing of the CAHPS Cancer Care Survey. J Oncol Pract. Nov 2019;15(11):e969–e978. doi: 10.1200/JOP.19.00039 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Giordano LA, Elliott MN, Goldstein E, Lehrman WG, Spencer PA. Development, implementation, and public reporting of the HCAHPS survey. Med Care Res Rev. Feb 2010;67(1):27–37. doi: 10.1177/1077558709341065 [DOI] [PubMed] [Google Scholar]
- 14.Hays RD, Berman LJ, Kanter MH, et al. Evaluating the psychometric properties of the CAHPS Patient-centered Medical Home survey. Clin Ther. May 2014;36(5):689–696 e1. doi: 10.1016/j.clinthera.2014.04.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Hays RD, Chong K, Brown J, Spritzer KL, Horne K. Patient reports and ratings of individual physicians: An evaluation of the DoctorGuide and Consumer Assessment of Health Plans Study provider-level surveys. Am J Med Qual. Sep-Oct 2003;18(5):190–6. [DOI] [PubMed] [Google Scholar]
- 16.Hays RD, Mallett JS, Gaillot S, Elliott MN. Performance of the Medicare Consumer Assessment of Health Care Providers and Systems (CAHPS) physical functioning items. Med Care. Feb 2016;54(2):205–9. doi: 10.1097/MLR.0000000000000475 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Hays RD, Martino S, Brown JA, et al. Evaluation of a care coordination measure for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) Medicare survey. Med Care Res Rev. Apr 2014;71(2):192–202. doi: 10.1177/1077558713508205 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Hays RD, Shaul JA, Williams VS, et al. Psychometric properties of the CAHPS 1.0 survey measures. Consumer Assessment of Health Plans Study. Med Care. Mar 1999;37(3 Suppl):MS22–31. doi: 10.1097/00005650-199903001-00003 [DOI] [PubMed] [Google Scholar]
- 19.Morales LS, Weech-Maldonado R, Elliiott MN, Weidmer B, Hays RD. Psychometric properties of the Spanish Consumer Assessment of Health Plans Survey (CAHPS). Hisp J Behav Sci. 2003;25(3):386–409. [Google Scholar]
- 20.Rothman AA, Park H, Hays RD, Edwards C, Dudley RA. Can additional patient experience items improve the reliability of and add new domains to the CAHPS hospital survey? Health Serv Res. Dec 2008;43(6):2201–22. doi: 10.1111/j.1475-6773.2008.00867.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Schmocker RK, Cherney Stafford LM, Siy AB, Leverson GE, Winslow ER. Understanding the determinants of patient satisfaction with surgical care using the Consumer Assessment of Healthcare Providers and Systems surgical care survey (S-CAHPS). Surgery. Dec 2015;158(6):1724–33. doi: 10.1016/j.surg.2015.06.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Solomon LS, Hays RD, Zaslavsky AM, Ding L, Cleary PD. Psychometric properties of a group-level Consumer Assessment of Health Plans Study (CAHPS) instrument. Med Care. Jan 2005;43(1):53–60. [PubMed] [Google Scholar]
- 23.Toomey SL, Elliott MN, Zaslavsky AM, et al. Variation in Family experience of pediatric inpatient care as measured by Child HCAHPS. Pediatrics. Apr 2017;139(4)doi: 10.1542/peds.2016-3372 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Weech-Maldonado R, Morales LS, Spritzer K, Elliott M, Hays RD. Racial and ethnic differences in parents' assessments of pediatric care in Medicaid managed care. Health Serv Res. Jul 2001;36(3):575–94. [PMC free article] [PubMed] [Google Scholar]
- 25.Weidmer BA, Cleary PD, Keller S, et al. Development and evaluation of the CAHPS (Consumer Assessment of Healthcare Providers and Systems) survey for in-center hemodialysis patients. Am J Kidney Dis. Nov 2014;64(5):753–60. doi: 10.1053/j.ajkd.2014.04.021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Beckett MK, Quigley DD, Lehrman WG, et al. Interventions and hospital characteristics associated with patient experience: An update of the evidence. Med Care Res Rev. In Press; [DOI] [PubMed] [Google Scholar]
- 27.Peikes D, Dale S, Ghosh A, et al. The Comprehensive Primary Care Initiative: Effects on spending, quality, patients, And physicians. Health Aff (Millwood). Jun 2018;37(6):890–899. doi: 10.1377/hlthaff.2017.1678 [DOI] [PubMed] [Google Scholar]
- 28.Quigley D, Qureshi N, Rybowski L, et al. Summary of the 2020 AHRQ research meeting on 'advancing methods of implementing and evaluating patient experience improvement using consumer assessment of healthcare providers and systems (CAHPS(R)) surveys'. Expert Rev Pharmacoecon Outcomes Res. Sep 2022;22(6):883–890. doi: 10.1080/14737167.2022.2064848 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Quigley D, Wiseman S, Farley D. Improving Performance for Health Plan Customer Service: A Case Study of a Successful CAHPS Quality Improvement Intervention. 2007. [Google Scholar]
- 30.Quigley D, Wiseman S, Farley D. Improving Hospital Inpatient Nursing Care: A Case Study of One Hospital’s Intervention to Improve the Patient’s Care Experience. 2010. [Google Scholar]
- 31.Quigley DD, Elliott MN, Qureshi N, Predmore Z, Hays RD. How has research used CAHPS Clinician and Group Patient Experience survey data? A systematic review. JPCRR. In Press; [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Quigley DD, Elliott MN, Slaughter ME, et al. Shadow coaching improves patient experience with care, but gains erode later. Med Care. Nov 1 2021;59(11):950–960. doi: 10.1097/mlr.0000000000001629 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Quigley DD, Elliott MN, Slaughter ME, Talamantes E, Hays RD. Shadow coaching improves patient experience for English-preferring patients but not for Spanish-preferring patients. J Gen Intern Med. 2023. (in press); [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Beckett MK, Quigley DD, Cohea CW, et al. Trends in HCAHPS survey scores, 2008-2019: A quality improvement perspective. Under review; [DOI] [PubMed] [Google Scholar]
- 35.Weinick RM, Quigley DD, Mayer LA, Sellers CD. Use of CAHPS patient experience surveys to assess the impact of health care innovations. Jt Comm J Qual Patient Saf Sep 2014;40(9):418–27. doi: 10.1016/s1553-7250(14)40054-0 [DOI] [PubMed] [Google Scholar]
- 36.Agency for Healthcare Research and Quality. Health Care Innovations Exchange. Agency for Healthcare Research and Quality. Accessed December 8, 2023. https://www.ahrq.gov/innovations/index.html [Google Scholar]
- 37.Mitka M. New innovation web site helps spread ideas through the health care community. JAMA. May 28 2008;299(20):2377. doi: 10.1001/jama.299.20.2377 [DOI] [PubMed] [Google Scholar]
- 38.Saldaña J. The coding manual for qualitative researchers. Sage Publications Ltd.; 2009. [Google Scholar]
- 39.Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. Mar 1977;33(1):159–74. [PubMed] [Google Scholar]

