Abstract
Background and Objectives
Delivering care as a patient-centered medical home (PCMH) is being widely adopted across the United States by primary care practices to better meet patient needs. A key PCMH element is measuring patient experience for practice improvement. The National Committee for Quality Assurance (NCQA) PCMH recognition program requires practices to both measure patient experience and engage in continuous practice/quality improvement to attain PCMH recognition and then throughout full PCMH transformation. NCQA recommends but does not require that practices administer the Consumer Assessment of Healthcare Providers (CAHPS®) clinician and group patient experience survey (CG-CAHPS) plus 14 PCMH CAHPS items, known as the CAHPS PCMH survey. We examine aspects of patient experience measured by practices with varying number of years on their journey of PCMH transformation.
Methods
We randomly selected practices from the 2008–2017 NCQA directory of practices that had applied for PCMH recognition based on region, physician count, number of years and level of PCMH recognition, and use of the CG-CAHPS PCMH survey. We collected characteristics of the practices from practice leader(s) knowledgeable about the practice’s PCMH history and patient experience data. We confirmed the patient experience surveys used during their PCMH history and requested copies of their non-CAHPS survey(s). For practices not administering the recommended CG-CAHPS survey (53/105 practices), we obtained and coded the content of their non-CAHPS surveys (68%; 36/53). We mapped the patient experience domains and specific measures to the CG-CAHPS survey (version 2.0 and 3.0), CAHPS PCMH item set (Version 2.0 and 3.0), and the available CG-CAHPS supplemental items.
Results
Whether or not practices administered the CG-CAHPS items, most of them addressed topics contained in the CG-CAHPS survey such as Access to care, Provider communication, Office staff helpfulness/courteousness, Care coordination, and Shared decision making. The most common CAHPS measures included were Office staff helpfulness/courteousness and Provider communication. Common non-CAHPS measures included were Ease of scheduling, Being informed about delays, and Provider helpfulness/courteousness.
Conclusion
NCQA PCMH practices included CAHPS items on their patient experience surveys even if they did not administer the full CG-CAHPS survey or the recommended CAHPS PCMH survey. To enhance the usefulness of patient experience surveys for practices undergoing PCMH changes, additional CAHPS measures could be developed related to key areas of PCMH change, including expanded access to care (i.e., after hours and weekend visits, ease of scheduling, being informed about delays), use of shared decision making, and improvements in provider communication (i.e., provider is courteous, communication by other clinical staff and the patient). These additional measures would assist practice leaders in capturing the breadth and depth of their PCMH transformation and its influence on providing more patient-centered care. Developing such items would help standardize the measurement of changes related to patient experience during PCMH transformation. Research is needed to determine whether a CAHPS survey is the best source of this information.
Keywords: patient experience, patient-centered medical home, survey measures, CAHPS®
Introduction
The patient-centered medical home (PCMH) model of primary care delivery has a rich history beginning in the 1960s.1 The model was originally developed to provide comprehensive and continuous care to children,1 but was adapted to help redesign how fragmented primary care had become due to the increased utilization of specialties, growing specialized and technologically-sophisticated care,2–5 and the decreased number of residents choosing primary care.6 Practices can obtain certification as a PCMH from several organizations, such as the National Committee for Quality Assurance (NCQA). The NCQA PCMH Recognition program is the most widely-adopted program in the country with an estimated 13,000 practices obtaining NCQA PCMH recognition in the United States from 2008 to 2017.7
NCQA provides guidance, resources, and oversight to obtain recognition as a PCMH. Practices applied to NCQA for PCMH recognition and received a Level 1, Level 2 or Level 3 designation of how well they met the PCMH standards (i.e., Level 1 ranged between 35–59 points, Level 2 ranged from 60–84 points, and Level 3 ranged from 85–100 points; All three levels required 6 of 6 Must-Pass elements). With NCQA, PCMH Level 3 signifies the practice has become an advanced primary care practice. This designation means that the practice has undergone a transformation in quality and safety enhancement and has established a foundation for coordination across the continuum of care. This means that the Level 3 practice for example offers extended hours, care management to patients with chronic illnesses, etc. that are the core PCMH must-pass elements, and some practices also offer the optional criteria elements such as behavioral health integration, referrals to specialists, or other advanced services. So, Level 3 indicated the practice is an advanced primary care practice and is an accredited PCMH recognized practice.
Typically, to maintain their accreditation as a PCMH recognized practice and to meet the various elements of the 6 PCMH standards, which evolved every three years (i.e., 2008, 2011, 2014, 2017), Level 3 PCMH recognized practices continued to pursue changes related to additional or new aspects of the PCMH standards (either core or optional) and re-applied to maintain their accreditation as PCMH recognized as an advanced primary care practice, indicating that initial recognition as a Level 3 PCMH only starts a practice’s journey of PCMH transformation.
The overarching goal of PCMH transformation is inherently focused on improving care provided to patients and includes a requirement that practices measure patient experience and use these data to implement quality improvement (QI) to improve patient experience. The collection of patient experience survey data is a vital part of performance measurement and the associated QI activities that are at the center of transforming into a PCMH and maintaining PCMH recognition.8 Practices, however, are not required by the NCQA to collect patient experience data using a specific survey. Rather, it is recommended, but optional, to administer the Consumer Assessment of Healthcare Providers and Systems (CAHPS®) clinician and group survey (CG-CAHPS). CG-CAHPS was designed to inform practices about patient experiences of ambulatory care in areas that the patient is the best source of information and to be used for QI to make care experiences higher quality and more patient-focused.9–11 CAHPS surveys, developed in 1995, are now the US standard for collecting information about patient experiences of care.12–14 In 2009, the CAHPS PCMH item set was developed in collaboration with NCQA to reflect NCQA PCMH standards. A survey that combines the CAHPS PCMH item set and the CG-CAHPS core items is known as the CAHPS PCMH survey.15 Similar to the CG-CAHPS survey, NCQA recommends practices use the CAHPS PCMH survey to guide their transformation process, but it is optional.
Despite the large number of practices recognized by NCQA as a PCMH, no research to date has summarized the content of the patient experience surveys administered by practices as they make changes to attain initial recognition as a PCMH and then over the course of their PCMH transformation. We collected characteristics of practices from practice leaders knowledgeable about the practice’s PCMH history and patient experience data. We asked about the history of their patient experience surveys used for their PCMH application and PCMH transformation and examined the content of the patient experience surveys administered during PCMH transformation. The goal was to examine and describe the patient experience content areas that practice leaders included in their patient experience surveys during their PCMH transformation. We reviewed and coded the content of the survey questions, identified whether the content aligned with the NCQA-PCMH recommended CG-CAHPS 2.0 or 3.0 domains or recommended CAHPS PCMH 2.0 or 3.0 items, or the CG-CAHPS supplemental items.
Methods
Design.
We randomly selected practices from across the US from the census of ~15,000 practices that had applied for NCQA PCMH recognition from 2008, when the NCQA recognition program began to 201716 stratified by region, physician count, PCMH history and use of the NCQA-recommended CAHPS PCMH Survey (never, previously or currently administered). Practices were sampled across these key dimensions to gain a broad range of practices across the US pursuing PCMH recognition. Our analyses first examined results by whether the practice used a CAHPS survey or a non-CAHPS survey and then examined results also by type of CAHPS survey used (i.e., CG-CAHPS survey, CAHPS PCMH survey). All practices had applied for NCQA PCMH recognition and received a Level 1, Level 2 or Level 3 designation of how well they met the NCQA PCMH standards: Level 1 or Level 2 indicates an aspiring PCMH practice, while Level 3 designates the attainment of NCQA’s PCMH Recognition. We measured PCMH history by both the level of NCQA PCMH recognition and the length of time a practice had maintained their Level 3 PCMH recognition status: Level 3 PCMH status for less than three years, three to five years, or five years or more. We included adult practices and excluded pediatric practices.
We collected practice characteristics and information about PCMH changes and patient experience data from practice leaders knowledgeable about the practice’s PCMH history and patient experience data. Practice leaders were identified based on their knowledge of the practice’s PCMH application process, changes, and use of data to meet PCMH. goals. We confirmed the patient experience surveys used during their PCMH history and requested copies of their non-CAHPS patient experience survey(s). All study protocols were approved by RAND’s Human Subjects Protection Committee (HSPC) (FWA00003425), the IRB of record, and by the Office of Management and Budget (OMB).
Data.
We collected information on 105 practices. We confirmed the patient experience surveys used during their PCMH history and requested copies of non-CAHPS patient experience surveys. To include a large stratified sample of practices, data was collected from June 2017 to June 2018.
Analytic Approach.
We uploaded the non-CAHPS surveys into Dedoose,17 a web application for managing, analyzing, and presenting qualitative and mixed-method research data. We developed an a priori codebook and identified categories for the codes.18 Three coders (LX, CP, NQ) coded surveys independently, noting all patient experience topics. During team meetings, the coding team explored the data to reach consensus on codes, identify discrepancies, refine concepts, and define codes for analysis.19 We employed interrater reliability exercises with the senior subject matter expert (DQ) and three coders (LX, NQ, CP) to refine codes and code descriptions. After code refinement, we estimated a pooled kappa of 0.94, indicating very good agreement.20,21 We conducted content analysis to identify the domains and measures on the surveys. We then mapped the coded list of patient experience domains and specific measures to the content of the CG-CAHPS 2.0 and 3.0 surveys, CAHPS PCMH item set and CG-CAHPS supplemental items. All results are reported as frequencies and proportions of the surveys with questions mapped to patient experience domains. The CG-CAHPS 2.0 and 3.0 surveys contain a 0–10 overall rating and 4 composite measures: Access to care, Provider communication, Care coordination, and Helpfulness/ Courtesy of office staff. The PCMH item set is 14 items related to the PCMH standards: Access to care, Provider communication, Care coordination, Shared decision making, and Self-management support.
Results
Practice characteristics.
Practices that fielded a CAHPS survey (i.e., either the CG-CAHPS survey or the CAHPS PCMH survey) and those that did not field a CAHPS survey were similar based on the stratification criteria. Overall, one-third of practices (31%) had 1–2 physicians on staff, with another third (28%) that had 3–5 physicians on staff. Half (50%) were affiliated with a hospital and most (82%) were part of a medical group or network. A third (30%) were physician-owned; Another third of the practices were federally qualified health centers (FQHC) (35%), and 20% were hospital owned. Most (84%) practices staffed only primary care physicians and most (79%) treated both adults and children (Table 1). Practices who used a CAHPS survey were slightly more likely to be affiliated with or owned by a hospital, which might be explained by the hospital’s use of Press Ganey as their vendor for the H-CAHPS survey.
Table 1.
Practice Characteristics
Used CAHPS survey (N=52) N (%) |
Used non CAHPS survey (N=53) N (%) |
Overall (N=105) N (%) |
|
---|---|---|---|
Patient-Centered Medical Home History | |||
Level 1 or 2 (i.e., Not PCMH recognized) | 13 (25%) | 15 (28%) | 28 (27%) |
Level 3, PCMH Recognized: < 3 years | 16 (31%) | 11 (21%) | 27 (26%) |
Level 3, PCMH Recognized: 3–5 years | 10 (19%) | 11 (21%) | 21 (20%) |
Level 3, PCMH Recognized: 5+ years | 13 (25%) | 16 (30%) | 29 (28%) |
Providers at Primary Care Practice | |||
Primary care only | 43 (83%) | 40 (75%) | 83 (79%) |
Primary care and specialists | 9 (17%) | 13 (25%) | 22 (21%) |
Practice Size (i.e., Number of Physicians) | |||
Very Small/ 1–2 physicians | 17 (33%) | 16 (30% | 33 (31%) |
Small/ 3–5 physicians | 15 (29%) | 14 (26%) | 29 (28%) |
Medium Small/ 5 to 9 physicians | 10 (19%) | 12 (23%) | 22 (21%) |
Medium/ 10 – 24 physicians | 10 (19%) | 8 (15%) | 18 (17%) |
Large/ More than 24 physicians | 0 (0%) | 3 (6%) | 3 (3%) |
Patient Population | |||
Adult only | 9 (17%) | 13 (25%) | 22 (21%) |
Adult and children | 43 (83%) | 40 (75%) | 83 (79%) |
Hospital Affiliation | |||
Hospital affiliated | 30 (58%) | 22 (42%) | 52 (50%) |
Not hospital affiliated | 22 (42%) | 31 (58%) | 53 (50%) |
Group or Network Status | |||
Part of group or network | 44 (85%) | 42 (79%) | 86 (82%) |
Not part of group or network | 8 (15%) | 11 (21%) | 19 (18%) |
Ownership | |||
Privately-owned | 15 (29%) | 16 (30%) | 31 (30%) |
Hospital-owned | 14 (27%) | 7 (13%) | 21 (20%) |
Federal Qualified Health Center | 17 (33%) | 19 (36%) | 36 (34%) |
Other ownership structure (incl. health system-affiliated, medical/academic health center, or HMO) | 6 (12%) | 11 (21%) | 17 (16%) |
Patient Experience Survey at time of interview* | |||
CAHPS | 52 (100%) | 0 (0%) | 52 (50%) |
CG-CAHPS Only | 18 (35%) | 0 (0%) | 18 (17%) |
CAHPS + PCMH Items | 33 (63%) | 0 (0%) | 33 (31%) |
No Survey (previously CAHPS + PCMH Items) | 1 (2%) | 0 (0%) | 1 (1%) |
Non-CAHPS | 0 (0%) | 53 (100%) | 53 (50%) |
Homegrown | 0 (0%) | 30 (57%) | 30 (29%) |
Other Survey | 0 (0%) | 23 (43%) | 23 (22%) |
Practices were interviews between 2017–2018
Types of surveys.
We examined the type of patient experience surveys administered across the 105 practices pursuing PCMH (Table 1). Thirty-three were currently administering the recommended CAHPS PCMH survey and an additional 18 were currently administering the CG-CAHPS Survey (without the CAHPS PCMH items). Ten of these 18 had never used the CAHPS PCMH survey and 8 had previously used it. Half (N=53/105) were administering a patient experience survey that was neither CG-CAHPS nor the CAHPS PCMH survey. Of these 53 practices, 30 administered a “homegrown” survey that practices themselves constructed and was only used at their health system or site. The remaining 23 practices used another non-CAHPS standardized survey (e.g., Crossroads, Medstatic, Press Ganey proprietary survey). One practice was not administering a patient experience survey, as they were finalizing a contract to use the CAHPS PCMH survey for their next survey period. For the 53 practices using a non-CAHPS survey, we obtained 36 non-CAHPS surveys (68%) to examine.
Practice leader characteristics.
Three-fourths of the practice leaders in the 53 practices administering a non-CAHPS survey (74%) and in the 36 practices for which we examined their non-CAHPS surveys (78%) had other job functions besides PCMH implementation. Also, most practice leaders were working at the practice throughout the entire PCMH implementation period (51% for all 53 non-CAHPS survey practices and 58% for the practices for which we examined their non-CAHPS surveys). More than half of practice leaders were involved with the first PCMH application (51% and 56% respectively), while most were involved in subsequent applications (52% and 45%, respectively). Practice leaders were heavily involved in PCMH implementation as part of the team of providers and staff who decided on and implemented the necessary changes to satisfy PCMH standards (64% and 69%, respectively). Half were also coordinators for PCMH efforts at their practice (49% and 50% respectively), while most also reported being part of the team that reviewed data (including patient experience survey data) to determine PCMH changes (64% and 69%, respectively).
Content of non-CAHPS Surveys.
We analyzed the content of 36 non-CAHPS surveys and identified 12 main patient experience domains: Access to care (94%), Provider communication (94%), Office staff helpfulness/courteousness (92%), Care coordination (78%), Office environment and hours (61%), Shared decision making (56%), Continuity of care (44%), Care and treatment (44%), Privacy and confidentiality (39%), Billing (33%), Managing health (31%), and Mental and Emotional health (11%). We mapped the domains and specific measures to the CG-CAHPS 2.0 and 3.0 surveys, CAHPS PCMH item set and CAHPS Supplemental Items (Table 2). Almost all non-CAHPS surveys also included Global measures (89%).
Table 2.
Patient Experience Survey Domains Administered During PCMH Transformation, Mapped to CAHPS Survey or Item Set
Domain (% of surveys) N=36 |
Measure | Non CAHPS Survey | In CAHPS Survey |
|
---|---|---|---|---|
N | % | |||
| ||||
Access to care (94%) |
Wait time at the clinic | 29 | 81 | |
Medical questions answered after hours | 25 | 69 | X | |
Ease of scheduling appointments | 12 | 33 | ||
Patients being informed about delays | 12 | 33 | X | |
Patients getting information they needed after clinical hours | 12 | 33 | X | |
Satisfaction with wait time | 8 | 22 | ||
Appt/wait time -non-urgent care | 7 | 19 | X | |
Getting same day appts | 4 | 11 | ||
Mode of scheduling | 4 | 11 | ||
Wait time to appointment | 4 | 11 | ||
Reminders-appointments/tests | 2 | 6 | X | |
Appt/wait time -urgent care | - | - | X | |
Seen within 15 mins | - | - | X | |
Informed about length of wait | - | - | X | |
After hours care provided | - | - | X | |
| ||||
Provider communication (94%) | Provider listens | 30 | 83 | X |
Provider was easy to understand | 29 | 81 | X | |
Provider spends enough time | 27 | 75 | X | |
Provider was helpful and courteous | 25 | 69 | ||
Provider respectful | 12 | 33 | X | |
Provider/nurses sensitive to patient needs | 9 | 25 | ||
Nurse communication | 8 | 22 | ||
Provider gave easy to understand information | 4 | 11 | X | |
Provider caring /inspire trust | - | - | X | |
Patient encouraged to ask questions | - | - | X | |
| ||||
Office staff helpfulness/ courteousness (92%) |
Office staff are courteous, respectful and friendly | 31 | 86 | X |
Office staff are helpful | 25 | 69 | ||
| ||||
Care coordination (78%) |
Someone at the clinic followed up with the patient about test results | 16 | 44 | X |
Provider and staff worked together to provide care | 13 | 36 | ||
Discussing medications | 13 | 36 | X | |
Provider knows about patient care from specialist | 13 | 36 | X | |
Knowledge of patient’s medical history | 12 | 33 | X | |
Knowledge of care from other providers | 11 | 31 | X | |
Provider referrals to specialist | 8 | 22 | ||
Give info about medications | 7 | 19 | ||
Ease of obtaining medication | 4 | 11 | ||
Explained/ordered blood test, x-ray, other test | 3 | 8 | X | |
Medication history | 2 | 6 | X | |
Patient care from specialist | - | - | X | |
See acupuncturist or herbalist | - | - | X | |
| ||||
Office environment and hours (61%) |
Office was near and clean | 14 | 39 | |
Comfortable environment | 9 | 25 | ||
Convenient office hours | 8 | 22 | ||
Convenient location | 4 | 11 | ||
Safe environment | 3 | 8 | ||
Services for mobility impaired | 2 | 6 | X | |
Office interpreter services | 1 | 3 | X | |
Discrimination | - | - | X | |
| ||||
Shared decision making (56%) |
Discussing health goals and making decisions with your provider about your care | 18 | 50 | X |
Challenges with health goals | 4 | 11 | X | |
Discuss surgery or procedures | - | - | X | |
| ||||
Continuity of care (44%) |
Main reason for visit | 14 | 39 | X |
Length of time with provider | 12 | 33 | X | |
Identifying provider | 9 | 25 | X | |
Medical home | 2 | 6 | ||
| ||||
Care and treatment (44%) |
Gives good advice/treatment | 8 | 22 | |
Information on follow up care | 5 | 14 | ||
Confidence in care provider | 1 | 3 | ||
Experience/comfort with exam | - | - | X | |
| ||||
Privacy and confidentiality (39%) |
Clinic maintained patient privacy and confidentiality | 12 | 33 | |
| ||||
Billing (33%) |
Explanation of bill | 10 | 28 | |
| ||||
Managing health (31%) |
Health promotion | 9 | 25 | X |
Care and prescriptions for chronic conditions | - | - | X | |
| ||||
Mental and Emotional health (11%) |
Asked patient about mental and/or emotional health and stress | 6 | 17 | X |
| ||||
Technology (0%) |
Patient use of technology | - | - | X |
Provider use of technology | - | - | X | |
Information via website & easy to understand | - | - | X | |
Patient use of technology | - | - | X | |
| ||||
Global measures (89%) |
Likelihood of recommending the clinic | 20 | 56 | |
Likelihood of recommending the provider | 16 | 44 | X | |
Overall rating of provider | 15 | 42 | X | |
Overall assessment of clinic | 13 | 36 | ||
Return to clinic | 5 | 14 |
NOTE: An – indicates there were no surveys that we reviewed that included the item(s). An X denotes that the item(s) are present in a CAHPS survey.
The two most common measures within the Access to care domain were: “Wait time at the clinic” (81% of all non-CAHPS surveys) and “Medical questions answered after hours” (69%). Neither of these topics are in the CG-CAHPS 3.0 survey, however “Patients getting the information they needed after clinic hours” (36%) is a CG-CAHPS 2.0 item and a PCMH 3.0 item. The CG-CAHPS 2.0 survey also included an item about “Being seen within 15 minutes” (which we categorized under “Wait time at clinic”). The non-CAHPS surveys also addressed two measures not on CAHPS surveys: “Ease of scheduling appointments” (33%) and “Patients being informed about delays” (33%). The non-CAHPS items about “Ease of scheduling appointments” and “Satisfaction with wait time” addressed whether scheduling your primary care appointment was easy (these items used either the 5-option responses of “Excellent-to-Poor” or “Very good-to-Very poor”).
Provider communication was included on 94% of all the non-CAHPS surveys. The most common survey items within Provider communication were: “Provider listens” (83%), “Provider was easy to understand” (81%), and “Provider spends enough time” (75%), which is a CG-CAHPS 2.0 measure. In addition, 11% of the surveys used the same CG-CAHPS 2.0 item asking if the “Provider gave easy to understand information”. Some surveys shortened this item by removing the time reference or asked whether the provider used “words” that the patient could understand. Another 11% used an item with no stem, e.g., “Provider listened to patient,” or “Asked specifically about the experience during the visit.” Both “Provider spends enough time” (75%) and “Provider was respectful” (33%) are measures on CG-CAHPS 2.0 and 3.0 surveys. Lastly, the non-CAHPS surveys often included a measure on whether the “Provider was helpful and courteous” (69%), which is not a CAHPS item.
The Office staff helpfulness/courteousness domain included CAHPS 2.0 measures and appeared on 92% of the non-CAHPS surveys. The domain included two measures: “Office staff are courteous, respectful, and friendly” (86%) and “Office staff are helpful” (69%). The CG-CAHPS 2.0 measure (“Office staff are helpful”) assesses frequency, (i.e., how often) using a Never/Sometimes/Usually/Always response scale, while the non-CAHPS items asked the patient to rate how courteous or respectful staff were using a 5-point Poor-to-Excellent response scale.
Care coordination was measured using items about the provider coordinating the patient’s care, medication topics, and referrals. Care coordination appeared on 78% of the non-CAHPS surveys. A common non-CAHPS measure asked whether the “Provider and staff worked together to provide care” (36%). Measures similar to CAHPS items asked about whether “Someone at the clinic followed up with the patient about test results” (44%) and provider’s “Knowledge of patient’s medical history” (33%); the wording of these two items were identical to CG-CAHPS items and used the CAHPS response options of Yes/definitely/Somewhat/No. On the majority of the surveys, the question about someone at the clinic following up on test results was the same as the CAHPS item, except in a handful of cases where the question was shortened, to “Practice informs of blood tests or x-rays” or “Notifications of results.” Non-CAHPS items about “Discussing medications” were included in 36% of the surveys, as was whether the “Provider knows about care from the specialist”.
Measures of Office environment and hours were included, with the most common measure asking patients whether the “Office was clean and neat” (39%). Items related to Shared decision making were included in just over half the surveys (56%). The most common item came from the CG-CAHPS 2.0 item: “Discussing health goals and making decisions with your provider about your care,” which was included in 50% of surveys. Measures about the “Main reason for the visit” under the domain of Continuity of care were commonly included (39%). Privacy and confidentiality items were included a third of the time (33%), asking patients about the “Clinic maintaining patient privacy and confidentiality;” however, none of these were CAHPS measures.
Several patient experience domains were included in less than a third of the non-CAHPS surveys: Managing health, Mental and emotional health, and Billing. Technology [including health information technology (HIT)] items were not included in the non-CAHPS surveys, but a CAHPS HIT supplemental item set exists.22 Available at: https://www.ahrq.gov/cahps/surveys-guidance/item-sets/HIT/index.html.
Finally, Global measures were common on the non-CAHPS surveys. The “Overall rating of the provider” (45%) is a CG-CAHPS 2.0 measure. The common Global measures not on CAHPS surveys were: “Overall assessment of the clinic” (36%), “Likelihood of recommending the clinic” (56%), and “Likelihood of recommending the provider” (44%). CG-CAHPS 2.0 visit survey includes an item asking, “Would you recommend this provider’s office to your family and friends?” and the CG-CAHPS 3.0 survey also includes this item.
Conclusions
From a stratified random sample of practices all pursuing PCMH recognition and transformation, half administered the CAHPS survey (i.e., either the CG-CAHPS survey or the CAHPS PCMH survey), while the other half did not administer a CAHPS survey. To understand the content of these non-CAHPS patient experience surveys and what aspects of patient experience practices were interested in collecting, trending and measuring during their PCMH transformation, we examined the specific domains and measures in these non-CAHPS surveys. We obtained 68% (36/53) of these non-CAHPS surveys from practices pursuing PCMH.
The most common patient experience survey domains administered by practices undergoing PCMH transformation included in the non-CAHPS surveys were: Access to care (94%), Provider communication (94%), Office staff helpfulness/courteousness (92%), and Care coordination (78%). In these non-CAHPS surveys, we found that many practices were still administering specific CG-CAHPS items. Most commonly, practices undergoing PCMH transformation that were administering non-CAHPS surveys included exact CAHPS measures on Provider communication (i.e., provider spends enough time, provider listens, amount of time with the provider), medical questions answered after hours, office staff courteousness, follow-up on test results, and provider knows patient medical history. This is not surprising given that all CAHPS measures are rigorously developed,23 commonly used as benchmarks of care quality and that the 14-item CAHPS PCMH item set was specifically developed to reflect the standards for achieving a quality PCMH practice. While many practices did not use the recommended core CG-CAHPS survey nor include the CAHPS PCMH items, they did choose to include a few CG-CAHPS measures or unconventionally adapt the CG-CAHPS measures, as they measured specific items of interest and importance to meet their clinic needs or PCMH goals.
Practice leaders have an intimate knowledge of their practice operations. They recognize the connection between understanding what the patient needs and experiences and its association with providing quality care.24,25 Practices also have diverse and varied approaches to pursuing the initial PCMH recognition and the myriad of choices along the path of PCMH transformation and though there are PCMH standards and guidelines for providing PCMH quality of care, the day-to-day changes to achieve PCMH recognition are appropriately tailored to site-specific needs and goals. As a result, practices pursuing PCMH transformation need to consider what information they can use from their patient experience survey to help meet with PCMH transformation goals. More and more practices are engaging in how to use patient experience data to gauge their progress and identify gaps or areas of improvement,10,26–29 which is also recognized by the requirement of continuous QI being included in the NCQA PCMH standards.
Based on the non-CAHPS items included by these PCMH-transforming practices, new CG-CAHPS items could be developed so that practices could collect standardized measures to assess aspects of PCMH transformation on patient experience related to appointment reminders, wait time, cleanliness of the office, and assessing the main reason for visit. Practices also wanted additional information on Provider communication related to whether providers are helpful and courteous or whether providers and nurses are sensitive to patient needs. These newly developed measures could be supplemental items or be included in the PCMH item set.
In sum, additional CG-CAHPS items in key PCMH topic areas, Access to care (i.e., for after hours and weekend visits, ease of scheduling, being informed about delays), Shared decision making, and Provider communication (i.e., provider is courteous, communication by other clinical staff and the patient), could assist practice leaders in their efforts to capture the breadth and depth of PCMH transformation and its influence on providing more patient-centered care. Research is needed to determine whether the CAHPS survey is the best source of this information.
Funding:
Preparation of this manuscript was supported through cooperative agreements from the Agency for Healthcare Research and Quality (Contract number U18 HS016980 and U18 HS025920)
Footnotes
Disclosure: There are no conflicts of interest for any of the authors.
Contributor Information
Lea Xenakis, Behavioral and Policy Sciences, RAND Corporation, 1776 Main Street, PO Box 2138, Santa Monica, CA 90407-2138.
Denise D. Quigley, Behavioral and Policy Sciences, RAND Corporation, 1776 Main Street, PO Box 2138, Santa Monica, CA 90407-2138.
Nabeel Qureshi, Pardee RAND Graduate School, RAND Corporation, 1776 Main Street, PO Box 2138, Santa Monica, CA 90407-2138.
Luma AlMasarweh, Case Western Reserve University, 226 Mather Memorial Building, 10900 Euclid Avenue, Cleveland, Ohio, 44106.
Chau Pham, RAND Corporation, 1776 Main Street, PO Box 2138, Santa Monica, CA 90407-2138.
Ron D. Hays, UCLA, Division of General Internal Medicine & Health Services Research, 1100 Glendon Avenue, Los Angeles, CA 90024.
References
- [1].Robert Graham Center. The Patient Centered Medical Home: History, Seven Core Features, Evidence and Transformational Change. 2007; https://www.aafp.org/dam/AAFP/documents/about_us/initiatives/PCMH.pdf. Accessed June 27, 2019.
- [2].Dentzer S Reinventing primary care: a task that is far ‘too important to fail’. Health Aff (Millwood) 2010;29(5):757. [DOI] [PubMed] [Google Scholar]
- [3].Ewing M The patient-centered medical home solution to the cost-quality conundrum. J Healthc Manag 2013;58(4):258–266. [PubMed] [Google Scholar]
- [4].Howell JD. Reflections on the past and future of primary care. Health Aff (Millwood) 2010;29(5):760–765. [DOI] [PubMed] [Google Scholar]
- [5].Rittenhouse DR, Shortell SM, Fisher ES. Primary care and accountable care--two essential elements of delivery-system reform. N Engl J Med 2009;361(24):2301–2303. [DOI] [PubMed] [Google Scholar]
- [6].Bodenheimer T Primary care--will it survive? N Engl J Med 2006;355(9):861–864. [DOI] [PubMed] [Google Scholar]
- [7].National Committee for Quality Assurance. Report Cards: Practices. Washington, DC: National Committee for Quality Assurance;2017. [Google Scholar]
- [8].Geonnotti KT, Peikes D, Schottenfeld L, Burak H, McNellis R, Genevro J. Engaging Primary Care Practices in Quality Improvement: Strategies for Practice Facilitators. AHRQ Publication No 15–0015-EF. Rockville, MD: Agency for Healthcare Research and Quality;2015. [Google Scholar]
- [9].Davies E, Shaller D, Edgman-Levitan S, Safran DG, Oftedahl G, Sakowski J, et al. Evaluating the use of a modified CAHPS survey to support improvements in patient-centred care: lessons from a quality improvement collaborative. Health Expect 2008;11(2):160–176. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [10].Friedberg MW, SteelFisher GK, Karp M, Schneider EC. Physician groups’ use of data from patient experience surveys. J Gen Intern Med 2011;26(5):498–504. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [11].Quigley DD, Mendel PJ, Predmore ZS, Chen AY, Hays RD. Use of CAHPS® patient experience survey data as part of a patient-centered medical home quality improvement initiative. J Healthc Leadersh 2015;7:41–54. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [12].Crofton C, Lubalin JS, Darby C. Consumer Assessment of Health Plans Study (CAHPS). Foreword. Med Care 1999;37(3 Suppl):MS1–9. [DOI] [PubMed] [Google Scholar]
- [13].Darby C, Hays RD, Kletke P. Development and evaluation of the CAHPS hospital survey. Health Serv Res 2005;40(6 Pt 2):1973–1976. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [14].Hays RD, Martino S, Brown JA, Cui M, Cleary P, Gaillot S, et al. Evaluation of a Care Coordination Measure for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) Medicare survey. Med Care Res Rev 2014;71(2):192–202. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [15].Agency for Healthcare Research and Quality. About the CAHPS® Patient-Centered Medical Home (PCMH) Item Set 3.0. 2015; https://www.ahrq.gov/sites/default/files/wysiwyg/cahps/surveys-guidance/item-sets/PCMH/about_pcmh-item-set-cg30-2314.pdf. Accessed April 27, 2020.
- [16].Quigley DD, Qureshi N, Hays RD. Use of CAHPS® Patient Experience Data As Part of Patient-Centered Medical Home Transformation. Paper presented at: AcademyHealth Annual Research Meeting (ARM)2019; Washington, DC. [Google Scholar]
- [17].Web application for managing, analyzing, and presenting qualitative and mixed method research data [computer program]. Los Angeles, CA: SocioCultural Research Consultants, LLC; 2018. [Google Scholar]
- [18].Bernard HR, Ryan GW. Chapter 4, Code books and Coding. Analyzing Qualitative Data: Systematic Approaches. Thousand Oaks, CA: Sage Publications; 2010:75–105. [Google Scholar]
- [19].Miller W, Crabtree B. The dance of interpretation. In: Miller W, Crabtree B, eds. Doing Qualitative Research in Primary Care: Multiple Strategies. 2nd ed. Newbury Park: Sage Publications; 1999:127–143. [Google Scholar]
- [20].Cohen J A coefficient of agreement for nominal scales. Educ Psychol Meas 1960;20(1):37–46. [Google Scholar]
- [21].Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977;33(1):159–174. [PubMed] [Google Scholar]
- [22].McInnes DK, Brown JA, Hays RD, Gallagher P, Ralston JD, Hugh M, et al. Development and assessment of CAHPS questions to assess the impact of health information technology on patient experiences with care. Medical Care 2012;50:S11–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [23].Agency for Healthcare Research and Quality. Development of the CAHPS Clinician & Group Survey. 2018; https://www.ahrq.gov/cahps/surveys-guidance/cg/about/Develop-CG-Surveys.html. Accessed June 27, 2019.
- [24].Akinci F, Patel PM. Quality improvement in healthcare delivery utilizing the patient-centered medical home model. Hosp Top 2014;92(4):96–104. [DOI] [PubMed] [Google Scholar]
- [25].Lebrun-Harris LA, Shi L, Zhu J, Burke MT, Sripipatana A, Ngo-Metzger Q. Effects of patient-centered medical home attributes on patients’ perceptions of quality in federally supported health centers. Ann Fam Med 2013;11(6):508–516. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [26].Geissler KH, Friedberg MW, SteelFisher GK, Schneider EC. Motivators and barriers to using patient experience reports for performance improvement. Med Care Res Rev 2013;70(6):621–635. [DOI] [PubMed] [Google Scholar]
- [27].Quigley DD, Palimaru AI, Chen AY, Hays RD. Implementation of Practice Transformation: Patient Experience According to Practice Leaders. Qual Manag Health Care 2017;26(3):140–151. PMC Journal - In Process. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [28].Quigley DD, Predmore ZS, Chen A, Hays RH. Implementation and Sequencing of Practice Transformation in Urban Practices with Underserved Patients. Qual Manag Health Care 2017;26(1):7–14. PMC Journal - In Process. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [29].Stout S, Weeg S. The practice perspective on transformation: experience and learning from the frontlines. Med Care 2014;52(11 Suppl 4):S23–25. [DOI] [PubMed] [Google Scholar]