Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Sep 1.
Published in final edited form as: Postgrad Med. 2020 Jun 7;132(7):636–642. doi: 10.1080/00325481.2020.1773685

Redesigning Primary Care in an Academic Medical Center: Lessons, Challenges, and Opportunities

Leonard E Egede 1,2, Rebekah J Walker 1,2, Sneha Nagavally 2, Madhuli Thakkar 2, Monica O’Sullivan 1, Wendy Stulac Motzel 3
PMCID: PMC7609476  NIHMSID: NIHMS1599738  PMID: 32441180

Abstract

Purpose:

To evaluate patient access, provider productivity, and patient satisfaction during a 24-month redesign process of an academic medical center, which requires balance between clinical and educational missions.

Methods:

A series of activities were conducted to optimize primary care across 17 attending physicians, 6 Advanced Practice Providers (APPs), and 39 residents. Patient access was defined as the next available appointment for either existing/established patients or new patients. Productivity was measured using panel sizes for each provider. Patient satisfaction was based on the Clinician and Group Consumer Assessment of Healthcare Providers and Systems (CGCAHPS).

Results:

Despite decreasing clinical effort to allow faculty and APPs to participate in education and research, there was an overall increase in access for both new and established patients, and an increase the percent of each providers’ panel that was full from 78.89% in 2017 to 115.29% in 2019. When comparing panel sizes for the 11 faculty present before and after strategic changes, we found significant increase in both overall panel size, and actual to expected ratios between 2017 and 2019. In addition, throughout the time period, patient satisfaction remained high with no significant changes.

Conclusions:

While this project was limited to one site, the inclusion of a set of well-planned metrics, and tracking of processes over time can provide insight for ongoing primary care redesign efforts at similar sites seeking to balance the academic mission with clinical productivity and high patient satisfaction.

Keywords: primary care redesign, access to care, provider productivity, patient satisfaction, academic medical center, quality improvement

Introduction

Healthcare systems in the United States face ongoing pressure to accomplish what is now referred to as the Quadruple Aim – enhance patient experience, improve population health, reduce costs, and decrease burnout of providers. (1,2) Unfortunately, these aims often come with conflicting metrics and a system-wide effort to improve performance despite increasing complexity of care. (1) In addition, improving population health requires consideration of factors generally viewed as external to the healthcare system, necessitating systems that integrate services and coordinate across organizations. (3)

Primary care has responded to this need to transform systems of care through practice redesign in the form of practice-based quality improvement, patient centered medical homes, and integrated care models. (37) Though many lessons have been learned through the process of implementing change in the primary care setting, no one best intervention has been identified. (8,9) Rather, the contextual conditions within a primary care system have been identified as providing the necessary support to allow effective redesigns and improved care. (8,9,10) In addition, transformation requires ongoing efforts to create meaningful change and balance a variety of metrics. (10) Factors such as increased training and focus on quality improvement, leadership support, and staff engagement have been found in clinics that experienced either high increases in efficiency, high improvements in physician productivity, or the greatest improvements in patient satisfaction. (8) One study of changes as part of a patient-centered medical home quality improvement initiative found that the constant review and presence of the patient-experience data alongside other metrics of productivity and quality, helped practice leaders in understanding what was impacting patient experience and how they could improve. (11) Another study found that accommodating a team-based care model designed to allow growth required tracking a number of key metrics related to the quadruple aim. (12)

Many primary care redesign processes are initiated with few evaluations in place to capture changes in important metrics, such as patient access, provider productivity, or patient satisfaction. Using patient-experience surveys for quality improvement is a common practice in the health care industry because it is essential for achieving patient-centered care (1316). Because of their reliability, content, and validity, the Consumer Assessment of Healthcare Providers and Systems (CAHPS) surveys are now the US standard for information about patient experience of care (1719). Research has demonstrated that patient experience consistently correlates with better health outcomes and clinical processes of care for prevention and disease management (2024). Furthermore, patient experience is not at odds with cost control, and may be positively correlated with key financial indicators. (12,25) For examples, one study found that nearly 30% of the variation in hospital financial performance is explained by patient’s perceptions of quality (25)

Since one of the most consistent messages from evaluations is the need to develop a robust support for quality improvement (4,5,8,10,12), the Division of General Internal Medicine (GIM) at an academic medical center in the Midwest sought to engage primary care faculty and staff in a 24-month redesign process. Given academic medical centers require balancing of clinical productivity with the educational mission, a series of evaluations were included in the process to track and understand the influence of the redesign on providers and residents. While academic medical centers have provided informative evaluations on transforming either primary care or residency programs (2628), the goal of this process was to improve efficiency and productivity within the clinic, while maintaining high patient satisfaction and decreasing provider burden. The primary measures used to assess success were patient access, provider productivity (panel sizes), and patient satisfaction.

Methods

Background Information on Academic Medical Center

This project was conducted within the primary care clinic of an academic medical center in the Midwest with 17 attending physicians, 6 Advanced Practice Providers (APPs), and 39 residents. Clinic work week for a full time FTE attending physician was eight clinic half days with two clinic half days protected for education and research. Private clinics included attending physicians seeing patients without residents. Clinic half days during which an attending physician oversaw residents were not counted towards expected productivity time, and instead this was incorporated into resident productivity. Residents did not change supervisory physicians, though new residents did enter during the study time frame and graduating residents exited.

This study captured the initiation of a primary care redesign process. Patient Centered Medical Home (PCMH) designation was awarded 12 months after the start of this process. The primary care clinic has roughly 11,000 visits in a given year with the mean age of patients 55.8 years, 63.9% female, and 39.5% minorities. Approximately half of the population is Managed Care, 15.7% covered by Medicaid, and 22.7% covered by Medicare.

Interventions Initiated During Primary Care Redesign

Over the course of 24 months, a series of activities were conducted to improve clinical access. Human subjects approval to conduct the work was provided through our institution’s quality improvement approval process. First, clinical work week for all providers was tightened and uniform visit lengths of 30 and 45 minutes were created. Second, early morning (7–8am) and evening (5–7pm) hours were added for all providers with two or more clinics per work week. Third, completion of pain management agreements were moved from the Advanced Practice Providers (APPs) to registered nurses to open additional clinic time for APPs. Fourth, a team-based care model was created that included two APPs per team to maximize access and increase patient volume. Fifth, visit length was standardized based on expected time to complete activities necessary and a time study to understand time spent for specific types of visits. For example, the visit length for resident transfer patients was decreased from 60 to 30 minutes and the length for established patients needing discharge follow-up was decreased from 60 to 30 minutes. And finally, bi-weekly meetings were begun with administrative, clinic and pre-arrival scheduling staff to improve scheduling. At the same time panel size benchmarks and an algorithm for regular assessment of panel sizes was created and then used to standardize measures of productivity across the Division.

Patient Access

Data Source and Study Population

To measure patient access, data was extracted from the scheduling application in Epic Health Systems medical record, Cadence. Scheduled visits for GIM providers was pulled from the medical record each month to provide information on appointments by length of appointment. An access report was then created to allow tracking of access by leadership and providers.

Patient Access Definitions

Patient access was defined as the duration between the day the data was pulled and the next available appointment for either existing/established patients or new patients. 3rd Next Long and 3rd Next Short represent access for existing/established patients. In prior fiscal years access for appointments longer than 30 minutes was calculated separately from appointment 30 minutes or less, however, beginning in FY2019 these two access measures were combined into one category for existing/established patients. 3rd Next New represented access for new patients who have not previously seen a GIM provider.

Productivity - Panel Size

Data Source and Study Population

To measure productivity, panel size was calculated for each provider. Panel sizes were summarized for providers with APPs and residents groups under the supervising physician and teams accounting for all groups. This format allowed panel expectations to account for interaction between supervising physician and residents and the competing demands and contexts of care experienced by each group. Data is extracted quarterly from the medical record for the prior 18 months for all General Internal Medicine (GIM) outpatient visits. Approval to access the medical record was provided to support clinical operations for GIM. All adults with at least one outpatient visit to a GIM primary care provider in the prior 18 months were included in the sample used to determine panel size.

Actual Panel Size Calculation

The panel size for each individual provider (Physician/Resident/APP) within the primary care section was calculated by assigning each encounter a primary care provider using Murray’s cut. (29) According to Murray’s cut, the following order is applied while attributing a provider to an outpatient encounter –

  1. An outpatient who has seen only one provider for all visits is assigned to the respective visit provider.

  2. A patient who has seen more than one provider is assigned to the visit provider he/she has seen most often. For Instance, in the past 18 months if a patient had 10 total visits and saw specific Physician for 9 times and an APP for 1 time, then the patient is assigned to the physician. This condition applies to providers including Residents and APPs.

  3. In a situation where a patient saw multiple providers the same number of times, then the patient is assigned to the provider who performed his/her most recent physical exam.

  4. In a case where a patient saw multiple providers the same number of times but, the patient had no health/physical exam done then the patient is assigned to the provider he/she saw last.

Expected Panel Size Calculation

Each provider was assigned an expected panel size based on clinical effort, type of provider and previously published estimates of time it takes primary care providers to provide based on degrees of task delegation (30). Physician providers were assigned an expected panel based on 1,000 individuals per full FTE. APP providers were assigned an expected panel based on 800 individuals per full FTE. Year 1 residents were assigned an expected panel of 50, and year 2 or 3 residents were assigned an expected panel of 100. Physician and APP provider expected panels were then adjusted for clinical effort based on the number of individual clinics and resident clinics weekly.

Demographic and Clinical Variables

Patient encounter information was pulled for any patient seen over the past 18 months. The data extracted included age, gender, race, and Epic risk score. Epic risk score is a general health risk score developed by the Epic medical record creators to predict whether patients will have an adverse health event and is used as a measure of case mix load.

Statistical Analysis for Panel Size

We investigated changes in panel size over the 12-months following implementation of this metric in the primary care clinic. First, all encounters in the past 18 months were assigned to a primary care provider using the Murray’s cut methodology described above. The number of individuals assigned to each provider was considered their actual panel, which was then provided along with expected panel size. Each provider received a summary of their actual panel size, expected panel size, and demographic and clinical information summarizing individuals in their panel. In addition, team summaries were created that provided the total actual panel and total expected panel for each of the three primary care teams in the General Internal Medicine clinic. Second, an investigation of change in panel size over time was conducted for faculty providers employed over the time period of January 2014 through January 2018, allowing five 18-month panel size comparisons. The mean and median for each of the 5 panel sizes was calculated with 95% confidence intervals to allow comparison. Finally, a comparison was completed in February 2019 investigating the overall change in actual, expected, and actual to expected ratio for panel sizes across the Division. The mean panel size or ratio for January through December 2016 was compared to the mean panel size or ratio for January through December 2018 using paired t-tests. All p-values were 2-sided and p<0.05 was considered statistically significant. Statistical analysis was performed with Stata v 14 and R Studio.

Patient Satisfaction

Data Source and Study Population

Patient satisfaction was measured using the Clinician and Group Consumer Assessment of Healthcare Providers and Systems (CGCAHPS). CGCAHPS is a survey system to capture the quality of care provided by physicians to a patient during a clinic visit. These surveys help researchers and healthcare centers to understand outpatient experience recognize merits and demerits of care providers and identify quality improvement measures across the healthcare organizations. For the purposes of this analysis, we obtained data from CGCAHPS data repository of Froedtert and Medical College of Wisconsin for the Division of General Internal Medicine from July 2017 to May 2019.

CGCAHPS Score Calculation

The dataset includes information on a series of survey items in addition to an overall score given by the patient to the provider. Questions about the provider included their ability to:

  1. Explain things in a way that was easy for the patient to understand;

  2. Listen carefully to the patient;

  3. Provide easy to understand instructions about taking care of patient’s health problems or concerns

  4. Know the important information about patient’s medical history

  5. Spend enough time with the patient.

The above survey items have response types ‘Yes definitely’, ‘Yes somewhat’ and ‘No’. The overall rating attributed to the provider by the patient ranges from 0 -being the worse to 10 -being the best provider. We treated the overall scale as both a continuous and categorical variable. In order to categorize the overall rating scale, we used the standard definition of satisfaction used by Press Ganey and our institution and grouped scores from 0–8 as ‘Not Satisfied’ and 9–10 as ‘Satisfied’.

Statistical Analysis for Patient Satisfaction

We were able retrieve a total of 3,419 surveys of which FY 2018 (July 2017 to June 2018) had 1,503 encounters and FY 2019 (July 2018 to May 2019) had 1,916 encounters. We investigated whether there were changes in patient satisfaction between FY 2018 and FY 2019. The frequencies for each of the variables that allowed categorical answer options were calculated and compared with ANOVA tests. The mean and standard deviation for the overall score, and the frequency for the satisfied and not satisfied groups were also calculated. The continuous overall score was compared using a t-test and the categorial overall score was compared using a chi2 test. All p-values were 2-sided and p<0.05 was considered statistically significant. Statistical analysis was performed with Stata v 14 and R Studio.

Results

Access

Overall, the clinic work week decreased from 8.55 in FY17 to 8.27 in FY19, while the total number of clinic half days increased from 2,940 in FY17 to 3,082 in FY19. In addition, despite decreasing FTEs to allow faculty and APPs to participate in education and research, there was an overall increase in the number of private clinics from 58 in 2017 to 63.5 in 2019, and an increase in the percent of each providers’ panel that is full from 78.89% in 2017 to 115.29% in 2019.

Figure 1 provides a visual representation of patient access measures from October 2016 through July 2019, noting the number of days until the next existing patient and new patient visit. The improvement in access for new patients is primarily driven by physician access, while the improvement in access for existing patients is primarily driven by resident and APP access.

Figure 1:

Figure 1:

Access for new and existing patients over time

Productivity

Table 1 presents the changes in faculty panel size before and after strategic changes. When comparing panel sizes for the 11 faculty present before and after strategic changes we found significant increase in both overall panel size, and actual to expected ratios between 2017 and 2019.

Table 1:

Changes in faculty panel size before and after strategic changes

Panel Size Ratio (actual:expected)
2017 2019 2017 2019
Provider 1 540 764 0.72 0.87
Provider 2 74 293 0.30 1.17
Provider 3 555 641 0.89 1.28
Provider 4 319 403 0.85 1.07
Provider 5 410 519 0.73 0.92
Provider 6 232 372 0.53 1.49
Provider 7 518 637 0.83 1.02
Provider 8 258 217 1.03 1.74
Provider 9 153 139 1.22 1.11
Provider 10 580 783 0.93 1.25
Provider 11 489 656 0.65 0.75
Mean 375 493 0.79 1.15
Significance p = 0.001 p = 0.005

Satisfaction

Figure 2 provides the overall rating for GIM providers during FY 2018 and FY 2019. Table 2 presents changes in patient satisfaction between FY 2018 and FY 2019. There were no statistically different changes in patient satisfaction during the time of implementation.

Figure 2:

Figure 2:

Overall Patient Satisfaction rating for FY 2018 and FY 2019 for Primary Care Providers

Table 2:

Patient satisfaction score comparisons between FY 2018 and FY 2019

Total Surveys n = 3,419 FY18 n =1,503 FY19 n=1,916 P-value
Overall Satisfaction Score
Satisfied (9–10) 88.8% 89.0% 88.6% 0.75
Not Satisfied (0–8) 11.2% 11.0% 11.4%
During your most recent visit, did this provider explain things in a way that was easy to understand?
Yes, definitely 95.3% 94.6% 95.9% 0.18
Yes, somewhat 3.9% 4.6% 3.4%
No 0.7% 0.8% 0.7%
During your most recent visit, did this provider listen carefully to you?
Yes, definitely 95.0% 94.6% 95.2% 0.35
Yes, somewhat 4.1% 4.2% 4.0%
No 0.9% 0.8% 0.7%
During your most recent visit, did this provider give you easy to understand instructions about taking care of these health problems or concerns?
Yes, definitely 93.9% 94.2% 93.7% 0.78
Yes, somewhat 5.3% 5.0% 5.6%
No 0.7% 0.8% 0.7%
During your most recent visit, did this provider seem to know the important information about your medical history?
Yes, definitely 89.6% 88.5% 90.4% 0.19
Yes, somewhat 8.2% 9.1% 7.6%
No 2.2% 2.5% 2.0%
During your most recent visit, did this provider show respect for what you had to say?
Yes, definitely 96.0% 95.8% 96.2% 0.67
Yes, somewhat 3.1% 3.2% 3.1%
No 0.9% 1.0% 0.7%
During your most recent visit, did this provider show respect for what you had to say?
Yes, definitely 95.6% 95.6% 95.8% 0.96
Yes, somewhat 3.2% 3.2% 3.2%
No 1.1% 1.2% 1.1%

Discussion

The impact of strategic changes made to our primary care section continue to be felt, however, some of the most striking is a) an increase in new patients of 22%, the highest in our Department, b) optimization of faculty panel size to increase access for new patients, c) facilitating our APPs to function at the top of their license by focusing their work on continuity of care and shifting task from physicians to APPs where appropriate, and d) improving the volume and continuity in our resident clinics through standardized resident panel sizes based on the year of residency and teams of residents and faculty that support continuity of existing patients. During the time of these changes, we improved access for both our existing/established patients and new patients, increased productivity of our providers as measured by panel size and maintained high patient satisfaction as reported by CGCHAPS scores and individual questions.

From the standpoint of our clinical partner, monitoring the impact of our primary care redesign was necessary for modifications to ensure flow of the clinic. Staffing models were aligned with benchmarks derived from clinical full time equivalent (cFTE) for registered nurses, medical assistants, and non-clinical support roles. Staffing ratios were also reviewed on a frequent basis to ensure alignment with provider changes, panel changes and access demands as volume growth in the clinic continued. Based on the success of the process, assessments continue to ensure all primary care team members can function at the top of their license. In addition, the clinic continues processes initiated in this process including telephone triage, refill management, and pre-visit planning. Weekly/monthly review of access and Patient Centered Medical Home (PCMH) measures by a leadership team including the medical director, lead APP, the clinic manager and the department administrator also make recommendations for improvement opportunities.

This process was integral in strategic planning for the clinic and integration of our clinical and academic medical center goals. Specifically, accurate panel size and patient access determination allowed for an active panel management process. Strategic decisions are then made based on these numbers including how to balance demand, access and capacity for patients. For example, when a faculty physician has achieved ideal panel size and access is poor, options to ensure continuity of care within the GIM clinic are implemented. These may include temporary closing of a faculty physician’s panel so established patients can maintain an ongoing relationship with their primary care physician. Alternatively, the faculty physician ‘s panel may remain open to new patients through the expanded use of team-based APPs who provide access for established patients. Another important result in relation to our academic role is increased resident clinic productivity resulting from inclusion of the resident continuity clinic in the empanelment process. This has allowed the residents to participate in active care coordination, population health, quality improvement, and team-based care, especially with hospitalized and high-risk patients. Lastly, a series of team-based processes were initiated, including 1:1 Physician or APP to MA staffing ratio, co-location of individual physicians and APPs with MAs, and creation of clinic workflows and protocols. These include how to inform patients of diagnostic test results, how to facilitate paperwork management and flow within clinic, how to coordinate new patient visits to ensure timely availability and review of patient records, and RN protocols for how to manage uncomplicated urinary tract infections.

Though this project provides insight into the process of optimizing primary care in an academic medical center where clinical care and the educational mission must be balanced, there are limitations that are important to note. First, this was a single site and therefore is limited in its ability to generalize to other sites without addressing contextual differences. Second, the sample size is small as this was the initial step of the redesign process, and therefore results should be considered preliminary with ongoing work continuing.

In conclusion, the inclusion of a set of well-planned metrics, and tracking of processes over time can provide insight for ongoing primary care redesign efforts at sites seeking to balance the academic mission with clinical productivity and high patient satisfaction. Improvements in faculty physician and resident productivity required development and implementation of team-based clinic process in order to maintain high patient satisfaction and decrease provider burden.

Acknowledgements

Effort for this study was partially supported by the National Institute of Diabetes and Digestive Kidney Disease (K24DK093699, R01DK118038, R01DK120861, PI: Egede), the National Institute for Minority Health and Health Disparities (R01MD013826, PI: Egede/Walker), and the American Diabetes Association (1-19-JDF-075, PI: Walker)

Footnotes

Financial Disclosures

No financial disclosures are reported by the authors of this paper.

Peer reviewers on this manuscript have no relevant financial or other relationships to disclose.

Conflict of Interest

The authors report no potential, perceived, or real conflicts of interest of a personal, professional, or financial relationship. All authors have approved the final manuscript and no sponsor agency was involved in study design, data collection, analysis/interpretation, or writing of the manuscript.

The contents of the paper and the opinions expressed within are those of the authors, and it was the decision of the authors to submit the manuscript for publication.

Coauthor: Madhuli Thakkar

Disclosure: The authors report no potential, perceived, or real conflicts of interest of a personal, professional, or financial relationship.

Coauthor: Monica O’Sullivan, MD

Disclosure: The authors report no potential, perceived, or real conflicts of interest of a personal, professional, or financial relationship.

Coauthor: Wendy Stulac Motzel

Disclosure: The authors report no potential, perceived, or real conflicts of interest of a personal, professional, or financial relationship.

References

  • 1.Brown-Johnson CG, Chan GK, Winget M, Shaw JG, Panton K, Hussain R, Olayiwola JN, Chang S, Mahoney M. Primary Care 2.0: Design of a transformational team-based practice model to meet the quadruple aim. Am J Med Qual. 2018; 1–9. Doi: 10.1177/1062860618802365. [DOI] [PubMed] [Google Scholar]
  • 2.Bodenheimer T and Sinsky C. From triple to quadruple aim: care of the patient requires care of the provider. Ann Fam Med. 2014; 12: 573–576. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Farmanova E, Baker GR, Cohen D. Combining integration of care and a population health approach: a scoping review of redesign strategies and interventions, and their impact. J Integrated Care. 2019; 19(2): 5, 1–25. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Kiran T, Ramji N, Derocher MB, Girdhari R, Davie S, Lam-Antoniades M. Ten tips for advancing a culture of improvement in primary care. BMJ Qual Saf. 2019; 85: 582–587. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Crabtree BF, Nutting PA, Miller WL, McDaniel RR, Strange KC, Jaen CR, Stewart E. Primary Care Practice Transformation is Hard Work: insights from a 15-year developmental program of research. Med Care. 2011. December; 49 (Suppl): S28–S35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Day J, Scammon DL, Kim J, Sheets-Mervis A, Day R, Tomoaia-Cotisel A, Waitzman NJ, Magill MK. Quality, satisfaction, and financial efficiency associated with elements of primary care practice transformation: preliminary findings. Ann fam Med; 2013; 11:S50–S59. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Bryce C, Fleming J, Reeve J. Implementing change in primary care practice: lessons from a mixed-methods evaluation of a frailty initiative. BMJ Open. 2018; doi: 10.3399/bjgpopen18X101421. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Hung DY, Harrison MI, Liang S, Truong QA. Contextual conditions and performance improvement in primary care. Q Manage Health Care. 2019; 28(2): 70–77. [DOI] [PubMed] [Google Scholar]
  • 9.Harrison MI and Grantham S. Learning from implementation setbacks: identifying and responding to contextual challenges. Learn Health Sys. 2018; 2:310068. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Grumbach K, Knox M, Huang B, Hammer H, Kivlahan C, Willard-Grace R. A longitudinal study of trends in burnout during primary care transformation. Ann Fam Med. 2019. August 12; 17(Suppl 1): S9–S16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Quigley DD, Mendel PJ, Predmore ZS, Chen AY, Hays RD. Use of CAHPS® patient experience survey data as part of a patient-centered medical home quality improvement initiative. J Healthc Leadersh. 2015;7:41–54. Published 2015 Jul 7. doi: 10.2147/JHL.S69963 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Mitchell JD, Haag JD, Klavetter E, Beldo R, Shah ND, Baumbach LJ, Sobolik GJ, Rutten LJ, Stroebel RJ. Development and implementation of a team-based, primary care delivery model: challenges and opportunities. Mayo Clinc Proc. 2019. July; 94(7): 1298–1303. [DOI] [PubMed] [Google Scholar]
  • 13.Davies E, Shaller D, Edgman-Levitan S, et al. Evaluating the use of a modified CAHPS survey to support improvements in patient-centered care: lessons from a quality improvement collaborative. Health Expect. 2008;11(2):160–176. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Friedberg MW, Steel Fisher GK, Karp M, Schneider EC. Physician groups’ use of data from patient experience surveys. J Gen Intern Med. 2011;26(5):498–504. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Goldstein E, Cleary PD, Langwell KM, Zaslavsky AM, Heller A. Medicare managed care CAHPS: a tool for performance improvement. Health Care Financ Rev. 2001;22(3):101–107. [PMC free article] [PubMed] [Google Scholar]
  • 16.Patwardhan A, Spencer CH. Are patient surveys valuable as a service improvement tool in health services? An overview. J Healthc Leadersh. 2012;4:33–46. [Google Scholar]
  • 17.Crofton C, Lubalin JS, Darby C. Consumer assessment of health plans study (CAHPS). Foreword. Med Care 1999;37(3 Suppl):MS1–MS9. [DOI] [PubMed] [Google Scholar]
  • 18.Darby C, Hays RD, Kletke P. Development and evaluation of the CAHPS hospital survey. Health Serv Res. 2005;40(6 pt 2):1973–1976. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Hays RD, Martino S, Brown JA, et al. Evaluation of a care coordination measure for the consumer assessment of healthcare providers and systems (CAHPS) Medicare survey. Med Care Res Rev. 2014;71(2):192–202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Sequist TD, Schneider EC, Anastario M, Odigie EG, Marshall R, Rogers WH, et al. Quality monitoring of physicians: linking patients’ experiences of care to clinical quality and outcomes. J Gen Intern Med. 2008; 23(11):1784–90. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Greenfield S, Kaplan S, Ware JE Jr. Expanding patient involvement in care: effects on patient outcomes. Ann Intern Med. 1985:102:520–8. [DOI] [PubMed] [Google Scholar]
  • 22.Greenfield S, Kaplan HS,Ware JE Jr., Yano EM, Frank HJ. Patients’ participation in medical care: effects on blood sugar control and quality of life in diabetes. J Gen Intern Med. 1988;3:448–57. [DOI] [PubMed] [Google Scholar]
  • 23.Stewart MA. Effective physician-patient communication and health outcomes: a review. CMAJ. 1995; 152:1423–33. [PMC free article] [PubMed] [Google Scholar]
  • 24.Roter DL, Hall JA, Kern DE, Baker LR, Cole KA, Roca RP. Improving physicians’ interviewing skills and reducing patients’ emotional distress: a randomized clinical trial. Archives of Internal Medicine. 1995; 155: 1877–1884. [PubMed] [Google Scholar]
  • 25.Nelson EC, Rust RT, Zahorik A, Rose RL, Batalden P, Siemanski BA. Do patient perceptions of quality relate to hospital financial performance? J Health Care Market. 1992;12(4): 6–13. [PubMed] [Google Scholar]
  • 26.Lochner J, Lankton R, Rindfleish K, Arndt B, Edgoose J. Transforming a family medicine residency into a community-oriented learning environment. Fam Med. 2018. Ju; 50(7): 518–525. [DOI] [PubMed] [Google Scholar]
  • 27.Sokal-Gutierrez K, Ivey SL, Garcia RM, Azzam A. Evaluation of the program in medical education for the urban underserved (PRIME-US) at the UC Berkeley-UCSF Joint Medical Program (JMP): The first 4 Years. Teach Learn Med. 2015; 27(2): 189–196. [DOI] [PubMed] [Google Scholar]
  • 28.Kennedy DM, Anastos CT, Genau MC. Improving healthcare service quality through performance management. Leadersh Health Serv (Bradf Engl). 2019. June 28; 32(3): 477–492. [DOI] [PubMed] [Google Scholar]
  • 29.Murray M, Davies M, Boushon B. Panel Size: How many patients can one doctor manage? Family Practice Management. 2017. April; 14(4): 44–51. [PubMed] [Google Scholar]
  • 30.Altschuler J, Margolius D, Beodenheimer T, Grumbach K. Estimating a reasonable patient panel size for primary care physicians with team-based task delegation. Ann Fam Med. 2012; 10(5): 396–400. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES