Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2013 Dec;5(4):587–593. doi: 10.4300/JGME-D-12-00233.1

The Learners' Perceptions Survey—Primary Care: Assessing Resident Perceptions of Internal Medicine Continuity Clinics and Patient-Centered Care

John M Byrne, Barbara K Chang, Stuart C Gilman, Sheri A Keitz, Catherine P Kaminetzky, David C Aron, Sam Baz, Grant W Cannon, Robert A Zeiss, Gloria J Holland, T Michael Kashner
PMCID: PMC3886456  PMID: 24455006

Abstract

Background

In 2010, the Department of Veterans Affairs (VA) implemented a national patient-centered care initiative that organized primary care into interdisciplinary teams of health care professionals to provide patient-centered, continuous, and coordinated care.

Objective

We assessed the discriminate validity of the Learners' Perceptions Survey—Primary Care (LPS-PC), a tool designed to measure residents' perceptions about their primary and patient-centered care experiences.

Methods

Between October 2010 and June 2011, the LPS-PC was administered to Loma Linda University Medical Center internal medicine residents assigned to continuity clinics at the VA Loma Linda Healthcare System (VALLHCS), a university setting, or the county hospital. Adjusted differences in satisfaction ratings across settings and over domains (patient- and family-centered care, faculty and preceptors, learning, clinical, work and physical environments, and personal experience) were computed using a generalized linear model.

Results

Our response rate was 86% (77 of 90). Residents were more satisfied with patient- and family-centered care at the VALLHCS than at either the university or county (P < .001). However, faculty and preceptors (odds ratio [OR]  =  1.53), physical (OR  =  1.29), and learning (OR  =  1.28) environments had more impact on overall resident satisfaction than patient- and family-centered care (OR  =  1.08).

Conclusions

The LPS-PC demonstrated discriminate validity to assess residents' perceptions of their patient-centered clinical training experience across outpatient primary care settings at an internal medicine residency program. The largest difference in scores was the patient- and family-centered care domain, in which residents rated the VALLHCS much higher than the university or county sites.


What was known

Patient-centered care approaches such as team-based care and chronic disease management have been recommended to improve patient care.

What is new

A survey by the Department of Veterans Affairs (VA) assessed residents' experiences in primary care settings, focusing on faculty and preceptors, learning, clinical work, the physical environment, and personal experiences.

Limitations

Single-specialty, single-institution study, and small sample size limit generalizability.

Bottom line

The survey effectively discriminated across the 3 teaching sites, suggesting that VA sites may provide better patient-centered care experiences than university-based or county hospital–based sites.

Editor's Note: The online version of this article contains the Learners' Perceptions Survey—Primary Care survey scores (34.4KB, docx) by domains and elements comparing Veterans Affairs, university, and county Loma Linda University Medical Center internal medicine resident continuity clinics.

Introduction

The relative disinterest of residents in a primary care (PC) career likely results from challenges that include inequitable remuneration, overworked physicians, burdensome administrative tasks,13 and patients' complex chronic diseases that require comprehensive, coordinated services and team care.4,5 Patient-centered care principles such as team-based care, chronic disease management, and enhanced communication have been recommended by national organizations as part of internal medicine (IM) residency reform and redesign.6,7 However, evaluating these reforms is limited by a lack of instruments designed to measure the effect of patient-centered care principles on residents' perceptions of PC training8 and to compare IM continuity clinics that differ in terms of available resources and education.9,10

The Department of Veterans Affairs (VA) Learners' Perceptions Survey—Primary Care (LPS-PC) is a survey designed to assess PC education and trainee perceptions of patient-centered care. The LPS-PC is modified from the original VA Learners' Perceptions Survey (LPS).11 Since 2001, the VA's Office of Academic Affiliations (OAA) has administered the LPS to assess health profession trainees' satisfaction with their VA clinical training experiences.12 The LPS has validity evidence across professional disciplines and clinical settings.1316 We report on a study to determine whether the LPS-PC can discriminate resident perceptions of patient-centered care across different continuity clinics in an IM residency program.

Methods

Survey Development

The LPS-PC was derived from the LPS under the OAA's National Evaluation Workgroup, which consisted of OAA leadership, clinician educators, and health services and education researchers. To develop a PC version of the LPS, the work group made 3 changes to the survey. It added a facility-level training experience domain that asked respondents to rate the value of their PC clinical experience (poor, fair, adequate, very good, excellent); included an 18-element patient-centered care domain; and changed several questions to focus on attributes of the PC setting, including faculty and preceptors, the learning, clinical work, and physical environment, and learners' personal experiences domains. Survey content was reviewed by a focus group of PC physicians, associated health and nursing educators, and VA leadership. Changes were made based on their expert consensus recommendations. The LPS-PC was pilot tested with 12 Loma Linda University Medical Center (LLUMC) IM residents at the VA Loma Linda Healthcare System (VALLHCS). Residents provided verbal and written feedback, and modifications were made based on their input.

Setting and Participants

The LLUMC IM residency program consists of the preliminary, categorical, and PC tracks and a medicine-pediatrics residency. Categorical and PC track residents are arbitrarily assigned to continuity clinics at 1 of 3 sites: VALLHCS, university, and county hospital clinics with approximately 35, 24, and 22 clinics, respectively, at each site. The 9 PC track residents have 1 or 2 continuity clinics per week with 3 to 7 clinics at their primary site at the VALLHCS, and 1 or 2 clinics at a secondary site (the university or county) per month. Categorical residents are assigned 1 half-day of continuity care clinics per week and rotate through an additional 6 months of 1-month ambulatory block rotations that consist of PC and subspecialty experiences. Residents have an additional 3 months of electives and 4 months of “selectives,” in which they choose subspecialty rotations that may be in part or wholly ambulatory. Overall, PC track residents have 6 more months of ambulatory care training in a variety of specialty clinics to enhance their PC skills.

Beginning in 2010, the VA implemented a patient-centered care model known as the Patient Aligned Care Team (PACT),17 which aims to provide patients with high-quality, outcomes-based, patient-centered, continuous, and coordinated care through an ongoing relationship with a team of health practitioners, caregivers, and administrative support staff.18 PACT was implemented in all PC clinics, including resident and faculty clinics. Before PACT, the VA developed many processes that support patient-centered principles,19 including an advanced clinical information system,2022 performance metrics and quality improvement processes,23 high-quality chronic disease and preventive care,2426 and timely access to care.27

In contrast, residents at the university and county continuity clinics were not formally organized into PC teams with nursing and other staff. In addition, at the time of the study, neither the county nor the university sites had a fully functional electronic health record.

Survey Administration

The survey was administered voluntarily, electronically, and anonymously to all LLUMC categorical and PC track IM residents between October 2010 and June 2011. Demographic data included sex, graduation year, and continuity clinic site assignment. To maintain respondent anonymity, residents did not identify their residency track due to the small numbers of PC-track residents. Residents were asked to identify their primary continuity clinic assignment and answer the survey about that clinic. Due to the anonymity of the survey, it is not known which clinic the PC track residents evaluated. Reminders to complete the survey were sent through e-mail, texts, and flyers.

The study was approved by the VALLHCS Institutional Review Board.

Analysis

Domain scores were computed for each respondent as ordinal scales or standardized means computed by dividing the mean of each domain's element responses by the standard deviation across all respondents. Element responses were scored by assigning integer values to the 5 response categories (1  =  very dissatisfied, 2  =  dissatisfied, 3  =  neither, 4  =  satisfied, 5  =  very satisfied). Because all but 1 factor per domain satisfied the “eigenvalue less than 1” rule and the ordered response categories were coded consecutively, standardized mean scores represent a sufficient statistic to measure domain satisfaction as a latent respondent characteristic.28

Adjusted differences across facilities were computed using generalized linear models with a normal distribution and identity linking function for standardized mean scores as well as with a multinomial distribution and a cumulative linking function for ordinal domain scores. Facility differences are reported as differences in t-distributed standardized means and as Wald χ2 distributed odds ratios. The latter reflects the change in the likelihood that an average responder would report a higher level of satisfaction had their training experience been at the university instead of VALLHCS, or at the county instead of the university. Differences were adjusted to reflect the mix of patients that the respondent reported seeing in the facility using an instrumental variable computed as the difference in respondent and the respondent's facility mean for the 3 facility-invariant elements (facility parking, facility location convenience, and patient record systems). Finally, respondents' value of each domain was computed as the independent association of each domain-level satisfaction on overall facility-level satisfaction.

Results

Of the 90 categorical and PC track residents, 77 (86%) responded, with 31 (40%), 26 (34%), and 20 (26%) from VALLHCS, university, and county, respectively. Among the respondents, 41 (53%) were men. In addition, 30 (39%), 26 (34%), 17 (22%), and 2 (3%) were postgraduate year (PGY)-1, PGY-2, PGY-3, and PGY-4, respectively.

Adjusted differences between facilities (VALLHCS, university, and county) are reported in terms of standardized means (Table 1) and ordinal scales (Table 2). The VALLHCS scores were significantly higher than the university and county scores for all domains except the physical environment (P < .05; Table 1). The university domain scores were not statistically different from those of the county, with the exception of physical and work environment, which was higher at the university.

TABLE 1.

Primary Care Learners' Perceptions Survey Scores by Domains Comparing Veterans Affairs (VA), University, and County Loma Linda Internal Medicine Resident Continuity Clinics

graphic file with name i1949-8357-5-4-587-t01.jpg

TABLE 2.

Primary Care Learners' Perceptions Survey Scores by Domains Comparing Veterans Affairs (VA), University, and County Loma Linda University Medical Center Internal Medicine Resident Continuity Clinics

graphic file with name i1949-8357-5-4-587-t02.jpg

Resident satisfaction with the survey domains are reported in Table 2 and the survey elements in the Appendix (provided as online supplemental material), ranked in order of the lowest odds ratio comparing the VALLHCS and the university; lower odds ratio favored the VALLHCS. For example, in the clinical environment domain, the odds ratio of 0.03 means that a resident assigned to the university clinic has a 3% chance of reporting a higher level of satisfaction than a resident assigned to the VALLHCS clinic. Stated another way, a resident assigned to the VALLHCS clinic is 1/0.03 or 33.33 times more likely to report a higher level of satisfaction than a resident assigned to the university. Notably, residents at the VALLHCS perceived more support from nursing staff between visits, better management of telephone calls, and better time availability of appointments. However, no differences were seen between the VALLHCS and university in terms of how well physicians worked with ancillary staff and nurses.

In the working environment domain, the patient record system was rated higher in both the VA and university clinics compared with the county, the only site that had only 1 element of an electronic health record (lab values).

A number of patient- and family-centered care domain elements were rated higher in the VALLHCS compared with the university, including use of technology to communicate with patients. Residents at the VALLHCS were also more satisfied with the ability to identify chronic disease patient cohorts needing additional interventions and to follow a patient panel longitudinally. However, no statistical difference was found between the VALLHCS and the university or between the university and the county in interprofessional, collaborative team care.

Residents valued VALLHCS PC training more highly than university training, but no differences were noted between the university and the county (Table 2). All domains contributed to residents' rating of their PC experience value, but the clinical/faculty preceptors had the greatest effect (Table 3).

TABLE 3.

Primary Care Learners' Perceptions Survey Domain Scores Contribution to Residents' Valuation of Their Loma Linda Internal Medicine Resident Continuity Clinics

graphic file with name i1949-8357-5-4-587-t03.jpg

Discussion

We found that the LPS-PC discriminated characteristics of PC education between sites in a residency program. Consistent with our expectations, the largest difference in domain scores was the patient- and family-centered care domain, in which the VALLHCS was rated much higher. These differences are consistent with the anecdotal observations of this program's residents regarding the VALLHCS as well as that of residents at other VA locations.29 This finding is also corroborated by a national study of IM continuity care clinics, which demonstrated that VA clinics are more prepared to provide patient-centered care.30 Therefore, the LPS-PC may be valuable in assessing and comparing patient-centered care in IM resident continuity clinics as well as in other PC training sites.

The LPS-PC may also have a role in evaluating residents' valuation of their patient-centered care experience. The data in this study are consistent with previous work that showed that valuation of continuity clinics may be associated with preceptor characteristics and operational issues, such as nursing support and medical record systems,8 and that teaching quality and the learning environment contribute to satisfaction with ambulatory clinics.31,32 Some data indicate that teaching and implementation of a team model and patient-centered medical home in PC increases residents' satisfaction with their clinic experiences33 and their satisfaction with caring for patients with chronic pain.34 The lower valuation of patient-centered care in this study may perhaps be the result of not clearly identifying practice components as part of a patient-centered initiative. For example, although residents in this program rated VA technology much higher, collaborative interprofessional team care was not rated more highly despite the existence of PACT teams. Therefore, the survey may be useful in identifying these deficiencies in patient-centered care implementation and education. Further studies are needed to determine residents' perceptions of the contribution of patient-centered care concepts in their valuation of and satisfaction with PC training.

In contrast to these data, previous work comparing resident satisfaction with the affiliated hospitals in this study's local graduate medical education system generally showed higher satisfaction with the university.35 Unlike the present study, faculty were rated similarly across affiliated hospitals. In addition, clinical services such as nursing, availability and timeliness of laboratory and imaging results, and social work and case management were rated lowest in the VALLHCS, although it did show improvements over time. These differences in institutional resources and priorities underscore the value of benchmarking training sites to identify relative areas of strength and weakness that can potentially inform curricular development and identify areas for improvement.35

Our study has limitations. First, the study was limited to 1 institution and 1 academic year, and the sample size was small, thus the findings may not be generalizable. Second, the LPS-PC data were not compared to other tools used to assess patient-centered care implementation.

Conclusion

The LPS-PC is a tool that can be used to assess residents' perceptions of their PC education and the patient-centered care environment. The largest difference in domain scores was the patient- and family-centered care domain, in which the VALLHCS was rated much higher than the university or county sites. As patient-centered concepts and their measurement continue to evolve,36 future research should focus on comparing trainees perceptions of patient-centeredness to other tools used to assess its implementation. Future studies are needed to evaluate perceptions over time, compare the LPS-PC to other measures of patient-centered care, and assess residents' valuation of patient-centered care in PC education.

Footnotes

John M. Byrne, DO, is Associate Chief of Staff for Education, VA Loma Linda Healthcare System, and Associate Professor of Medicine, Loma Linda University School of Medicine; Barbara K. Chang, MD, MA, is Director of Medical and Dental Education, Office of Academic Affiliations, Department of Veterans Affairs, and Professor Emeritus, University of New Mexico School of Medicine; Stuart C. Gilman, MD, MPH, is Director, Advanced Fellowships and Professional Development, Office of Academic Affiliations, Department of Veterans Affairs, and Professor of Clinical Health Sciences, University of California Irvine School of Medicine; Sheri A. Keitz, MD, PhD, is Senior Associate Dean for Faculty and Professor of Medicine, University of Miami Miller School of Medicine; Catherine P. Kaminetzky, MD, MPH, is Associate Chief of Staff for Education and Director, Center for Education and Development, VA Puget Sound Health Care System, and Assistant Professor of Medicine, University of Washington School of Medicine; David C. Aron, MD, MS, is Associate Chief of Staff for Education, VA Senior Scholar, Louis Stokes Cleveland DVA Medical Center, Professor of Medicine and Epidemiology and Biostatistics, School of Medicine, and Professor of Organizational Behavior, Weatherhead School of Management, Case Western Reserve University; Sam Baz, MD, is Program Director, Loma Linda University Medical Center Internal Medicine Residency, and Assistant Professor of Medicine, Loma Linda University School of Medicine; Grant W. Cannon, MD, is Associate Chief of Staff for Academic Affiliations, George E.Wahlen VA Medical Center, and Professor and Thomas E. and Rebecca D. Jeremy Presidential and Endowed Chair for Arthritis Research, School of Medicine, University of Utah; Robert A. Zeiss, PhD, is Director of Associated Health Education, Office of Academic Affiliations, Department of Veterans Affairs; Gloria J. Holland, PhD, is Special Assistant for Policy and Planning, Office of Academic Affiliations, Department of Veterans Affairs; and T. Michael Kashner, PhD, JD, is Health Science Specialist, Office of Academic Affiliations, Department of Veterans Affairs, Research Professor of Medicine, Loma Linda University Medical School, and Research Professor of Psychiatry, University of Texas Southwestern Medical Center.

Funding: This material is the result of work supported with resources and the use of the facilities at the VA Loma Linda Healthcare System.

References

  • 1.Keirns CC, Bosk CL. Perspective: the unintended consequences of training residents in dysfunctional outpatient settings. Acad Med. 2008;83(5):498–502. doi: 10.1097/ACM.0b013e31816be3ab. [DOI] [PubMed] [Google Scholar]
  • 2.Bodenheimer T, Grumbach K, Berenson RA. A lifeline for primary care. N Engl J Med. 2009;360(26):2693–2696. doi: 10.1056/NEJMp0902909. [DOI] [PubMed] [Google Scholar]
  • 3.Grumbach K, Bodenheimer T. A primary care home for Americans: putting the house in order. JAMA. 2002;288(7):889–893. doi: 10.1001/jama.288.7.889. [DOI] [PubMed] [Google Scholar]
  • 4.Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness. JAMA. 2002;288(14):1775–1779. doi: 10.1001/jama.288.14.1775. [DOI] [PubMed] [Google Scholar]
  • 5.McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):2635–2645. doi: 10.1056/NEJMsa022615. [DOI] [PubMed] [Google Scholar]
  • 6.Meyers FJ, Wienberger SE, Fitzgibbons JP, Glassroth J, Duffy D, Clayton CP, et al. Redesigning residency training in internal medicine: the consensus report of the Alliance for Academic Internal Medicine Education Redesign Task Force. Acad Med. 2007;82(12):1211–1219. doi: 10.1097/ACM.0b013e318159d010. [DOI] [PubMed] [Google Scholar]
  • 7.Weinberger SE, Smith LG, Collier VU Education Committee of the American College of Physicians. Redesigning training for internal medicine. Ann Intern Med. 2006;144(12):927–932. doi: 10.7326/0003-4819-144-12-200606200-00124. [DOI] [PubMed] [Google Scholar]
  • 8.Sisson SD, Boonyasai R, Baker-Genaw K, Silverstein J. Continuity clinic satisfaction and valuation in residency training. J Gen Intern Med. 2007;22(12):1704–1710. doi: 10.1007/s11606-007-0412-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Nadkarni M, Reddy S, Bates CK, Fosburgh B, Babbott S, Holmboe E. Ambulatory-based education in internal medicine: current organization and implications for transformation. Results of a national survey of resident continuity clinic directors. J Gen Intern Med. 2010;26(1):16–20. doi: 10.1007/s11606-010-1437-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Mladenovic J, Shea JA, Duffy FD, Lynn LA, Holmboe ES, Lipner RS. Variation in internal medicine residency clinic practices: assessing practice environments and quality of care. J Gen Intern Med. 2008;23(7):914–920. doi: 10.1007/s11606-008-0511-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Kashner TM, Bernett DS, Wicker AB VA Office of Academic Affiliations National Evaluation Workgroup. Learners' Perceptions Survey (LPS2012): Instructions Manual for Data Users. Washington, DC: Office of Academic Affairs, Veterans Health Administration, Department of Veterans Affairs; 2012. [Google Scholar]
  • 12.Keitz SA, Holland GJ, Melander EH, Bosworth HB, Pincus SH VA Learners' Perceptions Working Group. The Veterans Affairs Learners' Perceptions Survey: the foundation for education quality improvement. Acad Med. 2003;78(9):910–917. doi: 10.1097/00001888-200309000-00016. [DOI] [PubMed] [Google Scholar]
  • 13.Cannon GW, Keitz SA, Holland GJ, Chang BK, Byrne JM, Tomolo A, et al. Factors determining medical students' and residents' satisfaction during VA-based training: findings from the VA Learners' Perceptions Survey. Acad Med. 2008;83(6):611–620. doi: 10.1097/ACM.0b013e3181722e97. [DOI] [PubMed] [Google Scholar]
  • 14.Kaminetzky CP, Keitz SA, Kashner TM, Aron DC, Byrne JM, Chang BK, et al. Training satisfaction for subspecialty fellows in internal medicine: findings from the Veterans Affairs (VA) Learners' Perceptions Survey. BMC Med Educ. 2011;11:21. doi: 10.1186/1472-6920-11-21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Kashner TM, Henley SS, Golden RM, Byrne JM, Keitz SA, Cannon GW, et al. Studying the effects of ACGME duty hours limits on resident satisfaction: results from VA learners' perceptions survey. Acad Med. 2010;85(7):1130–1139. doi: 10.1097/ACM.0b013e3181e1d7e3. [DOI] [PubMed] [Google Scholar]
  • 16.Lam HT, O'Toole TG, Arola PE, Kashner TM, Chang BK. Factors associated with the satisfaction of millennial generation dental residents. J Dent Educ. 2012;76(11):1416–1426. [PubMed] [Google Scholar]
  • 17.Patient Centered Primary Care Implementation Work Group, US Department of Veterans Affairs. Patient-Centered Medical Home Concept Paper. http://www.va.gov/PrimaryCare/docs/pcmh_ConceptPaper.doc. Accessed June 1, 2012. [Google Scholar]
  • 18.Klein S. The Veterans Health Administration: implementing patient-centered medical homes in the nation's largest integrated delivery system. Case Study: High-Performing Health Care Organization. 2011;1537(16):1–24. The Commonwealth Fund. http://www.commonwealthfund.org/∼/media/Files/Publications/Case%20Study/2011/Sep/1537_Klein_veterans_hlt_admin_case%20study.pdf. Accessed May 21, 2012. [Google Scholar]
  • 19.Yano EM, Simon BF, Lanto AB, Rubenstein LV. The evolution of changes in primary care delivery underlying the Veterans Health Administration's quality transformation. Am J Public Health. 2007;97(12):2151–2159. doi: 10.2105/AJPH.2007.115709. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Evans DC, Nichol WP, Perlin JB. Effect of the implementation of an enterprise-wide Electronic Health Record on productivity in the Veterans Health Administration. Health Econ Pol Law. 2006;1(pt 2):163–169. doi: 10.1017/S1744133105001210. [DOI] [PubMed] [Google Scholar]
  • 21.Byrne JM, Elliott S, Firek A. Initial experience with patient-clinician secure messaging at a VA medical center. J Am Med Inform Assoc. 2009;16(2):267–270. doi: 10.1197/jamia.M2835. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Doebbeling BN, Vaughn TE, McCoy KD, Glassman P. Informatics implementation in the Veterans Health Administration (VHA) healthcare system to improve quality of care. AMIA Annu Symp Proc. 2006:204–208. [PMC free article] [PubMed] [Google Scholar]
  • 23.Perlin JB, Kolodner RM, Roswell RH. The Veterans Health Administration: quality, value, accountability, and information as transforming strategies for patient-centered care. Am J Manage Care. 2004;10(11):828–836. [PubMed] [Google Scholar]
  • 24.Trivedi AN, Matula S, Miake-Lye I, Glassman PA, Shekelle P, Asch S. Systematic review: comparison of the quality of medical care in Veterans Affairs and non-Veterans Affairs settings. Med Care. 2011;49(1):76–88. doi: 10.1097/MLR.0b013e3181f53575. [DOI] [PubMed] [Google Scholar]
  • 25.Asch SM, McGlynn EA, Hogan MM, Hayward RA, Shekelle P, Rubenstein L, et al. Comparison of quality of care for patients in the Veterans Health Administration and patients in a national sample. Ann Intern Med. 2004;141(12):938–945. doi: 10.7326/0003-4819-141-12-200412210-00010. [DOI] [PubMed] [Google Scholar]
  • 26.Jha AK, Perlin JB, Kizer KW, Dudley RA. Effect of the transformation of the Veterans Affairs Health Care System on the quality of care. N Engl J Med. 2003;348(22):2218–2227. doi: 10.1056/NEJMsa021899. [DOI] [PubMed] [Google Scholar]
  • 27.Armstrong B, Levesque O, Perlin JB, Rick C, Schectman G. Reinventing Veterans Health Administration: focus on primary care. J Healthc Manag. 2005;50(6):399–408. [PubMed] [Google Scholar]
  • 28.Hardouin JR, Mesbah M. Clustering binary variables in subscales using an extended Rasch model and Akaike information criterion. Commun Stat Theory Methods. 2004;33(6):1277–1294. [Google Scholar]
  • 29.Galen BT. In support of residency training at an academic veterans hospital. Acad Med. 2012;87(8):993–994. doi: 10.1097/ACM.0b013e31825ccb4f. [DOI] [PubMed] [Google Scholar]
  • 30.Babbott SF, Beasley BW, Reddy S, Duffy FD, Nadkarni M, Holmboe ES. Ambulatory office organization for internal medicine resident medical education. Acad Med. 2010;85(12):1880–1887. doi: 10.1097/ACM.0b013e3181fa46db. [DOI] [PubMed] [Google Scholar]
  • 31.Probst JC, Baxley EG, Schell BJ, Cleghorn GD, Bogdewic SP. Organizational environment and perceptions of teaching quality in seven South Carolina family medicine residency programs. Acad Med. 1998;73(8):887–893. doi: 10.1097/00001888-199808000-00014. [DOI] [PubMed] [Google Scholar]
  • 32.Roth LM, Severson RK, Probst JC, Monsur JC, Markova T, Kushner SA, et al. Exploring physician and staff perceptions of the learning environment in ambulatory residency clinics. Fam Med. 2006;38(3):177–184. [PubMed] [Google Scholar]
  • 33.Roth LM, Markova T, Monsur JC, Severson RK. Effects of implementation of a team model on physician and staff perceptions of a clinic's organizational and learning environments. Fam Med. 2009;41(6):434–439. [PubMed] [Google Scholar]
  • 34.Evans L, Whitham JA, Trotter DR, Fitz KR. An evaluation of family medicine residents' attitudes before and after a PCMH innovation for patients with chronic pain. Fam Med. 2011;43(10):702–711. [PubMed] [Google Scholar]
  • 35.Byrne JM, Loo LK, Giang D. Monitoring and improving resident work environment across affiliated hospitals: a call for a national resident survey. Acad Med. 2009;84(2):199–205. doi: 10.1097/ACM.0b013e318193833b. [DOI] [PubMed] [Google Scholar]
  • 36.Stange KC, Nutting PA, Miller WL, Jaén CR, Crabtree BF, Flocke SA, et al. Defining and measuring the patient-centered medical home. J Gen Intern Med. 2010;25(6):601–612. doi: 10.1007/s11606-010-1291-3. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES