July 1, 2003 marked a watershed moment in graduate medical education in this country. New interns entered residency programs transformed by two sweeping reforms from the Accreditation Council for Graduate Medical Education (ACGME), which acted to raise our profession's accountability for training new physicians.
First, the ACGME approved duty hours standards for all accredited programs.1 This regulation reflects growing concerns, dating back to the Libby Zion case in 1984, about the effects of fatigue on resident well-being and patient safety. The ACGME first responded in the 1990s with loosely enforced work hours limitations, which varied widely by specialty. As many programs failed to comply (30% of reviewed internal medicine programs cited in 19992), the public outcry grew louder and was heard in the Occupational Safety and Health Administration and in the halls of Congress.3 By this time, stakeholders could cite a growing body of scientific evidence linking long work hours to poorer performance and increased burnout among residents.4 With this backdrop, the ACGME imposed (and is beginning to enforce) an 80-hour per week limit, 1 day off in 7, and relief after a 24-hour shift plus 6 hours for transition.
Second, the ACGME “outcomes project” changed the accreditation currency from process and structure to outcome.5 Program directors must now provide more than a schedule of rotations, a written curriculum, and agreements with clinical training venues. They must objectively document that their residents achieve a level of competence in six general dimensions of practice. Two of these, practice-based learning and improvement and systems-based practice, reflect a recent emphasis on evidence-based reflective practice and newer models of team care and disease management.
As directors dramatically restructure their programs to comply with these two formidable unfunded mandates, they struggle with many uncertainties.6,7 They can take some lessons from New York State, which has legislated work hours restrictions since 1989.8 And the ACGME “toolbox” provides some guidance in evaluating the six competencies. Nonetheless, many questions remain unanswered. In this issue, educators respond to these challenges with reports of creative studies, innovative curricula, and robust evaluation strategies.
Hoellein et al.'s study adds to the evidence underpinning the work hours restrictions.9 In this observational study of 646 clinic encounters in a single program, patients were less satisfied with their visits with postcall residents, after adjusting for important confounders. Of note, resident satisfaction did not vary with call status.
Wong et al. developed an innovative day-float rotation that responds to both ACGME mandates.10 A senior medical resident joins the postcall team for morning rounds, assists with the daily tasks, and pursues the team's emerging clinical questions. When the day-float rotation was in operation, ward residents worked from 67 to 81 hours per week, compared to 79 to 90 hours prior to implementation. In addition, by keeping a learning portfolio, the day-float resident documents her practice-based learning competency. Day-float residents collected 6.6 and answered 6.1 questions per rotation. We suspect a very low recording rate, however, given studies showing that residents encounter clinical questions at a much higher rate.11
Pinsky and Fryer-Edwards describe their program's experience with a more ambitious program-wide portfolio system, designed to evaluate and promote reflection around all six ACGME competencies.12 They identified five elements that may promote successful implementation, including separate working and performance functions, a supportive climate, skill development in faculty and residents, observing progress over time, and fostering mentoring opportunities. Residents in this program apparently offered little resistance to maintaining portfolios. However, skeptical program directors may require more quantitative feasibility data before asking their busy residents to embrace this extra task.
Residents completed quality improvement projects in Ogrinc et al.'s innovative practice-based learning and improvement curriculum.13 In a pre-post controlled trial, exposed residents improved their scores on a validated quality improvement knowledge instrument. Furthermore, their project sponsors appreciated the meaningful contributions to care in their clinical venues.
Two investigators raised the bar for procedure training, which falls within the “patient care” competency. Smith et al. developed an innovative medical procedure service, which included a web-based component to teach procedural knowledge and 24-hour availability of qualified faculty (hospitalists and intensivists) to teach, supervise, evaluate, and track procedural experience.14 The additional billing revenues partially offset the costs of increased faculty coverage. In a 12-month pilot program, the complication rate was 3.7% over 246 procedures. The pneumothorax rate for thoracenteses was only 3.5%, which compares favorably to a 10.6% rate in a pooled analysis of the literature. Watkins and Moran developed a targeted intervention to train residents in Pap smear sampling, which included a skills workshop with a manikin and peer comparison feedback of adequacy data.15 In a randomized controlled trial, exposed residents were twice as likely to obtain sufficient endocervical cells, adjusting for important confounding variables.
On a practical level, competency-based evaluations add a substantial administrative burden to already stressed program leadership and staff. Triola et al. developed a web-based modular evaluation system that generates competency-based, venue-specific, and training level-specific questions for evaluators and customized evaluation reports for residents, faculty, and program directors.16 In their program, evaluation compliance increased from 35% in a paper system to 85% at 10 months after transition to this electronic system.
Finally, while investigators often report easily measured endpoints like trainee satisfaction, knowledge, and skills, we are charged to determine “how the design and conduct of medical education programs affect the clinical outcomes produced by doctors.”17 Educators, in this issue, evaluated the impact of educational interventions on trainee performance, such as procedure complications, Pap smear adequacy, and patient satisfaction. Furthermore, the educational activities themselves may add value to clinical care.18
Of course, “this is not the end,” as Churchill said. “It is not even the beginning of the end. But it is, perhaps, the end of the beginning.” The impact of these reforms will play out over many years. Much more research is needed to address lingering questions. For instance, many educators worry about the repercussions of disrupting continuity of care to meet work hours standards. What, then, will be the net effect on quality of care, patient satisfaction, resident education, and resident well-being? What is the ideal resident workweek? And important economic questions should rekindle the debate about GME funding. Will hospitals be able to absorb the “replacement costs” of resident hours and maintain their support of residency programs? Will program faculty be able to fully implement competency-based training, while they receive a shrinking allocation of GME financial support and insufficient academic “credit” for this work? Will researchers be able to advance the lagging educational science, despite extremely limited funding opportunities? We do, indeed, live in interesting times.
REFERENCES
- 1.Accreditation Council for Graduate Medical Education. Duty hours language. 2004. Available at: http://www.acgme.org/DutyHours/dutyHoursLang703.pdf. Accessed February 2004.
- 2.Accreditation Council for Graduate Medical Education. Percent of programs and institutions reviewed in 1999 and 2000 that were cited for work hours and related requirements. 2001. Available at: http://www.acgme.org/new/dutyhrscompare.pdf. Accessed February 2004.
- 3.The Patient and Physician Safety Act of 2001. Available at: http://www.house.gov/conyers/news_patientsafetyprtectionact.htm. Accessed January 2004.
- 4.Accreditation Council for Graduate Medical Education. Annotated bibliography on resident and physician work hours. 2003. Available at: http://www.acgme.org/DutyHours/dutyhoursarticles.pdf. Accessed February 2004.
- 5.Accreditation Council for Graduate Medical Education. Outcomes project: general competencies. 2004. Available at: http://www.acgme.org/outcome/comp/compFull.asp. Accessed January 2004.
- 6.Wilson MC. In pursuit of optimal duty hours and resident experiences. J Gen Intern Med. 2004;(19):97–8. doi: 10.1111/j.1525-1497.2004.31102.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Heard JK, Allen RM, Clardy J. Assessing the needs of residency program directors to meet the ACGME general competencies. Acad Med. 2002;77:750. doi: 10.1097/00001888-200207000-00040. [DOI] [PubMed] [Google Scholar]
- 8.Whang EE, Mello MM, Ashley SW, Zinner MJ. Implementing resident work hour limitations: lessons from the New York State experience. Ann Surg. 2003;237:449–55. doi: 10.1097/01.SLA.0000059966.07463.19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Hoellein AR, Feddock CA, Griffith CH, et al. Are continuity clinic patients less satisfied when the resident is postcall? J Gen Intern Med. 2004;19:563–6. doi: 10.1111/j.1525-1497.2004.30165.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Wong JG, Holmboe ES, Huot SJ. Teaching and learning in an 80-hour workweek: a novel day-float rotation for medical residents. J Gen Intern Med. 2004;19:519–23. doi: 10.1111/j.1525-1497.2004.30153.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Osheroff JA, Forsythe DE, Buchanan BG, Bankowitz RA, Blumenfeld BH, Miller RA. physicians' information needs: analysis of questions posed during clinical teaching. Ann Intern Med. 1991;114:576–81. doi: 10.7326/0003-4819-114-7-576. [DOI] [PubMed] [Google Scholar]
- 12.Pinsky LE, Fryer-Edwards K. Diving for PERLS: working and performance portfolios for evaluation and reflection on learning. J Gen Intern Med. 2004;19:583–8. doi: 10.1111/j.1525-1497.2004.30224.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Ogrinc GO, Headrick LA, Morrison LJ, Foster T. Teaching and assessing resident competence in practice-based learning and improvement. J Gen Intern Med. 2004;19:496–500. doi: 10.1111/j.1525-1497.2004.30102.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Smith CS, Gordon CE, Feller-Kopman D, et al. Creation of an innovative inpatient medical procedure service and a method to evaluate house staff competency. J Gen Intern Med. 2004;19:510–3. doi: 10.1111/j.1525-1497.2004.30161.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Watkins RS, Moran WP. Competency-based learning: the impact of targeted resident education and feedback on Pap smear adequacy rates. J Gen Intern Med. 2004;19:546–9. doi: 10.1111/j.1525-1497.2004.30150.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Triola MM, Feldman HJ, Pearlman EB, Kalet AL. Meeting requirements and changing culture: the development of a web-based clinical skills evaluation system. J Gen Intern Med. 2004;19:492–5. doi: 10.1111/j.1525-1497.2004.30065.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Whitcomb ME. Research in medical education: what do we know about the link between what doctors are taught and what they do? Acad Med. 2002;77:1067–8. doi: 10.1097/00001888-200211000-00001. [DOI] [PubMed] [Google Scholar]
- 18.Ogrinc G, Headrick L, Boex J. Understanding the value added to clinical care by educational activities. Value of Education Research Group. Acad Med. 1999;74:1080–6. doi: 10.1097/00001888-199910000-00009. [DOI] [PubMed] [Google Scholar]
