Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2011 Sep;3(3):435–437. doi: 10.4300/JGME-D-11-00150.1

The Year is Over, Now What? The Annual Program Evaluation

Deborah Simpson, Monica Lypson
PMCID: PMC3179219  PMID: 22942982

The Challenge

Continuous improvement of graduate medical education programs is the objective of the Common Program Requirement1 for an annual program evaluation. Although guidelines outlining the who, what, and how for the evaluation are included in the Common Program Requirements, there appears to be a lack of clarity about the expectations for a thorough evaluation as “Evaluation of Program” is one of the most common citations by Residency Review Committees (RRCs).

What Is Known

Process Evaluation and Strategies

In contrast to researchers, whose goal is finding new, generalizable knowledge, evaluators focus on determining the value or effectiveness of a specific program, event, or activity. The annual program evaluation is a form of “process” evaluation, designed to support ongoing improvement. Process evaluation focuses on the degree to which program activities, materials, and procedures support achievement of program goals, including resident or fellow performance.2 The evaluator (program director, designated institutional officer [DIO], educator, or faculty member) is responsible for leading a systematic, comprehensive, objective, and fair evaluation process3 that yields accurate and useful information for continuous improvement to all stakeholders.4

Rip Out Action Items

The Plan-Do-Study-Act Program Evaluation Cycle:

  1. Plan:

    1. Identify problem areas noted by external sources, such as the Institutional Review Committee and the RRC.

    2. Identify existing evaluation information, by Common Program Requirement and present in blueprint.

    3. Identify information gaps per blueprint and gather data.

  2. Do:

    1. Collate and present data annually to all faculty and resident representatives.

    2. Identify and act on 2 to 3 target deficiencies, approve action plan, and document.

  3. Study:

    1. Report findings throughout year and at annual faculty and resident meetings.

    2. Continue to collect findings and monitor progress.

  4. Act:

    1. Actively engage the faculty and residents in the improvement of the 2 to 3 action items.

    2. Repeat process.

Common Program Requirements:

The annual review is essentially a program level Practice-Based Learning and Improvement process for programs to actively engage in continuous improvement, based on constant self-evaluation.1 The principles of evaluation and improvement science (Plan-Do-Study-Act) can be useful when looking at one's education efforts for residents and fellows.4 One way to conduct the annual review is to reframe the Common and specialty-specific Program Requirements, using the who, what, how paradigm; this serves to deconstruct the evaluation process and support a stepwise approach to process evaluation.

  • Who: Residents and faculty confidentially evaluate the program. Other stakeholders (eg, patients, staff) may evaluate the program from their respective vantage points. The program director has primary responsibility for the overall evaluation process. The findings are reviewed by the institution's Internal Review Committee, Graduate Medical Education Committee, and DIO.5

  • What: Prior RRC citations, Accreditation Council for Graduate Medical Education (ACGME) and institution survey results, the curriculum, resident performance, faculty development efforts, alumni performance, and program quality are the focal evaluation areas.

  • How: A systematic, written evaluation designed to identify deficiencies resulting in an approved action or improvement plan, with ongoing follow-up by the program director or education committee and written documentation at each step.

How Can You Start TODAY

  1. Identify the information you already have about your program specific to the Common Program Requirements.

  2. List the information in an evaluation blueprint by requirements, with who provides the information, the number of times per year, and whether written documentation is available (table).

  3. Identify unneeded redundancies and information gaps by Common Program Requirement topic and source. For example, your annual faculty retreat includes a review of the overall quality of the program. However, there are no residents at the retreat nor do the minutes document the program's strengths or deficiencies with associated action plans.

  4. Prioritize the actions needed to rectify issues found on the ACGME resident and program faculty surveys, prior internal review recommendations, and RRC citations.

  5. Present preliminary findings to the department chair and/or leadership; discuss findings and infrastructure needed to create and sustain a robust evaluation process.

TABLE.

Program Evaluation Blueprint

graphic file with name i1949-8357-3-3-435-t01.jpg

What Can You Do LONG TERM

  1. Form or use an existing Education Review Committee that is explicitly charged with program evaluation.

  2. Present evaluation blueprint and associated data to the Education Review Committee.

    1. Identify gaps or deficiencies from the blueprint.

    2. Develop targeted action plan focused on 2 to 3 deficiencies as the primary focus for the year with metrics for progress.

  3. Seek key leadership support and resources; document faculty input and approval for action plan, especially for those areas noted by the internal review committee and ACGME.

  4. Continue evaluation data collection; implement new data collection tools and procedures as needed. Evaluation methods or tools that are overly time consuming or otherwise not feasible are rarely sustainable.

    1. Before collecting any information, consider what difference that information will make in the evaluation of your program. If the data are positive, if the data are mixed, if the data are negative, what will you do?

    2. If the results won't change what you are doing, then perhaps you should ask a different question and/or gather the data in a different way.

    3. Seek to use existing venues/data collection strategies to address gap areas. Elaborate data collection processes are rarely sustainable.

    4. Consider bringing in outside help if needed for assistance, consultation, or outside perspectives (eg, another program director, the DIO, an education specialist, and/or someone who specializes in program evaluation).

  5. Provide plan updates and data on action-plan progress to the faculty and residents throughout the year. Modify plan as needed.

  6. Present full data set and findings annually to residents, faculty, and other stakeholders. Demonstrate that target deficiencies have been addressed or are improving. Maintain permanent documentation of the findings.

  7. Repeat the process.

Resources for Further Information

Footnotes

Deborah Simpson, PhD, is Associate Dean and Professor, Medical College of Wisconsin. Monica Lypson, MD, MHPE, is Assistant Dean of Graduate Medical Education and Associate Professor of Internal Medicine at the University of Michigan. Both authors are associate editors for the Journal of Graduate Medical Education.

References


Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES