Abstract
Background
The importance and benefits of direct observation in residency training have been underscored by a number of studies. Yet, implementing direct observation in an effective and sustainable way is hampered by demands on physicians' time and shrinking resources for educational innovation.
Objective
To describe the development and pilot implementation of a direct observation tool to assess the history and physical examination skills of interns in a pediatric emergency department rotation.
Methods
A task force developed specific history and physical examination checklists for a range of common conditions. For the pilot implementation, 10 pediatric emergency medicine faculty attendings conducted the initial observations of 34 interns during the course of 1 academic year. At the conclusion of the pilot, the faculty observers and interns were interviewed to assess the feasibility and benefits of the process.
Results
A total of 33 of the 34 interns were observed during their rotation, with 26 of the observations conducted when the faculty observer was off shift, and it took approximately 20 minutes to complete each observation. In terms of learning benefits, interns and faculty observers reported that it facilitated clear and useful feedback and revealed gaps that would not have otherwise been identified. Faculty observers also mentioned that it helped them focus their teaching effort, built empathy with learners, and gave them a way to demonstrate a true concern for their learning.
Conclusion
Our results offer evidence for the feasibility and benefits of the direct observation checklists. The description of the implementation, challenges, and response to those challenges may help others avoid some of the common problems faced when implementing direct observation methods.
Editor's Note: The online version of this article contains the full condition-specific checklists (1.8MB, pdf) , the postimplementation survey questions used in this study (20.2KB, docx) , and an appendix of intern learning objectives (98.5KB, doc) .
Introduction
The Accreditation Council for Graduate Medical Education requirement that residency programs incorporate direct observation methods into their overall assessment strategy1 is supported by empirical evidence. Several studies have linked direct observation to a range of positive outcomes, including improved learning,2 higher quality of care,3 and improved patient safety.4 Nevertheless, implementing direct observation methods in an effective and sustainable way remains difficult as demands on physicians' time continue to grow and resources for educational innovation continue to shrink.5,6
We describe the development and pilot implementation of a direct observation tool to assess the history (Hx) and physical examination (PE) skills of interns in a pediatric emergency department rotation. The aim is to provide other educators with practical and useful information for designing and implementing effective observation methods in the face of current challenges.
Methods
Setting and Participants
Faculty attending physicians in pediatric emergency medicine (PEM) at Cincinnati Children's Hospital Medical Center (CCHMC) observed interns conducting Hx-PEs with real patients. The CCHMC emergency department is a high-volume (95 000 visits per year), urban, level I trauma center. There are currently 43 faculty attending physicians in PEM, and the intern class of 35 to 45 must complete a 1-month rotation in emergency medicine (EM).
A typical day shift includes 2 to 3 faculty attending physicians, 2 clinical physicians, 1 fellow, 3 to 5 upper-level residents, and 1 to 2 interns. There is no dedicated teaching shift for faculty attending physicians to focus on teaching. For the pilot year, 10 PEM faculty attending physicians volunteered to observe the 34 interns rotating in EM. The study was declared exempt by the CCHMC Institutional Review Board.
Intervention
Checklist Design
The development of the Hx-PE checklists was part of a larger effort by an education task force of several EM faculty members to improve the EM residency curriculum at CCHMC. The task force began by identifying condition-specific areas, including asthma, fever (0–2 months), fever (2 months to 3 years), and gastroenteritis/dehydration, that would serve as focal targets for the curriculum improvements. The overall strategy, in line with recent calls by a number of medical educators,7,8 was to develop a range of accessible learning opportunities and useful assessment tools that linked directly to a set of specific learning objectives.
Each task force member was appointed to focus on one of the targeted conditions. That member worked with an evaluation specialist to first identify the specific knowledge and skill expectations that applied in the context of that condition. After the condition-specific learning objectives were finalized by the larger task force, each task force member then worked with the evaluation specialist to develop a condition-specific Hx-PE checklist based on those objectives. All 4 condition-specific checklists are provided as online supplemental material.
The use of targeted, condition-specific Hx-PE checklists has advantages over a single general Hx-PE checklist (eg, mini–clinical evaluation exercise). One advantage is that the specific items leave less room for individual interpretation, which in turn reduces the resources needed to train observers and increases the accuracy and specificity of the information.9 To minimize the cognitive burden, the form was divided into 3 sections, and 2 open-ended feedback prompts were included to capture other specific feedback.
Observation Frequency and Timing
The purpose of the initial year-long pilot was to assess and improve the Hx-PE process. Consequently, each intern was limited to a single observation during the last 2 weeks of his or her 4-week rotation. The last weeks were chosen because it gave the interns a chance to develop their skills and comfort level.
Initially, interns were responsible for identifying and requesting an opportunity to be observed. However, most interns were reluctant to take the initiative because of a “fear of disrupting flow” or “provoking a negative reaction” from some of the faculty observers. In response, the responsibility of initiation was shifted to the faculty observers because they had a better view of the overall situation and were able to determine the best opportunity for observation. Each faculty observer was given the responsibility for observing 3 interns during course of the year. Observers had the option of conducting the observation while on shift or coming into the emergency department to conduct the observations while off shift.
Orientation to the Checklist
Although the simplicity and clarity of the Hx-PE checklists eliminated the need for resource-intensive training, the purpose of the pilot, role expectations, and various logistical matters had to be communicated. Consequently, a brief presentation and discussion session for interns was provided as part of their initial orientation to the CCHMC residency program and to faculty during one of the monthly faculty meetings. This clarified the process and expectations, and it helped generate support by giving stakeholders an opportunity to raise initial concerns and make suggestions.
Formative Versus Summative Purpose
The purpose of the Hx-PE checklists was to provide timely and specific feedback for developmental (ie, formative) purposes. The information was not to be used as a basis for making high-stakes (ie, summative) decisions. The focus on the formative purpose was driven primarily by feasibility concerns, given the rigorous testing required to establish sufficient evidence for validity and the number of observations needed to get an accurate estimate of an intern's true ability (the mini–clinical evaluation exercise, a similar observation tool, requires 12–14 observations to achieve a reliability coefficient of 0.810).
Form Administration
A simple paper-based process was used to administer the forms, which were placed in binders in the pediatric emergency department and deposited by the faculty observer in a conveniently placed collection box. An administrative assistant collected the forms, sent one copy to the intern, and sent one copy to the associate director of the residency program.
Data Collection and Analysis
For each observation, faculty observers were asked to indicate how much time it took to conduct the observation and provide the feedback to the learner. At the end of the pilot year, faculty observers and interns were interviewed by the primary and secondary investigators to assess the overall feasibility and benefits of the process (the questions are provided as online supplemental material). Key points were captured during each interview, and the primary and secondary investigators worked together to identify common themes and assign codes to each key point.
Results
Feasibility
The orientation for faculty took approximately 45 minutes, and the orientation for interns was conducted in a 30-minute session. Overall, 33 of the 34 interns were observed during their rotation, with 26 of those observations being conducted when the faculty observer was off shift. Based on an average of the estimated times reported by faculty observers, it took approximately 20 minutes to complete each observation and provide the feedback.
Perceived Benefits
Both interns and faculty observers liked that the process was being used for formative rather than summative purposes. Interns mentioned that it reduced their anxiety about being observed, whereas faculty mentioned that it reduced their anxiety about being candid. A summary of the perceived learning and teaching benefits of the Hx-PE checklists is included in the table. Although both faculty observers and interns mentioned “facilitated clear feedback” as the biggest benefit, many of the interns who mentioned this also made the point that the level of specificity was particularly effective because it helped them understand precisely what they needed to do differently.
TABLE.
In addition, 14 of the 32 interns indicated that the process would be more beneficial if observations were done more frequently, and 8 of the 32 interns specifically mentioned that an initial observation should occur early within the first week of the rotation to get more timely information about any improvement needs.
Discussion
This pilot was the first step toward full implementation of the Hx-PE checklists. Input from interns made it clear that more observations during the 4-week rotation would make the tool more useful. In response, our plan is to ensure that each intern is observed at least 3 times during the 4-week rotation—once during the first week, once during the second or third week, and once during the final week. To accommodate the need for more observations, the responsibility for observing will be expanded from the initial pilot group of 10 to include all 42 PEM faculty attending physicians at CCHMC. As a result, most faculty attending physicians will be responsible for conducting 3 observations on a single intern during the course of a year.
Communicating the results of this pilot in terms of feasibility and benefits should help gain the commitment of the larger group of faculty PEM physicians. However, our faculty attending physicians are asked to do a number of activities to support the educational, clinical, and research missions of the institution. The cumulative effect of these activities has been an increase in the resistance to additional requests. This is particularly true for educational interventions because faculty attendings are not significantly rewarded or penalized for their teaching efforts. Consequently, in addition to developing a direct observation process that is feasible and beneficial, we will also need to identify and address other motivational factors that could undermine its sustainability. Unfortunately, no electronic administration method was available. Electronic administration of the evaluations might reduce the burden of evaluation and enhance faculty acceptance.
Conclusion
Medical educators faced with the responsibility of implementing direct observation methods can benefit from our approach, the challenges we faced, and the lessons we learned. This should help others avoid some of the common problems faced when implementing direct observation methods.
Footnotes
All authors are at Cincinnati Children's Hospital Medical Center. Michael FitzGerald, PhD, is Field Service Assistant Professor of Pediatrics; at the time of writing, Mia Mallory, MD, was Assistant Professor of Pediatrics; she is now Attending Physician, Pediatric Emergency Medicine Associates, Children's Healthcare of Atlanta at Scottish Rite; Matthew Mittiga, MD, is Assistant Professor of Pediatrics; Charles Schubert, MD, is Professor of Pediatrics; Hamilton Schwartz, MD, is Assistant Professor of Pediatrics; Javier Gonzalez MD, MEd, is Professor of Pediatrics; Elena Duma, MD, is Assistant Professor of Pediatrics; Constance McAneney, MD, is a Professor of Pediatrics.
Funding: The authors report no external funding source for this study.
References
- 1.Accreditation Council for Graduate Medical Education. ACGME Outcome Project: Common Program Requirements. http://www.acgme.org/acgmeweb/Portals/0/dh_dutyhoursCommonPR07012007.pdf. Accessed September 25, 2012. [Google Scholar]
- 2.Howley LD, Wilson WG. Direct observation of students during clerkship rotations: a multiyear descriptive study. Acad Med. 2004;79(3):276–280. doi: 10.1097/00001888-200403000-00017. [DOI] [PubMed] [Google Scholar]
- 3.Kilminster S, Jolly B, Vleuten CPM. A framework for effective training for supervisors. Med Teach. 2002;24(4):385–389. doi: 10.1080/0142159021000000834. [DOI] [PubMed] [Google Scholar]
- 4.Graber ML. Taking steps towards a safer future: measures to promote timely and accurate medical diagnosis. Am J Med. 2008;121((5 suppl)):S43–S46. doi: 10.1016/j.amjmed.2008.02.006. [DOI] [PubMed] [Google Scholar]
- 5.DaRosa DA, Skeff K, Friedland JA, Coburn M, Cox S, Pollart S, et al. Barriers to effective teaching. Acad Med. 2011;86(4):453–459. doi: 10.1097/ACM.0b013e31820defbe. [DOI] [PubMed] [Google Scholar]
- 6.Holmboe ES. Faculty and the observation of trainees' clinical skills: problems and opportunities. Acad Med. 2004;79(1):16–22. doi: 10.1097/00001888-200401000-00006. [DOI] [PubMed] [Google Scholar]
- 7.Wilkinson TJ. Assessment of clinical performance: gathering evidence. Intern Med J. 2007;37(9):631–636. doi: 10.1111/j.1445-5994.2007.01483.x. [DOI] [PubMed] [Google Scholar]
- 8.Van Der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Educ. 2005;39(3):309–317. doi: 10.1111/j.1365-2929.2005.02094.x. [DOI] [PubMed] [Google Scholar]
- 9.Regehr G, MacRae H, Reznick RK, Szalay D. Comparing the psychometric properties of checklists and global ratings scales for assessing performance on an OSCE-format examination. Acad Med. 1998;73(9):993–997. doi: 10.1097/00001888-199809000-00020. [DOI] [PubMed] [Google Scholar]
- 10.Holmboe ES, Hawkins RE. Methods for evaluating the clinical competence of residents in internal medicine: a review. Ann Intern Med. 1998;129(1):42–48. doi: 10.7326/0003-4819-129-1-199807010-00011. [DOI] [PubMed] [Google Scholar]