Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Jun 15.
Published in final edited form as: J Surg Res. 2012 Aug 3;178(2):618–622. doi: 10.1016/j.jss.2012.07.034

The reflective statement: A new tool to assess resident learning

Sean F Monaghan 1, Andrew M Blakely 1, Pamela J Richardson 1, Thomas J Miner 1, William G Cioffi 1, David T Harrington 1,*
PMCID: PMC4467549  NIHMSID: NIHMS698457  PMID: 22883435

Abstract

Purpose

Continued assessment and redesign of the curriculum is essential for optimal surgical education. For the last 3 y, we have asked the residents to reflect on the previous week and describe “the best thing” they learned. We hypothesize that this statement could be used to assess the weaknesses or strengths of our curriculum.

Methods

Starting in 2007, residents filled out surveys approximately 4 times/y at the start of a mandatory conference. They were asked to describe the “best thing” they learned that week, where it was learned, and who taught it. Residents were not asked to classify the item learned by core competency (communication, knowledge, patient care, practice-based learning, professionalism, and systems-based practice). This categorization into core competencies was done as part of our study design. Attending, fellow, resident, or other were used as groups designating who taught each item. Where the item was learned was fit into either clinic, conference, operating room (OR), wards, or self. The impact of postgraduate year (PGY) level on learning was also assessed. χ2 analysis was used to compare groups.

Results

During the study period, 304 surveys were completed and returned by 65 residents. The majority of responses came from PGY 1 residents (134, 43%). Patient care and knowledge were the most common core competencies learned. As PGY level increased, learning of professionalism (P = 0.035) increased. A majority of learning was experiential (wards and OR, P < 0.0125). Self-learning and learning in clinic was a minor component of learning (P < 0.0125). Learning on wards (P < 0.001) decreased as residents progressed and learning from the OR (P = 0.002) had the opposite trend.

Conclusions

Patient care and knowledge are the most frequently cited competencies learned by the residents. Self-learning is not a significant source of learning, and the majority of the learning is experiential. It is not known if this was a sign that there was a lack of self-directed learning or that self-directed learning was not an efficient method of learning. In addition, each PGY level learns differently (teacher and location of learning), perhaps reflecting the different needs and/or structure of each PGY. We believe the reflective statement has been and will be a useful tool to assess our curriculum.

Keywords: Reflection, Portfolios, ACGME core competencies, Active learning

1. Introduction

The use of portfolios, which include reflections on what has been learned, has been advocated to enhance medical education by making the student an active learner [1]. Active learning requires an assessment of knowledge deficits, appreciation of an individual optimal method of learning, and creation of an individual plan for remediation. Learners also benefit from a well-designed and adaptable program. Many residency programs are improving their curriculum by enhancing didactic sessions such as morbidity and mortality conferences [2] and adding simulation experiences. Portfolios have been assessed and found to be effective for both medical student [3] and resident development [4]. As initially implemented at this program, the reflective statement was designed to have the residents review the last week and reinforce what they learned. Additionally, the portfolio and its reflective statement can also be used to determine how residents are learning and where the learning is occurring. This process could be an important tool to evaluate the difficult task of incorporating the ACGME (Accreditation Council for Graduate Medical Education) core competencies into the surgical curriculum [5]. Analyzing data from portfolios and their reflective statements may allow for opportunities to teach residents the core competencies and might verify the unstated hierarchy among the core competencies (knowledge and patient care being deemed most important and the rest of the competencies being “soft”) [6,7].

Since 2003, surgical faculty have been faced with training future surgeons in the environment of an educational budget. This budget is not a monetary constraint but a budget of time. Given our current 80-h budget, every hour must be looked at in terms of its educational value. In order to comply with demands for residents’ time, such as the electronic health record, residency duty hour restrictions, and patient safety initiatives, the education of residents must be more efficient than it was in the past. Program directors need tools in order to assess their program. This project sought to use a reflective statement gathered from the residents to assess learning across a tertiary medical facility—what was learned, who taught it, and where it was learned.

2. Methods

During the 2007–2008 and 2008–2009 academic years, residents were asked to fill out a “reflective statement” prior to mandatory academic conferences (Fig. 1). This was done approximately four times each academic year. These responses were collected and filed with the program coordinator. The responses were then stripped of identifying information and given a code to designate the resident and postgraduate year (PGY) of training.

Fig. 1.

Fig. 1

Form filled out by the surgical residents.

The “best thing learned” was then categorized into one of the six ACGME core competencies (communication, knowledge, patient care, practice-based learning, professionalism, and systems-based practice) by the lead author using an algorithm that included subgroup placement (Fig. 2). “Who taught it” was grouped according to faculty, resident, or other. “Other” included anyone who was not a resident or faculty member, such as a physician assistant, nutritionist, intensive care unit nurse, or the resident himself or herself. The item was categorized as clinic, conference, operating room (OR), self (off-duty learning), or wards, based on where it was learned. Resident clinic experience varies by rotation, but residents have a clinic experience in all their PGY years. All residents are required to attend conferences that occur in the morning prior to the start of scheduled operative cases. The resident’s exposure to the OR is dependent upon PGY and rotation, but all residents have a good operative experience in the program.

Fig. 2.

Fig. 2

Scheme used to categorize each response.

As part of the study, residents did not review their reflective statements after submitting them to the program director. After all responses for the study period were collected, categorized, and analyzed, the reflective statements were added to the resident file.

χ2 analysis, with statistical significance set at <0.05, was done for each planned comparison. When the same data were used for multiple comparisons, a Bonferroni correction was used. The Institutional Review Board of Rhode Island Hospital approved this project.

3. Results

Over the course of the study period 304 surveys were completed. Only five surveys were left blank, resulting in a 98.4% completion rate. Interns completed 44% (134) of the surveys, which reflects their percentage in the program.

Residents’ learning across the core competencies was asymmetric (P = 0.03). Patient care (58%) was the most-cited category for the “best thing” that was learned, with knowledge being the second most common (31%) (Fig. 3). What was learned varied by PGY (Table 1). The frequency of learning professionalism (P = 0.035) increased as PGY increased. Learning systems-based practice appears isolated to the first 3 years, communication was learned in the first year and then in the senior years (years 4 and 5), and practice-based learning increased as postgraduate level increased, though none of these trends was statistically significant.

Fig. 3.

Fig. 3

Percentage of responses that were categorized into each core competency. χ2 analysis with a Bonferroni correction. *P <0.00625 compared with knowledge, #P <0.00625 compared with patient care.

Table 1.

Core competency learned, by postgraduate year.

PGY Communication Knowledge Patient care Practice-based learning Professionalism Systems-based practice
1 3.7% 37.3% 54.5% 0.75% 0% 3.7%
2 0% 20% 70% 6% 0% 4%
3 0% 32.7% 55.1% 0% 6.1% 6.1%
4 9.5% 35.7% 50% 2.4% 2.4% 0%
5 3.5% 20.7% 75.9% 0% 0% 0%
P value 0.114 0.110 0.082 0.156 0.035 0.577

Faculty (62%) were most likely to teach the “best thing” learned (Fig. 4, P < 0.025). As training level increased, more was learned from faculty (Table 2, P = 0.02). The learning from other residents appeared to have a “U”-shaped distribution—interns and chief residents learned a large portion from other residents (P < 0.001). Throughout the course of the residency, the amount of learning from other individuals remained constant (P = 0.066).

Fig. 4.

Fig. 4

Percentage of responses that were taught by each group. χ2 analysis with a Bonferroni correction. *P <0.025 compared with faculty.

Table 2.

Who taught item, by postgraduate year.

PGY Faculty Resident Other
1 53.7% 33.6% 12.7%
2 62% 20% 18%
3 76.6% 8.2% 12.3%
4 61.9% 7.1% 31%
5 72.4% 13.8% 13.8%
P value 0.020 <0.001 0.066

Learning occurred predominantly on the wards, in the OR, and in conference. There was little self-learning (Fig. 5, P < 0.0125). In the early years, a majority of learning was done on the wards; this percentage decreased as the resident progressed (Table 3, P < 0.001). The opposite trend, with more learning occurring in the later postgraduate years, was seen in the OR (P = 0.002) and in conferences (P = 0.074). There was very little variability in the learning done in the clinic across resident years. Self-learning was very minimal in all postgraduate years. The majority of learning was experiential (Fig. 5), when the resident was in the hospital and interacting with a patient either on the wards (31%) or in the OR (30%).

Fig. 5.

Fig. 5

Percentage of responses that were learned in each location. χ2 analysis with a Bonferroni correction.*P <0.0125 compared with self-learning.

Table 3.

Where item was learned, by PGY.

PGY Clinic Conference OR Self Wards
1 6% 22.4% 21.6% 4.5% 45.5%
2 2% 18% 40% 10% 30%
3 8.2% 30.6% 36.7% 2% 22.5%
4 7.1% 35.7% 23.8% 11.9% 21.4%
5 0% 41.4% 55.2% 3.5% 0%
P value 0.415 0.074 0.002 0.175 <0.001

4. Discussion

Dr. Halsted developed the first formal surgical residency at Johns Hopkins. He designed the residency to be patient-centric and made the residents an integral part of daily care of the patient [8]. Much has changed in 100 years. Patient safety initiatives, duty-hour regulations, and increasing simulation have lessened the central role that residents played in the care of patients. And though these many changes have brought necessary and important reforms to surgical training, how can program directors assess the effects of these past events and future reforms on the surgical curriculum? In order to create well-trained surgeons, program directors need to stimulate their residents to be active participants in their training, and they need to create a curriculum that exposes them to all critical knowledge and clinical experiences and challenges them personally and professionally.

This study found patient care and knowledge to be the most common competencies learned. This finding is in keeping with Chandawarkar’s “inside-out” approach to understanding the core competencies [9]. In this approach, these two competencies must be mastered before other competencies can be learned. Other reports have validated that these two domains, patient care and knowledge, are the two most important core competencies [6]. The increase in the learning of systems-based practice and practice-based learning in PGY 2 and PGY 3 seen in our data supports this learning paradigm.

The lack of learning of systems-based practice and practice-based learning in PGY 4 and PGY 5 deserves explanation. At this point in their residency, with a solid grounding in patient care and knowledge, residents should be very amenable to understanding and learning these competencies. It is possible that these senior residents do not think these domains of knowledge are important, that we do a poor job in teaching these competencies, or that residents do not realize they are learning them [5,6]. An analysis of the “best thing” learned will allow us to explore which of these potential explanations is true in our program and to work on ways to correct this finding.

The result that most of the teaching was done by faculty, particularly as PGY increased, was predictable (Table 2). The importance of faculty development, as it relates to their teaching role, is highlighted by this finding. Resident-to-resident education also occurred. Junior residents and chief residents were more likely to learn from other residents. This finding may reflect the fact that PGY 5 and PGY 1 are often together on inpatient services, where much learning occurs. It also reflects the propensity for chief residents to share knowledge and clinical experiences with their fellow chiefs. This analysis cannot reliably assess if a peer, junior, or senior taught the residents their “best thing,” and future work should address this finding.

The majority of the items were learned in the operating room or on the wards (Table 3). Learning in these two places can be described as experiential and patient-centered. As time in the hospital has been reduced, there have been varying reports on whether case volumes have decreased for residents [10,11]. Much research has been done on the importance of replacing this perceived loss of operative cases with simulation [12]. However, our residency program has a skills laboratory curriculum for PGY 1 through PGY 3 residents, and learning a skill through simulation was never mentioned as the “best item” learned. Based on these data, program directors should be cautious of replacing patient care with simulation.

Residents must learn on their own to assimilate all the necessary information needed for their training and develop skills for continuing proficiency in the expanding body of medical knowledge [13]. However, self-learning was a minor component of where items were learned. It is not known if this is due to a lack of effort outside the hospital by the residents. To address these questions, future work will add a question to the survey asking for number of hours spent self-learning in the last week. However, it is possible that self-learning is not effective and the curriculum should be changed to include techniques to improve these skills.

Limitations of this study include the fact that this is a baseline assessment of the program and there is no comparison group. Future studies will use these data as a foundation to compare changes in the curriculum. This survey was initially intended as a personal reflection, so there may be some bias in the interpretation of the question; and whereas these data reflect the “best thing” learned, the survey does not assess all things learned.

5. Conclusion

Using a reflective statement produced by residents provides insight into the core competencies learned by residents, who teaches them, and where they are learned. As residency programs are constrained by a time budget, this analysis allows for an evaluation of the program to assess its strengths and weaknesses. From this assessment of a surgical curriculum, the core competencies of professionalism, communication, systems-based practice, and practice-based learning need to be carefully reassessed and augmented. Much of the learning by residents is experiential and appears to be vital to surgical training.

References

  • 1.Challis M. Portfolios and assessment: meeting the challenge. Med Teach. 2001;23:437. doi: 10.1080/01421590120075643. [DOI] [PubMed] [Google Scholar]
  • 2.Kim MJ, Fleming FJ, Peters JH, Salloum RM, Monson JR, Eghbali ME. Improvement in educational effectiveness of morbidity and mortality conferences with structured presentation and analysis of complications. J Surg Educ. 2010;67:400. doi: 10.1016/j.jsurg.2010.04.005. [DOI] [PubMed] [Google Scholar]
  • 3.Davis MH, Ponnamperuma GG, Ker JS. Student perceptions of a portfolio assessment process. Med Educ. 2009;43:89. doi: 10.1111/j.1365-2923.2008.03250.x. [DOI] [PubMed] [Google Scholar]
  • 4.O’Sullivan PS, Reckase MD, McClain T, Savidge MA, Clardy JA. Demonstration of portfolios to assess competency of residents. Adv Health Sci Educ Theory Pract. 2004;9:309. doi: 10.1007/s10459-004-0885-0. [DOI] [PubMed] [Google Scholar]
  • 5.Wasnick JD, Chang L, Russell C, Gadsden J. Do residency applicants know what the ACGME core competencies are? One program’s experience. Acad Med. 2010;85:791. doi: 10.1097/ACM.0b013e3181d71d7b. [DOI] [PubMed] [Google Scholar]
  • 6.Yaszay B, Kubiak E, Agel J, Hanel DP. ACGME core competencies: where are we? Orthopedics. 2009;32:171. [PubMed] [Google Scholar]
  • 7.Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the Accreditation Council for Graduate Medical Education: a systematic review. Acad Med. 2009;84:301. doi: 10.1097/ACM.0b013e3181971f08. [DOI] [PubMed] [Google Scholar]
  • 8.Cameron JL. William Stewart Halsted. Our surgical heritage. Ann Surg. 1997;225:445. doi: 10.1097/00000658-199705000-00002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Chandawarkar R. A simple primer for understanding core competencies. J Surg Educ. 2011;68:99. doi: 10.1016/j.jsurg.2010.11.002. [DOI] [PubMed] [Google Scholar]
  • 10.Bruce PJ, Helmer SD, Osland JS, Ammar AD. Operative volume in the new era: a comparison of resident operative volume before and after implementation of 80-hour work week restrictions. J Surg Educ. 2010;67:412. doi: 10.1016/j.jsurg.2010.05.007. [DOI] [PubMed] [Google Scholar]
  • 11.Picarella EA, Simmons JD, Borman KR, Replogle WH, Mitchell ME. “Do one, teach one” the new paradigm in general surgery residency training. J Surg Educ. 2011;68:126. doi: 10.1016/j.jsurg.2010.09.012. [DOI] [PubMed] [Google Scholar]
  • 12.Meier AH. Running a surgical education center: from small to large. Surg Clin North Am. 2010;90:491. doi: 10.1016/j.suc.2010.02.003. [DOI] [PubMed] [Google Scholar]
  • 13.Nothnagle M, Goldman R, Quirk M, Reis S. Promoting self-directed learning skills in residency: a case study in program development. Acad Med. 2010;85:1874. doi: 10.1097/ACM.0b013e3181fa02a4. [DOI] [PubMed] [Google Scholar]

RESOURCES