Skip to main content
JAMA Network logoLink to JAMA Network
. 2020 May 13;155(7):665–667. doi: 10.1001/jamasurg.2020.0845

Feasibility of Integration of Resident Surgical Evaluations Into the Electronic Medical Record

Elaine Stickrath 1,2,, Karilynn Rockhill 3, Heather Zuhn 4, Meredith J Alston 1,2
PMCID: PMC7221859  PMID: 32401288

Abstract

This cohort study of resident physicians and faculty members at a single academic center examines the usability and acceptability of a resident evaluation tool integrated into an electronic medical records system.


Evaluation of a resident’s surgical performance is key to the developing surgeon’s education; however, collecting surgical feedback can be problematic for residents and faculty members alike. This study aims to describe how a surgical evaluation tool can be presented to surgeons in a new way through integration into the electronic medical record.

Methods

This descriptive study was conducted in an academic safety-net hospital from February 2019 through June 2019. A novel tool was developed within the Epic electronic medical record (EMR) system that resulted in an in-basket message (Figure 1) to the faculty surgeon of record on case completion. The process was created by a physician builder in the EMR with the assistance of an EMR analyst; they required approximately 15 hours of personnel time to build and test this tool. The message contained a link to complete a surgical evaluation via Qualtrics (SAP), an outside survey platform. The evaluation tool consisted of the previously validated Zwisch1 scale, with 2 additional questions to allow free-text feedback on resident performance. When the evaluation was completed, an email was instantly generated to the operating resident, providing nearly real-time feedback. At the conclusion of the study, the proportion of the number of completed evaluations to the total number generated was calculated. After the study was completed, an anonymous survey was sent to the faculty surgeons and residents to assess the acceptability of the tool and the outcome the tool had on the amount of feedback given and received. This project was reviewed by the Colorado Multiple Institutional Review Board and determined to be exempt. Each participant provided informed consent with the completion of the survey. Data analysis was completed with SAS version 9.4 (SAS Institute).

Figure 1. Surgical Evaluation Tool In-Basket Message.

Figure 1.

Results

A total of 17 operating surgeons and 37 residents participated in this study. During the study period, 724 operations were performed in the Department of Obstetrics and Gynecology at Denver Health Medical Center. Of the evaluation requests made via the in-basket, 552 evaluations were completed (76.2%). When the number of evaluations completed by each clinician was analyzed, 14 of 19 faculty members (74%) completed at least 80% of the evaluations they received. The completion rate was at or near 0 for 3 faculty members.

Poststudy surveys were completed by 26 of 27 residents (96%) and 14 of 17 faculty members (82%). Nearly all residents (26 [96%]) reported that they received more feedback (based on responses of “much more” or “somewhat more”) because of this tool (Figure 2). Residents also felt favorably about receiving feedback through this mechanism (23 [85%], combining the responses “like a great deal” and “like somewhat”). Among faculty, 13 of 14 (93%) found the tool acceptable to use, but only 3 (21%) felt that they gave more feedback as a result of this evaluation tool.

Figure 2. Electronic Medical Record Tool Review and Evaluation by Group.

Figure 2.

Stratified by group type (resident vs faculty surgeon), survey responses to assess acceptability and use of the electronic medical record tool are displayed for all quantitative questions (A and C, residents; B and D, faculty members), as well as direct quotes from open-ended text responses that demonstrate main themes (E, residents; F, faculty members). OR indicates operating room.

Discussion

To our knowledge, this is the only study to date describing the integration of an evaluation tool into the EMR. Our study demonstrated that not only is it possible to build an evaluation tool into the EMR, but faculty members were likely to complete the evaluations (76.2% of evaluations) and find the tool acceptable to use (93% of respondents). Unlike other previously developed tools,2 this evaluation tool falls within the workflows of both the surgeons and residents and does not incur additional costs to training programs.

The strengths of this study include its novelty, as well as the relatively high volume of cases and the number of evaluations generated. Additionally, all postsurvey resident responses were kept confidential to limit any social desirability bias that could occur because of a student-teacher dynamic. This study was conducted as a feasibility study. The reliability and validity of resident performance through this tool were not evaluated.

Based on the high response rate seen in this study, there seems to be some advantage of presenting evaluation requests within the EMR. This method of collecting feedback for resident learners could be applied broadly, because it was found to be both feasible to implement and acceptable to users.

References

  • 1.George BC, Teitelbaum EN, Meyerson SL, et al. Reliability, validity, and feasibility of the Zwisch scale for the assessment of intraoperative performance. J Surg Educ. 2014;71(6):e90-e96. doi: 10.1016/j.jsurg.2014.06.018 [DOI] [PubMed] [Google Scholar]
  • 2.Bohnen JD, George BC, Williams RG, et al. ; Procedural Learning and Safety Collaborative (PLSC) . The feasibility of real-time intraoperative performance assessment with SIMPL (System for Improving and Measuring Procedural Learning): early experience from a multi-institutional trial. J Surg Educ. 2016;73(6):e118-e130. doi: 10.1016/j.jsurg.2016.08.010 [DOI] [PubMed] [Google Scholar]

Articles from JAMA Surgery are provided here courtesy of American Medical Association

RESOURCES