ABSTRACT
Objectives
The COVID‐19 pandemic has necessitated the widescale adoption of video‐based interviewing for residency applications. Video interviews have previously been used in the residency application process through the pilot program of the American Association of Medical Colleges standardized video interview (SVI). We conducted an SVI preparation program with our students over 3 years that consisted of an instructional lecture, deliberate practice in video interviewing, and targeted feedback by emergency medicine faculty. The aim of this investigation was to summarize the feedback students received on their practice SVIs to provide the guidance they need for preparing for the video interviews that will replace in‐person interviews with residency programs.
Methods
A retrospective thematic analysis was conducted on faculty feedback provided to students who had completed SVI practice videos in preparation for their application to an EM residency between June 2017 and July 2019. Categorized comments were also sorted by type of faculty feedback: positive reinforcement, constructive criticism, or both.
Results
Forty‐six medical students received 334 feedback elements from three faculty. Feedback was balanced between positive reinforcement statements and constructive criticism. Students performed well on appearance and attire, creating a proper recording environment, and response content. They needed the most guidance with the delivery of content and the technical quality of the video.
Conclusions
Our results demonstrate a need for formal instruction in how to communicate effectively through the video medium. Medical educators will need to formally prepare students for tele‐interviews with residency programs, with an emphasis on communication skills and techniques for improving the quality of their video presentation, including lighting and camera placement.
Keywords: education, emergency medicine, internship and residency, interviews as topic, undergraduate medical
INTRODUCTION
The novel coronavirus pandemic (COVID) has dramatically affected all aspects of medical education and medical care. In the wake of ongoing health and safety concerns related to the pandemic, the Coalition for Physician Accountability created a work group of major stakeholders to devise a plan for helping medical students to safely navigate the transition to residency. 1 The work group made two significant recommendations related to curtailing residency recruitment activities that might contribute to spreading the virus. First, they recommended the elimination of visiting rotations, traditionally used by medical students to audition for, and become familiar with, residency programs of interest. Second, they recommended the suspension of in‐person interviews between student candidates and residency programs. In the absence of away rotations and in‐person interviews, the work group proposed the use of “virtual” video‐based interviews. The lack of experience with virtual interviews has stirred considerable anxiety among medical student applicants who are acutely aware of the importance of interviews in the residency selection process. 2 , 3
Although video interviews for employment selection are common in business, their use for selection in medical education has been limited. 4 , 5 , 6 , 7 , 8 , 9 To date, the only large‐scale use of video interviews for residency application has been through the American Association of Medical Colleges (AAMC) standardized video interview (SVI), a structured video interview program that was developed to assess applicants’ interpersonal and communication skills and knowledge of professional behaviors. 10 The introduction of the SVI as a requirement for students applying to emergency medicine residency programs during the 2017 application cycle provoked anxiety among medical students similar to the anxiety that students are experiencing now. In response, our institution developed an SVI preparation program for fourth‐year medical students entering the match in emergency medicine.
The program consisted of an informational lecture, deliberate practice through the creation of sample videos, and targeted feedback from medical education faculty. Because the AAMC initially provided little guidance about how the SVI would be scored, we incorporated more general instruction about standard interviewing techniques and advice specific to video interviewing technology. We also reviewed concepts related to communications and professionalism. In retrospect, the advice offered was remarkably similar to that currently being offered by faculty advisors and medical student forums as well as in the AAMC’s preparation guide for applicants participating in virtual interviews. 8 , 11 , 12 , 13 , 14 As such, analysis of the feedback provided to our students on their practice SVIs was thought to have relevance to the current interview environment, in which medical students are required to present themselves to residency programs through a video‐based platform. We think that the common problems we identified will provide faculty advisors with the resources needed to guide their students through the virtual interview process.
METHODS
Population and instructional methods
Three cohorts of applicants to emergency medicine residencies participated in our SVI preparation program between 2017 and 2019. The program was designed to address student concerns about the implementation of the SVI. The introductory didactic session was held during the month of June and served as the kickoff to the medical students’ fourth year and their residency application process. The content of the didactic session was derived from a variety of definitive sources, including guidance on communication skills for in‐person interviews 13 , 14 , 15 , 16 , 17 and technical guidance on how to prepare the setting for the video interview and (for the SVI) recording. 15 , 17 The didactic curriculum also included a review of the professionalism topics that had been presented throughout our preclinical curriculum.
After participating in the didactic session, students received instructions on how to complete the practice SVIs through our curriculum management system. We used sample interview questions provided by the AAMC through the AAMC‐SVI website. 10 Students were encouraged to simulate the actual conditions, as closely as possible, in which they would eventually take the SVI (including appearance, dress, background, pacing, and technical specifications for recording). All practice videos were then uploaded to a secure electronic curriculum management system where they were reviewed by at least one of three emergency medicine education faculty involved with the program. During the first year, students were required to submit six practice videos, one for each sample question. For years 2 and 3, students were required to submit at least two videos, but were permitted to submit up to six as they felt needed. Upon receiving their feedback, students were encouraged to follow up with these advisory faculty regarding questions or concerns about their video interview performance.
Practice standardized video assessment
The three emergency medicine faculty who developed the SVI preparation curriculum also reviewed the student's practice videos (six required videos during the first year, and up to six in years two and three). They also provided feedback in narrative format for each individual video through the electronic curriculum management system. Because the three faculty reviewers developed and taught the SVI preparation curriculum, they were cognizant of the program's learning objectives and in concert with each other in their performance assessment of the student's practice videos. All three had taught at the preclinical and clinical undergraduate medical school level, served as primary advisors to fourth‐year medical students, served as either director or associate director of the emergency medicine clerkship, and had performed residency interviews.
Data coding and analysis
The coding frame for this project consisted of all narrative feedback provided to the medical students by faculty on their practice SVIs. These feedback narratives were downloaded from the curriculum management system and then tagged with a numeric code before being deidentified of both student and faculty names. The numerically coded narratives were then organized into a database for analysis.
Owing to the nature of the data formatting from the curriculum management system, there did not exist a one‐to‐one correspondence between the feedback provided to one student on one SVI practice video, by one faculty member. Furthermore, if the faculty member reviewed the practice video over more than one session, the feedback appeared in more than one record (row) in the database. In other words, one record (row) of feedback might have covered more than, or less than, one practice SVI (video interview session) in the database. Consequently, we were unable to determine how many practice videos each student submitted; however, we were able to identify which records belonged to which students and were provided feedback by which faculty. In other words, each record (row) represented a single feedback narrative provided to one student by one faculty reviewer.
Problems and errors related to student performance on video interviews were anticipated from the curriculum we provided; consequently we used a thematic coding approach to data analysis. 18 Problems and errors were defined as behaviors exhibited by the students during their interview that were inconsistent with the learning objectives taught during the preparation program. The feedback narratives consisted of comments related to errors as well as good performance. Prior to analysis, the coding rubric of categories and themes was created by program leaders (see Table 1, Column 1).
Table 1.
Number and percentage of feedback elements that were positive reinforcement, negative constructive criticism, or both listed by themes and subthemes
Topic | Positive | Negative | Both | Total comments |
---|---|---|---|---|
Appearance and attire | 37 (92.5) | 2 (5.0) | 1 (2.5) | 40 (12.0) |
Delivery of content | 54 (34.6) | 97 (62.2) | 5 (3.2) | 156 (46.7) |
Body language | 1 (100) | 0 (0) | 0 (0) | 1 (0.3) |
Concise/succinct | 8 (25.0) | 23 (71.9) | 1 (3.1) | 32 (9.6) |
Distracting behaviors | 0 (0) | 3 (100) | 0 (0) | 3 (0.9) |
Eye contact | 11 (37.9) | 15 (51.7) | 3 (10.3) | 29 (8.7) |
Empathy/facial expression | 7 (53.8) | 6 (46.2) | 0 (0) | 13 (3.9) |
Filler words/signal‐to‐noise ratio | 3 (30.0) | 7 (70) | 0 (0) | 10 (3.0) |
Pace | 6 (66.7) | 2 (22.2) | 1 (11.1) | 9 (2.7) |
Repetitive words or phrases | 0 (0) | 2 (100) | 0 (0) | 2 (0.6) |
Summary of answer | 8 (36.4) | 14 (63.6) | 0 (0) | 22 (6.6) |
Use of medical jargon | 10 (28.6) | 25 (71.4) | 0 (0) | 35 (10.5) |
Recording environment | 24 (49.0) | 16 (32.6) | 9 (18.4) | 49 (14.7) |
Ambient noise | 1 (14.3) | 6 (85.7) | 0 (0) | 7 (2.1) |
Backdrop | 23 (54.8) | 10 (23.8) | 9 (21.4) | 42 (12.6) |
Response content | 20 (52.6) | 8 (21.1) | 10 (26.3) | 38 (11.4) |
Technical quality of video | 19 (37.3) | 29 (56.9) | 3 (5.9) | 51 (15.3) |
Audio quality | 0 (0) | 1 (100) | 0 (0) | 1 (0.3) |
Laptop positioning | 2 (14.3) | 11 (78.6) | 1 (7.1) | 14 (4.2) |
Lighting quality | 15 (55.6) | 10 (37.0) | 2 (7.4) | 27 (8.1) |
Video quality | 2 (22.2) | 7 (77.8) | 0 (0) | 9 (2.7) |
Total | 154 (46.1) | 152 (45.5) | 28 (8.4) | 334 (100) |
Percentages of positive, negative, and both categories are based on row totals, while row total percentages are based on total number of comments (N = 334).
Two independent emergency medicine faculty (not involved with the SVI preparation program) served as data coders. During the first round of analysis both coders analyzed and coded the first half of the feedback narratives. They isolated discrete feedback elements, or feedback related to one specific behavior, and classified those feedback elements into themes. Each record of data contained multiple feedback elements. Coders then met with each other to evaluate their use of the coding rubric and their coding consistency. The two coders then independently completed coding the entire database of feedback narratives. At the conclusion of coding, they met to compare their results with a third coder (program instructor) available to assign the theme in the event of a discrepancy.
Feedback content set aside earlier was reviewed by all coders together and either assigned a theme from the rubric or assigned a new theme. Once all codes were categorized into the rubric themes, the original two coders returned to the feedback narratives and classified each comment by type of comment: as positive reinforcement, constructive criticism, or equal elements of both. A third coder was available to resolve discrepancies but was not needed. The total number of thematic comments from the feedback narratives and percentages of narrative purpose for each were calculated. This study was determined to be exempt from committee review by The Ohio State University Institutional Review Board.
RESULTS
Forty‐six of 51 (90.2%) medical students submitted practice SVIs for review and feedback over 3 years: graduates of 2017, 19 of 20 (95%); graduates of 2018, 16 of 16 (100%); and graduates of 2019, 11 of 15 (73%). At least one of three faculty members reviewed each practice SVI and generated feedback narratives. Fourteen of 46 students (30%) received a feedback narrative from more than one reviewer, while 32 of 46 (70%) received a feedback narrative from a single reviewer. One faculty member reviewed videos from 35 students, another from 29 students, and the third from five students.
The curriculum management system produced a database of 71 records. Across these 71 records, coders identified 334 discrete feedback elements and linked all 334 feedback elements to five major themes and 16 subthemes. The mean (±SD) number of feedback elements identified in each feedback narrative (feedback record) was 7.3 (±3.0). Table 1 shows the total number of feedback elements (n = 334) listed by themes/subthemes and type of feedback received. Percentages in columns 2 through 4 of Table 1 refer to the ratio of feedback type by total received for each theme (or subtheme). Percentages in column 5 are based on the total number of feedback elements (N = 334).
Overall, the feedback narratives contained 154 (46.1% of 334) elements of positive reinforcement feedback, 152 (45.5% of 334) elements of constructive criticism feedback, and 28 (8.4% of 334) elements that contained both kinds of feedback. The most frequent reinforcement comments (positive feedback) were made regarding appearance and attire (92.5%, 37 of 40), recording environment (49%, 24 of 49), and response content (52.6%, 20 of 38). The most frequent constructive comments (negative feedback) involved the delivery of content (34.6%, 54 of 156) with the subthemes of use of medical jargon (71.4%, 25 of 35) and concise/succinct delivery (71.9%, 23 of 32) being the most prevalent. Interestingly, the faculty members provided the least amount of feedback, positive, negative, or otherwise, on the student's response content (11.4%, 38 of 334).
Figure 1 shows the relative percentages of reinforcement, constructive and both types of feedback, within each of the major feedback themes. From this figure, we see that students received mostly positive reinforcement (dark blue) feedback on appearance and attire, while the two domains that received the most constructive criticism (orange) feedback involved the delivery of content and technical quality of the video. The final two domains, recording environment and response content, received roughly equal numbers of positive reinforcement (dark blue and gray) and constructive criticism (orange and gray) feedback.
FIGURE 1.
Percentages of the nature (type) of feedback message (positive reinforcement, negative constructive criticism, both) within each feedback theme.
The faculty who reviewed practice interview videos varied in their content and type of feedback. Overall, reviewer 1 provided more positive reinforcement feedback, while reviewers 2 and 3 provided more constructive criticism (see Table 2). Reviewer 1 focused more on appearance and attire and technical quality of video, while reviewer 2 focused more on response content. All three reviewers provided abundant feedback on the delivery of content.
Table 2.
Type of feedback message (positive reinforcement, negative constructive criticism, both) within each feedback theme (and subtheme) by faculty reviewer.
Topic | Reviewer | Positive | Negative | Both | Total comments |
---|---|---|---|---|---|
Appearance and attire | 1 | 26 (96.3) | 1 (3.7) | 0 | 27 (14.9) |
2 | 7 (77.8) | 1 (11.1) | 1 (11.1) | 9 (7.8) | |
3 | 4 (100) | 0 | 0 | 4 (14.3) | |
Delivery of content | 1 | 42 (47.7) | 41 (46.6) | 5 (5.7) | 88 (48.6) |
2 | 7 (13.2) | 46 (86.8) | 0 | 53 (46.1) | |
3 | 5 (35.7) | 9 (34.3) | 0 | 14 (50.0) | |
Body language | 1 | 0 | 0 | 0 | 0 |
2 | 0 | 0 | 0 | 0 | |
3 | 1 (100) | 0 | 0 | 1 (3.6) | |
Concise/succinct | 1 | 8 (42.1) | 10 (52.6) | 1 (5.3) | 19 (10.5) |
2 | 0 | 8 (100) | 0 | 8 (7.0) | |
3 | 0 | 5 (100) | 0 | 5 (17.9) | |
Distracting behaviors | 1 | 0 | 0 | 0 | 0 |
2 | 0 | 3 (100) | 0 | 3 (0) | |
3 | 0 | 0 | 0 | 0 | |
Eye contact | 1 | 7 (43.8) | 6 (37.5) | 3 (18.8) | 16 (8.8) |
2 | 2 (20.0) | 8 (80.0) | 0 | 10 (8.7) | |
3 | 2 (66.7) | 1 (33.3) | 0 | 3 (10.7) | |
Empathy/facial expression | 1 | 6 (60.0) | 4 (40.0) | 0 | 10 (5.5) |
2 | 2 (66.7) | 1 (33.3) | 0 | 3 (2.6) | |
3 | 0 | 0 | 0 | 0 | |
Filler words/signal‐to‐noise ratio | 1 | 3 (37.5) | 5 (62.5) | 0 | 8 (4.4) |
2 | 0 | 1 (100) | 0 | 1 (0.9) | |
3 | 0 | 1 (100) | 0 | 1 (3.6) | |
Pace | 1 | 6 (75.0) | 1 (12.5) | 1 (12.5) | 8 (4.4) |
2 | 0 | 1 (100) | 0 | 1 (0.9) | |
3 | 0 | 0 | 0 | 0 | |
Repetitive words or phrases | 1 | 0 | 1 (100) | 0 | 1 (0.6) |
2 | 0 | 1 (100) | 0 | 1 (0.9) | |
3 | 0 | 0 | 0 | 0 | |
Summary of answer | 1 | 4 (50.0) | 4 (50.0) | 0 | 8 (4.4) |
2 | 2 (16.7) | 10 (83.3) | 0 | 12 (10.4) | |
3 | 2 (100) | 0 | 0 | 2 (7.1) | |
Use of medical jargon | 1 | 8 (44.4) | 10 (55.6) | 0 | 18 (9.9) |
2 | 1 (7.1) | 13 (92.9) | 0 | 14 (12.2) | |
3 | 1 (33.3) | 2 (66.7) | 0 | 3 (10.7) | |
Recording environment | 1 | 17 (73.9) | 6 (26.1) | 0 | 23 (12.7) |
2 | 7 (46.7) | 8 (53.3) | 0 | 15 (13.0) | |
3 | 0 | 2 (100) | 0 | 2 (7.1) | |
Ambient noise | 1 | 1 (25.0) | 3 (75.0) | 0 | 4 (2.2) |
2 | 0 | 3 (100) | 0 | 3 (2.6) | |
3 | 0 | 0 | 0 | 0 | |
Backdrop | 1 | 16 (57.1) | 3 (10.7) | 9 (32.1) | 28 (15.5) |
2 | 7 (58.3) | 5 (41.7) | 0 | 12 (10.4) | |
3 | 0 | 2 (100) | 0 | 2 (7.1) | |
Response content | 1 | 7 (58.3) | 4 (33.3) | 1 (8.3) | 12 (6.6) |
2 | 11 (47.8) | 4 (17.4) | 8 (34.8) | 23 (20.0) | |
3 | 2 (66.7) | 0 | 1 (33.3) | 3 (10.7) | |
Technical quality of video | 1 | 11 (37.3) | 18 (56.9) | 2 (5.9) | 31 (17.1) |
2 | 7 (46.7) | 7 (46.7) | 1 (6.6) | 15 (13.0) | |
3 | 1 (20.0) | 4 (80.0) | 0 | 5 (17.9) | |
Audio quality | 1 | 0 | 0 | 0 | 0 |
2 | 0 | 0 | 0 | 0 | |
3 | 0 | 1 (100) | 0 | 1 (3.6) | |
Laptop positioning | 1 | 2 (14.3) | 6 (78.6) | 0 | 8 (4.4) |
2 | 0 | 3 (75.0) | 1 (25.0) | 4 (3.5) | |
3 | 0 | 2 (100) | 0 | 2 (7.1) | |
Lighting quality | 1 | 9 (50.0) | 7 (38.9) | 2 (11.1) | 18 (9.9) |
2 | 6 (75.0) | 2 (25.0) | 0 | 8 (7.0) | |
3 | 0 | 1 (100) | 0 | 1 (3.6) | |
Video quality | 1 | 0 | 5 (100) | 0 | 5 (2.8) |
2 | 1 (33.3) | 2 (66.4) | 0 | 3 (2.6) | |
3 | 1 (100) | 0 | 0 | 1 (3.6) | |
Total | 1 | 103 (56.9) | 70 (38.7) | 8 (4.4) | 181 (100) |
2 | 39 (33.9) | 66 (57.4) | 10 (8.7) | 115 (100) | |
3 | 12 (42.9) | 15 (53.6) | 1 (3.6) | 28 (100) |
Percentages of positive, negative, and both categories are based on row totals, while row total percentages are based on total number of comments by reviewer (N 1 = 181, N 2 = 115, N 3 = 28).
DISCUSSION
Over the past decade or so, there has been increasing interest in replacing in‐person interviews with some form of video‐based interviewing, primarily to ameliorate the financial liabilities and time burden associated with in‐person interviews for residency recruitment. 19 , 20 , 21 Until now, however, movement toward video‐based interviewing has been slow to be adopted. While most medical students and their associated advisors have little to no experience with video‐interviewing, the social distancing requirements currently imposed by COVID‐19 has necessitated the widescale adoption of video‐based interviewing for obtaining residency positions. Through examination of medical students’ previous performance on video interviews, we have identified specific suggestions for targeted intervention to help the candidates to improve their video interview skills for the current and future residency application cycles.
Our analysis of feedback provided to medical students during SVI preparation yielded several common concerns with student practice video interviews. Problems related to delivery of content were the most frequently identified issue for our medical students. Our findings align with early research in the psychology of job interviews that support the concept: “It's not what you say, but how you say it.” These studies showed that fluency of speech and nonverbal communication skills including body posture and eye contact are important contributors to successfully navigating the employment interview. 22 , 23
The most common error our students made during their interviews had to do with their inability to deliver concise answers. Some students were noted to have rambling responses or had a propensity to provide so many details that the key point of their response became obscured. These details often came in the form of specific test results, diagnoses, etc., on which students have been conditioned to elaborate when presenting patients in the clinical setting. However, in the setting of the residency interview, elaboration may appear to the interviewer as an inability to “get to the point.” These types of errors can negatively impact interviewer's opinions of applicants as previous studies have shown that fluency of speech is correlated with perceptions of intelligence. 23 , 24 , 25 , 26 Furthermore, earlier studies of job applicant's responses during an interview found that the ability of the candidate to respond concisely, answer questions fully, … and keep to the subject matter at hand appears to be crucial in obtaining a favorable employment decision. 25 , 26 Accordingly, we recommend the incorporation of formal training in articulating succinct and targeted responses when helping medical students prepare for video interviews for the residency application process.
Another common problem we identified was related to ineffective eye contact. This finding aligns with previous studies that showed that ratings of eye contact during video interviews are perceived to be lower than ratings of eye contact during in‐person interviews. 27 , 28 Some of the technological constraints that may limit eye contact in video‐mediated interviews have been identified. The placement of the video camera relative to the screen may result in the perception of gaze aversion, 29 , 30 while frame‐in‐frame video may serve as a distraction to the applicant during the interview. 31 The importance of making appropriate eye contact during interviews cannot be overemphasized as eye contact is a well‐established determinant of social judgment. People who avoid eye contact are perceived to be less sincere, more pessimistic, more deceptive, and less intelligent. 32 , 33 , 34 , 35 In contrast, applicants who maintain mutual gaze during job interviews are rated as more credible, more attractive, and more likely to be hired. 36 Although video interviews are commonplace in the business world, the residency interview will likely be a medical student's first experience with video interviewing for summative evaluation. Therefore, formal instruction on methods for maintaining mutual gaze in a virtual setting should be offered to medical students to prepare them for video interviews.
In aggregate, student practice videos were also found to have issues with the technical quality of the video. These problems were most often related to poor or inappropriate lighting or problems with the positioning of the camera that ultimately affected the quality of the video. Our findings were consistent with another recent study on the effects of video‐interview technology on communication during recruitment interviews. This study also found that technical issues were present in a majority of interviews, including improper camera angles (86%) and poor lighting (60%). 28 Technical issues are especially concerning since they contribute to a lack of effective eye contact, a common problem addressed earlier. In particular, when cameras are placed too low, they result in the appearance that an applicant is looking down on the interviewers, which gave an impression that the “candidate had an air of superiority and arrogance.” We recommend that applicants be given more explicit instructions on how to optimize their videoconferencing setup to ensure proper eye contact with the video interviewer or camera. Along with adequate lighting, the camera should be raised so that it is positioned slightly above eye level and the image of the interviewer should be positioned immediately below the camera. These simple manipulations can greatly impact the perception of mutual gaze and the interviewer's opinion of the applicant.
Surprisingly, the content of responses yielded a relatively small number of critical comments with the majority of comments providing positive reinforcement and an intermediate number having a mix of positive and critical elements. Critical comments typically revolved around fully developing stated principles and providing illustrations of key concepts with appropriate examples. This may indicate that most of our students have adequate knowledge of professional behaviors or that teaching professionalism in the preclinical curriculum has had a lasting positive effect. Alternatively, it may reflect a lack of sensitivity of the SVI sample test questions to detect a lack of professional behaviors or discomfort of the faculty raters in providing negative feedback on content of responses.
LIMITATIONS
When interpreting our results, one must consider a few limitations of our study. One limitation is that it is unclear whether errors made by medical students when recording their practice video interviews for the SVI will be predictive of mistakes they may make when interviewing, in real time, across a video‐based Internet platform. Although the AAMC has recommended as “best practices” the use of behavioral and situational questions in a structured interview format for conducting video‐based residency interviews, 36 the content of residency interview schedules will surely be different since programs care more about the applicant's “fit” than they do about the constructs tested by the SVI. However, the feedback given to our medical students was more generally about how they presented themselves on camera. Furthermore, the recording equipment and video platforms that our students used for their practice SVI is likely to be the same equipment they would use for their residency interviews. Therefore, it is reasonable to assume that technical errors will be similar for both formats. Accordingly, the feedback given to our students on technical errors may be more relevant for residency interviews than they were for the SVI.
Another limitation of this study was that the faculty members provided far more comments on the delivery of content and technical issues compared to the content of responses. While this may reflect appropriate preparation on the part of the students, it may also reflect a social desirability bias on the part of the faculty raters. It should be noted that all three faculty reviewers have extensive experience teaching and providing feedback on professionalism and communication skills in both the preclinical and the clinical portions of our curriculum and therefore should be comfortable providing feedback that is useful to improving student performance. However, additional faculty development on providing feedback specifically related to the content of responses might have yielded different results.
CONCLUSIONS
Thematic analysis of feedback given on practice standardized video interviews yielded results that may help other programs anticipate the needs of medical students as they prepare for the video‐based residency interviews. Medical students need guidance on how to use appropriate examples to fully develop the content of their responses. Students also need coaching and practice in how to deliver clear and concise responses to interview questions, ones free of overly detailed information. Applicants should be provided with more detailed instruction and feedback regarding the technical aspects of video interviewing, including lighting, camera positioning, how to maximize both video and audio quality, and setting an appropriate background environment. Camera positioning and lighting strategies should be employed to achieve appropriate eye contact with the interviewer through the camera lens. These simple interventions have potential to positively impact the student's presentation of themselves to residency programs through the video medium.
CONFLICT OF INTEREST
The authors have no potential conflicts to disclose.
Leung CG, Malone M, Way DP, Barrie MG, Kman NE, San Miguel C. Preparing students for residency interviews in the age of COVID: Lessons learned from a standardized video interview preparation program. AEM Education and Training. 2021;5:e10583. 10.1002/aet2.10583
REFERENCES
- 1. Final Report and Recommendations for Medical Education Institutions of LCME‐Accredited U.S. Osteopathic, and Non‐US Medical School Applicants. Washington, DC: Association of American Medical Colleges; 2020. Accessed September 24, 2020. https://www.aamc.org/system/files/2020‐05/covid19_Final_Recommendations_05112020.pdf/. [Google Scholar]
- 2. Crane JT, Ferraro CM. Selection criteria for emergency medicine residency applicants. Acad Emerg Med. 2000;7(1):54‐60. [DOI] [PubMed] [Google Scholar]
- 3. Negaard M, Assimacopoulos E, Harland K, Van Heukelom J. Emergency medicine residency selection criteria: an update and comparison. AEM Educ Train. 2018;2(1):146‐153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Shah SK, Arora S, Skipper B, Kalishman S, Timm TC, Smith AY. Randomized evaluation of a web‐based interview process for urology resident selection. J Urol. 2012;187(4):1380‐1384. [DOI] [PubMed] [Google Scholar]
- 5. Vadi MG, Malkin MR, Lenar J, Stier GR, Gatling JW, Applegate RL. Comparison of web‐based and face‐to‐face interviews for application to an anesthesiology training program: a pilot study. Int J Med Educ. 2016;7:102‐108. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Daram SR, Wu R, Tang SJ. Interview from anywhere: feasibility and utility of web‐based videoconference interviews in the gastroenterology fellowship selection process. Am J Gastroenterol. 2014;109(2):155‐159. [DOI] [PubMed] [Google Scholar]
- 7. Edje L, Miller C, Kiefer J, Oram D. Using Skype as an alternative for residency selection interviews. J Grad Med Educ. 2013;5(3):503‐505. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Williams K, Kling JM, Labonte HR, Blair JE. Videoconference interviewing: tips for success. J Grad Med Educ. 2015;7:331–333. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Pasadhika S, Altenbernd T, Ober RR, Harvey EM, Miller JM. Residency interview video conferencing. Ophthalmology. 2012;119(2):426. [DOI] [PubMed] [Google Scholar]
- 10. About the SVI. Washington, DC: Association of American Medical Colleges; 2017. Accessed July 8, 2019. https://students‐residents.aamc.org/applying‐residency/article/about‐svi/ (no longer available). [Google Scholar]
- 11. AAMC Standardized Video Interview: Essentials for the ERAS® 2019 Season. Washington, DC: Association of American. Medical Colleges; Updated February 7, 2018. Accessed July 8, 2019. https://aamc‐orange.global.ssl.fastly.net/production/media/filer_public/41/37/41376739‐d8a1‐4e38‐85f5‐479d99caa1a3/aamc_standardized_video_interview_applicant_prep_guide_22019.pdf. [Google Scholar]
- 12. Virtual Interview Tips for Medical School Applicants. Washington, DC: Association of American Medical Colleges; 2020. Accessed September 25, 2020. https://www.aamc.org/system/files/2020‐05/Virtual_Interview_Tips_for_Medical_School_Applicants_05142020.pdf. [Google Scholar]
- 13. Joshi A, Bloom DA, Spencer A, Gaetke‐Udager K, Cohan RH. Video interviewing: a review and recommendations for implementation in the era of COVID‐19 and beyond. Acad Radiol. 2020;27(9):1316‐1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Top 5 Video Interviewing Tips for Residency and Fellowship Programs. Santa Clara, CA: Thalamus, Connecting the Docs. 2020. Accessed September 25, 2020. https://thalamusgme.com/top‐5‐video‐interviewing‐tips‐for‐residency‐and‐fellowship‐programs/. [Google Scholar]
- 15. Candidate Resources and Candidate Help Center. South Jordan, UT; Hireview; 2020. Accessed September 28, 2020. https://www.hirevue.com/candidates.
- 16. Skillings P. The Ultimate Guide to Acing Your Next Video Interview. New York, NY; Skillful Communications; c2021. Accessed June 12, 2018. https://biginterview.com/video‐interview/. [Google Scholar]
- 17. HireVue Team . Video Interview Preparation: 8 Tips for Successful Video Interviewing. HireVue Blog. South Jordan, UT; Hireview; 2018. Accessed June 12, 2018. https://www.hirevue.com/blog/candidates/how‐to‐prepare‐for‐your‐hirevue‐digital‐interview.
- 18. Ayers L. Thematic coding and analysis. In: Given LM, ed. The Sage Encyclopedia of Qualitative Research Methods. Thousand Oaks, CA: Sage Publications Inc.; 2008:867‐868. [Google Scholar]
- 19. Hariton E, Bortoletto P, Ayogu N. Residency interviews in the 21st century. J Grad Med Educ. 2016;8(3):322‐324. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Hariton E, Bortoletto P, Ayogu N. Using video‐conference interviews in the residency application process. Acad Med. 2017;92(6):728‐729. [DOI] [PubMed] [Google Scholar]
- 21. Pourmand A, Lee H, Fair M, Maloney K, Caggiula A. Feasibility and usability of tele‐interview for medical residency interview. West J Emerg Med. 2018;19(1):80‐86. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Imada AS, Hakel MD. Influence of nonverbal communication and rater proximity on impressions and decisions in simulated employment interviews. J Applied Psychol. 1977;62(3):295. [Google Scholar]
- 23. Hollandsworth JG, Kazelskis R, Stevens J, Dressel ME. Relative contributions of verbal, articulative, and nonverbal communication to employment decisions in the job interview setting. Personnel Psychol. 1979;32:359‐367. [Google Scholar]
- 24. Borkenaur P, Liebner A. Observable attributes as manifestations and cues of personality and intelligence. J Personality. 1995;63:1‐25. [Google Scholar]
- 25. Ferran Urdanetta C, Storck J. Truth or deception: the impact of video conferencing on job interviews. ICIS 1997 Proceedings. 1997;12:183‐196. [Google Scholar]
- 26. Burns RB. The Secrets of Finding and Keeping a Job. 2nd ed. Warriewood, New South Wales, Australia: Business & Professional Publishing, PTY LTD; 1999. [Google Scholar]
- 27. McColl R, Michelotti M. Sorry, could you repeat the question? Exploring video‐interview recruitment practice in HRM. Hum Resour Manag J. 2019;29:637‐656. [Google Scholar]
- 28. Bohannon LS, Herbert AM, Pelz JB, Rantanen EM. Eye contact and video‐mediated communication: a review. Displays. 2013;34(2):177–185. [Google Scholar]
- 29. Grayson DM, Monk AF. Are you looking at me? Eye contact and desktop video conferencing. ACM Transact Comput Human Interact. 2003;10(3):221‐243. [Google Scholar]
- 30. Horn RG, Behrend TS. Video killed the interview star: does picture‐in picture affect interview performance? Personnel Assess Decis. 2017;3(1):51‐59. [Google Scholar]
- 31. Bond CF, Omar A, Mahmoud A, Bonser RN. Lie detection across cultures. Nonverbal Behav. 1990;14(3):189–204. [Google Scholar]
- 32. Larson RJ, Shackleford TK. Gaze avoidance: personality and social judgements of people who avoid face‐to‐face contact. Person Individ Diff. 1996;21(6):907‐917. [Google Scholar]
- 33. Kleck RE, Nuessle W. Congruence between the indicative and communicative functions of eye contact in interpersonal relations. Br J Soc Clin Psychol. 1968;7:241‐246. [DOI] [PubMed] [Google Scholar]
- 34. Hemsley GD, Doob AN. The effect of looking behavior on perceptions of a communicator's credibility. J Applied Soc Psychol. 1978;8(2):136‐140. [Google Scholar]
- 35. Burgoon JK, Manusov V, Mineo P, Hale JL. Effects of gaze on hiring, credibility, attraction, and relational message interpretation. J Nonverbal Behav. 1985;9(3):133‐146. [Google Scholar]
- 36. Best Practices for Conducting Residency Program Interviews. Washington, DC: Association of American Medical Colleges; 2020. Accessed September 28, 2020. https://www.aamc.org/system/files/2020‐05/best%20practices%20for%20conducting%20residency%20program%20interviews.pdf. [Google Scholar]