Abstract
Transitioning medical students are anxious about their readiness-for-internship, as are their residency program directors and teaching hospital leadership responsible for care quality and patient safety. A readiness-for-internship assessment program could contribute to ensuring optimal quality and safety and be a key element in implementing competency-based, time-variable medical education. In this paper, we describe the development of the Night-onCall program (NOC), a 4-h readiness-for-internship multi-instructional method simulation event. NOC was designed and implemented over the course of 3 years to provide an authentic “night on call” experience for near graduating students and build measurements of students’ readiness for this transition framed by the Association of American Medical College’s Core Entrustable Professional Activities for Entering Residency. The NOC is a product of a program of research focused on questions related to enabling individualized pathways through medical training. The lessons learned and modifications made to create a feasible, acceptable, flexible, and educationally rich NOC are shared to inform the discussion about transition to residency curriculum and best practices regarding educational handoffs from undergraduate to graduate education.
Keywords: Transitions to residency, Immersive simulation, Mixed modality experiences, Educational experience, Team work, Basic clinical skills, Communication between team members, Handoffs, Oral presentations, Readiness-for-internship assessments, Competency-based medical education, Entrustable Professional Activities
Introduction
“It still doesn’t quite feel like I am able to jump in and start on July 1…the nurses expect you to be the doctor, the patients expect you to be the doctor, your colleagues expect you to be the doctor”.
~4th year medical student 2 weeks before graduation expressing anxiety about transitioning to residency.
“We get to see July 1st as medical students and get to see how a lot of Interns really struggle with some basic skills”.
~3rd year medical student a year before graduation voicing concern about transitioning to residency.
Medical students transitioning from undergraduate medical education (UME) to graduate medical education (GME, also referred to as “residency” or “internship”) experience uncertainty and distress about their readiness-for-internship [1–3]. This lack of readiness may be partially responsible for the “July effect”—a reported increase of 10% in fatal medical errors in teaching hospitals in North America when these new graduates enter the workforce each July [4]. Residency program directors are just as anxious about integrating the incoming medical students into a fast-paced and complex health care system because they are aware that clinical experience and competence during the senior year of medical school is variable, both within a single school and across institutions [5–7], and a new resident class is typically made up of graduates of many medical schools. This heterogeneity in readiness has led residency programs and hospital leadership to implement orientation programs and increase supervision to ensure patient care quality and safety as new trainees learn to function effectively in their latest roles [8, 9]. Some medical schools have also implemented transition courses; however, these are generally focused by clinical discipline [10]. A clinical discipline-agnostic readiness-for-internship program, administered just prior to medical school graduation, would serve many important purposes including (1) preparing near-graduate medical students for a smooth and safe transition to residency, (2) building an assessment program with the intention of ultimately benchmarking and reporting readiness-for-internship metrics, regardless of clinical discipline, and (3) providing a meaningful educational handoff between UME and GME in the USA and beyond.
A competency-based readiness-for-internship assessment program is both timely and critical to the UME-GME continuum [10].In recent years, patient safety and quality assurance committees of hospitals and residency program directors have been called upon by accrediting agencies, malpractice insurance companies, and the general public to demonstrate that trained residents are capable of providing the level of care for which they have been assigned. Residency Review Committees, the clinical discipline specific accreditation bodies of the US Accreditation Council for Graduate Medical Education (ACGME), have provided guidelines outlining what a first-year resident can and cannot do without direct supervision until competency has been documented [11]. In 2014, the Association of American Medical Colleges (AAMC), responsible for accrediting medical schools in the USA, released a set of 13 core Entrustable Professional Activities (EPAs) for entering residency (Core EPAs) (see Fig. 1). EPAs are units of professional practice a trainee can be trusted to accomplish unsupervised once he or she has demonstrated sufficient and specific competence. Authors of the core EPAs provided detailed guidance meant to drive the community toward refining, measuring, and benchmarking the minimal level of competence expected of a medical school graduate [12]. As of yet, there is little consensus on how to assess the Core EPAs of new residents or what type of transition documentation (or “handoff”) to residency programs would be meaningful [13, 14].
Although ensuring readiness-for-internship is challenging, there are unacceptable negative consequences for patients, institutions, programs, and for the individual professional if “onboarding” is not done effectively. Simulation has a critical role to play in both reducing the risk of iatrogenic harm to patients [15, 16] and assessing fundamental clinical competence critical to creating an institutional culture of safety [17–20]. Ideally, with the implementation of a meaningful simulation-based assessment program just prior to medical school graduation, actionable formative feedback can be provided to both the learner and GME Program Directors to achieve these goals.
In this paper, we describe in detail the development of a complex, immersive simulated, Night-onCall (NOC). We believe that NOC is an innovative program for a number of reasons including the fact that it (1) was designed iteratively and in response to specific local needs and evolving research questions, (2) it can be reproduced at most medical schools without need of sophisticated simulation facilities, and (3) it provides both an authentic educational experience for and is likely to enable high value core EPAs assessment of transitioning medical students.
Developing NOC
Conceptual framework underlying NOC
NOC is a multi-station experience in an Objective Structured Clinical Exams (OSCEs) format [21–23]. Since the 1960’s OSCEs utilizing standardized (a.k.a. “programed,” “simulated”) patients to assess core clinical skills have become a ubiquitous part of medical education assessment programs—used worldwide in a vast array of formats and for a variety of purposes including physicians’ licensing examinations. NOC aligns with literature that supports the utility of a well-designed OSCE as an assessment of clinical competence, assuming careful attention is paid to “contextual fidelity,” which includes the interprofessional nature of most medical work and accurate “professional role reproduction” [24]. NOC is the current focus of a research program in which we explore the measurement of clinical competence for the purpose of supporting increasingly individualized pathways through medical training [25].
The team
NOC was developed by a multidisciplinary and inter-professional team consisting of physician, nurse, medical librarian, and PhD-prepared educators from Emergency Medicine, Internal Medicine and Surgery and Obstetrics and Gynecology. Our team, as whole, has extensive expertise in using simulation in undergraduate and graduate medical and nursing education.
Table 1 details how we incrementally developed NOC over a 3-year period into a complex multi-modal, immersive simulation and a summary of our experience. The individual components of the 2016 NOC experience were refined and designed to address and assess each of the 13 Core EPAs. Fig. 2 illustrates what a medical student would experience in the 2016 iteration of NOC.
Table 1.
Development year over year | 2014 | 2015 | 2016 (Night-onCall) |
---|---|---|---|
Clinical cases/mixed modality |
Case 1: Oliguria “I am calling about Mr. Jackson, 64-year-old man S/P Elective endovascular repair of AAA, post-operative day 3. His urine output has dropped and he has mild abdominal pain” (Has urinary retention, BPH). WISE-onCall module with 3 practice cases. Case 2: Oliguria “I am calling about Mr. Taylor. 57-year-old man here for observation to rule out acute cardiac ischemia and pulmonary embolism. His urine output has dropped and he remains without chest pain” (received contrast for a cat scan). |
Case 1: Oliguria (same).
WISE-onCall module with three practice cases. Case 2: Oliguria (same). Case 3: headache “I’m calling about Mr. Johnson, 64-year-old man S/P a AAA repair day 3, and is complaining of a severe headache” (has a blood pressure of 195/99 and a history of HTN). Case 4: headache: “Hi. Are you covering for Mr. Kolinsky, 62-year-old man S/P internal fixation of an ankle fracture…I wanted to let you know that he is having a severe headache” (history of migraines on propranolol for prevention). Form clinical question and retrieve evidence to advance clinical care. Culture of safety analysis of a paper case: vignette describing pre-entrustable peer on internal medicine clerkship-structured response identifying evidence of behaviors and assessment of entrustment. Handoff all two cases to fellow intern (standardized): prioritize based on urgency. Assessment of entrustment. |
Case 1: Oliguria (same).
Plus: oral presentation to attending. WISE-onCall module with three practice cases. Case 2: Oliguria (same) Case 3r: headache (revised) “Hi. Are you covering for Mr. Brooks, 60-year-old man being treated for Diverticulosis, …I wanted to let you know his blood pressure is really high” (195/99 currently, non-focal neuro-exam, history of migraine headaches on propranolol for prevention)? Form clinical question and retrieve evidence to advance clinical care (same). Case 4r: Go “get” consent (revised): “Hi. This is Randy, your second year resident. You are covering Mr. Smith a 40 y/o with a cough, fever and pleural effusion. You need to go consent him for a thoracentesis. I will meet you at the bedside in 1 h.” (The resident will explain the procedure if asked, patient’s husband is in the room). Culture of safety analysis of a paper case: same. Handoff all four cases to fellow intern (standardized): prioritize based on urgency. Assessment of entrustment. |
Number and types of participants | 52 4th-year graduating medical students. | 66 4th-year graduating students. 42 3rd-year students (rising seniors). |
89 students. 35 4th-year, 12 3rd-year accelerated, 36 3rd-year, 65-year pathway. |
Event length | 3 h/student, Over 3 full days in simulation center. | 3 h/student, over 9 full days in simulation center. | 4 h/student, over 16 half days in simulation center. |
Incentive | $100/Student, IRB-approved protocol. | $100/Student, IRB-approved protocol. | $100/student, IRB-approved protocol. |
EPA’s addressed and assessed | 1–5, 9,12 | Piloted oral presentation, handoff, evidence-based medicine, culture of safety. 1–10, 12–13 |
1–13 |
Study questions | In what ways are our near graduates ready for internship? Does WISE-OnCall “just in time” improve core clinical skills required for common clinical coverage issues? Do different forms of feedback (short-form checklist vs. whole-form checklist) provided during the practice cases have an impact on learning outcomes? |
Does simulated clinical exposure before WISE-onCall enhance learning from it? Does WISE-onCall improve clinical performance in content discordant cases? Exploratory: 3rd-year vs. 4th-year students? Which core EPAs for entering residency can we reliably assess in an integrated authentic simulated experience? |
Is it feasible to assess all core EPAs for entering residency in an integrated authentic simulated experience? What are the differences in readiness for residency among clinically experienced students in different curricular pathways? |
Measurements (assessor: assessed domains) | SP: Communication skills (data gathering, rapport building, patient education and counseling), history gathered, physical exam, professionalism, recommendations (entrustment equivalent). SN: collaboration, inter-professional communication, rapport building, professionalism (entrustment). Patient note: reporter, interpreter, manager, clinical reasoning. Faculty: clinical reasoning, entrustment. Structured domain specific medical knowledge (clinical schema). |
Plus
Paper case: culture of safety: entrustment of peers. Medical librarian: ability to formulate answerable clinical questions and identify a literature based answer. Peer: handoff quality and/entrustment. |
Plus
Faculty: oral presentation skills and entrustment. Case no. 3: SP/SN: recognize a patient requiring urgent or emergent care and initiate evaluation and management. Case no. 4: SP/standardized resident/spouse: ability to perform an ethical and legal informed consent discussion and effectively include family members. |
Feedback, findings and remaining questions | • Students appreciate the opportunity to practice and learn before July 1. • WISE-onCall module is useful “just in time”. • Students question authenticity of working with nurse in patient’s room. • Extreme variability in measured “readiness” and sophistication in clinical schema. • Majority of students improved significantly after WISE-onCall (some did not). Does this reflect readiness for learning from clinical cases? • Simpler forms of feedback with in Wise onCall are as effective as more complex ones (RCT). • Although they need to be refined, our assessments were reasonably reliable, authentic and synthetic. • Many of the common topics required for transition from UME to GME can be assessed and addressed using this style of blended assessment/learning experiences. |
• All clinical students (3rd and 4th) appreciate the practice and authenticity. • Educational utility is high. • Students demonstrate the best clinical skills and clinical reasoning after they complete an SP/SN case on the same topic before a Wise onCall module (neither alone is enough). • MS 3s have more comprehensive basic clinical skills than MS 4s. • Both MS 3s and 4s get a significant boost in content specific structured knowledge from blended WISE-OnCall and simulation experience. • 4th-year students gained more in the domains of clinical management and overall clinical reasoning than the 3rd-year students. • This may be secondary to boosting effect of the experience on knowledge and skills they had obtained but forgot. • Almost all students recognize pre-entrustable “culture of safety” behaviors in a peer and can recommend strategies to address these. • The quality of ability to formulate answerable clinical questions and identify a literature-based answer is highly variable. |
• Continued enthusiasm for high educational yield of the event. • Feasible to assess all 13 core EPAERs confirmed. • No significant differences among students in accelerated MD program and traditional 4th-year program (small sample). • Attendings impressed with variability in intern readiness based on oral presentation. • Both competency measures and entrustment measures can be made. • What should we do with students who perform poorly on NOC? • What would be the more useful design for educational handoffs from UME to GME? • Can we establish predictive models and cut offs for the data produced in NOC? |
Types of NOC assessments
Web-based multimedia module
In response to the increasing focus on medical students’ readiness for residency, and based on 10 years of experience building and studying WISE-MD—a web-based core surgery clerkship curriculum [26], our team created WISE-onCall—a set of web-based, multimedia modules targeted at enhancing the ability of novices to address common clinical coverage issues. The modules are designed as a cognitive apprenticeship framework [27], starting with two “partially worked” case examples including video demonstration of inter-professional interaction, utilizing the instructional strategies of modeling, coaching, scaffolding, and fading of instructional guidance and then three text-based practice cases where the learner applies diagnostic skills and obtains feedback. To date, eight WISE-onCall modules have been completed with plans to build at least five more in the next 2 years [28]. For NOC, we selected the Oliguria (low urine output) WISE-onCall module because it is a topic that all students are likely to have basic familiarity with by the end of medical school and it is a condition interns in all clinical disciplines can expect to encounter during a typical night on inpatient call. (Addresses EPAs #1,2,3,4,9,10,12).
Performance-based assessment (PBA)
Initially, in 2014, we designed two standardized patient (SP) and standardized nurse (SN) cases of relatively equal difficulty (case no. 1, case no. 2) for pre and post of the WISE-onCall module. In 2015, we developed two additional SP/SN cases, (case no. 3, case no. 4) in order to explore how clinical case content concordance and sequencing of PBA and WISE-onCall, impacted performance across the simulation activities [31].In 2016, we revised case numbers 3r and 4r to enable us to address, assess, and align PBAs to the core EPAs (see Table 1 for details).
Learners’ clinical skills including inter-professional teamwork were assessed using SP/SN completed checklists developed based on extensive/prior research [22]. Clinical reasoning was assessed based on student-completed patient coverage notes and scored by a clinician based on a rubric [29, 30]. Rigorous methods were employed to develop SP/SN roles and checklists, as well as recruit, train, and calibrate actors for both case portrayal (3 h) and rater reliability (3 h) [23] (Addresses EPAs 1,2,3,4,5,9,10,12).
Oral presentation
Experienced physicians from the study team played the role of a standardized attending (SA) for case number 1. The SA received a phone call from the study participants following the case number 1 clinical encounter. A detailed guide to the case and the task were provided. Specifically, the guide included the clinical details of the case and a set of standardized prompts to be used to encourage the learner to share their clinical reasoning and establish a management plan. The SA was also responsible to assess the quality of the oral presentation using a checklist designed based on the detailed description of core EPA no. 6 [31] and make an entrustment judgment.
Evidence-based medicine activity
Following case no. 3 learners, seated in front of a computer with an Internet access, were given 10 min to define a clinical question based on this case and instructed to use Web-based resources to find the best answer to a clinical questions provided to them (e.g., “What is the best initial management for urgent hypertension?”). A computer program was used to allow a medical librarian to remotely observe the learner’s progression through the activity both in real time and based on a recording. Using this approach, the medical librarian was also able to assess the learner’s ability to formulate a clinical question and use digital resources to identify high quality evidence to guide the patient’s care as described by EPA no. 7.
Patient handoff
We recruited senior medical students to play the role of the standardized intern (SI) taking over the clinical service. Each SI was trained to use a structured evaluation instrument, modified from a published instrument, to assess the quality of the handoff [32] as well as provide an entrustment judgement (EPA no. 8).
Culture of safety exercise
Participants were first given time to read a detailed vignette describing a pre-entrustable intern’s approach to a series of common quality and safety challenges on an inpatient ward [31]. Then, in written responses to open-ended prompts, the participants listed both the interns’ behaviors and attitudes that interfered with a culture of safety and suggested actions needed for systems improvement. A faculty member (GN) assessed students’ written responses based on a rubric designed based on the description of the AAMC’s EPA no. 13 [33].
Recruitment of students
For all phases of NOC, we recruited near-graduate medical students by email. If the student agreed to participate, he or she could sign up for a scheduled slot in the simulation center by clicking on a Universal Resource Locator (URL) embedded in the recruitment email. Study staff then confirmed the date with the participant and provided him/her background information regarding the study via email. Participation was entirely voluntary, written informed consent was obtained and a financial incentive was provided.
Resources needed to implement NOC
The NOC experience was hosted by the New York Simulation Center [34]. We estimate our cost-per-student for NOC to be around $500 (US). This includes SP/SN salaries and staff time for planning and running the event, including SP training, student recruitment, and scheduling. This estimated cost does not include a facility fee, the study incentive, case development time, patient note scoring, data entry and management, and physician preceptor time.
Lessons learned
In building this experience, we have learned many lessons that may be of interest to other’s seeking to build similar assessment events. While currently we do not share any assessment data with students, ultimately, we seek to use the competency assessments and entrustment judgments for feedback to students on their readiness and as a handoff to residency training program directors. The following is what we learned so far.
NOC is feasible
As we have demonstrated, it is feasible to host a NOC for a large number of students. However, this can only be done with championship from leadership, adequate funding and committed professional and administrative personnel. The team met weekly for the 3 years; it took to develop materials, pilot, and refine the program. The staging of the full NOC event required several months of planning which included scheduling space, recruiting and training actors, faculty and students playing roles, and recruiting and scheduling participants. Data entry, cleaning, analysis, and interpretation also required adequate resources. Advanced simulation facilities or equipment were not required to host the NOC.
NOC is acceptable
NOC is an immersive, complex, mixed-modality simulation experience, aimed at creating an authentic opportunity to rehearse being an Intern “on call.” Although it will require more work to establish the program as an effective means of measuring students’ readiness-for-internship for high-stakes purposes, the participants routinely expressed in debriefing portion of the NOC experience, that it helped them better understand their readiness and identify knowledge/skill gaps prior to their transition.
NOC is a flexible structure
Depending on local needs and resources, there are ways to modify the program to reduce cost and shorten the time needed, while at the same time, still achieving the same objectives. Based on our experience, we believe the EPA framework allows for a great deal of creativity and innovation. For instance, we choose to assess EPA no. 13 using a written assessment of a paper case rather than a complex simulation others have used [35]. Other schools prepare students for a night on call by integrating assessments into their required advanced clerkships [36]. In the future, we plan to conduct head-to-head comparisons of various strategies to better understand relative educational and assessment value and costs.
NOC will likely produce valuable information
One goal of the analysis of our experience and data is to understand the educational value of the components of NOC. From the point of view of the students who have volunteered to participate, this low-stakes experience was almost uniformly seen as time well spent, educational, and anxiety-reducing. This may change as we refine the competency measures and entrustment judgments and start providing detailed feedback. At our school, 12 years ago, we established a Comprehensive Clinical Skills Exam (CCSE), an 8-station Objective Structured Clinical Exam, to serve as a final performance exam for the core clinical clerkships. Similarly, we developed the CCSE as an assessment for learning or formative experience where it was very popular with and very much appreciated by students. Once we transitioned the CCSE to an assessment of learning, (as defined by van der Vleuten et al. [37]) or summative, high-stakes experience where students were required to pass the exam, its popularity and the enthusiasm among students decreased. We suspect this may be an inevitable trade-off for some students, but we do hope to engage students in embracing the value of the data produced by NOC.
Next steps for NOC
We are currently experimenting to find effective ways to visualize the NOC data and report it to students for the purposes of guiding them in preparation for internship. Despite desiring educational handoff information on their incoming interns, residency program directors are suspicious of assessments done in the undergraduate setting and do not yet “trust” evidence of readiness [1]. With this in mind, we are exploring how, if at all, residency program directors would find this type of performance data useful to plan supervision during the transition months, given that in the USA, they are contractually committed to training incoming residents at the time of medical school graduation.
We are also exploring both how best to understand the entrustment judgments generated in NOC [38, 39] and adding self-assessment measures (e.g., context-specific self-efficacy, affect, and cognitive load relevant measures) to examine the value of experiences like NOC on understanding a student’s metacognitive capabilities [40]—thought to be crucial to the lifelong learning required by a career in medicine in the twenty-first century.
Is NOC a valid approach to enhance readiness for internship?
We embraced the complexity and context-based nature of competence in building NOC. As a consequence, it will require a great deal of work to establish validity and set standards with the NOC outcome data for the purpose of high-stakes promotion decisions. Our team is currently working toward this goal. NOC’s design is grounded in both a conceptual (situated mixed modality clinical experiences in an immersive simulation) and content framework (core EPAs), created through national consensus and endorsed by the AAMC. When available, we based our assessment instruments on tools with previously reported internal validity data and we are working to ensure there is acceptable reliability to all our assessments. NOC balances the difficulty of having highly reliable measures with the fact that we are generating a large number of assessments on each student from a variety of perspectives (patient, nurse, expert, peer—a simulated 360° workplace assessment. We plan to follow some of our subjects forward into the first year of residency and beyond to see if strengths and weaknesses identified during the NOC experience are associated with adjustment to internship and demonstrated skills, and, in the longer run, to study if NOC predicts success in residency training and beyond.
The NOC program has already resulted in curriculum changes. For example, our clerkships and sub-internships have incorporated WISE onCall modules and related exercises that address clinical reasoning and provide examples of professional behavior, teamwork, and communication.
Conclusion
If the AAMC EPAs are to become the standard by which we assess transitioning students’ preparedness for residency, we will need to assure all of our students reach those standards and continue to be able to perform at that level at the time they are transitioning to graduate level medical education. Building programs, like NOC, will also enable medical schools to move toward a competency-based, time-variable curriculum that many now believe is the best way forward [41–43]. We have described a program for achieving these goals that is feasible, acceptable, flexible, and likely to produce valuable information for learners, educational leaders and policy makers.
Acknowledgements
We would like to thank Heather Dumorne, Nadiya Pavlishyn, Gizely Andrade, and Natasha Orzeck-Byrnes for providing support in the running of NOC as well as the many actors who participated as standardized patients and nurses to make this event a success. We also thank Deans of Medical Education, Victoria Harnik and Melvin Rosenfeld, for their support and guidance.
Funding
This project was supported with a grant from the James and Frances Berger Family Foundation.
Availability of data and materials
All materials needed to replicate the NOC Project, including standardized patient case materials and assessment instruments are available from the corresponding author on reasonable request.
Authors’ contributions
All individuals listed below have substantially contributed to the conception or design of the work and have participated in drafting the manuscript and/or revising critically for important intellectual content. In addition, each has given final approval of this version and has agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. Specifically, AK is responsible for all aspects of this project. She has led the conception and design of the project, drafted and finalized the manuscript, designed and refined aspects of the project, and championed and supervised its implementation. SZ participated in overall conception of the project, designed SP/SN cases, recruited and trained actors, and participated in conceptualizing, editing, and finalizing the manuscript. DS participated in overall conception of the project, designed SP/SN cases, and participated in manuscript editing and finalizing. SDY participated in design and implementation of NOC, data collection and management, and manuscript editing. HS participated in conception and design, sought and received IRB approval, and participated in manuscript editing. MN has participated in overall conception of the project, implemented the WISE-onCall module, designed data management and reporting components of the project, and participated in manuscript editing. GN has participated in overall conception of the project, in particular, the culture of safety activity, and participated in manuscript editing. MVP participated in design and conception of the oral presentation component and manuscript conceptualization and editing. CD participated in design and conceptualization of the implementation of NOC. CB participated in design and conceptualization of the handoff and in manuscript editing. KLE contributed important intellectual content on educational theory and participated in manuscript editing. JN participated in conceptualization and design of evidence-based medicine component of the NOC and participated in manuscript editing. TSR participated in funding and championing the project and in conceptualizing and editing the manuscript. All authors read and approved the final manuscript.
Ethics approval and consent to participate
While no data is shared in this manuscript, the NOC Project has been reviewed and approved by the NYU School of Medicine Institutional Review Board.
Consent for publication
Not applicable.
Competing interests
WISE-onCall was developed at NYU School of Medicine, and Dr. Thomas Riles, is the Executive Director for both WISE-MD which produces and distributes WISE-onCall and the New York Simulation Center both are not-for-profit entities. Mr. Nick is a member of the Program for Medical Education and Technology at the NYU School of Medicine and Technical Director for the WISE-MD and WISE-onCall. The other authors declare that they have no competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Footnotes
Presented at NEGEA 2016, Providence RI, and the annual national AAMC meeting, Baltimore, Maryland 2015
Contributor Information
Adina Kalet, Phone: 212 263 1137, Email: Adina.Kalet@nyumc.org.
Sondra Zabar, Email: Sondra.Zabar@nyumc.org.
Demian Szyld, Email: dszyld@gmail.com.
Steven D Yavner, Email: syavner1@gmail.com.
Hyuksoon Song, Email: hyuksong@gmail.com.
Michael W Nick, Email: mwn209@nyu.edu.
Grace Ng, Email: Grace.Ng@nyumc.org.
Martin V Pusic, Email: Martin.Pusic@nyumc.org.
Christine Denicola, Email: Christine.Denicola@nyumc.org.
Cary Blum, Email: Cary.Blum@nyumc.org.
Kinga L Eliasz, Email: Kinga.Eliasz@nyumc.org.
Joey Nicholson, Email: Joseph.Nicholson@nyumc.org.
Thomas S Riles, Email: Thomas.Riles@nyumc.org.
References
- 1.Sozener CB, Lypson ML, House JB. Hopson LR, Dooley-Hash SL, Hauff S, et al. Reporting achievement of medical student milestones to residency program directors: an educational handover. Acad Med. 2016;91:676–684. doi: 10.1097/ACM.0000000000000953. [DOI] [PubMed] [Google Scholar]
- 2.Minter RM, Amos KD, Bentz ML, Blair PG, Brandt C, D'Cunha J, et al. Transition to surgical residency: a multi-institutional study of perceived intern preparedness and the effect of a formal residency preparatory course in the fourth year of medical school. Acad Med. 2015;90:1116–1124. doi: 10.1097/ACM.0000000000000680. [DOI] [PubMed] [Google Scholar]
- 3.Teunissen PW, Westerman M. Opportunity or threat: the ambiguity of the consequences of transitions in medical education. Med Educ. 2011;45:51–59. doi: 10.1111/j.1365-2923.2010.03755.x. [DOI] [PubMed] [Google Scholar]
- 4.Petrilli CM, Del Valle J, Chopra V. Why July matters. Acad Med. 2016;91:910–912. doi: 10.1097/ACM.0000000000001196. [DOI] [PubMed] [Google Scholar]
- 5.Goren EN, Leizman DS, La Rochelle J, Kogan JR. Overnight hospital experiences for medical students: results of the 2014 clerkship directors in internal medicine national survey. J Gen Intern Med. 2015;30:1245–1250. doi: 10.1007/s11606-015-3405-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Frischknecht AC, Boehler ML, Schwind CJ, Brunsvold ME, Gruppen LD, Brenner MJ, et al. How prepared are your interns to take calls? Results of a multi-institutional study of simulated pages to prepare medical students for surgery internship. Am J Surg. 2014;208:307–315. doi: 10.1016/j.amjsurg.2014.01.014. [DOI] [PubMed] [Google Scholar]
- 7.Schwind CJ, Boehler ML, Markwell SJ, Williams RG, Brenner MJ. Use of simulated pages to prepare medical students for internship and improve patient safety. Acad Med. 2011;86:77–84. doi: 10.1097/ACM.0b013e3181ff9893. [DOI] [PubMed] [Google Scholar]
- 8.Sachdeva AK, Loiacono LA, Amiel GE, Blair PG, Friedman M, Roslyn JJ. Variability in the clinical skills of residents entering training programs in surgery. Surgery. 1995;118:300–308. doi: 10.1016/S0039-6060(05)80338-1. [DOI] [PubMed] [Google Scholar]
- 9.Lyss-Lerman P, Teherani A, Aagaard E, Loeser H, Cooke M, Harper GM. What training is needed in the fourth year of medical school? Views of residency program directors. Acad Med. 2009;84:823–829. doi: 10.1097/ACM.0b013e3181a82426. [DOI] [PubMed] [Google Scholar]
- 10.Antonoff MB, Swanson JA, Green CA, Mann BD, Maddaus MA, D'Cunha J. The significant impact of a competency-based preparatory course for senior medical students entering surgical residency. Acad Med. 2012;87:308–319. doi: 10.1097/ACM.0b013e318244bc71. [DOI] [PubMed] [Google Scholar]
- 11.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system––rationale and benefits. N Engl J Med. 2012;366:1051–1056. doi: 10.1056/NEJMsr1200117. [DOI] [PubMed] [Google Scholar]
- 12.Englander R, Flynn T, Call S, Carraccio C, Cleary L, Fulton TB, et al. Toward defining the foundation of the MD degree: core entrustable professional activities for entering residency. Acad Med. 2016;91:1352–1358. doi: 10.1097/ACM.0000000000001204. [DOI] [PubMed] [Google Scholar]
- 13.Santen SA, Rademacher N, Heron SL, Khandelwal S, Hauff S, Hopson L. How competent are emergency medicine interns for level 1 milestones: who is responsible? Acad Emerg Med. 2013;20:736–739. doi: 10.1111/acem.12162. [DOI] [PubMed] [Google Scholar]
- 14.Warm EJ, Englander R, Pereira A, Barach P. Improving learner handovers in medical education. Academic Medicin. 2017;92(7):927–993. doi: 10.1097/ACM.0000000000001457. [DOI] [PubMed] [Google Scholar]
- 15.Barsuk JH, Cohen ER, Wayne DB, Siddall VJ, McGaghie WC. Developing a simulation-based mastery learning curriculum: lessons from 11 years of advanced cardiac life support. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2016;11:52–59. doi: 10.1097/SIH.0000000000000120. [DOI] [PubMed] [Google Scholar]
- 16.Cohen ER, Feinglass J, Barsuk JH, Barnard C, O'Donnell A, McGaghie WC, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc. 2010;5:98–102. doi: 10.1097/SIH.0b013e3181bc8304. [DOI] [PubMed] [Google Scholar]
- 17.Cleland J, Patey R, Thomas I, Walker K, O’Connor P, Russ S. Supporting transitions in medical career pathways: the role of simulation-based education. Advances in Simulation. 2016;1:14. doi: 10.1186/s41077-016-0015-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Hallas D, Biesecker B, Brennan M, Newland JA, Haber J. Evaluation of the clinical hour requirement and attainment of core clinical competencies by nurse practitioner students. J Am Acad Nurse Pract. 2012;24:544–553. doi: 10.1111/j.1745-7599.2012.00730.x. [DOI] [PubMed] [Google Scholar]
- 19.Ker J, Mole L, Bradley P. Early introduction to interprofessional learning: a simulated ward environment. Med Educ. 2003;37:248–255. doi: 10.1046/j.1365-2923.2003.01439.x. [DOI] [PubMed] [Google Scholar]
- 20.Thomas I, Nicol L, Regan L, Cleland J, Maliepaard D, Clark L, et al. Driven to distraction: a prospective controlled study of a simulated ward round experience to improve patient safety teaching for medical students. BMJ quality & safety. 2015;24:154–161. doi: 10.1136/bmjqs-2014-003272. [DOI] [PubMed] [Google Scholar]
- 21.Zabar S, Hanley K, Altshuler L, Wallach A, Porter B, Fox J, et al. Do clinical skills assessed in osces transfer to the real world of clinical practice? Using unannounced standardized patient visits to assess transfer. Acad Med. In Press.
- 22.Zabar S, Adams J, Kurland S, Shaker-Brown A, Porter B, Horlick M, et al. Charting a key competency domain: understanding resident physician interprofessional collaboration (IPC) skills. J Gen Intern Med. 2016;31:846–853. doi: 10.1007/s11606-016-3690-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Zabar S, Kachur E, Kalet A, Hanley K. Objective structured clinical examinations: 10 steps to planning and implementing OSCEs and other standardized patient exercises: Springer Science & Business Media; 2012.
- 24.Hodges B. Validity and the OSCE. Med Teach. 2003;25:250–254. doi: 10.1080/01421590310001002836. [DOI] [PubMed] [Google Scholar]
- 25.Szyld D, Unquillas K, Green B, Yavner S, Song HS, Nick M, et al. Improving the clinical skills of near graduating medical students using a blended computer and simulation based approach. Simul Healthc. (in press). [DOI] [PMC free article] [PubMed]
- 26.Yavner SD, Pusic MV, Kalet AL, Song HS, Hopkins MA, Nick MW, et al. Twelve tips for improving the effectiveness of Web-based multimedia instruction for clinical learners. Med Teach. 2015;37:239–244. doi: 10.3109/0142159X.2014.933202. [DOI] [PubMed] [Google Scholar]
- 27.Stalmeijer RE. When I say… cognitive apprenticeship. Med Educ. 2015;49:355–356. doi: 10.1111/medu.12630. [DOI] [PubMed] [Google Scholar]
- 28.The Web Initiative for Surgical Education (WISE). WISE-OnCall Web-based e-learning modules. Available at: http://www.wisemed.org/wise-oncall-e-learning-page. Accessed 30 June 2017.
- 29.Berger AJ, Gillespie CC, Tewksbury LR, Overstreet IM, Tsai MC, Kalet AL, et al. Assessment of medical student clinical reasoning by “lay” vs physician raters: inter-rater reliability using a scoring guide in a multidisciplinary objective structured clinical examination. Am J Surg. 2012;203:81–86. doi: 10.1016/j.amjsurg.2011.08.003. [DOI] [PubMed] [Google Scholar]
- 30.Stevens DL, King D, Laponis R, Hanley K, Zabar S, Kalet AL, et al. Medical students retain pain assessment and management skills long after an experiential curriculum: a controlled study. Pain. 2009;145:319–324. doi: 10.1016/j.pain.2009.06.030. [DOI] [PubMed] [Google Scholar]
- 31.Association of American Medical Colleges (AAMC). Core entrustable professional activities for entering residency, curriculum developers guide. Association of American Medical College. Available at: https://members.aamc.org/eweb/upload/core%20EPA%20Curriculum%20Dev%20Guide.pdf. Accessed 30 June 2017.
- 32.Farnan JM, Paro JA, Rodriguez RM, Reddy ST, Horwitz LI, Johnson JK, et al. Hand-off education and evaluation: piloting the observed simulated hand-off experience (OSHE) J Gen Intern Med. 2010;25:129–134. doi: 10.1007/s11606-009-1170-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Ng G, Pimentel S, Szyld D, Kalet A. Towards entrusting medical students: recognising safety behaviours. Med Educ. 2016;50:569–570. doi: 10.1111/medu.13028. [DOI] [PubMed] [Google Scholar]
- 34.The New York Simulation Center (NYSIM). Available at: http://www.nysimcenter.org/. Accessed 29 June 2017.
- 35.Farnan JM, Gaffney S, Poston JT, Slawinski K, Cappaert M, Kamin B, et al. Patient safety room of horrors: a novel method to assess medical students and entering residents’ ability to identify hazards of hospitalisation. BMJ quality & safety. 2016;25:153–158. doi: 10.1136/bmjqs-2015-004621. [DOI] [PubMed] [Google Scholar]
- 36.Wald D, Peet A, Cripe J. Kinloch M. A simulated night on call experience for graduating medical students. Available at: https://www.mededportal.org/publication/10483. Accessed 27 July 2017. [DOI] [PMC free article] [PubMed]
- 37.van der Vleuten C, Sluijsmans D, Joosten-ten BD. Competence assessment as learner support in education. In: Mulder M, editor. Competence-based vocational and professional education: bridging the worlds of work and education. Cham: Springer International Publishing; 2017. pp. 607–630. [Google Scholar]
- 38.Kalet A, Ark T, Eliasz KL, Nick M, Ng G, Szyld D, et al. A simulated night on call (NOC): assessing the entrustment of near graduating medical students from multiple perspectives. J Gen Intern Med. 2017;32:102–103. [Google Scholar]
- 39.ten Cate O. Entrustment decisions: bringing the patient into the assessment equation. Acad Med. 2017;92(6):736–738. doi: 10.1097/ACM.0000000000001623. [DOI] [PubMed] [Google Scholar]
- 40.Cutrer WB, Miller B, Pusic MV, Mejicano G, Mangrulkar RS, Gruppen LD, et al. Fostering the development of master adaptive learners: a conceptual model to guide skill acquisition in medical education. Acad Med. 2017;92:70–75. doi: 10.1097/ACM.0000000000001323. [DOI] [PubMed] [Google Scholar]
- 41.Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T, et al. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010;32(8):631–637. doi: 10.3109/0142159X.2010.500898. [DOI] [PubMed] [Google Scholar]
- 42.Emanuel EJ, Fuchs VR. Shortening medical training by 30%. JAMA. 2012, 21;307(11):1143–1144. [DOI] [PubMed]
- 43.Cangiarella J, Gillespie C, Shea JA, Morrison G, Abramson SB. Accelerating medical education: a survey of deans and program directors. Med Educ Online. 2016;21(31794):1–8. doi: 10.3402/meo.v21.31794. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
All materials needed to replicate the NOC Project, including standardized patient case materials and assessment instruments are available from the corresponding author on reasonable request.