ABSTRACT
Context
Bedside clinical teaching is the backbone of clerkship education. Data-driven methods for supplementing bedside encounters with standardized content from vetted resources are needed.
Objective
To compare a flipped-classroom versus an interactive online-only instruction for improving knowledge, skills, self-directed learning (SDL) behaviors, and satisfaction in a medical school clerkship.
Methods
An IRB-approved prospective study employing a peer-reviewed clinical reasoning curriculum in neurology was conducted; 2nd-4th year medical students rotating through a required clerkship were enrolled. Students were randomized to flipped-classroom (i.e., flipped) or interactive asynchronous online instruction (i.e., online-only), which supplemented existing bedside teaching. Baseline and end-of-course knowledge, skill development, SDL behaviors, satisfaction, and long-term retention were assessed by peer-reviewed clinical reasoning exam, NBME scores, faculty/resident clinical evaluations, non-compulsory assignment completion, end-of-clerkship surveys, and objective structured clinical exam (OSCE).
Results
104 students (49 flipped, 55 online-only) were enrolled. Age, gender, and training level did not differ by group (all p > 0.43); baseline knowledge was higher in the flipped group (p = 0.003). Knowledge-based exam scores did not differ by group even after adjusting for differences in baseline knowledge (2.3-points higher in flipped group, 95%CI −0.4–4.8, p = 0.07). Clinical skills were significantly higher in the flipped group, including examination skills (4.2 ± 0.5 vs. 3.9 ± 0.7, p = 0.03) and future housestaff potential (4.8 ± 0.3 vs 4.5 ± 0.6, p = 0.03). Students in the online-only group were more likely to engage in SDL (42 vs. 12%, p = 0.001) and reported more hours studying (6.1 vs. 3.8 hours, p = 0.03). Satisfaction (p = 0.51) and OSCE scores (p = 0.28) were not different by group.
Conclusions
In this comparative study of two evidence-based curricular delivery approaches, we observed no difference in knowledge acquired. Greater clinical skills were observed with flipped instruction, while more SDL was observed with online-only instruction. Supplementing bedside teaching with blended instruction that balances live skill development with vetted online resources is optimal for clerkship education.
KEYWORDS: flipped classroom, neurology clerkship, clinical reasoning, online education, curriculum
Introduction
Today’s learners have been described as possessing an inherent interest in, understanding of, and appreciation for technology and social connection. They have witnessed an explosion in the use of e-technology and mobile devices in daily life. In medical education, a similar exponential increase has occurred in web-based technologies, virtual learning platforms, and social media [1,2]. When given the option, less than 1/3 of medical students attend class in person [3]. This has been accelerated by the emergence of the COVID-19 pandemic, which imposed the need for distance learning in concordance with social distancing regulations [4]. As medical educators were forced to embrace virtual learning environments, many found new expertise and confidence in online education [5]. Despite these trends, a paucity of outcomes-based research exists on how to incorporate these technologies into optimal clinical teaching approaches [6].
Clinical instruction at the bedside has been the bedrock of medical school clerkship education for decades [7–10]. Observed bedside encounters allow students to hone history and examination skills, apply knowledge, and learn from faculty mentors. Prior studies have identified challenges to bedside instruction, including increasing demands on faculty time, high patient turnover, concerns over violation of patient privacy, inability to standardize exposure in clinical settings, and a need for vetting online resources [7,11]. How to optimally address these challenges, supplement existing bedside teaching, and incorporate new technologies continues to be actively investigated [12].
In this study, we aimed to assess the comparative effectiveness of two evidence-based methods for instruction on clinical reasoning: an interactive online-only approach or a live hands-on flipped classroom method [13,14]. Clinical reasoning is fundamental to medical practice and represents a core skill for all healthcare providers [15,16]. It is also an essential component of at least five of the recently published 13 core medical student entrustable professional activities (EPAs) [17]. Clinical reasoning requires not only patient exposure but also the synthesis of clinical data following an encounter, development of a differential diagnosis and management plan, and reassessment as new information is acquired after an encounter. This topic is ideal for evaluating how to use technology to supplement bedside encounters.
Materials and methods
Setting
A prospective randomized comparative effectiveness study was designed (Supplemental Figure 1). All 2nd – 4th-year medical students rotating through a 4-week clerkship at a single academic institution were enrolled from 2014–2015. The neurology core clerkship was selected because it is required and allows for a pre-existing peer-reviewed curriculum for teaching and assessing clinical reasoning [13,14].
Participants
All rotating students (i.e., five blocks of students) were enrolled. At our institution, each block is 8 weeks long during which medical students spend four weeks in the neurology clerkship and four weeks in the psychiatry clerkship. For each block, twenty students are randomly assigned by the institutional registrar to either psychiatry or neurology clerkship for the first month with equal numbers of students in each clerkship. The groups alternate to complete the other clerkship in the second month. Thus, each block consists of two groups of students; group 1 rotates through neurology in the first four weeks of the block while group 2 rotates through neurology in the second four weeks of the block. Each group is then randomized to one of two methods of instruction: interactive online-only learning (i.e., online-only); or a live, hands-on, flipped classroom method with synchronous hands-on in-person learning and asynchronous online components (i.e., flipped, detailed description below). For example, the group 1 cohort may be randomly assigned to online-only in the first month of the block while the group 2 cohort receive flipped instruction in the second month of the block. With each subsequent block of students, this two-layer process of randomization was repeated.
Teaching Methods
Both curricula incorporated equivalent instruction time, compulsory assignments, and duration of required study. Flipped instruction occurred at discrete pre-specified times while online-only instruction was distributed across the clerkship – a critical planned difference between the instructional strategies. All students received in-person training in the neurologic examination and bedside teaching on the wards and in the clinic. The primary outcomes were student knowledge as assessed by the National Board of Medical Examiners (NBME) exam, clinical reasoning as assessed by a study-specific exam, clinical skills as assessed by faculty/resident clinical evaluations, self-directed learning as tracked by independent assignment completion, and student satisfaction. The study was reviewed and approved by the local institutional review board.
A previously published, peer-reviewed, case-based curriculum for teaching clinical reasoning in neurology was employed in both methods of instruction [13,14]. The flipped model incorporated an evidence-based, flipped-classroom approach consisting of two pre-classroom videos (length: 60 minutes total), which were viewed asynchronously followed by two-60 minute, synchronous, small group discussions (total required student time: 3 hours; total in-class faculty time: 2 hours) [14]. During the first discussion (i.e., Activity 1), the course instructor (R.Strowd) reviewed 1-of-6 standardized patient vignettes using a think-aloud approach to clinical reasoning instruction [16,18]. Students were recommended to complete the remaining five vignettes on their own (i.e., self-directed study). During the second discussion (i.e., Activity 2), self-directed completion of these patient vignettes was tracked, and the course instructor reviewed one of the remaining five vignettes using the same method.
The online-only model was delivered via a best-practice platform developed by the Johns Hopkins School of Education Center for Technology in Education, which has been used to provide distance learning in elementary to post-graduate classrooms nationwide [19]. The same curriculum was employed via this web-based platform to ensure standardization of content, number of patient vignettes completed, the opportunity for self-directed case completion, and total required curricular time so that differences in achievement would not be related to differences in required time or amount of content reviewed. The curriculum included four activities; two initial pre-course videos were delivered asynchronously followed by two online learning activities with each assignment released and completed each week of the 4-week clerkship rotation (total required student time: 3 hours). Activity 1 was completed in week one, during which students independently reviewed 1-of-the-6 patient vignettes (see above) and posted their write up to a web-based discussion board using a study-specific template. Activity 2 was completed in week two during which students peer reviewed a fellow student’s write-up and generated group discussion about the clinical reasoning process. Students were encouraged to complete any remaining patient vignettes independently which was tracked at clerkship completion. The instructor facilitated learning by monitoring online course participation and completion. The instructor also provided asynchronous but timely feedback and online discussion with students.
Measurement Instruments
Student knowledge was assessed by the NBME exam, an accepted objective measure, completed at clerkship end. Baseline clinical reasoning was assessed by a study-specific multiple-choice examination consisting of five questions with four answer choices and a single-best answer. Final end-of-study clinical reasoning performance was assessed by a study-specific multiple-choice examination consisting of thirty questions with four answer choices and a single-best answer. Both pre- and post-examinations were developed using standardized questions obtained with permission from the peer-reviewed American Academy of Neurology Continuum® publication. Questions were screened by the study-PI (R.Strowd) and selected by the Neurology Clerkship Course Director Team (C.G. & R.Salas) based on (1) relevance to clinical reasoning, and (2) appropriateness for the training level of a senior medical student. For both the pre-test and post-test, multiple-choice questions were adapted from a five answer-choice to a four answer-choice format to better align to the level of a senior medical student. We piloted the exams with 2 rotation groups prior to the start of the academic year when this was implemented to ensure clarity of wording and anticipated level of difficulty. Clinical skills were assessed by faculty/resident directly-observed clinical evaluations, which are completed in real-time throughout the clerkship using a 5-point Likert scale (from 1 = strongly disagree to 5 = strongly agree). Four clinical skill domains were selected to assess those pertaining to clinical reasoning including: data gathering, neurologic exam, problem-solving, patient presentation, and a previously validated measure of overall potential as a future housestaff [20]. To avoid biased evaluations, none of the course teachers served as evaluators of student clinical skills. Self-directed learning was tracked as above and assessed as the number of non-compulsory patient vignettes completed. Student satisfaction was assessed on an end-of-course evaluation (created by R.Strowd, C.G. and R.Salas) by 5-point Likert (1 = strongly dissatisfied, 5 = strongly satisfied, Supplemental Table 2). Long term retention was assessed by an objective structured clinical exam (OSCE) performed approximately one year following clerkship completion. The in-person OSCE was designed to mirror the format of the Step 2 Clinical Skills Exam where students complete standardized patient encounters at multiple stations – one of these stations is focused on neurology and the grade is based on students’ written responses to post-station questions. Student learning preference was assessed by two non-validated questions evaluating preference for studying (i.e., alone vs. group) and for learning environment (i.e., in-class, out of class, Supplemental Table 1).
Recruitment and Data Collection
All 2nd-4th year medical students during their four-week Neurology Core Clerkship rotation were recruited for this study. Demographic (e.g., age, gender, medical school year) and academic data (NBME scores, baseline and final clinical reasoning exam scores, clinical evaluation ratings, student satisfaction ratings, and OSCE scores) were collected from the Neurology Core Clerkship Office during one academic year (2014–15).
Data Management and Analysis
Statistical analysis was performed using Stata/IC v13.1 (Stata Corp., Cary, NC, © 2014). Descriptive statistics were performed of the entire population and by the intervention group (i.e., flipped vs. online-only). Unpaired student t-tests were used to compare continuous variables; chi-square and Fisher’s exact test for categorical variables. Comparison of baseline, follow up clinical reasoning knowledge exam, NBME exam, and satisfaction scores between both groups were performed by un-paired student t-test. Univariate linear regression was used to determine variables that were significantly associated with differences in NBME shelf exam performance and clinical reasoning examination performance by the intervention group. Multivariable linear regression was performed to account for potential differences in baseline knowledge, adjusting for baseline exam performance. Assumptions of linear regression analysis were checked including linearity, homoscedasticity, independence, and normality. The effect size was calculated using Cohen’s d for comparisons of means by intervention group, eta-squared (η2) for estimating the variance when controlling for baseline exam performance. Pre-determined significance was defined as p < 0.05.
Results
Over one year, 104 students were enrolled in the course; 49 were randomized to flipped instruction, 55 online-only. Gender, medical school year, and student studying preference were not different by intervention group (Table 1). The number of students randomly assigned to each group was not different by intervention group. Baseline clinical reasoning exam performance was higher in the flipped (mean score 28/30) as compared to the online-only group (21/30, p = 0.002).
Table 1.
All (n = 104) | Flipped (n = 49) | Online-Only (n = 55) | P-Value | |
---|---|---|---|---|
Age (years, mean, sd)* | 26.6 (2.8) | 26.5 (3.2) | 26.7 (2.5) | 0.79 |
Gender (n, % male) | 49 (47%) | 25 (51%) | 24 (44%) | 0.56 |
Med School Year (n, %) | 0.73 | |||
● MS2 | 20 (19%) | 9 (18) | 11 (20) | |
● MS3 | 64 (62%) | 32 (66) | 32 (58) | |
● MS4 | 20 (19%) | 8 (16) | 12 (22) | |
Rotation Block (n, %) | 0.89 | |||
● Rotation Block 1 | 26 (25%) | 13 (27) | 13 (24) | |
● Rotation Block 2 | 15 (14%) | 7 (14) | 8 (15) | |
● Rotation Block 3 | 20 (19%) | 10 (20) | 10 (17) | |
● Rotation Block 4 | 26 (25%) | 13 (27) | 13 (24) | |
● Rotation Block 5 | 17 (17%) | 6 (12) | 11 (20) | |
Studying Preference (n, %) | 0.51 | |||
● Prefer Alone | 76 (73%) | 34 (69) | 42 (76) | |
● Prefer Group | 28 (27%) | 15 (31) | 13 (24) | |
Classroom Preference (n, %) | 0.43 | |||
● `Prefer In Class | 48 (46%) | 25 (51) | 23 (42) | |
● Prefer Out of Class | 56 (54%) | 24 (49) | 32 (58) | |
Baseline Clinical Reasoning Exam Score (mean, sd) | 24.6 (11) | 28.2 (11) | 21.5 (10) | 0.002 |
Note: Age data available for 102 students (49 flipped, 53 online-only). Two sample t-test used for age and baseline clinical reasoning exam score. Fisher’s exact test used for all other variables. Statistical significance defined as p < 0.05.
Overall, mean NBME exam performance was 78.7 ± 6.5 (Table 2); 79.2 ± 6.8 for the online-only, and 78.1 ± 6.3 for the flipped group (p = 0.37; effect size, d = 0.18, 95% CI: −0.21, 0.56). Given the differences in baseline clinical reasoning scores by intervention group, a multivariable model adjusting for baseline clinical reasoning exam score was performed. In this adjusted model, mean NBME exam scores were 2.3 points higher (95% CI −0.4, 4.8) in the online-only group, though this was not a statistically greater score (p = 0.07; η2 = 0.09, 95% CI 0.007, 0.199; Table 3). The mean post-course clinical reasoning exam score was 209 ± 27 (69.4% correct) and was not statistically different by intervention group (p = 0.32). When adjusting for baseline score, mean clinical reasoning performance was 7.9 points higher in the online-only group (95% CI −3.1, 19; one-question worth 10 points), though this was not statistically higher (p = 0.16). Long-term retention of clinical reasoning skills as assessed by OSCE scores did not differ by group (7.8 for flipped vs. 8.1 online-only, p = 0.28).
Table 2.
All (n = 104) | Flipped (n = 49) | Online-Only (n = 55) | P-Value | |
---|---|---|---|---|
STUDENT PERFORMANCE | ||||
● Final Exam (mean, sd) | 208 (27) | 206 (27) | 211 (27) | 0.32 |
● Shelf (mean, sd) | 78.7 (6.5) | 78.1 (6.3) | 79.2 (6.8) | 0.37 |
CLINICAL SKILLS | ||||
● Data Gathering (mean, sd) | 4.4 (0.6) | 4.4 (0.7) | 4.4 (0.6) | 0.57 |
● Neurologic Exam (mean, sd) | 4.1 (0.7) | 4.2 (0.5) | 3.9 (0.7) | 0.03 |
● Problem Solving (mean, sd) | 4.3 (0.5) | 4.3 (0.5) | 4.3 (0.5) | 0.37 |
● Patient Presentation (mean, sd) | 4.3 (0.8) | 4.3 (0.8) | 4.2 (0.7) | 0.23 |
● Housestaff Potential (mean, sd) | 4.7 (0.5) | 4.8 (0.3) | 4.5 (0.5) | 0.03 |
SELF-DIRECTED LEARNING | ||||
● Non-compulsory Assignments (mean, sd) | 0.9 (1.8) | 0.4 (1.3) | 1.3 (2.1) | 0.01 |
● Percent Completing at least 1 Non-compulsory Assignment (n, %) | 30 (29%) | 6 (12%) | 24 (44%) | <0.001 |
● Time Committed to Course (mean hrs, sd)a | 5.1 (4.5) | 3.8 (5.1) | 6.1 (3.8) | 0.03 |
SATISFACTION | ||||
● Student Satisfaction (mean, sd) | 3.9 (0.7) | 3.9 (0.8) | 3.8 (0.7) | 0.51 |
Notes: aTime committed to course based on data for n = 77 (35 flipped, 42 online-only) due to electronic error in survey distribution to the first group of student rotators (equal randomization). Fisher’s exact test used for percent completing non-compulsory assignments. Two sample t-test used for remaining variables. Statistical significance defined as p < 0.05.
Table 3.
Unadjusted Association |
Adjusted Association |
|||
---|---|---|---|---|
Coefficient (95% CI) | p-value | Coefficient (95% CI) | p-value | |
Randomization (Flipped vs Online-Only; n = 104) | 1.2 (−1.4, 3.7) | 0.37 | 2.3 (−0.4, 4.8) | 0.07 |
Baseline Examination Score (n = 104) | 0.16 (0.04, 0.27) | <0.01 | 0.19 (0.07, 0.31) | <0.01 |
Caption: results of univariate and multivariate linear regression on NBME shelf examination scores. Compared to the flipped group, the online-only group scored 1.2 points higher on the NBME shelf exam (p = 0.37) and when adjusting for baseline clinical reasoning examination scores the online-only group score 2.3 points higher on the NBME shelf exam (p = 0.07, η2 = 0.09, 95% CI 0.007, 0.199).
Faculty/resident rating of student clinical skills for all participants was generally high, including for data gathering (4.4 ± 0.6), neurologic exam (4.1 ± 0.7), problem solving (4.3 ± 0.5), patient presentation (4.3 ± 0.8), and future housestaff potential (4.7 ± 0.5, all out-of-5). There were no differences in the clinical skills rating by intervention group for data gathering (p = 0.57), problem solving (p = 0.37), or patient presentation (p = 0.23). Mean rating was significantly lower in the online-only group for neurologic examination skills (3.9 ± 0.7 online-only vs 4.3 ± 0.5 flipped, p = 0.03, d = 0.45) and housestaff potential (4.5 ± 0.5 vs 4.8 ± 0.3, p = 0.03, d = 0.42).
SDL was uncommon overall, with 30 ± 29% of students completing at least one non-compulsory assignment. Students in the online-only group were significantly more likely to have completed at least one non-compulsory assignment (24 ± 44% vs 6 ± 12%, p < 0.001, d = 0.73). On average, students in the online-only group completed 1.3 ± 2.1 non-compulsory assignments compared to only 0.4 ± 1.3 in the flipped group (p = 0.01, d = 0.50). Despite a 3-hour time requirement which was provided and protected during the clerkship for course activities, on average the flipped group reported spending 3.8 ± 5.1 hours on activities relating to the course (includes noncompulsory and other activity time) while students in the online-only group reported significantly longer amount of time (6.1 ± 3.8 hours, p = 0.03, d = 0.51).
Overall, satisfaction with the course was high with an average rating of 3.9 ± 0.7 (out of 5) that did not differ by intervention group (p = 0.51, Table 2). For the flipped intervention group, mean satisfaction was not different for those students who self-described themselves as in class learners (mean 3.9 ± 0.7) compared to outside class learners (3.9 ± 0.8, p = 0.86, Supplemental Table 3). However, for the online-only group, mean satisfaction was significantly lower for in class compared to outside class learners (mean 3.6 ± 0.7 vs 4.0 ± 0.6, p = 0.04, d = 0.60).
Student satisfaction with the setup of the web-based platform was high (Supplemental Table 4). The majority of students agreed or strongly agreed that the platform itself supported their learning (n = 38, 73%), was easy to navigate (n = 35, 67%), incorporated helpful content (n = 33, 63%), that peer feedback enhanced learning (n = 34, 65%), and instructor feedback enhanced learning (n = 45, 87%). Students indicated that the platform provided an opportunity for peer learning (n = 35, 67%), to reflect on content (n = 36, 69%), and to process new ideas (n = 32, 62%) more than to collaborate (n = 22, 42%), apply concepts to practice (n = 19, 37%), or create community (n = 9, 17%).
Discussion
In this comparative effectiveness study of two evidence-based approaches to curricular delivery in the medical school clerkship, we dissect the impact of these best-practice strategies for teaching today’s modern learners. No differences in performance on a peer-reviewed multiple choice clinical reasoning knowledge examination or NBME scores was observed. Differential effects on clinical skills (favoring flipped instruction) and self-directed learning (favoring online-only instruction) were demonstrated. Student satisfaction was high with both methods of instruction. These data provide outcome-derived support for supplementing bedside teaching with blended methods of curricular delivery and suggest that combining these approaches may optimally target clinical skill development as well as the promotion of lifelong learning. Blended learning methods may be superior to traditional didactics – the addition of online learning technology to live instruction can lead to knowledge gains by giving students the ability to have repeated exposure to content over multiple occurrences as opposed to a single didactic and by increasing learners’ engagement [21,22].
Many studies have evaluated the impact of a heterogeneous approach to online instruction in medical education [23–28]. In a 2014 meta-analysis of 59 studies of online learning, six randomized studies were identified in medical student education that all showed positive effects on knowledge with one showing improved skills and student confidence [23–25,27,28]. Though described as online learning, the curricula used in all of these studies added assigned online learning modules to standard live curricula and compared to traditional didactic instruction. Such designs can introduce systematic differences in the time spent on content which may account for the aforementioned results. The flipped instruction in the current study is a blended learning model with the integration of technology and face-to-face instruction, similar to the aforementioned studies, but with standardization of time spent on required material. Furthermore, it represents a true flipped classroom model with advantages over traditional didactics alone including increased opportunities for collaborative learning and competency-based education practices [29]. Studies show high student satisfaction with flipped classrooms [30] and one head-to-head meta-analysis of the flipped classroom versus traditional didactics showed a larger effect size favoring the flipped classroom [31]. As such, the current study adds to the literature on optimal instructional strategies for teaching undergraduate medical students in general and provides a critical window into the use of these strategies in neurology and for the teaching of clinical reasoning and localization.
Each generation of learners presents different characteristics, traits, preferences, and challenges for educators. Much has been written of today’s modern learners as digital natives who report high rates of satisfaction with online learning [3–5,21,22]. One recent study of ninety fourth-year medical students reported that the identified pros of online learning outweighed the cons by 3:1, but the authors go on to acknowledge that online learning cannot replace psychomotor skills and mentoring that occurs in person [32]. Similarly, in our current study, the scores on the neurological exam and ratings of housestaff potential were lower in the online-only group which supports the notion that neurological examination needs to be physically practiced and relationships developed during this helps inform assessment of potential as a future housestaff. A 2019 meta-analysis comparing online versus in-person learning demonstrated online learning is equal to or superior to in-person instruction [33]. The authors postulate that noninteractive online learning is no better than traditional didactic teaching but can promote increased self-directed learning. It also suggests that online learning may only be effective for simple low-order learning objectives but did not differentiate between knowledge outcomes versus skills outcomes. Our study demonstrates the effective use of flipped classroom instruction to target the higher-order thinking skills required for clinical reasoning [34]. Effective online instruction (synchronous and asynchronous) also requires teaching skills which may differ from those used during in-person instruction [35]. In the time of COVID-19, educators have now developed guides to facilitate faculty development and ‘good online teaching practices’ that include recommendations on ensuring active engagement, promoting self-directed learning, and timely feedback [36]. Our present study adds to the literature by implementing an online curriculum that is consistent with these postulated competencies for instructors [36]. Effective and validated online curricula can be challenging and time-consuming to create for medical educators who may already have many clinical, research, and educational demands on their time. Yet that time burden need only be felt once. Our experience with the online-only curriculum shows that once implemented, faculty facilitation of an online course is less arduous than repeatedly conducting traditional in-person didactics. This allows for the sustainability of such a curriculum – indeed, the clinical reasoning in neurology course continues to be a well-perceived core instructional component of the Neurology Core Clerkship at Johns Hopkins since its inception.
In the current study in neurology, knowledge acquisition did not differ by method of curricular delivery. Of note, baseline knowledge was significantly higher in the flipped group and there was a trend toward greater standardized examination scores in the online-only group at study end. There is suggestion that there may have been greater absolute knowledge gained in students who received online-only instruction. However, given the high-stakes nature of standardized testing in the clinical clerkships, the lack of significant difference in scores at the study end also suggests that motivation to study and perform on these important tests is high and may be a more important driver of performance than the method of curricular delivery [37].
Interestingly, clinical skill development and SDL tendencies were differentially impacted by the type of instruction. While online-only learning was strongly associated with high rates of SDL, flipped instruction was more strongly associated with greater faculty-rated clinical skills. The benefit of in-person instruction on clinical skill development is not surprising. Importantly, all students in the current study received in-person training in the techniques of the neurologic examination. Instruction in how to apply these exam findings to the patient differed by the method of delivery. These data suggest that the opportunity to engage interactively with a faculty educator and discuss applying physical exam findings to patient vignettes may influence clinical skills more than web-based methods.
SDL is a critical component of modern medical education and is important in promoting the development of lifelong learning skills [3,38,39]. Motivation and satisfaction with online education are dependent on building the student’s sense of community, peer support, time management skills, and ability to communicate frequently with the instructor and these factors should be considered when creating any type of online curriculum [38]. The online learning platform utilized in this study was easy to use and accessible. It allowed activities and learning to be distributed over the course of the entire clerkship. In contrast to flipped classroom live-lectures which were compacted into single 60-minute ‘chunks’ of instructional time, online-only activities were distributed over the 4-week clerkship allowing students greater opportunity and time for completing assignments, engaging with material, and generating motivation. Web-based access provided ease in use and accessibility of the platform through web and mobile devices. Such features may have led to the association between the online method of delivery and higher SDL and greater reported time on material. Students rotating through the neurology clerkship continuously engage in a wide number of resources for studying and learning. Thus, while the absolute time spent in this course differed, these data do not confirm that time spent studying neurology was different. Importantly, this platform allowed for standardization of the content provided to students and a vetting process for resources in which students studied during their clerkship.
This study was performed within a single clerkship at a single academic institution which limits the generalizability of these data to other educational settings in other environments. Despite efforts to achieve adequate randomization, important differences were observed in baseline clinical reasoning exam scores. Whether this is reflective of a true difference in baseline knowledge or a difference in testing scores is not clear given that the 5-question baseline exam was not previously validated. It is also worth noting that the baseline high performance in the flipped group may have created a ceiling effect that confounds this study’s ability to show improvement after curricula intervention. It is possible, that the online-only group with its lower baseline score improved to match the flipped group, resulting in similar post-exam scores. Adjusting for the differences in baseline clinical reasoning exam scores did show that the online-only group attained higher absolute scores on both NBME and clinical reasoning exams but not high enough to reach the threshold of significance. Future studies will incorporate validated baseline clinical reasoning knowledge examinations. Randomization is difficult with medical education interventions, though alternative strategies for randomizing students could be considered in the future. Clinical skills were assessed by resident and faculty evaluators who were not informed of the instructional method of each group of students and were not involved in the curricular delivery; however, complete blinding of randomization cannot be ensured. Future studies should be designed to power either a larger, multi-institution study of these curricular interventions in neurology or a larger, single-institution study that spans multiple specialties.
Conclusion
Educators have suggested that today’s modern learners are inherently different from those in prior generations and require unique instructional strategies to account for their deep appreciation for technology. In this study, we did not find systematic differences in performance for online-only and a flipped-classroom model of instruction. These contemporary evidence-based methods of teaching in the clerkship had similar impact on knowledge, clinical reasoning, and satisfaction. Better clinical skills were associated with hands-on instruction from clinical faculty who demonstrate exam techniques and findings. Greater self-directed learning behaviors were observed with online-only instruction which distributed learning across the 4-week clerkship and provided standardized, vetted educational resources to students at any location and time. Students indicated value in both approaches. These data are important for medical educators who are challenged to supplement bedside encounters with limited resources, time constraints, and difficulty with selecting optimal methods of delivering content to their learners.
Supplementary Material
Acknowledgments
The authors would like to thank Neurology Continuum® for providing the peer-reviewed multiple-choice examination questions. The authors would also like to thank the Johns Hopkins School of Education Center for Technology in Education for supporting the development, maintenance, and hosting of the web-based platform (the Electronic Learning Community, JHU Office of Technology Transfer, Reference number Nunn1605, February 2002) utilized in this study.
Funding Statement
This study was supported by funding from the American Academy of Neurology Institute.
Disclosure Statement
Dr. Paul has no conflicts to disclose. Dr. Leung has no conflicts to disclose. Dr. Salas has entered into an agreement with UpToDate, Inc. and has been paid royalties for her {Citation}received royalties from sales of the MySleep101 iPad app. She has received less than $75. Ms. Krum has no conflicts to disclose. Dr. Saylor has no conflicts to disclose. Dr. Abras has no conflicts to disclose. Ms. Gugliucciello has no conflicts to disclose. Dr. Nunn has no conflicts to disclose. Dr. Gamaldo has entered into an agreement with UpToDate, Inc. and has been paid royalties for her contribution of medical articles for this publication. She has received less than $400. She has received royalties from sales of the MySleep101 iPad app. She has received less than $75. Dr. Strowd serves as a consultant for Monteris Medical. He receives an editorial stipend from the American Academy of Neurology. He has received research/grant support from the American Academy of Neurology, American Society for Clinical Oncology, Southeastern Brain Tumor Foundation, Jazz Pharmaceuticals, and the International Association for Medical Science Educators.
Author Contributions
Dr. Paul contributed to analysis and drafting the manuscript and has approved this final version.
Dr. Leung contributed to analysis and drafting the manuscript and has approved this final version.
Dr. Salas contributed to the study design, analysis, drafting, and supervision of the manuscript and has approved this final version.
Ms. Cruz contributed to the study design, data collection/curation, project administration, and drafting the manuscript and has approved this final version.
Dr. Abras contributed to the study design, analysis, and drafting of the manuscript and has approved this final version.
Dr. Saylor contributed to the study design, analysis, and drafting of the manuscript and has approved this final version.
Ms. Gugliucciello contributed to the online instructional design and development of the web-based platform and has approved this final version.
Dr. Nunn contributed to the web-based platform’s online instructional design and development and has approved this final version.
Dr. Gamaldo contributed to the study design, analysis, drafting and supervision of the manuscript and has approved this final version.
Dr. Strowd contributed to the study design, methodology, conceptualization, analysis, and drafting the manuscript and has approved this final version.
Data Availability Statement
Due to the nature of this research, participants of this study did not agree for their data to be shared publicly, so supporting data is not available.
Supplemental data
Supplemental data for this article can be accessed online at https://doi.org/10.1080/10872981.2022.2142358.
References
- [1].Johnson TR. Virtual patient simulations and optimal social learning context: a replication of an aptitude–treatment interaction effect. Med Teach. 2014;36:486–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [2].Cheston CC, Flickinger TE, Chisolm MS. Social Media Use in Medical Education: a Systematic Review. Acad Med. 2013;88:893–901. [DOI] [PubMed] [Google Scholar]
- [3].Emanuel EJ. The Inevitable Reimagining of Medical Education. JAMA. 2020;323:1127. [DOI] [PubMed] [Google Scholar]
- [4].Rose S. Medical Student Education in the Time of COVID-19. JAMA. 2020;323:2131. [DOI] [PubMed] [Google Scholar]
- [5].Rajab MH, Gazal AM, Alkattan K. Challenges to Online Medical Education During the COVID-19 Pandemic. Cureus. 2020. DOI: 10.7759/cureus.8966 [DOI] [PMC free article] [PubMed] [Google Scholar]
- [6].Sutherland S, Jalali A. Social media as an open-learning resource in medical education: current perspectives. Adv Med Educ Pract. 2017;8:369–375. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [7].Peters M, Ten Cate O. Bedside teaching in medical education: a literature review. Perspect Med Educ. 2014;3:76–88. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [8].Cooper D, Beswick W, Whelan G. Intensive bedside teaching of physical examination to medical undergraduates: evaluation including the effect of group size. Med Educ. 1983;17:311–315. [DOI] [PubMed] [Google Scholar]
- [9].Verghese A, Brady E, Kapur CC, et al. The Bedside Evaluation: ritual and Reason. Ann Int Med. 2011;155:550. [DOI] [PubMed] [Google Scholar]
- [10].Ramani S. Twelve tips to improve bedside teaching. Med Teach. 2003;25:112–115. [DOI] [PubMed] [Google Scholar]
- [11].Quail M, Brundage SB, Spitalnick J, et al. Student self-reported communication skills, knowledge and confidence across standardised patient, virtual and traditional clinical learning environments. BMC Med Educ. 2016;16:73. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [12].Yavner SD, Pusic MV, Kalet AL, et al. Twelve tips for improving the effectiveness of web-based multimedia instruction for clinical learners. Med Teach. 2015;37:239–244. [DOI] [PubMed] [Google Scholar]
- [13].Strowd R, Kwan A, Cruz T, et al. A Guide to Developing Clinical Reasoning Skills in Neurology: a Focus on Medical Students. MedEdPORTAL. 2015;10163. DOI: 10.15766/mep_2374-8265.10163 [DOI] [Google Scholar]
- [14].Strowd R, Gamaldo C, Kwan A, et al. Flipping the switch: the feasibility of a “think aloud” flipped-classroom approach to clinical reasoning instruction. J Contemp Med Educ. 2016;4:40. [Google Scholar]
- [15].Arocha JF, Wang D, Patel VL. Identifying reasoning strategies in medical decision making: a methodological guide. J Biomed Inform. 2005;38:154–171. [DOI] [PubMed] [Google Scholar]
- [16].Patel VL, Groen GJ. Knowledge Based Solution Strategies in Medical Reasoning. Cogn Sci. 1986;10:91–116. [Google Scholar]
- [17].Flynn T. Core Entrustable Professional Activities for Entering Residency Curriculum Developers ’ Guide. Assoc Am Med Coll. 2014; 1–114 [Google Scholar]
- [18].Simmons B, Lanuza D, Fonteyn M, et al. Clinical Reasoning in Experienced Nurses. West J Nurs Res. 2003;25:701–719. [DOI] [PubMed] [Google Scholar]
- [19].Abras CN. Information technology in the medical school curriculum. Md Med Soc. 2012;13:11–12. [PubMed] [Google Scholar]
- [20].Gamaldo CE, Gamaldo AA, Strowd RE, et al. Applying Lean Thinking: the Assessment of Professional Conduct of Medical Students. Creat Educ. 2016;07:861–869. [Google Scholar]
- [21].Khalil MK, Abdel Meguid EM, Elkhider IA. Teaching of anatomical sciences: a blended learning approach. Clin Anat. 2018;31:323–329. [DOI] [PubMed] [Google Scholar]
- [22].Wilson AB, Brown KM, Misch J, et al. Breaking with Tradition: a Scoping Meta-Analysis Analyzing the Effects of Student-Centered Learning and Computer-Aided Instruction on Student Performance in Anatomy: effectiveness of Student Centered and Computer-Aided Learning in Anatomy. Anat Sci Educ. 2019;12:61–73. [DOI] [PubMed] [Google Scholar]
- [23].Ochoa JG, Wludyka P. Randomized Comparison Between Traditional and Traditional Plus Interactive Web-Based Methods for Teaching Seizure Disorders. Teach Learn Med. 2008;20:114–117. [DOI] [PubMed] [Google Scholar]
- [24].Lipman AJ, Sade RM, Glotzbach AL, et al. The Incremental Value of Internet-based Instruction as an Adjunct to Classroom Instruction: a Prospective Randomized Study. Acad Med. 2001;76:1060–1064. [DOI] [PubMed] [Google Scholar]
- [25].Raupach T, Münscher C, Pukrop T, et al. Significant increase in factual knowledge with web-assisted problem-based learning as part of an undergraduate cardio-respiratory curriculum. Adv Health Sci Educ. 2010;15:349–356. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [26].Subramanian A, Timberlake M, Mittakanti H, et al. Novel Educational Approach for Medical Students: improved Retention Rates Using Interactive Medical Software Compared with Traditional Lecture-Based Format. J Surg Educ. 2012;69:253–256. [DOI] [PubMed] [Google Scholar]
- [27].Truncali A, Lee JD, Ark TK, et al. Teaching physicians to address unhealthy alcohol use: a randomized controlled trial assessing the effect of a Web-based module on medical student performance. J Subst Abuse Treat. 2011;40:203–213. [DOI] [PubMed] [Google Scholar]
- [28].Ricks C, Ratnapalan S, Jain S, et al. Evaluating Computer-Assisted Learning for Common Pediatric Emergency Procedures: pediatr. Emerg Care. 2008;24:284–286. [DOI] [PubMed] [Google Scholar]
- [29].Hurtubise L, Hall E, Sheridan L, et al. The Flipped Classroom in Medical Education: engaging Students to Build Competency. J Med Educ Curric Dev. 2015;2:JMECD.S23895. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [30].Ramnanan C, Pound L. Advances in medical education and practice: student perceptions of the flipped classroom. Adv Med Educ Pract. 2017;8:63–73. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [31].Hew KF, Lo CK. Flipped classroom improves student learning in health professions education: a meta-analysis. BMC Med Educ. 2018;18:38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [32].Torda A. How COVID-19 has pushed us into a medical education revolution. Intern Med J. 2020;50:1150–1153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [33].Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019;24:1666538. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [34].Krathwohl DR. A Revision of Bloom’s Taxonomy: an Overview. Theory Pract. 2002;41:212–218. [Google Scholar]
- [35].Chickering AW, Ehrmann SC. Implementing the seven principles: technology as lever. Am Acad High Educ Bull. 1996;49:3–7. [Google Scholar]
- [36].Saiyad S, Virk A, Mahajan R, et al. Online teaching in medical training: establishing good online teaching practices from cumulative experience. Int J Appl Basic Med Res. 2020;10:149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [37].Worm BS, Buch SV. Does Competition Work as a Motivating Factor in E-Learning? A Randomized Controlled Trial. PLoS ONE. 2014;9:e85434. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [38].Hart C. Factors Associated With Student Persistence in an Online Program of Study: a Review of the Literature. J Interact Online Learn. 2012;11:19–42. [Google Scholar]
- [39].Li S-T-T, Tancredi DJ, Co JPT, et al. Factors Associated with Successful Self-Directed Learning Using Individualized Learning Plans During Pediatric Residency. Acad Pediatr. 2010;10:124–130. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Due to the nature of this research, participants of this study did not agree for their data to be shared publicly, so supporting data is not available.