Skip to main content
Medical Science Educator logoLink to Medical Science Educator
. 2023 Jun 23;33(4):913–924. doi: 10.1007/s40670-023-01816-w

Use of Active Learning During Emergency Remote Teaching in COVID-19 Pandemic

Giovanna Maria Gimenez Testa 1, Mariana Bueno de Oliveira Souza 2, Ângela Tavares Paes 2, Juliana Magdalon 2,
PMCID: PMC10403487  PMID: 37546198

Abstract

The mandatory isolation caused by COVID-19 required the adoption of emergency remote teaching, which caused difficulties for instructors, especially for those who use active learning that depends on student participation in class. This study aimed to investigate the ability of instructors to apply active learning effectively given the pandemic context. This was a cross-sectional observational study carried out in an undergraduate medical school. The sample was composed from one to three classes of 28 instructors that were observed synchronously. Each class was analyzed using a form created from an adaptation of the PORTAAL tool, aiming to evaluate quantitatively essential elements for active learning. We observed that the mean times devoted to activities and active participation of students were 54.8% and 33.1% of the total class time, respectively. Among the time spent in student interactions, the intra-group demanded the highest percentage of the class time. Additionally, 22.0% of the activities presented a high level in Bloom’s taxonomy and there was a positive correlation between the percentage of activities at higher Bloom levels and the percentage of class time with student participation, intra-group or between-group interactions, supporting the use of higher-order cognitive skills in a collaborative and student-centered context. In conclusion, our findings indicate that some instructors were able to apply essential elements for an active and collaborative learning even during the emergency remote teaching.

Keywords: Active learning, Flipped classroom, Emergency remote teaching, Medical education, Small-group discussion

Introduction

Active learning methods are a set of strategies used in the classroom that seek to center learning on the student [1]. These methods are of greater efficiency in the learning process as demonstrated in a meta-analysis that included different higher education courses from the area of science, technology, engineering, and mathematics (STEM) [2]. These teaching strategies have also been shown to be efficient in the training of health professionals, given that learning is no longer a memorization process, but a reflexive-critical process dependent on students’ participation [3]. Several courses have adhered to methods such as the flipped classroom, in which there is individual pre-class study and in-class application of the newly acquired knowledge using higher-order thinking through active and collaborative activities [4]. A meta-analysis demonstrated that the use of flipped classroom increases learning among healthcare students compared with traditional classes [5]. For flipped classroom to occur effectively, it is essential that instructors properly choose pre-class study material as well as in-class activities that encourage student participation and use high levels of Bloom’s taxonomy, such as application and analysis of data [6, 7]. Although instructors recognize that there is academic value in the use of flipped classroom, its use requires the development of new skills, and it is common for instructors to have difficulty in understanding in practice how to use active learning methods effectively [8]. This difficulty is reflected on, for example, the massive use of lecture classes in most of STEM courses in the USA [9].

The COVID-19 pandemic has caused the need for emergency remote teaching. The adaptations made in this context differ from what was done online before and, especially, from the face-to-face model [10], which ended up accelerating the use of technology and the implementation of new methodologies. During social distancing, synchronous and asynchronous resources were used in a complementary way as an attempt to overcome the barriers of socialization [11]. In this context, the flipped classroom became to be one of the methods that provided a more complete learning process, promoting the effective use of technology as well as developing critical thinking, communication, and collaborative skills [12, 13].

It is still controversial whether the emergency remote teaching provided an appropriate environment and useful methodology for learning, especially in higher education contexts. Two meta-analyses comparing online with face-to-face teaching in healthcare courses applied before the COVID-19 pandemic showed that the online learning process showed to be as efficient as the face-to-face process [14, 15]. However, it is important to note that, in these studies, the methodological resources of the courses were designed to be online, which change the structure and perhaps the efficiency of remote learning. On the other hand, courses that needed to become remote on an emergency basis, i.e., that were not initially designed to be online, demonstrated a worse performance of students due to several factors, such as problems with internet access and related to electronic devices, lack of faculty knowledge about digital resources and strategies, and little interaction between students and faculty [1620]. These studies were performed using answers from student questionnaires, however, which may limit the conclusions in terms of their effect on learning. Analyses of students’ test scores before and during the COVID-19 pandemic showed that they performed better or equally well during the emergency remote period [2123]. Data of these analyses raise the question whether the higher scores were led by the effective learning during remote teaching or because of other factors, such as the lack of academic integrity while taking online tests.

It is uncertain whether the performance of instructors and students during remote classes could remain at least as effective for learning as before the pandemic, particularly in courses that use active learning methods, which often rely on constant engagement with students. Therefore, this study aimed to investigate the ability of instructors to apply active learning methods in remote classes during the COVID-19 pandemic, and to describe quantitatively the strategies used by them.

Materials and Methods

Definition of Participation and General Study Planning

This study was performed in the medical school of Faculdade Israelita de Ciências da Saúde Albert Einstein, Brazil, and was approved by the Institutional Research Ethics Committee (CAAE: 27353219.5.0000.0071). The research included undergraduate medical students and instructors from courses prior to the internship, spanning from the first to the seventh semester. Each of these courses is dedicated to utilizing flipped classroom and active learning methods. Classes include from 50 to 60 students, randomly divided into 8 or 9 groups that are maintained throughout one entire semester. All instructors were trained for active learning methods when they were hired by the institution. Other activities regarding these methodologies have been offered by the faculty developmental program over time: training programs, lectures and feedback after class observation, or student evaluation, which happens twice a semester.

Four instructors from each semester before the internship were selected randomly, totalizing 28 instructors. This represented 47% total number of instructors during the period of the study. The drawing process was carried out on a digital platform from a list of all the instructors, separating them by semester and giving priority to the main instructors of the courses.

Then, the selected instructors were sent, via e-mail, messages containing a brief explanation about the project and asking if they were interested in participating in the research. If they were interested, a link to the informed consent form was sent. In cases of refusal or in the impossibility of continuing the research, new draws were made until the pre-established minimum number of 4 instructors per semester was reached.

The study design included the observation and analysis of one to three 2-h synchronous online classes, via the Zoom videoconferencing application, of each of the instructors during two consecutive semesters of classes (between September 2020 and June 2021).

Observation of Classes

The process of organizing the observations began by acquiring the teaching plans of each subject to organize the schedule, thus selecting the classes that did not use the team-based learning (TBL) method, that coincided with the observers’ available time slots and prioritizing classes taught exclusively by the participating instructor. Classes using TBL were excluded from the analysis because this method has a pre-determined structure that could not be compared equally to classes in which the instructors are free to plan the activities. If there was no instructor exclusivity in a certain subject, the classes with the least participation of guest instructors were chosen. The instructors were not previously informed about the classes that would be observed to avoid a possible bias on their teaching practice.

Each class was recorded by one of two observers, using a form created from an adaptation of the “Practical Observation Rubric To Assess Active Learning” (PORTAAL) tool [24]. The purpose of this tool is to evaluate quantitatively essential elements for the proper conduction of classes with active learning. The reason by which it was necessary to make adaptations to the original PORTAAL tool was that this was originally developed to evaluate video-recorded classes. However, it was not allowed to video-record the classes in the college of the present study and it was not feasible to register all its variables in synchronous live classes. Therefore, it was discussed which variables could be registered and some adaptations of the tool were made.

Before data collection, both observers have completed the entire PORTAAL training manual, including the exercises to categorize questions in Bloom’s taxonomy levels, which was previously shown to promote a high degree of agreement in the records between different observers and, therefore, to be successful for a reliable use of PORTAAL tool [24]. Subsequently, both observers have applied the adapted PORTAAL tool to one single class, as a pilot study. The analyses were compared and it has been noticed a difference from one observer to the other one in the total number of activities and the time some debriefs inside an activity had started. Therefore, the divergent criteria were discussed and corrected to avoid discrepancy between observers during the study. Other variables, such as Bloom’s taxonomy level, type and duration of interactions, number of activities with at least one question directed towards specific groups or students, or number of activities with explicit positive feedback by the instructor, have not differed between observers.

From the variables present in the PORTAAL tool, a list of variables to be observed in this study was created (Table 1). Variables regarding class, activity, and interaction time are not part of any dimension or element of the original PORTAAL tool; however, they were used to calculate other variables. Lecturing time, open interaction time, number of questions from students, and different resources used during class, such as oral question, question projected on slide associated or not to a case, different applications or websites for polls, Zoom breakout rooms, as well as guest instructor participations, were not present in the original PORTAAL tool, but they were considered important by the research group and easy to register, so they were included.

Table 1.

Observed variables during the classes and their respective definitions. Variables that are present in the original PORTAAL tool are defined with respect to their dimension and element

Variables Definition PORTAAL dimension and element
Total class time (minutes) Difference between the end time and the starting time of the class -
Total time in lecturing (minutes) Total time the instructor speaks only, i.e., without interaction with students, which could be out of an activity or during the introduction or debrief of an activity -
Percentage of class time in lecturing Total time in lecturing divided by the total class time multiplied by 100 -
Total number of activities in class Total number of activities counted during the class, always started by the instructor asking a question -
Total time in activity (minutes) Sum of the time spent in all the activities carried out during the class, calculated by the difference between the end time and the start time of each activity -
Percentage of class time in activities Total time in activities divided by the total class time multiplied by 100 -
Total number of interactions in class Total number of interactions counted during the class, which could be open, individual, intra-group, or between groups -
Total time in interactions (minutes) Sum of the time spent in all the interactions during the proposed activities, calculated by the difference between the end time and the start time of each interaction -
Percentage of class time in interactions Total time of interactions divided by the total class time multiplied by 100 -
Total time with student participation (minutes) Sum of the total time in interactions with the total time spent in activity debrief when included student participation Dimension 1—P1
Percentage of class time with student participation Total time with student participation divided by the total class time multiplied by 100 Dimension 1—P1
Number of activities with at least one open interaction Total number of activities with the presence of at least one open interaction (question addressed to the whole class to be answered individually and orally) -
Open interaction time (minutes) Sum of the time spent in all the open interactions during the proposed activities, calculated by the difference between the end time and the start time of each interaction classified as open -
Percentage of class time with open interactions Open interaction time divided by the total class time multiplied by 100 -
Number of activities with at least one individual interaction Total number of activities with the presence of at least one individual interaction (question(s) addressed to the whole class to be answered individually by a site or application) Dimension 2—L3
Individual interaction time (minutes) Sum of the time spent in all the individual interactions during the proposed activities, calculated by the difference between the end time and the start time of each interaction classified as individual Dimension 2—L3
Percentage of class time with individual interactions Individual interaction time divided by the total class time multiplied by 100 Dimension 2—L3
Number of activities with at least one intra-group interaction Total number of activities with the presence of at least one intra-group interaction (question(s) to be discussed in small groups)

Dimension 2—L4

Dimension 3—A2

Intra-group interaction time (minutes) Sum of the time spent in all the intra-group interactions during the proposed activities, calculated by the difference between the end time and the start time of each interaction classified as intra-group

Dimension 2—L4

Dimension 3—A2

Percentage of class time with intra-group interactions Intra-group interaction time divided by the total class time multiplied by 100

Dimension 2—L4

Dimension 3—A2

Number of activities with at least one between-group interaction Total number of activities with the presence of at least one between-group interaction (discussion between different groups) Dimension 2—L6
Between-group interaction time (minutes) Sum of the time spent in all the between-group interactions during the proposed activities, calculated by the difference between the end time and the start time of each interaction classified as between groups Dimension 2—L6
Percentage of class time with between-group interactions Between-group interaction time divided by the total class time multiplied by 100 Dimension 2—L6
Number of activities with a high level in Bloom's taxonomy Total number of activities counted during the class classified as high level in Bloom’s taxonomy Dimension 2—L1
Percentage of activities with high level in Bloom's taxonomy Total number of high-level activities in Bloom’s taxonomy divided by the total number of activities multiplied by 100 Dimension 2—L1
Number of activities with explicit positive feedback by the instructor Total number of activities counted during the class where the instructor stimulated students to participate, such as through praise or encouragement Dimension 4—R2, R3
Percentage of activities with the explicit positive feedback by the instructor Total number of activities with the presence of explicit positive feedback by the instructor divided by the total number of activities multiplied by 100 Dimension 4—R2, R3
Number of activities with explanation of correct answer Total number of activities counted during the class in which there was an explanation of the correct answer to the question by the instructor during the debrief Dimension 2—L7
Percentage of activities with explanation of correct answer Total number of activities with explanation of the correct answer divided by the total number of activities multiplied by 100 Dimension 2—L7
Number of activities with at least one question directed towards specific groups or students Total number of activities counted during the class where the instructor asked for the answer to a specific group or student, regardless of the stage of the activity (interactions or debrief)

Dimension 3—A3

Dimension 4—R1

Percentage of activities with at least one question directed towards specific groups or students Total number of activities with at least one question addressed to specific groups or students divided by the total number of activities multiplied by 100

Dimension 3—A3

Dimension 4—R1

Number of questions students asked during the class Total number of questions students asked during the class -
Number of different resources used in class Total number of different resources that were used during the classes (e.g., oral question, question projected on slide associated or not to a case, different applications or websites for polls, Zoom breakout rooms, etc.) -
Guest instructor participation time (minutes) Sum of the time spent in all the guest instructor’s participation moments, calculated by the difference between the end time and the start time of each participation moment -
Percentage of class time with participation of guest instructors Guest instructor participation time divided by the total class time multiplied by 100 -

As defined by the PORTAAL tool, the introduction of each activity started when the instructor asked a question. This question could be made through an online application, a case with knowledge application projected in a slide, question projected in the slide that was not associated with any case, or an oral question that was not written anywhere. During the activity, it was marked the option “open interaction” when the instructor asked a question to the whole class and waited for someone to answer; “individual interaction” when the instructor proposed an activity that everyone should answer individually (such as through quizzes in applications); “intra-group interaction” when the instructor proposed an activity in which the answer should be discussed in small groups; and finally “between-group interaction” when the instructor moderated a discussion between different groups. Importantly, the activities were classified according to Bloom’s taxonomy level to be low when question(s) required knowledge or comprehension and high when question(s) required application, analysis, synthesis, or evaluation, as thoroughly explained and practiced in the PORTAAL training manual [24]. Regarding the types of student responses, they were classified as “voluntary” when the instructor asked a question to the whole class and some student voluntarily answered; “specific group” when the instructor called a group to answer; and “specific student” when the instructor called a single student to answer (cold call). The activity ended when a conclusive answer was given or when a new question was asked, thus starting another activity. The activity debrief could include the instructor explaining the answer alone or together with the students, which was also registered.

Once the form had been established, the observations of each instructor’s classes that met the requirements were started. At the beginning of each class, the observer informed her presence to students and instructors via Zoom chat, with pre-established messages that did not influence any individual’s participation.

Data Analysis

Data were summarized as means, medians, standard deviations, 25th and 75th percentiles, and minimum and maximum values. The distribution of the time classes by modalities was described by a pie chart. Box plots were used to show distributions of observed classes regarding to percentage of class time spent on activities, interactions, student participation, and lecturing. Scatterplots and Spearman’s Rho correlation coefficients were computed to analyze relationship between pairs of observed variables. The R version 4.0.5 and JAMOVI version 2.3.18 statistical packages were used to perform data analysis.

Results

Description of Classes

A total of 58 classes were observed, i.e., one to three synchronous remote classes of 28 instructors, which on average teach 2.47 different courses before internship. Most of the courses were in the field of biological sciences (basic and clinical), but there were also courses in the field of leadership and humanities. Although the expected duration of each class was 120 min, the classes were, on average, 113 min long. The shortest class duration was 49.8 min, and the longest one was 168 min (Table 2). The observed variables during the classes were also presented in values relative to the total time of the class (Table 2).

Table 2.

Descriptive statistics for the total number, and total and relative time of activities and interactions during the classes (N = number of classes)

Variables N Mean Median Standard deviation 25th percentile 75th percentile Minimum Maximum
Total class time (minutes) 58 113.0 115.0 18.0 107.0 123.0 49.8 168.0
Total time in lecturing (minutes) 58 74.5 75.5 21.7 56.8 89.3 34.7 115.0
Percentage of class time in lecturing 58 66.9 65.2 19.8 52.0 82.5 28.1 99.7
Total number of activities in class 58 12.1 11.0 6.5 7.0 17.0 1.0 29.0
Total time in activity (minutes) 58 62.9 68.2 31.4 35.3 83.1 0.4 124.0
Percentage of class time in activities 58 54.0 60.0 24.8 34.4 68.6 0.7 98.2
Total number of interactions in class 58 20.1 18.0 11.1 12.0 26.0 1.0 53.0
Total time in interactions (minutes) 58 30.2 30.7 19.0 14.5 43.9 0.2 73.8
Percentage of class time in interactions 58 25.8 28.3 15.6 12.6 38.0 0.3 58.3
Total time with student participation (minutes) 58 38.9 40.0 25.4 18.5 55.6 0.2 117.0
Percentage of class time with student participation 58 33.1 34.8 19.8 17.5 48.0 0.3 71.9
Number of activities with at least one open interaction 58 2.3 1.0 4.3 0.0 2.0 0.0 22.0
Open interaction time (minutes) 58 1.9 0.04 5.4 0.0 0.5 0.0 33.3
Percentage of class time with open interactions 58 1.6 4.1 4.6 0.0 0.4 0.0 27.1
Number of activities with at least one individual interaction 58 7.8 7.0 6.2 3.0 11.8 0.0 24.0
Individual interaction time (minutes) 58 8.7 4.3 10.6 1.1 12.1 0.0 45.0
Percentage of class time with individual interaction 58 7.4 3.6 8.8 1.15 9.7 0.0 35.0
Number of activities with at least one intra-group interaction 58 1.7 2.0 1.5 0.0 3.0 0.0 5.0
Intra-group interaction time (minutes) 58 15.4 13.3 14.2 0.0 24.3 0.0 53.6
Percentage of class time with intra-group interaction 58 13.1 12.0 11.3 0.0 22.7 0.0 34.7
Number of activities with at least one intra-group interaction 58 2.5 2.0 2.5 0.0 4.0 0.0 10.0
Between-group interaction time (minutes) 58 4.2 2.7 4.9 0.0 6.15 0.0 19.0
Percentage of class time with between-group interactions 58 3.7 2.3 4.5 0.0 5.15 0.0 17.0

The classes had an average of 12.1 questions asked by the instructor, which started an activity. However, there was a great variation among them, with one class with only one activity and another with 29 activities (Table 2). The mean time of the classes spent on activities was 54.8% of the total class time, while the mean time spent on active participation of students was 33.1% of the total class time (Table 2, Figs. 1 and 2). In addition, the time spent on lecturing, which included the instructor’s speaking times in and out of the proposed class activities, ranged from 28.1 to 99.7% of the total class time, with an average of 66.9% (Table 2, Figs. 1 and 2). The mean number of interactions throughout the class was 20.1, making up 25.8% of the total class time. Among the time spent with each interaction, it is possible to determine that the intra-group interactions demanded the highest percentage of the class time (13.1%), followed by individual interactions (7.44%), between groups (3.68%), and open interactions (1.63%) (Table 2). It is also possible to notice that, on average, 29.9% of the activities presented at least one question directed towards a specific group or student, that is, the answer was not spontaneously initiated by the students (Table 3).

Fig. 1.

Fig. 1

Mean distribution of the time dedicated to each of the modalities for a total of 58 classes

Fig. 2.

Fig. 2

Percentage of class time spent on activities, interactions, student participation, and lecturing for a total of 58 classes

Table 3.

Characteristics of the activities performed during the classes (N = number of classes)

Variables N Mean Median Standard deviation 25th percentile 75th percentile Minimum Maximum
Number of activities with high level in Bloom’s taxonomy 58 2.4 2.0 3.1 0.0 3.0 0.0 17.0
Percentage of activities with high level in Bloom’s taxonomy 58 22.0 16.2 22.6 0.0 36.5 0.0 80.0
Number of activities with explicit positive feedback by the instructor 58 8.6 7.0 5.1 5.0 12.0 0.0 22.0
Percentage of activities with explicit positive feedback by the instructor 58 71.8 74.5 22.2 60.0 87.3 0.0 100.0
Number of activities with explanation of correct answer 58 9.2 8.0 5.3 5.0 12.8 1.0 22.0
Percentage of activities with explanation of correct answer 58 80.2 80.5 18.8 70.6 1.0 28.6 100.0
Number of activities with at least one question directed towards specific groups or students 58 3.5 3.0 3.7 1.0 5.0 0.0 17.0
Percentage of activities with at least one question directed towards specific groups or students 58 29.9 25.0 27.0 43.6 45.2 0.0 94.4
Number of questions students asked during the class 58 14.1 12.0 10.9 7.0 18.0 1.0 52.0
Number of different resources used in class 58 4.9 5.0 1.6 4.0 6.0 2.0 8.0
Guest instructor participation time (minutes) 9 11.7 9.9 7.5 5.9 18.5 1.7 23.6
Percentage of class time with participation of guest instructors 9 10.1 8.6 6.5 5.6 13.2 1.4 19.9

We observed that 71.8% of the activities presented explicit positive feedback from the instructor to the students (Table 3), with either encouragement to participate or praise after they had given the answer. Additionally, 80.2% of the activities had an explanation of the correct answer during debrief (Table 3). Only 22.0% of the activities presented a high level in Bloom’s taxonomy (Table 3), and they were usually present during clinical case discussions. Still, it was possible to observe that the instructors used a diversity of resources, with an average of 4.95 different resources per class (Table 3). It is also worth mentioning that, of the 58 classes, 9 had guest instructors, who had an average speaking time of 11.7 min, corresponding to 10.1% of the total class time (Table 3).

Associations Between Class Data

As expected, the percentage of class with student participation correlated positively with the percentage of class with activities and interactions. Accordingly, the percentage of the class with student participation also correlated positively with the percentage of the class with intra-group and between-group interactions, given that this was the time in the class that they most actively participated. Regardless correlations, it can be noticed that the student participation has occurred even in classes without intra- and between-group interactions (Fig. 3).

Fig. 3.

Fig. 3

Correlation analysis between percentage of class with student participation and percentage of class with activities (A), percentage of class with interactions (B), percentage of class with intra-group interactions (C), or percentage of class with between-group interactions (D)

Weaker correlations were found involving the percentage of activities with high level in Bloom’s taxonomy. There was a positive correlation with the percentage of class time with activities, with the percentage of class time with student participation, and with the percentage of class time with intra-group or between-group activities, whereas it showed a negative correlation with the percentage of the class with open interactions and did not correlate with the percentage of the class with individual interactions (Fig. 4). Despite low correlation coefficients, these data suggest that Bloom’s taxonomy is related to the formulation of complex reasoning with higher collaboration among students, which occurred mostly during clinical case discussions.

Fig. 4.

Fig. 4

Correlation analysis between percentage of activities with higher Bloom levels and percentage of class with activities (A), percentage of class with student participation (B), percentage of class with open interactions (C), percentage of class with individual interactions (D), percentage of class with intra-group interactions (E), or percentage of class with between-group interactions (F)

The percentage of the class time with intra-group interactions showed a positive correlation with the percentage of the class with between-group interactions (r = 0.603; p < 0.001). This was possibly explained by the fact that the former usually preceded the latter as part of the same activity to achieve a higher-order thinking skill. The percentage of activities with instructor explicit positive feedback showed a positive correlation with the percentage of class with student participation (r = 0.326; p = 0.013), therefore suggesting that instructor stimulation tended to increase in-class participation of students.

Discussion

The sudden changes produced by the pandemic required a challenging adaptation to remote learning. The adjustment of faculty and students to such immediate changes turned the online learning process into an even more complex situation. The present study intended to address, by means of applying an adaptation of the PORTAAL tool [24], possible variables related to the application of active learning methods during remote classes. The goal of this research was to verify if this method could be efficiently used during classes abruptly adapted to mandatory isolation.

The PORTAAL tool is the most detailed tool regarding student participation and class dynamics, although it is also the most difficult to apply [25]. It was developed for the purpose of evaluating recorded classes and was thus even more complex to use it in live classes, as in our study. Therefore, it was necessary to make some modifications in order to be able to be applied in our study. To the best of our knowledge, this is the first study that has used this tool, although modified, to show the use of active learning in different classes. To assess whether the remote classes were similar to what has already been described for face-to-face classes, our results were compared to the original article [24] as well as to other studies that have used another tool that assesses active learning, such as “Classroom Observation Protocol For Undergraduate Stem” (COPUS) [26], which measures the frequency of various student and instructor behaviors every 2 min throughout an entire class. The original COPUS tool was subsequently classified into 4 behaviors for students and 4 behaviors for instructors, one of which is guiding [27], comparable to the time spent on activities in the PORTAAL tool [24], as it includes time spent asking questions, administering quizzes, moderating discussions, and explaining answers.

The perceived variation on how active learning was applied during classes was not unique to this study nor to the remote emergent teaching period [24, 28, 29]. However, the mean use of active learning occupied 54% of class time. This result was based on the presentation of 12 questions, on average, in each class. The average time devoted to guiding by instructors trained for the active learning was 58.4%, assessed using the COPUS tool [30], and therefore comparable to that found in the present study. In addition, we identified that 29.9% of the activities featured at least one question directed to a single student or group, similar as reported previously [24] and associated with better performance on tests [31]. Around 80% of the activities contained an explanation of the correct answer in the debrief, which was also equivalent to the original study [24], and was shown to be important to the application of reasoning by students [32].

The online environment in remote classes can present several challenges, such as to encourage student participation and contribution to the proposed class discussions [33]. However, in our study, the observed classes used an average of 33.1% (median 34.8%) of their time with active student participation, while the work where the PORTAAL tool was developed indicated a median participation of less than 6 min per 50 min of class time, which determined an active student participation of 12% [24]. The median value found in our study was equivalent to that of the instructor indicated as a reference of active learning application by the other study [24]. This may indicate that even remote classes can promote active student participation when using effective strategies. This was quite relevant since many students reported a lack of motivation, engagement, and interest during emergency remote teaching in research conducted in different courses and various countries [16, 17, 19, 20, 3436].

In the present study, we identified that the lecture part of the class, i.e., the instructors’ exclusive speaking time, represented from 28.1 to 99.74% of the class time, with an average of 66.9%. This finding corroborates with the range reported in previous studies [26, 28, 37]. There is no definition of how much lecture portion would be expected in an active class, but it is possible to state that a portion of the classes in the present study used very little active learning while another portion used this methodology extensively. A study using the COPUS tool and classifying the classes into different profiles defines that classes that are more student-centered, i.e., that make extensive use of group discussion and collaborative methods, present from 26 to 50% of lecture, while those that already present a considerable amount of active learning activities present from 55 to 76% of lecture [29].

Case-based learning is a methodology used to integrate theoretical and practical knowledge through the application of authentic cases with questions to be answered by students. The cases are directed to small groups of students, who then discuss their solution through the use of complex reasoning, thus promoting knowledge construction at a deeper level through teamwork [38]. This methodology involves activities with a high level in Bloom’s taxonomy, which represented in the present study an average percentage of 22% of the total activities, a value higher than in the previous study [24], and thus indicating the difficulty in using such a resource with exacerbated frequency. The divergence between this study and the other one [24] may be associated with the class time (2 h in our study vs 50 min in the other) since these activities use a more complex form of reasoning and consequently demand more time to be executed. Therefore, shorter classes may have a lower percentage of these activities because there is not enough time to perform them repeatedly.

It is also interesting to note that the intra-group interactions, which were usually performed through the Zoom breakout rooms, were the type of interaction that demanded the most class time, and together with the interactions between groups occupied, on average, 17% of the total class time. A positive correlation was seen between activities with high Bloom’s taxonomy levels and intra- and between-group interactions. This supported the idea that activities that required more complex reasoning, such as during the discussion of clinical cases, were debated initially in small groups and later among groups. The positive correlations found between the percentage of student participation during class and intra-group or between-group interactions reinforce the importance of these modalities in making the environment more active. This is particularly true by considering that these are the moments when students devote more time to knowledge construction, which ultimately even reflects in better performance on exams [37].

The success in active student participation during remote teaching at our institution can be attributed to several factors, among them the commitment of students and instructors to active learning methods even before the COVID-19 pandemic. Thus, both already had knowledge about the method and some technological tools to apply them, which was supported by the abundant amount of different resources used in the classes. Indeed, a greater diversity of activities has already been demonstrated in active learning classes compared to lectures, such as through the application of questions, clickers, and group discussion instead of students just listening to the instructor [26]. The use of multimodal activities and the variability of teaching strategies in class are indeed stimulating factors for learners [11], and in general, students feel that remote teaching should be more interactive [35]. Consequently, the use of these tools associated with active learning may increase student engagement and performance during emergency remote teaching [3941], as found in our study. Another factor that may justify participation of students was the presence of explicit positive feedback by the instructor, which was present in 71.8% of the activities performed and showed a positive correlation with participation of students. Such encouragement can increase student participation since it makes students feel valued and important, while reducing their fear of making mistakes [42, 43]. Moreover, remote classes can facilitate the participation of shy students, who can use chat or even Zoom breakout rooms to engage in discussions, and therefore turning the online environment less prone to exposure and intimidation [33, 35].

Unfortunately, our study has limitations and some may be addressed in future investigations. First, we were not able to provide specific examples of activities due to the lack of recorded class sessions. Consequently, a more detailed description of the activities and a deeper understanding of the activities’ Bloom level and their relationship to the type of interaction may be limited. In future research, we could also consider collecting qualitative data, such as student perceptions and experiences, to complement our quantitative findings and provide a more comprehensive understanding of active learning in emergency remote teaching. This could potentially shed light on the effectiveness of different active learning strategies and help inform pedagogical practices in online teaching environments.

As a conclusion, this study showed that faculty can effectively apply active learning methods even at challenging situations such as the emergency remote teaching imposed by the COVID-19 pandemic. In this sense, students were actively participating during 33.1% of the total class time, and activities made up 54.8% of the total class time. We also identified 22.0% of the activities as high level in Bloom’s taxonomy, and there was a positive correlation between the percentage of high-level activities and the percentage of class time with student participation, or intra-group or between-group interactions. All these data are consistent with student-centered learning and complex reasoning, essential elements of active and collaborative learning.

Acknowledgements

We thank all the students and instructors who agreed to participate in this study as volunteers. We have not received any financial support.

Data Availability

The datasets generated and analyzed during the current study are not publicly available due to the fact that they constitute an excerpt of research in progress but are available from the corresponding author on reasonable request.

Declarations

Ethics Approval

The study was approved by the Research Ethics Committee of the Hospital Israelita Albert Einstein (CAAE: 27353219.5.0000.0071).

Competing Interests

The authors declare no competing interests.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Giovanna Maria Gimenez Testa, Email: gihtesta04@gmail.com.

Mariana Bueno de Oliveira Souza, Email: marianabuoliza@gmail.com.

Ângela Tavares Paes, Email: angela.tpaes@einstein.br.

Juliana Magdalon, Email: juliana.magdalon@einstein.br.

References

  • 1.Prince M. Does active learning work? A review of the research. J Eng Educ. 2004;93:223–231. doi: 10.1002/j.2168-9830.2004.tb00809.x. [DOI] [Google Scholar]
  • 2.Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, et al. Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci U S A. 2014;111:8410–8415. doi: 10.1073/pnas.1319030111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Schwartzstein RM, Roberts DH. Saying goodbye to lectures in medical school — paradigm shift or passing fad? N Engl J Med. 2017;377:605–607. doi: 10.1056/NEJMp1706474. [DOI] [PubMed] [Google Scholar]
  • 4.Persky AM, McLaughlin JE. The flipped classroom – from theory to practice in health professional education. Am J Pharm Educ. 2017;81:118. doi: 10.5688/ajpe816118. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Hew KF, Lo CK. Flipped classroom improves student learning in health professions education: a meta-analysis. BMC Med Educ. 2018;18:1–12. doi: 10.1186/s12909-018-1144-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Kraut AS, Omron R, Caretta-Weyer H, Jordan J, Manthey D, Wolf SJ, et al. The flipped classroom: a critical appraisal. West J Emerg Med. 2019;20:527–536. doi: 10.5811/westjem.2019.2.40979. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Gilboy MB, Heinerichs S, Pazzaglia G. Enhancing student engagement using the flipped classroom. J Nutr Educ Behav. 2015;47:109–114. doi: 10.1016/j.jneb.2014.08.008. [DOI] [PubMed] [Google Scholar]
  • 8.O’Flaherty J, Phillips C. The use of flipped classrooms in higher education: a scoping review. Internet High Educ. 2015;25:85–95. doi: 10.1016/j.iheduc.2015.02.002. [DOI] [Google Scholar]
  • 9.Stains M, Harshman J, Barker MK, Chasteen SV, Cole R, DeChenne-Peters SE, et al. Anatomy of STEM teaching in North American universities. Science. 1979;2018(359):1468–1470. doi: 10.1126/science.aap8892. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning [Internet]. 2020. Available from: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning. Accessed 26 Mar 2022.
  • 11.Lima KR, das Neves B-HS, Ramires CC, dos Santos Soares M, Martini VA, Lopes LF, et al. Student assessment of online tools to foster engagement during the COVID-19 quarantine. Adv Physiol Educ. 2020;44:679–683. doi: 10.1152/advan.00131.2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Latorre-Cosculluela C, Suárez C, Quiroga S, Sobradiel-Sierra N, Lozano-Blasco R, Rodríguez-Martínez A. Flipped classroom model before and during COVID-19: using technology to develop 21st century skills. Interact Technol Smart Educ. 2021;18:189–204. doi: 10.1108/ITSE-08-2020-0137. [DOI] [Google Scholar]
  • 13.Rhim HC, Han H. Teaching online: foundational concepts of online learning and practical guidelines. Korean J Med Educ. 2020;32:175–183. doi: 10.3946/kjme.2020.171. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.He L, Yang N, Xu L, Ping F, Li W, Sun Q, et al. Synchronous distance education vs traditional education for health science students: a systematic review and meta-analysis. Med Educ. 2021;55:293–308. doi: 10.1111/medu.14364. [DOI] [PubMed] [Google Scholar]
  • 15.Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019;24:1–13. doi: 10.1080/10872981.2019.1666538. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Onyema EM, Eucheria NC, Obafemi FA, Sen S, Atonye FG, Sharma A, et al. Impact of coronavirus pandemic on education. J Educ Pract. 2020;11:108–121. [Google Scholar]
  • 17.Petillion RJ, McNeil WS. Student experiences of emergency remote teaching: Impacts of instructor practice on student learning, engagement, and well-being. J Chem Educ. 2020;97:2486–2493. doi: 10.1021/acs.jchemed.0c00733. [DOI] [Google Scholar]
  • 18.Shim TE, Lee SY. College students’ experience of emergency remote teaching due to COVID-19. Child Youth Serv Rev. 2020;119:1–7. doi: 10.1016/j.childyouth.2020.105578. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Shin M, Hickey K. Needs a little TLC: examining college students’ emergency remote teaching and learning experiences during COVID-19. J Furth High Educ. 2021;45:973–986. doi: 10.1080/0309877X.2020.1847261. [DOI] [Google Scholar]
  • 20.Tan C. The impact of COVID-19 on student motivation, community of inquiry and learning performance. Asian Educ Dev Stud. 2021;10:308–321. doi: 10.1108/AEDS-05-2020-0084. [DOI] [Google Scholar]
  • 21.Gonzalez T, De la Rubia MA, Hincz KP, Comas-Lopez M, Subirats L, Fort S, et al. Influence of COVID-19 confinement on students’ performance in higher education. PLoS ONE. 2020;15. [DOI] [PMC free article] [PubMed]
  • 22.Iglesias-Pradas S, Hernández-García Á, Chaparro-Peláez J, Prieto JL. Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: a case study. Comput Human Behav. 2021;119:1–18. doi: 10.1016/j.chb.2021.106713. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Yakar L. The effect of emergency remote teaching on the university students’ scores and students and instructors perceptions on scores. Journal of Educational Technology and Online Learning. 2021;4:373–390. doi: 10.31681/jetol.957433. [DOI] [Google Scholar]
  • 24.Eddy SL, Converse M, Wenderoth MP. PORTAAL: a classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes. CBE Life Sci Educ. 2015;14:1–16. doi: 10.1187/cbe.14-06-0095. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Asgari M, Miles AM, Lisboa MS, Sarvary MA. COPUS, PORTAAL, or DART? Classroom observation tool comparison from the instructor user’s perspective. Front Educ (Lausanne) 2021;6:1–14. [Google Scholar]
  • 26.Smith MK, Jones FHM, Gilbert SL, Wieman CE. The classroom observation protocol for undergraduate stem (COPUS): a new instrument to characterize university STEM classroom practices. CBE Life Sci Educ. 2013;12:618–627. doi: 10.1187/cbe.13-08-0154. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Smith MK, Vinson EL, Smith JA, Lewin JD, Stetzer MR. A campus-wide study of STEM courses: new perspectives on teaching practices and perceptions. CBE Life Sci Educ. 2014;13:624–635. doi: 10.1187/cbe.14-06-0108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Kinnear G, Smith S, Anderson R, Gant T, MacKay JRD, Docherty P, et al. Developing the FILL+ tool to reliably classify classroom practices using lecture recordings. J STEM Educ Res. 2021;4:194–216. doi: 10.1007/s41979-020-00047-7. [DOI] [Google Scholar]
  • 29.Lund TJ, Pilarz M, Velasco JB, Chakraverty D, Rosploch K, Undersander M, et al. The best of both worlds: building on the COPUS and RTOP observation protocols to easily and reliably measure various levels of reformed instructional practice. CBE Life Sci Educ. 2015;14:1–12. doi: 10.1187/cbe.14-10-0168. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Kranzfelder P, Lo AT, Melloy MP, Walker LE, Warfa ARM. Instructional practices in reformed undergraduate STEM learning environments: a study of instructor and student behaviors in biology courses. Int J Sci Educ. 2019;41:1944–1961. doi: 10.1080/09500693.2019.1649503. [DOI] [Google Scholar]
  • 31.Moon S, Jackson MA, Doherty JH, Wenderoth MP. Evidence-based teaching practices correlate with increased exam performance in biology. PLoS ONE. 2021;16:e0260789. doi: 10.1371/journal.pone.0260789. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Butler AC, Godbole N, Marsh EJ. Explanation feedback is better than correct answer feedback for promoting transfer of learning. J Educ Psychol. 2013;105:290–298. doi: 10.1037/a0031026. [DOI] [Google Scholar]
  • 33.Launer J. Teaching and facilitating groups online: adapting to the COVID-19 pandemic. Postgrad Med J. 2021;97:543–544. doi: 10.1136/postgradmedj-2021-140619. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Daniels LM, Goegan LD, Parker PC. The impact of COVID-19 triggered changes to instruction and assessment on university students’ self-reported motivation, engagement and perceptions. Soc Psychol Educ. 2021;24:299–318. doi: 10.1007/s11218-021-09612-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020;10:e042378. doi: 10.1136/bmjopen-2020-042378. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Mahdy MAA. The impact of COVID-19 pandemic on the academic performance of veterinary medical students. Front Vet Sci. 2020;7:1–8. doi: 10.3389/fvets.2020.594261. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Weir LK, Barker MK, McDonnell LM, Schimpf NG, Rodela TM, Schulte PM. Small changes, big gains: a curriculum-wide study of teaching practices and student learning in undergraduate biology. PLoS ONE. 2019;14:e0220900. doi: 10.1371/journal.pone.0220900. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Thistlethwaite JE, Davies D, Ekeocha S, Kidd JM, MacDougall C, Matthews P, et al. The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No. 23. Med Teach. 2012;34:e421–e444. doi: 10.3109/0142159X.2012.680939. [DOI] [PubMed] [Google Scholar]
  • 39.McBrien JL, Jones P, Cheng R. Virtual spaces: employing a synchronous online classroom to facilitate student engagement in online learning. Int Rev Res Open Dist Learn. 2009;10:1–17. [Google Scholar]
  • 40.Morawo A, Sun C, Lowden M. Enhancing engagement during live virtual learning using interactive quizzes. Med Educ. 2020;54:1188. doi: 10.1111/medu.14253. [DOI] [PubMed] [Google Scholar]
  • 41.Rossi IV, de Lima JD, Sabatke B, Nunes MAF, Ramirez GE, Ramirez MI. Active learning tools improve the learning outcomes, scientific attitude, and critical thinking in higher education: experiences in an online course during the COVID-19 pandemic. Biochem Mol Biol Educ. 2021;49:888–903. doi: 10.1002/bmb.21574. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Bell BS, Kozlowski SWJ. Active learning: effects of core training design elements on self-regulatory processes, learning, and adaptability. J Appl Psychol. 2008;93:296–316. doi: 10.1037/0021-9010.93.2.296. [DOI] [PubMed] [Google Scholar]
  • 43.Ellis K. Perceived teacher confirmation the development and validation of an instrument and two studies of the relationship to cognitive and affective learning. Hum Commun Res. 2000;26:264–291. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The datasets generated and analyzed during the current study are not publicly available due to the fact that they constitute an excerpt of research in progress but are available from the corresponding author on reasonable request.


Articles from Medical Science Educator are provided here courtesy of Springer

RESOURCES