Abstract
Student engagement is important for classroom management but can be challenging to monitor, especially in large virtual classes. After lessons were moved online due to COVID-19 measures, instructors were unable to directly observe student behaviours, impacting their ability to gauge engagement levels and adjust the pace of delivery for optimal learning outcomes. A widget called MOSH, for Move On/Stay Here, was developed for students to indicate whether they wished to “move on” from or “stay here” on a point of discussion in real-time. By increasing acknowledgment of and response to student feedback, we aimed to enhance the student-instructor feedback loop.
Keywords: Online learning, Student engagement, MOSH
Background
The move to online learning in health professions education (HPE) has gained momentum with the increasing prevalence of blended learning formats and COVID-19 pandemic–related restrictions. While student engagement (defined as participation in educationally effective practices [1]) during online classes is critical for student learning [2], maintaining it is a major challenge for educators, due in part to difficulties in assessing student disengagement, frustration, and disinterest in virtual classrooms (reviewed by Wilcha [3]). Therefore, there is a pressing need to develop systems for monitoring student engagement during virtual classes to maintain the effectiveness of online medical education.
The Lee Kong Chian School of Medicine employs team-based learning (TBL) as the main pedagogical method in the first two years of the pre-clinical curriculum [4]. Classes are co-taught by facilitators, who are TBL pedagogy experts, and content experts, who are subject matter experts. During class, students answer questions utilising a learning activity management system (LAMS), after which there are class-wide discussions for clarification of gaps in understanding and further application of concepts learnt. Facilitators manage these discussions by encouraging student participation and guiding content experts to provide explanations at the appropriate level for the students and at the optimal timepoint during the discussion. This requires facilitators to constantly monitor classroom dynamics and identify when students are struggling with the difficulty of the content or disengaged from the discussion.
From March 2020 to May 2022, all TBL sessions were conducted via Zoom due to COVID-19 social distancing measures. This rendered facilitators unable to gauge levels of interest and attention by directly observing student behaviours. Instead, feedback about student engagement and understanding was restricted to observing the facial expressions in thumbnail-sized displays and to the handful of students who volunteered to speak up during class.
In response, we developed a novel widget on our LAMS platform through which students could give immediate feedback during class, providing facilitators with a real-time snapshot of student understanding of the current discussion and their engagement with the lesson. Student engagement can be positive (characterised by interest and enthusiasm), negative (disruption or boycotting), or non-existent (apathy) [5]. The intention behind the widget was for it to allow students to indicate their understanding at each point of discussion, so as to elicit positive student engagement, and provide a means for students to move away from non-existent or negative engagement. While this initiative was triggered by the transition to online classes during the pandemic, it was also relevant for live classes since it can provide more quantitative and explicit feedback to facilitators than mere observation of classroom dynamics.
Activity
We designed the widget (known as MOSH, for Move On/Stay Here) in collaboration with our LAMS developer. Students accessed MOSH on LAMS via a small pop-up screen which they click to indicate their preference for moving on to the next point (“Move On”) or staying on a point of discussion (“Stay Here”), as well as their reasons for choosing either option from a pull-down list (Fig. 1).
Fig. 1.
Student view of MOSH interface. Screenshots of a sample question with a minimised MOSH icon (A) and an expanded drop-down list (B). Options on the list are highlighted after students click on them
The responses were collated and presented to facilitators in real-time on their LAMS dashboard in an easy-to-understand graphical representation of the votes and a breakdown of the reason behind each vote. In addition, the dashboard included a “Start discussion” button so that facilitators could control the activation of MOSH for each question (Fig. 2).
Fig. 2.
Simulated example of facilitator view of MOSH interface. A The facilitator clicks on the “Start discussion sentiment” button (i) to bring up the MOSH dashboard. B The MOSH dashboard includes a graphical representation (ii) of the proportion of “Stay here” to “Move on” votes in yellow and blue respectively, as well as a breakdown of the specific reasons (iii) selected by students. In this example, of the five students that voted, one wanted more clarification, one wanted to know what the key point was, and three indicated that they had already understood and were ready to move on
From group interviews with year 2 medical students, we gathered initial feedback to refine the widget design and develop a candidate list of reasons why students might want to “Move On” or “Stay Here”, as these options can help to inform the facilitator on how best to proceed with the lesson. Following this, we conducted a pre-implementation survey of year 1 and 2 students to select options from the candidate list for inclusion in the pull-down list, and to also gauge buy-in for the initiative. We then conducted a pilot run of the widget with a group of year 2 students, the results of which have been previously reported [6].
Having received generally positive feedback from the pre-implementation student survey and pilot run, we proceeded to implement MOSH for 27 year 1 and 37 year 2 online TBL sessions that ran between 14 Jan and 20 May 2022. MOSH continues to be in use even after our has school returned to face-to-face TBL classes from June 2022 upon the relaxation of COVID-19 social distancing restrictions.
In Mar-April 2022, we conducted a post-implementation survey of Year 1 (n = 163) and Year 2 (n = 147) students via Google Forms to collect student feedback about their experience of MOSH during online TBLs. The survey included close-ended multiple-choice questions and optional follow-up open-ended questions requesting further elaboration.
Results and Discussion
95 out of 310 (31%) year 1 and year 2 students completed the survey (Fig. 3). 62 out of the 95 (65%) respondents reported having used MOSH during class. Of the remaining 33 respondents who reported never activating MOSH, the majority either were focussing on the lesson (11/33, 33%) or felt that the pace of discussion was suitable (8/33, 24%). Thus, a possible reason for never using MOSH was because these students did not feel a strong need to alter the pace of the lesson, which was supported by the responses to the open-ended survey questions (Table 1).
Fig. 3.
Participants’ responses to close-ended survey questions on MOSH experience. A, B Values on the x-axis represent absolute counts; C values on the x-axis represent percentages, and negative statements are preceded by an asterisk
Table 1.
Representative participant open-ended responses regarding the statements “It is useful in regulating the pace of discussion” and “It has improved my learning experience during TBLs”
| Theme | Representative quotes |
|---|---|
| No need/opportunity to provide feedback |
“I think that the pace of TBL hasn't been changed much by the system because most of the questions that are asked are usually relevant and students have a good sense of what questions should be asked during the TBL sessions and what should be asked during [optional clarifications by content experts after the class has ended] by now. Explanation by the [content experts] are also usually of appropriate length” “Only students who are not listening would have time to click on that bcos most the session is happening on zoom.” |
| Instructor does not act on/acknowledge feedback |
“It is useful only if the facilitator pays attention/acts on the feedback” “It is ultimately up to the facilitator and the [content expert]. Manyatimes, the MOSH feedback is disregarded.” “Too few students bother to voice their opinions on MOSH, and those who do voice their opinions just get ignored by the [content expert] anyway as there are too little responses, so it is just a feedback loop” |
Feedback on the experience of using MOSH was generally positive (Fig. 3C). Almost all respondents strongly agreed or agreed that it was easy to understand. In addition, the majority of respondents thought that it was useful for Zoom TBLs and would be useful for live TBLs (which had not yet resumed at the time of the survey). They also felt that it encouraged student feedback and allowed students to directly benefit from and receive acknowledgement of their feedback.
However, respondents were ambivalent about whether it was useful in regulating the pace of discussion or improved their learning experience during TBLs. Analysis of participants’ open-ended follow-on responses to these two close-ended survey questions (Table 1) revealed that some students did not engage more with MOSH due to the perceived lack of acknowledgment of feedback by instructors.
We had been concerned whether MOSH would be abused by disruptive students to speed up the pace of the class inappropriately, and whether students might be affected by their peers when using MOSH. This appeared not to be the case since the majority of respondents strongly disagreed with, disagreed with or were neutral towards the statements corresponding to these issues (Fig. 3C). However, a substantial proportion of respondents (44/95, 46%) felt that it was distracting, although we note that this is still lower than the proportion of respondents who felt that it was useful for TBLs. This suggests that MOSH would be most useful for classes in which the benefits of calibrating the pace of discussion outweigh the unintended negative consequence of causing distraction.
The utility of an immediate feedback tool from students to instructors may be more apparent in classes that are poorly paced and in which students are disengaged. The main reason why our students did not use MOSH was because they were focussed on the lesson and therefore already engaged. Therefore, lack of feedback may indicate that student engagement levels are sufficiently high, which is also useful information for managing class discussions. However, our results also revealed a subpopulation of respondents who felt that the MOSH feedback was disregarded. Therefore, instructors should explicitly acknowledge and act upon feedback when appropriate, since studies have shown that students see more value in providing feedback when they themselves receive immediate benefit [7].
The trend towards online HPE is unlikely to reverse as virtual teaching gains acceptance and in the face of potential future pandemics [8]. We have demonstrated an immediate feedback tool integrated into our existing digital platform that provides a simple, scalable, and effective way to monitor student engagement in both virtual and live classrooms. This approach allows instructors to calibrate content delivery to the needs of a diverse student population to optimise learning outcomes and may be particularly useful for online classes with large class sizes.
Funding
This project is supported by the EdeX Grant from the Teaching, Learning and Pedagogy Division, Nanyang Technological University.
Data Availability
The datasets are available from the corresponding author on reasonable request.
Declarations
Ethics Approval
Was obtained from our institution (IRB-2020-07-018).
Conflict of Interest
The authors declare that there is no conflict of interest.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Kuh G, Kinzie J, Buckley J. Piecing together the student success puzzle: research, propositions, and recommendations. ASHE High Educ Rep. 2007;32(5):1–182. [Google Scholar]
- 2.Martin F, Bolliger DU. Engagement matters: student perceptions on the importance of engagement strategies in the online learning environment. Online Learning. 2018;22(1):205–22. 10.24059/olj.v22i1.1092.
- 3.Wilcha RJ. Effectiveness of virtual medical teaching during the COVID-19 crisis: systematic review. JMIR Med Educ. 2020;6(2):e20963. doi: 10.2196/20963. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Rajalingam P, Rotgans JI, Zary N, Ferenczi MA, Gagnon P, Low-Beer N. Implementation of team-based learning on a large scale: three factors to keep in mind. Med Teach. 2018;40(6):582–588. doi: 10.1080/0142159X.2018.1451630. [DOI] [PubMed] [Google Scholar]
- 5.Trowler V. Student engagement literature review. York: The Higher Education Academy. 2010.
- 6.Han SP, Cleland JA, Yang L. Feedback in online classes: keeping it real-time. Med Educ. 2022;56(5):577–578. doi: 10.1111/medu.14773. [DOI] [PubMed] [Google Scholar]
- 7.Lake W, Boyd W, Boyd W, Hellmundt S. Just another student survey? - Point-of-contact survey feedback enhances the student experience and lets researchers gather data. Australian Journal of Adult Learning. 2017;57(1):82–104. [Google Scholar]
- 8.Jeffries PR, Bushardt RL, DuBose-Morris R, Hood C, Kardong-Edgren S, Pintz C, et al. The role of technology in health professions education during the COVID-19 pandemic. Acad Med. 2022;97(3S):S104–S9. 10.1097/ACM.0000000000004523. [DOI] [PMC free article] [PubMed]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The datasets are available from the corresponding author on reasonable request.



