Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2023 Apr 24:1–11. Online ahead of print. doi: 10.1007/s11528-023-00850-0

Exploring an Innovative Approach to Enhance Discussion Board Engagement

Hanadi Hamadi 1,, Aurora Tafili 2, Frederick R Kates 3, Samantha A Larson 3, Carlyn Ellison 4, Jihee Song 5
PMCID: PMC10124679  PMID: 37362586

Abstract

Online discussion boards are a standard learning management system (LMS) instructional tool used in the emerging online learning pedagogy. This pilot study examined an innovative approach that differs from how discussion boards have been commonly used. Using a retrospective, cross-sectional design, we evaluated the effect of shifting from traditional teacher and student-generated prompts to using student-generated videos with higher-order discussion questions to gauge student perceptions of peer feedback and engagement. Participants were graduate students in a health care administration course at a large university. Overall students’ perceptions of creating and responding to student-generated prompts were positive. Students responded that they were more engaged and thought more critically about the content with this shift from the traditional way of using discussion boards. As digital technology reshapes higher education, it is essential to reflect and evaluate the effectiveness of current LMS applications and standard procedures to improve educational delivery.

Keywords: Discussion boards, Feedback, Instructional design, Online learning, Student engagement, Student-generated video

Introduction

Discussion boards play an essential role in emerging online learning pedagogy. Online discussion boards are a standard instructional tool within learning management systems (LMS) such as Canvas, Moodle, Blackboard, and other software companies. In the United States, approximately 99% of colleges and universities use an LMS (Dahlstrom et al., 2014). During the recent pandemic, discussion boards played a critical role in promoting teacher-student interaction and self-directed learning (Adinda & Mohib, 2020; Geng et al., 2019; Kimberling & Akwafuo, 2023). LMS discussion boards allow instructors and students to post questions and responses, usually in an asynchronous manner. Allowing students more time to read, reflect, and critically engage with the content before posting to the discussion can enhance the learning experience (Clouse & Evans, 2003; Han & Ellis, 2019; Krentler & Willis-Flurry, 2005) – arguably one of the most significant advantages of online discussion boards compared to time-constrained in-person synchronous discussions (Kamin et al., 2001; Testa & Egan, 2016).

While discussion board utilization offers a variety of advantages, some educators have expressed concern regarding the quality of student engagement which has been shown to be a strong predictor of learning outcomes (Kates et al., 2015; Soffer & Cohen, 2019). For example, some students simply paraphrase the well-articulated responses of their peers rather than generate original remarks of their own (Gao et al., 2020; Wijekumar & Spielvogel, 2006). This is chiefly due to the current technical structure of LMS environments, which is tailored to the frequency rather than the quality of discussion posts (Gunnlaugson, 2006; Suler, 2004). In addition to educator considerations, students have expressed concerns over lack of clarity in discussion prompts, misunderstanding professor instruction, and engaging only with their peers (Ochoa et al., 2012; Thomas, 2002).

For instructors evaluating discussion board responses are particularly problematic. A straightforward way to measure student performance is by the number of postings, with the most common methods being either to calculate the mean number of postings per forum or measure the number of student responses divided by the number of students in the class (Kay, 2006; Mazzolini & Maddison, 2003; Nisbet, 2004; D. Wu & Hiltz, 2004; Yunusa & Umar, 2021). However, measuring the quality of student discussion board responses is complicated. While there are several methods for content analysis that may apply to discussion boards, the analytical process necessitates a great deal of time for a thorough review of a large number of responses (Bliss & Lawrence, 2009). Artificial intelligence (AI) is gaining traction and will change the LMS experience in the coming years. AI might reduce some of the education burdens for instructors by monitoring students’ input and assisting with feedback (Loeckx, 2016; Zhai et al., 2021). For example, one pilot study attempted to use AI to grade discussion boards with limited success, but it is a starting point (Rutner & Scott, 2022). With the growth of online instruction, instructors should continue to strive to improve student engagement.

Online education enrollment continues to increase, particularly as the coronavirus pandemic began to disrupt education in 2020, where roughly 75% (11.8 million) of all undergraduate students were enrolled in at least one distance education course (National Center for Education Statistics, 2021). Although, as early as 2013, 25% of all college students were enrolled in at least one online or blended course, whereby approximately 30–80% use discussion boards to deliver content, additional research and instructor reflection are required to improve this online learning pedagogy (Allen & Seaman, 2010; Russo-Gleicher, 2013). As referenced, researchers have identified both positive (Bailey et al., 2020) and negative aspects associated with discussion boards (Han & Ellis, 2019). This pilot study examines the influence of transitioning from a traditional discussion board format to an innovative alternative that is guided by teacher-and student-generated prompts and incorporates student-generated videos with higher-order discussion questions. The primary aim is to gauge how using this new format impacts students’ perceptions of their critical thinking skills, ability to provide/receive feedback, and engagement with the discussion post. This study addresses the following questions: (One) Does changing the way traditional discussion boards are used, from teacher-generated text-based prompts to student-generated video with discussion prompts, increase students’ perception of self-engagement? (Two) Did the task of creating higher-order discussion questions impact students’ perceptions about their critical thinking and knowledge of the topic? (Three) Finally, what are students’ perceptions of providing and receiving peer-review feedback?

Pedagogical Framework

Using the lens of the constructivist learning theory we examined how student-generated videos impact perceived self-engagement, critical thinking skills, and providing feedback. The constructivist learning theory was used to explain students’ active construction of knowledge through the collaborative process of creating videos with associated discussion questions. Constructivist learning theory suggests that learners construct knowledge from their own experiences. Students who actively seek to expand their knowledge are engaged and motivated to learn more effectively (Zhang et al., 2006). The current literature identifies that a major educational challenge, especially in distance learning settings, is keeping learners motivated and engaging them in the long-term learning process (Lespiau & Tricot, 2019).

A method of increasing student engagement and content retention within discussion boards involves incorporating student-generated videos in which students can more efficiently share their knowledge with classmates. By integrating student-generated videos into discussion boards, professors can utilize the zone of proximal development to facilitate students’ abilities to produce these videos. Vygotsky’s social learning theory (Vygotsky, 1978), which highlights social interaction as a significant factor in learning, specifies the “zone of proximal development” as an individual’s unaided performance versus when it is aided by a more knowledgeable individual. For example, group or team members who possess advanced skill sets can assist those with novice skill sets within their zone of proximal development (Einarsson & Hertzum, 2020; McLeod, 2012). Therefore, we hypothesize that students-generated video with discussion prompts will increase student perception of self-engagement.

As it relates to discussion boards, professors can administer an initial assessment to gauge students’ knowledge of technology and video editing skills. This borrows from the concept of the flipped classroom, and the literature supports the effectiveness of such an approach (Lapitan et al., 2021). Students who exhibit advanced knowledge and skill can then be deployed to assist other students in learning the technological components required to create and edit student-generated discussion videos—thereby also promoting increased social interaction amongst students (e.g., see Fig. 1). Within a discussion board, student interaction and exchange of information, as well as the collaborative effort put forth to create a video, supports the development of a group that holds specific learning goals (Zhu, 2006). Creating and posting videos on a discussion board is a student’s first exposure to social interaction and learning within the course.

Fig. 1.

Fig. 1

Adapted and modified from Vygotsky’s Zone of Proximal Development (Vygotsky, 1978)

Following this initial touchpoint, student engagement and feedback assist in internalizing the material and advancing knowledge as they critically read and thoughtfully respond to their peers. The key constructs of this study’s measures included student engagement with discussions, the quality of the discussion posts, and peer interaction. For example, student engagement was defined as student motivation to create high-quality videos for discussion board topics and their level of participation. The quality of posts was measured by prior knowledge of higher-order thinking skills and the process of creating discussion questions. Finally, peer interaction survey questions were used as a proxy to measure the appropriateness and expectation of student feedback.

Benefits of Video Creation

Videos are widely used in academic settings and are a powerful medium for online courses. The current generation of college students can easily access media outlets and are intimately familiar with various media products. Breaking away from traditional, teacher-generated LMS discussion boards to a method that employs student-generated videos and discussion questions could improve students’ learning experiences. For example, video development offers students a unique opportunity to enhance their understanding of course content and increase their collaborative skill set through peer engagement. Research shows that students who use interactive videos exhibit higher learning performance and satisfaction when compared to their peers (Zhang et al., 2006). Moreover, creating educational videos is correlated with improved student creativity, communication, and analytical thinking (Armstrong et al., 2009).

Omar et al. (2013) examined the impact of student-generated videos related to clinical scenarios and found that the advanced preparation necessary to create the video content led to an enhanced understanding of their health profession. In addition, other studies have shown that creating videos is positively associated with self-motivation and retention of learning materials (Greene & Crespi, 2012; Jowsey et al., 2020; Lim et al., 2009). Utilization of a student-created video is a reciprocal style of teaching that prompts students to become a facilitator of knowledge among their peers, which ultimately aids in engaging students in the course learning process (Kates et al., 2015). Once these videos are posted to a discussion board, content reflection occurs most notably when students watch and respond to associated discussion questions posed by their peers.

Benefits and Drawbacks of Peer Reviews

As a supplement to the online discussion board, students can utilize the peer review process. Not only is peer review comprised of evaluation and feedback, but also the development of individual knowledge (Awada & Diab, 2021). Education studies have illuminated that the peer review process can improve motivation, increase ownership, encourage self-learning and autonomy, enhance evaluation and constructive feedback skills, heighten emphasis on deep learning and metacognitive strategy use and improve the integration of assessments in the learning process (Barker & Bennett, 2011; Cuddy et al., 2001; Lin et al., 2021; Min, 2005). Overall, peer reviews elucidate benefits that extend beyond increasing student motivation and ownership by providing a deeper understanding of the content, improving critical thinking, and advancing reflection skills (Velez et al., 2011; Y. Wu & Schunn, 2021; Yang & Wu, 2012).

Methods

Setting and Participants

This study used a retrospective, cross-sectional design and was conducted in a graduate health care administration course at a large university. A cross-sectional design is one in which exposure and outcome are measured in participants at the same point in time (Setia, 2016). This design was selected because it allowed the researchers to observe and then assess their students’ experiences with discussion posts throughout the semester, and it is a design that is inexpensive and quicker to conduct (Setia, 2016).

Procedure

Vygotsky (1978)’s “zone of proximal development” guided our methods, first with a formative assessment using a self-assessment technique, we asked students to answer a series of questions to identify who possess prior knowledge of video editing, the processes for large file transfers and format conversions, and YouTube channel navigation. Students were also asked to rate their confidence level in using video editing software. Based on the results of this assessment, students were divided into collaborative teams of three or four to produce a team with mixed experience levels. The social interaction of learning and teaching new technical skills within the “zone” of the collaborative group can be a significant factor in learning. The assessment also helped to identify students who have limited or minimal technology skills who, without the benefit of the collaboration, might be “out of reach” or become frustrated and struggle with the video editing software.

The purpose of this four-step process is to prepare students to create, edit and critique discussion post videos. Instructor-led preparation included introducing students to the process of creating and editing videos, as well as, setting assignment expectations with detailed rubrics. In the first step, the instructor introduced video editing software using Canvas’s discussion board feature. The class was divided up into small groups of 2 to 3 students per group. Each group was asked to research a free video editing software and share their findings in a written post to the class using the discussion board feature. They were asked to post which video editing software they liked, the features that stood out, and any helpful tutorials to learn how to use it. Each student in the class was required to write a reply to at least 2 peer posts sharing their feedback about usability, tutorial, and other technology recommendations. The goal was to find some commonality and recommendations for video editing software that is student-driven, not instructor-driven, so the students take ownership in learning how to use the software.

Initial video exposure began with a classic team-building icebreaker assignment in the LMS called “Two Truths and a Lie.” Step two of the assignment asked students to create a video stating two truths and a lie about themselves to share something unique or exciting that peers would not expect. Students were permitted to use any editing software to complete their video (e.g., Final Cut Pro, Movie Maker, iMovie, or even their phone). An example video, filmed very simply with a cellphone, was provided by the instructor within the assignment details to help set expectations for the use of technology throughout the course. This video, “Two Truths and a Lie,” can be accessed at https://youtu.be/yWe6CmkCP5Q. Part three of the assignment required students to upload their video to an LMS discussion board to share with peers in their collaborative group. The part three assignment description also included links detailing how to participate in a discussion board and how to upload a video in the LMS. The literature supports the need for facilitator modeling to develop a safe learning environment that increases student involvement (Gonzalez et al., 2021; Tichon & Seat, 2004).

The second video assignment required of the participating sample was exploratory in nature, allowing students to familiarize themselves with video editing resources (e.g., iMovie, Adobe Spark, VirtualDub, Wax, OpenShot, VideoPad, VSDC Video Editor, and Movie Maker) on their own, then share their insights, recommendations and understanding with classmates in a discussion board. Many of these sites are “freemiums,” which allow users to access basic features at no cost and premium features for a subscription fee (Kumar, 2014). Therefore, before the exploratory lesson was available in the LMS, all students were made aware of the concept of a freemium and reminded that they were not required to, nor expected to, purchase premium features. The emphasis was on beta testing the free video editing software and providing feedback on the usability and features of the software. File size, conversion formats and the general upload process for videos are issues students typically struggle with as they create and share video content. Online tutorials on all aspects of the creation and publication process were created and uploaded in the LMS in order to mitigate these common barriers to success. Further, instructors capitalized on many students’ pre-existing knowledge and use of YouTube by allowing the submission of a YouTube video link, rather than a video file, for their assignments.

The fourth step was to introduce Bloom’s Taxonomy, a hierarchical framework to classify learning objectives and cognitive levels (Bloom et al., 1956). Bloom’s Taxonomy includes six levels of learning categories that progressively advance as skills are obtained in the lower levels of learning. The instructor shared university resources that focus on creating higher-order questions at the top of Bloom’s Taxonomy (i.e., analyze, evaluate, and create) and modeled the process. At the conclusion of each video assignment, students were required to post higher-order discussion questions to their team. The purpose of this was to extend viewers’ thinking beyond the video content in order to deepen understanding and motivate further exploration. In addition, the questions were intended to build on previously shared content, requiring their peers to infer relationships in order to generate new meaning and an understanding of the topic at hand. Students worked collaboratively to create course-specific videos and discussion questions in teams of three to four, depending on class size and instructor discretion. Teams were then grouped and placed within larger discussion groups.

For example, a class of 24 students had eight teams of three students which were divided into two discussion groups of 12 students. Canvas, a prominent LMS, was utilized to assign students to teams and discussion groups. For each assignment, one team member was required to submit the completed video alongside supporting discussion questions to their assigned group’s discussion board. Once all the teams uploaded their video, students were required to work individually to watch all team videos, provide feedback review on the videos and answer the accompanying discussion questions. The required feedback section was termed a “sandwich critique” in the discussion board lesson provided to all students (Dohrenwend, 2002; Kates et al., 2018). Firstly, the critique required students to view at least two team videos; Secondly, it required that the student provide a “sandwich critique” for each video they watched (See Fig. 2); finally, it required that one discussion question from each video be answered, for a total of two question responses. The education survey and informed consent for the identified course in the pilot study were approved under IRB-02 (IRB201800997).

Fig. 2.

Fig. 2

The sandwich critique modified from Dohrenwend (2002)

Survey Design

The researchers reviewed questions from past research studies and worked collaboratively to create survey questions (See Table 1) for this cross-sectional pilot study. A five-point Likert scale ranging from one, denoting that the respondent “strongly disagrees”, to five, indicating that the respondent “strongly agrees”, was adopted from previous studies influenced by the Community of Inquiry Framework addressing knowledge construction in collaborative online environments (Fiock, 2020; Pool et al., 2017).

Table 1.

Survey Questions with Related Study Aims

Related Study Aim Questions
Student Engagement 1) Did knowing that your peers would view your video motivate you to produce a higher quality video than if only the instructor would see your video?
2) I found it more effective to explain my content in a video vs. written format

3) The learning curve of creating videos was more than I expected, but after going through the process, I can see the benefits and will use this skill set again

4) I am more engaged with discussion boards that used student-generated videos with discussion questions than discussion boards that use teacher-generated prompts

Higher-order questions 1) Before the course, I had prior knowledge of Bloom’s taxonomy and higher-order thinking skills
2) The process of creating higher-order discussion questions to access my peer’s analysis and synthesis of the material presented in the video made me think more critically about the content
Peer-review 1) The peer responses to my discussion question provide adequate feedback if the intended message in the video was conveyed
2) Providing meaningful peer review is a reasonable expectation for college students
3) The peer review process motivated me to provide constructive feedback that would be helpful to the video creator
4) Peer reviewing student videos has helped me to think more critically about content delivery
5) I received useful peer review comments about my video

Data Collection

Twenty-four graduate health care administration students enrolled in a course at a large university were invited to participate in the investigation. The course was selected based on the students' elevated use of the discussion board technique during the semester. Thus, with the increased usage of discussion boards, the authors found it appropriate to examine the approach’s feasibility to share with other instructors. Therefore, the authors decided to run a pilot study to set the groundwork for a large-scale study. Student participation in the pilot study was voluntary, with no penalties for choosing not to participate, skipping questions, or withdrawing. The survey was created and delivered via Qualtrics, which is equipped with SAS 70 certification and data encryption (Qualtrics, 2018), ensuring that only study investigators had access to survey data.

Data Analysis

Survey results were analyzed using descriptive statistics in the form of frequencies and percentages. All analyses were conducted in Stata 14SE. Results were graphed using Microsoft Excel.

Results

The survey response rate was 75%, with 18 of the 24 total graduate students in the health care administration course voluntarily agreeing to take the survey. Among the participants, 66.7% were male and 33.3% female. Most participants were between 22 and 24 years of age and non-Hispanic White. Participant characteristics are categorized in Table 2.

Table 2.

Characteristics of Graduate Student Survey Respondents (N = 18)

Characteristics Frequency Percent
Gender Female 6 33.3%
Male 12 66.7%
Age 22–24 17 94.4%
25–26 1 5.6%
Race/Ethnicity African American/Non-Hispanic Black 3 16.7%
Asian/Pacific Islander 1 5.5%
Hispanic/Latino 3 16.7%
Non-Hispanic White 10 55.6%
Prefer not to respond 1 5.5%

When examining student engagement-related responses (See Fig. 3), we found that 67% of respondents agreed or strongly agreed that knowing a peer would be viewing their video motivated them to produce higher quality work and that the learning curve of creating videos was more than they expected but see the benefit and will use the skill again, respectively. Of participating students, 44% agreed or strongly agreed that they found it more effective to explain their content in a video format than a written format, and 50% felt more engaged in a discussion board with teacher-generated prompts, rather than student-generated prompts. When examining higher-order related questions, we found more than half of the students did not have previous knowledge of Bloom’s taxonomy or higher-order thinking skills, and 78% agreed or strongly agreed that the process of developing higher-order discussion questions prompted them to think more critically about the course content. When examining peer review related questions, we found that the majority (60% or higher) of students agreed or strongly agreed with the benefits and usefulness of the peer review feedback process.

Fig. 3.

Fig. 3

Student survey responses on a 5-point Likert scale

Discussion

This pilot investigation revealed that students had positive perceptions regarding the implementation of an innovative, alternative approach to online discussion board engagement. Most notably, participating students acknowledged that they were more engaged and thought more critically about course content as a result of the new approach, a stark shift from the traditional implementation of the discussion board platform. Our study findings regarding a positive outlook on student engagement align with prior findings. Previous research has documented findings consistent with our results, indicating that video-based discussion board assignments increased a social and teaching presence when compared to traditional, text-based approaches and were more helpful in achieving learning objectives (Caskurlu et al., 2021; Milovic & Dingus, 2021). Furthermore, we found that 67% of respondents agreed or strongly agreed that knowing a peer would be viewing their video motivated them to produce higher-quality work. Our study shows that incorporating peer review can enhance the benefit of curricula delivery. Students participating in our pilot study overall indicated the benefits of using peer review. Studies have highlighted the need for both peer review and high-quality peer review (Gaynor, 2020). Therefore, novel to our research, and the most prominent addition to the literature, is the incorporation of peer review as a means to facilitate increased student engagement and contribute to higher-order thinking. We also found that 67% of respondents agreed or strongly agreed that the learning curve of creating videos was more than they expected but see the benefit and will use the skill again. In addition, 67% of respondents agreed or strongly agreed that the learning curve of creating videos was more than they expected but see the benefit and will use the skill again. Part of the issues experienced by students included adhering to instructions to supply videos in MP4 formats, which some video software did not allow for. To address this issue, instructors should provide options on video format submissions or instruct students to not use software that does not allow MP4 formats. This requirement could be integrated in step one in which students need to identify editing software that can create MP4 files. Students explained that a reason for the large learning curve was their desire to produce high quality content that they knew their peers were watching along with their instructor. As a result, they spent more time than expected on their video content and editing. This is similar to the Hawthorne effect in which individuals modify an aspect of their behavior in response to their awareness of being observed (Lam et al., 2022).

We found that 44% of participants agreed or strongly agreed that they found it more effective to explain their content in a video format than a written format. This is consistent with a recent study that has shown that video-based discussion boards promote a comfortable online student environment and social interaction that allows students to get to know one another better by putting names to faces (Milovic & Dingus, 2021). Further, the same study showed that students devote more time to their video discussion posts in comparison to the traditional individual written discussion posts(Milovic & Dingus, 2021).

In our study, participants were asked if they were more engaged with discussion boards that used student-generated videos with discussion questions than discussion boards that use teacher-generated prompts. We found that 50% of participants felt more engaged in a discussion board with teacher-generated prompts, rather than student-generated prompts. This is not consistent with a recent study that has shown that students had significantly better attitudes toward student-generated responses than instructor-generated responses (Yu & Chen, 2021). However, this can be due to students not being exposed to developing their own questions in past courses, despite efforts to use Bloom’s taxonomy to guide them through the process of self-generated questions.

Our study highlights the need to better inform and educate students of higher-order thinking and incorporate the use of Bloom’s taxonomy in content development. A prior study identified this need as a major limitation of current learning pedagogy (Delaney et al., 2019). We acknowledge that this innovative methodology of using LMS discussion boards requires instructors to undertake significant upfront preparation, however, considering that nearly six million students take at least one online course and another three million take all their classes online (Bai, 2018), incorporating a fresh, engaging approach to learning has the potential to realize gains in online student engagement and improve the overall learning environment.

Recommendations

The results of this pilot study have generated recommendations for further research and action pertaining to question development, grading, instructor participation, and student engagement. This investigation unveiled that a greater emphasis should be placed on educating students on how to write effective discussion questions within the framework of Bloom’s Taxonomy. We did not anticipate that over 50% of the participating students would have little or no prior knowledge of Bloom’s Taxonomy and higher-order thinking skills. This was particularly unanticipated given that all students were enrolled in a graduate-level course at a large university which suggests that further training is needed to guide the development of effective discussion questions. Given this, our first recommendation is to begin instruction with an LMS formative assessment of students’ prior knowledge of Bloom’s Taxonomy to establish a baseline. Then, based on the assessment results, instructors should develop appropriate lessons and tutorials. Further, it is recommended that a detailed grading rubric be developed for all video and discussion board assignments as we found that doing so established expectations and provided an appropriate framework for peer evaluators when writing the sandwich critique. This recommendation is supported by previous investigations which have identified numerous benefits including increased student participation, satisfaction, perceived interaction with peers, and perceived learning (Bailey et al., 2020; Yunusa & Umar, 2021).

Additionally, we recommend that instructors dedicate a substantial amount of time to commenting and providing feedback on videos and discussion questions created by students. This “presence” demonstrates interest in student comprehension and interpretation, in addition to informing as to the overall level of engagement in the materials. Further, instructor presence has been shown to establish clear discussion board expectations (Mandernach et al., 2006), and motivate students’ thoughtful responses to peer comments (Dennen, 2005). Allowing an appropriate amount of time for students to reflect on the posted question may be particularly helpful for those whose primary language is one other than English. Non-native English-speaking students may utilize independent time to critically compose an answer which may have been difficult to craft in a rapid, synchronous environment (Yu, 2018).

Finally, to promote and encourage high-quality written and video submissions, we recommended that instructors create a team “Best Video” award to foster an additional sense of taking pride in one’s work. An instructor could establish this by creating an assignment where each team nominates another team for “Best Video” by writing a few sentences about why they believe that video stands out above the rest.

Limitations

Limitations of this pilot investigation included: One) by design, the sample population was small and, therefore, the total number of participating students was low (18 students) and all were enrolled in the same course; Two) the sample population included minimal diversity in participant race and ethnic background, with over 50% of students self-identifying as non-Hispanic white; Three) 66.7% of the respondents were male, indicating uneven gender distribution; and Four) all results were self-reported by graduate-level students, thereby making it difficult to generalize results to other academic settings. As this was an initial pilot study, the primary aim was to develop a preliminary view of students’ perception of engagement, higher-order questions, and peer review when utilizing an innovative approach to online discussion forums. We identified that student-generated video prompts in a discussion board may be an effective way to increase online course engagement, but this could have more to do with the novelty of the innovation than an improvement in the educational delivery. Therefore, researchers must view the pilot study limitations as a guide for improving future research.

Future Research

Future research should include a broader study, across several campuses with more diverse demographics to assess the impact of active video production on learning at the undergraduate and graduate student levels. Tactics for enhancing and modifying how educational tools are used in learning management systems should routinely be reviewed and analyzed. Discussion boards will play a prominent role in the emerging online learning environment as an effective method for replacing face-to-face instruction. Moreover, the ongoing global pandemic, COVID-19, has prompted educators to reevaluate and plan accordingly, based on the imposed restrictions for the community, student, and faculty’s safety. As such, researchers and educators need a more in-depth inquiry to understand the impacts COVID-19 may have had on this innovative strategy for online learning (Christopher et al., 2020). As the use of instructional technology increases and evolves with developments in software, hardware, and pedagogical approaches, it is vital to reflect on and evaluate the delivery of student instruction. As demonstrated by the authors in this study, future researchers should critically reflect on current applications and standard procedures to identify instruction delivery changes to enhance online learning. Furthermore, the challenges instructors might face when evaluating video content need to be studied.

Appendix

Author Biographies

Hanadi Hamadi, Ph.D, MHA is an associate professor in the department of health administration at University of North Florida. She earned her Ph.D. from University of South Carolina. Her research agenda focuses on the evaluation of health outcome initiatives (HOIs), with an emphasis on cost effectiveness and policy impact of social-determinants-focused Health outcome; the relationship between HOIs and population health, and the evaluation of state-by-state Medicaid-related policies and their impact on reimbursement, physician behavior and cost-containment efforts.

Aurora Tafili, MHA, MBA is a Ph.D. student at the University of Alabama at Birmingham. Her research interests include health outcome initiatives and value-based healthcare delivery, as well as the application of mixed methods approaches toward health outcome improvement interventions.

Frederick R. Kates III, Ph.D., MBA, serves as a clinical assistant professor for health services research, management and policy department in the College of Public Health the University of Florida. Dr. Kates received his doctoral degree in health services policy & management from the University of South Carolina.

Samantha Larson, MPH is a Ph.D. student in the department of health services research, management, and policy at the University of Florida (UF). Prior to joining UF, Samantha earned her Bachelor of Science in nutrition science from the University of Minnesota and Master of public health concentrated in health policy from Creighton University. Ms. Larson spent several years working in government affairs and strategic planning for BlueCross BlueShield of Vermont.

Carlyn Ellison, MPH, CPH is a Ph.D. student in the department of occupational therapy at the University of Florida. She earned her Bachelor of Science in health sciences and Master of public health from University of Florida.

Jihee Song, Ph.D., MPH is a research associate in the department of family, youth and community sciences in the Institute of Food and Agricultural Sciences at the University of Florida.

Data Availability

Data available on request from the authors.

Declarations

Ethics Approval

This study was performed in line with the principles of the Declaration of Helsinki. The questionnaire and methodology for this study was approved by the Institutional Review Board at The University of Florida (IRB-20–1,800,997).

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Conflict of Interest

All authors have no conflict of interest.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Hanadi Hamadi, Email: h.hamadi@unf.edu.

Aurora Tafili, Email: atafili@uab.edu.

Frederick R. Kates, Email: kates.rick@phhp.ufl.edu

Samantha A. Larson, Email: samantha.larson@ufl.edu

Carlyn Ellison, Email: carlynellison@phhp.ufl.edu.

Jihee Song, Email: ssong@ufl.edu.

References

  1. Adinda D, Mohib N. Teaching and instructional design approaches to enhance students’ self-directed learning in blended learning environments. Electronic Journal of eLearning. 2020;18(2):162–174. [Google Scholar]
  2. Allen, I. E., & Seaman, J. (2010). Learning on Demand: Online Education in the United States, 2009.
  3. Armstrong GR, Tucker JM, Massad VJ. Achieving learning goals with student-created podcasts. Decision Sciences Journal of Innovative Education. 2009;7(1):149–154. doi: 10.1111/j.1540-4609.2008.00209.x. [DOI] [Google Scholar]
  4. Awada, G. M., & Diab, N. M. (2021). Effect of online peer review versus face-to-Face peer review on argumentative writing achievement of EFL learners. Computer Assisted Language Learning, 1–19.
  5. Bai, H. (2018). Preparing Teacher Education Students to Use Instructional Technology in an Asynchronous Blended Course. Innovative Practices in Teacher Preparation and Graduate-Level Teacher Education Programs, 603–619. 10.4018/978-1-5225-3068-8.ch031
  6. Bailey, D., Almusharraf, N., & Hatcher, R. (2020). Finding satisfaction: intrinsic motivation for synchronous and asynchronous communication in the online language learning context. Educ Inf Technol (Dordr), 1–21. 10.1007/s10639-020-10369-z [DOI] [PMC free article] [PubMed]
  7. Barker, T., & Bennett, S. (2011). Marking complex assignments using peer assessment with an electronic voting system and an automated feedback tool. International Journal of e-Assessment.
  8. Bliss CA, Lawrence B. From posts to patterns: A metric to characterize discussion board activity in online courses. Journal of Asynchronous Learning Networks. 2009;13(2):15–32. [Google Scholar]
  9. Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. J., & Krathwohl, D. R. (1956). Taxonomy of Educational Objectives: Handbook I, Cognitive Domain (New York, 1956). Bloom's hierarchical classification gives six major categories:(a) knowledge,(b) comprehension,(c) application,(d) analysis,(e) synthesis,(f) evaluation. The taxonomy is two dimensional as follows.
  10. Caskurlu S, Richardson JC, Maeda Y, Kozan K. The qualitative evidence behind the factors impacting online learning experiences as informed by the community of inquiry framework: A thematic synthesis. Computers & Education. 2021;165:104111. doi: 10.1016/j.compedu.2020.104111. [DOI] [Google Scholar]
  11. Christopher R, de Tantillo L, Watson J. Academic caring pedagogy, presence, and Communitas in nursing education during the COVID-19 pandemic. Nursing Outlook. 2020;68(6):822–829. doi: 10.1016/j.outlook.2020.08.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Clouse SF, Evans GE. Graduate Business Students Performance with Synchronous and Asynchronous Interaction e-Learning Methods. Decision Sciences Journal of Innovative Education. 2003;1(2):181–202. doi: 10.1111/j.1540-4609.2003.00017.x. [DOI] [Google Scholar]
  13. Cuddy P, Oki J, Wooten J. Online peer evaluation in basic pharmacology. Academic Medicine. 2001;76(5):532–533. doi: 10.1097/00001888-200105000-00070. [DOI] [PubMed] [Google Scholar]
  14. Dahlstrom, E., Brooks, D. C., & Bichsel, J. (2014). The current ecosystem of learning management systems in higher education: Student, faculty, and IT perspectives. Retrieved from
  15. Delaney D, Kummer TF, Singh K. Evaluating the impact of online discussion boards on student engagement with group work. British Journal of Educational Technology. 2019;50(2):902–920. doi: 10.1111/bjet.12614. [DOI] [Google Scholar]
  16. Dennen VP. From message posting to learning dialogues: Factors affecting learner participation in asynchronous discussion. Distance Education. 2005;26(1):127–148. doi: 10.1080/01587910500081376. [DOI] [Google Scholar]
  17. Dohrenwend A. Serving up the feedback sandwich. Family Practice Management. 2002;9(10):43. [PubMed] [Google Scholar]
  18. Einarsson ÁM, Hertzum M. How is learning scaffolded in library makerspaces? International Journal of Child-Computer Interaction. 2020;26:100199. doi: 10.1016/j.ijcci.2020.100199. [DOI] [Google Scholar]
  19. Fiock H. Designing a community of inquiry in online courses. The International Review of Research in Open and Distributed Learning. 2020;21(1):135–153. doi: 10.19173/irrodl.v20i5.3985. [DOI] [Google Scholar]
  20. Gao X, Li P, Shen J, Sun H. Reviewing assessment of student learning in interdisciplinary STEM education. International Journal of STEM Education. 2020;7(1):1–14. doi: 10.1186/s40594-020-00225-4. [DOI] [Google Scholar]
  21. Gaynor JW. Peer review in the classroom: Student perceptions, peer feedback quality and the role of assessment. Assessment & Evaluation in Higher Education. 2020;45(5):758–775. doi: 10.1080/02602938.2019.1697424. [DOI] [Google Scholar]
  22. Geng S, Law KM, Niu B. Investigating self-directed learning and technology readiness in blending learning environment. International Journal of Educational Technology in Higher Education. 2019;16(1):1–22. doi: 10.1186/s41239-019-0147-0. [DOI] [Google Scholar]
  23. Gonzalez CM, Walker SA, Rodriguez N, Noah YS, Marantz PR. Implicit Bias Recognition and Management in Interpersonal Encounters and the Learning Environment: A Skills-Based Curriculum for Medical Students. MedEdPORTAL. 2021;17:11168. doi: 10.15766/mep_2374-8265.11168. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Greene, H., & Crespi, C. (2012). The value of student created videos in the college classroom- an exploratory study in marketing and accounting. International Journal of Arts & Sciences, 5(1).
  25. Gunnlaugson O. Generative Dialogue as a Transformative Learning Practice in Adult and Higher Education Settings. Journal of Adult and Continuing Education. 2006;12(1):2–19. doi: 10.7227/JACE.12.1.2. [DOI] [Google Scholar]
  26. Han F, Ellis RA. Identifying consistent patterns of quality learning discussions in blended learning. The Internet and Higher Education. 2019;40:12–19. doi: 10.1016/j.iheduc.2018.09.002. [DOI] [Google Scholar]
  27. Jowsey T, Foster G, Cooper-Ioelu P, Jacobs S. Blended learning via distance in pre-registration nursing education: A scoping review. Nurse Education in Practice. 2020;44:102775. doi: 10.1016/j.nepr.2020.102775. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Kamin C, Glicken A, Hall M, Quarantillo B, Merenstein G. Evaluation of Electronic Discussion Groups as a Teaching/Learning Strategy in an Evidence-based Medicine Course: A Pilot Study. Education for Health: Change in Learning & Practice. 2001;14(1):21–32. doi: 10.1080/13576280010015380. [DOI] [PubMed] [Google Scholar]
  29. Kates FR, Byrd MD, Haider MR. Every Picture Tells a Story: The Power of 3 Teaching Method. Journal of Educators Online. 2015;12(1):189–211. doi: 10.9743/JEO.2015.1.1. [DOI] [Google Scholar]
  30. Kates, F. R., Hamadi, H., Kates, M. M., Larson, S. A., & Audi, G. R. (2018). Integrating Technology: An innovative approach to improving online discussion boards. eLearn, 2018(12).
  31. Kay, R. (2006). Using asynchronous online discussion to learn introductory programming: An exploratory analysis. Canadian Journal of Learning and Technology/La revue canadienne de l’apprentissage et de la technologie, 32(1).
  32. Kimberling, A., & Akwafuo, S. (2023). A Comprehensive Virtual Classroom Dashboard. Paper presented at the Proceedings of Seventh International Congress on Information and Communication Technology.
  33. Krentler KA, Willis-Flurry LA. Does technology enhance actual student learning? The case of online discussion boards. Journal of Education for Business. 2005;80(6):316–321. doi: 10.3200/JOEB.80.6.316-321. [DOI] [Google Scholar]
  34. Kumar V. Making" freemium" work. Harvard Business Review. 2014;92(5):27–29. [Google Scholar]
  35. Lam CNC, Habil H, Sahari NB. Exploring the use of video-annotated peer feedback in oral presentation lessons. International Journal of Innovation and Learning. 2022;32(4):474–497. doi: 10.1504/IJIL.2022.126639. [DOI] [Google Scholar]
  36. Lapitan LD, Jr, Tiangco CE, Sumalinog DAG, Sabarillo NS, Diaz JM. An effective blended online teaching and learning strategy during the COVID-19 pandemic. Education for Chemical Engineers. 2021;35:116–131. doi: 10.1016/j.ece.2021.01.012. [DOI] [Google Scholar]
  37. Lespiau F, Tricot A. Using primary knowledge: An efficient way to motivate students and promote the learning of formal reasoning. Educational Psychology Review. 2019;31(4):915–938. doi: 10.1007/s10648-019-09482-4. [DOI] [Google Scholar]
  38. Lim J, Pellett HH, Pellett T. Integrating Digital Video Technology in the Classroom. Journal of Physical Education, Recreation & Dance. 2009;80(6):40–55. doi: 10.1080/07303084.2009.10598339. [DOI] [Google Scholar]
  39. Lin, X., Sun, Q., & Zhang, X. (2021). Using learners’ self-generated quizzes in online courses. Distance Education, 1–19.
  40. Loeckx J. Blurring boundaries in education: Context and impact of MOOCs. International Review of Research in Open and Distributed Learning. 2016;17(3):92–121. doi: 10.19173/irrodl.v17i3.2395. [DOI] [Google Scholar]
  41. Mandernach BJ, Gonzales RM, Garrett AL. An examination of online instructor presence via threaded discussion participation. Journal of Online Learning and Teaching. 2006;2(4):248–260. [Google Scholar]
  42. Mazzolini M, Maddison S. Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education. 2003;40(3):237–253. doi: 10.1016/S0360-1315(02)00129-X. [DOI] [Google Scholar]
  43. McLeod, S. A. (2012). Zone of proximal development. Simply psychology.
  44. Milovic, A., & Dingus, R. (2021). How to not disappear completely: using video-based discussions to enhance student social presence in an online course. Marketing Education Review, 1–11.
  45. Min H-T. Training students to become successful peer reviewers. System. 2005;33(2):293–308. doi: 10.1016/j.system.2004.11.003. [DOI] [Google Scholar]
  46. National Center for Education Statistics. (2021). The NCES fast facts.
  47. Nisbet D. Measuring the quantity and quality of online discussion group interaction. Journal of eLiteracy. 2004;1(2):122–139. [Google Scholar]
  48. Ochoa, S. F., Pino, J. A., Baloian, N., Antunes, P., & Herskovic, V. (2012, 2012). Some observations from the analysis of an online discussion board. Paper presented at the Systems, Man, and Cybernetics (SMC), 2012 IEEE International Conference on.
  49. Omar H, Khan SA, Toh CG. Structured student-generated videos for first-year students at a dental school in Malaysia. Journal of Dental Education. 2013;77(5):640–647. doi: 10.1002/j.0022-0337.2013.77.5.tb05514.x. [DOI] [PubMed] [Google Scholar]
  50. Pool, J., Reitsma, G. M., & Van den Berg, D. N. (2017). Revised community of inquiry framework: Examining learning presence in a blended mode of delivery.
  51. Qualtrics. (2018). Qualtrics. In Provo, UT, USA.
  52. Russo-Gleicher R. Qualitative insights into faculty use of student support services with online students at risk: Implications for student retention. Journal of Educators Online. 2013;10(1):1–32. doi: 10.9743/JEO.2013.1.4. [DOI] [Google Scholar]
  53. Rutner S, Scott R. Use of Artificial Intelligence to Grade Student Discussion Boards: An Exploratory Study. Information Systems Education Journal. 2022;20(4):4. [Google Scholar]
  54. Setia MS. Methodology Series Module 3: Cross-sectional Studies. Indian Journal of Dermatology. 2016;61(3):261–264. doi: 10.4103/0019-5154.182410. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Soffer T, Cohen A. Students' engagement characteristics predict success and completion of online courses. Journal of Computer Assisted Learning. 2019;35(3):378–389. doi: 10.1111/jcal.12340. [DOI] [Google Scholar]
  56. Suler J. In class and online: Using discussion boards in teaching. CyberPsychology & Behavior. 2004;7(4):395–401. doi: 10.1089/cpb.2004.7.395. [DOI] [PubMed] [Google Scholar]
  57. Testa D, Egan R. How useful are discussion boards and written critical reflections in helping social work students critically reflect on their field education placements? Qualitative Social Work. 2016;15(2):263–280. doi: 10.1177/1473325014565146. [DOI] [Google Scholar]
  58. Thomas MJ. Learning within incoherent structures: The space of online discussion forums. Journal of Computer Assisted Learning. 2002;18(3):351–366. doi: 10.1046/j.0266-4909.2002.03800.x. [DOI] [Google Scholar]
  59. Tichon, M., & Seat, E. (2004, 2004). Team toolbox: activities & suggestions for facilitating project teams. Paper presented at the Frontiers in Education, 2004. FIE 2004. 34th Annual.
  60. Velez JJ, Cano J, Whittington MS, Wolf KJ. Cultivating Change through Peer Teaching. Journal of Agricultural Education. 2011;52(1):40–49. doi: 10.5032/jae.2011.01040. [DOI] [Google Scholar]
  61. Vygotsky, L. S. (1978). Mind in society (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds.).
  62. Wijekumar, K. K., & Spielvogel, J. (2006). Intelligent discussion boards©: Promoting deep conversations in asynchronous discussion boards through synchronous support. Campus-Wide Information Systems.
  63. Wu D, Hiltz SR. Predicting learning from asynchronous online discussions. Journal of Asynchronous Learning Networks. 2004;8(2):139–152. [Google Scholar]
  64. Wu Y, Schunn CD. The Effects of Providing and Receiving Peer Feedback on Writing Performance and Learning of Secondary School Students. American Educational Research Journal. 2021;58(3):492–526. doi: 10.3102/0002831220945266. [DOI] [Google Scholar]
  65. Yang Y-TC, Wu W-CI. Digital storytelling for enhancing student academic achievement, critical thinking, and learning motivation: A year-long experimental study. Computers & Education. 2012;59(2):339–352. doi: 10.1016/j.compedu.2011.12.012. [DOI] [Google Scholar]
  66. Yu L-T. Native English-Speaking Teachers' Perspectives on Using Videoconferencing in Learning English by Taiwanese Elementary-School Students. JALT CALL Journal. 2018;14(1):61–76. doi: 10.29140/jaltcall.v14n1.224. [DOI] [Google Scholar]
  67. Yu F-Y, Chen C-Y. Student-versus teacher-generated explanations for answers to online multiple-choice questions: What are the differences? Computers & Education. 2021;173:104273. doi: 10.1016/j.compedu.2021.104273. [DOI] [Google Scholar]
  68. Yunusa AA, Umar IN. A scoping review of critical predictive factors (CPFs) of satisfaction and perceived learning outcomes in E-learning environments. Education and Information Technologies. 2021;26(1):1223–1270. doi: 10.1007/s10639-020-10286-1. [DOI] [Google Scholar]
  69. Zhai, X., Chu, X., Chai, C. S., Jong, M. S. Y., Istenic, A., Spector, M., . . . Li, Y. (2021). A Review of Artificial Intelligence (AI) in Education from 2010 to 2020. Complexity, 2021.
  70. Zhang D, Zhou L, Briggs RO, Nunamaker JF. Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & Management. 2006;43(1):15–27. doi: 10.1016/j.im.2005.01.004. [DOI] [Google Scholar]
  71. Zhu E. Interaction and cognitive engagement: An analysis of four asynchronous online discussions. Instructional Science. 2006;34(6):451–480. doi: 10.1007/s11251-006-0004-0. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data available on request from the authors.


Articles from Techtrends are provided here courtesy of Nature Publishing Group

RESOURCES