Skip to main content
Journal of Microbiology & Biology Education logoLink to Journal of Microbiology & Biology Education
. 2022 Oct 13;23(3):e00143-22. doi: 10.1128/jmbe.00143-22

A Lesson from the Pandemic: Utilizing Digital Tools To Support Student Engagement during Instructional Assistant-Led Sessions

Christina N Morra a, Robert Fultz b, Samiksha A Raut a,
Editor: Pamela Ann Marshallc
PMCID: PMC9753610  PMID: 36532207

ABSTRACT

Student instructional assistants (IAs) are an integral part of most students’ college experience in higher education. When properly trained, IAs can improve students’ grades, engagement with course content, persistence, and retention. Recently, the COVID-19 pandemic forced the transition of nearly all instructional practices online. At the University of Alabama at Birmingham, IAs, including Biology Learning Assistants (BLAs), began hosting their instructional sessions virtually, outside of class time. The goals of these sessions were to reinforce fundamental concepts using active learning strategies and to address student questions by building a supportive learning community. In this article, we summarize the training and guidance we provided to the BLAs regarding how best to adapt digital educational tools to engage students during their virtual sessions. We recommend that institutions of higher education recognize the expansion of digital educational tools as an opportunity to increase the technological literacy and competence of their IAs to best serve their student body in this increasingly digital age of education.

KEYWORDS: Teaching Assistants, digital tools, active learning, learning assistants, training

PERSPECTIVE

Student instructional assistants (IAs) are widely used for both primary and supplemental instruction in higher education. While the research is still sparse, broadly speaking, IAs have a positive impact on students (14). The IA’s status as a peer or near-peer allows them to create supportive learning environments (59) while facilitating higher mean course grades and higher retention and persistence rates (10, 11). The term IA can refer collectively to any or all of the following: teaching assistants (TAs), supplemental instruction leaders (SIs), or learning assistants (LAs). While different institutions use these terms differently, here the terms are defined by the following roles: (i) TAs and SIs host instructional sessions outside of class; (ii) TAs have a role in grading, while SIs do not; and (iii) LAs are in-class peer assistants with no role in grading.

There are no available data on the proportion of IA sessions offered online pre-coronavirus disease 2019 (COVID-19) pandemic, though the percentage of undergraduate students in the United States enrolled in distance education courses increased from 15.6% to 43.1% from 2004 to 2016 (12). The pandemic further increased the proportion of courses being offered via remote instruction (13). As the efficacy of remote instruction is dependent on student engagement (1416) and a student’s experience is substantially impacted by their IAs (111), institutions need to train their IAs in digital engagement tools to support this increased digital learning. Student engagement is itself essential to maintain student interest in learning (17), satisfaction (18, 19), persistence in college (2023), and scholastic achievement (20, 24), as well as critical thinking, problem solving, inclination to inquire, and intercultural effectiveness (25). Finally, student engagement has a reciprocally positive impact on instructor motivation (26). Cumulatively, the efficacy of, and performance in, digital instruction relies on the quality of interactions; particularly the frequency and effectiveness of each interaction (27). During the recent COVID-19-induced increase in online learning, a marked decrease in student engagement was observed (28).

Given the importance of student engagement and the unique ability of IAs to function as peer or near-peer mentors, IAs are ideally placed to rectify the reduced student engagement in online instruction. The University of Alabama at Birmingham (UAB) uses a unique near-peer instructional model, Biology Learning Assistants (BLAs). This initiative was adapted from the University of Colorado at Boulder’s Learning Assistance Alliance (2, 29). At UAB, BLAs receive (i) instruction on evidence-based educational pedagogy, (ii) practical training in student engagement techniques, and (iii) training in diversity, equity, and inclusion strategies; all while hosting student review sessions for introductory-level courses. When the BLA initiative was founded at UAB in 2017, BLAs not only assisted within the courses but also hosted in-person sessions outside of class times. However, since the onset of the pandemic and shelter-in-place ordinance in Birmingham, AL, in March 2020 (30), the BLA sessions have taken place outside of class time, synchronously via the online platform Zoom. After reviewing informal feedback from both the students and BLAs at the end of the Spring 2020 semester, we concluded that the BLAs needed specific training related to digital engagement tools to better prepare and support students during online sessions. We spent the summer of 2020 reviewing the available digital tools and the relevant research regarding their efficacy. We used this information to develop active learning-based training, during which the BLAs were introduced to, brainstormed uses for, and practiced using these tools in mock BLA sessions. With that in mind, our goal here is to share the resources we gathered for institutions who are looking to develop digital tools training for their own IAs (Table 1). While no list can be completely exhaustive, these are the current tools, in our experience, that balance ease of use with the added value they bring to the instruction. Broadly, we have categorized the tools we are highlighting into four types: audience response systems (ARSs), free-form collaborative platforms, structured collaborative tools, and study tools, though some tools can be part of more than one category.

TABLE 1.

Summary of reviewed digital tools, including pros, cons, and relevant references

Category Tools Description Pros Cons References
Audience response systems iClickers, Poll Everywhere, Kahoot!, Quizizz, Baamboozle, Peardeck, Wordwall Prepared quizzes can be tracked or answered anonymously Formative assessments provide real-time feedback; easy to set up and use; uses minimal class time; less stressful than other active learning methods Feedback is limited by the multiple-choice format; clicker remotes are not free; certain features in Poll Everywhere and Kahoot! require paid plans 34, 41, 42, 5161
Free-form collaborative platforms Jamboard, MURAL, Google Slides, Autodesk Sketchbook, Adobe Fresco, Paper, Animation Desk, Tayasui Sketches Digital “whiteboards” with diverse ways to add material to those spaces Multiple participants can participate simultaneously; open format allows for creativity Blank spaces may require more guidance to facilitate productive engagement; MURAL requires a paid subscription 62 71
Structured collaborative spaces Padlet, Google Sheets, Google Docs, video conferencing platforms, ELNs, ELM systemsa Web-based systems with formatting that guides usage Multiple participants can participate simultaneously; interface guides usage, structure reduces need for training Structure limits potential uses; certain features in Padlet require a paid plan, ELNs and ELMs often required paid subscriptions 72 75
Study tools AnkiDecks, Piazza, Beyond ELMs, GroupMe, Slack, Flipgrid, Padlet, MURAL Digital platforms for content review or communication with instructional staff Customizable; some available as phone apps; encourages students to communicate with instructional staff and to review course material regularly Students need to be encouraged to use platform; some pre-prepared decks in AnkiDecks require payment 76 86
a

ELN, electronic lab notebook; ELM, electronic learning management (system).

DIGITAL TOOLS TO INCREASE ENGAGEMENT IN IA SESSIONS

Audience response systems

One of the most extensively used and studied types of digital education tools are audience response systems (ARSs), also referred to as student response systems. Through these systems, multiple-choice quizzes or polls are generated ahead of time; subsequently, students use hand-held devices (phones, tablets or system-specific clickers) to reply. These tools can be set up to capture anonymous or identifiable responses.

ARSs have been found to improve classroom dynamics, engagement, student motivation, and learning experiences (3133). These tools are also less anxiety-inducing than other active learning activities (31) and enhance student attention by generating healthy competition among the students via gamification (3438). The ease of use and live feedback provided by these tools make them a valuable formative assessment tool for either in-person or online instruction (3941).

As with all digital tools, an unstable internet connection has been found to be a hindrance toward successfully accessing ARSs (4244). Digital access is influenced by an array of structural inequalities, such as location, household income, and digital literacy (4547). While institutions are addressing digital access issues in a myriad of ways, including offering students hardware, software, or hot spot internet access (48) and intentionally training students in digital literacy (49), inequity remains a pervasive problem.

Furthermore, the multiple-choice format of the ARSs can lead to disengaged guessing therefore, it is recommended that the student responses should include a guided follow-up discussion on how to identify the correct answer (50). Thus, these tools are best suited for establishing the collective understanding as opposed to attempting to establish nuanced insights to each individual’s comprehension.

The original ARSs were radiofrequency-based systems, such as iClicker, which was founded in 2000, that required students to purchase ARS-specific hardware to respond (41, 51, 52). The next generation of ARSs were web-based, such as Poll Everywhere and WordWall, and can be accessed via computer, tablet, or phone and so are less likely to add a financial burden for the students (5355). In addition to the ability to make personalized quizzes, newer web-based ARS platforms like Kahoot! and Quizizz include prepared quizzes (42, 56). PearDeck is an ARS add-on to Google Slides that allows an instructor to host quizzes and interactive slides while saving the student responses for a record of the formative assessment (5761). While questions administered by the previous ARSs required respondents to reply individually, Baamboozle is a web-based ARS but is more like gamified notecards, with only the instructor responding on behalf of student teams (34).

Free-form collaborative platforms

Free-form collaborative platforms provide a “blank slate” with several methods to engage with, and within, the space, allowing multiple students to view and edit the space simultaneously. These platforms are accessed via smartphone, tablet, or computer, and open with a blank space into which images, text, shapes, tables, and graphs can be added and arranged collaboratively.

Efficacy research on these newer digital tools is limited; however, early studies have found success in using these free-form collaborative platforms to engage students (6264). Previous work suggested that the visual nature of these platforms contributes substantially to improved student engagement and learning (65, 66). Furthermore, the potential for these tools to be used collaboratively further suggests that students are gaining insights from peer input (67).

Common free-form collaborative platforms are Jamboard and MURAL. While Google Slides has elements of structured platforms, the BLAs primarily used Google Slides as a free-form platform. In addition to these common platforms, artistic applications can also serve as free-form collaborative platforms. Providing students with experience working in these artistic spaces may improve engagement by artistic students and/or provide students with an opportunity to gain familiarity with applications that are valued in the workforce (6871). These applications include Autodesk’s Sketchbook, Adobe’s Fresco, Paper, Animation Desk, and Tayasui Sketches. All of these platforms are either free or have a free version that is sufficient for the purposes of student use to review course material.

Structured collaborative tools

In contrast to the free-form tools, the constraints of structured collaborative tools provide guidance on how these programs are best used. These platforms provide a workspace that can be accessed and utilized by multiple students synchronously or asynchronously.

Recently, there has been growing interest in further understanding the impact of these tools during remote instruction. In one study which evaluated the use of Google Docs as discussion tools in a remote, upper-level chemistry course, the platform did not increase higher-level problem-solving skills, but it did result in robust small group discussions and was able to successfully provide iterative formative feedback to the student users (72). An earlier study demonstrated that even when classes were in-person, Google Docs could be used effectively to facilitate out-of-class, collaborative assignments (73). That study highlighted the need to review key functions and value-added uses of these tools (73). In one creative study, Padlet was used to host a synchronous, virtual class debate between two teams of students (74). Interestingly, while the analysis of this assignment indicated that the students internalized new knowledge, students perceived minimal learning during the activity (74).

Common structured collaborative digital platforms, including Google Sheets and Google Docs, provide spreadsheet and word processing workspaces, respectively. Video conferencing platforms also fall into this category. While Zoom became nearly ubiquitous during the COVID-19 pandemic (75), other platforms, including Microsoft Teams, Google Meet, and Cisco Webex, were also common. Each platform offers unique features; however, in general, these are tools that, through the webcams and microphones embedded in computers and smartphones, allow for synchronous video, audio, chat, survey polls, and document exchange. Padlet, another structured collaborative tool, provides a platform where multiple students can contribute “note cards” to its bulletin board-like interface. Various electronic lab notebook (ELN) forums and electronic learning management (ELM) systems also fall under the category of structured collaborative tools.

Study tools

In addition to reviewing information, IAs are often tasked with providing study skill resources to students (76). As such at UAB, the BLA training included information on digital study tools. Study strategies have evolved alongside instruction’s substantial transition from analog to digital. Though there may always be some students who rewrite notes, make flashcards, or create study guides with pen and paper, students are increasingly using digital tools to study.

AnkiDecks is a smartphone application that uses a flashcard-based system which allows students to study card “decks” for either open-source common topics, such as introductory course material and professional school entrance exams, for which the card decks are premade, or desks made or personalized by users. Piazza is a learning management system formatted like a social media site. It is designed to encourage students to ask questions which can be moderated by instructors or IAs. Many of the audience response tools discussed above could also be considered study tools when used to review materials. Finally, tools which facilitate communication between the students and IAs allow students to clarify the confusion which frequently arises as students study (77). This confusion, while easy to recognize in students during in-person instruction (78), is more difficult to identify and address during online instruction (79). Beyond email and learning management systems, common communication tools include GroupMe, Slack, and Flipgrid. Padlet and MURAL, described above, can also be used to communicate with instructional staff anonymously.

Broadly, Piazza is an easy-to-use tool (80) that increases student engagement and grades by reinforcing key course concepts (81), particularly for novice learners (82, 83). The algorithm of AnkiDecks builds in interleaving (studying multiple topics together, as opposed to studying one thing and then the next) and spaced repetition. These features have been identified as particularly effective study features (8486).

DIGITAL TOOL TRAINING SESSIONS

Since introducing digital tools training for the BLAs in the Fall of 2020, the training provided has evolved each semester based on student feedback and available literature. Due to COVID-19-dictated safety policy, the BLAs were exclusively hosting their instructional sessions remotely as group meetings over Zoom; therefore, that was how the trainings were formatted. Over the first 2 weeks of the semester, there were 4 one hour training sessions dedicated to digital tools. All BLAs are expected to attend the sessions live, though the sessions were recorded. The recordings were posted to the course Canvas page, both to provide a reference tool for the BLAs to revisit throughout the semester and to allow absent students to complete the training at a later time.

The training sessions used Google Slides as a foundation to provide information on how to access each tool and the specific pros and cons of each tool. The bulk of the training time, however, was spent using the tools to host discussions and give the BLAs practical experience using the tools. Links to the completed training activities remained active and available to the BLAs all semester.

Tools utilized by our BLAs included the following:

  1. Padlet was used as a platform for the BLAs to introduce themselves by making individual “business cards.”

  2. BLAs worked together on a three-way Venn diagram in MURAL to discuss the distinctions between teaching assistants, supplemental instructor leaders, and biology learning assistants, all of which are instructional assistants available to UAB students.

  3. In Jamboard, BLAs communally brainstormed mechanisms to use digital tools to improve student engagement.

  4. BLAs practiced using Google Forms to ask questions ahead of the training sessions.

  5. Kahoot! was used for formative assessment of BLAs’ knowledge of the ethics in mentorship and to spark conversations around that topic.

There were three additional key components to the trainings: (i) BLAs were repeatedly reminded to never pay their own money to access any tool; (ii) BLAs were reminded that their role as a BLA required them to be accessible to the students two hours per week and that they should establish clear boundaries with their students about how, when, and what students could expect from them as their BLA; and (iii) all BLAs were matched with experienced BLA mentors to which they could take concerns or questions throughout the semester.

RECOMMENDATIONS

While many courses have transitioned back to in-person learning, the goal should not be to return to pre-2020 “normal,” but rather we should be striving to evaluate the efficacy of the changes that were made and retain data-driven improvements. We propose that one of these improvements has been increased technological competence with regard to digital educational tools.

Like many institutions in higher education, a normal, in-person Spring 2020 semester at the University of Alabama at Birmingham transitioned to online learning mid-semester. All IA sessions were promptly transitioned so students in our large-enrollment introductory biology courses could continue receiving peer support. As part of this transition, we diversified the BLA training to include digital educational tools, highlighting the evidence that these tools improve student engagement. In Table 1, we have summarized information on many digital educational tools.

Moving forward, we recommend IAs hosting online student help sessions receive evidence-based training and practice using these digital tools. For example, for effective use of audience response tools, it is critical that IAs receive training on how to design clear and appropriate questions (42). Furthermore, as the training of IAs expands and IAs begin using these tools, we propose that institutions rigorously evaluate the efficacy of these tools and the associated IA training in the context of their own institution.

In addition to the benefits these tools bring to the IA sessions, as the number of professionals working remotely increases, it is critical that students leave higher education with the ability to collaborate digitally (87). With the ubiquity of web-based systems in the workplace, we propose that students view the exposure to these tools as part of their career training. By integrating these tools into IA sessions, both the IAs and the students are getting this valuable career training built into their higher education curriculum.

As institutions look forward, it is vital that instructors, student instructional assistants, and administrators work together to update established expectations and evidence-based strategies for effective and engaging IA sessions. We suggest, as these conversations occur, that the digital tools we have discussed here be considered. Importantly, while the user-friendly nature of these tools could make training seem unnecessary, in our experience, it is important that IAs receive training in how to effectively and equitably employ these tools to increase student engagement.

By their nature, digital tools have certain technological requirements, including hardware, software, and sufficient internet infrastructure. In recommending the expanded use of these tools, we also highlight the need to expand the equitable availability of these technologies. The academic community now has an opportunity to consider how underserved populations, both in the United States and abroad, can benefit from the current expansion of instructional technologies and how these may be shared with those who have previously lacked access (88).

CONCLUSIONS

The high price of the COVID-19 pandemic has required us to reflect thoughtfully on what we can learn from the changes that the pandemic forced upon the higher education system. During this time, we developed digital tool training to support the Biology Learning Assistants as they hosted virtual IA sessions at the University of Alabama at Birmingham. We therefore propose that evidence-based digital tool training for IAs should become standard at institutions offering online courses. Higher education would be remiss if it failed to retain the valuable innovations brought about within higher education that were developed during the pandemic.

ACKNOWLEDGMENTS

We kindly acknowledge all Biology Learning Assistants who participated in this novel initiative. We appreciate the technical assistance offered by our undergraduate students Diana Bucio, Dasha Cherepovitsky, Cinnamin Cross, Derek Dang, Ryleigh Fleming, Karishma Parbhoo, Thomas Shevlin, and Jordan Wright with our BLA program at UAB.

C.N.M. is supported by the National Institute of General Medical Sciences training grant K12GM088010. This work was supported by National Science Foundation Research Coordination Networks in Undergraduate Biology Education grant 1826988 and the Building a Multicultural Curriculum grant (2020–2021)by College of Arts and Sciences at The University of Alabama at Birmingham both awarded to S.A.R.

We declare no conflicts of interest.

Contributor Information

Samiksha A. Raut, Email: sraut@uab.edu.

Pamela Ann Marshall, Arizona State University.

REFERENCES

  • 1.Alzen JL, Langdon LS, Otero VK. 2018. A logistic regression investigation of the relationship between the Learning Assistant model and failure rates in introductory STEM courses. Int J STEM Educ 5:56. doi: 10.1186/s40594-018-0152-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Barrasso AP, Spilios KE. 2021. A scoping review of literature assessing the impact of the learning assistant model. Int J STEM Educ 8:12. doi: 10.1186/s40594-020-00267-8. [DOI] [Google Scholar]
  • 3.Wan T, Geraets AA, Doty CM, Saitta EKH, Chini JJ. 2020. Characterizing science graduate teaching assistants’ instructional practices in reformed laboratories and tutorials. Int J STEM Educ 7:30. doi: 10.1186/s40594-020-00229-0. [DOI] [Google Scholar]
  • 4.Kendall KD, Schussler EE. 2013. Evolving impressions: undergraduate perceptions of graduate teaching assistants and faculty members over a semester. CBE Life Sci Educ 12:92–105. doi: 10.1187/cbe.12-07-0110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Whitman NA, Fife JD. 1988. Peer teaching: to teach is to learn twice. ERIC Institute of Education Sciences, U.S. Department of Education, Washington, DC. [Google Scholar]
  • 6.Evans DJ, Cuffe T. 2009. Near-peer teaching in anatomy: an approach for deeper learning. Anat Sci Educ 2:227–233. doi: 10.1002/ase.110. [DOI] [PubMed] [Google Scholar]
  • 7.Irvine S, Williams B, McKenna L. 2018. Near-peer teaching in undergraduate nurse education: an integrative review. Nurse Educ Today 70:60–68. doi: 10.1016/j.nedt.2018.08.009. [DOI] [PubMed] [Google Scholar]
  • 8.ten Cate O, van de Vorst I, van den Broek S. 2012. Academic achievement of students tutored by near-peers. Int J Med Educ 3:6–13. doi: 10.5116/ijme.4f0c.9ed2. [DOI] [Google Scholar]
  • 9.Williams B, Fowler J. 2014. Can near-peer teaching improve academic performance? Int J Higher Educ 3:142–149. doi: 10.5430/ijhe.v3n4p142. [DOI] [Google Scholar]
  • 10.Dawson P, van der Meer J, Skalicky J, Cowley K. 2014. On the effectiveness of supplemental instruction: a systematic review of supplemental instruction and peer-assisted study sessions literature between 2001 and 2010. Rev Educ Res 84:609–639. doi: 10.3102/0034654314540007. [DOI] [Google Scholar]
  • 11.Wilson SB, Varma-Nelson P. 2016. Small groups, significant impact: a review of peer-led team learning research with implications for STEM education researchers and faculty. J Chem Educ 93:1686–1702. doi: 10.1021/acs.jchemed.5b00862. [DOI] [Google Scholar]
  • 12.Snyder TD, de Brey C, Dillow SA. 2019. Digest of education statistics 2018. NCES report 2020–009. National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, Washington, DC. [Google Scholar]
  • 13.Purcell WM, Lumbreras J. 2021. Higher education and the COVID-19 pandemic: navigating disruption using the sustainable development goals. Discov Sustain 2:6. doi: 10.1007/s43621-021-00013-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Robinson CC, Hullinger H. 2008. New benchmarks in higher education: student engagement in online learning. J Educ Bus 84:101–109. doi: 10.3200/JOEB.84.2.101-109. [DOI] [Google Scholar]
  • 15.Chickering AW, Ehrmann SC. 1996. Implementing the seven principles: technology as lever. Am Assoc Higher Educ Bull 49:3–6. [Google Scholar]
  • 16.Abou-Khalil V, Helou S, Khalifé E, Chen MA, Majumdar R, Ogata H. 2021. Emergency online learning in low-resource settings: effective student engagement strategies. Educ Sci 11:24. doi: 10.3390/educsci11010024. [DOI] [Google Scholar]
  • 17.Berger JB, Milem JF. 1999. The role of student involvement and perceptions of integration in a causal model of student persistence. Res Higher Educ 40:641–664. doi: 10.1023/A:1018708813711. [DOI] [Google Scholar]
  • 18.Zimmerman BJ, Kitsantas A. 1997. Developmental phases in self-regulation: shifting from process goals to outcome goals. J Educ Psychol 89:29–36. doi: 10.1037/0022-0663.89.1.29. [DOI] [Google Scholar]
  • 19.Filak VF, Sheldon KM. 2008. Teacher support, student motivation, student need satisfaction, and college teacher course evaluations: testing a sequential path model. Educ Psychol 28:711–724. doi: 10.1080/01443410802337794. [DOI] [Google Scholar]
  • 20.Fredricks JA, Blumenfeld PC, Paris AH. 2004. School engagement: potential of the concept, state of the evidence. Rev Educ Res 74:59–109. doi: 10.3102/00346543074001059. [DOI] [Google Scholar]
  • 21.Hu S, McCormick AC, Gonyea RM. 2012. Examining the relationship between student learning and persistence. Innov Higher Educ 37:387–395. doi: 10.1007/s10755-011-9209-5. [DOI] [Google Scholar]
  • 22.Kuh GD. 2003. What we're learning about student engagement from NSSE: benchmarks for effective educational practices. Change: The Magazine of Higher Learning 35:24–32. doi: 10.1080/00091380309604090. [DOI] [Google Scholar]
  • 23.Kuh GD, Cruce TM, Shoup R, Kinzie J, Gonyea RM. 2008. Unmasking the effects of student engagement on first-year college grades and persistence. J Higher Educ 79:540–563. doi: 10.1080/00221546.2008.11772116. [DOI] [Google Scholar]
  • 24.Pascarella ET, Terenzini PT. 2005. How college affects students: a third decade of research, vol 2. ERIC Institute of Education Sciences, Washington, DC. [Google Scholar]
  • 25.Pascarella ET, Seifert TA, Blaich C. 2010. How effective are the NSSE benchmarks in predicting important educational outcomes? Change: The Magazine of Higher Learning 42:16–22. doi: 10.1080/00091380903449060. [DOI] [Google Scholar]
  • 26.Neves de Jesus S, Lens W. 2005. An integrated model for the study of teacher motivation. Appl Psychol 54:119–134. doi: 10.1111/j.1464-0597.2005.00199.x. [DOI] [Google Scholar]
  • 27.Jaggars SS, Xu D. 2016. How do online course design features influence student performance? Comput Educ 95:270–284. doi: 10.1016/j.compedu.2016.01.014. [DOI] [Google Scholar]
  • 28.Chen E, Kaczmarek K, Ohyama H. 2021. Student perceptions of distance learning strategies during COVID-19. J Dent Educ 85:1190–1191. doi: 10.1002/jdd.12339. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Otero V, Pollock S, Finkelstein N. 2010. A physics department’s role in preparing physics teachers: the Colorado learning assistant model. Am J Phys 78:1218–1224. doi: 10.1119/1.3471291. [DOI] [Google Scholar]
  • 30.Birmingham City Council. 2020. Ordinance no. 20–48, an ordinance to establish a “shelter in place order” for the City of Birmingham during the COVID-19 public health emergency. https://www.birminghamal.gov/wp-content/uploads/2020/03/2020.3.24.City-of-Birmingham.Shelter-In-Place-Ordinance.pdf.
  • 31.Adkins-Jablonsky SJ, Shaffer JF, Morris JJ, England B, Raut S. 2021. A tale of two institutions: analyzing the impact of gamified student response systems on student anxiety in two different introductory biology courses. CBE Life Sci Educ 20:ar19. doi: 10.1187/cbe.20-08-0187. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Roediger HL, III, Karpicke JD. 2006. The power of testing memory: basic research and implications for educational practice. Perspect Psychol Sci 1:181–210. doi: 10.1111/j.1745-6916.2006.00012.x. [DOI] [PubMed] [Google Scholar]
  • 33.Crossgrove K, Curran KL. 2008. Using clickers in nonmajors-and majors-level biology courses: student opinion, learning, and long-term retention of course material. CBE Life Sci Educ 7:146–154. doi: 10.1187/cbe.07-08-0060. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Dorrigiv M. 2021. Incorporation of serious games into higher education: a survey. International Serious Games Symposium (ISGS) 88–90. doi: 10.1109/ISGS54702.2021.9684766. [DOI]
  • 35.Wang AI. 2015. The wear out effect of a game-based student response system. Comput Educ 82:217–227. doi: 10.1016/j.compedu.2014.11.004. [DOI] [Google Scholar]
  • 36.Hanson TL, Drumheller K, Mallard J, McKee C, Schlegel P. 2010. Cell phones, text messaging, and Facebook: competing time demands of today's college students. College Teach 59:23–30. doi: 10.1080/87567555.2010.489078. [DOI] [Google Scholar]
  • 37.Cheong C, Filippou J, Cheong F. 2014. Towards the gamification of learning: Investigating student perceptions of game elements. J Info Syst Educ 25:233. [Google Scholar]
  • 38.Shawn MC, Madeline MC, Sullivan S. 2014. A kaleidoscope career perspective on faculty sabbaticals. Career Dev Int 19:295–313. doi: 10.1108/CDI-04-2013-0051. [DOI] [Google Scholar]
  • 39.Ismail MA-A, Mohammad J. 2017. Kahoot: a promising tool for formative assessment in medical education. Educ Med J 9:19–26. doi: 10.21315/eimj2017.9.2.2. [DOI] [Google Scholar]
  • 40.Kay RH, LeSage A. 2009. Examining the benefits and challenges of using audience response systems: a review of the literature. Comput Educ 53:819–827. doi: 10.1016/j.compedu.2009.05.001. [DOI] [Google Scholar]
  • 41.Sun JC-Y. 2014. Influence of polling technologies on student engagement: an analysis of student motivation, academic performance, and brainwave data. Comput Educ 72:80–89. doi: 10.1016/j.compedu.2013.10.010. [DOI] [Google Scholar]
  • 42.Wang AI, Tahir R. 2020. The effect of using Kahoot! for learning: a literature review. Comput Educ 149:103818. doi: 10.1016/j.compedu.2020.103818. [DOI] [Google Scholar]
  • 43.Zhu E. 2007. Teaching with clickers. CRLT Occasional Papers, vol 22. Center for Research on Learning and Teaching, University of Michigan, Ann Arbor, MI. [Google Scholar]
  • 44.Willems J, Farley H, Campbell C. 2019. The increasing significance of digital equity in higher education: an introduction to the digital equity special issue. Australas J Educ Technol 35:1–8. doi: 10.14742/ajet.5996. [DOI] [Google Scholar]
  • 45.Warschauer M. 2004. Technology and social inclusion: rethinking the digital divide. MIT Press, Cambridge, MA. [Google Scholar]
  • 46.Vogels EA. 2021. Digital divide persists even as Americans with lower incomes make gains in tech adoption. Pew Research Organization, Philadelphia, PA. https://pewrsr.ch/2TRM7cP. [Google Scholar]
  • 47.Lissitsa S, Madar G. 2018. Do disabilities impede the use of information and communication technologies? Findings of a repeated cross-sectional study, 2003–2015. Isr J Health Policy Res 7:66. doi: 10.1186/s13584-018-0260-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Wood S. 2021. How colleges are bridging the digital divide. U.S. News and World Report. https://www.usnews.com/education/best-colleges/articles/how-colleges-are-bridging-the-digital-divide.
  • 49.Kajeet. 2021. Tackling the digital divide in higher education: hurdles and opportunities. https://f.hubspotusercontent10.net/hubfs/367813/Digital%20Divide%20in%20Higher%20Education%20Whitepaper.pdf.
  • 50.Licorish SA, Owen HE, Daniel B, George JL. 2018. Students’ perception of Kahoot!’s influence on teaching and learning. Res Pract Technol Enhanced Learn 13:9. doi: 10.1186/s41039-018-0078-8. [DOI] [Google Scholar]
  • 51.Sun JC-Y, Martinez B, Seli H. 2014. Just-in-time or plenty-of-time teaching? Different electronic feedback devices and their effect on student engagement. J Educ Technol Soc 17:234–244. [Google Scholar]
  • 52.Martyn M. 2007. Clickers in the classroom: an active learning approach. Educause Q 30:71. [Google Scholar]
  • 53.Shon H, Smith L. 2011. A review of Poll Everywhere audience response system. J Technol Hum Serv 29:236–245. doi: 10.1080/15228835.2011.616475. [DOI] [Google Scholar]
  • 54.Kappers WM, Cutler SL. 2015. Poll Everywhere! Even in the classroom: an investigation into the impact of using PollEverwhere in a large-lecture classroom. Comput Educ J 6:21. [Google Scholar]
  • 55.Bueno M, Perez F, Valerio R, Areola EMQ. 2022. A usability study on Google site and wordwall.net: online instructional tools for learning basic integration amid pandemic. J Glob Bus Soc Entrepreneurship 7:61–71. [Google Scholar]
  • 56.Göksün DO, Gürsoy G. 2019. Comparing success and engagement in gamified learning experiences via Kahoot and Quizizz. Comput Educ 135:15–29. doi: 10.1016/j.compedu.2019.02.015. [DOI] [Google Scholar]
  • 57.Javed Y, Odhabi H. 2018. Active learning in classrooms using online tools: evaluating Pear-Deck for students' engagement. 2018 Fifth HCT Information Technology Trends, p 126–131. doi: 10.1109/CTIT.2018.8649515. [DOI] [Google Scholar]
  • 58.Haryani F, Ayuningtyas N. 2021. The impact of interactive online learning by Pear Deck during COVID-19 pandemic era. J Phys Conf Ser 1957:012006. doi: 10.1088/1742-6596/1957/1/012006. [DOI] [Google Scholar]
  • 59.Tanner KD. 2013. Structure matters: twenty-one teaching strategies to promote student engagement and cultivate classroom equity. CBE Life Sci Educ 12:322–331. doi: 10.1187/cbe.13-06-0115. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Mazur E, Hilborn RC. 1997. Peer instruction: a user’s manual. Phys Today 50:68–69. doi: 10.1063/1.881735. [DOI] [Google Scholar]
  • 61.Crouch CH, Mazur E. 2001. Peer instruction: ten years of experience and results. Am J Phys 69:970–977. doi: 10.1119/1.1374249. [DOI] [Google Scholar]
  • 62.Colwill RM. 2021. A blended CURE: how the pandemic helped me combine the best of remote and in-person teaching. J Res Practice Coll Teach, vol 6. University of Cincinnati, Cincinnati, OH. [Google Scholar]
  • 63.Sweeney E, Beger A, Reid L. 2021. Google Jamboard for virtual anatomy education. Clin Teach 18:341–347. doi: 10.1111/tct.13389. [DOI] [PubMed] [Google Scholar]
  • 64.Roush C, Burmeister AR. 2020. COVID-19 and the central dogma: an activity to improve student learning and engagement. J Microbiol Biol Educ 21:50. doi: 10.1128/jmbe.v21i3.2145. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Vazquez JJ, Chiang EP. 2014. A picture is worth a thousand words (at least): the effective use of visuals in the economics classroom. Int Rev Econ Educ 17:109–119. doi: 10.1016/j.iree.2014.08.006. [DOI] [Google Scholar]
  • 66.Cook M. 2011. Teachers’ use of visual representations in the science classroom. Sci Educ Int 22:175–184. [Google Scholar]
  • 67.Novak JD. 2003. The promise of new ideas and new technology for improving teaching and learning. Cell Biol Educ 2:122–132. doi: 10.1187/cbe.02-11-0059. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Bloom D. 13 November 2019. Want a better resume or job? Adobe says you should focus on these five things. Forbes, Jersey City, NJ. https://www.forbes.com/sites/dbloom/2019/11/13/want-a-better-resum-or-job-adobe-says-you-should-focus-on-these-five-things/?sh=532b89a036a3. [Google Scholar]
  • 69.Geller P, Stein J, Du D, Webb JR, Lieberman Z, Shreiber D, Parekkadan B. 2021. Impact of mixed reality presentation on STEM engagement and comprehension: a pilot study on adult scientists. Biomed Eng Educ 1:277–290. doi: 10.1007/s43683-021-00049-w. [DOI] [Google Scholar]
  • 70.Fagan S. 2022. Adobe explores in-demand skills. Reworked, San Francisco, CA. https://www.reworked.co/digital-workplace/adobe-explores-in-demand-skills-cloud-services-drive-microsoft-growth-more-news/. [Google Scholar]
  • 71.Doyle A. 2020. Important computer skills for workplace success. The Balance, New York, NY. https://www.thebalancecareers.com/computer-skills-list-2063738#toc-graphic-design. [Google Scholar]
  • 72.Tran C, Lamar M. 2020. Fostering small group discussion in an online instrumental analysis course using Google Docs. J Forensic Sci Educ 2. https://jfse-ojs-tamu.tdl.org/jfse/index.php/jfse/article/view/34. [Google Scholar]
  • 73.Zhou W, Simpson E, Domizi DP. 2012. Google Docs in an out-of-class collaborative writing activity. Int J Teach Learn Higher Educ 24:359–375. [Google Scholar]
  • 74.Dewitt D, Alias N, Siraj S. 2015. Collaborative learning: interactive debates using Padlet in a higher education institution. In International Educational Technology Conference (IETC 2015), 27–29 May 2015, Istanbul, Turkey. [Google Scholar]
  • 75.Dooley R. 30 September 2020. How Zoom conquered video conferencing. Forbes, Jersey City, NJ. https://www.forbes.com/sites/rogerdooley/2020/09/30/how-zoom-conquered-video-conferencing/?sh=77d397505a97. [Google Scholar]
  • 76.Fingerson L, Culley AB. 2001. Collaborators in teaching and learning: undergraduate teaching assistants in the classroom. Teach Sociol 29:299–315. doi: 10.2307/1319189. [DOI] [Google Scholar]
  • 77.Lodge JM, Kennedy G, Lockyer L, Arguel A, Pachman M. 2018. Understanding difficulties and resulting confusion in learning: an integrative review. Front Educ 3:49. doi: 10.3389/feduc.2018.00049. [DOI] [Google Scholar]
  • 78.Hill CFC, Gouvea JS, Hammer D. 2018. Teaching assistant attention and responsiveness to student reasoning in written work. CBE Life Sci Educ 17:ar25. doi: 10.1187/cbe.17-04-0070. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Wosnitza M, Volet S. 2005. Origin, direction and impact of emotions in social online learning. Learn Instruct 15:449–464. doi: 10.1016/j.learninstruc.2005.07.009. [DOI] [Google Scholar]
  • 80.Renuka T, Chitra C, Pranesha T. 2016. Evolving philosophy of OBE in engineering physics course: a case study. J Eng Educ Transform 29:14–19. doi: 10.16920/jeet/2016/v29i3/85192. [DOI] [Google Scholar]
  • 81.Kolluru S, Varughese JT. 2017. Structured academic discussions through an online education-specific platform to improve Pharm.D. students learning outcomes. Curr Pharm Teach Learn 9:230–236. doi: 10.1016/j.cptl.2016.11.022. [DOI] [PubMed] [Google Scholar]
  • 82.Smith DH, IV, Hao Q, Dennen V, Tsikerdekis M, Barnes B, Martin L, Tresham N. 2020. Towards understanding online question & answer interactions and their effects on student performance in large-scale STEM classes. Int J Educ Technol High Educ 17:20. doi: 10.1186/s41239-020-00200-7. [DOI] [Google Scholar]
  • 83.Sankar P, Gilmartin J, Sobel M. 2015. An examination of belongingness and confidence among female computer science students. SIGCAS Comput Soc 45:7–10. doi: 10.1145/2809957.2809960. [DOI] [Google Scholar]
  • 84.Weinstein Y, Madan CR, Sumeracki MA. 2018. Teaching the science of learning. Cogn Res Princ Implic 3:2. doi: 10.1186/s41235-017-0087-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Nguyen B-PT. 2021. Mobile-assisted vocabulary learning: a review of Anki. J Educ Technol 18:16–21. [Google Scholar]
  • 86.Seibert Hanson AE, Brown CM. 2020. Enhancing L2 learning through a mobile assisted spaced-repetition tool: an effective but bitter pill? Comput Assist Lang Learn 33:133–155. doi: 10.1080/09588221.2018.1552975. [DOI] [Google Scholar]
  • 87.Rainie L, Anderson J. 2017. The future of jobs and jobs training. Pew Research Center, Philadelphia, PA. https://www.pewresearch.org/internet/2017/05/03/the-future-of-jobs-and-jobs-training/. [Google Scholar]
  • 88.Morra CN, Adkins-Jablonsky SJ, Raut SA. 2021. Leveraging virtual experiences for international professional development opportunities during the pandemic and beyond. J Microbiol Biol Educ 22:ev22i1. doi: 10.1128/jmbe.v22i1.2511. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Microbiology & Biology Education are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES