Abstract
Flipped instruction using online enrichment is a popular way to enhance active learning in the laboratory setting. Graduate student teaching assistants at University of California, Irvine flipped an upper division undergraduate neurobiology and behavior lab using the new online software platform “Rocketmix.” The following research study compares the impact of pre-lab online instruction (front flipping) and post-lab online instruction (back flipping) on student exam performance. We describe a novel method for unbiased categorization of exam questions by degree of difficulty. Multi-choice instruction encourages students to consider all distractors and discourages verbal cues and process of elimination techniques. Eighteen identical questions were evenly distributed across exam versions with multiple choice instruction (single answer) or a more challenging multi-choice instruction (more than one answer). Student performance on multiple choice questions were used to categorize the degree of difficulty of questions that were presented in multi-choice format. Our findings reveal that pre-lab instruction resulted in better student performance compared with post-lab instruction on questions of moderate difficulty. This effect was significant for both male and female students. Student survey data on the flipped lab format is provided, indicating that students appreciated the online instructional modules, finding them both informative and useful during lab exercises and exams.
Keywords: neurobiology, active learning, flipped course, blended laboratory course, online lecture, Rocketmix, pedagogy, neuroscience education
Educators have pursued a variety of course formats that increase active learning while deemphasizing the passive learning that often occurs during traditional lectures. One such course design, the flipped format, describes a learning environment in which students review traditional lecture materials outside the classroom but actively engage with the material while inside the classroom (Sierra, 2010; Bergmann and Sams 2012).
In flipped courses, instructors commonly use recording software to make narrated videos with embedded interactive components that students can access remotely. This allows instructors to devote classroom sessions to student-centered activities such as peer learning, case study analysis and problem solving (Strayer, 2012; Tucker, 2012; Gajjar, 2013; Sarawagi, 2013; Bergmann and Sams, 2012).
Research into the efficacy of the flipped course design is ongoing. Commonly cited advantages of a flipped classroom are the increased time devoted to peer learning; increased opportunities to assess student learning styles; and the flexibility to adjust the course to students’ learning pace (Fulton, 2012; Herreid and Schiller, 2013). However, the hallmark of the flipped course is the enhanced student engagement it provides through increased active learning.
Active learning requires students to demonstrate a higher-order understanding of course work while giving educators opportunities to provide feedback to students (McCollough and Gremler, 1999). When incorporated effectively into a course, these strategies have been reported to improve the performance of students in the science, technology, engineering, and mathematics (STEM) fields (Freeman et al., 2014; Hake, 1998; Yadav et al., 2007). In biology, these strategies have the added benefit of increasing exam performance (Freeman et al., 2011) while reducing the achievement gap between advantaged and disadvantaged students (Haak et al., 2011). At least one study suggests that active learning improves performance regardless of whether the classroom is flipped or traditionally constructed (Jensen et al., 2015).
Even though active learning appears to be integral to the enhancements seen in student performance in flipped courses, it is unclear how additional factors affect student learning. For example, does the type of online presentation affect student performance? Gopaul (2010) found that interactive web-based instruction enhanced student performance. However, students report a preference for online instruction that include both interactive components and passive videos (Cuthrell and Lyon, 2007). Research also suggests that online pre-laboratory modules increase student performance and preparedness (Peteroy-Kelly, 2010). However, it is an open question as to whether pre-laboratory and post-laboratory modules are equally efficacious. Moreover, gender differences have also been observed in learning style preferences on the VARK (visual, auditory, reading, kinetic) questionnaire. These findings suggest that males prefer multi-modal instruction styles typically incorporated into online modules, while females show a preference for single mode kinetic styles of instruction (Wehrwein et al., 2007). Therefore, what considerations should be made regarding gender preferences for flipped course design?
To address these questions, we flipped our Neurobiology (N113L) Lab course with the intent to study optimal delivery of online teaching content. This neurobiology wet lab previously incorporated a traditional 45-minute introductory lecture delivered by a teaching assistant instructor. Graduate students at the University of California, Irvine developed a series of online modules that incorporated both passive videos and interactive questions to replace the traditional live lab lecture.
We provided students access to the instruction modules three days before or following each lab to compare front and back flipping. Historically, student performance in the lab course was assessed with a common multiple choice exam for all lab sections. Since multiple choice has been previously shown to reward partial knowledge and guessing strategies (Burton, 2010), we hypothesized that potential differences between the pre- and post-lab online instruction might be masked. Thus, we employed a novel method to increase the accuracy of our assessment. A series of questions with identical wording were presented across exam versions as either multiple choice (one answer) or multi-choice (multiple answers). Multi-choice instruction was used to introduce student uncertainty and reduce the use of guessing strategies (Lesage et al., 2013). We also investigated the influence of gender on the effect of pre- vs. post-online multi-modal lab instruction.
MATERIALS AND METHODS
Previously, during traditional laboratory meetings, instructors gave a preparatory lecture at the beginning of every 3-hour session. Following the preparatory lecture, students conducted neurobiology laboratory exercises, using detailed instructions provided in a printed manual. During the sessions, students were required to work in groups and complete a set of short response questions provided in their lab manual.
The flipped format was designed to increase the amount of class time for hands-on lab exercises by providing the preparatory lecture material online. Online modules were used to present videos of the traditional preparatory lectures. These videos were chunked into short easy-to-watch segments with embedded multiple choice and multi-choice study questions between segments. Students received one interactive preparatory online module about the laboratory protocol (Lab), and a second module on neurobiology content relevant to the experimental exercises (Lecture).
Online Teaching Modules
The following online modules were designed by graduate student teaching assistants (TAs) from the Department of Neurobiology and Behavior, at University of California Irvine. They contained lectures, demonstrations and test questions that were meant to augment students’ understanding of the experimental procedure and instructional content related to each neurobiology lab topic.
The course modules below were built using a new online software called Rocketmix (http://www.rocketmix.com/) that enables instructors to easily author and customize online teaching interventions for their students. The software allowed the teaching assistants to upload recorded videos containing course content. These videos were segmented into 3–5 minute videos, separated by embedded assessments in the form of multiple choice, multi-choice and free response questions.
-
Nerve Conductance A
Electrophysiology Lecture A
Cockroach Electrophysiology Lab A
-
Nerve Conductance B
Electrophysiology Lecture B
Cockroach Electrophysiology Lab B
-
Scientific Communication
Scientific Communication Lecture C
Scientific Communication Lab C
-
Pharmacology
Ileum Lecture D
Ileum Pharmacology Lab D
-
Habituation and Memory
C. elegans Behavior Lab E
C. elegans Behavior Lecture E
-
Relationship between Brain and Behavior
Online Rat Behavior Lecture F
Online Rat Behavior Lab F
-
Neuroimaging
Neuroimaging and EEG Lecture G
EEG Sleep Lab G
The Rocketmix platform provides detailed information on individual student performance for each module. Instructors were given access to information that included the amount of time students spent completing the module, the date of completion, how well students scored on questions, the number of attempts students made to answer each question, quantitative student ratings of module effectiveness and student comments.
Experiment 1 - Design
Upper-division biology majors who chose to enroll in the neurobiology and behavior course were randomly assigned to five lab sections of 20 students each. Students worked together in pairs or groups of four, depending on the laboratory experiment. Each laboratory section was monitored by a different graduate student TA. TAs were knowledgeable about the laboratory procedures, related content, and the experimental design.
All students received a manual that provided written procedures and content information along with questions that students were required to answer during the laboratory session. In addition, all students were given access to the Rocketmix online teaching modules described above. Students were required to complete the relevant online tutorials before coming to the lab meetings.
Three of the five sections were required to complete the online modules and also sit through a traditional in-class preparatory PowerPoint lecture before conducting the laboratory exercise (Modules + Slide Lectures), while the two remaining sections only received the online modules (Modules alone).
Experiment 1 - Assessment
All sections attended a common final exam. Two examination versions consisting of 50 multiple choice questions differed only in terms of question order and the order of possible answers within each question. Examination versions were distributed randomly across students from all laboratory sections. Examination performance for students that received Modules + Slide Lectures were compared with students that received Modules alone.
Experiment 2 - Design
A group of upper-division biology majors (different from those that participated in Experiment 1) were enrolled in the neurobiology and behavior laboratory course and were randomly assigned to four sections of 22 students. Each section was overseen by a knowledgeable graduate student TA. All students were provided a written manual as in Experiment 1 (described above). No traditional PowerPoint lectures were provided.
Pre-Laboratory and Post-Laboratory Module Delivery
In this experiment, students were only allowed access to the online modules for three days before (Pre-Lab) or three days after (Post-Lab) the laboratory exercise. The modules presented in this experiment were identical to those used in Experiment 1. Online module access was controlled such that each student received the full set of online modules, but half as Pre-Lab tutorials and half as Post-Lab tutorials (Fig. 1). The online modules served to either prepare students to better understand and conduct the lab procedure (Pre-Lab) or reinforce the lab content and experience after it was conducted (Post-Lab). Students were required to complete the online modules in order to receive full credit for the laboratory exercise.
Figure 1.
The above figure illustrates the distribution of pre- vs. post-lab completion of online module tutorials across four sections of 20 students (A–D). A different lab was taught each week for six weeks.
Experiment 2 - Assessment
Students were assessed on final examination performance on 18 target examination questions covering the each of the six lab topics. Each of the 18 questions had one correct answer and four incorrect distractor options. These target questions were assigned randomly across four exam versions so that some students received them as multiple choice questions and some students received them as multi-choice questions. The questions were distributed so that each student received the same number of multiple choice and multi-choice assessment questions. All 18 questions were presented to each student in one format or the other.
Multiple choice instructions asked students to “Choose the best answer,” whereas Multi-choice instructions directed students to “Choose the best answer or answers.” Regardless of instruction, there was only one correct answer. Multi-choice instruction was used to increase uncertainty about the correct answer by suggesting the presence of an additional correct distractor in addition to the standout answer.
Multiple Choice questions were scored 1 if the answer was correct or 0 if the answer was incorrect. Multi-choice questions were scored 1 if only the correct answer was chosen, 0.75 if one incorrect distractor was chosen along with the correct answer, 0.50 if two incorrect distractors were chosen along with the correct answer, 0.25 if three incorrect distractors were chosen along with the correct answer and 0 if only incorrect answers were chosen.
Multiple Choice and Multi-choice scores were analyzed for similarity in performance trend. Multiple choice questions were next separated into easy, moderate and difficult categories based on student score (Easy: top 1/3, Moderate: Middle 1/3, Difficult: Lower 1/3). Multiple choice performance was therefore used solely as a predictor to sort Multi-choice questions by difficulty. We did this comparison of question format in order to demonstrate that the added uncertainty of multi-choice format did not disrupt the pattern of student scoring on individual questions.
Multi-choice questions are considered a more sensitive and reliable measure of student knowledge. Multi-choice scores were used to assess the impact of Pre-Lab and Post-Lab module use on questions of varying difficulty. A mixed-effects linear regression analysis with a random intercept for student identity was used both to control for individual subject differences and to examine differences between pre- vs. post-lab instruction and student performance since not all students received the same multi-choice questions or pre- vs. post-treatment per each question topic.
We next split the above analysis in order to investigate whether the observed effect was influenced by gender.
Statistical Analysis
All regression analysis was performed using Stata software (Stata LLC). Additional t-test comparisons are provided on graphs showing means and standard deviations.
RESULTS
Experiment 1 - Analysis 1: Comparing mean final examination scores for students receiving modules alone along compared to modules and slide lectures
In Experiment 1, we found that using online modules alone were equally effective to online modules combined with traditional slide lectures. Final examination scores for students that completed online teaching modules were compared to students that received online teaching modules in addition to a traditional in-class lecture (Fig. 2). A regression analysis revealed that traditional PowerPoint lectures did not provide significant additional benefit over online modules alone, demonstrating that online delivery or “flipping” lab content using online teaching modules provides sufficient laboratory instruction (p = 0.57; modules alone n=40, modules + slide lectures n=60).
Figure 2.
The above box plot shows the distribution of final examination scores for students that received online teaching modules (Modules Alone), compared to students that received online teaching modules and in-class traditional slide lectures (Modules + Slide Lectures). Boxes represent the middle quartiles. Whiskers represent the bottom and top quartiles. Outliers beyond the upper and lower bounds (filled circles).
Experiment 2 - Analysis 1: Comparing Multiple Choice to Multi-Choice Performance on the Final Exam
Student performance on the multiple choice questions were used to as an internal measure to categorize multi-choice questions as being Easy, Moderate or Difficult. A regression analysis with a random intercept for student id, showed that students performed significantly higher on questions when they were delivered in multiple choice format compared to multi-choice (Coef −0.091, St Err 0.01, p < 0.0001). Multi-choice instructions provide significantly greater challenge, deterring students from using simple process of elimination techniques to locate the best answer.
Multiple choice score was a strong predictor of multi-choice score, suggesting that even with added student uncertainty, the relationship of perceived difficulty for individual questions relative to each other was maintained (Fig. 3). Thus, multi-choice format provided students an additional degree of challenge, discouraging simple process of elimination techniques without disrupting the pattern of student performance on target questions (Fig. 3).
Figure 3.
The above bar diagram shows a comparison of the trend for mean scores + SE of 18 target questions presented in Multiple Choice or Multi-Choice format and split by level of difficulty (student n=88). Insert shows that overall scores were higher when questions were delivered in Multiple Choice format (p < 0.0001).
Experiment 2 - Analysis 2: Comparing pre- and post-laboratory instruction on multi-choice performance on the final examination
Following categorization of multi-choice questions according to difficulty, we compared the impact of pre-laboratory or post-laboratory module completion on mean score for examination questions (Fig. 4). A mixed effects model with an added random intercept for student identity revealed a relationship between pre-laboratory instruction and increased student performance on multi-choice questions of moderate difficulty (coef. −1.57, St Err 0.043, p < 0.001***).
Figure 4.
The above bar diagram shows mean + SE for Multi-Choice target questions, split by level of difficulty and pre- or post-module instruction. (t-test comparison of means p < 0.01**, student n=88)
Experiment 2 - Analysis 3: Comparing the impact of pre- and post-laboratory instruction on moderate multi-choice scores in relation to gender
We compared pre- and post-laboratory module completion on mean score for moderate difficulty Multi-Choice target questions separated by gender (Fig. 5). A mixed effects model with random intercept for student identity revealed that students exposed to Pre-Lab modules outperformed students that received Post-Lab modules independent of gender. Female (coef. −0.56, St Err 0.055, p < 0.05), Male (coef. −0.206, St Err 0.064, p < 0.001). Therefore, pre-laboratory module enhanced performance regardless of gender for questions of moderate difficulty. No influence of gender on easy or difficult questions was observed. Further, no significant differences in score were observed between males and females for pre- and post-teaching categories at any level of difficulty.
Figure 5.
The above bar diagram shows t-test comparison of the impact of Pre- and Post-Lab module instruction on student performance (mean score) for Multi-Choice target questions, split by level of difficulty. p < 0.05, p < 0.01**
Student opinions on teaching modules
Student survey responses (Table 1) revealed that the majority of students (63%) felt that the online teaching modules helped them to perform better on the exam, while about half of the students (55%) felt the modules helped them perform better during the lab. About half of the students (49%) remember thinking about the content from online modules during the test. The majority of students found the modules to be informative (85%), while about half of the students found the online modules to be interesting (55%).
Table 1.
The above table presents student responses to survey questions about online tutorials.
Questions | No Opinion (%) | Strongly Disagree (%) | Disagree (%) | Agree (%) | Strongly Agree (%) |
---|---|---|---|---|---|
1. Completion of the online teaching modules helped me perform better on the final exam. | 12 | 7 | 18 | 52 | 11 |
2. During the final exam I found myself thinking about information I learned in the online teaching modules. | 14 | 12 | 25 | 45 | 4 |
3. The online teaching modules were interesting. | 16 | 11 | 18 | 53 | 2 |
4. The online teaching modules were informative. | 7 | 5 | 3 | 70 | 15 |
5. Completion of online modules helped me to perform better during the lab. | 9 | 9 | 27 | 41 | 14 |
DISCUSSION
Articles extolling the flipped method of instruction present it as a convenient and effective way to leverage emerging technology in education (Overmyer, 2012). However, a common complaint about flipping a course is the time instructors spend finding, making and tailoring online content. Then, there is the added obstacle of student’s resistance to engaging with online lectures prior to class meetings (Herreid and Schiller, 2013). Rocketmix allow us to address many of the typical concerns associated with a flipped course by providing a convenient and intuitive platform to develop new modules that are convenient for students to access from a variety of tools (smart phones, computers and ipads).
Flipped lecture courses increase student performance compared to a traditional course (Haak et al., 2011; Freeman et al., 2011). Reducing the time routinely devoted to in-class lectures leaves more time for active learning activities. While neurobiology labs are active by default, using 45 minutes of a 3-hour lab period to deliver a traditional power point lecture puts students under pressure to simply make it through some of our more time-consuming labs without leaving time to process the deeper content associated with the procedures. As our student populations continue to increase, online teaching provides a way to free up instructor time. The idea of a flipped lab is not new (Raman, 2015; Beckman, 2014). Our findings support the idea that increasing active lab time is possible and that online teaching can be a sufficient way to deliver content.
To enhance the online experience for students, we “chunked” the content so that videos would keep students’ attention and allow note taking (Sarawagi, 2013 Storer, 2016). Online testing and quizzing has previously been correlated with improved knowledge and student performance (Thompson et al., 2010; Johnson and Kiviniemi, 2009; Dobson, 2008; Orr and Foster, 2013). Therefore, we embedded both multiple choice and multi-choice questions into the teaching modules that mirrored the format of the questions on the final.
We introduced a new internal method to categorize questions by level of difficulty based on internal measures of student performance. To do this, we used student scores on multiple choice questions to inform and predict performance in multi-choice format. Our findings show that students do significantly better when a question is presented in multiple choice format as opposed to multi-choice, supporting the idea that multi-choice questions introduce a level of uncertainty that deters guessing and the associated benefits. Reliability of assessments have been shown previously to improve when uncertainty and the degree of student risk aversion are increased (Lesage et al., 2013). This is potentially true when there is a wide variance of difficulty. We demonstrated that scores on multiple choice and multi-choice follow the same pattern when multiple choice have been used to categorize difficulty level. Our method is perhaps a more accurate predictor of question difficulty than instructor-based question categorization.
The analysis of student scores on multi-choice questions suggest that it is best to present modules prior to laboratory sessions, as pre-laboratory modules are more impactful than modules presented after laboratory sessions. This result is in accordance with other studies accounting biology courses (Moravec et al., 2010, Orr and Foster, 2013). This supports the view that optimal online instruction should focus on priming the laboratory learning instead of reinforcing course content post-laboratory. Pre-laboratory modules increased the performance of students on questions of moderate difficulty questions to the level of easy questions. That we only observed gains in student performance on moderate difficulty questions was not surprising, as difficult and easy questions can be susceptible to floor and ceiling effects.
Given that our online modules were multimodal (videos, illustrations, reading and embedded questions) but contained no kinesthetic component, it was predicted that we may see reduced scores in female students compared to males. Researchers have previously reported that female students prefer kinesthetic learning styles while male students reported a preference for multimodal instruction (Wehrwein et al., 2007). However, we observed no difference in learning of males and female students and both did better on moderate questions when content was delivered before the lab.
Student survey response was overall positive, suggesting that students found the online Rocketmix modules to be informative and helpful. Most students enrolled in the study felt that their scores on the final exam improved as a result of using online teaching modules. While we did not directly measure performance data to gage whether the pre-lab tutorials actually helped students successfully complete lab procedures, over half of the students felt that the modules improved their performance during the lab. Instructors and graduate student teaching assistants found Rocketmix to be an effective and user-friendly platform.
Below is a link to an example of our online instruction modules:
Enrollment Link: https://my.rocketmix.com/enrollcourse.aspx?courseid=2901
Footnotes
This work was supported by The Department of Neurobiology & Behavior, Ayala School of Biological Sciences, University of California, Irvine. Human research approved by the university’s internal review board: IRB HS# 2013-9542.
REFERENCES
- Beckman M. The “flipped” undergraduate research lab: teaching core molecular biology techniques with descriptive protocols and videos. The FASEB Journal. 2014;28:618.3. [Google Scholar]
- Bergmann J, Sams A. Flip your classroom: reach every student in every class every day. Eugene, OR: ISTE; 2012. [Google Scholar]
- Burton RF. Quantifying the effects of chance in multiple choice and true/false tests: question selection and guessing of answers. Assess Eval High Educ. 2001;26:41–50. doi: 10.1080/02602930020022273. [DOI] [Google Scholar]
- Cuthrell K, Lyon A. Instructional strategies: what do online students prefer? J Online Learn and Teach. 2007;3(4):357–362. [Google Scholar]
- Dobson JL. The use of formative online quizzes to enhance class preparation and scores on summative exams. Adv Physiol Educ. 2008;32:297–302. doi: 10.1152/advan.90162.2008. [DOI] [PubMed] [Google Scholar]
- Freeman S, Haak D, Wenderoth MP. Increased course structure improves performance in introductory biology. CBE Life Sci Educ. 2011;10:175–186. doi: 10.1187/cbe.10-08-0105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, Wenderoth MP. Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci U S A. 2014;111:8410–8415. doi: 10.1073/pnas.1319030111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fulton K. Upside down and inside out: flip your classroom to improve student learning. Learning & Leading with Technology. 2012;39(8):12–17. [Google Scholar]
- Gajjar NB. The role of technology in 21st century education. Int J Res Educ. 2013;2:23–25. [Google Scholar]
- Gopal T, Herron SS, Mohn RS, Hartsell T, Jawor JM, Blickenstaff J. Effect of an interactive web-based instruction in the performance of undergraduate anatomy and physiology lab students. Comput Educ. 2010;55(2):500–512. [Google Scholar]
- Haak DC1, HilleRisLambers J, Pitre E, Freeman S. Increased structure and active learning reduce the achievement gap in introductory biology. Science. 2011;332:1213–1216. doi: 10.1126/science.1204820. [DOI] [PubMed] [Google Scholar]
- Hake R. Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys. 1998;16:64–74. [Google Scholar]
- Herreid CF, Schiller NA. Case studies and the flipped classroom. J Coll Sci Teach. 2013;42:62–66. [Google Scholar]
- Jensen JL, Kummer TA, Godoy PD. Improvements from a flipped classroom may be the fruits of active learning. CBE Life Sci Educ. 2015;14:1–12. doi: 10.1187/cbe.14-08-0129. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Johnson BC, Kiviniemi MT. The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teach Psychol. 2009;36:33–37. doi: 10.1080/00986280802528972. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lesage E, Valcke M, Sabbe E. Scoring methods for multiple choice assessment in higher education – Is it still a matter of number right scoring or negative marking? Studies in Educational Evaluation. 2013;39:188–193. [Google Scholar]
- McCollough MA, Gremler DD. Guaranteeing student satisfaction: an exercise in treating students as customers. J Mark Educ. 1999;21:118–130. [Google Scholar]
- Moravec M, Williams A, Aguilar-Roca N, O’Dowd DK. Learn before lecture: a strategy that improves learning outcomes in a large introductory biology class. CBE Life Sci Educ. 2010;9:473–481. doi: 10.1187/cbe.10-04-0063. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Orr R, Foster S. Increasing student success using online quizzing in introductory (majors) biology. CBE Life Sci Educ. 2013;12(3):509–514. doi: 10.1187/cbe.12-10-0183. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Overmyer J. Flipped classrooms 101. Principal. 2012;(September/October):46–47. [Google Scholar]
- Peteroy-Kelly M. Online pre-laboratory modules enhance introductory biology students’ preparedness and performance in the laboratory. J Microbiol Biol Educ. 2010;11(1):5–13. doi: 10.1128/jmbe.v11.i1.130. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raman R. Flipped labs as a smart ict innovation: modeling its diffusion among interinfluencing potential adopters. In: El-Alfy ES, Thampi S, Takagi H, Piramuthu S, Hanne T, editors. Advances in intelligent informatics. Vol. 320. Springer; Cham: 2015. (Advances in Intelligent Systems and Computing). [Google Scholar]
- Sarawagi N. Flipping an introductory programming course: Yes you can! J Comp Sci Coll. 2013;28(6):186–188. [Google Scholar]
- Sierra J. Shared responsibility and student learning: ensuring a favorable educational experience. J Mark Educ. 2010;32(1):104–111. [Google Scholar]
- Storer DA. The flipped classroom with limited internet access. In: The flipped classroom volume 1: background and challenges. ACS Symposium Series. 2016;1223(3):17–27. [Google Scholar]
- Strayer JF. How learning in an inverted classroom influences cooperation, innovation and task orientation. Learn Environ Res. 2012;15:171–193. [Google Scholar]
- Thompson KV, Nelson KC, Marbach-Ad G, Keller M, Fagan WF. Online interactive teaching modules enhance quantitative proficiency of introductory biology students. CBE Life Sci Educ. 2010;9:277–283. doi: 10.1187/cbe.10-03-0028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tucker B. The flipped classroom. Educ Next. 2012. p. 12. http://educationnext.org/the-flipped-classroom.
- Wehrwein EA, Lujan HL, DiCarlo SE. Gender differences in learning style preferences among undergraduate physiology students. Adv Physiol Educ. 2007;31(2):153–157. doi: 10.1152/advan.00060.2006. [DOI] [PubMed] [Google Scholar]
- Yadav A, Lundeberg M, DeSchryve M, Dirkin K, Schiller NA, Maier K, Herreid CF. Teaching science with case studies: a national survey of faculty perceptions of the benefits and challenges of using cases. J Coll Sci Teach. 2007;37:34–38. [Google Scholar]