ABSTRACT
Quantitative reasoning is one of the core competencies identified as a priority for transforming the undergraduate biology curriculum. However, first-year biology majors often lack confidence in their quantitative skills. We revised an introductory biology lab to emphasize the teaching of basic laboratory calculations, utilizing multiple teaching tools, including online prelab quizzes, minilab lectures, calculation worksheets, and online video tutorials. In addition, we implemented a repetitive assessment approach whereby three types of basic calculations—unit conversions, calculating molar concentrations, and calculating dilutions—were assessed on all quizzes and exams throughout the semester. The results showed that learning improved for each of the three quantitative problem types assessed and that these learning gains were statistically significant, both from first assessment to midterm and, notably, from midterm to final. Additionally, the most challenging problem type for students, calculating molar concentrations, showed the greatest normalized learning gains in the second half of the semester. The latter result suggests that persistent assessment resulted in continued learning even after formal, in-class teaching of these approaches had ended. This approach can easily be applied to other courses in the curriculum and, given the learning gains achieved, could provide a powerful means to target other quantitative skills.
KEYWORDS: laboratory course, undergraduate, calculations, quantitative, introductory biology
INTRODUCTION
There is a growing consensus that future biologists need to be competent in quantitative approaches to understanding biological concepts and investigating biological questions (1–4). Quantitative reasoning is one of the core competencies identified by Vision and Change in Undergraduate Biology Education as key to the transformation of undergraduate education in biology (5). Given this, it is important to develop courses that teach quantitative skills and, critically, assess student learning for these quantitative skills (6).
There has been a push to develop curricula that integrate math and biology (7, 8), utilize mathematical approaches to address biological research questions (9), and develop tools for students to use mathematical modeling for diverse fields of study (10). Some major curricular development programs have made efforts to assess the efficacy of their program and have demonstrated significant learning gains (11, 12). Other programs have demonstrated improved student performance following interventions targeting specific skill sets, such as data analysis skills (13), bioinformatic approaches (14), or statistical analysis (15). Cross-disciplinary integration of math and biology has also been carried out at the level of a single course. Examples include the integration of biology and math (16, 17), biology, math, and physics (18), and biology, math, and computer science (19).
It is clear that in order to fundamentally improve the quantitative competency of biology majors, it is important to include quantitative approaches across the curriculum and to assess student learning for these skills. Importantly, this should be initiated in the very first year of study. Introducing these skills early in the curriculum can help to counter student misconceptions that biology is not a quantitative science (20). Several different approaches to teaching quantitative skills in introductory biology courses have been shown to be effective. For example, courses using the scientific method as an organizing structure to teach quantitative reasoning have shown gains in students’ ability to employ these approaches (21, 22). Taking this a step further, some courses have used the course-based undergraduate research experience (CURE) as a means to teach quantitative approaches, data analysis, and statistical analysis (23–26). Other studies have shown that using multiple, targeted teaching modules, both online and in-person, can improve student performance on quantitative skills assessments (27, 28). In a particularly well-assessed study of a revised introductory molecular and cell biology course, integrating math skills acquired previously in prerequisite math courses with biological problems during in-class activities, quizzes, and exams led to greater gains on both integrated math and biology questions and biology-only questions (29).
Here, we describe revisions to an introductory biology laboratory course aimed at ensuring student learning of basic laboratory calculations. Laboratory sections of introductory biology courses would seem to be an ideal format for teaching quantitative skills. Lab sections have smaller class sizes and typically allow more direct interaction with the instructor, which could be particularly important for teaching topics that students find challenging. Also, the focus of laboratory exercises is often the collection and quantitative analysis of data, and experimental protocols often require calculations and the solving of simple equations. Despite this, there are relatively few published reports that describe course interventions for introductory biology labs that measure learning gains specifically for quantitative skills (as noted in references 6 and 30). One exemplary study developed an introductory biology lab course that had at its core a CURE based on point mutations for the p53 tumor suppressor (30). The research project was used as a means to teach students how to analyze data and draw conclusions from experiments, and the study assessed learning gains for these specific skills through performance on low-stakes exams. Another well-assessed study made relatively small changes to an introductory biology lab that required students to carry out data and statistical analysis for several different experimental exercises (15). The study used a pretest, posttest method to assess performance on skill-based questions for data analysis. They found particularly large improvements in student performance for interpreting scatterplots and regressions, interpreting t test results, and interpreting bar plot results. Thus, relatively modest changes to course structure can lead to dramatic learning gains.
The core of the teaching approach we describe here is to persistently and repeatedly test students’ ability to carry out basic calculations (unit conversions, molar concentrations, and dilutions) throughout the semester. Our hypothesis was that repetitive assessment on basic calculation questions would improve student learning gains for these quantitative approaches. We chose this collection of calculation questions because they are fundamental skills for molecular and cell biology research, which is the focus of this laboratory course. Indeed, these skills are required for many of the experiments carried out in this laboratory course. This approach was also prompted by our observation of poor student performance on basic calculation questions in previous semesters (as has been seen previously; e.g., see references 8 and 21), despite a focused effort to teach these skills in the first two laboratory periods. This approach was also inspired by anecdotal observations of a lack of student persistence in their efforts to learn these calculations. Through discussions with instructors in the course, and their impressions from interacting with students, we hypothesized that there was a significant population of students that simply did not attempt to master these skills, essentially sacrificing the relatively small percentage of points that these questions represented. The idea that lack of persistence is a barrier to learning, specifically, quantitative approaches has been postulated previously (29). This may be related to math anxiety among biology majors, which has been well documented (31–34). Student anxiety and reticence to engage in quantitative problems may be particularly acute for first-year students.
Another rationale for the persistent and repeated assessment approach that we employ here is based on the “testing effect,” for which there is a large body of evidence supporting the effect of repetitive assessment on learning gains (35–37). This approach of repetitive assessment has been shown to be an effective mechanism to improve learning, specifically in introductory biology courses (38, 39). Another area of significant research has shown that the “spacing effect,” whereby the repetition of the material to be learned is spaced, leads to increased learning and memory for a large variety of content areas (40, 41). The intervals shown to be effective for spaced learning, including for life science curricula (42), are on the order of the spacing between repetitive assessments of calculations employed here.
Here, we test our hypothesis that using repeated assessments to teach basic calculations in multiple sections of an introductory biology lab course for majors will increase student learning gains. Our teaching approach includes online prelab quizzes, in-lab lectures, worksheets, and online video tutorials. Given the rationale highlighted above, we revised this course to include basic calculation questions on every quiz and exam for the entire semester. We postulated that having these questions on all assessments would motivate students to persevere in learning these approaches. Furthermore, as the intervention is itself assessment, it also provides a mechanism to measure student progress. We describe below the details of this revision and the results of the assessment of student learning, specifically for these basic calculation questions.
METHODS
Course background
Course revisions and analysis were implemented in the Fall Semester of 2017 in all nine sections of the Biology I Laboratory (BIOS-1081), a semester-long, introductory level biology laboratory class that is intended for science majors at the University of New Orleans (UNO). It is typically the first science course for incoming majors, most of whom take their first chemistry course after taking this biology lab, although some may be taking Chemistry I concurrently. This course is offered yearly in the fall, spring, and summer. The prerequisite for this course is eligibility to enroll in precalculus algebra (ACT math score of ≥22). The laboratory meets once weekly for two hours and forty-five minutes for the entire semester and is worth one credit hour. Graded assessments in this course consist of online prequizzes, in-class weekly quizzes, worksheets, a midterm exam, and a final exam. The in-class quizzes cover the material from the previous week’s experiment. The midterm and final exams cover approximately half of the laboratory exercises each.
The nine lab sections included in this study were taught by five instructors (2 sections each for 4 of the 5 instructors). Four of the five instructors were graduate student teaching assistants, and one was a member of the university faculty. While instructors may personalize their teaching methods to some degree, the general approach is the same across all sections. As preparation for each lab, students are required to submit an online prequiz that introduces information they will use in the lab that week. The prequiz covers material from an assigned reading and the handout for that laboratory exercise. Aside from a few exceptions, such as the first lab of the semester, students are given a written quiz in lab covering the material from the week prior. For each lab period, instructors will begin with a brief lecture to teach important methods and concepts related to the experiment. Students then work in groups of 3 to 5 to follow the laboratory protocol laid out in the handout. They then work together to answer questions related to the experiment. The lab handouts are not collected, but students are held responsible for that material via the in-lab quiz the following week. This basic framework is followed each week and is sometimes supplemented by take-home assignments designed to create deeper understanding of important concepts.
The main learning objective for this course is for students to understand how biologists answer questions with experiments. This includes building basic skills needed in a laboratory setting, including pipetting, measurements, graphing, linear regressions, and standard curves, as well as building an understanding of scientific reasoning, experimental design, and interpretation of results. The first 2 weeks of the lab (Biology Bootcamp I and II) introduce some of the basic laboratory skills—pipetting and graphing in week one, and spectroscopy, dilutions, and standard curves in week 2. Four of the remaining six laboratory exercises are experiments or assays that were taken directly from faculty research programs. These include assaying lactate dehydrogenase enzyme activity in different muscle tissues, bacterial transformation and green fluorescent protein expression, PCR analysis of Anolis lizard pathogens, and genetic complementation in yeast (Saccharomyces cerevisiae). All of these labs are 2 weeks in length. The remaining laboratory exercises include an inquiry-based lab on maize genetics and a light microscope-based assay of endocytosis and exocytosis in Tetrahymena.
Student demographics
The median number of students enrolled per section was 28, with a minimum of 23 and a maximum of 30, for a total of 248 students enrolled in the fall 2017 semester. Combined enrollment for all 9 sections consisted of 67% female, 34% underrepresented minority students (21% Black/African American and 13% Hispanic/Latino), and 67% first-year students (53% first-semester students).
The University of New Orleans institutional review board (IRB) reviewed the research and procedures for this study and determined that the protocols were compliant with the University of New Orleans and federal guidelines and met the standard for being exempt (IRB number 07Aug14).
Teaching and assessing basic calculation skills
A major student learning outcome for this course, and the focus of the course revisions we describe here, is for students to be able to carry out basic laboratory calculations. Formal teaching of these calculations occurs in the first 2 weeks of the lab. Activities aimed at teaching unit conversions accompany the first laboratory exercise, Biology Bootcamp I. Calculating molar concentrations and calculating concentrations from dilutions are addressed in the second week, Biology Bootcamp II. Several different activities are used to teach these basic calculations, as follows.
1. Online prelab quizzes. Calculation problems are first introduced to students in the online prelab quizzes. Conversion questions are included on the prelab quiz for Biology Bootcamp I, and molar concentration calculations are included on the prelab quiz for Biology Bootcamp II.
2. Lab lecture. Each laboratory period begins with a brief, 15- to 20-minute minilecture. Approaches to basic laboratory calculations are first taught during these lectures.
3. Problems and calculations on lab exercise handouts. The first two labs (Biology Bootcamp I and II) focus on measurements and graphing, and as a part of these exercises, students are asked to carry out these basic calculations. Biology Bootcamp I asks students to carry out unit conversions, and the lab exercise handout ends with a series of conversion problems. For Biology Bootcamp II, students are asked to calculate concentrations from dilutions, and the handout ends with problems of calculating molar concentrations and calculating from dilutions.
4. Online video tutorials. Another means to teach students how to do these calculations is recorded video tutorials that students can access through the Moodle site for their lab section.
5. Extra credit worksheets or quizzes. Often on the first quiz, the students struggle with calculation questions. As a follow-up, instructors will go over the questions from the quiz in lab. In addition, poor performance may be followed up with an extra credit quiz or worksheet opportunity.
6. Summative assessment. Three types of basic calculation questions were assessed on a weekly basis: (1) unit conversions, (2) calculating molar concentrations, and (3) calculating dilutions. Examples of each type of question are shown in Table 1. These questions were included on all quizzes and exams beginning with quiz 1 (conversions) or quiz 2 (molar concentrations or dilutions).
TABLE 1.
Examples of the three types of calculation problems used in the study
| Unit conversion questions | |
|---|---|
| Make the following conversions for units of volume: | |
| (1) 22 liters = _________ milliliters | |
| (2) 12 milliliters = _________ microliters | |
| Calculating molar concentrations | |
| (1) How many milligrams (mg) of ATP do you need to add to 75 milliliters (mL) of water in order to make an 80 millimolar (mM) ATP solution? (MW of ATP = 507 daltons) | |
| Calculating concentrations from dilutions | |
| Use this equation for the following question: | C1 V1 = C2 V2 |
| (1) Suppose you have a stock solution with a solute concentration of 300 mM. If you wanted to dilute this solution to make a total of 25 mL of a 50 mM solution, what volume of stock _______ and what volume of water ________ would you use? Give your answer in mL, and show your calculations. | |
| (2) Suppose that you are doing serial dilutions. You start with a stock solution with a solute concentration of 100 mM. You then add 1 mL of this stock solution to 2 mL of water to give you concentration 1. You then take 1 mL of concentration 1 that you just made and add it to 2 mL of water to make concentration 2. You repeat these serial dilution steps to make concentrations 3 and 4. Fill in the final concentration of these dilutions in the space provided below. Show your calculations. | |
| Concentration 1 ______ mM | Concentration 3 ______ mM |
| Concentration 2 ______ mM | Concentration 4 ______ mM |
Statistical analysis
Because we lacked either pretest data or control data from laboratory sections that did not employ the repeated assessment intervention, the best means to analyze the statistical significance of improved student performance on conversion and calculation questions was a repeated measures analysis of variance. We performed this analysis on the changes in student scores across two time intervals: from the first quiz assessing each problem type to the midterm, and from the midterm to the final exam. The first assessment of unit conversion questions was quiz 1 for most students (n = 138) but was not until quiz 2 for a subset of students (n = 42). The first assessment for calculating molar concentrations and concentrations from dilutions was quiz 2 for most students (n = 159) but was not until quiz 3 for a subset of students (n = 16). We included all students for whom scores were available for both the first assessment and the midterm (n = 180 for unit conversions; n = 175 for molar concentrations and dilutions) and for both the midterm and the final (n = 215 for unit conversions and molar concentrations; n = 214 for dilutions).
It should be noted that for two lab sections, we did not have data for quizzes before the midterm. Because of this, these students could not be included in the statistical analysis of the period from first assessment to midterm. The missing data points were due to mistakes in data collection where the quiz was not scored for each type of question before the quizzes were returned to students. Thus, the students did take the quiz, but we do not have the data for performance for each type of question. Because of this, the value of n for the data from first quiz to midterm (n = 180 or 175) is somewhat different than that of the period from midterm to final (n = 215 or 214) (Table 2).
TABLE 2.
Statistical significance of learning gains
| Type of question | Mean % correct in: |
F statistic | P value | Mean % correct in: |
F statistic | P value | ||
|---|---|---|---|---|---|---|---|---|
| 1st quiz | Midterm | Midterm | Final | |||||
| Unit conversions | 79.2 | 94.4 | F1,179 = 57.7 | <0.001 | 93.8 | 96.4 | F1,214 = 7.3 | 0.007 |
| Molar concentration calculations | 52.2 | 61.0 | F1,174 = 6.6 | 0.011 | 64.3 | 75.7 | F1,214 = 23.4 | <0.001 |
| Dilution calculations | 55.1 | 77.0 | F1,174 = 37.0 | <0.001 | 78.4 | 84.1 | F1,213 = 6.3 | 0.013 |
We also asked whether the section instructor had a significant effect on either the mean student score (between-subject effect) or the change in mean scores from the midterm to the final exam (within-subject effect, score × instructor interaction). We expected that the scores of students in all sections would improve between the midterm and the final if continued testing had a beneficial effect on student comprehension, regardless of any differences in mean scores among sections or instructors. We included all students for whom midterm and final exam scores were available (n = 215 for unit conversions and molar concentrations; n = 214 for dilutions).
Normalized learning gains
The average normalized learning gains were calculated according to the method described by Hake (43) using the equation <g> = (mean last assessment − mean first assessment)/(100 − mean first assessment), where g is normalized learning gain. These were calculated for three time intervals: (i) the entire course, from first quiz to final exam; (ii) the first half of the semester, from first quiz to midterm exam; and (iii) the second half of the semester, from midterm to final.
RESULTS
Based on our rationale that persistent and repeated assessment should improve learning and sustain student motivation, we implemented a strategy of repeated weekly testing of three types of basic calculation questions: (i) unit conversions, (ii) calculating molar concentrations, and (iii) calculating dilutions. Examples of each question type are shown in Table 1. All three types of questions were included beginning with quiz 2, and then these calculation questions were included on all quizzes and exams for the rest of the semester (6 quizzes total, the midterm, and the final). As a policy, the calculation questions in total accounted for approximately 20 to 30% of the points on each quiz or exam. We introduced these problems and taught methods of solving them in the first 2 weeks of lab and informed students that they would be held responsible for these questions on all assessments through the lab final. The rationale was made explicit to students that learning these calculations would benefit them on all quizzes and exams and that failing to master these calculations would continue to adversely affect their grade throughout the semester. Student learning of each type of problem was assessed by determining the mean percent correct per section for each quiz or exam.
The average student performance (for all 9 sections) improved across the semester for all three question types (Fig. 1). The number of questions for each type varied somewhat for different quizzes/exams (5 to 10 for conversions, 2 to 6 for molar concentration, and 4 to 8 for dilutions). To determine the percentage correct, all questions were objectively scored as right or wrong (no partial credit). Students were most proficient at the unit conversion questions (Fig. 1a), with the mean performance on the first quiz being 79.2% correct (standard error of the mean [SEM], ±2.08%). Even given this initial proficiency, student performance on conversions continued to improve across the semester, showing significant mastery for these questions by the final exam (96.4% correct; SEM, ±0.6%). More challenging for students were the problems calculating concentrations from dilutions (Fig. 1c). The mean on the first quiz was 55.1% (SEM, ±3.3%). However, performance improved to 84.1% correct (SEM, ±2.1%) by the final exam. Students had the most difficulty with the questions on calculating molar concentrations. The mean performance on the first quiz was only 52.2% correct (SEM, ±3.3%), but performance improved to 75.7% (SEM, ±2.2%) by the final exam. These results demonstrate that our approach led to substantial learning gains for all three types of basic calculations.
FIG 1.

Improvements in student performance for three types of basic calculation problems. Average levels of performance on quizzes and exams for all 9 sections of the laboratory combined are plotted for each of the three types of calculation problems (see Table 1). The value of n for each data point varied for quizzes and exams, ranging from 77 to 240 (minimum = 77 for quiz 3 [Q3]; maximum = 240 for the midterm exam [MT]). Error bars = SEM.
As an interesting aside, performance appeared to improve on both the midterm and final compared to the quizzes. This was especially apparent for both the concentration calculation questions and the dilution calculation questions, even to the extent that there appeared to be a decrease in performance on the quizzes after the midterm. The performance then increased for the final. At this point, we are unsure of the reason for these patterns, but they do appear to be real effects and are observed for all 3 types of questions. One possibility is that students had more time for these questions on the midterm and final. Weekly quizzes were usually limited to about 15 to 20 min, whereas students had the entire lab period of the midterm and final. Another possibility is that the stakes were higher for the exams and student came better prepared for these. However, it is still hard to explain why performance decreased after the midterm.
To determine the statistical significance of these learning gains, repeated measures analysis of variance was used to analyze student gains for each of the three question types. The significance of the improvement in performance was measured for two intervals: from the first assessment to the midterm, and from the midterm to the final. By the repeated measures test, the improvement in student performance was statistically significant for all three types of questions and for both intervals (Table 2, F statistic and P values). Thus, students improved from the first assessment to the midterm and continued to improve from the midterm to the final. The latter finding is of particular importance because it means that there was a statistically significant improvement in student learning during a period in which there was no further formal, in-class instruction. Thus, students were improving their skills on these basic calculations through their own work or through instructor office hours, tutoring, or online video tutorials. Furthermore, this learning led to substantial mastery of these quantitative approaches (the percentages correct on the final were 96.4% for unit conversion, 75.7% for molar concentrations, and 84.1% for dilutions); this, in a course in which a majority of the students were first-year students (67% first year; 53% first semester).
To more accurately compare the learning gains between the different types of calculation questions, we determined the normalized learning gains (g) for each question type (according to Hake [43]). Normalized gains were determined for the entire course (from first quiz to final) and for two intervals, from first quiz to midterm and from midterm to final (Table 3). The largest gains were seen for unit conversion questions (g = 0.838), followed by dilution calculations (g = 0.626) and then molar concentration calculations (g = 0.495). Again, these data show there were substantial learning gains for all three types of questions. However, it appears that the learning gains were the largest for question types that showed the highest initial student performance. Thus, as might be predicted, students were best able to close the gap in performance with the least challenging type of question. This was also borne out when comparing the percentages of normalized learning gains between the two intervals (first quiz to midterm and midterm to final). For unit conversions and dilution calculations, the first interval accounted for the majority of learning gains (84% and 75%, respectively). However, for the most challenging type of calculation, calculating molar concentrations, a larger percentage of the normalized learning gains occurred in the second interval, from midterm to final (55%, compared to 45% in the first interval). These data suggest that persistence is particularly important for the most challenging problems. We emphasize that during the second interval, there was no longer any formalized teaching of the calculations, only the persistent assessment of the calculation questions (and, of course, available resources like video tutorials, office hours, and tutoring).
TABLE 3.
Normalized learning gains
| Type of question | Normalized learning gain (% of total) |
||
|---|---|---|---|
| Total (1st quiz to final) | 1st quiz to midterm | Midterm to final | |
| Unit conversions | 0.838 | 0.703 (84) | 0.135 (16) |
| Dilution calculations | 0.626 | 0.467 (75) | 0.159 (25) |
| Molar concentration calculations | 0.495 | 0.223 (45) | 0.272 (55) |
As a measure of the robustness of this approach, we examined whether the learning gains observed were dependent upon the instructor. There were differences in the experience levels of the five instructors that taught the nine sections analyzed. Two instructors (who taught 2 sections each) were first-semester biology graduate students (M.S.) who had not previously taught independently. Another two instructors (2 sections each) were second-year graduate students (1 M.S. and 1 Ph.D.) who had taught this lab course previously (for two semesters and for one semester, respectively). The fifth instructor was a faculty member with 11 years of teaching experience. To determine if the learning gains were section specific, the mean scores for each of the three question types for all quizzes and exams were plotted for each of the nine sections (Fig. 2, color coded by section). Although there were differences in performance between sections, as would be expected, the trend of improvement throughout the semester, including between midterm and final, was clearly visible for all sections. These results suggest that persistent assessment led to learning gains independent of the instructor.
FIG 2.
Improvements in student performance on calculation problems by instructor. The mean score for each assessment (quiz, midterm, or final) is plotted for each of the three types of calculation problems. Graphs are organized by instructor (row A is instructor 1, row B instructor 2, etc.). Different sections are differentiated by color. Lines represent the linear regressions for the sections. Missing data points from different sections were in most cases due to the data not being recorded—students did take the quiz, but performance was not scored before quizzes were returned to students. However, quiz 3 was not administered in 6 of 9 sections due to class cancellations. Also, quiz 5 was administered as a take-home, so the data were not included (panel E). Students did take the quiz for all other missing data points.
Repeated measures analysis of variance revealed no statistically significant differences in learning gains on the three types of questions from midterm to final among the instructors (Table 4). Analysis of the effect between subjects showed that there were statistically significant differences in the mean score between sections for the molar concentration questions and the dilution questions (though not for the unit conversion questions). This is not surprising, as it would be expected that there would be differences in the average performance between sections. However, there were no statistically significant differences between sections for the degree to which students improved from midterm to final as shown by the within-subject, by-instructor analysis (P values of 0.366, 0.192, and 0.494). These results demonstrate that the learning gains seen for all three types of quantitative questions were independent of instructor.
TABLE 4.
Analysis of variance by instructor
| Type of question, variable | Value for: |
||
|---|---|---|---|
| Sum of squares | Test statistic | P value | |
| Unit conversions | |||
| Effect—between subjects: | 333.527 | F4,210 = 0.55 | 0.699 |
| Variation in mean score among instructors | |||
| Effect—within subjects: | 759.558 | F1,210 = 7.309 | 0.007 |
| Analysis of improvement from midterm to final | |||
| Effect—within subjects, by instructor: | 447.858 | F4,210 = 1.077 | 0.369 |
| Analysis of improvement by instructor | |||
| Calculating molar concentrations | |||
| Effect—between subjects: | 19,927.715 | F4,210 = 3.052 | 0.018 |
| Variation in mean score among instructors | |||
| Effect—within subjects: | 14,498.823 | F1,210 = 23.43 | <0.001 |
| Analysis of improvement from midterm to final | |||
| Effect—within subjects, by instructor: | 3,812.851 | F4,210 = 1.540 | 0.192 |
| Analysis of improvement by instructor | |||
| Dilution calculations | |||
| Effect—between subjects: | 27,207.417 | F4,209 = 4.804 | 0.001 |
| Variation in mean score among instructors | |||
| Effect—within subjects: | 4,106.536 | F1,209 = 6.328 | 0.013 |
| Analysis of improvement from midterm to final | |||
| Effect—within subjects, by instructor: | 2,209.004 | F4,209 = 0.851 | 0.494 |
| Analysis of improvement by instructor | |||
DISCUSSION
The goal of this study was to develop an approach that increased learning of basic calculation skills in a first-year introductory biology course. In general, the methods that we have used to teach these calculations are similar to those of a typical laboratory course and include online prelab quizzes, in-lab lectures, lab exercise worksheets, online video tutorials, and extra-credit quizzes and worksheets. The major change in our approach was the inclusion of questions testing the students’ ability to perform these basic calculations on every quiz and exam for the entire semester. This approach was made explicit to students, highlighting the importance of learning these basic calculations for their success on these assessments. As we show here, the approach was successful. Students achieved significant learning gains both from the first assessment to the midterm and from the midterm to the final. These gains were statistically significant and showed substantial normalized learning gains (g) for all three types of questions. Although we are not aware of other studies that directly measured the learning gains for these particular types of calculation questions, the magnitudes of the gains we show here are on the order of those shown for other interventions intended to improve quantitative skills. For example, two studies that utilized web-based modules in introductory biology courses to improve students’ quantitative skills showed increased performance on a quantitative assessment of approximately 30 to 33% from pretest to posttest (12, 27). Here, we show increases in performance from first assessment to final exam of 17.8%, 31%, and 34.5% (for conversions, calculations, and dilutions, respectively). These increases were not based on pretest scores, so they likely underestimate learning gains. Another study, which integrated mathematic skills with biological problems throughout the course, showed that that intervention led to normalized learning gains of about 36% on problems on a pretest, posttest assessment of applying math skills on biology problems (29). Although our study was a more comprehensive intervention that assessed performance on a wider variety of math skills, the learning gains we show here (49% for molar concentration calculations, 63% for dilution calculations, and 84% for unit conversion questions) are similar to or greater than those in the studies described above.
The approach we have developed achieves significant learning gains with relatively simple revisions. In addition, persistent assessments of these calculations also provide a convenient means to assess student learning. This assessment does not require any additional method other than the quizzes and exams already in place, so it should require relatively little investment to expand to other courses. Given the gains in learning achieved, we believe that this is a promising approach for targeting other quantitative methods in subsequent courses. However, it will be important to incorporate and assess these calculation questions in subsequent courses to ensure that the gains are maintained. If this approach was included across the curriculum, and taught and reassessed at each level, one could envision that students could, upon graduation, be equipped with a significant collection of quantitative tools.
Although our approach has clearly led to significant learning for a large cohort of students (9 sections of ∼25 students each), the question remains whether these gains are specifically due to the inclusion of calculation questions on all quizzes and exams. Unfortunately, we lack data on student performance on the calculation questions from previous semesters. In previous semesters, we did not include these questions on assessments other than the first two quizzes and the midterm, and even for those assessment, we did not grade the calculation questions separately. Without these data for a cohort that did not receive the repetitive assessment intervention, we cannot experimentally conclude that the learning gains we describe in this study are specifically due to the repetitive assessment approach. However, we have clearly shown that through this approach, students achieve substantial learning gains for all three types of calculation questions. And we would argue that the most likely explanation for this success is the repetitive assessment approach. For one thing, students showed statistically significant learning gains during the second half of the course (Table 2), an interval in which there was no further formal teaching of the approaches but during which the repetitive assessments continued. Furthermore, the normalized learning gains showed that for the most difficult calculation questions, the molar concentration calculations, the majority (55%) of the learning gains achieved were achieved during this second half of the course (Table 3). It seems unlikely that this would have occurred if the students were no longer being assessed on these calculations. Normalized learning gains in the second half of the course were seen for all three types of questions. Thus, the persistent assessments of these calculation questions would seem to be the most likely explanation for the continued learning gains.
There is also a substantial literature that supports the potential effectiveness of repetitive assessments for promoting learning gains. The “testing effect” is a well described mechanism by which repetitive assessments themselves lead to substantial learning gains (35–37). Indeed, this has been shown to be an effective approach for introductory biology courses (38, 39). The intervals between assessments used in this study are also consistent with mechanisms of the “spacing effect” (40–42). For example, a study that assessed the spacing effect for natural science material in a lecture context showed that spaced review 8 days after presentation of material led to increased learning gains as opposed to review 1 day after (42). Our intervals of weekly assessment are on the order of this spaced interval. Thus, there is an underlying theoretical rationale supporting our approach.
Another motivation for our implementation of the repetitive assessment approach was our observations of a lack of student persistence in learning quantitative skills, often due to math anxiety among biology majors. Thus, we hypothesized that including calculation questions on all assessments, and making this explicitly known to students, might increase student motivation for mastering these basic skills. Our data showing continued learning gains in the second half of the semester, after formal teaching had ended, are consistent with such a motivational effect. With the data we have, we cannot determine whether the learning gains shown were due to a motivational effect, a testing effect, or through spaced learning. It seems likely that all three were involved. Future studies, including qualitative assessment of student’s motivations, will be required to determine the relative importance of these mechanisms in the learning gains.
For the study described here, we chose relatively simple calculations, both because we had observed that students struggled with these basic skills and because these are simple skills that we were confident our students could master. The rationale for our intervention was not just to ensure students learn these skills but also to help these first-year students gain confidence in their ability to master quantitative approaches. Future work should examine the effectiveness of this approach for learning more complex quantitative approaches in upper-level course work and should address motivation, confidence, and the transfer of these skills in subsequent coursework.
ACKNOWLEDGMENTS
This work and J.H.H. were supported by a Howard Hughes Medical Institute grant (grant number 52008121) to the University of New Orleans.
None of the authors have any conflict of interest to report.
REFERENCES
- 1.National Research Council. 2003. BIO2010: transforming undergraduate education for future research biologists. National Academies Press, Washington, DC. [PubMed] [Google Scholar]
- 2.National Research Council. 2009. A new biology for the 21st century: ensuring the United States leads the coming biology revolution. National Academies Press, Washington, DC. [PubMed] [Google Scholar]
- 3.Bialek W, Botstein D. 2004. Introductory science and mathematics education for 21st-century biologists. Science 303:788–790. doi: 10.1126/science.1095480. [DOI] [PubMed] [Google Scholar]
- 4.Association of American Medical Colleges (AAMC), Howard Hughes Medical Institute (HHMI). 2009. Scientific foundations for future physicians: report of the AAMC-HHMI Committee. AAMC, Washington, DC. [Google Scholar]
- 5.American Association for the Advancement of Science. 2011. Vision and change in undergraduate biology education. A call to action. American Association for the Advancement of Science, Washington, DC. [Google Scholar]
- 6.Aikens ML, Dolan EL. 2014. Teaching quantitative biology: goals, assessments, and resources. Mol Biol Cell 25:3478–3481. doi: 10.1091/mbc.E14-06-1045. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Usher DC, Driscoll TA, Dhurjati P, Pelesko JA, Rossi LF, Schleiniger G, Pusecker K, White HB. 2010. A transformative model for undergraduate quantitative biology education. CBE Life Sci Educ 9:181–188. doi: 10.1187/cbe.10-03-0029. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Feser J, Vasaly H, Jose Herrera J. 2013. On the edge of mathematics and biology integration: improving quantitative skills in undergraduate biology education. CBE Life Sci Educ 12:124–128. doi: 10.1187/cbe.13-03-0057. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Miller JE, Walston T. 2010. Interdisciplinary training in mathematical biology through team-based undergraduate research and courses. CBE Life Sci Educ 9:284–289. doi: 10.1187/cbe.10-03-0046. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Jungck JR, Gaff H, Weisstein AE. 2010. Mathematical manipulative models: in defense of “beanbag biology”. CBE Life Sci Educ 9:201–211. doi: 10.1187/cbe.10-03-0040. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Depelteau AM, Joplin KH, Govett A, Miller HA, Seier E. 2010. SYMBIOSIS: development, implementation, and assessment of a model curriculum across biology and mathematics at the introductory level. CBE Life Sci Educ 9:342–347. doi: 10.1187/cbe.10-05-0071. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Thompson KV, Cooke TJ, Fagan WF, Gulick D, Levy D, Nelson KC, Redish EF, Smith RF, Presson J. 2013. Infusing quantitative approaches throughout the biological sciences curriculum. Int J Math Educ Sci Technol 44:817–833. doi: 10.1080/0020739X.2013.812754. [DOI] [Google Scholar]
- 13.Bravo A, Porzecanski A, Sterling E, Bynum N, Cawthorn M, Fernandez DS, Freeman L, Ketcham S, Leslie T, Mull J, Vogler D. 2016. Teaching for higher levels of thinking: developing quantitative and analytical skills in environmental science courses. Ecosphere 7:1–20. doi: 10.1002/ecs2.1290. [DOI] [Google Scholar]
- 14.Wightman B, Amy T, Hark AT. 2012. Integration of bioinformatics into an undergraduate biology curriculum and the impact on development of mathematical skills. Biochem Mol Biol Educ 40:310–319. doi: 10.1002/bmb.20637. [DOI] [PubMed] [Google Scholar]
- 15.Goldstein J, Flynn DFB. 2011. Integrating active learning & quantitative skills into undergraduate introductory biology curricula. Am Biol Teach 73:454–461. doi: 10.1525/abt.2011.73.8.6. [DOI] [Google Scholar]
- 16.Duffus D, Olifer A. 2010. Introductory life science mathematics and quantitative neuroscience courses. CBE Life Sci Educ 9:370–377. doi: 10.1187/cbe.10-03-0026. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Rheinlander K, Wallace D. 2011. Calculus, biology and medicine: a case study in quantitative literacy for science students. Numeracy 4:Article 3. doi: 10.5038/1936-4660.4.1.3. [DOI] [Google Scholar]
- 18.Chiel HJ, McManus JM, Shaw KM. 2010. From biology to mathematical models and back: teaching modeling to biology students, and biology to math and engineering students. CBE Life Sci Educ 9:248–265. doi: 10.1187/cbe.10-03-0022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Matthews KE, Peter Adams P, Goos M. 2010. Using the principles of BIO2010 to develop an introductory, interdisciplinary course for biology students. CBE Life Sci Educ 9:290–297. doi: 10.1187/cbe.10-03-0034. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Zan R, Brown L, Evans J, Hannula MS. 2006. Affect in mathematics education: an introduction. Educ Stud Math 63:113–121. doi: 10.1007/s10649-006-9028-2. [DOI] [Google Scholar]
- 21.Speth EB, Momsen JL, Moyerbrailean GA, Ebert-May D, Long TM, Sara Wyse S, Linton D. 2010. 1, 2, 3, 4: infusing quantitative literacy into introductory biology. CBE Life Sci Educ 9:323–332. doi: 10.1187/cbe.10-03-0033. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Goldey ES, Abercrombie CL, Ivy TM, Kusher DI, Moeller JF, Rayner DA, Smith CF, Spivey NW. 2012. Biological inquiry: a new course and assessment plan in response to the call to transform undergraduate biology. CBE Life Sci Educ 11:353–363. doi: 10.1187/cbe.11-02-0017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Bell E. 2011. Using research to teach an “introduction to biological thinking. Biochem Mol Biol Educ 39:10–16. doi: 10.1002/bmb.20441. [DOI] [PubMed] [Google Scholar]
- 24.Small CJ, Newtoff KN. 2013. Integrating quantitative skills in introductory ecology: investigations of wild bird feeding. Am Biol Teach 75:269–273. doi: 10.1525/abt.2013.75.4.8. [DOI] [Google Scholar]
- 25.Brownell SE, Hekmat-Scafe DS, Singla V, Seawell PC, Conklin Imam JF, Eddy SL, Stearns T, Cyert MS. 2015. A high-enrollment course-based undergraduate research experience improves student conceptions of scientific thinking and ability to interpret data. CBE Life Sci Educ 14:ar21. doi: 10.1187/cbe.14-05-0092. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Olimpo JT, Pevey RS, McCabe TM. 2018. Incorporating an interactive statistics workshop into an introductory biology course-based undergraduate research experience (CURE) enhances students’ statistical reasoning and quantitative literacy skills. J Microbiol Biol Educ 19:19.1.49. doi: 10.1128/jmbe.v19i1.1450. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Thompson KV, Nelson KC, Marbach-Ad G, Keller M, Fagan WF. 2010. Online interactive teaching modules enhance quantitative proficiency of introductory biology students. CBE Life Sci Educ 9:277–283. doi: 10.1187/cbe.10-03-0028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Hoffman K, Leupen S, Dowell K, Kephart K, Leips J. 2016. Development and assessment of modules to integrate quantitative skills in introductory biology courses. CBE Life Sci Educ 15:ar14. doi: 10.1187/cbe.15-09-0186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Hester S, Buxner S, Elfring L, Nagy L. 2014. Integrating quantitative thinking into an introductory biology course improves students’ mathematical reasoning in biological contexts. CBE Life Sci Educ 13:54–64. doi: 10.1187/cbe.13-07-0129. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Brownell SE, Kloser MJ. 2015. Toward a conceptual framework for measuring the effectiveness of course-based undergraduate research experiences in undergraduate biology. Stud High Educ 40:525–544. doi: 10.1080/03075079.2015.1004234. [DOI] [Google Scholar]
- 31.Ashcraft MH. 2002. Math anxiety: personal, educational, and cognitive consequences. Curr Dir Psychol Sci 11:181–185. doi: 10.1111/1467-8721.00196. [DOI] [Google Scholar]
- 32.Ashcraft MH, Krause JA. 2007. Working memory, math performance, and math anxiety. Psychon Bull Rev 14:243–248. doi: 10.3758/bf03194059. [DOI] [PubMed] [Google Scholar]
- 33.Madlung A, Bremer M, Himelblau E, Tullis A. 2011. A study assessing the potential of negative effects in interdisciplinary math–biology instruction. CBE Life Sci Educ 10:43–54. doi: 10.1187/cbe.10-08-0102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Flanagan KM, Einarson J. 2017. Gender, math confidence, and grit: relationships with quantitative skills and performance in an undergraduate biology course. CBE Life Sci Educ 16:ar47. doi: 10.1187/cbe.16-08-0253. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Roediger HL, III, Karpicke JD. 2006. The power of testing memory: basic research and implications for educational practice. Perspect Psychol Sci 1:181–210. doi: 10.1111/j.1745-6916.2006.00012.x. [DOI] [PubMed] [Google Scholar]
- 36.Arnold KM, McDermott KB. 2013. Test-potentiated learning: distinguishing between direct and indirect effects of tests. J Exp Psychol Learn Mem Cogn 39:940–945. doi: 10.1037/a0029199. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Bjork EL, Little JL, Storm BC. 2014. Multiple-choice testing as a desirable difficulty in the classroom. J Appl Res Memory Cognition 3:165–170. doi: 10.1016/j.jarmac.2014.03.002. [DOI] [Google Scholar]
- 38.Orr R, Foster S. 2013. Increasing student success using online quizzing in introductory (majors) biology. CBE Life Sci Educ 12:509–514. doi: 10.1187/cbe.12-10-0183. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Walck-Shannon EM, Cahill MJ, Mark A, McDaniel MA, Frey RF. 2019. Participation in voluntary re-quizzing is predictive of increased performance on cumulative assessments in introductory biology. CBE Life Sci Educ 18:ar15–13. doi: 10.1187/cbe.18-08-0163. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Balota DA, Duchek JM, Logan JM. 2007. Is expanded retrieval practice a superior form of spaced retrieval? A critical review of the extant literature, p 83–105. In Nairne JS (ed), The foundations of remembering: essays in honor of Henry L. Roediger, III. Psychology Press, New York, NY. [Google Scholar]
- 41.Carpenter SK, Cepeda NJ, Rohrer D, Kang SH, Pashler H. 2012. Using spacing to enhance diverse forms of learning: review of recent research and implications for instruction. Educ Psychol Rev 24:369–378. doi: 10.1007/s10648-012-9205-z. [DOI] [Google Scholar]
- 42.Kapler IV, Weston T, Wiseheart M. 2015. Spacing in a simulated undergraduate classroom: long-term benefits for factual and higher-level learning. Learn Instr 36:38–45. doi: 10.1016/j.learninstruc.2014.11.001. [DOI] [Google Scholar]
- 43.Hake RR. 1998. Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66:64–74. doi: 10.1119/1.18809. [DOI] [Google Scholar]

