Skip to main content
Journal of Microbiology & Biology Education logoLink to Journal of Microbiology & Biology Education
. 2009 Dec 17;10(1):43–50. doi: 10.1128/jmbe.v10.98

Assessing Student Understanding of Host Pathogen Interactions Using a Concept Inventory

Gili Marbach-Ad 1,*, Volker Briken 1, Najib M El-Sayed 1, Kenneth Frauwirth 1, Brenda Fredericksen 1, Steven Hutcheson 1, Lian-Yong Gao 1, Sam Joseph 1, Vincent T Lee 1, Kevin S McIver 1, David Mosser 1, B Booth Quimby 1, Patricia Shields 1, Wenxia Song 1, Daniel C Stein 1, Robert T Yuan 1, Ann C Smith 1,*
PMCID: PMC3577151  PMID: 23653689

Abstract

As a group of faculty with expertise and research programs in the area of host-pathogen interactions (HPI), we are concentrating on students’ learning of HPI concepts. As such we developed a concept inventory to measure level of understanding relative to HPI after the completion of a set of microbiology courses (presently eight courses). Concept inventories have been useful tools for assessing student learning, and our interest was to develop such a tool to measure student learning progression in our microbiology courses. Our teaching goal was to create bridges between our courses which would eliminate excessive overlap in our offerings and support a model where concepts and ideas introduced in one course would become the foundation for concept development in successive courses. We developed our HPI concept inventory in several phases. The final product was an 18-question, multiple-choice concept inventory. In fall 2006 and spring 2007 we administered the 18-question concept inventory in six of our courses. We collected pre- and postcourse surveys from 477 students. We found that students taking pretests in the advanced courses retained the level of understanding gained in the general microbiology prerequisite course. Also, in two of our courses there was significant improvement on the scores from pretest to posttest. As we move forward, we will concentrate on exploring the range of HPI concepts addressed in each course and determine and/or create effective methods for meaningful student learning of HPI aspects of microbiology.


This study involved the development of a diagnostic assessment tool or concept inventory to measure the level of understanding about host-pathogen interactions (HPIs) after completing a set of microbiology courses. As a group of faculty at a research university with expertise and research programs in the area of HPIs, we are responsible for teaching the undergraduate courses with HPI content (presently eight courses). In fall 2004, we formed a teaching group to bridge learning between our courses. Our group includes faculty from all ranks (full professors, associate professors, assistant professors, and instructors), along with an assistant professor from the College of Education with expertise in science education, and several graduate students with a strong interest in teaching who have joined us for various projects (http://www.life.umd.edu/hpi/).

Our goal was to create bridges which would eliminate excessive overlap in our offerings and support a model where concepts and ideas introduced in one course would become the foundation for concept development in successive courses. Our first task was to develop a list of 13 HPI concepts that were fundamental to an understanding of HPI. We used these concepts to guide the learning progression in our sequence of courses, so that students moving from the prerequisite course to more advanced courses would develop a deeper understanding of HPI (5, 18, 19, 24). We chose two “anchor” organisms to be used as exemplars of fundamental HPI concepts in all of our courses. In addition, we worked together to incorporate well-documented teaching strategies into the classroom and designed active-learning activities that address our concepts (15).

To assess how well our courses support the understanding of fundamental principles as defined by our 13 HPI concepts, we developed an HPI concept inventory. This paper describes the multistep collaborative process of building the inventory, assessing its use, and evaluating student performance through administration of the inventory.

The goal of a concept inventory or conceptual diagnostic test is to assess student understanding of basic concepts in a discipline (1, 7, 27). Concept inventories have been developed to target discipline-specific knowledge and are designed in a multiple-choice format (1, 7, 8, 11, 20). The selection of correct responses to the multiple-choice question reveals the students’ understanding of a basic concept, whereas the selection of incorrect responses or distractors suggests that students possess commonly held alternate conceptions (27).

Physicist educators have considerably altered the way physics is taught in response to student performance on the Force Concept Inventory (8, 10, 11). Their demonstration of the value of investigating what students know about fundamental concepts has encouraged several groups of biologists and chemists to develop concept inventories (1, 7, 13, 17, 20). Many concept inventories have been placed online (Field-Tested Learning Assessment Guide, http://www.flaguide.org; Bioliteracy Project, http://bioliteracy.net/).

Concept inventories are built with a multiple-choice format to allow administration to large numbers of students as well as efficient and objective scoring (27). In concept inventories, the wording of the question and response choices is based on extensive research that includes a thorough review of questions and answers to capture all possible interpretations of wording and all possible correct and incorrect answers. In its final form, each question in the concept inventory consists of one correct answer and multiple distractors. The distractors are incorrect answers based on commonly held alternate conceptions.

“Alternate conceptions” as described by Fisher (6) are ideas that differ from corresponding scientific explanations. As defined, alternate conceptions are usually held by a significant proportion of students and are highly resistant to instruction (6, 17). Alternate conceptions have previously been referred to as “misconceptions”; however, researchers find that the more positive term “alternate conception” recognizes that these conceptions can be used as anchors (4, 21) from which to move to a scientific conception when targeted instructional strategies are developed. The most important role for concept inventories is to provide instructors with an understanding of student alternate conceptions and ideas that may be actively interfering with learning.

The science education literature offers a large body of research that describes students’ alternate conceptions in different scientific topics at different age levels (17). In some cases, the concept inventory could be built based on an existing database for alternate conceptions; however, in other cases, such as in the HPI area, there are very few references in the current literature. Therefore, in order to build an HPI concept inventory, we began with the detailed process of identifying the alternate conceptions that students hold.

Our approach was similar but not identical to the two-tier method advocated by Treagust (26), Anderson et al. (1), Khodor et al. (13), and Odom and Barrow (20). The two-tier survey consisted of multiple-choice questions, each followed by a free response prompt. The two-tier method is attractive because it separates factual knowledge (tier 1, facts) from reasons for choosing a particular fact (tier 2, mechanisms and beliefs). The final product was an 18-question multiple-choice concept inventory.

MATERIALS AND METHODS

The model system for learning and courses involved. Our initiative involves eight HPI undergraduate courses. General Microbiology (600 students/year) serves as a prerequisite for all seven other upper-level courses: Pathogenic Microbiology (120 students/year), Microbial Pathogenesis (25 students/year), Microbial Genetics (80 students/year), Immunology (100 students/year), Immunology Lab (80 students/year), Epidemiology (100 students/year), and Bioinformatics (30 students/year). Our teaching group met monthly, with an average attendance of 13 members. It was decided that to help students build bridges between content presented in the various courses, we would link discussion of host-pathogen interactions in all courses to two organisms, Escherichia coli and Streptococcus sp. Further it was decided that each course should include methods that would expose students to and engage them in the scientific research process (15). Simultaneously with these goals, we developed the HPI concept inventory to evaluate our progress.

Constructing the concept inventory. Mixed methods of qualitative and quantitative approaches were used in developing the HPI concept inventory. The following steps were followed in designing, developing, implementing, and evaluating the HPI concept inventory.

(i) Developing a first version of the survey. To build questions that could be used to evaluate students’ level of thinking and understanding in each course, we considered the work of Bloom (2) and Mayer (16). We discussed the characteristics of questions that reflected rote learning as opposed to meaningful learning, and we learned how to write questions that could reliably assess a deeper level of understanding. Each faculty member submitted two questions that he or she thought a student should be able to answer at the completion of the course. We rated these according to cognitive level (2, 16), and we devised a tool that targeted the HPI concepts. We piloted the tool in three courses. After analysis of the results, we felt that we had learned quite a bit, but our tool was not yet meeting our needs. Our concerns were the following:

  1. The approach to the development of the tool was too individualized. The questions were written by distinct faculty members and merged.

  2. There seemed to be large gaps in the content assessed.

  3. We did not know how this tool could be used to monitor students’ development in meaningful understanding of HPI concepts.

(ii) Defining the content boundaries of the survey. We considered as a group this question: “What do we want our students to truly understand and remember 5 years after they have completed the set of our courses?” Accordingly, we developed a list of 13 HPI concepts (Table 1). We aimed at concepts that we believe are required for understanding HPI at a level of sophistication appropriate for microbiology majors. Content validity of the concepts was established by our complete HPI group.

TABLE 1.

The 13 HPI concepts—the big ideas for our project

Concept number Concept
1 The structural characteristics of a microbe are important in the pathogenicity of that microbe.
2 Diverse microbes use common themes to interact with the environment (host).
3 Microbial evolution is subject to forces of natural selection. Important consequences include changes in virulence and antibiotic resistance.
4 Microbes adapt and respond to the environment by altering gene expression.
5 Microbes have various strategies to cause disease.
6 Pathogens and hosts have evolved in a mutual fashion.
7 The cell wall and the cell membrane affect the bacterial response to the environment.
8 There is a distinction between a pathogen and a nonpathogen.
9 The environment will affect the phenotype (pathogenicity) of a bacterium.
10 Microbes adapt and respond to the environment by altering their metabolism.
11 Immune response has evolved to distinguish between self and nonself.
12 Immune response recognizes general properties (common themes versus specific attributes, innate versus adaptive).
13 Immune response memory is specific.

(iii) Developing a two-tier survey. Based on the HPI concept list, a 23-item multiple-choice survey with free response answers was developed. With the free open-ended response, we aimed to assess students’ alternate conceptions, which later would be used as distractors in the final multiple-choice survey. Therefore, each question had two tiers. The first tier consisted of questions with two to five choices; there could be more than one correct answer. The second tier consisted of requests for explanation (explain your answer or defend your response). Each question covered one or more concepts from the HPI concept list (Table 2). The 23 questions were piloted with a small focus group of two graduate students and two undergraduate students. Results from this focus group were analyzed by the HPI teaching team. Our 23 questions were amended to 18 two-tier questions. To establish content validity (25), we provided the draft instrument for inspection to our science content experts and a science pedagogy expert.

TABLE 2.

HPI concepts addressed in two-tier survey

Question number Question HPI concept addressed
1A Selection of an antibiotic resistant organism is based upon a change in the (a) phenotype (b) genotype (c) both (d) neither (e) either 3, 4, 10
1B Defend your response.
2A What determines a Gram stain reaction?
  1. Distinction relating to bacterial structure

  2. Distinction relating to bacterial function

  3. Both

1
2B Defend your response.

(iv) Obtaining information about students’ alternate conceptions. In the spring of 2006, the 18-question assessment was distributed via our course management system to 200 students in General Microbiology (introductory course) and 60 students in Bacterial Genetics (one of our HPI advanced courses). In order to limit the time requirement for the students in this pilot, only five questions were given to each student. For each question, we received about 60 responses from the General Microbiology course and 20 responses from the advanced course. The student responses were collated and reviewed by our HPI faculty as a group. We met to score student responses for alternate conceptions and then to develop multiple-choice questions that use commonly held alternate conceptions as distractors. Tables 3A and B show an example for analyzing a two-tier question. For each question, we first counted (quantitative analysis) the number of students selecting each choice in the multiple-choice part of the question (first tier). Note, that depending upon how students chose to defend their response, there could be more than one correct option among the multiple choices. Sometimes the student answered the first tier question correctly, but produced an incorrect explanation, and vice versa. This survey was not used to determine student course grades, but students who participated were awarded extra credit points. We were interested in finding what alternate conceptions students held.

TABLE 3A.

Results from analysis of answers to a first-tier multiple-choice question

Question and answers No. of students choosing each answer
General Microbiology (n = 68) Bacterial Genetics (n = 25)
1. Selection of an antibiotic resistant organism is based upon a change in the
  a. Phenotype 1 7
  b. Genotype 38 4
  c. Both 25 13
  d. Neither 0 1
  e. Either 4 0

TABLE 3B.

Results from analysis of student open-ended responses to the second-tier prompt: “Defend your response”

Major categories of students’ responsesa No. of students grouped under each category
General Microbiology Bacterial Genetics
Excellent response 21 16
Basic response, more required to indicate higher understanding 9 0
Students didn’t understand that selection is based on phenotypes 28 1
Student responses indicated that they did not understand that a change in phenotype is due to a change in genotype 3 6
Alternate conception was with the understanding of the differences between genotype and phenotype 9 1
Either student did not answer question or student response was completely off the mark 3 0
a

Students’ open-ended responses were grouped in major categories.

In order to define categories (qualitative analysis) for the second-tier responses (“defend your response”), we decided to use the technique of Hodder, Ebert-May, and Batzli (12). We formed three small groups of three instructors each. Each group received five or six questions to analyze. For each question, the group read all of the answers and established categories (level of correctness and alternate conceptions). Then, each member went through each response and categorized the response. Finally, the three members of the group compared their ratings and discussed responses to reach a consensus for each student response. Below are two examples for common alternate conceptions for question 1.

Question 1. Selection of an antibiotic resistant organism is based upon a change in the (a) phenotype (b) genotype (c) both (d) neither (e) either.

a. Students didn’t understand that selection is based on phenotypes. One student that selected “(b) genotype” wrote: “When an organism becomes resistant to antibiotics (when it acquires an antibiotic-resistant gene that has been inserted as a marker), the organism’s genotype has been changed.”

b. Alternate conception was with understanding of the differences between genotype and phenotype. The student wrote “This must be a change in the genotype because having antibiotic resistance will not necessarily change the look of an organism (phenotype). It will merely allow it to survive in situations where the antibiotic is present.”

(v) Developing a multiple-choice concept inventory. Following the analysis of all questions, each group built two multiple-choice questions for the final assessment tool, the HPI concept inventory. These questions usually included the opening sentence or sentences of the previous question and four or five choices of response: one correct answer and three or four distractors that reflect the students’ alternate conceptions revealed in the analysis. For example, one question developed from the information that is presented in Table 3 was the following:

The selection of antibiotic-resistant, transformed bacteria is based upon a change in the:

  1. phenotype of the bacteria.

  2. genotype of the bacteria.

  3. phenotype and genotype of the bacteria.

  4. genotype and physiology of the bacteria.

  5. genotype and morphology of the bacteria.

RESULTS

In fall 2006 and spring 2007, we administered the 18-question concept inventory in six of our courses. Participation in the survey was voluntary for the students. Students who participated were provided extra credit points. We requested permission from the students to use their responses for our research. Only data from students who gave permission were analyzed in the study. We collected pre- and postcourse surveys from 477 students (gender: 69% females, 31% males; ethnicity: 46% white, 26% Asian, 11% African American, 7% Hispanic, 10% other) with the following course distribution:

  • General Microbiology (BSCI 223), fall 2006, 127 students

  • Pathogenic Microbiology (BSCI 424), fall 2006, 96 students

  • General Microbiology (BSCI 223), spring 2007, 109 students

  • Bacterial Genetics (BSCI 412), spring 2007, 45 students

  • Immunology (BSCI 422), spring 2007, 48 students

  • Epidemiology (BSCI 425), spring 2007, 52 students

Student performance on concept inventory.Table 4 shows average scores for pre- and postcourse concept inventories. Each correct question weighed 1 point; because we removed two questions from the analysis (see below explanation), the maximum number of points possible on the concept inventory was 16. An inspection of these data shows that in both semesters of General Microbiology, the prerequisite course, the pre- and postcourse scores are similar. This is an important finding, as in the spring and the fall semesters we had different instructors teaching the course. For future analysis, we can treat these courses as comparable courses. Encouragingly, using t test analysis, we found that in four of our courses (both BSCI 223 courses, BSCI 424, and BSCI 422) there was significant improvement on the concept inventory scores from presurvey to postsurvey. Moreover, students taking the presurvey in the advanced courses retained the level of understanding gained in the prerequisite course (scores on BSCI 223 postsurvey are around 7.0, and scores on all presurveys in advanced courses are around 7 or greater).

TABLE 4.

Average scores on the pre- and postcourse concept inventorya

Pre or post General Microbiology, fall 2006 (n = 127) Pathogenic Microbiology, fall 2006 (n = 96) General Microbiology, spring 2007 (n = 109) Bacterial Genetics, spring 2007 (n = 45) Immunology, spring 2007 (n = 48) Epidemiology, spring 2007 (n = 52)
Pre 4.9 7.3 4.7 7.8 9.2 6.6
Post 7.0b 8.7b 7.3b 7.6 9.9c 6.6
a

Each correct response was weighted 1 point. The maximum number of points was 16. Values were calculated without data from questions 8 and 13.

b

P < 0.001.

c

P < 0.05.

Table 5 shows percentages of correct answers for each question in the pre- and postcourse surveys. There is a significant amount of data in the chart. As our project involves eight different courses, our first question was to determine how students in different courses would respond to each question. We looked at the percentages of correct answers for each question in each course and examined the relationship of students’ overall scores with their ability to choose the correct response to a specific question (discrimination factor).

TABLE 5.

Percentages of correct answers on pre- and postcourse concept inventory

Question Pre or post General Microbiology, fall 2006 (n = 127) Pathogenic Microbiology, fall 2006 (n = 96) General Microbiology, spring 2007 (n = 109) Bacterial Genetics, spring 2007 (n = 45) Immunology, spring 2007 (n = 48) Epidemiology, spring 2007 (n = 52)
1 Pre 25 32 12 35 56 36
Post 30 24 26 44 67 42
2 Pre 9 25 3 20 37 15
Post 24 46 21 29 35 17
3 Pre 18 33 21 49 48 31
Post 27 40 33 60 50 38
4 Pre 87 87 77 87 92 85
Post 88 88 87 84 87 90
5 Pre 7 24 10 31 50 23
Post 29 26 28 33 65 31
6 Pre 60 82 65 80 79 65
Post 78 85 83 71 77 69
7 Pre 28 42 13 38 56 40
Post 42 57 29 33 52 31
8 Pre 17 23 16 13 12 23
Post 17 25 32 16 12 6
9 Pre 23 59 25 47 50 48
Post 46 63 56 49 56 36
10 Pre 41 71 40 73 75 58
Post 61 78 78 73 69 58
11 Pre 18 45 23 56 48 42
Post 39 59 38 53 60 50
12 Pre 62 45 64 62 52 42
Post 49 65 72 51 50 35
13 Pre 11 17 9 27 23 21
Post 12 26 11 9 21 15
14 Pre 28 41 37 47 48 33
Post 35 52 43 38 52 33
15 Pre 10 30 12 29 69 31
Post 40 38 20 27 75 36
16 Pre 19 36 17 33 48 23
Post 34 48 32 24 60 21
17 Pre 43 57 38 60 79 60
Post 65 60 57 58 81 50
18 Pre 13 24 17 36 46 29
Post 16 40 20 31 58 25

We grouped students’ scores on the inventory into categories where scores that fell into the top 25% were placed in the “high” category, scores that fell into the middle 50% were placed in the “medium” category, and scores in the bottom 25% were placed in the “low” category. The discrimination values (range, 0–1) were calculated for each question. If a question had a value below .30, it meant that students who did well on the concept inventory (high-performance group) performed poorly on this question. We reviewed all questions administered in each class and found that in every class, two questions (8 and 13) provided poor discrimination; no one did well (Table 5).

Questions 8 and 13 were designed to address issues regarding bacterial metabolism (Table 1, concept 10). The group reviewed and discussed the importance of concept 10 and analyzed the clarity and specific student alternate understandings addressed in both questions 8 and 13. As a result of this discussion, question 8 was reworded. Question 13 was left as is; it was decided that the question wording was clear. We feel that in fact even our best students do not understand this concept. This was truly an excellent question as it revealed to us a gap in our curriculum.

DISCUSSION

The idea of a concept inventory began with the Force Concept Inventory that was developed to measure students’ conceptual understanding of motion and force (11). Similar multiple-choice concept inventories have more recently been developed for the assessment of student learning in other areas, including chemistry, biology, astronomy, statistics, engineering, and geosciences (1, 13, 17, 20, 22). As a group of instructors who care about their teaching and have taken on the challenge of creating a cohesive set of courses that result in meaningful learning of HPI concepts, we sought to assess the effectiveness of our teaching efforts. We believe that the process of constructing a reliable concept inventory as a group has had great value not only in the production of the product but also in the conversation about teaching and learning within our group. Together we worked to articulate the most important concepts for undergraduates to grasp in order to develop meaningful learning of HPI (the “big ideas”). Then through collaborative efforts, we built the HPI concept inventory to assess student progress in our courses. Through the review of student answers and comments on the HPI concept inventory, we have developed a deeper understanding of how students perceive HPI concepts. As was observed by Hestenes and Halloun (9) in an interpretation of results from the Force Concept Inventory, student responses to the HPI concept inventory provided a surprise value to each of us (i.e., “how could my students miss that?”). Reading and discussing students’ choice of answers and their accompanying comments gave us very specific information that we plan to use for course development.

Through the implementation of the concept inventory in eight different courses we generated significant data. Our goal with this data was to analyze our questions as appropriate indicators of student understanding of HPI concepts. This paper reports our review of the data targeted toward understanding how students in eight microbiology courses respond to 18 concept inventory questions. As is the procedure in generating a valid and reliable set of concept inventory questions (14), we reviewed the questions to see how students in each of our courses would respond. We found two questions with a very poor discrimination factor. One question was determined to be worded poorly; the other provided an indication of a gap in our curriculum.

We did not expect that our students would have perfect scores on the concept inventory. Immunology lecture is generally the last course taken by students in the spring of their senior year. The average concept inventory score following the course is 9.9. As we continue to work on the development of the concept inventory and on our curriculum initiatives, we will monitor student scores as indicators of our course development progress.

Through the first distribution of the concept inventory we have significant findings. For the General Microbiology course we found a significant increase in student learning as measured by the HPI concept inventory. This increase was consistent in two semesters where the course was taught by different instructors. General Microbiology is taught according to an active-learning course format (23). This result suggests that the use of this format is instructor independent.

Further we found that the presurvey scores for advanced courses were not significantly different from postsurvey scores for students completing the prerequisite General Microbiology course. We attribute this retention of learning to the active-learning strategies that have been in place in General Microbiology for several years (23). Research widely supports the claim that students learn best when they actively participate and are engaged in their learning. When students learn actively, they retain more content for a longer time and are able to apply that material in a broader range of contexts (3).

Finally, in Pathogenic Microbiology and Immunology, two advanced courses, we observed significant improvement in postsurvey scores relative to presurvey scores. As we move forward, we will concentrate on exploring the range of HPI concepts addressed in each course. The detailed analysis of student performance on each question in each course will help us determine and/or create effective methods for meaningful student learning in each.

We believe that the concept inventory will serve as a useful tool to monitor our course development initiatives. Curriculum initiatives under development include adding and adapting published active-learning activities to our courses such as clicker questions, case studies, and team projects, and developing teaching tools that bring our research interests into the classroom in authentic ways. Building the concept inventory required the collaborative efforts of research and teaching faculty. We believe that this team approach could be a model for others working on curriculum and assessment projects.

Acknowledgments

This research was supported in part by a grant to the University of Maryland from the Howard Hughes Medical Institute through the Undergraduate Science Education Program. This work has been approved by the Institutional Review Board as IRB 060140. Special thanks to Katerina V. Thompson for editorial comments and long-standing support of the HPI teaching group. We also thank Laura Cathcart, science education graduate student, for feedback from the student point of view and Katherine C. McAdams for help with statistics.

REFERENCES

  • 1.Anderson DL, Fisher KM, Norman GJ. Development and evaluation of the conceptual inventory of natural selection. J Res Sci Teach. 2002;39:952–978. doi: 10.1002/tea.10053. [DOI] [Google Scholar]
  • 2.Bloom BS, editor. Taxonomy of educational objectives Handbook 1: cognitive domain. Longman; New York, NY: 1984. [Google Scholar]
  • 3.Bransford JD, Brown AL, Cocking RR, editors. How people learn: brain, mind, experience, and school. National Academy Press; Washington, DC: 2000. [Google Scholar]
  • 4.Clement J, Brown DE, Zietman A. Not all preconceptions are misconceptions: finding “anchoring conceptions” for grounding instruction on students’ intuitions. Int J Sci Educ. 1989;11:554–565. doi: 10.1080/0950069890110507. [DOI] [Google Scholar]
  • 5.Duschl RA, Schweingruber HA, Shouse AW, editors. Taking science to school: learning and teaching science in grades K-8. The National Academies Press; Washington, DC: 2007. [Google Scholar]
  • 6.Fisher KM. Amino acids and translation: a misconception in biology. In: Helm H, Novak JD, editors. Proceedings of the international seminar misconceptions in science and mathematics. Cornell University; Ithaca, NY: 1983. pp. 407–419. [Google Scholar]
  • 7.Garvin-Doxas K, Klymkowsky MW. Understanding randomness and its impact on student learning: lessons from the biology concept inventory (BCI) CBE Life Sci Educ. 2008;7:227–233. doi: 10.1187/cbe.07-08-0063. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Hake RR. Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys. 1998;66:64–74. doi: 10.1119/1.18809. [DOI] [Google Scholar]
  • 9.Hestenes D, Halloun I. Interpreting the force concept inventory a response to Huffman and Heller. Phys Teach. 1995;33:502–506. doi: 10.1119/1.2344278. [DOI] [Google Scholar]
  • 10.Hestenes D, Wells M. A mechanics baseline test. Phys Teach. 1992;30(3):159. doi: 10.1119/1.2343498. [DOI] [Google Scholar]
  • 11.Hestenes D, Wells M, Swackhamer G. Force concept inventory. Phys Teach. 1992;30(3):141–158. doi: 10.1119/1.2343497. [DOI] [Google Scholar]
  • 12.Hodder J, Ebert-May D, Batzli J. Coding to analyze students’ critical thinking. Front Ecol Environ. 2006;4:162–163. doi: 10.1890/1540-9295(2006)004[0162:CTASCT]2.0.CO;2. [DOI] [Google Scholar]
  • 13.Khodor J, Gould Halme D, Walker GC. A hierarchical biology concept framework: a tool for course design. Cell Biol Educ. 2004;3:111–121. doi: 10.1187/cbe.03-10-0014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Klymkowsky MW, Garvin-Doxas K. Recognizing student misconceptions through Ed’s Tools and the biology concept inventory. PLoS Biol. 2008;6:0014-0017e3. doi: 10.1371/journal.pbio.0060003. doi: 10.1371/journal.pbio.0060003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Marbach-Ad G, Briken V, Frauwirth K, Gao L, Hutcheson S, Joseph S, Mosser D, Parent B, Shields P, Song W, Stein D, Swanson K, Thompson K, Yuan R, Smith AC. A faculty team works to create content linkages among various courses to increase meaningful learning of targeted concepts of microbiology. Cell Biol Educ. 2007;6:155–162. doi: 10.1187/cbe.06-12-0212. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Mayer RE. Rote versus meaningful learning. Theory Pract. 2002;41:226–232. doi: 10.1207/s15430421tip4104_4. [DOI] [Google Scholar]
  • 17.Mulford DR, Robinson WR. An inventory for alternate conceptions among first-semester general chemistry students. J Chem Educ. 2002;79(6):739–744. doi: 10.1021/ed079p739. [DOI] [Google Scholar]
  • 18.National Assessment Governing Board . Science assessment and item specifications for the 2009 National Assessment of Educational Progress. National Assessment Governing Board; Washington, DC: 2006. [Google Scholar]
  • 19.National Research Council . Taking science to school: learning and teaching science in grades K-8. National Academy Press; Washington, DC: 2006. [Google Scholar]
  • 20.Odom AL, Barrow LH. Development and application of a two-tier diagnostic test measuring college biology students’ understanding of diffusion and osmosis after a course of instruction. J Res Sci Teach. 1995;32:45–61. doi: 10.1002/tea.3660320106. [DOI] [Google Scholar]
  • 21.Redish EF. Teaching physics with the physics suite. John Wiley & Sons, Inc; Hoboken, NJ: 2003. [Google Scholar]
  • 22.Rhoads TR, Roedel RJ. The wave concept inventory—a cognitive instrument based on Bloom’s taxonomy. Proceedings of the 1999 Frontiers in Education Conference; San Juan, Puerto Rico. Champaign, IL: Stipes Publishing LLC; 1999. http://www.fie-conference.org/fie99/. [Google Scholar]
  • 23.Smith AC, Stewart R, Shields P, Hayes-Klosteridis J, Robinson P, Yuan R. Introductory biology courses: a framework to support active learning in large enrollment introductory science courses. Cell Biol Educ. 2005;4:143–156. doi: 10.1187/cbe.04-08-0048. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Smith C, Wiser M, Anderson CW, Krajcik J. Implications of research on children’s learning for assessment: a proposed learning progression for matter and the atomic molecular theory. Meas Interdiscip Res Perspect. 2006;4:1–98. doi: 10.1080/15366367.2006.9678570. [DOI] [Google Scholar]
  • 25.Smith ML, Glass GV. Research and evaluation in education and the social sciences. Prentice Hall; Englewood Cliffs, NJ: 1987. [Google Scholar]
  • 26.Treagust DF. Development and use of diagnostic tests to evaluate students’ misconceptions in science. Int J Sci Educ. 1988;10:159–169. doi: 10.1080/0950069880100204. [DOI] [Google Scholar]
  • 27.Zeilik M. Classroom assessment techniques conceptual diagnostic tests. http://www.flaguide.org/cat/diagnostic/diagnostic5.php.

Articles from Journal of Microbiology & Biology Education : JMBE are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES