Skip to main content
Journal of Microbiology & Biology Education logoLink to Journal of Microbiology & Biology Education
. 2017 Apr 27;19(1):19.1.52. doi: 10.1128/jmbe.v19i1.1368

A Call for Programmatic Assessment of Undergraduate Students’ Conceptual Understanding and Higher-Order Cognitive Skills

Lacy M Cleveland 1,2,*, Thomas M McCabe 3, Jeffrey T Olimpo 4
PMCID: PMC5969447  PMID: 29904561

Abstract

In response to empirical evidence and calls for change, individual undergraduate biology instructors are reforming their pedagogical practices. To assess the effectiveness of these reforms, many instructors use course-specific or skill-specific assessments (e.g., concept inventories). We commend our colleagues’ noble efforts, yet we contend that this is only a starting point. In this Perspectives article, we argue that departments need to engage in reform and programmatic assessment to produce graduates who have both subject-matter knowledge and higher-order cognitive skills. We encourage biology education researchers to work collaboratively with content specialists to develop program-level assessments aimed at measuring students’ conceptual understanding and higher-order cognitive skills, and we encourage departments to develop longitudinal plans for monitoring their students’ development of these skills.

INTRODUCTION

In response to overwhelming empirical evidence supporting the use of active learning, collegiate biology instructors across the United States are revising their instructional practices. Wishing to quantify their efforts, instructors often use course-specific assessments to measure learning gains or compare the effectiveness of various instructional methods (e.g., 1). As the number of discipline-based education researchers has grown, the types and scales of research questions have increased, including asking how specific pedagogical methods influence students’ understanding of specific concepts (e.g., cellular division) or development of certain skills (e.g., critical thinking). Answering such questions depends on the availability of valid and reliable assessment tools.

In the last decade, discipline-based education researchers have developed numerous assessments designed to measure students’ conceptual understanding (for an overview, see 2) and skills (e.g., 3). The majority of assessments are focused on conceptual understanding and are course-specific (e.g., Enzyme-Substrate Concept Inventory (4); Genetics Concept Assessment (5); and the Introductory Molecular and Cell Assessment (6)). Development of such assessments has been invaluable to enhancing the community’s understanding of which instructional practices promote learning. However, we believe it is time to begin developing instruments to assess at the programmatic level foundational conceptual understanding and higher-order cognitive skills (HOCS), or critical thinking skills, which we define as the ability to apply knowledge to a real-world problem, evaluate information, and create solutions.

Our appeal arose from data collected from a university in the west. These data indicated that common misconception types (e.g., teleological, anthropocentric, and essentialist, defined in Table 1) persisted across all academic levels (Figs. 14), and most assessments primarily focused on students’ use of lower-order cognitive skills throughout their program of study (Table 2). In this Perspectives article, we argue for providing students ongoing and purposeful opportunities to develop higher-order cognitive skills, which we assert provide a means of challenging misconceptions. We acknowledge the difficulties in developing a disposition toward critical thinking (7) and that students possess robust misconceptions (e.g., 8, 9). Therefore, we are also calling for programmatic efforts and monitoring of both students’ HOCS and conceptual understanding (i.e., ongoing formative assessment), aided by the development of one or several programmatic-level concepts and skills assessment tools.

TABLE 1.

Types and examples of cognitive construalsa.

Cognitive Construal Definition Examplesb
Anthropocentric Assigning human characteristics to non-human objects or phenomenon. The main force that drove their evolution was the need to increase in size to become less susceptible to land-based predators. (PubA:Gr11:231)c
Essentialist Assuming similar classes of items function in the same way in similar situations. Changing a single gene in an organism results in a new kind of organism.
Teleological Systems or individuals work in a specific manner to achieve an end goal. An organism may become adapted to a particular environment through its interactions with it. (PubA:Gr12:277)
Organisms develop features that help them survive. (PubA:Gr12:285)
a

Information presented in this table is based on Coley and Tanner’s 2012 (15) and 2015 (16) work.

b

Examples are taken from the textbooks analyzed in Tshuma & Sanders (17) paper, indicating teleological and anthropocentric examples are often found in textbooks. PubA refers to the textbook that was analyzed; Gr refers to the grade level of text; the last number refers to the number of pages on which the statement was written.

c

This statement has both anthropocentric and teleological errors.

FIGURE 1.

FIGURE 1

Students were asked to indicate their agreement (0 = strongly disagree, 5 = strongly agree) with anthropocentric, teleological, and essentialist misconceptions. No significant interaction was found between students’ academic level, between students enrolled in freshman level courses, sophomore, and upperclass courses (junior and senior level, F(4, 352) = 0.961, p = 0.429). Participants were given a list of six misconceptions (two for each type of cognitive construal) and asked to indicate how well they agreed or disagreed with the statement and then provide a brief description of their understanding. Some students did not provide responses for all six statements. The number of statements answered per grade-level is as follows: freshmen (n = 354), sophomore (n = 492), junior (n = 476), and senior (n = 258).

FIGURE 2.

FIGURE 2

Students were randomly given two of the four anthropocentric statements related to cell death, sexual reproduction, the size of different genders of organism, or how plants acquire food. A chi-squared analysis was performed and no relationship was found between academic level of the students (freshman, sophomore, and upperclass) and students’ level of agreement (agree, neutral, or disagree) with the various anthropocentric statements: Cell death χ2 (4, 91) = 1.094; p = 0.895; Sexual reproduction χ2 (4, 92) = 1.429; p = 0.839; Males are bigger χ2 (4, 100) = 4.304; p = 0.366; Plants get food from soil χ2 (4, 193) = 4.023; p = 0.403.

FIGURE 3.

FIGURE 3

Students were randomly given two of the four teleological statements related to evolution and cellular respiration. A chi-squared analysis was performed and no relationship was found between the students’ academic level (freshman, sophomore, and upperclass) and the students’ level of agreement (agree, neutral, or disagree) with the various teleological statements: Birds have wings χ2 (4, 90) = 2.726; p = 0.605; Species adapt χ2 (4, 90) = 1.212; p = 0.876; Evolution χ2 (4, 99) = 4.372; p = 0.358; Plants Produce O2 χ2 (4, 102) = 1.624; p = 0.804.

FIGURE 4.

FIGURE 4

Students were randomly given two of the four essentialist statements related to evolution and cellular respiration. A chi-squared analysis was performed and no relationship was found between students’ academic level (freshman, sophomore, and upperclass) and level of agreement (agree, neutral, or disagree) with three of the teleological statements: Species χ2 (4, 90) = 0.586; p = 0.965; Homeostasis χ2 (4, 97) = 7.731; p = 0.102; Wetlands χ2 (4, 99) = 1.791; p = 0.774. Freshmen displayed a significantly higher level of misconceptions for genetic changes χ2 (4, 90) = 15.802; p = 0.003.

TABLE 2.

Cognitive level of test questions per course level.

Course Level LOCS (# of questions) Apply (# of questions) HOCS (# of questions) Total Percent Ratio (LOCS:App:HOCS)
100 389 23 26 438 89:5:6a
200 184 98 77 359 51:27:21a
300 164 31 52 247 66:13:21a
400 47 16 53 116 41:14:46a
Total 784 168 208 1,160 66:14:18a

Five faculty submitted tests. Of these tests, 8 were from the 100 level, 18 from the 200 level, 7 from the 300 level, and 3 from the 400 level. Test questions were defined as individual prompts associated with separate points. For example, some questions included multiple parts, each with individual point values; each individual part was considered as a separate question in our analysis. Percent ratios were calculated by calculating the percentage of LOCS, Application, and HOCS per course level per the category descriptions given in the Blooming Biology Tool (18). Two authors of this paper (LMC and TMM) worked simultaneously and collaboratively to rate all questions. Questions were scored together; therefore, no inter-rater reliability value was calculated. To determine whether a difference existed between the ratio of lower-order, application-level, and higher-order test questions per course level, we ran a chi-squared test of independence.

a

The results indicated a significant difference (χ2 [6, N = 1,160] = 211.968, p = 0.000, two-tailed) between the percentages of lower-order, application-level, and higher-order course objectives across the various course levels. Additionally, Cramer’s V, a measure of association between two variables used to indicate effect size (here course-level and cognitive-level of course objective) revealed a small effect size (ϕc = 0.302, p = 0.000).

The goal of this article is three-fold. First, we are calling for biology education researchers to develop programmatic-level assessment tools for departments to monitor and measure students’ changes in conceptual understanding and HOCS throughout their degree completion. Second, we provide insight into the misconceptions and HOCS literature and demonstrate that the use of evidence-based teaching practices can assist students in developing their conceptual understanding and HOCS. Finally, we call for departments to develop a plan for promoting and assessing conceptual understanding and HOCS.

DISCUSSION

Difficulties in dismantling misconceptions and developing higher-order cognitive skills

Dismantling misconceptions

The literature is rich with empirical evidence indicating how difficult it is to confront and dismantle students’ misconceptions in biology (e.g., 8, 9) and is equally rich with evidence emphasizing the importance and difficulty of developing students’ HOCS (e.g., 10). In agreement with prior research, we contend that receiving a bachelor’s degree in a field of knowledge should be an indication that an individual has developed conceptual understanding and the ability to think critically within the area of expertise (e.g., 11). However, our data indicated that students near graduation may not display sufficient conceptual understanding or HOCS. Specifically, our data indicated that students nearing the completion of their bachelor’s degree retained biology-specific misconceptions (Figs. 14). Furthermore, others have shown that over the course of their undergraduate degree, college students show very limited improvements in their critical thinking skills (12).

Unfortunately, a vast quantity of data suggests that such issues begin early in a student’s career, as students’ prior knowledge may include misconceptions. Accordingly, students are liable to associate new concepts with misconceptions if instructors do not address such misconceptions (9). Instructors assist in facilitating learners’ refinement of their knowledge (for essays with varying viewpoints see 13, 14); thus, large learning gains are related to an instructor’s ability to dismantle misconceptions.

In theory, the number and variety of misconceptions students could hold are infinite. Fortuitously, Coley and Tanner (15, 16) suggested that students’ misconceptions can often be classified into predictable categories. They coined the term “cognitive construals” to describe seemingly unrelated biological misconceptions that are linked to intuitive ways of thinking, such as teleological, essentialist, and anthropocentric arguments (see our Table 1). These three categories of misconceptions provide instructors and researchers a foundation for predicting, exploring, and challenging students’ biological cognitive construals. Pedagogical and assessment practices promoting conceptual understanding (as opposed to memorization), developing HOCS, and building students’ metacognitive abilities are associated with restructuring students’ misconceptions to accurately reflect biological knowledge (18). In particular, we argue that the same pedagogical strategies shown to be effective in promoting HOCS are also key practices associated with dismantling misconceptions.

Developing HOCS

Higher-order cognitive skills represent a skill set associated with students being able to apply information to real life, use information to solve problems and create solutions, and use their knowledge to evaluate information (19). Undergraduate students struggle with HOCS (11, 20); for example, students may not demonstrate a commitment to rational inquiry, a necessary component of critical thinking. As undergraduate students progress from their freshman to final year of college, they do not demonstrate large, if any, gains in HOCS (12, 21). Research indicates that critical thinking (which includes using HOCS) can be improved over the course of one semester (22); however, creating a critical-thinking disposition requires years of reinforcement (7). To date, effective course design has meant aligning curricular activities and assessment procedures with learning objectives (2327); however, these alignments may not necessarily include a focus on students’ cognitive abilities. In addition to providing students with low-stakes practice, assessment techniques influence students’ approaches to thinking (i.e., the use of primarily lower-order or higher-order cognitive skills when studying) (5, 28). When instructors employ assessments requiring HOCS, students are motivated to participate in deeper-learning strategies—approaches that prompt them to move beyond factual recall and instead focus on developing their conceptual understanding and ability to apply the information to an authentic circumstance (i.e., HOCS) (5, 28). Collegiate instructors need to provide students the opportunity to develop their HOCS, skills we assert help to develop students’ conceptual understanding.

Instructional practices promoting conceptual understanding and HOCS

Under the assumption that an individual obtaining a Bachelor’s in a particular field of study should have both conceptual understanding and domain-specific higher-order cognitive skills (11), it would be appropriate that an undergraduate degree program focus on both of these elements. Luckily, a great deal of overlap exists between the instructional approaches associated with improving conceptual understanding and those associated with developing HOCS. Active-learning strategies including, but not limited to, discussions, inquiry, data analysis, and concept maps have been associated with improved conceptual understanding (2931). For example, the use of discussion has been shown to be an effective method for allowing students to identify their own misconceptions (31). To address students’ misconception that enzymes only work inside of an extremely narrow temperature range, instructors are encouraged to ask their students to discuss how enzymes work when an individual has a fever. Following discussion, students develop an experiment to test their hypotheses (31). Of course, providing students with inquiry opportunities related to each of their misconceptions can be costly in terms of materials and time. Alternatively, by presenting students with data, instructors can also create cognitive dissonance (31). Moreover, instructors can encourage students to evaluate whether the data concur with their thinking, providing an excellent opportunity to promote deep learning as well as conceptual understanding. The literature provides an ample variety of active-learning strategies, and we encourage instructors to employ strategies supported by empirical evidence that build conceptual understanding and promote HOCS (for some immediate resources, see, e.g., 29, 31, 32).

Importantly, development of expertise, conceptual understanding, and domain-specific HOCS in a scientific field requires thousands of hours (33). Based upon the difficulty in correcting misconceptions and the length of time it takes individuals to develop a disposition toward HOCS, we argue that departments, as well as individual instructors, need to work to evaluate their current practices and develop a program-level plan to monitor students’ conceptual understanding and domain-specific critical thinking from the beginning to the end of their degree programs. By providing programmatic scaffolding and consistent (and frequent) opportunities to practice and assess students’ HOCS, departments will also promote the development of students’ conceptual understanding. To come full circle, we do not treat the wealth of work completed already in the development of individual courses or shorter-term interventions with a deficit view, but encourage the use of these curricular design strategies to reshape areas of need identified in the auditing process. Simultaneously developing students’ conceptual understanding and HOCS serves not only to promote the goals of higher education (i.e., developing subject-matter expertise and HOCS), but to provide a public benefit, that is, training STEM workers who are knowledgeable and able to apply their knowledge to solve problems and to create solutions.

Action steps and call for research

Educational theory demonstrates that scaffolding provides a platform for building students’ knowledge and skills (34). Based on the information we have presented, we call for: 1) biology education researchers to create a program-level concept and skills inventory; and 2) departmental chairs to assist in developing a plan for faculty to promote and assess conceptual understanding and HOCS within their department. In terms of the former, we call for biology-education researchers and content-experts to collaborate to develop programmatic-level concepts and skills assessments. We believe that this type of assessment would provide a standardized method for evaluating biology programs nationwide—much like the American Chemical Society exam for chemistry. If developed, this assessment would also provide an avenue for conducting large-scale studies examining outcomes across multiple institutions.

For chairs and faculty members, in order for such change to occur, we recommend that departments undertake the following:

  1. Develop program objectives that include conceptual understanding and HOCS. Once program-level objectives have been developed, a curriculum map, outlining when objectives will be taught, assessed, and reinforced throughout the program, is essential. Vision and Change (35) and Scientific Teaching (10) provide insight into possible program-level objectives and objective development, respectively.

  2. Determine a method for evaluating students’ progression in their conceptual knowledge and domain-specific HOCS. We encourage departments to measure students’ knowledge prior to beginning in their program and then at yearly intervals or after the completion of each core course. Here, we also appeal to biology education researchers to initiate development of an assessment tool that departments can use to measure students’ conceptual understanding and HOCS; we believe this assessment could be built based upon the key concepts presented in Vision and Change (35).

  3. Evaluate their current course assessment mechanisms (see Appendix 1 for a guide). The goal of this evaluation is to determine whether core courses continually and progressively promote students’ HOCS development. Recognizing the difficulty of creating HOCS assessments, we recommend that cohorts of faculty who teach similar courses work together to create a test bank based on programmatic objectives. By working in teams, faculty may also create a departmental test bank of HOCS questions to improve the continuity of program reform and provide future faculty members with tools to align assessments with program objectives. When writing HOCS test questions, we recommend first creating a scenario with real or hypothetical data. Using the hypothetical data set, questions can then be written to ask students whether the experimental design was appropriate (evaluation); what conclusions can be drawn from the data (application and synthesis); what hypotheses were actually being tested based on the experimental design (evaluation); and what future questions emerge from the data (creation). This model promotes faculty conversation and allows for collaborative efforts to reduce the time required by individual faculty to write higher-order questions.

In summary, undergraduate biology faculty across the nation are spending a great deal of effort and time to reform their classrooms and use evidence-based practices, and we commend them. Unfortunately, the data indicate that dismantling misconceptions and developing HOCS deserve more attention and inclusion in these efforts. We believe it is now time for departmental reform and ask that biology education researchers and content specialists come together to develop program-level assessments aimed at measuring students’ conceptual understanding and HOCS to aid in this endeavor. Individual departments may already be engaged in our second recommendation and using their own evaluation mechanisms. However, the creation of a standardized, valid, and reliable programmatic concept inventory would provide the opportunity for 1) large-scale multi-institutional studies and 2) evaluation of biology programs nationwide.

SUPPLEMENTAL MATERIALS

Appendix 1: Model for evaluating the department’s promotion of higher-order cognitive skills

ACKNOWLEDGMENTS

All data collected for and presented in this article were approved by the University of Northern Colorado’s Institutional Review Board (UNC IRB 662380-2 Exempt; UNC IRB 643327-2 Exempt). We extend our gratitude to the multitude of instructors who allowed us to collect data in their classroom, provided us with copies of their syllabi and tests as well as the students who participated in data. Furthermore, we would like to thank Dr. Susan Hutchinson who provided feedback on the statistical analysis related to Table 2. The authors declare that there are no conflicts of interest.

Footnotes

Supplemental materials available at http://asmscience.org/jmbe

REFERENCES

  • 1.Cleveland LM, Olimpo JT, DeChenne-Peters SE. Investigating the relationship between instructors’ use of active-learning strategies and students’ conceptual understanding and affective changes in introductory biology: a comparison of two active-learning environments. CBE Life Sci Educ. 2017;16(2):ar19. doi: 10.1187/cbe.16-06-0181. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.D’Avanzo C. Biology concept inventories: overview, status, and next steps. BioScience. 2008;58(11):1079–1085. doi: 10.1641/B581111. [DOI] [Google Scholar]
  • 3.Schraw G, Dennison RS. Assessing metacognitive awareness. Contemp Educ Psychol. 1994;19(4):460–475. doi: 10.1006/ceps.1994.1033. [DOI] [Google Scholar]
  • 4.Bretz SL, Linenberger KJ. Development of the enzyme–substrate interactions concept inventory. Biochem Mol Biol Educ. 2012;40(4):229–233. doi: 10.1002/bmb.20622. [DOI] [PubMed] [Google Scholar]
  • 5.Smith MK, Wood WB, Knight JK. The genetics concept assessment: a new concept inventory for gauging student understanding of genetics. CBE Life Sci Educ. 2008;7(4):422–430. doi: 10.1187/cbe.08-08-0045. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Shi J, Wood WB, Martin JM, Guild NA, Vicens Q, Knight JK. A diagnostic assessment for introductory molecular and cell biology. CBE Life Sci Educ. 2010;9:453–461. doi: 10.1187/cbe.10-04-0055. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Giancarlo CA, Facione PA. A look across four years at the disposition toward critical thinking among undergraduate students. J Gen Educ. 2001;50:29–55. doi: 10.1353/jge.2001.0004. [DOI] [Google Scholar]
  • 8.Chi MT. Commonsense conceptions of emergent processes: why some misconceptions are robust. J Learn Sci. 2005;14(2):161–199. doi: 10.1207/s15327809jls1402_1. [DOI] [Google Scholar]
  • 9.Tanner K, Allen D. Approaches to biology teaching and learning: understanding the wrong answers—teaching toward conceptual change. Cell Biol Educ. 2005;4(2):112–117. doi: 10.1187/cbe.05-02-0068. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan R, Gentile J, Lauffer S, Stewart J, Tilghman SM, Wood WB. Scientific teaching. Science. 2004;304:521–522. doi: 10.1126/science.1096022. [DOI] [PubMed] [Google Scholar]
  • 11.Zoller U. Are lecture and learning compatible? Maybe for LOCS: unlikely for HOCS. J Chem Educ. 1993;70:195. doi: 10.1021/ed070p195. [DOI] [Google Scholar]
  • 12.Association of American Colleges and Universities. Liberal education outcomes: a preliminary report on student achievement in college Association of American Colleges and Universities. Washington, DC: 2005. [Google Scholar]
  • 13.Leonard MJ, Kalinowski ST, Andrews TC. Misconceptions yesterday, today, and tomorrow. CBE Life Sci Educ. 2014;13(2):179–186. doi: 10.1187/cbe.13-12-0244. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Maskiewicz AC, Lineback JE. Misconceptions are “so yesterday!”. CBE Life Sci Educ. 2013;12(3):352–356. doi: 10.1187/cbe.13-01-0014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Coley JD, Tanner KD. Common origins of diverse misconceptions: cognitive principles and the development of biology thinking. CBE Life Sci Educ. 2012;11(3):209–215. doi: 10.1187/cbe.12-06-0074. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Coley JD, Tanner K. Relations between intuitive biological thinking and biological misconceptions in biology majors and nonmajors. CBE Life Sci Educ. 2015;14(1):1–9. doi: 10.1187/cbe.14-06-0094. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Tshuma T, Sanders M. Textbooks as a possible influence on unscientific ideas about evolution. J Bio Educ. 2015;49(4):354–369. doi: 10.1080/00219266.2014.967274. [DOI] [Google Scholar]
  • 18.Stanger-Hall KF. Multiple-choice exams: an obstacle for higher-level thinking in introductory science classes. CBE Life Sci Educ. 2012;11:294–306. doi: 10.1187/cbe.11-11-0100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Crowe A, Dirks C, Wenderoth MP. Biology in bloom: implementing Bloom’s taxonomy to enhance student learning in biology. CBE Life Sci Educ. 2008;7:368–381. doi: 10.1187/cbe.08-05-0024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Bailin S. Critical thinking in science education. Sci Educ. 2002;11:361–375. doi: 10.1023/A:1016042608621. [DOI] [Google Scholar]
  • 21.Arum R, Roksa J. Academically adrift: limited learning on college campuses. University of Chicago Press; Chicago, IL: 2011. [Google Scholar]
  • 22.Quitadamo IJ, Brahler CJ, Crouch, GJ. Peer-led team learning: a prospective method for increasing critical thinking in undergraduate science courses. Sci Educ. 2009;18:29–39. [Google Scholar]
  • 23.Bissell A, Lemons P. A method for assessing critical thinking in the classroom. BioScience. 2006;56:66–72. doi: 10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2. [DOI] [Google Scholar]
  • 24.Ebert-May D, Batzli J, Lim H. Disciplinary research strategies for assessment of learning. BioScience. 2003;53:1221–1228. doi: 10.1641/0006-3568(2003)053[1221:DRSFAO]2.0.CO;2. [DOI] [Google Scholar]
  • 25.Fink LD. Creating significant learning experiences: an integrated approach to designing college courses. John Wiley & Sons; San Francisco, CA: 2013. [Google Scholar]
  • 26.Sundberg MD. Assessing student learning. Cell Biol Educ. 2002;1:11–15. doi: 10.1187/cbe.02-03-0007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Wiggins GP, McTighe J. Understanding by design. Association for Supervision and Curriculum Development; Alexandria, VA: 1998. [Google Scholar]
  • 28.Scouller K. The influence of assessment method on students’ learning approaches: multiple choice question examination versus assignment essay. Higher Educ. 1998;35:453–472. doi: 10.1023/A:1003196224280. [DOI] [Google Scholar]
  • 29.Allen D, Tanner K. Infusing active learning into the large-enrollment biology class: seven strategies, from the simple to complex. Cell Biol Educ. 2005;4(4):262–268. doi: 10.1187/cbe.05-08-0113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Wallace JD, Mintzes JJ. The concept map as a research tool: exploring conceptual change in biology. J Res Sci Teach. 1990;27(10):1033–1052. doi: 10.1002/tea.3660271010. [DOI] [Google Scholar]
  • 31.Yip DY. Identification of misconceptions in novice biology teachers and remedial strategies for improving biology learning. Int J Sci Educ. 1998;20(4):461–477. doi: 10.1080/0950069980200406. [DOI] [Google Scholar]
  • 32.Dolan EL, Collins JP. We must teach more effective: here are four ways to get started. Mol Biol Cell. 2015;26(12):2151–2155. doi: 10.1091/mbc.E13-11-0675. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Litzinger TA, Lattuca LR, Hadgraft RG, Newstetter WC. Engineering education and the development of expertise. J Eng Educ. 2011;100:123–150. doi: 10.1002/j.2168-9830.2011.tb00006.x. [DOI] [Google Scholar]
  • 34.Vygotsky LS. Mind in society: the development of higher mental processes. Harvard University Press; Cambridge, MA: 1978. [Google Scholar]
  • 35.American Association for the Advancement of Science. Vision and change in undergraduate biology education: a call to action: a summary of recommendations made at a national conference organized by the American Association for the Advancement of Science; July 15–17, 2009; Washington, DC. 2011. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Appendix 1: Model for evaluating the department’s promotion of higher-order cognitive skills

Articles from Journal of Microbiology & Biology Education are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES