Skip to main content
Journal of Microbiology & Biology Education logoLink to Journal of Microbiology & Biology Education
. 2021 Mar 15;22(1):22.1.14. doi: 10.1128/jmbe.v22i1.2205

Build-Your-Own Exam: Involving Undergraduate Students in Assessment Design and Evaluation to Enhance Self-Regulated Learning

Lisa M D’Ambrosio 1,
PMCID: PMC7976776  PMID: 33884045

INTRODUCTION

Self-regulation of learning refers to the metacognitive process of assessing one’s competencies relative to an academic goal and the development of corresponding strategies to improve intellectual growth (1). An emerging method to promote self-regulation of learning in the undergraduate classroom is to directly involve learners in the design and evaluation of course assessments (24). By providing students with opportunities to serve as active decision-makers in the assessment process, they are better able to understand the criteria by which their work is evaluated, to enhance their cognitive reasoning, and to ultimately direct their own learning (5, 6).

Here, I describe an activity that engages undergraduate science students in the creation and evaluation of written exams. The aim of this activity is to progressively develop students’ cognitive thinking and ability to critically evaluate their own learning through the use of faculty instruction, peer feedback, and self-assessment. While this activity is designed for a 12-week, mid- to upper-level undergraduate course of 50 students or fewer, adaptable instructional resources, marking schemes, and timelines are provided to facilitate easy and effective implementation in a variety of classroom contexts.

PROCEDURE

General structure and timeline

This 12-week activity consists of three modules (training, design, and evaluation). The training module occurs once during the first 2 weeks of the activity, whereas the design and evaluation modules occur twice from weeks 3 to 12 (Fig. 1). The design and evaluation modules occur twice to provide students with multiple opportunities to apply the skills acquired during the training module to the specific content they are learning throughout the course. For alternative activity timelines to accommodate different course structures, see Appendix 1. Learning objectives and a suggested evaluation plan for this activity are outlined in Fig. 2 and 3, respectively.

FIGURE 1.

FIGURE 1

Activity modules and schedule. This activity may be administered throughout a 12-week undergraduate science course. Suggestions for alternative timelines and structures are provided in Appendix 1.

FIGURE 2.

FIGURE 2

Activity learning objectives. Amodule that facilitates each learning objective is indicated. Gray circle, training module; blue circle, design module; yellow circle, evaluation module.

FIGURE 3.

FIGURE 3

Suggested activity evaluation grading scheme. This evaluation is based on the 12-week activity timeline indicated in Fig. 1.

Training module

The training module consists of two 90-minute sessions. The first session teaches students how to design constructed-response exam questions via a short (20-minute) instructor-led lecture (Appendix 2). Constructed-response questions are open-ended, noncued questions that require the test taker to formulate a written answer (7). The rationale for using this type of question format is that they provide learners with a deep perspective on their cognitive thinking skills and learning progression (8). The lecture also introduces Bloom’s taxonomy as a pedagogical framework for exam design. Bloom’s taxonomy includes six hierarchal levels, where one and six represent the lowest and highest levels of cognitive function, respectively (810). The lecture describes how to apply the higher cognitive levels of Bloom’s taxonomy when designing exams to ensure that student-developed questions assess critical reasoning and comprehension rather than basic recall of information.

The lecture is then followed by a student-directed activity that provides learners with an opportunity to practice designing and identifying effective constructed-response questions based on course content covered to date (Appendix 3). This student-led activity may be conducted in small groups of 5 to 10 individuals whereby the instructor circulates throughout the classroom to facilitate discussion. The activity concludes with a class discussion whereby the instructor invites a representative from each group to present their findings. Each individual student is required to submit a copy of the completed activity to the instructor for evaluation prior to the end of the session. The second session follows the same format as the first session but instead provides students with guidelines on how to design answer keys that assess responses to exam questions. It also introduces learners to strategies for providing effective feedback on exam performance. For adaptable outlines for training module minilecture, see Appendix 2. For training module activities and corresponding marking schemes, see Appendix 3.

Design module

Each student is then asked to use the information learned from the training module to develop five constructed-response questions and a corresponding answer key for each question. The questions and answer key should be based on the course content covered in class to date and should evaluate a variety of cognitive levels as per Bloom’s taxonomy. Students are given 1 week to complete this task, at which point they are required to submit their work to the instructor for evaluation and feedback. Students are given 1 week to revise their work based on instructor feedback. This final draft should also include an additional five constructive-response exam questions and a corresponding answer key based on course content that has been covered since submitting the initial draft. The latter allows students to apply feedback from their initial draft to develop new course-based questions. The final exam draft should therefore contain 10 questions that cover approximately half of the semester’s curriculum. This design module may be repeated later in the course to provide students with another opportunity to practice applying their exam-building skills and critical thinking to new content covered in the latter portion of the course. For instructional guidelines and suggested marking rubric for the design module, see Appendix 4.

Evaluation module

A 30-minute, in-class session is reserved for administration of student-designed exams. During this session, students are randomly assigned to a partner by the instructor. Each student constructs responses to five of their peer’s exam questions that have been preselected by the instructor. The session concludes with each student submitting their completed responses to the exam author. Outside of class time, the exam author is given 1 week to use their self-designed answer key to provide their peer with constructive feedback on their exam responses. Each student is also required to submit their final exam draft, answer key, and the feedback that they provided their peer to the instructor for evaluation (Appendix 5). It is important to emphasize that students do not receive a grade from their peer on the accuracy of their exam responses. Rather, the instructor evaluates each student on the completion of their partner’s exam questions and on the quality of their peer feedback (Appendix 5). Alternatively, faculty may simply grant a completion mark for students who provide peer feedback. Following reflection of the feedback received by the instructor and peer, the student then completes a self-assessment that requires them to critically evaluate their cognitive thinking and to suggest strategies for closing knowledge gaps (Appendix 6). This module may be repeated later in the course to provide students with an additional chance to further improve their ability to accurately self-assess and direct their learning progress. While not directly part of this activity, faculty are welcome to follow-up this exercise with a summative assessment consisting of instructor-designed constructed-response questions to provide a comprehensive summary of learning.

CONCLUSION

By providing learners with the opportunity to construct and evaluate exams, this activity aims to improve students’ cognitive thinking and ability to accurately assess their own competencies to ultimately direct future learning.

SUPPLEMENTAL MATERIALS

Appendix 1. Alternative activity timelines, Appendix 2. Training module minilecture outlines, Appendix 3. Training module activities and marking scheme, Appendix 4. Design module: instructional guidelines and marking scheme, Appendix 5. Evaluation module: instructional guidelines and marking scheme, Appendix 6. Self-assessment

ACKNOWLEDGMENTS

The author declares no conflicts of interest.

Footnotes

Supplemental materials available at http://asmscience.org/jmbe

REFERENCES

  • 1.Zimmerman BJ. Theories of self-regulated learning and academic achievement: An overview and analysis. In: Zimmerman BJ, Schunk DH, editors. Self-regulated learning and Academic Achievement: Theoretical Perspectives. 2nd ed. Taylor & Francis Group; New York, NY: 2001. pp. 1–37. [Google Scholar]; Zimmerman BJ. 2001. Theories of self-regulated learning and academic achievement: An overview and analysis, pp. 1–37. In Zimmerman BJ & Schunk DH. Self-regulated learning and Academic Achievement: Theoretical Perspectives, 2nd ed. Taylor & Francis Group, New York, NY.
  • 2.Rhind SM, Pettigrew GW. Peer generation of multiple-choice questions: student engagement and experiences. J Vet Med Educ. 2012;39(4):375–379. doi: 10.3138/jvme.0512-043R. [DOI] [PubMed] [Google Scholar]; Rhind SM, Pettigrew GW. 2012. Peer generation of multiple-choice questions: student engagement and experiences. J Vet Med Educ 39(4):375–379.
  • 3.Hardy J, Bates SP, Casey MM, Galloway KW, Galloway RK, Kay AE, Kirsop P, McQueen HA. Student-generated content: enhancing learning through sharing multiple-choice questions. Int J Sci Educ. 2014;36(13):2180–2194. doi: 10.1080/09500693.2014.916831. [DOI] [Google Scholar]; Hardy J, Bates SP, Casey MM, Galloway KW, Galloway RK, Kay AE, Kirsop P, McQueen HA. 2014. Student-generated content: enhancing learning through sharing multiple-choice questions. Int J Sci Educ 36(13):2180–2194.
  • 4.Teplitski M, Irani T, Krediet CJ, Di Cesare M, Marvasi M. Student-generated pre-exam questions is an effective tool for participatory learning: a case study from ecology of waterborne pathogens course. Res Food Sci Educ. 2018;17(2):76–84. doi: 10.1111/1541-4329.12129. [DOI] [Google Scholar]; Teplitski M, Irani T, Krediet CJ, Di Cesare M, Marvasi M. 2018. Student-generated pre-exam questions is an effective tool for participatory learning: a case study from ecology of waterborne pathogens course. Res Food Sci Educ 17(2):76–84.
  • 5.Brown GTL. Involving students in assessment. In: Brown GTL, editor. Assessment of Student Achievement. Routledge, Taylor & Francis Group; New York, NY: 2017. pp. 57–72. [DOI] [Google Scholar]; Brown GTL. 2017. Involving students in assessment, pp. 57–72. In Brown GTL, Assessment of Student Achievement. Routledge, Taylor & Francis Group, New York, NY.
  • 6.Dinsmore DL, Wilson HE. Student participation in assessment: does it influence self-regulation? In: Brown GTL, Harris LRG, editors. Handbook of Human and Social Factors in Assessment. Routledge, Taylor & Francis Group; New York, NY: 2016. pp. 145–168. [Google Scholar]; Dinsmore DL, Wilson HE. 2016. Student participation in assessment: does it influence self-regulation?, pp. 145–168. In Brown GTL, Harris LRG. Handbook of Human and Social Factors in Assessment. Routledge, Taylor & Francis Group, New York, NY.
  • 7.Livingston S. Constructed-response test questions: why we use them; how we score them. Educational Testing Service R& D Connections. 2009;11:1–8. [Google Scholar]; Livingston, S. 2009. Constructed-response test questions: why we use them; how we score them. Educational Testing Service, R& D Connections, 11, 1–8.
  • 8.Paniagua M, Swygert KA, Downing SM. Written tests: writing high-quality constructed-response and selected-response items. In: Yudkowsky R, Park YS, Downing SM, editors. Assessment in Health Professions Education. 2nd ed. Routledge, Taylor & Francis Group; New York, NY: 2020. pp. 109–126. [Google Scholar]; Paniagua M, Swygert KA, Downing SM. 2020. Written tests: writing high-quality constructed-response and selected-response items, pp. 109–126. In Yudkowsky R, Park, YS, Downing, SM, Assessment in Health Professions Education, 2nd ed. Routledge, Taylor & Francis Group, New York, NY.
  • 9.Bloom BS, Engelhart MD, Furst EJ, Hill WH, Krathwohl DR. Handbook I: Cognitive domain. Longmans Green & Co, David McKay; New York, NY: 1956. Taxonomy of educational objectives: The classification of educational goals. [Google Scholar]; Bloom BS, Engelhart MD, Furst EJ, Hill WH, Krathwohl DR. 1956. Taxonomy of educational objectives: The classification of educational goals. Handbook 1: Cognitive domain. Longmans, Green & Co, David McKay, New York, NY.
  • 10.Krawthwohl DR. A revision of Bloom’s taxonomy: an overview. Theory Practice. 2002;41(4):212–218. doi: 10.1207/s15430421tip4104_2. [DOI] [Google Scholar]; Krawthwohl DR. 2002. A revision of Bloom’s taxonomy: an overview. Theory Practice 41(4):212–218.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Appendix 1. Alternative activity timelines, Appendix 2. Training module minilecture outlines, Appendix 3. Training module activities and marking scheme, Appendix 4. Design module: instructional guidelines and marking scheme, Appendix 5. Evaluation module: instructional guidelines and marking scheme, Appendix 6. Self-assessment

Articles from Journal of Microbiology & Biology Education are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES