Skip to main content
Journal of Education and Health Promotion logoLink to Journal of Education and Health Promotion
. 2020 Jun 30;9:136. doi: 10.4103/jehp.jehp_753_19

A new method of “student-centered formative assessment” and improving students' performance: An effort in the health promotion of community

Fateme Shahedi 1,2, Javad Ahmadi 2,3, Tahereh Sharifi 4,5, Seyedeh Nahid Seyedhasani 2,6, Mahbubeh Abdollahi 2,5, Negar Shaabani 7, Mohammad Sarmadi 2,8,
PMCID: PMC7377129  PMID: 32766321

Abstract

BACKGROUND:

Improving the learning process in education will empower medical students, and also formative assessment helps improve the teaching–learning process by providing ongoing reflective information about learning gaps.

OBJECTIVE:

The aim of this study was to explore the effect of student-centered formative assessment by weekly reflective self-correction quizzes on medical laboratory students' performance on the final examination of hematology course in 2018.

MATERIALS AND METHODS:

A semi-experimental study was conducted on fifty students divided randomly into intervention (n = 25) and control groups (n = 25) using convenience sampling in 2018 from Torbat Heydariyeh University of Medical Sciences, Iran. Data analysis was performed using SPSS software version 16, two-sample t-test, Chi-square test, and analysis of covariance.

RESULTS:

The intervention had positive effects on students' mean test scores in hematology II so that the intervention and control groups managed to obtain 18.45 ± 1.46 and 14.57 ± 2.64, respectively (P < 0.01).

CONCLUSIONS:

The results suggested that weekly formative assessments along with reflective self-correction activity and active participation of students in the learning process by designing questions could improve student learning.

Keywords: Assessment, education, health promotion, student, test

Introduction

The high quality of an educational system is a prerequisite for enhancing the ability of individuals and ensuring sustainable societal development and consequence on the health promotion of community.[1,2] Paying particular attention to the quality and quantity of medical education as part of the higher education system that deals with human life will lead to an improvement in the quality of health-care services and community health promotion.[3,4] Educational evaluation provides an apt opportunity to review and measure educational system performance, which can greatly affect teaching–learning activity.[5] One way to determine the extent to which learners have achieved educational objectives is formative assessment conducted by teachers during the learning process. It provides ongoing feedback about learning gaps and identifies student strengths and weaknesses in order to make necessary reforms and adjustments in the teaching and learning strategies.[6,7]

Student question generation is an effective approach that deeply engages students in the learning process. It is defined as the process by which students design questions about important materials learned in the course content.[8] The act of formulating questions has been linked to self-directed learning and enhanced student conceptual understanding of important subjects.[4] To exploit all the potential benefits of this method, it has to be appropriately contextualized.[8] Bloom's taxonomy is the most recognized tool to design appropriate examination questions at various cognitive levels.[9] The hierarchical models of Bloom's taxonomy of learning domains are broadly used to design questions and evaluate student comprehension.

Several empirical studies have found a positive association between student achievement and formative assessment in core academic subjects, but inconsistent findings have been reported in different studies about the effects of student question generation on the learning status of students.[6,10,11,12,13] Moreover, it has been shown that the effects of formative assessment must be evaluated within a specific educational curriculum, as the results are dependent on the subjects.[14,15]

Hematology is an important and basic course for undergraduate laboratory students of medical laboratory science. It is offered in two consecutive semesters as Hematology I and II course. Many students often have trouble understanding and remembering the concepts and principles discussed in this course, with some students failing to accomplish the predefined objectives of this course or studying its subject materials during the semester.[16,17] Therefore, it is of utmost importance to find practical ways to improve assessment strategies so that they can properly meet student requirements and enhance lifelong learning. This study sets out to explore the effect of weekly reflective quiz self-corrections on the performance of laboratory students in the final examination of hematology course.

Materials and Methods

A semi-experimental study was conducted among bachelor students of medical laboratory science, after the approval of Research Ethics Committee of Torbat Heydariyeh University of Medical Sciences (IR.MUMS.REC.1396.67) in 2018. Of all students who had Hematology II course, fifty students (control group = 25 and intervention group = 25) were chosen using random sampling method [Figure 1]. The sample size was selected based on minimum sample size in experimental studies.[18] Inclusion criteria included pass in Hematology I course and have a same teacher.

Figure 1.

Figure 1

The flow diagram of the study

The demographic information of students (e.g., age, sex, mother's/father's job, income level, and being a native) was collected. We used Hematology I score for pretest.

Besides regular lectures covering basic concepts presented in the control group, formative assessment through weekly quizzes was conducted in the intervention group. To apply student-centered approach to the formative assessment, and improve student engagement, they were asked to finish the following three steps for each class session: (a) studying step to read the textbook chapters corresponding to contents presented at each session, (b) diagnosis step to identify important contents and principles, and (c) production step to generate ten multiple-choice items based on the concepts presented at each session. Bloom's taxonomy was given in the class, and students received specific instructions on the type of items that would be credited. The method of study among 16 sessions is summarized in Figure 2.

Figure 2.

Figure 2

The process of conducting formative assessment with student-generated questions in the intervention group

Nearly 70% of the items in each quiz were generated by students and the remaining 30% of the items were formulated by the instructor. Finally, the mean test scores of students on the final exam in both intervention and control groups were compared and analyzed. Data analysis was performed using SPSS 16 software (IBM Company, Chicago, IL, USA). Two-sample t-test and Chi-square test were used to compare demographic variables between the control and intervention groups. The analysis of covariance was used to investigate the effect of educational intervention in the study group. P < 0.05 was considered statistically significant.

Results

Participants (n = 50) were randomly assigned to the intervention group (n = 25, 50%) and the control group (n = 25, 50%). The mean age of the students was 22.30 ± 0.95 and 22 ± 0.94 years in the control and intervention groups, respectively. Participants' demographic information is listed in Table 1.

Table 1.

Demographic characteristics of participants in intervention and control groups

Variables Intervention group Control group P
Age (year) 22.30±0.95 22±0.94 0.48a
Number of children 3.44±1.33 3.44±1.67 0.82a
Sex (%)
 Male 17 (68) 14 (56) 0.65b
 Female 8 (32) 11 (44)
Father occupation (%)
 Self-employment 13 (52) 8 (32) 0.07b
 Employed 12 (48) 17 (68)
Mother occupation (%)
 Homemaker 19 (76) 14 (56) 0.31b
 Other jobs 6 (24) 11 (44)
Income level (%)
 Moderate 13 (52) 14 (56) 0.68b
 High 12 (48) 11 (44)
Native/nonnative (%)
 Native 5 (20) 4 (16) 1b
 Nonnative 20 (80) 21 (84)

at-test, bChi-square test

The results of educational intervention are reported in Table 2. The results of covariance analysis suggested that the educational intervention had a positive effect on the test scores of students in Hematology II so that the mean score of Hematology II in the intervention group was 4 points higher than that of the control group. The mean test scores were statistically significantly different between the two groups for all variables (P = 0.007).

Table 2.

Comparison of mean and standard deviation of variables’ score changes in intervention and control groups

Variable Sample size Mean±SD P

Control group Intervention group
Mean score of hematology II 25 14.57±2.64 18.45±1.46 <0.01
Mean score of hematology I 25 15.74±3.06 17.66±1.22 0.09

SD=Standard deviation

Discussion

Assessment is one of the principles of teaching in the educational process. In this context, an ongoing evaluation of student learning through formative assessment constitutes an integral part of effective teaching. This study investigated the impact of a student-centered formative assessment (SCFA) using weekly reflective quizzes on the performance of medical laboratory sciences students on the final exam of hematology course. The SCFA not only contributed to the learning success, but also significantly increased the average test scores of students. Our findings are in good agreement with those of studies in which the formative assessment method has been adopted to improve the teaching–learning process.[6,19] The main reason behind this improvement is that formative assessment helps students get acquainted with the necessary levels of learning, raise their awareness of learning gaps, and provide effective feedbacks to guide teachers and students in the appropriate direction of learning. Moreover, given that examinations and quizzes are stress-inducing activities, taking frequent quizzes serves as an effective way to reduce exam-taking anxiety.[20]

According to the previous studies, repeated assessment and review of the educational content at different sessions improves long-term retention and enhance student performance on the final examination.[21] This study also exhibited that using student-generated questions for formative assessment was associated with a noticeable improvement in learning gains of students. The participation of students in designing questions would engage them more deeply into the learning process and enhance their mastery of course materials.[8,22] However, a number of studies have reported that the poor quality of student-generated questions lowers the level of learning.[8,22] Therefore, we provided specific instructions on how to design questions that targeted higher levels of critical thinking skills. As the act of generating complex multiple-choice items represents a higher-order learning activity and requires significant mental efforts, students were asked to formulate multiple-choice items in this study.[23] Requiring students to formulate questions covering all topics presented in each class session can ensure that a noticeable mental effort was involved in exam preparation. In addition, the reflective self-correction tasks improved students' conceptual understanding of hematology course and engaged them in a self-assessment process, which, in turn, reinforced self-regulated learning. This process helps enhance students' problem-solving skills, and they become active learners who assume responsibility for their own performance improvement.

The limitation of this study was its relatively small sample size; design for one university, and one course, its findings might not be generalizable to other university's students. On the other hand, using the same teacher and investigating a basic course such as hematology and new method of formative assessment (SCFA) were among the strengths of this study. It is also suggested that future studies investigate the association between another course and academic achievement in different universities.

Conclusions

The use of formative assessment by weekly reflective self-correction quizzes seems to offer a valuable learning tool that helps instructors identify learning gaps, encourage more student engagement in the classroom, and improve learning. Furthermore, the formulation of multiple-choice items by students requires not only the recalling of prior knowledge, but also a comprehensive understanding of the course materials and application of critical thinking skills. Finally, allowing medical sciences students to be involved and take control of their learning process makes them transformative thinkers and productive citizens in the community, which, in turn, will play a vital role in promoting health of the society.

Financial support and sponsorship

This study was financially supported by Student Research Committee, Torbat Heydariyeh University of Medical Sciences, Torbat Heydariyeh, Iran.

Conflicts of interest

There are no conflicts of interest.

Acknowledgment

The authors would like to thank the Student Research Committee of Torbat Heydariyeh University of Medical Sciences for their financial support for performing this research. Ethics code (IR.THUMS.REC.1396.67) was also obtained from the Ethics Committee of Torbat Heydariyeh University of Medical Sciences.

References

  • 1.Laurie R, Nonoyama-Tarumi Y, Mckeown R, Hopkins C. Contributions of education for sustainable development (ESD) to quality education: A synthesis of research. J Educ Sustain Dev. 2016;10:226–42. [Google Scholar]
  • 2.Hahn RA, Truman BI. Education improves public health and promotes health equity. Int J Health Serv. 2015;45:657–78. doi: 10.1177/0020731415585986. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Angadi NB, Kavi A, Shetty K, Hashilkar NK. Effectiveness of flipped classroom as a teaching-learning method among undergraduate medical students – An interventional study. J Educ Health Promot. 2019;8:211. doi: 10.4103/jehp.jehp_163_19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Song D. Student-generated questioning and quality questions: A literature review. Res J Educ Stud Rev. 2016;2:58–70. [Google Scholar]
  • 5.Scheerens J, Ehren M, Sleegers P, de Leeuw R. OECD Review on Evaluation and Assessment Frameworks for Improving school Outcomes. Country Background Report for the Netherlands. 2012 [Google Scholar]
  • 6.Evans DJ, Zeun P, Stanier RA. Motivating student learning using a formative assessment journey. J Anat. 2014;224:296–303. doi: 10.1111/joa.12117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Connell GL, Donovan DA, Chambers TG. Increasing the Use of Student-Centered Pedagogies from Moderate to High Improves Student Learning and Attitudes about Biology. CBE Life Sci Educ. 2016;15:ar3. doi: 10.1187/cbe.15-03-0062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Rhind SM, Pettigrew GW. Peer generation of multiple-choice questions: Student engagement and experiences. J Vet Med Educ. 2012;39:375–9. doi: 10.3138/jvme.0512-043R. [DOI] [PubMed] [Google Scholar]
  • 9.Kim MK, Patel RA, Uchizono JA, Beck L. Incorporation of Bloom's taxonomy into multiple-choice examination questions for a pharmacotherapeutics course. Am J Pharm Educ. 2012;76:114. doi: 10.5688/ajpe766114. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Zimmerman BJ, Kitsantas A. The Hidden dimension of personal competence: Self-regulated learning and practice. In: Elliot A J, Dweck C S, editors. Handbook of competence and motivation. Guilford Publications; pp. 509–526. [Google Scholar]
  • 11.Cavilla D. The effects of student reflection on academic performance and motivation. SAGE Open. 2017;7:1–3. [Google Scholar]
  • 12.Larsen DP, Butler AC, Roediger HL., 3rd Comparative effects of test-enhanced learning and self-explanation on long-term retention. Med Educ. 2013;47:674–82. doi: 10.1111/medu.12141. [DOI] [PubMed] [Google Scholar]
  • 13.Olde Bekkink M, Donders AR, Kooloos JG, de Waal RM, Ruiter DJ. Challenging students to formulate written questions: A randomized controlled trial to assess learning effects. BMC Med Educ. 2015;15:56. doi: 10.1186/s12909-015-0336-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Neal K, Nash B. Formative assessment: A meta-analysis and a call for research. Educ Meas Issues Pract. 2011;30:28–37. [Google Scholar]
  • 15.Bennett RE. Formative assessment: A critical review. Assess Educ Princ Policy Pract. 2011;18:5–25. [Google Scholar]
  • 16.Nusche R. Ch. 4. Paris: Reviews of Evaluation and Assessment in Education and Assessment OECD Publishing; 2013. Student assessment: Putting the learner at the centre. Synergies for Better Learning: An International Perspective on Evaluation; p. 133. [Google Scholar]
  • 17.Fry H, Ketteridge S, Marshall S. A Handbook for Teaching and Learning in Higher Education: Enhancing Academic Practice. London, United Kingdom: Routledge Company; 2008. p. 452. [Google Scholar]
  • 18.Quinn GP, Keough MJ. Experimental Design and data Analysis for Biologists. New York: Cambridge University Press; 2002. p. 557. [Google Scholar]
  • 19.Dudek CM, Reddy LA, Lekwa A, Hua AN, Fabiano GA. Improving universal classroom practices through teacher formative assessment and coaching. Assess Eff Interv. 2019;44:81–94. [Google Scholar]
  • 20.Khanna MM. Ungraded pop quizzes: Test-enhanced learning without all the anxiety. Teach Psychol. 2015;42:174–8. [Google Scholar]
  • 21.Chang EK, Wimmers PF. Effect of repeated/spaced formative assessments on medical school final exam performance. Heal Prof Educ. 2017;3:32–7. [Google Scholar]
  • 22.Hardy J, Bates SP, Casey M, Galloway KW, Galloway R, Kay AE, et al. Student-generated content: Enhancing learning through sharing multiple-choice questions. Int J Sci Educ. 2014;36(13):2180–2194. [Google Scholar]
  • 23.Bjork EL, Soderstrom NC, Little JL. Can multiple-choice testing induce desirable difficulties. Evidence from the laboratory and the classroom? Am J Psychol. 2015;128:229–39. doi: 10.5406/amerjpsyc.128.2.0229. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Education and Health Promotion are provided here courtesy of Wolters Kluwer -- Medknow Publications

RESOURCES