Skip to main content
Heliyon logoLink to Heliyon
. 2024 Jun 6;10(12):e32629. doi: 10.1016/j.heliyon.2024.e32629

The effect of classroom-based assessment (CBA) on Chinese EFL learners’ speaking performance, engagement, and willingness to communicate: Undergraduate students' perceptions

Yawen Liu 1
PMCID: PMC11226770  PMID: 38975164

Abstract

CBA has emerged as a prominent pedagogical approach aimed at intertwining assessment within the instructional process, effectively fostering student learning. In English as a Foreign Language (EFL) education, CBA has garnered considerable attention, given its potential to enhance language learners' engagement, speaking proficiency, and willingness to communicate. This mixed-method research study delves into the impact of Classroom-Based Assessment (CBA) on the speaking performance, engagement, and willingness to communicate of undergraduate Chinese EFL learners. The study involved 90 Chinese EFL learners in the quantitative phase and 25 in the qualitative phase. Data collection comprised three questionnaires and an interview checklist. The data analysis employed inferential statistics (t-tests) for the quantitative data and thematic analysis for the qualitative insights. The findings of this study reveal that CBA exerts a significant influence on language learners' engagement, speaking abilities, and willingness to communicate. The quantitative results, as indicated by t-tests, indicate noteworthy improvements in these key aspects when CBA is integrated into the classroom environment. Moreover, the qualitative phase of the study, employing thematic analysis, uncovered five distinct ways through which CBA impacts the main variables under investigation. This research elucidates the multifaceted influence of CBA on undergraduate Chinese EFL learners. It not only demonstrates the quantitative impact on engagement, speaking skills, and willingness to communicate but also offers valuable qualitative insights into the diverse ways through which CBA enhances the language learning experience. These findings underscore the significance of integrating CBA into EFL classrooms as a potent tool for optimizing learner outcomes.

Keywords: Classroom-based assessment, Speaking skill, Learners' engagement, Willingness to communicate, Learners' perceptions

1. Introduction

With the ultimate goal of enhancing student learning, CBA is a pedagogical approach emphasizing the smooth integration of assessment into the learning process [[1], [2], [3], [4]]. Academics have been paying more and more attention to CBA since the seminal study by Black and Wiliam [2], which highlighted the critical role that teachers' classroom assessments play in maximizing students' learning outcomes. When it comes to language learners, it is essential [5]. Despite the growing interest in evaluating language learners, a dearth of empirical data exists regarding the application of CBA by language educators in undergraduate learner environments. CBA in the context of language learners is an area that has not been thoroughly investigated and needs more research, according to a thorough analysis by Nikolov and Timpe-Laughlin [6]. Past studies in this area have primarily examined the efficacy of CBA in enhancing language acquisition among learners, the accuracy of CBA as a tool for evaluating language learners' skills, and the methods educators employ to put CBA into practice [[7], [8], [9], [10], [11]]. In addition to examining the accuracy of CBA and its effects on language acquisition performance and attitudes, these studies have yielded various results and provided insightful information about the state of CBA implementation at the moment [[12], [13], [14]]. Teachers' attitudes regarding CBA will determine how well it integrates in language learner classrooms.

Within education systems around the world, the ideas of CBA and Assessment for Learning have become widely accepted. In the Asia-Pacific Region, where interest in the positive effects of these assessments on students is growing, this trend is especially noticeable [15]. New Zealand, for example, places a strong emphasis on the development of students' assessment skills and highlights the value of assessment for learning in the educational system. All young people should be given an education that fosters their ability to evaluate their own learning, according to the "Directions for Assessment in New Zealand" document. The aforementioned competencies enable learners to retrieve, decipher, and apply data from superior evaluations to augment their education [16]. Assessing people in a similar way is happening in China, where the government has called for significant reforms to assessment methods in order to shift from rote memorization-based traditional evaluation systems to more formative, authentic, and humanistic approaches [[15], [16], [17]]. In addition to actively pursuing a school-based assessment system that integrates teachers' opinions into the public examination process, China continues to use high-stakes public exams for student selection at various educational levels. By promoting the use of assessment as a tool for learning, these changes to curriculum policies hope to lessen the dominance of exams [[18], [19], [20]].

2. Review of literature

2.1. Studies on CBA

CBA encompasses any assessment activities conducted within the classroom setting, whether they are explicitly designed assessments or more implicit evaluative processes, as opposed to traditional large-scale external tests administered outside of the classroom [[21], [22], [23]]. Technically, it signifies a complex process involving both educators and learners in the gathering, assessment, and application of evidence related to student learning [21]. Within this framework, CBA is more accurately described as a process-oriented method rather than a mere evaluative instrument. CBA fulfills two primary functions: summative assessment (SA) and formative assessment (FA), commonly known as assessment of learning and assessment for learning, respectively [[24], [25], [26], [27]].

Summative assessment (SA) is primarily structured to accumulate evidence regarding the extent of a student's knowledge, proficiency, or skill level for administrative or reporting objectives, with a central focus on generating numerical scores or grades [28]. In contrast, formative assessment (FA) centers on leveraging assessment as a means to enhance student learning [29,30]. It places significant emphasis on providing descriptive feedback rather than merely assigning scores, aiming to offer tangible evidence of student learning. Through this constructive feedback, students gain valuable insights into their areas of strength and areas needing improvement, while also receiving guidance on bridging the gap between their current performance and their desired outcomes [31]. Importantly, students play an active role in the assessment process by being cognizant of learning objectives and assessment criteria, as well as engaging in self-assessment and peer assessment practices [32]. Such dynamic assessment approaches hold substantial promise for nurturing students' capacity for self-regulated learning [33].

Although summative assessment (SA) and formative assessment (FA) serve distinct purposes, the demarcation between them is not always well-defined, as the same assessment data can be utilized for different objectives at various times [34]. The primary difference lies in the application of the information. In some instances, SA can be used in a formative manner, and FA can fulfill summative functions [35,36]. Summative tests administered at the end of a unit or term to document learning outcomes can also be used to adjust teaching strategies and improve future student learning. Formative assessment (FA) practices, such as self- and peer assessments, generate extensive data on students' progress, which can be recorded and used for summative reporting. Recent scholarship emphasizes that in an assessment culture focused on learning, all assessments, including those for administrative and reporting purposes, should primarily aim to promote learning [4,36,37].

Researchers have proposed multiple frameworks to guide the practical implementation of Classroom-Based Assessment (CBA) principles. These frameworks include Black and Wiliam's five strategies of formative assessment [5], Hill and McNamara's framework on CBA processes [38], Davison and Leung's teacher-based assessment framework [39], and various contemporary frameworks related to learning-oriented assessment [19,[25], [26], [27], [28], [29], [30]]. Despite their differences, these frameworks emphasize three crucial instructional processes for enhancing student learning: understanding learning objectives, assessing current student progress, and identifying strategies to help students achieve their learning goals [40].

Davison and Leung's framework [41] operationalizes CBA into a cycle of four distinct steps, underscoring its process-oriented nature. This framework offers a practical approach for examining teachers' assessment practices within the classroom, making it the chosen analytical framework for this study. The first step in Davison and Leung's framework, termed Planning Assessment (PA), focuses on clarifying learning objectives and assessment criteria, ensuring that students clearly understand their educational targets.

Extensive research has identified gaps between teachers' beliefs about CBA and their actual practices [[42], [43], [44]]. For instance, Chen et al. [43] studied CBA implementation among two university English as a Foreign Language (EFL) instructors and found that although these educators were positively inclined toward involving students in the assessment process, they rarely engaged students in self- and peer-assessment. More recently, Vattøy [45] conducted interviews with ten secondary EFL teachers in Norway, revealing a discrepancy between teachers' beliefs and practices concerning formative teacher feedback.

Moreover, these studies have shown that various factors influence the CBA beliefs and practices of second language (L2) teachers. These factors include individual elements such as students' needs, teachers' core beliefs, and their teaching experience, as well as sociocultural factors like policy support, class size, time constraints, prescribed curriculum, and the prevailing assessment culture in their educational environment. In summary, research in L2 teaching and assessment highlights the complex interplay between teachers' beliefs and their practical actions related to CBA. While some educators align their beliefs with their practices, others exhibit discrepancies influenced by a range of individual and sociocultural factors. Understanding these dynamics is crucial for improving the effective implementation of CBA in language education [[40], [41], [42], [43], [44], [45]].

2.2. Studies on learners’ engagement in classroom

A wide range of favorable and unfavorable student reactions toward peers, teachers, educational institutions, and learning objectives are included in emotional engagement. Students' emotional responses and attachments formed in the classroom are reflected in this type of engagement, which has a big impact on how they learn. Conversely, students' intellectual involvement and comprehension of the subject matter are called cognitive engagement. This entails giving difficult ideas careful thought, reflecting thoroughly, and being prepared to put in significant work to understand difficult concepts and acquire complex skills [46]. Cognitive engagement is typified by mental operations that support learning, like applying knowledge in different contexts, solving problems, and critically analyzing information. A student's life is affected in many different ways by their academic engagement's far-reaching and long-lasting effects. Active learners are more likely to pursue postsecondary education, maintain regular study habits, improve professional opportunities, develop positive self-perceptions and overall well-being, and lessen depressive symptoms [47]. As a result, involvement in academic activities outside of the classroom positively impacts personal and professional development. Moreover, a close connection exists between academic motivation and performance and academic engagement. Pupils who participate in extracurricular activities respect their studies [[48], [49]].

Engaged students are more likely to put more effort into their academic work, which will help them finish their assignments successfully and perform better in class [55]. A mental state marked by heightened energy, unwavering dedication, and total absorption is referred to as engagement in professional settings [[50], [51], [52], [53], [54], [55]]. Absorption is the state of total immersion and satisfaction in one's activities, which causes time to seem to pass quickly. Vigor is the state of increased cognitive energy during work. Dedication is associated with a sense of self-worth, enthusiasm, inspiration, pride, and challenge. By emphasizing students' academic assignments and activities, this conceptual framework has been modified for the academic setting [56]. Engaged students experience increased vitality, a strong commitment to their academic pursuits, and active integration into their scholarly journey [57]. Engaged university students perform better academically, according to empirical evidence [55]. Positive relationships between engagement and academic success are consistently found in research designs [55]. Academic achievements, self-reported learning achievements, and improved grades are all linked to engagement [[55], [56], [57]]. Based on these findings, it is possible to significantly enhance student outcomes by promoting engagement in educational settings.

2.3. Studies on willingness to communicate

Willingness to Communicate (WTC) is a notion that describes a person's preparedness to converse verbally in a second language (L2) [41]. When offered the option, it is also seen as a consistent inclination to speak [60]. According to Kurk [60], when a learner uses the target language for communication, WTC reflects their cognitive decision-making. Under these circumstances, WTC, according to MacIntyre and Vincze [61], is the ultimate aim of language learning since it fosters authentic communicative behavior and improves L2 proficiency. Öz and colleagues. define WTC as a comprehensive construct that consists of communicative, linguistic, affective, and socio-psychological components [62]. Language learners' propensity to communicate in the L2 can be explained and predicted using this construct. MacIntosh et al. have three perspectives: trait-oriented, dynamic, and contextual [63]—that make up the theoretical framework [59]. Stress-related to learning a foreign language, motivation, and self-assurance is all strongly correlated with the psychological component of WTC [64]. The socio-environmental and situational elements of the dynamic and contextual aspects, on the other hand, include teachers [67], peers [68], conversational partners [65], and discourse topics [66]. Stable learner characteristics and situational dispositions are combined in WTC, which is a dual-faceted construct that is highlighted in recent research [69]. In addition to being situationally sensitive, this perspective emphasizes how stable characteristics like age, gender, and personality have an impact on WTC [[70], [71], [72]].

Within the L2 domain, WTC is regarded as a crucial factor that influences communicative behavior and enhances L2 proficiency [73]. Research has indicated that there is a positive correlation between elevated WTC and elevated L2 engagement [[68], [69], [70]]. Additionally, studies show a favorable correlation between WTC and L2 proficiency [74]. According to recent research, learners' WTC influences their L2 performance, underscoring its importance beyond simple communicative actions. In summary, the complex construct of Willingness to Communicate (WTC) has important implications for language acquisition because it encompasses both situational dispositions and enduring traits. The significance of WTC in influencing learners' communicative behaviors, L2 engagement, and, eventually, their proficiency in the target language is highlighted by this multifaceted approach that incorporates psychological, dynamic, and contextual dimensions. Gaining an understanding of and encouraging WTC can greatly improve language learning outcomes because learners who exhibit higher WTC are more likely to participate in meaningful interactions that promote language practice and development.

2.4. Rationale of the study

Despite the robust theoretical foundation of CBA, its successful application within the Chinese EFL classroom can be influenced by various contextual factors. Context-related variables, such as institutional demands, classroom dynamics, cultural factors, and teacher training, play pivotal roles in shaping teachers' assessment literacy and their actual assessment practices. The impetus for this research comes from the realization that good language evaluation and instructional techniques are vital to improving undergraduate Chinese students' experiences learning English as a foreign language (EFL). Understanding that language acquisition is a complex process with many facets, this study aims to explore the possible impacts of classroom-based assessment (CBA) on speaking performance, engagement, and willingness to communicate. The main aim of this study is to ascertain whether the application of CBA enhances EFL learners' speaking abilities. The study also intends to investigate how CBA affects students' participation in EFL classes. The study also looks into the relationship between CBA and EFL learners' propensity to converse in the target language. Finally, in order to gain important insights into the individualized experiences and viewpoints of the learners themselves, the study aims to explore how intermediate EFL learners perceive the role of CBA in enhancing their willingness to communicate and engagement in the EFL classroom. By achieving these goals, the study hopes to further the pedagogical conversation about language assessment and instruction while providing useful insights for language teachers dealing with Chinese EFL undergraduate students.

2.5. Research questions

In line with above mentioned gap, this study aims at answering the following research questions.

  • 1.

    Does Classroom based assessment improve the EFL learners' speaking performance?

  • 2.

    Does Classroom based assessment improve the EFL learners' engagement in EFL classrooms?

  • 3.

    Does Classroom based assessment significantly improve the EFL learners' willingness to speak in EFL classrooms?

  • 4.

    How do intermediate EFL learners perceive the role of classroom-based assessment in improving their willingness to communicate and engagement in EFL classroom?

2.6. Research method

A mixed methods approach was used in this study, combining qualitative and quantitative research designs. To evaluate the effects of Computer-Based Assessment (CBA) on Chinese learners' engagement styles and outcome readiness, the quantitative component used a pretest/posttest control/experimental group research design. This design gave Pretests and posttests to the experimental and control groups to facilitate the comparison of results. Statistical analyses were carried out to interpret the quantitative data and ascertain the intervention's effects. In addition to the quantitative aspect, the study employed a qualitative case research design to investigate the individual experiences and perspectives of the experimental group members concerning the intervention. This qualitative case study aimed to understand the significance and core of the lived experiences connected to the intervention. The application of thematic analysis to the qualitative data allowed for the recognition of recurring themes and trends in the responses provided by the participants. Because they allow for flexible, open-ended data collection and analysis, qualitative research methodologies—like the case study approach—are especially well-suited for examining subjective experiences and perceptions. Qualitative research can help clarify the phenomenon being studied by allowing participants to express their experiences in their own words.

2.7. Participants

The participants consisted of 90 (4 intact classes) first-year language learners from Quzhou University, China, where the first author played a dual role as teacher and researcher. An additional 25 language learners were selected for the qualitative phase of the study. All participants were native Chinese speakers and learned English as a foreign language. The selection criteria included being enrolled as a first-year student at Quzhou University in China and having Chinese as a native language. The intact classes were all taught by the first author. Two intact classes (consisting of 45 language learners) were assigned to the control group and the other two intact classes (45 language learners) were assigned to the experimental group.

2.8. Instruments

The Willingness to Communicate (WTC) scale, the Learner Engagement Scale, the Speaking Test, and an Interview Checklist were the four data collection tools used in this study. The speaking test was the first tool used. The purpose of this test was to evaluate their verbal English expression skills by simulating conversations from everyday life. There were four tasks in all, and each one offered the students a different challenge. A number of criteria were employed to grade the students' performance during the speaking exam. Assessors evaluated the learner's accent, clarity, and accuracy of the English sounds produced when assessing pronunciation, one of the criteria. Grammar and vocabulary were assessed in order to determine the learner's level of language competency. Furthermore, coherence and fluency were significant factors since they indicated how well students could arrange their ideas logically and how smoothly they could speak. In addition, appraisers evaluated the responses' content and applicability to the provided prompts. This made sure that students could communicate meaningful information while also speaking clearly. The way in which students interacted with one another during role-plays or group discussions was assessed, with a focus on how well they could listen and react to others.

Scoring of the speaking test was typically conducted by trained raters using holistic assessment, which awarded an overall score for the learner's performance ranging from 1 to 30. To ensure the reliability of the speaking test, several measures were taken. The evaluators were trained to thoroughly understand the evaluation criteria to ensure the consistency of their judgments. Clear rubrics or scoring guidelines were provided to raters to assist them in their assessments. The order of prompts or sections was often randomized to minimize bias, and multiple raters scored each test independently to assess inter-rater reliability. Regular reviews and revisions of test content and evaluation criteria ensure that test quality is maintained over time.

Before actually conducting the test, pilot testing was also carried out to identify and correct any problems. The inter-rater reliability of the test was 0.86. The second instrument utilized in the study was the Student Engagement Scale, a self-report measure designed to evaluate the degree of student involvement in classroom activities. This scale encompassed three key dimensions of engagement: affective enjoyment, cognitive engagement, and behavioral engagement. Scores on this scale ranged from 12 to 60, with higher scores indicating elevated levels of engagement [58]. To assess the readiness to communicate (WTC) among English language learners, an assessment tool based on the framework established by MacIntyre et al. (2001) was employed. This tool consisted of 27 items, each rated using a specific rating scale. Participants were prompted to indicate their level of agreement or willingness on a five-point scale, ranging from "Almost never willing" to "Almost always willing." To ensure the reliability of the assessment items, Cronbach's alpha coefficient was calculated. The results of this reliability analysis indicated a high level of internal consistency for both the overall scale and its individual dimensions, with Cronbach's alpha coefficients exceeding 0.81. Additionally, an interview checklist served as a guide for semi-structured interviews conducted with participants from the experimental group. This checklist comprised open-ended questions and prompts tailored by the researchers for the study's objectives and validated by experts in the field. Prior to implementation, three colleagues with expertise in qualitative research methods confirmed the checklist's relevance to the qualitative phase of the study, ensuring its alignment with the study's goals and objectives.

2.9. Procedure

After careful consideration, four speaking classes Quzhou University were chosen, and each class was then placed in either the control or experimental group. Before beginning any kind of treatment, a thorough evaluation of both groups' speaking abilities, WTC, and levels of engagement was conducted. The researcher carefully constructed speaking exam and modified scales were used in this evaluation. The experimental group received an innovative Classroom-based Assessment (CBA) intervention that lasted for a total of 14 sessions which consisted of the following teaching and learning activities, as presented in Table 1.

Table 1.

CBA instructional content and activities.

Session Objectives Activities
1 Introduction to CBA and its importance Overview of CBA and its importance, Discussion on intervention objectives, Explanation of assessment criteria
2 Understanding Speaking Performance Define speaking performance, Analyze speaking tasks, Practice activities, Peer feedback and reflection
3 Enhancing Speaking Skills Improve fluency, accuracy, coherence, Role-play activities, Pronunciation practice, Group discussions
4 Increasing Engagement in Speaking Activities Importance of engagement, Interactive tasks and games, Technology integration, Group discussions
5 Building Confidence in Speaking Overcoming anxiety, Guided practice, Positive reinforcement, Individual goal-setting
6 Assessing Speaking Performance Introduction to assessment methods, Discussion on evaluation criteria, Practice in assessment
7 Implementing CBA in Speaking Assessment Integration of CBA into assessment, Designing tasks, Administering assessments, Benefits discussion
8 Analyzing Speaking Performance Data Data collection and analysis, Identifying patterns, Instructional decision-making, Group interpretation
9 Providing Effective Feedback Delivering constructive feedback, Practice sessions, Peer feedback, Role-playing scenarios
10 Reflecting on Speaking Progress Individual reflection, Goal review, Discussion on reflection, Peer sharing
11 Applying Speaking Skills in Real-Life Contexts Role-play simulations, Contextual adaptation, Reflection on transferability, Brainstorming activities
12 Exploring Cultural Aspects of Communication Introduction to cultural factors, Discussion on differences, Role-play activities, Reflection
13 Reviewing and Consolidating Speaking Skills Review of key strategies, Guided practice, Peer collaboration, Individual reflection
14 Final Assessment and Reflection Completion of final assessment, Individual reflection, Group discussion, Goal-setting

The experimental group participated in a range of CBA activities intended to evaluate their speaking abilities thoroughly as part of a dynamic and interactive learning process, as presented in Table 1. These activities, which included peer evaluations, group discussions, and oral presentations, were closely linked to the course goals and offered ongoing insights into the way in which students' speaking skills were developing. In order to create an atmosphere that was favorable to active participation, the researcher actively promoted a growth mindset, self-reflection, and teamwork among the experimental group throughout the course of the treatment.

However, the control group was subjected to conventional assessment methods, which were restricted to midterm and final exams and lacked the integrative and dynamic qualities of CBA. Students in the control group were assessed twice: in the middle of the semester and at the end of the semester. The teacher did not provide the students in the control group any feedback anout the accuracy and fluency of their speaking performance. Both groups completed a final evaluation at the end of the 14 sessions, and the researcher used statistical techniques to compare the results between the experimental and control groups. Both the control and experimental groups were reassessed using the same scales and tests following the completion of the 14-session CBA intervention. The purpose of this post-treatment evaluation was to find out whether the intervention had any appreciable effects on speaking ability, communication willingness, or engagement levels. A subset of 25 language learners from the experimental group were interviewed one-on-one by the researcher to obtain a more thorough understanding of the effects of the CBA intervention. These interviews were intended to extract and record the participants' thoughts, feelings, and experiences about their involvement in the CBA sessions and how that involvement affected their ability to speak and their desire to communicate.

2.10. Data analysis

A thorough analysis of the quantitative data involved adopting a comprehensive statistical approach to extract meaningful insights from participants' responses. The dataset, comprising pre- and post-test results for both control and experimental groups, underwent various statistical assessments. Descriptive statistics, including means, standard deviations, and frequency distributions, provided an overview of initial communication readiness and engagement levels, as well as any changes post the classroom-based assessment (CBA) intervention. ANOVA tests were employed to ascertain significant differences between changes observed in the experimental versus control groups, with all test assumptions thoroughly checked and met.

For qualitative data analysis, a systematic approach utilizing thematic analysis was undertaken to unveil meaningful patterns and themes within transcribed interview data. Initially, interview transcripts underwent repeated reading to grasp participants' narratives comprehensively. Open coding generated initial codes capturing key concepts, subsequently organized into thematic categories to identify overarching patterns. Themes were refined iteratively through team discussion to ensure precision and validity. Connections between themes were explored to construct a coherent narrative summarizing participants' perceptions of the CBA intervention's impact on willingness to communicate and engagement. Relevant excerpts from interview transcripts supported these themes, enhancing credibility. Ultimately, qualitative analysis was integrated with quantitative findings to offer a comprehensive understanding of the research phenomenon, enriching the study's breadth and depth.

3. Ethical approval and consent to participate

The study involved undergraduate students who volunteered to participate and signed informed consent forms, which were integrated into the scale utilized for data collection. Participants were fully informed about the study's purpose and were not required to provide their names to maintain anonymity. Due to the lack of any potential positive or negative impacts on their educational attainment or status as students, ethical approval by the Institutional Review Board (IRB) or university ethical committee was deemed unnecessary. Furthermore, as the study did not pose any risks to human beings, animals, or the environment, it fell outside the scope of IRB review, which typically focuses on studies with potential adverse effects.

4. Quantitative findings

The quantitative findings include descriptive and inferential statistics for the groups’ scores on the main variables of the study before and after the treatment.

4.1. Research question 1

To compare the speaking performance of the control and experimental groups’ scores on the speaking performance test before and after the treatment, independent samples-t-tests were used. Results are presented in Table 2.

Table 2.

Mean and SD (Standard Deviation) of the groups’ scores on speaking test.

Time Test Control Group (CG)
Experimental Group (EG)
Mean SD Mean SD
pretest Speaking 15.5 2.25 15.62 2.98
Posttest Speaking 20.23 3.36 25.26 3.29

Table 2 displays the pretest mean scores for both the control and experimental groups on speaking performance. Before the treatment, the control group had a mean score of 15.5 (SD = 2.25), while the experimental group had a slightly higher mean score of 15.62 (SD = 2.98). Following the treatment, there was a noticeable increase in speaking performance for both groups. However, the experimental group exhibited a more substantial increase compared to the control group. Specifically, the control group had a mean score of 20.23 (SD = 3.36), whereas the experimental group had a higher mean score of 25.26 (SD = 3.29). This descriptive analysis suggests a potentially positive impact of the treatment on speaking performance, particularly for the experimental group. However, to confirm whether these observed differences are statistically significant, further statistical analysis, such as ANOVA, is necessary. Table 3 presents the results of the ANOVA test.

Table 3.

ANOVA for test for comparing the groups’ speaking before and after the treatment.

Source SS df MS F Sig. PES
Corrected Model 2192.311a 3 730.770 154.273 0.000 0.724
Intercept 56180.000 1 56180.000 11860.1 0.000 0.985
Groups 1973.422 1 1973.422 416.609 0.000 0.703
Time 125.000 1 125.000 26.389 0.000 0.130
Groups * Time 93.889 1 93.889 19.821 0.000 0.101
Error 833.689 176 4.737
Total 59206.000 180
Corrected Total 3026.000 179

Note: SS=Sum of squares, MS = Mean square, PES=Partial Eta Squared, df = degree of freedom.

a

R Squared = .724 (Adjusted R Squared = .720).

A repeated measures ANOVA was conducted to evaluate the effects of treatment on speaking performance, comparing pre- and post-intervention results in the control and experimental groups. The analysis revealed significant main effects for the time, [F(1, 176) = 26.389, p < 0.001, = 0.130], indicating notable differences in speaking performance between the pretest and posttest phases. Furthermore, significant main effects were observed for groups [F(1, 176) = 416.609, p < .001, ηp2 = 0.703], indicating significant differences in overall speaking performance between the control and experimental groups. Furthermore, a significant interaction effect between groups and time was identified [F (1, 176) = 19.821, p < .001, ηp2 = 0.101]. This suggests that the change in speaking performance over time differed significantly between the control and experimental groups, suggesting a differential impact of treatment on their speaking abilities. Overall, the model was highly significant, [F (3, 176) = 154.273, S < 0.001, ηp2 = 0.724], indicating that the combination of time, groups, and their interaction explained significant variance in speaking performance. These results highlight the significant impact of treatment on improving speaking skills, with differential effects observed between the control and experimental groups.

4.2. Research question 2

To compare the control and experimental groups’ scores on engagement before and after the treatment, ANOVA test was used. Results are presented in Table 4, Table 5.

Table 4.

Mean and SD of control and experimental groups’ scores on engagement.

Time Test Control Group
Experimental Group
Mean SD Mean SD
pretest Speaking 3.2 1.1 3.3 1.05
posttest Speaking 3.3 1.00 4.1 0.56

Table 5.

ANOVA test for the groups’ scores on engagement.

Source SS df MS F Sig. PES
Corrected Model 81.083a 3 27.028 39.759 0.000 0.404
Intercept 1662.272 1 1662.272 2445.244 0.000 0.933
Groups 29.606 1 29.606 43.551 0.000 0.198
Time 23.472 1 23.472 34.528 0.000 0.164
Groups * Time 28.006 1 28.006 41.197 0.000 0.190
Error 119.644 176 0.680
Total 1863.000 180
Corrected Total 200.728 179

Note: SS=Sum of squares, MS = Mean square, PES=Partial Eta Squared, df = degree of freedom.

a

. R Squared = .404 (Adjusted R Squared = .394).

Before the treatment (pretest), the control group had a mean engagement score of 3.2 (SD = 1.1), while the experimental group's mean engagement score was slightly higher at 3.3 (SD = 1.05). This suggests that both groups had similar levels of engagement prior to the treatment. Following the treatment (posttest), the mean engagement score for the control group increased slightly to 3.3 (SD = 1.00). In contrast, the mean engagement score for the experimental group showed a more substantial increase to 4.1 (SD = 0.56). Table 5 presents the results of the ANOVA test.

A repeated measures ANOVA was conducted to assess the influence of groups (Control G and Experimental G) and time (pretest and posttest) on engagement scores. The analysis revealed a significant main effect of groups, [F(1, 176) = 43.551, p < 0.001, ηp2 = 0.198], indicating noteworthy differences in engagement scores between the control group and the experimental group overall. Additionally, a significant main effect of time was observed, [F(1, 176) = 34.528, p < 0.001, ηp2 = 0.164], suggesting significant changes in engagement scores from pretest to posttest across both groups. Furthermore, the interaction effect between groups and time was significant, [F(1, 176) = 41.197, p < 0.001, ηp2 = 0.190]. This indicates that the change in engagement scores over time varied significantly between the control group and the experimental group, implying that the treatment had a differential impact on the engagement levels of the two groups.

4.3. Research question three

The third research question investigated the effect of CBA on EFL learners’ WTC, results are presented in Table 6, Table 7.

Table 6.

Mean and SD (Standard Deviation) of the groups’ scores WTC.

Time Control G
Experimental G
Mean SD Mean SD
pretest 77.9 13.5 77 13.2
Posttest 81.2 14 110.8 12.5

Table 7.

ANOVA test for the groups’ scores WTC.

Source SS df MS F Sig. PTS
Corrected Model 13716.150a 3 4572.050 45.443 0.001 0.436
Intercept 1046531.25 1 1046531.25 10401.72 0.001 0.983
Groups 4470.050 1 4470.050 44.429 0.001 0.202
Time 7644.050 1 7644.050 75.976 0.001 0.302
Groups * Time 1602.050 1 1602.050 15.923 0.001 0.083
Error 17707.600 176 100.611
Total 1077955.00 180
Corrected Total 31423.750 179

Note: SS=Sum of squares, MS = Mean square, PES=Partial Eta Squared, df = degree of freedom.

a

. R Squared = .436 (Adjusted R Squared = .427).

Before the intervention (pretest), both the control group and the experimental group demonstrated similar mean scores on Willingness to Communicate (WTC). The control group had a mean WTC score of 77.9 (SD = 13.5), while the experimental group had a slightly lower mean WTC score of 77 (SD = 13.2). This suggests no significant differences in WTC between the two groups prior to the intervention. Following the intervention (posttest), there was a notable increase in the mean WTC score for both groups. The control group exhibited a modest increase to a mean WTC score of 81.2 (SD = 14), whereas the experimental group showed a more substantial increase to a mean WTC score of 110.8 (SD = 12.5). These scores underwent an ANOVA test to determine whether the observed differences were statistically significant or not. The results are presented in Table 7.

The corrected model, which includes the main effects of Groups and Time as well as their interaction, was found to be highly significant, F(3, 176) = 45.443, p < 0.001, with a partial eta squared value of 0.436. This indicates that the combination of Groups, Time, and their interaction accounts for a significant amount of variance in the outcome variable. The main effect of Groups was significant, [F(1, 176) = 44.429, p < 0.001, partial eta squared = 0.202], suggesting that there were significant differences in the outcome variable between the groups. Similarly, the main effect of Time was significant, [F(1, 176) = 75.976, p < 0.001, partial eta squared = 0.302], indicating that there were significant changes in the outcome variable over time. Furthermore, the interaction effect between Groups and Time was significant, [F(1, 176) = 15.923, p < 0.001, partial eta squared = 0.083]. This indicates that the change in the outcome variable over time differed significantly between the groups, suggesting that the treatment had a differential impact on the two groups.

4.4. Research question 4

In this thematic analysis of intermediate EFL learners' perceptions, several themes emerged regarding the role of Learner Autonomy (CBA) in improving their willingness to communicate (WTC) and engagement in the EFL classroom.

4.5. Empowerment and self-direction

Many participants expressed that CBA empowered them to take more initiative in their language learning. They described how having control over their learning process increased their confidence in using English, leading to greater WTC and participation in classroom activities. The following quotations exemplify the theme:

Before, I used to wait for the teacher to tell me what to do and how to learn English. But now, with learner autonomy, I feel like I'm in control. I choose the topics I'm interested in, and it's like I'm driving my own language learning journey. This sense of control has boosted my confidence, and I find myself speaking up in class more often." (Participant 7)

"I used to be so nervous about speaking English, especially in front of my classmates. But when I started taking charge of my learning, it was like a switch flipped. I realized that I have the power to decide what and how I learn. That shift in mindset made me feel more capable, and now I actively participate in discussions and activities. Learner autonomy has been a game-changer for me." (Participant 9)

4.6. Personalized learning

Some learners highlighted the importance of personalized learning pathways enabled by CBA. They mentioned that being able to choose materials and activities that aligned with their interests and goals made the learning experience more engaging, consequently enhancing their classroom engagement. To exemplify this theme, participant 7 stated, "With learner autonomy, I get to tailor my English learning to what truly matters to me. I can pick topics, materials, and activities that align with my interests and goals. It's not a one-size-fits-all approach anymore. This personalization has made learning English more enjoyable, and I'm more engaged because I'm learning what I care about." Similarly, participant 10 stated, “Before, English classes felt generic, and I struggled to stay engaged. However, with learner autonomy, I can choose content that resonates with me. It's like discovering a new level of enthusiasm for the language. I eagerly dive into lessons because they're relevant to my interests, and it has completely transformed my classroom engagement."

4.7. Motivation and ownership

Participants commonly mentioned that CBA heightened their motivation to learn English. They perceived a sense of ownership over their progress, leading to a greater willingness to communicate with their peers and instructors. They felt more invested in the learning process. For instance, participant 8 stated, "CBA breathed new life into my motivation for learning English. When I started making decisions about what and how to learn, it felt like I had a personal stake in my progress. I was no longer just a passive learner, but someone driving their own learning journey. That motivation translated into a newfound willingness to communicate with my classmates and teachers. It's like I found my voice in the language." Similarly, participant 9 stated, “Taking ownership of my English learning through CBA was a turning point for me. It's not just about following a curriculum; it's about owning my path and progress. This sense of ownership has made me more committed to the learning process. I'm more eager to participate, share my ideas, and connect with my peers. I'm not just learning English; I'm owning it."

4.8. Reduced anxiety

Several learners indicated that CBA allowed them to pace their learning according to their comfort level, reducing anxiety associated with language use. As their confidence grew, they became more active participants in classroom discussions and activities. For instance, participant 12 stated, “"Before CBA, I always felt this overwhelming anxiety when speaking English in class. But now, I can learn at my own pace, gradually building my confidence. It's like having a safety net. As my anxiety reduced, I started to actively engage in discussions and activities without that constant fear of making mistakes. It's been liberating." Participant 15 also stated, "CBA has been a lifesaver for me. I used to dread English classes because of the pressure to keep up with everyone else. But now, I can take my time, focus on what I need, and build my language skills at a comfortable pace. It's remarkable how my anxiety has faded away. I'm now a much more relaxed and active participant in class discussions and activities, thanks to CBA."

4.9. Teacher facilitation

Some participants emphasized the role of teachers in fostering CBA. They appreciated instructors who encouraged autonomy and provided guidance when needed, as it positively influenced their WTC and engagement. The following quotations exemplify the theme:

"Having a teacher who supports learner autonomy is a game-changer. They encourage us to take charge of our learning while offering guidance when we need it. It's like having a mentor who empowers us. This approach has significantly boosted my willingness to communicate. I'm more confident in my abilities, knowing they're there to help if I stumble." (Participant 17)

"I've been lucky to have instructors who understand the value of CBA. They don't just dictate what we should learn but allow us to explore our interests. When I know my teacher supports my autonomy, it motivates me to engage more actively. I feel like I'm part of a collaborative learning journey, and that's made a world of difference in my classroom participation."(participant 19)

Overall, the thematic analysis revealed that intermediate EFL learners perceive CBA as a catalyst for improved WTC and classroom engagement. It empowers learners, personalizes their learning experience, enhances motivation, reduces anxiety, and can be facilitated by supportive instructors."

5. Discussion

The first main finding of this study, indicating a significant difference in speaking performance between the control group (Control G) and the experimental group (Experimental G) after a 14-session classroom-based assessment (CBA) intervention, is confirmed by existing research on this Area. The initial similarity in speaking performance between control G and experimental G is consistent with the concept of baseline equivalence in assessment studies, as discussed by Hill and McNamara [38]. This suggests that both groups assumed comparable performance levels before the intervention. The significant improvement in speaking performance observed in Experiment G after the CBA intervention is consistent with the principles of formative assessment and its positive impact on learning outcomes, as emphasized by Black and Wiliam [[2], [3], [4]]. The CBA approach, characterized by continuous practice, timely feedback and learner engagement, reflects the principles of formative assessment and leads to improved speaking skills [5]. The significant difference in post-intervention speaking scores between the two groups, supported by a small p-value and large effect size, reinforces the notion that well-designed classroom-based assessment interventions can lead to significant improvements in student performance, as discussed by Pryor and Crossouard [32]. In summary, the results are consistent with the existing literature on formative assessment, classroom-based assessment, and their capabilities.

The results also showed significant differences in engagement levels between the control group (Control G) and the experimental group (Experimental G) both before and after the classroom-based assessment (CBA) intervention. These results are consistent with established research on the influence of formative assessment and learner engagement in educational contexts. Initially, the two groups showed similar levels of engagement, as indicated by p-values above .05. This suggests that the groups demonstrated homogeneity in terms of engagement at the start of the study and conformed to the principles of random assignment in the experimental design [36,41]. After the CBA intervention, EG showed notable improvements in several dimensions of engagement compared to CG. These improvements are consistent with existing research on the positive impact of effective instructional assessment on student engagement [23,32]. Affective engagement increased significantly in EG, reflecting the creation of a more positive affective learning environment. This change, indicated by a moderate effect size of 0.68, highlights the significant progress in affective engagement achieved through the learner-centered CBA approach [33,40].

Cognitive engagement showed a significant increase in experimental G, indicating deeper cognitive engagement promoted by the CBA intervention. This is consistent with the principles discussed by Hill and McNamara [38], which highlight the importance of classroom-based assessment in improving students' cognitive engagement. Behavioral engagement also improved significantly in EG, reflecting the effectiveness of continuous practice and learner engagement in line with the principles of formative assessment [5,15]. The significant effect size underlines the influence of the CBA intervention on shaping student behavior in the learning process. Overall, the results of the study indicate a comprehensive improvement in engagement levels, as evidenced by the significantly higher overall engagement score in Experiment G following the CBA intervention. The significant effect highlights the magnitude of this change and highlights the comprehensive improvement in learner engagement [33,34].

Results also demonstrated the impact of the treatment on students' WTC. Initially, there was no statistically significant difference between the two groups' mean WTC scores, indicating homogeneity before the treatment. However, after the intervention, a significant difference emerged, with the experimental group showing substantially higher WTC, evidenced by a large effect size. This increase underscores the intervention's positive impact on students' oral communication confidence, consistent with literature on the effectiveness of such interventions [23,32].

The qualitative findings highlighted key themes related to empowerment and self-direction in personalized learning, including personalized learning, motivation and ownership, reduced anxiety, and teacher facilitation. One prominent theme was personalized learning pathways, where participants valued choosing materials and activities that aligned with their interests and goals. This personalization enhanced their learning experience and sense of control over their educational journey, aligning with contemporary educational theories [11,12].

Participants noted that learner autonomy increased their confidence in using English, leading to greater willingness to communicate and participate in class. This finding supports self-determination and motivation theories, which suggest that autonomy fosters intrinsic motivation [7,8]. Intrinsic motivation is linked to higher engagement and better learning outcomes [6,9]. The qualitative findings underscore the importance of fostering learner autonomy to enhance motivation and ownership in language learning.

Another significant theme that emerged from the data was the role of autonomy in reducing learner anxiety. Several participants indicated that having control over their learning pace and content allowed them to manage their anxiety more effectively. This finding is consistent with research on anxiety and language learning, which suggests that learner autonomy can be a valuable coping strategy for reducing anxiety [16,17]. When learners feel more in control of their learning, they may experience less performance anxiety, which can positively impact their willingness to communicate and engage in classroom activities [20]. This suggests that empowering learners through self-directed learning can contribute to a more positive and less anxiety-inducing learning environment. Participants also emphasized the role of teachers in fostering learner autonomy and empowerment. They appreciated instructors who encouraged autonomy and provided guidance when needed. This finding aligns with the concept of a "scaffolded" approach to autonomy, where teachers support learners in gradually taking more control over their learning [9,18]. Effective teacher facilitation is crucial in helping learners navigate the complexities of self-directed learning while ensuring that they stay on track and achieve their learning goals. This suggests that teachers play a pivotal role in creating an environment where empowerment and self-direction can thrive.

6. Conclusions and implications

This study has several notable strengths. The research takes a comprehensive approach by examining multiple dimensions of language learning, including speaking performance, engagement and willingness to communicate. Using a 14-session longitudinal design with pre- and post-test assessments allows for a thorough examination of the sustained effects of CBA over time. Various assessment tools such as oral presentations, group discussions and peer evaluations contribute to the authenticity and relevance of the study results. Incorporating objective measures such as statistical analyzes with t-tests, p-values, and effect sizes increases the accuracy and transparency of research results. Furthermore, the qualitative component that examines learners' perceptions provides valuable insights into their subjective experiences and enriches the understanding of the potential benefits of CBA. The study's focus on Chinese EFL learners adds relevance to the broader language teaching literature and offers practical implications for educators seeking to improve their language teaching practices. Overall, the study represents a comprehensive examination of the impact of CBA on language learning outcomes, combining quantitative and qualitative approaches to provide nuanced insights and actionable recommendations for educators.

In conclusion, the quantitative findings robustly support the effectiveness of the Classroom-Based Assessment (CBA) intervention in enhancing speaking performance and learner engagement in the experimental group compared to the control group. These results align with established principles of formative assessment and its positive impact on learning outcomes. The substantial improvement in speaking performance and increased engagement across affective, cognitive, and behavioral dimensions highlight the potential of learner-centered assessment approaches to foster significant positive changes in language learning. This underscores the importance of integrating formative assessment strategies into language education to promote both academic achievement and learner engagement.

Moreover, the study's findings on willingness to communicate (WTC) dimensions reveal a comprehensive positive impact of the intervention on students' confidence and engagement in speaking, reading, writing, and comprehension. These results are consistent with prior research on the influence of educational interventions on WTC and its sub-scales. The substantial differences in post-intervention scores between the control and experimental groups emphasize the effectiveness of the CBA approach in enhancing students' communicative engagement. This highlights the importance of considering multiple dimensions of WTC when evaluating the impact of language interventions and demonstrates the potential for fostering well-rounded communication skills through learner-centered assessment practices.

In terms of implications, these findings have significant relevance for language educators and curriculum designers. The study highlights the positive outcomes associated with learner-centered assessment strategies, suggesting that educators should consider incorporating formative assessment practices into their teaching approaches. Moreover, the findings underscore the importance of personalized learning experiences, learner autonomy, and teacher facilitation in promoting engagement and reducing anxiety. Educators can benefit from adopting strategies that allow students to have more control over their learning and providing guidance and support when needed. Overall, the results provide valuable insights into the potential of learner-centered assessment and empowerment-focused language education to enhance both language learning outcomes and the overall educational experience.

Data availability

The data would be available on request from the corresponding author.

Ethical approval and consent to participate

The study involved undergraduate students who volunteered to participate and signed informed consent forms, which were integrated into the scale utilized for data collection. Participants were fully informed about the study's purpose and were not required to provide their names to maintain anonymity. Due to the lack of any potential positive or negative impacts on their educational attainment or status as students, ethical approval by the Institutional Review Board (IRB) or university ethical committee was deemed unnecessary. Furthermore, as the study did not pose any risks to human beings, animals, or the environment, it fell outside the scope of IRB review, which typically focuses on studies with potential adverse effects.

Funding

This work was sponsored in part by “ Offline, Online+Offline and Social Practice ” Provincial First Class Courses for Undergraduate Universities 2019 ——English Curriculum and Teaching Pedagogy.

CRediT authorship contribution statement

Yawen liu: Writing – review & editing, Writing – original draft, Visualization, Project administration, Funding acquisition, Formal analysis, Data curation, Conceptualization.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgment

The author would like to thank all participants of the study.

References

  • 1.Cizek G.J. In: The Handbook of Formative Assessment. Andrade H.L., Cizek G.J., editors. Routledge; New Youk, NY: 2010. An introduction to formative assessment: history, characteristics, and challenges; pp. 3–17. [Google Scholar]
  • 2.Black P., Wiliam D. Assessment and classroom learning. Assess Educ. Princ. Pol. Pract. 1998;5:7–74. doi: 10.1080/0969595980050102. [DOI] [Google Scholar]
  • 3.Black P., Wiliam D. Developing the theory of formative assessment. Educ. Assess. Eval. Account. 2009;21:5–31. doi: 10.1007/s11092-008-9068-5. [DOI] [Google Scholar]
  • 4.Black P., Wiliam D. Classroom assessment and pedagogy. Assess Educ. Princ. Pol. Pract. 2018;25:551–575. doi: 10.1080/0969594X.2018.1441807. [DOI] [Google Scholar]
  • 5.Britton M. Assessment for learning in primary language learning and teaching. Multilingual Matters. 2021 Bristol, UK. [Google Scholar]
  • 6.Nikolov M., Timpe-Laughlin V. Assessing young learners' foreign language abilities. Lang. Teach. 2021;54:1–37. doi: 10.1017/S0261444820000294. [DOI] [Google Scholar]
  • 7.Butler Y.G., Lee J. The effects of self assessment among young learners of English. Lang. Test. 2010;27:5–31. doi: 10.1177/0265532209346370. [DOI] [Google Scholar]
  • 8.Butler Y.G. In: The Handbook of Second Language Assessment. Tsagari D., Banerjee J., editors. De Gruyter Mouton; Boston, MA: 2016. Assessing young learners; pp. 359–375. [Google Scholar]
  • 9.Butler Y.G. In: Second Handbook of English Language Teaching. Gao X., editor. Springer; Cham, Switzerland: 2019. Assessment of young English learners in instructional settings; pp. 477–496. [Google Scholar]
  • 10.Butler Y.G. In: The Routledge Handbook of Language Testing. second ed. Fulcher G., Harding L., editors. Routledge); New York: 2021. Assessing young learners; pp. 153–170. [Google Scholar]
  • 11.Kaur K. Formative assessment in English language teaching: exploring the enactment practices of teachers within three primary schools in Singapore. Asia Pac. J. Educ. 2021;41:695–710. doi: 10.1080/02188791.2021.1997707. [DOI] [Google Scholar]
  • 12.Borg S. The impact of in-service teacher education on language teachers' beliefs. System. 2011;39:370–380. doi: 10.1016/j.system.2011.07.009. [DOI] [Google Scholar]
  • 13.Borg S. In: Second Handbook of English Language Teaching. Gao X., editor. Springer; Cham, Switzerland: 2019. Language teacher cognition: perspectives and debates; pp. 1149–1169. [Google Scholar]
  • 14.Sun Q., Zhang L.J. A sociocultural perspective on English-as-aforeign-language (EFL) teachers' cognitions sbout form-focused instruction. Front. Psychol. 2021;12 doi: 10.3389/fpsyg.2021.593172. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Yan Q., Zhang L.J., Dixon H.R. Exploring classroom-based assessment for young EFL learners in the Chinese context: teachers' beliefs and practices. Front. Psychol. 2022;13 doi: 10.3389/fpsyg.2022.1051728. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Absolum M., Flockton L., Hattie J., Hipkins R., Reid I. 2009. Directions for Assessment in New Zealand: Developing Students' Assessment Capabilities. Unpublished paper prepared for the Ministry of Education. [Google Scholar]
  • 17.Chen J., Brown G.T. High-stakes examination preparation that controls teaching: Chinese prospective teachers' conceptions of excellent teaching and assessment. J. Educ. Teach. 2013;39(5):541–556. [Google Scholar]
  • 18.Ngok K. Chinese education policy in the context of decentralization and marketization: evolution and implications. Asia Pac. Educ. Rev. 2007;8:142–157. [Google Scholar]
  • 19.Carless D. Routledge; London, UK: 2011. From Testing to Productive Student Learning: Implementing Formative Assessment in Confucian-Heritage Settings. [Google Scholar]
  • 20.Kunnan A.J., Jang E.E. The Handbook of Language Teaching. 2009. Diagnostic feedback in language assessment; pp. 610–627. [Google Scholar]
  • 21.Tsagari D. Assessment orientations of state primary EFL teachers in two Mediterranean countries. Center Educ. Policy Stud. J. 2016;6:9–30. doi: 10.26529/cepsj.102. [DOI] [Google Scholar]
  • 22.Turner C.E. In: The Routledge Handbook of Language Testing. Fulcher G., Davidson F., editors. Routledge); London, England: 2012. Classroom assessment; pp. 65–79. [Google Scholar]
  • 23.Turner C.E., Purpura J.E. In: The Handbook of Second Language Assessment Issue Purpura 2009. Tsagari D., Banerjee J., editors. DeGruyter Mouton; Berlin, Germany: 2016. Learning-oriented assessment in second and foreign language classrooms; pp. 1–19. Vattøy, K. D. (2020. [Google Scholar]
  • 24.McMillan J.H. In: The Sage Handbook of Research on Classroom Assessment. McMillan J.H., editor. Sage; Thousand Oaks, CA: 2013. Why we need research on classroom assessment; pp. 2–17. [Google Scholar]
  • 25.Wu X.M., Dixon H.R., Zhang L.J. Sustainable development of students' learning capabilities: the case of university students' attitudes towards teachers, peers, and themselves as oral feedback sources in learning English. Sustainability. 2021;13:5211. doi: 10.3390/su13095211. [DOI] [Google Scholar]
  • 26.Wu X.M., Zhang L.J., Dixon H.R. Implementing assessment for learning (AfL) in Chinese university EFL classes: teachers' values and practices. System. 2021;101 doi: 10.1016/j.system.2021.102589. [DOI] [Google Scholar]
  • 27.Wu X.M., Zhang L.J., Liu Q. Using assessment for learning (AfL): multi-case studies of three Chinese university English-as-a-foreign-language (EFL) teachers engaging students in learning and assessment. Front. Psychol. 2021;12:1–16. doi: 10.3389/fpsyg.2021.725132. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Wiliam D. In: Currilum and Assessment. Scott D., editor. Alex Publishing); Westport, Conn: 2001. An overview of the relationship between assessment and the curriculum; pp. 165–181. [Google Scholar]
  • 29.Wiliam D. In: Handbook of Formative Assessment. Andrade H.L., Cizek G.J., editors. Routledge); New York, NY: 2010. An integrative summary of the research literature and implications for a new theory of formative assessment; pp. 18–40. [Google Scholar]
  • 30.Wiliam D., Thompson M. In: The Future of Assessment: Shaping Teaching and Learning. Dwyer C.A., editor. Lawrence Erlbaum; Mahwah, NJ: 2008. Integrating assessment with instruction: what will it take to make it work? pp. 53–82. [Google Scholar]
  • 31.Sadler R.D. Formative assessment and the design of instructional systems. Instr. Sci. 1989;18:119–144. doi: 10.1007/BF00117714. [DOI] [Google Scholar]
  • 32.Pryor J., Crossouard B. A socio-cultural theorisation of formative assessment. Oxf. Rev. Educ. 2008;34:1–20. doi: 10.1080/03054980701476386. [DOI] [Google Scholar]
  • 33.Panadero E., Andrade H., Brookhart S.M. Fusing self-regulated learning and formative assessment: a roadmap of where we are, how we got here, and where we are going. Aust. Educ. Res. 2018;45:13–31. doi: 10.1007/s13384-018-0258-y. [DOI] [Google Scholar]
  • 34.Rea-Dickins P. In: International Handbook of English Language Teaching. Davison C.C.J., editor. Springer; New York, NY: 2007. Classroom-based assessment: possibilities and pitfalls; pp. 505–520. [Google Scholar]
  • 35.Dixson D.D., Worrell F.C. Formative and summative assessment in the classroom. Theory Into Pract. 2016;55:153–159. doi: 10.1080/00405841.2016.1148989. [DOI] [Google Scholar]
  • 36.Dixon H., Hill M., Hawe E. Noticing and recognising AfL practice: challenges and their solution. Assess. Matters. 2020;14:42–62. doi: 10.18296/am.0044. [DOI] [Google Scholar]
  • 37.Dolin J., Black P., Harlen W., Tiberghien A. In: Transforming Assessment. Dolin J., Evans R., editors. Springer; Cham, Switzerland: 2018. Exploring relations between formative and summative assessment; pp. 53–80. [Google Scholar]
  • 38.Hill K., McNamara T. Developing a comprehensive, empirically based research framework for classroom-based assessment. Lang. Test. 2011;29:395–420. doi: 10.1177/0265532211428317. [DOI] [Google Scholar]
  • 39.Davison C. In: Second Handbook of English Language Teaching. Gao X., editor. Springer International; Cem, Switzerland: 2019. Using assessment to enhance learning in English language education; pp. 433–454. [Google Scholar]
  • 40.Wiliam D., Thompson M. In: The Future of Assessment: Shaping Teaching and Learning. Dwyer C.A., editor. Lawrence Erlbaum; Mahwah, NJ: 2008. Integrating assessment with instruction: what will it take to make it work? pp. 53–82. [Google Scholar]
  • 41.Davison C., Leung C. Current issues in English language teacherbased assessment. Tesol Q. 2009;43:393–415. doi: 10.1002/j.1545-7249.2009.tb00242.x. [DOI] [Google Scholar]
  • 42.Xu Y., Liu Y. Teacher assessment knowledge and practice: a narrative inquiry of a Chinese college EFL teacher's experience. Tesol Q. 2009;43:492–513. doi: 10.1002/j.1545-7249.2009.tb00246.x. [DOI] [Google Scholar]
  • 43.Chen Q., May L., Klenowski V., Kettle M. The enactment of formative assessment in English language classrooms in two Chinese universities: teacher and student responses. Assess Educ. Princ. Pol. Pract. 2014;21:271–285. doi: 10.1080/0969594X.2013.790308. [DOI] [Google Scholar]
  • 44.Nasr M., Bagheri M.S., Sadighi F., Rassaei E. Iranian EFL teachers' perceptions of assessment for learning regarding monitoring and scaffolding practices as a function of their demographics. Cogent Educ. 2018;5 doi: 10.1080/2331186X.2018.1558916. [DOI] [Google Scholar]
  • 45.Vattøy K.D. Teachers' beliefs about feedback practice as related to student self-regulation, self-efficacy, and language skills in teaching English as a foreign language. Stud. Educ. Eval. 2020;64 doi: 10.1016/j.stueduc.2019.100828. [DOI] [Google Scholar]
  • 46.Fredricks J.A., Mccolskey W. In: Handbook of Research on Student Engagement. Christenson S., Reschy A.L., Wylie C., editors. Springer; New York, NY, USA: 2012. The measurement of student engagement: a comparative analysis of various methods and student self-report instruments; pp. 763–782. [Google Scholar]
  • 47.Eccles J., Wang M.-T. In: Handbook of Research on Student Engagement. Christenson S.L., Reschly A.L., Wylie C., editors. Springer; Boston, MA, USA: 2012. Part I commentary: so what is student engagement anyway? pp. 133–145. [Google Scholar]
  • 48.Li Y., Lerner R.M. Trajectories of school engagement during adolescence: implications for grades, depression, delinquency, and substance use. Dev. Psychol. 2011;47:233–247. doi: 10.1037/a0021307. [DOI] [PubMed] [Google Scholar]
  • 49.King R.B. Sense of relatedness boosts engagement, achievement, and well-being: a latent growth model study. Contemp. Educ. Psychol. 2015;42:26–38. [Google Scholar]
  • 50.Gobert J.D., Baker R.S., Wixon M.B. Operationalizing and detecting disengagement within online science microworlds. Educ. Psychol. 2015;50:43–57. [Google Scholar]
  • 51.Ketonen E.E., Malmberg L.-E., Salmela-Aro K., Muukkonen H., Tuominen H., Lonka K. The role of study engagement in university students' daily experiences: a multilevel test of moderation. Learn. Individ. Differ. 2019;69:196–205. [Google Scholar]
  • 52.Appleton J.J., Christenson S.L., Kim D., Reschly A.L. Measuring cognitive and psychological engagement: validation of the student engagement instrument. J. Sch. Psychol. 2006;44:427–445. [Google Scholar]
  • 53.Zhou Q., Main A., Wang Y. The relations of temperamental effortful control and anger/frustration to Chinese children's academic achievement and social adjustment: a longitudinal study. J. Educ. Psychol. 2010;102:180. [Google Scholar]
  • 54.Schaufeli W.B., Salanova M., González-Romá V., Bakker A.B. The measurement of burnout and engagement: a confirmatory factor analytic approach. J. Happiness Stud. 2002;3:71–92. [Google Scholar]
  • 55.Avcı Ü., Ergün E. Online students' LMS activities and their effect on engagement, information literacy and academic performance. Interact. Learn. Environ. 2022;30:71–84. [Google Scholar]
  • 56.Wang C., Kim D.-H., Bai R., Hu J. Psychometric properties of a self-efficacy scale for English language learners in China. System. 2014;44:24–33. doi: 10.1016/j.system.2014.01.015. [DOI] [Google Scholar]
  • 57.Kagan K. Engaging Teacher Candidates and Language Learners with Authentic Practice. IGI Global; 2019. Boosting engagement and intercultural competence through technology; pp. 55–74. [Google Scholar]
  • 58.MacIntyre P., Dörnyei Z., Clément R., Noels K. Conceptualizing willingness to communicate in a L2: a situational model of L2 confidence and affiliation. Mod. Lang. J. 1998;82:545–562. doi: 10.1111/j.1540-4781.1998.tb05543.x. [DOI] [Google Scholar]
  • 59.MacIntyre P., Charos C. Personality, attitudes, and affect as predictors of second language communication. J. Lang. Soc. Psychol. 1996;15:3–26. [Google Scholar]
  • 60.Kruk M. Dynamicity of perceived willingness to communicate, motivation, boredom and anxiety in second life: the case of two advanced learners of English. Comput. Assist. Lang. Learn. 2019;2:1–27. [Google Scholar]
  • 61.MacIntyre P.D., Vincze L. Positive and negative emotions underlie motivation for L2 learning. Stud. Sec. Lang. Learn. Teach. 2017;7:61–88. 10. 14746/ssllt.2017.7.1.4. [Google Scholar]
  • 62.Öz H., Demirezen M., Pourfeiz J. Willingness to communicate of EFL learners in Turkish context. Learn. Indiv Differ. 2015;37:269–275. doi: 10.1016/j.lindif.2014.12.009. [DOI] [Google Scholar]
  • 63.Dewaele J.M., Dewaele L. Learner-internal and learner-external predictors of willingness to communicate in the FL classroom. J. Eur. Second Lang. Assoc. 2018;2:24–37. doi: 10.22599/jesla.37. [DOI] [Google Scholar]
  • 64.Lee J.S., Hsieh J.C. Affective variables and willingness to communicate of EFL learners in in-class, out-of-class, and digital contexts. System. 2019;82:63–73. doi: 10.1016/j.system.2019.03.002. [DOI] [Google Scholar]
  • 65.Fadilah E. Willingness to communicate from Indonesian learners' perspective. J. ELT Res. 2018;3:168–185. doi: 10.22236/JER_Vol3Issue2pp168-185. [DOI] [Google Scholar]
  • 66.Mystkowska-Wiertelak A., Pawlak M. In: Classroom-oriented Research. Pawlak M., editor. Springer; Cham: 2016. Designing a tool for measuring the interrelationships between L2 WTC, confidence, beliefs, motivation, and context; pp. 19–37. [Google Scholar]
  • 67.Zarei N., Saeidi M., Ahangari S. Exploring EFL teachers' socioaffective and pedagogic strategies and students' willingness to communicate with a focus on Iranian culture. Educ. Res. Int. 2019;2:1–12. doi: 10.1155/2019/3464163. [DOI] [Google Scholar]
  • 68.Khajavy G.H., Makiabadi H., Navokhi S.A. The role of psychological capital in language learners' willingness to communicate, motivation, and achievement. Eurasian J. Appl. Linguist. 2019;5:495–513. doi: 10.32601/ejal.651346. [DOI] [Google Scholar]
  • 69.Zhang J., Beckmann N., Beckmann J.F. To talk or not to talk: a review of situational antecedents of willingness to communicate in the second language classroom. System. 2018;72:226–239. [Google Scholar]
  • 70.Barabadi H., Mojab F., Vahidi H., Marashi B., Talank N., Hosseini O., Saravanan M. Green synthesis, characterization, antibacterial and biofilm inhibitory activity of silver nanoparticles compared to commercial silver nanoparticles. Inorg. Chem. Commun. 2021;129 [Google Scholar]
  • 71.MacIntyre P.D., Wang L. Willingness to communicate in the L2 about meaningful photos: application of the pyramid model of WTC. Lang. Teach. Res. 2021;25(6):878–898. [Google Scholar]
  • 72.Cao Y., Philp J. Interactional context and willingness to communicate: a comparison of behavior in whole class, group and dyadic interaction. System. 2006;34(4):480–493. [Google Scholar]
  • 73.Mystkowska-Wiertelak A., Pawlak M. Willingness to communicate in instructed second language acquisition: combining a macro-and micro-perspective. Multilingual Matters. 2017;110 [Google Scholar]
  • 74.Mahmoodi M.H., Moazam I. Willingness to communicate (WTC) and L2 achievement: the case of Arabic language learners. Procedia-Social and Behavioral Sciences. 2014;98:1069–1107. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The data would be available on request from the corresponding author.


Articles from Heliyon are provided here courtesy of Elsevier

RESOURCES