Abstract
Alternative assessment aims to increase the practicality and authenticity of assessment in university education and has been increasingly used during the pandemic. The implementation can be ineffective without considering students' needs and concerns in adapting to new assessment practices. This mixed-methods study applied an attitude–behaviour–context model to examine students' perceptions about the implementation of alternative assessment. One hundred and thirteen questionnaires were collected from students who experienced alternative assessment before the survey. Six students were interviewed about their learning experience. The quantitative results revealed that the students' perceived context of alternative assessment directly influenced their learning behaviour. The students' attitudes towards alternative assessment partially mediated the relationship between their perceived context of alternative assessment and their learning behaviour. The qualitative data were analysed using a deductive thematic approach, providing an in-depth interpretation of students' understanding and awareness of perceived teacher support and expectations about alternative assessment at the university. The semi-structured interview found that although students positively viewed the alternative assessment as an authentic task that help developing their higher-level thinking skills, the effectiveness of the assessment was weakened by the insufficient support and monotonous types of the assessment. This study's findings provide practical suggestions for teachers and universities to improve alternative assessment.
Keywords: Authenticity in assessment, Formative assessment, Higher education, Mixed-methods, Teacher support
Authenticity in assessment; Formative assessment; Higher education; Mixed-methods; Teacher support.
1. Introduction
Views of assessment have changed over time in line with the desire to provide students with a better learning experience (Berry, 2008). The suspension of face-to-face learning because of the coronavirus pandemic has motivated universities to rethink their assessment approach with the aim of designing creative, innovative and practical assessments for university students (Kong et al., 2021). To cope with the unexpected challenges posed by the pandemic, universities worldwide have shifted away from traditional test-based assessment to alternative assessment, also known as authentic assessment (Brown and Sambell, 2020). For instance, some chemistry teachers at a university in Singapore designed concept map tasks to replace examinations and tests (Lau et al., 2020). Moreover, Gachon International Language Centre redesigned the assessment to adapt the sudden transition to online learning by asking students to do video presentations and self and peer assessment (Chung and Choi, 2021). Teachers at a Hong Kong university also used different technology tools to facilitate the adoption of alternative assessment, like designing posters and creating e-portfolio (Kong et al., 2021). These examples show that universities have been working hard to maintain the quality of assessment in the online learning environment.
Meanwhile, students' perceptions of alternative assessment are important because the effectiveness of the assessment is determined by their learning experience and outcomes (Ajjawi et al., 2020). Siow (2015) showed that student's positive attitude of the self and peer assessment is constituted by the remarkable learning outcome experienced by students, such as the improved critical thinking and organization skills. Students having a favorable attitude of alternative assessment would love to see such kind of assessment in the future, encouraging university to achieve the goal of assessment for learning. However, Atifnigar et al. (2020) reported that students would have negative attitudes towards the portfolio assessment due to the extra workload. They also failed to see the relevance of the assessment to their future career. In view of this, the university reduced the content requirements in the assessment and redesigned it so that it was more relevant to the students' future career development. Students also developed a more positive attitude towards the alternative assessment after the adjustment made by the university. Therefore, students' attitudes and perceived context of alternative assessment can affect their engagement in the assessment. While previous studies mainly focus on investigating students' perception towards the alternative assessment (Atifnigar et al., 2020; Siow, 2015; Cornelius and Kinghorn, 2014), little is known about how students' learning behaviours in alternative assessment is influenced by students' attitudes and perceived context of alternative assessment.
It is important to learn from the students' perspectives concerning how the alternative assessment is implemented and whether they feel comfortable towards such new and novel approach. As such, universities and academic staffs can adjust the alternative assessment according to the students' needs and expectations and thus guarantee the effectiveness of alternative assessment. The purpose of this study was to explore the perceptions and learning behaviours of students who experienced alternative assessment at a university. The attitude–behaviour–context (ABC) model was adopted to explore students' attitudes, perceived context and learning behaviours in relation to alternative assessment. A mixed-methods approach was adopted, with a quantitative study examining the statistical relationships between students' attitudes, perceived context and learning behaviours and a qualitative study offering an in-depth picture of students’ perceptions of their experiences with alternative assessment. This study aimed to answer the following research questions.
Research Question 1
(RQ 1): What are the relationships between students' attitudes towards, perceived context of and learning behaviours in alternative assessment? (Quantitative study)
Research Question 2
(RQ 2): How do students perceive the situation of alternative assessment regarding their understanding and awareness of alternative assessment and perceived teacher support? (Qualitative study)
Research Question 3
(RQ 3): What do students expect from the university to improve their learning experience in alternative assessment? (Qualitative study)
This study's findings and implications provide suggestions and insights for universities to improve their alternative assessment approaches.
2. Background
2.1. Alternative assessment in university education
Educators in university settings view assessment not only as the measurement of students' learning outcomes but also as a way of engaging students in the learning process (Brown, 2015). Alternative assessment emphasises practicality and authenticity. It enables students to apply their knowledge and skills in real-life scenarios (Berry, 2008). According to Wiggins (1998), alternative assessment consists of the following six elements: (1) enabling students to reflect on the knowledge and skills that are essential in a real-world setting; (2) encouraging students to think critically and innovatively to solve unstructured problems; (3) asking students to complete a task using the skills and procedures typically used in their academic discipline; (4) allowing students to apply what they have learned in a realistic context; (5) assessing students’ performance in using a wide range of higher-order competencies in complex tasks and (6) allowing students to exchange feedback with their peers and to practise so they can improve their work. Alternative assessment can be implemented in various ways, such as creating portfolios and reflective journals, designing posters, doing self and peer assessment, making video or oral role-play presentation and completing projects (Berry, 2008; Cornelius and Kinghorn, 2014; Craft and Ainscough, 2015).
Alternative assessment has been found to promote university students' learning outcomes (Sokhanvar et al., 2021). Applying acquired knowledge and skills in realistic circumstances promotes students' learning determination and satisfaction, and students thus participate more actively in their learning (Ashford-Rowe et al., 2014; Svinicki, 2004). For instance, James and Casidy (2018) found that undergraduates had favourable attitudes towards alternative assessment, which was found to drive their learning satisfaction and intention to pursue business studies. Nikolova and Andersen (2017) concluded that business students' engagement in service-learning assessment was promoted when they faced real clients with needs. Alternative assessment can also enhance students' cognitive skills such as self-reflection, creativity and problem-solving (Darling-Hammond and Snyder, 2000; Palmer, 2004). Traditional assessment tends to test memory and lower-level thinking skills, whereas alternative assessment is considered a better tool for comprehensively assessing students' performance, including their higher-level thinking skills (Ahmad et al., 2020). According to Berry (2008), students are required to monitor and reflect on their own learning progress, which is a fundamental feedback practice featured in alternative assessment. The self-reflection and feedback process allows students to examine their true self and cultivate their creativity and problem-solving skills (Palmer, 2004; Wiewiora and Kowalkiewicz, 2018). For example, Palmer (2004) found that traditional assessments, such as examinations, tend to engender only surface learning in engineering students. Comparatively, an alternative assessment engineering assignment was more effective in developing engineering students' deep learning and creativity (Palmer, 2004). However, because it is time- and resource-intensive to prepare and adopt, alternative assessment can be a challenging assessment approach (Sokhanvar et al., 2021). For example, in Muthohharoh et al. (2020), an English teacher needed to give more attention to students' progress during class as he lacked the time to implement self-assessment formally. Putri et al. (2019) asserted that teachers do not have enough time to measure all aspects of students' oral language skills via alternative assessment, and thus discouraging students' participation at speaking class. In addition, teachers often lack an awareness of alternative assessment or techniques for adopting it (Ojung'a & Allida, 2017). Ojung'a and Allida (2017) observed that teachers in Kenya tended to use traditional assessment tools and seldom design and use alternative assessment tools. As a result, students' engagement in alternative assessment was low. Meanwhile, students may feel insecure about new types of assessment because they are unfamiliar with the format and measurement standards (Fox et al., 2017). In Lau et al. (2020), some chemistry students felt uneasy about replacing traditional written assignments with a concept map task because they doubted that the alternative approach would reflect their abilities. To explore the challenges related to alternative assessment, this study investigated the implementation of alternative assessment in a university. We believe that the findings will give university educators new ideas for improving their alternative assessment approach.
2.2. Theoretical framework—attitude–behaviour–context theory
Attitude–behaviour–context (ABC) theory (Guagnano et al., 1995) has been frequently used in studies concerning environmental education and pro-environmental behaviour. Specifically, it has been used to explain how pro-environmental activities are promoted through the constructs of attitude and context. In ABC theory, ‘attitude’ refers to a person's beliefs, values and general assumptions about behaving in specific ways (Stern, 2000). A positive attitude would mean acceptance and ongoing adoption of the promoted item by users (Reddy et al., 2020). Whereas ‘context’ refers to external legal, physical, social or financial parameters (Stern, 2000). As universities continue to improve the use of assessment, the context for higher education in assessment is ever-changing, ranging from summative to formative way (Pereira et al., 2016). Researchers think that one's attitude alone is insufficient to motivate behaviour change and that contextual factors can motivate or discourage actions (Guagnano et al., 1995; Zepeda and Deal, 2009). In addition, studies have found that attitude plays a mediating role in the interaction between contextual factors and behaviour. For instance, Guagnano et al. (1995) found that Americans' recycling behaviour was significantly affected by both their ideas of responsibility (i.e., attitude) and their possession of a recycling bin (contextual factor). Importantly, the possession of a bin indirectly influenced their recycling behaviour via their ideas about responsibility. Zepeda and Deal (2009) found that organic-food shoppers perceived that a few large chain stores controlled the availability of organic food. Motivated by this negative perception of control, they preferred to shop at smaller, local markets.
ABC theory has been increasingly used to study the relationships between context, attitudes and behaviours. However, most of these studies have been conducted in the environmental protection field (e.g., Xu et al., 2017; Zepeda and Deal, 2009). Specifically, researchers have mainly applied ABC theory to understand the connection between the context of an environmental policy and the public's environmental behaviour. Similarly, we argue that ABC theory is applicable to the education field because it can help educators to understand the context of an education policy and students' learning behaviour. However, there is limited evidence of ABC theory being adopted in educational research. This study adapted ABC theory to better understand the implementation of alternative assessment. We explored how students' attitudes towards and perceived context of alternative assessment at university affected their behaviour in relation to alternative assessment. The proposed path model is shown in Figure 1.
Figure 1.
The proposed attitude–behaviour–context (ABC) theory model for describing university education alternative assessment.
2.3. Hypotheses
The qualitative findings of Adie et al. (2010) suggested that university students have a fixed mindset in which assessment must have a research study format. However, after building a more supportive learning environment by providing sufficient guidelines and information about the alternative assessment, the students were willing to engage in the alternative assessment approach, which took the form of a community project. Fook and Sidhu (2010) found that if students perceived that the context of alternative assessment would allow them to acquire practical real-world skills, they engaged more actively in the task and did not need close supervision from the academic/teaching staff. Thus, we expect that a positive perceived context of alternative assessment can positively influence students’ learning behaviour.
Hypothesis 1
(H1): Enhancing students' perceived context of alternative assessment directly promotes their learning behaviour.
According to Gulikers (2006), students' perceptions of alternative assessment are a function of their experiences with it and its applicability to their future profession. Student's engagement in the assessment is determined by the motivational value of the task and its manageability (Lizzio and Wilson, 2013). Students' positive impressions of alternative assessment can be reinforced when it allows them to apply their knowledge in a real-world scenario (Craft and Ainscough, 2015). However, as authenticity is subjective, what an educator considers authentic might not match what students consider authentic (Ajjawi et al., 2020). Ojung'a and Allida (2017) found that students had a limited understanding of alternative assessment because their perceived learning environment was not informative. Therefore, the students' uncertainty negatively affected their engagement (Lau et al., 2020). We expect that a positive attitude towards alternative assessment positively mediates the relationship between the students' perceived context of alternative assessment and their learning behaviour.
Hypothesis 2
(H2): Enhancing students' perceived context of alternative assessment indirectly promotes their learning behaviour through the mediating role of attitude.
3. Methodology
This study used a convergent mixed methods design (Cohen et al., 2011) to answer the research questions through collecting and analyzing the quantitative (RQ1) and qualitative (RQ2 and RQ3) data concurrently. The data collection was completed through distributing online questionnaires to and interviewing students at a local university. Students' ratings in the quantitative section were analyzed to present the relationships between students' attitudes towards, perceived context of and learning behaviours in alternative assessment with SPSS and AMOS. The qualitative method was used to understand students' perception regarding the situation of alternative assessment regarding their understanding and awareness of alternative assessment and perceived teacher support, as well as their expectation on the implementation of alternative assessment. The mixed method design offered an in-depth understanding of students’ perspectives in assessment practices.
3.1. Instrument
3.1.1. Quantitative study
We designed a survey based on the three fundamental constructs of the ABC model, namely, attitude, perceived context and behaviour. Attitude consisted of eight items. The items reflected the major characteristics of alternative assessment: practicality, versatility, flexibility, learning motivation and metacognition engagement (Berry, 2008; Wiggins, 1998). Higher marks represented students' greater acknowledgement of the effectiveness of alternative assessment. Perceived context consisted of six items that measured students' perceived implementation of alternative assessment in terms of the variety and content of the assessments, the use of assessment tools, and support from the academic/teaching staff. Higher marks indicated that students perceived a greater intensity of implementation. Learning behaviour consisted of seven items that measured students' behaviour in terms of their activeness, contributions, learning process and outcomes. Higher marks indicated greater engagement. The complete survey is shown in Table 1 below. All of the constructs were measured using a five-point Likert scale ranging from 1 (‘strongly disagree)’ to 5 (‘strongly agree)’. George and Mallery (2003) specified that an alpha of .90 should be considered as having an excellent internal consistency. Cronbach's alpha of the survey was .91, suggesting excellent reliability.
Table 1.
The survey's confirmatory factor analysis, Cronbach's alpha and mean.
| Item | Loading | Cronbach's alpha | M |
|---|---|---|---|
| Attitude | 0.87 | 3.91 | |
| I think that alternative/authentic assessment uses multiple methods to achieve an integrative assessment of my actual abilities and learning process. | 0.68 | 3.85 | |
| I think that alternative/authentic assessment provides me with greater flexibility and autonomy in undertaking assignments (e.g., in the form, source materials used, content and presentation of the assignment). | 0.61 | 4.04 | |
| I think that alternative/authentic assessment better engages and motivates me to learn than traditional assessment. | 0.75 | 3.92 | |
| I think that alternative/authentic assessment encourages my active participation in the assessment process, such as through self-assessment and peer assessment. | 0.62 | 3.84 | |
| I think that alternative/authentic assessment enables me to develop self-regulated learning (e.g., by recognising and monitoring my learning progress). | 0.71 | 3.87 | |
| I think that alternative/authentic assessment develops my higher-order competencies (e.g., analysis, synthesis, evaluation, creative and critical thinking, problem-solving, elaborative communication, collaboration and reflection). | 0.73 | 4.02 | |
| I think that alternative/authentic assessment enhances my self-confidence in handling real-life challenges. | 0.70 | 3.80 | |
| Context | 0.87 | ||
| A variety of alternative/authentic assessment methods have been adopted in my courses. | 0.72 | 3.42 | |
| Alternative/authentic assessments in my courses have enabled me to utilise what I have learned (e.g., knowledge, skills and attitudes) to undertake real-life and professional tasks. | 0.87 | 3.69 | |
| I think that the situations (e.g., cases, scenarios, issues and problems) in the alternative/authentic assessment methods in my courses have been similar to real-life and professional environments. | 0.79 | 3.55 | |
| The alternative/authentic assessments in my courses used various electronic tools or online platforms to help us complete the assignments. | 0.78 | 3.65 | |
| My course teachers gave me timely feedback after I completed the alternative/authentic assessments. | 0.66 | 3.51 | |
| Behaviour | 0.85 | ||
| My participation in alternative/authentic assessment has helped me to be active, constructive, interactive and reflective in the learning process. | 0.74 | 3.78 | |
| I am making good use of alternative/authentic assessment in my learning. | 0.79 | 3.83 | |
| I received a briefing about the alternative/authentic assessment, which helped me to achieve my intended purpose. | 0.67 | 3.72 | |
| I am making good use of the alternative/authentic assessment feedback for my learning. | 0.75 | 3.83 | |
| Alternative/authentic assessment motivates me to self-regulate my learning. | 0.65 | 3.80 |
3.1.2. Qualitative study
Nine interview questions (Table 2) based on our research questions were developed to address the following issues: (a) students' understanding and awareness of alternative assessment; (b) students' perceived teacher support in relation to alternative assessment and (c) students’ expectations about the future implementation of alternative assessment.
Table 2.
Interview questions for student interviewees.
| Students' understanding and awareness of alternative assessment |
|---|
| 1) What kinds of alternative assessments did you experience in the 2020 to 2021 academic year? |
| 2) Compared with traditional assessment, how do you like alternative assessment? |
| 3) Please describe and comment on your learning experience in alternative assessment in terms of variety, application of knowledge, real-life scenarios, the teaching/electronic tools and feedback from teachers. |
| 4) After experiencing alternative assessment, which elements of alternative assessment do you think were important in facilitating your learning? How and why? |
| 5) How did the alternative assessment you experienced affect your participation and engagement in your studies? |
| Students' perceived teacher support in alternative assessment |
| 1) What kinds of support did the university and academic/teaching staff provide to facilitate your learning with alternative assessment? |
| 2) Do you think that the support provided by the university and academic/teaching staff was helpful, valid and sufficient? If yes, please explain. If not, please explain and provide suggestions for improvement. |
| Students' expectations about the future implementation of alternative assessment |
| 1) How do you think the university and academic/teaching staff could have improved your learning experience with alternative assessment? |
| 2) Did your learning experience with alternative assessment meet your expectations about alternative assessment? |
3.2. Procedure and participants
3.2.1. Quantitative study
Online questionnaires were distributed to all students in a university via Qualtrics. This study focused on the implementation of alternative assessments in courses undertaken in 2020 and 2021 while face-to-face classes were suspended because of the coronavirus pandemic. The university's Human Research Ethics Committee approved the study, and informed consent was obtained online from those who agreed to participate in this research. Online surveys were sent to 6,050 students, and 113 responses (2%) were obtained. The low response rate was due to the short implementation period of alternative assessment, which only began in 2020. Moreover, the adoption of alternative assessment was not implemented throughout the university; thus, only a small group of students had any understanding or experience of alternative assessment. Among the respondents, 21% were men (N = 26) and 79% were women (N = 87). The demographic characteristics of the respondents are presented in Table 3.
Table 3.
Respondents’ demographic characteristics.
| Characteristic | Frequency (N = 113) | Percentage (%) |
|---|---|---|
| Gender | ||
| Women | 87 | 77.4 |
| Men | 26 | 22.6 |
| Year of Study | ||
| Year 1 | 11 | 9.7 |
| Year 2 | 51 | 45.1 |
| Year 3 | 27 | 23.9 |
| Year 4 | 15 | 13.3 |
| Year 5 or above | 9 | 8.0 |
| Study Mode | ||
| Full-time | 96 | 85.0 |
| Part-time | 17 | 15.0 |
| Programme Type | ||
| Higher Diploma | 4 | 3.5 |
| Postgraduate Diploma in Education Programme | 9 | 8.0 |
| Undergraduate Programme | 92 | 81.4 |
| Postgraduate Programme | 8 | 7.1 |
3.2.2. Qualitative study
Individual semi-structured interviews were conducted to collect qualitative data. Four undergraduate and two postgraduate students (two men and four women) were interviewed. The six students were anonymised with letters (students A to F). Each interview was conducted through Zoom, a videoconferencing platform that can securely record and store recordings. Each interview lasted approximately 30 min and was conducted in Cantonese and transcribed into English.
4. Results
4.1. Quantitative results
4.1.1. Descriptive statistics
Table 4 shows the means, standard deviations (SD) and correlations between the ABC model constructs. The score of each item presented in this research ranged from 1 to 5, meaning from strongly disagree to strongly agree. The mean values of the three constructs fall between 3.5 and 4.0. This implies that students generally had a moderately high attitude towards the alternative assessment. The result also indicated that students experienced and engaged in alternative assessment in their courses at a moderately high level. The SD values for students' attitudes of and learning behaviour in alternative assessment are lower than 0.6, meaning that students' response in these areas differ insignificantly from the overall mean. However, the SD values for students' perceived context of alternative assessment is around 0.70, implying a relatively high variation in the participants' response. This suggests that the students' learning environment regarding the alternative assessment differed significantly. Moderately significant positive intercorrelations were observed between the constructs. The reliability coefficients of each construct are shown in brackets. George and Mallery (2003) specified that an alpha of .80 should be considered as having a good internal consistency. Cronbach's alpha for the three constructs ranged from .85 to .87, indicating good reliability.
Table 4.
Descriptive statistics, correlations and reliabilities (in brackets) of the ABC model constructs
| Mean | SD | 1 | 2 | 3 | |
|---|---|---|---|---|---|
| 1. Attitude | 3.93 | 0.51 | (0.87) | ||
| 2. Perceived context | 3.61 | 0.72 | 0.33∗∗ | (0.87) | |
| 3. Learning behaviour | 3.81 | 0.52 | 0.66∗∗ | 0.61∗∗ | (0.85) |
Note: N = 113; ∗p < .05; ∗∗p < .01.
4.1.2. Path analysis
AMOS version 27 tested the hypotheses regarding the relationship between students' attitude, perceived context of and behaviours in alternative assessment. According to Cohen et al. (2002), the root mean square error of approximation (RMSEA) should be below .08. Comparative fix index (CFI) and Tucker–Lewis index (TLI) values greater than .95 indicate an excellent fit (Hu and Bentler, 1999). The model showed an excellent fit to the data (χ2 = 143.34, df = 114, p < .05, χ2/df = 1.26, TLI = .96, CFI = .97; RMSEA = .05). The results revealed the following significant paths: (a) from perceived context to attitude (β = .40, p < .000), (b) from attitude to learning behaviour (β = .59, p < .000) and (c) from perceived context to learning behaviour (β = .47, p < .000). In addition, 5,000 bootstrap resamples were used to investigate the mediation effects of the model (MacKinnon et al., 2004). Direct and indirect effects were found (Table 5). Hypotheses 1 and 2 were thus both supported. Enhancing students' perceived context of alternative assessment directly promotes their learning behaviour in alternative assessment. Also, enhancing students’ perceived context of alternative assessment indirectly promotes their learning behaviour through the mediating role of attitude. Figure 2 shows the final model developed in this study and its significant paths.
Table 5.
Summary of direct, indirect and total effects among the three model constructs.
| Path | Effect |
|---|---|
| 1. Direct effect of Perceived context on Attitude | .40∗∗ |
| 2. Direct effect of Attitude on Learning behaviour | .59∗∗ |
| 3. Direct effect of Perceived context on Learning behaviour | .48∗∗ |
| 4. Indirect effect of Perceived context on Learning behaviour | .24∗∗ |
| 5. Total effect of Perceived context on Learning behaviour | .72∗∗ |
Note: N = 113; ∗p < .05; ∗∗p < .01.
Figure 2.
The attitude–behaviour–context alternative assessment model.
4.2. Qualitative results
We applied our proposed ABC alternative assessment model to the qualitative data to further investigate our research questions. A deductive and theory-driven thematic analysis was conducted. This approach allowed us to align the data to the significant paths revealed by the ABC model (Braun and Clarke, 2006). Moreover, it helped us pinpoint answers to our research questions. We developed the following themes: (a) the direct and indirect paths of the ABC alternative assessment model, (b) students’ understanding and awareness of alternative assessment, (c) perceived teacher support and (d) expectations of future alternative assessment implementation. After thoroughly scrutinising the transcripts, the authors completed and refined the initial coding. The coding results were discussed and reviewed by the authors until a consensus was reached for all of the codes. The findings are described in the next section according to theme. Table 6 presents the demographics of the interviewees with details of their gender, year of study, courses in which they experienced an alternative assessment and the assessment methods. Table 7 shows the themes, sub-themes and supporting comments of the qualitative analysis.
Table 6.
Demographics of the interviewees.
| Student | Gender | Age | Year of study | Course titles | Assessment methods |
|---|---|---|---|---|---|
| A | Woman | 19 | Undergraduate Year 1 | Creative teaching | Paper art1, reflective journal |
| Chinese language | Peer review | ||||
| B | Woman | 20 | Undergraduate Year 1 | Positive education | Reflective journal |
| Field experience foundation course | Microteaching2 | ||||
| C | Man | 24 | Undergraduate Year 4 | Consolidating undergraduate learning through university e-portfolio | E-Portfolio, reflective journal, presentation3 |
| D | Woman | 25 | Undergraduate Year 5 | Blog practise | Teaching practicum4, reflective journal, e-portfolio |
| E | Woman | 34 | Postgraduate Year 1 | Positive psychology | Presentation |
| Positive education | Case study essay5 | ||||
| F | Man | 43 | Postgraduate Year 2 | The researcher–practitioner in professional and vocational education | Presentation |
| Blog practise | Teaching practicum |
Note: N = 6.
1. Paper art: Students designed art pieces with paper using the creativity theories they learned in class.
2. Microteaching: Students filmed themselves teaching a topic.
3. Presentation: Case studies or portfolio presentations were made in various forms, including individual, group and video formats.
4. Teaching practicum: Students were sent to local schools to teach in the classroom.
5. Case study essay: Students were asked to analyse and give suggestions about the implementation of a positive psychology intervention at a local school.
Table 7.
Themes, sub-themes and supporting comments from students.
| Themes | Sub-themes | Supporting comments |
|---|---|---|
| The direct and indirect paths of the ABC model | Direct effect of Perceived context on Attitude | ‘Although peer assessment was useful, I started losing interest in it because it was part of almost every course.’ Student A |
| Direct effect of Attitude on Learning behaviour | ‘In the creativity course … was fun and interesting … I was always looking forward to that class.’ Student A | |
| Direct effect of Perceived context on Learning behaviour | ‘Teachers would regularly track our feedback personally or by electronic tools. I appreciated the teachers’ effort and was motivated to do more in the assessment.’ Student E | |
| Indirect effect of Perceived context on Learning behaviour | ‘The teaching practicum provided me with a chance to apply what I learned, and I gained real-life experience that traditional assessment cannot offer. Since there are no standard answers, I spent a lot of time preparing and improving my performance according to my teacher's opinions.’ Student D | |
| ‘Writing a reflective journal about my teaching styles and strengths was quite useless for me as a freshman. The knowledge I acquired in the reflection would ultimately disappear since I'm not applying it in life any time soon. I wrote the journal just like writing an academic essay.’ Student B | ||
| Students' understanding and awareness of alternative assessment | Relevance to the real world | ‘As I work at a school, I could use the techniques I learned in class, like time management, content delivery and teaching plan design.’ Student F |
| Developing higher-level skills | ‘Writing reflective journals has taught me how to discover and appreciate the creativity hidden in our daily lives’ Student A | |
| The versatility of assessment | ‘I experienced several types of alternative assessment, including video presentation, practicum and e-portfolio.’ Student F | |
| Obtaining first-hand experience | ‘I learn the most when I have the chance to gain hands-on experience. When I worked on the paper art assignments, I could apply the creativity theories that I learned in class.’ Student A | |
| ‘I wish we could visit the local schools more. Since there are no real students in microteaching, it is less useful. Why don't I just go teach at the tutoring centre where there are real students.’ Student B | ||
| Perceived teacher support of alternative assessment | Receiving sufficient support | ‘The coursework offered me sufficient knowledge to reflect on my area of study and thus I knew how to do the e-portfolio.’ Student F |
| Receiving insufficient support | ‘The practicum guidelines concerning the format and hours were unclear. The department staff failed to answer our questions.’ Student D | |
| Expectations of future alternative assessment implementation | Offering more coursework samples | ‘Providing more samples would give me a clearer idea of the teacher's expectations.’ Student E |
| Including more informal activities before official evaluation | ‘I wish my teacher would monitor my teaching without grading my performance in the beginning stage of practice so I can have a chance to improve my teaching before the formal evaluation’ Student D | |
| Introducing more varieties of alternative assessment | ‘I would like to see more varieties other than writing, like drawing a mind map or making videos or recordings ... Some students are not good at writing. Including other varieties would give us an opportunity to use our strength.’ Student C |
4.2.1. Views on direct and indirect paths of the ABC alternative assessment model
The interviews provided in-depth information about the significant paths identified in the ABC alternative assessment model. For example, the qualitative analysis showed that some students perceived the content of the implemented assessment to be meaningful and useful in real life, resulting in their having a positive attitude towards the assessment and increasing their active engagement. As student D stated, ‘The teaching practicum provided me with a chance to apply what I learned, and I gained real-life experience that traditional assessment cannot offer. Since there are no standard answers, I spent a lot of time preparing and improving my performance according to my teacher's opinions.’ If students cannot relate what they learn to their life, they might perceive the context of the assessment to be meaningless, which could further weaken their motivation and involvement. As student B stated, ‘Writing a reflective journal about my teaching styles and strengths was quite useless for me as a freshman. The knowledge I acquired in the reflection would ultimately disappear since I'm not applying it in life any time soon. I wrote the journal just like writing an academic essay.’ Meanwhile, some students perceived a rich variety of alternative assessments, which aroused their learning interest and hence promoted their learning behaviour. For example, student A mentioned, ‘In the creativity course, there were many different types of alternative assessment activities, like filling in surveys and making paper art. I have never had lessons like that. It was fun and interesting … I was always looking forward to that class.’ However, some students voiced that their perceptions of the assessment became negative when the assessment was monotonous. As student A mentioned, ‘Although peer assessment was useful, I started losing interest in it because it was part of almost every course. I hope that there will be other ways to evaluate my performance.’ In addition, some students perceived the alternative assessment implementation to be instructive and supportive, which further increased their efforts to learn. As student E stated, ‘Teachers would regularly track our feedback personally or by electronic tools. I appreciated the teachers’ effort and was motivated to do more in the assessment.’
4.2.2. Students’ understanding and awareness of alternative assessment
Relevance to the real world substantially contributed to the students' understanding of the alternative assessment. Four students thought that the alternative assessment provided them an opportunity to experience their future profession. For example, student D stated, ‘The teaching practicum allowed me to examine different teaching techniques I learned from class … I could also try my own methods to see whether the students liked them or not.’ Student F similarly stated, ‘As I work at a school, I could use the techniques I learned in class, like time management, content delivery and teaching plan design.’ In addition, the students believed that alternative assessment could assist them in developing higher-level skills such as problem-solving and communication skills. These skills may help them to become a life-long learner and might exert a positive long-lasting impact on their lives. For instance, student A mentioned, ‘Writing reflective journals has taught me how to discover and appreciate the creativity hidden in our daily lives. I will maintain this habit even after finishing the course.’
In terms of the implementation, every student was aware that they had experienced at least two types of alternative assessment. Most of the alternative assessments involved various realistic scenarios, such as writing or making video presentations. Some assessments were peer assessments, portfolios, hands-on activities or professional practicums. Overall, the students preferred other types of assessments over writing assessments. They believed that the experience gained through completing an alternative assessment should build upon exposure to real-life situations. Therefore, they considered obtaining first-hand experience a vivid and effective way to learn. As student A mentioned, ‘I learn the most when I have the chance to gain hands-on experience. When I worked on paper art assignments, I could apply the creativity theories that I learned in class.’ Student B also stated, ‘I wish we could visit the local schools more. Since there are no real students in microteaching, it is less useful. Why don't I just go teach at the tutoring centre where there are real students.’
4.2.3. Perceived teacher support of alternative assessment
All of the students valued the support they received from the university and academic/teaching staff in relation to alternative assessments. This support took various forms, such as providing instructions and coursework to equip the students with the knowledge and ability to complete the assessment, offering samples as a reference and giving feedback. For instance, student F mentioned, ‘The coursework offered me sufficient knowledge to reflect on my area of study and thus I knew how to do the e-portfolio.’ However, three students reported that they received insufficient support. For instance, student D mentioned, ‘The practicum guidelines concerning the format and hours were unclear. The department staff failed to answer our questions. This increased my workload unnecessarily and caused me to teach with uncertainty.’ Such negative feelings could negatively influence their learning experience.
4.2.4. Expectations about future implementation of alternative assessment
As alternative assessment was new to the students, two students wanted to see more samples to gain insight into how others performed. Student E said, ‘Providing more samples would give me a clearer idea of the teacher's expectations.’ The students would also like to see more informal activities so they can gain more experience without worrying about receiving a low grade. Student D stated, ‘I wish my teacher would monitor my teaching without grading my performance in the beginning stage of practice so I can have a chance to improve my teaching before the formal evaluation.’ Moreover, three students expressed their desire to experience more varieties of alternative assessment. As student C asserted, ‘I would like to see more varieties other than writing, like drawing a mind map or making videos or recordings ... Then I would be more motivated to do the assignment. Some students are not good at writing. Including other varieties would give us an opportunity to use our strength.’
5. Discussion and practical implications
This study's findings show that students' perceived context of alternative assessment significantly influences their learning behaviour. In addition, attitude partially mediates the relationship between perceived context and learning behaviour. The mixed-methods study design allowed the qualitative investigation to enrich the quantitative findings. For example, the qualitative findings showed that some students perceived the context of alternative assessment as supportive, which further enhanced their positive attitude towards the assessment and increased their engagement in learning. These findings answered RQ1 concerning the relationships between students' attitudes towards, perceived context of and learning behaviours in alternative assessment. They echoed those of other studies such as Lizzio et al. (2002) and Kurtz et al. (2019). These studies found that if students perceive an assessment to be appropriate and teaching to be empathic, motivating, understandable and helpful, they are more likely to adopt a deep approach to learning. A deep approach to learning is described as seeking for complex understanding and meaning about the course content; conversely, a surface approach—led by a negative perception of the assessment and teaching—merely involves reproducing knowledge without making an effort to integrate the information. Importantly, a deep approach contributes to students' qualitative learning outcomes such as general academic and workplace skills. To improve students' perceived context of alternative assessment, we believe that clear instructions and feedback are critical to forming a supportive assessment context (e.g., Fox et al., 2017). For example, clarifying the grading system might mitigate students' uncertainty and anxiety and thus help them to perform better in the assessment (Litchfield and Dempsey, 2015). Additionally, Van Wyk (2017) found that by offering constructive feedback, teachers supported education students in their alternative assessment, which required the students to engage in practice teaching and write a reflective journal. In this way, the education students could self-monitor their teaching performance and successfully develop their own teaching philosophy. Furthermore, the students' higher-order thinking skills, such as critical thinking, was developed through the process of articulating and rethinking their practice in conjunction with the teachers' feedback (Van Wyk, 2017).
This study's qualitative findings contradicted those of other studies in terms of students' understanding and awareness of alternative assessment. Some studies have suggested that university students have a fixed mindset about assessment approaches and that they prefer traditional assessments such as paper-based exams and essays because they are more familiar with these methods (e.g., Adie et al., 2010; Lau et al., 2020). However, in this study, the students could identify the advantages of the alternative assessment based on their learning experience, such as its practicality and real-world relevance. Unlike traditional assessment, alternative assessment highlighting real-world learning allows students to develop practical hands-on skills (Sheila et al., 2021). Although our quantitative study found that the students generally perceived teacher support to be somewhat low, the qualitative study provided in-depth information about the various levels of teacher support perceived by the students in relation to the assessment. Specifically, whereas some students appreciated the careful guidance they received from their teachers, others felt frustrated because of the unclear instructions and lack of feedback. These findings answered RQ2 regarding students' understanding and awareness of alternative assessment and perceived teacher support in their learning environment. Student perceptions of alternative assessment are critical in how they decide to engage with the assessment. For example, Lau et al. (2020) found that students who had a negative attitude towards making a concept map preferred traditional assessments. Their interest and engagement in the alternative assessment were thus weakened (Lau et al., 2020). Khoury (2022) observed that the translation students found the peer assessment difficult as they were not familiar with the interactive computer-assistance translation tools and lacked the confidence and skills to conduct the task. Cornelius and Kinghorn, 2014 also found that first-year students were generally unfamiliar with the self and peer assessment. To raise the acceptance and participation of students towards the assessment, teachers designed materials and in class activities to allow students becoming familiar with the assessment gradually. Teachers not only provided check boxes to guide the students during the assessment process, but also arranged pair activity for students to comment on each other's’ work in a comparatively stress-free environment. Students in the end generally recognized the peer assessment as a valuable language-learning tool and expressed their willingness to participate in these forms of assessment. Therefore, to improve student perceptions of alternative assessment and to ultimately promote student engagement, universities should pinpoint and address students' concerns by increasing their understanding of alternative assessment and offering follow-up support.
Finally, RQ3 concerning students' expectations in improving alterative assessment was addressed by obtaining information through the student interviews; most of the students expected to receive more information from their teachers and a greater variety of alternative assessment methods. As alternative assessment was newly introduced to the university, it is reasonable that the students' expectations were not completely met. A gap between student expectations and teachers' implementation has often been observed (e.g., Henderson et al., 2019). For example, Gulikers (2006) found that students highly value the feedback process because it offers them information to improve. However, teachers’ busy schedules often prevent them from giving students thorough feedback, which can lead to student dissatisfaction with learning (Henderson et al., 2019) because they cannot use feedback to improve their performance. In addition, Kong et al. (2021) mentioned that teachers utilized many online technology tools to facilitate the implementation of alternative assessment in universities. Written assessments take various forms, such as posters, e-portfolios and peer reviews (Kong et al., 2021). However, students in the study of Ajjawi et al. (2020) were not convinced that writing-based alternative assessments could fully reflect their developed skills and abilities; hence, they were frustrated by the lack of recognition for their attainment of general skills. To narrow the gap between student expectations and university implementation, universities should promote and require the adoption of alternative assessment by teachers. As teachers have various understandings about changes in educational practices, university administrators could promote more the philosophy and goals behind alternative assessment methods to them (Brezicha et al., 2015). In addition, universities can provide a platform and network for teachers to discuss their alternative assessment approaches, exchange ideas and receive support from colleagues (Brezicha et al., 2015).
6. Conclusion and limitations
In this mixed-methods study, we used an ABC model to examine university students' perceptions of alternative assessment. The quantitative and qualitative studies both showed that the students' perceived context influenced their learning behaviour directly and indirectly through the mediating role of attitude. Although the students believed that alternative assessment was a practical and rewarding learning tool, they also found difficulties with it, such as a lack of teacher support. Local students' perceived context of alternative assessment has room for improvement, thus the attitude of alternative assessment and their engagement in alternative assessment were too not high. The practical implications of this study are that universities need to spend more effort in successfully shifting from traditional test-based assessment to alternative assessment to provide a better learning experience for students. Universities should investigate what students worry about the assessment and address their concerns accordingly. In our case, the insufficient teacher support and monotonous types of the assessment reported by students had prevented the university from implementing an effective alternative assessment. Possible solutions for these barriers require the effort from both teachers and the universities. Teachers should provide detailed instructions such as clarifying the grading system and offer constructive feedback to overcome students' uncertainty and anxiety. Universities should offer professional development to increase teachers' knowledge of how to give constructive feedback and deliver the assessment in a richer form. Also, university can build up a communication platform for teachers, such that teachers' implementation experiences can be shared and learned among each other. It is hoped that by increasing students' exposure to and increasing teachers' knowledge of the alternative assessment, students' awareness and attitudes toward the alternative assessment could be facilitated and thus the benefits of the alternative assessment can be maximized in students’ learning.
This study has three limitations that could be addressed in future work. First, as alternative assessment was only recently implemented among a small population at the university, the sample size was relatively small, and the response rate was low. Therefore, the findings are generalisable only to a certain extent in this university. Second, the application of the ABC model is also limited by the small sample size. The original ABC model predicts that the attitude–behaviour link is the strongest when contextual factors are at a moderate level. However, we did not have enough data to confirm this theory in our research context. The third limitation concerns the design of the survey. As the questionnaire consists of only positively worded items, the missing of negatively worded items might lead to response bias effects. Also, the third item in the behaviour section includes two questions. This forced students to answer the question inaccurately and thus lowered the validity of the questionnaire. Future studies should recruit more university students to better clarify the relationships between students’ attitudes, behaviours and perceived contextual factors after a university implements alternative assessment. Researchers are also suggested to design the survey including a mixture of positive and negative items to reduce response bias from participants, as well as stating only one question for each item.
Declarations
Author contribution statement
Siu-Cheung KONG: Conceived and designed the experiments; and contributed reagents, materials, analysis tools and data.
Cheuk-Nam Yuen: Performed the experiments; analyzed and interpreted the data; and wrote the paper.
Funding statement
This work was supported by the Teaching Development Grant (TDG), provided by The Education University of Hong Kong.
Data availability statement
Data will be made available on request.
Declaration of interest’s statement
The authors declare no conflict of interest.
Additional information
No additional information is available for this paper.
References
- Adie L.E., Hee L., Wharton L. In: Emerging Trends in Higher Education Learning and Teaching: Proceedings of the TARC International Conference on Learning and Teaching. Chin S.N., editor. Tunku Abdul Rahman College; Malaysia: 2010. Incorporating authentic assessment into different university learning scenarios; pp. 82–90. [Google Scholar]
- Ahmad S., Sultana N., Jamil S. Behaviourism vs constructivism: a paradigm shift from traditional to alternative assessment techniques. J. Appl. Ling. Lang. Res. 2020;7:19–33. [Google Scholar]
- Ajjawi R., Tai J., Huu Nghia T.L., Boud D., Johnson L., Patrick C.J. Aligning assessment with the needs of work-integrated learning: the challenges of authentic assessment in a complex context. Assess Eval. High Educ. 2020;45(2):304–316. [Google Scholar]
- Ashford-Rowe K., Herrington J., Brown C. Establishing the critical elements that determine authentic assessment. Assess Eval. High Educ. 2014;39(2):205–222. [Google Scholar]
- Atifnigar H., Alokozay W., Takal G.M. Students’ perception of alternative assessment: a systematic literature review. Int. J. Ling. Liter. Trans. 2020;3(4):228–240. [Google Scholar]
- Berry R. Hong Kong University Press; 2008. Assessment for Learning. [Google Scholar]
- Braun V., Clarke V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006;3(2):77–101. [Google Scholar]
- Brezicha K., Bergmark U., Mitra D.L. One size does not fit all: differentiating leadership to support teachers in school reform. Educ. Adm. Q. 2015;51(1):96–132. [Google Scholar]
- Brown S. A review of contemporary trends in higher education assessment. @tic. Rev. Innovació Educ. 2015;14:43–49. [Google Scholar]
- Brown S., Sambell K. 2020. Contingency planning: exploring rapid alternatives to face-to-face assessment. Sally-brown.net.https://sally-brown.net/download/3122/ [Google Scholar]
- Chung S.J., Choi L.J. The development of sustainable assessment during the Covid-19 pandemic: the case of the English language program in South Korea. Sustainability. 2021;13(8):4499. [Google Scholar]
- Cohen J., Cohen P., Stephen G.W., Aiken L.S. third ed. Routledge; 2002. Applied Multiple Regression/Correlation Analysis for the Behavioural Sciences. [Google Scholar]
- Cohen L., Manion L., Morrison K. sixth ed. Routledge; London: 2012. Research Methods in Education. [Google Scholar]
- Cornelius S., Kinghorn O. Student attitudes towards self and peer assessment in Japanese university first year EFL classes. Foreign Lang. Educ. Forum. 2014;13:1–10. [Google Scholar]
- Craft J., Ainscough L. Development of an electronic role-play assessment initiative in bioscience for nursing students. Innovat. Educ. Teach. Int. 2015;52(2):172–184. [Google Scholar]
- Darling-Hammond L., Snyder J. Authentic assessment of teaching in context. Teach. Teach. Educ. 2000;16:523–545. [Google Scholar]
- Fook C.Y., Sidhu G.K. Authentic assessment and pedagogical strategies in higher education. J. Soc. Sci. 2010;6(2):153–161. [Google Scholar]
- Fox J., Murphy V., Freeman S., Hughes N. ‘Keeping it real’: a review of the benefits, challenges and steps towards implementing authentic assessment. All Ireland J. Higher Educ. 2017;9(3):3232–3239. [Google Scholar]
- George D., Mallery P. Allyn & Bacon; 2003. SPSS for Windows Step by Step: A Simple Guide and Reference. 11.0 Update (4th Edition) [Google Scholar]
- Guagnano G.A., Stern P.C., Dietz T. Influences on attitude-behaviour relationships: a natural experiment with curbside recycling. Environ. Behav. 1995;27(5):699–718. [Google Scholar]
- Gulikers J. Datawyse/Universitaire Pers Maastricht; 2006. Authenticity Is in the Eye of the Beholder. Beliefs and Perceptions of Authentic Assessment and the Influence on Student Learning. [Google Scholar]
- Henderson M., Ryan T., Phillips M. The challenges of feedback in higher education. Assess Eval. High Educ. 2019;44(8):1237–1252. [Google Scholar]
- Hu L., Bentler P.M. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct. Equ. Model. 1999;6(1):1–55. [Google Scholar]
- James L.T., Casidy R. Authentic assessment in business education: its effects on student satisfaction and promoting behaviour. Stud. High Educ. 2018;43(3):401–415. [Google Scholar]
- Khoury O. Perceptions of student-centered learning in online translator training: findings from Jordan. Heliyon. 2022;8(6) doi: 10.1016/j.heliyon.2022.e09644. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kong S.C., Lam S.S.M., Lam W.W.M., Lau K.K.M., Yee L.T.S., Yung S.N. Four case studies on applying alternative assessments in higher education with technology tools to facilitate teaching and learning. ICERI2021 Proc. 2021:8843–8850. [Google Scholar]
- Kurtz J.B., Lourie M.A., Holman E.E., Grob K.L., Monrad S.U. Creating assessments as an active learning strategy: what are students' perceptions? A mixed methods study. Med. Educ. Online. 2019;24(1):1630239. doi: 10.1080/10872981.2019.1630239. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lau P.N., Chua Y.T., Teow Y., Xue X. Implementing alternative assessment strategies in chemistry amidst COVID-19: tensions and reflections. Educ. Sci. 2020;10(11):1–15. [Google Scholar]
- Litchfield B.C., Dempsey J.V. Authentic assessment of knowledge, skills, and attitudes. N. Dir. Teach. Learn. 2015;2015(142):65–80. [Google Scholar]
- Lizzio A., Wilson K., Simons R. University students' perceptions of the learning environment and academic outcomes: implications for theory and practice. Stud. High Educ. 2002;27(1):27–52. [Google Scholar]
- Lizzio A., Wilson K. First-year students' appraisal of assessment tasks: implications for efficacy, engagement and performance. Assess Eval. High Educ. 2013;38(4):389–406. [Google Scholar]
- MacKinnon D.P., Lockwood C.M., Williams J. Confidence limits for the indirect effect: distribution of the product and resampling methods. Multivariate Behav. Res. 2004;39(1):99–128. doi: 10.1207/s15327906mbr3901_4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Muthohharoh S., Linggar-Bharati D.A., Rozi F. The implementation of authentic assessment to assess students’ higher order thinking skills in writing at MAN 2 Tulungagung. Engl. Educ. J. 2020;10(3):374–386. [Google Scholar]
- Nikolova N., Andersen L. Creating shared value through service-learning in management education. J. Manag. Educ. 2017;41(5):750–780. [Google Scholar]
- Ojung’a J., Allida D. A survey of authentic assessments used to evaluate English language learning in Nandi central sub-county secondary schools. Kenya: Baraton Interdiscipl. Res. J. 2017;7:1–11. [Google Scholar]
- Palmer S. Authenticity in assessment: reflecting undergraduate study and professional practice. Eur. J. Eng. Educ. 2004;29(2):193–202. [Google Scholar]
- Putri N.S.E., Pratolo B.W., Setiani F. The alternative assessment of EFL students’ oral competence: practices and constraints. Ethical Lingua: J. Lang. Teach. Learn. 2019;6(2):72–85. [Google Scholar]
- Reddy P., Chaudhary K., Sharma B., Chand R. The two perfect scorers for technology acceptance. Educ. Inf. Technol. 2020;26(2):1505–1526. [Google Scholar]
- Sheila N.A., Zhu C., Kintu M.J., Kataike J. Assessing higher education institutional stakeholders' perceptions and needs for community engagement: an empirical evidence from Uganda. Heliyon. 2021;7(4) doi: 10.1016/j.heliyon.2021.e06612. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Siow L.F. Students' perceptions on self- and peer-assessment in enhancing learning experience. Malays. Online J. Educ. Sci. 2015;3(2):21–35. [Google Scholar]
- Sokhanvar Z., Salehi K., Sokhanvar F. Advantages of authentic assessment for improving the learning experience and employability skills of higher education students: a systematic literature review. Stud. Educ. Eval. 2021;70 [Google Scholar]
- Stern P. Toward a coherent theory of environmentally significant behaviour. J. Soc. Issues. 2000;56(3):407–424. [Google Scholar]
- Svinicki M.D. Authentic assessment: testing in reality. N. Dir. Teach. Learn. 2004;2004(100):23–29. [Google Scholar]
- Van Wyk M.M. Student teachers’ views regarding the usefulness of reflective journal writing as an eportfolio alternative strategy: an interpretive phenomenological analysis. Gender Behav. 2017;15(4):10208–10219. [Google Scholar]
- Wiewiora A., Kowalkiewicz A. The role of authentic assessment in developing authentic leadership identity and competencies. Assess Eval. High Educ. 2018;44(3):415–430. [Google Scholar]
- Wiggins G. first ed. Jossey-Bass; 1998. Educative Assessment: Designing Assessments to Inform and Improve Student Performance. [Google Scholar]
- Xu X., Maki A., Chen C.F., Dong B., Day J.K. Investigating willingness to save energy and communication about energy use in the American workplace with the attitude-behaviour-context model. Energy Res. Social Sci. 2017;32:13–22. [Google Scholar]
- Zepeda L., Deal D. Organic and local food consumer behaviour: alphabet theory. Int. J. Consum. Stud. 2009;33(5):697–705. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Data will be made available on request.


