Skip to main content
Springer logoLink to Springer
. 2011 May 5;16(4):529–545. doi: 10.1007/s10459-011-9298-z

Self-reflection and academic performance: is there a relationship?

Magdeleine D N Lew 1,, Henk G Schmidt 2
PMCID: PMC3167369  PMID: 21544551

Abstract

The purposes of the present study were two-fold: first, to evaluate whether reflection journal writing was effective in promoting self-reflection and learning, and whether students become better at self-reflection if they engage continuously in reflection journal writing. To that end, the reflection journals of 690 first-year applied science students at a local polytechnic were studied by means of an automated coding procedures using software. Data was collected twice, once at the beginning and again towards the end of an academic year. Outcomes of the textual content analyses revealed that students reflected on both the process and contents of their learning: critical review of past learning experiences, learning strategies and summaries of what was learned. Correlational analyses showed weak to moderate inter-relationships between the textual categories and their classroom and knowledge acquisition test grades. Taken together, the findings suggest that self-reflection on both how and what students have learned does lead to improvements in academic performance, although to a limited extent.

Keywords: Self-reflection, Reflection journals, Classroom performance grades, Academic performance

Introduction

The role of reflection in education has created an upsurge of interest amongst educators and researchers since Dewey’s (1991) ground-breaking work, which emphasized the positive roles that reflection might play in fostering students’ self-reflection, critical thinking, and in the demonstrable development of professional values or skills. Self-reflection (or simply, reflection) has received numerous definitions from different sources in the literature. In his work, Dewey had defined reflection as “active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusion to which it tends” (p. 9). According to Mann et al. (2009), they suggest that Dewey’s definition of reflection shares similarities with our understanding of critical thinking. Boud et al. (1985) aptly define reflection in the context of learning and focus more on one’s personal experience as the object of reflection, as referring to “those intellectual and affective activities that individuals engage into explore their experience, which leads to new understanding and appreciations” (p. 19). The definition of reflection by Moon (1999), on the other hand, focuses more on the role of reflection and learning, and embeds reflection into the learning process. She describes reflection as “a form of mental processing with a purpose and/or anticipated outcome that is applied to relatively complex or unstructured ideas for which there is not an obvious solution” (p. 23). All three definitions though focus on different contexts, share similarities in that they emphasize purposeful critical analysis of knowledge and experience so as to achieve deeper meaning and understanding.

The definitions of self-reflection, though heterogeneous, are united in their advocacy to improve student learning. In the present study, self-reflection is influenced by these interpretations. It refers to the processes that a learner undergoes to look back on his past learning experiences and what he did to enable learning to occur (i.e. self-reflection on how learning took place), and the exploration of connections between the knowledge that was taught and the learner’s own ideas about them (i.e. self-reflection on what was learned). It is contended that since processes such as these can lead to informed and thoughtful deliberations on one’s behaviours and actions, they are believed to assist learners to become better at self-reflection, which leads subsequently to better academic achievement.

Reflection and problem-based learning

Problem-based learning (PBL) tend to be characterized by students working collaboratively in small groups, with learning centred on problems relevant to the students’ domain of study and much time spent on self-directed learning. In PBL, students learn by solving problems and reflecting on their experiences (Hmelo-Silver 2004). Reflecting on the relationship between problem solving and learning is a critical component of PBL and is needed to support the construction of extensive and flexible knowledge (Salomon and Perkins 1989). According to Salomon and Perkins, self-reflection helps students to (a) review the group process and their own personal functioning in the group, (b) understand how their learning and problem-solving strategies might be reapplied, and (c) relate new knowledge to prior understanding (i.e. contents that were discussed and taught). PBL incorporates reflection several times throughout the learning process and when completing a problem. At the completion of a problem, students reflect on what they have learned, how well they collaborated with the group, and how effectively they directed their learning. As such, students learn self-reflection when they become proficient in assessing their own progression in learning.

In her work, Hmelo-Silver (2004) highlighted that while a tutor can support self-reflection in PBL, other techniques may also be helpful. One approach to improving self-reflection is through the use of reflection journals.

Reflection journals, self-reflection and academic achievement

Self-reflection’s currency as a topic of educational importance has resulted in the incorporation of reflection journals as learning tools that promote reflection into many curricula, including PBL (Mann et al. 2009). Reflection journal writing is believed to enable students to critically review processes of their own learning and behaviours, and to understand their ability to transform their own learning strategies (Gleaves et al. 2008). Reflection journals are variously referred to as “reflective journals” (e.g. Chirema 2007), “reflective learning journals” (e.g. Thorpe 2004) or “learning journals” (e.g. Moon 1999). Although used in a variety of courses, reflection journals are essentially written records that students create as they think about various concepts learned, about critical incidents involving their learning, or about interactions between students and teachers, over a period of time for the purpose of gaining insights into their own learning (Thorpe 2004). The purposes of reflection journal writing include: to critically review the behaviours (e.g. strengths and weaknesses; learning styles and strategies) (Weinstein and Mayer 1986); learning of self and others; setting or tracking learning goals (i.e. how learning took place) (Lew and Schmidt 2011); and exploring connections between knowledge that was learned and students’ own ideas about them (Moon 1999). It is hoped that through reflecting and writing about new information or ideas, learners can better understand and remember them. In addition, the articulation of connections between new information, ideas, prior or existing knowledge also deepens learning (O’Rourke 1998).

The literature reports of a positive association between journal keeping and learners’ cognitive skills. In their study, McCrindle and Christensen (1995) explored the impact of reflection journal writing on cognitive processes and academic performances of forty undergraduates in a first-year biology course. Students were randomly assigned to a learning journal (experimental) group or scientific report (control) group. Their findings demonstrate that students in the experimental group used more cognitive strategies during a learning task as compared to those in the control group. Students who kept learning journals also showed more sophisticated conceptions of learning, greater awareness of cognitive strategies, and demonstrated the construction of more complex and related knowledge structures when learning from text. They also performed significantly better on the final examination for the course. While the data from this study are suggestive, it is unclear as to the precise nature of the relationships between students’ conceptions of learning and their cognitive processes, and more research is required to explicate these links.

The literature offers evidence that students, regardless of their domains of study, show improvements in their learning, that is, students became better in self-reflection, through journal keeping, although students did not reportedly become better at earning higher test grades. For instance, Selfe et al. (1986) investigated the use of reflection journals in a college-level mathematics course. Their findings suggest that while reflection journals did not necessarily assist students with earning high grades on achievement tests, they did assist students in developing abstract thinking thereby enabling them to better conceptualize the meaning of technical definitions. Students appeared to develop better strategies in problem solving through writing as compared to mere memorizing of calculations. In addition, students also showed improvements in their reflective writing skills, for instance, they were able to develop personal conceptual definitions that were more understandable than technical definitions of the texts. The findings by Selfe and colleagues were mirrored in the study by Moon (1999), where she summarized a number of studies which examined the effects of reflection journal writing on student academic achievement across a variety of disciplines. She reported that some studies showed effects, whilst others did not. Like Selfe and colleagues, Moon’s work also demonstrated the influence of journal keeping on student academic performance was subtle and did not seem to assist students with achieving better achievement test grades. However, this conclusion which they drew could be due to small sample sizes and poor measurement of the content of students’ journal responses in the studies reported.

The evidence to support and inform the curricular intervention of reflection journal writing as a means to improve students’ self-reflection and thus academic achievement remains largely theoretical. In addition, most of the present studies in the literature involved only a limited number of participants where students’ reflection journals were usually rated by teachers and hence any conclusions derived may be overly subjective. To maximize the validity of our findings, we did not rely on the reflection journals of a selected, small group of students. Instead, we collected data from 690 first-year students of a polytechnic, and used an objective analysis by subjecting students’ journal responses to an automated coding procedure using software (Lew and Schmidt 2011).

Aims of the study

The students in our study repeatedly had to reflect on how and what they have learned as the semester unfolded, and received continuous feedback from their teachers on their performances. The purposes of the present study were two-fold: first, to evaluate whether reflection journal writing was effective in promoting self-reflection and learning, and whether students become better at self-reflection if they engage continuously in reflection journal writing. It was hypothesized that self-reflection and academic achievement influenced each other interactively, i.e. students by looking back on how and what they have learned results in them having better self-reflection skills, which subsequently lead them to perform better in the classroom or on knowledge acquisition tests. Second, we were interested to investigate which type of reflection (i.e. self-reflection on how learning took place and/or what was learned) was more effective in promoting learning and thus academic achievement. To that end, students’ reflection journals were compared with their classroom performance and academic test grades for an academic year.

Method

Subjects

Participants included 690 applied science students in their first year of studies at a polytechnic in Singapore in the academic year 2007–2008. They were enrolled in three-year science diploma courses such as Biomedical Sciences, Pharmaceutical Sciences and Biotechnology. Of these students, 426 (62%) were females and 264 (38%) were males, and their mean age was 17.21 years (SD = 1.28).

Educational context

Problem-based learning

The polytechnic at which the research was carried out organizes its curriculum according principles of problem-based learning (Schmidt and Moust 2000). Students work collaboratively in teams of four to five, with learning centred on problems relevant to their domain of study. They work on one problem each day. The problem is initially discussed in the morning, followed by individual study. At the end of the day, information gathered is shared and elaborated upon. No didactic teaching takes place nor is there any form of direct instruction. One tutor supervises the student teams in a larger classroom. His or her role is to facilitate student learning (Alwis 2007). There are two semesters in an academic year, with each semester lasting 16 weeks. All the courses offered are part of a three-year curriculum.

Data collection versus assessment in the curriculum

The daily assessment approach consists of four elements: (1) a classroom performance grade awarded by the tutor based on how well a student has performed during the day (2) an activity in which a student assessed his or her own performance for the day, and (3) an activity in which a student assessed his or her team mates’ performances for the day, and (4) a reflection journal to be written by each student. The classroom performance grade is measured based on tutors’ observations of students’ processes of daily learning. The observations by the tutors include students’ self-directedness, level of participation inclusive of teamwork; students’ ability to reason, justify and defend opinions and ideas formulated in respond to problems, as well as their problem solving skills. Tutors will then award grades ranging from “A” to “F”, which are derived based on what they observe and the impression they have on each student during the duration of time they had with him/her. Tutors also take into consideration students’ individual reflection journals (short essays which document students’ reflections on daily learning) and their self and peer assessments when awarding grades. Furthermore, tutors will provide feedback to students on their learning outcomes and processes of daily learning.

The reflection journal records a student’s reflections of daily learning in response to a reflection journal question provided by the tutor. Each student is required to respond to one journal question per day. The student submits his/her reflection journal electronically by means of an online platform by the end of the day. Tutor-asked journal questions required students to be reflective about their learning and development. Some examples of reflection journal questions include “Discuss your effectiveness as a team player/leader in solving the problem today.”, “What insights did I gain today?”, “How can you apply some of the skills and knowledge that you have learned?”, “What strategies have I used to help me in my learning?” and so on. Students respond to a different reflection journal question each day during a 5-day workweek. The purpose of writing the reflection journal is to encourage and record self-reflection about how learning took place and what was learned. Some examples of students’ journal responses are contained in the appendix section.

Students also need to take four knowledge acquisition tests per module, which are taken at different points (i.e. after every 3–4 weeks) during the semester. The tests are conducted in a supervised environment, similar to an end-of-course examination and require students to answer at least three open-structured questions. Students are tested on their ability to understand and apply what they have learned. The knowledge acquisition test grades range from “A” to “F”.

Procedure

The classroom performance and knowledge acquisition grades were first converted to scaled numerical values on a five-point scale. The averages of the knowledge acquisition grades for that of semesters 1 and 2 were computed and used for the analyses.

Analysis

The tutor grades were first converted to scaled numerical values on a five-point scale. In seeking evidence of reflective activities through reflection journal writing, student journals were analyzed using the SPSS Text Analysis for SurveyTM software (SPSS 2006). The software uses advanced linguistic theory technologies that extract and classify key concepts from student journal responses. These technologies analyze content as a set of phrases and sentences whose grammatical structure provides a context for the meaning of a response. The software enables the coding and categorization of journal responses in a fraction of the time required to do the job manually. Another benefit is that the categorization of responses is done consistently and reliably; the responses are analyzed in an iterative manner. Unlike human coders, the software classifies the same response in the same categories every time.

The first step in content analysis is to extract key terms and ideas from the journal responses. The engine uses linguistic algorithms and resources to identify relevant concepts. This means that extraction does not treat a response as a set of unrelated words, but it identifies key words, compound words, and patterns in the text. Pre-coded definitions were the linguistic resources used to extract terms from the journal responses.

The extracted terms were grouped into categories by the software. As used in content analysis, a category refers to a group of closely related concepts, opinions or attitudes. The software relies upon three linguistic-based techniques that take into account the root meanings of the extracted terms and their relationship between sets of similar objects or opinions: term derivation, term inclusion and semantic networks (SPSS 2006, p. 101). Because these techniques are complementary to one another, all of them are used for categorizing the extracted terms.

The term derivation technique creates categories by taking a term and finding other terms that are related to it by analyzing whether any of the terms components are morphologically related. For instance, the term “opportunities for self-reflection” would be grouped with the term “self-reflection opportunities”. The term inclusion technique uses algorithms to create categories by taking a term and finding other terms that include it. When determining inclusion, word order and the presence of such words as “in” or “of” are ignored. As illustration, given the term “skill”, term inclusion will group terms such as “programming skills” and “a set of skills” in a skill category. The root term used to create the category (skill) can have words before it, after it, or both before and after (“programming skill set”).

The semantic networks technique creates categories using a semantic/lexical network based on WordNet®, a linguistic project based in Princeton University (Miller 2006). WordNet® is a reference system of “Nouns, verbs, adjectives and adverbs grouped into sets of cognitive synonyms, each representing one underlying lexical concept.” This method begins by identifying extracted terms that are known synonyms and hyponyms (i.e. a word that is more specific than the category represented by a term, e.g. student, tutor and peer are hyponyms of the term “person”).

In order to analyze the journal responses in a more meaningful fashion, a custom library was created. This library contained domain-specific words and terms (with synonyms) that arose from the modules taken by all first-year students. In this particular institution, all students were required to take two mathematics and computer applications modules in their first year of studies. These modules consisted of several tasks which asked students to create spreadsheets and basic computer programs to perform simple numerical functions. Using these modules as an example, domain-specific words would include “visual basics programming”, “Microsoft excel graphs”, “spreadsheets” etc.

The categories that were automatically generated were also renamed to capture their essential meanings. The descriptions of the categories obtained are contained in Table 1 (see also Lew and Schmidt 2011).

Table 1.

Description of categories generated by means of text analysis software

Category Sub-category Description Examples of reference studies
Critical review of past learning experiences Self

To look over or examine self-performance. This includes:

Learning strengths and weaknesses

Setting or tracking learning goals

Learning styles such as visual, auditory, and tactile

Lew and Schmidt (2006, 2007, 2011); Moon (1999)
Peers

To look over or examine peers’ performance. This includes:

Team work, and team dynamics, i.e. cooperativeness and level of contributions, and

Helping peers with their learning, or seeking help from peers

Products

To look over or study the products of learning, which emerged as a result of relating knowledge structures from text. This includes:

Domain-specific skills, e.g.: graph-plotting using Microsoft Excel, Visual Basics programming, Microsoft PowerPoint etc.

Presentation slides, self-created computer programs, self-creating Excel accounting spreadsheets, classroom performance grades etc.

Learning strategies Rehearsal Oral repetition, copying, making selective verbatim responses and underlining the important parts of the material McCombs and Whistler (1989); Weinstein and Mayer (1986)
Organization Categorizing information, creating knowledge networks and hierarchies (e.g. mind maps)
Elaboration Creating analogies or mental images, generative note taking and self-questioning
Summaries of what was learned To relate new information to prior or existing knowledge; applicability of knowledge gained to other situations Moon (1999); O’Rourke (1998); Selfe et al. (1986)

The outcomes of the text analyses suggest that students appeared to reflect on three general categories related to their learning in their reflection journals: critical review, learning strategies, and summaries of what was learned (see Table 1 for description).

Data used were for the analyses were student reflection journals for an entire week during Week 3 of the first semester and again during Week 14 of the second semester of the academic year 2007–2008. Data from Week 1 was not considered as it being the start of a new academic year, a steady state in the student enrolment had yet to be reached as students were still appealing to enter or change polytechnics. The student enrolment figures reached a steady state by the second week. Data from weeks 15 and 16 were not considered because the attendance of students in classes was poor for the last 2 weeks of the academic year; the number of reflection journals submitted in the last 2 weeks was significantly lower as compared to that for week 14.

Identical categories were generated for both sets of data. The number of instances which each category appeared in each student’s journal response was recorded and used for comparison against students’ performance in class (i.e. classroom performance grades) and on knowledge acquisition tests.

Results

Table 2 contains the results of the correlational analyses between the frequency counts for coding categories present in students’ journal responses and their classroom performance grades for Week 3 of the first semester and Week 14 of the second semester in the academic year. Weak correlational values (r) were obtained (r ranging from .02 to .27).

Table 2.

Correlations between frequency counts for coding categories present in student journal responses and students’ classroom performance grades

Classroom performance grades
Week 3 Week 14
Critical review
 Self
  Week 3 .03 .02
  Week 14 .16* .25*
 Peers
  Week 3 .04 .04
  Week 14 .13* .14*
 Products
  Week 3 .03 .03
  Week 14 .12* .13*
Learning strategies
 Rehearsal
  Week 3 .13* .09*
  Week 14 .25* .27*
 Organization
  Week 3 .10* .02
  Week 14 .12* .14*
 Elaboration
  Week 3 .08* .13*
  Week 14 .11* .15*
Summaries of what was learned
  Week 3 .16* .12*
  Week 14 .12* .14*

Week 3 was in semester 1 at the beginning of academic year whereas Week 14 was in semester 2 towards the end of the academic year

Degrees of freedom = 689

* p < .01, 2-tailed

The results of the correlational analyses between the frequency counts for coding categories present in student journal responses and students’ knowledge test grades are contained in Table 3. Weak to moderately strong correlations were obtained (r ranging from .02 to .34).

Table 3.

Correlations between frequency counts for coding categories present in student journal responses and students’ knowledge acquisition grades

Knowledge acquisition test grades
Semester 1 Semester 2
Critical review
 Self
  Week 3 .02 .04
  Week 14 .34* .33*
 Peers
  Week 3 .09 .11*
  Week 14 .25* .28*
 Products
  Week 3 .19* .19*
  Week 14 .11* .12*
Learning strategies
 Rehearsal
  Week 3 .12* .13*
  Week 14 .29* .29*
 Organization
  Week 3 .07* .08*
  Week 14 .14* .15*
 Elaboration
  Week 3 .11* .13*
  Week 14 .16* .15*
Summaries of what was learned
  Week 3 .16* .17*
  Week 14 .21* .11*

Mean values of students’ knowledge acquisition test grades for semesters 1 and 2 were used in the correlational analyses

Degrees of freedom = 689

* p < .01, 2-tailed

Higher correlations for week 14 were reported as compared to those for week 3. A method that compares correlations drawn from the same sample as described by Cohen and Cohen, (1983) was used to test for significant differences between them (p. 57). Results of the analysis reveal that the differences in the correlations between the coding categories and classroom performance grades were not statistically significant. Similar findings were evident between the coding categories and knowledge acquisition test grades. The findings suggest that the type of reflection (i.e. self-reflection on how learning took place and/or what was learned) did not matter when it comes to promoting learning and hence academic achievement in students.

Discussion

The present study was conducted to examine whether a relationship exists between students’ abilities to self-reflect and their academic achievement, and if their awareness of how and what they have learned would improve as they progressed through the course, engaging in continuous journal keeping. To that end, students’ reflection journals, which focus on self-reflection on the processes of learning and the knowledge taught, were coded in an objective fashion, by means of automated content analysis approach using software, and textual categories generated. Correlational analyses were performed on the textual categories and students’ classroom performance and knowledge test grades. Data used in the analyses was collected once at the beginning of the academic year, and again at the end of the academic year.

Weak correlations were reported between the learning categories generated from students’ journal responses and their classroom and knowledge acquisition test grades. The findings also indicate that the type of reflection, i.e. self-reflection on how learning took place and/or what was learned was no different terms of helping students become more effective at learning or academic achievement. Although the differences in the correlations between that of week 3 and week 14 were not statistically significant, one cannot conclude that no relationship exists between students’ abilities to self-reflect and their performances in classrooms and on knowledge acquisition tests. Increasing trends in the correlations were observed in Tables 2 and 3, suggesting that self-reflection was effective to a small extent in improving student learning, and that students do demonstrate some growth in self-reflection (as indicated by the higher correlations between coding categories and academic grades), i.e. their abilities to self-reflect on how and what they have learned did improved through engaging continuously in reflection journal writing, although this influence is not manifested to a measureable effect resulting in improvements in academic performance.

What are some plausible explanations for these findings? First, there is this possibility that the weak relationship between self-reflection and academic performance is because students are generally poor at self-reflection. They simply are not able to reflect on their own performance and the subject matter taught effectively, for instance, because they have insufficient access to their own learning process. However, the study by McCrindle and Christensen (1995) reported that undergraduates in a first-year biology course who kept reflection journals showed more sophisticated conceptions of learning, greater awareness of cognitive strategies, and demonstrated the construction of more complex and related knowledge structures when learning from text, as compared to those who did not engage in journal keeping. Furthermore, they also performed significantly better on the final examination for the course. Hence, a general dismissal of the idea that students can be competent self-reflectors may be premature.

A second possibility is that the weak inter-relationship between self-reflection and academic performance is attributed by the fact students in this particular study who are somewhat lacking the experience of self-reflecting on how and what they have learned. Students who took part in the current study could be described as “inexperienced” to some extent, because they were first-year students in higher education, although they already had more than 10 years of education behind them. Although some authors (e.g. Mann et al. 2009; Moon 1999) have suggested that experienced students, i.e. those in their later years of studies were better at self-reflection as compared to those students in introductory programs, McCrindle and Christensen (1995) did demonstrate that first-year students in higher education already have the capacity for self-reflection. Nevertheless, one cannot exclude the plausibility that the beginning of a new study is not the best moment to investigate the relationship between self-reflection and academic performance and that the findings are time-dependent, that is, the results obtained would have been different if students’ journal responses and grades from other weeks of the academic year were used in the correlational analyses. Another possibility for the findings could be due to differences between responses in weeks 3 and 14 may be caused by differences in the tutor-asked journal questions.

To test of the findings reported are time-dependent, we examined post-hoc the data of students journal responses written in two other weeks, i.e. week 4 of the first semester and week 15 of the second semester, and their classroom performance and knowledge acquisition test grades for the second semester. Again, identical textual categories to those contained in Table 1 were generated. Similar to the results obtained from the data sets from weeks 3 and 14, test of differences between the mean categorical frequency counts by means of paired-samples t tests revealed no significant differences (for example, Critical review (self) = t(689) = 1.54, p < .01; Learning strategies (organization) = t(689) = −2.75, p < .01; Summaries of what was learned = t(689) = 1.87, p < .01, with degrees of freedom in parentheses). Furthermore, test of differences in the correlations between the learning categories and that of classroom performance grades (week 4: r ranging from .03 to .22; week 15: r ranging from .04 to .24) and knowledge test grades (week 4: r ranging from .03 to .30; week 15: r ranging from .05 to .29) revealed no significant differences when compared the data sets from weeks 3 and 14. This suggests the measurement stability of our findings, since the results from content analyses using data from other weeks of the academic year and the correlations between textual categories and academic grades were similar to those obtained from the data sets from weeks 3 and 14.

The reader may remember that students write reflection journals in response to a question by their tutor. These questions differ per day and they also differ between tutors. To test whether the difference in self-reflection as a function of time was influenced by the specific tutor-asked questions, we subjected all the questions asked in both weeks 3 and 14 to text analyses using the same content analysis approach of student journal responses. In total, more than 400 journal questions were asked by approximately 200 tutors involved in taking the first-year applied science students. Identical categories (e.g. learning strengths and weaknesses, skills, subject matter etc.) were generated for both data sets. Comparisons between the means of the frequency counts for the categories by means of paired sample t tests revealed that none of their differences were statistically significant. Therefore, the differences in the journal responses in weeks 3 and 14 were not due to differences in the tutor-asked journal questions.

A third possible explanation for our findings is that although a relationship exists between self-reflection and academic performance, this is not reflected as an improvement in students’ classroom performance and knowledge test grades. Moon (1999) and Selfe et al. (1986) contended that the influence of reflection journal keeping on student academic performance as being subtle, and did not seem to assist students in attaining better academic achievement. Instead, journal keeping seems to facilitate student learning in a number of other ways, among them synthesizing new knowledge about a domain subject with their prior knowledge and learning, recording of useful strategies in solving problems, and in enhancing students’ awareness of their cognitive processes and their control of these processes.

A final possible explanation for the fairly poor inter-relationship between self-reflection and academic performance not yet discussed here is that some students simply do not take the task of journal writing seriously while others perhaps do, leading to weak correlations between the coding categories and students’ grades. In an earlier study, Lew and Schmidt (2007) reported that polytechnic students, when presented with the task of journal writing, became “strategic” in their approach to completing the task. Some students reported that they wrote their reflection journals in a bid to impress their tutors, while others were sceptical about the need to reflect on how and what they have learned, citing reflection journals as “mechanical and meaningless” which were non-beneficial to their learning.

Conclusion

These deliberations lead us to the conclusion that, generally, students’ abilities to self-reflect on how and what they have learned did improved through engaging continuously in reflection journal writing, although this influence was not manifested to a measurable effect which leads to improvements in academic performance. Our study also suggests that self-reflection skill cannot be easily learned through extended experience and the provision of continuous feedback from their tutors. There is an underlying assumption in the literature that students who are better at self-reflection, perform better academically. To date, there is no finding to refute or support this assumption. Such a finding may suggest that curricular interventions to teach self-reflection are futile, and should be abandoned. However, the literature reveals that self-reflection does improve learning in other ways (see Mann et al. 2009; Moon 1999), although it cannot be measured using academic achievement. The findings from the present study are to a large extent, in agreement with what Moon (1999) and Selfe et al. (1986) argue about the positive effect of self-reflection as not necessarily measured by achievement test grades. However, the results from existing studies were more subjective, since they involved manual coding of student journal responses. Further, existing studies did not include comparison of findings over time, casting some doubts over the reliability and validity of their results.

The present study has sought ways in arriving at more reliable and valid measurements. We did not rely on single reflection journals of students and had adopted an automated coding procedure to analyse the responses. As such, the problem of inter-coder reliability was absent. Contrary to most studies in journal writing with limitations such as small sample size, non-continuous engagement in the task of writing journals or infrequent feedback given by teachers, we collected data from over 600 first-year applied science students. Furthermore, in this context, students engaged continuously in the task of journal writing and receive timely and regular feedback on their learning from their tutors. Though the provision of such continuous feedback may have created optimal conditions for enhancing students’ awareness of how and what they have learned, this is not translated into better achievement on classroom performance and knowledge test grades.

Limitations

Some limitations should however, be noted. A shortcoming of the present study is the partial overlap of the instruments used: reflection journal, tutor judgment and knowledge acquisition test, which may have produced, in part, the weak to moderate correlations between the coding categories and academic grades. A study employing identical instruments for should certainly be conducted to verify our findings.

The text analysis software is not a panacea, and although using software to perform content analysis removes inter-coder reliability as a concern, it is not without its shortcomings. In human coding, the coders read the responses and can capture all the nuances of a statement even if they face difficulties applying the coding categories. The software can apply the coding categories, but they need to be defined so that the nuances are captured. An implication arising from this is that the editing done by the researchers of the synonyms and excluded words in the various libraries must accurately capture the ideas of the respondents in the text. Another limitation of the software is that it will not capture all the information in the journal responses, although categories can be created easily without any intervention on the part of the researchers.

Further research

Based on the findings, two studies are suggested for future. First, given the range of students’ aptitude and ability to cope with, and respond to, the task of reflecting on their own learning and performance, the focus on individual students and their strengths and weaknesses should constitute the next stage of research in better understanding the nature and operation of self-reflection on academic performance in higher education. The gathering of detailed empirical evidence which may cast light on those characteristics and factors which could account for individual differences in student self-reflection skill is one key area for further research.

Second, further research should investigate if students’ self-reflection skills can be improved through formal training. Extended experience alone, as our study has demonstrated, is clearly not enough to affect change. Mann et al. (2009) recommend that as with other skills, learners may need a structure to guide the complex process of self-reflection on the content and process of their learning. They contend that guidance and supervision are vital to helping students become better at self-reflection. Through a more structured and closely guided process, students may become better aware of, and value their existing capability for, self-reflection, and its potential for development and application.

Acknowledgments

The authors are grateful to Republic Polytechnic, Singapore, which made the data collection and management possible.

Open Access

This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Appendix

See Table 4.

Table 4.

Examples of students’ reflection journals to illustrate the different textual categories generated by means of software

Student Reflection journal question Reflection journal response Textual categories
A Discuss your effectiveness as a team player/leader in solving the problem today

I think I have completed my tasks as a team member and did well in my team today. Besides me, my team members have also worked well on the problem today. We co-operated and accomplished our distributed tasks. Besides, we understood the problem together and explained to each other when one of us met some problems. In the beginning, I did not know what actually we want to find out after read the problem statement. After discuss with my team members and the facilitator, I understood and found that I have learned before in my previous school which is the topic of isomerism. Then, we discussed together and tried to solve out the problem. Firstly, we distributed the works to each other and did the research. I completed my reaching and finished my slides. After that, I explained to my team members what I have done. At the same time, they also let me see their researching and explained to me when I did not understand. This led us to understand the information what we have found out of today’s problem. Therefore, I felt that we have tried our best to do well in our presentation with the sufficient information

In conclusion, I think the team work is the most important to make the team solve out the problem effectively. Sharing ideas is the process for us to solve the problems. “No man is an island”. I think this saying goes is very correct

Critical review of past learning experiences
B How can you apply some of the skills and knowledge that you have learned?

Today, I have learned about the different types of isomers and according to our research, isomer is categorised into structural and stereo. From structural it is than further broken down into skeletal, positional and functional and stereo is broken down into geometrical and optical. Also, about what chiral centre is. It means that 1 carbon atom attached to 4 different atoms. So in this case, to me, a chiral centre is like we working together in a group whereby there is 1 person who is the leader to lead the team and the rest of the 4 members who follows what the leader instruct to do. Thus, in order to work well in the team there must be a responsible leader and 4 other members who are willing to follows what the leader says and this is same as saying that there is no chiral centre if either 1 is not present. All these above are knowledge which I have never learned before and a new challenge to me. I have learned to find these resources in a more effective manner whereby typing the exact keyword so that the points which I found would not be out of point. Furthermore, being team with friends or people whom I am more comfortable working with, I am more confident speaking what I wanted to say clearly, ask questions which warrant further investigation. At the same time it builds up my confident level and also giving more suggestions and ideas to the team so as to improve my contribution in the team. Also, I am more interested in what I am doing and will be able to contribute more and effectively

In conclusion, I realized that working in a team we need to build good relationship with people so as to accomplish our work more efficiently, we will have an enjoyable working environment and also able to produce a more productivity work. So, I will apply what I have realized in future module in speaking up when I am in doubt, listening and valuing what others have to say and also suggesting a hypothesis or a possible solution built on the ideas of the group so as to improve the group as a whole to become a better one

Critical review of past learning experiences
C “The development of science and technology is the panacea to all our environmental problems.” What is your view of this statement? I strongly disagree that the development of science and technology is the panacea to all our environmental problems. This is because the development of Science and Technology bring a lot of disadvantages to our environment and it causes more problems to our environment. The development of Science involved using a lot of chemical product. Indirectly, the development of science and technology causes acid rain. Acid rain occurs when these gases react in the atmosphere with water, oxygen, and other chemicals to form various acidic compounds. Sunlight increases the rate of most of these reactions. The result is a mild solution of sulphuric acid and nitric acid. Acid rain accelerates the decay of building materials and paints, including irreplaceable buildings, statues, and sculptures that are part of our nation’s cultural heritage. Besides, It benefits people in some way, but alters our environment in some other way (like affecting other species, degrading some physical property of our environment). Nuclear energy and environmental pollutants most certainly fall into this topic Summaries of what was learned
D What insights did you gain today? In chemistry, there are many complicating terms for us to study and thus i find it necessary to memorize such terms as in science, alot of properties and reactions and terms have many links between one another and therefore there is a need to memorize all these terms starting from the simplest basic formula and terms to the most complicating one as there is a need to apply them when solving the most difficult question. You will need to work out the simplest method and terms slowly then u can get the end formula and answer to it. Therefore, I think it is necessary to memorize such terms as it is very useful in the need to apply it when doing the questions especially the difficult ones. However, I think that understanding the whole term is absolutely better than memorizing as if you understand the term rather than memorizing it only, you would tend to make lesser mistakes as you already have the knowledge of it. Like for example, in science, one formula may just differ from one another by a digit or an alphabet which could then leads to a totally different term together. Thus, apart from memorizing, understanding terminologies is also much needed Summaries of what was learned
E What strategies have I used to help me in my learning? I have used a number of strategies to help me in my learning. The first strategy is repetition, in which I will I will re-read the content to clarify an area of confusion. I believe that when I revise the notes a few times, I will be able to pick out information that I had missed or failed to understand in class. This way I can assess my own understanding level. There is also another strategy in which I firm a term or acronym that’s familiar and helps me put the content into a realistic setting. This has also helped me to improve my grades for my understanding tests. Another strategy I used visuals and pictures to help me learn the contents better. I use mind maps and graphs to understand the big picture, compare use of information and recall a section in the course Learning strategies
F What learning strategies did you use to help you answer the problem today? In order to help me answer the problem today, I keep a notebook where I copy down any important information that was being taught by the tutor. I prefer to write my own notes so that it helps me to memorize the information. I would try to link whatever I do not know to what I know. I try to learn how were the problem given or the information given was related to what I have learnt in secondary school. I would use thing I have found on the internet like visuals, graphs, images and information to what I was taught or given and through this, I would be able to learn. Lastly, I would try to create analogies to help me remember what was taught. I am weak in my Biology, so I will create my own analogies. For example, a gene is like an instruction manual that teaches someone how to build something, and the genotype is the text inside the manual Learning strategies

References

  1. Alwis, W. A. M. (2007). Pedagogical beliefs and institutional practices at Republic Polytechnic. Paper presented at the Keynote presentation at the 2nd international symposium on PBL: Reinventing PBL, Singapore.
  2. Boud D, Keogh R, Walker D. Reflection: Turning experience into learning. London: Kogan Page; 1985. [Google Scholar]
  3. Chirema KD. The use of reflective journals in the promotion of reflection and learning in post-registration nursing students. Nurse Education Today. 2007;27(3):192–202. doi: 10.1016/j.nedt.2006.04.007. [DOI] [PubMed] [Google Scholar]
  4. Cohen J, Cohen P. Applied multiple regression/correlation analysis for the behaviorial sciences. 2. Hillsdale, NJ: Lawrence Erlbaum Associates; 1983. [Google Scholar]
  5. Dewey J. How we think. Buffalo, NY: Prometheus Books (Originally published: Lexington, MA: D.C. Heath, 1910); 1991. [Google Scholar]
  6. Gleaves A, Walker C, Grey J. Using digital and paper diaries for assessment and learning purposes in higher education: A case of critical reflection or constrained compliance? Assessment and Evaluation in Higher Education. 2008;33(3):219–231. doi: 10.1080/02602930701292761. [DOI] [Google Scholar]
  7. Hmelo-Silver CE. Problem-based learning: What and how do students learn? Educational Psychology Review. 2004;16(3):235–266. doi: 10.1023/B:EDPR.0000034022.16470.f3. [DOI] [Google Scholar]
  8. Lew MDN, Schmidt HG. Reflection upon learning between theory and practice: A focus-group study of tutors’ and students’ perceptions. The Netherlands: Erasmus University Rotterdam; 2006. [Google Scholar]
  9. Lew, M. D. N., & Schmidt, H. G. (2007). Reflecting on practice: The use of journals at a problem-based learning school in Singapore. Poster presented at the 2nd international symposium on PBL: Reinventing PBL, Singapore. Paper presented at the Reinventing PBL, Singapore.
  10. Lew, M. D. N., & Schmidt, H. G. (2011). Writing to learn: Can reflection journals be used to promote self-reflection and learning? Higher Education Research and Development (in press).
  11. Mann K, Gordon J, et al. Reflection and reflective practice in health professions education: A systematic review. Advances in Health Sciences Education. 2009;14(4):595–621. doi: 10.1007/s10459-007-9090-2. [DOI] [PubMed] [Google Scholar]
  12. McCombs BL, Whistler JS. The role of affective variables in autonomous learning. Educational Psychologist. 1989;24(3):277–306. doi: 10.1207/s15326985ep2403_4. [DOI] [Google Scholar]
  13. McCrindle AR, Christensen CA. The impact of learning journals on metacognitive and cognitive processes and learning performance. Learning and Instruction. 1995;5:167–185. doi: 10.1016/0959-4752(95)00010-Z. [DOI] [Google Scholar]
  14. Miller GA. WordNet: A lexical database for the English language (Version 3.0) NJ: Princeston University; 2006. [Google Scholar]
  15. Moon JA. A handbook of reflective and experiental learning. London: Routledge; 1999. [Google Scholar]
  16. O’Rourke R. The learning journal: From chaos to coherence. Assessment and Evaluation in Higher Education. 1998;23(4):403–413. doi: 10.1080/0260293980230407. [DOI] [Google Scholar]
  17. Salomon G, Perkins DN. Rocky roads to transfer: Rethinking mechanism of a neglected phenomenon. Journal of Educational Psychology. 1989;24:113–142. doi: 10.1207/s15326985ep2402_1. [DOI] [Google Scholar]
  18. Schmidt HG, Moust JHC. Factors affecting small-group tutorial learning: A review of the literature. In: Hmelo DHECH, editor. Problem-based learning: A research perspective on learning interactions. NJ Lawrence Erlbaum: Mahwah; 2000. pp. 19–52. [Google Scholar]
  19. Selfe CL, Petersen BT, Nahrgang CL. Journal writing in mathematics. In: Fulwiler AYT, editor. Writing across the disciplines. Upper Montclair, NJ: Boynton/Cook; 1986. pp. 192–207. [Google Scholar]
  20. SPSS I. SPSS text analysis for SurveysTM 2.0 user guide. IL: Chicago; 2006. [Google Scholar]
  21. Thorpe K. Reflective learning journals: From concept to practice. Reflective Practice. 2004;5(3):327–343. doi: 10.1080/1462394042000270655. [DOI] [Google Scholar]
  22. Weinstein CE, Mayer RE. The teaching of learning strategies. In: Wittrock MC, editor. Handbook of research on teaching. 3. New York: Macmillan; 1986. pp. 315–327. [Google Scholar]

Articles from Advances in Health Sciences Education are provided here courtesy of Springer

RESOURCES