Abstract
Researchers have long been interested in understanding how different learning approaches impact learning outcomes. Learning approaches are often conceptualized as a dichotomy of superficial and deep, and learning outcomes are typically viewed on a cognitive scale that ranges from lower- to higher-order. While there appears to be an inherent relationship between learning approach and outcomes where superficial approaches lead to lower-order learning and deep approaches result in higher-order learning, this concept is not well documented. The purpose of this study is to better understand this relationship by evaluating whether student performance on higher- and lower-order examination questions is influenced by the approach a student takes when studying. To investigate this, survey and examination data were collected from an upper-level undergraduate Human Anatomy course at the University of Cincinnati. Results indicate that, on average, students in the course favored a deep approach to learning. The impact that learning approach had on examination performance was investigated using a series of analytical approaches, which revealed that students who took a deep approach to learning performed marginally better on both higher- and lower-order examination questions in lecture and practical examination settings. These results are contextualized within the literature, which highlights the need for more research surrounding the interrelatedness and dependency of categories within both learning approaches and cognitive levels.
Keywords: Assessment, Higher-order learning, Deep approach, Superficial approach, Cognitive level
Introduction
Understanding the relationship between student learning and academic achievement has important implications at multiple levels of education. Within this area of research, there are two topics that have been studied extensively: student approaches to learning and cognitive levels associated with learning.
Student approaches to learning
Researchers have been interested in categorizing student approaches to learning since the 1970s (e.g., Marton & Saljo 1976; Entwistle et al., 1979; Biggs, 1978). One of the main findings to come of out this early work was the identification of two approaches to learning: superficial approach (SA) and deep approach (DA). The DA is characterized by thinking critically about the material, having intrinsic motivation to learn, and seeking out the interrelations of information. In contrast, students who utilize a SA focus on memorization without contextualizing the information and are highly motivated by external factors, such as what material will be tested on an examination (Beattie et al., 1997). It has been argued that encouraging students to take a DA to learning is important since previous studies have shown it to be positively correlated with academic outcomes (Cassidy, 2012; Rajaratnam et al., 2013; Newton & Martin, 2013) and those who take a DA are more likely to be life-long learners (Newble et al., 1990).
Cognitive levels of learning
A series of meetings held by psychologists in the 1950s aimed at developing a classification scheme that captured behaviors representing the cognitive processes of a learner. This work culminated in the development of Bloom’s taxonomy, which was originally created to standardize the way educational objectives were written so they could be more easily shared between institutions (Bloom, 1956). Subsequently, Bloom’s taxonomy has undergone revisions (Krathwohl, 2002; Anderson et al., 2001), but its core outline has largely remained the same. At its foundation, Bloom’s taxonomy is a cumulative hierarchy where cognitive complexity increases as one progresses from lower to higher levels. Lower levels focus on memorization and comprehension, whereas higher levels incorporate aspects of cognition that include applying, analyzing, evaluating, and creating (Anderson et al., 2001). While Bloom’s taxonomy can be utilized in a variety of ways, there have been considerable research efforts focused on classifying examination questions based on cognitive levels and using this information to better understand student examination performance (Thompson et al., 2013, 2016; Verenna et al., 2018; Morton & Colbert-Getz, 2017; Crowe et al., 2008; Thompson & Giffin, 2021; Thompson & O’Loughlin, 2015; Zheng et al., 2008).
Intersection between learning approach and cognitive level
While research in the areas of student approaches to learning and cognitive levels of learning originated independently, there is considerable overlap in the concepts they describe. Characteristics associated with a DA to learning share many commonalities with the traits associated with higher cognitive levels whereas lower levels on Bloom’s taxonomy are more consistent with descriptions of the SA. While some research indicates that promoting higher-order thinking leads to better learning outcomes (Jensen et al., 2014), it is important to determine whether a DA to learning translates into a greater understanding of concepts that require higher-order cognition.
While several studies have attempted to better understand this relationship (Pandey & Zimitat, 2007; Rajaratnam et al., 2013; Hobbins et al., 2020; Newton & Martin, 2013; Bansal et al., 2021), many fail to account for confounding factors, such as student aptitude. Additionally, most studies have not employed a study design that directly assesses whether students who take a DA to learning outperform those who take a SA on higher-order examination questions, particularly when they are delivered in multiple-choice format or are administered as part of a practical examination. Therefore, the goal of this study is to evaluate whether a student’s approach to learning is correlated with performance on higher- or lower-order examination questions in an upper-level undergraduate (baccalaureate) Human Anatomy course where both lecture-based multiple choice and practical examinations were administered.
Materials and methods
This study involved students enrolled in an upper-level undergraduate Human Anatomy course offered through the Medical Sciences Baccalaureate Program at the University of Cincinnati College of Medicine. The Human Anatomy course is a 4 credit-hour, 15 week (one semester) course offered to seniors only and focuses on clinically relevant anatomy. The regions covered in the course (from start to finish) include: back, thorax, abdomen, lower limb, and upper limb. The head/neck and pelvis/perinium regions, while highly clinically relevant, are not covered due to time constraints. The learning modalities consist of lecture, team-based learning sessions, and human dissection laboratory. Lecture and team-based learning sessions are scheduled for one hour, whereas laboratory sessions last approximately two hours. In the dissection laboratory, students are divided into teams of four and each team is assigned their own donor. All team members complete the dissections as a group.
Students are evaluated are three points during the course using computer-based, multiple-choice lecture examinations and practical examinations that incorporate both open-ended and multiple-choice questions utilizing pinned structures on donors. The first set of lecture and practical examinations is shorter as it includes only content related to the back. This is deliberate as it provides students with a chance to evaluate the effectiveness of their study methods in a low stakes setting early in the course. The second set of examinations covers the thorax and abdomen, and the final set includes material related to the upper and lower limbs. Assessments include a mix of higher- and lower-order questions (Thompson & Giffin, 2021). Example lecture and practical examination question are provided in Table 1.
Table 1.
Examples of lower- and higher-order anatomy questions
| Cognitive level | Lecture question example | Practical question example |
|---|---|---|
| Lower-order |
A 20-year-old male presents with a superficial laceration on the posterior aspect of his lateral malleolus. Which of the following structures is most vulnerable to being damaged? A.Small saphenous vein B. Tendon of tibialis posterior C. Great saphenous vein D. Posterior tibial vein E. Tibial nerve |
Tag the small saphenous vein near the lateral malleolus on a donor. Question prompt: Identify the tagged structure. |
| Higher-order |
A. Weakness in wrist extension B. Weakness in elbow flexion C. Numbness on the medial palm D. Weakness in pronation E. Fingertip numbness on digits 4 and 5 |
Tag the radial nerve in the triceps hiatus on a donor. Question prompt: A 35-year-old female presents to the ER with arm pain after a car accident. If tagged structure was damaged during the accident, what physical exam findings are most likely? A. Weakness in wrist extension B. Weakness in elbow flexion C. Numbness on the medial palm D. Weakness in pronation E. Fingertip numbness on digits 4 and 5 |
Study design
The current study included students enrolled in the Human Anatomy course in the Fall 2021 semester. Forty students were enrolled in the course. All students were seniors in the Medical Sciences Baccalaureate program with a mean age of 21 years old. There were 10 students who reported their gender as male and 30 students who reported as female. Gender data were collected from college applications where the following options were provided when students were asked to identify their gender: female, male, nonbinary, or add another gender. If a student selects add another gender, then they are able to self-identify their gender identity.
The research project was introduced to students at the end of the first class meeting. During this time students were provided a brief explanation of the study, including what would be expected from them if they chose to participate. In short, participation involved taking several surveys and permitting the authors to use de-identified course grades and measures of prior academic performance (college grade point average (GPA)). At the end of the presentation, the authors left the room and an individual not associated with the study distributed and collected consent forms. The forms were sealed until final grades were calculated. By participating in all parts of the study, students were provided with extra credit in the course. This protocol was approved by the University of Cincinnati College of Medicine Institutional Review Board (IRB # 2021-0414).
Student approaches to learning were evaluated using the revised two-factor Study Process Questionnaire (R-SPQ-2 F) (Biggs et al., 2001). This instrument was chosen because it is relatively short (20 questions), it has been validated in multiple contexts (Vaughan, 2016; Mogre & Amalba, 2014; Malik et al., 2019), and it is used widely in educational research. Students were asked to take the survey at two separate points: once at the beginning of the course, and again at the end of the course. Participants were not told what the instrument was measuring, and it was emphasized that there is no “right” answer to any of the questions.
The R-SPQ-2 F uses a series of statements that each participant rates using a five-point scale. Learning approach is then assessed on two scales: superficial and deep. Subscales related to motive and strategy are also a component of the instrument, although these were not considered in the present study. Once all participants completed the survey, their responses were tallied according to the guidelines in Biggs et al. (2001). This calculation then provides a score for each scale (superficial or deep) that ranges from 0 to 40. The higher the number on a given scale, the more likely an individual is to use that learning approach.
To determine whether learning approach influenced academic performance, student grades in the Human Anatomy course were considered. All questions on the lecture and practical examinations were assigned a cognitive level using the Blooming Anatomy Tool (Thompson & O’Loughlin, 2015). These assignments were made by a single individual (ART), who has previously shown a high degree of intra-rater reliability when classifying examination questions (Thompson & Giffin, 2021). Questions were then separated into higher-order (Bloom’s apply and analyze) and lower-order (Bloom’s knowledge and comprehension) to permit more straightforward comparisons and reduce observer error (Thompson & O’Loughlin, 2015; Anderson et al., 2001). Classification of practical examination questions into lower- and higher-order was done following the criteria outlined in Thompson et al. (2016) and Thompson and Giffin (2021). The resulting distribution of sample sizes for questions according to assessment type and cognitive level are as follows: lecture examinations: higher-order (n = 42), lower-order (n = 77); practical examinations: higher-order (n = 21), lower-order (n = 64).
Statistical methods
Results of the two different administration points of the R-SPQ-2 F were evaluated using paired t-tests with effect size reported as Cohen’s D and interpreted as 0.2 = small, 0.5 = medium, and 0.8 = large (Cohen, 1988). Examination data were considered as a whole and separated into higher- and lower-order categories. When comparing examination performance between groups, it is critical to account for the fact that student aptitude can confound the results. As such, the analyses employed in this study used overall college GPA as a covariate. While undergraduate GPA may not be the best metric in studies involving participants from multiple institutions, it is appropriate here since all study participants were from the same institution and enrolled in the same degree program.
The data were analyzed using two approaches. The first approach used a partial correlation to determine whether an individual’s score on the DA and SA scales was associated with performance in the Human Anatomy course. In the second approach, two-step cluster analysis was used to group individuals based on their SA and DA learning scores. This cluster method was chosen since it can accommodate scale data and automatically determines the optimal number of clusters. Group membership determined from this analysis was then used an independent variable in analysis of covariance (ANCOVA) to compare examination performance. Effect size for the ANCOVA results were determined using eta squared (η2), which was calculated by dividing the sum of squares between by the sum of squares total (Levine & Hullett, 2002). Cohen, 1988 suggests that values should be interpreted as: 0.01 = small, 0.06 = medium, and 0.14 = large. All data were analyzed using SPSS version 27 (IBM Cort., Armonk, NY).
Results
Of the 40 students enrolled in the course, 100% agreed to participate in the study and completed both administrations of the R-SPQ-2 F survey. Overall, students in the course favored the DA at both the start and end of the course. While deep scores dropped slightly and superficial scores increased slightly at the end of the course, the differences were not statistically significant (Table 2). The end of course R-SPQ-2 F results were used in the remainder of the analyses since it is likely they better represent the approaches students took during the Human Anatomy course.
Table 2.
Comparison of learning approaches at the start and end of course
| Learning approach | Start of class mean (SD) | End of class mean (SD) | p-value | Effect size (D) |
|---|---|---|---|---|
| Deep score | 32.6 (5.6) | 31.4 (5.1) | 0.078 | 0.290 |
| Superficial score | 23.3 (5.1) | 24.3 (5.1) | 0.202 | 0.208 |
SD, Standard deviation
Study approach by gender
Since previous studies have shown differences in learning approach by gender (Rajaratnam et al., 2013; Mirghani et al., 2014; Bansal et al., 2021), this relationship was investigated in the current study. The average DA score for students reporting as male was 31.7 ± 3.1 compared to an average score of 31.4 ± 5.5 for those reporting as female. This difference was not statistically significant (p = 0.858). Males had a slightly higher average SA score (25.7 ± 5.9) compared to females (24.0 ± 4.9), but this difference was not statistically significant (p = 0.371).
Correlation between learning approach and student performance
Partial correlation controlling for aptitude using GPA revealed no significant correlation between leaning approach and student performance in the Human Anatomy course (Table 3). While not statistically significant, the SA scores were consistently negatively correlated with all measures of student performance whereas the DA was consistently positively correlated with student performance.
Table 3.
Partial correlation of study approach and student performance
| Practical LO | Practical HO | Lecture LO | Lecture HO | Final grade | |
|---|---|---|---|---|---|
| DA score | |||||
| Correlation | 0.111 | 0.191 | 0.198 | 0.199 | 0.166 |
| p-value | 0.499 | 0.245 | 0.227 | 0.224 | 0.313 |
| SA score | |||||
| Correlation | − 0.116 | − 0.282 | − 0.208 | − 0.111 | − 0.223 |
| p-value | 0.314 | 0.082 | 0.205 | 0.501 | 0.153 |
DA, Deep approach; SA, Superficial approach; LO, Lower order questions; HO, Higher order questions
Grouping based on learning approach
Two-step cluster analysis was used to determine whether study participants could be grouped based on a combination of their scores on the SA and DA scales. This analysis revealed the identification of two distinct clusters (Fig. 1). The average silhouette coefficient, which is a metric of the quality of cluster (Rousseeuw, 1987), was 0.6, which is considered a good fit. The first group (Group 1, n = 29) included students with mean DA and SA scores (29.0 ± 3.4 and 26.0 ± 4.8, respectively) that were fairly similar. The second group (Group 2, n = 11) was comprised of students with DA scores (mean = 37.8 ± 1.8) that were much higher than their SA scores (mean = 20.1 ± 3.3). While individuals in the first group still favored the deep approach, the difference in scores was much smaller (3.0 points) compared to this difference in the second group (17.7 points).
Fig. 1.

Plot of cluster analysis group membership with 95 percent confidence ellipses
Group membership described above was used as an independent variable in the comparison of student performance using ANCOVA (Table 4). Overall, there were no statistically significant differences in how individuals in Group 1 compared to those in Group 2. However, it is worth noting that individuals in Group 2 consistently outperformed those in Group 1. The most apparent difference was on higher-order practical examination questions, where individuals in Group 2 scored nearly 5% higher than Group 1. Although the difference was not statistically significantly, the magnitude of the difference was medium (η2 = 0.074).
Table 4.
Comparison of student performance (percent correct) by cluster group membership
| Group 1 | Group 2 | p-value | Effect size (η2) | |
|---|---|---|---|---|
| Practical LO | 83.9 | 85.7 | 0.433 | 0.017 |
| Practical HO | 73.2 | 78.0 | 0.094 | 0.074 |
| Lecture LO | 86.0 | 88.6 | 0.203 | 0.043 |
| Lecture HO | 81.4 | 84.6 | 0.205 | 0.043 |
| Final grade | 85.9 | 88.0 | 0.209 | 0.042 |
LO, Lower order questions; HO, Higher order questions
Discussion
The purpose of this study was to explore the relationship between students’ learning approach and examination performance in an undergraduate clinically oriented Human Anatomy course. Based on previous research, it was suspected that academic success would be positively correlated with DA scores and negatively correlated with SA scores (Choo, 2006; Reid et al., 2007; Pandey & Zimitat, 2007; Subasinghe & Wanniachchi, 2012; Bliuc et al., 2011; Rajaratnam et al., 2013; Shaik et al., 2017; Bansal et al., 2021; Kamath et al., 2018). In addition, it was predicted that favoring a DA would result in better performance on questions that assessed higher-order learning.
Two methods were taken to evaluate these relationships. The first was to perform a correlation analysis of the R-SPQ-2 F results and student performance. While this analysis failed to reveal any statistically significant relationships between learning approach and student performance, DA scores were consistently positively correlated with examination scores, whereas SA scores were consistently negatively correlated with performance. The second method employed in this study involved using cluster analysis to determine if the study participants could be grouped according to a combination of their SA and DA scores. This analysis resulted in the identification of two distinct groups. Individuals in Group 1 had average deep and superficial scores that were similar, whereas Group 2 included individuals who, on average, had a DA score that was considerably higher than their SA score. Comparison of examination performance between these two groups revealed no statistically significant differences. However, there was a consistent trend of higher average performance among those individuals who strongly favored a DA to learning (Group 2), regardless of the cognitive level being assessed.
Study strategy score comparison
General R-SPQ-2 F results from this study indicate that the majority of students had higher DA scores compared to SA scores. This aligns with the findings of the vast majority of studies that have investigated learning approaches of students enrolled in undergraduate STEM or graduate medical programs (Mattick et al., 2004; Newton & Martin, 2013; Rajaratnam et al., 2013; Mirghani et al., 2014; Mogre & Amalba, 2014; Shaik et al., 2017; Hobbins et al., 2020; Johnson et al., 2021). This differs from the findings of both Newble & Gordon (1985) and Martenson (1986) who, using the Lancaster Approaches to Learning Inventory, found that first-year medical students tended to have higher SA than DA scores. Perhaps this shift away from a predominantly superficial learning approach can be explained by innovative curricular reforms that have been implemented over the last half century (Putnam, 2006), as well as increased perception of peer support, a characteristic of collaborative and interactive learning environments that has also been shown to be positively correlated with favoring a DA to learning (Sivan et al., 2000; Struyven et al., 2006; Coertjens et al., 2016).
Study strategy and cognitive level
While numerous studies have looked at the relationship between overall student performance and study strategy, few have investigated whether students who take a DA to learning demonstrate better understanding of higher-order concepts. Hobbins et al. (2020) intentionally structured a two-semester sequence of undergraduate Human Physiology courses to be taught and assessed in ways they hoped would encourage higher-order thinking and promote a DA to learning. Although students consistently favored a DA to learning across both semesters, an increase in higher-order and a decrease in lower-order performance was reported by the end of the second semester. In contrast to the present study, Hobbins et al. (2020) focused on outcomes at the aggregate level, comparing class-wide student performance and average learning approach across multiple semesters. The authors did not report on whether a student’s specific learning approach predicted how they performed on higher- versus lower-order questions.
Newton and Martin (2013) took a more targeted approach to whether learning approach was a factor in student performance at various cognitive levels. The authors found significant positive correlations between a DA to learning and accuracy of responses to Bloom’s level 3 (application) questions. A similar positive correlation was observed for level 2 questions (comprehension). No relationship between a surface approach and taxonomic level was discovered. While Newton and Martin (2013) classified their assessment questions on a slightly different scale, their findings are similar to what was reported here with regard to showing that DA scores were positively correlated with performance on questions assessing both higher- and lower-order concepts.
In both studies described above, short answer questions were utilized on examinations. This differs from the present study, in which multiple choice questions (MCQs) were used for lecture examinations. While the practical examinations included both MCQ and open-ended questions, answers to the later were generally a single term (i.e., they were not short answer). Some have argued that MCQs only test a student’s ability to engage in factual recall, and as such cannot be used to assess higher order cognition (Wood, 2003; Hobbins et al., 2020). Others pose that properly constructed MCQs can be used to assess higher-order cognition (Clifton & Schriner, 2010), although typically only Bloom’s levels 1–4 can be reliably assessed using MCQs (Crowe et al., 2008; Thompson & O’Loughlin, 2015). An area of opportunity for future research could be to create a study design that includes both MCQ and open-ended questions on examinations. This would allow for a wider range of Bloom’s levels to be included while retaining some grading efficiency.
Learning approach by gender
Although some studies have shown differences in preferred learning approach based upon participants’ reported gender, this study found no significant gender-based difference in either DA or SA scores. These findings are consistent with Wilson et al. (2006) and Shah et al. (2016) who also found no differences in the learning approaches of male and female students using the R-SPQ-2 F questionnaire. When evidence of gender-based differences is pointed out in the literature, the source of variation tends to be inconsistent. For example, Mirghani et al. (2014) found that, among preclinical and clinical medical students, males had significantly higher SA scores than females. In contrast, Bansal et al. (2021) described female students as more strongly favoring characteristics of SA learning when compared to their male counterparts. Similarly, Rajaratnam et al. (2013) found that a significantly larger portion of female students preferred a DA rather than SA, although, there was no significant difference in the average DA scores between male and female students. Overall, the findings presented here add to the growing body of literature that indicates gender is not a consistent predictor in the way students approach learning.
Learning approach as a dynamic process
Many studies use the R-SPQ-2 F to categorize students as either deep or surface learners. This dichotomization is arguably oversimplistic and fails to consider the dynamic nature of learning. Learning approaches are not innate to individual learners but are instead determined by the relationship between the learner and learning environment (Beattie et al., 1997). The contextual features that influence learning approach include assessment type and expectations (Entwistle & Entwistle, 1991), nature of the questions being asked (Marton et al., 1997), and quality of the learning environment (Entwistle & Entwistle, 1991; Coertjens et al., 2016). Specific environmental factors include attitudes and enthusiasm of the lecturer, peer support, and format of assessments. Evidence of adaptability in learning approach has been demonstrated by studies reporting significant alterations in approach occurring when the subject being studied is changed (Duff & McKinstry, 2007; Coertjens et al., 2016). Students’ relative interest and perceived relevance of the material being learned also impacts learning approach (Struyven et al., 2006; Coertjens et al., 2016).
The idea that students’ perceptions of how they will be assessed in turn drives their approach to learning has been suggested by a number of studies (Entwistle & Entwistle, 1991; Marton et al., 1997; Struyven et al., 2006; Gijbels et al., 2008; Prat-Sala & Redford, 2010; Thiede et al., 2011). Duff & McKinstry (2007) propose that if teaching and assessments only measure lower-order skills, learning approach may not predict academic performance. Similarly, a curriculum built on traditional didactic teaching style and assessments that measure rote memorization skill may not elucidate differences in learning approaches among students (Chonkar et al., 2018). Jensen et al. (2014) found that exposure to higher-order questions throughout a semester resulted in better performance on questions that assessed both higher- and lower-order concepts. Pandey and Zimitat (2007) investigated the relationship between learning strategy and academic success among first year medical students studying anatomy. Interestingly, the authors report average DA and SA scores as 31 and 30, respectively. The authors conclude that both memorization and understanding are key learning strategies for success. While the anatomy course investigated in the present study utilized various instructional modalities and examinations that assessed both higher- and lower-order concepts (Thompson & Giffin, 2021), at its core, anatomy requires a great deal of memorization. Perhaps it is the lower-order concepts that are reinforced by the SA that provide the scaffolding for higher-order learning.
The hierarchical nature of Bloom’s taxonomy
Bloom’s taxonomy was originally conceptualized as a cumulative hierarchy where lower-order concepts are built upon to reach higher-order processing (Bloom, 1956; Anderson et al., 2001). However, this concept is not universally accepted. Through a series of controlled studies, Agarwal (2019) showed that students who were exposed to higher-order practice quizzes did better on higher-order test questions compared to students who took lower-order quizzes. This evidence is used to suggest that, in a strict sense, higher-order skills are not inherently built on from lower-order knowledge. However, the author also discovered that students who received practice quizzes incorporating a mix of higher- and lower-order questions performed better on both higher- and lower-order test items. Thus, encouraging a learning approach where both superficial/lower-order and deep/high-order learning methods are utilized could maximize the acquisition of information at multiple cognitive levels.
Validity concerns with the R-SPQ-2 F
While numerous studies have generally confirmed the validity and reliability R-SPQ-2 F (Biggs et al., 2001; Vaughan, 2016), this is not without exception. One concern that has arisen is whether the instrument is valid when administered to non-native English-speakers (Choo, 2006; Shahrazad et al., 2013; Martinelli & Raykov, 2017; Malik et al., 2019). While the results of some studies differ, the majority conclude that a two-factor structure is preferable over the four-factor model proposed by Biggs et al. (2001). Some studies have found that removing one or more items from the questionnaire provides an enhanced fit for their respective samples (Immekus & Imbrie, 2010; Socha & Sigler, 2014; Zakariya et al., 2020). Martinelli & Raykov (2017) suggested that adding face-to-face interviews with students to protocols involving culturally diverse samples could strengthen validity. While issues related to the language of the R-SPQ-2 F are not a major concern in the present study, the racial and ethnic background of study participants was diverse and thus could have influenced the validity of the results.
Johnson et al. (2021) recently questioned the validity of the R-SPQ-2 F based on their sample of undergraduate students studying anatomy and physiology. The researchers administered the questionnaire to 231 students and from these respondents, the authors chose a small group of students (n = 11) to participate in individual interviews aimed at elucidating their thoughts on the learning process and approach. Comparing quantitative results from R-SPQ-2 F to the qualitative results of the interview questions revealed considerable misalignment between the two sources of data. Word interpretation was identified as an area of concern for eight of the twenty survey items. It is possible that dated phrasing of the original study process questionnaire creates an opportunity for confusion and subjective interpretation. Additionally, Johnson and colleagues determined that contextual factors such as course expectations or course content may lead to various interpretations of at least four more survey items. Reconciliation of the questionnaire and interview results lead the authors to propose an approach that involves a “surface leading to deep” process. This parallels the idea presented here with regard to the importance of using lower-order concepts as scaffolding to achieve higher-order learning. In this case, Johnson and colleagues are making the same argument, just from the lens of student learning approach opposed to cognitive levels.
Limitations
There are several aspects of this study that limit its interpretation. While students generally enter the Human Anatomy course with only a basic understanding of anatomy, prior experience was not accounted for in the present study design. Additionally, individuals’ previous exposure to higher-order assessments and how that might influence study approach could not be evaluated. While learning approach is dynamic, prior experience with higher-order questions influences subsequent performance on questions of similar cognitive complexity, independent of favored learning approach (Jensen et al., 2014; Agarwal, 2019). Although administration of the R-SPQ-2 F at the beginning and end of the semester showed no significant change in learning approach scores over that period, the possibility remains that individual approaches to learning could have fluctuated throughout the duration of the course.
Another limitation of this study is that it did not consider how external factors may influence study approaches and academic performance. Beattie et al. (1997) point out that students who take a DA to learning tend to spend more time studying. Students concurrently enrolled in other, challenging courses or those who have extenuating circumstances in their personal life might opt to take SA to learning, which is argued to take less time (Mirghani et al., 2014). Given the timeframe of the study and impact of the COVID-19 pandemic, this is certainly a possibility.
One point to consider regarding the investigation of gender-related differences in learning approach is that historically, gender and sex were often used as synonyms in education research (Glasser & Smith, 2008). Details on how gender-related data are collected is not commonly included in manuscripts. As a result, gender comparisons made between this study and others may be flawed since it is not always clear whether researchers are referring to the modern-day understanding of gender or using it as a synonym for biological sex. Moving forward, it will be important that researchers are clear about the survey instrument used to collected gender-related data so future comparisons can be made accurately.
As noted previously, there are mixed findings as to the validity of the R-SPQ-2 F. Since this study was not designed in a manner that permitted a study-specific evaluation of validity, it is difficult to say with certainty whether the R-SPQ-2 F results accurately represent a student’s learning approach. Lastly, this study focused on a single cohort with a relevantly small sample size. Due to the nature of the program and level of the course, the study participants were more homogenous compared to studies that have utilized larger, introductory-level classes. This may have impacted the ability discern patterns in the data.
Conclusion
Understanding the relationship between learning approach and performance on questions assessing higher- versus lower-order cognitive levels is critical in designing teaching and assessment strategies that optimize student learning. It is generally accepted that students who take a DA to learning perform better academically and, as such, educators should utilize teaching and assessment methods that encourage students to adopt a DA (Biggs et al., 2001). This study investigated the relationship between a student’s learning approach and performance on higher- versus lower-order examination question in an undergraduate Human Anatomy course. Through a number of different analytical approaches, this study revealed a consistent, but not statistically significant, positive relationship between taking a DA to learning and performance on both higher- and lower-order examination question. While the relatively small sample size may have impacted these results, review of the literature also highlights several potential issues related to the conceptual framework of how the R-SPQ-2 F measures and interprets learning approach. When viewed broadly, these findings highlight the need for additional research in this area. In particular, more information is needed to clarify how the ability to adapt a given learning approach impacts academic success and the importance that lower-order cognitive skills have on achieving higher-order learning.
Author contribution
ART and LPOL wrote the manuscript text, ART analyzed the data and prepared the tables and figures.
Declarations
Conflict of interest
All authors declare that they have no conflict of interest.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Agarwal PK. Retrieval practice & Bloom’s taxonomy: Do students need fact knowledge before higher order learning? Journal of Educational Psychology. 2019;111(2):189–209. doi: 10.1037/edu0000282. [DOI] [Google Scholar]
- Anderson L, Krathwohl D, Airasian P, Cruikshank K, Mayer R, Pintrich P, Rathds J, Wittrock M. A taxonomy for learning, teaching and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman; 2001. [Google Scholar]
- Bansal S, Bansal M, White S. Association between learning approaches and medical student academic progression during preclinical training. Advances in Medical Education and Practice. 2021;12:1343–1351. doi: 10.2147/AMEP.S329204. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beattie V, Collins B, McInnes B. Deep and surface learning: a simple or simplistic dichotomy? Accounting Education. 1997;6(1):1–12. doi: 10.1080/096392897331587. [DOI] [Google Scholar]
- Biggs J. Individual and group differences in study processes. British Journal of Educational Psychology. 1978;48(3):266–279. doi: 10.1111/j.2044-8279.1978.tb03013.x. [DOI] [Google Scholar]
- Biggs J, Kember D, Fau - Leung DY, Leung DY. The revised two-factor study process questionnaire: R-SPQ-2F. British Journal of Educational Psychology. 2001;71:133–149. doi: 10.1348/000709901158433. [DOI] [PubMed] [Google Scholar]
- Bliuc AM, Ellis RA, Goodyear P, Hendres DM. Understanding student learning in context: relationships between university students’ social identity, approaches to learning, and academic performance. European Journal of Psychology of Education. 2011;26(3):417–433. doi: 10.1007/s10212-011-0065-6. [DOI] [Google Scholar]
- Bloom B. Taxonomy of educational objectives: cognitive domain. New York: McKay; 1956. [Google Scholar]
- Cassidy S. Exploring individual differences as determining factors in student academic achievement in higher education. Studies in Higher Education. 2012;37:793–810. doi: 10.1080/03075079.2010.545948. [DOI] [Google Scholar]
- Chonkar SP, Ha TC, Chu SSH, Ng AX, Lim M, Ee T, Ng M, Tan K. The predominant learning approaches of medical students. BMC Medical Education. 2018;18(1):17. doi: 10.1186/s12909-018-1122-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Choo PGS. Assessing the approaches to learning of twinning programme students in Malaysia. Malaysian Journal of Learning and Instruction. 2006;3(0):93–116. [Google Scholar]
- Clifton SL, Schriner CL. Assessing the quality of multiple-choice test items. Nurse Education. 2010;35(1):12–16. doi: 10.1097/NNE.0b013e3181c41fa3. [DOI] [PubMed] [Google Scholar]
- Coertjens L, Vanthournout G, Lindblom-Ylänne S, Postareff L. Understanding individual differences in approaches to learning across courses: a mixed method approach. Learning and Individual Differences. 2016;51:69–80. doi: 10.1016/j.lindif.2016.07.003. [DOI] [Google Scholar]
- Cohen J. Statistical power analysis for the behavioral sciences. New Jersey: Lawrence Erlbaum Associates Inc; 1988. [Google Scholar]
- Crowe A, Dirks C, Wenderoth MP. Biology in Bloom: implementing Bloom’s taxonomy to enhance student learning in biology. CBE-Life Sciences Education. 2008;7(4):368–381. doi: 10.1187/cbe.08-05-0024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Duff A, McKinstry S. Students’ approaches to learning. Issues in Accounting Education. 2007;22(2):183–214. doi: 10.2308/iace.2007.22.2.183. [DOI] [Google Scholar]
- Entwistle N, Hanley M, Hounsell D. Identifying distinctive approaches to studying. Higher education. 1979;8(4):365–380. doi: 10.1007/BF01680525. [DOI] [Google Scholar]
- Entwistle NJ, Entwistle A. Contrasting forms of understanding for degree examinations: the student experience and its implications. Higher Education. 1991;22(3):205–227. doi: 10.1007/bf00132288. [DOI] [Google Scholar]
- Gijbels D, Segers M, Struyf E. Constructivist learning environments and the (im)possibility to change students’ perceptions of assessment demands and approaches to learning. Instructional Science. 2008;36(5):431. doi: 10.1007/s11251-008-9064-7. [DOI] [Google Scholar]
- Glasser HM, Smith JP. On the vague meaning of “gender” in education research: The problem, Its sources, and recommendations for practice. Educational Researcher. 2008;37(6):343–350. doi: 10.3102/0013189X08323718. [DOI] [Google Scholar]
- Hobbins JO, Murrant CL, Snook LA, Tishinsky JM, Ritchie KL. Incorporating higher order thinking and deep learning in a large, lecture-based human physiology course: can we do it? Advances is Physiology Education. 2020;44(4):670–678. doi: 10.1152/advan.00126.2019. [DOI] [PubMed] [Google Scholar]
- Immekus J, Imbrie PK. A test and cross-validation of the revised two-factor study process questionnaire factor structure among Western University students. Educational and Psychological Measurement. 2010;70:495–510. doi: 10.1177/0013164409355685. [DOI] [Google Scholar]
- Jensen JL, McDaniel MA, Woodard SM, Kummer TA. Teaching to the test or testing to teach: exams requiring higher order thinking skills encourage greater conceptual understanding. Educational Psychology Review. 2014;26(2):307–329. doi: 10.1007/s10648-013-9248-9. [DOI] [Google Scholar]
- Johnson, S. N., Gallagher, E. D., & Vagnozzi, A. M. (2021). Validity concerns with the revised study process questionnaire (R-SPQ-2F) in undergraduate anatomy & physiology students. PLoS ONE,16(4), e0250600 [DOI] [PMC free article] [PubMed]
- Kamath A, Rao R, Shenoy P, Ullal S. Approaches to learning and academic performance in pharmacology among second-year undergraduate medical students. Scientia Medica. 2018;28:32395. doi: 10.15448/1980-6108.2018.4.32395. [DOI] [Google Scholar]
- Krathwohl DR. A revision of BloomBloom’s taxonomy: An overview. Theory Into Practice. 2002;41(4):212–218. doi: 10.1207/s15430421tip4104_2. [DOI] [Google Scholar]
- Levine TR, Hullett CR. Eta squared, partial eta squared, and misreporting of effect size in communication research. Human Communication Research. 2002;28(4):612–625. doi: 10.1111/j.1468-2958.2002.tb00828.x. [DOI] [Google Scholar]
- Malik AA, Khan RA, Malik HN, Humayun A, Butt NS, Baig M. Assessing reliability and validity of revised biggs two-factor study process questionnaire to measure learning approaches among undergraduate medical students in Lahore, Pakistan. Journal of Pakistan Medical Association. 2019;69(3):337–342. [PubMed] [Google Scholar]
- Martenson DF. Students’ approaches to studying in four medical schools. Medical Education. 1986;20(6):532–534. doi: 10.1111/j.1365-2923.1986.tb01395.x. [DOI] [PubMed] [Google Scholar]
- Martinelli V, Raykov M. Evaluation of the revised two-factor study process questionnaire (R-SPQ-2F) for student teacher approaches to Learning. Journal of Educational and Social Research. 2017;7(2):9–13. doi: 10.5901/jesr.2017.v7n2p9. [DOI] [Google Scholar]
- Marton F, Saljo R. On qualitative differences in learning: I. Outcome and process. British Journal of Educational Psychology. 1976;46(1):4–11. doi: 10.1111/j.2044-8279.1976.tb02980.x. [DOI] [Google Scholar]
- Marton F, Watkins D, Tang C. Discontinuities and continuities in the experience of learning: An interview study of high-school students in Hong Kong. Learning and Instruction. 1997;7(1):21–48. doi: 10.1016/S0959-4752(96)00009-6. [DOI] [Google Scholar]
- Mattick K, Dennis I, Bligh J. Approaches to learning and studying in medical students: Validation of a revised inventory and its relation to student characteristics and performance. Medical Education. 2004;38(5):535–543. doi: 10.1111/j.1365-2929.2004.01836.x. [DOI] [PubMed] [Google Scholar]
- Mirghani H, Ezimokhai M, Shaban S, Van Berkel H. Superficial and deep learning approaches among medical students in an interdisciplinary integrated curriculum. Education for Health. 2014;27:10–14. doi: 10.4103/1357-6283.134293. [DOI] [PubMed] [Google Scholar]
- Mogre V, Amalba A. Assessing the reliability and validity of the revised two factor study process questionnaire (RSPQ2F) in Ghanaian medical students. Journal of Educational Evaluation for Health Professions. 2014;11:14. doi: 10.3352/jeehp.2014.11.19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Morton DA, Colbert-Getz JM. Measuring the impact of the flipped anatomy classroom: The importance of categorizing an assessment by Bloom’s taxonomy. Anatomical Sciences Education. 2017;10(2):170–175. doi: 10.1002/ase.1635. [DOI] [PubMed] [Google Scholar]
- Newble DI, Gordon MI. The learning style of medical students. Medical Education. 1985;19(1):3–8. doi: 10.1111/j.1365-2923.1985.tb01132.x. [DOI] [PubMed] [Google Scholar]
- Newble DI, Hejka EJ, Whelan G. The approaches to learning of specialist physicians. Medical Education. 1990;24(2):101–109. doi: 10.1111/j.1365-2923.1990.tb02507.x. [DOI] [PubMed] [Google Scholar]
- Newton G, Martin EC. Blooming, SOLO taxonomy, and phenomenography as assessment strategies in undergraduate science education. Journal of College Science Teaching. 2013;43(2):78–90. doi: 10.2505/4/jcst13_043_02_78. [DOI] [Google Scholar]
- Pandey P, Zimitat C. Medical students’ learning of anatomy: memorisation, understanding and visualisation. Medical Education. 2007;41(1):7–14. doi: 10.1111/j.1365-2929.2006.02643.x. [DOI] [PubMed] [Google Scholar]
- Prat-Sala M, Redford P. The interplay between motivation, self-efficacy, and approaches to studying. British Journal of Educational Psychology. 2010;80(Pt 2):283–305. doi: 10.1348/000709909x480563. [DOI] [PubMed] [Google Scholar]
- Putnam CE. Reform and innovation: a repeating pattern during a half century of medical education in the USA. Medical Education. 2006;40(3):227–234. doi: 10.1111/j.1365-2929.2006.02402.x. [DOI] [PubMed] [Google Scholar]
- Rajaratnam N, D’cruz SM, M C. Correlation between the learning approaches of first year medical students and their performance in multiple choice questions in physiology: Correlation between the learning approaches and performance. National Journal of Integrated Research in Medicine. 2013;4(5):42–47. [Google Scholar]
- Reid WA, Duvall E, Evans P. Relationship between assessment results and approaches to learning and studying in Year two medical students. Medical Education. 2007;41(8):754–762. doi: 10.1111/j.1365-2923.2007.02801.x. [DOI] [PubMed] [Google Scholar]
- Rousseeuw PJ. Silhouettes: a graphical aid to the interpretation and validation of cluster analysis. Journal of Computational and Applied Mathematics. 1987;20:53–65. doi: 10.1016/0377-0427(87)90125-7. [DOI] [Google Scholar]
- Shah DK, Yadav RL, Sharma D, Yadav PK, Sapkota NK, Jha R, Islam MN. Learning approach among health sciences students in a medical college in Nepal: A cross-sectional study. Advances in Medical Education and Practice. 2016;7:137–143. doi: 10.2147/amep.s100968. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shahrazad WSW, Sulaiman WSW, Dzulkifli MA. Reliability of second-order factors of a revised two-factor study process questionnaire (R-SPQ-2F) among university students in Malaysia. Journal of Educational Evaluation for Health Professions. 2013;19:11. [Google Scholar]
- Shaik SA, Almarzuqi A, Almogheer R, Alharbi O, Jalal A, et al. Assessing saudi medical students learning approach using the revised two-factor study process questionnaire. International Journal of Medical Education. 2017;8:292–296. doi: 10.5116/ijme.5974.7a06. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sivan A, Leung RW, Woon C, Kember D. An implementation of active learning and its Effect on the quality of Student Learning. Innovations in Education and Training International. 2000;37(4):381–389. doi: 10.1080/135580000750052991. [DOI] [Google Scholar]
- Socha A, Sigler EA. Exploring and “reconciling” the factor structure for the revised two-factor study process questionnaire. Learning and Individual Differences. 2014;31:43–50. doi: 10.1016/j.lindif.2013.12.010. [DOI] [Google Scholar]
- Struyven K, Dochy F, Janssens S, Gielen S. On the dynamics of students’ approaches to learning: The effects of the teaching/learning environment. Learning and Instruction. 2006;16(4):279–294. doi: 10.1016/j.learninstruc.2006.07.001. [DOI] [Google Scholar]
- Subasinghe S, Wanniachchi DN. Approach to learning and the academic performance of a group of medical students—any correlation ? Student Medical Journal. 2012;3:5–10. [Google Scholar]
- Thiede KW, Wiley J, Griffin TD. Test expectancy affects metacomprehension accuracy. British Journal of Educational Psychology. 2011;81:264–273. doi: 10.1348/135910710x510494. [DOI] [PubMed] [Google Scholar]
- Thompson AR, Braun MW, O’Loughlin VD. A comparison of student performance on discipline-specific versus integrated exams in a medical school course. Advances in Physiology Education. 2013;37:370–376. doi: 10.1152/advan.00015.2013. [DOI] [PubMed] [Google Scholar]
- Thompson AR, Giffin BF. Higher-order assessment in gross anatomy: a comparison of performance on higher- versus lower-order anatomy questions between undergraduate and first-year medical students. Anatomical Sciences Education. 2021;14(3):306–316. doi: 10.1002/ase.2028. [DOI] [PubMed] [Google Scholar]
- Thompson AR, Kelso RS, Ward PJ, Wines K, Hanna JB. Assessment driven learning: the use of higher-order and discipline-integrated questions on gross anatomy practical examinations. Medical Science Educator. 2016;26(4):587–596. doi: 10.1007/s40670-016-0306-z. [DOI] [Google Scholar]
- Thompson AR, O’Loughlin VD. The Blooming anatomy tool (BAT): A discipline-specific rubric for utilizing Bloom’s taxonomy in the design and evaluation of assessments in the anatomical sciences. Anatomical Sciences Education. 2015;8:493–501. doi: 10.1002/ase.1507. [DOI] [PubMed] [Google Scholar]
- Vaughan B. Confirmatory factor analysis of the study process questionnaire in an australian osteopathy student population. International Journal of Osteopathic Medicine. 2016;20:62–67. doi: 10.1016/j.ijosm.2016.03.001. [DOI] [Google Scholar]
- Verenna AMA, Noble KA, Pearson HE, Miller SM. Role of comprehension on performance at higher levels of Bloom’s taxonomy: Findings from assessments of healthcare professional students. Anatomical Sciences Education. 2018;11:433–444. doi: 10.1002/ase.1768. [DOI] [PubMed] [Google Scholar]
- Wilson K, Smart R, Watson R. Gender differences in approaches to learning in first year psychology students. British Journal of Educational Psychology. 2006;66:59–71. doi: 10.1111/j.2044-8279.1996.tb01176.x. [DOI] [Google Scholar]
- Wood EJ. What are extended matching sets questions? Bioscience Education. 2003 doi: 10.3108/beej.2003.01010002. [DOI] [Google Scholar]
- Zakariya YF, Bjørkestøl K, Nilsen HK, Goodchild S, Lorås M. University students’ learning approaches: An adaptation of the revised two-factor study process questionnaire to Norwegian. Studies in Educational Evaluation. 2020;64:100816. doi: 10.1016/j.stueduc.2019.100816. [DOI] [Google Scholar]
- Zheng AY, Lawhorn JK, Lumley T, Freeman S. Application of Bloom’s taxonomy debunks the “MCAT myth". Science. 2008;319:414–415. doi: 10.1126/science.1147852. [DOI] [PubMed] [Google Scholar]

A 35-year-old female presents to the ER after a car accident. Imaging studies are provided. If the nerve traveling in the region of the injury is damaged, what physical exam findings are most likely?