Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2018 Nov 26;32(1):156–163. doi: 10.1002/ca.23298

An evidence‐based approach to learning clinical anatomy: A guide for medical students, educators, and administrators

Anthony V D'Antoni 1,, Estomih P Mtui 1, Marios Loukas 2, R Shane Tubbs 3, Genevieve Pinto Zipp 4, John Dunlosky 5
PMCID: PMC7379743  PMID: 30307063

Abstract

The amount of information that medical students learn is voluminous and those who do not use evidence‐based learning strategies may struggle. Research from cognitive and educational psychology provides a blueprint on how best to learn science subjects, including clinical anatomy. Students should aim for high‐cognitive learning levels as defined in the SOLO taxonomy. Using a real‐world example from a modern clinical anatomy textbook, we describe how to learn information using strategies that have been experimentally validated as effective. Students should avoid highlighting and rereading text because they do not result in robust learning as defined in the SOLO taxonomy. We recommend that students use (1) practice testing, (2) distributed practice, and (3) successive relearning. Practice testing refers to nonsummative assessments that contain questions used to facilitate retrieval (e.g., flashcards and practice questions). Practice questions can be fill‐in, short‐answer, and multiple‐choice types, and students should receive explanatory feedback. Distributed practice, the technique of distributing learning of the same content within a single study session or across sessions, has been found to facilitate long‐term retention. Finally, successive relearning combines both practice testing and distributed practice. For this strategy, students use practice questions to continue learning until they can answer all of the practice questions correctly. Students who continuously use practice testing, distributed practice, and successive relearning will become more efficient and effective learners. Our hope is that the real‐world clinical anatomy example presented in this article makes it easier for students to implement these evidence‐based strategies and ultimately improve their learning. Clin. Anat., 2018. © 2018 The Authors. Clinical Anatomy published by Wiley Periodicals, Inc. on behalf of American Association of Clinical Anatomists.

Keywords: learning strategies, cognitive psychology, educational psychology, highlighting, rereading, practice testing, distributed practice, successive relearning, clinical anatomy, SOLO taxonomy


The active recall of a fact from within is, as a rule, better than its impression from without.

—Thorndike (1906).

INTRODUCTION

Since the time of Sir William Osler, there have been several opinions on how medical students learn best and how their learning environments can be improved (Armstrong et al., 2004; Becker, 2014). Although a few papers have been published on undergraduate college student learning of anatomy and physiology (Dobson and Linderholm, 2015), none, to our knowledge, have specifically focused on how to use evidence‐based strategies to learn clinical anatomy in medical school. This void could be due to the misperception that medical students already use effective strategies because they are high‐performing students. Although students matriculating at U.S. medical schools have high grade point averages and Medical College Admission Test (MCAT) scores (Mitchell, 1990), some struggle in the medical school curriculum. Another obstacle could be a lack of awareness among medical students, educators, and administrators about research from the field of cognitive psychology (Ruiter et al., 2012), which has systematically explored many techniques that promise to improve student achievement.

The content presented in U.S. medical schools has increased exponentially during the last 50 years, even though the duration of the 4 year undergraduate medical curriculum has remained unchanged (Anderson and Graham, 1980; D'Antoni et al., 2010). Therefore, there has been increased pressure on medical students to absorb vast amounts of information with limited time. This could be one of the main factors influencing their academic success in medical school coursework. We have observed that the learning strategies used by some medical students, especially during the first year, are not robust enough to meet the challenges associated with deep learning but rather support surface knowledge acquisition (D'Antoni et al., 2009). As described in the structure of observed learning outcomes (SOLO) taxonomy, deep learning is achieved when the learner moves from unistructural and multistructural levels (both are associated with a surface approach to learning) to rational and extended abstract levels (Hattie and Brown, 2004). The learning strategies used by the learner, in this case the medical student, will influence the level and depth of learning. And depending on the learner's goals (retention of critical facts or understanding core concepts), different techniques are likely to be more effective in achieving them.

On the basis of what is currently known regarding medical school curricula, medical students' characteristics, learning theory, learning strategies, and the adult learner, how can we as educators use learning strategies that will move medical students from using surface learning experiences to embracing deeper learning experiences? We argue in this paper that it is through modeling the use of learning strategies that seek to promote the relational and extended abstract levels of learning, as discussed in the SOLO Taxonomy, as deep learning experiences.

The SOLO taxonomy of learning proposed by Biggs and Collis (1982) is a mechanism to motivate students' development intrinsically and extrinsically, to think reflectively and drive their self‐determination to learn. SOLO taxonomy is a process used to describe increasing levels of complexity in a learner's understanding of a concept (Biggs and Collis, 1982). SOLO taxonomy is a five‐leveled approach that classifies the observed learning outcome as either prestructural, unistructural, multistructural, relational, or extended abstract (Pinto Zipp et al., 2016). As a learner moves along these levels, their cognitive abilities transition from that of recalling bits of information to evaluating and synthesizing information, ultimately supporting the transfer of knowledge acquired to new situations. This process of transforming knowledge demonstrates a deeper level of learning (Biggs and Collis, 1982). Figure 1 shows the different levels of the SOLO taxonomy (Pinto Zipp et al., 2016). In level one, prestructural, the learner acquires unconnected pieces of information and is unsure about the subject matter in general. In level two, unistructural, the learner possesses an idea about the information and begins to make simple connections between information and ideas but no significant associations are formed. In level three, mulitstructural, the learner begins to make several connections among individual ideas and information, but a meta‐connection among all the information is lacking. In level four, relational, the learner begins to see the connections between the individual parts of the information acquired and how they fit into the whole understanding of the concept. Finally, in level five, extended abstract, the learner is able to transfer and generalize information and ideas from one context to another (Biggs and Collis, 1982).

Figure 1.

Figure 1

Different levels of the SOLO taxonomy and characteristics associated with the learner's cognitive abilities at each level. [Color figure can be viewed at wileyonlinelibrary.com]

Researchers suggest that by using the SOLO taxonomy, curricula can be aligned to assessments and verify program outcomes (Biggs and Collis, 1982). The alignment and validation of a program's curricular map using the SOLO taxonomy has been termed “constructive alignment” (Biggs and Collis, 1982). Constructive alignment is an example of outcomes‐based education. In the constructive alignment approach, students are required to demonstrate an action such as “apply” or “perform” in the outcome statement. Subsequently, learning is assessed on the basis of the action taken by the student to reach the stated outcome. Thus, in the constructive alignment approach, the action taken by the student, not their ability to restate information, implies learning (Newton and Martin, 2013). The traditional learning activity of dissecting a human body that is still used in many medical schools today (McBride and Drake, 2018) is an example of a constructive alignment approach because in the anatomy laboratory, students dissect structures and then demonstrate them to faculty and their classmates.

A simple definition of a learning strategy is a technique that a student uses to learn information. Emphasizing the learner's activity during the process, Mayer (1996) defined a learning strategy as a behavior or thought engaged by a learner that is intended to influence his/her encoding process. As shown in Figure 2, encoding is the process of transferring information into long‐term memory, whereas retrieval is the process of accessing it from long‐term memory when needed to answer a question or solve a problem (Atkinson and Shiffrin, 1968). In this paper, we review evidence‐based learning strategies that have been published in the cognitive psychology and medical education literature (Augustin, 2014; Dunlosky et al., 2013). Specifically, we use the recent monograph by Dunlosky et al. (2013) as a framework to extract the most effective learning strategies and discuss how they can be adapted to learning a section from the abdomen chapter of a popular U.S. medical school clinical anatomy textbook by Moore et al. (2015). We describe how medical students can apply these learning strategies to facilitate long‐term retention and deep learning, and interpose U.S. Medical Licensing Examination (USMLE)‐style items so they can assess the effectiveness of a given learning strategy. The learning techniques reviewed in this article include those to avoid (i.e., highlighting and rereading) and those to embrace (i.e., practice testing, distributed practice, and successive relearning). Note that the techniques outlined by Dunlosky et al. (2013) can be used to help students learn any scientific or nonscientific information taught in medical and health‐professional schools. Consequently, our approach in this paper can be used by adult learners in different educational settings, not only in the field of clinical anatomy.

Figure 2.

Figure 2

Updated modal model of learning by Atkinson and Shiffrin (1968). Information is read or heard and then encoded in the brain and stored in long‐term memory. When needed, the information is retrieved and used to solve problems.

BLUEPRINT TO LEARNING

Contextual Information

A medical student first needs to get a sense of the text‐based material that needs to be learned. For this article, we are using information presented on pages 113 to 119 in the abdomen chapter (chapter 2) of a popular clinical anatomy textbook (Moore et al., 2015). As shown in Table 1, these seven pages contain information about the anterolateral abdominal wall organized into three domains: basic anatomy (text on white background), clinical anatomy (text on blue background), and surface anatomy (text on yellow background). The basic anatomy domain is divided into the following sections: (1) fascia of the anterolateral abdominal wall, (2) muscles of the anterolateral abdominal wall, and (3) internal surface of the anterolateral abdominal wall. The clinical anatomy domain is divided into: (1) clinical significance of fascia and fascial spaces of the abdominal wall, (2) abdominal surgical incisions, (3) endoscopic surgery, (4) incisional hernia, (5) protuberance of the abdomen, and (6) palpation of the anterolateral abdominal wall. The surface anatomy domain comprises a single page with no sections. Once the student has a sense of the topic, she/he should begin to use evidence‐based study strategies. Before these are described in the context of the anatomical information to be learned, we want to point out what students should avoid.

Table 1.

Organization of information in chapter 2 (abdomen) based on domainsa

Section characteristics
Name Background color Number of figures Number of tables
Basic Anatomy of Abdomen
Fascia of anterolateral abdominal wall White 1 0
Muscles of anterolateral abdominal wall White 2 1
Internal surface of anterolateral abdominal wall White 1 0
Clinical Anatomy of Abdomen
Clinical significance of fascia and fascial spaces of the abdominal wall Blue 1 0
Abdominal surgical incisions Blue 1 0
Endoscopic surgery Blue 0 0
Incisional hernia Blue 0 0
Protuberance of abdomen Blue 0 0
Palpation of anterolateral abdominal wall Blue 0 0
Surface Anatomy of Abdomen
Not applicable (no sections) Yellow 1 0
a

Information based on pages 113 to 119 in the textbook by Moore et al. (2015).

Learning Strategies to Avoid

Highlighting

Many students instinctively use highlighting when reading textbooks and multiple studies have shown this strategy to be relatively ineffective (Dunlosky et al., 2013), and even if one does learn a bit more while highlighting (as compared to merely reading), such learning will likely fall well short of real‐world learning objectives. Highlighting is a passive learning strategy because students are more focused on highlighting than thinking about the information. Additionally, highlighting a single word or statement is unistructural and mutistructural in nature, focusing the learner's learning experience at the surface levels of SOLO taxonomy rather than the deeper levels (see Fig. 1). Often students are unable to decipher key points and determine their relational relevance in the text passage and this causes “over‐highlighting,” an activity that results in entire pages being highlighted in usually a fluorescent color. Such indiscriminate highlighting defeats the purpose of the activity, which should be to identify the more important, testable topics (e.g., clinical correlates) for further study.

In our experience, students who highlight their textbooks rarely revisit the highlighted text. Moreover, modern textbooks often have keywords in bold or italics already, which render highlighting superfluous. For example, in the “Fascia of the Anterolateral Abdominal Wall” section, the following eight key terms are in bold: subcutaneous tissue (superficial fascia), superficial fatty layer (Camper fascia), deep membranous layer (Scarpa fascia), investing fascia, endo‐abdominal fascia, transversalis fascia, parietal peritoneum, and extraperitoneal fat. A quick glance at the associated figure reveals the positions of these structures relative to each other, and of the three muscles of the anterolateral abdominal wall (external oblique, internal oblique, and transversus abdominis), even though they are specifically discussed in the next section of the chapter. An efficient medical student will recognize that the order of these structures is important and testable. A plausible clinical question (item) based on this information that could be asked on a formative or summative assessment is found in Figure 3. The item in Figure 3 could have been slightly modified to include the three anterolateral abdominal wall muscles so there would have been a total of seven structures. This reasonable variation would integrate two sections of the chapter and make the item more challenging to answer. The subcutaneous tissue (superficial fascia) has a superficial fatty layer (Camper fascia) and a deep membranous layer (Scarpa fascia), and these are well defined below the level of the umbilicus but not so above it. Such insightful thinking guides an efficient medical student's learning approach. However, if the student focuses on highlighting the text indiscriminately, she/he could completely miss the clinical relevance of the information. As discussed by Dunlosky et al. (2013), researchers have reported cases where highlighting actually hindered the ability of students to make inferences at a later time. Clearly, just highlighting text should be avoided.

Figure 3.

Figure 3

Example of an item based on information in the “Fascia of the Anterolateral Abdominal Wall” section of the textbook by Moore et al. (2015).

Rereading

Rereading is a very common strategy used by undergraduate students (Carrier, 2003; Karpicke et al., 2009). While rereading can become an active learning strategy if readers are seeking new insights each time they read a passage, students often do not seek insights from their reading. Instead they erroneously perceive that via repetition, rereading will result in learning. This repetition of reading without the active engagement of the student seeking insights turns the activity into a passive learning strategy, which should be avoided in medical school because it does not foster deep learning. The relatively lackluster effects of rereading just for the sake of reading also arise when students rewatch video‐recorded lectures. Liles et al. (2018) recently reported that many medical students who received “C” grades reported rewatching online lectures as a learning strategy, whereas those who received “A” grades almost never reported rewatching lectures. These data corroborate the observations that rereading and rewatching lectures are ineffective learning strategies.

Learning Strategies to Embrace

Practice testing

Practice testing refers to low‐stakes, nonsummative assessments that contain items that can be used to facilitate retrieval; these can include flashcards and any available practice items (Dunlosky et al., 2013). For the “Fascia of the Anterolateral Abdominal Wall,” a student can simply cover the name of each fascia and attempt to recall the correct names until they can recall them correctly; later, they could attempt to sketch the fasciae from memory to try to capture all the key details of the abdominal wall, making sure to check what they had recalled correctly versus what they need to restudy. Practice testing has been investigated intensely and found to be a robust learning strategy (Karpicke et al., 2009; Karpicke and Roediger, 2008, 2010). Unfortunately, most college students report that they prefer rereading to practice testing (Karpicke et al., 2009). Karpicke et al. (2009) suggest that such students experience illusions of competence when rereading. For the abdomen chapter, medical students can use any of the commercially available anatomy flashcards to assess themselves and learn the material. Such a strategy is especially useful for laboratory practical examinations. The use of USMLE‐style items that focus on the clinical anatomy of the abdominal wall (Loukas et al., 2016) would also be very helpful for facilitating retrieval. These items should include lengthy explanations about the best answer choice and distractors (D'Antoni et al., 2012; D'Antoni and Mtui, 2018). As shown in Figure 4, thoughtful explanations of answer choices allow students to receive immediate feedback, and if they require more information they can refer to the appropriate pages in their textbooks (Loukas et al., 2016). The type of feedback shown in Figure 4 is explanatory feedback, which has been found to facilitate better student learning (both recall and inferential learning) than just correct answer feedback (Butler et al., 2013).

Figure 4.

Figure 4

Explanation of answer choices in the item shown in Figure 3.

The components of successful practice testing include practice‐test format and dosage. Historically, practice testing using free recall or short‐answer items was found to result in better retrieval than multiple‐choice items (Dunlosky et al., 2013), but some studies did not reveal robust differences (Smith and Karpicke, 2014). More recently, multiple‐choice items were found to be just as effective in the classroom for practice testing because they allow students to recognize what they do not know, empowering them to go back and restudy for the formative and/or summative assessment (McDaniel et al., 2012; McDermott et al., 2014). Nevertheless, students can easily use free recall or short‐answer items in combination with multiple‐choice items for a combined effect. In fact, short‐answer items are still commonly used in anatomy practical examinations, although a recent trend has been to use multiple‐choice items for this purpose (Shaibah and van der Vleuten, 2013). The reason we recommend high‐quality, multiple‐choice items for practice testing in medical school is that such items are found on the USMLEs (Paniagua and Swygert, 2016). In our experience, the more items a student practices, the better she/he will perform in the class and on subsequent licensing board examinations (D'Antoni et al., 2012). Our experience parallels the literature because it has been found that practice testing also enhances performance in summative assessments (Daniel and Broida, 2004; McDaniel et al., 2011). However, answering practice items haphazardly might not lead to success. We recommend that students answer practice items in domain blocks that are related to newly learned material and that the items span the spectrum of difficulty. After answering an item, students should carefully read the accompanying explanations to ensure that they know the distractors as well as the best answer. If further knowledge is needed, students should work backwards by reading the associated page(s) in their textbook. As they answer practice items and read explanations, students should keep a running list of “muddy points,” which are self‐identified concepts in which they are weak that require further study. Students should revisit their “muddy points” list at the end of the week to learn the concepts in more depth. Such a simple method allows students to recognize what they do not know using practice items, and then resolve those knowledge deficits at a later time.

This brings us to the issue of dosage. How much practice testing should a medical student undertake? The answer is clear: the more the better (Rawson and Dunlosky, 2011). However, practice testing must be spaced to achieve maximum efficiency, and often longer is better to obtain the longest levels of retention (Pashler et al., 2003). Because the medical school curriculum is densely packed, students need to plan out their practice sessions carefully, so that they have a chance to practice answering test items about each topic across multiple sessions.

Distributed practice

“Pulling an all‐nighter” or cramming is a time‐honored tradition for many students who carry it over from college to medical school. In a sample of 376 undergraduate medical students, Bickerdike et al. (2016) found that those who crammed had poorer academic performance. Clearly, such a strategy does not result in long‐term retention and should be avoided. A more efficacious strategy is one where the information is learned over time, and this is called distributed practice. Distributed practice refers to the technique of distributing learning of the same content within a single study session or across sessions to facilitate long‐term retention (Dunlosky et al., 2013). Distributed practice emphasizes the schedule of learning episodes and not the learning technique used, so ideally, students will use the most effective strategies (e.g., practice retrieval instead of passive rereading) and distribute their practice of important concepts across study sessions.

Successive relearning: Combining the best strategies

As noted above, practice testing is a strategy a student can apply when attempting to learn all kinds of material—the specific details, the interconnections among them, and higher‐order relationships. As noted above, distributed practice is spacing one's practice of the same content across multiple sessions; combining practice testing with distributed practice is the foundation of successive relearning. During a study session, students would use practice tests until they can perform a test correctly and (if relevant) correctly explain why their answer is correct (and alternatives are not). If students miss a question during practice, they would review the correct answer and test themselves on that concept again later in the session. That is, the students would continue until they can answer the practice questions correctly or recall the target material one time. This strategy is well known to students who use flashcards to learn foreign language vocabulary; if the students cannot remember a particular association, that flash card goes to the bottom of the stack and they continue to practice until all the vocabulary is correctly recalled once.

This comprises a single session of practice testing. The difficulty in stopping here is that even after mastering concepts in a single session, students still forget much of what they learn (Bahrick, 1979). Thus, they need to return to the same material at a later time and use practice testing again until they can answer the questions correctly. During the first session, students may struggle, but in each succeeding session relearning will proceed much faster and the content will be retained even longer. Of course, successive relearning will take some time to use, but for critical content that students must retain and understand this strategy is essential. In fact, almost anything people do well has been learned using successive relearning; most people just do not realize they are using this strategy. The key is that the students need to understand their learning objectives so they can practice correctly during each session. If deep learning is needed, then their practice should focus on answering test items that require deep learning, et cetera. Fortunately, most U.S. medical school curricula have clear objectives that are available to guide students. Table 2 contains objectives correlated to the information in the anterolateral abdominal wall section (see Table 1). Objectives 1 and 2 are straightforward, whereas objectives 3 and 4 require deeper learning.

Table 2.

Objectives that correlate with the information from the anterolateral abdominal wall section taken from Moore et al. (2015)

Objectives:

1. Describe the structure and function of the anterolateral abdominal wall.

2. Demonstrate the significant landmarks visible or palpable on the abdominal examination.

3. Describe the fascial and muscular layers of the anterolateral abdominal wall from superficial to deep.

4. Describe common surgical incisions of the anterolateral abdominal wall and their relationships to underlying anatomical structures.

However, in contrast to other strategies reviewed here, very little evidence is available about the effectiveness of successive relearning (partly because evaluating its efficacy experimentally requires a lot of time and effort). Nevertheless, the available evidence is promising (Rawson et al., 2013), and given that a wealth of evidence shows that its two main components are effective (i.e., practice testing and distributed practice), it seems reasonable that successive relearning would have a meaningful impact. For instance, students in an introductory psychology course used successive relearning to master key term definitions (core concepts that are foundational to this domain). During a single session, they learned eight concepts by trying to recall their meaning from memory (e.g., what is the meaning of “availability heuristic”?) and then received feedback about their performance. They continued until they could recall each concept correctly, and they then returned to relearn those concepts (i.e., recalling each one until they recalled it correctly) during several other sessions. Compared to concepts that they studied on their own, performance on a high‐stakes exam was about a letter and a half higher (Rawson, Dunlosky, & Sciartelli, 2013).

For the item in Figure 3, a student could initially miss the correct choice because she/he did not consider the clinical relevance of knowing the fascial layers of the anterolateral abdominal wall from anterior to posterior. Later, when answering another similar item, she/he might answer correctly and provide an appropriate explanation. Nevertheless, the student could forget the correct order of fascial layers and again miss this item on a subsequent study session. She/He would again need to correct her/his error and try another item later in that session. After several successes across sessions, however, the clinical relevance of knowing the correct order of the fascial layers (from anterior to posterior and vice versa) would become ingrained in her/his understanding of the anterolateral abdominal wall, which should be long‐lasting.

CONCLUSIONS

Medical students should use practice testing, distributed practice, and successive relearning as strategies for learning at higher levels of SOLO taxonomy. These strategies can be introduced very early in the undergraduate medical curriculum so that students can begin to implement and refine them as they transition from student to physician. There should be learning experts available in medical schools to support students as they implement these learning strategies. Fortunately, the learning strategies discussed in this article are beginning to receive attention in medical specialty journals (Weidman and Baker, 2015). Our hope is that the real‐world clinical anatomy example presented in this paper will make it easier for students to implement these evidence‐based strategies and improve their learning.

ACKNOWLEDGMENTS

The authors thank Dr. Ritwik Baidya, Dr. Santosh K. Sangari, and Dr. Sushil Kumar for reviewing the article and providing thoughtful comments.

REFERENCES

  1. Anderson J, Graham A. 1980. A problem in medical education: is there an information overload. Med Educ 14:4–7. [DOI] [PubMed] [Google Scholar]
  2. Armstrong EG, Mackey M, Spear SJ. 2004. Medical education as a process management problem. Acad Med 79:721–728. [DOI] [PubMed] [Google Scholar]
  3. Atkinson RC, Shiffrin RM. 1968. Human Memory: A Proposed System and its Control Processes. New York: Academic Press. [Google Scholar]
  4. Augustin M. 2014. How to learn effectively in medical school: test yourself, learn actively, and repeat in intervals. Yale J Biol Med 87:207–212. [PMC free article] [PubMed] [Google Scholar]
  5. Bahrick HP. 1979. Maintenance of knowledge: questions about memory we forgot to ask. J Exp Psychol Gen 108:296–308. [Google Scholar]
  6. Becker RE. 2014. Remembering sir William Osler 100 years after his death: what can we learn from his legacy. Lancet 384:2260–2263. [DOI] [PubMed] [Google Scholar]
  7. Bickerdike A, O'Deasmhunaigh C, O'Flynn S, O'Tuathaigh C. 2016. Learning strategies, study habits and social networking activity of undergraduate medical students. Int J Med Educ 7:230–236. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Biggs J, Collis KF. 1982. Evaluating the Quality of Learning: The SOLO Taxonomy (Structure of the Observed Learning Outcome). New York: Academic Press. [Google Scholar]
  9. Butler AC, Godbole N, Marsh EJ. 2013. Explanation feedback is better than correct answer feedback for promoting transfer of learning. J Educ Psychol 105:290–298. [Google Scholar]
  10. Carrier LM. 2003. College students' choices of study strategies. Perceptual & Motor Skills 96:54–56. [DOI] [PubMed] [Google Scholar]
  11. D'Antoni AV, DiLandro AC, Chusid ED, Trepal MJ. 2012. Psychometric properties and podiatric medical student perceptions of USMLE‐style items in a general anatomy course. J Am Podiatr Med Assoc 102:517–528. [DOI] [PubMed] [Google Scholar]
  12. D'Antoni AV, Mtui EP. 2018. Assessment of anatomical knowledge: approaches taken by higher education institutions by Bipasha Choudhury and Anthony Freemont. Clin Anat 00:1–2. [DOI] [PubMed] [Google Scholar]
  13. D'Antoni AV, Zipp GP, Olson VG. 2009. Interrater reliability of the mind map assessment rubric in a cohort of medical students. BMC Med Educ 9:19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. D'Antoni AV, Zipp GP, Olson VG, Cahill TF. 2010. Does the mind map learning strategy facilitate information retrieval and critical thinking in medical students. BMC Med Educ 10:61. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Daniel DB, Broida J. 2004. Using web‐based quizzing to improve exam performance: lessons learned. Teach Psychol 31:207–208. [Google Scholar]
  16. Dobson JL, Linderholm T. 2015. Self‐testing promotes superior retention of anatomy and physiology information. Adv Health Sci Educ Theory Pract 20:149–161. [DOI] [PubMed] [Google Scholar]
  17. Dunlosky J, Rawson KA, Marsh EJ, Nathan MJ, Willingham DT. 2013. Improving students' learning with effective learning techniques: promising directions from cognitive and educational psychology. Psychol Sci Public Interest 14:4–58. [DOI] [PubMed] [Google Scholar]
  18. Hattie JAC, Brown GTL. 2004. Cognitive Processes in asTTle: The SOLO Taxonomy (asTTle Technical Report #43). Auckland, New Zealand: University of Auckland/Ministry of Education. [Google Scholar]
  19. Karpicke JD, Butler AC, Roediger HL. 3rd. 2009. Metacognitive strategies in student learning: do students practise retrieval when they study on their own. Memory 17:471–479. [DOI] [PubMed] [Google Scholar]
  20. Karpicke JD, Roediger HL. 3rd. 2008. The critical importance of retrieval for learning. Science 319:966–968. [DOI] [PubMed] [Google Scholar]
  21. Karpicke JD, Roediger HL. 3rd. 2010. Is expanding retrieval a superior method for learning text materials. Mem Cognit 38:116–124. [DOI] [PubMed] [Google Scholar]
  22. Liles J, Vuk J, Tariq S. 2018. Study habits of medical students: an analysis of which study habits most contribute to success in the preclinical years. MedEdPublish 7:1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Loukas M, Tubbs RS, Abrahams P, Carmichael S. 2016. Gray's Anatomy Review. 2nd Ed. Philadelphia, PA: Elsevier. [Google Scholar]
  24. Mayer RE. 1996. Learning strategies for making sense out of expository text: the SOI model for guiding three cognitive processes in knowledge construction. Educ Psychol Rev 8:357–371. [Google Scholar]
  25. McBride JM, Drake RL. 2018. National survey on anatomical sciences in medical education. Anat Sci Educ 11:7–14. [DOI] [PubMed] [Google Scholar]
  26. McDaniel MA, Agarwal PK, Huelser BJ, McDermott KB, Roediger HLI. 2011. Test‐enhanced learning in a middle school science classroom: the effects of quiz frequency and placement. J Educ Psychol 103:399–414. [Google Scholar]
  27. Mitchell KJ. 1990. Traditional predictors of performance in medical school. Acad Med 65:149–158. [DOI] [PubMed] [Google Scholar]
  28. Moore KL, Agur AMR, Dalley AF. 2015. Essential Clinical Anatomy. 5th Ed. Baltimore, MD: Wolters Kluwer Health. [Google Scholar]
  29. Newton G, Martin E. 2013. Blooming, SOLO taxonomy, and phenomenography as assessment strategies in undergraduate science. J Coll Sci Teach 43:78–90. [Google Scholar]
  30. Paniagua MA, Swygert KA. 2016. Constructing Written Test Questions for the Basic and Clinical Sciences. 4th Ed. Philadelphia, PA: NBME. [Google Scholar]
  31. Pashler H, Zarow G, Triplett B. 2003. Is temporal spacing of tests helpful even when it inflates error rates. J Exp Psychol Learn Mem Cogn 29:1051–1057. [DOI] [PubMed] [Google Scholar]
  32. Pinto Zipp G, Maher C, Donnelly E, Fritz B, Snowdon L. 2016. Academicians and neurologic physical therapy residents partner to expand clinical reflection using the SOLO taxonomy: a novel approach. J Allied Health 45:e15–e20. [PubMed] [Google Scholar]
  33. Rawson KA, Dunlosky J. 2011. Optimizing schedules of retrieval practice for durable and efficient learning: how much is enough. J Exp Psychol Gen 140:283–302. [DOI] [PubMed] [Google Scholar]
  34. Rawson KA, Dunlosky J, Sciartelli SM. 2013. The power of successive relearning: improving performance on course exams and long‐term retention. Educ Psychol Rev 25:523–548. [Google Scholar]
  35. Ruiter DJ, van Kesteren MT, Fernandez G. 2012. How to achieve synergy between medical education and cognitive neuroscience? An exercise on prior knowledge in understanding. Adv Health Sci Educ Theory Pract 17:225–240. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Shaibah HS, van der Vleuten CP. 2013. The validity of multiple choice practical examinations as an alternative to traditional free response examination formats in gross anatomy. Anat Sci Educ 6:149–156. [DOI] [PubMed] [Google Scholar]
  37. Smith MA, Karpicke JD. 2014. Retrieval practice with short‐answer, multiple‐choice, and hybrid tests. Memory 22:784–802. [DOI] [PubMed] [Google Scholar]
  38. Thorndike EL. 1906. The Principles of Teaching Based on Psychology. New York, NY: A. G. Seiler. [Google Scholar]
  39. Weidman J, Baker K. 2015. The cognitive science of learning: concepts and strategies for the educator and learner. Anesth Analg 121:1586–1599. [DOI] [PubMed] [Google Scholar]

Articles from Clinical Anatomy (New York, N.y.) are provided here courtesy of Wiley

RESOURCES