Skip to main content
Journal of Medical Education and Curricular Development logoLink to Journal of Medical Education and Curricular Development
. 2020 Jul 29;7:2382120520941822. doi: 10.1177/2382120520941822

Table Quizzes as an Assessment Tool in the Gross Anatomy Laboratory

Natascha Heise 1, Carolyn A Meyer 1, Brendan A Garbe 2, Heather A Hall 1, Tod R Clapp 1,
PMCID: PMC7391427  PMID: 32775692

Abstract

Using cadaveric instruction in a graduate-level anatomy course is an expensive and time-consuming undertaking. While this is a worthwhile endeavor, most first-year medical students and students in the health fields struggle with the independent, self-directed learning approach in the cadaveric laboratory, and going beyond rote memorization of the material. As such, effective assessment tools that maximize student learning in the cadaveric laboratory are critical, especially if no lecture component is present. Dissection quality often reflects student attention to detail and therefore may be tied to overall performance in the course. The aim of this study was to investigate the relationship between weekly table quizzes and the overall student outcomes in a graduate biomedical human dissection class as well as examining the benefits and implications of this approach. In this course, a uniquely structured weekly quiz assessed dissection quality and probed student understanding in human anatomy. Student data compiled from 5 years of dissection courses were analyzed to evaluate the relationship between performance in the weekly assessment and on the unit examinations. The results showed a statistically significant relationship between the weekly quizzes and the student examinations at the end of each dissection block in 2013, 2015, 2016, and 2017. The data suggest a potential correlation between performance on weekly quizzes and on unit examinations. The unique nature of the table quizzes provides the students with the opportunity to practice the retrieval of their knowledge, feel more guided throughout their dissection, and receive immediate feedback on their performance. This assessment tool also provides a way to predict student outcomes and an opportunity for early intervention to help at-risk students. The analysis of this research study contributes to the need for more data on the usage of assessment tools in a graduate human dissection class.

Keywords: dissection, prosection, flipped classroom

Introduction

Every year, the number of medical students gradually increases within the United States1 and often, first-year students find gross anatomy to be particularly challenging. Ongoing research on various educational techniques such as repeated and early intervention techniques that promote learning and retention in other disciplines has been well documented, but there are many unique challenges that are faced by medical programs or other institutions that offer gross anatomy laboratory classes. In addition, anatomical education has changed over the past decades. Since the 1990s, medical schools in the United States have revised their curricula toward a more student-centered instruction while reducing the time devoted to the anatomical sciences.2-6 Many graduate programs offering cadaveric classrooms have followed this student-centered approach when teaching human anatomy. This puts additional pressure to identify methodologies that increase learning efficiency. In the 2014 survey of 55 graduate and medical programs within the United States, all programs indicated that they use cadavers as a primary instruction method in their gross anatomy class.5 In addition, all programs have changed to either a partially or fully integrated curriculum connecting subjects such as neuroanatomy, gross anatomy, microscopy, and embryology.5 Only 31 programs (56%) in the United States assess other competencies like professionalism, communication skills, and teamwork building.5 In addition, many schools have moved away from lecture-based methods and some even completely eliminated traditional anatomy lectures.7-10 These changes have created challenges for anatomists to develop new strategies to teach human anatomy.5

Within the medical field, educators have worked on implementing various assessment tools to not only improve student performance in anatomical courses but also to deepen their understanding and increase long-term retention of material.11 Here, we describe the implementation of weekly table quizzes as an assessment tool. These oral quizzes between a facilitator and a group of 4 dissectors assess dissection quality while giving the opportunity to work with contextual information.

Retrieval practice

Scientific and educational literature agrees that frequent, interactive quizzing is related to improved student outcomes.12,13 Regular assessments as an integral part in different learning environments engage students, enhance learning,14 encourage them to stay motivated and review more often, enable student-teacher interactions,11,15 and increase retrieval of knowledge. Karpicke16 proposed that it is impossible to directly assess the contents of stored knowledge. One can only examine students’ reconstructed knowledge.17-19 In Karpicke study, 3 groups of students were given a scientific text and then exposed to different levels of rereading information or retrieving information. Results indicated that the group with the most active retrieval method produced the best long-term retention.16 However, most students are unaware of using retrieval-based learning for enhancing their learning.20 Several recent studies have shown that low-stakes quizzes that are interspersed in the course can serve as an effective tool for promoting retrieval practice.21-23 Another study indicated that repeated recall in the classroom enhanced retention by more than 100% relative to not recalling them frequently.24 These studies and other studies show that it is critical to practice retrieval for learning in the classroom.25 Practicing the retrieval of knowledge through various exercises is a valuable skill for all students but does require time. However, with the decreased time devoted to the anatomical sciences, it remains questionable how feasible the implementation of this approach might be in a busy cadaveric classroom.

Feedback

Regular effective evaluation in the classroom is also critical in promoting student learning. When comparing multiple-choice vs short-answer formats, studies indicate that regular short-answer quizzes paired with targeted instructor comments that we often define as instructor feedback are more effective in enhancing student learning rather than taking a multiple-choice format.26,27 Most cadaver-based anatomy classes refer back to the use of multiple different active learning strategies in implementing problem-based learning, life models, radiological images, and laparoscopic views of the living body. Armbruster and colleagues28 implemented active learning and student-centered pedagogy in an introductory biology class with the hope to improve student attitudes and performance. Results indicated that students performed better on final examinations when material was taught in an interactive form. Interactivity included emphasizing learning goals during lectures with clicker questions, weekly quizzes, group work, recitation, and outside class study groups. The weekly quizzes were considered a helpful lecture element and ranked third highest in helpfulness. These weekly quizzes were implemented to encourage students to keep up with the material and to provide feedback.28 In addition, other literature has shown that weekly quizzes can help students to form links and relationships between clinical facts and other concepts.29-33 However, in a shortened time devoted to the anatomical sciences and with a decrease in faculty and staff present in the laboratory, anatomy facilitators have faced problems incorporating this approach in providing frequent feedback.

Assessment-driven learning

Most cadaveric classrooms have shifted to an independent, self-directed learning environment in which students are required to link concepts on their own. This is particularly challenging if no lecture component is present. Weekly oral quizzes, as well as checklists, may provide discussion assessment-driven learning in the cadaveric classroom. Checklists have been used to evaluate learning in a gross anatomy laboratory to improve dissection quality and to assist students in improving learning outcomes and maintaining focus.34 Halliday et al35 implemented regular assessments into their medical school anatomy curricula to incentivize students to keep up with the material and provide students with regular checkpoints to assess their progress. In another study, students reported that they saw weekly assessment of their dissection as a “valuable and rewarding part of their anatomy course” and 64% of students agreed that the evaluation of their dissection in form of weekly quizzes helped them to use their laboratory time efficiently.36 Nevertheless, more data are needed on this approach on how to effectively guide students in a more condensed cadaveric classroom environment.

Targeted intervention

Predicting student outcomes and ultimately determining at-risk students has gained of great interest to provide sufficient help for students in need and ultimately increase student outcomes. Literature has looked into ways on how to identify students who are going to do poorly in a course early enough so remedial actions can take place.37 An example of such is reported by Meier and colleagues37 who created an algorithm that focuses on the past history of students’ performance in a course and proposed that early in-class assessments such as quizzes would enable timely interventions by the instructor. Others have used personalized multiregression and matrix factorization approaches38 or the analytics of using a digital textbook39 to forecast student outcomes. The literature asks for more research on how to effectively intervene and help students in need in a timely manner.

Summary

Although many studies have demonstrated the success of various supplemental materials in the cadaveric laboratory, it is important to continue to evaluate pedagogical methods and determine their benefits and implications. Student tendencies to focus on gaining rote knowledge and memorizing the material should be supplemented with active learning strategies to let them practice the retrieval of their knowledge and receive immediate feedback on their progress. Students enrolled in a dissection course without a lecture component, as described in this study, may be in particular need of active learning to guide and encourage their independent study. The aim of this study was to investigate the relationship between uniquely structured weekly table quizzes and the overall student outcomes in a graduate biomedical human dissection class as well as examine the benefits for faculty and staff of this approach in predicting student outcomes. The data from weekly table quizzes described in this study may further support the literature in examining the advantages of frequent assessment, feedback, and guidance in a cadaveric laboratory.

Materials and Methods

Ethical approval

This study was reviewed by the Institutional Review Board at Colorado State University and did not require approval because it did not involve intervention or interaction with the individual or identifiable private information (45CFR46.1029(f)).

Student cohort

For this study, the sample is composed of students who were enrolled in the graduate human gross anatomy dissection class at Colorado State University (Table 1). Thus, all registered students were part of this study and therefore formed an opportunity sample. In the fall 2013, 46 students were admitted to the class. In the fall 2012, 2015, and 2016, 48 students each were enrolled in the class and in 2017, 55 students. The age range of the participants was between 20 and 35 years. Undergraduate students with a major in biomedical sciences, biology, or health and exercise sciences made up one third of the class. The other two thirds of the class were graduate students enrolled in the 1-year master’s program in biomedical sciences. Each semester, approximately 1 to 2 graduate students from a toxicology master’s program were enrolled as well. Every semester, the class was composed of approximately half female and half male students. Transcripts were reviewed before enrollment, and these showed that all students began the class with a diverse background in human anatomy. Approximately half of the students were previously enrolled in an undergraduate prosection gross anatomy course. The remainder of the student cohort had no sufficient anatomy knowledge and were recommended to take an undergraduate prosection gross anatomy class prior to enrollment or concurrently. The cadaveric prosection course provided a lecture component that is beneficial to students without an anatomy background.

Table 1.

Population characteristics.

Years 2012 2013 2015 2016 2017
# of students 48 46 48 48 55
Age range 20-35 years
Enrollment 1/3rd undergraduate and 2/3rd graduate students

Population characteristics from the years of 2012, 2013, 2015, 2016, and 2017 with number (#) of students, age range, and enrollment.

Course structure and grading

Students enrolled in graduate human gross anatomy dissection worked in groups of 4 to dissect a human cadaver over the course of a 16-week semester. The individual groups were formed based on students’ self-selection. Each group was assigned to a specific cadaver on the first day of classes and was required to follow regional dissection blocks to finish the dissection by the end of the semester. Each dissection block was 4 weeks long and incorporated 3 weekly quizzes, termed table quizzes, and 1 laboratory examination that tested student knowledge. However, the first dissection block was only composed of 2 table quizzes due to timing between the start of the semester and the first examination. The 4 examinations comprised 80% of the students’ final grade and table quizzes comprised the remaining 20%. At the end of the semester, students were allowed to drop 1 (lowest) weekly table quiz grade. The course comprised solely laboratory instruction time, with no lecture component. The course was scheduled 3 times a week for 3 hours in the afternoon. During this time, students dissected with the presence of 2 professors, 1 instructor, and up to 5 teaching assistants. In addition, the students were required to work on their cadavers outside of class to finish the dissection that was required for each week. Because there was no lecture component for this course, the students were required to work with Grant’s Dissector40 and the Atlas of Human Anatomy.41 Weekly dissection guides of 2 to 3 pages supplemented the available resources, outlining highlights of the related week. Similar details on the course design and dissection have been described by Nwachukwu and colleagues.36 As stated, students who did not have sufficient anatomy knowledge were also required to enroll in the undergraduate prosection class which was composed of 3 1-hour lectures on Mondays, Wednesdays, and Fridays, as well as a prosection-based laboratory component once a week for 3 hours.

Cadaveric dissection

Embalmed cadavers were received from the State Anatomical Board, an agency based out of the University of Colorado at Denver and the Health Sciences Center. Each year, 12 to 14 cadavers were used for the dissection course. Around 10 dissected cadavers from the previous year were used for the prosection gross anatomy course on campus and also served as an important tool to guide the students in their dissection. The dissection class was organized into 4 blocks with a focus on the lower limb, thorax/abdomen/pelvis, head/neck, and upper limb. Because the anatomical areas varied in size, difficulty, and detail, it was required that the students equally contribute to the dissections outside of class among their group.

Laboratory examinations and weekly table quizzes

To assess student knowledge in human anatomy, faculty, staff, and graduate teaching assistants tagged anatomical structures on the cadavers students were working on for the laboratory examinations. A total of 80% to 90% of the tagged structures required identification, whereas 10% to 20% focused on application of knowledge. Preference in choosing examination questions was given to structures from the Grant’s Dissector40 or Atlas of Human Anatomy41 and those that have clinical relevance. These examinations were facilitated in an individual and written format. In addition to the laboratory examinations, table quizzes were implemented to primarily assess dissection quality. The goal for these weekly table quizzes was to monitor dissection quality and to ensure that students were able to identify anatomical structures as well as add contextual information (eg, relevant innervation and blood supply). While this exercise enhanced communication within the small group of dissectors, it also provided the instructors with the opportunity to informally assess individual and group understanding. These table quizzes were composed of 10 structures relevant to the current unit and took place on Monday of each week, testing the material from the previous week. Each question in these assessments was equally worth 10% of the maximum 5 points for the quiz. On alternating weeks, the table quizzes included an extra credit question which was worth 0.5 points. The example table quiz in Table 2 illustrates a typical set of structures within the thigh and gluteal region of the lower limb dissection unit. Every week, in preparation for the table quiz, faculty and staff would meet and discuss relevant structures to determine the components of each table quiz. The instructors chose a combination of muscles, arteries, nerves, ligaments, or other important structures in that region that was also described in the Grant’s Dissector40 or Atlas of Human Anatomy.41 Additional questions as well as requirements for receiving credit for each structure were discussed and noted during those meetings. The notes taken during those meetings formed a base for conversations between faculty and students during the weekly table quizzes and outlined information in each rubric (example illustrated in Table 3). These meetings were crucial to ensure objectivity of assessments. Frequently, graduate teaching assistants paired with faculty and staff to assist with this assessment. This assessment tool required the students to clearly identify anatomical structures, trace them, and know important characteristics about the structures, such as origin and insertion of a muscle or terminal branches of a vessel or nerve. However, during early implementation in 2012, the table quizzes solely focused on identification and dissection quality of the observed structures. In an effort to constantly improve pedagogical technique, second- and third-order questions were added in 2013 to increase the depth of knowledge tested with this assessment tool.

Table 2.

Example table quiz within lower limb dissection block.

Structures 1-5 Structures 6-10
1. Transverse branch of lateral circumflex femoral artery (0.5 points)
2. Superior lateral geniculate artery (0.5 points)
3. Long head of biceps femoris muscle (0.5 points)
4. Obturator nerve (0.5 points)
5. Lateral femoral cutaneous nerve (0.5 points)
6. Sciatic nerve (0.5 points)
7. Vastus medialis muscle (0.5 points)
8. Gluteus medius muscle (0.5 points)
9. Superior gluteal nerve (0.5 points)
10. Obturator internus muscle (0.5 points)
Extra credit: pes anserine (0.5 points)

List of example anatomical structures chosen for a table quiz within the lower limb dissection block. All 10 questions were worth each 0.5 points with a total of 5 points. On alternating weeks, an extra credit structure was given which increased the total to 5.5 points.

Table 3.

Table quiz rubric.

Muscle
Requirement
Artery/vein
Requirement
Nerve
Requirement
Ligament
Requirement
Intact throughout region Intact throughout region Intact throughout region Intact throughout region
Free from fascia/surrounding tissues Free from fascia/surrounding tissues Free from fascia/surrounding tissues Free from fascia/surrounding tissues
Origin and insertion visible Traceable through entire region Traceable through entire region Attachment points visible
Muscle striations and borders visible Muscular branches visible If applicable: muscular branches visible
If applicable: complete reflection according to weekly dissection guide If a branch: root and other branches visible

Rubric was used during table quizzes for grading purposes. Muscles, arteries and veins, nerves, and ligaments had requirements that needed to be checked during the table quizzes. Full points were given when all requirements were completed. No partial credit was given.

The process of the weekly oral table quizzes began with the evaluator giving students a structure to identify and giving them 1 minute of small group discussion. No learning cues were given throughout this time. The students used that time to share their knowledge with each other and find the anatomical structure on their cadaver. Providing the group time to work collaboratively was important to give each person an equal opportunity to contribute detailed knowledge. Following this discussion time, the evaluator checked the identified structure. In addition, the group had to answer verbal questions from the evaluator which was not a structured part of this assessment. The grading criteria focused on an assessment of the quality of the dissection. If the entirety of the structure was clearly distinguishable from surrounding structures and fascia was appropriately removed, the group received the points (Table 3). Even though the quality of the dissection determined the grade received for the table quiz, the evaluator also posed critical thinking questions to interact with the students and create a learning environment. For example, the following questions would be asked for the first structure (transverse branch of lateral circumflex femoral artery) as seen in Table 2:

What area does this artery supply blood to? Where does this artery branch off from? What are other named branches of the same artery? Are there any existing anastomoses?

This verbal interaction was implemented to test student knowledge besides the dissection quality as no lecture component provided additional learning opportunities and students were required to independently draw connections between structures. Because students’ knowledge varied throughout the cohort, the additional questions asked differed between each dissection group. A discussion of structural relationships and oral questions resulted in a student-teacher interaction and feedback of approximately 20 to 25 minutes during each table quiz. Feedback was provided immediately after each structure ensuring immediate understanding of the material. In addition, each individual student was given the opportunity to communicate with the instructor and work through follow-up questions. Instructors ensured everyone within the small group participated and contributed to the group grade. This approach allowed the assessment of collaborative as well as individual effort. To make this process more efficient, faculty and staff alternated between groups to interact with 1 group while another group was working independently to discuss the anatomical structure. This allowed each group time to work independently and to work with the instructor on critical thinking questions, making this process more feasible in a busy cadaver laboratory. Group performances were recorded of a total of 5 points in form of grades in an online learning management system (Canvas).

Data analysis

After the grades were collected through the online learning management system (Canvas), the table quiz grades were compared with the laboratory examination scores. For each student, table quiz grades within 1 dissection block were averaged. The averaged table quiz grades were correlated to the relevant laboratory examination score for each individual student. Linear regression was performed in Microsoft Excel (Microsoft, Redmond, WA). In this study, it was used to make a prediction and examine the relationship between table quizzes and examinations. Linear regression was performed in live time after each examination. This analysis allowed the facilitators to predict student outcomes and intervene within their progress to increase student success in the class. All grades were considered in this study, and statistical significance was set at the 0.01 level. Because the residuals were approximately normally distributed, Pearson parametric correlation was used to analyze the data. The regression coefficients R2 and P values for the years 2012, 2013, 2015, 2016, and 2017 were recorded. The data from the fall 2014 were corrupted and thus were not part of this study. The regression analyses allowed for longitudinal comparison of the effectiveness of table quizzes at improving student learning and dissection quality across multiple years.

Results

Quantitative analysis of student grades revealed a statistically significant positive relationship between the averaged table quiz grades and the student examination scores in 2013, 2015, 2016, and 2017 (Table 4). The number of data points (observations) varied in each year due to the change in enrollment. Linear regression provides a visual appreciation of the association between improvement in table quiz grades and increased examination scores (Figures 1-5).

Table 4.

Results from regression analysis from 2012, 2013, 2015, 2016, and 2017.

Years 2012 2013* 2015* 2016* 2017*
P values .8774 <.001 .0057 <.0001 <.0001
R 2 0.0008 0.1191 0.0395 0.2137 0.2130
Observations 192 184 192 192 220
Range TQ 45-103.4 45-103.4 70-108 40-108 40-110
Range E 34-100 44-102 56-102 66-102 54-102

Summary of regression analysis is shown in Figures 1 to 5. P values determined the significance of the comparison of averaged weekly table quizzes and examination scores in 2012, 2013, 2015, 2016, and 2017. Statistically significant years with P < .01 were indicated with *. R2 values indicated how close the data were to the fitted regression line. The observations included 4 averaged table quizzes, each compared with the respective unit examination scores for each student in that year. In 2012, 2015, and 2016, 48 students were enrolled. In 2013, 46 students and in 2017, 55 students. The range of the averaged table quiz scores (TQ) and examination scores (E) varied throughout the years and is indicated in percent from lowest to highest (%).

Figure 1.

Figure 1.

Linear regression analysis 2012. Regression analysis comparing averaged table quiz scores and examination scores in each dissection unit in 2012. The maximum score (100%) of each table quiz was 5 and 5.5 with the extra credit question. The maximum score (100%) of each examination was 50 points and 51 points including the extra credit question. Each student is represented 4 times in this figure due to the 4 dissection blocks. In this year, 48 students were enrolled in the class resulting in 192 grades.

Figure 2.

Figure 2.

Linear regression analysis 2013. Regression analysis comparing averaged table quiz scores and examination scores in each dissection unit in 2013. The maximum score (100%) of each table quiz was 5 and 5.5 with the extra credit question. The maximum score (100%) of each examination was 50 points and 51 points including the extra credit question. Each student is represented 4 times in this figure due to the 4 dissection blocks. In this year, 46 students were enrolled in the class resulting in 184 grades.

Figure 3.

Figure 3.

Linear regression analysis 2015. Regression analysis comparing averaged table quiz scores and final examination scores in each dissection unit in 2015. The maximum score (100%) of each table quiz was 5 and 5.5 with the extra credit question. The maximum score (100%) of each examination was 50 points and 51 points including the extra credit question. Each student is represented 4 times in this figure due to the 4 dissection blocks. In this year, 48 students were enrolled in the class resulting in 192 grades.

Figure 4.

Figure 4.

Linear regression analysis 2016. Regression analysis comparing averaged table quiz scores and final examination scores in each dissection unit in 2016. The maximum score (100%) of each table quiz was 5 and 5.5 with the extra credit question. The maximum score (100%) of each examination was 50 points and 51 points including the extra credit question. Each student is represented 4 times in this figure due to the 4 dissection blocks. In this year, 48 students were enrolled in the class resulting in 192 grades.

Figure 5.

Figure 5.

Linear regression analysis 2017. Regression analysis comparing averaged table quiz scores and final examination scores in each dissection unit in 2017. The maximum score (100%) of each table quiz was 5 and 5.5 with the extra credit question. The maximum score (100%) of each examination was 50 points and 51 points including the extra credit question. Each student is represented 4 times in this figure due to the 4 dissection blocks. In this year, 55 students were enrolled in the class resulting in 220 grades.

In 2013, linear regression of the data showed a regression coefficient R2 of 0.1191, P < .001. In that year, 46 students were enrolled which resulted in a recording of 184 grades across all 4 dissection units. The data points from the averaged table quizzes ranged from 2.25 to 5.17 points (45%-103.4%) and the examination scores ranged from 22 to 51 points (44%-102%). Linear regression of 2015 data showed an R2 of 0.0395, P < .0057, with an enrollment of 48 students that year and a collection of 192 grades. Range of averaged table quiz scores was 3.5 to 5.4 points (70%-108%) and 28 to 51 points (56%-102%) for examination scores. 2016 data showed an R2 of 0.2137, P < .001. The enrollment of students was 48 with a collection of 192 grades. The averaged table quiz scores ranged from 2 to 5.34 points (40%-108%), and the examination scores were in between 33 and 51 points (66%-102%). Finally, the 2017 data had an R2 of 0.21304, P < .001, with 55 students enrolled that year and a collection of 220 grades. The range of averaged table quiz scores was 2 to 5.5 points (40%-110%), and the range of examination scores was 27 to 51 points (54%-102%).

Data analyzed from grades collected in 2012 showed no significant relationship. R2 from 2012 was 0.0001, P = .8774. In that year, 48 students were enrolled, and 192 grades were collected. The averaged table quiz scores ranged from 2.25 to 5.17 points (45%-103.4%) and the examination scores ranged from 17 to 50 points (34%-100%).

As mentioned previously, the range of averaged table quiz scores varied throughout the years. The data from 2012 and 2013 showed that the range of averaged table quiz scores is the same for those 2 years (58.4%). Surprisingly, the range in 2015 decreased (38%) and increased again in 2016 (68%) with 2017 representing the biggest range (70%). In 2015, students performed better on table quizzes which resulted in this narrow range of data points. The range of examination scores decreased over the years from 2012 (66%) to 2016 (36%). In 2017, however, the range was again the biggest (66%).

An analysis of the means of all table quizzes combined and all examination scores for each year revealed lower means for examination scores than table quizzes for 2012 and 2015 (Figure 6) indicating that students performed better on table quizzes when compared with examination scores. The reversed relationship was observed in 2013, 2016, and 2017 showing that students performed better on examinations than on table quizzes. The highest mean was found in 2015 (92.483%) for the table quizzes and the lowest in 2013 for table quizzes (78.140%).

Figure 6.

Figure 6.

Table quiz and examination score means from 2012, 2013, 2015, 2016, and 2017. Statistical analysis comparing means of all table quizzes and examination scores for each year ± standard deviation. The first bar of each color represents the mean for table quizzes (TQ), whereas the second bar illustrated the mean of all examination scores (E).

Discussion

In this study, the results indicate that there is a positive relationship between the table quizzes and the laboratory examination scores in the graduate-level human dissection course in the years of 2013, 2015, 2016, and 2017. Surprisingly, 2012 did not indicate a significant correlation between the 2. Throughout the table quizzes in 2012, faculty and staff solely focused on identification and dissection quality of the observed structures using no specific grading rubric. The table quizzes took around 10 minutes and did not include an oral component because the students simply had to point to the anatomical structures. Beginning with 2013, faculty and staff added application questions to create a dialogue and feedback between students and faculty. This adjustment and creation of a more active learning environment may have resulted in the observed positive relationship in future years which is supported by the literature.12,13,28-34 Because the instructors have changed minimally over the years, the stronger correlations in later years may be explained by the instructors improving their skills regarding the table quizzes and their communication with the students. One critical component of this improvement was changing the table quizzes to a more conversation-based assessment and increasing the duration to 20 to 25 minutes with each group. Other studies also support the benefits of immediate feedback and have shown that this can result in a higher performance and affect retrieval of information.26-33 Active learning is more engaging than memorization which might aid in understanding the presented anatomical material on a deeper level.42,43

In our course, we let students self-select their groups which may have had an impact on how thoroughly they dissect. This also may have influenced the overall mean table quiz and examination scores for each of the years as seen in Figure 6. Literature suggests that students generally perform better on group examinations than during individual testing circumstances.44-46 This is supported by our data from 2012 and 2015. In addition, it is possible that high stakes and high point totals of individual examinations could have affected the different total mean scores.

Strategies to increase active learning in a gross anatomy laboratory and other scientific classes, such as frequent quizzes, have previously been shown to be successful.12,13,28-34,47 One particular study focused on a cadaveric classroom and showed a similar positive correlation between the dissection assessment and the final course grades.36 However, this previous study focuses mainly on dissection quality and not on higher order thinking and anatomical knowledge gained through the weekly quizzes or on how to use this information for targeted intervention. The current study supports these findings while further developing the benefits of this assessment tool. The visual representation of the data (Figures 1-5) aids in tracking student progress and predicting student outcomes throughout the semester which has previously been shown to help identify at-risk students.37-39 Performing regression analysis after the first examination was especially helpful as faculty and staff were able to evaluate the students’ progress and provide early help if necessary. Teaching assistants were assigned to give additional instruction to groups who performed poorly on weekly table quizzes. The weekly examination of dissection quality and assessing student knowledge may be a way of using the given laboratory time effectively in getting to know the students and intervene during their learning progress if necessary.37-39 In addition, the weekly quizzes enabled the students to practice retrieval and recall anatomical structures and content within the classroom as supported by the literature.21-25 These table quizzes also encouraged students to keep up with the material and stay motivated throughout the semester. While not formally assessed, the oral component served as feedback that played a critical role in enhancing student learning as supported by previous studies.26,27,48,49 Oral examinations can serve as a reliable and objective means to monitor students’ progress in a cadaveric anatomy course.50 According to Johnson and colleagues,51 the supplementation of oral quizzes improved student learning in a laboratory dissection class. These examinations during the dissection served as “spot checks” on anatomical areas which were followed by faculty advice on how to improve and maintain focus.51 This and other studies reported that the implementation of those assessments keep students motivated and guide them throughout their dissection.52

Students in this study were required to learn a lot of information in a short amount of time with no lecture component associated with this course, and it was critical that faculty create an active learning environment to engage students. Frequently, students in a self-directed learning situation feel overwhelmed and are less likely to find learning strategies on their own.53 The course described in this study is primarily self-directed instruction and the weekly table quizzes are an important activity to engage students in active learning. This form of assessment is similar to the previously described spaced practice during which the content of 1 dissection unit is spaced out over time and tested throughout the unit instead of only at the end.54,55 In addition to the retention of knowledge gained through weekly table quizzes, the dissection process may have enhanced student technical skills as described previously.36

Limitations of this study

To determine whether the table quizzes have an impact on the students’ learning and knowledge retention, it would be beneficial to formally assess the oral component of this assessment tool. This study only demonstrated an increase in examination scores based on dissection quality. Analyzing the higher order thinking questions posed during the student instructor interactions might contribute in determining whether students retained the material and whether it increased their knowledge in human anatomy. Another potential limitation of this study is the lack of a control group. It would be beneficial to compare a group of students exposed to table quizzes and examinations with a group of students only taking examinations. However, eliminating table quizzes for a group of students would prevent faculty from using this quiz as an early gauge of student progress. In this course, the instructors used the table quizzes not only to prepare the students for their upcoming examination but also to interact with the students and recognize weaknesses. Before the table quizzes were implemented, instructors would solely focus on the first examination grade to recognize and reach out to academically at-risk students. While not a formal purpose of this assessment, it has proved invaluable in allowing faculty to intervene for students who are struggling with the material prior to receiving a failing grade on the first examination. As such, it would be difficult in the current course setup to create a control group without sacrificing student performance.

Another limitation of this study is that statistical analysis revealed a regression coefficient of no more than 21%, indicating that this model needs improvement. Comparing averaged table quiz and final examination scores shows a potentially beneficial correlation but does not directly imply a causal link between improved table quiz scores and examination scores. It is difficult to capture the complexities leading to student examination success. Motivation, time spent dissecting, and understanding of dissection guides were not directly measured in this study. For example, a student who may have done poorly during table quizzes may make up for their individual final examination grades through increased study efforts. In addition, it is important to keep the grading throughout the table quizzes as objective as possible. The subjectivity of faculty and staff was controlled through a set rubric but ultimately may have influenced group table quiz scores.

Future direction

The research team hopes to continue the implementation of this assessment tool in the dissection classroom, to adjust specific delivery methods, to analyze the results, and to ultimately improve human anatomy instruction. To better assess this method, it is essential to gather qualitative data on how the students perceive the table quizzes throughout the semester. This type of data could strengthen the usage of table quizzes in the gross anatomy laboratory. In addition, focusing on the oral component of the table quizzes could reveal more aspects beneficial for teaching in the cadaveric laboratory as well as an opportunity to formally assess student knowledge. In the future, student success and feedback will continue to drive innovation in identifying distinct learning methods for optimal knowledge acquisition.

Conclusions

Educational research in anatomical sciences indicates that assessment tools not only enhance a positive laboratory experience but also provide students with direction and guidance while working with their cadavers during their laboratory time.51,56 While current literature supports the use of assessment tools, further studies are needed to demonstrate the effectiveness of these educational tools in cadaveric laboratories. Thus, this longitudinal study of implementing oral table quizzes over 5 years helps to support the usage of assessment tools in all cadaveric laboratories that evaluate dissection quality. The unique nature of the table quizzes provided the students with the opportunity to practice the retrieval of their knowledge and feel more guided throughout their dissection. In addition, they were able to interact with faculty and staff and receive immediate feedback on performance throughout the course. This approach creates the opportunity for facilitators to assess dissection quality while giving them the opportunity to introduce higher order questions.

This approach may be useful for instructors teaching human anatomy in a stand-alone laboratory setting without a lecture component and for instructors seeking ways to work more closely with their students. In addition, this assessment tool might contribute in filling the gap of incorporating active learning strategies in a busy cadaveric laboratory.

Acknowledgments

The materials in this article have previously been presented in form of a poster presentation at the conference of the American Association of Clinical Anatomists in July 2017. The abstract was published in Clinical Anatomy.

Footnotes

Funding:The author(s) received no financial support for the research, authorship, and/or publication of this article.

Declaration of Conflicting Interests:The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Author Contributions: BG, TC, CM, and NH conceived and planned the experiments. NH carried out the experiments. NH took the lead role in writing the manuscript. All authors contributed to the analysis and interpretation of the results. All authors provided evaluation to help shape the research and manuscript.

ORCID iD: Natascha Heise Inline graphic https://orcid.org/0000-0003-4459-814X

References

  • 1. Association of American Medical Colleges. About the AAMC. https://www.aamc.org/about/. Accessed December 1, 2018.
  • 2. Drake RL, Lowrie DJ, Prewitt CM. Survey of gross anatomy, microscopic anatomy, neuroscience, and embryology courses in medical school curricula in the United States. Anat Rec. 2002;269:118-122. [DOI] [PubMed] [Google Scholar]
  • 3. Drake RL, McBride JM, Lachman N, Pawlina W. Medical education in the anatomical sciences: the winds of change continue to blow. Anat Sci Educ. 2009;2:253-259. [DOI] [PubMed] [Google Scholar]
  • 4. Rizzolo LJ, Rando WC, O’Brien MK, Haims AH, Abrahams JJ, Stewart WB. Design, implementation, and evaluation of an innovative anatomy course. Anat Sci Educ. 2010;3:109-120. [DOI] [PubMed] [Google Scholar]
  • 5. Drake RL, McBride JM, Pawlina W. An update on the status of anatomical sciences education in United States Medical Schools. Anat Sci Educ. 2014;7:321-325. [DOI] [PubMed] [Google Scholar]
  • 6. McBride JM, Drake RL. National survey on anatomical sciences in medical education. Anat Sci Educ. 2017;11:7-14. [DOI] [PubMed] [Google Scholar]
  • 7. Kerby J, Shukur ZN, Shalhoub J. The relationships between learning outcomes and methods of teaching anatomy as perceived by medical students. Clin Anat. 2011;24:489-497. [DOI] [PubMed] [Google Scholar]
  • 8. Vasan NS, DeFouw DO, Compton S. Team-based learning in anatomy: an efficient, effective, and economical strategy. Anat Sci Educ. 2011;4:333-339. [DOI] [PubMed] [Google Scholar]
  • 9. Kamei RK, Cook S, Puthucheary J, Starmer CF. 21st century learning in medicine: traditional teaching versus team-based learning. Med Sci Educ. 2012;22:57-64. [Google Scholar]
  • 10. Prober CG, Heath C. Lecture halls without lectures—a proposal for medical education. N Engl J Med. 2012;388:1657-1659. [DOI] [PubMed] [Google Scholar]
  • 11. Rezaei AR. Frequent collaborative quiz taking and conceptual learning. Active Learn High Educ. 2015;16:187-196. [Google Scholar]
  • 12. Daniel DB, Broida J. Using web-based quizzing to improve exam performance: lessons learned. Teach Psychol. 2004;31:207-208. [Google Scholar]
  • 13. Marcell M. Effectiveness of regular online quizzing in increasing class participation and preparation. Int J Scholarship Teach Learn. 2008;2:7. [Google Scholar]
  • 14. Rezaei AR, Lovorn M. Reliability and validity of rubrics for assessment through writing. Assess Writ. 2010;15:18-39. [Google Scholar]
  • 15. Phelps RP. The effect of testing on student achievement, 1910–2010. Int J Test. 2012;12:21-43. [Google Scholar]
  • 16. Karpicke JD. Retrieval-based learning: active retrieval promotes meaningful learning. Curr Dir Psychol Sci. 2012;21:157-163. [Google Scholar]
  • 17. Roediger HL, III, Guynn MJ. Retrieval processes. In: Bjork EL, Bjork RA, eds. Memory. 1st ed San Diego, CA: Academic Press; 1996:197-236. [Google Scholar]
  • 18. Tulving E, Pearlstone Z. Availability vs. accessibility of information in memory for words. J Verb Learn Verb Behav. 1966;5:381-391. [Google Scholar]
  • 19. Roediger HL., III Why retrieval is the key process in understanding human memory. In: Tulving E, ed. Memory, Consciousness, and the Brain: The Tallinn Conference. 1st ed Philadelphia, PA: Psychology Press; 2000:52-75. [Google Scholar]
  • 20. Karpicke JD, Grimaldi PJ. Retrieval-based learning: a perspective for enhancing meaningful learning. Educ Psychol Rev. 2012;24:401-218. [Google Scholar]
  • 21. Mayer RE, Stull AT, DeLeeuw K, et al. Clickers in college classrooms: fostering learning with questioning methods in large lecture classes. Contemp Educ Psychol. 2009;34:51-57. [Google Scholar]
  • 22. Roediger HL, III, Agarwal PK, McDaniel MA, McDermott KB. Test-enhanced learning in the classroom: long-term improvements from quizzing. J Exp Psychol Appl. 2011;17:382-395. [DOI] [PubMed] [Google Scholar]
  • 23. Weinstein Y, Nunes LD, Karpicke JD. On the placement of practice questions during study. J Exp Psychol Appl. 2016;22:72-84. [DOI] [PubMed] [Google Scholar]
  • 24. Karpicke JD, Roediger HL., III Repeated retrieval during learning is the key to long-term retention. J Mem Lang. 2007;57:151-162. [Google Scholar]
  • 25. Karpicke JD, Roediger HL., III The critical importance of retrieval for learning. Science. 2008;319:966-968. [DOI] [PubMed] [Google Scholar]
  • 26. Kang SH, McDermott KB, Roediger HL., III Test format and corrective feedback modify the effect of testing on long-term retention. Eur J Cognit Psychol. 2007;19:528-558. [Google Scholar]
  • 27. Roediger HL, III, Butler AC. The critical role of retrieval practice in long-term retention. Trends Cogn Sci. 2011;15:20-27. [DOI] [PubMed] [Google Scholar]
  • 28. Armbruster P, Patel M, Johnson E, Weiss M. Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sci Educ. 2008;8:203-213. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Ettarh R. A practical hybrid model of application, integration, and competencies at interactive table conferences in histology (ITCH). Anat Sci Educ 2016;9:286-294. [DOI] [PubMed] [Google Scholar]
  • 30. Mennin SP, Krackov SK. Reflections on relevance, resistance, and reform in medical education. Acad Med. 1998;73:S60-64. [DOI] [PubMed] [Google Scholar]
  • 31. Pascoe JM, Babbott D, Pye KL, Rabinowitz HK, Veit KJ, Wood DL. The UME-21 project: connecting medical education and medical practice. Fam Med. 2004;36:S12-S14. [PubMed] [Google Scholar]
  • 32. Woods NN. Science is fundamental: the role of biomedical knowledge in clinical reasoning. Med Educ. 2007;41:1173-1177. [DOI] [PubMed] [Google Scholar]
  • 33. Woods NN, Brooks LR, Norman GR. The role of biomedical knowledge in diagnosis of difficult clinical cases. Adv Health Sci Educ Theory Pract. 2007;12:417-426. [DOI] [PubMed] [Google Scholar]
  • 34. Hofer RE, Nikolaus OB, Pawlina W. Using checklists in a gross anatomy laboratory improves learning outcomes and dissection quality. Anat Sci Educ. 2011;4:249-255. [DOI] [PubMed] [Google Scholar]
  • 35. Halliday N, O’Donoghue D, Klump KE, Thompson B. Human structure in six and one-half weeks: one approach to providing foundational anatomical competency in an era of compressed medical school anatomy curricula. Anat Sci Educ. 2015;8:149-157. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Nwachukwu C, Lachman N, Pawlina W. Evaluating dissection in the gross anatomy course: correlation between quality of laboratory dissection and student outcomes. Anat Sci Educ. 2014;8:45-52. [DOI] [PubMed] [Google Scholar]
  • 37. Meier Y, Xu J, Atan O, van der Schaar M. Predicting grades. IEEE T Signal Proces. 2016;64:959-972. [Google Scholar]
  • 38. Elbadrawy A, Polyzou A, Ren Z, Sweeney M, Karypis G, Rangwala H. Predicting student performance using personalized analytics. Computer. 2016;49:61-69. [Google Scholar]
  • 39. Junco R, Clem C. Predicting course outcomes with digital textbook usage data. Internet High Educ. 2015;27:54-63. [Google Scholar]
  • 40. Detton AJ. Grant’s Dissector. 16th ed. Philadelphia, PA: Wolters Kluwer Health/Lippincott Williams & Wilkins; 2017. [Google Scholar]
  • 41. Netter FH. Atlas of Human Anatomy. 7th ed. Philadelphia, PA: Saunders/Elsevier Inc; 2018. [Google Scholar]
  • 42. Dangerfield P, Bradley P, Gibbs T. Learning gross anatomy in a clinical skills course. Clin Anat. 2000;13:444-447. [DOI] [PubMed] [Google Scholar]
  • 43. Lujan HL, DiCarlo SE. First-year medical students prefer multiple learning styles. Adv Physiol Educ. 2006;30:13-16. [DOI] [PubMed] [Google Scholar]
  • 44. Rao SP, Collins HL, DiCarlo SE. Collaborative testing enhances student learning. Adv Physiol Educ. 2002;26:37-41. [DOI] [PubMed] [Google Scholar]
  • 45. Giuliodori M, Lujan H, DiCarlo S. Collaborative group testing benefits high- and low-performing students. Adv Physiol Educ. 2008;32:274-278. [DOI] [PubMed] [Google Scholar]
  • 46. Leight H, Saunders C, Calkins R, Withers M. Collaborative testing improves performance but not content retention in a large-enrollment introductory biology class. CBE Life Sci Educ. 2012;11:392-401. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Kamuche FU. Do weekly quizzes improve student performance? Acad Exchange Q. 2005;9:188-193. [Google Scholar]
  • 48. McDaniel MA, Agarwal PK, Huelser BJ, McDermott KB, Roediger HL., III Test-enhanced learning in a middle school science classroom: the effects of quiz frequency and placement. J Educ Psych. 2011;103:399-414. [Google Scholar]
  • 49. McDaniel MA, Roediger HL, III, McDermott KB. Generalizing test-enhanced learning from the laboratory to the classroom. Psychon Bull Rev. 2007;14:200-206. [DOI] [PubMed] [Google Scholar]
  • 50. Lukić IK, Glunčić V, Katavić V, Petanjek Z, Jalšovec D, Marušić A. Weekly quizzes in extended-matching format as a means of monitoring students’ progress in gross anatomy. Ann Anat. 2001;183:575-579. [DOI] [PubMed] [Google Scholar]
  • 51. Johnson EO, Charchanti AV, Troupis TG. Modernization of an anatomy class: from conceptualization to implementation. A case for integrated multimodal-multidisciplinary teaching. Anat Sci Educ. 2012;5:354-366. [DOI] [PubMed] [Google Scholar]
  • 52. Percac S, McArdle PJ. Anatomy teaching: students’ perceptions. Surg Radiol Anat. 1997;19:315-317. [DOI] [PubMed] [Google Scholar]
  • 53. Smythe G, Hughes D. Self-directed learning in gross human anatomy: assessment outcomes and student perceptions. Anat Sci Educ. 2008;1:145-153. [DOI] [PubMed] [Google Scholar]
  • 54. Carpenter SK, Cepeda NJ, Rohrer D, Kang SH, Pashler H. Using spacing to enhance diverse forms of learning: review of recent research and implications for instruction. Educ Psychol Rev. 2012;24:369-378. [Google Scholar]
  • 55. Putnam AL, Sungkhasettee VW, Roediger HL., III Optimizing learning in college: tips from cognitive psychology. Perspectives Psychol Sci. 2016;11:652-660. [DOI] [PubMed] [Google Scholar]
  • 56. Sugand K, Abrahams P, Khurana A. The anatomy of anatomy: a review for its modernization. Anat Sci Educ. 2010;3:83-93. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Medical Education and Curricular Development are provided here courtesy of SAGE Publications

RESOURCES