Abstract
BACKGROUND
Medical education integrates skills training and simulation to prepare students for clinical tasks. A seminar on interventional radiology was restructured to include specific practical training utilizing a 3D-catheter model. We aimed to investigate the complex interplay between student evaluations, their visual-spatial ability and practical performance.
METHODS
The seminar comprised a short plenary introduction followed by 3 practical training units. Students were tested for their visual-spatial ability and their catheter insertion performance. Students rated the seminar and their interest in the subject. Data were subjected to descriptive, factorial, regression, and moderating analysis.
RESULTS
A total of 141 medical students enrolled in the seminar. They attributed a high didactic and practical quality and expressed great interest in the subject. Male students outperformed females in the cube perspective test. In the practical examination, males needed significantly less time on average (57.9 s) compared to females (73.1 s). However, there were no significant differences in the performance score, with a maximum of 5 attainable points: males 4.61 and females 4.51. The seminar evaluation explained a large portion of the variance (48.6%) in students’ interest in the subject. There was a moderating role of practical quality (β = 0.12, P < .05) on the link between the cube perspective test and the practical examination: rated high practical quality could partly compensate for low cube perspective scores, enhancing performance in the practical examination.
CONCLUSIONS
Well-designed practical courses and a perceived high teaching quality may assist students with deficits in visual-spatial ability to acquire clinical-practical skills. Such initiatives not only enhance learning outcomes across diverse student groups but also stimulate interest in specialized fields like interventional radiology, thereby potentially guiding future career paths in medicine.
Keywords: psychometric evaluation, interventional radiology, medical education, visual-spatial ability, simulation in education, moderating analysis
Introduction
Over the past decades, there has been an ever-increasing effort to include simulation and hands-on practical training in healthcare education. 1 Clinically oriented skills training supervised by experienced instructors aims to prepare students and future doctors to develop proficiency in patient care. Practical training enhances learners’ desires for competence, relatedness, and autonomy, thereby fostering intrinsic motivation, as outlined by self-determination theory. 2 Attempting to satisfy these needs is considered highly valuable, as it outperforms the extrinsic motivation and thus guides people's behavior toward effective learning. 3 There is also evidence that the interest of students in specialties for postgraduate training profits from positive learning experiences during practical training in undergraduate education. 4
The field of radiology, similar to many others, significantly emphasizes the recruitment of highly motivated and skilled graduates. Research indicates that active engagement with medical students, especially in clinical activities, is a key factor in shaping their future career choices. 5 Recent research has shown that, due to the interdisciplinary relevance of imaging, there is a strong desire among medical students for increased radiology teaching in their curriculum. 6 This finding led teaching staff to proactively create positive experiences. 7 Vogel and Harendza 8 postulated a balanced mix of self-study and individual practice supervised by trained instructors to be an effective teaching method.
Recently, tests designed to predict radiological expertise have been developed and published. 9 Despite some studies indicating that radiologists may not have better visual skills compared to nonexperts, 10 Birchall et al suggested that visual-spatial ability is crucial in choosing future radiologists. 11 Studies in the past already demonstrated that visual-spatial ability affects performance in medical and surgical tasks such as in vascular surgery, endoscopy, or interventional radiology. 12 Louridas et al 13 tried to correlate visual-spatial ability with baseline laparoscopic proficiency in novice trainees in surgery. With respect to orientation in the 2D-3D space, they found that the ability to navigate a laparoscopic camera was linked to participants’ scores on a cube-comparison test which was used during the initial selection process. Although there is extensive research on predicting surgical proficiency, 14 the specific field of (interventional) radiology has not been thoroughly studied in terms of forecasting performance.
Recent studies highlight how advanced technologies can transform student learning outcomes. For instance, Rangier et al demonstrated that using interactive 3D image postprocessing software in undergraduate radiology education improves students’ diagnostic skills, radiological reasoning, and visual-spatial abilities. 15 Medical educators increasingly utilize 3D anatomical models and prints to depict anatomical structures, thereby reducing unnecessary cognitive strain.16,17
In a previous study, we developed and validated a questionnaire to assess the quality of didactic and practical courses. 18 Students expressed high satisfaction, appreciating both the theoretical knowledge gained and the practical skills acquired, such as catheter insertion into a 3D-model of the aorta. However, the relationship between student evaluations, gender, interest in the subject, practical skills delivery, and the impact of visual-spatial abilities on educational outcomes is still unclear. Despite the known importance of skills training in medical education, the best way to include interventional radiology in undergraduate courses is still unclear. Our study introduces a detailed seminar with practical skills training in the fourth year to examine these complex relationships. We investigated the following research questions:
- Do gender differences impact the performance in visual-spatial and practical skills tests?
- Does students’ evaluation of the seminar moderate the link between their visual-spatial ability and practical skills?
- Can students’ interest in interventional radiology be predicted from how they rate the seminar's didactic and practical components?
Methods
Design and Participants
This prospective cross-sectional study was conducted at a state examination medical degree course in Germany (Würzburg) offering a standard 6-year curriculum comprising 2 preclinical years of teaching, 3 clinical years, and 1 practical year of training. In the fourth year, teaching in radiology comprises lecture series and 12 mandatory seminars for the entire student cohort. This study was conducted during the winter semester of 2018/2019 across 12 sessions of a 120-min interventional radiology seminar. Each session had up to 14 students, allowing all 141 semester students to participate. The study rationale was explained, and students were informed that participation was totally voluntary and choosing not to participate would not affect their degree course results. All participants signed informed consent statements. The reporting of this study conforms to the Standards for Quality Improvement Reporting Excellence for Education (SQUIRE-EDU). 19 The SQUIRE-EDU items and their corresponding examples for this study are listed in Supplement 1.
The seminar's format, previously detailed, includes an initial brief lecture followed by 3 practical training sessions. 18 In this study, the participants’ visual-spatial ability was measured at the beginning with a written mental rotation test and the cube perspective test. These 2 tests comprise 3 mental rotation figures (2 options each and a maximum of 2 points per figure) and 6 cube perspectives (1 point each), respectively.20,21 After completion of the 3 training units, students then took the practical examination to insert a catheter into a 3D-model of the aorta; the maximum time to complete the task was limited to 240 s. An illustration of the 3D-model of the aorta (produced by HumanX Medical LLC, Saint Petersburg, USA) used is presented in Figure 1.
Figure 1.
3D-model of the aorta with transfemoral catheter inserted into the truncus coeliacus (a. splenica).
Experienced resident doctors and consultants assessed student performance using a 5-item checklist, with each item rated up to a maximum of 1 point, partial points were awarded (Table 1). The maximum total score achievable was 5 points. Students exceeding the time limit were not considered for the calculation of the mean time needed, as they were stopped from completing the task. The survey was conducted at the end of the seminar where students rated the didactic quality (5 items), practical quality (6 items), and their interest in the subject of interventional radiology (4 items). Responses were given on a 5-point Likert-scale (1 = I do not agree at all, to 5 = I totally agree). The questionnaire used in the study, known as “Radio-Prak,” had been validated in a prior study. 18 This validation process demonstrated that the questionnaire is a highly reliable and valid tool for assessing the quality of clinical-practical seminar.
Table 1.
Checklist (Form With Rating Scales) to Assess Student Performance in the Practical Examination.a
1. The student placed the wire at the target location (splenic artery); the catheter lies in the coeliac trunk a . | [_] yes [_] no |
2. The student followed the rule “wire before catheter” strictly during the catheterization procedureb. | [_] always [_] partially [_] never |
3. I helped the student a lot during the exercisec. | Does not apply at all [_] [_] [_] [_] [_] applies completely |
4. The student demonstrated smooth handling skills. In order to probe the splenic artery successfully, she/he requiredb. | [_] one attempt [_] two attempts [_] >two attempts |
5. The student successfully completed the exercise within the time limit of 4 minutes a . | [_] yes [_] no |
To create the score, the following approach was used:
aYes = 1 point, no = 0 points.
bAlways / 1 attempt = 1 point, partially / 2 attempts = 0.5 points, never / more than 2 attempts = 0 points.
cDoes not apply at all = 1 point, selection left of center = 0,75 points, selection center = 0,5 points, selection right of center = 0,25 points, applies completely = 0 points.
Each item on the checklist allows for a maximum score of 1 point.
Statistical Analysis
Calculations and statistical procedures followed the recommendations of the World Health Organization 22 and the guidelines provided by the international test commission. 23 The software suite “IBM SPSS Statistics 26” was employed for the descriptive analysis and extended by the macro “PROCESS” developed by Hayes 24 for the factor and moderating analysis.
Descriptive Analysis
Descriptive analysis included item mean scores (M), standard deviation (SD), and minimum/maximum. To analyze reliability, Cronbach alpha was calculated to assess the internal consistency of the subscales of the questionnaire and the visual-spatial ability tests. A correlation analysis (Pearson correlation r) was performed to evaluate the relationship between the 6 measures (evaluation of the didactic and practical quality, interest in the subject of interventional radiology, visual-spatial ability test score, practical examination score, and time to complete the aortic model test). The analysis of variance (ANOVA) result was reported as an F-statistic and its associated degrees of freedom and P value. Significant results are indicated as *P < .05, **P < .01, and ***P < .001.
Factorial Validity
The factorial structure of the questionnaire was based on the exploratory factor analysis of the previous study. 18 A confirmatory factor analysis (CFA) was run and the following indices were checked for the model fit: the comparative-fit-index (CFI), 25 the standardized-root-mean-square-residual-coefficient (SRMR), 26 and the root-mean-square-error-of-approximation (RMSEA). 27 Comparative-fit-index values greater than 0.9 were required to rule out misspecification. Values greater than 0.95 indicated a very good model fit. Standardized-root-mean-square-residual-coefficient values below 0.08 were considered good, values below 0.05 pointed to a very good model fit. 28 Root-mean-square-error-of-approximation values needed to be below 0.05. Chi-square (χ2)/degrees of freedom (df) were used to test whether a specific model better suited the data.
Regression Analysis
Corresponding regressions to the findings from the correlation analysis were calculated. As part of the regression analysis, group differences were determined by ANOVA. 29 First, the scores of the visual-spatial ability test were set in relation to the scores of practical performance and the time needed. Secondly, differences in the interest in the subject of interventional radiology were calculated based on the assessment of the didactic and practical quality. And finally, the link between the seminar evaluation and the scores of the visual-spatial ability test and the practical examination was demonstrated in terms of interference statistics through mediation and moderation analysis.
Moderating Analysis
Potential moderating effects of test scores and values (visual-spatial ability, practical performance, time needed for the task) and the seminar evaluation on the regressive coherences were detected. The 4 requirements for robustness in moderating analysis according to Hayes 24 were scrutinized at the outset. Firstly, the moderating analysis was based on linear regression. Secondly, the residuals should follow a normal distribution. Furthermore, the calculations require homoscedasticity, a main requirement already needed for the ANOVA. Finally, the specific error relating to a given data point should be independent of the errors for other data points. We computed the confidence intervals by bootstrapping with 5000 samples.
Results
A total of 141 medical students took part in the study, among them 74 (52.5%) were female.
Descriptive Statistics of the Questionnaire
The 2 dimensions of the seminar quality (didactic and practical) and the interest in the subject of radiology were evaluated (Table 2). The item mean scores ranged between 3.40 and 4.90 with a SD between 0.37 and 1.00. Students generally rated the didactic and practical quality highly; however, the item “Thanks to the seminar, I improved my visual-spatial ability” had the lowest rating, achieving a mean item score of 3.40 only. Interest in the subject was generally rated highly, despite one lower rating (mean item score of 3.46) in the item “Through the exercises, I feel well prepared for my later clinical work with patients.”
Table 2.
Descriptive Statistics of the Questionnaire Items Used for the Seminar Evaluation.a
Abbreviations | Wording of the item | M | SD | Min | Max |
---|---|---|---|---|---|
Didactic quality | |||||
Did1 | The topics were presented in an understandable manner. | 4.75 | 0.48 | 2 | 5 |
Did2 | The lecture part prepared me well for the following practical part. | 4.42 | 0.66 | 2 | 5 |
Did3 | The lecturer managed to arouse enthusiasm in the topic. | 4.64 | 0.56 | 2 | 5 |
Did4 | Practice and theory units were balanced well with each other. | 4.77 | 0.47 | 3 | 5 |
Did5 | The lecturer encouraged questions and active participation. | 4.74 | 0.54 | 2 | 5 |
Practical quality | |||||
Prac1 | I was able to develop my practical skills further. | 4.46 | 0.76 | 2 | 5 |
Prac2 | Thanks to the seminar, I improved my visual spatial ability. | 3.40 | 1.00 | 1 | 5 |
Prac3 | I was able to learn and practice well with the simulators. | 4.51 | 0.71 | 2 | 5 |
Prac4 | The practical part added substantial value to the seminar. | 4.90 | 0.37 | 3 | 5 |
Prac5 | I learned a lot during the seminar. | 4.43 | 0.65 | 3 | 5 |
Prac6 | After the course, I feel able to perform the task of probing a vessel on a vascular model. | 4.46 | 0.69 | 2 | 5 |
Interest in the subject of interventional radiology | |||||
Int1 | The seminar has sparked my interest in the subject of interventional radiology. | 4.20 | 0.83 | 2 | 5 |
Int2 | The seminar had a positive influence on my view of the subject of interventional radiology. | 4.49 | 0.73 | 1 | 5 |
Int3 | Through the exercises, I feel well prepared for my later clinical work with patients. | 3.46 | 0.91 | 1 | 5 |
Int4 | I understood the methodology of interventional radiology and its importance to everyday clinical practice. | 4.61 | 0.63 | 1 | 5 |
Abbreviations: SD, standard deviation; CFA, confirmatory factor analysis.
The abbreviations are listed for the use of the CFA.
The internal consistency of the questionnaire was considered as being very good, with an overall Cronbach's alpha of 0.87. The subscale reliabilities of Cronbach's alpha were 0.75 (didactic quality), 0.72 (practical quality), and 0.68 (interest in the subject of interventional radiology).
Confirmatory Factor Analysis
The results of the CFA are illustrated in Figure 2. The different parameters of the statistical quality indicated good results overall: The CFI reached the value of 0.961, ruling out misspecification. The RMSEA missed the level of significance (0.05) only slightly with a value of 0.054, just as the SRMR with a value of 0.056. Fourteen items contributed significantly to their respective factors: “didactic quality” (5 items), “practical quality” (6 items), and “interest in the subject of interventional radiology” (3 items). Only the loading of the item “Through the exercises, I feel well prepared for my later clinical work with patients” onto its corresponding superordinate factor was found not to be significant.
Figure 2.
Factorial structure of the questionnaire and standardized factor loadings. Significant loadings are indicated by *P < .05.
Descriptive Test Results of Visual-Spatial Ability and Practical Skills
The student scores in the mental rotation test were generally higher (M = 3.71, SD = 1.67) than in the cube perspective test (M = 2.92, SD = 1.27). Cronbach's alpha for the mental rotation test was considered very good (0.83), whereas the cube perspective test suggests low internal consistency (0.19), possibly due to the heterogeneity of test items or multidimensionality. There were 2 peaks detectable for the performance scores in the mental rotation test, while the scores of students in the cube figures more likely resembled a Gaussian curve. Male students outperformed the females in the cube perspective test, P < .05 (Figure 3).
Figure 3.
Relative numbers of students (y-axis) achieving point scores relating to performance (x-axis) in the mental rotation test (A) and the cube perspective test (B), differentiated according to gender; respective descriptive values and gender differences are listed below.
The skill performance of students was measured by 2 parameters: the score of a practical examination involving insertion of a catheter into a 3D-model of the aorta and the time needed to fulfill the task. A high proportion of students (62.0% of all male and 64.9% of all female students) achieved the maximum score. Between male and female students, there were no significant differences (Figure 4). The time span ranged from 14 to 174 s, with an average of 66.94 s (SD = 42.82). Male students were faster (57.9 s) than females (73.1 s), P < .05. Of note, data from 3 students had to be eliminated, as they were unable to successfully complete the task within the time limit.
Figure 4.
Relative numbers of students (y-axis) achieving point scores relating to performance (maximum of 5 points) (x-axis) in the practical examination to insert a catheter into a 3D-model of the aorta, differentiated according to gender; respective descriptive values and gender differences are listed below.
Correlation Analysis
A correlation coefficient matrix between the different measures is given in Table 3. Positive correlations were determined between the “cube perspective test” and the “practical examination” as well as the “time needed for the task.” The latter 2 even displayed the strongest correlation. The factor “didactic quality” correlated highly with both the factors “practical quality” and “interest in the subject.” Furthermore, “practical quality” correlated with “the interest in the subject.”
Table 3.
Correlations Between the Different Measures.
Cube perspective test | Practical examination | Time needed for the task | Didactic quality | Practical quality | Interest in the subject | |
---|---|---|---|---|---|---|
Cube perspective test | 1 | 0.37a | −0.23b | 0.00 | 0.08 | 0.05 |
Practical examination | 1 | −0.77a | 0.04 | 0.17 | 0.05 | |
Time needed for the task | 1 | −0.01 | 0.20 | −0.05 | ||
Didactic quality | 1 | 0.55a | 0.70a | |||
Practical quality | 1 | 0.68a | ||||
Interest in the subject | 1 |
P < .001.
P < .01.
Regression Analysis
The correlations with very high significance (P < .001) were subjected to regression analysis (ANOVA). The results are listed in Table 4. A large portion of the variance in “interest in the subject” was explained by either the evaluation of the “didactic quality” (48.6%) or the “practical quality” (45.7%) of the seminar. There was only a moderate relationship of 29.9% between the didactic and the practical quality and of 13.3% between the cube perspective test and the practical examination.
Table 4.
Results of Regression Analysis (ANOVA).
df | F-value | Significance of ANOVA | Regression coefficient | Constant | R² (=explained variance) | ||||
---|---|---|---|---|---|---|---|---|---|
Cube perspective test / Practical examination | 1 | 20.869 | <.001 | 0.560a | 7.298 | 13.3% | |||
Cube perspective test / Time needed for the task | 1 | 7.890 | .006 | −10.029a | 89.572 | 5.5% | |||
Didactic quality / Interest in the subject | 1 | 133.363 | <.001 | 1.005a | −0.509 | 48.6% | |||
Practical quality / Interest in the subject | 1 | 118.821 | <.001 | 0.826a | 0.571 | 45.7% | |||
Didactic quality /practical quality | 1 | 60.128 | <.001 | 0.463a | 2.633 | 29.9% |
P < .001.
Moderating Analysis
The effect of seminar evaluation on the relationship between the cube perspective test score and the practical performance is demonstrated in Figure 5. Evaluation of the didactic quality did not moderate this relationship. In contrast, evaluation of the practical quality negatively adjusted the relationship between the performance in the cube perspective test and the practical examination. This is illustrated by the 3 lines of different rating levels, of which the gradient flattened with increasing ratings in the scale of the practical quality. In other words, students who rated the seminar as being of high practical quality (indicated by the green line) were able, to a certain degree, to compensate for particularly low scores in the cube perspective test and improve their performance in the practical examination. Of note, no significant moderating effects were found for the relationship between didactic or practical quality and the interest in the subject, nor between didactic and practical quality.
Figure 5.
Moderating effect of student evaluation of the didactic (A) and practical quality (B) on the relationship between performance in the cube perspective test and the practical examination. The 3 lines represent different levels of student rating: the mean scale score, on a scale from 1 to 5, are presented here in the relevant excerpt (average rating = red line), deviation + 1 SD (high rating = green line) and deviation −1 SD (low rating = blue line).
Discussion
In the methodological introduction of our study, we demonstrated that the questionnaire was a reliable, valid, and feasible tool to measure the teaching quality of a seminar which included practical elements of training. The questionnaire comprised 3 factors, which reflected the high degree of satisfaction students have with the teaching concept and their interest in the subject of interventional radiology. Furthermore, the students’ performance in practical skills and the impact of visual-spatial ability were assessed as influencing components on both the didactic and practical quality of the seminar and on interest in the subject of interventional radiology. In the following, we discuss the main findings in relation to the research questions posed at the outset.
The study initially aimed to examine the influence of gender-specific differences onto actual performance in visual-spatial ability and practical skills, using specific tests to gauge this among participants. The literature outlines various methodologies for assessing visual-spatial ability. McGee identified 2 components of visual-spatial ability: visual-spatialization, which encompasses the capacity to mentally manipulate, rotate, twist, or deform objects independently of one's own orientation and spatial orientation, referring to the ability to conceptualize how an object looks from different viewpoints. 30 In our study, we included abbreviated versions of the cube perspective test and the mental rotation test, which are part of the aptitude test for entry into medical school in Germany, 31 as they serve as potential assessment for spatial orientation. Further, we examined dynamic spatial ability and environmental ability, as proposed in the literature, 32 by conducting a practical examination where participants inserted a catheter into a 3D-model of the aorta employing the Seldinger technique.
In the present study, we observed that male students significantly outperformed females in the cube perspective test. However, there were no significant differences in performance scores of the practical examination between male and female students, though males completed the task more quickly. This aligns with existing research that notes a male advantage in overall spatial performance. 33 Acknowledging these gender-based differences in spatial abilities—yet not in general intelligence—underscores the need for educational strategies that address such disparities to promote fairness. As such, our seminar concept has successfully functioned and served its purpose by compensating for the disadvantage illustrated by significantly diverging performance in the cube perspective test aligning with principles of diversity-sensitive education.
The second issue of the study addressed the question, as to whether evaluation of the seminar quality influences the relationship of visual-spatial ability and practical skills performance. We found that students who perceived the practical quality of the seminar to be high could to some extent compensate for a low score in the cube perspective test to the benefit of improving their performance in the practical examination. With respect to the practical quality, a significant moderating effect was noticeable, a phenomenon that did not extend to the evaluation of didactic quality. One has to assume that students rating the practical quality of the seminar highly were prone to being less dependent on their visual-spatial ability to solve the task of inserting the catheter into the 3D-model of the aorta. There is evidence in the literature of the beneficial effects of practical training on learners’ visual-spatial ability. Kass et al demonstrated that even brief training could remove the gender differences between participants with lasting effect. 34 Several studies have demonstrated that simulation training plays a crucial role in identifying personal strengths and weaknesses. 35 In our study, high ratings of the practical quality of the seminar were found to downscale the impact of visual-spatial ability on the performance in the catheter insertion into the aortic model. An earlier report had already noted that simulation training can not only lead to a lower error rate during specific procedures but also reduce the required time. 36 Thus, it can be concluded that all students, regardless of gender and their diverging spatial abilities, benefited from our seminar, especially those who highly valued the practical training aspect of their learning.
Finally, our study linked the seminar's practical aspects to students’ interest in interventional radiology, suggesting that improved practical training can influence specialty choice. This finding underscores the necessity for curriculum development that incorporates practical, student-centered learning. It also highlights the value of simulation and diverse teaching methods in enhancing learner outcomes and ultimately providing medical specialties with a way of generating interest among future specialists. The desired impact of education preferences on career paths is already documented in the literature. Early teaching and learning experiences of undergraduates tend to have a large influence on career intentions. 37 Over the past few decades, there has been a marked increase in research into the choice of specialty medical students select for postgraduate training. 38 Several studies revealed a positive influence of improved training through practical courses on the later choice of specialty. 39 According to the European Society of Radiology, exposure to interventional radiology in medical education varies greatly and is exceptionally dependent on the local curriculum. 40 As discovered in another recent large survey, the interest of medical students in the specific subject of interventional radiology is generally low when compared to the interest in other hands-on specialties, such as surgery or anesthesiology 41 with a growing demand for future (interventional) radiologists with the discipline becoming more and more present in modern health care. Nissim et al further mentioned that an improved understanding of the field results from greater exposure as part of the curriculum. Roff and McAleer suggested that the educational climate is predisposed favorably by progressive degrees of autonomy in the learning environment as students move forward toward graduation. 42 The Dundee Ready Education Environment Measure accentuated the importance of long-term relevance of the learning content to create a good learning atmosphere. 43 Additionally, the shift from teacher-oriented to problem-based and student-centered learning has a significant impact on student perception of the learning environment, as several papers over the last few decades have reported. 44 A recent study pointed out the positive effect of endovascular simulator training improving student's attitude toward interventional radiology. 45 We found the preference of students for the subject of interventional radiology to be associated with the perceived and practical quality of the seminar. These results of our study may help to improve understanding of the interrelation of seminar quality and any potential rise in interest toward the subject.
Limitations of the Study
Limitations of the study include its monocentric design, utilizing the questionnaire at a single university, which might limit the generalizability of the findings. The checklist for the assessment of performance, developed with input from experts in neuroradiology and medical education, was used for the first time in this study. While it is grounded in expert advice (subject matter expertise), it has not undergone traditional validation processes such as content or construct validity. Additionally, we noted a ceiling effect: approximately 60% of students scored the maximum 5 points on their first attempt. This high level of performance may be attributed to the assessment being conducted immediately after the educational intervention, possibly when students were most attentive and prepared to apply their newly acquired skills. We evaluated a seminar with high participant satisfaction, leading to right-skewed, non-normally distributed data. Using the questionnaire in more diverse settings could yield different results. Another potential problem is that some data were self-reported by students, raising the risk of common method variance. This occurs when study participants provide both independent and dependent variable data. While conclusions also came from objective assessments like the tests or the practical examination, the study's reliance on self-reported interest in interventional radiology may not accurately predict long-term career decisions.
Conclusions
Overall, our findings advocate for the inclusion of practical training to address and compensate for individual differences in cognitive abilities, thereby promoting equal learning opportunities and diversity-sensitive teaching approaches. A well-designed practical element in seminars/courses may assist students with deficits in visual-spatial ability in acquiring the necessary clinical-practical skills and lead to a more homogenous distribution of competence in clinical procedures. Additionally, the perceived practical quality of the seminar was associated with the interest in the subject of interventional radiology, suggesting the strategic use of this approach for recruiting future specialists.
Supplemental Material
Supplemental material, sj-docx-1-mde-10.1177_23821205241281647 for Bridging Visual-Spatial Ability and Skill Performance: The Impact of Perceived Quality of a Practical Seminar in Interventional Radiology Education by Jakob Bartels, Joy Backhaus, Ralph Kickuth, Friederika Fluck, Anne Marie Augustin and Sarah König in Journal of Medical Education and Curricular Development
Footnotes
Authors’ Contribution: All the authors were involved in the form and/or study design and contributed critically to the final preparation of this article, including approving the final version of the manuscript. Jakob Bartels: conceptualization, formal analysis, data curation, writing – original draft, visualization, investigation, review. Joy Backhaus: formal analysis, data curation, writing – original draft. Ralph Kickuth: writing – review, supervision, resources, investigation. Friederika Fluck: investigation, writing – original draft. Anne Marie Augustin: investigation, writing – original draft. Sarah König: conceptualization, methodology, writing – review, supervision, project administration.
Availability of Data and Materials: Data are available on request from the corresponding author (bartels_j@ukw.de).
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
FUNDING: The author(s) received no financial support for the research, authorship, and/or publication of this article.
Ethics Approval and Consent to Participate: This study was based on anonymous data retrieved during the seminar. Data were collected from 3 parts: (1) the test of visual-spatial ability, (2) an examination of a practical skill, and (3) an evaluation survey. The study did not represent biomedical or epidemiological research on human subjects. The University of Würzburg Ethics Committee has confirmed that no ethical approval of the study is required (20220901 01). The data were collected with student consent with no personal information being recorded other than gender. Informed consent was obtained from all subjects. Results from the 3 parts of the study were matched using temporarily assigned random numbers which were deleted immediately after collection and subsequent linkage of the datasets. Data were processed and stored in accordance with the data protection laws and regulations in Germany and the European Union. All methods were carried out in accordance with relevant guidelines and regulations, and the declaration of Helsinki.
ORCID iD: Jakob Bartels https://orcid.org/0009-0001-7471-8723
Supplemental Material: Supplemental material for this article is available online.
References
- 1.Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE guide no. 82. Med Teach. 2013;35(10):e1511-e1530. [DOI] [PubMed] [Google Scholar]
- 2.Gagné M, Deci EL. Self-determination theory and work motivation. J Organ Behav. 2005;26(4):331-362. [Google Scholar]
- 3.van Roy R, Zaman B. Why gamification fails in education and how to make it successful: introducing nine gamification heuristics based on self-determination theory. In: Serious Games and Edutainment Applications. Springer; 2017:485-509. [Google Scholar]
- 4.Werwick K, Spura A, Gottschalk M, et al. Für Chirurgie begeistern–Einflüsse der Famulatur aus Sicht Studierender auf eine spätere Fachpräferenz. Zentralbl Chir-Z Allg, Visz Thor Gefäßchir. 2017;142(06):550-559. [DOI] [PubMed] [Google Scholar]
- 5.Xu Y, Pervez A, Theodoulou I, et al. Future interventional radiologists and where to find them—insights from five UK Interventional Radiology symposia for junior doctors and medical students. Cardiovasc Intervent Radiol. 2021;44(2):300-307. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Dmytriw AA, Mok PS, Gorelik N, Kavanaugh J, Brown P. Radiology in the undergraduate medical curriculum: too little, too late? Med Sci Educ. 2015;25(3):223-227. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Rutledge C, Walsh CM, Swinger N, et al. Gamification in action: theoretical and practical considerations for medical educators. Acad Med. 2018;93(7):1014-1020. [DOI] [PubMed] [Google Scholar]
- 8.Vogel D, Harendza S. Basic practical skills teaching and learning in undergraduate medical education—a review on methodological evidence. GMS J Med Educ. 2016;33(4) doi: 10.3205/zma001063. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Waite S, Farooq Z, Grigorian A, et al. A review of perceptual expertise in radiology-how it develops, how we can test it, and why humans still matter in the era of artificial intelligence. Acad Radiol. 2020;27(1):26-38. [DOI] [PubMed] [Google Scholar]
- 10.Nodine CF, Krupinski EA. Perceptual skill, radiology expertise, and visual test performance with NINA and WALDO. Acad Radiol. 1998;5(9):603-612. [DOI] [PubMed] [Google Scholar]
- 11.Birchall D. Spatial ability in radiologists: a necessary prerequisite? Br J Radiol. 2015;88(1049):20140511. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Wanzel KR, Hamstra SJ, Anastakis DJ, Matsumoto ED, Cusimano MD. Effect of visual-spatial ability on learning of spatially-complex surgical skills. Lancet. 2002;359(9302):230-231. [DOI] [PubMed] [Google Scholar]
- 13.Louridas M, Quinn LE, Grantcharov TP. Predictive value of background experiences and visual spatial ability testing on laparoscopic baseline performance among residents entering postgraduate surgical training. Surg Endosc. 2016;30(3):1126-1133. [DOI] [PubMed] [Google Scholar]
- 14.Louridas M, Szasz P, de Montbrun S, Harris KA., Grantcharov TP. Can we predict technical aptitude? Ann Surg. 2016;263(4):673-691. [DOI] [PubMed] [Google Scholar]
- 15.Rengier F, Häfner MF, Unterhinninghofen R, et al. Integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves diagnostic skills and visual-spatial ability. Eur J Radiol. 2013;82(8):1366-1371. [DOI] [PubMed] [Google Scholar]
- 16.Fredieu JR, Kerbo J, Herron M, Klatte R, Cooke M. Anatomical models: a digital revolution. Med Sci Educ. 2015;25(2):183-194. [Google Scholar]
- 17.Yuen J. What is the role of 3D printing in undergraduate anatomy education? A scoping review of current literature and recommendations. Med Sci Educ. 2020;30(3):1321-1329. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Bartels J, Backhaus J, Kickuth R, Fluck F, König S, Augustin A. Making innovation in teaching measurable: psychometric validation of the “Radio-Prak”: a questionnaire using the example of a clinical practical seminar in interventional radiology. Radiologe. 2020;60(4):342-350. [DOI] [PubMed] [Google Scholar]
- 19.Ogrinc G, Armstrong GE, Dolansky MA, Singh MK, Davies L. SQUIRE-EDU (Standards for QUality Improvement Reporting Excellence in Education): publication guidelines for educational improvement. Acad Med. 2019;94(10):1461-1470. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Rengier F. Die TMS-Vorbereitung 2021 Band 3: Schlauchfiguren im Medizinertest mit Übungsaufgaben, Lösungsstrategien, Tipps und Methoden (Übungsbuch für den Test für Medizinische Studiengänge). SmartMedix Verlag; 2021:72-75. [Google Scholar]
- 21.Heidelberg U. Test für medizinische Studiengänge TMS Informationsbroschüre 2018. ITB Consulting GmbH; 2018:26-27. [Google Scholar]
- 22.World Health Organization. Process of translation and adaptation of instruments. 2007. 2010.
- 23.Muniz J, Elosua P, Hambleton RK. International Test Commission Guidelines for test translation and adaptation. Psicothema. 2013;25(2):151. [DOI] [PubMed] [Google Scholar]
- 24.Hayes AF. PROCESS: A Versatile Computational Tool for Observed Variable Mediation, Moderation, and Conditional Process Modeling. University of Kansas; 2012. [Google Scholar]
- 25.Bentler PM. Comparative fit indexes in structural models. Psychol Bull. 1990;107(2):238-246. [DOI] [PubMed] [Google Scholar]
- 26.Jöreskog K, Sörbom D. LISREL (version 8.80) [Computer Software]. Scientific Software International, Inc; 2007.
- 27.Browne MW. Testing Structural Equation Models. Sage Publications, Inc; 1993. [Google Scholar]
- 28.Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Model. 1999;6(1):1-55. [Google Scholar]
- 29.Miller RG, Jr. Beyond ANOVA: Basics of Applied Statistics. Chapman & Hall/CRC; 1997. [Google Scholar]
- 30.McGee MG. Human spatial abilities: psychometric studies and environmental, genetic, hormonal, and neurological influences. Psychol Bull. 1979;86(5):889-918. [PubMed] [Google Scholar]
- 31.Thomas S. Test für medizinische Studiengänge (TMS). Arbeiten aus dem Institut für Psychologie:247.
- 32.Halpern DF. Sex Differences in Cognitive Abilities. Psychology Press; 2013. [Google Scholar]
- 33.Contreras MJ, Rubio VJ, Peña D, Colom R, Santacreu J. Sex differences in dynamic spatial ability: the unsolved question of performance factors. Mem Cognit. 2007;35(2):297-303. [DOI] [PubMed] [Google Scholar]
- 34.Kass SJ, Ahlers RH, Dugger M. Eliminating gender differences through practice in an applied visual spatial task. Hum Perform. 1998;11(4):337-349. [Google Scholar]
- 35.Kuehster CR, Hall CD. Simulation: learning from mistakes while building communication and teamwork. J Nurses Prof Dev. 2010;26(3):123-127. [DOI] [PubMed] [Google Scholar]
- 36.Gehling KG. Simulation diagnostischer Angiographien zur Ausbildung in der interventionellen Neuroradiologie. Dissertation. Technische Universität München; 2020. [Google Scholar]
- 37.Ibrahim M, Fanshawe A, Patel V, et al. What factors influence British medical students’ career intentions? Med Teach. 2014;36(12):1064-1072. [DOI] [PubMed] [Google Scholar]
- 38.Schwartz RW, Haley JV, Williams C, et al. The Controllable Lifestyle Factor and Students’ Attitudes About Specialty Selection. Academic Medicine; 1990. [DOI] [PubMed] [Google Scholar]
- 39.Kruschinski C, Wiese B, Eberhard J, Hummers-Pradier E. Einstellungen von Studierenden zur Allgemeinmedizin: einflüsse von Geschlecht, Blockpraktikum und Gesamtcurriculum. GMS Z Med Ausbild. 2011;28(1):Doc16. [Google Scholar]
- 40.European Society of Radiology (ESR) communications@ myESR. org. Undergraduate education in radiology. A white paper by the European Society of Radiology. Insights Imaging. 2011;2(4):363-374. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Nissim L, Krupinski E, Hunter T, Taljanovic M. Exposure to, understanding of, and interest in interventional radiology in American medical students. Acad Radiol. 2013;20(4):493-499. [DOI] [PubMed] [Google Scholar]
- 42.Roff S, McAleer S. What is educational climate? Medical Teacher 2001;23(4):333-334. [DOI] [PubMed] [Google Scholar]
- 43.Roff S, McAleer S, Harden RM, et al. Development and validation of the Dundee ready education environment measure (DREEM). Med Teach. 1997;19(4):295-299. [Google Scholar]
- 44.Weller JM. Simulation in undergraduate medical education: bridging the gap between theory and practice. Med Educ. 2004;38(1):32-38. [DOI] [PubMed] [Google Scholar]
- 45.Stoehr F, Schotten S, Pitton M. Wie man Medizinstudenten für interventionelle Radiologie begeistert. Eur Radiol. 2020;30:4656-4663. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material, sj-docx-1-mde-10.1177_23821205241281647 for Bridging Visual-Spatial Ability and Skill Performance: The Impact of Perceived Quality of a Practical Seminar in Interventional Radiology Education by Jakob Bartels, Joy Backhaus, Ralph Kickuth, Friederika Fluck, Anne Marie Augustin and Sarah König in Journal of Medical Education and Curricular Development