Skip to main content
BMC Medical Education logoLink to BMC Medical Education
. 2025 Jul 23;25:1102. doi: 10.1186/s12909-025-07711-9

Prospective comparison of static versus dynamic images in abdominal ultrasound education - a randomised controlled trial

Johannes Matthias Weimer 1,2,, Michael Eigenseher 3, Simon Alexander Stiehl 4, Dieter Nürnberg 4, Michael Ludwig 5, Marie Stäuber 1, Andreas Weimer 6, Roman Kloeckner 7, Klaus Dirks 8, Liv Lorenz 9, Holger Buggenhangen 1, Julia-Weinmann-Menke 2, Daniel Merkel 3,4
PMCID: PMC12285136  PMID: 40702512

Abstract

Introduction

In medical ultrasound education, static and dynamic images help learners to understand sonoanatomical and sonopathological findings. However, there is currently no clear evidence or recommendations from professional associations regarding which presentation format is more effective for developing ultrasound competencies. This prospective, randomised, controlled study aimed to investigate the impact of static versus dynamic ultrasound images on learners’ acquisition of theoretical competencies in abdominal sonography.

Methods

Participants in certified ultrasound courses were randomised into two groups following an introductory session on ultrasound basics. Separately, both groups completed training covering normal findings and pathologies of the gallbladder, the liver and the pancreas. The study group underwent a digital training session (18 min total) for each of the three topics using dynamic images (video clips) while the control group received the same training session using static images. After the training, participants of both groups completed an online multiple-choice theory test, consisting of 54 questions with 4 answer options per question.

Results

A total of 145 datasets (69 control group, 76 study group) were included in the analysis. The study group achieved significantly higher overall theory test scores (p = 0.001) and performed significantly better in the total score of pathology findings (p < 0.001). No significant differences were observed in the total score of normal findings (p = 0.08). Multivariate regression analysis identified “group allocation dynamic,” “experience with > 30 ultrasound examinations,” and “employment in internal medicine” as significant positive predictors (p < 0.01) of theory test performance.

Conclusion

Dynamic images in ultrasound education improve comprehension of pathological findings over static images. These insights should inform the development and adaptation of future training programs and educational materials to enhance the quality of ultrasound education and diagnostic accuracy.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12909-025-07711-9.

Keywords: Ultrasound education, Static images, Dynamic images, Static vs. dynamic, Digital education, Still images, Motion images

Introduction

Background

Ultrasound diagnostics have become an indispensable tool in modern medicine, playing a central role in diagnosis and therapeutic decision-making across various medical specialties [1]. The continuous and unlimited real-time visualisation of organs and structures provide additional diagnostic value compared to other imaging techniques, establishing its high relevance in addressing specific diagnostic questions [1]. Considering its pivotal role, a comprehensive education in ultrasound diagnostics is essential for aspiring physicians. Currently, this education typically involves attending (certified) training courses and supervised clinical practice as part of medical education and professional development [24]. Professional societies issue recommendations for ultrasound training at both the undergraduate and postgraduate levels, addressing aspects such as the timing and scope of training, teaching methods and materials and the qualifications of instructors and training institutions [2, 58].

Despite these guidelines, the teaching methods employed in ultrasound training vary widely, particularly regarding the use of didactic materials and the presentation of sonographic images and findings [2, 57, 912]. Trainees are often exposed to both static and dynamic ultrasound images to develop a fundamental understanding of sonoanatomy and sonopathology [9, 13, 14].

Research problem and question

Although the presentation of images is a cruical component of ultrasound education, there are no clear, evidence-based recommendations on whether static or dynamic images are more effective for teaching theoretical and practical competencies. Previous research in this field primarily focused on the general effectiveness of digital teaching methods [12, 1517].

Insights from cognitive science suggest that both static and dynamic images—used individually or in combination—can support learning, but their specific effects seem to depend on the context and task [1821]. A multimedia approach may offer advantages in general learning scenarios [20, 22, 23]. For instance, in psychomotor training for dental students, participants using dynamic videos to learn origami folding techniques demonstrated faster and more accurate replication, likely due to a better intuitive understanding of movement sequences. Conversely, those using static images with audio instructions required more time, but achieved deeper cognitive processing [24]. The literature consistently highlights the benefits of both well-prepared educational films and images [20, 21, 2528].

Research specifically addressing the advantages and disadvantages of static versus dynamic images in ultrasound education remains scarce. While the benefits of real-time imaging are often emphasised [29], there are currently no evidence-based guidelines on the optimal format (static vs. dynamic) for presenting findings in ultrasound education [2]. This lack of standardization and the existing heterogeneity in training practices raise the question of which type of image presentation most effectively facilitates the acquisition of theoretical knowledge (e.g., understanding normal findings and pathologies) in ultrasound training. This study aims to address this gap by investigating the impact of static and dynamic images on the development of theoretical competencies among ultrasound learners participating in certified abdominal sonography courses. It is hypothesized that dynamic images, with their real-time representation and clinical relevance, may provide superior learning effects compared to static images. Static images, however, might offer unique advantages for detailed analysis of anatomical structures and measurements.

By systematically analysing the effects of both presentation formats on learners’ understanding and application of ultrasound diagnostics, this study seeks to provide evidence-based recommendations for the design of future training programs. Ultimately, this could enhance the quality of ultrasound education, improve patient safety, and support more effective clinical decision-making.

Material and methodology

Study design, participant recruitment, and inclusion criteria

This prospective, controlled, randomized study was conducted between December 2022 and November 2023 as part of certified basic ultrasound courses by the German Society for Ultrasound in Medicine (DEGUM) (see Fig. 1) [30]. Following a 90-minute theoretical introduction covering the basics of ultrasound and image formation, participants were randomly assigned to a control group (video training using static images) or a study group (video training using dynamic images).

Fig. 1.

Fig. 1

Illustration of the study process, including recruitment, materials, methodology, and measurement timeline

Participants were separated into two distinct rooms to avoid potential interference. Both groups received a standardised video training session that presented ultrasound normal findings and pathologies, either as static images (control group) or as dynamic images (study group). The training focused on three topics—gallbladder, liver, and pancreas—lasting a total of 18 min. Immediately after the training, all participants completed a digital theory test consisting of 54 multiple-choice questions (maximum duration: 25 min) and answered additional baseline characteristic questions.

Inclusion criteria were consent to participate, completion of the video training, and full completion of the theory test. The primary endpoint was the competence level, measured by performance in the theory test. Secondary endpoints included the analysis of potential influencing factors.

The study protocol was reviewed and approved by the Ethics Committee of the Medical University Theodor Fontane Brandenburg under the number E-01-20220427 (May 26, 2022).

Training and theory test

The video training for both groups was identical in duration, spoken text, and displayed captions, differing only in the format of the sonographic findings. In the control group, the ultrasound findings were demonstrated exclusively with static images, while in the study group, only dynamic images were used to present the same ultrasound findings. The training content covered three key topics, each lasting six minutes. The first topic focused on the gallbladder, addressing both normal findings and pathologies, such as gallstones, sludge, and cholecystitis. The second topic dealt with the liver, covering normal findings and various pathologies, including hyperechoic, hypoechoic, and isoechoic lesions. Finally, the third topic explored the pancreas, discussing its normal findings and pathologies related to the head, body, tail, and associated lesions. The training was delivered in a plenary format using a projector and audio system. All sonographic findings were sourced and curated from an online teaching platform [31].

Immediately after the training, participants completed a digital theory test in an online multiple-choice format [32]. The test comprised 54 questions (16 for the gallbladder, 20 for the liver, and 18 for the pancreas), each with four answer options, which were always identical for each complex (for example questions/findings see Fig. 2 + Supplement 1). Participants completed the test after scanning a QR code using smartphones or laptops. Each question had a maximum time limit of 30 s.

Fig. 2.

Fig. 2

Sample question of the online theory test on the topics (a) gallbladder normal finding and (b) pathology cholecystitis; (c) liver normal finding and (d) pathology haemangioma; (e) pancreas normal finding and (f) head tumour pathology

An interdisciplinary team of ultrasound experts and educators developed the video training and theory test. To ensure content validity, the materials were pretested in a proof-of-concept evaluation by five ultrasound experts (certified as DEGUM Level II or III). At the end of the test, participants answered questions on baseline characteristics like professional position, prior ultrasound experience, specialty, and current employment role.

Statistical analysis

Data collection was carried out using the survey and test tool Survio (Survio ®, Czech Republic). All data were saved with Microsoft Excel. All statistical analyses were performed in Rstudio (Rstudio Team [2020]. Rstudio: Integrated Development for R. Rstudio, PBC, http://www.rstudio.com, last accessed on 20 06 2024) with R 4.0.3 (A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, http://www.R-project.org; last accessed on 20 06 2024). Where possible, a main scale score was made from the average of the subscale scores. Binary and categorical baseline variables are given as absolute numbers and percentages. Continuous data are given as median and interquartile range (IQR) or as mean and standard deviation (SD). Categorical variables were compared using a chi-squared test and continuous variables using a t-test or the Mann-Whitney U test. These tests were also used to calculate the influence of the factors on the objective test results anf for subgroup analysis. In addition, parametric (Anova) or non-parametric (Kruskall-Wallis) analyses of variance were calculated and further explored with pairwise post-hoc tests (either t-test or Mann-Whitney U test). Multivariate linear regression models were constructed to compare the influence of individual factors on the test results. In the multivariate linear regression analysis in relation to the results of the theory tests, group allocation (static vs. dynamic), prior ultrasound experience (> 30 independent examinations), qualification level (student/resident vs. specialist/senior physician), specialty, and workplace (i.e., clinical setting) were defined as possible influencing factors. P-values < 0.05 were considered statistically significant. For this study, a power analysis was performed to determine the sample size required to detect a statistically significant effect. Based on an expected effect size of 0.6, a significance level of 0.05, and a desired power of 0.90, the calculated sample size was set at 120 participants (60 in each group).

Results

Baseline characteristics of the study and control group

A total of 145 datasets (see CONSORT Fig. 3 and Supplement 2) were included in the statistical analysis (n = 76 in the control group, n = 69 in the study group). Both groups exhibited similar demographic profiles regarding qualifications, professional roles, and prior experience. However, there was a significant difference in the participants’ “current specialty” between the two groups. In the study group, a higher proportion of participants were employed in internal medicine (55% vs. 43%), whereas more participants in the control group worked in general medicine (20% vs. 6%). A majority of participants in both groups were residents (study: 68%, control: 63%) and worked in clinical settings (study: 78%, control: 62%). Approximately half of the participants in each group reported to have conducted fewer than 30 independent ultrasound examinations.

Fig. 3.

Fig. 3

Flow diagram showing participant recruitment and data analysis according to CONSORT guidelines [30]

Theory test results

The results of the theory test are presented in Figs. 4 and 5 and Supplement 3. The study group achieved significantly higher scores in the overall theory test (study: 34.1 ± 6.3 vs. control: 30.7 ± 6.1, p = 0.001) and the total score of pathology findings (study: 23.5 ± 4.7 vs. control: 20.8 ± 4.6, p < 0.001). No significant differences were observed in the total score of normal findings (p = 0.08). These trends were also reflected in the liver and pancreas subcategories, but not in the gallbladder subcategory (Fig. 5).

Fig. 4.

Fig. 4

Raincloud plot illustration of the results of the theory test of the control group (‘static’) and study group (‘dynamic’) in (a) the overall value, and the competence areas (b) ‘normal findings’ and (c) ‘pathology findings’. The results are presented as a percentage

Fig. 5.

Fig. 5

Violin plot representation of the results of the theory test of the control group (‘static’) and study group (‘dynamic’) in the (ac) gallbladder, (df) liver, and (gi) pancreas subcategories. The red dot marks the mean value

Both groups performed generally better on normal findings than on pathology findings, with significant differences observed in some subcategories. For instance, both groups performed significantly better in “liver normal findings” compared to “liver pathology findings” (p < 0.01).

Influencing factors and subgroup analysis

Membership of the study group (p < 0.01), a number of > 30 independent ultrasound examinations (p < 0.01), and current employment in internal medicine (p = 0.01) had a significant influence on both the total score theory test and the competence area total score pathology findings, but not on the competence area total score normal finding. Figure 6 and Supplement 4 also show the results of the theory test of the control and study groups across different levels of experience and qualification. In this subgroup analysis, participants in the study group tended to achieve higher scores in the intergroup comparison, although there was only partial evidence of significance. In addition, participants with more experience and higher qualifications achieved better results in the theory test in congruence with the calculations of the regression model.

Fig. 6.

Fig. 6

Boxplot representation of the results of the theory test of the control group (‘static’) and study group (‘dynamic’) of (a) different levels of experience and (b) qualification. The red dot marks the mean value

In the subgroup analysis focusing on internists, the study group trained with dynamic images outperformed the control group trained with static images across multiple measures. For the overall score, internists in the dynamic group achieved significantly higher results compared to the static group (p = 0.04). While no significant differences were observed for normal findings (p = 0.4), internists in the dynamic group demonstrated significantly better performance in pathological findings (p = 0.02).

Discussion

Relevance of the study and key findings

Traditionally, ultrasound education combines theoretical instruction with hands-on practice to build competency [8]. The COVID-19 pandemic highlighted the need for innovative teaching methods and materials, such as blended learning, e-learning, webinars, and simulation-based training, which are now actively discussed in the literature [12, 3336]. However, much of the focus has been on the type of teaching medium (digital vs. analogue) and instructional methods (traditional vs. innovative), with less attention given to the presentation format of ultrasound findings. The question of how static or dynamic images influence learning in ultrasound education is critical, as answering it provides an evidence-based foundation for developing future curricula and training standards. The current lack of standardized guidelines and recommendations by national and international ultrasound societies underscores the need for further research in this area [2, 37]. For the first time, this prospective, randomised, controlled study systematically investigated the impact of either static or dynamic images on the acquisition of theoretical competencies in abdominal ultrasound. The results demonstrated that participants trained with dynamic images achieved greater theoretical competency, particularly in understanding pathology findings. In contrast, both groups showed comparable results in learning normal findings. These outcomes highlight the need to integrate dynamic images more extensively into ultrasound education. A dual approach —using static images to teach sonoanatomy and dynamic images for complex pathology presentation —could enhance the efficiency of ultrasound training.

Static vs. dynamic images in ultrasound education

Assessing the effectiveness of static versus dynamic images in ultrasound education requires evaluation of the potential advantages and disadvantages of each modality.

Static images allow learners to focus on specific details, precise measurements, and characteristic features of pathologies [2, 21, 23, 27]. This format is particularly well-suited for teaching static features, such as gallbladder wall thickness or basic sonoanatomy [20, 23, 24]. In this study, the comparable performance between groups on the “total score: normal findings” and “gallbladder pathology finding score” subcategories could be attributed to these strengths. For beginners, static images provide a structured and comprehensible learning environment with minimal cognitive load, as they are not distracted by dynamic movements [20, 21, 23, 24, 27, 38]. Additionally, static images are less resource-intensive to produce and store, making them ideal for inclusion in textbooks, scripts, and digital teaching materials [15].

The primary limitation of static images is their inability to depict real-time dynamics [20, 23, 24, 27]. They cannot represent temporal or functional processes, such as lung movement, bowel motility, or the respiratory shift of liver borders. As a result, they do not fully reflect the reality of clinical ultrasound practice [23, 24]. Complex pathologies, such as tumour infiltration or vascular abnormalities, may also be difficult to comprehend without the additional information provided by dynamic imaging [39, 40]. These limitations likely contributed to the inferior performance of the control (static image) group in the “liver pathology” and “pancreas pathology” subcategories.

Dynamic images are ideal for demonstrating functional aspects and dynamic processes, such as blood flow, organ movement, or tissue response to pressure [20, 25, 4144]. Advantageously, dynamic images can also be integrated into digital pathology atlases [31], e-learning platforms [15, 45], or simulators [46]. Such real-time imaging is essential in fields like echocardiography and duplex sonography [47, 48] and provides practical insights into clinical applications of lung and abdominal sonography. For instance, dynamic images help to assess respiratory shift, bowel motility, or the characterisation of liver tumours using contrast agents [4143]. This practical, real-world representation of clinical scenarios likely explains the superior performance of the dynamic group in the “total score: pathology findings” subcategory. Previous studies in musculoskeletal ultrasound and middle-ear endoscopy are in line with our results and similarly support the value of dynamic images for training [49, 50]. Another aspect that could explain the results is that dynamic images provide a clearer spatial understanding of organ or lesion structures. The sweeping motion helps visualize 3D relationships, enabling learners to recognize subtle pathologies and mentally reconstruct complex anatomical environments. This enhances intuitive comprehension and diagnostic accuracy compared to the limited perspective of static images [51].

These advantages are also consistent with findings by Nicholls et al. [52] who identified core psychomotor and cognitive skills essential for ultrasound scanning—particularly the integration of visuomotor coordination and spatial awareness. Dynamic images directly support the development of these competencies by presenting movement, anatomical variation, and probe handling in a clinically realistic manner. The improved test performance for pathology recognition in our dynamic group reflects this alignment between instructional design and skill acquisition.

However, dynamic images also have limitations, including increased cognitive load and learner difficulty in analysing details [20, 23, 24]. For beginners, the movement and abundance of information disclosed in dynamic images may be overwhelming, potentially hindering learning outcomes [20, 21, 27, 53]. Identifying individual structures or static features within dynamic images can also be challenging, thus complicating detailed analysis [23]. However, such effects were not evident in the present study. Dynamic images require more storage space, higher-performance devices, and greater bandwidth for digital teaching or documentation. However, modern database structures increasingly accommodate large data volumes [54]. For documentation and presentation of findings, both modalities should be considered to capture both static measurements and dynamic organ findings [2, 55].

Artificial intelligence could play a significant role in managing, optimising, and even generating static or dynamic images [56], as well as supporting the analysis process for learners [57]. Advanced ultrasound technologies now enable 3D imaging of organ systems and vascular structures, which has previously been recommended for integration into ultrasound education to modernise training programs further [58].

Strengths and limitations

The randomized controlled design ensured high internal validity, while the standardized video training sessions minimized bias by controlling for content and duration across groups. The large sample size and subgroup analyses further strengthen the findings, confirming the robustness of the results. Additionally, the study’s focus on both normal findings and pathological conditions provides a comprehensive assessment of theoretical ultrasound competencies. Despite its strengths, the study has several limitations. The findings of this study are possibly limited by the study’s focus on a single area of sonography, which restricts the generalisability of the results. Additionally, the use of cross-sectional data prevents the assessment of long-term learning effects [59]. Our additional subgroup analysis demonstrates that the superior performance of participants trained with dynamic images holds true within the internist subgroup. This finding reinforces the robustness of our results and suggests that the advantage of dynamic images is not solely due to unequal professional group distribution. However, the higher proportion of internists in the study group remains a potential confounding factor and is noted as a limitation. While linear regression analysis and subsequent subgroup analysis accounted for participants’ prior experience and related professional background, other personal factors that may have influenced the test outcomes cannot be ruled out. This includes a potential selection bias due to the voluntary participation of individuals with a particular interest in ultrasound training.

Conclusion

In summary, educators’ use of static or dynamic images in ultrasound training should be guided by specific learning objectives. Static images are particularly effective for teaching fundamental concepts and detailed sonoanatomy, whereas dynamic images excel in illustrating dynamic processes and complex pathologies. A combined approach that integrates both formats offers the greatest educational benefit by merging the precision of static images with the realism of dynamic images. Such dual strategies can optimise learning outcomes and enhance the overall effectiveness of ultrasound education. Since current training guidelines by professional societies do not explicitly address the modality of ultrasound findings used in instruction, these recommendations should be refined to provide clearer guidance on the appropriate use of imaging modalities.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 2 (104.9KB, pdf)
Supplementary Material 3 (92.8KB, pdf)
Supplementary Material 4 (26.2KB, pdf)

Acknowledgements

This study includes parts of one of the authors’ doctoral theses (M.E.). We thank all participating students and lecturers for supporting our study and C. Christe for her help in revising the figures. We would like to express our gratitude to Kay Stankov for his contributions to this publication. His dedicated efforts in consulting, supervising, and meticulously reviewing all statistical aspects have been instrumental in ensuring the rigor and accuracy of our research findings.

Author contributions

Conceptualization: J.W., M.E., S.S. and D.M.; methodology and software: J.W., S.S., M.E. and D.M.; validation: J.W., D.N., M.L, K.D. and D.M., formal analysis: J. W. and D.M.; investigation: J.W., M.E. and D.M.; resources: J.W. and D. M.; data curation: J.W., M.E., M.L., M.S. and D. M.; writing—original draft preparation: J.W. writing—review and editing: J.W., M.E., S.S., D.N., M.L., M.S., A.W., R.K., K.D., L.L., H.B., J.W.M., and D.M.; visualization: J.W.; supervision: J.W. and D. M.; project administration: J.W. and D. M.; All authors have read and agreed to the published version of the manuscript.

Funding

Open Access funding enabled and organized by Projekt DEAL. Open Access funding enabled and organized by Projekt DEAL. This research received no external funding.

Data availability

Data cannot be shared publicly because of institutional and national data policy restrictions imposed by the Ethics committee since the data contain potentially identifying study participants’ information. Data are available upon request from the Johannes Gutenberg University Mainz Medical Center (contact via weimer@uni-mainz.de) for researchers who meet the criteria for access to confidential data (please provide the manuscript title with your inquiry).

Declarations

Ethics approval and consent to participate

The study was approved by the Ethics Commission of Medical School Brandenburg (approval number E-01-20220427 from 08/08/2022). All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Informed written consent was obtained from all the participants.

Consent for publication

Not Applicable.

Competing interests

J.W. is a member of the Editorial Board of BMC Medical Education. The other authors declare no conflicts of interest.

Clinical trial number

Not Applicable.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Dietrich CF, Bolondi L, Duck F, Evans DH, Ewertsen C, Fraser AG, et al. History of ultrasound in medicine from its birth to date (2022), on occasion of the 50 years anniversary of EFSUMB. A publication of the European federation of societies for ultrasound in medicine and biology (EFSUMB), designed to record the historical development of medical ultrasound. Med Ultrason. 2022;24(4):434–50. [DOI] [PubMed] [Google Scholar]
  • 2.Wüstner M, Radzina M, Calliada F, Cantisani V, Havre RF, Jenderka KV, et al. Professional standards in medical Ultrasound - EFSUMB position paper (Long Version) - General aspects. Ultraschall Med. 2022;43(5):e36–48. [DOI] [PubMed] [Google Scholar]
  • 3.Weber MA, Delorme S. [Ultrasound training in the professional development of radiological specialists: concepts and challenges]. Radiologe. 2017;57(11):967–72. [DOI] [PubMed] [Google Scholar]
  • 4.Welle R, Seufferlein T, Kratzer W. [Current state of under- and postgraduate education in abdominal ultrasonography at German university hospitals. A panel study over 20 years]. Z Gastroenterol. 2021;59(3):225–40. [DOI] [PubMed] [Google Scholar]
  • 5.Dietrich CF, Hoffmann B, Abramowicz J, Badea R, Braden B, Cantisani V, et al. Medical student ultrasound education: A WFUMB position paper, part I. Ultrasound Med Biol. 2019;45(2):271–81. [DOI] [PubMed] [Google Scholar]
  • 6.Hoffmann B, Blaivas M, Abramowicz J, Bachmann M, Badea R, Braden B, et al. Medical student ultrasound education, a WFUMB position paper, part II. A consensus statement of ultrasound societies. Med Ultrason. 2020;22(2):220–9. [DOI] [PubMed] [Google Scholar]
  • 7.Hoppmann RA, Mladenovic J, Melniker L, Badea R, Blaivas M, Montorfano M et al. International consensus conference recommendations on ultrasound education for undergraduate medical students. The Ultrasound Journal. 2022;14(1):31. [DOI] [PMC free article] [PubMed]
  • 8.Recker F, Neubauer R, Dong Y, Gschmack AM, Jenssen C, Möller K, et al. Exploring the dynamics of ultrasound training in medical education: current trends, debates, and approaches to didactics and hands-on learning. BMC Med Educ. 2024;24(1):1311. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Neubauer R, Bauer CJ, Dietrich CF, Strizek B, Schäfer VS, Recker F. Evidence-based ultrasound Education?– A systematic literature review of undergraduate ultrasound training studies. Ultrasound Int Open. 2024;10(continuous publication). [DOI] [PMC free article] [PubMed]
  • 10.Prosch H, Radzina M, Dietrich CF, Nielsen MB, Baumann S, Ewertsen C, et al. Ultrasound curricula of student education in europe: summary of the experience. Ultrasound Int Open. 2020;6(1):E25–33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Tarique U, Tang B, Singh M, Kulasegaram KM, Ailon J. Ultrasound curricula in undergraduate medical education: A scoping review. J Ultrasound Med. 2018;37(1):69–82. [DOI] [PubMed] [Google Scholar]
  • 12.Blank V, Strobel D, Karlas T. Digital training formats in ultrasound diagnostics for physicians: what options are available and how can they be successfully integrated into current DEGUM certified course concepts? Ultraschall Med. 2022;43(5):428–34. [DOI] [PubMed] [Google Scholar]
  • 13.Miles KA. Diagnostic imaging in undergraduate medical education: an expanding role. Clin Radiol. 2005;60(7):742–5. [DOI] [PubMed] [Google Scholar]
  • 14.Konge L, Albrecht-Beste E, Nielsen MB. Virtual-reality simulation-based training in ultrasound. Ultraschall Med. 2014;35(2):95–7. [DOI] [PubMed] [Google Scholar]
  • 15.Weimer JM, Recker F, Horn L, Kuenzel J, Dirks K, Ille C et al. Insights into modern undergraduate ultrasound education: prospective comparison of digital and analog teaching resources in a flipped classroom Concept– The divan study. Ultrasound Int Open. 2024;10(continuous publication). [DOI] [PMC free article] [PubMed]
  • 16.Altersberger M, Pavelka P, Sachs A, Weber M, Wagner-Menghin M, Prosch H. Student perceptions of instructional ultrasound videos as Preparation for a practical assessment. Ultrasound Int Open. 2019;5(3):E81–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Lien WC, Lin P, Chang CH, Wu MC, Wu CY. The effect of e-learning on point-of-care ultrasound education in novices. Med Educ Online. 2023;28(1):2152522. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Wagner I, Schnotz W. Learning from static and dynamic visualizations: what kind of questions should we ask?? In: Lowe R, Ploetzner R, editors. Learning from dynamic visualization: innovations in research and application. Cham: Springer International Publishing; 2017. pp. 69–91. [Google Scholar]
  • 19.Castro-Alonso JC, Wong M, Adesope OO, Ayres P, Paas F. Gender imbalance in instructional dynamic versus static visualizations: a Meta-analysis. Educational Psychol Rev. 2019;31(2):361–87. [Google Scholar]
  • 20.Bacha D, Sâadia B, Talbi G, Ferjaoui W, Atef M. Image in medical education: why, when and how to use it?? A critical review of the literature. Biomedical Journal of Scientific & Technical Research; 2020.
  • 21.Daly CJ, Bulloch JM, Ma M, Aidulis D. A comparison of animated versus static images in an instructional multimedia presentation. Adv Physiol Educ. 2016;40(2):201–5. [DOI] [PubMed] [Google Scholar]
  • 22.Castro-Alonso JC, Wong RM, Adesope OO, Paas F. Effectiveness of multimedia pedagogical agents predicted by diverse theories: a Meta-Analysis. Educational Psychol Rev. 2021;33(3):989–1015. [Google Scholar]
  • 23.Ploetzner R, Berney S, Bétrancourt M. When learning from animations is more successful than learning from static pictures: learning the specifics of change. Instr Sci. 2021;49(4):497–514. [Google Scholar]
  • 24.Robertson D. Evaluating the effectiveness of motion vs still-image videos teaching origami: implications for e-learning in pre-clinical dental education [Text] 2024.
  • 25.Hirschowitz BI, Use of motion, pictures in teaching & diagnosis. Ann N Y Acad Sci. 1967;142(2):455–60. [DOI] [PubMed] [Google Scholar]
  • 26.Montague JF. What motion pictures can do for medical education. Annals Am Acad Political Social Sci. 1926;128:139–42. [Google Scholar]
  • 27.Norris EM. The constructive use of images in medical teaching: a literature review. JRSM Short Rep. 2012;3(5):33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Dev P. Imaging and visualization in medical education. Comput Graphics Appl IEEE. 1999;19:21–31. [Google Scholar]
  • 29.Popescu BA, Andrade MJ, Badano LP, Fox KF, Flachskampf FA, Lancellotti P, et al. European association of echocardiography recommendations for training, competence, and quality improvement in echocardiography. Eur J Echocardiogr. 2009;10(8):893–905. [DOI] [PubMed] [Google Scholar]
  • 30.Moher D, Hopewell S, Schulz KF, Montori V, Gøtzsche PC, Devereaux PJ, et al. CONSORT 2010 explanation and elaboration: updated guidelines for reporting parallel group randomised trials. BMJ. 2010;340:c869. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Merkel D, Schneider C, Ludwig M. Internet-based digital video atlas of sonographic findings for clinical and educational purposes. J Ultrason. 2020;20(80):e24–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Höhne E, Recker F, Dietrich CF, Schäfer VS. Assessment methods in medical ultrasound education. Front Med (Lausanne). 2022;9:871957. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Bintaro S, Dietrich CF, Potthoff A. Principles for teaching sonography - current status. Z Gastroenterol. 2023. [DOI] [PubMed]
  • 34.Weimer AM, Berthold R, Schamberger C, Vieth T, Balser G, Berthold S et al. Digital transformation in musculoskeletal ultrasound: acceptability of blended learning. Diagnostics (Basel). 2023;13(20). [DOI] [PMC free article] [PubMed]
  • 35.Höhne E, Recker F, Schmok E, Brossart P, Raupach T, Schäfer VS. Conception and feasibility of a digital Tele-Guided abdomen, thorax, and thyroid gland ultrasound course for medical students (TELUS study). Ultraschall Med. 2021. [DOI] [PubMed]
  • 36.Ruppert J, Krüger R, Göbel S, Wolfhard S, Lorenz L-A, Weimer AM, et al. The effectiveness of e-learning in focused cardiac ultrasound training: a prospective controlled study. BMC Med Educ. 2025;25(1):806. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Seitz K. Qualitätssicherung in der ultraschalldiagnostik in Deutschland– eine unendliche geschichte. Ultraschall Med. 2012;33:517–9. [DOI] [PubMed] [Google Scholar]
  • 38.Dreher SM, DePhilip R, Bahner D. Ultrasound exposure during gross anatomy. J Emerg Med. 2014;46(2):231–40. [DOI] [PubMed] [Google Scholar]
  • 39.Schenke SA, Petersen M, Görges R, Ruhlmann V, Zimny M, Richter JP, et al. Interobserver agreement in ultrasound risk stratification systems for thyroid nodules on static images versus Cine-Loop video sequences. Diagnostics (Basel). 2024;14:19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Van Holsbeke C, Yazbek J, Holland TK, Daemen A, De Moor B, Testa AC, et al. Real-time ultrasound vs. evaluation of static images in the preoperative assessment of adnexal masses. Ultrasound Obstet Gynecol. 2008;32(6):828–31. [DOI] [PubMed] [Google Scholar]
  • 41.Honer Zu Siederdissen C, Potthoff A. [Sonographic diagnostics of liver tumors]. Internist (Berl). 2020;61(2):115–22. [DOI] [PubMed] [Google Scholar]
  • 42.Slowinska-Klencka D, Popowicz B, Klencki M. Real-Time ultrasonography and the evaluation of static images yield different results in the assessment of EU-TIRADS categories. J Clin Med. 2023;12(18). [DOI] [PMC free article] [PubMed]
  • 43.Dietrich CF, Mathis G, Blaivas M, Volpicelli G, Seibel A, Wastl D, et al. Lung B-line artefacts and their use. J Thorac Dis. 2016;8(6):1356–65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Ang E, Talib S, Thong M, Tze C. Using video in medical education: what it takes to succeed. Asia Pac Scholar. 2017;2:15–23. [Google Scholar]
  • 45.Weimer J, Recker F, Krüger R, Müller L, Buggenhagen H, Kurz S, et al. The effectiveness of digital vs. Analogue teaching resources in a flipped classroom for undergraduate focus cardiac ultrasound training: A prospective, randomised, controlled Single-Centre study. Educ Sci. 2025;15(7):810. [Google Scholar]
  • 46.Nayahangan LJ, Dietrich CF, Nielsen MB. Simulation-based training in ultrasound - where are we now? Ultraschall Med. 2021;42(3):240–4. [DOI] [PubMed] [Google Scholar]
  • 47.Bell FE 3rd, Wilson LB, Hoppmann RA. Using ultrasound to teach medical students cardiac physiology. Adv Physiol Educ. 2015;39(4):392–6. [DOI] [PubMed]
  • 48.Hammoudi N, Arangalage D, Boubrit L, Renaud MC, Isnard R, Collet JP, et al. Ultrasound-based teaching of cardiac anatomy and physiology to undergraduate medical students. Arch Cardiovasc Dis. 2013;106(10):487–91. [DOI] [PubMed]
  • 49.Caputo V, Denoyelle F, Simon F. Educational endoscopic videos improve teaching of middle ear anatomy. Eur Arch Otorhinolaryngol. 2024;281(9):4649–55. [DOI] [PubMed] [Google Scholar]
  • 50.van de Stadt LA, Kroon FPB, Rosendaal FR, van der Heijde D, Reijnierse M, Riyazi N, et al. Real-time vs static scoring in musculoskeletal ultrasonography in patients with inflammatory hand osteoarthritis. Rheumatology (Oxford). 2022;61(Si):Si65–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Weimer J, Ruppert J, Vieth T, Weinmann-Menke J, Buggenhagen H, Künzel J, et al. Effects of undergraduate ultrasound education on cross-sectional image Understanding and visual-spatial ability - a prospective study. BMC Med Educ. 2024;24(1):619. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Nicholls D, Sweet L, Hyett J. Psychomotor skills in medical ultrasound imaging: an analysis of the core skill set. J Ultrasound Med. 2014;33(8):1349–52. [DOI] [PubMed] [Google Scholar]
  • 53.Jamniczky HA, Cotton D, Paget M, Ramji Q, Lenz R, McLaughlin K, et al. Cognitive load imposed by ultrasound-facilitated teaching does not adversely affect gross anatomy learning outcomes. Anat Sci Educ. 2017;10(2):144–51. [DOI] [PubMed] [Google Scholar]
  • 54.Moore GE. Cramming more components onto integrated circuits. Proc IEEE. 1998;86(1):82–5.
  • 55.Ernst B, Dörsching C, Bozzato A, Gabrielpillai J, Becker S, Frölich M, et al. Structured reporting of head and neck sonography achieves substantial interrater reliability. Ultrasound Int Open. 2023;09:E26–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Gordon M, Daniel M, Ajiboye A, Uraiby H, Xu NY, Bartlett R, et al. A scoping review of artificial intelligence in medical education: BEME guide 84. Med Teach. 2024;46(4):446–70. [DOI] [PubMed] [Google Scholar]
  • 57.Zhao G, Kong D, Xu X, Hu S, Li Z, Tian J. Deep learning-based classification of breast lesions using dynamic ultrasound video. Eur J Radiol. 2023;165. [DOI] [PubMed]
  • 58.Guo J, Guo Q, Feng M, Liu S, Li W, Chen Y, et al. The use of 3D video in medical education: A scoping review. Int J Nurs Sci. 2023;10(3):414–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Weimer JM, Widmer N, Strelow K-U, Hopf P, Buggenhagen H, Dirks K, et al. Long-Term effectiveness and sustainability of integrating Peer-Assisted ultrasound courses into medical School - A prospective study. Tomography. 2023;9(4):1315–28. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material 2 (104.9KB, pdf)
Supplementary Material 3 (92.8KB, pdf)
Supplementary Material 4 (26.2KB, pdf)

Data Availability Statement

Data cannot be shared publicly because of institutional and national data policy restrictions imposed by the Ethics committee since the data contain potentially identifying study participants’ information. Data are available upon request from the Johannes Gutenberg University Mainz Medical Center (contact via weimer@uni-mainz.de) for researchers who meet the criteria for access to confidential data (please provide the manuscript title with your inquiry).


Articles from BMC Medical Education are provided here courtesy of BMC

RESOURCES