Abstract
Objectives
Acute otitis media (AOM) is extremely prevalent among children but its diagnosis remains challenging. Our primary objective was to measure the impact of an e-learning module on medical students’ accuracy in diagnosing paediatric AOM.
Methods
This randomized controlled trial was performed at a single tertiary care paediatric emergency department (ED). Medical students on their paediatric rotation were randomized to a locally developed e-learning module or a small-group lecture on AOM. They then had to examine at least 10 ears of patients at risk for AOM. The primary outcome was diagnostic accuracy and secondary outcomes included knowledge test scores and learning modality preference.
Results
Between May 2017 and September 2018, 201 medical students were randomized. Eighty-three evaluated at least 10 ears and were included in the primary analysis. Diagnostic accuracies (76.5% for the e-learning group versus 76.4% for the lecture group, difference of 0.1%; 95%CI: –6.2 to 6.4%) and post-test scores (difference of 0.5/20 points; 95%CI: –0.8 to 1.2/20 points) were similar between the groups. Sixty-two per cent of participants preferred the e-learning module to the lecture, while 15% had no preference.
Conclusions
Diagnostic accuracy for AOM was similar between students exposed to an e-learning module or a small-group lecture. E-learning was the preferred learning modality.
Keywords: Education, Medical, Otitis media, Online learning, Paediatrics
Acute otitis media (AOM) is one of the most common infections of childhood and a leading cause of antibiotic prescription. Medical students often find diagnosing AOM quite challenging. Previous studies report AOM diagnostic sensitivities between 52% and 75% for medical students and residents (1–3).
E-learning, defined as the use of Internet technologies for teaching and learning purposes (4), can include a broad array of tools like videos and images, interactive clinical cases, modeling, quizzes, and feedback on performance (5). Several advantages stem from the use of e-learning, including interactivity, access to collective intelligence, easy updating of content, accessibility, and user-adaptable rhythm (6). E-learning is also perceived as cost-effective, as its development is generally affordable, and it requires few human resources compared to traditional teaching (4). Over the years, e-learning has been studied in various health domains (6–17) but only a few studies have evaluated the impact of specific e-learning programs on students’ performing a clinical task, which is the highest level on Miller’s pyramid of clinical competence (18).
The primary objective of this study was to measure the impact of an original e-learning module on AOM on medical students’ ability to appropriately diagnose AOM. Secondary objectives were to assess medical students’ knowledge on AOM, confidence in performing ear examination, and preferred learning modality.
Methods
This randomized controlled trial was performed in the paediatric ED of a tertiary care centre located in Montreal, Canada, between May 2017 and September 2018. The ED is staffed with paediatric emergency physicians, general paediatricians, and emergency physicians with specific expertise in paediatrics.
Study participants were third- and fourth-year medical students doing a 6-week rotation in general paediatrics. The intervention of interest was completion of an e-learning module on AOM. The module covered anatomy, epidemiology, pathophysiology, microbiology, diagnostic criteria, treatment options, and prognosis. It was designed in 2016 to 2017 using Adobe Captivate (Adobe Inc., San José, CA). It was mainly developed by two paediatric residents supported by a team composed of a technical educator, a multimedia conceptor, a paediatric otolaryngologist, and paediatricians with expertise in medical education. Once completed, the module was reviewed by a paediatric emergency physician and a paediatric infectious diseases specialist. It was then beta tested by 16 physicians: six paediatric residents, five paediatricians, three paediatric emergency physicians, one paediatric otolaryngologist, and one paediatric infectious diseases specialist. Based on their comments, improvements were made to the module before the beginning of the study. The mean time for completing the e-learning module was estimated at 30 minutes, and the lecture’s approximate duration was 60 minutes. The lecture was given by a paediatrician or a paediatric emergency fellow. A standardized PowerPoint presentation (Microsoft Inc., Redmond, Washington) was used for every lecture. The educational content of the e-learning module was similar to the content of the lecture. However, the module included videos that were not included in the lecture: video examples of AOM and a 5-minute video on the paediatric otoscopic examination. The lecture included image examples of AOM and an in-person demonstration of the ear examination technique.
On the first day of every rotation, a member of the research team presented the study to all eligible medical students during an introductory meeting. Medical students who were absent for this introduction and those who had already been recruited in a previous rotation were not eligible. Interested medical students signed a consent form, filled out a short questionnaire assessing for baseline characteristics (year of training, specific training in otolaryngology, and prior paediatric rotations) and completed an electronic pretest. This test included five general questions on AOM (definition, pathophysiology, microbiology, and treatment) and 15 questions of interpretation of eardrum videos demonstrating different conditions. The videos were selected by the authors because they demonstrated diagnoses that were unanimous among eight paediatric otolaryngologists. A logbook was kept of medical students who were approached and those who declined to participate.
Simple computer-generated randomization was used to distribute the participants in two groups. This randomization sequence was generated by one of the authors and kept concealed in an opaque envelope until the introductory meeting. The intervention group completed the e-learning module and the control group received the lecture. Participants were not blinded to their assignment. However, they were asked not to reveal their allocation to the ED attending physicians.
The study took place in the ED during the ambulatory portion of the students’ paediatric rotation. Before their first ED shift, participants completed the e-learning module or received the lecture. They then completed an online post-test, which consisted of the same 20 questions as those found in the pretest. On their ED shifts, participants were asked to examine as many ears as possible and to report if they could visualize the eardrum and if they thought there was an AOM. Each ear examination was then repeated by an attending physician who concluded on the presence or absence of AOM. Approximately 3 weeks after either completing the module or receiving the lecture, participants completed another online test to evaluate knowledge retention (retention test). To ensure equity in learning opportunities, the group who had first received the lecture was given access to the e-learning module, and the group who had first completed the module received the lecture. This was done after the retention test. Lastly, participants who had completed both methods were surveyed regarding their preferred learning modality using an appreciation questionnaire (Supplementary Appendix 1).
The primary outcome measure was the accuracy of diagnosis of AOM in comparison to the diagnosis made by the attending physician. Secondary outcome measures consisted of the following: post-test score, retention test score, participants’ confidence in performing ear examination using a 100 mm visual analog scale, and participants’ preferred learning method.
The primary analysis was the comparison of accuracies for medical students randomized to the e-learning module compared to those randomized to the lecture using a Student’s t-test. To be included in the primary analysis, medical students had to evaluate a minimum of 10 ears of children aged less than 60 months to 12 years with fever and/or respiratory symptoms. Individual accuracy was a continuous variable obtained by dividing the number of ears for which there was an agreement between the student and the attending physician by the total number of ears evaluated by the medical student. Secondary analyses included a comparison of the post-test scores using a Student’s t-test, comparison of the retention test scores using a Student’s t-test, comparison of confidence scores using a Student’s t-test, and identification of medical students’ preferred learning modality using a simple proportion.
Previous data suggested that the accuracy of ear evaluation by medical trainees is approximately 60% (3). A pilot evaluation of 11 medical students reported a mean accuracy of 72% and a variability between medical students of ±19%. Upon discussion with experts in the field of paediatric emergency medicine, we agreed that the smallest clinically significant difference would be a 15% increase in the AOM diagnostic accuracy. Based on this, it was calculated that a power of 90% and an alpha-value of 0.05 would require enrollment of 34 medical students in each group. To account for potential loss to follow up, this number was increased to 40 students per arm.
This study was approved by the institution’s Research Ethics Board and was registered at http://www.clinicaltrials.gov: Identifier NCT03101605. To participate, medical students had to provide a written informed consent. Parents of children who were examined in the context of the study were offered information on the research project.
RESULTS
Between May 2017 and September 2018, a total of 252 medical students were invited to participate to the study, of which 201 accepted (Supplementary Appendix 2). The reasons for declining participation were not recorded. Among the 201 medical students who signed the informed consent, 139 evaluated at least one child fulfilling the inclusion criteria for the study. Of these, 83 evaluated at least 10 ears fulfilling the inclusion criteria and were thus included in the primary analysis. Supplementary Appendix 3 demonstrates that the baseline characteristics of the study participants were similar between the two groups as well as between the participants who were included in the primary analysis and those who were not. There were slightly more fourth-year medical students enrolled in the e-learning group in the primary analysis compared to the lecture group (20% versus 8%). Overall, participants were mostly third-year medical students, 64% had previously received teaching on AOM and 35% had done a previous rotation in paediatrics.
There was no difference between the two interventions in AOM diagnostic accuracy (Table 1). This is demonstrated by a total accuracy of 76% for both groups with a difference of 0.1% (95%CI: −6.2 to 6.4%). Sensitivities and specificities were also similar. The secondary analysis including all medical students who evaluated at least one eligible ear also reported no difference in diagnostic accuracies (difference: −0.8%; 95%CI: −6.5 to 4.8; Table 2). Both groups showed an important knowledge improvement with mean score improvements of approximately 4.5/20 points (22.5%) (Table 3), but there was no clinical difference found between the groups (difference: 0.5/20 points; 95%CI: −0.8 to 1.2/20 points). Three weeks after training, medical students randomized to the e-learning module had a mean score similar to those randomized to the lecture (difference between the groups: −0.2/20 points; 95%CI: −1.2 to 0.8/20 points).
Table 1.
E-learning group (intervention), % | Lecture group (control), % | Difference, % (95%CI) | |
---|---|---|---|
Mean sensitivity | 67.7 | 63.6 | 4.1 (−9.0 to 17.3) |
Mean specificity | 79.0 | 80.8 | −1.8 (−8.7 to 5.1) |
Mean diagnostic accuracy | 76.5 | 76.4 | 0.1 (−6.2 to 6.4) |
Table 2.
E-learning group (intervention), % | Lecture group (control), % | Difference, % (95%CI) | |
---|---|---|---|
Mean sensitivity | 70.0 | 68.0 | 2.0 (−10.4 to 14.4) |
Mean specificity | 80.2 | 82.8 | −2.6 (−8.6 to 3.4) |
Mean diagnostic accuracy | 78.7 | 79.5 | −0.8 (−6.5 to 4.8) |
Table 3.
E-learning group n=99 | Lecture group n=102 | Difference (95%CI) | |
---|---|---|---|
Mean difference between pre- and post-test | 4.6* | 4.1‡ | 0.5 (−0.8 to 1.2) |
Mean difference between post- and retention test | 4.9¤ | 5.2§ | −0.2 (−1.2 to 0.8) |
Mean increase in confidence in ear exam technique, pre–post-study participation, /10 | 3.6† | 4.1¥ | −0.4 (−1.3 to 0.3) |
Mean increase in confidence in ear exam interpretation, pre–post-study participation, /10 | 2.9† | 2.8¥ | 0.1 (−0.7 to 0.9) |
*Missing data for 19 students, ‡missing data for 33 students, ¤missing data for 43 students, §missing data for 57 students, †missing data for 50 students, ¥missing data for 65 students.
Using a verbal numeric scale, both groups reported similar confidence in ear exam technique (difference between the groups: −0.4; 95%CI: −1.3 to 0.3) and interpretation of ear exam (difference between the groups: 0.1; 95%CI: −0.7 to 0.9; Table 3).
Of the 86 medical students who completed both learning modalities and responded to the appreciation questionnaire, 53 (62%) preferred the e-learning module to the lecture, 20 (23%) preferred the lecture to the e-learning module, and 13 (15%) had no preference. The main reasons identified for preferring the e-learning module were the freedom to choose the desired time and place to complete it (89%) and the rapidity of completion compared with the lecture (34%). The main reason to prefer the lecture was access to a professor available for answering questions.
Discussion
Our study demonstrated similar clinical knowledge and clinical skills on AOM in medical students who completed an e-learning module compared with students who were given a traditional lecture. To our knowledge, this is the first study to evaluate the clinical impact of an e-learning module on the topic of AOM on medical students. In 2017, Tarpada et al. (6) published a systematic review evaluating e-learning in otolaryngology education and found that e-learning was associated with improved knowledge acquisition (19–24) and/or satisfaction (22,25–27) in a majority of studies. Only three studies among the 12 that were reviewed evaluated the clinical impact of e-learning (20,28,29). One possible reason to explain why our study showed no difference in knowledge and clinical skills between the e-learning and lecture groups is that the lecture was given in small groups by a motivated teacher. Beyea et al. evaluated an e-learning module to teach residents about ‘particle repositioning maneuver’ and found it to be superior to standard classroom instruction but comparable to small-group clinical instruction, which supports the hypothesis that small-group lectures might be superior to large group lectures (28). Another possible explanation is limited student motivation and engagement toward the e-learning module. The effect of engagement on performance was studied by Hussain et al. who used machine learning to identify low engagement students in a virtual learning environment (30).
The erosion of physical examination skills was a concern raised by the clinicians about e-learning. In our study, the paediatric ear exam was demonstrated through a video in the e-learning module and in person in the lecture. Although the ear examination skill was not evaluated specifically, it is crucial in making the correct diagnosis. Thus, the similar diagnostic accuracies found in the two groups suggested that learners were not disadvantaged by this teaching modality.
Another important finding of our study is that a majority of students preferred the e-learning module over the lecture because it was faster to complete and it could be done whenever and wherever desired. For the students who preferred the lecture, the main reason was access to an educator who could answer questions. Acosta et al. evaluated an interactive website for teaching to optometry students, which was compared to a static website and to a blended approach of online and face-to-face teaching (7). Through a survey and a focus group, they found that there is acceptance of online learning methods, but that having some in-person time with an educator could also be beneficial. Blended learning, defined as the combination of traditional learning and e-learning, was evaluated in a systematic review and meta-analysis published in 2016 by Liu et al. (31). It seems more effective or at least as effective as nonblended methods in health care education. Additionally, blended learning methods could be less expensive and more sustainable than resource-intensive traditional teaching in health care, where educators are often busy clinicians (4,8,32).
Limitations
There are limitations to our study. First, medical students’ diagnostic accuracy for AOM was measured in comparison with the ear examination performed by ED attending physicians and not otolaryngologists. Second, the content of the lecture and e-learning module were slightly different: the latter included video examples of AOM and a video demonstrating the paediatric ear examination technique. The authors believe that the image examples of AOM and the in-person demonstration of the ear examination technique were similar to the videos included in the e-learning module. The e-learning group also enrolled more fourth-year medical students (20% versus 8%). However, accuracies were found to be similar in both arms which led the authors to think the e-learning group was not favored by the small content differences or the level of the medical students enrolled. Third, medical students were not blinded, which could have led to performance bias. Also, knowing that they were participating in a study may have led students to perform more meticulous physical exams because they knew a staff would cross-check their physical exam (Hawthorne effect). This may have improved diagnosis accuracy in both groups. Nevertheless, the mean diagnostic accuracy found in the study was similar to what we found during the pilot study, when medical students did not know they were being observed. Finally, it is unclear if knowledge retention was preserved long term, as it was only tested approximately 3 weeks after intervention. This was due to the 6-week paediatric rotation period during which medical students were recruited for participation in the study. Ideally, knowledge retention would have been tested much later on, perhaps 6 months after intervention. Unfortunately, this was not possible in the context of our study because it was decided to offer both learning methods to every medical student during their paediatric rotation.
Conclusions
In summary, this study showed no difference on clinical knowledge, clinical skills, and confidence in diagnosing AOM for medical students randomized to an e-learning module compared to a traditional lecture. A majority of medical students preferred e-learning to a traditional lecture. Future studies should focus on evaluating new teaching modalities such as blended learning to improve AOM diagnosis accuracy.
Supplementary Material
ACKNOWLEDGEMENTS
We would like to thank Mr. Pierre Guimond (technical educator), Mr. Nicolas Guillemot (multimedia conceptor), and Dr. Marc Lebel (paediatric infectious disease specialist) who have helped design the e-learning module. We would also like to thank Mrs. Ramona Cook and Mrs. Maryse Lagacé who are research nurses who have supervised participants throughout the study period.
Author Contributions: SM, MP, AL, BHN, CHZ, and JG conceptualized and designed the study. SM, MP, and JG contributed to acquisition of data. SM and JG managed study recruitment and carried out the initial analyses. SM drafted the initial manuscript. MP and JG contributed substantially to its revision. No payment in any form was given to anyone to produce the manuscript. All authors approved the final manuscript as submitted and agree to be accountable for all aspects of the work.
Funding: There are no funders to report for this submission.
Potential Conflicts of Interest: All authors: No reported conflicts of interest. All authors have submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Conflicts that the editors consider relevant to the content of the manuscript have been disclosed.
Ethics Board Approval and Clinical Trial Registration: This study was approved by the CHU Sainte-Justine Research Ethics Board and was registered at http://www.clinicaltrials.gov: Identifier NCT03101605.
Prior Presentations: The study protocol was presented in the form of a poster at the 2017 McGill and Université de Montréal joint Pediatric Research Day for Residents and Fellows in Montreal, Canada, and orally at the 2017 Pediatric Emergency Research Canada Annual Meeting in Banff, Canada. Preliminary data were presented orally at the 2018 Pediatric Emergency Research Canada Annual Meeting in Mont-Tremblant, Canada and at the 2018 McGill and Université de Montréal joint Pediatric Research Day for Residents and Fellows in Montreal, Canada. In May 2019, the study findings were presented orally at the Society for Academic Emergency Medicine annual meeting in Las Vegas, United States.
References
- 1. Benjamin DK, DeLong E, Steinbach WJ. Latent class analysis: An illustrative application for education in the assessment of resident otoscopic skills. Ambul Pediatr 2004;4(1):13–7. [DOI] [PubMed] [Google Scholar]
- 2. Shaikh N, Stone MK, Kurs-Lasky M, Hoberman A. Interpretation of tympanic membrane findings varies according to level of experience. Paediatr Child Health 2016;21(4):196–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Mousseau S, Lapointe A, Gravel J. Diagnosing acute otitis media using a smartphone otoscope; a randomized controlled trial. Am J Emerg Med 2018;36(10):1796–801. [DOI] [PubMed] [Google Scholar]
- 4. Ruiz JG, Mintzer MJ, Leipzig RM. The impact of E-learning in medical education. Acad Med 2006;81(3):207–12. [DOI] [PubMed] [Google Scholar]
- 5. Jayakumar N, Brunckhorst O, Dasgupta P, Khan MS, Ahmed K. e-Learning in surgical education: A systematic review. J Surg Educ 2015;72(6):1145–57. [DOI] [PubMed] [Google Scholar]
- 6. Tarpada SP, Hsueh WD, Gibber MJ. Resident and student education in otolaryngology: A 10-year update on e-learning. Laryngoscope 2017;127(7):E219–24. [DOI] [PubMed] [Google Scholar]
- 7. Acosta ML, Sisley A, Ross J, et al. Student acceptance of e-learning methods in the laboratory class in Optometry. PLoS One 2018;13(12):e0209004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Bock A, Modabber A, Kniha K, Lemos M, Rafai N, Hölzle F. Blended learning modules for lectures on oral and maxillofacial surgery. Br J Oral Maxillofac Surg 2018;56(10):956–61. [DOI] [PubMed] [Google Scholar]
- 9. Davis JS, Garcia GD, Wyckoff MM, et al. Knowledge and usability of a trauma training system for general surgery residents. Am J Surg 2013;205(6):681–4. [DOI] [PubMed] [Google Scholar]
- 10. Gillan C, Papadakos J, Brual J, et al. Impact of high-fidelity e-learning on knowledge acquisition and satisfaction in radiation oncology trainees. Curr Oncol 2018;25(6):e533–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Haskins SC, Feldman D, Fields KG, et al. Teaching a point-of-care ultrasound curriculum to anesthesiology trainees with traditional didactic lectures or an online e-learning platform: A pilot study. J Educ Perioper Med 2018;20(3):E624. [PMC free article] [PubMed] [Google Scholar]
- 12. Le TT, Rait MA, Jarlsberg LG, Eid NS, Cabana MD. A randomized controlled trial to evaluate the effectiveness of a distance asthma learning program for pediatricians. J Asthma 2010;47(3):245–50. [DOI] [PubMed] [Google Scholar]
- 13. Levine DA, Funkhouser EM, Houston TK, et al. Improving care after myocardial infarction using a 2-year internet-delivered intervention: The Department of Veterans Affairs myocardial infarction-plus cluster-randomized trial. Arch Intern Med 2011;171(21):1910–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Maertens H, Madani A, Landry T, Vermassen F, Van Herzeele I, Aggarwal R. Systematic review of e-learning for surgical training. Br J Surg 2016;103(11):1428–37. [DOI] [PubMed] [Google Scholar]
- 15. Tarpada SP, Morris MT, Burton DA. E-learning in orthopedic surgery training: A systematic review. J Orthop 2016;13(4):425–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Vaona A, Banzi R, Kwag KH, et al. E-learning for health professionals. Cochrane Database Syst Rev 2018;1:CD011736. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Wentzell S, Moran L, Dobranowski J, et al. E-learning for chest x-ray interpretation improves medical student skills and confidence levels. BMC Med Educ 2018;18(1):256. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65(9 Suppl):S63–7. [DOI] [PubMed] [Google Scholar]
- 19. Al-Khatib T, Fanous A, Al-Saab F, Sewitch M, Razack S, Nguyen LH. Pneumatic video-otoscopy teaching improves the diagnostic accuracy of otitis media with effusion: Results of a randomized controlled trial. J Otolaryngol Head Neck Surg 2010;39(6):631–4. [PubMed] [Google Scholar]
- 20. Glicksman JT, Brandt MG, Moukarbel RV, Rotenberg B, Fung K. Computer-assisted teaching of epistaxis management: A Randomized Controlled Trial. Laryngoscope 2009;119(3):466–72. [DOI] [PubMed] [Google Scholar]
- 21. Grasl MC, Pokieser P, Gleiss A, et al. A new blended learning concept for medical students in otolaryngology. Arch Otolaryngol Head Neck Surg 2012;138(4):358–66. [DOI] [PubMed] [Google Scholar]
- 22. Kandasamy T, Fung K. Interactive Internet-based cases for undergraduate otolaryngology education. Otolaryngol Head Neck Surg 2009;140(3):398–402. [DOI] [PubMed] [Google Scholar]
- 23. Nicholson DT, Chalk C, Funnell WR, Daniel SJ. Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model. Med Educ 2006;40(11):1081–7. [DOI] [PubMed] [Google Scholar]
- 24. Venail F, Deveze A, Lallemant B, Guevara N, Mondain M. Enhancement of temporal bone anatomy learning with computer 3D rendered imaging software. Med Teach 2010;32(7):e282–8. [DOI] [PubMed] [Google Scholar]
- 25. Alnabelsi T, Al-Hussaini A, Owens D. Comparison of traditional face-to-face teaching with synchronous e-learning in otolaryngology emergencies teaching to medical undergraduates: A randomised controlled trial. Eur Arch Otorhinolaryngol 2015;272(3):759–63. [DOI] [PubMed] [Google Scholar]
- 26. Cabrera-Muffly C, Bryson PC, Sykes KJ, Shnayder Y. Free online otolaryngology educational modules: A pilot study. JAMA Otolaryngol Head Neck Surg 2015;141(4):324–8. [DOI] [PubMed] [Google Scholar]
- 27. Hu A, Moore C, Yu E, et al. Evaluation of patient-perceived satisfaction with photodynamic therapy for Bowen disease. J Otolaryngol Head Neck Surg 2010;39(6):688–96. [PubMed] [Google Scholar]
- 28. Beyea JA, Wong E, Bromwich M, Weston WW, Fung K. Evaluation of a particle repositioning maneuver Web-based teaching module. Laryngoscope 2008;118(1):175–80. [DOI] [PubMed] [Google Scholar]
- 29. Mendez A, Seikaly H, Ansari K, Murphy R, Cote D. High definition video teaching module for learning neck dissection. J Otolaryngol Head Neck Surg 2014;43:7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Hussain M, Zhu W, Zhang W, Abidi SMR. Student engagement predictions in an e-learning system and their impact on student course assessment scores. Comput Intell Neurosci 2018;2018:6347186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Liu Q, Peng W, Zhang F, Hu R, Li Y, Yan W. The effectiveness of blended learning in health professions: Systematic review and meta-analysis. J Med Internet Res 2016;18(1):e2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Fuchs L, Gilad D, Mizrakli Y, Sadeh R, Galante O, Kobal S. Self-learning of point-of-care cardiac ultrasound—Can medical students teach themselves? PLoS One 2018;13(9):e0204087. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.