Abstract
Background
e-Learning is an underutilized tool in education for the health professions, and radiation medicine, given its reliance on technology for clinical practice, is well-suited to training simulation in online environments. The purpose of the present study was to evaluate the knowledge impact and user interface satisfaction of high-(hf) compared with low-fidelity (lf) e-learning modules (e-modules) in radiation oncology training.
Methods
Two versions of an e-module on lung radiotherapy (lf and hf) were developed. Radiation oncology residents and fellows were invited to be randomized to complete either the lf or the hf module through individual online accounts over a 2-week period. A 25-item multiple-choice knowledge assessment was administered before and after module completion, and user interface satisfaction was measured using the Questionnaire for User Interaction Satisfaction (quis) tool.
Results
Of 18 trainees, 8 were randomized to the lf module, and 10, to the hf module. Overall, knowledge assessment performance increased (11%, p < 0.05), with hf-group participants reporting a 13% improvement (p = 0.02), and senior participants reporting an almost 15% improvement (p < 0.01). Scores on the quis indicated that participants were satisfied with various aspects of the user interface.
Conclusions
The hf e-module had a greater impact on knowledge acquisition, and users expressed satisfaction with the interface in both the hf and lf situations. The use of e-learning in a competency-based curriculum could have educational advantages; participants expressed benefits and drawbacks. Preferences for e-learning integration in education for the health professions should be explored further.
Keywords: Radiation medicine, e-learning, radiation oncology training, user interaction satisfaction
INTRODUCTION
In education for the health professions, e-learning is an underutilized tool1,2. The instructional methods in e-learning provide a standardized, self-paced, safe, asynchronous learning environment that can support knowledge- and skill-building3,4. The e-learning approach can overcome financial, temporal, and infrastructure barriers to health care education5,6. Although current models of e-learning often make use of interactive components such as videos, animations, and quizzes, the ability of this approach to effectively engage the learner in active learning, at the appropriate time in their training, has not been clearly established in the literature4. Given the limitations of the predominant “low fidelity” (lf) approach, such as lack of user engagement and inattention to variation in learning styles, e-learning in the clinical environment has focused primarily on the delivery of didactic content6,7.
Clinical simulation has successfully modelled “highfidelity” (hf) environments to support development and integration of knowledge, team-based competencies, and technical skills. The success of simulation has been demonstrated in areas such as emergency medicine, obstetrics, surgery, and radiation medicine8–11. Given its reliance on technology for clinical practice, including target delineation and image-guided radiation therapy image registration, radiation medicine is well-positioned to incorporate simulation in online environments12–14. Research has shown multiple benefits for trainees, including increased selfefficacy, better clinical judgment, enhanced clinical skills, and improved cognitive insights15–19.
Exploring the role of hf e-learning in training for the health professions is needed. Radiation oncology training in Canada is moving to a competency-based model20. Competency-based medical education (cbme) requires increased frequency and diversity of assessment tools. The need to adapt current training models to a cbme approach has created an urgent need for educators to explore the feasibility and integration of new teaching and assessment methods into training. The purpose of the present study was to evaluate the knowledge impact and user interface satisfaction for hf compared with lf e-learning modules (e-modules) in radiation oncology training. The study was conducted with research ethics board approval.
METHODS
e-Module Development
The two e-modules were developed using the Storyline software application (Articulate Global, New York, NY, U.S.A.). The clinical content was based on an early-stage lung cancer case treated with stereotactic body radiotherapy. The 5-section e-module addressed clinical decision-making, treating with radiation therapy, motion management in lung radiation therapy, treatment planning for lung stereotactic body radiotherapy, and general and lung-specific image guidance and treatment delivery.
Development of the e-modules involved defining the functional fidelity of the simulation component or technical skills exercise, which refers to the degree to which the skills required for a real task—for example, image registration—are captured in a simulated task21. The lf version presented text-based slides and optional readings. The hf version presented videos, imaging datasets, decision-making exercises, and knowledge assessment slides. Comparing the hf with the lf e-module, there were 96 pages (hf) compared with 86 pages (lf), 24 videos totalling 140 minutes (hf) compared with 9 videos totalling 15 minutes (lf), and 1 technical skills exercise (hf) compared with no such exercise (lf). The technical skills exercise used a nonclinical version of the Elekta (Stockholm, Sweden) XVI software hosted on a virtual machine to practice image registration of computed tomography data. The e-module content underwent a comprehensive expert review by interprofessional members of the clinical lung radiotherapy team, including 4 radiation oncologists, radiation therapists, medical physicists, and dosimetrists.
Study Population and Recruitment
All radiation oncology residents (n = 28) and fellows (n = 28) at the Princess Margaret Cancer Centre, affiliated with the University of Toronto, were invited to participate in the study. Trainees were approached by a research assistant, and informed consent was obtained before the trainee could participate in the study. Participants were randomly assigned to either the hf or the lf e-module.
e-Module Access and Evaluation
Each participant was assigned a unique anonymized login. Participants had access for 14 days to complete the pre-test, the e-module, and the post-test.
The pre-module assessment consisted of a 25-item multiple-choice knowledge assessment developed by a lung radiation oncologist and used previously in assessments for other training purposes. The post-module assessment consisted of a second set of 25 multiple-choice questions. A single bank of multiple choice questions and a corresponding answer key were created and then split into two to create the pre- and post-test assessments. The items in the pre- and post-module assessments differed, but items were matched to ensure evaluation of the same knowledge domains and to facilitate comparisons. Knowledge assessment was scored by 1 investigator who was blinded to the each participant’s assigned group. Participants also completed a modified Questionnaire for User Interaction Satisfaction [quis (version 7.0: University of Maryland Human–Computer Interaction Lab, College Park, MD, U.S.A.)], a validated tool designed to assess a user’s subjective satisfaction with a human–computer interface22. Allowing for modification based on appropriateness, 19 of the 27 quis items in four sections were used for the present study. Each item was measured on a 9-point general dislike–like Likert scale, where 1 represented “greatly dislike” and 9 represented “greatly like.”
Group differences for continuous variables were examined using t-tests. Proportions were compared using the Fisher exact test. Pre- and post-module assessments were compared using paired t-tests. All tests were 2-sided. The SAS software application (version 9.4: SAS Institute, Cary, NC, U.S.A.) was used for the data analysis.
RESULTS
Study Participants
Of 56 eligible trainees, 18 participated in the study (response rate: 32%). Table I summarizes the participant training experiences. The range in postgraduate (pgy) experience was year 1 of a 5-year radiation oncology residency to year 2 of a 2-year fellowship. That variable was dichotomized as “junior” (pgy 1–3, n = 8) and “senior” (pgy 4–5 and fellows, n = 10). At the time of the study, 9 participants (50%) had had at least 1 postgraduate clinical rotation in lung cancer. More than half the study participants (n = 10) had some prior experience with e-learning.
TABLE I.
Module type | Postgraduate year | Participants (n) | Lung rotation [n (%)] | e-Learning experience [n (%)] |
---|---|---|---|---|
High fidelity | Junior (1–3) | 4 | 1 (25) | 1 (25) |
Senior (4–5 or Fellow) | 6 | 4 (67) | 4 (67) | |
Overall | 10 | 5 (50) | 5 (50) | |
| ||||
Low fidelity | Junior (1–3) | 4 | 1 (25) | 2 (50) |
Senior (4–5 or Fellow) | 4 | 3 (75) | 3 (75) | |
Overall | 8 | 4 (50) | 5 (63) |
Knowledge Acquisition
Knowledge acquisition was assessed using the pre- and post-tests, in which participants could achieve a maximum total score of 25. Overall, knowledge test scores increased significantly after the e-module by an average of almost 11%. The more senior participants demonstrated higher knowledge scores from pre- to post-test, with an average knowledge increase just under 15% (p < 0.01). The junior participants improved by an average of only 6%, an increase that was not statistically significant. Comparing pgy groups, more senior participants than junior participants demonstrated increased improvement in their knowledge scores (100% vs. 63% reported increased scores from pre- to post-test), although that increase was not statistically significant. Participants who completed the hf e-module had a significant increase in their knowledge score (56% pre vs. 70% post, p = 0.02); those who completed the lf e-module had a nonsignificant increase to 64% from 56% (p = 0.12). Table II summarizes the results.
TABLE II.
Variable | Participants (n) | Testing relative to module | p Value | Increased total score [% (95% CI)] | |||
---|---|---|---|---|---|---|---|
| |||||||
Before | After | ||||||
|
|
||||||
(mean) | (%) | (mean) | (%) | ||||
Module type | |||||||
High fidelity | 10 | 14.1±3.8 | 56.4 | 17.4±4.3 | 69.6 | 0.018 | 80 (44–97) |
Low fidelity | 8 | 14.0±4.0 | 56.0 | 16.1±3.9 | 64.4 | 0.117 | 88 (47–100) |
p Value | 0.958 | 0.525 | >0.999 | ||||
| |||||||
Postgraduate year | |||||||
Senior (4–5, or Fellow) | 10 | 15.3±2.9 | 61.2 | 19.0±3.2 | 76.0 | 0.003 | 100 (69–100) |
Junior (1–3) | 8 | 12.5±4.4 | 50.0 | 14.1±3.5 | 56.4 | 0.284 | 63 (24–91) |
p Value | 0.123 | 0.007 | 0.069 | ||||
| |||||||
Prior clinical rotation in lung | |||||||
Yes | 9 | 16.2±2.3 | 64.8 | 18.9±2.3 | 75.6 | 0.034 | 89 (52–100) |
No | 9 | 11.9±3.8 | 47.6 | 14.8±4.5 | 59.2 | 0.060 | 78 (40–97) |
p Value | 0.01 | 0.027 | >0.999 | ||||
| |||||||
Overall | 18 | 14.1±3.8 | 56.4 | 16.8±4.1 | 67.2 | 0.003 | 83 (59–96) |
The 19 items of the modified quis were organized into four sections: overall user reaction, overall screen interface, terminology and system information, and learning. Table III summarizes the average mean scores of the items within each of the four sections of the quis overall and by e-module type and previous experience of e-learning.
TABLE III.
Variable | Participants (n) | Mean score by sectiona | ||||
---|---|---|---|---|---|---|
| ||||||
1 (6 items) | 2 (4 items) | 3 (5 items) | 4 (4 items) | Overall (19 items) | ||
Module type | ||||||
High fidelity | 10 | 6.2±1.4 | 7.4±0.7 | 7.6±0.8 | 6.9±1.7 | 7.0±0.9 |
Low fidelity | 8 | 7.0±1.3 | 7.8±0.7 | 8.0±0.9 | 8.2±1.0 | 7.7±0.9 |
p Value | 0.251 | 0.292 | 0.326 | 0.076 | 0.128 | |
| ||||||
Previous exposure to e-learning | ||||||
Yes | 10 | 6.8±1.6 | 7.5±0.6 | 7.8±0.8 | 8.0±1.5 | 7.5±1.0 |
No | 8 | 6.3±1.2 | 7.5±0.9 | 7.8±0.8 | 7.0±1.5 | 7.1±0.9 |
p Value | 0.509 | 0.926 | 0.962 | 0.209 | 0.448 | |
| ||||||
Overall | 18 | 6.5±1.4 | 7.5±0.7 | 7.8±0.8 | 7.4±1.5 | 7.3±0.9 |
1: overall user reactions (satisfaction, stimulation, ease); 2: screens (layouts, sequence, and so on); 3: terminology or system information (communication with user); 4: learning (intuitiveness, adaptability to interface).
The overall average quis rating was 7.3 (out of 9) suggesting that participants were fairly satisfied with the user interface of the e-module. The section rated the highest was section 3, system issues (7.8 ± 0.8), and the lowest-rated section was section 1, overall user reactions (6.5 ± 1.4).
In observing mean scores across all sections of the quis, differences were evident when comparing e-module type and prior exposure to e-learning. Participants in the lf group gave an overall mean score of 7.7; the hf group scored the interface at 7.0. When comparing mean scores by section, each section was scored higher by participants in the lf group than by those in the hf group, with the largest margin of difference being that for section 4, learning, which assessed participant satisfaction with the intuitiveness and adaptability of the interface. The mean scores given by the lf and hf participants differed by more than 1 point (8.2 ± 1.0 vs. 6.9 ± 1.7).
DISCUSSION
As an educational format, e-learning is viable for integration into radiation oncology training. Knowledge gain might, however, be better supported by hf e-modules, and careful attention to best practices in user interface design is needed.
The flexibility afforded by e-learning is well documented in the literature3,23. By implementing such a resource formally in the residency curriculum, likely as a pre-rotation tool, clinical interaction can potentially be maximized, because learners can master the “basics” offline in a selfdirected manner.
A commonly noted apprehension of clinical faculty in the adoption of cbme is the burden of frequent individualized assessment and feedback20,24,25. The content created for the development of the e-modules was comprehensive, and it addressed the current curricular objectives for early-stage lung cancer management in Radiation Oncology at Princess Margaret Cancer Centre (which would otherwise have been delivered in a didactic teaching model). Once the approach is validated, offloading some of the traditional didactic teaching from the immediate clinical environment to e-learning might be a reasonable potential consideration, thus creating time to focus on assessment and feedback. In overall effectiveness, e-learning is generally comparable with more traditional approaches5; but, implemented across a training program, robust and tailored e-learning resources have the potential to contribute to standardization of experience in training, lessening reliance on clinical faculty to convey the foundational knowledge of the field and also mitigating other limitations (such as scheduling challenges) inherent in face-to-face synchronous educational approaches3. The addition of e-learning could also facilitate improved resource-sharing across education centres and training programs.
Resource implications in establishing and maintaining e-learning resources, including managing changes in software and the rapid pace of new scientific knowledge, cannot be ignored, but should be contextualized in light of faculty time savings and the enduring nature of many of the resources. Development and maintenance of e-learning modules can be centralized, handled on a rotating basis, or divided between multiple programs or educators, while benefiting a high volume of trainees. For similar reasons, there is additional potential to explore engagement of professionals in related disciplines (for radiation oncology, those disciplines might include radiation therapy or medical physics), both to enrich the scope of the content and to build interprofessional education and collaboration opportunities26. Greater integration of e-learning strategies into training curricula will increase familiarity with common tools and functionality. In the present study, participants with prior e-learning experience rated the system as more intuitive than did those without such experience. Similarly, attention should be paid to avoid compromising the educational value of e-learning with complex technological requirements.
No qualitative data were collected; that aspect was beyond the scope of the study. However, participant feedback detailed some of the drawbacks of e-learning, such as a sense of isolation arising from a lack of engagement or interactivity with faculty and other learners, which is neither entirely novel criticism nor an insurmountable issue with the learning format27–29. With appropriate monitoring and facilitation from educators or clinical faculty, interactive forums or other opportunities for communication can be incorporated into e-learning to further heighten the interactivity explored in the present study through videos, quizzes, and hands-on clinical exercises. Such tools have been incorporated (with documented success) in developing fields such as moocs (massive open online courses), like those provided by Coursera (Mountain View, CA, U.S.A.)30,31.
Limitations of the present study include its small sample size, the magnitude of difference between the lf and hf e-modules, and the isolated nature of the e-learning experience. Although a response rate of one third of the trainees agreeing to participate should be considered acceptable (and the demographics were considered reasonable when compared with the study population), it is possible that the voluntary participation model conferred a selection bias—in that those who agreed to participate might have been more receptive to e-learning or more engaged in the clinical content area.
No measures were taken to control potential collegial cross-contamination of the pre- and post-test assessments; however, our study was optional and had no effect on program assessment for the trainees. Thus, there would be little motivation to share content. Given that the strength of e-learning is thought to be in its role as a strategy that complements other curricular elements, assessing it in isolation from the clinical experience and interaction with faculty limits the insights that can be gleaned. Additionally, improvements in knowledge acquisition were found to be statistically significant only in senior trainees, which might be a result of their having had more experience. Most would have completed a lung rotation and would potentially be able to infer more information from the modules based on prior clinical experience. Although the results did not demonstrate the same improvement for the junior trainees, that is not to say that an e-learning experience would not be beneficial for knowledge acquisition. When to integrate e-learning resources in a clinical training program for optimal learning is unknown; however, the present work constitutes proof-of-principle with respect to the format itself and the basic content, which should provide a starting point for future work, which could then better inform integration of e-learning strategies into a comprehensive cbme curriculum.
In providing e-learning resources in an enduring learning management system, trainees would have the advantage of being able to access information as personally required. Future studies could also be designed to collect data about the time and costs of developing such resources for a cost–benefit comparison of blended education models.
CONCLUSIONS
The hf version of e-learning had a greater impact on knowledge acquisition, and participants in both the hf and lf groups expressed satisfaction with the user interface. The use of e-learning in a competency-based curriculum could have educational advantages, and the expressed benefits, drawbacks, and preferences for e-learning integration should be explored further.
ACKNOWLEDGMENTS
The authors acknowledge the support of a University of Toronto Faculty of Medicine Educational Development Fund grant for the work done to complete this project.
Footnotes
CONFLICT OF INTEREST DISCLOSURES
We have read and understood Current Oncology’s policy on disclosing conflicts of interest, and we declare that we have none.
REFERENCES
- 1.Pucer P, Vavpotic D, Žvanut B. Improving the use of e-learning in health care curricula: presentation of best practices. J e-Learn High Educ Manag. 2016;2016:14–25. [Google Scholar]
- 2.Sheen ST, Chang WY, Chen HL, Chao HL, Tseng CP. e-Learning education program for registered nurses: the experience of a teaching medical center. J Nurs Res. 2008;16:195–201. doi: 10.1097/01.JNR.0000387306.34741.70. [DOI] [PubMed] [Google Scholar]
- 3.Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006;81:207–12. doi: 10.1097/00001888-200603000-00002. [DOI] [PubMed] [Google Scholar]
- 4.Petty J. Interactive, technology-enhanced self-regulated learning tools in healthcare education: a literature review. Nurse Educ Today. 2013;33:53–9. doi: 10.1016/j.nedt.2012.06.008. [DOI] [PubMed] [Google Scholar]
- 5.Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300:1181–96. doi: 10.1001/jama.300.10.1181. [DOI] [PubMed] [Google Scholar]
- 6.Gordon M, Chandratilake M, Baker P. Low fidelity, high quality: a model for e-learning. Clin Teach. 2013;10:258–63. doi: 10.1111/tct.12008. [DOI] [PubMed] [Google Scholar]
- 7.Gordon M, Chandratilake M, Baker P. Improved junior paediatric prescribing skills after a short e-learning intervention: a randomised controlled trial. Arch Dis Child. 2011;96:1191–4. doi: 10.1136/archdischild-2011-300577. [DOI] [PubMed] [Google Scholar]
- 8.Paige J, Kozmenko V, Morgan B, et al. From the flight deck to the operating room: an initial pilot study of the feasibility and potential impact of true interdisciplinary team training using high-fidelity simulation. J Surg Educ. 2007;64:369–77. doi: 10.1016/j.jsurg.2007.03.009. [DOI] [PubMed] [Google Scholar]
- 9.Paige JT, Kozmenko V, Yang T, et al. Attitudinal changes resulting from repetitive training of operating room personnel using high-fidelity simulation at the point of care. Am Surg. 2009;75:584–90. [PubMed] [Google Scholar]
- 10.Paige JT, Kozmenko V, Yang T, et al. High-fidelity, simulation-based, interdisciplinary operating room team training at the point of care. Surgery. 2009;145:138–46. doi: 10.1016/j.surg.2008.09.010. [DOI] [PubMed] [Google Scholar]
- 11.Robertson B, Schumacher L, Gosman G, Kanfer R, Kelley M, DeVita M. Simulation-based crisis team training for multi-disciplinary obstetric providers. Simul Healthc. 2009;4:77–83. doi: 10.1097/SIH.0b013e31819171cd. [DOI] [PubMed] [Google Scholar]
- 12.Deraniyagala R, Amdur RJ, Boyer AL, Kaylor S. Usability study of the EduMod eLearning Program for contouring nodal stations of the head and neck. Pract Radiat Oncol. 2015;5:169–75. doi: 10.1016/j.prro.2014.10.008. [DOI] [PubMed] [Google Scholar]
- 13.Gillespie EF, Panjwani N, Golden DW, et al. Multi-institutional randomized trial testing the utility of an interactive three-dimensional contouring atlas among radiation oncology residents. Int J Radiat Oncol Biol Phys. 2017;98:547–54. doi: 10.1016/j.ijrobp.2016.11.050. [DOI] [PubMed] [Google Scholar]
- 14.Gunther J, Liauw S, Choi S, Stepaniak C, Das P, Golden D. Post-operative prostate and seminal vesicle fossae contouring module: evaluation of medical student target delineation before and after a teaching intervention [Web resource] MedEdPORTAL. 2015;11:10199. [Downloadable from: https://www.mededportal.org/publication/10199; cited 15 February 2018] [Google Scholar]
- 15.Lehmann R, Bosse HM, Simon A, Nikendei C, Huwendiek S. An innovative blended learning approach using virtual patients as preparation for skills laboratory training: perceptions of students and tutors. BMC Med Educ. 2013;13:23. doi: 10.1186/1472-6920-13-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Bambini D, Washburn J, Perkins R. Outcomes of clinical simulation for novice nursing students: communication, confidence, clinical judgment. Nurs Educ Perspect. 2009;30:79–82. [PubMed] [Google Scholar]
- 17.Rogers PL, Jacob H, Thomas EA, Harwell M, Willenkin RL, Pinsky MR. Medical students can learn the basic application, analytic, evaluative, and psychomotor skills of critical care medicine. Crit Care Med. 2000;28:550–4. doi: 10.1097/00003246-200002000-00043. [DOI] [PubMed] [Google Scholar]
- 18.Adrales G, Chu U, Witzke D, et al. Evaluating minimally invasive surgery training using low-cost mechanical simulations. Surg Endosc. 2003;17:580–5. doi: 10.1007/s00464-002-8841-7. [DOI] [PubMed] [Google Scholar]
- 19.Schwid HA, Rooke GA, Carline J, et al. on behalf of the Anesthesia Simulator Research Consortium. Evaluation of anesthesia residents using mannequin-based simulation: a multiinstitutional study. Anesthesiology. 2002;97:1434–44. doi: 10.1097/00000542-200212000-00015. [DOI] [PubMed] [Google Scholar]
- 20.Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32:638–45. doi: 10.3109/0142159X.2010.501190. [DOI] [PubMed] [Google Scholar]
- 21.Maran NJ, Glavin RJ. Low-to high-fidelity simulation—a continuum of medical education? Med Educ. 2003;37(suppl 1):22–8. doi: 10.1046/j.1365-2923.37.s1.9.x. [DOI] [PubMed] [Google Scholar]
- 22.Chin JP, Diehl VA, Norman KL. Development of an instrument measuring user satisfaction of the human–computer interface. In: O’Hare JJ, editor. CHI ’88; Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; Washington, DC, U.S.A.. 15–19 May 1988; New York, NY: Association for Computing Machinery; 1988. pp. 213–18. [DOI] [Google Scholar]
- 23.Alfieri J, Portelance L, Souhami L, et al. Development and impact evaluation of an e-learning radiation oncology module. Int J Radiat Oncol Biol Phys. 2012;82:e573–80. doi: 10.1016/j.ijrobp.2011.07.002. [DOI] [PubMed] [Google Scholar]
- 24.Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32:676–82. doi: 10.3109/0142159X.2010.500704. [DOI] [PubMed] [Google Scholar]
- 25.Hersh WR, Bhupatiraju RT, Greene P, Smothers V, Cohen C. Adopting e-learning standards in health care: competency-based learning in the medical informatics domain. AMIA Annu Symp Proc. 2006:334–8. [PMC free article] [PubMed] [Google Scholar]
- 26.Macdonald CJ, Stodel EJ, Chambers LW. An online interprofessional learning resource for physicians, pharmacists, nurse practitioners, and nurses in long-term care: benefits, barriers, and lessons learned. Inform Health Soc Care. 2008;33:21–38. doi: 10.1080/14639230801886824. [DOI] [PubMed] [Google Scholar]
- 27.Tyler-Smith K. Early attrition among first time eLearners: a review of factors that contribute to drop-out, withdrawal and non-completion rates of adult learners undertaking eLearning programmes. J Online Learn Teach. 2006;2:73–85. [Google Scholar]
- 28.O’Neill K, Singh G, O’Donoghue J. Implementing eLearning programmes for higher education: a review of the literature. J Inform Tech Educ. 2004;3:313–23. doi: 10.28945/304. [DOI] [Google Scholar]
- 29.Booth A, Carroll C, Papaioannou D, Sutton A, Wong R. Applying findings from a systemat ic review of workplace-based e-learning: implications for health information professionals. Health Inform Libr J. 2009;26:4–21. doi: 10.1111/j.1471-1842.2008.00834.x. [DOI] [PubMed] [Google Scholar]
- 30.Yuan L, Powell S. MOOCs and Open Education: Implications for Higher Education. A White Paper. Bolton, UK: JISC Centre for Educational Technology and Interoperability Standards; 2013. [Available online at: https://publications.cetis.org.uk/wp-content/uploads/2013/03/MOOCs-and-Open-Education.pdf; cited 15 February 2018] [Google Scholar]
- 31.Harder B. Are moocs the future of medical education? BMJ. 2013;346:f2666. doi: 10.1136/bmj.f2666. [DOI] [PubMed] [Google Scholar]