Abstract
Background
Internationally, healthcare systems are facing global issues due to rising costs and an ageing population. System-wide improvement is needed to help address these issues. Therefore, large-scale training of staff in improvement skills is required.
An established method of training at scale is digitally delivered training, including Massive-Open-Online-Courses (MOOCs). Within the National Health Service in England, wide-scale variation exists in digital education and training standards. This study evaluates an education programme, known as MOOC School, that sought to address educational skills shortfalls by training subject matter experts and trainers in interactive, online learning techniques.
Methods
This evaluation assessed the MOOC School training programme’s impact on participants’ ability to design, develop and deliver online learning. A mixed-methods approach was used, with data collected from existing application and attendance records, surveys and interviews with volunteers who self-identified as having put the learning into practice. The study aimed to identify key success factors of a health educator digital upskilling programme.
Results
The MOOC School programme ran seven formal cohorts over 3 years with 96 participants and 2 informal, coaching style cohorts with 14 participants the following year. The programme was well received with 97% rating the course as good or very good. MOOC School helped participants to reach over 30 000 enrolments through courses they created with their teams after undertaking the training, filling a significant gap. Participants reported gaining important skills and insights into the art of what is possible in delivering training in new ways.
Conclusions
The experience and plans of the participants support the need for more creative training practices and digitally literate health educators to deliver the training that is required. The findings of the evaluation highlight a way forward in defining the essential skills and knowledge needed to create high-quality digital learning at scale.
Keywords: Continuous quality improvement, Quality improvement, Healthcare quality improvement, Quality improvement methodologies, Education
WHAT IS ALREADY KNOWN ON THIS TOPIC
The need to skill educators in digital learning methodologies and practices is driven by the rapid advances in technology and practice. Collaborative digital learning is a successful medium for delivering skills and knowledge, such as improvement and transformation.
WHAT THIS STUDY ADDS
An evaluation of a digital learning skills train-the-trainer programme for improvement educators. Organised by levels of the Kirkpatrick learning evaluation framework, the evaluation highlights essential criteria and success factors.
HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY
The study highlights recommendations of vital elements for inclusion in a successful train-the-trainer digital learning skills programme.
Introduction
Globally, health services are facing increasingly complex challenges. The origins of these challenges include a growing and ageing population, medical advances,1 2 patient backlogs caused by the COVID-19 pandemic and workforce shortages.3 To address these challenges, a quadruple aim framework for care has been established and promoted internationally to guide the redesign and transformation of healthcare services.4 The quadruple aims prioritise improving the quality of patient care, increasing population health, reducing per-capita cost and improving staff experience. Developing a culture of system-wide quality improvement (QI) is required to deliver the changes required at scale and pace.5 6 Systematic methods of improvement have been promoted to health systems internationally to support progress with the goals of the quadruple aim.7 Building improvement capability is considered a key component to transforming healthcare.7,11
Technology enhanced learning, encompassing several variations of online learning including Massive Open Online Courses (MOOCs), is considered essential to building improvement capability. Representing a growing market for achieving wide-reaching capability development,12 MOOCs aim to disseminate knowledge and provide education online to limitless participants, using interactive online platforms to expand QI capability training across vast areas and nations. Providing learning solutions online offers equity and ease of access for those in remote or rural areas, and those working flexibly. Online learning can meet training needs at scale, saving time and costs compared with in-person training events,13 as well as improving the participant experience.3
MOOCs are suggested as adaptable and cost-effective ways to educate health and care personnel in QI13 and have been used to teach QI across the globe. 14,18 MOOCs have been used and evaluated successfully for several years within the National Health Service (NHS) as a method to deliver improvement training at scale.17 18
The digital competence of educators has been shown to be a leading factor for successful digital education.19,22 To help organisations to self-assess and benchmark their current ability to design, develop and evaluate training, NHS England has published a suite of education and training skills standards.23 It is unclear to what extent these skills are in place within the NHS among informal educators, but studies have identified the value of training health educators in adult learning theory, pedagogy and technical skills and not assuming experts in their field possess these skills already.24 25 Fortunately, the digital competence of trainers can be effectively improved through skills-based training.26 To address the skills issue, a training programme to increase interactive, online learning skills among NHS staff, entitled ‘MOOC School’ was developed.
This study evaluates the MOOC School training programme, assessing its effectiveness in building skills and knowledge in best practice online learning creation. The findings of the evaluation highlight a way forward in defining the essential skills and knowledge needed to create high quality digital learning at scale to further spread essential improvement skills. Overall, this assesses a contemporary teaching approach that is integral for delivering extensive distance learning. Consequently, it could be highly beneficial to educators, especially in the QI field, on a global scale.
MOOC School overview
MOOC School was an NHS training programme developed to build best practice technology enhanced learning skills in both formal and informal improvement educators across the wider health and care system. The programme aimed to equip participants with the knowledge and skills to create interactive MOOC style learning, based on connectivism theory. Connectivism emphasises the role of connecting and networking in the online learning process.27 It focuses on engaging the learner in activities and encouraging collaboration, sharing and debate with other participants to widen the learning experience. The curriculum covered essential skills for teaching and learning design—an overview of the sessions is provided in table 1. Supplementing trainers’ content knowledge with an understanding of pedagogy and technology is believed to provide the best quality digital training.28
Table 1. MOOC school curriculum of sessions.
Week | Module | Topic |
---|---|---|
1 | 1 | Induction |
2 | Teaching and learning skills for designing and planning | |
2 | 3 | Designing assessment and certification of learning |
4 | Design features and developing materials | |
3 | 5 | Design and collect evidence of evaluation and improvements |
6 | Social network and facilitation skills | |
4 | 7 | Assessment planning, resources and going forward |
8 | (Optional) Using the QI Learning Platform |
MOOC, Massive-Open-Online-Course.
MOOC School was initially delivered as a 4-week programme of 3-hour synchronous webinars delivered via Webex or MS Teams to small cohorts of 10–20 participants. The live webinars contained a mix of knowledge transfer sessions and participant practical collaboration exercises to practise skills. The live sessions were supported by homework and additional resources on a closed section of the internal QI learning platform (a social, learner experience platform). At the end of the programme, participants designed, developed and delivered a short online learning piece for assessment.
Later cohorts delivered the same learning via small, personal coaching sessions mixed with assisted development of live learning packages by the participants.
Methods
Education is moving increasingly into the digital area29 due to advances in technology and the digital shift created by the 2020 pandemic and lockdowns.30 Consequently, health organisations are under pressure to upskill their educators in digital practices. MOOCs are seen as an accepted method of delivering training to large audiences due to their accessibility and ease of access.
The aim of this evaluation was to determine the key success factors required in upskilling educators in digital learning practices. Mixed evaluation methods were applied to gain insights from both qualitative and quantitative sources. Attendance data, pre and post course surveys provided a statistical base with qualitative data from surveys and interviews enhancing understanding and conclusions.
Approach
Several models are available for evaluating learning. Popular models include: Context, Input, Process, Product (CIPP), Brinkerhoff and Kirkpatrick. The CIPP model provides a systematic framework for evaluating educational programmes by examining their CIPP.31 The CIPP model works best when implemented throughout the training delivery process. The Brinkerhoff Success Case Method32 focuses on identifying and analysing the most and least successful cases of learning transfer and impact from a training programme, to identify what works and what does not. While this model is useful for highlighting areas for improvement, it is less useful for understanding long term impact and examining the programme across the full spectrum of learning levels.32 Among the most used evaluation models, Kirkpatrick’s Model is highly popular and is particularly effective for evaluating online learning.33 34 The Kirkpatrick framework has been a staple of education evaluation since the 1950s. Kirkpatrick defines four levels of evaluation, from the initial reaction and satisfaction with the learning delivered, identifying what was learnt, determining how this changed student behaviour and considering the wider results or benefits to the individual or organisation. The Kirkpatrick model is flexible and adaptable, making it easy to implement across different environments and fields, and has been recommended for evaluating MOOCs.35 The Kirkpatrick model has been widely used in previous evaluations and has been demonstrated to be effective at evaluating MOOCs with an improvement theme13 17 18 and, therefore, was selected for this evaluation.
A mix of methods and data sets was used to evaluate the effectiveness and impact of the MOOC School internal training programme, including existing data sets, surveys and semi-structured qualitative interviews.
The MOOC School programme ran from 2018 to 2023 with nine cohorts of up to 20 participants each. Participants in the training programme initially self-selected to apply and completed an application statement. Potential candidates were asked to detail their experience, reason for applying and how the learning would be put into practice. From over 200 applications, 127 were offered a place, including 14 selected from internal teams for personal coaching, forming the focus of this evaluation. Successful participants came from a range of backgrounds, NHS and non-NHS, clinical and non-clinical, with and without an learning and development (L&D) background. Due to the relatively small stakeholder population, this study considered all the successful applicants to the MOOC School programme, regardless of attendance record or completion status.
Data collection and analysis
A variety of new and existing data was collected and formed the background analysis of participants in the programme and contributed to the overall understanding and triangulation of results.36 Quantitative analysis of the existing data sets provided further understanding of the background and prior experience of participants.
Qualitative data were collected via existing application forms and attendance data to form a demographic of the participants and their prior experience. A post course survey was submitted at the end of each course by those that completed it. The post course survey asked participants to rate their knowledge and confidence—both how it had been before the course and how it was after the course. This produced ‘paired samples’ of before and after knowledge and confidence data which were analysed to measure learning achieved using the Wilcoxon signed-rank test. The Wilcoxon signed-rank test is used to test for differences between paired samples of non-parametric, ordinal data and has been used to analyse knowledge and confidence change in improvement themed MOOCs.13 18
In this case, participants rated their knowledge and confidence in a number of areas against a 5-point, Likert style scale from ‘very low’ to ‘very high’. The survey also requested a five-point Likert rating of statements regarding the effectiveness of the training from ‘strongly disagree’ to ‘strongly agree’.
Qualitative data were also gathered during the post course survey with open and closed questions covering the participant’s experience and learning. A further follow-up survey was sent in 2022 to all those with experience in the formal training programme regardless of completion status, to evaluate the lasting impact. Semistructured interviews with those who reported via the follow-up survey that they had put the learning into practice explored course experiences in more detail.
Data sources were mapped to each of the Kirkpatrick evaluation levels of reaction, learning, behaviour and results, as shown in table 2, to ensure a comprehensive evaluation spread. Study findings were categorised and evaluated according to the Kirkpatrick evaluation levels.
Table 2. Study data mapped to Kirkpatrick evaluation model levels.
Kirkpatrick level | Quantitative | Qualitative |
---|---|---|
1-Reaction |
|
|
2-Learning |
|
|
3-Behaviour |
|
|
4-Results |
|
|
MOOC, Massive-Open-Online-Course.
In addition to assessment against the levels of the Kirkpatrick framework, thematic analysis of the qualitative data was used as a method to identify themes related to key learning areas.36 Thematic analysis involved analysis of the qualitative data such as interview transcripts and survey responses to identify recurring patterns or themes. Initial broad themes, or categories, were developed and later refined to form the thematic recommendations in the conclusion.
Results
The MOOC School programme ran as either classroom or direct coaching from 2018 to 2023 with nine cohorts of up to 20 participants each. The findings were organised according to Kirkpatrick’s four levels of training evaluation: reaction, learning, behaviour and results.
Kirkpatrick level 1: reaction
Of the 127 places offered, 110 were accepted and attended one or more sessions, with 68% completing all or most sessions. Those who finished the course were invited to complete a post course survey.
High satisfaction and engagement
The programme achieved high levels of participant satisfaction, with 97% rating it as good or very good.
93% of respondents indicated they were likely or very likely to recommend the course to others.
The follow-up survey reported that the most common reason for participants not completing the course was work pressures (66%) and that they would retake the course if the opportunity arose.
Positive feedback on course structure
Participants appreciated the course’s structured approach, particularly the focus on learning design and assessment, which helped in shaping their own work.
Kirkpatrick level 2: learning
Participants were asked in the post course survey to self-assess their knowledge and confidence before and after attending.
Increase in knowledge and confidence
Participants reported an average 55% increase in knowledge and an average 53% increase in confidence post-training.
The Wilcoxon signed rank test showed the reported change in knowledge and confidence to be statistically significant (p<0.001) figure 1.
Figure 1. Chart of self-reported change in knowledge and confidence before and after the training.
The training effectively equipped participants to start their own digital learning projects, with 73% agreeing or strongly agreeing that the course positively impacted their ability to do so.
Value of understanding learning theory and connectivism principles
Participants found the sessions on learning assessment and connectivism principles particularly beneficial. These sessions moved trainers from static presenters to dynamic educators.
Practising the exchange of ideas within a diverse group during the training provided valuable insights and inspiration.
Kirkpatrick level 3: behavioural change
Studies examining the pandemic’s impact on future working practices conclude that workers are likely to retain many of the digital practices developed over the last few years.37,40 A follow-up survey was sent in 2022 when the formal training programme ended to all those who had attended at least one session to identify behaviour change since participating in the course. 113 applications were offered places to participate in the formal programme (cohorts 1–7 in table 3). 78 were sent the follow-up survey, the remainder being uncontactable or excluded due to conflict of interest. 30 responses were received, 77% of which had completed the programme.
Table 3. Overview of programme cohorts.
Cohort | Number of applications | Number of places offered | Number of participants | Number of completions | Evaluation responses | Response rate |
---|---|---|---|---|---|---|
(Post course) | (As % of completers) | |||||
October 2018 (pilot) | 9 | 9 | 8 | 2 | 0 | NA |
November 2019 | 82 | 13 | 11 | 9 | 8 | 89 |
January 2020 (covid) | 11 | 8 | 4 | 1 | 25 | |
May 2020 | 61 | 20 | 18 | 18 | 10 | 56 |
October 2020 | 18 | 18 | 15 | 12 | 80 | |
January 2021 | 78 | 22 | 18 | 15 | 9 | 60 |
July 2021 | 20 | 15 | 9 | 9 | 100 | |
October 2021 (coaching) | 6 | 4 | 4 | 4 | NA | NA |
August 2022 (coaching) | 10 | 10 | 10 | 10 | NA | NA |
Totals | 246 | 127 | 110 | 86 | 49 | 57% |
NA, not available.
Improved digital capabilities
73% of survey respondents and all interviewees reported increased ability and confidence to implement digital learning post-training.
Many participants were able to implement the training to create their own online learning programmes.
Barriers to implementation
Some participants faced barriers such as inadequate resourcing and cultural resistance within their workplaces, highlighting the need for organisational support to fully leverage digital training skills.
Future planning and adoption
Participants expressed intentions to incorporate more interactive and blended approaches, including wrap-around support, in their future training designs, aligning with postpandemic digital work practices.
There was a notable trend towards refining facilitation methods to enhance peer-to-peer interactions and collaborations.
Kirkpatrick level 4: results
Widespread impact and adoption
The programme’s methodology was successfully spread beyond individual participants. With participants taking the learning back to their wider teams within the NHS and embedding the methodology in their future delivery mode, thereby demonstrating significant organisational impact.
Some participants have developed and delivered new learning programmes to thousands of learners, indicating the programme’s scalability and effectiveness.
Creation of new courses
Several online programmes have been developed by MOOC School participants and their teams, demonstrating the long-term benefits of the training, including:
Four new MOOC courses on large-scale change and lean methodology were created within the NHS England Improvement Directorate, reaching over 15 000 participants.
Two new MOOC courses on elective care waiting list management were developed by cohort nine and have reached over 11 000 participants in the first year and a half.
A new MOOC on disposable glove use ran once and reached over 1200 participants.
A new culture and leadership targeted course reached over 3000 participants.
Thematic analysis
Interviews and survey responses were categorised, and subsequent thematic analysis allowed the identification of overarching themes.36 A number of high-level themes emerged, each with separate subordinate components, as shown in table 4. Theme 1 concerned the digital capabilities of the participants both before and after attending the training and also identified potential barriers to putting the learning into practice. Themes 2 and 3 covered the pedagogical aspects of the training that were most valued. Theme 4 is related to the interactive methodology being taught and demonstrated. Finally, the fifth theme encompassed content creation options. The five themes identified encompass the range of training evaluation elements from participant reaction to the learning, the learning that took place and subsequent behaviour change.
Table 4. Thematic analysis data mapping.
Theme | Subcategory | Applications | Attendance records | Assessment data | Post course survey | Study survey | Interviews |
---|---|---|---|---|---|---|---|
1-Digital capabilities | Not knowing where to start | X | X | X | X | X | |
Ability to implement | X | X | X | ||||
Barriers | X | ||||||
2-Value of learning pedagogy | Background | X | X | X | |||
Design methods | X | X | X | ||||
Learner assessment | X | X | X | ||||
3-Assessment and evaluation | Background | X | X | X | |||
Design methods | X | X | X | ||||
Learner assessment | X | X | X | ||||
4-Engaged learning techniques | Experience the learning style | X | |||||
Put into practice | X | X | X | ||||
Future Plans | X | ||||||
5-Content creation | Lack of awareness of the variety of options | X | X | ||||
Experience using alternative tools | X |
Limitations
The study has several limitations.
Due to the closely supported, coaching nature of the delivery, the programme ran with relatively few participants, which may limit the generalisability of the study findings.
Participants self-selected to apply to participate in the programme, highlighting an awareness of the need for digital learning skills. This pre-existing awareness is beneficial, although it may introduce some bias as they would have been aware of the need for the skills prior to undertaking the course. Consequentially, this awareness could influence their perceptions and feedback.
The researcher formed part of the delivery team, so a limitation of the study may be the researcher’s own positivity bias, and possible impact on the responses participants gave. Steps were taken to ensure neutrality,36 including sharing outputs with a neutral team member, not involved in the design or delivery, for independent review and validation. Nonetheless, prima facie responses seemed reasonable and aligned with the overall outcomes.
To focus on results and impact, the study only interviewed participants who reported in the survey that they had put the learning into practice. Further study of participants who have been unable to follow their learning with actions could identify and quantify the barriers encountered.
Implications and conclusions
Using the Kirkpatrick framework, the findings indicate that the programme evaluated well at a reaction level, with high satisfaction at level 1. At level 2, self-assessment demonstrated improvement in both knowledge and confidence. At levels 3 and 4, participants reported changes in their work practices and could show examples of new learning products being created as a result of the learning programme. In addition, both before and during the pandemic, there were exceptionally high levels of interest in joining the programme.
The levels of interest and satisfaction demonstrate both the underlying need and the usefulness of training staff in new, innovative ways of delivering training. Although this programme has only run on a small scale to date, this study identified significant volumes of on-the-job training being delivered by staff who do not have a formal educator background. As a result, the study detected wide variation in digital and pedagogical competencies. Thematic analysis of the findings, described above, leads the researcher to conclude a number of recommended themes for future training programmes of this type:
Theme 1: enhance digital learning capabilities in subject matter experts/educators
The evaluated learning programme sought to improve the digital capabilities of formal or informal educators in the NHS. The study found that before attending training, participants had a low understanding of how to move from traditional face-to-face training to delivering in an online environment. Evidence collected from the surveys and data sources illustrates that attending the training programme enabled participants to embark on projects to create their own learning offers. Training subject matter experts in the skills to move to digital learning delivery helps accelerate trainers’ overall digital capabilities.
Theme 2: adult education pedagogy
For informal subject matter expert trainers, learning about adult teaching theory and pedagogy as part of the curriculum strongly resonated with participants. Teaching design skills took the form of course characteristics, scheme of work and detailed session plans. The structured approach to learning design and development was an area cited consistently across surveys and interviews as most helpful in supporting shaping participants’ own work, giving a clear framework to use. By incorporating these structured methodologies, participants felt more confident in their ability to shape and refine their own instructional practices, ultimately leading to more impactful and learner-centred future training programmes.
Theme 3: assessment and evaluation
Another area reported via the surveys and interviews to be particularly beneficial concerned sessions on learning assessment and evaluation, moving the trainer from a static presenting role to a full teaching role. These sessions focused on how and why a trainer collects evidence of learning through the design, considering the student viewpoint and experience and assessing learning with the ultimate aim of achieving the wider training goal. Teaching trainers about evaluation and assessment equips them with the skills to foster a more engaging, responsive and impactful learning environment, ultimately leading to better educational outcomes for all participants.
Theme 4: interactive learning principles
Participants reported via the surveys and interviews the benefits of learning within a diverse group, practising, exchanging ideas and commenting on each other’s contributions to enrich the learning experience and foster a deeper understanding of the subject matter. Participants also described deriving value in seeing interactive, engaged learning in action during the training, as it gave the participants ideas and inspiration. The study findings concur with the literature around learning that is purposeful, engaged and connectivist in nature can lead to more meaningful and sustained learning outcomes.2741,44
Theme 5: content creation awareness
The final theme highlighted through interviews and surveys was the variety of different types of content that could be used to create a varied learning package. Participants reported not being aware of the options that were possible, resulting in a reliance on traditional methods, such as text-based materials, which can limit engagement and effectiveness. By understanding the diverse range of content types available, such as videos, animations, interactive models, infographics, quizzes and generative AI (Artificial Intelligence), participants can design more engaging learning experiences. This not only enhances the learning but also caters to different learning styles, making the training more inclusive and effective. Therefore, content creation awareness is crucial for developing comprehensive and impactful training programmes.
Summary
This study has uniquely evaluated a L&D offer aimed at equipping staff with the skills to create improvement skills training rather than evaluating the online courses, or MOOCs, themselves. This study provides insights to help in the design of digital training skills programmes to train improvement experts to develop interactive online learning to improve training outcomes. In summary, this study demonstrates the effectiveness of the evaluated programme in enhancing digital learning creation capabilities, adult education pedagogy, assessment and evaluation skills, connectivism learning principles and content creation awareness among staff creating training. The recommendations provided highlight the impact of innovative training methods on improving digital and pedagogical competencies and are of use to educational trainers and workforce training leads.
Supplementary material
Footnotes
Funding: The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Provenance and peer review: Not commissioned; externally peer reviewed.
Patient consent for publication: Not applicable.
Ethics approval: The Health Research Authority online decision tool was used to determine that NHS research ethics was not required for the study as it was considered evaluation of service improvement.
Patient and public involvement: Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.
Data availability statement
All data relevant to the study are included in the article or uploaded as supplementary information.
References
- 1.NHS England The NHS long term plan. 2019. http://www.longtermplan.nhs.uk Available.
- 2.NHS England Our 2023/24 business plan. 2023. https://www.england.nhs.uk/long-read/our-2023-24-business-plan Available.
- 3.HEE Educator workforce strategy. 2023. https://www.england.nhs.uk/long-read/educator-workforce-strategy Available.
- 4.Sikka R, Morath JM, Leape L. The Quadruple Aim: care, health, cost and meaning in work. BMJ Qual Saf . 2015;24:608–10. doi: 10.1136/bmjqs-2015-004160. [DOI] [PubMed] [Google Scholar]
- 5.Ham C, Berwick D, Dixon J. The King’s Fund; 2016. Improving quality in the English NHS: a strategy for action.https://www.kingsfund.org.uk/insight-and-analysis/reports/quality-improvement Available. [Google Scholar]
- 6.Alderwick H, Charles A, Jones B, et al. The King’s Fund; 2017. Making the case for quality improvement: lessons for nhs boards and leaders.https://www.kingsfund.org.uk/publications/making-case-quality-improvement Available. [Google Scholar]
- 7.Sampath B, Rakover J, Baldoza K, et al. IHI White Paper. Boston, MA: Institute for Healthcare; 2021. Whole system quality: a unified approach to building responsive, resilient health care systems.www.ihi.org Available. [Google Scholar]
- 8.NHS England NHS delivery and continuous improvement review. 2023. https://www.england.nhs.uk/long-read/nhs-delivery-and-continuous-improvement-review-recommendations Available.
- 9.NHS England NHS impact. 2023. https://www.england.nhs.uk/nhsimpact Available.
- 10.Burgess N. Six key lessons from the NHS and virginia mason institute partnership. 2022. https://www.wbs.ac.uk/news/six-key-lessons-from-the-nhs-and-the-virginia-mason-institute-partnership Available.
- 11.Burgess N, Currie G, Crump B, et al. Warwick, UK: Warwick Business School; 2022. Leading quality improvement in the NHS - findings of the national evaluation of the NHS-VMI partnership.www.warwick.ac.uk Available. [Google Scholar]
- 12.Verified Market Report Global massive open online course (mooc) platforms market 2019 by company, regions, type and application, forecast to 2024. 2019. https://www.verifiedmarketreports.com/product/global-massive-open-online-course-mooc-platforms-market-2019-by-company-regions-type-and-application-forecast-to-2024/ Available.
- 13.Smith IM. Building lean improvement sckills at scale: an evaluation of a massive open online course in the English NHS. BMJ Open Qual. 2023;12:e002357. doi: 10.1136/bmjoq-2023-002357. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Reese D, Dolansky MA, Moore SM, et al. Quality improvement education innovation: evaluation of Coursera MOOC ‘Take the Lead on Healthcare Quality Improvement’. J Res Nurs. 2021;26:62–78. doi: 10.1177/1744987120982644. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Dwyer M, Prior SJ, Van Dam PJ, et al. Development and Evaluation of a Massive Open Online Course on Healthcare Redesign: A Novel Method for Engaging Healthcare Workers in Quality Improvement. Nurs Rep. 2022;12:850–60. doi: 10.3390/nursrep12040082. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Gleason KT, Commodore-Mensah Y, Wu AW, et al. Massive open online course (MOOC) learning builds capacity and improves competence for patient safety among global learners: A prospective cohort study. Nurse Educ Today. 2021;104:104984. doi: 10.1016/j.nedt.2021.104984. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Guest C, Wainwright P, Herbert M, et al. Driving quality improvement with a massive open online course (MOOC) BMJ Open Qual. 2021;10:e000781. doi: 10.1136/bmjoq-2019-000781. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Smith IM, Bayliss E, Mukoro F. Capability building for large-scale transformational change: learning from an evaluation of a national programme. BMJ Open Qual. 2021;10:e000980. doi: 10.1136/bmjoq-2020-000980. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Núñez-Canal M, de Obesso M de las M, Pérez-Rivero CA. New challenges in higher education: A study of the digital competence of educators in Covid times. Technol Forecast Soc Change. 2022;174:121270. doi: 10.1016/j.techfore.2021.121270. [DOI] [Google Scholar]
- 20.Matthews B. Digital Literacy in UK Health Education: What Can Be Learnt from International Research? Cont Ed Technology . 2021;13:ep317. doi: 10.30935/cedtech/11072. [DOI] [Google Scholar]
- 21.O’Doherty D, Dromey M, Lougheed J, et al. Barriers and solutions to online learning in medical education - an integrative review. BMC Med Educ. 2018;18:130. doi: 10.1186/s12909-018-1240-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Hautz SC, Hoffmann M, Exadaktylos AK, et al. Digital competencies in medical education in Switzerland: an overview of the current situation. GMS J Med Educ. 2020;37:Doc62. doi: 10.3205/zma001355. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.NHS Digital Education and training standards and self-assessment. 2023. https://digital.nhs.uk/services/training-quality-improvement/education-and-training-standards-and-benchmarking Available.
- 24.Khong ML, Chan E, Tanner JA, et al. COVID-19 – A Covert Catalyst for Pedagogical Stocktake and Transformation: Perspectives of a Global Hub. MedEdPublish. 2016;9:212. doi: 10.15694/mep.2020.000212.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Andrews TC, Speer NM, Shultz GV. Building bridges: a review and synthesis of research on teaching knowledge for undergraduate instruction in science, engineering, and mathematics. IJ STEM Ed. 2022;9 doi: 10.1186/s40594-022-00380-w. [DOI] [Google Scholar]
- 26.Reisoğlu İ. How Does Digital Competence Training Affect Teachers’ Professional Development and Activities?Technology, Knowledge and Learning. 2021;10:1007. [Google Scholar]
- 27.Downes S. Recent Work in Connectivism. European Journal of Open, Distance and E-Learning. 2020;22:113–32. doi: 10.2478/eurodl-2019-0014. [DOI] [Google Scholar]
- 28.Mishra P, Koehler MJ. Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge. Teachers College Record: The Voice of Scholarship in Education. 2006;108:1017–54. doi: 10.1111/j.1467-9620.2006.00684.x. [DOI] [Google Scholar]
- 29.Karger T, Kalenda J, Vaculíková J, et al. Online learning platforms and resources in adult education and training: new findings from four European countries. International Journal of Lifelong Education . 2024;43:417–31. doi: 10.1080/02601370.2024.2358896. [DOI] [Google Scholar]
- 30.Bryant J, Child F, Dorn E, et al. McKinsey & Company WwwMckinseyCom; 2020. New global data reveal education technology’s impact on learning.https://www.mckinsey.com/industries/education/our-insights/new-global-data-reveal-education-technologys-impact-on-learning Available. [Google Scholar]
- 31.Yale CIPP model | poorvu center for teaching and learning. 2021. https://poorvucenter.yale.edu/CIPP Available.
- 32.Deller J. KodosurveyCom; 2019. Brinkerhoff model 101: methodology and goals.https://kodosurvey.com/blog/brinkerhoff-model-101-methodology-and-goals Available. [Google Scholar]
- 33.Liu S, Zu Y. Evaluation Models in Curriculum and Educational Program - A Document Analysis Research. J TECHNOL HUM. 2024:32–8. doi: 10.53797/jthkkss.v5i1.4.2024. [DOI] [Google Scholar]
- 34.Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. 3rd. San Francisco, Ca: Berrett-Koehler; 2006. edn. [Google Scholar]
- 35.Smith-Lickess SK, Woodhead T, Burhouse A, et al. Study design and protocol for a comprehensive evaluation of a UK massive open online course (MOOC) on quality improvement in healthcare. BMJ Open. 2019;9:e031973. doi: 10.1136/bmjopen-2019-031973. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Merriam SB, Tisdell EJ. Qualitative Research: A Guide to Design and Implementation. San Francisco, Ca: Jossey-Bass; 2015. [Google Scholar]
- 37.Lund S, Madgavkar A, Manyika J, et al. McKinsey & Company McKinsey Global Institute; 2021. The future of work after covid-19.https://www.mckinsey.com/featured-insights/future-of-work/the-future-of-work-after-covid-19 Available. [Google Scholar]
- 38.Aarts E, Fleuren H, Sitskoorn M. In: The New Common. Wilthagen T, editor. Cham: Springer International Publishing; 2021. https://link.springer.com/book/10.1007%2F978-3-030-65355-2#editorsandaffiliations Available. [Google Scholar]
- 39.Teevan J, Hecht B, Jaffe S, et al. Microsoft Corporation; 2021. The new future of work: research from microsoft into the pandemic’s impact on work practices.https://www.microsoft.com/en-us/research/publication/the-new-future-of-work-research-from-microsoft-into-the-pandemics-impact-on-work-practices/ Available. [Google Scholar]
- 40.Vyas L. “New normal” at work in a post-COVID world: work–life balance and labor markets. Policy and Society. 2022;41:155–67. doi: 10.1093/polsoc/puab011. [DOI] [Google Scholar]
- 41.Liu D, Carter L, Lin J. Towards Connectivism: Exploring Student Use of Online Learning Management Systems during the Covid-19 Pandemic. OLJ . 2024;28:1–25. doi: 10.24059/olj.v28i2.4047. [DOI] [Google Scholar]
- 42.Yannier N, Hudson SE, Koedinger KR, et al. Active learning: “Hands-on” meets “minds-on”. Science. 2021;374:26–30. doi: 10.1126/science.abj9957. [DOI] [PubMed] [Google Scholar]
- 43.Abik M, Ajhoun R. Impact of Technological Advancement on Pedagogy. Turkish Online Journal of Distance Education. 2012;13:11–21. [Google Scholar]
- 44.Whitton N. Game Engagement Theory and Adult Learning. Simulation & Gaming . 2011;42:596–609. doi: 10.1177/1046878110378587. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
All data relevant to the study are included in the article or uploaded as supplementary information.