Skip to main content
BMJ Open Quality logoLink to BMJ Open Quality
. 2025 Aug 18;14(3):e003224. doi: 10.1136/bmjoq-2024-003224

Transforming improvement training at scale with essential digital training skills

Margaret Herbert 1,, Iain M Smith 2,3, Cheryl Guest 4
PMCID: PMC12366557  PMID: 40829885

Abstract

Background

Internationally, healthcare systems are facing global issues due to rising costs and an ageing population. System-wide improvement is needed to help address these issues. Therefore, large-scale training of staff in improvement skills is required.

An established method of training at scale is digitally delivered training, including Massive-Open-Online-Courses (MOOCs). Within the National Health Service in England, wide-scale variation exists in digital education and training standards. This study evaluates an education programme, known as MOOC School, that sought to address educational skills shortfalls by training subject matter experts and trainers in interactive, online learning techniques.

Methods

This evaluation assessed the MOOC School training programme’s impact on participants’ ability to design, develop and deliver online learning. A mixed-methods approach was used, with data collected from existing application and attendance records, surveys and interviews with volunteers who self-identified as having put the learning into practice. The study aimed to identify key success factors of a health educator digital upskilling programme.

Results

The MOOC School programme ran seven formal cohorts over 3 years with 96 participants and 2 informal, coaching style cohorts with 14 participants the following year. The programme was well received with 97% rating the course as good or very good. MOOC School helped participants to reach over 30 000 enrolments through courses they created with their teams after undertaking the training, filling a significant gap. Participants reported gaining important skills and insights into the art of what is possible in delivering training in new ways.

Conclusions

The experience and plans of the participants support the need for more creative training practices and digitally literate health educators to deliver the training that is required. The findings of the evaluation highlight a way forward in defining the essential skills and knowledge needed to create high-quality digital learning at scale.

Keywords: Continuous quality improvement, Quality improvement, Healthcare quality improvement, Quality improvement methodologies, Education


WHAT IS ALREADY KNOWN ON THIS TOPIC

  • The need to skill educators in digital learning methodologies and practices is driven by the rapid advances in technology and practice. Collaborative digital learning is a successful medium for delivering skills and knowledge, such as improvement and transformation.

WHAT THIS STUDY ADDS

  • An evaluation of a digital learning skills train-the-trainer programme for improvement educators. Organised by levels of the Kirkpatrick learning evaluation framework, the evaluation highlights essential criteria and success factors.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

  • The study highlights recommendations of vital elements for inclusion in a successful train-the-trainer digital learning skills programme.

Introduction

Globally, health services are facing increasingly complex challenges. The origins of these challenges include a growing and ageing population, medical advances,1 2 patient backlogs caused by the COVID-19 pandemic and workforce shortages.3 To address these challenges, a quadruple aim framework for care has been established and promoted internationally to guide the redesign and transformation of healthcare services.4 The quadruple aims prioritise improving the quality of patient care, increasing population health, reducing per-capita cost and improving staff experience. Developing a culture of system-wide quality improvement (QI) is required to deliver the changes required at scale and pace.5 6 Systematic methods of improvement have been promoted to health systems internationally to support progress with the goals of the quadruple aim.7 Building improvement capability is considered a key component to transforming healthcare.7,11

Technology enhanced learning, encompassing several variations of online learning including Massive Open Online Courses (MOOCs), is considered essential to building improvement capability. Representing a growing market for achieving wide-reaching capability development,12 MOOCs aim to disseminate knowledge and provide education online to limitless participants, using interactive online platforms to expand QI capability training across vast areas and nations. Providing learning solutions online offers equity and ease of access for those in remote or rural areas, and those working flexibly. Online learning can meet training needs at scale, saving time and costs compared with in-person training events,13 as well as improving the participant experience.3

MOOCs are suggested as adaptable and cost-effective ways to educate health and care personnel in QI13 and have been used to teach QI across the globe. 14,18 MOOCs have been used and evaluated successfully for several years within the National Health Service (NHS) as a method to deliver improvement training at scale.17 18

The digital competence of educators has been shown to be a leading factor for successful digital education.19,22 To help organisations to self-assess and benchmark their current ability to design, develop and evaluate training, NHS England has published a suite of education and training skills standards.23 It is unclear to what extent these skills are in place within the NHS among informal educators, but studies have identified the value of training health educators in adult learning theory, pedagogy and technical skills and not assuming experts in their field possess these skills already.24 25 Fortunately, the digital competence of trainers can be effectively improved through skills-based training.26 To address the skills issue, a training programme to increase interactive, online learning skills among NHS staff, entitled ‘MOOC School’ was developed.

This study evaluates the MOOC School training programme, assessing its effectiveness in building skills and knowledge in best practice online learning creation. The findings of the evaluation highlight a way forward in defining the essential skills and knowledge needed to create high quality digital learning at scale to further spread essential improvement skills. Overall, this assesses a contemporary teaching approach that is integral for delivering extensive distance learning. Consequently, it could be highly beneficial to educators, especially in the QI field, on a global scale.

MOOC School overview

MOOC School was an NHS training programme developed to build best practice technology enhanced learning skills in both formal and informal improvement educators across the wider health and care system. The programme aimed to equip participants with the knowledge and skills to create interactive MOOC style learning, based on connectivism theory. Connectivism emphasises the role of connecting and networking in the online learning process.27 It focuses on engaging the learner in activities and encouraging collaboration, sharing and debate with other participants to widen the learning experience. The curriculum covered essential skills for teaching and learning design—an overview of the sessions is provided in table 1. Supplementing trainers’ content knowledge with an understanding of pedagogy and technology is believed to provide the best quality digital training.28

Table 1. MOOC school curriculum of sessions.

Week Module Topic
1 1 Induction
2 Teaching and learning skills for designing and planning
2 3 Designing assessment and certification of learning
4 Design features and developing materials
3 5 Design and collect evidence of evaluation and improvements
6 Social network and facilitation skills
4 7 Assessment planning, resources and going forward
8 (Optional) Using the QI Learning Platform

MOOC, Massive-Open-Online-Course.

MOOC School was initially delivered as a 4-week programme of 3-hour synchronous webinars delivered via Webex or MS Teams to small cohorts of 10–20 participants. The live webinars contained a mix of knowledge transfer sessions and participant practical collaboration exercises to practise skills. The live sessions were supported by homework and additional resources on a closed section of the internal QI learning platform (a social, learner experience platform). At the end of the programme, participants designed, developed and delivered a short online learning piece for assessment.

Later cohorts delivered the same learning via small, personal coaching sessions mixed with assisted development of live learning packages by the participants.

Methods

Education is moving increasingly into the digital area29 due to advances in technology and the digital shift created by the 2020 pandemic and lockdowns.30 Consequently, health organisations are under pressure to upskill their educators in digital practices. MOOCs are seen as an accepted method of delivering training to large audiences due to their accessibility and ease of access.

The aim of this evaluation was to determine the key success factors required in upskilling educators in digital learning practices. Mixed evaluation methods were applied to gain insights from both qualitative and quantitative sources. Attendance data, pre and post course surveys provided a statistical base with qualitative data from surveys and interviews enhancing understanding and conclusions.

Approach

Several models are available for evaluating learning. Popular models include: Context, Input, Process, Product (CIPP), Brinkerhoff and Kirkpatrick. The CIPP model provides a systematic framework for evaluating educational programmes by examining their CIPP.31 The CIPP model works best when implemented throughout the training delivery process. The Brinkerhoff Success Case Method32 focuses on identifying and analysing the most and least successful cases of learning transfer and impact from a training programme, to identify what works and what does not. While this model is useful for highlighting areas for improvement, it is less useful for understanding long term impact and examining the programme across the full spectrum of learning levels.32 Among the most used evaluation models, Kirkpatrick’s Model is highly popular and is particularly effective for evaluating online learning.33 34 The Kirkpatrick framework has been a staple of education evaluation since the 1950s. Kirkpatrick defines four levels of evaluation, from the initial reaction and satisfaction with the learning delivered, identifying what was learnt, determining how this changed student behaviour and considering the wider results or benefits to the individual or organisation. The Kirkpatrick model is flexible and adaptable, making it easy to implement across different environments and fields, and has been recommended for evaluating MOOCs.35 The Kirkpatrick model has been widely used in previous evaluations and has been demonstrated to be effective at evaluating MOOCs with an improvement theme13 17 18 and, therefore, was selected for this evaluation.

A mix of methods and data sets was used to evaluate the effectiveness and impact of the MOOC School internal training programme, including existing data sets, surveys and semi-structured qualitative interviews.

The MOOC School programme ran from 2018 to 2023 with nine cohorts of up to 20 participants each. Participants in the training programme initially self-selected to apply and completed an application statement. Potential candidates were asked to detail their experience, reason for applying and how the learning would be put into practice. From over 200 applications, 127 were offered a place, including 14 selected from internal teams for personal coaching, forming the focus of this evaluation. Successful participants came from a range of backgrounds, NHS and non-NHS, clinical and non-clinical, with and without an learning and development (L&D) background. Due to the relatively small stakeholder population, this study considered all the successful applicants to the MOOC School programme, regardless of attendance record or completion status.

Data collection and analysis

A variety of new and existing data was collected and formed the background analysis of participants in the programme and contributed to the overall understanding and triangulation of results.36 Quantitative analysis of the existing data sets provided further understanding of the background and prior experience of participants.

Qualitative data were collected via existing application forms and attendance data to form a demographic of the participants and their prior experience. A post course survey was submitted at the end of each course by those that completed it. The post course survey asked participants to rate their knowledge and confidence—both how it had been before the course and how it was after the course. This produced ‘paired samples’ of before and after knowledge and confidence data which were analysed to measure learning achieved using the Wilcoxon signed-rank test. The Wilcoxon signed-rank test is used to test for differences between paired samples of non-parametric, ordinal data and has been used to analyse knowledge and confidence change in improvement themed MOOCs.13 18

In this case, participants rated their knowledge and confidence in a number of areas against a 5-point, Likert style scale from ‘very low’ to ‘very high’. The survey also requested a five-point Likert rating of statements regarding the effectiveness of the training from ‘strongly disagree’ to ‘strongly agree’.

Qualitative data were also gathered during the post course survey with open and closed questions covering the participant’s experience and learning. A further follow-up survey was sent in 2022 to all those with experience in the formal training programme regardless of completion status, to evaluate the lasting impact. Semistructured interviews with those who reported via the follow-up survey that they had put the learning into practice explored course experiences in more detail.

Data sources were mapped to each of the Kirkpatrick evaluation levels of reaction, learning, behaviour and results, as shown in table 2, to ensure a comprehensive evaluation spread. Study findings were categorised and evaluated according to the Kirkpatrick evaluation levels.

Table 2. Study data mapped to Kirkpatrick evaluation model levels.

Kirkpatrick level Quantitative Qualitative
1-Reaction
  • Application rates and attendance

  • Participants’ post course ranking of MOOC

  • Participants’ post course likelihood to recommend to colleagues

  • Participants’ session interaction

  • Participants’ post course survey and comments

  • Follow-up survey responses

2-Learning
  • Self-assessment of knowledge before and after participation

  • Ongoing assessment of Participants during participation

  • Student post course survey comments

  • Follow-up survey responses

3-Behaviour
  • Self-assessment of confidence before and after participation

  • Summative student micro-teach assessments

  • Pre and post course survey comments

  • Follow-up interviews and survey responses

4-Results
  • Micro-teach assessment results

  • Follow-up interviews and survey responses

  • Micro-teach assessment results

  • Follow-up interviews and survey responses

MOOC, Massive-Open-Online-Course.

In addition to assessment against the levels of the Kirkpatrick framework, thematic analysis of the qualitative data was used as a method to identify themes related to key learning areas.36 Thematic analysis involved analysis of the qualitative data such as interview transcripts and survey responses to identify recurring patterns or themes. Initial broad themes, or categories, were developed and later refined to form the thematic recommendations in the conclusion.

Results

The MOOC School programme ran as either classroom or direct coaching from 2018 to 2023 with nine cohorts of up to 20 participants each. The findings were organised according to Kirkpatrick’s four levels of training evaluation: reaction, learning, behaviour and results.

Kirkpatrick level 1: reaction

Of the 127 places offered, 110 were accepted and attended one or more sessions, with 68% completing all or most sessions. Those who finished the course were invited to complete a post course survey.

High satisfaction and engagement

  • The programme achieved high levels of participant satisfaction, with 97% rating it as good or very good.

  • 93% of respondents indicated they were likely or very likely to recommend the course to others.

  • The follow-up survey reported that the most common reason for participants not completing the course was work pressures (66%) and that they would retake the course if the opportunity arose.

Positive feedback on course structure

Participants appreciated the course’s structured approach, particularly the focus on learning design and assessment, which helped in shaping their own work.

Kirkpatrick level 2: learning

Participants were asked in the post course survey to self-assess their knowledge and confidence before and after attending.

Increase in knowledge and confidence

  • Participants reported an average 55% increase in knowledge and an average 53% increase in confidence post-training.

The Wilcoxon signed rank test showed the reported change in knowledge and confidence to be statistically significant (p<0.001) figure 1.

Figure 1. Chart of self-reported change in knowledge and confidence before and after the training.

Figure 1

  • The training effectively equipped participants to start their own digital learning projects, with 73% agreeing or strongly agreeing that the course positively impacted their ability to do so.

Value of understanding learning theory and connectivism principles

  • Participants found the sessions on learning assessment and connectivism principles particularly beneficial. These sessions moved trainers from static presenters to dynamic educators.

  • Practising the exchange of ideas within a diverse group during the training provided valuable insights and inspiration.

Kirkpatrick level 3: behavioural change

Studies examining the pandemic’s impact on future working practices conclude that workers are likely to retain many of the digital practices developed over the last few years.37,40 A follow-up survey was sent in 2022 when the formal training programme ended to all those who had attended at least one session to identify behaviour change since participating in the course. 113 applications were offered places to participate in the formal programme (cohorts 1–7 in table 3). 78 were sent the follow-up survey, the remainder being uncontactable or excluded due to conflict of interest. 30 responses were received, 77% of which had completed the programme.

Table 3. Overview of programme cohorts.

Cohort Number of applications Number of places offered Number of participants Number of completions Evaluation responses Response rate
(Post course) (As % of completers)
October 2018 (pilot) 9 9 8 2 0 NA
November 2019 82 13 11 9 8 89
January 2020 (covid) 11 8 4 1 25
May 2020 61 20 18 18 10 56
October 2020 18 18 15 12 80
January 2021 78 22 18 15 9 60
July 2021 20 15 9 9 100
October 2021 (coaching) 6 4 4 4 NA NA
August 2022 (coaching) 10 10 10 10 NA NA
Totals 246 127 110 86 49 57%

NA, not available.

Improved digital capabilities

  • 73% of survey respondents and all interviewees reported increased ability and confidence to implement digital learning post-training.

  • Many participants were able to implement the training to create their own online learning programmes.

Barriers to implementation

  • Some participants faced barriers such as inadequate resourcing and cultural resistance within their workplaces, highlighting the need for organisational support to fully leverage digital training skills.

Future planning and adoption

  • Participants expressed intentions to incorporate more interactive and blended approaches, including wrap-around support, in their future training designs, aligning with postpandemic digital work practices.

  • There was a notable trend towards refining facilitation methods to enhance peer-to-peer interactions and collaborations.

Kirkpatrick level 4: results

Widespread impact and adoption

  • The programme’s methodology was successfully spread beyond individual participants. With participants taking the learning back to their wider teams within the NHS and embedding the methodology in their future delivery mode, thereby demonstrating significant organisational impact.

  • Some participants have developed and delivered new learning programmes to thousands of learners, indicating the programme’s scalability and effectiveness.

Creation of new courses

Several online programmes have been developed by MOOC School participants and their teams, demonstrating the long-term benefits of the training, including:

  • Four new MOOC courses on large-scale change and lean methodology were created within the NHS England Improvement Directorate, reaching over 15 000 participants.

  • Two new MOOC courses on elective care waiting list management were developed by cohort nine and have reached over 11 000 participants in the first year and a half.

  • A new MOOC on disposable glove use ran once and reached over 1200 participants.

  • A new culture and leadership targeted course reached over 3000 participants.

Thematic analysis

Interviews and survey responses were categorised, and subsequent thematic analysis allowed the identification of overarching themes.36 A number of high-level themes emerged, each with separate subordinate components, as shown in table 4. Theme 1 concerned the digital capabilities of the participants both before and after attending the training and also identified potential barriers to putting the learning into practice. Themes 2 and 3 covered the pedagogical aspects of the training that were most valued. Theme 4 is related to the interactive methodology being taught and demonstrated. Finally, the fifth theme encompassed content creation options. The five themes identified encompass the range of training evaluation elements from participant reaction to the learning, the learning that took place and subsequent behaviour change.

Table 4. Thematic analysis data mapping.

Theme Subcategory Applications Attendance records Assessment data Post course survey Study survey Interviews
1-Digital capabilities Not knowing where to start X X X X X
Ability to implement X X X
Barriers X
2-Value of learning pedagogy Background X X X
Design methods X X X
Learner assessment X X X
3-Assessment and evaluation Background X X X
Design methods X X X
Learner assessment X X X
4-Engaged learning techniques Experience the learning style X
Put into practice X X X
Future Plans X
5-Content creation Lack of awareness of the variety of options X X
Experience using alternative tools X

Limitations

The study has several limitations.

Due to the closely supported, coaching nature of the delivery, the programme ran with relatively few participants, which may limit the generalisability of the study findings.

Participants self-selected to apply to participate in the programme, highlighting an awareness of the need for digital learning skills. This pre-existing awareness is beneficial, although it may introduce some bias as they would have been aware of the need for the skills prior to undertaking the course. Consequentially, this awareness could influence their perceptions and feedback.

The researcher formed part of the delivery team, so a limitation of the study may be the researcher’s own positivity bias, and possible impact on the responses participants gave. Steps were taken to ensure neutrality,36 including sharing outputs with a neutral team member, not involved in the design or delivery, for independent review and validation. Nonetheless, prima facie responses seemed reasonable and aligned with the overall outcomes.

To focus on results and impact, the study only interviewed participants who reported in the survey that they had put the learning into practice. Further study of participants who have been unable to follow their learning with actions could identify and quantify the barriers encountered.

Implications and conclusions

Using the Kirkpatrick framework, the findings indicate that the programme evaluated well at a reaction level, with high satisfaction at level 1. At level 2, self-assessment demonstrated improvement in both knowledge and confidence. At levels 3 and 4, participants reported changes in their work practices and could show examples of new learning products being created as a result of the learning programme. In addition, both before and during the pandemic, there were exceptionally high levels of interest in joining the programme.

The levels of interest and satisfaction demonstrate both the underlying need and the usefulness of training staff in new, innovative ways of delivering training. Although this programme has only run on a small scale to date, this study identified significant volumes of on-the-job training being delivered by staff who do not have a formal educator background. As a result, the study detected wide variation in digital and pedagogical competencies. Thematic analysis of the findings, described above, leads the researcher to conclude a number of recommended themes for future training programmes of this type:

Theme 1: enhance digital learning capabilities in subject matter experts/educators

The evaluated learning programme sought to improve the digital capabilities of formal or informal educators in the NHS. The study found that before attending training, participants had a low understanding of how to move from traditional face-to-face training to delivering in an online environment. Evidence collected from the surveys and data sources illustrates that attending the training programme enabled participants to embark on projects to create their own learning offers. Training subject matter experts in the skills to move to digital learning delivery helps accelerate trainers’ overall digital capabilities.

Theme 2: adult education pedagogy

For informal subject matter expert trainers, learning about adult teaching theory and pedagogy as part of the curriculum strongly resonated with participants. Teaching design skills took the form of course characteristics, scheme of work and detailed session plans. The structured approach to learning design and development was an area cited consistently across surveys and interviews as most helpful in supporting shaping participants’ own work, giving a clear framework to use. By incorporating these structured methodologies, participants felt more confident in their ability to shape and refine their own instructional practices, ultimately leading to more impactful and learner-centred future training programmes.

Theme 3: assessment and evaluation

Another area reported via the surveys and interviews to be particularly beneficial concerned sessions on learning assessment and evaluation, moving the trainer from a static presenting role to a full teaching role. These sessions focused on how and why a trainer collects evidence of learning through the design, considering the student viewpoint and experience and assessing learning with the ultimate aim of achieving the wider training goal. Teaching trainers about evaluation and assessment equips them with the skills to foster a more engaging, responsive and impactful learning environment, ultimately leading to better educational outcomes for all participants.

Theme 4: interactive learning principles

Participants reported via the surveys and interviews the benefits of learning within a diverse group, practising, exchanging ideas and commenting on each other’s contributions to enrich the learning experience and foster a deeper understanding of the subject matter. Participants also described deriving value in seeing interactive, engaged learning in action during the training, as it gave the participants ideas and inspiration. The study findings concur with the literature around learning that is purposeful, engaged and connectivist in nature can lead to more meaningful and sustained learning outcomes.2741,44

Theme 5: content creation awareness

The final theme highlighted through interviews and surveys was the variety of different types of content that could be used to create a varied learning package. Participants reported not being aware of the options that were possible, resulting in a reliance on traditional methods, such as text-based materials, which can limit engagement and effectiveness. By understanding the diverse range of content types available, such as videos, animations, interactive models, infographics, quizzes and generative AI (Artificial Intelligence), participants can design more engaging learning experiences. This not only enhances the learning but also caters to different learning styles, making the training more inclusive and effective. Therefore, content creation awareness is crucial for developing comprehensive and impactful training programmes.

Summary

This study has uniquely evaluated a L&D offer aimed at equipping staff with the skills to create improvement skills training rather than evaluating the online courses, or MOOCs, themselves. This study provides insights to help in the design of digital training skills programmes to train improvement experts to develop interactive online learning to improve training outcomes. In summary, this study demonstrates the effectiveness of the evaluated programme in enhancing digital learning creation capabilities, adult education pedagogy, assessment and evaluation skills, connectivism learning principles and content creation awareness among staff creating training. The recommendations provided highlight the impact of innovative training methods on improving digital and pedagogical competencies and are of use to educational trainers and workforce training leads.

Supplementary material

online supplemental file 1
bmjoq-14-3-s001.docx (18.4KB, docx)
DOI: 10.1136/bmjoq-2024-003224

Footnotes

Funding: The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Provenance and peer review: Not commissioned; externally peer reviewed.

Patient consent for publication: Not applicable.

Ethics approval: The Health Research Authority online decision tool was used to determine that NHS research ethics was not required for the study as it was considered evaluation of service improvement.

Patient and public involvement: Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information.

References

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

online supplemental file 1
bmjoq-14-3-s001.docx (18.4KB, docx)
DOI: 10.1136/bmjoq-2024-003224

Data Availability Statement

All data relevant to the study are included in the article or uploaded as supplementary information.


Articles from BMJ Open Quality are provided here courtesy of BMJ Publishing Group

RESOURCES