Skip to main content
BMC Health Services Research logoLink to BMC Health Services Research
. 2026 Jan 24;26:269. doi: 10.1186/s12913-025-13923-y

The Monash learning health system maturity matrix: codesign of a tool to measure and guide improvement in complex health system behaviour

Darren Rajit 1, Alison Johnson 1, Sandy Reeder 1, Dominique Cadilhac 3,4,5, Joanne Enticott 1,✉,#, Helena Teede 1,2,#
PMCID: PMC12911384  PMID: 41580818

Abstract

Background

Learning Health Systems (LHS) have proven efficacy in catalysing healthcare improvement, but adoption and scale-up remains challenging due to limited implementation guidance and evaluation tools. To address this gap, guide LHS implementation, and measure alignment with LHS principles, we aimed to codesign, iteratively refine, and apply an LHS Maturity Matrix (LHS-MM) based on the Monash LHS framework.

Methods

In this mixed methods study, the double diamond design and innovation model (discover, define, develop, deliver) was applied in the codesign of the LHS-MM. Insights and tools uncovered from a scoping review and the Monash LHS Framework were leveraged to develop an initial version of the LHS-MM. This was then refined through codesign with subject matter experts (n = 18), and potential users of the LHS-MM (n = 11), followed by triangulating insights from evidence-based implementation frameworks: the Consolidated Framework of Implementation Research (CFIR) and the Reach, Efficacy, Adoption, Implementation and Maintenance (RE-AIM) framework. The LHS-MM was then evaluated in a test case for stroke.

Results

Tools uncovered in the discover and define stage from the scoping review included the Cincinnati Network Maturity Grid that we then adapted to the Monash LHS framework to produce the initial draft. Codesign elevated the tool to focus on assessing complex systems behaviours related to LHS principles, with significant changes to assessment criteria, rating scale wording and scenarios for use. The LHS-MM assesses system-level, LHS behaviours across eight components on a numerical, five-point scale (1–5), which can be visualised as a radar chart. Components include stakeholder engagement, priority identification, evidence-based information, evidence synthesis and guidelines, data systems, benchmarking, implementation, and healthcare improvement. The Australian Stroke LHS test case revealed ratings from 4/5 (established) to 5/ 5 (Transformative), and informed system level opportunities for improvement.

Conclusion

Through an evidence-informed, iterative co-design process, we have created the Monash LHS-MM. Our test case example utilising the Monash LHS-MM illustrates opportunities for improving LHS fidelity and implementation, with further research needed beyond the Australian healthcare system.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12913-025-13923-y.

Keywords: Learning health systems, Evaluation, Implementation science, Healthcare improvement, Implementation fidelity, Maturity matrix

Introduction

Effective, sustained improvement in complex health systems has been elusive [1, 2]. Theory driven Implementation science frameworks such as the Consolidated Framework for Implementation Research (CFIR) [3] are often used as static determinant frameworks with limitations in operationalising delivery of improvement or catalysing iterative learning in complex systems over time [4, 5]. The Learning Health System (LHS), [6] has emerged in this context as a systems change framework that is complementary with the CFIR [7], positing that complex systems change requires aligning people, data and culture [811]. LHS approaches thus seek to close the evidence to practice gap by moving beyond the ‘what’, to a focus on the ‘how’: that is how an improvement is contextualised, implemented, assessed, and sustained for impact [12] as a process model for implementation. Despite increasing interest, LHS research has been mainly theoretical, with relatively few high-quality empirical implementation studies, as captured in our recent scoping review [13]. Additionally, LHS research often lacks codesign [11] ,with limited involvement of stakeholders or end-users in prioritising problems and designing solutions [14] and most are academically focused on technological [15] or data [16] aspects of the LHS, with limited investigation of complex system, or socio-technical factors [13]. Without consideration of such factors, there is a risk that a LHS may become overly defined by its constituent technology or discrete components, rather than a pragmatic approach towards healthcare system design delivered through whole-of-system commitment towards stakeholder centred healthcare improvement.

The evidence-based Monash LHS framework underpinning this work was developed through codesign with health system stakeholders and community end-users (Fig. 1) to address these limitations. Its development involved engaging with national and international experts, a systematic literature review analysing successful LHS case studies [11], followed by stakeholder engagement [17] and codesign [10]. As a result, usage of the Monash LHS framework embeds theories of change for implementation. Firstly, we recognise that health systems as complex adaptive systems, and position iterative healthcare improvement as an emergent system level behaviour that arises from the accumulation of complex non-linear processes and interactions [18]. Secondly, we posit that these underlying interactions can be grouped into four recognised evidence domains, with eight LHS components (Table 1). Each LHS component represents broad programs of work that provide the essential contributions to realising the full potential of a LHS. Specifically, this involves (i) robust stakeholder engagement (ii) priority setting within the Stakeholder evidence domain, (iii) priority driven research generation and (iv) evidence synthesis within the Research evidence domain, (v) robust data collection systems that then facilitate (vi) benchmarking within the Data evidence domain, and (vii) theory driven implementation and (viii) evaluation within the Implementation evidence domain. Thirdly, we further posit that once these eight bodies of work are operationalised, aligned, and integrated through context specific processes across individual, organisation and system settings, then iterative healthcare improvement arises as an emergent outcome (Table 1).

Fig. 1.

Fig. 1

The evidence-based Monash learning health system (LHS) framework [10]. LHS Domains: Evidence from Stakeholders (orange), Evidence from Research (green), Evidence from Practice and Data (light blue), and Evidence from Implementation (dark blue). LHS Components are numbered

Table 1.

Examples of underlying processes or tools that are integrated and implemented in a learning health system to drive direct system level behavioural outcomes. Collectively, these direct behavioural outcomes lead to the higher level, indirect emergent behaviour of cyclical healthcare improvement

LHS Evidence Domain LHS Components Processes and Tools (Examples for Illustrative Purposes) Direct System Level Behavioural Outcomes Indirect System Level Behavioural Outcome
Stakeholders Stakeholder engagement Evidence-based approaches to identify and map stakeholders [3], and co-design interventions [19]which are routinely employed Stakeholder groups (including consumers*) are mapped and empowered in the co-design or co-leading of initiatives Cyclical Healthcare Improvement
Priority setting Use of robust methods for Priority Setting and consensus building 20 LHS priorities are codesigned and agreed with stakeholders, ranked with formal consensus methods, inform downstream LHS activities, and are included in evaluation and outcome metrics
Research Evidence Based Information Use of approaches [21, 22] to access and integrate research evidence, and generate new research to address gaps, aligned with stakeholder priority Information from research evidence is routinely accessed and generated in a structured and systematic manner in such a way aligned with stakeholder priority areas
Evidence Synthesis Synthesis of new evidence as it occurs through the production of systematic reviews, guidelines, living evidence [23, 24] Synthesised research evidence is accessed and supported with robust systems for integration. Ongoing evidence synthesis is also fully informed by stakeholder priority with iterative processes in place for synthesis of new evidence as it is produced.
Data / Practice Data Information Systems Prioritised outcomes such as Patient reported outcomes (PROMs) and Patient Reported Experience Measures (PREMs) and other key outcome measures are collected, data flow systems and infrastructure is integrated and data is accessible for analysis in priority areas [25] Relevant (as defined by stakeholder priority) data sources including PROMs and PREMs, and external data such as population-based data are accessible across all sites, organisations and services.
Benchmarking Data driven dashboards and reports are produced and updated in a timely way [26] Data is analysed, benchmarked, and transparently reported to relevant stakeholders in ongoing cycles, and explicitly linked to stakeholder priority
Implementation Implementation Implementation frameworks 3 and evidence-informed strategies [27] are applied to inform implementation Proposed innovations or interventions have an associated implementation strategy, are aligned with stakeholder priority and have been adapted to local context with co-design. Theory driven approaches are applied during implementation, with consideration of organisational and system barriers, clear processes demarcating roles, responsibilities and project milestones, with process outcomes being measured at the individual, local and systems level, and also being captured, reported upon, and sustained.
Healthcare Improvement Implementation efforts are scaled-up, and evidence based frameworks are routinely used to evaluate implementation outcomes [28], and inform improvement activity [29] Widespread evidence of ongoing implementation of innovations in healthcare with systematic, regular and evidence-based evaluation of reach, efficacy in health and economic outcomes, adoption, and maintenance in routine healthcare. With implementation and evaluation aligned to stakeholder priority, and evaluation driving improvement activity.

The Monash LHS framework has been applied in Australia by government, health services and nationally funded implementation research and health system partnerships at urban (MRFF2023389 ) [30], regional (RARUR000072) [31] and national levels [8]. However, ongoing stakeholder engagement with parties actively implementing the Monash LHS Framework (RARUR000072) [] highlighted limited guidance to monitor fidelity, benchmark and guide implementation [31]. Without such guidance, it was not possible to monitor the ongoing implementation process of the Monash LHS framework over time, particularly in reporting against grant objectives. This also meant that the embedded theory of change within the Monash LHS framework could not be appropriately evaluated. Therefore, in response to stakeholder need, we aimed to engage stakeholders to iteratively codesign an LHS Maturity Matrix (LHS-MM) aligned with the Monash LHS framework based on the Australian context. The matrix was designed to be integrated within a digital implementation toolkit [32], and enable systematic self-assessment of the (i) implementation fidelity of the Monash LHS framework and (ii) guide and optimise ongoing implementation across diverse contexts or settings.

Method

This mixed methods study was underpinned by the validated codesign and innovation Double Diamond method [33], composed of four iteratively applied phases: Discover (Scoping Review, Expert and User Group Codesign), Define (Expert and User Group Codesign), Develop (integration of implementation science frameworks), and Deliver (Small-scale testing and refinement in the Australian Stroke LHS [34, 35] )

The work was led by our interdisciplinary team including a researcher-in-residence biomedical engineer (DR), clinicians (HT, DC, AJ) and implementation and LHS experts (HT, AJ, DC, SR, and JE). Ethics approval (ID:19969), reporting, and design of this study align with the Standards for Reporting Qualitative Research (SRQR) Checklist [36] (S6).

Discover, & define: scoping review, and codesign involving experts and user groups

An initial scoping review aiming to capture existing LHS implementation and evaluation research, including case studies and tools, was conducted and is published elsewhere [37]. The US based Network Maturity Grid tool [38] emerged as the most advanced [37]. It was developed over several years through evidence from stakeholders, a literature review, and multiple case examples [38, 39]. Here, we built on this tool in codesigning the LHS-MM, with permission for adaption provided from the creators.

A purposive sample of experts with experience in health systems research, LHS, evaluation, implementation science, health service delivery and complex research, as well as consumer and community members were engaged. The network of expert advisors established as part of initial Monash LHS framework development [17]. (AJ, HT, JE) were invited to contribute to matrix refinement and identify further potential experts. The interview schedule (S8) was informed by our systematic reviews on effective LHS models [11] and scoping review on implementation tools [37] and was designed to discover the need, purpose, perceived usefulness of the tool, suggestions for wording and criteria refinements, suggestions for potential frameworks that could be integrated into the tool, and implementation considerations such as barriers and / or facilitators for its use.

The interviewer (AJ) introduced the Monash LHS framework and context, before exploring the outlined topics and then the prototype LHS-MM via a semi-structured interview. Input was solicited verbally and confirmed post interview via email and inline comments directly on prototype documents. The LHS-MM was revised iteratively, with changes to assessment criteria and maturity levels, evaluation matrix format, potential scenarios of use and target audiences, and possible integration into existing workflows. Successive dated versions of the tool were stored on secure file servers. Semi-structured interviews were continued to saturation, where no further actionable changes were added.

A workshop was then conducted with identified stakeholders and end-users’ representative of the target audience of the tool, that is embedded researchers engaged in healthcare improvement activity. Here we engaged with embedded researchers and healthcare improvement project leads who were actively implementing the Monash LHS framework in a regional health system transformation program (MRFF RARUR000072) involving a diverse array of improvement projects. Participants were invited to apply the LHS-MM to assess LHS maturity in a group setting, with individual participants assessing each LHS component. Feedback and reflections were captured on experience using the LHS-MM, specifically on aspects of usability, use cases for the tool, and how the results of assessment could be leveraged going forward. Recommended refinements were captured via participatory engagement, with inputs captured verbally, in writing via email and directly on tool documents during and after the workshop.

All feedback from expert and end user engagement was considered of equal importance, aligned to sharing of power in co-design [40], allowing the capture and integration of multiple perspectives across different subject matter experts and end users to further refine the tool. Throughout the codesign process, the co-authors met regularly and communicated via email to iteratively discuss and integrate evidence sources and build on the tool with successive dated versions stored on secure file servers.

Develop: integration with theory driven frameworks

The LHS-MM generated via integrating results from the scoping review, feedback from experts and insights from the workshop, were then integrated with an evidence-based implementation framework (CFIR) [3], an evaluation framework (Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM)) [28], and the adherence construct of implementation fidelity [41]. These frameworks were selected a priori for their robust evidence base and extensive utilisation in implementation science [3, 4244]. CFIR was integrated within the Implementation and Healthcare Improvement LHS Components to incorporate a multi-level perspective on implementation and evaluation with increasing maturity; and RE-AIM was integrated within the Healthcare Improvement LHS component to incorporate a multifaceted perspective on evaluation with increasing LHS maturity. The adherence construct of implementation fidelity was integrated as the overarching framing for the tool and informed directions for usage. Additional iteration was also conducted at this stage to better improve the specificity and clarity of criteria through framing of criteria through dot points, and to ensure consistency in gradation of maturity levels across all LHS components.

Deliver: small scale test case – Australian stroke LHS

The LHS-MM was then tested with the Australian Stroke LHS Program [34] based on publicly available evidence. This program [34] is a national network of consumer-clinician alliances [35], data monitoring systems [45, 46], evidence synthesis bodies [47] and research centres [48] that have collectively led improvements in evidence-based stroke care in Australia. It was recently showcased as an exemplar for LHS approaches [34] aligning to the Monash LHS Framework [10, 35]. The test case was conducted independently by DR, with DC, a leader for the Australian Stroke LHS providing additional context, clarification, and evidence.

Results

The results from each phase are summarised in Table 2. In summary, the scoping review generated an initial skeleton prototype that was adapted to the Monash LHS, with iterative refinement through the discover, define, develop and deliver phases of development.

Table 2.

Summarised results and contributions of each step in the multi method study to Monash LHS-MM development

Double Diamond Phase Step Contribution to LHS-MM Development
Discover, Define Scoping Review and adaptation of the tool to Monash LHS framework Uncovered the maturity grid assessment tool [32] that formed the basis of further adaptation
Adapting an Implementation Fidelity Tool for the Monash LHS Framework Basic structure of the Lannon et al. maturity matrix [38] was mapped to the Monash LHS Framework, resulting in an initial version for further iteration (S3)
Expert Feedback (n = 18) Changes in wording for maturity levels and assessment criteria, addition of options for different directions for usage (reflexive ratings vs. evidence-based), addition of background and instructions sheet within tool, recommendation for frameworks to inform assessment criteria in stakeholder evidence domain, and interim version for focus group testing (S4)
User (n = 11) Group Workshop The workshop reaffirmed the LHS-MM with potential use cases beyond just assessing implementation fidelity. No actionable changes to the tool itself were highlighted however, and the interim version (S4) proceeded to the next step
Develop Integration of Theory Driven Frameworks Insights from Process Domain of CFIR [3] and broad RE-AIM [28] constructs were integrated and led to wording changes to Implementation and Healthcare Improvement criteria resulting in current version of Monash LHS-MM (S1-S2), Further iteration was applied to improve consistency of criteria including reframing as dot points to improve clarity.
Deliver Test case with the Australian Stroke LHS Results from the Test case affirmed application of the LHS-MM as both an assessment tool and roadmap.

Collectively this codesign process involved fundamental changes to structure, wording, rating criteria and conceptualisation of the LHS and its maturity. Differences from the NMG that formed the basis for the tool is summarised in S7), and wording and criteria changes are summarised in S9.

Monash LHS maturity matrix (LHS-MM)

The Monash LHS-MM tool includes a worksheet (S1) and report template (S2) and requires users to self-assess systems behaviours across four LHS domains and eight LHS components (Table 1), each with five maturity levels and associated quantitative scores: Not Started (1), Beginning (2), Developing (3), Established (4) and Transformative (5). These criteria (full wording in S1) evaluate system level behaviours generated through underlying processes that are characteristic of an increasingly mature LHS. At higher maturity levels, fulfilment of assessment criteria in a LHS component is sequential and requires prerequisites from earlier components and will also require pre-requisites from other LHS components within a current LHS domain. For example, in Fig. 2, “Transformative” behaviour within priority setting and ranking requires robust upstream stakeholder mapping and engagement. Additionally, fulfilment also requires pre-requisites from other evidence domains, reflecting how LHS systems behaviours builds successively across domains. For example in Fig. 2, “Transformative” behaviour in research evidence synthesis within the Research evidence domain requires robust upstream priority setting and ranking from the Stakeholder evidence domain.

Fig. 2.

Fig. 2

Examples of Interdependencies between LHS system behaviours captured within the Monash LHS-MM. Where “Transformative” stakeholder mapping and engagement (first box, top box) is required for “Transformative” priority setting and ranking (Second, middle box); and “Transformative” evidence synthesis requiring “Transformative” upstream priority setting (Third, bottom box)

The result is a graphical view of LHS maturity (implementation fidelity) (hypothetical scores in Fig. 3), highlighting strengths, deficiencies, and routes for improvement or investment.

Fig. 3.

Fig. 3

Sample radar chart that is produced by the Monash LHS Maturity Matrix (LHS-MM), highlighting ability to track maturity over time

In the following sections, we provide further detail on assessment of LHS domains and components in the LHS-MM.

Stakeholder evidence domain

Stakeholders derived evidence is generated from individuals or groups who have a stake in both the problems uncovered and the solutions that are proposed within a system [49]. Measurement of maturity in this LHS domain focuses on how stakeholders are engaged (LHS Component: Engagement of People) for problem ideation, and co-design of interventions; and how stakeholder priorities are integrated into decision making and project planning (LHS Component: Identifying Priorities). The Engagement of People scale is informed by the International Association for Public Participation (IAP2) Spectrum of Public Participation model [50] that was suggested during expert stakeholder co-design. For example, the IAP2 model conceptualises the level of public participation on a five-stage spectrum ranging from “inform”, “consult”, “involve”, “collaborate” and lastly, “empower”. Likewise, the Engagement of People scale in the Monash LHS-MM maps stakeholder mapping and “informing” to the second level of maturity, “consulted” to the third level, “involvement“ and “collaborative inclusion” to the fourth level, and “empowerment” for codesign and co-leading of initiatives in the highest (fifth) level. The Identifying Priorities scale is informed by the James Lind Alliance (JLA) approach to priority setting [51]. Specifically, the JLA approach [52]involves the establishment of a priority setting partnership with appropriate engagement of stakeholders, gathering potential research questions and uncertainties, developing a long list of priorities before developing a revised short list in a final priority setting workshop. Accordingly, the Identifying Priorities scale in the LHS-MM maps the JLA process on a scale. Thus, engagement with stakeholders to create a “long list” is set to the second level of maturity, with the development of a short list after “long list” development being mapped to the third level. The fourth level is concerned with the method by which priority setting is conducted, specifying the need for formal consensus methods such as Delphi processes [20] or nominal group technique [52], and the fifth level requires the formal operationalisation of these priorities into evaluation and outcome metrics within the system.

Research evidence domain

Research derived evidence is generated by embedding high quality, peer-reviewed research. Measurement of maturity in this LHS domain focuses on how existing peer-reviewed research (LHS Component: Evidence Based Information) is being accessed, generated, incorporated and synthesised (LHS Component: Evidence Synthesis and Guidelines). At higher levels of maturity, integration of stakeholder priorities from the Stakeholder LHS domain is expected to guide evidence generation and synthesis [9], as well as new research evidence being integrated and synthesised as it is produced [24].

Data and practice derived evidence domain

Data derived evidence refers to facts, circumstances or perceptions that can be analysed to inform decisions. This can be both qualitative and quantitative data that is generated through routine health system functioning. Measurement in this LHS domain focuses on how relevant data is being captured and accessed (LHS Component: Data and Information Systems); before being analysed, reported, and used to inform healthcare improvement activities (LHS Component: Benchmarking). Relevant data is determined by alignment with stakeholder priorities and research evidence, as elicited from the prior LHS domains. At higher maturity levels, there is clear evidence that stakeholder priority and research evidence is being used to drive the way data is being accessed, analysed and benchmarked to inform improvement activity in an ongoing cycle.

Implementation derived evidence

Implementation derived evidence refers to (i) how implementation science and practice-based evidence is being used to inform and create the conditions necessary for sustainable change and innovation (LHS Component: implementation), and (ii) how evidence from the other three LHS domains are being integrated and evaluated (LHS Component: Healthcare Improvement) to underpin ongoing healthcare improvement.

The Implementation component scale is informed by the Innovation domain of the CFIR framework [3]. Specifically, the conceptual separation between the innovation to be implemented, and its implementation strategy. This conceptual separation is required in the LHS-MM starting from the second level of maturity onwards. Thereafter, increasing maturity coincides with consideration of additional barriers and enablers towards implementation, in alignment with the Implementation process domain of CFIR [3]. Lastly, at the highest maturity level, implementation and outcomes should also be tracked at levels easily mapped to the individual, inner (local level within the LHS-MM) and outer (system level within the LHS-MM) setting domains of CFIR [3].

The Healthcare Improvement component scale is informed by CFIR [3] and the RE-AIM [28] frameworks. At higher maturity levels, implementation science and evidence from all LHS domains is being used at scale, to (i) select and adapt novel innovations and (ii) co-design and operationalise implementation or de-implementation strategies for these innovations, in alignment with both the innovation and process domains of CFIR. Thereafter, evaluation from implementation activities is continuously used to inform cycles of healthcare improvement, allowing evidence from all LHS domains to be linked towards outcomes. Further, as maturity level builds, it is expected that evaluation routinely drives healthcare improvement, with gradual increase in sophistication and comprehensiveness. This starts with limited evidence of any evaluation in “Not Started”, before building towards the routine evaluation of relevant RE-AIM [28] constructs such as reach, effectiveness, adoption and maintenance across all levels. At the “Transformative” level, there is an expectation that evaluation activity also drives further improvement activity.

Monash LHS-MM as an implementation fidelity assessment tool

The LHS-MM is designed to assess the four subcategories (Content, Frequency, Duration and Coverage) of the “Adherence” construct of implementation fidelity in relation to the Monash LHS framework (Table 3). Implementation fidelity refers to the extent to which a complex intervention (the Monash LHS framework) has been implemented as intended by the developers, whereas “Adherence” is the “bottom line” of Implementation fidelity, or the extent to which those wishing to implement LHS principles have “adhered” to the Monash LHS framework as planned. Adherence is decomposed into four subcategories: Content, Frequency, Duration, and Coverage. “Content” refers to “what” is being assessed, in this case how the Monash LHS framework has been conceptualised into measurable attributes as detailed above. Frequency, Duration and Coverage collectively refer to the “dose”, or the extent to which the Monash LHS framework is being delivered.

Table 3.

The Monash LHS-MM approach to measure the adherence construct of implementation fidelity, as related to the Monash LHS framework

Adherence Subcategories Monash LHS-MM Assessment Approach
Content Breaks down the Monash LHS Framework and LHS cycle into system level behaviours across all LHS components and domains. Each component is assessed across five maturity levels, with criteria building on preceding maturity levels within LHS components, and between LHS components in other domains.
Frequency & Duration Frequency & Duration refers to how often and for how long the Monash LHS Framework is being applied within a context using the LHS-MM as an implementation strategy. For example, an implementor wishing to implement the Monash LHS framework may prespecify routine implementation fidelity evaluations every 3 months (frequency) for two years (duration) using the LHS-MM to investigate how changes to stakeholder engagement are improving the maturity of their LHS context. Both of this can be measured via successive usages of the tool due to its flexible format. Sequential radar charts can then be overlaid to display evolution of adherence to the LHS model over time.
Coverage “Coverage” is the extent, or depth to which the LHS has been delivered. Here the Monash LHS-MM measures this on a maturity scale from 1–5.

Test case – the Australian stroke learning health system program

Figure 4 summaries the overall LHS maturity level of the Australian Stroke LHS. A completed LHS-MM, accompanying report, and assessment rationale with evidence for each component is available (S5a-SSc). Notably, the Australian Stroke LHS was rated as “Transformative (5/5)” in terms of Stakeholder engagement, Evidence Based Information, Evidence Synthesis, Data Information Systems, Benchmarking and Implementation. However, “Identifying Priorities” was an area for improvement (Developing (3/5)) due to limited evidence for how stakeholder priorities were being ranked. Additionally, given that implementation of tools and processes associated with the LHS principles was still relatively nascent, evidence of evaluation of ongoing LHS cycle is still nascent, thus a rating beyond Established (4/5) could not be assigned.

Fig. 4.

Fig. 4

Radar chart summarising results of assessment of the Monash LHS Maturity Matrix (LHS-MM) of the Australian Stroke Program

Application

The LHS-MM can be used as a brief reflexive, analysis with documentation to support maturity scores as done in our test case (S5a-c), or a deep analysis using mixed methods data collection such as interviews; for example with a realist evaluation [53]. Following the instructions in the LHS-MM worksheet (S1), the scores are completed and a summary sheet with a radar chart quantitatively visually displays maturity scores across all eight components (Fig. 3) accompanied by a report template (S2) to record results. The tool can be applied using simple word and excel outputs and has also been incorporated into an online Monash LHS implementation toolkit, with further instructions and support information. The maturity assessment using the LHS-MM is best conducted iteratively with benchmarking over time and across organisations to guide targeted efforts into improving health systems behaviours towards a mature LHS.

Discussion

Building on the LHS, we have generated a Monash LHS-MM codesigned from (i) evidence from a scoping review of existing frameworks and tools, [37] and (ii) codesign with expert stakeholders and end-users through the discovery, defining and development phases including integration of evidence-based, theory driven frameworks, (CFIR [3], RE-AIM [28] and the conceptual framework of Implementation Fidelity [41]). The resultant LHS-MM, designed for measuring fidelity and guiding implementation of an evidence-based LHS framework to enhance healthcare improvement, was then delivered in a test case in the Australian LHS Stroke program [34]. As such, it is both an implementation guide and monitoring tool and can assist health services across diverse contexts to establish a LHS to enhance healthcare improvements and deliver impact.

This codesign process for the LHS-MM was underpinned by the conceptualisation of the LHS as a series of measurable, system level behaviours. These behaviours are generalisable across contexts but delivered through flexible processes unique to context. To address the need to guide and measure LHS implementation, we build upon work originating in the United States on maturity matrices usage in LHSs (Network Maturity Grid) [38, 54]. These have emerged from the engineering sector [55, 56] to assess and optimise behavioural outcomes from a systems perspective (key divergences in S7), and have also been used in public health settings with similar large scale health transformation projects [5759]. Maturity matrices rely on behaviourally anchored rating scales [60] where each level on the scale depicts model behaviour to which assessors can compare their own context.

The LHS-MM includes quantitative rating scales to assess LHS system behaviours by maturity from 1 to 5, in all eight LHS components (Table 3). Here, we focus on assessing behaviours, without prescribing specific processes, allowing these to be employed in a bespoke manner, depending on context. This allows the flexible application of the LHS-MM across a range of settings and contexts in healthcare. Throughout the codesign process, the matrix was perceived to be useful in understanding, planning, implementing, refining and evaluating LHS implementation fidelity and adherence to the LHS framework. Next steps in development include implementation and evaluation in large scale national programs (APP1198561, APP2018718), which are underway across a range of LHS initiatives, with embedded research and evaluation [61].

The Monash LHS-MM builds on contemporary maturity matrices [38, 57, 59, 62] and the LHS to integrate evidence-based implementation frameworks (CFIR [3] and RE-AIM [28]) to improve system level capability in integrating implementation science at scale. For example, where CFIR excels at helping implementors identify “what” contextual, multi-level factors are salient to implementation, the LHS-MM helps implementors to quantitatively assess and chart a course to developing system level capabilities and behaviours that enable successful implementation and improvement activity; in this case both the “how” and “how well”. For example, in our Australian stroke test case, a lack of maturity in engaging stakeholders and enhancing priority setting was identified, and future investment could be targeted towards improving how stakeholder priorities are elicited via consensus, ranked, and shortlisted to inform downstream research, data and benchmarking, and implementation activity.

The major strength of the Monash LHS-MM is that it is a codesigned resource and can function as an implementation guide and monitoring tool. It can help health systems in different contexts establish a LHS to enhance healthcare improvements and deliver impact. As such, it has been integrated within an online implementation toolkit currently being developed to enable (i) ongoing workforce capacity building, and the (ii) implementation of learning health systems to continuously learn from practice, adapt interventions, and drive improvement. This will help ensure that LHSs move beyond conceptual models into practical, scalable systems that can drive meaningful change. Additionally, since development, the Monash LHS-MM is ready for wider use and testing. As such it is being leveraged at a national scale within the Australian National Maternal LHS (MRF2042814) to elicit nationwide priority setting on maternal outcomes and develop a national clinical quality registry and linked data ecosystem [63] to collectively improve pregnancy outcomes. Its versatility is also evidenced from its uses beyond health, where it is has also found use in advancing gender equity in the health leadership workforce [64].

There are several limitations to this study. The LHS-MM was codesigned to align with the Monash LHS Framework and the Australian context. Therefore, it may require adaptation for other LHS models and settings. In refining the LHS-MM it was delivered through an Excel worksheet and Word template, which may lack sophistication, but enhances accessibility. It is now integrated in an online implementation toolkit, with codesigned embedded guidance and resources to enable better usability. Workshops with end-users were conducted in group settings. As such, in-group dynamics may have influenced the results. Further, there is an element of subjectivity to self-assessment. Whilst the LHS-MM aims to enable a pragmatic reflexive and self-assessment approach, this does allow potential bias, with the need for more prescriptive description of maturity that evolved over the codesign process. Further in-depth case studies are needed using mixed methods and incorporating multiple perspectives across a range of settings, to assess interrater reliability, construct validity and level of detail required for a more comprehensive “real-world” assessment. Future work should also consider the relative strengths and weaknesses of either using a reflexive approach as conducted in the Stroke LHS case study, and a more in-depth, evidence-based approach in assessing implementation fidelity as part of a formative evaluation structured within a hybrid effectiveness implementation study design [65]. This is underway across multiple large-scale complex funded initiatives underpinned by the CFIR, LHS and the Monash LHS-MM.

Conclusion

Learning Health Systems (LHS) aim to deliver healthcare improvement through eliciting and integrating evidence-based systems behaviours that are operationalised by a series of processes unique to context. Given their complexity, implementation can be challenging and tools to support LHS implementation and evaluate implementation fidelity are scarce. In response, we have codesigned the Monash LHS-MM through a scoping review, and a four phase Double Diamond codesign process involving stakeholder codesign, integration of evidence-based frameworks, and a test case. The resultant LHS-MM provides a structured approach to assess LHS implementation fidelity and maturity over time, supporting stakeholders to assess and improve approaches to healthcare improvement. The LHS-MM has now been integrated into an online platform [32] aligned with the CFIR and is being tested and deployed across initiatives to benchmark and drive large-scale systems change within healthcare and beyond.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary Material 1 (26.6KB, xlsx)
Supplementary Material 2 (437.2KB, xlsx)
Supplementary Material 3 (45.2KB, docx)
Supplementary Material 4 (19.1KB, docx)
Supplementary Material 5 (33.9KB, docx)
Supplementary Material 6 (29.4KB, docx)
Supplementary Material 7 (1.6MB, xlsx)
Supplementary Material 8 (22.4KB, docx)
Supplementary Material 9 (16.2KB, docx)
Supplementary Material 10 (427.3KB, xlsx)

Acknowledgements

The authors would like to acknowledge the input and valuable feedback contributed by the subject matter experts and potential users of the tool that were engaged as part of maturity matrix development. Further, this work has leveraged the Deliver initiative (MRFF RARUR000072) aiming to leave a sustainable learning health system and resultant legacy of better health outcomes and a greater research capacity in regional settings in Australia.

Abbreviations

LHS

Learning Health System

LHS

MM–Learning Health System Maturity Matrix

CFIR

Consolidated Framework for Implementation Research

RE

AIM–Reach, Efficacy, Adoption, Implementation and Maintenance

BARS

Behaviourally Anchored Scales

SRQR

Standards for Reporting Qualitative Research

ASHC

Academic Health Science Centres

PROMs / PREMs

Patient Reported Outcome/ Experience Measures

PBS

Pharmaceutical Benefits Scheme

MBS

Medicare Benefits Schedule

ASC

Australian Stroke Coalition

AuSDaT

Australian Stroke Data Tool

AuSCR

Australian Stroke Clinical Registry (AuSCR)

DR

Darren Rajit

AJ

Alison Johnson

DC

Dominique Caldilhac

SR

Sandy Reeder

JE

Joanne Enticott

HT

Helena Teede

Author contributions

D.R, A.J, J.E and H.T contributed to conceptualisation. H.T obtained funding. A.J conducted the interviews. J.E and H.T contributed to supervision. D.C and S.R provided feedback on the maturity matrix during its development stage and D.C corroborated the stroke test case details. D.R. drafted the manuscript with H.T, and all authors contributed intellectually, revised and approved the manuscript.

Funding

D.R. is supported by an Australian Government Research Training Program (RTP) Scholarship. H.T. is funded by an NHMRC Fellowship. D.C is a grant holder of the Centre of Research Excellence to Accelerate Stroke Trial Innovation and Translation, Australian National Health and Medical Research Council, grant number 2015705, and co-leads the Learning Health System workstream for this grant. This work is also supported by the Australian Government Medical Research Future Fund. The funders of this work did not have any direct role in the design of the study, its execution, analyses, interpretation of the data or decision to submit results for publication.

Data availability

All data supporting the findings of this study is available within the paper and the accompanying supplementary information.

Declarations

Ethics approval and consent to participate

Potential participants were invited to take part in the study by an introductory email and then followed up by a project researcher to provide study information (including prototypes of the matrix), answer questions and organise mutually agreeable interview times to provide sufficient time for participants to consider involvement. Participants were fully informed on study aims and specific information sought from them before the start of the interview. Informed consent was obtained through implied consent demonstrated by participation in the interview after receiving this information and provided agreement to record the session. All interviews were recorded. This study was approved by the Monash University Human Research Ethics Committee (Project ID: 19969) who approved the method for consent. This study adhered to the Declaration of Helsinki, Ethical Principles for Medical Research Involving Human Participants.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Joanne Enticott and Helena Teede Joint senior author.

References

  • 1.Melder A, Robinson T, McLoughlin I, Iedema R, Teede H. An overview of healthcare improvement: unpacking the complexity for clinicians and managers in a learning health system. Intern Med J. 2020;50(10):1174–84. 10.1111/imj.14876. [DOI] [PubMed] [Google Scholar]
  • 2.Braithwaite J, Glasziou P, Westbrook J. The three numbers you need to know about healthcare: the 60-30-10 challenge. BMC Med. 2020;18(1):102. 10.1186/s12916-020-01563-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated consolidated framework for implementation research based on user feedback. Implement Sci. 2022;17(1):75. 10.1186/s13012-022-01245-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010;5(1):14. 10.1186/1748-5908-5-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, Damschroder L. A systematic review of the use of the consolidated framework for implementation research. Implement Sci. 2016;11(1):72. 10.1186/s13012-016-0437-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Institute of Medicine (US) Roundtable on Evidence-Based Medicine. The learning healthcare system: workshop summary [Internet]. Olsen L, Aisner D, McGinnis JM, editors. Washington (DC): National Academies Press (US); 2007 [cited 2024 Jul 1]. http://www.ncbi.nlm.nih.gov/books/NBK53494/ [PubMed]
  • 7.Safaeinili N, Brown-Johnson C, Shaw JG, Mahoney M, Winget M. CFIR simplified: pragmatic application of and adaptations to the consolidated framework for implementation research (CFIR) for evaluation of a patient‐centered care transformation within a learning health system. Learn Health Syst. 2019;4(1):e10201. 10.1002/lrh2.10201. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Cadilhac DA, Bravata DM, Bettger JP, et al. Stroke learning health systems: A topical narrative review with case examples. Stroke. 2023;54(4):1148–59. 10.1161/STROKEAHA.122.036216. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Rajit D, Johnson A, Callander E, Teede H, Enticott J. Learning health systems and evidence ecosystems: a perspective on the future of evidence-based medicine and evidence-based guideline development. Health Res Policy Syst. 2024;22(1):4. 10.1186/s12961-023-01095-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Enticott JC, Melder A, Johnson A, et al. A learning health system framework to operationalize health data to improve quality care: an Australian perspective. Front Med (Lausanne). 2021;8:730021–730021. 10.3389/fmed.2021.730021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Enticott J, Johnson A, Teede H. Learning health systems using data to drive healthcare improvement and impact: a systematic review. BMC Health Serv Res. 2021;21(1):200. 10.1186/s12913-021-06215-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.McEvoy MD, Dear ML, Buie R, et al. Embedding learning in a learning health care system to improve clinical practice. Acad Med. 2021;96(9):1311–4. 10.1097/ACM.0000000000003969. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Ellis LA, Sarkies M, Churruca K, et al. The science of learning health systems: scoping review of empirical research. JMIR Med Inf. 2022;10(2):e34907. 10.2196/34907. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Vargas C, Whelan J, Brimblecombe J, Allender S. Co-creation, co-design, co-production for public health - a perspective on definition and distinctions. Public Health Res Pract. 2022;32(2):3222211. 10.17061/phrp3222211. [DOI] [PubMed] [Google Scholar]
  • 15.McEvoy MD, Dear ML, Buie R, et al. Effect of smartphone App–Based education on clinician prescribing habits in a learning health care system: A randomized cluster crossover trial. JAMA Netw Open. 2022;5(7):e2223099. 10.1001/jamanetworkopen.2022.23099. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Vahidy FS. A learning health care System–Based approach for improving quality of care among patients with transient ischemic attack. JAMA Netw Open. 2020;3(9):e2016123. 10.1001/jamanetworkopen.2020.16123. [DOI] [PubMed] [Google Scholar]
  • 17.Enticott J, Braaf S, Johnson A, Jones A, Teede HJ. Leaders’ perspectives on learning health systems: a qualitative study. BMC Health Serv Res. 2020;20(1):1087. 10.1186/s12913-020-05924-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Rutter H, Savona N, Glonti K, et al. The need for a complex systems model of evidence for public health. Lancet. 2017;390(10112):2602–4. 10.1016/S0140-6736(17)31267-9. [DOI] [PubMed] [Google Scholar]
  • 19.Bird M, McGillion M, Chambers EM, et al. A generative co-design framework for healthcare innovation: development and application of an end-user engagement framework. Res Involv Engagem. 2021;7(1):12. 10.1186/s40900-021-00252-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Vogel C, Zwolinsky S, Griffiths C, Hobbs M, Henderson E, Wilkins E. A Delphi study to build consensus on the definition and use of big data in obesity research. Int J Obes. 2019;43(12):2573–86. 10.1038/s41366-018-0313-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Bello JO, Grant P. A systematic review of the effectiveness of journal clubs in undergraduate medicine. Can Med Educ J. 2023;14(4):35. 10.36834/cmej.72758. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Nyanchoka L, Tudur-Smith C, Thu VN, Iversen V, Tricco AC, Porcher R. A scoping review describes methods used to identify, prioritize and display gaps in health research. J Clin Epidemiol. 2019;109:99–110. 10.1016/j.jclinepi.2019.01.005. [DOI] [PubMed] [Google Scholar]
  • 23.McDonald S, Hill K, Li HZ, Turner T. Evidence surveillance for a living clinical guideline: case study of t he Australian stroke guidelines. Health Inf Libr J. Published Online November. 2023;9. 10.1111/hir.12515. [DOI] [PMC free article] [PubMed]
  • 24.Rajit D, McDonald S, Tay CT, Du L, Enticott J, Teede H. Assessing the coverage of PubMed, Embase, OpenAlex, and semantic scholar for automated single-database searches in living guideline evidence surveillance: a case study of the international polycystic ovary syndrome guidelines 2023. J Clin Epidemiol. 2025;183:111789. 10.1016/j.jclinepi.2025.111789. [DOI] [PubMed] [Google Scholar]
  • 25.Bull C, Teede H, Watson D, Callander EJ. Selecting and implementing Patient-Reported outcome and experience measures to assess health system performance. JAMA Health Forum. 2022;3(4):e220326. 10.1001/jamahealthforum.2022.0326. [DOI] [PubMed] [Google Scholar]
  • 26.Weiner J, Balijepally V, Tanniru M. Integrating strategic and operational decision making using Data-Driven dashboards: the case of St. Joseph mercy Oakland hospital. J Healthc Manag. 2015;60(5):319. [PubMed] [Google Scholar]
  • 27.Michie S, Richardson M, Johnston M, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46(1):81–95. 10.1007/s12160-013-9486-6. [DOI] [PubMed] [Google Scholar]
  • 28.Holtrop JS, Estabrooks PA, Gaglio B, et al. Understanding and applying the RE-AIM framework: clarifications and resources. J Clin Translational Sci. 2021;5(1):e126. 10.1017/cts.2021.789. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.King DK, Shoup JA, Raebel MA, et al. Planning for implementation success using RE-AIM and CFIR frameworks: A qualitative study. Front Public Health. 2020;8:59. 10.3389/fpubh.2020.00059. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Best Practice for PIVC. Accessed June 17. 2024. https://sites.google.com/monash.edu/bestpracticepivc/home
  • 31.Deliver. Growing research in Western Victoria. April 4, 2024. Accessed June 17, 2024. https://deliver.westernalliance.org.au/.
  • 32.Implementation toolkit, Accessed MCHRI. November 17, 2025. https://www.mchri.org.au/guidelines-resources/health-professionals/implementation-toolkit/.
  • 33.The Double Diamond - Design Council. Accessed August 6. 2024. https://www.designcouncil.org.uk/our-resources/the-double-diamond/
  • 34.Teede H, Cadilhac DA, Purvis T, et al. Learning together for better health using an evidence-based learning health system framework: a case study in stroke. BMC Med. 2024;22(1):198. 10.1186/s12916-024-03416-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Australian Stroke Coalition. Position statement: stroke learning health system approach in Australia [Internet]. 2024 [cited 2024 Aug 26]. https://australianstrokecoalition.org.au/wp-content/uploads/2024/03/ASC-Pos-Statement_Stroke-Learning-Health-System-approach-in-Australia.pdf
  • 36.O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–51. 10.1097/ACM.0000000000000388. [DOI] [PubMed] [Google Scholar]
  • 37.Rajit D, Reeder S, Johnson A, Enticott J, Teede H. Tools and frameworks for evaluating the implementation of learning health systems: a scoping review. Health Res Policy Syst. 2024;22(1):95. 10.1186/s12961-024-01179-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Lannon C, Schuler CL, Seid M, et al. A maturity grid assessment tool for learning networks. Learn Health Syst. 2021;5(2):e10232. 10.1002/lrh2.10232. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Van Citters AD, Buus-Frank ME, King JR, et al. The cystic fibrosis learning network: A mixed methods evaluation of program goals, attributes, and impact. Learn Health Syst. 2023;7(3):e10356. 10.1002/lrh2.10356. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.National Institute for Health and Care Research (NIHR). Guidance on co-producing a research project: learning for involvement [Internet]. [cited 2024 Aug 6]. https://www.learningforinvolvement.org.uk/content/resource/nihr-guidance-on-co-producing-a-research-project/
  • 41.Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2(1):40. 10.1186/1748-5908-2-40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Frontiers. Use of the reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) framework to guide iterative adaptations: applications, lessons learned, and future directions [Internet]. [cited 2024 Nov 14]. https://www.frontiersin.org/journals/health-services/articles/10.3389/frhs.2022.959565/full [DOI] [PMC free article] [PubMed]
  • 43.Muntinga ME, Van Leeuwen KM, Schellevis FG, Nijpels G, Jansen AP. From concept to content: assessing the implementation fidelity of a chronic care model for frail, older people who live at home. BMC Health Serv Res. 2015;15(1):18. 10.1186/s12913-014-0662-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Guerbaai RA, DeGeest S, Popejoy LL, et al. Evaluating the implementation fidelity to a successful nurse-led model (INTERCARE) which reduced nursing home unplanned hospitalisations. BMC Health Serv Res. 2023;23(1):138. 10.1186/s12913-023-09146-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Ryan O, Ghuliani J, Grabsch B, et al. Development, implementation, and evaluation of the Australian stroke data tool (AuSDaT): comprehensive data capturing for multiple uses. HIM J. 2024;53(2):85–93. 10.1177/18333583221117184. [DOI] [PubMed] [Google Scholar]
  • 46.Harris D, Cadilhac DA, Hankey GJ, Hillier S, Kilkenny M, Lalor E. National stroke audit: the Australian experience. Clin Audit. 2010;2:25–31. 10.2147/CA.S9435. [Google Scholar]
  • 47.English C, Hill K, Cadilhac DA, et al. Living clinical guidelines for stroke: updates, challenges and opportunities. Med J Aust. 2022;216(10) [cited 2024 Jul 1]. https://www.mja.com.au/journal/2022/216/10/living-clinical-guidelines-stroke-updates-challenges-and-opportunities#6. [DOI] [PMC free article] [PubMed]
  • 48.The Centre for Research Excellence to Accelerate Stroke Trial Innovation and Translation. [Title of webpage] [Internet]. 2023 Dec 14 [cited 2024 Jul 1]. https://stroke-trials-cre.org.au/
  • 49.Schiller C, Winters M, Hanson HM, Ashe MC. A framework for stakeholder identification in concept mapping and health research: a novel process and its application to older adult mobility and the built environment. BMC Public Health. 2013;13:428. 10.1186/1471-2458-13-428. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.International Assocation for Public Participation (IAP2.). IAP2 Spectrum of Public Participation. https://iap2.org.au/wp-content/uploads/2020/01/2018_IAP2_Spectrum.pdf
  • 51.James Lind Alliance (JLA). JLA Guidebook. https://www.jla.nihr.ac.uk/jla-guidebook/
  • 52.Nygaard A, Halvorsrud L, Linnerud S, Grov EK, Bergland A. The James Lind alliance process approach: scoping review. BMJ Open. 2019;9(8):e027473. 10.1136/bmjopen-2018-027473. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Nurjono M, Shrestha P, Lee A, et al. Realist evaluation of a complex integrated care programme: protocol for a mixed methods study. BMJ Open. 2018;8(3):e017111. 10.1136/bmjopen-2017-017111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Maier AM, Moultrie J, Clarkson PJ. Assessing organizational capabilities: reviewing and guiding the development of maturity grids. IEEE Trans Eng Manage. 2012;59(1):138–59. 10.1109/TEM.2010.2077289. [Google Scholar]
  • 55.Fraser P, Moultrie J, Gregory M. The use of maturity models/grids as a tool in assessing product development capability. In: Proceedings of the IEEE International Engineering Management Conference; 2002; Cambridge, UK. Vol. 1. p. 244–249. 10.1109/IEMC.2002.1038431
  • 56.Moultrie J, Sutcliffe L, Maier A. A maturity grid assessment tool for environmentally conscious design in the medical device industry. J Clean Prod. 2016;122:252–65. 10.1016/j.jclepro.2015.10.108. [Google Scholar]
  • 57.Sharma KM, Jones PB, Cumming J, Middleton L. A self-assessment maturity matrix to support large-scale change using collaborative networks in the new Zealand health system. BMC Health Serv Res. 2024;24(1):838. 10.1186/s12913-024-11284-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.In LG, Hjm FDD. Assessment of the implementation fidelity of a strategy to scale up integrated care in five European regions: a multimethod study. BMJ Open. 2020;10(3). 10.1136/bmjopen-2019-035002. [DOI] [PMC free article] [PubMed]
  • 59.Grooten L, Vrijhoef HJM, Calciolari S, et al. Assessing the maturity of the healthcare system for integrated care: testing measurement properties of the SCIROCCO tool. BMC Med Res Methodol. 2019;19(1):63. 10.1186/s12874-019-0704-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Holland JR, Arnold DH, Hanson HR et al. Reliability of the behaviorally anchored rating scale (BARS) for assessing non-technical skills of medical students in simulated scenarios. Med Educ Online. 27(1):2070940. 10.1080/10872981.2022.2070940 [DOI] [PMC free article] [PubMed]
  • 61.Ng AH, Reeder S, Jones A, et al. Consumer and community involvement: implementation research for impact (CCIRI) – implementing evidence-based patient and public involvement across health and medical research in Australia – a mixed methods protocol. Health Res Policy Syst. 2025;23(1):25. 10.1186/s12961-025-01293-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Ramadan N, Arafeh M. Healthcare quality maturity assessment model based on quality drivers. Int J Health Care Qual Assur. 2016;29(3). 10.1108/IJHCQA-08-2015-0100. [DOI] [PubMed]
  • 63.Callander EJ, Enticott J, Mol BW. Maternal and neonatal outcomes and health system costs in standard public maternity care compared to private obstetric-led care: A population-level matched cohort study. BJOG: An International Journal of Obstetrics & Gynaecology.(n/a) 10.1111/1471-0528.18286 [DOI] [PMC free article] [PubMed]
  • 64.Advancing Women in Healthcare Leadership. About Advancing Women in Healthcare Leadership [Internet]. [cited 2025 Oct 21]. https://www.womeninhealthleadership.org/about-awhl
  • 65.Handley MA, Murphy LD, Sherwin EB, Shade SB. Practical application of hybrid effectiveness–implementation studies for intervention research. Int J Epidemiol. 2025;54(3):dyaf039. 10.1093/ije/dyaf039. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material 1 (26.6KB, xlsx)
Supplementary Material 2 (437.2KB, xlsx)
Supplementary Material 3 (45.2KB, docx)
Supplementary Material 4 (19.1KB, docx)
Supplementary Material 5 (33.9KB, docx)
Supplementary Material 6 (29.4KB, docx)
Supplementary Material 7 (1.6MB, xlsx)
Supplementary Material 8 (22.4KB, docx)
Supplementary Material 9 (16.2KB, docx)
Supplementary Material 10 (427.3KB, xlsx)

Data Availability Statement

All data supporting the findings of this study is available within the paper and the accompanying supplementary information.


Articles from BMC Health Services Research are provided here courtesy of BMC

RESOURCES