ABSTRACT
Objective
Current QI reports within the literature frequently fail to provide enough information regarding interventions, and a significant number of publications do not mention the utilization of a guiding model or framework. The objective of this scoping review was to synthesize the characteristics of hospital‐based QI interventions and assess their alignment with recommended quality goals.
Methods
This scoping review followed the JBI methodology for scoping reviews to synthesize existing literature on hospital‐based QI interventions and reporting using the PRISMA Extension for scoping reviews. Included studies involved a hospital‐based QI intervention that was evaluated through the Development of the Quality Improvement Minimum Quality Criteria Set (QI‐MQCS) framework, reporting on hospital users' (i.e., practitioners and patients) data. We searched Medline, CINAHL, Embase and PubMed databases for primary research published between 2015 and 2024. Grey literature was also examined. A narrative synthesis guided the integration of findings.
Results
From 1398 identified records, 70 relevant records were included. Results indicate a wide variation in QI frameworks and methods used by the included studies. The QI interventions most frequently assessed were organizational‐focused (n = 59), followed by professional‐related interventions (n = 41) and patient‐care interventions (n = 24). There were multiple facilitators and barriers across organizational, professional, and patient care levels found in the included studies. Examples of facilitators were instrumental in driving successful QI initiatives included education, training, active leadership, and stakeholder engagement. Conversely, barriers such as time constraints, resource limitations, and resistance were highlighted.
Conclusion
Existing QI publications lack sufficient detail to replicate interventions. Using a model or framework to guide the conduct of a QI‐activity may support a more robustly designed and well‐conducted project. The variation of reporting characteristics suggests that future research should focus on the development of a pragmatic tool for use by front‐line clinicians to support consistent and rigorous conduct of QI projects.
Keywords: evidence implementation, hospital, quality improvement, scoping review
1. Introduction
Healthcare is dynamic and with increasing reports of Quality Improvement (QI) projects, understanding the characteristics and outcomes of these types of studies is beneficial. Since QI interventions tend to be complex and uniquely tailored to settings, the success of these interventions is often difficult to describe and measure [1].
Quality improvement is described as the systematic and continuous activity undertaken by organizations to ensure services meet users' requirements and needs [2, 3, 4]. This ongoing and evolving process is especially important in healthcare to ensure optimal patient and service outcomes by streamlining processes and identifying areas for improving the quality of care [2, 5]. QI initiatives aim to enhance the safety and efficiency of healthcare services, the overall health of patients, as well as professional development [3, 6]. They generally employ well‐established methodologies, such as Lean, Six Sigma, and Plan‐Do‐Study‐Act, each with its own unique approach yet also sharing common features, such as iterative cycles [7]. QI projects in healthcare drive enhancements across various fronts. Through efficient resource allocation, they can maximize resource utilization and reduce waste [8, 9]. QI initiatives foster innovation by encouraging novel approaches to emerging challenges and promote patient‐centered care by engaging patients and addressing their needs holistically. Moreover, they cultivate a culture of continuous learning and improvement, utilizing data‐driven insights to refine healthcare delivery iteratively [5]. Examples of quality improvement projects in health services include; implementation of symptom electronic patient‐reported outcome measures (ePROMs) in a hemodialysis clinic, developing a protocol for standardizing operating rooms to intensive care units handoffs in a mixed surgical population and implementation of an evidence based toolkit for medication reconciliation in small hospitals [10, 11, 12].
In 2001, the Institute of Medicine (IOM) established six dimensions (i.e., safety, effectiveness, patient centered approach, timeliness, efficiency, and equitability) that when addressed might enable healthcare facilities to implement fundamental improvements to the health of the people [13]. Since publication of the IOM dimensions of quality care, new tools have emerged that consider other elements of healthcare. “The Triple Aim Care” (TAC) goals developed by Berwick et al. [14], for instance, recommend that healthcare organizations should not only focus on improving the health of populations but also on improving patient experience and reducing healthcare costs. Both TAC and the dimensions established by IOM, however, failed to consider the needs of healthcare providers. Consequently, in 2014, TAC was expanded to Quadruple Aim Care, which included a contemporary focus on the well‐being of clinicians and other healthcare staff [15].
The Quality Improvement Minimum Quality Criteria Set (QI‐MQCS) is one tool that allows the objective evaluation of QI studies using 16 predefined domains, including QI rationale, outcomes, sustainability, and the ability to spread [1]. Since the publication of the QI Criteria, it is not known if published QI projects, particularly those within hospital environments, are adhering to the guidance or not. Adhering to the reporting guidance of these criteria has the potential to compare the efficacy and safety of these interventions and enable researchers to avoid work duplication and research waste [1]. However, this tool is seldom used or reported in all projects. The benefits of using such tools include the robustness of QI projects in their design and their completeness and their likelihood to generate data that can be used by other organizations [1].
The benefits of QI projects in healthcare organizations can be considerable, but, despite the existence of the above‐mentioned QI methodology and quality indicator guidelines, many QI efforts remain unsuccessful [16, 17]. Failure factors relate to a range of challenges experienced in the design, delivery, and sustainability of proposed change, and include time constraints and competing priorities, leadership issues, work culture, the development of collaborative teams, and the process of patient engagement and patient experience [18, 19]. One major challenge in adopting QI projects is overcoming established routines and beliefs held by some clinicians, despite evidence for change. This can be a barrier to trying new ideas and ensuring ongoing active engagement to sustain practice changes [19].
This scoping review therefore aimed to synthesize characteristics of reported QI interventions in hospital settings, according to QI‐MQCS criteria [1]. The objectives of the review were to: explore the methodologies used to guide QI projects; identify the focus and outcomes of the interventions; examine the facilitators and barriers to QI project implementation; assess the alignment of the studies with the 16 domains of the QI‐MQCS framework and the six dimensions of the IOM framework; and determine whether the included studies reported strengths, limitations, and provided recommendations for future research.
2. Methods
This scoping review followed the Preferred Reporting Items for Systematic Reviews and Meta‐Analyses Extension for Scoping Reviews (PRISMA‐ScR) [20] and the Joanna Briggs Institute (JBI) framework and scoping review methodology [21]. See supplementary file for the PRISMA‐ScR checklist. This review was also registered in the Open Science Framework database (registration DOI: 10.17605/OSF.IO/MSKC9) on August 7, 2022.
2.1. Eligibility Criteria
The Population, Intervention, Comparator, Outcome, and Study design (PICOS) framework [22] guided inclusion criteria eligibility. Initially, we planned to include QI interventions across all healthcare settings in the review; however, due to the considerable volume of literature obtained from the initial search, it was later decided to focus exclusively on hospital settings This is also in addition to the distinctions between primary and acute settings across many aspects, such as governance, staff, and operation models. Studies were required to report on any domain of the QI‐MQCS framework [1]. Further, only quantitative studies, written in English and conducted after 2015, were included in this analysis. We applied date restrictions, as 2015 was the year the QI‐MQCS framework was developed.
2.2. Information Sources and Search Strategy
The database searches were conducted using established keywords based on the research question and objectives. Medline, CINAHL, Embase and PubMed were searched for peer‐reviewed articles published from January 2015 to August 6, 2022. Subsequently, an updated search was undertaken from August 7, 2022, until March 1, 2024. The search strategies were modified according to the requirements of the database tools. Among the four databases used for peer‐reviewed articles, studies were identified through a combination of keyword and subject searches. Google Scholar was used to scope gray literature within the specified timeframe, using a combination of the terms “quality improvement,” “health service,” and “implementation.” The first 10 pages of Google Scholar output were checked for relevant articles. We only included publications in the English Language only due to resources constraints See Appendix 1 for database‐specific search strategies.
2.3. Study Screening and Selection Process
The selected studies were uploaded into EndNote 20.1 [23] and duplicates were removed. The remaining studies were exported to Covidence [24], a web‐based collaboration software platform that allows for structured screening
2.4. Data Extraction
Data extraction was completed by two independent reviewers with disagreements resolved through conferencing. Data was extracted into a priori standardized data extraction form, consistent with Tables 1–4 (available in the supplementary file). Following the JBI guidance [21] data extracted included the author, title, country, the intervention, the intervention focus (i.e., organizational, professional or patient specific) and duration of the project implemented, outcomes measured, strengths and limitations identified, and recommendations for future research. Besides these key characteristics, QI methodologies and the barriers to, and facilitators of QI project implementation were also documented. Furthermore, we assessed if the included publications addressed the six IOM dimensions of quality improvement [25] (safe, effective, patient‐centered, timely, efficient, and equitable) and if the studies aligned with the 16 domains of the QI Minimum Quality Criteria Set (QI‐MQCS) framework including (Organizational Motivation, Intervention Rationale, Intervention Description, Organizational Characteristics, Implementation, Study Design, Comparator Description, Data Sources, Timing, Adherence/Fidelity, Health Outcomes, Organizational Readiness, Penetration/Reach, Sustainability, Spread and Limitations) [1]. When a publication adhered to at least the minimal standards for a domain, it received a “meeting the criteria” score. If only parts of a domain were addressed, a “partially meeting the criteria” score was assigned. Conversely, a “not meeting the criteria” score was given when a publication did not address the domain at all [1]. For studies in which the domain “fidelity” was not explicitly reported, it was assumed that adherence to the intervention was maintained if the number of participants remained constant. In addition, we appraised the quality of each separate publication. A score of 0 or 1 was assigned for each of the 16 domains, resulting in a maximum possible score of 16. A domain scored 1 if it met the minimum requirement, that is, from partially to fully meeting the criteria. In line with previous studies [1, 26, 27] publications that reported on 14 or more domains were considered high quality.
2.5. Data Synthesis
We have presented the data using descriptive analysis using frequencies and percentages where possible to report on the number of interventions, counties and other measurable outcomes. For textual data, we have used high‐level categorization to present them as guided by Peters et al. [21]. No critical appraisal assessment was used to adhere the scoping review methodology.
3. Results
3.1. Study Selection
Of the 1398 records identified through the first and updated searches, 70 met all inclusion criteria and were included in the present review. Figure 1 details the results at each level and reasons for exclusion.
FIGURE 1.

PRISMA‐ScR diagram flow chart initial and updated search combined illustrating the process of study selection.
3.2. Study Characteristics
Study years ranged from 2016 to 2024. There was a gradual increase in the number of studies published from 2016 to 2021, followed by a decline in 2022 and 2023, as displayed in Figure 2.
FIGURE 2.

Publication frequency of included studies by year.
Note. No studies were identified in in the first two months of 2024.
Table 1 in the supplementary file reports on the key characteristics of all included studies. Studies were most frequently conducted in the United States (n = 38; 54%), followed by Australia (n = 10; 14%), Canada (n = 6; 9%), and the UK (n = 4; 6%). Thus, most studies originated from Western English‐speaking countries, which is not surprising given that we only included studies published in English. China and the Netherlands each contributed two studies (3%), while the Dominican Republic, Germany, Mexico, Pakistan, Switzerland, and Vietnam were each represented by a single study. Additionally, one study included participants across various unspecified countries across Europe [28]. Intervention duration varied among the 55 (79%) studies reporting on study time frames: from 2 h for a teaching session [29] to 5 years for a large and complex multi‐site intervention [30]. The mean intervention duration across the 55 studies was 13.46 months. Figure 3 reports on the types of interventions reported in the studies as organizational, professional or patient care orientated. Some studies reported on interventions that addressed more than one aspect of Quality improvement.
FIGURE 3.

Types of interventions reported in the included studies.
3.2.1. Intervention Methodology
From the 70 included studies, 48 (69%) employed specific QI methods, with ten of them (14%) utilizing multiple frameworks. Among the 48 studies, six unique QI models and frameworks were most frequently cited that included the Consolidated Framework for Implementation Research (CFIR; n = 8; 11%), the Plan‐Do‐Study‐Act (PDSA; n = 7; 10%) cycle, the Theoretical Domains Framework (TDF; n = 9; 13%), the Standards for Quality Improvement Reporting Excellence (SQUIRE; n = 6; 9%), the Knowledge‐to‐Action Framework [31, 32, 33] (KTA; n = 3; 4%), and the Donabedian Model [34, 35] (n = 2; 3%). Twenty‐two (31%) additional frameworks were reported upon once these included but were not limited to the Behavior Change Wheel, Integrated Promoting Action on Research Implementation in Health Services (i‐PARIHS), the Iowa model of evidence‐based practice, and Lean methodology.
3.2.2. Intervention Focus
The predominant focus of the QI interventions was on organizational aspects (n = 59, 84%). Forty‐one studies (58%) were professional‐related interventions, while patient care interventions were addressed in 24 studies (34%). Of note, 53 studies (76%) described multiple QI interventions: specifically, 17 (24%) studies included all three interventions (i.e., organizational, professional, and patient care initiatives); a further 17 (24%) studies assessed organizational and professional interventions; 14 (20%) studies assessed organizational and patient care interventions; and five studies (7%) assessed professional and patient care interventions. Seventeen studies (24%) assessed one type of QI intervention only (organizational (n = 11, 16%); patient care (n = 4; 6%); and professional [36, 37] (n = 2; 3%)).
3.3. Intervention Outcomes
A wide variety of outcomes were examined across different healthcare contexts. Patient‐related outcomes included length of hospital stay (n = 12; 17%), mortality rates (n = 4; 6%), readmissions (n = 4; 6%), and other health‐related outcomes (n = 16; 23%). Clinical process measure outcomes included QI intervention compliance (n = 21; 30%) and examination of QI implementation facilitators and barriers (n = 22; 31%). Additional clinician‐related outcomes included clinician experiences with the intervention [12, 38, 39] (n = 3; 4%) and clinician knowledge, skills, and adherence to best practices (n = 11; 16%). QI implementation outcomes related to the feasibility (n = 14; 20%), acceptability [40, 41, 42] (n = 3; 4%), and sustainability (n = 5; 7%) of the implemented interventions. Organizational outcomes examined procedures (n = 10; 14%), as well as team processes, including collaboration and communication among healthcare professionals [43, 44, 45] (n = 3, 4%) and training/education [19, 37, 46] (n = 3, 6%).
3.4. Facilitators and Barriers to the Quality Improvement Implementation Process
Facilitators and barriers across organizational, professional, and patient care levels provide valuable insights into the factors influencing the success of QI intervention implementation (see Table 2 in the supplementary file) and summarized below as a shorter version in Table 1.
TABLE 1.
A summary of barriers and facilitators and how to address them.
| Category | Facilitators | Barriers | Strategies to address barriers |
|---|---|---|---|
| Organizational |
|
|
|
| Professional |
|
|
|
| Patient level |
|
|
|
3.4.1. Facilitators of the Implementation Process
A little over half of the papers (n = 38; 54%) reported on facilitators to QI implementation. Only 31 (44%) studies focused on organizational level facilitators of the implementation process. Most frequently mentioned organizational level facilitators were: education and training provision (n = 7, 10%); leadership and stakeholder interest and involvement (n = 7, 10%); the use of standardized procedures and guidelines (n = 6, 9%); continuous monitoring, audits, and feedback (n = 5, 7%); and the use of an assigned staff member to support implementation (n = 7, 4%).
Professional‐level facilitators were reported in 17 (24%) of the 70 studies, revealing several trends. The most frequently mentioned facilitators were the involvement and engagement of various healthcare professionals (n = 7, 10%). This involvement included staff participation, adherence to expected practices, optimistic outlooks on interventions, and active engagement in providing feedback. Teamwork, interdisciplinary coordination, and communication among staff members were mentioned in three studies (4%) [29, 47, 48]. Additionally, the willingness of healthcare workers to adopt new tools and their positive perceptions of the benefits of the intervention were identified as facilitators (n = 6, 9%). Professional prior experience and knowledge (n = 2, 3%) [49, 50] also facilitated the implementation process.
Patient‐level facilitators were rarely reported, with only three (4%) of the 70 studies discussing these factors. Key patient‐related facilitators included acceptability of the intervention and willingness to become involved [51], patient interaction [50], and awareness of the intervention and perceiving this as a positive experience [32, 52].
3.4.2. Barriers of the Implementation Process
A total of 58 (83%) papers reported on barriers to QI implementation, as shown in Table 3 in the supplementary file. Organizational barriers to the implementation of QI initiatives in healthcare settings were reported in 49 (71%) of the 69 included studies. The most frequently reported barriers were lack of necessary resources such as equipment and support (16; 23%), increased workload and workflow disruption (n = 8; 11.5%), lack of time (n = 6, 9%), and insufficient financial resources (n = 6, 9%). Inadequate staffing (n = 7, 10%) and a high staff turnover (n = 7, 10%) were also identified as a challenge. Several studies (n = 7, 10%) highlighted difficulties related to the integration of new processes into existing systems and workflows. Difficulties in data collection and/or reporting were mentioned in five (7%) studies. A lack of standardized protocols and tools (n = 4, 6%), technology limitations [29, 38, 53] (n = 3, 4%), and inadequate knowledge about the intervention [32, 36] (n = 2, 3%) were also identified as barriers to QI implementation.
Professional‐level barriers to QI implementation were reported in 29 (42%) of the 69 studies. The most frequently mentioned professional barriers included: a lack of knowledge, skills, confidence, and training (n = 11, 16%); time constraints and increased workload (n = 9, 13%); resistance to change and lack of engagement (n = 10, 14%); communication and coordination challenges [40, 41, 54] (n = 3, 4%); and concerns about patient expectations and receptiveness [11, 55] (n = 2; 3%).
Patient‐level barriers were reported in 14 of the 69 (20%) studies. Patients' lack of knowledge and understanding of the interventions and their importance was identified as a barrier [46, 56]. Financial constraints posed another barrier [57, 58]. Language barriers and limited health literacy also impeded implementation efforts, while skepticism about the use of health information further complicated the process [38, 56, 59]. Additionally, accessibility and technological barriers were identified, with patients experiencing difficulties in accessing technology platforms and completing pre‐measures as well as navigating healthcare systems [38, 57].
3.5. Alignment With the QI‐MQCS Framework
Thirty‐seven studies (53%) were of high quality, with a threshold score of at least 14 out of the 16 QI dimensions (either fully or partially met). The median quality score of the 70 papers was 14 (IQR: 12–15). The majority of the studies provided characteristics of the intervention (n = 68; 97%), implementation plan (n = 66; 94%), rationale of the intervention (n = 65; 93%), data source (n = 63; 90%), information on the limitations of the intervention (n = 57; 82%), organizational motivation (n = 57; 82%), and an explanation of the comparator (n = 56; 80%). The domains that were least sufficiently reported to meet the criteria were spread of the intervention (n = 27; 38%) and sustainability (n = 31; 44%). Domains that were mainly partially recorded with not much description were adherence/fidelity of the intervention (n = 34; 49%), and penetration of the intervention into the work routine (n = 20; 28%). The alignment of the papers with the 16 domains of the QI‐MQCS framework is shown in Figure 4.
FIGURE 4.

Alignment of the included studies with the 16 domains of the QI‐MQCS framework (%).
3.6. Alignment With the IOM Dimensions
The IOM dimensions addressed by the included studies are shown in Figure 5. Most of the included studies aligned well with these dimensions in terms of patient centeredness (n = 65; 93%) and effectiveness (n = 64; 91%). Sixty‐three percent (n = 44) of the studies related to the efficiency of the intervention and 60% (n = 42) reported on safety. The dimensions that were least addressed were equitability (n = 9; 13%) and timeliness (n = 19; 27%).
FIGURE 5.

Alignment of the included studies with the IOM dimensions (%).
3.7. Strengths of Included Studies
Strengths of the studies were reported in 25 out of 70 (36%) studies (see Table 4 in the supplementary file). A notable strength across the studies involved the use of evidence‐based approaches, validated knowledge translation theories, and systematic application of frameworks in planning and evaluating implementation efforts (n = 6; 69%). Additionally, some studies (n = 6; 9%) employed mixed methods approaches and conducted evaluations prior to publishing findings, allowing for a comprehensive understanding of intervention feasibility and effectiveness. The demonstration of feasibility in specific settings was also a key strength, showcasing the potential for tailored implementation (n = 3; 4%) [38, 60, 61]. Furthermore, the involvement of multidisciplinary teams and stakeholders throughout the implementation process was a significant asset, facilitating collaborative efforts and ensuring relevance to diverse perspectives (n = 3, 4%) [61, 62, 63].
3.8. Limitations of Included Studies
Most studies (n = 62; 89%) reported on limitations, highlighting areas for improvement in future research (see Table 4 in the supplementary file). The lack of generalizability was reported in three studies [33, 42, 64] (n = 3; 4%), specifically due to the prevalence of single‐center or site‐specific studies (n = 22; 31%) and small sample sizes (n = 15; 21%), thereby restricting the broader applicability of findings. Additionally, the absence of control groups in many studies (n = 12; 17%) raised concerns about the potential influence of confounding factors on the observed outcomes. Similarly, 20 studies (29%) mentioned issues with the data collection process or the use of multiple interventions at the same time. Seasonal variations during study phases and short‐term follow‐up (n = 4; 6%), as well as resource and time constraints (n = 7; 10%) might have hindered the assessment of sustained effects over time. Finally, concerns about the influence of sampling bias (n = 10; 14%), including low response bias, as well as measurement bias (n = 16; 23%), including instrument, recall, response, and evaluation bias, and the Hawthorne effect, were acknowledged as limitations that could have influenced the validity and reliability of the results.
3.9. Recommendations for Future Research
Recommendations for future research were prevalent among the included studies, reported in 47 (67%) studies (see Table 4 in the supplementary file). Twelve studies (17%) emphasized the need for large‐scale, multicenter studies incorporating control groups to rigorously assess intervention effectiveness, advocating for designs such as randomized controlled trials. Conducting larger, long‐term evaluations to examine sustained efficacy over time emerged as another common recommendation, appearing in six studies (9%). Several studies (n = 6, 9%) recommended investigating additional barriers, facilitators, and solutions to implementation across diverse settings. Six studies (9%) underscored the importance of incorporating patient and healthcare provider perceptions, engagement, and education into interventions to enhance acceptability, adherence, and tailoring to stakeholder needs. Practitioner training on quality improvement interventions was highlighted as a critical area for future research (n = 2, 3%) [19, 65]. Cost‐effectiveness analysis was another recommendation, to evaluate the economic viability of interventions (n = 2, 3%) [31, 57]. Finally, the development, refinement, and validation of standardized tools and protocols for evaluation and measurement was proposed to establish robust and consistent methodologies (n = 7, 10%).
4. Discussion
This scoping review was undertaken to investigate the reporting quality and characteristics of QI projects undertaken in hospital settings. The findings of this review highlight the variety of frameworks used to report these QI projects and at times lack of frameworks to guide implementation and reporting of their success in practice. The lack of structured approach for guiding these QI projects may result in poor reporting and lack of their intake and transferability to other settings. Addressing this issue is crucial to enhancing the sustainability, scalability, and overall impact of QI efforts on patient outcomes and organizational efficiency
A key feature of the 70 included studies were the variety of topics, reporting features and project characteristics. It is interesting to note that despite the many frameworks, theories, and models developed to support QI projects. These frameworks provide a scaffold for planning, implementation, and evaluation, guiding practitioners through each stage of the QI process. Moreover, they facilitate consistency and comparability across studies, fostering a shared understanding of best practices and promoting advancements in healthcare quality on a broader scale [66]. Thus, the adoption of QI frameworks serves as a cornerstone in promoting evidence‐based, effective, and sustainable improvements in healthcare delivery [67].
The alignment of included studies with Quality Improvement Minimum Quality Criteria Set (QI‐MQCS) and Institute of Medicine (IOM) dimensions offers a vital framework for evaluating QI research comprehensiveness and quality. While most studies aligned with key domains such as intervention characteristics and rationale, gaps persist in reporting aspects such as intervention integration into work routines (spread) and equitable healthcare outcomes. Ensuring alignment with established quality criteria sets is crucial for advancing the effectiveness and sustainability of QI efforts aimed at enhancing healthcare quality and patient outcomes [25]. A shorter, pragmatic tool may be more appropriate for health services to follow when developing or implementing QI projects and may contribute toward a more consistent approach to these types of projects [26, 68].
Context is a key consideration of QI projects, and this review has highlighted the diversity of contexts where these types of projects are being undertaken. In their review of systematic reviews, Kringos et al. [69] highlighted how context can influence overall effectiveness of QI interventions. Our current review identified variation in reporting contextual characteristics, which can be problematic for replication of projects in other contexts. Clear and thorough reporting of contextual factors is recommended for all QI projects and further work could be undertaken in this area to support reporting consistency [70, 71].
Most studies primarily evaluated organizational‐focused interventions, indicative of the acknowledged significance of organizational dynamics in shaping healthcare quality. This breadth of examination underscores the holistic approach adopted in QI assessments, recognizing the intricate interplay between different facets of healthcare quality. By encompassing a wide range of outcomes, QI evaluations aim to provide a comprehensive understanding of the effectiveness and impact of interventions, thus informing strategies for sustainable quality improvement in healthcare settings.
Several barriers and facilitators were presented in the included studies. To address organizational barriers in QI implementation, securing adequate resources is crucial. This includes allocating budgets for necessary equipment, updating technology, and ensuring sufficient staffing. Enhancing workflow efficiency through process mapping, task redistribution, and flexible scheduling can help mitigate increased workload and time constraints. Implementing pilot programs or phased rollouts can ease the integration of new processes into existing workflows. Training staff on standardized protocols and improving data collection systems ensures smoother implementation. Advocacy for funding and investment in staff retention programs further addresses financial constraints and high turnover, fostering a more stable and well‐supported organizational environment.
On a professional and patient level, targeted strategies aim to improve engagement and reduce resistance to change. Continuous professional development and mentoring programs equip staff with the knowledge and confidence to adopt interventions. Open communication channels and participatory decision‐making processes help address resistance and foster teamwork. For patients, addressing knowledge gaps through tailored educational materials, interpreters, and visual aids ensures inclusiveness and understanding. Building trust through transparent communication, offering financial support options, and simplifying technology platforms also remove barriers related to skepticism, financial constraints, and accessibility. These strategies collectively enhance the likelihood of successful QI implementation by aligning organizational, professional, and patient‐level efforts.
Future work should focus on developing a universal tool for QI projects as a checklist taking into consideration the holistic nature of QI interventions in healthcare settings addressing minimum criteria such as a clear description of the intervention, its implementation plan, the framework used for the implementation, rationale of the intervention, data sources, and sustainability. This has the potential to improve reporting of QI projects and allow for replication of beneficial interventions in other settings.
This study rigorously synthesized hospital‐based quality improvement (QI) interventions using JBI methodology and PRISMA‐ScR guidelines, ensuring a comprehensive and transparent approach. The broad search strategy, covering multiple databases and grey literature, effectively captured a wide range of relevant studies, offering a comprehensive view of QI reporting quality. This inclusive approach, while addressing nomenclature challenges contributed to a comprehensive search. The use of the QI‐MQCS framework allowed for a structured evaluation of interventions, while the categorization of the facilitators and the barriers, provides actionable guidance for healthcare practitioners.
However, the study has limitations. The variability in reporting frameworks and methods across studies limits comparability and generalizability. Additionally, reliance on published literature may have excluded valuable unpublished QI projects. The exclusive use of the QI‐MQCS framework may have excluded other valuable approaches. The lack of detailed reporting in many studies affects replicability. In addition, the study did not address geographical or cultural variations in QI implementation, which could impact its broader applicability. Furthermore, we have only included studies published in the English language which may have omitted some other important studies from the review published in other languages. Finally, none of the included studies were assessed for quality as per the JBI guidance methodology.
The review results underscore the diverse landscape of QI interventions in healthcare, marked by varying intervention durations, adoption of QI frameworks, and assessment of outcomes across diverse healthcare contexts. Understanding implementation facilitators and barriers, alongside alignment with established quality criteria sets, is pivotal for enhancing the effectiveness and sustainability of QI endeavors aimed at enhancing healthcare quality and patient outcomes.
Conflicts of Interest
The authors declare no conflicts of interest.
Supporting information
Supporting Information
References
- 1. Hempel S., Shekelle P. G., Liu J. L., et al., “Development of the Quality Improvement Minimum Quality Criteria Set (QI‐MQCS): A Tool for Critical Appraisal of Quality Improvement Intervention Publications,” BMJ Quality & Safety 24, no. 12 (2015): 796. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Backhouse A. and Ogunlayi F., “Quality Improvement Into Practice,” Bmj 368 (2020): m865. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Knudsen S. V., Laursen H. V. B., Johnsen S. P., Bartels P. D., Ehlers L. H., and Mainz J, “Can Quality Improvement Improve the Quality of Care? A Systematic Review of Reported Effects and Methodological Rigor in Plan‐Do‐Study‐Act Projects,” BMC Health Services Research 19 (2019): 1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Lynn J., Baily M. A., Bottrell M., et al., “The Ethics of Using Quality Improvement Methods in Health Care,” Annals of Internal Medicine 146, no. 9 (2007): 666–673. [DOI] [PubMed] [Google Scholar]
- 5. Engle R. L., Mohr D. C., Holmes S. K., et al., “Evidence‐based Practice and Patient‐Centered Care: Doing Both Well,” Health Care Management Review 46, no. 3 (2021): 174–184. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Hines K., Mouchtouris N., Knightly J. J., and Harrop J., “A Brief History of Quality Improvement in Health Care and Spinal Surgery,” Global Spine Journal 10, no. 1 (2020): 5S–9S. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Rubenstein L., Khodyakov D., Hempel S., et al., “How Can We Recognize Continuous Quality Improvement?,” International Journal for Quality in Health Care 26, no. 1 (2014): 6–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Baccei S. J., Henderson S. R., Lo H. S., and Reynolds K., “Using Quality Improvement Methodology to Reduce Costs While Improving Efficiency and Provider Satisfaction in a Busy, Academic Musculoskeletal Radiology Division,” Journal of Medical Systems 44 (2020): 1–7. [DOI] [PubMed] [Google Scholar]
- 9. Henry E. S., Robertshaw S., and Stephenson J., “Improving Accessibility to Outpatient Clinics for Adults With Suspected Seizures From the Emergency Department: A Quality Improvement Project,” Seizure: The Journal of the British Epilepsy Association 93 (2021): 160–168. [DOI] [PubMed] [Google Scholar]
- 10. Čerlinskaitė‐Bajorė K., Lam C. S. P., Sliwa K., et al., “Sex‐specific Analysis of the Rapid Up‐Titration of Guideline‐Directed Medical Therapies After a Hospitalization for Acute Heart Failure: Insights From the STRONG‐HF Trial,” European Journal of Heart Failure 25, no. 7 (2023): 1156–1165. [DOI] [PubMed] [Google Scholar]
- 11. Flythe J. E., Tugman M. J., Narendra J. H., et al., “Feasibility of Tablet‐Based Patient‐Reported Symptom Data Collection Among Hemodialysis Patients,” Kidney International Reports 5, no. 7 (2020): 1026–1039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Lane‐Fall M. B., Pascual J. L., Peifer H. G., et al., “A Partially Structured Postoperative Handoff Protocol Improves Communication in 2 Mixed Surgical Intensive Care Units: Findings From the Handoffs and Transitions in Critical Care (HATRICC) Prospective Cohort Study,” Annals of Surgery 271, no. 3 (2020): 484–493. [DOI] [PubMed] [Google Scholar]
- 13. Institute of Medicine Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001. [Google Scholar]
- 14. Berwick D. M., Nolan T. W., and Whittington J., “The Triple Aim: Care, Health, and Cost,” Health Affairs 27, no. 3 (2008): 759–769. [DOI] [PubMed] [Google Scholar]
- 15. Bodenheimer T. and Sinsky C., “From Triple to Quadruple Aim: Care of the Patient Requires Care of the Provider,” The Annals of Family Medicine 12, no. 6 (2014): 573–576. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Kellogg K. M., Hettinger Z., Shah M., et al., “Our Current Approach to Root Cause Analysis: Is It Contributing to Our Failure to Improve Patient Safety?,” BMJ Quality & Safety 26, no. 5 (2017): 381–387. [DOI] [PubMed] [Google Scholar]
- 17. Britt H., Miller G. C., Henderson J., et al., General Practice Activity in Australia 2015–16 (Sydney University Press, 2016).
- 18. Vaz N. and Araujo C., “Failure Factors in Healthcare Quality Improvement Programmes: Reviewing Two Decades of the Scientific Field,” International Journal of Quality and Service Sciences 14, no. 2 (2022): 291–310. [Google Scholar]
- 19. Jones B., Vaux E., and Olsson‐Brown A., “How to Get Started in Quality Improvement,” Bmj 364 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Tricco A. C., Lillie E., Zarin W., et al., “PRISMA Extension for Scoping Reviews (PRISMA‐ScR): Checklist and Explanation,” Annals of Internal Medicine 169, no. 7 (2018): 467–473. [DOI] [PubMed] [Google Scholar]
- 21. Peters M. D. J., Marnie C., Tricco A. C., et al., “Updated Methodological Guidance for the Conduct of Scoping Reviews,” JBI Evidence Synthesis 18, no. 10 (2021). [DOI] [PubMed] [Google Scholar]
- 22. McKenzie J. E., Brennan S. E., Ryan R. E., Thomson H. J., Johnston R. V., and Thomas J., “Defining the Criteria for Including Studies and How They Will be Grouped for the Synthesis,” Cochrane Handbook for Systematic Reviews of Interventions (2019): 33–65. [Google Scholar]
- 23.The EndNote Team. EndNote. EndNote 20 Ed (Philadelphia, PA: Clarivate, 2013). [Google Scholar]
- 24. Covidence systematic review software [Internet] (Melbourne, Australia: Veritas Health Innovation) (2023).
- 25. Wolfe A., “Institute of Medicine Report: Crossing the Quality Chasm: A New Health Care System for the 21st Century,” Policy, Practice, & Nursing Practice 2, no. 3 (2001): 233–235. [Google Scholar]
- 26. Inata Y., Nakagami‐Yamaguchi E., Ogawa Y., Hatachi T., and Takeuchi M., “Quality Assessment of the Literature on Quality Improvement in PICUs: A Systematic Review,” Pediatric Critical Care Medicine 22, no. 6 (2021): 553–560. [DOI] [PubMed] [Google Scholar]
- 27. Wollny K., Cui S., McNeil D., et al., “Quality Improvement Interventions to Prevent Unplanned Extubations in Pediatric Critical Care: A Systematic Review,” Systematic Reviews 11, no. 1 (2022): 259. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28. Clack L., Zingg W., Saint S., et al., “Implementing Infection Prevention Practices Across European Hospitals: An In‐Depth Qualitative Assessment,” BMJ Quality & Safety 27, no. 10 (2018): 771–780. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29. Rachamin Y., Grischott T., and Neuner‐Jehle S., “Implementation of a Complex Intervention to Improve Hospital Discharge: Process Evaluation of a Cluster Randomised Controlled Trial,” BMJ Open 11, no. 5 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Howard R., Delaney L., Kilbourne A. M., et al., “Development and Implementation of Preoperative Optimization for High‐Risk Patients With Abdominal Wall Hernia,” JAMA Network Open 4, no. 5 (2021). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Howell D., Rosberger Z., Mayer C., et al., “Personalized Symptom Management: A Quality Improvement Collaborative for Implementation of Patient Reported Outcomes (PROs) in ‘Real‐World’ Oncology Multisite Practices,” Journal of Patient‐Reported Outcomes 4, no. 1 (2020): 47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Mackay H. J., Campbell K. L., van der Meij B. S., and Wilkinson S. A., “Establishing an Evidenced‐Based Dietetic Model of Care in Haemodialysis Using Implementation Science,” Nutrition & Dietetics 76, no. 2 (2019): 150–157. [DOI] [PubMed] [Google Scholar]
- 33. Moore J. E., Marquez C., Dufresne K., et al., “Supporting the Implementation of Stroke Quality‐Based Procedures (QBPs): A Mixed Methods Evaluation to Identify Knowledge Translation Activities, Knowledge Translation Interventions, and Determinants of Implementation Across Ontario.” BMC Health Services Research 18, no. 1 (2018): 466. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34. Rowe A. D., McCarty K., and Huett A., “Implementation of a Nurse Driven Pathway to Reduce Incidence of Hospital Acquired Pressure Injuries in the Pediatric Intensive Care Setting,” Journal of Pediatric Nursing 41 (2018): 104–109. [DOI] [PubMed] [Google Scholar]
- 35. Fabbruzzo‐Cota C., Frecea M., Kozell K., et al., “A Clinical Nurse Specialist–Led Interprofessional Quality Improvement Project to Reduce Hospital‐Acquired Pressure Ulcers,” Clinical Nurse Specialist: The Journal for Advanced Nursing Practice 30, no. 2 (2016): 110–116. [DOI] [PubMed] [Google Scholar]
- 36. Reinhorn M., Dews T., Warren J. A., et al., “Utilization of a National Registry to Influence Opioid Prescribing Behavior After Hernia Repair,” Hernia 26, no. 3 (2022): 847–853. [DOI] [PubMed] [Google Scholar]
- 37. Slater P. J., Osborne C. J., and Herbert A. R., “Ongoing Value and Practice Improvement Outcomes From Pediatric Palliative Care Education: The Quality of Care Collaborative Australia,” Advances in Medical Education and Practice 12 (2021): 1189–1198. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Bailey K., Lo L. A., Chauhan B., Formuli F., Peck J. R., and Burra T. A., “Using a Quality Improvement Approach to Implement Measurement‐Based Care (MBC) in Outpatient General Psychiatry,” The Joint Commission Journal on Quality and Patient Safety 49, no. 10 (2023): 563–571. [DOI] [PubMed] [Google Scholar]
- 39. Mueller S., Murray M., Schnipper J., and Goralnick E., “An Initiative to Improve Advanced Notification of Inter‐Hospital Transfers,” Healthcare (Amsterdam, Netherlands) 8, no. 2 (2020): 100423. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40. Byrnes A., Young A., Banks M., Mudge A., Clark D., and Bauer J., “Prospective Application of an Implementation Framework to Improve Postoperative Nutrition Care Processes: Evaluation of a Mixed Methods Implementation Study,” Nutrition & Dietetics 75, no. 4 (2018): 353–362. [DOI] [PubMed] [Google Scholar]
- 41. Giesler D. L., Krein S., Brancaccio A., et al., “Reducing Overuse of Antibiotics at Discharge Home: A Single‐Center Mixed Methods Pilot Study,” American Journal of Infection Control 50, no. 7 (2022): 777–786. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42. Mulchan S. S., Hinderer K. A., Walsh J., McCool A., and Becker J., “Feasibility and Use of a Transition Process Planning and Communication Tool Among Multiple Subspecialties Within a Pediatric Health System,” Journal for Specialists in Pediatric Nursing 27, no. 1 (2022): e12355. [DOI] [PubMed] [Google Scholar]
- 43. Raynor T., Marcet‐Gonzalez J., Roy K., et al., “Development and Implementation of a Pre‐Tracheostomy Multidisciplinary Conference: An Initiative to Improve Patient Selection,” International Journal of Pediatric Otorhinolaryngology 158 (2022). [DOI] [PubMed] [Google Scholar]
- 44. Y‐p X.u, P‐y Z., Y‐t B., and Li S., “The Effect of Care Transition Pathway Implementation on Patients Undergoing Joint Replacement During the COVID‐19 Pandemic: A Quasi‐Experimental Study From a Tertiary Care Hospital Orthopedic Department in Beijing, China,” Journal of Orthopaedic Surgery & Research 16, no. 1 (2021): 1–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45. Filbrun A. G., Enochs C., Caverly L., et al., “Quality Improvement Initiative to Improve Pulmonary Function in Pediatric Cystic Fibrosis Patients,” Pediatric Pulmonology 55, no. 11 (2020): 3039–3045. [DOI] [PubMed] [Google Scholar]
- 46. Zhang Q., Zc Y.u, Bc Q.i, Ni X. C., and Moola S., “Nutritional Screening and Nutritional Interventions in Patients Following Gastrointestinal Surgery in a General Surgical Ward: A Best Practice Implementation Project,” JBI Evidence Implementation 19, no. 4 (2021): 347–356. [DOI] [PubMed] [Google Scholar]
- 47. McArdle J., Sorensen A., Fowler C. I., Sommerness S., Burson K., and Kahwati L., “Strategies to Improve Management of Shoulder Dystocia Under the AHRQ Safety Program for Perinatal Care,” Journal of Obstetric, Gynecologic, and Neonatal Nursing 47, no. 2 (2018): 191–201. [DOI] [PubMed] [Google Scholar]
- 48. Presley C. A., Wooldridge K. T., Byerly S. H., et al., “The Rural VA Multi‐Center Medication Reconciliation Quality Improvement Study (R‐VA‐MARQUIS),” Am J Health‐Syst Pharm 77, no. 2 (2020): 128–137. [DOI] [PubMed] [Google Scholar]
- 49. Rutledge R. I., Romaire M. A., Hersey C. L., Parish W. J., Kissam S. M., and Lloyd J. T., “Medicaid Accountable Care Organizations in Four States: Implementation and Early Impacts,” Milbank Quarterly 97, no. 2 (2019): 583–619. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50. Eastin C., Moore B., Moulton A., et al., “Vaccine Acceptance During a Novel Student‐Led Emergency Department COVID‐19 Vaccination Program,” Western Journal of Emergency Medicine 24, no. 3 (2023): 436. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51. Becker S., Hagle M., Amrhein A., et al., “Implementing and Sustaining Bedside Shift Report for Quality Patient‐Centered Care,” Journal of Nursing Care Quality 36, no. 2 (2021): 125–131. [DOI] [PubMed] [Google Scholar]
- 52. Gill F. J., Leslie G. D., and Marshall A. P., “Parent Escalation of Care for the Deteriorating Child in Hospital: A Health‐Care Improvement Study,” Health Expectations 22, no. 5 (2019): 1078–1088. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53. Farley J. N., “A Pilot Quality Improvement Project to Introduce Utilization of an Electronic Bone Health Order Set in a Population of Hospitalized Pediatric Patients Identified at Risk for Fractures,” (Doctoral Dissertation, Georgetown University, 2016).
- 54. Jones W. D., Rodts M. F., and Merz J., “Influencing Discharge Efficiency: Addressing Interdisciplinary Communication, Transportation, and COVID‐19 as Barriers,” Professional Case Management 27, no. 4 (2022): 169–180. [DOI] [PubMed] [Google Scholar]
- 55. Ramsey A. T., Chiu A., Baker T., et al., “Care‐paradigm Shift Promoting Smoking Cessation Treatment Among Cancer Center Patients via a Low‐Burden Strategy, Electronic Health Record‐Enabled Evidence‐Based Smoking Cessation Treatment,” Transl Behav Med 10, no. 6 (2020): 1504–1514. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56. Ahmed S., Zidarov D., Eilayyan O., and Visca R., “Prospective Application of Implementation Science Theories and Frameworks to Inform Use of PROMs in Routine Clinical Care Within an Integrated Pain Network,” Quality of Life Research 30, no. 11 (2021): 3035–3047. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57. Chavarri‐Guerra Y., Soto‐Perez‐de‐Celis E., Ramos‐López W., et al., “Patient Navigation to Enhance Access to Care for Underserved Patients With a Suspicion or Diagnosis of Cancer,” The Oncologist 24, no. 9 (2019): 1195–1200. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58. Pooya S., Johnston K., Estakhri P., and Fathi A., “Successful Implementation of Enhanced Recovery After Surgery Program in a Safety‐Net Hospital: Barriers and Facilitators,” Journal of Perianesthesia Nursing 36, no. 5 (2021): 468–472. [DOI] [PubMed] [Google Scholar]
- 59. Turcinovic M., Singson R., Harrigan M., et al., “Physical Therapy for Hospitalized Patients with COVID‐19 in Isolation: Feasibility and Pilot Implementation of Telehealth for Delivering Individualized Therapy,” Archives of Rehabilitation Research and Clinical Translation 3, no. 2 (2021): 100113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60. Bindraban R. S., van Beneden M., Kramer M. H. H., et al., “Association of a Multifaceted Intervention With Ordering of Unnecessary Laboratory Tests Among Caregivers in Internal Medicine Departments,” JAMA Network Open 2, no. 7 (2019): e197577. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61. Bates S. E., Isaac T. C. W., Marion R. L., Norman V., Gumley J. S., and Sullivan C. D., “Delayed Cord Clamping With Stabilisation at all Preterm Births—Feasibility and Efficacy of a Low Cost Technique,” European Journal of Obstetrics and Gynecology and Reproductive Biology 236 (2019): 109–115. [DOI] [PubMed] [Google Scholar]
- 62. Moniz M. H., Dalton V. K., Smith R. D., et al., “Feasibility and Acceptability of a Toolkit‐Based Process to Implement Patient‐Centered, Immediate Postpartum Long‐Acting Reversible Contraception Services,” American Journal of Obstetrics and Gynecology 226, no. 3 (2022): 394.e1‐.e16. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63. Spruce K. and Butler C., “Enhancing Outcomes for Outpatient Percutaneous Coronary Interventions,” Clinical Nurse Specialist: The Journal for Advanced Nursing Practice 31, no. 6 (2017): 319–328. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64. Ray‐Barruel G., Horowitz J., McLaughlin E., Flanders S., and Chopra V., “Barriers and Facilitators for Implementing Peripherally Inserted central Catheter (PICC) Appropriateness Guidelines: A Longitudinal Survey Study From 34 Michigan Hospitals,” PLoS ONE 17, no. 11 (2022): e0277302. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65. Starr N., Gebeyehu N., Tesfaye A., et al., “Value and Feasibility of Telephone Follow‐Up in Ethiopian Surgical Patients,” Surgical Infections 21, no. 6 (2020): 533–539. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66. Picarillo A. P., “Introduction to Quality Improvement Tools for the Clinician,” Journal of Perinatology 38, no. 7 (2018): 929–935. [DOI] [PubMed] [Google Scholar]
- 67. Glickman S. W., Baggett K. A., Krubert C. G., Peterson E. D., and Schulman K. A., “Promoting Quality: The Health‐Care Organization From a Management Perspective,” International Journal for Quality in Health Care 19, no. 6 (2007): 341–348. [DOI] [PubMed] [Google Scholar]
- 68. Tzelepis F., Sanson‐Fisher R. W., Zucca A. C., and Fradgley E. A., “Measuring the Quality of Patient‐Centered Care: Why Patient‐Reported Measures Are Critical to Reliable Assessment,” Patient Preference and Adherence (2015): 831–835. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69. Kringos D. S., Sunol R., Wagner C., et al., “The Influence of Context on the Effectiveness of Hospital Quality Improvement Strategies: A Review of Systematic Reviews,” BMC Health Services Research 15 (2015): 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70. Mielke J., Leppla L., Valenta S., et al., “Unraveling Implementation Context: The Basel Approach for coNtextual ANAlysis (BANANA) in Implementation Science and Its Application in the SMILe Project,” Implementation Science Communications 3, no. 1 (2022): 102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71. Coles E., Anderson J., Maxwell M., et al., “The Influence of Contextual Factors on Healthcare Quality Improvement Initiatives: A Realist Review,” Systematic Reviews 9 (2020): 1–22. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supporting Information
