Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2024 Oct 21;81(11):8050–8061. doi: 10.1111/jan.16571

Using Implementation Science to Implement Evidence‐Based Practice: A Discursive Paper

Audrey Chays‐Amania 1,, Jocelyn Schwingrouber 1, Sébastien Colson 2
PMCID: PMC12535329  PMID: 39431403

ABSTRACT

Aim

The purpose of this manuscript is to offer an overview of knowledge regarding Evidence‐Based Practice and implementation science. It addresses the question: What are the EBP implementation models used in nursing settings?

Design

Discursive paper.

Methods

The databases were searched with the following keywords: ‘Nursing Faculty’, ‘Nurse educator’, ‘Academic’, ‘clinic’, ‘Evidence‐based implementation’, ‘evidence‐based practice’, ‘implementation’, ‘implementation science’, ‘undergraduate’, ‘nurse’. The search strategy aims to identify published studies. Eight databases were searched.

Results

There are specific implementation models for implementing EBP: the IOWA Model, the Stetler Model, the Johns Hopkins Nursing Evidence‐Based Practice Model, the Stevens Star Model, the Promoting Action on Research Implementation in Health Services (PARIHS), the Advancing Research and Clinical practice through close collaboration (ARCC) model. They were analysed according to the Nilsen classification. An evidence‐based implementation project must be structured. First, it is necessary to choose an implementation model, then identify one or more implementation strategies, and finally, plan evaluation for implementation outcome. The use of implementation science ensures successful implementation or at least highlights barriers that need adjustment. Effective utilisation of implementation science facilitates the transfer of obtained results to similar contexts.

Conclusion

Implementation science complements the EBP process perfectly and ensures the proper implementation of evidence.

Implication for the Profession

EBP mentors now have the entire structure of implementation science to succeed in implementing evidence‐based data in both academic and clinical settings.

Impact

The discursive paper addresses the difficulties of implementing evidence in academic or clinical settings. Implementation science is the bridge between evidence and practice. Nurses now have everything they need to implement evidence‐based practice successfully.

No Patient or Public Contribution

There was no patient or public involvement in the design or writing of this discursive article.

Keywords: EBP mentors, evidence‐based practice, implementation models, implementation outcomes, implementation science, implementation strategies, nurses

1. Introduction

In the early 2000s, the Institute of Medicine highlighted the devastating consequences of medical errors and called for patient‐centred clinical decision‐making based on evidence and much more developed collaboration between healthcare professionals and institutions to deliver quality care (Medicine 2001). The International Council of Nurses (ICN), took a stand for evidence‐based nursing practice as early as 2013 (CII 2013).

The World Health Organisation (WHO) took up the issue by establishing global recommendations that clinical decisions made by healthcare professionals be based on evidence (World Health Organization 2015). The WHO went further in its recommendations by specifying that nursing science education should be based on the Evidence‐Based Practice (EBP) process (World Health Organization et al. 2017). The WHO argues for its position by highlighting the benefits at the population level of nursing practice within the healthcare system, research, and education (Table 1).

TABLE 1.

The benefits of EBP (World Health Organization et al. 2017, 9).

Beneficiary Benefits
General population • Improved conditions for patient‐centred care
• Patient preferences included in decision‐making
• Consistent health services leading to better equity
• Reduction in geographic variation
• Reduction in patients' length of stay
• Better patient outcomes
Nurses and midwives • Increased job satisfaction
• Empowerment
• Improved skills to integrate patient preferences into practice
• Support for professional growth
• Continuous career development through expert roles
Health‐care systems • Improvement in the quality of care
• Better outcomes for patients
• Increased patient safety
• Reduced costs
• Stronger basis for health‐care investment decisions
Research and education • Increased need for production and synthesis of robust evidence
• Competence development
• Integration of nursing and midwifery expert roles in health

Source: Nursing Research Foundation.

2. Background

2.1. Origin and History of EBP

EBP is a long‐standing practice. Traces have been found in ancient Egypt, Greece and the Arabian Peninsula (Sallam 2010). The principle of observation, analysis and documentation was encouraged, thus pushing the sacred aside through scientific approach.

Nursing practice was first traced, conceptualised, measured, evaluated, and finally disseminated by Florence Nightingale in Scutari, during the Crimean War (1853–1856). Until then, nursing care had been practiced but not conceptualised, based on empiricism and tradition, without being transmitted or taught. Florence Nightingale, meticulously recorded the care provided (creation of care diagrams) and systematically collected patient data, performed the necessary statistics to objectify results. In doing so, she created for the first time the basis of EBP. By reducing mortality in hospitals from 42.7% to 2%, she objectively and concretely highlighted the critical importance of providing nursing care. The spatial representation in the form of Coxcombs made the outcomes of care interventions very explicit (McDonald 2014).

In the 1970s, Archie Cochrane expressed his frustration with the complexity of gathering concrete available evidence to make informed medical decisions. He criticised the approach largely based on experience and revisited the fundamental principles stated by Aristotle. Although randomised clinical trials had been conducted by various medical teams worldwide, the major challenge lay in identifying and synthesising these trials to generate useful recommendations. He worked on establishing a precise, rigorous, and reproducible methodology: the systematic review of literature. Randomised controlled trials, the true ‘gold standard’ of research methods, are therefore collected, evaluated, and synthesised to produce evidence. The Cochrane Collaboration was founded in 1993.

Today, over 190 countries are members, including all European countries. These collaborations produce quality systematic reviews accessible for free to healthcare professionals, families, care beneficiaries, and patients. However, producing evidence is not sufficient to deeply improve practice. David Sackett and Gordon Guyatt went further and conceptualised Evidence‐Based Medicine (Guyatt et al. 1992). Seeking evidence is only the first step (1). This evidence must then be implemented in clinical practice based on (2) the clinician's expertise, (3) available resources, (4) patient preferences. Clinical decision‐making is only made through the prism of these four elements.

Alan Pearson, a nurse, was captivated by Cochrane's method. He also decided to seek evidence to practice nursing care. However, although the method was clear, he pointed out the lack of evidence in his field of expertise (Lisy 2014). Indeed, randomised controlled trials cannot be used for all nursing clinical issues. The criterion of care effectiveness is not the only one to consider within nursing expertise. To deepen practice, it is also necessary to understand the patient's lived experience and the meaning of this experience to them. Alan Pearson did not find these answers with this type of research method. He turned to qualitative or mixed methodologies and highlights the crucial importance of these research results for nursing science. Although the levels of evidence from qualitative research are not at the top of the evidence pyramid, they are essential for nursing practice. He founded the Joanna Briggs Institute for Evidence‐Based Nursing (JBI) in 1996 to offer nurses the possibility of using other types of methodologies in systematic reviews and thus produce the necessary evidence for informed and updated nursing practice. The JBI actively promotes the implementation of EBP and its integration into nursing practices (Lisy 2014). It is the only organisation that supports the synthesis of evidence and its implementation for the nursing profession.

2.2. Definition and Process

The Sicily Statement clarified the scope of what evidence‐based practice means (Dawes et al. 2005). It is not limited to the application of epidemiological data and their consideration in clinical decision‐making. The process is much more comprehensive and complex. There are numerous definitions. However, central points are common: clinical expertise, best available evidence, patient preferences. EBP was initially developed in Medicine but the Sicily Statement extended the term to Evidence‐Based Practice (EBP), to include other healthcare professionals for a common approach to this method (Dawes et al. 2005).

David Sackett defines Evidence‐Based Medicine as ‘The conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients’ (Sackett et al. 1996). The ICN adopted Melnyk's definition (2005) stated: ‘A problem solving approach to clinical decision making that incorporates a search for the best and latest evidence, clinical expertise and assessment, and patient preference values within a context of caring’(International Council of Nurses 2012).

EBP is a dynamic process. Therefore, a methodology is necessary to update practices. The Sicily Statement was based on international work and led to the emergence of a five‐step approach to the EBP process (Dawes et al. 2005):

‘1. Translation of uncertainty to an answerable question

2. Systematic retrieval of best evidence available

3. Critical appraisal of evidence for validity, clinical relevance, and applicability

4. Application of results in practice

5. Evaluation of performance’

Melnyk and Fineout‐Overholt completed the five stages by adding a step 0 and a stage 6 (Melnyk and Fineout‐Overholt 2023):

‘Step 0: Cultivate a spirit of inquiry within an EBP culture and environment.

Step 1: Formulate the burning clinical PICOT question.

Step 2: Search for the best evidence.

Step 3: Critical appraisal of evidence.

Step 4: Integrate the evidence with clinical expertise and patient/family preferences to make the best clinical decision.

Step 5: Evaluate the outcomes of the practice change based on evidence.

Step 6: Disseminate the outcomes of the EBP change.’

Steps 0 to 3 are in the first circle (research evidence), Steps 4 and 5 are in the other 3 circles (Figure 1).

FIGURE 1.

FIGURE 1

EBP process for clinical decision making (Chays‐Amania 2023).

2.3. EBP vs. EBN or Evidence‐Based Practice in Nursing and Related Concepts

The Sicily Statement adopted the term Evidence‐Based Practice (Dawes et al. 2005). The term Evidence‐Based Nursing (EBN) therefore refers to nursing, but the ‘evidence‐based’ approach is identical for all specialties. EBN is specific to our discipline as it explores more fields relating exclusively to patient care. In EBN, the evidence used is different, for example, randomised controlled trials are not suitable for all nursing practices (DiCenso, Guyatt, and Cliska 2005). Melnyk and Fineout‐Overholt (2023) use the term Evidence‐Based Practice in Nursing and Healthcare, referring to Sackett's definition of EBP (Melnyk and Fineout‐Overholt 2023).

The distinction between EBP and EBN also considers nursing clinical expertise. As it is specific to nursing, the term EBN is preferred by Estabrooks (Estabrooks 1998).

However, the term EBP, EBP in nursing or EBN are often used interchangeably in studies, but the term EBP is predominantly used regardless of the professional setting studied.

EBP should be differentiated from two other frequently confused concepts: research utilisation and quality improvement (Aquino‐Maneja et al. 2023). Research in nursing sciences aims to produce, explain, or predict phenomena to verify knowledge or create new knowledge (Christenbery 2017).

Quality improvement aims to identify organisational or structural problems to improve safety, effectiveness, and care quality for a specific population. It relies more on internal data (specific to the patient or their environment), which does not always involve research on external data (data from research). Specific tools are used, such as the Plan Do Check Act (PDCA) Model for example, which allows planning an intervention, implementing it, checking, and acting (Christenbery 2017).

EBP is a problem‐solving approach, which is based on research utilisation and can sometimes be combined with quality improvement methods to enhance care quality. However, the EBP process requires ctical appraisal of research studies, and the synthesis of a significant amount of evidence contained in multiple research studies. This allows us to make practice recommendations. Finally, combined with the nurse's clinical expertise and patient or family preferences, decision‐making takes place (Melnyk and Fineout‐Overholt 2023)

To sum up:

Research utilisation answers the question: What's the best thing to do?

EBP answers the question: Are we doing the best thing possible?

Quality improvement answers the question: Are we actually doing the best thing every time?

2.4. Barriers to EBP Implementation

Barriers exist that hinder the implementation of EBP. They are of different kinds (McNett et al. 2022).

First, nurses are poorly prepared for EBP due to a lack of knowledge, values, and skills to understand and use EBP. Even if they recognise the principle of EBP and understand its importance for care quality, they do not always feel able and qualified to practice it (Melnyk et al. 2018). The lack of coaching or support from executives and senior managers or directors is recognised as a barrier. Other more systemic factors are found, such as the lack of standardised processes. The lack of time and resources to practice EBP is also frequently cited (McNett et al. 2022).

Individual factors are identified, such as resistance to change or lack of collaboration, which do not allow for quality implementation (Tucker and Gallagher‐Ford 2019), as well as the absence of EBP mentor (specialists).

Healthcare managers and Advanced Practice Nurses (APNs) also sometimes lack EBP skills, preventing them from always supporting care teams in EBP implementation (McNett et al. 2022).

2.5. Implementation Science

EBP needs to be implemented. The concept of ‘implementation science’ allows for studying the effectiveness of implementation strategies globally (McNett, Tucker, and Melnyk 2021) and bridging the gap between the scientific production of evidence and its application in care practices as much as possible (Medicine 2001; Melnyk 2018).

Implementation science is defined as: ‘The scientific study of methods to promote the systematic uptake of research findings and other evidence‐based practices into routine practice and, hence, to improve the quality and effectiveness of health services’ (Eccles and Mittman 2006). Implementation science, a true bridge between research findings and the adoption of evidence in care practices, is sometimes referred to as ‘knowledge transfer’ or ‘knowledge translation’ by other authors (Soicher, Becker‐Blease, and Bostwick 2020). It is a non‐linear, continuous, and interactive process. Implementation science concerns any domain involving innovation in practice.

Implementation science focuses on the systematic adoption of research into care practice (Eccles and Mittman 2006; Nilsen 2015). The observed differences between research outcomes that can improve patient care and current practices have raised questions about how implementation is carried out Implementation science is a process of understanding, reflection, and evaluation the different implementation stages. Evaluation can provide contextual information to understand if and how evidence is implemented in practice (Damschroder and Lowery 2013).

Implementation science is essential for steps 4 and 5 of the EBP process (van Achterberg, Schoonhoven, and Grol 2008). Indeed, it can take up to 17 years between the publication of evidence and actual change in practice (Balas and Boren 2000; Grimshaw et al. 2012), this has been reduced to 15 years in oncology (Melnyk 2021). EBP require Evidence‐Based Implementation (McNett, Tucker, and Melnyk 2021; van Achterberg 2013).

Three steps are necessary for successful implementation: (1) choosing the appropriate implementation model, (2) defining one or more implementation strategies and (3) evaluating implementation outcomes.

3. Data Sources

An overview of EBP implementation models in academic or clinical nursing settings will address the following question: What EBP implementation models are used in nursing settings?

The databases were searched using the following keywords: ‘Nursing Faculty’, ‘Nurse educator’, ‘Academic’, ‘Clinic’, ‘Evidence‐based implementation’, ‘Evidence‐based practice’, ‘Implementation’, ‘Implementation science’, ‘Undergraduate’, ‘Nurse’.

The search strategy aims to identify published studies. The keywords of the corresponding articles as well as the index terms (Mesh terms or descriptors) were identified to develop a comprehensive search strategy. Keywords were adapted to the thesaurus of each database or search engine. Finally, the bibliography of selected articles was examined to identify additional studies.

Selected languages are English and French, as the research team is French and English‐speaking, with no publication date limit.

Searches were conducted on MEDLINE, Cochrane DataBase for Systematic Reviews, Allied and Complementary Medicine Database (AMED), British Nursing Index (BNI), Cumulative Index to Nursing and Allied Health Literature (CINAHL), Excerpta Medica dataBASE (EMBASE), Health Management Information Consortium (HMIC), EMCARE, PsyInfo and JBI Evidence Synthesis.

4. Overview of the Issue

Implementation models have been studied, at organisational or individual levels, to support EBP implementation (Schaffer et al. 2013). Many evaluation tools measure nurses' knowledge, skills, and attitude (Leung et al. 2014). However, choosing an implementation model is complex. Criteria may be based on generalisation, reproducibility, or the context in which the model is used. Models can be compared with each other depending on a care situation (S. Tucker et al. 2021). Several studies propose model choices based on the context: for primary health care (Huybrechts et al. 2021); teaching practices (Oermann et al. 2022); within Magnets Hospitals (Speroni et al. 2020). Other studies have provided an overview of the implementation of recommendations in clinical settings (Pereira et al. 2022). However, they are not specific to EBP implementation. EBP implementation in healthcare settings is well documented organizationally (Clavijo‐Chamorro et al. 2020), by health managers (Birken et al. 2018) or by APN (Clarke et al. 2021). In their discussion paper, Schaffer and colleagues (2013) debated the relevance of certain specialised models for EBP implementation in nursing settings. An update of these studies is necessary to identify specialised EBP implementation models in nursing settings.

Nilsen (2015) highlighted five categories: process model; determining frameworks; classic theories; implementation theories; evaluation frameworks and offers a clearer vision of the different types of implementation support (Nilsen 2015, 4).

5. Results

5.1. Implementation Models

This research identified specific EBP implementation models in nursing settings. They were analysed according to Nilsen's five categories (Nilsen 2015).

5.2. Process Model

5.2.1. The Stevens Star Model of Knowledge Transformation

This model was created to support clinicians in clinical decision‐making, by relying on different forms of knowledge. It is represented by a pictogram where a star is surrounded by a circle. The 5 branches of the star structure are the 5 steps from identifying evidence to implementation and evaluation.

The first step represents primary investigations. Alone, they do not have sufficient evidence to change practices. Thus, the second step involves synthesising these data into a systematic review. To make the results of these studies readable to the entire caregiving community, they are adapted into recommendations. The fourth point involves integrating these recommendations into care practices, and finally, the last point ensures the process of results and evaluation.

The Stevens Star Model of Knowledge Transformation relies on transforming knowledge obtained through research into practice.

5.2.2. IOWA Model (Buckwalter et al. 2017)

This model is based on Rogers' theories (Diffusion of Innovations theory). It is recognised for its ease of use, especially among multidisciplinary teams. It was designed for front‐line practitioners and academic partners.

It is represented by an algorithm and feedback loops on key decision points that allow for returning to the previous step if necessary.

The first step is identifying a trigger. Three key points are decision‐making steps: Is this topic a priority? Is there sufficient evidence? Is the change appropriate for adoption in practice? The steps to follow are those of the 7 steps of EBP. Feedback loops make it possible to consider the reality of the context. Indeed, EBP implementation is not a linear process, multiple back‐and‐forths are necessary to reach the set objective.

It is complemented by evaluation tools to develop a comprehensive implementation strategy Since its creation until 2021, over 8000 model use requests have been processed; it has been cited over 800 times by 54 countries and translated into four languages (S. Tucker et al. 2021).

5.2.3. Stetler Model (Stetler 2001)

Initially developed for research utilisation, it was later adapted to EBP. The model proposes 5 phases based on critical thinking and can be used individually by healthcare professionals. The model considers the individual characteristics of users.

The following version allowed extending its use to groups of practitioners. It is more oriented towards experienced practitioners, such as APNs, nurse PhDs, and specialised nurses. It is represented in the form of a flowchart and complemented with a narrative table.

Phase 1: preparation identifies priorities, assesses the context and searches for evidence.

Phase 2: the validation of synthesised evidence (systematic reviews, guidelines) related to the initial problem.

Phase 3: the decision‐making process based on the research results and identified needs according to identified internal and external factors.

Phase 4: translating evidence into context, planning practice change implementation.

Phase 5: evaluating planning and actions against set objectives.

It can be used at both individual and collective levels. However, it is reserved for those with solid EBP skills (Melnyk and Fineout‐Overholt 2023).

5.2.4. John Hopkins Nursing Evidence‐Based Practice Model (JHNEBP) (Dang et al. 2022)

This model was initiated by John Hopkins Hospital, which observed the gap between research results and clinical nursing practice. Frontline nurses were associated with the project and emphasised the need for a practical guide for clinical decision‐making. Thus, the model was built as a conceptual model first and then as a process to support implementation. The core of the conceptual model is based on the Practice question Evidence and Translation (PET) process. These three key elements are divided into 17 steps to guide users' step by step. It is a dynamic process represented by an arrow that allows for updating practices and, thus, learning and vice versa. New questions may arise during the conceptual model and thus initiate the implementation cycle again. This model was integrated into the nursing students' program at Johns Hopkins University School of Nursing. This partnership allows nurses to pose clinical questions, and undergraduate or graduate students can conduct research, thus feeding the conceptual model's learning‐practice loop. Various tools for each step have been developed and should be used.

5.2.4.1. Determinant Frameworks
5.2.4.1.1. PARIHS (Promoting Action on Research Implementation in Health Service)

The PARIHS model was developed from the author's experience. It has been revised over time. Initially, it was based on determinants:

  • Evidence

  • Context

  • Facilitation

These 3 elements were then divided into sub‐elements where actions were taken to improve them (moving from low to high).

The PARIHS model was revisited: Integrated‐Promoting Action on Research Implementation in Health Services (i‐PARIHS) Framework. The i‐PARIHS model considers the reality that implementing EBP or research results is a complex, unpredictable, and non‐linear process (Harvey and Kitson 2016). As a result, i‐PARIHS now integrates key elements from an organisational perspective in the broad sense, a contextual perspective, participant characteristics, and their characteristics. It is represented as a spiral to reflect non‐linearity and emphasises determinants and their actions to facilitate interventions.

It is a conceptual model that highlights the interactions between the different factors influencing the success of implementation. The phenomenon being complex, highlighting different determinants allows acting more specifically on them and improving implementation.

5.2.4.2. Evaluating Implementation

ARCC Model (Advancing Research and Clinical practice through close Collaboration Model) (Melnyk and Fineout‐Overholt 2023) and the model ARCC‐E.

The ARCC model develops strategies for both individual and organisational change for sustainable EBP practice in a clinical context. The ARCC‐E model is the version of the ARCC model for the educational context. In both cases, it proposes EBP implementation at a systemic level.

It consists of several steps, all accompanied by validated assessment tools to identify barriers and facilitators and propose a strategy to overcome or mitigate obstacles. It is based on the training of mentors to sustainably implement EBP. Step 1: organisational assessment of readiness.

Step 2: based on the results of this assessment, highlights the potential strengths and potential barriers of the system about EBP.

Step 3: use of EBP mentors.

Step 4: implementing EBP.

Step 5: evaluating the results achieved in practice.

The main feature of this model is mentor training, who act as real relays within the institution. The concept of mentorship allows for individualising the teacher‐learner relationship.

Implementation models are valuable guides for supporting implementation. They answer at ‘what to do’. However, it is also necessary to use strategies to support EBP implementation: ‘how do we do it?’

5.3. Implementation Strategies

They are defined as: ‘Methods or techniques used to enhance the adoption, implementation, and sustainability of an under‐utilised intervention’. (Pinnock et al. 2017). The fluctuation in the lexicon used to describe strategies for EBP implementation prompted researchers to identify and list these strategies. Indeed, the terms and definitions used often present inconsistencies, homonyms, and synonyms, leading to variations in expression meanings. Furthermore, published studies rarely describe implementation strategies, hindering the transfer of results both theoretically and practically. This is a barrier to developing systematic reviews and publishing recommendations (Powell et al. 2015).

Initial work attempted to list the strategies used, and then the Cochrane Institute developed a checklist: Effective Practice and Organisation of Care (EPOC). It was created to guide systematic reviews regarding strategies but also more generally to support quality management strategies. Other compilations of strategies were published with sometimes very specific themes and imprecise definitions (Powell et al. 2012). Research has been conducted to establish a stable taxonomy of the identified strategies.

Checklists have been developed to enable authors to allow authors to precisely describe the strategies used:: Consolidated Standards Of Reporting Trials (CONSORT) for Randomised Controlled Trials (Schulz, Altman, and Moher 2010), Transparent Reporting of Evaluations with Nonrandomized Designs (TREND) for Nonrandomized Controlled Trials, Cohorts, Quasi‐Experimental Studies (Des Jarlais et al. 2004) and Strengthening the Reporting of OBservational studies in Epidemiology (STROBE) for cross‐sectional studies (von Elm et al. 2007). Although essential and widely used, they do not provide the level of detail required to define and report the nature of the strategies used.

Proctor and colleagues made recommendations regarding the essential steps to describe strategies: naming them, defining them, and specifying (a) the actors, (b) the action, (c) the action target, (d) the temporality, (e) the dose, (f) the implementation outcome affected, (g) the justification (Proctor, Powell, and McMillen 2013).

On their side, Mazza et al. (2013) focused on creating a taxonomy listing various specific strategies for integrating and applying clinical practice guidelines.

Simultaneously, a taxonomy more focused on EBP implementation was developed on a larger scale. This involved a first phase of updating and harmonising to achieve coherent results. A first study identified 68 implementation strategies grouped into six processes: programming, training, funding, restructuring, quality management, and considering the political context (Powell et al. 2012). A second large‐scale study, the Expert Recommendations for Implementing Change (ERIC) project based on the Delphi method, including scientific experts and field researchers, stabilised the vocabulary used, identifying 73 distinct strategies (Powell et al. 2015). These were then synthesised into nine domains by Waltz and colleagues (Waltz et al. 2015):

  • Use evaluative and iterative strategies.

  • Provide interactive assistance.

  • Adapting and tailor to context.

  • Develop stakeholder interrelationships.

  • Train and educate stakeholders.

  • Support clinicians.

  • Engage consumers.

  • Utilise financial strategies.

  • Change infrastructure.

The 9 domains are ranked by feasibility and importance. Stakeholders training, for example (train and educate stakeholders) consists of 11 specific strategies, the first 7 are classified as grade I, meaning highly feasible and important while the last four are classified as grade II, meaning feasible but less important.

Train and educate stakeholder Grade I

19 Conduct ongoing training

55 Provide ongoing consultation

29 Develop educational materials

43 Make training dynamic.

31 Distribute educational materials.

71 Use train‐the‐trainer strategies

15 Conduct educational meetings

Grade II

16 Conduct educational outreach visits

20 Create a learning collaborative.

60 Shadow other experts

73 Work with educational institutions

5.4. Implementation Outcomes

Implementation outcomes hold a significant place in implementation science. They are defined as: « the effects of deliberate and purposive actions to implement new treatments, practices, and services » (Proctor et al. 2011). The distinction between clinical outcomes from evidence implementation (in terms of improving care quality, for example) and the outcomes of the implementation itself is fundamental. Several implementation models incorporate outcome evaluation. This is notably the case with the Re‐AIM model, focusing on public health strategy implementation (Glasgow et al. 1999), and the PRECEDE‐PROCEED model, focusing on health promotion (Green and Kreuter 2005).

Proctor's work is more specific to EBP (Proctor et al. 2011). WHO recommends using Proctor's taxonomy to assess implementation outcomes (D. Peters et al. 2013). Figure 2 presents a heuristic model delineating different levels: intervention strategies, implementation strategies, and outcomes. Regarding outcomes a clear distinction is made between: implementation outcomes, services outcomes and client outcomes.

FIGURE 2.

FIGURE 2

‘Conceptual model of implementation research’ (Proctor et al. 2009).

Proctor identifies three functions of implementation outcomes: they act as indicators of implementation success, as proximal indicators of implementation processes, and as intermediate steps linked to expected outcomes. Indeed, a treatment can be highly effective but poorly implemented and vice versa. It becomes essential to differentiate between expected outcomes, i.e. those of implementation, and those that come from the evidence.

Proctor's work allows for defining the precise taxonomy of implementation outcomes. Eight categories are identified (Proctor et al. 2011):

  • Acceptability: the perception of various stakeholders about implementation (also known as satisfaction)

  • Adoption: the decision to try or use an innovation (also known as assimilation)

  • Appropriateness: the perception of the adequacy and relevance between the EBP innovation or practice and the context (also known as compatibility)

  • Implementation cost: the financial effort made for implementation.

  • Feasibility: the extent to which an innovation can be successfully used in each setting.

  • Fidelity: the degree to which the implementation of the innovation follows the prescribed method (also known as the integrity or quality of the program)

  • Penetration: the number of people involved

  • Sustainability: measuring the maintenance of this practice over time

These work was complemented in 2022 by the publication of 6 recommendations for better understanding and use of taxonomy (Lengnick‐Hall et al. 2022).

Recent work by Proctor highlighted underutilization of taxonomy over the past 10 years, making it difficult to assess implementation outcomes (Proctor et al. 2023). Without a common vocabulary, it is unfortunately not possible to transfer evidence implementation in a different context.

5.5. Clinical Application

Implementation projects that have used a model, strategies and evaluated their implementation outcomes achieve successful implementation results. Hammond's work demonstrates the success of such a project using the PARIHS model, three strategies from the ERIC project, and evaluating three implementation outcomes from Proctor's taxonomy (Hammond et al. 2020). Byrnes' project, similarly, describes the implementation of postoperative nutrition care that allows patients to recover faster. Implementation is successful using implantation science (Byrnes et al. 2018). Hanrahan led the project to modify several obsolete professional practices and achieved good results by following the IOWA model, seven strategies and evaluated two implementation outcomes (Hanrahan et al. 2015). These examples are summarised in Table 2.

TABLE 2.

Examples of implementation project using Implementation Science.

EBP: Name of the practice, program, intervention, policy being implemented Model Strategy Implementation outcomes Outcomes
Hammond et al. (2020)

Preoperative nasal decolonization of surgical patients with nasal povidone‐iodine (PI)

i‐PARiHS

Use evaluative and iterative strategies

Develop stakeholder interrelationships

Train and educate stakeholders

Acceptatibility

Feasibility

Fidelity

High
Byrnes et al. (2018) Early diet upgrades, postoperative nutrition care processes from ERAS (Enhanced Recovery After Surgery) guideline

i‐PARiHS

Use evaluative and iterative strategies

Provide interactive assistance

Adapt and tailor to the context

Develop stakeholder interrelationships stakeholders

Support clinicians

Acceptability

Adoption

Appropriateness

Feasibility

Fidelity

High
Hanrahan et al. (2015) Implementation of EBP practice and remove of Sacred Cows IOWA Model

Use evaluative and iterative strategies

Provide interactive assistance

Adapt and tailor to the context

Develop stakeholder interrelationships

Train and educate stakeholders

Support clinicians

Engagement of consumers

Adoption

Sustainability

High

6. Discussion

EBP is essential for nursing practice (World Health Organization 2015). EBP can rely on concepts and propositions derived from nursing theories (steps 0 to 3) to initiate, implement, measure, and report on critical nursing phenomena. Inductively, EBP will inform research about the lack of responses to issues raised by nurses. EBP should be able to feed a theory and evolve it based on observed phenomena. Finally, deductively, EBP will allow implementing research results, evaluating them, improving them if necessary, and disseminating them (steps 4 to 6).

The research process allows developing nursing knowledge and evolving clinical practices. EBP constitutes a true link between research results and the application of this evidence. This approach allows nurses to modify their practice based on their own science while promoting the development of critical analysis and clinical judgement (Cui et al. 2018). The profession thus reclaims nursing sciences, develops decision‐making autonomy, and highlights its clinical reasoning. As a result, nurses no longer practice based on tradition but in an informed manner. They master their practice and are proactive: research results feed practice through EBP, and conversely, EBP informs research about clinical phenomena that deserve exploration (Pepin et al. 2017).

One of the most important assets of the EBP culture is its potential for developing team empowerment, both structurally and psychologically (Laschinger et al. 2001). From a structural empowerment perspective, EBP impacts opportunities (new skills), information (new evidence), support (problem‐solving), formal power (reward for implementing innovation), and informal power (collaboration development with peers and other healthcare professionals during implementation). From a psychological empowerment perspective, EBP allows developing: the meaning given to practice, confidence in the ability to provide current and appropriate care, autonomy in practice, and the impact of one's practice (influence on care outcomes) (Teixeira et al. 2023).

EBP is indeed the foundation of nursing practice, enabling the delivery of updated, patient‐adapted care that aligns with the nurse's clinical expertise. However, it is no longer possible to consider an EBP project without first identifying a model, strategies, and implementation outcomes.

Implementation science perfectly complements the EBP approach to make it feasible, acceptable, and sustainable. This field of science is well‐structured, offering models, strategies, and outcomes to evaluate. It identifies favourable levers and overcomes barriers encountered in the field during evidence implementation. Rigorous methods available allow evaluating both clinical objective achievement and the process itself. Evaluating implementation outcomes can enable transferring to a similar context if the vocabulary used is standardised.

Implementation models used are regularly updated. They are numerous and have different objectives, allowing for selection based on the studied context. Once the model is chosen, they are relatively easy to use. However, time is needed to establish a comprehensive strategy. Mentors play an essential role in supporting teams at all EBP implementation stages, backed by implementation sciences (Kaba et al. 2023).

7. Conclusion

This discursive paper provides an overview of EBP and implementation sciences, the models used and their specifics, possible strategies, and outcomes to evaluate. The EBP mentor can thus choose the appropriate model for their practice context, identify the most relevant strategies, and now possess all the elements to evaluate both clinical and implementation outcomes thanks to Proctor's taxonomy. Indeed, it is crucial for EBP mentors and champions to fully exploit implementation sciences to ensure the success of evidence implementation projects and their transferability to similar contexts. Countries that have not yet adopted EBP can benefit from the experiences of nurses who regularly implement evidence.

Implementation science perfectly complements the EBP approach.

Author Contributions

All authors have agreed on the final version and meet at least one of the following criteria (recommended by the ICMJE*): (1) substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; (2) drafting the article or revising it critically for important intellectual content. A.C. ‐A. made substantial contributions to conception and design, or acquisition of data, or analysis and interpretation of data. A.C.‐A., J.S., S.C. involved in drafting the manuscript or revising it critically for important intellectual content and given final approval of the version to be published. Each author should have participated sufficiently in the work to take public responsibility for appropriate portions of the content and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Ethics Statement

The authors have nothing to report.

Conflicts of Interest

The authors declare no conflicts of interest.

Peer Review

The peer review history for this article is available at https://www.webofscience.com/api/gateway/wos/peer‐review/10.1111/jan.16571.

Acknowledgements

The authors have nothing to report.

Funding: The authors received no specific funding for this work.

Data Availability Statement

Data sharing not applicable‐no new data generated, or the article describes entirely theoretical research.

References

  1. Aquino‐Maneja, E. M. , Failla K. R., Flores S. L., and Squier V. R.. 2023. “Research, Evidence‐Based Practice, and Quality Improvement Simplified.” Journal of Continuing Education in Nursing 54, no. 1: 40–48. 10.3928/00220124-20221207-09. [DOI] [PubMed] [Google Scholar]
  2. Balas, E. A. , and Boren S. A.. 2000. “Managing Clinical Knowledge for Health Care Improvement.” Yearbook of Medical Informatics 9, no. 1: 65–70. 10.1055/s-0038-1637943. [DOI] [PubMed] [Google Scholar]
  3. Birken, S. , Clary A., Tabriz A. A., et al. 2018. “Middle Managers’ Role in Implementing Evidence‐Based Practices in Healthcare: A Systematic Review.” Implementation Science: IS 13, no. 1: 149. 10.1186/s13012-018-0843-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Buckwalter, K. C. , Cullen L., Hanrahan K., et al. 2017. “Iowa Model of Evidence‐Based Practice : Revisions and Validation.” Worldviews on Evidence‐Based Nursing 14, no. 3: 175–182. 10.1111/wvn.12223. [DOI] [PubMed] [Google Scholar]
  5. Byrnes, A. , Young A., Mudge A., Banks M., Clark D., and Bauer J.. 2018. “Prospective Application of an Implementation Framework to Improve Postoperative Nutrition Care Processes: Evaluation of a Mixed Methods Implementation Study.” Nutrition & Dietetics 75, no. 4: 353–362. 10.1111/1747-0080.12464. [DOI] [PubMed] [Google Scholar]
  6. Christenbery, T. L. 2017. Evidence‐Based Practice in Nursing: Foundations, Skills, and Roles. New York: Springer Publishing Company. Incorporated. http://ebookcentral.proquest.com/lib/rcn/detail.action?docID=5185511. [Google Scholar]
  7. Chays‐Amania, A. 2023. EBP Process for Clinical Decision Making [Image]. [Google Scholar]
  8. CII . 2013. Prises de position CII, le domaine de la pratique des soins infirmiers. Geneva: ICN‐International Council of Nurses. https://www.icn.ch/fr/que‐faisons‐nous/prises‐de‐position. [Google Scholar]
  9. Clarke, V. , Lehane E., Mulcahy H., and Cotter P.. 2021. “Nurse Practitioners’ Implementation of Evidence‐Based Practice Into Routine Care: A Scoping Review.” Worldviews on Evidence‐Based Nursing 18, no. 3: 180–189. 10.1111/wvn.12510. [DOI] [PubMed] [Google Scholar]
  10. Clavijo‐Chamorro, M. Z. , Sanz‐Martos S., Gómez‐Luque A., Romero‐Zarallo G., and López‐Medina I. M.. 2020. “Context as a Facilitator of the Implementation of Evidence‐based Nursing: A Meta‐synthesis.” Western Journal of Nursing Research 43, no. 1: 60–72. 10.1177/0193945920914397. [DOI] [PubMed] [Google Scholar]
  11. Cui, C. , Li Y., Geng D., Zhang H., and Jin C.. 2018. “The Effectiveness of Evidence‐Based Nursing on Development of Nursing Students’ Critical Thinking: A Meta‐Analysis.” Nurse Education Today 65: 46–53. [DOI] [PubMed] [Google Scholar]
  12. Damschroder, L. J. , and Lowery J. C.. 2013. “Evaluation of a Large‐Scale Weight Management Program Using the Consolidated Framework for Implementation Research (CFIR).” Implementation Science: IS 8: 51. 10.1186/1748-5908-8-51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Dang, D. , Dearholt S., Bissett K., Ascenzi J., and Whalen M.. 2022. “2022 EBP Models and Tools.” https://www.hopkinsmedicine.org/evidence‐based‐practice/ijhn_2017_ebp.html.
  14. Dawes, M. , Summerskill W., Glasziou P., et al. 2005. “Sicily Statement on Evidence‐Based Practice.” BMC Medical Education 5, no. 1: 1. 10.1186/1472-6920-5-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Des Jarlais, D. C. , Lyles C., Crepaz N., and Group, T., TREND Group, & the TREND Group . 2004. “Improving the Reporting Quality of Nonrandomized Evaluations of Behavioral and Public Health Interventions : The TREND Statement.” American Journal of Public Health (1971) 94, no. 3: 361–366. 10.2105/AJPH.94.3.361. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. DiCenso, A. , Guyatt G., and Cliska D., eds. 2005. Evidence‐Based Nursing: A Guide to Clinical Practice. St. Louis, MO: Elsevier Health Sciences. [Google Scholar]
  17. Eccles, M. P. , and Mittman B. S.. 2006. “Welcome to Implementation Science.” Implementation Science 1, no. 1: 1. 10.1186/1748-5908-1-1. [DOI] [Google Scholar]
  18. Estabrooks, C. A. 1998. “Will Evidence‐Based Nursing Practice Make Practice Perfect?” Canadian Journal of Nursing Research Archive 30, no. 1: 15–36. https://cjnr.archive.mcgill.ca/article/view/1422. [PubMed] [Google Scholar]
  19. Glasgow, R. E. 1999. “Evaluating the Public Health Impact of Health Promotion Intervention: The RE‐AIM Framework.” American Journal of Public Health 89: 1322–1327. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Green, L. , and Kreuter M.. 2005. Health Program Planning: An Educational and Ecological Approach. McGraw‐Hill Education. [Google Scholar]
  21. Grimshaw, J. M. , Eccles M. P., Lavis J. N., Hill S. J., and Squires J. E.. 2012. “Knowledge Translation of Research Findings.” Implementation Science 7, no. 1: 50. 10.1186/1748-5908-7-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Guyatt, G. , Cairns J., Churchill D., and Cook D.. 1992. “Evidence‐based medicine. A new approach to teaching the practice of medicine.” JAMA 268, no. 17: 2420. 10.1001/jama.1992.03490170092032. [DOI] [PubMed] [Google Scholar]
  23. Hammond, E. N. , Brys N., Kates A., et al. 2020. “Nasal Povidone‐Iodine Implementation for Preventing Surgical Site Infections: Perspectives of Surgical Nurses.” PLoS One 15, no. 11: e0242217. 10.1371/journal.pone.0242217. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Hanrahan, K. , Wagner M., Matthews G., et al. 2015. “Sacred Cow Gone to Pasture: A Systematic Evaluation and Integration of Evidence‐Based Practice.” Worldviews on Evidence‐Based Nursing 12, no. 1: 3–11. [DOI] [PubMed] [Google Scholar]
  25. Harvey, G. , and Kitson A.. 2016. “PARIHS Revisited : From Heuristic to Integrated Framework for the Successful Implementation of Knowledge Into Practice.” Implementation Science 11, no. 1: 33. 10.1186/s13012-016-0398-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Huybrechts, I. , Declercq A., Verté E., Raeymaeckers P., and Anthierens S.. 2021. “The Building Blocks of Implementation Frameworks and Models in Primary Care: A Narrative Review.” Frontiers in Public Health 9: 675171. 10.3389/fpubh.2021.675171. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. International Council of Nurses . 2012. “Closing the Gap: From Evidence to Action: International Nurses Day 2012.” International Council of Nurses: 1–52. https://www.nursingworld.org/~4aff6a/globalassets/practiceandpolicy/innovation‐‐evidence/ind‐kit‐2012‐for‐nnas.pdf. [Google Scholar]
  28. Kaba, M. , Birhanu Z., Fernandez Villalobos N. V., et al. 2023. “Health Research Mentorship in Low‐ and Middle‐Income Countries : A Scoping Review.” JBI Evidence Synthesis 21, no. 10: 1912–1970. 10.11124/JBIES-22-00260. [DOI] [PubMed] [Google Scholar]
  29. Laschinger, H. K. S. , Finegan J., Shamian J., and Wilk P.. 2001. “Impact of Structural and Psychological Empowerment on Job Strain in Nursing Work Settings: Expanding Kanter's Model.” Journal of Nursing Administration 31, no. 5: 260–272. 10.1097/00005110-200105000-00006. [DOI] [PubMed] [Google Scholar]
  30. Lengnick‐Hall, R. , Gerke D. R., Proctor E. K., et al. 2022. “Six Practical Recommendations for Improved Implementation Outcomes Reporting.” Implementation Science: IS 17: 16. 10.1186/s13012-021-01183-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Leung, K. , Trevena L., and Waters D.. 2014. “Systematic Review of Instruments for Measuring Nurses’ Knowledge, Skills and Attitudes for Evidence‐Based Practice.” Journal of Advanced Nursing 70, no. 10: 2181–2195. 10.1111/jan.12454. [DOI] [PubMed] [Google Scholar]
  32. Lisy, D. K. 2014. “Alan Pearson : A Pioneer of Evidence‐Based Care.” JBI Evidence Synthesis 12, no. 6: 1–2. 10.11124/jbisrir-2014-1841. [DOI] [Google Scholar]
  33. Mazza, D. , Bairstow P., Buchan H., et al. 2013. “Refining a Taxonomy for Guideline Implementation : Results of an Exercise in Abstract Classification.” Implementation Science: IS 8: 32. 10.1186/1748-5908-8-32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. McDonald, L. 2014. “Florence Nightingale, Statistics and the Crimean War.” Journal of the Royal Statistical Society: Series A (Statistics in Society) 177, no. 3: 569–586. 10.1111/rssa.12026. [DOI] [Google Scholar]
  35. McNett, M. , Tucker S., and Melnyk B. M.. 2021. “Evidence‐Based Practice Requires Evidence‐Based Implementation.” Worldviews on Evidence‐Based Nursing 18, no. 2: 74–75. 10.1111/wvn.12494. [DOI] [PubMed] [Google Scholar]
  36. McNett, M. , Tucker S., Thomas B., Gorsuch P., and Gallagher‐Ford L.. 2022. “Use of Implementation Science to Advance Nurse‐Led Evidence‐Based Practices in Clinical Settings.” Nurse Leader 20, no. 3: 297–305. 10.1016/j.mnl.2021.11.002. [DOI] [Google Scholar]
  37. McNett, M. , Tucker S., Zadvinskis I., et al. 2022. “A Qualitative Force Field Analysis of Facilitators and Barriers to Evidence‐Based Practice in Healthcare Using an Implementation Framework.” Global Implementation Research and Applications 2, no. 3: 195–208. 10.1007/s43477-022-00051-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Medicine . 2001. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press. 10.17226/10027. [DOI] [PubMed] [Google Scholar]
  39. Melnyk, B. M. 2018. “Breaking Down Silos and Making Use of the Evidence‐Based Practice Competencies in Healthcare and Academic Programs: An Urgent Call to Action.” Worldviews on Evidence‐Based Nursing 15, no. 1: 3–4. [DOI] [PubMed] [Google Scholar]
  40. Melnyk, B. M. , and Fineout‐Overholt E.. 2023. Evidence‐Based Practice in Nursing & Healthcare : A Guide to Best Practice, 5th ed. Philadelphia, PA: Wolters Kluwer. [Google Scholar]
  41. Melnyk, B. M. 2021. “The Current Research to Evidence‐Based Practice Time Gap Is Now 15 Instead of 17 Years: Urgent Action Is Needed.” Worldviews on Evidence‐Based Nursing 18, no. 6: 318–319. 10.1111/wvn.12546. [DOI] [PubMed] [Google Scholar]
  42. Melnyk, B. M. , Gallagher‐Ford L., Zellefrow C., et al. 2018. “The First U.S. Study on Nurses' Evidence‐Based Practice Competencies Indicates Major Deficits That Threaten Healthcare Quality, Safety, and Patient Outcomes.” Worldviews on Evidence‐Based Nursing 15, no. 1: 16–25. 10.1111/wvn.12269. [DOI] [PubMed] [Google Scholar]
  43. Nilsen, P. 2015. “Making Sense of Implementation Theories, Models and Frameworks.” Implementation Science 10, no. 1: 53. 10.1186/s13012-015-0242-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Oermann, M. H. , Reynolds S. S., and Granger B. B.. 2022. “Using an Implementation Science Framework to Advance the Science of Nursing Education.” Journal of Professional Nursing 39: 139–145. [DOI] [PubMed] [Google Scholar]
  45. Pepin, J. , Ducharme F., and Kérouac S.. 2017. La Pensée Infirmière (Chenelière Éducation). [Google Scholar]
  46. Pereira, V. C. , Silva S. N., Carvalho V. K. S., Zanghelini F., and Barreto J. O. M.. 2022. “Strategies for the Implementation of Clinical Practice Guidelines in Public Health: An Overview of Systematic Reviews.” Health Research Policy & Systems 20, no. 1: 1–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Peters, D. , Tran N., Adam T., Alliance for Health Policy and Systems Research , and World Health Organization . 2013. Implementation Research in Health: A Practical Guide/edited by David Peters … [et al]. World Health Organization. https://apps.who.int/iris/handle/10665/91758. [Google Scholar]
  48. Pinnock, H. , Barwick M., Carpenter C. R., et al. 2017. “Standards for Reporting Implementation Studies (StaRI) Statement.” BMJ i6795: i6795. 10.1136/bmj.i6795. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Powell, B. J. , McMillen J. C., Proctor E. K., et al. 2012. “A Compilation of Strategies for Implementing Clinical Innovations in Health and Mental Health.” Medical Care Research and Review: MCRR 69, no. 2: 123–157. 10.1177/1077558711430690. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Powell, B. J. , Waltz T. J., Chinman M. J., et al. 2015. “A Refined Compilation of Implementation Strategies : Results From the Expert Recommendations for Implementing Change (ERIC) Project.” Implementation Science: IS 10: 21. 10.1186/s13012-015-0209-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Proctor, E. K. , Bunger A. C., Lengnick‐Hall R., et al. 2023. “Ten Years of Implementation Outcomes Research: A Scoping Review.” Implementation Science: IS 18, no. 1: 31. 10.1186/s13012-023-01286-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Proctor, E. K. , Landsverk J., Aarons G., Chambers D., Glisson C., and Mittman B.. 2009. “Implementation Research in Mental Health Services : An Emerging Science With Conceptual, Methodological, and Training Challenges.” Administration and Policy in Mental Health and Mental Health Services Research 36, no. 1: 24–34. 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Proctor, E. K. , Powell B. J., and McMillen J. C.. 2013. “Implementation Strategies : Recommendations for Specifying and Reporting.” Implementation Science 8, no. 1: 139. 10.1186/1748-5908-8-139. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Proctor, E. K. , Silmere H., Raghavan R., et al. 2011. “Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda.” Administration and Policy in Mental Health and Mental Health Services Research 38, no. 2: 65–76. 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Sackett, D. L. , Rosenberg W. M. C., Gray J. A. M., Haynes R. B., and Richardson W. S.. 1996. “Evidence Based Medicine: What It Is and What It isn't.” BMJ 312, no. 7023: 71–72. 10.1136/bmj.312.7023.71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Sallam, H. N. 2010. “Aristotle, Godfather of Evidence‐Based Medicine.” Facts, Views & Vision in ObGyn 2, no. 1: 11–19. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4154333/. [PMC free article] [PubMed] [Google Scholar]
  57. Schaffer, M. A. , Sandau K. E., and Diedrick L.. 2013. “Evidence‐Based Practice Models for Organizational Change: Overview and Practical Applications.” Journal of Advanced Nursing 69, no. 5: 1197–1209. 10.1111/j.1365-2648.2012.06122.x. [DOI] [PubMed] [Google Scholar]
  58. Schulz, K. F. , Altman D. G., and Moher D.. 2010. “CONSORT 2010 Statement: Updated Guidelines for Reporting Parallel Group Randomised Trials.” PLoS Medicine 7, no. 3: e1000251. 10.1371/journal.pmed.1000251. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Soicher, R. N. , Becker‐Blease K. A., and Bostwick K. C. P.. 2020. “Adapting Implementation Science for Higher Education Research : The Systematic Study of Implementing Evidence‐Based Practices in College Classrooms.” Cognitive Research: Principles and Implications 5: 54. 10.1186/s41235-020-00255-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Speroni, K. G. , McLaughlin M. K., and Friesen M. A.. 2020. “Use of Evidence‐based Practice Models and Research Findings in Magnet‐Designated Hospitals Across the United States: National Survey Results.” Worldviews on Evidence‐Based Nursing 17, no. 2: 98–107. 10.1111/wvn.12428. [DOI] [PubMed] [Google Scholar]
  61. Stetler, C. B. 2001. “Updating the Stetler Model of Research Utilization to Facilitate Evidence‐Based Practice.” Nursing Outlook 49, no. 6: 272–279. 10.1067/mno.2001.120517. [DOI] [PubMed] [Google Scholar]
  62. Teixeira, A. C. , Nogueira A., Barbieri‐Figueiredo M., and do C.. 2023. “Professional Empowerment and Evidence‐Based Nursing: A Mixed‐Method Systematic Review.” Journal of Clinical Nursing 32, no. 13‑14: 3046–3057. 10.1111/jocn.16507. [DOI] [PubMed] [Google Scholar]
  63. Tucker, S. , McNett M., Mazurek Melnyk B., et al. 2021. “Implementation Science: Application of Evidence‐Based Practice Models to Improve Healthcare Quality.” Worldviews on Evidence‐Based Nursing 18, no. 2: 76–84. 10.1111/wvn.12495. [DOI] [PubMed] [Google Scholar]
  64. Tucker, S. J. , and Gallagher‐Ford L.. 2019. “EBP 2.0 : From Strategy to Implementation.” American Journal of Nursing 119, no. 4: 50. 10.1097/01.NAJ.0000554549.01028.af. [DOI] [PubMed] [Google Scholar]
  65. van Achterberg, T. 2013. “Nursing Implementation Science : 10 Ways Forward.” International Journal of Nursing Studies 50, no. 4: 445–447. 10.1016/j.ijnurstu.2013.02.004. [DOI] [PubMed] [Google Scholar]
  66. van Achterberg, T. , Schoonhoven L., and Grol R.. 2008. “Nursing Implementation Science : How Evidence‐Based Nursing Requires Evidence‐Based Implementation.” Journal of Nursing Scholarship 40, no. 4: 302–310. 10.1111/j.1547-5069.2008.00243.x. [DOI] [PubMed] [Google Scholar]
  67. von Elm, E. , Altman D. G., Egger M., Pocock S. J., Gøtzsche P. C., and Vandenbroucke J. P.. 2007. “The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement : Guidelines for Reporting Observational Studies.” Lancet 370, no. 9596: 1453–1457. 10.1016/S0140-6736(07)61602-X. [DOI] [PubMed] [Google Scholar]
  68. Waltz, T. J. , Powell B. J., Matthieu M. M., et al. 2015. “Use of Concept Mapping to Characterize Relationships Among Implementation Strategies and Assess Their Feasibility and Importance : Results From the Expert Recommendations for Implementing Change (ERIC) Study.” Implementation Science: IS 10: 109. 10.1186/s13012-015-0295-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. World Health Organization . 2015. “Orientations stratégiques européennes relatives au renforcement des soins infirmiers et obstétricaux dans le cadre des objectifs de Santé 2020.” https://www.euro.who.int/fr/health‐topics/Health‐systems/nursing‐and‐midwifery/publications/2015/european‐strategic‐directions‐for‐strengthening‐nursing‐and‐midwifery‐towards‐health‐2020‐goals.
  70. World Health Organization , Jylhä V., Oikarainen A., Perälä M.‐L., and Holopainen A.. 2017. Facilitating Evidence‐Based Practice in Nursing and Midwifery in the WHO European Region, 1–35. https://iris.who.int/bitstream/handle/10665/353672/WHO‐EURO‐2017‐5314‐45078‐64291‐eng.pdf?sequence=1&isAllowed=y. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data sharing not applicable‐no new data generated, or the article describes entirely theoretical research.


Articles from Journal of Advanced Nursing are provided here courtesy of Wiley

RESOURCES