Abstract
Background
There has long been debate about the balance between fidelity to evidence-based interventions (EBIs) and the need for adaptation for specific contexts or particular patients. The debate is relevant to virtually all clinical areas. This paper synthesises arguments from both fidelity and adaptation perspectives to provide a comprehensive understanding of the challenges involved, and proposes a theoretical and practical approach for how fidelity and adaptation can optimally be managed.
Discussion
There are convincing arguments in support of both fidelity and adaptations, representing the perspectives of intervention developers and internal validity on the one hand and users and external validity on the other. Instead of characterizing fidelity and adaptation as mutually exclusive, we propose that they may better be conceptualized as complimentary, representing two synergistic perspectives that can increase the relevance of research, and provide a practical way to approach the goal of optimizing patient outcomes. The theoretical approach proposed, the “Value Equation,” provides a method for reconciling the fidelity and adaptation debate by putting it in relation to the value (V) that is produced. The equation involves three terms: intervention (IN), context (C), and implementation strategies (IS). Fidelity and adaptation determine how these terms are balanced and, in turn, the end product – the value it produces for patients, providers, organizations, and systems. The Value Equation summarizes three central propositions: 1) The end product of implementation efforts should emphasize overall value rather than only the intervention effects, 2) implementation strategies can be construed as a method to create fit between EBIs and context, and 3) transparency is vital; not only for the intervention but for all of the four terms of the equation.
Summary
There are merits to arguments for both fidelity and adaptation. We propose a theoretical approach, a Value Equation, to reconciling the fidelity and adaptation debate. Although there are complexities in the equation and the propositions, we suggest that the Value Equation be used in developing and testing hypotheses that can help implementation science move toward a more granular understanding of the roles of fidelity and adaptation in the implementation process, and ultimately sustainability of practices that provide value to stakeholders.
Keywords: Adherence, Modifications, Fidelity, Evaluation, Sustainability theory, Theory, Sustainment, Validity
Background
Implementation science is defined as the scientific study of methods to promote the uptake of research findings into routine healthcare in clinical, organizational, or policy contexts [1]. The goal is to close the gap between what has been shown to be effective in rigorous trials (i.e., evidence-based interventions [EBIs], such as diagnostic tools and treatments) and what is done in clinical practice, so that patient and population health is improved.
If patients and ultimately populations are going to benefit from the best available evidence, fidelity (also denoted as adherence or treatment integrity), defined as the degree to which an intervention is carried out as it was described and originally tested and/or as the developer intended [2–8], is important in all steps of the research-to-practice pathway. During efficacy and effectiveness trials, when the main purpose is to separate effects of the intervention from other factors, high fidelity ensures that it is the intervention, not other factors, that produces the effect. In implementation studies, EBI fidelity is often a primary implementation outcome for determining if the methods used to promote uptake were successful [9]. In routine care, the degree to which EBIs are delivered as originally designed ultimately determines if patients and populations indeed receive the interventions that correspond with the best available evidence [2, 5, 10]. Overall, this makes understanding fidelity a central issue for implementation science.
However, throughout all the steps of the research-to-practice pathway, there are forces that pull away from fidelity. This has drawn increased attention to the role of adaptations, defined as the changes made to an intervention based on deliberate considerations to increase fit with patient or contextual factors at the system, organization, team, and individual clinician level [11–15]. Deliberate distinguishes adaptations from drift [16]. The central role of adaptations in implementation is increasingly being acknowledged, as evident from recent special issues [17] and themes in recent scientific meetings such as the US National Institutes of Health and Academy Health 9th Annual Conference on the Science of Dissemination and Implementation [10] and the 2018 Nordic Implementation Conference [18]. Yet, although there is global concern about fidelity-adaptation questions, the related conceptual and methodological issues are far from resolved.
Starting from routine care, there is ample evidence illustrating how adaptation of EBIs is the rule rather than the exception when used in real-world practice [12, 19, 20]. Influences on multiple levels, from the system, organization, provider, and patient, can all influence the degree to which EBIs might require adaptation [14, 21]. For example, reasons for adaptation can include system- and organization-level concerns including workforce readiness, organizational context, and the cost of purchasing EBIs from intervention developers and purveyors. In line with this, it has been noted that adaptability is an important factor that implementation strategies should address and that adaptation is likely to be needed to promote uptake [22]. This follows Everett Rogers’ seminal research stipulating that an innovation (e.g., an EBI) almost always will be reshaped to fit the organization or context in which it will be used [23]. In a concurrent development, research about cultural adaptations has also highlighted the need to tailor interventions based on the culture of the target populations [24], as well as the need to increase our understanding of cultural influences on implementation strategies and outcomes [25].
Recently, there has also been more discussion about the role that adaptations play earlier along the research-to-practice pathway [26, 27]. This includes questioning the assumption that fidelity automatically maximizes effectiveness [28]. It also includes showing that adaptation happens not only when EBIs are used in practice but also during trials, indicating that intervention researchers also needs to attend to issues related to fidelity and adaptation [27]. There have been a number of efforts to improve the reporting of fidelity and adaptation (e.g., [12, 13, 29–31]) (see also Roscoe et al., (2019) [32] for a comparison of four classification taxonomies), but to date, neither adaptations nor the intervention as planned (i.e., fidelity), are sufficiently described or documented in effectiveness trials [33–36]. This leaves a gap in understanding the full scope of the fidelity and adaptation dilemma earlier in the research-to-practice pathway. Thus, fidelity and adaptation are concepts that implementation science by necessity needs to acknowledge and address. Yet, this is prevented by the plethora of terms used, and the lack of clarity as to how the constructs can be conceptually organized. In Table 1, we propose a taxonomy that further refines the definitions of fidelity and adaptation, according to subcomponents and dimensions to which fidelity and adaptation can refer. This may aid in identifying relevant constructs for assessment and measurement.
Table 1.
Sub-components | Characteristics of the intervention | Characteristics of the context of delivery | |||
---|---|---|---|---|---|
Content of intervention What intervention activities and processes are performed? [4, 37] |
Intervention delivery How is the intervention delivered? [2, 37] |
Format In which format and through which channels is the intervention delivered? [13, 24] |
Conditions of the delivery context What are the precise conditions related to the system, organization and providers under which the intervention is delivered? [38] |
What is the target population specifics? [7] | |
Suggested dimensions to consider |
• The intervention core components, the specific active ingredients that make an EBI effective [7], fidelity consistent components [39] • The logic or theory that explains how the intervention is intended to work (i.e. the interventions’ deep structure [40], intervention strength [38] and program model [7]; its function [41]) • Components that are part of the intervention but not central for producing the outcomes (surface structure) [40] or adaptable periphery • Components, not part of the intervention: fidelity consistent [5, 7, 39], or fidelity inconsistent [39] • Components that make the intervention uniquely distinguishable (program differentiation) [2] |
• Number, length and frequency of sessions, • Density: how the intervention is spaced out in time; the intensity of the intervention [28] • The quality of delivery [2, 4] • What the treatment provider plans and believes is delivered (dose delivered) • What the recipient perceives that they have received (dose received) [42] • The timing; when the various parts of the intervention is delivered in relation to the other parts [43] |
• Format of delivery, e.g. one-to-one or group • Channels of delivery, e.g. phone, internet or face-to-face • Location of delivery, e.g. school, non-profit organization or church |
• Interventionist specifics (e.g. who is delivering the intervention; their training, competence level; personal attributes and skills) • Setting, e.g. primary care, hospital, community-based, workplace-based [13] • Organizational factors, e.g. climate, leadership, mandate, history [44] • System characteristics, e.g. reimbursement models, contracts, laws, policies, regulations, political climate [44] |
• The health conditions that the intervention targets • Cultural characteristics • Age groups (e.g., children, youth, adults) • Patient’s own health goals and specific needs • Comorbidities |
aFidelity and adaptation sometimes refer to implementation strategies rather than interventions, that is, to what extent the strategies chosen to facilitate the use of the intervention is adhered to or adapted. The focus of the current paper is on fidelity to and adaptations of interventions
The issue of fidelity and adaptation has been controversial for decades (e.g., [45]). This debate deals with the longstanding tensions between achieving internal and external validity [46]. Whereas some scholars emphasize the importance of drawing valid conclusions about the effects of an intervention, thereby prioritizing internal validity, others highlight the need for interventions to fit and function in the daily operations of different systems and organizations, thus highlighting the virtue of external validity. However, there has been little theoretical development that can guide how fidelity and adaptations should be managed and documented across the research-to-practice pathway and how this is related to implementation science [21]. The debate and the research on fidelity and adaptations has been split and fragmented across multiple fields and journals representing various clinical fields, disciplines, and/or settings in which EBIs are implemented. With some noticeable exceptions (e.g., [13, 28]), the debate has taken place in parallel silos, and there is currently a lack of overview over the main arguments for fidelity on one side and adaptations on the other. This hampers a more comprehensive understanding needed to move toward a theoretical approach for how the fidelity and adaptation debate can be reconciled. This paper aims to synthesise the main arguments for fidelity and adaptation and, based on that, propose a theoretical and applied approach for how adaptation and fidelity can optimally be managed.
Five reasons fidelity is vital – and five reasons adaptations are also vital
As outlined above, the logic of the research-to-practice pathway stipulates that EBIs should be used as they were described and intended to be provided. This approach implies that fidelity to the intervention is central and any deviations problematic. However, there are also strong arguments for why adaptations are needed. Table 2 summarizes the more pervasive reasons and justifications found in the literature for fidelity and adaptations, respectively.
Table 2.
Argument | Fidelity … | Adaptation … |
---|---|---|
1 | … is vital for drawing valid conclusions by: | … improves intervention–context fit by: |
- increasing internal validity by the transparent and adherent use of EBIs [35] - separating implementation failure from theory failure, i.e., distinguishing between lack of effects due to insufficient implementation from lack of effects of an ineffective intervention [47] - avoiding type-III errors: concluding that an intervention is not effective when it actually was poorly implemented [48] |
- ensuring that EBIs can be implemented and used in/for systems, organizations, providers, or patients that is different than the one in which the EBI was originally tested [49] - increasing the acceptability, feasibility, and applicability of an EBI to a given context [29] - increasing practical and/or value fit (philosophical and cultural) [12], e.g., creating fit with practical circumstances by changing from individual treatment to group format to align with funding contract [50] |
|
2 | … makes accumulation of knowledge possible by: | … balances different outcomes by: |
- making replication possible by ensuring the intervention remains the same across studies, thereby distinguishing between random and robust results [35, 51] - allowing results from multiple studies to be synthesised in systematic reviews and meta-analyses |
- focusing on a broader spectrum of objectives, e.g., not only the specific clinical outcome an EBIs is evaluated against (e.g., symptom reduction, improved functioning) but also outcomes on other levels (patient, provider, organization and system) such as reach, relevance, costs [29, 49, 52] - focusing on optimizing benefits over time rather than focusing on sustained delivery (i.e., sustainment) [14, 28, 53, 54] |
|
3 | ... assures EBI effectiveness by: | ... assures EBI effectiveness by: |
- relying on studies, across different types of interventions and settings, showing that high fidelity can improve outcomes (e.g., [27, 37, 55–58] (at least in comparison to drift) | - relying on studies, across different types of interventions and settings, showing that adaptations can improve outcomes, e.g., [20, 59] (at least when there is a large variation in client and provider characteristics) | |
4 | ... provides transparency and confidence by: | ... is necessary to address multiple diagnoses by: |
- ensuring that users, patients and their families, care providers, and funders (health systems, governments, insurers, and foundations) gets what they are promised [47, 60]: ... that patients know that the EBI offered is also the EBI delivered, facilitating informed choices ... that subsequent providers can deliver appropriate care; trusting that the treatment as documented in clinical records is also the treatment delivered ... that funders get what they are paying for ... that systems allow fair comparison between organizations in a competitive market, and fair benchmarking of treatment outcomes |
- acknowledging that comorbidity is the rule rather than the exception in clinical practice, and that most EBIs have only been indicated (shown to be effective) for a very limited group of patients, primarily without comorbidities [61] - allowing “indication shift” to be able to use EBIs for groups for which evidence is lacking |
|
5 | ... provides equal care and reduces disparities by: | ... optimise the benefit for each patient by: |
- decreasing unwanted variation between providers, organizations, geographical regions, and different target groups or individuals, e.g., between men and women [60] - ensuring that decisions about adoption and use are made systematically, reducing the risk of gender and cultural biases |
- translating mean effects into what is best for each individual in the group [62] - taking individual patient variability into account by detecting the individuals that are likely to improve less (i.e., the tails of the distribution of effects), consistent with the personalized medicine movement [63] |
Discussion
There are valid and reasonable arguments in support of fidelity, and there are valid and reasonable arguments in support of adaptation. However, many of the arguments seem to be contradictory and sometimes mutually exclusive. Much of the debate over the years has taken one or the other position, but the possibility that adaptation and fidelity can coexist has also been raised. These suggestions note that they can co-exist as long as the EBI core components are adhered to (e.g., [11, 24, 30]), and, more recently, that adaptation can improve fidelity by ensuring adherence to the key principles or elements underlying the EBI [64, 65]. Recent advancements in the conceptualization, measurement and documentation of adaptations have moved the field forward by aiding the empirical exploration of the relationship between fidelity, adaptations and outcomes (e.g., [14, 29, 31]).
Yet, with some noticeable exceptions (e.g., Chambers and Norton’s “Adaptome” model [26]), there have been few attempts at making theoretical propositions that address how fidelity and adaptation can be reconciled. In the following, we deconstruct the arguments for fidelity and adaptation to get at underlying assumptions, and then make three propositions that reconcile fidelity and adaptation. The propositions and equation terms are summarized in the Value Equation, as shown in Table 3. The Value Equation states that the optimal value (V) is a product of the intervention (IN), the nature of the context (C) in which the intervention is being implemented, and the implementation strategies (IS). The Value Equation (V = IN * C * IS) terms are described in detail below.
Table 3.
Terms | Specification | |
---|---|---|
Value (V) | Vs | Value and fit of intervention in the system context |
Vo | Value and fit of intervention in organizational context | |
Vpr | Value and fit of the intervention for the provider | |
Vpt | Value and fit of the intervention for the patient | |
Intervention (IN) | INf | Extent to which the intervention is carried out as it was described (fidelity) |
INfc | Fidelity-consistent adaptations | |
INfi | Fidelity-inconsistent adaptations | |
Context (C) | Cs | System context |
Co | Organizational context | |
Cpr | Provider context | |
Cpt | Patient context | |
Implementation Strategy (IS) | ISc | Implementation strategy optimizing the context |
ISi | Implementation strategy optimizing the intervention |
Building the value equation
Table 3 summarizes elements of the Value Equation. Written as a simple mathematical equation, its starting point is an assumption that it is (only) the EBI that produces the effect:
Implicit here is that by adhering to the intervention as it was designed, the 1) effect is maximized; 2) it is clear what is being delivered; 3) there is little unwanted EBI variation between organizations, professionals, and patients; and 4) it is possible to accumulate knowledge across studies. Nevertheless, as described previously, adaptation happens. Thus, there is a need to specify the intervention as the extent to which the intervention was carried out as it was described (fidelity) (INf), as well as fidelity-consistent (INfc) and fidelity-inconsistent (INfi) adaptations [39] (see Table 3).
As the EBI moves along the research-to-practice pathway, the influence of contextual factors is increasingly recognizable. Thus, a second term is added to the equation: Context (C).
Because many implementations take place in complex systems including influences on system, organization, provider, and patient levels, context needs to be further specified. Thus, we suggest that context be delineated as system context (Cs), organizational context (Co), provider context (e.g., professional discipline, training, attitudes toward the intervention) (Cpr), and patient context (e.g., target group) (Cpt).
The Value Equation proposes that by acknowledging that context is indeed a term in the equation, the effects of intervention, by necessity, need to be understood in relation to the context in which it is implemented. For example, even in efficacy trials, there are contextual factors that will influence the outcome (e.g., highly trained staff delivering the intervention, urban settings). Thus, an EBI is not effective in isolation; it is more or less effective for a certain group, in certain settings, and under certain conditions. When the EBI is used beyond that, the context term changes, and so does the expected effect. High fidelity may increase effects in certain contexts, and adaptation in others. The optimal answers lie in the configuration of both terms in the equation.
Implementation strategies create intervention–context fit
Implementation strategies are systematic processes to adopt and integrate EBIs into clinical care [22]. Implementation strategies can be simple (e.g., clinical reminders) or complex and multicomponent (e.g., training + coaching + audit and feedback) and varies with EBIs and contexts. We build on this notion to derive our first proposition: that implementation strategies are ways to create fit (i.e., appropriateness [9]) between an intervention and a specific context. We add a third term to the equation: Implementation Strategy (IS).
We argue that implementation strategies can optimize the effect of interventions in two ways: 1) by optimizing the outer system or inner organizational context so that it fits the intervention (ISc) [44], or 2) by optimizing the intervention so that it fits the context (ISi) (Table 3). Thus, in the first case, implementation strategies are concerned with increasing fidelity by enabling appropriate changes in the context (e.g., by increasing competence among staff and/or create opportunities for the target behaviours through environmental restructuring such as changing the reimbursement system to allow clinicians needed time, etc. [66, 67]). In the second case, the implementation strategies promote adaptations to achieve fit (e.g., remove components because they are perceived as culturally inappropriate, or tailor based on patient preferences [12, 68]).
This proposition builds on the first argument for adaptation, stating that intervention–context fit is a necessary condition for implementation, but also invokes Elliott and Mihalic’s (2004) [69] notion that the need for intervention–context fit does not necessarily mean adaptation of the intervention; it may as well mean adaptation of the context to facilitate fidelity to the intervention. Thus, we build on previous work suggesting that adaptation and fidelity can co-exist (e.g., [11, 24, 30]), and add to that by explicitly proposing implementation strategies as the activities that optimize fit and reconcile fidelity and adaptation, whether those are concerned with modifying the intervention or the context, or both intervention and context.
The proposition to view implementation strategies as ways to create fit between an intervention and a specific context opens up new innovative approaches to choosing and matching implementation strategies, which has proven to be challenging [70]. The proposition aligns with recent suggestions to use user-centred design principles and community-academic partnerships for the purpose of creating fit between interventions and context, by engaging intervention developers and/or implementers and practitioners in a collaborative redesign process [71–73]. The value equation can aid this process by explicating which strategies are used, and why (if it is for the purpose of achieving fit by changing the context, or the intervention), and to what effect.
Moving from effect to multilevel value proposition
A compelling argument for both fidelity and adaptation is the potential for increase in the effectiveness and public health impact of an intervention. Here, we make our second proposition by proposing an intentional shift from focusing on the effect of an intervention to focusing on the value (V) it creates, making a final adjustment to the equation by exchanging effect for value. Expressed mathematically, the complete Value Equation becomes the following:
Value is broader than intervention effects alone. It reflects the optimization of a configuration of patient (Vpt), provider (Vpr), organization (Vo), and system (Vs) values and outcomes. Thus, value is a multicomponent, multilevel construct that represents the perceived or real benefit for each stakeholder and for stakeholders combined: a multilevel value proposition. For example, value for a service system may be increased population health, while for an organization, it may be optimized service delivery and decreased costs. Concurrently, a clinical professional may view value as being able to consider individual patient needs and outcomes, and patients may value their own improved functioning in daily life and/or clinical outcomes.
But what then is success of an EBI? By focusing on value, we suggest that implementation success can be defined as the ability to optimize value across the different levels and stakeholders. This perspective on implementation success aligns with recent definitions of sustainability, which highlight the ability to continuously deliver benefits as key part of the construct [74], with adaptations being a strategy to promote it [75]. The value equation proposes that effects on certain clinical outcomes are necessary but not sufficient. An EBI also needs to maximize value for individual providers, for the organization, and for the system. This shifts the focus on implementation from getting an EBI in place, to thinking about its value more broadly, and being more egalitarian in considering the needs of multiple stakeholders, including recently identified “bridging factors” to optimize implementation across context levels [76].
The equation, with its focus on value, also has implications for intervention developers. It implies that moving from designing interventions to maximize efficacy to designing interventions that maximize value, for multiple stake-holders. According to the Value Equation, the intervention that is most efficacious may not be the one that also provides the most value. A less complex intervention that can be delivered by less skilled staff and that requires less implementation resources (e.g., supervision, re-organization of care) may result in higher value than an intervention that stands little chance of being used in practice [26]. This is consistent with approaches to maximizing public health impact where a given EBI may have a smaller effect size, but if it is sustained and reaches more patients then even a small effect sizes can have significant public health impact [77].
It is in relation to the multidimensional value configuration that fidelity and adaptation should be considered. Sometimes, fidelity is a way to optimize value, sometimes it is adaptations, and often it is a combination. This also means that fidelity might optimize one outcome and adaptation another. Furthermore, the different types of outcomes may be valued differently by different stakeholders. In this, we acknowledge that different stakeholders’ definitions of value may differ. In fact, they may often be misaligned, such as when an organization is required by the system to provide a service to a sub-population that does not request it. We suggest that the better implementers are at acknowledging and addressing these value conflicts, the higher the likelihood for successful and sustained implementation. Community–academic partnerships may be one bridging factor that may facilitate this process [78] by engaging stakeholders in jointly considering system, organization, and patient needs, increasing their understanding of others agendas and encouraging a transparent negotiation of how to best address different needs. Techniques such as concept mapping and co-created program logic (COP) may be useful to promote an understanding of divergent viewpoints, and an effective dialogue [79, 80].
Similarly, by moving from focus only on treatment effect to a value configuration, we can reconcile arguments for fidelity and adaptation in relation to equity. We simply propose focusing on equity of the value achieved by the equation as a whole (i.e., for all stakeholders across levels) rather than only equity in relation to the intervention.
Transparency over all the value equation terms
One of the main arguments for fidelity is related to transparency: Fidelity to an EBI is needed for comparisons, accumulation of knowledge, and accountability. Our third proposition is that what is essential is transparent use. Thus, replication and accumulation of knowledge is still possible, but redefined to focus on transparency in relation to all terms in the Value Equation. In this, the Value Equation is consistent with recent calls for redefining replicability in clinical science (e.g., [81]). Requirements from funders to provide information on all equation terms would be helpful to push the development in this direction.
This proposition is consistent with calls for rigorous strategies to monitor, guide and evaluate fidelity [82–84] as well as adaptation, as increasingly has been acknowledged (e.g., [17, 26, 29, 31, 85, 86]). The Value Equation adds to this by proposing transparent reporting of all equation terms, and justification of fidelity and adaptation based on how it promotes fit between the EBI and context and in relation to how it impacts value. In this way, users will be supported in assessing INfi and INfc in subsequent implementations. Otherwise, the risk is what can be called “adaptation neglect,” a syndrome where adaptations pass unnoticed or undocumented regardless of how obvious they are.
Toward personalized value equations
One of the main argument for fidelity is to enable accumulation of knowledge through replication, putting focus on only one of the terms of the equation. The Value Equation and the transparency proposition instead focuses on all terms, thereby facilitating a gradual increase in the precision of the knowledge of what works for whom and when (i.e., specificity) [87]. This requires sophisticated processes and infrastructure. One way to achieve this may be to create databases of the different ways in which an intervention has been used, in what context, and to what effect [26, 86, 88]. Such data, thus, can form the basis for a gradual increased understanding of what creates value for whom and shows how the logic of the Value Equation can look in practice. For example, in the Paediatric Oncology Department of Karolinska University Hospital in Sweden, when a child does not respond as expected to a treatment protocol, adaptations are made, and both adaptations and effects are documented. Data from similar cases are accumulated, creating additional arms in the ongoing comparative trial. In this way, data on intervention*context configurations are collected.
Nevertheless, building databases that reflect the whole Value Equation may increase the administrative burden on clinical staff and organizations as a whole. A way to circumvent this risk may be to build a data infrastructure where all stakeholders involved in the healthcare process (patients, providers, organizations, and system) are invited to share and use data for their specific needs, so that those entering the data also benefit from it in their daily operations [89]. Although such a development may seem utopian in many fragmented systems, there are examples of these learning healthcare systems, for instance, at Cincinnati Children’s Hospital Medical Center [89] and in rheumatology care in Sweden [90]. Researchers may, for example, use the data for comparative effectiveness studies, and healthcare system representatives for benchmarking. However, the most transformative aspect may be when patients and providers can use the system at the point of care to track how an EBI is used (fidelity and adaptation) and what value it creates for the specific patient. This is consistent with recent applications of measurement-based care, where data related to intervention, context and implementation is assessed real-time along with clinical data to guide clinical decision making [91].
Used in this way, the learning healthcare system [92] will provide the most precise version of “what works for whom, when” we can think of: personalized value equations in patient- or provider-driven n = 1 studies [93]. Aggregation of all n = 1 studies will then provide the basis for accumulation of knowledge of “what works for whom, when,” thereby bridging personalized medicine and the ideas for systemizing knowledge about adaptations as outlined in the Adaptome [26].
Conclusions
In mathematics and statistics, we are used to thinking about how the different terms of an equation together determine the outcome. Implementation scientists can use the same approach to understand the product of an EBI, minding the context in which it is used and given the implementation strategies applied. The Value Equation is a theoretical proposition that reconciles the role of adaptation and fidelity in the research-to-practice pathway. The Value Equation states that the optimal value configuration of the intervention that can be obtained (V) is a product of the intervention (IN), the nature of the context (C) in which the intervention is being implemented, and how well the implementation strategy (IS) optimizes the intervention and the context. Fidelity and adaptation determine how these terms are mixed and, in turn, the end product: the value configuration it produces for multiple stakeholders.
The Value Equation contains three central propositions: 1) it positions implementation strategies as a way to create fit between EBIs and context, 2) it explicates that the product of implementation effort should move from emphasizing effects to emphasizing optimization of a multilevel value configuration, and 3) it shifts focus from fidelity to transparency over all terms of the equation. While there are many complexities in each of these propositions and in each of the terms in the equation, we suggest that the Value Equation be used to develop and test hypotheses that ultimately can help implementation science move toward a more granular understanding of how methods to promote the uptake of research findings can be optimized.
Abbrevations
- EBI/EBIs
Evidence-based intervention(s)
- V
Value
- Vs
value and fit of intervention in the system context
- Vo
Value and fit of intervention in organizational context
- Vpr
Value and fit of the intervention for the provider
- Vpt
Value and fit of the intervention for the patient
- IN
Intervention
- INf
Extent to which the intervention is carried out as it was described (fidelity)
- INfc
Fidelity-consistent adaptations
- INfi
Fidelity-inconsistent adaptations
- C
Context
- Cs
System context
- Co
Organizational context
- Cpr
Provider context (e.g., professional discipline, training, attitudes toward the EBI)
- Cpt
Patient context (e.g., target group)
- IS
Implementation Strategy
- ISc
Implementation strategy optimizing the context
- ISi
Implementation strategy optimizing the intervention
Authors’ contributions
UvTS and HH conducted the literature review and created the first version of the Value Equation, which was further refined with GAA. All authors contributed significantly to development of the paper. All authors read and approved the final version.
Funding
This work was funded by a research grant from the Swedish Research Council (project no. 2016–01261) and a visiting researcher grant from FORTE (DELG-2017/0024) after competitive peer-review processes. GAA was supported in part by the US National Institute on Drug Abuse (Grant # R01DA038466) and the National Institute of Mental Health (Grant # R01MH072961 and R03MH117493). The funders had no role in the design and conduct of the study or in the writing of the manuscript. The content is solely the responsibility of the authors and does not necessarily represent the official views of the Swedish Research Council, Forte, or the US National Institutes of Health.
Availability of data and materials
NA
Ethics approval and consent to participate
NA
Consent for publication
NA
Competing interests
The authors declare that they have no competing interests.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Contributor Information
Ulrica von Thiele Schwarz, Email: ulrica.schwarz@mdh.se, Email: ulrica.schwarz@ki.se.
Gregory A. Aarons, Email: gaarons@ucsd.edu
Henna Hasson, Email: Henna.hasson@ki.se.
References
- 1.Eccles MP, Mittman BS. Welcome to implementation science. Implenent Sci. 2006;1:1. doi: 10.1186/1748-5908-1-1. [DOI] [Google Scholar]
- 2.Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2:1–9. doi: 10.1186/1748-5908-2-40. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, et al. Enhancing Treatment Fidelity in Health Behavior Change Studies: Best Practices and Recommendations From the NIH Behavior Change Consortium. Health Psychol. 2004;23:443–451. doi: 10.1037/0278-6133.23.5.443. [DOI] [PubMed] [Google Scholar]
- 4.Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;18:237–256. doi: 10.1093/her/18.2.237. [DOI] [PubMed] [Google Scholar]
- 5.Hasson H. Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implement Sci. 2010;5:67. doi: 10.1186/1748-5908-5-67. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Sechrest L, West SG, Phillips MA, Redner R, Yeaton W. Some neglected problems in evaluation research: Strength and integrity of treatments. Evaluation studies review annual. 1979;4:15–35. [Google Scholar]
- 7.Gearing RE, El-Bassel N, Ghesquiere A, Baldwin S, Gillies J, Ngeow E. major ingredients of fidelity: a review and scientific guide to improving quality of intervention research implementation. Clin psychol rev. 2011;31:79–88. doi: 10.1016/j.cpr.2010.09.007. [DOI] [PubMed] [Google Scholar]
- 8.Perepletchikova F, Treat TA, Kazdin AE. Treatment integrity in psychotherapy research: analysis of the studies and examination of the associated factors. Journal of consulting and clinical psychology. 2007;75(6):829. doi: 10.1037/0022-006X.75.6.829. [DOI] [PubMed] [Google Scholar]
- 9.Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76. doi: 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Chambers D, Simpson L, Neta G, von Thiele Schwarz U, Percy-Laurry A, Aarons GA, et al., editors. Proceedings from the 9 th annual conference on the science of dissemination and implementation. Implement Sci; 2017; 12(Suppl1).
- 11.Lee SJ, Altschul I, Mowbray CT. Using planned adaptation to implement evidence-based programs with new populations. Am J Community Psychol. 2008;41:290–303. doi: 10.1007/s10464-008-9160-5. [DOI] [PubMed] [Google Scholar]
- 12.Moore J, Bumbarger B, Cooper B. Examining Adaptations of Evidence-Based Programs in Natural Contexts. J Prim Prev. 2013;34:147–161. doi: 10.1007/s10935-013-0303-6. [DOI] [PubMed] [Google Scholar]
- 13.Stirman S, Miller C, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65. doi: 10.1186/1748-5908-8-65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32. doi: 10.1186/1748-5908-7-32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Card JJ, Solomon J, Cunningham SD. How to adapt effective programs for use in new contexts. Health promot pract. 2011;12:25–35. doi: 10.1177/1524839909348592. [DOI] [PubMed] [Google Scholar]
- 16.Waller G. Evidence-based treatment and therapist drift. Behav Res Ther. 2009;47(2):119–127. doi: 10.1016/j.brat.2008.10.018. [DOI] [PubMed] [Google Scholar]
- 17.Bumbarger BK, Kerns SEU. Introduction to the Special Issue: Measurement and Monitoring Systems and Frameworks for Assessing Implementation and Adaptation of Prevention Programs. The Journal of Primary Prevention. 2019;40(1):1–4. doi: 10.1007/s10935-019-00537-4. [DOI] [PubMed] [Google Scholar]
- 18.von Thiele Schwarz U, Hasson H, Aarons GA, Sundell K. Usefulness of evidence -Adaptation and adherence of evidence-based methods. Nordic Implementation Conference; 2018, May 29; Copenhagen, Denmark.
- 19.Aarons GA, Miller EA, Green AE, Perrott JA, Bradway R. Adaptation happens: a qualitative case study of implementation of The Incredible Years evidence-based parent training programme in a residential substance abuse treatment programme. Journal of Children's Services. 2012;7(4):233–245. doi: 10.1108/17466661211286463. [DOI] [Google Scholar]
- 20.Wiltsey Stirman S, Gamarra JM, Bartlett BA, Calloway A, Gutner CA. Empirical examinations of modifications and adaptations to evidence-based psychotherapies: Methodologies, impact, and future directions. Clin Psychol: Science Practice. 2017;24(4):396–420. doi: 10.1111/cpsp.12218. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12:111. doi: 10.1186/s13012-017-0640-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21. doi: 10.1186/s13012-015-0209-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Rogers EM. Diffusion of innovations. New York: Simon and Schuster; 2010. [Google Scholar]
- 24.Castro FG, Barrera M, Jr, Martinez CR., Jr The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev Sci. 2004;5:41–45. doi: 10.1023/B:PREV.0000013980.12412.cd. [DOI] [PubMed] [Google Scholar]
- 25.Cabassa L, Baumann A. A two-way street: bridging implementation science and cultural adaptations of mental health treatments. Implement Sci. 2013;8:90. doi: 10.1186/1748-5908-8-90. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Chambers DA, Norton WE. The adaptome: advancing the science of intervention adaptation. Am J Prev Med. 2016;51:S124–S131. doi: 10.1016/j.amepre.2016.05.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.von Thiele Schwarz U, Förberg U, Sundell K, Hasson H. Colliding ideals–an interview study of how intervention researchers address adherence and adaptations in replication studies. BMC Med Res Methodol. 2018;18:36. [DOI] [PMC free article] [PubMed]
- 28.Chambers D, Glasgow R, Stange K. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117. doi: 10.1186/1748-5908-8-117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, et al. Systematic. Multimethod Assessment of Adaptations Across Four Diverse Health Systems Interventions. Front Public Health. 2018;6:102. doi: 10.3389/fpubh.2018.00102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Pérez D, Van der Stuyft P, del Carmen ZM, Castro M, Lefèvre P. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions. Implement Sci. 2015;11:91. doi: 10.1186/s13012-016-0457-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Stirman SW, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14:58. doi: 10.1186/s13012-019-0898-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Roscoe JN, Shapiro VB, Whitaker K, Kim BE. Classifying changes to preventive interventions: applying adaptation taxonomies. J Prim Prev. 2019;40:89–109. doi: 10.1007/s10935-018-00531-2. [DOI] [PubMed] [Google Scholar]
- 33.Hoffmann TC, Erueti C, Glasziou PP. Poor description of non-pharmacological interventions: analysis of consecutive sample of randomised trials. BMJ. 2013;347:f3755. doi: 10.1136/bmj.f3755. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Leichsenring F, Steinert C, Ioannidis JP. Toward a paradigm shift in treatment and research of mental disorders. Psychol Med. 2019:1–7. [DOI] [PubMed]
- 35.Cox JR, Martinez RG, Southam-Gerow MA. Treatment integrity in psychotherapy research and implications for the delivery of quality mental health services. J Consult Clin Psych. 2019;87:221. doi: 10.1037/ccp0000370. [DOI] [PubMed] [Google Scholar]
- 36.Glasziou P, Meats E, Heneghan C, Shepperd S. What is missing from descriptions of treatment in trials and reviews? BMJ. 2008;336:1472. doi: 10.1136/bmj.39590.732037.47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev. 1998;18:23–45. doi: 10.1016/S0272-7358(97)00043-3. [DOI] [PubMed] [Google Scholar]
- 38.Yeaton WH, Sechrest L. Critical dimensions in the choice and maintenance of successful treatments: strength, integrity, and effectiveness. J Consulting Clin Psychol. 1981;49:156. doi: 10.1037/0022-006X.49.2.156. [DOI] [PubMed] [Google Scholar]
- 39.Stirman SW, Gutner C, Crits-Christoph P, Edmunds J, Evans AC, Beidas RS. Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications to an evidence-based psychotherapy. Implement Scie. 2015;10(1):115. doi: 10.1186/s13012-015-0308-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Resnicow K, Soler R, Braithwaite RL, Ahluwalia JS, Butler J. Cultural sensitivity in substance use prevention. J Community Psychol. 2000;28:271–290. doi: 10.1002/(SICI)1520-6629(200005)28:3<271::AID-JCOP4>3.0.CO;2-I. [DOI] [Google Scholar]
- 41.Hawe P. Lessons from complex interventions to improve health. Ann Rev Public Health. 2015;36:307–323. doi: 10.1146/annurev-publhealth-031912-114421. [DOI] [PubMed] [Google Scholar]
- 42.Steckler AB, Linnan L, Israel B. Process evaluation for public health interventions and research. San Francisco, California: Jossey-Bass; 2002. [Google Scholar]
- 43.von Thiele Schwarz Ulrica, Hasson Henna, Lindfors Petra. Applying a fidelity framework to understand adaptations in an occupational health intervention. Work. 2015;51(2):195–203. doi: 10.3233/WOR-141840. [DOI] [PubMed] [Google Scholar]
- 44.Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–274. doi: 10.1146/annurev-publhealth-032013-182447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Castro FG, Yasui M. Advances in EBI development for diverse populations: Towards a science of intervention adaptation. Prev Sci. 2017;18:623–629. doi: 10.1007/s11121-017-0809-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Cook TD, Campbell DT, Shadish W. Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin; 2002. [Google Scholar]
- 47.Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Health. 2011;38:32–43. doi: 10.1007/s10488-010-0321-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Scanlon JW, Horst P, Nay JN, Schmidt RE, Waller A. Evaluability assessment: Avoiding type III and IV errors. Evaluation management. 1977:71–90.
- 49.Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research issues in external validation and translation methodology. Evaluation and the Health Professions. 2006;29:126–153. doi: 10.1177/0163278705284445. [DOI] [PubMed] [Google Scholar]
- 50.Aarons G, Hurlburt M, Horwitz S. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Schmidt S. Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Rev Gen Psychol. 2009;13:90. [Google Scholar]
- 52.Escoffery C, Lebow-Skelley E, Haardoerfer R, Boing E, Udelson H, Wood R, et al. A systematic review of adaptations of evidence-based public health interventions globally. Implement Sci. 2018;13:125. doi: 10.1186/s13012-018-0815-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.von Thiele Schwarz Ulrica, Lundmark Robert, Hasson Henna. The Dynamic Integrated Evaluation Model (DIEM): Achieving Sustainability in Organizational Intervention through a Participatory Evaluation Approach. Stress and Health. 2016;32(4):285–293. doi: 10.1002/smi.2701. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Shediac-Rizkallah M, Bone L. Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Educ Res. 1998;13:87–108. doi: 10.1093/her/13.1.87. [DOI] [PubMed] [Google Scholar]
- 55.Blakely CH, Mayer JP, Gottschalk RG, Schmitt N, Davidson WS, Roitman DB, et al. The fidelity-adaptation debate: Implications for the implementation of public sector social programs. Am J Community Psychol. 1987;15:253–268. doi: 10.1007/BF00922697. [DOI] [Google Scholar]
- 56.Hansen WB, Graham JW, Wolkenstein BH, Rohrbach LA. Program integrity as a moderator of prevention program effectiveness: Results for fifth-grade students in the adolescent alcohol prevention trial. J Stud Alcohol. 1991;52:568–579. doi: 10.15288/jsa.1991.52.568. [DOI] [PubMed] [Google Scholar]
- 57.Becker Deborah R., Smith James, Tanzman Beth, Drake Robert E., Tremblay Timothy. Fidelity of Supported Employment Programs and Employment Outcomes. Psychiatric Services. 2001;52(6):834–836. doi: 10.1176/appi.ps.52.6.834. [DOI] [PubMed] [Google Scholar]
- 58.Fauskanger Bjaastad J, Henningsen Wergeland GJ, Mowatt Haugland BS, Gjestad R, Havik OE, Heiervang ER, et al. Do clinical experience, formal cognitive behavioural therapy training, adherence, and competence predict outcome in cognitive behavioural therapy for anxiety disorders in youth? Clin psychol psychother. 2018;25:865–877. doi: 10.1002/cpp.2321. [DOI] [PubMed] [Google Scholar]
- 59.Sundell K, Beelmann A, Hasson H, von Thiele Schwarz U. Novel Programs, International Adoptions, or Contextual Adaptations? Meta-Analytical Results From German and Swedish Intervention Research. J Clin Child Adoles Psychol. 2015:1–13. [DOI] [PubMed]
- 60.Bond GR, Becker DR, Drake RE. Measurement of fidelity of implementation of evidence-based practices: Case example of the IPS Fidelity Scale. Clin Psychol: Science Practice. 2011;18:126–141. [Google Scholar]
- 61.Tinetti ME, Fried TR, Boyd CM. Designing health care for the most common chronic condition—multimorbidity. JAMA. 2012;307:2493–2494. doi: 10.1001/jama.2012.5265. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Tonelli M. The philosophical limits of evidence-based medicine. Acad Med. 1998;73:1234–1240. doi: 10.1097/00001888-199812000-00011. [DOI] [PubMed] [Google Scholar]
- 63.Joyner MJ, Paneth N. Seven questions for personalized medicine. JAMA. 2015;314:999–1000. doi: 10.1001/jama.2015.7725. [DOI] [PubMed] [Google Scholar]
- 64.Anyon Y, Roscoe J, Bender K, Kennedy H, Dechants J, Begun S, et al. Reconciling Adaptation and Fidelity: Implications for Scaling Up High Quality Youth Programs. The journal of primary prevention. 2019;40(1):35–49. doi: 10.1007/s10935-019-00535-6. [DOI] [PubMed] [Google Scholar]
- 65.Marques L, Valentine SE, Kaysen D, Mackintosh M-A, De Silva D, Louise E, et al. Provider fidelity and modifications to cognitive processing therapy in a diverse community health clinic: Associations with clinical change. J Consult Clin Psych. 2019;87(4):357. doi: 10.1037/ccp0000384. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Beh Health Ser & Research. 2017;44:177–194. doi: 10.1007/s11414-015-9475-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Michie S, Atkins L, West R. The behaviour change wheel. A guide to designing interventions. Great Britain: Silverback Publishing; 2014. [Google Scholar]
- 68.Kakeeto M, Lundmark R, Hasson H, von Thiele Schwarz U. Meeting patient needs trumps adherence. A cross-sectional study of adherence and adaptations when national guidelines are used in practice. J Eval Clinic Practice. 2017;23:830–838. doi: 10.1111/jep.12726. [DOI] [PubMed] [Google Scholar]
- 69.Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004;5:47–53. doi: 10.1023/B:PREV.0000013981.28071.52. [DOI] [PubMed] [Google Scholar]
- 70.Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14:42. doi: 10.1186/s13012-019-0892-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Lyon AR, Bruns EJ. User-Centered Redesign of Evidence-Based Psychosocial Interventions to Enhance Implementation—Hospitable Soil or Better Seeds? JAMA Psych. 2019;76:3–4. doi: 10.1001/jamapsychiatry.2018.3060. [DOI] [PubMed] [Google Scholar]
- 72.Drahota A, Meza RD, Brikho B, Naaf M, Estabillo JA, Gomez ED, et al. Community-academic partnerships: A systematic review of the state of the literature and recommendations for future research. The Milbank Quarterly. 2016;94(1):163–214. doi: 10.1111/1468-0009.12184. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Hasson H, Gröndal H, Hedberg Rundgren Å, Avby G, Uvhagen H, Von Thiele Schwarz U. How can evidence-based interventions give the best value for users in social services? Balance between adherence and adaptations: A study protocol. Implement Sci Communications. In press. [DOI] [PMC free article] [PubMed]
- 74.Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12:110. doi: 10.1186/s13012-017-0637-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Stirman SW, Finley EP, Shields N, Cook J, Haine-Schlagel R, Burgess JF, et al. Improving and sustaining delivery of CPT for PTSD in mental health systems: a cluster randomized trial. Implement Sci. 2017;12(1):32. doi: 10.1186/s13012-017-0544-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14:1. doi: 10.1186/s13012-018-0842-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Rutledge T, Loh C. Effect sizes and statistical testing in the determination of clinical significance in behavioral medicine research. Ann Behav Med. 2004;27:138–145. doi: 10.1207/s15324796abm2702_9. [DOI] [PubMed] [Google Scholar]
- 78.Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, et al. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adoles Psychol. 2014;43:915–928. doi: 10.1080/15374416.2013.876642. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Green Amy E., Fettes Danielle L., Aarons Gregory A. A Concept Mapping Approach to Guide and Understand Dissemination and Implementation. The Journal of Behavioral Health Services & Research. 2012;39(4):362–373. doi: 10.1007/s11414-012-9291-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Schwarz Ulrica von Thiele, Richter Anne, Hasson Henna. Organizational Interventions for Health and Well-being. 2018. Getting Everyone on the Same Page; pp. 42–67. [Google Scholar]
- 81.Tackett JL, Lilienfeld SO, Patrick CJ, Johnson SL, Krueger RF, Miller JD, et al. It’s time to broaden the replicability conversation: Thoughts for and from clinical psychological science. Persp Psychol Sci. 2017;12:742–756. doi: 10.1177/1745691617690042. [DOI] [PubMed] [Google Scholar]
- 82.Des Jarlais D, Lyles C, Crepaz N, Group T Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health. 2004;94:361–366. doi: 10.2105/AJPH.94.3.361. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMC Med. 2010;8:18. doi: 10.1186/1741-7015-8-18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.von Elm Erik, Altman Douglas G, Egger Matthias, Pocock Stuart J, Gøtzsche Peter C, Vandenbroucke Jan P. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: Guidelines for Reporting Observational Studies. PLoS Medicine. 2007;4(10):e296. doi: 10.1371/journal.pmed.0040296. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Lewis CC, Lyon AR, McBain SA, Landes SJ. Testing and Exploring the Limits of Traditional Notions of Fidelity and Adaptation in Implementation of Preventive Interventions. J Prim Prev. 2019;40:137–141. doi: 10.1007/s10935-019-00539-2. [DOI] [PubMed] [Google Scholar]
- 86.DeRosier ME. Three Critical Elements for Real-Time Monitoring of Implementation and Adaptation of Prevention Programs. J Prim Prev. 2019;40:129–135. doi: 10.1007/s10935-019-00538-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Pawson R. The science of evaluation: A realist manifesto: Sage. 2013. [Google Scholar]
- 88.Berkel C, Gallo CG, Sandler IN, Mauricio AM, Smith JD, Brown CH. Redesigning Implementation Measurement for Monitoring and Quality Improvement in Community Delivery Settings. J Prim Prev. 2019;40:111–127. doi: 10.1007/s10935-018-00534-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Lindblad S, Ernestam S, Van Citters A, Lind C, Morgan T, Nelson E. Creating a culture of health: evolving healthcare systems and patient engagement. QJM: Int J Med. 2017;110:125–129. doi: 10.1093/qjmed/hcw188. [DOI] [PubMed] [Google Scholar]
- 90.Ovretveit J, Keller C, Forsberg HH, Essén A, Lindblad S, Brommels M. Continuous innovation: developing and using a clinical database with new technology for patient-centred care—the case of the Swedish quality register for arthritis. Int J Qual Health Care. 2013;25:118–124. doi: 10.1093/intqhc/mzt002. [DOI] [PubMed] [Google Scholar]
- 91.Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cogn Behav Pract. 2015;22:49–59. doi: 10.1016/j.cbpra.2014.01.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Atkins D, Kilbourne AM, Shulkin D. Moving from discovery to system-wide change: the role of research in a learning health care system: experience from three decades of health systems research in the Veterans Health Administration. Ann Review Publ Health. 2017;38:467–487. doi: 10.1146/annurev-publhealth-031816-044255. [DOI] [PubMed] [Google Scholar]
- 93.Riggare S, Unruh KT, Sturr J, Domingos J, Stamford JA, Svenningsson P, et al. Patient-driven N-of-1 in Parkinson’s Disease. Methods information med. 2017;56:e123–e1e8. doi: 10.3414/ME16-02-0040. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
NA