Abstract
Neurological disorders are the leading cause of disability and the second leading cause of death globally. To challenge this enormous disease burden, scientists are pursuing innovative solutions to maintain and improve the quality of neurologic care. Despite the availability of many effective evidence-based practices, many patients with neurological disorders cannot access these (or receive them inefficiently after a long delay) and may be exposed to unnecessary, expensive, and potentially harmful treatments. To promote the systematic uptake of evidence-based practices into the real world, a new scientific study of methods has been developed: Implementation Science. In implementation science research, trans-disciplinary research teams systematically (using theory, model, and framework) assess local barriers to facilitate the adoption of evidence-based practices and examine potential solutions using implementation strategies (interventions that help adoption of intended practices) targeting multiple levels in the healthcare system, including patient, provider, clinic, facility, organization, or broader community and policy environment. The success of these strategies (implementation outcomes) is measured by the extent and quality of the implementation. Implementation studies can be either observational or interventional but are distinct from traditional efficacy or effectiveness studies. Traditional neuroscience research and clinical trials, conducted in controlled settings, focus on discovering new insights with little consideration of translating those insights into the everyday practice of a resource-constrained and dynamic healthcare system. Thus, neurologists must become familiar with Implementation Science to reduce the knowledge-practice gap, maximize healthcare value, and improve management of brain disorders affecting public health.
Keywords: Comparative effectiveness research, Dissemination, framework, brain, neuroscience, Quality
Background
Clinical innovation in biomedical research is expected to improve public health. But effectiveness of innovation does not guarantee its rapid uptake in the healthcare system or the community. (1–3) Only 14% of clinically useful research findings are adopted into everyday practice after an average delay of approximately 17 years. (4) This problem has persisted since the beginning of biomedical research to the current time. Before the 19th century, countless sailors died from scurvy during long voyages. Many navy merchants observed that citrus might cure scurvy since the 1600s, and Dr. James Lind proved the efficacy of citrus to prevent scurvy in a randomized controlled trial in 1747. However, lemon juice did not become compulsory on ships for almost 50 years after the successful experiment. This delay in uptake of evidence-based practices(EBPs) has only worsened in recent times, with the rapid pace of biological research outstripping the healthcare system’s capacity to uptake innovation. A study analyzing the quality of health care in the US showed that only 50% of people receiving recommended preventive care, 70% of patients getting appropriate acute care, and only 60% receiving recommended reasonable chronic care. (5) Additionally, 20–30% of patients were noted to receive contraindicated acute and chronic care. Unfortunately, there has been a temporal trend of worsening healthcare delivery in recent times, with a higher rate of inappropriate medical treatment. (6) Recent challenges of getting EBPs promptly into the community can be remarkably seen in adopting mask-usages in public places or accepting highly efficacious vaccines to prevent coronavirus transmission.
As with other branches of medicine, there is also an enormous gap between scientifically proven health innovation in neurological disorders and the successful implementation of these innovations in the real world. Many patients with neurological disorders do not receive evidence-based care (or receive it inefficiently after a long delay) and may be exposed to the unnecessary, expensive, and potentially harmful treatment. For example, The American Academy of Neurology (AAN) suggested eight quality measures for providing high-quality care for people with epilepsy. (7) These measures include referral or discussion of referral to a comprehensive epilepsy center for all patients with intractable epilepsy and reproductive counseling for all women of childbearing potential with epilepsy at least once a year. However, a real-world study showed that only 48% of patients received an appropriate referral for epilepsy surgery, and 46% of eligible patients were counseled about reproductive issues. (8) Another quality-care recommendation involves screening for anxiety and depression in people with epilepsy during every office visit as more than 1 in 4 may have depression or anxiety. However, a survey showed only 55% of neurologists were routinely screening for these co-morbidities. (9) An AAN practice guideline recommends informing all children and adults with epilepsy about the risk of Sudden Unexpected Death in Epilepsy (SUDEP) annually with counseling that seizure freedom, particularly freedom from generalized tonic-clonic seizures, is strongly associated with decreased SUDEP risk. (10) However, the awareness of SUDEP was extremely poor (14.3%) among patients as health professionals infrequently discussed SUDEP with their patients. (11) Another example of a knowledge-practice gap can be seen in the management of status epilepticus, one of the most common neurological emergencies. Several consensus-based guidelines support a prompt and early use of intravenous benzodiazepine as the first-line therapy in seizure emergencies. (12) However, a multicenter, observational study from tertiary pediatric hospitals demonstrated that only one-third of patients received first-line treatment in a timely fashion. (13) Similarly, despite strong evidence from randomized controlled trials and practice parameters, the time to referral for epilepsy surgery is frequently over 20 years. (14) Although examples above were cited from the epilepsy neurology field, these types of know-do gaps (the gap between what we know and what we do) are pervasive in all fields of neurology and medicine. A new field of science, Implementation Science, can systematically analyze these know-do gaps to identify barriers and potential solutions in the local context with the production of generalizable knowledge. Our aim for this review to introduce this new field of science to neurologists and neuroscience researchers.
Definition
Implementation Science(IS) is defined as ‘ the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practice into routine practice and, hence improve the quality and effectiveness of health services.’(15) IS studies are distinct from traditional efficacy and effectiveness studies. (16) (Table 1). After an efficacy study is successful in a controlled setting, comparative effectiveness research, which measures research outcomes in overburdened, under-resourced clinics or hospitals, may follow. However, effectiveness trials are still primarily dependent on the research support separate from the clinical infrastructure. Thus, to improve the uptake of EBPs into the clinical practice or scaling the intervention at the population level will need assistance from the IS methodologies. IS focuses not only on patients but also on providers, organization, and policy levels of healthcare to identify quality gaps, break down the research-practice barrier, and maximize the possible impact of evidence-based innovation. (17) To identify barriers in the broad landscape, IS is highly interdisciplinary. Clinicians and clinical scientists frequently share research concepts with social scientists, economists, system engineers, and health service researchers and work closely with other operational partners (clinical staff, administrators) to form diverse and complementary human resources.
Table 1.
Few commonly used frameworks with associated domains
| Frameworks | Domains |
|---|---|
| RE-AIM framework | Five domains are assessed in this framework: |
| • Reach into the target population (proportion of eligible individuals who receive the intervention) | |
| • Effectiveness or efficacy of the intervention (if the intervention has the intended effect) | |
| • Adoption (proportion of setting, institutions, and staffs adopt the intervention) of the innovation | |
| • Implementation (fidelity of the intervention as implemented, time and cost of delivery of intervention) | |
| • Maintenance of the intervention (sustainability of the intervention). | |
| Any intervention may be evaluated for public health impact using these 5 domains. For example, a global impact assessment for repository natural corticotrophin injection for patients with infantile spasms may show that it has a significant impact at the individual-intervention level with a high level of reach (the extent to which target families accepted the treatment) and efficacy. However, this expensive treatment can only be adopted, implemented, and maintained in only a small number of high resource countries with less overall global public health impact. | |
|
| |
| iPARIHS (Promoting Action on Research Implementation) | This framework proposes that successful implementation of evidence into practice depends not only on the quality of the evidence but also on the context or setting where the evidence is being introduced and the process of the introduction(facilitation)of the evidence into practice. Each of these three key factors consists of several sub-elements related to the implementation outcome. |
| • Quality of Evidence has four bases: research, practitioner expertise and experience, community/group inclusion in decision making, and local context and environment. | |
| • Key aspects of context include organizational culture, leadership role, resources, relevance and fit of the innovation to the organization. | |
| • Lastly, facilitation is an active intervention strategy and specifies the types of support required to bring the intended change and change characteristics and the styles necessary for the facilitation role. | |
|
| |
| The Consolidated Framework For Implementation Research (CFIR) | This framework has 39 constructs, arranged in 5 major domains to help with data collection for capacity- and need-assessment before the implementation phase of intervention and during the implementation and post-implementation phase of the intervention to identify barriers and facilitators of the intervention and link performance objectives with intervention methods and implementation objectives: |
| • Intervention characteristics | |
| • Outer setting | |
| • Inner setting | |
| • Characteristics of the individuals involved | |
| • The process of implementation. | |
| The distinction between outer and inner settings is not always clear. However, generally, the outer setting includes the economic, political, and social context of the organization, and the inner setting includes the structural, political, or cultural context inside the organization in which the implementation process works | |
Distinction from Quality Improvement and Dissemination Science
Implementation science and quality improvement (QI) have overlapping common goals of solving real-world problems, and both approaches are included in the category of ‘improvement science.’ However, IS differs from QI efforts as it starts with an underutilized EBP before detecting gaps at the provider, clinic, or healthcare system level. (18) In contrast, a specific problem in the healthcare system leads to a specific QI process with an aim to standardize optimum practices and prevent errors. QI process focuses on concrete, care-based interventions with a patient-oriented endpoint based on the recognized gap in the system, whereas IS evaluates a broader process: from the process of team and intervention development to the practical execution. IS does not concentrate on the ultimate results of the intervention but the conception of intervention strategy and utility of a specific strategy in a specific setting/context. Thus, IS develops generalizable knowledge that can be applied beyond the individual system studied. Lastly, IS has a solid grounding in theory and may apply principles of hypothesis testing.
Similar to QI efforts, dissemination research is frequently combined with IS. However, in contrast to IS, dissemination science primarily focuses on widespread communication and educational strategies to disseminate evidence-based intervention. (19)
History of Implementation Science
In 1962, Everett Rogers, a prominent American sociologist and communication theorist, published the Diffusion of Innovations theory. (20) His theory posited, from examples of various fields such as agriculture and healthcare industries, that spread of innovation was a social process that was highly dependent on context, not always concordant with the evidence supporting the actual innovation. The rapid adoption of effective innovations into the clinical practice was accelerated by the Quality Enhancement Research Initiative (QUERI), which was established in 1998 to improve US veterans’ health. (21) This initiative opened the emergence of IS as a new field. Over the last two decades, several national health services and countries (USA, Canada, the UK, Australia, and others) also showed initiative to bring IS and associated strategies to the forefront to maximize the value of relatively fixed healthcare funds allotted for research. (22, 23) For example, Fogarty International Center of the National Institute of Health now extensively supports research and research training in IS. (24) Centers for Global Health Studies at Fogarty supports innovative research in a wide variety of research areas, including brain disorders. (25)
The Principles and Methods of Implementation Science
Theory, Model, and Framework
Implementation Science(IS) as a catalyst for health system reform depends on the well-developed theory that enables generalizable knowledge to emerge out of seeming chaos. (26) Researchers and clinicians outside the IS field may need a basic understanding of Theory-based implementation concepts with associated terminologies (theory, model, and framework) to select relevant approaches during research and promote cross-disciplinary dialogue with other researchers. (27, 28)
Scientific theories are formally stated, testable propositions that are based on known facts to help predict a future event. In contrast, ‘theory’ in IS indicates direct relation between two variables or constructs without high specificity. A ‘theory’ may operate inside a ‘model.’ A hypothetical example can help distinguish between terms ‘theory’ and ‘model’ used in IS. A model may hypothesize a change process that disseminating AAN practice parameters among neurologists will increase supplementation of folic acid in women of reproductive age. Many ‘theories’ may operate inside this model, such as disseminating the guideline will increase knowledge among the providers (Theory 1), and increase provider knowledge will lead to folic acid supplementation (Theory 2). Each theory in the model can be assessed to support or reject this pathway of change. Based on results, the model may be refined or rejected altogether.
On the other hand, frameworks provide a broad set of variables or constructs without specifying causal relationships. Frameworks facilitate planning of implementation work and can be adapted.
These frameworks provide a structure for the evaluation of different constructs or variables to determine their influence on the implementation outcome of implementation endeavors (Table 2) (29–34) Although disease-specific models are not necessary, implementation models specific to neurology and brain disorders have started to develop. The PRIME model (Patient Reported Implementation Science) has been utilized to identify patient journeys, treatment gaps, system delays, and care barriers associated with refractory epilepsy. (35)
Table 2.
Difference among efficacy, effectiveness, and implementation studies
| Efficacy studies | Effectiveness studies | Implementation studies | |
|---|---|---|---|
| Purpose | New interventions are tested in tightly controlled trials to assess impact under ideal conditions | Clinical/preventive interventions are tested in the community and/or organizational system | Study of strategies to make evidence-based practices to reach large portion of the population, delivered with fidelity, and maintained afterwards |
| Focus | Whether a clinical/preventive intervention could work under rigorous conditions | Determine if a clinical/preventive intervention does work in a realistic context. | Concerns how to bring evidence-based practices within community and/or service settings |
| Study examples | Efficacy of adjunctive cannabidiol to reduce drop seizures in patients with Lennox-Gastaut syndrome compared to placebo | Comparison of efficacy, tolerability, and neuropsychological effects of ethosuximide, valproic acid, and lamotrigine in children with newly diagnosed childhood absence epilepsy. | A study aims to evaluate the effectiveness of the Epilepsy Centers of Excellence by describing changes in the quality of and access to care for epilepsy before and after the initiative using predefined quality Indicators |
| Population and setting | Homogeneous group of subjects with careful monitoring and supervision to ensure high fidelity. Stringent control of inclusion and exclusion criteria. | Relatively less strict in excluding patients and may include more heterogeneous group of study participants with presence of some commonly associated comorbidities | Unit of observation may be health professionals, clinics, organizations, and policy level rather than only patients |
| Study staffs | A highly-trained research team | Clinicians, other practitioners, or trained individuals from the community deliver the clinical/preventive intervention with ongoing supervision supported by researchers | Local team with opinion leader and champion |
| Outcome | Primary health outcome with many secondary outcomes | Less number of outcomes are measured | Measures implementation outcomes relevant to the strategies (fidelity, acceptance, penetration, sustainability, etc.) |
Implementation Process (Fig 1)
Figure 1.
Implementation Process Flowchart
Choosing evidence-based practices
During the process of deciding on a new procedure, clinical guideline, decision-making tool, care pathway, or ‘best-practice’ to implement, it is imperative to choose one that is credible, easily accessible, simple to use and offers support for complex problems concerning interdependent actions, decisions, and routines among the stakeholders. Medical specialists (including neurologists) frequently adopt guidelines published by the national same specialty organization rather than guidelines from an independent national institute, college of family physicians, or international organization. (36) Interested neurology researchers may refer to the AAN, Child Neurology Society, American Epilepsy Society websites for available guidelines and practice parameters. (37–39) For successful implementation, innovation needs to be partially compatible with existing norms and values and adapted for the local context with recommendations specific enough for definitive action. Researchers should carefully evaluate different characteristics of innovation during the pre-implementation process (relative advantage, compatibility with perceived needs, values and norms, the complexity of use, triability, adaptability, cost, duration, disruptiveness, magnitude) that promote or hinder implementation. (40) Data from rigorous, well-done systematic review/meta-analysis can be used as a starting point in the absence of clear guidelines about an intervention.
Quality indicators
After deciding about the particular evidence-based practice (EBP), a measurable element of practice performance (i.e., quality indicator/s) should be determined to monitor and evaluate the important management, clinical, and support functions that affect patient outcomes. These indicators need to satisfy one or more of these various criteria: relevance, validity, reliability, feasibility, and target group orientation. Indicators can be classified as follows: structure indicators such as availability of a particular type of specialized clinic in a clinical network, process indicators such as the number of referrals to epilepsy clinic after diagnosis of intractable epilepsy, or outcome indicators such as seizure freedom rate after epilepsy surgery. (41, 42) Although externally driven quality measures focus on outcome indicators, IS and internal quality measures primarily target various process indicators. Process indicators are preferentially used in IS as outcome measures cannot always be attributed to the patient-specific intervention due to presence of various confounders other than the specific intervention, for example, one patient may deteriorate even after receiving evidence-based care. On the other hand, another patient gets better despite not receiving quality care (a patient with drug-resistant epilepsy may continue to have seizures despite receiving appropriately indicated epilepsy surgery, and a similarly affected patient may have excellent control of seizures despite not receiving otherwise appropriate surgical referral and intervention). The determination of agreed-upon Quality indicator/s is similar to the QI efforts to identify measurable variables (or characteristics) before the study. Based on the determined indicators, baseline data are collected in the local context.
Diagnostic analysis of local context
In contrast to the efficacy study done in a controlled setting, a careful diagnostic analysis is performed of many relevant local factors (determinants of change) that oppose or enhance (barriers and facilitators) uptake of EBPs in everyday practice. (43, 44) This type of formative evaluation process (data gathered during evaluation are fed back to the implementation team and target group for necessary adaptation) may utilize various implementation frameworks for detailed analysis of stakeholders (cognition, behaviors, motivation), local setting (teams, networks), organization (structure, culture, resource), and other external control (policy, regulations). These determinants of change link to various outcomes, and one single outcome may have various determinants. For identifying these variants, a less utilized research methodology can be particularly valuable: In-depth qualitative study using semi-structured interviews, group interviews, and direct observation of activities in the local environment. (45) Qualitative research is essential to answer the ‘why’ and ‘how’ questions and insight into human behavior. Surveys and pilot studies also can be a useful tool to understand barriers associated with determinants of change. (46) Data acquired from these methods are analyzed to understand various barriers and facilitators in the local context.
Implementation strategies
After completion of the diagnostic analysis, implementation strategies (an integrated set, bundle, or package of discreet implementation interventions ideally selected to address specific identified barriers to implementation success) are planned and prioritized corresponding to each barrier. They may be directed towards the patient, provider, clinic, organization, or policy level. (47–53) However, the logical choice of intervention is not a common practice due to implicit bias about various familiar but not necessarily effective/suitable intervention in that local context (for example, frequently used passive educational intervention may not be significantly beneficial in most contexts). (54–56) Implementation strategies for change can be classified into educational interventions, financial interventions, organizational interventions, and regulatory interventions. Different strategies and interventions are more appropriate and desirable during different phases of the change process (orientation, insight, acceptance, change, and maintenance). Moreover, different approaches may be needed in various subgroups within the target groups. For example, innovators may respond well to passive educational intervention, whereas the middle majority are more sensitive to the opinion of the local thought-leader, and late adopters may need additional strategies involving formal regulation or financial incentives. Common implementation strategies at the provider level include education/training, audit-feedback, and performance incentives. Strategies targeting the provider, team, clinic, or organization levels may include QI techniques, organizational restructuring efforts, team-based performance incentives, learning collaborative, or community-engagement. Implementation strategies should be uniformly named, defined, specified in the following dimensions-1. Actor (who enacts the strategy) 2. Action (specific actions, steps, or processes) 3. Action target 4. Temporality (timing and sequencing), 5. Dosing 6. Implementation outcomes affected, 7. Justification (theoretical, empirical, or pragmatic justification for the choice of implementation strategies) (57). Comprehensive, balanced, and multifaceted strategies to target various barriers simultaneously are feasible and practical in most contexts, primarily if these are implemented as a pilot intervention with gradual scaling. (58) Although external research support is generally limited in Implementation Science, facilitation (guided efforts by internal or external organizational staff to support multiple levels of system change through a provider or team-based coaching) is increasingly recognized as essential in implementing intervention strategies. (59) Several other implementation strategies (computerized decision support, interactive education with other practice-reinforcing strategies, the input of opinion leaders with educational outreach or performance feedback, availability of change champions) have demonstrated success in promoting adoption, implementation, and sustenance of evidence-based interventions.
Implementation outcome
The successful implementation of evidence-based health intervention, rather than the actual health-outcome of the intervention, is the primary outcome of studies relevant to Implementation Science. Implementation outcomes can be measured using various indicators as described previously. Several different implementation outcomes have been extensively studied, such as Feasibility (the extent to which an innovation can practically be used in a given setting), Acceptability (a given innovation is agreeable among stakeholders), Appropriateness (differential utility of EBPs among patients), Adoption (Intention or action to try an innovation), Penetration (Reach or integration of the innovation within a setting), and Sustainability (the extent to which innovation is maintained or routinized within a setting over time). (60) With outcome assessment, the distinction between intervention versus implementation failure is vital as the former suggests ineffective intervention in a new setting, and the latter suggests that a good intervention has been deployed incorrectly in the local context.
De-implementation
Besides implementing evidence-based care, there is a rising interest in the IS field about the process of de-implementation to abandon existing low-value practices, minimize patient harm, reduce unnecessary waste, and improve population health.(61) De-implementation can be more complex than implementation of EBP as it may need a greater degree of nuanced understanding of patients, health professionals, and organization characteristics, such as anxiety, fear, worry, inaccurate belief and social norm, cognitive dissonance, fear of medical malpractice, revenue challenges, competitive advantage, and liability issues. (62) For example, a de-implementation of the practice of routinely ordering brain MRI in patients with chronic primary headaches can be more complex than following an EBP guideline to obtain MRI in patients with focal epilepsy.
Type of studies
Initial implementation studies involved primarily observational studies for determinants of implementation. Over time, IS did not remain limited to a descriptive assessment of barriers to adopting EBP and evolved into specifying and testing optimal implementation strategies for the uptake of improvement initiatives. Regarding intervention studies, quasi-experimental design can estimate the effect of an intervention despite lack of randomization. Examples are pre-post designs, interrupted time series, and stepped wedge designs (all participants receive the intervention but in a staggered fashion). (63) Randomized studies are also performed in IS; however, these are distinct from the commonly performed efficacy and effectiveness studies. In contrast to efficacy and effectiveness trials (primarily evaluate the health impact of the innovation), implementation trials evaluate strategies to enhance the adoption, implementation, sustainment, and scaling of evidence-based innovations into the clinical practice. (Table 2)
Additionally, implementation controlled trials frequently perform cluster-randomization at the clinic/site-level rather than the individual level to mitigate ‘contamination’ (provider educated about one intervention may use that knowledge for all patients unconsciously). (63) Nonetheless, cluster randomization may have confounding issues and require stratification to confirm similarities in critical variables and sophisticated analytic models to determine the allocation scheme. Besides randomization at the providers, clinics, or organization level, randomized controlled design for policy evaluation can also guide the distribution of the scarce resources.
Another distinction between efficacy/effectiveness studies and IS studies are the frequent use of multiple interventions simultaneously. For multi-component implementation strategies, different combinations of implementation strategies can be tested in factorial or fractional-factorial designs. A variant of this design is a sequential, multiple-assignment randomized trial to determine optimum sequences of strategies to maximize downstream clinical outcomes.
Although the efficacy-effectiveness-implementation spectrum may suggest a unidimensional flow with IS coming into play later in the scene, it should be much more iterative with experience gained from implementation research should guide efficacy and effectiveness studies to increase external validity without losing the fidelity of the clinical innovation. There is a greater push to integrate IS methodologies into clinical trial design. Hybrid effectiveness-implementation trials evaluate two different outcomes: effectiveness of the intervention at the patient level and effectiveness of intervention strategies to be adopted by patients, providers, systems of care, and community. (64) Hybrid I studies explicitly collect data on the implementation process (may be used subsequently for later implementation trial) but primarily evaluate the effectiveness or health outcome of the EBP. In contrast, Hybrid III studies focus primarily on implementation outcomes while also collecting effectiveness outcomes related to uptake or fidelity of the intervention.
A real-world example of implementation science study involving brain disorder
Patients with transient ischemic attack (TIA) are at high risk of recurrent vascular events, but quality care can reduce the risk of future events by 80%. (65) Despite solid scientific evidence, TIA acute care has been inadequately delivered. (66) One way to measure TIA acute care is to calculate the without the fail (WTF) rate (proportion of patients with TIA who received all of the processes of care they are eligible among seven processes: brain imaging, carotid artery imaging, neurology consultation, hypertension control, anticoagulation for atrial fibrillation, antithrombotic, and high/moderate potency statins). (67) Unfortunately, the national average WFR in 2017 was only 34.3. (67)
A recent study, the PREVENT (Protocol Guided Rapid Evaluation of Veterans Experiencing New Transient Neurologic Symptom) program, implemented local adaptation and multi-component QI intervention to improve the quality of TIA care. (67, 68) The study started with a collection of baseline data and detailed qualitative interviews among stakeholders. Data were also collected through observation, field notes, and by the use of a structured electronic log/template system for rapid and systematic extraction of key concepts across various sources. Interview transcripts were coded using 4 Consolidated Framework For Implementation Research constructs (goals and feedback, planning, reflecting, evaluating, and champions) to determine the presence and influence of the constructs toward successful implementation. Barriers to providing high-quality acute TIA care were identified, including knowledge gaps, lack of performance data, lack of QI experience, inadequate care coordination, and insufficient information technology support. (69) Intervention component mapping was done with multi-component strategies (strategy bundle) by utilizing the Consolidated Framework For Implementation Research constructs. The core strategies were utilizing an interactive web-based system (HUB) for facility-specific data audit and feedback; reflecting and evaluating on the existing practices; planning and goal setting by the individual team; using external facilitation by a nurse trained in Six Sigma Methodology; building a community care practice by collaborative monthly joint conference among all the teams for cross-conversation and discussion; establishing clinical programs with inclusion of clinical pathways and educational materials; and accelerating professional education. Implementation outcomes were the facility team’s number of completed implementation activities, level of team organization and cohesion measured by the Group Organization score, and ≥a 15-point improvement in the WFR over the one year of implementation.
A stepped wedge trial was done in 6 geographically diverse Veterans Affairs(VA) healthcare systems in three waves, with activation of 2 facilities per wave. (67) Site members used HUB to identify barriers in providing quality TIA care, brainstorm solutions to tackle these barriers, prioritize and rank solutions on an impact-effort matrix, and develop a site-specific plan of action for the short-and long-term goals. Interaction of external facilitator with a site was evaluated in categories for education, quality monitoring, planning, or networking. Implementation success was noted to be secondary to an effective champion and the champion’s effective engagement with the team in the planning, goal setting, reflection, and evaluation of the progress compared to the original plan. Among the 6 prevent sites WTF rate improved by 17.3% [ 36.7%(58 of 158 patients) at baseline to 54%(95 of 176 patients, OR= 2.10; 95% CI 1.27–3.48; P=0.0040] during 1-year implementation period. A further non-randomized cluster trial was conducted between these six centers with 36 matched control sites. (70) Compared to 17.3% absolute improvement among the 6 PREVENT sites, only 3.2% improvement was noted over one year in the WTF rate in the control sites [ baseline 38.6%(345 of 893 patients) to 41.8%( 363 of 869 patients). This improvement suggests the successful implementation of multifaceted strategies employed in the PREVENT program, compared to the control sites, was associated with improved TIA quality care.
The benefit of Implementation Science methodologies in the field of child neurology is also gradually emerging. For example, a cost-effective, combined (endoscopic third ventriculostomy and choroid plexus cauterization) surgical approach to treat hydrocephalus in infancy was first noted to be effective in Uganda, with the potential to decrease morbidity and mortality associated with lifetime shunt dependency. (71) The same technique is now implemented in several US centers using Implementation Science methodologies to transfer the new insight to a completely different setting. (72)
Current status of Implementation Science in brain disorders
Due to increasing public awareness of neurological disorders and advocacy efforts, the United States declared the period between 1990–1999 as the Decade of the Brain “to enhance public awareness of the benefits to be derived from brain research.”(73) Nevertheless, translation of the neurologic research to consistent, high-quality, evidence-based neurologic care remains elusive. (74,75) Moreover, there is a substantial difference in the quality of care depending on socioeconomic and racial/ethnic status. (76). Only in the last ten years, greater attention was given to promote Implementation Science in brain disorders, both for assessing the impact of existing evidence-based studies and designing research studies to increase uptake of the EBPs in a real-world setting. In 2009, a workshop by the National Institute of Neurological Disorders and Stroke was convened to address the knowledge and training gap in comparative effectiveness and implementation research. In the workshop, other National Institute of Health (NIH) institutes presented their priority setting mechanism in this field of research. (77) Additionally, Fogarty International Center provides funding opportunities to promote implementation research in neurological disorders in low-income and middle-income countries. A pioneer program in IS, VA’s Quality Enhancement Research Initiative (QUERI), also researches brain disorders (stroke, epilepsy, and traumatic brain injury). Besides federal agencies, other non-profit organizations, such as Patient-Centered Outcomes Research Initiatives have emphasized the use of dissemination and implementation framework at the point of research topic selection (long before the research is conducted and evidence is ready to be shared). (78, 79)
Besides progressively greater research priorities are given to IS, initiatives to increase awareness, education, training, and publication opportunities for non-specialists are also gradually maturing. Despite the need for more funding options, some grant opportunities are already available for neurology researchers, such as the Clinical Translational Science Award, career development awards from NIH, and VA health services-research fellowships, to pursue IS research. Interested researchers can also benefit from the well-developed resources, programs, and training/workforce development efforts from the work of the consortium of NIH-funded Clinical Translational Science Award institutions. (80) The Fogarty International Center’s Learning Collaborative for Implementation Science in Global Brain Disorders provides training opportunities as part of the Global Brain Disorders Research Program’s network meetings to increase grantees’ capacity to conduct global implementation research. (81)
Besides the formal training, a simple teaching tool (a single slide) is also available that uses jargon-free and straightforward language to help new learners grasp vital definitions and concepts in IS and its relation to traditional effectiveness research. (82) There is also provision of the dedicated conference (annual conference on the Science of Dissemination and Implementation in Health, co-hosted by the NIH and Academy Health) and journals (such as Implementation Science) to present and publish articles to report not only the final result after implementation work but to describe the critical contextual, developmental and supporting work necessary for the generation of generalizable knowledge. (83, 84)
Conclusion
Neurological disorders are the leading cause of disability and the second leading cause of death globally. To challenge this enormous disease burden, neurology researchers are pursuing sophisticated big data analytics to develop new prevention and treatment strategies as part of the Learning Healthcare System. However, without the effective use of IS, we may not capitalize on these interventions to make practice change, provide high-value healthcare, and progress in reducing the burden of neurological diseases.
Acknowledgments
Disclosure: This work was supported by the Translational Research Institute (TRI), UL1 TR003107, through the National Center for Advancing Translational Sciences of the National Institutes of Health (NIH). The views expressed in this paper are those of the authors and do not necessarily reflect the position or policy of the United States Department of Veterans Affairs (VA), Veterans Health Administration (VHA), or the United States Government.
Footnotes
Conflict: The authors have no conflicts of interest to disclose.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- 1.Røttingen J, Regmi S, Eide M, Young AJ, Viergever RF, Årdal C, et al. Mapping of available health research and development data: What’s there, what’s missing, and what role is there for a global observatory? The Lancet. 2013;382(9900):1286–307. [DOI] [PubMed] [Google Scholar]
- 2.Chalmers I, Bracken MB, Djulbegovic B, Garattini S, Grant J, Gülmezoglu AM, et al. How to increase value and reduce waste when research priorities are set. The Lancet. 2014;383(9912):156–65. [DOI] [PubMed] [Google Scholar]
- 3.Contopoulos-Ioannidis DG, Alexiou GA, Gouvias TC, Ioannidis JP. Life cycle of translational research for medical interventions. Science. 2008;321(5894):1298–9. [DOI] [PubMed] [Google Scholar]
- 4.Balas EA, Boren SA. Managing clinical knowledge for health care improvement. . 2000. [PubMed] [Google Scholar]
- 5.Schuster MA, McGlynn EA, Brook RH. How good is the quality of health care in the united states? Milbank Q. 2005;83(4):843. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Levine DM, Linder JA, Landon BE. The quality of outpatient care delivered to adults in the united states, 2002 to 2013. JAMA internal medicine. 2016;176(12):1778–90. [DOI] [PubMed] [Google Scholar]
- 7.Fountain NB, Van Ness PC, Swain-Eng R, Tonn S, Bever CT. Quality improvement in neurology: AAN epilepsy quality measures: Report of the quality measurement and reporting subcommittee of the american academy of neurology. Neurology. 2011;76(1):94–9. [DOI] [PubMed] [Google Scholar]
- 8.Wicks P, Fountain NB. Patient assessment of physician performance of epilepsy quality-of-care measures. Neurology: Clinical Practice. 2012;2(4):335–42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.The black dog in your waiting room: Screening for depression in people with epilepsy [homepage on the Internet]. [cited 1/29/2021]. Available from: https://www.ilae.org/journals/epigraph/epigraph-vol-21-issue-4-fall-2019/the-black-dog-in-your-waiting-room-screening-for-depression-in-people-with-epilepsy#:~:text=Screening%20can%20improve%20patient%20outcomes,people%20visiting%20hospital%20epilepsy%20centers.
- 10.Harden C, Tomson T, Gloss D, Buchhalter J, Cross JH, Donner E, et al. Practice guideline summary: Sudden unexpected death in epilepsy incidence rates and risk factors: Report of the guideline development, dissemination, and implementation subcommittee of the american academy of neurology and the american epilepsy society. Epilepsy currents. 2017;17(3):180–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Xu Z, Ayyappan S, Seneviratne U. Sudden unexpected death in epilepsy (SUDEP): What do patients think? Epilepsy & Behavior. 2015;42:29–34. [DOI] [PubMed] [Google Scholar]
- 12.Glauser T, Shinnar S, Gloss D, Alldredge B, Arya R, Bainbridge J, et al. Evidence-based guideline: Treatment of convulsive status epilepticus in children and adults: Report of the guideline committee of the american epilepsy society. Epilepsy currents. 2016;16(1):48–61. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Gaínza-Lein M, Fernández IS, Jackson M, Abend NS, Arya R, Brenton JN, et al. Association of time to treatment with short-term outcomes for pediatric patients with refractory convulsive status epilepticus. JAMA neurology. 2018;75(4):410–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Berg AT, Langfitt J, Shinnar S, Vickrey BG, Sperling MR, Walczak T, et al. How long does it take for partial epilepsy to become intractable? Neurology. 2003;60(2):186–90. [DOI] [PubMed] [Google Scholar]
- 15.Eccles MP, Mittman BS. Welcome to implementation science. Welcome to implementation science. 2006. [Google Scholar]
- 16.Lane-Fall MB, Curran GM, Beidas RS. Scoping implementation science for the beginner: Locating yourself on the “subway line” of translational research. BMC medical research methodology. 2019;19(1):133. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC psychology. 2015;3(1):32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Lane-Fall MB, Fleisher LA. Quality improvement and implementation science: Different fields with aligned goals. Anesthesiology Clinics. 2018;36(1):xiii–xv.30092943 [Google Scholar]
- 19.Dearing JW, Kee KF. Historical roots of dissemination and implementation science. Dissemination and implementation research in health: Translating science to practice. 2012;55:71. [Google Scholar]
- 20.McGrath C, Zell D. The future of innovation diffusion research and its implications for management: A conversation with everett rogers. Journal of Management Inquiry. 2001;10(4):386–91. [Google Scholar]
- 21.Demakis JG, McQueen L, Kizer KW, Feussner JR. Quality enhancement research initiative (QUERI): A collaboration between research and clinical practice. Med Care. 2000:I17–25. [PubMed] [Google Scholar]
- 22.Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National institutes of health approaches to dissemination and implementation science: Current and future directions. Am J Public Health. 2012;102(7):1274–81. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Straus SE, Brouwers M, Johnson D, Lavis JN, Légaré F, Majumdar SR, et al. Core competencies in the science and practice of knowledge translation: Description of a canadian strategic training initiative. Implementation Science. 2011;6(1):127. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Sturke R, Vorkoper S, Bekker L, Ameyan W, Luo C, Allison S, et al. Fostering successful and sustainable collaborations to advance implementation science: The adolescent HIV prevention and treatment implementation science alliance. Journal of the International AIDS Society. 2020;23:e25572. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Engelgau MM, Rosenthal JP, Newsome BJ, Price L, Belis D, Mensah GA. Noncommunicable diseases in low-and middle-income countries: A strategic approach to develop a global implementation research workforce. Global heart. 2018;13(2):131–7. [DOI] [PubMed] [Google Scholar]
- 26.Fisher ES, Shortell SM, Savitz LA. Implementation science: A potential catalyst for delivery system reform. JAMA. 2016;315(4):339–40. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Nilsen P Making sense of implementation theories, models and frameworks. Implementation science. 2015;10(1):53. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Newhouse R, Bobay K, Dykes PC, Stevens KR, Titler M. Methodology issues in implementation science. Med Care. 2013:S32–40. [DOI] [PubMed] [Google Scholar]
- 29.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: The RE-AIM framework. Am J Public Health. 1999;89(9):1322–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: Theoretical and practical challenges. Implementation science. 2008;3(1):1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation science. 2009;4(1):1–15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Ajzen I The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211. [Google Scholar]
- 33.Crosby R, Noar SM. What is a planning model? an introduction to PRECEDE- PROCEED. J Public Health Dent. 2011;71:S7–S15. [DOI] [PubMed] [Google Scholar]
- 34.Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change. Implementation Science. 2013;8(1):117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Rapport F, Shih P, Faris M, Nikpour A, Herkes G, Bleasel A, et al. Determinants of health and wellbeing in refractory epilepsy and surgery: The patient reported, ImpleMentation sciEnce (PRIME) model. Epilepsy & Behavior. 2019;92:79–89. [DOI] [PubMed] [Google Scholar]
- 36.Burgers JS, van Everdingen JJ. Evidence-based guideline development in the netherlands: The EBRO platform. Ned Tijdschr Geneeskd. 2004;148(42):2057–9. [PubMed] [Google Scholar]
- 37.AAN policy and guidelines [homepage on the Internet]. [cited 1/15/2021]. Available from: https://www.aan.com/policy-and-guidelines/guidelines/.
- 38.Clinical guidance. american epilepsy society [homepage on the Internet]. [cited 1/15/2021]. Available from: https://www.aesnet.org/clinical_resources/guidelines.
- 39.Practice parameters. CNS [homepage on the Internet]. [cited 1/15/2021]. Available from: https://www.childneurologysociety.org/resources/practice-parameters.
- 40.Grol R, Wensing M. Characteristics of successful innovations. Improving Patient Care: The Implementation of Change in Health Care. 2020:87–102. [Google Scholar]
- 41.Stelfox HT, Straus SE. Measuring quality of care: Considering measurement frameworks and needs assessment to guide quality indicator development. J Clin Epidemiol. 2013;66(12):1320–7. [DOI] [PubMed] [Google Scholar]
- 42.Defining Mainz J. and classifying clinical indicators for quality improvement. International journal for quality in health care. 2003;15(6):523–30. [DOI] [PubMed] [Google Scholar]
- 43.Hamilton S, McLaren S, Mulhall A. Assessing organisational readiness for change: Use of diagnostic analysis prior to the implementation of a multidisciplinary assessment for acute stroke care. Implementation Science. 2007;2(1):21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Samanta D Improving management of infantile spasms by adopting implementation science. Neuropediatrics. 2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Smith JD, Hasan M. Quantitative approaches for the evaluation of implementation research studies. Psychiatry Res. 2020;283:112521. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Proctor EK, Powell BJ, McMillen JC. Implementation strategies: Recommendations for specifying and reporting. Implementation Science. 2013;8(1):1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Harvey G, Kitson A. Translating evidence into healthcare policy and practice: Single versus multi-faceted implementation strategies–is there a simple answer to a complex question? International journal of health policy and management. 2015;4(3):123. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Kirchner JE, Smith JL, Powell BJ, Waltz TJ, Proctor EK. Getting a clinical innovation into practice: An introduction to implementation strategies. Psychiatry Res. 2020;283:112467. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Stetler CB, Legro MW, Wallace CM, Bowman C, Guihan M, Hagedorn H, et al. The role of formative evaluation in implementation research and the QUERI experience. Journal of general internal medicine. 2006;21(2):S1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: Results from the expert recommendations for implementing change (ERIC) project. Implementation Science. 2015;10(1):21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.USVA QUERI (US department of veterans affairs; quality enhancement research initiative, QUERI).[homepage on the Internet]. [cited 1/15/2021]. Available from: http://www.queri.research.va.gov/implementation/default.cfm.
- 53.Schmid AA, Andersen J, Kent T, Williams LS, Damush TM. Using intervention mapping to develop and adapt a secondary stroke prevention program in veterans health administration medical centers. Implementation Science. 2010;5(1):97. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Forsetlund L, Bjørndal A, Rashidian A, Jamtvedt G, O’Brien MA, Wolf FM, et al. Continuing education meetings and workshops: Effects on professional practice and health care outcomes. Cochrane database of systematic reviews. 2009(2). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Audit and feedback: Effects on professional practice and health care outcomes. The Cochrane database of systematic reviews. 2006(2):CD000259. [DOI] [PubMed] [Google Scholar]
- 56.O’Brien T, MA O, AD D, DA H, RB FN& harvey EL (2000) educational outreach visits: Effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews. 2001;1. [DOI] [PubMed] [Google Scholar]
- 57.Proctor EK, Powell BJ, McMillen JC. Implementation strategies: Recommendations for specifying and reporting. Implementation Science. 2013;8(1):1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Grol R, Wensing M, Eccles M, Davis D. Improving patient care: The implementation of change in health care. John Wiley & Sons; 2013. [Google Scholar]
- 59.Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, et al. Role of” external facilitation” in implementation of research findings: A qualitative evaluation of facilitation experiences in the veterans health administration. Implementation Science. 2006;1(1):23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(2):65–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Prasad V, Ioannidis JP. No title. Evidence-based de-implementation for contradicted, unproven, and aspiring healthcare practices. 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Norton WE, Chambers DA. Unpacking the complexities of de-implementing inappropriate health interventions. Implementation Science. 2020;15(1):1–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Miller CJ, Smith SN, Pugatch M. Experimental and quasi-experimental designs in implementation research. Psychiatry Res. 2020;283:112452. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Landes SJ, McBain SA, Curran GM. Reprint of: An introduction to effectiveness-implementation hybrid designs. Psychiatry Res. 2020;283:112630. [DOI] [PubMed] [Google Scholar]
- 65.Hackam DG, Spence JD. Combining multiple approaches for the secondary prevention of vascular events after stroke: A quantitative modeling study. Stroke. 2007;38(6):1881–5. [DOI] [PubMed] [Google Scholar]
- 66.O’Brien EC, Zhao X, Fonarow GC, Schulte PJ, Dai D, Smith EE, et al. Quality of care and ischemic stroke risk after hospitalization for transient ischemic attack: Findings from get with the guidelines-stroke. Circulation: Cardiovascular Quality and Outcomes. 2015;8(6_suppl_3):S117–24. [DOI] [PubMed] [Google Scholar]
- 67.Damush TM, Miech EJ, Rattray NA, Homoya B, Penney LS, Cheatham A, et al. Implementation evaluation of a complex intervention to improve timeliness of care for veterans with transient ischemic attack. Journal of General Internal Medicine. 2020:1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Bravata DM, Myers LJ, Homoya B, Miech EJ, Rattray NA, Perkins AJ, et al. The protocol-guided rapid evaluation of veterans experiencing new transient neurological symptoms (PREVENT) quality improvement program: Rationale and methods. BMC neurology. 2019;19(1):1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Damush TM, Miech EJ, Sico JJ, Phipps MS, Arling G, Ferguson J, et al. Barriers and facilitators to provide quality TIA care in the veterans healthcare administration. Neurology. 2017;89(24):2422–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Bravata DM, Myers LJ, Perkins AJ, Zhang Y, Miech EJ, Rattray NA, et al. Assessment of the protocol-guided rapid evaluation of veterans experiencing new transient neurological symptoms (PREVENT) program for improving quality of care for transient ischemic attack: A nonrandomized cluster trial. JAMA network open. 2020;3(9):e2015920. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Kulkarni AV, Schiff SJ, Mbabazi-Kabachelor E, Mugamba J, Ssenyonga P, Donnelly R, et al. Endoscopic treatment versus shunting for infant hydrocephalus in uganda. N Engl J Med. 2017;377(25):2456–64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Fogarty-supported research revolutionizes hydrocephalus care [homepage on the Internet]. [cited 5/5/2021]. Availablefrom: https://www.fic.nih.gov/News/GlobalHealthMatters/september-october-2018/Pages/hydrocephalus-care.aspx
- 73.Project on the decade of the brain [homepage on the Internet]. [cited 1/0/2021].
- 74.Samanta D, Singh R, Gedela S, Perry MS, Arya R. Underutilization of epilepsy surgery: Part II: Strategies to overcome barriers. Epilepsy & Behavior. 2021:107853. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Samanta D, Ostendorf AP, Willis E, Singh R, Gedela S, Arya R, et al. Underutilization of epilepsy surgery: Part I: A scoping review of barriers. Epilepsy & Behavior. 2021:107837. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Fiscella K, Franks P, Gold MR, Clancy CM. Inequality in quality: Addressing socioeconomic, racial, and ethnic disparities in health care. JAMA. 2000;283(19):2579–84. [DOI] [PubMed] [Google Scholar]
- 77.Vickrey BG, Hirtz D, Waddy S, Cheng EM, Johnston SC. Comparative effectiveness and implementation research: Directions for neurology. Ann Neurol. 2012;71(6):732–42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.Fogarty international center [homepage on the Internet]. [cited 1/21/2021].
- 79.Esposito D, Heeringa J, Bradley K, Croake S, Kimmey L. PCORI dissemination and implementation framework. Washington, DC: Patient-Centered Outcomes Research Institute. 2015. [Google Scholar]
- 80.Dolor RJ, Proctor E, Stevens KR, Boone LR, Meissner P, Baldwin L. Dissemination and implementation science activities across the clinical translational science award (CTSA) consortium: Report from a survey of CTSA leaders. Journal of clinical and translational science. 2020;4(3):188–94. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Sturke R, Malekzadeh A, Michels K, Glass R. Implementation science for brain disorders. The Lancet Neurology. 2020;19(8):645. [DOI] [PubMed] [Google Scholar]
- 82.Curran GM. Implementation science made too simple: A teaching tool. Implementation Science Communications. 2020;1(1):1–3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Eccles MP, Mittman BS. Welcome to implementation science. Welcome to implementation science. 2006. [Google Scholar]
- 84.10th annual conference on the science of dissemination and implementation in health [homepage on the Internet]. [cited 1/15/2021]. Available from: https://academyhealth.org/events/site/10th-annual-conference-science-dissemination-and-implementation-health. [Google Scholar]

