Abstract
Sickle cell disease (SCD) results in end organ damage and a shortened lifespan. Both the pathophysiology of the disease and the social determinants of health affect patient outcomes. Randomized controlled trials have been completed among this population and resulted in medical advances; however, the gestation of these advances and the lack of penetrance into clinical practice have limited advancements in clinical improvements for many people with SCD. We discuss the role of implementation science in SCD and highlight the need for this science to shorten the length of time to implement evidence-based care for more people with SCD.
Keywords: implementation science, public health, sickle cell disease
1 INTRODUCTION
The National Heart, Lung, and Blood Institute’s (NHLBI) Center for Translation Research and Implementation Science (CTRIS) was formed in 2014 to support research that facilitates integration of evidence-based interventions within clinical and public health settings. This initiative recognizes that effective interventions have not only been tested for rigor with scientific advances, clinical trials, and expert systematic review panels, but that interventions are actually adopted in “real-world” settings.1 CTRIS aims to support research that works toward overcoming health equity issues, a manifest challenge relating to those with sickle cell disease (SCD).
SCD presents numerous challenges as an inherited blood disorder that results in lifetime anemia, severe pain, organ damage, and often premature mortality.2 In addition to the biological component, SCD is compounded by social complexity; SCD predominantly affects African Americans and other underrepresented minorities, and treatment relies heavily on public insurance and healthcare programs.3 Hydroxyurea (HU) is the only agent approved by the U.S. Food and Drug Administration that reduces both acute pain episodes and risk of death.4–6 While the field has made major medical advances, including HU and transcranial Doppler (TCD) screening with prophylactic chronic transfusion for children with high risk for stroke,7 the median age of survival for individuals with SCD remains just 30–40 years, 30–45 years less than the average African American life expectancy.8 Population-based studies demonstrate that 25% of children with SCD in Georgia did not receive recommended pneumococcal vaccinations.9 In a cohort of pediatric patients with SCD on Medicaid in New York, the rate of HU use increased from 25% to 40% over 4 years10; while HU use did improve over the time of the study, the penetration of HU prescription varied by medical center, and prescription rates were suboptimal, even at the “best” centers. Data are similarly suboptimal among adults; fewer than 23% of adults with severe (three or more hospitalizations or emergency visits) sickle cell anemia (SCA) receive HU.11
SCD lacks an infrastructure for intervention trials. While the National Institutes of Health (NIH) have funded randomized trials in SCD, there is a significant delay in implementing the new methods of care. For example, the Stroke Prevention Trial in Sickle Cell Anemia (STOP) was published in 1998.7 Yet, when the Silent Cerebral Infarct Transfusion trial began in 2004, several of the academic sites were not performing TCD screenings for children with SS or S-beta thal0 between the ages of 2 and 16 years (M.R. DeBaun, personal communication). Seventeen years after STOP was published, only 40% of eligible children received TCD screening12 (Fig. 1). If we know what our patients need, why are they not receiving the quality care that they deserve? More investigation is needed to advance care of people with SCD and to bridge the gap between research and practice.
FIGURE 1.
The quality gap: The children of today are not receiving the quality of care that they deserve; example of a timeline from completion of a trial to standard care
2 THE QUALITY GAP
It takes 17 years to turn 14% of original research to benefit patient care.13 Such estimates come from “leakages” of research at each stage: complete research, publication submission, production of guidelines, and implementation of best practices.1,2 By the time guidelines and practices are implemented in usual care, the science is already out of date. One way to perceive this quality gap between research and practice is simply to think about the development of a child: Should a youth wait throughout childhood to receive the best quality care available? By current scientific standards, children with SCD are not receiving the best available quality of care. In Figure 2, we present examples of two recent SCD trials14,15 to show that the timeline between efficacy trials to clinical practice can be even longer than 17 years. Implementation science has emerged with the goal of closing this “quality gap” and supporting the transition from research to usual care. This paper summarizes implementation science and relates it to SCD.
FIGURE 2.
Timelines of two randomized controlled trials in sickle cell disease
3 IMPLEMENTATION SCIENCE
Implementation science has emerged as an area of research and has made significant progress in terms of definition of frameworks, methodology, and measures,16,17 including a journal (Implementation Science; http://implementationscience.biomedcentral.com/) and a recent issue of JAMA Pediatrics devoted to the subject.18 Funding agencies including the NIH, Patient-Centered Outcomes Research Institute, and the Agency for Health Research and Quality have invested in implementation research to accelerate the uptake of effective care and evidence-based practice. We use implementation science to refer to the field of integrating research into practice and/or policy.19 The NIH defines implementation research as “the scientific study of the use of strategies to adopt and integrate evidence-based health interventions into clinical and community settings in order to improve patient outcomes and benefit population health” (PAR-16-238). Other terms, such as knowledge translation, have also been used to describe the process of translating knowledge from research to practice.20
3.1 SCD and the need for implementation science
Since early landmark papers,21,22 funding has increased for SCD; however, work remains.23 One of the lessons learned from other fields is that, while effectiveness and efficacy studies are extremely important, they are not sufficient to improve quality of care. There are numerous drawbacks of a stepwise approach to research progression through efficacy, then effectiveness, and finally to implementation of science studies.24–26
Distinguishing between clinical studies and implementation science studies will clarify how implementation science may improve quality of life for those with SCD. We need to clarify the definitions between clinical trials and implementation studies, because often, similar terms are used with ambiguous definitions.27 According to Flay: “Whereas efficacy trials are concerned with testing whether a treatment or procedure does more good than harm when delivered under optimum conditions, effectiveness trials are concerned with testing whether a treatment does more good than harm when delivered via a real-world program.”28 Clinical studies aim to evaluate the efficacy or effectiveness of a specific intervention. Eldh and colleagues27 define clinical interventions as intentional activities designed to result in a health-related outcome (e.g., HU helps reduce painful episodes and complications for patients with SCD, the “what” to be implemented).
Implementation studies require a paradigm shift.29 They extend efficacy and effectiveness research focused on discovering what works to understanding how the implementation works in specific contexts.30 Implementation designs can be uncontrolled (e.g., variations in program delivery) or controlled (e.g., testing different approaches to dissemination of an efficacious technology).28 Implementation studies aim to change behaviors or settings at the organizational, practitioner, or patient level31 with the goal of enhancing the adoption of an intervention or guideline. A theoretical foundation is important to guide the implementation process and help to elucidate variables that affect clinical outcomes (e.g., effectiveness of HU) from the implementation variables (e.g., feasibility of an automated application to support prescription of HU by doctors).
The importance of distinguishing the implementation process from the clinical trial is related to Wandersman and colleagues’ rightful title of their manuscript: “Evidence-based interventions are necessary but not sufficient for achieving outcomes in each setting in a complex world.”32 Often, the assumption is made that if an intervention is proven effective, then its widespread use will improve outcomes.32,33 However, the scale-up of interventions without clear planning about the “how” or the implementation process may prove unsuccessful.32 Accordingly, systematic reviews of randomized controlled trials show that when interventions are tailored to address contextual barriers, they are more likely to improve professional practice.34
How can the large gap between research and practice be decreased if an intervention’s effectiveness still needs to be tested? A hybrid design can be used to accelerate the path between efficacy, effectiveness, and implementation studies.24 The three types of hybrid design vary in degree of clinical effectiveness, implementation trials within one single study, unit of analysis, unit of randomization, outcomes measures, and targets of tested interventions.24 Table 1 summarizes the main differences among the three types of hybrid design using SCD as an example.
TABLE 1.
Summary of key characteristics from different types of hybrid designs, adapted from Curran et al.,24 with examples from SCD field
Study characteristic | Hybrid trial type 1 | Hybrid trial type 2 | Hybrid trial type 3 |
---|---|---|---|
Research aims | Primary aim: Determine the effectiveness of an intervention Secondary aim: Understand the context for implementation |
Co-aims: Determine the effectiveness of an intervention. Evaluate implementation outcomes of an intervention/implementation strategy |
Primary aim: Evaluate implementation outcomes of an intervention/implementation strategy (e.g., the acceptability or feasibility of an intervention; cost) Secondary aim: Evaluate clinical outcomes of the intervention |
Examples of research questions |
|
|
|
Units of randomization | Patient, clinical unit | See type I for clinical effectiveness and type II for the implementation aim. If not randomized, it can be a case study | Provider, clinical unit, facility, system |
Comparison conditions | Placebo, treatment as usual, competing treatment | See type I for clinical effectiveness and type II for the implementation aim. If not randomized, it can be a case study | Provider, clinical unit, facility, system; implementation as usual, competing implementation strategy |
Measures | Primary aim: patient outcomes Secondary aim: implementation outcomes (e.g., acceptability and feasibility of intervention), barriers, and facilitators to implementation |
Clinical effectiveness aim: Patient outcomes Implementation aim: implementation outcomes |
Primary aim: Implementation outcomes (e.g., adoption of the intervention or guideline) Secondary aim: Patient outcomes |
Below are four broad recommendations to investigators.3
(1) Examine the quality gap
Guidelines should be developed based on quality standards suggested by the Institute of Medicine.35,36 Implementation guidelines should reduce inappropriate care and improve quality of care.37–39 Examining guideline implementation and adherence in clinical practice is crucial,40 because simply telling people what they need to do is not sufficient to change behavior and support guideline adherence.41,42 Factors that affect nonadherence of guidelines vary and include characteristics of the clinician, how guidelines were written, and the system in which guidelines are implemented.36,43
(2) Identify implementation strategy
Once gaps in guideline adherence have been identified, evidence is translated into practice44,45 via dissemination46 and implemention.47 Implementation strategy is the “how”; it is defined as “a systematic intervention to adopt and integrate evidence-based health innovations into usual care.”48 Current taxonomies49 include 72 strategies ranging from financial (e.g., provide incentive for the adoption of the intervention), organizational infrastructures (e.g., adding equipment to a room), electronic records (e.g., change electronic records to better capture outcomes), to education (e.g., training of providers), among others. However, the evidence to support the use of specific implementation strategies for specific interventions is an area of research that needs further development.50 Testing implementation strategies should be guided by theories, conceptual models, and/or frameworks to ensure that essential contextual and process elements related to implementation are not overlooked.17,51,52 The choice of which strategy to use and how to implement it should also be based on identification of barriers and facilitators, and on stakeholder input.53,54
(3) Select an implementation theory, conceptual model, and/or framework
From a practical and scientific standpoint, if there is large variability in the “how” of an intervention’s implementation, the effectiveness of the intervention becomes irrelevant. The process of implementation affects the outcomes of a trial.32 A thorough understanding of what an intervention is, what it does, and how it works is necessary for meaningful replication of interventions that were successful in their original context55 (p. 5).
The follow-up rationale is to determine how outcomes will be achieved. To achieve the same outcome in each setting, we need to account for the contextual variables and select an implementation theory, conceptual model, or framework from the several existing in the literature.17 Selecting a model depends on the constructs targeted (e.g., change at the level of system, community, organization, or individual) and should happen at the planning stage of the study. There are five categories of implementation science theories, models, and frameworks:56 (1) process frameworks aim to guide translating research to practice (e.g., Provonost and colleagues’ model,57 knowledge-to-action model,20 intervention mapping53); (2) determinant frameworks aim to specify domains of determinants that can act as barriers or enablers to implementation (e.g., PARIHS,58,59 theoretical domains framework,60 Consolidated Framework for Implementation Research [CFIR]61); (3) classic theories provide understanding of the process of implementation (e.g., theory of diffusion62); (4) implementation theories provide an understanding of the implementation process (e.g., organizational readiness,63 absorptive capacity64); and (5) evaluation frameworks specify aspects of implementation that could be evaluated (e.g., RE-AIM,65 Proctor’s implementation outcomes66). Additionally, some have proposed models for guideline implementation.54,67 The chosen model should guide the study on design, aims, methods, and evaluation17 to facilitate the quantification of mediators, moderators, and outcomes of implementation.17,68,69 Models can be adapted, but the adaptation should be made with careful consideration, planning, and documentation.70–72
Selecting a framework that will guide the study can be a challenge.73 Depending on the study question, one framework may be enough, but in other cases, more than one framework is necessary to address the study purposes and conceptual levels.73 A frequent drawback in many currently published implementation science studies is a lack of clear and explicit rationale for choosing frameworks.74 We hope that studies in SCD will better outline the rationale for their framework of choice.
One example of a formative cross-case qualitative study that provides a good justification for the framework of choice is Keith and colleagues’ study of 21 primary care practices participating in the Comprehensive Primary Care (CPC) initiative.75 The mixed-methods evaluation of this study was based on a well-known implementation framework, the CFIR.61 CFIR is a conceptual framework with 39 constructs in five domains: intervention characteristics, inner setting, outer setting, characteristics of individuals, and implementation process. The authors used CFIR to guide their analysis of their qualitative methods and to develop a checklist to guide their observations of the practices around the inner setting, practice members’ perceptions, practice’s process of implementation, and the practice’s outer setting. CFIR was used to guide data analysis and inform rapid cycle evaluation of implementation of CPC. CFIR has been used for evaluation of other studies76 including guideline implementation,77 but it is important to observe that it is one of the many other “determinant frameworks,”56 and, depending on the research question, it may be paired with process frameworks to inform the different stages of the implementation process.
(4) Measure implementation outcomes
Implementation studies must evaluate implementation outcomes. Proctor and colleagues16,66 proposed a taxonomy of implementation outcomes, including acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, and sustainability. Evaluation of implementation outcomes helps scholars disentangle implementation effectiveness from treatment effectiveness and to know, for example, whether an intervention failed because it was ineffective or because it was implemented incorrectly.51,66
The selection of implementation outcomes will depend on the level of analysis, as it can be at the policy, organization, or provider levels. The measurement of these outcomes is still under development as the field is testing psychometrically valid measures for implementation outcomes.78 There is a large movement to develop evidence-based measures of implementation that would have predictive validity.79,80 Initiatives that have moved the field forward include the Society for Implementation Research Collaboration (https://societyforimplementationresearchcollaboration.org), a group that has conducted an in-depth systematic review and synthesis of instruments,80,81 and the compilation of tools for qualitative and quantitative measures of the CFIR (http://cfirguide.org/index.html).
3.2 Implementation study example
Our recently funded implementation study is one of the eight NHLBI-funded centers’ cooperative agreements that form the Sickle Cell Disease Implementation Consortium: Using Implementation Science to Optimize Care of Adolescents and Adults with Sickle Cell Disease.82 The consortium is composed of eight studies that aim to improve the care of adolescent and adult patients with SCD, each using different frameworks and strategies. Our proposal focuses on maximizing screening for risk of stroke (p. 38 of NHLBI Expert Panel Report83), cognitive screening, and educational support to improve outcomes of adolescents and young adults with SCD. Below, we share some key components of our school-focused SCD study.
3.3 The quality gap exists beyond the “clinical” field of medicine
“Policies that address factors such as education could have a bigger influence on health than all medical advances combined.”84 Within the United States, adults without a high school diploma are likely to die 9 years sooner than college graduates. A 15-year-old male with SCA who comes from a low-income family has a 70% risk of failing a grade in school,85 putting him at risk of dropping out. These statistics are stark reality for the 100,000 Americans who have SCD. Over 90% of this population are African American,3 60%–70% are living at or near poverty (based up Medicaid qualifying),85,86 and all suffer from a chronic disease associated with a high risk of central nervous system (CNS) injury.87
Children with SCD often lack supportive care. Approximately 50% of young children with SCD experience cognitive challenges, with or without cerebral infarction.88 The 2014 Evidence-Based Management guidelines from NHLBI state:
Silent CNS infarcts can present with non-focal signs such as developmental delays or poor or declining school performance in children or changes in social role or work functioning in adults. Throughout their lives, people with SCD should be considered for formal neurocognitive evaluation when assessments reveal any of these concerns.83
Given the prevalence of CNS injury from silent cerebral infarcts and strokes, cognitive impairment, and low educational attainment in the SCD population, a multidisciplinary approach is needed to investigate barriers and implement evidence-based interventions to improve health and educational outcomes. Based on our work with patients, families, and the community and on the existing literature,89–91 we prepared an intervention to promote cognitive screening and implementation of educational support services in order to maximize educational attainment and health of adolescents with SCD.
We hypothesize that better implementation of these recommended cognitive evaluations and translation to well-defined plans for educational support in schools will increase students’ success in educational attainment and promote better health. The long-term goal is to improve health and academic attainment of adolescents and adults with SCD by implementing evidence-based guidelines for neurocognitive testing and intervention.
3.4 The implementation framework
The CFIR61 will be used to capture implementation outcomes. We chose to focus on the characteristics of the intervention by evaluating the evidence strength, quality, and trialability, as well as knowledge about the intervention from staff in the school setting. We will also focus on the inner setting, assessing the networks and communication from the school staff. Specific to the implementation process, we aim to evaluate whether engagement of clinical and educational leaders in the intervention determines the knowledge and support from leadership in both the clinical and educational settings, as leadership endorsement is essential for sustainability.92 All individuals involved in patients’ health (patients, families, community members, healthcare providers, educators, healthcare system representatives) will need to be involved in the study for maximum benefit and sustainability. Our team has experience in engaging multiple levels of systems and community members to address disparities in education and healthcare.93–95 Because we will interact with different systems (hospital, school, families), our study will also use the Interactive Systems Framework96 to guide our implementation process, using the Evidence-Based System for Innovation Support97 to guide the development of our training and toolkits.
3.5 Implementation strategy
Our plan is to implement an evidence-based intervention to address current educational practices around SCD at school by maximizing assessments in both the clinic and school. Implementing TCD screening, magnetic resonance imaging (MRI), and cognitive assessments should be standard care within pediatric SCD clinics. However, we know from peer-reviewed publications and personal correspondence that we lack 100% penetrance of these standard practices.9,12 The path from the clinic to the school is cumbersome. At the clinic level, we can plan automated prompts with an electronic medical record for orders. At the caregiver level, a completed cognitive assessment report is given, but caregivers are then responsible for providing the report to the school so that accommodations can be made. Even if a report is given to the school, challenges continue. Most educators are unaware of the cognitive challenges associated with SCD. Educational support for both clinical teams and families is needed to improve the understanding and self-efficacy of these groups to adhere to recommendations of standard care.
Our implementation strategies aim to improve the path from the clinic to schools. We aim to support the school teams to increase understanding of morbidity associated with SCD by teaching guidelines and screening in clinics. By law, after a formal written request for an Individualized Educational Plan or 504 Plan is presented, the school has 60 days to set a date for the creation of such a plan. The process of reviewing cognitive assessment results and developing supportive interventions will assist students in school, activities of daily living, and health.
3.6 Measuring outcomes
Noting the many stakeholders involved in the intervention, we planned measures for each target population. For patients, cognitive assessments and educational plans are primary. For the clinical team, SCD-specific outcomes of TCD screening, brain MRI, and blood transfusion or HU prescription are outcomes. We will also evaluate leadership, recognizing that the clinical leaders in medicine and nursing will set the priority for following quality evidence-based guidelines. In the school setting, our team will assess leadership because the principals, counselors, and lead teachers will set the tone and expectations. The feasibility and acceptability of teacher training will be key to understanding needs of students with SCD and training future educators. Without assessing this process and the beliefs and understanding of school personnel, leadership, and resources, these methods of supporting students with SCD are unlikely to be instituted, and it will be even less likely that the students succeed.
4 CONCLUSION
Complementing SCD with implementation science research can only enhance the impact of care. As Handley and colleagues98 state, “A key reason for the persistent gaps between evidence and practice across all areas of medicine is that there have been few attempts to identify the target factors critical for successful implementation of an evidence-based intervention.”98 The successful uptake of interventions will only occur with use of implementation frameworks, evaluation of implementation strategies, and examination of implementation outcomes.
Our project involves the clinic and the school, as strategies to support learning will inevitably assist with healthy behaviors, including adherence to SCD-related medications, appointments, and recommendations. However, even interventions purely in the clinical setting benefit from implementation science. Unless we can deliver evidence-based medicine to people with SCD, it is challenging to ask funders to continue to invest in bench-to-bedside advancements. Transdisciplinary science that includes basic researchers for discovery, translational researchers to transfer basic findings to clinical application, and clinical researchers to prove efficacy is imperative to advancing care of people with SCD. This relatively new emphasis on implementation science will be the final variable to increase access to evidence-based care to the broader population of those with SCD.
Abbreviations
- CFIR
Consolidated Framework for Implementation Research
- CNS
central nervous system
- CPC
Comprehensive Primary Care
- CTRIS
Center for Translation Research and Implementation Science
- HU
hydroxyurea
- MRI
magnetic resonance imaging
- NHLBI
National Heart, Lung, and Blood Institute
- NIH
National Institutes of Health
- SCA
sickle cell anemia
- SCD
sickle cell disease
- TCD
transcranial Doppler
Footnotes
CONFLICT OF INTEREST
The authors declare that there is no conflict of interest.
References
- 1.National Heart Lung and Blood Institute. Center for Translation Research and Implementation Science. 2014 https:www.nhlbi.nih.gov/about/org/ctris.
- 2.Kanter J, Kruse-Jarres R. Management of sickle cell disease from childhood through adulthood. Blood Rev. 2013;27(6):279–287. doi: 10.1016/j.blre.2013.09.001. [DOI] [PubMed] [Google Scholar]
- 3.Hassell KL. Population estimates of sickle cell disease in the U.S. Am J Prev Med. 2010;38(4 Suppl):S512–S521. doi: 10.1016/j.amepre.2009.12.022. [DOI] [PubMed] [Google Scholar]
- 4.Charache S, Terrin ML, Moore RD, et al. Effect of hydroxyurea on the frequency of painful crises in sickle cell anemia. Investigators of the Multicenter Study of Hydroxyurea in Sickle Cell Anemia. N Engl J Med. 1995;332(20):1317–1322. doi: 10.1056/NEJM199505183322001. [DOI] [PubMed] [Google Scholar]
- 5.Wang WC, Ware RE, Miller ST, et al. Hydroxycarbamide in very young children with sickle-cell anaemia: a multicentre, randomised, controlled trial (BABY HUG) Lancet. 2011;377(9778):1663–1672. doi: 10.1016/S0140-6736(11)60355-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Steinberg MH, McCarthy WF, Castro O, et al. The risks and benefits of long-term use of hydroxyurea in sickle cell anemia: a 17.5 year follow-up. Am J Hematol. 2010;85(6):403–408. doi: 10.1002/ajh.21699. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Adams RJ, McKie VC, Hsu L, et al. Prevention of a first stroke by transfusions in children with sickle cell anemia and abnormal results on transcranial Doppler ultrasonography. N Engl J Med. 1998;339(1):5–11. doi: 10.1056/NEJM199807023390102. [DOI] [PubMed] [Google Scholar]
- 8.The Henry J. Kaiser Family Foundation. State health facts—life expectancy at birth (in years), by race/ethnicity. 2015 http://kff.org/other/state-indicator/life-expectancy-by-re/
- 9.Neunert CE, Gibson RW, Lane PA, et al. Determining adherence to quality indicators in sickle cell anemia using multiple data sources. Am J Prev Med. 2016;51(1 Suppl 1):S24–S30. doi: 10.1016/j.amepre.2016.02.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Anders DG, Tang F, Ledneva T, et al. Hydroxyurea use in young children with sickle cell anemia in New York state. Am J Prev Med. 2016;51(1 Suppl 1):S31–S38. doi: 10.1016/j.amepre.2016.01.001. [DOI] [PubMed] [Google Scholar]
- 11.Stettler N, McKiernan CM, Melin CQ, Adejoro OO, Walczak NB. Proportion of adults with sickle cell anemia and pain crises receiving hydroxyurea. JAMA. 2015;313(16):1671–1672. doi: 10.1001/jama.2015.3075. [DOI] [PubMed] [Google Scholar]
- 12.Hussain S, Nichols F, Bowman L, Xu H, Neunert C. Implementation of transcranial Doppler ultrasonography screening and primary stroke prevention in urban and rural sickle cell disease populations. Pediatr Blood Cancer. 2015;62(2):219–223. doi: 10.1002/pbc.25306. [DOI] [PubMed] [Google Scholar]
- 13.Balas EA, Boren SA. Managing clinical knowledge for health care improvement. Yearb Med Inform. 2000;(1):65–70. https://www.schattauer.de/t3page/1214.html?manuscript=26379&L=1. [PubMed]
- 14.DeBaun MR, Gordon M, McKinstry RC, et al. Controlled trial of transfusions for silent cerebral infarcts in sickle cell anemia. N Engl J Med. 2014;371(8):699–710. doi: 10.1056/NEJMoa1401731. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Ware RE, Helms RW. Stroke with transfusions changing to hydroxyurea (SWiTCH) Blood. 2012;119(17):3925–3932. doi: 10.1182/blood-2011-11-392340. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Brownson RC, Colditz GA, Proctor EK. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012. [Google Scholar]
- 17.Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–350. doi: 10.1016/j.amepre.2012.05.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.JAMA Pediatr. 2015;169(4):297–412. doi: 10.1001/jamapediatrics.2014.2122. https://jamanetwork.com/journals/jamapediatrics/issue/169/4. [DOI] [PubMed] [Google Scholar]
- 19.Rabin BA, Brownson RC. Developing the terminology for dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. New York, NY: Oxford University Press; 2012. pp. 23–51. [Google Scholar]
- 20.Graham ID, Logan J, Harrison MB, et al. Lost in knowledge translation: time for a map? J Contin Educ Health Prof. 2006;26(1):13–24. doi: 10.1002/chp.47. [DOI] [PubMed] [Google Scholar]
- 21.Scott RB. Sickle-cell anemia—high prevalence and low priority. N Engl J Med. 1970;282(3):164–165. doi: 10.1056/NEJM197001152820312. [DOI] [PubMed] [Google Scholar]
- 22.Smith LA, Oyeku SO, Homer C, Zuckerman B. Sickle cell disease: a question of equity and quality. Pediatrics. 2006;117(5):1763–1770. doi: 10.1542/peds.2005-1611. [DOI] [PubMed] [Google Scholar]
- 23.Homer CJ, Oyeku SO. Sickle cell disease: a roadmap for getting to excellence everywhere. Am J Prev Med. 2016;51(1 Suppl 1):S3–S4. doi: 10.1016/j.amepre.2015.10.018. [DOI] [PubMed] [Google Scholar]
- 24.Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–226. doi: 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Wells KB. Treatment research at the crossroads: the scientific interface of clinical trials and effectiveness research. Am J Psychiatry. 1999;156(1):5–10. doi: 10.1176/ajp.156.1.5. [DOI] [PubMed] [Google Scholar]
- 26.Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009;36(1):24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Eldh AC, Almost J, DeCorby-Watson K, et al. Clinical interventions, implementation interventions, and the potential greyness in between-a discussion paper. BMC Health Serv Res. 2017;17(1):16–25. doi: 10.1186/s12913-016-1958-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Flay BR. Efficacy and effectiveness trials (and other phases of research) in the development of health promotion programs. Prev Med. 1986;15(5):451–474. doi: 10.1016/0091-7435(86)90024-1. [DOI] [PubMed] [Google Scholar]
- 29.Landsverk J, Brown CH, Rolls Reutz J, Palinkas L, Horwitz SM. Design elements in implementation research: a structured review of child welfare and child mental health studies. Adm Policy Ment Health. 2011;38(1):54–63. doi: 10.1007/s10488-010-0315-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Damschroder LPD, Petersen D. Using implementation research to guide adaptation, implementation, and dissemination of patient-centered medical home models. Rockville, MD: Feb, 2013. AHRQ Publication No. 13-0027-EF. [Google Scholar]
- 31.Grimshaw JM, Thomas RE, MacLennan G, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004;8(6):iii–iv. 1–72. doi: 10.3310/hta8060. [DOI] [PubMed] [Google Scholar]
- 32.Wandersman A, Alia K, Cook BS, Hsu LL, Ramaswamy R. Evidence-based interventions are necessary but not sufficient for achieving outcomes in each setting in a complex world: empowerment evaluation, getting to outcomes, and demonstrating accountability. Am J Eval. 2016;37(4):544–561. [Google Scholar]
- 33.Atkins MS, Rusch D, Mehta TG, Lakind D. Future directions for dissemination and implementation science: aligning ecological theory and public health to close the research to practice gap. J Clin Child Adolesc Psychol. 2016;45(2):215–226. doi: 10.1080/15374416.2015.1050724. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Baker R, Camosso-Stefinovic J, Gillies C, et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2010;(3):Cd005470. doi: 10.1002/14651858.CD005470.pub2. https://doi.org/10.1002/14651858.CD005470.pub2. [DOI] [PMC free article] [PubMed]
- 35.Morton S, Berg A, Levit L, Eden J. Finding What Works in Health Care: Standards for Systematic Reviews. Washington, DC: National Academies Press; 2011. [PubMed] [Google Scholar]
- 36.Steinberg E, Greenfield S, Wolman DM, Mancher M, Graham R. Clinical Practice Guidelines We Can Trust. Washington, DC: National Academies Press; 2011. [PubMed] [Google Scholar]
- 37.Armstrong JJ, Goldfarb AM, Instrum RS, MacDermid JC. Improvement evident but still necessary in clinical practice guideline quality: a systematic review. J Clin Epidemiol. 2017;81:13–21. doi: 10.1016/j.jclinepi.2016.08.005. [DOI] [PubMed] [Google Scholar]
- 38.Worrall G, Chaulk P, Freake D. The effects of clinical practice guidelines on patient outcomes in primary care: a systematic review. Can Med Assoc J. 1997;156(12):1705–1712. [PMC free article] [PubMed] [Google Scholar]
- 39.Andrews E, Redmond H. A review of clinical guidelines. Br J Surg. 2004;91(8):956–964. doi: 10.1002/bjs.4630. [DOI] [PubMed] [Google Scholar]
- 40.Dziedzic KS, Healey EL, Porcheret M, et al. Implementing the NICE osteoarthritis guidelines: a mixed methods study and cluster randomised trial of a model osteoarthritis consultation in primary care—the Management of OsteoArthritis in Consultations (MOSAICS) study protocol. Implement Sci. 2014;9:95–109. doi: 10.1186/s13012-014-0095-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Qual Saf Health Care. 2006;15(6):433–436. doi: 10.1136/qshc.2006.018549. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Shiffman RN, Dixon J, Brandt C, et al. The GuideLine Implementability Appraisal (GLIA): development of an instrument to identify obstacles to guideline implementation. BMC Med Inform Decis Mak. 2005;5(1):23–30. doi: 10.1186/1472-6947-5-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Gurses AP, Marsteller JA, Ozok AA, Xiao Y, Owens S, Pronovost PJ. Using an interdisciplinary approach to identify factors that affect clinicians’ compliance with evidence-based guidelines. Crit Care Med. 2010;38:S282–S291. doi: 10.1097/CCM.0b013e3181e69e02. [DOI] [PubMed] [Google Scholar]
- 44.Bhushan R, Lebwohl MG, Gottlieb AB, et al. Translating psoriasis guidelines into practice: important gaps revealed. J Am Acad Dermatol. 2016;74(3):544–551. doi: 10.1016/j.jaad.2015.11.045. [DOI] [PubMed] [Google Scholar]
- 45.Aakhus E, Granlund I, Odgaard-Jensen J, Oxman AD, Flottorp SA. A tailored intervention to implement guideline recommendations for elderly patients with depression in primary care: a pragmatic cluster randomised trial. Implement Sci. 2016;11:32–46. doi: 10.1186/s13012-016-0397-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Flodgren G, Hall AM, Goulding L, et al. Tools developed and disseminated by guideline producers to promote the uptake of their guidelines. Cochrane Database Syst Rev. 2016;(8):Cd010669. doi: 10.1002/14651858.CD010669.pub2. https://www.cochrane.org/CD010669/EPOC_effectiveness-tools-developed-and-disseminated-guideline-producers-improve-uptake-their-guidelines. [DOI] [PMC free article] [PubMed]
- 47.Moreno EM, Moriana JA. User involvement in the implementation of clinical guidelines for common mental health disorders: a review and compilation of strategies and resources. Health Res Policy Syst. 2016;14(1):61–68. doi: 10.1186/s12961-016-0135-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Powell BJ, McMillen JC, Proctor EK, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–157. doi: 10.1177/1077558711430690. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Waltz TJ, Powell BJ, Chinman MJ, et al. Expert Recommendations for Implementing Change (ERIC): protocol for a mixed methods study. Implement Sci. 2014;9:39–50. doi: 10.1186/1748-5908-9-39. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139–149. doi: 10.1186/1748-5908-8-139. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Proctor EK, Powell BJ, Baumann AA, Hamilton AM, Santens RL. Writing implementation research grant proposals: ten key ingredients. Implement Sci. 2012;7:96–108. doi: 10.1186/1748-5908-7-96. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Baumann AA, Powell BJ, Kohl PL, et al. Cultural adaptation and implementation of evidence-based parent-training: a systematic review and critique of guiding evidence. Children Youth Serv Rev. 2015;53:113–120. doi: 10.1016/j.childyouth.2015.03.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Eldredge LKB, Markham CM, Kok G, Ruiter RA, Parcel GS. Planning Health Promotion Programs: An Intervention Mapping Approach. San Francisco, CA: John Wiley & Sons; 2016. [Google Scholar]
- 54.Gagliardi AR, Alhabib S. Trends in guideline implementation: a scoping systematic review. Implement Sci. 2015;10:54–64. doi: 10.1186/s13012-015-0247-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Davidoff F, Dixon-Woods M, Leviton L, Michie S. Demystifying theory and its use in improvement. BMJ Qual Saf. 2015;24(3):228–238. doi: 10.1136/bmjqs-2014-003627. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53–65. doi: 10.1186/s13012-015-0242-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Pronovost PJ, Berenholtz SM, Needham DM. Translating evidence into practice: a model for large scale knowledge translation. BMJ. 2008;337:a1714. doi: 10.1136/bmj.a1714. [DOI] [PubMed] [Google Scholar]
- 58.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7(3):149–158. doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Rycroft-Malone J. Promoting action on research implementation in health services (PARIHS) In: Rycroft-Malone J, Bucknall T, editors. Models and Frameworks for Implementing Evidence-Based Practice: Linking Evidence to Action. Wiley-Blackwell; 2010. pp. 109–135. [Google Scholar]
- 60.Michie S, Atkins L, West R. The behaviour change wheel: a guide to designing interventions. Needed: Physician Leaders. 2014:26. https://www.behaviourchangewheel.com/
- 61.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50–64. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Rogers EM. Diffusion of Innovations. 5. New York, NY: Free Press; 2003. [Google Scholar]
- 63.Weiner BJ. A theory of organizational readiness for change. Implement Sci. 2009;4(1):1–9. doi: 10.1186/1748-5908-4-67. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Zahra SA, George G. Absorptive capacity: a review, reconceptualization, and extension. Acad Manag Rev. 2002;27(2):185–203. [Google Scholar]
- 65.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. A J Public Health. 1999;89(9):1322–1327. doi: 10.2105/ajph.89.9.1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. doi: 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Shekelle P, Woolf S, Grimshaw JM, Schunemann HJ, Eccles MP. Developing clinical practice guidelines: reviewing, reporting, and publishing guidelines; updating guidelines; and the emerging issues of enhancing guideline implementability and accounting for comorbid conditions in guideline development. Implement Sci. 2012;7:62–68. doi: 10.1186/1748-5908-7-62. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Glasgow RE, Marcus AC, Bull SS, Wilson KM. Disseminating effective cancer screening interventions. Cancer. 2004;101(S5):1239–1250. doi: 10.1002/cncr.20509. [DOI] [PubMed] [Google Scholar]
- 69.Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC. Dissemination and implementation research on community-based cancer prevention: a systematic review. Am J Prev Med. 2010;38(4):443–456. doi: 10.1016/j.amepre.2009.12.035. [DOI] [PubMed] [Google Scholar]
- 70.Cabassa LJ, Baumann AA. A two-way street: bridging implementation science and cultural adaptations of mental health treatments. Implement Sci. 2013;8(1):90–103. doi: 10.1186/1748-5908-8-90. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Powell BJ, Bosk EA, Wilen JS, Danko CM, Van Scoyoc A, Banman A. Evidence-based programs in “real world” settings: finding the best fit. In: Daro D, Cohn Donnelly A, Huang LA, Powell BJ, editors. Advances in Child Abuse Prevention Knowledge. Switzerland: Springer International AG; 2015. pp. 145–177. [Google Scholar]
- 72.Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8(1):65–76. doi: 10.1186/1748-5908-8-65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Birken SA, Powell BJ, Presseau J, et al. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review. Implement Sci. 2017;12(1):2–15. doi: 10.1186/s13012-016-0534-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010;5:14–19. doi: 10.1186/1748-5908-5-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EF. Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implement Sci. 2017;12(1):15–26. doi: 10.1186/s13012-017-0550-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR) Implement Sci. 2013;8:51–67. doi: 10.1186/1748-5908-8-51. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Breimaier HE, Heckemann B, Halfens RJ, Lohrmann C. The Consolidated Framework for Implementation Research (CFIR): a useful theoretical framework for guiding and evaluating a guideline implementation process in a hospital-based nursing practice. BMC Nurs. 2015;14:43–51. doi: 10.1186/s12912-015-0088-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.Rabin BA, Lewis CC, Norton WE, et al. Measurement resources for dissemination and implementation research in health. Implement Sci. 2016;11:42–50. doi: 10.1186/s13012-016-0401-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79.Garner BR, Hunter SB, Funk RR, Griffin BA, Godley SH. Toward evidence-based measures of implementation: examining the relationship between implementation outcomes and client outcomes. J Subst Abuse Treat. 2016;67:15–21. doi: 10.1016/j.jsat.2016.04.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Lewis CC, Weiner BJ, Stanick C, Fischer SM. Advancing implementation science through measure development and evaluation: a study protocol. Implement Sci. 2015;10:102–111. doi: 10.1186/s13012-015-0287-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Lewis CC, Stanick CF, Martinez RG, et al. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation. Implement Sci. 2015;10:2–19. doi: 10.1186/s13012-014-0193-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82.National Heart Lung and Blood Institute. 2016 https://www.nhlbi.nih.gov/news/press-releases/2016/nhlbi-awards-grants-help-improve-health-outcomes-teens-adults-sickle-cell.
- 83.National Heart Lung and Blood Institute. Evidence-based management of sickle cell disease: expert panel report. 2014. [Google Scholar]
- 84.Woolf SH, Johnson RE, Phillips RL, Jr, Philipsen M. Giving everyone the health of the educated: an examination of whether social change would save more lives than medical advances. Am J Public Health. 2007;97(4):679–683. doi: 10.2105/AJPH.2005.084848. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.King AA, Rodeghier MJ, Panepinto JA, et al. Silent cerebral infarction, income, and grade retention among students with sickle cell anemia. Am J Hematol. 2014;89(10):E188–E192. doi: 10.1002/ajh.23805. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Panepinto JA, Owens PL, Mosso AL, Steiner CA, Brousseau DC. Concentration of hospital care for acute sickle cell disease-related visits. Pediatr Blood Cancer. 2012;59(4):685–689. doi: 10.1002/pbc.24028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.DeBaun MR, Armstrong FD, McKinstry RC, Ware RE, Vichinsky E, Kirkham FJ. Silent cerebral infarcts: a review on a prevalent and progressive cause of neurologic injury in sickle cell anemia. Blood. 2012;119(20):4587–4596. doi: 10.1182/blood-2011-02-272682. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Drazen CH, Abel R, Gabir M, Farmer G, King AA. Prevalence of developmental delay and contributing factors among children with sickle cell disease. Pediatr Blood Cancer. 2016;63(3):504–510. doi: 10.1002/pbc.25838. [DOI] [PubMed] [Google Scholar]
- 89.King AA, Strouse JJ, Rodeghier MJ, et al. Parent education and biologic factors influence on cognition in sickle cell anemia. Am J Hematol. 2014;89(2):162–167. doi: 10.1002/ajh.23604. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 90.King AA, Tang S, Ferguson FL, DeBaun MR. An education program to increase teacher knowledge about sickle cell disease. J Sch Health. 2005;75(1):11–14. doi: 10.1111/j.1746-1561.2005.tb00003.x. [DOI] [PubMed] [Google Scholar]
- 91.King AA, White DA, McKinstry RC, Noetzel M, Debaun MR. A pilot randomized education rehabilitation trial is feasible in sickle cell and strokes. Neurology. 2007;68(23):2008–2011. doi: 10.1212/01.wnl.0000264421.24415.16. [DOI] [PubMed] [Google Scholar]
- 92.Aarons GA, Ehrhart MG, Farahnak LR. The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9(1):45–54. doi: 10.1186/1748-5908-9-45. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 93.Thompson VL, Drake B, James AS, et al. A community coalition to address cancer disparities: transitions, successes and challenges. J Cancer Educ. 2014;30(4):616–622. doi: 10.1007/s13187-014-0746-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.James AS, Richardson V, Wang JS, Proctor EK, Colditz GA. Systems intervention to promote colon cancer screening in safety net settings: protocol for a community-based participatory randomized controlled trial. Implement Sci. 2013;8:58–65. doi: 10.1186/1748-5908-8-58. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Daley CM, Filippi M, James AS, et al. American Indian community leader and provider views of needs and barriers to mammography. J Community Health. 2012;37(2):307–315. doi: 10.1007/s10900-011-9446-7. [DOI] [PubMed] [Google Scholar]
- 96.Wandersman A, Duffy J, Flaspohler P, et al. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3–4):171–181. doi: 10.1007/s10464-008-9174-z. [DOI] [PubMed] [Google Scholar]
- 97.Wandersman A, Chien VH, Katz J. Toward an evidence-based system for innovation support for implementing innovations with quality: tools, training, technical assistance, and quality assurance/quality improvement. Am J Community Psychol. 2012;50(3–4):445–459. doi: 10.1007/s10464-012-9509-7. [DOI] [PubMed] [Google Scholar]
- 98.Handley MA, Gorukanti A, Cattamanchi A. Strategies for implementing implementation science: a methodological overview. Emerg Med J. 2016;33(9):660–664. doi: 10.1136/emermed-2015-205461. [DOI] [PMC free article] [PubMed] [Google Scholar]