Implementation science is focused on understanding and accelerating the integration of research findings and research-based innovations into everyday practice settings to improve health. Here, we highlight the body of implementation science knowledge needed across the cancer control continuum, discuss selected theories and outcome measures drawn from implementation science, propose a framework for strategic implementation synthesizing the Expert Recommendations for Implementing Change (ERIC)1,2 and the Conceptual Model of Evidence-Based Practice Implementation,3 and illustrate the use of the framework to design an initiative to implement preoperative frailty assessment in older adult oncology patients. We conclude by suggesting resources for training, study design, and research funding, and highlight the challenges to be addressed to enhance the impact of implementation science on patient outcomes in oncology.
Importance of Implementation Science for Practice Transformation in Oncology
Typically, our research efforts conclude when the efficacy of an intervention has been confirmed and accumulated knowledge has been synthesized across a body of evidence. Efficacy studies do not usually provide information about the barriers and enablers of implementation or the effects of individual and organizational context on intervention efficacy. Moreover, studies of efficacy are generally not designed to manualize an intervention for use in routine practice, confirm the nature and extent of intervention adaptation that is permissible while preserving efficacy, or address sustainability of the intervention in routine practice settings. Although cancer care delivery research has been proposed as an emerging research focus to address these gaps,4 cancer care delivery research efforts to date seem insufficiently informed by implementation science theories and methods.5,6
At the same time, quality improvement (QI) efforts are often local, may not be disseminated in a manner that allows the reader to critically appraise the generalizability of the findings, and are typically not designed to isolate the underlying mechanisms of practice change.7 Significant challenges exist when introducing research-tested interventions into care delivery, and in many settings, practice change tends to be opportunistic, rather than systematically planned. There exist multiple examples relevant to cancer control where despite the preponderance of evidence of benefit, there are gaps in care delivery. For example, we have empirically supported interventions for colorectal cancer screening, but recent data suggest that only approximately two thirds of people in the United States are up to date on screening and approximately one third do not receive appropriate follow-up testing after a positive screen.8-10 Similarly, human papillomavirus vaccination can prevent up to 95% of cervical cancers, but uptake among adolescents in the United States is well below that of other vaccines.11 As a third example, tobacco use remains the largest contributor to lung cancer, yet effective cessation interventions do not reach all those who might benefit from them.12 For cancer care interventions to continue to bend the curve on morbidity and mortality, the scientific enterprise must go well beyond testing the efficacy and effectiveness of interventions using randomized clinical trials. We need to consider the factors that affect the long-term population benefit of those interventions such as the proportion of the population that is reached and the extent to which the intervention is adopted, implemented, and sustained in routine practice.13
Many forces are driving the need for practice improvement in oncology, including a burgeoning array of practice guidelines and care pathways14 and greater availability of metrics (eg, patient-reported outcomes and indicators of technical quality and care efficiency) that identify unwarranted practice variation, gaps in care quality, and adverse outcomes.15,16 Contemporary models for delivering and reimbursing cancer care are incentivizing the value of the care delivered rather than just the volume (eg, number of visits or procedures).17 At the same time, the provision of cancer care is complex, in part because it requires multimodal treatment and a team-based interdisciplinary care delivery model and is often distributed across geographically diverse settings. Implementation science offers theory, methods, and insights well suited to address these complexities of contemporary oncology practice.
Implementation science is concerned with three aspects: understanding the most effective techniques to improve the distribution and receipt of evidence (also known as dissemination); effectively incorporating new discoveries into clinical care delivery; and intervening on the determinants (eg, workflow, ease of implementation, feasibility, resource requirements) of successful and failed clinical implementation, thereby improving the quality and effectiveness of care and maximizing the efficiency of practice change in health systems.18,19 Issues of intervention fidelity and sustainability, and the influence of context on health care professional and organizational behaviors, are also central concerns of implementation science. Other terms (eg, dissemination and implementation research, diffusion research, knowledge translation, adoption, improvement science) have been used to describe the set of research activities encompassed by implementation science. Figure 1 maps these terms within an overarching frame of health services research.
Fig 1.
Model depicts relationships among components of implementation science.
Variation and inconsistency in terminology may create confusion, because different terms may be used synonymously or the same term may have different meanings to different researchers. In our efforts at the National Institutes of Health (NIH), we predominantly refer to this area of science as implementation science or dissemination and implementation research. At the National Cancer Institute, we also view studies of health communication that focus on effective ways of making evidence available, understandable, and actionable for both patients and providers as fitting within the scope of dissemination research. We have also seen the growth of improvement science, which emerged from the traditions of studies in the QI space. This we see as a subset of the work within implementation research, because it often relates to generalizable knowledge emerging from efforts to improve local clinical practice. We note a distinction between this work and QI, because traditional QI seeks to solve a local problem with quality of care rather than to derive universal knowledge affecting clinical practice more generally. Although Figure 1 represents our efforts to provide a conceptual map of some of these terms, we recognize distinctions that others may draw. We encourage others, particularly within the cancer care space, to define their terms so that discourse on the priorities and activities in implementation science to reduce the burden of cancer can be based on a shared conceptual foundation.
Theories, Models, and Frameworks for Implementation Science
Theory has a prominent role in implementation science, and to date, > 100 different implementation theories, models, or frameworks have been proposed. This work is summarized in nearly a dozen narrative or systematic reviews.20-25 Theory can be used to inform evaluation of barriers and facilitators, identify stakeholders, guide the selection of implementation strategies, and anticipate and manage implementation failures. Theory also provides a framework to measure the effectiveness and efficiency of implementation and to identify the factors that should be considered to achieve sustainability. Most implementation science theories, models, or frameworks acknowledge that local and national context influences implementation efforts, the characteristics of the evidence-based interventions themselves affect ease of implementation and sustainability, and processes of dissemination and implementation are multistep and multilevel in their complexity.21,26
Some frameworks may perform better than others for specific kinds of implementation efforts.23 For example, an organizing framework of interventions to promote and integrate evidence into health practices, systems, and policies called the AIMD (Aims, Ingredients, Mechanism, Delivery) Framework has been recently derived through consensus. This framework includes the following four key components: strategies and techniques (active ingredients), how they function (causal mechanisms), how they are delivered (mode of delivery), and what they aim to change (intended targets).20,27 This framework may be particularly useful in planning implementation research studies and in shaping population-level implementation strategies. In addition, some theories, models, or frameworks may be more conducive to integration and dialogue across disciplinary boundaries.28
Outcome Measures and Intervention Strategies
Implementation outcomes are the missing piece in the traditional conceptual model of outcomes research, which typically focuses on health interventions and their direct impact on health services outcomes and patient outcomes.29 Implementation outcomes include acceptability, feasibility, uptake and awareness, adoption and penetration, fidelity, costs, and sustainability.30-34 More than 100 measures of implementation outcomes are available,34,35 although most have had limited psychometric testing, and given overlap in the constructs measured, their construct validity may be underdeveloped.36 Some key constructs for implementation have had insufficient implementation outcome measure development.37 Several resources exist to facilitate selection of an implementation outcome measure,31 including an interactive Web site designed to assist in selecting an implementation model and associated outcome measures.38
There has been an increasing number of empirical investigations of strategies to improve the implementation of evidence-based cancer control interventions39-41 and evidence-based interventions across other areas of health care.42-44 Strategies that have been tested include provider training and supervision interventions, guidelines and care pathways, audit and feedback, organizational change strategies, learning collaboratives, and the use of technologic solutions such as clinician decision support and reminders or alerts to improve delivery of evidence-based practices.42 Importantly, most of these intervention strategies have had only limited testing in the context of cancer care delivery. The optimal method for delivering, targeting, and tailoring these interventions on the basis of practitioner characteristics, organizational context, and setting has had comparatively little systematic study.45,46 Similarly, little is known with respect to the active ingredients or mechanism of action of the strategies that have been shown to favorably affect provider behaviors and delivery of evidence-based care.27,47
The ERIC initiative identified 73 implementation strategies grouped into six clusters.1,2,48 ERIC provides a menu of strategies allowing researcher to compare and prioritize intervention options. Implementation strategies vary in complexity and may be single or multicomponent interventions. Typically, implementation strategies are multilevel and target the individual, team, organizational, and system levels.29
Various implementation strategies may be particularly fitting and timely at different phases of the change process.3 To guide the selection of implementation strategies for a given implementation need, we propose a Strategic Implementation Framework derived from the literature. As shown in Figure 2, the framework arranges implementation strategies along the continuum of change beginning with setting the stage and moving to active implementation and then to monitoring, supporting, and sustaining change. There is interplay between the strategies proposed for each stage such that components of active implementation, such as adjusting workflow, revising team structures, or conducting cyclical small tests of change, may occur alongside efforts to set the stage for change (eg, identify barriers and resources, build coalitions, and acquire resources). Similarly, active implementation may also include strategies that will ensure that evidence-based practices and improvements in care delivery are supported and sustained (eg, performance metrics, reward practices, policies, and disincentives).
Fig 2.
Strategic Implementation Framework. HIT, health information technology. Adapted from Waltz et al2 and Aarons et al,3 with permission.
To illustrate the use of the Strategic Implementation Framework, consider the problem of implementing preoperative frailty assessment in older adult oncology patients. Recognizing that frailty is the most important predictor of surgical outcomes, the American College of Surgeons recommended in 2012 that preoperative frailty assessment be incorporated into the care of older adults undergoing surgical treatment of cancer.49 Practice guidelines to evaluate and optimize components of frailty have been articulated.50-53 Both the National Comprehensive Cancer Network and the International Society of Geriatric Oncology recommend geriatric assessment to determine the best treatment of older adult patients with cancer.54-56 However, a recent survey of surgeons indicates that half of respondents do not consider preoperative frailty assessment to be mandatory, < 10% use available tools and practice guidelines for frailty assessment and optimization, and < 5% collaborate with geriatric specialty care.57 On the basis of these findings and guided by the Strategic Implementation Framework, implementation efforts should focus on identifying barriers and facilitators, as well as identifying champions and early adopters. Continued activities to share knowledge to prepare organizations and build coalitions are also critical. Simultaneously, these findings suggest the need to develop educational materials and training to support the use of established tools and guidelines and, importantly, the need to examine and revise team roles and structures, and to accommodate inclusion of geriatric frailty assessments in surgical team workflows. Conducting cyclical small tests of changes in workflow and roles will allow teams to build on their successes and make necessary adjustments. Establishment of learning collaboratives and the inclusion of reminders and alerts will also be helpful in monitoring, supporting, and sustaining these practice changes.
NIH Efforts to Encourage Implementation Science
For the past 12 years, the NIH has issued a series of solicitations for implementation science applications designed to stimulate research that improves our understanding of the optimal integration of evidence and effective interventions into the delivery of health care services. Sixteen of the constituent institutes and centers at the NIH participate in these funding opportunity announcements, which represent the NIH’s mechanism for jointly supporting studies that cut across health issue, stage of illness, and care setting. More than 150 projects have been funded through these trans-NIH funding opportunity announcements on dissemination and implementation research,58 which prioritize research on both improved uptake of evidence-based interventions and the de-implementation of practices deemed to be unnecessary, lacking an evidence base, or harmful. The most recent announcements have called for increased emphasis on the scale-up and spread of effective interventions, an improved understanding of factors supporting sustainability of effective health care practices, and studies that scaffold multiple evidence-based interventions to create an evidence-based system of care. There is also greater focus on building a mechanistic understanding of dissemination and implementation processes and on identifying targets for implementation and testable hypotheses as to whether these targets, when engaged, lead to higher rates of intervention uptake. Thus, the NIH is moving toward a more precise, explanatory, and contextualized approach, focused on determining the active component(s) in an implementation strategy, the mechanism(s) by which that strategy produces the desired changes, and how the essential elements of the strategy can be optimized for diverse providers, settings, and patient circumstances.
The NIH has partnered with AcademyHealth, in collaboration with the Agency for Healthcare Research and Quality, the Patient-Centered Outcomes Research Institute, and the Department of Veterans Affairs, to host a series of annual conferences advancing implementation science.59 These annual conferences bring together more than 1,000 investigators from diverse health-related settings to present study findings and discuss challenges and solutions for the field. In addition, multiple training opportunities exist for investigators interested in moving into implementation science, both in cancer care specifically or in health research more generally.60 There remains a need to build additional capacity for implementation science, particularly among clinician-scientists, practitioners, and clinical leaders to ensure that implementation studies are rigorous and are directed to achieve improvements in the quality and value of cancer care.
Although significant advances have been made in recent years in leveraging insights from implementation science in oncology, there are specific challenges that deserve attention if we are going to maximize the impact of our research on cancer-related outcomes. First, although we need to continue to develop and test the efficacy of interventions in cancer care, our interventions should also be designed with dissemination and implementation in mind.61 The National Cancer Institute has recently launched a training program for interventionists, Speeding Interventions into Practice,62 to support a better understanding of the care delivery systems in which interventions are developed and to encourage entrepreneurial thinking to optimize the fit between interventions and practice settings. We have also seen increased attention on novel study designs such as hybrid effectiveness-implementation trials,63 where investigators can include implementation research questions as part of a clinical trial examining efficacy or effectiveness.
Second, we must question assumptions about the permanence of evidence-based practices, particularly when testing the effectiveness of interventions that will be incorporated for sustained delivery by a health care system that is continuously evolving. Thus, it has been suggested that in evaluating efficacy, researchers should consider building in the expectation that health care systems and interventions will need to evolve over time.64 Rather than limiting ourselves to only understanding how interventions can be adapted as needed within local settings, the field can view implementation as a chance to build up understanding about the best ways to fit an intervention into a practice setting. This bidirectionality is encouraged by three key features of the learning health care system. These include streams of systematically gathered patient data, including patient-reported outcomes, presented to clinicians in real time; the use of those data to support care delivery (rapid referral for needed services, monitoring, and clinician follow-up) and to motivate and direct QI efforts (reminders or alerts, audit and feedback, report cards); and active participation by stakeholders across different levels of the organization to optimize outcomes.65 These features of the learning health care system generate synergistic real-world implementation science evidence that can be used both to refine the intervention for ease of implementation and acceptability to patients and clinicians and to simultaneously confirm that the intervention favorably affects quality, patient outcomes, and cost.66
Third, implementation science in cancer care is inherently a multilevel, cross-setting, transdisciplinary, endeavor. Changing cancer care through the implementation of evidence-based interventions requires recognition that influences on implementation exist at patient, provider, organization, and system levels, and that strategies to improve uptake will likely need to engage diverse individuals from different organizational levels and settings. In addition, implementation science must leverage expertise from multiple disciplines, including the health care professions (medicine, nursing, social work, psychology, pharmacy, and rehabilitation medicine), health outcomes researchers, practice managers, and experts in informatics, organizational behavior, engineering, marketing and communications, and health policy. Cancer care is also typically delivered across several different settings, from oncology to primary care to community services. Thus, practice improvement efforts in oncology must accommodate the coordination and handoffs that occur as a patient moves from one setting to another, and must consider the impact of differing organizational contexts on the delivery of evidence-based cancer care benefits.
As cancer care delivery and cancer research continue to evolve, there is great optimism about the role that implementation science can play in linking biomedical discovery and evidence-based health care delivery. The recent Cancer Moonshot initiative and the Blue Ribbon Panel report 67 recognized the population health benefit that can accrue from effectively implementing what we know. We are confident that increased attention to the theories, methods, and tools of implementation science can be leveraged to drive practice transformation and support the delivery of evidence-based care, thereby reducing the burden of cancer across the cancer control continuum.
AUTHOR CONTRIBUTIONS
Conception and design: All authors
Collection and assembly of data: All authors
Data analysis and interpretation: All authors
Manuscript writing: All authors
Final approval of manuscript: All authors
Accountable for all aspects of the work: All authors
AUTHORS' DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST
Leveraging Implementation Science to Improve Cancer Care Delivery and Patient Outcomes
The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated. Relationships are self-held unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject matter of this manuscript. For more information about ASCO's conflict of interest policy, please refer to www.asco.org/rwc or ascopubs.org/journal/jop/site/misc/ifc.xhtml.
Sandra A. Mitchell
No relationship to disclose
David Chambers
No relationship to disclose
REFERENCES
- 1.Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21. doi: 10.1186/s13012-015-0209-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Waltz TJ, Powell BJ, Matthieu MM, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10:109. doi: 10.1186/s13012-015-0295-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Kent EE, Mitchell SA, Castro KM, et al. Cancer care delivery research: Building the evidence base to support practice change in community oncology. J Clin Oncol. 2015;33:2705–2711. doi: 10.1200/JCO.2014.60.6210. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Unger JM, Hershman DL, Arnold KB, et al. Stepwise development of a cancer care delivery research study to evaluate the prevalence of virus infections in cancer patients. Future Oncol. 2016;12:1219–1231. doi: 10.2217/fon-2015-0076. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Adesoye T, Greenberg CC, Neuman HB. Optimizing cancer care delivery through implementation science. Front Oncol. 2016;6:1. doi: 10.3389/fonc.2016.00001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Ogrinc G, Davies L, Goodman D, et al. SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): Revised publication guidelines from a detailed consensus process. BMJ Qual Saf. 2016;25:986–992. doi: 10.1136/bmjqs-2015-004411. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Chubak J, Garcia MP, Burnett-Hartman AN, et al. Time to colonoscopy after positive fecal blood test in four U.S. health care systems. Cancer Epidemiol Biomarkers Prev. 2016;25:344–350. doi: 10.1158/1055-9965.EPI-15-0470. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Nadel MR, Berkowitz Z, Klabunde CN, et al. Fecal occult blood testing beliefs and practices of U.S. primary care physicians: Serious deviations from evidence-based recommendations. J Gen Intern Med. 2010;25:833–839. doi: 10.1007/s11606-010-1328-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Shapiro JA, Klabunde CN, Thompson TD, et al. Patterns of colorectal cancer test use, including CT colonography, in the 2010 National Health Interview Survey. Cancer Epidemiol Biomarkers Prev. 2012;21:895–904. doi: 10.1158/1055-9965.EPI-12-0192. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Printz C. HPV vaccine uptake remains low: Why some adolescents are not receiving the vaccine, and what can be done about it. Cancer. 2013;119:2947–2948. doi: 10.1002/cncr.28278. [DOI] [PubMed] [Google Scholar]
- 12.Zhu SH, Lee M, Zhuang YL, et al. Interventions to increase smoking cessation at the population level: How much progress has been made in the last two decades? Tob Control. 2012;21:110–118. doi: 10.1136/tobaccocontrol-2011-050371. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: The RE-AIM framework. Am J Public Health. 1999;89:1322–1327. doi: 10.2105/ajph.89.9.1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Ellis PG, O’Neil BH, Earle MF, et al. Clinical pathways: Management of quality and cost in oncology networks in the metastatic colorectal cancer setting. J Oncol Pract. 2017;13:e522–e529. doi: 10.1200/JOP.2016.019232. [DOI] [PubMed] [Google Scholar]
- 15.Haggstrom DA, Doebbeling BN. Quality measurement and system change of cancer care delivery. Med Care. 2011;49(suppl):S21–S27. doi: 10.1097/MLR.0b013e3181d59529. [DOI] [PubMed] [Google Scholar]
- 16.Blayney DW, McNiff K, Eisenberg PD, et al. Development and future of the American Society of Clinical Oncology’s Quality Oncology Practice Initiative. J Clin Oncol. 2014;32:3907–3913. doi: 10.1200/JCO.2014.56.8899. [DOI] [PubMed] [Google Scholar]
- 17.Miller P, Mosley K. Physician reimbursement: From fee-for-service to MACRA, MIPS and APMs. J Med Pract Manage. 2016;31:266–269. [PubMed] [Google Scholar]
- 18.Williams JK, Feero WG, Leonard DG, et al. Implementation science, genomic precision medicine, and improved health: A new path forward? Nurs Outlook. 2017;65:36–40. doi: 10.1016/j.outlook.2016.07.014. [DOI] [PubMed] [Google Scholar]
- 19.Brownson RC, Colditz GA, Proctor EK.(eds)Dissemination and Implementation Research in Health: Translating Science to Practice New York, NY: Oxford University Press; 2012doi:10.1093/acprof:oso/9780199751877.001.0001 [Google Scholar]
- 20.Colquhoun H, Leeman J, Michie S, et al. Towards a common terminology: A simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. 2014;9:51. doi: 10.1186/1748-5908-9-51. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Tabak RG, Khoong EC, Chambers DA, et al. Bridging research and practice: Models for dissemination and implementation research. Am J Prev Med. 2012;43:337–350. doi: 10.1016/j.amepre.2012.05.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Lokker C, McKibbon KA, Colquhoun H, et al. A scoping review of classification schemes of interventions to promote and integrate evidence into practice in healthcare. Implement Sci. 2015;10:27. doi: 10.1186/s13012-015-0220-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Mitchell SA, Fisher CA, Hastings CE, et al. A thematic analysis of theoretical models for translational science in nursing: Mapping the field. Nurs Outlook. 2010;58:287–300. doi: 10.1016/j.outlook.2010.07.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53. doi: 10.1186/s13012-015-0242-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Moullin JC, Sabater-Hernandez D, Fernandez-Llimos F, et al. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework. Health Res Policy Syst. 2015;13:16. doi: 10.1186/s12961-015-0005-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Glasgow RE, Vinson C, Chambers D, et al. National Institutes of Health approaches to dissemination and implementation science: Current and future directions. Am J Public Health. 2012;102:1274–1281. doi: 10.2105/AJPH.2012.300755. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Bragge P, Grimshaw JM, Lokker C, et al. AIMD: A validated, simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. BMC Med Res Methodol. 2017;17:38. doi: 10.1186/s12874-017-0314-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Brunner JW, Sankaré IC, Kahn KL. Interdisciplinary priorities for dissemination, implementation, and improvement science: Frameworks, mechanics, and measures. Clin Transl Sci. 2015;8:820–823. doi: 10.1111/cts.12319. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Proctor EK, Landsverk J, Aarons G, et al. Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009;36:24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Clinton-McHarg T, Yoong SL, Tzelepis F, et al. Psychometric properties of implementation measures for public health and community settings and mapping of constructs against the Consolidated Framework for Implementation Research: A systematic review. Implement Sci. 2016;11:148. doi: 10.1186/s13012-016-0512-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Rabin BA, Lewis CC, Norton WE, et al. Measurement resources for dissemination and implementation research in health. Implement Sci. 2016;11:42. doi: 10.1186/s13012-016-0401-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Lewis CC, Fischer S, Weiner BJ, et al. Outcomes for implementation science: An enhanced systematic review of instruments using evidence-based rating criteria. Implement Sci. 2015;10:155. doi: 10.1186/s13012-015-0342-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: A systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8:22. doi: 10.1186/1748-5908-8-22. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Martinez RG, Lewis CC, Weiner BJ. Instrumentation issues in implementation science. Implement Sci. 2014;9:118. doi: 10.1186/s13012-014-0118-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Rabin BA, Purcell P, Naveed S, et al. Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012;7:119. doi: 10.1186/1748-5908-7-119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76. doi: 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Neta G, Glasgow RE, Carpenter CR, et al. A framework for enhancing the value of research for dissemination and implementation. Am J Public Health. 2015;105:49–57. doi: 10.2105/AJPH.2014.302206. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Center for Research in Implementation Science and Prevention: Dissemination and Implementation Models in Health Research and Practice. http://www.dissemination-implementation.org.
- 39.Brouwers MC, Garcia K, Makarski J, et al. The landscape of knowledge translation interventions in cancer control: What do we know and where to next? A review of systematic reviews. Implement Sci. 2011;6:130. doi: 10.1186/1748-5908-6-130. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Grunfeld E, Zitzelsberger L, Evans WK, et al. Better knowledge translation for effective cancer control: A priority for action. Cancer Causes Control. 2004;15:503–510. doi: 10.1023/B:CACO.0000036448.40295.1d. [DOI] [PubMed] [Google Scholar]
- 41.Neta G, Sanchez MA, Chambers DA, et al. Implementation science in cancer prevention and control: A decade of grant funding by the National Cancer Institute and future directions. Implement Sci. 2015;10:4. doi: 10.1186/s13012-014-0200-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Colquhoun HL, Squires JE, Kolehmainen N, et al. Methods for designing interventions to change healthcare professionals’ behaviour: A systematic review. Implement Sci. 2017;12:30. doi: 10.1186/s13012-017-0560-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Grant A, Dreischulte T, Guthrie B. Process evaluation of the data-driven quality improvement in primary care (DQIP) trial: Active and less active ingredients of a multi-component complex intervention to reduce high-risk primary care prescribing. Implement Sci. 2017;12:4. doi: 10.1186/s13012-016-0531-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Lau R, Stevenson F, Ong BN, et al. Achieving change in primary care—Causes of the evidence to practice gap: Systematic reviews of reviews. Implement Sci. 2016;11:40. doi: 10.1186/s13012-016-0396-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11:141. doi: 10.1186/s13012-016-0506-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Bunce AE, Gold R, Davis JV, et al. “Salt in the wound”: Safety net clinician perspectives on performance feedback derived from EHR data. J Ambul Care Manage. 2017;40:26–35. doi: 10.1097/JAC.0000000000000166. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Ivers NM, Sales A, Colquhoun H, et al. No more ‘business as usual’ with audit and feedback interventions: Towards an agenda for a reinvigorated intervention. Implement Sci. 2014;9:14. doi: 10.1186/1748-5908-9-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Powell BJ, McMillen JC, Proctor EK, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69:123–157. doi: 10.1177/1077558711430690. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Chow WB, Rosenthal RA, Merkow RP, et al. Optimal preoperative assessment of the geriatric surgical patient: A best practices guideline from the American College of Surgeons National Surgical Quality Improvement Program and the American Geriatrics Society. J Am Coll Surg. 2012;215:453–466. doi: 10.1016/j.jamcollsurg.2012.06.017. [DOI] [PubMed] [Google Scholar]
- 50.Huisman MG, Kok M, de Bock GH, et al. Delivering tailored surgery to older cancer patients: Preoperative geriatric assessment domains and screening tools—A systematic review of systematic reviews. Eur J Surg Oncol. 2017;43:1–14. doi: 10.1016/j.ejso.2016.06.003. [DOI] [PubMed] [Google Scholar]
- 51.Wozniak SE, Coleman J, Katlic MR. Optimal preoperative evaluation and perioperative care of the geriatric patient: A surgeon’s perspective. Anesthesiol Clin. 2015;33:481–489. doi: 10.1016/j.anclin.2015.05.012. [DOI] [PubMed] [Google Scholar]
- 52.Mohile SG, Velarde C, Hurria A, et al. Geriatric assessment-guided care processes for older adults: A Delphi consensus of geriatric oncology experts. J Natl Compr Canc Netw. 2015;13:1120–1130. doi: 10.6004/jnccn.2015.0137. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Knittel JG, Wildes TS. Preoperative assessment of geriatric patients. Anesthesiol Clin. 2016;34:171–183. doi: 10.1016/j.anclin.2015.10.013. [DOI] [PubMed] [Google Scholar]
- 54.Wildiers H, Heeren P, Puts M, et al. International Society of Geriatric Oncology consensus on geriatric assessment in older patients with cancer. J Clin Oncol. 2014;32:2595–2603. doi: 10.1200/JCO.2013.54.8347. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Decoster L, Van Puyvelde K, Mohile S, et al. Screening tools for multidimensional health problems warranting a geriatric assessment in older cancer patients: An update on SIOG recommendations. Ann Oncol. 2015;26:288–300. doi: 10.1093/annonc/mdu210. [DOI] [PubMed] [Google Scholar]
- 56.VanderWalde N, Jagsi R, Dotan E, et al. NCCN Guidelines Insights: Older Adult Oncology, Version 2.2016. J Natl Compr Canc Netw. 2016;14:1357–1370. doi: 10.6004/jnccn.2016.0146. [DOI] [PubMed] [Google Scholar]
- 57.Ghignone F, van Leeuwen BL, Montroni I, et al. The assessment and management of older cancer patients: A SIOG surgical task force survey on surgeons’ attitudes. Eur J Surg Oncol. 2016;42:297–302. doi: 10.1016/j.ejso.2015.12.004. [DOI] [PubMed] [Google Scholar]
- 58. National Institutes of Health: Department of Health and Human Services: Part 1. Overview information. https://grants.nih.gov/grants/guide/pa-files/PAR-16-238.html.
- 59.Chambers D, Simpson L, Neta G, et al. Proceedings from the 9th Annual Conference on the Science of Dissemination and Implementation. Implement Sci. 2017;12:48. [Google Scholar]
- 60.Proctor EK, Chambers DA. Training in dissemination and implementation research: A field-wide perspective. Transl Behav Med. doi: 10.1007/s13142-016-0406-8. 10.1007/s13142-016-0406-8 [epub ahead of print on November 11, 2016] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Brownson RC, Jacobs JA, Tabak RG, et al. Designing for dissemination among public health researchers: Findings from a national survey in the United States. Am J Public Health. 2013;103:1693–1699. doi: 10.2105/AJPH.2012.301165. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62. National Cancer Institute: Speeding Research-Tested Interventions (SPRINT). http://www.nci-sprint.com/
- 63.Brown CH, Curran G, Palinkas LA, et al. An overview of research and evaluation designs for dissemination and implementation. Annu Rev Public Health. 2017;38:1–22. doi: 10.1146/annurev-publhealth-031816-044215. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Chambers DA, Azrin ST. Research and services partnerships: Partnership: A fundamental component of dissemination and implementation research. Psychiatr Serv. 2013;64:509–511. doi: 10.1176/appi.ps.201300032. [DOI] [PubMed] [Google Scholar]
- 65.Chambers DA, Feero WG, Khoury MJ. Convergence of implementation science, precision medicine, and the learning health care system: A new model for biomedical research. JAMA. 2016;315:1941–1942. doi: 10.1001/jama.2016.3867. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Fisher ES, Shortell SM, Savitz LA. Implementation science: A potential catalyst for delivery system reform. JAMA. 2016;315:339–340. doi: 10.1001/jama.2015.17949. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67. National Cancer Institute: Cancer Moonshot Blue Ribbon Panel. https://www.cancer.gov/research/key-initiatives/moonshot-cancer-initiative/blue-ribbon-panel.


