Skip to main content
Translational Behavioral Medicine logoLink to Translational Behavioral Medicine
. 2021 Nov 30;11(11):1980–1988. doi: 10.1093/tbm/ibab105

Four strategic areas to advance equitable implementation of evidence-based practices in cancer care

Katharine A Rendle 1,2,3,4,, Rinad S Beidas 3,4,5,6,7,8,9
PMCID: PMC8634319  PMID: 34850931

Implications.

Practice: Most efforts to increase uptake of evidence-based practice in cancer treatment have not leveraged insights from behavioral economics, presenting an opportunity to learn rapidly to improve equitable implementation of evidence-based care.

Policy: Expanding the focus of implementation science in cancer control to understand how outer setting factors influence the effectiveness of behavioral economics informed approaches to equitable implementation.

Research: There are four strategic areas to advance implementation research in cancer care delivery: (1) leveraging insights from behavioral economics; (2) applying rapid bidirectional approaches to fail fast and learn quickly; (3) identifying mechanisms through which implementation strategies work using mixed methods; and (4) understanding and addressing health disparities by incorporating a health equity lens into implementation science.

Tremendous discoveries across the cancer control continuum over the past two decades have the potential to dramatically improve cancer care and outcomes now and in the future [1]. For example, following decades of cervical cancer research, the highly effective human papillomavirus (HPV) vaccine became the first cancer prevention vaccine recommended for routine immunization in 2006 and now stands as a cornerstone for the elimination of cervical cancer worldwide [2–4]. In 2013, supported by findings from the National Lung Screening Trial [5], the United States Preventive Services Task Force and other professional organizations first recommended annual lung cancer screening for high-risk adults, providing the opportunity to dramatically reduce lung cancer mortality through early detection [6–10]. In 2017, after over 50 years of basic and clinical research, the FDA approved two CAR T-cell therapies, a novel form of immunotherapy that harnesses the power of patients’ own immune cells to treat cancer [11–13]. Each of these revolutionary discoveries (and the many others not listed) are the result of many years of innovative research, multidisciplinary collaboration, and billions of research dollars used to establish the necessary evidence to warrant translation into practice [14].

Despite these remarkable scientific discoveries, there is a major “last mile” problem that threatens their promise. National uptake of HPV vaccination remains at 54% [15] and less than 10% of eligible adults in the USA receive lung cancer screening, suggesting a major implementation failure [16]. Although early, access to CAR T-cell and other forms of immunotherapy have been differentially available to individuals based on race and insurance [17, 18], and questions remain regarding cost-effectiveness [19–21]. Each of these revolutionary discoveries and their subsequent suboptimal integration into practice highlight the pressing question of how best to implement new evidence-based practices—and de-implement outdated practices [22–24]—amid burgeoning discoveries in cancer research. It is not enough to discover. The same attention and resources given to discovery must be focused on how best to equitably implement our scientific discoveries to achieve the promise of these evidence-based practices on cancer care delivery and outcomes [25, 26].

Historically, the path from evidence to practice integration is as long as the time needed to develop these discoveries, resulting in delayed and missed opportunities to translate evidence into improved outcomes at the individual and population level. While much has been learned in the last two decades, there remain persistent challenges and inequities in the translation of evidence-based practice into care, with a 2021 review indicating that the 17-year gap has only been reduced to 15-years in cancer control [27, 28]. The rapidly growing field of implementation science offers frameworks and methods that can help to advance practice integration of these and future discoveries that are likely to emerge over the next decade [29].

In the last two decades, there has been a substantial growth of dissemination and implementation research [29, 30]. This is reflected by an increase in the support of implementation science across the National Institutes of Health (NIH) and the National Cancer Institute (NCI) in particular [31]. The NIH has made translational and implementation science a priority through increased training and funding initiatives including the development of the Clinical and Translational Science Awards in 2006 and the development of the Training in Dissemination and Implementation Research in Health (TIDIRH) in 2011 [32]. At the NCI, 71 grants have been funded between 2006 and 2019 in implementation science. The majority of funded proposals focused on prevention (46%) and screening (46%) and few studies examining treatment (13%) and survivorship (15%) [33]. In addition to these investigator-driven grants, there have been several large-scale efforts supported by the Cancer Moonshot and the 2021 NCI Strategic Plan, which explicitly called for increased integration and funding for implementation science into cancer research [34]. These include three large research networks: Accelerating Colorectal Cancer Screening and follow-up through Implementation Science (ACCSIS; https://healthcaredelivery.cancer.gov/accsis), Improving the Management of Symptoms during and following Cancer Treatment (IMPACT; https://healthcaredelivery.cancer.gov/impact), and most recently the Implementation Science Centers in Cancer Control (ISC3; https://cancercontrol.cancer.gov/is/initiatives/isc3). ISC3 includes seven Centers to support the rapid implementation of evidence-based strategies across the cancer control continuum with a focus on health equity [35]. These multipronged efforts reflect a tremendous opportunity to simultaneously advance the field of implementation science and cancer care.

In this commentary, building upon the work across the field and drawing on our Penn ISC3, we outline four strategic areas that can dually improve cancer control and the science of implementation. First, strategies from behavioral economics—particularly those that can “nudge” individuals to make evidence-based choices [36, 37]—have not been applied widely within implementation science or cancer care, despite the success of these approaches in other clinical contexts [36, 38–41]. There is an untapped opportunity to leverage the rich literature from this burgeoning field, which bridges economics and psychology to elucidate how to leverage underlying cognitive biases to shape patient and clinician behaviors [42]. Second, implementation science has not yet widely applied rapid bidirectional approaches to “fail fast and learn quickly” in the search for effective implementation strategies [43, 44]. In the context of healthcare, where clinical workflows, payment models, and policies are rapidly changing, advancing these approaches within the field is key to achieving the promise of learning healthcare systems [45]. Third, efforts to identify the mechanisms through which implementation strategies improve the use of evidence-based practices are in their infancy, particularly with regard to the integration of mixed methods approaches [46–48]. Fourth, for the field to truly have equitable reach and impact at the population health level, implementation science must seek to understand and address persistent existing health inequities, and prioritize health equity by eliminating health disparities driven by social and structural determinants [49–52]. The continuous and iterative flow of data, ideas, and resources from and to and to and from the people, places, and systems we with work can help the field to rapidly identify and target persistent inequities. As such, similar to cancer research, implementation science is in a critical period of refocusing, sharpening, and innovation.

Behavioral Economics

Implementation science largely started as a field of “trial and error,” often lacking theoretical rationale for the specific choice of implementation strategies [53], and focused on testing more broad or passive strategies such as consensus statements, guidelines, and trainings [47]. However, early empirical studies found that suggested passive or atheoretical strategies did not generally result in effective behavior change [54–56]. Over the past decade, implementation science has rapidly expanded into a robust scientific field, with over 150 conceptual frameworks [57], identification of over 600 implementation determinants [58], and well-defined implementation outcomes [59]. However, forward development of the field has also been hindered by certain key assumptions including those regarding how individuals make behavioral decisions that are rooted in a rational model of decision-making [60, 61].

A growing body of research from behavioral economics challenges the assumption that individuals make decisions “rationally”—or that individuals, given their preferences and resource restraints, will make decisions that maximize individual benefit and satisfaction. In contrast, behavioral economics argues that people often behave in predictably “irrational” yet modifiable ways, guided by emotion, cognitive biases, and common heuristics when making decisions [62]. Rooted in both neoclassical economics and cognitive and social psychology, behavioral economics includes a set of models and frameworks that recognize the complexity under which individuals tend to make decisions. Counter to traditional economic theory, behavioral economics posits that individuals (including clinicians and patients) do not always make decisions based upon complete information or to maximize expected utility in terms of personal benefit or satisfaction [61]. Rather individuals make decisions under the constraints of “bounded rationality,” due to emotion, habit, available information, time, and other constraints [63]. As a result, as decision-makers, we are influenced by myriad psychological, social, cognitive, and emotional factors, and use a range of simplifying cognitive heuristics or shortcuts. Heuristics particularly relevant to medical decision-making include availability bias (the tendency to be influenced by recent or common examples), and status quo bias (the tendency to stick with a current approach even if new or better approaches are available) [60, 64]. These heuristics can be accommodated and harnessed through strategies such as choice architecture, which involves changing the environment to facilitate the desired (evidence-based) choice [65].

One of the most powerful ways to use choice architecture to affect behavior is through nudges, which involve making subtle changes to the way that choices are presented [66]. Nudges can take several forms, with differing levels of intensity and correspondingly effect and burden [67]. Nudges can range from simply framing information in a specific way to changing default options with systems [37]. Modifying default options is a common approach to increasing the frequency of the desired behavior while preserving clinician and patient choice. This strategy simply sets the desired action as the default, such that it takes effect unless the decision-maker chooses to override it. In the context of cancer care, an example of a default option that is focused on de-implementation is changing the default medication for patients with cancer from higher to lower cost, but equally efficacious, medication [68]. Default options are a more potent approach than providing information but may increase the burden on clinicians and be unfeasible or unnecessary in certain cases, and thus specific strategies should be tailored by stakeholders and context.

Several research studies in other areas of healthcare have shown robust results when applying principles from behavioral economics to modify physician and patient behavior [38, 39, 69, 70]. However, to date, implementation science and behavioral economic strategies have seldom informed one another [71]. Often in implementation science, we apply kitchen-sink approaches that are very expensive and often not scalable (and to date, have yielded disappointing findings) [56, 72]. Additionally, leading implementation frameworks, such as the Consolidated Framework for Implementation Research [73] and the Exploration, Preparation, Implementation, and Sustainment framework, [74] do not explicitly include these psychological processes in understanding implementation determinants. Behavioral economics informed strategies tend to be low-cost and scalable yet have been questioned for their long-term sustainability—an area where implementation science has had some focus. Furthermore, the widespread availability of electronic medical records (EMRs) allows for the deployment of implementation strategies based on behavioral economics [75, 76]. EMRs are used to document care, but they also can prompt and shape clinical decision-making in unobtrusive ways that do not disrupt workflow or add more tasks. In addition to point-of-care clinician decisions, EMR platforms such as patient portals can be used to automatically communicate with patients prior to, during, or after visits. Potential disparities in technology availability and use should be considered in using these approaches [77]. Using the EMR to deliver implementation or de-implementation strategies has the added benefits of being easily modifiable when new data emerge [36]. Given these potentials, increasing integration of behavioral economics into implementation science has the potential to improve cancer care delivery, health equity, and outcomes for millions of Americans and their families [65, 78].

Rapid Bidirectional Learning

Critiques of implementation science have argued that the field is recreating the research-to-practice gap in the field (i.e., implementation practice is not informed by research) and that it takes too long to get to the action (i.e., spending too much time on contextual inquiry) [79]. As such, there have been calls for more rapid learning (also called rapid implementation science) to align with the time frames called for by policy and health system decision makers both in the methods used to collect contextual data and for generating recommendations [80, 81].

While these conversations started before the COVID-19 pandemic, the pandemic has amplified the need for rapid contextual inquiry and implementation—and also innovation in this area [82]. For example, two recent studies outside of oncology care applied the Consolidated Framework for Implementation Research to rapidly assess contextual determinants and identify effective strategies to increase the use of two clinical practices that changed due to the pandemic: the need to enhance family engagement strategies due to restricted COVID-19 visitation policies [83, 84] and the need to implement prone positioning for patients with severe acute respiratory distress syndrome [85]. These are only two examples of how the pandemic has spurred rapid change in both the way healthcare is delivered and highlighted areas of long-standing areas of need (e.g., addressing vaccine hesitancy and dismantling structural racism) [86, 87]. In the field of oncology, the COVID-19 pandemic has altered considerations of what types of cancer care can be safely delivered at home including dramatic increases in the use of telehealth visits, increased use of cancer treatment at home, and continued expansion of oral therapies [88, 89]. The rapidly changing environment of healthcare systems during 2020 highlights why rapid contextual inquiry and implementation is needed more than ever; however, the need to respond quickly to ever-changing practice is not new and underpins the concept and potential of learning health systems and quality improvement science [45, 90].

As defined by the National Academy of Medicine, learning health systems are delivery systems in which “science, informatics, incentives, and culture are aligned for continuous improvement and innovation, with best practices seamlessly embedded in the delivery process and new knowledge captured as an integral by-product of the delivery experience” [91]. Learning health systems draw from the data and people within the system to rapidly and continuously learn and improve practice. As such, learning health systems are an optimal environment for developing methods in implementation to learn faster and fail quicker [45]. Learning health systems are also ideal contexts to advance the science of de-implementation given the growing need to identify effective and efficient ways to reduce, replace, remove, or restrict clinical practices that may become outdated based on discoveries or evidence [22–24]. While the connections between implementation science and the science of learning health care systems have been noted, this remains an understudied area particularly in oncology care, where the great impact could be made [45]. For example, given the rapid increase in availability, delivery, and costs of cancer therapies, there is a great need to rapidly evaluate and monitor the implementation of these therapies in real-world care to ensure effectiveness and equity. Beyond a robust data infrastructure that can help ascertain and monitor outcomes, this will require rapid approaches to identifying and addressing gaps in practice that may result in suboptimal effective and unequal access. This will also require systematic coordination and co-creation across system and community stakeholders—all areas where insights from implementation science may help to translate the promise of learning health systems into widespread improvements.

Mixed Methods Approaches

The need for qualitative and mixed methods approaches has long been embraced by implementation scientists for understanding and overcoming putative barriers and facilitators (i.e., determinants) to implementation [92]. However, as the list of determinants has grown, there is a growing appreciation for going beyond lists of variables and to a more nuanced understanding of the implementation process and causal theory [46]. A focus on targets and mechanisms is based upon recent findings of the failure of many implementation strategies to engage targeted mechanisms. For example, a recent systematic review of 88 randomized implementation trials found no evidence that any implementation strategy engaged its targeted mechanisms of action [93]. There has been little attention to these issues in oncology [29]. As such, there is a continued need to identify the causal mechanisms through which implementation strategies operate to improve how advances in cancer delivery are translated into practice, including mechanisms that may be specific to populations and settings, particularly those that experience inequities.

Implementation science is still in the beginning stages of building causal theory to explain the mechanisms by which implementation strategies work [46, 47]. Due to the clustered nature of implementation trials—in which patients are nested within clinicians, clinicians within clinics, and clinics within healthcare systems, traditional statistical approaches to assessing moderators and mechanisms are often not feasible or possible. This infeasibility is often due to limited power based on the given sample size of clinics and/or collinearity of variables across levels, particularly in cancer care where it may be difficult to recruit sufficient participants (patients, clinicians, and clinics) to power the analyses required to study complex cross-level mediation models. Furthermore, implementation strategies can be directed at multiple levels including patients, clinicians, clinics, hospitals, and systems and strategies at one level may ultimately target behavior and outcomes at other levels. For example, changes in organizational climate may affect clinician behavior and patient outcomes [94, 95]. Even in cases when a study may be adequately powered to conduct formal mediation analyses, adding qualitative inquiry to complement the quantitative analyses provides an opportunity to learn openly from participants—in unstructured and often unexpected ways—that may be missed if limited solely to measures determined a priori by researchers. This process necessitates approaches that offer the ability to collect rich contextual data to help identify for whom implementation strategies are most effective (i.e., moderation), including among sub-groups experiencing inequities, and to identify how strategies might work (i.e., mechanisms of change) [46].

Configurational comparative methods (CCMs) are a category of case-based approaches that have received increased attention in implementation science to identify mechanisms within trials [96, 97]. Drawing upon notions of complex causality, CCMs seeks to characterize the relationships between conditions and specific outcomes in phenomena that are casually complex or may be limited in sample size. In CCMs, the term condition encompasses any of the multilevel mechanisms or “ingredients” (i.e., predictors, determinants, factors, and processes) that shape the success or failure of implementation strategies. CCMs evaluate which combinations of these conditions are required for implementation success. The two most common types of CCMs are qualitative comparative analysis (QCA) and coincidence analysis [98–100]. QCA and coincidence analysis (CNA) are both case-based analytic methods that draw upon notions of complex causality [97]. Distinct from variable-based methods, QCA and CNA do not assume linear or additive relationships and are predicated on the assumption that different pathways may result in an outcome (i.e., equifinality) [96, 97]. QCA and CNA both involve multiple analytic steps that combines qualitative and quantitative coding and calibration to identify conditions shaping the effectiveness of implementation strategies [98]. The central difference between the two approaches is with regard to their reduction algorithm and how they account for multiple causal outcomes [98, 101]. While these approaches are promising, there is a substantial need to continue to develop and identify methods to identify how and for whom strategies may be effective particularly in cancer care, where patients often see multiple clinicians at multiple clinics across their care, and therefore hierarchical approaches to accounting for or modeling clustering effects are not reflective of practice. Continuing to learn and collaborate with scientists with expertise in multilevel science can help to identify new approaches—for example, cross-classified multilevel modeling—that are underused in implementation science [102]. Accounting for the complexities of implementation—both from a measurement and intervention perspective—continues to be an area of growth and need in the field that can have a substantial impact on cancer control [103].

Health Equity

The NCI’s investment in implementation science to increase evidence-based practices has had a tremendous impact on care, but many individuals across the USA are still not receiving optimal care and disparities by race, ethnicity, income, geography, and other factors persist [104–106]. Racism, discrimination, and inequitable access to wealth, care, and other resources have oppressed and marginalized Black, Latinx, and Indigenous peoples in the USA [107]. These inequities have contributed to persistent disparities in cancer incidence and mortality by race, ethnicity, and other social characteristics that are not the reflection of biological differences, but rather driven by the interplay of complex structural and social factors. Despite some progress, cancer health disparities persist at every phase of the cancer continuum from prevention through survivorship [21]. For example, only 47% of Latinx adults are up-to-date for colorectal cancer screening in contrast to 64% of White adults. In adults with less than high school education, the difference is even starker (71% completion rate in those with a college degree in contrast to 47% among those without a high school degree) [21]. For treatment, these disparities start at the point of clinical trials (i.e., disproportionately low enrollment of racially and ethnically diverse trial participants) and continue through differential access to treatment [108]. Applying principles from implementation science to target both core determinants of inequities and disparities at each phase can help to ensure all populations benefit from cancer innovation [21, 109].

Health equity has been underemphasized within the context of implementation science. Reasons include methodological, funding, and resource challenges (i.e., the need for large and diverse samples to understand disparities). Structural barriers also have limited marginalized populations from biomedical and healthcare research (e.g., costs associated with clinical trials), and in some cases resulted in the continuation of outdated practices (e.g., screening outside recommended guidelines) due to medical mistrust, discrimination, or other poor experiences with the U.S. healthcare system that lead patients to distrust recommendations [110]. Thought leaders have suggested that the time is ripe for ensuring that health equity is a key focus of implementation science, with opportunities for extending a health equity lens to our frameworks, measures, implementation strategies, and study design and execution [49, 51, 52, 111].

The impact of COVID-19 and racial injustices of 2020 have brought long-overdue attention to the immediate and historical effects of structural racism in the USA. As the field of implementation science works to integrate equity into existing frameworks, the field can help by identifying effective strategies to ensuring equitable implementation of evidence-based practices across sites and populations and targeting contextual mechanisms that contribute to disparities in care [112]. This means not only identifying and assessing strategies that work to decrease persistent inequities in cancer outcomes and reach all patients equitably, but also developing effective strategies that directly target drivers of inequities including medical discrimination, mistrust, unequal access to education and wealth, and structural racism [110, 113].

Penn Implementation Science Center in Cancer Control

The Penn Implementation Science Center in Cancer Control (Penn ISC3) grand challenge is to apply insights from behavioral economics to rapidly accelerate the pace at which evidence-based practices for cancer care are deployed and the extent to which they are delivered equitably, thereby increasing their reach and impact on the health and health equity of individuals with cancer. Learning with and from partners across a complex health system that spans 5 hospitals and 10 multispecialty outpatient clinics across Philadelphia and the greater Delaware Valley, Penn ISC3 aims to simultaneously advance the fields of implementation science and cancer care delivery by integrating the four strategic areas outlined above.

In our first two pilot projects, we are utilizing technology (including electronic health record-based alerts, text messaging, patient portals, and email) to deliver patient and clinician nudges to increase the use of two evidence-based practices in cancer care: tobacco cessation referral and serious illness conversations. During the first 6 months of our center, we worked closely with clinical and operational leaders and patients to develop the content (e.g., specific wording, messaging), frequency and timing (e.g., when the alerts fire and re-fire), and delivery modality (e.g., text messaging versus use of patient portals). Within these processes, we collaboratively listened and learned from the experiences of clinicians and researchers to reduce the burden. We also learned from the data—on who our patient population is and how they engage with the system including patient portals—to enhance equity and reach. These processes have set up a model by which we will develop future nudge strategies for Penn ISC3 and beyond.

In addition to leveraging bidirectional learning from our human and data systems, we also have integrated rapid cycle approaches (RCA)—prior to launching each specific pilot project. Often used in engineering and technology [114], RCA applies a variety of strategies to test potential innovations more efficiently, less expensively, and more reliably than can be accomplished through the ramp-up to and conduct of traditional trials [115]. Penn is an international leader in applying RCA to improve healthcare delivery and outcomes [115, 116]. In Penn ISC3, we draw upon this expertise to rapidly assess and improve proposed strategies prior to the study launch. For example, in our first projects, investigators used “fake back ends” by which patients received text-based messages manually created by research staff prior to changing to automated text messages during the trial. As we move into the second phase of pilot projects, we will continue to use RCA to inform the development of strategies.

In Penn ISC3, we have integrated an embedded-concurrent approach [117] to understand multilevel factors that impact the effectiveness of the proposed strategies across the pilot projects—both of which use a pragmatic factorial trial design and therefore are limited by what baseline data can be collected from participants. To help to understand mechanisms driving the success or failure of the nudges, we are collecting qualitative and quantitative data via surveys and semi-structured interviews from patients, clinicians, and clinical leaders involved in the pragmatic trials. These data will serve as multilevel inputs for conducting qualitative comparative analysis to identify potential mechanisms driving the success or failure of the implementation strategies. By harmonizing contextual data collection across the pilot projects, we hope to harness the strengths of pragmatic design to understand if proposed strategies work in routine care while accounting for potential interaction between strategies using a factorial design in the pilot trials.

Across the center, we have thought carefully about how different strategies may enhance or mitigate inequities in reach and impact. The process has been challenging as the most low-touch approaches in behavioral economics often rely upon technology—which is not equally available or used by all patients in our system or nationally. To understand how our approaches may impact equity at each phase, we have developed strategies for monitoring and evaluating the process through the study. First, we are tracking potential disparities in reach and penetration that might result from our implementation approaches using the extension of RE-AIM for equity [51]. Second, we are collecting electronic medical record and survey data to explore how social determinants of health (e.g., neighborhood-level poverty) and social needs (e.g., transportation) may moderate the effectiveness of proposed implementation strategies or may shape the experiences of patients. Third, we are oversampling for patients who identify as Black and/or from neighborhoods with high rates of poverty in our qualitative interviews to understand specific barriers that may contribute to persistent inequities in cancer. In our next phase of pilot projects, we will move from monitoring and understanding to prioritizing projects that specifically target the reduction of inequities as the primary focus.

Conclusion

In this commentary, we discussed four strategic areas that can help to advance equitable implementation of evidence-based practices in cancer prevention and care and share how we are operationalizing these insights in Penn ISC3. While not exhaustive, we contend that each of these areas in coordination with rapidly increasing advances in the prevention, early detection, and treatment of cancer could dramatically reduce and in some cases even eliminate the cancer burden in the USA and globally. However, for this potential to be translated into reality, these discoveries need to be integrated into practice in efficient, equitable, sustainable ways—or else their potential will be stalled or even lost. As such, the changing landscape of cancer discovery offers an exciting and vital opportunity for implementation science and cancer researchers to work together to increase the reach and equity of cancer innovation across the care continuum.

Acknowledgments

We would like to acknowledge the input from our Penn Medicine Implementation Lab Partners. We would also like to thank the following individuals who are part of our Penn ISC3 Center: Justin Bekelman, MD, Robert Schnoll, PhD, Alison Buttenheim, PhD, MBA, Frank Leone, MD, Mitesh Patel, MD, MBA, David Asch, MD, MBA, Samuel Takvorian, MD, MsHP, Paul Wileyto, PhD, Krisda Chaiyachati, PhD, MsHP, Rachel Shelton, ScD, MPH, Larry Shulman, MD, Peter Gabriel, MD, MSE, Adina Lieberman, MPH, Megan Reilly, MPH, Anna Marika-Bauer, BA, Susan Ware, BS, Callie Scott, MSc, Alicia Clifton, MDP, Tasnim Salam MBE, MPH, and Kelly Zentgraf, MS. We would also like to thank the members of our Data Safety Monitoring Board (Erin Aakhus, MD, MSHP, Kate Courtwright, MD, MSHP, Kit Delgado, MD, MS, Megan Lane-Fall, MD, MSHP, FCCM); the members of our External Advisory Board (Ross Brownson, PhD, Kristie Foley, PhD, Colleen McBride, PhD, Jamie Ostroff, PhD); and the members of our Internal Advisory Board (Karen Glanz, PhD, MPH, Carmen Guerra, MD, MSCE, David Miller, MBA, Katherine Nathanson, MD, Lynn Schuchter, MD, Roy Rosin, MBA, Kevin Volpp, MD, PhD). Last, but certainly not least, we thank our families for their patience, love, and support while we wrote this manuscript during the difficult time of COVID-19 in early hours, late nights, and weekends (Katharine: E, E, and O; Rinad: K, M, and E). This study was funded by National Cancer Institute P50CA244690 and National Cancer Institute UM1CA221939.

Compliance with Ethical Standards

Conflicts of Interest: Dr. Beidas reports royalties from Oxford University Press, has received consulting fees from the Camden Coalition of Healthcare Providers, currently consults for United Behavioral Health, and sits on the scientific advisory committee for Optum Behavioral Health.

Human Rights: This article does not contain any studies with human participants performed by any of the authors.

Informed Consent: This study does not involve human participants and informed consent was therefore not required.

Welfare of Animals: This article does not contain any studies with animals performed by any of the authors.

References

  • 1.Institute NC. Milestones in Cancer Research and Discovery National Institutes of Health. Available at https://www.cancer.gov/research/progress/250-years-milestones. Accessibility verified February 15, 2021.
  • 2.Brisson M, Kim JJ, Canfell K, et al. Impact of HPV vaccination and cervical screening on cervical cancer elimination: A comparative modelling analysis in 78 low-income and lower-middle-income countries. Lancet. 2020;395(10224):575–590. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Lei J, Ploner A, Elfström KM, et al. HPV vaccination and the risk of invasive cervical cancer. N Engl J Med. 2020;383(14):1340–1348. [DOI] [PubMed] [Google Scholar]
  • 4.Medeiros LR, Rosa DD, da Rosa MI, Bozzetti MC, Zanini RR. Efficacy of human papillomavirus vaccines: a systematic quantitative review. Int J Gynecol Cancer. 2009;19(7):1166–1176. [DOI] [PubMed] [Google Scholar]
  • 5.Aberle DR, Adams AM, Berg CD, et al. ; National Lung Screening Trial Research Team . Reduced lung-cancer mortality with low-dose computed tomographic screening. N Engl J Med. 2011;365(5):395–409. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Moyer VA; U.S. Preventive Services Task Force . Screening for lung cancer: U.S. Preventive Services Task Force recommendation statement. Ann Intern Med. 2014;160(5):330–338. [DOI] [PubMed] [Google Scholar]
  • 7.Wender R, Fontham ET, Barrera E Jr, et al. American Cancer Society lung cancer screening guidelines. CA Cancer J Clin. 2013;63(2):107–117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Roberts H, Walker-Dilks C, Sivjee K, et al. ; Lung Cancer Screening Guideline Development Group . Screening high-risk populations for lung cancer: Guideline recommendations. J Thorac Oncol. 2013;8(10): 1232–1237. [DOI] [PubMed] [Google Scholar]
  • 9.Jaklitsch MT, Jacobson FL, Austin JH, et al. The American Association for Thoracic Surgery guidelines for lung cancer screening using low-dose computed tomography scans for lung cancer survivors and other high-risk groups. J Thorac Cardiovasc Surg. 2012;144(1):33–38. [DOI] [PubMed] [Google Scholar]
  • 10.Krist AH, Davidson KW, Mangione CM, et al. ; US Preventive Services Task Force . Screening for lung cancer: U.S. Preventive Services Task Force recommendation statement. JAMA. 2021;325(10):962–970. [DOI] [PubMed] [Google Scholar]
  • 11.Aamir S, Anwar MY, Khalid F, Khan SI, Ali MA, Khattak ZE. Systematic review and meta-analysis of CD19-specific CAR-T Cell therapy in relapsed/refractory acute lymphoblastic leukemia in the pediatric and young adult population: safety and efficacy outcomes. Clin Lymphoma Myeloma Leuk. 2020;21(4):e334–e347. doi: 10.1016/j.clml.2020.12.010. [DOI] [PubMed] [Google Scholar]
  • 12.Reagan PM, Friedberg JW. Axicabtagene ciloleucel and brexucabtagene autoleucel in relapsed and refractory diffuse large B-cell and mantle cell lymphomas. Future Oncol. 2021;17(11):1269–1283. [DOI] [PubMed] [Google Scholar]
  • 13.Neelapu SS, Locke FL, Bartlett NL, et al. Axicabtagene ciloleucel CAR T-Cell therapy in refractory large B-Cell lymphoma. N Engl J Med. 2017;377(26):2531–2544. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Eddy DM. Evidence-based medicine: a unified approach. Health Aff (Millwood). 2005;24(1):9–17. [DOI] [PubMed] [Google Scholar]
  • 15.Elam-Evans LD, Yankey D, Singleton JA, et al. National, regional, state, and selected local area vaccination coverage among adolescents aged 13–17 Years – United States, 2019. MMWR Morb Mortal Wkly Rep. 2020;69(33):1109–1116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Fedewa SA, Kazerooni EA, Studts JL, et al. State variation in low-dose CT scanning for lung cancer screening in the United States. J Natl Cancer Inst. 2020. doi: 10.1093/jnci/djaa170. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Haque W, Verma V, Butler EB, Teh BS. Racial and socioeconomic disparities in the delivery of immunotherapy for metastatic melanoma in the United States. J Immunother. 2019;42(6):228–235. [DOI] [PubMed] [Google Scholar]
  • 18.Verma V, Haque W, Cushman TR, et al. Racial and insurance-related disparities in delivery of immunotherapy-type compounds in the United States. J Immunother. 2019;42(2):55–64. [DOI] [PubMed] [Google Scholar]
  • 19.Lloyd-Williams H, Hughes DA. A systematic review of economic evaluations of advanced therapy medicinal products. Br J Clin Pharmacol. 2021; 87(6): 2428– 2443. doi: 10.1111/bcp.14275. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Al-Qurayshi Z, Crowther JE, Hamner JB, Ducoin C, Killackey MT, Kandil E. Disparities of immunotherapy utilization in patients with stage III cutaneous melanoma: A national perspective. Anticancer Res. 2018;38(5):2897–2901. [DOI] [PubMed] [Google Scholar]
  • 21.AACR. Cancer Disparities Progress Report. 2020. Available at http://www.CancerDisparitiesProgressReport.org/. Accessibility verified February 2, 2021.
  • 22.Norton WE, Chambers DA. Unpacking the complexities of de-implementing inappropriate health interventions. Implement Sci. 2020;15(1):2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Norton WE, Chambers DA, Kramer BS. Conceptualizing de-implementation in cancer care delivery. J Clin Oncol. 2019;37(2):93–96. [DOI] [PubMed] [Google Scholar]
  • 24.Norton WE, McCaskill-Stevens W, Chambers DA, Stella PJ, Brawley OW, Kramer BS. DeImplementing ineffective and low-value clinical practices: Research and practice opportunities in community oncology settings. JNCI Cancer Spectr. 2021;5(2):pkab020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Fisher ES, Shortell SM, Savitz LA. Implementation science: A potential catalyst for delivery system reform. JAMA. 2016;315(4):339–340. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Kilbourne AM, Glasgow RE, Chambers DA. What can implementation science do for you? Key success stories from the field. J Gen Intern Med. 2020;35(Suppl 2):783–787. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: Understanding time lags in translational research. J R Soc Med. 2011;104(12):510–520. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Khan S, Chambers D, Neta G. Revisiting time to translation: Implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes Control. 2021;32(3):221–230. [DOI] [PubMed] [Google Scholar]
  • 29.Neta G, Sanchez MA, Chambers DA, et al. Implementation science in cancer prevention and control: A decade of grant funding by the National Cancer Institute and future directions. Implement Sci. 2015;10:4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Kim DD, Cohen JT, Wong JB, et al. Targeted incentive programs for lung cancer screening can improve population health and economic efficiency. Health Aff (Millwood). 2019;38(1):60–67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: Current and future directions. Am J Public Health. 2012;102(7):1274–1281. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Meissner HI, Glasgow RE, Vinson CA, et al. The U.S. training institute for dissemination and implementation research in health. Implement Sci. 2013;8:12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Neta G, Clyne M, Chambers DA. Dissemination and implementation research at the National Cancer Institute: A review of funded studies (2006–2019) and opportunities to advance the field. Cancer Epidemiol Biomarkers Prev. 2021;30(2):260–267. [DOI] [PubMed] [Google Scholar]
  • 34.Agus DB, Jaffee EM, Van Dang C. Cancer Moonshot 2.0. Lancet Oncol. 2021;22(2):164–165. [DOI] [PubMed] [Google Scholar]
  • 35.Oh A, Vinson CA, Chambers DA. Future directions for implementation science at the National Cancer Institute: Implementation science centers in cancer control. Transl Behav Med. 2020;11(2):669–675. doi: 10.1093/tbm/ibaa018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Patel MS, Volpp KG, Asch DA. Nudge units to improve the delivery of health care. N Engl J Med. 2018;378(3):214–216. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Yoong SL, Hall A, Stacey F, et al. Nudge strategies to improve healthcare providers' implementation of evidence-based guidelines, policies and practices: A systematic review of trials included within Cochrane systematic reviews. Implement Sci. 2020;15(1):50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Patel MS, Day SC, Halpern SD, et al. Generic medication prescription rates after health system-wide redesign of default options within the electronic health record. JAMA Intern Med. 2016;176(6):847–848. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Doshi JA, Lim R, Li P, et al. A synchronized prescription refill program improved medication adherence. Health Aff (Millwood). 2016;35(8):1504–1512. [DOI] [PubMed] [Google Scholar]
  • 40.Patel MS, Kurtzman GW, Kannan S, et al. Effect of an automated patient dashboard using active choice and peer comparison performance feedback to physicians on statin prescribing: The PRESCRIBE cluster randomized clinical trial. JAMA Netw Open. 2018;1(3):e180818. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Becker GS.The Economic Approach to Human Behavior. Chicago, IL: The University of Chicago Press; 1978. [Google Scholar]
  • 42.Loewenstein G, Asch DA, Volpp KG. Behavioral economics holds potential to deliver better results for patients, insurers, and employers. Health Aff (Millwood). 2013;32(7):1244–1250. [DOI] [PubMed] [Google Scholar]
  • 43.Asch DA, Terwiesch C, Mahoney KB, Rosin R. Insourcing health care innovation. N Engl J Med. 2014;370(19):1775–1777. [DOI] [PubMed] [Google Scholar]
  • 44.Boustani M, Alder CA, Solid CA. Agile implementation: A blueprint for implementing evidence-based healthcare solutions. J Am Geriatr Soc. 2018;66(7):1372–1376. [DOI] [PubMed] [Google Scholar]
  • 45.Chambers DA, Feero WG, Khoury MJ. Convergence of implementation science, precision medicine, and the learning health care system: a new model for biomedical research. JAMA. 2016;315(18):1941–1942. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Lewis CC, Klasnja P, Powell BJ, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:136. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Williams NJ, Beidas RS. Annual Research Review: the state of implementation science in child psychology and psychiatry: A review and suggestions to advance the field. J Child Psychol Psychiatry. 2018;60(4):430–450. doi: 10.1111/jcpp.12960 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Lewis CC, Boyd MR, Walsh-Bailey C, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Nooraie RY, Kwan BM, Cohn E, et al. Advancing health equity through CTSA programs: Opportunities for interaction between health equity, dissemination and implementation, and translational science. J Clin Transl Sci. 2020;4(3):168–175. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Brownson RC, Kumanyika SK, Kreuter MW, Haire-Joshu D. Implementation science should give higher priority to health equity. Implement Sci. 2021;16(1):28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: Addressing dynamic context and promoting health equity over time. Front Public Health. 2020;8:134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Powell BJ, Beidas RS, Lewis CC, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Closing the gap between research and practice: An overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ. 1998;317(7156):465–468. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Green LW. Making research relevant: If it is an evidence-based practice, where’s the practice-based evidence? Fam Pract. 2008;25 Suppl 1:i20–i24. [DOI] [PubMed] [Google Scholar]
  • 56.Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Flottorp SA, Oxman AD, Krause J, et al. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013;8:35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Kahneman D, Tversky A. Prospect theory: An analysis of decision under risk. Econometrica. 1979;47(2):263–292. [Google Scholar]
  • 61.Thaler R, Sunstein C.. Nudge: Improving Decisions About Health, Wealth, and Happiness. New Haven, CT: Yale University Press; 2008 [Google Scholar]
  • 62.Tversky A, Kahneman D. Judgment under uncertainty: Heuristics and biases. Science. 1974;185(4157):1124–1131. [DOI] [PubMed] [Google Scholar]
  • 63.Fiske ST, Taylor SE.. Social Cognition: From Brains to Culture. 2nd ed. Thousand Oaks, CA: SAGE Publications; 2013. [Google Scholar]
  • 64.Tversky A, Kahneman D. The framing of decisions and the psychology of choice. Science. 1981;211(4481):453–458. [DOI] [PubMed] [Google Scholar]
  • 65.Beidas RS, Vlopp KG, Buttenheim A, et al. Transforming mental health delivery through behavioral economics and implementation science: Protocol for three exploratory projects. 2019;8(2):e12121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Saghai Y. Salvaging the concept of nudge. J Med Ethics. 2013;39(8):487–493. [DOI] [PubMed] [Google Scholar]
  • 67.Hepple B.Public Health: Ethical Issues. London, UK: Nuffield Council on Bioethics; 2007. [Google Scholar]
  • 68.Takvorian SU, Ladage VP, Wileyto EP, et al. Association of behavioral nudges with high-value evidence-based prescribing in oncology. JAMA Oncol. 2020;6(7):1104–1106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Volpp KG, Troxel AB, Pauly MV, et al. A randomized, controlled trial of financial incentives for smoking cessation. N Engl J Med. 2009;360(7):699–709. [DOI] [PubMed] [Google Scholar]
  • 70.Halpern SD, French B, Small DS, et al. Randomized trial of four financial-incentive programs for smoking cessation. N Engl J Med. 2015;372(22):2108–2117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Beidas RS, Buttenheim AM, Mandell DS. Transforming mental health care delivery through implementation science and behavioral economics. JAMA Psychiatry. 2021. doi: 10.1001/jamapsychiatry.2021.1120. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Wensing M, Oxman A, Baker R, et al. Tailored Implementation For Chronic Diseases (TICD): A project protocol. Implement Sci. 2011;6:103. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4:50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Hsiao CJ, Hing E. Use and characteristics of electronic health record systems among office-based physician practices: United States, 2001–2012. NCHS Data Brief. 2012;143(111):1–8. [PubMed] [Google Scholar]
  • 76.Melnick ER, Sinsky CA, Krumholz HM. Implementing measurement science for electronic health record use. JAMA. 2021;325(21):2149–2150. [DOI] [PubMed] [Google Scholar]
  • 77.Graetz I, Gordon N, Fung V, Hamity C, Reed ME. The digital divide and patient portals: Internet access explained differences in patient portal use for secure messaging by age, race, and income. Med Care. 2016;54(8):772–779. [DOI] [PubMed] [Google Scholar]
  • 78.Lau-Min KS, Guerra CE, Nathanson KL, Bekelman JE. From race-based to precision oncology: Leveraging behavioral economics and the electronic health record to advance health equity in cancer care. JCO Precis Oncol. 2021;5:403–407. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Davis M, Beidas RS. Refining contextual inquiry to maximize generalizability and accelerate the implementation process. Implement Res Prac. 2021;2:1–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Glasgow RE, Battaglia C, McCreight M, Ayele RA, Rabin BA. Making implementation science more rapid: Use of the RE-AIM framework for mid-course adaptations across five health services research projects in the veterans health administration. Front Public Health. 2020;8:194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Smith J, Rapport F, O’Brien TA, et al. The rise of rapid implementation: A worked example of solving an existing problem with a new method by combining concept analysis with a systematic integrative review. BMC Health Serv Res. 2020;20(1):449. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Øvretveit J. Implementation researchers can improve the responses of services to the COVID-19 pandemic. Implement Res Pract. 2020;1. doi: 10.1177/2633489520949151. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Taylor SP, Short RT 3rd, Asher AM, Taylor B, Beidas RS. A rapid pre-implementation evaluation to inform a family engagement navigator program during COVID-19. Implement Sci Commun. 2020;1(1):110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Hart JL, Turnbull AE, Oppenheim IM, Courtright KR. Family-centered care during the COVID-19 era. J Pain Symptom Manage. 2020;60(2):e93–e97. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Klaiman T, Silvestri JA, Srinivasan T, et al. Improving prone positioning for severe acute respiratory distress syndrome during the COVID-19 Pandemic. An implementation-mapping approach. Ann Am Thorac Soc. 2021;18(2):300–307. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Dror AA, Eisenbach N, Taiber S, et al. Vaccine hesitancy: The next challenge in the fight against COVID-19. Eur J Epidemiol. 2020;35(8):775–779. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.Egede LE, Walker RJ. Structural racism, social risk factors, and COVID-19 – a dangerous convergence for Black Americans. N Engl J Med. 2020;383(12):e77. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Laughlin A, Begley M, Delaney T, et al. Accelerating the delivery of cancer care at home during the COVID-19 Pandemic. NEJM Catalyst. 2020. doi: 10.1056/cat.20.0258 Available at https://catalyst.nejm.org/doi/full/10.1056/cat.20.0258 [DOI] [Google Scholar]
  • 89.Royce TJ, Sanoff HK, Rewari A. Telemedicine for cancer care in the time of COVID-19. JAMA Oncol. 2020;6(11):1698–1699. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Koczwara B, Stover AM, Davies L, et al. Harnessing the synergy between improvement science and implementation science in cancer: A call to action. J Oncol Pract. 2018;14(6):335–340. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 91.Roundtable T.Roundtable on Value & Science-Driven Health Care. 2012 [Google Scholar]
  • 92.Nilsen P, Bernhardsson S. Context matters in implementation science: A scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv Res. 2019;19(1):189. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Williams NJ. Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Adm Policy Ment Health. 2016;43(5):783–798. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Williams NJ, Ehrhart MG, Aarons GA, Marcus SC, Beidas RS. Linking molar organizational climate and strategic implementation climate to clinicians’ use of evidence-based psychotherapy techniques: cross-sectional and lagged analyses from a 2-year observational study. Implement Sci. 2018; 13(1): 85. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Williams NJ, Beidas RS. Navigating the storm: How proficient organizational culture promotes clinician retention in the shift to evidence-based practice. PLoS One. 2018;13(12):e0209745. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Ragin CC.Redesigning Social Inquiry. Chicago, IL: University of Chicago Press; 2008 [Google Scholar]
  • 97.Rihoux B, Ragin CC.. Configurational Comparative Methods. Thousand Oaks, CA: SAGE Publications; 2008. [Google Scholar]
  • 98.Whitaker RG, Sperber N, Baumgartner M, et al. Coincidence analysis: A new method for causal inference in implementation science. Implement Sci. 2020;15(1):108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 99.Kahwati L, Viswanathan M, Golin CE, Kane H, Lewis M, Jacobs S. Identifying configurations of behavior change techniques in effective medication adherence interventions: A qualitative comparative analysis. Syst Rev. 2016;5:83. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Kane H, Lewis MA, Williams PA, Kahwati LC. Using qualitative comparative analysis to understand and quantify translation and implementation. Transl Behav Med. 2014;4(2):201–208. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 101.Baumgartner M, Ambühl M.. Causal modeling with multi-value and fuzzy-set coincidence analysis. Political Sci Res Methods. 2018;8(3):526–542. doi: 10.1017/psrm.2018.45 [DOI] [Google Scholar]
  • 102.Barker KM, Dunn EC, Richmond TK, Ahmed S, Hawrilenko M, Evans CR. Cross-classified multilevel models (CCMM) in health research: A systematic review of published empirical studies and recommendations for best practices. SSM Popul Health. 2020;12:100661. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Taplin SH, Anhang Price R, Edwards HM, et al. Introduction: Understanding and influencing multilevel factors across the cancer care continuum. J Natl Cancer Inst Monogr. 2012;2012(44):2–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 104.Nekhlyudov L, Levit L, Hurria A, Ganz PA. Patient-centered, evidence-based, and cost-conscious cancer care across the continuum: Translating the Institute of Medicine report into clinical practice. CA Cancer J Clin. 2014;64(6):408–421. [DOI] [PubMed] [Google Scholar]
  • 105.Toll BA, Brandon TH, Gritz ER, Warren GW, Herbst RS; AACR Subcommittee on Tobacco and Cancer . Assessing tobacco use by cancer patients and facilitating cessation: An American Association for Cancer Research policy statement. Clin Cancer Res. 2013;19(8):1941–1948. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 106.Goldstein AO, Ripley-Moffitt CE, Pathman DE, Patsakham KM. Tobacco use treatment at the US National Cancer Institute’s designated Cancer Centers. Nicotine Tob Res. 2012;15(1):52–58. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 107.Bailey ZD, Feldman JM, Bassett MT. How structural racism works – racist policies as a root cause of U.S. Racial Health Inequities. N Engl J Med. 2021;384(8):768–773. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 108.El-Deiry WS, Giaccone G. Challenges in diversity, equity, and inclusion in research and clinical Oncology. Front Oncol. 2021;11:642112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 109.Rendle KA, Burnett-Hartman AN, Neslund-Dudas C, et al. Evaluating lung cancer screening across diverse healthcare systems: A process model from the Lung PROSPR Consortium. Cancer Prev Res (Phila). 2020;13(2):129–136. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 110.Shelton RC, Brotzman LE, Johnson D, Erwin D. Trust and mistrust in shaping adaptation and de-implementation in the context of changing screening guidelines. Ethn Dis. 2021;31(1):119–132. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 111.Sterling MR, Echeverría SE, Commodore-Mensah Y, Breland JY, Nunez-Smith M. Health equity and implementation science in heart, lung, blood, and sleep-related research: Emerging themes from the 2018 Saunders-Watkins Leadership Workshop. Circ Cardiovasc Qual Outcomes. 2019;12(10):e005586. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 112.Galaviz KI, Breland JY, Sanders M, et al. Implementation science to address health disparities during the Coronavirus pandemic. Health Equity. 2020;4(1):463–467. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 113.Shelton RC, Winkel G, Davis SN, et al. Validation of the group-based medical mistrust scale among urban black men. J Gen Intern Med. 2010;25(6):549–555. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 114.Zahran S.Software Process Improvement: Practical Guidelines for Business Success. New York, NY: Addison-Wesley; 1998. [Google Scholar]
  • 115.Asch DA, Rosin R. Innovation as discipline, not fad. N Engl J Med. 2015;373(7):592–594. [DOI] [PubMed] [Google Scholar]
  • 116.Hirshberg A, Vandertuyn M, Mahraj K. Rapid-cycle innovation testing of text-based monitoring for management of postpartum hypertension. J Clin Outcomes Manag. 2017;24:77–85. [Google Scholar]
  • 117.Creswell JW CV.Designing and Conducting Mixed Methods Research. Thousand Oaks, CA: SAGE Publications; 2011. [Google Scholar]

Articles from Translational Behavioral Medicine are provided here courtesy of Oxford University Press

RESOURCES