Skip to main content
Translational Behavioral Medicine logoLink to Translational Behavioral Medicine
. 2017 Feb 22;7(2):380–384. doi: 10.1007/s13142-017-0474-4

Behavioral and Social Sciences at the National Institutes of Health: adoption of research findings in health research and practice as a scientific priority

William T Riley 1,
PMCID: PMC5526815  PMID: 28229306

Abstract

The National Institutes of Health’s Office of Behavioral and Social Sciences Research (OBSSR) recently released its Strategic Plan for 2017 to 2021. This plan highlights three scientific priorities: (1) improve the synergy of basic and applied behavioral and social sciences research, (2) enhance and promote the research infrastructure, methods, and measures needed to support a more cumulative and integrated approach to behavioral and social sciences research, and (3) facilitate the adoption of behavioral and social sciences research findings in health research and in practice. This commentary focuses on the challenges and opportunities to facilitate the adoption of research findings in health research and in practice. In addition to the ongoing NIH support for dissemination and implementation (D&I) research, we must address transformative challenges and opportunities such as better disseminating and implementing D&I research, merging research and practice, adopting more rigorous and diverse methods and measures for both D&I and clinical trials research, evaluating technological-based delivery of interventions, and transitioning from minimally adaptable intervention packages to planned adaptations rooted in behavior change principles. Beyond translation into practice and policy, the OBSSR Strategic Plan also highlights the need for translation of behavioral and social science findings into the broader biomedical research enterprise.

Keywords: Dissemination and implementation research, Implementation science, Practice adoption, Planned adaptation, Clinical research methodology, Clinical translation, Behavioral and social sciences research


“Why fund behavioral intervention research if the interventions found effective are not adopted in practice?” This was a recurring question among the National Institutes of Health (NIH) institute and center directors who provided input on the recently released Office of Behavioral and Social Sciences Research Strategic Plan 2017–2021 [1]. This practice adoption concern was echoed by the Strategic Planning expert panel composed of leaders in behavioral and social sciences research (BSSR). Based on these and other inputs, the OBSSR Strategic Plan has as one of its three scientific priorities to facilitate the adoption of behavioral and social sciences research findings into health research and in practice. The other two scientific priorities, (1) improve the synergy of basic and applied BSSR and (2) enhance and promote the research infrastructure, methods, and measures needed to support a more cumulative and integrated approach to BSSR, are described further elsewhere [2, 3].

Inadequate translation of research findings into practice is not unique to the behavioral and social sciences. The chasm between research and practice has been well-documented across the entire health delivery system [4, 5]. The translation of social and behavioral interventions into health practice and policy, however, presents unique challenges. First, the extensive market-driven system and the accompanying regulatory structure to facilitate the adoption of safe and effective medical interventions are essentially nonexistent for behavioral interventions. Second, health care insurance reimbursement seldom covers effective behavioral interventions although recent progress has been made in this regard via the Affordable Care Act [6]. Even when behavioral interventions are reimbursed, fee-for-service reimburses based on the time, not the quality or empirical basis of the services delivered. Third, compared to medical interventions, the settings in which behavioral and social interventions are delivered are much more diverse and contextually different, including not only the health care setting, but also communities, schools, workplaces, and policy settings, most of which have competing interests and constrained resources. Fourth, behavioral interventions are often complex and human resource intensive, requiring considerable training and time to deliver with fidelity.

NIH SUPPORT FOR DISSEMINATION AND IMPLEMENTATION RESEARCH

Other governmental agencies such as the Centers for Disease Control (CDC) and the Substance Abuse and Mental Health Services Administration (SAMHSA) have primary responsibility for the dissemination and implementation of quality public health interventions, and an objective within the OBSSR scientific priority to facilitate adoption is to work more closely with these agency partners to ensure that research findings supported by the NIH are rapidly and readily translated into public health practice and policy. For the NIH to achieve fully its mission to enhance health, lengthen life, and reduce illness and disability, however, we cannot leave our research findings “at the water’s edge” and simply hope that others implement the findings supported by the NIH into practice and policy.

For over a decade, the NIH has supported dissemination and implementation (D&I) research to understand better how to facilitate the translation of research findings into practice. Between 2007 and 2014, 146 NIH research grants were awarded through D&I funding announcements [7]. The National Cancer Institute (NCI) recently reported funding 67 grants with an implementation science focus between 2000 and 2012, concluding that cancer-oriented implementation science is diverse and active but could be enhanced with greater focus on measurement development/harmonization and on better linking conceptual frameworks to D&I outcomes [8]. The NIH also has supported transdisciplinary training in D&I methodologies and approaches [9]. Potential future opportunities to advance population health D&I include collaborations between the behavioral and social science community and Clinical and Translational Science Awards (CTSAs) to address key issues such as data challenges, common theory of change, and the study of complex and dynamic processes [10]. In addition to support for D&I research, the NIH must capitalize on transformative challenges and opportunities to translate research findings into practice and policy.

D&I FOR D&I RESEARCH

Funding and training in D&I research alone is insufficient to achieve the OBSSR scientific priority to facilitate adoption of research findings into practice. A recent review of public health D&I research [11] showed that only 13% of these publications were original research or literature reviews, and of these, only about a quarter were intervention-focused; the remainder were descriptive/epidemiological in nature. Measurement research, often identified as a clear need in the field, made up less than 2% of publications. Moreover, the less common study types (systematic reviews, randomized controlled trials (RCTs), cohort studies) were the more frequently cited. Forums for sharing D&I research findings across disciplines have been proposed to increase the transdisciplinary influence of this research [12]. The limited influence of D&I research on researchers suggests an even more limited influence of D&I research on the practice and policy settings in which these D&I findings need to be adopted. Essentially, D&I research appears to have a D&I problem, and more progress is needed in implementing D&I findings in practice and policy settings to facilitate adoption of effective interventions.

FURTHER BLURRING OF RESEARCH AND PRACTICE

The classic medical D&I model is stepwise and discrete. An intervention is found effective in one or more large RCTs, then practice guidelines are developed and promulgated, and barriers to guideline adoption are addressed [13]. Ensuring the safety and effectiveness of an intervention before dissemination and implementation is standard practice in this model. It becomes less necessary, however, to follow this staged model when intervention risks are minimal and existing intervention options are either unavailable or of limited effectiveness, conditions often found for behavioral or public health interventions. Therefore, behavioral interventions are more amenable to being evaluated as they are being disseminated, offering a more real-world test of the intervention in the context in which it is intended to be delivered. Advances in learning healthcare systems [14] and in rigorous methodologies more appropriate for practice settings [15] provide research frameworks to evaluate intervention effectiveness in practice settings. Thirty years of failed practice implementation of motivational interviewing [16] indicates that sustained implementation of complex and human resource intensive interventions requires ongoing training and regular monitoring and benchmarking of outcomes. The continued and increasing integration of research within practice settings provides for routine outcome and quality metrics that facilitate continuous evaluation of interventions and their implementation in practice.

IMPROVED DIVERSITY, RIGOR, AND ACCEPTANCE OF D&I METHODOLOGIES

Traditional RCT designs have limited feasibility, practicality, and appropriateness for D&I research [17]. D&I research requires a range of methodological expertise beyond RCTs including mixed methods approaches [18] and quasi-experimental designs more amenable to the research constraints of practice settings [19]. A greater emphasis on cost-benefit analyses provides stakeholders with critical information for prioritizing services with constrained resources [20]. Northridge and Metcalf [21] make a compelling case for more system science approaches to understand and intervene better in the complex, dynamic, and interrelated systems in which social and behavioral interventions are implemented. Good science, including good system science, requires precise, accurate, and temporally dense measurement approaches, and improvements in implementation science measurement have been cited as a critical need of the field [8, 20, 22, 23].

Methodology and measurement advances are needed not only in D&I research proper but also in clinical trials research to facilitate the adoption of the findings from these trials. Intervention research supported by the NIH needs to consider practice/policy adoption from inception. Designing for dissemination, accelerating the clinical trials process, and involving stakeholders throughout intervention development and evaluation remain crucial to increasing the likelihood of practice adoption [24]. A continuing tension when designing for dissemination is that constraining interventions to fit current resources available to deliver them may limit effectiveness. If research shows that more intervention dose than current resources allow is necessary to produce a clinically meaningful effect, then practice and policy needs to shape to these increased resource demands.

Adjustments to interventions often occur during intervention trials, but these adjustments are seldom reported because they are perceived to conflict with procedures requiring that the intended intervention remain unchanged throughout the study. These intervention tweaks and adjustments, however, are often particularly useful information for those attempting to implement these interventions in practice [25]. Designs such as Continuous Evaluation of Evolving Intervention Technologies (CEEBIT) [26] provide for more planned evaluation of improved versions of interventions over time. Multiphase Optimization STrategy (MOST) designs provide those implementing interventions with important information on which combination and sequence of intervention components are critical to retain in any truncated version of the intervention [27]. These methodological and measurement advances are addressed not only in the OBSSR scientific priority to facilitate adoption, but also in the scientific priority to advance measurement and methods in the behavioral and social sciences.

EVALUATION OF TECHNOLOGY-BASED INTERVENTIONS

Based in part of implementation difficulties, technology-based behavioral interventions have been developed and implemented [28]. Technology-based interventions can be used to augment intervention providers but also can be used to bypass the provider, fully automating the intervention. Although intervention automation can fail to capture aspects of in-person interventions that are difficult to operationalize or deliver given current technology capabilities, there are implementation advantages of technology-based interventions. Technology-based interventions are delivered with fidelity. Servers, smartphones, and software never get tired, distracted, or place their professional judgment above the packaged intervention. Intervention development is resource intensive, but once these fixed development costs are covered, the variable costs of intervention delivery is minimal, greatly increasing reach and scalability. Technology-based interventions also can automate outcome measurement, providing the measurement infrastructure needed for continuous quality improvement testing of the intervention. These interventions, however, are not a panacea. Although they offer considerable promise, effectiveness studies to date have had mixed results [29, 30] and sustained engagement with technology-based interventions remains a challenge. Further evaluation of the use of technologies to implement effective interventions with reach and scalability is needed.

TRANSITION FROM PACKAGES TO PRINCIPLES

Atkins and colleagues [31] noted that one of the misguided assumptions of D&I in mental health services research is that evidence-based packages (EBPs) are the gold standard. They argue that EBPs fail to accommodate the realities of practice and that promoting packaged EBPs does not successfully integrate knowledge of settings and persons toward maximal impact. Deviation from the EBP produces unknown changes in effectiveness, placing the burden on practitioners to follow the EBP with full fidelity or risk diminishing effectiveness.

Planned adaptation [32] acknowledges the tension between implementing programs with fidelity and adapting programs to fit the population, setting, and context. Planned adaptation provides a framework to guide practitioners in adapting programs while encouraging researchers to provide information relevant to adaptation. Requiring practitioners and practice settings to follow a standard EBP is analogous to asking bridge builders to follow a standard bridge blueprint for all bridges they build. Instead, civil engineers apply principles from physics and other sciences and adapt bridge building to the setting (e.g., span distance, weather, footing characteristics), demands (e.g., traffic flow, ship mast clearance), and resources (e.g., funds available and skills of workers available). Continuing to transition from rigid EBPs to more planned adaptations and ultimately to interventions developed and evaluated by practitioners and policymakers based on principles of behavioral and social change will make research findings more relevant and flexibly implemented in practice.

IMPLEMENTATION OF BEHAVIORAL AND SOCIAL SCIENCE FINDINGS IN HEALTH RESEARCH

Although the focus of implementation science is on translation of research findings into practice and policy, there is a need to extend this work to the translation of behavioral and social sciences research within the broader health research enterprise. For example, medication clinical trials have suffered from inadequate adherence to the assigned medications. Research on medication adherence can be applied more systematically to ensure adequate medication adherence in these trials [33]. Another challenge of health research is the engagement of participants in longitudinal trials and observational cohort studies [34]. Principles of behavioral and social sciences such as altruism, persuasion, and motivation can be applied to improve the sustained engagement of participants in research. Therefore, the OBSSR scientific priority to facilitate adoption of behavioral and social science findings extends to the health research sector as well.

CONCLUSION

Through our foundational processes of communications, program coordination, training, and policy and evaluation, the OBSSR will work with our colleagues in the various NIH institutes, centers, and offices; our partner agencies dedicated to practice implementation, and the behavioral and social sciences research community to facilitate the adoption of behavioral and social sciences into practice, policy, and health research. Our efforts need to extend beyond our continued support for rigorous D&I research to address some of the transformative challenges of the field including D&I of our D&I research findings, further integration of research into practice, advances in methods and measures, evaluation of technology-based intervention delivery, and the shift toward greater flexibility of intervention delivery based more on principles than packages. One sign that we are making progress is when the NIH can more readily connect the intervention research we fund to the implementation of effective interventions in practice and policy.

Footnotes

Implications

Practice: Transformational opportunities such as the merging of research into practice and planned adaptations of evidence-based interventions should provide practitioners with greater flexibility to adapt evidence-based interventions to the population, context, and resource constraints of the settings.

Policy: More rapid and readily available research findings from questions generated by policymakers and other research stakeholders should make behavioral and social sciences research more responsive to policy needs.

Research: The National Institutes of Health and the Office of Behavioral and Social Sciences research support for more rigorous and diverse methods and measures, for research designs more readily translated into practice and for continued dissemination and implementation research should facilitate the adoption of behavioral and social sciences research into practice, policy, and the broader biomedical research enterprise.

References

  • 1.The Office of Behavioral and Social Sciences Research Strategic Plan 2017-2021. Retrieved November 23, 2016, from https://obssr.od.nih.gov/wp-content/uploads/2016/12/OBSSR-SP-2017-2021.pdf.
  • 2.Riley, W. T. (2017). Basic and applied behavioural and social sciences at the NIH. Nature Human Behav, 1, 0023. doi:10.1038/s41562-016-0023.
  • 3.Riley, W. T. (2017) Behavioral and social sciences at the National Institutes of Health: methods, measures, and data infrastructures as a scientific priority. Health Psych, 36(1), 5–7. [DOI] [PubMed]
  • 4.Meslin EB, Blasimme A, Cambon-Thomsen A. Mapping the translational policy “valley of death.”. Clin Trans Med. 2013;2(1):1–8. doi: 10.1186/2001-1326-2-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Wolff SH. The meaning of translational research and why it matters. JAMA. 2008;299(2):274–286. doi: 10.1001/jama.2007.26. [DOI] [PubMed] [Google Scholar]
  • 6.Siu AL, Bibbins-Domingo K, Grossman D. Evidence-based clinical prevention in the era of the Patient Protection and Affordable Care Act: the role of the US Preventive Services Task Force. JAMA. 2015;314(19):2021–2012. doi: 10.1001/jama.2015.13154. [DOI] [PubMed] [Google Scholar]
  • 7.Purtle J, Peters R, Brownson RC. A review of policy dissemination and implementation research funded by the National Institutes of Health, 2007-2014. Implement Sci. 2016;11:1. doi: 10.1186/s13012-015-0367-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Neta G, Sanchez MA, Chambers DA, Phillips SM, Leyva B, Cynkin L, et al. Implementation science in cancer prevention and control: a decade of grant funding by the National Cancer Institute and future directions. Implement Sci. 2015;10:4. doi: 10.1186/s13012-014-0200-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Proctor EK, Chambers DA. Training in dissemination and implementation research: a field wide perspective. Transl Behav Med. 2016 doi: 10.1007/s13142-016-0406-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Kuo T, Gase LN, Inkelas M, The Population Health and Policy Workgroup Dissemination, implementation, and improvement science research in population health: opportunities for public health and CTSAs. Clin Transl Sci. 2015;8(6):807–813. doi: 10.1111/cts.12313. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Wolfenden L, Milat AJ, Lecathelinais C, Skelton E, Clinton-McHarg T, Williams C, et al. A bibliographic review of public health dissemination and implementation research output and citation rates. Prev Med Rep. 2016;4:441–443. doi: 10.1016/j.pmedr.2016.08.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Brunner JW, Sankare IC, Kahn KL. Interdisciplinary priorities for dissemination, implementation, and improvement science: frameworks, mechanics, and measures. Clin Transl Sci. 2015;8(6):820–823. doi: 10.1111/cts.12319. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Fischer F, Lange K, Klose K, Greiner W, Kraemer A. Barriers and strategies in guideline implementation—a scoping review. Healthcare. 2016;4:36. doi: 10.3390/healthcare4030036. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Budrionis A, Belika JG. The Learning Healthcare System: Where are we now? A systematic review. J Biomed Inform. 2016;64:87–92. doi: 10.1016/j.jbi.2016.09.018. [DOI] [PubMed] [Google Scholar]
  • 15.Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–226. doi: 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Hall K, Staiger PK, Simpson A, Best D, Lubman DI. After 30 years of dissemination, have we achieved sustained practice change in motivational interviewing? Addiction. 2016;111(7):1144–1150. doi: 10.1111/add.13014. [DOI] [PubMed] [Google Scholar]
  • 17.Nutbeam D, Bauman AE. Evaluation in a nutshell: a practical guide to the evaluation of health promotion programs. New York: McGraw-Hill; 2006. [Google Scholar]
  • 18.Southam-Gerow MA, Dorsey S. Qualitative and mixed methods research in dissemination and implementation science: introduction to the special issue. J Clin Child Adolesc Psychol. 2014;43(6):845–850. doi: 10.1080/15374416.2014.930690. [DOI] [PubMed] [Google Scholar]
  • 19.Rabin BA, Brownson RC, Kerner JF, Glasgow RE. Methodologic challenges in disseminating evidence-based interventions to promote physical activity. Am J Prev Med. 2006;31(4 Suppl):S24–S34. doi: 10.1016/j.amepre.2006.06.009. [DOI] [PubMed] [Google Scholar]
  • 20.Chan CKY, Oldenburg B, Viswanath K. Advancing the science of dissemination and implementation in behavioral medicine: evidence and progress. Int J Behav Med. 2015;22:277–282. doi: 10.1007/s12529-015-9490-2. [DOI] [PubMed] [Google Scholar]
  • 21.Northridge ME, Metcalf SS. Enhancing implementation science by applying best principles of system science. Health Res Policy Sys. 2016;14:74. doi: 10.1186/s12961-016-0146-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Rabin BA, Lewis CC, Norton WE, Neta G, Chambers D, Tobin JN, et al. Measurement resources for dissemination and implementation research in health. Implement Sci. 2016;11:42. doi: 10.1186/s13012-016-0401-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Glasgow RE, Riley WT. Pragmatic measures: what they are and why we need them. Am J Prev Med. 2013;45(2):237–243. doi: 10.1016/j.amepre.2013.03.010. [DOI] [PubMed] [Google Scholar]
  • 24.Riley WT, Glasgow RE, Etheridge L, Abernethy AP. Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise. Clin Transl Med. 2013;2(1):10. doi: 10.1186/2001-1326-2-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Neta G, Glasgow RE, Carpenter CR, Grimshaw JM, Rabin BA, Fernandez ME, et al. A framework for enhancing the value of research for dissemination and implementation. Am J Pub Health. 2015;105(1):49–57. doi: 10.2105/AJPH.2014.302206. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Mohr DC, Cheung K, Schueller SM, Hendricks Brown C, Duan N. Continuous evaluation of evolving behavioral intervention technologies. Am J Prev Med. 2013;45(4):517–523. doi: 10.1016/j.amepre.2013.06.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Buscemi J, Janke A, Kugler KC, Duffecy J, Mielenz TJ, St. George SM, et al. Increasing public health impact of evidence-based intervention in behavioral medicine: new approaches and future directions. J Behav Med. 2016 doi: 10.1007/s10865-016-9773-3. [DOI] [PubMed] [Google Scholar]
  • 28.Linke SE, Larsen BA, Marquez B, Mendoza-Vasconez A, Marcus BH. Adapting technological interventions to meet the needs of priority populations. Prog Cardiovascu Dis. 2016;58(6):630–638. doi: 10.1016/j.pcad.2016.03.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Raaijmakers LC, Pouwels S, Berghuis KA, Nienhuis SW. Technology-based interventions in the treatment of overweight and obesity: a systematic review. Appetite. 2015;95:138–151. doi: 10.1016/j.appet.2015.07.008. [DOI] [PubMed] [Google Scholar]
  • 30.Fowler LA, Holt SL, Joshi D. Mobile technology-based interventions for adult users of alcohol: a systematic review of the literature. Addict Behav. 2016;62:25–34. doi: 10.1016/j.addbeh.2016.06.008. [DOI] [PubMed] [Google Scholar]
  • 31.Atkins MS, Rusch D, Mehta TG, Lakind D. Future directions for dissemination and implementation science: Aligning ecological theory and public health to close the research to practice gap. J Clin Child Adolesc Psychol. 2016;45(2):215–226. doi: 10.1080/15374416.2015.1050724. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Lee SJ, Altschul I, Mowbray CT. Using planned adaptation to implement evidence-based programs with new populations. Am J Community Psychol. 2008;41:290–303. doi: 10.1007/s10464-008-9160-5. [DOI] [PubMed] [Google Scholar]
  • 33.Atassi N, Yerramilli-Rao P, Szymonifka J, Yu H, Kearney M, Grasso D, et al. Analysis of start-up, retention, and adherence in ALS clinical trials. Neurology. 2013;81(15):1350–1355. doi: 10.1212/WNL.0b013e3182a823e0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Cobb EM, Meurer W, Harney D, Slibergleit R, Lake BP, Clark C, et al. Patient engagement in neurological clinical trials design: a conference summary. Clin Trans Sci. 2015;8(6):776–778. doi: 10.1111/cts.12297. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Translational Behavioral Medicine are provided here courtesy of Oxford University Press

RESOURCES