Skip to main content
FASEB BioAdvances logoLink to FASEB BioAdvances
. 2021 Apr 7;3(8):626–638. doi: 10.1096/fba.2020-00124

Learning health systems: Driving real‐world impact in mental health and substance use disorder research

Amy M Kilbourne 1,2,, Emily Evans 1, David Atkins 1
PMCID: PMC8332471  PMID: 34377958

Abstract

The Veterans Health Administration (VHA), under the U.S. Department of Veterans Affairs (VA), is one of the largest single providers of health care in the U.S. VA supports an embedded research program that addresses VA clinical priorities in close partnership with operations leaders, which is a hallmark of a Learning Health System (LHS). Using the LHS framework, we describe current VA research initiatives in mental health and substance use disorders that rigorously evaluate national programs and policies designed to reduce the risk of suicide and opioid use disorder (data to knowledge); test implementation strategies to improve the spread of effective programs for Veterans at risk of suicide or opioid use disorder (knowledge to performance); and identify novel research directions in suicide prevention and opioid/pain treatments emanating from implementation and quality improvement research (performance to data). Lessons learned are encapsulated into best practices for building and sustaining an LHS within health systems, including the need for early engagement with clinical leaders; pragmatic research questions that focus on continuous improvement; multi‐level, ongoing input from regional and local stakeholders, and business case analyses to inform ongoing investment in sustainable infrastructure to maintain the research‐health system partnership. Essential ingredients for supporting VA as an LHS include data and information sharing capacity, protected time for researchers and leaders, and governance structures to enhance health system ownership of research findings. For researchers, incentives to work with health systems operations (e.g., retainer funding) are vital for LHS research to be recognized and valued by academic promotion committees.

Keywords: chronic disease management, implementation science, learning health systems, mental disorders, substance use disorders, Veterans

1. BACKGROUND

There is a well‐documented disconnect between health research and practice. Research findings often take years, if not decades, to be translated into routine clinical practice. 1 Research may not address the urgent needs of health systems, providers, or patients. 2 A recent report from the National Academy of Medicine on the Future of Health Services Research 3 points to the lack of alignment between academic research priorities and the needs of health systems and recommends that health researchers do more to address real‐ world problems identified by health systems, providers, patients, and other stakeholders.

This gap between research and practice, and clinical care delivery challenges that are not addressed by existing research, is especially problematic for persons suffering from noncommunicable diseases (NCDs), including chronic conditions such as mental health or substance use disorders. Mental health and substance use disorders are often considered “index” chronic conditions given the need for ongoing, coordinated care management across different providers. 4 Mental health (e.g., major depression, schizophrenia) and substance use disorders (e.g., alcohol, tobacco, and opioid use) are also significant risk factors for other chronic illnesses, notably cardiovascular disease and lung cancer. 5 , 6 Nearly half of the U.S. population has experienced a mental health or substance use disorder in their lifetime, 7 yet only a third receive adequate treatment. 8 Lack of treatment for these conditions can lead to preventable hospitalizations or even death from suicide. 9 Moreover, only a third of frontline providers have access to training in effective interventions for mental health or substance use disorders. 10 Many effective interventions that are designed and tested in academic research settings may not be practical for lower‐resourced clinical settings. 11 These barriers have been exacerbated by the COVID‐19 pandemic, which has caused delays in needed medical and mental health care and exacerbated the effects of social isolation.

Improving the quality and outcomes of care for mental health and substance use disorders requires redirecting research investments toward real‐world health care delivery problems and aligning incentives for researchers to work closely with health systems. However, the traditional metrics for advancing as an academic researcher emphasize research productivity based on grants and publications rather than health system impacts. This system does not incentivize researchers to pursue directions that may better address the needs of the health systems or populations they serve, 12 nor does it produce sufficiently timely results for the health systems or patients in need.

The Learning Health System (LHS) framework aligns researcher and health system incentives and priorities to increase the impact of research findings and interventions on health care and outcomes. The National Academy of Medicine has defined the LHS as a system in which “science, informatics, incentives, and culture are aligned for continuous improvement and innovation, with best practices seamlessly embedded in the delivery process and new knowledge captured as an integral by‐product of the delivery experience”. 13 A fundamental mechanism of an LHS is embedding research (and researchers) within the health system to improve the impact and timeliness of research and facilitate uptake and implementation of the evidence generated. Embedded, or partnered, research is defined as the process by which researchers and health system (operations) partners “work together, with different roles, to use research both to solve practical problems and contribute to science”. 14 , 15

For these research‐operations partnerships to successfully address health care priorities related to NCDs, incentives, timing, and agendas of researchers and health system leaders need to be aligned. Current research funding mechanisms do not generally provide sustained support – for both resources and protected time – to successfully build and support the partnerships, data infrastructure, and rapid evaluations of current best practices needed to ensure an efficient and responsive research enterprise that results in meaningful impacts in clinical outcomes and care delivery. 16

This paper describes the Office of Research Development's (ORD’s) efforts in the U.S. Department of Veterans Affairs (VA) to build an LHS, particularly in the areas of mental health and substance use disorder treatment, through funding mechanisms that align researcher and operations partner priorities and infrastructure needs. VA is a Cabinet‐level agency within the Executive Branch of the U.S. Government, within which the Veterans Health Administration is responsible for delivering health care to over 9 million Veterans. In addition to clinical care, its primary missions include research and clinician education/training. Over the past decade, VA has prioritized research focused on suicide prevention and opioid use disorder treatment. 17 As one of the largest providers of mental health and substance use disorder care in the U.S., and with the presence of an embedded research program that employs researchers affiliated with academic institutions, VA provides a unique example of building an LHS through research‐operations partnerships with support from research funding mechanisms. 18 We discuss the lessons learned from these partnered research‐operations initiatives and how they might inform the LHS community within and external to VA, especially concerning improving outcomes in mental health and substance use disorders.

2. SUICIDE PREVENTION AND OPIOID USE DISORDER IN THE VA

Suicide rates for veterans are higher than for the general population, and suicide prevention is a high priority for VA. Between 2005 and 2017, 78,875 veterans died by suicide, 19 which is more than the number of Americans killed in each major conflict except for World War II and the Civil War. Evidence suggests 20 that the strongest predictors of death by suicide, either during service or post‐military separation, include current and past diagnoses of self‐inflicted injuries, major depression, bipolar disorder, substance use disorder, and other mental health conditions. Similarly, opioid use disorder (OUD) is a major cause of illness and death among Veterans. 21 , 22 Hence, ending suicide and opioid use disorder are major U.S. national initiatives. 23 , 24

3. VA LEARNING HEALTH SYSTEM FRAMEWORK

Figure 1 provides an outline of VA’s LHS framework, adapted from previous frameworks. 25 , 26 It involves a progressive, three‐phase approach (data to knowledge, knowledge to performance, performance to data) that aligns people, processes, technologies, and policies to achieve continuous scientific learning in a health system:

FIGURE 1.

FIGURE 1

VA Learning Health System Cycle: Suicide Prevention & OUD. Adapted from Friedman et al. Legend: OUD: Opioid use disorder, STORM: Stratification Tool for Opioid Risk Management, REACH‐Vet: Recovery Engagement and Coordination for Health – Veterans Enhanced Treatment

  • People include researchers, clinical operational leaders, patients, frontline providers, and other stakeholders who collaborate on a shared plan to address a clinical priority goal.

  • Processes include implementation of quality improvement (Q.I.) strategies and rapid‐cycle evaluation methods to test and validate interventions in real‐world care delivery settings. Implementation strategies are methods designed to improve the quality of care by promoting uptake of evidence‐based practices among frontline providers, sites, and health systems by overcoming multilevel barriers to uptake such as costs (and other resources), care coordination, and operationalization of services.

  • Technology includes electronic data captured in real‐time on clinical, patient, provider, and system‐level outcomes. In the VA, the principal source of electronic data is the Corporate Data Warehouse, which is a compilation of longitudinal electronic health record data, including inpatient, outpatient, and emergency department visits, laboratory results, medications/treatments, diagnoses, and procedures on users of the VA health care system across all VA medical centers and community‐based outpatient clinics. More limited administrative data including utilization, medications and diagnoses are available for care provided outside VA, either paid by Medicare for eligible older Veterans or paid by VA under a program to ensure more timely and convenient access to specific services. 27

  • Policies include governance for promoting continuous quality improvement, data ascertainment, curation, and analysis, and scientific discovery over time. In VA, governance is supported by the establishment of national resource centers to facilitate data infrastructure. For example, the VA Health Services Research and Development (HSR&D) program funds the VA Information Resource Center (VIREC), 28 which supports researchers in accessing and using VA data for research and quality improvement purposes, as well as the. the Health Economics Resource Center (HERC) 28 which supports researchers in determining the cost of VA care, assessing cost‐effectiveness, and evaluating the efficiency of VA programs and providers using common data elements.

Below we describe VA research initiatives that align with the three LHS phases for VA’s priority goals of reducing the adverse impact of NCDs, focusing on mental health, substance (opioid) use, and suicide prevention. Lessons learned from these current initiatives can help inform a process for researchers and health system leaders to continue implementing core components of the LHS for clinical priorities going forward.

4. DATA TO KNOWLEDGE: IDENTIFYING AND UNDERSTANDING DRIVERS OF SUICIDE AND OUD TO OPTIMIZE PREVENTION EFFORTS

Data to knowledge involves the process of identifying and understanding determinants of health and health care quality gaps and evaluating interventions that might reduce or eliminate these gaps. VA national program offices such as the Office of Mental Health and Suicide Prevention (OMHSP) have used longitudinal electronic health record data to identify and understand the drivers of suicide and opioid use disorder risk, assess adherence to evidence‐based guidelines, track variation in practice, monitor performance, and reward improvement. 29 Data to knowledge also involves careful evaluation of interventions that are effective in real‐world practice. Strong partnerships enabled these analyses conducted by embedded researchers to understand determinants of increased risk for both suicide 30 and opioid use disorder risk factors, 31 which informed national interventions described below.

4.1. Reduce/Eliminate quality gaps: VA national program evaluations

OMHSP developed two data‐driven national interventions to identify patients at the highest risk for suicide (REACH Vet) and opioid use disorder and overdose risk (STORM) (Table 1). Recognizing the opportunity to conduct embedded research to determine the effectiveness of these programs in real‐world settings, HSR&D launched a program to evaluate national rollouts of these two interventions with additional funding from the Office of Management and Budget (OMB), which strongly encouraged randomization to produce rigorous evaluations of new government programs. 32 Evaluation priorities focused on suicide prevention and opioid misuse were selected based on a nomination process where VA clinical leaders submitted to HSR&D priority evaluation topics, which were peer‐reviewed for their impact, clinical relevance, and feasibility of building randomization into the rollout. Researchers then submitted evaluation funding proposals to HSR&D on these topics, who scientifically peer‐reviewed them and funded the top‐scoring proposals.

TABLE 1.

Learning health system‐focused initiatives in the VA for mental health and substance use disorders

LHS phase and initiative Mental health priority goal: suicide prevention Substance use disorders priority goal: opioid use disorder

Data to Knowledge

National data from VA national clinical leaders identified gaps in quality

Program evaluations selected by leaders and led by researchers test interventions to reduce gaps in quality/outcomes

Identify Gaps: Lack of services for high‐risk Veterans

Recovery Engagement and Coordination for Health ‐ Veterans Enhanced Treatment (REACH VET): Coordinators supported Veterans in the top 0.1% risk of suicide based on a national data algorithm and coordinated care.

Identify Gaps: Lack of access to the effective management of opioid use disorder

Stratification Tool for Opioid Risk Management (STORM), which uses a real‐ time data dashboard to present individual patients’ level of risk, display patient‐ specific clinical risk factors, and track the use of recommended risk mitigation strategies

Knowledge to Performance Partnered Implementation Initiatives funded through quality improvement (VA Quality Enhancement Research Initiative) designed to improve quality for clinical priorities selected by health system leaders

Caring Contacts for Suicide Prevention in emergency department settings

Caring Contacts involves mailing brief, non‐demanding expressions of care and concern over a year to Veterans screened for suicide risk

Partnered Implementation Initiative: Consortium to Disseminate and Understand Implementation of Opioid Use Disorder Treatment (CONDUIT).

CONDUIT used implementation strategies to scale up and spread medication‐ assisted treatment for opioid use disorder as well as non‐opioid pain treatments

Performance to Data

Consortia of Research (COREs)

Funded through research to create a collaborative community of researchers and leaders to support infrastructure that fosters discoveries, data

needs, and dissemination products

The goal of the Suicide Prevention Research Impact NeTwork (SPRINT) is to accelerate suicide prevention research that will lead to improvements in care and ultimately, reductions in suicide among Veterans The goal of the Pain/Opioid CORE is to foster high‐quality Veteran‐centered research to improve pain care and reduce opioid‐related harms

4.2. REACH‐VET suicide prevention program evaluation

In 2016, VA implemented the Recovery Engagement and Coordination for Health – Veterans Enhanced Treatment (REACH VET) program to support Veterans in the top 0.1% risk of suicide based on a national data algorithm. 33 REACH VET supported mental health coordinators at each VA facility to identify and coordinate care for them. As REACH VET was already being rolled out nationally, the evaluation was designed to determine whether the program's effects varied across sites (and if so, why), and compare a standard versus enhanced implementation strategy to support sites not fully implementing REACH VET. The standard implementation strategy included policy memos, identified a coordinator at each of the 140 participating VA healthcare sites, web‐based training, and educational and support materials. The enhanced implementation strategy is Facilitation (Table 2), which involves ongoing, individualized advice to frontline providers on implementing and embedding REACH VET into routine clinical care by overcoming organizational barriers. Evaluation of key outcomes is still in process, including a proportion of patients identified at each facility who receive the REACH VET intervention and the proportion of providers in each facility that participate in the program.

TABLE 2.

Example of an implementation strategy (Facilitation) and application in addressing barriers to implementation and uptake of mental and substance use disorders in VA

Facilitation component Brief description Key barriers addressed
Identify and engage stakeholders, including organizational leaders, local provider champions, and local opinion leaders

Facilitator helps champions (those who directly deliver the evidence‐based practice at sites) identify multi‐level stakeholders to help build rapport and motivation, align site and organization leaders’ points of influence

Site local opinion leaders (e.g., influencers who are not the practice champions) support local provider champions through publicity and resource‐sharing

Leadership support helps align the evidence‐based practice goal with larger goals of institution and garner additional protected times for champions

Provider: Local opinion leaders can help garner support for resources and protected time for provider champions

Site/clinic: Local opinion leaders can overcome site operational inertia by identifying additional champions and opportunities where the evidence‐based practice can support other competing demands at site

Organizational: Leadership endorsement helps mitigate organizational lack of prioritization, competing demands, limited incentives

Performance monitoring and goal‐setting, identify process barriers, build business case

Facilitator benchmarks sites’ ongoing progress in implementing the evidence‐based practice and patient/provider outcomes, provides feedback to provider champion to build competency and confidence in delivering evidence‐based practice

Monitoring over time can identify gaps and potential improvements in organizational and practice outcomes

Foster organizational change through leadership advocacy and feedback

Provider: Monitoring and feedback promotes provider self‐efficacy in delivering evidence‐based practice, helps with identifying other provider champions

Site/clinic: Monitoring mitigates operational barriers by identifying and overcoming gaps in care, potential positive impacts on other site functions (e.g., patient experience, quality of care)

Organizational: Use data to communicate impact of evidence‐based practice on organizational priorities (e.g., patient experience, provider productivity, quality metrics, health care costs) build ongoing support

Clarify provider roles and team processes

Facilitator guides provider champions in process mapping and defining roles of providers within the site/clinic in delivering the evidence‐based practice

Providers outline process for how patients receive evidence‐based practice through and who is responsible for which task/procedure

Providers with support from site and organizational opinion leaders embed evidence‐based practice components into information technology system (e.g., patient identification and outcomes monitoring)

Provider: Mitigate burnout due to duplication of efforts, unbalanced burden of tasks

Site/clinic: Enables identification of process streamlining or leveraging of other services

Organizational: Mitigate resource constraints by leveraging existing tools and functions

Adapt intervention and clinical processes to overcome barriers Facilitator guides provider champions to identify feasibility issues in delivering evidence‐based practice, confirm core functions of the evidence‐based practice that cannot be changed and garner local provider input on adapting mutable components such as mode of delivery (e.g., virtual, smart phone). Use rapid‐cycle testing at sites to evaluate adaptations

Provider: opportunities to adapt helps mitigate barriers including lack of time or enthusiasm

Site/clinic: mitigate resistance to change by enabling site input into adaptation and through rapid‐cycle testing demonstrate how evidence‐based practice can support site functions and other services

Transition to end‐user ownership and sustainment

Site provider champions, with guidance from Facilitator, work with site and organizational leaders to develop an action plan including roles and responsibilities for ongoing maintenance of the evidence‐based practice implementation.

Form learning collaborative among champions across sites to share progress and sustainment strategies

Providers: Build self‐efficacy in practice change and implementation

Site/clinic: Mitigate drift by building in automated clinical and information technology processes for maintaining evidence‐based practice

Organizational: Overcome “voltage drop” that occurs post‐study or Facilitation support by building in quality measures and performance incentives to maintain evidence‐based practice, and protected time for ongoing champions to continue monitoring and learning collaborative

4.3. STORM opioid treatment policy evaluation

The Stratification Tool for Opioid Risk Management (STORM) focuses on reducing harmful opioid prescribing by using a real‐time data dashboard to present individual patients’ level of risk, display patient‐specific clinical risk factors, and track the use of recommended risk mitigation strategies (e.g., naloxone kits, reduction in opioid dosage) and non‐opioid pain treatments (e.g., physical therapy) for individual patients. VA national leadership rolled out the STORM policy notice requiring VA sites to complete case reviews for patients whom STORM identifies as very high‐risk of harmful opioid use (i.e., top 1% of STORM risk scores). For the randomized evaluation, 34 researchers randomly assigned half the sites to receive notices of required additional support and oversight if they failed to meet an established percentage of case reviews. In contrast, the other half received the policy notice only. Researchers then used a stepped‐ wedge cluster randomized design to further randomize sites to conduct case reviews for an expanded pool of patients (top 5% of STORM risk scores vs. 1%) up to 15 months after the notice was released. Primary evaluation outcomes included reducing opioid prescribing and which implementation strategies supported effectiveness across sites.

4.4. Challenges in deploying LHS data to knowledge

Two key challenges in supporting embedded research‐practice partnerships within LHS were timing and feasibility of randomized study designs. Health systems confronted with crises may not have the luxury of waiting for the research to be completed before acting, and researchers may not have the ability to choose their preferred study design. In those cases, quasi‐experimental study designs 35 are an alternative for estimating effects of interventions without randomization. Key examples of such designs include non‐equivalent control group designs in which investigators use electronic health record data to compare outcomes of patients receiving or not receiving a program, controlling for potential confounders that influence program receipt and outcomes of interest. Stepped‐wedge designs (which may or may not use randomization) involve systematic roll‐out of a program to sites; investigators evaluate outcomes across specific time points until all sites receive the program. Repeated measurement at each step of the roll‐out enables each site to serve as its own control while also enabling between site comparisons. An interrupted time series involves outcomes measurements from electronic health record data across multiple periods of time prior to, during, and after program deployment.

For example, even with the opportunity to embed research into their programs, VA leaders decided to move ahead with national implementation of the REACH VET suicide prevention program; the evaluation question was therefore modified to determine which implementation strategies improved REACH VET uptake (and inform understanding of program sustainment). In addition, the VA implemented REACH‐Vet before randomization could occur among sites, and investigators modified the study to evaluate different implementation strategies. Similarly, for STORM, although implementation of the policy directive was delayed at the national level, this did not impact the ability of the researchers to randomize sites. Nonetheless, these national program evaluations produced opportunities for clinical leaders to learn what would optimize uptake across local and regional VA health systems, especially by studying optimal implementation strategies.

5. KNOWLEDGE TO PERFORMANCE: THE VA RESEARCH QUALITY ENHANCEMENT RESEARCH INITIATIVE PARTNERED IMPLEMENTATION INITIATIVES

Knowledge to performance focuses on implementing effective interventions into real‐world care settings through a process that informs strategies to help sustain the quality improvement gains over time. To accomplish this goal, HSR&D, through its Quality Enhancement Research Initiative (QUERI), funds Partnered Implementation Initiatives (PII) that support health system–researcher teams in applying implementation science to scale up and spread EBPs addressing the priorities selected by VA regional health system leaders with the ultimate goal of informing a quality improvement playbook for health system leaders. QUERI selects PII topics based on a bottom‐up clinical topic nomination process, where regional health system leaders from the 18 regional VA integrated service networks (VISNs) nominate clinical priorities that “keep them awake at night” and that QUERI could help address via quality improvement efforts. These leaders then select their top 2–3 priorities via a live‐voting process for QUERI to support through the PII mechanism.

To be eligible for PII funding, proposals had to be co‐led by a researcher and VISN leader who chooses the priority; deploy specific evidence‐based practices addressing the clinical priority across several sites from more than one VISN; apply specific implementation strategies designed to promote the uptake of effective interventions; and benchmark impact using VA national quality performance standards. PIIs also had to conduct a business case analysis for VISNs to sustain the implementation. 36 , 37 , 38 The business case analysis assessed outcomes across multiple stakeholders (e.g., provider turnover, employee satisfaction and engagement, consumer satisfaction), in addition to health care costs to inform an implementation “playbook” that describes requirements for successful implementation over time.

The first cohort of PIIs focused on suicide prevention and opioid use disorder/pain treatments: Caring Contacts for Suicide Prevention in Non‐Mental Health Settings and Consortium to Disseminate and Understand Implementation of Opioid Use Disorder Treatment (CONDUIT). Both Caring Contacts and CONDUIT applied Facilitation to implement evidence‐based practices.

Facilitation (Table 2) is a previously established bundle of implementation strategies that helps providers address organizational barriers to evidence‐based practice uptake through interactive problem‐solving and mentorship. Derived from the Promoting Action on Research Implementation in Health Services (PARiHS) framework, 39 Facilitation has been used in the successful implementation of evidence‐based practices across NCDs. 40 , 41 , 42 Core components of Facilitation 43 , 44 are detailed in Table 2, including include multi‐level stakeholder engagement, ongoing monitoring and feedback, operationalization of provider and team processes, adaptation to fit local needs, and transitioning to sustainment. Together these components address common barriers to implementation of evidence‐based practices at the provider, site, and organizational levels; notably limited coordination among clinicians; lack of operationalization of the evidence‐based practice; and organizational cost considerations.

5.1. Caring contacts for suicide prevention partnered implementation initiative

Caring Contacts is an evidence‐based suicide prevention intervention 45 that involves mailing brief, non‐demanding expressions of care and concern to Veterans screened for suicide risk in the emergency department. Caring Contacts is currently being implemented across 28 facilities in 9 VISNs using the REACH VET implementation strategy (virtual facilitation). Key outcomes include the number of Veterans reached and suicide related behavior, and the business case analysis will also focus on the role of geographic variation in outcomes.

5.2. CONDUIT: OUD/Pain partnered implementation initiative

CONDUIT’s 46 overall goal is to expand Veterans’ access to medications for OUD and evidence‐based pain treatments through the facilitation implementation strategy. Effective medications for OUD are available, but their availability and use among Veterans varies across the VA. A key barrier is the lack of provider uptake of U.S. Drug Enforcement Agency waiver forms to enable non‐pain specialist providers to prescribe OUD medications such as buprenorphine/naloxone, methadone, and naltrexone. CONDUIT involves 6 VISNs and 57 sites and spans four care settings in the OUD continuum of care: Primary Care; Specialty Care; Acute Care (inpatient and Emergency Department); and Telehealth. The implementation strategies deployed at the CONDUIT sites include virtual facilitation, with added trained internal facilitators at local sites to embed EBPs as part of routine clinical care. Key outcomes include receipt of medication‐assisted treatment for OUD and non‐opioid pain treatment. The business case analysis will inform an implementation playbook and communication strategy for Veterans and providers.

5.3. Challenges with LHS knowledge to performance

The focus on rapid implementation of EBPs for suicide prevention and OUD/pain treatment limited researchers’ ability to learn as much as possible from their efforts. While the regional health system leaders welcomed the additional support from researchers, investigators lacked time to pursue new research ideas that arose out of the implementation process, or compare the different types of implementation strategies (e.g., virtual versus on‐ site facilitation). Hence, PIIs could benefit from additional infrastructure support to build lasting partnerships and create comprehensive data on outcomes and mechanisms most likely to achieve long‐lasting effects on program sustainment.

6. PERFORMANCE TO DATA: BUILDING SUSTAINABLE STRUCTURES TO ALIGN RESEARCH AND PROGRAM PRIORITIES CONSORTIA OF RESEARCH (CORES)

Performance to data is the process by which active efforts to improve health system quality and outcomes, guided by shared goals between researchers and clinical leaders, generate new questions or discoveries that can be the topic of additional study to inform continuous quality improvement. However, shared goals between researchers and health system leaders are not sufficient to support a successful LHS partnership. Building and sustaining effective partnerships requires trust, communication and time, all of which benefit from investment in infrastructure and mechanisms to foster this partnership. To this end, HSR&D established the Consortia of Research (COREs), the first two focusing on suicide prevention and opioid use disorder.

The CORES had five aims that support an LHS: (1) create a collaborative community of researchers; (2) review and assess the existing portfolio of research; (3) establish regular communication with clinical stakeholders to identify and align program and research priorities; (4) identify and address data needs to support improvement and research; and (5) distill and communicate important research findings back to clinical stakeholders. The COREs can help the LHS cycle run more efficiently, especially in the performance to data phase of identifying and formulating new research questions. As extra incentives for partners, the COREs solicit and support rapid‐cycle projects (analyses, rapid improvement projects, pilots) to address time‐sensitive needs. Recent COREs include the Suicide Prevention Research Impact NeTwork (SPRINT) and Pain/Opioid COREs.

6.1. SPRINT: Suicide prevention core

SPRINT 47 aims to accelerate suicide prevention research that will lead to improvements in care, and ultimately, reductions in suicide among Veterans. SPRINT has built a network of suicide prevention researchers and operations partners through OMHSP focused on research related to VA’s overall public health suicide prevention strategy. 48 Through this partnership, SPRINT worked to develop a “state of the science” inventory of information about VA and non‐VA health services suicide prevention research activities and the evidence base for suicide prevention interventions. SPRINT uses this inventory to create a focused research agenda on suicide prevention and provide recommendations for multi‐site projects. SPRINT’s research agenda leverages partnerships with communities to implement tailored, local prevention plans while also focusing on evidence‐based clinical strategies for intervention.

6.2. Pain/Opioid core

The Pain/Opioid CORE’s goals are to foster high‐quality, Veteran‐centered research to improve pain care and reduce opioid‐related harms. 49 The major themes of this CORE include the need for novel pain treatments, including studies of complementary and integrative health approaches, exercise/movement, psychological and behavioral interventions, and novel OUD treatments such as long‐term opioid therapy and tapering strategies. VA leadership partners include VA’s National Pain Management program, Integrative Health Coordinating Center, Specialty Care Services, and OMHSP. The Opioid/Pain CORE built a coalition of VA leaders and researchers to develop a strategic plan for the CORE that included short‐ and long‐term goals directly addressing operational partner priorities. Start‐up funds were essential to supporting these goals by incentivizing researchers to build operational partnerships, develop capacity for research studies, and conduct evidence syntheses to support a research program in this area.

7. DISCUSSION

Collectively, the VA Randomized Program Evaluations, Partnered Implementation Initiatives, and Consortia of Research provide a comprehensive array of research mechanisms focused on NCDs, notably in mental health and substance use disorders. Each of these initiatives align research goals with the needs of local and national health system stakeholders. Assessments of these LHS‐inspired initiatives are still in process, but together the experiences point to several insights into building an LHS to support ongoing translation of research into practice; maintaining the capacity to do so through clinical and data infrastructures that address shared clinical priority goals; testing interventions to improve Veteran health; and discovery of new treatment directions.

Despite the comprehensive overview of VA LHS components, limitations in our review include the lack of complete information on the LHS impacts, notably on cost and quality of care over time. Nonetheless, several lessons can be learned from these LHS initiatives (outlined in Table 3). First, engagement with health systems leaders must occur as early as possible, not after a research proposal is fully developed. Partners who have a voice in setting priorities and discussing alternative ways of answering a question will be more invested in the research results. Second, while program partners want to know whether their programs “work” compared to some baseline, program partners are even more interested in making their programs work better or more consistently across different settings. Third, initiatives withstood common changes to clinical priorities by ensuring that regional leaders co‐led the work, and that the initiative included input from multi‐level partners including frontline clinicians, staff, Veterans, and family members. Fourth, while business case analysis is essential for determining the value of the implementation strategies, it needs to measure outcomes beyond quality and cost to include provider experiences and patient experiences.

TABLE 3.

LHS core values, lessons learned, and alignment of VA research partnership

LHS value Examples of issues/challenges Recommended steps
Participatory Leadership and Transparency Lack of alignment of priorities among health care leaders, frontline providers, and researchers

Identify the full set of relevant stakeholders and establish channels of communications

Form study team with clinical and research expertise, with engagement from local clinical leaders/providers

Specify priority questions early on from the health system's perspective that can be addressed through research

Scientific integrity Lack of planning or resources to conduct rigorous evaluation

Rigorous application of scientific methods and evaluation best practices using pragmatic designs (e.g., cluster randomization)

Obtain external review of study methods

Standards for operating based on the input of multiple stakeholders Competing demands of health care leaders and personnel

Researchers and clinical operations leaders regularly meet, plan for sustainment

Cross‐functional teams garner input from multi‐level stakeholders on study execution, sustainment

Implement processes to clarify roles and data access, ensure privacy, security, and confidentiality of data

Stakeholder‐ focused Changing health system priorities

Focus on improving health care quality and outcomes for a problem affecting the health system

Formulate and refine questions of interest, plan business case analysis that captures outcomes of interest across stakeholder

group, the value of implementation

Inclusiveness Lack of communication regarding expectations and timing of research

Agreements, including memoranda of understanding, data use agreements, publication and dissemination policies, and other study implementation processes that include stakeholder preferences

Clinical leaders co‐lead projects, obtain recognition as key partners in success

Adaptability Limited time in a health care setting to invest in information technology or research infrastructures

Research funding to support infrastructures that maximize rigor such as data ascertainment and analysis

Rapid and iterative design and evaluation of improvement efforts

Accessibility and Value Lack of planning or tools for providers once the research funding ends

Develop a “definition of done” and hand‐off or ownership protocol to operations partners of research results, for researchers, route to other research funding opportunities

Products disseminated and made available to clinical partners including implementation playbooks

Fifth, investment in sustainable infrastructure is essential to maintain the research‐health system partnership, especially for data and information sharing capacity, protected time for researchers and leaders, and governance structures. Different funding models are possible to encourage LHS work. In the VA, we can use research funding to support more rigorous evaluations and advance science while furthering goals of clinical partners; in other systems, clinical leaders have offered funding tied to their priority questions, with the understanding that this preliminary work can be used to leverage additional research support. Formal arrangements such as memoranda of understanding, data use agreements, and guidelines for publications and other dissemination products reinforce expectations. There needs to be pragmatic data capture from multiple sources, especially from electronic health records, that can be replicated over time and embedded into workflows. Finally, to maximize sustainment, a proactive plan to clarify responsibility for the continuing clinical processes once the research ends is vital. Similarly, researchers needed to be incentivized to work with operations partners through retainer funding or through a special research funding mechanism that their academic promotion committees recognize.

Is the VA LHS generalizable beyond the VA? Key ingredients to the VA LHS included a national electronic health record, funding or protected time for researchers, and a health system with a commitment from operational leaders to participate in research or evaluation. U.S. organizations that have these include the Health Care Systems Research Network, 16 which is a consortium of health care systems that share common electronic data elements and an infrastructure to conduct research. Increasingly, federal research funding agencies have also funded health systems and academic health centers to support LHS components. For example, the U.S. National Institutes of Health's National Cancer Institute and National Center for Advancing Translational Sciences explore the role of implementation science, building capacity in supporting and complementing LHS in precision medicine approaches to transforming patient care. 16 , 50

Examples of health systems implementing these LHS core elements include New York University's Langone Health system, which employs rigorous, rapid‐cycle randomized testing of existing system wide initiatives to close ineffective programs and optimize valuable programs for consumers, clinicians, and support staff. 51 Cincinnati Children's Hospital formed pediatric learning health systems for NCDs. 52 Recent NIH initiatives such as the HEALing Communities initiative are also applying these LHS core elements to fund multilevel community‐based organizations in several states to reduce the opioid death rate by leveraging statewide electronic health data, building research and evaluation infrastructures across sites, and involving communities and organizations in the research and implementation process. 53

Moreover, several U.S. policies and laws encourage operationalization of LHS principles, notably through “meaningful use” electronic health records 54 to promote data infrastructures, as well as the 21st Century CURES Act 55 which enables the use of electronic data sources to assess the public health impacts of new treatments. The Foundations for Evidence‐based Policymaking Act of 2018 56 also requires federal agencies to use evidence and evaluation to inform budget decisions. These initiatives point to the urgent need to prepare researchers and health systems leaders to better apply LHS principles and work together to embed research initiatives that facilitate rapid discovery, adoption, and evaluation of clinical interventions and care delivery approaches.

8. CONCLUSIONS

Closing the gap between research and practice, especially for non‐communicable diseases such as mental health and substance use disorders, requires a concerted effort to coordinate across different stakeholders, shared goals across practitioners and researchers, data infrastructure, and commitment to make continuous learning (quality improvement and research) feasible and robust. Over the past decade, the VA has rapidly evolved to embody these core values of a Learning Health System and implement initiatives that can inform the future of clinical research and operations to ultimately improve chronic illness outcomes and care experiences for patients, especially for mental health and substance use disorders.

CONFLICT OF INTEREST

The authors declare no conflicts of interest.

AUTHOR CONTRIBUTIONS

A.M.K. drafted the manuscript; D.A. included current research and funding initiatives; E.E. provided critical edits and content related to the study findings. All authors reviewed and approved the manuscript's content.

ACKNOWLEDGMENTS

The U.S. Department of Veterans Affairs, Veterans Health Administration, Health Services Research & Development Service supported this work. The views expressed are those of the authors and do not necessarily represent the Department of Veterans Affairs’ views.

This article is part of the Global Voices for Prevention of Noncommunicable Diseases Special Collection.

REFERENCES

  • 1. Institute of Medicine . Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: The National Academies Press; 2001. 10.17226/10027 [DOI] [PubMed] [Google Scholar]
  • 2. Balas EA, Boren SA. Managing clinical knowledge for health care improvement. Yearb Med Inform. 2000;09(01):65‐70.PMID: 27699347.Future of Health Services Research NAM. [PubMed] [Google Scholar]
  • 3. National Academy of Medicine . Future of Health Services Research. National Academies Press, 2018. https://nam.edu/the‐future‐of‐health‐services‐research‐special‐publication/ [PubMed] [Google Scholar]
  • 4. Woltmann E, Grogan‐Kaylor A, Perron B, Georges H, Kilbourne AM, Bauer MS. Comparative effectiveness of collaborative chronic care models for mental health conditions across primary, specialty, and behavioral health care settings: systematic review and meta‐analysis. Am J Psychiatry. 2012;169(8):790‐804. 10.1176/appi.ajp.2012.11111616. PMID: 22772364. [DOI] [PubMed] [Google Scholar]
  • 5. Kilbourne AM, Morden NE, Austin K, et al. Excess heart‐disease‐related mortality in a national study of patients with mental disorders: identifying modifiable risk factors. Gen Hosp Psychiatry. 2009c;31(6):555‐563. 10.1016/j.genhosppsych.2009.07.008. PMCID: PMC4033835. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Bauer UE, Briss PA, Goodman RA, Bowman BA. Prevention of chronic disease in the 21st century: elimination of the leading preventable causes of premature death and disability in the USA. Lancet. 2014;384(9937):45‐52. 10.1016/S0140-6736(14)60648-6. PMID: 24996589. [DOI] [PubMed] [Google Scholar]
  • 7. Kessler RC, Berglund P, Demler O, Jin R, Merikangas KR, Walters EE. Lifetime prevalence and age‐of‐onset distributions of DSM‐IV disorders in the National Comorbidity Survey Replication. Arch Gen Psychiatry. 2005;62(6):593‐602. 10.1001/archpsyc.62.6.593 [DOI] [PubMed] [Google Scholar]
  • 8. Kessler RC, Berglund P, Demler O, et al. National comorbidity survey replication. The epidemiology of major depressive disorder: results from the National Comorbidity Survey Replication (NCS‐R). JAMA. 2003;289(23):3095‐3105. 10.1001/jama.289.23.3095. PMID: 12813115. [DOI] [PubMed] [Google Scholar]
  • 9. Ilgen MA, Bohnert AS, Ignacio RV, et al. Psychiatric diagnoses and risk of suicide in veterans. Arch Gen Psychiatry. 2010;67(11):1152‐1158. 10.1001/archgenpsychiatry.2010.129. PMID: 21041616. [DOI] [PubMed] [Google Scholar]
  • 10. Insel TR. Translating scientific opportunity into public health impact: a strategic plan for research on mental illness. Arch Gen Psychiatry. 2009;66(2):128‐133. 10.1001/archgenpsychiatry.2008.540. PMID: 19188534. [DOI] [PubMed] [Google Scholar]
  • 11. Wells KB. Treatment research at the crossroads: the scientific interface of clinical trials and effectiveness research. Am J Psychiatry. 1999;156(1):5‐10. 10.1176/ajp.156.1.5. PMID: 9892291. [DOI] [PubMed] [Google Scholar]
  • 12. Braganza MZ, Kilbourne AM. The quality enhancement research initiative (QUERI) impact framework: measuring the real‐world impact of implementation science. J Gen Intern Med. 2021;36(2):396‐403. 10.1007/s11606-020-06143-z. PMID: 32875498. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Institute of Medicine . Best Care at a Lower Cost: the Path to Continuously Learning Health Care in America. Washington DC: The National Academies Press; 2013. https://www.nap.edu/catalog/13444/best‐care‐at‐lower‐cost‐the‐path‐to‐continuously‐learning [PubMed] [Google Scholar]
  • 14. Ovretveit J, Hempel S, Magnabosco JL, Mittman BS, Rubenstein LV, Ganz DA. Guidance for research‐practice partnerships (R‐PPs) and collaborative research. J Health Organ Manag. 2014;28(1):115‐126. 10.1108/JHOM-08-2013-0164. PMID: 24783669. [DOI] [PubMed] [Google Scholar]
  • 15. Gould MK, Sharp AL, Nguyen HQ, et al. Embedded research in the learning healthcare system: ongoing challenges and recommendations for researchers, clinicians, and health system leaders. J Gen Intern Med. 2020;35(12):3675–3680. 10.1007/s11606-020-05865-4. PMCID: PMC7728937. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Kilbourne AM, Jones PL, Atkins D. Accelerating implementation of research in learning health systems: lessons learned from VA health services research and NCATS clinical science translation award programs. J Clin Transl Sci. 2020;4(3):195‐200. 10.1017/cts.2020.25. PMID: 32695488; PMCID: PMC7348004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Office of Mental Health and Suicide Prevention . Veterans Health administration. U.S. Department of Veterans Affairs. Mental Health Guidebook, 2019. https://www.mentalhealth.va.gov/docs/VA‐Office‐of‐Mental‐Health‐and‐Suicide‐Prevention‐Guidebook‐June‐2018‐FINAL‐508.pdf [Google Scholar]
  • 18. Kilbourne AM, Braganza MZ, Bowersox NW, et al. Research lifecycle to increase the substantial real‐world impact of research: accelerating innovations to application. Med Care. 2019;57:S206‐S212. 10.1097/MLR.0000000000001146. PMCID: PMC6750195. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Office of Mental Health and Suicide Prevention . Veterans Health Administration. U.S%3e Department of Veterans Affairs. Suicide Prevention Report, 2019. https://www.mentalhealth.va.gov/docs/data‐sheets/2019/2019_National_Veteran_Suicide_Prevention_Annual_Report_508.pdf [Google Scholar]
  • 20. Shen YC, Cunha JM, Williams TV. Time‐varying associations of suicide with deployments, mental health conditions, and stressful life events among current and former US military personnel: a retrospective multivariate analysis. Lancet Psychiatry. 2016;3(11):1039‐1048. 10.1016/S2215-0366(16)30304-2. PMID: 27697514. [DOI] [PubMed] [Google Scholar]
  • 21. Lin LA, Peltzman T, McCarthy JF, Oliva EM, Trafton JA, Bohnert ASB. Changing trends in opioid overdose deaths and prescription opioid receipt among veterans. Am J Prev Med. 2019;57(1):106‐110. 10.1016/j.amepre.2019.01.016. Epub 2019 May 22 PMID: 31128955. [DOI] [PubMed] [Google Scholar]
  • 22. Bohnert AS, Valenstein M, Bair MJ, et al. Association between opioid prescribing patterns and opioid overdose‐related deaths. JAMA. 2011;305(13):1315‐1321. 10.1001/jama.2011.370. PMID: 21467284. [DOI] [PubMed] [Google Scholar]
  • 23. U.S. Office of the President . President’s Roadmap to Empower Veterans and End a National Tragedy of Suicide (PREVENTS). Executive Order 13861. March 5, 2019. https://www.whitehouse.gov/presidential‐actions/executive‐order‐national‐roadmap‐empower‐veterans‐end‐suicide/
  • 24. U.S. Congress . Comprehensive Addiction and Recovery Act (CARA). 2016. https://www.govtrack.us/congress/bills/114/s524/summary
  • 25. Guise JM, Savitz LA, Friedman CP. Mind the gap: putting evidence into practice in the era of learning health systems. J Gen Intern Med. 2018;33(12):2237‐2239. 10.1007/s11606-018-4633-1. PMID: 30155611; PMCID: PMC6258636. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Friedman CP, Rubin JC, Sullivan KJ. Toward an Information Infrastructure for Global Health Improvement. Yearb Med Inform. 2017;26(01):16‐23. 10.1055/s-0037-1606526. PMID: 28480469; PMCID: PMC6239237. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Murphy PA, Cowper DC, Seppala G, Stroupe KT, Hynes DM. Veterans Health Administration inpatient and outpatient care data: an overview. Eff Clin Pract. 2002;3: PMID: 12166925. [PubMed] [Google Scholar]
  • 28. U.S. Department of Veterans Affairs . Health Services Research and Development. VA Information Resource Center (VIREC). www.virec.research.va.gov. Accessed February 15, 2021. [Google Scholar]
  • 29. Lemke S, Boden MT, Kearney LK, et al. Measurement‐based management of mental health quality and access in VHA: SAIL mental health domain. Psychol Serv. 2017;14(1):1‐12. 10.1037/ser0000097. PMID: 28134552. [DOI] [PubMed] [Google Scholar]
  • 30. Zivin K, Kim HM, McCarthy JF, et al. Suicide mortality among individuals receiving treatment for depression in the Veterans Affairs health system: associations with patient and treatment setting characteristics. Am J Public Health. 2007;97(12):2193‐2198. 10.2105/AJPH.2007.115477. PMCID: PMC2089109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Conner KR, Bohnert AS, McCarthy JF, et al. Mental disorder comorbidity and suicide among 2.96 million men receiving care in the Veterans Health Administration health system. J Abnorm Psychol. 2013;122(1):256‐263. 10.1037/a0030163. PMID: 23088376. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. U.S. Government . Office of management and budget. Implementation of the foundations for evidence‐based policymaking act. https://www.whitehouse.gov/wp‐content/uploads/2020/03/M‐20‐12.pdf
  • 33. Reger GM, McClure ML, Ruskin D, Carter SP, Reger MA. Integrating predictive modeling into mental health care: an example in suicide prevention. Psychiatr Serv. 2019;70(1):71‐74.PMID: 30301448. [DOI] [PubMed] [Google Scholar]
  • 34. Minegishi T, Frakt AB, Garrido MM, et al. Randomized program evaluation of the Veterans Health Administration Stratification Tool for Opioid Risk Mitigation (STORM): A research and clinical operations partnership to examine effectiveness. Subst Abus. 2019;40(1):14‐19.PMID: 30620691. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Goodrich DE, Miake‐Lye I, Braganza MZ, Wawrin N, Kilbourne AM. The QUERI Roadmap for Implementation and Quality Improvement [Internet]. Washington (DC): Department of Veterans Affairs (US); 2020. https://www.ncbi.nlm.nih.gov/books/NBK566223/. PMID: 33400452. [PubMed] [Google Scholar]
  • 36. Saldana L, Chamberlain P, Bradford WD, Campbell M, Landsverk J. The cost of implementing new strategies (COINS): a method for mapping implementation resources using the stages of implementation completion. Child Youth Serv Rev. 2014;1(39):177‐182. 10.1016/j.childyouth.2013.10.006. PMID: 24729650; PMCID: PMC3979632. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Liu CF, Rubenstein LV, Kirchner JE, et al. Organizational cost of quality improvement for depression care. Health Serv Res. 2009;44(1):225‐244. 10.1111/j.1475-6773.2008.00911.x. PMCID: PMC2669623. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Leatherman S, Berwick D, Iles D, et al. The business case for quality: case studies and an analysis. Health Aff (Millwood). 2003;22(2):17‐30. 10.1377/hlthaff.22.2.17. PMID: 12674405. [DOI] [PubMed] [Google Scholar]
  • 39. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016. 10(11):33. 10.1186/s13012-016-0398-2. PMID: 27013464; PMCID: PMC4807546. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Parchman ML, Anderson ML, Dorr DA, et al. A randomized trial of external practice support to improve cardiovascular risk factors in primary care. Ann Fam Med. 2019;17(Suppl 1):S40‐S49. 10.1370/afm.2407. PMID: 31405875; PMCID: PMC6827661. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41. Smith SN, Almirall D, Prenovost K, et al. Change in patient outcomes after augmenting a low‐level implementation strategy in community practices that are slow to adopt a collaborative chronic care model: a cluster randomized implementation trial. Med Care. 2019;57(7):503‐511. 10.1097/MLR.0000000000001138. PMID: 31135692; PMCID: PMC6684247. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. Kirchner JE, Ritchie MJ, Pitcock JA, Parker LE, Curran GM, Fortney JC. Outcomes of a partnered facilitation strategy to implement primary care‐mental health. J Gen Intern Med. 2014;29(S4):904‐912. 10.1007/s11606-014-3027-2. PMID: 25355087; PMCID: PMC4239280. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;12(10):21. 10.1186/s13012-015-0209-1. PMID: 25889199; PMCID: PMC4328074. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Miake‐Lye I, Mak S, Lam CA, et al. Scaling beyond early adopters: a content analysis of literature and key informant perspectives. J Gen Intern Med. 2021;36(2):383–395. 10.1007/s11606-020-06142-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Landes SJ, Kirchner JE, Areno JP, et al. Adapting and implementing caring contacts in a department of veterans affairs emergency department: a pilot study protocol. Pilot Feasibility Stud. 2019;5:115. PMID: 31624637; PMCID: PMC6785900. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. U.S. Department of Veterans Affairs . Consortium to disseminate and understand implementation of opioid use disorder treatment. https://www.queri.research.va.gov/visn_initiatives/opioids.cfm
  • 47. U.S. Department of Veterans Affairs . Veterans Health Administration. Health Services Research and Development. Suicide Prevention Research Impact Network (SPRINT). https://www.hsrd.research.va.gov/centers/core/sprint.cfm
  • 48. U.S. Department of Veterans Affairs . National strategy for prevention veteran suicide. https://www.mentalhealth.va.gov/suicide_prevention/docs/Office‐of‐Mental‐Health‐and‐Suicide‐Prevention‐National‐Strategy‐for‐Preventing‐Veterans‐Suicide.pdf
  • 49. U.S. Department of Veterans Affairs . Veterans Health Administration. Health Services Research and Development. Pain and Opioid Consortium of Research. https://www.hsrd.research.va.gov/centers/core/pain‐opioid.cfm#:~:text=Principal%20Investigators%20for%20this%20CORE%20are%3A%20Alicia%20Heapy%2C,on%20improving%20care%20for%20and%20reducing%20opioid%20
  • 50. Chambers DA, Feero WG, Khoury MJ. Convergence of implementation science, precision medicine, and the learning health care system: a new model for biomedical research. JAMA. 2016;315(18):1941‐1942. 10.1001/jama.2016.3867. PMID: 27163980; PMCID: PMC5624312. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51. Horwitz LI, Kuznetsova M, Jones SA. Creating a learning health system through rapid‐ cycle, Randomized Testing. N Engl J Med. 2019;381(12):1175‐1179. 10.1056/NEJMsb1900856. PMID: 31532967.Pedsnet cinn. [DOI] [PubMed] [Google Scholar]
  • 52. Forrest CB, Margolis P, Seid M, Colletti RB. PEDSnet: how a prototype pediatric learning health system is being expanded into a national network. Health Aff (Millwood). 2014;33(7):1171‐1177. 10.1377/hlthaff.2014.0127. PMID: 25006143. [DOI] [PubMed] [Google Scholar]
  • 53. National Institutes of Health . National Institute for Drug Abuse. HEALing Communities Initiative. https://heal.nih.gov/research/research‐to‐practice/healing‐communities
  • 54. Everson J, Rubin JC, Friedman CP. Reconsidering hospital EHR adoption at the dawn of HITECH: implications of the reported 9% adoption of a "basic" EHR. J Am Med Inform Assoc. 2020;27(8):1198‐1205. 10.1093/jamia/ocaa090. PMID: 32585689; PMCID: PMC7481034. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55. U.S. food and Drug Administration . 21st Century CURES Act. https://www.fda.gov/regulatory‐information/selected‐amendments‐fdc‐act/21st‐century‐cures‐act
  • 56. U.S. Congress . Foundations for Evidence‐based Policymaking Act of. 2018. https://www.congress.gov/bill/115th‐congress/house‐bill/4174

Articles from FASEB BioAdvances are provided here courtesy of Wiley

RESOURCES