Skip to main content
Learning Health Systems logoLink to Learning Health Systems
. 2025 Dec 16;10(1):e70048. doi: 10.1002/lrh2.70048

Structuring Collaboration Between Researchers and Operational Innovators: Diffusing New Practices Across the Veteran's Administration Healthcare System

Sudha R Raman 1,, Jennifer M Gierisch 1,2,3, Miriam A Kirshner 2, Kristopher R Teague 4, Jaifred Christian F Lopez 1,2, Blake Henderson 4, Ryan J Vega 4,5, Beth Ripley 4,6, Amy M Kilbourne 7,8, George L Jackson 2,9,10
PMCID: PMC12812486  PMID: 41560986

ABSTRACT

Introduction

To support a learning health system, we aimed to develop processes and tools to enable collaboration between the Veterans Health Administration (VHA) operations and research sectors in the implementation of promising healthcare innovations.

Methods

The collaboration process involved: (1) holding foundational partnership meetings between operations and health research sector leadership (2) reviewing existing research–operations collaborations frameworks, (3) developing a framework with criteria for systematically categorizing innovations into defined pathways that integrate research and operational considerations, and (4) adapting known participatory approaches to collect information for productive collaboration. Clearly delineating the skills, goals, and perspectives of the innovation and research sectors during partnership meetings enabled assessment of the value and trade‐offs of advancing an innovation into practice. The literature review of existing frameworks yielded heterogeneous objectives, domains, and criteria for evaluation. Across framework domains, commonalities (e.g., alignment with institutional goals), differences (e.g., attention to sustainment), and gaps (e.g., health equity focus) were noted. We developed five research–innovation partnership pathways and criteria to vet and categorize potential innovations (i.e., evidence for effectiveness and implementation, risks, equity, feasibility, and sustainability). We developed a menu of participatory processes to elicit feedback on innovations.

Results

We applied this process with an innovation to refine and evaluate the process, pathways, and criteria. Surveys of diverse partners (clinicians, administrators, researchers) ranked the innovation using our developed criteria, and placed the innovation on a pathway, which then helped guide next steps for evaluating the innovations. Overall, the process was feasible. We were able to categorize and plan next steps for promising innovations.

Conclusion

This theoretically grounded, iterative process may serve as a blueprint to accelerate the implementation of healthcare innovations. Through these intentional and participatory processes to engage operations innovators and health services researchers we can speed the delivery of promising innovations that impact patient care.

Keywords: biomedical, community based participatory research, delivery of health care, diffusion of innovation, evidence‐based practice, learning health system, quality improvement, translational research, veterans

1. Introduction

Recognizing the need to promote continuous improvement within health systems, the National Academy of Medicine (NAM) developed the concept of learning health systems (LHS), where “science, informatics, incentives and culture are aligned for continuous improvement, innovation and equity”, and where embedded discovery of best practices is a “by‐product of healthcare delivery.” [1, 2].

Despite a growing literature including LHS frameworks about LHS needs, actions and characteristics, and experience reports from individual health systems, there is no current consensus on the key components for an LHS [3]. In the Veterans Health Administration (VHA), which oversees the Nation's largest integrated healthcare system, an LHS is envisioned as having the following components: effective collaborations across research–operations sectors; alignment of research activities with clinical and organizational priorities; adoption of a population perspective; and use of existing data sources and infrastructure to develop, implement and evaluate promising innovations [4]. In the VHA, this requires an LHS to bridge the gap between the two distinct enterprise activities of generating scientific knowledge (i.e., research) and operational processes of daily clinical practice and quality improvement (i.e., operationally focused innovation) [5, 6, 7, 8].

Implementing an LHS requires careful consideration of how health systems interact with the research process; traditionally, research within health systems is carried out separately from routine care operations, with limited opportunity for interaction and collaboration [9]. Often, researchers are grounded in conducting studies that address questions in a theory‐informed, systematic and reproducible way with established methods to generate generalizable knowledge. These projects are often funded by national sources, and may take years to obtain funding and complete [7]. In contrast, healthcare organizations often make rapid decisions on how to continually enhance services. Operational leaders focus on developing systems for the optimal provision of health services that balance key aspects of quality, costs, risks, and resources [10, 11]. Additionally, frontline clinicians/staff have clinical experience but may not have the tools, funding and/or time to develop and test promising innovations [12]. Operationally focused innovation activities tend to include rapid development, testing, and diffusion and are oriented toward improvements most relevant to the specific organization or clinical setting [13, 14, 15, 16]. This difference in the orientation of research and practice activities can lead to barriers to effective collaboration, for example, misaligned timing of research funding and health system decision‐making, and limited incentives in academia for health system‐based research and collaborations [9, 17, 18]. The bi‐directional, co‐learning activities within an LHS need to take advantage of all these perspectives in a true partnership to source promising innovations, develop relevant questions, interpret findings, and implement solutions in partnership to enhance the quality of both science and health services [19, 20].

As a key participant in the development of the LHS concept, VHA has implemented a variety of initiatives that build and improve an LHS [21], making the agency an ideal venue for LHS‐related research. VHA has a well‐established peer‐reviewed research grant program operated by the intra‐agency Office of Research and Development (ORD) [4, 7]. In addition, over the past decade, the VHA has also developed an innovation infrastructure called the Innovation Ecosystem (IE), which includes programs that train frontline staff in how to develop innovations and identify and replicate successful innovations [13, 14, 15, 22, 23]. The Quality Enhancement Research Initiative (QUERI) is an example of a bridge between traditional research and operationally focused innovation. QUERI uses rigorous, implementation science‐based evaluation to understand which strategies work to advance evidence‐based practices in the VHA [24, 25, 26] (Table A).

Given these existing components, the VHA sought to better operationalize an LHS by focusing on building a way to increase meaningful engagement and collaboration [27]. However, the characteristics of potential partnerships between operationally focused innovation and research infrastructure are not as well defined and even less is known about the practical criteria that can help build relationships and aid individual projects in deciding on the most suitable pathway forward. Therefore, the objective of this experience report is to detail the development of a process for collaboration between the operationally focused and research sectors in selecting, guiding and accelerating the adoption of innovations.

2. Materials and Methods

Based on the methods of participatory research and engagement science, [28, 29, 30] we developed a process of collaboration. Participatory research centers on the importance of bidirectional involvement of all partners with a stake in a particular issue to facilitate knowledge exchange. This knowledge can then inform effective and equitable innovations that are more feasible to implement and sustain [29, 31, 32]. The collaboration process involved 5 steps: (1) holding foundational partnership meetings between the Innovation Ecosystem (IE) and Office of Research and Development (ORD) leadership; (2) reviewing existing theories and frameworks about research‐operations collaborations; (3) developing a systematic way to categorize innovations into five defined pathways that integrate health service research and operational considerations; and (4) adapting known participatory methods into a menu of services that could be used to collect and assess information about research and operational considerations. We then applied this process with an innovation to increase the feasibility of the process.

2.1. Step 1: Foundational Partnership Meetings

In December 2020–March 2021, three foundational partnership meetings were held between IE and ORD leadership to facilitate interactions where partners develop an understanding of each other's perspectives, as well as their needs, preferences, and as able, barriers and facilitators to the intended collaboration. Participants were nominated by directors of each sector, or had expressed an interest in participating and could represent either sector. The initial 2 meetings focused on current activities, the definition of an operationally focused innovation, and understanding how greater collaboration could help advance an innovation into practice. In this meeting, we discussed how the VHA Office of Healthcare Innovation and Learning defines an innovation, as the “practical implementation of ideas, methods, or devices that solves a problem, introduces new offerings or improves existing processes”. We discussed common examples of this definition, such as clinical delivery models that use evidence‐based practices, similar to those introduced through IE activities.

The third and final meeting was a structured 3‐h videoconference with 15 participants who represented a diverse group of innovation and research collaborators. Topics of discussion included the current structure, skills, activities, goals, and roles of both groups (IE and ORD) and potential collaborative models or infrastructure for the identification, replication and diffusion of innovations. The meeting was recorded and structured notes were taken for further analysis.

Based on the discussion in the first two meetings, we drafted a figure to represent distinct pathways for collaboration between operationally focused innovation and research. Pathways varied by levels of effectiveness and implementation evidence. In the third meeting, we asked participants to react to this draft, and reflect on the value and tradeoffs of advancing an innovation into practice. The group agreed that the pathways had face validity. The meeting also established that several domains were important to evaluate for any innovation: alignment with VHA and target population priorities, robust evidence of effectiveness in multiple settings, knowledge of implementation issues required for widespread diffusion, potential positive impact on health equity, and sustainability of the innovation under competing demands on front line resources.

2.2. Step 2: Reviewing Existing Frameworks About Research‐Operations Collaborations

Informed by our methodological participatory approach, we used the understanding of the partner perspectives gathered in Step 1 to build a theoretical foundation for our collaboration model. The core work group conducted a targeted literature scan of existing, well‐established intervention design, implementation and dissemination, LHS, and innovation evaluation frameworks that discussed or were focused on collaboration between research and operational sectors in large health systems, including VHA‐specific guide documents and publications.

We included widely used frameworks from implementation and dissemination science and the LHS literature, as well as those developed or extensively used within the VHA, focusing on dimensions beyond adoption, such as sustainment and de‐implementation. We selected a convenience sample of 10 frameworks for in‐depth review [20, 25, 26, 33, 34, 35, 36, 37, 38, 39] (Table B).

Thus, the frameworks we selected were driven by what the partners were already using, and reflected the context‐specific values, needs, and preferences expressed in Step 1. The frameworks had a range of objectives, domains and criteria for evaluation. We assessed the framework objectives, and categorized the frameworks into, (1) those that focused on the description of the design of interventions, conceptual elements, systems and process of adoption and (2) those that centered around the evaluation of interventions or evidence about interventions.

We prioritized the following 4 frameworks that included discrete criteria that could be used to evaluate an innovation, the evidence for an intervention and/or the potential for an intervention's success within in a given environment. These 4 frameworks reflected the common values, needs and preferences expressed by participants in Step 1 and represented the range of domains seen in the larger group of 10 frameworks (Table C). We chose the Non‐adoption, Abandonment, Scale‐up, Spread, Sustainability (NASSS) Framework for Health Care Technology since it is an evidence‐based theory‐informed framework. This framework was developed to study technology innovations in real time, with a focus on identifying both factors about the intervention and context and the complicated or complex interactions between domains [33]. The QUERI Impact Framework, created by VHA implementation scientists focuses on the measurement of health, economic, policy, and cultural impacts of implementation and quality improvement [26]. Both of these frameworks can be considered evidence‐informed and have demonstrated rigor in the development process. The Veterans Affairs (VA) Readiness to Implement Guide (Readiness to Implement) was chosen because it is actively being used in the VHA by decision makers, the language is familiar, and is innovation focused, with domains that consider de‐implementation, and non‐adoption, and health equity [25]. Lastly, the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) Certainty of Evidence is a rating system that includes well‐documented criteria for evaluating the strength of a body of research in evidence synthesis science, and has been adopted by Cochrane Collaboration as a standard [38].

Review of the domains and criteria for evaluation within the 4 frameworks (NASSS, GRADE, QUERI Impact Framework, VA Readiness to Implement) revealed several commonalities. The frameworks that focused on the intervention characteristics (all frameworks except the GRADE approach) had the common domain of ensuring that the intervention objectives were aligned with the activities and goals of the larger unit (institution or health system). For example, the first domain of the NASSS and the QUERI impact guide refers to the greater significance or priority of the disease or target population. (Table C) Most frameworks emphasized the need for sufficient evidence of effectiveness and safety either as a stand‐alone criterion (e.g., NASSS domain 4, VA Readiness to Implement criteria 2), or in combination with other characteristics (e.g., evidence of effectiveness and feasibility VA Readiness to Implement “high priority” rating). Several differences between the frameworks were noted, e.g., only two have domains that rate the potential for either implementation in different settings or sustainment of change. In all frameworks, there was a clear lack of guidance about how to prioritize innovations (in relation to alternative interventions or usual care) and to delineate the implications of the intervention on health equity.

2.3. Step 3: Drafting a Framework for Innovation Assessment

Identifying the domains deemed important for innovation assessment, we developed a systematic approach to (1) categorize innovations into 5 partnership pathways that define the role of research in evaluating and spreading innovations and (2) apply a series of evaluation criteria, across important domains (current evidence for effectiveness and implementation, known and unknown risks, equity, feasibility and sustainability), so that innovations could be ranked and placed on a partnership pathway by stakeholders in a transparent way (Figure 1).

FIGURE 1.

FIGURE 1

Pathways for collaboration between operationally focused innovation and research. The 5 levels represent varying levels of recommended research involvement, based on current evidence of the intervention's effectiveness, level of evidence about implementation, and risk. The first pathway (top pathway in Figure 2) characterizes an innovation that is low risk, and has high evidence about both effectiveness and implementation, so that the innovation can be implemented with only a need for light collaboration between research and innovation e.g., activities related to managing sustainability and ongoing evaluation. The second pathway characterizes an innovation that may have adequate formal evidence and evidence about implementation in various settings; however, it may need research collaboration to refine evaluation metrics. The third pathway describes innovations that have strong evidence of effectiveness; but they may need research collaboration to generate additional implementation evidence for the intervention in different settings. The fourth pathway characterizes an innovation with evidence that is promising; however, it may need more evidence of the intervention in different settings. These interventions also may have risks to implementation that are no longer classified as low risk. The fifth pathway characterizes innovations that are established but are still generating the foundational evidence of effectiveness and safety. Research involvement may be more intensive as the group decides on the key elements, outcomes and settings that need to be examined, and decides on the most appropriate study designs.

The 5 levels represent varying levels of recommended research involvement, based on current evidence of the intervention's effectiveness, evidence about implementation, and risk. The first pathway (top pathway in Figure 2) characterizes an innovation that is low risk, and has high evidence about both effectiveness and implementation, so that the innovation can be implemented with only need for a light collaboration between research and innovation. The second pathway characterizes an innovation that may have adequate formal evidence and evidence about implementation in various settings but may need research collaboration to refine evaluation metrics. The third pathway describes innovations that have strong evidence of effectiveness yet may need research collaboration to generate additional implementation evidence for the intervention in different settings. The fourth pathway characterizes an innovation with evidence that is promising, however, may need more evidence of the intervention in different settings. These interventions also may have risks to implementation that are no longer classified at low risk. The fifth pathway characterizes innovations that are established but are still generating the foundational evidence of effectiveness and safety. Research involvement may be more intensive as the group decides on the key elements, outcomes and settings that need to be examined, and most appropriate study designs.

FIGURE 2.

FIGURE 2

Innovation and research collaboration approaches.

Using a two‐stage consensus‐building approach within our research team, and informed by our research and innovations partnership meetings, two members of the research team (JMG, SRR) first independently determined potential domains. Through discussion with other core team members (GLJ, MAK), we came to consensus on 6 domains: “Priority of the innovation for the VA”, “Evidence of effectiveness and implementation”, “Risk and uncertainty level”, “Equity”, “Feasibility” and “Potential for long term sustainability”. Using components from the frameworks examined in Step 2, we determined criteria within each domain that would enable reviewers to assess innovations and align research and innovation objectives. To reflect the various levels of potential impact of an intervention, each domain was assessed in reference to patients, providers and the VA as a whole, where applicable. We noted that the GRADE system rated the quality of evidence in four levels, and in aiming for consistency across the criteria, we decided that each domain would be assessed similarly. In doing so, we developed a method with 4 confidence levels ranging “Very Low” to “High” (Table 1). Each item contributed 0–3 points and the scores were averaged within domains. These criteria and rankings enabled us to organize and summarize reviewers' assessments across each of the domains.

TABLE 1.

Survey questions informing the selection of pathways for collaboration between operationally focused innovation and research.

Domain Items
Each is prefaced by “Based on what I know about the innovation, I am confident that…”
Priority of the innovation for the VA
  • the innovation is a priority for the VA as a whole

  • the innovation is a priority for Veterans

  • the innovation is a priority for VA staff

  • there is a VA Office that can move this innovation forward

Evidence of effectiveness and implementation …there is evidence …
  • of effectiveness in Veteran populations on outcomes that are important to the VA as whole

  • of effectiveness in Veteran populations on outcomes that are important to Veterans

  • of effectiveness in Veteran populations on outcomes that are important to VA staff, and other key VA stakeholders

  • that this innovation is more beneficial than current practice in the VA

  • on the best way to implement this innovation

To what degree do you believe there is evidence for the following aspects of this innovation?
Risk and uncertainty level
  • the benefits to patients outweigh the potential risks/harm

  • there are risks to the VA organization of using the innovation

  • there are risks to discontinuing the innovation

Risk involves negative impacts to patients, staff and the organization. It includes such things as monetary and infrastructure cost, patient safety, provider burden/burnout, trialability/reversibility of the innovation
Equity
  • this innovation improves equity across marginalized groups (e.g., Veterans who are women, reside in rural areas, LGBTQ+, Black, Indigenous, people of color (BIPOC), people with disabilities, or other traditionally marginalized or underserved groups of individuals)

Feasibility
  • we know enough about the feasibility of the innovation processes and components to judge whether it can or should be adopted

  • that the innovation is sufficiently customizable/adaptable to new settings

  • that the cost of the innovation is acceptable to VA Central office

  • that cost of the innovation is acceptable to VA facilities

  • that facilities implementing the innovation will have sufficient person power to do so

  • facilities implementing the innovation will have sufficient time to implement the innovation

  • facilities implementing the innovation will have sufficient information technology (IT) infrastructure to do so

  • facilities implementing the innovation will have sufficient funding/available money needed to do so

  • facilities implementing the innovation will be able to fit this into their current workflows

  • the VA currently has regulations or policies in place needed to support this innovation

  • Veterans Integrated Services Networks (VISNs) have the ability to support this innovation

  • VA Central Office has the ability to support this innovation

Potential for long‐term sustainability
  • The VA facilities implementing this innovation would be able to sustain it

  • VISNs would be able to support the sustainment of this innovation in the long term

  • VA Central Office would be able to support the sustainment of this innovation in the long term

  • this innovation could adapt to new organizational, technology, or clinical realities

  • the VA is resilient to handling critical events and adapting to unforeseen eventualities in relation to this innovation

2.4. Step 4: Defining Participatory Collaborative Approaches

The goal of this step was to create a menu of potential participatory collaborative approaches to collect needed information from frontline innovation teams, stakeholders and researchers; and help teams understand and determine the next steps required to support their innovation. We used the process flow, planning materials, and facilitated meeting methods based on existing processes for stakeholder engagement, e.g., the “studio” approach [40] and “think tank” approach for research prioritization [41]. We assessed the potential scope, and gaps in these existing models, drafted a process to determine which type of engagement approach was needed, and what aspects of the innovation would need to be assessed. We then brought the innovation project team together with key innovation, research and clinical stakeholders to assess the project against the criteria and pathways of the framework drafted in Step 3. We assessed the feasibility of our model with an innovation and then adapted the models and materials according to participant feedback.

After adapting these participatory models, we offered engagement options tailored to innovation needs (Figure 2). To determine what type of engagement was needed, the core team and innovation team met to gather information about the intervention, its history, the stage of evolution of the innovation and types of evidence available, and current status and challenges.

Participatory models focus on three key aspects to spread innovation across healthcare systems: defining near‐term research questions about effectiveness, assessing available evidence, and having innovators, operational partners and researchers jointly prioritize next steps. We thus used three approaches: a “think tank”, evidence synthesis, and a “studio” approach.

The think tank approach (2–5 h structured facilitated meetings) is a mechanism to prioritize a research agenda [42]. After a brief presentation about the innovation, attendees in a facilitated meeting generated research priorities and participated in online voting using a forced ranking prioritization method to prioritize potential research topics. We used virtual breakout groups to co‐generate potential next steps needed to address the top priorities. A priority that may result from a think tank is a need to synthesize the available literature about a given innovation. The VHA has a robust Evidence‐Synthesis Program (ESP), which provides independent syntheses of published evidence for the VHA to translate into evidence‐based clinical practice, policy, and research. They conduct traditional systematic reviews and adapt their approach to meet operational partners' specific evidence needs and timelines with products such as evidence compendiums, evidence briefs, or evidence maps.

Originally developed by the Clinical and Translational Science (CTSA) program at Vanderbilt University, the “studio” approach was designed to integrate the perspectives of community members in research studies [40]. It has since been expanded to work with health systems that represent “communities” of staff and leaders who come together with the mission of providing high‐quality care. We adapted this approach by planning up to three studio sessions, and varying the objectives depending on the innovation stage. The “studio” model helped the innovation champions frame considerations for implementation and evaluation (such as defining patient, provider, and system outcomes, finding required key partners and champions, defining priority objectives, priorities and goals). Each session had similar multisector participants.

3. Results

3.1. Application of the Process With an Innovation

We applied the framework and participatory processes for developing collaboration with a clinician‐led innovation nominated by IE leadership. The objective of this activity was to apply the process we developed to increase feasibility of the process, rather than to evaluate the outcomes of the model or of the piloted innovation. We assessed the feasibility of the process and further refined the materials for the engagement options, pathways, and criteria. We used surveys administered within and after sessions for key potential partners of the innovation (clinicians, administrators, researchers) to elicit feedback about both the innovation and the process. We inserted additional points of contact between innovators and researchers, and used their feedback to improve prepared materials and meeting structure.

The innovation team had 2 clinician champions who had experience using a virtual reality (VR) intervention in multiple VHA settings. During the process, we determined what types of evidence would be needed to move from innovation to adoption to spread. During regular meetings with the innovators, we learned about the stage of evolution of the innovation and assessed the utility of three activities: (1) research question priority (think tank); (2) understanding the state of evidence (evidence synthesis); and (3) prioritizing work to move an innovation forward in research needs and/or implementation (Studio Model).

3.1.1. Use of the Think Tank

We partnered with an existing VHA entity (VHA Extended Realty Network, XRN), to conduct a think tank to prioritize research/evaluation needs for implementing VR in VHA Skilled Nursing Facilities (SNFs). In February 2022, 25 contributors convened for a 4‐h virtual think tank, and developed a draft list of priorities based on a single prompt. Following report‐outs of unique priorities identified by groups, two rounds of online voting using a forced ranking prioritization method were used for all participants to prioritize the curated list of potential priorities. Lastly, virtual breakout groups were then established to consider potential next steps needed to address the top priorities.

The think tank prioritization process yielded 4 key questions related to the use of VR in VHA SNFs and a range of next steps including systematic review, clarifying clinical goals related to the innovation, and setting priority outcomes and evaluation. In summary, the think tank process used a multi‐perspective panel to brainstorm and prioritize questions and related next steps to support the implementation of virtual reality in VHA SNFs, including through the planning and conduct of a new ongoing clinical demonstration project across one VHA region.

3.1.2. Use of Evidence Synthesis

As noted in Table 2, the think tank discussion generated questions about the state of evidence in the specific setting (skilled nursing facilities). Partnering between the innovators and VA ESP expertise, an evidence synthesis product was developed [43].

TABLE 2.

Detailed results of pilot process with a virtual reality‐based innovation.

Participants and objectives Resulting information or actions
Think tank
We partnered with the VHA Extended Realty Network (XRN), which at the time brought together over 1000 innovators at over 160 medical facilities seeking to use virtual reality (VR) and related technology to enhance patient outcomes and quality of life, to conduct a think tank to prioritize research/evaluation needs for implementing VR in VHA Skilled Nursing Facilities (SNFs). In February 2022, 25 contributors convened for a 4‐h virtual think tank, including VR experts, frontline clinicians, researchers, and VHA operational partners. Individually, followed by breakout sessions, contributors developed a draft list of priorities based on the prompt: “What are the most important questions to be addressed in the application of Virtual Reality for residents of skilled nursing facilities, like the VA Community Living Centers?” At the end of the Think Tank we used the framework criteria to elicit ratings from both the innovators and studio participants and to place the innovation on one of 5 pathway.

The think tank prioritization process yielded 4 key questions related to use of VR in VHA SNFs: (1) What is the state of the science on VR in SNF setting? (2) From a patient perspective, what are the key challenges Veterans face in the SNF that VR could address to increase overall wellness/quality of life? (3) What are the short‐ and long‐term outcomes of VR for patients in the SNF? and (4) What is the best way to implement VR in the SNF? Potential next steps include: (1) conducting a systematic review of the impact of VR in the SNF; (2) clarifying clinical goals related to VR in the SNF; (3) identifying opportunities to study priority outcomes; and (4) conducting an implementation‐science informed evaluation of “naturalistic” efforts to implemented VR in VA SNFs. We observed that the results of the criteria survey aligned with the main conclusions of the think tank meeting in that there is high confidence in pursuing studying VR in the CLC.

Because there was an active demonstration pilot of VR in the VISN 4 CLC; the decision was made to focus on: (1) conducting a systematic review of the impact of VR in the SNF; (2) clarifying clinical goals related to VR in the SNF as the primary research scope.

Evidence synthesis activities
As noted above, the primary question that came out of the Think Tank on use of extended reality in skilled nursing facilities was “What is the state of the science on VR in SNF setting?” We worked with the XR Network and VHA ESP program to arrange for the innovators to partner on the development of an evidence‐compendium outlining what research has been published on the use of extended reality therapies among older adults. The report, developed by the ESP site in Portland, Oregon, concluded that while some research has been published indicating promise for virtual reality to enhance outcomes among patients in residential settings, there is a lack of a robust research evidence‐base for such programs.
Studio

Using an iterative process of brainstorming and prioritizing each studio focused and refined different aspects of VR in the CLC.

Studio 1 asked: “What are the short‐ and long‐term outcomes of VR for patients in the CLC?”

Following Studio 1 we sent out “homework”, a Qualtrics survey to prioritize attributes of a VR platform

Studio 2 asked: “Which VR Platform will best suit the proposed outcomes for Veterans in the CLC?” For this studio 3 VR companies whose platforms and attributes matched the identified needs; were asked to present. We administered a survey after each presentation to determine confidence in each platform, and after all three vendors had presented, we asked participants to rank each platform by their preference.

Studio 1 identified primary patient level outcomes to be addressed by VR are improved connectedness and socialization, reducing challenging behavior, pain intensity/pain reduction. We determined the platform should have the following attributes: is a standalone platform, is customizable, and has ability to use gaze control.

Studio 2 Using survey results, we determined the top choice of platform.

We observed that the results of the criteria survey aligned with the main conclusions of the studio and think tank meetings.

3.1.3. Use of the Studio Model

The first session aimed to understand the innovation's significance, work progress, and evidence status. Innovators also discussed barriers and facilitators and then generated a prioritize list of solutions. In the second session, priorities were ranked, and for top ranked priority, the stakeholders, key measures and resources were identified. The last session used the framework criteria to elicit ratings from both the innovators and studio participants to place the innovation on a pathway. We observed that the criteria survey results aligned with the conclusions of the studio and think tank meetings (Table 2).

4. Discussion

We have outlined the design process of developing a framework and a process of engagement, leveraging existing infrastructure to support effective collaboration between the VHA operations and health research sectors in the implementation of promising healthcare innovations. We integrated domains from various implementation frameworks to inform a practical and goal‐oriented series of interactions between stakeholders. We developed specific criteria to aid groups in gathering the right stakeholders and information that will ultimately lead to decisions that can speed the process of bringing the innovations to fruition. This process is meant to be very pragmatic and transparent, each component designed with a specific, decision‐relevant outcome in mind. With further development, this theoretically grounded, iterative process may serve as a blueprint to accelerate the implementation of healthcare innovations.

In creating a culture that supports an LHS, prioritizing relevant stakeholders in the decision‐making about which innovations to support is crucial. An effective collaboration is context‐specific, but aims to integrate the strengths of both innovation and research; the operational innovators share their understanding of how to impact healthcare quality and researchers lend their understanding of how to generate and integrate evidence. Our work can help ensure that decisions and criteria are transparent so that stakeholders can truly inform each step of the translation pathway. The framework proposed also aligns with how LHSs impact the quality of care and several of the noted barriers to doing so, including competing priorities, and lack of evaluation [44, 45]. We were able to develop a collaboration process and practical evaluation criteria that consider stakeholder values and decide on priority outcomes for evaluation. Through this process, an LHS can support the innovations that are most likely to improve priority outcomes.

This effort has important limitations. This model focuses on the process of collaboration and focuses on a definition of innovation related to care delivery innovations that use existing products or interventions, and would require tailoring to address innovations that involve product design and development. The review of frameworks was not systematic, as in other efforts which utilized formal inclusion and exclusion criteria [46]; nonetheless, this work was intended to be pragmatic and in line with other reports that made use of a comparative approach [47]. Results are optimized for the VHA, and thus would need careful adaptation to be externally valid to other health systems. For example, this collaborative model requires skillful and agile project coordination to facilitate the process, research expertise, organizational knowledge, consensus‐oriented culture, and leadership support. To apply this process within an LHS with fewer resources, some aspects of innovation evaluation may be critical, e.g., alignment of innovation goals with organization mission. Other aspects may require creative adaptation to either abridge processes or leverage external partnerships. The innovation to which we applied this process used all engagement options (think tank, evidence synthesis and studio sessions) and thus involved a multi‐month process. The processes chosen, the amount of information needed and the timeline of decision‐making may differ by innovation. Future research could systematically evaluate the progress and impact of innovations through the pathways identified in this work, and consider additional pathways or adaptations as needed. Similarly, future work in other LHSs could explore which aspects of the process may be particularly sensitive to organizational culture.

In addition, we learned important lessons in the studio process. For each of the efforts mentioned, there was no funding mechanism immediately available for the next project. We learned the importance of identifying potential funding mechanisms or specific goals ahead of time to set clear expectations for all involved. Another key lesson was the need to continue expanding the process for identifying opportunities to collaborate in a systematic way. We are currently working to identify lessons from the effort outlined here and apply this process within a single Veterans Integrated Services Network (VISN), where there are known stakeholders, more local strategic plans and a dedicated innovation staff to help tailor the process to this learning health system.

5. Conclusions

There are challenges in bridging traditional research and innovation approaches to building and evaluating promising innovations. Our results describing this theoretically grounded, iterative process can inform the collaborative role that research can play in promoting practice‐based innovations. As the VHA moves intentionally toward improving as an LHS, strengthening connections between health services research and practice‐based innovation enterprises has the potential to speed the spread of effective innovations.

Disclosure

The views expressed in this article are those of the authors and do not represent the position or policy of the Department of Veterans Affairs, United States government, or other organizations with which the authors are affiliated.

Conflicts of Interest

The authors declare no conflicts of interest.

Supporting information

Data S1: Supporting Information.

LRH2-10-e70048-s001.docx (33.4KB, docx)

Acknowledgments

We would like to acknowledge Anne Lord Bailey, PharmD, BCPS, Arash Harzand, MD, Caitlin Rawlins, MSN, RN, and Amit Shah, MD, MSCR for their partnership in piloting this collaboration method.

Raman S. R., Gierisch J. M., Kirshner M. A., et al., “Structuring Collaboration Between Researchers and Operational Innovators: Diffusing New Practices Across the Veteran's Administration Healthcare System,” Learning Health Systems 10, no. 1 (2026): e70048, 10.1002/lrh2.70048.

Funding: This work was supported by the Department of Veterans Affairs Health Services Research and Development Service (SDR 20‐389) and Quality Enhancement Research Initiative (PEC‐17‐002 and QUE 20‐012). The Durham Center of Innovation to Accelerate Discovery and Practice Transformation (CIN 13‐410) at the Durham Veterans Affairs Health Care System also supported this work.

References

  • 1. McGinnis J. M., Fineberg H. V., and Dzau V. J., “Advancing the Learning Health System,” New England Journal of Medicine 385, no. 1 (2021): 1–5. [DOI] [PubMed] [Google Scholar]
  • 2. Institute of Medicine Roundtable on Evidence‐Based Medicine , The Learning Healthcare System: Workshop Summary, ed. Olsen L., Aisner D., and McGinnis J. M. (National Academies Press, 2007). [PubMed] [Google Scholar]
  • 3. Easterling D., Perry A. C., Woodside R., Patel T., and Gesell S. B., “Clarifying the Concept of a Learning Health System for Healthcare Delivery Organizations: Implications From a Qualitative Analysis of the Scientific Literature,” Learning Health Systems 6, no. 2 (2022): e10287. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Atkins D., Kilbourne A. M., and Shulkin D., “Moving From Discovery to System‐Wide Change: The Role of Research in a Learning Health Care System: Experience From Three Decades of Health Systems Research in the Veterans Health Administration,” Annual Review of Public Health 38 (2017): 467–487. [DOI] [PubMed] [Google Scholar]
  • 5. Kilbourne A. M., Schmidt J., Edmunds M., Vega R., Bowersox N., and Atkins D., “How the VA Is Training the Next‐Generation Workforce for Learning Health Systems,” Learning Health Systems 6, no. 4 (2022): e10333. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Jackson G. L., Damschroder L. J., White B. S., et al., “Balancing Reality in Embedded Research and Evaluation: Low vs High Embeddedness,” Learning Health Systems 6, no. 2 (2022): e10294. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Kilbourne A. M., Braganza M. Z., Bowersox N. W., et al., “Research Lifecycle to Increase the Substantial Real‐World Impact of Research: Accelerating Innovations to Application,” Medical Care 57, no. Suppl 10 (2019): S206–S212. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Gould M. K., Sharp A. L., Nguyen H. Q., et al., “Embedded Research in the Learning Healthcare System: Ongoing Challenges and Recommendations for Researchers, Clinicians, and Health System Leaders,” Journal of General Internal Medicine 35, no. 12 (2020): 3675–3680. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Psek W. A., Stametz R. A., Bailey‐Davis L. D., et al., “Operationalizing the Learning Health Care System in an Integrated Delivery System,” Egems 3, no. 1 (2015): 1122. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Berwick D. M., Nolan T. W., and Whittington J., “The Triple Aim: Care, Health, and Cost,” Health Affairs 27, no. 3 (2008): 759–769. [DOI] [PubMed] [Google Scholar]
  • 11. Sikka R., Morath J. M., and Leape L., “The Quadruple Aim: Care, Health, Cost and Meaning in Work,” BMJ Quality and Safety 24, no. 10 (2015): 608–610. [DOI] [PubMed] [Google Scholar]
  • 12. Goldstein K. M., Gierisch J. M., Tucker M., J. W. Williams, Jr. , Dolor R. J., and Henderson W., “Options for Meaningful Engagement in Clinical Research for Busy Frontline Clinicians,” Journal of General Internal Medicine 36, no. 7 (2021): 2100–2104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Vega R. J. and Kizer K. W., “VHA'S Innovation Ecosystem: Operationalizing Innovation in Health Care,” NEJM Catalyst Innovations in Care Delivery 1, no. 6 (2020). [Google Scholar]
  • 14. Jackson G. L., Cutrona S. L., White B. S., et al., “Merging Implementation Practice and Science to Scale up Promising Practices: The Veterans Health Administration (VHA) Diffusion of Excellence (DoE) Program,” Joint Commission Journal on Quality and Patient Safety 47, no. 4 (2021): 217–227. [DOI] [PubMed] [Google Scholar]
  • 15. Vashi A. A., Orvek E. A., Tuepker A., et al., “The Veterans Health Administration (VHA) Innovators Network: Evaluation Design, Methods and Lessons Learned Through an Embedded Research Approach,” Health 8, no. Suppl 1 (2021): 100477. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Jackson G. L., Fix G. M., White B. S., et al., “Diffusion of Excellence: Evaluating a System to Identify, Replicate, and Spread Promising Innovative Practices Across the Veterans Health Administration,” Frontiers in Health Services 3 (2023): 1223277. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Moloney R. M., Tambor E. S., and Tunis S. R., “Patient and Clinician Support for the Learning Healthcare System: Recommendations for Enhancing Value,” Journal of Comparative Effectiveness Research 5, no. 2 (2016): 123–128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Psek W., Davis F. D., Gerrity G., et al., “Leadership Perspectives on Operationalizing the Learning Health Care System in an Integrated Delivery System,” eGEMs 4, no. 3 (2016): 1233. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Cyril S., Smith B. J., Possamai‐Inesedy A., and Renzaho A. M., “Exploring the Role of Community Engagement in Improving the Health of Disadvantaged Populations: A Systematic Review,” Global Health Action 8 (2015): 29842. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Committee on the Learning Health Care System in A, Institute of M , Best Care at Lower Cost: The Path to Continuously Learning Health Care in America, ed. Smith M., Saunders R., Stuckhardt L., and McGinnis J. M. (National Academies Press, 2013). [PubMed] [Google Scholar]
  • 21. Kilbourne A. M., Braganza M. Z., Bravata D. M., et al., “The Translation‐To‐Policy Learning Cycle to Improve Public Health,” Learning Health Systems 8, no. 4 (2024): e10463. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Elnahal S. M., Clancy C. M., and Shulkin D. J., “A Framework for Disseminating Clinical Best Practices in the VA Health System,” Journal of the American Medical Association 317, no. 3 (2017): 255–256. [DOI] [PubMed] [Google Scholar]
  • 23. Vega R., Jackson G. L., Henderson B., et al., “Diffusion of Excellence: Accelerating the Spread of Clinical Innovation and Best Practices Across the Nation's Largest Health System,” Permanente Journal 23 (2019): 23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Demakis J. G., McQueen L., Kizer K. W., and Feussner J. R., “Quality Enhancement Research Initiative (QUERI): A Collaboration Between Research and Clinical Practice,” Medical Care 38, no. Suppl 1 (2000): I17–I25. [PubMed] [Google Scholar]
  • 25. Kilbourne A. M., Goodrich D. E., Miake‐Lye I., Braganza M. Z., and Bowersox N. W., “Quality Enhancement Research Initiative Implementation Roadmap: Toward Sustainability of Evidence‐Based Practices in a Learning Health System,” Medical Care 57, no. Suppl 10 (2019): S286–S293. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Braganza M. Z. and Kilbourne A. M., “The Quality Enhancement Research Initiative (QUERI) Impact Framework: Measuring the Real‐World Impact of Implementation Science,” Journal of General Internal Medicine 36, no. 2 (2021): 396–403. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Damschroder L. J., Knighton A. J., Griese E., et al., “Recommendations for Strengthening the Role of Embedded Researchers to Accelerate Implementation in Health Systems: Findings From a State‐Of‐The‐Art (SOTA) Conference Workgroup,” Health 8, no. Suppl 1 (2021): 100455. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Oetzel J. G., Wallerstein N., Duran B., et al., “Impact of Participatory Health Research: A Test of the Community‐Based Participatory Research Conceptual Model,” BioMed Research International 2018 (2018): 7281405. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Wallerstein N. and Duran B., “Community‐Based Participatory Research Contributions to Intervention Research: The Intersection of Science and Practice to Improve Health Equity,” American Journal of Public Health 100, no. Suppl 1 (2010): S40–S46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Wallerstein N., Duran B., Oetzel J. G., and Minkler M., Community‐Based Participatory Research for Health: Advancing Social and Health Equity, 3rd ed. (Jossey‐Bass, 2018). [Google Scholar]
  • 31. Forsythe L. P., Carman K. L., Szydlowski V., et al., “Patient Engagement in Research: Early Findings From the Patient‐Centered Outcomes Research Institute,” Health Affairs 38, no. 3 (2019): 359–367. [DOI] [PubMed] [Google Scholar]
  • 32. Mullins C. D., Abdulhalim A. M., and Lavallee D. C., “Continuous Patient Engagement in Comparative Effectiveness Research,” JAMA 307, no. 15 (2012): 1587–1588. [DOI] [PubMed] [Google Scholar]
  • 33. Greenhalgh T., Wherton J., Papoutsi C., et al., “Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale‐Up, Spread, and Sustainability of Health and Care Technologies,” Journal of Medical Internet Research 19, no. 11 (2017): e367. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Fischer M., Safaeinili N., Haverfield M. C., Brown‐Johnson C. G., Zionts D., and Zulman D. M., “Approach to Human‐Centered, Evidence‐Driven Adaptive Design (AHEAD) for Health Care Interventions: A Proposed Framework,” Journal of General Internal Medicine 36, no. 4 (2021): 1041–1048. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Damschroder L. J., Reardon C. M., Widerquist M. A. O., and Lowery J., “The Updated Consolidated Framework for Implementation Research Based on User Feedback,” Implementation Science 17, no. 1 (2022): 75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Glasgow R. E., Harden S. M., Gaglio B., et al., “RE‐AIM Planning and Evaluation Framework: Adapting to New Science and Practice With a 20‐Year Review,” Frontiers in Public Health 7 (2019): 64. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Rogers E., Diffusion of Innovations (Free Press, 1969). [Google Scholar]
  • 38. Guyatt G., Oxman A. D., Akl E. A., et al., “GRADE Guidelines: 1. Introduction‐GRADE Evidence Profiles and Summary of Findings Tables,” Journal of Clinical Epidemiology 64, no. 4 (2011): 383–394. [DOI] [PubMed] [Google Scholar]
  • 39. Weiner B. J., “A Theory of Organizational Readiness for Change,” Implementation Science 4, no. 1 (2009): 67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Joosten Y. A., Israel T. L., Williams N. A., et al., “Community Engagement Studios: A Structured Approach to Obtaining Meaningful Input From Stakeholders to Inform Research,” Academic Medicine 90, no. 12 (2015): 1646–1650. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41. Lewinski A. A., Sullivan C., Allen K. D., et al., “Accelerating Implementation of Virtual Care in an Integrated Health Care System: Future Research and Operations Priorities,” Journal of General Internal Medicine 36, no. 8 (2021): 2434–2442. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. Huppert J. S., Fournier A. K., Bihm J. L., et al., “Prioritizing Evidence‐Based Interventions for Dissemination and Implementation Investments: AHRQ'S Model and Experience,” Medical Care 57, no. Suppl 3 (2019): S272–S277. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Apaydin E. A. A., Williams B. E., and Parr N. J., Evidence Compendium: Research on Extended Reality‐Based Therapies Among Older Adults (Health Services Research and Development Service, Office of Research and Development, Department of Veterans Affair, 2023). [Google Scholar]
  • 44. Morain S. R., Kass N. E., and Grossmann C., “What Allows a Health Care System to Become a Learning Health Care System: Results From Interviews With Health System Leaders,” Learning Health Systems 1, no. 1 (2017): e10015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Foley T. J. and Vale L., “What Role for Learning Health Systems in Quality Improvement Within Healthcare Providers?,” Learning Health Systems 1, no. 4 (2017): e10025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. Hanney S., Kanya L., Pokhrel S., Jones T., and Boaz A., What Is the Evidence on Policies, Interventions and Tools for Establishing and/or Strengthening National Health Research Systems and Their Effectiveness? (World Health Organization Regional Office for Europe, 2020). [PubMed] [Google Scholar]
  • 47. Sheikh K. A., Learning Health Systems: Pathways to Progress (World Health Organization, 2021). [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data S1: Supporting Information.

LRH2-10-e70048-s001.docx (33.4KB, docx)

Articles from Learning Health Systems are provided here courtesy of Wiley

RESOURCES