Skip to main content
VA Author Manuscripts logoLink to VA Author Manuscripts
. Author manuscript; available in PMC: 2022 Jun 23.
Published in final edited form as: Healthc (Amst). 2021 Jun 23;8(Suppl 1):100477. doi: 10.1016/j.hjdsi.2020.100477

The Veterans Health Administration (VHA) Innovators Network: Evaluation Design, Methods and Lessons Learned Through an Embedded Research Approach

Anita A Vashi 1,2,3, Elizabeth A Orvek 4,5, Anaïs Tuepker 6,7, George L Jackson 8,9,10, Allison Amrhein 11, Brynn Cole 11, Steven M Asch 1,12, Allen L Gifford 13,14, Jennifer Lindquist 8, Nell J Marshall 1, Summer Newell 6, Melissa A Smigelsky 15, Brandolyn S White 8, Lindsay K White 4, Sarah L Cutrona 4,16
PMCID: PMC8244154  NIHMSID: NIHMS1701670  PMID: 34175094

Abstract

Background:

Collaboration between researchers, implementers and policymakers improves uptake of health systems research. In 2018, researchers and VHA Innovators Network (iNET) leadership used an embedded research model to conduct an evaluation of iNET. We describe our evaluation design, early results, and lessons learned.

Methods:

This mixed-methods evaluation incorporated primary data collection via electronic survey, descriptive analysis using existing VA datasets (examining associations between facility characteristics and iNET participation), and qualitative interviews to support real-time program implementation and to probe perceived impacts, benefits and challenges of participation.

Results:

We developed reporting tools and collected data regarding site participation, providing iNET leadership rapid access to needed information on projects (e.g., target populations reached, milestones achieved, and barriers encountered). Secondary data analyses indicated iNET membership was greater among larger, more complex VA facilities. Of the 37 iNET member sites, over half (n=22) did not have any of the six major types of VA research centers; thus iNET is supporting VA sites not traditionally served by research innovation pathways. Qualitative findings highlighted enhanced engagement and perceived value of social and informational networks.

Conclusions:

Working alongside our iNET partners, we supported and influenced iNET’s development through our embedded evaluation’s preliminary findings. We also provided training and guidance aimed at building capacity among iNET participants.

Implications:

Embedded research can yield successful collaborative efforts between researchers and partners. An embedded research team can help programs pivot to ensure effective use of limited resources. Such models inform program development and expansion, supporting strategic planning and demonstrating value.

Background

Learning health systems, committed to a culture of continuous learning and improvement, often seek innovative solutions to the many challenges they face. For learning health care systems to truly learn from the experience of patient care and embed best practices in the delivery process, evaluation tools are needed. Yet many such systems lack the expertise to conduct such evaluations.

One promising solution that better aligns evidence, practice and policy is the embedded research model [1]. Embedded research can be thought of as a “partnership between academic researchers and decision-makers to assist in strengthening the development of policy, practice and social innovation, or the co-production of knowledge [2].” By increasing ownership and acceptability, research findings and evidence-based strategies are better integrated into health care implementation and policy solutions. The relationship between implementers and researchers can vary by degree of embeddedness and can range from a dichotomized research-practice approach to a more a more deeply immersive embedded research model [3].

The Veterans Administration (VA’s) Quality Enhancement Research Initiative (QUERI) has also embraced the embedded research concept. QUERI’s mission is three-fold: 1) enable rapid translation of research findings and evidence-based treatments into clinical practice; 2) increase the impact of research findings through bi-directional partnerships and rigorous evaluations; and 3) promote implementation science and support VA’s transformation to a learning healthcare system. One component of QUERI’s portfolio includes the national Partnered Evaluation Initiatives (PEIs) [4]. A healthcare system operations partner provides primary funding to conduct specific evaluations with potential high impact on VA national policy. These PEIs work closely with operational leaders across VA to provide expertise and conduct time-sensitive evaluations of services and programs, enhancing program design and rollout for continuous innovation and improvement. In 2018, a QUERI PEI was funded to evaluate the VHA Innovators Network (iNET), a national program whose mission is to build and empower a community of VA employees who actively move VA forward using innovation practices. Here we describe the organization of our embedded evaluation team, evaluation design, early results, and lessons learned.

Methods

VHA Innovators Network (iNET)

iNET, launched in 2015, builds the innovation capacity of the VA by: (1) training VA employees on innovation-related competencies; (2) providing an innovation resourcing and acceleration pathway; and (3) supporting the VHA Diffusion of Excellence, a parallel program [5,6], by fueling promising practice identification and implementation.

VA medical facilities are recruited and competitively apply to join iNET. Having begun with 8 sites in 2016, iNET has grown to include over 30 sites. At each participating network site, one or more trained Innovation Specialists facilitate the local innovation program and work with employees to develop innovative projects aimed at resolving unmet needs of Veterans and employees. iNET provides Innovation Specialists employed by member sites with coaching, facilitation, and training on core competencies which include human-centered design, project management, and lean methodologies.

Complementary to this support for local leadership development, iNET also supports projects initiated by frontline employees through the Spark-Seed-Spread (S-S-S) investment program. This provides a structure that allows ideas to be developed, refined, and tested utilizing three tiers of financial support, based on idea maturation. Spark funding supports “Proof of Concept Projects” to help innovators develop initial proof of concept “prototypes,” where there is a strong problem statement and potentially some preliminary evidence or strong theory of action. Seed funding is for “Pilot Projects” to help further develop proof of concept and perform pilot testing. Spread funding is for “Implementation and Scaling Projects” to help spread or scale innovation projects to other populations, clinics, or sites in VA. To date, iNET has resourced 205 Spark, 162 Seed, and 64 Spread projects.

Additionally, iNET also provides centralized support and resources to all sites to reduce barriers to innovation. For example, the network assists with contracting, purchasing, and forming community and university collaborations. Resources are also allocated to generate interest, promote virtual connections, and showcase collaborations and successes.

QUERI Partnered Evaluation

In January of 2017, QUERI announced an open-ended call for proposals to support a comprehensive evaluation of iNET. The decision to release the call was based on a VHA leadership request and was informed by iNET’s alignment with the QUERI mission. Assessing iNET’s impact was viewed as necessary to inform further operationalization and sustainability of iNET processes and to support VA’s national efforts to develop a culture of innovation and implement innovation development pathways. After a competitive, peer-reviewed grant process, funding was provided jointly by QUERI, which manages the grant process and provides scientific oversight, and iNET, which works with the evaluators to set strategic direction for the evaluation and utilizes the results in programmatic decision making.

Organizational Governance of the Partnered Evaluation Team

In 2017, two research teams were awarded a six-month planning period. During this time each team conducted regular calls and held in-person meetings with evaluation team members and iNET leadership to better understand the goals of the then nascent program and the goals for an evaluation. This early work was iterative and included bidirectional information sharing, allowing iNET leadership to teach researchers about the program and in turn, allowing the evaluation teams to explain the required capacity, as well as strengths and limitations, associated with proposed evaluation activities. At the conclusion of the planning period, the two research teams decided to join forces to compete for a three-year evaluation grant spanning from April 2018 to March 2021, which we were subsequently awarded. Together, with iNET leadership, we formulated evaluation questions that 1) inform key aspects of iNET development; 2) aid in strategic planning; and 3) provide the potential to demonstrate value of the program depending on the results. The final scope of work was detailed in a memorandum of understanding between QUERI, iNET Leadership and the evaluation team. This three-year evaluation is administratively overseen as a non-research quality improvement activity.

Partnered Evaluation Goals

Early on, as iNET was seeking to establish itself and grow nationwide, their primary concern was program adoption. Thus, initial evaluation efforts focused on understanding characteristics of iNET sites and specialists as well as understanding expectations for impact at the site and project level. As the program evolved, iNET leadership’s focus shifted toward gaining a better understanding of the impact of the program (implementation success) and gathering evidence to support program maintenance (outcome measures) both for their internal quality improvement and to justify ongoing investment by investors. Consequently, later stages of our evaluation have focused on measuring the impact of iNET for selected projects. Our team included expertise in data management, implementation science, clinical care, and qualitative and quantitative methods. This range of expertise, and the collaborative relationship we had established with iNET leadership at the start of the evaluation, allowed us to adapt successfully to evolving goals. Our evaluation work supported our ongoing collaborative relationship, providing tools with immediate utility to iNET and information upon which leadership could rapidly act.

Evaluation Methods

This mixed-methods evaluation incorporates primary data collection via electronic survey, secondary data analysis using existing VA datasets, and qualitative interviews.

Primary quantitative data collection (survey reporting tools):

Based on an initial review of iNET data collection strategies, observation of training sessions, and interviews with stakeholders, we identified a gap in tracking progress and measuring success across diverse projects and sites. iNET leaders desired metrics that could be shared with stakeholders and a data collection solution that would be suitable for respondents with varied experience in project and data management. Through an iterative approach with regular input from our partners, we created a centralized data collection system comprised of four surveys each focused on a specific level of the iNET program (sites, Innovation Specialists, current project leads and program alumni).

Data collected from iNET sites included innovation related activities (e.g., training events, community partnerships, etc.). Data from Innovation Specialists included information about each respondent’s role, background characteristics, and ongoing professional development. Project lead data included information on projects including target population, focus areas, progress/milestones and success metrics. Alumni data captured the status of projects after iNET funding ended.

The data collection system was built within the VA Research Electronic Data Capture (REDCap) program [7], using skip logic and incorporating information from project applications to minimize respondent burden. Before launching data collection, the evaluation team held calls and met in person with network participants to explain the need for these data and promote buy-in. Participant hesitations included the concern about additional burden, questions about necessity of this reporting, and lack of clarity on how to procure requested data. We provided reassurance that surveys would be designed to minimize respondent burden wherever possible and shared all planned questions in advance to avoid surprises. We also reviewed with participants the utility of data within the organizational structure of the VA (e.g., reporting back to their facility directors, demonstrating change on VA specific measures of performance) and discussed potential VA sources for requested data. Survey administration began in January 2019 on a quarterly basis. We performed frequency counts and descriptive analyses of collected data. We also grouped free-text responses into broader categories for ease of interpretation.

Secondary quantitative data (existing nationwide VA data):

Concurrent with primary data collection, we used data from existing VA data sources to look for associations between macro-level facility characteristics and level of engagement in iNET. The purpose of this effort was to determine the reach of the program in relation to the types of facilities represented in the VA. Facility-level characteristics of interest included:1) degree of facility complexity and number of patients served; 2) quality indicators included in VA’s primary quality measurement system, the Strategic Analytics for Improvement and Learning (SAIL) [8] (e.g., employee satisfaction patient satisfaction and 30-day readmission rates); 3) the presence of large research centers and programs at the facility. Characteristics were calculated for iNET sites (n=37), sites who applied but were not accepted to iNET (n=28) and the remaining 81 facilities who have not applied for the program.

Primary qualitative data (interviews with iNET participants):

Qualitative evaluation components were designed to facilitate relatively rapid analysis in support of real-time program implementation [9]. In the first year of the evaluation, we conducted semi-structured interviews with three key groups of program participants: Innovation Specialists, Project Investees, and Leadership Champions to 1) identify themes from their experiences that could formatively shape the evolving iNET program content, and 2) inform our evaluation efforts to develop relevant impact metrics. Interview questions were designed to probe the perceived impacts, benefits and challenges of network participation, and observations of systemic or structural supports observed or desired to sustain these impacts. We developed a purposive sampling strategy for maximized diversity on site characteristics of interest including facility size, geographic location, first time versus repeat applicant, prior innovation approach/experience, and involvement of community/Veteran partners. Site applications to join the network were reviewed against these criteria. A total of 50 interviews were conducted across 15 sites.

Interview guide development was iterative and included soliciting feedback from program participants and operational partners [Appendix]. Team members experienced in qualitative interviewing and partnered evaluation conducted all interviews over the telephone.

Conventional thematic analysis through coding of recorded transcripts was performed, and results of that analysis are now being prepared for publication. In order to provide timely, formative feedback to operational partners, however, a more rapid analysis used templated notetaking during interviews to identify themes with practical implications for implementation both within and across the three interviewee groups [9].

Early Results

Primary quantitative data collection:

Primary data collected during the first 3 quarters of FY2019 are reported here (subsequent data collected are not reported). Our reporting tools have provided iNET leadership with the ability to describe the implementation of their program across diverse sites, Innovation Specialists and S-S-S projects (Table 1). Data collected illustrate the growth of the program from 8 sites in Year 1 to 37 sites in Year 4. Most sites (74%) identified one individual as their innovation specialist, with the remaining having more than one. The data collection also captured needed information on projects including information on target populations reached, milestones achieved, and barriers encountered.

Table 1.

Characteristics of VA iNET’s Implementation as of June 2019.

Number of iNET sites active in each program year
Year 1 8
Year 2 22
Year 3 32
Year 4 37

Average number of Spark-Seed-Spread Projects per sitea
Spark 5.72
Seed 4.61
Spread 1.61
Funded projects (all levels) 11.94

Active innovation specialists
Total Innovation Specialists 46

Specialist distribution across 34 sitesb
Sites with 1 specialist 25
Sites with 2 specialists 8
Sites with 3 specialists 1

Specialist FTE b.c
Average FTE per site 0.85
Total FTE across all iNET sites 28.74

Affiliated departmentd
Director’s office or Chief of Staff 8 (19%)
QSV (Quality Management, Quality Safety & Innovation) 9 (21%)
Systems redesign 4 (10%)
Patient care services 3 (7%)
Extended care & rehabilitation 4 (10%)
Informatics 2 (5%)
Local innovation center 3 (7%)
Research 2 (5%)
Other 7 (17%)

FY19 Spark-Seed-Spread distribution across 90 projects
Spark 44 (49%)
Seed 31 (34%)
Spread 15 (17%)

Spark-Seed-Spread Focus areae
Veteran Experience and/or Veterans Satisfaction 46
Veteran-Centered Care/Veteran Wellness 42
Improvement in Clinical Outcome 31
Access to Care 28
VA costs 22
Patient Safety 15
Quality of Care through Employee Education 15
Efficiency of Care Administration 14
Timeliness of Care 13

Spark-Seed-Spread Primary target populatione
Veterans 83
Veteran caregivers (non-employee) 12
VA employees 40
a

Data provided by 44 specialists across 34 Sites

b

Average number of projects across 34 sites that have been eligible for S-S-S funding.

c

FTE = Full Time Equivalent; 1.0 FTE is one full-time employee.

d

Data provided by 42 specialists across 34 sites.

e

Categories are not mutually exclusive.

Secondary quantitative data analysis:

iNET membership is greater among larger, more complex VA facilities than smaller, less complex facilities, however, we did not find meaningful differences in overall quality between iNET member facilities and non-iNET member facilities as measured by the SAIL star rating, employee or patient satisfaction, or hospital readmissions data (Table 2). iNET appears to extend innovation capacity to facilities without major research centers. Of the 37 iNET sites, 22 (59%) do not have any of the six major types of research centers found across the VA (Table 2). Eleven (52%) of the 21 VA facilities that host VA health services research Centers of Innovation have not applied to the network.

Table 2.

Characteristics of iNET Sites vs Non iNET Sites.

Active iNET Sites Non-iNET Sites

Active sitesa 37 109

VA facility complexityb
(Most complex) 1a 18 (49%) 21 (19%)
1b 5 (14%) 15 (14%)
1c 8 (22%) 24 (22%)
2 4 (11%) 16 (15%)
(Least complex) 3 2 (5%) 26 (24%)
Complexity Level Not Separately Calculated 0 (0%) 7 (6%)

Facility urban/rural classificationc
Metropolitan core [1] 31 (84%) 81 (74%)
Metropolitan core-less urban [2,3] 3 (8%) 12 (11%)
Micropolitan [46] 2 (5%) 14 (13%)
Small town or rural area [710] 1 (3%) 2 (2%)

VA SAIL staring rating (facility’s overall quality)d
(Lowest overall quality) * 3 (8%) 6 (6%)
** 7 (19%) 28 (26%)
*** 14 (38%) 42 (39%)
**** 9 (24%) 19 (17%)
(Highest overall quality) ***** 4 (11%) 14 (13%)

Presence of a major research programe
Sites with a major research program 15 (41%) 24 (22%)

Number of unique patients served In Fiscal Year 2018
Average number per iNET site 60,722 16,932
Range 14,599-141,053 8,155-130,292
a

The 146 total VA sites represent the VA sites/healthcare systems for which the VA produces Strategic Analytics for Improvement and Learning (SAIL) quality reports in fiscal year (FY) 2018.

b

The Facility Complexity Model is a data driven model that relies on data from VHA corporate databases along with information from VA central office program offices to identify workload and programs (e.g., teaching, research, and complex clinical programs such as cardiac surgery and neurosurgery) at each facility for the purposes of comparing facility complexity. Facilities are categorized into one of five groups: 1a (most complex), 1b, 1c, 2, and 3 (least complex). 1a facilities (highest complexity) are those with high volume, high risk patients, most complex clinical programs, and large research and teaching programs. Figures are based on complexity levels are based on the fiscal year 2017 calculation of complexity levels done by the VA.

c

Urban/rural classification based on the US Department of Agriculture rural-urban community area (RUCA) codes that use data from the 2010 decennial census and the 2006-10 American Community Survey. The codes are based on population density, urbanization, and daily commuting. Number(s) in the [ ] indicate the RUCA code(s) used for each category.

d

The Strategic Analytics for Improvement and Learning (SAIL) metrics for facility performance improvement include nine quality domains, and one efficiency and one capacity domain. Overall quality is represented by the combination of the quality domains. The star rating for overall quality (1 to 5 stars) is assessed as the relative performance compared to other VA facilities. Ratings from the 4th quarter of FY2018.

e

Defined as having one or more of the following types of VA research programs or centers: Cooperative Studies Program; Geriatric Research Education and Clinical Center; Health Services Research & Development Center of Innovation; Mental Illness Research, Education and Clinical Center (headquarters facility); Quality Enhancement Research Initiative program and/or partnered evaluation; or Rehabilitation Research & Development Center.

Primary qualitative data analysis:

An early finding from qualitative interviews was the strong perceived value, across all three participating populations (Investees, Specialists, and Champions), of the network as a network. Not surprisingly, Investees valued opportunities for project funding and salary support, but they equally emphasized intangible but significant and consistent benefits resulting from engagement in the network. Benefits accrued from the creation of social and informational networks, from new opportunities for sharing information and experiences, and from the feeling of engagement in work and professional networks. Interviewees identified common problem areas (such as challenges in fund dispersal, or a need for more connections to national program offices that could help with dissemination). Specialists and Investees reported diverse professional backgrounds and different levels of familiarity with innovation work, and this influenced their reception and satisfaction with different components of the training provided.

Discussion

Evaluation impacts

Our evaluation’s preliminary findings have supported and influenced iNET’s development. By designing our data collection system to accommodate requests for rapid feedback of actionable findings, and by managing our team to accommodate prompt responses to requests for data, we have been able to support iNET’s need to produce data-driven presentations for regional VA leaders (who have invested in this program) and for VA Central Office leadership. Quarterly project-level data identifying barriers or delays in achievement of expected milestones has also allowed iNET leadership to reach out either directly or through the Innovation Specialists to provide assistance with troubleshooting. In order to support capacity building, we designed the REDCap based data system to allow direct data access for iNET leadership and Specialists. Further, we have provided training on use of this tool so that Innovation Specialists and iNET leaders can download and access the collected data.

Both our quantitative data (primary and secondary data) and our qualitative data informed iNET leadership’s preparation for site visits, facilitating identification of site strengths and support needs. For example, our quantitative results suggest that iNET is accessing and supporting new populations within VA not traditionally served by research innovation pathways. These findings have been presented to iNET leadership and may influence future efforts to target recruitment of sites into the network.

Insights gleaned from rapid qualitative analyses of facility director interviews identified qualities leadership felt were associated with successful Innovation Specialists. iNET leadership then provided these insights to directors at sites seeking to hire staff for an Innovation Specialist position to inform hiring decisions. Reporting common challenges and differing experiences with training in a rapid way to the operational partners assisted them in knowing which problems to prioritize in either solving or, at a minimum, acknowledging them. Qualitative research methods are sometimes criticized as being too slow to be useful for ongoing program development and evaluation, yet our work using rapid methods demonstrated that strong and easily identified themes could be delivered in time to help operational partners make decisions, As more detailed coding of interview transcripts progressed, we continued to have conversations that pushed operational partners and researchers alike to think more deeply about how to define the program’s “value” and “outcomes.” Additionally, our finding of the importance of the network as a network informed ongoing development of the structure and format of networking opportunities (for example, continuing to invest in in-person meetings despite additional expense, and developing regional meetings to make it easier for more people to attend and network within their geographic region). Subsequently, the evaluation team began collecting data for use in a Network Analysis of characteristics of the network relationships [10].

Use of embedded research strategies

Our partnered evaluation utilized many strategies that are characteristic of successful embedded research teams [1]. From the start, we prioritized relationship building with iNET partners through frequent interactions. We have led calls and participated in calls, attended in-person events, and connected directly with Innovation Specialists and iNET leadership. When possible, participation in immersive experiences such as VA Demo Day, a showcase event, and regional summits allowed us to better understand the complexities faced by iNET leaders. In turn, this allowed us to propose different solutions that respond to the interests of a range of stakeholders.

Changes in iNET’s leadership (there have been two previous leadership teams prior to the current one) forced us to frequently redefine and adapt project scope to respond to evolving needs while striving to maintain the academic goals of our evaluation. Leadership teams had to account for differing contexts within VA as the program moved between VA offices and shifted its role, its priorities and its responsibilities. The evaluation team used a reflexive approach where we consciously thought about our position as evaluators in order to maintain clear boundaries of our role and capacity.

From an early stage, our partners articulated a desire for rapid feedback of actionable findings to inform practice. Consequently, we designed data collection tools and methods with the intention of being able to provide information rapidly. Our results and insights have directly helped improve the effectiveness of the iNET program. Again, using a reflexive approach, we realized that one of the most important contributions we could make to this nascent program would be to work interactively with iNET leadership to build capacity for data management and self-evaluation (defining and measuring program success).

Lessons learned

Like other evaluators, we have met a number of challenges and limitations through our work as embedded researchers partnered with iNET and have learned important lessons along the way [11]. The partnered evaluation team was formed two years after iNET launch. Thus, the evaluators initially had to spend considerable time “catching up.” Moreover, when we began, many operational decisions had already been made that may have benefitted from evaluator input. The high degree of iNET leadership turnover required us to adapt and apply strong listening skills. Turnover in iNET leadership teams impacted the goals, pace, and scope of the evaluation. Further, as the program continues to expand, we have had to re-evaluate and re-negotiate our original aims and shift focus as needs as have changed. For example, as the program matures there is an increasing need for demonstration of value as opposed to pursuing a more detailed analysis of who participates and why. Moreover, we often struggled to achieve an appropriate balance between conducting an independent evaluation (the job we were hired to do) and supporting operations (which both we and our partners desired). Evaluating such a geographically diverse program also posed challenges. We often had to settle for being ‘virtually’ embedded and working to ensure team presence at in-person gatherings scheduled throughout the year. This also meant listening in on group calls and tracking listserves and other online chat functions to understand the context in which iNET members function. However, since all our team members are also VA employees, we had the advantage of being embedded in the same healthcare system as our stakeholders.

Limitations

Results from our evaluation should be considered in context of our limitations. We discuss here findings from an ongoing evaluation, thus data collection is not yet complete. Also, our qualitative sample was not designed to compare experiences of participants with non-participants at the same site, thus our finding of strong positive effects on employee experience may be moderated by other site programs or characteristics that we were not able to consider. While this partnered evaluation is ongoing, the challenges, success and lessons learned may serve as a valuable and realistic example of an embedded research model.

Conclusion

Embedded research can be effective in producing successful collaborative efforts between researchers and operational and policy partners. A successful embedded research team, when utilized early in program development and deployment, can help programs pivot when necessary to ensure effective use of limited resources. Such models not only inform program development and expansion but also can aid in strategic planning and demonstrating value when present. In other scenarios, results from an embedded research team may provide the evidence necessary to help the program grow or spread. In turn, results from innovative programs, presented in scholarly reports by the evaluation team, can facilitate dissemination of results and help other systems facing similar challenges.

Supplementary Material

Supp File 2
Supp File 3
Supp File 1

Acknowledgments

This work was funded by an evaluation planning grant and is currently funded by a full evaluation grant (PEC 18-015: Innovators Network-Population factors, Organizational Capacity, Workflow and Resources (INPOWR). Funding provided by the VA QUERI program and the VHA Innovators Network (iNET). This manuscript does not represent the views of the VA or the United States government.

Source of Funding:

Funding provided by the VA Quality Enhancement Research Initiative (QUERI) and the VHA Innovators Network (PEC 18-015), corresponding Principal Investigator: Sarah Cutrona, MD, MPH.

References

  • 1.Vindrola-Padros C, Pape T, Utley M, Fulop NJ. The role of embedded research in quality improvement: a narrative review. BMJ Qual Saf. 2017;26:70–80. 10.1136/bmjqs-2015-004877. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Marshall M, Pagel C, French C, Utley M, Allwood D, Fulop N, Pope C, Banks V, Goldmann A. Moving improvement research closer to practice: the researcher-in-residence model. BMJ Qual Saf. 2014;23:801–805. 10.1136/bmjqs-2013-002779. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Churruca K, Ludlow K, Long JC, Braithwaite J, Best S, Taylor N. The time has come: Embedded implementation research for health care improvement. J Eval Clin Pract. 2019;25:373–380. 10.1111/jep.13100. [DOI] [PubMed] [Google Scholar]
  • 4.Veterans Affairs Quality Enhancement Research Initiative. Partnered evaluations. https://www.queri.research.va.gov/national_partnered_evaluations. Accessed August 11, 2020.
  • 5.Elnahal SM, Clancy CM, Shulkin DJ. A framework for disseminating clinical best practices in the VA health system. JAMA. 2017;317:255–256. 10.1001/jama.2016.18764. [DOI] [PubMed] [Google Scholar]
  • 6.Vega R, Jackson GL, Henderson B, et al. Diffusion of excellence: Accelerating the spread of clinical innovation and best practices across the nation’s largest health system. Perm J 2019;23:18.309. DOI: 10.7812/TPP/18.309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42:377–381. 10.1016/j.jbi.2008.08.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Veterans Health Administration. Strategic Analytics for Improvement and Learning (SAIL) - Quality of Care. https://www.va.gov/qualityofcare/measure-up/strategic_analytics_for_improvement_and_learning_sail.asp. Accessed August 11, 2020.
  • 9.Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Res. 2019;280:112516. 10.1016/j.psychres.2019.112516. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Borgatti SP, Everett MG, Johnson JC. Analyzing Social Networks. London, UK: SAGE Publications Ltd; 2018. [Google Scholar]
  • 11.Vindrola-Padros C, Eyre L, Baxter H, Cramer H, George B, Wye L, Fulop NJ, Utley M, Phillips N, Brindle P, Marshall M. Addressing the challenges of knowledge co-production in quality improvement: learning from the implementation of the researcher-in-residence model. BMJ Qual Saf. 2019;28:67–73. 10.1136/bmjqs-2017-007127. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supp File 2
Supp File 3
Supp File 1

RESOURCES