Abstract
Implementation efforts to increase adoption of health technologies (e.g., telehealth, mobile health, electronic health records, patient portals) have commonly focused on increasing the adoption of specific health technologies in specific service lines. To facilitate adoption of multiple health technologies across a hospital setting, four Virtual Health Resource Centers (VHRCs) were established to provide clinical adoption support to healthcare staff and patients in four hospitals in a large healthcare system. This study spanned a 3-year period, with the first half including pre-implementation efforts, and the second half involved in implementation efforts. In order to compare sites to the national population, a binomial regression was used which allowed for adjustment of relevant covariates (e.g., differences in number of enrollees, level of complexity of facility). The pre-implementation phase and the initial year-and-a-half of the implementation phase resulted in an increase in internal facilitators’ knowledge and skills of virtual care technologies, an increase in facilitator and site capacity, and high levels of adherence to implementation strategies were maintained across sites. Virtual care utilization increased across all sites and across the healthcare system during the implementation phase; however, a comparison to the increase in national level virtual care utilization metrics yielded no meaningful difference. While many implementation strategies aim to increase the adoption of a particular health technology product (e.g., a particular app or remote monitoring use case), the establishment of VHRCs may increase efficiencies in delivery of virtual care training and consultation to healthcare staff and patients, which may increase capacity and decrease barriers to adoption. However, due to the impact of the COVID-19 pandemic on the need for rapid adoption of technology and decreased in person care and services, it is not yet known the longer term impact that the establishment of VHRCs may have on the sustained adoption of health technologies.
Keywords: Implementation science, Training, Healthcare system, Telehealth, Mobile health, Virtual care
Introduction
Virtual care health technologies such as synchronous and asynchronous telehealth, mobile applications, text-messaging applications, wearable/implantable biosensors, and patient-generated health data deployed in the setting of an integrated electronic health record (EHR) with and advanced analytic capabilities create the potential for huge benefits in the delivery and coordination of clinical care (Brown et al., 2014; Hermes et al., 2019; Hilty et al., 2019; Keyworth et al., 2018; Rathbone & Prescott, 2017). Barriers to implementation of virtual care include systemic barriers at the organizational level (governance models, lack of guidance regarding clinical integration), environmental (low levels of digital health literacy, lack of understanding of benefits, lack of training, learning new workflows), lack of infrastructure (poor wi-fi, lack of logistics for supplying equipment and access, poor integration of EHR systems), logistical barriers (scheduling, staffing), lack of resources (space, equipment, funding, time), challenges with collaboration (bureaucratic delays, communication), and policy issues (Armstrong et al., 2018; Hilty et al., 2018; Maheu et al., 2018; Muir et al., 2020; Torous et al., 2018; Zhou et al., 2019).
Implementation science, including the analysis of barriers and facilitators of adoption, provides a conceptual framework to systematically identify and evaluate strategies in order to identify the most effective and efficient implementation strategies (Kirchner et al., 2020; Morris et al., 2011; Nilsen, 2015). Facilitators of virtual care adoption include clinical staff (dedicated clinicians, experience with innovation, motivation, training), other staff (champions, coordinators, leadership support, external facilitators), and having an implementation strategy (tailored approach, pilot sites, standardized procedures) (Armstrong, 2019; Armstrong et al., 2020; Gould et al., 2019; Muir et al., 2020).
With an estimated 9.1 million Veteran enrollees, over 400,000 full time health care professionals and support staff at over 1200 health care facilities, supported by an annual budget of over $68 billion, the Veterans Health Administration (VHA) is the largest integrated health care system in the United States, and one of the largest in the world. While the complexities of delivering care across such a large and diverse system creates logistical challenges, the scope and integrated nature of the system also provides opportunities to create efficiencies of scale by standardizing operations, automating basic clinical activities, improving access to critical resources, and empowering both Veterans and healthcare staff. Health technologies enhance care coordination across VHA and promote Veterans’ ownership of and engagement with their own care. In accordance with VHA’s 2018–2024 strategic plan, the VHA aims to increase implementation and dissemination efforts to more fully integrate best practices in the use of virtual care technologies a to support the delivery of Veteran care.
While the utilization of virtual care health technologies across VHA has significantly increased over the past decade, evidence for their integration into the daily workflows of clinical teams across the spectrum of care, as well as their adoption by patients, leaves much opportunity for advancement. A limitation in the advancement of implementation studies in the area of health technologies is that many have focused on the increased adoption of an individual technology product or functionality (e.g.,, clinical video telehealth or a single mobile health app) into a specific clinical workflow (e.g., primary care), within a single clinic or hospital (Armstrong et al., 2020; Lindsay et al., 2015; Quanbeck et al., 2018; Yakovchenko et al., 2019). This approach, while simplifying analyses, creates challenges in the generalization of lessons learned to a system-level and focus on an individual technology may not the most efficient method of increasing overall virtual care utilization. This study aimed to leverage implementation science to increase the adoption of multiple health technologies through the establishment of Virtual Health Resource Centers (VHRCs) at four sites in a large healthcare system.
Methods
The aim of this mixed methods study was to test three multi-faceted approaches to increase health technology adoption, including a) Evaluating and building facilitator and site capacity at four sites; b) measurement of site adherence to 73 discrete implementation strategies; and c) establishment and evaluation of VHRCs and evaluation of their impact on health technology utilization metrics. Strategies were selected based on their perceived ability to decrease barriers and increase the adoption of health technologies in a large healthcare system. Implementation efforts focused on interdisciplinary VHA staff adoption of virtual care tools. The effectiveness-implementation hybrid design evaluated the use of facilitation specialists and compared within-site effects and between-sites effects on several usage metrics (Landes et al., 2020).
Participants
Participants included VHA staff on the implementation team directly involved in facilitating implementation of virtual care at study site locations. The interdisciplinary implementation team included twelve staff members in fiscal year (FY) 2019: two physicians, four nurses, one psychologist, one dietician, one human factors engineer, one program manager, and one healthcare specialist. The implementation team increased in FY20 to eighteen staff members, and included the additions of four healthcare specialists, one social worker, and one nurse.
Implementation team member roles were identified based on published recommendations (Ritchie et al., 2020; Rathbone & Prescott, 2017) and included external facilitators, experts in general implementation strategies and tools with expertise or credible knowledge about the clinical innovation and its evidence base; internal clinical leadership, individuals and groups of stakeholders who can impact the implementation of the clinical innovation; site internal facilitators, who were familiar with site-level organizational structures, procedures, and culture along with the national-level clinical processes within the healthcare network.
Procedures
The current study included activities completed over a 3-year period, from April 2018 to March 2021, and included a one-and-a-half-year pre-implementation phase (April 2018–September 2019) and the first year-and-a-half of the implementation phase (October 2019-March 2021). Implementation models used included “Promoting Action on Research Implementation in Health Services” (i-PARIHS) framework (Harvey & Kitson, 2016) and selection of seventy-three implementation strategies from the Expert Recommendations for Implementing Change (ERIC; Powell et al., 2015). The i-PARIHS framework was used as an overarching framework while discrete implementation strategies and activities were tracked throughout the pre-implementation and implementation phases using the compilation of implementation strategies from the Expert Recommendations for Implementing Change (ERIC) project (Powell et al., 2015).
The steps completed in the pre-implementation phase included the following: needs assessment and gaps analysis for program, funding to support program, program mission and goals (April-June 2018), program evaluation plan, site selection, site ‘memoranda of understanding’ signed by VA central office and site leadership (July–September 2018), staff hiring for sites (October 2018-March 2019), Pre-implementation efforts included a needs assessment and gap analysis of current virtual care implementation efforts across the VHA, accessing funding, establishment of program mission, goals and evaluation plan, site selection, and staff hiring.
Site identification was based on several factors including clinician, patient mix, and value; location, size, and type of VHA facility; existing relationships and availability of an on-site clinical leader; resource availability and collaborative opportunities near the site. Once selected, a site profile process was conducted including a needs and gaps analysis, and facility leadership approval. The needs assessment and gaps analysis were used to identify opportunities having the greatest potential to increase the adoption and utilization of virtual care technologies at the respective site. Preliminary personnel and site assessments were conducted and identified key baseline site performance metrics, patient population characteristics, and barriers and facilitators to adoption of virtual care technologies. Table 3 includes a comparison of key site operational metrics and highlights the diversity of the sites in terms of facility complexity, number of facilities included, number of patients, and rurality. Sites identified included medical centers and surrounding facilities in St Cloud, Minnesota (Site A), San Diego, California (Site B), New Orleans, Louisiana (Site C), and Tampa, Florida (Site D). Sites identified were diverse, providing optimum opportunity to evaluate current implementation strategies across a variety of settings.
Table 3.
National | Site A (St. Cloud, MN) | Site B (San Diego, CA) | Site C (New Orleans, LA) | Site D (Tampa, FL) | |||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Operational metrics | |||||||||||||||||||||||||||||||
Time point* | 1* | 2* | % difference | 1 | 2 | % difference | 1 | 2 | % difference | 1 | 2 | % difference | 1 | 2 | % difference | ||||||||||||||||
Facility complexity categorization | n/a | n/a | n/a | 3 | 3 | n/a | 1A | 1A | n/a | 1B | 1B | n/a | 1A | 1A | n/a | ||||||||||||||||
# of facilities included | 1255 | 1299 | n/a | 4 | 4 | n/a | 7 | 7 | n/a | 9 | 9 | n/a | 15 | 15 | n/a | ||||||||||||||||
# enrollees | 9,100,000 + | 9,093,691 | 2.2% | 38,944 | 32,046 | (− 17.7%) | 86,140 | 72,931 | (− 15.3%) | 46,743 | 43,478 | (− 7.0%) | 100,329 | 95,836 | (− 4.5%) | ||||||||||||||||
Operating beds | 20,354 | 35,681 | 75.3% | 388 | 389 | 0.3% | 272 | 272 | n/a | 150 | 168 | 12.0% | 499 | 499 | n/a | ||||||||||||||||
Outpatient visits | 81,305,962 | 45,965,521 | (− 43.5%) | 440,059 | 216,676 | (− 50.8%) | 993,457 | 522,244 | (− 47.3%) | 694,783 | 367,013 | (− 94.7%) | 1,437,900 | 863,301 | (− 40.0% | ||||||||||||||||
Unique inpatient admissions | 487,600 | 237,075 | (− 51.4) | 2491 | 576 | (− 76.9%) | 6642 | 3133 | (− 52.8%) | 2927 | 1783 | (− 39.1%) | 10,753 | 4368 | (− 59.4%) | ||||||||||||||||
% Rural/highly rural | 29.3% | 32.5% | 10.9% | 75.3% | 76.0% | 0.9% | 5.1% | 4.7% | (− 7.8%) | 19.4% | 18.4% | (− 5.2%) | 8.4% | 9.5% | 13.1% | ||||||||||||||||
Health technology utilization metrics | |||||||||||||||||||||||||||||||
Synchronous video telehealth | Primary care providers using VA Video Connect | 63.7% | 93.6% | 46.9% | 87.8% | 98.8% | 12.5% | 61.9% | 96.8% | 56.4% | 53.7% | 99.1% | 84.5% | 94.7% | 99.1% | 4.6% | |||||||||||||||
Mental health providers using VA Video Connect | 62.4% | 97.0% | 55.4% | 60.7% | 91.6% | 50.9% | 58.4% | 97.6% | 67.1% | 55.2% | 93.2% | 68.8% | 90.1% | 95.3% | 5.8% | ||||||||||||||||
Online patient health portal | My HealtheVet (% premium accounts) | 41.1% | 46.2% | 12.4% | 47.1% | 51.3% | 8.9% | 48.5% | 55.2% | 13.8% | 45.1% | 45.8% | 1.6% | 48.6% | 53.2% | 9.5% | |||||||||||||||
Automated text message platform | Annie (# (%) unique veterans) | 5247 | 30,342 | 478.3% | 472 | 729 | 54.4% | 86 | 589 | 584.9% | 19 | 248 | 1205.3% | 42 | 582 | 1285.7% | |||||||||||||||
Mobile health apps | PTSD Coach (# lifetime downloads) | 473,283 | 770,282 | 62.8% | No user authentication is required for these mobile health apps, so utilization metrics are not available by any specific site or individual | ||||||||||||||||||||||||||
Mindfulness Coach | 376,242 | 836,386 | 122.3% | ||||||||||||||||||||||||||||
CBT-i Coach | 399,163 | 714,243 | 78.9% | ||||||||||||||||||||||||||||
Insomnia Coach | 0** | 72,563 | n/a | ||||||||||||||||||||||||||||
COVID Coach | 0** | 199,310 | n/a |
Timepoint 1: end of pre-implementation phase = September 2019 (end of quarter 4, fiscal year (FY) 2019); timepoint 2: one-and-a-half years after implementation phase began = end of quarter 2, FY2021
Following leadership engagement and signing of memoranda of understanding, preliminary site needs and readiness assessments were conducted to identify baseline site performance metrics, patient population, barriers. and facilitators. A site capacity survey was administered to site internal facilitators measuring: motivations/interest in virtual care; current level of knowledge of mobile health core competencies; current level of knowledge and experience using VHA products; perception of site capacity; perception of site leadership support; perception of site provider support; perception of potential impact of virtual care at site; and barriers, limitations, need for support; and knowledge of VHA and military health systems and patient populations (Table 2).
Table 2.
Facilitator or site focused | Knowledge area | Item | Sub-item | Across sites | Site A (St. Cloud, MN) | ||||
---|---|---|---|---|---|---|---|---|---|
Pre (N = 11) | Post (N = 8) | 6-month Post (N = 7) | Pre (N = 2) | Post (N = 2) | 6-month Post (N = 2) | ||||
Questions to facilitator regarding their own knowledge and readiness | Knowledge base of target audience (% rated high or very high level of knowledge) | Rate your current level of knowledge on VA health care: (response options: very high, high, moderate, low, very low) | Veteran Health Needs | 72.7% | 87.5% | 100.0% | 50.0% | 50.0% | 100.0% |
VHA System | 63.6% | 87.5% | 100.0% | 50.0% | 100.0% | 100.0% | |||
Knowledge base of virtual care core competencies (% rated high or very high) | Rate how strongly you agree/disagree with the following statements regarding adoption of virtual care tools and program in clinical practice at your facility (response options: very high, high, moderate, low, very low) | Clinical Integration of Virtual Care | 72.7% | 100.0% | 100.0% | 50.0% | 100.0% | 100.0% | |
Evidence Base for Virtual Care | 63.6% | 100.0% | 100.0% | 50.0% | 100.0% | 100.0% | |||
Security and Privacy of Virtual Care | 63.6% | 100.0% | 100.0% | 50.0% | 100.0% | 100.0% | |||
Ethical Issues with Virtual Care | 72.7% | 75.0% | 100.0% | 50.0% | 50.0% | 100.0% | |||
Cultural Considerations with Virtual Care | 63.6% | 75.0% | 100.0% | 50.0% | 50.0% | 100.0% | |||
Readiness to use virtual care (% in action or maintenance stage) | Please select the item below that best describes where you are in terms of integration of virtual health technologies into clinical care | I do not intend on integrating virtual health technologies into my clinical practice (precontemplation) | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | |
I am intending to begin integration virtual health technologies into my clinical practice in the next 6 months (contemplation) | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | |||
I am planning on integrating virtual health technologies into my clinical practice next month (preparation) | 9.1% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | |||
I started integrating virtual health technologies into my clinical practice within the past 6 months (action) | 18.2% | 37.5% | 0.0% | 50.0% | 0.0% | 0.0% | |||
I started integrating virtual health technologies into my clinical practice more 6 months ago (sustainment) | 72.7% | 62.5% | 100.0% | 50.0% | 100.0% | 100.0% | |||
Product knowledge (% that had either used the product, prescribed, or recommended the product in clinical care, or trained others on how to use the product) | Average rate of product knowledge across suite of 56 virtual care products and programs (% that had either used the product, prescribed, or recommended the product in clinical care, or trained others on how to use the product) | 21.8% | 34.7% | 42.4% | 21.4% | 34.1% | 41.5% | ||
Product knowledge on key products (% that had either used the product, prescribed, or recommended the product in clinical care, or trained others on how to use the product) | Annie (VHA's automated health text messaging web platform) | 36.7% | 62.5% | 85.7% | 50.0% | 100.0% | 50.0% | ||
Mindfulness Coach (iOS and Android app) | 36.7% | 50.0% | 57.1% | 0.0% | 50.0% | 50.0% | |||
My HealtheVet (VHA's patient web portal) | 63.6% | 75.0% | 100.0% | 50.0% | 50.0% | 100.0% | |||
PTSD Coach (iOS and Android app) | 36.7% | 50.0% | 71.4% | 50.0% | 50.0% | 50.0% | |||
Rx Refill (iOS and Android app) | 27.3% | 37.5% | 42.9% | 0.0% | 0.0% | 0.0% | |||
VA Video Connect (live video telehealth) | 72.7% | 87.5% | 100.0% | 50.0% | 50.0% | 100.0% | |||
Questions to facilitator regarding perceptions of site capacity | Quality of facility climate and support regarding virtual care implementation (% agree or strongly agree) | Rate how strongly you agree/disagree with the following statements regarding adoption of virtual care adoption in clinical practice at your facility (response options: strongly agree, agree, neither agree nor disagree, disagree, strongly disagree) | Virtual care can be integrated into care at your site | 81.8% | 100.0% | 100.0% | 50.0% | 100.0% | 100.0% |
Virtual care is supported by scientific evidence | 72.7% | 100.0% | 100.0% | 50.0% | 100.0% | 100.0% | |||
Virtual care implementation is supported by your facility leadership | 72.7% | 75.0% | 85.7% | 50.0% | 50.0% | 100.0% | |||
Site barriers to implementation (% rated barrier as moderate, high, or very high) | Time constraints/too busy | 100.0% | 87.5% | 85.7% | 100.0% | 100.0% | 100.0% | ||
Don’t know virtual care tools enough | 81.8% | 62.5% | 57.1% | 100.0% | 50.0% | 50.0% | |||
Need more training | 72.7% | 62.5% | 57.1% | 50.0% | 50.0% | 50.0% | |||
Unclear policies regarding use | 72.7% | 50.0% | 42.9% | 50.0% | 50.0% | 0.0% | |||
Concerns about security and privacy | 72.7% | 62.5% | 57.1% | 50.0% | 50.0% | 50.0% | |||
Need more evidence that proves effectiveness of virtual care | 63.6% | 37.5% | 28.6% | 50.0% | 50.0% | 50.0% | |||
Don’t know how to choose which virtual care tool to use | 63.6% | 50.0% | 42.9% | 50.0% | 50.0% | 50.0% | |||
Virtual care will interfere with the patient/client relationship | 45.5% | 37.5% | 28.6% | 50.0% | 50.0% | 0.0% | |||
No compensation for use of virtual care with patients | 36.7% | 37.5% | 42.9% | 50.0% | 50.0% | 50.0% | |||
Lack of connectivity in the clinic | 36.7% | 25.0% | 28.6% | 50.0% | 50.0% | 50.0% | |||
Employer restrictions/lack of employer support | 27.3% | 25.0% | 14.3% | 50.0% | 0.0% | 0.0% | |||
Cost or resources associated with use | 18.2% | 25.0% | 14.3% | 50.0% | 50.0% | 50.0% | |||
Don’t think that virtual care will work | 18.2% | 12.5% | 14.3% | 0.0% | 0.0% | 0.0% |
Facilitator or site focused | Site B (San Diego, CA) | Site C (New Orleans, LA) | Site D (Tampa, FL) | ||||||
---|---|---|---|---|---|---|---|---|---|
Pre (N = 3) | Post (N = 2) | 6-month Post (N = 2) | Pre (N = 3) | Post (N = 2) | 6-month Post (N = 2) | Pre (N = 3) | Post (N = 2) | 6-month Post (N = 1) | |
Questions to facilitator regarding their own knowledge and readiness | 66.7% | 100.0% | 100.0% | 100.0% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% |
33.3% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | 100.0% | 100.0% | 100.0% | |
66.7% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | 100.0% | 100.0% | 100.0% | |
66.7% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | |
66.7% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | |
66.7% | 50.0% | 100.0% | 66.7% | 100.0% | 100.0% | 100.0% | 100.0% | 100.0% | |
66.7% | 50.0% | 100.0% | 66.7% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | |
0.0% | 0.0% | 0.0% | 0.0% | 0.0% | 0.00% | 0.0% | 0.0% | 0.0% | |
0.0% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | |
33.3% | 0.0% | 0.0% | 0.0% | 0.0% | 0.00% | 0.0% | 0.0% | 0.0% | |
33.3% | 50.0% | 0.0% | 33.3% | 0.0% | 0.0% | 0.0% | 0.0% | 0.0% | |
33.3% | 50.0% | 100.0% | 66.7% | 100.0% | 100.0% | 100.0% | 100.0% | 100.0% | |
22.0% | 35.1% | 42.8% | 20.7% | 32.9% | 40.3% | 22.5% | 35.2% | 43.8% | |
33.3% | 50.0% | 100.0% | 33.3% | 50.0% | 100.0% | 33.3% | 50.0% | 100.0% | |
0.0% | 50.0% | 50.0% | 33.3% | 50.0% | 50.0% | 33.3% | 50.0% | 100.0% | |
66.7% | 33.3% | 100.0% | 66.7% | 33.3% | 100.0% | 66.7% | 33.3% | 100.0% | |
33.3% | 33.3% | 50.0% | 33.3% | 33.3% | 100.0% | 33.3% | 50.0% | 100.0% | |
33.3% | 33.3% | 50.0% | 33.3% | 33.3% | 50.0% | 33.3% | 33.3% | 100.0% | |
66.7% | 66.7% | 100.0% | 100.0% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | |
Questions to facilitator regarding perceptions of site capacity | 100.0% | 100.0% | 100.0% | 66.7% | 100.0% | 100.0% | 100.0% | 100.0% | 100.0% |
66.7% | 100.0% | 100.0% | 66.7% | 100.0% | 50.0% | 66.7% | 100.0% | 100.0% | |
66.7% | 100.0% | 100.0% | 66.7% | 50.0% | 50.0% | 100.0% | 100.0% | 100.0% | |
100.0% | 50.0% | 100.0% | 100.0% | 50.0% | 50.0% | 100.0% | 50.0% | 100.0% | |
100.0% | 50.0% | 50.0% | 66.7% | 50.0% | 50.0% | 66.7% | 50.0% | 100.0% | |
100.0% | 50.0% | 50.0% | 66.7% | 50.0% | 50.0% | 66.7% | 50.0% | 100.0% | |
66.7% | 50.0% | 50.0% | 100.0% | 50.0% | 50.0% | 66.7% | 50.0% | 100.0% | |
100.0% | 50.0% | 50.0% | 66.7% | 50.0% | 50.0% | 66.7% | 100.0% | 100.0% | |
66.7% | 50.0% | 50.0% | 66.7% | 0.0% | 50.0% | 66.7% | 50.0% | 100.0% | |
66.7% | 50.0% | 50.0% | 66.7% | 50.0% | 50.0% | 66.7% | 50.0% | 0.0% | |
0.0% | 50.0% | 50.0% | 33.3% | 50.0% | 50.0% | 33.3% | 0.0% | 0.0% | |
0.0% | 0.0% | 50.0% | 66.7% | 50.0% | 50.0% | 66.7% | 50.0% | 100.0% | |
33.3% | 50.0% | 50.0% | 33.3% | 0.0% | 0.0% | 33.3% | 0.0% | 0.0% | |
33.3% | 50.0% | 50.0% | 33.3% | 50.0% | 0.0% | 0.0% | 0.0% | 0.0% | |
0.0% | 0.0% | 0.0% | 33.3% | 50.0% | 50.0% | 0.0% | 0.0% | 100.0% | |
33.3% | 0.0% | 0.0% | 33.3% | 50.0% | 50.0% | 0.0% | 0.0% | 0.0% |
Capacity building assessments used were adapted from the following standardized surveys: Organizational Readiness to Change Assessment Tool (ORCA; Helfrich et al., 2009), Checklist to Assess Organizational Readiness for Evidence-Informed Practice (CARI; Barwick, 2011), readiness for change (Prochaska et al., 1994, p. 68). These assessments were modified in alignment with recommendations provided by the i-PARIHS framework. Modifications consisted of additional items specific to the aim of this study. The additional items measured: motivations/interest in virtual care, current level of knowledge of mobile health core competencies, current level of knowledge of VHA, military health system and patient health needs, current level of knowledge and experience using VHA products, quality of organizational climate and support regarding virtual care, perception of potential impact of connected health at site, barriers, limitations, need for support. Items are included in Table 2.
Between July and September 2019, the lead external facilitator held one-on-one meetings with each site’s internal facilitators in order to develop individual capacity building plans for to inform the development of the broader capacity building strategy across sites. Processes included the administration of a preliminary assessment to each site internal facilitator, and results were used to assess knowledge and skill gaps, site level barriers to implementation, and quality of facility climate and support regarding virtual care implementation. During this component of process, implementation and dissemination plans were developed and deployed to facilitate the implementation process. Individual implementation plan meetings with sites included the following: (1) explanation of site capacity building process, (2) discussion of identified needs and possible solutions, (3) discussion regarding identified strengths, (4) discussion regarding identified opportunities, (5) discussion regarding bridging identified gaps. Implementation plans for each site included: (1) training preference/goals, (2) topics/prioritized (individual/group/resources), (3) types of promotional/training resources available and ideas for additional ones develop, (4) agreement on goals and development of site road map to achieve goals.
After individual and site capacity building plans were created based on the preliminary results, individual and site training plans were developed and delivered to increase all internal facilitators’ base level of knowledge and skills across areas measured. Results of the initial needs assessment, and gap analysis, as well as the results of site assessment and capacity building results, implementation strategies were selected across all sites and an implementation plan developed to provide an overall roadmap for the development of VHRCs at each site, which were launched between March 2020 and July 2020 and began the implementation phase at each site. Post-site capacity building assessments were delivered at 3-months after the preliminary assessment and provided information regarding progress toward knowledge and skills gained, as well as further training needed. Continued training and site capacity building efforts were included based on internal facilitators’ responses. A 6-month post assessment was also delivered in order to assess progress toward goals.
Implementation strategies were tracked across all sites, and key implementation strategies (training, marketing, encounters (e.g., consultations) at the VHRC) were further tracked. Some primary activities included delivery and evaluation of training and marketing events to increase the awareness of virtual care tools and programs. Training and marketing events were also assessed based on the type of event, with lower impact events aimed at increasing awareness evaluated by reach (number of attendees and number of events) and qualitative feedback provided by attendees, and higher impact delivery methods evaluated with course evaluations and follow-up assessments (when possible) included to measure awareness, knowledge and skills. In developing the training and marketing plan, the implementation team remained cognizant of VHA organizational structures, ensured trainings were tailored to local sites, and used a phased approach to allow for the refinement of procedures on systemic and site levels.
Evaluation
Evaluation was guided by the mixed-methods RE-AIM Qualitative Evaluation for Systematic Translation (RE-AIM QuEST) framework. Measure impact on virtual care utilization was calculated as percent increase from baseline (prior to the implementation phase in end of fourth quarter in FY19) to the end of the first year of implementation phase (end of fourth quarter in FY20). Rates of key telehealth metrics across five key domains were assessed (telehealth use, patients using telemental health, clinical video telehealth to offsite, primary care providers using VA Video Connect, and Mental health providers using VA Video Connect). The rate of utilization was also analyzed for products (VA Video Connect, My HealtheVet, Annie, PTSD Coach, Mindfulness Coach, CBT-i Coach, and Insomnia Coach, and COVID Coach). VA Video Connect, My HealtheVet, and Annie require secure login credentials to use, so site metrics were able to be calculated. For PTSD Coach, Mindfulness Coach, CBT-i Coach, and Insomnia Coach, and COVID Coach (and other VA mobile apps that do not require login credentials) site utilization metrics are not available.
The scope of the virtual care technologies included the suite of VHA mobile health applications (e.g., PTSD Coach, Mindfulness Coach, Annie, My VA Images, Rx Refill), the suite of products and programs offered in VHA telehealth (e.g., home telehealth, remote patient monitoring, VA Video Connect), and the VHA’s web-based patient portal (My HealtheVet), which includes secure (encrypted) messaging capability between VHA staff and Veterans, access to healthcare records, prescription refills, and VHA medical records.
Results
Discrete implementation strategies utilized across sites are detailed in Table 1, and show activities completed and resources developed for each discrete strategy. Results show percentages of implementation strategies used across the national program team and across sites were consistent. The national program team used 87.7% (64/73) of available strategies, Site A and D used 79.5% (58/73) of strategies, and Sites B and C used 80.8% (59/73) of strategies,. The only strategy that was not used across sites was strategy 72 (site visits). In-person visits were completed for sites B and C, however, due to restrictions related to the COVID-19 pandemic in-person site visits were put on hold for sites A and D. Five of the seventy-three strategies were only used at the national program level to support efforts at sites and across the VHA enterprise, including the following strategies: 1) access new funding, 2) alter incentive/allowance structures, 3) alter patient/consumer fees, 22) create or change credentialling and/or licensure standards, and 34) fund and contract for the clinical innovation.
Table 1.
Strategy # | Strategy | Activities completed |
---|---|---|
1 | Access new funding | Funding provided through VA’s Central Office of Connected Care budget |
2 | Alter incentive/allowance structures | Primary care leadership and program performance-based performance goals |
3 | Alter patient/consumer fees | Co-pay was eliminated for patients using VA Video Connect for a telehealth visit as a national initiative |
4* | Assess for readiness and identify barriers and facilitators | Site readiness assessed at all barriers/facilitators identified; site implementation plans developed |
5* | Audit and provide feedback | Created clinical performance and compliance data reports and deliver leadership briefings |
6* | Build a coalition | Learning collaboratives established; Leadership meetings held; Collaboration with implementation stakeholders; Identified champions |
7* | Capture and share local knowledge | Evaluated implementation strategies, document results in implementation reports |
8* | Centralize technical assistance | Establishment of Virtual Health Resource Centers (VHRCs) providing one-stop support for virtual care technologies; establishment of toll-free number (1–844-813–4361) for centralized access |
9 | Change accreditation or membership requirements | n/a |
10 | Change liability laws | n/a |
11 | Change physical structure and equipment | Acquired space in facility and equipment for staff for VHRC |
12 | Change record systems | Created and implemented EHR templates and consults for facilities |
13* | Change service sites | Change delivery of clinical services to be provided ‘anywhere to anywhere’ |
14* | Conduct cyclical small tests of change | Implementation pilots incorporating virtual tools and services into a specific clinic |
15* | Conduct educational meetings | Weekly education strategy and planning meetings; community of practice |
16* | Conduct educational outreach visits | Developed and distributed implementation toolkits training and promotional materials to sites |
17* | Conduct local consensus discussions | Established interdisciplinary learning collaboratives; Established steering committee |
18* | Conduct local needs assessment | Site level needs assessments and gap analysis |
19* | Conduct ongoing training | Delivery of training and marketing events |
20* | Create a learning collaborative | Established earning collaboratives |
21 | Create new clinical teams | Evaluated existing clinical workflows and guided improvements to increase efficiency and effectiveness in using health technologies; team-based approach |
22 | Create or change credentialing and/or licensure standards | VA expanded access to care by allowing healthcare providers to provide care through telehealth across state lines |
23* | Develop a formal implementation blueprint | Developed implementation plans based on recommendations in VA Facilitation and Implementation Guide (Ritchie et al., 2017) |
24* | Develop academic partnerships | Partnered with local universities to deliver marketing and training opportunities; Mentorship of interns |
25* | Develop an implementation glossary | Developed at national level, deployed at each site |
26* | Develop and implement tools for quality monitoring | Created implementation and quality monitoring tools; developed at national level, deployed at each site |
27* | Develop and organize quality monitoring systems | Developed at national level, deployed at each site; quarterly reports regarding progress toward goals |
28 | Develop disincentives | n/a |
29* | Develop educational materials | Targeted training to increase awareness and knowledge of health technologies |
30* | Develop resource sharing agreements | Centers of Excellence were established in partnership with facilities to increase access to resources related to virtual care implementation |
31* | Distribute educational materials | Targeted training to increase product knowledge (Annie, My VA Images, Patient Viewer, Mental Health Checkup, VA Video Connect, Self-Health VA Apps (PTSD Coach, Mindfulness Coach, etc.); weekly group meetings; weekly individual and site meetings |
32* | Facilitate relay of clinical data to providers | Established learning collaboratives; created leadership reports |
33* | Facilitation | Weekly national and site meetings |
34 | Fund and contract for the clinical innovation | All products implemented were all developed and funded by VHA |
35* | Identify and prepare champions | Identified and trained local clinical champions; trained internal facilitators on knowledge and skills in facilitation; Recruited members to learning collaboratives; Provided ongoing training and collaboration with these individuals; Workflow analysis |
36* | Identify early adopters | Identified early adopters at local sites and conducted field tests to understand their experience with the health technology |
37* | Increase demand | Provided regularly scheduled presentations to service leadership and staff; Marketing and education provided to Veterans and VHA staff |
38* | Inform local opinion leaders | Identified and collaborated with local opinion leaders; provided quarterly reports on progress toward goals |
39* | Intervene with patients/consumers to enhance uptake and adherence | Established ongoing information/education sessions; expanded expand virtual outreach options; developed ongoing training and marketing efforts |
40* | Involve executive boards | Reports to site executive leadership; Provided leadership briefings |
41* | Involve patients/consumers and family members | Outreach to Veterans to recruit peer-trainers for connected devices pilot; Established Virtual Health Resource Centers providing services to Veterans and family members; Field testing with Veterans |
42 | Make billing easier | n/a |
43* | Make training dynamic | Developed training materials based on Adult Learning Theory with the aim to increase engagement with learners; Evaluate training and marketing materials and delivery for satisfaction |
44* | Mandate change | Communicated to stakeholders existing mandates and alignment of strategic goals: VHA strategic goals include meeting needs of Veterans, enhancing Veteran experience, modernizing systems, and improving patient experience; Performance goals related to health technology use |
45* | Model and simulate change | Facilitation training specific to integration of health technologies in clinical care |
46* | Obtain and use patients/consumers and family feedback | Established VHRCs to meet the need of VHA staff and Veterans in integration of health technologies providing individualized consultation services; data reports on results |
47* | Obtain formal commitments | Signed Memoranda of Understanding (MOU) at each site |
48* | Organize clinician implementation team meetings | Implementation project plans and reports that include results, lessons learned, improved clinical workflows, and road maps |
49 | Place innovation on fee for service lists/formularies | n/a |
50* | Prepare patients/consumers to be active participants | Established VHRCs to meet the need of VHA staff and Veterans in integration of health technologies providing individualized consultation services |
51* | Promote adaptability | Assessed needs for service lines regarding barriers for use of health technologies, and implementation plans, to include marketing and training, specific to the site and service line needs |
52* | Promote network weaving | Identified staff willing to collaborate to promote utilization of virtual health; Established national Connected Care Community of Practice engaging VHA staff |
53* | Provide clinical supervision | Leveraged train the trainer model to provide mentorship and guidance to champions |
54* | Provide local technical assistance | Developed VHRCs providing technical assistance to Veterans and VHA staff; Internal facilitators partnered with local personnel to increase reach |
55* | Provide ongoing consultation | Developed VHRCs providing consultation to Veterans and VHA staff |
56* | Purposely reexamine the implementation | Provided outcome monitoring (aka control plan) in each implementation plan; post training follow-up completed to encourage adoption of virtual tools and detect barriers to utilization |
57* | Recruit, designate, and train for leadership | Established process to educate and train local champions |
58* | Remind clinicians | Assessed and modified clinical workflows to increase efficiency and effectiveness |
59 | Revise professional roles | n/a |
60* | Shadow other experts | Identified subject matter experts on various health technologies and effective implementation strategies for team members to shadow and improve knowledge and skills |
61* | Stage implementation scale up | Completed eight implementation studies to assess implementation strategies for combinations of health technologies; Established VHRC Implementation Consult Service to provide guidance and resources to additional VHA sites wanting to build VHRC |
62 | Start a dissemination organization | n/a |
63* | Tailor strategies | Met with service lines and multiple disciplines to identify barriers to implementation; Incorporated feedback into implementation planning; Completed site specific assessment |
64* | Use advisory boards and workgroups | Connected Care Steering Committee consisting of all service established to discuss implementation strategies and expansion |
65* | Use an implementation advisor | Implementation team completed formal training in facilitation and implementation; Implementation advisor review and guidance regarding implementation plans |
66 | Use capitated payments | n/a |
67* | Use data experts | Applied data analytics and informatics project planning and monitoring of implementation efforts; Worked with local group practice managers to develop local data reports based on electronic health records entered for virtual programs |
68* | Use data warehousing techniques | Utilized healthcare system data warehouse to create reports to track site progress toward goals |
69* | Use mass media | Collaborated with Public Affairs Office (national and at facilities) to promote virtual programs across platforms (e.g., GovDelivery email, social media, blog posts, videos, radio, etc.) |
70 | Use other payment schemes | n/a |
71* | Use train-the-trainer strategies | Staff education and training using a train-the-trainer model |
72 | Site visits | In-person site visits were completed for two sites. Due to COVID-19 pandemic, in-person site visits were put on hold |
73* | Work with educational institutions | Partnered with academic institutions and provide mentorship for three graduate student interns |
Strategy # | Resources developed |
---|---|
1 | Performance work statement |
2 | Performance goals |
3 | VA policy |
4* | Clinical Coach Quick Guide and Implementation Plan for each site |
5* | Data reports and presentations of progress toward performance goals |
6* | Agendas, meeting notes, presentations |
7* | Implementation reports created; development of implementation team SharePoint site for knowledge management |
8* | Development of videos and placed on YouTube publicizing services and phone number |
9 | n/a |
10 | n/a |
11 | Created VHRC Toolkit establishing minimum, standard, and optimum parameters for space, equipment, and materials needed |
12 | Creation of: scheduling clinic template, sample progress notes template to document use of tools, referral order for VHRC; VVC national blood pressure template; Digital divide consult; Clinical video telehealth consult |
13* | n/a |
14* | Created implementation project plans, implementation reports including results and lessons learned |
15* | Development of meeting agendas and notes, presentations |
16* | Development and delivery of education and training and training materials |
17* | Meeting agendas and notes gathering of feedback and ideas for advancing the use of virtual tools and services |
18* | Needs assessment, gap analysis, site implementation plan |
19* | Developed PowerPoints on training plans; preparatory email; follow-up emails: developed control plan to ensure VVC visits completed and barriers addressed post training |
20* | Meeting agenda and notes; presentations through Connected Care Community of Practice |
21 | Meeting agenda and notes; presentations; new clinical workflows |
22 | n/a |
23* | Developed national implementation strategic plan and iplementation plan for each site based on site specific needs |
24* | Office of Connected Care Internship Orientation Guide |
25* | Site implementation guide; VA Mobile Health Practice Guide (Armstrong et al., 2021) |
26* | Performance goals; tracking spreadsheets |
27* | Developed tracking systems for activities and Power BI dashboards to provide access to results across sites |
28 | n/a |
29* | VA Virtual Care Toolkit: Clinicians Guide and Prescription Pad; VA Mobile Health Practice Guide 1st Edition; VA Virtual Care Best Practices training series (8 one-hour trainings) |
30* | Central SharePoint site developed to provide access to resources across sites |
31* | Training and marketing materials (flyers, training slides, wallet cards, etc.) |
32* | Meeting agenda and notes; reports; presentations |
33* | Meeting agendas and notes; presentations |
34 | n/a |
35* | Developed purpose statements, meeting notes and agendas, presentations |
36* | Results of field tests with early adopters |
37* | Training and marketing materials |
38* | Data and progress reports; presentations |
39* | Training and marketing materials |
40* | Data and progress reports; presentations |
41* | Marketing and training materials for Veterans and family members; Field testing reports; newsletters; blogs; social media posts |
42 | n/a |
43* | Reports including feedback from learners and level of satisfaction |
44* | Training and marketing materials; establishment of performance measures |
45* | Presentations; scripts; checklists; scenarios to role play |
46* | VHRC customer service feedback; training feedback |
47* | n/a |
48* | Implementation plans and reports |
49 | n/a |
50* | Marketing and promotional items to increase effectiveness of sharing nationally developed market materials (backpacks, folders, etc.); |
51* | Presentations; flyers; videos; clinical workflows |
52* | Ongoing collaboration through trainings and meetings |
53* | Implementation toolkit; training and marketing materials specific to clinical disciplines; train the trainer materials |
54* | System to track all encounters related to technical assistance |
55* | System to track all encounters related to consultation |
56* | Implementation project plans and reports; tracking spreadsheets |
57* | Training and marketing materials; tracking system for champions |
58* | Clinical workflows |
59 | n/a |
60* | Training and marketing materials |
61* | Implementation project reports including results and lessons learned; Developed of VHRC Implementation Road Map |
62 | n/a |
63* | Implementation project reports including results and lessons learned |
64* | Meeting agendas and notes; presentations |
65* | Updated implementation strategy and plans |
66 | n/a |
67* | Metrics and analytics reports |
68* | Metrics and analytics reports |
69* | Marketing materials (videos, blogs, social media posts, email, text, etc.) |
70 | n/a |
71* | Meeting agendas and notes, presentation, implementation toolkits |
72 | Meeting agendas and notes; presentations; updated implementation plans based on lessons learned |
73* | Office of Connected Care Internship Orientation Guide |
*Strategies that were used at the national program level and at all sites. (Compilation of implementation strategies from the Expert Recommendations for Implementing Change (ERIC); Powell et al., 2015)
Eight of the seventy-three strategies tracked were not used by any site. These included strategies: 9) change accreditations or membership requirements, 10) change liability laws, 28) develop disincentives, 42) make billing easier, 49) place innovation on fee for service lists/formularies, 62) start a dissemination organization, 66) use capitated payments, and 70) use other payment schemes. These were identified in the pre-implementation phase strategies that either were irrelevant in context of the VHA (e.g., those involving change in billing or payment structure), those that were not necessary to create as they are already an existing resource in the VHA (e.g., start a dissemination organization), and those that were not identified as a priority during the initial year of implementation (e.g., work with educational institutions), but may be integrated as the program matures in subsequent implementation years. Training, marketing and consultations (e.g., encounters delivered through the VHRCs) delivered as specific implementation strategies used and involved internal and external facilitators. Activities completed and resources developed for each strategy used is included in Table 1.
Eleven internal facilitators across sites responded to a preliminary site capacity assessment and met with the external facilitator to build individual and site capacity building implementation plan. A total of eight internal facilitators participated in the post-assessment, which was assessed approximately three months after the preliminary assessment (M = 86.4 days, SD = 7.8). A total of seven internal facilitators responded to the assessment delivered six months (M = 187 days, SD = 12.7) after the post-assessment. Results are included in Table 2 and show that across all sites there was an increase over time on the knowledge base on the target audience, knowledge base of virtual care core competencies, quality of facility climate and support regarding virtual care implementation, readiness to use virtual care tools, and product knowledge.
When comparisons are made across sites, Sites A and B had lower baseline knowledge (based on preliminary assessment) in the areas of knowledge of target audience, and a lower level of baseline readiness to use virtual care compared to Sites C and D. Internal facilitators’ motivation and interest in virtual care was high across sites, but level of knowledge in the following areas was inconsistent across sites: knowledge of the VHA healthcare system and Veteran healthcare needs, knowledge of core competencies in the integration of virtual care, and knowledge of individual virtual care products, knowledge of implementation science and facilitation processes. However, results show that knowledge in these areas increased across all sites at the 3 and 6-month intervals. Additionally, results of the preliminary assessment indicated variation in the quality of organization climate and support across sites, and in the perception of potential impact of virtual care at site, barriers, limitations, and need for support.
Of the knowledge areas assessed specific product knowledge was the weakest area across sites at the preliminary assessment time period (ranged from 21.8 to 40.3% respondents having either used the product, prescribed or recommended the product in clinical care, or trained others on how to use the product). Site capacity building efforts aimed to increase product knowledge across a suite of fifty-six virtual care products commonly used in the VHA. Results showed that training efforts served to nearly double (40.3–43.8%) product knowledge across at the 3 and 6-month time periods.
During the implementation phase a total of 442 education/training events were delivered to a total of 21,653 attendees. This included 17,203 attendees across 307 webinars, 1598 attendees across 82 presentations, 1241 attendees across 10 communities of practice, 1143 attendees across 16 train-the-trainer events, and 468 attendees across 27 office hours. During the same time a total of 51 marketing/promotional events were delivered to a total of 3538 attendees. This included 1498 attendees across 8 resource fairs and 2040 attendees across 43 outreach events. Training effectiveness and satisfaction was measured for several training efforts. While marketing and promotional events (e.g.,, outreach events, resource fairs, conference booths) were not evaluated beyond number of events and attendees, education and training events were evaluated. A Connected Care Community of Practice (CoP) series (one-hour webinars aimed at increasing knowledge and awareness of virtual care technologies)) training VHA staff on the use of virtual care tools was delivered across ten events to a total of 793 attendees. Results showed that 98.6% of CoP attendees reported that they were “satisfied or very satisfied with the training”, 98.9% of attendees responding that they agreed or strongly agreed that they would “…use the information and skills learned in my work,” and 97.9% would “…share information learned with others.” A series of fourteen training sessions (live webinar, one-hour each) “VA Connected Care for Telehealth Teams” was delivered to a total of 1475 telehealth leads and facilitators across the VHA. The majority (97.2%) of attendees agreed or strongly agreed that they “learned new knowledge and/or skills from this learning activity”, 98.1% of attendees agreed or strongly agreed that “content of this training was relevant to the needs of my healthcare team”, and 98.6% agreed or strongly agreed that they would ‘ be able to apply the knowledge and/or skills learned from this activity to improve my performance as a member of my healthcare team.’ A series of fourteen training sessions were developed and delivered to 2768 clinicians across twenty training events aimed at increasing knowledge and skills on the use of the VHA’s automated text-message platform (Annie). Results showed an increase in use of Annie from 6% before training to 63% 1-month after training; increased knowledge about Annie (17–22% before training, 41–56% after training, and 63–89% 1-month after training); increased satisfaction with Annie (average of 40% satisfied with Annie before training, 75% after training, and an average of 83% 1-month after training).
In addition to training and marketing efforts, a total of 9182 total encounters (e.g., individual consultations with Veterans, family/caregivers, or VHA staff on the use of virtual care) were provided services. The breakdown of service recipients through the VHRCs included 92.8% Veterans, 7.1% VHA staff, 0.1% family members or caregivers of Veterans. Encounter methods included 47.5% via telephone, 13.8% via in-person meeting, 37.2% via live video, and 1.5% listed as ‘other’. Services provided through the VHRCs included those seeking support on a single product or service, or a combination of products and services. The results of the VHRCs encounter analysis showed 48.9% encounters in support of My HealtheVet, 26.2% in support of telehealth, 24.9% encounters in support of mobile health apps or web-based tools or in support of devices (smartphones, tablets, activity trackers, glucometers, pulse oximeters, etc.).
Changes in virtual care utilization across first year-and-a-half of implementation phase are detailed in Table 3. National and site utilization metrics were compared to determine changes over the implementation period. Results were inconclusive with none of Site A’s results meeting or exceeding national levels, all health technology utilization metrics exceeded at Site B and some at Site C, and Site D exceeding the national level of increase in only one area. A binomial regression was used and allowed for adjustment of relevant covariates (e.g., differences in number of enrollees, level of complexity of facility), and showed that the comparison of the health technology utilization metrics at the national level, compared to sites with a VHRC did not provide statistically significant evidence of meaningful difference (95% CI, p-value = 0.23). Operational metrics, which are typically relatively stable over time, decreased during the implementation phase, with a decrease om the number of enrollees decreasing of 10.2% across all study sites, a decrease in outpatient visits by 44.8%, a decrease in inpatient admissions by 56.8%, and a slight increase in the percentage of patients living in rural or highly rural locations.
Discussion
The current study described the year-and-a-half pre-implementation phase, and the first year-and-a-half of the implementation phase of a study aimed at increasing virtual care adoption at four sites across a large healthcare system by implementing VHRCs. Discrete implementation strategies used across sites were tracked, key areas of knowledge and readiness were assessed, and site-specific implementation plans were developed. Results demonstrated that facilitator knowledge and readiness across key domains increased, perceived barriers to the integration of health technology across sites decreased, and that discrete implementation strategies can be applied with similar adherence rates across varied sites. Key areas of knowledge and readiness were assessed, and a site-specific implementation plan was effectively delivered. Results demonstrated increased knowledge and readiness of facilitators, improved quality of climate and support at study sites, and decreased site barriers. This strategy served as a basic guide and helped to streamline action at individual sites during system wide implementation, and increased the efficiency of facilitators to expand the reach of resource delivered at sites. Thus, systematically targeting key implementation strategies with the aim of increasing site capacity was an effective process. However, the impact of the COVID-19 pandemic on the launch of the in-person VHRCs may have affected the analysis of this, and other factors may have been at hand. While the need to pivot to virtual care modalities during COVID-19 drastically increased the need for virtual care (Connolly et al., 2021; Ferguson et al., 2021), it required a shift away from in person to the delivery of services to virtual platforms preventing many in-person interactions (Veterans Health Administration, 2021; U.S. Department of Veterans Affairs, 2021).
While increased engagement with virtual care increased utilization metrics as well healthcare staff and patient familiarity with the use of these tools, the increase in use at the pace we experienced could never have anticipated. Additionally, due to social distancing protocols and stay at home orders (Center for Disease Control, 2020), the vision of a VHRC as an ‘on site, in-person’ resource located within the hospital was not possible. Operational metrics nationally, and across all sites had decreases in number of enrollees, number of outpatient visits, and number of inpatient admissions. With increased demand for virtual care our team shifted quickly to meet the demand to provide training on virtual care tools to patients and healthcare staff. This, at least anecdotally, provided support for the on-site telehealth and My HealtheVet teams to better meet training needs. While health technology utilization increased across sites, sites with increases that outpaced the national increases were minimal. While efforts were made to demonstrate that implementation strategies that have shown effectiveness at a smaller scale (single technology, limited target audience) can be generalized to better meet the needs staff and patients, given the small sample size and lack of control sites beyond comparison to the national, we were unable to make this conclusion at this time. Because the current study describes the results of the initial year-and-a-half of results during the implementation phase of this study, it may be that over time differences can be detected across sites to better determine specific strategies that increase virtual care adoption.
Further, it may have been due to the necessity for virtual care as driven by the COVID-19 pandemic that likely drove healthcare staff and patients to adopt virtual health care tools at a higher rate than pre-pandemic. However, the pandemic created a huge need for virtual care delivery, which drove adoption of many of these tools, which also made the evaluation of impact of efforts to increase adoption of virtual care during this time challenging. Another impact of COVID-19 was the surge in needs for acute care distracted that drained resources from organizations trying to implement new practices and technologies. In addition to the impact of COVID-19 on the ability to measure the impact of the establishment of VHRCs, it is also important to note is that virtual care across the VHA includes many teams involved with training and marketing activities. While increases in utilization at the study sites and comparison with sites not included in the study provide an indication of the impact of the implementation efforts described, increases in utilization cannot solely be credited to these efforts alone.
Future efforts include continuing to build capacity at established VHRCs and developing a VHRC implementation consult service to guide other facilities interesting in building a VHRC. While more research is needed on the value of site, regional, or nationwide VHRCs there has been interest in several additional VHA sites in developing a VHRC at their site, demonstrating an interest in the field of replication of this model. Future efforts will aim to improve evaluation and reporting processes to better assess the impact of VHRCs. For example, a lesson learned during the implementation phase was that there was a need to collect customer feedback regarding satisfaction with services provided through the VHRC. As a result, items were developed and a process was implemented in January 2021 which will provide an additional source of feedback to support continuous improvement process. Additionally, centrally located online platforms for entering data on VHRC activities across sites have been established, as well as data dashboards allowing VHRC sites to view data reports showing progress toward established target goals.
Having an impact on an organizational or program level requires investment in training that influences the culture to realize the benefits of these health technologies, and an overarching plan of action that aligns technology and utilization strategies across the organization/program. While implementation studies aimed at evaluating the integration of a single innovation into a single clinic or hospital make it easier to evaluate impact, the results create challenges making it difficult to generalize to other innovations. In addition, the impact of single site, single product implementation does not reflect the reality of health technology integration across a healthcare system, which is varied, complex, and needs to allow flexibility for staff and patients to choose which health technology tool or tools that best promote patient-centric care.
Acknowledgements
The authors thank the VA’s Office of Connected Care’s graduate student interns, Rosemarie Sapigao, Shelby Smout, and Anne Legoute for their contributions to this work. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States government.
Author Contribution
CA, NW, and JM conceived and designed the study. CA, BD, AJ, BB, KZ, NF, GB, ML, and WC were responsible for the survey data collection. CA, NW, BD, WC, and JM contributed to the analysis and interpretation of the survey data. ML, BV, BD, and JH provided information such as history and usage statistics. All authors revised the manuscript critically for important intellectual content. CA had principal responsibility for drafting the manuscript. All authors approved the final version.
Funding
This work was supported by the Department of Veterans Affairs, Veterans Health Administration (VHA), Office of Health Informatics (10A7), and Contract GS35F0251V, awarded to Iron Bow Technologies, LLC.
Declarations
Conflict of Interest
The authors declare no competing interests. The authors alone are responsible for the content and writing of the paper.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Armstrong, C. M. (2019). Mobile health provider training: Results and lessons learned from year four of training on core competencies for mobile health in clinical care. Journal of Technology in Behavioral Science.10.1007/s41347-019-00089-8, online preprint sharing https://rdcu.be/btchQ
- Armstrong CM, Ciulla RP, Edwards-Stewart A, Hoyt T, Bush N. Best practices of mobile health in clinical care: The development and evaluation of a competency-based provider training program. Professional Psychology: Research and Practice. 2018;49(5–6):355–363. doi: 10.1037/pro0000194. [DOI] [Google Scholar]
- Armstrong CM, Ciulla RP, Williams SA, Micheel L. An applied test of knowledge translation methods using a mobile health solution. Military Medicine. 2020;185:526–535. doi: 10.1093/milmed/usz196. [DOI] [PubMed] [Google Scholar]
- Armstrong, C. M., McGee-Vincent, P., Juhasz, K., Owen, J., Avery, T., Jaworski, B., Jamison, A., Cone, W., Gould, C., Ramsey, K., Mackintosh, M. A., & Hilty, D. M. (2021). VA Mobile Health Practice Guide (1st ed.). U.S. Department of Veterans Affairs. Washington, DC. https://connectedcare.va.gov/sites/default/files/va-mobile-health-practice-guide.pdf
- Barwick, M. (2011). Checklist to Assess Organizational Readiness (CARI) for EIP implementation. Toronto, ON: Hospital for Sick Children Toronto.
- Brown W, Giguere R, Ibitoye M, Carballo-Diéguez A, Cranston RD. Successfully addressing challenges to implementing a multinational SMS-based reminder and data collection system in a biomedical HIV prevention trial. AIDS Research and Human Retroviruses. 2014;30(S1):A87. doi: 10.1089/aid.2014.5159.abstract. [DOI] [Google Scholar]
- Centers for Disease Control. (2020). Social distancing: Limiting close face-to-face contact with others. Accessed June 15, 2021: https://www.cdc.gov/coronavirus/2019-ncov/prevent-getting-sick/social-distancing.html
- Connolly SL, Stolzmann KL, Heyworth L, Weaver KR, Bauer MS, Miller CJ. Rapid increase in telemental health within the Department of Veterans Affairs during the COVID-19 pandemic. Telemedicine Journal and E-Health. 2021;27(4):454–458. doi: 10.1089/tmj.2020.0233. [DOI] [PubMed] [Google Scholar]
- Ferguson JM, Jacobs J, Yefimova M, Greene L, Heyworth L, Zulman DM. Virtual care expansion in the Veterans Health Administration during the COVID-19 pandemic: Clinical services and patient characteristics associated with utilization. Journal of the American Medical Informatics Association. 2021;28(3):453–462. doi: 10.1093/jamia/ocaa284. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gould, C. E., Kok, B. C., Ma, V, K., Zapata, A., Owen, J. E., & Kuhn, E. (2019). Veterans Affairs and the Department of Defense mental health apps: A systematic literature review. Psychological Services, 16(2), 9, 196–207. 10.1037/ser0000289 [DOI] [PubMed]
- Harvey G, Kitson A. PARIHS revisited: From heuristic to integrated framework for the successful implementation of knowledge into practice. Implementation Science. 2016;11(1):33. doi: 10.1186/s13012-016-0398-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Helfrich CD, Li Y-F, Sharp ND, Sales AE. Organizational readiness to change assessment (ORCA): Development of an instrument based on the Promoting Action on Research in Health Services (PARIHS) framework. Implementation Science. 2009;4:38. doi: 10.1186/1748-5908-4-38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hermes ED, Lyon AR, Schueller SM, Glass JE. Measuring the implementation of behavioral intervention technologies: Recharacterization of established outcomes. Journal of Medical Internet Research. 2019;21(1):e11752. doi: 10.2196/11752. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hilty DM, Chan S, Torous J, Luo J, Boland RJ. Mobile health, smartphone/device, and apps for psychiatry and medicine: Competencies, training, and faculty development issues. Psychiatric Clinics of North America. 2019;42:513–534. doi: 10.1016/j.psc.2019.05.007. [DOI] [PubMed] [Google Scholar]
- Hilty DM, Sunderji N, Suo S, Chan S, McCarron RM. Telepsychiatry and other technologies for integrated care: Evidence base, best practice models and competencies. International Review of Psychiatry. 2018;30(6):292–309. doi: 10.1080/09540261.2019.1571483. [DOI] [PubMed] [Google Scholar]
- Keyworth C, Hart J, Armitage CJ, Tully MP. What maximizes the effectiveness and implementation of technology-based interventions to support healthcare professional practice? A systematic literature review. BMC Medical Informatics and Decision Making. 2018;18(1):93. doi: 10.1186/s12911-018-0661-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kirchner, J. E., Smith, J. L., Powell, B. J., Waltz, T. J., & Proctor, E. K. (2020). Getting a clinical innovation into practice: An introduction to implementation strategies. Psychiatry Research, 283. 10.1016/j.psychres.2019.06.042 [DOI] [PMC free article] [PubMed]
- Landes, S. J., McBain, S. A., & Curran, G. M. (2020). An introduction to effectiveness-implementation hybrid designs. Psychiatry Research, 283, 112630. 10.1016/j.psychres.2019.112630 [DOI] [PubMed]
- Lindsay JA, Kauth MR, Hudson S, Martin LA, Ramsey DJ, Daily L, Radar J. Implementation of video telehealth to improve access to evidence-based psychotherapy for posttraumatic stress disorder. Telemedicine Journal and E-Health. 2015;21(6):467–472. doi: 10.1089/tmj.2014.0014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Maheu M, Drude K, Hertlein K, Lipschutz R, Wall K, Hilty DM. An interdisciplinary framework for telebehavioral health competencies. Journal of Technology in Behavioral Science. 2018;2:190–210. doi: 10.1007/s41347-017-0038-y. [DOI] [Google Scholar]
- Morris Z, Wooding S, Grant J. The answer is 17 years, what is the question: Understanding time lags in translational research. Journal of the Royal Society of Medicine. 2011;104(12):510–520. doi: 10.1258/jrsm.2011.110180. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Muir SD, Boer K, Nedelijkovic M, Meyer D. Barriers and facilitators of videoconferencing psychotherapy implementation in veteran mental health care environments: A systematic review. BMC Health Services Research. 2020;20:999. doi: 10.1186/s12913-020-05858-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nilsen P. Making sense of implementation theories, models and frameworks. Implementation Science. 2015;10(53):1–13. doi: 10.1186/s13012-015-0242-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., Matthieu, M. M., Procter, E. K., & Kirchner, J. E. (2015). A refined compilation of implementation strategies: Results from the expert recommendations for implementing change (ERIC) project. Implementation Science, 10(1), 21. 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed]
- Prochaska JO, Norcross JC, DiClemente CC. Changing for good: A revolutionary six-stage program for overcoming bad habits and moving your life positive forward. HarperCollins; 1994. [Google Scholar]
- Quanbeck A, Gustafson DH, Marsch LA, Chih M, Korfield R, McTavish F, Johnson R, Brown RT, Mares M, Shah DV. Implementing a mobile health system to integrate the treatment of addiction into primary care: A hybrid implementation-effectiveness study. Journal of Medical and Internet Research. 2018;20(1):e37. doi: 10.2196/8928. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rathbone, A. L. & Prescott, J. (2017). The use of mobile apps and SMS messaging as physical and mental health interventions: Systematic review. Journal of Medical Internet Research, 19, e295. [DOI] [PMC free article] [PubMed]
- Ritchie, M. J., Dollar, K. M., Miller, C. J., Smith, J. L., Oliver, K. A., Kim, B., Connolly, S. L., Woodward, E., Ochoa-Olmos, T., Day, S., Lindsay, J. A., & Kirchner, J. E. (2020). Using implementation facilitation to improve healthcare (version 3). Veterans Health Administration. Behavioral Health Quality Enhancement Research Initiative (QUERI). Available at: https://www.queri.research.va.gov/tools/implementation/faclitation-manual.pdf
- Torous, J., Wisniewski, H., Liu, G., & Keshavan, M. (2018). Mental health mobile phone app usage, mobile phone app usage, concerns, and benefits psychiatric outpatients: Comparative survey study. JMIR Mental Health, 5(4), e11715. [DOI] [PMC free article] [PubMed]
- U.S. Department of Veterans Affairs. (2021). COVID-19 pandemic response. Weekly Report. https://www.va.gov/health/docs/VA_COVID_Response.pdf
- Veterans Health Administration (March 11, 2021). Review of Veterans Health Administration’s COVID-19 response and continued pandemic readiness. Report #20–02717–85 Review of Veterans Health Administration’s COVID-19 Response and Continued Pandemic Readiness (va.gov)
- Yakovchenko, V., Hogan, T. P., Houston, T. K., Richardson, L., Lipschitz, J., Petrakis, B. A., Gillespie, C., & McInnes, D. K. (2019). Automated text messaging with patients in department of veterans affairs specialty clinics: Hybrid type 2 effectiveness implementation study. Journal of Medical and Internet Research, 21,http://www.jmir.org/2019/7/e14750 [DOI] [PMC free article] [PubMed]
- Zhou L, Bao J, Watzlaf V, Parmanto B. Barriers to and facilitators of the use of mobile health apps from a security perspective: Mixed-methods study. JMIR mHealth and uHealth. 2019;7(4):e11223. doi: 10.2196/11223. [DOI] [PMC free article] [PubMed] [Google Scholar]