Skip to main content
The Permanente Journal logoLink to The Permanente Journal
. 2023 Aug 7;27(3):79–91. doi: 10.7812/TPP/23.008

Supporting Veteran’s Administration Medical Center Directors’ Decisions When Adopting Innovative Practices: Development and Implementation of the “QuickView” and “WishList” Tools

Sarah L Cutrona 1,2,, Lindsay White 1, Danielle Miano 1, Laura J Damschroder 3, Timothy P Hogan 1,4, Allen L Gifford 1,5, Brandolyn White 6, Heather A King 6,7, Marilla A Opra Widerquist 3, Elizabeth Orvek 1,2, Kathryn DeLaughter 1, Andrea L Nevedal 8, Caitlin M Reardon 3, Blake Henderson 9, Ryan Vega 10, George L Jackson 6,7
PMCID: PMC10502382  PMID: 37545198

Abstract

Background

Since 2015, the Veterans Health Administration (VHA) Diffusion of Excellence Program has supported spread of practices developed by frontline employees. Shark Tank–style competitions encourage “Sharks” nationwide (VHA medical center/regional directors) to bid for the opportunity to implement practices at their institutions.

Methods

The authors evaluated bidding strategies (2016–2020), developing the “QuickView” practice comparator to promote informed bidding. Program leaders distributed QuickView and revised versions in subsequent competitions. Our team utilized in-person observation, online chats after the competition, bidder interviews, and bid analysis to evaluate QuickView use. Bids were ranked based on demonstrated understanding of resources required for practice implementation.

Results

Sharks stated that QuickView supported preparation before the competition and suggested improvements. Our revised tool reported necessary staff time and incorporated a “WishList” from practice finalists detailing minimum requirements for successful implementation. Bids from later years reflected increased review of facilities’ current states before the competition and increased understanding of the resources needed for implementation. Percentage of bids describing local need for the practice rose from 2016 to 2020: 4.7% (6/127); 62.1% (54/87); 78.3% (36/46); 80.6% (29/36); 89.7% (26/29). Percentage of bids committing specific resources rose following QuickView introduction: 81.1% (103/127) in 2016, 69.0% (60/87) in 2017, then 73.9% (34/46) in 2018, 88.9% (32/36) in 2019, and 89.7% (26/29) in 2020.

Discussion

In the years following QuickView/WishList implementation, bids reflected increased assessment before the competition of both local needs and available resources.

Conclusion

Selection of a new practice for implementation requires an understanding of local need, necessary resources, and fit. QuickView and WishList appear to support these determinations.

Introduction

The Veterans Health Administration (VHA) Diffusion of Excellence (DOE) program, within the VHA Innovation Ecosystem, 1 supports adoption, implementation, and spread of innovative practices developed by frontline VHA employees. The program uses a novel approach to generate interest in practice uptake among medical center directors, hosting live national Shark Tank–style competitions annually. 1–5 “Sharks” (VA medical center directors and regional directors from across the US) bid against each other; winners receive 6 months of facilitation to implement the practice at their home institutions. Facilitation includes support from the practice originator and from the DOE program. Since 2015, there have been 3278 practices submitted for bidding at Shark Tank–style events and 87 practices chosen for implementation within sites nationwide. Practices are diverse and wide-ranging in scope, seeking to improve veteran care through clinical, administrative, health informatics/telehealth, and veteran experience-focused initiatives. 6–10 The program’s goals align with VHA’s commitment to innovation and to the principles of a learning health care system. The VHA is not the only health care system to use innovation contests to engage frontline employees, but the national scope of the DOE program presents a unique opportunity to understand the impact of this strategy across diverse sites and multiple years. 11–13

The decision to adopt a new practice in the health care setting is influenced by a range of factors. Medical center/health system directors and other organizational leaders must respond to sociopolitical and external factors including changes in policy, environmental influences, shifting needs of the population served, and top-down mandates to maintain or improve performance on selected quality metrics. 14 At the organizational or local level, leaders seeking to adopt new practices must account for the unique needs, capabilities, and available resources at the sites where the practice will be implemented. 15–17 When selecting between available options, leaders must also understand the salient characteristics of an innovative practice, including complexity of implementation, necessary resources for implementation, and performance in comparable settings. 15,16,18–22

Question of Interest

In the early years of the Diffusion of Excellence Program, the authors noted that Sharks sometimes had difficulty identifying and selecting innovations that fit their site’s unique needs. In addition, these VA medical center leaders were not always aware of the resources needed to support implementation success for a chosen practice. In this paper, the authors describe their process for developing, implementing, and revising the QuickView and WishList tools to promote informed bidding, including understanding of the resources needed for implementation. The authors also describe their evaluation of Shark/director bids (2016–2020) to understand whether and how bids evolved over time.

Methods

Per regulations outlined in VHA Program Guide 1200.21, this work has been designated a nonresearch quality-improvement activity. This work was supported by the VA Quality Enhancement Research Initiative (QUERI). This paper is the result of an embedded, long-term, partnered evaluation of the DOE program. By design, embedded work considers the perspectives of both the independent evaluators and operational leaders of the program.

Setting

With over 431,680 employees providing services to 9.26 million veterans/patients enrolled in VA health care across 1306 sites of care in 2022, the VHA is the largest fully integrated health care delivery system in the United States. 23

Overview of Shark Tank–style Bidding Process

VHA Shark Tank–style competitions are held annually. 3 For each competition, applications describing key aspects of candidate practices are submitted by employees and reviewed by program staff and subject-matter experts from across VHA (frontline clinical and administrative staff with relevant expertise). 2 The applicant pool is narrowed to 100 semifinalists and then 15–20 finalist practices are selected for inclusion in the Shark Tank–style competition. Review criteria include demonstrated value, replication feasibility, and alignment with VHA priorities. In advance of the live bidding date, Sharks are provided with a list and information on the 15–20 finalist practices selected for the competition. On the day of the Shark Tank–style competition, finalist practice owners give brief presentations pitching their projects. Sharks engage (either in person or via virtual chat functions) to place bids (commitment of resources and institutional support), which are reviewed by program leadership after the event; sites with winning bids receive 6 months of facilitation to implement the practice at their home institutions.

QuickView Tool—Development Goals and Theoretical Framework

Our goal in developing the QuickView tool was to create a resource that summarized key data about the 15–20 finalist projects to support Shark bid preparation prior to Shark Tank–style events and to facilitate structured comparisons of projects during competitions. Data was summarized using words, color coding, and easily understood icons. Our selection of data elements for inclusion in the QuickView was guided by literature on consumer decision support and visual representation of data, 24–26 and informed by the cognitive fit theory, which addresses the cognitive fit between the type of problem to be solved and the information displayed, 27 linking these to the efficiency of the problem-solving performance. As described in the following, the revised QuickView tool included an additional element—the WishList—which used the practice leads’ own words to provide granular details on 1) minimum requirements for successful implementation and 2) additional resources that would strengthen the likelihood of success. (See Appendices A and B for images of QuickView and WishList tools.)

Process Overview

Our process can be understood in 3 phases: 1) preimplementation of QuickView (needs assessment and tool development activities); 2) implementation of QuickView (with postimplementation evaluation and tool revision); 3) implementation of revised QuickView with WishList (with postimplementation evaluation). Following phase 3, tool management was transitioned to DOE leadership for sustained impact; QuickView and WishList distribution was managed by the DOE team, and subsequently elements were incorporated into the program’s website. Table 1 provides an overview of the evaluation team activities associated with these phases, as well as dates and corresponding Shark Tank–style events.

Table 1:

QuickView and WishList bidding support tools: development, implementation and evaluation

Phase Evaluation team activity
Phase 1.
Preimplementation of QuickView
Years: 2016–2018
Shark Tank–style competitions: second, third, and lead-up to fourth
Needs assessment Observation of medical center directors during preparatory phone calls before the competition
Direct observation of live Shark Tank–style bidding (in-person and virtual)
Post-Tank feedback from Sharks solicited via online chat function
Review of submitted Shark Tank–style bids
Tool development Review of data available in previous Shark Tank–style applications
Review of data available in 2018 applications; development of QuickView tool
Phase 2.
Implementation of QuickView
Years: 2018
Shark Tank–style competition: fourth, with follow-up period
QuickView tool implemented Team provided electronic versions of QuickView in advance of the Shark Tank–style competition; hard copies were available to those in person
Postimplementation evaluation Direct observation of live Shark Tank–style bidding (in-person and virtual)
Post-competition feedback from Sharks solicited via online chat function
Medical center director interviews
Review of submitted bids post-QuickView implementation
Tool revision Review of and alignment with Diffusion Marketplace DOE program’s online platform)
Follow-up surveys sent to finalist project leads
QuickView revised and WishList developed
Phase 3.
Implementation of revised QuickView with WishList
Years: 2019–2020
Shark Tank–style competitions: fifth and a sixth
Revised tool implemented Team provided electronic versions of QuickView in advance of the Shark Tank–style competition; hard copies were available to those in-person
Postimplementation evaluation (with comprehensive bid review) Observation of live Shark Tank–style bidding (in person and virtual)
Observation of bid-selection process by DOE leadership
Review of submitted bids postimplementation of revised tools
Comprehensive review of submitted bids (2016–2020)

Transition: Provided DOE leadership with detailed documentation of approach to QuickView and WishList creation (Appendix C)

Sustainment: Creation and distribution of QuickView and WishList adopted by DOE leadership (no longer evaluation-team led); also informed design for the program’s Diffusion Marketplace, an online platform facilitating practice uptake and spread nationwide.

a

In 2020, prior to the sixth Shark Tank–style competition, the evaluation team transitioned tool management to the DOE program.

DOE, Diffusion of Excellence.

Phase 1: Preimplementation of QuickView

Needs Assessment

Our preparation for tool development began with an assessment of Shark and program team needs; our goal in doing this assessment was to improve bid quality. Team members joined preparatory calls in the months leading up to both the second and third Shark Tank–style competitions and attended these 2 live Shark Tank–style competitions both in person and virtually, seeking to understand the experience of participants. Templated notes were taken at these live events, which were then reviewed and summarized to distill key points. Using a criterion sampling method (seeking to include Sharks participating in the Shark Tank–style competition), 28 team members facilitated an online chat with Shark Tank–style participants (administered over a Skype platform immediately after the third Shark Tank–style event; n = 17).

Key findings from our needs assessment are summarized in Table 2; based on our assessment, the team suggested that a summary tool could provide useful support and that it should be distributed on preparatory calls in the weeks leading up to a Shark Tank–style competition (when Sharks are introduced to the candidate practices as well as to the expectations for the bidding process).

Table 2:

Summary of findings from needs assessment

Shark strategies for pre-competition preparation
Joined informational calls
Reviewed project pitch videos
Communicated with subject-matter experts
Reviewed practice descriptions with local leadership team
Sharks’ needs for future Shark Tank–style competitions
They wanted to understand better:
How the practice aligns with VISN (regional) priorities
Prior practice success
Resources required to implement the practice
How the practice might solve existing local problem
Whether the practice is already being done at bidder’s own facility
How bids are evaluated to determine winners
Shark suggestions (would have been helpful to have before/during the Shark Tank–style competition)
Summaries of practices
Tools to facilitate earlier and more thorough practice review
Access to project videos earlier
Access to more detailed practice information

VISN, Veteran Integrated Service Network (a regional designation within the Veterans Administration).

Our team solicited input from program leadership on the purpose of this tool and agreed upon the goal of providing information without introducing bias (our tool should facilitate comparison without recommending one practice (vs) another).

Tool Development

Six hundred twenty-two applications were submitted for the 2018 Shark Tank–style competition; 100 semifinalists were chosen several months before the competition and 20 finalists were invited for participation just weeks before the QuickView tool needed to be ready for distribution (19 finalists participated in the competition).

Candidate applications were examined to identify available data, then this assessment was refined once the list of finalists was released. An initial review of all submitted applications for the third (2017) and fourth (2018) Shark Tank–style competitions indicated that the quality of information provided by candidates about their respective practices varied widely. The team therefore focused on the top 100 practices (semifinalists) for 2018 and performed an in-depth review of a random selection of semifinalist practices (n = 20). In our selection of data elements for the QuickView grid, team members sought where possible to include information highlighted by our needs assessment (see Shark needs and suggestions, Table 2). Informed by additional considerations detailed in Table 3, data elements for inclusion were selected. Once the finalist list was available, the 20 finalist applications (none of which were selected for review in the first round) were additionally reviewed; 40% of the semifinalist practice submissions thus received in-depth review (n = 40). Data elements were ultimately included only if they could be extracted for every finalist practice and if there was sufficient variation across practices (see Appendix A, QuickVew, for included elements).

Table 3:

Considerations during design process for QuickView

Characteristics of data elements selected for inclusion
Concept Question
Availability Is data element present in every finalist project application?
Reproducibility Is data element collected through discrete menu items (vs) free text (requiring interpretation)?
Ease of access (given need for timely turnaround)
Differentiation Is there sufficient variation across finalist practices to merit inclusion of data element?
Do included data elements capture what is distinct between practices?
Generalizability Do data elements capture broadly applicable concepts and VA priorities?
Is design easy to adapt for future years?
Importance to Sharks Are we including relevant information drawn from applications?
Are we including items of potential use to Sharks, not available in applications (eg, characteristics of the site where practice was successfully implemented)?
Importance for successful implementation Does information contribute to improved “fit” between practice and implementing site?
Considerations for layout design
  • Number of practices displayed per page (5 practices)

  • Number of data elements displayed per practice (11 elements)

  • Consistent formatting to facilitate comparison across practices

  • Practices listed in order of Shark Tank–style presentation for easy use

Consideration for best practices in visual representation of data 16,17
  • Induce the viewer to think about substance rather than the graphic design

  • Avoid distorting the data

  • Encourage the eye to compare different pieces of data

  • Present many numbers in a small space

  • Make large data sets coherent

  • Serve a purpose: description, exploration, tabulation, or decoration

  • Simple representation of missing data (the authors avoided empty boxes, used M = missing data)

  • Judicious use of visual techniques (used words, colors, icons, numbers, and simple repeated charts)

  • Minimize need for viewer to reference key on separate page

VA, Veterans Administration.

Phase 2: Implementation of QuickView

QuickView Tool Implemented

The DOE implemented QuickView, distributing the tool via email to Sharks before the fourth competition. The team joined preparatory calls to introduce the tool and provide suggestions for its use. Colored hard copies were provided to bidders at the live Shark Tank–style event.

Postimplementation Evaluation

The team conducted direct observation of live Shark Tank–style bidding. Team members attended the live in-person event and took structured notes, observing Sharks’ consultation of the QuickView prior to and during the live bidding. QuickView was used by medical center directors and by support staff accompanying (or representing) them.

The team solicited feedback immediately following the live national Shark Tank–style event (see Table 4). Sharks were provided paper copies (if they were attending the competition in person) or invited to respond to questions on the chat function supported by the virtual platform. There were 47 registrants in total for the Shark Tank–style competition, though the team was unable to confirm the actual number of attendees. Ten Shark Tank–style participants contributed feedback, 4 via virtual chat and 6 in person (in-person attendees submitted written responses to questions survey style, because of limitations in our ability to reach them through the virtual platform). Seven participants provided comments in response to our question on the utility of the QuickView tool (1 via chat; 6 via paper survey). Responses were collected and reviewed. A single question focused on utility of the QuickView tool: “If you used the Finalist Quick View, in what ways was it useful (or not useful)? The Finalist QuickView was the 4-page document summarizing finalist practices.” Another question focused more generally on suggestions for improvement in the Shark Tank–style experience overall: “What do you feel could have been better overall?” Additional questions provided further insights into the Sharks’ needs when preparing for the Shark Tank–style competition and determining whether to place a bid.

Table 4:

Findings from post-QuickView implementation feedback (collected via paper survey and online “chat” after the fourth Shark Tank–style competition, 2018)

Shark (medical center director) responses to:
If you used the Finalist Quick View, in what ways was it useful (or not useful)? What do you feel could have been bettera?
“The QuickView was good” “Asking the participants to prepare a slide for the Sharks with their ideal Shark/facility/bid would have been helpful”
“Very helpful in prepwork with ELT [Executive Leadership Team] in deciding what we wanted to bid on”
“Allowed me to get acquainted with the projects—very nicely done” “Many of these were practices we have been doing at our facility for a while”
“Useful” “Query the interest of the Sharks before bidding” [to prevent practices with no bids]
“Great snapshot view,”
“yes”
“Really good tool, maybe a color coding element”
a

Responses shown here are limited to those focused on optimizing practice/site match

The team conducted interviews with medical center directors (see Table 5). Interviews (n = 20) were conducted by phone and were recorded with permission from the interviewees, then professionally transcribed. These interviews were part of a larger focus on understanding medical center director decision making surrounding implementation of innovative practices within their facility. Reported here are only those themes that emerged in response to a question focused on the QuickView. Interviewees were asked, “Describe the process you went through to prepare for the Shark Tank” and were then asked a follow-up: “You received a document that provided a Finalist Quick View of each practice. Did you use this to help prepare?” Transcripts from the interviews were reviewed and coded, identifying a priori and emergent themes.

Table 5:

Medical center director interviewees describe experience of using QuickView tool

Positive feedback about QuickView
Used QuickView to supplement existing process; helpful not to need to create their own summary Well, I mean again, who the innovation specialist is presenting this to is to the quad and chief of quality, safety, and value. And we don't have time to go through every single thing in detail. So my innovations specialist having that one sheet really just helps summarize it for us of why it’s important, what it will be able to accomplish, what resources it would take, that was very useful in that sense. Because my innovation specialist didn't have to create that themselves. They can start with that and add their notes to it ... it would be innovation specialist who was doing that [creating a summary] ... But I think giving us that one-pager, at least we all start with the standardized one sheet. And then again, my innovation specialist might give a little couple more comments but we all start with the same information ... But instead of taking it from a PowerPoint or even a video presentation of something, it helps a lot.”
Nice quick snapshot of each individual practice
Useful, comprehensive, easy to read
“In addition our existing process…we were able [to] still utilize it. It was a nice quick snapshot of each individual practice.”
[asked about ways to improve QuickView] “… nothing really comes to mind, it was pretty comprehensive and easy to read.”
Helped to further clarify what the projects were “I think it was definitely helpful to further explain the innovation idea ... I think it helped further clarify what exactly the project is. I think that may have been one that we didn't bid as high on once we found out what it really was. I think that for clarification purposes, I think it was helpful.”
Wanted more information/did not recall using QuickView
Difficulty remembering whether or not they had received a QuickView
Wanted more detailed synopsis
  • [Describing what they would find helpful]

  • “So maybe like a summary page of the project that has an outline of the start to finish, might be something that gives you just—you know, almost like an issue brief that says this is what we did, these were our improvements and this is what our resources were, so that it’s very easy to review to see if it would be a match or not.”

[Interviewer described QuickView tool]
”I believe—I do remember that. I think it was just a very brief synopsis of what the practice was. But I think what I’m referring to would be a little more detailed synopsis of what the practice is to where—you know, you just had a little bit more information.”
Discussing general considerations for bidding “[W]e were in a tight budget year. And so resources were an issue for us to consider. And then the factor of, well, what’s ... particular areas have strengths or weaknesses and which do we think would be the best fit? And then lastly, I know that one common mistake is for people to try too much and to try too many things. And so, we had to focus it, and focus your efforts because the worst thing you can do is bite off more than you can chew.”

Tool Revision

The tool was reviewed for alignment with the program’s online platform. In the time from our initial development of the QuickView tool, the program began designing a web-based platform known as the Diffusion Marketplace, 9 the goal of which was to facilitate cross-VA sharing of information on practices developed through DOE as well as related programs. As part of our QuickView revision, the team sought to familiarize ourselves with the Diffusion Marketplace to align the revised QuickView tool better with both content and data visualization employed on this site.

Brief surveys were collected from finalist leads before the competition.

To create the revised QuickView for 2019 (see Appendix B), our team started with the 2018 final template for QuickView and assessed corresponding questions on the Diffusion Marketplace, making decisions on what needed to be presented differently in QuickView to ensure alignment. Once the 15 finalists were announced, the existing questions were populated where possible.

At our recommendation, the DOE program agreed to administer a brief survey to the 15 finalists several weeks prior to the 2019 competition, to supplement application data. The questionnaire consisted of 5 content areas (see Appendix C). Questions were structured to align with Diffusion Marketplace content where possible, to enhance the coherence of program messaging and to minimize the potential confusion in those looking at both resources. The questionnaire was created in REDCap and emailed to the finalists.

The team also elicited input from program leadership on the representation of newly collected data elements (focused on staff and hours). The revised QuickView includes new data elements (number of full-time equivalents and person hours necessary to implement practice) as well as replacement elements (“ease of replication” was replaced by “complexity of implementation” to align better with Marketplace data elements; both self-reported).

Paired with the revised QuickView, a supplemental tool called the WishList was also developed. Our survey asked finalists: “If you could help write the winning bid for your own practice, what would that bid look like? 1) Must-have: assuming none of the required elements are in place yet, please list the minimum requirements a new site will need to provide in order to implement your practice; 2) nice-to-have: What additional commitments would help make your project implementation a success?” Verbatim responses were provided to Shark Tank–style bidders (see Appendix B).

Phase 3: Implementation of Revised QuickView With WishList

Revised Tool Implemented

The DOE implemented the revised QuickView with WishList (Appendix B), distributing them via email to Sharks before the fifth competition, with hard copies available to those attending the competition in person. Team members again joined preparatory calls to introduce the QuickView tool and provide suggestions for its use in pre-competition preparation.

Postimplementation Evaluation (With Comprehensive Review of Submitted Bids)

Live Shark Tank–style bidding and bid-selection by program leadership was directly observed. The team members attended the live in-person event (fifth competition) and additionally observed the process by which leaders selected bids after the competition, using QuickView for reference.

Submitted bids were reviewed at 4 stages, with preliminary reviews conducted in 2017 (after the third competition); 2018 (after the fourth competition) and 2019 (after the fifth competition). In 2020, a comprehensive review of all submitted bids (2016–2020) was conducted and bids received in each of the 6 cycles of the VHA Shark Tank–style competition were analyzed (Table 6). Bids were ranked (scale 1–4) based on criteria that our evaluation team identified through preliminary review of bids from all 6 years. Ranking was intended to reflect the extent of understanding of resources required for implementation. Lower ranking reflected 1) the absence of any mention of resources committed, 2) a nonspecific statement about bidder’s intent to provided needed resources, or 3) a statement about intent to provide needed resources that included identification of relevant departments or teams but without further details. The highest score (4) was assigned to bids providing a list of relevant available or committed resources (eg, number of full-time equivalents in personnel, access to specialized materials, space, etc). Other bid characteristics were coded as present/absent. These included 1) details describing site’s need for the practice; 2) reference to leadership/champion support, 3) commitment of travel funds; and 4) inclusion of “sweeteners” (promises of extra goods or experiences such as meals with local foods, tickets to regional sports events, lunch with director, etc). Bid word count was also calculated.

Table 6:

Characteristics of bids placed in Shark Tank–style competition, by year

Shark Tank–style competition year 2016 2017 2018 2019 2020
Number of bids placed (total) 127 87 46 36 29
Extent to which bid reflects understanding of resources needed for implementation
Ranked from 1 (lowest) to 4 (highest); rankings are mutually exclusive
Number of bids (%)
1—Absence of any mention of resources committed 9 (7.1) 11
(12.6)
6 (13.0) 1
(2.8)
1 (3.4)
2—General statement (specifics not provided):
Intent to provide needed resources
8
(6.3)
6
(6.9)
4
(8.7)
2
(5.6)
1
(3.4)
3—Statement with some specifics:
Intent to provide needed resources, with identification of relevant departments or teams
7
(5.5)
10
(11.5)
2
(4.3)
1
(2.8)
1
(3.4)
Total number of bids ranked 1–3 24
(18.9)
27
(31.0)
12
(26.1)
4
(11.1)
3
(10.3)
4. Statement with specific details:
Intent to provide needed resources, with list of relevant resources (eg, FTE for administrative or clinical staff, access to specialized materials, space)
103
(81.1)
60
(69.0)
34
(73.9)
32
(88.9)
26
(89.7)
Other characteristics of bids
Bid references leadership support, or project champion 46
(36.2)
17
(19.5)
10
(21.7)
13
(36.1)
14
(48.3)
Bid commits funds for travel 57
(44.9)
16
(18.4)
25
(54.3)
12
(33.3)
1
(3.5)
Bid provides details on-site’s assessed need for practice 6
(4.7)
54
(62.1)
36
(78.3)
29/36
(80.6)
26
(89.7)
Bid adds in “sweeteners”a 24 (18.9) 9
(10.3)
15
(32.6)
6
(16.7)
0
(0)
Bid word count (mean) 20.9 42.8 70.2 212.4 264.7
a

Sweeteners were statements promising goods or experiences outside what was formally needed to implement the practice. In some cases, access to leadership was offered, with varying degrees of seriousness (eg, lunch with director, signed photo of director, “an old photo…unsigned”). They often expressed local/regional identity by including local food specialties (eg, lobster rolls, BBQ, maple syrup) or trips to local sports teams or participation in recreational activities.

Two reviewers (team members who had not been involved in QuickView development) read each bid, assigned a classification, and then met to review coding. Where they differed, coding was discussed. Consensus was reached in all cases.

Transition and Sustainment

In 2020, prior to the sixth Shark Tank–style event, the evaluation team transitioned tool management to the DOE program. The DOE program led creation and distribution of the QuickView/WishList for the sixth Shark Tank–style event (an entirely virtual event in 2020). (See Appendix C.) Our team followed up with program leadership after the competition to collect feedback.

Results

The results of our postimplementation evaluations for phase 2 (implementation of QuickView) and phase 3 (implementation of revised QuickView and WishList, comprehensive review of bids from 2016–2020) are reported here. Findings related to transition and sustainment are also reported.

Phase 2. Implementation of QuickView—Postimplementation Evaluations

Direct Observation of Live Shark Tank–Style Bidding

The QuickView tool was used by Sharks and DOE leadership during the live event (Sharks referred to QuickView when practices were being presented by finalists).

Post-Competition Feedback

Ten Shark Tank–style participants contributed feedback, with 7 commenting directly on the QuickView tool. All respondents who addressed the QuickView (n = 7) described finding the tool helpful to get acquainted with projects in advance of the Shark Tank–style competition. Respondents commented on its function as a “snapshot view” and described it as a tool for engaging executive leadership in bidding/adoption decisions (Table 4). Sharks noted major considerations when bidding, including 1) the cost of the finalist practices and 2) the potential benefit of an overview slide. On cost, 1 Shark shared: “[o]nce realized the cost, changed my mind about bidding”; another Shark observed that it “seemed that some practices required a lot of FTE or $ to test at our facility.” On overviews, a Shark suggested: “Asking the participants to prepare a slide for the Sharks with their ideal Shark/facility/bid would have been helpful” (Table 4). These comments directly informed our revised QuickView and WishList, implemented in 2019.

Interviews With Medical Center Directors

The team conducted 20 interviews with medical center directors and executive staff (28 total individuals) across the country in late 2018 and early 2019, approximately 2–5 months after the competition; of these, 9 directors had participated in Shark Tank–style events. Themes that emerged from this review are included in Appendix D, along with representative quotes. Several directors described ways they had integrated the QuickView into pre-competition preparation; one respondent did not initially remember using QuickView and once reminded of the tool’s characteristics, criticized it for providing “just a very brief synopsis of what the practice was,” calling for more detailed information.

Our evaluation had identified several data elements that Sharks wished to have during bid preparation. Briefly, these included information on cost/number of full-time equivalents for personnel as well as information on the contents of an ideal bid (which populated the WishList described in the following). In some cases, this information was present in the applications (derived from open-ended questions) but was inconsistent across practices. In other cases, the information was completely unavailable.

Phase 3. Implementation of Revised QuickView With WishList—Postimplementation Evaluations

Direct Observation of Live Shark Tank–Style Bidding and Bid-Selection (After the Competition)

As in the previous year’s Shark Tank–style competition, team members noted that the QuickView tool was used by Sharks and DOE leadership during the live event and by DOE leadership and support staff for reference in sorting and prioritizing candidate bids following the competition.

Comprehensive Review of Submitted Bids (2016–2020)

In later years, bids reflected increased pre-competition review of facilities’ current state, increased understanding of the resources needed for implementation, and increased length. Percentage of bids describing local need for the practice rose consistently from 2016 through 2020: 4.7% (6/127), 62.1% (54/87), 78.3% (36/46), 80.6% (29/36), 89.7% (26/29). Percentage of bids in which specific resources committed to the practice were described rose in each year following our 2018 QuickView implementation: 81.1% (103/127) in 2016, 69.0% (60/87) in 2017, then 73.9% (34/46) in 2018, 88.9% (32/36) in 2019, 89.7% (26/29) in 2020. Average word count rose from 20.9 (2016) to 264.7 (2020), and overall number of submitted bids dropped from 127 (2016) to 29 (2020).

The team identified a number of bids that included elements that our team labeled sweeteners—statements promising goods or experiences outside of what was formally needed to implement the practice. As noted in observations of Shark Tank–style interactions, sweeteners appeared to serve several roles. These were generally low-cost or free items promised in a friendly or humorous manner. As part of a national gathering of VHA medical center leaders, adding a sweetener to a bid was in some instances an expression of local or regional pride—practice originators (who would often be traveling to the new implementing site) were offered local food specialties (“lobster rolls for your team,” “Texas-sized steak dinner,” “ gallon of maple syrup”), trips to or clothing from local sports teams (“Gator T-shirt,” “Durham Bulls baseball game,” “round of golf at Augusta National,” “wide screen TV to watch Husker football”) and recreational activities (“We do have the Opry to offer!” “best fishing and seafood in the US!“ “Two free passes to the Rock and Roll Hall of Fame,” “come to the bayou country with warm weather and Cajun cooking, great southern hospitality.” Sweeteners sometimes appeared in consecutive bids and appeared to respond to each other, expressing friendly competition. In some cases, access to leadership was offered, with varying degrees of seriousness (eg, lunch with director, “signed photo of director,” “an old photo…unsigned”). Sweeteners were common in the earlier years, peaking at 32.6% in 2018, reducing to zero by 2020.

Transition and Sustainment

In a follow-up call several weeks after the competition, our team spoke to program leadership to ascertain any feedback received on the tool. The director of the DOE program (author BH) reported using QuickView during the process of selecting winning bids and emphasized the influence of these tools on the development of the Diffusion Marketplace online platform.

Discussion

Sharks reported that QuickView was a tool supportive of pre-competition preparation. In response to Shark input, our revised tool added information on necessary full-time equivalents and person hours and also incorporated practice finalists’ descriptions of minimum requirements for successful implementation. Our evaluation indicates that implementation of QuickView and WishList tools coincided with increases in the level of preparation reflected in Shark bids. In the years following QuickView implementation, Shark bids increasingly reflected assessment of local need for the practice as well as an understanding of (and commitment to providing) resources needed for implementation. Whether directly caused by the QuickView tool or introduced by conditions changing in parallel with QuickView implementation, this trend reflects the VA’s ongoing efforts to provide institutional support for identifying and implementing innovative practices.

Interestingly, over the years the authors studied, the number of bids placed per Shark Tank–style event decreased. This drop in submitted bids (culminating in 2020) was likely influenced by a variety of external factors that the authors were unable to capture in our study. It is possible that the evolution of the Shark Tank–style application process, in conjunction with system-wide learning about how to improve on the bidding process, could have contributed both to lower overall numbers of submitted bids and to increased sophistication in bid development over time.

Although there are many advantages to this trajectory, our team also noted that the use of sweeteners as a bid element declined in recent years. These sweeteners were mainly low-cost or no-cost items that were used to express humor and hospitality, to create team-building motivational experiences (shared meals, shared nonwork events), or to provide local souvenirs as tokens of goodwill and local pride. The source of payment for these was not explicitly specified, but they were often framed as small gifts or expressions of hospitality from the directors. In resource-stretched environments, the motivation and goodwill generated with these gestures may play an important role in implementation success thus their decline is a trend worthy of monitoring. The DOE program’s facilitation support may benefit from explicitly addressing these factors, including their impact on team building and their influence on implementation success. An alternate explanation to keep in mind is that reductions in use of sweeteners could be the result of an increase in the level of understanding (on the part of Sharks and their teams) regarding the technical and operational considerations of practice implementation. Taken together with our findings of increased bid sophistication, this would prompt us to interpret the decline in use of sweeteners as a signal that the goodwill and sense of camaraderie initially conveyed through small expressions of humor and hospitality is now being translated into preparation and teamwork supporting informed, high-quality bidding (with the goal of successful implementation of the selected practice).

It is possible that the QuickView tool may have had an effect at the organizational level—visually appealing graphics provided to Sharks and passed along to advisory team members may have been helpful in focusing attention on prebid preparation. One respondent specifically described the ways that the QuickView supported presentations to local leadership (to “the quad and chief of quality safety and value”). This respondent described using the tool as a starting point (“we all start with the standardized one sheet…they can start with that and add their notes to it”). Although the data collected on this effect were limited, our team did hear from Sharks that having summary sheets explaining “why it’s important, what it will be able to accomplish, what resources it would take, that was very useful.”

Leadership decisions on adoption of new practices within a medical center must respond to factors and influences that extend far beyond the local environment. These often include organizational conditions (eg, facility size, complexity of care delivered, broader network influences 29 ), as well as regional and national leadership priorities. 13,28 As demonstrated in our previous work, 3 medical center directors may focus on considerations such as support of senior leadership or the ability of a practice to affect performance measures; these may be seen by decision makers as more influential than questions of available space or resources. Implementing these practices, however, is often a ground-level, highly local effort. Tools to support optimal fit between practice and health care settings must seek to describe practices in the context of leadership priorities and also facilitate assessment of local needs and allocation of the resources needed for successful implementation. Assessment of the possibility of sustainment beyond periods of facilitated support can shed light on the extent to which initial fit of a practice translates to sustained activities over months to years, an important future topic for study. 30

Limitations

Our evaluation has some limitations. Changes in bid content very likely were influenced by additional factors beyond our QuickView and WishList tools. The trend over the first 5 years of Shark Tank–style competitions toward greater resource awareness in submitted bids may be consistent with an impact from QuickView implementation efforts but may also be consistent with gradual system learning and interventions outside QuickView, such as the evolution of the application process for Shark Tank–style competitions. Over the time period evaluated, the DOE program underwent changes in leadership and an evolution in Shark Tank–style format. In the earliest years of the competition, bids could be called out verbally (followed by written submission) or submitted via an online chat function visible to other bidders, emphasizing the sense that participants were engaging in a friendly, competitive event. The format evolved, incorporating more sophisticated virtual platforms. Because of the COVID pandemic in 2020, participation was entirely virtual. Decreased opportunity for interaction between Sharks may have decreased the use of sweeteners (less of the atmosphere of competitive camaraderie which had been encouraged at in-person events) and may have influenced the number of bidders for the same reason. Guidance provided to Sharks on pre-competition preparation (based on our evaluation findings) also became more explicit over time, and may have influenced the trends in percentage of bids describing local need for practice. Given these limitations, our embedded evaluation team will continue in upcoming years to investigate the relationships between what facilities bid in the Shark Tank–style competition, organizational goals of bidding facilities, and the ability of sites to implement and sustain innovative practices successfully.

Conclusion

Over 5 years, our team has worked alongside DOE program leadership to provide evidence-based guidance on ways to support Shark decision making when adopting new practices. Our findings from this evaluation indicate that selection of a new practice for implementation within a medical center should begin with a thorough understanding of local need for the practice, clear expectations regarding resources required, and an assessment of fit between site and practice. QuickView and WishList support these determinations and have also informed design for the program’s Diffusion Marketplace, an online platform facilitating practice uptake and spread nationwide.

Our findings have important implications that extend beyond the VA. Health system leaders seeking to implement innovative practices at their own institutions may benefit from a systematic approach to gathering and presenting data for practice comparison and supporting assessment of local need and fit prior to practice selection by leadership. Clear lines of communication allowing practice originators to share with leadership views on must-haves (minimum necessary requirements for implementation) and nice-to-haves (additional resources that would support implementation success) may also aid in informed practice selection.

Disclaimer

The views expressed in this paper are those of the authors and do not reflect the position or policy of the Department of Veterans Affairs or the US Government.

Supplementary Material

Supplementary Material

tpp_23.008-suppl-01.docx (1.8MB, docx)

Acknowledgments

The authors would like to acknowledge the invaluable contribution made by Thomas K Houston, MD, to the initial idea of implementing comparison grids for purposes of decision support.

Footnotes

Author Contributions: SLC led the implementation and evaluation efforts as well as manuscript writing. LW led preliminary review of bid data and design/revision of the QuickView tool. DM, KD, and EO coded bid data. EO and LW led implementation of QuickView and WishList tools and managed evaluation efforts. GLJ, LJD, TPH, ALG, BW, HAK, MAOW, ALN, and CMR contributed to design and implementation of QuickView, guided evaluation, and provided input on manuscript writing. BH and RV provided input on the QuickView tool design, implementation, and revision. All authors read and approved the final manuscript.

Conflicts of Interest: None declared

Funding: The evaluation was funded by the VHA QUERI [PEC-17-002] with additional funding subsequently provided by the VHA Office of Rural Health through the VHA Diffusion of Excellence program.

References

  • 1. Vega RJ, Kizer KW. VHA’s innovation ecosystem: Operationalizing innovation in health care. NEJM Catalyst. 2020;1(6):1. 10.1056/CAT.20.0263 [DOI] [Google Scholar]
  • 2.Vega R, Jackson GL, Henderson B, et al. Diffusion of excellence: Accelerating the spread of clinical innovation and best practices across the nation’s largest health system. Perm J. 2019;23:23. 10.7812/TPP/18.309 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Jackson GL, et al. Merging implementation practice and science to scale up promising practices: The Veterans Health Administration (VHA) diffusion of excellence (DoE) program. Jt Comm J Qual Patient Saf. 2021;47(4):217–227. 10.1016/j.jcjq.2020.11.014 [DOI] [PubMed] [Google Scholar]
  • 4.Nevedal AL, Reardon CM, Jackson GL, et al. Implementation and sustainment of diverse practices in a large integrated health system: A mixed methods study. Implement Sci Commun. 2020;1:61. 10.1186/s43058-020-00053-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Clancy C. Creating world-class care and service for our nation’s finest: How veterans health administration diffusion of excellence initiative is innovating and transforming veterans affairs health care. Perm J. 2019;23. 10.7812/TPP/18.301 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Kobe EA, Lewinski AA, Jeffreys AS, et al. Implementation of an Intensive Telehealth Intervention for Rural Patients with Clinic-Refractory Diabetes. J Gen Intern Med. 2022;37(12):3080–3088. 10.1007/s11606-021-07281-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Lange TM, Hilgeman MM, Portz KJ, Intoccia VA, Cramer RJ. Pride in all who served: Development, feasibility, and initial efficacy of a health education group for LGBT veterans. J Trauma Dissociation. 2020;21(4):484–504. 10.1080/15299732.2020.1770147 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Nichols LO, Chang C, Lummus A, et al. The cost-effectiveness of a behavior intervention with caregivers of patients with Alzheimer’s disease. J Am Geriatr Soc. 2008;56(3):413–420. 10.1111/j.1532-5415.2007.01569.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.US Dept of Veterans Affairs . Accessed 9 April 2023. Diffusion Marketplace (va.gov)
  • 10.Crowley MJ, Tarkington PE, Bosworth HB, et al. Effect of a comprehensive telehealth intervention vs telemonitoring and care coordination in patients with persistently poor type 2 diabetes control. JAMA Intern Med. 2022;182(9):943. 10.1001/jamainternmed.2022.2947 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Terwiesch C, Mehta SJ, Volpp KG. Innovating in health delivery: The Penn Medicine innovation tournament. Healthc (Amst). 2013;1(1–2):37–41. 10.1016/j.hjdsi.2013.05.003 [DOI] [PubMed] [Google Scholar]
  • 12.Jung OS, Jackson J, Majmudar M, McCree P, Isselbacher EM. Engaging frontline employees using innovation contests. Lessons from Massachusetts General Hospital Healthc (Amst). 2022;10(2):100615. 10.1016/j.hjdsi.2022.100615 [DOI] [PubMed] [Google Scholar]
  • 13.Kaitz J, DeLaughter K, Deeney C, et al. Leveraging organizational conditions for innovation: A typology of facility engagement in the veterans health administration shark tank-style competition. Perm J. 2023;27(2):43–50. 10.7812/TPP/22.154 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Brockhoff K, Chakrabarti AK, Hauschildt J. The dynamics of innovation. In: Organizational adaptation and innovation: The dynamics of adopting innovation types. Berlin: Springer; 1999. 10.1007/978-3-662-03988-5 [DOI] [Google Scholar]
  • 15.Berwick DM. Disseminating innovations in health care. JAMA. 2003;289(15):1969–1975. 10.1001/jama.289.15.1969 [DOI] [PubMed] [Google Scholar]
  • 16.Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4:50. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Williams JW, Jackson GL. Utilizing evidence to address the health and health care needs of veterans. North Carolina Medical Journal. 2015;76(5):294–298. 10.18043/ncm.76.5.294 [DOI] [PubMed] [Google Scholar]
  • 18.Rogers EM. Diffusion of innovations. 4th ed. New York, NY: The Free Press; 1995. [Google Scholar]
  • 19.Jackson GL, Williams JW. Does PCMH “work”?--The need to use implementation science to make sense of conflicting results. JAMA Intern Med. 2015;175(8):1369–1370. 10.1001/jamainternmed.2015.2067 [DOI] [PubMed] [Google Scholar]
  • 20.Jackson GL, Cutrona SL, Kilbourne A, White BS, Everett C, Damschroder LJ. Implementation science: Helping healthcare systems improve. JAAPA. 2020;33(1):51–53. 10.1097/01.JAA.0000615508.92677.66 [DOI] [PubMed] [Google Scholar]
  • 21.Jackson GL, Damschroder LJ, White BS, et al. Balancing reality in embedded research and evaluation: Low vs high embeddedness. Learning Health Systems. 2022;6(2). 10.1002/lrh2.10294 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Vashi AA, Orvek EA, Tuepker A, et al. The Veterans Health Administration (VHA) innovators network: Evaluation design, methods and lessons learned through an embedded research approach. Healthc (Amst). 2021;8(Suppl 1):100477. 10.1016/j.hjdsi.2020.100477 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Produced by the National Center for Veterans Analysis and Statistics. Accessed 7 October 2023. www.va.gov/vetdata/docs/pocketcards/fy2021q2.pdf
  • 24.Tang H, Lee CBP, Choong KK. Consumer decision support systems for novice buyers--A design science approach. Inf Syst Front. 2017;19(4):881–897. 10.1007/s10796-016-9639-9 [DOI] [Google Scholar]
  • 25.Tufte E. The visual display of quantitative information. 2nd ed. Cheshire, CT: Graphics Press; 1983. [Google Scholar]
  • 26.Nicholson K. The truthful art: Data, charts and maps for communication. Technology|Architecture + |Design|. 2017;1(2):243–245. 10.1080/24751448.2017.1354630 [DOI] [Google Scholar]
  • 27.Vessey I. Cognitive fit: A theory-based analysis of the graphs versus tables literature. Decision Sciences. 1991;22(2):219–240. 10.1111/j.1540-5915.1991.tb00344.x [DOI] [Google Scholar]
  • 28.Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Adm Policy Ment Health. 2015;42(5):533–544. 10.1007/s10488-013-0528-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Anderson N, Potočnik K, Zhou J. Innovation and creativity in organizations: A state-of-the-science review, prospective commentary, and guiding framework. J Manage. 2014;40(5):1297–1333. 10.1177/0149206314527128 [DOI] [Google Scholar]
  • 30.Reardon CM, Damschroder L, Opra Widerquist MA, et al. Sustainment of diverse evidence-informed practices disseminated in the veterans health administration (VHA): Initial development and piloting of a pragmatic survey tool. Implement Sci Commun. 2023;4(1):6. 10.1186/s43058-022-00386-z [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material

tpp_23.008-suppl-01.docx (1.8MB, docx)

Articles from The Permanente Journal are provided here courtesy of Kaiser Permanente

RESOURCES