Abstract
Introduction
Implementation researchers could draw from participatory research to engage patients (consumers of healthcare) in implementation processes and possibly reduce healthcare disparities. There is a little consumer involvement in healthcare implementation, partially because no formal guidance exists. We will create and pilot a toolkit of methods to engage consumers from the US’ Veterans Health Administration (VHA) in selecting and tailoring implementation strategies. This toolkit, Consumer Voice, will provide guidance on what, when, where, how and why an implementer might engage consumers in implementing treatments. We will pilot the toolkit by implementing Safety Planning Intervention for suicide prevention with rural veterans, a population with suicide disparities. Safety Planning Intervention is effective for reducing suicidal behaviours.
Methods and analysis
In Aim 1, we will use participatory approaches and user-centred design to develop Consumer Voice and its methods. In Aim 2, we will pilot Consumer Voice by implementing the Safety Planning Intervention in two clinics serving rural VHA patients. One site will receive a current implementation strategy (Implementation Facilitation) only; the second will receive Implementation Facilitation plus Consumer Voice. We will use mixed methods to assess feasibility and acceptability of Consumer Voice. We will compare sites on preliminary implementation (reach, adoption, fidelity) and clinical outcomes (depression severity, suicidal ideation, suicidal behaviour). In Aim 3, we will evaluate Aim 2 outcomes at 20 months to assess sustained impact. We will gather qualitative data on sustainability of the Safety Planning Intervention.
Ethics and dissemination
These studies are overseen by the Institutional Review Board at the Central Arkansas Veterans Healthcare System. We plan to use traditional academic modalities of dissemination (eg, conferences, publications). We plan to disseminate findings through meetings with other trainers in implementation practice so they may adopt Consumer Voice. We plan to share results with local community boards.
Keywords: mental health, public health, quality in health care
Strengths and limitations of this study.
Rigorous, iterative process based on user-centred design.
Includes consumers/patients in several steps of toolkit development.
Consumers/patients are involved on research team who makes decisions about Consumer Voice.
Researchers will have difficulty detecting patient-level differences in outcomes due to relative infrequency of suicide.
Provides a toolkit on how to engage consumers/patients in implementation practice to generalise to other settings.
Introduction
Healthcare disparities are significant differences in receipt of, access to, quality of, or outcomes of healthcare between marginalised groups and reference groups.1 Healthcare disparities persist in the USA and in the Veterans Health Administration (VHA) for several marginalised groups who have experienced societal oppression.2 3 One reason disparities persist is that clinical interventions target patient factors only—patients’ individual attitudes, behaviours—and although these are necessary targets to reduce disparities, they are not sufficient.4 We must also intervene on broader structures—for example, cultures of change, policies, organisational climate. Implementation scientists can address these broad, organisational factors contributing to disparities by using implementation strategies.
Implementation strategies are implementation interventions to address known barriers to uptake of a clinical intervention.5 Implementation strategies are commonly targeted at providers, clinics, hospitals or systems, such as provider training, performance data feedback or securing new funding streams.6 For example, to reduce racial disparities in guideline-concordant cardiovascular disease care, one possible implementation strategy is to plan for, act on and re-evaluate quality improvement efforts among patients by race. Typically, implementers—researchers, quality improvement personnel, facilitators—select and tailor implementation strategies. Tailoring a strategy involves refinements or tweaks so that it fits better with local context and more precisely targets implementation barriers.7 Although existing implementation strategies have improved care for the general population,5 they may not be sufficient to improve care for marginalised populations.8 One potential solution to reduce healthcare disparities is to engage marginalised patients (referred to as consumers) in selecting and tailoring implementation strategies to better fit their needs.
Participatory approach to engage consumers in implementation
Participatory research is an approach in which consumers are actively engaged in the research process. Consumers might be informants, discussants, or partners in research with varying degrees of decision-making power and trust with healthcare or academic staff. Among marginalised populations, participatory research has enhanced retention in health disparities research,9 improved fidelity to clinical care,10 better health outcomes,9 and reduced inequities in access to, satisfaction of, and quality of care.11 In fact, the Agency for Healthcare Research and Quality recommends participatory research as a ‘gold standard’ to reduce disparities.12
Although implementers often engage healthcare staff, using participatory approaches to involve consumers throughout implementation does not often occur.13 In the most robust example of using a participatory approach to enhance implementation, quality improvement that included community members was more effective than technical assistance without a participatory approach for uptake of a depression intervention across diverse US healthcare settings.14
A participatory approach to implementation shares principles with participatory research, such as work funded by Patient-Centered Outcomes Research Institute in which consumers inform research outcomes most important to them. Yet, engaging consumers in implementation is distinguishable from participatory research by its focus beyond intervention and outcomes to broader factors necessary to get patients, organisations and providers to be willing or able to implement the intervention. The benefit of this study is that we will engage consumers to focus on strategies to increase uptake of an intervention rather than more typical consumer engagement to determine components of an intervention or outcomes.
Gap in implementation and purpose of the current study
Implementation scientists need to reduce disparities in uptake and reach of interventions.15 Although consumer engagement in implementation has nascent evidence of improving healthcare among marginalised populations,16 methods for involving consumers in selecting and tailoring implementation strategies are not well synthesised or documented. Thus, participatory approaches to implementation are used less frequently than ideal, not well operationalised or reported, and not well studied as potential mechanisms for decreasing healthcare disparities. A Cochrane review of consumer engagement in healthcare called for greater specificity on how consumers are engaged and what resources are needed so processes and positive effects can be replicated.17
The purpose of this study is to augment a conventional method of selecting and tailoring implementation strategies (Implementation Facilitation) with consumer engagement and assess feasibility, acceptability and preliminary impact of consumer engagement on implementation and clinical outcomes. We will systematically develop a toolkit, Consumer Voice, to guide processes for engaging consumers in selecting and tailoring implementation strategies. We will pilot it by implementing the Safety Planning Intervention to prevent suicide among rural VHA patients.
Conventional strategy: Implementation Facilitation
To evaluate Consumer Voice, we will pair it with a conventional implementation strategy, Implementation Facilitation.18–20 Implementation Facilitation is defined as ‘a process of interactive problem solving and support that occurs in the context of a recognized need for improvement and a supportive interpersonal relationship’.21 22 Implementation Facilitation involves methods to select and tailor implementation strategies and does not usually involve interfacing with consumers.
Intervention to be implemented: Safety Planning Intervention among rural VHA patients
Approximately six VHA patients die by suicide daily.23 Compared with urban dwelling US veterans, rural dwelling US veterans are more likely to consider suicide and less likely to access mental healthcare.24 Safety Planning Intervention is a suicide prevention intervention, effective at reducing suicidal ideation and suicidal behaviours, inside and outside VHA.25 26 Safety Planning Intervention is a one-session, clinical intervention for patients with suicidal thoughts or behaviours. Patients and providers collaboratively create a safety plan with prompts populated in the VHA electronic health record and a copy given to patients.25 A complete Safety Planning Intervention safety plan consists of six types of coping skills that patients can use when suicidal thoughts arise.
Specific aims
Using a participatory approach, we will develop a toolkit (Consumer Voice) containing methods to engage consumers in selecting and tailoring implementation strategies.
Using a two-arm design, we will pilot feasibility and acceptability of Consumer Voice and its preliminary impact on implementation and clinical outcomes by implementing Safety Planning Intervention.
We will compare Implementation Facilitation to Implementation Facilitation plus Consumer Voice on sustainability of Safety Planning Intervention and assess factors that enhance or hinder sustainability of Safety Planning Intervention.
Methods and analysis
Patient and public involvement
The development of this research question was informed by patient and public opinion through our VHA centre’s Veterans Research Council. The lead author met with them as a group, presented research ideas, integrated some of their feedback while maintaining decision-making power, and returned to the council a second time to refine ideas before submitting this research for external funding. We also incorporated patient and public involvement in the design of Aim 1, especially recruitment strategies and locales suited for patients, by consulting with three patient representatives working in community organisations serving VHA patients in our US state.
Although this is a protocol, we began early components of the study and added two community member consultants (Veterans) on our research team that makes decisions about the form and function of Consumer Voice. For dissemination, we plan to share results with our local community boards, such as the community service organisations serving VHA patients and our local Veterans Research Council. We also plan to create an infographic of key results and distribute on social media from our research centre.
Theoretical approach
We will use the Health Equity Implementation Framework27 (see figure 1) to inform this research. This framework posits domains that predict successful implementation and reductions in implementation disparities. Within each domain are several determinants or specific factors that are measurable and, together in constellation with other determinants, clarify barriers, facilitators, moderators or mediators to equitable and successful implementation. The framework also proposes a process—Implementation Facilitation—by which change in each domain would occur.28 29
Some examples of domains in the Health Equity Implementation Framework are described below. Innovation refers to the treatment, intervention, practice, or new ‘thing’ to be implemented (ie, the Safety Planning Intervention), adopted by providers and staff, and delivered to patients.30 Recipients are individuals who influence implementation and those who are affected by its outcomes (ie, rural VHA patients, VHA staff and providers), at the individual and collective team levels.29 Cultural factors of recipients are unique characteristics to a particular group in the implementation effort (eg, patients, staff, providers) based on their lived experience. Some examples are implicit bias, socioeconomic status, stress related to discrimination, health literacy, health beliefs, or trust in the healthcare staff or patient group.31 32 Economies include how innovations are marketed and acquired (ie, government controlled healthcare at low cost) and other market forces that change demand for the Safety Planning Intervention (eg, it becomes offered at local urgent care clinics outside of VHA). Physical structures are where people have to visit to get healthcare and what environmental elements people may be exposed to that exacerbate or minimise the health problem.33 One factor in rural areas can be lack of confidentiality for suicide screening in a small town with few providers where many residents know each other.
We will use the Health Equity Implementation Framework to: (1) identify barriers/facilitators to using Consumer Voice (Aim 1), (2) identify barriers/facilitators for Safety Planning Intervention implementation among rural VHA patients that will guide Implementation Facilitation and Consumer Voice at local clinics (Aim 2) and its sustainability (Aim 3), and (3) interpret results from Aims 1, 2 and 3.
Setting
To reach a subset of rural VHA patients at risk for suicide, we will target rural VHA community-based outpatient clinics in Arkansas that house primary care and mental healthcare. One reason to implement suicide prevention in these primary care settings is because many veteran suicide deaths occur among those not engaged in mental healthcare who do seek primary care.23 Suicide prevention in primary care will reach more high-risk, rural veterans than in mental healthcare alone.
Study design, processes and planned analyses by specific aim
Aim 1: using a participatory approach, develop a toolkit (Consumer Voice) containing methods to engage consumers in selecting and tailoring implementation strategies
We will build a toolkit for use in engaging consumers in selecting and tailoring implementation strategies. Consumer Voice will be a multimedia manual showcasing who, what, when, where, how and why implementers should engage veterans (as consumers of healthcare) in implementing new or improved healthcare services. Our team will build the first draft of Consumer Voice based on a complete environmental scan of existing examples of consumer engagement in implementation activities.34
User-centered design
We will build Consumer Voice to expand Implementation Facilitation by using a QUALITATIVE→quantitative→QUALITATIVE structure through three sequential steps in which qualitative data will be given more weight (figure 2).35 Drawing from user-centred design,36 we will use an iterative approach to engage end-users (implementers) and other stakeholders in initial prototype testing and then mini-pilot tests of Consumer Voice. Our three sequential steps are: (1) conduct individual qualitative interviews and cocreation sessions with diverse stakeholders, (2) ask implementers to pilot Consumer Voice briefly in their own work and reconvene through a Delphi process to achieve consensus on components, and (3) reconvene diverse stakeholders again in a nominal group technique process to clarify the most feasible and important components for the final prototype of Consumer Voice. Within each step, we will use a variety of user-centred design methods such as interviews about user perspectives, applying process maps to visualise system-level implementation activities needed for Consumer Voice, cocreation sessions in which stakeholders develop some aspects of Consumer Voice alongside our team, and experience sampling (ie, implementers briefly pilot using Consumer Voice in their work).36
Step 1: stakeholder qualitative interviews
We will conduct interviews with key stakeholders (see table 1) to refine operational definitions of consumer engagement in implementation methods, preferences or needs, potential barriers to and facilitators of using these methods, and technical resources needed for Consumer Voice. We expect to achieve saturation between 12 and 20 total interviews.37
Table 1.
Stakeholder group | Location | Method |
Consumers (rural veterans who experienced suicide risk and caregivers/families; n=5) | In town of clinic: hotel lobbies, churches, coffee shops, Veterans Service Organizations* | Video, face-to-face or telephone |
Community members involved in suicide prevention (Veterans Service Officers, chaplains, n=3) | In town of clinic: hotel lobbies, churches, coffee shops, Veterans Service Organizations* | Video, face-to-face or telephone |
Clinic mental health providers and leadership (n=3) | Clinic or hospital | Video, face-to-face or telephone |
Safety Planning Intervention clinical champions at two VHA facilities, and at the national level (n=3) | – | Telephone or video |
Consumer engagement researchers (n=3) | – | Telephone or video |
Implementers who would be the end-users of Consumer Voice (n=3) | – | Telephone or video |
Interview length=45–60 min.
*These suggestions were derived from three key informant interviews with Veterans Service Officers in the state of Arkansas.
VHA, Veterans Health Administration.
We will reach out to existing contacts in each stakeholder group for potential participation to recruit stakeholders in a respondent-driven, non-probabilistic approach. These contacts will serve as referral agents who suggest other stakeholders in any group for recruitment. We have built connections and partnerships with two veteran community groups. Stakeholders will be offered financial payment as incentive.
Interview guide
The interview guide will be structured to assess preferred types of engagement and technical resources; see sample questions in table 2. Interviews will be audio recorded, approximately 45 min long, and interviewers will take notes during the interview.
Table 2.
Interview topic | Sample questions |
Preferred types of engagement | What activities or strategies would you like to be involved in when VHA is designing how they will implement a new treatment? Are there kinds of involvement you would be opposed to? Please tell me about your concerns. If we were to ask you to (insert type of engagement, for example, come to a VHA twice in 3 months to act as a mock patient), would you do this? Why? Why not? |
Technical resources needed | To explain the way a new treatment might get implemented, would you prefer a video, for it to be written down, or for someone to talk about it verbally with you? Why do you prefer this approach over the others? Take a look at these materials to orient veterans to what we are doing (show prototype)—what do you think about this? What needs to change? What would you keep? |
VHA, Veterans Health Administration.
Qualitative analysis
We will use a Rapid Assessment Process to analyse qualitative data from stakeholder interviews. The time required for this approach can range from 4 days to 6 weeks.38 This method is useful for an implementation study in which there is a time-sensitive demand for creation and modification of an implementation product (Consumer Voice), yet need for rigour in the analysis.39 The analysis will blend inductive and deductive approaches, using directed content analysis40 and allow a framework to guide analysis deductively while leaving room for emerging information. We will use the Health Equity Implementation Framework to create summary templates to categorise barriers and facilitators. We will present results to veteran community groups focused on suicide prevention to give feedback to inform the next iteration of Consumer Voice.
Step 2: Delphi process with implementation experts
We will ask implementers to use a beta version of Consumer Voice in their own work as an uncontrolled pilot. Then, using a modified Delphi process that will produce quantitative data from voting, we will generate consensus on Consumer Voice through rounds of discussion and voting with those implementers.41
We will use respondent-driven sampling to identify up to 12 implementation experts by advertising on Twitter, and approaching professional implementation networks. We will ask these participants to reach out to one other potential participant through e-mail or social media.
Experts will be engaged in 2–3, 60 min, virtual Delphi sessions using online polling and discussion to reach consensus through videoconferencing platforms, Microsoft PowerPoint and telephone calls. The two sessions will follow this cycle: (1) present the draft version of Consumer Voice, elicit discussion from participants based on experience, and vote on which components to include (70% agreement achieves consensus)42; and (2) present group results back to participants and elicit discussion to vote again on which components to include. Implementers will receive the final version of Consumer Voice and monetary payment as an incentive.
We will also administer a one-time set of three questionnaires produced by Weiner et al,43 four questions each, assessing feasibility, acceptability and appropriateness of the beta version of Consumer Voice. Responses are on a Likert scale ranging from 1 (completely disagree) to 5 (completely agree). Example items include ‘Consumer Voice is appealing to me’ (acceptability), ‘Consumer Voice seems fitting’ (appropriateness) and ‘Consumer Voices seems doable’ (feasibility).
Step 3: nominal group technique to finalise Consumer Voice
Finally, we will use the nominal group technique with stakeholders to prioritise final components of Consumer Voice after step 2 (post-Delphi version) based on stakeholder rankings of importance and feasibility. The nominal group technique is a participatory research method in which exploratory questions about a topic are presented to small stakeholder groups to generate ideas, develop consensus and set priorities for guidelines, particularly for research areas that are underdeveloped.44
We will host 2–3 2-hour meetings with subsets from the diverse stakeholder groups in table 1. Each stakeholder will attend only one meeting. We will offer very small groups (eg, 2–4 individuals), varying locations that can be private and confidential, and even individual interviews should a stakeholder prefer not to discuss these topics with others. Our sampling is consistent with recommendations for the nominal group technique: emphasis is on involving people from different roles/locations to ensure heterogeneity of viewpoints.45
Participants will be provided an explanation of nominal group technique, key terms used in discussion, and a draft of the Consumer Voice toolkit. Participants will also be provided with preprinted forms that specify exploratory response questions. The exploratory questions will be honed through initial individual stakeholder interviews; they will likely resemble1: ‘What do you think are the most important and feasible ways to engage VHA consumers in implementing a healthcare intervention?’2; ‘What are other methods or ways to engage VHA consumers in implementation?’. Participants will be able to select, adapt and suggest new methods in their lists. Participants will be provided 15 min to brainstorm in silence followed by an oral round of listing ideas on flipcharts, serial discussion of each idea, group ranking of priorities, group discussion of rankings and re-ranking until consensus is reached.
Analysis to finalise Consumer Voice
The analytic plan is to use and connect data gathered after each of the three steps to form iterative prototypes of Consumer Voice.35 After each step, an analysis team (authors ENW, IAB, JEK, CW, veteran consultants) will meet to integrate data gathered from the prior step, using brainstorming and consensus, and decide how to integrate changes into the next Consumer Voice prototype. Data may take the form of suggested visual changes, stakeholder needs, suggested methods, activities or archival examples of consumer engagement in implementation. The analysis team will categorise the function of each consumer engagement method/activity on a continuum from least intensive to most intensive (eg, from informing consumers to partnering with them).46 One likely challenge we expect is for findings from stakeholders to diverge. The analysis team will work to resolve discrepancies during mixed-methods analysis between each step.47 In the final joint nominal group technique session, we will present remaining discrepancies to diverse stakeholders and elicit feedback on how to resolve, lending priority to different groups based on the function or form of the discrepancy (eg, clinical expert opinions will be given priority on components of clinical intervention delivery).
Aim 2: using a two-arm design, we will pilot feasibility and acceptability of Consumer Voice and its preliminary impact on implementation and clinical outcomes by implementing Safety Planning Intervention
We will use the Consumer Voice toolkit to conduct engagement meetings, events and interactions with rural veterans who have experienced suicidal thoughts or behaviour and their families in selecting and tailoring implementation strategies for Safety Planning Intervention. During and after these interactions, we will conduct a mixed-methods process evaluation of the Consumer Voice toolkit and process. We will conduct a pilot study using an effectiveness-implementation hybrid 2 design comparing Implementation Facilitation only with Implementation Facilitation plus Consumer Voice on implementation and clinical outcomes.48
Sites
Within our VHA regional system, one community-based outpatient clinic (referred to as ‘clinic’) will be the ‘standard care’ clinic at which Implementation Facilitation alone is used; the second clinic will be the ‘implementation site’ at which Implementation Facilitation plus Consumer Voice is used (table 3). We randomly assigned each site’s implementation condition. The sites are matched on clinic size and percentage of veterans defined as rural. One possible challenge is that sites might drop out of the study. If a site is unable to participate, mental health leaders at our VA facility identified alternate sites for this study.
Table 3.
0–4 months: planning | 5–8 months: pre-implementation | 9–12 months: implementation |
13–18 months: sustainability | 19–22 months: observation | |
Anticipated implementation activities at clinics | Facilitator becomes familiar with updates to Safety Planning Intervention rollout, consults with local and national leadership, assesses implementation barriers and facilitators | Facilitator visits site and works collaboratively with stakeholders to adapt and complete an implementation checklist for planning. Select and tailor strategies to prepare to implement the Safety Planning Intervention | The Safety Planning Intervention is implemented according to implementation plan using strategies | Continued Safety Planning Intervention implementation and monitoring. Facilitator assists stakeholders in completing a written Sustainability Action Plan adapted to their clinic |
The Safety Planning Intervention continues with natural implementation without facilitator involvement |
Data collection | Collect feasibility and acceptability data | Collect feasibility and acceptability data | Collect feasibility and acceptability data | Month 13: collect data on reach, effectiveness, adoption and implementation | Re-collect data on reach, effectiveness, adoption and implementation |
Timeline for Safety Planning Intervention implementation and data collection
Implementation will occur in four phases, each lasting 4 months: planning, pre-implementation, implementation and sustainability. Although time periods are short compared with larger trials, they will allow sufficient time to determine feasibility and acceptability of Consumer Voice (table 3).
Implementation strategies across implementation phases
There will be one facilitator who will use conventional Implementation Facilitation at the standard care clinic, and Implementation Facilitation plus Consumer Voice at the implementation clinic. Implementation Facilitation and Consumer Voice will occur on the same timeline, although we anticipate there to be additional or different activities at the Implementation Clinic in which Consumer Voice is used in conjunction with Implementation Facilitation. The facilitator will track their weekly time and activities related to Implementation Facilitation using pre-established tracking logs49 and key Implementation Facilitation events that occur using a pre-established checklist from our preliminary work.50 The facilitator will use these logs to document whether clinics receive the same amount and type of activities of Implementation Facilitation.51
One anticipated challenge is that Consumer Voice participants may drop out over the course designing for implementation. In preparation for this, we will track retention as an outcome for the process evaluation (see table 4). If drop out occurs, we will apply similar techniques as in original recruitment, identify new participants and spend 1–2 hours orienting them to consumer engagement and implementation, the development of Consumer Voice, and study progress to date.
Table 4.
Key feasibility questions | Construct | Measure |
Is recruitment possible for consumer engagement participation? Are the eligibility criteria to participate too strict? Is recruitment reaching rural veterans at risk for suicide and their families and community members? | Recruitment capability and sample | |
Recruitment rate to engage consumers in implementation | # of consumers who attended one event, meeting, or interaction out of consumers approached* | |
Eligibility criteria of consumers | Reasons for missed engagement* | |
Sample characteristics of consumers | Demographics of consumers engaged: age, war era, race, gender, income, rural/urban residence, mental health condition(s)† | |
How appropriate are Consumer Voice toolkit and consumer engagement interactions for the intended population and purpose of implementation? | Data collection procedures and outcome measures | |
Completion of consumer engagement events, meetings or interactions | Complete measures, interviews or meetings† Length of time to complete measures, interviews or meetings† |
|
Consumer Voice materials are at suitable reading level | Rating from Flesch Reading Ease† | |
Usefulness of Consumer Voice toolkit | Investigator-created Likert scale items administered to independent implementers and facilitator in this study† | |
Does the research team have resources and ability to manage consumer engagement participation? | Resources to manage and execute consumer engagement | |
Ability to manage consumer engagement meetings, events or interactions | % scheduled interactions successfully completed by facilitator† | |
Adequate resources | % interactions impeded by lack of space, technology, funding, staff† | |
Facilitator skills related to consumer engagement or ethical issues | # and type of consultations needed to execute methods* |
*Qualitative data collection.
†Quantitative data collection.
Mixed-methods process evaluation of Consumer Voice
Because the end-user of Consumer Voice will be implementers as they will use Consumer Voice methods to engage consumers, we need to assess feasibility and acceptability of using the Consumer Voice toolkit and methods with implementers as well as consumers. We will conduct a mixed-methods process evaluation of Consumer Voice.52 We will use a qualitative+QUANTITATIVE design; data will be collected simultaneously and importance will be given to quantitative measures.35 53
Procedures
The function of these mixed-methods data will be convergence, which involves integrating them to answer the same question: is Consumer Voice feasible and acceptable to all stakeholders?.35 To assess feasibility, we will administer brief surveys at consumer engagement events to all consumers and healthcare professionals and use logs for tracking data in real-time during these interactions. To assess acceptability, we will use (1) the same surveys and logs used for feasibility data collection to assess retention and physical safety, and (2) brief qualitative interviews with consumers and healthcare professionals to assess burden and satisfaction. We will also attempt to interview consumers who responded to initial recruitment but did not attend or dropped out about reasons for non-attendance. This strategy allows for greater external validity by ensuring broader variability in the data.54
Measures
We will assess feasibility outcomes suggested by Orsmond and Cohn55 as seen in table 4. Acceptability outcomes were designed based on recommendations from Proctor and colleagues as seen in table 5.56 We will administer again Weiner’s three questionnaires,43 four questions each, assessing feasibility, acceptability and appropriateness of Consumer Voice that was used in Aim 1.
Table 5.
Key acceptability questions | Construct | Measure |
Were consumers engaged enough to continue attending consumer engagement meetings? | Retention | Original participants attend 66% of consumer engagement meetings, events or interactions* |
Do consumers feel burden of consumer engagement in implementation is reasonable? | Burden | Risk/benefit of burden is such that consumer would attend a meeting or event again† |
Are consumers satisfied with consumer engagement meetings? | Satisfaction | Consumer would recommend participation to another consumer† |
Are consumers safe while participating in consumer engagement meetings? | Safety | # of adverse events reported to IRB* |
We will deploy Consumer Voice to guide consumer engagement meetings.
*Quantitative data collection.
†Qualitative data collection
Analysis
To integrate data, we will merge information from quantitative and qualitative datasets.35 We will use descriptive statistics to analyse quantitative data. Qualitative data from surveys and interviews will be extracted into summary templates aligned with the Health Equity Implementation Framework.27 The coding team will analyse data using a blend of inductive and deductive approaches through the Rapid Assessment Process described in Aim 1.38 39 As one way to triangulate data to answer questions about acceptability and feasibility of Consumer Voice, some qualitative categories will be able to be quantified (eg, 0=not satisfied, 1=somewhat satisfied) and converged with quantitative data. Another way to triangulate data will be for the mixed-methods analytic team to meet together to present, review, discuss and integrate findings from quantitative and qualitative data.
Assessing preliminary impact of Consumer Voice
As part of the pilot, we will also assess implementation outcomes of reach, adoption and fidelity to Safety Planning Intervention and clinical outcomes of patient depression, suicidal ideation and suicidal behaviour. This pilot will not have enough statistical power to detect a conclusive effect of Consumer Voice on implementation or clinical outcomes. The pilot study will allow us to obtain SD estimates of clinical outcomes for sample size determination of future trials.
Measures
To evaluate preliminary implementation and clinical outcomes, we will use Reach, Effectiveness, Adoption, Implementation, Maintenance (RE-AIM)57 as a framework (RE-AIM). We will collect these data from both clinics during month 13 after the implementation phase. Reach is defined by Safety Planning Intervention being used with the targeted patient population (ie, rural veterans with suicidal ideation or behaviour). Effectiveness is conceptualised as whether veteran depression symptoms and suicidal ideation and behaviour is different because of exposure to Safety Planning Intervention. Adoption is conceptualised as Safety Planning Intervention uptake by providers in primary care and specialty mental healthcare roles at each clinic. Implementation is conceptualised as high fidelity to the Safety Planning Intervention. We will randomly select 30% of rural veterans exposed to Safety Planning Intervention from both clinics to assess implementation fidelity of Safety Planning Intervention. Using this sample, we will conduct chart reviews of the safety plans created in the medical record to assess the quantity of Safety Planning Intervention steps completed; a complete safety plan involves six steps. Table 6 lists planned outcomes and sources from which we will collect data to evaluate these outcomes.
Table 6.
RE-AIM measure (Population) | Operational definition | Data source |
Reach (Veterans) |
% of rural patients with a safety plan documented in the electronic health record Demographics of patients reached (age, gender, race, ethnicity) |
VHA Administrative Data |
Effectiveness (Patients at either site receiving the Safety Planning Intervention) |
Change in depression symptoms, aggregated by site Change in number of self-directed violent behaviours |
VHA Administrative Data (Patient Health Questionnaire-9) |
Adoption (Providers) |
% of providers that complete a safety plan with a patient/total providers at clinics # of Safety Planning Intervention safety plans completed by each provider (even providers who did not complete any safety plans) |
VHA Administrative Data |
Implementation: (Sites) |
# of Safety Planning Intervention safety plans completed 100% (6 out of 6 steps completed equals optimal fidelity) | Chart review on random 30% of Veterans exposed to Safety Planning Intervention |
Maintenance: Sustainability | Repeat reach, effectiveness, adoption and implementation measures at 24 months | VHA Administrative Data Chart review |
RE-AIM, Reach, Effectiveness, Adoption, Implementation, Maintenance; VHA, Veterans Health Administration.
One possible challenge with the adoption measure is that there may be very low adoption overall, and thus, we might need to increase the percentage of chart reviews to identify differences in adoption between sites. One limit to the Implementation measure is that it is a basic fidelity assessment that does not capture quality of completion of SPI safety plans. It is possible that a fidelity measurement focused on quality will be needed, and if so, I will use rating tools created by the VHA Safety Planning Intervention training group.
Patient sample
To assess clinical effectiveness outcomes, we will analyse a sample of rural VHA patients within both clinics. We will include patients who screen positive on a suicidal ideation question at primary care appointments (ie, Patient Health Questionnaire-2+Item 9 (suicidal ideation)). Because data on patient clinical effectiveness will be extracted directly from VHA administrative data, patients sampled in each clinic will represent a convenience sample (vs a random sample), therefore; there is no clear sample size.
Analysis
To assess preliminary impact of Consumer Voice on implementation outcomes of reach, adoption and implementation fidelity, we will calculate descriptive statistics. We will not conduct effect sizes from this pilot study due to concerns about inflation of type I and II errors in small samples sizes.55 58 We will describe variance in the outcomes detailed in table 6, including CIs about each point estimate (eg, mean, SD).58 59
To evaluate the hypothesis that Consumer Voice will improve patient clinical outcomes, we will conduct inferential statistics. We will conduct an analysis of covariance and compare differences in patient outcomes between clinics that receive standard Implementation Facilitation and Implementation Facilitation plus Consumer Voice at the month 13 time period, while controlling for baseline level of depression at each site during the 4-month planning period. Independent variables will be implementation assignment and time. The dependent variables will be depression symptoms, suicidal ideation and self-directed violence.
Aim 3: Evaluate sustainability of Safety Planning Intervention
One metric of the impact of Consumer Voice is how well Safety Planning Intervention is sustained in a clinic that used Implementation Facilitation only compared with a clinic that received Implementation Facilitation plus Consumer Voice.
Design
Therefore, we will use mixed-methods (QUANTITATIVE+qualitative)35 to compare the two clinics on: (1) repeated implementation and clinical outcomes at months 19–22 (observation period) and (2) barriers and facilitators to Safety Planning Intervention sustainment. Quantitative data collection to assess implementation and clinical outcomes will precede qualitative data collection by 1 month to document the fidelity of Safety Planning Intervention at a later timepoint. Qualitative data will be used to assess stakeholder perceptions of Safety Planning Intervention sustainability barriers and facilitators.
Measures and analysis of quantitative data
For maintenance (sustainability), we will measure outcomes guided by the RE-AIM framework for reach, effectiveness, adoption and implementation again in Month 19, as reviewed in table 6. For analysis, we will repeat detailed analysis described in Aim 2.
Sampling and recruitment for qualitative interviews
We will interview again a subset of stakeholders listed in table 1. The purpose of the interviews will be to assess barriers to and facilitators of Safety Planning Intervention sustainment at each clinic to document differences between the contexts of the standard care clinic (Implementation Facilitation only), and the implementation clinic (Implementation Facilitation plus Consumer Voice). Participants will be sampled purposively, by selecting those that were most informative during stakeholder interviews and the nominal group technique in Aim 1 and implementation in Aim 2, and those that presented ‘negative cases’ in Aim 2 (ie, preliminary results that did not fit with majority of information used to implement Safety Planning Intervention).54 The interview guide will be semi-structured, with questions aligned to the Health Equity Implementation Framework.
Analysis of qualitative data
We will use the blended inductive-deductive analysis40 through a Rapid Assessment Process described in Aim 1.39 Initially, the analysis will be deductive and focused on these specific questions: How has Safety Planning Intervention been sustained? Which implementation strategies contributed to its sustainment? How has consumer involvement affected Safety Planning Intervention sustainment? We anticipate these interviews may elucidate potential mechanisms of change that we would investigate in a subsequent, fully powered trial of Consumer Voice.
We will also compare Sustainability Action Plans completed for each site in the sustainability phase, including any updates to the plans. Specifically, we will code for three criteria within each Sustainability Action Plan: (1) communication between consumers and clinic, (2) consumers involved in developing or reviewing the plan and (3) consumers being sampled for some metric of sustainability (eg, consumer satisfaction, use of Safety Planning Intervention). These criteria were informed by a consumer partnering subscale of a reliable, quantitative sustainability measure.60
Ethics and dissemination
Ethics
A major innovation of this study is the integration of a participatory research approach with implementation science; this novel approach has potential for ethical pitfalls. Participatory research and implementation science come from distinct research traditions. Although there are shared goals, they are also distinctly different on their ethical approach.61
Regarding motivation, both approaches want to improve society. Participatory research is geared more to create social change and build capacity among users; implementation science is geared more to apply knowledge to help users, although not explicitly to build capacity among them. Regarding social location, both approaches want knowledge users involved in the healthcare system. Participatory research is rooted in grassroots, user led action (equalising power differentials); while implementation science is rooted in decisions made by healthcare professionals. The main aim is not to equalise power between researchers and users, although this may occur. Both approaches propose users should be engaged in an ethical manner, although sometimes ethical is defined by consumers in participatory research, but by researchers in implementation science.
As we conduct this work, we will have to recruit and retain engagement with multiple stakeholders and pay careful attention to inputs of consumers and processes used, to create an implementation strategy and toolkit that truly exemplifies strengths from both traditions. We will need to pay extra attention to work collaboratively, inclusively, and with respect for people living in rural communities, using suggested best practices by experienced community engaged researchers such as using a variety of participation strategies, allowing extra time for building trust, being a regular presence in the community, and including local customs in interventions or implementation.62
Dissemination
In this pilot, there is narrow focus on Safety Planning Intervention implementation and Consumer Voice will require adaptation to other evidence-based practices. Although we are collecting preliminary impact data about implementation and patient outcomes, we will be unable to draw strong conclusions about these research questions.
We plan to use traditional academic modalities of dissemination, including conference presentations and journal publications. We also plan to disseminate findings through meetings with other trainers and teachers in implementation practice so they may adapt or adopt Consumer Voice to meet their needs. Although VHA has no publicly available data repositories, we will make data from our studies available on request.
Supplementary Material
Acknowledgments
Thank you to the Veterans Research Council at the VA Center for Mental Healthcare and Outcomes Research, along with Veteran Service Officers in the state of Arkansas, for their collective feedback on elements of this research design. ENW is a fellow with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (5R25MH08091607).
Footnotes
Contributors: ENW conceptualised the study and manuscript, and prepared all written materials, tables and figures. CW helped design methods and integration of mixed methods and edited the manuscript. SJL helped conceptualise the involvement of implementing Safety Planning Intervention and edited the manuscript. LRMH refined the study design and methods and edited the manuscript. KLD helped design analytic plans for qualitative components and edited the manuscript. SO designed analytic plans for quantitative components and edited the manuscript. IAB helped refine methods for Aim 1 (Developing Consumer Voice), edited the manuscript and developed supporting documentation, such as declarations, abbreviations, references and supplemental files. JEK helped develop conceptualisations of the study and manuscript and edited the manuscript.
Funding: This work was supported by Career Development Award Number IK2 HX003065 from the US Department of Veterans Affairs Health Services Research and Development (HSRD) Service (ENW).
Disclaimer: The views expressed in this article are those of the author and do not necessarily represent the views of the US Department of Veterans Affairs.
Competing interests: None declared.
Patient and public involvement: Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.
Provenance and peer review: Not commissioned; externally peer reviewed.
Ethics statements
Patient consent for publication
Not applicable.
References
- 1.Kilbourne AM, Switzer G, Hyman K, et al. Advancing health disparities research within the health care system: a conceptual framework. Am J Public Health 2006;96:2113–21. 10.2105/AJPH.2005.077628 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Atkins D, Kilbourne A, Lipson L. Health equity research in the Veterans Health Administration: we've come far but aren't there yet. Am J Public Health 2014;104:S525–6. 10.2105/AJPH.2014.302216 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.VA Office of Health Equity . National veteran health equity report—FY2013. Washington, DC: US Department of Veterans Affairs, 2016. http://www.va.gov/healthequity [Google Scholar]
- 4.Yancey A, Glenn BA, Ford CL. Dissemination and implementation research among racial/ethnic minority and other vulnerable populations. In: Dissemination and implementation research in health: translating science into practice. 2nd edn. New York, NY: Oxford University Press, 2018: 449–70. [Google Scholar]
- 5.Baker R, Camosso-Stefinovic J, Gillies C, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev 2015;308:CD005470. 10.1002/14651858.CD005470.pub3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Waltz TJ, Powell BJ, Fernández ME, et al. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci 2019;14:42. 10.1186/s13012-019-0892-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Baumann AA, Cabassa LJ, Stirman SW. Adaptation in dissemination and implementation science. In: Dissemination and implementation research in health. 2nd edn. New York, NY: Oxford University Press, 2017: 285–300. [Google Scholar]
- 8.Lion KC, Raphael JL. Partnering health disparities research with quality improvement science in pediatrics. Pediatrics 2015;135:354–61. 10.1542/peds.2014-2982 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Tiwari T, Sharma T, Harper M, et al. Community based participatory research to reduce oral health disparities in American Indian children. J Fam Med 2015;2. [PMC free article] [PubMed] [Google Scholar]
- 10.Fettes D, Haine-Schlagel R. Mixed methods adaptation and pilot testing of a toolkit to enhance parent participation in home visitation programs. 32nd annual San Diego conference on child and family maltreatment. San Diego, CA, 2018. [Google Scholar]
- 11.Patzer RE, Paul S, Plantinga L, et al. A randomized trial to reduce disparities in referral for transplant evaluation. J Am Soc Nephrol 2017;28:935–42. 10.1681/ASN.2016030320 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Agency for Healthcare Research and Quality . AHRQ activities using community-based participatory research to address health care disparities [online]. Rockville, MD, 2014. http://www.ahrq.gov/research/findings/factsheets/minority/cbprbrief/index.html [Google Scholar]
- 13.Holt CL, Chambers DA. Opportunities and challenges in conducting community-engaged dissemination/implementation research. Transl Behav Med 2017;7:389–92. 10.1007/s13142-017-0520-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Wells KB, Jones L, Chung B, et al. Community-partnered cluster-randomized comparative effectiveness trial of community engagement and planning or resources for services to address depression disparities. J Gen Intern Med 2013;28:1268–78. 10.1007/s11606-013-2484-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Cabassa LJ. The role of implementation science in reducing healthcare disparities. Implementation research Institute. St. Louis: Washington University, 2018. [Google Scholar]
- 16.Wells KB, Jones L, Chung B, et al. Community-Partnered cluster-randomized comparative effectiveness trial of community engagement and planning or resources for services to address depression disparities. J Gen Intern Med 2013;28:1268–78. 10.1007/s11606-013-2484-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Anderson LM, Adeney KL, Shinn C, et al. Community coalition-driven interventions to reduce health disparities among racial and ethnic minority populations. Cochrane Database Syst Rev 2015;19:CD009905. 10.1002/14651858.CD009905.pub2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Kirchner JE, Ritchie MJ, Pitcock JA, et al. Outcomes of a partnered facilitation strategy to implement primary care-mental health. J Gen Intern Med 2014;29:904–12. 10.1007/s11606-014-3027-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Kilbourne AM, Almirall D, Goodrich DE, et al. Enhancing outreach for persons with serious mental illness: 12-month results from a cluster randomized trial of an adaptive implementation strategy. Implement Sci 2014;9:163. 10.1186/s13012-014-0163-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Dickinson WP, Dickinson LM, Nutting PA, et al. Practice facilitation to improve diabetes care in primary care: a report from the EPIC randomized clinical trial. Ann Fam Med 2014;12:8–16. 10.1370/afm.1591 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Ritchie MJ, Dollar KM, Miller C. Using implementation facilitation to improve healthcare implementation facilitation training manual (Version 3) [online]. Veterans health administration, behavioral health quality enhancement research initiative (QUERI), 2020. Available: https://www.queri.research.va.gov/tools/implementation.cfm
- 22.Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci 2015;10:21. 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.U.S. Department of Veterans Affairs . VA National suicide data report 2005-2016 [online], 2018. Available: https://www.mentalhealth.va.gov/suicide_prevention/Suicide-Prevention-Data.asp
- 24.McCarthy JF, Blow FC, Ignacio RV, et al. Suicide among patients in the Veterans Affairs health system: rural-urban differences in rates, risks, and methods. Am J Public Health 2012;102:S111–7. 10.2105/AJPH.2011.300463 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Stanley B, Brown GK, Brenner LA, et al. Comparison of the safety planning intervention with follow-up vs usual care of suicidal patients treated in the emergency department. JAMA Psychiatry 2018;75:894. 10.1001/jamapsychiatry.2018.1776 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Stanley B, Chaudhury SR, Chesin M, et al. An emergency department intervention and follow-up to reduce suicide risk in the Va: acceptability and effectiveness. Psychiatr Serv 2016;67:680–3. 10.1176/appi.ps.201500082 [DOI] [PubMed] [Google Scholar]
- 27.Woodward EN, Matthieu MM, Uchendu US, et al. The health equity implementation framework: proposal and preliminary study of hepatitis C virus treatment. Implement Sci 2019;14:26. 10.1186/s13012-019-0861-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci 2015;10:53. 10.1186/s13012-015-0242-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implementation Science 2015;11. 10.1186/s13012-016-0398-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Curran GM. Implementation science made too simple: a teaching tool. Implement Sci Commun 2020;1:27. 10.1186/s43058-020-00001-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Betancourt JR, Green AR, Carrillo JE, et al. Defining cultural competence: a practical framework for addressing racial/ethnic disparities in health and health care. Public Health Rep 2003;118:293–302. 10.1016/S0033-3549(04)50253-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.National Research Council, Institute of Medicine . Policies and social values. In: Woolf SH, Aron L, eds. US health in international perspective: shorter lives, poorer health [online]. Washington, DC: National Academies Press, 2013. [PubMed] [Google Scholar]
- 33.Thomson K, Hillier-Brown F, Todd A, et al. The effects of public health policies on health inequalities in high-income countries: an umbrella review. BMC Public Health 2018;18:869. 10.1186/s12889-018-5677-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Melgar Castillo A, Woodward EN, True G. Examples and challenges of engaging consumers in implementation science activities: an environmental scan. oral symposium presented at: 13th annual conference on the science of dissemination and implementation. Washington, DC, 2020. [Google Scholar]
- 35.Palinkas LA, Aarons GA, Horwitz S, et al. Mixed method designs in implementation research. Adm Policy Ment Health 2011;38:44–53. 10.1007/s10488-010-0314-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Dopp AR, Parisi KE, Munson SA, et al. A glossary of user-centered design strategies for implementation experts. Transl Behav Med 2019;9:1057–64. 10.1093/tbm/iby119 [DOI] [PubMed] [Google Scholar]
- 37.Guest G, Bunce A, Johnson L. How many interviews are enough? an experiment with data saturation and variability. Field Methods 2006;18:59–82. [Google Scholar]
- 38.Beebe J. Rapid assessment process: an introduction. Walnut Creek, CA: AltaMira Press, 2001. [Google Scholar]
- 39.Hamilton A. Qualitative methods in rapid Turn-Around health services research. HSRD Cyberseminar. US Department of Veteran Affairs, 2013. [Google Scholar]
- 40.Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res 2005;15:1277–88. 10.1177/1049732305276687 [DOI] [PubMed] [Google Scholar]
- 41.Helmer-Hirschberg O. Analysis of the future: the Delphi method [online]. Santa Monica, CA: RAND Corporation, 1967. https://www.rand.org/pubs/papers/P3558.html [Google Scholar]
- 42.Domlyn AM, Wandersman A. Community coalition readiness for implementing something new: using a Delphi methodology. J Community Psychol 2019;47:882–97. 10.1002/jcop.22161 [DOI] [PubMed] [Google Scholar]
- 43.Weiner BJ, Lewis CC, Stanick C, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci 2017;12:108. 10.1186/s13012-017-0635-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Rankin NM, McGregor D, Butow PN, et al. Adapting the nominal group technique for priority setting of evidence-practice gaps in implementation science. BMC Med Res Methodol 2016;16:110. 10.1186/s12874-016-0210-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Van de Ven AH, Delbecq AL. The nominal group as a research instrument for exploratory health studies. Am J Public Health 1972;62:337–42. 10.2105/AJPH.62.3.337 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Carman KL, Dardess P, Maurer M, et al. Patient and family engagement: a framework for understanding the elements and developing interventions and policies. Health Aff 2013;32:223–31. 10.1377/hlthaff.2012.1133 [DOI] [PubMed] [Google Scholar]
- 47.Moffatt S, White M, Mackintosh J, et al. Using quantitative and qualitative data in health services research - what happens when mixed method findings conflict? [ISRCTN61522618]. BMC Health Serv Res 2006;6:28. 10.1186/1472-6963-6-28 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Curran GM, Bauer M, Mittman B, et al. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care 2012;50:217–26. 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Ritchie MJ, Liu CF, Townsend JC. Time and cost of “extreme” implementation facilitation to address challenging clinical contexts. 4th Biennial “Society for Implementation Research Collaboration”. Seattle, Washington, 2017. [Google Scholar]
- 50.Woodward EN, Pitcock JA, Ritchie M. Implementing integrated primary care in late adopter sites: the impact of key events on repeated measures of adoption. 9th annual conference on the science of dissemination and implementation in health. Washington, DC, 2016. [Google Scholar]
- 51.Thabane L, Ma J, Chu R, et al. A tutorial on pilot studies: the what, why and how. BMC Med Res Methodol 2010;10:1. 10.1186/1471-2288-10-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Bauer MS, Damschroder L, Hagedorn H, et al. An introduction to implementation science for the non-specialist. BMC Psychol 2015;3:32. 10.1186/s40359-015-0089-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Creswell JW, Klassen AC, Plano Clark VL. Best practices for mixed methods research in the health sciences. August 2011. National Institutes of Health. [Internet. Washington, DC: Office of Behavioral and Social Sciences Research of the National Institutes of Health, 2011. https://obssr.od.nih.gov/training/online-training-resources/mixed-methods-research/ [Google Scholar]
- 54.Morse JM. Critical analysis of strategies for determining rigor in qualitative inquiry. Qual Health Res 2015;25:1212–22. 10.1177/1049732315588501 [DOI] [PubMed] [Google Scholar]
- 55.Orsmond GI, Cohn ES. The distinctive features of a feasibility study: objectives and guiding questions. OTJR 2015;35:169–77. 10.1177/1539449215578649 [DOI] [PubMed] [Google Scholar]
- 56.Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health 2011;38:65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 1999;89:1322–7. 10.2105/AJPH.89.9.1322 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Leon AC, Davis LL, Kraemer HC. The role and interpretation of pilot studies in clinical research. J Psychiatr Res 2011;45:626–9. 10.1016/j.jpsychires.2010.10.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Dobkin BH. Progressive staging of pilot studies to improve phase III trials for motor interventions. Neurorehabil Neural Repair 2009;23:197–206. 10.1177/1545968309331863 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Luke DA, Calhoun A, Robichaux CB, et al. The program sustainability assessment tool: a new instrument for public health programs. Prev Chronic Dis 2014;11:130184. 10.5888/pcd11.130184 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Jull J, Giles A, Graham ID. Community-based participatory research and integrated knowledge translation: advancing the co-creation of knowledge. Implement Sci 2017;12:150. 10.1186/s13012-017-0696-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Grunbaum J. Challenges in improving community engagement in research. In: Principles of community engagement. 2nd edn. National Institutes of Health, 2011. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.