Abstract
Research has produced a steady stream of evidence-based practices (EBP) that can promote youth behavioral health, but widespread implementation is often poor. To narrow the “science to practice gap,” an implementation strategy was developed to enhance school-based mental health providers’ intentions to implement EBP. The current study adopted a user-testing approach to inform the iterative development of this implementation strategy, which consisted of strategic education, social influence techniques, and motivational interviewing. Segments of the implementation strategy were demonstrated live for a representative sample of the intended audience. Participants rated each segment on acceptability, appropriateness, and likely impact on intentions to use EBP. Ratings were tallied in real time and presented to participants to spur discussion. Qualitative input was coded via conventional content analyses. Results indicated that that implementation strategies may need to be tailored to the specific EBP. In addition, implementation goal setting was well-received by some participants but not others, suggesting a difficulty of conducting motivational interviewing in group settings. Participants also perceived themselves as familiar with EBP and strong advocates of school mental health services. The paper concludes with a discussion of how this research has influenced the ongoing development of the strategy and implications for EBP implementation efforts.
Keywords: implementation strategy, school mental health providers, user testing, iterative development
In the United States, one in five children experience mental health problems that impair their academic performance and interpersonal relationships, and increase the risk for negative outcomes in adulthood (Perou et al., 2013). Fortunately, a number of mental health interventions have been developed over the past several decades that, when implemented with fidelity, are effective in reducing symptoms and improving quality of life (Chorpita et al., 2011). Schools provide an ideal setting for mental health services. School mental health (SMH) services can be provided individually or in small groups by school-employed mental health staff (e.g., school social workers) or by contracted mental health agencies co-located in schools. Schools provide a convenient access point for services, reducing barriers to treatment that plague traditional outpatient settings including transportation, stigma, lack of insurance coverage or other issues related to cost (Juszczak et al., 2003). SMH services have been expanding, to the point where schools now represent the largest provider of mental health care to youth (Burns et al., 1995; Leaf et al., 1996; Merikangas et al., 2011). Despite the surge of interest in SMH, a significant gap between typical and optimal care remains, where most youths do not receive high-quality, evidence-based care (Owens et al., 2014). Research indicates that SMH services are frequently not based on evidence for effectiveness (Owens et al., 2014). When evidence-based practices (EBP) are implemented, adoption is typically uneven and fidelity is typically insufficient to produce an intended therapeutic effect (Ennett et al., 2003; Evans & Weist, 2004; Odom et al., 1995; Owens et al., 2014).
Much attention has been devoted to understanding this research-to-practice gap. This previous work has identified a number of barriers to high-quality services in schools at the organizational and individual level. At the organizational level, these barriers include lack of funding, poor leadership, high staff turnover, and inadequate training (Forman & Barakat, 2011; Forman et al., 2009). At the individual level, care providers are less likely to implement evidence-based practices that they do not perceive as acceptable or appropriate. In addition, competing time demands and unsupportive social norms have been identified as individual-level barriers to implementation of evidence-based practices (Dart et al., 2012; Forman et al., 2012; Gonzalez et al., 2004; Grol & Grimshaw, 2003). Although the importance of attending to and addressing organizational barriers is well-recognized (Aarons et al., 2011; Beidas & Kendall, 2010), some studies indicate that individual factors may be more predictive of implementation fidelity (Locke et al., 2019).
Even with this emergent knowledge base regarding barriers to implementation, few strategies have been developed and tested to specifically address these barriers. Some organizational strategies, such as ongoing training, modifying incentive systems, or improving climate and culture, have been found to yield encouraging results (Aarons et al., 2015; Glisson & Schoenwald, 2005), but they are often time-consuming and expensive. Because implementation ultimately rests on the motivation, decisions, and behavior change of individual practitioners (Michie et al., 2011), it is critical to develop implementation strategies that address specific individual-level barriers to implementation (Powell et al., 2019).
To address this need, we previously developed an implementation strategy called Beliefs and Attitudes for Successful Implementation in Schools (BASIS). BASIS is a group-based, in-person professional development training that targets individual-level factors that influence EBP implementation. Because BASIS targets individual-level factors, it is intended to be a cost-effective “first line” strategy that may be used prior to or in conjunction with more expensive or complex organizational implementation strategies.
BASIS is grounded in the Theory of Planned Behavior (TPB), which has increasingly been applied to the prediction of implementation behaviors (Godin et al., 2008). The central tenet of TPB is that one of the best predictors of behavior is a person’s behavioral intentions (Ajzen, 1991), defined as an individual’s motivation or conscious plan to exhibit a particular behavior. Behavioral intentions, in turn, are a function of an individual’s attitudes (cognitive appraisals or evaluations of the behavior in question), subjective norms (perception of the social pressure to perform the behavior), and perceived behavioral control (confidence about being ability to perform the behavior). A growing body of research links attitudes, subjective norms and perceived behavioral control to implementation-relevant behavior change and effective delivery of EBP (Borntrager et al., 2009; Glanz & Bishop, 2010). Grounded in the TBP (see Figure 1), BASIS aims to increase implementation intentions by shifting provider attitudes, subjective norms, and perceived behavioral control.
Figure 1.
Initial BASIS components, hypothesized mechanisms of change, and target outcomes.
BASIS Components
BASIS integrates three empirically-supported approaches to change behavior: strategic education, social influence techniques, and motivational interviewing to target the three mechanisms of change outlined by the TPB.
Strategic education.
Attitudes toward a behavior have been shown to arise from a combination of knowledge about the outcomes of behavior and evaluations of the associated consequences (Ajzen & Fishbein, 2005). Strategic education is one of the most common active ingredients of evidence-based interventions (Chorpita et al., 2007). Alone, strategic education has been found to lead to small but meaningful changes in some behaviors (Lukens & McFarlane, 2004).
Social influence techniques.
EBP trainings are most often delivered in group-based settings, but little is known about how to leverage social expectations and norms to facilitate providers’ engagement in training and post-training use of novel practices. BASIS incorporates carefully-crafted social proofing messages about social norms (e.g., data or testimonials describing the behavior or attitudes of others), which has been shown to be effective in reducing a variety of problem behaviors (Perkins et al., 1999).
Motivational interviewing.
Motivational Interviewing (MI) is an empirically-tested intervention that has been used to improve EBP implementation among teachers and primary care providers (Reinke et al., 2011). It involves an empathic, supportive, and person-centered communication strategies to elicit and elaborate change talk among participants (Miller & Rollnick, 2002).
Prior Studies of BASIS
The current version of BASIS was adapted from a preliminary protocol, which was delivered to 1,181 teachers and administrators in 62 schools, along with training in universal behavioral health EBPs. Pre-post surveys showed that the intervention led to significantly more favorable attitudes towards EBP at post-intervention, with a large effect size (Cook et al., 2015). Attitudes, in turn, were associated with two measures of intervention fidelity with moderate effect sizes. Schools that showed the greatest improvements in beliefs and attitudes had the highest quality implementation. However, the original BASIS strategy was lengthy (i.e., two days of training), was intended for teachers rather than clinicians, and was not constructed through a careful iterative process.
The Current Study
The primary aim of the current study is to gather user feedback on the condensed version of BASIS for use with SMH clinicians. A mixed methods demonstration study was conducted with 12 SMH providers. Providers were exposed to components of BASIS, one at a time, and asked to rate each component on acceptability, appropriateness, and likely effectiveness. These ratings were tallied in real time and followed by focus group discussions to delineate explanations for high and low ratings and solicit recommendations for refinements. Analyses focused on identifying factors that impact clinician’s perceptions of the acceptability, appropriateness, and likely effectiveness of implementation strategies.
Method
Participants
The user testing literature consistently indicates that only small samples are required to detect usability problems(Albert & Tullis, 2008; Macefield, 2009). In keeping with recommendations from the user testing literature, participants were recruited via a purposive sampling approach to balance experienced and novice users (Macefield, 2009). We recruited participants using purposive snowball sampling from two school districts in the Pacific Northwest. Administrative contacts at these districts were asked to nominate potential participants based on the following criteria: more than 50% time providing SMH services, ability to clearly articulate ideas in a solution-oriented manner, and ability to represent not just their own perspective but those of other SMH clinicians. Seventeen nominations were received, and twelve providers were successfully recruited into the study. Most participants were female (11 of 12) and Caucasian (11 of 12, 1 participant was Hispanic/Latinx). The sample included 6 counselors, 5 psychologists, and 1 social worker. Most were fully embedded mental health practitioners in elementary (n = 7), middle (n = 2), or high schools (n = 3). On average, providers had been in their current role for 10.16 years (SD = 8.19, range = 1–20 years). Participants were recruited from one rural (n = 8) and one urban district (n = 4) in the Pacific Northwest. The rural district served primarily Caucasian students (82.0%), with 28.1% of the student body eligible for free and reduced lunch. The urban school district served primarily students of color (22.5% Caucasian), with 55.73% eligible for free and reduced lunch.
Procedures
School district administrators provided names and email addresses of eligible SBMH providers, who were then contacted by study staff. An introductory email explained the study and procedures. Those that replied to the introductory email were invited to attend a one-day meeting and asked to sign a consent form prior to the start of the demonstration. The intervention demonstration occurred in one in-person session, lasting seven hours. Participants were first provided with an overview of the purpose of the project, the aims of BASIS, and the guiding theoretical framework. BASIS was presented in segments, and information was provided about the proximal target of each segment (e.g., to shift providers’ attitudes about EBP, to enhance self-efficacy in implementing EBP). After each segment, participants provided quantitative ratings of the segment using PollEv, a real-time polling program. After all participants submitted ratings, frequency scores were displayed for participants to spur focus group discussions. These procedures are consistent with a sequential, mixed-methods approach focused on data explanation and elaboration (Palinkas et al., 2011), where qualitative methods are used to yield insights into quantitative variations. For the purposes of the discussion, participants were split randomly into two focus groups of six participants each. Discussions were videotaped and transcribed for coding.
BASIS Intervention
The BASIS implementation strategy is a group-based, interactive professional development session, lasting approximately three hours. BASIS was delivered by a licensed school psychologist with experience consulting with schools on evidence-based social, emotional and behavioral interventions. BASIS was delivered in six segments, with each segment representing a core intervention component. The segments are summarized in Table 1. Across all segments, MI facilitation principles and testimonials were embedded throughout. For instance, the BASIS protocol embedded open-ended discussion questions designed to elicit change talk. Consistent with MI, the facilitator’s role was to reflect participants’ change talk, affirm their self-efficacy, and draw common themes using summarizing statements. In terms of testimonials, videos and quotes from experts, SMH providers, students, and parents were used to attest to the benefits of EBP, normalize the difficulty of implementing EBP in schools, and share specific strategies for overcoming implementation barriers. Although BASIS is designed to be largely program-agnostic, the current demonstration paired it with the Cognitive Behavioral Intervention for Trauma in Schools (CBITS; Stein et al., 2003). In other words, SMH providers would receive BASIS in conjunction with CBITS training. CBITS is an evidence-based cognitive-behavioral intervention for students who have witnessed or experienced traumatic life events and was developed to be used by SMH clinicians.
Table 1.
Chronological summary of the BASIS intervention
| Order | Title | Description |
|---|---|---|
| 1 | Understanding Student Mental Health, Connection to Education, & Access Gap | • Defining the access gap as the number of students with mental health problems who are unable to access care • Facilitated discussions to evoke participants’ sense of responsibility to close the access gap |
| 2 | Vulnerabilities to Adopting and Implementing Non-Evidence-Based Practices | • Individuals commonly adopt non-evidenced based practice because of cognitive biases (e.g., “doing something is better than nothing.”) |
| 3 | Common Myths about Evidence-Based Practice | • “Fact or fiction” exercise around common myths about EBPs (e.g., “EBPs are inflexible.”) |
| 4 | What Does It Mean When Something is Evidence-Based? | • Standards for scientific evidence • Resources for determining the evidence base of available programs (e.g., Blueprints for Healthy Youth Development registry) • Defining fidelity and dimensions of fidelity |
| 5 | Values-Based Action | • Facilitated values-clarification exercise • Participants set value-congruent goals |
| 6 | Individualized Implementation Planning + Common Implementation Barriers | • Participants work together to anticipate barriers that may arise in implementation and identify strategies to overcome those barriers. • Participants answer standard MI “ruler” questions • Participants develop specific plans for overcoming implementation barriers |
Measures
Quantitative ratings.
For each BASIS segment, participants completed quantitative ratings using select items modified from the Treatment Acceptability Rating Form-Revised (TARF-R; Reimers & Lee, 1991), and the Intervention Rating Profile-15 (Martens et al., 1985). Items acceptability (“This segment is likely to be acceptable and satisfactory to school-based mental health providers”); appropriateness (“This segment would be appropriate for school-based mental health providers and the settings in which they work”); perceived impact on EBP implementation generally (“This segment would effectively increase school-based mental health providers’ intentions to use evidence-based practices”) and perceived impact on CBITS implementation specifically (“This segment would effectively increase school-based mental health providers’ intentions to use CBITS”). All participants completed ratings using a live polling program, and frequencies were displayed for all participants.
Qualitative focus groups.
Semi-structured focus group questions elicited explanations for high/low quantitative ratings (i.e., “Why do you think it was rated this way?”). Additional questions solicited specific recommendations to enhance the segment (i.e., “Which pieces are unnecessary/counterproductive?”; “What suggestions do you have for improving this segment?”). Focus groups also included questions tailored to each segment, which asked specifically about the targeted mechanism (e.g., “Is there anything else that you can think of that would help increase participants’ sense of self efficacy about being able to deliver an EBP?”).
Analytic strategy
Quantitative data were summarized descriptively for integration with qualitative codes within a mixed methods approach (Palinkas et al., 2011). Focus group discussions were video recorded, transcribed, and coded using conventional content analysis (Hsieh & Shannon, 2005) and qualitative coding software (NVivo QSR 10). Three trained coders independently coded two transcripts to identify potential codes. As a group, coders discussed recurring themes and used an integrated approach to develop the coding scheme. An initial codebook was developed, trialed, and revised through discussion over subsequent transcript reviews. Three major iterations of the codebook were trialed prior to arriving at a stable set of codes. Thirty six percent of the transcripts were randomly selected and coded by all three coders to determine inter-rater reliability. Reliability was based on the number of coded words agreed upon and was excellent (percent agreement = 97.3% on parent codes, range = 94.0 – 100.0; 99.9% on subcodes, range = 99.9 – 100.00). This process yielded seven codes, summarized in Table 2. Three of these six codes pertained to BASIS content, such as testimonials, activities, and information provided about the EBP. The remaining three codes pertained to the delivery of BASIS content, including comments about visuals (e.g., graphics, diagrams, and images), delivery or structure, and appropriateness for the target audience.
Table 2.
Codes, subcodes, and definitions
| Codes and subcodes | Definitions |
|---|---|
| Insufficient EBP Information | Refers to comments regarding need to include more information about the specific EBP. |
| Visuals | Comments about graphs, diagrams, and/or images presented that were confusing, offensive, or not relevant for the audience. |
| Testimonials | Comments about the testimonial slides and suggestions on how to improve or revise the content to make it more engaging and relevant for the audience. |
| Activities | Refers to any comments about specific activities that participants identified as irrelevant to the segment, not engaging, a poor use of time, etc. |
| Target Audience | Refers to comments about anchoring language and structure of BASIS presentation within the appropriate context across different audiences (e.g., mental health providers, teachers, administrators, etc.). Includes activities or language used that was identified as demotivating, rudimentary, condescending, redundant, etc. for the target audience. |
Results
Table 3 summarizes the quantitative ratings and counts of qualitative codes, by segment. Ratings in bold indicate marked deviation from the mean. For quantitative ratings, we focused on those that were .5 SD below or above the mean of all segments. Because qualitative mentions had greater distribution than quantitative ratings, we focus on number of mentions (counts) that were 1 SD below or above the mean. We discuss ratings in combination with the qualitative codes for each segment below. Quantitative data are described first and qualitative data are used to lend insight into variation in ratings.
Table 3.
Quantitative ratings, and counts of qualitative themes, by segment
| Quantitative Ratings |
Qualitative Themes |
|||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| Segment | Acceptable | Appropriate | Effective (EBP) | Effective (CBITS) | Insuff. EBP info | Visuals | Testimonial | Delivery/Structure | Activities | Target audience |
| 1. Understanding Student Mental Health, Connection to Education, & Access Gap | 3.25 | 3.42 | 3.00 | -- | 15 | 0 | 0 | 10 | 0 | 11 |
| 2. Vulnerabilities to Adopting and Implementing Non-Evidence-Based Practices | 3.17 | 3.70 | 3.20 | -- | 7 | 0 | 0 | 12 | 3 | 3 |
| 3. Common Myths about Evidence-Based Practice | 3.00 | 3.50 | 3.08 | -- | 0 | 8 | 3 | 7 | 0 | 0 |
| 4. What Does It Mean When Something is Evidence-Based? | 3.08 | 3.17 | 2.92 | 2.58 | 22 | 2 | 2 | 1 | 3 | 7 |
| 5. Values-Based Action | 2.64 | 2.75 | 2.83 | 2.33 | 3 | 0 | 0 | 0 | 2 | 7 |
| 6. Individualized Implementation Planning + Common Implementation Barriers | 3.08 | 3.17 | 2.58 | 2.42 | 10 | 1 | 1 | 12 | 5 | 0 |
| Mean across all segments | 3.04 | 3.29 | 2.94 | 2.44 | 9.50 | 1.83 | 1.00 | 7.00 | 2.17 | 4.67 |
| SD across all segments | 1.03 | 0.80 | 0.92 | 1.03 | 7.37 | 2.85 | 1.15 | 4.90 | 1.77 | 4.03 |
Note. Possible range for all ratings is 0–4. 0 = Not at all. 1 = Barely. 2 = Somewhat. 3 = m Moderately. 4 = Extremely. Numbers in bold indicate relatively low or high ratings, defined as ± .5 SD for quantitative ratings and ± SD for qualitative mentions.
The Access Gap
Quantitative ratings suggested that participants perceived this segment to be moderately to extremely acceptable and appropriate (M = 3.25 and 3.42, respectively), and moderately effective in increasing intentions to use EBPs (M = 3.00). There were relatively high numbers of comments during the focus groups regarding appropriateness for target audience (11 mentions, > 1 SD above the mean across all segments). SMH providers perceived themselves as strong advocates for mental health services in educational settings. Several mentioned previous trainings that have emphasized the connection between mental health and academic performance. Thus, the initial overview of connections between mental health and education in this segment was perceived as “preaching to the choir”: “you’re looking at an audience of mental health providers, a lot of us already know that information.” Some portions were perceived as appropriate for SMH providers. In particular, providers appreciated being reminder of the importance of their role: “I have to say, we’re well versed in a lot of this stuff, but just some of the statistics [about the prevalence of mental health problems] was like an affirmation. I was just thinking to myself, we have to do something, we have to do something.” Other participants noted particular statistics, statements, or examples that they found to be memorable or moving. One participant, for example, stated that the reminder of the connection between adverse childhood events and substance use and suicide evoked a “visceral response.”
Vulnerabilities to Adopting Non-EBPs
Participants rated this segment as moderate to extremely acceptable (M = 3.17), appropriate (M = 3.70), and likely to be effective (M = 3.20). This segment had a relatively high score on appropriateness (> .5 SD above the mean across all segments), and a relatively high number of codes related to presentation delivery/structure (> 1 SD above the mean). Participants enjoyed the illustrations of cognitive pitfalls: “I thought that the examples were fun and they were good reminders.” The examples spurred some to think of additional cases of popular practices that are not based in evidence (e.g., vaccinations causing autism). Three participants expressed that they wished the examples had been “extended.” For instance, one suggested we include a school-based practice that was relevant to mental health providers. Similarly, another suggested “inviting people to think about how they personally have been vulnerable by not using evidence-based practices in their own building.” Some participants wanted the facilitator to segue into the potential benefits of EBP: “after you talk about bloodletting, talk about when they looked at sanitation and hand-washing.” The information presented seemed to resonate with participants on a personal level and provoked them to reflect on their own practices: “Certainly it helped increase my intention … I just saw myself grabbing at all these gazillion curriculum available for counselors … arguably it’s almost worse providing something that isn’t evidence-based, because not only are they not getting the support they need, but we’re also wasting our time and energy in something that isn’t going to work.”
Common Myths about EBPs
This segment was rated as moderately acceptable (M = 3.00) and effective (M = 3.08), and moderately to extremely appropriate (M = 3.50). Relative to other segments, it received a high number of mentions about visual materials. A number of the graphics used on the slides were confusing to participants. Several participants also commented that the organization of the slides did not make clear whether the “myths” were being refuted or confirmed: “It gave the myth, and then it repeated the myth on the page where it was being refuted. And that was part of my confusion— it’s like, ‘Oh. No. They are saying here’s why we shouldn’t buy in to that.’”
What Does Evidence-Based Mean?
Quantitative ratings for this segment were moderate for acceptability (M = 3.08), appropriateness (M = 3.17), and likely effectiveness in increasing use of EBP (M = 2.92). In this segment, we asked participants for the first time to rate potential impact on their intentions to use CBITS in particular. That item was rated in the somewhat to moderate range (M = 2.58). On the qualitative codes, this segment received a high number of mentions under “insufficient EBP information.” Providers wanted the implementation intervention to be tailored to the EBP to be implemented: “we’ve heard it all before, we’ve all studied it a bit, we’ve all looked at evidence-based programs, so you know… just seemed like a bit of overkill if you’re going to be explaining this to school people.” Another participant said “I think most school employees get it, evidence based practices are what we should be doing, but, now what is it about this one, that would increase our, our desire to use this, and do it with fidelity.” Specifically, participants asked for data about the impact of CBITS on students, how it is contextually appropriate for the school setting, and the details of its implementation. Although the facilitator made it clear that BASIS is distinct from CBITS training, participants inquired about the details of CBITS implementation, such as eligibility criteria, how to orient parents to the program, and whether consultation will be provided after training.
It should be noted that, while participants felt they were inundated with EBP information, the in-depth explanations of fidelity and list of resources for identifying third-party reviews of evidence introduced in this segment were novel to many. Engaging with familiar terms like “fidelity” at this deeper level allowed providers to draw connections between EBP and student outcomes, and prompted some to be thoughtful in their selection of practices. “It prompted me to take a little bit more time before I try and solve the problem, making sure I pick something that’s precise, that is actually the best thing for the problem.”
Values-Based Action
Participants found this segment to be somewhat to moderately acceptable (M = 2.64) and appropriate (M = 2.75). They believed it would have a moderate impact on intention to use EBP (M = 2.83) and somewhat less impact on intention to use CBITS (M = 2.33). The appropriateness score for this segment was lower than the segment average by > .5 SD. The qualitative codes showed that participants had mixed reactions to a goal setting activity. The facilitator asked participants to specify their professional values and set goals for engaging in the upcoming CBITS training and implementing CBITS post-training. Some participants found this activity valuable: “it’s hard to take a new program and learn it well enough so you feel like you’re ready to jump in there and do it well enough. But also, all the things you have to do to make it work— there’s just recognition of that, but then, you know, you break it down, we set those — we think about our goals, and who we are, and why we would want to use a good program and so it’s almost like you walked us through a little mini process of, ‘You can do this!’” Another participant said: “I don’t set goals... regularly. You know, actually write goals down. I have goals in mind all the time but you know, just that, walking through that... is a reminder that it’s important to do and that you can chunk it out. It feels like a task that’s so big, I don’t even want to try. This is what’s motivating about it. You know, you chunk it out and it is doable.” A number of participants, however, were not receptive to the exercise. One participant said: “I felt like we were being like CBT’d.”
Common Implementation Barriers and Individualized Implementation Planning
The last segment of BASIS was perceived to be moderately acceptable (M = 3.08) and appropriate (M = 3.17). Participants rated it to have somewhat to moderate impact on their intentions to use EBPs (M = 2.58) and to use CBITS (M = 2.42). This segment received relatively high number of codes under presentation delivery/ structure and activities. Most participants were spurred to problem-solve barriers. One participant said: “I’m glad that barriers were addressed. As long as you don’t let it get out of control and [focus on] all the horrible reasons why we can’t do anything. You know, I think we tend to go down that path. And I liked how [the facilitator] basically gave you a general framework for solving the problem.” The activities gave participants the opportunity to brainstorm ways to talk to supervisors about their caseload and making time to conduct a group-based EBP, and leverage partnerships with teachers, administrators, and parents to implement a new EBP. Some participants felt that being able to interact with their colleagues, and specifically, to connect around professional values and practice was motivating.
Discussion
Although a number of evidence-based interventions exist to address student mental health difficulties, the quality of implementation of such interventions in schools is often suboptimal (Owens et al., 2014). The current project represents a real-world effort to iteratively develop a pragmatic, theoretically grounded implementation strategy to enhance SMH providers’ adoption of and fidelity to EBP. Below, we discuss the main themes that arose from providers’ feedback in the context of ongoing BASIS protocol development and their relevance to SMH more generally.
First, provider input suggested that implementation strategies need to be tailored to the EBP. We envisioned BASIS as an implementation strategy that can supplement any EBP training. However, we received feedback that this may limit the impact of BASIS in some ways. For instance, participants wanted very detailed information about the requirements of implementation for CBITS. Based on this feedback, we revised the BASIS protocol to be a “book-ended” experience. In the revised protocol, a pre-training session (2.5 hours) is aimed at enhancing engagement in training. A small amount of tailoring to EBP is done in the pre-training session. For example, a trauma-focused intervention may call for an overview of the prevalence of adverse childhood events and impact on adult outcomes. After the EBP training is completed, a 45-minute post-training session is delivered focused almost exclusively on implementation planning. By this point, participants have a clear idea of the implementation requirements of the EBP and can engage in a grounded discussion about potential barriers and methods to address those barriers. Although some degree of tailoring (to individuals, EBP, contexts) is likely to be advantageous (Lewis et al., 2018; Powell et al., 2017), tailoring to a specific EBP may make implementation strategies like BASIS less pragmatic, due to the additional time and energy required to tailor content. Our revisions attempt to find a middle ground by maintaining an EBP-agnostic approach while still tailoring targeted content to a specific EBP once SMH providers have had an opportunity to receive training and develop an understanding of the behaviors they will have to perform.
Second, participants had mixed reactions to the implementation goal setting activity. Throughout the BASIS development process, we were aware that goal setting, anticipating barriers, and creating plans to overcome barriers is relatively directive, and therefore potentially inconsistent with a MI approach. Thus, we anticipated that many may find the exercise to be challenging, and that some who are at lower levels of readiness for change, may experience resistance. Conducting MI in groups can be challenging because participants are likely to be at varying stages of change readiness. A key benefit of conducting MI in groups, of course, is the potential for savings in cost and time. As it is impractical to wait for every participant to be ready for change, the current recommended practice is to be responsive to the readiness of the majority (i.e., move to planning when more than 50% of participants are past precontemplation; Wagner & Ingersoll, 2012). In BASIS, we also hoped to prevent resistance by normalizing that participants may be in various stages of readiness. As an example, we provided a range of implementation goals that represent the range of commitment levels (e.g., ‘review training materials’ to ‘implement a group’). Although more work is needed to understand the optimal group readiness level, it is clear that some participants benefited from the goal setting activity, which is consistent with research showing planning to be a critical step for bridging implementation intentions and committed action (Oettingen & Gollwitzer, 2010).
A finally cross-cutting theme across the segments was that trainings need to build on and advance participants’ knowledge. Participants had negative reactions if they felt the information delivered was redundant or covered in previous trainings. Given that having some experience with content is also likely to engage trainees, a balance between novelty and familiarity is likely optimal (Lyon et al., 2010). In the current study, participants appreciated more advanced content that built on their foundational knowledge. For example, many were not aware of the standards for evaluating evidence, or that there were public, objective, third party reviews of EBP. Although many had heard the term “fidelity” repeatedly in prior trainings, they appreciated the analogy of EBP core components as a “recipe”, where all active ingredients are required and interact together to produce outcomes.
Of course, it is difficult for any facilitator, particularly those not indigenous to a given system, to perfectly calibrate the training to providers’ existing knowledge. In addition, there is likely to be variability among trainees in their level of knowledge, and what is redundant to some may be novel to others. In our revision of BASIS, the facilitator acknowledges at the outset that some of the information may be familiar to participants and invited participants to take the opportunity to further internalize that knowledge in order to advance their advocacy of mental health services. We also built in more opportunities for participant reflection and discussion around familiar content. In these situations, the facilitator’s role is less didactic. Instead, the facilitator focuses on setting the stage for the discussion in a way that elicits change talk and encourages providers to advocate for particular ideas. We adopted the MI structure of “Elicit, Provide, Elicit” to convey information, where the facilitator introduces a topic, asks what participants already know, provides information to confirm knowledge or fill in gaps in knowledge, and then elicits participant reactions to the information provided.
Limitations
Although the current study was formative by design, several limitations should be considered. First, although our sample size was small, a sample of 12 has been demonstrated to be adequate for data saturation (Guest et al., 2006). Second, our sample may not fully represent all types of school and mental health roles that could provide meaningful input about the development of an implementation strategy. Nevertheless, all participants routinely provided mental health services in their school. Collectively, they represented a range of years of experience and training backgrounds. Third, this study only examined participant perceptions and not actual behavior. Finally, some of our qualitative analyses emphasized the frequency with which particular themes were mentioned during discussions. Although this may be one meaningful way of determining their relative importance, it is also possible that some infrequently mentioned themes are the most critical to the successful application of the BASIS protocol. Despite these limitations, the knowledge gleaned from this study has been instrumental to the iterative development of BASIS, and may help guide other implementation efforts focused on the adoption of EBP in schools and other settings.
Conclusion
Trainings on EBP often assume that participants are ready to fully engage in the training and integrate novel information into their routine practice. Little attention is paid to motivation to change prior to EBP training. The current results show the potential value of targeting trainee’s attitudes, social norms, and perceived behavioral control in order to increase intentions to implement EBP.
Acknowledgments
Informed consent was conducted for all participants. The authors have no conflicts of interest to disclose. This publication was made possible by funding from grant R21 MH108714, awarded to the second and last authors from the National Institute of Mental Health (NIMH).
Footnotes
Compliance with ethical standards: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional research board.
Contributor Information
Mylien T. Duong, University of Washington, Seattle, WA
Clayton R. Cook, University of Minnesota, Minneapolis, MN
Kristine Lee, University of Washington, Seattle, WA.
Chayna J. Davis, University of Washington, Seattle, WA
Cheryl A. Vázquez-Colón, University of Washington, Seattle, WA
Aaron R. Lyon, University of Washington, Seattle, WA
References
- Aarons GA, Ehrhart MG, Farahnak LR, & Hurlburt MS (2015). Leadership and organizational change for implementation (LOCI): A randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science: IS, 10, 11. 10.1186/s13012-014-0192-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Hurlburt MS, & Horwitz SM (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ajzen I (1991). The theory of planned behaviour. Organizational Behaviour and Human Decision Processes, 50. 10.1016/0749-5978(91)90020-t [DOI] [Google Scholar]
- Ajzen I, & Fishbein M (2005). The influence of attitudes on behavior. In Albarracín D, Johnson BT, & Zanna MP (Eds.), The Handbook of Attitudes (pp. 173–221). Erlbaum. [Google Scholar]
- Albert W, & Tullis T (2008). Measuring the user experience: Collecting, analyzing, and presenting usability metrics (1 edition). Morgan Kaufmann. [Google Scholar]
- Beidas RS, & Kendall PC (2010). Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice, 17(1), 1–30. 10.1111/j.1468-2850.2009.01187.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Borntrager C, Chorpita BF, McMillan-Higa C, & Weisz J (2009). Provider attitudes towards evidence based practices: Are the concerns with the evidence or the manuals? Psychiatric Services, 60, 677–681. [DOI] [PubMed] [Google Scholar]
- Burns BJ, Costello EJ, Angold A, Tweed D, Stangl D, Farmer EM, & Erkanli A (1995). Children’s mental health service use across service sectors. Health Affairs, 14(3), 147–159. 10.1377/hlthaff.14.3.147 [DOI] [PubMed] [Google Scholar]
- Chorpita BF, Becker KD, Daleiden EL, & Hamilton JD (2007). Understanding the common elements of evidence-based practice: Misconceptions and clinical examples. Journal of the American Academy of Child & Adolescent Psychiatry, 46(5), 647–652. 10.1097/chi.0b013e318033ff71 [DOI] [PubMed] [Google Scholar]
- Chorpita BF, Daleiden EL, Ebesutani C, Young J, Becker KD, Nakamura BJ, Phillips L, Ward A, Lynch R, Trent L, Smith RL, Okamura K, & Starace N (2011). Evidence-based treatments for children and adolescents: An updated review of indicators of efficacy and effectiveness. Clinical Psychology: Science and Practice, 18(2), 154–172. 10.1111/j.1468-2850.2011.01247.x [DOI] [Google Scholar]
- Dart EH, Cook CR, Collins TA, Gresham FM, & Chenier JS (2012). Test driving interventions to increase treatment integrity and student outcomes. School Psychology Review, 41(4), 467–481. [Google Scholar]
- Ennett ST, Ringwalt CL, Thorne J, Rohrbach LA, Vincus A, Simons-Rudolph A, & Jones S (2003). A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science, 4(1), 1–14. 10.1023/A:1021777109369 [DOI] [PubMed] [Google Scholar]
- Evans SW, & Weist MD (2004). Commentary: Implementing empirically supported treatments in the schools: What are we asking? Clinical Child and Family Psychology Review, 7(4), 263–267. 10.1007/s10567-004-6090-0 [DOI] [PubMed] [Google Scholar]
- Forman SG, & Barakat NM (2011). Cognitive‐behavioral therapy in the schools: Bringing research to practice through effective implementation. Psychology in the Schools, 48(3), 283–296. 10.1002/pits.20547 [DOI] [Google Scholar]
- Forman SG, Fagley NS, Chu BC, & Walkup JT (2012). Factors influencing school psychologists’ “willingness implement” evidence-based interventions. School Mental Health, 4(4), 207–218. 10.1007/s12310-012-9083-z [DOI] [Google Scholar]
- Forman SG, Olin SS, Hoagwood KE, Crowe M, & Saka N (2009). Evidence-based interventions in schools: Developers’ views of implementation barriers and facilitators. School Mental Health, 1(1), 26. [Google Scholar]
- Glanz K, & Bishop DB (2010). The role of behavioral science theory in development and implementation of public health interventions. Annual Review of Public Health, 31(1), 399–418. 10.1146/annurev.publhealth.012809.103604 [DOI] [PubMed] [Google Scholar]
- Glisson C, & Schoenwald SK (2005). The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Mental Health Services Research, 7(4), 243–259. 10.1007/s11020-005-7456-1 [DOI] [PubMed] [Google Scholar]
- Godin G, Bélanger-Gravel A, Eccles M, & Grimshaw J (2008). Healthcare professionals’ intentions and behaviours: A systematic review of studies based on social cognitive theories. Implementation Science, 3, 36. 10.1186/1748-5908-3-36 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gonzalez JE, Nelson JR, Gutkin TB, & Shwery CS (2004). Teacher resistance to school-based consultation with school psychologists: A survey of teacher perceptions. Journal of Emotional and Behavioral Disorders, 12(1), 30–37. 10.1177/10634266040120010401 [DOI] [Google Scholar]
- Grol R, & Grimshaw J (2003). From best evidence to best practice: Effective implementation of change in patients’ care. The Lancet, 362(9391), 1225–1230. [DOI] [PubMed] [Google Scholar]
- Guest G, Bunce A, & Johnson L (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18(1), 59–82. 10.1177/1525822X05279903 [DOI] [Google Scholar]
- Hsieh HF, & Shannon SE (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. 10.1177/1049732305276687 [DOI] [PubMed] [Google Scholar]
- Juszczak L, Melinkovich P, & Kaplan D (2003). Use of health and mental health services by adolescents across multiple delivery sites. Journal of Adolescent Health, 32(Suppl6), 108–118. 10.1016/S1054-139X(03)00073-9 [DOI] [PubMed] [Google Scholar]
- Leaf PJ, Alegria M, Cohen P, Goodman SH, Horwitz SM, Hoven CW, Narrow WE, Vaden-Kiernan M, & Regier DA (1996). Mental health service use in the community and schools: Results from the four-community MECA study. Journal of the American Academy of Child & Adolescent Psychiatry, 35(7), 889–897. [DOI] [PubMed] [Google Scholar]
- Lewis CC, Puspitasari A, Boyd MR, Scott K, Marriott BR, Hoffman M, Navarro E, & Kassab H (2018). Implementing measurement based care in community mental health: A description of tailored and standardized methods. BMC Research Notes, 11(1), 76. 10.1186/s13104-018-3193-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Locke J, Lee K, Cook CR, Frederick L, Vázquez-Colón C, Ehrhart MG, Aarons GA, Davis C, & Lyon AR (2019). Understanding the organizational implementation context of schools: A qualitative study of school district administrators, principals, and teachers. School Mental Health, 11(3), 379–399. 10.1007/s12310-018-9292-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lukens EP, & McFarlane WR (2004). Psychoeducation as evidence-based practice: Considerations for practice, research, and policy. Brief Treatment and Crisis Intervention, 4(3), 205–225. 10.1093/brief-treatment/mhh019 [DOI] [Google Scholar]
- Lyon AR, Stirman SW, Kerns SEU, & Bruns EJ (2010). Developing the mental health workforce: Review and application of training approaches from multiple disciplines. Administration and Policy in Mental Health and Mental Health Services Research, 38(4), 238–253. 10.1007/s10488-010-0331-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Macefield R (2009). How to specify the participant group size for usability studies: A practitioner’s guide. Journal of Usability Studies, 5(1), 34–45. [Google Scholar]
- Martens BK, Witt JC, Elliott SN, & Darveaux DX (1985). Teacher judgments concerning the acceptability of school-based interventions. Professional Psychology: Research and Practice, 16(2), 191–198. 10.1037/0735-7028.16.2.191 [DOI] [Google Scholar]
- Merikangas KR, He J, Burstein M, Swendsen J, Avenevoli S, Case B, Georgiades K, Heaton L, Swanson S, & Olfson M (2011). Service utilization for lifetime mental disorders in U.S. adolescents: Results of the National Comorbidity Survey-Adolescent Supplement (NCS-A). Journal of the American Academy of Child and Adolescent Psychiatry, 50(1), 32–45. 10.1016/j.jaac.2010.10.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Michie S, Van Stralen M, & West R (2011). The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science, 6(1), 1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miller WR, & Rollnick S (2002). Motivational interviewing: Preparing people for change (2nd ed.). Guilford Publications, Inc. [Google Scholar]
- Odom SL, McLean ME, Johnson LJ, & LaMontagne MJ (1995). Recommended practices in early childhood special education: Validation and current use. Journal of Early Intervention, 19(1), 1–17. 10.1177/105381519501900101 [DOI] [Google Scholar]
- Oettingen G, & Gollwitzer PM (2010). Strategies of setting and implementing goals: Mental contrasting and implementation intentions. In Maddux JE & Tangney JP (Eds.), Social psychological foundations of clinical psychology. (pp. 114–135). Guilford Press. [Google Scholar]
- Owens JS, Lyon AR, Brandt NE, Masia Warner C, Nadeem E, Spiel C, & Wagner M (2014). Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health, 6(2), 99–111. 10.1007/s12310-013-9115-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, & Landsverk J (2011). Mixed method designs in implementation research. Administration and Policy in Mental Health, 38(1), 44–53. 10.1007/s10488-010-0314-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Perkins HW, Meilman PW, Leichliter JS, Cashin MA, & Presley CA (1999). Misperceptions of the norms for the frequency of alcohol and other drug use on college campuses. Journal of American College Health, 47, 253–258. 10.1080/07448489909595656 [DOI] [PubMed] [Google Scholar]
- Perou R, Bitsko RH, Blumberg SJ, Pastor P, Ghandour RM, Gfroerer JC, Hedden SL, Crosby AE, Visser SN, Schieve LA, Parks SE, Hall JE, Brody D, Simile CM, Thompson WW, Baio J, Avenevoli S, Kogan MD, Huang LN, & Centers for Disease Control and Prevention (CDC). (2013). Mental health surveillance among children—United States, 2005–2011. MMWR Supplements, 62(2), 1–35. [PubMed] [Google Scholar]
- Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, & Mandell DS (2017). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44(2), 177–194. 10.1007/s11414-015-9475-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, McHugh SM, & Weiner BJ (2019). Enhancing the impact of implementation strategies in healthcare: A research agenda. Frontiers in Public Health, 7. 10.3389/fpubh.2019.00003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reimers TM, & Lee J (1991). Parental acceptability of treatments for children’s hypercholesterolemia. Journal of Behavioral Medicine, 14(3), 225–239. [DOI] [PubMed] [Google Scholar]
- Reinke WM, Herman KC, & Sprick R (2011). Motivational interviewing for effective classroom management: The classroom check-up (Riley-Tillman TC, Ed.). Guilford Press. [Google Scholar]
- Stein BD, Jaycox LH, Kataoka S, Rhodes HJ, & Vestal KD (2003). Prevalence of child and adolescent exposure to community violence. Clinical Child and Family Psychology Review, 6(4), 247–264. 10.1023/B:CCFP.0000006292.61072.d2 [DOI] [PubMed] [Google Scholar]
- Wagner C, & Ingersoll K (2012). Motivational interviewing in groups (Rollnick S, Miller WR, & Moyers TB, Eds.; Vol. 1). Guilford Press. [Google Scholar]

