Abstract
Guided by implementation science scholarship and school mental health research, the current study uses qualitative and quantitative data to illuminate the barriers, opportunities, and processes underlying the implementation of a teacher consultation and coaching model (BRIDGE) in urban elementary schools. Data come from five public elementary schools, 12 school mental health staff (BRIDGE consultants), and 18 teachers participating in a classroom-randomized trial of BRIDGE. Findings from directed content analysis of teacher focus group and interview data suggest that aspects of the BRIDGE intervention model, school organization and classroom contexts, and teachers/consultants and their relationship were relevant as implementation facilitators or barriers. In addition, case study analysis of intervention materials and fidelity tools from classrooms with moderate-to-high dosage and adherence suggest variation in consultation and coaching by initial level of observed classroom need. Results illuminate the need for implementation research to extend beyond simple indicators of fidelity to the multiple systems and variation in processes at play across levels of the implementation context.
Keywords: implementation processes, school mental health, teacher consultation, classroom intervention, mixed method
In recent years, increasing pressure has been placed on schools to select social-emotional and behavioral interventions for students that have demonstrated effects on target outcomes (e.g., academic, behavioral; http://www.casel.org/policy/). Education and prevention scientists champion this recommendation, as evidence-based interventions (EBIs) should produce intended benefits. Yet, as noted recently in reports of the Institute of Education Sciences (Sparks, 2013) and the U.S. Department of Health and Human Services (Durlak, 2013), whether interventions benefit students depends upon their successful implementation. As conceptualized by Dane and Schneider (1998), implementation involves multiple elements, including dosage or exposure (i.e., duration and frequency of receipt) and adherence (i.e., delivery as intended), among others. Although researchers increasingly document implementation outcomes (Durlak & DuPre, 2008; Gottfredson et al., 2015), and scholars have proposed conceptual models of implementation systems (Aarons, Hurlburg, & Horwitz, 2011; Damschroder et al., 2009; Graczyk, Domitrovich, Small, & Zins, 2006; Wandersman et al., 2008), implementation systems and processes in schools remain understudied (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005). Calls have been made to examine implementation as a dynamic process (Berwick, 2008; Hoagwood, Atkins, & Ialongo, 2013; Saunders, Evans, & Joshi, 2005), explore heterogeneity in implementation and effects (Seidman, 2012), and use implementation data to inform program refinements (Cappella, Reinke, & Hoagwood, 2011).
Aligned with these calls, and guided by implementation science scholarship (Fixsen et al., 2005), conceptual models (Damschroeder et al, 2009), and school mental health research (e.g., Fazel, Hoagwood, Stephan, & Ford, 2014; Forman, Olin, Hoagwood, Crowe, & Saka, 2009), the current study uses qualitative and quantitative data to illuminate the barriers, opportunities, and processes underlying the implementation of a teacher consultation and coaching model in urban elementary schools—BRIDGE (citations omitted). The overall aim is to move education science and policy beyond what constitutes an evidence-based intervention toward an understanding of effective implementation of EBIs as a means toward positive outcomes for more students in schools.
BRIDGE Teacher Consultation and Coaching: Intervention Overview
The BRIDGE teacher consultation and coaching intervention is founded in research on elementary age children’s behavioral difficulties (Reid, Gonzalez, Nordness, Trout, & Epstein, 2004) and the challenges teachers face when working with these children (Reinke et al., 2011). Children with behavioral difficulties (e.g., inattention, hyperactivity, and/or oppositional behavior), often experience academic underachievement and relational problems in schools (Hughes & Cavell, 1999; Miles & Stipeck, 2006). Left unaddressed, these problems can lead to a cycle of maladjustment that worsens over time (Reid et al., 2004). Teachers also struggle to manage and motivate students with behavioral difficulties (Reinke et al., 2011) and refer these students to support services at high rates (Cappella et al., 2008). Most support services involve individual assessment or individual, group, or family treatment outside the classroom (Brener, Weist, Adelman, Taylor, & Vernon-Smiley, 2007; Kutash, Duchnowski, & Lynn, 2006). Rarely are these services complemented with regular and systematic consultation to teachers to assist them in creating a classroom context where children with behavioral difficulties can succeed.
Recent studies provide evidence that effective classrooms, such as those with emotionally supportive (i.e., warm, responsive) and organized (i.e., consistent, structured) teaching practices, can facilitate social-emotional development and academic performance, particularly among children with adjustment problems (Brock, Nishida, Chiong, Grimm, & Rimm-Kaufman, 2008; Hamre & Pianta, 2005). This research is based in a theory-based lens through which to assess and understand effective classrooms—the Classroom Assessment Scoring System (CLASS: Pianta et al., 2008). According to the CLASS, teachers who create emotional support in the classroom are aware of and responsive to students’ needs, encourage warm and positive interactions, and provide appropriate levels of autonomy. Organized classrooms (i.e., high behavioral support) have clear and productive routines, engaging and varied instructional methods, and positive and proactive behavior management (Pianta, Hamre, & Allen, 2012).
In randomized trials of classroom-focused, social-emotional and behavioral interventions, researchers find effects on observed classroom emotional support and organization, and on children’s social and academic adjustment over time (Brown, Jones, LaRusso, & Aber, 2010; Jones, Brown, & Aber, 2011; Raver et al., 2008; 2011). Intervention effects on children appear to be mediated through improvements to classroom emotional support or organization (McCormick et al., 2015). Though promising, these demonstration trials use externally-funded coaches, facilitators, or consultants to provide training and support to teachers. Given the financial costs, it may be difficult for schools in low-income communities to sustain these supports over time.
In response to the research evidence and practical realities, BRIDGE was developed for school-based staff to deliver to teachers as a part of their regular daily activities (citation omitted). BRIDGE is based on two models: (a) MyTeachingPartner (MTP: Pianta, Mashburn, Downer, Hamre, & Justice, 2008; Allen, Pianta, Gregory, Mikami, & Lun, 2011), a teacher professional development program, and (b) Links to Learning (L2L: Atkins et al., 2008; 2015; Cappella et al., 2008), a mental health services model. Like MTP, BRIDGE is rooted in the CLASS, a standardized and validated observational tool for understanding and assessing effective classroom practices (Pianta, LaParo, & Hamre, 2008). The CLASS lens enables the teacher and mental health practitioner (i.e., BRIDGE consultant) to view the classroom similarly when reflecting on classroom interactions, and organizes consultation and coaching within a set of empirically-based dimensions of effective teaching. Like L2L, BRIDGE operates at two tiers – (1) overall classroom interactions (universal) and, (2) specific interactions with children with behavioral difficulties (targeted) – in order to respond to teacher needs to work effectively across the classroom and with students with behavioral challenges. Unlike other interventions, BRIDGE involves a classroom continuous quality improvement cycle (Park, Hironaka, Carver, & Nordstrum, 2013) implemented monthly to individualize intervention to classrooms and students.
Overall, BRIDGE goals are to improve classroom emotional support and organization, and promote the academic and psychosocial adjustment of students with and without behavioral difficulties. To reach these goals, school-based mental health staff (i.e., BRIDGE consultants) receive training in (a) the CLASS observation system (Pianta et al., 2008), (b) evidence-based strategies targeting emotional support and classroom organization (Berryhill & Prinz, 2003; Embry, 2004), and (c) approaches to effective consultation and coaching (Miller & Rolnick, 2002; Reinke, Lewis-Palmer, & Merrell, 2008). After training, BRIDGE consultants are paired with teachers in their schools, engage in initial interviews, and conduct baseline classroom observations using an adapted CLASS framework (Pianta et al., 2008) and principles of functional behavior assessment (FBA; Watson & Skinner, 2001). Then, consultant-teacher pairs meet to reflect on classroom and student needs and choose specific dimension(s) of teaching practice within domains of emotional support and organization on which to focus the consultation and coaching cycles. A full cycle involves: (1) classroom and student observation, (2) teacher consultation, and (3) implementation of evidence-based strategies from the BRIDGE Tips and Tools manual (see Table 1). This cycle is repeated monthly to improve teaching practices across the classroom and with students with behavioral difficulties.
Table 1.
Examples of links between CLASS domains/dimensions and classwide and targeted evidence-based strategies in BRIDGE
| CLASS Domain | CLASS Dimension | Indicators of Effective Practice | Classwide Evidence-Based Strategies (selected) | Targeted Evidence-Based Strategies (selected) |
|---|---|---|---|---|
| Emotional Support | Positive Climate | Positive, respectful relationships; Enthusiasm and enjoyment |
|
|
| Teacher Sensitivity | Teachers recognize and address student needs; Appropriate levels of support |
|
|
|
| Regard for Student Perspectives | Emphasis on student interests; Meaningful leadership roles; Appropriate autonomy |
|
|
|
| Classroom Organization | Behavior Management | Clear, consistent rules; Proactive intervention, redirection of misbehavior |
|
|
| Productivity | Teacher is prepared; Clear routines; Smooth transitions; Maximal learning time |
|
|
|
| Instructional Learning Formats | Effective facilitation; Variety of modalities for instruction; Student engagement |
|
|
References: Barrish et al., 1979; Berryhill & Prinz, 2003; Blechman, Taylor, & Shrader, 1981; Brophy & Good, 1986; Durlak et al., 2011; Embry, 2002; 2004; Kazdin, 1982; Kelly, 1990; Kern, Childs, Dunlap, Clarke, & Falk, 1994; Kraemer, Davies, Arndt, & Hunley, 2012; Lyman, 1987; Mooney, Ryan, Uhing, Reid, & Epstein, 2005; Pianta, Mashburn, et al., 2008; Reddy, Newman, De Thomas, & Chun, 2009; Rohrbeck et al., 2003; Skinner et al., 2000; Walberg & Paik, 2000.
BRIDGE Intervention Effects and Implementation Questions
In the 2010–11 school year, intent-to-treat effects of BRIDGE on classroom and child outcomes were assessed in a classroom randomized trial in five urban schools (citation omitted). At the classroom level, BRIDGE had a positive impact on observed teaching practices in classrooms with low levels of emotional support at the start of the year. At the student level, children with and without behavioral difficulties in BRIDGE classrooms benefited in terms of their relational closeness to teachers, social experiences with peers, and academic self-concept. In addition, children in BRIDGE classrooms who had behavioral problems at the start of the year were less likely to experience peer victimization at the end of the year than comparable children in control classrooms. No significant effects were detected for teacher practices of classroom organization, students’ aggression, or students’ behavioral regulation.
Across this trial, implementation dosage and adherence were adequate (citation omitted). On average, teachers participated in one cycle of observation/consultation/implementation (with coaching) during each month of the intervention. With the exception of viewing video segments, consultation sessions included the primary BRIDGE content and strategy implementation involved use of classroom strategies from the BRIDGE Tips and Tools Manual implemented to minimum levels of adherence (see, for a detailed description, citation omitted). However, significant heterogeneity in implementation was observed. This is not unusual in intervention trials in which the intervention is delivered by school-based personnel rather than paid consultants, university personnel, or research staff (e.g., Aber et al., 1997; Atkins et al., 2015). Moreover, BRIDGE was designed to match classroom and student need, increasing the likelihood of variation in implementation. We also found moderated effects for students with behavioral difficulties and classrooms with low baseline levels of emotional support, suggesting that baseline levels of classroom and student need mattered.
Motivated by these results and responding to calls to explore implementation processes in real world contexts (Hoagwood et al., 2013), we aim in the current study to: (1) explore barriers to and facilitators of BRIDGE implementation, and (2) describe variation in implementation dosage and adherence by pre-intervention classroom need. Our inquiry was guided by the Consolidated Framework for Implementation Research (Damschroder et al., 2009), an implementation framework that integrates across published theories to identify common constructs related to effective implementation in healthcare settings. In this model, Damschroder and colleagues (2009) identify five key domains of implementation: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. We use this model as a guide, considering characteristics of BRIDGE (content, implementation), school organizational factors, classroom contextual factors, and characteristics of teachers and consultants and their relationship (see Figure 1 for a conceptual model). The overall goals are to expand understanding of implementation variability and processes in school mental health, inform refinements to implementation systems in school-based interventions, and increase our success at enhancing teaching practices and student outcomes among students with and without behavioral problems in urban schools.
Figure 1.
Conceptual model of implementation analysis of teacher focus groups and interviews.
Method
Setting and Participants
This study draws data from five public elementary schools in a large urban center invited to participate in an experimental trial of BRIDGE across one school year. The schools were linked to a partner community agency for school-based mental health services on the basis of location (proximal to the agency) and economic disadvantage (free/reduced lunch eligibility: 89–99%). According to district records, participating schools enrolled mainly Latino and African-American students (87% and 11%), with four of the five schools serving mainly Latinos (89–99%) and one school serving mainly African-American students (69%).
Participants in the current study were school-based mental health professionals – BRIDGE consultants – and classroom teachers. Consultants were recruited during informational meetings in the agency and schools. Resource mapping procedures (Adelman & Taylor, 2006) were used to determine the school-based mental health staff whose roles and time in the school enabled implementation. Interested individuals met with researchers to determine whether their time and role were sufficient to accommodate the consultation practice, as well as to discuss intervention implementation and study design. Across the five schools, 12 of 18 eligible mental health professionals consented to participate. The consultants were mostly female (82%), and identified as Latino (33%), White (50%), Black (8%), and mixed/other (8%). Roles in the school included counselor, social worker, and psychologist.1 Consultants were employed by the school district (n = 7) or the community agency (n = 5) providing services at the school.
A letter of invitation to participate was distributed to all teachers (N = 154) in the five schools. Given the number of consultants, a total of 48 teachers could be accommodated in this trial. Researchers met with teachers who expressed interest within two weeks of receipt of the invitation (n = 44); a total of 36 provided informed consent. After baseline data were collected in the fall, researchers used a random numbers table to randomize teachers within schools to BRIDGE (intervention; n = 18 classrooms) or training only (comparison; n = 18 classrooms) conditions. Consultants were paired with teachers of intervention classrooms based on existing relationships in the school and delivered BRIDGE consultation and coaching as a part of their ongoing school activities.
For the current study, the 18 lead teachers from the classrooms randomly assigned to BRIDGE were included in analysis. These teachers led regular education (n = 12) or special education/combined classes (n = 6). Similar numbers of younger grade teachers (K - 2nd grade: n = 9) and older grade teachers (3rd – 5th grade: n = 9) were included. The teachers were primarily female (83%) and identified as Latino (56%), White (28%), or Black (17%). On average, teachers worked in their current school for 8.82 (SD = 5.47) years.
Procedures and Data
All procedures were approved by the university and school district institutional review boards. Qualitative and quantitative data were collected over one school year to: 1) assess existing implementation barriers and facilitators through the eyes of classroom teachers, and 2) gain insight into the variation in implementation between participating classrooms.
Data included: (a) focus groups (n = 3) and interviews (n = 3) with intervention teachers, (b) pre-intervention classroom observations in BRIDGE classrooms (n = 18), (c) initial teacher interviews (n = 14), (d) fidelity measures (n = 123), and (e) and implementation materials (n = 149). Following research board guidelines, schools received small monetary gifts for teacher participation and the mental health agency received a training stipend.
Focus groups and semi-structured interviews
Three teacher focus groups were conducted by researchers at the end of the trial (see Morgan, 1998). Trained researchers led the 90-minute focus groups in order to gather feedback on implementation processes and gain insight into perceived barriers and facilitators of intervention implementation. Focus groups were audio-recorded and transcribed. Semi-structured individual interviews were conducted with intervention teachers unable to attend a focus group. Researchers took detailed notes during interviews. Transcribed focus groups and structured interview notes were coded for analyses (see below for codebook development details).
Pre-intervention classroom observations
A systematic classroom observation measure – Classroom Assessment Scoring System (CLASS: Pianta, La Paro, et al., 2008) – was used to assess classroom practices. This is the same tool that guides BRIDGE intervention with teachers. Ten dimensions are scored on a seven-point scale ranging from 1 or 2 (low) to 6 or 7 (high). Each dimension contains a detailed overall description, behaviorally-anchored scale points, and behavioral indicators (see Mashburn et al., 2008). Each dimension was coded four times per teacher during each of two observational periods. Scores for each dimension were averaged to calculate dimension scores. The dimension scores factor into three domains: Emotional Support, Classroom Organization, and Instructional Support (Pianta, La Paro, et al., 2008).
The current study focused on the CLASS domains underlying BRIDGE: Emotional Support (Positive Climate, Negative Climate – reverse, Teacher Sensitivity, and Regard for Student Perspectives) and Classroom Organization (Behavior Management, Productivity, and Instructional Learning Formats). Current study internal reliabilities were α = .79 (Emotional Support) and α = .86 (Classroom Organization). CLASS domain scores coded prior to intervention were used to assess classroom need at the beginning of implementation.
Initial teacher interview
As the first step to implementation, BRIDGE consultants conducted structured individual interviews with their assigned classroom teachers. The dyads met at a convenient time in a quiet space in the school. Consultants first introduced the BRIDGE consultation procedure and reviewed dimensions of quality classroom interactions aligned with the CLASS. Then, consultants asked questions regarding teachers’ views on their overall classroom interactions (behavioral, academic) as well as questions about interactions with specific children with behavioral difficulties. The consultant records from these teacher interviews were included in classroom case studies.
Fidelity measures
Three checklists were used as research tools to assess implementation dosage and adherence. Approximately once a month, teachers and consultants completed the Monthly Intervention Checklist indicating the CLASS dimension(s) of focus and the specific intervention strategies discussed. Teachers also completed the Weekly Strategy Checklist, which reported the strategies (targeted and classroom-wide) that they used in their classroom each week. Lastly, consultants completed the Post-Consultation Checklist after each consultation session. This checklist indicated what occurred in each consultation session, such as discussing the recent classroom observation, problem-solving challenges to implementing a classwide or targeted strategy, or discussing how to use a new strategy. These fidelity measures were included in the data analyzed for the classroom case studies.
Intervention materials
Multiple forms were created to facilitate and document intervention delivery for the BRIDGE consultants and their supervisors. Among the provided forms, BRIDGE consultants chose to use the ones they found most helpful for implementation. These forms included tools to assist consultants’ observation of teachers’ classroom practice and implementation of classroom strategies. Specifically, the Classroom and Child Observation Form documented consultants’ observation of classwide and target student interactions (CLASS and FBA lenses) in an open-ended format. The Tips and Tools Implementation Form was used by consultants to observe and provide feedback to teachers about their implementation of the specific classwide and targeted strategies chosen during the consultation sessions.
Other tools were designed to facilitate productive consultation and follow up. The Consultation Preparation Form was designed to guide consultants in (a) reflecting on what they observed in the classroom, (b) choosing specific CLASS dimensions of focus, and (c) deciding relevant classroom and targeted strategies for teachers to implement. The Consultation Notes Form enabled consultants to record discussions with teachers and next steps for implementation (e.g., modeling of a classwide strategy, implementation of a targeted strategy, etc.). University Consultant Reports were written records completed by university consultants who provided supervision to BRIDGE consultants. Each of these tools was included in the case study analysis.
Analytic Plan
The overall goal was to explore the BRIDGE implementation process, including barriers to and facilitators of implementation and heterogeneity related to pre-intervention classroom need. This goal was achieved through (a) content analysis of teacher focus group and interview data, and (b) case study analysis of BRIDGE implementation materials.
Content analysis of focus groups and interviews
Teachers’ reflections after the BRIDGE trial were content analyzed using a directed approach (Hsieh & Shannon, 2005) to understand the multilevel barriers to and facilitators of implementation (see Figure 1). Development of a codebook and qualitative analysis of focus group transcriptions and interview notes was guided by the implementation framework by Damschroder and colleagues (2009), which characterized key aspects of implementation, including the characteristics of the intervention, the outer setting or organizational context, the inner setting or micro-context, the characteristics of the individuals involved, and the implementation processes.2
Two researchers independently reviewed focus group and interview data to identify salient themes within these topics: (a) BRIDGE intervention components/implementation processes (b) school organization, (c) classroom context, and (d) characteristics of teachers/consultants and their relationship. Following this independent review, researchers created a preliminary coding manual and independently identified discrete units of text that corresponded to specific categories within each theme. Researchers compared independent codes, discussed coding challenges, and resolved discrepancies, a process that led to coding manual revisions.
The two researchers then recoded all text using the revised manual to identify the prevalence of each theme and category, and normative examples. Researchers met with the principal investigator to review all codes and resolve discrepancies. An external auditor who was familiar with the project was trained and independently coded 30% of the text using the revised manual (e.g., Hill et al., 1997). Over 80% of the external codes matched the original codes; discrepancies were resolved via discussion.
Case studies
Four classrooms were selected to illustrate implementation in practice. These classrooms were selected based on (a) perceived need for intervention, (b) to what extent intervention was implemented (dosage, adherence), and (c) availability of intervention materials and fidelity measures.
To assess need, classrooms were rank ordered based on the sum of their pre-intervention scores on the CLASS domains of emotional support and classroom organization (scale from 1–7). The top tertile of classrooms was classified as “low-need” (i.e., high emotional support and organization) and the bottom tertile was classified as “high-need” (i.e., low emotional support and organization). To assess implementation, a dosage/adherence index was created that summed exposure to consultation sessions (1 = low; 2 = moderate; 3 = high) and exposure to classroom strategies weighted by intensity level of the strategy (1 = low intensity, e.g., “catch ‘em being good”, and 2 = high intensity, e.g., Good Behavior Game; Barrish, Saunders, & Wolf, 1969; see Table 1).3 Classrooms at the low end of the resulting 0–9 range participated in 1–2 consultation sessions and implemented 1–2 low intensity classroom and/or targeted strategies (with basic adherence to the BRIDGE model). Classrooms as the high end of the range participated in 4–6 consultation sessions and implemented 2–5 classroom and/or targeted strategies that ranged from low to high intensity (see Figure 2).
Figure 2.
BRIDGE intervention classrooms’ pre-intervention need and implementation dosage/adherence.
Note: Starred classrooms were included in case study analysis.
As indicated by a visual analysis of Figure 2, pre-intervention classroom need and BRIDGE dosage/adherence were unrelated. Some moderate-high (i.e., adequate) dosage/adherence classrooms had low initial CLASS scores (i.e., high-need) whereas others had high initial CLASS scores (i.e., low-need). To explore potential differences in intervention received between classrooms that varied by need, we chose two high-need classrooms and two low-need classrooms for a case study analysis. All were moderate-high dosage/adherence classrooms in order to have sufficient data (intervention materials, fidelity measures) from multiple reporters (consultant, teacher, and supervisor) for analysis.
Aligned with developmental evaluation approaches (Patton, 2011), we analyzed these records according to the theoretical framework by Yin (2009). Co-authors reviewed these documents to identify implementation patterns aligned with key components of the conceptual model. Twice monthly meetings over a four-month period were used to discuss patterns and build consensus. Disagreements were largely resolved via discussion. In isolated cases where discussion did not resolve the disagreement, the first author’s perspective was weighed more heavily. Written results were then distributed to co-authors for refinement and reviewed by a subset of BRIDGE consultants for a member check. Lastly, findings were integrated with focus group analysis and interpreted in the context of the overall implementation framework.
Results I: Focus Groups and Interviews
Findings from teacher focus groups and interviews that explored barriers and facilitators of BRIDGE implementation are summarized by major themes aligned with Damschroder and colleagues (2009): (a) BRIDGE intervention model; (b) school organization; (c) classroom micro-context, and (d) implementer characteristics and relationships. We present the themes and sub-categories within each theme that emerged in our directed content analysis (see Table 2).
Table 2.
Themes and categories derived from teacher focus groups and interviews on barriers and facilitators to BRIDGE implementation.
| Theme | % of codes | Category | % of codes |
|---|---|---|---|
| 1. BRIDGE intervention | 34.34% | a. Dosage | 14.14% |
| b. Content | 12.12% | ||
| c. Support | 6.06% | ||
| d. Model | 2.02% | ||
| 2. School organization | 32.32% | a. Structure | 12.12% |
| b. School support | 10.10% | ||
| c. Incentive/buy-in | 5.05% | ||
| d. Peer support | 5.05% | ||
| 3. Classroom micro-context | 24.24% | a. Classroom composition | 10.10% |
| b. Classroom organization | 10.10% | ||
| c. Target children | 4.04% | ||
| 4. Teacher and consultant characteristics/relationships | 9.09% | a. Competence/experience | 3.03% |
| b. Professional role/readiness | 2.02% | ||
| c. Relationship quality | 4.04% |
BRIDGE Intervention
Several of the teachers spoke about barriers and facilitators associated with components of the BRIDGE intervention (34% of codes). One category that emerged was the dosage of the intervention received (14% of codes), with much of the discussion focused on inadequate dosage. For example, one teacher stated: “At that point we didn’t see the consultants until you know maybe a week or within two weeks. You know that’s when the continuity got lost.” The specific content of the BRIDGE intervention was another category within this theme (12% of codes). One teacher talked about the evidence-based classroom strategies: “The suggestions she gave me were good. And I tried some of them. You know … and for most of the class they worked – except for a few.” Teachers mentioned the CLASS observation framework and the focus of the consultation (i.e., individual or classwide). For example, one suggested: “I think for me, these dimensions cover a lot of what should be going on in classrooms.”
Teacher participants spoke about the personal and professional support associated with the intervention (6% of codes). One teacher who had experienced many difficulties earlier in the year said: “I felt support, I felt like I’m not alone.” Teachers also discussed the overall goals of the BRIDGE model (2% of teacher codes). For example, one commented on the difficulty of reaching the goal of self-reflection: “This program points at both the teacher and the child—which is a little sensitive because people don’t want to look in the mirror and see themselves.”
School Organization
Most teachers mentioned the organizational context of the schools in which BRIDGE was implemented (32% of codes), with specific attention to the barriers within the school structure (12% of codes) and the support teachers felt they received from the school (10% of codes). For instance, teachers spoke about the lack of time for consultation to take place. One teacher spoke about changes to her schedule: “It was just time constraints. I think that a few times we wanted to meet and couldn’t meet because prep got changed.” Another teacher spoke of the number of programs within the school as another barrier, stating “They have so many programs that they don’t have time to really implement this well.” The lack of support from the school for the teachers overall was noted in regards to teachers’ experiences within their classrooms. For example, one teacher spoke about her desire for more help saying, “… You know in a kindergarten class I think that you should have an extra set of hands.”
Teachers discussed the lack of buy-in from the school, or incentive to support teachers in implementation, as another factor that impacted implementation of BRIDGE (5% of codes). One teacher suggested the school needed to change priorities in order to fully commit to BRIDGE: “With this school, that’s just not the way it is.… I mean a big change would have to be made and it’s just not a priority.” Another teacher spoke about the importance of the whole school’s “buy-in” for the implementation of an intervention to be successful, stating:
“There has to be a more consistent school-wide culture that constantly reinforced what the expected behavior is that doesn’t turn into this wishy-washy thing depending on who the teacher is and all these other factors in some cases excuses, that would also help support the teachers to enforce those expectations.”
Teachers also spoke about peer professional support in their school (5% of codes). One teacher expressed the need for support from colleagues who share similar classroom experiences: “I think I would want someone who has recent experience in the classroom and someone who has demonstrated that they have overcome some of their behavioral issues. I feel like that kind of feedback would be of value.” Yet, most teachers indicated a lack of time for this: “Even at grade level meetings, we already have a set agenda so there’s very little time to discuss other things.”
Classroom Micro-context
Teachers identified barriers and facilitators to implementation related to the classroom micro-context (24% of codes). Of the categories under this theme, teachers mainly spoke about the whole classroom composition (10% of codes) and organizational structure (10% of codes).
In these categories, teachers spoke about difficulties and issues that characterized their classrooms, and particular needs with respect to daily interactions with students. For example, one teacher spoke about her students: “… they don’t start their morning in a good way, you know … they come in with all this baggage they have to deal with.” Another teacher commented that in the beginning of the year, “We’d get one misbehavior and then we’d need a lot of strategies to maintain [the students] and we absolutely had it up to here.” Teachers also spoke about strategies already in place prior to the introduction of BRIDGE (e.g., Stoplight), which helped or hindered implementation by creating complementary or competing organizational systems.
A third category within this theme pertained to target students (4% of codes). Specifically, teachers discussed the student-teacher relationship and target student characteristics that affected interactions across the classroom. For example, one teacher reported, “Many students need counseling and other services and they are not getting them. Several students have problems reading and interacting with one another … [this] makes the whole class difficult.”
Implementer Characteristics and Relationships
Across teacher participants, facilitators and barriers related to characteristics of the individual teacher or consultant were offered. Specifically, teachers reflected on the consultants’ competence and experience (3% of codes). One teacher spoke about her consultant being generally knowledgeable but questioned her classroom experience, commenting, “I have the impression that she’s read the stuff and knows a lot of the stuff but she hasn’t really been in the classroom.” Teachers also spoke about their own professional role and readiness (2% of codes). One teacher described herself as: “… Very open to have someone who could help me with behavior management. Even give me feedback on my teaching. For me, this is my 4th year teaching, I am still new enough where I can still learn, but old enough that I am getting in the groove of things.” Finally, teachers mentioned the quality of the teacher-consultant relationship (4% of codes) as a facilitator or a barrier. For instance, one teacher spoke about the level of comfort she had with her consultant, stating “I had a very good relationship with her. I felt very comfortable when I met with her – when we could get a chance to talk.”
Summary of Focus Group and Interview Results
In sum, teachers reflected on implementation barriers and facilitators related to aspects of the BRIDGE intervention, school organization, classroom micro-context, and implementer characteristics and relationships. Each of these themes and their sub-categories emerged through directed content analysis and provide information to consider in refinements of the intervention model and implementation system.
Results II: Case Studies
The intervention materials and fidelity measures collected during implementation provide data for case studies of moderate-high dosage classrooms that varied in level of need for intervention. Students in the four case study classrooms were primarily from low-income Latino families, with a small subset classified as English language learners. Classrooms ranged from 12–25 students in kindergarten through grade three.4 In three of the four classrooms, teachers identified as Latino and spoke Spanish and English; the teachers in the fourth classroom identified as white and spoke English only. Our analysis focuses on the content and pattern of BRIDGE implementation in each of these four classrooms. We first present results from the high-need classrooms and then from low-need classrooms. For sources and relevant dimensions of the specific classwide and targeted strategies described below, see Table 1.
High-Need Classrooms: “How Much is Enough”
Ms. S’s classroom was observed to have a moderate level of emotional support (4.88) and a low level of organization (3.50) prior to the intervention. During the initial interview with her BRIDGE consultant, Ms. S reported difficulties in managing the classroom as a whole. During the initial observations conducted by the BRIDGE consultant, negative interactions were recorded around management of student behaviors (e.g. “[Ms. S is] reactive when it got rowdy” “lots of kids wandering even when whole group is on rug”) and unproductive transitions (e.g. “lots of waiting and wandering” “transitions slow because of wandering”).
Ms. O’s classroom, on the other hand, was observed to have low levels of both emotional support (2.75) and classroom organization (2.83) prior to intervention. At the initial interview, Ms. O reported difficulties in managing students’ inattentive and disruptive behaviors. Also in the interview, Ms. O expressed challenges in addressing the range of behavioral and academic needs among students and concerns about the mismatch between their family backgrounds and the peer and school context. At the start of BRIDGE implementation, the consultant noted in an initial classroom observation “yelling” to redirect or punish misbehavior and “many students waiting” for the teacher to deliver instructions or for academic activities to begin.
Specific dimensions of practice
Much of the BRIDGE consultation and coaching in these high need classrooms focused on a small number of CLASS dimensions. Specifically, in Ms. S’s classroom, 71% of the consultations focused on behavior management and productivity, with implementation of the Good Behavior Game (Barrish et al., 1979; Embry, 2002) as the primary way to improve behavior and maximize learning time. In Ms. O’s classroom, the consultation times were focused mainly on positive climate (86% of the consultations) and productivity (57% of the consultations), with classwide strategies implemented to encourage positive interactions among students (positive climate) and increase productivity for students who finished activities early (productivity). Although the goal was for teachers and consultants to cycle through more than two of the CLASS dimensions over the course of BRIDGE implementation, this did not occur with the high-need classrooms.
Classwide needs and strategies
Implementation data suggest BRIDGE consultants working with high need classrooms focused primarily on classwide needs and strategies. Ms. S’s consultations involved coaching to implement classwide strategies such as the Good Behavior Game (GBG: Embry, 2002), Think-Pair-Share (Lyman, 1987), and positive peer reporting or “tootling” (Cihak, Kirk, & Boon, 2009; Skinner, Cashwell, & Skinner, 2000). Selected targeted strategies were used by the BRIDGE consultant but not fully integrated into teacher practice. Similarly in Ms. O’s classroom, consultation focused on classwide strategies to improve teacher-student and peer interactions (e.g. “peer tootling”; Lambert, Tingstrom, Sterling, Dufrene, & Lynne, 2014) and maximize learning time and establish routines (e.g. activity box: Greenwood, 1997). The BRIDGE consultant also provided tips for modifying specific students’ behaviors but these were not the primary focus of consultation and coaching.
Implementation difficulties
The teachers experienced difficulties implementing new strategies in their classrooms. For example, the BRIDGE consultant observed Ms. S when she implemented the Good Behavior Game (Embry, 2002), and noted that the teacher did not state the rules of the game, did not implement the strategy during key times, and did not reinforce the rules consistently. In Ms. O’s case, the BRIDGE consultant found the teacher sometimes implemented the strategies in unintended ways. For example, the consultant and teacher agreed to implement positive peer reporting to improve positive climate in the classroom. The strategy involves students writing positive statements about what a target student says or does. Ms. O thought the strategy would be useful to teach new vocabulary. This creative appropriation of the strategy for unintended purpose may have been an effective instructional tool but the negative peer interactions remained unaddressed.
Active and ongoing coaching
In part due to these implementation difficulties, BRIDGE consultants played an active and ongoing role in coaching and modeling. The teachers expressed low levels of confidence and competence in implementing new strategies without coaching support. Therefore, BRIDGE consultants were asked to introduce and implement strategies the first time they were used. After the introduction, however, teachers continued to request support in implementing strategies, including co-leading classwide strategies (e.g., GBG: Ms. S) and leading targeted strategies (e.g. behavioral contract: Ms. S. and Ms. O).
Summary of implementation in high-need classrooms
In sum, the intervention materials and fidelity measures from high-need classrooms revealed a focus on a small number of dimensions of effective teaching practice, implementation of classwide strategies rather than targeted student strategies, difficulties in using strategies as intended and with fidelity, and need for active and ongoing coaching support.
Low-Need Classrooms: “Always Can Use Extra Help”
Ms. H’s and Ms. B’s classrooms represent moderate-high dosage/low need classrooms. In a pre-intervention observation, Ms. H and her co-teacher were observed to have high scores in both emotional support (5.88) and classroom organization (6.00). In the initial observation, the BRIDGE consultant found no significant classwide issues, noting “very positive climate”, “routines in place”, and “children are very engaged.” During the initial interview, Ms. H mentioned two children with inattentiveness and mild behavioral difficulties and one child with significant academic and social challenges who experienced peer victimization. The BRIDGE consultant observed this child “consistently calling for attention, lacking self-awareness.” When this child gave an incorrect answer, it was met with “giggling from other students.”
Similarly, Ms. B’s classroom was positive and well-organized prior to intervention, as indicated by moderate to high CLASS scores (emotional support = 4.75; classroom organization = 5.17). In an initial observation, the BRIDGE consultant noted need for higher productivity (e.g., “Transition a little lengthy: takes a while for them to settle down and begin work. It is pretty clear, though, what needs to get done”) and teacher sensitivity (“maybe more aware of girls in back of room”). Yet, the consultant found the classroom without significant overall difficulties. Individual student challenges were seen, however, with three students who displayed inattentive and disruptive behaviors, including “tease other kids,” “not staying in seat,” and “drift off really easily.”
Focus on target students
BRIDGE implementation in these low-need classrooms was focused on managing the behaviors of specific targeted students. Because the classrooms had adequate pre-intervention levels of warmth and emotional support with well-established routines and positive and proactive behavioral management practices, most of the consultation and coaching was focused on targeted strategies for students with behavioral difficulties. For example, Ms. H’s classroom had three target students: One child experienced inattention, another was identified as having a poor self-concept and behavioral difficulties, and the third was victimized by classmates (e.g., teased, bullied) because of language, social, and developmental challenges. The BRIDGE consultant worked with the teachers to choose targeted strategies, such as a self-monitoring card (Mooney, Ryan, Uhing, Reid, & Epstein, 2005) for inattention, positive performance feedback (Warren et al., 2006) for the student with low self-concept, and positive peer reporting (Skinner et al, 2000) for the student who experienced teasing and bullying (to accompany individual treatment). Similarly, Ms. B and the BRIDGE consultant chose targeted strategies, such as self-monitoring, good news notes, and behavioral contracts, to help the three target students stay engaged during academic activities (e.g., Mooney et al., 2005;). The goal of the consultation was primarily to determine the most effective strategies for each student.
Teacher and consultant as partners
BRIDGE implementation in low-need classrooms was conducted in partnership. Teachers and consultants made joint decisions about strategies to implement for the targeted students, with teachers taking a leadership role in the implementation and consultants taking a supporting role. When Ms. H implemented a targeted strategy, the consultant did not model or co-lead; instead, the consultant monitored the student’s progress and provided feedback on implementation quality. In Ms. B’s case, the teacher asked the consultant to introduce the strategies to target students at the start; then, the teacher led subsequent and ongoing implementation of these strategies. When an initial strategy did not work, Ms. B worked with the consultant to identify additional tools to supplement or replace the strategy.
Effective modification and implementation of strategies
Low-need teachers worked with consultants to tailor BRIDGE strategies to fit the needs of targeted students. For example, in Ms. H’s classroom, the teacher modified the self-monitoring procedure to increase the teacher’s role in co-monitoring students’ attention and off-task behaviors. This was designed to improve its efficacy for a student who experience difficulty in self-reflection. Similarly, although Ms. B implemented the good news note (Blechman, Taylor, & Schrader, 1981) consistently with target students at the start of the consultation, she shifted to a daily report card (Kelley, 1990) when she and the consultant recognized that parent reinforcement was needed (e.g., “each day the students would take the card home and get it signed by the parent”). These planned modifications occurred in collaboration, maintained the original goal (e.g., improving student behavior), and preserved the effectiveness of the strategy.
Summary of implementation in low-need classrooms
Overall, these case studies reveal classrooms with moderate-to-high pre-intervention emotional support and classroom organization scores were primarily focused on targeted student interactions. In addition, consultation sessions involved collaborative decision-making between the teacher and consultant. Lastly, implementation of targeted strategies was appropriate and effective.
Discussion
Responding to calls for increased understanding of implementation processes in practice (Hoagwood et al., 2013), the current study used qualitative and quantitative data to illuminate the barriers, opportunities, and processes underlying the implementation of BRIDGE teacher consultation and coaching in urban elementary schools (citation omitted). Unlike many well-studied, classroom-focused intervention models, BRIDGE is implemented by existing school-based mental health staff and responsive to individual classroom and target student needs. Results from directed content analysis of teacher focus group and interview data suggest that aspects of the BRIDGE intervention model, school organization and classroom contexts, and teachers/consultants and their relationship were relevant as implementation facilitators or barriers. In addition, case study analysis of permanent products from high implementation classrooms suggest variation in consultation and coaching by initial level of classroom need. Results illuminate the need for implementation research to extend beyond simple indicators of dosage and fidelity to the multiple systems and variation in processes at play across levels of the implementation context.
Implementation Barriers and Facilitators
In the context of a classroom-randomized trial of BRIDGE effects on classrooms and students, we gathered qualitative data from teachers to understand their experience of BRIDGE implementation in their schools. Guided by Damschroeder and colleagues’ (2009) implementation model, we coded five major themes and emergent categories within each theme.
A major focus of teachers’ comments was specific aspects of the BRIDGE model, including concerns about the frequency and consistency of consultation and coaching, remarks on components that were appealing (e.g., classroom strategies) or challenging (e.g., self-reflection), and the presence or absence of support received from BRIDGE consultants. These themes are not surprising as prior scholarship suggests the intervention itself – including its components, logistics, and content – is a factor in initial engagement and ongoing implementation (Graczyk et al., 2008). Interestingly, given recent discussion as to whether relational support or concrete information are more important to provide in effective teacher consultation (e.g., Knotek & Hylander, 2014), in the current study, teachers spontaneously mentioned both factors when describing their interest and progress. The main barrier cited – inconsistencies in implementation dosage due to logistical constraints – requires shifts in future trials to better integrate intervention with the regular schedule and daily fabric of schooling.
School organization and classroom contexts were also mentioned as barriers to implementation. Lack of support from the school (e.g., time, peer professional learning interactions, supportive culture) made it difficult to implement BRIDGE well. Some scholars suggest that a supportive and organized context (e.g., “organizational readiness”; Weiner, Amick, & Lee, 2008) is necessary for consistent and high-quality implementation of EBIs. Our data do not test organizational readiness, but do indicate that school organizational structure and culture may be critical to consider in implementation. In addition, classroom composition and practices were mentioned with relative frequency. As seen in other teacher reports (Reinke et al., 2011), teachers in the current study cited students’ problem behaviors and poverty-related stress as challenges to effective teaching. Some teachers indicated these factors made their classroom ripe for BRIDGE—a two-tier intervention focused on classrooms and target students and a responsiveness to teachers’ strengths, needs, and practices. Others suggested the level of student need made it difficult to fully implement BRIDGE. This may add to the call for linking across all three tiers of intervention (universal, targeted, and intensive) so students with the most significant needs are receiving aligned services (Cappella et al., 2008).
Finally, teachers discussed their own willingness to receive support, their consultants’ competence, and the teacher-consultant relationship. Some research indicates that teacher traits, such as openness and emotional competence, are predictive of uptake and implementation of evidence-based practices (see Durlak & DuPre, 2008), which was reflected in our data as well. In addition, and also replicating prior research (e.g., Rohrbach, Grana, Sussman, & Valente, 2006), the extent of the consultant’s expertise was seen as relevant. BRIDGE consultants ranged in their role in the school and relationships with teachers. Consultants with more competence in classroom practices and better relationships with teachers were seen as more effective implementers. This may speak to selecting staff with these characteristics (e.g., lead or mentor teachers) as BRIDGE consultants, or alternatively, to training staff who are equipped to acquire them (e.g., school mental health personnel with teacher consultation experience).
Implementation Heterogeneity by Classroom Need
In related analyses, we identified four classrooms with moderate-high levels of BRIDGE implementation and varying levels of initial need. These classrooms served as the sample for a case study analysis. We examined intervention materials and fidelity tools to determine the content and pattern of implementation across classrooms with different levels of observed need.
In exploration of data from high need classrooms (low initial CLASS scores) versus low need classrooms (moderate-high initial CLASS scores), we identified several areas of divergence in implementation. First, in high need classrooms, consultant-teacher dyads focused on a limited number of CLASS dimensions throughout the intervention. The goal in BRIDGE was for each dyad to cycle through several dimensions of emotional support and classroom organization over the intervention period—once assessments indicated sufficient progress in that dimension. High need classrooms focused on approximately two dimensions (e.g., behavior management, positive climate) for the entire period. Results suggest the difficulties observed by consultants and/or raised by teachers within these dimensions of practice were substantial enough to require multiple cycles of assessment, feedback, and action. Thus, depth of intervention was greater in high need classrooms and breadth of intervention was greater in low need classrooms.
Second, high need classrooms focused on implementation of classwide strategies to prevent behavior problems and promote a positive classroom climate (e.g., Good Behavior Game; positive peer reporting). Low need classrooms emphasized implementation of targeted strategies to impact the behavior and well-being of specific students with behavioral problems (e.g., Daily Report Card, self-monitoring). This finding is aligned with the public health model in schools (Kutash, Duchnowski, & Lynn, 2006; Nastasi, 2004) whereby universal intervention is implemented first to strengthen the setting, and targeted or intensive intervention follows to address specific needs. Although BRIDGE consultants were not instructed to administer intervention in this manner, it matches existing models. In fact, it may benefit BRIDGE and other interventions to be explicit in these processes and goals in the future.
Third, teachers in high need classrooms had difficulty using strategies as intended and with fidelity, whereas use of targeted strategies in low need classrooms was generally appropriate and effective. This may be a reflection of the difficulty of implementation of classwide versus targeted strategies: the Good Behavior Game requires a broader and deeper set of skills to implement well than does a self-monitoring card (see Becker, Bradshaw, Domitrovich, & Ialongo, 2013). Alternatively, it may reflect the level of teacher skills and the prevalence of student behavioral problems across the classroom. Teachers in high need classrooms are struggling to use their skills to meet classroom demands, and thus require more active and ongoing coaching to implement complex classwide strategies well. Teachers in low need classrooms have the skills needed and/or a more manageable group of students, and therefore may be better able to work with their consultant to choose targeted strategies and implement them with minimal support. The BRIDGE model was designed to provide flexible coaching. However, it is clear here that the intensity of implementation support (i.e., coaching) is greater for consultants working in high need classrooms. It may be helpful in the future to consider classroom need prior to creating teacher-consultant pairs in order to ensure that consultants have the time and capability to meet classroom needs.
Limitations and Future Directions
Several limitations are important to consider. First, the sample is small and the study involves one public school system and intervention model. These results are not generalizable beyond this set of teachers and schools, or this teacher consultation and coaching model. The current study findings are aligned with other research, and the intervention model is based on a widely-used observation system and evidence-based classroom strategies; yet, it is important to conduct research in other settings to understand implementation processes across contexts and populations. Second, although this study has a strong foundation in implementation science and school mental health, findings are descriptive and exploratory and should be interpreted as such. Third, due to the need for sufficient data to analyze, the classroom case studies included only classrooms with moderate-high implementation. Future work should use other kinds of data to enable inclusion of low-implementation classrooms in case study analysis. Fourth, both teachers and consultants were critical to BRIDGE implementation; yet this study focuses solely on teachers. In a future trial, it is critical to incorporate consultant strengths, concerns, and experiences into the analysis. Lastly, given that BRIDGE is based on a specific system of understanding effective classrooms (CLASS), the classrooms in the case studies were split by high and low levels of classroom need as assessed by the CLASS. However, it may be important in future work to define and measure classroom needs differently to determine if similar implementation patterns emerge.
Still, this study has practical implications for classroom interventions such as BRIDGE and for implementation science. First, the current findings suggests teacher consultation and coaching should be integrated not only into the daily activities of existing school personnel but also into school organizational structures (e.g., meeting times, professional development) and their social climates. Second, models that focus on both relational support and functional support can be implemented. In fact, these may engage different teachers, perhaps leading to more implementation overall. Third, in high poverty schools, prevention and intervention scientists need to attend more fully to the third (intensive) tier. Given the levels of student need, linking two-tier programs to the third tier may reduce teacher stress and increase implementation of effective practices. Fourth, classroom interventions may be more efficient when priorities and personnel are explicitly based on initial classroom need. For example, high need classrooms may benefit from classwide coaching from a lead or mentor teacher; low need classrooms may benefit from targeted student consultation from a school mental health professional.
Finally, the current study is an example of how to use existing data from intervention trials to illuminate the process of implementation. As a field, we know a great deal about effective programs and practices; and, researchers increasingly document implementation outcomes such as dosage and adherence. We know less, however, about patterns, processes, and heterogeneity in implementation, particularly for intervention models delivered by existing school and community resources. Conducting systematic research on implementation processes using commonly available data and guided by strong theoretical frameworks will inform future implementation research and installation of evidence-based implementation systems. Taking this step promises to move the field beyond a basic understanding of implementation outcomes toward a theoretically-rich, data-based, and practically-relevant understanding of effective implementation of EBIs as a means toward positive outcomes for more students in schools.
Acknowledgments
This research was supported, in part, by a Center grant from the National Institute of Mental Health (NIMH 1P20MH078458-01A2; PI: Atkins). The opinions expressed are those of the authors and do not represent the views of the NIMH. We would like to thank the LINKS Center Investigators, research assistants from New York University, and the school and community collaborators who contributed time and expertise to this effort.
Funding: This study was funded by NIMH 1P20MH078458-01A2 (Principal Investigator: Marc S. Atkins).
Footnotes
In three instances, university consultants with education levels similar to the community and school professionals (masters’ level) supplemented BRIDGE delivery when staff left the school (one case) or took maternity leave (two cases).
In our analysis, the implementation process domain was embedded within the intervention characteristics domain.
Exposure to consultation sessions or classroom strategies was only counted if minimal levels of adherence were met (e.g., the primary elements of the consultation session or classroom strategy were present).
Specific class sizes and grades are removed to protect confidentiality.
Conflict of Interest: Elise Cappella serves on the Evaluation Advisory Committee of Good Shepherd Services (unrelated to this study); Caroline Bilal declares that she has no conflict of interest; Daisy R. Jackson declares that she has no conflict of interest; Ha Yeon Kim declares that she has no conflict of interest; Sibyl Holland declares that she has no conflict of interest; Marc S. Atkins declares that he has no conflict of interest
Compliance with Ethical Standards:
Ethical approval: All procedures performed in this study were in accordance with the ethical standards of the university institutional and school district research committees and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent: Informed consent was obtained from all individual participants included in the study.
Contributor Information
Elise Cappella, New York University
Daisy R. Jackson, The Reeds Center
Ha Yeon Kim, Harvard University
Caroline Bilal, New York University
Sibyl Holland, Harvard University
Marc S. Atkins, University of Illinois at Chicago
References
- Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health & Mental Health Services Research. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aber JL, Jones SM, Brown JL, Chaudry N, Samples F. Resolving conflict creatively: Evaluating the developmental effects of a school-based violence prevention program in neighborhood and classroom context. Development and Psychopathology. 1998;10(02):187–213. doi: 10.1017/s0954579498001576. [DOI] [PubMed] [Google Scholar]
- Adelman HS, Taylor L. Mapping a school’s resources to improve their use in preventing and ameliorating problems. In: Franklin C, Harris MB, Allen-Meares P, editors. The school services sourcebook: A guide for school-based professionals. New York: Oxford University Press; 2006. pp. 977–990. [Google Scholar]
- Allen JP, Pianta RC, Gregory A, Mikami AY, Lun J. An interaction-based approach to enhancing secondary school instruction and student achievement. Science. 2011;333(6045):1034–1037. doi: 10.1126/science.1207998. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Atkins MS, Frazier S, Leathers S, Graczyk P, Talbott E, Gibbons R. Teacher key opinion leaders and mental health consultation in urban low-income schools. Journal of Consulting and Clinical Psychology. 2008;76:905–908. doi: 10.1037/a0013036. [DOI] [PubMed] [Google Scholar]
- Atkins MS, Shernoff ES, Frazier SL, Schoenwald SK, Cappella E, Marinez-Lora A, Bhaumik D. Re-designing community mental health services for urban children: Supporting schooling to promote mental health. Journal of Consulting and Clinical Psychology. 2015;83(5):839–852. doi: 10.1037/a0039661. http://dx.doi.org/10.1037/a0039661. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Barrish HH, Saunders M, Wolf MM. Good Behavior Game: Effects of individual contingencies for group consequences on disruptive behavior in a classroom. Journal of Applied Behavior Analysis. 1969;2:119–124. doi: 10.1901/jaba.1969.2-119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Becker KD, Bradshaw CP, Domitrovich C, Ialongo NS. Coaching teachers to improve implementation of the good behavior game. Administration and Policy in Mental Health and Mental Health Services Research. 2013;40(6):482–493. doi: 10.1007/s10488-013-0482-8. [DOI] [PubMed] [Google Scholar]
- Berryhill JC, Prinz RJ. Environmental interventions to enhance student adjustment: Implications for prevention. Prevention Science. 2003;4(2):65–87. doi: 10.1023/a:1022994514767. [DOI] [PubMed] [Google Scholar]
- Berwick DM. The science of improvement. Journal of the American Medical Association. 2008;299(10):1182–1184. doi: 10.1001/jama.299.10.1182. [DOI] [PubMed] [Google Scholar]
- Blechman EA, Taylor CJ, Schrader SM. Family problem solving versus home notes as early intervention with high-risk children. Journal of Consulting and Clinical Psychology. 1981;49:919–926. doi: 10.1037/0022-006X.49.6.919. [DOI] [PubMed] [Google Scholar]
- Brener ND, Weist M, Adelman H, Taylor L, Vernon-Smiley M. Mental health and social services: Results from the school health policies and programs study 2006. Journal of School Health. 2007;77(8):486–499. doi: 10.1111/j.1746-1561.2007.00231.x. [DOI] [PubMed] [Google Scholar]
- Brophy J, Good T. Teacher behavior and student achievement. In: Wittrock M, editor. Handbook of research on teaching. 3. New York: Macmillan; 1986. [Google Scholar]
- Brock LL, Nishida TK, Chiong C, Grimm KJ, Rimm-Kaufman SE. Children’s perceptions of the classroom environment and social and academic performance: A longitudinal analysis of the contribution of the Responsive Classroom approach. Journal of School Psychology. 2008;46:129–149. doi: 10.1016/j.jsp.2007.02.004. [DOI] [PubMed] [Google Scholar]
- Brown JL, Jones SM, LaRusso MD, Aber JL. Improving classroom quality: Teacher influences and experimental impacts of the 4R program. Journal of Educational Psychology. 2010;102:153–167. doi: 10.1037/a0018160. [DOI] [Google Scholar]
- Cappella E, Frazier SL, Atkins MS, Schoenwald SK, Glisson C. Enhancing schools’ capacity to support children in poverty: An ecological model of school-based mental health services. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(5):395–409. doi: 10.1007/s10488-008-0182-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cappella E, Reinke WM, Hoagwood KE. Advancing intervention research in school psychology: Finding the balance between process and outcome for social and behavioral interventions. School Psychology Review. 2011;40(4):455–464. [Google Scholar]
- Cihak DF, Kirk ER, Boon RT. Effects of classwide positive peer “tootling” to reduce the disruptive classroom behaviors of elementary students with and without disabilities. Journal of Behavioral Education. 2009;18(4):267–278. [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Science. 2009;4(1):50. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clinical Psychology Review. 1998;18:23–45. doi: 10.1016/s0272-7358(97)00043-3. [DOI] [PubMed] [Google Scholar]
- Durlak J. ASPE Research Brief. US Department of Health and Human Services; 2013. The importance of quality implementation for research, practice, and policy. [Google Scholar]
- Durlak JA, DuPre EP. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology. 2008;41(3–4):327–350. doi: 10.1007/s10464-008-9165-0. [DOI] [PubMed] [Google Scholar]
- Durlak JA, Weissberg RP, Dymnicki AB, Taylor RD, Schellinger KB. The impact of enhancing students’ social and emotional learning: A meta-analysis of school-based universal interventions. Child Development. 2011;82:405–432. doi: 10.1111/j.1467-8624.2010.01564.x. Fazel, Hoagwood, Stephan, & Ford, 2014. [DOI] [PubMed] [Google Scholar]
- Embry DD. The Good Behavior Game: A best practice candidate as a universal behavioral vaccine. Clinical Child and Family Psychology Review. 2002;5:273–297. doi: 10.1023/a:1020977107086. [DOI] [PubMed] [Google Scholar]
- Embry DD. Community-based prevention using simple, low-cost, evidence-based kernels and behavior vaccines. Journal of Community Psychology. 2004;32(5):575–591. [Google Scholar]
- Fixsen D, Naoom S, Blase K, Friedman R, Wallace F. Implementation research: A synthesis of the literature (FMHI Publication No. 231) Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network; 2005. [Google Scholar]
- Forman SG, Olin SS, Hoagwood KE, Crowe M, Saka N. Evidence-based interventions in schools: Developers’ views of implementation barriers and facilitators. School Mental Health. 2009;1(1):26–36. [Google Scholar]
- Gottfredson DC, Cook TD, Gardner FE, Gorman-Smith D, Howe GW, Sandler IN, Zafft KM. Standards of evidence for efficacy, effectiveness, and scale-up research in prevention science: Next generation. Prevention Science. 2015:1–34. doi: 10.1007/s11121-015-0555-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Graczyk PA, Domitrovich CE, Small M, Zins JE. Serving all children: An implementation model framework. School Psychology Review. 2006;35(2):266–274. [Google Scholar]
- Greenwood J. Activity box: A resource book for teachers of young students. Cambridge, England: Cambridge University Press; 1997. [Google Scholar]
- Hamre BK, Pianta RC. Can instructional and emotional support in the first-grade classroom make a difference for children at risk of school failure? Child Development. 2005;76:949–967. doi: 10.1111/j.1467-8624.2005.00889.x5. [DOI] [PubMed] [Google Scholar]
- Hoagwood K, Atkins M, Ialongo N. Unpacking the black box of implementation: The next generation for policy, research and practice. Administration and Policy in Mental Health and Mental Health Services Research. 2013;40(6):451–455. doi: 10.1007/s10488-013-0512-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qualitative Health Research. 2005;15:1277–1288. doi: 10.1177/1049732305276687. [DOI] [PubMed] [Google Scholar]
- Hughes JN, Cavell TA. Influence of the teacher-student relationship in childhood conduct problems: A prospective study. Journal of Clinical Child Psychology. 1999;28(2):173–184. doi: 10.1207/s15374424jccp2802_5. [DOI] [PubMed] [Google Scholar]
- Knotek SE, Hylander IE. Research issues in mental health consultation and consultee-centered approaches. In: Erchul WP, Sheridan SM, editors. Handbook of research in school consultation. New York: Psychology Press; 2014. pp. 153–179. [Google Scholar]
- Kutash K, Duchnowski A, Lynn N. School-based mental health: An empirical guide for decision makers. University of South Florida, The Louis De La Parte Florida Mental Health Institute, Department of Child and Family Studies, Research and Training Center for Children’s Mental Health; 2006. Retrieved on June 16, 2015 from http://rtckids.fmhi.usf.edu/rtcpubs/study04/ [Google Scholar]
- Lyman F. Think-pair-share: An expanding teaching technique. MAA-CIE Cooperative News. 1987;1:1–2. [Google Scholar]
- Jones SM, Brown JL, Lawrence Aber J. Two-year impacts of a universal school-based social-emotional and literacy intervention: An experiment in translational developmental research. Child Development. 2011;82(2):533–554. doi: 10.1111/j.1467-8624.2010.01560.x. [DOI] [PubMed] [Google Scholar]
- Kazdin AE. The token economy: A decade later. Journal of Applied Behavior Analysis. 1982;15(3):431–445. doi: 10.1901/jaba.1982.15-431. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kelley ML. School-home notes: Promoting children’s classroom success. New York: Guilford; 1990. [Google Scholar]
- Kern L, Childs KE, Dunlap G, Clarke S, Falk GD. Using assessment-based curricular intervention to improve the classroom behavior of a student with emotional and behavioral challenges. Journal of Applied Behavior Analysis. 1994;27(1):7–19. doi: 10.1901/jaba.1994.27-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kraemer EE, Davies SC, Arndt KJ, Hunley S. A comparison of the mystery motivator and the Get’ Em On Task interventions for off-task behaviors. Psychology in the Schools. 2012;49(2):163–175. [Google Scholar]
- Kutash K, Duchnowski A, Lynn N. School-based mental health: An empirical guide for decision makers. University of South Florida, The Louis De La Parte Florida Mental Health Institute, Department of Child and Family Studies, Research and Training Center for Children’s Mental Health; 2006. [Google Scholar]
- Lambert AM, Tingstrom DH, Sterling HE, Dufrene BA, Lynne S. Effects of tootling on classwide disruptive and appropriate behavior of upper-elementary students. Behavior Modification. 2014 doi: 10.1177/0145445514566506. 0145445514566506. [DOI] [PubMed] [Google Scholar]
- Mashburn AJ, Pianta RC, Hamre BK, Downer JT, Barbarin A, Bryant D, Howes C. Measures of classroom quality in prekindergarten and children’s development of academic, language, and social skills. Child Development. 2008;79:732–749. doi: 10.1111/j.1467-8624.2008.01154.x. [DOI] [PubMed] [Google Scholar]
- Miles SB, Stipek D. Contemporaneous and longitudinal associations between social behavior and literacy achievement in a sample of low-income elementary school children. Child Development. 2006;77(1):103–117. doi: 10.1111/j.1467-8624.2006.00859.x. [DOI] [PubMed] [Google Scholar]
- Miller WR, Rollnick S. Motivational interviewing: Preparing people for change. New York: Guilford Press; 2002. [Google Scholar]
- Mooney P, Ryan JB, Uhing BM, Reid R, Epstein MH. A review of self-management interventions targeting academic outcomes for students with emotional and behavioral disorders. Journal of Behavioral Education. 2005;14:203–221. [Google Scholar]
- Morgan DL. Focus groups as qualitative research. Sage; Newbury Park, CA: 1998. [Google Scholar]
- Nastasi BK. Meeting the challenges of the future: Integrating public health and public education for mental health promotion. Journal of Educational & Psychological Consultation. 2004;15:295–312. doi: 10.1207/s1532768xjepc153&4_6. [DOI] [Google Scholar]
- Owens JS, Richerson L, Beilstein EA, Crane A, Murphy CE, Vancouver JB. School-based mental health programming for children with inattentive and disruptive behavior problems: First-year treatment outcome. Journal of Attention Disorders. 2003;9:261–274. doi: 10.1177/1087054705279299. [DOI] [PubMed] [Google Scholar]
- Park S, Hironaka S, Carver P, Nordstrum L. Continuous improvement in education. Carnegie Foundation for the Advancement of Teaching; 2013. Retrieved from: http://www.carnegiefoundation.org/resources/publications/continuous-improvement-education/ [Google Scholar]
- Patton MQ. Developmental evaluation: Applying complexity concepts to enhance innovation and use. Guilford Press; 2011. [Google Scholar]
- Pianta RC, Hamre BK, Allen JP. Teacher-student relationships and engagement: Conceptualizing, measuring, and improving the capacity of classroom interactions. In: Christenson SL, Reschly AL, Wylie C, editors. Handbook of research on student engagement. New York: Springer; 2012. pp. 365–386. [Google Scholar]
- Pianta RC, La Paro KM, Hamre BK. Classroom Assessment Scoring System Manual. Baltimore, MD: Brooks Publishing Co; 2008a. [Google Scholar]
- Pianta RC, Mashburn AJ, Downer JT, Hamre BK, Justice L. Effects of web-mediated professional development resources on teacher-child interactions in pre-kindergarten classrooms. Early Childhood Research Quarterly. 2008b;23:431–451. doi: 10.1016/j.ecresq.2008.02.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raver CC, Jones SM, Li-Grining CP, Metzger M, Champion KM, Sardin L. Improving preschool classroom processes: Preliminary findings from a randomized trial implemented in Head Start settings. Early Childhood Research Quarterly. 2008;23:10–26. doi: 10.1016/j.ecresq.2007.09.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raver CC, Jones SM, Li-Grining C, Zhai F, Bub K, Pressler E. CSRP’s impact on low-income preschoolers’ preacademic skills: Self-regulation as a mediating mechanism. Child Development. 2011;82:362–378. doi: 10.1111/j.1467-8624.2010.01561.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reddy LA, Newman E, De Thomas CA, Chun V. Effectiveness of school-based prevention and intervention programs for children and adolescents with emotional disturbance: A meta-analysis. Journal of School Psychology. 2009;47(2):77–99. doi: 10.1016/j.jsp.2008.11.001. [DOI] [PubMed] [Google Scholar]
- Reid R, Gonzalez JE, Nordness PD, Trout A, Epstein MH. A meta-analysis of the academic status of students with emotional/behavioral disturbance. The Journal of Special Education. 2004;38:130–143. [Google Scholar]
- Reinke WM, Lewis-Palmer T, Merrell K. The classroom check-up: A classwide teacher consultation model for increasing praise and decreasing disruptive behavior. School Psychology Review. 2008;37(3):315–332. [PMC free article] [PubMed] [Google Scholar]
- Reinke WM, Stormont M, Herman KC, Puri R, Goel N. Supporting children’s mental health in schools: Teacher perceptions of needs, roles, and barriers. School Psychology Quarterly. 2011;26:1–13. doi: 10.1037/a0022714. [DOI] [Google Scholar]
- Rohrbach LA, Grana R, Sussman S, Valente TW. Type II translation: Transporting prevention interventions from research to real-world settings. Evaluation and the Health Professions. 2006;29:302–333. doi: 10.1177/0163278706290408. [DOI] [PubMed] [Google Scholar]
- Rohrbeck CA, Ginsburg-Block MD, Fantuzzo JW, Miller TR. Peer-assisted learning interventions with elementary school students: A meta-analytic review. Journal of Educational Psychology. 2003;95(2):240–257. [Google Scholar]
- Saunders RP, Evans MH, Joshi P. Developing a process-evaluation plan for assessing health promotion program implementation: a how-to guide. Health Promotion Practice. 2005;6(2):134–147. doi: 10.1177/1524839904273387. [DOI] [PubMed] [Google Scholar]
- Seidman E. An emerging action science of social settings. American Journal of Community Psychology. 2012;50:1–16. doi: 10.1007/s10464-011-9469-3. [DOI] [PubMed] [Google Scholar]
- Skinner CH, Cashwell TH, Skinner AL. Increasing tootling: The effects of a peer-monitored group contingency program on students’ reports of peers’ prosocial behaviors. Psychology in the Schools. 2000;37:263–270. [Google Scholar]
- Walberg HJ, Paik SJ. Effective educational practices. Brussels, Belgium: International Academy of Education; 2000. Educational Practices Series–3. [Google Scholar]
- Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, Saul J. Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology. 2008;41(3–4):171–181. doi: 10.1007/s10464-008-9174-z. [DOI] [PubMed] [Google Scholar]
- Warren JS, Bohanon-Edmonson HM, Turnbull AP, Sailor W, Wickham D, Griggs P, Beech SE. School-wide positive behavior support: Addressing behavior problems that impede student learning. Educational Psychology Review. 2006;18(2):187–198. [Google Scholar]
- Watson TS, Skinner CH. Functional behavioral assessment: Principles, procedures, and future directions. School Psychology Review. 2001;30:156–172. [Google Scholar]
- Weiner B, Amick H, Lee SY. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Medical Care Research and Review. 2008;65(4):379–436. doi: 10.1177/1077558708317802. [DOI] [PubMed] [Google Scholar]
- Yin RK. Case study research: Design and methods. 3. Sage Publications; Thousand Oaks: 2003. [Google Scholar]


