Skip to main content
Behavior Analysis in Practice logoLink to Behavior Analysis in Practice
. 2022 Feb 4;15(3):924–937. doi: 10.1007/s40617-021-00663-8

Planning Positive Reinforcement Cycles in Behavior Intervention Plans

Kathleen N Zimmerman 1,, Jessica N Torelli 2, Jason C Chow 3
PMCID: PMC9582084  PMID: 36465599

Abstract

Behavior analysts partner with educators in schools to support the creation of behavior intervention plans (BIPs). Assessment and intervention planning often focuses on the relational contingencies between the student and their environment, with little attention paid to the relational contingencies contacted by the educator. In this article, we posit that planning should simultaneously include contingencies for both the student and the educator as BIPs are created. Specifically, we aim to explore a dual-pathway intervention plan in which student and educator access to reinforcement is simultaneously designed to increase both educators’ implementation of high-quality instruction and students’ engagement and performance. Procedural steps outlining the duality of intervention planning for both the student and the educator, as well as a theoretical model for considering contextual and reinforcement contingencies for both parties, will be detailed in a step-by-step guide to support readers’ creation and implementation of plans to support improved educator and student performance. Planning for supporting both the educator and student may increase sustained, high-quality instruction and improved student outcomes for students with behavioral support needs.

Supplementary Information

The online version contains supplementary material available at 10.1007/s40617-021-00663-8.

Keywords: Behavior intervention plans, Fidelity, School-based behavior analysis


Behavior intervention plans (BIPs) are detailed support plans designed to teach students new skills or behaviors (replacement behaviors) to use instead of challenging behavior to gain access to desired items, activities, or individuals or to escape undesired items, activities, or individuals (see Steege et al., 2019, for a detailed account of processes to conduct BIPs in schools). BIPs are informed by the results of functional behavior assessments (FBAs; e.g., Steege et al., 2019). FBA is a process used to identify variables influencing challenging behavior—namely, establishing operations and reinforcers (e.g., Hanley, 2012). BIPs include antecedent strategies to prevent challenging behavior, instructional strategies to teach replacement behaviors, and consequence strategies to reinforce replacement behaviors and decrease challenging behaviors. Extensive evidentiary support exists for the effectiveness of BIPs created using FBA; specifically, BIPs that include skill instruction, antecedent supports, and reinforcement strategies effectively decrease challenging behavior and increase prosocial behaviors (e.g., Cho & Blair, 2017; Dunlap & Carr, 2007; Lloyd & Kennedy, 2014; Nahgahgwon et al., 2010; Turton et al., 2011). An effective BIP is one that results in a socially valid reduction in challenging behavior and an increase in replacement behaviors across contexts and implementers.

School teams develop BIPs based on results from FBAs to ensure a comprehensive plan is created detailing multiple aspects of the student’s performance. Team-based approaches are widely accepted as standard practice for FBA and BIP development (e.g., Gable et al., 2014; Kern et al., 2004; Scott et al., 2008). Although little guidance exists for who composes the team (Collins & Zirkel, 2017), including a behavior specialist (e.g., Board Certified Behavior Analyst) is recommended to provide expertise in behavioral theory related to the assessment and intervention-planning process (e.g., Benazzi et al., 2006; Scott et al., 2008). School teams are responsible for training educators to effectively implement the BIP (Gable et al., 2014); behavior analysts are suggested training experts for school personnel (Hirsch et al., 2017). However, there is limited guidance on the specific role of the behavior analyst in creating plans to support both BIP content and BIP implementation fidelity.

Educator BIP Fidelity

Implementation fidelity, or treatment integrity, is the extent to which behavioral procedures are implemented as planned (Gresham et al., 1993) and is necessary to produce changes in student behavior (e.g., Biggs et al., 2008; Truscott et al., 2003) or to adequately draw conclusions about the relation between a BIP and changes in outcomes for students (Wilkinson, 2007). Despite the extensive research evidence supporting the effectiveness of BIPs, educators in school settings often fail to implement BIPs with fidelity (Bambara et al., 2012) and fail to collect treatment integrity data (McIntyre et al., 2007; Vollmer et al., 2008). Surveys of educational leaders report fidelity estimates average around 68% (Cook et al., 2012), with significant decreases in fidelity within 7–10 days of BIP implementation, even if initial fidelity estimates are high (Johnson et al., 2014). Improving fidelity is an important goal, as higher fidelity has been shown to result in improved outcomes (Fiske, 2008; St Peter Pipkin et al., 2010). Low-fidelity implementation can be impacted by limited educator training in preservice programs (Simonsen et al., 2013) or limited in-service support to organize procedures, data, or staffing of BIPs (Fixsen et al., 2005; Moore et al., 2017).

Fidelity may also be impacted by contextual factors such as competing behaviors or contingencies (e.g., other responsibilities, needs of other students, curricular requirements, failure to contact reinforcement during initial BIP implementation) or environmental stimuli (e.g., absence of needed materials; students’ continuous access to preferred stimuli; removal of aversive stimuli, including the student who is engaging in challenging behavior; Collier-Meek et al., 2017). A detailed discussion of factors impacting fidelity, components of fidelity, and strategies to promote fidelity can be found in DiGennaro Reed et al. (2014). Many of these complex factors may be outside the educator’s control and related to resources, school values or priorities, or available training.

Unfortunately, educators report multiple barriers to implementing BIPs. A survey of 600 K–12 educators found the most common barriers included BIPs not addressing the cause of the behavior, inconsistent staff implementation, inadequate resources, ineffective plans, and lack of training (Robertson et al., 2020). Specifically, educators report the lack of time and resources in their schools as “unsupportive of their work in general” (Robertson et al., 2020, p. 151). Given high fidelity is related to BIP effectiveness (Bruhn et al., 2015; Cook et al., 2012), and low fidelity results in poor student outcomes (St Peter Pipkin et al., 2010), improving educators’ implementation fidelity of BIPs is an essential component of improving student outcomes.

Studies support several strategies to increase educators’ implementation fidelity of BIPs, including self-monitoring (e.g., Mouzakitis et al., 2015; Pinkelman & Horner, 2017), performance feedback (e.g., Mouzakitis et al., 2015), modeling and role-play (e.g., Madzharova et al., 2018), multimedia anchored instruction (e.g., Holcomb et al., 2020), practice-based professional development (e.g., Hirsch et al., 2020), and behavior skills training (e.g., Fiske, 2008; Hogan et al., 2015). Each of these components can improve educators’ fidelity of BIP implementation, allowing the student access to reinforcement contingent on exhibiting the replacement behavior. However, each of these strategies focuses on delivering performance feedback for educators’ implementation of BIP steps, rather than explicitly using planned positive consequences identified by the educator as tools to enhance BIP creation, implementation, and sustainability, thereby increasing educators’ access to positive reinforcement by enhancing their success in the classroom.

The purpose of this article is to augment behavior analysts’ considerations for BIP creation, training, and implementation. Specifically, we present an argument for considering reinforcement contingencies for both the student for whom the BIP was created and the educators implementing the BIP to increase the likelihood BIPs are implemented with fidelity to improve the student’s outcomes. First, we provide a discussion of reinforcement cycles for both students and educators in the classroom. Next, we position educator behavior in the three-term contingency and suggest methods for identifying and planning for reinforcing stimuli for educators. We then present methods for reinforcing both educator and student behavior through behavioral skills training. Finally, we detail an example of data collection tools and reinforcement cycles behavior analysts can use to increase educators’ access to reinforcement for implementing BIPs with fidelity and in turn improving students’ outcomes.

Improving the Efficacy of BIPs Through the Positive Reinforcement Cycle

Interactions between teachers and students who engage in persistent challenging behavior often can be characterized by a negative reinforcement cycle. Gunter and Coutinho (1997) theorized that in this cycle, students engage in challenging behavior to escape or avoid academic instruction. Teachers reinforce these challenging behaviors by avoiding or delaying instruction, and they experience negative reinforcement when they escape challenging behavior by not providing instructional demands; Collier-Meek et al. (2017) defined the process of considering factors in the environment and student behaviors that influence educator fidelity as the competing-pathway model. For example, when a student engages in disruption to avoid recalling math facts, the educator may call on another student to answer the question. Over time, the educator provides fewer instructional demands to avoid these instances of disruption. The student avoids increasingly more instructional demands through more disruptive behavior. Although not all challenging behavior is negatively reinforced, this negative reinforcement cycle may characterize student–teacher interactions more broadly for students who engage in challenging behavior. As evidence, studies have found educators have fewer academic interactions with students who engage in the most challenging behavior (e.g., Carr et al., 1991; Sutherland et al., 2004; Wehby et al., 1998).

A successful BIP creates a context that replaces this negative reinforcement cycle for challenging behavior with a positive reinforcement cycle for replacement behaviors (e.g., Collier-Meek et al., 2017). Understanding student and educator contingencies within the positive reinforcement cycle provides a framework for promoting effective coaching relationships between behavior analysts and educators serving students who engage in persistent challenging behavior. Figure 1 provides an example framework for conceptualizing the positive reinforcement cycle that may help explain both student and educator behaviors. Within this example, the educator beginning to implement effective and preventative behavior management strategies detailed in the BIP serves as an entry point to the positive reinforcement cycle (top-right circle of diagram). This change in educator behavior improves the quality of their instruction and provides new opportunities to reinforce students’ alternative behaviors, including engagement and prosocial behaviors. Because students now receive reinforcement for alternative behaviors over challenging behavior, challenging behavior decreases and the frequency and duration of high-quality instruction increases (bottom-right circle). Continued opportunities to reinforce alternative behaviors, combined with continued decreases in interruptions to instruction, promote improvement in the student–educator relationship (bottom-left circle). At this point in the cycle, interactions have become predominantly positive, with the educator recognizing and reinforcing the student’s engagement in instructional activities and prosocial communication, rather than avoiding the student’s disruptive behavior. Educators now face fewer opportunities to avoid student interactions and more opportunities to have positive student interactions because engagement occurs more often and challenging behavior occurs less often. Furthermore, student outcomes improve as students access high-quality instruction that supports their learning and access to reinforcement. Educators’ behaviors in implementing effective behavior management practices may now potentially be reinforced by improved student performance (top-left circle). Essentially, educators set the stage for increased access to high-quality instruction by improving their own practice and relationships with their students (Chow et al., 2020). These improvements are bolstered by the potential reinforcement or positive consequences that educators access from the successful implementation of BIPs.

Fig. 1.

Fig. 1

An example of a positive reinforcement cycle for an educator and student. Note. The cycle may differ by educator and student pairing. An educator beginning to implement behavior management strategies may serve as an entry point to shift from a negative reinforcement cycle to a positive reinforcement cycle

Implementation of both academic instruction and behavioral support becomes easier for the educator as positive consequences and potentially reinforcement contingencies develop around positive interactions rather than challenging behavior. Educator behavior, including their fidelity of BIP implementation, is potentially reinforced by improved student performance, increased positive student interactions, increased instructional time, or decreased challenging behavior. Students’ engagement and prosocial behaviors are reinforced by access to preferred items, activities, or individuals or removal of aversive items, activities, or individuals, as well as new reinforcement contingencies associated with engaging with and responding to instruction. The likelihood of student engagement and prosocial behaviors also increases as learning becomes more predictable through BIP implementation. The likelihood of an educator’s implementation of BIP procedures increases as the educator’s implementation behaviors are likely reinforced by overall improved performance from the target student, and likely others in the instructional environment. In the remainder of the article, we identify ways to plan for potential reinforcement—or, at a minimum, positive consequences—of educator behaviors to aid in the creation of a positive reinforcement cycle for students and educators.

Planning for Positive Consequences for Educators

Reinforcement of educator-implemented countertherapeutic contingencies creates a need to consider the interaction between educator and student access to reinforcement during coaching. An effective plan should go beyond describing replacement behaviors for the educator, which are detailed in the BIP by describing consequences for the student’s challenging and replacement behaviors. In addition, an effective plan should consider ways to frequently monitor and reinforce educators’ replacement behaviors early in implementation. At a minimum, planning for positive consequences for engaging in BIP implementation should occur to support educator implementation. This may include the removal of aversive or punishing stimuli (e.g., property destruction as a result of challenging behavior) or the addition of positive consequences that could serve as reinforcers (e.g., efficient instruction may create more time for educator breaks or moments of independence in the classroom; more positive interactions with students may improve relationships between educators and students). Frequent access to positive consequences that may serve as reinforcers of educators’ behaviors may be necessary to establish a positive reinforcement cycle and thus therapeutic contingencies for both the educator and the student. Initially, these positive consequences might need to be provided or enhanced by the behavior analyst because behavioral strategies may not have an immediate effect on student behavior, and the delay to reinforcement might punish teacher implementation of the BIP.

For behavior analysts who coach educators on BIP implementation, they may need to consider more than a behavior-skills-training approach on the components of the BIP to produce long-term behavior change (e.g., DiGennaro et al., 2007; Sanetti & Collier-Meek, 2015). Educators have a history of presenting functional reinforcers for challenging behavior, giving rise to the behavior that necessitated the BIP. Although challenging behavior may have a much longer history of reinforcement than the student’s experience in their current classroom, educators may also have a long history of reinforcing challenging behavior to produce temporary reductions in disruption. These temporary methods of reinforcement are often reactive, which can interfere with the proactive behavior support planning required to effectively address the function of the behavior (Chow & Gilmour, 2016). Educators’ histories of negative reinforcement cycles suggest behavior analysts must carefully and collaboratively reflect with educators as they begin BIP planning as a team to better understand educator strengths, as well as strategies that may be aversive (e.g., an educator prefers not to wear an audible interval timer to cue praise), to increase the likelihood both the student and the educator contact positive consequences for changing their responding. A detailed discussion of how to identify and evaluate factors that may impact fidelity in the environment contributing to the access or removal of reinforcing stimuli for educators can be found in Collier-Meek et al. (2017).

Collecting Data

Before developing the BIP, behavior analysts typically collect baseline data on the student’s target behavior to determine its function. Behavior analysts may also consider measuring the educator’s behaviors (Collier-Meek et al., 2017) and contextual variables that may influence challenging behavior during baseline data collection. We suggest behavior analysts engage in a collaborative discussion about also identifying contextual and instructional variables around the educator’s classroom interactions to further support a comprehensive picture of the educator’s strengths and potential barriers to those instructional and management strengths being demonstrated in the classroom due to challenging behavior (Collier-Meek et al., 2017).

Eliciting the educator’s input on the contexts under which they feel successful delivering instruction and behavior management strategies can support collaborative planning for the BIP that increases the likelihood the educator accesses programmed reinforcing consequences. Prior to collecting these data, the team should transparently discuss the purpose of data collection and explain how contextual and environmental variables, including educator preferences, can contribute to a BIP that is both effective for the student and feasible for the educator (O’Neill et al., 2015). Indirect and direct observations, similar to the FBA process, can be used to gather these data, as well as to identify the educator’s preferences.

Indirect Observations

Indirect observations can include an environmental assessment and educator interviews. Environmental and ecobehavioral assessments include assessment of classroom environmental conditions related to the physical space, ratios of students and educators, educator–student relationships, student groupings, and time spent in instruction (Gettinger et al., 2011; Mason et al., 2014). Environmental assessments can provide additional contextual information to support the creation of positive reinforcement cycles by ensuring that planning for the environmental context includes preferred arrangements or items for educators and students. A comprehensive discussion of methods for assessing classroom environments can be found in Gettinger et al. (2011); a comprehensive discussion of ecobehavioral assessment in the context of FBAs can be found in Mason et al. (2014).

Interview

Behavior analysts should reflect with the educator to identify the contexts under which the educator feels successful in the classroom and the specific setting events or stimuli that support their success. A structured interview can be used to elicit the conditions under which the educator feels successful and unsuccessful in the classroom, as well as the conditions under which challenging behavior occurs for the target student (Jessel et al., 2016). Specifically, a structured, open-ended interview can be used to hypothesize conditions important for a functional analysis or FBA (see Hanley, 2012, for specific interview questions). Supplemental File 1 provides readers with a sample educator interview template to gather information about an educator’s preferences, contextual variables, and instructional variables that may inhibit or prohibit their implementation of replacement behaviors and strategies for the BIP, as well as antecedents or consequences that may maintain their behavior and that could be reinforcing challenging behavior for the target student.

The interview can also elicit the instructional contexts that the educator feels are desired in the classroom but perhaps are viewed as unrealistic due to challenging behavior. These “desired” instructional contexts could be used to reinforce the educator’s implementation of the BIP by arranging contingencies to support student behavior in these preferred contexts and providing additional consultative support during these activities to support initial success (e.g., BIP coaching and modeling). For example, if the educator states they want to engage in more collaborative student-led activities, but the behavior analyst does not observe those interactions, then the behavior analyst can initiate a structured conversation to identify potentially reinforcing consequences for the educator and the supports needed to promote student engagement in this instructional context. Supplemental File 1 provides a sample educator interview. The responses to these questions can then be used to support the identification of BIP plan elements that elicit or result in the contexts in which educators can access positive consequences for plan implementation.

Interviews can also function as reflection opportunities following direct observations. For example, when reflecting on the classroom observation, the educator could note they feel like instructional delivery is successful when students are engaged in answering questions and making positive comments to each other. The behavior analyst notes when opportunities to respond are frequent for all students (e.g., whiteboards to show math responses), the educator delivered more praise statements and had more positive interactions with students noting their correct responses and having supportive conversations with students demonstrating an incorrect response, relative to other instructional arrangements. Collaboratively, the behavior analyst and educator can work together to identify barriers to delivering quality instruction that may be attributed to the context (e.g., seating arrangements), educator’s skills (e.g., educator does not know how to respond to verbal aggression), or student behaviors (e.g., student runs away from instruction). Then, they can program antecedent, instructional, and consequence strategies in the student BIP that align with educator-identified reinforcing consequences achieved during successful times of the day. These positive consequences may be the removal of aversive stimuli (e.g., decreases in challenging behavior as replacement behaviors are reinforced) or the addition of preferred stimuli (e.g., increases in positive interactions between students and the educator).

Direct Observations

Direct observations can allow behavior analysts to first determine if the baseline classroom environmental context is sufficient to elicit appropriate student behavior to promote improved academic outcomes. Kestner et al. (2019) detail five required classroom management areas needed for student success: activity pacing (including opportunities to respond), appropriate curricula, feedback and reinforcement, instruction, and transitions. A detailed tutorial and data collection form to evaluate the classroom context in each of these areas can be found in Kestner et al. We recommend evaluating classroom context using these procedures to gain an understanding of the overall classroom environment prior to creating an individual student BIP.

Additional information may also be needed related to the specific interactions between the educator and the target student, the frequency of challenging behavior exhibited by the target student relative to their peers, and the environmental contexts in which challenging behavior is likely to occur as a result of educator behaviors. Supplemental File 2 provides readers with a direct observation tool to gather information about educator and student interactions that can be tailored for class-wide data collection or collection regarding specific interactions between a student and the educator. This direct observation tool allows the behavior analyst to simultaneously collect data on teacher and student behaviors in the context of each instructional activity. Teacher behaviors to be measured include instructional delivery, appropriate redirections, delivery of reinforcement, attending to challenging behavior, active supervision, transitions, and administrative tasks (see Table 1 for definitions of each behavior) using momentary time sampling (MTS) procedures with 30-s intervals. Student behaviors include alternating intervals of measuring target student engagement and class-wide engagement (again using MTS with 30-s intervals); alternating intervals also measure instances of challenging behavior for the target student and the entire class using event recording. Definitions of each behavior are detailed in Table 1. A sample of how these data are analyzed is detailed in the case study in what follows; a blank data collection form is provided in Supplemental File 3.

Table 1.

Definitions of observed behaviors

Engagement Challenging behavior Educator behaviors
In assigned location in the classroom (e.g., at desk when students are working) Elopement or leaving the area (e.g., walking around the room when the task is to stay at the table; leaving the instructional area) Delivering academic instruction, providing an opportunity to respond, or reinforcing an academic response
Looking at the instructor (or speaker) Physical aggression (e.g., hitting, kicking, pushing) Delivering behavioral instruction and behavioral directives or reinforcing appropriate behavior
Attending to or manipulating instructional materials appropriately Property destruction (e.g., tearing materials, throwing materials or furniture) Delivering reinforcement of behaviors other than challenging behavior through verbal praise, tangible delivery, or cessation of an activity early
Following directives within 10 s Verbal aggression (e.g., name calling, making threats) Providing active supervision, or moving between student areas and monitoring task completion without directly interacting with students
Speaking in a large group when called on by the teacher Active noncompliance (failing to follow teacher directive within 10 s) Redirecting appropriately (e.g., gesturing nonverbally, telling the student what to do rather than what not to do, withholding reinforcement rather than delivering punishment)
Transitioning appropriately with voice off, hands to self, and in line Transitioning/waiting (e.g., educator monitoring a transition, teacher waiting for students to line up)
Appropriately waiting and oriented to the speaker or task (e.g., raising a hand to answer a question) Completing administrative tasks (completing attendance or materials organization that does not involve directly interacting with students)
Attending to challenging behavior (e.g., telling the student to stop, moving materials away from the student, delivering a reprimand)

Establishing Rapport

During this observational period, behavior analysts should work to build a supportive and collaborative relationship with the educator. Establishing an effective partnership with the educator can increase the likelihood they will implement the BIP with fidelity, thus improving student outcomes (Wehby et al., 2012). Rather than using this time as an opportunity to “catch” the educator engaging in maladaptive or ineffective practice, behavior analysts should identify the environmental contexts that support the educator in engaging in strong instructional practices, as well as the contexts that may prohibit successful delivery of instruction. These moments of countertherapeutic cycles with a student, as well as positive cycles, can be used to support a collaborative dialogue with the educator to identify programmed consequences that may potentially function as reinforcers for the educator to implement the BIP.

If an educator expresses reluctance or apprehension toward the behavior analyst’s presence or collaboration in the classroom, the behavior analyst may benefit from finding ways to pair their presence with a decrease in the educator’s aversive tasks, a decrease in challenging behavior, or an increase in an engagement. For example, during classroom visits, the behavior analyst might redirect a small group of off-task students or offer to monitor students during independent work while the educator takes a restroom break or makes a caregiver phone call. Small actions like picking up a piece of trash, providing a student with behavior-specific praise, or complimenting the educator’s behavior accumulate over time to ensure the educator looks forward to the behavior analyst’s classroom visits, and thus these actions improve the collaborative relationship.

Establishing Feedback Modality and Schedule

After collaboratively creating programmed positive consequences for BIP implementation by the educator, select a feedback modality and schedule. Ask the educator how they would like to debrief (e.g., email, text, phone, in person) and when (e.g., in the moment, during their planning period, at the end of the observation, at the end of the day). Think of the feedback schedule as a schedule of evaluating the planned positive consequences for educator implementation of the BIP (or educator replacement behaviors). Plan for providing a dense schedule of positive consequences initially and for fading these potentially reinforcing stimuli over time. Determine the steps and criteria for fading consequences based on educator behaviors. For example, during early sessions, the behavior analyst might initially reinforce all educator behaviors using unobtrusive reinforcers, such as a thumbs-up, a head nod and smile, or an “OK” hand signal, while providing higher quality positive consequences identified by the educator during collaborative reflections. Following two consecutive sessions or visits with high levels of fidelity of at least 85%, the behavior analyst might fade potential reinforcement to every three behaviors on average, focusing on reinforcing behaviors the educator implemented with lower fidelity during initial sessions. Following two more consecutive sessions or visits with high fidelity, they might further fade potential reinforcement to provide one unobtrusive, during-session positive consequence in a previous area of concern (e.g., thumbs-up) and one additional positive consequence (e.g., positive email to the principal, monitoring the class for a brief break).

Although precise percentages for “low” and “high” fidelity are not consistently agreed on in the literature, we are suggesting these as an example of goals a team may collaboratively set for a BIP. Parametric analyses of components of some behavior-analytic interventions have been conducted to identify critical components that must be implemented for an intervention to be effective to inform necessary levels of fidelity. Although beyond the scope of this article, we encourage readers to explore a recent review of parametric analyses and systematic manipulations of fidelity on student outcomes (see Brand et al., 2019).

Coaching Steps

Coaching should include evidence-based strategies to promote educator fidelity, including behavior skills training and performance feedback (e.g., Novak et al., 2019). Begin with emphasizing each stakeholder’s strengths and contributions as the plan is collaboratively developed and iteratively refined through reflective, data-based conversations. Many times the behavior analyst may be tasked with supporting plan implementation. Behavior skills training has been shown to be an evidence-based method for training educators to implement behavioral strategies, including BIPs (e.g., Courtemanche et al., 2020; Hogan et al., 2015; Parsons et al., 2013). Begin behavior skills training by providing an overview of the steps for implementing the BIP. Describe each part of the plan and what to do if replacement or challenging behavior occurs. Provide an appropriate level of background and rationale for BIP components based on the educator’s interest level and willingness to implement the plan. Encourage other team members who have collaboratively developed the plan alongside the behavior analyst to share rationales for plan components (e.g., the educator may share that increasing opportunities for collaborative group work is important so she could engage in more positive interactions with students rather than only reprimands or redirections). Some educators may only want to know the steps to implement the plan, whereas others may want a rationale for procedures. Preemptively address the educator’s concerns related to parts of the plan that might be more difficult to implement and describe how you all will collaboratively address these concerns (e.g., modeling, role-play, changing the plan if it is too difficult to implement).

Second, model BIP implementation, asking the educator to choose how they would like to see the plan demonstrated (e.g., role-play between the collaborative team members, behavior analyst demonstration with the student, or both). Give the educator an opportunity to ask questions and tailor their coaching experience to match their preferences in instructional delivery and feedback. Third, role-play BIP implementation, focusing on more difficult aspects of the plan, such as how to reinforce replacement behaviors and how to respond to challenging behavior. Finally, provide feedback, including behavior-specific praise and corrective feedback until the educator demonstrates mastery.

After behavior skills training, determine who will initially implement each component of the BIP. It may often be helpful and increase buy-in for the behavior analyst to implement all components of the BIP during the first activity, while the educator provides class instruction and observes the behavior analyst and the student. Behavior analysts may also consider a gradual release of BIP responsibilities to the educator. For example, initially, the educator may only provide reinforcement to the student, strengthening the relationship between the student and educator. Next, the educator may provide prompts and reinforcement, while the behavior analyst responds to challenging behavior, often the most difficult aspect of the plan to implement. Finally, the educator may implement all aspects of the plan independently.

During these initial sessions, it is critically important to frequently provide positive consequences for the educator’s implementation of the BIP as planned. In addition to ensuring the educator accesses positive consequences early and often, the behavior analyst should consider ways to maximize positive feedback, while minimizing constructive feedback to establish a high ratio of positive to critical feedback, maintaining themselves as a positive collaborator for the educator. To support educators initially, as student behavior may not immediately change, behavior analysts may consider including contrived reinforcers or preferred stimuli or activities. For example, adding opportunities for the educator to visit the restroom, fill a water bottle, or make copies; offering to run an errand for the educator; or walking a student to the hallway restroom may be contrived ways to support the educator. Asking the educator about ways they prefer to be acknowledged while building rapport can support the behavior analyst in identifying contrived preferred stimuli or reinforcers. When observing improvements in student behavior with BIP implementation, make explicit the connection between educator and student behaviors. For example, if educators are delivering the BIP as designed, verbalize the linkage between the educator’s implementation behavior and the reduction in challenging behavior through behavior-specific praise toward the educator or collaborative reflection on changes in student data. Connecting changes in the environment regarding the target student’s performance to the educator’s implementation of the BIP will help the educator see the relation between plan fidelity and the positive reinforcement cycle experienced by both the student and educator.

Performance Feedback

Continued performance feedback after initial plan training is essential to maintaining the educator’s fidelity of BIP implementation. Performance feedback through coaching or consultation is an evidence-based practice for improving the fidelity of interventions in a school setting (Fallon et al., 2015). Although performance feedback is not a novel contribution of this article, it must be noted that feedback is essential to educator and student success when using a BIP. The frequency of performance feedback should align with (a) your billed hours or consultation time with the client and (b) educator and student performance. Performance feedback can occur in person (e.g., Hogan et al., 2015), via video-based feedback (e.g., DiGennaro-Reed et al., 2010), via email (e.g., Barton et al., 2020), or via text message (e.g., Barton et al., 2019). Feedback should include steps of the BIP that are being implemented correctly, areas for improvement, data reviews for the educator and student, and collaborative goal setting (e.g., Pantermuehl & Lechago, 2015). Feedback can also include video models (e.g., Weston et al., 2019) of individual steps of the BIP to support educator implementation. Finally, feedback should be faded over time as educators maintain BIP implementation fidelity or as the student’s needs change and new plans are created to adapt to address the student’s support needs.

Case Example

Sam is a second-grade student who engages in frequent elopement from the classroom during whole-group instructional activities. Sam receives special education services as a student with an emotional and behavioral disorder in an inclusive general-education setting for literacy, math, behavioral skills, and social skills support. The coaching team includes Ms. Wolfe, a behavior analyst, who supported the collaborative individualized education program (IEP) team in conducting an FBA to explore the contingencies maintaining Sam’s elopement behaviors, and Mr. Davids, Sam’s general-education teacher. Ms. Wolfe shared with the team during the FBA planning meeting the importance of considering both controllable environmental and contextual factors such as educator responding, in addition to specific instances of elopement and challenging behavior. When Ms. Wolfe started this case, she collaboratively created operational definitions for educator behaviors and academic contexts she planned to observe in addition to defining student behaviors. Table 1 summarizes her definitions for challenging behavior, alternative behavior, and educator behaviors. Specifically, she defined academic and behavioral instruction, transitions, attending to challenging behavior, reinforcement delivery, active supervision, and administrative tasks as educator behaviors (see Table 1). Ms. Wolfe completed several classroom observations to collect data on Sam’s classroom contextual factors, Sam’s performance, and Mr. Davids’s interactions surrounding Sam’s challenging behavior (see Supplemental File 3 for an example data sheet). She used the definitions of observed behavior presented in Table 1 to collect the data provided in Supplemental File 2. She also followed the process checklist listed in Supplemental File 4.

After collecting data from a variety of sources, Sam’s team hypothesized the function of both his challenging behavior and Mr. Davids’s responses to Sam’s challenging behavior. Table 2 includes a summary of the setting events, target behaviors, precursor behaviors, and hypothesized functions of both Sam’s and Mr. Davids’s performance. The results of the FBA indicated Sam’s elopement was maintained by escape from task demands delivered in group instructional settings. The FBA data also noted that when Sam displayed precursor behaviors to elopement (standing, pacing, repeating “no”), Mr. Davids increased the frequency of task demands and attention provided to Sam before eventually threatening to call the office. Sam would then elope from the classroom and stay in the office for the remainder of class, providing a negative reinforcement contingency for Mr. Davids’s continued delivery of task demands as Sam’s challenging behavior escalated.

Table 2.

Functional behavior assessment student and educator data at a glance

Student: Sam S. Teacher: Mr. Davids
Age/Gender: 8-yr-old male Grade/School: 2nd grade Liberty Elementary School
Target behavior operational definition Educator response
Left targeted instructional area (eloped) without educator permission through (a) individual permission or (b) group transition as defined by the student’s body exiting the plane of the instructional area (e.g., edge of bookshelves marking instructional area; perimeter of group carpet)

Stops delivering verbal directives and reprimands

Does not follow the student

Calls administration on a walkie-talkie

Antecedents/setting events: Student behaviors Antecedents/setting events: Educator behaviors

Asked a question or given a task direction by the teacher in front of the group

Redirected by teacher to answer the question or complete the task

Delivers task directions to the student (e.g., clean up, walk to the board to mark a response)

Redirects the student to join the instructional activity and complete the task or answer the question

Precursor behaviors: Student Precursor behaviors: Educator

Stood up and paced in a circle

Said “no” repeatedly at an increasingly loud volume

Tells the student to sit down and stop, and answers question

Delivers threat: “If you don’t stop then I’m calling the office.”

Hypothesized function: Student Hypothesized function: Educator

Escaped from task demands in a group instructional setting

Eloped to escape from task demands, as he stays in the office for the remainder of class

Escapes disruption of instruction

Elopement provides negative reinforcement through the removal of aversive disruption; the student stays in the office for the remainder of class

BIP

When the IEP team collaboratively created the BIP, Mr. Davids noted he often attended to challenging behaviors during intervals in which the target behavior occurred, and that Sam was removed from the classroom following consecutive intervals with the target behavior. These data, along with data on Sam’s challenging and alternative behavior, helped inform the BIP and a plan for creating positive consequences for Mr. Davids’s BIP implementation (replacement behaviors).

Table 3 provides a brief summary of the BIP the team created for Sam and Mr. Davids. Ms. Wolfe broke down the goals for both Sam and Mr. Davids at the beginning of the document. She then organized the at-a-glance document into antecedent, instructional, and consequence strategies. Fidelity data–collection steps for each component of the plan are embedded in the document to support Mr. Davids in remembering the implementation steps before, during, and after instruction. The BIP summary also directly connects Mr. Davids’s behaviors to the behaviors he expects from Sam to support the connection between changes in educator behavior and student performance.

Table 3.

Behavior intervention plan at a glance

graphic file with name 40617_2021_663_Tab3_HTML.jpg

Note. OTR opportunities to respond, FR fixed ratio

Ms. Wolfe interviewed Sam about strategies that were helpful and asked for his preferences in intervention selection. Sam noted he was working to earn an Apple Watch© for good school performance from his uncle. He said he liked how his uncle could use the watch to see what he had planned for the day and see messages from friends. Ms. Wolfe asked Sam if he would like to learn to use a calendar like his uncle, and he said he would if it was electronic like the watch. Ms. Wolfe found a protocol for using a digital schedule to increase Sam’s engagement and decrease challenging behavior during whole-group instruction (Zimmerman et al., 2020). She created a visual schedule with Mr. Davids for Sam to use during group instructional settings (Fig. 2). The visual schedule shows two behaviors to get Sam in the location for group instruction: sit down and look at the board. Then, a checklist for three task directions and a reminder to request a break complete the schedule. Sam can choose any three task directions to follow and then ask Mr. Davids to take a break with the digital schedule.

Fig. 2.

Fig. 2

Sam’s visual schedule for group instruction. Note. Schedule format adapted from Choiceworks© application digital schedule. Organization adapted from Zimmerman et al. (2020). All images Creative Commons licensed

Meanwhile, Mr. Davids’s replacement behaviors are to implement the visual schedule with fidelity. Fidelity steps are broken down into tasks he needs to complete before instruction (antecedent), steps to implement during ongoing instruction (instructional procedure), and steps to complete after Sam finishes each part of his schedule (consequence). This involves making sure Sam has the schedule at the start of group instruction and reminding him of the break contingency (antecedent), prompting Sam to check off items following completion and providing behavior-specific praise following task completion (instruction), and providing contingent access to a break every time (fixed-ratio 1 schedule; consequence). Ms. Wolfe will reinforce Mr. Davids’s fidelity with behavior-specific praise, positive affirmations, and opportunities for a break (e.g., restroom break, errand) during the class transition to the next activity when Ms. Wolfe is conducting consultative observations.

Ms. Wolfe also planned for instructional procedures to teach Sam to request a break, which he seldom does independently. Mr. Davids will teach Sam to request a break using embedded trials in group instruction. The visual schedule will serve as a prompt for Sam to request the break, and Mr. Davids will use constant time-delay procedures similar to those used in Zimmerman et al. (2020) to increase the ease with which Mr. Davids can embed break-request instruction into ongoing academic instruction. A gestural prompt will be used so Mr. Davids can provide Sam with a discrete prompt while also delivering some planned attention outside the context of challenging behavior.

Finally, Ms. Wolfe planned a consequence strategy to minimize reinforcement of challenging behavior using differential reinforcement of Sam’s academic responses and his break requests. Specifically, when Sam engages in replacement behaviors, Mr. Davids will reinforce all responses to group opportunities to respond by providing behavior-specific praise. Once Sam completes three tasks, Mr. Davids will reinforce break requests and minimize attention to Sam while he is on his break, giving Mr. Davids a break from Sam’s challenging behavior. Sam will be able to stay on break for the remainder of the whole-group activity. When Mr. David engages in his replacement behaviors by correctly implementing these strategies, Ms. Wolfe will provide positive affirmations and will give Mr. Davids a break, either by implementing the consequence strategies or by monitoring Mr. Davids’ class for a few minutes. The team collaboratively decided that these “breaks” provided by Ms. Wolfe will be faded over time and Mr. Davids will encounter more opportunities for naturally occurring breaks as students become more successful.

Performance Feedback

Ms. Wolfe will observe for Mr. Davids’s correct implementation of the constant time delay and his delivery of reinforcement immediately following prompted and unprompted break requests. She will reinforce Mr. Davids’s accurate implementation fidelity with behavior-specific praise and positive affirmations and by providing Mr. Davids with a break during independent work in the classroom so he can use the restroom, run an errand, or make copies. Mr. Davids and Ms. Wolfe decided on three observations per week with performance delivered via email; during the first 2 days of the plan’s implementation, however, Ms. Wolfe committed to being in the classroom to model plan implementation and deliver in-the-moment coaching and feedback using Mr. Davids’s AirPods for bug-in-ear coaching (Owens et al., 2020). The team agreed to meet after 4 weeks of implementation to review data on Sam’s and Mr. Davids’s performance using the new BIP that addresses both of their performance and explicitly shows Mr. Davids how his fidelity leads to the same consequences as before the plan was implemented.

Conclusion

This discussion of positive reinforcement cycles for both educators and students provides considerations for behavior analysts working to support BIP implementation in school-based contexts. The sample materials outline a method for collecting data on both educator and student performance to ensure BIPs include both educator and student access to positive consequences and potential reinforcement through engaging in replacement prosocial behaviors (student) and implementation fidelity behaviors (educator). Explicitly planning to support both students’ and educators’ access to positive consequences and potential reinforcement through behavioral change outlined in a BIP may increase the fidelity of BIP implementation, improve student performance, and create an opportunity for behavior analysts to maximize technologies available in the assessment and treatment of challenging behavior to support both student and adult clients in school-based settings.

Supplementary Information

ESM 1 (17.3KB, docx)

(DOCX 17 kb)

ESM 2 (33.6KB, docx)

(DOCX 33 kb)

ESM 3 (24KB, docx)

(DOCX 23 kb)

ESM 4 (20.4KB, docx)

(DOCX 20 kb)

Funding

There are no financial disclosures to report.

Declarations

Conflict of interest

There are no potential conflicts of interest to disclose.

Informed consent

Research involving human subjects or animals was not conducted; therefore, informed consent was not required to complete this article.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. Anderson CM, Rodriguez BJ, Campbell A. Functional behavior assessment in schools: Current status and future directions. Journal of Behavioral Education. 2015;24(3):338–371. doi: 10.1007/s10864-015-9226-z. [DOI] [Google Scholar]
  2. Barton EE, Rigor MN, Pokorski EA, Velez M, Domingo M. Using text messaging to deliver performance feedback to preservice early childhood teachers. Topics in Early Childhood Special Education. 2019;39(2):88–102. doi: 10.1177/0271121418800016. [DOI] [Google Scholar]
  3. Barton EE, Velez M, Pokorski EA, Domingo M. The effects of email performance-based feedback delivered to teaching teams: A systematic replication. Journal of Early Intervention. 2020;42(2):143–162. doi: 10.1177/1053815119872451. [DOI] [Google Scholar]
  4. Benazzi L, Horner RH, Good RH. Effects of behavior support team composition on the technical adequacy and contextual fit of behavior support plans. Journal of Special Education. 2006;40:160–170. doi: 10.1177/00224669060400030401. [DOI] [Google Scholar]
  5. Brand D, Henley AJ, Reed FDD, Gray E, Crabbs B. A review of published studies involving parametric manipulations of treatment integrity. Journal of Behavioral Education. 2019;28(1):1–26. doi: 10.1007/s10864-018-09311-8. [DOI] [Google Scholar]
  6. Biggs BK, Vernberg EM, Twemlow SW, Fonagy P, Dill EJ. Teacher adherence and its relation to teacher attitudes and student outcomes in an elementary school-based violence prevention program. School Psychology Review. 2008;37:533–549. doi: 10.1080/02796015.2008.12087866. [DOI] [Google Scholar]
  7. Carr EG, Taylor JC, Robinson S. The effects of severe behavior problems in children on the teaching behavior of adults. Journal of Applied Behavior Analysis. 1991;24(3):523–535. doi: 10.1901/jaba.1991.24-523. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Chow JC, Gilmour AF. Designing and implementing group contingencies in the classroom: A teacher’s guide. Teaching Exceptional Children. 2016;48(3):137–143. doi: 10.1177/0040059915618197. [DOI] [Google Scholar]
  9. Chow, J. C., Cunningham, J., & Wallace, E. (2020). Interaction-centered model for language and behavioral development. Handbook of research on emotional & behavioral disabilities: Interdisciplinary developmental perspectives on children and youth. Routledge.
  10. Collier-Meek MA, Sanetti LM, Fallon LM. Incorporating applied behavior analysis to assess and support educators’ treatment integrity. Psychology in the Schools. 2017;54(4):446–460. doi: 10.1002/pits.22001. [DOI] [Google Scholar]
  11. Collins LW, Zirkel PA. Functional behavior assessments and behavior intervention plans: Legal requirements and professional recommendations. Journal of Positive Behavior Interventions. 2017;19(3):180–190. doi: 10.1177/1098300716682201. [DOI] [Google Scholar]
  12. Cook C, Mayer G, Wright D, Kraemer B, Wallace M, Dart E, Restori A. Exploring the link among behavior intervention plans, treatment integrity, and student outcomes under natural educational conditions. Journal of Special Education. 2012;46:3–16. doi: 10.1177/0022466910369941. [DOI] [Google Scholar]
  13. Courtemanche, A. B., Turner, L. B., Molteni, J. D., & Groskreutz, N. C. (2020). Scaling up behavioral skills training: Effectiveness of large-scale and multiskill trainings. Behavior Analysis in Practice. Advance online publication. [DOI] [PMC free article] [PubMed]
  14. DiGennaro Reed FD, Hirst JM, Howard VJ. Behavior analytic techniques to promote treatment integrity. In: Sanetti LMH, Kratochwill TR, editors. Treatment integrity: A foundation for evidence-based practice in applied psychology. American Psychological Association; 2014. pp. 203–226. [Google Scholar]
  15. DiGennaro FD, Martens BK, Kleinmann AE. A comparison of performance feedback procedures on teachers’ treatment implementation integrity and students’ inappropriate behavior in special education classrooms. Journal of Applied Behavior Analysis. 2007;40(3):447–461. doi: 10.1901/jaba.2007.40-447. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. DiGennaro-Reed FD, Codding R, Catania CN, Maguire H. Effects of video modeling on treatment integrity of behavioral interventions. Journal of Applied Behavior Analysis. 2010;43:291–295. doi: 10.1901/jaba.2010.43-291. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Dunlap G, Carr E. Positive behavior support and developmental disabilities. In: Odom S, Horner R, Snell M, Blacher J, editors. Handbook of developmental disabilities. Guilford Press; 2007. pp. 469–482. [Google Scholar]
  18. Fiske KE. Treatment integrity of school-based behavior analytic interventions: A review of the research. Behavior Analysis in Practice. 2008;1(2):19–25. doi: 10.1007/BF03391724. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature (Publication #231). Louis de la Parte Florida Mental Health Institute.
  20. Gettinger M, Schienebeck C, Seigel S, Vollmer L. Assessment of classroom environments. In: Bray MA, Kehle TJ, editors. The Oxford handbook of school psychology. Oxford University Press; 2011. pp. 260–283. [Google Scholar]
  21. Gresham FM, Gansle KA, Noell GH, Cohen S, Rosenblum S. Treatment integrity of school-based behavioral intervention studies: 1980–1990. School Psychology Review. 1993;22(2):254–272. doi: 10.1080/02796015.1993.12085651. [DOI] [Google Scholar]
  22. Gunter PL, Coutinho MJ. Negative reinforcement in classrooms: What we’re beginning to learn. Teacher Education and Special Education. 1997;20(3):249–264. doi: 10.1177/088840649702000306. [DOI] [Google Scholar]
  23. Hanley GP. Functional assessment of problem behavior: Dispelling myths, overcoming implementation obstacles, and developing new lore. Behavior Analysis in Practice. 2012;5(1):54–72. doi: 10.1007/BF03391818. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Hirsch SE, Bruhn AL, Lloyd JW, Katsiyannis A. FBAs and BIPs: Avoiding and addressing four common challenges related to fidelity. Teaching Exceptional Children. 2017;49(6):369–379. doi: 10.1177/0040059917711696. [DOI] [Google Scholar]
  25. Hirsch SE, Bruhn AL, Randall K, Dunn M, Shelnut J, Lloyd JW. Developing and implementing FBA-BIPs in elementary classrooms: A conceptual replication. Journal of Special Education Apprenticeship. 2020;9(2):1–25. [Google Scholar]
  26. Hogan A, Knez N, Kahng S. Evaluating the use of behavioral skills training to improve school staffs’ implementation of behavior intervention plans. Journal of Behavioral Education. 2015;24(2):242–254. doi: 10.1007/s10864-014-9213-9. [DOI] [Google Scholar]
  27. Holcomb C, Baker JN, More C. Digital behavior intervention plans: Effects on general education teacher fidelity of implementation. Journal of Special Education Technology. 2020;35(3):155–166. doi: 10.1177/0162643419854502. [DOI] [Google Scholar]
  28. Jessel J, Hanley GP, Ghaemmaghami M. Interview-informed synthesized contingency analyses: Thirty replications and reanalysis. Journal of Applied Behavior Analysis. 2016;49(3):576–595. doi: 10.1002/jaba.316. [DOI] [PubMed] [Google Scholar]
  29. Kestner KM, Peterson SM, Eldridge RR, Peterson LD. Considerations of baseline classroom conditions in conducting functional behavior assessments in school settings. Behavior Analysis in Practice. 2019;12(2):452–465. doi: 10.1007/s40617-018-0269-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Lloyd, B. P., & Kennedy, C. H. (2014). Assessment and treatment of challenging behaviour for individuals with intellectual disability: A research review. Journal of Applied Research in Intellectual Disabilities, 27(3), 187–199. [DOI] [PubMed]
  31. Lloyd B, Weaver E, Staubitz J. A review of functional analysis methods conducted in public school classroom settings. Journal of Behavioral Education. 2016;25(3):324–356. doi: 10.1007/s10864-015-9243-y. [DOI] [Google Scholar]
  32. Madzharova MS, Sturmey P, Yoo JH. Using in-vivo modeling and feedback to teach classroom staff to implement a complex behavior intervention plan. Journal of Developmental and Physical Disabilities. 2018;30(3):329–337. doi: 10.1007/s10882-018-9588-y. [DOI] [Google Scholar]
  33. Martens BK, DiGennaro FD. Behavioral consultation. In: Erchul WP, Sheridan SM, editors. Handbook of research in school consultation: Empirical foundations for the field. Routledge; 2008. [Google Scholar]
  34. Mason R, Eagle J, Dowd-Eagle S, Wills HP, Hanson B, Mason B. Functional behavior assessment and intervention design in ecobehavioral consultation. In: Lee SW, Niileksela CR, editors. Ecobehavioral consultation in schools: Theory and practice for school psychologists, special educators, and school counselors. Routledge; 2014. [Google Scholar]
  35. McIntyre LL, Gresham FM, DiGennaro FD, Reed DD. Treatment integrity of school-based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis. 2007;40(4):659–672. doi: 10.1901/jaba.2007.659-672. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Moore TC, Wehby JH, Oliver RM, Chow JC, Gordon JR, Mahany LA. Teachers’ reported knowledge and implementation of research-based classroom and behavior management strategies. Remedial and Special Education. 2017;38(4):222–232. doi: 10.1177/0741932516683631. [DOI] [Google Scholar]
  37. Mouzakitis A, Codding R, Tryon G. The effects of self-monitoring and performance feedback on the treatment integrity of behavior intervention plan implementation and generalization. Journal of Positive Behavior Interventions. 2015;17:223–234. doi: 10.1177/1098300715573629. [DOI] [Google Scholar]
  38. Novak MD, Reed FDD, Erath TG, Blackman AL, Ruby SA, Pellegrino AJ. Evidence-based performance management: Applying behavioral science to support practitioners. Perspectives on Behavior Science. 2019;42(4):955–972. doi: 10.1007/s40614-019-00232-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. O’Neill RE, Albin RW, Storey K, Horner RH, Sprague JR. Functional assessment and program development. 3. Cengage Learning; 2015. [Google Scholar]
  40. Owens TL, Lo YY, Collins BC. Using tiered coaching and bug-in-ear technology to promote teacher implementation fidelity. Journal of Special Education. 2020;54(2):67–79. doi: 10.1177/0022466919852706. [DOI] [Google Scholar]
  41. Pantermuehl RM, Lechago SA. A comparison of feedback provided in vivo versus an online platform on the treatment integrity of staff working with children with autism. Behavior Analysis in Practice. 2015;8(2):219–222. doi: 10.1007/s40617-015-0059-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Parsons MB, Rollyson JH, Reid DH. Teaching practitioners to conduct behavioral skills training: A pyramidal approach for training multiple human service staff. Behavior Analysis in Practice. 2013;6(2):4–16. doi: 10.1007/BF03391798. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Pinkelman SE, Horner RH. Improving implementation of function-based interventions: Self-monitoring, data collection, and data review. Journal of Positive Behavior Interventions. 2017;19(4):228–238. doi: 10.1177/1098300716683634. [DOI] [Google Scholar]
  44. Robertson RE, Kokina AA, Moore DW. Barriers to implementing behavior intervention plans: Results of a statewide survey. Journal of Positive Behavior Interventions. 2020;22(3):145–155. doi: 10.1177/1098300720908013. [DOI] [Google Scholar]
  45. Sanetti LM, Collier-Meek M. Data-driven delivery of implementation supports in a multi-tiered framework: A pilot study. Psychology in the Schools. 2015;52:815–828. doi: 10.1002/pits.21861. [DOI] [Google Scholar]
  46. Simonsen B, MacSuga AS, Fallon LM, Sugai G. The effects of self-monitoring on teachers’ use of specific praise. Journal of Positive Behavior Interventions. 2013;15:5–15. doi: 10.1177/1098300712440453. [DOI] [Google Scholar]
  47. St Peter Pipkin C, Vollmer TR, Sloman KN. Effects of treatment integrity failures during differential reinforcement of alternative behavior: A translational model. Journal of Applied Behavior Analysis. 2010;43(1):47–70. doi: 10.1901/jaba.2010.43-47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Sutherland KS, Singh NN, Sutherland KS, Conroy M, Stichter JP. Learned helplessness and students with emotional or behavioral disorders: Deprivation in the classroom. Behavioral Disorders. 2004;29(2):169–181. doi: 10.1177/019874290402900208. [DOI] [Google Scholar]
  49. Truscott SD, Richardson RD, Cohen C, Frank A, Palmeri D. Does rational persuasion influence potential consultees? Psychology in the Schools. 2003;40:627–640. doi: 10.1002/pits.10132. [DOI] [Google Scholar]
  50. Vollmer T, Sloman K, Pipkin C. Practical implications of data reliability and treatment integrity monitoring. Behavior Analysis in Practice. 2008;1(2):4–11. doi: 10.1007/BF03391722. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Wehby, J. H., Maggin, D. M., Partin, T. C. M., & Robertson, R. (2012). The impact of working alliance, social validity, and teacher burnout on implementation fidelity of the good behavior game. School Mental Health, 4(1), 22–33.
  52. Wehby JH, Symons FJ, Canale JA, Go FJ. Teaching practices in classrooms for students with emotional and behavioral disorders: Discrepancies between recommendations and observations. Behavioral Disorders. 1998;24(1):51–56. doi: 10.1177/019874299802400109. [DOI] [Google Scholar]
  53. West EA, Billingsley F. Improving the system of least prompts: A comparison of procedural variations. Education and Training in Developmental Disabilities. 2005;40(2):131–144. [Google Scholar]
  54. Weston, R., Davis, T. N., Radhakrishnan, S., O’Guinn, N., & Rivera, G. (2019). Comparing performance feedback and video self-monitoring within a BST package to train pre-service behavior analysts to conduct preference assessments. Journal of Behavioral Education, 1–13.
  55. Wilkinson LA. Assessing treatment integrity in behavioral consultation. International Journal of Behavioral Consultation and Therapy. 2007;3(3):420. doi: 10.1037/h0100816. [DOI] [Google Scholar]
  56. Zimmerman K, Ledford J, Gagnon K, Martin J. Social stories and visual supports interventions for students at risk for emotional and behavioral disorders. Behavioral Disorders. 2020;45(4):207–223. doi: 10.1177/0198742919874050. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

ESM 1 (17.3KB, docx)

(DOCX 17 kb)

ESM 2 (33.6KB, docx)

(DOCX 33 kb)

ESM 3 (24KB, docx)

(DOCX 23 kb)

ESM 4 (20.4KB, docx)

(DOCX 20 kb)


Articles from Behavior Analysis in Practice are provided here courtesy of Association for Behavior Analysis International

RESOURCES