Skip to main content
Behavior Analysis in Practice logoLink to Behavior Analysis in Practice
. 2021 Jul 12;14(3):856–872. doi: 10.1007/s40617-021-00612-5

Teaching Future School Personnel to Train Parents to Implement Explicit Instruction Interventions

Sara Kupzyk 1,, Zachary C LaBrot 2
PMCID: PMC8458530  PMID: 34631388

Abstract

Students with disabilities are less likely to be proficient with basic academic skills compared to peers, indicating a need for more quality instructional time. Parent tutoring has been identified as a promising practice for supplementing instruction to improve child outcomes. However, educators are not sufficiently prepared to collaborate with and provide guidance to parents in how to support academic goals at home. We describe how an academic assessment and intervention clinic trains future school personnel to work with families to develop and implement explicit instruction parent tutoring interventions. A case example illustrates the process.

Keywords: parent tutoring, academic interventions, explicit instruction


National data indicate that less than one fourth of children are proficient with reading, writing, and math (National Assessment of Educational Progress (NAEP), 2011, 2020). The results for children with disabilities is even more concerning, with large discrepancies in performance compared to students without disabilities. For example, fourth graders with disabilities scored 42 points lower in reading and 31 points lower in math than peers without disabilities, with similar outcomes at the eighth-grade level. Overall, there is a need for more intense, explicit instruction for children with disabilities to increase academic performance that is coordinated across home and school settings to increase goal attainment.

Meaningful home–school collaboration is associated with benefits for children with disabilities (e.g., improved learning outcomes, attendance), their families (e.g., higher self-efficacy for helping, lower stress), and their teachers (e.g., higher self-efficacy, increased retention/job satisfaction; Garbacz et al. 2017; Marotz & Kupzyk, 2018). Given these benefits, it is not surprising that parental involvement is mentioned and required as part of federal legislation (e.g., Individuals with Disabilities Education Act, Title I). Among the many ways parents can be involved in their children’s education; however, programs that employ parent tutoring focused on a single skill area (e.g., reading, math) have the largest effect on student academic success (Fishel & Ramirez, 2005; Erion, 2006). Parent tutoring involves training a parent (or other relevant caregiver, tutor, etc.) to deliver a brief, focused intervention that provides additional opportunities for students to receive instruction and practice with key academic skills. Parent tutoring has been effective in (1) improving math skills in students at risk for math problems (Heller & Fantuzzo, 1993); (2) increasing reading skills in students with attention-deficit/hyperactivity disorder (ADHD; Hook & DuPaul, 1999); (3) improving reading and vocabulary skills of students who have limited English proficiency (Cooke et al., 2009; Kupzyk et al., 2011b; ); and (4) increasing reading skills in students at-risk for or identified with learning disabilities (Doǧanay Bilgi, 2020; Erion & Hardy, 2019; Gortmaker et al., 2007; Kupzyk & Daly, 2017; Persampieri et al., 2006; Valleley et al., 2005; Zhou et al., 2019). Parent tutoring can promote skill generalization and maintenance as students have more opportunities to respond to multiple exemplars and with diverse instructors. In addition, as parents gain skills in assisting with one academic goal, they may be more confident with helping and better able to apply those skills to assist with new goals. This can be particularly meaningful as parents are their children’s first and most constant teachers.

In education, “explicit instruction” refers to instruction focused on clear instructional sequences and outcomes that are fully transparent (i.e., explicit) to teachers and learners. Both the design and delivery of explicit instruction are unambiguous, structured, systematic, and scaffolded (Archer & Hughes, 2010; Goeke, 2009). Likewise, Direct Instruction emphasizes carefully designed, research-informed lessons featuring clearly defined and prescribed communication sequences and manageable learning increments (Kame'enui, 2021; Rolf & Slocum, this issue; Slocum & Rolf, this issue). A foundational tenet of Direct Instruction is that clear instruction, without misinterpretation, can greatly improve and accelerate learning (National Institute for Direct Instruction, n.d.; Twyman, 2021).

Direct Instruction and procedures characterized as “explicit instruction” share many features, such as targeting specific academic skills, clearly specifying instruction and feedback, encouraging active learner engagement, and monitoring student performance. Even more closely related may be “little di” (Rosenshine & Stevens, 1986), a term used to describe variables found to be closely related to student achievement, including engaged time, small group instruction, and specific and immediate feedback. These procedures (i.e., “little di,” “capital DI,” and those known as “explicit instruction”) feature evidence-based teaching practices that have been shown to significantly affect student performance and outcomes across a variety of settings.

Elements of explicit instruction (Archer & Hughes, 2010; Carnine et al., 2017; Hughes et al., 2017) are key to the effectiveness of parent-tutoring interventions. Interventions taught to parents are designed to target specific academic skills based on assessment of current performance and the sequence of skill development. Furthermore, the interventions include a clear academic focus, explicit instructions, strategies to maintain high levels of student engagement (e.g., active responding, frequent feedback, perky pace), review of previous skills taught, and monitoring of student performance. Data are used to inform decision making over time to determine if the intervention should be continued, modified, or discontinued.

It is unfortunate that educators and other school support staff often do not receive explicit training in how to develop such interventions or collaborate with families. Less than half of special education teachers report feeling confident to engage diverse families in the education of students (Fowler et al., 2019). Therefore, it is not surprising that parents report dissatisfaction with the special education process, in particular that their perspectives and input are not valued by the educators (Tucker & Schwartz, 2013). Parents of children with disabilities also report feeling unsure of how to support their children’s learning at home (Jacobs et al., 2016). Providing parents with knowledge and skills in using explicit instruction interventions with their child can to lead to improved child outcomes and increased parent confidence in helping their child.

To successfully collaborate with parents and implement parent tutoring, school-based professionals (e.g., school psychologists, special education teachers) need skills in (1) building relationships and using effective communication, (2) identifying skill needs to target for instruction, (3) designing and evaluating interventions, (4) training parents, and (5) monitoring progress with academic skills and treatment integrity. This article describes our academic assessment and intervention clinic that trains school psychology graduate students in these skills. The clinic process and a case example are presented.

Clinical Model: Explicit Instruction Delivered through a Behavior Analytic Consultation and Training Model

Our academic assessment and intervention clinic is a community-based program that serves families of students (ages 5–18) with disabilities (e.g., autism, intellectual disorder, specific learning disorder) who are demonstrating academic difficulties. Families are typically referred to the clinic by psychologists, pediatricians, or other families. The clinic is staffed by specialist-level graduate students in school psychology who meet weekly with families as part of a practicum experience. Sessions with families were scheduled at 4:15 and 5:15 followed by group supervision and class. The practicum is taught and supervised by a psychology faculty member assisted by a predoctoral psychology intern. Clinic staff assist with initial scheduling. We also established community collaborations with the state parent training and information center, respite services, and parent resource coordinators (parents of children with disabilities that have received training to aid parents in navigating services) to provide information sessions to parents while their children are participating in sessions. The clinic operates on funds from local grants (faculty time, subcontracts for time for community partners, and scholarships for families in need) and income from fees for services (sliding fee schedule ranging from $20–$60 per session).

The clinic’s primary goals are to (1) provide effective academic assessment and intervention to children and adolescents with and without various developmental and learning disabilities, (2) train parents to deliver evidence-based academic interventions to their children, and (3) prepare school psychology graduate students to serve as behavior analytic clinicians and consultants. Embedded within our clinical model are the fundamental components of explicit instruction (Archer & Hughes, 2010) delivered through a behavior analytic training model (Dufrene et al., 2016). Explicit instruction is an ideal method to address students’ academic referral concerns given its foundation in matching students’ instruction to their current academic skill level and progression through instruction via data-based decision making (Archer & Hughes, 2010). However, once-a-week outpatient clinical services are insufficient for promoting long-term, significant academic outcomes for struggling learners. Therefore, we adopt a behavior analytic consultation model to prepare students’ parents to deliver evidence-based academic interventions.

Behavioral consultation has empirical support for improving students’ academic outcomes (Sheridan et al., 1996). Behavioral consultation includes four phases: problem identification, problem analysis, plan implementation, and plan evaluation (Dufrene et al., 2016). Figure 1 shows the phases of behavioral consultation and the specific steps of the clinical process we use to address academic difficulties. Problem identification involves identifying and operationally defining a student’s academic problem through interviews with relevant persons (e.g., parents, other caregiver) and indirect and direct assessment of academic deficits. Problem analysis involves developing and testing the effectiveness of academic interventions delivered through explicit instruction as well as determining whether a student’s academic deficit is due to a skill or performance deficit. Plan implementation consists of developing an intervention protocol and using behavioral skills training to teach parents or other related caregivers to implement the selected academic interventions. Finally, plan evaluation includes monitoring a student’s response to an academic intervention in addition to the parent’s treatment integrity (Dufrene et al., 2016; Erchul & Martens, 2012). This section describes the activities conducted at each phase in more detail.

Fig. 1.

Fig. 1

Consultation Phases and Steps in the Clinic Process. Note. The consultation phases and steps in the academic assessment and intervention clinic process

Problem Identification

During the problem identification phase, information is gathered using interviews and indirect and direct assessment to operationally define the problem and inform intervention selection. During the initial meeting with a student and his/her parent it is critical to establish a warm and supportive relationship. Research demonstrates that confidence in the clinician and previous experiences with various therapies are moderators for good treatment integrity (Fiske, 2008). This can be achieved by engaging in rapport building, clearly and thoroughly describing each step in the clinical process, and answering any parent and student questions. Establishing collaborative relationships with the student and his/her parent(s) increases the likelihood that the treatment plan will align with their values, skills, and available resources and will ideally result in good treatment integrity (Kupzyk & Shriver, 2016).

Interview and Indirect Assessment of Academic Deficits, Student Behavior, and Contextual Variables

When assessing academic referral concerns, it is important to first gather information from relevant individuals who are familiar with a student’s academic difficulties. In our clinic, this most often involves a semi-structured interview with the student’s parent(s), student, and teachers when possible. We ask parents to bring copies of previous evaluations, report cards, and other relevant information (e.g., work samples) to the initial appointment. During the appointment, parents are asked to sign a release of information to allow the clinician to obtain records from the school and communicate with the teacher to coordinate services. Obtaining records and establishing collaborations with teachers is useful because parents may not fully understand or share the results of evaluations conducted, related implications for learning, or goals outlined within school plans. The primary goal of the interviews is to clarify and clearly define the student’s academic problems (Dufrene et al., 2016). This allows for a more precise development of a direct academic skills assessment. Important variables to initially assess include the student’s educational history, current performance and behavior during work tasks, and other behaviors and environmental variables that might be impeding skill development. See Fig. 2 for sample interview questions.

Fig. 2.

Fig. 2

Sample Interview Questions

Educational history

Information is gathered related to a student’s success or struggles with learning in past grade levels, whether a student has ever repeated a grade or has a history of receiving special services (e.g., special education services, 504 accommodations), extent of schooling received, and frequency of moves which can affect continuity with instruction. The clinicians also review the student’s academic records. Review of a student’s grades, multidisciplinary team reports (overview of testing completed and any school verifications and recommendations for programming), and individualized education program (IEP) will delineate previous and current intervention attempts. This gives the clinician the opportunity to discuss with the parent whether these supports have been helpful for improving academic outcomes. To further develop a more precise direct assessment, clinicians should ascertain when a student’s academic concern(s) began and how long the problem has persisted. This can allow clinicians to estimate the approximate grade level a student is performing in a given academic domain. That is, if a parent reports that a student began to struggle with reading fluency in third grade, a clinician may begin their direct assessment with third grade-level materials. Overall, this type of information provides a general background of a student’s educational history that could indicate whether academic difficulties are related to specific learning difficulties or environmental factors (e.g., poor previous instruction; Burns et al., 2017).

Current performance and work tasks

Questions are asked about the student’s current instruction as well as homework requirements (e.g., subjects, time, difficulty) to help clinicians understand the task-demand requirements for a student. Past homework samples can be reviewed for commonly made mistakes or feedback from the student’s teacher. This can aid clinicians in deciding what grade-level materials to use for direct assessment (Dufrene et al., 2016). It is also critical to assess a student’s current performance across each academic area (i.e., reading, writing, mathematics, and organization) as difficulties in these areas may be interrelated (Burns et al., 2017). For example, difficulties with solving mathematics problems (e.g., word problems) could be related to difficulties with reading comprehension (Dufrene et al., 2016).

Clinicians should also take the time to interview the student (see Figure 2 for sample questions). The purpose of this interview is to establish rapport with the student and gain insight about the student’s understanding of their academic difficulties. Clinicians can collect valuable data from students about their perspective of an academic problem, including the student’s perspective on what they find most difficult about school. Further, clinicians could ask what the student believes would be most and least helpful in addressing academic difficulties (Dufrene et al., 2016).

Other behaviors and environmental variables

Sometimes students who present with academic difficulties also present with behavioral difficulties. Because our academic clinic is an outpatient clinic, it is not always feasible to directly assess a student’s behavior in the classroom. However, concerns with problem behavior are discussed in the interview with a parent (e.g., teacher reports, frequency of problems, when problems are most likely to occur, behavior intervention support plan at school). Classroom behaviors such as on-task behavior, classwork completion, participation, classwork refusal, and disruptive behavior are all behavioral targets to be assessed (Dufrene et al., 2016). Home behaviors to be assessed include homework completion, disruptive behaviors during homework routines, and noncompliance with parent instructions. Clinicians should also evaluate behavior management interventions implemented by the parents and their relative success. In addition, contingencies setting the occasion for and maintaining a student’s behavior (e.g., delay or escape from homework, access to preferred tangibles that interfere with homework) should be considered as this information may help a clinician develop an effective academic intervention package and related behavior management strategies to increase the success of the home tutoring plan (Dufrene et al., 2016).

In addition to assessing student-related variables, clinicians should assess a student’s current environment. Environmental variables may be partially or fully responsible for a student’s academic difficulties. For example, a student’s daily routines, sleep schedule, and living arrangements can affect academic performance. These factors may also be targeted and incorporated into a comprehensive academic intervention.

Direct Assessment of Academic Deficits

Data collected during the indirect assessment should lead to a targeted, hypothesis-driven direct assessment of academic difficulties. Direct skills assessment should include assessment of each area of academic concern indicated by the parent or teacher. Further, for each area of academic concern, direct skills assessment should begin at the grade level of performance suspected in order to conserve time and resources and to decrease the likelihood of frustration for the student (Dufrene et al., 2016). As a fundamental tenet of explicit instruction, decisions to decrease or increase assessment difficulty should be data-based. That is, assessment difficulty should be decreased if data indicate the student's performance is below grade level on the initial grade-level assessed or increased if data indicate the student’s performance is found to be within the mastery range. This process should continue until the student’s exact level of instructional performance is identified. This type of data-based decision making allows for the precise identification of a student’s academic performance level and as the target for an intervention (Shapiro, 2011).

The most common measure of direct academic skills utilized in our clinic is curriculum-based measurement (CBM; Shinn, 1989). CBM provides a brief, direct measurement of students’ performance in relevant academic skills (e.g., reading fluency, math facts fluency). CBMs are divided by grade level, and research indicates that they are predictive of more generalized outcomes (e.g., performance on tests; Goffreda et al., 2009). CBMs range from early academic skills (e.g., letter–sound fluency, number identification) to more advanced academic skills (e.g., reading comprehension, mathematics computation fluency). Further, for each academic skill, several probes are available at each grade level that allow for tools for intervention as well as socially valid progress monitoring (Dufrene et al., 2016).

Although CBM is a commonly utilized measure, other measures of academic skills sometimes must be utilized. For instance, sometimes task analysis of precursory academic skills (e.g., letter/number identification, letter sounds, number values) is necessary to determine what stage of early literacy or numeracy a student has achieved. In addition, standardized diagnostic tools that lead to a standardized academic intervention can be used to measure academic skill level. For example, The CORE Phonics Survey (Park et al., 2014) is a standardized supplemental phonics assessment that measures sequential phonics concepts at the different grade levels. Students scoring in “strategic” or “intensive” levels at a given level require targeted and intensified instruction and basic phonics instruction, respectively. Students can progress to higher level phonics concepts as data begin to indicate mastery of the lower level phonics concepts.

Instructional hierarchy

After identifying the student’s instructional level of academic skill performance, an instructional hierarchy (IH) can be used to determine the stage of learning to inform selection of intervention components (Haring et al., 1978; Daly et al., 1996). The IH used by our clinic conceptualizes student learning within a hierarchy through which students progressively pass through four stages of learning: acquisition, fluency, generalization, and adaptation (Daly et al., 1996). This is an ideal model to guide instruction as the IH dictates that student learning progresses through a logical and predictable sequence). To apply the IH, clinicians attend to certain dimensions of students’ academic responses (e.g., response time, accuracy) to identify the stage of learning for a particular academic skill (Ardoin & Daly, 2007; Daly et al., 1996). The acquisition stage is characterized by slow and effortful responding with many errors and indicates a need for modeling, prompting, and feedback for every response. Performance in the fluency stage is indicated by accurate, but slow responding. Intervention components found effective at the fluency stage include repeated practice, programmed contingencies for improving accurate rate of responding, and performance feedback on responses. As students become fluent (i.e., able to use the skill with speed and accuracy) with a new skill, they progress to the generalization stage in which they learn to use the skill in novel contexts (e.g., taught when to use the skill practice across contexts). Finally, students progress to the adaptation stage in which they alter responding to novel demands. Effective intervention components for the generalization and adaptation levels include repeated practice and performance feedback across novel academic stimuli, repeated practice of an academic skill within the context of other skills, and explicitly teaching students when a skill should and should not be used (i.e., discrimination training).

Taken together, the indirect and direct assessment process is essential for developing a targeted intervention. Explicit instruction involves delivering intervention at a student’s current level of academic performance, and progressively increasing level of performance in a logical sequence as students respond to instruction (Archer & Hughes, 2010). In our clinical model, the indirect and direct assessment is the beginning of the data-based decision-making process. At the conclusion of the problem identification stage, a clinician will have identified a student’s instructional level of performance in a given academic skill that will serve as the starting point for an intervention as well as one or more potentially effective academic intervention components.

Problem Analysis

The goals of the problem analysis phase are to determine which of the potential academic interventions are most effective and whether academic skills problems are due to skill deficits, performance deficits, or a combination of the two. In an explicit instruction teaching model, the most effective academic intervention that maps the specific skill deficit and instructional level is utilized to set the occasion for accurate and progressively more fluent responding (Archer & Hughes, 2010). Therefore, during the problem analysis phase, conducting a brief experimental analysis (BEA) provides an empirical method for testing hypotheses about potential interventions for an individual (Daly & Martens, 1999, Duhon et al., 2004; VanDerHeyden & Burns, 2009). A BEA uses an alternating treatments design or adapted alternating treatments design to compare the effects of intervention components on academic responding. The student is asked to perform an academic skill over a series of brief (20–30 min) sessions in which one or more interventions or intervention components are present or absent in random succession. Data are collected on performance of the skill across the conditions. Visual analysis is used to determine which component(s) are most effective. In particular, data are examined to determine if skill performance in one academic intervention is higher than baseline levels of performance and outcomes of the other academic interventions (e.g., differentiated responding; Dufrene et al., 2016; Duhon et al., 2004). A withdrawal condition or extended analysis can be used to verify the effectiveness of an academic intervention (Daly et al., 1999; Dufrene & Warzak, 2007).

When designing a BEA or selecting an intervention (see Table 1 for a list of common parent tutoring interventions/intervention components, citations, and online resources), it is important to ensure that the interventions include elements of explicit instruction and fit within the family’s context. Although a comprehensive description of the elements of explicit instruction (see Archer & Hughes, 2010; Carnine et al., 2017) is beyond the scope of this article, it is still important to describe how we incorporate explicit instruction into the context of implementing parent tutoring interventions. First, the direct skills assessment and use of the IH aids clinicians in breaking down complex academic skills (e.g., reading difficulties broken down to the issue of reading fluency) and logically sequencing academic intervention targets (e.g., reducing errors then increase rate of responding). A central goal is to create high levels of student engagement through structured teacher–student interactions. This is accomplished by establishing a clear academic focus, using concise instructions, providing multiple opportunities to respond, giving immediate feedback, progressing at a good pace of instruction, and monitoring student performance.

Table 1.

Sample Interventions/Treatment Components and Online Resources

Academic Skill Focus Potential Evidence-Based Interventions for Parent Implementation
Reading fluency

Repeated Reading (Meyer & Felton, 1999)

Listening Passage Preview (Rose & Sherry, 1984; Van Bon et al., 1991)

Phrase Drill Error Correction (Begency et al., 2006)

Performance Feedback/graphing (Eckert et al. 2006)

Sight word fluency Strategic Incremental Rehearsal (Phipps et al., 2020)
Decoding skills

Teach Your Child to Read in 100 Easy Lessons (Englemann et al., 1983)

Sound Partners (Marchand-Martella et al., 2002)

Reading comprehension

Prereading

○ Predicting (McCallum et al., 2010)

○ Activating Background Knowledge (Hansen & Pearson, 1983)

○ Asking Questions (Davey & McBride, 1996)

○ Preteach Vocabulary (Burns et al., 2004)

During reading

○ Monitoring/ Click/ Clunk (Babbs, 1984)

○ Summarizing (Berkowitz, 1986)

○ Story Map (Gardill & Jitendra, 1999)

Post-reading

○ Summarizing (Berkowitz, 1986)

○ Asking/ Answering Questions (McCallum et al., 2010)

Writing accuracy

Simple Sentence Writing Program (Datchuk, 2016)

Self-regulated Strategy Development and Mnemonics (Chalk et al., 2005)

Writing fluency

Planning (e.g., graphic organizers, story maps) (Li, 2007)

Repeated writing (Graham et al., 2001)

Editing (Bos & Vaughn, 2002)

Spelling accuracy

Cover-Copy-Compare (Skinner et al., 1997)

Taped spelling intervention (McCallum et al., 2014)

Explicit instruction of morphology (prefixes, suffixes, root words, letter-sound correspondence, syllable patterns) (Joshi et al., 2008)

Math computation skills

Cover-Copy-Compare (Skinner et al., 1997)

Detect-Practice-Repair (Poncy et al., 2013)

Strategic Incremental Rehearsal (Burns, 2005)

Strategy Instruction (Montague et al., 2000)

Schema Instruction (Xin, 2008)

Cognitive Strategy Instruction (Krawec et al., 2013)

Math fact fluency

Explicit Timing (Duhon et al., 2015)

Taped problems (Windingstad et al., 2009)

Organization Brief Description and Website
National Center on Intensive Instruction

Literacy strategies

https://intensiveintervention.org/intervention-resources/literacy-strategies

Math strategies

https://intensiveintervention.org/intervention-resources/mathematics-strategies-support-intensifying-interventions

Florida Center for Reading Research

Activities by grade level and reading skill

https://www.fcrr.org/student-center-activities

Reading Rockets Activities for understanding and supporting reading at home https://www.readingrockets.org/audience/parents
Intervention Central Academic and behavior interventions and materials https://www.interventioncentral.org/response-to-intervention
Measures & Intervention for Numeracy Development

Math materials, protocols, training videos

https://brianponcy.wixsite.com/mind

CEEDAR Center

Information and tools for families including modeling, using clear directions, providing support, helping with staying on task, giving feedback, and goal setting

https://ceedar.education.ufl.edu/family-guide-to-at-home-learning/

Academic intervention sessions begin with stating the goals and expectations with clear and concise verbiage (e.g., “Today you are going to practice solving some math problems. When I show you a flashcard with a problem, try your best to solve it. If you do not say the answer after five seconds, I will tell you the answer and you will repeat it.”). After this, clinicians engage in explicit teaching of academic skills that begins with modeling, then makes the transition to guided rehearsal, and finally independent student responding with contingent reinforcement (i.e., model-lead-test; e.g., “Four plus four equals eight. Now let’s say it together, ‘four plus four equals eight.’ Now you say it on your own. . . . Great job solving that problem!”). This explicit style of training occurs at a relatively swift pace so as to keep the student engaged.

To ensure retention of the skills trained through the academic intervention, we recommend that parents implement the intervention several times per week (e.g., three to five times). Research indicates that brief, distributed, and cumulative practice is more beneficial for improving academic skills than massed practice (i.e., practicing for long durations of time) (Seabrook et al., 2005).

Regarding contextual fit, the interventions developed and evaluated in the clinic are typically brief in duration (10–20 min) given family time constraints and other daily demands. We aim to keep the interventions as simple, yet effective, because as complex interventions are less likely to be implemented with integrity (Fiske, 2008; Gresham, 1989). Therefore, it is prudent to only include intervention components that are necessary to improve academic outcomes. In addition, it is important for clinicians to consider the parent and student acceptability (e.g., like the intervention, view the intervention as effective) of the intervention as well as preference (e.g., choice, buy-in, similarity to other interventions used), because these factors can influence the extent to which the intervention will be implemented as designed. When intervention plans are developed collaboratively, the intervention is more likely to match the environment where the intervention will be used and be appropriate for the skill level and needs of the implementer (e.g., parent reading proficiency, math skills; Kelleher et al., 2008). Care is taken to actively engage parents throughout the process to gain an understanding of their goals, preferences, and skills to inform the development of the parent tutoring intervention.

Overall, use of BEAs increases the clinician’s confidence in decision making by providing more evidence for the effectiveness of the intervention. In fact, interventions selected using BEAs are more effective than those selected based on initial assessments of the academic concern (Wagner et al., 2017). BEA procedures have been developed for early writing (Burns et al., 2009), reading fluency (Doǧanay Bilgi, 2020), math (Mellott & Ardoin, 2019), and spelling (McCurdy et al., 2016).

Furthermore, a BEA can determine whether a student’s academic problem is due to a skill deficit or a performance deficit (i.e., student possesses the skills but is not motivated to do so) or a combination of the two (Duhon et al., 2004). Clinicians can assess motivation by including a reward condition in the BEA (Dufrene et al., 2016). To do this, clinicians must first conduct a brief preference assessment to determine potentially reinforcing stimuli. To conserve time and resources, we train our clinicians to conduct brief, indirect preference assessments by inquiring from both students and their parents about potentially reinforcing stimuli or activities (e.g., “What type of prize would you like to work for?”; “Can you make a list of activities you would like to do with me?”); however, more structured preference assessments (e.g., multiple stimulus without replacement, paired stimulus) are used when needed (see Chazin & Ledford, 2016, for sample protocols and videos). In addition, an indirect and direct functional assessment (Sterling-Turner et al., 2001) can be conducted to determine the function(s) of challenging session behavior, and those contingencies can inform reward selection. For example, functional assessment data may indicate that escape from task-demands maintains disruptive session behavior. In this case, a 10- to 15-min break could serve as a reward in the reward condition of a BEA.

During the reward condition, students can earn one of the reinforcing stimuli they selected if they meet a specific criterion established by the clinician. For example, a student could earn their reward contingent upon a criterion of reading 30% more words than their baseline (Dufrene et al., 2016; Duhon et al., 2004). If a student meets or exceeds this criterion, a performance deficit is more probable than a skill deficit (Dufrene et al., 2016). It should also be noted that students might demonstrate a skill and performance deficit, indicating the need for academic intervention and reinforcement for meeting a predetermined criterion.

Plan Implementation

Protocol Development

Once an evidence-based intervention has been identified, clinicians should develop a comprehensive, yet concise step-by-step protocol for parents to follow and promote and monitor treatment integrity (see Figure 4 for a sample protocol). Intervention protocols should clearly delineate intervention steps in language easy to understand and implement by the parent. Each day the home tutoring intervention is delivered, parents are asked to write the date, mark how long the instruction took (dosage), and check whether they completed each step in the protocol (adherence). The protocols are returned to the clinician each session. The data are used to help determine whether the frequency and intensity of an academic intervention needs to be modified to improve a student’s academic outcomes and if additional supports are needed to increase treatment integrity.

Fig. 4.

Fig. 4

Clinician Training the Parent Checklist. Note. The therapist followed the checklist during the parent training meeting. The checklist was used as a treatment adherence measure within the clinic

Parent Training

Following protocol development, parents are trained to implement the parent tutoring intervention as outlined in the protocol (see Fig. 4 for a sample parent training checklist). At the beginning of the training session, the clinician provides an agenda for the meeting, describes the assessment and intervention selection results, provides a rationale for the intervention (i.e., why the intervention is likely to result in goal attainment over time), and then uses behavioral skills training (BST) to teach the steps included in the protocol. Behavioral skills training (Miltenberger, 2016) is an evidence-based behavior analytic training method that has demonstrated effectiveness for training parents in a variety of skills (Gortmaker et al., 2007; LaBrot et al., 2020; Shayne & Miltenberger, 2013). Our clinicians utilize BST because it makes direct use of the protocol (i.e., instructions), allows parents to see the intervention being implemented (i.e., modeling), gives parents an opportunity to implement the intervention with their student in the presence of the clinician (i.e., rehearsal), and allows an opportunity for the clinician to deliver immediate performance feedback to the parent.

Based on our clinical model, parent training of the intervention protocol typically only occurs once during a scheduled 50-min session, with ongoing feedback and practice provided as needed. Emphasis is placed on teaching parents to deliver reinforcement for children’s effort and engagement in academic intervention as well as improvement in academic skills. This is done to promote parents’ generalization of foundational skills across academic interventions as children’s academic needs change over time. Therefore, it is essential that BST with the parent is comprehensive and thorough. To ensure thorough training, our clinicians follow specific BST guidelines (e.g., DiGennaro-Reed et al., 2018; Miltenberger, 2016).

Instructions

Clinicians provide both written (i.e., protocol/checklist of steps) and verbal instructions to parents that match the parents’ needs (e.g., reading level, language, learning needs). It is not uncommon for students to engage in distracting behavior while clinicians verbally explain the intervention protocol. For this reason, it is often helpful to meet with parents individually or provide the student with an activity while initially reviewing the intervention protocol. Furthermore, when giving instructions, clinicians use straightforward language (i.e., avoid jargon) and check for parent understanding (Banks et al., 2018).

Modeling

It is essential to model the intervention protocol for parents with absolute precision. Modeling intervention steps incorrectly may lead to the parent implementing the intervention with less than optimal treatment integrity. To further enhance parents’ understanding, clinicians should verbally name each intervention step while demonstrating it. Providing the family with a video model of the clinician delivering the intervention to the child to take home (e.g., email video, send on a flash drive) can also be useful as it allows the parent to review the intervention steps at their convenience following training. In addition, the parent can use it to reference if questions arise regarding intervention implementation.

Rehearsal and feedback

To increase the likelihood of generalization following training, clinicians strive to make the training setting as similar to the one in which the intervention will be implemented (e.g., table, chairs, student, siblings) and incorporate common stimuli (e.g., protocol, binder, letter cards, books; Shapiro et al., 1999). In addition, the clinician should provide or ensure that the parent has all of the resources to implement the intervention. During the training, a mastery criterion level of performance (e.g., 80% or higher) for parents’ rehearsal of the intervention should be established to facilitate integrity of implementation. This sometimes means that it is necessary for parent to practice implementing the intervention more than once. For this reason, we recommend ensuring that there is ample time for parents to practice intervention implementation (e.g., 30–45 min). Directly observing the parent deliver the intervention also provides an opportunity to determine if prerequisite skills need to be taught to the parent or if intervention changes are needed to enhance fit for the parent in order for them to meet the mastery criterion.

Performance feedback can be provided throughout or at the end of the parent’s rehearsal of intervention implementation. Regardless of when it is provided, feedback should be based on the protocol checklist (e.g., components completed, errors of commission and omission) as well as the quality of implementation (e.g., aspects such as preparation, fluency, enthusiasm, pace). Performance feedback should include praise for intervention components implemented correctly and corrective feedback for components not implemented, implemented incorrectly or components added that might interfere with the effectiveness of the intervention. Throughout the training session, it is important to give the parent time to ask any follow-up questions they may have to ensure adequate understanding of the protocol.

Guidelines for establishing implementation goals

At the end of the parent training, clinicians and parents collaboratively establish intervention implementation goals. In an ideal situation, the academic intervention should be implemented three to five times per week to allow students to have multiple opportunities to practice and receive feedback and reinforcement for academic skills engagement. However, given families’ existing time-constraints (e.g., other siblings, after-school activities), we recommend establishing an achievable goal with each individual family, as goal-setting has been found to be an effective antecedent intervention that leads to improved treatment integrity (Cohrs et al., 2016). A plan for coaching and feedback can also be discussed so that the parent knows support is available and how to access coaching in and outside of sessions (Reinke et al., 2014).

Plan Evaluation

Developing SMART Goals to Monitor Progress

Determining progress toward and achievement of clinically significant behavior change involves developing measurable and meaningful goals. SMART goals align well with explicit instruction in that goals for academic skill attainment are operationally defined and lend themselves to making data-based decisions for instructional alterations. The acronym SMART symbolizes key elements of effective learning goals: Specific, Measurable, Action Words, Realistic/Relevant, and Time-Limited (Hedin & DeSpain, 2018).

Specific refers to operationally defining the targeted academic skill area and how the student’s progress will be measured (e.g., reading fluency as defined by rate of words read correctly per min). Measurable means that the targeted academic skill can be objectively observed visually, auditorily, or via permanent product (e.g., digits calculated correctly per min). Action words refers to a specific word that specifies the desired direction of performance of an academic skill (e.g., increase in words read correctly per minute, decrease in errors per min) and to the specific level of attainment (e.g., instructional level of reading fluency at the third-grade level). Realistic and relevant refers to setting a goal that is achievable. For example, it may not be reasonable to expect that a fifth-grade student reading fluently at a first-grade instructional level will read fluently at a fifth-grade instructional level within 3 months. However, it is reasonable that the fifth-grade student could read fluently at a second-grade instructional level within 3 months. Finally, time-limited requires that a goal has a discrete start and end date. By doing this, clinicians can judge the relative effectiveness of a goal by determining whether or not a student has achieved their goal by the end date. Taken together, an example SMART goal may read, “In 12 weeks [time to reach goal] when given a third-grade oral reading fluency probe [progress monitoring tool], Suzy will increase [action word specifying direction] words read aloud [specific skill performed] to 45 words within 1 min [objective measurement] with 95% accuracy [realistic and relevant goal].”

At the end of the prespecified date, clinicians review the goal with the parent(s) and discuss goal attainment and any changes needed to the intervention approach (e.g., intervention components, frequency, duration) or instructional level of the materials (e.g., mastered targeted skills and move to higher level skills, teaching earlier skills). Specific guidelines for these changes to an academic intervention package are discussed below.

Progress Monitoring and Ongoing Support

Progress monitoring should include monitoring students’ response to an academic intervention as well as treatment integrity of the intervention (Dufrene et al., 2016). We typically monitor progress with direct measurement of specific academic skills at the grade level at which intervention is being implemented and at the student’s actual grade level. Primary effectiveness of the intervention is assessed most directly with performance data obtained at the grade level at which the intervention is being implemented. Measuring at the student’s actual grade level may indicate whether academic skills are generalizing across skill levels.

In our clinic, progress monitoring is completed weekly after parents have been trained to implement an academic intervention. Over time, however, as the clinician begins to fade services and treatment integrity data suggest that parents have reliably implemented the intervention with good integrity, parents can be trained to collect progress monitoring data to provide to the clinician and inform future decision making (e.g., continue the intervention, change to a new skill, change to a new intervention).

Assessment of treatment integrity should assess the extent to which intervention is being implemented as intended (Dufrene et al., 2016). This can be done by requesting that parents return the intervention checklists, which would indicate the number of days the intervention was implemented (frequency) and the start and stop time of intervention implementation (dosage). Another way to monitor treatment integrity is to request that the parent audio or video record the intervention sessions at home and submit them for review by the clinician. In addition, the clinician can monitor treatment integrity by having the parent implement the intervention in the clinic using the intervention protocol.

Data-Based Decision Making

Data collected over time may indicate an increase, no change, or a decline in the targeted skill. If a student’s data show an increase, the clinician and parent examine the data to determine if the intervention should be discontinued if the goal has been met (intervention may begin on another skill target) or continued if the goal has not been met or additional practice is needed to increase the likelihood of skill maintenance.

If a student’s data indicates no improvement or decline in performance, it is important to determine the level of treatment integrity to ensure that the lack of improvement is not due to poor integrity. This is essential to consider because applied treatments are at high risk for integrity failures (McIntyre et al., 2007). Multiple dimensions of treatment integrity should be considered when determining the extent to which the intervention is being implemented as planned including adherence, dosage, quality, and engagement (Hagermoser Sanetti & Kratochwill, 2009). Adherence typically refers to the percentage of steps completed as outlined in the interventions protocol. Dosage entails the duration and frequency of sessions. Quality involves how well the parent implements the protocol (e.g., level of enthusiasm when providing praise, fluency in implementing the steps). Lastly, engagement refers to whether or not the student is actively engaged or responding to the treatment components. For example, parents can be asked to report the level of engagement on a scale (e.g., all of the time to none of the time; all responses correct to no correct responses) or clinicians can conduct direct observations if the intervention is done in the clinic.

There are several possible factors that can contribute to poor treatment integrity, such as intervention complexity, consultation skills of the clinician who trained the parents, time constraints, access to resources, a parent’s previous experiences with academic and behavioral interventions, and behavioral difficulties exhibited by the student (Fiske, 2008). When there is evidence of poor treatment integrity, it is important to assess for these factors, engage in problem solving, and provide ongoing support. Clinicians can use a specific problem-solving approach (see Kupzyk & Shriver, 2016) to identify strategies related to the parents’ skills, resources, and performance to increase the relevant dimensions of treatment integrity (e.g., adherence, dosage, quality, engagement).

For example, if a parent is omitting protocol steps or incorrectly, brief follow-up meetings can be arranged with the parent to practice implementation of intervention (DiGennaro-Reed et al., 2018). This provides the parent additional opportunities to practice intervention implementation and opportunities for the clinician to provide feedback on intervention implementation. Likewise, clinicians can provide weekly, ongoing feedback on intervention implementation until a mastery-criteria is achieved. If a parent has difficulty implementing a more complex intervention, the clinician might remove an intervention component if it does not compromise the overall effectiveness of the intervention and gradually reintroduce the component as parents begin to master the intervention protocol.

If treatment integrity is acceptable and the student’s data show limited progress, the clinician can (1) make sure the skill target is appropriate for the student, (2) modify the intervention, and (3) consider alternative interventions to implement. Modifications to the intervention might include increasing the amount of guided practice or opportunities to respond, changing the method for correcting errors, or enhancing motivation to participate in the intervention. If modifications are made or a different intervention is selected, clinicians can again use a BEA to examine the effectiveness of the intervention in the clinic setting.

Summary

The described clinical model incorporates explicit instruction for addressing academic deficits within a behavioral consultation and training framework. The general clinical framework involves indirect and direct assessment of academic deficits, BEA of potentially effective academic interventions, BST to train parents to deliver the most effective academic intervention identified through BEA, and continual progress monitoring with ongoing support. The components of this clinical model are behavior analytic in nature and are supported by the research literature as well as clinical outcome data we have collected. Below we describe a case study that provides some evidence for the effectiveness of our clinical model of training parents to use explicit instruction during home tutoring sessions. We outline the clinical process as we describe the case study and provide clinical outcome data to demonstrate its effectiveness.

Case Example

William (pseudonym used throughout) was a 17-year-old male who lived with his biological mother (Ms. Cooper; worked full time), his mother’s partner (Ms. Barnes; stay-at-home), and younger brother. He was diagnosed with a mild intellectual disability, autism spectrum disorder, and ADHD. William received special education services at a local public high school. During the initial interview, William’s mother reported concern with his early reading skills. In particular, she described that they had continued to work several days per week at home on reading skills for years in coordination with his teachers, but had made little progress. For the last couple of years, the mother and teachers had primarily been working on sight word reading.

Indirect and Direct Assessment of Academic Deficits

Following the interview, we conducted a direct assessment of academic skills using AIMSweb curriculum-based measurement early literacy measures. William was administered three, 1-min letter sound fluency (LSF) probes that measure a student’s skill with letter–sound correspondence and nonsense word fluency (NWF) probes that measure a student’s skill with reading nonreal words (i.e., say the sounds in the word or read the whole word; early phonics skills). William demonstrated proficient performance on the LSF probes based on first-grade percentile scores (i.e., above the 75th percentile compared to the first-grade sample). His performance on the initial NWF probe indicated that he had difficulty stating and blending the sounds of letters when presented in a nonsense word. His performance on the NWF probes was characterized by slow and effortful responding with many errors indicating he was within the acquisition stage of the IH and a need for modeling, prompting, and feedback for every response. In subsequent sessions, to establish a baseline level of performance, additional NWF probes were completed and the median score for each assessment session was graphed. William’s baseline performance across three assessment sessions ranged from 8 to 11 sounds correct per min, indicating significant difficulty with applying basic phonics skills. We also assessed the number of first 100 Fry Instant Words (i.e., most common words used in texts rank ordered by when children usually learn them; Fry, 1999) he was able to read as this was a focus of instruction at school and home. He read 21 of the first 100 words correctly.

Intervention Selection and Initial Trial by Clinician

Given William’s deficits in basic early literacy skills, Sound Partners (Vadasy et al., 2000), a research-supported, phonics-based tutoring program designed for tutors with minimal training/experience was selected. The program includes a systematic scope and sequence for introducing and reviewing phonemic awareness skills, letter–sound correspondence, blending sounds to form words, reading in context, and writing. To make the program more explicit, we incorporated a model-lead-test procedure (see Figure 3). This involved modeling the task, prompting the child to perform the task with the parent, and then prompting the child to perform the task independently. Furthermore, behavior management strategies were incorporated based on the results of a BEA conducted which indicated he would benefit from attention and periodic incentives for meeting expectations. The intervention was evaluated in the clinic by the graduate student clinician prior to training the mother to ensure that he would respond well to the tutoring plan.

Fig. 3.

Fig. 3

Parent Protocol. Note. The written protocol was reviewed with the parent during training and sent home as a prompt to complete each step during home tutoring. The parent was asked to check off each step completed and return the form to the clinician. A goal is set with the family to tutor 3–7 times per week

Parent Training

A student clinician trained William’s mother how to implement the Sound Partners program at home by following a training checklist (see Figure 4). The clinician provided a rationale and instructions for the intervention, modeled each component, had the mother practice, provided feedback, and developed a plan for use of the tutoring program at home. We reviewed the importance of being prepared for instruction, moving at a brisk pace, and providing immediate feedback. The written protocol with an embedded checklist was provided to William’s mother to complete during each session as a measure of treatment adherence. In addition, the home sessions were audio recorded so that additional feedback could be provided on implementation. After the parent completed at least 80% of steps correctly and felt comfortable implementing the Sound Partners Intervention, we also taught William’s mother a more systematic approach, strategic incremental rehearsal (Kupzyk et al., 2011a), for working on the Fry Instant Words. Mrs. Cooper also taught Mrs. Barnes to use the intervention; however, the treatment integrity data indicated poor adherence. Therefore, she attended the clinic for a training session with the clinician.

Progress Monitoring

William and his mother returned to the clinic for progress monitoring on a weekly basis. During the progress monitoring sessions, the clinician administered the relevant Sound Partners Mastery Tests and three NWF probes, graphed the data and shared it with his mother. After six progress-monitoring clinic sessions, William’s performance on the NWF measure had increased to 22 correct sounds per min and he was able to correctly read 44 of the first 100 Fry Instant Words (see Fig. 5). Furthermore, he showed improvement on the Sound Partners Mastery Tests, receiving 86% correct on Mastery Test 5, which is completed following Lesson 52. This assessment entailed William reading full sentences, which he was not previously able to do.

Fig. 5.

Fig. 5

Progress Monitoring Graph for William. Note. William’s performance was measured during weekly clinic sessions using AIMSweb Nonsense Word Fluency probes. The data demonstrate an improvement in his skills following the introduction of home tutoring by his mother

Time was reserved during progress monitoring sessions for William’s mother to ask questions and for the clinician to provide feedback and practice any steps in the program with the mother to increase integrity. Based on the audio recordings and discussion, his mother was prepared for home sessions, implemented sessions regularly, and provided enthusiastic praise. Specific feedback and additional training were provided to increase use of behavior management strategies (e.g., clearer expectations, use of more effective directions, ignoring minor inappropriate behavior) and instructional practices (e.g., follow the script and use model-lead-test when introducing content and correcting errors). William’s mother showed an improvement in adherence following the feedback session. Given William’s progress and his mother’s skill in using the intervention, and an upcoming family move, it was agreed that services be terminated. The clinician provided guidance on continued use of the tutoring program and the family was encouraged to contact the clinic with any follow-up questions.

Conclusion

The goal of our training model is to increase the skills of future school personnel to collaborate effectively with families to increase learning opportunities for children with disabilities. Incorporating such programming into courses and experiences for undergraduate and graduate students in special education might lead to increased confidence with engaging families in meaningful collaborations. We are expanding this program to develop a cohesive, yet manageable online distance training series for preservice teachers. As we have created the distance training, we have also modified the procedures to for meetings with families to be conducted remotely using video technology (interested readers can contact the first author for information and resources about the distance training). School or clinic personnel can provide clear training and guidance to parents in how to assist their children with academic skills at home and thus improve parent’s confidence in doing so. Most important, academic outcomes can be improved through coordinated school and home interventions that involve parent tutoring programs based on the elements of explicit instruction.

Acknowledgment

We thank Dr. Mark Shriver for starting an academic clinic, inspiring students, and providing guidance in supporting students and families.

Funding

This program was supported in part by a grant from the United Way of the Midlands 2017-2020.

Declarations

Ethics Approval for Human Subjects

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. The study was approved by the University of Nebraska Medical Center Institutional Review Board.

Consent to Participate

Written informed consent was obtained from the parents.

Consent to Publish

The parents signed informed consent regarding publishing their data.

Conflict of Interest

The authors declare that they have no conflict of interest.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. Archer, A. L., & Hughes, C. A. (2010). Explicit instruction: Effective and efficient teaching. Guildford Press.
  2. Ardoin SP, Daly EJ., III Introduction to the special series: Close encounters of the instructional kind—how the instructional hierarchy is shaping instructional research 30 years later. Journal of Behavioral Education. 2007;16(1):1–6. doi: 10.1007/s10864-006-9027-5. [DOI] [Google Scholar]
  3. Babbs PJ. Monitoring cards help improve comprehension. The Reading Teacher. 1984;38(2):200–204. [Google Scholar]
  4. Banks BM, Shriver MD, Chadwell MR, Allen KD. An examination of behavioral treatment wording on acceptability and understanding. Behavioral Interventions. 2018;33(3):260–270. doi: 10.1002/bin.1521. [DOI] [Google Scholar]
  5. Begency JC, Daly EJ, III, Vallely RJ. Improving oral reading fluency through response opportunities: A comparison of phrase drill error correction with repeated readings. Journal of Behavioral Education. 2006;15(4):229–235. doi: 10.1007/s10864-006-9028-4. [DOI] [Google Scholar]
  6. Berkowitz SJ. Effects of instruction in text organization on sixth-grade students’ memory for expository reading. Reading Research Quarterly. 1986;21:161–178. doi: 10.2307/747843. [DOI] [Google Scholar]
  7. Bos, C. S., & Vaughn, S. (2002). Strategies for teaching students with learning and behavior problems. Allyn & Bacon.
  8. Burns MK. Using incremental rehearsal to increase fluency of single-digit multiplication facts with children identified as learning disabled in mathematics computation. Education & Treatment of Children. 2005;28(3):237–249. [Google Scholar]
  9. Burns MK, Dean VJ, Foley S. Preteaching unknown key words with incremental rehearsal to improve reading fluency and comprehension with children identified as reading disabled. Journal of School Psychology. 2004;42(4):303–314. doi: 10.1016/j.jsp.2004.04.003. [DOI] [Google Scholar]
  10. Burns MK, Ganuza Z, London R. Brief experimental analysis of written letter formation: Single-case demonstration. Journal of Behavioral Education. 2009;18:20–34. doi: 10.1007/s10864-008-9076-z. [DOI] [Google Scholar]
  11. Burns, M. K., Riley-Tillman, T. C., & Rathvon, N. (2017). Effective school interventions: Evidence-based strategies for improving student outcomes. Guilford Press.
  12. Carnine, D., Silbert, J., Kame’enui, E. J., Slocum, T., Tarver, S. G. (2017). Direct instruction reading (6th ed.). Pearson.
  13. Chalk JC, Hagan-Burke S, Burke MD. The effects of self-regulated strategy development on the writing process for high school students with learning disabilities. Learning Disability Quarterly. 2005;28(1):75–87. doi: 10.2307/4126974. [DOI] [Google Scholar]
  14. Chazin, K. T., & Ledford, J. R. (2016). Preference assessments. In Evidence-based instructional practices for young children with autism and other disabilities.http://ebip.vkcsites.org/preference-assessments
  15. Cooke NL, Mackiewicz SM, Wood CL, Helf S. The use of audio prompting to assist mothers with limited English proficiency in tutoring their pre-kindergarten children on English vocabulary. Education & Treatment of Children. 2009;32(2):213–229. doi: 10.1353/etc.0.0057. [DOI] [Google Scholar]
  16. Cohrs CM, Shriver MD, Burke RV, Allen KD. Evaluation of increasing antecedent specificity in goal statements on adherence to positive behavior-management strategies. Journal of Applied Behavior Analysis. 2016;49(4):768–779. doi: 10.1002/jaba.321. [DOI] [PubMed] [Google Scholar]
  17. Daly EJ, III, Lentz FE, Jr, Boyer J. The instructional hierarchy: A conceptual model for understanding the effective components of reading interventions. School Psychology Quarterly. 1996;11(4):369–386. doi: 10.1037/h0088941. [DOI] [Google Scholar]
  18. Daly EJ, III, Martens BK. A brief experimental analysis for identifying instructional components needed to improve oral reading fluency. Journal of Applied Behavior Analysis. 1999;32:83–95. doi: 10.1901/jaba.1999.32-83. [DOI] [Google Scholar]
  19. Daly EJ, III, Martens BK, Hamler KR, Dool EJ, Eckert TL. A brief experimental analysis for identifying instructional components needed to improve oral reading fluency. Journal of Applied Behavior Analysis. 1999;32(1):83–94. doi: 10.1901/jaba.1999.32-83. [DOI] [Google Scholar]
  20. Datchuk SM. Writing simple sentences and descriptive paragraphs: Effects of an intervention on adolescents with writing difficulties. Journal of Behavioral Education. 2016;25(2):166–188. doi: 10.1007/s10864-015-9236-x. [DOI] [Google Scholar]
  21. Davey B, McBride S. Effects of question-generation training on reading comprehension. Journal of Educational Psychology. 1996;78:256–262. doi: 10.1037/0022-0663.78.4.256. [DOI] [Google Scholar]
  22. DiGennaro-Reed FD, Blackman AL, Erath TG, Brand D, Novak MD. Guidelines for using behavioral skills training to provide teacher support. Teaching Exceptional Children. 2018;50(6):373–380. doi: 10.1177/0040059918777241. [DOI] [Google Scholar]
  23. Doǧanay Bilgi A. Evaluating the effect of parent reading interventions on improving reading fluency of students with reading difficulties. Behavioral Interventions. 2020;35(2):217–233. doi: 10.1002/bin.1708. [DOI] [Google Scholar]
  24. Dufrene BA, Warzak WJ. Brief experimental analysis of Spanish reading fluency: An exploratory evaluation. Journal of Behavioral Education. 2007;16(2):143–154. doi: 10.1007/s10864-006-9007-9. [DOI] [Google Scholar]
  25. Dufrene BA, Zoder-Martell KA, Dieringer ST, LaBrot ZC. Behavioral analytic consultation for academic referral concerns. Psychology in the Schools. 2016;53(1):8–23. doi: 10.1002/pits.21885. [DOI] [Google Scholar]
  26. Duhon GJ, House S, Hastings K, Poncy B, Solomon B. Adding immediate feedback to explicit timing: An option for enhancing treatment intensity to improve mathematics fluency. Journal of Behavioral Education. 2015;24(1):74–87. doi: 10.1007/s10864-014-9203-y. [DOI] [Google Scholar]
  27. Duhon GJ, Noell GH, Witt JC, Freeland JT, Dufrene BA, Gilbertson DN. Identifying academic skill and performance deficits: The experimental analysis of brief assessments of academic skills. School Psychology Review. 2004;33(3):429–443. doi: 10.1080/02796015.2004.12086260. [DOI] [Google Scholar]
  28. Eckert TL, Dunn EK, Ardoin SP. The effects of alternate forms of performance feedback on elementary-aged students’ oral reading fluency. Journal of Behavioral Education. 2006;15(3):148–161. doi: 10.1007/s10864-006-9018-6. [DOI] [Google Scholar]
  29. Engelmann S, Haddox P, Bruner E. Teach your child to read in 100 easy lessons. Simon & Schuster; 1983. [Google Scholar]
  30. Erchul WP, Martens BK. School consultation: Conceptual and empirical bases of practice. 3. Springer; 2012. [Google Scholar]
  31. Erion J. Parent tutoring: A meta-analysis. Education & Treatment of Children. 2006;29:79–106. [Google Scholar]
  32. Erion J, Hardy J. Parent tutoring, instructional hierarchy, and reading: A case study. Preventing School Failure: Alternative Education for Children and Youth. 2019;63(4):382–392. doi: 10.1080/1045988X.2019.1627998. [DOI] [Google Scholar]
  33. Fishel M, Ramirez L. Evidence-based parent involvement interventions with school-aged children. School Psychology Quarterly. 2005;20(4):371–402. doi: 10.1521/scpq.2005.20.4.371. [DOI] [Google Scholar]
  34. Fiske KE. Treatment integrity of school-based behavior analytic interventions: A review of the research. Behavior Analysis in Practice. 2008;1(2):19–25. doi: 10.1007/BF03391724. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Fowler, S. A., Coleman, M. R. B., & Bogdan, W. K. (2019). The state of the special education profession survey report. Council for Exceptional Children.
  36. Fry, E. (1999). 1000 instant words. Teacher Created Resources.
  37. Garbacz SA, Herman KC, Thompson AM, Reinki WM. Family engagement in education and intervention: Implementation and evaluation to maximize family, school, and student outcomes. Journal of School Psychology. 2017;62:1–10. doi: 10.1016/j.jsp.2017.04.002. [DOI] [PubMed] [Google Scholar]
  38. Gardill MC, Jitendra AK. Advanced story map instruction: Effects on the reading comprehension of students with learning disabilities. Journal of Special Education. 1999;33(1):2–17. doi: 10.1177/002246699903300101. [DOI] [Google Scholar]
  39. Goeke, J. L. (2009). Explicit instruction: A framework for meaningful direct teaching. Pearson.
  40. Goffreda CT, Diperna JC, Pedersen JA. Preventive screening for early readers: Predictive validity of the dynamic indicators of basic early readings skills (DIBELS) Psychology in the Schools. 2009;46(6):539–552. doi: 10.1002/pits.20396. [DOI] [Google Scholar]
  41. Gortmaker VJ, Daly EJ, III, McCurdy M, Persampieri MJ, Hergenrader M. Improving reading outcomes for children with learning disabilities: Using brief experimental analysis to develop parent-tutoring interventions. Journal of Applied Behavior Analysis. 2007;40(2):203–221. doi: 10.1901/jaba.2007.105-05. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Graham S, Harris KR, Larsen L. Prevention and intervention of writing difficulties for students with learning disabilities. Learning Disabilities Research & Practice. 2001;16:74–84. doi: 10.1111/0938-8982.00009. [DOI] [Google Scholar]
  43. Gresham FM. Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review. 1989;18(1):37–50. doi: 10.1080/02796015.1989.12085399. [DOI] [Google Scholar]
  44. Hagermoser Sanetti LM, Kratochwill TR. Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review. 2009;38(4):445–459. [Google Scholar]
  45. Hansen J, Pearson PD. An instructional study: Improving the inferential comprehension of good and poor fourth-grade readers. Journal of Educational Psychology. 1983;75:821–829. doi: 10.1037/0022-0663.75.6.821. [DOI] [Google Scholar]
  46. Haring, N. G., Lovitt, T. C., Eaton, M. D., & Hansen, C. L. (1978). The fourth R: Research in the classroom. Charles E. Merrill.
  47. Heller LR, Fantuzzo JW. Reciprocal peer tutoring and parent partnership: Does parent involvement make a difference? School Psychology Review. 1993;22:517–534. doi: 10.1080/02796015.1993.12085670. [DOI] [Google Scholar]
  48. Hedin L, DeSpain S. SMART or not? Writing specific, measurable IEP goals. Teaching Exceptional Children. 2018;51(2):100–110. doi: 10.1177/0040059918802587. [DOI] [Google Scholar]
  49. Hook CL, DuPaul GJ. Parent tutoring for students with attention-deficit/hyperactivity disorder: Effects on reading performance at home and school. School Psychology Review. 1999;28:60–75. doi: 10.1080/02796015.1999.12085948. [DOI] [Google Scholar]
  50. Hughes CA, Morris JR, Therrien WJ, Benson SK. Explicit instruction: Historical and contemporary contexts. Learning Disabilities Research & Practice. 2017;32(3):140–148. doi: 10.1111/ldrp.12142. [DOI] [Google Scholar]
  51. Jacobs M, Woolfson L, Hunter S. Attributions of stability, control and responsibility: How parents of children with intellectual disabilities view their child’s problematic behaviour and its causes. Journal of Applied Research in Intellectual Disabilities. 2016;29(1):58–70. doi: 10.1111/jar.12158. [DOI] [PubMed] [Google Scholar]
  52. Kame'enui, E. J. (2021, forthcoming). Ode to Zig (and the Bard): Toward a more complete logical-empirical model of Direct Instruction. Perspectives on Behavior Science. [DOI] [PMC free article] [PubMed]
  53. Kelleher C, Riley-Tillman TC, Power TJ. An initial comparison of collaborative and expert-driven consultation on treatment integrity. Journal of Educational & Psychological Consultation. 2008;18:294–324. doi: 10.1080/10474410802491040. [DOI] [Google Scholar]
  54. Krawec J, Huang J, Montague M, Kressler B, Melia de Alba A. The effects of cognitive strategy instruction on knowledge of math problem-solving processes of middle school students with learning disabilities. Learning Disability Quarterly. 2013;36(2):80–92. doi: 10.1177/0731948712463368. [DOI] [Google Scholar]
  55. Kupzyk S, Daly EJ., III Teachers training parents as reading tutors. Contemporary School Psychology. 2017;21:140–151. doi: 10.1007/s40688-016-0113-y. [DOI] [Google Scholar]
  56. Kupzyk S, Daly EJ, III, Andersen MN. A comparison of two flashcard methods for improving sight-word reading. Journal of Applied Behavior Analysis. 2011;44:781–792. doi: 10.1901/jaba.2011.44-781. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Kupzyk S, McCurdy M, Hofstadter KL, Berger L. Recorded readings: A taped parent-tutoring intervention. Journal of Behavioral Education. 2011;20:87–102. doi: 10.1007/s10864-011-9123-z. [DOI] [Google Scholar]
  58. Kupzyk S, Shriver MD. A systematic framework for addressing treatment integrity in school settings. Psychology in the Schools. 2016;53(9):954–970. doi: 10.1002/pits.21955. [DOI] [Google Scholar]
  59. Joshi RM, Treiman R, Carreker S, Moats LC. How words cast their spell: Spelling in an integral part of learning the language, not a matter of memorization. American Educator. 2008;43:6–43. [Google Scholar]
  60. LaBrot ZC, Kupzyk S, Strong-Bak W, Paswua JL, Mahon J. Examination of group-based behavioral skills training for parents of children with intellectual and neurodevelopmental disorders. Child & Family Behavior Therapy. 2020;42(2):98–124. doi: 10.1080/07317107.2020.1738715. [DOI] [Google Scholar]
  61. Li D. Story mapping and its effects on the writing fluency and word diversity of students with learning disabilities. Learning Disabilities: A Contemporary Journal. 2007;5(1):77–93. [Google Scholar]
  62. Marchand-Martella NE, Martella RC, Nelson JR, Waterbury L, Shelley SA, Cleanthous C, Hartfield D. Implementation of the sound partners reading program. Journal of Behavioral Education. 2002;11(2):117–130. doi: 10.1023/A:1015483326686. [DOI] [Google Scholar]
  63. Marotz, L., & Kupzyk, S. (2018). Parenting today’s children: A developmental perspective. Cengage.
  64. Miltenberger, R. G. (2016). Behavior modification: Principles and procedures (6th ed.). Cengage Learning.
  65. McCallum RS, Krohn KR, Skinner CH, Hilton-Prillhart A, Hopkins M, Waller S, Polite R. Improving reading comprehension of at-risk high-school students: The art of reading program. Psychology in the Schools. 2010;48(1):78–86. doi: 10.1002/pits.20541. [DOI] [Google Scholar]
  66. McCallum E, Schmitt AJ, Evans SN, Schaffner KN, Long KH. An application of the taped spelling intervention to improve spelling skills. Journal of Evidence-Based Practices for Schools. 2014;14(1):51–81. [Google Scholar]
  67. McCurdy M, Clure LF, Bleck AA, Schmitz SL. Identifying effective spelling interventions using a brief experimental analysis and extended analysis. Journal of Applied School Psychology. 2016;32(1):46–65. doi: 10.1080/15377903.2015.1121193. [DOI] [Google Scholar]
  68. McIntyre LL, Gresham FM, DiGennaro FD, Reed DD. Treatment integrity of school-based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis. 2007;40(4):659–672. doi: 10.1901/jaba.2007.659-672. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Mellott JA, Ardoin SP. Using brief experimental analysis to identify the right math intervention at the right time. Journal of Behavioral Education. 2019;28(4):435–455. doi: 10.1007/s10864-019-09324-x. [DOI] [Google Scholar]
  70. Meyer MS, Felton RH. Repeated reading to enhance fluency: Old approaches and new directions. Annals of Dyslexia. 1999;49:283–306. doi: 10.1007/s11881-999-0027-8. [DOI] [Google Scholar]
  71. Montague M, Warger C, Morgan TH. Solve it! Strategy instruction to improve mathematical problem solving. Learning Disabilities Research & Practice. 2000;15(2):110–116. doi: 10.1207/SLDRP1502_7. [DOI] [Google Scholar]
  72. National Assessment of Educational Progress (NAEP). (2011). Writing assessment. U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics. https://nces.ed.gov/nationsreportcard/pdf/main2011/2012470.pdf
  73. National Assessment of Educational Progress (NAEP). (2020). Results from the 2019 Mathematics and Reading Assessments. U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics. https://www.nationsreportcard.gov/
  74. National Institute for Direct Instruction. (n.d.). Basic philosophy of Direct Instruction (DI).https://www.nifdi.org/what-is-di/basic-philosophy.html
  75. Park Y, Benedict AE, Brownell MT. Construct and predictive validity of the CORE phonics survey: A diagnostic assessment for students with specific learning disabilities. Exceptionality. 2014;22(1):3–50. doi: 10.1080/09362835.2013.865534. [DOI] [Google Scholar]
  76. Persampieri M, Gortmaker V, Daly EJ, III, Sheridan SM, McCurdy M. Promoting parent use of empirically supported reading interventions: Two experimental investigations of child outcomes. Journal of Behavioral Interventions. 2006;21:31–57. doi: 10.1002/bin.210. [DOI] [Google Scholar]
  77. Phipps, L., Robinson, E. L., & Grebe, S. (2020). An evaluation of strategic incremental rehearsal on sight word acquisition among students with specific learning disabilities in reading. Journal of Behavioral Education. Advance online publication. 10.1007/s10864-020-09398-y.
  78. Poncy BC, Fontenelle SF, Skinner CH. Using detect, practice, and repair (DPR) to differentiate and individualize math fact instruction in a class-wide setting. Journal of Behavioral Education. 2013;22(3):211–228. doi: 10.1007/s10864-013-9171-7. [DOI] [Google Scholar]
  79. Reinke WM, Stormont M, Herman KC, Newcomer L. Using coaching to support teacher implementation of classroom-based interventions. Journal of Behavioral Education. 2014;23:150–167. doi: 10.1007/s10864-013-9186-0. [DOI] [Google Scholar]
  80. Rolf, K. R., & Slocum, T. (this issue). Features of Direct Instruction: Interactive lessons. [DOI] [PMC free article] [PubMed]
  81. Rose TL, Sherry L. Relative effects of two previewing procedures on LD adolescents’ oral reading performance. Learning Disabilities Quarterly. 1984;7:39–44. doi: 10.2307/1510259. [DOI] [Google Scholar]
  82. Rosenshine, B., & Stevens, R. (1986). Teaching functions. In M. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 376–391). Macmillan.
  83. Seabrook R, Brown G, Solity J. Distributed and massed practice: From laboratory to classroom. Applied Cognitive Psychology. 2005;19(1):107–122. doi: 10.1002/acp.1066. [DOI] [Google Scholar]
  84. Shapiro, E. S. (2011). Academic skills problems: Direct assessment and intervention. Guilford Press.
  85. Shapiro ES, Miller DN, Sawka K, Gardill CM, Handler M. Facilitating the inclusion of students with EBD into general education classrooms. Journal of Emotional & Behavioral Disorders. 1999;7:83–93. doi: 10.1177/106342669900700203. [DOI] [Google Scholar]
  86. Sheridan SM, Welch M, Orme SF. Is consultation effective? A review of outcome research. Remedial & Special Education. 1996;17(6):341–354. doi: 10.1177/074193259601700605. [DOI] [Google Scholar]
  87. Shayne R, Miltenberger RG. Evaluation of behavioral skills training for teaching functional assessment and treatment selection skills to parents. Behavioral Interventions. 2013;28(1):4–21. doi: 10.1002/bin.1350. [DOI] [Google Scholar]
  88. Shinn, M. R. (1989). Curriculum-based measurements: Assessing special children. Guilford Press.
  89. Skinner CH, McLaughlin TF, Logan P. Cover, copy, and compare: A self-managed academic intervention effective across skills, students, and settings. Journal of Behavioral Education. 1997;7:296–306. doi: 10.1023/A:1022823522040. [DOI] [Google Scholar]
  90. Slocum, T., & Rolf, K. R. this issue. Features of Direct Instruction: Content analysis. [DOI] [PMC free article] [PubMed]
  91. Sterling-Turner HE, Robinson SL, Wilczynski SM. Functional assessment of distracting and disruptive behaviors in the school setting. School Psychology Review. 2001;30(2):211–226. doi: 10.1080/02796015.2001.12086110. [DOI] [Google Scholar]
  92. Tucker V, Schwartz I. Parents perspectives of collaboration with school professionals: Barriers and facilitators to successful partnerships in planning for students with ASD. School Mental Health. 2013;5:3–14. doi: 10.1007/s12310-012-9102-0. [DOI] [Google Scholar]
  93. Twyman, J. S. (2021, Forthcoming). The evidence is in the design. Perspectives on Behavior Science. [DOI] [PMC free article] [PubMed]
  94. Vadasy PF, Jenkins JR, Pool K. Effects of tutoring in phonological and early reading skills on students at risk for reading disabilities. Journal of Learning Disabilities. 2000;33(6):579–590. doi: 10.1177/002221940003300606. [DOI] [PubMed] [Google Scholar]
  95. Valleley RJ, Begeny JC, Shriver MD. Collaborating with parents to improve children’s reading. Journal of Evidence Based Practices for Schools. 2005;6:19–41. [Google Scholar]
  96. Van Bon WHJ, Boksebeld LM, Font Freide TAM, Van den Hurk JM. A comparison of three methods of reading-while-listening. Journal of Learning Disabilities. 1991;24:471–476. doi: 10.1177/002221949102400805. [DOI] [PubMed] [Google Scholar]
  97. VanDerHeyden AM, Burns MK. Performance indicators in math: Implications for brief experimental analysis of academic performance. Journal of Behavioral Education. 2009;18(1):71–91. doi: 10.1007/s10864-009-9081-x. [DOI] [Google Scholar]
  98. Wagner DL, Cooling-Chaggin M, Deris AR. Comparing brief experimental analysis and teacher judgement for selecting early reading interventions. Journal of Behavioral Education. 2017;26:348–370. doi: 10.1007/s10864-017-9281-8. [DOI] [Google Scholar]
  99. Windingstad S, Skinner CH, Rowland E, Cardin E, Fearrington JY. Extending research on a math fluency building intervention: Applying taped problems in a second-grade classroom. Journal of Applied School Psychology. 2009;25(4):364–381. doi: 10.1080/15377900903175861. [DOI] [Google Scholar]
  100. Xin YP. The effect of schema-based instruction in solving mathematics word problems: An emphasis on prealgebraic conceptualization of multiplicative relations. Journal for Research in Mathematics Education. 2008;39(5):526–551. doi: 10.5951/jresematheduc.39.5.0526. [DOI] [Google Scholar]
  101. Zhou Q, Dufrene BA, Mercer SH, Olmi DJ, Tingstom DH. Parent-implemented reading interventions within a response to intervention framework. Psychology in the Schools. 2019;56(7):1139–1156. doi: 10.1002/pits.22251. [DOI] [Google Scholar]

Articles from Behavior Analysis in Practice are provided here courtesy of Association for Behavior Analysis International

RESOURCES