Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2025 May 17.
Published in final edited form as: J Educ Psychol Consult. 2024 May 17;34(3):265–289. doi: 10.1080/10474412.2024.2352464

Intervention Plan Quality Matters: Using COMPASS to Collaboratively Develop Student-Centered, Evidence-based Intervention Plans using an EBPP Approach

Lindsey N Ogle 1, Blaine A Garman-McClaine 2, Lisa A Ruble 1
PMCID: PMC11280249  NIHMSID: NIHMS1995779  PMID: 39072267

Abstract

The quality of interventions for children with autism has improved thanks in part to the widespread dissemination of evidence-based practices (EBPs); however, teachers still report challenges developing focused interventions targeting the core challenges of students with autism. Tested in three randomized trials, COMPASS is a consultation-based implementation strategy that prioritizes shared decision-making in the development of goals and intervention plans using an evidence-based practice in psychology approach. To successfully train COMPASS consultants, a 16-item Intervention Plan Quality Scale (IPQS) was developed and tested with a set of nine school-based COMPASS-trained consultants who provided a total of 28 consultations. Results revealed that the IPQS had acceptable reliability and concurrent validity and was successful in helping consultant trainees develop high-quality plans over four feedback sessions. Overall, the IPQS was helpful for fidelity monitoring and appears to partially mediate child goal attainment outcomes through teacher adherence implementing the intervention plans.


Public schools are the primary setting where children with autism receive services (Stichter et al., 2016; Zemantic et al., 2022). Individualized services are outlined and specified in a student’s Individualized Education Program (IEP), a “roadmap” and promise to caregivers for what to expect. The IEP must include but is not limited to a child’s (a) present level of performance, (b) measurable annual goals (including progress toward meeting such goals), (c) specially designed instruction and related services that are evidence-based, and (d) where the child will receive most services in relation to their nondisabled peers (i.e., least restrictive environment). Prior to Endrew F. v. Douglas County School District (Endrew, 2017), schools could develop IEPs that resulted in only minimal educational benefit for students with disabilities and still meet the legal requirements of the Individuals with Disabilities Education Act (IDEA, 2004; Yell & Bateman, 2020). Endrew, however, established a new educational benefit standard for student progress that states “a school must offer an IEP reasonably calculated to enable a child to make progress appropriate in light of the child’s circumstances” (Endrew F. v. Douglas County School District, 2017, p. 2) and challenged schools to ensure that all children “have the chance to meet challenging objectives” (p. 3). This begs the question: how do teachers develop educational programming that meets this higher standard? Fortunately, with tested strategies such as consultation to support special education teachers in developing effective IEPs, many teachers can design high-quality educational programming that can meet the standard required by Endrew.

A high-quality IEP for students with autism includes goals and objectives that target underlying developmental differences in areas of social, communication, and independent learning/self-management skills (National Research Council, 2001). Example skills might include taking turns with peers, initiating requests for help, or starting and completing a task independently, skills that support self-determination and quality of life. For many teachers, developing these types of goals and the high-quality IEPs that address the goals may be difficult (Dale et al., 2022; Findley et al., 2022; Knight et al., 2019; Odom et al., 2022; Ruble et al., 2010). Although students with autism share a diagnostic label, each one has a different set of skills, interests, and strengths that need to be accounted for in a personalized IEP. To consider the heterogeneity in autism, teachers need support in identifying and writing personalized goals and intervention plans that appropriately challenge students in the IEP (Ogle et al., 2023; Ruble et al., 2022).

Although autism intervention research has produced a menu of evidence-based practices (EBPs; Hume et al., 2021) for teachers to select and use, adopting, implementing, and sustaining EBPs in schools is challenging (Domitrovich et al., 2008; Forman et al., 2009; Odom et al., 2022; Owens et al., 2014). There is also more to improving student outcomes than exclusively delivering an evidence-based practice (Ruble et al., 2018a; 2018b). As highlighted by the framework for evidence-based practice in psychology (EBPP; APA Presidential Task Force on Evidence-Based Practice, 2006), practitioners must also consider professional expertise and student characteristics, needs, and preferences (McGrew et al. 2015). Professional expertise incorporates multiple competencies that improve the impact of an EBP, including but not limited to (a) assessment and intervention planning, (b) progress monitoring, (c) acquisition of new skills, and (d) understanding individual and cultural nuances of EBP impact (APA Presidential Task Force on Evidence-Based Practice, 2006). Providing individualized services that align directly with student characteristics and preferences is at the heart of special education services. However, the research evidence surrounding “what works for whom” is less clear when selecting from a menu of EBPs for students with autism. This gap in the literature can create challenges for teachers adhering to all three tenets of EBPP and suggest the need for high-quality, evidence-based implementation strategies to achieve socially valid student outcomes.

Implementation strategies are designed to increase the adoption, implementation, and sustainability of an EBP (Curran et al., 2012; Proctor et al., 2013). Implementation strategies such as toolkits, checklists, and protocols can support practitioners’ ability to select the best available EBPs, what and how to support skill development and growth based on child and family characteristics, and adequately assess and monitor student progress (Ruble et al., 2015). When combined with consultation, these strategies can support the implementation of EBPs by helping teachers apply their knowledge of EBPs to the specific contexts of their students and gain opportunities for performance feedback and student progress monitoring.

In addition to increasing teachers’ effectiveness in developing and implementing high-quality intervention plans, consultation can also offer opportunities for meaningful caregiver engagement. Caregiver involvement in the IEP process has been a consistent legal requirement, with federal law mandating caregivers be equal partners and informed decision-makers in their child’s IEP development (Turnbull et al., 2007). Positive caregiver-teacher collaboration is essential for students’ success, and research has consistently linked it to improved academic and social-emotional student outcomes (Newman & Dusenbury, 2015; Ruble et al., 2019). Meaningfully involving caregivers early in the IEP process from the bottom up while identifying goals and developing ideas for intervention plans through consultation can help caregivers and teachers work together to develop an appropriately ambitious, evidence-based intervention plan based on team alliance (Azad et al., 2021; Dale et al., 2022; Ruble et al., 2022a; Ruble et al., 2022b; Ogle et al., 2023).

One comprehensive consultation-based implementation strategy that supports teachers and caregivers in developing and implementing high-quality goals and intervention plans for students with autism is the Collaborative Model for Competence and Success (COMPASS; Ruble et al., 2012; 2015; 2022). COMPASS is an innovative second-generation, comprehensive intervention (Odom et al., 2010) that brings together those who interact most with the child – parents/caregivers and teachers – and supports them in making shared decisions about the child’s educational programming (Ruble et al., 2012; See Figure 1). COMPASS consists of an initial consultation in which goals and intervention plans are mutually developed with caregiver and teacher input followed by teacher coaching with performance feedback and progress monitoring to support the teacher’s implementation of the intervention plans. COMPASS incorporates appropriate EBPs individualized to contextual factors and implementation strategies such as checklists, toolkits, and regular progress monitoring through goal attainment scaling. However, previous research on COMPASS has noted that developing high quality intervention plans is a particularly challenging skill for consultant trainees (Ruble et al., 2022).

Figure 1.

Figure 1

COMPASS Implementation Framework Adapted from Dunst et al. (2013)

What are the qualities of high-quality intervention plans?

High quality intervention plans include: 1) Specific, measurable, attainable, relevant, and timely (SMART) goals addressing the core social, communication, and learning/self-management skills, 2) an evidence-based systematic teaching sequence that utilizes an EBPP approach to adapt appropriate EBPs to the specific strengths, needs, and interests of the student and environment, 3) clear and specific procedures outlining who/when/where the intervention will be implemented and how it will be measured, and 4) plans for maintenance, self-direction, and generalization of the target skills. To support the implementation of these elements, a template of the intervention plan includes prompts for each of these elements (See Appendix 1). We detail these elements in the next section.

SMART Goals Addressing Core Needs of Students with Autism

In addition to selecting goals that address the core social, communication, and learning skills of students with autism, the goals themselves should be SMART (Specific, Measurable, Achievable, Relevant, and Time-Bound). Writing SMART, individualized IEP goals is a particularly challenging skill for many teachers (Hedin & DeSpain, 2018) that we have also observed in our own work. To facilitate goal writing compliant with the requirements of IDEA (2004), goals follow specific format including: a) the condition or circumstance, b) behavior, c) criteria or frequency, d) measurement, and e) timeline (See Table 1; e.g., “When arriving to the table for group work, Student will independently initiate a greeting to one familiar peer in his class for 4 out of 5 opportunities in a single week as measured by a frequency checklist by the end of the school year.”). Using this format ensures that the goals contain the details necessary to be observable and measurable.

Table 1.

Writing SMART IEP Goals

Goal Element Question Example
Condition/ Circumstance In what circumstance do you want to see the behavior? When given a verbal greeting (Hi Matt!)
Behavior What is the behavior you want to see? Matt will return the greeting by saying “Hi” independently
Criteria/ Frequency How will you know if the goal is achieved? four times per day for five days in a row
Measurement How will you measure the behavior? as measured by a frequency checklist
Timeline When do you want the skill to be accomplished? by the end of the school year

Effective Teaching Sequence that Embeds EBPs within an EBPP Framework

Embedded within a high-quality intervention plan is a systematic teaching sequence (See Figure 2) that is informed by an evidence-based practice in psychology (EBPP) framework that requires the adaptation of EBPs to the student’s personal and environmental strengths, interests, and challenges (Ruble et al., 2022a). Regardless of the skill being targeted or the specific EBP used, systematic instruction that includes a step-by-step, evidence-based teaching sequence provides the foundation for effective instruction (McLeskey et al., 2017; Ruble et al., 2022a).

Figure 2.

Figure 2

Common Elements of an Effective Teaching Sequence

First, teachers must identify meaningful activities designed to teach the target behavior and design a teaching sequence that allows for the instruction and demonstration of the behavior described in the goal. For example, if the student’s goal is to engage in social initiation with peers, a teaching sequence focused on responding to a peer would not give the student an opportunity to demonstrate the target behavior of social initiation. Aligning educational activities to IEP goals targeting social emotional skills is a consistent challenge for teachers of autistic students as it requires a high degree of creativity and knowledge of relevant EBPs to design meaningful activities to teach these skills (Brock et al., 2019; Knight et al., 2019). Next, it is essential to describe how the teacher, peer(s), or environment will acquire and maintain the student’s attention. Challenges with establishing joint attention, or shared attention is a core deficit of students with autism (Craig et al., 2016; Murza et al., 2016), and describing specific strategies for obtaining joint attention are important to include in a systematic teaching sequence to ensure that the teacher does not mistakenly overlook this important challenge (Ruble et al., 2020).

After describing how the student’s attention will be focused, it is essential then to describe an appropriate response-prompting procedure to increase the probability of correct responding and thus opportunities for positive reinforcement (Walker et al., 2008). For example, it is common for teachers to provide a cue and then repeat the cue using the same prompting strategy if the student did not respond. The unintended consequence of this type of constant prompting can lead to dependence (Hume et al., 2009), a phenomenon that limits the self-direction of the target skills. While simultaneous prompting has value in some situations (Kurt et al., 2008), this procedure should quickly transition to either constant or progressive time delayed prompting to prevent prompt dependence (Hume et al., 2009; Kurt et al., 2008; Westling et al., 2021). Thus, it is important in a systematic teaching sequence to describe the specific response-prompting procedures, including the latency period between the cue and controlling prompt, to prevent prompt dependency and increase opportunities for self-direction.

Finally, a successful teaching sequence should conclude the activity with reinforcement or correction (Ruble et al., 2020). Reinforcement is an EBP (AFIRM, 2015) that increases the likelihood that the learner will demonstrate the targeted behavior in the future by building a relationship between the demonstration of a behavior and a reward. Reinforcers can be natural (e.g., spoken praise) or artificial (e.g., token, edible, break) as well as unconditioned (e.g., food as reward to waiting in lunch line) or conditioned (e.g., earning tokens toward a reward) (Westling et al., 2021). Reinforcement should start as natural and unconditioned as possible in the beginning or systematically change over time from more conditioned, artificial reinforcement to unconditioned, natural reinforcement (Muharib et al., 2021; Westling et al., 2021). The reinforcement schedule, or the frequency in which reinforcement is provided following a correct response, should also be described in a systematic teaching sequence. Reinforcement schedules can either be (1) ratio schedules that reward a student based on the number of correct responses on a fixed or variable schedule or (2) interval schedules that reward a student after a passage of time on a fixed or variable schedule (Westling et al., 2021). Variable schedules of reinforcement, which appear less predictable from the student’s perspective, promote the most durable response. Thus, transitioning from a fixed to variable reinforcement strategy is recommended as the student begins to demonstrate mastery of the target skill to improve the maintenance of the skill (Muharib et al., 2021).

In previous research, we evaluated these features of a systematic teaching sequence by observing the teacher’s implementation of the intervention plans using a six-item index measure called the common elements of teaching sequences (CETS; Ruble et al.,2020) and found that teachers’ use of the common elements increased over time with repeated opportunities for coaching. More importantly, students whose teachers used more common elements had higher goal attainment outcomes and higher ratings of engagement measured by an independent rater.

Clear and Specific Procedures

To support the implementation of the intervention plans, clear and specific procedures outlining 1) who will be involved and for what reason and where, 2) how often the intervention will be implemented, 2) how the student’s progress will be measured (e.g., frequency checklist, time sampling data sheet, etc.), and 4) what materials the teacher needs to implement the plans (e.g. social story, video model, trained peers, a paraprofessional etc.). Pre-teaching activities, or activities necessary to implement the step-by-step teaching sequence such as training peers or paraprofessionals, writing a social story, or reviewing a specific EBP are also essential to describe in the intervention plan. Together these procedures ensure that the educational team is on the same page about exactly how and when the plans will be implemented and how they will be measured supporting both implementation fidelity and performance feedback.

Plans for Maintenance, Self-direction, and Generalization

Last, high-quality intervention plans should include specific plans for the maintenance, self-direction, and generalization of the target skills (McLeskey et al., 2017; Westling et al., 2021). The pivotal skills targeted in the intervention plans, once mastered, provide the foundation for the development of more advanced skills (Koegel & Koegel, 2006), and it is important to clearly describe what those more advanced skills are in the intervention plans. For example, in a skill targeting social initiation with familiar peers in the classroom, we want that skill to generalize to other people and other settings and progress eventually towards the conversational skills that underly the development of friendships and other close relationships with others, which is an essential component of high quality of life (Ruble & Dalrymple, 1996).

Research Questions

This study comes from a larger investigation focused on developing and testing a COMPASS training package (Ogle et al., 2023; Ruble et al., 2022a). In that study, we found that we could train consultants to implement COMPASS with acceptable implementation outcomes (i.e., adherence, acceptability, feasibility, appropriateness). However, unlike the other features of COMPASS (e.g., quality of delivery; adherence to procedures implementing COMPASS) that required only one session of fidelity feedback, developing high quality intervention plans required at least four feedback opportunities to reach fidelity (Ruble et al., 2022).

The current study provides a more detailed understanding on the development and usefulness of a measure we developed during the training to guide intervention plan development, performance feedback to consultant trainees, and teacher adherence. The Intervention Plan Quality Scale (IPQS) was developed to address this need. We examined three research questions:

  1. Can we develop a measure on intervention plan quality for training community-based COMPASS consultants that demonstrates acceptable reliability and validity?

  2. What elements of the Intervention Plan Quality Scale did consultants demonstrate the most difficulty at their first consultation and how did this change as they gained more experience?

  3. Are scores on the Intervention Plan Quality Scale associated with teacher adherence to the implementation of intervention plans and student goal attainment outcomes?

Method

Participants

As stated previously, participants were recruited as part of a larger participant informed implementation effectiveness (Curran et al., 2012) study testing a COMPASS training package for school and community-based consultants and analyzing both implementation outcomes (Ruble et al., 2022) and the impact of different types and dosages of performance feedback on child outcomes (Ogle et al., 2023). Participants included nine school-based consultant trainees (CTs) who each implemented COMPASS with up to four different students with autism (N = 28) in collaboration with the student’s special education teacher (N = 28) and caregiver (N = 28) for a total of 93 participants. All participants were recruited across two school years (2018–2019 & 2019–2020) from a mid-southern state in the United States in a stepwise fashion with consultants recruited first followed by teachers then caregivers/students. All participants received an initial COMPASS consultation in which intervention plans were developed then were assigned to one of four follow-up conditions: four coaching sessions, two coaching sessions, four emailed feedback opportunities, and initial consultation only (no follow-up). Recruitment was negatively impacted by the COVID-19 pandemic in the spring of 2020 as schools closed before some consultations were able to take place and participants were unable to participate in the study as planned.

To be eligible to participate, students needed to be formally diagnosed with autism, receiving special education services under the category of autism, and meet cut-off scores for autism on the Social Communication Questionnaire – Current (Rutter, Bailey, & Lord, 2003). Students were between the ages of three and 12 years old (M = 7.39, SD = 2.67); five (18%) were female and 23 (82%) were male. Student socioeconomic status varied with 20% of caregivers reporting less than $25,000 in total annual household income, 26% reporting between $25,000 and $49,999, 36% reporting between $50,000 and $100,000, and 3% reporting greater than $100,000. The majority were White (N = 25, 89%), two (7%) were Black, and one (4%) was multi-racial. Mean student adaptive behavior based on the Composite Standard Score of the Vineland II – Teacher Form (Sparrow et al., 2005) was 54.79 (SD = 19.43) as rated by their teacher.

All special education teachers (N = 28) were licensed in their state with 25 (89%) licensed in special education, two (7%) licensed in Elementary Education, and one (3%) with an alternative certification. All teachers were White and 93% were female with a mean age of 38 years (SD = 9.82). Teachers reported between one and 31 years of experience (M = 11.14, SD = 8.14) teaching. For classroom placements, 32% reported teaching in inclusion classrooms, 32% in resource classrooms, 53% in self-contained classrooms, 7% in specialized schools. Teachers had between one and nine students with autism on their caseload at baseline (M = 3.36, SD = 2.33). Most reported some training or professional development specific to working with students with autism with 36% reporting more than five professional development opportunities. Mean scores from the Autism Self-efficacy Scale for Teachers (ASSET; Ruble et al., 2013) administered at baseline revealed somewhat high ratings (M = 3.28 out of a possible 4.0, SD = .55; 1 = Not at all Certain to 4 = Very Certain).

All CTs were White women (M = 37 years old, SD = 7.96) who had an average of 10 years (SD = 6.31) consultation experience in public schools. Seven had prior teaching experience, eight had master’s degrees, one had a bachelor’s degree, and two were advanced doctoral students who provided school consultation in rural areas as part of their university-affiliated autism clinical and were from different universities from the researchers. All CTs were familiar with the 27 identified EBPs in autism and had a relatively high degree of self-efficacy in teaching students with autism with mean scores similar to teachers (ASSET; Ruble et al., 2013; M = 3.32, SD = .42) at baseline. All CTs were trained and received supervision by the researchers in COMPASS consultation and coaching.

Procedures

Initial Consultation.

All teacher-caregiver dyads received an initial COMPASS consultation facilitated by a COMPASS trained CT. Consultations occurred between the months of October and January and generally lasted three hours with most CTs, teachers, and caregivers opting to meet over two, 1 ½-hr sessions a few days apart. Prior to the consultation, caregivers and teachers independently completed the COMPASS profile (available for electronic completion online at www.compassforautism.org), which was then compiled into a single document prior to the consultation. During the consultation, this profile was used to collaboratively identify and come to consensus on three IEP goals in the domains of social skills, communication skills, and independent learning skills. Once goals were identified, intervention plans to meet each goal were then collaboratively developed using the student’s personal and environmental challenges and supports to personalize and contextualize EBPs to maximize the student’s goal attainment and written into the template (Appendix 1). While not required, we encouraged teams to update the student’s IEP to include the new goals and objectives in either a separate IEP meeting or via a written addendum to the IEP as deemed appropriate by the student’s IEP team. We confirmed that 18 of 28 (64%) IEPs were updated with the new goals by the end of the academic year.

Feedback to CTs.

Immediately following the consultation, CTs, teachers, and caregivers completed various implementation outcome measures (adherence, acceptability, & appropriateness), and audio recordings of the consultation session were sent to the researchers along with the draft of the completed intervention plan. The researchers listened to the recording and reviewed feedback from the caregivers and teachers and the intervention plan before scoring the fidelity measures and providing performance feedback. Feedback was generally provided by the researchers to the consultants within one week of the consultation via Zoom. CTs were encouraged to update the intervention plans after receiving feedback from the researchers and share the updated plans with the teacher and caregiver for their final review before asking the teacher to implement the plans.

Follow-up Teacher Coaching.

After intervention plans were developed in the initial consultation, teachers received various types and dosages of follow-up from their CT to support the successful implementation of the intervention plans (Ogle et al., 2023). Ogle et al. (2023) examines the impact of the different types and dosages of follow-up on teacher and student outcomes. All teachers who received follow-up from their CT submitted video recordings of their implementation of each intervention plan and this video was used to rate teacher’s adherence in implementing the intervention plans by the researchers.

Measures

Consultation is a complex intervention that has an indirect relationship with student outcomes mediated by the teacher’s adherence to the intervention plan (Wong et al., 2017; See Figure 1). Thus, various measures were included to assess the CTs implementation fidelity including the CT’s adherence and quality of delivery in implementing COMPASS. A new measure of intervention plan quality adherence was developed as an additional implementation fidelity monitoring tool as we observed difficulty in completing this task well. The teacher’s adherence to the intervention plans and the student outcome measure of goal attainment progress were also assessed.

Intervention Plan Quality Scale (IPQS).

The IPQS is a 16-item researcher developed yes-no checklist (KR-20 = 0.72) that measures the quality of COMPASS intervention plans (See Table 2). It was completed by both the researchers and CTs after each initial consultation based on the intervention plans developed by the CT, teacher, and caregiver to provide focused performance feedback during supervision. As mentioned earlier, the IPQS (Table 2) assesses the presence of evidence-based principles and common elements of high-quality intervention plans including SMART goals, clear and specific procedures, an effective teaching sequence, and plans for maintenance, self-direction, and generalization (Ruble et al., 2020; 2022). It was developed following DeVellis (2017) guidelines for scale development of identifying the construct we sought to measure, identifying items and a response format, followed by pilot testing and evaluation. Concurrent validity, or the expected agreement between measures taken at the same time, (DeVellis, 2017) of the IPQS was established with other measures of COMPASS implementation fidelity including the CT’s adherence in implementing the consultation and quality of delivery based on audio recordings of the consultation. The total percentage of elements included in the plans were calculated for each consultation.

Table 2.

Researcher Reported COMPASS Intervention Plan Quality Checklist (IPQS) Adherence by Order of Consultation

Consultation Order
1st (n = 9) 2nd (n = 8) 3rd (n = 6) 4th (n = 4)
SMART Goals
 1. The goals are SMART (Specific, Measurable, Attainable, Relevant, and Time-bound) 56% 50% 67% 100%
Clear and Specific Procedures
 2. The teaching plans for each of the target skills are clear and specific 22% 37% 50% 75%
 3. The teaching plans list who, where, and when they will be implemented 89% 100% 83% 100%
 4. The teaching plans list the resources and materials, including any modifications or accommodations, needed to implement the plans for each of the target skills 100% 100% 100% 100%
 5. The teaching plans describe the data collection system that will be used to monitor progress 100% 88% 100% 75%
 6. Pre-teaching activities (activities that address prerequisite knowledge or skills) are described in the teaching plans for each of the target skills 44% 37% 50% 75%
Effective Teaching Sequence
 7. Personal and environmental challenges and supports of the student are addressed for each skill 56% 50% 83% 75%
 8. At least one evidence-based practice for children with ASD is used for each skill 89% 88% 83% 50%
 9. There is a plan to engage the student in meaningful, goal-directed activities for each skill 78% 88% 100% 100%
 10. The teaching plans discuss how the teacher/peer/environment will obtain the student’s attention at the start and maintain it throughout the teaching sequences for each skill 11% 62% 20% 75%
 11. The teaching plans discuss how the teacher/peer will make an initial request or set up the environment in such a way that the child can understand the goals of the activities 78% 75% 100% 100%
 12. The teaching plans remind the teacher to provide sufficient time (3–5 sec) for the student to perform each of the target skills after the initial requests 11% 13% 33% 50%
 13. The teaching plans remind the teacher to provide sufficient time (3–5 sec) following each prompt to perform each of the target skills 0% 13% 33% 50%
 14. The teaching plans describe how the child will be reinforced for completing each skill 44% 25% 100% 75%
Plans for Maintenance, Self-Direction, and Generalization of Skills
 15. The teaching plans describe in appropriate detail how the teacher will scaffold the skills for each goal to support self-direction and generalization 22% 38% 67% 100%
 16. There is a plan for maintenance, self-direction, and generalization for each goal 33% 25% 20% 75%
 Total Mean 52% 56% 68% 80%

CT Consultation Adherence.

The CT’s adherence in implementing the consultation was assessed using a researcher developed 25-item, yes-no checklist (KR-20 = 0.72; Authors, 2012). This checklist was rated by the researchers and included items such as a) Teacher and caregivers attend entire meeting, b) planning for the student’s program is based on input from all participants, and c) Interactive problem solving is implemented by team members providing input and ideas. The total percentage of elements observed were calculated for each consultation.

CT Quality of Delivery.

CT quality of delivery, or the CTs process skills, of the consultation was rated by the researchers using a 27-item, researcher developed yes-no checklist (KR-20 = 0.86) (Authors, 2022). Process skills are general consultation group management and communication skills not specific to a single consultation method that includes strategies to share information effectively and efficiently from all participants and checking for understanding by summarizing and paraphrasing. The total percentage of elements observed was calculated for each consultation.

Teacher Adherence Implementing the Intervention Plans.

Teacher adherence was assessed by the researchers after each coaching or e-feedback session based on teacher-made videos of their implementation of the intervention plans (Ogle et al., 2023). The number of observable elements in each intervention plan were identified and a percentage of elements observed was calculated. This percentage was converted to a 0–4 scale with 0 = 0% of elements observed, 1 = 1–25%, 2 = 26–50%, 3 = 51–75%, 4 = 76–100% elements observed. Following the procedures used in previous studies of COMPASS (Ruble et al., 2010, 2013, 2018, 2022) adherence for the three goals was then averaged to obtain an overall mean score for each session. Two coders, a postdoctoral scholar and doctoral student research assistant, independently rated 20% of sessions resulting in a sample interrater agreement of .92.

Student Goal Attainment.

Student goal attainment (Ruble et al., 2021) was assessed based on teacher-made videos of their implementation of the intervention plans using goal attainment scaling (GAS) on a 5-point scale: 0: present level of performance, 1: progress, 2: annual goal, 3: exceeding annual goal, 4: greatly exceeding annual goal. GAS has been used in multiple studies that evaluate the impact of EBPs for children with autism (Sam et al., 2022; Odom et al., 2013; Ruble et al., 2021). This study provides further evidence for the use of GAS as a standardized measure to evaluate progress toward individualized goals (Odom et al., 2013). Goals were developed in collaboration with the researchers and tested for psychometric equivalency (i.e., level of difficulty, measurability, and equidistance of scaled objectives) following procedures developed by Ruble et al. (2013). Two coders independently coded 20% of sessions resulting in a sample interrater agreement of .96 for student goal attainment. Each student’s three goals’ ratings were then averaged to get a final goal attainment score.

Analysis

For the first research question, Kuder Richardson and exact agreement were applied for calculating reliability, and Spearman’s rank correlation was used to determine concurrent validity with other fidelity measures (i.e., adherence and quality of delivery). For the second question, the percentage of elements observed in each consultation’s intervention plans were calculated at the item level and averaged. Descriptive statistics were used to describe item level scores. For the last question, Pearson correlations were used to determine the associations between the IPQS, overall mean teacher adherence in implementing the three intervention plans at the first and fourth coaching sessions and mean final student goal attainment outcomes at the conclusion of the study.

Results

Reliability and Validity of the IPQS

The IPQS had acceptable internal consistency (KR-20 = .72) and interrater reliability (percent agreement = 94%). Spearman’s rank correlation was computed to assess the relationship between the IPQS and two other consultation implementation fidelity measures. There was a positive correlation between the CT’s adherence to the COMPASS consultation protocol and the IPQS, r(27) = .62, p = .001, and a trend towards significance between quality of delivery of the consultation and the IPQS, r(27) = .14, p = .520. That is, CTs who had greater adherence to the COMPASS initial consultation protocol also demonstrated higher adherence to the principles of a high-quality intervention plan as measured by the IPQS. However, the quality of delivery of COMPASS had minimal impact on the IPQS.

Most Challenging Elements of the IPQS for CTs

The percentage of elements used in the intervention plans identified with the IPQS averaged 52% (N = 9) for their first consultation, 56% (n = 8) for their second, 68% (n = 6) for their third, and 80% (n = 4) by their fourth demonstrating consistent improvement with feedback from the researchers and practice. Looking closer at the individual items (See Table 2), writing SMART goals was a challenging skill for CTs with just 56% writing high-quality goals at the first consultation. This improved considerably by the fourth consultation when 100% of the goals met the standard. Including clear and specific procedures, conversely, was a relative strength with 89% of CTs including sufficient detail about who, when, and where the interventions would be implemented along with what the materials (100%) and data collection systems (100%) at their first consultation. Pre-teaching activities (e.g., activities such as peer and paraprofessional training, development of materials, review of EBPs) were only included in 44% of intervention plans at the first consultation but were included in 75% of intervention plans at the fourth consultation. The overall clarity and specificity of the intervention plans also increased from just 22% at the first consultation to 75% by the fourth.

CT’s ability to include elements of an effective teaching sequence, on the other hand, was a skill that CTs demonstrated the most difficulty in mastering overall (See Table 2). CTs consistently included at least one EBP in the plans (75–89%) and wrote plans that were goal-directed and focused on the target skill (78 – 100%). However, the student’s personal and environmental challenges and supports were only utilized in 56% of plans at the first consultation, which improved to 75% by the fourth. While plans consistently (78–100%) discussed how teachers or peers would make an initial request (e.g., verbal, picture, or gesture) or set up the environment in such a way (e.g., structured workstation; visual schedule) that the child can understand the behavior expected, only 11% identified how the student’s attention would be obtained and maintained throughout the teaching sequence, which improved to 75% at the fourth consultation. Including reminders to the teacher to provide sufficient time (e.g., 3–5 seconds) for the student to perform each of the target skills after the initial cue (11%) and following each prompt (0%) was rarely included at the first consultation and only improved to 50% in both cases at the fourth consultation.

Plans describing how the student would be reinforced for completing the skill consistently increased from 22% in the first consultation to 100% by the fourth. Just 22% of plans included specific instructions for scaffolding skills to achieve increased self-direction and generalization at the first consultation, but 100% included scaffolding at the fourth. The inclusion of plans for maintenance, self-direction, and generalization of the target skills increased from just 33% at the first consultation to 75% at the fourth.

Association of IPQS with Outcome Measures

Scores from the IPQS were positively correlated with teacher adherence to the intervention plans at the final coaching session (see Table 3). That is, the higher the quality of intervention plans, the better teachers adhered to the intervention plans. This was only observed for the final coaching session, however, not the first coaching session. The IPQS did not significantly correlate with student goal attainment outcomes at the end of the year.

Table 3.

Means, Standard Deviations, and Spearman’s Rank Correlations

Mean SD IPQS r (df)
Intervention Plan Quality Scale (IPQS) 61.60 19.24 ---
Consultant Adherence 84.65 15.43 .62** (27)
Consultant Quality of Delivery 81.29 16.88 .14 (27)
Teacher Adherence (First Coaching Session) 2.01 1.03 .10 (18)
Teacher Adherence (Last Coaching Session) 2.77 1.06 .61** (18)
Student Goal Attainment 1.54 .96 .15 (24)

Note:

*

Indicates p < .05.

**

Indicates p < .01.

Discussion

In training novice consultants to implement COMPASS, we needed a way to systematically support the application of standardized development and performance feedback on the overall quality of the intervention plans, a skill that presented to most challenge compared to other areas (Ruble et al., 2023). However, the IPQS may be helpful for evaluating the quality of intervention plans developed to teach a specific skill more broadly. Findings from our preliminary testing reveal that the IPQS is a sufficiently reliable measure of intervention plan quality in COMPASS. Both internal consistency and interrater reliability were acceptable, and the IPQS was correlated with the CT’s consultation adherence demonstrating expected concurrent validity. The IPQS demonstrated a non-significant positive correlation with a trend towards significance with the CTs quality of delivery, which emphasizes relationship skills, such as active listening, asking open-ended questions, and summarizing information as the consultation moves from one topic to the next. These skills appear to be somewhat disparate from the skills needed to write high quality intervention plans. While both skills comprise high quality consultation, the former is more general and not specific to COMPASS while the latter is specific to COMPASS and most critical for teacher feedback during coaching.

Usefulness of the IPQS in Improving the Quality of Plans Developed by COMPASS-trained CTs

In terms of its usefulness in providing performance feedback to CTs, the IPQS was helpful in providing more targeted feedback to improve the quality of plans developed. CTs demonstrated consist improvements in their ability to collaboratively develop high quality intervention plans as they gained feedback and experience increasing from an overall mean score of 52% to 80%. While these scores are preliminary and small sample size limits the generalizability, the IPQS does demonstrate some utility in giving targeted feedback to CTs that is worthy of future research.

In looking closely at the elements of the intervention plan by ratings, the procedural elements (e.g., describing who, when, where the teaching sequence would be implemented with what materials) were relatively easy for CTs to include. Surprisingly, writing SMART goals was quite challenging for CTs which may be related to issues with their teaching sequences that also demonstrated several challenges. While CTs demonstrated that they could select an appropriate EBP or EBPs to target the specific skill, they did not always individualize it to the unique needs of the student and environment. In another study with a different sample, Ruble et al. (2022) found that on average four different EBPs are used in COMPASS intervention plans, and planned adaptations of the EBPs to match the needs of students and their personalized context were necessary. These findings have important implications for the successful implementation of EBPs individualized to meet the student’s needs as it demonstrates that teachers need more support in learning how to implement EBPs with their specific students in mind. Future research should consider utilizing hybrid effectiveness-implementation designs to assess both the clinical effectiveness of interventions alongside their implementation outcomes (Curran et al., 2012).

Furthermore, while most did design a teaching sequence that was appropriate to demonstrate the target behavior of the goal, intervention plans often failed to include many of the details necessary to implement a systematic teaching sequence. Many failed to include plans for establishing and maintaining student’s attention, and they also failed to include appropriate response-prompting procedures, including describing what specific prompts would be used, the latency period between the cue and controlling prompt, and how the student would be reinforced for completing the skill. While these results are strictly preliminary due to the small sample size, more research is needed to explore strategies to train CTs to incorporate all elements of a systematically designed teaching sequence personalized to a particular student’s personal and environmental challenges and supports.

Checklists such as the IPQS can also be helpful to the sustainment of complex consultation interventions such as COMPASS. The consistent and deep involvement of the teacher during COMPASS is likely to have a positive impact on the use of the IPQS as a tool for developing and implementing high-quality intervention plans for students with autism more broadly. Research suggests that checklists are most successful when there is strong leadership in an organization (Gillespie & Marshall, 2015), a critical yet often missing component in school based EBP implementation (Melgarejo et al., 2020; Stadnick et al., 2019). However, the bottom-up approach to COMPASS may actually combat a lack of or suboptimal leadership. Thus, the IPQS may facilitate the development, implementation, and sustainment of high-quality intervention plans among special education teachers supporting students with autism.

The Relationship between the IPQS and Teacher and Student Outcomes

The IPQS was associated with teacher adherence to the intervention plans at the final coaching session. This finding confirms the importance of consistent performance feedback in increasing teacher’s adherence in implementing the intervention plans found in other studies (Ruble et al., 2013; 2022). Consistent with prior research, ongoing consultation leads to teacher behavior change (Beidas et al., 2012; Ruble et al., 2013), and like CTs, teachers benefit from consistent performance feedback (Noell et al., 2005). Another analysis on COMPASS from a larger sample (Ogle et al., 2023) revealed that teachers need at least two coaching sessions to demonstrate adherence to the intervention plans, but four coaching sessions were necessary for observing correlations between teacher adherence and child goal attainment outcomes (Ruble et al., 2013). Rather than directly correlating with child goal attainment outcomes, the IPQS appears to partially mediate child goal attainment outcomes through teacher adherence in implementing the intervention plans. This finding is supported by a different study of COMPASS. Using serial mediation, Wong et al. (2017) found that COMPASS consultation quality had an indirect effect on student goal attainment outcomes. That is, student outcomes were mediated by the teacher adherence to the intervention plans (i.e., teaching quality) and student’s engagement during instruction. Thus, these findings are consistent with theory (Dunst et al., 2013) and prior research that the relationship between the consultation and student goal attainment outcomes is indirect and mediated by the teacher’s adherence to the intervention plans. More research is needed to confirm this relationship with a larger and more diverse sample of students, teachers, and CTs as serial mediation could not be conducted in the current study due to small sample size.

Limitations and Future Directions

We recognize that the conclusions from this study are greatly limited by a small sample size and limited racial diversity. Study recruitment of teachers, students, and caregivers was negatively impacted by the COVID-19 pandemic which limited the number of participants and thus consultations able to be implemented by the CTs as originally planned. This study was part of a larger study that developed and validated a training package for novice COMPASS consultants and tested different follow-up conditions following the initial consultation in which intervention plans were developed (Ruble et al., 2022; Ogle et al., 2023). All CTs were intended to work with four sets of teachers, caregivers, and students. However, due to pandemic-related school closures, not all sessions were able to be implemented. Thus, we could not investigate how CT’s adherence to the IPQS changed over time as they gained more practice and experience with this sample. Despite these challenges, the sample of this study is comparable to other studies investigating novel interventions related to consultation and coaching (Wainer et al., 2017). Furthermore, an overwhelming majority of study participants were White. While this population was representative of the largely rural and suburban mid Southern state the study was conducted in, this lack of racial diversity further limits the generalizability of our findings. As such, we view the results of this study as preliminary, yet worthy of future research.

We also recognize that the COMPASS process requires a significant time commitment of roughly three hours for the initial consultation in which goals and intervention plans are developed and an additional four hours of coaching support in traditional COMPASS. Kraft et al. (2018) reported in their meta-analysis of teacher coaching interventions that 48% of studies reported greater than ten hours of direct consultation or coaching with just 27% reporting less than 10 hours. The seven total hours of consultation and coaching in COMPASS is still among the shortest of effective consultation and coaching interventions. Nevertheless, more research is needed to understand if the IPQS can be used independently from COMPASS as the IPQS has broader applicability to intervention plans more generally.

Developing high-quality, individualized intervention plans based on pivotal social, communication, and learning skill goals collaboratively developed with caregivers is a challenging skill. Teachers often need support in developing these plans, and consultation interventions such as COMPASS with a trained expert in EBPs and EBPP decision-making are proven methods of providing that support. Love and Cai (2023) conducted an independent replication of COMPASS in Australian schools with similar outcomes. In this study, we tested a single measure specific and concise enough to be meaningful, yet broad enough to encompass the wide variety of EBPs and teaching strategies used to teach students with autism to provide more focused performance feedback to our COMPASS consultant trainees. Although more research is needed to refine and validate this measure with a larger sample, the results provide directions for future research. We highlighted specific areas that consultants and autism trainers may struggle with when developing high-quality intervention plans and high-quality IEPS. Writing SMART goals, individualizing EBPs to the specific strengths and challenges of the student and writing teaching sequences that incorporate specific instructions for gaining and maintaining the student’s attention and response-prompting procedures were noteworthy areas of further research and training. The need for high-quality intervention planning personalized to the student and their context will not go away soon, especially as trainers continue to prioritize focus on specific EBPs rather than EBPP-informed approaches.

Acknowledgements

This work was supported by grant Number R34MH073071 from the National Institute of Mental Health. The authors would like to acknowledge the students, caregivers, teachers, consultant trainees, and special education administrators who participated in the study, as well as the graduate student assistants who aided with various aspects of this study including Kahyah Pinkman, Jordan Findley, Rebecca Stayton, and Mary Hoffman.

Funding:

This work was supported by grant Number R34MH073071 from the National Institute of Mental Health. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute of Mental Health or the National Institutes of Health.

Appendix 1. COMPASS Intervention Plan Template

graphic file with name nihms-1995779-f0001.jpg

Footnotes

Declaration of Interests Statement

The authors report there are no competing interests to declare.

References

  1. AFIRM Team. (2015). Reinforcement: EBP brief packet. Chapel Hill, NC: National Professional development Center on Autism Spectrum Disorders, FPG Child Development Center, University of North Carolina. Retrieved from https://afirm.fpg.unc.edu/reinforcement [Google Scholar]
  2. APA Presidential Task Force on Evidence-Based Practice. (2006). Evidence-based practice in psychology. The American Psychologist, 61, 271–285. 10.1037/0003-066x.61.4.271 [DOI] [PubMed] [Google Scholar]
  3. Azad GF, Marcus SC, & Mandell DS (2021). Partners in School: Optimizing Communication between Parents and Teachers of Children with Autism Spectrum Disorder. Journal of Educational and Psychological Consultation, 31(4), 438–462. 10.1080/10474412.2020.1830100 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Beidas RS, Edmunds JM, Marcus SC, & Kendall PC (2012). Training and consultation to promote implementation of an empirically supported treatment: A randomized trial. Psychiatric Services, 63(7), 660–665. 10.1176/appi.ps.201100401 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Brock ME, Dynia JM, Dueker SA, & Barczak MA (2020). Teacher-reported priorities and practices for students with autism: Characterizing the research-to-practice gap. Focus on Autism and Other Developmental Disabilities, 35(2), 67–78. 10.1177/1088357619881217 [DOI] [Google Scholar]
  6. Craig F, Margari F, Legrottaglie AR, Palumbi R, De Giambattista C, & Margari L (2016). A review of executive function deficits in autism spectrum disorder and attention-deficit/hyperactivity disorder. Neuropsychiatric Disease and Treatment, 12, 1191. 10.2147/NDT.S104620 [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Curran GM, Bauer M, Mittman B, Pyne JM, & Stetler C (2012). Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50(3), 217–226. 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Domitrovich CE, Bradshaw CP, Poduska JM, Hoagwood K, Buckley JA, Olin S, … & Ialongo NS (2008). Maximizing the implementation quality of evidence-based preventive interventions in schools: A conceptual framework. Advances in School Mental Health Promotion, 1(3), 6–28. 10.1080/1754730X.2008.9715730 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Dunst CJ, Trivette CM, & Raab M (2013). An implementation science framework for conceptualizing and operationalizing fidelity in early childhood intervention studies. Journal of Early Intervention, 35(2), 85–101. 10.1177/1053815113502235 [DOI] [Google Scholar]
  10. Dunst CJ (2015). Improving the design and implementation of in-service professional development in early childhood intervention. Infants & Young Children, 28(3), 210–219. 10.1097/IYC.0000000000000042 [DOI] [Google Scholar]
  11. DeVellis RF (2017). Scale development: Theory and applications. Los Angeles, CA: Sage Publications. [Google Scholar]
  12. Endrew F, a Minor by and Through his Parents and Next Friends, Joseph F. et al., v. Douglas County School District RE-1. 580 U. S. ____ (2017). https://www.supremecourt.gov/opinions/16pdf/15-827_0pm1.pdf
  13. Findley JA, Ruble LA, & McGrew JH (2022). Individualized education program quality for transition age students with autism. Research in Autism Spectrum Disorders, 91, 101900. 10.1016/j.rasd.2021.101900 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Forman SG, Olin SS, Hoagwood KE, Crowe M, & Saka N (2009). Evidence-based interventions in schools: Developers’ views of implementation barriers and facilitators. School Mental Health, 1, 26–36. 10.1007/s12310-008-9002-5 [DOI] [Google Scholar]
  15. Gillespie BM, & Marshall A (2015). Implementation of safety checklists in surgery: a realist synthesis of evidence. Implementation Science, 10(1), 1–14. 10.1186/s13012-015-0319-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Individuals with Disabilities Education Improvement Act, 20 U.S.C. § 1400 (2004).
  17. Knight VF, Huber HB, Kuntz EM, Carter EW, & Juarez AP (2019). Instructional practices, priorities, and preparedness for educating students with autism and intellectual disability. Focus on Autism and Other Developmental Disabilities, 34(1), 3–14. 10.1177/1088357618755694 [DOI] [Google Scholar]
  18. Koegel RL, & Koegel LK (2006). Pivotal response treatments for autism: Communication, social, & academic development. Paul H Brookes Publishing. [Google Scholar]
  19. Kraft MA, Blazar D, & Hogan D (2018). The effect of teacher coaching on instruction and achievement: A meta-analysis of the causal evidence. Review of Educational Research, 88(4), 547–588. 10.3102/0034654318759268 [DOI] [Google Scholar]
  20. Kurt O, & Tekin-Iftar E (2008). A comparison of constant time delay and simultaneous prompting within embedded instruction on teaching leisure skills to children with autism. Topics in Early Childhood Special Education, 28(1),53–64. 10.1177/0271121408316046 [DOI] [Google Scholar]
  21. Love AM & Cai RY (2023). Adapting COMPASS in Australia. In Ruble LA & McGrew JH (Eds.), COMPASS and innovative education for students with autism (pp. 69–88). Springer. 10.1007/978-3-031-31395-0 [DOI] [Google Scholar]
  22. McLeskey J, Barringer M-D, Billingsley B, Brownell M, Jackson D, Kennedy M, … Ziegler D (2017). High-leverage practices in special education. Arlington, VA: Council for Exceptional Children & CEEDAR Center. [Google Scholar]
  23. Melgarejo M, Lind T, Stadnick NA, Helm JL, & Locke J (2020). Strengthening capacity for implementation of evidence-based practices for autism in schools: The roles of implementation climate, school leadership, and fidelity. American Psychologist, 75(8), 1105–1115. 10.1037/amp0000649 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Muharib R, Walker VL, Alresheed F, & Gerow S (2021). Effects of multiple schedules of reinforcement on appropriate communication and challenging behaviors: A meta-analysis. Journal of Autism and Developmental Disorders, 51, 613–631. 10.1007/s10803-020-04569-2 [DOI] [PubMed] [Google Scholar]
  25. Murza KA, Schwartz JB, Hahs-Vaughn DL, & Nye C (2016). Joint attention interventions for children with autism spectrum disorder: a systematic review and meta-analysis. International Journal of Language & Communication Disorders, 51(3), 236–251. 10.1111/1460-6984.12212 [DOI] [PubMed] [Google Scholar]
  26. National Research Council. (2001). Educating children with autism. National Academies Press. [Google Scholar]
  27. Newman J, & Dusenbury L (2015). Social and emotional learning (SEL): A framework for academic, social, and emotional success. In Prevention science in school settings (pp. 287–306). New York, NY: Springer. [Google Scholar]
  28. Noell GH, Witt JC, Slider NJ, Connell JE, Gatti SL, Williams KL, … & Duhon GJ (2005). Treatment implementation following behavioral consultation in schools: A comparison of three follow-up strategies. School Psychology Review, 34(1), 87–106. 10.1080/02796015.2005.12086277 [DOI] [Google Scholar]
  29. Odom SL, Collet-Klingenberg L, Rogers SJ, & Hatton DD (2010). Evidence-based practices in interventions for children and youth with autism spectrum disorders. Preventing School Failure: Alternative Education for Children and Youth, 54(4), 275–282. 10.1080/10459881003785506 [DOI] [Google Scholar]
  30. Odom SL, Cox AW, & Brock ME (2013). Implementation science, professional development, and autism spectrum disorders. Exceptional Children, 79(2), 233–251. 10.1177/0014402913079002081 [DOI] [Google Scholar]
  31. Odom SL, Sam AM, & Tomaszewski B (2022). Factors associated with implementation of a school-based comprehensive program for students with autism. Autism, 26(3), 703–715. 10.1177/13623613211070340 [DOI] [PubMed] [Google Scholar]
  32. Ogle LN, Ruble LA, Toland MD, & McGrew JH (2023). Type and dosage of performance feedback following COMPASS consultation on teacher and student outcomes. Remedial and Special Education, 45(1), 30–43. 10.1177/07419325231164755 [DOI] [Google Scholar]
  33. Owens JS, Lyon AR, Brandt NE, Masia Warner C, Nadeem E, Spiel C, & Wagner M (2014). Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health, 6, 99–111. 10.1007/s12310-013-9115-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Ruble LA, & Dalrymple NJ (1996). An alternative view of outcome in autism. Focus on Autism and Other Developmental Disabilities, 11(1), 3–14. [Google Scholar]
  35. Ruble L, Dalrymple N, & McGrew JH (2012). The Collaborative Model for Promoting Competence and Success for Students with Autism. NY: Springer. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Ruble L, Dalrymple N, & McGrew J (2010). The effects of consultation on individualized education program outcomes for young children with autism: The collaborative model for promoting competence and success. Journal of Early Intervention, 32(4), 286–301. 10.1177/1053815110382973 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Ruble LA, Love AMA, Wong VW, Grisham-Brown JL, & McGrew JH (2020). Implementation fidelity and common elements of high-quality teaching sequences for students with autism spectrum disorder in COMPASS. Research in Autism Spectrum Disorders, 71, 101493. 10.1016/j.rasd.2019.101493 [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Ruble L, McGrew J, Dale B, & Yee M (2022). Goal attainment scaling: An idiographic measure sensitive to parent and teacher report of IEP goal outcome assessment for students with ASD. Journal of Autism and Developmental Disorders, 52, 3344–3352. 10.1007/s10803-021-05213-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Ruble L, McGrew J, Johnson L, & Pinkman K (2023). Matching autism interventions to goals with planned adaptations using COMPASS. Remedial and Special Education, 44(5), 365–380. 10.1177/07419325221134122 [DOI] [Google Scholar]
  40. Ruble L, McGrew JH, & Toland MD (2012b). Goal attainment scaling as an outcome measure in randomized controlled trials of psychosocial interventions in autism. Journal of Autism and Developmental Disorders, 42, 1974–1983. 10.1007/s10803-012-1446-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Ruble LA, McGrew JH, Toland MD, Dalrymple NJ, & Jung LA (2013a). A randomized controlled trial of COMPASS web-based and face-to-face teacher coaching in autism. Journal of Consulting and Clinical Psychology, 81(3), 566–572. 10.1037/a0032003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Ruble LA, McGrew JH, Toland M, Dalrymple N, Adams M, & Snell-Rood C (2018). Randomized control trial of COMPASS for improving transition outcomes of students with autism spectrum disorder. Journal of Autism and Developmental Disorders, 48, 3586–3595. 10.1007/s10803-018-3623-9 [DOI] [PubMed] [Google Scholar]
  43. Ruble L, McGrew J, Rispoli K, & Pinkman K (2024). Parent and teacher alliance and autism spectrum disorder: Relationship matters. Manuscript under review.
  44. Ruble L, Ogle L, & McGrew J (2022). Practice makes proficient: Evaluation of implementation fidelity following COMPASS consultation training. Psychology in the Schools, 60(3), 743–760. 10.1002/pits.22800 [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Ruble LA, Toland MD, Birdwhistell JL, McGrew JH, & Usher EL (2013b). Preliminary study of the autism self-efficacy scale for teachers (ASSET). Research in Autism Spectrum Disorders, 7(9), 1151–1159. 10.1016/j.rasd.2013.06.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Rutter M, Bailey A, & Lord C (2003). The social communication questionnaire: Manual. Western Psychological Services. [Google Scholar]
  47. Sam AM, Steinbrenner JR, Odom SL, Nowell SW, Waters V, Perkins Y, … & Rogers HJ (2022). Promoting paraeducators’ use of evidence-based practices for students with autism. Exceptional Children, 89(3), 314–331. 10.1177/00144029221135572 [DOI] [Google Scholar]
  48. Sparrow SS, Balla DA, & Cicchetti DV (2005). Vineland adaptive behavior scales: Survey forms manual. AGS Publishers. [Google Scholar]
  49. Stadnick NA, Meza RD, Suhrheinrich J, Aarons GA, Brookman-Frazee L, Lyon AR, … & Locke J (2019). Leadership profiles associated with the implementation of behavioral health evidence-based practices for autism spectrum disorder in schools. Autism, 23(8), 1957–1968. 10.1177/1362361319834398 [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Stichter JP, Riley-Tillman TC, & Jimerson SR (2016). Assessing, understanding, and supporting students with autism at school: Contemporary science, practice, and policy. School Psychology Quarterly, 31(4), 443–449. 10.1037/spq0000184 [DOI] [PubMed] [Google Scholar]
  51. Turnbull AP, Zuna N, Turnbull III HR, Poston D, & Summers JA (2007). Families as partners in educational decision making: Current implementation and future directions. In Odom SL, Horner RH, Snell ME, & Blacher J (Eds.), Handbook of Developmental Disabilities (pp. 570–590) New York: Guilford Press. [Google Scholar]
  52. Walker G (2008). Constant and progressive time delay procedures for teaching children with autism: A literature review. Journal of Autism and Developmental Disorders, 38, 261–275. 10.1007/s10803-007-0390-4 [DOI] [PubMed] [Google Scholar]
  53. Westling DL, Carter EW, Da Fonte A, & Kurth JA (2021). Teaching students to learn, generalize, and maintain skills. In Teaching Students with Severe Disabilities. Hoboken, NJ: Pearson. [Google Scholar]
  54. Wong VW, Ruble LA, Yu Y, & McGrew JH (2017). Too stressed to teach? Teaching quality, student engagement, and IEP outcomes. Exceptional Children, 83(4), 412–427. 10.1177/0014402917690729 [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Yell ML, & Bateman D (2020). Defining educational benefit: An update on the US Supreme Court’s ruling in Endrew F. v. Douglas County School District (2017). Teaching Exceptional Children, 52(5), 283–290. 10.1177/0040059920914259 [DOI] [Google Scholar]
  56. Zemantic PK, Kurtz-Nelson EC, Barton H, Safer-Lichtenstein J, & McIntyre LL (2022). Family empowerment: Predicting service utilization for children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 52, 4986–4993. 10.1007/s10803-021-05329-6 [DOI] [PubMed] [Google Scholar]

RESOURCES