Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Oct 8.
Published in final edited form as: Autism. 2020 May 20;24(7):1713–1725. doi: 10.1177/1362361320919248

Does Implementing a New Intervention Disrupt Use of Existing Evidence-based Autism Interventions?

Melanie Pellecchia 1, Rinad S Beidas 1,2,3, Gwendolyn Lawson 1, Nathaniel J Williams 4, Max Seidman 1, John R Kimberly 5, Carolyn C Cannuscio 6, David S Mandell 1
PMCID: PMC7541454  NIHMSID: NIHMS1579886  PMID: 32431162

Abstract

This study examines how the introduction of TeachTown:Basics, a computer-assisted intervention for students with autism spectrum disorder (ASD), influenced teachers’ use of other evidence-based practices (EBPs). In a randomized controlled trial that enrolled 73 teachers nested within 58 schools, we used 3-level hierarchical linear models to evaluate changes in teachers’ use of EBPs across the school year for those who received TeachTown:Basics versus those assigned to control. Both groups received training and implementation support to deliver three well-established EBPs for ASD. Qualitative interviews were conducted with 25 teachers who used TeachTown:Basics to better understand their experience. Compared with teachers in the control group, teachers in the TeachTown:Basics group reported significantly less growth over the 9-month period in their use of EBPs that require one-to-one instruction (ps < .05), but no difference in their reported use of EBPs that do not involve one-to-one instruction (p = .637). Qualitative interviews indicated that teachers viewed TeachTown:Basics as an effective substitute for one-to-one instruction because it was less burdensome, despite the lack of support for TeachTown:Basics’ effectiveness. Before introducing new practices, education leaders should carefully consider both evidence of effectiveness and the potential impact on the use of other EBPs.

Trial Registration:

NCT02695693

Keywords: Computer-Assisted Intervention, School-Based Implementation, Autism Spectrum Disorder

Lay Abstract

Interventions for children with ASD are complex and often are not implemented successfully within schools. When new practices are introduced in schools they often are layered on top of existing practices, with little attention paid to how introducing new practices affects the use of existing practices. This study evaluated how introducing a computer-assisted intervention (CAI), called TeachTown:Basics, affected the use of other evidence-based practices in autism support classrooms. We compared how often teachers reported using a set of evidence-based practices in classrooms that either had access to TeachTown:Basics or didn’t have the program. We found that teachers who had access to the computer-assisted intervention reported using the other evidence-based practices less often as the school year progressed. Teachers also reported that they liked the computer-assisted intervention, found it easy to use, and that it helped overcome challenges to implementing other evidence-based practices. This is important because the computer-assisted intervention did not improve child outcomes in a previous study, and indicates that teachers may use interventions that are appealing and easier to implement, even when they do not have evidence to support their effectiveness. These findings support the idea interventions’ complexity and how well the intervention fits within the classroom affect how teachers use it, and highlights the need to develop school-based interventions that both appeal to the practitioner and improve child outcomes.

BACKGROUND

The implementation science literature is filled with compilations of research focused on developing and evaluating strategies and mechanisms to improve the implementation of evidence-based practices (EBPs) in community settings (Bunger et al., 2017; Powell, Proctor, & Glass, 2013; Powell et al., 2015; Weiner et al., 2017). This work is critically important to ensuring that EBPs are implemented in usual care. But what happens when innovations are introduced in service systems that have evidence-based practices already in place? An innovation is an idea, practice, or object that is perceived as new by an individual or other unit of adoption (Rogers, 2003). Often, innovations are layered on top of existing EBPs. Yet, little attention has been paid to the effects of introducing new practices on top of existing practices. Research-based answers to this question can help with planning for more effective and sustained implementation. In this paper, we took advantage of a randomized trial of a new, computer-assisted, intervention for students with autism spectrum disorder (ASD) to examine the consequences of implementing new practices when other EBPs already are in place.

Many efficacious interventions for children with ASD are expensive and complex, consisting of multiple components that require extensive training to implement with fidelity (Odom, Boyd, Hall, & Hume, 2010; Pellecchia et al., 2015). Schools, where most children with ASD receive the bulk of their interventions, often do not have the resources and organizational structures to implement many ASD interventions the way they were designed (Dingfelder & Mandell, 2011). Increasingly, school districts are trying new methods to increase the extent to which students with ASD receive evidence-based interventions. In response to the intensive nature of these interventions, many schools have turned to computer-assisted interventions (CAI), instructional material presented by means of a computer instead of by a teacher. Recent technological advances and a dramatic drop in cost have created enthusiasm for the potential of CAI to improve access to evidence-based interventions for students with ASD (Ploog, 2010). Many software packages have been developed in recent years to address the core skill domains targeted in EBPs for ASD, such as academic skills (Khowaja & Salim, 2013; Knight, McKissick, & Saunders, 2013), communication and language development (Ploog, Scharf, Nelson, & Brooks, 2013), daily living skills (Ramdoss et al., 2012; Self, Scudder, Weheba, & Crumrine, 2007), and social skills (Ramdoss et al., 2012; Wainer & Ingersoll, 2011). Most CAIs incorporate some version of the instructional strategies found in EBPs for students with ASD, such as the use of visual cues, systematic prompts and prompt fading, reinforcement, repeated trials, and video modeling (Ramdoss et al., 2012; Wainer & Ingersoll, 2011). CAI is particularly appealing to under-resourced schools because it could provide many elements of autism treatment that require additional staff to implement, such as providing immediate reinforcement, or collecting frequent data on students’ progress (Ramdoss et al., 2011). CAI also may allow more students to receive individualized instruction while freeing up teachers to provide concurrent group instruction. Despite these potential advantages, some autism treatment researchers have voiced concerns with the increased use of CAI as an instructional strategy for children with ASD, including fewer opportunities for social interaction and verbal communication (Bernard-Opitz, Ross, & Tuttas, 1990), and increases in challenging behavior related to perseverations on computer use common in children with ASD (Ramdoss et al., 2012; Ramdoss et al., 2011). Further, the efficacy of CAI in improving child outcomes is questionable at best. While some small-scale studies have suggested promise for these interventions (Ramdoss et al., 2012; Ramdoss et al., 2011; Wainer & Ingersoll, 2011), our group recently conducted a large randomized field trial of a CAI, TeachTown:Basics, and found: 1) no overall differences in children’s cognitive or language outcomes between the treatment and control groups after an academic year; and 2) that students who spent more time using TeachTown:Basics over the course of the school year had worse language outcomes than a control group (Pellecchia et al., 2019). No studies have examined how introducing CAI affects the use of existing evidence-based practices. Introducing CAI may free up teachers to provide more one-to-one intervention when some students are engaged in computer work. On the other hand, teachers may reduce their use of existing EBPs because they think that CAI is a reasonable and easier substitute, which is concerning, given the lack of evidence for the efficacy of CAI on improving child outcomes.

It is important to consider how introducing innovations like CAI will affect the use of EBPs already in place. Layering additional innovations on top of existing EBPs, without systematic sustainment efforts, could have a range of undesirable consequences, including implementation fatigue and discontinued use of existing EBPs, or poor implementation of innovations. These undesired consequences may be especially likely in schools for children with ASD given that many autism interventions often are not implemented in schools the way they were designed to be implemented (Kasari & Smith, 2013).

For the past ten years, teachers in the School District of Philadelphia have received training and coaching in the use of several EBPs for children with ASD. These practices, described below, are considered best practice for school-aged children with ASD (National Research Council, 2001; Volkmar & Weisner, 2009). As in many schools in large urban districts, Philadelphia schools often lack the infrastructure to support rigorous EBP implementation, which has led to variable implementation fidelity across teachers (Mandell et al., 2013; Pellecchia et al., 2015). Recognizing this problem, and in response to the need to better support students with ASD, the district implemented TeachTown:Basics (Whalen et al., 2010), a CAI, in its kindergarten-through-second grade autism support classrooms, as a supplement to existing EBPs. The district hoped to alleviate teacher burden associated with implementing EBPs in their classrooms by providing CAI as a complement to established EBPs. We partnered with the district to conduct a randomized trial of the effectiveness and implementation of TeachTown:Basics. In this paper, we examine data from our trial to assess whether introducing TeachTown:Basics affected the use of existing EBPs in these same classrooms, capitalizing on a rare opportunity to examine how implementing a new intervention may affect the use of existing practices. Because TeachTown:Basics is designed for students to use on their own, and because it is derived from evidence-based one-to-one interventions for children with ASD, we hypothesized that introducing TeachTown:Basics would (a) reduce teachers’ use of one-to-one EBPs for ASD, and (b) not affect teachers’ use of other EBPs.

METHODS

Setting

The School District of Philadelphia is the eighth largest district in the country. Most (69%) of its students are ethnic minorities, and 75% live below the Federal poverty line. In the year this study was conducted, the district operated 91 classrooms for students with an educational classification of autism in kindergarten-through-second grades. These classrooms enrolled an average of 9 students, with a lead teacher, assistant teacher, and additional support staff as needed.

Recruitment and Randomization

Details regarding the randomized effectiveness trial can be found elsewhere (Pellecchia et al., 2019). The CONSORT diagram (Figure 1) describes the flow of recruitment and retention. Of the 91 eligible kindergarten-through-second-grade autism support classrooms, 73 (80%) were enrolled in the study. Inclusion criteria were that the teacher taught in a kindergarten-through-second grade autism support classroom in Philadelphia, the school principal agreed to allow the classroom teacher to participate, and the teacher consented. Teachers were randomly assigned prior to the start of the school year, using a random number generator in SAS, to one of two conditions: TeachTown:Basics and Control.

Figure 1.

Figure 1.

Trial Profile

Overview of TeachTown:Basics Intervention

TeachTown:Basics is an intervention for children with ASD that includes two components: 1) computer-based lessons targeted toward each child’s individualized goals, 2) off-computer interpersonal activities that are delivered by the teachers. The program includes processes for automatic data collection and reporting, skill acquisition tracking, and a note system for communication with the child’s team. The curriculum is designed for children with a nonverbal cognitive ability equivalent to that of children aged 2–7 years.

Computer-assisted instruction.

TeachTown:Basics is a CAI that delivers instruction directly to the child, not a technology designed to aid the teacher in delivering instruction to the child. The computer lessons incorporate the principles of applied behavior analysis, using a discrete trial format, in which the student is provided with a specific instruction and selects the correct response. Correct responses are immediately reinforced using animated rewards and verbal praise. The lessons use specific prompting procedures, such as fading of visual cues and highlighting the correct answers, to promote success. The curriculum progresses through five levels of difficulty, and students move through the curriculum at their own pace. The curriculum content addresses six domains: 1) adaptive skills; 2) cognitive skills; 3) language arts; 4) language development; 5) mathematics; and 6) social emotional skills. Progress monitoring is built into the software. Students complete pre-tests and must demonstrate mastery before they are automatically advanced to the next lesson. Students are expected to spend 20 minutes per day using the software with classroom staff or on their own. The software is delivered via a desktop computer or electronic tablet. Each classroom had at least two computers or tablets for delivery of the CAI.

Off-line activities: interpersonal lessons.

TeachTown:Basics also includes off-line lessons that are delivered by the teachers via direct instruction. These lessons address the same areas targeted in the CAI activities, and are specifically designed to promote expressive language and interaction skills. Lesson plans and cues for instructional delivery are included with the TeachTown:Basics program. The program recommends that teachers use the off-line components with students for 20–30 minutes each day as a supplement to the CAI.

Teacher training in TeachTown:Basics.

Teachers received one full day of didactic training in the TeachTown:Basics program at the start of the school year. The didactic training included experiential components and practice using the software. Teachers also had access to online training webinars in TeachTown:Basics and in-classroom consultation two or three times across the school year. The program developers provided this training and consultation.

Overview of Existing Evidence-based Practices

Treatments based in applied behavior analysis have the most evidence to support their effectiveness for students with ASD (National Autism Center, 2009). All teachers in the district’s kindergarten-through-second grade autism support classrooms receive training in the use of a package of EBPs based on the principles of applied behavior analysis, including discrete trial training, pivotal response training, classroom behavior management strategies, and visual supports. This training occurs during the 4–6 professional development days that precede and occur during the academic year. New teachers receive additional in-classroom coaching once a month. Most participating teachers (n = 64 out of 73) reported received training and coaching prior to the study. During the course of this study, coaching was augmented so that all participating teachers (not just new teachers) received monthly coaching.

Discrete Trial Training.

Discrete trial training (DTT: Lovaas, 1987; Smith, 2001) is implemented using an intensive one-to-one teaching session in a setting free from distractions. Discrete trial training generally involves the repeated practice of the same response for several successive teaching episodes and the use of reinforcers that are functionally unrelated to the response (e.g., providing access to a treat for correctly identifying a car). Instruction within DTT involves breaking down complex skills into small component parts, and teaching each component part individually. For example, to teach a student with autism to play appropriately with toys, an instructor may first teach the student to imitate actions with objects, such as stacking a block when provided with an imitative cue.

Pivotal Response Training.

Pivotal response training (PRT: Koegel, Koegel, Harrower, & Carter, 1999; Koegel, 1988) incorporates one-to-one teaching within sessions that are loosely structured. PRT sessions rely on capturing and contriving a child’s motivation to guide instruction. Teaching sessions are initiated and paced by the child, take place in a variety of naturalistic and play-based settings, and use items and activities that are highly preferred by the child. During PRT the child chooses the instructional object or activity, and the reinforcer is related to the response (e.g., providing access to a toy car for correctly identifying a car).

Visual Schedules.

Visual schedules are used to increase independence and decrease frustration during transitions (Dettmer, Simpson, Myles, & Ganz, 2000). Visual schedules are used throughout the day. A daily visual schedule for each child is posted in prominent locations and reviewed daily. Students are cued to “check schedules” during transitions and are taught to transition to the next scheduled activity independently.

Teacher training and coaching in EBPs.

Master’s level consultants with expertise in applied behavior analysis provided teachers and classroom staff intensive training and coaching in the use of these EBPs throughout the school year. Training included workshops at the start of the school year, hands-on work in the classrooms with teachers to set up classrooms and plan student lessons at the start of the school year, quarterly, half-day workshops during the school year, and ongoing in-classroom coaching for two to three hours per month during the year.

Measures

Demographic Survey.

Teachers completed a form at the beginning of the school year that included questions about demographics, education, experience teaching special education, and other specialized training.

Use of TeachTown:Basics.

How much teachers had their students use TeachTown:Basics was measured using electronic logs generated by the software. The TeachTown company provided anonymized data on the frequency with which teachers logged in each student on the software. These data provided a measure of the number of minutes each student used the software each month throughout the school year, and aggregated to the classroom level, allowed us to measure penetration and use for all students in the classroom.

Use of EBPs.

Teachers’ use of discrete trial training, pivotal response training, and visual schedules was measured monthly via self-report. Teachers were interviewed monthly throughout the school year by a consultant familiar with the teachers’ use of each EBP, and asked to report the frequency of use of each EBP for each student in their classroom (e.g., “How many times did you implement DTT with Johnny last week?”). Interviews happened during their regularly and individually scheduled coaching visits. These coaching visits generally happened monthly and at the same point in each month. Teachers reported on their use of each EBP with each student in their classroom for one week of each month. That report was recorded as an estimate of their use for the month. Use of each EBP was coded using a Likert scale ranging from 0 to 4 with the following criteria for each score: 0 (less than one time per week), 1 (one time per week), 2 (two to four times per week), 3 (one time per day), and 4 (two times per day). Teacher’s overall use of each EBP was calculated as a mean of use across all students in the classroom for the month.

Quantitative Analyses

For the purpose of this study, we conceptualize undesirable consequences as the decrease in use of established EBPs over time. Therefore, the outcome of interest was the change in the use of discrete trial training (DTT), pivotal response training (PRT), and visual schedules (VS) over the course of the academic year. We hypothesized that, compared with teachers in the control condition, teachers in the TeachTown:Basics condition would decrease their use of EBPs that involve one-to-one instruction (i.e., DTT and PRT) over time. To test this hypothesis, we used 3-level hierarchical linear models (Raudenbush & Byrk, 2002), with repeated measures at level 1 nested within teachers at level 2, nested within schools at level 3. Teachers’ individual growth trajectories in EBP use comprised the level 1 model, variation in growth parameters between teachers within a school was captured in the level 2 model, and variation among schools in the growth parameters was captured in the level 3 model. We used HLM software for Windows (Version 7.03) to estimate the models via full maximum likelihood estimation.

Preliminary analyses indicated there was significant variance in teachers’ use of VS (p = .002; ICC = .24) and PRT (p < .001; ICC = .19) at the school level, but not DTT (p > .50; ICC = .001). Given the significant variance at level 3 for two of the practices, we used 3-level models to account for nesting within schools. Preliminary analyses confirmed that change in teachers’ EBP use over time was best modeled as a linear growth trajectory as opposed to a quadratic trajectory for all outcomes (all p’s ≥ .37). Treatment condition (TeachTown:Basics vs. control) was included in the models at level-2 because randomization occurred at the teacher level. Because the treatment groups did not differ significantly on any demographic variables of interest, we did not include any covariates in the models. To maximize interpretability, time point and treatment condition were uncentered. The models estimated differences between the TeachTown:Basics and Control groups on baseline EBP use and rates of change in EBP use per month. To examine the size of the effect, we calculated the percent of variance in teacher-level slopes explained by treatment group membership, computed as: (τslope variance null model − τslope variance conditional model) / τslope variance null model.

During the course of the year, two teachers in the control group and one teacher in the intervention group were replaced. All three new teachers consented to participate and remained in the condition to which the original teacher was assigned. We conducted a sensitivity analysis removing those three classrooms, and found no meaningful difference in the magnitude or statistical significance of the observed effects.

Mixed-Methods Approach

Semi-structured interviews were conducted with 25 teachers from the TeachTown:Basics group at the end of the school year to elaborate upon the quantitative findings and to understand teachers’ perspectives toward the introduction of TeachTown:Basics in their classrooms. Teachers were purposively sampled by selecting teachers with high and low EBP fidelity for each EBP in order to learn about facilitators and barriers to the use of EBPs in autism support classrooms. Interviews were conducted by research assistants under the direction of experts in qualitative research. During the interviews we queried teachers’ rationale for using TeachTown:Basics; how they view the program in relation to the other EBPs (e.g., complement, replacement); and the utility and effectiveness of each. Interviews were digitally recorded, professionally transcribed, and loaded into Nvivo 10.0 software for data management and analysis.

Qualitative Analysis

Interviews were analyzed using an integrated approach that combined both a priori questions and concepts derived inductively through a close reading of the transcripts (Bradley, Curry, & Devers, 2007). During the analysis the transcripts were analyzed to identify themes related to teachers’ use of TeachTown:Basics and existing EBPs and facilitators and barriers to using each approach. De-identified transcripts were entered into NVivo 10.0 software for analysis. Members of the research team developed a qualitative codebook through a collaborative and iterative process. First, the team read through several of the interviews and noted recurrent concepts. Next, they discussed commonalities among their observations and used the overlapping insights to guide the initial framework for the codebook. The codebook included operational definitions for each code and sample quotes. Coders independently summarized key findings for each of the selected codes, including quotes that corroborated or diverged from the key findings. The resulting summary memos were used to guide team discussions through which cross-cutting themes were identified and rare disagreements between coders were resolved through discussion and consensus.

RESULTS

Sample Characteristics

Table 1 shows the demographic characteristics of the 73 teachers in the sample. Teachers were predominantly female (96%) and white (81%); the most common level of educational attainment was a graduate or professional degree (85%). Teachers in the TeachTown:Basics group did not differ significantly from those in the control group on any measured demographic variable.

Table 1.

Demographic Characteristics of Teachers in the Sample

Variable Total Sample
(N= 73)
TeachTown Group
(N = 36)
Control Group
(N = 37)
Significance
test

N (%) M (SD) N (%) M (SD) N (%) M (SD)
Age (in years) 37.33
(10.76)
36.97
(8.31)
37.68
(12.82)
t = .269, p = .789
Years experience teaching special education 8.30
(6.89)
8.20
(5.02)
8.41
(8.41)
t = .119, p = .906
Gender
 Female 70 (95.9) 36 (100) 34 (97.3) t = −1.44, p = .16
 Male 2 (2.7) 0 (0) 2 (5.4)
 Not provided 1 (1.4) 0 (0) 1 (2.7)
Race
 White 59 (80.8) 29 (80.6) 30 (81.1) t = .302, p = .763
 Black 11 (15.1) 6 (16.7) 5 (13.5) t = −.323, p = .747
 Asian 1 (1.4) 1 (2.8) 0 (0) t = −1.0, p = .321
 American Indian/Alaskan Native 1 (1.4) 1 (2.8) 0 (0) t = −1.0, p = .324
 Not provided 1 (1.4) 1 (2.8) 1 (2.6)
Educational attainment
 College 9 (12.3) 4 (11.1) 5 (13.5) t = .352, p = .726
 Graduate/Professional 62 (84.9) 32 (88.9) 30 (81.1) t = −.674, p = .502
 Other 1 (1.4) 0 (0) 1 (2.7) t = 1.0, p = .321
 Not provided 1 (1.4) 0 (0) 1 (2.7)

Use of EBP at Baseline and Changes in EBP Use over Time

Table 2 shows the results of the HLM analyses. The treatment and control groups differed at baseline and in their rates of change over time in their reported use of the Discrete Trial Training and Pivotal Response Training, but not in their use of Visual Schedules.

Table 2.

Results of Three-Level Hierarchical Linear Model Analyses Examining Teacher Use of EBPs Across Time

Discrete Trial Training Pivotal Response
Training
Visual Schedules

Parameter Estimated
coefficient
SE Estimated
coefficient
SE Estimated
coefficient
SE
Fixed effects
 Intercept .369** .116 .174 .097 2.387*** .257
 Treatment Condition .351* .168 .271* .131 −.700 .358
 Time .071** .024 .082*** .021 −.010 .041
 Treatment Condition x Time −.073* .034 −.082** .031 −.026 .055
Pseudo-R-Squared .112 .144 .084

Note. Pseudo-R-Squared is the reduction in teacher-level slope variance calculated as [(τslope variance null model − τslope variance conditional model) / τslope variance null model]

*

p < .05,

**

p < .01,

***

p < .001

Discrete Trial Training.

At baseline, teachers in the TeachTown:Basics group reported significantly higher use of Discrete Trial Training than did teachers in the control group (b = .351, p = .04), representing a standardized mean difference (Cohen’s d) of .45. Compared with teachers in the control group, teachers in the TeachTown:Basics group showed significantly less growth in DTT use (Treatment Condition x Time; b = −.073, p = .034), such that the average rate of growth in DTT use for TeachTown:Basics group teachers was −.002 points per month (see Figure 2). Across time, teachers in the Control group reported significant increases in their use of DTT, at the average rate of .07 points per time point (b =.071, p = .004). In the control group, teachers’ use of DTT increased by 154% from baseline to the 8-month follow-up. In contrast, in the TeachTown:Basics group, teachers’ reported use of DTT decreased by 2% from baseline to the 8-month follow-up. Treatment condition explained 11.2% of the variance in teachers’ rates of change in DTT use during the study period. Model-estimated growth trajectories in DTT use for the control group and TeachTown:Basics group are displayed in Figure 2.

Figure 2.

Figure 2.

Changes in use of discrete trial training across the school year for both groups.

Note. n = 36 teachers in the TeachTown:Basics group; n = 37 teachers in the Control group

Pivotal Response Training.

Teachers in the TeachTown:Basics group reported greater PRT use at baseline (b = .27, p = .04), representing a standardized mean difference (Cohen’s d) of .45. Teachers in the TeachTown:Basics group showed significantly less growth in PRT use (Treatment Condition x Time; b = −.08, p = .009), such that the average rate of growth for TeachTown:Basics group teachers was −.001 points per time point. Teachers in the control group reported significant increases in their use of PRT across time, at the average rate of .08 points per time point (b = .08, p < .001). Based on these results, in the TeachTown:Basics group, teachers’ reported use of PRT decreased by 1.8% from baseline to the 8-month follow-up whereas in the control group, teachers’ use of PRT increased by 377.0% from baseline to the 8-month follow-up. Treatment condition explained 14.4% of the variance in teachers’ rates of change in PRT use over time. Estimated growth trajectories in PRT use for the control group and TeachTown:Basics group are displayed in Figure 3.

Figure 3.

Figure 3.

Changes in reported use of pivotal response training across the school year for both groups.

Note. n = 36 teachers in the TeachTown:Basics group; n = 37 teachers in the Control group.

Visual Schedules.

Teachers in the TeachTown:Basics and control groups did not differ in their reported use of Visual Schedules at baseline (b = −.70, p = .05), although the control group teachers showed marginally greater use (Cohen’s d = −.25). There was no significant change in the use of VS across time among control group teachers (b = −.01, p = .76), and there were no differences in the rate of change between TeachTown:Basics teachers versus control (Treatment Condition x Time; b = −.02, p = .63). Estimated growth trajectories in VS use for the Control Group and TeachTown:Basics group are displayed in Figure 4.

Figure 4.

Figure 4.

Changes in reported use of visual schedules across the school year for both groups.

Note. n = 36 teachers in the TeachTown:Basics group; n = 37 teachers in the Control group.

Qualitative Findings

Several themes emerged regarding teacher’s perspectives toward the use of TeachTown:Basics. An overarching finding is that teachers viewed TeachTown:Basics as a replacement instead of a supplement to some existing EBPs. Table 3 includes illustrative quotes related to each theme.

Table 3.

Selected Quotes from Semi-Structured Interviews with Teachers.

Theme Quote Implications for Future Research or Practice
TeachTown is more effective than one-to-one instruction They get DTT more effectively when they use it through the TeachTown program because we try to do it with [name of student] but I can’t think of DTT with him that I’ve doneeffectively.

I would love to see TeachTown replace DTT. Because when the kids see things on the computer screen, they remember. When they hear things on videos, like the counting –They remember the stuff. And if they just do it every single day, they’re learning. And remembering.
Teachers’ self-efficacy for implementing EBPs involving one-to-one instruction is low. Methods to improve teachers’ self-efficacy for these practices, paired with evaluations of how these increases in self-efficacy relate to changes in implementation, are needed.

Although teachers believed TeachTown was more effective than teacher-delivered instruction, our effectiveness data did not support this belief. The effectiveness of computer-assisted interventions should be further studied and potential refinements should beconsidered.
TeachTown is easier to implement than one-to-one instruction. It’s made data collection much easier because I can just print out all the reports.

I like the data collection component of it. It, you know, collects data for me. I can use that data for progress monitoring. That’s awesome.
Teachers’ were more likely to use tools that were easier to implement. Methods to simplify EBPs for children with ASD are needed in order to improve their implementation in schools.
TeachTown is appealing to students and parents. TeachTown has been a very good outlet for them to go and they enjoy it. So they’re not like getting off there and saying “I’m done” or, “I don’t wanna do this anymore.”

I printed out some of the graphs from TeachTown just to show the parents. It’s neat to see it instead of me just telling them or showing them. Just graphs always make it neater to see. So they liked it. They were excited about it.
Teachers value their students’ and students’ parent’s views regarding the acceptability of classroom interventions. EBPS that are perceived as enjoyable and appealing are viewed more favorably by teachers. Methods to make EBPs for ASD more appealing and enjoyable for consumers are needed and may increase implementation.
TeachTown helps teachers overcome staffing problems. We were short staffed a lot in the last couple weeks, so we were using the one DTT station. We were putting them on TeachTown and that was working out pretty well too. Then they weren’t getting DTT with a person that day. They’re just getting it on TeachTown which is fine. I mean, it’s more or less the same concept.

It was definitely very helpful when we had staff out because we didn’t have to interrupt another part of the rotation. So, you know, if they were in the blue area and that staff member was out, it was definitely easy that they still were doing something academic in their area and they didn’t have to interrupt seat work.
Staffing shortages were described as prevalent in under-resourced school settings, and as a barrier to implementing EBPs. Interventions that require high rates of one-to-one instruction are not feasible in these settings. Effective interventions that are designed for implementation within the staffing and resource constraints pervasive in under-resourced schools are needed.
TeachTown helps teachers manage challenging student behavior I think it’s, at least, given me a little bit of peace of mind that when we’re dealing with one of the kids’ behaviors, at least the kids can interact in something educational and something that they enjoy.

Sometimes that child that might be exhibiting those negative behaviors is on TeachTown with the headphones and then I can get some other work done with some of the other kids. So I can use that as a way to just get him or her to do their work on the computer, give them a chance to calm down and focus, and then work with some of the other kids.
Techers of students with ASD face many challenges related to their student’s clinical presentation which impact their ability to implement EBPs. Systematic efforts to design and evaluate implementation strategies to address the broad array of challenges faced by teachers in these settings are needed.

Summary of themes:

TeachTown:Basics was viewed as more effective than one-to-one interpersonal instruction.

Teachers thought that the TeachTown:Basics program was more effective than classroom staff at delivering discrete trial training. Teachers described the EBPs that require one-to-one instruction as complex, and reported that they were not confident in their ability to implement these EBPs accurately with all of their students.

TeachTown:Basics is easier to implement than one-to-one interpersonal instruction.

Teachers reported that aspects of TeachTown:Basics, such as automatic data collection, made it easier to incorporate individualized programming into their daily routines. Overwhelmingly, teachers described ongoing data collection as burdensome and as a task that they often neglected to complete because they could not find a way to integrate data collection into the workflow of their classroom. They appreciated that TeachTown:Basics was a tool to help them more easily collect data. Teachers also described the automated data collection and progress reporting included within TeachTown:Basics as a welcome and easy method to report student progress to parents during parent-teacher conferences.

Students and parents like TeachTown:Basics.

Teachers repeatedly stated that students and parents liked the TeachTown:Basics program and indicated this as an important reason for using it. Teachers described that students liked spending time on the computer, and enjoyed the animations and the videos used as reinforcers. Teachers also reported that parents liked the automated and visual representations of their child’s progress.

TeachTown:Basics helps teachers cope with staffing problems.

Staffing shortages were commonly described as a barrier to implementing the existing EBPs, which often require one-to-one or small group instruction. Teachers expressed that using TeachTown:Basics enabled them to provide their students with individualized instruction through the computer, even when they were short-staffed.

TeachTown:Basics helps teachers manage challenging student behavior.

Teachers often indicated that challenging student behavior interfered with their ability to implement EBPs throughout the school day. They also reported that some children who exhibited challenging behavior enjoyed using the TeachTown:Basics program, and were less prone to disruptive behavior when they were using the computer program. Teachers reported that they would often use TeachTown:Basics as a method to manage those students’ behavior while facilitating instruction for other students in the class.

DISCUSSION

Successful implementation of EBPs in community settings is often the result of years of thoughtful, systematic, and community-partnered efforts. The results of this study suggest that even when purposeful implementation efforts are executed, undesired consequences of implementation are possible and can negatively affect implementation outcomes. These results demonstrate that the introduction of unproven innovations that are easier to implement and more attractive to consumers, may reduce the use of more labor-intensive practices with an established evidence base.

Intervention characteristics, such as adaptability, complexity, and fit are critical factors in many implementation science frameworks (Aarons, Hurlburt, & Horwitz, 2011; Damschroder et al., 2009; Rogers, 2003), and there is broad support in the literature that attributes of an intervention contribute to its successful implementation. During interviews, teachers described TeachTown:Basics as a method to overcome barriers to implementing evidence-based one-to-one instruction. They highlighted the ease of using TeachTown:Basics relative to other EBPs and the advantages of using TeachTown:Basics when classrooms are under-staffed or when challenging student behavior is present. Overwhelmingly teachers viewed TeachTown:Basics as an efficient alternative to implementing complex one-to-one interventions that are difficult to integrate into their classrooms. The challenges of implementing EBPs for children with ASD in public schools and community settings are well-established (Brookman-Frazee, Taylor, & Garland, 2010; Mandell et al., 2013). It appears that teachers viewed TeachTown:Basics as an attractive alternative. Teachers may view interventions that are easier to implement as more effective, despite lack of evidence of effectiveness in improving child outcomes. This hypothesis highlights the importance of designing EBPs that are easier to implement in order to promote uptake.

Given teachers views toward TeachTown:Basics, it is not surprising that over the course of the school year, teachers who were given access to the program reported decreases in their use of established one-to-one evidence-based practices – specifically, discrete trial training and pivotal response training. In contrast, teachers who did not have access to TeachTown:Basics reported that they increased their use of DTT and PRT by 154% and 377%, respectively. Changes in use of EBP over the school year provide clear evidence that introducing a new, potentially competing practice can suppress growth in the use of established evidence-based practices. By the end of the year, teachers in the control group reported using pivotal response training twice as often, and discrete trial training 1.3 times as often, as teachers in the TeachTown:Basics group. This difference in the use of evidence-based interventions that incorporate individual instruction was not matched by a difference in the use of visual schedules, which suggests that teachers saw TeachTown:Basics specifically as a replacement for one-to-one intervention. Discrete trial training and pivotal response training are more complex than visual schedules and require dedicated instructional time and staff support to implement. Our findings suggest that these characteristics of DTT and PRT drove teachers’ decisions to replace them with TeachTown:Basics. It is important to note that the district’s intention was to supplement existing EBPs with TeachTown:Basics, rather than to replace established practices with the CAI. Our findings indicated that layering additional practices on top of existing practices likely led to implementation fatigue, and a suppression in the use of some or all of the previously established practices.

Several study limitations should be addressed. Perhaps most importantly, our measure of EBP use was collected via teacher self-report, rather than direct observation. Although self-report measures of intervention use lack the rigor of direct observation measures, it was not feasible, given budgetary constraints and the scale of the study, to frequently and directly observe teachers’ use of each EBP. We have no reason to expect that any limitations in this measure would introduce differential reporting between the experimental and control groups. Second, we do not have a measure of how often the classroom assistants implemented any of the EBPs with students. Teacher reports and direct observations in classrooms indicate that overwhelmingly teachers were tasked with implementing the individual EBPs with students while assistants engaged other students in small group activities. However, it is possible that a small number of classroom assistants implemented the EBPs with students and we do not have a measure of how often those sessions were implemented.

Despite these limitations, this study has important implications for the implementation and sustainment of EBPs in community practice. Much effort among implementation scientists has focused on developing strategies to improve the implementation of new practices within community settings. Yet, little attention has been paid to systematically evaluating the potential unintended consequences of these implementation efforts. Careful consideration of the potential unintended consequences of implementing new practices is warranted in order to ensure that EBPs sustain, especially within the context of innovations that have limited support for effectiveness. Intervention characteristics likely influence decisions to adopt or discontinue use of a practice, even more than evidence for effectiveness. Our study shows that teachers will likely decrease their use of EBPs viewed as burdensome in favor of innovations that are easier, even in the absence of rigorous evidence supporting their effectiveness. It is possible that EBPs for ASD are too complex to be implemented as designed in under-resourced settings, resulting in low fidelity (Brookman-Frazee et al., 2010; Mandell et al., 2013; Pellecchia et al., 2015; Stahmer et al., 2015). Teachers working in under-resourced settings are presented with many daily stressors (Abel & Sewell, 1999; Darling-Hammond, 2003), and implementing EBPs for students with ASD may be viewed as an additional burden. Introducing any new practice comes with a burden to the practitioner that may result in a rebalancing of which practices the practitioner uses and with what frequency. TeachTown:Basics was easier to implement, more appealing, and likely benefited teachers by reducing their burnout and daily overload. It makes sense that teachers decreased their use of more difficult EBPs when they had access to TeachTown:Basics. Reducing teacher burnout is a potentially critical and often overlooked aspect of school-based implementation efforts. In fact, helping teachers improve their implementation of EBPs for ASD may help reduce burnout (Ouellette, 2018). It is also possible that computer-assisted interventions may have some utility for teachers and students during times that are particularly challenging or stressful. Given the high acceptance of computer-assisted interventions by both teachers and students, using these approaches for brief respite or as complement to other evidence-based practices, may reduce teacher burnout and allow students to engage in a preferred activity during classroom routines. A critical missing element to improve the implementation of EBPs for ASD in schools may be the development of interventions that are easier to implement, and appealing to the consumer, while effective at improving child outcomes. Implementation efforts should focus on incorporating elements of user-centered design (Da Silva, Martin, Maurer, & Silveira, 2011) by partnering with, and learning from, teachers about the types of interventions that are most feasible for implementation.

In addition to identifying strategies to support sustained implementation of EBPs for autism in schools, a discussion of systematic efforts to de-implement interventions that are not successful or have iatrogenic effects is also warranted. Implementation scientists recently have called attention to the concept of de-implementation, the study of methods to systematically discontinue or reduce the use of low-value or non-evidence-based practices, as an important under-studied aspect of implementation science (Davidson, Ye, & Mensah, 2017; Prasad & Ioannidis, 2014; Wang, Maciejewski, Helfrich, & Weiner, 2018). The systematic de-implementation of ineffective clinical practices is an essential aspect of quality assurance (Hahn, Munoz-Plaza, Wang, Garcia-Delgadillo, Mittman, & Gould, 2017), and has been posited as a critical component of efforts to improve the implementation of evidence-based practices in routine care (Prasad & Ioannidis, 2014). Our longstanding community-academic partnership with the school district provided a forum to quickly disseminate our research findings to district administrators and leaders, which led to the district-wide de-implementation of TeachTown:Basics within autism support classrooms. Researchers engaged in community-partnered research have an obligation to share research findings with stakeholders and support those stakeholders in integrating research findings into policy and practice (Wells & Jones, 2009). Our findings led to the de-implementation of TeachTown:Basics, but continued work is needed to support the district in the implementation of EBPs that are feasible and accepted by teachers and staff.

CONCLUSIONS

The temptation to adopt innovations that have created some “buzz” and enthusiasm before they have undergone rigorous evaluations of effectiveness is substantial in the school-based implementation of EBPs for ASD. Administrators and leaders in education should carefully consider the evidence to support the effectiveness of new practices, the intervention-setting fit for new and current practices, and the associated burden and potential trade-offs of implementing new practices, before introducing them. This may be especially true in settings where teachers have fewer resources. Efforts should focus on methods to improve the implementation of interventions that have an established evidence-base, rather than layering additional programs onto resource-challenged teachers. Conversely, researchers should be tasked with dismantling complicated interventions (Pellecchia et al., 2015) and testing modular approaches (Powell et al., 2015) to make evidence-based practices easier to implement. The key to successful long-term implementation of EBPs in schools may be for researchers to focus on the understanding the characteristics of effective interventions that are most feasible and appealing for this setting.

Acknowledgements:

This study is being funded by the National Institute of Health: 5R01MH106175-02. We are grateful to our colleague (Steven C. Marcus, PhD) for his comments on an earlier version of this paper and his feedback regarding the analytic methods. We also would like to acknowledge and thank the teachers, staff, and administrators of The School District of Philadelphia autism support classrooms for their tireless work and diligent efforts in partnering with us on the implementation of the procedures described in this study.

Funding: This study is being funded by the National Institute of Health: 5R01MH106175-02.

Footnotes

DECLARATIONS

Ethics approval and consent: All procedures described in this protocol are in accordance with the ethical standards of the institutional and/or national research committee and with the 1975 Helsinki declaration and its later amendments or comparable ethical standards. All study procedures were approved by the University of Pennsylvania Institutional Review Board.

Competing Interests: The authors declare that they do no competing interests.

REFERENCES

  1. Aarons GA, Hurlburt M, & Horwitz S. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Abel MH, & Sewell J. (1999). Stress and Burnout in Rural and Urban Secondary School Teachers. The Journal of Educational Research, 92(5), 287–293. [Google Scholar]
  3. Bernard-Opitz V, Ross K, & Tuttas M. (1990). Computer assisted instruction for autistic children. Annals of the Academy of Medicine, Singapore, 19(5), 611–616. [PubMed] [Google Scholar]
  4. Bradley EH, Curry LA, & Devers KJ (2007). Qualitative data anaysis for health services research. Health Research and Educational Trust, 42(4), 1758–1772. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Brookman-Frazee L, Taylor R, & Garland AF (2010). Characterizing community-based mental health services for children with autism spectrum disorders and disruptive behavior disirders. . Journal of Autism and Developmental Disorders, 40(10), 1188–1201. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, & Shea C. (2017). Tracking implementation strategies: a description of a practical approach and early findings. Health Research Policy and Systems, 15. doi: 10.1186/s12961-017-0175-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. National Autism Center. (2009). National Standards Report. Retrieved from Randolph, MA: [Google Scholar]
  8. Council, N. R. (2001). Educating Chidren with Autism. Washington, DC: National AcademyPress. [Google Scholar]
  9. Curran GM, Bauer M, Mittman B, Pyne JM, & Stetler C. (2012). Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50(3), 217–226. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Da Silva TS, Martin A, Maurer F, & Silveira M. (2011). User-centered design and agile methods: a systematic review. Paper presented at the Agile Conference (AGILE), 2011. [Google Scholar]
  11. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Darling-Hammond L. (2003). Keepong good teachers: Why it matters, what leaders can do. . Educational Leadership, 60(8), 6–13. [Google Scholar]
  13. Davidson KW, Ye S, Mensah GA (2017). De-implementation Science: A virtuous cycle of ceasing and desisting low-value care before implementing new high value are. Ethnicity and Disease, 27(4), 463–468. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Dettmer S, Simpson RL, Myles BS, & Ganz JB (2000). The use of visual supports to facilitate transitions of students with autism. Focus on autism and other developmental disabilities, 15(3), 163–169. [Google Scholar]
  15. Dingfelder HE, & Mandell DS (2011). Bridging the Research-to-Practice Gap in Autism Intervention: An Application of Diffusion of Innovation Theory. Jounral of Autism and Developmental Disorders, 41(5), 597–609. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Hahn EE, Muzos-Plaza CE, Wang J, Garcia-Delgadillo J, Mittman BS, & Gould MK (2016). Working towards de-implementation: a mixed-methods study in breast cancer surveillance care. Journal of Patient Centered Research Reviews, 3, 177–178. [Google Scholar]
  17. Kasari C, & Smith T. (2013). Interventions in schools for children with autism spectrum disorder: methods and recommendations. Autism, 17(3), 254–267. doi: 10.1177/13623613124704961362361312470496 [pii] [DOI] [PubMed] [Google Scholar]
  18. Khowaja K, & Salim SS (2013). A systematic review of strategies and computer-based intervention (CBI) for reading comprehension of children with autism. Research in Autism Spectrum Disorders, 7(9), 1111–1121. [Google Scholar]
  19. Knight V, McKissick BR, & Saunders A. (2013). A review of technology-based interventions to teach academic skills to students with autism spectrum disorder. Journal of autism and developmental disorders, 43(11), 2628–2648. [DOI] [PubMed] [Google Scholar]
  20. Koegel LK, Koegel RL, Harrower JK, & Carter CM (1999). Pivotal response intervention I: Overview of approach. Journal of the Association for Persons with Severe Handicaps, 24(3), 174–185. [Google Scholar]
  21. Koegel RL (1988). How To Teach Pivotal Behaviors to Children with Autism: A Training Manual. [Google Scholar]
  22. Lovaas OI (1987). Behavioral treatment and normal educational and intellectual functioning in young autistic children. Journal of consulting and clinical psychology, 55(1), 3–9. [DOI] [PubMed] [Google Scholar]
  23. Mandell DS, Stahmer AC, Shin S, Xie M, Reisinger E, & Marcus SC (2013). The role of treatment fidelity on outcomes during a randomized field trial of an autism intervention. Autism, 17(3), 281–295. doi: 10.1177/1362361312473666 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Odom SL, Boyd BA, Hall LJ, & Hume K. (2010). Evaluation of comprehensive treatment models for individuals with autism spectrum disorders. Journal of Autism and Developmental Disorders, 40(4), 425–436. [DOI] [PubMed] [Google Scholar]
  25. Ouellette RR, Pellecchia M, Beidas RS, Wudeman R, Xie M. Mandell DS. (2018). Boon or Burden: The Effect of Implementing Evidence-Based Practices on Teachers’ Emotional Exhaustion. Administration and Policy in Mental Health and Mental Health Services Research. doi: 10.1007/s10488-018-0894-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Pellecchia M, Connell JE, Beidas RS, Xie M, Marcus SC, & Mandell DS (2015). Dismantling the Active Ingredients of an Intervention for Children with Autism. Journal of autism and developmental disorders, 45(9), 2917–2927. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Pellecchia M, Marcus SC, Spaulding CS, Seidman M, Xie M, Rump K, . . . Mandell CS. (2018). Randomized Trial of a Computer-Assisted Intervention for Children with Autism in Schools. Under Review. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Ploog B. (2010). Educational computer games and their applications to developmental disabilities In Edvardsen F. & Kulle H. (Eds.), Educational games: Design, learning and applications (pp. 281–297). Hauppauge, NY: Nova Science Publishers, Inc. [Google Scholar]
  29. Ploog BO, Scharf A, Nelson D, & Brooks PJ (2013). Use of computer-assisted technologies (CAT) to enhance social, communicative, and language development in children with autism spectrum disorders. Journal of autism and developmental disorders, 43(2), 301–322. [DOI] [PubMed] [Google Scholar]
  30. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen C, Proctor EK, & Mandell DS (2015). Methods to improve the selection and tailoring of implementation strategies. Journal of Behavioral Health Services and Research, 44(2), 177–194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Powell BJ, Proctor EK, & Glass JE (2013). A Systematic Review of Strategies for Implementing Empirically Supported Mental Health Interventions Research on Social Work Practice, 24(2), 192–212. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, . . . Kirchner JE (2015). A Refined Compilation of Implementation Strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10(1), 21. doi: 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Prasad V, & Ioannidis JP (2014). Evidence-based de-implementation for contradicted, unproven, and aspiring healthcare practices. Implementation Science, 9(1), 1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Ramdoss S, Lang R, Fragale C, Britt C, O’Reilly M, Sigafoos J, . . . Lancioni GE. (2012). Use of Computer-Based Interventions to Promote Daily Living Skills in Individuals with Intellectual Disabilities: A Systematic Review. Journal of Developmental and Physical Disabilities, 24(2), 197–215. doi:DOI 10.1007/s10882-011-9259-8 [DOI] [Google Scholar]
  35. Ramdoss S, Lang R, Mulloy A, Franco J, O’Reilly M, Didden R, & Lancioni G. (2011). Use of computer-based interventions to teach communication skills to children with autism spectrum disorders: A systematic review. Journal of Behavioral Education, 20(1), 55–76. [Google Scholar]
  36. Ramdoss S, Machalicek W, Rispoli M, Mulloy A, Lang R, & O’Reilly M. (2012). Computer-based interventions to improve social and emotional skills in individuals with autism spectrum disorders: a systematic review. Developmental Neurorehabilitation, 15(2), 119–135. [DOI] [PubMed] [Google Scholar]
  37. Ramdoss S, Machalicek W, Rispoli M, Mulloy A, Lang R, & O’Reilly M. (2012). Computer-based interventions to improve social and emotional skills in individuals with autism spectrum disorders: A systematic review. Developmental neurorehabilitation, 15(2), 119–135. [DOI] [PubMed] [Google Scholar]
  38. Raudenbush SW, & Byrk AS (2002). Hierarchical Linear Models: Applications and data analysis methods. Thousand Oaks, CA: Sage Pubications, Inc. [Google Scholar]
  39. Rogers E. Diffusion of Innovations. 2003. (5th ed.). New York, NY: Free Press. [Google Scholar]
  40. Self T, Scudder RR, Weheba G, & Crumrine D. (2007). A virutal approach to teaching safety skills to children with autism spectrum disorder. Topics in Language Disorders, 27(3), 242–253. [Google Scholar]
  41. Smith T. (2001). Discrete trial training in the treatment of autism. Focus on autism and other developmental disabilities, 16(2), 86–92. [Google Scholar]
  42. Stahmer AC, Rieth S, Lee E, Reisinger EM, Mandell DS, & Connell JE (2015). TRAINING TEACHERS TO USE EVIDENCE-BASED PRACTICES FOR AUTISM: EXAMINING PROCEDURAL IMPLEMENTATION FIDELITY. Psychology in the Schools, 52(2), 181–195. doi: 10.1002/pits.21815 [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Volkmar FR, & Weisner LA (2009). Educational Interventions A Practical Guide to Autism: What every parennt, family member, and teacher needs to know. Hoboken, NJ: John Wiley and Sons, Inc. [Google Scholar]
  44. Wang V, Maciejewski ML, Helfrich CD, Weiner BJ (2018). Working smarter not harder: Coupling implementation to de-implementation. Healthcare, 6(2), 104–107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Wainer AL, & Ingersoll BR (2011). The use of innovative computer technology for teaching social communication to individuals with autism spectrum disorders. Research in Autism Spectrum Disorders, 5(1), 96–107. [Google Scholar]
  46. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, . . . Halko H. (2017). Psychometric Assessment of Three Newly Developed Implementation Outcome Measures. Implementation Science, 12(1), 108. doi: 10.1186/s13012-017-0635-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Wells KB, Jones L. (2009). “Research” in community-partnered, participatory research. JAMA, 302 (3), 320–321. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Whalen C, Moss D, Ilan AB, Vaupel M, Fielding P, Macdonald K, . . . Symon J. (2010). Efficacy of TeachTown: Basics computer-assisted intervention for the intensive comprehensive autism program in Los Angeles unified school district. Autism, 14(3), 179–197. [DOI] [PubMed] [Google Scholar]

RESOURCES