Skip to main content
Sage Choice logoLink to Sage Choice
. 2024 Oct 9;29(1):207–221. doi: 10.1177/13623613241273081

Use of telemediated caregiver coaching to increase access to naturalistic developmental behavioral interventions within a statewide early intervention system

Kathleen Simcoe 1,, J Alacia Stainbrook 1, Kate T Chazin 1, Elaina Schnelle 2, Liliana Wagner 1, Madison Hooper 3, A Pablo Juárez 1, Zachary Warren 1,3
PMCID: PMC11656621  PMID: 39381960

Abstract

Despite the clear efficacy and appeal of naturalistic developmental behavioral interventions for families of young children, they are often difficult for families to access due to the limited availability of trained service providers. In recent years, telehealth has emerged as an effective tool for overcoming issues related to access, especially in rural and underserved communities. However, while telehealth offers a strategy to connect with families, it does not address the limited availability of trained providers. In this article, we provide an overview of a statewide model developed to increase access to naturalistic developmental behavioral interventions for families while building the capacity of early intervention providers. Through this model, expert consultants connect to caregivers and providers via telehealth to provide information and coaching over a limited series of visits. Collectively, child, caregiver, and provider outcomes support the effectiveness, acceptability, and feasibility of this model while demonstrating that services can be provided successfully to diverse participants.

Lay abstract

Many families seek access to evidence-based therapy to support their child’s learning. Naturalistic developmental behavioral intervention is a set of practices that use a child’s natural motivation and interest to teach skills in everyday routines. Many families find naturalistic developmental behavioral interventions appealing and they have been proven to be effective. However, families may not be able to enroll in naturalistic developmental behavioral intervention–based programs due to the limited availability of trained service providers. Telehealth is the use of technology to engage with care providers, including doctors and therapists. Telehealth is an effective tool for improving access to services, especially for people in rural and underserved communities. Telehealth offers a way for providers to connect with families but it does not address the low numbers of trained providers. In this article, we share a statewide model developed to increase access to naturalistic developmental behavioral interventions for families while increasing training opportunities for early intervention providers. Through this model, expert consultants worked with caregivers and providers via telehealth for a brief series of visits. During these visits, consultants taught caregivers and providers strategies based in naturalistic developmental behavioral interventions. Feedback from caregivers and providers, along with improvement in child skills, show that this model was effective and acceptable.

Keywords: autism, early intervention, naturalistic developmental behavioral interventions, telehealth


The prevalence of autism continues to increase with approximately 1 in 36 children now receiving diagnoses; this includes more children receiving an evaluation before age 4 (Maenner et al., 2021; Shaw et al., 2023). With this increase in early identification comes increased need for more accessible and effective early intervention practices, including a modality for reaching families and providers in rural and under-resourced areas.

Naturalistic developmental behavioral interventions

Naturalistic developmental behavioral interventions (NDBIs) refer to empirically supported interventions that are rooted in the principles of applied behavior analysis (ABA) and developmental theory (Schreibman et al., 2015). NDBIs are particularly appealing to families of young children, given that they can be implemented in naturalistic contexts, involve shared control between the adult and child, and capitalize on naturally occurring motivation (Schreibman et al., 2015; Vivanti & Zhong, 2020). Despite the effectiveness of NDBIs (Rogers et al., 2019; Stahmer et al., 2020; Wetherby et al., 2018) and the appeal for families of young children, NDBIs are not widely implemented across service delivery systems (D’Agostino et al., 2023).

Systematically training caregivers to implement NDBIs offers one possible solution to increase access. Evidence suggests caregiver coaching improves implementation of evidence-based practices (e.g. NDBIs) and promotes skill growth for young children (Akamoglu & Meadan, 2018; Heidlage et al., 2019; Kasari et al., 2014); furthermore, its effects are often amplified because of the longevity of a caregiver’s relationship with their child (Hampton et al., 2020; Kasari et al., 2014; Lundahl et al., 2006). In addition, coaching within daily routines, may facilitate generalization to novel activities (Kashinath et al., 2006; McDuffie et al., 2013; Moore et al., 2014). Despite the benefits of caregiver coaching, barriers remain in accessing providers trained to coach caregivers in NDBIs, especially in rural and under-resourced communities (Antezana et al., 2017; Drahota et al., 2020; Mello et al., 2016; Wallace-Watkin et al., 2022).

Telemediated caregiver coaching

Telehealth offers the potential for overcoming the barriers identified above (Antezana et al., 2017; Corona et al., 2020, 2021; Juarez et al., 2018; Knutsen et al., 2016; Olsen et al., 2012; Zwaigenbaum & Warren, 2020). Telemediated service delivery has become increasingly popular since the COVID-19 pandemic (Shaver, 2022), with growing evidence to support its use across intervention services (Ellison et al., 2021; Knutsen et al., 2016; Lindgren et al., 2016; Olsen et al., 2012; Wacker et al., 2013) including NDBIs (D’Agostino et al., 2020; Ingersoll et al., 2016; Vismara et al., 2018).

There are several limitations in this literature to understanding the broader utility of telemediated caregiver coaching for NDBI-based programming. First, few large-scale studies on caregiver coaching have been conducted combining telehealth with NDBIs, particularly across varying geographic areas (e.g. rural vs urban), and with families from diverse backgrounds. Thus, we have limited data to understand for whom these interventions are effective and preferred, and whether outcomes are affected by characteristics of the children and families. Second, although utilizing telehealth to coach caregivers in NDBIs has the potential to address important barriers for families, telemediated coaching alone does not address the shortage of providers capable of training caregivers (Yingling et al., 2021) nor the ongoing coaching and professional development opportunities required by providers (Kyzer et al., 2014; Rogers et al., 2022; Wainer et al., 2017).

Infants and toddlers with developmental delays and disabilities are eligible for early intervention services through Part C of the Individuals with Disabilities Education Act (IDEA; “Early Intervention Program for Infants Toddlers with Disabilities,” 2011). These programs are already embedded within communities and offer an ideal platform for helping families access services. Although early intervention providers (EIPs) are trained to deliver general strategies to promote child development, they may be more limited in their understanding of evidence-based practices to address the specific needs of children with autism or related developmental profiles, NDBI strategies, and manualized curricula (Hendrix et al., 2023; Spiker et al., 2000; Stahmer et al., 2005). However, equipping EIPs with the expertise to coach caregivers in NDBI-based strategies within their ongoing visits may alleviate some of the barriers families experience in accessing specialized care.

Caregiver and provider support services

We developed the Caregiver and Provider Support Services (CAPSS) program to (a) increase access to NDBI-based support services following a diagnostic evaluation for autism and (b) build EIPs’ capacity to serve the growing number of autistic children (Maenner et al., 2020). CAPSS was initially developed as an in-person program through which trained consultants from an academic medical center provided caregiver coaching in NDBI strategies. Beginning in July 2020, the program permanently transitioned to a telehealth-only model of service delivery, following data suggesting that the telehealth model resulted in similar child outcomes to in-person services, and increased the capacity of our program by 25% (Corona et al., 2021).

The purpose of this article is to provide an overview of our telemediated NDBI coaching model, while also examining differences that may emerge between diverse groups and initial feasibility and interest of incorporating EIPs in the CAPSS model. We also examine the impact of this model on (a) child outcomes, (b) caregiver and EIP perceptions of acceptability and effectiveness, (c) reported fidelity to treatment model, and (d) provider perceptions of long-term impact on their service provision, while also examining any differences that exist based on family characteristics.

Method

Overview of service model

CAPSS was developed over the course of an 8-year partnership between our university-based medical center and statewide Part C system. The model consists of six, 1-hour telehealth visits between a behavioral consultant (hereafter: consultant) and a family (i.e. child and at least one caregiver). The family’s EIP participates in at least two sessions. The consultant provides services via telehealth from a university-based medical center, the family is typically based at home, and the family’s EIP has the option to participate with the family in person or via telehealth.

Caregiver education

Prior to the start of services, caregivers, the EIP, and consultant collaboratively select one of five domains to target throughout the CAPSS program (i.e. functional communication, social play skills, toilet training, sleep, or dangerous behavior), each with associated curriculum modules. If caregivers would like to target more than one topic, they are encouraged to select their highest priority and EIPs are encouraged to address additional topics with families after participation in the CAPSS program. The five curriculum modules were developed by our team and guided by the principles of NDBIs. Previous versions of this model utilized a well-known manualized curriculum, (e.g. Early Start Denver Model caregiver training (Rogers et al., 2012). However, the unique nature of this approach (e.g. six 1-hour visits, the need for materials that are easily accessed by caregivers and EIPs) and the priorities identified by our statewide system led us to creating our own modules. The specific NDBI strategies taught vary depending on the focus area, the family’s priorities, and the family’s existing skillset. Each module is divided into six lessons that correspond to the six visits included in the series; each lesson requires 15–20 min to complete. The open-access modules include interactive online courses with specific objectives and printable materials (see Figure 1).

Figure 1.

Figure 1.

Example of curriculum-specific treatment fidelity checklist (communication curriculum).

EIP collaboration

The family’s EIP was required to join a minimum of two sessions; frequently, EIPs participated in all six sessions. During visits, consultants and EIPs collaborated to share expertise on intervention strategies and coach caregivers during naturalistic routines with their child. In addition to collaboration during visits, consultants engaged EIPs by discussing family goals prior to visits, coordinating a plan for each session, and developing a plan for follow-up (see Figure 2).

Figure 2.

Figure 2.

Visual representation of the CAPSS service delivery model.

Family-guided routines-based intervention

A family-guided routines-based intervention (FGRBI) framework (Woods, 2021) was used throughout each of the six visits. FGRBI is a responsive framework employed to coach caregivers on the use of evidence-based strategies to promote child learning within everyday routines. FGRBI had been adopted by our state Part C system; therefore, most EIPs had received some training in this approach. The FGRBI model consists of four components in each session: (1) setting the stage for coaching, (2) observing caregiver implementation and embedding opportunities to practice evidence-based strategies, (3) problem-solving and planning next steps, and (4) reflecting on the session and reviewing the action plan. An in-depth description of these components is available in the FGRBI Key Indicators Manual (Woods, 2021).

Cultural responsiveness

Consultants made adaptations to ensure the intervention was responsive to the unique needs of each family. First, consultants utilized interpreters in sessions if the family primarily spoke a language other than English. Second, goals were selected based on family priority, including cultural priority. For example, caregivers were encouraged to teach culturally relevant gestures (e.g. raising arms for “help”) rather than focusing on a specific gesture identified by the consultant (e.g. ASL sign for “help”). Third, consultants received ongoing training on implicit bias and cultural sensitivity from the medical center where they were based, the Part C system, and continuing education related to professional licensing requirements.

Case example

The following is a brief example of a typical CAPSS session: The consultant joined the telehealth session and greeted 2-year-old August, his mother, and his EIP, all present in the home. August’s mother shared recent updates and demonstrated the progress they were making with joint play by engaging in a back-and-forth block routine. August’s mother demonstrated the three strategies that had been introduced previously: following August’s lead in play, taking short turns, and adding interest with sound effects. These strategies addressed the family’s goals of increasing August’s participation in play and attention to others. After the play routine, the team reflected on the play routine, discussing which NDBI strategies worked well and what August’s mother wanted to target during the rest of the session. She said that August had been enjoying bath time and she’d like to increase the back-and-forth engagement during that routine. The consultant and EIP helped August’s mother reflect on how she might apply the previously learned strategies to bath time. The consultant reminded August’s mother of a tip sheet about strategies for building new routines and showed a video example from the online module of a bath time routine. They brainstormed together to identify a new strategy, sitting at eye level, for August’s mother to try during his next bath.

Participants

Consultants

The services described in this article were provided by 10 consultants from a university-based medical center with training and experience implementing NDBIs; including seven Board Certified Behavior Analysts® (BCBAs®), two speech-language pathologists, and one early childhood educator. Consultants had an average of 15 years of experience (range = 3–25) working with families and young children with autism and other developmental delays. A doctoral level BCBA (BCBA-D®) and two BCBAs provided ongoing supervision via direct observation and feedback. Consultants had prior experience with various NDBI models (e.g. Early Start Denver Model Rogers et al., 2012) and Enhanced Milieu Teaching (Hancock et al., 2016) along with specific knowledge of caregiver coaching and adult education.

Early intervention providers

Participating EIPs included 80 individuals from 20 different agencies across the state. All EIPs were required by the state to have a degree in early childhood education or special education, child and family studies, early intervention, or a related field and were required to participate in continuing education for 30 h per year. EIPs participated in sessions by sharing prior knowledge of family needs and interventions, preparing families for sessions, and collaborating with consultants to individualize recommended strategies. Our center offers additional training focused on capacity building that is not described here.

Families

Children and their caregivers were eligible for participation if they were enrolled in Part C services and were receiving developmental therapy from an EIP, had received an autism evaluation (regardless of evaluation outcome), and if the child was under the age of 33 months at the time of referral. This allowed for service completion before the child’s third birthday, when children typically transitioned out of Part C. Data for this article were collected from families who opted to participate in services between July 2020 and June 2022 (n = 327). Of these, 47 families discontinued services before three sessions, and 46 families completed three or more sessions but did not provide post-intervention data.

Data analyzed for this article are drawn from 234 families who completed at least one post-intervention measure (n= 229) and/or had treatment fidelity scores available for analysis (n = 5). Because of the community-based nature of this program, we did not always receive complete datasets as there was no monetary incentive for families to complete surveys. We felt it important to include as much data as possible to increase transparency, so we did not remove data from analyses if a dataset was incomplete. Where pre- and post-data are reported, datasets were matched to ensure that families completed both measures for an accurate comparison. All families had a participating child between the ages of 17 and 35 months at the start of services (Mean = 28.35 months, SD = 4.31 months) and participants had an autism diagnosis (83.8%). Additional demographic data are presented in Table 1.

Table 1.

Participant demographics.

Participant characteristics Participants included in analysis Participants excluded from analysis
n (%) n (%)
Child race
 Asian 5 (2.1%) 0
 Black or African American 29 (12.4%) 4 (4.3%)
 White 180 (76.9%) 15 (16.1%)
 Multiple races 18 (7.7%) 0
 Other/unknown 2 (0.8%) 74 (79.6%)
Child ethnicity
 Hispanic or Latino 12 (5.1%) 14 (15.1%)
 Not Hispanic or Latino 221 (94.4%) 15 (16.1%)
 Unknown 1 (0.4%) 64 (68.8%)
Caregiver income
 Less than US$25,000 a year 42 (17.9%) 1 (1.1%)
 US$25,000 to US$50,000 a year 57 (24.3%) 3 (3.2%)
 US$50,000 to US$75,000 a year 39 (16.7%) 0
 US$75,000 to US$100,000 a year 19 (8.1%) 0
 US$100,000 to US$125,000 a year 23 (9.8%) 0
 US$125,000 to US$150,000 a year 5 (2.1%) 0
 Over US$150,000 a year 9 (3.8%) 1 (1.1%)
 Prefer not to say 32 (13.7%) 0
 Unknown 8 (3.4%) 88 (94.6%)
Caregiver education
 Less than high school diploma 9 (3.8%) 2 (2.2%)
 High school graduate (or GED equivalent) 62 (26.5%) 2 (2.2%)
 Some college or associate degree 93 (39.7%) 1 (1.1%)
 College degree 50 (21.4%) 0
 Graduate degree(s) 16 (6.8%) 0
 Unknown 4 (1.7%) 88 (94.6%)
Home location
 Rural 145 (62%) 27 (29%)
 Non-rural 89 (38%) 25 (26.9%)
 Unknown 41 (44.1%)
Child diagnosis
 Autism 196 (83.8%) 42 (45.2%)
 Developmental delay 19 (8.1%) 3 (3.2%)
 Other (e.g., Language Delay, Down syndrome) 19 (8.1%) 2 (2.2%)
 Unknown 46 (49.5%)

GED: general equivalency diploma.

To run post hoc analyses, participants were grouped into categories based on demographic data. Participants were identified as “rural” (62%) or “urban” (38%) based on their address to examine potential group differences by geographic location. The HRSA Rural Grants Eligibility Analyzer (Health Resources and Services Administration, 2023) was used to classify areas as “rural” versus “urban.” To examine potential differences between racial and ethnic groups, participants were grouped as “White” (77%) or “Black, Indigenous, People of Color” (BIPOC; 23%). Data on ethnicity (Hispanic/Latino or Not Hispanic/Latino) were collected separately from data on race so participants who identify as Hispanic/Latino are present in both the White and BIPOC groups. Thus, we separately report data comparing Hispanic/Latino participants (5.1%) to non-Hispanic/Latino participants (94.8%).

Measures

Data collection and analysis

Data were collected and managed using Research Electronic Data Capture (REDCap), a secure, web-based software platform hosted at the university-based medical center (Harris et al., 2009, 2019). Measures were sent directly to relevant shareholders (i.e. caregivers, EIPs, and/or consultants) before the start of services and/or at the conclusion of services.

We collected data specific to child, caregiver, EIP, and consultant outcomes. Pre-intervention and post-intervention data were collected for the following child outcome measures: Communication Symbolic Behavior Scales-Developmental Profile Caregiver Questionnaire (CSBS-DP; Wetherby & Prizant, 2003), MacArthur Bates Communication Development Inventory (MCDI) Short Form (Fenson et al., 1993), and Clinical Global Impression (CGI) Scale. Only post-intervention data were collected for caregiver and EIP acceptability surveys. Post-intervention measures were only sent to families who completed three or more sessions.

In addition, treatment fidelity data were reported by consultants after each session. In December 2022, EIPs were asked to complete an additional impact survey reflecting on the services provided between July 2020 and June 2022.

Child outcomes

Child measures focused on caregiver and consultant perception of severity of challenges, improvement over time, and development of communication skills. All child measures were completed by the caregiver before and immediately following intervention; the CGI scale was also completed by the consultant.

To measure social communication, caregivers were asked to complete the CSBS-DP. The CSBS-DP is a 41-item caregiver-completed, norm-referenced questionnaire that measures communication across three domains: social communication, speech, and symbolic communication.

To measure language, caregivers were asked to complete the MCDI Short Form (Level 1). The MCDI is a caregiver-completed, norm-referenced questionnaire that measures receptive and expressive language development, as well as communicative actions and gestures. We used the MCDI: Short Form (Level I), which is norm-referenced for young children ages 8 to 18 months, although it also used with older children with language delays.

To measure severity of challenges, caregivers and consultants completed the CGI. CGI is a well-researched instrument used to assess the severity of impact of a diagnosis or disability on a person’s daily functioning (Busner & Targum, 2007). For all CGI scales, ratings are on a 7-point Likert-type scale, with higher scores indicating a greater impact and lower scores indicating lesser impact. We created a CGI scale to rate challenges experienced by a child and their family across daily routines (i.e. 1 indicates “no challenges” and 7 indicates “very severe challenges”). Our CGI assessed the presence of challenges across seven domains: child participation in caregiving routines, play-based routines, verbal communication, nonverbal communication, social interactions, restricted or narrow interests, and adaptive behavior.

Acceptability survey (caregivers and EIPs)

To measure acceptability, we collected data on caregiver and EIP satisfaction with services. Caregivers and EIPs independently completed a 15-item survey at the conclusion of services. This survey included 12 closed-ended items and 3 open-ended items, intended to measure satisfaction with the consultant, services, and outcomes. Responses for closed-ended items were reported on a 4-point Likert-type scale, ranging from “strongly disagree” to “strongly agree.”

Treatment fidelity

After each session, consultants completed a three- to five-item checklist self-assessing adherence to curriculum-specific objectives. Treatment fidelity checklists were specific to each curriculum module. Each objective was scored as “discussed,” if the consultant reviewed content related to that item with the family, and “achieved,” if the caregiver demonstrated knowledge or application related to the items. It was possible for an objective to be scored as “discussed” but “not achieved” if a caregiver did not demonstrate the skill. Thus, the “discussed” score provided a measure of consultant implementation fidelity. The “achieved” score provided a measure of caregiver implementation fidelity. See Figure 1 for a sample treatment fidelity checklist.

Impact survey

Following the completion of both service years (December 2022), EIPs completed a 12-item survey as a self-reflection on the long-term impact of services. The survey assessed improvements in knowledge, comfort, and application of NDBI strategies to support young children and caregivers following participation in services. Responses for closed-ended items were reported on a 5-point Likert-type scale, ranging from “not at all improved” to “extremely improved.”

Analytic plan

Post-intervention data analyzed for this article are drawn from 229 families who completed a minimum of three sessions with a consultant and one post-intervention measure. We analyzed child outcomes, as well as consultant, caregiver, and EIP satisfaction. Treatment fidelity was analyzed from all families for whom it was available, which included five additional families who did not complete any post-intervention measures. Due to missing data from some families, sample size differs across measures. For the CSBS-DP, MCDI Short Form (Level 1), and CGI, we used paired sample t-tests to compare respondent ratings prior to and following intervention. For satisfaction surveys and impact surveys, mode was calculated for each response and within categories. Independent t-tests (t-test (t) or Welch's t-test (t*) depending on whether the assumption of homogeneity of variance was satisfied, as indicated by Levene’s test) were also used to compare groups (e.g. rural vs non-rural locations) across measures. Effect size was estimated using Cohen’s d or Cohen’s d with the Welch approximation for the degrees of freedom. Small, medium, and large effect sizes are given by 0.2, 0.5, and 0.8, respectively. To analyze group differences for income and education level, where it was not possible to create two dichotomous groups, analyses of variance (ANOVAs) were used for comparisons. Only one result was reported in the “Less than High School Diploma” group for caregiver education so that score was removed to run group comparisons.

Community involvement

The CAPSS program was developed over the last 8 years with significant input from statewide policymakers within the Part C system, caregivers of children with autism and developmental delays (including autistic caregivers), and community providers. Our center also participates in a Community Advisory Council made up of individuals with disabilities, family members, state and community agency representatives, policymakers, and other community members.

Results

A total of 234 families completed an average of 5.43 sessions (SD = 1.05, range = 1–6). An analysis was conducted to determine if group differences existed between those who met inclusion criteria and those who did not. Pearson’s chi-square test or Fisher’s exact test (when an expected frequency of a cell was less than 5) was used to evaluate the independence of two categorical variables (e.g. dropped and non-rural). For most variables (i.e. rural vs non-rural location, child race, child diagnosis, and caregiver income), no significant differences were found between participants and those who dropped out. For child ethnicity (p< 0.001) and parent education level (p = 0.032), the Fisher’s exact test indicated significance and Cramer’s V showed a moderate association and small association, respectively, between these variables and completion.

Group comparisons indicated that attendance did not differ among caregiver income levels (F*(7, 42.14) = 1.465, p = 0.206), or caregiver education levels (F*(4, 39.29) = 2.512, p = 0.057). EIPs (n = 80) participated in an average of 3.85 sessions (SD = 1.76; range = 0–6), with 89% participating in at least two sessions and 63% participating in four or more sessions.

Child outcomes

Following CAPSS, caregivers reported significant improvements in communication and symbolic behavior on the CSBS-DP. Prior to intervention, participants received an average score of 66.22 (SD 25.35), which increased to 77.68 (SD 29.99) following intervention (t (131) = –6.55; p< 0.001, d = 0.57). See Table 2 for weighted raw score by domain. There were no significant differences between groups on the CSBS-DP when comparing rural versus non-rural participants (see Table 3), White versus BIPOC participants (see Table 4), or Hispanic/Latino versus non-Hispanic/Latino participants (see Table 5).

Table 2.

Communication and Symbolic Behavior Scale Developmental Profile (CSBS-DP) composite scores (n = 131).

Weighted raw score Pre Mean (SD) Post Mean (SD) t df p
Social composite 26.91 (8.02) 31.28 (7.92) 8.40 130 < 0.001
Speech composite 15.78 (9.52) 19.96 (10.94) 8.36 130 < 0.001
Symbolic composite 23.42 (11.29) 27.44 (13.27) 5.45 130 < 0.001
Total 66.11 (25.26) 78.68 (28.63) 8.95 130 < 0.001

Table 3.

Rural versus non-rural group comparisons.

Variable Rural Non-rural Test statistic p-value Effect size
n Mean (SD) n Mean (SD)
Child age (months) 145 28.45 (4.05) 89 28.18 (4.71) t*(165.35) = –0.446 0.656 0.061
Attendance (%) 145 87.45 (21.55) 89 90.45 (21.99) t(232) = 1.026 0.306 0.138
Pre-treatment CSBS-DP 139 65.02 (26.20) 83 70.57 (26.07) t(220) = 1.531 0.127 0.212
Post-treatment CSBS-DP 83 77.15 (27.51) 49 78.57 (34.05) t*(84.75) = 0.248 0.805 0.046
CSBS-DP change 83 13.58 (14.36) 48 9.44 (24.80) t*(65.58) = –1.060 0.293 0.205
EI provider satisfaction 64 3.91 (0.41) 27 3.82 (0.61) t(89) = –0.804 0.424 0.184
Caregiver satisfaction 84 3.74 (0.48) 49 3.83 (0.33) t*(127.69) = 1.220 0.225 0.208
Fidelity discussed 115 90.24 (17.13) 67 83.05 (21.09) t*(116.49) = –2.372 0.019 0.374
Fidelity achieved 113 81.60 (20.94) 67 73.00 (27.15) t*(112.49) = –2.230 0.028 0.355

CSBS-DP: Communication and Symbolic Behavior Scale Developmental Profile; EI: early intervention.

t represents an independent t-test and t* represents a Welch’s t-test.

Table 4.

White versus BIPOC group comparisons.

Variable White BIPOC Test statistic p-value Effect size
n Mean (SD) n Mean (SD)
Child age (months) 180 28.05 (4.38) 54 29.33 (3.95) t(232) = –1.932 0.055 0.300
Attendance (%) 180 89.24 (19.65) 54 86.41 (27.63) t(232) = 0.841 0.401 0.131
Pre-treatment CSBS-DP 174 65.46 (26.80) 78 73.01 (23.36) t(220) = –1.774 0.078 0.289
Post-treatment CSBS-DP 105 76.55 (30.22) 27 82.06 (29.19) t(130) = –0.850 0.397 0.183
CSBS-DP change 105 12.55 (19.88) 26 10.10 (14.29) t(129) = 0.592 0.555 0.130
EI provider satisfaction 74 3.91 (0.39) 17 3.76 (0.75) t*(17.99) = 0.804 0.432 0.253
Caregiver satisfaction 109 3.75 (0.46) 24 3.89 (0.25) t*(62.43) = –2.179 0.033 0.397
Fidelity discussed 146 87.92 (18.53) 36 86.23 (20.80) t(180) = 0.479 0.633 0.089
Fidelity achieved 144 79.23 (21.90) 36 75.08 (30.13) t*(44.67) = 0.777 0.441 0.158

BIPOC: Black, Indigenous, People of Color; CSBS-DP: Communication and Symbolic Behavior Scale Developmental Profile; EI: early intervention.

t represents an independent t-test and t* represents a Welch’s t-test.

Table 5.

Hispanic/Latino versus non-Hispanic/Latino.

Variable Hispanic/Latino Non-Hispanic/Latino Test statistic p-value Effect size
n Mean (SD) n Mean (SD)
Child age (months) 12 28.75 (3.89) 221 28.35 (4.33) t(231) = 0.314 0.754 0.093
Attendance (%) 12 86.17 (25.33) 221 88.75 (21.61) t(231) = –0.399 0.690 0.118
Pre-treatment CSBS-DP 11 75.82 (30.62) 211 66.64 (25.98) t(220) = 1.132 0.259 0.350
Post-treatment CSBS-DP 7 84.64 (28.78) 125 77.29 (30.11) t(130) = 0.630 0.530 0.245
CSBS-DP change 7 13.93 (16.27) 124 11.96 (19.07) t(129) = 0.267 0.790 0.104
EI provider satisfaction 6 4.00 (0.00) 84 3.87 (0.49) t(83) = 2.363 0.020 0.365
Caregiver satisfaction 6 4.00 (0.00) 127 3.76 (0.44) t*(126) = 6.045 < 0.001 0.759
Fidelity discussed 10 76.08 (20.48) 171 88.19 (18.74) t(179) = –1.977 0.049 0.643
Fidelity achieved 10 64.95 (32.05) 169 79.07 (23.05) t(177) = –1.839 0.068 0.598

CSBS-DP: Communication and Symbolic Behavior Scale Developmental Profile; EI: early intervention.

t represents an independent t-test and t* represents a Welch’s t-test.

Group comparisons indicated that post-treatment CSBS-DP scores did not differ among diagnostic groups (F(3, 128) = 1.104, p = 0.350), caregiver income levels (F(7, 118) = 0.432, p = 0.880), or caregiver education levels (F(3, 125) = 0.506, p= 0.679). When controlling for pre-treatment CSBS-DP scores, post-treatment CSBS-DP scores did not differ among diagnostic groups (F(3, 127) = 2.489, p = 0.063), caregiver income levels (F(7, 117) = 1.040, p = 0.407), or caregiver education levels (F(3, 124) = 1.181, p = 0.320). Group comparisons indicated that changes in pre- to post-treatment CSBS-DP scores did not differ among diagnostic groups (F(3, 127) = 0.881, p = 0.453), caregiver income levels (F(7, 118) = 0.768, p = 0.615), or caregiver education levels (F(3, 36.083) = 1.532, p = 0.223).

On the MCDI Short Form, caregivers reported significant improvements in the number of words children were able to say and understand following intervention (n = 131; M = 27.03, SD = 28.87); t (140) = −7.28; p < 0.001).

For the CGI, caregivers (n = 145; M(pre) = 3.87; M(post) = 3.19; p < 0.001) and consultants (n = 145; M(pre) = 4.31; M(post) = 3.46; p < 0.001) reported a significant decrease in the negative impacts across domains (see Table 6). Both caregivers and consultants rated significant improvement from pre to post despite variability in pre/post scores between caregivers and consultants. Overall, ratings between pre and post time points showed significant improvements (p < 0.001) for caregivers and for consultants.

Table 6.

Clinical global impressions scale.

Pre-treatment, Mean (SD) Post-treatment, Mean (SD) p-value
Caregiver 3.9 3.2 < 0.001
 Caregiver routines 3.4 (1.3) 3 (1.2) < 0.001
 Play 3.2 (1.4) 2.6 (1.1) < 0.001
 Verbal communication 5.2 (1.6) 4.4 (1.6) < 0.001
 Nonverbal communication 3.7 (1.5) 3 (1.2) < 0.001
 Social interactions 3.8 (1.5) 3 (1.3) < 0.001
 Restricted interests 4 (1.6) 3.1 (1.3) < 0.001
 Challenging behavior 3.9 (1.5) 3.2 (1.3) < 0.001
Consultant 4.3 3.5 < 0.001
 Caregiver routines 4.2 (1.1) 3.3 (1.1) < 0.001
 Play 4.3 (1.1) 3.3 (1.2) < 0.001
 Verbal communication 5.1 (1.1) 4.3 (1.3) < 0.001
 Nonverbal communication 4.2 (1.3) 3.3 (1.4) < 0.001
 Social interactions 4.2 (1.1) 3.3 (1.2) < 0.001
 Restricted interests 4.3 (1.2) 3.5 (1.2) < 0.001
 Challenging behavior 4 (1.3) 3.1 (1.1) < 0.001

Caregiver acceptability

Satisfaction surveys were sent to caregivers who completed at least three sessions (n = 280); a total of 133 caregivers (48% return rate) completed satisfaction surveys. Overall, caregivers reported high acceptability, with most respondents selecting “strongly agree” across each question. The majority of caregivers reported that they were pleased with the outcomes for themselves and their child (83% reported “strongly agree”), they would recommend the services to other families (83% reported “strongly agree”), the consultant with whom they worked was knowledgeable about intervention (83% reported “strongly agree”), and the consultant communicated clearly (84% reported “strongly agree”). Caregiver levels of satisfaction did not vary significantly between rural and non-rural participants (see Table 3). BIPOC participants (M (SD) = 3.89 (0.25), t*(62.43) = −2.18, p < 0.05) reported significantly higher levels of satisfaction than White participants, as did Hispanic/Latino participants (M (SD) = 4.0 (0.0), t*(126) = 6.045, p < 0.001) compared to non-Hispanic/Latino participants (see Tables 4 and 5). Group comparisons indicated that caregiver satisfaction did not differ among caregiver income levels (F(7, 120) = 1.312, p = 0.251), or caregiver education levels (F(4, 126) = 0.362, p = 0.835).

EIP acceptability

A total of 104 satisfaction surveys were completed by EIPs regarding their participation in CAPSS. Note that this number exceeds the total number of EIPs (n = 80); EIPs completed one satisfaction survey following each collaboration with a consultant, and some EIPs participated in multiple collaborations. The majority of EIPs reported high levels of satisfaction with consultants’ knowledge about interventions (94% reported “strongly agree”) and understanding around the specific needs of families (94% reported “strongly agree”). EIPs also reported that they were highly satisfied with child and family outcomes following participation (87% reported “strongly agree”). Finally, 92% of EIPs reported that they would recommend services to other caregivers of children with autism or related developmental disabilities/delays. There were no significant differences between groups on the EIP satisfaction survey when comparing EIPs who served rural versus non-rural participants (see Table 3) or White versus BIPOC participants (see Table 4). EIPs who served Hispanic/Latino families (see Table 5) did report significantly higher levels of satisfaction than those who served non-Hispanic/Latino families (M (SD) = 4.0 (0.0), t(83) = 2.363, p < 0.05).

Treatment fidelity

Overall, consultants reported discussing an average of 89.50% (SD = 15.3%) of treatment objectives. On average, 79.60% (SD = 21.50) of these objectives were achieved. Treatment fidelity did not vary between White and BIPOC participants (see Table 4) or between Hispanic/Latino participants and non-Hispanic/Latino participants (see Table 5). However, rural participants discussed (M (SD) 90.24 (17.13), t (116.49) = –2.37; p < 0.05) and achieved (M (SD) 81.6 (20.94), t (112.49) = –2.23; p < 0.05) significantly more objectives than non-rural participants (see Table 3). Group comparisons indicated that achieved treatment fidelity did not differ among diagnostic groups (F(3, 176) = 0.440, p = 0.725), caregiver income levels (F*(7, 32.20) = 1.549, p = 0.187), or caregiver education levels (F(4, 172) = 1.712, p = 0.150).

EIP impact

A total of 34 EIPs (43% return rate) completed impact surveys. The majority of EIPs reported overall improvements in knowledge, comfort, and application of evidence-based strategies. EIPs self-reported the highest ratings on items related to the impact of direct collaboration with consultants. Specifically, EIPs reported that their knowledge around effective caregiver coaching strategies (85% reported “very improved” or “extremely improved”), comfort engaging in the coaching process with caregivers (80% reported “very improved” or “extremely improved”) and serving other families who have participated in these services (78% reported “very improved” or “extremely improved”) were improved through collaboration with consultants. See Table 7 for an overview of EIP perceptions.

Table 7.

Participation impact on early intervention providers (EIPs) as reported by EIPs (n = 33).

Measure % EIP response
Not at all Slightly Somewhat Very Extremely
General knowledge of evidence-based practices (EBPs) 0 0 18 39 38
Comfort teaching caregivers to use/implement EBPs 0 3 24 36 36
Knowledge of effective caregiver coaching strategies 0 6 9 58 30
Comfort engaging in the coaching process with caregivers 0 3 18 61 21

Discussion

This article adds to existing literature (Corona et al., 2021; D’Agostino et al., 2020; Ingersoll et al., 2016; Vismara et al., 2018) indicating that coaching caregivers to use NDBI strategies via telehealth can yield meaningful child and caregiver outcomes. In addition, the results described here suggest that using telehealth technology to partner with community-based providers (i.e. EIPs) may be an effective strategy for increasing access to NDBIs. These data represent findings from the initial phase of a multi-phase project to increase the capacity of EIPs to provide evidence-based, autism-focused services to families.

This service delivery model was deemed feasible and acceptable to both caregivers and EIPs. Furthermore, EIPs consistently reported that their knowledge, comfort, and application of evidence-based practices improved following participation. This is critical for maintaining knowledge and skills gained through intervention and sustainability of the model across other families over time.

Preliminary data indicate that there was no significant benefit to privileged groups related to child outcomes, caregiver or EIP treatment acceptability, treatment fidelity, or EIP impact. In fact, the only significant difference between groups indicated benefit for traditionally underserved groups, including (a) higher satisfaction rates for BIPOC (vs White) and Hispanic/Latino (vs non-Hispanic/Latino) caregivers and (b) higher procedural fidelity scores for rural (vs urban) participants. Importantly, sample sizes for BIPOC and Hispanic/Latino groups were also notably smaller than White or non-Hispanic/Latino, so results should be considered carefully. One plausible reason for differences in satisfaction and procedural fidelity may be attributed to lower levels of access to services for these populations resulting in greater appreciation for services provided and greater dedication to following recommendations. In addition, child outcomes, attendance, and treatment fidelity did not vary based on family income or caregiver education levels.

Limitations and future directions

There are several factors that limit the internal validity of this article. First, this article describes a large-scale implementation of telehealth-based caregiver coaching, aiming to bridge the gap between clinical research and typical community service provision. As such, the methodology and data lack the rigor of a clinical research study, particularly in that it uses pre-post and post-only measures; lacks a comparison group and randomization; and group sizes were not equivalent. Although the results are promising, observed outcomes must be interpreted cautiously. While child outcomes improved over the course of the program, without a comparison group, it is possible that improvements may be caused by confounding factors, including child maturation and post-diagnosis caregiver adjustments. Similarly, although caregivers and EIPs indicated high social acceptability of the intervention, we cannot speculate how this level of acceptability compares to those of other interventions or business as usual. Direct observational data were not collected on behaviors or specific skills gained by caregivers or EIPs, and that should be examined in future iterations of this program.

Second, data were only included for families who completed at least one post-measure and/or had treatment fidelity data available. Data from families who withdrew from the program and/or did not complete final paperwork were not included, and that group includes families that were more likely to have a lower level of caregiver education or report Hispanic/Latino ethnicity. Third, all outcomes were limited to caregiver and EIP perceptions. We chose to limit data collection measures due to the community-based nature of this program and concerns about over-burdening local providers with data collection. To improve internal validity, future studies might include direct observational measures of intended outcomes along with measures of continued strategy implementation by EIPs.

Fourth, we do not fully understand the reasons that satisfaction rates were higher for BIPOC (vs White) and Hispanic/Latino (vs non-Hispanic/Latino) caregivers. Due to the relatively small sample of families from BIPOC and/or Hispanic/Latino backgrounds, the generalizability of this finding is limited. Because of the community-based nature of this program, we accepted all eligible participants who were referred and did not try to create comparable groups. Earlier, we discussed the possibility of decreased access to services as a reason for increased satisfaction. It is also possible that the cultural responses made to intervention contributed to these differential results. Fifth, the packet of post-intervention measures was sent to the 280 families who completed three or more sessions (86% of total families). Limiting post-intervention measures to these families ensured that we only collected data from families who likely completed enough of the program to observe measurable behavioral differences. However, this prevented us from collecting information from families who dropped out before three sessions; thus, our treatment acceptability may be inflated. Consultants typically sent three to five reminders to complete final paperwork but there were no incentives provided for completion. Future program iterations might incentivize final paperwork completion to increase likelihood of more complete datasets. They might also send out treatment acceptability paperwork to all participants, regardless of the number of sessions completed.

Recommendations for practitioners

Practitioners may consider low-intensity, telehealth-based service models like the one described here to “bridge the gap” between diagnostic evaluations and access to higher-intensity services. Teaching caregivers NDBI strategies to support priority goals in the home may help ameliorate caregiver stress related to service access while simultaneously supporting child development around critical skills in their natural environment. Furthermore, the utilization of telehealth to deliver these low-intensity services both reduces costs to providers (i.e. reduced travel time) and increases access to families living in rural and underserved communities.

Practitioners may also consider partnering with their state Part C system when feasible. Teaming and collaboration are recommended practices in the provision of early childhood services (Division for Early Childhood, 2014; “Early Intervention Program for Infants Toddlers with Disabilities,” 2011) and may build the capacity of local service providers, who can in turn provide increased access to NDBIs in rural and under-resourced communities. When consultants and local EIPs collaborate to serve a family, information relevant to the support needs of the family can be shared tri-directionally. Consultants can use online resources and telehealth technology to observe EIPs and families in intervention sessions. Local EIPs often have a greater understanding of the child and family history as well as local resources and services and will remain involved with the IFSP team for the child’s time in Part C programs. This family- and community-specific knowledge enhances the consultants’ understanding of potential opportunities and barriers that may inform recommendations. It will be important for future work to study the determinants affecting implementation and use of this model with all parties (caregivers, EIPs, consultants) in order to adapt the program to best meet the needs of individual communities and systems.

The CAPSS program combines telehealth-based caregiver coaching on NDBIs with capacity-building strategies for EIPs in order to increase access for families and provider capacity. These results indicate that this telehealth-based approach to disseminating NDBIs is an effective, feasible, and acceptable model for supporting families in under-served communities.

Acknowledgments

The authors wish to acknowledge and thank all consultants, Part C providers, and families who participated in this program.

Footnotes

Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported by funding from the Tennessee Department of Intellectual and Developmental Disabilities through the Tennessee Early Intervention System (Grant no. 70785) and Redcap is funded by UL1 TR000445 from NCATS/NIH. The first four authors contributed equally to the development of this article.

References

  1. Akamoglu Y., Meadan H. (2018). Parent-implemented language and communication interventions for children with developmental delays and disabilities: A scoping review. Review Journal of Autism and Developmental Disorders, 5(3), Article 15. [Google Scholar]
  2. Antezana L., Scarpa A., Valdespino A., Albright J., Richey J. A. (2017). Rural trends in diagnosis and services for autism spectrum disorder. Frontiers in Psychology, 8, Article 590. 10.3389/fpsyg.2017.00590 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Busner J., Targum S. D. (2007). The clinical global impressions scale: Applying a research tool in clinical practice. Psychiatry, 4(7), 28–37. https://www.ncbi.nlm.nih.gov/pubmed/20526405 [PMC free article] [PubMed] [Google Scholar]
  4. Corona L., Hine J., Nicholson A., Stone C., Swanson A., Wade J., Wagner L., Weitlauf A., Warren Z. (2020). TELE-ASD-PEDS: A telemedicine-based ASD evaluation tool for Toddlers and young children. https://vkc.vumc.org/vkc/triad/tele-asd-peds
  5. Corona L. L., Stainbrook J. A., Simcoe K., Wagner L., Fowler B., Weitlauf A. S., Juarez A. P., Warren Z. (2021). Utilization of telemedicine to support caregivers of young children with ASD and their Part C service providers: A comparison of intervention outcomes across three models of service delivery. Journal of Neurodevelopmental Disorders, 13(1), Article 38. 10.1186/s11689-021-09387-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. D’Agostino S., Douglas S. N., Horton E. (2020). Inclusive preschool practitioners’ implementation of naturalistic developmental behavioral intervention using telehealth training. Journal of Autism and Developmental Disorders, 50(3), 864–880. 10.1007/s10803-019-04319-z [DOI] [PubMed] [Google Scholar]
  7. D’Agostino S., Dueñas A. D., Bravo A., Tyson K., Straiton D., Salvatore G. L., Pacia C., Pellecchia M. (2023). Toward deeper understanding and wide-scale implementation of naturalistic developmental behavioral interventions. Autism, 27(1), 253–258. 10.1177/13623613221121427 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Division for Early Childhood. (2014). DEC recommended practices in early intervention/early childhood special education 2014. https://www.dec-sped.org/dec-recommended-practices
  9. Drahota A., Sadler R., Hippensteel C., Ingersoll B., Bishop L. (2020). Service deserts and service oases: Utilizing geographic information systems to evaluate service availability for individuals with autism spectrum disorder. Autism, 24(8), 2008–2020. 10.1177/1362361320931265 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Early Intervention Program for Infants Toddlers with Disabilities: Final Regulations #2011-22783. (2011). Federal Register: The Daily Journal of the United States Government. [PubMed] [Google Scholar]
  11. Ellison K. S., Guidry J., Picou P., Adenuga P., Ellison K. S. (2021). Telehealth and autism prior to and in the age of Covid-19: A systematic and critical review of the last decade. Clinical Child and Family Psychology Review, 24(2), 599–630. 10.1007/s10567-021-00358-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Fenson L., Dale P. S., Reznick J. S., Thal D., Bates E., Hartung J. P., Pethick S., Reilly J. S. (1993). The MacAurthur Communicative Development Inventories: User’s guide and technical manual. Singular Publishing Group. [Google Scholar]
  13. Hampton L. H., Kaiser A. P., Fuller E. A. (2020). Multi-component communication intervention for children with autism: A randomized controlled trial. Autism, 24(8), 2104–2116. 10.1177/1362361320934558 [DOI] [PubMed] [Google Scholar]
  14. Hancock T. B., Ledbetter-Cho K., Howell A., Lang R. (2016). Enhanced Milieu Teaching. In Lang R., Hancock T., Singh N. (Eds.), Early intervention for young children with autism spectrum disorder: Evidence-based practices in behavioral health (pp. 177–218). Springer. 10.1007/978-3-319-30925-5_7 [DOI] [Google Scholar]
  15. Harris P. A., Taylor R., Thielke R., Payne J., Gonzalez N., Conde J. G. (2009). Research electronic data capture (REDCap)–a metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics, 42(2), 377–381. 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Harris P. A., Taylor R., Minor B. L., Elliott V., Fernandez M., O’Neal L., McLeod L., Delacqua G., Delacqua F., Kirby J., Duda S. N. & REDCap Consortium (2019). The REDCap consortium: Building an international community of software platform partners. Journal of Biomedical Informatics, 95, 103208. 10.1016/j.jbi.2019.103208 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Health Resources and Services Administration. (2023, November). Rural health grants eligibility analyzer. https://data.hrsa.gov/tools/rural-health
  18. Heidlage J. K., Cunningham J. E., Kaiser A. P., Trivette C. M., Barton E. E., Frey J. R., Roberts M. Y. (2019). The effects of parent-implemented language interventions on child linguistic outcomes: A meta-analysis. Early Childhood Research Quarterly, 50, Article 17. 10.1016/j.ecresq.2018.12.006 [DOI] [Google Scholar]
  19. Hendrix N., Chatson E., Davies H., Demetri B., Xiang Y., Yohannes M., Buck A., Harper S., Stapel-Wax J., Pickard K. (2023). Early intervention provider-reported NDBI use and relationships with provider- to system-level implementation determinants. Journal of Autism and Developmental Disorders. Advance online publication. 10.1007/s10803-023-06203-3 [DOI] [PubMed] [Google Scholar]
  20. Ingersoll B., Wainer A. L., Berger N. I., Pickard K. E., Bonter N. (2016). Comparison of a self-directed and therapist-assisted telehealth parent-mediated intervention for children with ASD: A pilot RCT. Journal of Autism and Developmental Disabilities, 46(7), 2275–2284. [DOI] [PubMed] [Google Scholar]
  21. Juarez A. P., Weitlauf A. S., Nicholson A., Pasternak A., Broderick N., Hine J., Stainbrook J. A., Warren Z. (2018). Early identification of ASD through telemedicine: Potential value for underserved populations. Journal of Autism and Developmental Disorders, 48(8), 2601–2610. 10.1007/s10803-018-3524-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Kasari C., Siller M., Huynh L. N., Shih W., Swanson M., Hellemann G. S., Sugar C. A. (2014). Randomized controlled trial of parental responsiveness intervention for toddlers at high risk for autism. Infant Behavior and Development, 37(4), 711–721. 10.1016/j.infbeh.2014.08.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Kashinath S., Woods J., Goldstein H. (2006). Enhancing generalized teaching strategy use in daily routines by parents of children with autism. Journal of Speech, Language, and Hearing Research, 49(3), 466–485. [DOI] [PubMed] [Google Scholar]
  24. Knutsen J., Wolfe A., Burke B. L., Hepburn S., Lindgren S., Coury D. (2016). A systematic review of telemedicine in autism spectrum disorders. Review Journal of Autism and Developmental Disorders, 3, 330–344. [Google Scholar]
  25. Kyzer K. B., Chiu C., Kemp P., Aldersey H. M., Turnbull A. P., Lindeman D. P. (2014). Feasibility of an online professional development program for early intervention practitioners. Infants and Young Children, 27(2), 174–191. [Google Scholar]
  26. Lindgren S., Wacker D., Suess A., Schieltz K., Pelzel K., Kopelman T., Lee J., Romani P., Waldron D. (2016). Telehealth and autism: Treating challenging behavior at lower cost. Pediatrics, 137(Supplement), S167–S175. 10.1542/peds.2015-2851o [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Lundahl B., Risser H. J., Lovejoy M. C. (2006). A meta-analysis of parent training: Moderators and follow-up effects. Clinical Psychology Review, 26(1), 86–104. [DOI] [PubMed] [Google Scholar]
  28. Maenner M. J., Shaw K. A., Baio J., Washington A., Patrick M., Dirienzo M., Christensen D. L., Wiggins L. D., Pettygrove S., Andrews J. G., Lopez M., Hudson A., Baroud T., Schwenk Y., White T., Rosenberg C. R., Lee L.-C., Harrington R. A., Huston M., . . .Dietz P. M. (2020). Prevalence of autism spectrum disorder among children aged 8 years: Autism and Developmental Disabilities Monitoring Network, 11 sites, United States, 2016. Morbidity and Mortality Weekly Report Surveillance Summaries, 69(4), 1–12. 10.15585/mmwr.ss6904a1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Maenner M. J., Shaw K. A., Bakian A. V., Bilder D. A., Durkin M. S., Esler A., Furnier S. M., Hallas L., Hall-Lande J., Hudson A., Hughes M. M., Patrick M., Pierce K., Poynter J. N., Salinas A., Shenouda J., Vehorn A., Warren Z., Constantino J. N., . . .Cogswell M. E. (2021). Prevalence and characteristics of autism spectrum disorder among children aged 8 years: Autism and Developmental Disabilities Monitoring Network, 11 sites, United States, 2018. Morbidity and Mortality Weekly Report Surveillance Summaries, 70(11), 1–16. 10.15585/mmwr.ss7011a1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. McDuffie A. M. W., Oakes A., Haebig E., Weismer S. E., Abbeduto L. (2013). Distance video-teleconferencing in early intervention: Pilot study of a naturalistic parent-implemented language intervention. Topics in Early Childhood Special Education, 33(3), 172–185. 10.1177/0271121413476348 [DOI] [Google Scholar]
  31. Mello M. P., Goldman S. E., Urbano R. C., Hodapp R. M. (2016). Services for children with autism spectrum disorder: Comparing rural and non-rural communities. Education and Training in Autism and Developmental Disabilities, 51(4), 355–365. [Google Scholar]
  32. Moore H. W., Barton E. E., Chironis M. (2014). A program for improving toddler communication through parent coaching. Topics in Early Childhood Special Education, 33(4), 10. 10.1177/0271121413497520 [DOI] [Google Scholar]
  33. Olsen S., Fiechtl B., Rule S. (2012). An evaluation of virtual home visits in early intervention: Feasibility of “virtual intervention.” The Volta Review, 112, 267–281. [Google Scholar]
  34. Rogers S., Estes A., Vismara L., Munson J., Zierhut C., Greenson J., Dawson G., Rocha M., Sugar C., Senturk D., Whelan F., Talbott M. (2019). Enhancing low-intensity coaching in parent implemented early start Denver model intervention for early autism: A randomized comparison treatment trial. Journal of Autism and Developmental Disorders, 49(2), 632–646. 10.1007/s10803-018-3740-5 [DOI] [PubMed] [Google Scholar]
  35. Rogers S. J., Dawson G., Vismara L. A. (2012). An early start for your child with autism: Using everyday activities to help kids connect, communicate, and learn. Guilford Press. [Google Scholar]
  36. Rogers S. J., Stahmer A., Talbott M., Young G., Fuller E., Pellecchia M., Barber A., Griffin E. (2022). Feasibility of delivering parent-implemented NDBI interventions in low-resource regions: A pilot randomized controlled study. Journal of Neurodevelopmental Disorders, 14(3). 10.1186/s11689-021-09410-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Schreibman L., Dawson G., Stahmer A. C., Landa R., Rogers S. J., McGee G. G., Kasari C., Ingersoll B., Kaiser A. P., Bruinsma Y., McNerney E., Wetherby A., Halladay A. (2015). Naturalistic developmental behavioral interventions: Empirically validated treatments for autism spectrum disorder. Journal of Autism and Developmental Disorders, 45(8), 2411–2428. 10.1007/s10803-015-2407-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Shaver J. (2022). The state of telehealth before and after the COVID-19 pandemic. Primary Care: Clinics in Office Practice, 49(4), 517–530. 10.1016/j.pop.2022.04.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Shaw K. A., Bilder D. A., McArthur D., Williams A. R., Amoakohene E., Bakian A. V., Durkin M. S., Fitzgerald R. T., Furnier S. M., Hughes M. M., Pas E. T., Salinas A., Warren Z., Wiliams S., Esler A., Grzybowski A., Ladd-Acosta C. M., Patrick M., Zahorodny W., Maenner M. J. (2023). Early identification of autism spectrum disorder among children aged 4 years: Autism and Developmental Disabilities Monitoring Network, 11 Sites, United States, 2020. Morbidity and Mortality Weekly Report Surveillance Summaries, 72(1), 1–15. 10.15585/mmwr.ss7201a1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Spiker D., Hebbeler K., Wagner M., Cameto R., McKenna P. (2000). A framework for describing variations in state early intervention systems. Topics in Early Childhood Special Education, 20(4), 195–207. [Google Scholar]
  41. Stahmer A. C., Collings N. M., Palinkas L. A. (2005). Early intervention practices for children with autism: Descriptions from community providers. Focus on Autism and Other Developmental Disabilities, 20(2), 66–79. 10.1177/10883576050200020301 [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Stahmer A. C., Rieth S. R., Dickson K. S., Feder J., Burgeson M., Searcy K., Brookman-Frazee L. (2020). Project ImPACT for Toddlers: Pilot outcomes of a community adaptation of an intervention for autism risk. Autism, 24(3), 617–632. 10.1177/1362361319878080 [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Vismara L. A., McCormick C. E., Wagner A. L., Monlux K., Nadhan A., Young G. S. (2018). Telehealth parent training in the Early Start Denver Model: Results from a randomized controlled study. Focus on Autism and Other Developmental Disabilities, 33(2), 67–79. [Google Scholar]
  44. Vivanti G. Z., Zhong H. N. (2020). Naturalistic developmental behavioral interventions for children with autism. In Vivanti G., Bottema-Beutel K., Turner-Brown L. (Eds.), Clinical guide to early interventions for children with autism: Best practices in child and adolescent behavioral health care (pp. 93–130). Springer. 10.1007/978-3-030-41160-2_6 [DOI] [Google Scholar]
  45. Wacker D. P., Lee J. F., Dalmau Y. C. P., Kopelman T. G., Lindgren S. D., Kuhle J., Pelzel K. E., Waldron D. B. (2013). Conducting functional analyses of problem behavior via telehealth. Journal of Applied Behavior Analysis, 46(1), 31–46. 10.1002/jaba.29 [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Wainer A. L., Pickard K., Ingersoll B. R. (2017). Using Web-based instruction, brief workshops, and remote consultation to teach community-based providers a parent-mediated intervention. Journal of Child and Family Studies, 26, 1592–1602. [Google Scholar]
  47. Wallace-Watkin C., Sigafoos J., Waddington H. (2022). Barriers and facilitators for obtaining support services among underserved families with an autistic child: A systematic qualitative review. Autism, 27(3), 588–601. 10.1177/13623613221123712 [DOI] [PubMed] [Google Scholar]
  48. Wetherby A. M., Prizant B. M. (2003). Communication and Symbolic Behavior Scales (Normed ed.). Paul A. Brookes. [Google Scholar]
  49. Wetherby A. M., Woods J., Guthrie W., Delehanty A., Brown J. A., Morgan L., Holland R. D., Schatschneider C., Lord C. (2018). Changing developmental trajectories of toddlers with autism spectrum disorder: Strategies for bridging research to community practice. Journal of Speech, Language, and Hearing Research, 61(11), 2615–2628. 10.1044/2018_JSLHR-L-RSAUT-18-0028 [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Woods J. (2021). FGRBI key indicators manual (6th ed.). http://fgrbi.com/
  51. Yingling M. E., Ruther M. H., Dubuque E. M., Mandell D. S. (2021). County-level variation in geographic access to Board Certified Behavior Analysts among children with Autism Spectrum Disorder in the United States. Autism, 25(6), 1734–1745. 10.1177/13623613211002051 [DOI] [PubMed] [Google Scholar]
  52. Zwaigenbaum L., Warren Z. (2020). Commentary: Embracing innovation is necessary to improve assessment and care for individuals with ASD: A reflection on Kanne and Bishop (2020). Journal of Child Psychology and Psychiatry, 62(2), 143–145. 10.1111/jcpp.13271 [DOI] [PubMed] [Google Scholar]

Articles from Autism are provided here courtesy of SAGE Publications

RESOURCES