Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Mar 1.
Published in final edited form as: Acad Pediatr. 2021 Apr 23;22(2):263–270. doi: 10.1016/j.acap.2021.04.017

Reducing Barriers to Autism Screening in Community Primary Care: A Pragmatic Trial using Web-Based Screening

Kyle J Steinman a,b,c,d,#, Wendy L Stone e,#, Lisa V Ibañez e, Shana M Attar e
PMCID: PMC8536796  NIHMSID: NIHMS1696836  PMID: 33901728

Abstract

Objective:

To determine whether an intervention addressing both logistical and knowledge barriers to early screening for autism spectrum disorder (ASD) increases evidence-based screening during 18-month well-child visits and PCPs’ perceived self-efficacy in caring for children with ASD.

Methods:

Forty-six primary care providers (PCPs) from 10 diverse practices across four counties in Washington State participated. PCPs attended a two-hour training workshop on early recognition and care for toddlers with ASD and use of a REDCap-based version of the Modified Checklist for Autism in Toddlers–Revised with Follow-up (webM-CHAT-R/F) that provided automated presentation and scoring of follow-up questions. Data were collected at baseline and six-months following each county’s training window. PCPs’ screening methods and rates and perceived self-efficacy regarding ASD care were measured by self-report and webM-CHAT-R/F use was measured via REDCap records.

Results:

At follow-up, 8 of the 10 practices were using the webM-CHAT-R/F routinely at 18-month visits. The proportion of PCPs reporting routine M-CHAT screening increased from 82% at baseline to 98% at follow-up (16% increase, 95% CI 3%–28%; McNemar exact p=.02). The proportion using the M-CHAT-R/F follow-up interview questions increased from 33% to 82% (49% increase, 95% CI 30%–68%, exact McNemar test, p<.001). Significant increases in self-efficacy were found for all seven areas assessed (ps≤.008).

Conclusions:

This brief intervention increased PCPs’ self-reported valid use of the M-CHAT-R/F at 18 months and their self-efficacy regarding ASD care. Combining educational information with a web-based ASD screen incorporating the M-CHAT-R/F follow-up questions may increase universal ASD screening with improved fidelity.

Keywords: autism spectrum disorder, screening, primary care


The increasing prevalence of autism spectrum disorder (ASD)1 and the demonstrated effectiveness of early, specialized intervention for toddlers with ASD25 have led the American Academy of Pediatrics (AAP) to recommend universal ASD screening at 18 and 24-months6. However, the routine use of evidence-based ASD screening tools by primary care providers (PCPs) is impeded by several obstacles, including limited time during well-child visits, limited knowledge about early ASD symptoms, and discomfort communicating ASD concerns to families710.

The Modified Checklist for Autism in Toddlers (M-CHAT) is one of the most widely used ASD screening tools11,12. Its current iteration, the Modified Checklist for Autism in Toddlers-Revised with Follow Up (M-CHAT-R/F), includes a follow-up interview that is critical for reducing false positives13. Unfortunately, most PCPs omit the follow-up interview questions, due to inadequate time and/or personnel to complete it or mere lack of awareness of this critical step12,14. In addition, scoring errors have been associated with use of the paper-based format of the M-CHAT-R/F15. Previous studies have addressed accuracy, time, personnel, and follow-up protocol awareness barriers by developing digital versions of the M-CHAT-R/F that provide automated presentation and scoring of the follow-up questions1518. However, these studies either took place within a single setting or provider network1517, employed a prescribed online delivery system workflow set by the researcher18, or involved additional cost (i.e., a fee-for-use product; requiring PCPs to conduct follow-up questions during visit time18), which limit their generalizability to more diverse “usual care” practice conditions.

The present study employed a pragmatic trial approach19 to assess the effectiveness of an intervention for increasing the use of evidence-based ASD screening (i.e., adherence to AAP universal screening guidelines at 18-month-old well-child checks and employing the M-CHAT-R/F follow-up questions when indicated). PCPs were enrolled from diverse practices and the manner in which the intervention was implemented was free to vary across sites.20 In addition, our approach to improving early ASD screening was broader than in previous studies, as it addressed potential knowledge gaps as well as barriers in communicating with families. Our intervention combined access to an open-source, web-based version of the M-CHAT-R/F with automated scoring and implementation of the follow-up questions (i.e., the webM-CHAT-R/F), along with a two-hour office-based workshop on early behavioral characteristics of ASD, strategies for communicating with parents about ASD concerns, and local referrals and resources.

Methods

Study Approach:

This study was part of a broader, interrupted time series pragmatic study designed to increase ASD screening and expedite specialized intervention for toddlers in underserved (rural and/or geographically-isolated) communities in Washington State21. Four counties with at least one interested PCP practice were included. Counties represented the East, Central, and Western parts of the state and diverse demographics (e.g., the census of Hispanic households ranged from 6%-50%, and population density ranged from 31-267 people per square mile22). Ten PCP practices participated. Within each practice there was a physician who served as a point person for ongoing communication about the project. These were all practicing physicians who maintained active caseloads. Because this was a pragmatic trial, and we were interested in the feasibility of implementation of the webM-CHAT-R/F for “usual practice” settings, the point people received no financial compensation. The study was approved by the University of Washington’s Institutional Review Board.

Consistent with a community participatory research framework23, research staff met with PCP practices prior to study initiation to discuss common goals and obtain input regarding study procedures and workshop content. PCP recruitment/enrollment occurred from July 2015-July 2017. PCPs completed baseline measures upon enrollment. Counties were randomly assigned to the timing of their training workshops using a stepped wedge approach with four consecutive three-month blocks. This randomization established an interrupted time-series design for examining providers’ screening practices and perceived self-efficacy in caring for toddlers with ASD before and after the “interruption” (i.e., the training workshops). Within each three-month training window, individual practices within a county received a two-hour workshop followed by a technical assistance visit. Follow-up data from providers at each practice were collected six months after their county’s training window ended and occurred from October 2017-August 2018.

WebM-CHAT-R/F:

The M-CHAT-R/F is a well-validated, open-source screening tool designed for 16-30-month-olds13 that was selected for use due to its strong evidence base and psychometric properties24. Its most recent iteration comprises two stages: a 20-item parent-report checklist (yielding low-, medium-, or high-risk scores) and a structured follow-up interview, administered by the clinician, to clarify parent responses for medium-risk scores after the first stage of screening – which converts initial medium-risk scores to either final low-risk (screen negative) or high-risk (screen positive) scores. The webM-CHAT-R/F provides automated scoring of the initial 20 items and automated presentation and scoring of the follow-up questions, as appropriate, without the need for administration by the clinician. The webM-CHAT-R/F was developed for this study using the REDCap (Research Electronic Data Capture) platform25 and a tablet interface. REDCap is a HIPAA-compliant, open-source application for building and managing online surveys and databases. Enrolled providers were able to access information only for those patients in their practice. The REDCap platform is hosted by the University of Washington and can be accessed for free by partnering institutions.

Parents completed the webM-CHAT-R/F on a tablet in the waiting area, and the final screen displayed a color-coded banner that indicated the results (e.g., yellow if negative or purple if positive). PCPs and medical staff could log into the REDCap database to obtain specific information about the failed items. The webM-CHAT-R/F was programmed in both English and Spanish. Specific features of the tablet interface were customized to each practice as requested (e.g., adding text-to-speech capabilities for families with low literacy; customizing the color of the final screen).

Training workshops:

A two-hour training workshop, conducted by authors WLS and LVI, was held at each PCP practice. Attendance was open to all PCPs and practice staff. In addition to introducing the webM-CHAT-R/F, workshop content addressed several challenges described in the literature79,26 and those mentioned by PCPs during preliminary meetings: the early behavioral features of ASD, the importance of universal ASD screening at 18-months, the importance of using the M-CHAT-R/F follow-up questions to reduce false positives, strategies for talking to families about ASD concerns, and local resources for toddlers with ASD. Workshops included video examples of “red flag” behaviors in toddlers with ASD and group discussion, as well as shared decision-making materials for use with families. Providers were advised which billing code to use for reimbursement (CPT® code 96110). Continuing Medical Education (CME) and Maintenance of Certification (MOC) credits were available. After the workshop, research staff provided a technical assistance visit to create logins and provide instructions for accessing REDCap records. Additional materials and supplies were provided as needed (e.g., Wi-Fi hotspots, rolling carts). Research staff were available throughout the project for technical assistance but did not directly oversee or incentivize webM-CHAT-R/F use.

Data Collection:

Confidential surveys were administered at baseline (upon enrollment, before training) and follow-up (6 months after training). Demographic information collected from PCPs comprised age, gender, years in practice, professional title (physician, physician assistant, or nurse practitioner), and Pediatric or Family Practice provider. Primary outcomes were adoption rates and feasibility of the webM-CHAT-R/F and PCPs’ perceived self-efficacy regarding their clinical care for toddlers with ASD. Adoption of the webM-CHAT-R/F was measured using both self-report and objective data. Self-reports at baseline and follow-up compared: (1) the percent of 18-month visits in which the M-CHAT (any version) was used; and (2) the frequency with which the M-CHAT-R/F was used, with administration of the follow-up questions when indicated. REDCap records provided an objective measure of the number of webM-CHAT-R/Fs conducted within each practice at 18-month visits (16-20 months, inclusive; Figure 1) and allowed for calculation of aggregate screening results. Feasibility was measured via self-report, using items adapted from the Usage Rating Profile-Intervention Revised (URPIR)27, a well-validated measure for assessing key aspects of implementation. PCPs rated five statements (Table 1) on a scale from Strongly Agree to Strongly Disagree. Perceived self-efficacy regarding clinical care for toddlers with ASD concerns was measured via self-report at baseline and follow-up using seven items developed for this study (Figure 2). PCPs rated their level of knowledge, comfort, or confidence regarding early identification, screening, and referral via options ranging from Not at all to Extremely.

Figure 1. Run charts for webM-CHAT-R/F use by county and practice for the first 6 months post-training.

Figure 1.

Each line represents an individual practice. Month 0 (0m) represents the end of the training window for each county. See Results for characterization of run chart curves.

Table 1.

Feasibility of the webM-CHAT-R/F

Item Strongly Agree n (%) Agree n (%) Disagree n (%) Strongly Disagree n (%) Total na
I have the resources I need to implement the online M-CHAT-R/F with my patients. 19 (42) 17 (38) 6 (13) 3 (7) 45
The online M-CHAT-R/F fits well with my current practices. 20 (44) 13 (29) 8 (18) 4 (9) 45
The online M-CHAT-R/F is well matched to what is expected in my profession. 23 (52) 14 (32) 5 (11) 2 (5) 44
The materials needed to implement the online M-CHAT-R/F are reasonable. 16 (37) 20 (47) 5 (12) 2 (5) 43
Use of the M-CHAT-R/F is consistent with the mission of my work setting. 22 (50) 20 (45) 1 (2) 1 (2) 44
a

Total n<46 due to participants not providing a response to that question.

Figure 2. Changes in perceived self-efficacy.

Figure 2.

Shading within each bar represents the number of PCPs rating their degree of self-efficacy at the baseline (BL) and 6-month follow-up (FU) period. Wilcoxon signed-rank test, p-values listed above each item. Survey Questions: (a) How knowledgeable do you feel about recognizing the signs of autism in children under 2 years old?; (b) How knowledgeable do you feel about selecting and using autism screening tools for children under 2 years old?; (c) How confident do you feel in the results of the autism-specific screening tools?; (d) How comfortable do you feel discussing autism concerns with parents of children under 2 years old? (e) How comfortable do you feel making a referral for assessment and intervention services for children under 2 years old with autism concerns?; (f) How knowledgeable do you feel about the diagnostic assessment resources in your community for children under 2 years old?; (g) How knowledgeable do you feel about the early intervention resources in your community for children under 2 years old? *p < .01; **p < .001. an=44 for BL; 2 providers provided no response at BL; statistical analysis compared BL to FU in 44 subjects.

Secondary outcomes comprised self-report of general obstacles to screening and specific questions about helpful and challenging aspects of the webM-CHAT-R/F. Perceived obstacles to screening were measured at baseline and follow-up via five common barriers described in previous research: time, staffing, familiarity with tools, comfort talking to families about ASD, and availability of community resources8,9 (Figure 3). PCPs rated each item as a Major obstacle, Minor obstacle, or Not an obstacle. Helpful and challenging aspects of webM-CHAT-R/F use were assessed with two open-ended questions. Two raters (WLS and LVI) independently categorized the responses and then used consensus coding to identify themes emerging from each question.

Figure 3. Changes in perceived obstacles to screening.

Figure 3.

Shading within each bar represents the number of PCPs rating each potential obstacle to screening at the baseline (BL) and 6-month follow-up (FU) period. Wilcoxon signed-rank test, *p < .05; **p < .01. an=45 for FU; 1 provider provided no response at FU; statistical analysis compared BL to FU in 45 subjects.

Statistical Analyses:

Data were analyzed at the level of the provider using Stata/MP 15.1. Data from all PCPs with completed surveys at both time points were included in analyses, regardless of workshop attendance. Demographic variables for PCPs completing surveys only at baseline versus at both time points were compared using the Mann-Whitney test or Fischer’s exact test as appropriate. Changes over time were compared using the Wilcoxon signed-rank test (ordinal data) and the McNemar exact test (nominal data). Hypothesis tests were 2-sided with threshold for significance set at p<.05.

Results

Of the 59 PCPs who enrolled in the study, 2 were missing baseline data and 11 were missing follow-up data, resulting in a final sample of 46. The number of PCPs enrolled per practice ranged from 2–19. PCPs had a median age of 42 years (range: 28–64) and a median of 11 years in practice (range: <1–35). The majority were physicians (78%) and were female (78%). Forty (40) were Pediatrics providers and 6 were Family Practice providers. The training workshop was attended by 42 of 46 providers; workshops were also attended by medical assistants, office managers, and behavioral health consultants. There were no significant differences (all ps≥.08; Fisher’s exact test) on any outcome measure between providers who attended the training workshop and those who did not. There were no significant differences between the PCPs who completed surveys at both time points vs. only at baseline for age (median 42 vs. 55 years; p=.41), gender (female 78% vs. 64%; p=.44), years in practice (median 9 vs. 11 years; p=.54), or professional title (45% physician, 9% physician assistant, 45% nurse practitioner vs 78% physician, 7% physician assistant, 15% nurse practitioner; p=.07). The median time between the end of the training window and completion of the six-month follow-up surveys ranged from 5.7 to 7.1 months across the counties.

Adoption.

The proportion of PCPs who reported using either the M-CHAT or M-CHAT-R/F regularly during 18-month well-child exams (i.e., at least 75% of the time) increased from 82% at baseline to 98% at follow-up (16% increase, 95% CI 3%–28%; McNemar exact p=.02; n=45, one participant did not respond at baseline). The proportion using the M-CHAT-R/F including the follow-up questions for initial medium-risk scores increased from 33% at baseline to 82% at follow-up (49% increase, 95% CI 30%–68%, McNemar exact p<.001; n=45, one participant did not respond at baseline). Across all counties, the webM-CHAT-R/F was used for 659 children at 18-month visits, 12% of which were completed in Spanish. Though not a focus of this study, this tool was also used for 366 children at 24-month visits (8% in Spanish).

WebM-CHAT-R/F run charts (Figure 1) illustrate the cumulative number of 18-month screenings conducted within each practice (range 1–262), as well as their latencies to initiating use of the webM-CHAT-R/F following the training window. Practices began implementing the webM-CHAT-R/F at different intervals after the training. Note that four practices (two in County A and two in County B) began using the webM-CHAT-R/F before their county’s training window ended. Eight of the 10 practices had non-flat curves, demonstrating continued use of the webM-CHAT-R/F from the time they began implementation until the end of 6 months. The practice in County A represented by the long-dashed line shows a plateauing of use when the practice experienced technical issues including WiFi connectivity problems, followed by resumption of use when these were resolved. While the long-dashed line in County B does show a flattening of the slope during the last month, 4 administrations did occur during this last interval, reflecting month-to-month variability. Less steep (but still upward) slopes reflect smaller practices and/or fewer enrolled providers (e.g., County B, short-dashed line, 1 provider) and a practice with all Family Practice providers, who see far fewer young children (County B, long dashed line). One practice (dotted line in County A) did not adopt usage and one (dotted line in County B) used it in the first month then stopped using it when they moved offices, followed by a decision to use their prior method which utilized behavioral consultants to implement the follow-up interview.

Screening results.

Of the 659 screened, 604 children (91.7%) scored low-risk (less than 3) on the first screening stage, 48 (7.3%) scored at medium-risk (3-7) on the first stage and therefore received at least one follow-up question, and 7 (1.1%) scored high-risk (8 or higher) on the first stage and therefore did not receive any follow-up questions. Of the 48 who scored initial medium-risk, 48% (n=23) changed to low-risk after the follow-up questions were administered, while 52% (n=25) received high-risk as a final designation. Thus 4.9% of this sample screened positive.

Feasibility.

At follow-up, the majority of PCPs (73%-95%) rated webM-CHAT-R/F feasibility items as either Agree or Strongly Agree (Table 1).

Perceived self-efficacy.

Significant improvements in PCPs’ self-efficacy in caring for young children with ASD were found for all seven areas assessed (Figure 2): (a) Wilcoxon signed-rank p<.001, (b) p<.001, (c; n=44, 2 did not respond to this item) p<.001, (d) p=<.001, (e) p=.008, (f) p=.002, and (g) p<.001.

Obstacles to screening.

Significant decreases in obstacle ratings were found at follow-up for four of the five items (Figure 3): familiarity with screening tools (Wilcoxon signed-rank p=.007); time during well-child appointments to conduct screening (p=.046); staff to conduct screening (p=.024); and comfort talking to parents about ASD concerns (p=.001). No changes were found for lack of community resources for diagnosis and treatment (p=.16).

Helpful and challenging aspects of webM-CHAT-R/F use.

The themes and number of endorsements for each are described below for the 36 PCPs who responded to the open-ended questions. Helpful aspects fell into four major themes: the automated scoring, display, and interpretation of results (n=10); the automated presentation of the follow-up questions (n=9); parent satisfaction and usability (n=9); and general ease and/or speed of use (n=7). Challenging aspects included 5 themes: workflow and time challenges (n=9); problems with online technology (n=8); difficulty reviewing specific failed items (n=6); lengthy initial start-up and staff training (n=4); and lack of integration with electronic medical records (EMR) systems (n=2).

Discussion

Previous research has indicated that formal screening is more effective than clinical judgement alone in identifying ASD28,29. This study employed a pragmatic approach to effectively increase evidence-based ASD screening at 18 months, increase fidelity of use of the M-CHAT-R/F (i.e., administering the follow-up interview questions as recommended), and increase PCP self-efficacy regarding ASD care across “usual practice” settings. Provider-reported routine screening for ASD increased from 82% at baseline to 98% at follow-up. At baseline, only 33% of enrolled PCPs reported using the M-CHAT-R/F follow-up interview questions, which are critical for reducing false positive results13 and preventing unnecessary referrals to an already-overburdened service delivery system. At the six-month follow-up, this percentage had risen to 82%, demonstrating that the webM-CHAT-R/F can improve PCPs’ fidelity in using the most well-validated ASD screening tool across diverse settings and practice-determined workflows.

PCPs provided high ratings for the feasibility of the webM-CHAT-R/F and its fit with their mission, practices, and resources. Although we targeted 18-month visits only, REDCap records indicated that the webM-CHAT-R/F was also used at 24-month visits. In addition, PCPs reported decreases from baseline to follow-up in the extent to which they perceived time, staffing, communication, and familiarity with available tools as obstacles to screening. Not surprisingly, no significant changes in the availability of community resources for diagnostic or treatment services were reported, highlighting the well-recognized access issues faced by underserved communities such as these3033.

There was considerable variability in the length of time it took practices to initiate webM-CHAT-R/F use. Four practices began implementation immediately after their training was completed, while two practices took three months or longer to initiate implementation. One practice attributed their delay in using the webM-CHAT to the considerable time needed to discuss and develop their new workflow, assess their staff capacity, and implement their plan, thus highlighting the importance of having a clear blueprint before engaging successfully in quality improvement efforts within healthcare systems (e.g., Plan-Do-Study-Act Cycle34). Informal communication with practices revealed that webM-CHAT-R/F implementation was also influenced by staff turnover, office moves, and changes in EMR systems.

While use of the webM-CHAT-R/F appeared to be feasible for the majority of practices, the two practices that discontinued use provide important information for future implementation efforts. One practice had access to behavioral health consultants who had been administering the follow-up interview in person to parents for positive initial paper screens prior to study initiation; after a brief trial, they preferred to revert to their original system. Notably, this practice had not been conducting ASD screening at all prior to our introductory meeting to describe the study, pointing to the potential impact of community-research partnerships. The second practice comprised two PCPs within a larger practice who initially chose to use the webM-CHAT-R/F only for children with positive initial (paper) screens to avoid practice-wide workflow disruption; however, they found that positive screens occurred too rarely for them to retain this separate protocol.

The overall screen positive rate of the webM-CHAT-R/F in this sample (4.9%) was higher than rates found in prior studies of the M-CHAT-R/F (1.9%–3%)13,15 and warrants further study. Our screen positive (i.e. medium- or high-risk) rate on the first stage of the webM-CHAT-R/F (8%) was similar to those found in previous studies (7%)13,35. However, in our study a larger proportion (52%) of those receiving the follow-up questions continued to screen at high risk relative to other studies, which ranged from 30%–35%13,15. This variability across studies might be an artifact of differences in the format (e.g., paper vs. digital), sample size, population demographics, and/or age ranges included. For example, our study analyzed data only for 18-month-olds, who may be more susceptible to over-identification relative to 24-month-olds36.

Qualitative analysis provided valuable insight into the advantages and challenges associated with webM-CHAT-R/F use. Consistent with other studies employing digital versions of the M-CHAT1518, PCPs appreciated the automated scoring and presentation of the follow-up interview questions (e.g., “[It is] much faster than the paper version and easier to score”) and described parental satisfaction with the tablet-based approach (e.g., “Parents have fatigue with filling out paperwork, this is a refreshing change”). In contrast, challenges to implementation included the initial startup and staff training time (e.g., “Took us a while to get our system down”), workflow issues (e.g., “Everything else we use is still paper...having one thing digital is awkward and cumbersome”), and the web-based technology itself, including issues with WiFi access, parental reluctance to provide information online, and the need to log into REDCap to see the specific failed items.

This study has several limitations that warrant discussion and provide directions for future research. First, the PCPs who participated in this study may have been a selective group, as over 80% were already conducting some form of ASD screening at 18-month well-child visits at study initiation. As a result, our findings may not generalize to other community practice settings where ASD screening is less common12. Second, our sample was too small to systematically examine PCP-, practice-, and county-level characteristics that might indicate for whom this intervention would be best suited. Third, we were unable to directly measure the level of activity of the point person at each practice, so could not evaluate the effect of this person’s activities on implementation of the webMCHAT-R/F.

In addition, our use of a pragmatic design impeded our ability to gather certain types of information. Importantly, despite extensive efforts and multiple strategies for obtaining objective data for the denominator of 18-month-old visits seen, the practices were unable to provide these data, preventing us from obtaining information about the proportion of 18-month-old visits for which the webM-CHAT-R/F was used. We were unable to obtain data about billing and reimbursement, though it is unlikely that reimbursement motivated screening use given the low rate of reimbursement in WA state. Finally, we were unable to collect data regarding referral practices following positive screens, which is an acknowledged critical next step in the screening process. Recent studies underscore the importance of a comprehensive approach for promoting early detection, as positive screens on the M-CHAT-R/F do not necessarily translate into service referrals26,37.

To our knowledge, this intervention is unique in conceptualizing ASD screening as broader than the screening tool itself; rather, it is a process that involves judgement, decision-making, and communication with families about ASD risk and potential resources and referrals37,38. While additional research is needed, our results suggest promising avenues for scalability and dissemination. The REDCap platform was selected because it is open-source and available at no cost through universities and other community organizations. The workshop materials and presentations have the potential to be “scaled up” through approaches such as tele-mentoring, online training modules, or Project ECHO39,40, which have demonstrated improvements in provider practices. Collaboration with professional organizations (e.g., local AAP chapters) and community partners leading MOC Quality Improvement projects (e.g., Department of Health) may provide a feasible route and meaningful incentive for PCPs to access additional training in ASD screening. We encourage others to explore strategies for improving early ASD screening that incorporate both knowledge and communication barriers in addition to those related to the screening tool itself.

What’s New.

Combining broader education about the ASD detection process with a screener with automated-scoring improves adherence to universal screening best-practices and improves PCP perceived self-efficacy regarding ASD care.

What’s New.

Combining education about the ASD detection process and access to a screener with automated scoring improves adherence to universal screening best-practices and improves PCP perceived self-efficacy regarding ASD care.

Acknowledgements:

The authors are grateful to numerous individuals at University of Washington (UW) who contributed to this project: Lead research assistants Juan Pablo Espinosa, BS, Elyanah Posner, BA, Allycen Kurup, BS, Roya Baharloo, BA, and John Hershberger, BA; READi Lab team members Catherine Dick, MS, Trent DesChamps, MS, and Hannah Neiderman, BA; Co-Investigators Shannon Dorsey, PhD, Chuan Zhou, PhD, Ann Vander Stoep, PhD, and Kathleen Myers, MD, MPH, MS; CHDD faculty Kate Orville, MPH and Amy Carlsen, RN; and Bas de Veer, MSc for REDCap consultation. Joel Tieder, MD, MPH and James W. Stout, MD, MPH at Seattle Children’s Hospital assisted with the Maintenance of Certification (MOC) project, and Diana Robins, PhD, Drexel University, consulted on webM-CHAT-R/F development. Finally, we extend our deep appreciation to our county liaisons and to the primary care providers and office staff, who contributed their valuable time to participate in this study.

Funding Source:

This study was funded by the National Institute of Mental Health [Grant # R01 MH104302 (Stone, PI)]. NIMH did not have input on any aspect of the study design or activities or the decision to submit the report for publication.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

Clinical Trial Registration: The registration number for this trial is NCT02409303 and it was posted on ClinicalTrials.gov on April 6, 2015.

Conflict(s) of Interest: The authors have no conflicts of interest.

Declarations of Interest: None.

References

  • 1.Baio J Prevalence of Autism Spectrum Disorder Among Children Aged 8 Years — Autism and Developmental Disabilities Monitoring Network, 11 Sites, United States, 2014. MMWR Surveill Summ. 2018;67. doi: 10.15585/mmwr.ss6706al [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Dawson G, Jones EJH, Merkle K, Venema K, Lowy R, Faja S, et al. Early behavioral intervention Is associated with normalized brain activity in young children with autism. J Am Acad Child Adolesc Psychiatry. 2012;51(11):1150–1159. doi: 10.1016/j.jaac.2012.08.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Zwaigenbaum L, Bauman ML, Choueiri R, Fein D, Kasari C, Pierce K, et al. Early identification and interventions for autism spectrum disorder: Executive summary. Pediatrics. 2015;136(Supplement 1):S1–S9. doi: 10.1542/peds.2014-3667B [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Kasari C, Gulsrud A, Paparella T, Hellemann G, Berry K. Randomized comparative efficacy study of parent-mediated interventions for toddlers with autism. J Consult Clin Psychol. 2015;83(3):554–563. doi: 10.1037/a0039080 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Landa RJ, Holman KC, O’Neill AH, Stuart EA. Intervention targeting development of socially synchronous engagement in toddlers with autism spectrum disorder: A randomized controlled trial. J Child Psychol Psychiatry. 2011;52(1):13–21. doi: 10.1111/j.l469-7610.2010.02288 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Hyman SL, Levy SE, Myers SM. Identification, evaluation, and management of children with autism spectrum disorder. Pediatrics. 2020;145(1):e20193447. doi: 10.1542/peds.2019-3447 [DOI] [PubMed] [Google Scholar]
  • 7.Crais ER, McComish CS, Humphreys BP, Watson LR, Barankek GT, Reznick JS, et al. Pediatric healthcare professionals’ views on autism spectrum disorder screening at 12–18 Months. J Autism Dev Disord. 2014;44(9):2311–2328. doi: 10.1007/sl0803-014-2101-2 [DOI] [PubMed] [Google Scholar]
  • 8.Barton ML, Dumont-Mathieu T, Fein D. Screening young children for autism spectrum disorders in primary practice. J Autism Dev Disord. 2012;42(6):1165–1174. doi: 10.1007/s10803-011-1343-5 [DOI] [PubMed] [Google Scholar]
  • 9.Morelli DL, Pati S, Butler A, et al. Challenges to implementation of developmental screening in urban primary care: A mixed methods study. BMC Pediatr. 2014;14(1):16. doi: 10.1186/1471-2431-14-16 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Mazurek MO, Harkins C, Menezes M, Chan J, Parker RA, Kuhlthau K, et al. Primary care providers’ perceived barriers and needs for support in caring for children with autism. J Pediatr. March 2020. doi: 10.1016/j.jpeds.2020.01.014 [DOI] [PubMed] [Google Scholar]
  • 11.Self TL, Parham DF, Rajagopalan J. Autism spectrum disorder early screening practices: A survey of physicians. Commun Disord Q. 2015;36(4):195–207. doi: 10.1177/1525740114560060 [DOI] [Google Scholar]
  • 12.Carbone PS, Campbell K, Wilkes J, et al. Primary Care Autism Screening and Later Autism Diagnosis. Pediatrics. 2020;146(2):e20192314. doi: 10.1542/peds.2019-2314 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Robins DL, Casagrande K, Barton M, Chen C-MA, Fein D. Validation of the Modified Checklist for Autism in Toddlers, Revised With Follow-up (M-CHAT-R/F). Pediatrics. 2014;133(1):11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Guthrie W, Wallis K, Bennett A, et al. Accuracy of Autism Screening in a Large Pediatric Network. Pediatrics. 2019;144(4):e20183963. doi: 10.1542/peds.2018-3963 [DOI] [PubMed] [Google Scholar]
  • 15.Campbell K, Carpenter KLH, Espinosa S, Hashemi J, Qui Q, Tepper M, et al. Use of a digital Modified Checklist for Autism in Toddlers — Revised with Follow-up to improve quality of screening for autism. J Pediatr. 2017;183:133–139.e1. doi: 10.1016/j.jpeds.2017.01.021 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Harrington JW, Bai R, Perkins AM. Screening children for autism in an urban clinic using an electronic M-CHAT. Clin Pediatr (Phila). 2013;52(1):35–41. doi: 10.1177/0009922812463957 [DOI] [PubMed] [Google Scholar]
  • 17.Brooks BA, Haynes K, Smith J, McFadden T, Robins DL. Implementation of web-based autism screening in an urban clinic. Clin Pediatr (Phila). 2016;55(10):927–934. doi: 10.1177/0009922815616887 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Sturner R, Howard B, Bergmann P, Morrel T, Andon L, Marks D, et al. Autism screening with online decision support by primary care pediatricians aided by M-CHAT/F. Pediatrics. 2016;138(3):e20153036–e20153036. doi: 10.1542/peds.2015-3036 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Patsopoulos NA. A pragmatic view on pragmatic trials. Dialogues Clin Neurosci. 2011;13(2):217–224. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Loudon K, Treweek S, Sullivan F, Donnan P, Thorpe KE, Zwarenstein M. The PRECIS-2 tool: Designing trials that are fit for purpose. BMJ. 2015;350. doi: 10.1136/bmj.h2147 [DOI] [PubMed] [Google Scholar]
  • 21.Ibañez LV, Stoep AV, Myers K, Zhou C, Dorsey S, Steinman KJ, et al. Promoting early autism detection and intervention in underserved communities: Study protocol for a pragmatic trial using a stepped-wedge design. BMC Psychiatry. 2019;19(1):169. doi: 10.1186/sl2888-019-2150-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.U.S Census Bureau. U.S. Census Bureau QuickFacts: Washington; United States, website. https://www.census.gov/quickfacts/fact/table/WA,US/PST045219. Accessed April 2, 2020.
  • 23.Minkler M, Wallerstein N. Community-based participatory research for health: From process to outcomes. San Francisco, CA: John Wiley & Sons; 2011 [Google Scholar]
  • 24.Zwaigenbaum L, Maguire J. Autism screening: Where do we go from here? Pediatrics. 2019;144(4):e20190925. doi: 10.1542/peds.2019-0925 [DOI] [PubMed] [Google Scholar]
  • 25.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–381. doi: 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Monteiro SA, Dempsey J, Berry LN, Voigt RG, Goin-Kochel RP. Screening and referral practices for autism spectrum disorder in primary pediatric care. Pediatrics. 2019;144(4):e20183326. doi: 10.1542/peds.2018-3326 [DOI] [PubMed] [Google Scholar]
  • 27.Briesch AM, Chafouleas SM, Neugebauer SR, Riley-Tillman TC. Assessing influences on intervention implementation: Revision of the Usage Rating Profile-Intervention. J Sch Psychol. 2013;51(1):81–96. doi: 10.1016/j.jsp.2012.08.006 [DOI] [PubMed] [Google Scholar]
  • 28.Miller JS, Gabrielsen T, Villalobos M, Alleman R, Wahmhoff N, Carbone PS, et al. The each child study: Systematic screening for autism spectrum disorders in a pediatric setting. Pediatrics. 2011;127(5):866–871. doi: 10.1542/peds.2010-0136 [DOI] [PubMed] [Google Scholar]
  • 29.Barger B, Rice C, Wolf R, Roach A. Better together: Developmental screening and monitoring best identify children who need early intervention. Disabil Health J. 2018;11(3):420–426. doi: 10.1016/j.dhjo.2018.01.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Antezana L, Scarpa A, Valdespino A, Albright J, Richey JA. Rural trends in diagnosis and services for autism spectrum disorder. Front Psychol. 2017;8. doi: 10.3389/fpsyg.2017.00590 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Daniels AM, Mandell DS. Explaining differences in age at autism spectrum disorder diagnosis: A critical review. Autism. 2014;18(5):583–597. doi: 10.1177/1362361313480277 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Zuckerman KE, Mattox K, Donelan K, Batbayar O, Baghaee A, Bethell C. Pediatrician Identification of Latino children at risk for autism spectrum disorder. Pediatrics. 2013;132(3):445–453. doi: 10.1542/peds.2013-0383 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Zuckerman KE, Sinche B, Cobian M, Cervantes M, Mejia A, Becker T, et al. Conceptualization of autism in the latino community and its relationship with early diagnosis. J Dev Behav Pediatr JDBP. 2014;35(8):522–532. doi: 10.1097/DBP.0000000000000091 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Berwick DM. Developing and testing changes in delivery of care. Ann Intern Med. 1998;128(8):651. doi: 10.7326/0003-4819-128-8-199804150-00009 [DOI] [PubMed] [Google Scholar]
  • 35.Rea KE, Armstrong-Brine M, Ramirez L, Stancin T. Ethnic Disparities in Autism Spectrum Disorder Screening and Referral: Implications for Pediatric Practice. J Dev Behav Pediatr. 2019;40(7):493–500. doi: 10.1097/DBP.0000000000000691 [DOI] [PubMed] [Google Scholar]
  • 36.Sturner R, Howard B, Bergmann P, Stewart L, Afarian TE. Comparison of Autism Screening in Younger and Older Toddlers. J Autism Dev Disord. 2017;47(10):3180–3188. doi: 10.1007/s10803-017-3230-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Sheldrick RC, Frenette E, Vera JD, Mackie TI, Martinez-Pedraza F, Hoch N, et al. What drives detection and diagnosis of autism spectrum disorder? Looking under the hood of a multi-stage screening process in early intervention. J Autism Dev Disord. 2019;49(6):2304–2319. doi: 10.1007/s10803-019-03913-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Locke J, Ibañez LV, Posner E, Frederick L, Carpentier P, Stone WL. Parent perceptions about communicating with providers regarding their early autism concerns. (2020). Pediatrics. doi:DOI: 10.1007/s10803-019-04337-x [DOI] [PubMed] [Google Scholar]
  • 39.Mazurek MO, Curran A, Burnette C, Sohl K. ECHO Autism STAT: Accelerating early access to autism diagnosis. J Autism Dev Disord. 2019;49(1):127–137. doi: 10.1007/s10803-018-3696-5 [DOI] [PubMed] [Google Scholar]
  • 40.Sohl K, Mazurek MO, Brown R. ECHO Autism: Using technology and mentorship to bridge gaps, increase access to care, and bring best practice autism care to primary care. Clin Pediatr (Phila). 2017;56(6):509–511. doi: 10.1177/0009922817691825 [DOI] [PubMed] [Google Scholar]

RESOURCES