Abstract
Background/aims
The COVID-19 pandemic has highlighted the need for accessible support for children with developmental disabilities. This study explored online literacy instruction with supplementary parent-led shared book reading (SBR) for children with autism.
Methods
Twenty-one children with autism (5–12 years) completed a battery of assessments (T1) before being assigned to ability matched Instruction (n = 10) and Control groups (n = 11). Instruction group participants completed 16 h of ABRACADABRA instruction working with a researcher 1:1 online and SBR activities at home with a parent over 8 weeks. All participants were reassessed after the instruction period (T2) and parents of children in the Instruction group were interviewed regarding their views and experiences.
Results
Quantitative analyses showed no significant improvements in reading for Instruction group children relative to Control group children. However, each child successfully participated in 16 online instruction sessions and qualitative data revealed that parents were generally positive about the program, with some observing improvements in their child’s literacy skills and reading confidence.
Conclusions and Implications
While it appears children with autism can participate in online literacy instruction, sixteen hours of online ABRACADABRA instruction with parent-led SBR may not be effective in improving their reading skills. Further research is required to explore whether more intensive and/or extended online instruction may be feasible and effective, and to improve uptake of parent-led book reading activities at home.
Keywords: Autism, Telehealth, Online instruction, COVID-19, Literacy, Reading instruction
What this paper adds?
The COVID-19 pandemic has brought into focus a clear need for accessible learning support for children with developmental disabilities. This study explores the effect and experience of online literacy instruction for children with autism using an evidence-based program (ABRACADABRA). ABRACADABRA was designed for face-to-face instruction but adapted here for online delivery and used alongside supplementary parent-led shared book reading activities. In contrast to previous ABRACADABRA studies, our quantitative analyses showed that children with autism aged 5–12 years who received online ABRACADABRA instruction with supplementary parent-led reading activities did not achieve statistically significantly gains in reading as compared to a wait control group of children with autism. However, our qualitative data showed that some parents observed improvements in reading skills and confidence for children who received instruction. Moreover, parents tended to reflect positively on the literacy instruction and provided numerous recommendations for improving online learning support for children with autism in the future. These results highlight key considerations for the implementation and evaluation of online literacy instruction for children with autism and their families.
1. Introduction
Literacy provides a foundation for educational success and is a key contributor to long-term occupational and financial outcomes, as well as health and wellbeing (DeWalt, Berkman, Sheridan, Lohr, & Pignone, 2004). Yet, many children, including those with developmental disabilities such as autism spectrum disorder (ASD), cerebral palsy, and Down syndrome, do not have access to effective reading instruction in part due to misconceptions regarding their capacity for learning and pseudoscientific instruction programs (Griffiths, Taylor, Henderson, & Barrett, 2016; Machalicek et al., 2010). School and clinic closures during the COVID-19 pandemic have led to further reductions in support for these children (Dickinson & Yates, 2020). Hybrid service delivery models involving online professional support and parent-led activities in the home may have the potential to reverse these trends. Here, we report findings from a trial of online literacy instruction for autistic children using the ABRACADABRA program with supplementary parent-led shared book reading in the home.1 These findings are part of a broader project on online literacy instruction for children with developmental disabilities including autism, Down syndrome and cerebral palsy. Findings relating to literacy instruction for children with Down syndrome and cerebral palsy are in preparation at the time of writing and will be published elsewhere.
1.1. Literacy instruction for children with autism
An early review by Whalon, Otaiba, and Delano (2009) concluded that autistic children can benefit from literacy instruction designed for any beginning reader that focuses on one or more of the five key components of reading according to the US National Reading Panel (NRP): phonemic awareness, phonics, vocabulary, reading fluency and reading comprehension (National Institute of Child Health & Human Development, 2000). Bailey and Arciuli (2019) provided an updated review of the research on evidence-based literacy instruction for children with autism and an analysis of research quality (for additional commentary on this field including directions for future research see Arciuli & Bailey, 2021). Consistent with the findings of Whalon et al., results showed improvements in specific aspects of reading for children who received instruction targeting one or more of the key five skills recommended by the NRP. A quasi-experimental study by Bailey, Arciuli, and Stancliffe (2017) was the only study to be awarded a research quality rating of adequate or strong and explore the effects of comprehensive reading instruction using the computer-based ABRACADABRA program (ABRA).
ABRA is a free web application designed to target the five NRP key reading skills via game-based learning (Centre for the Study of Learning & Performance, 2009) and shown to improve the phonemic awareness, phonics, listening and reading comprehension skills of typically developing children (Abrami, Lysenko, & Borokhovski, 2020). The web application comprises 33 activities arranged in four domains: alphabetics (phonological awareness and phonics), reading fluency, reading comprehension and spelling (example ABRA activities are described in Appendix A). While there are some general guidelines regarding classroom implementation of the program (see Head, Pillay, Wade, & Warwick, 2018), the ABRA activities are not presented in any prescribed sequence but are selected for use based on the needs of individual children. Most activities are equipped with customisable difficulty settings which permit instruction tailored to the individual’s skill level. For example, the basic decoding ABRA activity includes seven “levels” where easier targets are presented first (e.g., 3-phoneme words containing only short vowels) progressing to more difficult targets in the higher levels (e.g., 5-phoneme words with consonant blends and short or long vowel sounds).
In the study by Bailey et al. (2017), twenty children with autism aged 5–12 years were assigned to an Instruction (n = 11) or Control group (n = 9) matched on age and baseline measures of language, adaptive ability, reading accuracy and comprehension. Children in the Instruction group received 26 h of ABRA instruction over 13 weeks working 1:1 with a trained program facilitator in their family home while children in the Control group continued their regular learning activities business as usual. Results showed that children in the Instruction group achieved significantly greater gains in word- and passage-level reading accuracy and reading comprehension from pre- to post-instruction compared to the Control group, with relative gains on all outcome measures associated with large effect sizes.
Arciuli and Bailey (2019) investigated the effects of reading instruction using ABRA in schools for groups of children with autism. This study involved a total of 23 children with autism aged 5–9 years. Participants were assigned to an Instruction (n = 11) or Control group (n = 12) matched on age and baseline measures of oral language, adaptive ability, reading accuracy and comprehension. Unlike the previous study which involved 13 weeks of 1:1 instruction at home, children in the Instruction group received 9 weeks of instruction working in small groups in a school library with their regular classroom teachers. Results comparing pre- and post-instruction assessment scores again showed improvements in word- and passage-level reading accuracy for the Instruction group relative to the Control group, with large effect sizes. In contrast to the statistically significant and large improvements observed in the earlier study, gains in reading comprehension for the Instruction group did not reach statistical significance. Taken together, results from the previous ABRA studies suggest that the program is effective in improving word- and passage-level reading accuracy skills for children with autism when used 1:1 in a home setting or in groups in a school setting. Findings regarding reading comprehension were inconsistent across the studies. Notably, children in the group-based study completed the 26 h of instruction in fewer weeks (9 weeks vs. 13 weeks) by participating in some longer sessions (90 min. vs. 60 min.). Contemporary research indicates that children with literacy learning difficulties can achieve gains in reading over similarly brief periods, but that instruction must be delivered intensively over shorter, more frequent sessions to be optimally effective (e.g., daily 20−30 min sessions over 10 weeks; Reynolds, Wheldall, & Madelaine, 2010). Thus, differences in instruction delivery, among other factors, may have contributed to disparate findings regarding reading comprehension.
Questions regarding the optimisation of ABRA for children with autism warrant further investigation. One possibility is that parent-led shared book reading (SBR) activities in the home could be partnered with ABRA to increase the frequency of children’s literacy learning opportunities. SBR refers to an adult reading aloud with a child while promoting interaction and supporting the child’s language and literacy development (e.g., by asking questions about the text; National Early Literacy Panel, 2008). A recent meta-analysis by Boyle, McNaughton, and Chapin (2019) considered the single-subject research on SBR interventions for children with autism. Overall, SBR interventions were found to have a moderate positive impact across multiple reading and related skills, including listening comprehension and participation in shared reading, and intervention effects were found to be similar across most child outcomes irrespective of whether SBR was conducted by a teacher or parent. This is consistent with other studies showing the feasibility and positive impacts of SBR for children with autism (e.g., Akemoglu & Tomeny, 2021), though it should be noted that studies in this field have generally included very small samples.
1.2. Online instruction for children with autism
Interest in remote online instruction for children with autism predates the COVID-19 pandemic. An early systematic review of telehealth support services by Boisvert, Lang, Andrianopoulos, and Boscardin (2010) identified five studies on online assessment and instruction services for children with autism. Results from these studies, collectively involving 46 children aged 2–11 years, showed that professionals were able to use telehealth to assist carers and educators to deliver support services, including diagnostic assessments and behavioural and early interventions. An updated review by Sutherland, Trembath, and Roberts (2018) reported results generally consistent with these findings
A recent review of remote online literacy instruction for children and adolescents without autism by Furlong, Serry, Bridgman, and Erickson (2021) found that evidence-based procedures may be feasibly presented online. Seven of the nine studies included in this review investigated the effects of online literacy instruction on reading and spelling skills, including knowledge of letter names and sounds, vocabulary, reading fluency, accuracy and comprehension. These studies tended to involve small samples (n = 3–25 with one study including 61 participants) and all participants were children or adolescents with learning difficulties. While there were some inconsistent findings, the results showed that 2–4 online literacy instruction sessions each week over 8–12 weeks was generally effective in improving children’s reading and spelling skills. Note that there are two issues to consider here in relation to ‘dosage’: intensity in terms of sessions per week as well as overall duration of the instruction period.
Of the works included in the Furlong et al. (2021) review, the pre-experimental study by Houge and Geier (2009) was one of the few to directly assess learning outcomes following online comprehensive literacy instruction. In this study, children and adolescents with literacy learning difficulties received 8 weeks (16 sessions) of individualised online tutoring targeted at word study (phonemic awareness and phonics), reading fluency, vocabulary and writing skills as well as implementation of reading comprehension strategies. The instruction sessions drew on a range of resources (i.e., published approaches and learning materials) and followed a routine structure: 5 min repeated reading, story summarising and oral retell, 15−20 min word study, vocabulary, sentence writing and question answering, 5−10 min written retell, and finally 5 min shared reading. Paired samples t-tests showed statistically significant gains in participants’ passage reading accuracy, fluency and comprehension skills and word spelling skills from pre- to post-instruction as measured using the Gray Oral Reading Test – 4th edition (Wiederholt & Bryant, 2001) and the Test of Written Spelling Test – 4th edition (Larsen, Hammill, & Moats, 1999), respectively. These results demonstrate that online comprehensive literacy instruction can be effective in improving multiple aspects of reading and spelling for children with literacy learning difficulties in as little as 16 sessions delivered over 8 weeks. Whether this is the case for children with autism is unclear.
1.3. The current study
The current study was funded by a COVID-19 Collaborative Research Grant designed to support rapid research on effective supports during the COVID-19 pandemic. The aims of this study were to (i) establish a new hybrid model of literacy instruction for children with developmental disabilities by adapting the ABRA literacy program for clinician-led online delivery with parent support supplemented by parent-led reading activities outside of the ABRA sessions, (ii) pilot the new hybrid model of literacy instruction with autistic children, and examine efficacy (quantitatively), and (iii) explore parents’ experiences of the new hybrid model and their perspectives on the outcomes (qualitatively).
We hypothesised that online ABRA instruction would lead to improvements in word and nonword reading accuracy based on the previous ABRA study conducted over a similar time period (e.g., Arciuli & Bailey, 2019) and a previous study conducted over a similar time period and with a similar number of sessions (Houge & Geier, 2009). We were less sure about whether we would see improvements in reading comprehension skills for children with autism relative to a control group because Arciuli and Bailey (2019) did not observe gains in comprehension. We anticipated that the addition of parent-led SBR in the current study would support gains in comprehension.
2. Method
This research was approved by the Human Research Ethics Committee atFlinders University (Project ID: 2069) and conducted over three phases following an adapted sequential exploratory mixed methods design (Creswell & Plano Clark, 2018). In Phase One, parents of children with developmental disabilities were invited to be interviewed to identify ways of adapting ABRA instruction for clinician-led online delivery and encouraging parent-led shared reading activities. Parents of two children with Down Syndrome came forward to participate and provided recommendations regarding the administration of home reading activities and online literacy instruction for children with developmental disabilities (see Appendix B for Phase One methods and findings). In Phase Two, we piloted our co-designed hybrid program, focusing on children’s reading skills in an Instruction group versus a Control group. In Phase Three, we explored parents’ views on the instruction period via semi-structured interviews.
To be eligible for Phase Two, participants met the following criteria: (i) 5–12 years of age, (ii) parent-reported clinical diagnosis of autism spectrum disorder using Diagnostic and Statistical Manual of Mental Disorders criteria (DSM-V; American Psychiatric Association, 2013), (iii) no serious hearing or vision impairments, (iv) able to communicate using sentences, (v) able to identify at least 1 letter of the alphabet, (vi) able to sustain attention for at least 15 min, (vii) living in Australia, and (viii) speaks English as a first language.
Twenty-one children with autism met criteria and were included in the study. Participants were assigned to an Instruction (ABRA group; n = 10) or Control group (n = 11). Children assigned to the Control group did eventually receive 8-weeks instruction consistent with the Instruction group but after the post-instruction assessments had been completed. To ensure similar abilities across groups, we identified pairs of participants who achieved similar scores on baseline assessments of adaptive ability, socialisation, phonological awareness, reading accuracy and reading comprehension (see Measures section). One participant from each pairing was randomly assigned to the Instruction group, and the other participant was assigned to the Control group. A single unmatched participant was assigned to the Control group as this assignment resulted in closer group matching than if the participant was placed in the Instruction group. Participant information by group is shown in Table 1 . Participants’ scores varied within each group reflecting the broad inclusion criteria.
Table 1.
Scores for Each Pre-Instruction Baseline Measure by Group.
Control (n = 11) |
ABRA instruction (n = 10) |
||||||||
---|---|---|---|---|---|---|---|---|---|
Measure | M | SD | Range | M | SD | Range | t(19) | p | Cohen’s d |
Adaptive ability | 76.27 | 22.49 | 21 - 114 | 73.00 | 9.19 | 58 - 92 | .43 | .673 | .19 |
Socialisation | 79.18 | 12.11 | 57 - 97 | 73.50 | 14.65 | 51 - 95 | .97 | .343 | .42 |
Phonological awareness | 19.64 | 9.30 | 5 - 31 | 18.50 | 12.23 | 0 - 33 | .24 | .812 | .11 |
Word-level reading accuracy | 34.55 | 12.22 | 9 - 51 | 30.90 | 13.54 | 5 - 48 | .65 | .524 | .28 |
Word and nonword reading accuracy | 59.36 | 27.37 | 23 - 101 | 52.60 | 39.27 | 0−96 | .46 | .650 | .20 |
Passage-level reading accuracy | 40.09 | 21.01 | 15 - 73 | 32.60 | 26.59 | 0 - 75 | .72 | .480 | .31 |
Passage-level reading comprehension | 16.91 | 9.35 | 2 - 33 | 13.10 | 11.44 | 0 - 34 | .84 | .412 | .36 |
Everyday reading comprehension | 13.00 | 5.95 | 0 - 20 | 10.00 | 8.67 | 0 - 20 | .93 | .363 | .40 |
Note. Adaptive ability: Vineland Adaptive Behavior Scale – 2nd edition (VABS-2), Adaptive Behavior Composite standard score; Socialisation: VABS-2, Socialisation standard score; Phonological awareness: Comprehensive Test of Phonological Processing – 2nd edition (CTOPP-2), Elision subtest standard score; Word-level reading accuracy: Wide Range Achievement Test – 4th edition (WRAT-4), Word Reading subtest raw score, Word and nonword reading accuracy: Castles and Coltheart Test – 2nd edition (CC-2), combined regular word, irregular word and nonword raw score; Passage-level reading accuracy and reading comprehension: Neale Analysis of Reading Ability – 3rd edition (NARA-3), Accuracy and comprehension composite raw scores; Everyday reading comprehension: Test of Everyday Reading Comprehension (TERC), Reading comprehension raw score.
As shown in Table 1, independent samples t tests showed no statistically significant differences across groups in terms of adaptive ability, socialisation, phonological awareness, reading accuracy, or reading comprehension. The groups were also matched for age (mean Instruction group age = 9.40 years, range = 5.83–12.75 years; mean control group age = 9.08 years, range = 5.67–12.42 years; t(19) = 0.343, p = .736). Across the sample, children presented as highly diverse in their reading skills but tended to perform below the age and year-of-schooling adjusted averages of typically developing children without autism on standardised measures of word reading accuracy (mean percentile rank = 36.82, SD = 30.69), passage reading accuracy (mean percentile rank = 27.71, SD = 26.12), and passage reading comprehension (mean percentile rank = 35.47, SD = 31.47).
2.1. Measures
A series of individually administered standardised assessments were used to obtain measures of phonological awareness, reading accuracy and reading comprehension. Children’s adaptive ability was assessed via semi-structured parent interview. Interview and assessment sessions were conducted online using Zoom.
2.1.1. Adaptive ability, including socialisation
The Survey Interview Form of the Vineland Adaptive Behavior Scales – 2nd edition was used to assess adaptive ability: socialization, communication, and daily living skills (VABS-2; Sparrow, Cicchetti, & Balla, 2005). Test items relating to motor skills were also administered to children aged 6 years and younger. Parents were asked to indicate whether their child always, sometimes or never demonstrates behaviours which relate to child development in the target domains. As reported in the test manual, individuals with health impairments, brain injury, multiple impairments and/or autism comprised 1.7 % of the normative sample for this assessment. Test-retest reliability coefficients based on data from this normative sample show good consistency (e.g., for VABS-2 Adaptive Behaviour Composite [ABC] scores r = 0.92). Correlations between VABS-2 ABC scores and other established measures of adaptive ability were also high for this normative sample (r = .70–78). In our current sample, the VABS-2 had high internal consistency for children aged seven years or older (Cronbach’s α = 0.96), and 6 years or younger (Cronbach’s α = 0.97).
2.1.2. Phonological awareness
The Elision subtest of the Comprehensive Test of Phonological Processing – 2nd edition was used to assess participants’ phonological awareness (CTOPP-2; Wagner, Torgesen, Rashotte, & Pearson, 2013). This subtest involved participants repeating a word said aloud by the Researcher but omitting a specified syllable or phoneme (e.g., “say ‘tan’ without saying /t/”). As reported in the test manual, children with disabilities comprised 5% of the normative sample for this test. Test-retest reliability statistics reported in the assessment manual shows good consistency (r = 0.73) and correlations between Elision subset scores and other phonological awareness measures provide evidence of validity (r = 69−.85). The Elision subtest had high internal consistency for children included in the current study (Cronbach’s α = 0.96).
2.1.3. Word-level reading accuracy
The Word Reading subtest of the Wide Range Achievement Test – 4th edition was used to assess participants’ ability to identify letters and real words (WRAT-4; Wilkinson & Robertson, 2006). Participants were asked to identify a list of single letters by name followed by a series of single words. As reported in the test manual, children with disabilities comprised 5% of the normative sample for the WRAT-4. Correlations between Word Reading subtest scores and other established measures of letter and word reading accuracy reported in the test manual provide evidence of validity (r = .77−.76). Internal consistency for the Word Reading subtest was high for children in this study (Cronbach’s α = 0.97).
2.1.4. Word and nonword reading accuracy
The Castles and Coltheart Test - 2nd edition was used to measure participants’ ability to read phonetically regular words, phonetically irregular words and nonwords (CC-2; Castles et al., 2009). Participants were asked to read aloud a series of single regular and irregular words and nonwords. Information on the number of children with autism or other disabilities in the CC-2 normative sample is not currently available. Data from the normative sample show acceptable internal consistency (rs = 0.90) and substantial inter-rater reliability (κ = 0.91; Bennett et al., 2020). Internal consistency for the CC-2 was high in this study (Cronbach’s α = .98).
2.1.5. Passage-level reading accuracy
The Reading Accuracy Composite raw score from the Neale Analysis of Reading Ability – 3rd edition was used to assess passage-level reading accuracy (NARA-3; Neale, 1999). In this assessment, participants read a series of passages of increasing length and complexity. The NARA-3 manual does not state the number of children with ASD, or other disabilities, in its normative sample. Test-retest statistics based on this normative sample showed good consistency for the Reading Accuracy Composite (r = 0.95). Correlations between all composite scores and other established reading measures reported in the assessment manual provide evidence of validity (r = .70–.77). Internal consistency for the Reading Accuracy Composite was high in the current sample (Cronbach’s α = .89).
2.1.6. Passage-level reading comprehension
The Reading Comprehension Composite score from the NARA-3 was used to assess passage-level reading comprehension. Participants were asked questions relating to information conveyed in the abovementioned passages. Test-retest statistics based on this normative sample showed good consistency for the Reading Comprehension Composite (r = 0.93) and correlations between all composite scores and other established reading measures were strong as noted above. Internal consistency for the Reading Comprehension Composite was high in this study (Cronbach’s α = .86).
2.1.7. Everyday reading comprehension
Participants’ ability to read text in everyday situations was assessed using the Test of Everyday Reading Comprehension (TERC; McArthur et al., 2012). Children were asked to read instances of text in real-world situations, such as a text message or shopping list, and respond to questions about the conveyed content. The TERC normative sample included children with developmental dyslexia and typically developing children. Data from the normative sample show high inter-rater reliability (r = 0.99) and significant correlations between TERC scores and associated literacy measures (McArthur et al., 2013; Wheldall & McMurtry, 2014). Internal consistency for the TERC was high in the current sample (Cronbach’s α = 0.96).
2.2. Procedure
Assessment and instruction sessions were conducted by a single researcher online using Zoom. Instruction outcomes were assessed both quantitatively and qualitatively.
2.2.1. Pre-instruction assessment
Baseline measures of adaptive ability, phonological awareness, reading accuracy and reading comprehension were collected over two 20- to 30-minute assessment sessions. Session One involved assessments of phonological awareness, regular and irregular word reading accuracy, nonword reading accuracy, and everyday reading comprehension. Session Two involved assessments of word-level and passage-level reading accuracy and passage-level reading comprehension. Adaptive abilities were assessed during a single 20-minute parent interview. All sessions were conducted on a one-to-one basis, with parents’ supervising the assessments and assisting where necessary (e.g., adjusting seat height or computer volume).
2.2.2. Post-instruction assessment
The post-instruction assessment followed the same procedures as the pre-instruction assessment and included seven outcome measures. We considered those measures used in the previous studies involving ABRA and children with autism as our primary outcome measures: (i) WRAT-4 word reading accuracy; (ii) NARA-2 passage-level accuracy, and (iii) NARA-2 passage-level comprehension. Measures designed for computerised delivery were included as secondary outcome measures: (i) CC-2 word and nonword reading accuracy and (ii) TERC everyday reading comprehension.
2.2.3. ABRA instruction
Parents of children assigned to the Instruction group attended a single 60-minute online information session on Zoom prior to engaging with the ABRA program. These sessions were conducted on a 1:1 basis and provided parents information on the ABRA program and its use in the current study, and on the parent-led shared-book reading activities. Child participants then completed two 60-minute ABRA instruction sessions each week over a period of eight weeks. Note that the 8-week instruction period was slightly shorter than in the previous ABRA studies involving children with autism (9 weeks in Arciuli & Bailey, 2019; 13 weeks in Bailey et al., 2017). This shorter period was necessary to complete the project within the rapid research grant guidelines and was anticipated to be a sufficient period of time in view of gains achieved by children who have received online literacy instruction with a similar number of sessions and over a similar duration (e.g., Houge & Geier, 2009). Online delivery of these sessions involved parents joining a Zoom session organised by the Researcher. Participants were permitted shared screen control to enable them to directly interact with the ABRA program running on the Researcher’s computer. In this way, the Researcher (program facilitator and clinician) was able to guide participants through the individualised literacy program while allowing them to navigate the program interface as if it were running on their own device.
Each ABRA session comprised approximately: (i) 15 min of word-level computer activities targeting alphabetics, high-frequency word identification, or word spelling skills; (ii) 20 min of passage-level computer activities targeting reading fluency or comprehension skills, (iii) 15 min of shared book reading with the Researcher which aimed to revisit skills targeted during the preceding computer activities, and (iv) a 10-minute reward activity. Participants were provided breaks as part of the 60-minute session protocol as deemed appropriate by parents and the Researcher. Word and passage-level computer activities were hosted on the ABRA web application. Researcher-led book reading activities were conducted using the Fitzroy Reader iPad application (Berryman & O’Carroll, 2012).
The pre-instruction assessment data was used to identify learning goals, activities, and difficulty settings appropriate for instruction (see Head et al., 2018, for details regarding ABRA goal setting and learning activities). These were revised after each session. A performance criterion of 65–85 % accuracy was employed to identify activities of appropriate content and difficulty for instruction. Skill mastery was set at 85 % accuracy for each word-level activity, maintained over three sessions. Once mastery was achieved on a given activity, children progressed to another level or skill.
2.2.4. Shared book reading
In addition to the online ABRA sessions, parents were asked to complete two shared book reading activities each week of the instruction period. These activities could be completed at any time during the week and did not involve the Researcher. Activities focused on texts from the Fitzroy Reader iPad applications and were conducted in line with the following general guidelines: (i) children and parents take turns reading each page of the story, (ii) reading errors are recorded by parents during shared reading, (iii) children or parents are to ask a question about the text every 2nd page, (iv) each story is to conclude with the child answering three questions about the content of the story and with the child practicing any recorded reading errors. Parents were encouraged to provide feedback on children’s reading accuracy and answers to comprehension questions. A copy of these guidelines and teaching materials were provided to parents in the form of a shared reading manual. Note that the Fitzroy readers were selected for use in this study consistent with the previous ABRA studies involving children with autism (Arciuli & Bailey, 2019; Bailey et al., 2017). Using these readers permitted a degree of knowledge and control over the complexity of texts that children were attempting to read during the parent-led activities.
The parent-led shared book reading activities were designed to target specific skills introduced in the week’s ABRA sessions (e.g., encouraging children to monitor their own reading comprehension). These targets were discussed with parents verbally during weekly consultation sessions which were conducted online using Zoom. Targets were then added to a reading log (see Appendix C for example entry) which was used to communicate the week’s reading activities to parents and for parents to complete and return via email to the Researcher showing the week’s progress.
2.3. ABRA implementation fidelity
For each participant in the Instruction group, two ABRA instruction sessions were selected at random and recorded for fidelity assessment. These were submitted to an independent Researcher with extensive knowledge of the ABRA program along with a corresponding session plan for review. This Researcher used the Phase Two Fidelity Rating Form to evaluate the quality of the literacy instruction and the extent to which sessions adhered to the guidelines of the ABRA program (see Appendix D).
Implementation fidelity ratings were received for nine of the ten participants in the Instruction group. The tenth participant declined to have their sessions video recorded. Table 2 shows a summary of the implementation fidelity ratings. Mean ratings show that all key elements of ABRA instruction were delivered with high fidelity.
Table 2.
Means and Standard Deviations for ABRA Fidelity Ratings.
Fidelity item | M | SD |
---|---|---|
Researcher is familiar with the ABRA program/lesson content | 5 | 0 |
Lesson has clear goals and objectives | 4.89 | .33 |
Lesson is planned ahead of time | 5 | 0 |
Lesson is content is balanced (includes alphabetics, word and text activities) | 5 | 0 |
Lesson includes introduction | 4.56 | .73 |
Lesson includes demonstration | 5 | 0 |
Researcher monitors child’s navigation of the program | 5 | 0 |
Lesson includes conclusion | 4 | 1.73 |
Lesson includes ABRA and non-ABRA activities | 5 | 0 |
Lesson includes computer and non-computer activities | 5 | 0 |
Researcher manages student behaviour | 5 | 0 |
Learning environment is appropriately organised | 5 | 0 |
Lesson is at least 30 min. in duration | 5 | 0 |
Note. Fidelity ratings are based on 9 video recorded sessions as one participant did not consent to the recording. Each item was rated using a six-point scale: 0 = not applicable, 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree.
2.4. Data analysis
A series of 2 × 2 ANOVAs (Time: pre-, post-instruction x Group: Instruction vs. Control) were used to evaluate the effects of ABRA instruction on participants’ reading abilities. Time was a within participants factor and group was a between participants factor. The dependent variables of interest were the same as those used in the previous studies involving ABRA and children with autism (WRAT-4 word-level reading accuracy, NARA-2 passage-level reading accuracy, NARA-2 passage-level reading comprehension) as well as additional measures designed for computerised delivery (CC-2 word and nonword reading accuracy, and TERC everyday reading comprehension). Due to the number of tests conducted we adopted a conservative alpha of 0.01. Regarding effect sizes, we interpreted ƞp 2 of .01 as a small effect size, .06 as a medium effect size, and .14 as a large effect size (Richardson, 2011).
2.5. Phase three
Parents of children in the ABRA Instruction group were invited to a post instruction interview using Zoom or phone. The interview questions focused on parents’ thoughts specifically on the ABRA program including the software, the videoconferencing technology, the instructional techniques and strategies, followed by how the participant experienced the program, changes seen in the participant following instruction, benefits and challenges and their recommendations for development of the ABRA program. Similar questions were asked about the parent-led shared book reading activities. The duration of the interviews ranged from 26−61 min (M = 43 min.).
Interview data were analysed using reflexive thematic analysis (Braun & Clarke, 2021). Interview transcripts were imported into NVivo and common themes and subthemes were identified.
3. Results
In line with the sequential mixed methods approach, the Phase Two quantitative results will be presented first followed by the Phase Three qualitative results. Mean pre/post assessment raw scores for each of the outcome measures are provided by group in Table 3 . Generally, children in both groups showed improvements in their reading abilities from pre- to post-instruction assessment.
Table 3.
Pre- and Post-Instruction Raw Scores for Each Outcome Measure by Group.
Pre-instruction |
Post-instruction |
|||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Control (n = 11) |
ABRA instruction (n = 10) |
Control (n = 11) |
ABRA instruction (n = 10) |
|||||||||
Measure | M | SD | Range | M | SD | Range | M | SD | Range | M | SD | Range |
Word-level reading accuracy | 34.55 | 12.22 | 9 - 51 | 30.90 | 13.54 | 5 - 48 | 36.45 | 10.17 | 16 - 50 | 33.20 | 13.17 | 8 - 49 |
Word and nonword reading accuracy | 59.36 | 27.37 | 23 - 101 | 52.60 | 39.27 | 0 - 96 | 61.55 | 25.91 | 28 - 94 | 58.20 | 38.18 | 0 -101 |
Passage-level reading accuracy | 40.09 | 21.01 | 15 - 73 | 32.60 | 26.59 | 0 - 75 | 40.27 | 18.37 | 16 - 72 | 36.40 | 28.14 | 0 - 83 |
Passage-level reading comprehension | 16.91 | 9.35 | 2 - 33 | 13.10 | 11.44 | 0 - 34 | 19.00 | 10.23 | 4 - 35 | 15.00 | 11.35 | 0 - 38 |
Everyday reading comprehension | 13.00 | 5.95 | 0 - 20 | 10.00 | 8.67 | 0 - 20 | 15.27 | 4.47 | 6 - 19 | 12.20 | 7.98 | 0 - 20 |
Note. Word-level reading accuracy: WRAT-4, Word Reading subtest; Regular and nonword reading accuracy: CC-2, combined regular word, irregular word and nonword raw score; Passage-level reading accuracy and reading comprehension: NARA-3, Reading accuracy and comprehension composite raw scores; Everyday reading comprehension: TERC, Reading comprehension raw score.
3.1. Phase two quantitative results
Each child in the Instruction group attended 16 instruction sessions for a total of 160 sessions. Sessions were considered ‘fully complete’ if the child completed the assigned word-level computer task, passage-level computer task and Researcher-led shared book reading task. Sessions were considered ‘partially complete’ when children completed two of these activities and ‘incomplete’ when children completed one or none of these activities. Of the 160 instruction sessions, 126 (78.75 %) were fully complete (range for individual participants = 10–16), 34 (21.25 %) were partially complete (range for individual participants = 1–6) and 0 were incomplete. Reasons for partial completion included participants arriving late for their session (n = 8), participants taking longer than expected to complete assigned tasks (n = 16), technological issues (n = 5), and unexpected interruptions such as a participant’s tooth falling out (n = 2). Results of the quantitative analyses relating to the effects of ABRA instruction on participants’ reading abilities (Time x Group ANOVAs) are summarised in Table 4 .
Table 4.
Phase Two Time (Pre-, Post-instruction assessment) x Group (Instruction vs. Control) ANOVA Results.
Outcome measure | Main effect of Time |
Time x Group interaction |
||||
---|---|---|---|---|---|---|
F (1,19) | p | ƞp2 | F (1,19) | p | ƞp2 | |
Word-level reading accuracy | 12.33 | <.01 | .39 | 0.11 | .75 | .01 |
Word and nonword reading accuracy | 10.21 | <.01 | .35 | 1.97 | .18 | .09 |
Passage-level reading accuracy | 3.75 | .07 | .17 | 3.09 | .10 | .14 |
Passage-level reading comprehension | 6.40 | .02 | 25 | 0.02 | .91 | <.01 |
Everyday reading comprehension | 9.87 | .01 | .34 | 0.003 | .96 | <.01 |
Note. Word-level reading accuracy: WRAT-4, Word Reading subtest raw scores; Word and nonword reading accuracy: CC-2, combined regular word, irregular word and nonword raw score; Passage-level reading accuracy and reading comprehension: NARA-3. reading accuracy and comprehension raw scores; Everyday reading comprehension: TERC, Reading comprehension raw score.
Significant main effects of Time were found for two out of five outcome measures. None of the Time x Group interactions were statistically significant. This suggests that, while the sample improved on some measures during the instruction period, children in the Instruction group did not achieve significantly greater improvements relative to children in the Control group. Effect sizes associated with these comparisons were small in magnitude for all measures except for word and non-word reading accuracy (medium effect size) and passage reading accuracy (large effect size).
3.2. Examination of adherence and quality of SBR activities from parent logs
Parents were asked to complete one reading log entry for each of the 16 shared book reading activities, noting their child’s estimated reading accuracy and comprehension along with other comments (see Appendix C). Entries which described children’s reading accuracy and comprehension were deemed fully complete even if other cells (e.g., completion date, example errors, etc.) were left empty. Entries which noted either reading accuracy or comprehension were deemed partially complete and entries which noted neither reading accuracy nor comprehension were deemed incomplete. Of the 160 possible entries, 109 (68 %) were fully complete, 21 (13 %) were partially complete and 30 (19 %) were incomplete.
3.3. Phase three qualitative results
As can be seen in Table 5 , six overarching themes and several sub-themes were identified in the parent interview data (see Appendix E for additional example comments).
Table 5.
Overarching Themes and Subthemes in Parent Interviews.
Theme | Sub-themes |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
3.3.1. Remote intervention using ABRA was perceived positively by parents and children
Parents reported that the ABRA software/program was appealing with attractive visuals and was entertaining: “she found the characters interesting… They were quite confident and bright colors and different” (P11). The flexibility and the timing of the sessions was seen as a benefit for the children and families. According to parents, the children looked forward to the weekly online sessions and were keen to participate. Children learned to interact with the facilitator and switch between the ABRA program on their home computer/laptop/iPad and the Zoom screen. Parents highlighted that the opportunity to participate in online literacy training sessions during COVID-19 was extremely valuable. This was either due to families living in regional areas where they typically would not have access to intensive training programs with a skilled facilitator or due to families housebound due to COVID-19.
3.3.2. Facilitator played an important role in the success of the program
All parents commented positively about the facilitator: “The best part was the practitioner component, when [Facilitator] was actively engaging” (P4). They also noted that his dedication, knowledge and skills in planning and implementing the online literacy instruction contributed to the overall experience. Parents also valued the style and traits of the facilitator.
3.3.3. Parents perceived improvements in children's literacy and self-confidence
Parents’ views and perceptions on the impact of the ABRA program on their child’s reading skills was positive. Parents noted changes in their children’s willingness and ability to read texts not used in the ABRA sessions, such as signs and text on the television. Parents also reported that children used strategies practiced in the instruction sessions to read text encountered in everyday life: “He used to look at me when it was a difficult word. But…he tries by himself and he'll say, ‘Wait, I'll, I'm trying’. And then he finally gets it and he’s proud and he gives me a proud, ‘look at myself’ and ‘I can do it’” (P6).
3.3.4. Parents benefited from observing facilitator interact with their children
Many parents reflected positively on their own learning while observing the facilitator supporting their child: “my perspective…has changed about learning stuff that it can be fun. We can have fun while learning. Always in his school, the teachers used to say, ‘You must have fun while learning’, but I never understood it. But when I saw [Facilitator] doing that, that's when I came to know…” (P6). Parents identified the following strategies as being particularly useful: routinely asking questions in SBR, making errors deliberately while reading to check children’s engagement and modelling “sounding out” (decoding) of hard to read words.
3.3.5. Responses to parent-led shared book reading were mixed
Parents described their views on the benefits and challenges of parent-led SBR. In terms of benefits, some parents reported that SBR had a positive effect on children’s reading enjoyment and motivation and provided opportunity to try strategies that they had observed during the ABRA sessions. Others did not find the SBR activities to be of value and found it a “chore”. Some children reportedly lacked motivation or did not like reading books on the iPad. Some parents also mentioned that the books were not always well suited to their child in terms of their age and interests, and found it difficult to find the time to complete the activities: “a little bit difficult. Trying to find the time…the logs not so bad because it's only… 5 or 10 min…but it was a challenge to try and find the spare time” (P20).
3.3.6. Parents made recommendations for improving the program
Parents provided various suggestions for future online implementation of the program as well as specific recommendations for ABRA and parent-led shared book reading. Parents suggested that access to the instruction materials could be improved using a central website and user login rather than email links which were used in this project to share these resources. Regarding ABRA, parents suggested that the program could be usefully extended to target additional skills, such as grammar, spelling and story writing. Some parents noted that it would be useful to have access to ABRA outside the session times: “… I think the sessions were run just the right amount of time, but I think maybe some … more activities to do as a family at home” (P23). Parents suggested that reading materials used in the shared book reading activities could be better tailored to the needs and interests of individual children.
4. Discussion
This sequential mixed method design study explored the effect and experience of online literacy instruction using the ABRACADABRA program with supplementary parent-led shared book reading (SBR) activities for children with autism and their families over three phases. In Phase One, we established a new hybrid model of literacy instruction for children with developmental disabilities comprising online researcher-led instruction using the ABRA program and parent-led shared book reading activities in the family home based on recommendations of parent interviews. In Phase Two, we piloted our new hybrid model with autistic children over 8 weeks in an Instruction group (n =10) relative to a Control group (n = 11). In the third and final phase, we explored parents’ views and experiences.
4.1. Effects of literacy instruction and shared book reading
To our knowledge, this is the first investigation of online ABRA delivery for autistic children. Our records show that children with autism were able and willing to complete the majority of online literacy instruction sessions over the 8-week instruction period (79 % sessions fully complete), though participation was sometimes limited by scheduling issues, slower than anticipated task completion, technological issues and other disruptions (21 % sessions partially complete). Participation in the parent-led SBR activities was considerably less consistent (60 % activities complete, 21 % partially complete, 19 % incomplete). These results show that autistic children may feasibly participate in remote online ABRA instruction. The feasibility of hybrid instruction including online sessions and supplementary SBR in the home is less clear and could be usefully explored in future studies.
Our quantitative analyses showed that children with autism who received 8 weeks of literacy instruction did not achieve significantly greater gains in reading accuracy or comprehension as compared to a Control group of children with autism. This finding stands in contrast to previous results showing improvements in the reading skills of children with autism following face-to-face instruction using the ABRA program (Arciuli & Bailey, 2019, reported gains in reading accuracy; Bailey et al., 2017, reported gains in both reading accuracy and reading comprehension). While it is difficult to draw definitive conclusions, there are several factors that may have contributed to the current result beyond online administration of the program which was the focus in this study.
One potentially important factor is instruction dosage. Instruction group children in the current study received considerably fewer hours of direct literacy instruction using the ABRA program over fewer weeks of instruction (16 h over 8 weeks) as compared to those in the previous ABRA studies (26 h over 9–13 weeks). Contemporary research suggests that it may be possible to achieve gains in reading over such short periods, but that instruction must be delivered more intensively using shorter, more frequent sessions than that in current study to be optimally effective (Reynolds et al., 2010). We anticipated that parent-led SBR activities would bolster the effects of the online ABRA instruction sessions in the current study by providing additional literacy learning opportunities. Our results suggest that this was not the case; however, parent feedback and the shared reading logs demonstrated that many of the parent-led SBR activities were not attempted or only partially completed (around 40 %). Thus, it is difficult to determine whether parent-led SBR was useful because of this evidence of low adherence.
A second consideration is the characteristics of the current sample versus those in the previous ABRA studies (Arciuli & Bailey, 2019; Bailey et al., 2017). Children who received instruction in the current study were substantially older (mean age = 113 months) relative to those in the previous studies (mean age = 88 months in Arciuli & Bailey, 2019, and 90 months in Bailey et al., 2017) and tended to demonstrate higher reading skills at baseline. For example, Instruction group mean passage-level reading accuracy raw scores (NARA-2) were 32.60 (SD = 26.59) in the current study, 10.91 (SD = 12.21) in the study by Arciuli and Bailey (2019), and 20.09 (SD = 18.64) in Bailey et al. (2017). Differences in mean reading comprehension raw scores (NARA-2) were even greater: 13.10 (SD = 11.44) in the current study, 1.82 (SD = 2.14) in Arciuli and Bailey (2019), and 5.00 (SD = 5.33) in Bailey et al. (2017). It is possible that Instruction group children in the current study were older or more skilled in a way that led to less benefit from ABRA which is designed to target fundamental reading skills.
Finally, the ABRA program was delivered with high fidelity in the current study. However, as participants used their own computer and internet to access the online instruction sessions, it was not possible to control all aspects of instruction delivery. For example, bandwidth limitations restricted access to some instruction sessions for children living in remote areas. It is also worth noting that parents’ computer literacy also had the potential to influence the instruction sessions. For example, on one occasion, a participant successfully convinced his parent that his computer speakers were not working by using the mute function in Zoom. Such instances were usually addressed swiftly using the message function in Zoom or by the researcher calling parents over the phone but were nonetheless disruptive.
4.2. Experience of online literacy instruction and parent-led shared book reading
Our interview data showed that parents were very positive about the ABRA program. This is consistent with previous findings showing that parents tend to reflect positively and see advantages in instruction programs and interventions delivered online (e.g., Reaven, Blakeley-Smith, Nichols, & Hepburn, 2011). Interestingly, some parents also reported improvements in their child’s reading skills and reading confidence/motivation, suggesting that participation in the program may have improved aspects of reading which were not captured by the pre- and post-assessment battery we used here.
While there were some positive comments, it is important to acknowledge that several parents reflected negatively on the assigned SBR activities, mostly in relation to time demands. This is consistent with data collected using the SBR logs which showed only partial completion of these activities. It is also consistent with previous studies that have reported low engagement and high attrition rates for parent-led SBR activities (e.g., Justice, Skibbe, McGinty, Piasta, & Petrill, 2011). Further research is required to identify ways of optimising parent-led SBR, especially with regard to improving accessibility and encouraging uptake and adherence. For example, user-friendly apps to collect reading log data may be beneficial. Reducing demands in terms of the duration and frequency of reading activities might also have a positive impact on uptake and adherence.
4.3. Limitations
A key limitation of the current study is the relatively small sample of children with autism. It is difficult to make definitive conclusions about the efficacy and generalisability of online literacy instruction for children with autism based on this sample. However, we note that samples of fewer than 10 children are commonplace in the field and that the current sample (10 Instruction group participants and 11 Control group participants) comprised a similar number of participants as those in the previously published ABRA research (Arciuli & Bailey, 2019, 11 Instruction group participants and 12 Control group participants; Bailey et al., 2017, 11 Instruction group participants and 9 Control group participants). Given the medium to large effect sizes associated with some of our analyses, it appears possible that future studies involving larger samples may return a different pattern of results due to greater statistical power and sampling a broader range of abilities. A second key limitation is that children’s diagnostic status was not verified using independent autism assessment, rather, it was accepted that children had autism based on parents reporting a previous formal diagnosis from a health professional using DSM criteria. Third, as we have already stated, the current study delivered fewer hours of ABRA instruction per child than in our previous studies (similar overall time period for instruction but less intensity than in Arciuli & Bailey, 2019) and there was evidence of adherence issues in the parent-led SBR activities. These issues should be taken into consideration when interpreting results and provide important learnings for future studies. While there is some evidence that online and face-to-face assessment renders similar results when working with autistic children (Sutherland et al., 2018), some of the measures used in the current study were not intended for online delivery. A final consideration is that the same literacy materials were used at pre- and post-instruction assessment. It is therefore possible that practice effects may have contributed to pre- to post-instruction assessment scores; however, we note that participants were not provided any feedback on their performance at either assessment, and it is likely that any practice effects would have influenced Instruction and Control groups in the same way.
4.4. Conclusion and directions for future research
Quantitative data revealed that 16 h of online ABRA literacy instruction delivered over an 8 week period alongside parent-led SBR may not be effective in improving autistic children’s reading skills. However, children were able to participate in online instruction sessions. Qualitative data revealed that parents reflected on these sessions positively and provided important recommendations regarding their delivery in future studies. Additional research is required to determine why this online ABRA trial was less effective than previous ABRA trials, including the possibility that the amount of ABRA instruction was insufficient and that the small sample size may have led to the study being underpowered to find statistically significant effects. This is an imperative given the possibility of continued disruptions to clinical/educational services due to COVID-19 and the potential of online instruction to increase support services for autistic children.
Data availability
Data will be made available on request.
CRediT authorship contribution statement
Benjamin Bailey: Conceptualisation, Methodology, Formal analysis, Investigation, Data curation, Writing, Project administration, Funding acquisition. Darryl Sellwood: Conceptualisation, Methodology, Formal analysis, Data curation, Writing, Funding acquisition. Fiona Rillotta: Conceptualisation, Methodology, Formal analysis, Data curation, Writing, Funding acquisition. Pammi Raghavendra: Conceptualisation, Methodology, Formal analysis, Data curation, Writing, Funding acquisition. Joanne Arciuli: Conceptualisation, Methodology, Formal analysis, Oversight of Investigation and Project administration, Data curation, Writing, Funding acquisition.
Funding statement
This work was supported by a COVID-19 Collaborative Research Grant awarded by the Caring Futures Institute at Flinders University.
Footnote
There has been a great deal of discussion regarding the use of person-first versus identify-first language with regards to the autism community (e.g., Bottema-Beutel, Kapp, Lester, Sasson, & Hand, 2021; Gernsbacher, 2017; Kenny et al., 2016). Although we did not survey the current sample about their preferences, we have tried to show awareness of the importance of this issue and willingness to accommodate a range of preferences by using person-first and identify-first terminology interchangeably throughout our paper.
Declaration of Competing Interest
None.
Number of reviews completed is 2
Appendix A. Example ABRA Activities
Module | |
---|---|
Example activity | Description |
Alphabetics | |
Syllable counting | Identify number of syllables in a word |
Letter bingo | Identify letter on bingo card when given letter name |
Letter sound search | Find the letter corresponding to a sound from the computer |
Auditory blending | Blend sounds from the computer to form a whole word |
Basic decoding | “Sound out” written words and match them to pictures |
Word changing | Change individual letters in a given word to create a new word |
Reading fluency | |
Tracking | Use cursor to highlight words in text as they are read aloud |
Reading with expression | Take turns reading text with correct and incorrect expression |
Speed reading | Get feedback on whether you are reading too fast or too slow |
Practice | Practice tracking and reading with speed and expression |
Reading comprehension | |
Prediction | Predict what is going to happen next in the story |
Comprehension monitoring | Identify words on the page that do not make sense |
Vocabulary | Match words to sentences with correct usage |
Summarising | Answer questions relating to key information in text and then an overall summary |
Story elements | Answer questions about characters, setting, complication, resolution and conclusion |
Writing | |
Spelling words | Type words of increasing length and complexity |
Spelling sentences | Type sentences of increasing length and complexity |
Appendix B. Phase One Method and Findings
Phase One Method
Participants. Flyers inviting parents to participate in the Phase One interviews which were geared toward optimising literacy instruction for children with developmental disabilities were distributed widely online and via the researchers’ personal and professional networks. Two parents volunteered to take part in Phase One. Participant One was a mother onatiof a child with Down Syndrome who attended a mainstream school and was 7 years of age. Participant Two was a mother of a child with Down syndrome and intellectual disability who attended a mainstream school and was 9 years of age.
Procedure. Participants were asked to view two informational videos on the Concordia University website hosting the ABRACADABRA program (https://literacy.concordia.ca/resources/abra/parent/en/video_using_abra.php). Together, these contained approximately 10 min. footage on the ABRACADABRA program and its use with the general school-aged population. Parents were also asked to complete a purpose-built questionnaire providing information about themselves and their child.
Each participant took part in a one-to-one 30−45 min interview with one of two members of the research team using Zoom. In this interview, parents were asked about their child’s computer skills and home reading activities as well as their own views on the potential uses of ABRACADABRA. The interviews were recorded and analysed using thematic analysis as proposed by Braun & Clarke (2021) to identify key points to guide the delivery of online ABRACADABRA and supplementary parent-led reading activities in Phase Two.
Phase One Findings
Home reading activities. Participants reported that parents, grandparents, and/or siblings were confident reading with their child and regularly engaged in shared reading at home. Specifically, parents reported working through readers and practicing sight words. Parents also reported using the following strategies when reading with their child: slowing down reading speed, asking questions about the story, encouraging their child to point to unfamiliar words and ask for help, asking their child to sound out words, and re-reading books to assist learning through repetition. Materials used in the home included readers from school and allied health clinics, and laminated sight word picture cards. One child also used audio books.
Recommendations for the proposed online ABRA instruction sessions. Participants offered numerous recommendations but the following were key to guiding the research team in their development of the hybrid model piloted in Phase Two: (i) encourage families to establish a daily reading routine including a set time for shared reading activities, (ii) clearly communicate the time requirement and other demands of the instruction period, (iii) be flexible with regard to the types of hardware parents and children use to optimally engage with the program (e.g., use of specialist computer equipment for children with sensory or physical limitations). Other recommendations such as adding Australian accents to the ABRA program could not be implemented (only the ABRA creators at Concordia University can change the program).
Appendix C. Example Shared Book Reading Log Entry
![]() |
Note. Unshaded cells were completed by the Researcher and parents were asked to enter information into the shaded cells following the SBR activity.
Appendix D. Phase Two Fidelity Rating Form
-
1
Did the researcher experience any technical difficulties? How was this managed? How much time was taken to resolve the difficulties?
-
2
Did the researcher introduce the lesson? What did the researcher do during the introduction? Was the child engaged? Was the introduction effective?
-
3
How do you feel the ABRA lesson progressed today? Was the children engaged? What aspects were you particularly pleased / displeased with?
-
4
What is the researcher’s comfort level with the ABRA program and activities that were being used during this session?
-
5
Did the researcher conclude the lesson? What did the researcher do during the conclusion? Was the child engaged? Was the conclusion effective?
-
6
Please share any other observations you may have had regarding both the teacher and the children using ABRA (i.e., engagement, organisation, teaching style, interruptions)
Key | ||||||
0 | 1 | 2 | 3 | 4 | 5 | |
Not applicable | Strongly disagree | Disagree | Neutral | Agree | Strongly agree | |
0 | 1 | 2 | 3 | 4 | 5 | |
Researcher is familiar with the ABRA program/lesson content | ||||||
Lesson has clear goals and objectives | ||||||
Lesson is planned ahead of time | ||||||
Lesson content is balanced (includes alphabetics, word and text activities) | ||||||
Lesson includes introduction | ||||||
Lesson includes demonstration | ||||||
Researcher monitors child’s navigation of the program | ||||||
Lesson includes conclusion | ||||||
Lesson includes ABRA and non-ABRA activities | ||||||
Lesson includes computer and non-computer activities | ||||||
Researcher manages student behaviour | ||||||
Learning environment is appropriately organised | ||||||
Lesson is at least 30 min. in duration |
Appendix E. Example Parent Comments by Theme
Theme | Quote |
---|---|
|
|
|
|
|
|
|
|
|
|
|
|
References
- Abrami P.C., Lysenko L., Borokhovski E. The effects of ABRACADABRA on reading outcomes: An updated meta‐analysis and landscape review of applied field research. Journal of Computer Assisted Learning. 2020;36(3):260–279. doi: 10.1111/jcal.12417. [DOI] [Google Scholar]
- Akemoglu Y., Tomeny K.R. A parent-implemented shared-reading intervention to promote communication skills of preschoolers with autism spectrum disorder. Journal of Autism and Developmental Disorders. 2021;51(8):2974–2987. doi: 10.1007/s10803-020-04757-0. [DOI] [PubMed] [Google Scholar]
- American Psychiatric Association . 5th ed. Author.; Washington, DC: 2013. Diagnostic and statistical manual of mental disorders. [Google Scholar]
- Arciuli J., Bailey B. Efficacy of ABRACADABRA literacy instruction in a school setting for children with autism spectrum disorders. Research in Developmental Disabilities. 2019;85:104–115. doi: 10.1016/j.ridd.2018.11.003. [DOI] [PubMed] [Google Scholar]
- Arciuli J., Bailey B. The promise of comprehensive early reading instruction for children with autism and recommendations for future directions. Language, Speech, and Hearing Services in Schools. 2021;52(1):225–238. doi: 10.1044/2020_LSHSS-20-00019. [DOI] [PubMed] [Google Scholar]
- Bailey B., Arciuli J. Reading instruction for children with autism spectrum disorders: A systematic review and quality analysis. Review Journal of Autism and Developmental Disorders. 2019;7:127–150. doi: 10.1007/s40489-019-00185-8. [DOI] [Google Scholar]
- Bailey B., Arciuli J., Stancliffe R.J. Effects of ABRACADABRA literacy instruction on children with autism spectrum disorder. Journal of Educational Psychology. 2017;109(2):257–268. doi: 10.1037/edu0000138. [DOI] [Google Scholar]
- Bennett C., Cullinane M., Bennetts S.K., Love J., Hackworth N.J., Mensah F.K.…Westrupp E.M. Tablet-based adaptation and administration of the Castles and Coltheart Reading Test 2 for a large longitudinal study. PloS One. 2020;15(9):e0239420. doi: 10.1371/journal.pone.0239420. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Berryman F., O’Carroll P. 11th ed. Fitzroy Programs Pty Ltd.; Victoria, Australia: 2012. The Fitzroy Readers. [Google Scholar]
- Boisvert M., Lang R., Andrianopoulos M., Boscardin M.L. Telepractice in the assessment and treatment of individuals with autism spectrum disorders: A systematic review. Developmental Neurorehabilitation. 2010;13(6):423–432. doi: 10.3109/17518423.2010.499889. [DOI] [PubMed] [Google Scholar]
- Bottema-Beutel K., Kapp S.K., Lester J.N., Sasson N.J., Hand B.N. Avoiding ableist language: Suggestions for autism researchers. Autism in Adulthood Knowledge Practice and Policy. 2021;3(1):18–29. doi: 10.1089/aut.2020.0014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Boyle S.A., McNaughton D., Chapin S.E. Effects of shared reading on the early language and literacy skills of children with autism spectrum disorders: A systematic review. Focus on Autism and Other Developmental Disabilities. 2019;34(4):205–214. doi: 10.1177/1088357619838276. [DOI] [Google Scholar]
- Braun V., Clarke V. One size fits all? What counts as quality practice in (reflexive) thematic analysis? Qualitative Research in Psychology. 2021;18(3):328–352. doi: 10.1080/14780887.2020.1769238. [DOI] [Google Scholar]
- Castles A., Coltheart M., Larsen L., Jones P., Saunders S., McArthur G. 2009. Assessing the basic components of reading: A revision of the Castles and Coltheart Test with new norms.www.motif.org.au Retrieved form. [Google Scholar]
- Centre for the Study of Learning and Performance . Concordia University; Montreal, Canada: 2009. The Learning Toolkit (Version 2.27) [Web application]http://doe.concordia.ca/cslp/ICTLTK.php January, Retrieved from. [Google Scholar]
- Creswell J.W., Plano Clark V.L. 3rd ed. SAGE.; Thousand Oaks, CA: 2018. Designing and conducting mixed methods research. [Google Scholar]
- DeWalt D.A., Berkman N.D., Sheridan S., Lohr K.N., Pignone M.P. Literacy and health outcomes. Journal of General Internal Medicine. 2004;19(12):1228–1239. doi: 10.1111/j.1525-1497.2004.40153.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dickinson, H. & Yates, S. (2020). More than isolated: The experience of children and young people with disability and their families during the COVID-19 pandemic. Victoria, Australia: ACIE. Retrieved from https://apo.org.au/node/305856.
- Furlong L., Serry T., Bridgman K., Erickson S. An evidence‐based synthesis of instructional reading and spelling procedures using telepractice: A rapid review in the context of COVID‐19. Advance online publication. International Journal of Language & Communication Disorders. 2021 doi: 10.1111/1460-6984.12619. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gernsbacher M.A. Editorial perspective: The use of person‐first language in scholarly writing may accentuate stigma. Journal of Child Psychology and Psychiatry. 2017;58(7):859–861. doi: 10.1111/jcpp.12706. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Griffiths P.G., Taylor R.H., Henderson L.M., Barrett B.T. The effect of coloured overlays and lenses on reading: A systematic review of the literature. Ophthalmic and Physiological Optics. 2016;36(5):519–544. doi: 10.1111/opo.12316. [DOI] [PubMed] [Google Scholar]
- Houge T.T., Geier C. Delivering one‐to‐one tutoring in literacy via videoconferencing. Journal of Adolescent & Adult Literacy. 2009;53(2):154–163. doi: 10.1598/JAAL.53.2.6. [DOI] [Google Scholar]
- Head, J., Pillay, V., Wade, A., & Warwick, L. (2018). Literacy within the Learning Toolkit+: A guide for regional Trainers and Teachers. Retrieved from https://www.concordia.ca/research/learning-performance/tools/learning-toolkit.html#resources.
- Justice L.M., Skibbe L.E., McGinty A.S., Piasta S.B., Petrill S. Feasibility, efficacy, and social validity of home-based storybook reading intervention for children with language impairment. Journal of Speech Language and Hearing Research. 2011;54(2):523–538. doi: 10.1044/1092-4388(2010/09-0151). [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kenny L., Hattersley C., Molins B., Buckley C., Povey C., Pellicano E. Which terms should be used to describe autism? Perspectives from the UK autism community. Autism. 2016;20:442–462. doi: 10.1177/1362361315588200. [DOI] [PubMed] [Google Scholar]
- Larsen S.C., Hammill D.D., Moats L.C. 4th ed. PRO-ED; Austin, TX: 1999. Test of Written Spelling. [Google Scholar]
- Machalicek W., Sanford A., Lang R., Rispoli M., Molfenter N., Mbeseha M.K. Literacy interventions for students with physical and developmental disabilities who use aided AAC devices: A systematic review. Journal of Developmental and Physical Disabilities. 2010;22(3):219–240. doi: 10.1007/s10882-009-9175-3. [DOI] [Google Scholar]
- McArthur, G., Jones, K., Anandakumar, T., Larsen, L., Castles, A., & Coltheart, M. (2012). The Test of Everyday Reading Comprehension. Retrieved from www.motif.org.au.
- McArthur G., Jones K., Anandakumar T., Larsen L., Castles A., Coltheart M. A test of everyday reading comprehension (TERC) Australian Journal of Learning Difficulties. 2013;18(1):35–85. doi: 10.1080/19404158.2013.779588. [DOI] [Google Scholar]
- National Early Literacy Panel . National Institute for Literacy; Washington, DC: 2008. Developing early literacy: Report of the National Early Literacy Panel. [Google Scholar]
- National Institute of Child Health and Human Development . U.S. Government Printing Office; Washington, DC: 2000. Report of the National Reading Panel: Teaching children to read. [Google Scholar]
- Neale M.D. 3rd ed. Australian Council for Educational Research Ltd.; Melbourne, Australia: 1999. Neale Analysis of Reading Ability. [Google Scholar]
- Reaven J., Blakeley-Smith A., Nichols S., Hepburn S. Brookes; Baltimore, MD: 2011. Facing your fears: Group therapy for managing anxiety in children with high-functioning autism spectrum disorders. [Google Scholar]
- Reynolds M., Wheldall K., Madelaine A. Components of effective early reading interventions for young struggling readers. Australian Journal of Learning Difficulties. 2010;15(2):171–192. doi: 10.1080/19404150903579055. [DOI] [Google Scholar]
- Richardson J.T. Eta squared and partial eta squared as measures of effect size in educational research. Educational Research Review. 2011;6:135–147. doi: 10.1016/j.edurev.2010.12.001. [DOI] [Google Scholar]
- Sparrow S.S., Cicchetti D.V., Balla D.A. 2nd ed. AGS Publishing; Circle Pines, MN: 2005. Vineland Adaptive Behavior Scales. [Google Scholar]
- Sutherland R., Trembath D., Roberts J. Telehealth and autism: A systematic search and review of the literature. International Journal of Speech-language Pathology. 2018;20(3):324–336. doi: 10.1080/17549507.2018.1465123. [DOI] [PubMed] [Google Scholar]
- Wagner R.K., Torgesen J.K., Rashotte C.A., Pearson N.A. 2nd ed. PRO-ED; Austin, TX: 2013. Comprehensive Test of Phonological Processing. [Google Scholar]
- Whalon K.J., Al Otaiba S., Delano M.E. Evidence-based reading instruction for individuals with autism spectrum disorders. Focus on Autism and Other Developmental Disabilities. 2009;24(1):3–16. doi: 10.1177/1088357608328515. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wheldall K., McMurtry S. Preliminary evidence for the validity of the new test of everyday reading comprehension. Australian Journal of Learning Difficulties. 2014;19(2):173–178. doi: 10.1080/19404158.2014.979525. [DOI] [Google Scholar]
- Wiederholt J.L., Bryant B.R. 4th ed. Pearson; Canada: 2001. Gray Oral Reading Test. [Google Scholar]
- Wilkinson G.S., Robertson G.J. 4th ed. Psychological Assessment Resources; Lutz, FL: 2006. Wide Range Achievement Test. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Data will be made available on request.