Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2016 Jan 1.
Published in final edited form as: Aust J Learn Diffic. 2015 Jun 15;20(1):39–53. doi: 10.1080/19404158.2015.1047871

Utility of the Spelling Sensitivity Score to Analyze Spellings of Children with Specific Language Impairment

Krystal L Werfel 1, Hannah Krimm 2
PMCID: PMC4581535  NIHMSID: NIHMS699317  PMID: 26413194

Abstract

The purpose of this study was to examine the utility of the Spelling Sensitivity Score (SSS) beyond percentage correct scoring in analysing the spellings of children with specific language impairment (SLI). Participants were 31 children with SLI and 28 children with typical language in grades 2 through 4. Spellings of individual words were scored using two methods: (a) percentage correct and (b) SSS. Children with SLI scored lower than children with typical language when spelling was analysed with percentage correct scoring and with SSS scoring. Additionally, SSS scoring highlighted group differences in the nature of spelling errors. Children with SLI were more likely than children with typical language to omit elements and to represent elements with an illegal grapheme in words, whereas children with typical language were more likely than children with SLI to represent all elements with correct letters.


Children with specific language impairment (SLI) exhibit unexplained difficulties in acquiring spoken language (Tomblin et al., 1997). Although research clearly has demonstrated that children with SLI have compromised literacy outcomes (Catts, Fey, Tomblin, & Zhang, 2002), research on spelling outcomes in particular has been limited. Poor spelling is associated with poor academic performance, as well as negative perception of an individual's general capabilities including intelligence and attention to detail (Figueredo & Varnhagen, 2005; Marshall & Powers, 1969). In light of the poor spelling performance of children with SLI as well as the sparse research base on mechanisms underlying poor spelling outcomes for this population, there is a need for increased study of the spelling skills of children with SLI.

Spelling Theories and Instruction

Across theories of spelling development, researchers propose that learning to encode word spellings necessitates use of linguistic knowledge (Apel, Masterson, & Hart, 2004; Bourassa & Treiman, 2001; Gentry, 1978). Indeed, numerous studies have shown that linguistic knowledge is an important contributor to spelling performance (Apel, Wilson-Fowler, Brimo, & Perrin, 2012; Bourassa & Treiman, 2001; Read, 1971; Stahl & Murray, 1994; Walker & Hauerwas, 2006; Werfel, 2012). Spelling skill is dependent on a variety of types of linguistic knowledge and awareness, including phonological awareness (Stahl & Murray, 1994; Treiman, 1991), morphological knowledge (Apel et al., 2012; Treiman & Cassar, 1996; Treiman, Cassar, & Zukowski, 1994), and orthographic knowledge (Treiman, 1994; Walker & Hauerwas, 2006), among others. It is therefore unsurprising that spelling deficits are abundant in children with SLI, who exhibit deficits across these types of linguistic knowledge and awareness (Bishop & Adams, 1990; Cordewener, Bosman, & Verhoeven, 2012; Wagovich, Pak, & Miller, 2012; Young et al., 2002).

Despite the strong theoretical and empirical linguistic basis for spelling, spelling continues to be taught widely in English-speaking countries as a memorization task; little to no time is spent developing and applying linguistic skills in spelling instruction (Carreker, Joshi, & Boulware-Gooden, 2010; Coltheart & Prior, 2007; Schlagal, 2002). Further, spelling performance on individual words typically is assessed as simply correct or incorrect, reflecting the widespread focus on memorization. In contrast, a consideration of error type often can provide information about underlying linguistic skills. The percentage correct scoring method focuses only on whole-word accuracy and fails to capture the specific linguistic nature of spelling errors, which can inform instruction or intervention practices (e.g., Masterson & Apel, 2010). The purpose of this study was to compare the utility of two spelling scoring methods: (a) traditional percentage correct and (b) the Spelling Sensitivity Score (SSS), a linguistic-based spelling scoring system.

Spelling of Children with SLI

Children with SLI score lower on spelling measures than children with typical language (Bishop & Adams, 1990; Cordewener et al., 2012; Wagovich et al., 2012; Young et al., 2002). Spelling deficits in children with SLI are evident in elementary school (e.g., age 8; Bishop & Adams, 1990), and difficulties with spelling persist into young adulthood (Young et al., 2002). Additionally, it appears that the nature of English spelling errors may differ qualitatively for children with SLI and children with typical language (Mackie & Dockrell, 2004; Silliman, Bahr, & Peters, 2006).

In a study of British English spelling, Mackie and Dockrell (2004) broadly classified spelling errors in two ways: phonological inaccuracy and/or orthographic inaccuracy. They reported a clinically meaningful difference in proportion of spelling errors made by 9- to 12-year-old children with SLI compared to language-matched controls. The effect size between the two groups was large, d = 0.94. In addition to a higher proportion of errors, the nature of spelling errors made by children with SLI also differed compared to chronological age-matched and language age-matched controls. Children with SLI produced spellings with more phonologically-inaccurate errors (e.g., “nan” for “can”) and spellings with more orthographically-illegal errors (e.g., “ckan” for “can”) than either control group. Additionally, the authors used a separate standardized scoring system to evaluate omission of inflectional word endings. Children with SLI produced spellings characterized by widespread omission of inflectional endings including progressive –ing, regular past tense –ed, third person singular –s, and plural – s. The only omission error the control groups produced was omission of plural –s. Thus, children with SLI produce proportionally more spelling errors than children with typical language, and these errors appear to differ in nature from errors seen in typical development.

In American English spelling, Silliman, Bahr, and Peters (2006) likewise reported that the spelling errors of children with SLI were qualitatively different than the spelling errors of chronological age-matched and spelling age-matched controls. The authors developed the Phonological, Orthographic, and Morphological Assessment of Spelling (POMAS) to evaluate children's spellings. The POMAS is a spelling measure that consists of 30 words selected to sample different types of linguistic knowledge: phonological, orthographic, and morphological. Errors first were coded broadly into one of these three linguistic knowledge categories. Errors then were coded further for specific linguistic features within the broad category (e.g., consonant blends, vowel patterns, morphological patterns such as past tense -ed). Children with SLI made diffuse errors across all categories. They made more errors in representing the phonological structure of words and omitted morphological markers more often than either control group. Thus, Silliman et al. provide additional evidence that children with SLI produce spellings with errors that do not follow typical developmental patterns.

In contrast to typical educational practices, the two studies of spelling errors of children with SLI used qualitative scoring systems to analyse children's misspellings. Although both spelling systems provide insight into the nature of spelling difficulties in children with SLI, each has limitations for use in clinical practice. Mackie and Dockrell's (2004) scoring system is broad and largely incomplete, lacking a provision for morphological errors. In contrast, the POMAS is very detailed, and teachers are unlikely to have adequate linguistic knowledge to correctly classify spelling errors using such a system (Moats, 1994).

The Spelling Sensitivity Score

Similar to the scoring systems used by Mackie & Dockrell (2004) and Silliman et al. (2006), the Spelling Sensitivity Score (SSS; Masterson & Apel, 2010) allows for more fine-grained analysis of the linguistic nature of spelling errors than traditional percentage correct scoring. The SSS is more comprehensive than Mackie and Dockrell's scoring system, accounting for morphological errors in addition to phonological and orthographic errors. It requires less extensive explicit knowledge of linguistic structure than the POMAS. SSS scoring considers spelling errors in light of linguistic factors concurrently and on the basis of single elements within a word. For these reasons, SSS scoring may prove useful as a formative spelling assessment for identifying children's linguistic weaknesses reflected in their spelling errors.

That children with SLI are poorer spellers than their peers is undisputed. The purpose of this study was to examine the utility of SSS scoring for comprehensive examination of the linguistic nature of spelling errors made by children with SLI.

Method

The Institutional Review Board approved this study. Parents or guardians provided consent for children to participate, and all participants assented prior to each testing session. The data presented here are part of a larger study of the linguistic basis of spelling for children with SLI and children with typical language.

Participants

Participants were 31 children with SLI and 28 children with typical language in grades 2 through 4. Distribution of grade level between the two groups was not significantly different (X2 (2, N = 59) = 1.113; p = .573). All children scored within normal limits on the Test of Nonverbal Intelligence – 4th Edition (Brown, Sherbenou, & Johnsen, 2010), passed a bilateral hearing screening prior to participation, and spoke English as a first language as determined by parent report. Children with SLI scored at least one standard deviation below the mean and children with typical language scored within normal limits on the Core Language Index of the Clinical Evaluation of Language Fundamentals – 4th Edition (Semel, Wiig, & Secord, 2003). Children in the typical language group also had reading and spelling scores within normal limits, as indicated by the Woodcock Reading Mastery Test – III (Woodcock, 2011) and the Test of Written Spelling – 4th Edition (TWS-4; Larsen, Hammill, & Moats, 1999), to avoid including children with additional disorders known to affect these skills (e.g., dyslexia). Children with diagnoses such as intellectual disability, autism, or hearing loss, determined by school and/or parent report, were excluded. Table 1 displays demographic information for participants.

Children were recruited from public and private elementary schools in a southeastern US state. School speech-language pathologists sent consent forms home with children receiving special education services for speech, language, and/or reading. We recruited broadly within these categories because in the most recent and comprehensive epidemiological study of SLI, Tomblin et al. (1997) reported that SLI is largely undiagnosed in school-age children in the US. Additionally, children with SLI have higher rates of speech impairment (Shriberg, Tomblin, & McSweeny, 1999) and reading disability (Catts et al., 2002) than the general population. Children referred during this process who fit criteria for the typical language group (i.e., language, reading, and spelling scores within normal limits; n = 5) were added to the typical language group. Whenever possible, the remaining children with typical language (n = 23) were recruited from the general education classrooms of the children with SLI. For each child with SLI, consent forms were sent home with three children in the same classroom who were judged by the teacher to have typical language development. Nine of the children with SLI were recruited from a school for children with learning disabilities; therefore, children with typical language and literacy skills were not available for recruitment from their classrooms.

Procedures

Children individually completed a battery of speech, language, and literacy measures as part of a larger study (details of assessment battery available from first author on request). Measures were administered in a randomized order. Total assessment time was approximately four hours per child; testing did not last for more than two hours at a time, with one exception per school request. The analysis set for the present study was the first 20 words on the TWS-4 (Larsen et al., 1999). To avoid floor effects, all children began with item 1 and continued at least through item 20. If a child did not reach ceiling per the published protocol (5 consecutive incorrect spellings) before item 20, administration was continued until they reached ceiling. Only the first 20 spellings were used in the analyses reported below. Standard scores are reported in the demographic information in Table 1 but were not used in these analyses.

Analysis

For the initial analysis, spellings were scored using two methods: percentage correct and SSS.

Percentage correct

To compute percentage correct scores, correct spellings received a score of 1 and incorrect spellings received a score of 0. For each child, the number of correct spellings was divided by 20 and the result was multiplied by 100.

SSS

To compute SSS scores, words were entered into the Computerized Spelling Sensitivity Score program (Masterson & Hrbec, 2011). The program divided words into elements and assigned scores to each element and word as described below. The SSS method resulted in two scores: SSS Word Score (SSS-W) and SSS Element Score (SSS-E).

First, each word was divided into component elements. Base words were parsed phonemically; each phoneme was considered one element (e.g., “want” is parsed into 4 elements: w, a, n, t). In multi-morphemic words, base words were parsed phonemically but each affix was considered one element regardless of the number of phonemes in the affix (e.g., “wanted” is parsed into 5 elements: w, a, n, t, ed).

Next, each element was assigned a score from 0 to 3. Scoring was as follows: (a) conventional spelling = 3 (e.g., “ir” for birds), (b) incorrect spelling in which the element was represented with a legal but incorrect grapheme = 2 (e.g., “er” for birds), (c) incorrect spelling in which the element was represented with an illegal grapheme = 1 (e.g., “u” for birds), and (d) the element was omitted = 0.

Then, each word was assigned a word score and an element score. The word score was the score of the lowest-scoring element in the word (e.g., word score = 2 for “berds”). The element score was the average of the scores of the elements within the word (e.g. element score = 2.75 for “berds”). See Table 2 for examples of SSS scoring.

Following calculation of individual word and element scores, the SSS-W and SSS-E scores were calculated by averaging the word and element scores, respectively, for all words that each participant spelled1.

Reliability

To ensure reliability of percentage correct scores, children's spellings were scored online by the first author. A research assistant trained to administer and score the TWS-4 double-checked all scoring. Next, scores were entered into a database, and a research assistant double-checked data entry. Calculations of percentage correct were performed in Microsoft Excel. Disagreements throughout this process were rare and were resolved by mutual consensus of the authors.

To ensure reliability of SSS scores, the following process was followed. First, children's spellings were entered into a .csv file in Excel. A research assistant double-checked entry of the children's spellings. Next, the .csv file was uploaded to the Computerized SSS program, and the first author or a trained research assistant manually double-checked the program's automatic scoring of each word. Finally, a second trained research assistant triple-checked the SSS scoring of all words. Data were exported to a database, and a research assistant double-checked the accuracy of the export. Again, disagreements throughout this process were rare and were resolved by mutual consensus of the authors. Thus, final scores for each word were agreed on by both authors.

Comparisons

Three independent samples t-tests were used to compare mean group spelling performance using each scoring method (percentage correct, SSS-W, SSS-E). The Bonferroni correction was applied to guard against multiple comparison bias, resulting in the alpha level of .02. As a follow-up analysis, mixed-effects model comparisons were used to examine group differences in word score error types.

Results

Percentage correct

When spellings were analysed using percentage correct, children with SLI were less accurate than children with typical language (see Figure 1). Children with SLI spelled on average 52.26% of words correctly (SD = 25.06; range = 15.00 – 95.00). In contrast, children with typical language spelled on average 78.93% of words correctly (SD = 16.35; range = 40.00 – 100.00). This difference was statistically significant (p < .001). Cohen's d was 1.26, indicating a large effect of language status on spelling accuracy.

Figure 1.

Figure 1

Percentage of correct spellings by language status. Error bars represent one standard deviation. SLI = specific language impairment. *p < .001.

SSS-W

When children's spellings were analysed using SSS-W, children with SLI demonstrated lower levels of linguistic knowledge in word spellings than children with typical language (see Figure 2). Children with SLI had a mean SSS-W score of 1.96 (SD = 0.63; range = 0.90 – 2.95). Children with typical language had a mean SSS-W score of 2.62 (SD = 0.33; range = 1.75 – 3.00). This difference was statistically significant (p < .001). Cohen's d was 1.31, indicating a large effect of language status on SSS-W scores.

Figure 2.

Figure 2

Mean SSS Word Score by language status. Error bars represent one standard deviation. SSS = Spelling Sensitivity Score. SLI = specific language impairment. *p < .001.

SSS-E

When children's spellings were analysed using SSS-E, children with SLI demonstrated lower levels of linguistic knowledge in element spellings than children with typical language (see Figure 3). Children with SLI had an average SSS-E score of 2.51 (SD = 0.37; range = 1.61 – 2.99). Children with typical language had an average SSS-E score of 2.86 (SD = 0.13; range = 2.52 – 3.00). This difference was statistically significant (p < .001). Cohen's d was 1.26, indicating a large effect of language status on SSS-E scores.

Figure 3.

Figure 3

Mean SSS Element Score by language status. Error bars represent one standard deviation. SSS = Spelling Sensitivity Score. SLI = specific language impairment. *p < .001.

Follow-up analyses

Mean performance across scoring methods revealed group differences in spelling performance, and SSS-W and SSS-E scores highlighted possible qualitative differences in the nature of spelling errors for children with SLI. To examine the specific nature of group differences, we conducted four mixed-effects model comparisons, one for each word score (i.e., 0, 1, 2, 3). In each model, word score was dichotomized as the outcome variable. Because each participant spelled multiple words and each word was spelled by multiple participants, it was necessary to control for the effects of repeated measures. To accomplish this, each participant was coded with a participant ID and each word was coded with a word ID. Participant ID and word ID were entered as random effects in the null models to account for the effects of repeated measures. The predictor of interest, group, was entered as a fixed effect. Group was a significant predictor of word score compared to the null models for word score 0 (ΔAIC = −16.56, X2 = 18.56, p < .001), word score 1 (ΔAIC = −10.85, X2 = 12.85, p < .001), and word score 3 (ΔAIC = −17.5, X2 = 19.46, p < .001). Group was not a significant predictor of word score compared to the null model for word score 2 (ΔAIC = 1.85, X2 = .15, p = .70).

Positive β weights indicate that children with SLI were more likely to receive the word score outcome. Negative β weights indicate that children with typical language were more likely to receive the word score outcome. Therefore, children with SLI were more likely than children with typical language to omit elements and to represent elements with illegal letters in word spellings. Children with typical language were more likely than children with SLI to spell all elements in words correctly. There was no difference between children with SLI and children with typical language for representing elements incorrectly but legally. Table 3 displays model comparison statistics, and Figure 4 displays the distribution of word scores by group.

Figure 4.

Figure 4

Distribution of SSS word scores by Group.

Discussion

The findings of the present study replicate previous findings that the spelling of children with SLI is less accurate than their same-age peers with typical language. On a list of words that children with typical language on average spelled 75% correctly, children with SLI on average spelled only 50% correctly. No child with SLI spelled all words correctly, and less than 25% of children with SLI scored 80% or above. In contrast, four children with typical language spelled all words correctly and more than 50% of children with typical language scored 80% or above. That children with SLI score lower than their same-age peers with typical language on spelling measures is undisputed (Bishop & Adams, 1990; Cordewener et al., 2012; Young et al., 2002). However, simply knowing that children with SLI produce a larger proportion of inaccurate spellings does not inform instruction or intervention.

The Spelling Sensitivity Score used in this study is one approach to scoring spellings that can provide insight into the types of linguistic knowledge that children exhibit through their spellings. The scoring system is designed to characterize spellings by inclusion and/or legality of linguistic elements. Masterson and Apel (2010) explained that SSS-W scores of 0 – .99 indicate a lack of awareness of the phonological and/or morphological structure of words. SSS–W scores of 1 – 1.99 indicate a lack of orthographic pattern knowledge. SSS-W scores of 2 – 2.99 indicate generally intact orthographic pattern knowledge with a need for “fine-tuning” word-specific orthographic representations. Thus, SSS scoring offers a quantitative score that provides information about the areas of linguistic knowledge that drive individual children's spellings.

The mean SSS-W score for children with SLI in this study was 1.94. This score indicates that children have deficits in orthographic pattern knowledge, using Masterson and Apel's (2010) interpretation guide. The follow-up analysis in the present study supported this interpretation. Children with SLI were more likely than children with typical language to represent elements with illegal letters. Twenty-five percent of spellings of children with SLI contained at least one element represented illegally, compared to thirteen percent of spellings of children with typical language. For example, the most common misspelling of the word “knife” in the group of children with SLI was “nif,” which contains one element spelled legally but inaccurately (“n” for “kn”) and one element spelled illegally (“i” for “i_e”). Other misspellings included spellings with even less orthographic accuracy: “life,” “tif,” and “mth.”

Additionally, the follow-up analyses demonstrated that children with SLI were more likely than children with typical language to omit elements in words. Sixteen percent of spellings of children with SLI contained at least one omission, compared to two percent of spellings of children with typical language. The SSS-W score, however, did not reflect this finding. That children with SLI omit elements more frequently than children with typical language was illustrated only when words were analysed individually rather than in terms of average performance across words and participants. This illustrates a limitation of SSS-W scores and suggests that clinicians using the SSS may be advised to analyse on the level of individual word scores (e.g., number or percentage of words in a spelling sample that received each word score) for a more fine-grained interpretation of a child's linguistic strengths and weaknesses. Specifically, the SSS assigns an average score to word-specific phenomena and thus must be interpreted carefully.

In contrast to children with SLI, children with typical language had a mean SSSW score of 2.62, which indicates adequate orthographic pattern knowledge but occasional deficits in word-specific orthographic knowledge. That is, children with typical language generally represented all elements in words with legal letters; however, they did not always produce conventional spellings of words. Using the above example, if children with typical language misspelled the word “knife,” overwhelmingly they represented is as “nife,” which contains one element spelled inaccurately (“n” for “kn”) but all elements spelled legally. The follow-up analysis confirmed this finding for children with typical language. Children with typical language were less likely than children with SLI to omit elements or spell them illegally and more likely to spell words conventionally.

Children with SLI had lower SSS-E scores than children with typical language; however, both groups had mean scores within the 2 – 2.99 range. Although a mean SSS-E score of 2.50 suggests that children with SLI generally spell elements with legal or correct graphemes, element scores followed the same pattern as word scores: 5.5% of elements were omitted, 14.8% were represented illegally, 5.5% were represented legally but inaccurately, and 73.3% were correct. As with SSS-W scores, simply using the SSS-E score without an individual examination of element scores results in incomplete interpretation of children's spellings.

Our findings replicate previous work that has characterized spelling errors in children with SLI. Mackie and Dockrell (2004) reported that 9- to 12-year-old children with SLI produced a greater proportion of phonologically inaccurate and orthographically illegal spellings than children with typical language. Likewise, Silliman et al. (2006) reported that spelling errors of 6- to 11-year old children with SLI indicated deficits in phonological and orthographic knowledge in spelling, as well as deficits in representing morphological elements. Additionally, our analyses suggest that children with SLI are more likely than children with typical language to omit elements in their spellings. This finding extends Silliman et al.'s (2006) finding that children with SLI omitted more morphological markers than children with typical language.

Implications for Intervention

These converging findings provide guidance in selecting approaches to spelling interventions for children with SLI. Once an appropriate intervention has been identified, we hypothesize that the SSS will be a useful tool for monitoring progress in spelling achievement for children with SLI, as it is for children with typical language (Masterson & Apel, 2010).

There is growing evidence that linguistic-based spelling instruction is effective for children who struggle to master spelling skills. In a recent meta-analysis, Weiser and Mathes (2011) reported robust effects of instruction that links linguistic knowledge to spelling new words on reading and spelling outcomes for at-risk students. The authors argued that the linguistic focus is the “missing link” for students who struggle with literacy (p.193). Apel and Masterson (2001) described the effectiveness of a multilinguistic spelling intervention program that focused on phonological, orthographic, and morphological structure of words.

Despite the linguistic basis of spelling and the evidence supporting linguistic-based instruction, Schlagal (2002) reported that spelling instruction in the United States continues to focus primarily on memorizing the spellings of individual words, with little focus on linguistic attributes. Likewise, Coltheart and Prior (2007) reported that literacy instruction in Australia is primarily based on the whole language approach, with little focus on explicit teaching of the linguistic structure of words. Both of these reports followed government-sponsored meta-analyses that recommended explicit, systematic literacy instruction rooted in linguistic knowledge (NICHD, 2000; Rowe, 2005). Ten years after the report of the National Reading Panel in the US, teachers still reported using memorization-based strategies rather than linguistic-based strategies when teaching spelling (Carreker et al., 2010).

Perhaps it is the focus on holistic percentage correct scoring of children's spellings that leads in part to the widespread use of memorization-based instructional approaches. When the level of analysis of children's spellings is simply right or wrong, misspellings provide little guidance as to how to approach instruction. Additionally, several researchers in the US have reported that teachers possess insufficient knowledge of linguistic structure to provide highly effective literacy instruction (e.g., Carreker et al., 2010; Cunningham, Perry, Stanovich, & Stanovich, 2004; Moats, 1994; Spencer, Schuele, Guillot, & Lee, 2008), and less than 45% of pre-service teachers in Australia believe they are sufficiently prepared to teach spelling or phonics (Louden & Rohl, 2006). Such findings point to a lack of teacher preparedness to analyse misspellings beyond correct/incorrect with the purpose of guiding instructional approaches. It is difficult to design an appropriate intervention approach that is informed by how children's misspellings point to areas of weakness when an educator has little knowledge about linguistic structure.

Limitations and Future Directions

As with all studies, our findings have some limitations and provide direction for future research. We evaluated the spellings of second, third, and fourth graders as a single group. It is possible that misspellings of children with SLI indicate deficits with different types of linguistic knowledge across grades, and it would be interesting to study each grade individually in the future. Additionally, research should utilize a longitudinal study design to evaluate the validity of the SSS for measuring growth in spelling skills for children with SLI and to study the role of linguistic knowledge in children's spellings across development. A longitudinal study could also further clarify whether spellings of children with SLI are better characterized as following a delayed but intact developmental trajectory or a fundamentally disordered one. Finally, it would be valuable to evaluate the feasibility of classroom teachers implementing the SSS to analyse children's misspellings and thus tailor instruction.

Conclusion

Our findings converge with previous studies of spelling in children with SLI. Overall, children with SLI spell words less accurately than their peers with typical language. Further, SSS scoring is a useful tool for analysing the spellings of children with SLI insofar as it allows for concurrent analysis of the linguistic knowledge represented in spellings.

The SSS scoring system provides a quantitative score that offers much more information about students’ spellings than a traditional percentage correct scoring method. A metric such as SSS may aid educators in identifying areas of linguistic difficulty captured in children's misspellings. SSS-W and SSS-E scores are limited, however, in that they provide a mean score for categorical variables. Although this mean score provides a general overview of spelling skill, additional consideration of individual word scores and element scores may more explicitly guide educators in selecting appropriate interventions based on spelling errors. For example, if a child tends to omit elements in words, phonological awareness training, including phoneme counting, may be advisable. Conversely, if elements tend to be represented illegally, intervention that focuses on sound/symbol relationships and orthographic patterns may be more appropriate. Future research should explicitly examine the effect of such targeted interventions on spelling for children with SLI.

Table 1.

Demographic Information

Variable Group Mean SD Range t p d
Age (months) SLI 111.97 12.64 88 – 136 2.431 .018 0.63
TL 104.29 11.53 86 – 127
Language SLI 71.19 9.81 40 – 84 14.373 <.001 3.74
TL 108.21 9.96 88 – 124
Reading SLI 79.45 13.42 55 – 108 8.656 <.001 2.26
TL 109.93 13.58 89 – 145
Spelling SLI 85.16 12.11 68 – 114 7.111 <.001 1.85
TL 109.00 13.64 87 – 140

Note. SLI = specific language impairment; TL = typical language.

Table 2.

Example of SSS Word and Element Scoring

Spelling Word Score Element Score
birds 3 3
berds 2 2.75
buds 1 2.5
bds 0 2.0

Note. SSS = Spelling Sensitivity Score; The target word “birds” is parsed into four elements: b, ir, d, s.

Table 3.

Model comparisons for mixed-effects models for each word score

Model β group Δ AIC X2 p
1. word score 0, null model
2. word score 0, including group 0.13
−16.56 18.56 < .001
3. word score 1, null model
4. word score 1, including group 0.13
−10.85 12.85 < .001
5. word score 2, null model
6. word score 2, including group 0.01
1.85 0.15 .70
7. word score 3, null model
8. word score 3, including group −0.27
−17.5 19.46 < .001

Note: Δ AIC = change in Akaike information criterion when group was added as a predictor.

Acknowledgements

This research was supported by the 2012 Jeanne S. Chall Research Fellowship (PI: Werfel) from the International Reading Association, a Preparation of Leadership Personnel grant (H325D080075; PI: Schuele) from the US Department of Education, and the Vanderbilt CTSA grant UL1 RR024975-01 from NCRR/NIH. The content is solely the responsibility of the authors and does not necessarily represent the official views of the International Reading Association, US Department of Education, or National Institutes of Health. The authors thank Lucy D'Agostino McGowan, whose guidance in data analysis was very helpful.

Footnotes

1

For consistency, we use the terms “word score” and “element score” when referring to individual words and “SSS-W” and “SSS-E” when referring to mean scores across words.

Contributor Information

Krystal L. Werfel, University of South Carolina

Hannah Krimm, Vanderbilt University.

References

  1. Apel K, Masterson J. Theory-guided spelling assessment and intervention: A case study. Language, Speech, & Hearing Services in Schools. 2001;32:182–195. doi: 10.1044/0161-1461(2001/017). doi: 0161–1461/01/3203–0182. [DOI] [PubMed] [Google Scholar]
  2. Apel K, Masterson J, Hart P. Integration of language components in spelling: Instruction that maximizes students' learning. In: Silliman E, Wilkinson L, editors. Language and literacy learning in schools. The Guilford Press; New York, NY: 2004. [Google Scholar]
  3. Apel K, Wilson-Fowler E, Brimo D, Perrin N. Metalinguistic contributions to reading and spelling in second and third grade students. Reading and Writing. 2012;25:1283–1305. doi: 10.1007/s11145-011-9317-8. [Google Scholar]
  4. Bishop D, Adams C. A prospective study of the relationship between specific language impairment, phonological disorders, and reading achievement. Journal of Child Psychology and Psychiatry. 1990;31:1027–1050. doi: 10.1111/j.1469-7610.1990.tb00844.x. doi: 10.1111/j.1469-7610.1990.tb00844.x. [DOI] [PubMed] [Google Scholar]
  5. Bourassa D, Treiman R. Spelling development and disability: The importance of linguistic factors. Language, Speech, and Hearing Services in Schools. 2001;32:172–181. doi: 10.1044/0161-1461(2001/016). doi: 10.1044/0161-1461(2001/016) [DOI] [PubMed] [Google Scholar]
  6. Brown L, Sherbenou R, Johnsen S. Test of Nonverbal Intelligence. 4th ed. Pro-Ed; Austin, TX: 2010. [Google Scholar]
  7. Carreker S, Joshi RM, Boulware-Gooden R. Spelling-related teacher knowledge: The impact of professional development on identifying appropriate instructional activities. Learning Disability Quarterly. 2010;33:148–158. doi: 10.1177/073194871003300304. [Google Scholar]
  8. Catts H, Fey M, Tomblin JB, Zhang X. A longitudinal investigation of reading outcomes in children with language impairments. Journal of Speech, Language, and Hearing Research. 2002;45:1142–1157. doi: 10.1044/1092-4388(2002/093). doi: 10.1044/1092-4388(2002/093) [DOI] [PubMed] [Google Scholar]
  9. Coltheart M, Prior M. Learning to read in Australia. Academy of the Social Sciences in Australia; Canberra: 2007. [Google Scholar]
  10. Cordewener K, Bosman A, Verhoeven L. Characteristics of early spelling of children with specific language impairment. Journal of Communication Disorders. 2012;45:212–222. doi: 10.1016/j.jcomdis.2012.01.003. doi: 10.1016/j.jcomdis.2012.01.003. [DOI] [PubMed] [Google Scholar]
  11. Cunningham A, Perry K, Stanovich K, Stanovich P. Disciplinary knowledge of K-3 teachers and their knowledge calibration in the domain of early literacy. Annals of Dyslexia. 2004;54:139–167. doi: 10.1007/s11881-004-0007-y. doi: 10.1007/s11881-004-0007-y. [DOI] [PubMed] [Google Scholar]
  12. Figueredo L, Varnhagen CK. Didn't you run the spell checker? Effects of type of spelling error and use of a spell checker on perceptions of the author. Reading Psychology. 2005;26:441–458. [Google Scholar]
  13. Gentry JR. Early spelling strategies. The Elementary School Journal. 1978;79:88– 92. [Google Scholar]
  14. Larsen S, Hammill D, Moats L. Test of Written Spelling. 4th ed. Pro-Ed; Austin, TX: 1999. [Google Scholar]
  15. Louden W, Rohl M. “Too many theories and not enough instruction”: Perceptions of preservice teacher preparation for literacy teaching in Australian schools. Literacy. 2006;40:66–78. [Google Scholar]
  16. Mackie C, Dockrell J. The nature of written language deficits in children with SLI. Journal of Speech, Language, and Hearing Research. 2004;47:1469–1480. doi: 10.1044/1092-4388(2004/109). doi: 10.1044/1092-4388(2004/109) [DOI] [PubMed] [Google Scholar]
  17. Marshall J, Powers J. Writing neatness, composition errors, and essay grades. Journal of Educational Measurement. 1969;6:97–101. [Google Scholar]
  18. Masterson J, Apel K. The spelling sensitivity score: Noting developmental changes in spelling knowledge. Assessment for Effective Intervention. 2010;36:35–45. doi: 10.1177/1534508410380039. [Google Scholar]
  19. Masterson J, Hrbec B. Computerized Spelling Sensitivity System [computer software] Missouri State University Language-Literacy Lab; Springfield, MO: 2011. [Google Scholar]
  20. Moats L. The missing foundation in teacher education: Knowledge of the structure of spoken and written language. Annals of Dyslexia. 1994;44:81–102. doi: 10.1007/BF02648156. doi: 10.1007/BF02648156. [DOI] [PubMed] [Google Scholar]
  21. NICHD . Report of the National Reading Panel: Teaching children to read: Reports of the subgroups (00-4754) U.S. Government Printing Office; Washington, DC: 2000. [Google Scholar]
  22. Read C. Preschool children's knowledge of English phonology. Harvard Educational Review. 1971;41:1–34. [Google Scholar]
  23. Rowe K. Teaching reading: Report and recommendations. Australian Government Department of Education, Science, and Training: National Inquiry into the Teaching of Literacy. 2005 [Google Scholar]
  24. Schlagal B. Classroon spelling instruction: History, research, and practice. Reading Research and Instruction. 2002;42:44–57. [Google Scholar]
  25. Semel E, Wiig E, Secord W. Clinical Evaluation of Language Fundamentals. 4th ed. The Psychological Corporation; San Antonio, TX: 2003. [Google Scholar]
  26. Shriberg L, Tomblin JB, McSweeny J. Prevalence of speech delay in 6-year-old children and comorbidity with language impairment. Journal of Speech, Language, and Hearing Research. 1999;42:1461–1481. doi: 10.1044/jslhr.4206.1461. [DOI] [PubMed] [Google Scholar]
  27. Silliman E, Bahr R, Peters M. Spelling patterns in preadolescents with atypical language skills: Phonological, morphological, and orthographic factors. Developmental Neuropsychology. 2006;29:93–123. doi: 10.1207/s15326942dn2901_6. [DOI] [PubMed] [Google Scholar]
  28. Spencer E, Schuele CM, Guillot K, Lee M. Phonemic awareness skill of speech-language pathologists and other educators. Language, Speech, & Hearing Services in Schools. 2008;39:512–520. doi: 10.1044/0161-1461(2008/07-0080). [DOI] [PubMed] [Google Scholar]
  29. Stahl S, Murray B. Defining phonological awareness and its relationship to early reading. Journal of Educational Psychology. 1994;86:221–234. [Google Scholar]
  30. Tomblin J, Records N, Buckwalter P, Zhang X, Smith E, O'Brien M. Prevalence of specific language impairment in kindergarten children. Journal of Speech, Language, and Hearing Research. 1997;40:1245–1260. doi: 10.1044/jslhr.4006.1245. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Treiman R. Children's spelling errors on syllable-initial consonant clusters. Journal of Educational Psychology. 1991;83:346–360. [Google Scholar]
  32. Treiman R. Use of consonant letter names in beginning spelling. Developmental Psychology. 1994;30:567–580. [Google Scholar]
  33. Treiman R, Cassar M. Effects of morphology on children's spelling of final consonant clusters. Journal of Experimental Child Psychology. 1996;63:141–170. doi: 10.1006/jecp.1996.0045. [DOI] [PubMed] [Google Scholar]
  34. Treiman R, Cassar M, Zukowski A. What types of linguistic information do children use in spelling? The case of flaps. Child Development. 1994;65:1318–1337. doi: 10.1111/j.1467-8624.1994.tb00819.x. [DOI] [PubMed] [Google Scholar]
  35. Wagovich S, Pak Y, Miller M. Orthographic word knowledge growth in school-age children. American Journal of Speech-Language Pathology. 2012;21:140–153. doi: 10.1044/1058-0360(2012/10-0032). [DOI] [PubMed] [Google Scholar]
  36. Walker J, Hauerwas L. Development of phonological, morphological, and orthographic knowledge in young spellers: The case of inflected verbs. Reading and Writing. 2006;19:819–843. [Google Scholar]
  37. Weiser B, Mathes P. Using encoding instruction to improve the reading and spelling performances of elementary students at risk for literacy difficulties: A best-evidence synthesis. Review of Educational Research. 2011;81:170–200. doi: 10.3102/0034654310396719. [Google Scholar]
  38. Werfel K. Contributions of linguistic knowledge to spelling performance in elementary school children with and without language impairment. Vanderbilt University; Nashville, TN.: 2012. Unpublished doctoral dissertation. [Google Scholar]
  39. Woodcock R. Woodcock Reading Mastery Tests. 3rd ed. Pearson; San Antonio, TX: 2011. [Google Scholar]
  40. Young A, Beitchman J, Johnson C, Douglas L, Atkinson L, Escobar M, Wilson B. Young adult academic outcomes in a longitudinal sample of early identified language impaired and control children. Journal of Child Psychology and Psychiatry. 2002;43:635–645. doi: 10.1111/1469-7610.00052. [DOI] [PubMed] [Google Scholar]

RESOURCES