Abstract
The Short Sensory Profile (SSP) is one of the most commonly used measures of sensory features in children with autism spectrum disorder (ASD), but psychometric studies in this population are limited. Using confirmatory factor analysis, we evaluated the structural validity of the SSP subscales in ASD children. Confirmatory factor models exhibited poor fit, and a follow-up exploratory factor analysis suggested a 9-factor structure that only replicated three of the seven original subscales. Secondary analyses suggest that while reliable, the SSP total score is substantially biased by individual differences on dimensions other than the general factor. Overall, our findings discourage the use of the SSP total score and most subscale scores in children with ASD. Implications for future research are discussed.
Keywords: Autism Spectrum Disorder, Sensory, Short Sensory Profile, Factor Analysis, Psychometric, Validity
Introduction
Atypical responses to sensory stimuli have long been observed in individuals with autism spectrum disorder (ASD) and are now considered a defining feature of the condition (American Psychiatric Association, 2013). Numerous studies have demonstrated ASD-control differences in sensory processing across modalities and assessment instruments (Baum, Stevenson, & Wallace, 2015; Ben-Sasson et al., 2009; Boudjarane, Grandgeorge, Marianowski, Misery, & Lemonnier 2017; Hazen, Stornelli, O’Rourke, Koesterer, & McDougle, 2014; Marco, Hinkley, Hill, & Nagarajan, 2011; Mikkelsen, Wodka, Mostofsky, & Puts, 2018; Moore 2015; Robertson & Baron-Cohen, 2017; Schauder & Bennetto, 2016). However, findings across studies are extremely heterogeneous, frequently contradictory, and do not converge on a specific pattern of sensory features as a hallmark of ASD (Schaaf & Lane, 2015).
In light of the clinical heterogeneity of sensory features and of ASD itself, some researchers have advocated for taking an “individual differences” approach, whereby within-group variation is emphasized, and specific patterns of sensory processing are used to stratify ASD into clinically meaningful subgroups (Uljarević et al., 2017). This perspective has become increasingly popular in recent years, with a proliferation of studies that utilize cluster analysis or Gaussian mixture models to define subgroups of individuals with ASD based on reported sensory symptoms (Ausderau et al., 2014a, 2016; Ben-Sasson et al., 2008; Elwin, Schröder, Ek, Wallsten, & Kjellin 2017; Hand, Dennis, & Lane, 2017; Lane Young, Baker, & Angley, 2010; Lane, Dennis, & Geraghty, 2011; Lane, Molloy, & Bishop, 2014b; Liss, Saulnier, Fein, & Kinsbourne, 2006; Uljarević, Lane, Kelly, & Leekam, 2016). Unfortunately, these investigations have produced disparate classification schemes that seem to be largely dependent on the questionnaire used to assess sensory symptoms (for a review, see DeBoth & Reynolds, 2017). Although the fractionation of ASD into sensory subtypes is a laudable goal, it is hampered by the large number of instruments created to measure sensory features and the relative dearth of empirical studies supporting the reliability and validity of these instruments in the ASD population (Burns, Dixon, Novack, & Granpeesheh, 2017). Thus, the current study aims to supplement the existing literature by evaluating the structure and psychometric properties of a commonly-used sensory measure, the Short Sensory Profile (SSP; McIntosh, Miller, Shyu, & Dunn, 1999) in a large sample of children with ASD.
The Short Sensory Profile: Development and Prior Studies
The SSP is a shortened form of Dunn’s Sensory Profile caregiver questionnaire (SP; Dunn, 1999) originally developed as a screening tool to identify children with sensory processing difficulties (McIntosh et al., 1999). The SSP contains 38 items organized into seven subscales. The SSP total score and the score on each subscale can be used to classify children’s level of sensory abnormality (Typical, Probably Difference, or Definite Difference) based on score percentiles from a large normative sample of children without disabilities.
The 38-item SSP was derived from psychometric analyses of SP questionnaires completed by the caregivers of 117 children aged 3–17 years: 21 with clinically diagnosed Sensory Modulation Disorder (SMD), 24 with Fragile X Syndrome, 35 with other developmental disabilities, including but not limited to ASD, and 37 with typical development (TD). From the 125-item SP, 27 items with content outside of the domain of sensory modulation were eliminated. Sixty additional items were removed based on: (a) lack of differentiation between SMD and TD groups, (b) low item-total correlations with conceptually similar sections of the long-form SP, (c) poor item loadings in a Principal Component Analysis (PCA) using the normative TD sample from the long-form SP validation, and (d) increases in Cronbach’s (1951) alpha upon removal of the item. The final 38 items were again subjected to PCA in the large normative sample, the results of which were reported by McIntosh and colleagues (1999).
Based on the above methodology, the authors of the SSP concluded that the questionnaire was made up of seven subscales (McIntosh et al., 1999): Tactile Sensitivity (TAC; 7 items), Taste/Smell Sensitivity (TSM; 4 items), Movement Sensitivity (MOV; 3 items), Underresponsive/Seeks Sensation (USS; 7 items), Auditory Filtering (AFL; 6 items), Low Energy/Weak (LEW; 6 items), and Visual/Auditory Sensitivity (VAS; 5 items). Reliability of each subscale was assessed by calculating Cronbach’s alpha for the 117-child sample (α = 0.82– 0.89). Correlations between the unit-weighted subscale scores in the sample were large on average (mean reported r = 0.53, range 0.25–0.72; Cohen, 1992). Neither alpha values nor subscale correlations were reported for the larger normative TD sample. Lastly, construct validity was assessed in a small study evaluating skin conductance responses to sensory stimuli in children with SMD (n = 19) and TD children (n = 19). Results showed that children with abnormal skin conductance responses scored lower on the SSP than those with typical skin conductance (McIntosh et al., 1999).
Beyond the initial construction of the SSP, two other studies have investigated the factor structure of the SSP in non-ASD populations (Engel-Yeger, 2010; Ee, Loh, Chinna, & Marret, 2016). Engel-Yeger (2010) administered a Hebrew translation of the SSP to the parents of 395 TD Israeli children aged 3–10 years, reporting an 8-component solution with item 1 of the measure occupying its own factor and four of the AFL items loading on the USS factor. In the only confirmatory factor analysis of the SSP, Ee et al. (2016) fit a 7 correlated-factor SSP model to data from a Malay translation of the measure using a sample of 419 TD children. Initial model fit indices were subpar (CFI = 0.87, TLI = 0.86, RMSEA = 0.06), and the post-hoc correlation of error terms was required before the authors determined the model to fit adequately (CFI = 0.92, TLI = 0.91, RMSEA = 0.05). Neither translation study was able to validate the original factor structure of the SSP, suggesting the possibility that the original SSP factor structure was misspecified.
Use of the SSP in Children with ASD
Despite the limited empirical support for the factor structure of the SSP, the questionnaire has become one of the most commonly used instruments to measure parent-reported sensory features in children with ASD. In a review of 93 studies assessing sensory processing in ASD, Burns and colleagues (2017) noted that the SSP was the second most utilized measure (28.0% of studies) after the full-length SP. Moreover, all SP measures (including the SP, SSP, as well as infant/toddler and adolescent/adult adaptations) together were employed in the vast majority (79.6%) of published studies (Burns et al., 2017). The SSP has served as the sensory phenotyping measure of choice in large-scale ASD projects such as the Autism Speaks Autism Treatment Network (Lajonchere, Jones, Coury, & Perrin, 2012) and the EU-AIMS Longitudinal European Autism Project (Charman et al., 2017), likely due to its reduced participant burden relative to lengthier surveys and its widespread use in the ASD literature. Given the wide-scale adoption of the measure, SSP total scores are also commonly utilized in convergent validity testing of newer sensory measures (Neil, Green, & Pellicano, 2017; Siper, Kolevzon, Wang, Buxbaum, & Tavassoli, 2017; Tavassoli et al., 2016). Previous investigations have used item-and subscale-level SSP data to compare the sensory processing of children with ASD to both TD peers (e.g. Tomchek & Dunn, 2007) and children with other intellectual and developmental disabilities (O’Brien, Tsermentseli, & Cummins, 2009; Green, Chandler, Charman, Simonoff, & Baird, 2016a; Rogers, Hepburn, & Wehner, 2003). In addition, profiles of SSP subscale scores from children with ASD have been used in a number of studies to define phenotypic subgroups of children with ASD (Hand et al., 2017; Lane et al., 2010, 2011, 2014b; Tomchek, Little, Myers, & Dunn, 2018; Uljarević et al., 2016). Most frequently, the SSP total score is used as a dimensional measure of overall sensory atypicality in ASD (or in some cases, specifically as an index of sensory hyperresponsiveness), serving to explain individual differences in other phenotypic, neuroimaging, or physiological measures (Chen, Rodgers, & McConachie, 2009; Corbett, Muscatello, & Blain, 2016; Glod, Riby, Honey, & Rodgers, 2015; Hegarty et al., 2018; Johnson et al., 2014; Mazurek et al., 2013, 2014; Mazurek & Petroski, 2015; McCormick et al., 2014; Neil, Olsson, & Pellicano, 2016; Orekhova et al., 2012; Samson et al., 2014; Wigham, Rodgers, South, McConachie, & Freeston, 2015).
To date, only one factor analytic study of the SSP has been conducted in a sample of children with ASD (Tomchek, Huebner, & Dunn, 2014). Using data from 400 ASD children 3 to 6 years of age, the investigators conducted a PCA with varimax rotation on the 38 items of the SSP. Amidst ambiguous evidence suggesting the extraction of between 6 and 11 components, the 6-component structure was chosen due to its interpretability and the fact that the other solutions contained factors with only one or two salient loadings. The loading matrix from the ASD sample only partially resembled the structure in TD children (McIntosh et al., 1999), with only the LEW and TSM factors containing the same items. The other four reported factors were interpreted as: (a) tactile and movement sensitivity (TMV), containing all items from the TAC and MOV subscales, (b) auditory and visual sensitivity (AVS), containing the original VAS items plus three AFL items representing a reduced ability to concentrate in ambient noise, (c) sensory seeking/distractibility (SSD), containing items from the USS and AFL scales that largely tapped the constructs of inattention and hyperactivity, and (d) hypo-responsivity (HYP), containing two AFL items about the child seemingly ignoring speech, as well as the two USS items “Doesn’t seem to notice when face or hands are messy” and “Leaves clothing twisted on body” (Tomchek et al., 2014). Based on the above findings, the authors acknowledged the divergence of this factor structure from that proposed by McIntosh et al. (1999), speculating that the latent factors underlying SSP responses may be qualitatively different in ASD children. While preliminary and limited to preschool children, these findings call into question the validity of the original SSP factor structure in children with ASD. In the case that the SSP truly does measure different constructs in ASD and TD children, ASD-TD group differences in SSP total and subscale scores may not actually reflect differences in the underlying latent constructs. Given the potential implications of the study by Tomchek et al., confirmatory analyses in an independent sample of ASD children across the SSP age range are required to specifically test the fit of the original 7-factor SSP model in children with ASD.
Present Investigation
The primary aim of the current study was to assess the fit of the 7-factor SSP model in a sample of children with ASD using confirmatory factor analysis (CFA). The fit of this model was also compared to that of the factor model proposed by Tomchek et al. (2014). Acceptable CFA model fit is necessary for valid inferences to be made from the SSP subscale scores in this population. In particular, the models used by Lane (2010, 2011, 2014b) and others to derive SSP-based sensory subtypes utilized subscale scores as their inputs and thus relied on the assumption that these scores are valid in the ASD population. Furthermore, valid comparisons of SSP scores between children with ASD and other groups requires the establishment of measurement invariance, an assumption of which is that the items on the questionnaire tap the same latent constructs in both groups (i.e., “configural invariance”; Gregorich, 2006; Vandenberg & Lance, 2000). Although this study was not designed to test for measurement invariance between ASD and TD children on the SSP, an ill-fitting 7-factor model reduces the interpretability of differences in subscale scores between ASD children and TD children, as they could reflect something other than true differences in the latent constructs that those subscales purportedly measure.
Methods
Participants
This investigation was a secondary analysis of questionnaire data completed by the caregivers of 388 children with ASD, pooled from two independent datasets (Table 1). The first dataset consisted of children who participated in a number of behavioral and neuroimaging studies at Vanderbilt University Medical Center between the years 2007 and 2017. All data collection procedures were approved by the university institutional review board. ASD diagnoses were confirmed using gold-standard measures, including the Autism Diagnostic Observation Schedule (ADOS; Lord et al., 2000, original algorithm) and the Autism Diagnostic Interview-Revised (ADI-R; Lord, Rutter, & Le Couteur, 1994). As a part of the assessment battery, the caregivers of these children filled out the long-form SP, from which the SSP items were extracted. Children were included in this study if: (a) they met ASD criteria per the study protocol, (b) they were between 2 and 12 years old at the time of assessment, and (c) they had complete data for all 38 SSP items on their SP caregiver questionnaires. In total, 127 children met these criteria and were included (106 males, mean age 8.55 years). The second dataset consisted of questionnaire data from the National Database for Autism Research (NDAR; Hall Huerta, McAuliffe, & Farber, 2012). Children were included in the investigation if: (a) they were classified as having an “Autism Spectrum” phenotype, according to NDAR convention, (b) they were between 2 and 12 years old at the time of assessment, and (c) complete item-level data was available for the SSP or corresponding long-form SP items. In total, 261 children met these criteria and were included in the study (204 male, mean age 6.74 years).
Table 1.
Vanderbilt | NDAR | Total Sample | |
---|---|---|---|
Sample Size | 127 | 261 | 388 |
Sex | |||
Male | 106 | 204 | 310 |
Female | 21 | 57 | 78 |
Chronological Age | |||
Years [M (SD)] | 8.55 (2.07) | 6.74 (3.15) | 7.34 (2.97) |
Min–Max | 3.58–12.83 | 2.17–12.75 | 2.17–12.83 |
Verbal IQ | |||
High | 16 | 13 | 29 |
Average | 73 | 46 | 119 |
Low | 21 | 78 | 99 |
Unavailable | 17 | 124 | 141 |
Performance IQa | |||
High | 22 | 6 | 28 |
Average | 69 | 30 | 99 |
Low | 19 | 63 | 82 |
Unavailable | 17 | 162 | 179 |
SSP Total [M (SD)] | 128.24 (23.11) | 136.67 (21.01) | 133.91 (22.04) |
Sensory Classificationb | |||
Definite difference | 87 | 157 | 244 |
Probable difference | 19 | 53 | 72 |
Typical | 21 | 51 | 72 |
Note. NDAR = National Database for Autism Research; SSP = Short Sensory Profile
High: IQ > 234100; Average: 85 ≤ IQ ≤ 100; Low: IQ < 85; Per NDAR convention
Based on the SSP total score
The combined sample was 80.0% male and had a mean age of 7.34 years (range 2.16– 12.83 years). Though the SSP was originally designed to be used for children 3–10 years of age (Dunn, 1999), this age range was chosen to match that used in one of the most recent investigations of SSP-based ASD subtypes (Hand et al., 2017) and a CFA study of a comparable sensory questionnaire for children with ASD (Ausderau et al., 2014b). Moreover, the vast majority of children in our sample (86%) fell within the recommended age range for the SSP, and only 3% of the sample was younger than three years of age. To best represent the entire population of children with ASD seen in research settings, no children were excluded on the basis of IQ, functional status, or known genetic condition. Verbal and nonverbal IQ data were available for a subset of the children from each dataset and are presented in Table 1 categorized in the NDAR convention of Low (<85) Average (85–100) or Above Average (>100).
Measures
The SSP (McIntosh et al., 1999) is a 38-item caregiver questionnaire derived from the long-form SP (Dunn, 1999). The SSP contains 38 items (Table 2) representing observable child behaviors that are scored on a 1–5 rating scale based on their frequency (1 being “Always” and 5 being “Never”). Notably, lower scores indicate higher frequency of the endorsed behaviors. The SSP total score and the score on each of the instrument’s seven subscales can be used to classify children into the categories of “Typical Performance,” “Probable Difference,” or “Definite Difference” based on scores from a large normative sample.
Table 2.
Item Content | SSP Factora |
ASD Factorb |
Endorsedc N (%) |
rcorrd |
---|---|---|---|---|
1. Expresses distress during grooming | TAC | TMV | 167(43%) | 0.45 |
2. Prefers long-sleeved clothing when it is warm or short sleeves when it is cold | TAC | TMV | 43 (11%) | 0.42 |
3. Avoids going barefoot, especially in sand or grass | TAC | TMV | 52 (13%) | 0.34 |
4. Reacts emotionally or aggressively to touch | TAC | TMV | 28 (7%) | 0.46 |
5. Withdraws from splashing water | TAC | TMV | 36 (9%) | 0.45 |
6. Has difficulty standing in line or close to other people | TAC | TMV | 84 (22%) | 0.52 |
7. Rubs or scratches out a spot that has been touched | TAC | TMV | 30 (8%) | 0.44 |
8. Avoids certain tastes or food smells that are typically part of children’s diets | TSM | TSM | 145 (37%) | 0.44 |
9. Will only eat certain tastes | TSM | TSM | 147 (38%) | 0.39 |
10. Limits self to particular food textures/temperatures | TSM | TSM | 146 (38%) | 0.43 |
11. Picky eater, especially regarding food textures | TSM | TSM | 179 (46%) | 0.42 |
12. Becomes anxious or distressed when feet leave the ground | MOV | TMV | 28 (7%) | 0.46 |
13. Fears falling or heights | MOV | TMV | 44 (11%) | 0.40 |
14. Dislikes activities where head is upside down | MOV | TMV | 27 (7%) | 0.33 |
15. Enjoys strange noises/seeks to make noise for noise’s sake | USS | SSD | 130 (34%) | 0.37 |
16. Seeks all kinds of movement and this interferes with daily routines | USS | SSD | 200 (52%) | 0.50 |
17. Becomes overly excitable during movement activity | USS | SSD | 171 (44%) | 0.45 |
18. Touches people and objects | USS | SSD | 166 (43%) | 0.33 |
19. Doesn’t seem to notice when face or hands are messy | USS | HYP | 122 (31%) | 0.32 |
20. Jumps from one activity to another so that it interferes with play | USS | SSD | 114 (29%) | 0.22 |
21. Leaves clothing twisted on body | USS | HYP | 65 (17%) | 0.35 |
22. Is distracted or has trouble functioning if there is a lot of noise around | AFL | AVS | 193 (50%) | 0.53 |
23. Appears to not hear what you say | AFL | HYP | 217 (56%) | 0.38 |
24. Can’t work with background noise | AFL | AVS | 48 (12%) | 0.47 |
25. Has trouble completing tasks when the radio is on | AFL | AVS | 85 (22%) | 0.49 |
26. Doesn’t respond when name is called but you know the child’s hearing is OK | AFL | HYP | 189 (49%) | 0.37 |
27. Has difficulty paying attention | AFL | SSD | 251 (65%) | 0.46 |
28. Seems to have weak muscles | LEW | LEW | 88 (23%) | 0.49 |
29. Tires easily, especially when standing or holding particular body position | LEW | LEW | 83 (21%) | 0.59 |
30. Has a weak grasp | LEW | LEW | 87 (22%) | 0.45 |
31. Can’t lift heavy objects | LEW | LEW | 71 (18%) | 0.53 |
32. Props to support self | LEW | LEW | 61 (16%) | 0.54 |
33. Poor endurance/tires easily | LEW | LEW | 89 (23%) | 0.52 |
34. Responds negatively to unexpected or loud noises | VAS | AVS | 108 (28%) | 0.42 |
35. Holds hands over ears to protect ears from sound | VAS | AVS | 116 (30%) | 0.40 |
36. Is bothered by bright lights after others have adapted to the light | VAS | AVS | 41 (11%) | 0.47 |
37. Watches everyone when they move around the room | VAS | AVS | 53 (14%) | 0.35 |
38. Covers eyes or squints to protect eyes from light | VAS | AVS | 65 (17%) | 0.41 |
Note. ASD = autism spectrum disorder; rcorr = Corrected item-total correlation; SSP = Short Sensory Profile; TAC = Tactile sensitivity; TSM = Taste/smell sensitivity; MOV = Movement sensitivity; USS = Underresponsive/seeks sensation; AFL = Auditory filtering; LEW = Low energy/weak; VAS = Visual/auditory sensitivity; TMV = Tactile and movement sensitivity; SSD = Sensory seeking/distractibility; HYP = Hypo-responsivity; AVS = Auditory and visual sensitivity
Based on the original factor analysis by McIntosh et al. (1999)
Based on the factor analysis in an ASD population by Tomchek et al. (2014)
Response of 1 “Always” or 2 “Frequently”
Correlations less than 0.4 appear in bold
Reliabilities of the SSP subscales, as estimated by Cronbach’s (1951) coefficient alpha, have been reported to range from α = 0.82–0.89 in the sample of 117 children used to develop the questionnaire. Preliminary evidence of the SSP’s validity comes from a small study wherein children with aberrant electrodermal responses to sensory stimuli were found to have more atypical scores on the SSP than children with typical responses (McIntosh et al., 1999). However, a later study using the same psychophysiological protocol in a combined group of ASD, TD, and SMD children failed to find a relationship between the SSP and physiological measures (Schoen et al., 2009). Likewise, in a study of toddlers with ASD, no relationship was found between SSP scores and skin conductance magnitudes (McCormick et al., 2014). Though studies thus far question the relation between SSP scores and physiologic measures of sensory responsiveness in ASD children, these investigations are limited in their ability to detect weaker linear or nonlinear relationships that may still exist between parent-reported sensory symptoms and psychophysiological indicators.
Statistical Analysis
Descriptive statistics.
Item-level descriptive statistics were calculated, including item means and standard deviations, percent item endorsement (defined as a score of 1 “Always” or 2 “Frequently”), and (corrected) item-total correlations. Subscale-level descriptive statistics were also calculated, including coefficient alpha, the more appropriate reliability coefficient omega total (ωt; McDonald, 1999; McNeish, 2017) with bias-corrected accelerated bootstrap confidence intervals (Kelley & Pornprasertmanit, 2016), internal consistency (as assessed by average inter-item correlation; Davenport et al., 2015), and the range of inter-item correlations within each subscale. Clark & Watson (1995) suggest that for an optimal balance of internal consistency and construct validity, the average inter-item correlation, as well as virtually all inter-item correlations within a scale, should lie in the range of 0.15–0.50. SSP subscales with mean inter-item correlations greater than 0.5 were examined further for redundant item content (Boyle, 1991).
Confirmatory factor analysis.
To address the primary aim of the study, the previously proposed 7-factor (McIntosh et al., 1999) and 6-factor (Tomchek et al., 2014) structures of the SSP were tested using CFA models. In these confirmatory models, latent variables (i.e., common factors) are postulated to be responsible for the covariance between manifest variables (i.e., SSP item scores), and each item’s variance is decomposed into common variance (variance shared with other items, presumably caused by underlying latent variables) and error variance (see Babyak & Green, 2010 for an accessible introduction to CFA methodology). The 7- and 6-factor models were constructed such that each item loaded onto only one common factor, corresponding to its subscale assignment in Table 2. Latent variables in these CFA models were allowed to correlate with one another. Correlated error terms were not allowed in the models, as no such relationships were hypothesized a priori (see also Hermida, 2015 for arguments against the practice of correlating error terms more generally).
Confirmatory factor models were estimated in R (R core team, 2017) using the lavaan package (Rosseel, 2012). Due to the ordered-categorical nature of the manifest variables, we utilized a diagonally-weighted least squares (DWLS) estimator with a robust mean and variance corrected test statistic (i.e., the “WLSMV” estimator; Asparouhov & Muthén, 2010; see also Li, 2016). Model fit was evaluated using the chi-square test of exact fit, but given the test’s high likelihood of rejecting models that differ trivially from the population structure (e.g. Bentler, 1990), several additional fit indices were also calculated, including the comparative fit index (CFI; Bentler, 1990), Tucker-Lewis Index (TLI; Tucker & Lewis, 1973), the root mean square error of approximation (RMSEA; Steiger, 1990), and the weighted root mean square residual (WRMR; Yu, 2002). Widely-accepted rules of thumb based on the fit index guidelines of Hu and Bentler (1999) suggest that CFI/TLI values of > 0.95 and RMSEA values of < 0.06 indicate good model fit (though see also Marsh, Hau, & Wen, 2004; McNeish, An, & Hancock, 2018; Tomarken & Waller, 2003). The WRMR is a less well-studied fit index, but recent simulation work has supported the assertion by Yu (2002) that values below 1.0 generally suggest good model fit (DiStefano, Liu, Jiang, & Shi, 2017). It is worth acknowledging that aside from the WRMR, these fit index guidelines were not derived using DWLS estimators, though a small amount of research has suggested similar cutoff values are appropriate for ordinal data (Yu, 2002). Moreover, the Hu and Bentler cutoff criteria are routinely used to evaluate WLSMV-estimated CFA models in the applied literature (e.g. Reeve et al., 2007; Yerys et al., 2017). In addition to the calculation of fit indices, the residual correlation matrix of each model was examined to determine areas of localized model misfit. Off-diagonal elements of the residual correlation matrix with absolute values greater than 0.2 suggest additional unmodeled factors (Reeve et al., 2007) and contribute to the holistic judgment of whether a model is appropriate.
Exploratory factor analysis.
As we ultimately found both the 7-factor and 6-factor models unfit for the data, we followed our CFA with an exploratory factor analysis (EFA) to better understand the relationships between the SSP items in this sample (Gorsuch, 1997). This analysis was conducted in R (R core team, 2017) using the psych package (Revelle, 2017). Because of the ordinal nature of the data, polychoric correlations were computed and used in the EFA (Holgado Tello, Chacón Moscoso, Barbero García, & Vila Abad, 2010). We also sought to improve upon the EFA methodology used in previous samples (McIntosh et al., 1999; Tomchek et al., 2014) by employing more theoretically-sound and accurate means of factor extraction, rotation, and determination of the number of factors (Norris & Lecavalier, 2010; Preacher & McCallum, 2003). One methodological improvement is factor extraction by principal axis factoring (PAF) rather than PCA, as the latter method assumes no measurement error and tends to estimate parameters less accurately than other extraction methods under a number of data conditions (see also Boorsboom, 2006; Fabrigar, Wegener, MacCallum, & Strahan, 1999; Floyd & Widaman, 1995 for a more extensive discussion of PCA as an extraction method).
Previous studies have also used the varimax criterion for factor rotation (Kaiser, 1959), which assumes factors to be orthogonal and thus completely uncorrelated. Preacher and McCallum (2003) argue that such orthogonality assumptions are difficult to justify, as an obliquely rotated solution will resemble an orthogonal one if the optimal simple structure solution truly is represented by uncorrelated factors. In fact, it is quite possible that the uninterpretable 7- and 8-factor solutions encountered by Tomchek et al. (2014) would have been more intelligible had the solution been rotated obliquely. Thus, for our investigation, we chose to use the oblique Geomin rotation (Yates, 1987; Browne, 2001) because of its minimization of item complexity and favorable performance under conditions of simple structure (Sass & Schmitt, 2010).
In order to determine the number of factors to extract, several methods were employed including the minimum average partial method (MAP; Velicer, 1976) and Horn’s (1965) parallel analysis, both of which are considered among the most accurate methods available (Velicer, Eaton, & Fava, 2000). These methods have been modified for application to ordinal data and remain effective in recovering the correct factor structure of a polychoric correlation matrix (Garrido, Abad, & Ponsoda, 2011, 2013; Yang & Xia, 2015). The modified parallel analysis procedure we used compared the eigenvalues of the reduced SSP item polychoric correlation matrix (based on PAF) to the median eigenvalues of 200 similarly-obtained correlation matrices based on simulated data derived from uncorrelated variables with equal item thresholds to the SSP data. The number of factors was also calculated based on the Empirical Kaiser Criterion (EKC; Braeken & van Assen, 2017), a newer eigenvalue-based methodology that has shown promising results when compared to parallel analysis. As these methods often disagree on the number of factors to extract, all suggested factor solutions were explored, and the optimal solution was chosen based on factor interpretability and examination of the residual correlation matrix. These procedures were all conducted in R, with the MAP and parallel analysis procedures implemented in the psych package (nfactors and fa.parallel functions; Revelle, 2017) and the EKC procedure implemented using a custom script written by the first author.
Calculation of reliability and general factor saturation.
In addition to determining the factor structure of the SSP, we also sought to assess the utility of the SSP total score as a measure of overall sensory responsiveness. Thus, as a secondary analysis, we used the best-fitting SSP model to calculate model-based reliability coefficients that also assessed the general factor saturation (i.e., the degree to which variance in the SSP total score reflects one unified construct, ostensibly “sensory atypicality”). Tomchek and colleagues (2014) reported the alpha coefficient of the total score in their ASD sample (α = 0.89), but this estimate of scale reliability assumes that the scale in question is unidimensional (Gignac, 2014) and is not appropriate to determine the general factor saturation of a multidimensional composite score (Brunner, Nagy, & Wilhelm, 2012; Cho, 2016; Gignac, 2014; Revelle & Zinbarg, 2009; Rodriguez, Reise, & Haviland, 2016a, 2016b; Zinbarg et al., 2005). Thus, in order to more accurately assess the general factor saturation of the SSP, we estimated an exploratory bifactor model (Reise, 2012) from the best-fitting EFA solution using the Schmid-Leiman orthogonalization (Schmid & Leiman, 1957). This procedure transforms the oblique EFA solution with m correlated factors into an orthogonal m + 1 factor solution represented by one “general factor” common to all items and p orthogonal “group” factors with salient loadings on only a subset of the items (Reise, 2012).
From this bifactor model, a number of useful psychometric indices can be calculated (Rodriguez et al., 2016a, 2016b), including: (a) coefficient omega total (ωt; Mcdonald, 1999; Revelle & Zinbarg, 2009; Zinbarg, Revelle, Yovel, & Li, 2005), a measure of composite scale reliability, (b) coefficient omega hierarchical (ωH; Mcdonald, 1999; Revelle & Zinbarg, 2009; Zinbarg et al., 2005), an estimate of the proportion of total scores that can be attributed to the general factor, and (c) the ratio of general factor variance to group factor variance (explained common variance [ECV]; Reise, Scheines, & Widaman, 2013; Sijtsma, 2009), which determines the degree to which multidimensional data can be considered essentially unidimensional. The degree of general factor saturation (i.e., ωH) can be used as an index of total score interpretability when compared to bifactor ωt, the proportion of variance explained by all general and group factors (Rodriguez et al., 2016a). The difference between ωH and ωt reflects the proportion of variance in total scores due to variance in latent factors other than the general factor, indicating the level of non-random error (i.e., bias) inherent in considering total scale scores as indicators of the latent general factor. Such indices are readily calculated in R using the omega routine in the psych package (Revelle, 2017).
Although guidelines on the values of ωH and ωt necessary for total score interpretability have not been firmly established, Rodriguez and colleagues (2016a) calculated these indices for a number of published studies and reported a mean ωt of 0.94 and a mean ωH of 0.80 (mean difference = 0.14), which they deemed satisfactory to interpret total scale scores as unbiased indicators of general factors. However, some specific cases from their investigation (ωt = 0.90 and ωH = 0.52; ωt = 0.89 and ωH = 0.58; ωt = 0.88, ωH = 0.57) were highlighted as examples of total scale scores that were significantly biased by the presence of multidimensionality. Using these examples to guide our interpretations, we sought to use the omega coefficients from our bifactor model to assess the bias inherent in SSP total scores.
Results
Descriptive Statistics
SSP item endorsement frequencies and are presented in Table 2. The percentage of the sample endorsing an item as “Always” or “Frequently” occurring varied widely, with endorsement values ranging from 7% (“Dislikes activities where head is upside-down”) to 65% (“Doesn’t respond when name is called but you know the child’s hearing is OK”). Of the 38 items, all but four were endorsed in excess of 10%. Corrected item-total and item-subscale correlations were deemed satisfactory for most items based on the common rule-of-thumb of r > 0.4 (Ware & Gandek, 1998). However, the item-total correlations of several items, primarily those in the USS scale, suggest that they may be only weakly related to the general factor underlying the SSP total score.
Based on the SSP total score, 244 (62.9%) of the children were classified as having a “Definite Difference,” and an additional 72 (18.6%) were characterized as having a “Probable Difference” in overall sensory processing. Classifications based on subscale scores were more variable, with the percentage of the sample labeled as having a “Definite Difference” ranging from 16.0% (MOV) to 63.4% (USS). Although the percentage of children with elevated SSP scores was large in this sample, it was lower than that reported in a large sample of younger children with ASD (83.6% definite, 11.4% probable; Tomchek & Dunn, 2007). However, the rate of “Definite Difference” classification was similar to that reported in a sample of 9–14-year-old children with ASD (Green et al., 2016).
Subscale alpha and omega total values (Table 3) were all satisfactory (> 0.7) based on commonly-cited recommendations for internal consistency reliability (Nunnally & Bernstein, 1994). Average inter-item correlations for most scales conformed to the guidelines of Clark & Watson (1995), though the large mean inter-item correlations of TSM ( = 0.77) and LEW ( = 0.67) indicated that these subscales may be composed of exceptionally homogenous items. It is worth clarifying that these scale-level reliability and internal consistency coefficients assume that the original dimensionality of the SSP is correct and are therefore not interpretable if an alternate factor model is deemed more appropriate for the data (Cortina, 1993; Crutzen & Peters, 2017; Davenport, Davison, & Liou, 2015). Moreover, the single-factor CFA models utilized to derive the values of omega total did not display goodness of fit for most subscales (Supplementary Table S1), thereby violating the unidimensionality assumptions of both alpha and omega coefficients (McNeish, 2017). The Pearson correlations between unit-weighted subscale scores in our sample were smaller than those reported by McIntosh and colleagues (mean r = 0.32), with a range of r = 0.14–0.50.
Table 3.
Scale | M (SD) | Min–Max | α (95% CI) | ωt (95% CI) | Average IIC | IIC Range |
---|---|---|---|---|---|---|
TAC | 27.73 (5.18) | 9–35 | 0.74 (0.70–0.74) | 0.76 (0.71–0.80) | 0.30 | 0.21–0.51 |
TSM | 12.65 (5.48) | 4–20 | 0.93 (0.92–0.94) | 0.94 (0.92–0.95) | 0.77 | 0.70–0.80 |
MOV | 12.92 (2.56) | 3–15 | 0.74 (0.70–0.77) | 0.76 (0.70–0.81) | 0.49 | 0.37–0.62 |
USS | 21.93 (5.44) | 7–35 | 0.74 (0.70–0.77) | 0.76 (0.71–0.80) | 0.29 | 0.16–0.52 |
AFL | 17.83 (4.49) | 6–30 | 0.78 (0.75–0.81) | 0.95 (0.89–1.0) | 0.38 | 0.19–0.76 |
LEW | 22.81 (6.81) | 6–30 | 0.93 (0.91–0.94) | 0.97 (0.95–0.99) | 0.67 | 0.55–0.83 |
VAS | 18.04 (3.96) | 6–25 | 0.72 (0.68–0.76) | 0.91 (0.84–0.98) | 0.34 | 0.17–0.71 |
Note. α = Cronbach’s coefficient alpha; ωt = Coefficient omega total; IIC = Inter-item correlation; TAC = Tactile sensitivity; TSM = Taste/smell sensitivity; MOV = Movement sensitivity; USS = Underresponsive/seeks sensation; AFL = Auditory filtering; LEW = Low energy/weak; VAS = Visual/auditory sensitivity
Confirmatory Factor Analysis
Model fit for both the 7-factor and 6-factor SSP models was inadequate based on conventional fit criteria (Table 4). For both models, the chi-square test was highly significant (p 0.001), rejecting the null hypothesis of exact model fit. All other fit indices failed to reach the a priori cutoff values (i.e., CFI/TLI > 0.95, RMSEA < 0.06, WRMR < 1.0), further suggesting that neither factor model adequately fit the data in our sample. Fit indices were approximately equal between the 7-factor and 6-factor models, and given the poor fit of both, no further testing was done to evaluate the superiority of one factor model over the other. Furthermore, examination of residual correlation matrices revealed 19 and 18 residual correlations with absolute values greater than 0.2 in the 7- and 6-factor models, respectively, indicating many inter-item correlations that were not adequately explained by the proposed factor models. The pattern of residuals indicated numerous areas of local misfit in both models, with items 23, 26, 36, and 38 fitting particularly poorly into their assigned subscales across factor models (Supplementary Tables S3–S4). Model fit indices were similarly poor when the Vanderbilt and NDAR datasets were analyzed separately, and they did not change appreciably when the analyses were restricted to children in the 3–10 age range recommended by the SSP authors (Supplementary Table S2). Thus, all subsequent analyses focused exclusively on the combined dataset.
Table 4.
SSP Model | χ2 | df | CFI | TLI | RMSEA (90% CI) | WRMR |
---|---|---|---|---|---|---|
7F | 1900.07** | 644 | 0.922 | 0.915 | 0.071 (0.067–0.075) | 1.603 |
6F | 1868.75** | 650 | 0.924 | 0.918 | 0.070 (0.066–0.073) | 1.623 |
Note. SSP = Short Sensory Profile; 7F = Original 7-factor model from McIntosh et al. (1999); 6F 6-factor model from Tomchek et al. (2014); df = degrees of freedom; CFI = Confirmatory Fit Index; TLI = Tucker-Lewis Index; RMSEA = root mean square error of approximation; WRMR weighted root mean square residual; CI = confidence interval
p < 0.001
Exploratory Factor Analysis
Because the confirmatory factor models evidenced poor fit, we followed up by fitting exploratory factor models to the item-level SSP data. Parallel analysis with PAF estimation, the Velicer MAP, and the EKC were all used to estimate the number of factors to extract, arriving at slightly different conclusions. Parallel analysis indicated an 8-factor solution (first 9 eigenvalues of the reduced polychoric correlation matrix: 9.52, 3.37, 2.09, 1.45, 1.07, 0.78, 0.64, 0.48, 0.29; first 9 simulated median eigenvalues: 0.70, 0.59, 0.53, 0.48, 0.43, 0.39, 0.35, 0.32, 0.28), as did the Velicer MAP (average partial correlation = 0.02 at 8 factors). The EKC, however, indicated that a 9-factor solution was most appropriate (first 10 eigenvalues of the polychoric correlation matrix: 10.38, 4.11, 2.94, 2.18, 1.81, 1.56, 1.41, 1.32, 1.07, 0.98; first 10 EKC eigenvalues: 1.72, 1.29, 1.13, 1.01, 1.00, 1.00, 1.00, 1.00, 1.00, 1.00). Thus, both 8- and 9-factor EFA models were estimated and compared based on the residual correlation matrices and the interpretability of the rotated solutions.
The 8-factor model explained 60% of the total variance in SSP item scores (Supplementary Table S5). Item communalities were generally high (mean h2 = 0.60, range = 0.25–0.88), but eight of the items (1, 2, 3, 15, 18, 19, 21, 37) were found to have low communalities (h2 < 0.4), indicating either poor relation to the other items or the necessity for additional factors (Costello & Osborne, 2005). Factor determinacy (FD) statistics for all factors (range 0.91–0.98) were adequate to allow the interpretation of factor scores (ρ > 0.90; Grice, 2001; Rodriguez et al., 2016a, 2016b). Based on item loadings greater than 0.4, the eight rotated factors were interpreted (in order of variance explained) as: (1) Low energy/weakness, (2) Taste/smell sensitivity, (3) Hyperactivity/inattention, (4) Auditory sensitivity (encompassing both distraction by ambient noise and distress at loud noises), (5) Tactile sensitivity, (6) Movement Sensitivity, (7) Visual sensitivity, with significant cross-loadings from auditory items 34 and 35, (8) Hyporesponsiveness to speech, containing only items 23 and 26. Factor scores were modestly intercorrelated (mean r = 0.22, range −0.002–0.42). Of these factors, only Low energy/weak, Taste/smell sensitivity, and Movement sensitivity contained identical items to their original subscales. Though we chose to retain the name of the Taste/smell sensitivity factor, our examination of the item content leads us to believe that the items more accurately represent food selectivity. Indeed, all four items mention food or eating, and two of the items specifically query about the child’s refusal of food due to non-chemosensory qualities (i.e., texture and temperature). Notably, seven of the 38 items (1, 2, 5, 15, 19, 21, 37) failed to load onto any factor with a magnitude greater than 0.4, and one item (5: “Withdraws from splashing water”) had no loadings greater than 0.30. Examination of the residual correlation matrix revealed a single value greater than 0.2 (items 34 and 35, rresid = 0.23), indicating that the two items loading onto both the auditory and visual sensitivity factors were poorly modeled by this solution.
The 9-factor model was structurally very similar to the 8-factor model, with the principal difference being that items 34 (“Responds negatively to unexpected or loud noises”) and 35 (“Holds hands over ears to protect ears from sound”) loaded onto their own factor rather than cross-loading on the Auditory sensitivity and Visual sensitivity factors (Table 5). The nine factors explained 62% of the variance in SSP item scores, and item communalities were approximately the same (mean h2 = 0.62, range = 0.29–0.91). The communalities of items 1, 2, 3, 15, 18, 19, 21, and 37 did not improve with the extraction of an additional factor, indicating those items are poorly related to the rest of the scale. Likewise, all seven items with primary loadings < 0.4 remained as such in the 9-factor model. FD statistics, including that of the ninth factor (ρ = 0.91) remained sufficient for interpretation (range: 0.91–0.98). The factors were thus interpreted as: (1) Low energy/weakness, (2) Taste/smell sensitivity, (3) Hyperactivity/inattention, (4) Tactile sensitivity, (5) Movement sensitivity, (6) Auditory distractibility, (7) Hyporesponsiveness to speech, (8) Visual sensitivity, and (9) Noise distress.
Table 5.
Item | F1 | F2 | F3 | F4 | F5 | F6 | F7 | F8 | F9 | h2 |
---|---|---|---|---|---|---|---|---|---|---|
TAC | ||||||||||
1 | −0.01 | 0.23 | 0.02 | 0.34 | 0.12 | −0.10 | 0.18 | 0.07 | 0.07 | 0.35 |
2 | 0.12 | −0.03 | −0.06 | 0.32 | 0.17 | 0.17 | 0.16 | −0.02 | 0.00 | 0.33 |
3 | 0.10 | 0.08 | −0.18 | 0.47 | 0.07 | 0.11 | −0.02 | 0.03 | −0.06 | 0.36 |
4 | 0.05 | 0.06 | 0.03 | 0.73 | 0.02 | −0.04 | 0.04 | −0.03 | 0.01 | 0.58 |
5 | 0.17 | −0.03 | −0.06 | 0.25 | 0.24 | 0.11 | −0.02 | 0.25 | −0.04 | 0.45 |
6 | −0.02 | 0.10 | 0.20 | 0.65 | 0.08 | −0.01 | 0.03 | −0.04 | 0.01 | 0.58 |
7 | 0.04 | −0.08 | 0.25 | 0.57 | −0.03 | 0.09 | −0.13 | 0.06 | 0.06 | 0.53 |
TSS | ||||||||||
8 | 0.08 | 0.87 | 0.00 | −0.02 | −0.05 | 0.09 | 0.04 | 0.02 | −0.03 | 0.80 |
9 | −0.05 | 0.91 | 0.04 | 0.01 | 0.06 | −0.04 | 0.01 | −0.04 | 0.02 | 0.85 |
10 | 0.04 | 0.92 | 0.02 | −0.01 | 0.02 | −0.02 | −0.02 | −0.02 | 0.05 | 0.87 |
11 | −0.02 | 0.93 | −0.06 | 0.05 | −0.04 | 0.07 | −0.02 | 0.04 | 0.00 | 0.88 |
MOV | ||||||||||
12 | −0.07 | 0.02 | 0.09 | 0.06 | 0.93 | 0.01 | 0.00 | 0.05 | −0.05 | 0.92 |
13 | 0.19 | 0.01 | −0.08 | −0.05 | 0.51 | 0.10 | −0.04 | 0.17 | 0.00 | 0.48 |
14 | 0.02 | −0.02 | −0.01 | 0.06 | 0.72 | −0.01 | −0.06 | 0.00 | 0.07 | 0.57 |
USS | ||||||||||
15 | 0.01 | −0.05 | 0.36 | 0.24 | −0.05 | −0.05 | 0.22 | 0.13 | 0.00 | 0.38 |
16 | 0.02 | 0.00 | 0.60 | 0.23 | 0.01 | 0.16 | 0.06 | −0.01 | −0.05 | 0.60 |
17 | −0.07 | 0.04 | 0.66 | 0.30 | −0.06 | 0.03 | −0.12 | 0.08 | 0.02 | 0.59 |
18 | 0.00 | 0.04 | 0.43 | −0.01 | −0.02 | −0.05 | 0.14 | 0.26 | −0.02 | 0.35 |
19 | 0.21 | 0.00 | 0.31 | −0.19 | 0.27 | −0.02 | 0.17 | −0.01 | 0.01 | 0.29 |
20 | −0.14 | −0.04 | 0.70 | 0.01 | 0.02 | 0.07 | −0.01 | −0.03 | −0.02 | 0.49 |
21 | 0.19 | 0.02 | 0.36 | −0.02 | 0.26 | −0.14 | 0.04 | −0.11 | 0.18 | 0.30 |
AFL | ||||||||||
22 | −0.01 | 0.00 | 0.13 | 0.02 | −0.03 | 0.82 | 0.02 | 0.03 | 0.06 | 0.80 |
23 | 0.03 | −0.01 | 0.05 | −0.01 | −0.03 | 0.13 | 0.84 | 0.02 | 0.06 | 0.82 |
24 | −0.06 | 0.04 | −0.01 | 0.08 | 0.17 | 0.61 | 0.05 | −0.01 | 0.11 | 0.57 |
25 | 0.06 | 0.07 | 0.05 | 0.03 | 0.01 | 0.71 | −0.02 | −0.01 | 0.05 | 0.62 |
26 | −0.07 | 0.07 | 0.01 | 0.11 | 0.00 | 0.06 | 0.84 | 0.03 | −0.01 | 0.81 |
27 | 0.08 | 0.02 | 0.43 | −0.14 | 0.10 | 0.35 | 0.22 | −0.01 | −0.01 | 0.52 |
LEW | ||||||||||
28 | 0.95 | 0.04 | 0.06 | 0.00 | −0.04 | 0.01 | −0.01 | 0.01 | −0.14 | 0.88 |
29 | 0.79 | −0.04 | −0.04 | 0.15 | 0.05 | 0.04 | 0.01 | −0.02 | 0.12 | 0.82 |
30 | 0.84 | 0.07 | 0.10 | −0.01 | 0.00 | −0.12 | −0.02 | −0.02 | −0.02 | 0.69 |
31 | 0.88 | 0.09 | −0.01 | −0.01 | 0.03 | 0.03 | −0.03 | 0.01 | −0.04 | 0.82 |
32 | 0.76 | −0.06 | 0.03 | 0.16 | −0.01 | −0.05 | 0.11 | 0.04 | 0.05 | 0.69 |
33 | 0.84 | −0.04 | −0.08 | 0.02 | 0.00 | 0.10 | −0.02 | 0.05 | 0.06 | 0.81 |
VAS | ||||||||||
34 | 0.01 | 0.05 | 0.00 | 0.09 | −0.01 | 0.02 | −0.01 | 0.05 | 0.73 | 0.64 |
35 | 0.00 | 0.01 | 0.01 | −0.04 | 0.00 | 0.08 | 0.00 | 0.02 | 0.82 | 0.74 |
36 | 0.00 | −0.04 | 0.04 | 0.08 | 0.01 | 0.05 | 0.07 | 0.73 | 0.10 | 0.73 |
37 | 0.05 | 0.01 | 0.33 | 0.10 | 0.10 | 0.06 | −0.32 | 0.23 | 0.04 | 0.32 |
38 | 0.01 | 0.06 | 0.00 | −0.04 | 0.03 | −0.08 | 0.00 | 0.95 | 0.01 | 0.87 |
Note. Factor loadings greater than 0.4 and communalities (h2) less than 0.4 are presented in bold.
TAC = Tactile sensitivity; TSM = Taste/smell sensitivity; MOV = Movement sensitivity; USS = Underresponsive/seeks sensation; AFL = Auditory filtering; LEW = Low energy/weak; VAS = Visual/auditory sensitivity; F1–9 = Factors 1–9
Notably, item 27 (“Has difficulty paying attention”), exhibited a cross-loading of λ = 0.35 on the “Auditory distractibility” factor. In the residual correlation matrix, all off-diagonal values had an absolute value of less than 0.2. Because of the favorable residual values and interpretable factor structure, the 9-factor solution was chosen as the best representation of the latent constructs underlying SSP item responses in this sample.
An exploratory bifactor model was then estimated by subjecting the Geomin-rotated 9-factor model to a Schmid-Leiman orthogonalization (Supplementary Table S6). As this model is simply a transformation of the correlated-factors model, item communalities and total percentage of variance explained remained unchanged. Loadings onto the general factor were moderate on average (mean λ = 0.43, range = 0.19–0.58), though five items exhibited general factor loadings smaller than 0.3. Notably, the FD statistic for the general factor (ρ = 0.86) was below the recommended value for interpretation of the corresponding factor score. Though this finding does provide some evidence against the interpretation of unit-weighted SSP total scores, no specific level of FD is required for the valid interpretation of model-based statistics such as ωH, ωt, and ECV. Omega coefficients indicated high composite score reliability (ωt = 0.96) with significantly lower general factor saturation (ωH = 0.68). Furthermore, the large difference between omega coefficients (ωt - ωH = 0.28) was similar to the values presented by Rodriguez and colleagues (2016a) as indicative of a total score biased by multidimensionality. The ECV was calculated to be 0.31, indicating that the questionnaire would be poorly approximated with a unidimensional model, as over two thirds of common variance is due to factors other than the general factor. To confirm that the omega statistics and ECV were not biased by the somewhat arbitrary choice of the 9-factor structure over the 8-factor structure, we calculated them again using a Schmid-Leiman transformation of the 8-factor solution and obtained very similar results (ωH = 0.67, ωt = 0.96, ECV = 0.33).
Discussion
The present investigation represents a critical step forward in improving the measurement of sensory features associated with ASD. The SSP is one of the most widely used sensory phenotyping measures in autism research, and the scores derived from the questionnaire have been extensively used to explore the clinical and psychobiological correlates of sensory features (e.g., Glod et al., 2015), as well as the very nature of sensory abnormalities in ASD (Lane et al., 2010, 2011, 2014b; Tomchek & Dunn, 2007; Tomcheck et al., 2014; Uljarević et al., 2016). Despite the frequent use of the measure in children with ASD, no study to date has employed CFA to validate the factor structure of the SSP in this population. Employing a wide variety of psychometric methods, we sought to fill this gap in the literature by: (a) providing item- and subscale-level descriptive statistics specific to this population, (b) testing the factorial validity of prior PCA-derived SSP factor models (7 original subscales: McIntosh et al., 1999; 6-factor alternative model from ASD sample: Tomchek et al., 2014), (c) deriving a more accurate factor model for the SSP items, and (d) determining the validity and utility of the SSP total score as an overall measure of sensory processing abnormality in ASD. Overall, we found the SSP item data from our sample to be a poor fit to either of the previously proposed factor models, and our results greatly caution the interpretation of SSP subscale and total scores in children with ASD. These findings are also able to inform the broader autism research community of several important issues in psychological measurement, providing insight into the ways that newer measures of clinical phenomena may be evaluated and improved.
Using robust estimation techniques appropriate for ordinal data, we conducted a confirmatory factor analysis of the SSP item data in a large sample of children with ASD. According to standard fit indices, both the original 7-factor model and the 6-factor model of Tomchek and colleagues (2014) failed to adequately explain the data in this sample. This misfit was confirmed by examination of the residual correlation matrices, which revealed a number of unacceptably large residuals (|r| > 0.2) in both models. Follow-up exploratory factor models suggested that the internal structure of the SSP is best represented by nine correlated factors in this population, only three of which (TSM, MOV, LEW) correspond to original subscales of the SSP. These findings indicate that for all but the aforementioned three subscales, unit-weighted scores from the SSP do not measure the same latent constructs as in TD children and are thus unfit for group-level comparison or individual-score interpretation in children with ASD. Furthermore, the lack of structural validity for most SSP subscales complicates the interpretation of proposed sensory subtypes that have been created on the basis of these subscale scores (e.g. Lane et al., 2010, 2011, 2014b; Hand et al., 2017; Tomchek et al., 2018). Of particular interest, the two subscales thought to explain the majority of variance in sensory subtype groupings (TSM and LEW; Hand et al., 2017; Lane et al., 2014b) were both extremely homogeneous in content and well-replicated across factor-analytic studies (Engel-Yeger, 2010; McIntosh et al., 1999; Tomchek et al., 2014). It is thus possible that the dominant role played by these two scales in sensory subtype classification is more a statistical artifact than a true phenomenon. Were the sensory measure used for subtyping to contain more reliable scales assessing hyperresponsiveness in the visual and auditory domains, a very different set of subtypes may have emerged. Further research is certainly required to determine whether proposed sensory subtypes are robust to methodological factors and replicable across measurement tools.
As a secondary aim of our investigation, we employed a bifactor model to calculate several newly-established psychometric indices assessing the unidimensionality, composite reliability, and general factor saturation of the SSP. The ECV indicated that the SSP factor structure was not well approximated by a single general factor and that non-general group factors accounted for the majority of common variance. The composite reliability of the SSP total as estimated by ωt was 0.96, indicating that 96% of the variance in unit-weighted total scores can be explained by the general and group factors of the model. However, the value of coefficient ωH was 0.68, indicating that only 68% of unit-weighted total score variance could be accounted for by the general factor, with the remaining 28% of true-score variance due to the group factors.
It is clear from the omega coefficients that the SSP total score is reliable and does measure a general sensory factor to some degree in children with ASD, but several findings complicate the use of the SSP total as a measure of this general factor. Most troublingly, nearly a third of common variance is accounted for by the group factors (i.e., factors other than the general factor), causing the total score to be substantially biased by the multidimensionality of its constituent parts. As an example, a hypothetical test could exhibit a Pearson correlation of 0.5 with the SSP total and still share no variance with the SSP general factor. Moreover, when fitting the bifactor model, a large number of SSP items, particularly those from the TSM and USS scales, loaded poorly onto the modeled general factor. Not only did these loadings result in a poor FD statistic for the general factor, they also cause us to question the degree to which several of the SSP factors (namely food selectivity, hyperactivity/inattention, and hyporesponsiveness to speech) are truly related to the broader construct of atypical sensory reactivity in ASD. Due to the way that the SSP was constructed, the items represent a heterogeneous combination of sub-constructs (Miller, Anzalone, Lane, Cermak, & Osten, 2007; Schaaf & Lane, 2015) that are unequally distributed across both sensory modalities and subtypes of sensory processing dysfunction (e.g. the SSP taps constructs of visual hyperresponsiveness, auditory hyperresponsiveness, and auditory hyporesponsiveness, but not visual hyporesponsiveness). The SSP is thus likely too limited in content to comprehensively assess multiple sensory response patterns across modalities, while also being too broad in scope to reflect a single facet of sensory processing such as hyperresponsiveness, as has been noted by several authors before us (Ausderau et al., 2014b; Green et al., 2016; Schoen Miller, & Green, 2008). Given the atheoretical nature of the general construct produced by summing the SSP items and the large amount of bias in total score variance due to the measure’s multidimensionality, we recommend against the use of the SSP total score as a dimensional measure of sensory processing in ASD.
Despite lacking factorial validity in this sample, the SSP subscales displayed reliability and internal consistency values that exceeded conventional cutoffs. This finding exemplifies the inadequacy of coefficient alpha or internal consistency metrics to correctly define the dimensionality of a measure in the absence of factor-analytic studies (Crutzen & Peters, 2017). Because this study failed to support the validity of the SSP factor structure, these coefficients are interpretable for only those factors that were preserved in the best-fitting EFA model (i.e., TSM, MOV, and LEW). Of these three subscales, the TSM and LEW subscales have been reproduced in all factor-analytic studies of the SSP (Engel-Yeger, 2010; McIntosh et al., 1999; Tomchek et al., 2014; see also Hand et al., 2017), leading us to conclude that these two subscales likely do measure the same latent constructs in ASD and TD samples (though strict measurement invariance has not yet been established). Given the large mean inter-item correlations of these two subscales (TSM: = 0.77, LEW: = 0.67) they likely represent overly narrow constructs that sacrifice construct validity by not fully encompassing the domain of interest (Boyle, 1991; Clark & Watson, 1995). For instance, despite the LEW scale ostensibly measuring the construct termed “Sensory-Based Motor Disorder” in the taxonomy by Miller et al. (2007), scores on the LEW subscale were unrelated to both gross and fine motor skills in a large sample of preschool children with ASD (Tomchek et al., 2015). Thus, in the absence of convergent validity, it becomes unclear whether the score of a scale made of highly-redundant items is at all an accurate representation of the construct of interest.
Although the high mean inter-item correlation of the TSM calls into question the scale’s adequacy at measuring the broad construct of chemosensory hyperresponsiveness, our examination of the item content leads us to believe that the scale actually measures the much narrower construct of food selectivity. Food selectivity is prevalent in children with ASD and may be influenced by sensory sensitivity, but such behaviors are complex in nature and likely also influenced by a host of other non-sensory factors (Cermak, Curtin, & Bandini, 2010, 2014; Kuschner et al., 2015; Lusier et al., 2015; see also Stafford, Tsang, López, Severini, & Iacomini, 2017). Due to its limited scope and overly homogenous item content, the TSM scale of the SSP is likely to be a poor measure of chemosensory hyperresponsiveness in general. Nevertheless, we believe that this 4-item subscale holds promise as a brief dimensional measure of food selectivity in children with ASD. The validity of the TSM score for this indication is supported by large correlations between TSM scores and other measures of food selectivity across samples of children with ASD (Chistol et al., 2018; Lane, Geraghty, & Young, 2014a; Nadon, Feldman, Dunn, & Gisel, 2011; Smith, 2016), other developmental disabilities (Engel-Yeger, Hardal-Nasser, & Gal, 2016), and TD (Coulthard & Blissett, 2009; Nederkoorn, Jansen, & Havermans, 2015).
Beyond replicating the TSM, MOV, and LEW factors, the EFA allowed us to explore the range of constructs underlying the SSP. Although a fair number of items failed to load strongly onto any factor, the remaining items could be grouped into factors representing Hyperactivity/inattention, Tactile sensitivity, Auditory distractibility, Hyporesponsiveness to speech, Visual sensitivity, and Noise distress. Three of the factors (Hyporesponsiveness to speech, Visual sensitivity, and Noise distress) were represented by only two items apiece, producing subscales that were too unreliable to adequately represent the underlying constructs. Nevertheless, this factor model allows us to better understand the specific constructs being measured by the SSP in this population, which may inform future attempts to develop valid measures of sensory features in ASD.
In particular, the fractionation of the AFL and VAS subscales into four separate constructs provides some explanation beyond methodological differences for the divergence of our factor solution from that reported by Tomchek and colleagues (2014). In the aforementioned study, potential factor solutions were rejected based on the presence of two-item factors, causing the investigators to reduce the number of factors until all constructs represented by only two items were subsumed into larger subscales. In doing so, the “auditory and visual sensitivity” factor was created, consisting of the Auditory distractibility, Noise distress, and Visual sensitivity items combined into one heterogeneous scale. It is relatively uncontroversial that responsivity to visual and auditory stimuli represent different constructs, but conceptual distinctions also exist between hyperresponsiveness to soft and loud noises (Landon, Shepherd, & Lodhia, 2016; Levitin, Cole, Lincoln, & Bellugi, 2005; Phillips & Carr, 1998). Factor analysis of a larger number of auditory sensory behaviors in children with ASD has also found that “Difficulty in Background Noise” and “Aversive Reactions” represent separate dimensions of auditory reactivity (Egelhoff & Lane, 2013). The correlations between factor scores further support the distinctiveness of these constructs, with Auditory distractibility and Noise distress correlating at r = 0.43 and neither correlating more than r = 0.36 with Visual sensitivity scores. Thus, we conclude that the limited item coverage of the SSP led to inaccurate dimensionalization in the study by Tomchek et al. (2014) to preserve the reliability of the factor-analytically derived subscales.
We also want to emphasize that a number of behaviors suggestive of “sensory seeking” (e.g. USS items 16, 17, 20) are equally interpretable as representing hyperactivity, showing a strong covariance with an item specifically addressing inattention (AFL item 27). This interpretation is supported by the finding that children with ADHD have more typical scores on the SSP subscales than children with SMD on all but the USS and AFL subscales (Miller Nielsen, & Schoen, 2012). Disentangling parent-reported sensory features from ADHD symptoms in children with ASD may prove difficult (see Wodka et al., 2016), but the establishment of measures that discriminate sensory features from comorbid psychopathology will greatly advance the validity of sensory research moving forward.
This study had a number of strengths, most notably the methodologic rigor of its psychometric analyses and the use of more advanced model-based indices to determine the composite reliability and general factor saturation of the SSP questionnaire in this population. As factor-analytic studies in this field have not historically employed best-practice EFA methods (Norris & Lecavalier, 2010), we believe that a greater adherence to published methodological guidelines will strengthen the psychometric properties of the resulting instruments and improve the replicability of results. Also, the sample of children with ASD employed was relatively large and geographically diverse, likely approximating the typical participants in ASD research. However, there were also several limitations to the current investigation. Due to the nature of experimental tasks in our lab, the majority of subjects taken from our in-house dataset were verbal, intellectually able, and behaviorally regulated enough to tolerate neuroimaging studies, potentially leading our current dataset to under-sample more severely impaired children. Moreover, as sociodemographic data varied in scope and format between NDAR studies, we were unable to determine or report the racial/ethnic or socioeconomic breakdown of the subjects.
Methodologically, our decision to include only ASD children weakened the claims that SSP total or subscale scores are not comparable between ASD and TD children, as no tests of measurement invariance were performed. Though it is still the case that a different factor solution in ASD and TD samples invalidates cross-group score comparisons, there remains the possibility that the SSP factor structure in TD children is also better explained by the 9-factor model derived in this study. However, even in the case of factorial equivalence, as has been tentatively established for the TSM, MOV, and LEW scales, there is still the possibility of ASD-TD differences in factor loadings or intercepts that invalidate the comparison of scores across groups (cf. Frazier et al., 2014; Murray, Booth, McKenzie, Kuenssberg, & O’Donnell, 2014; Wicherts & Dolan, 2010). In addition, while a large portion of the sample possessed data for the long-form SP, this study did not attempt to evaluate the factor structure or psychometric utility of subscales and broad “sensory quadrant” scores derived from the SP in ASD children. It remains possible that scales from the long-form measure possess factorial invariance in ASD and TD populations, though additional factor-analytic studies are required to test this hypothesis.
As a final limitation, our best-fitting factor structure contained a number of subscales that were too short to be psychometrically useful. It is thus noted that we have discouraged the use of both the SSP total score and subscale scores without providing a suitable alternative for analyzing SSP data that has already been collected. Researchers hoping to add a sensory measure to their study protocol are instead encouraged to consider newer measures such as the Sensory Experiences Questionnaire 3.0 (SEQ; Ausderau et al., 2014b), Sensory Processing Scales-Inventory (Schoen, Miller, & Sullivan, 2017), or Sensory Profile 2 (Dunn, 2014), all of which attempt to address the shortcomings of the SSP and original SP. These instruments all assess three core sensory domains (i.e., hyperresponsiveness, hyporesponsiveness, and sensory seeking) with items that cover a wider range of sensory modalities and additional theoretical constructs unique to some of the measures. However, while all three newer-generation instruments have been used in samples of children with ASD (Ausderau et al., 2016; Little, Ausderau, Sideris, & Baranek, 2015, Little, Dean, Tomchek, & Dunn, 2018, Tavassoli et al., 2018), there are no published reports specifically examining the psychometric properties of these measures in an ASD population. Of the three, we tentatively recommend the SEQ due to a single published CFA study confirming its proposed factor structure in children with ASD and establishing measurement invariance between several important sub-populations (Ausderau et al., 2014b). The SEQ also includes a number of items measuring enhanced perceptual ability, a construct that is lacking from the other aforementioned measures and may be positively correlated with functional outcomes (Little et al., 2015). Researchers interested in assessing specific sub-constructs of sensory processing may potentially employ the four cross-modal SEQ subscales (Hyperresponsiveness: 31 items, Hyporesponsiveness: 18 items, Sensory Seeking, 31 items, Enhanced Perception: 12 items) individually. However, SEQ score profiles incorporating all four subscale scores are valuable in predicting clinical outcomes (Ausderau et al., 2016), leading us to suggest that clinicians and researchers employ the full-length questionnaire when possible. Despite our preliminary recommendation, additional research is needed to directly compare the reliability and validity of these measures in the ASD population and establish the invariance of their measurement models between diagnostic groups. Future psychometric analyses of these and other sensory questionnaires would also benefit from the estimation of bifactor models and the calculation of indices such as the omega coefficients.
Conclusion
The SSP has long served as one of the primary measures of sensory features among children with ASD, despite the relative absence of psychometric evidence supporting its use. Employing rigorous psychometric methods in a large sample of children with ASD, we sought to evaluate whether the SSP total score and subscale scores were appropriate for use in this population. Our analyses failed to support the validity of the proposed SSP structures in children with ASD and cause us to recommend against the interpretation of SSP subscale scores in children diagnosed with ASD. Furthermore, while the SSP total score is measured reliably and does seem to measure a general sensory factor of sorts, its interpretation is complicated by limited content validity and substantial bias due to multidimensionality. For these reasons, we also recommend against the use of the SSP total score as a measure of general sensory features in ASD. Follow-up exploratory analyses demonstrated that a 9 correlated-factor solution best explains responses to the SSP items, but several of the constructs are represented by too few items to be psychometrically or clinically useful. Only the TSM, MOV, and LEW subscales were replicated in our analysis, and this study provides preliminary evidence of factorial invariance for those subscales. The TSM subscale in particular holds promise as a dimensional measure of food selectivity, but its content is likely too narrow for it to provide an accurate measure of generalized chemosensory hyperresponsiveness. Researchers hoping to collect a phenotypic measure of sensory features would be better served by one of several newer-generation questionnaires (Ausderau et al., 2014; Dunn, 2014; Schoen et al., 2017). We believe that future research into the sensory manifestations of ASD would benefit greatly from the psychometric interrogation of existing sensory measures and the wider use of newer model-based indices to evaluate scale quality (Rodriguez et al., 2016a, 2016b).
Supplementary Material
Acknowledgements
The work described was supported by the National Institutes of Health under the following award numbers: T32GM007347, R01 MH102272, K01 MH 90232, R21 MH 109225, R21 MH10132, UL1 TR000445 from NCATS/NIH, and U54 HD083211. Data and/or research tools used in the preparation of this manuscript were obtained from the NIH-supported National Database for Autism Research (NDAR). NDAR is a collaborative informatics system created by the National Institutes of Health to provide a national resource to support and accelerate research in autism. This manuscript reflects the views of the authors and may not reflect the opinions or views of the NIH or of the Submitters submitting original data to NDAR. The authors would like to thank Dr. Grace Baranek for her input regarding our recommendations for the use of the Sensory Experiences Questionnaire.
Funding: This study was funded by the National Institutes of Health under the following award numbers: T32GM007347, R01 MH102272, K01 MH 90232, R21 MH 109225, R21 MH10132, UL1 TR000445 from NCATS/NIH, and U54 HD083211.
Compliance with Ethical Standards
Conflict of interest: The authors declare that they have no conflicts of interest with respect to their authorship or the publication of this article.
Ethical approval: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent: Informed consent was obtained from all individual participants included in the study.
References
- Ashburner J, Ziviani J, & Rodger S (2008). Sensory processing and classroom emotional, behavioral, and educational outcomes in children with autism spectrum disorder. The American Journal of Occupational Therapy, 62(5), 564–573. 10.1186/s13229-015-0060-x [DOI] [PubMed] [Google Scholar]
- Asparouhov T, & Muthén B (2010). Simple Second Order Chi-Square Correction: Technical Appendix Related to New Features in Mplus Version 6. Los Angeles, CA: Muthén & Muthén. [Google Scholar]
- Ausderau KK, Furlong M, Sideris J, Bulluck J, Little LM, Watson LR,& Baranek GT. (2014a). Sensory subtypes in children with autism spectrum disorder: latent profile transition analysis using a national survey of sensory features. Journal of Child Psychology and Psychiatry, and Allied Disciplines, 55(8), 935–944. 10.1111/jcpp.12219 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ausderau K, Sideris J, Furlong M, Little LM, Bulluck J, & Baranek GT (2014b). National survey of sensory features in children with ASD: factor structure of the sensory experience questionnaire (3.0). Journal of Autism and Developmental Disorders, 44(4), 915–925. 10.1007/s10803-013-1945-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ausderau KK, Sideris J, Little LM, Furlong M, Bulluck JC, & Baranek GT (2016). Sensory subtypes and associated outcomes in children with autism spectrum disorders. Autism Research, 9(12), 1316–1327. 10.1002/aur.1626 [DOI] [PubMed] [Google Scholar]
- Babyak MA, & Green SB (2010). Confirmatory factor analysis: An introduction for psychosomatic medicine researchers. Psychosomatic Medicine, 72(6), 587–597. 10.1097/PSY.0b013e3181de3f8a [DOI] [PubMed] [Google Scholar]
- Baum SH, Stevenson RA, & Wallace MT (2015). Behavioral, perceptual, and neural alterations in sensory and multisensory function in autism spectrum disorder. Progress in Neurobiology, 134, 140–160. 10.1016/j.pneurobio.2015.09.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ben-Sasson A, Cermak SA, Orsmond GI, Tager-Flusberg H, Kadlec MB, & Carter AS (2008). Sensory clusters of toddlers with autism spectrum disorders: Differences in affective symptoms. Journal of Child Psychology and Psychiatry, and Allied Disciplines, 49(8), 817–825. 10.1111/j.1469-7610.2008.01899.x [DOI] [PubMed] [Google Scholar]
- Ben-Sasson A, Hen L, Fluss R, Cermak SA, Engel-Yeger B, & Gal E (2009). A meta-analysis of sensory modulation symptoms in individuals with autism spectrum disorders. Journal of Autism and Developmental Disorders, 39(1), 1–11. 10.1007/s10803-008-0593-3 [DOI] [PubMed] [Google Scholar]
- Bentler PM (1990). Comparative fit indexes in structural models. Psychological Bulletin, 107(2), 238–246. 10.1037/0033-2909.107.2.238 [DOI] [PubMed] [Google Scholar]
- Black KR, Stevenson RA, Segers M, Ncube BL, Sun SZ, Philipp-Muller A, et al. (2017). Linking anxiety and insistence on sameness in autistic children: The role of sensory hypersensitivity. Journal of Autism and Developmental Disorders, 47(8), 2459–2470. 10.1007/s10803-017-3161-x [DOI] [PubMed] [Google Scholar]
- Borsboom D (2006). The attack of the psychometricians. Psychometrika, 71(3), 425–440. 10.1007/s11336-006-1447-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Boudjarane MA, Grandgeorge M, Marianowski R, Misery L, & Lemonnier E (2017). Perception of odors and tastes in autism spectrum disorders: A systematic review of assessments. Autism Research, 10(6), 1045–1057. 10.1002/aur.1760 [DOI] [PubMed] [Google Scholar]
- Boyle GJ (1991). Does item homogeneity indicate internal consistency or item redundancy in psychometric scales? Personality and Individual Differences, 12(3), 291–294. 10.1016/0191-8869(91)90115-r [DOI] [Google Scholar]
- Braeken J, & van Assen MALM (2017). An empirical Kaiser criterion. Psychological Methods, 22(3), 450–466. 10.1037/met0000074 [DOI] [PubMed] [Google Scholar]
- Browne MW (2001). An overview of analytic rotation in exploratory factor analysis. Multivariate Behavioral Research, 36(1), 111–150. 10.1207/s15327906mbr3601_05 [DOI] [Google Scholar]
- Brunner M, Nagy G, & Wilhelm O (2012). A tutorial on hierarchically structured constructs. Journal of Personality, 80(4), 795–795. 10.1111/j.1467-6494.2012.00749_1.x [DOI] [PubMed] [Google Scholar]
- Burns CO, Dixon DR, Novack M, & Granpeesheh D (2017). A Systematic Review of Assessments for Sensory Processing Abnormalities in Autism Spectrum Disorder. Review Journal of Autism and Developmental Disorders, 4(3), 209–224. 10.1007/s40489-017-0109-1 [DOI] [Google Scholar]
- Cermak SA, Curtin C, & Bandini L (2014). Sensory sensitivity and food selectivity in children with autism spectrum disorders In Patel VR, Preedy VR, and Martin CR (Eds.), Comprehensive Guide to Autism (pp. 2061–2076). New York, NY: Springer; 10.1007/978-1-4614-4788-7_126 [DOI] [Google Scholar]
- Cermak SA, Curtin C, & Bandini LG (2010). Food selectivity and sensory sensitivity in children with autism spectrum disorders. Journal of the American Dietetic Association, 110(2), 238–246. 10.1016/j.jada.2009.10.032 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Charman T, Loth E, Tillmann J, Crawley D, Wooldridge C, Goyard D,& Buitelaar JK.(2017). The EU-AIMS Longitudinal European Autism Project (LEAP): Clinical characterisation. Molecular Autism, 8(1), 27 10.1186/s13229-017-0145-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chen Y-H, Rodgers J, & McConachie H (2009). Restricted and repetitive behaviours, sensory processing and cognitive style in children with autism spectrum disorders. Journal of Autism and Developmental Disorders, 39(4), 635–642. 10.1007/s10803-008-0663-6 [DOI] [PubMed] [Google Scholar]
- Chistol LT, Bandini LG, Must A, Phillips S, Cermak SA, & Curtin C (2018). Sensory Sensitivity and Food Selectivity in Children with Autism Spectrum Disorder. Journal of Autism and Developmental Disorders, 48(2), 583–591. 10.1007/s10803-017-3340-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cho E (2016). Making reliability reliable. Organizational Research Methods, 19(4), 651–682. 10.1177/1094428116656239 [DOI] [Google Scholar]
- Cohen J (1992). A power primer. Psychological Bulletin, 112(1), 155–159. [DOI] [PubMed] [Google Scholar]
- Corbett BA, Muscatello RA, & Blain SD (2016). Impact of sensory sensitivity on physiological stress response and novel peer interaction in children with and without autism spectrum disorder. Frontiers in Neuroscience, 10, 1–9. 10.3389/fnins.2016.00278 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corbett BA, Schupp CW, Levine S, & Mendoza S (2009). Comparing cortisol, stress, and sensory sensitivity in children with autism. Autism Research, 2(1), 39–49. 10.1002/aur.64 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cortina JM (1993). What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology, 78(1), 98–104. 10.1037//0021-9010.78.1.98 [DOI] [Google Scholar]
- Costello AB, & Osborne JW (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research and Evaluation, 10(7), 1–9. [Google Scholar]
- Coulthard H, & Blissett J (2009). Fruit and vegetable consumption in children and their mothers. Moderating effects of child sensory sensitivity. Appetite, 52(2), 410–415. 10.1016/j.appet.2008.11.015 [DOI] [PubMed] [Google Scholar]
- Cronbach LJ (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334. 10.1007/BF02310555 [DOI] [Google Scholar]
- Crutzen R, & Peters G-JY (2017). Scale quality: Alpha is an inadequate estimate and factor-analytic evidence is needed first of all. Health Psychology Review, 11(3), 242–247. 10.1080/17437199.2015.1124240 [DOI] [PubMed] [Google Scholar]
- Davenport EC, Davison ML, Liou PY, & Love QU (2015). Reliability, dimensionality, and internal consistency as defined by Cronbach: Distinct albeit related concepts. Educational Measurement: Issues and Practice, 34(4), 4–9. 10.1111/emip.12095 [DOI] [Google Scholar]
- DeBoth KK, & Reynolds S (2017). A systematic review of sensory-based autism subtypes. Research in Autism Spectrum Disorders, 36, 44–56. 10.1016/j.rasd.2017.01.005 [DOI] [Google Scholar]
- DiStefano C, Liu J, Jiang N, & Shi D (2017). Examination of the weighted root mean square residual: Evidence for trustworthiness? Structural Equation Modeling: a Multidisciplinary Journal, 25(3), 453–466. 10.1080/10705511.2017.1390394 [DOI] [Google Scholar]
- Dunn W (1999). Sensory profile: User’s manual. San Antonio, TX: Psychological Corporation. [Google Scholar]
- Dunn W (2014). Sensory profile 2: User’s manual. San Antonio, TX: Psychological Corporation. [Google Scholar]
- Ee SI, Loh SY, Chinna K, & Marret MJ (2015). Cross-cultural adaptation and psychometric properties of the Malay version of the Short Sensory Profile. Physical and Occupational Therapy in Pediatrics, 36(2), 117–130. 10.3109/01942638.2015.1040574 [DOI] [PubMed] [Google Scholar]
- Egelhoff K, & Lane AE (2013). Brief report: Preliminary reliability, construct validity and standardization of the Auditory Behavior Questionnaire (ABQ) for children with autism spectrum disorders. Journal of Autism and Developmental Disorders, 43(4), 978–984. 10.1007/s10803-012-1626-5 [DOI] [PubMed] [Google Scholar]
- Elwin M, Schroder A, Ek L, Wallsten T, & Kjellin L (2017). Sensory clusters of adults with and without autism spectrum conditions. Journal of Autism and Developmental Disorders, 47(3), 579–589. 10.1007/s10803-016-2976-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Engel-Yeger B (2010). The applicability of the Short Sensory Profile for screening sensory processing disorders among Israeli children. International Journal of Rehabilitation Research, 33(4), 311–318. 10.1097/mrr.0b013e32833abe59 [DOI] [PubMed] [Google Scholar]
- Engel-Yeger B, Hardal-Nasser R, & Gal E (2016). The relationship between sensory processing disorders and eating problems among children with intellectual developmental deficits. British Journal of Occupational Therapy, 79(1), 17–25. 10.1177/0308022615586418 [DOI] [Google Scholar]
- Fabrigar LR, Wegener DT, MacCallum RC, & Strahan EJ (1999). Evaluating the use of exploratory factor analysis in psychological research. Psychological Methods, 4(3), 272–299. 10.1037/1082-989X.4.3.272 [DOI] [Google Scholar]
- Floyd FJ, & Widaman KF (1995). Factor analysis in the development and refinement of clinical assessment instruments. Psychological Assessment, 7(3), 286–299. 10.1037/1040-3590.7.3.286 [DOI] [Google Scholar]
- Frazier TW, Ratliff KR, Gruber C, Zhang Y, Law PA, & Constantino JN (2014). Confirmatory factor analytic structure and measurement invariance of quantitative autistic traits measured by the social responsiveness scale-2. Autism, 18(1), 31–44. 10.1177/1362361313500382 [DOI] [PubMed] [Google Scholar]
- Garrido LE, Abad FJ, & Ponsoda V (2011). Performance of Velicer’s minimum average partial factor retention method with categorical variables. Educational and Psychological Measurement, 71(3), 551–570. 10.1177/0013164410389489 [DOI] [Google Scholar]
- Garrido LE, Abad FJ, & Ponsoda V (2013). A new look at Horn’s parallel analysis with ordinal variables. Psychological Methods, 18(4), 454–474. 10.1037/a0030005 [DOI] [PubMed] [Google Scholar]
- Gignac GE (2014). On the inappropriateness of using items to calculate total scale score reliability via coefficient alpha for multidimensional scales. European Journal of Psychological Assessment, 30(2), 130–139. 10.1027/1015-5759/a000181 [DOI] [Google Scholar]
- Glod M, Riby DM, Honey E, & Rodgers J (2015). Psychological correlates of sensory processing patterns in individuals with autism spectrum disorder: A systematic review. Review Journal of Autism and Developmental Disorders, 2(2), 199–221. 10.1007/s40489-015-0047-8 [DOI] [Google Scholar]
- Gorsuch RL (1997). Exploratory factor analysis: Its role in item analysis. Journal of Personality Assessment, 68(3), 532–560. 10.1207/s15327752jpa6803_5 [DOI] [PubMed] [Google Scholar]
- Green D, Chandler S, Charman T, Simonoff E, & Baird G (2016). Brief Report: DSM-5 sensory behaviours in children with and without an autism spectrum disorder. Journal of Autism and Developmental Disorders, 46(11), 3597–3606. 10.1007/s10803-016-2881-7 [DOI] [PubMed] [Google Scholar]
- Grice JW (2001). Computing and evaluating factor scores. Psychological Methods, 6(4), 430–450. 10.1037/1082-989x.6.4.430-450 [DOI] [PubMed] [Google Scholar]
- Hand BN, Dennis S, & Lane AE (2017). Latent constructs underlying sensory subtypes in children with autism: A preliminary study. Autism Research, 10(8), 1364–1371. 10.1002/aur.1787 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hall D, Huerta MF, McAuliffe MJ, & Farber GK (2012). Sharing heterogeneous data: The national database for autism research. Neuroinformatics, 10(4), 331–339. 10.1007/s12021-012-9151-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hazen EP, Stornelli JL, O’Rourke JA, Koesterer K, & McDougle CJ (2014). Sensory symptoms in autism spectrum disorders. Harvard Review of Psychiatry, 22(2), 112–124. 10.1097/01.HRP.0000445143.08773.58 [DOI] [PubMed] [Google Scholar]
- Hegarty JP II, Gu M, Spielman DM, Cleveland SC, Hallmayer JF, Lazzeroni LC, et al. (2018). A proton MR spectroscopy study of the thalamus in twins with autism spectrum disorder. Progress in Neuro-Psychopharmacology & Biological Psychiatry, 81, 153–160. 10.1016/j.pnpbp.2017.09.016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hermida R (2015). The problem of allowing correlated errors in structural equation modeling: Concerns and considerations. Computational Methods in Social Sciences, 3(1), 5–17. [Google Scholar]
- Holgado Tello FP, Chacón Moscoso S, Barbero García I, & Vila Abad E (2010). Polychoric versus Pearson correlations in exploratory and confirmatory factor analysis of ordinal variables. Quality & Quantity, 44(1), 153–166. 10.1007/s11135-008-9190-y [DOI] [Google Scholar]
- Horn JL (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30(2), 179–185. 10.1007/BF02289447 [DOI] [PubMed] [Google Scholar]
- Johnson CR, Turner K, Stewart PA, Schmidt B, Shui A, Macklin E,& Hyman SL (2014). Relationships between feeding problems, behavioral characteristics and nutritional quality in children with ASD. Journal of Autism and Developmental Disorders, 44(9), 2175–2184. 10.1007/s10803-014-2095-9 [DOI] [PubMed] [Google Scholar]
- Kaiser HF (1959). Computer program for varimax rotation in factor analysis. Educational and Psychological Measurement, 19(3), 413–420. 10.1177/001316445901900314 [DOI] [Google Scholar]
- Kelley K, & Pornprasertmanit S (2016). Confidence intervals for population reliability coefficients: Evaluation of methods, recommendations, and software for composite measures. Psychological Methods, 21(1), 69–92. 10.1037/a0040086 [DOI] [PubMed] [Google Scholar]
- Kuschner ES, Eisenberg IW, Orionzi B, Simmons WK, Kenworthy L, Martin A, & Wallace GL (2015). A preliminary study of self-reported food selectivity in adolescents and young adults with autism spectrum disorder. Research in Autism Spectrum Disorders, 15, 53–59. 10.1016/j.rasd.2015.04.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lajonchere C, Jones N, Coury DL, & Perrin JM (2012). Leadership in health care, research, and quality improvement for children and adolescents with autism spectrum disorders: Autism treatment network and autism intervention research network on physical health. Pediatrics, 130(Supplement 2), S62–S68. 10.1542/peds.2012-0900C [DOI] [PubMed] [Google Scholar]
- Landon J, Shepherd D, & Lodhia V (2016). A qualitative study of noise sensitivity in adults with autism spectrum disorder. Research in Autism Spectrum Disorders, 32, 43–52. 10.1016/j.rasd.2016.08.005 [DOI] [Google Scholar]
- Lane AE, Dennis SJ, & Geraghty ME (2011). Brief report: Further evidence of sensory subtypes in autism. Journal of Autism and Developmental Disorders, 41(6), 826–831. 10.1007/s10803-010-1103-y [DOI] [PubMed] [Google Scholar]
- Lane AE, Geraghty ME, Young GS, & Rostorfer JL (2014a). Problem eating behaviors in autism spectrum disorder are associated with suboptimal daily nutrient intake and taste/smell sensitivity. ICAN: Infant, Child, & Adolescent Nutrition, 6(3), 172–180. 10.1177/1941406414523981 [DOI] [Google Scholar]
- Lane AE, Molloy CA, & Bishop SL (2014b). Classification of children with autism spectrum disorder by sensory subtype: A case for sensory-based phenotypes. Autism Research, 7(3), 322–333. 10.1002/aur.1368 [DOI] [PubMed] [Google Scholar]
- Lane AE, Young RL, Baker AEZ, & Angley MT (2010). Sensory processing subtypes in autism: Association with adaptive behavior. Journal of Autism and Developmental Disorders, 40(1), 112–122. 10.1007/s10803-009-0840-2 [DOI] [PubMed] [Google Scholar]
- Levitin DJ, Cole K, Lincoln A, & Bellugi U (2005). Aversion, awareness, and attraction: Investigating claims of hyperacusis in the Williams syndrome phenotype. Journal of Child Psychology and Psychiatry, 46(5), 514–523. 10.1111/j.1469-7610.2004.00376.x [DOI] [PubMed] [Google Scholar]
- Li C-H (2016). The performance of ML, DWLS, and ULS estimation with robust corrections in structural equation models with ordinal variables. Psychological Methods, 21(3), 369–387. 10.1037/met0000093 [DOI] [PubMed] [Google Scholar]
- Liss M, Saulnier C, Fein D, & Kinsbourne M (2006). Sensory and attention abnormalities in autistic spectrum disorders. Autism, 10(2), 155–172. 10.1177/1362361306062021 [DOI] [PubMed] [Google Scholar]
- Little LM, Ausderau K, Sideris J, & Baranek GT (2015). Activity participation and sensory features among children with autism spectrum disorders. Journal of Autism and Developmental Disorders, 45(9), 2981–2990. 10.1007/s10803-015-2460-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Little LM, Dean E, Tomchek S, & Dunn W (2018). Sensory processing patterns in autism, attention deficit hyperactivity disorder, and typical development. Physical and Occupational Therapy in Pediatrics, 38(3), 243–254. 10.1080/01942638.2017.1390809 [DOI] [PubMed] [Google Scholar]
- Lord C, Risi S, Lambrecht L, Cook EH, Leventhal BL, DiLavore PC,& Rutter M (2000). The autism diagnostic observation schedule-generic: A standard measure of social and communication deficits associated with the spectrum of autism. Journal of Autism and Developmental Disorders, 30(3), 205–223. [PubMed] [Google Scholar]
- Lord C, Rutter M, & Le Couteur A (1994). Autism Diagnostic Interview-Revised: A revised version of a diagnostic interview for caregivers of individuals with possible pervasive developmental disorders. Journal of Autism and Developmental Disorders, 24(5), 659–685. [DOI] [PubMed] [Google Scholar]
- Luisier A-C, Petitpierre G, Ferdenzi C, Clerc Bérod A, Giboreau A, Rouby C, & Bensafi M (2015). Odor perception in children with autism spectrum disorder and its relationship to food neophobia. Frontiers in Psychology, 6, 1830 10.3389/fpsyg.2015.01830 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Marco EJ, Hinkley LBN, Hill SS, & Nagarajan SS (2011). Sensory processing in autism: a review of neurophysiologic findings. Pediatric Research, 69(5 Pt 2), 48R–54R. 10.1203/PDR.0b013e3182130c54 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Marsh HW, Hau K-T, & Wen Z (2004). In search of golden rules: Comment on hypothesis-testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler’s (1999) findings. Structural Equation Modeling: A Multidisciplinary Journal, 11(3), 320–341. 10.1207/s15328007sem1103_2 [DOI] [Google Scholar]
- Mazurek MO, & Petroski GF (2015). Sleep problems in children with autism spectrum disorder: Examining the contributions of sensory over-responsivity and anxiety. Sleep Medicine, 16(2), 270–279. 10.1016/j.sleep.2014.11.006 [DOI] [PubMed] [Google Scholar]
- Mazurek MO, Keefer A, Shui A, & Vasa RA (2014). One-year course and predictors of abdominal pain in children with autism spectrum disorders: The role of anxiety and sensory over-responsivity. Research in Autism Spectrum Disorders, 8(11), 1508–1515. 10.1016/j.rasd.2014.07.018 [DOI] [Google Scholar]
- Mazurek MO, Vasa RA, Kalb LG, Kanne SM, Rosenberg D, Keefer A, & Lowery LA (2013). Anxiety, sensory over-responsivity, and gastrointestinal problems in children with autism spectrum disorders. Journal of Abnormal Child Psychology, 41(1), 165–176. 10.1007/s10802-012-9668-x [DOI] [PubMed] [Google Scholar]
- McCormick C, Hessl D, Macari SL, Ozonoff S, Green C, & Rogers SJ (2014). Electrodermal and behavioral responses of children with autism spectrum disorders to sensory and repetitive stimuli. Autism Research, 7(4), 468–480. 10.1002/aur.1382 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McDonald RP (1999). Test theory: A unified treatment. Psychology Press. [Google Scholar]
- McIntosh DN, Miller LJ, Shyu V, & Dunn W (1999). Development and validation of the short sensory profile. Sensory Profile User’s Manual, 59–73. [Google Scholar]
- McNeish D, An J, & Hancock GR (2018). The thorny relation between measurement quality and fit index cutoffs in latent variable models. Journal of Personality Assessment, 100(1), 43–52. 10.1080/00223891.2017.1281286 [DOI] [PubMed] [Google Scholar]
- McNeish D (2017). Thanks coefficient alpha, We’ll take it from here. Psychological Methods. 10.1037/met0000144 [DOI] [PubMed] [Google Scholar]
- Mikkelsen M, Wodka EL, Mostofsky SH, & Puts NAJ (2018). Autism spectrum disorder in the scope of tactile processing. Developmental Cognitive Neuroscience, 29, 140–150. 10.1016/j.dcn.2016.12.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miller LJ, Anzalone ME, Lane SJ, Cermak SA, & Osten ET (2007). Concept evolution in sensory integration: A proposed nosology for diagnosis. The American Journal of Occupational Therapy, 61(2), 135–140. [DOI] [PubMed] [Google Scholar]
- Miller LJ, Nielsen DM, & Schoen SA (2012). Attention deficit hyperactivity disorder and sensory modulation disorder: A comparison of behavior and physiology. Research in Developmental Disabilities, 33(3), 804–818. 10.1016/j.ridd.2011.12.005 [DOI] [PubMed] [Google Scholar]
- Murray AL, Booth T, McKenzie K, Kuenssberg R, & O’Donnell M (2014). Are autistic traits measured equivalently in individuals with and without an autism spectrum disorder? An invariance analysis of the Autism Spectrum Quotient Short Form. Journal of Autism and Developmental Disorders, 44(1), 55–64. 10.1007/s10803-013-1851-6 [DOI] [PubMed] [Google Scholar]
- Nadon G, Feldman DE, Dunn W, & Gisel E (2011). Association of sensory processing and eating problems in children with autism spectrum disorders. Autism Research and Treatment, 2011, 1–8. 10.1155/2011/541926 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nederkoorn C, Jansen A, & Havermans RC (2015). Feel your food. The influence of tactile sensitivity on picky eating in children. Appetite, 84, 7–10. 10.1016/j.appet.2014.09.014 [DOI] [PubMed] [Google Scholar]
- Neil L, Green D, & Pellicano E (2017). The psychometric properties of a new measure of sensory behaviors in autistic children. Journal of Autism and Developmental Disorders, 47(4), 1261–1268. 10.1007/s10803-016-3018-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Neil L, Olsson NC, & Pellicano E (2016). The relationship between intolerance of uncertainty, sensory sensitivities, and anxiety in autistic and typically developing children. Journal of Autism and Developmental Disorders, 46(6), 1962–1973. 10.1007/s10803-016-2721-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Norris M, & Lecavalier L (2010). Evaluating the use of exploratory factor analysis in developmental disability psychological research. Journal of Autism and Developmental Disorders, 40(1), 8–20. 10.1007/s10803-009-0816-2 [DOI] [PubMed] [Google Scholar]
- Nunnally JC, & Bernstein IH (1994). Psychometric theory. New York, NY: MacGraw-Hill. [Google Scholar]
- O’Brien J, Tsermentseli S, & Cummins O (2009). Discriminating children with autism from children with learning difficulties with an adaptation of the Short Sensory Profile. Early Child Development and Care, 179(4), 383–394. 10.1080/03004430701567926 [DOI] [Google Scholar]
- O’Donnell S, Deitz J, Kartin D, Nalty T, & Dawson G (2012). Sensory processing, problem behavior, adaptive behavior, and cognition in preschool children with autism spectrum disorders. The American Journal of Occupational Therapy, 66(5), 586–594. 10.5014/ajot.2012.004168 [DOI] [PubMed] [Google Scholar]
- Orekhova EV, Tsetlin MM, Butorina AV, Novikova SI, Gratchev VV, Sokolov PA, et al. (2012). Auditory cortex responses to clicks and sensory modulation difficulties in children with autism spectrum disorders (ASD). PloS One, 7(6), e39906 10.1371/journal.pone.0039906 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Phillips DP, & Carr MM (1998). Disturbances of loudness perception. Journal of the American Academy of Audiology, 9(5), 371–379. [PubMed] [Google Scholar]
- Preacher KJ, & MacCallum RC (2003). Repairing Tom Swift’s electric factor analysis machine. Understanding Statistics, 2(1), 13–43. 10.1207/s15328031us0201_02 [DOI] [Google Scholar]
- R Core Team (2017). R: A language and environment for statistical computing R Foundation for Statistical Computing, Vienna, Austria: https://www.R-project.org/. [Google Scholar]
- Reeve BB, Hays RD, Bjorner JB, Cook KF, Crane PK, Teresi JA,& PROMIS Cooperative Group. (2007). Psychometric evaluation and calibration of health-related quality of life item banks: Plans for the patient-reported outcomes measurement information system (PROMIS). Medical Care, 45(5), S22–S31. 10.1097/01.mlr.0000250483.85507.04 [DOI] [PubMed] [Google Scholar]
- Reise SP (2012). The rediscovery of bifactor measurement models. Multivariate Behavioral Research, 47(5), 667–696. 10.1080/00273171.2012.715555 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reise SP, Scheines R, & Widaman KF (2013). Multidimensionality and structural coefficient bias in structural equation modeling: A bifactor perspective. Educational and Psychological Measurement, 73(1), 5–26. 10.1177/0013164412449831 [DOI] [Google Scholar]
- Revelle W, & Zinbarg RE (2009). Coefficients alpha, beta, omega, and the glb: Comments on Sijtsma. Psychometrika, 74(1), 145–154. 10.1007/s11336-008-9102-z [DOI] [Google Scholar]
- Revelle W (2017). psych: Procedures for Personality and Psychological Research, Northwestern University, Evanston, Illinois, USA [Google Scholar]
- Robertson CE, & Baron-Cohen S (2017). Sensory perception in autism. Nature Reviews. Neuroscience, 18(11), 671–684. http://doi.org/10.1038/nrn.2017.112 [DOI] [PubMed] [Google Scholar]
- Rodriguez A, Reise SP, & Haviland MG (2016a). Applying bifactor statistical indices in the evaluation of psychological measures. Journal of Personality Assessment, 98(3), 223–237. 10.1080/00223891.2015.1089249 [DOI] [PubMed] [Google Scholar]
- Rodriguez A, Reise SP, & Haviland MG (2016b). Evaluating bifactor models: Calculating and interpreting statistical indices. Psychological Methods, 21(2), 137–150. 10.1037/met0000045 [DOI] [PubMed] [Google Scholar]
- Rogers SJ, Hepburn S, & Wehner E (2003). Parent reports of sensory symptoms in toddlers with autism and those with other developmental disorders. Journal of Autism and Developmental Disorders, 33(6), 631–642. [DOI] [PubMed] [Google Scholar]
- Rosseel Y (2012). Lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1–36. http://doi.org/10.18637/jss.v048.i02 [Google Scholar]
- Samson AC, Phillips JM, Parker KJ, Shah S, Gross JJ, & Hardan AY (2014). Emotion dysregulation and the core features of autism spectrum disorder. Journal of Autism and Developmental Disorders, 44(7), 1766–1772. 10.1007/s10803-013-2022-5 [DOI] [PubMed] [Google Scholar]
- Sass DA, & Schmitt TA (2010). A comparative investigation of rotation criteria within exploratory factor analysis. Multivariate Behavioral Research, 45(1), 73–103. 10.1080/00273170903504810 [DOI] [PubMed] [Google Scholar]
- Schmid J, & Leiman JM (1957). The development of hierarchical factor solutions. Psychometrika, 22(1), 53–61. 10.1007/BF02289209 [DOI] [Google Scholar]
- Schoen SA, Miller LJ, & Green KE (2008). Pilot study of the sensory over-responsivity scales: Assessment and inventory. The American Journal of Occupational Therapy, 62(4), 393–406. [DOI] [PubMed] [Google Scholar]
- Schoen SA, Miller LJ, Brett-Green BA, & Nielsen DM (2009). Physiological and behavioral differences in sensory processing: A comparison of children with autism spectrum disorder and sensory modulation disorder. Frontiers in Integrative Neuroscience, 3, 1–11. 10.3389/neuro.07.029.2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schoen SA, Miller LJ, & Sullivan J (2017). The development and psychometric properties of the Sensory Processing Scale Inventory: A report measure of sensory modulation. Journal of Intellectual & Developmental Disability, 42(1), 12–21. 10.3109/13668250.2016.1195490 [DOI] [Google Scholar]
- Sijtsma K (2009). On the use, the misuse, and the very limited usefulness of Cronbach’s alpha. Psychometrika, 74(1), 107–120. 10.1007/s11336-008-9101-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Siper PM, Kolevzon A, Wang AT, Buxbaum JD, & Tavassoli T (2017). A clinician-administered observation and corresponding caregiver interview capturing DSM-5 sensory reactivity symptoms in children with ASD. Autism Research, 10(6), 1133–1140. 10.1002/aur.1750 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Smith JA (2016). Sensory processing as a predictor of feeding/eating behaviors in children with autism spectrum disorder. The Open Journal of Occupational Therapy, 4(2), 1–11. http://doi.org/10.15453/2168-6408.1197 [Google Scholar]
- Stafford LD, Tsang I, López B, Severini M, & Iacomini S (2017). Autistic traits associated with food neophobia but not olfactory sensitivity. Appetite, 116, 584–588. 10.1016/j.appet.2017.05.054 [DOI] [PubMed] [Google Scholar]
- Tavassoli T, Bellesheim K, Siper PM, Wang AT, Halpern D, Gorenstein M, et al. (2016). Measuring sensory reactivity in autism spectrum disorder: Application and simplification of a clinician-administered sensory observation scale. Journal of Autism and Developmental Disorders, 46(1), 287–293. 10.1007/s10803-015-2578-3 [DOI] [PubMed] [Google Scholar]
- Tavassoli T, Miller LJ, Schoen SA, Jo Brout J, Sullivan J, & Baron-Cohen S (2018). Sensory reactivity, empathizing and systemizing in autism spectrum conditions and sensory processing disorder. Developmental Cognitive Neuroscience, 29, 72–77. 10.1016/j.dcn.2017.05.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Tomarken AJ, & Waller NG (2003). Potential problems with “well fitting” models. Journal of Abnormal Psychology, 112(4), 578–598. [DOI] [PubMed] [Google Scholar]
- Tomchek SD, & Dunn W (2007). Sensory processing in children with and without autism: A comparative study using the short sensory profile. The American Journal of Occupational Therapy, 61(2), 190–200. [DOI] [PubMed] [Google Scholar]
- Tomchek SD, Huebner RA, & Dunn W (2014). Patterns of sensory processing in children with an autism spectrum disorder. Research in Autism Spectrum Disorders, 8(9), 1214–1224. 10.1016/j.rasd.2014.06.006 [DOI] [Google Scholar]
- Tomchek SD, Little LM, & Dunn W (2015). Sensory pattern contributions to developmental performance in children with autism spectrum disorder. The American Journal of Occupational Therapy, 69(5), 6905185040p1–10. 10.5014/ajot.2015.018044 [DOI] [PubMed] [Google Scholar]
- Tomchek SD, Little LM, Myers J, & Dunn W (2018). Sensory subtypes in preschool aged children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 48(6), 2139–2147. 10.1007/s10803-018-3468-2 [DOI] [PubMed] [Google Scholar]
- Tucker LR, & Lewis C (1973). A reliability coefficient for maximum likelihood factor analysis. Psychometrika, 38(1), 1–10. 10.1007/BF02291170 [DOI] [Google Scholar]
- Uljarević M, Lane A, Kelly A, & Leekam S (2016). Sensory subtypes and anxiety in older children and adolescents with autism spectrum disorder. Autism Research, 9(10), 1073–1078. 10.1002/aur.1602 [DOI] [PubMed] [Google Scholar]
- Uljarević M, Baranek G, Vivanti G, Hedley D, Hudry K, & Lane A (2017). Heterogeneity of sensory features in autism spectrum disorder: Challenges and perspectives for future research. Autism Research, 10(5), 703–710. 10.1002/aur.1747 [DOI] [PubMed] [Google Scholar]
- Vandenberg RJ, & Lance CE (2000). A review and synthesis of the measurement invariance literature: Suggestions, practices, and recommendations for organizational research. Organizational Research Methods, 3(1), 4–70. 10.1177/109442810031002 [DOI] [Google Scholar]
- Velicer WF (1976). Determining the number of components from the matrix of partial correlations. Psychometrika, 41(3), 321–327. [Google Scholar]
- Velicer WF, Eaton CA, & Fava JL (2000). Construct explication through factor or component analysis: A review and evaluation of alternative procedures for determining the number of factors or components In Goffin RD & Helmes E (Eds.), Problems and solutions in human assessment (pp. 41–71). Boston, MA: Springer; 10.1007/978-1-4615-4397-8_3 [DOI] [Google Scholar]
- Ware JE Jr., & Gandek B (1998). Methods for testing data quality, scaling assumptions, and reliability: The IQOLA project approach. Journal of Clinical Epidemiology, 51(11), 945–952. 10.1016/S0895-4356(98)00085-7 [DOI] [PubMed] [Google Scholar]
- Wicherts JM, & Dolan CV (2010). Measurement invariance in confirmatory factor analysis: An illustration using IQ test performance of minorities. Educational Measurement: Issues and Practice, 29(3), 39–47. 10.1111/j.1745-3992.2010.00182.x [DOI] [Google Scholar]
- Wiggins LD, Robins DL, Bakeman R, & Adamson LB (2009). Brief report: Sensory abnormalities as distinguishing symptoms of autism spectrum disorders in young children. Journal of Autism and Developmental Disorders, 39(7), 1087–1091. 10.1007/s10803-009-0711-x [DOI] [PubMed] [Google Scholar]
- Wigham S, Rodgers J, South M, McConachie H, & Freeston M (2015). The interplay between sensory processing abnormalities, intolerance of uncertainty, anxiety and restricted and repetitive behaviours in autism spectrum disorder. Journal of Autism and Developmental Disorders, 45(4), 943–952. 10.1007/s10803-014-2248-x [DOI] [PubMed] [Google Scholar]
- Wodka EL, Puts NAJ, Mahone EM, Edden RAE, Tommerdahl M, & Mostofsky SH(2016). The role of attention in somatosensory processing: A multi-trait, multi-method analysis. Journal of Autism and Developmental Disorders, 46(10), 3232–3241. 10.1007/s10803-016-2866-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yang Y, & Xia Y (2015). On the number of factors to retain in exploratory factor analysis for ordered categorical data. Behavior Research Methods, 47(3), 756–772. 10.3758/s13428-014-0499-2 [DOI] [PubMed] [Google Scholar]
- Yates A (1987). Multivariate exploratory data analysis: A perspective on exploratory factor analysis. Albany: State University of New York Press. [Google Scholar]
- Yerys BE, Nissley-Tsiopinis J, de Marchena A, Watkins MW, Antezana L, Power TJ, & Schultz RT (2017). Evaluation of the ADHD Rating Scale in Youth with Autism. Journal of Autism and Developmental Disorders, 47(1), 90–100. 10.1007/s10803-016-2933-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yu C-Y (2002). Evaluating cutoff criteria of model fit indices for latent variable models with binary and continuous outcomes. (Unpublished doctoral dissertation). University of California, Los Angeles. [Google Scholar]
- Zinbarg RE, Revelle W, Yovel I, & Li W (2005). Cronbach’s α, Revelle’s β, and Mcdonald’s ωH: Their relations with each other and two alternative conceptualizations of reliability. Psychometrika, 70(1), 123–133. 10.1007/s11336-003-0974-7 [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.