Skip to main content
Implementation Research and Practice logoLink to Implementation Research and Practice
. 2022 Aug 4;3:26334895221116065. doi: 10.1177/26334895221116065

Construct validity of the school-implementation climate scale

Andrew J Thayer 1,, Clayton R Cook 2, Chayna Davis 3, Eric C Brown 4, Jill Locke 3, Mark G Ehrhart 5, Gregory A Aarons 6, Elissa Picozzi 3, Aaron R Lyon 3
PMCID: PMC9924285  PMID: 37091097

Abstract

Background

Implementation climate is an organizational construct theorized to facilitate the adoption and delivery of evidence-based practices. Within schools, teachers often are tasked with implementing universal prevention programs. Therefore, they are ideal informants when assessing school implementation climate for initial and continuous implementation improvement efforts. The purpose of this study was to examine the construct validity (i.e., factor structure and convergent/divergent validity) of a school-adapted measure of strategic implementation climate called the School Implementation Climate Scale (SICS).

Methods

Confirmatory factor analyses of SICS data, collected from 441 teachers in 52 schools, were used to compare uncorrelated and correlated first-order factor models and a second-order hierarchical model. Correlations with other school measures were examined to assess SICS convergent and divergent validities.

Results

Results demonstrated acceptable internal consistency for each SICS subscale (αs > 0.80 for all subscales) and construct validity of the hypothesized factor structure of the SICS with three new scales. The hierarchical second-order factor structure with eight first-order factors was found to best model the SICS data. Correlations with other school measures were in the expected direction and magnitude.

Conclusions

Results from this study provide psychometric evidence that supports the use of the SICS to inform the implementation research and practice in schools.

Plain Language Summary

Schools are busy trying to implement various universal programs and systems to help support kids in their growth. Beginning and sustaining these efforts is quite challenging, and there is need for tools and ideas to help those implementation efforts. One concept is implementation climate, which is broadly the school staff’s perception of the implementation support for a given practice. However, no measure currently exists to help schools assess their implementation climate. The goal of our study was to adapt a measure of implementation climate used in other settings to the school environment. We used feedback from educational experts to make changes and used various analyses to determine if the newly adapted measure was psychometrically sound. Findings suggest the new measure is usable to guide implementation efforts in schools.

Keywords: schools, evidence-based practices, organizational climate, implementation climate


The adoption, high-quality delivery, and sustainment of evidence-based practices (EBPs) is an important priority for schools to prevent and address a wide range of social, emotional, behavioral, and academic needs (Lyon & Bruns, 2019). EBP implementation is subject to a constellation of inner and outer organizational factors (Aarons et al., 2011). Outer context factors include the environment external to the organization that exerts a distal influence on implementation efforts. Inner context factors are those within the organizational context where implementation ultimately occurs, like organizational climate and culture, and are more proximal to EBP implementation (Moullin et al., 2019). Considering this, researchers have developed and validated tools that measure key aspects of the inner context that can be used for data-informed decisions, as identified in frameworks such as Exploration, Preparation, Implementation and Sustainment (EPIS; Moullin et al., 2019). Implementation climate represents a strategic aspect of organizational climate most directly linked to implementation. Past research in behavioral health has provided strong support for a measure of implementation climate—the Implementation Climate Scale (ICS; Ehrhart et al., 2014)—and a direct adaptation of that measure has shown promise in school settings (Lyon et al., 2018). However, this past research did not include a rigorous evaluation of the items for school settings or the possible emergence of new dimensions in such settings. Thus, the purpose of this study was to continue this measurement development effort by performing a thorough measure adaptation process for school settings.

General Organizational Climate Versus Strategic Implementation Climate

There are two main categories of organizational climate research: general and strategic. Broadly defined, general organizational climate is the “the shared meaning organizational members attach to the events, policies, practices, and procedures they experience and the behaviors they see being rewarded, supported, and expected” (Ehrhart et al., 2014, p. 69). General organizational climate reflects aggregate, shared perceptions of individuals in a given setting, in contrast to its individual-level counterpart, psychological climate, which reflects the individual psychological experience of single employees (Glick, 1985; James & Jones, 1974). Several studies have linked general organizational climate to broad performance outcomes (e.g., Mitchell et al., 2010; Ostroff, 1993; Schulte et al., 2009). Applied to schools, general organizational climate is an important determinant of teacher job satisfaction (Xiaofu & Qiwen, 2007), commitment to the school's values and mission (Riehl & Sipple, 1996), and broadly EBP implementation fidelity (Williams et al., 2019). However, general organizational climate does not reflect staff's perceptions of actions and experiences specifically related to implementation efforts, thus lacking specificity in identifying the particular climate elements and mechanisms of successful implementation efforts. To fully capture those specific elements and mechanisms, a granular view of climate that focuses strictly on implementation may be helpful.

In contrast to general climate, strategic organizational climate captures the extent to which an organization's policies, practices, procedures, and systems support a specific organizational process or outcome. There are numerous strategic climates including strategic implementation climate, which represents school staff's perceptions that EBP implementation is expected, supported, integrated, and rewarded (Ehrhart et al., 2014; Williams et al., 2018). Strategic implementation climate is considered more proximal than general climate to implementation outcomes because it reflects how staff think and feel about their experiences adopting and delivering an EBP. Schools actively pursuing EBP implementation are likely to include policies, supports, recognition systems, and communications from leadership that create the conditions that school staff experience and perceive related to an implementation effort.

To date, researchers have demonstrated that the strategic implementation climate is a key determinant of successful EBP implementation (Williams et al., 2018). Considering its importance, Ehrhart et al. (2014) developed and validated the ICS in the context of EBP implementation in community mental health settings. The ICS included six dimensions measuring staff perceptions of the implementation climate: Focus on EBPs, representing how much implementing EPBs is an organizational priority; Educational Support for EBPs, representing how much training and materials are provided to support EBP implementation; Recognition for EBPs, representing how much the organization values and recognizes providers for EBP implementation; Rewards for EBPs, representing how the organization provides financial compensation or benefits for EBP implementation; Selection for EBPs, representing how the organization considers prior EBP experience when recruiting, selecting, and hiring employees; and Selection for Openness, representing how the organization considers applicant openness to new practices, adaptability, and flexibility when recruiting, selecting, and hiring employees. According to this research, organizations with low levels of strategic implementation climate do not demonstrate that EBP implementation is valued (Ehrhart et al., 2014). In these cases, there is limited focus on EBPs, support provided, and/or forms of recognition and acknowledgment for staff who invest in and improve their EBP implementation. Considering that schools are an ideal setting for the delivery of universal social, emotional, behavioral, and academic prevention problems, strategic implementation climate may be a critical construct to examine in schools (Locke et al., 2016).

Prior Research on the Implementation Climate Scale in Schools

Although there are well-established measures of organizational climate (Patterson et al., 2005) and school climate (You et al., 2014), there is not one that assess strategic implementation climate. Recently, researchers evaluated the original ICS with minor wording changes for the school context using informants representing a single behavioral health consultant per school (Lyon et al., 2018). The ICS measured the degree to which school-based behavioral health providers perceived the school leaders and systems expected, supported, and recognized staff for EBP implementation. Initial findings demonstrated a five-factor solution with a general second-order factor. The Rewards scale of the original ICS resulted in poor factor loadings and low internal consistency. An item review conducted by the researchers suggested that the items for the Rewards scale of the original ICS did not fit the school setting as it was less relevant for that setting (e.g., financial incentives are not provided in schools) and would need to be redesigned with stakeholder feedback. Overall, this first attempt at applying the ICS to the school setting demonstrated strong psychometric properties with evidence supporting acceptable internal consistency of five scales (α > 0.7) and strong correlations with other measures of the inner school implementation context including implementation leadership (r= .85) and implementation citizenship behavior (r = .82; Lyon et al., 2018)1.

Although this research provided support for the use of the ICS in school settings, the items reflected the original ICS wording and subscales as developed in a community mental health context, without prior analysis to determine if the dimensions and items were appropriate for schools. Schools are unique systems involving multiple levels of administration, varied resource pools and allocations, multiple systems of support for students, and often overlapping role responsibilities between staff. As has been the case in past research adapting the ICS to other contexts (Ehrhart et al., 2016), the ICS may need adaptation to capture the idea of a school-specific implementation climate. As part of a federally funded project, researchers (Locke et al., 2019) held a series of focus groups with three different educational informant groups representing different systemic levels of school implementation efforts—district administrators, principals, and teachers—to adapt the ICS and additional organizational instruments for use in educational research and practice.

Purpose of this Research

The above qualitative adaptation study with educational stakeholders informed the current measurement validation study in elementary schools that were implementing one of two evidence-based universal prevention programs: School-Wide Positive Behavior Intervention and Supports (SW-PBIS, Horner et al., 2010) and Promoting Alternative THinking Strategies (PATHS, Greenberg et al., 1995). The study had three primary aims related to validating the School Implementation Climate Scale (SICS). The first aim was to adapt, refine, and remove items to better capture the current or new ICS dimensions. The second aim was to validate the hypothesized scales by conducting confirmatory analyses of the SICS. Within this aim, we recognized the need to examine if structural validity differed if the new SICS referred to a general EBP within the item stems (e.g., “implementing a universal EBP”) or a specific referent (e.g., “implementing PATHS). Both approaches to item development and measurement exist across studies and disciplines and to the best of our knowledge have not been directly investigated within schools. Considering schools are often integrating various efforts and systems (McIntosh & Goodman, 2016), it would be relevant to know if the SICS could apply to all implementation efforts or would need to be tailored to each effort. We hypothesized that a seven-factor model capturing four of the original ICS factors plus three additional factors could be identified with acceptable reliability and construct validity. We also hypothesized that the structural validity of measures would change based on the EBP referent. The third aim was to examine convergent and divergent validity with appropriate measures. Within this aim, we hypothesized that the SICS would evidence moderate associations with a measure of general organizational climate. We also hypothesized that the SICS would demonstrate moderate convergent validity with staff attitudes towards teaching, as both constructs are perceptual, measured using similar methods, and are associated with the environmental condition of the school. Finally, we hypothesized the SICS would evidence divergent associations with demographic characteristics as the demographic makeup of a school should have some, but negligible, effect on a school's ability to support an implementation effort.

Method

Setting and Participants

Settings. Schools were recruited for participation if they were implementing actively one of two evidence-based universal prevention programs: SW-PBIS (n = 39 schools) or PATHS (n = 13 schools). The study included 441 teachers from 52 elementary schools in Washington, Ohio, and Illinois. These states were selected through established partnerships with intermediary organizations assisting the research team. Since students were not the direct participants in this study, we gathered student population data from school websites. These websites often only reported aggregate data—a limitation we discuss later in this paper. Overall, schools in our sample served a racially/ethnically (Non-White 66%; range: 21%–100%) and socioeconomically diverse (low-income status 57%; range: 4%–100%) student population.

Teacher-level demographics. There does not yet exist consensus with empirical support for the number of respondents within an organization needed to complete measures of climate. Current conventions within the literature on mental health organizations average nine to ten respondents per organization (Glisson et al., 2008). Using that as a benchmark, we randomly recruited, on average, nine teachers per school to complete the SICS. Complete demographic information for participants is shown in Table 1. As depicted in the table, most teachers identified as female, had at least a Master's degree, had an average of 11.6 years of experience (SD = 4.2), and were predominantly White. All included participants provided consented to the study.

Table 1.

Teacher demographics for the SICS samples by type of program.

Teacher characteristic SW-PBIS schools frequency (%) PATHS schools frequency (%) Total frequency (%)
Age
 18 to 24 years old 7 (3.2) 14 (6.3) 21 (4.8)
 25 to 34 years old 65 (29.8) 64 (29.0) 129 (29.4)
 35 to 44 years old 58 (26.6) 63 (28.5) 121 (27.6)
 45 to 54 years old 56 (25.7) 47 (21.3) 103 (23.5)
 55 to 64 years old 31 (14.2) 30 (13.6) 61 (13.9)
 65 to 74 years old 1 (0.5) 3 (1.4) 4 (0.9)
 Total 218 (100.0) 221 (100.0) 439 (100.0)
Gender
 Male 27 (12.4) 19 (8.6) 46 (10.5)
 Female 190 (87.2) 201 (91.4) 391 (89.3)
 Other 1 (0.5) 0 (0.0) 1 (0.2)
 Total 218 (100.0) 220 (100.0) 438 (100.0)
Race
 American Indian or Alaskan Native 7 (3.2) 1 (0.5) 8 (1.8)
 Asian 1 (0.5) 5 (2.3) 6 (1.4)
 Black or African American 14 (6.5) 8 (3.7) 22 (5.1)
 Native Hawaiian or Pacific Islander 0 (0.0) 1 (0.5) 1 (0.2)
 White or Caucasian 179 (82.5) 184 (85.2) 363 (83.8)
 Multiracial 11 (5.1) 10 (4.6) 21 (4.8)
 Other 5 (2.3) 7 (3.2) 12 (2.8)
 Total 217 (100.0) 216 (100.0) 433 (100.0)
Ethnicity
 Latino/Hispanic 14 (6.4) 17 (7.7) 31 (7.1)
 Non-Latino/Hispanic 204 (93.6) 203 (92.3) 407 (92.9)
 Total 218 (100.0) 220 (100.0) 438 (100.0)
Highest degree earned
 Bachelors 72 (33.0) 68 (30.9) 140 (32.0)
 Masters 145 (66.5) 152 (69.1) 297 (67.8)
 Doctoral 1 (0.5) 0 (0.0) 1 (0.2)
 Total 218 (100.0) 220 (100.0) 438 (100.0)
Grade
 K – 2nd 92 (42.0) 99 (44.6) 191 (43.3)
 3rd – 5th and other 127 (58.0) 123 (55.4) 250 (56.7)
 Total 219 (100.0) 222 (100.0) 441 (100.0)
Years in current role 218, 11.9 ± 6.9 220, 11.3 ± 7.1 438, 11.6 ± 7.0
Years at school 218, 7.0 ± 6.1 220, 6.9 ± 5.9 438, 6.9 ± 6.0

Note. SW-PBIS = School-Wide Positive Behavior Intervention and Supports; PATHS = Promoting Alternative THinking Strategies; SCIS = School Implementation Climate Scale;

Procedures

This study took place as part of a large-scale, federally funded measurement adaptation and development project creating school-based tools for organizational constructs. Prior to conducting validation studies, the measures underwent a series of revisions to adapt them for use in schools by increasing the relevance, fit, and acceptability of each measure (Locke et al., 2019). The measures and constructs were adapted first via an expert summit and then mixed-methods focus group sessions with key educator stakeholder groups. This was an iterative process drawing upon the knowledge of these content experts, adjusting the ICS, and then re-examining the measure with the experts until consensus was reached on what items and scales to test. Based on information from these focus groups, four adaptions were made. First, although the original ICS had a Reward factor, stakeholders indicated that items assessing financial incentives were rare and inappropriate for the school context as in other service settings (Steele et al., 2009); thus, new items appropriate to the school context were needed. Second, the two implementation climate dimensions related to selection—Selection for EBP and Selection for Openness—were viewed as assessing information that was typically inaccessible to teachers. Third, stakeholders recommended changes to item wording throughout to increase comprehensibility and appropriateness for use with educators (Hambleton, 1996). Fourth, based on stakeholder input and a review of relevant literature, three additional scales for a school-adapted version of the ICS were identified: Existing Supports to Deliver EBPs, Use of Data to Support EBPs, and Integration of EBPs. Existing Supports to Deliver EBPs address the extent to which staff perceived that the school provides support such as professional development, coaching, and structured meetings to help them learn about and apply an EBP. This differed from the original scale on educational support because, in schools, educational support is often structural and ongoing rather than the time-limited trainings, conferences, and materials referenced in the original education support scale (Glisson et al., 2008). Use of Data to Support EBPs captures shifts in schools to incorporate data-based decision-making processes and tools as a way of creating accountability and encouraging school staff to implement an EBP. Integration of EBP indicates the extent to which EBP is integrated with other school systems and processes, including improvement plans, performance evaluations, and other ongoing work.

Institutional Review Board approval was obtained from the University of Washington Human Subjects Division and partnering school districts’ research and evaluation departments, when applicable. Recruitment of specific schools involved working with central administrators and communicating with site-based administrators regarding the project's benefits and data collection procedures. School administrators or an appointed liaison from partnering schools then recruited 4–12 teachers to participate in data collection. Contact information was obtained from teachers for research staff to contact them and send them a link to the survey.

To facilitate data collection, a web-based survey was constructed using Qualtrics Software (Qualtrics; Provo, UT). Data were collected during the 2017 fall academic semester. Each teacher was provided with a 1-month window to complete the survey from the time they were sent the initial email. Across all partnering schools, an average of 88% of respondents who were sent emails completed the online surveys, resulting in total of 441 sample size.

Measures

School Implementation Climate Scale. The initial SICS (Lyon et al., 2018) was based on the original ICS, which is a six-factor, 18-item measure developed to assess an organization's strategic implementation climate to support translating EBPs into routine practice (Ehrhart et al., 2014). SICS items were scored on a five-point scale (0 = not at all to 4 = very great extent). The original scale development work supported six factors (Focus on EBP, Educational Support for EBP, Recognition for EBP, Rewards for EBP, Selection for EBP, and Selection for Openness) with three items per factor (coefficient ɑ values ranging from 0.81 to 0.91). Four of these factors were retained for the SICS (Focus on EBP, Educational Support for EBP, Recognition for EBP, Rewards for EBP). Furthermore, three new subscales were added (Use of Data to Support EBPs, Existing Supports to Deliver EBP, Integration of EBP), resulting in seven total dimensions. In the preliminary development studies, keeping the measure relatively brief was a primary goal in order to minimize administration time (Ehrhart et al., 2014). Based on feedback from the key educator stakeholder groups, four items each were developed to represent the Use of Data to Support EBPs and EBP Integration scales, and three items were developed to create the Existing Supports to Deliver EBP scale. This resulted in an initial 23-item SICS measure with seven subscales. Two versions of the SICS were created—one that references a specific EBP (e.g., PBIS or PATHS) and another that refers generally to EBPs. These two versions were used to test if the measurement of school implementation climate for the purpose of guiding EBP implementation needs to consider the EBP(s) being implemented.

Organizational Health Inventory for Elementary Schools (OHI-E). The OHI-E (Hoy & Tarter, 1997) was administered as a molar climate measure that captures school staff perceptions of the health and climate of a school. Two scales from the OHI-E were used in this study: Teacher Affiliation (e.g., Teachers identify with this school) and Academic Emphasis (e.g., Teachers receive adequate supplied for their classrooms). Other OHI-E scales were not included because they contained items that measured aspects of Principal Leadership, which overlapped with items included in the SICS but not to any one specific subscale. The OHI-E has demonstrated acceptable internal consistency and stability in prior research (e.g., Bevans et al., 2007). In this study, coefficient alphas for two scales also were found to be acceptable (αs = 0.84 and 0.90, respectively).

Public School Teacher Questionnaire. We included the Public School Teacher Questionnaire (PSTQ) to measure teachers’ attitudes towards teaching and assess divergent validity with the SICS. The scale includes a total of 9 items that assess different attitudes towards the teaching profession (e.g., The teaching profession is something that I enjoy and feel competent doing). Items are rated on a four-point scale ranging from 1 = strongly disagree to 4 = strongly agree. The scale has demonstrated acceptable-to-good psychometric properties (Rimm-Kaufman & Sawyer, 2004). The PTSQ demonstrated an acceptable internal consistency estimate using data in this study (i.e., ɑ = 0.81).

Data Analytic Approach

The data analytic procedure for this study involved first assessing the construct validity of the new SICS subscales using item response theory. We reviewed item information coverage of the underlying trait in question for SICS scales with the goal of limiting each scale to the three most informative items (De Ayala, 2013) for consistency with the original ICS and to keep the overall measure as short and practical as possible. Then, a series of confirmatory factor analyses (CFA) using weighted least squares means and variances (WLSMV) estimation with delta parameterization for the ordered-categorical scale items was conducted. The fit of each model was determined across several indices including the χ2 statistic, comparative fit index (CFI; Hu & Bentler, 1999), the Tucker-Lewis index (TLI; Tucker & Lewis, 1973), and the root mean square error of approximation (RMSEA; Bollen & Long, 1993; Rigdon, 1996). Desirable statistics indicating quality model fits were CFI and TLI greater than 0.95 and RMSEA less than or equal to 0.05. Standardized factor loadings (λs) less than 0.55 were deemed poor and required further examination (Tabachnick & Fidell, 2018). All analyses were conducted using Mplus v8.0 (Muthén & Muthén, 2019). Because the teacher data were nested within schools, we conducted the analysis using the TYPE = COMPLEX option in Mplus, which accounted for influence of the clustering of data on parameter standard errors (Muthen & Satorra, 1995).

Three CFA models were fit to the data: (a) a first-order factor model with correlations among the seven first-order factors constrained to zero, (b) a first-order factor model that allowed the correlations among the factors to be freely estimated, and (c) a hierarchical second-order factor model that assumed a general SICS factor in place of the first-order interfactor correlations. We compared the first two models using a chi-square difference test appropriate for WLSMV estimation of the categorical items (Brown, 2006), and compared the second two models using Marsh’s (1991) Target Coefficient 2 (TC2) fit statistic, which assessed the “fit of only the higher-order portion” of the model (Marsh, 1991, p. 290). TC2 values greater than 0.90 were considered indicative of a viable hierarchical factor structure with the second-order SICS factor. In addition to the TC2, our theory, measurement development process, and projected function of the SICS as a brief, holistic measure of a school's overall implementation climate meant that we would prioritize a model with a second-order factor composite should a comparison of the two models be indeterminant.

Next, a multi-group structural equation model was performed to determine whether the best underlying factor structure of the SICS was invariant across versions of the scale. We employed the same fit indices as the CFAs to determine best model fit.

Convergent validity was assessed via correlations between OHI-E and PSTQ measures. Correlations with the OHI-E scales and PSTQ were expected to be small to moderate. Divergent validity was assessed via correlations between the SICS scales and school-level demographic variables.

Results

Table 2 presents summary statistics for the SICS items and scales (i.e., means, and standard deviations by item, and coefficient alpha for scales). Across all scales, coefficient alphas were in the large range (αs = 0.81–0.90).

Table 2.

SICS dimension items and summary statistics.

SICS subscale M (SD) Coefficient α
Focus on EBP 0.87
One of this school's main goals is to use EBP effectively. 3.02 (0.93)
People in this school believe that the implementation of EBP is important. 2.72 (0.92)
Using EBP is a top school priority. 2.73 (0.99)
Educational Support for EBP 0.82
This school supports attendance at conferences, workshops, or seminars focusing on EBP. 2.56 (1.12)
This school provides access to EBP trainings or in-services. 2.44 (1.08)
This school provides access to EBP materials (e.g., lesson plans, literature, etc.). 2.36 (1.12)
Recognition for EBP 0.87
Teachers/school staff who use EBP are seen as experts. 2.25 (1.08)
Teachers/school staff who use EBP are held in high esteem in this school. 2.25 (1.13)
Teachers/school staff who use EBP are more likely to be recommended for career development opportunities (e.g., recognized as an exemplar, promoted to another position, etc.). 1.76 (1.25)
Rewards for EBP 0.83
This school provides small perks or incentives (e.g., coffee cards) to teachers/school staff who use EBP. 0.63 (1.07)
The teachers/staff who are better at using EBP, are more likely to get additional resources to support their work. 1.02 (1.17)
This school provides opportunities to accumulate extra release time or reductions in other duties for the use of EBP. 0.87 (1.16)
Use of Data to Support EBPs 0.88
In this school, teachers/staff review data on barriers to implementation to problem solve and develop action plans. 2.22 (1.10)
This school collects data about how well EBP are being implemented (e.g., fidelity assessments). 2.02 (1.16)
This school provides data-driven feedback to staff about their delivery of EBP. 1.83 (1.20)
Existing Supports to Deliver EBPs 0.90
This school uses professional development time to support staff to use EBP over time. 2.28 (1.12)
This school provides follow-up support after professional development to help teachers/school staff deliver EBP with fidelity. 1.86 (1.14)
This school devotes structured meetings (e.g., professional learning communities, grade-level meetings) to problem-solve delivering EBP with fidelity. 2.12 (1.21)
EBP Integration 0.81
This school's continuous improvement efforts integrate the use of EBP. 2.54 (1.10)
This school connects implementation of EBP to teachers’/school staff's performance evaluations. 1.62 (1.29)
This school integrates the implementation of EBP with other ongoing work. 2.44 (1.07)

Note. EBP = Evidence-based Program; SICS = School Implementation Climate Scale.

Confirmatory Factor Analyses

Examination of item information curves from first-order correlated factor and second-order SICS CFA models indicated that two items within each of the scales were redundant and deleted for not contributing sufficient unique information to their respective scales, resulting in 21 items (seven dimension measures by three items each). The comparison between the uncorrelated first-order factor model, χ2(189, N = 441) = 7,678.83, p < .001, CFI = 0.356, TLI = 0.284, RMSEA = 0.300 (90% CI = 0.294, 0.306) and the correlated first-order factor model, χ2(168, N = 441) = 563.66, p < .001, CFI = 0.966, TLI = 0.957, RMSEA = 0.073 (90% CI = 0.066, 0.080) indicated a significantly better fit for the correlated first-order factor model, Δχ2(21, N = 441) = 2,620.07, p < .001. As shown in Figure 1, standardized factor loadings for the correlated first-order factor model were large (0.721 ≤ λs ≤ 0.965). Interscale correlations are shown in the top portion of Table 3. Except for a low correlation between rewards for EBP and focus on EBP (r = 0.424), interfactor correlations between rewards for EBP and other factors (i.e., Educational Support for EBP, Use of Data to Support EBP, Existing Supports to Deliver EBPs, and EBP Integration) were generally moderate (.500 ≤ rs < .700). Most inter-factor correlations were in the large (.700 ≤ rs < .900) or very large (.900 ≤ rs ≤ 1.00) range (Mukaka, 2012).

Figure 1.

Figure 1.

Factor loadings and fit statistics of the final first-order factor only measurement model.

Table 3.

Correlations among SICS scales and with OHI-E and PSTQ measures.

SICS
SICS A B C D E F G
A. Focus on EBP
B. Educational Support for EBP .810**
C. Recognition for EBP .742** .784**
D. Rewards for EBP .424** .554** .705**
E. Use of Data to Support EBPs .670** .713** .663** .605**
F. Existing Supports to Deliver EBPs .683** .888** .721** .639** .854**
G. EBP Integration .802** .835** .764** .639** .796** .929**
OHI-E
Academic Emphasis .399** .412** .393** .225** .332** .364** .361**
Teacher Affiliation .503** .392** .369** .143** .323** .353** .429**
PSTQ  
Total Score .315** .381** .357** .196** .319** .351** .350**
School Demographics
School Size .179** .093 .081 .033 −.019 .060 .107*
% White .084 .079 .018 −.056 .029 .169** .074
% Non-White −.184** −.178** −.109* .024 −.131** −.197** −.130**
% Transitional Bilingual .058 .011 .004 −.012 .060 −.042 −.010
% Special Education −.108* −.060 .000 −.010 −.019 −.035 −.041
% Attendance Rates .052 .104* .161** .157** .063 .031 .099

Note: EBP = evidence-based practices; SICS = School Implementation Climate Scale. OHI-E = Organizational Health Inventory for Elementary Schools. PSTQ = Public School Teacher Questionnaire.

*Correlation is significant at the p < .05 level (2-tailed).

**Correlation is significant at the p < .01 level (2-tailed).

The hierarchical second-order SICS model is shown in Figure 2. This model also indicated an acceptable model fit the data, χ2 (182, N = 441) = 716.73, p < .001, CFI = 0.954, TLI = 0.947, and RMSEA = 0.082 (90% CI = 0.075 0.088). Only small differences in standardized factor loadings from the correlated first-order factor model were observed (i.e., Δ <0.088); thus, standardized factor loadings for the first-order factors in the hierarchical second-order factor model also were large (0.658 ≤ λs ≤ 0.997). Standardized factor loadings for the second-order ICS factor were large as well (0.680 ≤ λs ≤ 0.952). A comparison of the correlated first-order factor model with the hierarchical second-order factor model resulted in a TC2 statistic of 0.973, suggesting a preference for the hierarchical second-order factor model.2

Figure 2.

Figure 2.

Factor loadings and fit statistics of the final second-order composite factor measurement model.

Multigroup SEM. Two separate versions of the SICS were tested against each other—one in which the items include general EBP as a referent, and the other included the name of a specific EBP (i.e., PBIS or PATHS) as a referent. The hierarchical second-order model was fit to both response groups, first allowing free estimation of all parameters. This resulted in a χ2 (364, N = 224) = 1,1186.456, p < .001, CFI = 0.963, TLI = 0.957, and RMSEA = 0.101 (90% CI[0.095, 0.108]). A second model with all loadings constrained to be equal across both groups fit the data significantly worse, χ2 (470, N = 118) = 1,014.442, p < .001, Δχ2 (106) = 187.196, p < .001. However, the CFI and TLI fit indices were larger at 0.975 and 0.978, respectively. Considering that the chi-square test is notably influenced by sample size (Maruyama, 1997), these results suggested that the measurement model was invariant for both the general-and specific-referent versions of the S-ICS.

Convergent and Divergent Validity

Table 3 depicts correlations between the ICS scales and OHI-E, PSTQ, and school demographic measures as indicators of convergent and divergent validity. Correlations between SICS scales and OHI-E and PSTQ measures in Table 3 were generally small to moderate (.300 ≤ rs < .500) and statistically significant (ps < .01) indicating a priori hypothesized convergent validity. Only the correlation between Teacher Affiliation and Rewards for EBP (i.e., r = .143) and between PSTQ Total score and Rewards for EBP (r = .196) were small and nonsignificant (i.e., rs < .300). Correlations between SICS scales and school demographics were all small and nonsignificant corresponding with a priori hypotheses supporting the divergent validity of the SICS scales.

Discussion

Organizational climate reflects staff perceptions of their work environment based on their shared experiences, which influences how they feel and function within a given setting (Schneider, 1990). When applied to EBP implementation, strategic implementation climate refers to staff perceptions that the use of an EBP is expected, supported, prioritized, and recognized/rewarded within their school. Assessing the influence of implementation strategies on implementation climate represents a promising practice for translating EBPs into routine practice (Anonymized for peer review). However, the ability to do so hinges upon the existence of psychometrically sound, contextually appropriate, and relevant measures of implementation climate.

Structural, Convergent, and Divergent Validity of the SICS

The aim of this study was to confirm the underlying factor structure of the school-adapted version of the ICS with the additional three scales deemed relevant for school-based EBP implementation by research and practice experts: Existing Supports for EBPs, Use of Data, and Integration of EBP. Confirmatory factor analyses of the seven SICS scales with a second-order global implementation climate factor provided a good fitting model for the data. Internal consistency estimates for all the 3-item scales were acceptable. Further, convergent and divergent relationships with other measures were consistent with prior research and our hypotheses, providing evidence for the construct validity of the SICS.

Rewarding EBP Implementation in Schools

Despite attempts to adapt the Reward construct to better align with the realities of the school context, inter-factor correlations with this construct were low to moderate (rs < .50) and demonstrated notably smaller factor loadings than other constructs. While monetary rewards may be more relevant for other settings (Kondo et al., 2016), fiscal incentives for EBP implementation are rare in schools (Locke et al., 2019), which prompted adaptation of the original items, such as earning small perks based on implementation, reductions in other duties, and receiving extra resources contingent upon implementation. The smaller factor loading of Rewards on SICS (i.e., λ = 0.69) could be due to the low prevalence of rewards being used in schools, thus creating limited variability between respondents on this item. This is consistent with prior research on the ICS demonstrating weaker inter-factor correlations and smaller internal consistency estimates for rewards than on other scales (Ehrhart et al., 2014; Ehrhart et al., 2016). This dimension appears less relevant in the education sector because teacher behavior is more likely to improve as a function of desired outcomes for their students (e.g., quality academic instruction) rather than external reward structures (Loyalka et al., 2019).

SICS as a Pragmatic Measure

From a real-world implementation standpoint, Glasgow and Riley (2013) argued that implementation-oriented measures must be pragmatic; that is, measures should be actionable, sensitive to change, and important, yet not burdensome, to stakeholders. Findings from this study and previous studies of the ICS suggest that SICS meets or approaches each of the criteria of a pragmatic measure (Lewis et al., 2018). First, per focus group participants, the SICS includes important constructs for school-based implementation (Locke et al., 2019). Furthermore, the multigroup SEM analysis demonstrated that the SICS measurement properties are invariant across general and specific EBP references. This gives the SICS flexibility to meet the needs of each school's unique implementation context. Second, with lack of time being cited consistently as one of the biggest barriers to implementation (McGoey et al., 2014), the SICS includes only three items per subscale, reducing the time burden. Third, research from industrial/organizational psychology indicates that the constructs assessed in the SICS represent actionable targets for implementation strategies aimed at facilitating EBP uptake and use (e.g., Ehrhart et al., 2013). Last, prior intervention research demonstrated that the SICS's predecessor—the ICS—was sensitive to detecting changes in implementation climate, suggesting that the SICS may be sensitive to changes in school-based implementation climate (Aarons, 2017). This is an area that warrants additional research. Collectively, the above suggests that the SICS may be a low-burden, actionable, change-sensitive, and pragmatic measure. However, researchers should continue to explore the actual use of the SICS in real-world implementation conditions to gather evidence that demonstrates its pragmatic qualities.

Limitations and Future Directions

There are several limitations to consider when interpreting these findings. The first limitation is the sample, which was bound geographically to certain regions of the United States. Implementation climate is likely to vary across states and counties with different cultural and legislative emphases on EBP implementation, as well as funding available to support EBP materials, training, and coaching. A follow-up study including a sample of schools from diverse geographical, political, and cultural communities to determine the stability and generalizability of the factor structure is warranted. Similarly, we were only able to access public sources for student demographic data which was quite limited for our participant sites. We were unable to capture anything related to the number, or percent, of students that identified with particular ethnic and racial communities, received special education services or similar wraparound supports or experienced inequitable opportunity access based on their communities. These qualities of student communities within a school may influence the dynamics between teacher and student(s) when implementing a universal program and may also influence teachers’ perceptions of climate. It is unclear exactly what those influences and outcomes might be, and a future study involving implementation climate would benefit from the direct collection of student data on a variety of population characteristics to determine how implementation differs in response to differences in student population.

Another limitation is change sensitivity of the SICS was not assessed. This psychometric quality was beyond the current scope and purpose, which was to establish the general factor structure of the SICS. This allows schools to assess implementation climate at one time, ideally while the school is preparing large-scale implementation efforts. For schools undergoing change efforts to improve implementation climate, dynamic data on EBP implementation using the SICS could inform decision makers as to whether change efforts are working. Future research should use the SICS as a progress monitoring instrument and link changes in implementation climate to changes in implementation and student outcomes.

The validity evidence in this study is confined to the measures used to establish evidence convergent and divergent validity. Although the observed relationships were consistent with hypothesized associations within and between implementation constructs, and external measures, this study did not include measures of actual program implementation (e.g., dosage) or student outcomes (e.g., student behavior) as it was beyond the scope of this study. However, previous studies have demonstrated the relationship between the ICS—the measure upon which the SICS was developed—and implementation outcomes (e.g., Williams et al., 2018). Future research should aim to establish the magnitude of the relationship between school implementation climate, implementation outcomes, and student outcomes.

Finally, the focus groups for revising the initial ICS to fit the school context lacked certain stakeholders with potentially relevant insights including community partners and related service providers. This was done intentionally, as these individuals are not directly involved in the implementation of universal EBPs but may have valuable insight into the implementation climate.

Implications for School-Based Implementation Research and Practice

As schools face increasing demands from legislation (e.g., Act, 2015), professional organizations (e.g., NASP, 2010), and communities to use EBPs to address students’ social, emotional, behavioral, and academic needs, research is clear that schools need to prepare deliberately for implementation (Lyon & Bruns, 2019). The exploration and preparation phases of the EPIS framework are important phases of the implementation process, involving a comprehensive evaluation of the implementation context to establish organizational readiness for change. Organizational readiness creates the context for individuals within the system to exert greater effort and display more responsiveness to training and consultative support aimed at supporting effective implementation (Weiner, 2009). Without deliberate attention and action during this preparation phase, implementation efforts are likely to fail (Kotter, 2012). The SICS provides critical information that can be used during the preparation phase to create organizational readiness for change by identifying malleable organizational intervention targets. Empirical research on school organizational readiness for change is lacking, including understanding the role of general and implementation-specific organizational climate, as well as the effects of strategies on improving aspects of the implementation climate prior to initiating active implementation efforts.

Moreover, the SICS has utility during subsequent phases of the implementation process: implementation and sustainment (Aarons et al., 2011). The SICS provides a straightforward way of assessing a school's implementation climate while implementation is happening to examine factors that may obstruct or enable successful, sustainable delivery of an EBP. Once these factors are identified, data can inform the tailoring of site-specific action plans to address aspects of the school implementation climate by increasing staff perceptions that EBPs are expected, rewarded, recognized, and supported in their setting (Powell et al., 2019). Thus, the SICS can be used as part of a continuous quality improvement process during the active implementation phase.

Conclusion

Factors associated with the immediate context in which implementation happens influence the successful translation of EBPs into routine practice. Implementation climate has relevance at various phases of the implementation process, ranging from preparation to sustainment. The goal of this research was to provide a psychometrically sound and pragmatic instrument that can be used to inform the implementation research and practice in schools. Although more research is needed to identify strategies to effect change in implementation climate, the SICS has potential as a useful tool to monitor the impact of those efforts and to assess how changes in implementation climate exert a cascade of effects on both implementation and student outcomes.

Supplemental Material

sj-pdf-2-irp-10.1177_26334895221116065 - Supplemental material for Construct validity of the school-implementation climate scale

Supplemental material, sj-pdf-2-irp-10.1177_26334895221116065 for Construct validity of the school-implementation climate scale by Andrew J. Thayer, Clayton R. Cook, Chayna Davis and Eric C. Brown, Jill Locke, Mark G. Ehrhart, Gregory A. Aarons, Elissa Picozzi, Aaron R. Lyon in Implementation Research and Practice

Acknowledgments

The authors wish to thank the teachers, administrators, and school staff that participated in this study.

1.

Implementation leadership and citizenship behavior are both inner context factors critical to successful implementation. Implementation leadership is the degree to which formal and informal leaders are knowledgeable about EBP, communicate, establish plans, monitor and support execution, and encourage staff through the implementation process. Implementation citizenship behavior refers to staff behaviors that are “above and beyond” expected responsibilities for successful implementation.

2.

Comparison TC2 statistics from models with and without adjustment for the clustering of teachers in school; and with WLSMV, ML and MLR estimation; were conducted with results indicating a consistent preference for the second-order factor model.

Footnotes

The authors declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Dr. Aaron Lyon is an Associate Editor of Implementation Research and Practice, and thus Dr. Lyon was not involved in any aspect of the peer review process for this manuscript.

Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This project and publication were supported by Institute of Education Sciences (grants R305A160114 and R305A200023). Gregory Aarons was supported by National Institute of Mental Health (grant R03MH117493) and National Institute on Drug Abuse (grant R01DA049891).

Ethical Approval: Institutional Review Board approval was obtained from the University of Washington Human Subjects Division and partnering school districts’ research and evaluation departments, when applicable.

Supplemental Material: Supplemental material for this article is available online.

References

  1. Aarons G. A. (2017). Preliminary cohort 1 findings for the leadership and organizational change for implementation (LOCI) strategy. Addiction health services research conference (AHSR). [Google Scholar]
  2. Aarons G. A., Hurlburt M., Horwitz S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Act E. S. S. (2015). Every student succeeds act (ESSA). Pub. L, 114–95. [Google Scholar]
  4. Bevans K., Bradshaw C., Miech R., Leaf P. (2007). Staff- and school-level predictors of school organizational health: A multilevel analysis. Journal of School Health, 77(6), 294–302. 10.1111/j.1746-1561.2007.00210.x [DOI] [PubMed] [Google Scholar]
  5. Bollen K. A., Long J. S. (eds.). (1993). Testing structural equation models. Sage Publications. [Google Scholar]
  6. Brown T. (2006). Confirmatory factor analysis for applied research. The Guilford express. [Google Scholar]
  7. De Ayala R. J. (2013). The theory and practice of item response theory. Guilford Publications. [Google Scholar]
  8. Ehrhart M. G., Aarons G. A., Farahnak L. R. (2014). Assessing the organizational context for EBP implementation: The development and validity testing of the implementation climate scale (ICS). Implementation Science, 9(1), 1–11. 10.1186/s13012-014-0157-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Ehrhart M. G., Schneider B., Macey W. H. (2013). Organizational climate and culture: an introduction to theory, research, and practice. Routledge. [DOI] [PubMed] [Google Scholar]
  10. Ehrhart M. G., Torres E. M., Wright L. A., Martinez S. Y., Aarons G. A. (2016). Validating the implementation climate scale (ICS) in child welfare organizations. Child Abuse & Neglect, 53, 17–26. 10.1016/j.chiabu.2015.10.017 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Glasgow R. E., Riley W. T. (2013). Pragmatic measures: What they are and why we need them. American Journal of Preventive Medicine, 45(2), 237–243. 10.1016/j.amepre.2013.03.010 [DOI] [PubMed] [Google Scholar]
  12. Glick W. H. (1985). Conceptualizing and measuring organizational and psychological climate: Pitfalls in multilevel research. Academy of Management Review, 10(3), 601–616. 10.5465/amr.1985.4279045 [DOI] [Google Scholar]
  13. Glisson C., Landsverk J., Schoenwald S., Kelleher K., Hoagwood K. E., Mayberg S., Green P. (2008). Assessing the organizational social context (OSC) of mental health services: Implications for research and practice. Administration and Policy in Mental Health and Mental Health Services Research, 35(1), 98–113. 10.1007/s10488-007-0148-5 [DOI] [PubMed] [Google Scholar]
  14. Greenberg M. T., Kusche C. A., Cook E. T., Quamma J. P. (1995). Promoting emotional competence in school-aged children: The effects of the PATHS curriculum. Development and Psychopathology, 7(1), 117–136. 10.1017/S0954579400006374 [DOI] [Google Scholar]
  15. Hambleton R. (1996). Guidelines for Adapting Educational and Psychological Tests. Retrieved from National Center for Education Statistics website: https://files.eric.ed.gov/fulltext/ED399291.pdf.
  16. Horner R. H., Sugai G., Anderson C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptional Children, 42(8), 1–14. 10.17161/foec.v42i8.6906 [DOI] [Google Scholar]
  17. Hoy W. K., Tarter C. J. (1997). The road to open and healthy schools: A handbook for change (middle and secondary school ed.). Thousand Oaks, CA: Corwin. [Google Scholar]
  18. Hu L., Bentler P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. 10.1080/10705519909540118 [DOI] [Google Scholar]
  19. James L. R., Jones A. P. (1974). Organizational climate: A review of theory and research. Psychological Bulletin, 81(12), 1096. 10.1037/h0037511 [DOI] [Google Scholar]
  20. Kondo K. K., Damberg C. L., Mendelson A., Motu’apuaka M., Freeman M., O’Neil M., Relevo R., Low A., Kansagara D. (2016). Implementation processes and pay for performance in healthcare: A systematic review. Journal of General Internal Medicine, 31(1), 61–69. 10.1007/s11606-015-3567-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Kotter J. P. (2012). Leading change. Harvard Business Review Press. [Google Scholar]
  22. Lewis C. C., Mettert K. D., Dorsey C. N., Martinez R. G., Weiner B. J., Nolen E., Stanick C., Halko H., Powell B. J. (2018). An updated protocol for a systematic review of implementation-related measures. Systematic Reviews, 7(1), 66. 10.1186/s13643-018-0728-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Locke J., Beidas R. S., Marcus S., Stahmer A., Aarons G. A., Lyon A. R., Cannuscio C., Barg F., Dorsey S., Mandell D. S. (2016). A mixed methods study of individual and organizational factors that affect implementation of interventions for children with autism in public schools. Implementation Science, 11(1), 135. 10.1186/s13012-016-0501-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Locke J., Lee K., Cook C. R., Frederick L., Vázquez-Colón C., Ehrhart M. G., Aarons G. A., Davis C., Lyon A. R. (2019). Understanding the organizational implementation context of schools: A qualitative study of school district administrators, principals, and teachers. School Mental Health, 11(3), 379–399. 10.1007/s12310-018-9292-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Loyalka P., Sylvia S., Liu C., Chu J., Shi Y. (2019). Pay by design: Teacher performance pay design and the distribution of student achievement. Journal of Labor Economics, 37(3), 621–662. 10.1086/702625 [DOI] [Google Scholar]
  26. Lyon A. R., Bruns E. J. (2019). From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Mental Health, 11(1), 106–114. 10.1007/s12310-018-09306-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Lyon A. R., Cook C. R., Brown E. C., Locke J., Davis C., Ehrhart M., Aarons G. A. (2018). Assessing organizational implementation context in the education sector: Confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science, 13(1), 1–14. 10.1186/s13012-017-0705-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Marsh H. W. (1991). Multidimensional students’ evaluations of teaching effectiveness: A test of alternative higher-order structures. Journal of Educational Psychology, 83(2), 285–296. 10.1037/0022-0663.83.2.285 [DOI] [Google Scholar]
  29. Maruyama G. (1997). Basics of structural equation modeling. Sage. [Google Scholar]
  30. McGoey K. E., Rispoli K. M., Venesky L. G., Schaffner K. F., McGuirk L., Marshall S. (2014). A preliminary investigation into teacher perceptions of the barriers to behavior intervention implementation. Journal of Applied School Psychology, 30(4), 375–390. 10.1080/15377903.2014.950441 [DOI] [Google Scholar]
  31. McIntosh K., Goodman S. (2016). Integrated multi-tiered systems of support: blending RTI and PBIS. Guilford Publications. [Google Scholar]
  32. Mitchell M. M., Bradshaw C. P., Leaf P. J. (2010). Student and teacher perceptions of school climate: A multilevel exploration of patterns of discrepancy. Journal of School Health, 80(6), 271–279. 10.1111/j.1746-1561.2010.00501 [DOI] [PubMed] [Google Scholar]
  33. Moullin J. C., Dickson K. S., Stadnick N. A., Rabin B., Aarons G. A. (2019). Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implementation Science, 14(1), 1. 10.1186/s13012-018-0842-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Mukaka M. M. (2012). A guide to appropriate use of correlation coefficient in medical research. Malawi Medical Journal, 24(3), 69–71. [PMC free article] [PubMed] [Google Scholar]
  35. Muthen B. O., Satorra A. (1995). Complex sample data in structural equation modeling. Sociological Methodology, 25, 267–316. 10.2307/271070 [DOI] [Google Scholar]
  36. Muthén L. K., Muthén B. O. (2019). Mplus user's guide (Eigth Edition). Muthén & Muthén. [Google Scholar]
  37. National Association of School Psychologists (NASP). (2010). Standards for school psychology: ethical and professional practices for school psychologists. Bethesda: Author. [Google Scholar]
  38. Ostroff C. (1993). The effects of climate and personal influences on individual behavior and attitudes in organizations. Organizational Behavior and Human Decision Processes, 56(1), 56–90. 10.1006/obhd.1993.1045 [DOI] [Google Scholar]
  39. Patterson M. G., West M. A., Shackleton V. J., Dawson J. F., Lawthom R., Maitlis S., Robinson D. L., Wallace A. M. (2005). Validating the organizational climate measure: Links to managerial practices, productivity, and innovation. Journal of Organizational Behavior, 26(4), 379–408. 10.1002/job.312 [DOI] [Google Scholar]
  40. Powell B. J., Fernandez M. E., Williams N. J., Aarons G. A., Beidas R. S., Lewis C. C., McHugh S. M., Weiner B. J. (2019). Enhancing the impact of implementation strategies in healthcare: A research agenda. Frontiers in Public Health, 7(3), 1–24. 10.3389/fpubh.2019.00003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Riehl C., Sipple J. W. (1996). Making the most of time and talent: Secondary school organizational climates, teaching task environments, and teacher commitment. American Educational Research Journal, 33(4), 873–901. 10.3102/00028312033004873 [DOI] [Google Scholar]
  42. Rigdon E. E. (1996). CFI Versus RMSEA: A comparison of two fit indexes for structural equation modeling. Structural Equation Modeling: A Multidisciplinary Journal, 3(4), 369–379. 10.1080/10705519609540052 [DOI] [Google Scholar]
  43. Rimm-Kaufman S. E., Sawyer B. E. (2004). Primary-grade teachers’ self-efficacy beliefs, attitudes toward teaching, and discipline and teaching practice priorities in relation to the" responsive classroom" approach. The Elementary School Journal, 104(4), 321–341. 10.1086/499756 [DOI] [Google Scholar]
  44. Schneider B. (1990). Organizational climate and culture (Vol. 4). Pfeiffer. [Google Scholar]
  45. Schulte M., Ostroff C., Shmulyian S., Kinicki A. (2009). Organizational climate configurations: Relationships to collective attitudes, customer satisfaction, and financial performance. Journal of Applied Psychology, 94(3), 618–634. 10.1037/a0014365 [DOI] [PubMed] [Google Scholar]
  46. Steele J., Murnane R. J., Willet J. B. (2009). Do Financial Incentives Help Low-performing Schools Attract and Keep Academically Talented Teachers? Evidence From California. Retrieved from https://www.nber.org/papers/w14780.pdf.
  47. Tabachnick B. G., Fidell L. S. (2018). Using multivariate statistics (7th ed.). Pearson. [Google Scholar]
  48. Tucker L. R., Lewis C. (1973). A reliability coefficient for maximum likelihood factor analysis. Psychometrika, 38(1), 1–10. 10.1007/BF02291170 [DOI] [Google Scholar]
  49. Weiner B. J. (2009). A theory of organizational readiness for change. Implementation Science, 4(1), 1–9. 10.1186/1748-5908-4-67 [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Williams N. J., Ehrhart M. G., Aarons G. A., Marcus S. C., Beidas R. S. (2018). Linking molar organizational climate and strategic implementation climate to clinicians’ use of evidence-based psychotherapy techniques: Cross-sectional and lagged analyses from a 2-year observational study. Implementation Science, 13(1), 1–13. 10.1186/s13012-018-0781-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Williams N. J., Frank H. E., Frederick L., Beidas R. S., Mandell D. S., Aarons G. A., Green P., Locke J. (2019). Organizational culture and climate profiles: Relationships with fidelity to three evidence-based practices for autism in elementary schools. Implementation Science, 14(1), 15. 10.1186/s13012-019-0863-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Xiaofu P., Qiwen Q. (2007). An analysis of the relation between secondary school organizational climate and teacher job satisfaction. Chinese Education & Society, 40(5), 65–77. 10.2753/CED1061-1932400507 [DOI] [Google Scholar]
  53. You S., O’Malley M. D., Furlong M. J. (2014). Preliminary development of the brief–California school climate survey: Dimensionality and measurement invariance across teachers and administrators. School Effectiveness and School Improvement, 25(1), 153–173. 10.1080/09243453.2013.784199 [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-pdf-2-irp-10.1177_26334895221116065 - Supplemental material for Construct validity of the school-implementation climate scale

Supplemental material, sj-pdf-2-irp-10.1177_26334895221116065 for Construct validity of the school-implementation climate scale by Andrew J. Thayer, Clayton R. Cook, Chayna Davis and Eric C. Brown, Jill Locke, Mark G. Ehrhart, Gregory A. Aarons, Elissa Picozzi, Aaron R. Lyon in Implementation Research and Practice


Articles from Implementation Research and Practice are provided here courtesy of SAGE Publications

RESOURCES