Abstract
Research conducted to date has highlighted barriers to initial adoption of universal behavior screening in schools. However, little is known regarding the experiences of those implementing these procedures and there have been no studies conducted examining the experiences of educators in different stages of implementing various tiered systems of supports. Universal screening is foundational to a successful Comprehensive, Integrated Three-Tiered (Ci3T) model of prevention—an integrated tiered system addressing academics, behavior, and social and emotional well-being. Therefore, the perspectives of Ci3T Leadership Team members at different stages of Ci3T implementation were solicited through an online survey that sought to understand (1) current school-based screening practices and (2) individual beliefs regarding those practices. A total of 165 Ci3T Leadership Team members representing five school districts from three geographic regions across the United States, all of whom were participating in an Institute of Education Sciences Network grant examining integrated tiered systems, reported the screening procedures were generally well-understood and feasible to implement. At the same time, results highlighted continuing professional learning may be beneficial in the areas of: (1) integrating multiple sources of data (e.g., screening data with other data collected as regular school practices) and (2) using those multiple data sources to determine next steps for intervention. We discuss educational implications, limitations, and directions for future inquiry.
Keywords: systematic screening, tiered systems, implementation stages, professional learning
Schools offer an advantageous setting for identifying and supporting students’ social-emotional and behavior concerns, given most children and adolescents in the United States attend schools. Drawing on innovations in public health, schools have adopted a variety of tiered systems of support, often with a focus toward preventing academic (e.g., Response to Intervention [RTI]; Fuchs et al., 2012) or behavior concerns (e.g., Positive Behavioral Interventions and Supports [PBIS]; Sugai & Horner, 2009) through systems-level coordinated delivery of evidence-based practices. Fortunately, school-based, three-tiered prevention models are increasingly becoming more interdisciplinary, drawing on recommendations from implementation and prevention science alike (Lyon, 2016; Simeonsson, 1994). Examples of integrated tiered systems include Interconnected Systems Framework (ISF; Barrett et al., 2013; integrating PBIS with school-based mental health supports), Multi-Tiered System of Supports (MTSS; McIntosh & Goodman, 2016; integrating one or more academic or behavior domains), and Comprehensive, Integrated, Three-tiered (Ci3T) model of prevention (Lane & Menzies, 2003; Lane, Menzies et al., 2020a; integrating academic, behavioral, and social domains). Across tiered systems, primary (Tier 1) prevention efforts are expected to meet most students’ educational needs. Secondary (Tier 2) interventions are additive, providing for the needs of 10%–15% of students demonstrating some level of risk (e.g., academic failure, challenging behavior). Tertiary (Tier 3) interventions are intended for 3%–5% of students with intensive intervention needs. Ci3T, in particular, represents an integrated tiered system emphasizing (1) meeting students’ academic, behavioral, and social-emotional well-being needs across three tiers of support; (2) assessment of treatment integrity and social validity at the school- and classroom-level to aid in the interpretation of student outcomes; and (3) application of data-informed professional learning to support stakeholders’ ability to implement core components of the Ci3T model (Lane et al., 2018a).
Although a successful Ci3T model of prevention is predicated on several critical components as described above (integrating academic, behavior, and social domains across Tier 1, Tier 2, Tier 3; see also Ci3T.org), at its foundation is universal screening (see Lane et al., 2021, for additional information on unique features of Ci3T). Universal screening is a process by which schools conduct brief assessments of all students in the population, typically in the fall, winter, and spring. Although most frequently viewed as a way to detect students who may require additional academic and/or behavioral supports, universal screening also serves the important purpose of examining the impact of Tier 1 efforts over time when implemented as designed (with treatment integrity), informing the use of teacher-delivered, low-intensity strategies (see Lane et al., 2015), and determining professional learning needs. That is, when an evidence-based core curriculum is implemented with fidelity, it is expected that fewer than 20% of students will require supplemental or intensive intervention supports to be successful. Thus, in those cases in which universal screening data suggest a larger proportion of the student population demonstrates risk, this suggests the need to strengthen Tier 1 efforts. Likewise, if certain classrooms or grade levels suggest a larger proportion of students demonstrating risk, teachers can incorporate validated, low-intensity strategies into daily instructional activities such as using higher rates of behavior-specific praise (Royer et al., 2019), instructional choice (Royer et al., 2017), precorrection (Ennis et al., 2017), and active supervision (Allen et al., 2020) to maximize engagement and limit disruptions. Thus, systematic screening data can be used with other sources of data to inform instruction in several ways: shifting Tier 1 practices; using teacher-delivered, low-intensity strategies; and connecting students to validated Tier 2 and Tier 3 interventions according to individual students’ needs (Lane et al., 2021). To accomplish these tasks, data structures and procedures for using data are key.
In practice, Ci3T leadership teams facilitate data collection and integrated data-based decision-making at the building level. This includes ensuring screening procedures occurred as planned and all students were screened (Common et al., 2021; Lane et al., 2018b). Although use of a standardized screening tool is at the core of universal screening (e.g., Student Risk Screening Scale for Internalizing and Externalizing; SRSS-IE; Drummond, 1994; Lane & Menzies, 2009), faculty and staff also utilize multiple sources of data (e.g., attendance, nurse visits, grades) to access a complete picture of student performance and have the opportunity to review data independently to shape instructional practices (e.g., using Second [Tier 2] Intervention Grid) as well as engage in data-informed decision making with other professionals (e.g., professional learning communities). To date, there is evidence to suggest behavior screening measures are socially valid (Lane et al., 2014; Oakes et al., 2016). For example, Lane and colleagues (Lane et al., 2014; Oakes et al., 2016) found behavior screeners were rated favorably in terms of feasibility and utility, and were particularly rated favorably relative to time investment, ease-of-use, and cost (for free-access) screeners. Although such social validity data are promising, more research is needed to understand stakeholders’ experiences of the behavior screening process as a whole. That is, screening is a process that moves from collection to summarization and analysis of data before using those data to inform decision making and next steps for intervention at any tier. As such, it is important to understand the experiences of stakeholders navigating this process, including both perceived benefits and challenges, particularly as Ci3T leadership teams advance through implementation stages, knowing implementation varies over time (Fixsen et al., 2005)
Briesch et al. (2021) conducted the first qualitative study examining systematic screening for behavior with Ci3T leadership team members (21 individual interviews; 17 individuals in focus groups) from three geographic regions to learn about their views on the benefits, challenges, and opportunities of behavior screening within Ci3T systems. Results of a thematic analysis suggested several perceived strengths: detecting students at the first sign of concern, having data to examine the overall functioning of their integrated Ci3T model, and normalizing and destigmatizing the process of early detection. Participants also identified barriers with screening efforts, including concerns about buy-in, reliability of data (referred to as consistency), and how to use the data to inform instruction.
Although useful in identifying what additional professional learning may be needed to empower educators with current information on reliability, validity, and functional utility of behavior screening data, this study was limited in two ways. First, the sample of interview and focus group participants was purposively small given the nature of the study. Although this allowed for the collection of rich qualitative data, takeaways were derived from a fairly small and targeted sample of participants. As such, the extent to which results are representative of the views of the typical Ci3T leadership team member is unknown. Second, it was not possible to explore perceptions with Ci3T leadership team members in different stages of implementation given the sample size. It is possible the perceptions and needs of educators may vary between initial and more experienced implementers, because findings from implementation science indicate adoption of new practices is a dynamic and ongoing process, indicating more than 1 year of implementation is often needed for individual skills and organizational capacity to develop before a practice becomes fully operational (Fixsen et al., 2005).
Critical to the successful implementation of a Ci3T model, in general—and universal screening, in specific—is the availability of high-quality professional learning to ensure school staff have not only the requisite knowledge and skills but also the confidence to employ them. Thus, although universal behavior screening is promising (Oakes et al., 2014), there is a need to build upon lessons learned from qualitative inquiry to understand whether additional supports may be needed to ensure effective and sustained implementation of behavior screening practices. This will likely include having the knowledge and resources to access and implement universal behavior screeners meaningfully (Lane, Powers et al., 2020b).
Purpose
We conducted this survey study to answer four key questions to advance the knowledge base and professional learning needs related to implementing behavior screening practices in integrated tiered systems such as Ci3T, with the goal of identifying priority directions for enhancing resources and professional learning about behavior screening. In particular, we sought to identify whether knowledge and professional learning needs shifted as schools gained experience in utilizing behavior screening practices. Research on adoption of new practices suggests patterns of implementation are likely to change over time, with 2–4 years often being necessary for organizations to reach full implementation (e.g., Bertram et al., 2011; Fixsen et al., 2005).
To accomplish this, we followed the same data analytic plan used by Common et al. (2021) to examine Ci3T professional learning avenues and priorities for schools at different stages of Ci3T implementation. Research questions include: (1) What are participants’ general level of understanding of behavior screening practices? Are there differences in understanding across participants in schools at different stages (e.g., years of experience; 1–2, 3–4, 4–6 years) of Ci3T implementation? (2) How do individuals view the usability of behavior screening practices, in particular, in terms of usability, willingness to change, feasibility, family-school collaboration, and external support? Are there differences in these constructs across participants in schools at different stages of Ci3T implementation? (3) What are the perceived strengths and barriers of universal behavior screeners? Are there differences in perceived strengths and barriers between respondents in schools at different stages of Ci3T implementation? (4) What areas do respondents prioritize for professional learning, and what are their preferred avenues for accessing this training? Are there differences in professional learning priorities and avenues between respondents schools at different stages of Ci3T implementation?
Method
Participants
Participants included 165 Ci3T leadership team members from 27 elementary schools from five districts in three geographic regions (West, k = 1; Midwest, k = 3; and Northeast, k = 1), all of whom were participating in an IES-funded network grant (Project ENHANCE: R324N190002) exploring integrated tiered systems of support. Eight schools were in their first year of Ci3T implementation (n = 36 participants), 6 were in their 2nd or 3rd year (n = 35), and 13 were in their 4th or 6th year (n = 94). Ci3T leadership team members included general education teachers, special education teachers, related service providers, staff, building administrators, and district administrators (see Table 1 for participant characteristics). Most participants were female (88.41%; n = 122), white (97.73%, n = 129), and had earned a minimum of a master’s degree (76.69%, n = 102). All Ci3T leadership teams were leading their schools’ Ci3T implementation efforts, with support from research team members serving as external Ci3T coaches.
Table 1.
Participant Demographics by Implementation Stage
| Variable/level | Implementation Stage (Years) | Total | ||
|---|---|---|---|---|
| 1 | 2-3 | 4-6 | N = 165 | |
| n = 36 | n = 35 | n = 94 | ||
| Sex % (n) | ||||
| Male | 10.71 (3) | 16.67 (5) | 10.00 (8) | 11.59 (16) |
| Female | 89.29 (25) | 83.33 (25) | 90.00 (72) | 88.41 (122) |
| Age M (SD) | 43.27 (8.70) | 39.55 (9.98) | 43.95 (10.16) | 42.86 (9.94) |
| Ethnicity and Race % (n) | ||||
| Hispanic | 14.81 (4) | 0.00 (0) | 7.50 (6) | 7.30 (10) |
| American Indian / Alaska Native | 0.00 (0) | 0.00 (0) | 0.00 (0) | 0.00 (0) |
| Asian or Asian / Pacific Islander | 0.00 (0) | 0.00 (0) | 0.00 (0) | 0.00 (0) |
| Black | 4.17 (1) | 0.00 (0) | 0.00 (0) | 0.76 (1) |
| White | 91.67 (22) | 100.00 (0) | 98.73 (78) | 97.73 (129) |
| Other | 4.17 (1) | 0.00 (0) | 1.27 (1) | 1.52 (2) |
| Decline | 0.00 (0) | 0.00 (0) | 1.27 (1) | 0.76 (1) |
| Geographic Region | ||||
| Midwest | 38.89 (14) | 100 (35) | 64.89 (61) | 66.67 (110) |
| Northeast | 0.00 (0) | 0.00 (0) | 35.11 (33) | 20.00 (33) |
| Northwest | 61.11 (22) | 0.00 (0) | 0.00 (0) | 13.33 (22) |
| Highest degree obtained % (n) | ||||
| Bachelor’s degree | 15.38 (4) | 30.00 (9) | 23.38 (18) | 23.31 (31) |
| Master’s degree | 38.46 (10) | 50.00 (15) | 40.26(31) | 42.11 (56) |
| Master’s degree + 30 | 42.31 (11) | 13.33 (4) | 24.68 (19) | 25.56 (34) |
| Doctoral, Education specialist, J.D. degree | 3.85 (1) | 6.67 (2) | 11.69 (9) | 9.02 (12) |
| Primary role (non-mutually exclusive) % (n) | ||||
| Teacher | 25.00 (7) | 51.72 (15) | 34.18 (27) | 36.03 (49) |
| Special education teacher | 17.86 (5) | 6.90 (2) | 12.66 (10) | 12.50 (17) |
| Related service provider | 10.71 (3) | 10.34 (3) | 20.25 (16) | 16.18 (22) |
| Staff (non-instructional) | 21.43 (6) | 10.34 (3) | 12.66 (10) | 13.97 (19) |
| Building administrator | 25.00 (7) | 20.69 (6) | 17.72 (14) | 19.85(27) |
| District administrator | 0.00 (0) | 0.00 (0) | 2.53 (2) | 1.47 (2) |
| Do you provide instruction to students (e.g., whole class, small group, 1:1)? | 57.14 (16) | 70.00 (21) | 69.62 (55) | 67.15 (92) |
| Grade-level taught (non-mutually exclusive) n | ||||
| Early childhood | 1 | 4 | 2 | 7 |
| Pre-kindergarten | 4 | 10 | 5 | 19 |
| Kindergarten | 17 | 14 | 38 | 69 |
| 1 | 18 | 15 | 39 | 72 |
| 2 | 16 | 13 | 44 | 73 |
| 3 | 15 | 16 | 42 | 73 |
| 4 | 19 | 13 | 44 | 76 |
| 5 | 18 | 14 | 43 | 75 |
| 6 | 0 | 8 | 16 | 24 |
| Mixed grade class | 2 | 0 | 2 | 4 |
| Years of experience | 15.29 (8.19) | 14.83 (10.33) | 17.94 (9.29) | 16.72 (9.36) |
| Professional learning | ||||
| Have you had a course in classroom management? % (n) | 88.46 (23) | 89.66 (26) | 87.18 (68) | 87.97 (117) |
| Have you had a professional development or other training in academic screenings? % (n) | 50.00 (13) | 66.67 (20) | 75.64 (59) | 68.66 (92) |
| Have you had a professional development or other training in behavior screenings? % (n) | 44.00 (11) | 70.00 (21) | 66.67 (52) | 63.16 (84) |
Years implementing refers to the number of years Ci3T implementation has been in place at the school-level. Dash (-) = data not reported due to small n. Ci3T = comprehensive, integrated, three-tiered model of prevention. Percentages based on number of respondents who completed a given item
Procedures
After securing university and district approvals, primary investigators extended an invitation to participate to all district- and building-level Ci3T leadership team members (n = 238) from the five partner districts. District leaders and principals had previously committed to participating in the overall 5-year project and had either designed or implemented their Ci3T model during the prior academic year (i.e., 2018–2019). Participating districts used the Student Risk Screening Scale for Internalizing and Externalizing (SSRS-IE; Drummond, 1994; Lane & Menzies, 2009) three times per year (fall, winter, and spring) and academic screening tools that varied by district. Recruitment emails sent via the Qualtrics online survey platform to potential participants included an individualized link taking participants to an approved informational letter that explained the purpose and voluntary nature of the study. All participants received the survey invitation concurrent with their district’s first Ci3T session in the Ci3T Implementation Professional Learning Series, the timing of which ranged from September 5 to November 4, 2019, with dates established relative to the on-set of the school year. Up to three reminder emails were sent to those who had not yet completed it (Dillman et al., 2008).
Of the 238 Ci3T leadership team members invited to participate, 174 opened the survey link and 168 respondents completed at least one question (71% response rate). All district response rates were above 70% with the exception of one district in the Midwest (i.e., 55%), yet all response rates were still well-within acceptable ranges to support generalizability of results (Babbie, 1990). We excluded responses from three participants because they did not complete more than one survey question. All surveys were completed between September 2019 and January 2020. As such, procedures for data collection were not disrupted by the onset of the COVID-19 pandemic. Project staff downloaded all data from Qualtrics, entered data into project-specific databases, and completed a series of accuracy checks prior to analyzing the survey data using SAS. In addition to completing this survey of universal screening practices, some Ci3T leadership team members provided additional information on their understanding and perceptions of universal screening for behavior via semi-structured interviews and focus groups, with findings reported by Briesch et al. (2021; see also introduction for summary).
Measures
The survey consisted of four sections: (1) understanding current behavioral screening practices (Table 2); (2) perceived usability of behavior screening practices (Table 3); (3) views of screening practices (Tables 4, 5 and 6); and (4) demographic information. Section 1 consisted of questions designed to better understand what the current behavior screening process looks like, from data collection to linking assessment with intervention. In addition to one open-ended question (i.e., How are universal behavior screening data used to connect students to Tier 2 and Tier 3 supports?), questions included forced-choice (e.g., How often did your school building conduct universal behavioral screenings during the 2018–19 school year?) and select-all-that-apply lists (e.g., In your school, which data sources are used at Tier 1 for universal behavior screening?). Questions within this section were created for this project by members of the research team with input from advisory board members and Ci3T district and school leaders.
Table 2.
Understanding Current Behavioral Screening Practices
| Variable/Level | Implementation Stage (Years) | Overall Sample | ||
|---|---|---|---|---|
| 1 | 2–3 | 4–6 | ||
| In your school, which data sources are used at Tier 1 for universal behavior screening? (more than one option possible) % (n) | ||||
| Discipline referrals | 69.44 (25) | 74.29 (26) | 72.34 (68) | 72.12 (119) |
| Screening tools | 66.67 (24) | 88.57 (31) | 86.17 (81) | 82.42 (136) |
| Classroom observation | 77.78 (28) | 71.43 (25) | 81.91 (77) | 78.79 (130) |
| Interview | 22.22 (8) | 14.29 (5) | 15.96 (15) | 16.97 (28) |
| Adult nomination | 25.00 (9) | 31.43 (11) | 44.68 (42) | 37.58 (62) |
| Peer nomination | 0.00 (0) | 5.71 (2) | 4.26 (4) | 3.64 (6) |
| Which of the following has your school adopted for behavioral screening for monitoring all students? % (n) | ||||
| BASC-3 BESS | 0.00 (0) | 0.00 (0) | 2.20 (2) | 1.25 (2) |
| DESSA | 0.00 (0) | 0.00 (0) | 0.00 (0) | 0.00 (0) |
| SAEBERS | 2.86 (1) | 2.94 (1) | 0.00 (0) | 1.25 (2) |
| SSIS-PSG | 0.00 (0) | 0.00 (0) | 1.10 (1) | 0.63 (1) |
| SSIS SEL | 0.00 (0) | 2.94 (1) | 1.10 (1) | 1.25 (2) |
| SDQ | 0.00 (0) | 0.00 (0) | 1.10 (1) | 0.63 (1) |
| SRSS | 25.71 (9) | 17.65 (6) | 24.18 (22) | 23.13 (37) |
| SRSS-IE | 68.57 (24) | 88.24 (30) | 89.01 (81) | 84.38 (135) |
| SSBD | 0.00 (0) | 0.00 (0) | 0.00 (0) | 0.00 (0) |
| None | 2.86 (1) | 0.00 (0) | 0.00 (0) | 0.63 (1) |
| How often did your school building conduct universal behavioral screenings during the 2018–19 school year? % (n) | ||||
| One time per year | 5.71 (2) | 0.00 (0) | 1.10 (1) | 1.88 (3) |
| Two times per year | 11.43 (4) | 17.65 (6) | 4.40 (4) | 8.75 (14) |
| Three times per year | 40.00 (14) | 82.35 (28) | 93.41 (85) | 79.38 (127) |
| Other | 42.86 (15) | 0.00 (0) | 1.10 (1) | 10.00 (16) |
| How often is your school building planning to conduct universal behavioral screenings during the 2019–20 school year? % (n) | ||||
| One time per year | 0.00 (0) | 0.00 (0) | 1.10 (1) | 0.63 (1) |
| Two times per year | 11.43 (4) | 14.71 (5) | 3.30 (3) | 7.50 (12) |
| Three times per year | 71.43 (25) | 85.29 (29) | 95.60 (87) | 88.13 (141) |
| Other | 17.14 (6) | 0.00 (0) | 0.00 (0) | 3.75 (6) |
| Who provides information used in universal behavioral screenings? % (n) | ||||
| Teacher | 97.14 (34) | 97.06 (33) | 96.70 (88) | 96.88 (155) |
| Student (e.g., self-report) | 5.71 (2) | 0.00 (0) | 2.20 (2) | 2.50 (4) |
| Student support personnel (e.g., school psychologist, social worker, counselor) | 20.00 (7) | 14.71 (5) | 8.79 (8) | 12.50 (20) |
| School staff (e.g., paraprofessional, classroom aides, lunchroom supervisors) | 8.75 (3) | 8.82 (3) | 7.69 (7) | 8.13 (13) |
| Parent/Guardian | 5.71 (2) | 0.00 (0) | 2.20 (2) | 2.50 (4) |
| Other | 2.86 (1) | 2.94 (1) | 0.00 (0) | 1.25 (2) |
| After universal behavior screenings are conducted, how are data reviewed? % (n) | ||||
| By individual school staff | 28.57 (10) | 52.94 (18) | 50.55 (46) | 46.25 (74) |
| By a group | 77.14 (27) | 82.35 (28) | 75.82 (69) | 77.50 (124) |
| Don't know/Prefer not to answer | 20.00 (7) | 5.88 (2) | 15.38 (14) | 14.38 (23) |
| You indicated that universal behavior screening data are reviewed by individual school staff. % (n) | ||||
| Teacher | 90.00 (9) | 72.22 (13) | 67.39 (31) | 71.62 (53) |
| Student support personnel | 40.00 (4) | 77.78 (14) | 80.43 (37) | 74.32 (55) |
| School administrator | 90.00 (9) | 94.44 (17) | 76.09 (35) | 82.43 (61) |
| Other | 20.00 (2) | 16.67 (3) | 6.52 (3) | 10.81 (8) |
| You indicated that universal behavior screening data are reviewed by a group. % (n) | ||||
| Teachers | 46.15 (12) | 64.29 (18) | 50.00 (34) | 52.46 (64) |
| All teachers from a specific grade level | 30.77 (8) | 82.14 (23) | 54.41 (37) | 55.74 (68) |
| Student support personnel | 61.54(16) | 92.86 (26) | 77.94 (53) | 77.87 (95) |
| School administrator | 80.77 (21) | 82.14 (23) | 75.00 (51) | 77.87 (95) |
| Parent/Guardian | 3.85(1) | 7.14 (2) | 1.47 (1) | 3.28 (4) |
| Other | 19.23(5) | 14.29 (4) | 11.76 (8) | 13.93 (17) |
| How are screening data used to connect students to secondary (Tier 2) and tertiary (Tier 3) supports? – comment provided. % (n) | 97.14(34) | 100.00 (34) | 98.90 (90) | 98.75 (158) |
BASC-3 BESS = Behavior Assessment System for Children 3rd Edition: Behavioral and Emotional Screening Systems (Kamphaus & Reynolds, 2015), DESSA = Deveraux Student Strengths Assessment (Naglieri et al., 2014), SAEBERS = Social, Academic, and Emotional Behavior Risk Screener (Kilgus et al., 2013), SSIS-PSG = Social Skills Improvement System-Performance Screening Guide (Elliott & Gresham, 2008), SSIS SEL = Social Skills Improvement System Social-Emotional Learning Edition (Gresham & Elliott, 2015), SDQ = Strengths and Difficulties Questionnaire (Goodman, 2001), SRSS = Student Risk Screening Scale (Drummond, 1994), SRSS-IE = Student Risk Screening Scale—Internalizing and Externalizing (Drummond, 1994; Lane & Menzies, 2009), SSBD = Systematic Screening for Behavior Disorders (Walker et al., 2014)
Table 3.
URP-NEEDS Descriptive Statistics
| URP-NEEDS Subscales and Items | Years Implementing (Years) | Respondents N = 151 | Significance Testing | ||||||
|---|---|---|---|---|---|---|---|---|---|
| 1 n = 31 |
2–3 n = 30 |
4–6 n = 90 |
|||||||
| M | SD | M | SD | M | SD | M | SD | ||
| Ci3T-URP-Needs overall score | 4.19 | 0.63 | 4.21 | 0.82 | 4.43 | 0.54 | 4.34 | 0.63 |
F(2,148) = 2.42 p = 0.09, R2 =0.03 |
| Understanding subscale | 4.02* | 0.80 | 4.30 | 0.82 | 4.40* | 0.68 | 4.30 | 0.74 |
F(2,148) = 3.17, p = 0.04, R2 =0.04 |
| School personnel understand the procedures for universal behavior screening. | 4.32 | 1.14 | 4.83 | 0.95 | 4.83 | 0.89 | 4.73 | 0.97 | |
| The current universal behavior screening approach offers a good way to identify a child’s behavior problem. | 4.81 | 0.65 | 4.27 | 1.17 | 4.59 | 0.92 | 4.57 | 0.94 | |
| School personnel know how to use universal behavior screening data to document student improvements. | 3.39 | 1.23 | 3.80 | 1.30 | 3.84 | 1.18 | 3.74 | 1.22 | |
| The current universal behavior screening approach is effective for addressing a variety of problems. | 4.23 | 0.99 | 4.13 | 1.36 | 4.42 | 0.87 | 4.32 | 1.01 | |
| School personnel are knowledgeable about the purpose and goals of universal behavior screening. | 3.84 | 1.10 | 4.37 | 0.85 | 4.41 | 0.97 | 4.28 | 1.00 | |
| School personnel are familiar with what can be done to prevent or treat behavioral difficulties in school. | 3.74 | 1.12 | 4.07 | 1.01 | 4.06 | 1.02 | 3.99 | 1.04 | |
| School personnel understand how goals for universal behavior screening fit with a system of student supports. | 3.84 | 1.16 | 4.20 | 1.06 | 4.24 | 1.05 | 4.15 | 1.08 | |
| School personnel understand how to use universal behavior screening data to guide decisions about student supports. | 3.65 | 1.17 | 4.00 | 1.14 | 4.11 | 1.21 | 3.99 | 1.20 | |
| School personnel are confident in their ability to carry out universal behavior screening. | 4.19 | 1.14 | 4.63 | 0.93 | 4.63 | 0.91 | 4.54 | 0.97 | |
| School personnel know how to carry out universal behavior screening. | 4.16 | 1.10 | 4.73 | 1.01 | 4.86 | 0.84 | 4.69 | 0.97 | |
| Willingness to Change subscale | 4.24 | 1.01 | 4.38 | 0.89 | 4.54 | 0.73 | 4.45 | 0.83 |
F(2,148) = 1.56, p = 0.21, R2 = 0.02 |
| School personnel like to use new strategies to help address the behavioral needs of students. | 4.19 | 1.22 | 4.53 | 1.04 | 4.68 | 0.86 | 4.55 | 0.99 | |
| School personnel are willing to use new and different types of behavioral strategies developed by researchers. | 4.39 | 1.05 | 4.47 | 0.94 | 4.60 | 0.88 | 4.53 | 0.93 | |
| School personnel would try a new strategy to address the behavioral needs of students even if it were very different than what they are used to doing. | 4.16 | 1.10 | 4.30 | 0.99 | 4.44 | 1.03 | 4.36 | 1.04 | |
| School personnel are willing to change how they operate to meet the behavioral needs of students. | 4.23 | 1.12 | 4.23 | 0.94 | 4.42 | 0.86 | 4.34 | 0.93 | |
| Feasibility subscale | 4.60 | 0.47 | 4.82 | 0.87 | 4.91 | 0.55 | 4.82 | 0.62 |
F(2,148) = 2.97, p = 0.05, R2 = 0.04 |
| The total time required for staff to carry out universal behavior screening is manageable for school personnel. | 4.74 | 0.63 | 4.87 | 1.04 | 5.18 | 0.77 | 5.03 | 0.82 | |
| The amount of time required of school personnel for record keeping related to universal behavior screening is reasonable. | 4.39 | 0.67 | 4.57 | 1.10 | 4.54 | 0.86 | 4.52 | 0.88 | |
| The preparation of materials needed for universal behavior screening is reasonable for school personnel. | 4.58 | 0.72 | 4.97 | 0.81 | 4.98 | 0.67 | 4.89 | 0.72 | |
| The materials needed for universal behavior screening are reasonable for school personnel. | 4.68 | 0.65 | 4.87 | 0.94 | 4.92 | 0.67 | 4.86 | 0.73 | |
| Family–School Collaboration subscale | 3.95 | 1.20 | 3.84 | 1.36 | 4.12 | 1.20 | 4.03 | 1.23 |
F(2,148) = 0.66, p = 0.52, R2 = 0.01 |
| Regular home–school communication is needed in order to execute universal behavior screening. | 3.87 | 1.26 | 3.93 | 1.39 | 4.24 | 1.36 | 4.11 | 1.35 | |
| A positive home–school relationship is needed to carry out universal behavior screening. | 4.19 | 1.38 | 4.10 | 1.49 | 4.26 | 1.33 | 4.21 | 1.36 | |
| Parental collaboration is needed in order to implement universal behavior screening. | 3.77 | 1.31 | 3.50 | 1.48 | 3.87 | 1.35 | 3.77 | 1.37 | |
| Consultative and Community (External) Supports subscale | 4.17 | 1.04 | 3.70 | 1.29 | 4.18 | 1.10 | 4.08 | 1.14 |
F(2,148) = 2.15, p = 0.1206, R2 = 0.03 |
| A positive relationship with community agencies is important to carry out universal behavior screening. | 4.13 | 1.38 | 3.93 | 1.53 | 4.21 | 1.36 | 4.14 | 1.40 | |
| Ongoing assistance from external consultants is necessary to successfully use universal behavior screening approach. | 4.23 | 1.23 | 3.57 | 1.45 | 4.22 | 1.33 | 4.09 | 1.35 | |
| School personnel need consultative support in order to carry out universal behavior screening. | 4.16 | 1.24 | 3.60 | 1.45 | 4.10 | 1.25 | 4.01 | 1.30 | |
* denotes statistically significant differences in multiple comparisons
Table 4.
Perceptions of Strengths of Universal Behavior Screeners
| Variable/level | Likert-type Rating | Implementation Stage M (SD) | Overall Sample | ||||||
|---|---|---|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | 5 | 1 | 2–3 | 4–6 | ||
|
To what extent do you think the universal behavior screening measure(s) used in your school target the behaviors that are most relevant to student success? [1 not at all—3 targets somewhat—5 targets completely] |
1 | 4 | 62 | 78 | 5 | 3.50 (0.82) | 3.73 (0.64) | 3.50 (0.57) | 3.55 (0.64) |
|
To what degree do you think the universal behavior screening measure(s) used in your school provides information to guide intervention decisions? [1 does not provide enough information to guide decisions—3 provides the right amount of information to guide decisions—5 provides more information than is needed to guide decisions] |
5 | 41 | 57 | 46 | 1 | 3.17 (0.83) | 2.97 (0.85) | 2.92 (0.87) | 2.98 (0.86) |
|
Generally, how successful do you think the universal behavior screening measure(s) is/are being used in your school? [1 not successfully—3 somewhat successfully—5 very successfully] |
8 | 24 | 79 | 37 | 2 | 2.77 (0.90) | 3.10 (0.88) | 3.05 (0.77) | 3.01 (0.82) |
|
Overall, to what degree do you think universal behavior screening has been effective at identifying student challenges in your school? [1 not effective—3 somewhat effective—5 very effective] |
4 | 22 | 60 | 52 | 12 | 3.03 (0.85) | 3.37 (0.93) | 3.37 (0.92) | 3.31 (0.91) |
Table 5.
Top Rated Perceived Barriers
| Variable/level | Implementation Stage (Years) | Overall Sample | ||
|---|---|---|---|---|
| 1 | 2–3 | 4–6 | ||
| To what extent do you perceive the following factors as being potential barriers to implementing universal behavior screening? (rank order) | ||||
| Teachers' concerns that the screening measure does not reflect all of their concerns | 11 | 6 | 22 | 39 |
| Financial costs in purchasing screening materials | 1 | 2 | 3 | 6 |
| Availability of trained staff to provide support to teachers | 5 | 5 | 12 | 22 |
| Availability of trained staff to summarize and interpret data | 3 | 2 | 3 | 8 |
| The extra work involved for teachers to complete the measure(s) | 4 | 2 | 5 | 11 |
| The extra work involved for staff to manage the screening data | 1 | 3 | 1 | 5 |
| Timely access for teachers to use screening results | 0 | 0 | 5 | 5 |
| Potential stigmatization of students who are identified through screening | 0 | 0 | 0 | 0 |
| Parental concerns involving consent | 0 | 0 | 0 | 0 |
| Concerns about the measure(s) overidentifying particular groups of students (e.g., gender, race/ethnicity, and socioeconomic status) | 1 | 0 | 3 | 4 |
| Ability of the school to provide follow-up services to those students identified as in need | 4 | 10 | 32 | 46 |
Numbers refer to the number of respondents who rated the given concern as their #1 concern (rank ordered)
Table 6.
Professional Learning: Needed Areas and Preferred Venues
| Implementation Stage (Years) | Total | |||
|---|---|---|---|---|
| Variable/level | 1 | 2–3 | 4–6 | |
| In what areas do you believe that you could benefit from additional training around universal behavior screening? % (n) (yes) | ||||
| Deciding which behaviors our school should screen for | 26.67 (8) | 16.67 (5) | 23.86 (21) | 22.97 (34) |
| Deciding what screening measure is best for our school to use | 23.33 (7) | 20.00 (6) | 19.32 (17) | 20.27 (30) |
| Understanding how to use data from multiple sources to identify at-risk students | 66.67 (20) | 60.00 (18) | 50.00 (44) | 55.41 (82) |
| Understanding how to make intervention decisions based on universal behavior screening data | 93.33 (28) | 93.33 (28) | 81.82 (72) | 86.49 (128) |
| Other | 3.33 (1) | 0.00 (0) | 3.41 (3) | 2.70 (4) |
| Identify the ways in which you would prefer to receive training on universal behavior screening—(rank order) | ||||
| Provided with materials to study on my own, such as information sheet or manual | 6.67 (2) | 20.00 (6) | 15.91 (14) | 14.86 (22) |
| In service workshop | 40.00 (12) | 43.33 (13) | 39.77 (35) | 40.54 (60) |
| Externally sponsored conference or workshop | 10.00 (3) | 23.33 (7) | 9.09 (8) | 7.43 (11) |
| On-line module(s) | 6.67 (2) | 3.33 (1) | 7.95 (7) | 6.76 (10) |
| Individualized coaching | 23.33 (7) | 20.00 (6) | 11.36 (10) | 15.54 (23) |
| Professional learning community | 13.33 (4) | 13.33 (4) | 15.91 (14) | 14.86 (22) |
Numbers refer to the number of respondents who rated the given concern as their #1 concern (rank ordered)
In Section 2, respondents were provided with the Usage Rating Profile for Supporting Students’ Behavioral Needs (URP-NEEDS; Chafouleas et al., 2018). The URP-NEEDS aims to assess multiple factors beyond social validity alone (e.g., feasibility, understanding) that may interact to determine whether or not a school will use a particular approach to identifying and supporting the social, emotional, and behavioral needs of students. Respondents were asked to consider their school’s approach to universal behavior screening when rating their level of agreement (i.e., 1 = Strongly Disagree to 6 = Strongly Agree) with 24 statements designed to assess perceived usability. The URP-NEEDS yields an overall total score (α = .91), as well as five subscale scores: understanding (10 items; α = .98), willingness to change (4 items; α = .88), feasibility (4 items; α = .79), family–school collaboration (3 items; α = .89), and external support (3 items; α = .80; all alpha coefficients reported for the current sample).
Section 3 consisted of seven questions designed to assess personal beliefs about their school’s current universal behavior screening practices. First, respondents were asked about the degree to which they believed their universal behavior screening measure (1) targets the behaviors most relevant to student success; (2) provides information to guide intervention decisions; (3) is used successfully; and (4) is effective at identifying student challenges. All four questions utilized a 5-point Likert-type scale (Table 4). Next, respondents were asked to rank order the extent to which they perceived different factors to be potential barriers to implementing universal behavior screening (e.g., financial costs, extra work involved; Table 5). When asking about the areas in which respondents believed they could benefit from additional training around universal behavior screening, a select-all-that-apply list was utilized (Table 6). The final question asked respondents to identify the ways in which they would prefer to receive training regarding universal behavior screening by rank ordering a set of available options (Table 6). Similar to Section 1, these questions were created for this project by the research team with input from advisory board members and Ci3T district and school leaders.
Section 4 featured basic demographic information to describe the sample (Table 1). Items included: sex, age, ethnicity (Hispanic: yes vs. no), race, highest degree obtained, primary roles in the school (with the option to indicate more than one role), whether or not they provided instruction to students, grade level taught, years of experience, and professional learning experience with classroom management, academic screening, and behavior screening.
Design and Analysis
This was a survey study with a single time point of administration. We used descriptive and inferential statistics to answer our research questions. We computed descriptive statistics to summarize responses to closed-ended questions related to (1) current behavior screening practices; (2) perceptions of usability and implementation of screening; (3) strengths and barriers; and (4) professional learning priorities and preferred avenues. Responses to open-ended questions were analyzed using thematic analysis (TA; Braun & Clarke, 2006). Subsequent to undergoing training in the qualitative analysis procedures, two research assistants independently applied data-driven codes to participant responses and then organized similar codes into larger themes. To constitute a theme, the decision was made to require a similar pattern of response across five or more respondents. After coders reviewed and confirmed themes, the first and sixth authors met to cross-check the individual analysis and finalize themes.
When exploring differences across schools at different stages of Ci3T implementation, we created three groups of schools: initial (Year 1), mid-level (Years 2–3), and experienced (Years 4–6). The sample size was not sufficient to conduct comparisons across each year of implementation. We conducted a series of one-way ANOVAs to compare mean levels, contrasting subgroups on URP-NEEDS overall and subscale scores. We used Tukey multiple comparisons (α = .05) to determine differences in mean scores for all proposed comparisons. Finally, we conducted chi-square analyses to assess whether the barriers identified by participants or professional learning preferences varied significantly by implementation year. We analyzed all available data from respondents. Visual inspection of completion patterns showed no distinct patterns of missing item-level data (e.g., stopping at the half-way point, or after a specific survey section).
Finally, in addition to deriving summaries of participant responses, we also descriptively examined the degree to which reported universal screening practices aligned with intended practices within a Ci3T model. For example, District Ci3T leadership team members developed an assessment schedule that featured conducting universal behavior screening of all students three times per year (i.e., fall, winter, spring). If a respondent indicated screenings took place only twice per year, for example, this was noted as an alignment issue.
Results
Current Behavior Screening Practices
When asked which behavioral screening tool(s) their school adopted for monitoring all students, nearly all respondents (84.38%) indicated use of the Student Risk Screening Scale for Internalizing and Externalizing (SSRS-IE; Drummond, 1994; Lane & Menzies, 2009); 23.13% indicated using the Student Risk Screening Scale (SRSS; Drummond, 1994), an apparent alignment issue given assessment schedules for all districts documented use of the SRSS-IE. Respondents in schools beyond the 1st year of Ci3T implementation appeared to have the most clarity on which screening tools were used. Of 1st-year implementers, 68.57% accurately identified the SRSS-IE as the screener used in their school. For more seasoned implementers (2–3, 4–6), participants accurately identifying the screener used ranged from 88.24%–89.01%. Beyond use of standardized behavior screeners, respondents frequently endorsed the use of classroom observations (78.79%) and office discipline referrals (72.12%) within the screening process (e.g., in addition to use of a systematic screening tool). Less frequently endorsed processes included adult nomination (37.58%), interviews (16.97%), and peer nomination (3.64%).
Most respondents (79.38%) reported their school screened students three times in the 2018–2019 academic year, again with respondents from schools beyond their first year of implementation having the greatest clarity on actual screening practices (given confirmation all schools were conducting universal behavior screenings three times per year). When asked who the informant in universal behavior screenings was, nearly all respondents (96.88%) indicated teachers, with all respondents clear on this practice regardless of years of implementation experience. Much smaller percentages of respondents reported the involvement of student support personnel (e.g., school psychologist, school social worker, counselor; 12.50%), other school staff (e.g., paraprofessionals, classroom aides, lunchroom supervisors; 8.13%), or parents (2.50%).
When asked how universal behavior screening data are reviewed after collected, 46.25% of respondents indicated they are reviewed by individual school staff, 77.50% indicated they were reviewed by a group, and 14.38% did not know or preferred not to answer. Respondents at schools at later stages of Ci3T implementation (e.g., more years of implementation experience) reported the data being reviewed by individual school staff (52.94% Years 2–3, 50.55% Years 4–6), which is an intended practice in Ci3T models (e.g., teachers being able to review data independently to inform instructional practices). This was a less known practice by schools in their first year of Ci3T implementation, with 28.57% of these respondents indicating universal behavior screening data are reviewed individually by school staff.
More than 75% of respondents indicated universal behavior screening data were reviewed by a group, such as a grade-level team. Respondents indicated these teams nearly always included student support staff and administrators (77.87%) and included teachers approximately half of the time (52.46%).
Respondents were also asked to describe how universal behavior screening data were used to connect students to Tier 2 and Tier 3 supports. Analysis focused on responses from individuals representing schools beyond the 1st year of implementation, given that 1st-year implementers were still within their first screening window. Ninety-nine participants responded, with thematic analysis of responses identifying four overarching themes. First, the greatest percentage of respondents (n = 61; 62%) noted that universal behavior screening data were used to create intervention groups according to student need. Some respondents described grouping students according to level of need (i.e., Tier 2 or Tier 3). For example, one respondent explained “the team looks at the most severe students from the screening and then puts them in tier 2 and then we talk about students from tier 2 who need more intensive intervention and put them in tier 3” (N.B.: Ci3T Professional Learning emphasizes that Tier 2/3 are not separate locations but rather involve a menu of support options targeted to the student’s level of need). Others noted forming groups according to the type of need (e.g., internalizing, specific skill deficit). For example, another respondent explained that those students identified through screening “are able to be identified whether they are [exhibiting] internalizing or externalizing behaviors and this allows us to provide them with the supports that they need such as social work, counselor pull out, etc.” Second, several respondents (n = 16) indicated screening data were used—sometimes in combination with other data sources—to identify students who were at risk for academic or behavioral challenges and to develop a plan to address difficulties. As one respondent noted, “we study the data to see which youth may need additional services beyond the tier one supports and come up with a plan for those students.” Third, six respondents (6%) indicated others used data in the school building (e.g., administrator, mental health team). One respondent explained, “to the best of my knowledge, these students are vetted through our psychologists/vice principal/classroom teachers and then next steps are considered,” whereas another indicated that the “mental health team uses data but I’m not clear about the structure.” Finally, a small number of respondents (n = 5; 5%) indicated they were unsure how or whether screening data were used. As one respondent noted, “I’m not really sure that they have been unless individual teachers are reaching out for support after completing the screening.”
Perceptions of Usability and Implementation of Screening
In terms of respondents’ perceived usability of the behavior screening process (Table 3), on average, the 151 respondents who completed the URP-NEEDS measure indicated slight agreement that collaboration and communication with families (M = 4.03; SD = 1.23) and consultative and community support (M = 4.08; SD = 1.14) were needed to support behavior screening. Agreement was slightly stronger that school personnel (1) understood how to carry out the screening process (M = 4.30; SD = 0.74); (2) were willing to use new strategies to address student behavior problems (M = 4.45; SD = 0.83); and (3) found behavior screening to be feasible, requiring a reasonable amount of time and resources to implement (M = 4.82; SD = 0.62).
Results of a series of one-way ANOVAs contrasting ratings by implementation stage for overall usability as well as each of the URP-NEEDS subscales indicated only one statistically significant difference between respondents in schools with different levels of experience implementing their Ci3T model, with respondents in schools in years 4–6 of implementation (i.e., experienced schools) reporting statistically higher levels of understanding (M = 4.40; SD = 0.54) compared to respondents in schools in year 1 of implementation (M = 4.02; SD = 0.80), accounting for 4% of the variance: F(2,148) = 3.17, p = 0.04, R2 =0.04, (1 < 4–6).
Strengths and Barriers
In terms of respondents’ personal beliefs regarding their school’s current universal behavior screening practices, respondents reported mild agreement that the screener used targeted behaviors most relevant to student success (M = 3.55; SD = 0.64) and that screening was effective in identifying student challenges (M = 3.31; SD = 0.91, noting the 1–5 range). Respondents were more decidedly neutral concerning whether the screener provided information to guide intervention decisions (M = 2.98; SD = 0.86) and the screener was being used successfully (M = 3.01; SD = 0.82). The results were comparable between respondents at schools at different stages of implementation, with respondents at schools in their first year of implementation reporting slightly lower scores regarding the extent to which their screener was being used successfully (M = 2.77; SD = 0.90) relative to other respondents in years 2–3 (M = 3.73; SD = 0.64) or 4-6 (M = 3.50; SD = 0.57).
When asked to rank order various factors in terms of the degree to which they served as potential barriers to implementing universal behavior screening (see Table 5), the three barriers that received the highest endorsements from the 148 respondents were (1) ability of the school to provide follow-up services to students identified as in need (31% of respondents ranked as top concern); (2) teachers’ concerns the screening measure does not reflect all of their concerns (26% of respondents ranked as top concern); and (3) availability of trained staff to provide supports to teachers (15% of respondents ranked as top concern). Relatively few respondents viewed either time (i.e., time to complete the measure, access the results, or manage the data) or costs to be barriers. No statistically significant differences were identified across years of implementation based on chi-square goodness-of-fit tests.
Professional Learning Preferences: Priorities and Avenues
Finally, regarding the areas in which respondents believed they could benefit from additional training around universal behavior screening, 86.49% selected understanding how to make intervention decisions based on universal behavior screening data and over half of respondents (55.41%) selected understanding how to use data from multiple sources to identify students at behavioral risk. These priorities were comparable for respondents with different levels of experience implementing their Ci3T model.
When asked to rank order the ways in which Ci3T leadership team members would prefer to receive professional learning, the highest rankings were in-service workshop (40.54% of respondents ranking this option first) and least preferred were externally sponsored conferences or workshops and on-line modules. In Table 6, we report respondents who rated the avenue for professional learning as their top preference, with chi-square analyses indicating no statistically significant differences between those with different levels of experience implementing their Ci3T model.
Discussion
To date, our understanding of successes and challenges related to school-based universal behavior screening has largely been limited to examination of barriers to adoption and installation (e.g., Bruhn et al., 2014). The current study therefore targeted a sample of 165 educators already implementing universal behavior screening (i.e., Ci3T leadership teams) to better understand stakeholders’ views of the screening process and associated professional learning needs. Respondents overwhelmingly indicated that results of their adopted screener (i.e., SSRS-IE) were reviewed in conjunction with other sources of data (i.e., office discipline referrals, classroom observations) three times across the academic year. These data were most typically reviewed by a team—but also by individual teachers—and used to both identify individual students in need of supports and to group students according to shared area of need (e.g., Tier 2 vs. 3; internalizing vs. externalizing). In those schools that had been implementing their Ci3T model for several years, data were used increasingly by individual educators for decision making (e.g., classroom-based instructional decisions). A hallmark of Ci3T is that teachers have timely access to data for adjusting Tier 1 instructional practices as well as determining appropriate Tier 2 or Tier 3 interventions or supports (Lane et al., 2020a; Oakes et al., 2022).
When asked about their personal beliefs regarding universal behavior screening, responses from Ci3T leadership team members highlighted both benefits and barriers. With regard to data collection procedures, respondents indicated the process was both well-understood by school staff and feasible. As expected, their understanding increased as schools continued to implement their Ci3T model for several years. This was encouraging as it may indicate the time Ci3T leadership teams participate in ongoing professional learning to support the system, structures, and practices in their Ci3T model, including screening procedures and use of data throughout implementation, may translate to the professional learning they are able to provide for their faculty and staff. Within the URP-NEEDS, respondents reported they knew why universal behavior screening is important and that school staff were confident in carrying out the procedures. Furthermore, the strongest overall endorsement on the URP-NEEDS was for the item suggesting the time required to carry out universal behavior screening was manageable. Such findings regarding perceived feasibility are consistent with prior research that indicates schools have considered the SRSS-IE to be a feasible behavior screening tool with respect to the amount of time it takes to complete the measure and the tool’s user-friendliness (Oakes et al., 2016). In the present study, this was particularly true given many schools reserved professional learning time for completion and data were made immediately available.
In contrast, respondents, as a group, reported somewhat less confidence surrounding data use. Mean responses indicated respondents were uncertain (i.e., neither agreed nor disagreed) the screener was being used successfully (M = 3.01; SD = 0.82) and it was effective in identifying student challenges in their school (M = 3.31; SD = 0.91). Although respondents, as a group, reported they believed the universal behavior screener provided the right amount of information to guide intervention decisions, professional learning needs were also identified in this area. In particular, nearly 90% of Ci3T leadership team members reported they could personally benefit from additional training related to data use (i.e., how to use screening data to make decisions regarding intervention). It is important to note that professional learning related to the use of data for decision making remains a priority even as schools progress through implementation stages.
Related to data use, results of the survey suggested one potential area in which Ci3T leadership team members might benefit from ongoing professional learning concerns the integration of multiple data sources within the screening process. Responses indicated about a quarter of respondents felt not all behaviors of concern might be detected through their current screening tool, the SRSS-IE, and over half of respondents expressed a desire for additional training around how to use multiple sources of data to identify and respond to students demonstrating behavioral risk. As noted in the results, those in their 1st year of implementation were less likely to less likely to indicate as a barrier the ability of school personnel to provide follow-up services to students identified as in need. This difference may be due to initial priorities being on accomplishing the behavior screening process with integrity. It may be attention shifts to how to use the behavior screening data in subsequent years.
Overall, these concerns highlight the importance of professional learning for Ci3T leadership team members around how to use the results of the SRSS-IE in conjunction with other measures to best identify and respond to students in need of Tier 2 or 3 supports. Respondents overwhelmingly reported collecting multiple sources of screening data (e.g., office discipline referrals, observations), but may benefit from additional guidance around how to use those data sources as part of the screening process and how to use these data to tease out student from system-level concerns. Furthermore, findings highlight the importance of ensuring that teams understand how to use these multiple sources of data to guide the selection of Tier 2 and 3 supports. Fortunately, these resources are being developed as part of Project ENHANCE, with free-access resources to support the selection and installation of systematic screening tools currently available on ci3t.org and pbis.org (Ma et al., 2022; Oakes, Buckman et al., 2021a; Oakes, Lane et al., 2021b; Oakes et al., 2022; Rollenhagen et al., 2021).
Limitations and Future Directions
Although results of the current survey study provide useful, initial directions for enhancing professional development supports, limitations of the study must be noted. First, due to the number of school districts participating in the study (n = 5), the effects of year of implementation cannot be disentangled from school district and geographic region, as was the case in the Common et al. (2021) study examining Ci3T professional learning avenues and priorities. For example, all Year 1 implementers were from the same district in the Northwest and all Years 2–3 implementers were from the Midwest. As such, the analyses concerning year of implementation must be considered preliminary, calling for replication before generalizing these findings. Second, the pool of participants for the current study consisted of those schools engaged in a larger grant project related to Ci3T. Thus, the extension of these findings to schools implementing other systems is unknown. Future work could explore if these findings hold for other integrated tiered systems. Third, to maximize feasibility (and response rate), the survey largely consisted of closed-ended items. As such, no opportunity existed for probing participant responses to clarify meaning to understand more fully experiences of school staff implementing universal behavior screening. We encourage others conducting inquiry into the application of screening procedures to build on this and the qualitative work by Briesch et al. (2021). Finally, although self-report is not a concern for those items assessing personal beliefs, there does exist the possibility of inaccuracies in respondents’ descriptions of screening procedures, reporting on their colleagues’ perceptions, and the scope of screening data usage. Although Ci3T leadership teams are representative of the school faculty and staff, they may not fully be aware of how all educators are using these data to support students, especially those educators using the data in their individual classrooms to improve their practices. It was noted, for example, that although there was generally consensus among members of each Ci3T leadership team regarding what universal behavior screening procedures looked like in their building, a small percentage of respondents either (1) reported not knowing the answers to particular questions (e.g., what screening measure was used, how screening data were reviewed) or (2) provided responses that were discrepant from those of the larger team, thus indicating a misunderstanding.
Conclusions
We conducted this study during the 2019–2020 academic year to identify priority directions for enhancing resources and professional learning about behavior screening. We surveyed district- and building-level Ci3T leadership team members about their use of behavior screening practices within Ci3T as well as their perceptions regarding the purpose, value, and usability of behavior screening. Overall, respondents reported behavior screening procedures to be clear and feasible, while simultaneously reporting less confidence surrounding how to integrate multiple sources of screening data to support data-informed decision making. Results from this study, in addition to the professional learning needs around building staff capacity (Common et al., 2021) and leadership (Royer et al., 2022) within integrated tiered systems will be used to inform future professional learning offerings developed to enhance and sustain Ci3T implementation.
Funding
The author(s) disclosed receipt of the following financial support for research, authorship, and/or publication of this article: This article was supported by funding provided by the Institute of Education Sciences, U.S. Department of Education (R324N190002: PI Lane). Opinions expressed herein do not necessarily reflect the position of the U.S. Department of Education, and as such, endorsements should not be inferred.
Declarations
Conflicts of Interests
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Footnotes
Special thanks are provided to all participants for their time, and appreciation is extended to research team members for their assistance on the project.
References
- Allen GE, Common EA, Germer KA, Lane KL, Buckman MM, Oakes WP, Menzies HM. A systematic review of the evidence base for active supervision in PK-12 settings. Behavioral Disorders. 2020;45(3):167–182. doi: 10.1177/0198742919837646. [DOI] [Google Scholar]
- Babbie E. Survey research methods. (2nd ed.). 1990. [Google Scholar]
- Barrett S, Eber L, Weist M, editors. Advancing educational effectiveness: Interconnecting school mental health and school-wide positive behavior support. University of Oregon; 2013. [Google Scholar]
- Bertram RM, Blase K, Shern D, Shea P, Fixsen D. Implementation opportunities and challenges for prevention and health promotion initiatives. National Association of State Mental Health Directors; 2011. [Google Scholar]
- Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3, 77–101. 10.1191/1478088706qp063oa
- Briesch, A. M., Chafouleas, S. M., Iovino, E. A., Abdulkerim, N., Sherod, R. L., Oakes, W. P., Lane, K. L., Common, E. A., Royer, D. J., & Buckman, M. (2021). Exploring directions for professional learning to enhance behavior screening within a comprehensive, integrated three-tiered model of prevention. Journal of Positive Behavior Interventions.10.1177/10983007211050424
- Bruhn AL, Woods-Groves S, Huddle S. A preliminary investigation of emotional and behavioral screening practices in K–12 schools. Education & Treatment of Children. 2014;37(4):611–634. doi: 10.1353/etc.2014.0039. [DOI] [Google Scholar]
- Chafouleas, S. M., Briesch, A. M., McCoach, D. B., & Riley-Tillman, T. C. (2018). Usage rating profile for supporting students’ behavioral needs. University of Connecticut. [DOI] [PubMed]
- Common EA, Buckman MM, Lane KL, Oakes WP, Royer DJ, Chafouleas S, Briesch A, Sherod R. Project ENHANCE: Assessing professional learning needs for implementing Comprehensive, Integrated, Three-tiered (Ci3T) Prevention modules. Education & Treatment of Children. 2021;44:125–144. doi: 10.1007/s43494-021-00049-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dillman DA, Smyth JD, Christian LM. Internet, mail, and mixed-model surveys: The tailored design method. (3rd ed.). John Wiley & Sons; 2008. [Google Scholar]
- Drummond T. The Student Risk Screening Scale (SRSS). Josephine County Mental Health Program. 1994. [Google Scholar]
- Elliott, S. N., & Gresham, F. M. (2008). Social Skills Improvement System: Performance screening guide. Pearson.
- Ennis RP, Royer DJ, Lane KL, Griffith CE. A systematic review of precorrection in PK–12 settings. Education & Treatment of Children. 2017;40:465–496. doi: 10.1353/etc.2017.0021. [DOI] [Google Scholar]
- Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. University of South Florida, National Implementation Research Network.
- Fuchs LS, Fuchs D, Compton DL. Smart RTI: A next-generation approach to multilevel protection. Exceptional Children. 2012;78(3):236–279. doi: 10.1177/001440291207800301. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goodman R. Psychometric properties of the Strengths and Difficulties Questionnaire (SDQ) Journal of the American Academy of Child & Adolescent Psychiatry. 2001;40(11):1337–1345. doi: 10.1097/00004583-200111000-00015. [DOI] [PubMed] [Google Scholar]
- Gresham, F., & Elliott, S. N. (2015). Social Skills Improvement System (SSiS)—Social emotional learning edition. PsychCorp Pearson Education.
- Kamphaus RW, Reynolds CR. Behavior Assessment System for Children-third edition (BASC-3): Behavioral and Emotional Screening System (BESS) PsychCorp Pearson Education; 2015. [Google Scholar]
- Kilgus SP, Chafouleas SM, Riley-Tillman TC, von der Embse NP. Social, Academic, and Emotional Behavior Risk Screener©: Teacher rating form. University of Missouri; 2013. [Google Scholar]
- Lane KL, Menzies HM. A school-wide intervention with primary and secondary levels of support for elementary students: Outcomes and considerations. Education & Treatment of Children. 2003;26:431–451. [Google Scholar]
- Lane, K. L. & Menzies, H. M. (2009). Student Risk Screening Scale for Internalizing and Externalizing.http://www.ci3t.org/screening
- Lane KL, Menzies HM, Ennis RP, Oakes WP. Supporting behavior for school success: A step-by-step guide to key strategies. Guilford Press; 2015. [Google Scholar]
- Lane KL, Menzies HM, Oakes WP, Kalberg JR. Developing a schoolwide framework to prevent and manage learning and behavior problems. (2nd ed.). Guilford Press; 2020. [Google Scholar]
- Lane, K. L., Oakes, W. P., Cantwell, E. D., & Royer, D. J. (2018a). Building and installing comprehensive, integrated, three-tiered (Ci3T) models of prevention: A practical guide to supporting school success V1.2. KOI Education.
- Lane KL, Oakes WP, Menzies HM. Comprehensive, integrated, three-tiered (Ci3T) models of prevention: The role of systematic screening to inform instruction. In: Pullen PC, Kennedy MJ, editors. Handbook of response to intervention and multi-tiered systems of support. Routledge; 2018. pp. 63–75. [Google Scholar]
- Lane KL, Oakes WP, Menzies HM. Considerations for systematic screening PK–12: Universal screening for internalizing and externalizing behaviors in the COVID-19 era. Preventing School Failure: Alternative Education for Children & Youth. 2021;65(3):275–281. doi: 10.1080/1045988X.2021.1908216. [DOI] [Google Scholar]
- Lane, K. L., Powers, L., Oakes, W. P., Buckman, M. M., Sherod, R., & Lane, K. S. (2020b). PBIS Forum 19 Practice Brief: Universal screening—systematic screening to shape instruction: Lessons learned and practicalities. , University of Oregon. https://www.pbis.org/resource/universal-screening-systematic-screening-to-shape-instruction
- Lane KL, Richards-Tutor C, Oakes WP, Connor K. Initial evidence for the reliability and validity of the Student Risk Screening Scale with elementary age English learners. Assessment for Effective Intervention. 2014;39(4):219–232. doi: 10.1177/1534508413496836. [DOI] [Google Scholar]
- Lyon, A. R. (2016). Implementation science and practice in the education sector. Substance Abuse & Mental Health Services Administration.https://education.uw.edu/sites/default/files/Implementation%20Science%20Issue%20Brief%20072617.pdf
- Ma Z, Sherod R, Lane KL, Buckman MM, Oakes WP. Interpreting universal behavior screening data: Questions to consider. University of Oregon; 2022. [Google Scholar]
- McIntosh K, Goodman S. Integrated multi-tiered systems of support: Blending RTI and PBIS. Guilford Press; 2016. [Google Scholar]
- Naglieri JA, LeBuffe PA, Shapiro VB. Devereaux Student Strengths Assessment K–8thGrade: A universal screening and progress monitoring system for social-emotional competencies. Aperture Education; 2014. [Google Scholar]
- Oakes WP, Lane KL, Cox M, Messenger M. Logistics of behavior screenings: How and why do we conduct behavior screenings at our school? Preventing School Failure. 2014;58(3):159–170. doi: 10.1080/1045988X.2014.895572. [DOI] [Google Scholar]
- Oakes WP, Lane KL, Ennis RP. Systematic screening at the elementary level: Considerations for exploring and installing universal behavior screening. Journal of Applied School Psychology. 2016;32:214–233. doi: 10.1080/15377903.2016.1165325. [DOI] [Google Scholar]
- Oakes WP, Buckman MM, Lane K, L., & Sherod, R. L. Selecting a universal behavior screening tool: Questions to consider. University of Oregon; 2021. [Google Scholar]
- Oakes WP, Lane KL, Ma Z, Sherod R, Perez-Clark P. Installing a universal behavior screening tool: Questions to consider. University of Oregon; 2022. [Google Scholar]
- Oakes WP, Lane KL, Sherod RL, Adams HR, Buckman MM. Guidance for systematic screening: Lessons learned from practitioners in the field. University of Oregon; 2021. [Google Scholar]
- Rollenhagen, J., Buckman, M. M., Oakes, W. P., & Lane, K. L. (2021). Screening coordinator training manual: A guide for installing the Student Risk Screening Scale – Internalizing and Externalizing (SRSS-IE) in your school or district [training manual]. Ci3T Strategic Leadership Team., www.ci3t.org/screening.
- Royer DJ, Lane KL, Cantwell ED, Messenger M. A systematic review of the evidence base for instructional choice in k-12 settings. Behavioral Disorders. 2017;42:89–107. doi: 10.1177/0198742916688655. [DOI] [Google Scholar]
- Royer DJ, Lane KL, Dunlap KD, Ennis RP. A systematic review of teacher-delivered behavior-specific praise on K-12 student performance. Remedial & Special Education. 2019;40(2):112–128. doi: 10.1177/0741932517751054. [DOI] [Google Scholar]
- Royer, D. J., Oakes, W. P., Chafouleas, S. M., Briesch, A. M., Lane, K. L., Buckman, M. M., Sherod, R. L. (2022). Project ENHANCE component 2: Leadership—interviews and focus groups. [Manuscript submitted for publication.]
- Simeonsson RJ. Risk, resilience & prevention: Promoting the well-being of all children. Paul H; 1994. [Google Scholar]
- Sugai G, Horner RH. Responsiveness-to-intervention and school-wide positive behavior supports: Integration of multi-tiered system approaches. Exceptionality. 2009;17(4):223–237. doi: 10.1080/09362830903235375. [DOI] [Google Scholar]
- Walker, H. M., Severson, H. H., & Feil, E. G. (2014). Systematic Screening for Behavior Disorders (SSBD) technical manual: Universal screening for PreK–9 (2nd ed.). Pacific Northwest Publishing.
