Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2024 Nov 10.
Published in final edited form as: Health Promot Pract. 2021 Jun 1;23(5):843–851. doi: 10.1177/15248399211014501

Development and Interrater Reliability of an Observational School Environment Checklist: A Practical, Comprehensive Tool to Assess Healthy Eating and Physical Activity Opportunities in Schools

Hannah Lane 1, Katherine Campbell 2, Anne Zhang 3, Rachel Deitch 4, Aaron Litz 4, Jasmia Shropshire 4, Lindsey Turner 5, Erin Hager 4
PMCID: PMC11550863  NIHMSID: NIHMS2031464  PMID: 34060358

Abstract

Introduction.

Comprehensive, objective assessment of schools’ eating and physical activity environments is critical to developing and evaluating policies and interventions to reduce pediatric obesity inequities; however, few tools exist that describe the entire school comprehensively and are feasible with restricted resources. This study describes development and reliability of the observational school environment checklist (OSEC), a comprehensive observational audit tool.

Method.

We developed the OSEC through iterative adaptations of existing instruments and pilot testing. The tool assesses four focus areas: cafeteria, lobby/hallway, gym, and outdoor areas. For reliability testing, two trained auditors independently completed the OSEC and met to resolve disagreements. For items with poor agreement, a third independent coder coded photographs taken during auditing. Percent agreement and Cohen’s kappa were calculated for all items and across four evidence-based constructs: atmosphere, accessibility, attractiveness, and advertising.

Results.

After iterative development, the 88-item OSEC was tested for reliability in 18 schools. Items with poor (<80%) agreement or redundancy were discarded or reworded (n = 16 items). All four constructs had acceptable agreement, ranging by focus area: 72.3% (attractiveness), 86.3% to 97.1% (atmosphere), 82.9% to 100% (accessibility), and 92.9% (advertising). Cohen’s kappa ranges were acceptable: 0.66–0.91 (atmosphere), 0.60–1.00 (accessibility), 0.46 (attractiveness), and 0.77 (advertising). After adding similar items across domains (n = 49) to improve comprehensiveness, the final tool contained 121 binary items.

Implications.

The OSEC reliably and comprehensively captures the school environment. It requires few resources or expertise to administer, has acceptable reliability, and can assess atmosphere, accessibility, attractiveness, and advertising in school areas where students engage in eating and physical activity.

Keywords: child/adolescent health, school health, health promotion, obesity


Schools are a logical setting to provide opportunities for healthy eating and physical activity behaviors for children and to promote lifelong health. More than 95% of children attend school for a large portion of childhood, and their interactions and experiences at school are highly influential on their behaviors (National Center for Education Statistics, 2019; Story et al., 2009). Comprehensive approaches such as the Centers for Disease Control and Prevention’s (CDC) Whole School, Whole Community, Whole Child (WSCC) model, which seeks to align health and education through policy, practice, and environmental change across many areas of the school, are needed. Environments that promote healthy eating and physical activity are critical to a WSCC approach, as they not only reduce risk for pediatric obesity but are also critical for academic outcomes, student learning, and social interactions (CDC, 2020). School-based efforts are of particular importance for children from socially disadvantaged families, who may have few other opportunities to consume nutritious meals and be physically active outside of school. Optimizing the school environment to enable healthy behaviors is an important and equitable approach to health promotion and obesity prevention (Kumanyika, 2019). Such changes include promoting the availability of nutritious foods and beverages, marketing healthful items, and enacting policies and structures that ensure safe, fun spaces for children to be frequently active throughout the day (Kumanyika, 2019; Micha et al., 2018; Morton et al., 2016; Sallis & Glanz, 2009; Story et al., 2009).

For school stakeholders and researchers to describe the extent to which schools’ environments are health promoting, as well as to monitor the implementation of new policies and practices, it is important to rigorously assess the areas of the school that are most critical for child health. A variety of methods have been used, such as administrator or food service surveys/interviews, checklists, or observational audits (Lane et al., 2020; O’Halloran et al., 2020; Patel et al, 2020; Saluja et al., 2018). Observational audits can be advantageous over other methods due to their objectivity and reduction in reporting bias (Glanz et al., 2015); however, existing research-tested audit tools are either comprehensive but burdensome for researchers and schools or are brief but specific to one area of the school (e.g., vending or outdoor play area; Lane et al., 2020). There is a need for reliable and valid observational audit tools that assess all relevant school areas, can be completed expediently to minimize burden on schools, and are practical for intervention efforts. Practicality is crucial to ensure that research findings are relevant to both research and the interests of school stakeholders, such as members of a school health council, and can be directly linked to policy and practice recommendations.

PURPOSE

This study describes the development, interrater reliability testing, and scoring of the observational school environment checklist (OSEC), an observational audit tool used to comprehensively, feasibly, and practically assess schools’ environments for healthy eating and physical activity.

METHOD

Tool Development

We initially developed the OSEC to meet a shared goal across two U.S. Department of Agriculture–funded, school-based randomized controlled intervention trials in Maryland, taking place in eight total districts across the two studies. This goal was to measure variations in schools’ baseline environments and to identify changes in the environment across areas of the school that may result from the interventions. The first trial, Project SELECT (Student Engagement, Lunchroom Environment, and Culinary Training), is a study across three school years (2016–17 to 2018–19) that aimed to increase lunch and breakfast participation rates and selection/consumption of specific vegetable subgroups in 23 schools in five school districts. The second trial, Wellness Champions for Change, is a study across five school years (2016–17 to 2020–21) that aims to support teams of teachers, administrators, staff, parents, and students to implement their district’s local wellness policy (i.e., written documents that guide district and school-level efforts to establish a health-promoting school environment; Lane et al., 2018; Local School Wellness Policy Implementation Under the Healthy, Hunger-Free Kids Act of 2010, 2016). This trial takes place in 33 schools serving predominately low- and middle-income students in five districts (two that overlap with SELECT).

Our research team developed the OSEC through an iterative process (Figure 1). The goal was to quantitatively document the extent to which the physical characteristics of common areas of the school (e.g., cafeteria, lobby/hallways, gym, outdoor play areas) promote healthy eating and opportunities for physical activity. We wanted to ensure that our tool achieved this goal with minimal burden placed on schools, which face many resource constraints. First, we examined items from existing instruments that measured relevant aspects of the school environment and/or had been previously administered in low- or middle-income schools. These instruments included the smarter lunchrooms self-assessment scorecard (California Department of Education, 2020; Hanks et al., 2013), several research-tested tools that assessed water availability and quality and the condition of outdoor play areas (Jones et al., 2010; Kenney et al. 2016), and the school observation environmental checklist, a tool developed and previously administered by the research team in urban schools serving primarily students from lower income households (a similar demographic to schools in our studies). We extracted specific items from these tools that (1) could be measured objectively and (2) aligned with policies and/or best practices for promoting healthy eating and physical activity during the school day. Items were a mix of dichotomous (yes/no) response options, scaled response options (poor, fair, good condition), and guided comments to provide more specific and detailed information. We incorporated photo taking into the protocol to allow for cross-validation during coding and data entry (thereby reducing the amount of time spent at the school) and to provide visual support for reporting findings.

FIGURE 1.

FIGURE 1

Development, Reliability Testing, and Modification Process for the Observational School Environment Checklist (OSEC)

We shared Version 1 with a team of school wellness experts, including staff from the state Department of Education Office of School and Community Nutrition Programs, Extension (Food and Nutrition and Health and Wellness education outreach), and school wellness researchers from other universities. Experts provided feedback during a formal team meeting, leading to minor modifications. The tool was pilot tested by research assistants in two urban schools (one elementary and one elementary/middle, both ~500 students enrolled), with which we have strong existing partnerships. Research assistants completed the tool on-site, accompanied by a school staff member as they completed the tool. Their responses were compared to assess item agreement, and the research team debriefed to develop training and administration protocols. Based on this pilot testing, we added clarifications to the auditor training and expanded the descriptions of certain items for Version 2, which contained 88 binary items, which we assessed for interrater reliability. Throughout the tool development and pilot-testing process, we updated the instruction manual to ensure that auditors were consistently and adequately trained and that objectivity could be maintained. The manual’s content focused on how to prepare for the observation, interpret each item, capture the environment with photographs, and score the results. The manual also included guidelines for interacting with staff and students while collecting data (see Supplemental file).

Scoring

The scoring protocol was designed to be directly linked to best practice recommendations for a WSCC approach (CDC, 2020), simple for school stakeholders to understand and use, and adaptable for other research purposes (Glanz et al., 2015, Lane et al., 2020). As such, items that had acceptable (>80%) agreement during reliability testing were sorted into one of four constructs: atmosphere, access, attractiveness, and advertising. These constructs represent characteristics of the school environment in four focus areas that have been previously associated with healthy eating and physical activity (Frerichs et al., 2015; Hanks et al., 2013; Morton et al., 2016; Sallis et al., 2001; Story et al., 2009; Table 1). Binary item responses are given a score of 0 or 1; responses that support a health-promoting environment were scored as 1, other responses received a score of 0. An overall score can be calculated for each construct by focus area by summing binary responses, with a higher score indicating the endorsement of more health-promoting environment best practices in that area.

TABLE 1.

Reliability Statistics for Each Construct of the Observational School Environment Checklist, Including Ranges Across Focus Areas (n = 75 items)

Construct No. of Items % Agreement Kappa Statistica (95% Confidence Interval)
Atmosphere 32 86.3–97.1      0.71 [0.61, 0.82] to 0.92 [0.75, 1.08]
Access 26 82.9–100      0.61 [0.49, 72] to 1.00 [1.00, 1.00]
Attractiveness  4 72.3      0.47 [0.25, 0.69]
Advertising 13 92.9      0.77 [067, 0.87]
a

Cohen’s kappa statistics were calculated using α = .05.

For outdoor play areas, where many different amenities are present and similar items are asked within each area, we developed two scoring scenarios depending on the context of tool use. The first scenario follows the straight scoring for other school areas. Binary items indicating the presence of each of the eight outdoor area amenities (basketball court, other court/blacktop area, outdoor playing field(s), baseball field, tennis court, outdoor track, playground, bike rack) are summed to create a single score for access (e.g., a school with a basketball court, tennis court and track would receive an access score of 3 for outdoor play area). For atmosphere, items within each amenity are summed, and the highest score is what is reported for outdoor play areas. The second scenario was developed to enable better comparison between schools, if desired, by addressing considerable variation in outdoor amenities depending on school size and type (e.g., elementary schools may have a playground whereas middle schools have a running track). In this scenario, the eight amenities are combined into four “sections” of outdoor areas: general areas (bike racks), field areas (outdoor playing fields, baseball fields), hard court areas (basketball, blacktop, tennis), and auxiliary areas (track, playground). Then, an access score is calculated for each section based on both presence of each amenity as well as available equipment, if relevant. A separate atmosphere score is calculated for each section. The instruction manual provides more detail for these protocols (see Supplemental file).

Interrater Reliability Testing

We evaluated interrater reliability in all schools participating in the first cohort of the two trials during the 2017–2018 school year. School-level demographic data (e.g., size, type, student body race/ethnicity and income) were collected from the Maryland State Department of Education and the National Center for Education Statistics, including the free/reduced-price meal eligibility rate and geographic location (urban/suburban/rural; National Center for Education Statistics, 2020).

Auditors were members of both projects’ research staff. Their training consisted of ~1-hour, in-person training and a practice audit. Six auditors were cleared to collect data after two senior members of the research team determined that they were using the tool as instructed (see Supplemental file). Two auditors independently completed the tool worksheet and took photographs to align with each item at each school. Time to complete the OSEC varied depending on school size and time of day, but it took no longer than 2 hours, including a 30-minute lunch period (see Supplemental file). Figure 2 displays an excerpt from the tool worksheet, procedure manual, scoring guide, and an example photograph. Auditors met within 1 week to resolve disagreements and reach consensus on differing responses.

FIGURE 2.

FIGURE 2

Excerpt From the Observational School Environment Checklist (OSEC) Tool and Corresponding Procedure Manual, Sample Photo, and Scoring-Guide Excerpt

Interrater reliability was assessed using item-by-item percent agreement. Category percent agreement and Cohen’s kappa (κ) scores assessed construct reliability across each of the four focus areas. Percent agreement allows for explicit reporting of agreement by item or category, whereas Cohen’s κ is calculated to assess chance agreement (McHugh, 2012). Cohen’s κ was calculated using R (R Core Team, 2020). We used Cohen’s criteria for agreement (0.01–0.2 = slight, 0.21–0.40 = fair, 0.41–0.60 = moderate, 0.61–0.80 = substantial, 0.80–1.00 = almost perfect; McHugh, 2012). When items had <80% agreement, a third trained auditor who was not familiar with the school coded a subset of items using photographs. The team of researchers met to discuss items with <80% agreement and reliability of the four constructs, and decide whether objectivity could be improved by either revising items, changing open-ended or Likert-type items to binary items, providing more training, or whether the items needed to be discarded.

Finalizing the Tool and Protocol

After reliability testing, final modifications were made to the Version 3 OSEC tool, training and procedure manual, and scoring guide. These documents were compiled into a single information manual, which includes in-person training materials, a definition guide, a detailed data collection manual, management and scoring protocol for binary items, and helpful recommendations based on the initial experiences collecting data in schools (e.g., how to approach cafeteria staff). The manual also includes a report card for disseminating the score to partnering schools following the study. Figure 2 depicts several excerpts of this manual describing items related to the cafeteria. The full information manual is currently available for download at http://www.marylandschoolwellness.org/ (see Supplemental file).

Results

The OSEC was completed by two raters per school in 18 schools, which represented two school districts. Thirteen schools were in an urban locale, and five were suburban. Twelve were elementary or elementary/middle schools, and six were middle schools. All schools served predominately lower income students and students belonging to a minority racial/ethnic group (Table 2).

TABLE 2.

Demographic Characteristics of Observational School Environment Checklist Study Sample and Prevalence of Environmental Characteristics (n = 18 schools)

Demographics n (%) or M (Range)
No. of schools 18
Size (No. of students)  670 (344–943)
School type
 Elementary    3 (16.7)
 Elementary/middle    9 (50.0)
 Middle    6 (33.3)
Geographic location
 Urban   13 (72.2)
 Suburban     5 (27.8)
Student race/ethnicitya
 % Black or African American 59.8 (6–95)
 % White 11.1 (5–39)
 % Hispanic or Latino 25.6 (5–72)
 % Asian  6.2 (5–11)
 Percent free/reduced-priced meal eligibilitya 80.3 (59–95)
School environmental characteristics
 Dining area menu  6 (33.3)
 Salad available for students  6 (33.3)
 Sliced/cut fruit  6 (33.3)
 Hot and cold vegetables available 10 (56.0)
 White milk on display 10 (56.0)
 Gym  18 (100.0)
 Playground 10 (56.0)
 Outdoor tennis court  7 (38.9)
 Outdoor track  3 (16.7)
a

Schools with <5 or >95% are capped at 5 and 95% due to state reporting protocols.

Interrater Reliability of Version 2

Item interrater reliability was first assessed, followed by reliability of the four constructs. Most individual items (71/88) had >80% agreement between the raters. For the 17 items with low agreement, disagreements could be resolved for 13 of the items when the raters reviewed the photographs, and we determined that future objectivity could be improved through better training. The remaining four items were determined to be too subjective and were discarded (e.g., Is the lunch period noisy?). An additional nine items with good agreement were discarded due to repetition (e.g., too many questions asking about menu presence) prior to assessing construct reliability. This left 75 items for scale reliability. Percent agreement with the remaining 75 items was acceptable for atmosphere (n = 32 items; 86.3%–97.1% agreement), access (n = 26 items; 82.9%–100%), and advertising (n = 13 items; 92.9%). Cohen’s κ ranges were substantial or near perfect (>0.61). For attractiveness (n = 4; 72.3%), agreement was lower but still acceptable. Cohen’s κ was 0.47, or moderate agreement (McHugh, 2012). Table 1 describes mean and range scores for constructs across school focus areas.

Modifications From Version 2 to Version 3

In addition to eliminating 13 items with poor reliability or repetition as described above, we discarded, added, or modified several other items. First, we discarded an additional three items because they did not fit the needs of the tool or were repetitive, so that 16 total items were removed. Second, we duplicated existing items that were asked in only one focus area so that they were asked across all areas (e.g., whether promotional posters or water sources were present), which improved comprehensiveness and consistency. This resulted in 49 added items. Finally, we modified four items that, while they had adequate agreement, still required clarification. Thus, Version 3 (the final version) contained 121 items. Other modifications included changing open-ended/Likert-type items to dichotomous yes/no options to eliminate confusion and to improve ease of score interpretation. Comment boxes and guided selection choices (e.g., poor, fair, good condition) were retained but were not included in the binary item scoring protocol or described in this article.

Audit Scores for Version 2

Table 2 describes the number and proportion of schools endorsing example binary items. All 18 schools had a gym, about half (56%) had access to a playground, and 39% had auxiliary spaces, such as tennis courts (39%) and/or tracks (17%). In the cafeteria, one third of schools had a menu visible in the dining area (33%), had a salad option available (33%), and had sliced or cut fruit on the service line (33%) for students to select on the day of the school visit. For vegetable and milk availability, slightly over half of the schools had both hot and cold vegetables for students to select (56%), and had white milk on the front of the display (56%). Table 3 describes the average score for atmosphere, advertising, and access across the four focus areas, with the four outdoor sections described separately. Attractiveness items were only present in the cafeteria and thus are not reported for other areas.

TABLE 3.

Average Scores for Atmosphere, Access, Attractiveness, Advertising Across the Four Focus Areas for Final Version (Version 3) of the Tool (n = 121 items)

Focus Area Construct No. of items M (SD) Score Range
Cafeteria Atmosphere 13 9.94 (1.70)  6–13
Advertising 16 4.72 (2.35) 1–9
Access 11 6.50 (2.33)  1–10
Attractiveness  4 1.89 (0.76) 0–3
Lobby/hallways Atmosphere 12 9.72 (3.94)  1–12
Advertising  9 3.44 (1.15) 1–6
Access  3 1.00 (0.84) 0–2
Attractiveness 0
Gym Atmosphere  5 3.61 (1.14) 2–5
Advertising  3 2.72 (0.67) 2–4
Access  4 1.67 (1.03) 0–3
Attractiveness  0
Outdoor areas (General, fields, auxiliary areas, courts)a General atmosphere  2 1.11 (0.32) 1–2
Court atmosphere 12 3.94 (2.19)  2–10
Field atmosphere  8 3.33 (3.12) 0–8
Auxiliary atmosphere  7 1.67 (1.37) 0–4
Advertising  0
General access  1 0.44 (0.51) 0–1
Court access  4 1.61 (0.78) 0–3
Field access  4 2.11 (1.68) 0–4
Auxiliary access  3 1.22 (0.88) 0–2
Attractiveness  0
a

General = bike rack; courts = basketball, blacktop, tennis; fields = outdoor playing field, baseball; auxiliary areas = outdoor track, playground.

DISCUSSION

This study describes the development, reliability testing, and iterative modification process for the OSEC, a comprehensive and literature-based tool for assessing components of the cafeteria, lobby/hallways, and indoor and outdoor play areas that may influence students’ physical activity and eating behaviors at school (Frerichs et al., 2015; Story et al., 2009). In addition to describing the tool, we provide information on training, scoring, and dissemination to allow for broad use for collaborative research with health promotion professionals and other stakeholders interested in evaluating or improving schools’ health-promoting environments.

The OSEC is unique from existing research tools in its comprehensiveness, lower burden of administration due to photo validation, and simple scoring protocol. It is also distinct from continuous quality improvement tools such as the school health index or the smarter lunchrooms scorecard (CDC, 2019; Hanks et al., 2013), which are similarly comprehensive, easy to administer, and linked to best practice recommendations, but have not been tested for reliability, which limits their ability to make comparisons between schools or systematically track improvement. Moreover, continuous quality improvement tools are useful components of interventions that aim to implement changes in a particular setting or organization but are difficult to use in situations where schools are not receiving interventions (e.g., control schools in a randomized controlled trial) or where policy implementation is being monitored. The comprehensiveness of the OSEC allows users to acquire a broad picture of the health-promoting environment, which can inform upstream efforts to change policies, systems, and environments. The ease of administration reduces the burden for researchers, school staff, or other stakeholders who may be interested in the results. The use of mostly binary items improves its usability and reliability. Finally, the OSEC is unique from previously validated audit tools in that it is easy to score and interpret, and all the materials were designed for easy dissemination (Lane et al., 2020). Future studies should focus on understanding administration, scoring, and interpretation among nonresearchers.

Given that school environments are audited for a variety of purposes (e.g., monitor implementation of interventions or programs, compare schools with one another, examine whether schools’ environments align with policy requirements or recommendations for best practices), it is important for tools to be flexible (Glanz et al., 2015; Glasgow & Riley, 2013; Lane et al, 2020; Morton et al., 2016). The OSEC can help define school or school district needs related to creating a more health-promoting environment or selecting and evaluating interventions or modifications to school environments. Additionally, this tool can monitor implementation of federal policies. Federal law requires school districts to write local wellness policies (guidelines to create a health-promoting environment based on best practices for healthy eating and physical activity), which are implemented at the school level. Districts are also required to monitor and report on this implementation (Local School Wellness Policy Implementation Under the Healthy, Hunger-Free Kids Act of 2010, 2016). The OSEC is a research-tested tool with a simple scoring rubric, which captures the data needed to meet these reporting requirements and provides shareable information. Future research should capture not only use by nonresearchers but also the extent to which findings translate to policy and practice.

The wide range of scores across attractiveness, atmosphere, advertising, and access in the 18 schools in this study demonstrates the OSEC’s potential to detect differences across individual schools in multiple areas of the school environment. Similar to other audit tool reliability studies, however, we were unable to broadly establish the validity of nonbinary variables, which introduced subjectivity (e.g., condition of outdoor play areas; Jones et al., 2010). With the exception of the cafeteria, these items were removed from the tool to encourage objective observations that were feasible to score. Future research should consider augmenting the OSEC with other measures, such as teacher or student surveys or qualitative methods, to assess condition and use of health-promoting infrastructure.

This study has several strengths. First, tool development was an iterative process that prioritized both the needs of researchers and feasible, minimally burdensome administration. Second, the applicability of the scoring was supported by use across two different studies and in both urban and suburban environments, suggesting potential broad use, although more testing is needed in rural areas. The two studies measured different health-related outcomes, which allowed the tool to be developed with the intention to assess multiple outcomes and take a comprehensive approach for measuring the health-promoting school environment. Third, the research team prioritized complete availability of materials (tool, training manual, and scoring protocol) so that other researchers or school officials can use the tool consistently, enabling comparisons across schools and school districts, which can increase generalizability of the findings.

Limitations

Several study limitations should be considered. First, percent agreement for the constructs was calculated in reference to 75 items remaining in the tool after the revision process, therefore further testing should be done for modified OSEC tool usage. The tool can be adapted and revised to meet the needs of the researcher or stakeholder and to address variability that exists in school environments, but amended versions should be tested to ensure the constructs hold true and the scoring procedure can be implemented. Additional testing is especially warranted for attractiveness, which demonstrated only moderate agreement. We only discuss the reliability and scoring protocol for the binary items, so further investigation of the continuous and open-ended questions should occur. Second, while the scores ranged across schools, we did not have adequate sample size to examine statistical differences by demographic and school characteristics, nor was it within our scope to assess the tool’s ability to detect changes in the environment over time or between school types (e.g., middle vs. elementary). The ability to measure changes, particular policy implementations and intervention efforts, is a key area for future research. Additionally, resource restraints and school restrictions limited our auditing team to two (with a third doing photo review), and our time frame (<2 hours) is an estimate. Future reliability studies may benefit from having more auditors per school, and recording exact time per audit. Finally, while the tool was designed to be usable for school stakeholders, additional research is required to evaluate actual use by nonresearchers.

Implications for Practice and Future Research

The OSEC audit tool can be administered with few resources, has acceptable interrater reliability, and provides a method to comprehensively assess the areas where students can engage in healthy eating and physical activity at school. Future studies can use this reliable tool to assess associations between school environments and student behaviors. Results from the OSEC can inform and assess the progress of school-based interventions and improvement plans that aim create a healthier environment for students.

Supplementary Material

Supplemental material

Authors’ Note:

This study was supported by funding from the following sources: National Institutes of Health, National Institute of Diabetes and Digestive and Kidney Diseases (Grant ID: F32DK115146 and T35DK095737), and National Heart, Lung, and Blood Institute (Grant ID: K12HL13830); U.S. Department of Agriculture (USDA) Agriculture and Food Research Initiative (Grant ID: 2016-68001-2492) and a USDA 2016 Team Nutrition Training Grant.

Footnotes

Supplemental Material

Supplemental material for this article is available at https://journals.sagepub.com/home/hpp.

REFERENCES

  1. California Department of Education. (2020). Smarter lunchrooms movement. https://www.cde.ca.gov/ls/nu/he/smarterlunchrooms.asp
  2. Centers for Disease Control and Prevention. (2019). CDC healthy schools: School health index. https://www.cdc.gov/healthyschools/shi/index.htm
  3. Centers for Disease Control and Prevention. (2020). CDC healthy schools: Whole School, Whole Community, Whole child (WSCC). https://www.cdc.gov/healthyschools/wscc/index.htm
  4. Frerichs L, Brittin J, Sorensen D, Trowbridge MJ, Yaroch AL, Siahpush M, Tibbits M, & Huang TT (2015). Influence of school architecture and design on healthy eating: A review of the evidence. American Journal of Public Health, 105(4), e46–e57. 10.2105/ajph.2014.302453 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Glanz K, Sallis JF, & Saelens BE (2015). Advances in physical activity and nutrition environment assessment tools and applications: Recommendations. American Journal of Preventive Medicine, 48(5), 615–619. 10.1016/j.amepre.2015.01.023 [DOI] [PubMed] [Google Scholar]
  6. Glasgow RE, & Riley WT (2013). Pragmatic measures: What they are and why we need them. American Journal of Preventive Medicine, 45(2), 237–243. 10.1016/j.amepre.2013.03.010 [DOI] [PubMed] [Google Scholar]
  7. Hanks AS, Just DR, & Wansink B (2013). Smarter lunchrooms can address new school lunchroom guidelines and childhood obesity. Journal of Pediatrics, 162(4), 867–869. 10.1016/j.jpeds.2012.12.031 [DOI] [PubMed] [Google Scholar]
  8. Jones NR, Jones A, van Sluijs EM, Panter J, Harrison F, & Griffin SJ (2010). School environments and physical activity: The development and testing of an audit tool. Health Place, 16(5), 776–783. 10.1016/j.healthplace.2010.04.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Kenney EL, Gortmaker SL, Cohen JF, Rimm EB, & Cradock AL (2016). Limited school drinking water access for youth. Journal of Adolescent Health, 59(1), 24–29. 10.1016/j.jadohealth.2016.03.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Kumanyika SK (2019). A framework for increasing equity impact in obesity prevention. American Journal of Public Health, 109(10), 1350–1357. 10.2105/AJPH.2019.305221 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Lane HG, Calvert H, Deitch R, Hager E, Turner L, Babatunde T, Harris R, & Pitts S (2020). Usability of existing observational tools to measure schools’ health-promoting environment: A systematic review. Health & Place, 66, Article 102388. 10.1016/j.healthplace.2020.102388 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Lane HG, Deitch R, Wang Y, Black MM, Dunton GF, Aldoory L, Turner L, Parker EA, Henley SC, Saksvig B, Song HJ, & Hager ER (2018). “Wellness champions for change”: A multi-level intervention to improve school-level implementation of local wellness policies: Study protocol for a cluster randomized trial. Contemporary Clinical Trials, 75, 29–39. 10.1016/j.cct.2018.10.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Local School Wellness Policy Implementation Under the Healthy, Hunger-Free Kids Act of 2010, 7 C.F.R. 210 and 220 (2016). https://www.federalregister.gov/documents/2016/07/29/2016-17230/local-school-wellness-policy-implementation-under-the-healthy-hunger-free-kids-act-of-2010 [PubMed] [Google Scholar]
  14. McHugh ML (2012). Interrater reliability: The kappa statistic. Biochemia Medica, 22(3), 276–282. 10.11613/BM.2012.031 [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Micha R, Karageorgou D, Bakogianni I, Trichia E, Whitsel LP, Story M, Peñalvo JL, & Mozaffarian D (2018). Effectiveness of school food environment policies on children’s dietary behaviors: A systematic review and meta-analysis. PLOS ONE, 13(3), e0194555. 10.1371/journal.pone.0194555 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Morton KL, Atkin AJ, Corder K, Suhrcke M, & van Sluijs EM (2016). The school environment and adolescent physical activity and sedentary behaviour: A mixed-studies systematic review. Obesity Review, 17(2), 142–158. 10.1111/obr.12352 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. National Center for Education Statistics. (2019). Digest of Education Statistics 2018: Chapter 1. https://nces.ed.gov/programs/digest/d18/index.asp
  18. National Center for Education Statistics. (2020). Common core of data. https://nces.ed.gov/ccd/pubschuniv.asp
  19. O’Halloran S, Eksteen G, Gebremariam M, & Alston L (2020). Measurement methods used to assess the school food environment: A systematic review. International Journal of Environmental Research Public Health, 17(5), 1623. 10.3390/ijerph17051623 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Patel AI, Podrabsky M, Hecht AA, Morris S, Yovanovich S, Walkinshaw LP, Ritchie L, & Hecht C (2020). Development and validation of a photo-evidence tool to examine characteristics of effective drinking water access in schools. Journal of School Health, 90(4), 271–277. 10.1111/josh.12873. [DOI] [PubMed] [Google Scholar]
  21. R Core Team. (2020). R: A language and environment for statistical computing. https://www.R-project.org/
  22. Sallis JF, Conway TL, Prochaska JJ, McKenzie TL, Marshall SJ, & Brown M (2001). The association of school environments with youth physical activity. American Journal of Public Health, 91(4), 618–620. 10.2105/ajph.91.4.618 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Sallis JF, & Glanz K (2009). Physical activity and food environments: Solutions to the obesity epidemic. Milbank Quarterly, 87(1), 123–154. 10.1111/j.1468-0009.2009.00550.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Saluja K, Rawal T, Bassi S, Bhaumik S, Singh A, Park MH, Kinra S, & Arora M (2018). School environment assessment tools to address behavioural risk factors of non-communicable diseases: A scoping review. Preventive Medicine Report, 10, 1–8. 10.1016/j.pmedr.2018.01.014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Story M, Nanney MS, & Schwartz MB (2009). Schools and obesity prevention: Creating school environments and policies to promote healthy eating and physical activity. Milbank Quarterly, 87(1), 71–100. 10.1111/j.1468-0009.2009.00548.x [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplemental material

RESOURCES