Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2021 Nov 9.
Published in final edited form as: J Safety Res. 2020 Jul 9;74:279–288. doi: 10.1016/j.jsr.2020.06.010

The association between subcontractor safety management programs and worker perceived safety climate in commercial construction projects

Ann Marie Dale a, Ryan Colvin a, Marco Barrera a, Jaime R Strickland a, Bradley A Evanoff a
PMCID: PMC8577185  NIHMSID: NIHMS1636565  PMID: 32951793

Abstract

Problem:

Safety management programs (SMPs) are designed to mitigate risk of workplace injuries and create a safe working climate. The purpose of this project was to evaluate the relationship between contractors’ SMPs and workers’ perceived safety climate and safety behaviors among small and medium-sized construction subcontractors.

Methods:

Subcontractor SMP scores on 18 organizational and project-level safety items were coded from subcontractors’ written safety programs and interviews. Workers completed surveys to report perceptions of their contractor’s safety climate and the safety behaviors of coworkers, crews, and themselves. The associations between SMP scores and safety climate and behavior scales were examined using Spearman correlation and hierarchical linear regression models (HLM).

Results:

Among 78 subcontractors working on large commercial construction projects, we found striking differences in SMP scores between small, medium, and large subcontractors (p<0.001), related to a number of specific safety management practices. We observed only weak relationships between SMP scales and safety climate scores reported by 746 workers of these subcontractors (β=0.09, p=0.04 by HLM). We saw no differences in worker reported safety climate and safety behaviors by contractor size.

Discussion:

SMP only weakly predicted safety climate scales of subcontractors, yet there were large differences in the quality and content of SMPs by size of employers.

Summary:

Future work should determine the best way to measure safety performance of construction companies and determine the factors that can lead to improved safety performance of construction firms.

Practical applications:

Our simple assessment of common elements of safety management programs used document review and interviews with knowledgeable representatives. These methods identified specific safety management practices that differed between large and small employers. In order to improve construction safety, it is important to understand how best to measure safety performance in construction companies to gain knowledge for creating safer work environments.

Keywords: leading indicators, injury prevention, construction, safety climate, safety management systems

PROBLEM

Construction is the most hazardous industry in the US, with the highest number of fatalities of any industry and high rates of nonfatal injuries (McCoy, Kucera, Schoenfisch, Silverstein, & Lipscomb, 2013; U.S. Bureau of Labor Statisitics (BLS), 2013). Safety in construction is complex, due to rapidly changing work environments, unique hazards of the industry, and the organizational issues of coordinating interactions between multiple contractors (National Institute for Occupational Safety and Health (NIOSH), 2013a; Ringen, Seegal, & Englund, 1995). Further complicating work organization is the large number of small construction contractors, whose owners often lack sufficient safety resources, and whose workers suffer a disproportionate number of fatalities compared to the overall sector (CPWR, 2007).

Recent approaches to improve construction safety have focused on measuring the effect of safety initiatives through leading and lagging indicators of safety. The construction industry has traditionally measured safety using “lagging” indicators such as fatalities, recordable injuries, lost time, and safety incidents. Although these measures are easy to collect, easy to understand, and can be used to benchmark against other employers, they are measures of after-the-fact failures of safety programs, rather than measures of program effectiveness. These lagging indicators of safety are insensitive to change, do not provide timely information, and do not measure barriers to change or actions taken to address workplace hazards (Grabowski, Ayyalasomayajula, Merrick, & Mccafferty, 2007; Hinze, Thurman, & Wehle, 2013; Trethewy, 2003). Leading indicators such as safety walk-throughs and inspections, preplanning task logs, and reports of safety behavior provide more timely and relevant safety information, and have been linked to injury prevention, particularly in those programs which focus on positive performance and depend on worker participation (Choudhry, Fang, & Lingard, 2009; Institute for Work and Health, 2011; Laitinen, Marjamaki, & Paivarinta, 1999; Mikkelsen, Spangenberg, & Kines, 2010; Toellner, 2001).

Safety climate, a measure of workers’ shared perceptions regarding the importance of safety in their organization, has been accepted as a leading indicator of the underlying risk of worker injuries on construction projects (Hecker & Goldenhar, 2014). Commonly proposed factors of safety climate include management commitment to safety, safety communication, and worker involvement in safety efforts (Cigularov, Chen, & Rosecrance, 2010; Ismail, Doostdar, & Harun, 2012; Kines, et al., 2010). A number of researchers have used safety climate measures in construction (Cheyne, Cox, Oliver, & Tomas, 1998; Dedobbeleer & Béland, 1991; National Institute for Occupational Safety and Health (NIOSH), 2013c; Zohar, 2000), but there is no single accepted measure of safety climate for the construction industry. Safety climate measures have predicted injuries and safety behaviors of workers in several industries (Choudhry, et al., 2009; Gittleman, et al., 2010; Sokas, Jorgensen, Nickels, Gao, & Gittleman, 2009), although this literature is mixed. A recent study of the S-CAT construction-specific safety climate assessment in 49 firms found that the overall safety climate score and 7 of 8 subscales were significantly correlated with the firms’ recordable injury rates (Probst, Goldenhar, Byrd, & Betit, 2019). However, other recent studies in construction have not found a robust relationship between safety climate measures and injuries or measures of safety practices (Marín, Lipscomb, Cifuentes, & Punnett, 2017; Sparer, Murphy, Taylor, & Dennerlein, 2013; Versteeg, Bigelow, Dale, & Chaurasia, In press).

A key element of a company’s safety performance are the Safety Management Programs used by construction companies, which describe the actions and policies taken to reduce or eliminate exposure to safety risks that lead to injuries and illnesses (Vinodkumar & Bhasi, 2011). Traditional safety programs are reactive, consisting of safety elements that respond to safety laws and regulations, and requirements from workplaces by general contractors (Hadjimanolis & Boustras, 2013; Herrero, Saldaña, del Campo, & Ritzel, 2002). These programs include a group of discrete activities that are not integrated with other organizational management activities of the business. Improved safety performance has been achieved in high-risk oil and gas industry by integrating safety into all organizational policies, procedures, and practices (Mearns, Whitaker, & Flin, 2003; Skogdalen, Utne, & Vinnem, 2011). These safety management systems apply safety in a consistent manner throughout the organization, are proactive to identify potential risk, actively involve workers, and have explicit safety activities independent of other organizational management activities (Hsu, Li, & Chen, 2010). There is an underlying assumption that safety management programs will reduce the risks that ultimately lead to worker injuries. OSHA (2016) has published a set of practices for safety and health programs specific to the construction industry to help reduce the hazards that leads to worker injuries; these practices serve as a guide to effective safety management programs. All safety management programs contain many of the same safety initiatives and elements, but the integration of these activities in safety management systems is needed to produce a consistently safe environment.

There is a growing interest to describe the elements of effective safety management programs. There is consensus for including some elements: leadership commitment, regular and frequent hazard recognition and controls, employee training, safety communication, and program evaluation (Occupational Safety & Health Administration, 2016). However, there is no commonly accepted scale or checklist that can be used to compare safety management programs across different construction companies, nor is there consensus on which specific safety program elements are most important to drive workplace safety. Figure 1 shows a conceptual model to illustrate the simple relationship between safety management programs and leading and lagging indicators (Neal, Griffin, & Hart, 2000). Similar models have been used in other safety climate research (Marín, et al., 2017; Neal, et al., 2000; Sparer, et al., 2013) to examine whether improved safety management practices would drive improved leading indicators of safety such as safety climate and safety performance, which should in turn be linked to lower injury rates.

Figure 1: The relationship between safety program to leading and lagging indicators.

Figure 1:

The purpose of this study was to examine the association between safety management programs and worker-perceived safety among subcontractors on large commercial construction projects. To test the hypothesis that a measure of safety management programs would be positively associated with measures of worker-perceived safety, we scored the Safety Management Plans (SMP) of subcontractors in commercial construction, and examined the agreement between SMP scores and five measures of safety reported on worker questionnaires.

METHODS:

Study design

We partnered with six general contractors with construction projects of at least 12 months in duration to gain access to a group of subcontractors from a variety of trades. Eligible subcontractors planned to work on the project for at least 30 days, and to employ at least two workers on the project. The general contractors informed the subcontractors about the research prior to their start at the jobsite, and provided access to workers on the job site. Subcontractors were asked to participate in two interviews, to provide a copy of their written safety program and other safety documents, and to provide access to their workers to collect surveys.

Workers were invited to complete surveys at two points in time: when they first entered the job site, and after working on the project for at least 30 days. The surveys took 10–15 minutes to complete; workers provided informed consent and were compensated for survey completion. The Institutional Review Board at Washington University approved all research activities.

Subcontractor interviews

Each recruited subcontractor was asked to provide a knowledgeable representative to participate in semi-structured interviews to learn about their safety policies, practices, and programs. The interviewee was most often the safety director, superintendent, project manager, or owner of the company. Interviews were approximately 30 minutes in duration, conducted in-person or by phone, and were audio-recorded with consent from the interviewee. The questions asked about the safety background of the company safety representative, a review of a checklist of safety policies, practices, and programs of the company safety program, and a description of any new safety equipment or activities implemented on the current project due to expectations of the general contractor. Each contractor provided an estimate on the average number of total workers employed in the past year. For analysis, contractors were assigned an employer size apriori, based on the distribution of employers found on commercial construction projects as follows: large (>200 workers), medium (51–200 workers) and small (0–50 workers). All interviews were transcribed and used in conjunction with the written safety plans to code 18 items from the subcontractor safety management program.

Subcontractor safety management program

We developed an 18-item Construction Safety Management Program Checklist (SMP checklist), drawn primarily from the model program developed by the St. Louis Council of Construction Consumers (2011) and similar to items described in other studies (Marín, et al., 2017; Sparer, et al., 2013). These items covered four domains: management commitment (7 items), worker participation (5 items), hazard identification (3 items), and training (3 items). Each item was scored “yes” if the item criteria was verified or stated as part of the safety program, “no” if not met, and “partial” if the description met a portion of the “yes” criteria. Some items were scored partial if the criteria met OSHA regulations and “yes” if the safety program exceeded OSHA regulations. The coding criteria are listed in Table 1. Most criteria required verification from the written safety plan or a detailed description from the interviewee. We requested the interviewee provide examples of safety practices to verify the item was enacted and not just stated in their safety plan. Two of the authors (MB and AMD), who regularly visited the job sites and conducted the interviews, coded each SMP checklist item using information from the written safety plan and the transcribed interviews (described below). The authors listed the supporting statement from the interview and/or source page from the safety plan for each item; each author then coded the item as Y, partial, or N. After completion, the authors discussed any disagreements in coding to reach consensus. The final item scores were assigned values of 1 for “yes,” 0.5 for “partial” and 0 for “no.” The summated score for each subcontractor (possible range 0–18) was standardized from 0 to 100 for use in analyses.

Table 1.

Safety program constructs and 18 safety elements used to compute safety checklist score, N=78 subcontractors

Item Constructs Elements Response Definition/Source Overall N = 78 Small N = 32 Medium N = 32 Large N = 14 p-value**
1 Organizational Management Written Safety Program Y Company provided copy of their written safety program or interviewee stated they have a written program 74 (94.9) 30 (93.8) 31 (96.9) 13 (92.9) 0.361
N Interviewee reported they do not have a written safety program 4 (5.1) 2 (6.2) 1 ( 3.1) 1 ( 7.1)
2 Safety Goal Y Written program or interviewee stated the company safety objective or goal focused on preserving the health and safety of workers 61 (78.2) 25 (78.1) 23 (71.9) 13 (92.9) 0.273
Partial Stated safety objective is only to reduce injuries or Experience Modification Rate (EMR) 9 (11.5) 3 (9.4) 5 (15.6) 1 ( 7.1)
N No company objective or goal related to safety 8 (10.3) 4 (12.5) 4 (12.5) 0 ( 0.0)
3 Monitor Progress in Safety Y Written safety program or interviewee stated company monitors progress toward safety goals via leading indicators (monitors safety activities and performance of safety program). 11 (14.1) 3 (9.4) 3 ( 9.4) 5 (35.7) 0.009
Partial Company monitors progress in safety via OSHA log and Experience Modification Rate only. 53 (67.9) 20 (62.5) 24 (75.0) 9 (64.3)
N Interviewee states company does not monitor improvement in safety 14 (17.9) 9 (28.1) 5 (15.6) 0 ( 0.0)
4 Safety Budget Y Written program or interviewee states company willingly provides resources to improve safety such as for safety equipment, initiatives, programs, manpower, over and above PPE only. 22 (28.2) 5 (15.6) 9 (28.1) 8 (57.1) 0.259
Partial written program or interviewee states budget for safety only covers PPE and other OSHA required equipment 42 (53.8) 22 (68.8) 17 (53.1) 3 (21.4)
N Interviewee states company does not cover costs for any safety items or activities 14 (17.9) 5 (15.6) 6 (18.8) 3 (21.4)
5 Designated Safety Representative Y Company has a designated safety person who is responsible for the written safety program, oversees safety in all company activities, conducts audits on projects, etc. This role must be done majority of the time. 26 (33.3) 2 (6.2) 14 (43.8) 10 (71.4) 0.000
Partial Designated safety person performs some company level safety activities but these activities are performed only part-time in conjunction with other roles 43 (55.1) 24 (75.0) 16 (50.0) 3 (21.4)
N There is no designated person responsible for safety or designated person only monitors OSHA log and collecting safety documents from projects (i.e. Weekly TBT). 9 (11.5) 6 (18.8) 2 ( 6.2) 1 ( 7.1)
6 Management Jobsite Inspections Y Management conducts job site inspections or audits that are clearly described (set frequency, defined process, results in action plan) as stated in written program or interview 41 (52.6) 10 (31.2) 21 (65.6) 10 (71.4) 0.001
Partial Only mentions management inspections without describing any details as stated in written program or interview 22 (28.2) 11 (34.4) 7 (21.9) 4 (28.6)
N Management does not conduct job site inspections or audits 15 (19.2) 11 (34.4) 4 (12.5) 0 ( 0.0)
7 Enforcement Policy Y Enforcement Policy/Disciplinary action clearly detailed by interviewee or in written program 55 (70.5) 17 (53.1) 25 (78.1) 13 (92.9) 0.027
Partial Only mentions having an enforcement policy but gives no detail in interview or written program 11 (14.1) 8 (25.0) 2 ( 6.2) 1 ( 7.1)
N No enforcement policy stated in interview or written program 12 (15.4) 7 (21.9) 5 (15.6) 0 ( 0.0)
8 Training New Hire Orientation* Y Describes a formal in person or online orientation for new hires that reviews the company safety policies and programs 37 (47.4) 9 (28.1) 19 (59.4) 9 (64.3) 0.015
Partial Written safety program is only distributed to new hires and requires worker to sign a statement that it has been received 27 (34.6) 14 (43.8) 10 (31.2) 3 (21.4)
N No company new hire orientation stated in interview or written program 14 (17.9) 9 (28.1) 3 ( 9.4) 2 (14.3)
9 Toolbox Talks Y Toolbox talks delivered regularly, mandated by company, regardless of general contractor expectations as stated by interview or in written program. 57 (73.1) 18 (56.2) 26 (81.2) 13 (92.9) 0.013
N only delivers TBT when mandated by general contractor or not stated in interview or written program 21 (26.9) 14 (43.8) 6 (18.8) 1 ( 7.1)
10 Other Safety Trainings* Y Worker trainings delivered by union or company on a regular basis or as required by general contractors over and above worker initial training as stated in interview or described in written program 76 (97.4) 31 (96.9) 31 (96.9) 14 (100.0) 0.805
N No information on training beyond TBT by union or company stated in interview or described in written program 2 (2.6) 1 (3.1) 1 ( 3.1) 0 ( 0.0)
11 Worker Participation Substance Use Policy Y Substance Use Policy clearly detailed by interviewee or in written program. May defer to union policy. 76 (97.4) 30 (93.8) 32 (100.0) 14 (100.0) 0.235
N No Substance Use Policy stated or clearly described in interview or written program 2 (2.6) 2 (6.2) 0 ( 0.0) 0 ( 0.0)
12 Stop Work Rights* Y Workers have the authority to stop work if they feel work is unsafe, without fear of retribution, stated in interviews or specifically detailed in the written safety program. This is a formal right and not just assumed to be a worker right by management. 40 (51.3) 12 (37.5) 18 (56.2) 10 (71.4) 0.082
N No policy stated in the written safety program or interview. 38 (48.7) 20 (62.5) 14 (43.8) 4 (28.6)
13 Worker Safety Suggestions Encouraged* Y Company seeks worker input on safety improvement through a formal process as stated in interview or in written program 33 (42.3) 6 (18.8) 16 (50.0) 11 (78.6) 0.000
Partial State they listen to worker suggestions but do not have a process to formally ask for worker suggestions as stated in interview or written program 40 (51.3) 21 (65.6) 16 (50.0) 3 (21.4)
N No formal process stated to seek worker suggestions or input for company safety activities in interview or written program 5 (6.4) 5 (15.6) 0 ( 0.0) 0 ( 0.0)
14 Worker Suggestions Used* Y Interviewee describes specific changes made to company safety activities or program in response to worker suggestions 17 (21.8) 4 (12.5) 6 (18.8) 7 (50.0) 0.033
Partial Interviewee reports company has made changes but gave no examples of worker ideas used for company safety 19 (24.4) 7 (21.9) 10 (31.2) 2 (14.3)
N Interviewee reports no changes were made to company safety from worker suggestions 42 (53.8) 21 (65.6) 16 (50.0) 5 (35.7)
15 Safety Committee Y Company has a formal meeting to discuss company safety issues on a regular basis with representation from management and workers (including foreman) as stated in interview or written program 11 (14.1) 1 (3.1) 4 (12.5) 6 (42.9) 0.000
Partial Company safety meetings only include management representation 24 (30.8) 7 (21.9) 12 (37.5) 5 (35.7)
N No company meetings for safety stated in the interview or written program 43 (55.1) 24 (75.0) 16 (50.0) 3 (21.4)
16 Hazard Identification/ Controls Job Hazard Analysis Y Company has a job hazard analysis document (JSA/JHA/PTP) used in all projects regardless of general contractor expectations as stated in interview and written program 43 (55.1) 12 (37.5) 18 (56.2) 13 (92.9) 0.002
N No job hazard analysis document or process consistently used in projects as stated in interview or written program 35 (44.9) 20 (62.5) 14 (43.8) 1 ( 7.1)
17 Incident Investigation Y Company has a formal incident investigation procedure stated in interview or in written program 47 (60.3) 15 (46.9) 21 (65.6) 11 (78.6) 0.073
Partial Company incident investigation procedure is not formal or well described as stated in interview 24 (30.8) 12 (37.5) 10 (31.2) 2 (14.3)
N No company investigations of incidents stated in interview or written program 7 (9.0) 5 (15.6) 1 ( 3.1) 1 ( 7.1)
18 PPE Policy Y Company has a PPE policy stated in interview or written program 75 (96.2) 30 (93.8) 32 (100.0) 13 (92.9) 0.343
N Company has no PPE policy or only follows the PPE policy from the general contractor as stated in interview or written program 3 (3.8) 2 (6.2) 0 ( 0.0) 1 ( 7.1)

Constructs and items derived from the St. Louis Construction Council, 2011

*

Information may come from worker interviews in addition to subcontractor representative interviews and written safety program

**

p-value from One-way ANOVA.

Worker Survey of safety scales

The worker surveys covered information about the type of trade, tenure in trade, duration of time employed by their subcontractor, injuries while on the project, and perception of safety and safety performance related to different trades on the project. The workers’ perceptions of safety were measured by five safety climate and behavior scales across different levels of the project organization: 1) perception of safety climate of the general contractor (10 items) adapted from Zohar and Luria (2005); 2) safety climate of their subcontractor (11 items) (Zohar & Luria, 2005); 3) safety behaviors of co-workers (8 items) (Brondino, Silva, & Pasini, 2012); 4) perceptions of their own safety behaviors (6 items) (Neal & Griffin, 2006), and 5) specific safety behaviors of their crews (6 items) (Kaskutas, et al., 2010). The safety behaviors of coworker items addressed worker perceptions of coworker’s general attitude toward safety (i.e. care about others safety awareness; encourage each other to work safely); crew behavior items referred to specific behaviors (i.e. use proper personal protective equipment (hard hats and safety glasses) at all times; my crew always work behind guard rails or were tied off (personal fall arrest system). These safety climate and behavior scales were standardized to a score from 0–100.

Data Analysis

We selected all subcontractors with a measure of safety management program (summated score of 18 items) and with at least one worker survey completed after working on the construction project for at least 30 days. Ten subcontractors worked on projects for more than one of the general contractors (one small-sized subcontractor, four medium-sized subcontractor, and five large-sized subcontractors); for these subcontractors, we included data from the project with the largest number of completed worker surveys in the analysis.

Among the subcontractors, we examined the descriptive data of the workers (sex, age, race and trade), and the distribution of safety scales across all subcontractors, as well as stratified by contractor size (small, medium, and large). The relationship between contractor size and safety checklist score was assessed using one-way ANOVA, with post-hoc pairwise comparisons ascertained with Tukey’s HSD tests.

The relationships of five safety climate and behavior scales (general contractor safety climate, subcontractor safety climate, coworker safety behaviors, crew safety behaviors, and self-rated safety behaviors) and subcontractor-level variables (SMP checklist score and subcontractor size) were also assessed. Since safety climate and behavior scales were measured at the individual worker level, for these analyses, each worker was assigned their subcontractor’s SMP checklist score and size. We conducted several hierarchical models to account for clustering of worker-level safety climate scales within subcontractor size and subcontractor SMP checklist score. First we examined the worker safety scale scores by subcontractor size with means and standard deviations. Then we conducted hierarchical linear models of safety scale scores (as dependent variable) and subcontractor size with subcontractor as a random effect to account for within subcontractor clustering. Next we assessed the relationship between safety climate scales and SMP checklist score with Spearman correlation coefficients. Then we constructed hierarchical linear models of safety scales to SMP checklist scores with subcontractor as a random effect to account for clustering. Additionally, a random effect for general contractor, random slopes for general and subcontractor, and a fixed effect for subcontractor size were considered and evaluated for the hierarchical models, but ultimately not included due to lack of significance and model fit issues.

RESULTS

We recruited 78 subcontractors working on the projects managed by six large general contractors. Of the 78 subcontractors, 14 were categorized as large (>200 employees), 32 medium (51–200 employees), and 32 small (0–50 employees). The median number of employees was 65 (range seven to 2500).

746 workers completed questionnaires. These workers were predominately male (98%), Caucasian (87%), had a mean age of 39 years (SD 11), mean tenure with their current subcontractor of 4 years (SD 5.9), and were employed in 18 different construction trades including Electrical (19%), Carpentry (12%), Ironwork (9%), Pipefitting (9%), Sheet metal (7%) and Drywall (6%).

Table 1 shows the 18 safety elements that comprise the SMP checklist along with the distribution of subcontractor responses to the items. The majority of the subcontractors fully met the criteria for several items: union or contractor provided safety training 97.4%, substance abuse policy 97.4%, PPE policy 96.2%, and written safety program 94.9%. The items that were not implemented or coded as “no” were the following: no company safety committee 55.1%; no job hazard analysis consistently used on projects 44.9%, no worker suggested changes made to the safety program 53.8%; and no policy states worker’s stop work rights 48.7%. Some items were scored as “partially met” if the activity met OSHA requirements but did not exceed these requirements. Among the items where a large portion of employers achieved but did not exceed OSHA compliance requirements were the following: company monitors safety goals only through lagging indicators (OSHA log and Experience Modification Rate) 67.9%, and company reports that safety budget covers only PPE and other OSHA required equipment 53.8%.

The 18 safety items were categorized by employer size. A one-way ANOVA of SMP checklist scores across contractor groups by size showed highly statistically significant differences in SMP scores as shown in Table 2. These differences were evident for the management level activities, with smaller contractors having fewer safety management practices to encourage or monitor progress toward safety. Smaller contractors were less likely to use leading indicators (safety activities and safety behaviors), employ a designated safety representative to oversee safety in all company activities, or conduct management audits on projects. Few small contractors sought worker suggestions toward safety improvement, and few incorporated these suggestions into their safety program. Fewer small contractors required typical safety activities such as compulsory toolbox talks if not required by the general contractor, and robust enforcement of safety policies. Safety activities such as inspections of equipment or jobsite audits were often less formal in small contractors. In contrast, large contractors more often demonstrated management commitment and worker involvement through active safety processes such as regular safety committee meetings that included workers, monitoring progress toward safety goals with leading indicators, and a budget for safety resources (safety budget for new programs and equipment, employing safety personnel).

Table 2.

Safety climate and safety behavior scales, and safety checklist scores by subcontractor size

Total Small Medium Large P*
Subcontractor safety checklist
 Mean ± SD 68.3 ± 19.3 58.1 ± 17.4 71.8 ± 16.2 83.7 ± 17.7 <0.0001
 Range 5.6–100.0 5.6–86.1 38.9–94.4 33.3–100.0

Safety climate/behavior scales
 Subcontractor safety climate, mean ± SD 82.8 ± 15.6 81.3 ± 15.3 83.2 ± 15.9 83.4 ± 15.5 0.27
 Coworker safety behaviors, mean ± SD 81.8 ± 15.5 81.2 ± 14.9 82.0 ± 15.5 82.1 ± 16.1 0.56
 Self-safety behaviors, mean ± SD 83.1 ± 15.1 82.5 ± 14.7 83.6 ± 14.6 82.9 ± 16.1 0.59
 Crew safety behaviors, mean ± SD 84.9 ± 14.2 84.8 ± 13.5 85.2 ± 14.3 84.5 ± 14.4 0.56
 General Contractor safety climate, mean ± SD 75.0 ± 17.8 74.6 ± 16.1 74.3 ± 19.1 76.3 ± 17.0 0.99
*

Safety Checklist score: One way ANOVA. Pairwise P values: L-M=0.08; L-S=<0.0001; M-S=0.005

Random intercepts for subcontractors were included to account for within-subcontractor clustering.

Safety climate/behavior scales: P values from mixed linear models with safety scales as dependent variable and subcontractor size as independent variable.

Although there were notable differences in safety program elements across various sized contractors, we found little difference in worker-reported safety climate and behavior scales by contractor size, as shown in Table 2. Safety climate scale measures at the subcontractor level, crew, and coworker levels were similar across contractor size. In general, workers seemed to rate their own safety behaviors and those of their coworkers, crews, and subcontractors higher than they rated the safety climate of their general contractors. Hierarchical linear regression models for each subcontractor safety scale by contractor size showed no statistical difference in safety scale scores across the groups by size.

An assessment of the relationship between SMP checklist scores and the five safety climate scale metrics reported by workers showed only weak correlations (Table 3). The only statistically significant relationship was found between subcontractor safety climate and SMP checklist score (r=0.08, p=0.02 by Spearman correlation, and β=0.09, p=0.04 by HLM).

Table 3.

Relationship between safety climate and safety behavior scales to subcontractor safety checklist scores

Spearman correlation coefficient Linear regression model*

Safety climate/performance scores N r p Beta** p
Subcontractor safety climate 732 0.08 0.02 0.09 0.04
Coworker safety behaviors 733 0.07 0.06 0.08 0.09
Self-safety behaviors 728 0.06 0.13 0.05 0.20
Crew safety behaviors 729 0.06 0.13 0.07 0.11
General contractor safety climate 739 0.03 0.49 −0.004 0.96
*

Mixed linear models with safety score as the dependent variable and subcontractor checklist score as independent variable. Random intercepts for subcontractor were included to account for within-subcontractor clustering.

**

Change in safety score per change in safety checklist,

DISCUSSION

Our study sought to examine the associations between a measure of subcontractor’s safety management program (SMP checklist), with their workers’ reported safety climate on five different scales. Among 78 subcontractors working on large commercial construction projects, we found striking differences in SMP scores between small, medium, and large subcontractors, related to a number of specific safety management practices that differed between large and small subcontractors. However, we observed only weak relationships between SMP scores and safety climate scores reported by 746 workers at these subcontractors. We saw no differences in worker reported safety climate and safety behaviors between small, medium, and large subcontractors. Workers also rated the safety climate of their subcontractor higher than they rated the safety climate of the general contractors on the current project.

The differences in SMP scores between small to large contractors is consistent with other literature. The majority of construction employers are small, employing fewer than 50 employees (Bureau of Labor Statistics, 2018). Small contractors often lack sufficient safety resources, and have disproportionate number of fatalities compared to the overall sector (CPWR, 2007). Small construction employers have been slower to adopt high performing safety management programs compared to employers in other high risk industries (Hasle, Kines, & Andersen, 2009). Even though small contractors may desire to create a safer work environment for their employees, their lack of resources and awareness are barriers to implementing recommended health and safety management programs (National Institute for Occupational Safety and Health (NIOSH), 2013b).

In contrast to SMP scores, differences in worker reported safety climate were not seen across different sizes of subcontractors; overall, the reported perceptions of safety climate were high across subcontractors, and lower for the general contractor. Lingard et al (2009) found high levels of within workgroup homogeneity on safety climate dimensions, but significant between‐group differences in perception of supervisory leadership and coworkers’ actual safety behavior, and noted that aggregating safety climate data at the organizational level can mask important differences. Our study did not have enough data from each subcontractor to study the effects of workgroup size and within‐group interactions. Safety climate reflects workers’ perceptions of their employer and co-workers; in stably employed workers (mean tenure of 4 years with the current subcontractor), the workers’ perceptions of their own employer may be better than their perception of the general contractor, on whose site the workers had been for only a month.

Our study found only weak relationships between measures of safety climate and the employers’ SMP checklist scores. Worker perceived subcontractor safety was significantly associated with their subcontractor’s SMP score while coworker safety trended toward statistical significance. While safety climate is promoted as a leading indicator of safety, the literature is mixed with regard to its ability to predict safety incidents or injuries. Some studies have found a link between safety performance and safety climate measures (Christian, Bradley, Wallace, & Burke, 2009; Johnson, 2007); other studies have shown relatively weak associations (Givehchi, Hemmativaghef, & Hoveidi, 2017).

A recent study of 401 construction workers employed by 68 companies found only weak correlations between workers’ safety climate scores and their employers’ safety performance scores (Sparer, et al., 2013). In a sample of 25 commercial construction companies in Colombia, Marin (2017) examined the relationship between workers’ safety climate perceptions and safety management practices reported by company safety officers. Their study found a moderate correlation between contractors’ 3-year injury rates and safety management practice scores. Although injury rates and the safety management practice scores were not related to the overall safety climate measure used in their study, two dimensions of safety climate - management safety empowerment and workers’ safety commitment - were moderately correlated to the employer’s safety management practices. This latter finding is similar to what was seen in our current study, which found weak correlations with some but not all of the safety scales tested.

Several reasons may exist for the weak associations observed in some studies between these two leading indicators of construction safety. Existing safety management plans may not be effectively implemented or communicated to workers via their supervisors, thus resulting in reducing workers’ poorer perceptions of safety climate. Alternatively, supervisors may actively pattern and communicate safety to their employees even in the absence of a robust safety plan. Marin (2017) and colleagues suggested that safety climate may be a parallel outcome to workplace safety practices, rather than a determinant of worker’s safety behaviors or outcomes as suggested by the models proposed by Zohar (2005) and Neal (2000). Some authors have suggested that important moderating/mediating intervening variables lie between safety climate and safety performance (Kongsvik, Almklov, & Fenstad, 2010). It is also important to note that safety in construction is uniquely complicated, due to rapidly changing work environments, unique hazards of the industry, and the organizational issues of coordinating interactions between multiple contractors (National Institute for Occupational Safety and Health (NIOSH), 2013a; Ringen, et al., 1995). Most measures of safety climate were developed in more stable work environments such as manufacturing, which have markedly different social and organizational structures than construction. The low associations observed between safety climate and other leading or lagging indicators of safety in the construction industry are likely due in part to the unique nature of work organization in construction.

While there is growing agreement on how to define and assess safety climate, the many different questionnaires and their dimensions still pose a challenge to the study of safety climate, particularly in construction (Hecker & Goldenhar, 2014). As discussed by Sparer and colleagues (2013), it is not clear whether workers’ perceptions of safety climate are more strongly influenced by their current worksite or by their past work experience with the same employer, or whether safety climate is more strongly influenced by workers’ unions, by their immediate workgroup, by their subcontractor, or by their worksite controlled by the general contractor. To date, there is little work that examines workers’ perceptions of safety climate in the context of the multiple and fluid organizational layers found in the construction industry. It is likely that well-validated industry specific tools will perform better in construction than will tools developed in other industries (Probst, Goldenhar, Byrd, & Betit, 2019).

Limitations of this study include its cross-sectional nature, which did not examine the relationship over time between SMP scores and safety climate scales. This study is nested within a larger study that will examine the “flow down” of safety practices from general contractors to subcontractors and provide measures of safety climate over time among different sized subcontractors during their tenure of working on a large commercial site; future analyses will allow longitudinal assessment. Another limitation is that the current study recruited subcontractors working for general contractors who were recognized in the local industry as having strong safety practices. These large general contractors aim to hire subcontractors with stronger SMPs, likely limiting the variation within our sample, and providing a sample of subcontractors with more robust programs - most subcontractors in our sample had the majority of the measured safety management program elements in place, and worker reported safety climate was high across all scales. On the large commercial projects studied, we defined small contractor as less than 50 employees which differs from categorizations so results may not be generalizable to much smaller contractors or non-commercial construction projects. Stronger associations between SMP scores and measures of safety climate might be found in a sample of smaller contractors, on different types of construction other than commercial, or among contractors who did not predominantly employ members of construction trade unions. Another limitation of our study is that we were unable to compare SMP scores and safety climate scales to data on injuries or safety incidents.

Another potential limitation is the use of a safety management program score created by our study team. We adapted our scoring criteria from guidelines promulgated by an influential regional council, and items were similar to those described in other studies. Not all safety elements were of the same quality across safety management programs of different subcontractors; we used a three level scoring of program elements, but a finer gradation may have improved the precision of our results. There is currently no nationally or internationally recognized standard for assessing safety program management in construction, and some literature is based on proprietary assessment methods, making it difficult to directly compare results. While written safety management programs commonly contain many of the same elements, a consistently safe environment can only be achieved by enacting and integrating these activities at the workplace. Review of written safety management plans alone cannot assess safety practices that are espoused versus those that are implemented. Our study used manager interviews to inform the scoring of our SMP checklist; additional work in this field is required to develop assessments of safety management programs to inform safety improvement in construction.

SUMMARY

Our study demonstrated large differences in safety program management between small, medium, and large construction subcontractors working on large commercial projects, and described specific program elements accounting for these differences. We found that a measure of safety program management only weakly predicted safety climate among employees of these subcontractors. Future work is needed to understand how best to measure safety performance of construction companies, factors leading to improved safety performance of small and medium construction firms, and the barriers and facilitators to improving safety.

Research Highlights:

  • Safety management programs differed in small, medium, and large construction firms

  • Specific safety program elements were less likely to be present in small contractors

  • Worker-reported safety climate did not differ between different sized contractors

  • There was little association between safety management programs and safety climate

Acknowledgements:

The authors would like to acknowledge the construction workers and contractors who participated in this research study. Without their participation, these studies are not possible. We would also like to thank Anna M. Kinghorn and Alexandra O’Brien from the Washington University in St. Louis Occupational Safety and Health Research Lab for their contributions to data collection, transcription, and assistance with manuscript preparation.

Funding: This work was supported by the National Institute of Occupational Safety and Health [Grant number 2U60OH009762-06/-IISCE] and by CPWR-The Center for Construction Research and Training

Footnotes

Declarations of Interest: None

REFERENCES

  1. Brondino M, Silva SA, & Pasini M. (2012). Multilevel approach to organizational and group safety climate and safety performance: Co-workers as the missing link. Safety Science, 50(9), 1847–1856. [Google Scholar]
  2. Bureau of Labor Statistics. (2018). Employment, Hours, and Earnings from the Current Employment Statustics survey (National). In. Washington, DC: U.S. Department of Labor. [Google Scholar]
  3. Cheyne A, Cox S, Oliver A, & Tomas J. (1998). Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255–271. [Google Scholar]
  4. Choudhry R, Fang D, & Lingard H. (2009). Measuring Safety Climate of a Construction Company. Journal of Construction Engineering and Management-Asce, 135(9), 890–899. [Google Scholar]
  5. Christian MS, Bradley JC, Wallace JC, & Burke MJ (2009). Workplace Safety: A Meta-Analysis of the Roles of Person and Situation Factors. Journal of Applied Psychology, 94(5), 1103–1127. [DOI] [PubMed] [Google Scholar]
  6. Cigularov KP, Chen PY, & Rosecrance J. (2010). The effects of error management climate and safety communication on safety: a multi-level study. Accid Anal Prev, 42(5), 1498–1506. [DOI] [PubMed] [Google Scholar]
  7. CPWR. (2007). The Construction Chart Book: The US Construction Industry and Its Workers. In. Silver Spring, MD: CPWR- The Center for Construction Research and Training. [Google Scholar]
  8. Dedobbeleer N, & Béland F. (1991). A safety climate measure for construction sites. Journal of Safety Research, 22(2), 97–103. [Google Scholar]
  9. Gittleman JL, Gardner PC, Haile E, Sampson JM, Cigularov KP, Ermann ED, Stafford P, & Chen PY (2010). [Case Study] CityCenter and Cosmopolitan Construction Projects, Las Vegas, Nevada: lessons learned from the use of multiple sources and mixed methods in a safety needs assessment. J Safety Res, 41(3), 263–281. [DOI] [PubMed] [Google Scholar]
  10. Givehchi S, Hemmativaghef E, & Hoveidi H. (2017). Association between safety leading indicators and safety climate levels. Journal of Safety Research, 62, 23–32. [DOI] [PubMed] [Google Scholar]
  11. Grabowski M, Ayyalasomayajula P, Merrick J, & Mccafferty D. (2007). Accident precursors and safety nets: leading indicators of tanker operations safety. Maritime Policy & Management: The flagship journal of international shipping and port research, 34(5), 405–425. [Google Scholar]
  12. Hadjimanolis A, & Boustras G. (2013). Health and safety policies and work attitudes in Cypriot companies. Safety Science, 52, 50–56. [Google Scholar]
  13. Hasle P, Kines P, & Andersen LP (2009). Small enterprise owners’ accident causation attribution and prevention. Safety Science, 47(1), 9–19. [Google Scholar]
  14. Hecker S, & Goldenhar L. (2014). Understanding Safety Culture and Safety Climate in Construction: Existing Evidence and a Path Forward. In CPWR; (), (pp. 1–19). [Google Scholar]
  15. Herrero S.G.a., Saldaña MAM, del Campo MAM, & Ritzel DO (2002). From the traditional concept of safety management to safety integrated with quality. Journal of Safety Research, 33(1), 1–20. [DOI] [PubMed] [Google Scholar]
  16. Hinze J, Thurman S, & Wehle A. (2013). Leading indicators of construction safety performance. Safety Science, 51(1), 23–28. [Google Scholar]
  17. Hsu Y-L, Li W-C, & Chen K-W (2010). Structuring critical success factors of airline safety management system using a hybrid model. Transportation Research Part E: Logistics and Transportation Review, 46(2), 222–235. [Google Scholar]
  18. Institute for Work and Health. (2011). Benchmarking organizational leading indicators for the prevention and management of injuries and illnesses Final report. In. Toronto, Ontario: Institute for Work & Health. [Google Scholar]
  19. Ismail Z, Doostdar S, & Harun Z. (2012). Factors influencing the implementation of a safety management system for construction sites. Safety Science, 50(3), 418–423. [Google Scholar]
  20. Johnson S. (2007). The predictive validity of safety climate. Journal of Safety Research, 38(5), 511–521. [DOI] [PubMed] [Google Scholar]
  21. Kaskutas V, Dale AM, Lipscomb H, Gaal J, Fuchs M, & Evanoff B. (2010). Fall prevention in apprentice carpenters. Scandinavian Journal of Work, Environment and Health, 36(3), 258–265. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Kines P, Andersen LPS, Spangenberg S, Mikkelsen KL, Dyreborg J, & Zohar D. (2010). Improving construction site safety through leader-based verbal safety communication. Journal of Safety Research, 41(5), 399–406. [DOI] [PubMed] [Google Scholar]
  23. Kongsvik T, Almklov P, & Fenstad J. (2010). Organisational safety indicators: Some conceptual considerations and a supplementary qualitative approach. Safety Science, 48(10), 1402–1411. [Google Scholar]
  24. Laitinen H, Marjamaki M, & Paivarinta K. (1999). The validity of the TR safety observation method on building construction. Accident Analysis and Prevention, 31(5), 463–472. [DOI] [PubMed] [Google Scholar]
  25. Marín LS, Lipscomb H, Cifuentes M, & Punnett L. (2017). Associations between safety climate and safety management practices in the construction industry. American Journal of Industrial Medicine, 60(6), 557–568. [DOI] [PubMed] [Google Scholar]
  26. McCoy AJ, Kucera KL, Schoenfisch AL, Silverstein BA, & Lipscomb HJ (2013). Twenty years of work-related injury and illness among union carpenters in Washington State. American Journal of Industrial Medicine, 56(4), 381–388. [DOI] [PubMed] [Google Scholar]
  27. Mearns K, Whitaker S, & Flin R. (2003). Safety climate, safety management practice and safety performance in offshore eviroments. Safety Science, 41, 641–680. [Google Scholar]
  28. Mikkelsen KL, Spangenberg S, & Kines P. (2010). Safety walkarounds predict injury risk and reduce injury rates in the construction industry. Am J Ind Med, 53(6), 601–607. [DOI] [PubMed] [Google Scholar]
  29. National Institute for Occupational Safety and Health (NIOSH). (2013a). National Academies NIOSH Program Review: Construction. In: Center for Disease Control and Prevention. [Google Scholar]
  30. National Institute for Occupational Safety and Health (NIOSH). (2013b). NIOSH Program Portfolio: Small Business Assistance and Outreach. In. [Google Scholar]
  31. National Institute for Occupational Safety and Health (NIOSH). (2013c). Safety Climate: Evaluation Survey. In: Centers for Disease Control and Prevention (CDC). [Google Scholar]
  32. Neal A, & Griffin MA (2006). A study of the lagged relationships among safety climate, safety motivation, safety behavior, and accidents at the individual and group levels. Journal of Applied Psychology, 91(4), 946–953. [DOI] [PubMed] [Google Scholar]
  33. Neal A, Griffin MA, & Hart PM (2000). The impact of organizational climate on safety climate and individual behavior. Safety Science, 34(1–3), 99–109. [Google Scholar]
  34. Safety Occupational & Administration Health. (2016). Recommended Practices for Safety & Health Programs in Construction. In. Washington, DC: U.S. Department of Labor. [Google Scholar]
  35. Probst TM, Goldenhar LM, Byrd JL, & Betit E. (2019). The Safety Climate Assessment Tool (S-CAT): A rubric-based approach to measuring construction safety climate. J Safety Res, 69, 43–51. [DOI] [PubMed] [Google Scholar]
  36. Ringen K, Seegal J, & Englund A. (1995). Safety and health in the construction industry. Annu Rev Public Health, 16, 165–188. [DOI] [PubMed] [Google Scholar]
  37. Skogdalen JE, Utne IB, & Vinnem JE (2011). Developing safety indicators for preventing offshore oil and gas deepwater drilling blowouts. Safety Science, 49(8–9), 1187–1199. [Google Scholar]
  38. Sokas R, Jorgensen E, Nickels L, Gao W, & Gittleman J. (2009). An Intervention Effectiveness Study of Hazard Awareness Training in the Construction Building Trades. Public Health Reports, 124, 161–168. [PMC free article] [PubMed] [Google Scholar]
  39. Sparer EH, Murphy LA, Taylor KM, & Dennerlein JT (2013). Correlation between safety climate and contractor safety assessment programs in construction. American Journal of Industrial Medicine, 56(12), 1463–1472. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Toellner J. (2001). Improving safety and health performance: Identifying and measuring leading indicators. Professional Safety, 46(9), 42–47. [Google Scholar]
  41. Trethewy RW (2003). OHS PERFORMANCE — IMPROVED INDICATORS FOR CONSTRUCTION CONTRACTORS. Journal of Construction Research, 04(01), 17–27. [Google Scholar]
  42. U.S. Bureau of Labor Statisitics (BLS). (2013). Incidence rates of nonfatal occupational injuries and illnesses by industry and case types, 2011: Bureau of Labor Statistics, United States Department of Labor. In. [Google Scholar]
  43. Versteeg K, Bigelow P, Dale AM, & Chaurasia A. (In press). Utilizing Construction Safety Leading and Lagging Indicators to Measure Project Safety Performance: a case study. Safety Science. [Google Scholar]
  44. Vinodkumar M, & Bhasi M. (2011). A study on the impact of management system certification on safety management. Safety Science, 49(3), 498–507. [Google Scholar]
  45. Zohar D. (2000). A group-level model of safety climate: testing the effect of group climate on microaccidents in manufacturing jobs. J Appl Psychol, 85(4), 587–596. [DOI] [PubMed] [Google Scholar]
  46. Zohar D, & Luria G. (2005). A multilevel model of safety climate: Cross-level relationships between organization and group-level climates. Journal of Applied Psychology, 90(4), 616–628. [DOI] [PubMed] [Google Scholar]

RESOURCES