Skip to main content
Proceedings (Baylor University. Medical Center) logoLink to Proceedings (Baylor University. Medical Center)
. 2016 Oct;29(4):367–370. doi: 10.1080/08998280.2016.11929472

Development and testing of Baylor Scott & White Health's “Attitudes and Practices of Patient Safety Survey”

Sunni A Barnes 1,, Jan Compton 1, Margaret Saldaña 1, Kristen M Tecson 1, Chizuko Hastings 1, Donald A Kennerly 1
PMCID: PMC5023285  PMID: 27695163

Abstract

Improving the quality of patient care requires a culture attuned to safety. We describe the development, implementation, and psychometric evaluation of the Attitudes and Practices of Patient Safety Survey (APPSS) within the Baylor Scott & White Health system. The APPSS was designed to enable safety culture data to be collected and aggregated at the unit level to identify high-priority needs. The survey, with 27 Likert-scale core questions divided into 4 concept domains and 2 open-ended questions, was administered electronically to employees with direct patient care responsibilities (n = 16,950). The 2015 response rate was 50.4%. The Cronbach's α values for the four domains ranged from 0.78 to 0.90, indicating strong internal consistency. Confirmatory factor analysis results were mixed but were comparable to those of established safety culture surveys. Over the years, the adaptability of the APPSS has proven helpful to administrative and clinical leaders alike, and the survey responses have led to the creation of programs to improve the organization's patient safety culture. In conclusion, the APPSS provides a reliable measure of patient safety culture and may be useful to other health care organizations seeking to improve the quality and safety of the care they provide.


Patient safety culture is a key element of health care quality and safety (1, 2). Since 2007, hospitals seeking Joint Commission accreditation have been required to demonstrate that “leaders regularly evaluate the culture of safety using reliable tools” (3). Thus, it was not surprising that tools to measure patient safety culture proliferated. By 2006, 12 distinct surveys were used by health care organizations to assess safety culture (4, 5). Most surveys showed substantial limitations in psychometric properties (4) and, as a result, more recent research tends to rely on one of four safety culture questionnaires that have demonstrated acceptable psychometric properties: the Hospital Survey on Patient Safety Culture (HSOPSC), the Safety Attitudes Questionnaire (SAQ), the Patient Safety Climate in Healthcare Organizations, and the Hospital Safety Climate Scale (6). Such consistency offers the advantage of comparability between hospitals; however, in the context of operational quality improvement, these surveys may not meet the needs of health care organizations. Such was the situation that the Baylor Health Care System (now part of Baylor Scott & White Health [BSWH]) encountered in 2002. There was a need for a tailored tool to collect data that organizational leaders could use to develop improvement initiatives by identifying high-priority patient safety needs. Here we describe the development, psychometric evaluation, and deployment of the Attitudes and Practices of Patient Safety Survey (APPSS).

METHODS

BSWH, formed through the 2013 merger of Baylor Health Care System and Scott & White Healthcare, is the largest not-for-profit health care system in Texas and one of the largest in the United States. It includes 49 owned, operated, joint-ventured, and affiliated hospitals, >500 patient care sites, >6000 affiliated physicians, >38,000 employees, an accountable care organization, and the Scott & White health plan. The data presented here are from the 23 acute care hospitals BSWH fully owned in the summer of 2015.

The first version of the APPSS in 2002 sought to incorporate domains that were thought to contribute to safety. Domains were derived from the general safety science and patient safety literature. A draft item list was constructed, incorporating both items used in existing survey instruments (that did not have copyright or intellectual property limitations) and newly developed items. The draft item list was reviewed for clarity and relevance by quality and safety professionals as well as hospital employees who provide direct patient care. Revisions were made based on their feedback to provide face and content validity to the survey items. After revision, each item was assigned to a primary concept domain by a content expert panel. Additionally, a summary item intended to represent the respondent's overall assessment of patient safety in his or her hospital was included: “I would feel safe being treated in my facility as a patient.” A job satisfaction item was also included (“How satisfied are you with your work at Baylor?”) to investigate whether a link exists between patient safety culture and professional satisfaction.

A frequency scale (“always,” “most of the time,” “sometimes,” “rarely,” and “never”) was used for as many items as possible. This approach was based on the idea that individuals vary widely in how they rate their agreement, but tend to favor positive responses. Furthermore, it was thought that employees would be more responsive to a quantitative rating scale describing how often a desirable activity took place. For example, knowing that good professional teamwork took place only “sometimes” (the central option of the 5-point frequency Likert scale) was thought to raise more concern than a “neutral” response (the midpoint of the level of agreement response options). Frontline staff indicated that making a selection on a frequency scale was easier than selecting among agreement options.

Following pilot testing in 2003, the APPSS was revised to reduce the number of items in order to alleviate the time burden associated with completing the survey. In 2008, the acquisition of eSurvey design software (SNAP Surveys, Portsmouth, NH) facilitated the use of additional survey methods such as branched chain items. Other changes were made based on results of cognitive testing and focus groups conducted following the first few deployments of the survey. By 2013, a core set of 27 standard Likert-scale items (17 of which have response-triggered follow-up items) and 2 open-ended questions covering four domains had been established (Table 1). A limited number of custom questions to investigate contemporary issues impacting safety (e.g., implementation of electronic medical records) can be added in each deployment.

Table 1.

Items in the Attitudes and Practices of Patient Safety Survey by concept domain

Domain Item
Leadership
  • Senior facility leaders promote patient safety.

  • Senior facility leaders make rounds on my unit/department to discuss various topics, including patient safety concerns.

  • My unit leader rounds with patients to provide feedback to staff and improve patient care.

  • This facility deals effectively with poor performance among the following staff*.

  • This facility deals effectively with disruptive behavior among the following staff*.

Reporting and feedback
  • Recognizing and reporting errors and hazardous conditions is encouraged and valued.

  • I do not worry about being punished when I report errors or near misses.

  • I have seen positive changes in practice as a result of reporting errors and near misses.

  • My unit/department receives feedback about patient safety events such as falls, skin breakdowns, infections, medication errors, and other errors or near misses.

Resources
  • My work environment is safe and secure.

  • I have what I need to deliver safe care.

  • On my unit/department the staffing is adequate to provide safe, consistent care.

  • I am able to complete hourly rounds on my patients.

  • I am able to access translation services through a certified translation staff member or language line to meet the needs of non–English-speaking patients and family members when needed.

Teamwork
  • In my unit/department people treat each other with respect.

  • There is cooperation among the facility units/departments that need to work together.

  • Doctors, nurses, and other clinical staff work together as an effective team.

  • When patients are transferred from one unit/department to another, all information needed for safe care is exchanged.

  • During shift change, all information needed for safe care is exchanged.

  • The timeout process occurs as I would like if I or a family member were the patient.

*

Composite of four items asking individually about physicians, nurses, managers, and other staff.

The APPSS forms part of a 2-year patient safety monitoring and improvement cycle, with patient safety site visits being conducted in the years it is not deployed (7). It is deployed electronically to enhance data integrity (8) and enable more efficient data management. During each deployment, employees with direct patient care responsibilities receive an email inviting them to participate in the online survey; the invitation includes a URL link that takes them directly to the survey. Employee data from the human resources databases are linked to the URL, eliminating the need for respondents to enter data related to their role, location, tenure, or other relevant characteristics. The survey period lasts for 3 weeks, during which the eSurvey software e-mails weekly reminders to employees who have not yet submitted their response.

The internal consistency of the instrument was assessed using Cronbach's α coefficient of reliability. We also assessed convergent and divergent validity of the survey items, checking that the corrected item-to-domain correlation (which indicates the item's correlation to the concept domain to which it is assigned) was higher than the item-to-nondomain correlation (which indicates the item's correlation to the other concept domains). We used the corrected item-total correlation, meaning that the item was removed from the domain score before the correlation was calculated. Correlations among the concept domains were also examined to ensure that the domains were sufficiently unique to be considered separate constructs. Finally, we fit a confirmatory factor model to examine the four predefined domains of leadership, teamwork, resources, and reporting and feedback. All analyses were conducted using SAS version 9.3 (SAS Institute, Cary, NC).

RESULTS

In the 2015 deployment, 16,950 employees were invited to participate, and 8548 responses were received (response rate = 50.4%) across 23 facilities. The Cronbach's α value for the APPSS domains indicated strong internal consistency, as shown in Table 2: all values fell in the 0.75 to 0.80 range (very good) or ≥0.80 range (excellent) (9). Table 2 also shows the corrected item-to-domain and item-to-nondomain correlations for each concept domain (8). For each domain, the former was larger than the latter, indicating that the items within each domain measured the same underlying concept. Based on the moderate to strong corrected item-to-domain and the smaller, moderate corrected item-to-nondomain correlations observed, the survey had desirable construct validity. This confirms the face and content validity observed via focus groups in the early years of development and deployment. Table 3 shows the correlations among the patient safety concept domains; all were <0.80, indicating that the domains were sufficiently different to be considered unique and to avoid problems with multicollinearity (10). The results shown in Tables 2 and 3 together support the use of the four concept domains and the assignment of items to them.

Table 2.

Performance of the concept domains of the Attitudes and Practices of Patient Safety Survey

Concept domain* Cronbach's α Average corrected item-to-domain correlations Average item-to-nondomain correlations Number of items included
Leadership 0.90 0.58 0.47 11
Reporting and feedback 0.78 0.60 0.48 4
Resources 0.78 0.56 0.46 5
Teamwork 0.85 0.64 0.46 6
*

Twenty-six core items roll up to a concept domain, and one core item serves as the overall rating.

Table 3.

Intercorrelation between concept domains of the Attitudes and Practices of Patient Safety Survey (2013 version)

Concept domain Leadership Reporting and feedback Resources Teamwork
Leadership 1.00 0.64 0.59 0.59
Reporting and feedback 1.00 0.59 0.57
Resources 1.00 0.62
Teamwork 1.00

Table 4 shows the correlation between the concept domains and the overall patient safety rating at the individual respondent level, at the unit level, and at the hospital level. The concept domains that were the strongest predictors of an employee's overall view of the safety of care were “Resources” and “Teamwork.” The highest level of analysis (i.e., the hospital level) showed the highest correlations; however, the correlations for the unit analysis and the individual analysis were very similar.

Table 4.

Correlation between concept domains of the Attitudes and Practices of Patient Safety Survey and overall patient safety rating

Concept domain Correlation with overall safety rating
Individual Unit Hospital
Leadership 0.49 0.48 0.68
Reporting and feedback 0.50 0.51 0.75
Resources 0.54 0.56 0.89
Teamwork 0.56 0.56 0.88

Results of the confirmatory factor analysis are shown in Table 5. The goodness-of-fit index and Bentler-Bonett normalized fit index showed the model achieved a fairly good fit. In contrast, the root-mean-square error of approximation (RMSEA) was higher than is desirable at 0.09, and the large chi-square test statistic and corresponding small P value indicated that the model is inappropriate; however, this did not cause much alarm, as we suspect that the large sample size in this study overpowered the chi-square test, which has been shown to reject appropriate models (11).

Table 5.

Results of confirmatory factor analysis for the Attitudes and Practices of Patient Safety Survey (2013 version)

Fit summary variable Value
Chi-square 23162.01
Pr > chi-square <0.0001
Goodness of fit index 0.81
Root mean square error of approximation 0.09
Bentler-Bonett normalized fit index 0.82

DISCUSSION

The APPSS satisfies this organization's need for a safety culture survey because it provides meaningful data that enables leaders to make informed decisions regarding the implementation of improvement initiatives targeting patient safety. Our evaluation shows it to have acceptable psychometric properties, making it a feasible alternative to the safety culture questionnaires that have dominated the safety culture research literature.

The reliability of the APPSS was comparable to that reported for the 12 dimensions of the HSOPSC, whose Cronbach's α values range from 0.44 to 0.84 (7, 12). The overall model fit for the HSOPSC was superior to that of the APPSS, with goodness-of-fit and normalized-fit index values >0.90 and an RMSEA of 0.04 (7). The SAQ also shows better model fit, with a comparative fit index of 0.90 and RMSEA of 0.03, but only after 10 items were dropped to obtain satisfactory model fit (13). The intercorrelation between concept domains was more consistent for the APPSS than the SAQ, ranging from 0.60 to 0.68, compared to the range of −0.28 to 0.95 (13).

The APPSS was developed prior to the availability of the HSOPSC and its comparative database. While using the HSOPSC would enable comparison to other hospitals, BSWH has chosen to continue using the APPSS. Factors influencing this decision include the “deeper dive” that the APPSS provides into specific issues through the use of response-dependent “breakout” items, the use of the frequency scale responses, the avoidance of negatively worded items, and the inclusion of questions asking for free-text responses through which respondents can address specific issues. Additionally, with the APPSS, hospital leaders have the ability to add, delete, and/or revise question items as necessary for the evaluation of current safety issues. Further, while the APPSS contains a similar number of items to the HSOPSC, they are divided into four concept domains rather than 12 dimensions (7), and this reduced number of domains is felt to be more manageable and actionable for leaders without expertise in patient safety culture.

The “breakout” survey items have been particularly helpful within BSWH in pointing to actionable deficits as perceived by frontline staff. For example, one facility that showed poor responses on a question addressing senior leaders' promotion of patient safety initiated regular patient safety leader rounding that both increased the visibility of the leadership's involvement in patient safety and offered staff an opportunity to speak up about their concerns. Another system-level example comes from the inclusion of a question, with a breakout, about the patient safety implications of the adoption of electronic medical records in BSWH hospitals in the 2013 version of the APPSS. Responses indicated staff had concerns about the use of electronic medical records that warranted further exploration, which led to the development of a separate survey to measure user experience with the electronic medical record (14).

The major limitation of our evaluation of the APPSS is that this instrument has been exclusively developed and used within BSWH and operates with a response rate of approximately 50%, so generalizability to other institutions has not been demonstrated. Given that terminology can be very specific to a particular region or institution, further work is needed to ensure respondents in organizations other than BSWH interpret the APPSS items in the same way. A related limitation is that comparison data to other institutions are not readily available.

ACKNOWLEDGMENTS

The authors wish to acknowledge the early contributions of Nettie Kuhn-Verdi, Elaine Nelson, and Linda Gerbig to the development of the APPSS and to thank members of the medical writing group for help drafting this article.

References

  • 1.Kohn LT, Corrigan JM, Donaldson MS. To Err Is Human: Building a Safer Health System: A Report from the Committee on Quality of Healthcare in America. Washington, DC: National Academies Press; 1999. [PubMed] [Google Scholar]
  • 2.Shojania KG, Duncan BW, McDonald KM, Wachter RM, Markowitz AJ. Making health care safer: a critical analysis of patient safety practices. Evid Rep Technol Assess (Summ) 2001. pp. i–x. 1-668. [PMC free article] [PubMed]
  • 3.The Joint Commission. 2010 Comprehensive Accreditation Manual for Hospitals: The Official Handbook. Oak Brook, IL: Joint Commission Resources; 2009. [Google Scholar]
  • 4.Colla JB, Bracken AC, Kinney LM, Weeks WB. Measuring patient safety climate: a review of surveys. Qual Saf Health Care. 2005;14(5):364–366. doi: 10.1136/qshc.2005.014217. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Flin R, Burns C, Mearns K, Yule S, Robertson EM. Measuring safety climate in health care. Qual Saf Health Care. 2006;15(2):109–115. doi: 10.1136/qshc.2005.014761. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Jackson J, Sarac C, Flin R. Hospital safety climate surveys: measurement issues. Curr Opin Crit Care. 2010;16(6):632–638. doi: 10.1097/MCC.0b013e32833f0ee6. [DOI] [PubMed] [Google Scholar]
  • 7.Compton J. Achieving Safe Health Care: Delivery of Safe Patient Care at Baylor Scott & White Health. Boca Raton, FL: Productivity Press; 2015. [Google Scholar]
  • 8.Bowling A. Research Methods in Health. Buckingham, UK: Open University Press; 1997. [Google Scholar]
  • 9.DeVellis RF. Scale Development: Theory and Applications. Thousand Oaks, CA: Sage; 2003. [Google Scholar]
  • 10.O'Brien RM. A caution regarding rules of thumb for variance inflation factors. Qual & Quantity: Int J Method. 2007. pp. 14673–14690.
  • 11.Singer SJ, Falwell A, Gaba DM, Meterko M, Rosen A, Hartmann CW, Baker L. Identifying organizational cultures that promote patient safety. Health Care Manage Rev. 2009;34(4):300–311. doi: 10.1097/HMR.0b013e3181afc10c. [DOI] [PubMed] [Google Scholar]
  • 12.Blegen MA, Gearhart S, O'Brien R, Sehgal NL, Alldredge BK. AHRQ's hospital survey on patient safety culture: psychometric analyses. J Patient Saf. 2009;5(3):139–144. doi: 10.1097/PTS.0b013e3181b53f6e. [DOI] [PubMed] [Google Scholar]
  • 13.Sexton JB, Helmreich RL, Neilands TB, Rowan K, Vella K, Boyden J, Roberts PR, Thomas EJ. The Safety Attitudes Questionnaire: psychometric properties, benchmarking data, and emerging research. BMC Health Serv Res. 2006;6:44. doi: 10.1186/1472-6963-6-44. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Xiao Y, Montgomery DC, Philpot LM, Barnes SA, Compton J, Kennerly D. Development of a tool to measure user experience following electronic health record implementation. J Nurs Adm. 2014;44(7/8):423–428. doi: 10.1097/NNA.0000000000000093. [DOI] [PubMed] [Google Scholar]

Articles from Proceedings (Baylor University. Medical Center) are provided here courtesy of Baylor University Medical Center

RESOURCES