Abstract
Background:
A particularly useful model for examining implementation of quality improvement interventions in health care settings is the PARIHS (Promoting Action on Research Implementation in Health Services) framework developed by Kitson and colleagues. The PARIHS framework proposes three elements (evidence, context, and facilitation) that are related to successful implementation.
Purposes:
An evidence-based program focused on quality enhancement in health care, termed TeamSTEPPS (Team Strategies and Tools to Enhance Performance and Patient Safety), has been widely promoted by the Agency for Healthcare Research and Quality, but research is needed to better understand its implementation. We apply the PARIHS framework in studying TeamSTEPPS implementation to identify elements that are most closely related to successful implementation.
Methodology/Approach:
Quarterly interviews were conducted over a 9-month period in 13 small rural hospitals that implemented TeamSTEPPS. Interview quotes that were related to each of the PARIHS elements were identified using directed content analysis. Transcripts were also scored quantitatively, and bivariate regression analysis was employed to explore relationships between PARIHS elements and successful implementation related to planning activities.
Findings:
The current findings provide support for the PARIHS framework and identified two of the three PARIHS elements (context and facilitation) as important contributors to successful implementation.
Practice Implications:
This study applies the PARIHS framework to TeamSTEPPS, a widely used quality initiative focused on improving health care quality and patient safety. By focusing on small rural hospitals that undertook this quality improvement activity of their own accord, our findings represent effectiveness research in an understudied segment of the health care delivery system. By identifying context and facilitation as the most important contributors to successful implementation, these analyses provide a focus for efficient and effective sustainment of TeamSTEPPS efforts.
Keywords: critical access hospital, implementation science, PARIHS, quality improvement, TeamSTEPPS
Implementation science is a relatively new area of research aimed at identifying conceptual models, methods, and strategies best suited to facilitate the adoption and use of evidence-based interventions for performance improvement (Lobb & Colditz, 2013). In health care, a vast gap exists between research and practice, despite demonstrable benefits of many interventions, and our knowledge of what moves innovations from discovery to rigorous practice is still at its emerging stage (Glasgow et al., 2012). In this article, we build on an implementation science framework to study the implementation of TeamSTEPPS (Team Strategies and Tools to Enhance Performance and Patient Safety), a large and complex quality improvement intervention, in 13 hospitals. TeamSTEPPS is an evidence-based teamwork training system designed through collaboration between the Agency for Healthcare Research and Quality (AHRQ) and the Department of Defense’s Patient Safety Program that provides clear research-based content on which a patient safety initiative can be implemented (Clancy & Tornberg, 2007). However, from the perspective of implementation science, the extent to which the research-based content of TeamSTEPPS actually gets applied in practice is not well understood. We thus adopt the specific case of TeamSTEPPS initiatives to explore how different elements of the implementation science framework are manifest in the intervention process and how they are related to the success of a specific implementation project.
Conceptual Framework
A search for conceptual models in implementation science identifies a number of models that are applicable for studying quality and performance initiatives (Damschroder et al., 2009; Feldstein &Glasgow, 2008; Kilbourne, Neumann, Pincus, Bauer, & Stall, 2007; Kitson, Harvey, & McCormack, 1998). A particularly useful model for examining implementation of quality improvement interventions in health care settings is the PARIHS (Promoting Action on Research Implementation in Health Services) framework developed by Kitson and colleagues (Kitson et al., 1998, Rycroft-Malone et al., 2002; 2008). The primary elements of the PARIHS framework are three components or predictors of successful implementation—evidence, context, and facilitation. Evidence is defined as knowledge that supports the effectiveness of an intervention. Context is defined as the environment or setting in which the intervention is implemented. Facilitation is defined as the technique or process used by a person (i.e., the facilitator) to help others change their attitudes, skills, or behaviors and thereby improve the likelihood of success of the intervention (Kitson et al., 1998). A key function of facilitation is to assess the context of implementation and implement understanding and acceptance of the evidence and thus develop the most appropriate approach to transform the context and evidence elements from a poor condition (i.e., weak context and/or weak evidence) to a more favorable condition (i.e., strong context and strong evidence; Kitson et al., 2008).
A systematic literature review (Helfrich et al., 2010) examined 18 studies that applied the PARIHS framework and identified key areas for future research such as defining subelements and examining the nature of dynamic relationships among elements. Several conceptual and methodological limitations of the original PARIHS framework have also been the focus of recent efforts to extend this line of research. First, although the PARIHS framework has considerable face validity and conceptual relevance for implementation science, a challenge in applying the PARIHS framework is the limited research on how to define and measure successful implementation, the framework’s dependent measure. Second, the adoption of interventions may occur at multiple levels in an organization (Kyratsis, Ahmad, & Holmes, 2012). Although recent research has refined the PARIHS framework for individualor task-level adoption of evidence-based practice (Stetler, Damschroder, Helfrich, & Hagedorn, 2011), there is still a lack of understanding about how elements of the PARIHS framework manifest and facilitate the adoption of interventions at the organizational level. An opportunity to apply the PARIHS framework to identify elements that are most closely related to successful implementation at the organizational level arose as we studied TeamSTEPPS.
TeamSTEPPS stands for Team Strategies and Tools to Enhance Performance and Patient Safety. It is an evidence-based quality improvement intervention that emphasizes improving health care team performance through flexible training designed to enhance team knowledge, skills, and attitudes (King et al., 2008). TeamSTEPPS involves a comprehensive curriculum that spells out key actions for a culture change toward teamwork including team leadership, mutual performance monitoring, backup behaviors, adaptability, team/collective orientation, shared mental models, mutual trust, and closed-loop communication. AHRQ has been actively disseminating TeamSTEPPS since 2006. As a result, TeamSTEPPS has been implemented in many U.S. hospitals. These training programs have been linked to enhanced staff knowledge, skill, and attitudes (Jones, Podila, & Powers, 2013; Sawyer, Laubach, Hudak, Yamamura, & Pocrnich, 2013) as well as improved clinical outcomes (Capella et al., 2010; Spiva et al., 2014). Yet, evidence suggests that TeamSTEPPS interventions are not universally effective (Armour Forse, Bramble, & McQuillan, 2011; Coburn & Gage-Croll, 2014; Sheppard, Williams, & Klein, 2013). One likely explanation for differences in effectiveness is variability in implementation.
Research concerning reasons why TeamSTEPPS is more effectively implemented in some settings than others is relatively scarce. TeamSTEPPS is a complex organization-level intervention, which creates challenges for organizations to adopt and sustain. Given the importance of TeamSTEPPS and its effort to enhance patient care quality and safety, knowledge of what facilitates its implementation into practice is needed to assist health care organizations to choose effective implementation strategies. In this article, we apply the PARIHS framework in studying TeamSTEPPS implementation in 13 small rural hospitals to identify elements that are most closely related to successful implementation. We extend research on the PARIHS framework by exploring the relationship among its subelements and how they relate to different aspects of successful implementation.
Methods
Design
The study used a prospective design.
Setting
Through the Iowa Department of Public Health, annual TeamSTEPPS master trainer training is offered free of charge to critical access hospitals, which are particularly small rural hospitals. We recruited all six hospitals that sent staff to these “train the trainer” sessions in 2011 and all eight hospitals that sent staff to training in 2012. One hospital suspended TeamSTEPPS activities after the first quarter, leaving 13 hospitals in our study sample.
Sample
We visited each participating hospital quarterly to gather information about implementation processes and outcomes. During each quarterly visit, semistructured interviews averaging 30–45 minutes were conducted with key personnel for the TeamSTEPPS implementation. This usually involved several staff who had attended TeamSTEPPS master trainer training plus an executive sponsor who supported the team in their efforts. Across the 13 hospitals and three quarters, 130 interviews were conducted with 73 individuals. The job positions of the interviewees in each of the 13 hospitals are shown in Table 1.
Table 1.
Characteristics of interviewees at the 13 hospitals studied
Hospital | Total number of interviews | Interviewees with executive positions | Interviewees with nonexecutive positions |
---|---|---|---|
1 | 3 single interviews | 1 Patient Care Coordinator | 1 Registered Nurse |
2 | 3 interviews: 3 group interviews (9 interviewees) | 1 Director of Nursing | 1 Registered Nurse |
1 Director of Senior Services | |||
1 Director of Emergency Services | |||
3 | 8 single interviews | 1 Director of Quality | 1 Radiology Technician |
1 Operating Room Manager | |||
4 | 8 interviews: 1 group interview (4 interviewees) and 7 single interviews | 1 Director of Nursing | 1 Clinic Support Services |
1 Radiology Services | |||
1 Surgical Services | |||
1 Medical/Surgical | |||
5 | 9 single interviews | 1 Chief Executive Officer | 1 Registered Nurse |
1 Director of Nursing | |||
1 Director of Quality | |||
1 Director of Laboratory | |||
6 | 9 interviews: 2 group interviews (4 interviewees) and 7 single interviews | 1 Director of Nursing | 2 Registered Nurses |
1 Director of Clinical Quality | 1 Respiratory Therapist | ||
1 Director of Laboratory | |||
7 | 10 single interviews | 1 Director of Inpatient Services | 2 Quality Services |
1 Director of Outpatient Services | |||
8 | 10 interviews: 1 group interview (2 interviewees) and 9 single interviews | 1 Director of Quality | 1 Emergency Department/Services and Satellite Clinic Manager |
1 Director of Laboratory | 1 Patient Safety/Infection Control | ||
9 | 13 single interviews | 1 Director of Quality and Patient Safety | 2 Registered Nurses |
1 Assistant Director of Quality | 1 Licensed Practical Nurse | ||
1 Director of Professional Development | |||
10 | 13 interviews: 4 group interviews (5 interviewees) and 9 single interviews | 1 Director of Nursing | 2 Registered Nurses |
1 Medical Records | |||
1 Quality/Utilization Review | |||
11 | 13 interviews: 2 group interviews (6 interviewees) and 11 single interviews | 1 Chief Executive Officer | 2 Registered Nurses |
1 Director of Nursing | 1 Emergency Room Manager | ||
1 Operating Room Manager | |||
12 | 13 interviews: 2 group interviews (8 interviewees) and 11 single interviews | 4 Registered Nurses | |
4 Nonclinical Services | |||
2 Quality Specialists | |||
2 Pharmacy | |||
2 Diagnostic Imaging | |||
1 Medical Records | |||
13 | 19 single interviews | 1 Chief Clinical Officer | 4 Registered Nurses |
1 Quality Coordinator | |||
1 Outpatient Surgery Center Coordinator |
Data Collection/Instruments
Interviewers gathered information on how implementation events such as planning, training, and tool implementation were proceeding; facilitators and barriers; outcomes; and factors affecting progress. As shown in Table 2, the interview guide tracked progress generally and did not specifically probe about PARIHS framework content. All interviews were conducted by four faculty researchers with extensive experience in conducting interviews with health care personnel. Transcriptions of interview recordings were anony-mized by masking all hospital and interviewee identities before analysis. The research protocol was approved by the University of Iowa Institutional Review Board.
Table 2.
Interview guide
Initial quarter interview guide
|
Follow-up quarterly interview guide
|
We adopted a two-part approach to analyzing the data obtained through the interviews. First, to clarify the nature of the PARIHS elements related to TeamSTEPPS implementation in small rural hospitals, we identified exemplar quotes to capture thoughts and experiences. Second, to better understand patterns in the interview data, we coded constructs and enumerated relationships.
As shown in Tables 3, 4, and 5, we identified exemplar quotes for the three primary elements of the PARIHS framework—evidence, context, and facilitation—as well as the four subelements of each (Kitson et al., 2008). The table also provides a clear definition of each component. Exemplar quotes were specifically identified by a set of coders, consisting of one advanced undergraduate, two graduate students, and a staff researcher, who were trained to identify and extract relevant quotes for the 12 subelements using directed content analysis (Hsieh & Shannon, 2005). Two coders independently identified and compiled lists of quotes for each subelement, and then two coauthors rated the quotes based on the richness of information they provided.
Table 3.
Selection of exemplar quotes for the evidence subelement of the PARIHS framework
Evidence: knowledge derived from a variety of sources that have been subjected to testing and have been found to be credible | |
---|---|
Research | Exemplar quotes |
Concept and components
|
|
Specific items related to concept
| |
Clinical experience | |
Concept and components
|
|
Specific items related to concept
| |
Patient experience | |
Concept and components
|
|
Specific items related to concept
| |
Information/data from local context | |
Concept and components
|
|
Specific items related to concept
|
Table 4.
Selection of exemplar quotes for the context subelement of the PARIHS framework
Context: the environment or setting in which the proposed change is to be implemented | |
---|---|
Receptive context | Exemplar quotes |
Concept and components
|
|
Specific items related to concept
| |
Culture | |
Concept and components
|
|
Specific items related to concept
| |
Leadership | |
Concept and components
|
|
Specific items related to concept
| |
Evaluation | |
Concept and components
|
|
Specific items related to concept
|
Table 5.
Selection of exemplar quotes for the facilitation subelement of the PARIHS framework
Facilitation: Facilitation refers to the process of enabling (making easier) the implementation of evidence into practice. Facilitation is a process that depends on the person (the facilitator) carrying out the role with the appropriate skills, personal attributes, and knowledge | |
---|---|
Role of facilitator | Exemplar quotes |
Concept and components
|
|
Specific items related to concept
| |
Goal and purpose | |
Concept and components
|
|
Specific items related to concept
| |
Characteristics and style | |
Concept and components
|
|
Specific items related to concept
| |
Skills and attributes | |
Concept and components
|
|
Specific items related to concept
|
We assessed patterns in the interview results through additional coding. Coding forms were derived, which listed constructs and definitions for each subelement, as shown in the first column of Tables 3, 4, and 5. A separate set of coders, consisting of three advanced undergraduates, one graduate student, and one nurse consultant, were trained to examine the interview transcripts and identify evidence of the PARIHS framework using directed content analysis and quantitizing (Hsieh & Shannon, 2005; Sandelowski, 2001; Sandelowski, Voils, & Knafl, 2009). For each hospital, three coders independently assigned a score ranging from 0 (no evidence) to 4 (high evidence) for each of the 12 subelements of the PARIHS framework. A higher score indicates that a subelement is highly evident in a hospital’s experience with TeamSTEPPS implementation as reflected in the interviews. Scores were averaged across the three coders.
We also coded the outcome element of the PARIHS framework—successful implementation. The originators of the PARIHS framework have been relatively mute on how to measure successful implementation, but Stetler and colleagues developed a guide for measuring this element for evidence-based practice implementations (Stetler et al., 2011). Consistent with their suggestion that successful implementation involves development of an implementation plan, TeamSTEPPS requires implementation teams to develop a TeamSTEPPS Action Plan and specifies 10 steps the plan should include (TeamSTEPPS Implementation Guide, 2015). Thus, for successful implementation, coders independently scored each hospital on its planning activities. To do so, the 10 steps of the TeamSTEPPS Action Plan were broken down into 16 specific items, spanning four planning phases, which constituted the subelements for planning activities. The four subelements were problem identification (why the interventions chosen are needed), intervention design (what specifically will be done and how it will be measured), implementation strategy (how it will be done), and reinforcement plan (how to communicate and engage key people). Scores of 1 for “yes” or 0 for “no” were assigned to each item indicating whether a planning activity was completed. Item scores were then summed to create subelement scores, ranging from 0 to 4. The summary planning activities score was calculated as an average of subelement scores.
Validity and Reliability
Standards for evaluating and scoring subelements of the PARIHS framework were discussed among the coauthors and coders, and coders worked independently once agreement to standards was ascertained. Coders met weekly with the coauthors to review coding agreement and discuss issues that arose. The coauthors reviewed and reconciled differences in data with the coders to assure that the coding accurately reflected the framework constructs. Intraclass correlations (LeBreton & Senter, 2008) were computed between pairs of coders and ranged from .51 to .90 with summary score values of .83, .78, .85, and .89 for evidence, context, facilitation, and planning activities, respectively.
Analysis
Scores for the three PARIHS elements (evidence, context, and facilitation) and successful implementation (planning activities) were analyzed with descriptive statistics, and then relationships between predictor and dependent variables were identified with bivariate regression analysis using SAS Version 9.3. Because of the small sample size (N = 13) and clearly expected positive relationships, one-tailed tests with p < .05 were accepted as evidence of a connection.
Findings
The distribution of scores indicated a range of values across the sample of 13 hospitals. The means and standard deviations across hospitals for each of the PARIHS primary elements (reflected in summary scores) and subelements are shown in Table 6. For PARIHS elements, the summary scores from the interview content indicated the highest level for facilitation (2.52), followed by context (2.27), and lowest for evidence (1.77). For successful implementation, the planning activities scores showed considerable variability across subelements. In particular, problem identification (3.32) had the highest score followed by intervention design (2.12), whereas implementation strategies and reinforcement plan were scored considerably lower (0.96 and 0.65, respectively).
Table 6.
Scores for PARIHS framework main elements and subelements in 13 hospitals
PARIHS framework elements (predictor variables) | Mean score (0–4) | Standard deviation |
---|---|---|
Evidence summary score | 1.77 | 0.36 |
Research | 0.81 | 0.78 |
Clinical experience | 2.58 | 0.57 |
Patient experience | 1.46 | 0.88 |
Information/data from local context | 2.23 | 0.83 |
Context summary score | 2.27 | 0.58 |
Receptive context | 1.85 | 0.63 |
Culture | 2.73 | 0.93 |
Leadership | 2.25 | 0.88 |
Evaluation | 2.23 | 0.81 |
Facilitation summary score | 2.52 | 0.89 |
Role of facilitator | 2.58 | 1.00 |
Goal/purpose | 2.41 | 1.00 |
Characteristics/style | 2.62 | 1.00 |
Skills/attributes | 2.46 | 1.20 |
Planning activities (dependent variable) | Mean score (0–4) | Standard deviation |
Planning activities summary score | 1.76 | 0.73 |
Problem identification | 3.32 | 1.09 |
Intervention design | 2.12 | 0.85 |
Implementation strategy | 0.96 | 1.14 |
Reinforcement plan | 0.65 | 0.58 |
N = 13 hospitals. PARIHS = Promoting Action on Research Implementation in Health Services.
As shown in Table 7, bivariate regression analysis yielded a number of relationships between descriptions of the PARIHS elements and descriptions of successful implementation. Among the three PARIHS elements, both facilitation and context showed significant relationships with planning activities, but none emerged for evidence. In particular, for context, the leadership subelement was related to the need subelement of planning activities, and the context summary score was related to the intervention design subelement of planning activities. For facilitation, multiple relationships were identified between facilitation subelements and planning activities subelements, particularly the problem identification and intervention design subelements. Likewise, the facilitation summary score and the planning activities summary score were related to each other and to multiple subelements.
Table 7.
Bivariate relationships between PARIHS framework predictor elements and successful implementation planning activities elements in 13 hospitals
Planning activities subelements and summary score | |||||
---|---|---|---|---|---|
Problem identification | Intervention design | Implementation strategy | Reinforcement plan | Planning summary score | |
Evidence | |||||
Research | .326 | .163 | .201 | −.145 | .22 |
Clinical experience | .185 | .454 | .387 | .24 | .402 |
Patient experience | .289 | −.162 | −.23 | .152 | .001 |
Information/local data | −.435 | .137 | .141 | −.093 | −.086 |
Evidence summary score | .176 | .251 | .206 | .056 | .232 |
Context | |||||
Receptive context | .232 | .036 | −.154 | −.071 | .023 |
Culture | .218 | .442 | .166 | .321 | .34 |
Leadership | .509* | .474 | .247 | .142 | .455 |
Evaluation | .255 | .355 | .439 | .261 | .424 |
Context summary score | .432 | .49* | .271 | .254 | .462 |
Facilitation | |||||
Role of facilitator | .556** | .656** | .477* | .08 | .604** |
Goal/purpose | .291 | .671** | .544* | .222 | .564** |
Character/style | .532* | .646** | .276 | .177 | .532* |
Skills/attributes | .359 | .519* | .257 | .111 | .409 |
Facilitation summary score | .51* | .731** | .452 | .172 | .617** |
N = 13 hospitals. PARIHS = Promoting Action on Research Implementation in Health Services.
p < .10, two tailed; p < .05, one tailed (cutoff r > .476),
p < .05, two tailed; p < .025, one tailed (cutoff r > .553).
Discussion
Research findings about what facilitates the implementation of evidence-based quality improvement interventions are needed to assist health care organizations in developing implementation strategies. The PARIHS framework (Kitson et al., 1998, Rycroft-Malone et al., 2002; 2008) has considerable face validity and conceptual relevance for examining implementation of quality improvement interventions in health care settings. The three predictors of successful implementation—evidence, context, and facilitation—in the PARIHS framework were identified from a considerable body of theory and research. However, as summarized in a systematic literature review (Helfrich et al., 2010), evidence directly testing the PARIHS framework is limited and indeed difficult to obtain given that samples of multiple settings undertaking a common intervention are rare.
For the current study, the PARIHS framework was applied to studying the process by which 13 small rural hospitals implemented TeamSTEPPS. The first step in our analysis examined the extent to which each of the PARIHS framework elements was discussed by those involved in TeamSTEPPS as evident in their hospital’s implementation experience. Our analysis indicated that the implementation of TeamSTEPPS in small rural hospitals reflected all three predictor elements, although these hospitals exhibited greater discussion of facilitation and context and less discussion of evidence in promoting implementation. An analysis of subelements showed that all components of facilitation and context were rated relatively high, but for evidence, the subelement of research was rated low compared with other subelements. This lack of reliance on evidence may have resulted from the nature of TeamSTEPPS, which (a) was built on largely nonclinical evidence (i.e., teamwork and change management) that is beyond the implementation teams’ expertise and (b) was established by national institutions such as AHRQ and accepted by the hospitals as an “evidence-based” approach. Thus, establishing and assessing research evidence supporting its usage were not the focus of the implementation teams. The prominent presence of facilitation and context elements in the implementation of TeamSTEPPS is partly because of the unique environment in which small and rural hospitals operate their quality improvement initiatives. Small and rural hospitals often have inadequate infrastructures and resources to support quality improvement activities (Casey & Moscovice, 2004; Paez, Schur, Zhao, & Lucado, 2013). For a large, complex intervention like TeamSTEPPS, these hospitals’ implementation context is uniformly weak. Thus, facilitation activities that transform the weak context into a strong context are particularly important for successful implementation in this environment (Kitson et al., 2008).
A particular challenge in examining the PARIHS framework is the limited research on how to define and measure successful implementation, the framework’s dependent variable. We adopted a guide published recently (Stetler et al., 2011) to capture successful implementation, breaking it into planning activities components using the TeamSTEPPS Action Plan as a guide. Planning activities codes showed considerable variability across subelements, with low scores for some subelements indicating that hospitals engaged in some initial planning steps but few hospitals showed evidence of more sustained planning. Shortcomings in planning have been particularly acknowledged in previous studies using the PARIHS framework (Sharp, Pineros, Hsu, Starks, & Sales, 2004). Further research using the PARIHS framework needs to provide clear definitions of how successful implementation is conceptualized, although it is likely to vary across studies depending on the type of quality improvement activity that is implemented.
A second aspect of our analyses examined patterns of relationships in the interview data. We specifically identified multiple components of the PARIHS framework that were related to successful implementation of TeamSTEPPS planning activities. In their systematic literature review of studies using the PARIHS framework, Helfrich et al. (2010) identified only a few studies that examined the nature of dynamic relationships among components. In two case studies, Rycroft-Malone et al. (2004) found support for all three PARIHS components. Two studies examining nursing utilization of research found that both context and facilitation played a role in promoting implementation (Cummings, Estabrooks, Midodzi, Wallin, & Hayduk, 2007; Estabrooks, Midodzi, Cummings, & Wallin, 2007) as did a study of implementing evidence-based practice (Ellis, Howard, Larson, & Robertson, 2005). Sharp et al. (2004), in their study of six Veterans Health Administration medical centers, found that context was the strongest component predicting successful implementation. Thus, previous research has found support for all three components separately, with the strongest previous support for context and facilitation, as we also observed.
The importance of facilitation in implementation has been supported by previous research (Ellis et al., 2005; Harvey et al., 2002; Kitson et al., 1998; Rycroft-Malone et al., 2004; Sharp et al., 2004). In the PARIHS model, facilitation is placed on a continuum from being task-oriented to being holistic. It is often equated with identifying clinical practice champions as content experts; however, facilitators in our hospitals more frequently took on project management roles. Sharp et al. (2004) studied barriers to implementation and recommended that a recognized leader be assigned the responsibility for coordinating activities and facilitating communication. Consistent with this recommendation, hospitals in our study that showed progress in implementing TeamSTEPPS often had designated facilitators who operated as coordinators.
Practice Implications
These findings hold important implications for small rural hospitals. First, the originators of the PARIHS framework indicate that all three components must be present for successful implementation to take place. Our findings suggest that this is not necessarily true. In particular, our analyses indicated that none of the facets of successful implementation were related to evidence. This is potentially important for small rural hospitals that may face extra challenges in establishing evidence, which involves deriving knowledge from a variety of tested and credible sources that support the effectiveness of the intervention. Staff in small hospitals often lack time and skills to evaluate the evidence base of the intervention by themselves (Ellis et al., 2005; Milner, Estabrooks, & Myrick, 2005; Rycroft-Malone et al., 2004). Our findings strongly indicate that, in hospitals with a less supportive evidence base, good facilitation can help overcome weak organizational resources for establishing evidence. Carefully selected facilitators with the right knowledge and skills, such as nurse educators (Milner et al., 2005), can help bridge the gaps between research and practice. Although it takes time and effort to change the elements of context, a well-developed facilitation approach can both drive the change process as well as contribute to transformation of the organization. Finally, with facilitation also playing an important role for sustaining the efforts (Conklin & Stolee, 2008), it is important to consider the implication of turnover among facilitators.
The findings from the current study show considerable support for the PARIHS framework; however, limitations must be recognized. First, successful implementation has not been clearly defined by the originators of the PARIHS framework and is thus subject to variation in how it is operationalized across studies. Second, how components of the framework are analyzed in qualitative research is always subject to judgment. Third, implementation studies often involve relatively small samples, which in our study, limited power for statistical analysis. Balancing these limitations are several strengths in the current study. First, the use of a clear conceptual framework permitted relationships to be explored and linked to existing research. Second, the sample size of 73 interviewees at 13 hospitals provided a wealth of data for analysis. Third, the implementation being examined—TeamSTEPPS—is being widely promoted for enhancing patient safety, which increases the significance of the findings presented here. Finally, studying factors that promote implementation in hospitals that are motivated to undertake quality initiatives have ready applicability in health care practice.
Acknowledgments
The authors thank Michelle Martin, Jill Scott-Cawiezell, Tom Vaughn, and Kelli Vellinga for contributing to the research project.
The research protocol was approved by the University of Iowa Institutional Review Board.
This research was supported by Grant number R18HS018396 from the Agency for Healthcare Research and Quality (AHRQ). The content is solely the responsibility of the authors and does not necessarily represent the official views of the AHRQ.
Footnotes
The authors have disclosed that they have no significant relationship with, or financial interest in, any commercial companies pertaining to this article.
References
- Armour Forse R, Bramble JD, & McQuillan R (2011). Team training can improve operating room performance. Surgery, 150, 771–778. [DOI] [PubMed] [Google Scholar]
- Capella J, Smith S, Philp A, Putnam T, Gilbert C, Fry W, … Remine S (2010). Teamwork training improves the clinical care of trauma patients. Journal of Surgical Education, 67, 439–443. [DOI] [PubMed] [Google Scholar]
- Casey MM, & Moscovice I (2004). Quality improvement strategies and best practices in critical access hospitals. The Journal of Rural Health, 20(4), 327–334. [DOI] [PubMed] [Google Scholar]
- Clancy CM, & Tornberg DN (2007). TeamSTEPPS: Assuring optimal teamwork in clinical settings. American Journal of Medical Quality, 22(3), 214–217. [DOI] [PubMed] [Google Scholar]
- Coburn AF, & Gage-Croll Z (2014). Improving hospital patient safety through teamwork: The use of TeamSTEPPS in critical access hospitals. Retrieved from http://flexmonitoring.org/documents/PolicyBrief21_TeamSTEPPS.pdf
- Conklin J, & Stolee P (2008). A model for evaluating knowledge exchange in a network context. The Canadian Journal of Nursing Research, 40(2), 116–124. [PubMed] [Google Scholar]
- Cummings GG, Estabrooks CA, Midodzi WK, Wallin L, & Hayduk L (2007). Influence of organizational characteristics and context on research utilization. Nursing Research, 56, S24–S39. [DOI] [PubMed] [Google Scholar]
- Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4, 50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ellis I, Howard P, Larson A, & Robertson J (2005). From workshop to work practice: An exploration of context and facilitation in the development of evidence-based practice. Worldviews on Evidence-Based Nursing, 2, 84–93. [DOI] [PubMed] [Google Scholar]
- Estabrooks CA, Midodzi WK, Cummings GG, & Wallin L (2007). Predicting research use in nursing organizations. A multilevel analysis. Nursing Research, 56, S7–S23. [DOI] [PubMed] [Google Scholar]
- Feldstein AC, & Glasgow RE (2008). A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Joint Commission Journal on Quality and Patient Safety, 34(4), 228–243. [DOI] [PubMed] [Google Scholar]
- Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, & Hunter C (2012). National Institutes of Health approaches to dissemination and implementation science: Current and future directions. American Journal of Public Health, 102(7), 1274–1281. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Harvey G, Loftus-Hills A, Rycroft-Malone J, Titchen A, Kitson A, McCormack B, & Seers K (2002). Getting evidence into practice: The role and function of facilitation. Journal of Advanced Nursing, 37, 577–588. [DOI] [PubMed] [Google Scholar]
- Helfrich CD, Damschroder LJ, Hagedorn HJ, Daggett GS, Sahay A, Ritchie M, … Stetler CB (2010). A critical synthesis of literature on the promoting action on research implementation in health service (PARIHS) framework. Implementation Science, 5, 82. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hsieh HF, & Shannon SE (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. [DOI] [PubMed] [Google Scholar]
- Jones F, Podila P, & Powers C (2013). Creating a culture of safety in the emergency department: The value of teamwork training. The Journal of Nursing Administration, 43(4), 194–200. [DOI] [PubMed] [Google Scholar]
- Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, & Stall R (2007). Implementing evidence-based interventions in health care: Application of the replicating effective programs framework. Implementation Science, 2, 42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- King HB, Battles J, Baker DP, Alonoso A, Salas E, Webster J, … Salisbury M (2008). TeamSTEPPS: Team strategies and tools to enhance performance and patient safety In Henriksen K, Battles JB, Keyes MA, Grady ML (Eds.), Advances in patient safety: New directions and alternative approaches (Vol. 3). Rockville, MD: Agency for Healthcare Research and Quality; Retrieved from http://www.ncbi.nlm.nih.gov/books/NBK43686/ [PubMed] [Google Scholar]
- Kitson A, Harvey G, & McCormack B (1998). Enabling the implementation of evidence based practice: A conceptual framework. Quality in Health Care, 7(3), 149–158. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, & Titchen A (2008). Evaluating the successful implementation of evidence into practice using the PARiHS framework: Theoretical and practical challenges. Implementation Science, 3, 1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kyratsis Y, Ahmad R, & Holmes A (2012). Making sense of evidence in management decisions: The role of research-based knowledge on innovation adoption and implementation in healthcare. Study protocol. Implementation Science, 7, 22. [DOI] [PMC free article] [PubMed] [Google Scholar]
- LeBreton JM, & Senter JL (2008). Answers to 20 questions about interrater reliability and interrater agreement. Organizational Research Methods, 11(4), 815–852. [Google Scholar]
- Lobb R, & Colditz GA (2013). Implementation science and its application to population health. Annual Review of Public Health, 34, 235–251. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Milner M, Estabrooks CA, & Myrick F (2005). Research utilization and clinical nurse educators: A systematic review. Journal of Evaluation in Clinical Practice, 12(6), 639–655. [DOI] [PubMed] [Google Scholar]
- Paez K, Schur C, Zhao L, & Lucado J (2013). A national study of nurse leadership and supports for quality improvement in rural hospitals. American Journal of Medical Quality, 28(2), 127–134. [DOI] [PubMed] [Google Scholar]
- Rycroft-Malone J, Harvey G, Seers K, Kitson A, McCormack B, & Titchen A (2004). An exploration of the factors that influence the implementation of evidence into practice. Journal of Clinical Nursing, 13, 913–924. [DOI] [PubMed] [Google Scholar]
- Rycroft-Malone J, Kitson A, Harvey G, McCormack B, Seers K, Titchen A, & Estabrooks C (2002). Ingredients for change: Revisiting a conceptual framework. Quality & Safety in Health Care, 11, 174–180. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sandelowski M (2001). Real qualitative researchers do not count: The use of numbers in qualitative research. Research in Nursing & Health, 24, 230–240. [DOI] [PubMed] [Google Scholar]
- Sandelowski M, Voils CI, & Knafl G (2009). On quantitizing. Journal of Mixed Methods Research, 3(3), 208–222. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sawyer T, Laubach VA, Hudak J, Yamamura K, & Pocrnich A (2013). Improvements in teamwork during neonatal resuscitation after interprofessional TeamSTEPPS training. Neonatal Network, 32(1), 26–33. [DOI] [PubMed] [Google Scholar]
- Sharp ND, Pineros SL, Hsu C, Starks H, & Sales AE (2004). A qualitative study to identify barriers and facilitators to implementation of pilot interventions in the Veterans Health Administration (VHA) Northwest Network. Worldviews on Evidence-Based Nursing, 1, 129–139. [DOI] [PubMed] [Google Scholar]
- Sheppard F, Williams M, & Klein VR (2013). TeamSTEPPS and patient safety in healthcare. Journal of Healthcare Risk Management, 32(3), 5–10. [DOI] [PubMed] [Google Scholar]
- Spiva L, Robertson B, Delk ML, Patrick S, Kimrey MM, Green B, & Gallagher E (2014). Effectiveness of team training on fall prevention. Journal of Nursing Care Quality, 29(2), 164–173. [DOI] [PubMed] [Google Scholar]
- Stetler CB, Damschroder LJ, Helfrich CD, & Hagedorn HJ (2011). A guide for applying a revised version of the PARIHS framework for implementation. Implementation Science, 6, 99. [DOI] [PMC free article] [PubMed] [Google Scholar]
- TeamSTEPPS Action Plan. (2015). TeamSTEPPS 2.0 implementation guide (pp 52–56). Retrieved from http://www.ahrq.govprofessionalseducationcurriculum-toolsteamsteppsinstructoressentialsimplguide.pdf