Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Mar 15.
Published in final edited form as: Adv Sch Ment Health Promot. 2017 Mar 15;10(2):127–146. doi: 10.1080/1754730x.2017.1295814

Intent to sustain use of a mental health innovation by school providers: What matters most?

Melanie Livet 1, Mary Yannayon 1, Kelly Kocher 1, Janey McMillen 1
PMCID: PMC5796670  NIHMSID: NIHMS912810  PMID: 29403540

Abstract

Despite innovations being routinely introduced in schools to support the mental health of students, few are successfully maintained over time. This study explores the role of innovation characteristics, individual attitudes and skills, and organizational factors in school providers’ decisions to continue use of Centervention, a technology-based tool that supports implementation of evidence-based mental health interventions (EBIs). Data were collected from 44 providers through online surveys following use of Centervention over a one-year period. When considered with individual and organizational factors, experience with Centervention (usability, usefulness, and satisfaction) was found to be the most influential predictor of intent to sustain use. Results reinforce the importance of (1) differentiating between factors that predict initial adoption vs. those that enable sustainability and (2) tailoring sustainability decision models to the nature of the innovation. They also support the need to incorporate strategies to enhance provider experience during implementation of an innovation.

Keywords: sustainability, technology-based tool, school providers, organizational factors, innovation characteristics


Despite innovations (e.g., new practice, technology, or program) being routinely introduced in schools to support the mental health and well-being of students, only about half are successfully sustained over time (Cooper, Bumbarger, & Moore, 2015; Scheirer, 2005; Stirman et al., 2012). Discontinuing effective practices, technology, or programs leads to costly cycles of repeated innovation implementations, increased resistance to change, and interruptions in needed student services (McIntosh, Martinez, Ty, & McClain, 2013). Despite the need for stakeholders to understand the long-term impact of their investments, successfully predicting continued use of effective innovations promoting mental health remains a challenge (Loh, Friedman, & Burdick, 2013; Owens et al., 2014; Racine, 2006; Stroul & Manteuffel, 2007). The majority of published literature has focused on examining the variables critical to adoption and initial implementation of the innovation in a diversity of settings, including schools{Stirman, 2012 #2843}. Limited attempts have been made to explore predictors of sustainability (Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004; Johnson, Hays, Center, & Daley, 2004; Loh, et al., 2013; Stirman, et al., 2012). Factors that influence initial implementation, however, may not be the same as those involved in decisions to continue long-term use (Lyon et al., 2013; Massey, Armstrong, Boroughs, & Henson, 2005; Scheirer, 2013). Furthermore, most research has not considered the nature of the innovation to be sustained. Scheirer (2013) argues that differentiating between the types of innovations (e.g., implemented by individual providers vs. requiring coordination among multiple staff) is crucial as factors predicting sustainability are likely to be different. This study specifically investigates influences on school providers’ intent to continue using a web-based tool, Centervention, designed to support long-term implementation of evidence-based mental health interventions (EBIs).

Centervention as the Innovation

Centervention (3C Institute, 2014) was developed in light of the need for easy to use, accessible, low cost support tools to accelerate successful introduction of mental health EBIs in school settings (Levin, Hennessy, & Petrila, 2010; President’s New Freedom Commission on Mental Health, 2003). Aligned with the evidence-based system for innovation support (EBSIS) model (Wandersman, Chien, & Katz, 2012), Centervention offers school providers access to four core implementation supports (Appendix A): (1) downloadable EBI resources and materials, available through an online Resource Center; (2) online EBI training, with accompanying knowledge quizzes and availability of Continuing Education credit (CEUs); (3) technical assistance (TA), to support providers with EBI implementation and use of Centervention, through online features such as Ask an Expert, Quick Tips, Q&As, as well as live support; and (4) quality assurance/improvement (QA/QI) feedback loops, including adherence tracking (to monitor provider fidelity to the EBI) and progress monitoring of students participating in the EBI (in the form of real-time data reports). Each of these core supports serves to reinforce the knowledge and skills acquired through the preceding component. The informational resources and accompanying training provide the basic information necessary to implement the EBI. These two strategies are supplemented by ongoing access to TA and reinforced through use of QA/QI processes (Wandersman et al., 2012). In addition to being grounded in theory, Centervention is also evidence-based. Based on results from a multilevel analysis, extent to which providers used each of the Centervention supports was found to significantly impact both their adherence to the EBI and the level of student engagement during the EBI sessions. Greater adherence to the EBI was in turn predictive of better socio-emotional outcomes for participating students (Livet, Yannayon, Sheppard, Kocher, Upright, & McMillen, 2016).

Operationalizing Intent to Sustain Use

With the understanding that sustainability is a dynamic and complex process (Johnson, et al., 2004; McIntosh, Horner, & Sugai, 2009; McIntosh, Mercer, et al., 2013; Stirman, et al., 2012), this study focuses on the point in time when an individual reflects on his or her past experience with the innovation and re-evaluates whether to continue or discontinue use (Rogers, 1995, 2003). Intent to sustain use is defined as the decision to maintain use of Centervention to support implementation of mental health EBIs beyond the first 12 months, in order to continue achieving desirable student outcomes (Scheirer & Dearing, 2011). Also grounded in the social-ecological model, intent to sustain is assumed to be the result of the interconnection between multiple forces, with individuals embedded in organizations which, in turn, operate within a broader context (Scheirer & Dearing, 2011; Stirman, et al., 2012). With individual provider-implemented innovations like Centervention, the decision to continue or discontinue use is likely to be influenced by the individual provider’s motivation and perceptions of these forces, assuming continued need for and access to the innovation (Han & Weiss, 2005; Scheirer, 2013).

Background Literature

Common to both the sustainability and technology acceptance literature, both of which guided the identification of the most important variables for the current study, are three sets of factors influencing continued use of any innovation: (1) characteristics of the innovation and its fit with user expectations; (2) individual attitudes and skills related to the innovation; and (3) organizational facilitators and barriers (e.g., sufficient resources and support, lack of competing priorities) (McIntosh et al., 2013; Scheirer & Dearing, 2011; Stirman, et al., 2012; Venkatesh & Bala, 2008). These factors are believed to operate synergistically to impact decisions and use of an innovation (Briesch, Chafouleas, Neugebauer, & Riley-Tillman, 2013). Broader contextual factors (e.g., educational policies, district initiatives) have also been found to impact sustainability (Owens et al., 2014; Scheirer & Dearing, 2011), but are beyond the scope of this study.

The importance of innovation characteristics has been extensively investigated in relation to initial adoption of technology in schools and other settings (Straub, 2009; Zhao & Frank, 2003). Few studies found these characteristics to be associated with sustainability, due to either lack of influence or lack of attention to these constructs (Stirman, et al., 2012). In studies reporting a significant relationship, sustained use and intent to continue use seemed to be most consistently related to the individual’s overall experience with the innovation, including perceived usefulness (benefits, demonstrable outcomes) and usability (Lippert & Forman, 2005; Lyon, et al., 2013; McIntosh et al., 2013). School providers needed to experience the effects of the innovation and weigh the costs and burden of continued implementation before deciding to continue long-term use.

Individual providers’ beliefs and skill set have also found to contribute to continued use (McIntosh et al., 2009). The provider’s level of comfort with technology, attitudes toward and existing knowledge of the innovation content, and readiness to continue implementing the practice, have been found to influence the sustainability of innovations in schools (Cooper, et al., 2015; Lyon, et al., 2013; Scheirer, 2005).

Finally, despite the growing interest in the role of organizational climate and culture, there have been fewer studies than expected examining the influence of these concepts on decisions to continue using an innovation{Stirman, 2012 #2843} (Stirman, et al., 2012). School-based programs were found more likely to be sustained in more favorable organizational contexts (Rohrbach, Grana, Sussman, & Valente, 2006). Organizational barriers (e.g., time availability), stability of resources, and a supportive working climate (e.g., availability of professional development opportunities, support from peers and principals) impacted the provider’s decision to continue use (Owens et al., 2014).

Current Study

This study aims to understand the predictors of school providers’ decision to continue use of Centervention to support mental health EBIs beyond the initial implementation period (assuming continued free access to the tool and need to continue implementing these EBIs). Providers had an opportunity to use Centervention over the course of a year prior to this study. The influence of Centervention on the predictors of interest following the initial year of implementation was first assessed, prior to inclusion in the predictive models. Based on previous literature, experience with the innovation, provider readiness to maintain the practice, attitudes towards EBIs and technology, and organizational factors directly impacting the providers’ ability to continue implementation (support, resources, and stressors) were hypothesized to significantly predict intent to sustain use.

Method

Study findings are based on data from the Centervention efficacy trial, a year-long investigation of the benefits of using Centervention to implement mental health EBIs in schools. Only data relevant to the current study were analyzed.

Participants and Settings

Participants were 44 school-based providers (counselors, classroom teachers) from 28 elementary schools in North Carolina and Florida. Out of the 48 participating providers, two were unable to implement the EBI within the designated timeframe, and two had insufficient data to be included in the current study. The sample was predominantly female (95%) with a racial distribution of 86% White and 7% African American, and 2% of participants reporting Latino/Hispanic ethnicity. The majority (73%) had graduate degrees with 27% having more than 20 years of professional experience. Schools were initially recruited through project staff’s personal connections and the 3C Institute (Centervention’s purveyor) newsletter. This quarterly newsletter disseminates information about the Institute’s work and research study opportunities to its extensive network of partners and customers, including schools. To ensure commitment to participating and implementing the EBIs offered through Centervention, schools had to apply to participate in the study. Feasibility of each school plan for completing study requirements (e.g., proposed timeline, signed agreement from school administration) was then reviewed by a project staff member. Both the written application and a phone interview with the school liaison (most often the counselor responsible for EBI implementation) were used as part of this review process. School-identified providers were also asked to select at least one of the following social-emotional health EBIs available in Centervention: Social Skills Group Intervention (S.S.GRIN) (N=16) (DeRosier, 2004; DeRosier, 2007; DeRosier & Marcus, 2005), Adventures in Emotional Literacy (AEL) (N=15) (Brightwood, DeRosier, Maschauer, & Wilson, 2009; Brightwood, DeRosier, & Maschauer, 2009; Craig, Leary, Parker, McMillen, & DeRosier, in preparation), LifeStories for Kids (LSK) (N=13) (Brightwood & DeRosier, 2007; DeRosier & Brightwood, 2007; DeRosier & Mercer, 2007), and Positive Action (PA) (N=3) (Flay & Allred, 2003).

Procedure

Prior to the efficacy trial, participants were provided with a list of study expectations and an overview of the Centervention functionalities during an orientation webinar. Although participating schools and providers had already made the decision to adopt and use Centervention to facilitate implementation of the EBIs over a one-year period, extent of use was voluntary. Providers were encouraged to: (1) use the online and print EBI resources and materials provided (e.g., EBI toolkit, story videos for use during EBI sessions, content knowledge checks); (2) complete the online EBI training accessible through Centervention, with opportunities to earn CEUs upon successful training quiz completion as an additional incentive; and (3) enter the weekly data on their fidelity to the EBI and on each participating student’s progress (used by Centervention to automatically create real-time reports for viewing). TA was available through use of online features such as Q&As or by calling or emailing project staff (live TA). Providers used all four components of Centervention, with variability in training completion (although all participants spent time viewing the training), an average of five reports being downloaded for each EBI student group, minimal needs for TA, and very high satisfaction with the provided EBI resources (Livet et al., 2016). In addition, participants were expected to complete surveys prior to study participation, and one-year post-implementation. Relevant to this study were data related to intent to sustain use and experience with the tool, collected post-implementation, and data related to provider attitudes/readiness and organizational factors, collected both pre- and post- study. Providers implemented their EBI(s) either during Fall 2013 or Spring 2014 for 9 or 10 weeks (depending on the EBI). Unlimited free access to Centervention was also available for an additional school year, beyond the one-year study.

Measures1 (Appendix B)

Intent to sustain use

Intent to continue using Centervention was measured post-implementation by asking participants if they were likely to sustain use of a number of Centervention components and tools, given free access and the need to continue delivering the available EBIs (e.g., “I am likely to continue using the tools for monitoring student progress,”). The overall score represented intent to sustain use across all Centervention components. This measure, which was specifically created for this study, was comprised of 6 items (5-point Likert scale, from “strongly disagree” to “strongly agree,” with higher scores indicating greater agreement), and had good reliability in the current sample (Cronbach’s alpha = .85).

Experience with the tool (usability, usefulness, and satisfaction)

Perceived usability of Centervention was assessed using a 10-item scale that included items from the Post-Study System Usability Questionnaire (PSSUQ) and the After-Scenario Questionnaire (ASQ) (Lewis, 1995). The original scales were modified to include only the items that were deemed relevant by the study team. In addition, to add specificity to response options and ease interpretation, Lewis’ 7-point Likert scale was changed to a 5-point Likert with descriptive Likert scale anchors for each option (with higher scores indicating more positive ratings). Based on results from the exploratory factor analysis conducted with this sample to assess the underlying structure of the 10 items selected2, this measure included two subscales, ease of use (6 items) and task efficiency (4 items), with Cronbach’s alphas of .92 and .89 respectively. Perceived usefulness consisted of 23 items created for this study to assess how different functionalities supported the provider in accomplishing implementation tasks (e.g., “the student center supported me in tracking student progress”) and had excellent internal consistency (Cronbach’s alpha=.96). Finally, tool satisfaction ratings were obtained by combining three items created for this study that assessed value to the provider, satisfaction with the tool, and willingness to recommend to colleagues (Cronbach’s alpha=.91). Responses for all three measures ranged from “strongly disagree” to “strongly agree” on a 5-point Likert scale, with higher scores indicating higher levels of agreement. These data were collected post-implementation, once participants had an opportunity to use Centervention.

Provider attitudes and readiness

The Evidence-Based Practice Attitude Scale (EBPAS) was used to measure school provider attitudes towards EBIs (Aarons, 2004). The scale is composed of four subscales, representative of four distinct constructs: general openness towards EBIs (openness), willingness to implement EBIs given their intuitive appeal (appeal), willingness to implement EBIs if required (requirements), and perceived divergence from usual practices (divergence). Minor wording changes were made to the original items to reflect the study context (e.g., “students” instead of “clients”). Responses ranged from “not at all,” to “a very great extent” using a 5-point Likert scale. The items were scored to reflect positive attitudes, with higher scores indicated greater agreement on the openness, appeal, and requirements subscales, but lower agreement on the divergence subscale (i.e., higher agreement towards lower divergence). The scale has demonstrated good internal consistency reliability (Aarons, 2004). Participants also completed a 4-item measure (5-point Likert scale, from “strongly disagree” to “strongly agree”), created specifically for this study, to assess comfort with technology (e.g., “I use technology (iPad, Kindle, Facebook, smartphones, etc.) frequently in my personal life”) (Cronbach’s alpha=.91). Finally, the Organizational Readiness for Implementing Change (ORIC) measure (Shea, Jacobs, Esserman, Bruce, & Weiner, 2014) was adapted to assess individual (rather than organizational) readiness to continue implementing this innovation through minor wording changes (e.g., “I am motivated to implement these innovations” rather than “People who work here are motivated to implement this change”). The measure is comprised of two subscales, commitment (5 items), and change efficacy (7 items), with good reliability (Shea et al., 2014). Responses are scored on a 5-point Likert scale, from 1 (“strongly disagree”) to 5 (“strongly agree”). These data were collected both prior to and after the implementation year, allowing a thorough description of the influence of Centervention, prior to conducting predictive analyses.

Organizational factors

Perceptions of the schools’ structural readiness to implement change were assessed by exploring resources, organizational stressors, and support (5-point Likert scale, from “strongly disagree” to “strongly agree”). The items were scored to indicate positive organizational conditions, with higher scores reflecting greater agreement that resources and support were present, but lower agreement that organizational stress level were high (i.e., higher agreement towards lower organizational stress). The resources items evaluated providers’ perceptions of whether each school had the necessary staff, time, and facilities to implement new innovations (4 items; Cronbach alpha=.75). The organizational stressors questions assessed the perceived stress and workload of school providers (3 items; Cronbach alpha=.80). Willingness to help one another, encouragement for change, and opportunities for professional development comprised the support scale (3 items; Cronbach alpha=.82). These items originated from the Texas Christian University Organizational Readiness for Change Assessment (TCU-ORC-S version) (Lehman, Greener, & Simpson, 2002). The items were selected based on relevancy to the study, from both the TCU-ORIC-S resources scales (offices, staffing, and training) and organizational climate scales (cohesion, stress, and change). Results of an exploratory factor analysis conducted with the current dataset revealed the three subscales described above3. Like the provider attitudes and readiness variables, these data were collected both pre- and post-implementation.

Data Analysis

Analyses were conducted using SPSS v.22. To test instrument reliability, only post-implementation Cronbach’s alphas were used, since the majority of the analyses used post-survey data only. To better understand participants’ overall experience with Centervention over the first year of implementation, descriptive statistics, including means, standard deviations, and frequencies, were initially computed for the post-survey measures of interest. Centervention’s influence on providers’ perceptions was further explored by running paired sample t-tests for pre- and post-survey measures of both provider attitudes and readiness, and organizational factors.

To predict intent to sustain use of Centervention, only post-survey data were used. The study focus was on predicting intent to continue use after providers had an opportunity to use Centervention for a year, essentially using post-data as the baseline (regardless of how participants initially responded prior to this experience). To identify the specific variables to be included as predictors in subsequent analyses, Pearson’s bivariate correlations were conducted between the dependent variable (DV; intent to sustain use) and each of the independent variables (IVs; perceived usability - including ease of use and task efficiency; perceived usefulness; tool satisfaction; attitudes towards EBIs including all four subscales; comfort with technology; individual readiness – including commitment and self-efficacy; and perceptions of school structural readiness - including resources, organizational stressors, and support)4. Only variables that significantly correlated with intent to sustain use were included in the multiple regression analyses. As guided by the ecological model that underscores the importance of evaluating multiple levels of influence simultaneously, multiple regression was selected to assess the predictive strength of each included IV on the DV, in the context of the other significant IVs. R-squared estimates are reported to indicate the percentage of variance accounted for by a particular model. The relative importance of each IV was reviewed by examining beta weights (standardized multiple regression coefficients) and uniqueness indices. Uniqueness indices for each predictor were assessed by reviewing the R-squared differences between each of the original regression models and each of the restricted or nested models. Multicollinearity was checked by examining two diagnostic indices, the tolerance and Variance Inflation Factor (VIF) (i.e., reciprocal of its tolerance; it measures how much the variance of the estimated regression coefficients are inflated when compared to having uncorrelated predictors). A tolerance value below or equal to .40 (and a VIF greater than 2.5) indicates multicollinearity (Allison, 1999).

Results

Understanding Overall Experience with and Influence of the Innovation on Providers: Descriptive Statistics and t-tests

When asked whether they intended to continue using Centervention post-study (N=42), 59.5% of participants indicated they would, 38.1% were unsure, and 2.4% reported they would not. More than 77% (N=44) of respondents “agreed” or “strongly agreed” that Centervention was easy to use (ease of use). More than 77% (N=44) also “agreed” or “strongly agreed” that this tool helped support their EBI implementation efforts (perceived usefulness). In addition, more than 64% (N=44) participants “agreed” or “strongly agreed” that the tool was valuable, that they were satisfied with it, and that they would recommend its use to colleagues (tool satisfaction). Task efficiency was the only tool experience variable rated more neutrally based on a mean of 3.15 (“neither agree nor disagree”).

There were no significant changes from pre- to post for provider attitudes towards EBIs (t(43)=1.80, p=ns), change efficacy (t(41)=1.37, p=ns), and comfort with technology (t(43)=-.72, p=ns). Post-test means on these variables still indicated positive ratings, with means above 3.5 on 5-point Likert scales (Pre-EBPAS: M=3.84, SD=.60; Post-EBPAS: M=3.68, SD=.57; pre-change efficacy: M=4.56, SD=.53; post-change efficacy: M=4.48, SD=.55; pre-comfort with technology: M=4.59, SD=.73; post-comfort with technology: 4.63, SD=.71). Likewise, although willingness to commit to innovation implementation significantly decreased after this first year (t(41)=3.06, p<.05), participants still reported agreeing that they were committed based on a post-test mean of 4.36 (SD=.71)(pre-commitment: M=4.68, SD=58). There was also a significant decrease in pre- and post-implementation resources scores (t(41)=3.84, p<.01), although participants still “agreed” they had the resources necessary to implement the innovation (pre-resources: M=4.21, SD=.87; post-resources: M=3.65, SD=.91). Finally, participants’ perceptions of organizational stressors and support remained unchanged over the course of the implementation year (t(39)=-.56, p=ns, and (41)=1.74, p=ns, respectively) (pre-organizational stressors: M=2.49, SD=.95; post-organizational stressors: M=2.61, SD=1.13; pre-support: M=4.42, SD=.69, post-support: M=4.25, SD=.87). Table 1 summarizes post-test means and standard deviations for the variables of interest.

Table 1.

Descriptive Statistics: Intent to Sustain Use and post-test IVs

Variable N M1 SD
Intent to sustain use 42 3.57 .61
Experience with the tool
Perceived usability
  Ease of use 44 3.94 .78
  Task efficiency 44 3.15 1.10
Perceived usefulness 44 3.92 .60
Tool satisfaction 44 3.73 .86
Provider attitudes and readiness
Evidence-based practice attitude scale
  EBPAS - Openness 44 4.00 .72
  EBPAS – Divergence (RC)2 44 3.51 .78
  EBPAS - Appeal 44 3.65 .86
  EBPAS - Requirements 42 3.42 1.51
Comfort with technology 44 4.63 .71
Individual readiness (IR)
  IR - commitment 42 4.36 .71
  IR - change efficacy 42 4.48 .55
Organizational factors
Resources 42 3.65 .91
Organizational stressors (RC) 40 2.61 1.13
Support 43 4.25 .87
1

Means above 3.5 reflected agreement or strong agreement in the positive direction (e.g., higher ease of use). For the two reverse-coded subscales (divergence and organizational stressors), means above 3.5 reflected agreement or strong agreement that there was lower divergence (high convergence) and lower organizational stress levels (positive levels of organizational stressors).

2

RC = reverse coded

Predicting Sustained Use of Centervention: Correlations and Multiple Regression Analyses

Intent to sustain use was found to significantly and positively correlate to all of the tool experience variables (perceived usability, perceived usefulness, and tool satisfaction), individual readiness (both change efficacy and commitment), and resources (Table 2). Attitudes toward EBIs, comfort with technology, organizational stressors, and support were not found to correlate significantly with intent to sustain use (Table 2). Consequently, the tool experience variables, the individual readiness subscales, and resources were identified as potential predictors to be included in the multiple regression. Because of the number of predictors being limited to a maximum of four based on sample size (Maxwell, 2000), a tool experience variable was created by averaging the scores from the perceived usability, perceived usefulness, and tool satisfaction scales (Cronbach’s alpha=.91). Likewise, both the change efficacy and commitment subscales were combined to recreate the original individual readiness scale score prior to inclusion. Intent to sustain use was therefore regressed on the linear combination of tool experience, individual readiness, and resources. The model significantly predicted 44% of the variance in the DV (adjusted R2=.39; F(3, 37)=9.67, p<.01), with no multicollinearity issues (VIF below 2.5, and tolerance index above .40). Only tool experience had a significant beta coefficient (β=.46, p<.01). Based on uniqueness indices, tool experience accounted for 16% of the variance, individual readiness for 3%, and resources for 2%, indicating a fair amount of shared variance among predictors. As a further exploratory analysis, simple regressions with the nonsignificant variables as sole predictors of intent to sustain use were conducted. When entered separately, both individual readiness and resources predicted a significant portion of the variance in intent to sustain use (individual readiness: R2=.22; F(1, 39)=10.69, p<.01; and resources: R2=.18; F(1, 39)=8.65, p<.01). In other words, when all three variables were entered in combination, experience with the tool was the only significant predictor of intent to continue use. However, when evaluated in isolation, individual readiness and resources significantly predicted intent to continue use.

Table 2.

Correlations of Intent to Use with IVs

Intent to sustain use
Pearson r Sig. (2-tailed)
Experience with the tool
Perceived usability
   Ease of use .37* .02
   Task efficiency .33* .03
Perceived usefulness   .58** .00
Tool satisfaction   .64** .00
Provider attitudes and readiness
Evidence-based practice attitude scale
   EBPAS - Openness .30 .06
   EBPAS - Divergence .07 .66
   EBPAS - Appeal .06 .69
   EBPAS - Requirements −.18 .28
Comfort with technology .01 .97
Individual readiness (IR)
  IR - commitment   .48** .00
  IR - change efficacy   .40** .01
Organizational factors
Resources   .43** .01
Organizational stressors −.26 .12
Support .07 .68
*

p ≤ .05

**

p ≤ .01

Discussion

This study aimed to explore the role of innovation characteristics, individual attitudes and skills, and organizational factors in school providers’ decisions to continue use of Centervention, a technology-based system to support implementation of mental health EBIs. Participating providers had the opportunity to use Centervention over the course of a year prior to this study. Experience with and influence of Centervention on the factors of interest were first examined based on descriptive statistics. Predictive models were then assessed to identify the most influential factors of intent to maintain use of the tool. With the sustainability literature being in its infancy {Aarons, 2011 #3448;Borntrager, 2009 #3420}(Aarons, Hurlburt, & Horowitz, 2011; Greenhalgh, et al., 2004; Johnson, et al., 2004; Loh, et al., 2013; Scheirer, 2013), this study adds to the current body of research by exploring intent to sustain use for a specific type of innovation, an individual provider-implemented technology-based tool.

Overall Experience with and Influence of the Innovation: Summary

Based on the descriptive data, Centervention was rated positively overall (i.e., easy to use, supportive of providers’ EBI implementation efforts, and valuable). The ease of completing Centervention tasks was rated neutrally, possibly due to the fact that a few software flaws were still being resolved (e.g., slow upload of tracking feature). Alternatively, study participation required regular data collection and entry, a potentially burdensome task for participants. Providers’ attitudes towards EBIs, sense of efficacy and commitment to implementing the innovation, comfort with technology, and organizational factors (resources, stressors, and support) were still rated highly after the first year of implementation, despite the significant decreases observed on the commitment and resources variables. It is possible that despite a positive experience with the tool, participants had a better understanding of the amount of time and resources required to implement this innovation.

Predicting Intent to Sustain Use: Key Findings

As expected based on previous literature (Forman, Olin, Hoagwood, Crowe, & Saka, 2009; Greenhalgh, et al., 2004; Han & Weiss, 2005; Holahan, Aronson, Jurkat, & Schoorman, 2004; Johnson, et al., 2004; Lyon, et al., 2013; Scheirer, 2005; Stirman, et al., 2012), intent to sustain use was positively related to a number of individual and organizational factors, including the providers’ change efficacy and commitment to maintaining the new practice over time, and the resources available to sustain the innovation. However, school providers’ attitudes towards technology and EBIs, organizational stressors (e.g., stress and workload), and a supportive working climate did not influence their decision to continue use of Centervention. It is possible that since providers had had a year of experience interacting with the tool, general attitudes toward technology and EBIs were no longer a meaningful predictor of intent to sustain use. Their experience and success with the innovation was more relevant than any beliefs they might have held previously. Initial resistance to change has been found to be mitigated by users experiencing the benefits of the innovation (Ertmer & Ottenbreit-Leftwich, 2010; Lyon, et al., 2013; Spotts & Bowman, 1995). Furthermore, with the amount of stress and workload stable over time and unlikely to change, providers may have adjusted their daily routines to accommodate use of Centervention. Previous research supports that those who are committed to and intend to continue using an innovation seem to find a way to make it part of their routine practice (Lyon, et al., 2013; Meyer & Goes, 1988). Finally, implementation of Centervention to deliver mental health EBIs rested on individual providers, with assistance from the Centervention purveyors. Having a supportive working climate might be more important for innovations that require coordination among multiple staff, changes in school-wide procedures, or other broader scale changes (Scheirer, 2013).

While a number of individual and organizational factors were positively correlated with intent to sustain use, further analysis revealed that school providers’ experience with Centervention was by far the most influential correlate and predictor of intent to maintain use. School providers’ decision to continue use was strongly influenced by the tool usefulness, usability, and overall satisfaction. In other words, experiencing the benefits and effects of the innovation was the deciding factor, over other key individual and organizational considerations. This is not to say that factors like provider readiness and resources dedicated to the innovation are not important, but rather that tool experience might be more salient to the decision making process to continue use for individual providers (Greenhalgh, et al., 2004; Han & Weiss, 2005; Lyon, et al., 2013; McIntosh, et al., 2009; Racine, 2006; Scheirer, 2005).

Study Implications

These results further highlight the need to differentiate between factors that might predict initial adoption and implementation, and those that enable sustainability (Aarons, et al., 2011; Lyon, et al., 2013; Massey, et al., 2005; Scheirer, 2013). For instance, while attitudes toward technology might be key to the initial adoption of online tools like Centervention, they might not be as important as actual experience with the tool when deciding to continue or discontinue use. Likewise, sustainability decision models need to be tailored to the nature of the innovation to be maintained (Cooper, et al., 2015; Scheirer, 2013). For example, having a supportive work climate might be more important for organization-wide changes that require coordination of multiple staff than for innovations that are implemented by individual providers.

The study also yields findings with real-world implications for sustaining technology-based mental health support tools among individual school providers. The decision to maintain use of the new practice will be facilitated first and foremost by ensuring a positive experience for the provider, assuming s/he holds some responsibility for this decision. S/he has to recognize the benefits associated with the innovation (e.g., rewards outweigh the necessary effort) and experience the desired outcomes (e.g., making one’s job easier, changes in students’ behaviors) before committing to maintaining use. Purveyors and school administrators can facilitate continued use by implementing strategies to enhance the provider experience. These strategies could include provision of ongoing support through access to technical assistance and the development of a community of practice among the providers implementing the same innovation; engaging providers in the continuous quality improvement of the innovation by incorporating their feedback into new versions of the tool; and providing opportunities for providers to showcase the positive changes that result from use of the innovation.

Limitations

This study focused specifically on intent to sustain use. The decision to continue or discontinue the innovation represents one point in time in a dynamic and complex process. It is possible that additional factors might play a role in predicting actual sustained use, with further differences being identified depending on the length of time over which the innovation is maintained (Adelman & Taylor, 2003; Hargreaves, 2002). For instance, intent to sustain use was asked of the provider, with the understanding that this decision was the provider’s responsibility and that s/he planned to continue implementing available EBIs. Actual sustained use might also be dictated by factors outside of the provider’s purview (e.g., funding cuts, educational policies, change in school priorities). Second, both the intent to sustain use measure and the tool experience scales relate to the broader concept of “use” and might therefore be naturally correlated. However, the scales were conceptually distinct, focusing on two different aspects of use: intent to continue using (future behavior) vs. tool usability, usefulness in supporting implementation efforts, and overall satisfaction (current experience). Previous research on the relationship between usability of a technology and behavioral intent to use that technology has typically conceptualized these variables as distinct and measured them using similar scales to those used in this study (Venkatesh & Bala, 2008). Third, the sample included a diversity of provider types (e.g., teachers, counselors). While all study participants committed to using Centervention to facilitate EBI implementation, it is possible that differing workloads and priorities may have influenced each provider’s experience with the innovation. While having this kind of sample increases generalizability of findings, future research could investigate intent to sustain use across diverse provider types. Finally, a small sample size and a focus on one type of organization, schools, may influence generalizability of the findings.

Conclusion

When considered with a number of individual and organizational factors, experience with the innovation was found to be the most influential predictor of intent to sustain use of a provider-implemented technology tool for mental health EBIs. To increase the likelihood of being maintained, the innovation needs to at minimum be easy to use, have notable advantages and benefits, and produce demonstrable results on the mental health and well-being of students, as experienced by the providers. While these findings add to the current body of literature on the sustainability process, further research is needed to understand (1) how these predictors change over time, as the innovation is being sustained; (2) which sets of predictors are most important for which types of innovations, settings, and providers; (3) how these predictors relate to each other; and (4) how use of an implementation support tool like Centervention changes over time as it is being sustained (increased familiarity with the tool might eventually lead to streamlined use). Ensuring that effective mental health innovations are successfully sustained over time will require further refinement of existing sustainability models, with recommendations that are tailored to each type of innovation, setting, and provider. Whereas the existing literature has primarily focused on initial adoption and implementation, gaining a thorough understanding of the sustainability process will ensure the long-term impact of mental health innovations in school settings.

Acknowledgments

this research was supported by a grant from the National Institute of Mental Health (2R44MH084375-03)

Appendix A Centervention

Centervention is a HIPAA-compliant web-based tool that supports quality implementation of evidence-based practices (EBIs) featuring downloadable intervention resources (in addition to a paper-and-pencil toolkit), online training, ongoing technical assistance, and QA/QI feedback loops (provider fidelity tracking, and real time student progress monitoring).

Online Training

EBI training that includes competency quizzes and CE credit is always available for initial, booster, or new staff training.

graphic file with name nihms912810u1.jpg

EBI Resources

In addition to the paper-and-pencil toolkit, the online Resource Center provides the essential and supplemental program materials in one convenient location. Materials may be viewed, downloaded, and printed. Additional links to related websites are also available.

graphic file with name nihms912810u2.jpg

Technical Assistance (TA)

TA is provided through online features such as Ask an Expert (that allows the provider the ability to ask a specific question regarding challenges and issues related to EBI implementation), Quick Tips, Q&As, and Provider tips. Providers can also access the Help Center, that features frequently asked questions (FAQs), help videos, and online help manuals, or access personalized live technical assistance via email or phone.

graphic file with name nihms912810u3.jpg

QA/QI Feedback Loops

Adherence Tracking

Tracking of the provider’s adherence to the EBI ensures ongoing quality and enhances accountability. Provider adherence reports are available to be downloaded and printed for each group.

graphic file with name nihms912810u4.jpg

Student Outcomes Progress Monitoring

Monitoring student outcomes progress over the course of EBI implementation provides the provider with an opportunity to make data-driven adjustments to maximize EBI results. Group and individual participant progress monitoring reports are available to be downloaded and printed at any time.

graphic file with name nihms912810u5.jpg

Appendix B

Survey Measures12

Construct/Measure Number of Items Response options Example Items Cronbach’s Alpha
Intent to sustain use 6 Rate your level of agreement on a 5-point Likert scale, from “strongly disagree” to “strongly agree”

1=strongly disagree
2=disagree
3=neither agree nor disagree
4=agree
5=strongly agree
I am likely to continue….

“using the tools for monitoring student progress”

“using the intervention online training to refresh my memory”
.85
Experience with the tool Perceived usability Ease of use 6 “I felt comfortable using Centervention”

“Centervention was simple to use”
.92
Task efficiency 4 “I was able to complete the necessary tasks quickly using Centervention”

“Centervention had all of the functions and capabilities I expected it to have”
.89
Perceived usefulness 23 “The student center supported me in tracking student progress”

“The Resource Center materials provided me with helpful information about using the interventions with the students”
.96
Tool satisfaction 3 “Centervention is valuable to me as a provider”

“Centervention is a resource I would highly recommend to colleagues”
.91
Provider attitudes and readiness Evidence-based practice attitude scale EBPAS-Openness 4 Indicate the extent to which you agree on a 5-point Likert scale, from “not at all” to “to a very great extent”

1=not at all
2=to a slight extent
3=to a moderate extent
4=to a great extent
5=to a very great extent
“I am willing to try new types of interventions even if I have to follow a manual” Aarons (2004)
EBPAS-Divergence (RC)3 4 “Teaching/counseling experience is more important than using manualized interventions”
EBPAS-Appeal 4 “it [the intervention] made sense to you?”
EBPAS-Requirements 3 “it [he intervention] was required by your principal?”
Comfort with Technology 4 Rate your level of agreement on a 5-point Likert scale, from “strongly disagree” to “strongly agree”

1=strongly disagree
2=disagree
3=neither agree nor disagree
4=agree
5=strongly agree
“I use technology (iPad, Kindle, Facebook, smartphones, etc.) frequently in my personal life”

“Technology makes my job easier”
.91
Individual Readiness IR-Commitment 5 “I am committed to implementing new types of innovations” Shea et al. (2014)
IR-Change efficacy 7 “I can keep the momentum going while implementing innovations”
Organizational factors Resources 4 “This school has the staff needed to implement innovations”

“Facilities at this school are adequate for implementing innovations”
.75
Organizational stressors (RC)4 3 “Some staff members at this school resist any type of change”

“The heavy workload at this school reduces innovation effectiveness”
.80
Support 3 “Staff members at this school are always quick to help one another when needed”

“The budget at this school allows staff to attend professional conferences each year”
.82
1

Composite scores for each scale and subscales were calculated by averaging results across the relevant items.

2

The survey including all items is available upon request

3

RC=reverse coded – higher scores indicated greater agreement that divergence was lower

4

RC=reverse coded – higher scores indicated greater agreement that there was lower organizational stress

Footnotes

Conflict of Interest: The authors who conducted this study are employees of the 3C Institute. The tool investigated in this study is commercially available. However, the data collected and described in this paper do not impact direct sales, since the focus is on understanding predictors of the intent to sustain use of the tool.

1

Composite scores for all of the scales and subscales were computed by averaging the results across the items relevant to the specific scale/subscale, for ease of interpretation and based on recommendations from the scale developers for validated scales.

2

An exploratory factor analysis using principal component analysis (Schonemann, 1990; Velicer & Jackson, 1990) with varimax rotation was conducted with this sample (Costello & Osborne, 2005) to identify similar groups of variables and assess the underlying structure of the 10 items selected to assess usability. The Kaiser-Meyer-Olkin measure verified the sampling adequacy for the analysis (KMO=.86); the Bartlett’s Test of Sphericity was significant (p<.01), indicating that the correlations between items were sufficiently large; and communalities were greater than .30. Based on eigenvalues > 1 and an examination of the scree plot, a 2 factor solution seemed optimal, explaining 78% of the variance. After rotation, items loadings were above .74, and cross loadings below .38. The items that clustered on the first component were related to ease of use and ease of navigation, while the second component included items focused on task efficiency.

3

Two exploratory factor analyses using the same methods outlined in Footnote 1 were conducted on the 10 items using varimax rotation, with pre- and post- data, respectively. Based on the results from the Kaiser-Meyer-Olkin measure, the Bartlett’s test of Sphericity, and the communalities, all underlying assumptions were successfully met. A three factor solution emerged based on eigenvalues > 1 and an examination of the scree plot, accounting for 69% and 73% of the variance, respectively. Item loadings were all above .65, with cross loadings below .40. The first factor indexed organizational resources, the second included items related to organizational support, and the third clustered items on organizational stressors.

4

To ensure that intent to continue use of Centervention was not simply a function of differences in initial rates of adoption, intent to continue use was correlated with all of the available usage variables (e.g., number of QA/QI reports downloaded, percent of online training completed, number of issues for which live TA was initiated by the providers, and number of times the online TA was accessed). None of the usage variables were significantly correlated with intent to sustain use, thereby ruling out this alternative explanation.

References

  1. 3C Institute. Centervention. Retrieved October 31, 2014, from https://Centervention.org/
  2. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS) Mental Health Services Research. 2004;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration & Policy in Mental Health and Mental Health Services Research. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Adelman HS, Taylor L. On sustainability of project innovations as systemic change. Journal of Educational and Psychological Consultation. 2003;14(1):1–25. [Google Scholar]
  5. Allison PD. Multiple regression: A primer. Pine Forge Press; 1999. [Google Scholar]
  6. Briesch AM, Chafouleas SM, Neugebauer SR, Riley-Tillman TC. Assessing the influences on intervention implementation: Revision of the Usage Rating Profile-Intervention. Journal of School Psychology. 2013;51:81–96. doi: 10.1016/j.jsp.2012.08.006. [DOI] [PubMed] [Google Scholar]
  7. Brightwood LH, DeRosier ME. Lifestories for Kids: Enhancing Character Development and Social Skills through Storytelling (Grades 3–5) Cary NC: 3C Institute for Social Development; 2007. [Google Scholar]
  8. Brightwood LH, DeRosier ME, Maschauer EL, Wilson ME. Adventures in Emotional Literacy (AEL): Building Social Relationships through Enhanced Communication, Cooperation, and Confidence (Grades K-2) Cary NC: 3C Institute for Social Development; 2009. [Google Scholar]
  9. Brightwood LH, DeRosier ME, Maschauer EL. Adventures in Emotional Literacy (AEL): Building Social Relationships through Enhanced Communication, Cooperation, and Confidence (Grade 3) Cary NC: 3C Institute for Social Development; 2009. [Google Scholar]
  10. Cooper BR, Bumbarger BK, Moore JA. Sustaining Evidence-Based Prevention Programs: Correlates in a Large-Scale Dissemination Initiative. Prevention Science. 2015;16:145–157. doi: 10.1007/s11121-013-0427-1. [DOI] [PubMed] [Google Scholar]
  11. Costello JW, Osborne JW. Best Practices in Exploratory Factor Analysis: Four Recommendations for Getting the Most From Your Analysis. Practical Assessment, Research, and Evaluation. 2005;10 Available online: http://pareonline.net/pdf/v10n7.pdf. [Google Scholar]
  12. Craig AB, Leary KA, Parker J, McMillen JS, DeRosier ME. Effectiveness of an emotional literacy intervention for the elementary classroom. 2016 Manusript in preparation. [Google Scholar]
  13. DeRosier ME. Building relationships and combating bullying: Effectiveness of a school-based social skills group intervention. Journal of Clinical Child and Adolescent Psychology. 2004;33:196–201. doi: 10.1207/S15374424JCCP3301_18. [DOI] [PubMed] [Google Scholar]
  14. DeRosier ME. Social Skills Group Intervention (S S GRIN): Group Interventions and Exercises for Enhancing Children’s Communication, Cooperation, and Confidence (Grades K-5) Cary NC: 3C Institute for Social Development; 2007. [Google Scholar]
  15. DeRosier ME, Brightwood LH. Lifestories for Kids: Enhancing Character Development and Social Skills through Storytelling (Grades K-2) Cary NC: 3C Institute for Social Development; 2007. [Google Scholar]
  16. DeRosier ME, Marcus SR. Building friendships and combating bullying: Effectiveness of S.S.GRIN at one-year follow-up. Journal of Clinical Child and Adolescent Psychology. 2005;34:140–150. doi: 10.1207/s15374424jccp3401_13. [DOI] [PubMed] [Google Scholar]
  17. DeRosier ME, Mercer S. Improving student social behavior: The effectiveness of a school-based character education program. Journal of Research in Character Education. 2007;5(2):131–148. [Google Scholar]
  18. Ertmer PA, Ottenbreit-Leftwich AT. Teacher technology change: How knowledge, confidence, beliefs, and culture intersect. Journal of research on Technology in Education. 2010;42(3):255–284. [Google Scholar]
  19. Flay BR, Allred CG. Long-term effects of the positive action program. American Journal of Health Behavior. 2003;27:6–21. [PubMed] [Google Scholar]
  20. Forman SG, Olin SS, Hoagwood KE, Crowe M, Saka N. Evidence-Based Interventions in Schools: Developers’ Views of Implementation Barriers and Facilitators. School Mental Health. 2009;1(1):26–36. [Google Scholar]
  21. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: Systematic review and recommendations. The Millbank Quarterly. 2004;82(4):581–629. doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Han SS, Weiss B. Sustainability of Teacher Implementation of School-Based Mental Health Programs. Journal of Abnormal Child Psychology. 2005;33(6):665–679. doi: 10.1007/s10802-005-7646-2. [DOI] [PubMed] [Google Scholar]
  23. Hargreaves A. Sustainability of educational change: The role of social geographies. Journal of Educational Change. 2002;3(3–4):189–214. [Google Scholar]
  24. Holahan PJ, Aronson ZH, Jurkat MP, Schoorman FD. Implementing computer technology: A multiorganizational test of Klein and Sorra’s model. Journal of Engineering and Technology Management. 2004;21(1):31–50. [Google Scholar]
  25. Johnson K, Hays C, Center H, Daley C. Building capacity and sustainable prevention innovations: a sustainability planning model. Evaluation and Program Planning. 2004;27:135–149. [Google Scholar]
  26. Lehman WEK, Greener JM, Simpson DS. Assessing organizational readiness for change. Journal of Substance Abuse Treatment. 2002;22(4):197–209. doi: 10.1016/s0740-5472(02)00233-7. [DOI] [PubMed] [Google Scholar]
  27. Levin BL, Hennessy KD, Petrila J. Mental Health Services: A Public Health Perspective. Third. New York: Oxford University Press; 2010. [Google Scholar]
  28. Lewis JR. IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. International Journal of Human-Computer Interaction. 1995;7(1):57–78. [Google Scholar]
  29. Lippert SK, Forman H. Utilization of information technology: Examining cognitive and experiential factors of post-adoption behavior. Engineering Management, IEEE Transactions on. 2005;52(3):363–381. [Google Scholar]
  30. Livet M, Yannayon ME, Sheppard K, Kocker K, Upright J, McMillen JS. Effectiveness of a web based implementation support tool: testing the EBSIS model. Manuscript submitted for publication 2016 [Google Scholar]
  31. Loh LC, Friedman SR, Burdick WP. Factors Promoting Sustainability of Education Innovations: A Comparison of Faculty Perceptions and Existing Frameworks. Education for Health. 2013;26(1):32–38. doi: 10.4103/1357-6283.112798. [DOI] [PubMed] [Google Scholar]
  32. Lyon AR, Ludwig K, Romano E, Leonard S, Stoep AV, McCauley E. “If It’s Worth My Time, I Will Make the Time”: School-Based Provders’ Decision-Making About Participating in an Evidence-Based Psychotherapy Consultation Program. Administrative Policy in Mental Health. 2013;40:467–481. doi: 10.1007/s10488-013-0494-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Massey OT, Armstrong K, Boroughs M, Henson K. Mental Health Services in Schools: A Qualitative Analysis of Challenges to Implementation, Operation, and Sustainability. Psychology in the Schools. 2005;42(4):361–372. [Google Scholar]
  34. Maxwell SE. Sample size and multiple regression analysis. Psychological Methods. 2000;5(4):434. doi: 10.1037/1082-989x.5.4.434. [DOI] [PubMed] [Google Scholar]
  35. McIntosh K, Horner RH, Sugai G. Sustainability of Systems-Level Evidence-Based Practices in Schools: Current Knowledge and Future Directions. In: Sailor W, Dunlap G, Sugai G, Horner R, editors. Handbook of Positive Behavior Support. New York: Springer Science + Business Media, LLC; 2009. [Google Scholar]
  36. McIntosh K, Martinez RS, Ty SV, McClain MB. Scientific research in school psychology: Leading researchers weigh in on its past, present, and future. Journal of School Psychology. 2013;51(3):267–318. doi: 10.1016/j.jsp.2013.04.003. [DOI] [PubMed] [Google Scholar]
  37. McIntosh K, Mercer SH, Hume AE, Frank JL, Turri MG, Mathews S. Factors Related to Sustained Implementation of Schoolwide Positive Behavior Support. Exceptional Children. 2013;79(3):293–311. [Google Scholar]
  38. Meyer AD, Goes JB. Organizational assimilation of innovations: A multilevel contextual analysis. Academy of management journal. 1988;31(4):897–923. [Google Scholar]
  39. Owens JS, Lyon AR, Brandt NE, Warner CM, Nadeem E, Spiel C, Wagner M. Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health. 2014:99–111. doi: 10.1007/s12310-013-9115-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. President’s New Freedom Commission on Mental Health. Achieving the promise: Transforming mental health care in America (D o H a H Services, Trans) Rockville, MD: DHHS; 2003. [Google Scholar]
  41. Racine DP. Reliable Effectivceness: A Theory on Sustaining and Replicating Worthwhile Innovations. Administrative Policy in Mental Health Services Research. 2006;33:356–387. doi: 10.1007/s10488-006-0047-1. [DOI] [PubMed] [Google Scholar]
  42. Rogers EM. Diffusion of Innovations. 5th. New York, NY: Free Press; 1995. [Google Scholar]
  43. Rogers EM. Diffusion of Innovations. 5th. New York: Free Press; 2003. [Google Scholar]
  44. Rohrbach LA, Grana R, Sussman S, Valente TW. TYPE II TRANSLATION:Transporting Prevention Interventions From Research to Real-World Settings. Evaluation & the Health Professions. 2006;29(3):302–333. doi: 10.1177/0163278706290408. [DOI] [PubMed] [Google Scholar]
  45. Scheirer MA. Is Sustainability Possible? A Review and Commentary on Empirical Studies of Program Sustainability. American Journal of Evaluation. 2005;26:320–347. [Google Scholar]
  46. Scheirer MA. Linking Sustainability Research to Intervention Types. American Journal of Public Health. 2013;103(4):e73–e80. doi: 10.2105/AJPH.2012.300976. http://doi.org/10.2105/AJPH.2012.300976. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Scheirer MA, Dearing JW. An Agenda for Research on the Sustainability of Public Health Programs. American Journal of Public Health. 2011;101(11):2059–2067. doi: 10.2105/AJPH.2011.300193. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. Organizational readiness for implementing change: A psychometric assessment of a new measure. Implementation Science. 2014;9(7):15. doi: 10.1186/1748-5908-9-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Schonemann PH. Facts, Fictions, and Common-Sense About Factors and Components. Multivariate Behavioral Research. 1990;25:47–51. doi: 10.1207/s15327906mbr2501_5. [DOI] [PubMed] [Google Scholar]
  50. Spotts TH, Bowman MA. Faculty use of instructional technologies in higher education. Educational technology. 1995;35(2):56–64. [Google Scholar]
  51. Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research. Implementation Science. 2012;7 doi: 10.1186/1748-5908-7-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Straub ET. Understanding Technology Adoption: Theory and Future Directions for Informal Learning. Review of Educational Research. 2009;79(2):625–649. [Google Scholar]
  53. Stroul BA, Manteuffel BA. The Sustainability of Systems of Care for Children’s Mental Health: Lessons Learned. Journal of Behavioral Health Services & Research. 2007;34(3):237–259. doi: 10.1007/s11414-007-9065-3. [DOI] [PubMed] [Google Scholar]
  54. Velicer WF, Jackson DN. Component Analysis Versus Common Factor-Analysis - Some Further Observations. Multivariate Behavioral Research. 1990;25:97–114. doi: 10.1207/s15327906mbr2501_12. [DOI] [PubMed] [Google Scholar]
  55. Venkatesh V, Bala H. Technology Acceptance Model 3 and a research agenda on interventions. Decision Sciences. 2008;39(2):273–315. [Google Scholar]
  56. Wandersman A, Chien VH, Katz J. Toward an Evidence-Based System for Innovation Support for Implementing Innovations with Quality: Tools, Training, Technical Assistance, and Quality Assurance/Quality Improvement. American Journal of Community Psychology. 2012;50:445–459. doi: 10.1007/s10464-012-9509-7. [DOI] [PubMed] [Google Scholar]
  57. Zhao Y, Frank KA. Factors affecting technology uses in schools: An ecological perspective. American Educational Research Journal. 2003;40(4):807–840. [Google Scholar]

RESOURCES