Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2006 Sep 12.
Published in final edited form as: Ment Health Serv Res. 2004 Jun;6(2):61–74. doi: 10.1023/b:mhsr.0000024351.12294.65

Mental Health Provider Attitudes Toward Adoption of Evidence-Based Practice: The Evidence-Based Practice Attitude Scale (EBPAS)

Gregory A Aarons 1,2,3
PMCID: PMC1564126  NIHMSID: NIHMS10919  PMID: 15224451

Abstract

Mental health provider attitudes toward organizational change have not been well studied. Dissemination and implementation of evidence-based practices (EBPs) into real-world settings represent organizational change that may be limited or facilitated by provider attitudes toward adoption of new treatments, interventions, and practices. A brief measure of mental health provider attitudes toward adoption of EBPs was developed and attitudes were examined in relation to a set of provider individual difference and organizational characteristics.

Methods

Participants were 322 public sector clinical service workers from 51 programs providing mental health services to children and adolescents and their families.

Results

Four dimensions of attitudes toward adoption of EBPs were identified: (1) intuitive Appeal of EBP, (2) likelihood of adopting EBP given Requirements to do so, (3) Openness to new practices, and (4) perceived Divergence of usual practice with research-based/academically developed interventions. Provider attitudes varied by education level, level of experience, and organizational context.

Conclusions

Attitudes toward adoption of EBPs can be reliably measured and vary in relation to individual differences and service context. EBP implementation plans should include consideration of mental health service provider attitudes as a potential aid to improve the process and effectiveness of dissemination efforts.

Keywords: evidence-based practice, attitudes, dissemination, mental health, child, adolescent, organization, services

INTRODUCTION

Mental health service providers are on the front line of delivering services to youth and families. However, treatments and interventions being used in usual care are often not based on evidence of efficacy or effectiveness (Hoagwood & Olin, 2002). Although most evidence-based models do not capture the richness and complexity of the provider–consumer relationship (Margison, 2001; Williams & Garner, 2002), providing services with evidence of effectiveness is an important priority. If the most efficacious and effective interventions are to be disseminated and implemented in community-based settings, a better understanding of attitudes of providers is needed in order to more effectively tailor dissemination and implementation (DI) efforts in relation to provider individual differences in the service context. The present study is a response to the call for a better understanding of the context into which evidence-based practices (EBPs) are likely to be disseminated (e.g., Burns, Hoagwood, & Mrazek, 1999; Glisson, 2002; Hoagwood, Burns, Kiser, Ringeisen, & Schoenwald, 2001; Schoenwald & Hoagwood, 2001). This study describes the development of the Evidence-Based Practice Attitude Scale (EBPAS),4 developed as a preliminary exploration of mental health service provider attitudes toward the adoption of EBP in community mental health settings.

Theoretical models that include attitudes have been proposed to explain and improve the dissemination process. For example, Rogers (1995) notes that studies of diffusion process span technologies such as use of the steel axe, agricultural innovation, teaching innovation, and medical/health innovation and disciplines including anthropology, sociology, economics, medicine, and marketing. Attitudes toward innovation can be a precursor to the decision of whether or not to try a new practice and the affective component of attitudes can impact decision processes regarding innovation (Candel & Pennings, 1999; Frambach & Schillewaert, 2002; Rogers, 1995). Still, little is known about behavioral health service provider attitudes toward adoption of EBPs or even how best to measure such attitudes. Indeed, service provider attitudes toward organizational change in community practice have been studied, but constrained samples have limited the generalizability of such studies (Addis, 2002). For example, Lehman, Greener, and Simpson (2002) examined staff attributes in regard to organizational change in substance abuse treatment settings. Most studies in mental health contexts have examined doctoral level licensed psychologists’ attitudes regarding use of treatment manuals and research-based information (Addis & Krasnow, 2000; Morrow-Bradley & Elliott, 1986; Prochaska & Norcross, 1983). There has been study of highly educated providers’ (i.e., doctoral level) concerns regarding manual-based treatments in traditional clinical settings (Addis, Wade, & Hatgis, 1999). However, in public mental health services, the majority of providers do not have doctoral level training (Aarons, Woodbridge, & Carmazzi, 2003) and attitudes of these providers have not been well studied.

Evidence suggests that the DI of innovation such as EBP must take into account the complexity inherent in real-world service settings (Fraser & Greenhalgh, 2001; Hasenfeld, 1992; Henggeler & Schoenwald, 2002; Jankowicz, 2000; Simpson, 2002). For example, in regard to regulatory concerns, service providers often work in programs that are subject to federal, state, and county policies and regulations. In regard to contracting, programs may have to compete for contracts and service provision is often subject to the terms of such contracts. Services also take place within organizational contexts that vary in regard to the quality of leadership and supervision, organizational norms and expectations, and climate (Glisson, 2002). Common methods of social service technology transfer (e.g., treatment manuals, off-site training sessions) often fail to take into account such complexity and thus may lack effectiveness (Addis, 2002; Backer, David, & Soucy, 1995; Backer, Liberman, & Kuehnel, 1986; Henggeler & Schoenwald, 2002; Strupp & Anderson, 1997). Thus, it is necessary to understand and consider attitudes toward adoption of EBPs of providers who are embedded within the complex organizational context of mental health service systems (e.g., Burns et al., 1999; Garland, Kruse, & Aarons, 2003; Glisson, 1992, 2002; Hoagwood et al., 2001).

The extant literature suggests at least four potentially important domains of provider attitudes toward adoption of EBPs. First, the intuitive appeal of innovation is important to consider in organizational change. This notion is supported by studies of persuasion processes and provider efficacy (Cialdini, Bator, & Guadagno, 1999; Tormala & Petty, 2002; Watkins, 2001). For example, studies have shown that providers are more at ease with information derived from colleagues in contrast to research articles or books (Cohen, Sargent, & Sechrest, 1986; Morrow-Bradley & Elliott, 1986) and attitudes toward adoption of EBPs will likely be influenced by the appeal of an EBP including the information source (Frambach & Schillewaert, 2002).

Second, requirements to provide services in a specified way based on organizational policies or funding exigencies may or may not be followed by service providers. For example, there is variability in the degree to which providers adopt and comply with new practices even when “required” by supervisors or agency mandates (Garland et al., 2003). Although some providers may be more or less compliant with required changes, individual and organizational variability can affect the degree to which innovations are adopted and sustained in practice (Glisson, 2002). Compliance with requirements differs from openness (i.e., willingness to try new experiences or consider new ways of doing things; McCrae & Costa, 2003) in that it denotes how employees respond to organizational rules and regulations. For example, an employee may be high on the characteristic of openness, but may also resist authority.

Third, openness to change in general has been identified as an important component of workplace climate that can impact innovation in mental health service programs (Anderson & West, 1998). Individual differences in openness are related to both organizational characteristics and job performance (Barrick & Mount, 1991). Business and organizational literatures have shown that openness to innovation may be important in developing the characteristics of “learning organizations” that are more responsive and adaptive to internal and environmental contingencies (Anderson & West, 1998; Birleson, 1999; Fiol & Lyles, 1985; Garvin, 1993).

Finally, a divergence may occur when there is a perceived difference between current and new practices. For example, mandated use of evidence-based assessment protocols are often perceived as incongruent or unneeded in clinical practice (Garland et al., 2003). Even where systems are in place to make the use of an EBP relatively seamless, there may be skepticism in regard to the use of such practices when perceived by providers to come from the culture of research and evaluation or when imposed by mandate. Similar “process resistance” has been documented in business sector studies (Garvin, 1993).

Thus, these four domains, intuitive Appeal, attitudes toward organizational Requirements, Openness to innovation, and perceived Divergence of research-based innovation, are likely to be important in understanding the process of DI of EBPs, but no measures are currently available to assess these constructs. Further, it is likely that these domains represent measurably distinct aspects of attitudes toward adoption of EBPs. For example, general openness to innovation is likely to be more akin to an attitudinal disposition rather than being contingent upon requirements in the workplace. The attitude of general openness is expected to differ from the attitude of appeal that is conditional upon the intuitive positive perception of an EBP. However, it is likely that perceived divergence of current practice with EBPs would be inversely associated with more favorable attitudes such as openness and appeal. It is expected that these domains can be identified and associations between domains examined.

Provider attitudes toward innovation and change are likely to interact with both individual differences (e.g., professional experience, training) and contextual factors such as organizational structure and organization type (Anderson & West, 1998; Birleson, 1999; Damanpour, 1991; Glisson, 2002). Studies support the contention that DI efforts of EBPs should take into account the education, training, and experience of service providers in order to facilitate the DI process (Ball et al., 2002; Strosahl, 1998). First, educational attainment has been found to be positively associated with endorsement of evidence-based treatment services and adoption of innovation (Loy, 1968; Ogborne, Wild, Braun, & Newton-Taylor, 1998). Second, a natural transition in the training of most clinical and case management professionals occurs during an internship or practicum experience. There is evidence that those still completing their education (e.g., interns) and transitioning into professional roles may be more flexible in regard to learning new interventions. For example, Ogborne et al. (1998) found that certified counselors were more likely than noncertified counselors to adhere to traditional conceptions of the causes and treatment of addictive disorders. Interns in specialty mental health clinics report more positive attitudes to using evidence-based assessment protocols (Garland et al., 2003). Interns providing services represent providers whose training is still in progress and may be less influenced by a long history of practice. As such, it is likely that interns would be more open to adoption of EBPs relative to providers who have been practicing for more protracted periods. Third, primary discipline in which a service worker is trained may also affect perceptions and use of empirical data or practices. In some cases specialized training may actually limit acquisition of new skills (Pithouse & Scourfield, 2002), however, specialized training that spans professional disciplines has the potential to positively affect breadth of practice (Amodeo, 2000). Still, there is variability by discipline in the emphasis on research and combining practice and research and this is becoming more important with the increasing demand to document evidence of effectiveness in practice (Thyer & Polk, 1997; Turnbull & Dietz-Uhler, 1995).

Contextual variation such as program type, organizational structure, and the presence of written policies regarding recommended practice may be important in understanding adherence—or lack thereof—to practice change (Glisson, 2002; Strupp & Anderson, 1997). First, it may be that the type of services to be delivered or program type (e.g., outpatient, residential) may be related to adoption of innovation and there is evidence that organizational innovativeness varies by type of organization (Damanpour, 1991). For example, there is variability in the mission, consumer population, and service staff of different types of mental health service programs. Second, in regard to organizational structure, organizations with high levels of bureaucracy and red tape may be less flexible in responding to change or promoting internal change relative to more flexible organizations (Frambach & Schillewaert, 2002). Baldassare and colleagues (2000) reported that local governments are often perceived as unresponsive, that there is a need for more responsive services, and recommend expanding the use of government contracting with private sector agencies and nonprofits in the provision of public services. Prager (1986) also found that social workers employed in more bureaucratic agencies were less flexible and made more restrictive long-term care decisions compared to social workers employed in agencies with flatter managerial structures. Finally, in working with mental health programs we have observed that some organizations have written policies specifying the use of specific interventions for specific disorders. Such practice policies can be assessed by program manager reports of whether or not a program has written policies specifying the use of particular interventions for a given mental health problem or disorder. Formalized policies may acquaint service providers with new technologies and demonstrate organizational support for matching treatments to disorders. Thus, as noted in the discussion above, it is important to understand the association of provider attitudes toward adoption of EBPs in relation to both individual difference and organizational/contextual factors.

The primary purpose of this study was to develop a brief measure assessing behavioral health service provider attitudes toward adoption of EBPs. A second goal was to examine the association of attitudes toward adoption of EBPs with provider education level, professional status (i.e., intern vs. staff), primary discipline, and organizational context. It was hypothesized that distinct aspects of attitudes toward adoption of EBPs could be identified among mental health service providers in regard to (1) Appeal of EBPs, (2) Requirements for the use of EBPs, (3) Openness to innovation, and (4) perceived Divergence of EBP with usual practice. Three hypotheses were tested for individual-level variables that more open attitudes toward adoption of EBPs would be associated with (1) higher educational attainment, (2) being an intern versus a professional service provider, and (3) that EBPAS scores would vary by primary discipline. Three hypotheses were tested for organizational characteristics as well specifying that more favorable attitudes toward adoption of EBPs would be positively associated with (1) programs providing less restrictive services, (2) a less bureaucratic organizational structure, and (3) the presence of formalized practice policies.

METHODS

Participants were 322 clinical and case management service providers and 51 program managers from 51 public sector programs providing mental health services to children and adolescents and their families in San Diego County, CA. Fifty-one of fifty-four contacted organizations agreed to participate in the study representing an organizational participation rate of 94.4%. Program managers from non-participant programs (k = 3) cited heavy workloads and time constraints as reasons for not participating. Of 348 potential provider participants, three actively declined participation and 11 surveys were not returned leaving 334 returned surveys (96.0% participation rate). Twelve surveys were excluded because of missing data on key variables resulting in a final sample size of 322 providers.

Eighty percent of respondents were full-time employees and primary disciplines included marriage and family therapy (33.9%), social work (32.3%), psychology (22.4%), psychiatry (1.6%), and “other” (9.9%; e.g., criminology, drug rehabilitation, education, public health). Interns were less prevalent in the service system (24.9%) relative to fully employed staff (75.1%), and interns represented disciplines of marriage and family therapy (46.8%), social work (24.7%), psychology (20.8%), psychiatry (1.3%), and “other” (6.5%).

Participant programs were publicly funded child/adolescent mental health programs providing outpatient treatment (52.9%), day treatment (23.5%), case management (11.8%), wraparound services (7.8%), and inpatient treatment (3.9%). Most programs were contracted with the County to provide services (83.7%) in contrast to operating under County administration structure (16.3%). Fewer programs reported having written policies regarding interventions for specific disorders (14.3%) relative to those without such policies (85.7%). There was substantial variability among programs in regard to the number of unduplicated clients served per year (M = 257.6; SD = 452.8) and the number of clinical and/or case management service staff employed (M = 6.6; range = 1–31).

Scale Development Procedure

An initial pool of items was generated on the basis of the literature reviewed above, consultation with mental health service providers, and child and adolescent services researchers with experience working with clinicians to implement evidence-based protocols (e.g., Garland et al., 2003). A total of 18 items were identified for use in the initial survey. The items assessed openness to innovation, rigidity related to academic training, perceptions of the utility of research-based interventions and manualized interventions, consistency in therapeutic practices over time, interest in using new interventions, perception of the importance of requirements and empirical support for interventions, and divergent attitudes to adoption of EBPs. Of the 18 items, 5 were developed to relate to Appeal, 3 to Requirements, 5 to Openness, and 5 to Divergence. Respondents were asked to indicate their agreement with the items pertaining to their attitudes about adopting new or different types of therapy/interventions. Response options were as follows: 0 = not at all, 1 = to a slight extent, 2 = to a moderate extent, 3 = to a great extent, and 4 = to a very great extent. The EBPAS items and scoring are presented in Appendix A.

Measures

Provider surveys were used to assess potential scale items and individual-level variables. Program manager interviews were used to assess organizational level variables.

The provider survey incorporated questions regarding provider demographics. Education level indicated ordered categories from low to high attainment of some high school, high-school graduate, some college, college graduate, some graduate work, master’s degree, and doctoral degree (PhD, MD, or equivalent). Professional status indicated whether the respondent was an intern or employed professional. Professional status was coded as “0” for staff and “1” for interns. Primary discipline was identified as marriage and family therapy, social work, psychology, psychiatry, and “other.” The “other” category included disciplines that were not one of those mentioned above (e.g., criminal justice, drug rehabilitation, education, public health). Psychiatrists were included in the “other” category for analyses because of the low number of participants indicating psychiatry as primary discipline (n = 5). Primary discipline was dichotomously dummy coded with psychology as the reference group.

The program manager interview included questions regarding types of services provided by the program (e.g., outpatient, day treatment, case management, wraparound, and residential/inpatient), type of organizational structure denoting level of bureaucracy (i.e., contract provider or County administered program), and the presence of practice policies (i.e., the presence of written policies for treatment of specific youth mental health problems). For this set of analyses, providers working in the least restrictive type of program (i.e., outpatient) served as the reference group for comparison with other groups. Organizational structure was categorized as low bureaucracy for programs that provide services under contract with the County versus programs that were part of the County administrative structure. More bureaucratic structure was exemplified by more levels in the administrative and organizational hierarchy, and staff that were part of collective bargaining groups and had civil service protections against job loss. Practice policies were assessed with four items indicating whether or not the program had written policies specifying the use of particular interventions for the treatment of Attention Deficit Hyperactivity Disorder, Conduct Disorder, Depression, or Anxiety disorders. The four items were dichotomously coded to indicate if policies were present (1) or absent (0). The variable used for analysis was a single dichotomous variable created to indicate whether or not the program had one or more written policies regarding treatment practices for the mental health disorders noted above.

Another issue requiring consideration in assessing attitudes toward adoption of EBPs is a possible lack of familiarity with the concept of EBP among those providing care in public mental health systems. Program managers were asked the degree to which they were familiar with the terms “evidence-based practice” and “empirically supported treatment” on the same scale (0–4) described above. However, because it was presumed that there might be a lack of familiarity with these terms, provider survey questions regarding attitudes toward adoption of EBP were couched in more descriptive terms invoking the notion of research-based and/or manualized approaches to service provision that characterize most EBP models.

Survey Procedure

Programs were participants in a study of organizational factors in child and adolescent mental health services in San Diego County. A program manager was contacted at each program and the study was described in detail. Permission was sought to interview each program manager and to survey service providers who worked directly with youth and families. For participant programs, interviews and provider survey sessions were scheduled at the program site at a time designated by the program manager. Interviews were conducted with each program manager individually and surveys were administered to groups of providers. The principal investigator and/or project coordinator conducted program manager interviews. The project coordinator and/or a trained research assistant administered provider surveys and were available during the survey session to answer any questions that arose. On completion of the survey, providers handed in the packet to the survey administrator at which time the surveys were checked for completeness. Any missing responses were then completed by the respondent, if possible. A few surveys were left for completion for providers who were not in attendance at the survey session. Such surveys were either mailed back in a prepaid envelope or picked up at a later time by a research assistant. Participants received a verbal and written description of the study and informed consent was obtained prior to the survey. This study was approved by the appropriate institutional review boards.

Analyses

Level of familiarity with the term “evidence-based practice” among mental health professionals was assessed first. On the same 0–4 scale described above, a mean score was computed in order to assess the degree to which program managers were familiar with the term EBP. Although service providers did not complete this measure, most program managers were the immediate clinical supervisors for the providers and had administrative as well as clinical supervision duties. Thus, this gives a rough indication, at the program level, of familiarity of clinical supervisors’ with the concept of EBP at the time of survey administration. This issue was examined as support for the decision to word survey questions in a way that was easily understandable and did not rely on the use of professional jargon.

Next, two separate factor analytic procedures were conducted. First, the sample was divided by randomly selecting approximately 50% of cases from within each program and assigning cases to either an exploratory (n = 159) or confirmatory (n = 163) analysis group. Group sample sizes were not identical due to an uneven number of providers in each program. Exploratory factor analyses (EFAs) were conducted using Principal Axis Factoring in order to partition systematic and error variance in the solution (Fabrigar, Wegener, MacCallum, & Strahan, 1999; Nunnally & Bernstein, 1994). Because it was expected that evidence-based practice attitude subscales would be related, promax oblique rotation was used allowing for factor intercorrelations. To promote simple structure, items were retained on a factor if they loaded at least .30 on the primary factor and less than .30 on all other factors (e.g., Fabrigar et al., 1999). Item-total correlations and scale reliabilities were also used to assess scale structure. Second, a confirmatory factor analysis (CFA) was conducted on the other half of the sample to test the factor structure derived in the EFA. Commonly accepted rules of thumb for fit indices in CFA include a comparative fit index (CFI) >.90 representing acceptable fit (Dunn, Everitt, & Pickles, 1993) and root mean square error of approximation (RMSEA) values <.10 representing good fit (Kelloway, 1998). Fit measures recommended by Hu and Bentler (1999) indicating excellent fit include CFI and Tucker–Lewis Index (TLI) values near .95 or greater, a RMSEA value near .06 or less, and a standardized root mean square residual (SRMR) near .08 or less. Use of these indices in combination provides a more comprehensive evaluation of model fit. On the basis of the item loadings, subscale scores were computed by assigning each item to the scale indicated in the factor analyses and computing a subscale mean and a total mean. The mean scores were then used in subsequent regression analyses.

Finally, regression analyses were conducted in order to examine organization level clustering effects and the association of EBPAS subscale and total scores with (1) provider characteristics (i.e., education level, professional status, primary discipline) and (2) organizational characteristics (i.e., program type, organizational structure, and presence of practice policies). A multiple stage analytic approach was adopted for these analyses. To assess the magnitude of clustering effects and intraclass correlations (ICCs), hierarchical linear model analyses were conducted (Bryk & Raudenbush, 1992; Hedeker, Gibbons, & Davis, 1991). First, a base model was estimated including only the intercept and dependent variable in the model. This allowed for an assessment of the magnitude of clustering effects. Second, individual provider characteristics representing Level 1 were entered and ICCs assessed again. Third, organizational characteristics at Level 2 were entered. Fourth, on the basis of the results of this procedure, the appropriate regression model was estimated. One-tailed significance tests were used for directional hypotheses and two-tailed tests for other hypotheses. Mplus (Muthén & Muthén, 1998) was used for factor analytic and multilevel regression analyses; however, multilevel regression models were also examined for convergence and consistency with MIXREG (Hedeker & Gibbons, 1996).

RESULTS

Familiarity with the term “evidence-based practice” among program managers was low. The mean familiarity rating was 1.4 (SD = 1.39) indicating only a low level of familiarity with even the terminology of EBP. When another descriptor was provided (i.e., “empirically supported treatment”), this rating did not change. Although not a direct assessment of subordinate service provider knowledge, this indicates a relatively low level of familiarity with the notion of EBP and empirically supported treatment and supports the use of more general language in the scale development.

Factor analyses were conducted next. The EFA suggested a four-factor solution in accordance with examination of the scree plot, simple structure criteria, item-total correlations, and Chronbach’s alpha analysis of internal consistency reliability. Fifteen of the original eighteen items were retained and the EFA model accounted for 63% of the variance in the data. Table 1 shows overall means, standard deviations, item-total correlations, eigenvalues, internal consistency reliabilities, and item loadings for each of the scales. Cronbach’s alphas ranged from .90 to .59 with an overall scale alpha of .77. The factors represented four subscales of attitudes toward adoption of EBPs in keeping with hypothesized dimensions. Appeal (four items; α = .80) is the extent to which the provider would adopt a new practice if it is intuitively appealing, makes sense, could be used correctly, or is being used by colleagues who are happy with it. Requirements (three items; α = .90) is the extent to which the provider would adopt a new practice if it is required by an agency, supervisor, or state. Openness (four items; α = .78) is the extent to which the provider is generally open to trying new interventions and would be willing to try or use new types of therapy. Divergence (four items; α = .59) is the extent to which the provider perceives research-based interventions as not clinically useful and less important than clinical experience. Item analyses showed that the reliability coefficient for the Divergence scale would not have been improved by removing items from the scale. Although the internal consistency reliability for the Divergence scale was not optimal (i.e., <.60), such attitudes have been reported as an important construct in previous studies (Garland et al., 2003) and so the subscale was retained.

Table 1.

EBPAS Subscale and Item Means, Standard Deviations, Item-Total Correlations, Exploratory Factor Analysis Loadings, Eigenvalues and Chronbach’s Alpha

EBPAS subscales and total Mean SD Item-total correlation EV α Scale 1 Scale 2 Scale 3 Scale 4
1. Requirements 2.47 0.88 4.31 .90
Agency required 2.44 0.94 .89 .98
Supervisor required 2.38 0.95 .80 .86
State required 2.60 1.02 .71 .72
2. Appeal 2.90 0.67 2.22 .80
Makes sense 3.04 0.79 .69 .86
Intuitively appealing 2.82 0.87 .66 .84
Get enough training to use 3.13 0.80 .53 .53
Colleagues happy with intervention 2.62 0.94 .57 .48
3. Openness 2.49 0.75 1.56 .78
Will follow a treatment manual 2.46 1.02 .60 .93
Like new therapy types 2.52 0.95 .54 .63
Therapy developed by researchers 2.62 0.89 .63 .56
Therapy different than usual 2.39 0.99 .54 .54
4. Divergence 1.34 0.67 1.36 .59
Research-based treatments not useful 0.83 0.90 .42 .65
Will not use manualized therapy 0.97 0.95 .39 .47
Clinical experience more important 2.23 1.08 .37 .45
Know better than researchers 1.35 1.07 .32 .39
EBPAS total 2.30 0.45 .77

Note. N = 322 for means, standard deviations, Chronbach’s alpha, and item-total correlations; n = 159 for exploratory factor analysis and eigenvalues; SD = standard deviation; EV = eigenvalue; α = Chronbach’s alpha; factor loadings <.28 are not shown.

Next, a CFA was conducted using data from the other randomly selected half of the sample and specifying the factor structure identified in the EFA. As shown in Fig. 1, CFA items were constrained to load only on the primary factor indicated in the previous analysis, thus providing a highly stringent test of the factor structure. As in the EFA, factor intercorrelations were allowed. As shown in Fig. 1, CFA factor loadings confirmed the EFA-based a priori factor structure and the model demonstrated good fit (χ2(84) = 144.92, CFI = .93, TLI = .92, RMSEA = .067, SRMR = .077) further supporting the EBPAS factor structure. Factor intercorrelations ranged from r = .03 to r = .50. Figure 1 shows that Appeal had a strong positive correlation with Openness, a moderate positive correlation with Requirements, and a moderate negative correlation with Divergence. The Requirements scale was moderately negatively correlated with Divergence. Openness had no significant correlation with Divergence or Requirements.

Fig. 1.

Fig. 1

Confirmatory Factor Analysis Model of the Evidence-Based Practice Attitude Scale (EBPAS). n = 163, model fit (χ2(84) = 144.92, CFI = .93, TLI = .92, RMSEA = .067, SRMR = .077); *p < .05, **p < .01; all factor loadings are significant at p < .01.

The following sections examine each of the EBPAS subscale and total scores in relation to provider and organizational characteristics. Sample size for regression models varied slightly because of missing responses. Table 2 shows means and standard deviations for each of the EBPAS scales by each of the predictor variables used in the regression analyses below.

Table 2.

Means and Standard Deviations of Evidence-Based Practice Attitude Scale Scores by Predictor Variables

Appeal
Requirements
Openness
Divergence
EBPAS total
Predictor n M SD M SD M SD M SD M SD
Education level
 Some college 10 2.45 0.47 2.13 0.61 2.53 0.61 1.53 0.80 2.40 0.37
 College graduate 62 2.66 0.81 2.49 0.90 2.51 0.67 1.27 0.62 2.60 0.50
 Some graduate work 35 3.04 0.64 2.83 0.70 2.75 0.88 1.47 0.81 2.79 0.47
 Master’s degree 182 2.97 0.61 2.49 0.86 2.46 0.74 1.32 0.63 2.65 0.44
 PhD/MD 31 2.89 0.72 2.09 1.09 2.41 0.79 1.37 0.82 2.50 0.53
Professional status
 Staff 238 2.85 0.69 2.43 0.85 2.46 0.72 1.38 0.68 2.59 0.46
 Intern 79 3.08 0.59 2.59 0.97 2.59 0.83 1.21 0.62 2.76 0.50
Primary discipline
 Social work 101 2.89 0.70 2.39 0.86 2.51 0.76 1.23 0.65 2.64 0.46
 MFT 106 2.95 0.65 2.43 0.91 2.42 0.69 1.43 0.62 2.59 0.47
 Psychology 70 2.85 0.64 2.51 0.95 2.53 0.86 1.30 0.68 2.65 0.51
 Psychiatry 5 3.40 0.89 2.67 0.82 2.95 1.01 1.65 0.68 2.84 0.51
 Other 31 2.83 0.68 2.72 0.74 2.56 0.64 1.50 0.81 2.65 0.43
Program type
 Inpatient 6 3.00 0.45 2.61 1.08 2.83 0.38 1.00 0.79 2.86 0.30
 Day treatment 67 2.79 0.75 2.60 0.77 2.47 0.71 1.50 0.65 2.59 0.45
 Outpatient 134 3.01 0.61 2.40 0.94 2.43 0.76 1.30 0.66 2.63 0.49
 Case management 55 2.70 0.63 2.38 0.81 2.40 0.74 1.35 0.66 2.53 0.45
 Wraparound 60 2.94 0.74 2.57 0.91 2.72 0.75 1.28 0.71 2.74 0.47
Organization structure
 High bureaucracy 37 2.75 0.55 2.09 0.77 2.13 0.53 1.44 0.73 2.38 0.41
 Low bureaucracy 285 2.92 0.69 2.52 0.88 2.54 0.76 1.33 0.66 2.66 0.47
Practice policies
 No 254 2.86 0.69 2.46 0.91 2.49 0.72 1.38 0.68 2.61 0.47
 Yes 52 3.12 0.61 2.58 0.76 2.75 0.71 1.19 0.63 2.82 0.44

Note. N = 322; n = number of service providers for each variable and may total less than 322 because of some missing data; M = mean; SD = standard deviation; MFT = marriage & family therapy; practice policies refer to having written policies regarding appropriate treatment for specific disorders.

Five regression models were conducted examining the association of independent variables with each of the five EBPAS scales. When ICCs for a dependent variable are negligible across organizational units and cluster size is small, there is little reason to conduct multilevel analysis (Kreft & de Leeuw, 1998; Snijders & Bosker, 1999). For the present study, ICCs and design effects were small for all dependent variables including Appeal (ICC = .099), Openness (ICC = .071), Requirements (ICC = .016), Divergence (ICC = .043), and the EBPAS total score (ICC = .101). Despite the small ICCs and design effects, multilevel regression analyses were attempted. When Level 1 predictors were added, ICCs were further reduced for Appeal (ICC = .063), Openness (ICC = .065), Divergence (ICC = .027), and the EBPAS total score (ICC = .09). Because of the negligible cross-organization variability and lack of residual variance, models that included both Level 1 and Level 2 predictors did not converge except for the Appeal scale. This indicates that remaining cross-organization variability of EBPAS scores was accounted for by predictor variables in the model. Thus, because of negligible clustering effects leading to nonconvergence of multilevel models, standard multiple linear regression analyses were conducted to examine the association of individual difference and contextual variables with EBPAS scale scores. The following analyses are reported for each EBPAS subscale and for the EBPAS total score.

As hypothesized, scores on the Appeal scale were positively associated with higher educational attainment (β = .106, SE β = .042, p < .05) and intern status (β = .169, SE β = .098, p < .05), indicating that those with higher educational attainment and interns endorsed more positive attitudes toward adoption of EBPs given their intuitive appeal. In regard to organizational variables, providers working in case management organizations were more likely than those in outpatient programs to score lower on the Appeal scale (β = −.225, SE β = .114, p < .05) and providers working in programs with written practice policies were more likely to score higher on the Appeal scale (β = .219, SE β = .113, p < .05). For the Appeal scale, multilevel regression and standard regression models produced consistent results. Predictors accounted for 9.1% of the variance in Appeal scale scores.

Scores on the Openness scale were positively associated with intern status (β = .201, SE β = .105, p < .05) indicating higher Openness scores for interns. Openness was also positively associated with wraparound program type indicating higher Openness scores for providers working in wraparound programs (relative to outpatient programs; β = .298, SE β = .130, p < .054). Higher Openness scores were also found for providers working in low bureaucracy programs (β = .303, SE β = .138, p < .05), and programs with written practice policies (β = .370, SE β = .121, p < .01). Predictors accounted for 9.0% of the variance in Openness scale scores.

Providers from day treatment programs were more likely than those from outpatient programs to score higher on the Requirements scale indicating more positive attitudes toward adoption of EBPs if required to do so (β = .286, SE β = .150, p < .05). Providers working in less bureaucratic programs were more likely to score higher on the Requirements scale indicating a more positive attitude toward adopting EBPs if required to do so (β = .342, SE β = .172, p < .05). Predictors accounted for 5.9% of the variance in the Requirements scale scores.

Interns were more likely to score lower on the Divergence scale (β = −.216, SE β = .098, p < .05) indicating less perceived divergence between EBP and current practice. The model accounted for 6.4% of the variance in Divergence scale scores.

Finally, interns were more likely to score higher on the EBPAS total score indicating more global positive attitudes toward adoption of EBPs (β = .182, SE β = .068, p < .05). Other positive associations were found for providers working in wraparound programs (β = .171, SE β = .083, p < .05), in less bureaucratic organizations (β = .209, SE β = .088, p < .05), and in programs with written policies regarding interventions for youth mental health problems (β = .248, SE β = .078, p < .01). Predictors accounted for 11.0% of the variance in EBPAS total scores.

DISCUSSION

The primary finding in this study is that attitudes toward adoption of EBPs can be identified and assessed among behavioral health care providers. The EBPAS subscales represent four distinct constructs involving willingness to adopt EBPs given their intuitive appeal, willingness to adopt new practices if required, general openness toward new or innovative practices, and perceived divergence of usual practice with academically developed or research-based practices. The EBPAS demonstrated good internal consistency reliability. Further study will be needed to examine the temporal reliability of the EBPAS and provide a more extensive assessment of validity.

Openness and Appeal scales were moderately to highly correlated suggesting that openness to using new innovations may be facilitated by the intuitive appeal of a given EBP. The Requirements scale was moderately positively associated with both Appeal and Openness suggesting that providers with positive attitudes toward adoption of EBPs may also be more likely to comply with changes in practice that are part of work requirements. The moderate association between the Appeal and Requirements scales suggests that DI efforts may benefit from a careful balance of requiring changes in practice with making new innovations appealing to providers in order to facilitate the adoption of EBPs in real-world settings. The present findings converge with other studies showing that openness to innovation can be an important component of mental health program and organizational context that is important in the development of a learning organization (e.g., Anderson & West, 1998; Birleson, 1999; Garvin, 1993). The Divergence scale was moderately negatively associated with Requirements suggesting that those who perceive EBPs as being of little relevance are less likely to adopt a practice even when it is a job requirement.

The most consistent individual difference finding across EBPAS scales was that interns endorsed more positive attitudes toward adoption of EBPs relative to professional providers and this was consistent with hypotheses. Level of educational attainment was also associated with positive attitudes to adopting EBPs given their intuitive appeal. Specifically, interns were more likely to score higher on Appeal, Openness, total EBPAS, and lower on Divergence scales whereas higher educational attainment was associated with higher scores on the Appeal scale. Although education and internship overlap, level of educational attainment and internship represent related, but qualitatively different aspects of a mental health provider’s professional development trajectory. This synergistic scenario suggests that although professional education leads to a conditional openness to EBPs, professional internships may be an especially opportune stage of service providers’ professional development in which to “plant seeds” and reinforce the value of the use of EBPs. Related research has shown that during preprofessional status, workers may be especially predisposed to the acquisition of new practices because of more malleable knowledge structures (e.g., Day, Arthur, & Gettman, 2001; Rentsch & Klimoski, 2001). Such flexibility may facilitate the effectiveness of training in EBPs. It follows that a staged model of education and acquisition of professional skills and attitudes could serve as a template for recommending optimal times to promote a well-considered flexibility in practice.

No significant differences were found in attitudes toward adoption of EBPs across discipline. It may be that there are other factors that interact with provider discipline in complex ways. For example, providers with different personality characteristics may respond to organizational constraints in complex ways and such characteristics are important to consider in understanding and improving job performance, workforce development, and personnel selection (Barrick & Mount, 1991). Such complex constraints should be explored further with sample sizes large enough to comprehensively evaluate subgroup results.

In addition to provider characteristics, program characteristics were related to EBPAS scores. In contrast to outpatient providers, behavioral health care providers working in wraparound programs were more open and those working in case management programs were less open to adoption of EBPs suggesting that it is important to consider the programmatic context into which EBPs are to be disseminated. This finding requires further study to delineate how program type interacts with provider attitudes.

Level of bureaucracy was also associated with attitudes toward adoption of EBPs. Providers working in low bureaucracy programs were more predisposed to adoption of EBPs scoring higher on Openness and Requirements scales. This indicates a general openness to new practices and willingness to engage in new practices when required to do so. One implication for tailoring DI efforts to specific operational contexts is that some organizations may be poised to respond to environmental contingencies such as changes in contracting and practice demands, whereas others may be less flexible in regard to changes in policies or procedures that allow for practice change. The notion of the organization as an adaptive system extends the research on learning organizations and holds promise as an explanatory model of change in behavioral health services (Jankowicz, 2000). To survive, organizations must be able to adapt to market demands. However, in public mental health services, these demands are less likely to be market driven than in the private sector. It is plausible that a more market-driven approach could lead to more receptivity to the use of new and innovative technologies. For example, contracts could be structured in a way that requires use of EBPs in usual care placing funding and professional status as incentive structures. Further research should examine these issues and the potential for cross-fertilization between business models and public sector service models. This may lead to more responsive organizations in which providers and provider organizations can grow and thrive while providing high quality, effective services for consumers.

The impact of providers working in settings with written policies for treatment of youth mental health problems has not been well studied. In this study, the presence of written policies regarding treatment of mental disorders were part of internal program initiatives that appear to predispose providers to being more open to new practices. Although this confirmed the hypothesis posed above, further work is needed to examine what types of policies and under what conditions policies lead to more innovation and adaptability in organizations. However, it is likely that top-down models of imposing practice policies may engender higher Divergence, especially where high caseloads and administrative demands compete with the provision of services (Garland et al., 2003).

This study differs from previous inquiries of clinician perceptions of psychotherapy practices (e.g., Cohen et al., 1986; Morrow-Bradley & Elliott, 1986) in that it includes providers from a number of different disciplines with widely varying levels of education and practicing in real-world publicly funded mental health service settings. Thus, the ecological validity of this study is well supported. The present study provides preliminary support for the face and content validity of the EBPAS. Construct validity was explored in relation to both individual difference and organizational context variables. However, concurrent and predictive validity of the measure have yet to be assessed and further study is required.

Although structured approaches (e.g., manualization) may aid in the dissemination of EBPs, additional factors must be considered in order to most effectively change treatment practices. In keeping with other findings, this study suggests that both provider individual differences and contextual variation are important in understanding potential attitudes toward EBP (e.g., Glisson, 2002; Strupp & Anderson, 1997). However, there may be optimal times in a career trajectory in which to facilitate an ongoing openness to innovation. Further research should examine training models in various disciplines in order to identify components of training programs that would increase openness to adoption of EBPs while maintaining professional judgment and a balanced “scientist/practitioner” perspective on the judicious use of empirically supported interventions.

The EBPAS has practical utility in a number of ways. First, the scale taps provider attitudes likely to be related to elements of practice that may facilitate or hinder the adoption of EBPs in real-world settings. As suggested by findings for the Appeal scale, having a positively perceived local opinion leader to introduce and guide change in practice may facilitate receptivity and change in provider behavior (Denton, Smith, Faust, & Holmboe, 2001). Second, organizational factors may impact provider attitudes in ways that could facilitate or hinder DI efforts (Glisson, 2002). Finally, because the EBPAS is extremely brief (i.e., 15 items), taking about 1–2 min to complete, the measure can be used efficiently for research and in real-world practice settings to better understand the service context prior to DI of EBPs.

As an exemplar of a complex system, many factors will affect adoption of EBPs in community mental health settings. The experience, training, and primary discipline of the provider may affect what procedures are used to serve clients. For example, marriage and family therapists may be more open to adoption of interventions that are focused on family issues, psychologists may be more focused on individual treatment, and psychiatrists may be more focused on pharmacological treatments. The context of health care and behavioral health care is highly complex and inquiry into theory and change requires an equally complex perspective (Fraser & Greenhalgh, 2001). In addition to issues addressed in the present study, EBP dissemination efforts should consider the potential impact of financing structures, contractual constraints, political forces, and consumer preferences on how mental health programs provide services.

Provider attitudes toward innovation and EBP represent just one aspect of the complex landscape of health service delivery. Although priority areas for treatment research have been identified (e.g., Burns et al., 1999; Kazdin, Siegel, & Bass, 1990) and DI issues are being critically considered (e.g., Schoenwald & Hoagwood, 2001), priority areas for DI research have only begun to be delineated and explored. The study of attitudes toward EBP has the potential to facilitate a more thorough understanding of how service providers respond to change in organizational processes. The conceptual shift from treatment as usual to EBP values represents an important evolution in the culture of clinical practice. Although the use of EBP has been called for in medicine and behavioral health care, the link between provider attitudes toward adoption of EBPs and contextual variation has not been well studied. To better understand these links, measurement of these constructs must be refined. The present study represents an initial attempt to identify attitudinal constructs in a practical way. Attitudes toward adoption of EBPs clearly involve multiple domains that are associated with important individual and contextual constraints. Organizational and provider individual differences in attitudes toward EBPs should be examined further and be considered in tailoring DI strategies to be most effective for particular provider groups and across service settings.

Acknowledgments

This work was supported by NIMH grant MH01695. The author thanks the program managers and mental health providers who participated in this study and provided their valued perspectives and time. The author also thanks Drs Ann Garland, Kristin Hawley, and John Landsverk for their comments on a previous version of this manuscript and Drs Michael Hurlburt and May Yeh for their consultation and suggestions.

APPENDIX A: EVIDENCE-BASED PRACTICE ATTITUDE SCALE ITEMS AND SCORING INSTRUCTIONS

Instructions

The following questions ask about your feelings about using new types of therapy, interventions, or treatments. Manualized therapy, treatment, or intervention refers to any intervention that has specific guidelines and/or components that are outlined in a manual and/or that are to be followed in a structured or predetermined way. Indicate the extent to which you agree with each item using the following scale.

0 1 2 3 4
Not at All To a Slight Extent To a Moderate Extent To a Great Extent To a Very Great Extent
Item Subscale Question
1. 3 I like to use new types of therapy/interventions to help my clients.
2. 3 I am willing to try new types of therapy/interventions even if I have to follow a treatment manual.
3. 4 I know better than academic researchers how to care for my clients.
4. 3 I am willing to use new and different types of therapy/interventions developed by researchers.
5. 4 Research based treatments/interventions are not clinically useful.
6. 4 Clinical experience is more important than using manualized therapy/interventions.
7. 4 I would not use manualized therapy/interventions.
8. 3 I would try a new therapy/intervention even if it were very different from what I am used to doing.
For questions 9–15: If you received training in a therapy or intervention that was new to you, how likely would you be to adopt it if:
9. 2 it was intuitively appealing?
10. 2 it “made sense” to you?
11. 1 it was required by your supervisor?
12. 1 it was required by your agency?
13. 1 it was required by your state?
14. 2 it was being used by colleagues who were happy with it?
15. 2 you felt you had enough training to use it correctly?

Note: Subscale 1 = Requirements; 2 = Appeal; 3 = Openness; 4 = Divergence.

Scoring the Subscales

The score for each subscale is created by computing a total or mean score for the items that load on a given subscale. For example, Items 11, 12, and 13 constitute subscale 1.

Computing the Total Scale Score

For the total score, all items from the Divergence subscale (Sub-scale 4) must be reverse scored before being used in computing the EBPAS total score.

Footnotes

4

The Evidence-Based Practice Attitude Scale and detailed scoring instructions may be obtained from the author.

References

  1. Aarons, G. A., Woodbridge, M., & Carmazzi, A. (2003). Examining leadership, organizational climate and service quality in a children’s system of care Proceedings of the 15th Annual Research Conference. A System of Care for Children’s Mental Health: Expanding the Research Base, Tampa, FL.
  2. Addis ME. Methods for disseminating research products and increasing evidence-based practice: Promises, obstacles, and future directions. Clinical Psychology: Science and Practice. 2002;9:367–378. [Google Scholar]
  3. Addis ME, Krasnow AD. A national survey of practicing psychologists’ attitudes toward psychotherapy treatment manuals. Journal of Consulting and Clinical Psychology. 2000;68:331–339. doi: 10.1037//0022-006x.68.2.331. [DOI] [PubMed] [Google Scholar]
  4. Addis ME, Wade WA, Hatgis C. Barriers to dissemination of evidence-based practices: Addressing practitioners’ concerns about manual-based psychotherapies. Clinical Psychology: Science and Practice. 1999;6:430–441. [Google Scholar]
  5. Amodeo M. The therapeutic attitudes and behavior of social work clinicians with and without substance abuse training. Substance Use and Misuse. 2000;35:1507–1536. doi: 10.3109/10826080009148228. [DOI] [PubMed] [Google Scholar]
  6. Anderson NR, West MA. Measuring climate for work group innovation: Development and validation of the team climate inventory. Journal of Organizational Behavior. 1998;19:235–258. [Google Scholar]
  7. Backer, T. E., David, S. L., & Soucy, G. (1995). Reviewing the behavioral science knowledge base on technology transfer (NIDA Research Monograph 155, NIH Publication No. 95-4035). Rockville, MD: National Institute on Drug Abuse. [PubMed]
  8. Backer TE, Liberman RP, Kuehnel TG. Dissemination and adoption of innovative psychosocial interventions. Journal of Consulting and Clinical Psychology. 1986;54:111–118. doi: 10.1037//0022-006x.54.1.111. [DOI] [PubMed] [Google Scholar]
  9. Baldassare, M., Shires, M. A., Hoene, C., & Koffman, A. (2000). Risky business: Providing local public services in Los Angeles County (ISBN 1-58213-022-1). San Francisco: Public Policy Institute of California.
  10. Ball S, Bachrach K, DeCarlo J, Farentinos C, Keen M, McSherry T, et al. Characteristics, beliefs and practices of community clinicians trained to provide manual-guided therapy for substance abusers. Journal of Substance Abuse Treatment. 2002;23:309–318. doi: 10.1016/s0740-5472(02)00281-7. [DOI] [PubMed] [Google Scholar]
  11. Barrick MR, Mount MK. The Big Five personality dimensions and job performance: A meta-analysis. Personnel Psychology. 1991;44:1–26. [Google Scholar]
  12. Birleson P. Turning child and adolescent mental-health services into learning organizations. Clinical Child Psychology and Psychiatry. 1999;4:265–274. [Google Scholar]
  13. Bryk, A. S., & Raudenbush, S. W. (1992). Hierarchical linear models: Applications and data analysis methods Thousand Oaks, CA: Sage.
  14. Burns BJ, Hoagwood K, Mrazek PJ. Effective treatment for mental disorders in children and adolescents. Clinical Child and Family Psychology Review. 1999;2:199–254. doi: 10.1023/a:1021826216025. [DOI] [PubMed] [Google Scholar]
  15. Candel MJJM, Pennings JME. Attitude-based models for binary choices: A test for choices involving an innovation. Journal of Economic Psychology. 1999;20:547–569. [Google Scholar]
  16. Cialdini, R. B., Bator, R. J., & Guadagno, R. E. (1999). Normative influences in organizations. In L. L. Thompson, J. M. Levine, & D. M. Messick (Eds.), Shared cognition in organizations: The management of knowledge (pp. 195–211). Mahwah, NJ: Erlbaum.
  17. Cohen LH, Sargent MM, Sechrest LB. Use of psychotherapy research by professional psychologists. American Psychologist. 1986;41:198–206. doi: 10.1037//0003-066x.41.2.198. [DOI] [PubMed] [Google Scholar]
  18. Damanpour F. Organizational innovation: A meta-analysis of effects of determinants and moderators. Academy of Management Journal. 1991;34:555–590. [Google Scholar]
  19. Day EA, Arthur W, Jr, Gettman D. Knowledge structures and the acquisition of a complex skill. Journal of Applied Psychology. 2001;86:1022–1033. doi: 10.1037/0021-9010.86.5.1022. [DOI] [PubMed] [Google Scholar]
  20. Denton GD, Smith J, Faust J, Holmboe E. Comparing the efficacy of staff versus housestaff instruction in an intervention to improve hypertension management. Academic Medicine. 2001;76:1257–1260. doi: 10.1097/00001888-200112000-00022. [DOI] [PubMed] [Google Scholar]
  21. Dunn, G., Everitt, B., & Pickles, A. (1993). Modeling covariances and latent variables using EQS London: Chapman & Hall.
  22. Fabrigar LR, Wegener DT, MacCallum RC, Strahan EJ. Evaluating the use of exploratory factor analysis in psychological research. Psychological Methods. 1999;4:272–299. [Google Scholar]
  23. Fiol CM, Lyles MA. Organizational learning. Academy of Management Review. 1985;10:803–813. [Google Scholar]
  24. Frambach RT, Schillewaert N. Organizational innovation adoption: A multi-level framework of determinants and opportunities for future research. Journal of Business Research. 2002;55:163–176. [Google Scholar]
  25. Fraser SW, Greenhalgh T. Complexity science: Coping with complexity: Educating for capability. BMJ. 2001;323:799–803. doi: 10.1136/bmj.323.7316.799. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Garland AF, Kruse M, Aarons GA. Clinicians and outcome measurement: What’s the use? Journal of Behavioral and Health Services Research. 2003;30:393–405. doi: 10.1007/BF02287427. [DOI] [PubMed] [Google Scholar]
  27. Garvin DA. Building a learning organization. Harvard Business Review. 1993;71:78–91. [PubMed] [Google Scholar]
  28. Glisson, C. (1992). Structure and technology in human service organizations. In Y. Hasenfeld (Ed.), Human services as complex organizations (pp. 184–202). Thousand Oaks, CA: Sage.
  29. Glisson C. The organizational context of children’s mental health services. Clinical Child and Family Psychology Review. 2002;5:233–253. doi: 10.1023/a:1020972906177. [DOI] [PubMed] [Google Scholar]
  30. Hasenfeld, Y. (Ed.). (1992). Human services as complex organizations Newbury Park, CA: Sage.
  31. Hedeker DR, Gibbons RD. MIXREG: A computer program for mixed-effects regression analysis with autocorrelated errors. Computer Methods and Programs in Biomedicine. 1996;49:229–252. doi: 10.1016/0169-2607(96)01723-3. [DOI] [PubMed] [Google Scholar]
  32. Hedeker DR, Gibbons RD, Davis JM. Random regression models for multicenter clinical trials data. Psychopharmacology Bulletin. 1991;27:73–77. [PubMed] [Google Scholar]
  33. Henggeler SW, Schoenwald SK. Treatment manuals: Necessary, but far from sufficient [Commentary] Clinical Psychology: Science and Practice. 2002;9:419–420. [Google Scholar]
  34. Hoagwood K, Burns BJ, Kiser L, Ringeisen H, Schoenwald SK. Evidence-based practice in child and adolescent mental health services. Psychiatric Services. 2001;52:1179–1189. doi: 10.1176/appi.ps.52.9.1179. [DOI] [PubMed] [Google Scholar]
  35. Hoagwood K, Olin SS. The NIMH Blueprint for Change report: Research priorities in child and adolescent mental health. Journal of the American Academy of Child and Adolescent Psychiatry. 2002;41:760–767. doi: 10.1097/00004583-200207000-00006. [DOI] [PubMed] [Google Scholar]
  36. Hu L, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling. 1999;6:1–55. [Google Scholar]
  37. Jankowicz D. From “learning organization” to “adaptive organization. Management Learning. 2000;31:471–490. [Google Scholar]
  38. Kazdin AE, Siegel TC, Bass D. Drawing on clinical practice to inform research on child and adolescent psychotherapy: Survey of practitioners. Professional Psychology: Research and Practice. 1990;21:189–198. [Google Scholar]
  39. Kelloway, E. K. (1998). Using LISREL for structural equation modeling: A researcher’s guide Thousand Oaks, CA: Sage.
  40. Kreft, I., & de Leeuw, J. (1998). Introducing multilevel modeling Thousand Oaks, CA: Sage.
  41. Lehman WEK, Greener JM, Simpson DD. Assessing organizational readiness for change. Journal of Substance Abuse Treatment. 2002;22:197–209. doi: 10.1016/s0740-5472(02)00233-7. [DOI] [PubMed] [Google Scholar]
  42. Loy JW., Jr Social psychological characteristics of innovators. American Sociological Review. 1968;34:73–82. [Google Scholar]
  43. Margison, F. (2001). Practice-based evidence in psychotherapy. In C. Mace, S. Moorey, et al. (Eds.), Evidence in the psychological therapies: A critical guide for practitioners (pp. 174–198). New York: Brunner-Routledge.
  44. McCrae, R. R., & Costa, P. T., Jr. (2003). Personality in adulthood: A five-factor theory perspective (2nd ed.). New York: Guilford Press.
  45. Morrow-Bradley C, Elliott R. Utilization of psychotherapy research by practicing psychotherapists. American Psychologist. 1986;41:188–197. doi: 10.1037//0003-066x.41.2.188. [DOI] [PubMed] [Google Scholar]
  46. Muthén, L. K., & Muthén, B. O. (1998). Mplus user’s guide Los Angeles: Muthén & Muthén.
  47. Nunnally, J., & Bernstein, I. (1994). Psychometric theory (3rd ed.). New York: McGraw-Hill.
  48. Ogborne AC, Wild TC, Braun K, Newton-Taylor B. Measuring treatment process beliefs among staff of specialized addiction treatment services. Journal of Substance Abuse Treatment. 1998;15:301–312. doi: 10.1016/s0740-5472(97)00196-7. [DOI] [PubMed] [Google Scholar]
  49. Pithouse A, Scourfield J. Ready for practice? The DipSW in Wales: Views from the workplace on social work training. Journal of Social Work. 2002;2:7–27. [Google Scholar]
  50. Prager E. Bureaucracy’s impact on decision making in long-term care. Health and Social Work. 1986;11:275–285. doi: 10.1093/hsw/11.4.275. [DOI] [PubMed] [Google Scholar]
  51. Prochaska JO, Norcross JC. Contemporary psychotherapists: A national survey of characteristics, practices, orientations, and attitudes. Psychotherapy: Theory, Research and Practice. 1983;20:161–173. [Google Scholar]
  52. Rentsch JR, Klimoski RJ. Why do ‘great minds’ think alike? Antecedents of team member schema agreement. Journal of Organizational Behavior. 2001;22:107–120. [Google Scholar]
  53. Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York: Free Press.
  54. Schoenwald SK, Hoagwood K. Effectiveness, transportability, and dissemination of interventions: What matters when? Psychiatric Services. 2001;52:1190–1197. doi: 10.1176/appi.ps.52.9.1190. [DOI] [PubMed] [Google Scholar]
  55. Simpson DD. A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment. 2002;22:171–182. doi: 10.1016/s0740-5472(02)00231-3. [DOI] [PubMed] [Google Scholar]
  56. Snijders, T. A. B., & Bosker, R. J. (1999). Multilevel analysis: An introduction to basic and advanced multilevel modeling Thousand Oaks, CA: Sage.
  57. Strosahl K. The dissemination of manual-based psychotherapies in managed care: Promises, problems, and prospects. Clinical Psychology: Science and Practice. 1998;5:382–386. [Google Scholar]
  58. Strupp HH, Anderson T. On the limitations of therapy manuals. Clinical Psychology: Science and Practice. 1997;4:76–82. [Google Scholar]
  59. Thyer BA, Polk G. Social work and psychology professors’ scholarly productivity: A controlled comparison of cited journal articles. Journal of Applied Social Sciences. 1997;21:105–110. [Google Scholar]
  60. Tormala ZL, Petty RE. What doesn’t kill me makes me stronger: The effects of resisting persuasion on attitude certainty. Journal of Personality and Social Psychology. 2002;83:1298–1313. doi: 10.1037//0022-3514.83.6.1298. [DOI] [PubMed] [Google Scholar]
  61. Turnbull JE, Dietz-Uhler B. The Boulder Model: Lessons from clinical psychology for social work training. Research on Social Work Practice. 1995;5:411–429. [Google Scholar]
  62. Watkins M. Principles of persuasion. Negotiation Journal. 2001;17:115–137. [Google Scholar]
  63. Williams DDR, Garner J. The case against ‘the evidence’: A different perspective on evidence-based medicine. British Journal of Psychiatry. 2002;180:8–12. doi: 10.1192/bjp.180.1.8. [DOI] [PubMed] [Google Scholar]

RESOURCES