Skip to main content
Health Services Research logoLink to Health Services Research
. 2011 Nov 18;47(3 Pt 1):1068–1090. doi: 10.1111/j.1475-6773.2011.01352.x

Management Systems, Patient Quality Improvement, Resource Availability, and Substance Abuse Treatment Quality

Dail Fields 1, Paul M Roman 2,3, Terry C Blum 4
PMCID: PMC3290740  NIHMSID: NIHMS331909  PMID: 22098342

Abstract

Objective

To examine the relationships among general management systems, patient-focused quality management/continuous process improvement (TQM/CPI) processes, resource availability, and multiple dimensions of substance use disorder (SUD) treatment.

Data Sources/Study Setting

Data are from a nationally representative sample of 221 SUD treatment centers through the National Treatment Center Study (NTCS).

Study Design

The design was a cross-sectional field study using latent variable structural equation models. The key variables are management practices, TQM/continuous quality improvement (CQI) practices, resource availability, and treatment center performance.

Data Collection

Interviews and questionnaires provided data from treatment center administrative directors and clinical directors in 2007–2008.

Principal Findings

Patient-focused TQM/CQI practices fully mediated the relationship between internal management practices and performance. The effects of TQM/CQI on performance are significantly larger for treatment centers with higher levels of staff per patient.

Conclusions

Internal management practices may create a setting that supports implementation of specific patient-focused practices and protocols inherent to TQM/CQI processes. However, the positive effects of internal management practices on treatment center performance occur through use of specific patient-focused TQM/CPI practices and have more impact when greater amounts of supporting resources are present.

Keywords: Internal management practices, quality management, substance use disorder treatment quality, behavioral health services


High performance management practices and total quality management/continuous quality improvement (TQM/CQI) processes are two theory-based approaches that may enhance health care organizations’ performance (Dean and Bowen 2003; Ghoshal and Bartlett 2005; Huselid 2009; Shortell et al. 1998; Reed, Lemak, and Montgomery 2010; Kaynak 2009). Some argue that good internal management practices provide the greatest boost to organizational performance (Shortell et al. 1998), while others suggest that management practices impact performance primarily through facilitation of specific patient-focused approaches inherent in TQM/CQI processes (Kaynak 2009; Khatri et al. 2009). However, environmental strains may alter the effectiveness of either of these approaches, especially in complex health care organizations (Lozeau, Langley, and Denis 2007; D'Aunno 2008). Health care leaders may thus have to choose whether to enhance general management approaches or concentrate on implementation of specific patient-focused TQM/continuous process improvement (CPI) processes. An organization may adopt a high-performance management philosophy, but not see expected improvements in quality of patient care because an essential focus on patients has not been explicitly emphasized (Lozeau, Langley, and Denis 2007; Khatri et al. 2009).

This study examines the impact of both general management practices and specific patient-focused TQM/CPI processes on performance within organizations that face complex and constant environmental turbulence, that is, a nationally representative sample of specialized substance use disorder (SUD) treatment centers. These organizations comprise a significant component of the health care industry, with 2006 gross estimated expenditure on SUD treatment of over $20 billion (Kimberly and McLellan 2010; Mark et al. 2002). This expansion partially reflects treatment as a legitimized component of the “War on Drugs” in the United States (DeGrandpre 1994) but also growing institutionalization of treatment as the preferred societal response to SUDs. Federal legislation has recently created “parity” for reimbursement of treatment for SUDs (Barry, Huskamp, and Goldman 1994), and expansion of SUD treatment coverage is a significant inclusion in Federal health care reform (Buck 2008).

Growth of SUD treatment organizations has been driven by both public and private funding, but this ever-changing mix and erratic funding levels create an uncertain resource environment. Central to this uncertainty are shifting medical and legal ambiguities surrounding the treatment of SUDs. For example, medications that aid the recovery process, recently expanded in variety and effectiveness, potentially conflict with long-standing regimens of treatment based on 12-step principles, advocating combinations of self-help, social support, and spirituality (Abraham and Roman 2010). In addition, a substantial proportion of SUD patients may be processed through the criminal justice system and face being imprisoned or re-imprisoned for nonconformity to prescribed treatment (Nolan 2002; Henderson and Taxman 2001). These anomalies both contribute to and are exacerbated by limitations in third-party treatment reimbursement yet to be significantly impacted by the aforementioned “parity” legislation (Smith, Lee, and Davidson 1997).

Public funding is substantial but flows through multi-layered fragmented decision making as each state and territory has substantial autonomy in allocating and controlling expenditure of Federal funding, including Medicaid. These decisions can be both fluid and unpredictable as they are variably impacted by politically based legislative agendas (DeGrandpre 1994; Smith, Lee, and Davidson 1997) and high turnover in state-level leadership. Uncertainty also stems from recent pressure from policy and funding agencies for re-engineering SUD treatment through implementing evidence-based practices (EBPs) (Lamb, Greenlick, and McCarty 1993; Miller et al. 2004).

Substance use disorder treatment centers offer a rich opportunity for re-examining previous research findings about the respective effects of internal management practices and specific patient-focused TQM/CPI practices in a context of substantial environmentally generated stress and uncertainty, but where better performance in patient care may buffer persistent environmental turbulence. In this study, we examine relationships among the use of high performance management practices, patient-focused TQM/CQI processes, and multiple dimensions of treatment center performance in a nationally representative sample of 221 SUD treatment centers surveyed in 2007–2008. As these treatment centers operate with high environmental uncertainty, we also examine how resource availability influences the effects of internal management practices and patient-focused TQM/CPI processes on performance (Zinn and Flood 1997; Pollack and D'Aunno 1999).

Literature Review

A central theoretical question in this study is whether organizational performance is predicted primarily by general management practices or primarily through implementation of specific patient-focused TQM/CQI processes. That is, across industries, effective management of organizations’ human capital has been suggested as the only truly sustainable source of competitive advantage, performance, and survival (Huselid 2009; Shortell et al. 1998; Snell and Youndt 1995; Evans and Davis 2000). However, an alternative view suggests internal management practices may impact performance largely through application of specific management disciplines such as TQM/CQI processes. The specific patient focus of these processes may limit the distractions inherent to environmental turbulence and other external pressures (Anderson, Rungtusanatham, and Schroeder 2009; Dean and Bowen 2003; Kaynak 2009; Khatri et al. 2009).

Internal Management Practices

High-performance management practices may include elements of a commitment-based approach, including self-managed teams, decentralized decision making, knowledge sharing, inter-functional coordination, and extensive employee training (Guthrie, 2001; Evans and Davis 2000; Khatri et al. 2009). Of particular interest here are components of internal management articulated and examined by Ghoshal and Bartlett (2005). The components were described as discipline (rational decision making and clear standards of performance), stretch (establishment of ambitious objectives), trust (fairness in organizational decisions and employee involvement), and support (freedom of initiative in work tasks).

There is evidence that management approaches such as fairness and employee participation may be positively related to higher quality of patient care and fewer errors within hospitals (Shortell et al. 1998; Khatri et al. 2009). Other studies found that patient engagement in SUD treatment was higher in treatment programs with clear goals and long-term plans (discipline), trial of new techniques (stretch), including staff views in organizational decision making (trust), and support of staff autonomy in job direction (support) (Broome et al. 1981; Greener et al. 1996).

Total Quality Management/Continuous Quality Improvement

Numerous studies of manufacturing and services industries show that use of TQM/CQI enhances outcomes for clients (Dean and Bowen 2003; Shortell et al. 1998; Reed, Lemak, and Montgomery 2010; Milne, Blum, and Roman 1982). The general characteristics of TQM/CPI practices may include focus on service to customers, objectives of continuous improvement, and structured problem solving by employees (Westphal, Gualti, and Shortell 1995). However, the principal distinguishing characteristic of TQM/CQI is focus on clients, including knowledge of customer viewpoints, measurement of service and operational results, and information management (Dean and Bowen 2003; Waldman, 1994; Shortell et al. 1998; Westphal, Gualti, and Shortell 1995; Chong et al. 2011; Kennedy and Fiss 2005). While the TQM/CPI terminology has been applied to a wide range of possible management tactics, the essence lies in focus on patient outcomes and consistent with previous studies in health care settings, we focus on those elements in this study (Shortell et al. 1998; Shortell, Bennett, and Byck, 1998).

Previous studies have found that after controlling size and ownership type, TQM/CPI implementation positively predicted better hospital internal management, patient outcomes, and financial and accreditation performance (Shortell et al. 1998; Douglas and Judge 2006). Using data collected in 2002–2004, Fields and Roman (2010) found that TQM/CQI practices explained significant incremental variance in comprehensive care and implementation of EBPs, both indicators of organizational performance in SUD treatment.

Performance of SUD Treatment Centers

Although patient outcomes might be preferred measures of performance of SUD treatment organizations, such measures may be challenged by differing medical baselines at treatment entry, adequately representative data, and potentially biased self-reports of behavior. Alternatively, performance may be measured by evidence of processes known to lead to better patient outcomes such as comprehensive care, use of EBPs, and minimal patient waiting time prior to treatment (Carr et al. 2007; Harris et al. 2008; Fields and Roman 2005). Comprehensive care includes detailed treatment plans, effective core treatment services, and linking patients with needed medical, psychiatric, and social services (NIDA 2000; Etheridge and Hubbard 2000; Weisner and McLellan 1994). EBPs in SUD treatment include cognitive behavioral therapy, motivational enhancement therapy, and contingency management, as well as five medications: disulfiram, naltrexone, acamprosate, buprenorphine, and selective serotonin reuptake inhibitors (SSRIs) (Lamb, Greenlick, and McCarty 1993; Ball and Young 1988; Institute of Medicine 1995; Knudsen, Ducharme, and Roman 2007). Shorter wait time has data-based associations with lower rates of pretreatment attrition, lower dropout rates, and less relapse to drug use (Carr et al. 2007).

Effects of Treatment Center Resources

Resource availability may enhance the effects that general management practices and specific TQM/CPI processes have on treatment center performance (Bourgeouis 1981; Meyer 2007; Zinn and Flood 1997). As Picone and colleagues (1981) note, residual resources can be applied to patient care and/or improvement of management processes. In addition, more slack resources available in a treatment center may limit the effects of external pressures, which can corrupt the implementation of TQM/CPI practices, thus reducing their positive effects on patient-focused performance (Westphal, Gualti, and Shortell 1995; Lozeau, Langley, and Denis 2007; Kennedy and Fiss 2005). Based on previous studies, we index resource availability as the number of full-time employees per patient in treatment (Pollack, D'Aunno, and Lamar 2003; Pollack and D'Aunno 1999).

Study Hypotheses

Management practices may impact organizational performance primarily through implementation of specific TQM/CQI practices (Easton and Jarrell 2000; Kaynak 2009; Goonan and Stoltz 1994; Khatri et al. 2009). For example, Anderson, Rungtusanatham, and Schroeder (2009) have shown that many aspects of Deming's management practices create an environment conducive to implementation of specific patient-focused techniques integral to TQM/CQI. Kaynak (2009) found that general management practices, employee relations, and training/development affected outcomes through specific TQM/CPI processes such as collection and use of quality data and process improvement efforts. In hospitals, studies have found that patient-focused TQM/CPI practices mediated the effects of general management philosophies and organizational culture on quality of patient care and hospital performance (Shortell et al. 1998; Khatri et al. 2009). Thus, our primary research hypothesis is:

H1: Patient-focused TQM/CPI practices mediate the effects of internal management practices on treatment center performance.

We test this hypothesis by comparing the fit of this mediated model with an alternative model with direct effects of both internal management practices and TQM/CPI client-focused processes (Anderson and Gerbing 2010). Although the simultaneous relationship of both constructs with organizational performance is rarely examined, studies in different industries have found possible direct effects for internal management practices on performance (Huselid 2009; Snell and Youndt 1995; Kehoe and Wright 2003). Other studies of SUD treatment centers suggest possible direct relationships between management practices and higher levels of patient involvement and shorter waiting periods to enter treatment (Greener et al. 1996; Carr et al. 2008).

Greater levels of resources may limit the effects of environmental turbulence, allowing treatment centers to focus TQM/CPI efforts on improving patient care. Treatment centers’ application of relative resource munificence to patient-centered performance is particularly significant because most SUD patients may have limited information about treatment alternatives. Treatment center resource limitations may also reduce the availability of employee time to apply internal management practices or TQM/CQI practices. That is, smaller ratios of employees to patients may stretch staff so thin that providing comprehensive care and using EBPs become lower priorities than just keeping up with patients. Our second research hypothesis is:

H2: Treatment center resource availability moderates the effects of patient-focused TQM/CPI practices such that the impact of TQM/CQI on performance is significantly larger when resources are greater.

Methods

Sample and Procedures

Data were collected in the National Treatment Center Study (NTCS), a family of national surveys of SUD treatment providers. Sample eligibility required the offering of alcohol and drug treatment at American Society of Addiction Medicine Level 1 outpatient services or higher. The NTCS uses a two-stage random sample of treatment centers, first stratifying counties by size and then randomly sampling within strata. In the second stage, using national and state directories, all eligible urban, suburban, and rural SUD treatment facilities in the sampled counties were enumerated. Centers were then proportionately sampled across strata, with telephone screening used to establish eligibility for the study. Facilities screened as ineligible were replaced by random selection from the same geographic stratum. Recruitment continued for an 18-month period, resulting in 345 privately funded treatment centers in the sample. Eligibility requirements excluded counselors in private practice, halfway houses, and transitional living facilities; programs offering exclusively methadone maintenance, court-ordered driver education, or detoxification services; and programs located in correctional and Veterans Health Administration facilities.

Two definitional criteria distinguish the NTCS. First, the unit of analysis is the organization, rather than the service delivery unit. Thus, treatment centers offering multiple treatment modalities contribute data on all available treatments, of particular importance when assessing comprehensiveness of services. Second, programs are defined as private sector if at least 50 percent of their annual operating revenues come from commercial insurance, patient fees (including Medicaid and Medicare), and income sources other than government grants or contracts.

Data used in this study were collected during 2007–2008 using face-to-face interviews with administrative and clinical directors, coupled with written questionnaires. Data about internal management practices and patient-focused TQM/CPI practices were provided by the administrative director. Information about patient care was provided by the clinical director. Some measures were collected in questionnaires completed by the administrative director. Use of questionnaire data reduced the sample to 221 or 64 percent of the 345 centers that participated in interviews. Following Goodman and Blum (1996), we estimated a logistic regression model to assess the representativeness of the 221 treatment centers relative to the total sample. This analysis revealed no significant differences between centers included/not included in the analysis for comprehensive care, use of EBPs, waiting time to treatment, TQM/CQI practices, and a group of treatment center characteristics (listed in description of control variables below). From a practical standpoint, the centers in our study sample were somewhat smaller on average than centers not returning administrator questionnaires.

Quantitative, empirical investigations of organizations are often faced with a lack of archival data about constructs of interest, and thus often rely on data reported by key informants, as is the case here (Kumar, Stern, and Anderson 1993). Key informants have been used in research studies to describe organizational variables including innovation adoption, environmental influences, power of major suppliers and customers, human resource practices, and quality management practices (Phillips 1981; Blum, Fields, and Goodman 1994; Fields, Goodman, and Blum 2005; Fields and Roman 2005). As top managers, the key informants in our sample were knowledgeable about the areas being studied, and were able and willing to communicate about these areas (Kumar et al. 1993).

Measures

In this study, we have conceptualized internal management practices, TQM/CPI processes, and treatment center performance as latent random variables as defined by Bollen (2010). This definition is consistent with widely used views that latent variables are perceptions that may not be directly measured in a sample of respondents, such as our key informants. As such, these latent variables are represented through actual values of observed random variables (Bollen 2010). We examine the association of these measured indicators with the latent constructs in our analyses (Joreskorg and Sorbom 1996; Bollen 2010). Consequently, the following discussion describes the measured random variables that are used to represent each of the latent variables.

Treatment Center Performance (Latent Variable)

Treatment center performance was measured using three indicators describing the extent to which the center provided comprehensive care, the extent of use of EBPs, and the average time in days between patient contact with the center and entry into treatment.

Comprehensive Care

Following Ducharme and colleagues (2001), the extent of comprehensive care was measured using an index of core and ancillary services. This index was the sum of indicators of whether a center provides: (1) use of the addiction severity index (ASI) at intake/assessment; (2) patient random drug testing; (3) one or more 12-step groups; (4) use of any of five pharmacotherapies (buprenorphine, naltrexone, methadone, disulfiram, and/or SSRIs), (5) aftercare/continuing care; (6) childcare services; (7) transportation assistance; (8) dedicated treatment for HIV/AIDS patients; and (9) integrated care for patients with dual diagnosis of both addiction and psychiatric conditions. In addition, the index included the mean of five additional variables describing on a 0-to-1 scale the extent to which a center links patients with primary medical care, employment, financial, family, and legal services.

Use of EBPs

This variable is measured with an index based on indicators of the use of any of three FDA approved medications for the treatment of SUDs as of 2004: disulfiram, naltrexone, and buprenorphine, SSRIs; and behavioral interventions, including manualized motivational enhancement therapy, contingency management, dual-focus schema therapy, cognitive behavioral therapy, the Matrix Model, multi-systemic therapy, community reinforcement, and supportive-expressive psychotherapy (NIDA 2000; Emmelkamp and Vedel 2006).

Waiting Time to Enter Treatment

This variable was measured as the clinical director's response to the question “For a new patient, what is the average length of time (in days) between the time they first contact the center and the time they attend their first treatment session?” asked during the on-site interview.

Internal Management Practices and Patient-Focused TQM/CQI Practices

The internal management practices used by each treatment center were described by indicators corresponding to the four major aspects of effective management practices of discipline, “stretch,” support, and trust (Ghoshal and Bartlett 2005). The items used in the scales for each measure are shown in Appendix SA2.

Discipline was measured by a 9-item scale (α = 0.83) based on items used previously by Khandwalla to describe managerial approach (Milne, Blum, and Roman 1982).

Stretch was measured by a 6-item scale (α = 0.89) adapted from a measure of entrepreneurial orientation by Anderson, Covin, and Slevin (). The 6-item scale used was factor analyzed and found to contain only one dimension.

Support was measured using a 6-item scale (α = 0.86) that described the extent to which the center supports training and development of treatment staff, encourages novel ideas by staff, and supports new and changing technology.

Trust was measured using a 5-item scale (α = 0.73) based on a measure used previously by Hage (Milne, Blum, and Roman 1982). As the items were designed to measure centralized control of work decisions, the responses were reverse coded to reflect trust in employee choices about job-related matters.

The extent to which each center uses patient-focused TQM/CQI practices was measured using indicators of three elements specified by the Baldrige award criteria. These are measurement of performance, customer knowledge, and information management. The items used in the scales for each measure are shown in Appendix SA2.

Measurement of performance was captured by a 6-item scale (α = 0.72) describing the extent to which a center measures quality of treatment, financial performance, efficiencies, and referral source satisfaction.

Customer knowledge was measured with an 11-item scale (α = 0.79), which describes the extent a center collects and uses data about patient, referral source, and third-party payer satisfaction levels to influence decisions about patient treatment and retention.

Information management was measured with a 4-item scale (α = 0.69) that describes the extent to which the center has computerized records for patient demographics, assessment, and medical record information.

Organizational Resources

The relative resource availability for each center was measured as the ratio of total full-time equivalent (FTE) employees in the center to the average patient census across all modalities of treatment offered.

Control Variables

In the analysis we controlled for characteristics that could be related to center performance (Ducharme et al. 2001; Carr et al. 2007; Fields and Roman 2005). These variables were hospital based (Y/N); nonprofit status (Y/N; offering both inpatient and outpatient treatment (Y/N); percentage of referrals from social service agencies; percentage of counselors with masters degrees; and percentage of patients with opiates as primary drug problem.

Results

The correlations among the study variables are shown in Table 1.

Table 1.

Correlations among the Measured Variables (N = 221 Treatment Centers)

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
1. Comprehensive care
2. Use of EBPs 0.71
3. Time to treatment −0.16 −0.17
4. Information management 0.19 0.19 −0.07
5. Measure performance 0.21 0.13 −0.17 0.14
6. Customer knowledge 0.15 0.04 −0.15 0.24 0.44
7. Discipline 0.03 0.05 0.08 0.06 0.08 0.19
8. Trust −0.09 0.01 0.07 0.03 −0.02 0.05 0.17
9. Stretch 0.18 0.09 −0.05 0.06 0.19 0.16 0.35 0.08
10. Support 0.10 0.21 −0.09 0.07 0.21 0.04 0.41 0.14 0.33
11. Hospital 0.25 0.15 −0.11 0.14 −0.04 −0.18 −0.19 −0.04 −0.06 −0.09
12. Nonprofit 0.20 0.09 0.18 0.10 −0.02 0.01 −0.06 −0.06 0.00 −0.08 0.25
13. Both in/outpatient 0.53 0.28 −0.18 0.14 0.15 0.10 −0.12 −0.06 0.15 0.01 0.24 0.23
14. Pct. social service referrals 0.16 0.08 0.04 0.10 0.07 0.09 −0.05 −0.03 −0.06 0.10 0.01 0.07 0.07
15. Pct. counselors w/masters 0.22 0.35 −0.15 0.00 0.06 −0.17 −0.01 0.09 −0.10 0.01 0.14 0.14 0.05 −0.11
16. Pct. opiate patients 0.26 0.22 −0.20 0.10 0.27 0.20 0.02 0.01 0.19 0.22 −0.05 −0.10 0.18 0.22 0.03
17. FTEs/patient 0.08 0.05 −0.06 0.05 0.07 0.01 −0.10 0.03 0.07 0.01 −0.03 −0.01 0.19 −0.04 0.06 0.01

Notes. Correlations larger than 0.13 are significant at p < .05; correlations larger than 0.18 are significant at p < .01.

EBPs, evidence-based practices; FTEs, full-time equivalents.

We tested hypothesis 1 by comparing the fit of alternative structural models of latent random variables connecting treatment center internal management practices, the use of patient-focused TQM/CQI practices, and treatment center performance. We then used the process described by Edwards and Lambert (2007) and Marsh, Wen, and Hau (2004) to test hypothesis 2 concerning the moderating effects of resource levels. We estimated the parameters and fit of the measurement and structural models using LISREL 8.8 (Scientific Software International, Lincolnwood, IL, USA).

Measurement and Structural Models

First, we examined a measurement model for the exogenous variables patient-focused TQM/CPI practices and management practices. All of the loadings of the measured indicators on the two latent variables were statistically significant and the model showed a close fit to the data as detailed in Figure 1.

Figure 1.

Figure 1

Parameter Estimates and Goodness-of-Fit Measures for Measurement Model of Exogenous Latent Variables Notes. **p < .01. Measurement model fit: χ2(13) = 17.50, n.s.; root mean squared error of approximation (RMSEA) = 0.04; normed fit index (NFI) = 0.92; comparative fit index (CFI) = 0.98; goodness-of-fit index (GFI) = 0.98; ratio of chi-squared/d.f. = 1.35.

To check for discriminant validity of the exogenous measures, we performed a confirmatory factor analysis, comparing the fit of this model to the fit of a one-factor model. The fit of the two-factor model was significantly better (Δχ2 = 42.67, d.f. = 1, p < .01). To cross-validate the measurement model for the exogenous latent variables, we randomly split the sample and compared the parameters of the measurement models in each subsample (Anderson and Gerbing 2010). None of the loadings of the measured indicators differed significantly in the cross-validation subsamples (average t values across seven loadings = 0.44, d.f. = 17, n.s.). As an additional test for the TQM/CPI latent variable, we re-estimated the measurement model allowing correlated error terms among the indicators. The fit of this model did not differ significantly (Δχ2 = 0.15, d.f. = 3, n.s.). Finally, we examined the measurement model for treatment center performance. In this model all loadings of the measured indicators on the latent performance variables were statistically significant and the model closely fit the data.

We then proceeded to estimate the parameters and fit of alternative structural models specified as (a) the hypothesized model with effects of internal management practices mediated by patient-focused TQM/CPI practices and (b) independent direct effects for internal management practices and patient-focused TQM/CPI practices (Anderson and Gerbing 2010). We included the control variables as direct indicators of treatment center performance in these structural models.

The fit of each of the structural models is summarized in Table 2. The hypothesized model fit the data better than the comparative model, supporting hypothesis 1. The path from management practices to performance in the mediated model was not significant, and thus patient-focused TQM/CPI practices fully mediated the effects of internal management practices. The indirect effects of internal management practices on performance are significant (Sobel test statistic = 2.31, p = .02). The mediated model is shown in Figure 2.

Table 2.

Fit of Alternative Structural Models

Model χ2 d.f. RMSEA NFI CFI GFI Δχ2 (d.f.)
Model (a)—mediation of internal management practices by TQM/CQI practices 203.18 115 0.059 0.74 0.87 0.90
Model (b)—independent effects of management systems and TQM/CQI 237.96 117 0.070 0.72 0.82 0.89 34.72(2)**

Notes:

**

p < .01.

TQM/CQI, Total Quality Management/Continuous Quality Improvement; RMSEA, root mean squared error of approximation; NFI, normed fit index; CFI, comparative fit index; GFI, goodness-of-fit index.

Figure 2.

Figure 2

Path Coefficients for Best Fit Structural Model. Mediation of Internal Management Practices by Total Quality Management/Continuous Quality Improvement (TQM/CQI) Practices (N = 221)

Hypothesis 2, which predicted that the effects of patient-focused TQM/COI practices on performance were moderated by the level of treatment center resources, was tested using the unconstrained procedure developed by Marsh and colleagues (2004) for testing continuous moderating effects for latent variable models (Edwards and Lambert 2007). As the resource measure had a single indicator and TQM/CQI practices had three measured indicators, we estimated models for all three pairs of indicators (Marsh et al. 2004). The model used to evaluate the interaction is illustrated in Figure 2. The results showed a significant positive moderating effect for center resources (β = 0.30, p < .01). At higher levels of resources (mean FTEs/patient plus 1 standard deviation), the direct effects of TQM/CPI on performance (0.59) are positive and more than twice as large as the direct effects of TQM/CPI at low resource levels (0.28). This difference is significant (t = 2.54, p < .05), supporting hypothesis 2. The moderating effects of resource levels are illustrated in Figure 3.

Figure 3.

Figure 3

Effects of Total Quality Management (TQM) on Performance for Different Resource Levels

Discussion

This study examined the relationship among SUD treatment centers’ high- performance internal management practices, use of patient-focused TQM/CQI processes, and performance in providing comprehensive patient care, implementing EBPs, and limiting waiting time before entry into treatment. The model with superior fit to the data showed patient-focused TQM/CQI fully mediated the relationship between management practices and performance. The squared multiple correlations indicated the mediated model, including control variables, accounted for 23 percent of the variation in treatment center performance (Joreskorg and Sorbom 1996).

The findings suggest that to the extent that higher levels of patient-focused performance influence funding sources, the use of specific TQM/CPI practices may buffer treatment center environmental stress and uncertainty. Of particular importance is the mediating effect of TQM/CQI on the impact of internal management practices. This perspective is consistent with the view that management practices including the key elements we tested (discipline, trust, stretch, and support) may create the culture or ethos that in turn supports implementation of patient-focused practices and protocols. However, TQM/CPI disciplines hold the keys to higher levels of treatment performance (Anderson, Rungtusanatham, and Schroeder 2009; Ghoshal and Bartlett 2005; Khatri et al. 2009). That is, internal management practices may improve patient outcomes through the application of TQM/CPI processes in part because these specific practices limit the effects of environmental turbulence and other external pressures and keep focus specifically on customer/patient results.

The effects of TQM/CPI practices differed depending on levels of center resources, with TQM/CPI having approximately twice the effect on performance in higher resource centers. These strikingly different results suggest that resource availability may be critical in order for treatment centers to realize the benefits of management practices and TQM/CQI processes. Resource constraints may limit both immediate and the longer term gains in center performance due to TQM/CQI processes, possibly because external forces and pressures limit or corrupt the implementation of specific patient-focused practices. Care should, however, be taken to note that while additional resources enhance desirable impacts, the data do not show that large amounts of resources are a necessary condition for TQM/CQI to produce any substantive impact on patient quality outcomes. Thus, it appears important to encourage perseverance in the use of quality tools even with slender resources. The observed fact that these practices are actually adopted by a substantial percentage of centers suggests that internal cultures of patient focus and professionalism may be enhanced by the presence of TQM/CQI processes across varying levels of resources. However, our results also suggest that “shallow” or “ceremonial” attempts to demonstrate the presence of TQM/CQI practices will not likely produce substantial results in terms of quality improvement. Given emphases on accreditation and certification, the temptation for partial implementation of these strategies as a form of window-dressing may serve short-term purposes but likely will not have long-term impacts on center performance.

Nonetheless, since a resource “advantage” is associated with improved performance, which should lead to advantages in obtaining more resources, the pattern is well described as “the rich get richer.” Given the unpredictability of the external environment in terms of resource flow, it is logical that more extensive resources or “organizational slack” may also provide an effective buffer against environmental uncertainty. These findings suggest it is important for treatment centers to tap multiple resource bases to acquire what may be protective “slack,” with the caution suggested by other studies that SUD treatment centers attempting to “serve too many masters” ultimately may defeat organizational goals and increase risk of closure (Shane, Blum, and Roman 2006). In addition, the importance of examining these relationships longitudinally cannot be overemphasized, since the full implementation of TQM/CPI practices may have longer term payoffs as “pay for performance” becomes a more widely implemented external strategy by funding agencies (Bremer et al. 2002). In addition, treatment for SUDs is only partially one-on-one, and treatment processes may involve group participation as well as other social interaction among patients. Through these routine treatment interactions, TQM/CPI may have synergistic effects that result in a more rapid accumulation of positive outcome effects than if there were just an accumulation of isolated one-on-one occurrences.

This study has both strengths and limitations that must be considered when interpreting the results. First, we used process measures of treatment quality rather than patient outcomes, relying on clinical research that suggests comprehensive care, EBPs, and shorter time to first treatment lead to higher levels of successful treatment outcomes for patients with SUDs. Second, the data were provided through interviews and questionnaires provided by administrative and clinical directors of each of the centers. These directors had an average of 7.5 years of experience in administrative or clinical management within the current center and only 6 percent had been in their role for 1 year or less. The average program in our sample had 28 full-time equivalent employees and thus administrators had a reasonable opportunity to observe actual practices in the center. However, it is possible that directors may not have complete information about actual processes and services provided. Finally, the sample size may have limited our ability to detect differences in the split sample validation of the two-factor measurement model for the exogenous variables.

Future studies of treatment centers should obtain more detailed data about the way that TQM processes are actually implemented. As Lozeau and colleagues (2007) have noted, the gap between the requirements of specific techniques and prevailing organizational practices may determine the extent a technique is altered to fit organizational circumstances. TQM/CPI practices may be implemented in centers for reasons and in forms that may vary suggesting that future studies should consider possible effects of institutionalization, possibly altering relationships among these variables and center performance (Westphal, Gualti, and Shortell 1995).

From a policy perspective, encouraging treatment centers to implement high quality management practices with patient-focused TQM/CPI practices may increase both the quality and possibly the cost-effectiveness of treatment services over time. At the same time, helping centers secure greater resources may prove wise investments as the positive effects of patient-focused TQM/CPI practices are realized.

Acknowledgments

Joint Acknowledgment/Disclosure Statement: Data collection and analyses were supported by research grant R37DA013110 awarded to the University of Georgia Research Foundation by the National Institute on Drug Abuse.

Disclosures: None.

Disclaimers: None.

SUPPORTING INFORMATION

Additional supporting information may be found in the online version of this article:

Appendix SA1: Author Matrix.

hesr0047-1068-SD1.doc (83KB, doc)

Appendix SA2: Scale Measures for Internal Management Practices and TQM/CQI Practices.

hesr0047-1068-SD2.doc (36.5KB, doc)

Please note: Wiley-Blackwell is not responsible for the content or functionality of any supporting materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.

References

  1. Abraham AJ, Roman PM. Early Adoption of Injectable Naltrexone for Alcohol-Use Disorders: Findings in the Private-Treatment Sector. Journal of Studies on Alcohol and Drugs. 2010;71(3):460–6. doi: 10.15288/jsad.2010.71.460. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Anderson BS, Covin JG, Slevin DP. Understanding the Relationship between Entrepreneurial Orientation and Strategic Learning Capability: An Empirical Investigation. Strategic Entrepreneurship Journal. 2009;3:218–40. [Google Scholar]
  3. Anderson JC, Gerbing DW. Structural Equation Modeling in Practice: A Review and Recommended Two-Step Approach. Psychological Bulletin. 1988;103(3):411–23. [Google Scholar]
  4. Anderson JC, Rungtusanatham M, Schroeder RG. A Theory of Quality Management Underlying the Deming Management Method. Academy of Management Review. 1994;19(3):472–509. [Google Scholar]
  5. Ball S, Young J. Dual Focus Schema Therapy for Personality Disorders and Substance Dependence: Case Study Results. Cognitive and Behavioral Practice. 2000;7:270–81. [Google Scholar]
  6. Barry CL, Huskamp HA, Goldman HH. A Political History of Federal Mental Health and Addiction Insurance Parity. Milbank Quarterly. 2010;88:404–33. doi: 10.1111/j.1468-0009.2010.00605.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Blum TC, Fields DL, Goodman JS. Organization Level Determinants of Women in Management. Academy of Management Journal. 1994;37(2):241–68. [Google Scholar]
  8. Bollen K. Latent Variables in Psychology and the Social Sciences. Annual Review of Psychology. 2002;53:605–34. doi: 10.1146/annurev.psych.53.100901.135239. [DOI] [PubMed] [Google Scholar]
  9. Bourgeois LJ. On the Measurement of Organizational Slack. Academy of Management Review. 1981;6(1):29–39. [Google Scholar]
  10. Bremer RW, Scholle SH, Keyser D, Houtsinger JVK, Pincus HA. Pay for Performance in Behavioral Health. Psychiatric Services. 2008;59:1419–29. doi: 10.1176/ps.2008.59.12.1419. [DOI] [PubMed] [Google Scholar]
  11. Broome KM, Flynn PM, Knight DK, Simpson DD. Program Structure, Staff Perceptions, and Client Engagement in Treatment. Journal of SubstanceAbuse Treatment. 2007;33:149–58. doi: 10.1016/j.jsat.2006.12.030. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Buck JA. The Looming Expansion and Transformation of Public Substance Abuse Treatment under the Affordable Care Act. Health Affairs. 2011;30:1402–10. doi: 10.1377/hlthaff.2011.0480. [DOI] [PubMed] [Google Scholar]
  13. Carr CJ, Xu J, Redko C, Lane DT, Rapp RC, Goris J, Robert G, Carlson RG. Individual and System Influences on Waiting Time for Treatment. Journal of Substance Abuse Treatment. 2008;34:192–201. doi: 10.1016/j.jsat.2007.03.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Chong P, Calingo LM, Reynolds G, Fisher D. Using an Innovative Approach to Shorten Coaching and Assessment Time When Applying The Baldrige Health Care Criteria for Performance Excellence in a Substance Abuse Treatment Setting. TQM & Business Excellence. 2003;14(10):1121–9. [Google Scholar]
  15. D'Aunno T. The Role of Organization and Management in Substance Abuse Treatment. Journal of Substance Abuse Treatment. 2006;31:221–33. doi: 10.1016/j.jsat.2006.06.016. [DOI] [PubMed] [Google Scholar]
  16. Dean J, Bowen D. Management Theory and Total Quality: Improving Research and Practice through Theory Development. Academy of Management Review. 1994;19(3):394–419. [Google Scholar]
  17. DeGrandpre RJ. The Cult of Pharmacology: How America Became the World's Most Troubled Drug Culture. Durham, NC: Duke University Press; 2006. [Google Scholar]
  18. Douglas TJ, Judge W. Total Quality Management Implementation and Competitive Advantage: The Role of Structural Control and Exploration. Academy of Management Journal. 2001;44(1):158–69. [Google Scholar]
  19. Ducharme LJ, Mello H, Roman PM, Knudsen H, Johnson JA. Service Delivery in Substance Abuse Treatment: Reexamining Comprehensive Care. Journal of Behavioral Health Services & Research. 2007;34(2):121–36. doi: 10.1007/s11414-007-9061-7. [DOI] [PubMed] [Google Scholar]
  20. Edwards J, Lambert L. Methods for Integrating Moderation and Mediation: A General Analytical Framework Using Moderated Path Analysis. Psychological Methods. 2007;12(1):1–22. doi: 10.1037/1082-989X.12.1.1. [DOI] [PubMed] [Google Scholar]
  21. Easton GS, Jarrell SL. The Effects of Total Quality Management on Corporate Performance. In: Cole R, Richard Scott W, editors. The Quality Movement and Organization Theory. Thousand Oaks, CA: Sage Publications; 2000. pp. 237–70. [Google Scholar]
  22. Emmelkamp P, Vedel E. Evidence-Based Treatment for Alcohol and Drug Abuse. New York: Routledge; 2006. [Google Scholar]
  23. Etheridge RM, Hubbard R. Conceptualizing and Assessing Treatment Structure and Process in Community-Based Drug Dependency Treatment Programs. Substance Use & Misuse. 2000;35:1757–95. doi: 10.3109/10826080009148240. [DOI] [PubMed] [Google Scholar]
  24. Evans WR, Davis WD. High-Performance Work Systems and Organizational Performance: The Mediating Role of Internal Social Structure. Journal of Management. 2005;31(5):758–75. doi: 10.1177/0149206305279370. [Google Scholar]
  25. Fields D, Goodman J, Blum T. Human Resource Dependence and Organizational Demography: A Study of Minority Employment in Private Sector Companies. Journal of Management. 2005;31:167–85. [Google Scholar]
  26. Fields DL, Roman PM. Total Quality Management and Performance in Substance Abuse Treatment Centers. Health Services Research. 2010;45(6):1630–50. doi: 10.1111/j.1475-6773.2010.01152.x. doi: 10.1111/j.1475-6773.2010.01152.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Ghoshal S, Bartlett CA. Linking Organizational Context and Managerial Action: The Dimensions of Quality of Management. Strategic Management Journal. 1994;15:91–112. [Google Scholar]
  28. Goodman JS, Blum TC. Assessing the Non-Random Sampling Effects of Subject Attrition in Longitudinal Research. Journal of Management. 1996;22(4):627–52. [Google Scholar]
  29. Goonan KJ, Stoltz PK. Leadership and Management Principles for Outcomes-Oriented Organizations. Medical Care. 2004;42(4):31–8. doi: 10.1097/01.mlr.0000120782.03031.b4. [DOI] [PubMed] [Google Scholar]
  30. Greener JM, Joe GW, Simpson DD, Rowan-Szal GA, Lehman WEK. Influence of Organizational Functioning on Client Engagement in Treatment. Journal of Substance Abuse Treatment. 2007;33:139–47. doi: 10.1016/j.jsat.2006.12.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Guthrie JP. High-involvement Work Practices, Turnover, and Productivity: Evidence from New Zealand. Academy of Management Journal. 2001;44:180–90. [Google Scholar]
  32. Harris A, Humphreys K, Bowe T, Kivlahan D, Finney J. Measuring the Quality of Substance use Disorder Treatment: Evaluating the Validity of the Department of Veterans Affairs Continuity of Care Performance Measure. Journal of Substance Abuse Treatment. 2009;36(3):294–305. doi: 10.1016/j.jsat.2008.05.011. [DOI] [PubMed] [Google Scholar]
  33. Henderson CE, Taxman FS. Competing Values among Criminal Justice Administrators: The Importance of Substance Abuse Treatment. Drug and Alcohol Dependence. 2009;103(suppl 1):S7–16. doi: 10.1016/j.drugalcdep.2008.10.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Huselid M. The Impact of Human Resource Management Practices on Turnover, Productivity, and Corporate Financial Performance. Academy of Management Journal. 1995;38(3):635–72. [Google Scholar]
  35. Institute of Medicine. Committee on Crossing the Quality Chasm: Adaptation to Mental Health and Addictive Disorders: Improving the Quality of Health Care for Mental and Substance-Use Conditions. Washington, DC: National Academy of Sciences; 2005. [Google Scholar]
  36. Joreskorg KG, Sorbom D. LISREL 8 Users Reference Guide. Chicago: Scientific Software International; 1996. [Google Scholar]
  37. Kaynak H. The Relationship between Total Quality Management Practices and Their Effects on Firm Performance. Journal of Operations Management. 2003;21:405–35. [Google Scholar]
  38. Kehoe RR, Wright PM. The Impact of High Performance Human Resource Practices on Employees’ Attitudes and Behaviors. Journal of Management. 2010 doi: 10.1177/0149206310365901. [Google Scholar]
  39. Kennedy MT, Fiss PC. Institutionalization, Framing, and Diffusion: The Logic of TQM Adoption and Implementation Decisions among U.S. Hospitals. Academy of Management Journal. 2009;52(5):897–918. [Google Scholar]
  40. Khatri N, Halbesleben JRB, Petroski GF, Meyer W. Relationship between Management Philosophy and Clinical Outcomes. Health Care Management Review. 2007;32(2):128–39. doi: 10.1097/01.HMR.0000267789.17309.18. [DOI] [PubMed] [Google Scholar]
  41. Kimberly JR, McLellan AT. The Business of Addiction Treatment: A Research Agenda. Journal of Substance Abuse Treatment. 2006;31:213–9. doi: 10.1016/j.jsat.2006.06.018. [DOI] [PubMed] [Google Scholar]
  42. Knudsen H, Ducharme LJ, Roman PM. The Adoption of Medications in Substance Abuse Treatment: Associations with Organizational Characteristics and Technology Clusters. Drug and Alcohol Dependence. 2007;87(2007):164–74. doi: 10.1016/j.drugalcdep.2006.08.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Kumar N, Stern L, Anderson JC. Conducting Inter-organizational Research Using Key Informants. Academy of Management Journal. 1993;36(6):1633–51. [Google Scholar]
  44. Lamb S, Greenlick MR, McCarty D. Bridging the Gap between Practice and Research: Forging Partnerships with Community-Based Drug and Alcohol Treatment. Washington, DC: National Academy Press; 1998. [PubMed] [Google Scholar]
  45. Lozeau D, Langley A, Denis JL. The Corruption of Managerial Techniques by Organizations. Human Relations. 2002;55(5):537–64. [Google Scholar]
  46. Mark TL, Levit KR, Vandivort-Warren R, Coffey RM, Buck JA. Trends in Spending for Substance Abuse Treatment, 1986–2003. Health Affairs. 2007;26(4):1118–28. doi: 10.1377/hlthaff.26.4.1118. [DOI] [PubMed] [Google Scholar]
  47. Marsh H, Wen Z, Hau K-T. Structural Equation Models of Latent Interactions: Evaluation of Alternative Estimation Strategies and Indicator Construction. Psychological Methods. 2004;9(1):275–300. doi: 10.1037/1082-989X.9.3.275. [DOI] [PubMed] [Google Scholar]
  48. Meyer AD. Adapting to Organizational Jolts. Administrative Science Quarterly. 1982;27:515–35. [PubMed] [Google Scholar]
  49. Miller WR, Sorensen JL, Selzer JA, Brigham GS. Disseminating Evidence-Based Practices in Substance Abuse Treatment: A Review with Suggestions. Journal of Substance Abuse Treatment. 2006;31:25–39. doi: 10.1016/j.jsat.2006.03.005. [DOI] [PubMed] [Google Scholar]
  50. Milne SH, Blum TC, Roman PM. Quality Management in a Health Care Setting: A Study of Substance Abuse Treatment Centers. Advances in the Management of Organizational Quality. 2000;5:215–48. [Google Scholar]
  51. National Institute on Drug Abuse. Principles of Drug Addiction Treatment (NIH Publication 99-4180) Washington, DC: National Institutes of Health; 1999. [Google Scholar]
  52. Nolan JL. Drug Courts: In Theory and in Practice. New York: Aldine-Degruyter; 2002. [Google Scholar]
  53. Phillips LW. Assessing Measurement Error in Key Informant Reports: A Methodological Note on Organizational Analysis in Marketing. Journal of Marketing Research. 1981;18(4):395–415. [Google Scholar]
  54. Picone G, Sloan F, Chou S, Taylor D. Does Higher Hospital Cost Imply Higher Quality of Care? Review of Economics and Statistics. 2003;85(1):51–62. [Google Scholar]
  55. Pollack HA, D'Aunno T. HIV Testing and Counseling in the Nation's Outpatient Substance Abuse Treatment System, 1995–2005. Journal of Substance Abuse Treatment. 2010;38:307–16. doi: 10.1016/j.jsat.2009.12.004. [DOI] [PubMed] [Google Scholar]
  56. Pollack HA, D'Aunno T, Lamar B. Outpatient Substance Abuse Treatment and HIV Prevention: An Update. Journal of Substance Abuse Treatment. 2006;30:39–47. doi: 10.1016/j.jsat.2005.09.002. [DOI] [PubMed] [Google Scholar]
  57. Reed R, Lemak DJ, Montgomery JC. Beyond Process: TQM Content and Firm Performance. Academy of Management Review. 1996;21(1):173–202. [Google Scholar]
  58. Shane S, Blum TC, Roman PM. Frontiers of Entreprenuerial Research. Boston: Babson College Center for Entrepreneurship; 1997. Organizational Survival: The Private Alcoholism Treatment Center; pp. 7–31. [Google Scholar]
  59. Shortell S, Bennett C, Byck G. Assessing the Impact of Continuous Quality Improvement on Clinical Practice: What It Will Take to Accelerate Progress. The Milbank Quarterly. 1998;76(4):593–624. doi: 10.1111/1468-0009.00107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Shortell S, O'Brien J, Carmen J, Foster R, Hughes E, Boertsler H, O'Connor E. Assessing the Impact of Continuous Quality Improvement/Total Quality Management: Concept vs. Implementation. Health Services Research. 1995;30:377–89. [PMC free article] [PubMed] [Google Scholar]
  61. Smith DE, Lee DR, Davidson LD. Health Care Equality and Parity for Treatment of Addictive Disease. Journal of Psychoactive Drugs. 2010;42(2):121–8. doi: 10.1080/02791072.2010.10400684. [DOI] [PubMed] [Google Scholar]
  62. Snell SA, Youndt MA. Human Resource Management and Firm Performance: Testing a Contingency Model of Executive Controls. Journal of Management. 1995;21(4):711–37. [Google Scholar]
  63. Waldman DA. The Contributions of Total Quality Management to a Theory of Work Performance. Academy of Management Review. 1994;19(3):510–36. [Google Scholar]
  64. Weisner C, McLellan AT. Report of the Blue Ribbon Task Force on the National Institute on Drug Abuse (NIDA) Health Services Research. Washington, DC: National Institutes of Health; 2004. [Google Scholar]
  65. Westphal JD, Gualti R, Shortell SM. Customization or Conformity: An Institutional and Network Perspective on the Content and Consequences of TQM Adoption. Administrative Science Quarterly. 1997;42:366–94. [Google Scholar]
  66. Zinn J, Flood A. Commentary: Slack Resources in Health Care Organizations – Fat to Be Trimmed or Muscle to Be Exercised? Health Services Research. 2009;44(3):812–20. doi: 10.1111/j.1475-6773.2009.00970.x. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

hesr0047-1068-SD1.doc (83KB, doc)
hesr0047-1068-SD2.doc (36.5KB, doc)

Articles from Health Services Research are provided here courtesy of Health Research & Educational Trust

RESOURCES