Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2022 Feb 1.
Published in final edited form as: Health Aff (Millwood). 2021 Feb;40(2):317–325. doi: 10.1377/hlthaff.2020.00378

Admission Practices And Cost Of Care For Opioid Use Disorder At Residential Addiction Treatment Programs In The US

Tamara Beetham 1, Brendan Saloner 2, Marema Gaye 3, Sarah E Wakeman 4, Richard G Frank 5, Michael Lawrence Barnett 6
PMCID: PMC8638362  NIHMSID: NIHMS1749692  PMID: 33523744

Abstract

Acute, short-term, residential care for opioid use disorder has grown rapidly, with policymakers advocating to increase the availability of “treatment beds”. However, there are concerns about high costs and misleading recruitment practices at many programs. We conducted an audit survey of 613 residential treatment programs nationally, posing as uninsured, cash-paying individuals using heroin and seeking residential addiction treatment. One-third of callers were offered admission before clinical evaluation, usually within one day. Most programs required upfront payments for admission, with for-profit programs charging over twice as much ($17,434) as non-profits ($5,712). Recruitment techniques (e.g., offering paid transportation) were used frequently by for-profit, but not nonprofit, programs. Practices including admission offers during the call, high upfront payments, and recruitment techniques were still common among programs with third-party accreditation and state licenses. These findings raise concerns that residential treatment programs, including those with third-party accreditation and state licenses, may be admitting a clinically and financially vulnerable population for costly treatment without assessing appropriateness for other care settings.


Acute, short-term, residential care – also referred to as “rehab” – accounts for over one quarter of national spending on substance use treatment, and has grown rapidly,13 with nearly one million admissions in 2018.4 The current addiction and opioid use disorder crisis has led numerous policymakers to promote treatment by increasing the availability of residential treatment options and the number of “treatment beds” across the US.57 One motivation behind these policy proposals is that residential programs are necessary as a mechanism to remove people with substance use disorders from exposure to addictive substances while providing a therapeutic setting.8

Despite enthusiasm for residential treatment, there is often inadequate consideration of what these organizations deliver. This is particularly the case for patients with opioid use disorder (OUD), many of whom can benefit from care in an outpatient setting with medication instead of residential treatment. In fact, residential treatment programs may, on average, result in higher risk of overdose and mortality than outpatient treatment911 due to their frequent focus on “detoxification” and abstinence-based ancillary therapies rather than evidence-based opioid agonist treatments (OAT) (i.e., methadone or buprenorphine) that are proven to reduce overdose and mortality.12

Clinical ambiguity about the benefits of residential treatment is compounded by concerns about high costs and misleading recruiting practices in some residential programs. Numerous anecdotal reports have documented facilities engaging in deceptive marketing and financial schemes with treatment approaches of unclear effectiveness.1317 These observations raise questions about the ability for existing mechanisms, such as third-party accreditation by agencies like The Joint Commission or state licensing bodies, to protect vulnerable consumers in this marketplace.18,19 Together, questions about the effectiveness of residential treatment and the significant financial implications of receiving residential treatment calls for further assessment of the conduct of these programs nationally.

To address the need for systematic evidence on residential treatment costs and admission processes nationally, we conducted an audit study in which trained staff posed as uninsured young adults who use heroin seeking residential treatment for OUD to systematically gather data on programs across the US. Audit techniques avoid the social desirability bias present in existing self-reported surveys20 and enable collection of data such as recruitment techniques or treatment cost that are not otherwise available.

STUDY DATA AND METHODS

Study Design

We conducted an audit study to collect data on admission practices, costs and wait time communicated to adult patients seeking admission to a residential treatment program for OUD. By posing as individuals seeking care, we captured information that programs currently give patients to guide their decisions about initiating treatment. We followed the American Association for Public Opinion Research (AAPOR) standard definitions for tracking response rates.21 The Harvard T.H. Chan School of Public Health Institutional Review Board determined that our analysis was not human subjects research.

Study Population and Sample Selection

In June 2019, we gathered publicly available data on all residential substance use treatment programs nationally from the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Treatment Services Locator.22 We excluded programs providing only short-term “detoxification”, transitional or “halfway” houses, and programs run by the federal or tribal governments. One program was randomly selected when programs had the same identifying information as an affiliated group.

The treatment locator allows programs to opt-out and excludes programs that are not under the regulation of a state agency.23 To capture a broader range of program types actively recruiting patients, we included programs that purchased search engine advertisements. We used data compiled by the search analytics company SpyFu24 to identify keywords related to residential addiction treatment purchased for advertisement on the Google search engine and selected the top 10 keywords by number of advertisers from July 2018-June 2019 (e.g. “heroin treatment,” Exhibit A1).25 These keywords yielded 159 unique web domains for residential treatment centers, 82 of which had no matching entry in the Treatment Locator data.

Combining the Treatment Locator sample with the advertising sample yielded a sampling frame of 1,436 non-federal short- and long-term residential treatment programs. From this, we randomly selected an equal proportion of for-profit and non-profit (including public) programs to obtain a target sample of approximately 600.

Data Collection

Three trained research staff called residential programs at their publicly listed intake phone numbers from June 27, 2019 to September 30, 2019. The lead script designer (TB) trained the other two research staff to ensure consistency in data collection and performed regular data audits. A standardized script was developed to simulate a 27-year old individual seeking care who was actively using heroin, reported no other health conditions and lacked insurance. The caller was interested in entering a residential treatment program and inquired about relevant information related to paying for treatment, typical treatment experiences, and whether immediate admission was possible. Callers were depicted as uninsured, cash-paying patients because of widespread concerns about the expense of treatment programs and the fact that over 40% of individuals seeking addiction treatment use their own resources or their family’s to pay for treatment.4

We performed open-ended pilot calls with programs outside the study sample to refine the initial script and define the categorization of responses to questions for a standardized data collection form. The categories used to classify interview responses for data collection were iteratively refined in the first month of calling to minimize the need to classify responses in a miscellaneous “other” category.

A program was included in the sample if they could be reached within three call attempts on separate days during business hours in the program’s time zone (Monday-Friday, 9:00am to 5:00pm). Programs were defined as out of sample if the clinical setting was outside the study scope (e.g., outpatient only), if it only served special populations that were outside the scope of our script (e.g., veterans or adolescents only), if they did not accept uninsured/cash pay patients, or the phone number did not work or was incorrect. Callers ensured that a bed would not be held for admission at the end of the call and voicemails were not left. A program was considered a non-respondent if it could not be reached after three attempts on separate days and was not otherwise excluded from the sample.

Study Outcomes

We report on three outcomes from the audit calls: admissions, financing, and recruitment techniques used by programs. In a separate study under review, we also report on outcomes related to provision of medication treatment for opioid use disorder. For admissions, we measured three possible responses: an offer of admission (directly or pending an intake, which was typically presented as a perfunctory process that would not impact admission), being told that further screening was necessary before admission, or a denial of admission (including placement on a waitlist). We elicited the time until next available bed and whether the program had any clinical criteria that could disqualify the caller from being admitted, such as other psychiatric illness or prior medical history.

In the financing domain, we obtained information on the cost per day following an admission, whether any upfront payment was required, and if so, the amount. We asked what payments the programs accepted besides insurance, whether programs discount their “standard” price (asking for caller’s budget presumably to provide a lower rate, or offering an income based sliding scale, scholarship, or willingness to negotiate cost), and options for financing the treatment episode (offering to accept credit card or long term payment plans).

Finally, we obtained information on the recruitment techniques used by the program to persuade callers to enter their program. While these techniques were not specifically elicited, they were a prominent feature of the admissions process during calls. Callers recorded 1–2 sentence verbatim statements for every recruiting technique used by a facility. The techniques were grouped into thematic categories based on pilot calls by two investigators (TB and MLB) and refined by consensus.

Study Variables

We examined the distribution of outcomes by programs’ for-profit versus nonprofit (including public) ownership. We also stratified our sample according to whether they were accredited by The Joint Commission (TJC) or the Commission on Accreditation of Rehabilitation Programs (CARF), as well as across programs with and without a license from a state health department or state mental health agency.12 We collected other characteristics of responding programs, including data source (Treatment Locator, search advertisement or both), Census division, short vs. long-term residential services and availability of medically supervised withdrawal management.

Statistical Analysis

We calculated rates of the outcomes and tested for differences across groups using z-tests for proportions, t-tests and chi-squared tests. Outcomes such as wait time or cost-related outcomes were missing from some facilities due to variation in the data given to callers (see Exhibit captions). Statistical tests were performed on the set of non-missing data. All p-values should be interpreted as exploratory given the presence of multiple comparisons across different subgroups. Analyses were performed in Stata (v. 14, College Station, TX). We considered a P-value of 0.05 or less to be statistically significant.

Limitations

This study has several limitations. First, non-respondents differed from the respondents in several ways, most notably a high rate of nonprofit programs among the non-respondents. This could bias our results for nonprofit programs if the non-respondents had systematically different outcomes compared to the nonprofit respondents. However, because we surveyed a wide swath of all programs in the US, we believe that these results still capture key practices of for-profit and nonprofit programs nationally, though the overall prevalence of practices may be sensitive to the non-response rate. Second, because our script simulated an uninsured, cash-paying individual, we were unable to capture outcomes relevant to insured patients, such as the rate of insurance acceptance by programs that could substantially defray the costs we collected. However, the quoted costs are broadly relevant given the 40% of individuals seeking addiction treatment who use their own resources or their family’s to pay for treatment.4 Third, the data collection process for this study was complex given the multi-faceted interaction between caller and admissions staff while using the script. This resulted in missing data, which could bias our results, and possible variation in the recording of data. We mitigated these limitations through careful training and regular data audits to monitor data consistency and quality.

RESULTS

From the 613 programs sampled, 160 were excluded as out of sample and we completed contact with 368 programs (Exhibit A2).25 This yielded a response rate of 81%, representing 26% of all 1,436 non-federal programs nationally. There were differences between respondents and non-respondents, for example a higher proportion of non-respondent programs were nonprofits compared to respondents (61% vs. 27%, respectively, Exhibit 1), and non-respondents were more likely to only offer stays over 30 days in length (40% vs. 13%).

Exhibit 1:

Characteristics of Respondent and Non-Respondent Programs

All Programs Respondents Non-Respondents
Total (n) 453 368 85

Data Source n % n % n %
SAMHSA Treatment Locator Only 364 80% 284 77% 80 94%
Internet Advertising Only 45 10% 42 11% 2 2%
Both 44 10% 42 11% 3 4%

Profit Status
For-profit 249 55% 226 61% 23 27%
Not For-profit 174 38% 120 33% 54 64%
Public (Local and State) 30 7% 22 6% 8 9%

Offers Detoxification
Yes 192 42% 163 44% 29 34%
No 216 48% 163 44% 53 62%
Missing 45 10% 42 11% 3 4%

Residential Stay Length
Over 30 days 83 18% 49 13% 34 40%
30 days or less 121 27% 100 27% 21 25%
Both 204 45% 177 48% 27 32%
Missing 45 10% 42 11% 3 4%

CARF Accredited
Yes 134 30% 97 26% 37 44%
No 319 70% 271 74% 48 56%

Joint Commission Accredited
Yes 198 44% 178 48% 20 24%
No 255 56% 190 52% 65 76%

Census Division
East North Central 40 9% 30 8% 10 12%
East South Central 21 5% 20 5% 1 1%
Middle Atlantic 41 9% 29 8% 12 14%
Mountain 47 10% 37 10% 10 12%
New England 27 6% 13 4% 14 16%
Pacific 107 24% 94 26% 13 15%
South Atlantic 85 19% 70 19% 15 18%
West North Central 39 9% 34 9% 5 6%
West South Central 46 10% 41 11% 5 6%

Source: Authors’ analysis of audit survey data merged with facility data from the Substance Abuse and Mental Health Services Administration (SAMHSA).

Program Admissions and Recruitment Techniques

Among the programs contacted, 122 (33%) offered admission during the call either immediately or pending an intake evaluation, 166 (45%) required further clinical or financial screening and 79 (22%) put the caller on a waitlist or declined admission (Exhibit 2). The majority of programs (202 [64%]) had beds available the same day of the call or the following regardless of admission acceptance, though many had longer wait times (mean [SD] wait time 6 [15] days). The most common potentially disqualifying criterion for admission was psychiatric history, including active comorbidities or suicidal ideation, (150 [41%]). Recruitment techniques were common (160 [43%] programs using at least one technique).

Exhibit 2:

Recruitment Techniques by Profit Status

All Profit Status
Non-profit For-profit P-value1

Facilities (n) 368 142 226

Recruitment Techniques
Any non-clinical recruitment approach 160 43% 13 9% 147 65% <0.001
Cost or luxury-based reasoning, n (%) *
Promotes luxury amenities 75 20% 4 3% 71 31% <0.001
Suggests that cost reflects quality 13 4% 2 1% 11 5% 0.08
Travel-based inducements, n (%) *
Car transportation 49 13% 5 4% 44 19% <0.001
Pay for or book flight to facility 8 2% 0 0% 8 4% 0.02
Encourages or normalizes travel 12 3% 1 1% 11 5% 0.03
General transportation 3 1% 0 0% 3 1% 0.17
Attempts to extend contact after call, n (%) *
Scheduler offers to talk to family or suggests family supporting treatment financially 80 22% 8 6% 72 32% <0.001
Offers to email or text 72 20% 4 3% 68 30% <0.001
Other non-clinical recruitment approaches *
Pressure to commit to treatment (e.g. life or death situation) 16 4% 1 1% 15 7% 0.01
Other 3 1% 0 0% 3 1% 0.17

Source: Authors' analysis of audit survey data merged with facility data.

Notes

Categories flagged with a

*

indicate that multiple answers per facility were possible.

1

Chi-squared test was performed for all admissions status categories, all factors evaluated for admission categories, and ability to get in the same or next day. t test was performed for mean wait time.

For-profit programs offered admissions over the phone more frequently than nonprofits (94 [42%] vs. 28 [20%], respectively, p<<0.0011, Exhibit 3). The majority of admissions at for-profit programs were available the same or following day of the call (161 [79%] vs. 41 [36%] for nonprofits, p<<0.0011). For-profit programs had a similar profile compared to nonprofit programs on potentially disqualifying criteria except for psychiatric history, which was cited more frequently by for-profit programs (115 [51%] vs. 35 [25%] for nonprofit, p<<0.0011).

Exhibit 3:

Admission Practices by Profit Status

All Profit Status
Non-profit For-profit P-value1

Facilities (n) 368 142 226

Admission status, n (%) 2
Offered admission 122 33% 28 20% 94 42% <0.001
Further screening required 166 45% 61 43% 105 47% 0.49
Waitlist or no admission 79 22% 53 37% 26 12% <0.001

Potentially disqualifying criteria for admission, n (%)* 3
None 33 9% 16 11% 17 8% 0.22
Psychiatric or substance use history 150 41% 35 25% 115 51% <0.001
Medical history (including pregnancy) or medications 127 35% 41 29% 86 38% 0.07
Legal history or prior violent behavior 70 19% 26 18% 44 19% 0.78
Financial screening 27 7% 11 8% 16 7% 0.81
Age 2 1% 1 1% 1 0% 0.74
Unwilling to disclose 90 24% 42 30% 48 21% 0.07

Wait time 4
Able to get in same day or next day, n (%) 202 64% 41 36% 161 79% <0.001
Wait time in days, mean (SD) 6 (15) 14 (22) 2 (4) <0.001

Source: Authors’ analysis of audit survey data merged with facility data.

Notes: Categories flagged with a

*

indicate that multiple answers per facility were possible.

1

Chi-squared test was performed for all admissions status categories, all factors evaluated for admission categories, and ability to get in the same or next day. t test was performed for mean wait time.

2

Facilities with missing data on admission status are excluded (n=1).

3

Facilities with missing data on potentially disqualifying factors for admission are excluded (n=11). “Psychiatric or substance use history” includes psychiatric comorbidities such as eating disorders, severity of comorbidities including suicidal ideation and severity of substance use disorder.

4

Facilities with unknown or missing data on first available bed date are excluded (n=50).

For-profit programs were the dominant users of recruitment techniques (147 [65%] for-profit programs using any technique vs. 13 [9%] for nonprofit programs, p<<0.0011, Exhibit 3). To illustrate the range of recruitment language used, we compiled typical examples of these statements across four common themes (Exhibit A3).25 Common recruitment techniques used by for-profit programs included promotion of luxury amenities such as gourmet food (71 [31%] of for-profit programs), offering car transportation (44 [19%]), offering to maintain contact with caller by email or text after the call (72 [32%]) and offering to contact the caller’s family or suggesting that family should finance admission (72 [32%]).

Admission Cost and Financing

The average daily cost for residential treatment among respondents was $618 (SD 468, Exhibit 4), though for-profit programs were more than twice the average cost of nonprofit programs ($758 average daily cost at for-profit vs. $357, p<<0.0011). A higher proportion of for-profit programs required upfront payment than nonprofit programs (88% vs. 50%, p<<0.0011), and for-profit programs requested larger upfront payments. For-profit programs on average asked for $17,434 upfront ($741 per day for 23 days) vs. $5,712 ($357 per day for 16 days) by nonprofit programs.

Exhibit 4:

Treatment Cost and Financing Offered, by Profit Status

All Profit Status
Non-profit
For-profit
P-value1
Programs (n) 368 142 226

Cost of treatment
Cost per day, mean (SD)2 $618 ($468) $357 ($350) $758 ($463) <0.001
Any upfront payment required, n (%)3 264 74% 68 50% 196 88% <0.001
Treatment days' cost requested upfront, mean (SD)4 20 (14) 16 (15) 23 (13) <0.001

Payment accepted without insurance? n (%) * 5
Yes, cash (including check, debit card, wire, etc.) 252 71% 78 58% 174 79% <0.001
Yes, credit card or loan 222 62% 52 39% 170 77% <0.001
Yes, public grants or vouchers 60 17% 47 35% 13 6% <0.001
Donť know 17 5% 11 8% 6 3% 0.02

Approaches offered to finance admissions, n (%) *
 Any 104 28% 21 15% 83 37% <0.001
Encourages credit card 53 14% 8 6% 45 20% <0.001
Payment plan 54 15% 16 11% 38 17% 0.14
Other 14 4% 1 1% 13 6% 0.01

Source: Authors’ analysis of audit survey data merged with facility data.

Notes: Categories flagged with a

*

indicate that multiple answers per program were possible.

1

Chi-squared test was performed on any upfront cost due, payment methods accepted, and payment assistance offered. t test was performed for mean cost per day and mean treatment days cost requested upfront.

2

Cost per day is calculated as the total cost of treatment divided by the total days of treatment initially quoted. 41 facilities without enough data to calculate cost per day are excluded (35 are missing data or have unknown total cost of treatment,1 is missing data on total days of treatment, and 5 are missing both).

3

Facilities with missing data on upfront costs are excluded (n=9).

4

Treatment days’ cost requested upfront is calculated as total cost of treatment due upfront divided by cost per day. 75 facilities without enough data to calculate treatment days’ cost are excluded (41 are the facilities excluded from the cost per day calculation and for the remaining 34 facilities cost of treatment due upfront is either missing or unknown).

5

Facilities with missing data on payment methods accepted or who only stated that most patients are insured are excluded (n=12).

Most programs accepted debt-based payments such as credit cards or loans (220 [60%]) for payment for the uninsured/cash-pay callers (Exhibit 4). This was skewed towards for-profit programs, three quarters of whom accepted credit cards/loans (170 [77%]), vs. a minority of nonprofit programs (52 [39%] that accepted credit cards or loans, p<<0.0011). To finance admission, 83 (37%) for-profit programs encouraged some form of debt-based financing such as credit card use or a “payment plan” compared to 21 (15%) nonprofit programs (p<<0.0011).

Outcomes by Accreditation and State Licensing

The majority of responding programs had accreditation from either TJC or CARF (266 [72%], Exhibits A4A6).25 Over 1/3rd of accredited programs offered admission to callers (98 [37%] while over half (141 [53%]) used recruitment techniques, both at higher rates than non-accredited programs. Accredited programs had an average cost of $703 per day, with 80% asking for upfront payment of 20 days of treatment, or $14,060 on average. These costs exceeded those of unaccredited programs ($355 per day, p<<0.0011). Among the 336 (91%) of programs with state licensing, 1/3rd of programs offered admission to callers (109 [33%]) and 238 (73%) asked for an average of 20 days’ treatment upfront. A total of 141 (42%) licensed programs used recruitment techniques, which was lower than among non-licensed facilities (20 [63%], p=0.02).

DISCUSSION

In this audit study, many residential programs offered rapid access for addiction treatment, but admission came at a substantial upfront cost. The rapid access available for callers, albeit cash-paying, suggests that overall treatment capacity exists. However, rapid admission was often offered before any formal clinical evaluation to establish the necessity for potentially costly residential care compared to care in another treatment setting. Programs used a wide range of recruitment techniques to persuade callers to commit to admission before any clinical evaluation, including payment for transportation and financing admissions with credit cards or loans. These findings raise concerns that residential treatment programs may be admitting a clinically and financially vulnerable population for costly treatment without assessing appropriateness for other care settings.

There were many contrasts between the practices of for-profit versus nonprofit programs. One of the starkest differences between these groups was the use of recruiting techniques such as continuing contact with callers by text or phone after calls, which were almost entirely concentrated among for-profit programs. The financial incentive for active recruitment at the program level is clear given that on average, for-profit programs requested over $17,000 in advance payment. These costs would be unmanageable, however, for a typical patient with OUD, 60% of whom earn less than $25,000 annually.26 Compounding the problem is that some for-profit programs encouraged the use of debt or family members for payment, which could cause significant long-term financial burden.1,27 Another important distinction between for-profit and nonprofit programs was that for-profits were more likely to screen out patients with additional psychiatric comorbidities, which encompasses 40% of individuals with substance use disorder.28 This suggests that for-profit programs also seek to recruit patients with low clinical complexity, possibly due to lack of clinical expertise or staffing necessary to manage more complicated patients.

Accredited programs were more likely than unaccredited programs to offer admission before full clinical evaluation and to actively recruit patients with inducements such as paid transportation. Accreditation from TJC and CARF is advertised as demonstrating adherence to rigorous quality standards, but descriptions of those standards are not publicly available, making it difficult to assess whether the admission practices we observed run counter to those standards.29,30 Examining state licensing, licensed programs were less likely than non-licensed programs to engage in practices like recruitment techniques, but over 90% of the facilities in our sample had state licenses, so they largely represent the prevalence of admission practices across the whole sample. The prevalence of active recruitment and admission offers without clinical evaluation among both licensed and accredited programs raises questions about the adequacy of these systems to provide oversight of the admission practices of residential programs.

Our findings fill an important gap in the literature on the nature of the residential addiction treatment industry in the US. We are not aware of a similar national survey focused on residential admission practices or other audit studies of substance use treatment programs outside of outpatient physician practices nationally.31,32 The closest analogue to this work is the annual self-reported National Survey of Substance Abuse Treatment Services (N-SSATS), a national survey conducted by SAMHSA, but that survey does not assess admission practices such as cost, bed availability or clinical screening.

In conclusion, we found a pattern of high costs and active recruitment, frequently without clinical evaluation, in a national audit of residential treatment programs. Some of these behaviors were more common among for-profit programs, but others were similar across both for-profit and nonprofit organizations. These findings raise concern about investing in access to residential treatment as a policy priority without broad reform. Substantially increased oversight and transparent measures for accountability may be required before significant reform can occur in residential programs.

Supplementary Material

online appendix

REFERENCES

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

online appendix

RESOURCES