Abstract
Connecting people to mental health and substance abuse services is critical, given the extent of unmet need. The way health plans structure access to care can play a role. This study examined treatment entry procedures for specialty behavioral health care in private health plans, and their relationship with behavioral health contracting arrangements, focusing primarily on initial entry into outpatient treatment. The data source was a nationally representative health plan survey on behavioral health services in 2003 (N= 368 plans with 767 managed care products; 83% response rate). Most health plan products initially authorized six or more outpatient visits if authorization was required, did not routinely conduct telephonic clinical assessment, had standards for timely access, and monitored wait time. Products with carve-outs differed on several treatment entry dimensions. Findings suggest that health plans focus on timely access and typically do not heavily manage initial entry into outpatient treatment.
Keywords: mental health, substance abuse, managed care, access, health plans
Introduction
There is a high level of unmet need for mental health and substance abuse services. For example, in one national survey, only 41% of those who currently met criteria for selected behavioral health disorders received any treatment in the past year, and of those who did, only one third received minimally adequate treatment.1 Thus, an important priority in the behavioral health field is to improve access to appropriate care. This goal is underlined by the Institute of Medicine’s report on improving care for mental health and substance use conditions2 as well as the President’s New Freedom Commission on Mental Health.3 While many factors such as stigma and financial barriers may be related to unmet need, the way that systems of care are structured can also affect access to initial and continuing treatment.
Health plans and the managed behavioral healthcare organizations (MBHOs) with which they frequently contract for specialty behavioral health services can play an important role. Enrollment in managed behavioral health care programs offered by MBHOs totaled 164 million in 2002,4 including health plan and employer “carve-out” contracts with MBHOs. By 2005, 95% of privately insured persons were enrolled in managed care plans.5 In 2003, 72% of private health plans’ commercial managed care products contracted with an MBHO, including 80% of health maintenance organization (HMO), 54% of preferred provider organization (PPO), and 84% of point-of-service products.6 Thus, health plans and their MBHO vendors are in a position to affect the treatment entry experience for most of the privately insured population.
Health plans can manage or influence utilization in many ways, including initial access. Managed care techniques focus on either providers or patients, and may be financial or non-financial (administrative) in nature.7,8 Administrative techniques, such as utilization management, may nonetheless have financial implications (e.g., a service may not be covered unless it is approved in advance). The treatment entry process in health plans entails a variety of administrative techniques. Enrollees may or may not be required to obtain a referral or authorization for care, and if they do, the experience may differ depending on whether clinical assessment is conducted over the phone. Plans may use different medical necessity criteria for their prior authorization procedures, and may authorize different numbers of visits in the initial allocation. Whether or not enrollees are required to seek authorization, they may call the health plan or MBHO for information regarding appropriate providers, and health plans may consider different provider characteristics when providing referral information.
Other administrative techniques include attempts to ensure timely treatment by maintaining formal standards for maximum waiting time to first appointment or service, and by utilizing various methods to monitor actual wait time. Plans may also choose to require network providers to use standardized assessment instruments. In all of these areas, when a health plan contracts with an MBHO, it may be the MBHO’s structure and procedures that define treatment entry. There are some indications that managed care in general,9,10 and behavioral health care in particular,6 are moving towards less administrative control and relying more on financial ones such as patient cost-sharing.
Specific aspects of the treatment entry process can influence behavioral health care utilization. For example, studies have found that initial authorization of a greater number of outpatient sessions is associated with higher total outpatient utilization, and that the number of visits authorized is correlated with likelihood of termination at milestone visits,11,12 although another found no similar effect.13 Examining a different aspect of treatment entry, moving from in-person behavioral health evaluations to a call center with minimal assessment and routine authorization resulted in a higher proportion of enrollees using specialty mental health care.14 Another study involved the creation of a managed care index including prior authorization for specialty care, but also including primary care gatekeeping and financial incentives.15 It was found that more-managed health plans were associated with reduced referrals to psychiatrists but no difference in likelihood of referral to mental health specialists overall. More-managed plans were also associated with better outcomes for a sample of primary care patients with depressive symptoms in the same study.
In previous analysis of the same nationally representative health plan survey data that are examined in the current study, it was found that 58% of health plans’ commercial managed care products required prior authorization for outpatient mental health care in 2003.6 More than half of all products had requirements that the enrollee and/or specialty provider must contact the plan, and primary care gatekeeping was extremely rare. Furthermore, products that contracted with MBHOs were more likely to require prior authorization. However, this paper did not address other key aspects of the treatment entry process such as the number of visits initially authorized, medical necessity criteria used, or standards for timely access. The 1999 round of this same survey of health plans found that products contracting with MBHOs were more likely to employ several treatment management approaches, but did not include the full range of initial access procedures.16
This study goes beyond prior work to provide a multifaceted and more in-depth examination of treatment entry for mental health and substance abuse services, using nationally representative data. Specifically, we examine the following research questions:
What are the prevalence and characteristics of health plans’ treatment entry structure and practices?
How do these vary by behavioral health contracting arrangement?
Methods
Data and Sample
A nationally representative survey of 368 commercial health plans was conducted regarding alcohol, drug abuse and mental health services in 2003 (83% response rate). This telephone survey included an administrative module addressing behavioral health contracting, benefits, network management and provider payment, and a clinical module addressing primary care screening and treatment, entry into specialty care, utilization management, prescription drug formularies, quality improvement, and other clinically oriented topics. Each plan was asked about its top three commercial managed care products. “Product” was defined as any packages, plans or contracts that were similar in out-of-network coverage, referrals and primary care physicians. Respondents were asked to group together any packages or plans that were similar except for differences in cost-sharing, such as copayments.
The telephone survey was conducted between April 2003 and April 2004, regarding health plan procedures and benefits during 2003. Typically there were two respondents per health plan (executive director and medical director, or their designees). For some national or regional plans, respondents at the corporate headquarters level were interviewed regarding multiple sites. In some cases the health plans referred us to the MBHO for more detailed information. Items were asked at the product level within each market-area-specific plan. The survey was a follow-up to an earlier round conducted in 1999. Data collection procedures were similar in both rounds, as was much of the key content and survey structure.
The study is linked to the Community Tracking Study (CTS), a longitudinal study of health system change, funded by the Robert Wood Johnson Foundation.17 The CTS sample design contained strata for large metropolitan, small metropolitan, and non-metropolitan market areas. Within strata, nearly all sites were randomly selected with probability proportional to size (with nine sites selected with certainty). The primary sampling units for our survey were the 60 market areas selected for the CTS to be nationally representative. Our second sampling stage consisted of selecting health plans within market areas. Health plans serving multiple markets were defined as separate health plans for the study and data were collected with reference to the specific market area.
While this paper presents results from 2003, we also describe the 1999 sampling strategy as it formed the basis for the 2003 sample. The 1999 sample frame was based on the CTS Follow-Back Survey, which used information from household survey respondents to survey insurers regarding health plan characteristics. The Follow-Back Survey yielded approximately 1,000 entities categorized as managed care plans across all sites. Based on information from Web searches and industry directories, we excluded entities that were exclusively indemnity plans as well as health plans that were no longer present within market areas. This left 944 health plans as the sample frame, from which 720 market-specific health plans were selected with equal probability and without replacement across the 60 sites. The health plan sample was allocated to each market area based on estimated health plan enrollment in each site. The allocation to each site was proportionally distributed between PPO-only and HMO/multi-type plans. Of the 720 sampled plans, 247 were categorized as ineligible because they had closed, merged or were otherwise unreachable, had less than 300 subscribers in the market area, did not offer comprehensive health-care products, served only Medicaid/Medicare, or for several other reasons. This left 473 eligible plans, of which 434 (92%) responded. They reported on 787 eligible products. For the medical director portion, 417 plans responded (88% response rate) regarding 752 products.
For the 2003 survey, we augmented the 1999 sample of 720 health plans with a national sample of health plans that were not previously operating in the sample sites. These additional plans were identified using several sources including databases maintained by the National Association of Insurance Commissioners and Interstudy. Of the 110 new health plans identified, we selected 94, using an allocation to maintain approximately the same sampling rates as those for the health plans selected for the 1999 survey. Thus, the 2003 sample frame consisted of 1,054 health plans, of which 814 health plans were sampled, including all 1999 eligible sampled plans as well as a sample of 1999 ineligibles and of new plans. Any plans sampled in 1999 that were ineligible in the earlier period, but that met criteria in 2003, were included as eligible for 2003.
Of the 814 health plans in our 2003 sample frame, 373 were categorized as ineligible according to the same criteria as in the previous round. This left 441 eligible health plans, of which 368 (83%) responded and reported on 812 eligible products. Of these, 347 health plans completed the medical director portion of the survey, reporting on 771 products, for a 79% response rate. Four products that were “consumer-driven plans” (a new category in 2003) were excluded from the current analyses.
Hypotheses
While a major contribution of this analysis is to describe health plan practices, two overall study hypotheses are considered here. First, further utilization management practices would reflect moderate levels of management in keeping with the general move in health care towards less overt administrative controls. Second, plans contracting with MBHOs would have significantly different treatment entry practices (tending towards more management) than other plans, based on the literature and rationale for carve-outs.
Measures
Measures derived from the survey data are described below.
Contracting arrangement
Specialty (contracted with an MBHO for specialty behavioral health services; comprehensive (contracted with a single vendor for both general medical and behavioral health provider networks); and internal-only (all behavioral health services provided by plan employees or through a network of providers directly administered by the plan).
Provider Information
Provider characteristics considered by phone center staff when choosing provider names to give to enrollees: geographical access, appointment availability, knowledge of language other than English, race, ethnicity, cultural competence, gender, type of practitioner (e.g., social worker), specialized expertise in substance abuse or in treating adolescents.
Prior authorization
Whether prior authorization was required to initiate specialty outpatient behavioral health care; asked separately for mental health and substance abuse. If required, the typical number of outpatient visits initially approved for routine cases.
Medical necessity criteria
If prior authorization was required for any level of behavioral health care, what medical necessity criteria set(s) were used (Milliman and Robertson,18 Interqual,19 self-developed by plan or behavioral health vendor, American Society of Addiction Medicine (ASAM), other criteria, left to clinical judgment); asked separately for mental health and substance abuse.
Clinical assessment
For all products, whether behavioral health specialty providers in the network were required to use a standardized instrument, tool, or protocol to assess mental health and substance abuse problems. For those products that required prior authorization for outpatient care through a call center, under what circumstances was clinical assessment conducted over the phone: routinely, only if high-risk or urgent/emergent, not conducted.
Wait time to first appointment
Whether plan had a formal standard for the maximum waiting time from request for behavioral health treatment to the initial appointment; asked separately for routine, urgent, and emergency treatment. For plans that had a standard, the maximum amount of time specified in days or hours, and methods used to monitor wait time.
Wait time monitoring methods
For plans that had a wait time standard, the methods used to monitor actual waiting time: patient surveys, complaint analysis, “secret shopper” method (calling for appointment without disclosing identity), other ways of contacting providers, claims analysis, or other methods.
Statistical Analysis
The findings reported are national estimates. Data presented here were weighted to be representative of health plans’ commercial managed care products in the continental U.S. The sampling weights were computed from the inverse of the selection probabilities, which were in turn computed from each stage of selection: site selection and the selection of health plans in each site. The site selection probability is unity (1.0) for the nine sites selected with certainty and less than 1.0 for the other 51 sites. The plan selection probability (conditional on the selection of the site) is the selection rate in each site. We computed the base weight for all health plans using the inverse of the product of site selection and selection of plans in each site.
Statistical analyses were implemented using SUDAAN software20 for accurate estimation of the sampling variance given the complex sampling design. When comparing results by contracting arrangement, pairwise t-tests were used to determine significance. Corrections were made for multiple comparisons by using the Bonferroni correction for analyses involving pairwise tests for all three contracting arrangements. For some variables, particularly those that are conditional, the comprehensive contracting subsample of responding products was small. Results are not presented separately for comprehensive products when the unweighted N was less than 30 products.
Results
Sample Characteristics
Table 1 displays characteristics of the weighted sample for this analysis. Of products offered by health plans, approximately 29% were HMOs, 36% were POS products and 35% were PPOs. Seventy-three percent of products had specialty contracts with MBHOs, 12% had comprehensive contracts, and 15% had internal behavioral health arrangements. Over three quarters of products were for-profit. Plans were distributed across all four Census regions and enrollment categories. The distribution of contracting arrangement and other variables differed significantly across product types (p < .01 for all).
Table 1.
Sample Characteristics
| Total | HMO | POS | PPO | |
|---|---|---|---|---|
| Number of products, unweighted | 767 | 247 | 261 | 259 |
| Number of products, weighted | 7,530 | 2,209 | 2,619 | 2,702 |
| Weighted % of products | ||||
| Contracting Arrangement* | ||||
| Internal | 15.2 | 16.2 | 13.1 | 16.6 |
| Specialty | 72.7 | 80.1 | 83.7 | 55.0 |
| Comprehensive | 12.1 | 3.7 | 3.2 | 28.4 |
| Tax Status* | ||||
| Non-profit | 14.4 | 17.9 | 11.7 | 14.2 |
| For Profit | 85.6 | 82.1 | 88.3 | 85.8 |
| Region* | ||||
| West | 18.9 | 21.0 | 19.5 | 16.5 |
| Northeast | 10.4 | 11.7 | 9.9 | 9.8 |
| Midwest | 27.2 | 25.8 | 21.5 | 34.4 |
| South | 43.5 | 41.6 | 49.1 | 39.2 |
| Market Area Enrollment* | ||||
| < 10,000 | 36.2 | 32.2 | 38.1 | 37.6 |
| 10,000 – 50,000 | 27.1 | 30.8 | 33.3 | 17.7 |
| > 50,000 | 20.6 | 24.5 | 18.5 | 19.5 |
| Unknown | 16.1 | 12.5 | 10.2 | 25.2 |
Notes: HMO = health maintenance organization, POS = point-of-service plan, PPO = preferred provider organization. Totals may not equal 100% due to rounding.
p<.01
Provider Information
For the large majority of products, numerous provider characteristics were taken into consideration when choosing provider names to give to enrollees who called the phone center (Table 2). Nearly all products considered geographical access, knowledge of language other than English, type/discipline of practitioner, specialized expertise in substance abuse treatment, and specialized expertise in treating adolescents. Over 80% considered race/ethnicity/cultural competence and 90% considered gender. Appointment availability was considered by about 60% of products. Specialty-contracted products were significantly more likely than products with internal behavioral health arrangements to consider several of these factors, with the differences particularly large in terms of race/ethnicity/cultural competence (92% versus 49%, p <.01).
Table 2.
Provider Characteristics Plans Consider When Enrollees Call for Referral Information
| Percent of Products (weighted) | |||
|---|---|---|---|
| Contracting Arrangement | |||
| All Products | Specialty | Internal | |
| N – Weighted | 7,004 | 5,348 | 1,071 |
| Access | |||
| Geographical access | 98.6 | 98.2 | 100.0 |
| Appointment availability | 59.9 | 66.7 b | 47.8b |
| Training/Expertise | |||
| Type of practitioner (e.g., social worker) | 90.0 | 97.9 b | 73.4 b |
| Specialized expertise in substance abuse | 93.8 | 94.7 | 90.7 |
| Specialized expertise with adolescents | 91.3 | 97.2 | 91.4 |
| Other Characteristics | |||
| Knowledge of language other than English | 90.4 | 95.9 | 90.9 |
| Race, ethnicity, cultural competence | 81.8 | 92.2a | 49.4 a |
| Gender | 89.9 | 93.3 b | 70.8b |
Notes: Within each category, percentages were calculated based on non-missing responses. Missing data ranges from 1.5% to 2.7%. The “all products” column includes comprehensive products, but due to small N in this subset, results are not presented separately for comprehensive products. N for this table consists of the 95% of products that made provider information available to enrollees through designated phone member at the health plan or MBHO.
Pairs that share superscript letter within a row are significantly different, a: p<.01 b: p<.05
Prior Authorization
Just over 41% of all products did not require prior authorization for mental health or substance abuse services (Table 3). Specialty-contracted products were significantly less likely than comprehensive products to forego this requirement (p<.01). For both mental health and substance abuse, the most common number of visits initially authorized was six to eight (42%–43% of all products). This range was especially common in specialty products. Less than 5% of all products typically authorized less than six visits but this was more common in products with internal behavioral health arrangements (16% for mental health, 20% for substance abuse).
Table 3.
Prior Authorization and Initial Visit Allocations for Outpatient Mental Health and Substance Abuse Services
| MENTAL HEALTH | SUBSTANCE ABUSE | |||||||
|---|---|---|---|---|---|---|---|---|
| Percent of products (weighted) | Percent of products (weighted) | |||||||
| Contracting Arrangement | Contracting Arrangement | |||||||
| Total | Specialty | Comprehensive | Internal | Total | Specialty | Comprehensive | Internal | |
| N – Weighted | 7,530 | 5,471 | 912 | 1,147 | 7,530 | 5,471 | 912 | 1,147 |
| Initial outpatient authorization: | ||||||||
| No initial approval required | 41.4 | 36.9b | 64.9b | 44.2 | 41.7 | 36.9a | 66.8a | 44.6 |
| Approval varies by patient/presenting problem | 0.9 | 1.1a,c | 0a | 0.1c | 0.8 | 1.0a | 0a | 1.0 |
| 1 – 5 visits | 4.2 | 2.6a,b | 0a,c | 16.3b,c | 4.8 | 2.7a | 0a | 20.4a |
| 6 – 8 visits | 43.0 | 50.2a | 31.4 | 15.6a | 42.4 | 50.1a,c | 29.1 | 13.4c |
| 10 – 12 visits | 3.9 | 2.0a | 0a | 17.6a | 2.9 | 2.0a | 0a,b | 10.2b |
| 15 – 20 visits | 1.8 | 1.2 | 3.6 | 3.7 | 0.9 | 0.7 | 0.7 | 2.3 |
| 30 visits | 1.3 | 1.8 | 0 | 0 | 2.1 | 2.3 | 2.9 | 0 |
| Other | 3.4 | 4.2b | 0.1b,a | 2.5a | 4.4 | 4.3 | 0.5 | 8.1 |
| Mean # outpatient visits authorized (SE) | 8.0 (14.4) | 8.1 (14.7) | 8.9 (12.9) | 6.9 (13.4) | 7.9 (16.5) | 8.1 (15.9)a | 9.9 (24) | 5.7 (14.1)a |
| Median # outpatient visits authorized | 8.0 | 8.0 | 8.0 | 7.0 | 8.0 | 8.0 | 8.0 | 6.0 |
Pairs that share superscript letter within a row (within mental health or substance abuse) are significantly different, a, c: p<.01, b: p<.05
Note: Percentages were calculated based on non-missing responses. Missing data < 6% except 11.4% for internal products.
When authorization was required and a typical number of visits were reported, the mean and median number of visits initially authorized was approximately eight for both substance abuse and mental health. There was no significant difference in means by contracting arrangement for mental health, but internal products typically authorized fewer substance abuse visits than specialty products (six versus eight visits, p <.01). The mean number of visits initially approved increased significantly since 1999, when it was approximately seven for both substance abuse and mental health (data not shown, p <.05). All products indicated that they always authorize at least one outpatient visit (data not shown).
Medical Necessity Criteria
Of products that required prior authorization for any level of mental health services, nearly three-quarters used self-developed criteria to determine that services were medically necessary (Table 4). The remainder primarily used two of the best-known commercial criteria (12% used Milliman and Robertson, 9% used InterQual). Eight percent indicated that these decisions were left to clinical judgment. There were significant differences by contracting arrangement. Eighty-nine percent of specialty-contracted products used self-developed criteria, while most comprehensive products used Milliman and Robertson and half of products with internal behavioral health arrangements used InterQual (p<.01 for most pairwise comparisons). One-quarter of internal products left these decisions to clinical judgment. Some plans indicated more than one type of medical necessity criteria as the categories were not mutually exclusive.
Table 4.
Medical Necessity Criteria and Standardized Assessment
| MENTAL HEALTH | SUBSTANCE ABUSE | |||||||
|---|---|---|---|---|---|---|---|---|
| Percent of products (weighted) | Percent of products (weighted) | |||||||
| Contracting Arrangement | Contracting Arrangement | |||||||
| Total | Specialty | Comprehensive | Internal | Total | Specialty | Comprehensive | Internal | |
| Medical necessity criteria used | ||||||||
| N – Weighted | 7,233 | 5,410 | 881 | 942 | 7,314 | 5,410 | 881 | 1,023 |
| Milliman and Robertson | 11.8 | 2.8a,c | 55.3a,b | 33.4b,c | 11.6 | 2.7a,c | 55.8a,b | 30.5c,b |
| Interqual | 8.9 | 1.4a | 13.0c | 50.8a,c | 6.8 | 0.9a | 9.1c | 37.7a,c |
| Self-developed | 72.2 | 89.0a,c | 19.1a | 35.4c | 51.3 | 60.9a | 1.6a | 31.6a |
| Left to clinical judgment | 8.2 | 6.3a | 2.5c | 23.4a,c | 7.2 | 3.4a | 34.7a,b | 9.3b |
| Other Criteria | 7.5 | 6.8 | 17.7 | 4.3 | -- | -- | -- | -- |
| ASAM | -- | -- | -- | -- | 53.2 | 64.1a,c | 0.0a,b | 28.9c,b |
| Standardized assessment instrument required | ||||||||
| N – Weighted | 7,530 | 5,471 | 912 | 1,147 | 7,530 | 5,471 | 912 | 1,147 |
| Has requirement | 19.4 | 22.2a | 0.2a,b | 17.9b | 19.1 | 22.0a | 0.2a,b | 16.9b |
Pairs that share superscript letter within a row (within mental health or substance abuse) are significantly different, a,c: p<.01, b: p<.05
Note: Percentages were calculated based on non-missing responses. Missing data < 6% except 24.7%, medical necessity criteria, comprehensive; 26.1% assessment tool comprehensive, 11% internal. Plans could report more than one criteria set used. Blank cells indicate item was not applicable or not asked. N for medical necessity criteria consists of the 96% of products that required prior authorization for some type of behavioral health services.
For substance abuse prior authorization, the most commonly reported medical necessity criteria were the American Society of Addiction Medicine (ASAM) patient placement criteria (53%), followed by self-developed criteria (51%). Criteria varied significantly by contracting arrangement with specialty-contracted products significantly more likely to use ASAM or self-developed criteria, and comprehensive or internal products more likely to use one of the two named commercial criteria sets. Comprehensive products were also significantly more likely to leave these decisions to clinical judgment (35%), and none reported using ASAM criteria.
Standardized Clinical Assessment
Only 19% of all products required network providers to use standardized instruments to assess mental health or substance abuse problems (Table 4). There was significant variation by contracting arrangement, with comprehensive products almost never requiring this (p<.01 compared to specialty-contracted products, p <.05 compared to products with internal behavioral health arrangements). About one-quarter of products that required prior authorization through a call center conducted telephonic clinical assessment on a routine basis, while three-quarters did so only if the caller presented with high-risk or urgent/emergent problems (data not shown). Products that contracted with an MBHO were much less likely than products with internal behavioral health arrangements to conduct routine assessment (16% versus 76%, p< .01).
Standards for Maximum Wait Time to Initial Appointment
Close to 90% of products had formal standards for maximum wait time to first appointment, varying only slightly by the acuity of treatment need (Table 5). This represents a significant increase since 1999, when 74% had such a standard for routine treatment (data not shown, p <.01). For routine, urgent and emergency treatment, specialty-contracted products were much more likely, and comprehensive products much less likely to have such a standard.
Table 5.
Health Plans’ Formal Standards for Maximum Wait Time to Initial Appointment
| Percent of Products (weighted) | ||||
|---|---|---|---|---|
| Contracting Arrangement | ||||
| All Products | Specialty | Comprehensive | Internal | |
| Routine Treatment | ||||
| Product has formal standard for maximum wait time | ||||
| N - Weighted | 7,530 | 5,471 | 912 | 1,147 |
| % with formal standard | 86.5 | 95.3a | 44.8a,b | 75.9b |
| If has standard, timeframe specified | ||||
| N - Weighted | 6,363 | 5,163 | – | 811 |
| Mean days, maximum wait time (SE) | 8.5 (.09) | 8.2 (.1) | – | 8.7 (.25) |
| Median days, maximum wait time | 7.0 | 7.0 | – | 10.0 |
| Products with maximum wait time of: | ||||
| 1 – 5 days | 3.0 | 2.4 | - | 8.1 |
| 7 days | 56.0 | 62.1a | – | 39.4a |
| 10 days | 32.4 | 28.3 | – | 41.4 |
| 14 days | 8.6 | 7.2 | – | 11.2 |
| Urgent treatment | ||||
| Product has formal standard for maximum wait time | ||||
| N - Weighted | 7,530 | 5,471 | 912 | 1,147 |
| % with formal standard | 87.6 | 95.4a | 44.8a,c | 82.3c |
| If has standard, timeframe specified | ||||
| N - Weighted | 6,387 | 5,172 | – | 827 |
| Mean number of hours, maximum wait time (SE) | 47.4 (.5) | 47.3 (.4) | 44.9 (1.6) | |
| Median hours | 48.0 | 48.0 | – | 48.0 |
| Products with maximum wait time: Within 24 hours | 14.1 | 9.4a | – | 34.9a |
| Within 48 hours | 74.4 | 84.1a | – | 43.2a |
| Within 72 hours | 11.5 | 6.5b | – | 21.9b |
| Emergency treatment | ||||
| Product has formal standard for maximum wait time | ||||
| N - Weighted | 7,530 | 5,471 | 912 | 1,147 |
| % products with formal standard | 88.4 | 96.2a,b | 47.5a,c | 81.5b,c |
| If has standard, timeframe specified | ||||
| N – Weighted | 6,445 | 5,215 | – | 818 |
| “Immediately” | 41.2 | 39.9 | – | 31.7 |
| 1 – 4 hours | 28.6 | 33.2a | – | 8.4a |
| 6 – 8 hours | 24.0 | 25.1 | – | 23.1 |
| 12 hours | 2.8 | 0.2a | – | 20.4a |
| 24 hours | 3.0 | 1.6 | – | 13.0 |
| 72 hours | 0.4 | 0.0b | – | 3.4b |
Notes: Percents are based on non-missing data. Missing data <5% except 6.9% routine treatment standard, internal products; 12.4% urgent and emergency treatment, internal products. The “all products” column includes comprehensive products, but due to small N in some subsets, some results are not presented separately for comprehensive products.
Pairs that share superscript letter within a row are significantly different, a, c: p < .01 b: p<.05
Among products that had a formal standard for routine treatment, the mean number of days specified as a maximum was 8.5 and over half of all products specified exactly seven days. There was no significant difference in means by contracting arrangement, but a higher percentage of specialty products specified seven days (62% versus 39%, p <.01). For urgent treatment, among products that had a formal standard, three quarters specified a maximum waiting time of 48 hours. Internal products were more likely than specialty products to specify 24 hours or 72 hours. For emergency treatment, about 41% of products indicated that care must be received “immediately,” while about one quarter each specified 1–4 hours and 6–8 hours. Specialty-contracted and internal products were not significantly different in terms of specifying “immediate” emergency treatment but specialty products were more likely to specify the 1–4-hour time frame (33% versus 8%, p <.01).
Methods Used to Monitor Wait Time to Initial Appointment
Complaint analysis (96%) and patient surveys (90%) were the most common methods reported for monitoring waiting time to initial appointment, among those products that had a formal standard (Table 6). Sixty-nine percent reported using “secret shopper” methods and 80% reported using other ways of contacting providers. Three quarters of products used claims analysis and more than half reported other methods, primarily online scheduling of appointments and site visits. Products that contracted with MBHOs were consistently more likely to utilize most of these methods compared to products with internal arrangements.
Table 6.
Methods Used to Monitor Waiting Time to Initial Appointment
| Percent of products (weighted) | |||
|---|---|---|---|
| Contracting Arrangement | |||
| All Products | Specialty | Internal | |
| Among products with formal standards for maximum wait time: | |||
| N - Weighted | 6,526 | 5,215 | 896 |
| Patient surveys | 89.6 | 97.0a | 49.9a |
| Complaint analysis | 95.7 | 98.4 | 89.5 |
| “Secret shopper” | 68.5 | 74.3a | 39.3a |
| Other ways of contacting providers | 80.4 | 88.6a | 41.7a |
| Analysis of claims | 73.8 | 86.6a | 9.0a |
| Other methods* | 54.9 | 62.7a | 14.6a |
| None | 1.9 | 0.2 | 9.6 |
Other methods includes appointments scheduled online (46% of other methods) and site visits - review appointments (24% of other methods).
Notes: Within each category, percentages were calculated based on non-missing responses. Missing data accounts for 1% in specialty, 6% in internal. The “all products” column includes comprehensive products, but due to small N in this subset results are not presented separately for comprehensive products.
Pairs that share superscript letter within a row are significantly different, p<.01
Discussion
The overall picture of access to specialty behavioral health outpatient care in these plans suggests that the tendency towards less overt management of health care (often viewed as resulting from consumer backlash) is partially evident in behavioral health as well. Prior authorization is typically required, but the procedures involved are not nearly as stringent as they could be. In the large majority of plans that did require prior authorization, at least six visits were typically approved in the initial allocation. Furthermore, the mean number of visits typically authorized increased modestly but significantly from seven to eight visits from 1999 to 2003, adding to previous evidence of some loosening management on the part of managed care plans.6 Larger initial visit allocations can reduce the time and costs of managing relatively inexpensive, short-term outpatient care for health plans and providers. It also gives patients certainty regarding the minimum number of visits they will be allowed, which may assist in treatment engagement.
Also, most plans do not routinely do clinical assessment over the phone. That could raise concerns regarding appropriate matching to providers. However, a study in one managed care organization found that switching from in-person evaluation to a call-in center with routine authorization and minimal assessment did not negatively impact treatment engagement or other performance indicators that could reflect problems with inappropriate matching.21 The switch was associated with increases in family treatment for child mental health patients, as is often recommended.
There are several possible contributors to the treatment entry findings in our study. Some of the procedures that anecdotally may be difficult for consumers, such as having to provide detailed personal information to a health plan over the phone, are mainly absent for outpatient care according to our data. This may reflect health plans’ inclination to refrain from unnecessarily triggering consumer or purchaser dissatisfaction. The routine nature of the procedures in place, including typically allowing six to eight initial visits, may also reflect plans’ determination that tighter management of early outpatient care is simply not worth it financially. In any case, moderate as opposed to heavy management of access is in line with an increasing focus on consumer choice.
It also appears that few health plans are motivated to require providers to use standardized assessment tools for either mental health or substance abuse treatment. An abundance of standardized instruments can be used for assessing severity and for guiding treatment planning, such as the Beck Depression Inventory22 and the Addiction Severity Index,23,24 although some can be time-consuming and they are often specific to single disorders (so that multiple instruments might be needed). The rarity of requiring the use of such instruments may reflect either health plans’ reluctance to engage in tight management of initial outpatient care, or confidence that network providers have the expertise to conduct effective assessments without standardized instruments. Plans may also recognize that any single health plan generally accounts for only part of each provider’s caseload and thus may have limited impact. It may also be simply a matter of feasibility. Most health plans do not have any way to readily monitor the assessments that network providers are conducting. However, some health plans and MBHOs have implemented systems for tracking treatment progress and outcomes25,26 and thus would be able to require the use of particular assessment tools with better likelihood of compliance.
Most plans appear to view timely access as very important, judging from the fact that a large majority have formal standards for maximum waiting time, most have relatively quick timelines, and most consider appointment availability in the referral process. The use of multiple monitoring methods to support this suggests this may not be just done “on paper”. In particular, the fact that two-thirds report using a proactive, sometimes controversial “secret shopper” approach suggests a serious focus on the issue of timely entry into care. There has been an increase in the prevalence of these standards since 1999. It may be that some health plan or MBHO accreditation standards (e.g., those of the National Committee for Quality Assurance) related to behavioral health service access or to conducting patient satisfaction surveys may help to motivate some of this activity. Furthermore, from a marketing perspective, patients’ complaints about inability to obtain appointments or “phantom networks” with little actual availability can be highly detrimental. At the same time, 40% of plans did not consider appointment availability when giving enrollees provider names. There are likely feasibility challenges in this area for health plans with network providers not tightly connected or linked in terms of information technology, which the survey did not address. Improvements in this area could contribute to easier access for enrollees.
For continuing care and higher levels of care, medical necessity criteria take on importance. Products contracting with MBHOs by and large used self-developed criteria (quite likely developed by the MBHO rather than the health plan), while products with other behavioral health arrangements were more likely to use two of the best-known commercial criteria sets or to leave decision to clinical judgment. Precisely how different each of these self-developed and commercial criteria sets are in terms of their effects on various aspects of behavioral health care is not known. A review of the literature on effects of utilization review criteria in general found that numerous utilization review instruments or methods are lacking in reliability and validity, and also noted problems in practical application of the criteria.27 One difference that has been noted between two of the most popular commercial products is that the Milliman and Robertson criteria incorporate explicit service quantities such as target length of stay, while InterQual provides clinical criteria and indications for types and levels of care without specifying target service quantities in the same way.28 These approaches have been termed, respectively, length-of-stay guidelines and algorithms considering specific clinical attributes.27 Our findings raise the intriguing question - not answerable with our data - of how application of different medical necessity criteria sets across contracting arrangements may result in different utilization patterns.
Specialty contracting - the predominant way of arranging behavioral health services in health plans - brings with it a different way of structuring treatment entry, in several aspects that have not been examined before on a national basis. Products that contract with MBHOs are more likely to consider several provider characteristics when enrollees call for referral information. They are more likely to require prior authorization (compared to comprehensive products), conduct clinical assessment over the phone only in high-risk cases, utilize self-developed medical necessity criteria, require providers to utilize standardized assessment instruments (relative to comprehensive products), have formal standards for timely access at all levels of acuity, and employ multiple methods to monitor actual waiting time. This appears consistent with the expectation that health plans contracting with MBHOs may do so in order to take advantage of a “ready-made” specialty behavioral health care system. Aspects of this system may include specialized management approaches, procedures, information technology, and staffing to perform oversight and member assistance.
There were few differences in prior authorization-related practices for mental health versus substance abuse. Health plans appear to conduct prior authorization for outpatient care very similarly, in terms of requiring prior authorization and number of visits typically authorized, across all of behavioral health. An exception is medical necessity criteria, largely due to the use of ASAM criteria which are only applicable to substance abuse. Also, specialty-contracted products initially authorized a higher number of substance abuse treatment visits than did internal products, while this contrast did not exist for mental health.
This analysis focused primarily on health plans’ treatment entry procedures for initiating outpatient behavioral health care. Lengthy or more intensive types of treatment are more costly than the brief initial phase of outpatient care, and many concerns about managed care have focused on accessing those types of treatment. The characteristics of the process of accessing treatment for ongoing treatment or higher levels of care may be different, and will be important to examine in their own right. Thus, it is possible that there are substantial access barriers not investigated within the scope of this analysis.
Limitations of the study include the fact that actual implementation can differ from policy, and we are not able to determine from our data how practices and procedures were carried out. For example, we do not know how provider “appointment availability” was ascertained by plans or how accurately plans measured this. Respondents may have also varied in their interpretation of terms such as “appointment availability.” In addition, we neither surveyed enrollees nor collected enrollee-level data. It is also important to note that other factors such as financial barriers, size of the provider network, and degree of choice can affect access, although they are not the focus of this analysis. We focused primarily on initial outpatient treatment; barriers may be greater for other types of care. These data describe treatment entry as it was at the time the data were collected and it is possible that changes have occurred since then. However, these nationally representative findings present a detailed picture of the specialty behavioral health treatment entry process (primarily for initial outpatient care) in health plans at a point in time when the managed care “backlash” was already well underway. Furthermore, the examination of the relationship between treatment entry procedures and a variety of behavioral health contracting arrangements help to shed light on the question of what difference health plan carve-outs make in terms of treatment entry.
Implications for Behavioral Health
When individuals decide to seek mental health or substance abuse treatment, it represents a window of opportunity during which it is important to respond. This detailed look at health plans’ management of initial access to specialty outpatient behavioral healthcare suggests that, according to health plans, there are relatively few barriers and some facilitators in place. Consideration of multiple provider characteristics when enrollees call for referral information reflects the potential for responding to enrollee preferences and requests. Prior authorization is usually required for outpatient care and some individuals may find it hard to make even a single call to a third party such as a health plan in order to access care. However, prior authorization does not typically involve the collection of detailed clinical information over the phone in order to access services, and the mean number of visits provided in the initial allocation suggests only moderate management during the early phase of outpatient treatment. The fact that the number of visits initially allocated has increased significantly (though modestly) since 1999 suggests a trend in this direction. Both providers and consumers are likely to view this positively.
Health plans’ reported focus on timely access to care is also encouraging. Monitoring actual wait time could result in health plans maintaining networks of providers that are sufficient in relation to need. The results also show that health plans’ contracting arrangements are strongly associated with treatment entry practices, and thus whether a plan “carves out” care has real implications for enrollees and providers.
Plans certainly vary, however, both within and across contracting categories, with a nontrivial minority having more stringent procedures. Furthermore, there may be greater access barriers in terms of longer-term outpatient treatment or higher levels of care. Use of different medical necessity criteria suggests there may be systematically different approaches to these aspects of treatment. There is a need to learn more about how health plans manage care in these areas. In addition, linking health plans’ treatment entry characteristics to utilization and consumer experience of care would be a useful next step.
Acknowledgments
This study was funded by the National Institute on Alcohol Abuse and Alcoholism grant #R01 A010869 and the National Institute on Drug Abuse grant #R01 DA10915. The authors thank respondents from the participating health plans for their time; Frank Potter and team at Mathematica Policy Research for sampling design and statistical consultation, and for fielding the survey; Galina Zolotusky for programming assistance; Grant Ritter for statistical consultation; Dominic Hodgkin for helpful review of an earlier draft; Terri White for research project coordination; and Michele Hutcheon for manuscript preparation.
References
- 1.Wang PS, Lane M, Olfson M, et al. Twelve-month use of mental health services in the United States: results from the National Comorbidity Survey Replication. Archives of General Psychiatry. 2005 Jun;62(6):629–640. doi: 10.1001/archpsyc.62.6.629. [DOI] [PubMed] [Google Scholar]
- 2.Institute of Medicine. Improving the quality of health care for mental and substance-use conditions. Washington, DC: National Academies Press; 2006. [PubMed] [Google Scholar]
- 3.New Freedom Commission on Mental Health. Final Report. Rockville, MD: 2003. Achieving the Promise: Transforming Mental Health Care in America. [Google Scholar]
- 4.Oss M, Jardine E, Pesare M. OPEN MINDS Yearbook of managed behavioral health and employee assistance program market share in the United States, 2002–2003. Gettysburg, PA: OPEN MINDS; 2002. [Google Scholar]
- 5.Gabel J, Claxton G, Gil I, et al. Health benefits in 2005: premium increases slow down, coverage continues to erode. Health Affairs (Millwood) 2005 Sep-Oct;24(5):1273–1280. doi: 10.1377/hlthaff.24.5.1273. [DOI] [PubMed] [Google Scholar]
- 6.Horgan CM, Garnick DW, Merrick EL, et al. Changes in how health plans provide behavioral health services. Journal of Behavioral Health Services & Research. 2007 Sep 14; doi: 10.1007/s11414-007-9084-0. E-pub. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Grembowski DE, Diehr P, Novak L, et al. Measuring the “managedness” and covered benefits of health plans. Health Services Research. 2000;35(3):707–731. [PMC free article] [PubMed] [Google Scholar]
- 8.Landon BE, Wilson IB, Cleary PD. A conceptual model of the effects of health care organizations on the quality of medical care. Journal of the American Medical Association. 1998;279:1377–1382. doi: 10.1001/jama.279.17.1377. [DOI] [PubMed] [Google Scholar]
- 9.Robinson JC. The end of managed care. Journal of the American Medical Association. 2001;285(20):2622–2628. doi: 10.1001/jama.285.20.2622. [DOI] [PubMed] [Google Scholar]
- 10.Robinson JC. Reinvention of health insurance in the consumer era. Journal of the American Medical Association. 2004;291(15):1880–1886. doi: 10.1001/jama.291.15.1880. [DOI] [PubMed] [Google Scholar]
- 11.Liu X, Sturm R, Cuffel BJ. The impact of prior authorization on outpatient utilization in managed behavioral health plans. Medical Care Research and Review. 2000 Jun;57(2):182–195. doi: 10.1177/107755870005700203. [DOI] [PubMed] [Google Scholar]
- 12.Howard R. The sentinel effect in an outpatient managed care setting. Professional Psychology: Research and Practice. 1998;29(3):262–268. [Google Scholar]
- 13.Compton SN, Cuffel BJ, Burns BJ, et al. Datapoints: Effects of changing from five to ten preauthorized outpatient sessions. Psychiatric Services. 2000 Oct;51(10):1223. doi: 10.1176/appi.ps.51.10.1223. [DOI] [PubMed] [Google Scholar]
- 14.Hodgkin D, Merrick EL, Horgan CM, et al. Does type of gatekeeping model affect access to outpatient specialty mental health services? Health Services Research. 2007 Feb;42(1 Pt 1):104–123. doi: 10.1111/j.1475-6773.2006.00609.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Grembowski DE, Martin D, Patrick DL, et al. Managed care, access to mental health specialists, and outcomes among primary care patients with depressive symptoms. Journal of General Internal Medicine. 2002 Apr;17(4):258–269. doi: 10.1046/j.1525-1497.2002.10321.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Merrick EL, Horgan CM, Garnick DW, et al. Managed care organizations’ use of treatment management strategies for outpatient mental health care. Administration and Policy in Mental Health. 2006 Jan;33(1):104–114. doi: 10.1007/s10488-005-0024-0. [DOI] [PubMed] [Google Scholar]
- 17.Kemper P, Blumenthal D, Corrigan JM, et al. The design of the community tracking study: a longitudinal study of health system change and its effects on people. Inquiry. 1996 Summer;33(2):195–206. [PubMed] [Google Scholar]
- 18.Milliman and Robertson Care Guidelines . [Accessed June 11, 2008]; http://www.milliman.com/expertise/healthcare/products-tools/milliman-care-guidelines/index.php.
- 19.InterQual Criteria. [Accessed June 11, 2008]; http://www.mckesson.com/en_us/McKesson.com/For%2BHealthcare%2BProviders/Hospitals/InterQual%2BCriteria%2BProducts/InterQual%2BCriteria%2BProducts.html.
- 20.Research Triangle Institute. SUDAAN User’s Manual: Release 8.0. Research Triangle Park, North Carolina: Research Triangle Institute; 2002. [Google Scholar]
- 21.Merrick EL, Hodgkin D, Horgan CM, et al. Changing mental health gatekeeping: Effects on performance indicators. Journal of Behavioral Health Services & Research. 2007 Jul 27; doi: 10.1007/s11414-007-9077-z. [DOI] [PubMed] [Google Scholar]
- 22.Beck AT, Steer RA, Brown GK. Manual for Beck Depression Inventory II (BDI-II) San Antonio, TX: Psychology Corporation; 1996. [Google Scholar]
- 23.McLellan AT, Luborsky L, Woody GE, et al. An improved diagnostic evaluation instrument for substance abuse patients. The Addiction Severity Index. Journal of Nervous and Mental Disease. 1980 Jan;168(1):26–33. doi: 10.1097/00005053-198001000-00006. [DOI] [PubMed] [Google Scholar]
- 24.Treatment Research Institute. Addiction Severity Index. (5) http://www.tresearch.org/asi.htm.
- 25.PacifiCare Behavioral Health. [Accessed November 28th, 2007]; http://www.pbhi.com/Members_public/Shared_Cust_Member/Provider_Directory/H4O_FAQ.asp.
- 26.Rowland C. Insurer seeks personal details on mental health form. Boston Globe. 2006 Nov 11; [Google Scholar]
- 27.Wickizer TM, Lessler D. Utilization management: issues, effects, and future prospects. Annual Review of Public Health. 2002;23:233–254. doi: 10.1146/annurev.publhealth.23.100901.140529. [DOI] [PubMed] [Google Scholar]
- 28.Lippman H. Myths and management in American hospitals: the bottom line on length of stay. Business and Health. 2001 Apr 1; [PubMed] [Google Scholar]
