Abstract
The objective of this study was to evaluate the cost of serving one additional youth in the Big Brothers Big Sisters of America (BBBS) program. We used a marginal cost approach which offers a significant improvement over previous methods based on average total cost estimates. The data consisted of eight years of monthly records from January 2008 to August 2015 obtained from program administrators at one BBBS site in the Mid-Atlantic. Results show that the BBBS marginal cost to serve one additional youth was $80 per mentor-month of BBBS mentoring (irrespective of program type). The cost to offer services for the average match duration of 19 months per marginal added youth was $1,503. The marginal costs per treated program participant in school-based versus community-based programs were $1,199 and $3,301, respectively. Marginal cost estimates are in the range of youth mentoring programs with significant returns on investment but are substantially higher than prior BBBS unit cost estimates reported using less robust estimation methods. This cost analysis can better inform policy makers and donors on the cost of expanding the scale of local BBBS programs as well as suggest opportunities for cost savings.
Keywords: youth mentoring, youth development, benefit-cost, sustainability, decision-making, Big Brothers, Big Sisters of America
1. INTRODUCTION
Big Brothers Big Sisters of America (BBBS), is the oldest, largest and best-known youth mentoring program in the US [1]. To facilitate the financial sustainability and expansion of these programs and compare their economic and health impact to other youth interventions this paper estimates the cost per additional client. This information is relevant for decision makers, donors, and researchers because it provides data about the cost of expanding the scale of these mentoring programs.
1.1. Background on youth mentoring
The presence of a positive trusted adult role model is a protective factor against maladaptive outcomes among youth [2]. Relationship like these, particularly when they are intentional and involve an adult acting in a helping capacity, are often referred to as mentoring. Mentoring can be defined by the nature of the relationship (i.e., natural versus volunteer) or the degree to which the relationship is explicitly defined (i.e., informal versus formal) [3]. This paper focuses on mentoring programs which aim to create formal mentoring relationships between a youth and a volunteer adult. Mentoring programs can significantly improve academic achievement and school attendance, as well as reduce delinquency, aggression, and drug use [2, 4, 5]. Involvement in mentoring programs also improves decision-making skills and fosters future planning abilities [2, 4, 5]. Additionally, there is evidence to suggest that mentoring programs show the greatest effects for youth who are more likely to have low grades, misdemeanors, or other less favorable behavioral outcomes. Youth who are less likely to transition successfully into adulthood can be identified either by demonstrating pre-existing problem behaviors or by being exposed to environmental risk [2].
In the past 10 years BBBS programs have served over 2 million youth ages 6–18. In 2016, the majority of these were children of color (68%), from low-income families (78%) who lived in a single-parent home (61%) [6]. A large-scale randomized control trial involving 959 youth ages 10–16 provided evidence of the program’s efficacy for reducing problem behaviors (i.e., aggression, alcohol and drug use, truancy) and improving school achievement and quality of relationships with parents and peers [7]. The original research on effects of BBBS mentoring focused on community-based mentoring (CBM), where a carefully screened community volunteer is paired with a youth for a one-on-one relationship. However, more recently school-based mentoring (SBM) has been growing in popularity due to the ease of recruiting groups of volunteers as well as meeting with youth in schools. Recent studies of the effects of BBBS SBM found that mentored youth performed better academically and had improved perceptions of their own academic ability [8]. Additional research has focused on understanding correlates of program efficacy, with some research suggesting the importance of longer term relationships [9].
1.2. Cost estimates of BBBS
Past cost studies of BBBS programs have typically divided total annual cost by the number of youth served to estimate the average cost (AC) of the program [10–14]. Average cost studies have also used the bottom-up ingredients approach to make a model of the average cost to serve just one youth [15] [16–18]. In other studies, the estimation method used to estimate costs has been unclear [7, 19–23]. Excluding volunteers’ time and adjusted for inflation to 2016 USD, past studies have estimated average annual cost per youth as $1,647 (ranging between $35-$3,618 for small and large programs) [16], or between $437-$1,672 for SBM programs and $1,285-$1,676 for CBM programs [15].
While average cost (AC) estimates are a common and convenient measure of the program unit cost, it is not an accurate estimate of how costs change when adding one additional youth served. AC estimates overestimate costs by assuming that all operating costs, payroll, and fixed costs like rent and insurance will increase for every unit increase in the caseload. The ingredients approach tries to correct for the overestimation in AC by adjusting estimates with data from qualitative surveys administered to program managers and administrators that help identify key operating resources, their quantity and value needed to match and support just one youth and mentor. The underlying assumption of the ingredients approach is that cost per unit estimates in the model can reflect the right quantity, value and mix of each resource and how this varies at different levels of supply for mentoring services. Program AC and ingredients approach estimates are common because often program evaluations do not report sufficient information to produce better cost per unit estimates. This is most likely because the focus of program evaluations is on the youth outcomes and not the cost [24].
In contrast, marginal cost (MC) analysis, also referred to as incremental cost analysis, provides estimates that can help decision makers understand the cost of expanding the scale of their program when enrolling one more youth. For many decision-makers, the benefits of youth mentoring accrue because “one more youth” was enrolled, so it is appropriate to know just the cost of enrolling “one more youth”. The MC approach takes into account that some operating costs change (e.g. match support supplies), while others do not change (e.g. administration). This is similar to the ingredients approach, but MC estimates are produced by using statistical techniques to analyze variability in all program expenditures data as the numbers of clients change. Specifically, this technique measures the association between changes to the program caseload and changes in expenditures over a long period of time thus producing a cost per unit estimate after controlling for external factors that may affect estimates (e.g. changes to local cost of living, seasonality, improvements in technology, etc.). MC analysis is possible when frequent (e.g. monthly or quarterly) data from multiple years is available on total program expenditure, youth served and, if available, data on the quantity of key resources used and their prices.
1.3. MC Estimates
While cost estimates are not the only element that needs to be considered in making program, planning or policy decisions, cost estimates can help leaders decide the economic value in funding one program versus another [16]. Inaccurate program unit cost estimates can result in inefficient allocation of scarce donor and public resources. For example, qualitative surveys administered to 52 mentoring programs, including programs from every state, suggested that many managers believed that expanding the programs could decrease the cost per youth [16]. Increasing programs’ caseload would decrease the average cost only if the program is running at less than full capacity (e.g. requires no additional fixed and staffing investments to increase the total caseload). If this condition is met, then expansion without additional resources makes sense to reduce the average cost and maximize the use of resources. However, if a program is running at full capacity then additional resources would be needed to expand the program and maintain the ability to produce intense and long lasting mentor relationships that result in successful matches and positive youth outcomes [9, 25].
On the other hand, overestimated unit costs could inhibit investments in programs because they would compare unfavorably to other youth program alternatives. For example, one study showed that a BBBS cost per unit that changed from about $1,300 to $3,300 would change the program’s benefit-to-cost ratio (BCR) (e.g., analysis of whether the lifelong benefits to youth and the community exceed the intervention’ cost) by almost half, from about $10 to $4 [26, 27]. There is currently uncertainty around BBBS’s BCR. While various studies have reported a positive BCR [10, 20, 28], WSIPP’s 2018 updated BCRs significantly decreased to less than one dollar. The decrease is mainly due to exclusion of economic benefit estimates derived from studies that did not meet their revised criteria for strong evidence. This does not mean that excluded studies have no value; rather, it means that less confidence can be placed in those results until stronger study designs can increase the confidence in cause-and-effect inferences [29]. As BCR is a ratio of benefits to costs, this further emphasizes the importance of improved cost estimates such as those provided by a MC analysis.
1.3. Study Objectives
This analysis seeks to contribute to the literature on BBBS youth mentoring programs by producing estimates on the MC of scaling up a program using statistical methods. This MC analysis can inform program managers, policy makers and donors on the cost of the BBBS scale and reveal important cost drivers. The specific aims of this study are to 1) estimate the MC of the BBBS mentoring program including all mentoring programs, 2) understand differences in MC for the different versions of programs, community-based mentoring program or school-based mentoring program, and 3) capture differences in the cost for making a “new match” (i.e. recruit, interview, and train mentors and youth) from supporting “returning”/continuing matches (i.e. follow-up calls and mentor support).
2. METHODOLOGY
2.1. Data
The data consisted of eight years of monthly records from January 2008 to December 2015, for a total of 92 monthly observations. Data were obtained from one BBBS program site in the Mid-Atlantic serving a primarily African American urban population. Data from additional BBBS program sites contacted were not available. The costing perspective for this study was the program perspective. This costing perspective was chosen because we are interested in assessing how use of detailed program expenditure and caseload data can help improve marginal cost estimates. A societal perspective, which would also study the cost and savings to program participants, volunteers and the community, is beyond the scope of this study. Primary cost data were obtained from financial administrators and included all BBBS program expenditures by month and type: wages, program services, match support, marketing, transportation, supplies, administrative fees, and miscellaneous (which included accounting and bank fees, affiliate dues, and insurance). All expenditure data were inflation adjusted to 2016 US dollars using average consumer prices [30]. Since this program was spending resources to acquire labor and capital at competitive market prices, the expenditures were assumed to reflect the actual cost of these inputs.
Our analysis takes the perspective of a BBBS program manager who is interested in the financial cost. BBBS programs stipulate that their perspective is to focus on the financial costs and that they do not pay volunteers. Hence, volunteer time was set to have a price of $0 from the BBBS decision-maker’s perspective. We note that from an societal perspective volunteer time does have an opportunity cost, and a model of the economic costs for a different purpose than ours would properly include the costs of volunteer time.
Data also included an aggregated count of non-identifiable new youth matches from 2005 to 2016 with data on matched entry and exit dates and mentoring program, CBM or SBM. As data were existing, non-identifiable program records, these analyses were exempt from human subjects’ requirements. Youth socio-demographic data were not available and thus disaggregating results by demographic characteristics was not possible.
This costing analysis complies with the consolidated health economic evaluation reporting standards (CHEERS) checklist, see appendix A.
2.2. Descriptive analysis
To answer our research questions, we first needed to conduct descriptive and time trend analyses of both monthly expenditures and youth caseload counts from January 2008 to August 2015. The analysis focused on the 92-month period for which all the data was available. The total number of youth active (i.e. enrolled and receiving mentoring) each month (e.g. month of March 2014) was the sum of youth records with entry dates on or before that month and exit dates on or after that month. The match length was the number of months between entry and exit dates. A micro simulation of the BBBS youth population flow was developed using data on entry and exit dates, reason for exit, and match length from all 2005–2016 records to derive the distribution of various match lengths by program type and reason for exit. The appropriate distribution for each category of client from the most recent period (i.e. last four years) was used to impute (i.e. fill in) the missing exit data from youth records who had not yet exited the program at the time of the study. Match length had to be imputed for 35%, 28%, 17%, 11%, 6%, for 2015, 2014, 2013, 2011, and 2010 respectively and 3%−0% from prior years. Imputed data were only used in the match length trend analysis, not in the marginal cost regression analysis. Charts and tabulations of data over time on monthly expenditures, total active youth and match length for new youth were crossed checked with the program administrators to validate the accuracy of the data.
2.3. Statistical strategy
Ordinary least squares regression analysis was applied to the 92 months of data. In this cost analysis the dependent variable was total program expenditure each month and the principal independent variable was total youth each month receiving mentoring. Our approach followed the method of cost function analysis originally applied to hospital cost functions [31]. Other control variables that could be associated with changes to program costs were included in the model, such as season (to control for differences in expenditure due to seasonal changes) and year effects (to control for differences in expenditures due to secular trends, such as the cost of living in the area). This analysis generates estimates of the marginal cost for one month of mentoring. The following estimation equations were used:
Equation 1 |
Equation 2 |
where TotalCost is total expenditure the tth month, β0 is a constant, Total_youth is the total BBBS caseload number the tth month receiving mentoring, β1 is interpreted as the marginal cost for one month of mentoring per additional youth in the caseload and β2 is a vector of control variables including year and season. Season was a vector of four dummy variables: fall (Sept.-Nov.), winter (Dec.-Feb.), spring (Mar.-May) and summer (Jun.-Aug.) omitted as the base for comparison. To produce estimates by program (i.e. CBM and SBM), we ran models that varied from Equation 1 by including variables measuring the total youth enrolled in each of these mentoring programs instead of just one variable for the total number of BBBS youth (see appendix B for Equation 1b). Likewise, to separate the cost of a new match month from a returning month of mentoring, Equation 2 includes these two output parameters, “New” and “Returning” youth the tth month instead of just one parameter for the total number of BBBS youth, γ1 and γ2 are interpreted as the marginal cost for the first month and each returning month of mentoring, respectively, per additional youth in the caseload, and γ3 is the vector of control variables. We also produce both a new and returning cost estimate for each mentoring program by including variables for the count of both New Match and Returning by CBM and SBM respectively (see appendix B for Equation 2b). Parameters βi, and γi can be estimated when the variation between all these outputs and total expenditure is large enough to differentiate the value between them and in this case the data produces a statistically significant value with a 95% confidence interval (C.I.).
While coefficients βi, and γi are MC estimates for one month of mentoring, the MC for the total duration of enrollment is estimated using the following equations:
Equation 3a |
Equation 3b |
Equation 3a is the product of the monthly marginal cost, β1 and the match length (in months). For the models with two outputs, γ1 and γ2, equation 3b is used to estimate the MC for the total duration.
We examined scatter plots to identify outliers and regression functional form and tested robustness of results by running multiple specifications. For example, we added an interaction term between season and total active youth; we used month-dummies instead of season dummies to improve the data alignment and smooth the cycles; we used year fixed effects, and we lagged expenditures by one year or quarterly. We also tried varying total expenditure by excluding costs from different cost category (e.g. wages, marketing, transportation, supplies, etc.). Specification tests included a Ramsey test for model specification errors or omitted variables bias, tests for multicollinearity between specific variables, and tests for heteroscedasticity [32–34]. The Akaike information criterion was used to determine the best-fitting specification.
3. Results
3.1. Overview of Youth Caseload by Program
Table 1 shows a summary of the monthly BBBS data during the study period. On average, there were 963 youth every month receiving mentoring services. Out of this total, 37 were new youth and the remaining 926 were returning youth. Out of the new youth, 25 were enrolled in CBM and 12 in SBM programs. Out of the returning youth, 693 were enrolled in CBM and 233 in SBM programs. The overall number of mentors interviewed was 1.9 times higher than the number of new matches. (Each potential mentor is interviewed to determine whether a mentor is a good fit for a prospective mentee). There was a bigger ratio of interviews to matches for CBM (2.1) than SBM (1.4) suggesting more resources are needed to produce a CBM match. The average match length was 19 months (i.e., 1.6 years); for CBM it was 20 months and for SBM 11 months. Note that these match length values are the average of 92 monthly averages.
Table 1.
Summary of BBBS Monthly Program Data
Program Data Community-based mentoring (CBM) & School-based mentoring (SBM) |
Monthly Data Between Jan. 2008 to Aug. 2015 |
||||
---|---|---|---|---|---|
Monthly Average |
Std. Dev. |
Monthly Min. |
Monthly Max. |
||
Total Youth Caseload | Both programs | 963.0 | 355.3 | 497.0 | 1,561.0 |
CBM | 718.7 | 253.6 | 384.0 | 1,137.0 | |
SBM | 244.3 | 105.9 | 95.0 | 424.0 | |
New Youth Matched | Both programs | 37.3 | 24.2 | 10.0 | 131.0 |
CBM | 25.5 | 11.0 | 7.0 | 48.0 | |
SBM | 11.8 | 18.4 | 0.0 | 90.0 | |
Returning Youth | Both programs | 925.7 | 342.4 | 482.0 | 1,502.0 |
CBM | 693.1 | 246.0 | 369.0 | 1,090.0 | |
SBM | 232.6 | 101.6 | 88.0 | 412.0 | |
Mentor Interviews | Both programs | 72.0 | 32.3 | 17.0 | 161.0 |
CBM | 54.8 | 19.6 | 16.0 | 105.0 | |
SBM | 17.1 | 25.6 | 0.0 | 106.0 | |
Match Length (in months) |
Both programs | 18.9 | 3.1 | 13.5 | 40.4 |
CBM | 20.0 | 3.4 | 13.5 | 28.1 | |
SBM | 10.6 | 8.5 | 0.0 | 40.4 | |
Program Cost Inflation adjusted to 2016 USD |
Total Cost from 1/08 to 8/15 (TC) | 211,318.9 | 40,138.5 | 92,575.0 | 345,180.0 |
Average Cost per month (AC) | 250.6 | 105.5 | 127.5 | 594.2 | |
Total Variable Cost (TVC)* | 190,530.4 | 37,872.7 | 81,126.0 | 292,253.0 | |
Average Variable Cost per Month (AVC)* | 222.1 | 84.1 | 120.9 | 510.9 |
Note: Data are the average over 92 months of actual program data and it includes all 12 months of data for every year, 2008 to 2014, expect for the year 2015 which had monthly data up to August.
Figure 1 provides details on the BBBS youth caseload and match length over the course of the years of the data. Figure 1 shows that both, new and returning, youth caseloads for both CBM and SBM reached a peak around mid-2010, and steadily declined until late 2013 after which they were fairly stable for the remaining study period. The youth caseload data described counts of every youth in the program every month during the study period, thus this data did not need to make estimates or assumptions about how length-of-stay affected caseload numbers. Program staff reported that the sharp decline in the youth caseload after mid-2010 was driven by large reductions in program funding. These financial limitations forced the program to reduce their caseload partially by closing returning matches with less than two years. Figure 1 also shows there were regular seasonal oscillations characteristic of the school season for SBM outputs. When we compared match length between community and school-based programs we found that for CBM, the match length for 30% of the caseload was between 1–12 months, for 40% it was between 13–24 months and for the remaining 30% it was longer than 24 months. For youth with SBM, the match length for 45% of the caseload was between 1–12 months, for 40% it was 3–24 months and for the remaining 15% it was longer than 24 months.
Figure 1.
BBBS Program New Youth by Month
Note: This figure shows the monthly time trend of total BBBS program youth caseload between January of 2008 to August of 2015 by mentoring program, community-based mentoring vs school-based mentoring, and by match type, new youth match vs. returning youth matches. The bottom two time trends with solid circular and triangular shape data points are new youth matches. The top two time trends with clear circular and triangular shape data p oints are returning youth (total monthly youth caseload minus new youth matches). The decreasing trend in both new and returning youth is consistent with decreased program funding after the year 2010.
3.2. Overview of Monthly Program Expenditures
The last section in Table 1 also summarizes total cost over 92 months (TC) and average cost per month (AC) over the study period at $211,319 and $251 respectively. Removing fixed costs (e.g., the equipment, rent and utility costs) produced similar TVC and AVC values, $190,530 (total variable cost) and $222 (average variable cost), respectively.
The pie graph in Figure 2a shows that the largest portion of program expenditures were wages (76%, S.D.: $33,705), followed by rent with utilities (5%, S.D.: $3,106), miscellaneous (5%, S.D.: 3,706) (including telecommunications, conferences and meetings, copier services and others) and administration fees (4%, S.D.: $10,655) (including accounting and bank fees, affiliate dues, and insurance). Match support (including contract services such as website modification, match support contractor and volunteer recruitment), office supplies, marketing expenditures and other categories were 2% or less each. The bar graph in Figure 2b shows 92 monthly total program expenditures from the 1st month of data, January 2008, to the last, August 2015. The monthly total cost increased by about 9% annually the first two years, reaching a peak around March 2009, then decreased by about 12% annually for the following two years, and finally increased again by about 10% annually the last two years. Each column shows the proportion of expenditure in wages versus in all other costs.
Figure 2a.
Percent Distribution of BBBS Program Expenditures by Cost Category
Note: This pie graph describes the percent distribution of average monthly BBBS total expenditure by cost category according to expenditure records from the BBBS study site and study period, Jan. 2008 to Aug. 2015. The percent distribution of expenditures by cost category was consistent over the study period.
Figure 2b.
BBBS Program Expenditure by Month and Cost Category
Comparing the time trend between the monthly total expenditure, Figure 2b, and caseload, Figure 1, shows a declining trend in both between 2010–2013, then total expenditures recover slightly while youth outputs do not increase but stop decreasing. The 2008–2015 average monthly rate of change for expenditures and youth caseload were 3.8% and −0.8%, respectively.
3.3. Marginal Cost Results
The MC of adding one youth to the BBBS caseload, regardless of mentoring program type, was $80 (99% significance level, C.I.: $47 to $112) for one month of mentoring and for the average match length (19 months) was $1,503 per treated program participant (see first row in Table 2). This analysis also showed that during the study period total program expenditure increased annually due to changes in program procedures and real costs that increased on average by 5% ($10,170 with a 99% significance level, C.I.: $4,552 to $14,263), value not shown in tables. Seasonal dummies were not statistically significant in any of the analyses.
Table 2.
Marginal Cost Estimates
Equation | Type of Mentoring | Marginal Cost per Youth by Mentoring Period | |||||
---|---|---|---|---|---|---|---|
One Month | Average Match Length |
One Year Match Length |
|||||
Regresson Estimate |
[95% Confidence Interval] |
AIC | Equations 3a-b | Equations 3a-b | |||
1 | Both programs | $79.64 *** | [47.19 to 112.08] | 2205 | $1,503.22 | $955.68 | |
1b | CBM | $115.60 * | [−1.51 to 232.63] | 2007 | $2,315.69 | $1,387.20 | |
SBM | - | [−305.86 to 287.80] | $841.39 † | $955.68 † | |||
2 | Both programs | New Match Month | $471.60 ** | [23.91 to 919.40] | 2204 | $471.60 | $471.60 |
Returning Months | $68.83 *** | [33.91 to 103.76] | $658.35 | $757.13 | |||
Total | $1,129.95 | $1,228.73 | |||||
2b | CBM | New Match Month | $1,398.00 *** | [403.77 to 2,391.31] | 2204 | $1,398.00 | $1,398.00 |
Returning Months | $100.00 * | [−14.62 to 214.62] | $1,903.19 | $1,100.00 | |||
Total | $3,301.19 | $2,498.00 | |||||
SBM | New Match Month | - | [−689.06 to 592.45] | $471.60 | $471.60 | ||
Returning Months | - | [−315.36 to 268.42] | $727.18 | $757.13 | |||
Total | $1,198.78 † | $1,228.73 † |
The Akaike information criterion (AIC) is an estimator of the relative quality of statistical models, the lower the number the better the model fit. Asterisks indicate the estimate’s level of statistical significance
is greater than 99%
greater than 95%
is greater than 90%, and “-” means the value was not statistically significant.
The one-month cost estimates for SBM were not statistically significant. The average and one-year match length estimates for SBM are conservative estimates assuming the one month values are at most equal to those from the non-desegregated regressions (i.e. SBM values in Equation 1b use the “one month” estimate from equation 1, and in Equation 2b use the “one month” estimates from Equation 2).
The second row in Table 2 shows statistically significant estimates (after controlling for both the CBS and SBM youth monthly caseload) for the monthly MC of adding one more youth for one more month with CBM. The estimates for monthly MC of SBM were not statistically significant, which suggests that most of the variation in costs and caseload is driven by changes in CBM. Plots of monthly costs over youth counts in SBM and CBM, not shown here, also suggested that the temporal variation of monthly caseload was low and most variation was due to caseloads in CBM. Thus, statistically significant estimates for SBM may be obtained in datasets with a larger sample size. The MC for adding one youth to the CBM caseload for one month was $116 (90% significance level, C.I.: $−1.5 to $233), and for the average match length (20 months) was $2,316. Statistically insignificant results for SBM suggest we cannot reject a zero or most likely small marginal cost for SBM. However, large confidence interval values for SBM, C.I.: −$306 to $288, suggest that the true cost was $288 or less. Thus, we lack precision around a very small estimate for SBM MC. We can conservatively infer that the SBM cost is below the upper C.I. bound, and less than the CBM cost (as suggested by BBBS program staff) which gives it an upper bound of $116. Assuming that the SBM monthly MC is at most equal to the cost from the combined mentoring programs analysis in Table 2 Equation 1 (i.e. $80), then a conservative SBM estimate for the average match length (11 months) was $841.
The analyses separating the monthly MC for a returning youth from a new match showed lower AIC values (2204 vs 2205 or 2207) and a better fit for the costing model (Table 2, Equations 2-2b) than the prior results that aggregated both types of youth (Table 2, Equations 1-1b). Combining both mentoring program types the MC of adding one youth to the BBS caseload was $472 (95% significance, C.I. $24 to $919) for the first month (the “match month”) and $69 (99% significance, C.I.: $34 to $104) for each subsequent “returning” month.
Further disaggregating the “new” vs. “returning” month analysis by SBM or CBM, the MC of adding one youth to the CBM caseload was $1,398 (99% significance level, C.I.: $404 to $2,391) for the first month (the “match month”) and much lower for each subsequent “returning” month, $100 (90% significance level, C.I.: $−15 to $215). Based on this better fit model, the MC for the CBM average match length (20 months) was $3,301 per treated program participant. The respective “new” and “returning” monthly MC estimates for SBM, also part of Equation 2b, were not statistically significant. As in Equation 1b, we cannot reject that the cost may be zero or close to zero.
4. DISCUSSION
4.1. Summary of results
Statistical analysis using regression models showed that the cost of providing one youth with BBBS mentoring varies depending on the costing model and the type of mentoring program. The costing model accounting for differences in program expenditures between the first (“match”) month vs. the following (“returning”) months of mentoring scored better in the statistical model fit test and showed higher marginal cost estimates than the model treating expenditures for all mentoring months the same. The former costing model fit better in the statistical tests most likely because it offered a better model of the way the BBBS program allocates resources to each match. In particular, the model accounts for large differences in expenditure levels between the first and returning months of mentoring which is explained by the initial significant staff time allocated to recruiting and screening “Bigs” (big brothers and sisters, or mentors), finding suitable matches for “Littles” (little brothers and sisters, or mentees), and recruiting and getting parental permission for Littles. Likewise, after controlling for the high cost of the first match month, the model shows there are significant expenditures during returning month of mentoring consistent with the program’s effort providing continued support to mentors, conducting monthly follow-up calls and monitoring match activities.
We also found statistically insignificant results for SBM. One interpretation for the lack of significant results is that the marginal cost of SBM could be zero or close to zero. However, we know that the program uses staff time to make and support matches, thus it’s unlikely that the cost is zero, particularly for a new match month. Based on the upper bound of the confidence interval, the SBM cost may be $593 or less for the first month and $268 or less for each subsequent returning month. The most conservative (highest cost) estimate for SBM would assume that the “new” and “returning” estimates from the “Both programs” analysis in Table 2, Equation 2, which fall between the SBM’s confidence interval, are the same MC estimates for SBM. In this case, the MC for the SBM average match length (11 months) would be $1,199 per treated program participant. The same table shows the equivalent MC estimate for only one year of mentoring.
The marginal cost of a year of CBM was also significantly higher than that of SBM. The difference can be explained by additional staff time and program resources needed to create a CBM match. The CBM program targets youth in the community who are less likely to transition successfully into adulthood. These youth are more likely to perform poorly in school, have misdemeanors or other poor behavioral outcomes compared to youth in SBM. BBBS has to invest more resources recruiting and screening Bigs willing to mentor youth outside school settings, particularly to match older mentees, and supporting and following-up mentors or parents who may move but cannot be located through schools as with SBM. Higher costs for CBM are consistent with data showing that for every successful Big and Little match there were 2.1 Big interviews for the CBM program compared to 1.4 interviews for SBM.
Cost estimates from this analysis may also help BBBS programs assess how costs change as they look to enhance youth behavioral outcomes (i.e. their return on investment) or scale up. For example, given that the largest portion of costs were labor costs, changes to the amount of time staff devote to mentoring activities will change costs. For instance, program coordinators from the study site reported that while much emphasis was focused on allocating resources to making new matches, a process that is long and had a high interview-to-match success ratio, a new national BBBS model planns shifting resources, particularly staff time, to better support returning youth. New match support activities will include conducting follow ups with all key BBBS actors (i.e. mentors, mentees, and mentee guardians) quarterly during the second and later years of the match. This change will likely increase the marginal cost of returning months. However, increased costs may be offset with larger return on investments from improvements to youth behavioral outcomes. Changes to the length of match relationships will also affect costs. Evidence from prior literature suggests that match relationships that last a year or longer have better youth outcomes than matches that terminate earlier than that [9]. The proportional distribution of match lengths in this study revealed large variation. Programs looking to enhance match lengths would increase the cost per participant but also improve youth outcomes. To understand how the cost per treated program participant may change with changes to match length, parameters in the study equations could be adjusted from the mean match length to any other length.
4.2. Prior Evidence
Compared to recent literature, the unit cost from this study for SBM ($1,199) is in the range of costs from prior studies ($437-$1,672) [12, 13, 15], after controlling for the effect of both CBM and SBM caseloads on expenditures. However, the CBM estimate from this study ($3,301) is substantially higher than the prior studies ($1,285-$1,676), after also controlling for the effect of both CBM and SBM caseloads on expenditures. The difference between the prior studies and our estimates may be attributed to differences in the methods used for estimating the unit cost. As noted in the introduction, various prior studies used average estimates rather than MC estimates to value the cost of scaling the program by one more youth. Similarly, other studies reporting MC estimates used ingredients-based costing methods which are less robust to analysis with regression techniques. Likewise, the difference may be explained by the number of mentoring months included in the estimates. Most prior studies report annual estimates. However, for the CBM program, both our annual ($2,498) and average match length MC estimates (see Table 2), as well as our average cost (see Table 1) ($250 xv12months=$3,000), are substantially higher than the literature’s annual values. The cost difference may arise due to differences in the program staff-to-youth ratios and or cost of living between our study site in the Mid-Atlantic and prior literature study sites (e.g. mostly from the Washington state). However, the median hourly wages for community and social service jobs in both the study site and Washington state (i.e. cost of living) are similar (i.e. $22.77 vs. $22.92 per hour, respectively)[35] and the wages for BBBS staff from this study site were reported by program staff to be on the lower end of national BBBS average wages. Thus, given that most of the program expenditures are wages, the difference in cost likely arises from differences in staff-to-youth ratios.
4.3. Opportunities for Cost Saving
The data time trends and AC to MC comparison suggest that during the last half of the 8-year study period the program could have been more efficient by either adding more youth participants without incurring additional fixed and staffing costs or alternatively cutting costs. For example, the time trends showed that the monthly youth caseload decreased significantly more over time than the expenditures. Given that most of the cost is wages, this means that, particularly for the last four years, the program kept the same staff expenditures more or less regardless of changes to the youth caseload. In fact, the AC increased by 85% between the years 2008 and 2015 and, for the last four years of the study period (2012 to 2015), was larger than the MC. Based on economic theory, AC values larger than the MC suggest that the program is running at less than full capacity and thus could have expanded more to lower the AC, consistent with economies of scale.
However, it may be difficult for programs to reduce staffing costs while also maintaining highly trained recruiters and match support staff throughout the year. For example, as observed in the time trends and as reported by program staff, after the year 2011 the program underwent structural changes demanding reduction of its caseload and cost cuts. These early staff cuts could have resulted in a loss of staff expertise reducing the program’s ability to maintain the same staff-to-youth ratios later compared to the previous years. A possible opportunity for cost saving is outsourcing more human resources operations for recruiting Bigs on a need by need basis to firms that specialize on these tasks, particularly for CBM matches which require about twice as many interviews per match compared to SBM matches. At the same time, an opportunity for maximizing resources (i.e. expanding when the program is running at less the full capacity) may be allocating more resources to keeping returning youth (i.e. investing in the quality and duration of matches) than focusing on enrolling new youth which is more expensive then returning youth.
4.4. Limitations
This study was limited by having data from only one BBBS site, thus caution should be used in generalizing this estimates to other BBBS program sites. BBBS programs in urban areas with similar caseload size, costs of living, and serving youth from similar socioeconomic backgrounds may have similar marginal costs. Likewise, this is the first BBBS cost study using marginal cost estimation techniques which are an improvement to the current estimates available in the literature. Our estimates also exclude the time cost of volunteers. However, while this component of the program cost is important and should be included as part of the program cost from a societal perspective [17], estimation of this component is beyond the scope of this study.
Lastly, the data from this study site had a downward trend in costs and caseload over time due to large financial cuts which may or not be characteristic of the expenditure fluctuations of other BBBS programs across the country. However, this data fluctuation enrich the statistical analysis on marginal cost estimates by allowing more variability between costs and caseload from month to month throughout the study period and this variation in the dataset is what makes statistical analysis with regressions able to capture the marginal changes in costs for an additional youth. Thus, the downward trend, as opposed to a constant trend, over time helped the analysis evaluate costs at different levels of output and draw estimates with a 95% confidence interval for CBM and the average mentoring program but did not vary enough to draw estimates for SBM.
5. CONCLUSIONS
This study uses statistical analysis to derive marginal cost estimates on the cost of scaling up a BBBS program by one more youth. Results show that the cost per treated program participant in SBM is $1,198.78 per new enrollee and in CBM is $3,301,19 per new enrollee assuming the enrollment lasts as long as average. New evidence from this study may help youth program managers, donors and policy makers better understand the variation in youth exposure times, the cost of BBBS scale up and opportunities for cost savings.
Highlights:
This study provides new evidence on the cost per Big Brothers Big Sisters of America (BBBS) participant using statistical analysis. The cost per youth for one year or the average length of community-based mentoring is significantly larger than estimates from prior literature. Data reveals opportunities for cost saving.
Highlights.
Using a marginal cost approach to evaluate the cost of serving one additional youth in the Big Brothers Big Sisters of America (BBBS) program offers a significant improvement over previous methods based on average total cost estimates.
The marginal cost per treated program participant in community-based programs is about three times the cost in school-based programs.
The marginal costs per treated program participant for both the first month of mentoring and each additional returning month of mentoring are significant but the former is substantially larger.
This cost analysis can better inform policy makers and donors on the cost of expanding the scale of local BBBS programs as well as suggest opportunities for cost savings.
Acknowledgments
This research was supported by the National Institute on Minority Health And Health Disparities (grant P20MD000198). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Abbreviations:
- BBBS
Big Brothers, Big Sisters of America
- CBM
Community-based mentoring
- SBM
School-based mentoring
- AC
Average cost
- MC
Marginal cost
- BCR
Benefit-to-cost ratio
- TC
Total cost
- SD
Standard deviation
- CI
Confidence Interval
APPENDIX A
Consolidated Health Economic Evaluation Reporting Standards – CHEERS Checklist
Section/Item | Item No. | Recommendation | Reported on Page No. Line No. |
---|---|---|---|
Title and abstract | |||
Title | 1 | Identify the study as an economic evaluation or use more specific terms such as “cost-effectiveness analysis”, and describe the interventions compared. | Page No. 1 Line No. 1 |
Abstract | 2 | Provide a structured summary of objectives, perspective, setting, methods (including study design and inputs), results (including base case and uncertainty analyses), and conclusions. | Part of different document |
Introduction | |||
Background and objectives | 3a | Provide an explicit statement of the broader context for the study. | Page No. 2–6 Line No. 29–129 |
3b | Present the study question and its relevance for health policy or practice decisions. | Page No. 6 Line No. 120–129 |
|
Methods | |||
Target population and subgroups | 4 | Describe characteristics of the base case population and subgroups analysed, including why they were chosen. | Page No. 7 Line No. 135 |
Setting and location | 5 | State relevant aspects of the system(s) in which the decision(s) need(s) to be made. | Page No. 7 Line No. 135 |
Study perspective | 6 | Describe the perspective of the study and relate this to the costs being evaluated. | Page No. 7 Line No. 139 |
Comparators | 7 | Describe the interventions or strategies being compared and state why they were chosen. | Page No. 7 Line No. 179–211 |
Time horizon | 8 | State the time horizon(s) over which costs and consequences are being evaluated and say why appropriate. | Page No. 7 Line No. 162–164 |
Discount rate | 9 | Report the choice of discount rate(s) used for costs and outcomes and say why appropriate. | N/A |
Choice of health outcomes | 10 | Describe what outcomes were used as the measure(s) of benefit in the evaluation and their relevance for the type of analysis performed. | Page No. 9 Line No. 181 |
Measurement of effectiveness | 11a | Single study-based estimates: Describe fully the design features of the single effectiveness study and why the single study was a sufficient source of clinical effectiveness data. | Page No. 9 Line No. 180 |
11b | Synthesis-based estimates: Describe fully the methods used for identification of included studies and synthesis of clinical effectiveness data. | N/A | |
Measurement and valuation of preference based outcomes | 12 | If applicable, describe the population and methods used to elicit preferences for outcomes. | N/A |
Estimating resources and costs | 13a | Single study-based economic evaluation: Describe approaches used to estimate resource use associated with the alternative interventions. Describe primary or secondary research methods for valuing each resource item in terms of its unit cost. Describe any adjustments made to approximate to opportunity costs. | Page No. 9 Line No. 188–223 |
13b | Model-based economic evaluation: Describe approaches and data sources used to estimate resource use associated with model health states. Describe primary or secondary research methods for valuing each resource item in terms of its unit cost. Describe any adjustments made to approximate to opportunity costs. | N/A | |
Currency, price date, and conversion | 14 | Report the dates of the estimated resource quantities and unit costs. Describe methods for adjusting estimated unit costs to the year of reported costs if necessary. Describe methods for converting costs into a common currency base and the exchange rate. | Page No. 7 Line No. 145 |
Choice of model | 15 | Describe and give reasons for the specific type of decision-analytical model used. Providing a figure to show model structure is strongly recommended. | Page No. 9 Line No. 182–183 |
Assumptions | 16 | Describe all structural or other assumptions underpinning the decision-analytical model. | Page No. 10 Line No. 212–232 |
Analytical methods | 17 | Describe all analytical methods supporting the evaluation. This could include methods for dealing with skewed, missing, or censored data; extrapolation methods; methods for pooling data; approaches to validate or make adjustments (such as half cycle corrections) to a model; and methods for handling population heterogeneity and uncertainty. | Page No. 8–9 Line No. 179–211 |
Results | |||
Study parameters | 18 | Report the values, ranges, references, and, if used, probability distributions for all parameters. Report reasons or sources for distributions used to represent uncertainty where appropriate. Providing a table to show the input values is strongly recommended. | Page No. 11–16 Line No. 236 & 307 |
Incremental costs and outcomes | 19 | For each intervention, report mean values for the main categories of estimated costs and outcomes of interest, as well as mean differences between the comparator groups. If applicable, report incremental cost-effectiveness ratios. | Page No. 16–19 Line No. 309 & 307–358 |
Characterising uncertainty | 20a | Single study-based economic evaluation: Describe the effects of sampling uncertainty for the estimated incremental cost and incremental effectiveness parameters, together with the impact of methodological assumptions (such as discount rate, study perspective). | Page No. 18 Line No. 345–358. |
20b | Model-based economic evaluation: Describe the effects on the results of uncertainty for all input parameters, and uncertainty related to the structure of the model and assumptions. | N/A | |
Characterising heterogeneity | 21 | If applicable, report differences in costs, outcomes, or cost-effectiveness that can be explained by variations between subgroups of patients with different baseline characteristics or other observed variability in effects that are not reducible by more information. | N/A |
Discussion | |||
Study findings, limitations, generalisability and current knowledge | 22 | Summarise key study findings and describe how they support the conclusions reached. Discuss limitations and the generalisability of the findings and how the findings fit with current knowledge. |
Page No. 19 Line No. 361 Page No. 20 Line No. 464 |
Other | |||
Source of funding | 23 | Describe how the study was funded and the role of the funder in the identification, design, conduct, and reporting of the analysis. Describe other non-monetary sources of support. | Part of different document |
Conflicts of interest | 24 | Describe any potential for conflict of interest of study contributors in accordance with journal policy. In the absence of a journal policy, we recommend authors comply with International Committee of Medical Journal Editors recommendations. | Part of different document |
APPENDIX B
Below is a list of all four full regression equations tested in the analysis of marginal costs.
Equation 1 |
Equation 1b |
Equation 2 |
Equation 2b |
where TotalCost is total expenditure the tth month, β0 is a constant, Total_youth is the total BBBS caseload number the tth month receiving mentoring, β1 is the marginal cost for one month of mentoring per additional youth in the caseload, γ1 and γ2 are interpreted as the marginal cost for the first month and each returning month of mentoring, respectively, per additional youth in the caseload, and β2 and γ3 are the vector of control variables including year and season.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Declarations of interest: none
REFERENCES
- 1.Big Brothers Big Sisters of America of America 2018. [cited 2018; Available from: http://www.bbbs.org/history/.
- 2.DuBois DL, et al. , How effective are mentoring programs for youth? A systematic assessment of the evidence. Psycholgocial Science in the Public Interest, 2011. 12(2): p. 57–91. [DOI] [PubMed] [Google Scholar]
- 3.DuBois D and Karcher M, Youth mentoring in contemporary perspective. The handbook of youth mentoring, 2014. 2: p. 3–13. [Google Scholar]
- 4.Eby LT, et al. , Does mentoring matter? A multidisciplinary meta-analysis comparing mentored and non-mentored individuals. Journal of Vocational Behavior, 2008. 72: p. 254–267. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Sipe CL, Mentoring programs for adolescents: a research summary. Journal of Adolescent Health, 2002. 31(6 Supp): p. 251–260. [DOI] [PubMed] [Google Scholar]
- 6.Big Brothers Big Sisters of America and BBBS Federation, Bigger Impact: 2022 2017: Tampa, FL. [Google Scholar]
- 7.Grossman JB and Tierney JP, Does Mentoring Work?:An Impact Study of the Big Brothers Big Sisters Program. Evaluation Review, 1998. 22(3): p. 403–426. [Google Scholar]
- 8.Herrera C, et al. , Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring. Child Development, 2011. 82(1): p. 346–361. [DOI] [PubMed] [Google Scholar]
- 9.Grossman JB and Rhodes JE, The Test of Time: Predictors and Effects of Duration in Youth Mentoring Relationships. American Journal of Community Psychology, 2002. 30(2): p. 199–219. [DOI] [PubMed] [Google Scholar]
- 10.Moodie ML and Fisher J, Are youth mentoring programs good value-for-money? An evaluation of the Big Brothers Big Sisters Melbourne Program. BMC Public Health, 2009. 9(1): p. 41–49. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.WSIPP, Mentoring: Big Brothers Big Sisters School-Based (including volunteer costs) 2018, Washington State Institute for Public Policy. [Google Scholar]
- 12.WSIPP, Mentoring: Big Brothers Big Sisters School-Based (taxpayer costs only) 2018, Washington State Institute for Public Policy. [Google Scholar]
- 13.WSIPP, Mentoring: Big Brothers Big Sisters Community-Based (taxpayer costs only) 2018, Washington State Institute for Public Policy. [Google Scholar]
- 14.WSIPP, Mentoring: Big Brothers Big Sisters Community-Based (including volunteer costs) 2018, Washington State Institute for Public Policy. [Google Scholar]
- 15.Herrera C, et al. , Making a Difference in Schools: The big brothers big sisters School-based mentoring impact study 2007, A Publication of Public/Private Ventures. [Google Scholar]
- 16.Fountain DL and Arbreton A, The Cost of Mentoring, in Contemporary Issues in Mentoring, Grossman JB, Editor. 1998, Public/Private Ventures: Philadelphia, PA: p. 48. [Google Scholar]
- 17.Grossman JB and Furano K, Making the Most of Volunteers. Law and Contemporary Problems, 1999. 62(4): p. 199–218. [Google Scholar]
- 18.Belfield CR, Estimating the Rate of Return to Educational Investments: A Case Study Using the Big Brothers Big Sisters Mentoring Program 2003.
- 19.Aos S, et al. , The Comparative Costs and Benefits of Program to Reduce Crime 2001, Washington State Institute for Public Policy: Olympia, WA. [Google Scholar]
- 20.Aos S, et al. , Benefits and costs of prevention and early intervention programs for youth 2004, Washington State Institute for Public Policy: Olympia, WA. [Google Scholar]
- 21.Miller TR and Levy DT, Cost-Outcome Analysis in Injury Prevention and Control: Eighty-Four Recent Estimates for the United States. Medical Care, 2000. 38(6): p. 562–582. [DOI] [PubMed] [Google Scholar]
- 22.Greenwood PW, Cost-Effective Violence Prevention through Targeted Family Interventions. Annals of the New York Academy of Sciences, 2004. 1036(1): p. 201–214. [DOI] [PubMed] [Google Scholar]
- 23.Grossman JB, Resch N, and Tierney JP, Making a Difference: An Impact Study of Big Brothers/Big Sisters (Re-issue of 1995 Study) 2000, Public/Private Ventures. [Google Scholar]
- 24.Washington State Institute for Public Policy, Benefit-cost technical documentation 2017: Olympia, WA. [Google Scholar]
- 25.Freedman M, The kindness of strangers: Adult mentors, urban youth, and the new volunteerism 1993, San Francisco: Jossey-Bass. [Google Scholar]
- 26.WSIPP, Mentoring: Big Brothers Big Sisters Community-Based (including volunteer costs) 2017, Washington State Institute for Public Policy. [Google Scholar]
- 27.WSIPP, Mentoring: Big Brothers Big Sisters Community-Based (taxpayer costs only) 2017, Washington State Institute for Public Policy. [Google Scholar]
- 28.Anton PA and Temple J, Analyzing the Social Return on Investment in Youth Mentoring Programs: A framework for Minnesota 2007, Wilder Research: Saint Paul, Minnesota. [Google Scholar]
- 29.Washington State Institute for Public Policy, Benefit-cost technical documentation 2018: Olympia, WA. [Google Scholar]
- 30.IMF National Statistics Office. Inflation, average consumer prices (Index), 2017 Monthly Data,. 2017. [cited 2017 March]; Available from: https://www.imf.org/external/pubs/ft/weo/2018/01/weodata/weorept.aspx?pr.x=77&pr.y=13&sy=2008&ey=2018&scsm=1&ssd=1&sort=country&ds=.&br=1&c=111&s=PCPI&grp=0&a=.
- 31.Grannemann TW, Brown RS, and Pauly MV, Estimating hospital costs: A multiple-output analysis. Journal of Health Economics, 1986. 5(2): p. 107–127. [DOI] [PubMed] [Google Scholar]
- 32.Ramsey JB, Tests for Specification Errors in Classical Linear Least-Squares Regression Analysis. Journal of the Royal Statistical Society. Series B (Methodological), 1969. 31(2): p. 350–371. [Google Scholar]
- 33.Breusch TS and Pagan AR, A Simple Test for Heteroscedasticity and Random Coefficient Variation. Econometrica, 1979. 47(5): p. 1287–1294. [Google Scholar]
- 34.Cook RD and Weisberg S, Diagnostics for Heteroscedasticity in Regression. Biometrika, 1983. 70(1): p. 1–10. [Google Scholar]
- 35.Bureau of Labor Statistics. May 2017 State Occupational Employment and Wage Estimates 2017. [cited 2018; Available from: https://www.bls.gov/oes/current/oessrcst.htm.