Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Jul 27.
Published in final edited form as: Eval Program Plann. 2020 Jun 26;82:101845. doi: 10.1016/j.evalprogplan.2020.101845

A longitudinal evaluation of government-sponsored job skills training and basic employment services among U.S. baby boomers with economic disadvantages

Sehun Oh a,*, Diana M DiNitto b, Daniel A Powers c
PMCID: PMC10372812  NIHMSID: NIHMS1908429  PMID: 32623184

Abstract

Job skills training is a cost-effective strategy for improving employment among individuals who have low income and employment barriers, but few U.S. government-sponsored employment program participants have received such training. To better understand long-term gains from job skills training, this study compared employment and earnings trajectories between program participants who received job skills training and those who received basic services only. Using data from the National Longitudinal Survey of Youth 1979, we estimated 33-year employment and earnings trajectories among U.S. baby-boomer cohorts while accounting for baseline group heterogeneity using inverse propensity score weighting. We found increases in employment rates over the life course, especially among Black women. Job skills training also increased earnings by up to 69.6 % compared to basic services only. Despite the long-term gains in employment and earnings, job skills training participation is not sufficient to address gender as well as racial and ethnic gaps in full-time employment. Findings reinforce the importance of incorporating job skills training as an essential service element of government-sponsored employment programs to improve long-term labor market outcomes among Americans with economic disadvantages.

Keywords: Job skills training, Employment, Earnings, Government-sponsored employment programs, Baby boomers, Life course perspective

1. Introduction

Job skills training is a cost-effective strategy for improving the labor market outcomes and economic well-being of populations with low-income and employment barriers. In the United States, the federal, state, and local governments have played key roles in improving the employability and economic self-sufficiency of un- and under-employed workers, including individuals receiving public assistance, by sponsoring job training programs. Such programs include voluntary programs (e.g., those developed under the Manpower Development and Training Act [MDTA, PL 87–415] and Comprehensive Employment and Training Act [CETA, PL 93–203], Job Corps, and Registered Apprenticeship) and mandatory welfare-to-work programs (e.g., the Job Opportunities and Basic Skills [JOBS] programs under the Family Support Act of 1988) (DiNitto & Johnson, 2015). For instance, CETA, which replaced MDTA in 1973 amidst economic recession and high unemployment, offered low-income youth and long-term unemployed individuals in-class instruction on basic education (e.g., high-school equivalence preparation) and occupational skills training (e.g., typing, key punch), on-the-job training (e.g., automobile repair, machine tool operation), and subsidized work experiences in public agencies or nonprofit organizations (Bloom & McLaughlin, 1982). Participants spent an average of 20 weeks in training and work opportunities targeted to low-skilled and entry-level jobs.

Though job training programs share a common goal of improving employability, not all program participants have had access to opportunities for accumulating the knowledge and skills necessary to achieve economic self-sufficiency. Instead, educational or training services have often been available only as a last resort to those unable to find jobs after receiving minimal initial services that provided little-to-no opportunity to acquire knowledge or skills sufficient to secure a modest-paying job. For instance, welfare-to-work programs for adults receiving cash assistance, such as the former JOBS programs, mostly offered basic educational services (e.g., adult basic education, General Educational Development [GED] preparation, and English as a second language) rather than higher education or job skills training (Bishop, 2004; Greenberg, Strawn, & Plimpton, 2000). Similarly, Workforce Investment Act (WIA, PL 105–220) programs served individuals with employment barriers using a three-tiered (core, intensive, and training) service approach, with intensive and training services available only to participants with greater employment barriers such as limited work histories and layoffs (O’Shea & King, 2001). As a result, few workers with economic disadvantages had opportunities to develop substantial knowledge and job skills, leaving them vulnerable in the labor market (Heinrich, Mueser, Troske, Jeon, & Kahvecioglu, 2013).

In contrast, government-sponsored employment programs that offer pre-employment education and training and teach specific job skills through classroom training, on-the-job training (OJT), and work experience have been found effective in improving employability and economic self-sufficiency (Greenberg, Michalopoulos, & Robins, 2003, 2006). For example, OJT, in one-on-one training at worksites, has helped participants with labor market disadvantages (e.g., racial and ethnic minorities with low incomes) improve their employability, not only by acquiring job skills specific for occupations, but also by addressing employment barriers such as limited social networks. Because employers rely heavily on insider referrals to fill new job openings (Braddock & McPartland, 1987; Fuller, 2017), participating in training activities at worksites increases trainees’ access to information about employment opportunities. At the same time, employers are able to acquire more information about participants, which may reduce chances of employers making decisions based on implicit bias and gender as well as racial and ethnic stereotypes that adversely affect applicants of low-income as well as racial and ethnic minority status who more often lack references and prior work experience (Fitzgerald, 2000; Fuller, 2017).

In their meta-analysis of earnings gains from 31 evaluations of 15 government-sponsored employment programs operated between 1964 and 1998, Greenberg et al. (2003) found that OJT produced the largest gains in annual earnings during the first several years after program completion, ranging from $1443 to $1644 for women and $312 to $874 for men (in 1999 dollars). Classroom job skills training also led to comparable earnings gains ($1295 to $1787 for women and $26 to $636 for men), but OJT overall resulted in larger gains across regions with varying local labor market conditions (Greenberg et al., 2003). Although these earnings gains may seem modest, they were large enough to compensate for government-sponsored training costs over the long run (which averaged $7080 for men and $6591 for women). Non-White participants had greater earnings gains than White participants, suggesting potential advantages for racial and ethnic minorities with low incomes in addressing employment barriers.

Despite the potential gains in labor market outcomes, study follow-up periods have been relatively short, making it unclear whether these gains would persist, and if they did, what these trajectories would look like over participants’ life-course. Understanding longer-term program effects is important in assessing the overall efficiency of job training programs compared to incurred costs, including program costs and participants’ personal costs (e.g., forgone earnings, time, and effort incurred during the training). The issue of efficiency is not only relevant to making decisions about program funding but also to individuals’ decisions about program participation as rational choices (Becker, 2009; Ehrenberg & Smith, 2017), potentially affecting both a program’s sustainability and participant recruitment.

1.1. Long-term impacts of job skills training on labor market outcomes

Views about the long-range earnings trajectories of populations with economic disadvantages who participated in employment training programs are conflicting, and few studies provide insights into these programs’ long-term effects. One hypothesis suggests that earnings gaps between program participants who received job skills training and those who received basic services but not job skills training gradually converge as the latter group gains work experience that narrows the skills gap (Greenberg et al., 2003). Another hypothesis suggests that gains persist over time as job skill training and subsequent employment are likely to lead employment and training participants to further employment and training opportunities (Dannefer, 2003; Merton, 1968; O’Rand, 1996). This view is consistent with human capital theory and life-course theory’s cumulative advantage and disadvantage mechanism in that knowledge and skills not only increase individual productivity and earnings potential in the short-term, they also facilitate future learning, leading to improved long-term earning potential (Ehrenberg & Smith, 2017; Oreopoulos & Salvanes, 2009).

Resolving such debates about the effects of job skills training requires an empirical assessment of the long-term employment and earnings outcomes of comparable groups receiving basic services and job skills training. Experimental evaluation studies have generally followed participants for no more than five years after program completion (Greenberg et al., 2003, Greenberg et al., 2006). Though Schochet (2018) examined the Job Corps’ impacts on participants over a 20-year period, the program’s focus on job training as a primary service element does not allow for testing the gains from job skills training compared to basic services. The nonexperimental evaluation studies that have been conducted using administrative data or national surveys also have substantial limitations such as lack of comparable treatment and comparison groups or contextual factors needed to account for group heterogeneity. They also usually lack the longitudinal data to assess labor market outcomes over participants’ careers. Methodological improvements such as propensity score-based methods (Dickinson, Johnson, & West, 1986; Rosenbaum & Rubin, 1983), regression discontinuity design (Cook & Campbell, 1979; Judd & Kenny, 1981; Marcantonio & Cook, 2010), selection models (Heckman, 1979), and instrumental variables approach (Angrist & Imbens, 1995) can address comparability issues to some extent. In addition, longitudinal studies using nationally-representative samples, such as the National Longitudinal Survey of Youth 1979 (NLSY79) and the Panel Study of Income Dynamics (PSID), have now followed study participants for over 40 years. However, to the authors’ knowledge, few studies have used these data to examine program participants’ labor market outcomes over their careers while accounting for sample selection into programs.

2. The present study

The present study examined life-course labor market outcomes resulting from participation in government-sponsored employment programs. Data came from NLSY79, an ongoing cohort study of a representative sample of late baby boomers (i.e., those born between 1957 and 1964) sponsored by the U.S. Department of Labor. Employment rates and earnings of participants who received job skills training services (“job skills training group”) were compared to participants who received other services not involving job skills development such as adult basic education, GED preparation, and job readiness training (“basic service group”). To reduce the potential bias from selection into employment programs and types of services received, inverse probability of treatment weighting (IPTW) was used (Austin, 2011; Rosenbaum & Rubin, 1983) prior to the main analyses. Specifically, inverse probability of treatment (IPT) weights, i.e., the reciprocal of the probability of receiving the treatment, were estimated and used to reweight the sample such that baseline characteristics are balanced between treatment and comparison groups.

3. Methods

3.1. Data and sample

NLSY79 offers a unique longitudinal data set with which to examine training participation and long-term labor market outcomes among populations with economic disadvantages. During the 40 years since data were first collected in 1979, NLSY79 has provided detailed information on participants’ government-sponsored employment program enrollment status, types of services received, and labor market outcomes. Since this birth cohort is now nearing retirement age, NLSY79 is also a good source of data on program participants’ life-course labor market outcomes. In addition, NLSY79 allows for assessing the effects of U.S. government-sponsored employment and training programs for baby-boomer cohorts since Congress first established such programs, beginning with CETA in 1973. Conclusive assessments of those programs from the early years are limited, as evaluations primarily used non-experimental study designs with comparison groups drawn from secondary data sets such as the Current Population Survey. Reliance on other data sources for comparison groups has resulted in substantial variations in outcomes reported when different selection processes and statistical methods were used (Bassi, 1983; Bloom & McLaughlin, 1982; Dickinson et al., 1986). Using NLSY79 to establish both treatment and comparison groups will help in obtaining more reliable program effects from government-sponsored employment programs for populations experiencing economic disadvantages in the United States.

NSLY79 contains data on 12,686 late baby boomers. Baseline and follow-up surveys were conducted annually between 1979 and 1994 and biannually since 1996. The information collected includes respondents’ education and training, labor market outcomes, and a wide array of physical and behavioral health outcomes. A multistage area probability sampling design was employed to recruit three independent samples at the time of the initial survey: (1) the main sample (n=6,111) representing noninstitutionalized civilian youth, (2) a supplementary sample (n=5,295) of oversampled Hispanics, Blacks, and economically disadvantaged non-Black or non-Hispanic youth, and (3) a military sample (n=1,280) of those serving in the four branches of the U.S. military. After excluding the military and supplementary sample that have not been interviewed since 1984 and 1990, respectively, the analytic sample in the present study is limited to 1496 respondents who reported participating in government-sponsored employment programs between 1980 and 1986. The sample’s labor market outcomes were drawn from multiple time points at approximately four-to-six year intervals: 1987, 1992, 1998, 2004, 2010, and 2014. Overall, 69.8 % of the main sample at baseline was retained for the interview conducted in 2014. To adjust for attrition across multiple survey years used in this study, customized longitudinal weights were retrieved from the National Longitudinal Surveys’ custom weighting program (https://www.nlsinfo.org/weights/nlsy79). The program will generate longitudinal weights as if non-respondents had participated in each survey round of interest. More details about NLSY79’s study procedures and design are available from the U.S. Bureau of Labor Statistics (2019).

3.2. Measures

3.2.1. Labor market outcomes

Labor market outcomes were assessed using two measures: (1) Employment status, a dichotomous variable (0=not employed, 1=employed), was created based on average work hours throughout the calendar year. Respondents were asked about work hours for each week prior to surveys, and those reporting any work hours were considered employed. As a supplementary measure of employment status, a categorical variable differentiating full-time employment from part-time employment (0=not working, 1=worked less than 35 hoursper week, 2=worked 35 or more hours per week) was also created; (2) Annual labor market earnings included total wages, salary, commissions, or tips from all jobs before deductions for taxes or anything else for the years 1987, 1992, 1998, 2004, 2010, and 2014.

3.2.2. Government-sponsored employment program participation and service receipt status

Based on questions about participation in government-sponsored employment programs and types of services received, two mutually exclusive groups (basic service and job skills training) were created. For each year, respondents were asked if they received any education or training services for one-month or longer in a government-sponsored program such as CETA, the Job Corps, or any other government-sponsored programs such as MDTA training, Opportunities Industrialization Centers (OIC), Service, Employment, and Redevelopment (SER) Jobs for Progress, Urban League, and Vocational Rehabilitation. Those who participated in any of the programs were asked about types of services received including classroom training in reading, writing, or arithmetic; English as a second language; GED; job counseling; college prep; classroom skills training; OJT; and work experience. Based on the responses, those who received classroom skills training, OJT, or work experience were classified as the job skills training (treatment) group; those who received classroom training in reading, writing, or arithmetic; English as a second language; GED; and job counseling or college prep, but not jobs skills training, were classified as the basic service (comparison) group. To better understand the job skills treatment group members gained, the types of occupations for which they were trained were also examined.

3.2.3. Sociodemographic characteristics

Individual and family characteristics were also examined. Individual characteristics at baseline and during the years when labor market outcome data were collected included age cohort (0=ages 14–16, 1=ages 17–19, 2=ages 20–22), sex (0=male, 1=female), race (Black, Hispanic, White), school enrollment status (0=no, 1=yes), highest grade completed (0=0–8, 1=9–11, 2=12, 3=13+), current marital status (0=no, 1=yes), number of biological or step or adopted children in household (0=none, 1=one, 2=two, 3=three+), age of youngest child (0=no child, 1=1–5, 2=6+), health limitations (0=no, 1=yes), spouse’s earnings (in dollars), and urbanicity of residence (0=urban, 1=rural). To balance the distributions of baseline characteristics between basic service and job skills training groups, 36 covariates measured in 1979 (and their missingness indicators) were used including Armed Forces Qualification Test percentile score; Rosenberg self-esteem score; religious background; expectation on highest grade completion; job training aspiration; occupation aspiration at age 35; alcohol, marijuana, and other illicit drug use; if relevant, age when stopped by police for the first time (excluding minor traffic offense); if relevant, age when booked or charged for the first time; ever convicted, sentenced, or been on probation; school enrollment and attendance status; highest grade completed; employment status; occupation; marital status; any health limitation; and government-sponsored job training program participation in 1980 or earlier. Family characteristics collected at baseline included parents who lived together when participant was aged 14, mother and father’s birthplace, mother’s and father’s highest school grade completed, urbanicity of residence, mother and father’s working status, family poverty status, and net household income.

3.3. Analytic strategy

3.3.1. Propensity score analysis

Propensity score analysis was used to estimate treatment effects by accounting for group heterogeneity between those in the treatment and comparison groups. Propensity score analysis uses a potential outcomes approach (also known as a counterfactual approach), which recognizes that we can only observe outcomes of treatment an individual actually received, not alternative outcomes, i.e. potential or counterfactual outcomes (Holland, 1986; Rubin, 1974; Sobel, 1994). For those selected into the treatment condition, the estimated effects without appropriate measures will not capture true program effects. To circumvent this potential selection bias, propensity score analysis produces a one dimensional summary of observed covariates for each respondent (i.e., the predicted probability of the respondent being in the treatment condition), which can be used to balance covariate distributions between treatment and comparison groups (Michalopoulos, Bloom, & Hill, 2004; Rosenbaum, 2002; Rosenbaum & Rubin, 1983). For further discussion of the assumptions and technical considerations of propensity score analysis, see, for example, McCaffrey et al. (2013). Of various propensity score analysis methods, in this study, IPTW was used, which balances the covariate distributions of the treatment group and comparison groups based on weights computed using estimated propensity scores. A main advantage of this weighting approach over matching or stratification methods is that a researcher need not make arbitrary decisions (e.g., about matching algorithms, the ratio of matches, and replacement methods for the matching approach), which can affect estimation results.

The treatment effect estimand of interest in this study is the average treatment effect in the population (ATE). ATE measures how the average outcome would change if everyone in the analytic sample received both types of training (McCaffrey et al., 2013), i.e., all participants in government-sponsored employment programs received both job skills training and basic services. Because recipients of both basic service and job skills training share economically disadvantaged backgrounds, our primary interest is to assess program effects among all program participants regardless of their particular job skills training experience. Therefore, ATE was used to estimate the average treatment effects of job skills training for all participants in government-sponsored employment programs.

Propensity score analysis proceeded as follows: First, the IPT weights were estimated based on propensity scores generated from the ps() function of the Stata twang macros (Ridgeway, McCaffrey, Morral, Burgette, & Griffin, 2017). The Stata command uses generalized boosted modeling to estimate the propensity scores. Generalized Boosted Modeling is considered an effective algorithm that can estimate the nonlinear relationship between treatment condition (i.e., job skills training receipt in this case) and a large set of covariates (McCaffrey, Ridgeway, & Morral, 2004). As a baseline covariate in the propensity score models, sampling weights were also included to capture non-measured factors that may relate to the probabilities of responding to the survey (Korn & Graubard, 1991; Pfeffermann, 1993). Second, covariate balance between treatment and comparison groups were examined based on two balance metrics: the absolute standardized mean difference (also referred to as the effect size) and the Kolmogorov-Smirnov (KS) statistic. Satisfactory covariate balance is considered achieved if the effect size is less than 0.1 according to Rubin’s (2001) threshold and there is no significant KS statistic for any baseline covariate. If covariate balances are not achieved, alternate propensity score models are estimated, generally by adding more higher-order covariates and interaction terms to the propensity score models. Lastly, the estimated IPT weights were used to reweight the models for the main statistical analyses.

3.3.2. Data analysis

Statistical analysis was conducted in phases. First, missing data values for individual and familial sociodemographic characteristics were multiply imputed using chained equations (MICE) (Buuren & Groothuis-Oudshoorn, 2010). This approach imputes each incomplete variable with a separate model conditional on all other data. Under the Missing at Random (MAR) assumption, this approach produces asymptotically unbiased estimates and standard errors and is asymptotically efficient (White, Royston, & Wood, 2011). Using 20 multiply imputed data sets, consolidated outputs in each of the subsequent analyses were obtained by combining each estimate based on “Rubin rules” (Rubin, 2004). Second, descriptive statistics for baseline and time-varying individual and family sociodemographic characteristics were examined while stratifying by job skills training services receipt status. Third, the long-term treatment effects of job skills training services after program participation were obtained by estimating growth curve models based on repeated employment and earnings measures at approximately four-to-six year intervals from 1987 to 2014. To estimate labor market outcome trajectories from the year of program participation, the number of years since program participation and its higher order term were included in growth curve models of employment status and earnings. Optimal growth curve models were determined using Singer’s (1998) sequential model fitting process consisting of an unconditional means model, an unconditional growth model including time variables, a fixed effect model, a fixed effect model with interaction, and a final model with different error covariance structures. Because model fit indices are not available for analyses involving multiply imputed datasets, the most parsimonious model with significant fixed effects and/or random effect variance components was selected for final models. For employment status trajectories, the final model with covariate adjustment is as follows:

Level 1:ηit=β0i+β1iTit+β2iTit2+β3iXit
Level 2:β0i=γ00+γ01Zi+u01
β1i=γ10+γ11zi
β2i=γ20
β3i=γ30

where ηit = logodds of being employed at time t for individual i

Tit = years since program participation for individual i at time t

Xit = covariates for individual i at time t

Zit = job skills training participation indicator for individual i

In this growth model of employment status, time-varying covariates were entered (in Models 2 and 3 of Table 4) as fixed effects, i.e., time slopes β1i and β2i showed no evidence of random variation between individuals. For earnings trajectories, the final model looks similar to the employment model except for a few differences in the level-1 equation:

Level 1:Yit=β0i+β1iTit+β2iTit2+β3iXit+rit

where Yit = logearnings for individual iat time t and ritN0,σ2

Table 4.

Inverse Probability of Treatment Weighted Mixed Effects Logistic Models of Employment Status among Government-Sponsored Employment Program Participants, National Longitudinal Survey of Youth 1979.

Model 1: IPTW Model
Model 2: IPTW w/ Regression Adjustment 1
Model 3: IPTW w/ Regression Adjustment 2
Odds Ratio 95 % CI Odds Ratio 95 % CI Odds Ratio 95 % CI

Fixed Effects
Job-skills training
 No 1.000 1.000 1.000
 Yes 1.426* 1.013–2.006 1.449* 1.042–2.013 1.579** 1.179–2.114
 Year 0.997*** 0.954–0.979 0.966*** 0.953–0.979 0.977*** 0.964–0.990
 Year2 0.999* 0.997–1.000 0.999* 0.997–1.000 0.999 0.998–1.001
Sex
 Male 1.000 1.000
 Female 0.440*** 0.319–0.608 0.427 0.318–0.574
Race
 Black 0.437*** 0.311–0.614 0.532*** 0.391–0.725
 Hispanic 0.614* 0.422–0.893 0.730 0.518–1.029
 White 1.000
Highest grade completed
 0–8 1.000
 9–11 1.451 0.874–2.406
 12 2.240** 1.315–3.817
 13+ 4.454*** 2.591–7.656
Age of youngest child
 Age 0 1.067 0.642–1.774
 Ages 1 –5 0.524*** 0.385–0.712
 Ages 6+ 1.119 0.813–1.542
No. child in Household 1.000
Health limitation
 No 1.000
 Yes 0.124*** 0.079–0.194
Spouse’s earnings (in 2018 dollars)
 < $20,000 1.185 0.726–1.935
 $20,000-$39,999 2.532** 1.319–4.859
 $40,000-$ 74,999 1.486 0.835–2.645
 $75,000 + 1.000
Random Effects
 Intercept 3.021*** 2.265–3.777 2.689*** 2.016–3.362 1.743 1.226–2.260
F-test Statistics 12.23*** 13.33*** 11.83***

Note. IPTW = Inverse Probability of Treatment Weighting.

< .1

*

p < .05

**

p < .01

***

p < .001.

The final earnings model also accounted for selection into the labor market using Heckman (1979)’s two-step estimator. The selection problem arises because we can observe earnings only for respondents who were selected into the labor market. Consistent with Heckman’s two-step approach, we fit a probit model of employment status to generate an inverse Mill’s ratios, and then included it as a regressor in the log earnings model. More details about Heckman’s two-step estimator are available in Heckman (1979). Lastly, all growth curve models were weighted using the product of the longitudinal survey weights and the ITP weights so that study findings can be generalized to the national population as DuGoff, Schuler, and Stuart (2014) suggest.

4. Results

4.1. Government-sponsored employment programs and services

Table 1 displays information on the government-sponsored employment programs in which the 1496 participants included in this study were enrolled and the types of services they received between 1980 and 1986. Though the names of the programs were not available for most participants, 11.7 % were reported to have been enrolled in CETA programs, 3.5 % in vocational rehabilitation, and 3.2 % in the Job Corps. As main service types, 21.4 % of participants received classroom skills training (95 % CI = 19.1 %–24.0 %), 20.8 % received job counseling (95 % CI = 18.5 %–23.3 %), and 18.0 % received basic education (95 % CI = 15.9 %–20.4 %). The most common support services program participants received were transportation assistance (12.3 %, 95 % CI = 10.6–14.3%), meal assistance (7.8 %, 95 % CI = 4.2 %–7.0 %), and health care (6.6 %, 95 % CI = 5.4 %–8.1 %).

Table 1.

Government-Sponsored Employment Program Types and Services Participants Received, National Longitudinal Survey of Youth 1979.

Programs n % (95 % CI) Services n % (95 % CI)

CETA 209 11.7 (9.8–13.7) Job counseling 410 20.8 (18.5–23.3)
Job Corps 76 3.2 (2.4–4.3) General Educational Development program 225 9.9 (8.4–11.6)
Vocational rehabilitation 50 3.5 (2.6–4.8) Basic education 366 18.0 (15.9–20.4)
Other programs 48 2.3 (1.6–3.2) ESL 64 2.6 (1.9–3.5)
Not specified 1,210 84.5 (82.2–86.5) College prep 148 7.5 (6.1–9.1)
Classroom skills training 411 21.4 (19.1–24.0)
Job placement, work experience, on-the-job training 235 12.3 (10.5–14.3)
Any support services 321 16.1 (14.1–18.4)
Health care 143 6.6 (5.4–8.1)
Child care 91 3.7 (2.9–4.8)
Transportation 248 12.3 (10.6–14.3)
Lodging 105 5.3 (4.2–6.8)
Meals 157 7.8 (6.5–9.5)
Any other services 89 5.4 (4.2–7.0)

Notes. CETA = Comprehensive Employment and Training Act. Other programs included apprenticeship, Manpower Development Training Act (MDTA) programs, Opportunities Industrial Centers (OIC), Service, Employment, and Redevelopment (SER) Jobs for Progress, and Urban League. Percents and 95 % confidence intervals were weighted to represent the population.

Table 2 shows that most classroom skills training participants were trained for office or administrative support (28.3 %, 95 % CI = 23.5 %–33.6 %), technical, operative, or craft (24.8 %, 95 % CI = 20.2 %–30.3 %), and service (15.9 %, 95 % CI = 12.3 %–20.4 %) occupations. Most OJT and work experience participants were trained for technical, operative, or craft (38 %, 95 % CI = 28.6–48.4%), office or administrative support (20.2 %, 95 % CI = 14.1 %–28.1 %), and professional or management (19.3 %, 95 % CI = 11.6 %–30.4 %) occupations.

Table 2.

Types of Occupations for Which Job Skills Training Participants Were Trained by Training Types, National Longitudinal Survey of Youth 1979.

Classroom Skills Training (n = 395; 81.6 %)
On-the-Job Training or Work Experience (n = 156; 32.5 %)
n % (95 % CI) n % (95 % CI)

Office & administrative support 146 34.7 (30.0–40.8) 42 20.2 (14.1–28.1)
Technical, operative, & craft 116 30.4 (24.8–36.7) 54 38.0 (28.6–48.4)
Service 81 19.5 (15.2–24.8) 30 18.1 (11.9–26.6)
Professional & management 60 18.7 (13.9–24.7) 21 19.3 (11.6–30.4)
Laborers 12 2.4 (1.2–4.8) 7 4.2 (1.7–9.6)
Sales 9 1.7 (0.7–4.0) 3 0.8 (0.3–2.7)
Farming, fishing, & forestry

Notes. Information on occupation types targeted by on-the-job training programs was not collected after 1982. Of the government-sponsored program participants, 122 reported receiving both classroom skills training and on-the-job training or work experience.

4.2. Sociodemographic characteristics by job skills training service receipt status

Table 3 displays the sociodemographic characteristics of government-sponsored employment program participants by job skills training receipt status. Of all program participants, 25.2 % (n = 465) received job skills training, that is, classroom skills training, OJT, and/or work experience, and 1031 received basic services. Overall, job skills training participants came from more disadvantaged sociodemographic groups. They were more likely to be Black (37.6 %; 95 % CI = 32.7 %–42.7 %), to have completed fewer years of schooling (25.2 % had 0–8 years of schooling, 54.0 % had 9–11 years of schooling), and to be unemployed (24.1 %, 95 % CI = 19.7 %–29.2 %) in 1979. Moreover, a larger share of job skills training recipients had mothers and fathers who had completed fewer than 12 years of schooling (46.9 % for mothers and 44.9 % for fathers), and approximately 74.6 % had an annual household income lower than $20,000 in 1979.

Table 3.

Demographic Characteristics of Government-Sponsored Employment Program Participants by Job Skills Training Receipt Status, National Longitudinal Survey of Youth 1979.

Basic Services (n = 1,031;74.8 %)
Job Skills Training (n = 465; 25.2 %)
χ2
% 95 % CI % 95 % CI

Individual Characteristics (Baseline)
Age
 14–16 32.5 29.2–36.1 40.2 34.8–45.9 3.9*
 17–19 38.4 34.8–42.1 38.7 33.2–44.4
 20–22 29.1 25.6–32.8 21.1 16.4–26.8
Sex
 Male 54.6 50.8–58.4 53.3 47.6–59.0 0.1
 Female 45.4 41.6–49.2 46.7 41.0–52.4
Race and ethnicity
 Black 17.1 15.1–19.3 37.6 32.7–42.7 43.1***
 Hispanic 9.6 8.4–11.0 10.1 8.1–12.5
 White 73.3 70.6–75.8 52.3 46.6–57.9
Highest grade completed
 0–8 19.4 16.7–22.5 25.2 20.5–30.4 8.3***
 9–11 42.2 38.5–45.9 54.0 48.1–59.8
 12 24.0 20.8–27.6 14.4 10.5–19.4
 13+ 14.4 11.7–17.6 6.4 3.8–10.8
Employment status
 Employed 51.9 48.1–55.6 38.5 31.8–43.5 12.0***
 Unemployed 13.5 11.3–16.1 24.1 19.7–29.2
 Not in labor force 34.6 31.2–38.3 38.4 33.1–44.0
Marital status
 Married 8.5 6.6–11.1 7.5 4.9–11.5 0.8
 Separated, divorced, or widowed 1.2 0.7–2.0 0.5 0.2–1.4
 Never married 90.3 87.7–92.4 92.0 88.0–94.7
Health limitation
 No 96.4 94.8–97.5 94.1 90.4–96.4 2.6
 Yes 3.6 2.5–5.2 6.0 3.7–9.6
Family Characteristics (Baseline)
Parents presence at age 14
 Both parents 92.3 90.2–94.0 92.6 89.2–94.9 0.1
 Single parent 3.3 2.2–4.9 3.5 1.9–6.5
 Other 4.4 3.2–6.0 3.9 2.4–6.2
Mother’s highest grade completed
 0–8 14.8 12.5–17.4 16.0 12.8–19.8 5.5**
 9–11 20.5 17.6–23.8 30.9 25.8–36.5
 12 46.1 42.1–50.1 41.9 35.9–48.1
 13+ 18.6 15.6–22.1 11.3 7.8–16.1
Father’s highest grade completed
 0–8 20.0 17.2–23.1 21.6 17.5–26.5 4.7**
 9–11 14.5 11.9–17.6 23.3 18.3–29.2
 12 38.6 34.7–42.7 37.8 31.6–44.5
 13+ 26.9 23.3–30.9 17.2 12.5–23.3
Annual household income
 <$10,000 30.5 26.9–34.2 42.3 36.3–48.6 7.6***
 $10,000-$19,999 28.5 24.7–32.5 32.3 26.5–38.8
 $20,000-$29,999 20.7 17.3–24.5 19.0 14.0–25.2
 $30,000+ 20.4 16.9–24.4 6.4 3.5–11.4

Note. Proportions and confidence intervals were adjusted for sample weights. χ2 are corrected weighted Pearson Chi-Square statistics.

*

p < .05

**

p < .01

***

p < .001.

Unadjusted employment rates and earnings of basic service and job skills training service recipients from 1987 to 2014 (not shown) were also examined. Among females, job skills training service recipients had employment rates that were 10 percentage points higher than rates for basic service recipients, and the difference grew noticeably larger from 1998 on, but no significant group differences were observed among males. Among males, basic service recipients reported consistently higher median earnings than job skills training service recipients over time, but there were no notable group differences among female government-sponsored employment program participants. Furthermore, among those with and without job skills training, males had higher median earnings than females over the study period.

4.3. Propensity score analysis and covariate balance

Propensity score models using both the absolute standardized bias and the Kolmogorov- Smirnov test statistic as stopping methods produced satisfactory covariate balance between the basic service and job skills training groups. Before the IPTW adjustment, about 30 % of baseline covariates had standardized mean differences greater than 0.1. After the IPTW adjustment, all effect sizes for the 36 observed covariates and their missingness indicators were less than 0.1, and the Kolmogorov-Smirnov test statistics were statistically insignificant, indicating that satisfactory covariate balance was achieved. Thus, no further refinements in propensity score models were necessary.

4.4. Average treatment effects of job skills training on labor market outcomes

Table 4 shows the average treatment effects of job skills training on employment status over the 33-year study period after IPTW adjustments. The coefficient estimates suggest that job skills training was associated with 42.6 %–57.9 % higher odds of employment compared to receiving basic services only. When regressors were included to adjust for remaining imbalances (Models 2 & 3), factors that predicted lower odds of employment were being female and Black and having completed fewer school grades, preschool-aged child(ren) between 1 and 5 years old, and health limitations. Differences in predicted employment rates between the job skills training and basic services groups are depicted in Fig. 1. Overall, job skills training had positive impacts on employment rates with increases over time for all racial and ethnic groups as well as both gender groups. The largest employment effects were observed among Black women with an employment rate that was 5.9 percentage points higher at Year 1 following program participation and 9.0 percentage points higher at Year 33, while the smallest employment effects were among White men (1.6 percentage points higher in Year 1 and 4.0 percentage points higher in Year 33).

Fig. 1.

Fig. 1.

Estimated Differences in Employment Rates by Job Skills Training Receipt Status among Government-Sponsored Employment Program Participants from Inverse Probability of Treatment Weighted Mixed Effects Logistic Regression Models, National Longitudinal Survey of Youth 1979.

Note. The predicted values of employment rates were plotted based on Model 2 of the growth curve models in Table 4.

For annual earnings, Model 1 indicated job skills training was marginally significant at a .05 level of significance (see Table 5). Once sociodemographic characteristics were controlled (see Model 2), job skills training had positive effects on annual earnings (b=.528,p<.01). When the inverse mills ratio was also included (Model 3), the magnitude of the job skills training coefficient estimate decreased slightly but remained significant (b=.495,p<.05). These effects of job skills training (i.e., holding other factors constant) suggested increases of approximately e0.404 to e0.528 in average annual earnings compared to the amount program participants would have earned if they received basic services but not job skills training. Fig. 2 presents a visual comparison of the 30-year post-program earnings trajectories by job skills training receipt status based on Model 3 estimates in Table 5. Based on the assumption that job skills training’s effects on earnings are uniform over the post-program years, the job skills training group was predicted to earn more than the basic service group over the post-program years ranging from $10,764 in year 1 to $19,921 in year 19.

Table 5.

Estimated Differences in Earnings by Job Skills Training Receipt Status among Government-Sponsored Employment Program Participants from Inverse Probability of Treatment Weighted Mixed Effects Regression Models, National Longitudinal Survey of Youth 1979.

Model 1: IPTW Model
Model 2: IPTW Model w/ Regression Adjustment
Model 3: IPTW Model w/ Inverse Mills Ratios
log(earnings) SE log(earnings) SE log(earnings) SE

Fixed Effects
Job-skills training 0.404 0.237 0.528** 0.194 0.495* 0.194
Year −0.014 0.010 −0.019 0.012 0.009 0.012
Year2 −0.004*** 0.001 −0.002* 0.001 −0.002* 0.001
Sex (ref: Male) −1.507*** 0.195 −0.625 0.380
Race (ref: White)
 Black −0.893*** 0.230 −0.335 0.304
 Hispanic −0.478 0.248 −0.161 0.275
Highest grade completed (ref: 0–8)
 9–11 0.289 0.496 −0.188 0.485
 12 1.223* 0.508 0.109 0.582
 13+ 2.172*** 0.506 0.376 0.774
Age of youngest child (ref: no child)
 Age 0 0.180 0.286 0.339 0.290
 Ages 1–5 −1.039*** 0.283 −0.201 0.460
 Ages 6+ 0.142 0.179 0.140 0.176
Health limitation (ref: no) −3.071*** 0.448 0.249 1.510
Inverse mills ratio −5.945** 2.177
Constant 7.938*** 0.169 7.685*** 0.609 10.249*** 1.038
Random Effects
 Level-1 error 3.500 0.083 3.427 0.105 3.415 0.098
 Intercept 2.709 0.099 2.064 0.105 2.034 0.103
F-test Statistics 6.24*** 21.28*** 24.25***

Notes. IPTW = Inverse Probability of Treatment Weighting. Insignificant covariates (school enrollment status, urbanicity of residence) were removed from the model with regression adjustments. The final models include marital status and spousal income for estimation conversion.

< .1

*

p < .05

**

p < .01

***

p < .001.

Fig. 2.

Fig. 2.

Predicted Trajectories of Post-program Earnings by Job Skills Training Receipt Status among Government-Sponsored Employment Program Participants from Inverse Probability of Treatment Weighted Mixed Effects Regression Models, National Longitudinal Survey of Youth 1979.

Note. The predicted earnings trajectories were plotted based on the estimates in Model 3 of the growth curve models in Table 5. The current figure assumes that the earnings effects of job skills training are uniform across the post-program years.

5. Discussion

This study examined the average treatment effects of job skills training on employment and annual earnings among late baby boomers who participated in U.S. government-sponsored employment programs. After accounting for group heterogeneity at baseline, findings show that job skills training led to persistent gains in labor market outcomes (employment and annual earnings) over the 33-year study period.

Consistent with prior studies, job skills training participants reported higher employment rates during the first several years after program participation. Although prior evidence is lacking about employment effects of job skills training services for the baby-boomer generation, evaluations of more recent government-sponsored employment programs show that employment rates increased by up to six percentage points during the first five years after program participation (Bloom, Orr, Cave, Bell, & Doolittle, 1993, 1997; Hamilton et al., 2001). Specifically, Bloom et al. (1997) studied Job Training Partnership Act program eligible applicants who were randomly assigned to treatment (i.e., participants allowed to participate in the program) or a control group between 1987–1989. They found that in the 18-months following participation, men and women had employment rates of 86.4 % and 78.9 %, respectively, which were 2.8 and 2.1 percentage points higher, respectively, than men and women in the control group. Similarly, the National Evaluation of Welfare-to-Work Strategies (NEWSS), which assessed JOBS programs, reported an 89.6 % employment rate among participants during the first five years after participation, with improvements as high as 5.8 percentage points greater than the control group at the Riverside site in California, which emphasized education and job training services (Hamilton et al., 2001; Michalopoulos & Schwartz, 2000). More importantly, the current study shows that the previously well-documented short-term gains in employment status among participants in government-sponsored job skills training persisted over the 33-year study period.

Employment rates are important, but employment alone does not necessarily guarantee improvements in economic self-sufficiency; therefore, this study also investigated earnings. Job skills training resulted in persistent earnings gains among baby boomers who experienced economic disadvantages relative to their basic service counterparts. For example, in Year 1 after program participation, the basic service group’s average earnings were $15,850 (when adjusted in 2018 dollars). The findings suggest that the job skills training group is anticipated to have additional gains of $10,764 per year (or a 64 % earnings increase), holding other factors constant. In comparison, in an examination of 1975–1976 CETA participants’ post-program earnings in 2018 dollars, Bassi (1983) found significant first-year earnings gains, especially among poor subgroups, that varied by the types of skills training received, relative to comparison groups generated from Current Population Survey data: minority women (classroom training: $2617; work experience: $4228; OJT: $6402), and minority men (OJT: $7900). Earnings gains were smaller among all groups of non-poor participants except White women (Bassi, 1983). Although differences in program design, analytic samples, and years of program operation require caution when comparing studies, the early-program earnings gains due to job skills training identified in the present study are consistent with Bassi (1983)’s findings.

Despite gains for each racial and ethnic as well as gender subgroup, labor market disadvantages across these subgroups persisted even after job skills training. In particular, females and Blacks, compared to their male and White counterparts, experienced larger gains in employment rates. However, their employment rates remained lower. Supplementary analyses also show that among job skills training participants, over 70 % of employed White and Hispanic males worked full-time (35 or more hours per week) over their career, but only one-third to one-half of employed Black males or females worked full-time. Researchers have offered various reasons for these disproportionate outcomes, including discriminatory practices throughout training and employment processes. For instance, most female trainees have been referred to clerical-medical, hospitality, and child care job training with few referred to training for jobs that offer greater job stability and higher wages such as electronics, computer programing, or truck driving (Negrey, Um’rani, Golin, & Gault, 2000). Even if referred to training for skilled trades occupations, female participants were less likely than their male counterparts to complete training because of lack of family-friendly training environments. For example, trainees in some occupations such as correctional officer had to leave their families for training camps for up to four months or spend considerable time commuting to the worksite (e.g., construction sites) or attend classes after work (Reed et al., 2012). These program schedules often conflicted with child care responsibilities. Moreover, based on a nationally representative employer sample, Braddock and McPartland (1987) found that disadvantaged members of racial and ethnic minority groups faced barriers throughout the employment process from the candidate stage to the promotion stage. These studies point to the additional challenges racial and ethnic minorities and women face in the labor market as they attempt to become economically self-sufficient following participation in job skills training programs.

This study has several limitations. First, the quasi-experimental study design may have confounded estimates of the true employment and earnings benefits for baby boomers who participated in government-sponsored job skills training programs. Any group heterogeneity that was not captured by baseline individual and family characteristics may have led to biased treatment effect estimates of job skills training. In particular, the lack of neighborhood and regional characteristics (e.g., regional unemployment, racial and ethnic composition, training center accessibility, and job vacancies) might have confounded treatment effect estimates. Despite the potential impacts of unobserved differences between treatment and comparison groups, this study likely provides the lower bounds of the true labor market effects of job skills training as individuals receiving job skills training came from more disadvantaged backgrounds. Second, since more detailed program and service information was not available for most respondents who participated in government-sponsored employment programs, treatment effects for each program could not be estimated. Since programs vary in training models, target populations, and main and support services, treatment effects may also differ across programs. Even within a program, treatment effects can differ due to site-specific characteristics such as differences in service providers and facility quality (Hamilton et al., 2001). However, this study sought to appraise the benefits of the job skills training approach versus the ‘job-first’ approach (i.e., offering basic services for quick employment), not the specific, short-term program effects that numerous prior studies have already examined. Lastly, employment quality, such as workplace benefits, retention, promotion opportunities, and work satisfaction, could not be examined because detailed employment-related information was unavailable. Future studies should examine job-specific characteristics to better inform policymakers and researchers of other benefits and challenges that populations with economic disadvantages experience after completing job skills training.

6. Lessons learned

The findings of this study underscore the importance of incorporating job skills training as a main activity in employment programs that serve Americans who face economic disadvantages and employment barriers. Since government-sponsored employment programs adopted a ‘job-first’ approach for low-income Americans, especially for TANF participants, job training opportunities have been virtually unavailable. Empirical evidence consistently suggests that the job-first approach does not lead to gains in overall earnings or economic self-sufficiency for low-income Americans, while the present study found that job skills training can produce consistent gains in employment and earnings for more than 30 years after program participation. In particular, women as well as racial and ethnic minorities, who face the most employment disadvantages, benefited. However, it is also important to note that, except for White and Hispanic men, most job skills training recipients were employed part-time rather than full-time. An excess of part-time employment coupled with low earnings and lack of benefits such as health insurance and retirement plans increase the likelihood of experiencing material hardship over the life-course. As more low-skilled jobs are lost to automation in the United States, this study provides evidence that to improve job stability and economic self-sufficiency, government-sponsored employment programs require a focus on providing job skills training. At the same time, further measures are needed to address racial and ethnic as well as gender disparities in full-time employment.

Biographies

Sehun Oh, PhD, is Assistant Professor at The Ohio State University College of Social Work. His scholarly interests revolve around the intersection of poverty, social welfare policies, and behavioral health. Currently, he investigates how higher education and job skills training positively influence various life domains including economic self-sufficiency and behavioral health among individuals and families in need.

Diana M. DiNitto, PhD, is Cullen Trust Centennial Professor in Alcohol Studies and Education and Distinguished Teaching Professor at the Steve Hicks School of Social Work at the University of Texas at Austin. Her scholarly work focuses on social welfare policy and substance abuse, including polysubstance use and co-occurring mental disorders.

Daniel Powers, PhD, is Professor in the Department of Sociology at The University of Texas at Austin. His research focuses on demography, racial/ethnic differences in infant mortality, and statistics/methods. Most of his substantive work is intertwined with his methodological interests in advanced statistical methods including longitudinal data analysis, dynamic modeling and regression decomposition.

Footnotes

Declaration of Competing Interest

The authors have no conflicts to disclose.

Author note

Research reported in this publication was supported by the Society for Social Work and Research (SSWR)under the 2019 SSWR Doctoral Fellows Award. The content is solely the responsibility of the authors and does not necessarily represent the official views of the SSWR. This paper is based on a chapter from the first author’s dissertation, which has been embargoed until May 2026 to allow for publication.

CRediT authorship contribution statement

Sehun Oh: Conceptualization, Methodology, Formal analysis, Writing - original draft, Writing - review & editing. Diana M. DiNitto: Conceptualization, Writing - original draft, Writing - review & editing. Daniel A. Powers: Methodology, Writing - review & editing.

References

  1. Angrist JD, & Imbens GW (1995). Two-stage least squares estimation of average causal effects in models with variable treatment intensity. Journal of the American Statistical Association, 90(430), 431–442. [Google Scholar]
  2. Austin PC (2011). An introduction to propensity score methods for reducing the effects of confounding in observational studies. Multivariate Behavioral Research, 46(3), 399–424. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Bassi LJ (1983). The effect of CETA on the postprogram earnings of participants. The Journal of Human Resources, 18(4), 539–556. [Google Scholar]
  4. Becker GS (2009). Human capital: A theoretical and empirical analysis, with special reference to education. Chicago, IL: University of Chicago press. [Google Scholar]
  5. Bishop SW (2004). Work, welfare and case management services: An analysis of the Missouri FUTURES program. Journal of Poverty, 8(1), 43–60. [Google Scholar]
  6. Bloom HS, & McLaughlin MA (1982). CETA training programs: Do they work for adults? Washington, DC: Congressional Budget Office. [Google Scholar]
  7. Bloom HS, Orr LL, Bell SH, Cave G, Doolittle F, Lin W, … Bos JM (1997). The benefits and costs of JTPA title II-A programs: Key findings from the National Job Training Partnership Act study. Journal of Human Resources, 32(3), 549–576. [Google Scholar]
  8. Bloom HS, Orr LL, Cave G, Bell SH, & Doolittle F (1993). The National JTPA Study. Title II-A impacts on earnings and employment at 18 months. Bethesda, MD: Abt Associates. [Google Scholar]
  9. Braddock JH, & McPartland JM (1987). How minorities continue to be excluded from equal employment opportunities: Research on labor market and institutional barriers. Journal of Social Issues, 43(1), 5–39. [Google Scholar]
  10. Buuren S.v., & Groothuis-Oudshoorn K (2010). MICE: Multivariate imputation by chained equations in R. Journal of Statistical Software, 10(2), 1–68. [Google Scholar]
  11. Cook TD, & Campbell DT (1979). Quasi-experimentation: Design and analysis issues for field settings. Chicago, IL: Rand McNally. [Google Scholar]
  12. Dannefer D (2003). Cumulative advantage/disadvantage and the life course: Cross-fertilizing age and social science theory. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences, 58(6), S327–S337. [DOI] [PubMed] [Google Scholar]
  13. Dickinson KP, Johnson TR, & West RW (1986). An analysis of the impact of CETA programs on participants’ earnings. The Journal of Human Resources, 21(1), 64–91. [Google Scholar]
  14. DiNitto DM, & Johnson DH (2015). Social welfare: Politics and public policy (8th ed.). Boston, MA: Pearson. [Google Scholar]
  15. DuGoff EH, Schuler M, & Stuart EA (2014). Generalizing observational study results: Applying propensity score methods to complex surveys. Health Services Research, 49(1), 284–303. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Ehrenberg RG, & Smith RS (2017). Modern labor economics: Theory and public policy. New York, NY: Routledge. [Google Scholar]
  17. Fitzgerald J (2000). Community colleges as labor market intermediaries: Building career ladders for low wage workers. Washington, DC: US Department of Education. [Google Scholar]
  18. Fuller S (2017). Segregation across workplaces and the motherhood wage gap: Why do mothers work in low-wage establishments? Social Forces, 96(4), 1443–1476. [Google Scholar]
  19. Greenberg DH, Michalopoulos C, & Robins PK (2003). A meta-analysis of government-sponsored training programs. ILR Review, 57(1), 31–53. [Google Scholar]
  20. Greenberg M, Strawn J, & Plimpton L (2000). State opportunities to provide access to post-secondary education under TANF. Washington, DC: Center for Law and Social Policy. [Google Scholar]
  21. Greenberg DH, Michalopoulos C, & Robins PK (2006). Do experimental and non-experimental evaluations give different answers about the effectiveness of government-funded training programs? Journal of Policy Analysis and Management, 25(3), 523–552. [Google Scholar]
  22. Hamilton G, Freedman S, Gennetian L, Michalopoulos C, Walter J, Adams-Ciardullo D, & Ahluwalia S (2001). How effective are different welfare-to-work approaches? Five-year adult and child impacts for eleven programs. National Evaluation of Welfare-to-Work Strategies. Washington, DC: US Department of Health and Human Services & US Department of Education. [Google Scholar]
  23. Heckman JJ (1979). Sample selection bias as a specification error. Econometrica, 47(1), 153–161. [Google Scholar]
  24. Heinrich CJ, Mueser PR, Troske KR, Jeon K-S, & Kahvecioglu DC (2013). Do public employment and training programs work? IZA Journal of Labor Economics, 2, 6. [Google Scholar]
  25. Holland PW (1986). Statistics and causal inference. Journal of the American Statistical Association, 81(396), 945–960. [Google Scholar]
  26. Judd CM, & Kenny DA (1981). Estimating the effects of social intervention. New York, NY: Cambridge University Press. [Google Scholar]
  27. Korn EL, & Graubard BI (1991). Epidemiologic studies utilizing surveys: Accounting for the sampling design. American Journal of Public Health, 81(9), 1166–1173. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Marcantonio RJ, & Cook TD (2010). Convincing quasi-experiments: The interrupted time series and regression-discontinuity designs. In Wholey JS, Hatry HP, & Newcomer KE (Eds.). Handbook of practical program evaluation. San Francisco, CA: Jossey-Bass. [Google Scholar]
  29. McCaffrey DF, Griffin BA, Almirall D, Slaughter ME, Ramchand R, & Burgette LF (2013). A tutorial on propensity score estimation for multiple treatments using generalized boosted models. Statistics in Medicine, 32(19), 3388–3414. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. McCaffrey DF, Ridgeway G, & Morral AR (2004). Propensity score estimation with boosted regression for evaluating causal effects in observational studies. Psychological Methods, 9(4), 403–425. [DOI] [PubMed] [Google Scholar]
  31. Merton RK (1968). The Matthew effect in science: The reward and communication systems of science are considered. Science, 159(3810), 56–63. [PubMed] [Google Scholar]
  32. Michalopoulos C, & Schwartz C (2000). What works best for whom: Impacts of 20 Welfare-to-Work programs by subgroup (Executive summary), National Evaluation of Welfare-to-Work Strategies. Washington, DC: US Department of Health and Human Services & US Department of Education. [Google Scholar]
  33. Michalopoulos C, Bloom HS, & Hill CJ (2004). Can propensity score methods match the findings from a random assignment evaluation of mandatory welfare-to-work programs? Review of Economics and Statistics, 86(1), 156–179. [Google Scholar]
  34. Negrey C, Um’rani A, Golin S, & Gault B (2000). Job training under welfare reform: Opportunities for and obstacles to economic self-sufficiency among low-income women. Georgetown Journal on Poverty Law & Policy, 7, 347–362. [Google Scholar]
  35. O’Rand AM (1996). The precious and the precocious: Understanding cumulative disadvantage and cumulative advantage over the life course. The Gerontologist, 36(2), 230–238. [DOI] [PubMed] [Google Scholar]
  36. O’Shea DP, & King CT (2001). The Workforce Investment Act of 1998: Restructuring workforce development initiatives in states and localities. Albany, NY: Nelson A. Rockefeller Institute of Government. [Google Scholar]
  37. Oreopoulos P, & Salvanes KG (2009). How large are returns to schooling? Hint: Money isn’t everything. Cambridge, MA: National Bureau of Economic Research. [Google Scholar]
  38. Pfeffermann D (1993). The role of sampling weights when modeling survey data. International Statistical Review, 61(2), 317–337. [Google Scholar]
  39. Reed D, Liu AY-H, Kleinman R, Mastri A, Reed D, Sattar S, … Ziegler J (2012). An effectiveness assessment and cost-benefit analysis of registered apprenticeship in 10 states. Oakland, CA: Mathematica Policy Research. [Google Scholar]
  40. Ridgeway G, McCaffrey D, Morral A, Burgette L, & Griffin BA (2017). Toolkit for Weighting and Analysis of Nonequivalent Groups: A tutorial for the twang package. Santa Monica, CA: Rand Corporation. [Google Scholar]
  41. Rosenbaum PR (2002). Observational studies (2nd ed.). New York, NY: Springer. [Google Scholar]
  42. Rosenbaum PR, & Rubin DB (1983). The central role of the propensity score in observational studies for causal effects. Biometrika, 70(1), 41–55. [Google Scholar]
  43. Rubin DB (1974). Estimating causal effects of treatments in randomized and non-randomized studies. Journal of Educational Psychology, 66(5), 688–701. [Google Scholar]
  44. Rubin DB (2001). Using propensity scores to help design observational studies: Application to the tobacco litigation. Health Services and Outcomes Research Methodology, 2(3–4), 169–188. [Google Scholar]
  45. Rubin DB (2004). Multiple imputation for nonresponse in surveys. New York, NY: John Wiley & Sons. [Google Scholar]
  46. Schochet PZ (2018). National Job Corps Study: 20-year follow-up study using tax data Princeton, NJ: Mathematica Policy Research. [Google Scholar]
  47. Singer JD (1998). Using SAS PROC MIXED to fit multilevel models, hierarchical models, and individual growth models. Journal of Educational and Behavioral Statistics, 23(4), 323–355. [Google Scholar]
  48. Sobel ME (1994). Causal inference in latent variable models. In von Eye A, & Clogg CC (Eds.). Latent variables analysis: Applications for developmental research (pp. 3–35). Thousand Oaks, CA: Sage Publications, Inc. [Google Scholar]
  49. White IR, Royston P, & Wood AM (2011). Multiple imputation using chained equations: Issues and guidance for practice. Statistics in Medicine, 30(4), 377–399. [DOI] [PubMed] [Google Scholar]

RESOURCES