Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Dec 23.
Published in final edited form as: Educ Res. 2014 June/July;43(5):230–241. doi: 10.3102/0013189X14540207

Intended and Unintended Effects of State-Mandated High School Science and Mathematics Course Graduation Requirements on Educational Attainment

Andrew D Plunk 1, William F Tate 2, Laura J Bierut 1, Richard A Grucza 1
PMCID: PMC4275121  NIHMSID: NIHMS621235  PMID: 25541563

Abstract

Mathematics and science course graduation requirement (CGR) increases in the 1980s and 1990s might have had both intended and unintended consequences. Using logistic regression with Census and American Community Survey (ACS) data (n = 2,892,444), we modeled CGR exposure on (a) high school dropout, (b) beginning college, and (c) obtaining any college degree. Possible between-groups differences were also assessed. We found that higher CGRs were associated with higher odds to drop out of high school, but results for the college-level outcomes varied by group. Some were less likely to enroll, whereas others who began college were more likely to obtain a degree. Increased high school dropout was consistent across the population, but some potential benefit was also observed, primarily for those reporting Hispanic ethnicity.

Keywords: dropouts, educational policy, high schools, higher education, mathematics education, policy analysis, regression analysis, science education

Background

Building on state standards-based policies of the past three decades, the Common Core State Standards and the National Research Council's (2011) Framework for K–12 Science Education provide guidelines for the mathematics, science, and engineering ideas and practices that all students should learn and be familiar with at the completion of high school. These guidelines call for changes in mathematics and science curricula and upgrades in high school mathematics and science course graduation requirements (CGRs). Forty-five states and the District of Columbia have adopted the Common Core Standards in mathematics, but many of these states have yet to move from curricular guidelines to actually mandating higher standards in the form of CGRs; only 11 of the Common Core states were classified as having aligned high school CGRs in mathematics, which calls for coursework including Algebra II/Integrated Math III plus an additional course (The Center for Public Education, & Change the Equation, 2013).

Research on the impact of these changes has been mixed. Allensworth, Nomi, Montgomery, and Lee (2009) evaluated the effects of Chicago Public Schools requiring Algebra I as a ninth-grade course, finding reduced differences in ninth-grade coursework by race/ethnicity and that credits earned in ninth grade mathematics increased. Their other findings were not as positive, including higher failure rates, grade reductions, and no change in test scores. In a related study, Nomi (2012) examined the Algebra-for-All policy in the Chicago Public Schools and found that schools developed more mixed-ability classrooms by eliminating remedial coursework when implementing algebra courses. Again, not all findings were positive; peer skill levels declined for high skill students, and as a result their test scores decreased. These studies examined both the intended and unintended consequences of a graduation requirement upgrade in mathematics at the school district level.

Surprisingly, studies of the impact of science and mathematics education in tandem are rarely done (Heck, 1998; Tate & Malancharuvil-Berkes, 2006), despite the longstanding push in both fields to upgrade state CGRs and the heavy emphasis on science, technology, engineering, and mathematics (STEM) education in public policy (National Research Council, 2007). Furthermore, few researchers have examined the intended and unintended consequences (e.g., school dropout) of mandated, state-wide increases in mathematics and science CGRs. For example, studies of the National Science Foundation's Statewide Systemic Initiatives have informed the STEM education literature, but these have rarely focused on the unintended consequences of state science and mathematics curriculum or graduation requirement mandates (Webb, Kane, Kaufman, & Yang, 2001; Zucker, Blank, & Gross, 1995). This raises important social justice issues. State-wide mandates could disadvantage some students in the short term, especially those from resource-poor districts that may struggle to implement large-scale curricular changes.

However, limited evaluation of these policies has not been an obstacle to further reform. For example, Arizona, Minnesota, North Carolina, Ohio, and Tennessee have made Algebra II a high school graduation requirement even as Florida has removed Algebra II as a requirement for the high school diploma, and Texas has a house bill calling for similar change (Robelen, 2013). Because research to date has not included all relevant information, it is feasible to inform current policy debates with evidence from past practices. Historical data can provide needed insight on both the intended and unintended consequences of increasing mathematics and science CGRs at the state level.

Reforms in Response to A Nation at Risk

In 1981, the Reagan administration's Secretary of Education T. H. Bell established the National Commission on Excellence in Education (NCEE, 1983) to “define the problems afflicting American education and to provide solutions” (p. iii). One of the NCEE's charges included studying the relationship between requirements for college admission and student achievement during high school. In their report, A Nation at Risk: The Imperative for Educational Reform, the NCEE described a number of risk factors related to high school mathematics and science education. For example, only a third of 17-year-olds in the United States could solve multiple-step mathematics problems on national assessments and science achievement trends for 17-year-olds were also on the decline. Additionally, in the period from 1975 to 1980, public 4-year colleges experienced a 72% increase in remedial mathematics courses, which constituted one-quarter of all mathematics offerings taught in these institutions (NCEE, 1983).

The NCEE was also dissatisfied with the rate at which students were migrating from college preparatory programs to general track coursework. They reported that an estimated one-fourth of credits earned by students classified in the general track were in physical and health education, work study, remedial English and mathematics, and personal service and development courses. Another concern was the extensive individual choice allowed in selecting high school coursework, which the commission labeled “cafeteria-style curriculum management,” in which “appetizers and desserts can easily be mistaken for main courses” (p. 18), that resulted in students opting out of college preparatory mathematics and science courses. The NCEE concluded that lax high school graduation requirements communicated lowered expectations to students, a problem that was contributing to the “disturbing inadequacies” of the American education system (NCEE, 1983).

Within a year of the commission's report, 26 states had raised their graduation requirements and many of the rest were considering a change (Kornhaber & Orfield, 2001). From 1980 to 1989, 42 states increased high school graduation requirements in mathematics, science, or both (Center for Policy Research in Education, 1989). These changes were in response to calls for more cognitively demanding high school coursework in support of improved achievement and, in theory, to better prepare high school students for the demands of college (Tate, 2006). This thinking has been supported by research; for example, mathematics achievement scores are associated with the number of mathematics courses taken and the amount of time spent studying advanced mathematics (Miller, 1997; Noble & Schnelker, 2007; Secada, 1992; Tate, 1997). Similarly, Perkins, Kleiner, Roey, and Brown (2004) reported a positive relationship between secondary science and mathematics coursework and National Assessment of Educational Progress (NAEP) achievement outcomes.

The Effect of Mandated Mathematics and Science CGRs

That there is positive impact of rigorous coursework when chosen by students is not controversial, but there has been ongoing debate over the effects of requiring a more difficult high school curriculum for everyone. Some have argued that requiring more demanding mathematics and science coursework for all high school students provides students with greater access and opportunity to learn cognitively challenging content aligned with college readiness (ACT, 2006; National Research Council, 2011). However, other scholars propose that more challenging graduation requirements might promote increased high school dropout (Center for Policy Research in Education, 1989; Tate, Jones, Thorne-Wallington, & Hogrebe, 2012).

These are important considerations, as education has a broad impact on both individuals and society. For example, better health has been linked to higher educational attainment (Freudenberg & Ruglis, 2007), and high dropout rates have been linked to increased public assistance use (Waldfogel, Garfinkel, & Kelly, 2007) and higher crime (Lochner & Moretti, 2001). Furthermore, with respect to STEM policy specifically, Tate et al. (2012) argued that dropping out of school is a major threat to science and mathematics learning and college readiness, which they labeled the “empty seat” problem, wherein if the seat is empty there can be no mathematics and science learning by default. It is in this context that we seek to better understand the connection between state-mandated mathematics and science CGRs, high school dropout, and college attainment. By examining the impact of state-mandated mathematics and science CGRs in tandem, we anticipate making a unique contribution to current STEM education policy discussions.

To our knowledge, there have been no prior studies investigating state-mandated mathematics and science CGRs in tandem, and only a few others looking at these policies more generally. Of these, some specifically examined state requirements, whereas in others it was unclear whether CGRs were based at the state or local district level. Given this range of outcomes, methodologies have differed and results have been mixed. Lillard and DeCicca (2001) examined the relationship between total CGRs across all subjects and high school dropout using both aggregate and individual-level data, finding a strong association. Their aggregate data were based on a variety of sources, whereas their individual-level analyses utilized the High School and Beyond (HSB) and National Educational Longitudinal Survey of 1988 (NELS:88); the original administrations and the first and second follow-ups of each were used. Using individual-level data, Hoffer (1997) examined mathematics CGRs by comparing schools with a three-course requirement to those that required two using the second follow-up of the NELS:88 and found no impact on graduation rates. Two other studies with similar methodology used aggregate state-level data to assess the impact of mathematics requirements in the context of content standards on a variety of outcomes, including high school graduation and college enrollment rates; both suggested that higher mathematics CGRs were associated with decreased high school graduation rates (Daun-Barnett & St. John, 2012; St. John, 2006). Findings for other outcomes were mixed; the earlier analysis suggested that there was not an association for going on to college (St. John, 2006), whereas the later study suggested that college enrollment increased (Daun-Barnett & St. John, 2012).

An extensive literature has examined the degree to which pre-collegiate indicators such as high school grades and standardized admissions test scores (ACT or SAT) predict grade point averages among college freshmen (Beatty, Greenwood, & Linn, 1999). Zwick and Sklar (2005) further demonstrated that pre-collegiate indicators also have value in predicting college graduation: Better high school grades and SAT scores were associated with a higher probability of obtaining a college degree. Similarly, work by Nicholls, Wolfe, Besterfield-Sacre, and Shuman (2010) suggest that the probability of earning an undergraduate STEM degree can be determined using data on eighth-grade coursework and standardized test scores from high school.

We seek to extend this literature by examining both the unintended and intended effects of required state-wide STEM coursework in the form of science and mathematics CGRs that were mandated across the United States from the mid-1980s to the mid-1990s after the NCEE report. To this end, we examined three educational attainment outcomes: (a) dropping out of high school, (b) beginning college but not necessarily completing a degree, and (c) obtaining any college degree after having begun college.

Past research on mathematics CGRs provides a rationale for and justifies our work but was also limited in several ways. Most importantly, previous work focused on mathematics requirements, which does not necessarily speak to STEM policy more generally. Studies using aggregated data also have limited ability to assess differential impact on the basis of gender or race/ethnicity, an important consideration considering the indiscriminate impact of policy mandates. Additionally, it was not always clear when a state or local requirement was assessed in past research. For example, the NELS:88 does report whether there was a course requirement for respondents, but the school administrator questionnaire used to assess that information did not specify whether the requirement was state or local (National Center for Education Statistics, 1991).

Using individual-level data allowed us to investigate several factors that potentially influence how CGRs affected educational attainment. These included individual characteristics, such as sex and race/ethnicity, and several environmental factors. State-level income disparity and citizen political ideology were included in all models; our high school dropout analysis additionally controlled for the impact of state-mandated exit exams, and analyses for our college-level outcomes included the average cost of a 4-year public college and the amount of state-funded need-based aid. In addition to our main analyses, we also explore additional covariates in a series of confirmatory analyses using a second data set.

The changes in CGRs in response to A Nation at Risk differed between states and were often incremental within states over time, allowing for comparisons based both on whether there was a statewide mandate and also by the size of the requirement (e.g., two vs. six courses). These incremental differences both within and between states and across time form the basis of a natural experiment that allows us to assess the impact of changing state-mandated mathematics and science CGRs.

Method

Source Data

The sample for the main analyses was obtained from the Integrated Public Use Microdata Series site (Ruggles et al., 2010) using the 1990 and 2000 decennial Censuses and the 2001– 2011 waves of the ACS, a yearly survey administered by the U.S. Census Bureau that is sampled from the same inventory of known living quarters as the decennial Census (U.S. Census Bureau, 2009). These data were combined and then restricted to the graduation years 1980 to 1999, which we estimated by adding 18 to each respondent's year of birth. In effect, this created repeated cross-sections based both on Census/ACS wave (i.e., 1990i.e., 2000–2001) and imputed graduation year (1980–1999). Individuals from states that never mandated math and science CGR were excluded, as were those who immigrated to the United States after age 8. We also restricted our analyses to those who had at least begun high school since we were investigating the impact of policy affecting coursework taken at that level. To focus on the period of change and to limit the impact of possible extraneous factors, we further restricted the sample to a 2-year window extending from 2 years before to 2 years after the policy change in each state, which restricted the period of analysis from 1983 to 1999. Focusing on periods of policy variation, both to limit the impact of reduced within-state variation over time and to seek out comparable controls, is a strategy common to our analytic method (Bertrand, Duflo, & Mullainathan, 2002; Meyer, 1995).

The ACS and Census do not retrospectively track where an individual resided when they were of high school age, which required that we estimate CGR exposure. We did this both for the full sample and for a subset that was less likely to have migrated between states. Analyses of the full sample (n = 2,892,444) assigned exposure based solely on the state in which an individual resided at the time of census/survey. The subsample was more restrictive and consisted of those whose state of residence was the same as their state of birth, which we labeled “likely nonmovers” (n = 1,837,119). Unlike the full sample, likely nonmovers are not representative of the general population, but those who live in the state in which they were born are much less likely to have migrated between states. Migration has been declining across demographic groups in the United States since the 1980s, but still remains correlated with several relevant factors, such as age and education (Molloy, Smith, & Wozniak, 2011). The two populations thus represent a potential tradeoff between generalizability and degree of confidence in determining policy exposure; we explore these differences in the Results section.

Outcome Measures and Covariates

Our main outcome measures were three variables constructed from the educational attainment item in the Census/ACS. The first, “high school dropout,” was based on having reported starting high school, but having failed to earn a diploma or GED. This is based on high school “completion” rather than “graduation” due to inclusion of GED recipients (Heckman & LaFontaine, 2010). The second outcome, “college enrollment,” was based on indicating having taken any college coursework, irrespective of receiving a degree, while the third, “any college degree,” indicated that an associate degree or higher had been completed. Individuals who were still attending high school were excluded from the high school dropout analyses. High school dropouts were excluded from analyses of college enrollment, and the “any college degree” analyses were limited to those who had at least begun college.

We included several state-level variables based on imputed graduation year. Each state's Gini coefficient, a measure of income disparity (University of Texas Inequality Project, 2012), was used in all models to control for the impact of socioeconomic factors that might have affected educational outcomes when each respondent was of high school age (instead of, for example, using income or poverty status as assessed later by the Census or ACS, as income later in life would be directly related to earlier education and could potentially confound our analyses). A measure of citizen political ideology developed by Berry, Ringquist, Fording, and Hanson (1998) was also included in all analyses to model differences in political climate that could potentially affect policy adoption and funding. Additional state-level covariates varied depending on the outcome: Analysis of high school dropout included a dummy variable denoting whether a state mandated an exit exam to receive a diploma, and our college-level analyses included the average cost of a 4-year public college and amount of state-funded need-based aid. Some states changed their exit exam requirements during this period (Kober et al., 2006), and it has been used as a covariate in previous CGR research (e.g., Lillard & DeCicca, 2001). College cost and state-funded need-based aid have both been found to be strongly associated with college-level outcomes in previous research (Heller, 1997). State need-based aid data were obtained from various years of the Digest of Education Statistics (Snyder & Hoffman, 1991a, 2002). Average cost was calculated for each state and year combination using the Enrollments and Institutional Characteristics surveys of the Integrated Postsecondary Education Data System (National Center for Education Statistics, 2013) and checked against published data when available (e.g., Snyder & Hoffman, 1991b). Preliminary models assessed average cost for both 4- and 2-year colleges; there were no major differences between the two, and average 4-year college cost was included in final analyses. In all models, race/ethnicity was coded as non-Hispanic White, non-Hispanic Black, Hispanic, and other, with the final category excluded from stratified analyses due to high within-group heterogeneity. Additional covariates included dummy variables for state, imputed graduation year, Census/ACS wave, and sex.

Graduation Requirement Exposure Coding

We obtained state mathematics and science CGR data for the graduation years 1980 to 1999 from the Educational Commission of the States (ECS), a nonprofit organization that tracks education policy (Education Commission of the States, 2012). During this period, six states never instituted statewide mathematics and science CGRs (Colorado, Iowa, Massachusetts, Michigan, Nebraska, and Wyoming); individuals from those states were excluded from our analyses. Another eight began without a statewide CGR but adopted one during the period and were included. Most states had separate mathematics and science CGRs, whereas others required a total number of courses combined from the two subjects; a total combined mathematics and science CGR was calculated for all states and years during this period and was the value used in our analyses. CGRs ranged from zero (no mandated statewide CGR) to six courses, with multiple values possible for states at different times due to incremental policy changes. Figure 1 graphically depicts these changes over time, and Table S1 (available on the journal website) lists the number of states that exhibited each level of CGR at least once during this period. Preliminary analyses assessed both categorical and interval-scale coding specifications for the policy exposure variable; results did not differ substantially between specifications, and there were no obvious nonlinear relationships based on threshold of exposure. To aid in interpretation, a 0–6 interval scale was used in subsequent analyses, in which response to CGR changes was modeled to change at a constant rate for each level of exposure. Additionally, although several of our college-level conditional analyses featured smaller sample sizes, the impact of a specific threshold of exposure did not substantively change.

Figure 1. State mathematics and science CGRs, combined, for the years 1980, 1990 and 1999.

Figure 1

State mathematics and science CGRs, combined, for the years 1980, 1990, and 1999. States without shading did not have a mandated statewide requirement, while the darkest shade of gray denotes a requirement of six courses; those states without a requirement by 1999 were excluded from our analyses.

Statistical Methods

Our analytic method is a variation of the “difference-indifferences” approach, which models exposure to a policy change by comparing pre- and postintervention differences in an outcome for affected groups to those for unexposed comparison groups (Wooldridge, 2010). A fixed-effects regression model was used, allowing intercepts to differ both between groups and across time periods to control for unobserved time-invariant heterogeneity (Allison, 2009; Wooldridge, 2010). In addition to performing full-sample analyses, we also ran a series of analyses conditioned on sex and race/ethnicity to investigate possible differences in estimates between demographic groups.

Logistic regression was used for all analyses to estimate the probability of the three educational attainment outcomes based on exposure to mandated state mathematics and science CGRs. The basic structure of the regression models was:

Yist=As+Bt+β1X1ist++βnXnist+βGRADREQst+ist

where Yist is an educational attainment outcome (e.g., high school dropout status) for i individual in s state in t graduation year; As and Bt are the fixed effects for states and graduation years, respectively; X1 through Xn represent the covariates for each individual, whereas GRADREQ is the CGR for each state and graduation year. This is the coefficient of interest in our analyses, which allow us to take advantage of incremental changes in CGRs to estimate an average effect over time while controlling for time and state-invariant factors.

Confirmatory Analyses

Although our main analyses were designed to assess outcomes related to policy exposure over time, they were also limited in several ways. The Census and ACS assess current status at the time of survey, even though characteristics at the time of exposure— at high school age, which was likely years earlier—are more useful. We also estimate CGR exposure in our main analyses, which at best introduces random measurement error and at worst could lead to spurious associations if between-state migration is linked to one of the educational attainment outcomes (Grucza et al., 2012, 2013; Plunk, Cavazos-Rehg, Bierut, & Grucza, 2013). We address these issues two ways.

First, we assessed past migration for both our full sample and for likely nonmovers. Second, we conducted a series of secondary analyses to assess the contemporaneous impact of CGR changes, in contrast to our retrospective main analyses. We performed these secondary analyses to (i) verify findings from our main Census/ACS analyses using other data, (ii) validate our method of policy exposure by using data for which we did not have to estimate exposure, and (iii) assess how other factors not available in Census/ACS data might affect how CGR exposure impacts educational attainment. To do this we created a repeated cross-sectional data set from the 1986–2000 March Supplements of the Current Population Survey (CPS) obtained from the Integrated Public Use Microdata Series (King et al. 2010). In addition to the inclusion criteria used in our main analyses, these data were restricted to 18–19-year-olds in each CPS wave who had been exposed to a CGR change starting in 1996 because that was the first year the school attendance item was available. This allowed us to assess the impact of the changes as they were occurring and without having to estimate where respondents resided. Our outcome was high school dropout, which we determined based on respondents indicating that they had not completed high school and were no longer in school, at age 18 and 19. These analyses featured additional covariates, as we had more information about respondents when they were exposed. We included additional dummy variables for socioeconomic indicators, such as poverty status and maternal education, which in past research have been predictive of high school dropout (Oreopoulos, Page, & Stevens, 2006; Pong & Ju, 2000). These variables were not available for all CPS respondents, and it is possible that restricting the sample by conditioning on them introduces some selection bias (Oreopoulos et al., 2006). We chose to include only maternal education, rather than also requiring paternal education, as a much larger proportion of respondents had complete data for that item (n = 11,950).

All analyses were performed using version 2.15.2 of the statistical language R (R Development Core Team, 2012). The survey package in R was also used (Lumley, 2012). Two-way cluster-robust standard errors, used to adjust for correlation of observations both within state and time (Petersen, 2009), were obtained in all analyses using R code based on the work of Arai (2009).

Results

Sample Description

Table 1 compares the full sample to likely nonmovers on several demographic variables. A greater proportion of likely nonmovers displayed lower educational attainment and were Black or Hispanic, compared to the full sample. The distribution of mathematics and science CGRs and mandated exit exams were similar,1 and the proportion of men and women did not differ between the two; likewise, the mean age (33) and age range (19–40) were the same, as was the imputed graduation year mean (1988) and range (1983–1999). Ranges for the citizen political ideology, income disparity, average 4-year college cost, and state-funded need-based aid covariates were identical between samples. Mean values were also similar, but likely nonmovers exhibited higher amounts of state-funded need-based aid on average.

Table 1. Demographic Characteristics by Sample Subset.

Full Sample Likely Nonmovers


n % n %
Sample size 2,892,444 100.00 1,837,119 100.00
Educational attainment
No high school diploma 291,022 10.06 208,671 11.36
High school diploma/GED 862,010 29.80 602,787 32.81
Some college, no degree 770,759 26.65 489,055 26.62
Associate degree 255,626 8.84 164,750 8.97
Bachelor degree 507,277 17.54 274,973 14.97
Graduate degree 205,750 7.11 96,883 5.27
Race/ethnicity
White 2,260,017 78.14 1,406,165 76.54
Black 361,695 12.50 252,475 13.74
Hispanic 270,732 9.36 178,479 9.72
Sex
Men 1,473,437 50.94 940,914 51.22
Women 1,419,007 49.06 896,205 48.78
Education policy variables
Math and science graduation requirement
NA 294,604 10.19 182,297 9.92
Two courses 491,186 16.98 326,941 17.80
Three courses 313,198 10.83 209,189 11.39
Four courses 1,197,708 41.41 757,896 41.25
Five courses 335,524 11.60 208,228 11.33
Six courses 260,224 9.00 152,568 8.30
Mandatory exit exam 280,183 9.69 169,473 9.22
Mean average 4-year college cost $1,642 $1,685
Mean state-funded need-based aid $76,040 $88,060
Other state-level covariates
Mean political ideology 49.10 49.91
Mean income disparity 0.40 0.40

Note. These data are based on the 1990 and 2000 Censuses and the 2001 to 2011 American Community Survey. “Likely nonmovers” are those who, when surveyed, lived in the same state in which they were born. State-funded need-based aid is reported in thousands.

Percentages for the three educational attainment outcomes by CGR for the full sample can be seen in Table 2 (see Table S2, available on the journal website, for likely nonmovers). There was a general trend for the high school dropout rate to increase as states required more mathematics and science coursework, although this varied by demographic group—those reporting Hispanic ethnicity were the only group that did not exhibit their highest dropout rates for the highest CGR. There were also large between-groups differences in magnitude, with Blacks and Hispanics dropping out at over twice the rate of Whites. These differences persisted for the other two outcomes; Whites in our sample who began college completed a degree 58.2% of the time, whereas Blacks and Hispanics completed a college degree at 43.5% and 45.1%, respectively (Table 2). The impact of increasing CGRs was less pronounced on enrolling in college and completing a college degree, although between-groups differences in magnitude remained (Table 2 and Table S2, available on the journal website).

Table 2. Educational Attainment Outcome Rates by Demographic Group and Combined Math and Science Course Graduation Requirement, Full Sample.

Overall Sex and Race/Ethnicity
Sex Race/Ethnicity


White Men White Women Black Men Black Women Hispanic Men Hispanic Women
Men Women White Black Hispanic
Failing to complete high school, conditioned on having begun
Overall 10.06 11.52 8.65 8.14 16.40 17.63 9.39 6.91 19.10 14.09 20.04 15.35
By CGR
NA 8.63 9.90 7.40 6.66 16.09 16.74 7.79 5.55 18.04 14.34 18.94 14.62
Two courses 8.90 10.18 7.67 7.66 15.97 17.65 8.90 6.46 18.07 14.18 19.42 15.98
Three courses 9.82 11.23 8.48 8.24 16.68 15.70 9.45 7.05 19.58 14.25 18.03 13.53
Four courses 10.46 11.98 8.99 8.31 16.06 17.74 9.54 7.08 18.70 13.76 20.39 15.19
Five courses 10.76 12.20 9.40 8.34 15.60 19.80 9.62 7.11 18.05 13.50 21.80 17.96
Six courses 11.41 13.27 9.64 9.72 18.86 13.60 11.23 8.23 23.10 15.34 15.84 11.48
Starting college, conditioned on having completed high school
Overall 66.86 63.52 69.99 68.73 57.98 61.51 65.68 71.65 51.69 63.07 57.90 64.75
By CGR:
NA 69.33 66.19 72.32 70.99 59.73 63.92 68.07 73.81 53.69 64.91 60.62 66.94
Two courses 64.81 61.73 67.68 65.81 56.87 62.43 62.96 68.52 50.83 61.78 59.90 64.71
Three courses 65.65 62.60 68.48 67.53 56.16 59.71 64.88 70.07 49.39 61.47 56.51 62.54
Four courses 68.22 64.75 71.47 70.55 59.40 61.62 67.40 73.57 53.15 64.54 57.62 65.22
Five courses 66.36 62.92 69.50 69.17 56.98 58.06 65.92 72.24 50.94 61.85 55.01 60.71
Six courses 63.82 59.91 67.40 65.09 56.47 65.77 61.50 68.50 49.58 61.69 62.25 68.94
Obtaining any college degree, conditioned on having begun college
Overall 55.69 54.29 56.87 58.20 43.51 45.10 56.71 59.51 40.34 45.60 43.84 46.12
By CGR:
NA 57.89 57.00 58.67 60.27 43.73 47.53 59.18 61.25 41.31 45.45 47.80 47.30
Two courses 57.83 56.64 58.85 59.39 44.72 50.19 58.15 60.48 42.05 46.50 48.36 51.71
Three courses 55.35 53.83 56.63 57.43 42.44 48.82 55.87 58.81 38.51 44.91 47.07 50.22
Four courses 54.73 53.20 56.03 57.69 43.39 43.25 56.10 59.09 40.08 45.63 41.50 44.64
Five courses 53.69 52.39 54.76 56.46 42.78 43.61 54.99 57.72 39.79 44.76 43.32 43.85
Six courses 56.59 54.75 58.09 58.76 44.20 54.22 56.72 60.49 40.97 46.16 52.63 55.52

Note. Percentages are based on the 1990 and 2000 Censuses and the 2001 to 2011 American Community Surveys (n = 2,892,444).

High School Dropout

Results predicting high school dropout from mandated statewide mathematics and science CGRs are summarized in Table 3, both for the full sample and for likely nonmovers. Table 3 also focuses on the CGR estimates broken down by demographic group and reports the change in predicted probability of dropping out of high school for those exposed to the highest CGR relative to the average for each group; estimates for other covariates in the model are included in Table S2 (available on the journal website).

Table 3. High School Dropout by Math and Science Course Graduation Requirement.

Full Sample Likely Nonmovers


b SE n Δ P b SE n Δ P
All respondents 0.0146*** 0.0041 2,892,444 0.82 0.0170*** 0.0046 1,837,119 1.07
Conditioned on sex
Men 0.0149*** 0.0042 1,419,007 0.94 0.0173*** 0.0053 896,205 1.23
Women 0.0144** 0.0061 1,473,437 0.71 0.0168** 0.0062 940,914 0.91
Conditioned on race/ethnicity
White 0.0169*** 0.0046 2,260,017 0.79 0.0177*** 0.0054 1,406,165 0.92
Black 0.0022 0.0035 361,695 0.18 0.0066 0.0051 252,475 0.62
Hispanic 0.0099 0.0065 270,732 0.88 0.0136* 0.0068 178,479 1.23
Conditioned on sex and race/ethnicity
White men 0.0138** 0.0049 1,120,000 0.73 0.0124* 0.0056 695,913 0.74
White women 0.0212** 0.0069 1,140,017 0.86 0.0252*** 0.0075 710,252 1.14
Black men 0.0080 0.0053 167,137 0.75 0.0176* 0.0074 113,989 1.88
Black women −0.0034 0.0050 194,558 −0.24 −0.0045 0.0056 138,486 −0.36
Hispanic men 0.0140* 0.0056 131,870 1.38 0.0252*** 0.0047 86,303 2.58
Hispanic women 0.0045 0.0099 138,862 0.35 −0.0001 0.0154 92,176 −0.01

Note. Only individuals who at least began high school were included in these analyses, which are based on the 1990 and 2000 Census and the 2001 to 2011 American Community Survey. “Likely nonmovers” differ by generalizability and propensity for between-state migration and are those who, when surveyed, lived in the same state in which they were born; “full sample” is the broader sample with exposure determined only by state of residence at the time of survey. Δ P is the change in predicted probability of dropping out of high school for those with the highest CGR, expressed as a change in percentage points.

p < .10.

*

p < .05.

**

p < .01.

***

p < .001.

Exposure to the highest CGR (six courses) was associated with a 0.82 percentage point increase in the probability of dropping out of high school in the full sample (b = 0.015, SE = 0.004, p < .001) and a 1.07 increase for likely nonmovers (b = 0.017, SE = 0.0046, p < .001). In conditional analyses, Black and Hispanic men saw the greatest increases in their probability to drop out of high school, increasing by 1.88 and 2.58 percentage points, respectively, for those exposed to the highest CGR (b = 0.018, SE = 0.007, p = .018; b = 0.025, SE = 0.005, p < .001).

We also observed significant associations when assessing the contemporaneous impact of CGR changes on high school dropouts in our secondary analyses with CPS data. Using the same covariates as in our main analyses, exposure to the highest CGR was associated with a 9.91 percentage point increase in the probability of dropping out of high school, compared to those without a CGR (b = 0.125, SE = 0.047, p = .008; Table S4, available on the journal website). Additional covariates used in these analyses did not have a large impact on the CGR estimate.

College Enrollment and Degree Completion

Analyses predicting the probability of college enrollment based on exposure to state mathematics and science CGRs did not demonstrate significant associations for the full sample nor for likely nonmovers until conditional analyses were performed, when some subgroups were less likely to go on to college when exposed to higher CGR. Black women and Hispanic men and women (both together and when analyzed separately) exhibited significant associations across both sample subsets. We noted the largest impact for likely nonmover Hispanic men and women, for whom the probability of going to college decreased with exposure to the highest CGR by 5.21 and 3.32 percentage points, respectively (b = −0.035, SE = 0.013, p = .008; b = −0.024, SE = 0.010, p = 0.022; Table 4 and Table S5, available on the journal website).

Table 4. College-Level Outcomes by Math and Science Course Graduation Requirement.

College Enrollment After Completing High School

Full Sample Likely Nonmovers


b SE n Δ P B SE n Δ P
All respondents −0.0044 0.0028 2,606,938 −0.58 −0.0025 0.0028 1,628,448 −0.35
Conditioned on sex
Men −0.0029 0.0027 1,258,158 −0.40 −0.0018 0.0026 778,220 −0.27
Women −0.0061 0.0039 1,348,780 −0.77 −0.0032 0.0045 850,228 −0.43
Conditioned on race/ethnicity
White −0.0028 0.0033 2,077,740 −0.37 0.0003 0.0036 1,276,939 0.05
Black −0.0053 0.0056 302,914 −1.43 −0.0098 0.0053 205,000 −1.46
Hispanic −0.0147*** 0.0013 226,284 −4.17 −0.0288** 0.0091 146,509 −4.20
Conditioned on sex and race/ethnicity
White men −0.0011 0.0036 1,015,684 −0.14 0.0027 0.0033 621,021 0.38
White women −0.0050 0.0040 1,062,056 −0.62 −0.0023 0.0051 655,918 −0.29
Black men −0.0040 0.0097 135,474 −0.59 −0.0115 0.0082 88,641 −1.72
Black women −0.0067** 0.0023 167,440 −0.95 −0.0089* 0.0044 116,359 −1.29
Hispanic men −0.0104* 0.0045 107,000 −1.52 −0.0350** 0.0132 68,558 −5.21
Hispanic women −0.0190** 0.0065 119,284 −2.64 −0.0235* 0.0103 77,951 −3.32
Obtaining any degree after starting college

Full Sample Likely Nonmovers


b SE n Δ P b SE n Δ P
All respondents 0.0029* 0.0014 1,396,909 0.42 0.0028 0.0020 799,449 0.42
Conditioned on sex
Men 0.0036*** 0.0009 639,839 0.54 0.0046 0.0029 357,743 0.69
Women 0.0021 0.0024 757,070 0.30 0.0014 0.0032 441,706 0.20
Conditioned on race/ethnicity
White 0.0013 0.0015 1,150,104 0.19 −0.0011 0.0018 648,334 −0.16
Black 0.0101 0.0059 140,882 1.49 0.0105* 0.0056 86,210 1.52
Hispanic 0.0106 0.0099 105,923 1.58 0.0398*** 0.0107 64,905 5.91
Conditioned on sex and race/ethnicity
White men 0.0040*** 0.0007 536,730 0.58 0.0034 0.0034 297,043 0.50
White women −0.0014 0.0030 613,374 −0.20 −0.0051 0.0041 351,291 −0.75
Black men 0.0050 0.0045 55,940 0.72 −0.0058 0.0153 32,208 −0.80
Black women 0.0135 0.0098 84,942 2.01 0.0194* 0.0093 54,002 2.87
Hispanic men 0.0013 0.0075 47,169 0.19 0.0430* 0.0200 28,492 6.36
Hispanic women 0.0179 0.0162 58,754 2.68 0.0356 0.0240 36,413 5.32

Note. Based on the 1990 and 2000 Census and the 2001 to 2011 American Community Survey. “likely nonmovers” differ by generalizability and propensity for between-state migration and are those who, when surveyed, lived in the same state in which they were born; “full sample” is the broader sample with exposure determined only by state of residence at the time of survey. Δ P is the change in predicted probability of obtaining a college degree for those with the highest CGR, expressed as a change in percentage points.

p < .10.

*

p < .05.

**

p < .01.

***

p < .001.

The any college degree analyses yielded a significant association for the full sample, for whom the probability of completing a degree for those exposed to the highest CGR increased by 0.42 percentage points (b = 0.003, SE = 0.001, p = .047), but not for likely nonmovers (b = 0.003, SE = 0.002, p = .162; Table 4 and Table S6, available on the journal website). Results for conditional analyses varied greatly by sample subset. In the full sample, White men were the only demographic group to yield a significant association, but the size of the effect was small (b = 0.004, SE = 0.001, p < .001). The likely nonmover subset exhibited a similar pattern as the college enrollment analyses. There was a significant association for Blacks that persisted when conditioning for Black women, for whom the probability of completing a degree increased by 2.87 percentage points when exposed to the highest CGR (b = 0.020, SE = 0.009, p = .037) and for Hispanics that persisted when conditioning for Hispanic men, for whom the probability of completing a college degree increased by 6.36 percentage points (b = 0.043, SE = 0.020, p = .031).

Cross-State Migration

We assessed past migration to quantify the degree to which we could be assigning incorrect CGR exposure because of a change in state of residence. The Census and ACS track migration differently; both ask where an individual lived at a time in the past, with the Census asking where each person lived 5 years prior, whereas the ACS asks where respondents lived 1 year ago. The Census-derived portion exhibited past 5-year migration rates of 14.05% for the full sample, whereas likely nonmovers migrated at 3.73%; past-year migration for the ACS-derived full sample was 3.83%; likely nonmovers were 1.32% (Table S7, available on the journal website).

Conclusion

Summary of Findings

We found that individuals who were exposed to higher CGRs were more likely to drop out of high school. Findings for this outcome were robust and consistent in both our main analyses and when using a second dataset less prone to measurement error and that allowed us to assess individual-level factors unavailable using Census/ACS data. We did not observe significant associations for the college enrollment analyses until conditioned on sex and race/ethnicity, but CGR changes were associated with a decrease, rather than an increase, in the likelihood for Black women and Hispanic men and women to begin college after completing high school. However, higher CGR was also associated with some benefit for these groups; likely non-mover Black women and Hispanic men and women who graduated high school and then went on to start college were more likely to obtain a degree.

Policy Implications

The results we noted in our high school dropout analyses are broadly meaningful for policy across multiple domains. For example, high school dropout has been associated with increased crime; Lochner and Moretti (2001) estimated that a 1% reduction in the national high school dropout rate would have resulted in 400 fewer murders and 8,000 fewer assaults and that a 1% reduction for all men ages 20–60 would save the United States as much as $1.4 billion per year in reduced costs from crime incurred by victims and society at large. These estimates highlight the relevance of our high school dropout findings, which are consistent with the 1% reduction that Lochner and Moretti report.

Results from our college enrollment and degree attainment analyses are more difficult to interpret. First, we did not observe any meaningful benefit for college enrollment associated with higher CGRs; significant associations were limited to Black women and Hispanic men and women, but higher CGRs were associated with less college enrollment for these groups. Furthermore, between-samples differences based on likelihood of migration raise questions about generalizability. We did observe meaningful effect sizes, most notably for those reporting Hispanic ethnicity, but these results were largest and most consistent for likely nonmovers. For example, we noted a 1.52 and 2.64 percentage point reduction in college enrollment, respectively, for all Hispanic men and women and a 5.21 and 3.32 point reduction, respectively, when restricted to likely nonmovers (Table 4). The impact of migration status was more pronounced for the college degree analyses, which did not produce statistically significant estimates for the full sample but suggested a 6.36 and 5.36 percentage point increase in the probability of obtaining a college degree for likely nonmover Hispanic men and women, respectively. These are very large effects; the college degree attainment results for Hispanic men represents a 15% increase over the average for that subgroup (see Table S2, available on the journal website), but it is difficult to know how these findings might generalize to other individuals of Hispanic ethnicity who were more likely to migrate between states.

However, our results could suggest that migration status is serving as a proxy for unmeasured factors that are relevant to education for Hispanic men and women. Likely nonmovers were by definition born in the United States, but the full sample analyses were only restricted to those who immigrated before the age of 8, and past research indicates that immigration and migration present distinct challenges for Hispanic youth (Rumbaut & Portes, 2001). We caution against broadly applying these results, but they have interesting implications if likely nonmover status is interpreted as a proxy for a more stable environment for those who reported Hispanic ethnicity. If interpreted this way, our findings could offer support of the greater effectiveness of education policy for a population when specific factors associated with perpetuated social disadvantage are addressed.

Potential Limitations and Conclusion

We make several assumptions, chief among them that changes in mandated state-level mathematics and science CGRs were caused by a national trend, and thus constituted a plausible natural experiment. Statistically, this idea is expressed in assumptions about potential relationships between measured and unmeasured factors, which, if violated, could bias us toward false positive associations. We explore how reasonable these assumptions are and the circumstances under which they would likely be violated in the Supplementary Material, available on the journal website. Additionally, our analytic approach could also lead us to underestimate the true positive impact of higher CGRs, which might be tied to something like school culture, which could take several years to change. Our study only assessed the short-term impact associated with the period of CGR increases and thus would not be able to speak to benefit (or other negative consequences) that required more time to develop.

Insofar as our assumptions hold, this study implies that changes in state mathematics and science CGRs had a large impact on the educational attainment of the high school students who were exposed to them. However, we observed no evidence of broad benefit related to increases in mathematics and science CGRs; higher requirements were significantly associated with greater college degree attainment only for certain demographic groups. Instead, higher CGRs consistently promoted high school dropout across the whole population and were also associated with decreases, rather than increases, in college enrollment. Despite being based on historical data, this study highlights the importance of anticipating unintended consequences when making broad policy changes across diverse populations, and remain relevant today. In particular, implementing Common Core Standards and NRC Science Standards recommendations for more demanding content and more rigorous graduation requirements in science and mathematics will also likely require appropriate academic and psychosocial supports for higher risk students.

Supplementary Material

Supplemental Material
Supplemental Table 1
Supplemental Table 2
Supplemental Table 3
Supplemental Table 4
Supplemental Table 5
Supplemental Table 6
Supplemental Table 7

Acknowledgments

Sources of funding included T32DA07313 (A.D.P.), R01DA 031288 (R.A.G.), and the Washington University Institute for Public Health.

Biography

Andrew D. Plunk, PhD, MPH, is a postdoctoral research fellow in the Department of Psychiatry, Washington University School of Medicine, 660 South Euclid Avenue, St. Louis, MO 63110; plunka@psychiatry.wustl.edu. His research focuses on education and drug policy, especially their impact on adolescent alcohol and tobacco use.

William F. Tate, PhD, MPE, is the Edward Mallinckrodt Distinguished University Professor in Arts & Sciences, Dean of the Graduate School of Arts & Sciences and Vice Provost of Graduate Education at Washington University in St. Louis, Campus Box 1183, St. Louis, MO 63130-4899; wtate@wustl.edu. His research is focused on using epidemiological and geospatial modeling to explain the influence of social policy on education, health, and human development.

Laura J. Bierut, MD, is an alumni endowed professor in psychiatry at Washington University School of Medicine in St. Louis; bierutl@psychiatry.wustl.edu. Her research focuses on influences of policy and genetics on behavior.

Richard A. Grucza, PhD, is an epidemiologist in the Department of Psychiatry at Washington University Medical School, 660 South Euclid Avenue, Saint Louis, MO 63110. His research focuses on alcohol and drug abuse, particularly as related to public policy. I have reviewed the document and understand that this is the last opportunity to do so before it is published.

Footnotes

1

Note: The distribution of CGR exposure did change in several conditional analyses, but likely did not bias our results. For example, in our college degree analysis conditioned on likely nonmover Hispanics (n = 64,905), 7,764 (12.0%) were exposed to no CGR, 3,568 (5.5%) were exposed to two, 2,825 (4.4%) to three, 37,843 (58.3%) to four, 11,213 (17.3%) to five, and 1,692 (2.6%) to six credits.

References

  1. ACT. College and Workforce Training Readiness. Iowa City, IA: Author; 2006. [Google Scholar]
  2. Allensworth E, Nomi T, Montgomery N, Lee VE. College preparatory curriculum for all: Academic consequences of requiring algebra and English I for ninth graders in Chicago. Educational Evaluation and Policy Analysis. 2009;31(4):367–391. [Google Scholar]
  3. Allison PD. Fixed effects regression models. Thousand Oaks, CA: Sage; 2009. [Google Scholar]
  4. Arai M. Cluster-robust standard errors using R. 2009 Retrieved from http://people.su.se/∼ma/clustering.pdf.
  5. Beatty A, Greenwood M, Linn RL, editors. Myths and tradeoffs: The role of tests in undergraduate admissions. Washington, DC: National Academies Press; 1999. [Google Scholar]
  6. Berry WD, Ringquist EJ, Fording RC, Hanson RL. Measuring citizen and government ideology in the American states, 1960-93. American Journal of Political Science. 1998;42(1):327–348. [Google Scholar]
  7. Bertrand M, Duflo E, Mullainathan S. How much should we trust differences-in-differences estimates? (Working Paper No 8841) Cambridge, MA: National Bureau of Economic Research; 2002. Retrieved from http://www.nber.org/papers/w8841. [Google Scholar]
  8. Center for Policy Research in Education. Graduating from high school: New standards in the States. A “bulletin” special. CPRE Policy Briefs (RB-02-04/89) 1989 Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=EJ385300.
  9. The Center for Public Education & Change the Equation. Out of sync: Many common core states have yet to define a common core-worthy diploma. The Center for Public Education. 2013 Retrieved from http://changetheequation.org/sites/default/files/GradReqs_v5.pdf.
  10. Daun-Barnett N, John EP. Constrained curriculum in high schools: The changing math standards and student achievement, high school graduation and college continuation. Education Policy Analysis Archives. 2012;20(5):n5. [Google Scholar]
  11. Education Commission of the States. About ECS. 2012 Retrieved October 23, 2012, from http://www.ecs.org/html/aboutECS/home_aboutECS.htm.
  12. Freudenberg N, Ruglis J. Reframing school dropout as a public health issue. Preventing Chronic Disease. 2007;4(4) Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2099272/ [PMC free article] [PubMed] [Google Scholar]
  13. Grucza RA, Hipp PR, Norberg KE, Rundell L, Evanoff A, Cavazos-Rehg P, Bierut LJ. The legacy of minimum legal drinking age law changes: Long-term effects on suicide and homicide deaths among women. Alcoholism: Clinical and Experimental Research. 2012;36(2):377–384. doi: 10.1111/j.1530-0277.2011.01608.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Grucza RA, Plunk AD, Hipp PR, Cavazos-Rehg P, Brownson RC, Bierut LJ. Long-term effects of laws governing youth access to tobacco. American Journal of Public Health. 2013;103(8):1493–1499. doi: 10.2105/AJPH.2012.301123. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Heck DJ. Evaluating equity in statewide systemic initiatives: Asking the right questions. Journal of Women and Minorities in Science and Engineering. 1998;4(2):161–181. [Google Scholar]
  16. Heckman JJ, LaFontaine PA. The American high school graduation rate: Trends and levels. The Review of Economics and Statistics. 2010;92(2):244–262. doi: 10.1162/rest.2010.12366. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Heller DE. Student price response in higher education: An update to Leslie and Brinkman. The Journal of Higher Education. 1997;68(6):624–659. doi: 10.2307/2959966. [DOI] [Google Scholar]
  18. Hoffer T. High school graduate requirements: Effects on dropping out and student achievement. The Teachers College Record. 1997;98(4):584–607. [Google Scholar]
  19. King M, Ruggles S, Alexander T, Flood S, Genadek K, Schroeder M, Vick R. Integrated public use microdata series, current population survey: Version 3.0 (Machine-readable database) Minneapolis: University of Minnesota; 2010. [Google Scholar]
  20. Kober N, Zabala D, Chudowsky N, Chudowsky V, Gayler K, McMurrer J. State high school exit exams: A challenging year. Washington, DC: Center on Education Policy; 2006. [Google Scholar]
  21. Kornhaber ML, Orfield G. Raising standards or raising barriers? Inequality in high stakes testing in public education. New York: Century Foundation Press; 2001. High-stakes testing policies: Examining their assumptions and consequences; pp. 1–18. [Google Scholar]
  22. Lillard DR, DeCicca PP. Higher standards, more dropouts? Evidence within and across time. Economics of Education Review. 2001;20(5):459–473. [Google Scholar]
  23. Lochner L, Moretti E. The effect of education on crime: Evidence from prison inmates, arrests, and self-reports (no w8605) Cambridge, MA: National Bureau of Economic Research; 2001. [Google Scholar]
  24. Lumley T. Survey: Analysis of complex survey samples. Journal of Statistical Software. 2012;9(8) [Google Scholar]
  25. Meyer BD. Natural and quasi-experiments in economics. Journal of Business & Economic Statistics. 1995;13(2):151–161. doi: 10.1080/07350015.1995.10524589. [DOI] [Google Scholar]
  26. Miller MLS. An American imperative: Accelerating minority educational advancement. New Haven, CT: Yale University Press; 1997. [Google Scholar]
  27. Molloy R, Smith CL, Wozniak A. Internal migration in the United States. The Journal of Economic Perspectives. 2011;25(3):173–196. doi: 10.1257/jep.25.3.173. [DOI] [Google Scholar]
  28. National Center for Education Statistics. School questionnaire NELS: 88 first follow-up. 1991 Retrieved from http://nces.ed.gov/surveys/nels88/pdf/07_F1_School_Administrator.pdf.
  29. National Center for Education Statistics. IPEDS data center. 2013 Retrieved August 29, 2013, from http://nces.ed.gov/ipeds/data-center/
  30. National Commission on Excellence in Education. A nation at risk: The imperative for educational reform. Washington, DC: Author; 1983. [Google Scholar]
  31. National Research Council. Rising above the gathering storm: Energizing and employing America for a brighter economic future. Washington, DC: National Academies Press; 2007. [Google Scholar]
  32. National Research Council. Expanding underrepresented minority participation: America's science and technology talent at the crossroads. Washington, DC: The National Academies Press; 2011. [PubMed] [Google Scholar]
  33. Nicholls GM, Wolfe H, Besterfield-Sacre M, Shuman LJ. Predicting STEM degree outcomes based on eighth grade data and standard test scores. Journal of Engineering Education. 2010;99(3):209–223. [Google Scholar]
  34. Noble J, Schnelker D. Using hierarchical modeling to examine course work and ACT score relationships across high schools. Iowa City: IA: ACT, Inc.; 2007. [Google Scholar]
  35. Nomi T. The unintended consequences of an algebra-for-all policy on high-skill students: Effects on instructional organization and students' academic outcomes. Educational Evaluation and Policy Analysis. 2012;34(4):489–505. doi: 10.3102/0162373712453869. [DOI] [Google Scholar]
  36. Oreopoulos P, Page ME, Stevens AH. The intergenerational effects of compulsory schooling. Journal of Labor Economics. 2006;24(4):729–760. [Google Scholar]
  37. Perkins R, Kleiner B, Roey S, Brown J. The high school transcript study: A decade of change in curricula and achievement, 1999–2000. Washington, DC: National Center for Education Statistics; 2004. [Google Scholar]
  38. Petersen MA. Estimating standard errors in finance panel data sets: Comparing approaches. Review of Financial Studies. 2009;22(1):435–480. doi: 10.1093/rfs/hhn053. [DOI] [Google Scholar]
  39. Plunk AD, Cavazos-Rehg P, Bierut LJ, Grucza RA. The persistent effects of minimum legal drinking age laws on drinking patterns later in life. Alcoholism: Clinical and Experimental Research. 2013;37(3):463–469. doi: 10.1111/j.1530-0277.2012.01945.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Pong SL, Ju DB. The effects of change in family structure and income on dropping out of middle and high school. Journal of Family Issues. 2000;21(2):147–169. doi: 10.1177/019251300021002001. [DOI] [Google Scholar]
  41. R Development Core Team. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing; 2012. [Google Scholar]
  42. Robelen EW. Questions arise about algebra 2 for all students. Education Week. 2013;32(35):1–29. [Google Scholar]
  43. Ruggles S, Alexander JT, Genadek K, Goeken R, Schroeder MB, Sobek M. Integrated Public Use Microdata Series: Version 5.0 (Machine-readable database) Minneapolis: University of Minnesota; 2010. [Google Scholar]
  44. Rumbaut RG, Portes A. Ethnicities: Children of immigrants in America. Berkeley: University of California Press; 2001. [Google Scholar]
  45. Secada WG. Race, ethnicity, social class, language, and achievement in mathematics. In: Grouws DA, editor. Handbook of research on mathematics teaching and learning. New York: Macmillan; 1992. [Google Scholar]
  46. Snyder TD, Hoffman CM. Digest of education statistics 1990. Washington, DC: National Center for Education Statistics, Office of Educational Research and Improvement; 1991a. [Google Scholar]
  47. Snyder TD, Hoffman CM. Digest of education statistics 1991. Washington, DC: National Center for Education Statistics, Office of Educational Research and Improvement; 1991b. [Google Scholar]
  48. Snyder TD, Hoffman CM. Digest of education statistics 2001. Washington, DC: National Center for Education Statistics, Office of Educational Research and Improvement; 2002. [Google Scholar]
  49. John EP. Education and the public interest: School reform, public finance, and access to higher education. 1st. New York: Springer; 2006. [Google Scholar]
  50. Tate WF. Race-ethnicity, SES, gender, and language proficiency trends in mathematics achievement: An update. Journal for Research in Mathematics Education. 1997;28(6):652–679. [Google Scholar]
  51. Tate WF. Do the math: Cognitive demand makes a difference. Research Points: Essential Information for Education Policy. 2006;4(2):1–4. [Google Scholar]
  52. Tate WF, Jones BD, Thorne-Wallington E, Hogrebe MC. Science and the city: Thinking geospatially about opportunity to learn. Urban Education. 2012;47(2):399–433. [Google Scholar]
  53. Tate WF, Malancharuvil-Berkes E. A contract for excellence in scientific education: May I have your signature please? Journal of Teacher Education. 2006;57(3):278–285. [Google Scholar]
  54. University of Texas Inequality Project. A panel of estimated family income inequality, by state and year, 1969-2004. 2012 Dec 17; December 18, 2012, from http://utip.gov.utexas.edu/data/Yearly%20approximations%20of%20state%20Gini%20coefficients%201969%20-%202004.xls.
  55. U.S. Census Bureau. American Community Survey design and methodology. Washington, DC: Author; 2009. [Google Scholar]
  56. Waldfogel J, Garfinkel I, Kelly B. Welfare and the costs of public assistance. In: Belfield CR, Levin HM, editors. The price we pay: Economic and social consequences of inadequate education. Washington, DC: Brookings Institution Press; 2007. pp. 160–174. [Google Scholar]
  57. Webb NL, Kane J, Kaufman D, Yang JH. Study of the impact of the Statewide Systemic Initiatives Program. Madison: Wisconsin Center for Education Research 2001 [Google Scholar]
  58. Wooldridge JM. Econometric analysis of cross section and panel data. 2nd. Cambridge, MA: MIT Press; 2010. [Google Scholar]
  59. Zucker AA, Blank R, Gross S. Evaluation of the National Science Foundation's Statewide Systemic Initiatives (SSI) Program: Second year report: Cross-cutting themes. Arlington, VA: National Science Foundation; 1995. [Google Scholar]
  60. Zwick R, Sklar JC. Predicting college grades and degree completion using high school grades and SAT scores: The role of student ethnicity and first language. American Educational Research Journal. 2005;42(3):439–464. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplemental Material
Supplemental Table 1
Supplemental Table 2
Supplemental Table 3
Supplemental Table 4
Supplemental Table 5
Supplemental Table 6
Supplemental Table 7

RESOURCES