Skip to main content
Public Opinion Quarterly logoLink to Public Opinion Quarterly
. 2015 Dec 31;80(1):90–113. doi: 10.1093/poq/nfv047

Sticker Shock

How Information Affects Citizen Support for Public School Funding

Beth E Schueler 1,*, Martin R West 1
PMCID: PMC4884814  PMID: 27257308

Abstract

This study examines the role of information in shaping public opinion in the context of support for education spending. While there is broad public support for increasing government funding for public schools, Americans tend to underestimate what is currently spent. We embed a series of experiments in a nationally representative survey administered in 2012 (n = 2,993) to examine whether informing citizens about current levels of education spending alters public opinion about whether funding should increase. Providing information on per-pupil spending in a respondent’s local school district reduces the probability that he or she will express support for increasing spending by 22 percentage points on average. Informing respondents about state-average teacher salaries similarly depresses support for salary increases. These effects are larger among respondents who underestimate per-pupil spending and teacher salaries by a greater amount, consistent with the idea that the observed changes in opinion are driven, at least in part, by informational effects, as opposed to priming alone.

Information and Opinion: The Case of School Funding

It is well established that the American public is often ill informed regarding public affairs (Bartels 1996). However, the consequences of its lack of knowledge for democratic representation are less well understood. Some scholars worry that when public information is imperfect, polls and voting behavior may not accurately reflect what public opinion would be if citizens were fully informed (Althaus 1998, 2003; Gilens 2001; Simon 2011). Others contend that citizens are able to rely on heuristics or “mental shortcuts” to make up for a lack of specific knowledge (Lupia and McCubbins 1998; Simon 2011, 9). These shortcuts provide a mechanism for citizens to express viewpoints that are in line with their interests, even without perfect information. In the aggregate, the result is a collectively rational public that makes sound policy judgments (Page and Shapiro 1992).

A series of recent studies cast doubt on the theory of collective rationality by demonstrating that the provision of information can alter public opinion. For instance, collective preferences shift when scholars simulate fully informed survey responses using measures of general political knowledge (Althaus 1998) as well as policy-specific information (Gilens 2001). In an experimental setting, Gilens (2001) finds that informing survey respondents that the crime rate has declined reduces support for prison construction. Similarly, Americans are less likely to support cutting foreign aid when they are told about current spending levels (Gilens 2001; Kull et al. 2011).

Our study contributes to this literature by examining the role of information in shaping public preferences about education finance, a salient policy issue distinguished by the frequency with which the public is asked to address it at the ballot box. Spending on elementary and secondary education accounts for approximately 25 percent of state budgets (Center on Budget and Policy Priorities 2014) and education spending overall makes up nearly half of local budgets (US Government Accountability Office 2014). 1 Candidates’ commitment to public education is often a top-tier issue in gubernatorial and even presidential elections (Henig 2013). In 2012 alone, voters from the 15 states for which comprehensive data are available considered at least 1,228 school bond or tax measures, approving 72 percent of them. 2 The approved bond measures amounted to more than $22 billion in additional spending (Ballotpedia 2013). The direct role citizens play in determining education spending levels may increase their incentive to acquire sufficient information to make decisions consistent with their preferences.

Surveys routinely suggest that there is broad support among Americans for increasing spending on public education (Berkman and Plutzer 2005). On the 2012 Education Next-Program on Education Policy and Governance (EdNext-PEPG) Survey, 63 percent of respondents supported increasing government funding for their local district and 64 percent supported increasing teacher salaries in their state, while only a small minority favored decreases (Howell, West, and Peterson 2013). When the 2013 Phi Delta Kappa/Gallup poll asked respondents to identify “the biggest problems that the public schools of your community must deal with,” 35 percent said “lack of financial support,” a larger percentage than those who mentioned the next seven most common responses combined (Bushaw and Lopez 2013). 3

As Berkman and Plutzer (2005) explain, the strong overall support for educational spending, as well as differences in support across social groups, appear to be rooted in self-interest or values. For instance, parents of school-aged children are more likely to support funding increases while homeowners, consistent with their short-term self-interest as property-tax payers, are less supportive. Value-based differences stem from variation in what economists call a “taste for education” or what Moe (2001) describes as a generalized commitment to the principle of public education. The former could account for greater support for school spending among the highly educated, while the latter could explain higher levels of support among groups traditionally affiliated with liberal political parties, such as African Americans. The largest cleavage in support for funding increases in recent decades has been the 15-percentage-point gap between seniors and non-seniors, but Plutzer and Berkman (2005) demonstrate that this gap is not the result of changes in individuals’ self-interest as they age. In fact, cohorts of citizens tend to become more supportive of education spending as they age; younger cohorts are simply more supportive than older cohorts.

At the same time, evidence indicates that the typical American is poorly informed about current levels of school spending. The 2007 EdNext-PEPG survey asked respondents to estimate per-pupil spending in their local school district and average teacher salaries in their state. On average, respondents estimated that their district spent only 54 percent of the actual amount of spending that year. Similarly, respondents underestimated the average teacher salary in their state by over 30 percent (Howell and West 2008). Americans’ underestimation of current school spending raises the question of whether their apparent support for spending increases is based on incomplete information.

In this study, we exploit a series of experiments embedded in the 2012 EdNext-PEPG Survey to examine the effects of providing information about current educational spending and teacher salaries on support for increasing public school funding, teacher salaries, and taxes to fund public schools. We find that providing this information sharply reduces support for increased spending and teacher salaries. Similarly, respondents are less supportive of funding increases when asked about their preference for raising taxes to fund education, as opposed to increased funding in the abstract.

The effects of providing spending and salary information that we document could be driven by the information content itself or by priming. Priming occurs when a survey question heightens the salience of a particular consideration and pushes other relevant considerations to the background of respondents’ minds in a way that influences their survey responses (Krosnick 2002; Simon 2011). We demonstrate that the effects of informing respondents about current spending and salary levels are larger among those who underestimate actual levels by a greater amount, consistent with information effects. The presence of effects for respondents who accurately estimate spending and salaries suggests that priming may indeed play a role but does not fully explain the opinion changes among typical respondents.

Methods

SAMPLE

Our data come from the 2012 EdNext-PEPG Survey, conducted by the polling firm Knowledge Networks®. The sample (n = 2,993) was drawn from the probability-based KnowledgePanel, which is constructed using sampling frames that cover more than 99 percent of the US population. 4 Knowledge Networks® drew 4,151 panel members from their KnowledgePanel®, and 2,933 of these panelists ultimately completed the survey, resulting in a final stage completion rate of 72.2 percent. Knowledge Networks® reported a recruitment rate of 16 percent and a profile rate of 63.5 percent, resulting in a cumulative response rate of 7.3 percent, calculated as described in Callegaro and DiSogra (2008). The sample is stratified and nationally representative of the adult population (aged 18 and over) in the United States.

Knowledge Networks® conducted the survey between April 27 and May 11, 2012. Participants could complete the survey in English or Spanish. The survey (provided in the online appendix) covered a variety of education-related topics. Throughout our analyses, we use post-stratification population weights provided by Knowledge Networks® to adjust for survey nonresponse and over-sampling and ensure that the sample reflects the characteristics of the national population. Appendix table A1 confirms that the weighted sample characteristics correspond closely to those of the national public in the 2012 Current Population Survey.

PROCEDURES

To gauge respondents’ knowledge of educational spending, survey administrators asked all respondents the following questions before asking about their support for spending:

  • (1) Based on your best guess, what is the average amount of money spent each year for a child in public schools in your school district?

  • (2) Based on your best guess, what is the average salary of a public school teacher in your state?

Respondents could not go back and adjust their answers to any questions after they were submitted.

To assess information and tax-wording effects, administrators experimentally assigned respondents to one of the following four questions regarding school funding:

  • (1A) Do you think that government funding for public schools in your district should increase, decrease, or stay about the same?

  • (1B) According to the most recent information available, $[NUMBER] is being spent each year per child attending public schools in your district. Do you think that government funding for public schools in your district should increase, decrease, or stay about the same?

  • (1C) Do you think that taxes to fund public schools should increase, decrease, or stay about the same?

  • (1D) According to the most recent information available, $[NUMBER] is being spent each year per child attending public schools in your district. Do you think that taxes to fund public schools should increase, decrease, or stay about the same?

Versions (1B) and (1D) included the most recent available data on average, annual per-pupil spending in the respondent’s school district during 2009–10 as reported in the National Center for Education Statistics’ Common Core of Data.

Administrators also experimentally assigned respondents to one of two questions regarding teacher salaries:

  • (2A) Do you think that teacher salaries should increase, decrease, or stay about the same?

  • (2B) Teachers in your state are paid an average annual salary of $[NUMBER]. Do you think that teacher salaries should increase, decrease, or stay about the same?

Question (2B) included the average annual teacher salary in the respondent’s state during the 2010–11 school year, as reported in the 2011 edition of the Digest of Education Statistics.

For both per-pupil spending and teacher salaries, the information provided to respondents was average dollar amounts at the lowest level of aggregation for which comparable data were available on a nationwide basis. Recent studies have documented substantial variation in per-pupil spending across schools within the same district, but reliable school-level spending data are not routinely published (Roza 2010). In addition, the fact that education budgets are ultimately determined at the district level makes this the relevant unit of analysis for questions about school spending levels. Individual teacher salaries vary within states due to differences in district pay scales and characteristics such as experience and educational credentials. According to the 2011 Digest, the median teacher nationally had 15 years of experience, and the average salary for teachers with 15–19 years of experience was $58,260, slightly higher than the average for all teachers of $56,069. 5 This suggests that using median teacher salaries, had they been available at the state level, would not have conveyed to respondents a very different message.

Online appendix B and online appendix table B1 demonstrate that randomization was generally successful in creating groups with similar observed characteristics and that the response rates to the items used in this study did not differ significantly across experimental conditions.

MEASURES

Outcomes:

The outcomes of interest were subjects’ responses to the survey questions regarding their preferences for government funding for local public schools, higher teacher salaries, and higher taxes to support local schools. For each experiment, we created an ordered categorical variable (“PREFERENCE”) indicating the subject’s response to the questions (greatly decrease = 1; decrease = 2; stay about the same = 3; increase = 4; greatly increase = 5).

Question predictors:

Our primary variables of interest were indicators for which version of the survey questions each subject answered. For both experiments, we relied on a dichotomous indicator (“INFORMED”) for whether the question included information on current spending levels (yes = 1; no = 0). For the per-pupil funding and taxation experiment, we created a dichotomous indicator (“TAX”) for whether the question asked about increasing taxes (yes = 1; no = 0), as well as an interaction variable identifying respondents who were asked about increasing taxes after being provided with spending information.

We also used respondents’ estimates of current per-pupil spending and teacher salaries to create two variables capturing the degree to which a respondent underestimated current spending and salaries. We first excluded extreme outliers who estimated $0 for per-pupil funding (n = 18) or average teacher salary (n = 5), or greater than $50,000 for per-pupil funding (n = 18). We then calculated the underestimation variables as the difference between the actual amount for a respondent’s district or state and his or her estimate. Positive values therefore represent underestimates, while negative values represent overestimates. We divided that number by $1,000 so that the (“UNDERESTIMATE”) variables represent the difference between the true amount and the respondent’s estimate in thousands of dollars.

Covariates:

In all but the unconditional models, we included dichotomous variables (“TEACHER,” “PARENT,” and “HOMEOWNER”) indicating whether a subject was a member of that group (yes = 1; no = 0). We also included continuous indicators of age and education in years, and household income in thousands of dollars. We generated the continuous education variable from categorical responses by assigning each respondent his or her approximate, implied years of education based on highest degree or grade level attainment (e.g., high school graduate = 12; bachelor’s degree = 16). For income, we assigned each respondent the midpoint for each response category range (e.g., $50,000 to $59,999 = $55,000).

DATA-ANALYSIS PLAN

The outcome for all research questions was a categorical variable with a natural ordering (i.e., “greatly increase” is higher than “increase”), so we used ordered logistic regression to model the odds (expressed below as a ratio of probabilities) that a respondent will give an answer that is more supportive of funding, salaries, or taxation for education funding. This allowed us to avoid the problematic assumption that the difference between all pairs of adjacent answer choices (e.g., “increase” and “stay the same” versus “stay the same” and “decrease”) reflects the same difference in support for increased funding.

Per-pupil funding:

To examine whether disclosing current per-pupil funding caused a shift in public opinion and whether asking about increasing taxes to fund schools altered levels of spending support, we relied on the following model:

Pr(PREFERENCEi>k)Pr(PREFERENCEik)=e(μk+β1INFORMEDi+β2TAXi+β3INFORMED×TAXi+γXi), (1)

where i represents a particular respondent, PREFERENCEi is the response given by subject i to the survey question, and k represents a particular answer choice. 6 On the right-hand side of equation (1), µ is a constant that varies depending on the value of k (i.e., the answer choice). The ordinal outcome PREFERENCEi is a categorical representation of a latent continuous opinion scale in that we cannot know for sure whether the distance between the answer choices “greatly increase” and “increase” is the same as the distance between the answer choices “increase” and “stay the same.” Therefore, µ is the estimated cutpoint on the latent continuous variable used to differentiate answer choice k from answer choice k + 1 (e.g., the cutpoint between “increase” and “greatly increase”).

INFOi indicates whether respondents saw a version of the question with a current per-pupil spending estimate. Therefore, β 1 is our estimate of the causal impact of providing information on funding preferences for respondents asked about increasing funding for public schools. TAXi indicates whether a respondent was asked about increasing taxes to fund schools, and INFORMED×TAXi allows us to compare the size of the information effect for those who were asked about increasing taxes to fund schools versus those who were simply asked about increasing school funding. When reporting the information effect for those asked about increasing taxes, we sum the INFORMEDi and INFORMED×TAXi coefficients. Finally, Xi is a vector of the covariates described above.

Teacher salaries:

To address whether disclosing current teacher salaries caused a shift in opinion, we relied on the following model:

Pr(PREFERENCEi>k)Pr(PREFERENCEik)=e(μk+β1INFORMEDi+γXi), (2)

where INFORMEDi represents whether respondents saw a version of the question with an estimate of current average teacher salaries. Therefore, β 1 is the parameter of interest that allows us to estimate the causal impact of the question type on funding preferences.

Results

We begin the presentation of our results by documenting the base level of support for spending and salary increases among respondents who were not provided with information, as well as the degree to which respondents underestimated actual spending and salary levels. We then examine how support for funding and salary increases responds to the provision of information on current spending levels, as well as to being asked about support for raising taxes to increase school funding. Finally, we consider how the effect of providing information on current expenditures varies with the degree to which respondents underestimated actual spending and salaries.

PUBLIC SUPPORT FOR AND KNOWLEDGE OF PUBLIC SCHOOL FUNDING.

We find broad support for educational funding increases among respondents in the uninformed conditions. Table 1 presents the percent of respondents, overall and within subgroups, who gave each answer choice for the per-pupil funding and teacher salary questions, in both the uninformed and informed conditions. The top panel indicates that a majority of respondents in the uniformed condition support raising per-pupil spending (62.5 percent) as well as teacher salaries (63.7 percent). Slightly larger percentages of teachers and parents support increases when compared to the sample as a whole. Although homeowners are less enthusiastic about increases, even they express majority support for boosting funding and salaries. Not surprisingly, teachers are more supportive of salary increases than the average respondent. Only a small percentage of all respondents support cutting spending (8.9 percent), and even smaller portions prefer decreasing teacher salaries (4.5 percent).

Table 1.

Percent of Respondents Giving Each Answer Choice for the Per-Pupil Funding and Teacher Salary Increase Support Questions in the Uninformed and Informed Conditions

Attitudes concerning per-pupil funding
Preferences of uninformed respondents
n Greatly increase Increase Stay the same Decrease Greatly decrease Total
Full sample 752 15.8 46.7 28.5 6.3 2.6 100%
Teachers 126 31.1 36.8 26.9 5.2 0.0 100%
Parents 202 16.0 48.7 26.4 4.7 4.3 100%
Homeowners 533 12.7 45.1 32.6 6.6 3.0 100%
Preferences of informed respondents
n Greatly increase Increase Stay the same Decrease Greatly decrease Total
Full sample 756 8.7 34.2 43.9 10.6 2.7 100%
Teachers 108 26.8 33.3 31.5 7.1 1.3 100%
Parents 203 13.9 31.7 45.6 5.8 3.0 100%
Homeowners 576 7.5 30.6 46.1 12.7 3.1 100%
Attitudes concerning teacher salaries
Preferences of uninformed respondents
n Greatly increase Increase Stay the same Decrease Greatly decrease Total
Full sample 1,509 13.0 50.7 31.8 3.3 1.2 100%
Teachers 229 42.1 43.2 10.3 2.0 2.4 100%
Parents 413 17.4 46.6 33.9 1.1 1.1 100%
Homeowners 1,119 11.8 47.8 34.7 4.2 1.5 100%
Attitudes concerning teacher salaries
Preferences of informed respondents
n Greatly increase Increase Stay the same Decrease Greatly decrease Total
Full sample 1,464 6.3 30.4 55.3 5.6 2.4 100%
Teachers 230 26.5 46.7 24.1 2.4 0.3 100%
Parents 374 10.5 24.5 56.3 7.6 1.1 100%
Homeowners 1,066 5.7 30.1 56.3 5.7 2.3 100%

The survey respondents as a whole, however, dramatically underestimated their local districts’ per-pupil expenditures as well as average teacher salaries in their states. Over 87 percent of respondents underestimated per-pupil spending, and 89 percent underestimated salaries, often by a substantial amount. Table 2 shows that the average respondent estimated that his or her district spent $6,189 annually per pupil, while actual average spending was more than twice that amount ($12,628). Nearly half (47.68 percent) of respondents estimated $3,000 or less. Similarly, for statewide teacher salary, the average respondent estimated $36,063 while the true average in respondents’ states was $55,042.

Table 2.

Actual and Estimated District Per-Pupil Funding and State-Average Teacher Salaries

Per-pupil funding Teacher salaries
Actual Estimated Actual Estimated
n Mean Mean Median Mean Mean Median
Full sample 2,993 $12,628 $6,189 $3,200 $55,042 $36,063 $37,000
Teachers 461 $12,914 $5,577 $3,500 $55,830 $43,997 $45,000
Parents 794 $12,190 $5,811 $3,000 $55,062 $34,828 $35,000
Homeowners 2,197 $12,471 $6,497 $3,500 $54,530 $38,915 $37,000
College graduates 1,179 $13,326 $7,140 $5,000 $56,066 $42,923 $40,000

Note.—For the per-pupil funding question, we excluded 18 respondents who answered $0 and 18 who answered > $50,000. For the teacher salary question, we excluded 5 respondents who answered $0 for average annual teacher salary.

Underestimation was the rule across all subgroups of survey respondents. College graduates were the most accurate subgroup for per-pupil spending, but they still estimated only $7,140 on average. Teachers were most accurate for salaries, presumably reflecting their greater familiarity with educator pay schedules, but still underestimated by about $11,833 on average. Despite the pervasiveness of underestimation, however, respondents’ estimates were nonetheless somewhat responsive to variation in actual local spending and salary levels. Specifically, each additional dollar was associated with a $0.18 higher per-pupil spending estimate (p < .001) and a $0.50 higher salary estimate (p < .001), respectively. 7

The effect of information on support for public school funding:

Providing an estimate of current local per-pupil spending caused public opinion on educational funding to shift notably. Table 1 provides unconditional differences between experimental groups, illustrating that support for increased funding drops from 62.5 percent in the uninformed to 42.9 percent in the informed condition. Table 3 presents results for the ordered logistic regression models. The coefficients are odds ratios, representing the odds of giving a particular answer choice (e.g., “greatly increase”) versus the odds of giving an answer that is one increment less supportive of spending on the five-point ordered categorical scale. The first row of table 3 therefore indicates that providing spending information cuts the odds roughly in half that a respondent will give an answer that is more supportive of funding (β = .47; p < .001). Teachers and parents were more supportive of higher school funding, while older respondents and homeowners were less supportive. Our results concerning information effects, however, are unaltered by the inclusion of demographic covariates. 8

Table 3.

The Effect of Information and Taxation Question Wording on Support for Increased Public School Funding and Increased Teacher Salaries

Per-pupil funding Teacher salaries
Underestimation Underestimation
b (SE) b (SE) b (SE) b (SE)
Informeda 0.47 (0.05)*** 0.54 (0.06)*** 0.35 (0.03)*** 0.50 (0.05)***
Tax question 0.35 (0.04)*** 0.35 (0.04)***
Informed x Tax question 1.38 (0.19)* 1.37 (0.19)*
Underestimationb 1.02 (0.01)*** 1.02 (0.00)***
Teacher 1.97 (0.45)** 1.88 (0.43)** 3.77 (0.88)*** 3.82 (0.89)***
Parent 1.21 (0.10)* 1.23 (0.11)* 0.98 (0.09) 0.92 (0.08)
Homeowner 0.64 (0.05)*** 0.64 (0.05)*** 0.64 (0.05)*** 0.68 (0.06)***
Age in years 0.99 (0.00)* 1.00 (0.00)* 1.00 (0.00) 1.00 (0.00)
Education in years 1.02 (0.01) 1.02 (0.01) 1.05 (0.02)** 1.07 (0.02)***
Income in thousands 1.00 (0.00) 1.00 (0.00) 1.00 (0.00) 1.00 (0.00)*
Underestimation x Informed 0.98 (0.01)** 0.98 (0.00)***
Cut 1 –4.50 (0.24) –4.38 (0.24) –4.43 (0.25) –3.75 (0.28)
Cut 2 –2.98 (0.22) –2.83 (0.22) –3.12 (0.22) –2.46 (0.26)
Cut 3 –0.61 (0.21) –0.47 (0.22) –0.28 (0.21) 0.41 (0.25)
Cut 4 1.56 (0.21) 1.70 (0.22) 2.14 (0.21) 2.87 (0.25)
N 2,973 2,921 2,973 2,921
Pseudo R-squared 0.038 0.040 0.044 0.053

Note.—These ordered logit estimates are expressed in odds ratios. The coefficients represent the odds of giving response k (e.g., “greatly increase”) versus the odds of giving response k – 1 (e.g., “increase”).

aA dichotomous indicator for whether the survey question included a current spending estimate/teacher salary estimate (yes = 1; no = 0).

bDifference between actual spending in respondent’s district/teacher salaries in respondent’s state and respondent’s estimate in thousands of dollars.

*p < .05; **p < .01; ***p < .001

Table 4 displays the predicted probability of giving each answer choice for both the informed and uninformed groups. Spending information reduces the probability that a respondent will answer “greatly increase” or “increase,” while the probability of selecting “stay the same” or one of the decrease options grows. Also, the change in the predicted probability of giving each answer choice due to information is comparable in magnitude to the effect of being a teacher, though in the opposite direction. Overall, informing respondents of current spending reduces the probability that they will support funding increases by 22 percentage points. All else equal, teachers are 19 percentage points more likely to favor increased funding. Although providing information certainly dampens citizen support for increases in spending, it is important to note that only a small minority (13.81 percent) of informed respondents favor cutting back.

Table 4.

Change in Predicted Probabilities of Giving Each Answer Choice Associated with Information and Group Membership

Attitudes concerning per-pupil public school funding
Greatly increase Increase Stay the same Decrease Greatly decrease
Informeda –0.064 –0.152 0.129 0.065 0.022
Teacher 0.069 0.123 –0.131 –0.046 –0.015
Parent 0.011 0.026 –0.023 –0.011 –0.004
Homeowner –0.033 –0.073 0.068 0.029 0.009
Attitudes concerning teacher salaries
Greatly increase Increase Stay the same Decrease Greatly decrease
Informeda –0.095 –0.162 0.209 0.034 0.014
Teacher 0.169 0.122 –0.251 –0.029 –0.011
Parent –0.002 –0.003 0.004 0.001 0.000
Homeowner –0.044 –0.066 0.092 0.013 0.005

Note.—These estimates represent the change in the predicted probabilities of giving each answer choice associated with the difference between the minimum and maximum value of the independent variables, holding all other independent variables constant at their means.

aA dichotomous indicator for whether the survey question included an estimate of current spending (yes = 1; no = 0).

The effect of tax salience on support for public school funding:

Respondents also are less supportive of increasing taxes to fund public schools than of increasing school funding in the abstract. Figure 1 shows the probability that respondents supported higher funding when they were and were not asked about tax increases, in both the uninformed and informed conditions. Our estimate of the tax effect is larger in magnitude than the information effect (χ2 = 9.17; p < .05). As shown in table 3, the taxation version of the question cuts by almost two-thirds the odds that a respondent will give an answer more supportive of higher funding (e.g., “increase”) versus giving an answer that is less supportive (e.g., “stay the same”) (β = .35; p < .001). However, again, those favoring funding cuts are still in the minority among uninformed (14.07 percent) and informed (16.04 percent) respondents answering the taxation question. Interestingly, information still dampened funding support among respondents in the tax wording condition (χ2 = 19.27; p < .05); however, the effect of providing information is modestly smaller among those who were asked about tax increases (β = 1.38; p < .05).

Figure 1.

Figure 1.

The Effect of Information on the Probability of Supporting Increased Funding for the Taxation and Non-Taxation Questions (Unconditional Model, N = 2,993).

The effect of information on support for higher teacher salaries:

With regard to teacher salaries, a key driver of overall educational spending, we find that providing information about current salaries also shifts opinion on whether those salaries should increase. In table 1, only 36.7 percent of informed respondents support higher salaries, as compared with 63.7 percent of respondents in the uninformed condition. According to the ordered logistic regression models in table 3, including information in the question reduces the odds by almost two-thirds that a respondent gives an answer choice that is more supportive of higher salaries (e.g., answering “increase” versus “stay the same,” or “greatly increase” versus “increase”) (β = .35; p < .001). Table 4 indicates that informing respondents of current salaries reduces support for salary increases by approximately 25 percentage points. As expected, teachers are more supportive of salary increases than non-teachers. 9 The change in the predicted probability of giving each answer choice associated with information is slightly smaller in magnitude than the effect of being a teacher, which increases the probability of favoring salary increases by 28 percentage points. 10

Information vs. priming:

It is possible that the effects of telling respondents about current spending and salary levels are an artifact of priming rather pure information effects. Priming occurs when a particular consideration is brought to the forefront of respondents’ minds, pushing to the background other potentially relevant considerations, in ways that ultimately affect how respondents answer the question (Miller and Krosnick 1996; Krosnick 2002; Simon 2011). In this study, survey administrators brought current spending and salary levels to the forefront of respondents’ minds by including this information in the questions. In doing so, they may have pushed to the background of respondents’ minds other relevant considerations, such as their direct observations of resource needs in their local schools, how funds are allocated or distributed, or how much it costs to adequately educate a child. To the degree that priming is occurring, it is likely inflating our estimates of the role played by the provision of information per se. 11

The design of our survey experiments does not permit us to definitively identify the mechanisms for the changes in opinion among respondents provided with information on current spending and salary levels. To shed light on the extent to which the effects are likely to be informational, however, we examined how the effect of providing information varied with the respondents’ preexisting knowledge, as measured by the degree to which they underestimated current per-pupil spending and teacher salaries before they were asked about funding support. We hypothesize that any information effects should be larger for low-knowledge respondents because the information gap is wider for them. Respondents who are very familiar with the details of educational finance receive less new information from our survey than those with little knowledge in this area. In other words, if the effect of disclosing per-pupil funding and salaries is at least partially informational, we would expect that the more accurate a respondent’s preexisting estimate of spending and salaries, the smaller the effect will be of including these details in the survey question.

Taken to its extreme, this logic would imply that the provision of information should have no effect on the opinions of respondents with accurate prior spending knowledge. Of course, it would be incorrect to suggest that the provision of current spending or salary levels contains no informational content for respondents whose estimates corresponded to reality. The experimental prompt is likely to have strengthened their confidence in their prior knowledge, and perhaps also their willingness to put that knowledge to use when asked to evaluate spending increases. Even so, the magnitude of any shift in opinion among respondents with accurate prior knowledge may provide an upper bound on the extent to which the overall effects of the information prompt reflect other types of effects, such as priming.

To implement this idea, we relied on the following model:

Pr(PREFERENCEi>k)Pr(PREFERENCEik)=            (3)e(μk+β1INFORMEDi+β2TAXi+ β3INFORMED×TAXi+ β4UNDERESTIMATEi+ β5INFORMED×UNDERESTIMATEi+ γXi),

where UNDERESTIMATEi represents the amount by which respondent i underestimated current spending or average teacher salaries. Again, this variable represents the difference between actual and estimated per-pupil spending or salaries in thousands of dollars. Positive values therefore indicate underestimation and negative values represent overestimation, although since few respondents overestimated, this variable can roughly be interpreted to represent inaccuracy. The INFORMED×UNDERESTIMATEi interaction allows us to see whether the effect of providing information varies depending on the accuracy of respondents’ estimates.

The effect of providing information regarding both current per-pupil spending and average teacher salaries is indeed larger, on average, among respondents who underestimate what is currently spent. We present the full “underestimation” model results in table 3. The two panels of figure 2 show the predicted probability of supporting increased funding (or salaries) as a function of the degree to which a respondent underestimated current spending (or salaries), for respondents in the informed condition (represented by the black lines) and respondents in the uninformed condition (the gray lines). For each additional $1,000 by which respondents underestimated either per-pupil spending or teacher salaries, the estimated effect of information on the probability of supporting higher spending or salaries is one percentage point higher, on average. This pattern is consistent with an informational explanation since the effects are increasingly larger among lower-knowledge respondents than among those who take the survey with greater preexisting information.

Figure 2.

Figure 2.

The Relationship Between Underestimation and Support for Increased Per-Pupil Funding and Salaries (Among 5th to 95th Percentiles of Estimates Based on Unconditional Model).

However, for respondents who accurately estimated spending in their district, including the current spending amount in the question still decreases the probability that they support funding increases by 17 percentage points, on average. This suggests that priming could play a role since even respondents whose spending estimates were accurate are affected by the experiment. As discussed above, however, the experimental prompt may have had informational content even for this group. Moreover, the effect of providing this information is larger among respondents who gave the median estimate (underestimating by approximately $8,000). For this group, information reduced the probability of support for increases by 22 percentage points.

A similar pattern emerges for the salary experiment. Among respondents who accurately estimated salaries, the provision of actual salary information dampened the probability of supporting raises for teachers by 17 percentage points. However, among those who gave the median estimate (underestimating by approximately $14,500), providing this information decreased the probability of supporting increases by 23 percentage points. This is again consistent with the idea that the effect we are detecting is in part informational since the effect for accurate estimators is not large enough to account for the entire effect among those with more limited preexisting information.

Discussion

Although a majority of Americans support increased funding for public education, this study suggests that public opinion in this area is not grounded in accurate perceptions of current spending levels. We find that the public experiences a strikingly large “sticker shock” when provided with information about current spending levels. Furthermore, we find that the public is more enthusiastic about boosting funding in the abstract than about increasing taxes to do so.

These findings are consistent with evidence that providing citizens with issue-specific information can shift opinion in consequential ways. Interestingly, our results are similar to what Gilens (2001) and Kull et al. (2011) have found with regard to public preferences for foreign aid spending but point in the opposite direction. Although a majority of Americans favor reduced spending on economic and military aid, they tend to overestimate current spending, and fewer oppose existing funding levels when provided with actual expenditure estimates.

Our results diverge from those of Kuklinski et al. (2000), who found no effect of providing information regarding welfare policy on opinion, as well as Nyhan and Reifler’s findings (2010) that correcting misperceptions regarding Iraqi possession of weapons of mass destruction, the effect of federal tax cuts on government revenues, and President G. W. Bush’s position on stem cell research not only failed to diminish misperceptions among the most ideologically committed respondents, but in some cases increased them. One possible explanation for this divergence relates to Kuklinski et al.’s distinction between being uninformed and misinformed, which they define as confidently holding wrong beliefs. They argue that misinformation regarding welfare policy makes it difficult for citizens to update their views when presented with the facts. Similarly, Nyhan and Reifler focused on misperceptions regarding politicized policy issues. In the case of education spending, it could be that Americans are simply uninformed rather than misinformed, and that information is more powerful in shifting opinion on issues where there is a lack of knowledge rather than strongly held wrong beliefs.

Readers should be cautious about concluding that, were the public to become better informed about current spending levels, support for higher funding would drop. The method used in our study to communicate information regarding spending may not have accurately simulated the way that information is acquired and applied in the real world, limiting our ability to generalize these findings beyond the experimental context (Althaus 1998; Gaines, Kuklinski, and Quirk 2007). However, Barabas and Jerit (2010) find that the effects of experimentally providing survey respondents with information on the relative financial health of Medicare and Social Security and a new citizenship test have similar, although larger, effects on respondents’ policy knowledge when compared to natural experiments among subgroups exposed to news sources with greater coverage of the same information. 12

To the extent that our findings are externally valid, they may shed light on the likely consequences of proposals to increase transparency about school finance in order to promote efficiency in the use of public resources, for example by incorporating spending information into educational accountability systems (see, e.g., Boser 2011). The accountability programs developed under the federal No Child Left Behind Act and its state-level predecessors focus on improving test scores and closing achievement gaps without considering whether schools achieve those goals in a cost-effective manner. Chingos, Henderson, and West (2012) demonstrate that the grades citizens assign their neighborhood public schools are correlated with publicly available information on test-score performance in those schools and that ratings issued by educational accountability systems have a causal effect on public perceptions of school quality. Our results suggest that increasing the transparency of educational expenditures may make it more difficult to sustain public support for higher school funding and teacher salaries, especially if such efforts focus primarily on current spending levels.

An important avenue for future research would be to examine the effects of providing citizens with additional information about school finance, such as how funds are allocated between different types of programs or expenses, how funds are distributed between different schools and students, or how much a respondent’s local schools spend relative to the state or national average. This research is especially relevant since political actors are likely to present budgetary information in the context of broader political frames and with additional information supporting their views on the merits of funding decisions. Although it may be contentious and difficult to summarize and present this type of information, research in this area has the potential to improve understanding of the effects of greater budget transparency and of how the public formulates opinions on major domestic spending issues.

Supplementary Data

Supplementary data are freely available online at http://poq.oxfordjournals.org/

Supplementary Data

Appendix Table A1.

Comparing Our Sample Demographics to the 2012 Current Population Survey

Our sample
CPS Unweighted Weighted
Gender
 Female 51.9 53.52 52.04
Household income
 Less than $5,000 2.57 1.9 2.48
 $5,000 to $7,499 1 1.47 2.15
 $7,500 to $9,999 1.8 1.7 1.89
 $10,000 to $12,499 2.09 2.94 3.21
 $12,500 to $14,999 2 2.64 2.43
 $15,000 to $19,999 4.36 3.27 3.3
 $20,000 to $24,999 4.93 4.54 3.83
 $25,000 to $29,999 4.77 5.51 5.87
 $30,000 to $34,999 5.03 4.98 5.29
 $35,000 to $39,999 4.77 5.81 5.85
 $40,000 to $49,999 8.8 7.15 6.41
 $50,000 to $59,999 8.01 9.59 9.71
 $60,000 to $74,999 10.63 10.56 9.17
 $75,000 to $84,999 6.1 8.09 8.07
 $85,000 to $99,999 7.18 8.02 6.85
 $100,000 and over 25.95 21.82 23.49
Race/ethnicitya
 White 79.72 59.97 66.57
 Black/African American 12.14 18.38 11.59
 Other 6.6 2.94 4.01
 Two or more races 1.56 1.97 3.17
 Hispanic 14.78 16.74 14.66

aCPS reports race categories without specifying Hispanic/non-Hispanic, while Knowledge Networks specifies “non-Hispanic” for each of the race categories (which likely explains the race group differences).

Footnotes

1.

The state estimate is based on 2012 spending. The local estimate is based on 2009 spending and includes both K–12 and higher-education spending.

2.

These states include Arizona, California, Colorado, Florida, Illinois, Michigan, Missouri, New Mexico, New Jersey, New York, Ohio, Oregon, Texas, Washington, and Wisconsin.

3.

These problems, which were identified by a combined 33 percent of respondents, were discipline, overcrowding, parental support, testing, fighting, teachers, and drugs.

4.

Knowledge Networks® offers to provide panelists free Internet access and a WebTV device that connects to a telephone and television, so the sample is not limited to subjects who previously owned a computer or had access to the Internet. When recruiting the panel, Knowledge Networks® sends an advance mailing and follows up with at least 15 dial attempts; it updates the panel quarterly.

5.

Median teacher experience (reported in table 74 of the 2011 Digest) is from 2006 and average salaries by experience level (table 80) are from 2007–08, while the national average for all teachers (table 84) is from 2009–10.

6.

If, for example, we allow k to equal the answer choice “stay the same,” then the model estimates the combined odds of giving an answer greater than “stay the same” (either “increase” or “greatly increase”) versus the odds of giving an answer less than or equal to “stay the same” (either “stay the same,” “decrease,” or “greatly decrease”).

7.

This evidence of responsiveness persists in models that include the covariates outlined in our Methods section; results available upon request.

8.

Nor do we find evidence that the information effect varies consistently by stakeholder group; results available upon request.

9.

Although we hypothesized that the salary information effect might be smaller for teachers, this did not appear to be the case (log-odds = .3949; p > .05).

10.

Because all respondents participated in two distinct experiments (one on per-pupil funding and one on salaries) and because the first experiment was two pronged (one with tax language and one without), we checked for interactions between the various permutations of experimental conditions. We examined whether the effect of salary information on support for higher salaries varied depending on whether respondents received information in the per-pupil funding experiment. We find that respondents who were provided with information about per-pupil spending then estimated that teachers earned $1,933.47 more, on average, than those who did not receive per-pupil spending information (p = .004). Despite this mean difference in salary estimates, the information effect on salary support did not vary depending on receipt of per-pupil spending information (β = .981; p > .05). This interaction did not depend on whether respondents in the per-pupil experiment saw taxation wording (β = 1.051; p > .05). We present these results in online appendix table C1.

11.

Yet another alternative interpretation of our findings relates to social desirability bias (Edwards 1957). Participants who were uninformed or were not asked about taxation may have felt pressure to express support for increased funding, given the breadth of public support for public education. This pressure could be mitigated when respondents were prompted to think about other social costs, such as public spending and taxation.

12.

They also find that information effects on policy opinions are not replicated in the natural experiment and conclude that belief integration is more complex in the real world than in experimental contexts. However, the authors point out that statistical power limits their ability to draw strong inferences. Additionally, it is not clear that the information provided in these studies was as closely related to the opinions gauged as it was in our study (i.e., information on a new citizenship test may have less influence on citizen views about the effect of immigration on the economy than information on funding levels has on opinions regarding funding increases).

References

  1. Althaus Scott. 1998. “Information Effects in Collective Preferences.” American Political Science Review 92:545–58. [Google Scholar]
  2. ———. 2003. Collective Preferences in Democratic Politics: Opinion Surveys and the Will of the People. New York: Cambridge University Press. [Google Scholar]
  3. Ballotpedia 2013. “Approval Rates of Local School Bond and Tax Elections (2012).” http://ballotpedia.org/wiki/index.php/Approval_rates_of_local_school_bond_and_tax _elections_%282012%29.
  4. Barabas Jason, Jerit Jennifer. 2010. “Are Survey Experiments Externally Valid?” American Political Science Review 104:226–42. [Google Scholar]
  5. Bartels Larry. 1996. “Uninformed Votes: Information Effects in Presidential Elections.” American Journal of Political Science 40:194–230. [Google Scholar]
  6. Berkman Michael, Plutzer Eric. 2005. Ten Thousand Democracies: Politics and Public Opinion in America’s School Districts. Washington, DC: Georgetown University Press. [Google Scholar]
  7. Boser Ulrich. 2011. “Return on Investment: A District-by-District Evaluation of US Educational Productivity.” Washington, DC: Center for American Progress. [Google Scholar]
  8. Bushaw William, Lopez Shane. 2013. “Which Way Do We Go? The 45th Annual Phi Delta Kappa/Gallup Poll of the Public’s Attitudes Toward the Public Schools.” Kappan 95(1):8–25. [Google Scholar]
  9. Callegaro Mario, Disogra Charles. 2008. “Computing Response Metrics for Online Panels.” Public Opinion Quarterly 72:1008–1032. [Google Scholar]
  10. Center on Budget and Policy Priorities. . 2014. “Where Do Our State Tax Dollars Go?” Policy Basics. Available at http://www.cbpp.org/files/policybasics-statetaxdollars.pdf. [Google Scholar]
  11. Chingos Matthew, Henderson Michael, West Martin. 2012. “Citizen Perceptions of Government Service Quality: Evidence from Public Schools.” Quarterly Journal of Political Science 7:411–45. [Google Scholar]
  12. Edwards Allen. 1957. The Social Desirability Variable in Personality Assessment and Research. New York: Dryden. [Google Scholar]
  13. Gaines Brian, Kuklinski James, Quirk Paul. 2007. “The Logic of the Survey Experiment Reexamined.” Political Analysis 15:1–20. [Google Scholar]
  14. Gilens Martin. 2001. “Political Ignorance and Collective Policy Preferences.” American Political Science Review 95:379–96. [Google Scholar]
  15. Henig Jeffrey R. 2013. “The Rise of Education Executives in the White House, State House, and Mayor’s Office.” In Education Governance for the Twenty-First Century, edited by Patrick McGuinn and Paul Manna, 178–208. Washington, DC: The Brookings Institution Press. [Google Scholar]
  16. Howell William, West Martin. 2008. “Is the Price Right?” Education Next 8(3):37–41. [Google Scholar]
  17. Howell William, West Martin, Peterson Paul. 2013. “Reform Agenda Gains Strength.” Education Next 13(1):8–19. [Google Scholar]
  18. Krosnick Jon. 2002. “Is Political Psychology Sufficiently Psychological? Distinguishing Political Psychology from Psychological Political Science.” In Thinking About Political Psychology, edited by Kuklinski James, 187–216. Cambridge, UK: Cambridge University Press. [Google Scholar]
  19. Kuklinski James, Quirk Paul, Jerit Jennifer, Schwieder David, Rich Robert. 2000. “Misinformation and the Currency of Democratic Citizenship.” Journal of Politics 62:790–816. [Google Scholar]
  20. Kull Steven, Clay Ramsay, Evan Lewis, Stefan Subias. 2011. How the American Public Would Deal with the Budget Deficit. Program for Public Consultation. University of Maryland: College Park, MD. [Google Scholar]
  21. Lupia Arthur, McCubbins Matthew D. 1998. The Democratic Dilemma: Can Citizens Learn What They Need to Know? New York: Cambridge University Press. [Google Scholar]
  22. Miller Joanne, Krosnick Jon. 1996. “News Media Impact on the Ingredients of Presidential Evaluations: A Program of Research on the Priming Hypothesis.” In Political Persuasion and Attitude Change, edited by Mutz Diana, Sniderman Paul, Brody Richard, 79–100. Ann Arbor: University of Michigan Press. [Google Scholar]
  23. Moe Terry. 2001. Schools, Vouchers, and the American Public. Washington, DC: Brookings. [Google Scholar]
  24. Nyhan Brendan, Reifler Jason. 2010. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32:303–30. [Google Scholar]
  25. Page Benjamin, Shapiro Robert. 1992. The Rational Public: Fifty Years of Trends in Americans’ Policy Preferences. Chicago: University of Chicago Press. [Google Scholar]
  26. Plutzer Eric, Berkman Michael. 2005. “The Graying of America and Support for Funding the Nation’s Schools.” Public Opinion Quarterly 69:66–86. [Google Scholar]
  27. Roza Maraguerite. 2010. Educational Economics: Where Do School Funds Go? Washington, DC: Urban Institute Press. [Google Scholar]
  28. Simon Adam. 2011. Mass Informed Consent: Evidence on Upgrading Democracy with Polls and New Media. Lanham, MD: Rowman & Littlefield Publishers. [Google Scholar]
  29. US Census Bureau. . 2013. State and Local Government Finances Summary: 2011. Governments Division Briefs; http://www2.census.gov/govs/local/summary_report.pdf. [Google Scholar]
  30. US Department of Education 2010. Highlights from PISA 2009: Performance of US 15-Year-Old Students in Reading, Mathematics, and Science Literacy in an International Context. Available at http://nces.ed.gov/pubs2011/2011004.pdf. [Google Scholar]
  31. ———. 2013. Public School Graduates and Dropouts from the Common Core of Data: School Year 2009–10. Available at http://nces.ed.gov/pubs2013/2013309rev.pdf. [Google Scholar]
  32. US Government Accountability Office. . 2014. State and Local Government Expenditures: 1977–2009. Available at http://www.gao.gov/assets/670/664569.pdf. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Data

Articles from Public Opinion Quarterly are provided here courtesy of Oxford University Press

RESOURCES