Abstract
We use the state-mandated stay-at-home orders during the coronavirus pandemic as a setting to study whether political beliefs inhibit compliance with government orders. Using geolocation data sourced from smartphones, we find residents in Republican counties are less likely to completely stay at home after a state order has been implemented relative to those in Democratic counties. Debit card transaction data shows that Democrats are more likely to switch to remote spending after state orders are implemented. Heterogeneity in factors such as Covid-19 risk exposure, geography, and county characteristics do not completely rule out our findings, suggesting political beliefs are an important determinant in the effectiveness of government mandates. Political alignment with officials giving orders may partially explain these partisan differences.
Keywords: COVID-19, Coronavirus, Political polarization, Geolocation data, Credit card transaction data
1. Introduction
Both the World Health Organization and Center for Disease Control have recognized social distancing as the most effective way to slow down the spread of the novel coronavirus. Early evidence from China and the 1918 US flu pandemic also highlight the importance of mandatory social distancing policies in fighting the spread of the disease (Kraemer et al. (2020); Correia et al. (2020); Chudik et al. (2020)). Observing the effectiveness of social distancing from Asia and Europe, most states in the US also issued stay-at-home and shelter-in-place orders. Throughout the first wave of the pandemic, roughly 316 million Americans were disciplined by some level of social distancing requirements. In this paper, we leverage geolocation tracking data sourced from smartphones as well as debit card transaction data to analyze the effectiveness of state-level social distancing policies and show that political beliefs are an important limitation for whether people adhere to these orders.
Due to the recent increase in political polarization in the U.S. (Boxell et al., 2020), it is possible that political beliefs would heterogeneously affect compliance with social distancing orders. Affective polarization has been shown to influence other historically nonpolitical choices, such as dating (Iyengar et al., 2012), shopping behavior (Painter, 2020), and hiring (Gift and Gift, 2015).1 However, there is less empirical evidence examining whether people are guided by political beliefs regarding a choice where the consequences could be an increase in hospitalizations and deaths of their neighbors.
Media reports have suggested that people from more Democratic areas show more distrust in President Trumps initial message regarding the pandemic and are more proactive about social distancing.2 The press has also proposed that President Trump initially downplayed the severity of the coronavirus pandemic, suggesting that Republicans may not take social distancing orders seriously.3 Supporting this conjecture, survey evidence from Pew Research shows that 83% of Republicans agree that Trump is “doing an excellent/good job responding to the coronavirus outbreak” whereas only 18% of Democrats agree.4 The question of whether this survey and media coverage is representative of the U.S. population is important as the answer may help policymakers better allocate time and resources during this pandemic as well as give guidance for future national emergencies.
To analyze whether partisanship affects adherence to social distancing orders, we create a measure of social distancing based on the daily GPS location of a large sample of smartphones in the US. From this data we measure social distancing as the percentage of people who stay home for an entire day relative to all people identified in a county.5 We also collect data on debit card transactions, government-sanctioned social distancing orders, county-level demographics, and county-level voting results from the 2016 presidential election. The union of these datasets allows us to study whether partisanship affects adherence to social distancing orders through a difference-in-differences framework. Our focus is on state-level social distancing orders, as this setting enables the use of fixed effects, allowing us to study how counties with different political beliefs respond to the same order, while controlling for other factors that may influence social distancing behavior.
We document a striking difference in compliance with state-level social distancing orders based on partisanship. A one standard deviation increase in the county-level share of votes for Donald Trump in the 2016 election is associated with a 0.3 percentage points (pps) lower percentage of people who stay at home after a state social distancing order relative to the average county, which increases distancing by 1.9pps. This result holds amongst a battery of robustness tests, including the use of and fixed effects, fixed effects, and controls for county demographics (e.g., population and income), weather, other local policies (e.g., closing schools), reports of county-level coronavirus cases and deaths, and other characteristics that may influence social distancing behavior (e.g., social capital and the ability to work from home). We also find consistent results using a nearest-neighbor matching model where we match Republican and Democratic counties based on population, population density, income, and age. Event-study analyses suggest the effect we find is driven by the stay-at-home orders and not due to pre-existing differences in social distancing behavior.
We confirm our results are robust to the use of the median dwell time at home as our measure of social distancing. While the percentage of people who stay completely at home captures the extensive margin of compliance, time spent at home focuses more on the intensive margin. Median dwell time also has the benefit of not counting citizens who leave the house for short essential trips (e.g., going to the grocery store) as non-compliant. Using the natural log of the daily median dwell time in a county as our dependent variable, we find a one standard deviation increase in the share of the vote that went to President Trump is associated with a 2.5% lower amount of time spent at home after a social distancing order is enacted, relative to the average county. This finding is robust to the same host of robustness checks employed when using the share of people who stay completely at home as our dependent variable.
We next explore whether the partisan divide is driven by differences in political beliefs or differences in risk exposure. In particular, Democratic areas tend to be more densely populated than Republican areas, which can confound our interpretation. However, when we exclude the top 10% or top 25% of counties based on population density we continue to find the partisan effect. We also estimate subsample regressions of our main specification based on population density. We find the partisan effect holds in the top three quintiles of population density when analyzing the percent of people staying completely at home and in all five quintiles when analyzing median dwell time. Thus it appears that risk exposure does play a role, but does not drive out the political belief mechanism.
While documenting the partisan differences in adherence to social distancing orders is the main contribution of our paper, we also document consistent evidence using consumption data. Specifically, we analyze debit card transaction data to examine whether partisanship influences remote consumption in relation to social distancing orders. We identify remote transactions as those that do not require physically visiting a store (e.g., Amazon.com or Instacart). We find Democratic counties increase the proportion of their spending towards remote transactions more than Republican counties after a state-policy is implemented. In our baseline specification, a one standard deviation decrease in the Trump vote share is associated with a 0.2pps increase in a county’s fraction of remote spending relative to the average county after a state policy is in place. This result provides one channel as to how Democratic counties adjust their behavior more than Republican counties in response to social distancing orders. Further, our consistent results across both geolocation and debit card data suggest that biases in any single dataset are not driving our results.
Our final tests focus on whether the political affiliation of the governor announcing a state-level social distancing order affects compliance. If Republicans weaker response to social distancing orders is due to President Trump's early dismissal of the pandemic, we may likewise find that Democrats response to orders vary based on the political affiliation of the representative who gives the order. We identify “aligned counties as those with the same political affiliation as the governor and “misaligned counties as those with conflicting political identities. We find that misaligned counties have a 0.3pps lower response to state policy social distancing orders relative to aligned counties. This difference is driven by misaligned Democratic counties, which have a 0.7pps lower response relative to aligned Democratic counties. The difference for aligned and misaligned Republican counties is statistically insignificant. We note that these tests are cross-state comparisons and thus do not allow the use of fixed effects, making it more difficult to rule out unobservables related to the state-order itself. Nevertheless, these results provide suggestive evidence that political alignment is one mechanism through which adherence to government orders can be maximized.
Our findings are consistent with recent theory in political economy. In particular, the model of Gitmez et al. (2020) finds that political preferences can induce individuals to seek out information that downplays the severity of a pandemic which could lead to variation in response to orders from political leaders. More generally, Eliaz and Spiegler (2020) formalize a model that suggests that competing narratives can induce political disagreement. The aforementioned media reports and surveys regarding the coronavirus pandemic show that competing narratives have been prevalent in this setting and our empirical findings are consistent with these narratives driving the political differences in response to government mandates.
Our findings are also related to the empirical literature studying how political beliefs can influence behavior. Examining politically-charged fake news, Long et al. (2019) find that conservative-media dismissals of the dangers of hurricanes Harvey and Irma led to lower evacuation rates for conservatives relative to liberals. Painter (2020) shows that consumers respond along partisan lines when firms issue political statements. There is also evidence that political polarization can influence economic expectations (Gerber and Huber (2010); McConnell et al. (2018)) and cause voters to vote for below average candidates (Duell and Valasek, 2019). We extend this literature to the recent pandemic setting, showing that partisan beliefs can influence responses to government orders even amidst the presence of potentially significant health consequences.
Several concurrent papers also examine partisan differences in social distancing in response to the coronavirus. Allcott et al. (2020) use survey and geolocation data to study partisan differences in social distancing and also find the Democratic counties are more likely to socially distance. Barrios and Hochberg (2020) use Google search data and an alternative source of geolocation data to measure the perception of risk that Republicans and Democrats feel regarding COVID-19. Engle et al. (2020) and Andersen (2020) also use GPS data to show an association with the county level share of Trump vote and social distancing.6 We differ from these studies by focusing on how partisanship affects response to state-level social distancing orders, which we argue is more relevant to policy-makers. Additionally, our paper is the first to show consistent evidence across both geolocation and debit card data and is the only one to present political misalignment as a potential mechanism.7
2. Data
The primary datasets we use in this study are (1) geolocation data from SafeGraph, (2) debit card transaction data from Facteus, (3) the timing and location of government-sanctioned social distancing orders from the New York Times, and (4) county-level election results and demographics from various sources.
2.1. Geolocation data
To create a measure of social distance compliance, we rely on anonymized location data from SafeGraph covering daily movements for January 2020 until April 23 2020. SafeGraph partners with mobile application services that have opt-in consent from users to collect location data. The partnerships allow SafeGraph to see location data from approximately 35 million unique devices in a given month. To preserve anonymity, the data is aggregated to the census block group (CBG) level and all CBGs with fewer than five observations are omitted. This geolocation data is advantageous as it allows us to see the movement behavior of a large sample of Americans. Further, prior studies using SafeGraph data find the data are generally representative of the US population (Chen et al., 2019) and in particular representative of voting patterns in the U.S. (Chen and Rohla, 2018).
From the SafeGraph data we create the following variable to track social distancing:
(1) |
where is the number of devices in county on day that never left home. Home is measured as the common nighttime location of each mobile device over a 6 week period to a Geohash-7 granularity (about 153 square meters). is the total number of devices identified in county on day and is the number of devices that leave home and go to another location for more than three hours during the period of 8 am to 6 pm local time.8 A higher percentage of indicates more residents in the area are complying with the social distancing order. Our measure captures the extensive margin of compliance as it only identifies those who are home the entire day as compliant. To provide evidence of whether the intensive margin of social distancing differs between partisan groups, we also use the natural log of the median dwell time at home to measure social distancing in additional tests. Panel A of Table 1 shows the average for (Dwell Time) is 35.2% (633 minutes) across the entire sample, 33.1% (606 minutes) for the before state-policy sample, and 44.2% (748 minutes) for the after state-policy sample.
Table 1.
Summary statistics.
Panel A: Social Distancing Behavior | ||||||
---|---|---|---|---|---|---|
All |
Before State Policy |
After State Policy |
||||
Mean | SD | Mean | SD | Mean | SD | |
Completely at Home | 0.352 | 0.086 | 0.331 | 0.075 | 0.442 | 0.075 |
Dwell Time at Home (Minutes) | 633.204 | 149.700 | 605.924 | 138.471 | 748.391 | 140.223 |
Remote Spending | 0.083 | 0.093 | 0.080 | 0.093 | 0.098 | 0.094 |
Observations | 335,344 | 271,131 | 64,213 | |||
Panel B: County Demographics | ||||||
All | Dem Counties | Rep Counties | ||||
Mean | SD | Mean | SD | Mean | SD | |
Population (000s) | 87.20 | 304.55 | 276.88 | 662.30 | 45.20 | 80.12 |
Population/Sq.Mi. | 76.06 | 892.02 | 282.37 | 2065.08 | 29.70 | 67.79 |
Age | 41.50 | 5.35 | 38.76 | 5.61 | 42.10 | 5.09 |
Income (000s) | 52.16 | 13.30 | 56.25 | 19.08 | 51.26 | 11.44 |
Precipitation | 3.09 | 8.26 | 3.20 | 8.39 | 3.07 | 8.23 |
Max Temp. | 53.26 | 16.76 | 53.15 | 16.32 | 53.29 | 16.86 |
Min Temp. | 32.49 | 15.13 | 33.11 | 15.04 | 32.35 | 15.15 |
Social Capital | 0.03 | 1.27 | -0.21 | 1.02 | 0.09 | 1.31 |
Telework (MSA level) | 0.42 | 0.08 | 0.45 | 0.08 | 0.41 | 0.08 |
Belief in Science | 0.48 | 0.05 | 0.55 | 0.04 | 0.46 | 0.03 |
Bachelor’s + | 0.21 | 0.09 | 0.29 | 0.13 | 0.19 | 0.07 |
Male | 0.50 | 0.02 | 0.50 | 0.02 | 0.50 | 0.02 |
Black | 0.09 | 0.14 | 0.20 | 0.23 | 0.07 | 0.10 |
White | 0.85 | 0.16 | 0.70 | 0.23 | 0.88 | 0.11 |
Native American | 0.02 | 0.07 | 0.04 | 0.12 | 0.02 | 0.04 |
Asian | 0.01 | 0.03 | 0.03 | 0.05 | 0.01 | 0.01 |
Observations | 335,344 | 60,798 | 274,546 | |||
Panel C: Matched Sample Co-variates | ||||||
Democratic | Republican | t-stat | ||||
Median Household Income | 55,456 | 56,340 | 0.6 | |||
Median Age | 38.3 | 38.6 | 0.68 | |||
Population (100,000) | 3.4 | 2.6 | 1.63 | |||
Population Density | 411.5 | 213.2 | 1.39 |
Note: This table reports summary statistics of our social distancing and population data. The unit of observation is a county-day and covers January 1st to April 23rd, 2020. Panel A reports summary statistics on social distancing behavior and remote spending for all observations as well as split on before and after government sanction. Panel B reports summary statistics for our baseline demographic variables for all observations as well as split on political affiliation. All variables are defined in Table A.1. Panel C reports the averages for variables used in the matching specification in Table 2. Covariates are reported separately for Democratic and Republican counties. We use two to one nearest neighbor matching with replacement. The final column of Panel C reports whether the difference between the two averages is statistically significant. Data sources: smartphone geolocation data from SafeGraph Inc, debit card transaction data from Facteus, and county demographics from the 2018 American Community Survey.
Though the SafeGraph data is extensive and useful for our setting, it does have some limitations. The data is nationally representative but relies on smartphones to track location and as of 2018 23% of American adults did not own a smartphone.9 Thus inferences from the geolocation data can only be drawn about those who own smartphones. Further, some smartphones may exit the sample if the phone is permanently turned off or the apps used to track location are deleted from the phone. Date fixed effects help address this limitation. Finally, the data is generated through intermittent and somewhat random “pings" to smartphones and is not monitored continuously throughout the day. This means short trips outside the home may be missed if the phone is not pinged during that time. This could introduce bias as more densely populated areas - which tend to be Democratic - are able to make short trips out of the house whereas rural areas - which tend to be Republican - must make longer trips for daily necessities (e.g., groceries). We address this potential bias using multiple approaches to control for population and population density.
2.2. Debit card transaction data
Our debit card data comes from Facteus, a data aggregation firm. The dataset is sourced from over 12 million debit cards and covers daily transactions at the zip-code level from January 2020 to April 17 2020. The data is primarily sourced from payroll cards, government cards, and challenger banks and therefore is primarily made up of younger consumers in the middle- to lower-income brackets.10 This differs from the SafeGraph data, which skews slightly towards higher income individuals. While neither dataset identically represents the entire U.S., consistent results in both datasets would suggest that no one bias in either dataset is driving the results. We note that transaction data similar to Facteus has been used in other research to document changes in spending behavior during the pandemic (Chetty et al. (2020); Baker et al. (2020a); Baker et al. (2020b); Chen et al. (2020)). In particular, Baker et al. (2020a) show that the level of spending on certain categories differs among Republicans and Democrats, with Republicans spending more at retail shops and restaurants. Our intent with the transaction data is not to repeat this work, but rather to document one potential channel that enables Democrats to practice social distancing more than Republicans.
One effective way for Americans to comply with social distancing orders is to switch their consumption towards purchases that don’t require face-to-face transactions. Because the debit card data allows us to see the amount spent on each brand at the zip-code level, we are able to create a proxy for the amount spent remotely by identifying brands that operate primarily online or by mail. We manually identify brands in the Facteus database that generate the vast majority of their revenue through remote transactions. These brands include online retailers (e.g., Amazon), grocery delivery services (e.g., Instacart), and shipment services (e.g, FedEx).11 We then create the following measure for the proportion of consumption spent remotely:
(2) |
where is the dollar amount spent at firms that do not require face-to-face transactions in county on date and is the total amount spent in county on date . Studying the proportion of transactions that are done remotely allows us to adjust for the potential that overall spending is also affected by social distancing orders. We note that Facteus data does not differentiate spending between the website and brick and mortar locations of a single brand.12 Therefore some transactions that happen remotely may not be recognized as remote in our data and our measure can be thought of as a lower bound of remote spending. For a given county-day the average number of transactions we observe is 1,639, with an average total spent of $55,820 and an average proportion of amount spent remotely of 8.3%. Panel A of Table 1 shows the average proportion spent remotely is 8% before state policies are enacted and is 9.8% after state policies are put in place, suggesting the U.S. population as a whole increased remote spending while under stay-at-home orders.
2.3. Government social distancing orders
We use the social distancing policy data assembled by the New York Times as it is comprehensive and provides precise information on both the timing and geography of social distancing orders.13 Importantly for our study, it also provides official documentation for the order, allowing us to identify the policy announcer in each case. California, the most populous state, was the first to order a state-wide stay at home order effective March 19. Since then, a total of 42 states have issued social distancing orders. We merge the political affiliation of all governors with the NYT data as it is not included in their report. We also gather daily data on the number of reported cases and deaths in each county from the NYT.14
There are also instances of counties issuing local orders before state-wide social distancing orders were put in place. In these cases, NYT also collects information at the city/county level. This data is not useful in our analysis however, as most city/county level orders are not made by political officers. For example, county level social distancing orders in Missouri have largely been made by public health officials. For this reason, we exclude all counties that have implemented a county-level social distancing order from our analysis. Excluding these counties is also beneficial as these counties were likely to have unofficial local policies (e.g., closing parks to the public) that would be difficult to systematically identify.
2.4. Political affiliation and other demographic data
Our setting also requires a proxy for the political preference of U.S. residents. We use the results of the 2016 U.S. Presidential election to measure a county’s political preference. Specifically, we collect county-level voting data from the MIT Election Data and Science Lab (MIT, 2018) and use the vote share won by Donald Trump to measure the degree to which a county leans Republican or Democrat. We also collect county-level demographic data from the 2018 American Community Survey database. This data includes county-level characteristics for population, population density, income, age, education, gender, and race.
Subsequent to the release of the first draft of this paper, new studies have documented other characteristics that are associated with social distancing behavior. In cases where data is available, we ensure our results are robust to controlling for these other factors. These characteristics include belief in science (Brzezinski et al., 2020), social capital (Ding et al., 2020), the ability to work from home (Dingel and Neiman, 2020), and local weather (Kapoor et al., 2020). We include the formal definitions and sources of all variables used in Table A.1. of the appendix.
We present summary statistics for demographic variables in Panel B of Table 1. We present means and standard deviations for the entire sample as well as split by partisanship, where we label counties as Republican (Democratic) if greater than (less than) 50% of the vote in that county went to Trump in the 2016 presidential election. As is often documented, we find Democratic counties have larger populations, are more densely populated, younger, wealthier, more educated, and have a larger share of minorities as residents. Democratic counties also have lower social capital, are better able to work from home, and have a higher belief in science. Democratic and Republican counties are similar in terms of daily precipitation and temperature.
3. Results
3.1. Partisan differences in adherence to social distancing orders
3.1.1. Main results
We examine whether political beliefs affect the response to state-level social distancing orders using the following generalized difference-in-differences estimation:
(3) |
Where is the percentage of smart devices that were completely at home in county on day if a state level social distancing order has gone into effect,15 and is the county-level vote share that went to Donald Trump in the 2016 election. We z-score to have a mean of zero and standard deviation of one. The coefficient on the interaction term will capture the marginal response to social distancing orders based on how much a county leans Republican or Democrat. Our baseline estimate includes controls for the one-day lag of the natural log of the cumulative number of cases and deaths due to the coronavirus in a county. Additionally, we include interactions of county-level controls for population, population density, age, and income with the state-policy indicator to dynamically control for these variables. We also include as controls dummy variables that identify when a state closed k-12 schools, day cares, gyms, and movie theatres and banned nursing home visits, non-essential business, and sit-in restaurants.16
We include county fixed effects to control for time-invariant local factors like county size or exposure to certain industries. We also include date fixed effects to control for common factors across time like the release of national coronavirus-related news on a certain day. Further, we include fixed effects in certain specifications to capture state-specific trends as well as any other state-policies not controlled for. fixed effects also ensure that no characteristics of the state policy itself (e.g., the tone of the order) influence our results as it allows us to compare counties under the same order but with different political beliefs. We double-cluster standard errors at the county and date level.17
We report the results of estimating Eq. (3) in Table 2 . In column (1) we analyze how political partisanship affects adherence to social distancing orders while controlling for local cases, deaths, and other state-level closures. The average county by vote share increased the percentage of people who stay at home by 1.7pps after a state policy, a 5% increase in the mean.18 Consistent with the argument that Republicans were influenced by Trumps early dismissal of the pandemic, we find that a higher vote share to Trump is associated with a lower proportion of people staying completely at home. Specifically, a one standard deviation increase in the vote share to Trump is associated with a 0.6pps decrease (35% lower) in proportion of people staying completely at home after a state policy relative to a county with an average vote share to Trump.
Table 2.
Partisan Response to Social Distancing Orders - % Completely at Home.
% Completely at Home |
||||
---|---|---|---|---|
(1) | (2) | (3) | (4) | |
State Policy | 0.017*** | 0.019*** | 0.027*** | |
(4.46) | (5.23) | (4.35) | ||
State PolicyTrump Vote | 0.006*** | 0.003** | 0.003*** | |
(4.89) | (2.44) | (3.10) | ||
State PolicyRepublican | 0.008** | |||
(-2.12) | ||||
Matching | No | No | No | Yes |
Base Demographic Controls | No | Yes | Yes | No |
Health Controls | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes |
County FE | Yes | Yes | Yes | Yes |
Date FE | Yes | Yes | No | Yes |
State Date FE | No | No | Yes | No |
0.708 | 0.751 | 0.814 | 0.786 | |
Observations | 332,368 | 321,741 | 321,629 | 82,202 |
Note: This table reports regression results from estimating Eq. (3). The unit of observation is a county-day. The dependent variable is the daily % of people who stay completely at home in a county. State Policy equals one if the underlying county is in a state that has a social distancing order in place on the day of observation and equals zero otherwise. Base demographic controls include the interaction of population, population density, age, and income with State Policy. Health controls include the natural log of the one day lag of cumulative coronavirus cases and deaths in a county. Other closures are dummy variables that identify when a state closed k-12 schools, day cares, gyms, and movie theaters and banned nursing home visits, non-essential business, and sit-in restaurants. Matching specifications involve a two to one match with replacement of Republican counties to Democratic counties based on population, population density, age, and income. Definitions of dependent variables and controls can be found in Table A.1. All continuous variables are z-scored to a mean of zero and standard deviation of one. t-statistics, based on standard errors double-clustered at the county and date level, are reported in parentheses. *** p0.01, ** p0.05, * p0.1
We also analyze this effect in event-time in Fig. 1 . We interact our state policy variable with indicator variables for how far away a date is from the state policy enactment and report the resulting coefficients. By construction, these coefficients capture the time-series of differences in social distancing compliance between treated and control counties. We first present the aggregate response to state-level social distancing orders in Panel A. The baseline result shows little difference between the social distancing in our treatment and control counties before state policies are enacted and a significant jump in the difference once a state policy goes into effect. On day zero, counties with state-policies practice social distancing by 2.5pps more than counties with no policy. This difference attenuates as counties move further away from the date of the state-policy order. In Panel B, we conduct the event study on subsamples of Republican and Democratic counties.19 The partisan split event study shows a significant difference in the response to state-policies when comparing Republican counties (50% Trump) and Democratic counties, with Democratic counties responding more. Further, the Democratic counties response persists through the entire sample, whereas the Republican county response attenuates toward zero.
Fig. 1.
Changes in Social Distancing around State Policies This figure plots coefficients of from the following regression of social distancing on the interaction of state policies for shelter in place in event time, where day 0 is the first date the state order went into effect. Panel A plots the entire sample. Panel B plots subsamples for Rep. and Dem. counties, where Rep. counties are those where Trump received over 50% of the vote in the 2016 Presidential election. Controls are the same as used in Table 1 column (2). County and date fixed effects are included. Standard errors are double-clustered at the county and date level. .
In column (2) of Table 2 we include interactions of the state policy indicator with controls for county-level population, population density, age, and income in order to adjust for the potential that these demographics influence our results. We find these controls do attenuate the effect but the effect remains important, with the coefficient on of -0.003 and still statistically significant. We find consistent results when including fixed effects in column (3), which allows us to rule out the potential that differences in orders themselves could be influencing our results. We see the results in columns (2) and (3) as our baseline estimates and now proceed to ensure these results are robust.
3.1.2. Robustness of main results
Our first robustness check relaxes the functional form assumptions of our model by using nearest neighbor matching. Specifically, we construct a two to one match of Republican (50% Trump vote) counties with Democratic counties based on a county’s median income, median age, population, and population density. We find these covariates match well between Republican and Democratic counties, as all comparisons of differences in these variables are insignificant at conventional levels (see Panel C of Table 1). Using our matched sample, we interact with an indicator for whether a county is Republican. We find results consistent with our baseline estimates, as under this specification Republican counties have a rate 0.8pps lower than their matched Democratic counties after a state policy is implemented.
Though we believe our baseline specification is reasonable and defensible, it is important for readers to understand how the reported relationship between partisanship and adherence to social distancing orders may differ under alternate specifications. Therefore, in the spirit of Simonsohn et al. (2019), we next create a specification curve that reports results for 52 alternative specifications. This exercise helps mitigate the possibility that any one arbitrary choice in our model is driving the results. In these specifications we include our baseline controls and various combinations of controls that could also affect social distancing behavior. These controls include the number of people in a county with a bachelors degree or higher, the racial makeup of a county (percent white, black, asian, and native american), the proportion of residents that believe climate change is human-induced (belief in science), the fraction of residents that are male, a county’s social capital, the proportion of residents who can reasonably work from home (measured at the MSA level), and the daily precipitation, maximum temperature, and minimum temperature in a county.20 We also vary the fixed effects used. In addition to specifications with and fixed effects, we present results using and fixed effects. Time-invariant controls are interacted with the state policy indicator in all models using fixed effects.
Fig. 2 reports the coefficient on as well as 90% and 95% confidence intervals for each specification, where is the dependent variable. The remaining panels indicate which controls are included in each specification. The coefficient on is relatively stable and is significant at the 10% level in all but one specification and at the 5% level in all but two.21 This specification curve highlights that the relationship between partisanship and adherence to social distancing orders we find is robust to alternative specifications and suggests that our baseline estimates are unlikely to be due to data-mining.22
Fig. 2.
Specification Curve for the Partisan Response to Social Distancing This figure plots the coefficient on for various model specifications. The darker (lighter) grey bands indicate 90% (95%) confidence intervals. The dependent variable is . The bottom panels indicate which co-variates and fixed effects are included in each specification. In all models with fixed effects, we interact the time-invariant controls with the indicator. Standard errors are double-clustered at the county and date level in all specifications. Control variables are described in Table A.1.
One concern regarding our main dependent variable is that it identifies anyone leaving the house for any reason as non-compliant. The measure therefore may mis-specify people who leave the house for essential reasons (e.g., doctor visits). We thus ensure our model is robust to using the median dwell time at home as the dependent variable. With this measure we assume people who stay home for a larger portion of the day are attempting to socially distance more than those who stay home less.
The results using the natural log of the median dwell time as the dependent variable are shown in Table 3 . Consistent with our baseline results, we find that counties with a higher Trump vote share stay home less relative to the average county after a state policy is implemented. A one standard deviation increase in Trump vote is associated with 2.5% less time spent at home than the average county after a state-order is implemented. We find this result is robust to the inclusion of controls for population, density, age, and income as well as the inclusion of fixed effects. Further, we find consistent results under our nearest neighbor matching specification, which shows that Republican counties are home for 6% less time each day than Democratic counties after state policies are put in place, relative to before policies are implemented.23
Table 3.
Partisan Response to Social Distancing Orders - Median Dwell Time.
Ln(Dwell Time at Home) |
||||
---|---|---|---|---|
(1) | (2) | (3) | (4) | |
State Policy | 0.011 | 0.025*** | 0.056*** | |
(1.33) | (3.41) | (4.72) | ||
State PolicyTrump Vote | 0.025*** | 0.029*** | 0.025*** | |
(6.36) | (8.14) | (7.41) | ||
State PolicyRepublican | 0.060*** | |||
(6.68) | ||||
Matching | No | No | No | Yes |
Base Demographic Controls | No | Yes | Yes | No |
Health Controls | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes |
County FE | Yes | Yes | Yes | Yes |
Date FE | Yes | Yes | No | Yes |
State Date FE | No | No | Yes | No |
0.680 | 0.749 | 0.775 | 0.798 | |
Observations | 332,368 | 321,741 | 321,629 | 82,202 |
Note: This table reports regression results from estimating Eq. (3), but using the daily dwell time at home in a county as the dependent variable. The unit of observation is a county-day. Definitions of dependent variables and controls can be found in Table A.1. All continuous variables are z-scored to a mean of zero and standard deviation of one. t-statistics, based on standard errors double-clustered at the county and date level, are reported in parentheses. *** p0.01, ** p0.05, * p0.1.
3.1.3. Mechanisms: political beliefs or risk exposure?
Though we control for population and population density in our main specification – and use both variables as characteristics for our nearest neighbor matching tests – it is still possible that extremely densely populated areas may be confounding our results as these areas are almost exclusively Democratic. For example, there are no Republican counties with the same population density as Manhattan. Exposure to a virus will be much higher in densely populated areas, which could confound our interpretation that differences in political beliefs is the driving force behind the heterogeneous response to stay-at-home orders. In this section, we examine to what extent risk exposure explains our results.
First, to ensure highly dense areas are not biasing our results, we re-estimate Eq. (3) while excluding the most densely populated areas. We report these tests in Table 4 . We exclude the top 10% of counties by population density in Panel A and the top 25% in Panel B. These subsamples have much more comparable population densities across Republican and Democratic counties: when excluding the top 10% (25%) Republican counties have a population of 20.9 (14.6) per square mile and Democratic counties have 27 (15.2) per square mile. In both subsamples we find coefficients on that are nearly identical to those in our full sample tests, easing doubts that densely populated areas are driving our baseline results.
Table 4.
Partisan Response to Social Distancing Orders - Density Robustness .
Panel A: Exclude Top 10% of Counties by Population Density | ||||
---|---|---|---|---|
(1) | (2) | (3) | (4) | |
% At Home | % At Home | Ln(Dwell) | Ln(Dwell) | |
State Policy | 0.029*** | 0.025*** | ||
(7.06) | (2.73) | |||
State PolicyTrump Vote Share | 0.003*** | 0.003*** | 0.030*** | 0.024*** |
(2.87) | (2.94) | (8.10) | (6.63) | |
Base Demographic Controls | Yes | Yes | Yes | Yes |
Health Controls | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes |
County FE | Yes | Yes | Yes | Yes |
Date FE | Yes | No | Yes | No |
State Date FE | No | Yes | No | Yes |
0.730 | 0.797 | 0.741 | 0.768 | |
Observations | 289,623 | 289,511 | 289,623 | 289,511 |
Panel B: Exclude Top 25% of Counties by Population Density | ||||
(1) | (2) | (3) | (4) | |
% At Home | % At Home | Ln(Dwell) | Ln(Dwell) | |
State Policy | 0.044*** | 0.009 | ||
(6.70) | (0.46) | |||
State PolicyTrump Vote Share | 0.004*** | 0.004*** | 0.031*** | 0.026*** |
(3.49) | (3.39) | (8.06) | (6.49) | |
Base Demographic Controls | Yes | Yes | Yes | Yes |
Health Controls | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes |
County FE | Yes | Yes | Yes | Yes |
Date FE | Yes | No | Yes | No |
State Date FE | No | Yes | No | Yes |
0.705 | 0.776 | 0.731 | 0.758 | |
Observations | 241,324 | 241,324 | 241,324 | 241,324 |
Note: This table reports results of our baseline model after excluding the top 10% (panel A) of counties by population density and the top 25% (panel B) of counties by population density. t-statistics, based on standard errors double-clustered at the county and date level, are reported in parentheses. *** p0.01, ** p0.05, * p0.1.
We next test whether there are any instances where the risk exposure channel may completely drive out the beliefs channel. To do so, we estimate Eq. (3) on quintile subsamples based on population density. We hypothesize that in the most rural areas the threat of exposure will be low enough that political beliefs will not be a driving factor. The results are shown in Table 5 . We find no significant difference in the percent of people who stay at home based on the share of the vote to Trump when examining the two least dense quintiles. The effect is significant in the middle quintile and becomes stronger in more densely populated areas. In panel B, we repeat our tests using median dwell time as the dependent variable. Here we find a partisan effect regardless of the population density quintile we examine. Altogether, these tests suggest that risk exposure is a factor in compliance with government orders but it does not entirely account for the partisan beliefs channel we propose in this paper.
Table 5.
Partisan Response to Social Distancing Orders - Density Quintiles .
Panel A: % Completely at Home | |||||
---|---|---|---|---|---|
(1) | (2) | (3) | (4) | (5) | |
Q1(least dense) | Q2 | Q3 | Q4 | Q5 (most dense) | |
State Policy | 0.007 | 0.029*** | 0.017*** | 0.008 | 0.006 |
(1.16) | (4.44) | (3.35) | (1.40) | (1.16) | |
State PolicyTrump Vote | 0.001 | 0.001 | 0.004* | 0.007*** | 0.009*** |
(0.28) | (0.25) | (1.73) | (3.41) | (3.15) | |
Health Controls | Yes | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes | Yes |
County FE | Yes | Yes | Yes | Yes | Yes |
Date FE | Yes | Yes | Yes | Yes | Yes |
0.568 | 0.751 | 0.794 | 0.831 | 0.875 | |
Observations | 67,133 | 66,376 | 66,870 | 64,814 | 56,548 |
Panel B: Median Dwell Time | |||||
(1) | (2) | (3) | (4) | (5) | |
Q1(least dense) | Q2 | Q3 | Q4 | Q5 (most dense) | |
State Policy | 0.000 | 0.023** | 0.018** | 0.019** | 0.004 |
(0.00) | (2.30) | (2.43) | (2.31) | (0.44) | |
State PolicyTrump Vote | 0.028*** | 0.027*** | 0.026*** | 0.022*** | 0.022*** |
(3.60) | (4.74) | (6.63) | (5.64) | (4.13) | |
Health Controls | Yes | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes | Yes |
County FE | Yes | Yes | Yes | Yes | Yes |
Date FE | Yes | Yes | Yes | Yes | Yes |
0.679 | 0.803 | 0.840 | 0.856 | 0.873 | |
Observations | 67,133 | 66,376 | 66,870 | 64,814 | 56,548 |
Note: This table reports results of our baseline model on quintile subsamples based on population density. Panel A reports results using % Completely at Home as the dependent variable. In panel B the dependent variable is the Median Dwell Time at home. t-statistics, based on standard errors double-clustered at the county and date level, are reported in parentheses. *** p0.01, ** p0.05, * p0.1
3.2. Changes in remote spending around social distancing orders
We next test whether there is a difference in remote consumption for Republican and Democratic counties around state policies using the following difference-in-differences estimation:
(4) |
As mentioned previously, moving consumption towards remote transactions is one way to help adhere to social distancing orders. Therefore a significant difference in the fraction of remote spending along partisan lines would suggest a difference in effort to comply with social distancing orders.
We first present an event study analysis of the difference in remote spending around state policies in Panel A of Fig. 3 using a model analogous to the one used in Fig. 1. Prior to the enactment of state policies, there is no significant difference in the percent of remote spending for the treated Republican and Democratic counties. After state policies are implemented, there is a steady and significant increase in the percent of remote spending for Democratic counties, persisting through the entire post-period. In contrast, there is no significant change in remote spending for the treated Republican counties.
Fig. 3.
Partisan Differences in Remote Spending This figure plots coefficients of from the following regression of the fraction of remote spending on the interaction of state policies for shelter in place in event time, where day 0 is the first date the state order went into effect. This test is run on subsamples of Republican and Democratic counties. Controls are the same as used in Table 1 column (2). County and date fixed effects are included. Standard errors are double-clustered at the county and date level. More detail on controls can be found in Fig. 2 and Table A.1.
We also study changes in remote spending around state-level social distancing orders in Table 6 . The models mimic those used previously but use the fraction of remote spending in a county as the dependent variable. We continue to find that more Republican counties are less likely to switch to remote spending after social distancing orders are implemented. Specifically, a one standard deviation increase in Trump vote share is associated with a 0.2pps lower fraction spent on remote transactions after a state policy is implemented, compared to the average county. We also confirm this result holds in the nearest neighbor matching specification in column (4). These results suggest that remote spending is one channel through which Democrats adjust their behavior to comply with social distancing orders relative to Republicans.24
Table 6.
Partisan Response in Remote Shopping Behavior .
% Remote Spending |
||||
---|---|---|---|---|
(1) | (2) | (3) | (4) | |
State Policy | 0.001 | 0.001 | 0.005* | |
(0.51) | (0.61) | (1.77) | ||
State PolicyTrump Vote | 0.002*** | 0.002*** | 0.002*** | |
(3.00) | (3.29) | (2.84) | ||
State PolicyRepublican | 0.005*** | |||
(2.63) | ||||
Matching | No | No | No | Yes |
Base Demographic Controls | No | Yes | Yes | No |
Health Controls | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes |
County FE | Yes | Yes | Yes | Yes |
Date FE | Yes | Yes | No | Yes |
State Date FE | No | No | Yes | No |
0.365 | 0.365 | 0.384 | 0.394 | |
Observations | 297,561 | 291,400 | 291,294 | 77,042 |
Note: This table reports regression results from estimating Eq. (4). The unit of observation is a county-day. Definitions of dependent variables and controls can be found in Table A.1. All continuous variables are z-scored to a mean of zero and standard deviation of one. t-statistics, based on standard errors double-clustered at the county and date level, are reported in parentheses. *** p0.01, ** p0.05, * p0.1
3.3. The effects of political misalignment on compliance with social distancing orders
Our results thus far document that Republican counties make less of an effort to follow social distancing orders relative to Democratic counties. This is in line with the intuition that Republicans are more likely to follow direction from President Trump – who was initially dismissive of the dangers of the pandemic – than from state-level officials. To test whether the political party of the official giving a social distancing order affects compliance across the political spectrum, we re-visit the geolocation data and create the variable which indicates whether the political affiliation of a county is misaligned with the political affiliation of the person who issues a state policy order. For example, for a Republican county in Colorado, where the Democratic Governor Jared Polis issued a stay at home order. On the other hand, for a Democratic county in Colorado.25 Our final tests examine our baseline specification but instead interacts the state policy indicator with the misalignment variable. A significant coefficient on would suggest that the political party of the policy issuer does affect compliance with social distancing orders.
Table 7 reports the misalignment results. We find that misaligned counties respond less to social distancing orders relative to aligned counties. After a state policy is enacted, the proportion of people who completely stay home is 0.3pps lower in misaligned counties relative to aligned counties. We next estimate the misalignment test on subsamples of Republican and Democratic counties. We find no significant difference in behavior between aligned and misaligned Republican counties (column 2) but a significant difference in behavior in Democratic counties (column 3). Specifically, misaligned Democratic counties respond 0.7pps less than aligned Democratic counties. The insignificant results for Republican counties is consistent with the idea that Republicans rely more on President Trump’s message than on any state-level official. Comparing columns (2) and (3) suggests misaligned Democratic counties are still more responsive to state policies than Republican counties, implying political misalignment is at best a partial explanation for the heterogeneous responses by partisanship.
Table 7.
The Effect of Misaligned Political Beliefs on Adherence to Social Distancing Orders.
% Completely at Home |
|||||
---|---|---|---|---|---|
(1) | (2) | (3) | (4) | (5) | |
Full Sample | Rep | Dem | Dem | Dem | |
State Policy | 0.019*** | 0.019*** | 0.037*** | 0.021** | 0.023** |
(5.00) | (4.62) | (4.32) | (2.24) | (2.37) | |
State PolicyMisalign | 0.003* | 0.002 | 0.007** | 0.009*** | 0.008** |
(1.87) | (1.07) | (2.06) | (2.71) | (2.50) | |
Base Demographic Controls | Yes | Yes | Yes | Yes | Yes |
Robustness Controls | No | No | No | Yes | Yes |
Health Controls | Yes | Yes | Yes | Yes | Yes |
Other Closures | Yes | Yes | Yes | Yes | Yes |
Trump Vote Control | No | No | No | No | Yes |
County FE | Yes | Yes | Yes | Yes | Yes |
Date FE | Yes | Yes | Yes | Yes | Yes |
0.750 | 0.731 | 0.818 | 0.829 | 0.829 | |
Observations | 321,741 | 262,713 | 59,028 | 58,916 | 58,916 |
Note: This table reports the impact of misaligned political beliefs between residents and the policy announcer (the governor) on social distancing behavior. The unit of observation is a county-day. Misalign equals one if the county is Democratic (Republican) and the governor is Republican (Democratic) and equals zero otherwise. Robustness controls include the number of people in a county with a bachelors degree or higher, the racial makeup of a county (percent white, black, asian, and native american), the proportion of residents that believe climate change is human-induced, the fraction of residents that are male, a county’s social capital, the proportion of residents who can reasonably work from home (measured at the MSA level), and the daily precipitation, maximum temperature, and minimum temperature in a county. Definitions of dependent variables and controls can be found in Table A.1. t-statistics, based on standard errors double-clustered at the county and date level, are reported in parentheses. *** p0.01, ** p0.05, * p0.1.
As mentioned in the introduction, we note that the cross-state comparisons required in these tests do not allow us to use fixed effects, making it difficult to rule out the influence of differences in state orders themselves on social distancing. Reassuringly, additional tests that include the full set of robustness controls (column 4) suggest the results in the Democratic subsample are not driven by other county characteristics. Further, we find consistent evidence when we include the interaction (column 5), suggesting the effect is not driven by differences in how Democratic a county is. Nevertheless, there are still concerns regarding unobservable factors and we therefore view these findings as complimentary to our main results which can rule out these unobservables.
4. Conclusion
Social distancing is one of the most effective ways to mitigate the spread of the novel coronavirus. Yet, in the U.S. social distancing orders are predominately given by political officials, making partisanship an important determinant in whether social distancing orders are adhered to.
In this paper, we study political limitations to government-mandated orders intended to get people to practice social distancing. Our results suggest that Republicans are less likely to follow social distancing orders relative to Democrats. These results hold up to a battery of robustness tests, including controlling for local health factors, population density, and other factors shown in the literature to influence social distancing. We also provide evidence that Democrats are more likely to switch to remote consumption after social distancing orders are implemented. It appears the political party of the official giving the order could partially explain this difference. Our results show that political beliefs influence behavior even in the midst of a public health crisis. In addition, the results speak to a recurring issue facing public officials regarding how to maximize adherence to government mandates and highlight the importance of bipartisan support in order for policies to be most effective.
Marcus Painter received $2,500 from the Institute for the Study of Free Enterprise at the University of Kentucky in relation to this study. He has no other declarations of interest to disclose. Tian Qiu received $2,500 from the Institute for the Study of Free Enterprise at the University of Kentucky in relation to this study. He has no other declarations of interest to disclose.
Footnotes
We thank SafeGraph Inc. and Facteus for data access. Academics and others working for the public good can access the geolocation and debit card data used in this paper here: https://www.safegraph.com/covid-19-data-consortium. We thank Will Gerken and Austin Wright for helpful comments. We received financial support from the Institute for the Study of Free Enterprise in relation to this study. *Corresponding author. Address: 3674 Lindell Blvd, St. Louis, MO 63108.
See Iyengar et al. (2019) for more on the consequences of affective polarization.
“Without guidance from the top, Americans have been left to figure out their own coronavirus solutions. Washington Post. March 15, 2020.
“Analyzing the Patterns in Trumps Falsehoods About Coronavirus. New York Times. March 27, 2020.
“Polling Shows Signs of Public Trust in Institutions amid the Pandemic. Pew Research Center. April 17, 2020. Note: Poll conducted March 19–24.
Our results are also robust to using a measure of daily at-home dwell time.
Outside of partisanship, Briscese et al. (2020) provide survey evidence that expectations regarding the duration of social distancing measures affect compliance in Italy and Wright et al. (2020) show that income is also associated with compliance to social distancing orders. Regarding consumption data, Baker et al. (2020a) document differences in spending between Republicans and Democrats during the pandemic but do not study shifts in remote spending nor do they focus on social distancing orders.
Alexander and Karger (2020) build upon our finding in subsequent work. They use different datasources to show that both movement and spending at restaurants is reduced in Democratic counties more than in Republican counties after a stay-at-home order.
We use three hours in order to adjust for both part-time and full-time workers. We also confirm our results are robust to the inclusion of workers in the social distancing measure in Figure A.1 in the internet appendix. Full documentation for the SafeGraph social distancing data can be found here: https://docs.safegraph.com/docs/social-distancing-metrics.
Mobile Fact Sheet. Pew Research Center. June, 2019.
We provide a breakdown of representativeness by generation in Table A.2. in the internet appendix.
The complete list of identified remote firms is shown in Table A.2. in the internet appendix.
A notable exception is Walmart, which is separated into brick and mortar spending and Walmart.com spending.
“See Which States and Cities Have Told Residents to Stay at Home.” New York Times. April 20, 2020.
“Were Sharing Coronavirus Case Data for Every U.S. County.” New York Times. March 28, 2020. Note: Updated through April 27, 2020.
We exclude from our analysis days where a state policy went into effect at 12pm or later.
This data is from Raifman et al. (2020). Note: we are using the April 27, 2020 version of this data.
We show that our main results are robust to clustering at the state level as well as double-clustering at the state and date level in Table A.3.
This arguably small change in magnitude is consistent with other research that finds a large portion of social distancing is done voluntarily before government orders are put in place (e.g., Sheridan et al. (2020) and Goolsbee and Syverson (2021)). Generally consistent with our findings, Alexander and Karger (2020) find people travel 9% less after stay-at-home orders are implemented.
Goodman-Bacon (2018) recommends subsample tests as the simplest way to incorporate a third-difference when there is variation in treatment-timing.
The definitions and sources of these variables are available in Table A.1.
We note the one insignificant specification includes telework as a control variable. This variable is only available at the MSA level and therefore its inclusion reduces our sample size to about one-third of the full sample.
We also present a specification curve using a variant of where workers are included in the measure in Figure A.1 and find consistent results.
We repeat the specification curve exercise with the natural log of dwell time as the dependent variable and report the results in Figure A.2. We again find the effect is largely robust to alternative specifications, with 49 of the 52 models statistically significant at the 5% level or higher. The three insignificant models do not have or fixed effects and have few controls and are thus more vulnerable to omitted variable bias. The coefficient on is also reasonably stable throughout the various specifications.
We present a specification curve using as the dependent variable in Figure A.3. We find the coefficient on is significant in the majority of specifications and is reasonably stable. We also present evidence of estimating Eq. (4) using aggregate spending as the dependent variable in Table A.4. The results suggest that Republicans are more likely to keep spending levels up relative to Democrats after lockdown orders. However, the results are not robust to the inclusion of fixed effects and thus could be due to unobserved variables.
This measure is analogous to the ideological mismatch measure used in Kempf and Tsoutsoura (2018).
Supplementary material associated with this article can be found, in the online version, at doi:10.1016/j.jebo.2021.03.019.
Appendix A. Supplementary materials
Supplementary Raw Research Data. This is open data under the CC BY license http://creativecommons.org/licenses/by/4.0/
References
- Alexander, D., Karger, E., 2020. Do stay-at-home orders cause people to stay at home? effects of stay-at-home orders on consumer behavior.
- Allcott, H., Boxell, L., Conway, J., Gentzkow, M., Thaler, M., Yang, D. Y., 2020. Polarization and public health: partisan differences in social distancing during the coronavirus pandemic. NBER Working Paper (w26946). [DOI] [PMC free article] [PubMed]
- Andersen, M., 2020. Early evidence on social distancing in response to covid-19 in the united states. Available at SSRN 3569368.
- Baker S.R., Farrokhnia R.A., Meyer S., Pagel M., Yannelis C. Technical Report. National Bureau of Economic Research; 2020. How does Household Spending Respond to an Epidemic? Consumption During the 2020 Covid-19 Pandemic. [Google Scholar]
- Baker S.R., Farrokhnia R.A., Meyer S., Pagel M., Yannelis C. Technical Report. National Bureau of Economic Research; 2020. Income, Liquidity, and the Consumption Response to the 2020 Economic Stimulus Payments. [Google Scholar]
- Barrios J.M., Hochberg Y. Technical Report. National Bureau of Economic Research; 2020. Risk Perception through the Lens of Politics in the Time of the COVID-19 Pandemic. [Google Scholar]
- Boxell L., Gentzkow M., Shapiro J.M. Technical Report. National Bureau of Economic Research; 2020. Cross-Country Trends in Affective Polarization. [Google Scholar]
- Briscese G., Lacetera N., Macis M., Tonin M. Technical Report. National Bureau of Economic Research; 2020. Compliance with COVID-19 Social-Distancing Measures in Italy: The Role of Expectations and Duration. [Google Scholar]
- Brzezinski A., Kecht V., Van Dijcke D., Wright A.L. 2020. Belief in Science Influences Physical Distancing in Response to Covid-19 Lockdown Policies. [Google Scholar]
- Chen, H., Qian, W., Wen, Q., 2020. The impact of the covid-19 pandemic on consumption: learning from high frequency transaction data. Available at SSRN 3568574.
- Chen M.K., Haggag K., Pope D.G., Rohla R. Technical Report. National Bureau of Economic Research; 2019. Racial Disparities in Voting Wait Times: Evidence from Smartphone Data. [Google Scholar]
- Chen M.K., Rohla R. The effect of partisanship and political advertising on close family ties. Science. 2018;360(6392):1020–1024. doi: 10.1126/science.aaq1433. [DOI] [PubMed] [Google Scholar]
- Chetty R., Friedman J.N., Hendren N., Stepner M., et al. Technical Report. National Bureau of Economic Research; 2020. How did covid-19 and stabilization policies affect spending and employment? a new real-time economic tracker based on private sector data. [Google Scholar]
- Chudik, A., Pesaran, M. H., Rebucci, A., 2020. Voluntary and mandatory social distancing: evidence on covid-19 exposure rates from chinese provinces and selected countries. Available at SSRN 3576703.
- Correia, S., Luck, S., Verner, E., 2020. Pandemics depress the economy, public health interventions do not: evidence from the 1918 flu. Working Paper.
- Ding, W., Levine, R., Lin, C., Xie, W., 2020. Social distancing and social capital: why us counties respond differently to covid-19. Available at SSRN 3624495.
- Dingel J.I., Neiman B. Technical Report. National Bureau of Economic Research; 2020. How Many Jobs can be Done at Home? [Google Scholar]
- Duell D., Valasek J. Political polarization and selection in representative democracies. J. Econ. Behav. Org. 2019;168:132–165. [Google Scholar]
- Eliaz K., Spiegler R. A model of competing narratives. Am. Econ. Rev. 2020;110(12):3786–3816. [Google Scholar]
- Engle, S., Stromme, J., Zhou, A., 2020. Staying at home: mobility effects of covid-19. Working Paper.
- Gerber A.S., Huber G.A. Partisanship, political control, and economic assessments. Am. J. Pol. Sci. 2010;54(1):153–173. [Google Scholar]
- Gift K., Gift T. Does politics influence hiring? evidence from a randomized experiment. Polit. Behav. 2015;37(3):653–675. [Google Scholar]
- Gitmez A., Sonin K., Wright A.L. Centre for Economic Policy Research; 2020. Political Economy of Crisis Response. [Google Scholar]
- Goodman-Bacon A. Technical Report. National Bureau of Economic Research; 2018. Difference-in-Differences with Variation in Treatment Timing. [Google Scholar]
- Goolsbee A., Syverson C. Fear, lockdown, and diversion: comparing drivers of pandemic economic decline 2020. J. Public Econ. 2021;193:104311. doi: 10.1016/j.jpubeco.2020.104311. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Iyengar S., Lelkes Y., Levendusky M., Malhotra N., Westwood S.J. The origins and consequences of affective polarization in the united states. Annu. Rev. Polit. Sci. 2019;22:129–146. [Google Scholar]
- Iyengar S., Sood G., Lelkes Y. Affect, not ideology: a social identity perspective on polarization. Public Opin. Q. 2012;76(3):405–431. [Google Scholar]
- Kapoor, R., Rho, H., Sangha, K., Sharma, B., Shenoy, A., Xu, G., 2020. God is in the rain: the impact of rainfall-induced early social distancing on covid-19 outbreaks. Available at SSRN 3605549. [DOI] [PMC free article] [PubMed]
- Kempf E., Tsoutsoura M. Technical Report. National Bureau of Economic Research; 2018. Partisan professionals: Evidence from credit rating analysts. [Google Scholar]
- Kraemer M.U.G., Yang C.-H., Gutierrez B., Wu C.-H., Klein B., Pigott D.M., Open COVID-19 Data Working Group, du Plessis L., Faria N.R., Li R., Hanage W.P., Brownstein J.S., Layan M., Vespignani A., Tian H., Dye C., Pybus O.G., Scarpino S.V. The effect of human mobility and control measures on the COVID-19 epidemic in China. Science. 2020 doi: 10.1126/science.abb4218. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Long E.F., Chen M.K., Rohla R. Political storms: emergent partisan skepticism of hurricane. risks. 2019 doi: 10.1126/sciadv.abb7906. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McConnell C., Margalit Y., Malhotra N., Levendusky M. The economic consequences of partisanship in a polarized era. Am. J. Pol. Sci. 2018;62(1):5–18. [Google Scholar]
- MIT, 2018. County Presidential Election Returns 2000–2016. 10.7910/DVN/VOQCHQ.
- Painter, M., 2020. Consumer response to corporate political statements: evidence from geolocation data. Working Paper.
- Raifman J., Nocka K., Jones D., Bor J., Lipson S., Jay J., Chan P., Galea S., et al. Covid-19 us state policy. Database. 2020 [Google Scholar]
- Sheridan A., Andersen A.L., Hansen E.T., Johannesen N. Social distancing laws cause only small losses of economic activity during the covid-19 pandemic in scandinavia. Proc. Natl. Acad. Sci. 2020;117(34):20468–20473. doi: 10.1073/pnas.2010068117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Simonsohn, U., Simmons, J. P., Nelson, L. D., 2019. Specification curve: descriptive and inferential statistics on all reasonable specifications. Available at SSRN 2694998.
- Wright A.L., Sonin K., Driscoll J., Wilson J. 2020. Poverty and Economic Dislocation Reduce Compliance with Covid-19 Shelter-in-Place Protocols. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplementary Raw Research Data. This is open data under the CC BY license http://creativecommons.org/licenses/by/4.0/