Skip to main content
Elsevier - PMC COVID-19 Collection logoLink to Elsevier - PMC COVID-19 Collection
. 2020 Nov 11;194:104322. doi: 10.1016/j.jpubeco.2020.104322

Economic preferences and compliance in the social stress test of the COVID-19 crisis

Stephan Müller 1,1, Holger A Rau 1,
PMCID: PMC9186354  PMID: 35702336

Abstract

We analyze in a survey study whether economic preferences and pre-crisis social responsibility predict social compliance to the policy regulations. Results show that economic preferences are closely related to compliance with policies fighting the crisis. Risk tolerance negatively affects citizens’ avoidance of crowds, whereas patience helps to do so and to stay home. Present-biased subjects engage in panic buying. Risk tolerance is negatively related with the fear of COVID-19 and trust positively resonates with positive media perception. Pre-crisis social responsible behavior related to fare evasion, turnout, support of vaccination is also positively related with social compliance. Our findings offer insights, which may help policy-makers and organizations to identify risk groups and regions for the allocation of scarce medical or surveillance resources, such as vaccines, masks, and law enforcement.

Keywords: Compliance, COVID-19, Experiment, Preferences, Social responsibility

1. Introduction

Many heads of states consider the COVID-19-virus pandemic as the greatest challenge since World War II. Worldwide, policy measures have been implemented mainly targeting at breaking the chain of transmission. Over time, measures were adjusted from initially soft measures, such as disinfection guidelines over behavioral recommendations to closed borders and curfews. No matter how strongly public administration encourages the right behavior, or how severe the potential punishment in case of civil disobedience, the success of these measures ultimately depends on the ongoing social compliance of the people.

Social compliance constitutes individual adherence to the regulations and recommendations regarding contributions to public health. It is related to the literature on individual contributions to the provision of public goods 2 and to the research on individual compliance, for example, with respect to tax obligations (e.g., Allingham and Sandmo, 1972, Alm et al., 1992, Kirchler, 2007), process standards in firms (e.g., Pierce et al., 2015, Staats et al., 2017, Sheedy et al., 2019), or vaccination (e.g., Bronchetti et al., 2015, Hansen and Schmidtblaicher, 2019). However, only a small share of this literature focuses on individual drivers of compliance in a public-good context. Furthermore, very little is known about the role of individual preferences for compliance in a crisis situation to what most citizens have not been close to experience.

The importance of individual preferences is demonstrated by politicians’ requests for social obedience in the crisis. For instance, politicians appeal that citizens should refrain from panic buying to secure the provision of daily goods. Moreover, politicians apply regulations on public gatherings and curfews to increase social distancing, the key instrument to break the chain of transmission. Sticking to the prescribed and recommended behavior reduces the risk of getting infected, but also requires patience and self-discipline. Thus, social compliance constitutes a behavior which is likely to resonate with individual time and risk preferences. This idea is supported by evidence of health-related contexts, highlighting that patience positively affects the adherence to physical activity advice (Van Der Pol et al., 2017), whereas more risk tolerant subjects are less likely to adhere to medications (Simon-Tuval et al., 2018). Compliance requires from citizens that they trust the appropriateness of the measures and the reliability of the provided information (Antinyan et al., 2020). Since compliance generates positive externalities for other members of the society, it might also depend on citizens’ willingness to take over social responsibility.

The first question of this paper analyzes to what extent key economic preferences (risk, time, trust, and honesty) predict citizens’ social compliance with policy measures in the COVID-19 crisis. Therefore, we present results of a survey study, where we apply preference measures, which have proven to strongly correlate with incentivized measures and demonstrated their external validity (see, Falk et al., 2016, Falk et al., 2018). Second, we relate social compliance to participants’ social responsibility taken before the crisis (e.g., fare evasion, voter turnout). We focus on a subject pool of students who represent a particularly relevant group, as they account for about 25% of all German citizens between the age of 20 and 29, an age cohort, which repeatedly has been reported to be least compliant with social-distancing regulations (Brouard et al., 2020, Daoust, 2020, Moore et al., 2020). Thus, even if students are not fully representative of the population, we look at a very important group, which constitutes a crucial part of the population when it comes to ending the pandemic. 3 Moreover, with a student subject pool we can exploit data on preferences and attendance behavior from before the crisis. For example, we exploit data on attendance rates to previous experiments, where the participants registered for and find a link between their attendance rate and their answers on pre-crisis social responsibility. Moreover, student samples have proven to show less measurement error and correlate very well with representative samples of the whole population (Snowberg and Yariv, forthcoming).

In the first block of our survey, without any reference to the COVID-19 crisis, we elicited preferences with respect to risk, time, trust, trustworthiness, and honesty. To measure social responsibility, we conduct a principal component analysis on three items related to subjects’ willingness to go along with society’s rules and to their contributions to the common good. The items are: fare evasion in public transportation, individual turnout, and subjects’ agreement to a law on compulsory measles vaccination. The second block is contextual to COVID-19. We captured behavioral compliance with COVID-19-specific political measures with the help of four items. We asked about behavior related to: staying at home, avoiding crowds, their willingness to get tested for the COVID-19 virus and panic buying. This block also includes items on the perception of the crisis, the media coverage and the appropriateness of political measures.

First, our results highlight that key economic preferences are correlated with individual compliance with the regulations and politicians’ public appeal. We thereby extend the growing literature applying established laboratory measures of individual preferences to explain behavior in the field (Falk et al., 2013, Cappelen et al., 2015, Snowberg and Yariv, 2018) to the context of a social crisis. We also add to the literature on individual drivers of compliant behavior. Our data show that more pronounced risk tolerance correlates with a lower propensity to avoid crowds. The results also demonstrate a relation between patience and compliance to the behavior prescribed by public authorities. Moreover, we identify present-biased participants to engage in panic buying. Second, we find a positive significant relation between social responsibility and social compliance. That is, people with higher social responsibility are more likely to behave in accordance to the policy regulations fighting the crisis. This provides evidence for the significant role of social capital (Putnam, 1995, Knack and Keefer, 1997, Bjørnskov, 2006) in overcoming an exceptional threat to society. Regarding subjects’ perception of COVID-19, we find that risk tolerance is negatively related with the fear posed by the COVID-19 virus. Whereas, citizens’ perceived appropriateness of the media coverage resonates with a measure of generalized trust.

The correlations could be used now and in future crises for the prediction problem faced by a policy-maker who wishes to allocate surveillance or scarce medical resources most efficiently among citizens. Suppose, for example, the policy-maker wants to identify target regions for medical resources and has knowledge about a positive correlation between individual turnout and compliance to the COVID-19 regulations. The policy-maker can use this insight together with available regional data on voter turnout to predict which regions can be expected to show lower compliance. This should be regions, which showed a low turnout. Since the problem is one of predicting the right target for policy interventions, the policy-maker does not need to know what the cause-and-effect relation between compliance and individual turnout is (see Kleinberg et al., 2015 for the formal argument). Thus, our results between, on the one hand, economic preferences and pre-crisis social responsibility and, on the other hand, social compliance can be used to identify target groups for the various policy measures. Regarding unobservable economic preferences, the identification of target groups can build on research, which reveals information on the distribution of preferences across occupations, space, or socioeconomic classes (e.g., Bonin et al., 2007, Masclet et al., 2009, Fouarge et al., 2014). For example, our finding that risk tolerant participants are less likely to avoid crowds, identifies workers who predominantly encounter financial and social risks and perform professional, managerial, or administrative work (Hill et al., 2019). This suggests that fines should vary with income, or that informational campaigns targeting this group should highlight the individual and social risks to increase social compliance.

More generally, the policy-maker can use our and similar correlations between social compliance and observable variables (e.g., turnout, fare evasion, and socioeconomics) or variables (e.g., risk attitudes, time preferences), which are known to correlate with observable variables, such as socioeconomics, occupational choices, and speeding to predict areas or socioeconomic groups at high risk. For this prediction a policy-maker could assign an aggregate score to a region or a group based on different observables. This risk assessment can be used for the design of policy measures tailored to the identified region or group of citizens. We elaborate more on this in the conclusion.

2. Data and study design

The data of this study were collected in an online survey on March 16 and March 17, 2020. At this time, politicians highly recommended social-distancing measures, such as staying at home and avoiding masses. Moreover, policy measures to fight COVID-19 were in the place. The measures concentrated on the closing of: the German borders, schools, day-care centers, bars, restaurants, discotheques, gyms, and public institutions. The German government restricted the visiting time in hospitals and rest homes. Our study analyzes citizens’ direct responses to these drastic measures.

The study focuses on subjects, who have signed into the data base for experiments at the University of Göttingen. In our sample, 95% of the participants are students under the age of 30. We sent subjects an invitation e-mail to participate in an online study, which was processed with the “Google-Forms” tool. They were told that the study lasts 10–15 min and that they receive a €5 Amazon voucher, if they complete the study. Importantly, we did not mention the study topic. Participants did not receive an indication that it was about COVID-19. Our study is divided in two blocks, where we first elicit participants’ preferences, followed by contextual questions focusing on the COVID-19 crisis (see the appendix for the questions). To address measurement error, we applied different packages of questions in each block. We discuss the validity of the measures in detail in Section 3.3.

In the first block, we applied a package on general preferences, where we ask verbal questions on: risk tolerance, time preferences, generalized trust, trustworthiness, and honesty. For risk, trust, trustworthiness, and honesty, participants had to answer on likert scales (0 = lowest degree; 10 = highest degree). To measure time preferences, participants were asked about the level of immediate compensation in Euros to forego a payment of €1000 in six months. Afterward, they were asked about the required level of compensation in six months to forego a payment of €1000 in twelve months. We calculate patience in the form of discount factors, by dividing the answers by 1000. We use the mean of the two discount factors, as a measure for patience (Meier and Sprenger, 2010), i.e., more (less) patient subjects have a higher (lower) discount factor. Eliciting a discount factor in the near and in the far future allows us to control, whether subjects are present-biased. To study preferences on social responsibility, we designed a second package, asking about participants’ behavior in social life and about their attitudes towards social duties. The questions concentrate on three scenarios about contributions to the common good before the crisis. First, we focus on free-riding behavior in the domain of public services, such as, transportation. We ask how often participants committed fare evasion in public transportation (0 = never before; 5 = always). To measure participation in politics, we ask for individual turnout. Recently, in Germany, there was a vivid debate on compulsory vaccination. An increasing number of citizens show a vaccination hesitancy. To address this, we ask for people’s willingness to take precautions to protect their health and the health of fellow citizens. We focused on their agreement to a law of compulsory measles vaccination before children go to the kindergarten/school (0 = lowest degree; 10 = highest degree). Based on the three answers, we conduct a principal component analysis to construct a social-responsibility factor.

The second block consisted of contextual questions on compliance in the COVID-19 time and about subjects’ perception of the crisis. To reduce measurement error and to increase the validity of our measures, we included multiple items. We focused on three domains of compliance during the crisis and compute an index on social compliance during COVID-19. Regarding social-distance behavior, we asked participants, whether they increased the time of staying home, since the COVID-19 crisis started (0 = lowest degree; 3 = highest degree). We address social distancing in a further question, where we asked participants, whether they started to avoid crowds, since the crisis started (0 = lowest degree; 10 = highest degree). We focused on the likelihood that subjects would do a COVID-19 test, when having symptoms (0 = lowest degree; 10 = highest degree). We analyze subjects’ behavior regarding the recommendations during the crisis with a question, which focuses on panic buying. We asked whether participants increased the purchases of durable food during the crisis (0 = lowest degree; 4 = highest degree). In the contextual block, we had a package on participants’ perception of the crisis. Analyzing perception of the crisis is of importance, as empirical evidence shows that citizens who misperceive the spreading speed of the virus behave less compliant (Banerjee et al., 2020). In our analysis, we focused on three dimensions: (i) participants’ fear of COVID-19; (ii) perception of media reporting; (iii) acceptance of policy measures. The detailed questions and results concerning the COVID-19-perception analyses are presented in the working-paper version of this article (Müller and Rau, 2020). Finally, we elicited demographics (age, gender, nationality, field of study/profession, disposable monthly income).

2.1. Data analysis and construct validity

For our data analysis we standardize all variables, except the dummy variables. Furthermore, we make use of a compliance index as outcome variable in our main regressions. Our approach is similar to Stango et al. (2017), i.e., we take the arithmetic mean of different variables of compliance, that we think are theoretically connected. To do so, we elicit four dimensions of compliance to compute an index. The items are: staying at home, avoidance of crowds, testing COVID-19, and panic buying. We believe that subjects’ propensity to stay home and to avoid crowds is theoretically related to compliance, as the social-distance policy measures in Germany highly recommended that subjects should stay home and avoid masses. Moreover, staying home is clearly connected to the avoidance of crowds, as people who do not leave their houses cannot attend big events. We believe that not engaging in panic buying is also related to compliance in the COVID-19 crisis, as politicians repeatedly gave this behavioral recommendation. The same holds for doing a COVID-19 test, when observing the corresponding symptoms. As all these dimensions relate to situations, where subjects may show compliance to policy measures or policy recommendations during the crisis, we expect that they are theoretically connected. To analyze reliability of the compliance index, we applied Cronbach’s alpha. The internal consistency was 0.544 for the four items. It turned out that Cronbach’s alpha was increased to 0.624 when removing panic buying. Therefore, we removed panic buying from the index. In our subsequent analysis we present the results on panic buying separately.

For our main analysis we use regression models where we apply a set of preference items and a set of items representing behaviors concerning subjects’ social responsibility. For the set of preference measures we conduct a principal component analysis (pca) to identify specific types of relevant preference combinations.

The items which intend to capture participants’ pre-crisis social responsibility focus on free-riding behavior in the public domain. In other words, the questions relate to behavior which has proven to resonate with social preferences. We believe that this type of social preferences might also drive social compliance behavior. First, Ayal et al. (2019) find in a field experiment that a descriptive social norm reduces fare evasion. Second, Dawes et al. (2011) highlight the role of social preferences for political participation. They find, among others, that subjects who were most interested in increasing total welfare in the dictator game were more likely to participate in politics than subjects with selfish preferences. Third, we opted for the item on compulsory vaccination because of the ongoing heated debate on this topic (compulsory law of measles vaccination of school kids). For example, Cappelen et al. (2010) highlight the role of social preferences for peoples’ decisions about childhood vaccination. Therefore, we think that this question highly resonates with citizens’ attitude toward contributions to public health, which is a particularly relevant context for our topic. Moreover, applying multiple measures reduces the effects of measurement error. However, a large set of controls is needed to entirely eliminate measurement error (Gillen et al., 2019). Therefore, we follow Gillen et al. (2019) and conduct a further principal component analysis to test construct validity of our social responsibility measure.

Factors were extracted on the basis of eigenvalues above one. A loading of 0.30 or greater was used to identify items. For the set of preferences the pca identified two components with eigenvalues exceeding one. We applied a varimax rotation. As a result, in component one, two items load positively and very strongly, i.e., trustworthiness (0.65) and honesty (0.64). Whereas, trust loads with 0.39. The component can be interpreted as characteristics of a trustworthy and honest person. Therefore, we call the first component “PC1: trustworthiness & honesty” in our further analyses. Second, it turns out that patience loads positively and very strongly (0.73) in component two. Whereas, risk tolerance loads negatively (−0.66). Thus, we can interpret this component as characteristics of a person who is patient and less risk tolerant. We call this component “PC2: patience & risk tolerance” in our further analyses. We conducted another pca for our measures on social responsibility (fare evasion, turnout, agreement to vaccination). Here, one component has an eigenvalue above one. It turns out that, “agreement to vaccination” (0.69) and “turnout” load positively (0.37), whereas “fare evasion” (−0.62) enters negatively. We interpret persons who score high in this component as socially responsible. We call this component “PC3: social responsibility” in our further analyses.

3. Results

In this section, we first present our sample and compare participants’ preferences to the preferences measured in old experiments. Next, we demonstrate our findings on the predictive power of economic preferences and social responsibility for social compliance in the COVID-19 stress test.

3.1. Sample and comparison with old data

In total, 197 subjects participated in the survey. We drop seven subjects who gave in the time-preference question an answer, which exceeded €1000. There are only five non-German subjects in the data base. This may bias the data and the social responsibility factor, e.g., when those subjects were not entitled to vote in any election. We indeed find that 4 of these subjects were not allowed to vote. It is possible that non-German subjects made different experiences with the pandemic in their home countries (or watched other media than Germans). This may further affect the data. To control for this, we would have to apply interactions in our regressions, which is however, problematic as the number of non-German subjects is very low (n = 5). Thus, we drop the subjects. This yields a sample of 185 subjects (52% female) with a mean age of 22.86. The field of study was balanced with only 19 percent of econ students.

We find that the mean risk tolerance of our participants is not significantly different from the data of participants in a laboratory experiment (Grosch et al., 2020), which was ran before the COVID-19 crisis at the University of Göttingen (two-sided t-test, p=0.784). For time preferences, we also find that participants’ mean patience does not differ from the data in an experiment (Rau, 2020), which was conducted at the University of Göttingen in December 2019 (two-sided t-test, p=0.646). Participants’ mean trust level is similar as in a sample of dutch citizens (two-sided t-test, p=0.702), which was collected in November 2018 (Riedl et al., 2019). The similarity of risk, trust, and time preferences before and after the COVID-19 crisis excludes COVID-19-specific sample-selection effects at the level of our subject pool. We discuss this in more detail in Section 4. We summarize the means of our preference elicitations and control variables in Table 2 in the appendix. Moreover, in the appendix we present pairwise correlations of the variables (Table 4) and an overview (Fig. 2 and Fig. 3) of the distributions.

Fig. 2.

Fig. 2

Summary statistics of preference questions in the survey.

Fig. 3.

Fig. 3

Summary statistics of contextual COVID-19 questions in the survey.

3.2. Main results: social compliance under COVID-19

We start with our main results regarding the impact of economic preferences on social compliance during the COVID-19 crisis. Fig. 1 displays participants’ answers in the four dimensions of the social-compliance index. The diagram presents the answers, which fall into the above-median categories of the corresponding dimension. The diagram reports the share of above-median answers, which were given by subjects who revealed a low- or high-type preference. This classification is based on a median split. 4

Fig. 1.

Fig. 1

Effects of economic preferences on social compliance (n = 185).

The diagram, gives a first tendency on the possible relations of the economic preferences and compliance. It suggests that participants with an above-median risk tolerance are less likely to increase staying home and less often avoid crowds. They are also less likely to engage in panic buying. In the same vein, trusting people are apparently not very susceptible to panic buying during the crisis. The diagram reveals a further pattern for time preferences. That is, more patient subjects are obviously more likely to stay home and to avoid crowds. A similar effect can be found for social responsibility. Social responsible participants tend to behave more compliant with respect to staying at home and avoiding crowds during the crisis. Finally, we observe that social responsibility may also have a positive effect on the probability that subjects do a COVID-19 test. To get deeper insights and to test for statistical significance, we turn to regressions (see Table 1 ).

Table 1.

OLS regressions on social compliance, on its three dimensions, and on panic buying.

Social Compliance
Compliance index Disaggregated index

Staying home Avoid. crowds COVID-19 testing Panic buying
(1) (2) (3) (4) (5) (6)
PC1: trustworthiness & honesty 0.056 0.061
(0.071) (0.074)
PC2: patience & risk tolerance 0.177∗∗ 0.195∗∗
(0.073) (0.077)
PC3: social responsibility 0.196∗∗∗ 0.201∗∗∗ 0.186∗∗ 0.170∗∗ 0.102 0.051
(0.073) (0.075) (0.075) (0.073) (0.080) (0.077)
Trustworthiness −0.017 0.065 0.031 −0.097
(0.088) (0.085) (0.093) (0.090)
Honesty 0.039 0.048 0.050 0.019
(0.087) (0.084) (0.092) (0.089)
Trust 0.005 −0.042 0.005 −0.042
(0.076) (0.074) (0.081) (0.078)
Patience 0.164∗∗ 0.189∗∗ 0.043 0.090
(0.077) (0.074) (0.081) (0.079)
Present bias 0.120 0.141 0.139 0.073 0.190 0.455
(0.217) (0.222) (0.226) (0.218) (0.238) (0.232)
Risk tolerance 0.024 −0.181∗∗ −0.034 −0.185∗∗
(0.080) (0.078) (0.085) (0.083)
Day two 0.498∗∗∗ 0.528∗∗∗ 0.586∗∗∗ 0.579∗∗∗ 0.034 0.071
(0.150) (0.156) (0.156) (0.151) (0.165) (0.160)
Constant −0.330∗∗∗ −0.126 −0.145 −0.185 0.019 0.155
(0.123) (0.215) (0.217) (0.210) (0.230) (0.223)



Controls No Yes Yes Yes Yes Yes
obs. 185 183 183 183 183 183
R2 0.110 0.128 0.148 0.188 0.035 0.088

Standard errors in parentheses
∗∗∗p < 0.01, ∗∗p < 0.05, p < 0.1

Controls: Gender, age, disposable income, dummies which control whether their main information source is social media, whether they vote for left-wing parties, whether they stated no voter preferences, whether they are econ students.

Models (1)–(2) incorporate the preference and social responsibility components, which we predicted with the principal-component analysis. PC1: trust & honesty is the first component with high positive loadings of subjects’ trust and honesty. PC2: patience & risk tolerance is the second component, which includes a positive loading of patience and a negative loading of risk tolerance. The third component our models apply is PC3: social responsibility, which includes positive loadings of “acceptance of vaccination” and “turnout,” whereas “fare evasion” enters negatively. The models apply present bias, an indicator dummy variable, which is positive for persons who behave time inconsistently, i.e., they indicated a higher discount factor in the far future than in the near future.

Table 1 presents OLS regressions which analyze the effects of economic preferences and social responsibility. The models focus on the compliance index (models (1)–(2)) and on disaggregeated analyses of the three dimensions of compliance (models (3)–(5)). Finally, model (6) focuses on subjects’ tendency to engage in panic buying. In models (3)–(6), we aim to get deeper insights on the distinct effects of the economic preferences on the three dimensions of social compliance and panic buying. Therefore, we apply disaggregated anyalyses, where we substitute PC1 and PC2 by our data on established preference measures (trustworthiness, trust, patience, risk tolerance). 5 Models (2)–(6) always apply the same control variables. The variables are: gender, age, subjects’ disposable income, dummies to control whether their main information source is social media, whether they are voters of left-wing parties, whether they stated no voter preferences, and whether they are econ students. The regressions focus on 183 subjects, as we loose two observations because of missing information. 6 The survey was conducted within two days (on March 16 and March 17, 2020) in the very dynamic time after Angela Merkel announced the drastic policy measures to fight COVID-19. During these days, the events were overturning and the media was reporting more and more new cases of COVID-19. At this time, more and more online articles and special broadcasts reported about the crisis and discussed the policy measures. This process may have affected behavior. To control for this, we always include a dummy variable (day two), which is positive when the survey was completed on day two (March 17, 2020).

We find that PC1 is not significant, i.e., trustworthiness and honesty do not affect compliance in the crisis. Focusing on economic preferences, the regressions clearly show that time preferences and risk preferences are very important determinants of compliant behavior. That is, in models (1) and (2) the coefficient of PC2 is positive and significant. Thus, more patient and less risk tolerant subjects show a higher degree of social compliance. The result is also robust to the inclusion of controls. In model (2) it can be seen that a one standard deviation (sd) increase of PC2 leads to a 0.195 sd increase on our compliance index. Models (3) and (4) reveal significant effects of time preferences, which show that our main finding is stimulated by the fact that patient people are more likely to stay home and to avoid the masses during the crisis. Model (3) highlights that a one sd increase of patience leads to a 0.164 sd increase on the staying-home scale. The effect of patience is even more pronounced for the avoidance of crowds, i.e., a one sd increase leads to a 0.189 sd increase on the avoidance-of-crowds scale. Risk tolerance is also predictive for compliant behavior. It has a similarly strong effect as patience on the avoidance of crowds, i.e., a one sd increase risk tolerance leads to a 0.181 sd decrease on this scale. Model (5) reveals that more risk tolerant subjects engage less in panic buying, i.e., a one sd increase risk tolerance leads to a 0.185 decrease on this scale. Interestingly, present-biased subjects are significantly more prone to panic buying (p=0.051), which leads to a 0.455 sd increase on this scale. This confirms the literature that present bias leads to undisciplined behavior (e.g., Meier and Sprenger, 2010). We do not find that economic preferences or social responsibility affect subjects’ willingness to do a COVID-19 test (model (5)). We observe time dynamics in the social compliance of our subjects. The coefficients of day2 are all positive, of similar magnitude and highly significant. That is, subjects behaved more compliant on the second day after Merkel announced the COVID-19 measures. We observe the strongest effect of time dynamics for “staying home,” i.e., on day two.

Next, we focus on social responsibility and analyze whether persons who score high in this component (PC3) also show a high degree of compliance in the COVID-19 crisis. Focusing on the relation between social responsibility and compliance, we indeed find a significantly positive correlation. This can be seen in models (1) and (2), where we find positive and significant coefficients of PC3. Model (2) highlights that the effect of PC3: social responsibility is similar in magnitude as the effect of PC2: patience & risk tolerance. That is, a one sd increase social responsibility leads to a 0.201 sd increase on the compliance index. Models (3) and (4) emphasize that the finding is confirmed in two of the three dimensions of compliance. That is, subjects with a higher social involvement are more likely to stay at home and to avoid crowds.

3.2.1. Perception of COVID-19

We briefly report the findings on the predictive power of economic preferences on citizens’ perception of COVID-19. 7 The results show that risk tolerance significantly and negatively affects subjects’ perceived fear of COVID-19, i.e., more risk tolerant subjects are less terrified by the virus. At the same time, subjects who are more socially responsible are also more afraid of the virus. We find that trusting subjects are less likely to perceive that the media exaggerate their reporting on the COVID-19 crisis. We find a significantly positive relation between the social-responsibility index and subjects’ acceptance of the COVID-19 policy measures.

3.3. Robustness checks

As outlined in the introduction, our findings generate policy implications, which only require correlation. Thus, it does not matter whether, for example, social image concerns drive the correlation between social responsibility and social compliance, as long as participants’ answers to our survey questions are externally valid.

First, we can test this in our data, as members of the subject pool of the Göttingen Laboratory of Behavioral Economics were informed at the time of their registration that no-shows of registered participants might lead to a cancellation of the experiment. The intention is to make them aware of the responsibility they take when they register for an experiment. Thus, we believe that the revealed attendance rate at least partially reflects participants’ social responsibility. Hence, it should correlate with our principal component on social responsibility, which we obtained with the survey. For subjects who participated in laboratory experiments before, we have data on the their revealed reliability before the crisis, i.e., on their number of registrations (n) and show ups (k). 8 For these participants we can test the external validity of our survey measure on social responsibility. We have to exclude participants, who never participated in experiments before. 9

Note that someone who registered and showed up for five experiments sends a less noisy signal than someone who registered and showed up only once. Put differently, any pattern of reliability can be generated by any unobserved true reliability rate θ(0,1). However, the different levels of the true reliability θ have different probabilities to generate a given pattern of reliability. For n=10 and k=9, low θs are very unlikely to generate this pattern. To estimate the true reliability rate, we weighted each of the potentially true θs with the probability to generate the observed reliability pattern. 10

Based on the estimated reliability θ^, we generate a dummy variable, which equals one if the estimated reliability is above the median and zero if not. We thereby classify participants with an estimated reliability above 75% as reliable. Similarly, we generate a dummy, which equals one if social responsibility is above the median. Using dummy variables has the advantage that this reduces measurement error and the dependency on the metric scale, or the form of the relationship (e.g., linearity). We have attendance data available for more than half of the subjects in the sample used in the regression analysis presented in Table 1. In our data, reliable subjects show a higher self-reported social responsibility (χ2(1,N=97)=2.901;p=0.089). This is confirmed by an OLS regression on social responsibility. Controlling for socioeconomics, we find a positive and significant coefficient for participants’ estimated reliability θ^ (β=1.903;p=0.068, see Table 5 in the appendix). The significant correlation between revealed past reliability and our social-responsibility component emphasizes the external validity of this measure.

Second, we discuss the external validity of our survey measures of time preferences, risk preferences and trust. We elicit these preferences verbally and non-incentivized (Falk et al., 2016, Falk et al., 2018) and test how well they translate to participants’ behavior during the pandemic. Several studies demonstrated that these non-incentivized measures correlate with incentivized measures. This holds for risk tolerance (Dohmen et al., 2011, Falk et al., 2016, Grosch et al., 2020) and for trust (Falk et al., 2016). 11 We validated the verbal time-preference measure in an experiment (Rau, 2020) at the University of Göttingen and showed that it correlates with the incentivized multiple-price list measure of Andreoni and Sprenger (2012) (Spearman’s ρ=0.273,p<0.001). These preference measures have proven to predict individual behavior and many important economic outcomes across countries (Dohmen et al., 2012, Alan and Ertac, 2018, Falk et al., 2018).

Third, for our measure of social compliance we provide indirect evidence for the truthfulness of the answers in two ways. In a first step, we make use of our honesty measure as a robustness check. In another data set (Grosch et al., 2020), we show that this unincentivized measure correlates 12 with the incentivized measure of lying preferences based on the paradigm of Fischbacher and Föllmi-Heusi (2013), which in turn has been predictive for economic behavior outside the laboratory (e.g., Potters and Stoop, 2016, Hanna and Wang, 2017, Dai et al., 2018). We find no correlation in our data when testing honesty with social compliance (Spearman’s ρ=0.006,p=0.935), with time preferences (Spearman’s ρ=-0.020,p=0.783), and risk preferences (Spearman’s ρ=-0.009,p=0.907). Indeed, honesty only correlates with the answers to fare evasion. The negative correlation we find (see Table 4) has been reported before by Dai et al. (2018). They show in the context of public transportation that lab measures of dishonesty predict fraud in the field. Thus, subjects answered in consistency with this finding on real-life fare evasion, which suggests truthfulness. Finally, our findings are very plausible from the perspective of economic theory and they are suggested by empirical evidence. For example, time preferences reflect subjects’ impatience, i.e., their preference for immediate utility over delayed utility (Frederick et al., 2002). Empirical evidence suggests that more patient subjects are more successful in achievements in social domains, which require a high degree of self-control (Alan and Ertac, 2018). Some of the prescribed measures of the COVID-19 stress test are characterized by trading off immediate utility over delayed utility. This applies to the directions of staying at home and avoiding public events with large crowds. Therefore, one might expect that more patient citizens are more likely to stay at home and to avoid crowds. Present-biased subjects face self-control problems, as disproportionate preferences for immediate consumption have the effect that subjects have a hard time to delay instantaneous gratification (Meier and Sprenger, 2010). Thus, such citizens may have problems to follow through with their intended compliance with politicians’ appeal not to panic buy when faced with tempting consumption opportunities.

4. Conclusion

The success of the policy measures to fight the COVID-19 pandemic and therefore the economic severity of the crisis depend to a large extent on citizens’ compliance. This study provides insights into the individual drivers of citizens’ compliance with the public regulations and behavioral recommendations. We focus, on the one hand, on standard measures of economic preferences with respect to time, risk, trust, and honesty as potential explanations of participants’ compliance. 13 On the other hand, we focus on participants’ social responsibility taken before the crisis. Our main findings are that patience and social responsibility are related to higher social compliance. Whereas, risk tolerant and present-biased participants show lower compliance in the form of avoidance of crowds and panic buying, respectively. Our student subject pool is likely to reduce measurement error (Snowberg and Yariv, forthcoming) and it allows us to exploit data from the same subject pool before the crisis.

A general concern with student samples is the potential limited generalizability of our results. First, we exploit old data from the same subject pool on preference measures and showed in Section 3.1 that we find no statistically significant differences to the survey measures. Thus, we can rule out a sample-selection effect at the level of our subject pool. Second, our data are obtained from a subject pool, which mainly consists of students. Applying a student subject pool is of importance for our analysis for several aspects. On the one hand, this allows us to compare the elicited preferences in the survey to old data. We obtained these data in controlled laboratory experiments with students from the same subject pool before the crisis started. On the other hand, student subject pools are less prone to measurement error, which can bias the analysis (Snowberg and Yariv, forthcoming).

The strength of this approach, comes at the cost of potential limitations in terms of the generalizability of our results. In this regard, the main concern is that student participants might not be representative for the whole population, or may be prone to selection effects. In our case, however, students are representative in the sense that they account for about 25% of all German citizens between the age of 20 and 29, an age cohort, which has been reported to be least compliant with social-distancing regulations (Brouard et al., 2020, Daoust, 2020, Moore et al., 2020). 14 Thus, we look at a part of the population, which is of special importance concerning the current pandemic and the prevention of future pandemics. Furthermore, Snowberg and Yariv (forthcoming) show in a large study that the behavior measured in a student subject pool, correlates with a representative U.S. population. Moreover, evidence emphasizes that the preferences we focus on are not only predictive for student samples in the lab, but also for non-student samples in the field. There is evidence that risk attitudes correlate with financial decisions. This was shown in the laboratory with student samples (e.g., Fellner and Maciejovsky, 2007, Eckel and Füllbrunn, 2015) and in the field (Dohmen et al., 2011). For time preferences, it was shown that smokers have higher discount rates than nonsmokers, which can be found for student samples in the lab (Harrison et al., 2018) and for samples in the field (e.g., Kang and Ikeda, 2014). Regarding compliance, results on tax evasion also highlight consistency between lab and field findings. For instance, Fonseca and Myles (2011) report that the majority of tax evasion laboratory studies find that the fine rate and the probability of audit have a positive effect on compliance. The latter is also found for Minnesota tax payers (Slemrod et al., 2001).

Next, we want to elaborate on the potential policy implication of our results. Our results provide valuable insights for the short-run crisis management of policy-makers in the current and in future crises. The findings facilitate the identification of target groups for the allocation of surveillance (e.g., law enforcement) or medical resources (e.g., vaccines, masks), and with the design of target-group specific information campaigns. Our first predictor, social responsibility, comprises several (potentially) observable variables. That is, the federal government agencies have access to fine grid data on voter turnout and general vaccination rates, municipalities can provide regional data on fare evasion and opinion polls, which contain local information about anti-vaccination movements. Together with information about other related observable measures of social responsibility, such as tax evasion, speeding, and littering, policy-makers could predict regions of low social compliance.

Regarding our second predictor, there is plenty of research on the distribution of (not directly observable) economic preferences across different dimensions. At the level of occupational groups, individual risk preferences and patience are informative for workers’ selection into jobs (e.g., Bonin et al., 2007, Masclet et al., 2009, Fouarge et al., 2014). In light of this, our findings help to identify risk groups in the workplace. Our results revealed that risk tolerant persons behave less compliant, which identifies workers in high-income branches who predominantly encounter financial and social risks (Hill et al., 2019). This suggests that fines may vary by income. Furthermore, policy-makers may run informational campaigns on compliance in professional fields, which attract risk tolerant persons. At the global level, risk preferences and patience vary to a high degree (Falk et al., 2018). In this respect, our results generate predictions for differences in compliance across countries. 15 This may provide important insights for organizations, which operate worldwide. For instance, the WHO may target health education on countries, which are characterized by a low degree of patience or a high degree of risk tolerance. The information on countries that are at risk may help to improve the screening of infectious diseases worldwide. We are aware that applying the preference results to predict compliance across countries requires complex additional analyses with appropriate data sets. In this respect, our results are a first promising starting point for future research focusing on compliance across countries. We also encourage replication studies and more research to fill in the white spots on the map of correlations between (indirectly) observable characteristics and social compliance to policy regulations in times of a crisis.

Finally, to the extent that the observed correlations between economic preferences and social compliance constitute causal relations, our findings offer a second type of policy implication related to the endogeneity of preferences. For example, regarding time preferences Alan and Ertac (2018) conducted a randomized educational intervention on children’s intertemporal choices. The treated children became more patient in incentivized experimental tasks, the results persisted almost three years after the intervention and students were less likely to receive a low “behavior grade.” If patience causes higher compliance such intervention might not only generate private benefits for the students, but also positive externalities in times of a crisis.

Acknowledgements

We thank Volker Benndorf, Kerstin Grosch, Hannes Müller, Emmanuel Peterle, and Amélie Wieczorek for their helpful comments. We thank Peter Werner for providing us with their data. We also want to thank the editor Erik Snowberg and two anonymous referees for very helpful comments. Financial support is friendly acknowledged to the University of Göttingen.

Footnotes

3

We want to thank an anonymous referee for raising this point.

4

If subjects’ preferences were below/equal (above) median, they become a low (high) type.

5

In Table 3 of the appendix, we also present regressions on the compliance index and panic buying, where we substitute PC3 by the social-responsibility items.

6

One subject stated to be neither female nor male. Another subject did not enter the disposable income and argued that he has an unlimited amount of money because his parents pay for him.

7

A detailed analysis is reported in Müller and Rau (2020).

8

Cases of excused absence count as shown up.

9

First, we wanted a measure of taking responsibility before the crisis. Second, registration for a laboratory experiment in advance and showing up at the laboratory at the date of the experiment is not comparable to a situation where registration allows an immediate online participation via a link to the survey included in the invitation.

10

That is, we applied the following formula: θ^(n,k)=01θ·θk(1-θ)n-knk01θk(1-θ)n-knkdθdθ.

11

Falk et al. (2016) find in their validation study a Spearman’s rank correlation coefficient of ρ=0.410 (p<0.001) for risk preferences. A similar result is found by Grosch et al. (2020) (Spearman’s ρ=0.345,p<0.001). For trust Falk et al. (2016) find a Spearman’s rank correlation coefficient of ρ=0.283 (p<0.001).

12

In the data of Grosch et al. (2020), honesty in the die-game correlates with the unincentivized verbal question we use in our survey (Spearman’s ρ=0.185,p=0.002).

13

Campos-Mercade et al. (2020) find that social preferences in the form of prosocial behavior have a positive effect on health behaviors during the COVID-19 pandemic.

14

The Federal Statistical Office of Germany reports that in 2019 there were about 2,9 million students and about 9,8 million citizens were aged between 20 and 29 years. Source: https://www-genesis.destatis.de/genesis/online. Moreover, 84.3% of all students are between 20 and 29 years old. Source: Statista 2020

15

Chavarría et al. (2020) report data from interviews in an Indonesian sample, showing that economic preferences and particularly disease knowledge explain protective health behavior against COVID-19.

Appendix A. Tables and figures

Table 2 presents an overview of the means of our preference elicitations and our control variables. Besides subjects’ sociodemographics, we also asked them about their media and voter preferences. That is, we asked for their main information source (TV, newspapers, social media, or friends). Based on that we build a dummy variable “social media,” which is positive, when subjects stated that their main information source was social media. Regarding voter preferences we asked them about the party they vote for. The dummy “left-wing party voter” is positive, if subjects stated that they either vote for “The Left,” “The Greens” or the “SPD.” If subjects did not reveal any voter preferences, the dummy “no voter preferences indicated” becomes one.

Table 2.

Preferences and controls of study participants (n = 185).

mean sd min max
Preferences
Risk 4.97 2.10 1 10
Patience 0.90 0.19 0 1
Trust 5.76 2.41 0 10
Trustworthiness 8.31 1.60 1 10
Honesty 7.99 1.40 2 10
Controls
Age 22.89 4.47 18 67
Female 0.52 0.51
Econ 0.19 0.40
Disposable income 435.87 327.82 0 2500
Main source: social media 0.13 0.34
Left-wing party voter 0.59 0.49
No voter preferences indicated 0.23 0.42

Table 3.

OLS regressions on the compliance index and on panic buying (PCs substituted by preference measures and social-responsibility items).

Compliance index Panic buying
(1) (2)
Trustworthiness 0.029 −0.085
(0.089) (0.089)
Honesty 0.075 0.009
(0.089) (0.090)
Patience 0.168∗∗ 0.086
(0.078) (0.078)
Risk tolerance −0.093 −0.176∗∗
(0.082) (0.082)
Vaccination 0.160∗∗ 0.098
(0.076) (0.076)
Turnout 0.097 −0.155∗∗
(0.074) (0.074)
Fare evasion −0.064 −0.062
(0.080) (0.080)
Present bias 0.180 0.413
(0.228) (0.229)
Trust −0.009 −0.034
(0.077) (0.078)
Day two 0.528∗∗∗ 0.073
(0.157) (0.158)
Constant −0.131 0.138
(0.219) (0.220)



Controls Yes Yes
obs. 183 183
R2 0.138 0.121




Standard errors in parentheses
∗∗∗p < 0.01, ∗∗p < 0.05, p < 0.1

Controls: Gender, age, disposable income, dummies which control whether their main information source is social media, whether they vote for left-wing parties, whether they stated no voter preferences, whether they are econ students.

Table 4.

Pairwise correlations between our elicitations across category and across demographics.

Patience Risk tol. Pres. bias Trust trustw. Honesty Fare evas. Turnout Vaccin. Compl. index Income Female Age Econ
Patience 1
Risk tol. −0.210∗∗∗ 1
Pres. bias −0.239∗∗∗ 0.168∗∗ 1
Trust 0.092 0.014 0.073 1
Trustw. −0.080 −0.041 0.046 0.244∗∗∗ 1
Honesty −0.091 −0.061 0.002 0.183∗∗ 0.533∗∗∗ 1
Fare evas. 0.030 0.195∗∗∗ 0.012 −0.062 −0.104 −0.168∗∗ 1
Turnout 0.033 0.003 −0.070 0.059 0.063 −0.023 −0.020 1
Vaccin. 0.010 −0.056 −0.013 −0.034 0.018 −0.075 −0.133 0.068 1
Compl. index 0.126 −0.052 −0.056 0.044 0.082 0.081 −0.014 0.156∗∗ 0.125 1
Income −0.094 0.063 0.116 −0.037 −0.049 −0.008 0.096 −0.073 −0.091 0.007 1
Female 0.026 −0.309∗∗∗ −0.077 0.046 0.023 0.117 −0.122 0.006 0.078 0.020 −0.150∗∗ 1
Age −0.057 0.036 0.053 −0.059 −0.010 0.005 −0.007 0.061 −0.007 0.027 0.561∗∗∗ −0.162∗∗ 1
Econ 0.020 0.197∗∗∗ 0.054 0.044 −0.078 −0.076 0.099 0.083 −0.070 −0.052 −0.110 −0.099 0.126 1

Table 5.

External validity of social responsibility.

Social Responsibility
Reliability (θ^) 1.903
(1.028)
Female 0.180
(0.222)
Econ −0.503∗∗
(0.249)
Age 0.238∗∗
(0.116)
Income −0.388∗∗∗
(0.122)



obs. 96
R2 0.152



Standard errors in parentheses
∗∗∗p < 0.01, ∗∗p < 0.05, p < 0.1

We had to drop the observation of one participant who did not provide the information on the disposable income.

Appendix B. Questions of the online survey

B.1. Preferences part

[Risk Tolerance].

  • How do you assess yourself: Are you a person who is prepared to take risks in general, or do you avoid taking risks? (0 = not at all prepared to take risks; 10 = very prepared to take risks)

[Time Preferences].

  • How much money do you want to receive today, such that you give up a sure payment of1000 in 6 months? (Please enter a money amount between €0 and €1000)

  • How much money do you want to receive in 6 months, such that you give up a sure payment of1000 in 12months? (Please enter a money amount between €0 and €1000)

[Honesty]

  • How do you assess yourself: Are you an honest person? (0 = not at all honest; 10 = very honest)

[Trust and Trustworthiness].

  • How well does the following statement describe you as a person? (0 = does not describe me at all; 10 = describes me perfectly)

    As long as I am not convinced otherwise, I assume that people have only the best intentions.

  • How well does the following statement describe you as a person? (0 = does not describe me at all; 10 = describes me perfectly)

    I consider myself to be a trustworthy person.

[Fare Evasion].

  • How often did you use public transportation services without having a valid ticket? (never before; rarely; occasionally; frequently; very frequently; always)

[Agreement to Vaccination].

  • How much do you agree to the law of compulsory measles vaccination, which came into effect on March 1, 2020. Under this law all kids have to do exhibit all recommended measles vaccinations before they go to the kindergarten or to school. (0 = no at all agree; 10 = completely agree)

[Participation in Election].

  • Have you participated in the last parliamentary/state election) (yes/no)

B.2. Contextual COVID-19 part

[Fear of COVID-19].

  • How much are you afraid of the Corona virus? (0 = no at all afraid; 10 = very much afraid)

[Staying at Home].

  • Have you reduced going outside because of the Corona virus? (no; yes, I go out less often; yes, I go out much less often; yes, I go out very much less often)

[COVID-19 testing].

  • Imagine that you experience symptoms, which are typical for the COVID-19 virus, how likely is it that you contact by phone your family doctor/public health department? (0 = unlikely; 10 = very likely)

[Suspected Case of COVID-19].

  • Do you know any suspected case of Corona in your personal environment? (yes; no)

[Purchases of Food].

  • Did you change your purchases of durable food (such as noodles, rice, or pesto) because of the Corona virus? (I buy much less of it; I buy less of it; no change in consumption; I buy more of it; I buy much more of it)

[Avoidance of Crowds].

  • How strongly do you avoid large crowds in public (public-transportation services, bars, restaurants, etc.)? (0 = no at all; 10 = completely)

[Main Information Source].

  • What is you main source of information? (TV news; print media; online newspapers; social media (Twitter, Facebook/Instagram); family, friends, fellow students/colleagues)

[Media Reporting].

  • How do you perceive the general media reporting of the Corona virus? (very understated; understated; adequate; exaggerated; very exaggerated)

[Self-assessed Likelihood of Becoming Infected with the COVID-19 virus].

  • What do you think is the probability that you will be infected with the virus within the next four weeks? (Please enter a value between 0 and 100)

[Agreement to Policy Measures].

  • How appropriate are the policy measures (Educational work, school closures, travel bans, etc.) in the context of the Corona virus, which were decided by the federal government? (0 = not appropriate; 10 = fully appropriate)

B.3. Socio demographics

  • What is your age?

  • What is your gender?

  • What is your nationality?

  • What is your field of study/job (if not a student)?

  • What is your monthly free disposable income (after the deduction of all regular payments, such as rent)?

B.4. Politics

  • What is the party that you sympathize most with? (CDU, SPD, The Greens, FDP, The Left, AfD, NPD, no information)

References

  1. Alan S., Ertac S. Fostering patience in the classroom: Results from randomized educational intervention. J. Polit. Econ. 2018;126(5):1865–1911. [Google Scholar]
  2. Allingham M.G., Sandmo A. Income tax evasion: a theoretical analysis. J. Public Econ. 1972;1(3–4):323–338. [Google Scholar]
  3. Alm J., Jackson B., McKee M. Institutional uncertainty and taxpayer compliance. Am. Econ. Rev. 1992;82(4):1018–1026. [Google Scholar]
  4. Andreoni J., Sprenger C. Estimating time preferences from convex budgets. Am. Econ. Rev. 2012;102(7):3333–3356. [Google Scholar]
  5. Antinyan A., Bassetti T., Corazzini L., Pavesi F. Trust in the healthcare system and covid-19 treatment in the developing world. survey and experimental evidence from armenia. Working Paper. 2020 [Google Scholar]
  6. Ayal S., Celse J., Hochman G. Crafting messages to fight dishonesty: a field investigation of the effects of social norms and watching eye cues on fare evasion. Organ. Behav. Hum. Decis. Process. 2019 [Google Scholar]
  7. Banerjee, R., Bhattacharya, J., Majumdar, P., 2020. Exponential-growth prediction bias and compliance with safety measures in the times of covid-19. arXiv preprint arXiv:2005.01273. [DOI] [PMC free article] [PubMed]
  8. Bjørnskov C. The multiple facets of social capital. Eur. J. Polit. Econ. 2006;22(1):22–40. [Google Scholar]
  9. Bonin H., Dohmen T., Falk A., Huffman D., Sunde U. Cross-sectional earnings risk and occupational sorting: the role of risk attitudes. Lab. Econ. 2007;14(6):926–937. [Google Scholar]
  10. Bowles S., Hwang S.-H. Social preferences and public economics: mechanism design when social preferences depend on incentives. J. Public Econ. 2008;92(8–9):1811–1820. [Google Scholar]
  11. Bronchetti E.T., Huffman D.B., Magenheim E. Attention, intentions, and follow-through in preventive health behavior: field experimental evidence on flu vaccination. J. Econ. Behav. Org. 2015;116:270–291. [Google Scholar]
  12. Brouard, S., Vasilopoulos, P., Becher, M., 2020. Sociodemographic and psychological correlates of compliance with the covid-19 public health measures in france. Can. J. Polit. Sci./Revue canadienne de science politique 1–6.
  13. Campos-Mercade, P., Meier, A., Schneider, F., Wengström, E., 2020. Prosociality predicts health behaviors during the covid-19 pandemic. University of Zurich, Department of Economics, Working Paper, (346). [DOI] [PMC free article] [PubMed]
  14. Cappelen A., Mæstad O., Tungodden B. Demand for childhood vaccination–insights from behavioral economics. Forum Develop. Stud. 2010;37(3):349–364. [Google Scholar]
  15. Cappelen A.W., Nygaard K., Sørensen E.Ø., Tungodden B. Social preferences in the lab: a comparison of students and a representative population. Scand. J. Econ. 2015;117(4):1306–1326. [Google Scholar]
  16. E. Chavarría, F. Diba, M.E. Marcus, A. Reuter, L. Rogge, S. Vollmer, et al. Knowing versus doing: Protective health behavior against covid-19 in indonesia. Technical report, Discussion Papers, University Göttingen, 2020.
  17. Dai Z., Galeotti F., Villeval M.C. Cheating in the lab predicts fraud in the field: an experiment in public transportation. Manage. Sci. 2018;64(3):1081–1100. [Google Scholar]
  18. Daoust J.-F. Elderly people and responses to covid-19 in 27 countries. PloS One. 2020;15(7):e0235590. doi: 10.1371/journal.pone.0235590. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Dawes C.T., Loewen P.J., Fowler J.H. Social preferences and political participation. J. Polit. 2011;73(3):845–856. [Google Scholar]
  20. Dohmen T., Falk A., Huffman D., Sunde U., Schupp J., Wagner G.G. Individual risk attitudes: measurement, determinants, and behavioral consequences. J. Eur. Econ. Assoc. 2011;9(3):522–550. [Google Scholar]
  21. Dohmen T., Falk A., Huffman D., Sunde U. The intergenerational transmission of risk and trust attitudes. Rev. Econ. Stud. 2012;79(2):645–677. [Google Scholar]
  22. Eckel C.C., Füllbrunn S.C. Thar she blows? gender, competition, and bubbles in experimental asset markets. Am. Econ. Rev. 2015;105(2):906–920. [Google Scholar]
  23. Falk A., Meier S., Zehnder C. Do lab experiments misrepresent social preferences? the case of self-selected student samples. J. Eur. Econ. Assoc. 2013;11(4):839–852. [Google Scholar]
  24. Falk, A., Becker, A., Dohmen, T.J., Huffman, D., Sunde, U., 2016. The preference survey module: a validated instrument for measuring risk, time, and social preferences. IZA DP, 9674.
  25. Falk A., Becker A., Dohmen T., Enke B., Huffman D., Sunde U. Global evidence on economic preferences. Quart. J. Econ. 2018;133(4):1645–1692. [Google Scholar]
  26. Fellner G., Maciejovsky B. Risk attitude and market behavior: evidence from experimental asset markets. J. Econ. Psychol. 2007;28(3):338–350. [Google Scholar]
  27. Fischbacher U., Föllmi-Heusi F. Lies in disguise–an experimental study on cheating. J. Eur. Econ. Assoc. 2013;11(3):525–547. [Google Scholar]
  28. Fischbacher U., Gächter S. Social preferences, beliefs, and the dynamics of free riding in public goods experiments. Am. Econ. Rev. 2010;100(1):541–556. [Google Scholar]
  29. Fonseca, M., Myles, G.D., 2011. A survey of experiments on tax compliance. Technical report, mimeo.
  30. Fouarge D., Kriechel B., Dohmen T. Occupational sorting of school graduates: the role of economic preferences. J. Econ. Behav. Org. 2014;106:335–351. [Google Scholar]
  31. Frederick S., Loewenstein G., O’donoghue T. Time discounting and time preference: a critical review. J. Econ. Lit. 2002;40(2):351–401. [Google Scholar]
  32. Gerber A.S., Green D.P., Larimer C.W. Social pressure and voter turnout: evidence from a large-scale field experiment. Am. Polit. Sci. Rev. 2008;102(1):33–48. [Google Scholar]
  33. Gillen B., Snowberg E., Yariv L. Experimenting with measurement error: techniques with applications to the caltech cohort study. J. Polit. Econ. 2019;127(4):1826–1863. [Google Scholar]
  34. Grosch, K., Müller, S., Rau, H., Zhurakhovska, L., 2020. Measuring (social) preferences with simple and short questionnaires. mimeo.
  35. Hanna R., Wang S.-Y. Dishonesty and selection into public service: Evidence from india. Am. Econ. J.: Econ. Policy. 2017;9(3):262–290. [Google Scholar]
  36. Hansen P.R., Schmidtblaicher M. A dynamic model of vaccine compliance: how fake news undermined the danish hpv vaccine program. J. Bus. Econ. Stat. 2019:1–21. [Google Scholar]
  37. Harrison G.W., Hofmeyr A., Ross D., Swarthout J.T. Risk preferences, time preferences, and smoking behavior. South. Econ. J. 2018;85(2):313–348. [Google Scholar]
  38. Hill T., Kusev P., Van Schaik P. Choice under risk: how occupation influences preferences. Front. Psychol. 2019;10:2003. doi: 10.3389/fpsyg.2019.02003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Kang M.-I., Ikeda S. Time discounting and smoking behavior: evidence from a panel survey. Health Econ. 2014;23(12):1443–1464. doi: 10.1002/hec.2998. [DOI] [PubMed] [Google Scholar]
  40. Kirchler E. Cambridge University Press; 2007. The Economic Psychology of Tax Behaviour. [Google Scholar]
  41. Kleinberg J., Ludwig J., Mullainathan S., Obermeyer Z. Prediction policy problems. Am. Econ. Rev.: Pap. Proc. 2015;105(5):491–495. doi: 10.1257/aer.p20151023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Knack S., Keefer P. Does social capital have an economic payoff? a cross-country investigation. Quart. J. Econ. 1997;112(4):1251–1288. [Google Scholar]
  43. Masclet D., Colombier N., Denant-Boemont L., Loheac Y. Group and individual risk preferences: a lottery-choice experiment with self-employed and salaried workers. J. Econ. Behav. Org. 2009;70(3):470–484. [Google Scholar]
  44. Meier S., Sprenger C. Present-biased preferences and credit card borrowing. Am. Econ. J.: Appl. Econ. 2010;2(1):193–210. [Google Scholar]
  45. Moore, R.C., Lee, A., Hancock, J.T., Halley, M., Linos, E., 2020. Experience with social distancing early in the covid-19 pandemic in the united states: Implications for public health messaging. medRxiv. doi:10.1101/2020.04.08.20057067. https://www.medrxiv.org/content/early/2020/04/11/2020.04.08.20057067.
  46. Müller, S., Rau,H.A., 2020. Economic preferences and compliance in the social stress test of the covid-19 crisis. cege discussion papers, Number 391–April 2020. [DOI] [PMC free article] [PubMed]
  47. Pierce L., Snow D.C., McAfee A. Cleaning house: the impact of information technology monitoring on employee theft and productivity. Manage. Sci. 2015;61(10):2299–2319. [Google Scholar]
  48. Potters J., Stoop J. Do cheaters in the lab also cheat in the field? Eur. Econ. Rev. 2016;87:26–33. [Google Scholar]
  49. Putnam R.D. Bowling alone: America’s declining social capital. J. Democracy. 1995;6:64–78. [Google Scholar]
  50. Rau, H.A., 2020. Time preferences in decisions for others. cege discussion papers, Number 395–June 2020.
  51. Riedl, A., Schmeets, H., Werner, P., 2019. Preferences for solidarity and attitudes towards the dutch pension system: Evidence from a representative sample. NETSPAR Design Paper, 128.
  52. Sheedy E., Zhang L., Tam K.C.H. Incentives and culture in risk compliance. J. Bank. Finance. 2019;107:105611. [Google Scholar]
  53. Simon-Tuval T., Shmueli A., Harman-Boehm I. Adherence of patients with type 2 diabetes mellitus to medications: the role of risk preferences. Curr. Med. Res. Opin. 2018;34(2):345–351. doi: 10.1080/03007995.2017.1397506. [DOI] [PubMed] [Google Scholar]
  54. Slemrod J., Blumenthal M., Christian C. Taxpayer response to an increased probability of audit: evidence from a controlled experiment in minnesota. J. Public Econ. 2001;79(3):455–483. [Google Scholar]
  55. Snowberg, E.,Yariv, L., forthcoming. Testing the waters: Behavior across participant pools. Amercian Economic Review, 2018.
  56. Staats B.R., Dai H., Hofmann D., Milkman K.L. Motivating process compliance through individual electronic monitoring: an empirical examination of hand hygiene in healthcare. Manage. Sci. 2017;63(5):1563–1585. [Google Scholar]
  57. Stango, V., Yoong, J., Zinman, J., 2017. The quest for parsimony in behavioral economics: new methods and evidence on three fronts. Technical report, National Bureau of Economic Research.
  58. Van Der Pol M., Hennessy D., Manns B. The role of time and risk preferences in adherence to physician advice on health behavior change. Eur. J. Health Econ. 2017;18(3):373–386. doi: 10.1007/s10198-016-0800-7. [DOI] [PubMed] [Google Scholar]

Articles from Journal of Public Economics are provided here courtesy of Elsevier

RESOURCES