Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Aug 25.
Published in final edited form as: Surv Res Methods. 2019 Dec 10;13(3):289–304.

Does Benefit Framing Improve Record Linkage Consent Rates? A Survey Experiment

Joseph W Sakshaug 1, Jens Stegmaier 2, Mark Trappmann 3, Frauke Kreuter 4
PMCID: PMC7447194  NIHMSID: NIHMS1594060  PMID: 32849920

Abstract

Survey researchers are increasingly seeking opportunities to link interview data with administrative records. However, obtaining consent from all survey respondents (or certain subgroups) remains a barrier to performing record linkage in many studies. We experimentally investigated whether emphasizing different benefits of record linkage to respondents in a telephone survey of employee working conditions improves respondents’ willingness to consent to linkage of employment administrative records relative to a neutral consent request. We found that emphasizing linkage benefits related to “time savings” yielded a small, albeit statistically significant, improvement in the overall linkage consent rate (86.0) relative to the neutral consent request (83.8 percent). The time savings argument was particularly effective among “busy” respondents. A second benefit argument related to “improved study value” did not yield a statistically significant improvement in the linkage consent rate (84.4 percent) relative to the neutral request. This benefit argument was also ineffective among the subgroup of respondents considered to be most likely to have a self-interest in the study outcomes. The article concludes with a brief discussion of the practical implications of these findings and offers suggestions for possible research extensions.

Keywords: administrative data, framing, interviewer-respondent interaction, questionnaire design, telephone survey

1. Introduction

Linking survey data with administrative records has become an increasingly attractive option in the field of survey research. Several high-profile surveys, such as the UK Household Longitudinal Study (Buck & McFall, 2012), the US National Health Interview Survey (Parsons et al., 2014), and the German Study “Labour Market and Social Security” (PASS; Trappmann et al., 2019; Trappmann, Beste, Bethmann, and Müller, 2013), engage in linking interview data to one or more of a variety of administrative data types (e.g. employment, education, health). Recently, several high-profile committees and organizations have endorsed record linkage as a way to improve official statistics and evidence-based policymaking. For example, the US National Academies of Sciences, Engineering, and Medicine (2017) recommended that “federal statistical agencies redesign current data collection e orts and estimation using multiple data sources, adapt current statistical methods to combine data sources, and … develop the new methods needed for design and analysis using multiple data sources (p.2).” The US Commission on Evidence-Based Policymaking (2017) advocated for greater use of record linkage, stating that it “will be an increasingly vital aspect of the evidence-building community’s capacity to meet future demand from policymakers (p.2).” Further, the Longitudinal Studies Strategic Review (2018) panel advised the UK’s Economic and Social Research Council (ESRC) to “promote, facilitate, and negotiate administrative data linkage for researchers … to meet increasing demands for longitudinal information including at local authority level and for particular subgroups (p.8).”

Linking survey data with administrative data offers several benefits to researchers, sponsors, and potentially to the survey respondents themselves. For researchers, record linkage increases scientific opportunities and strengthens the value of social science studies by making more information available on the surveyed units. Administrative records often contain important substantive variables recorded longitudinally over the life course (e.g. un/employment durations, lifetime earnings, medical expenditures). Many of these variables would be difficult—or near-impossible—to collect with high accuracy from respondent self-reports alone. For sponsors of survey research, the ability to merge interview data with existing administrative data is attractive from a cost perspective as it enhances the data profile of survey respondents at a fraction of the cost of primary data collection. Finally, for respondents, if the questionnaire is designed with linkage in mind then a shorter and more parsimonious interview can be conducted, leading to time savings and reduced burden for respondents. In some cases, interview duration and respondent burden can be immediately reduced if linkage is offered as an alternative to answering a subset of questionnaire items during the interview (e.g. Michaud, Dolson, Adams, & Renaud, 1995).

Although the benefits of linking administrative data to survey data are appealing to many stakeholders, what is unclear is the extent to which they resonate with the gatekeepers of these data—the survey respondents—whose data are linked and for whom authorization (or consent) for the linkage is typically sought. The new European General Data Protection Regulations have given the informed consent process renewed visibility, for example, by forcing the private sector to ask for consent to link data sources (General Data Protection Regulation, 2016). Researchers might be able to use the “legitimate interest” clause as a legal basis for linking records without asking for consent, though explicit consent might legitimize the use of data, in particular for those with more sensitive content1.

Convincing respondents that record linkage is a worthy endeavor is not an easy task as evidenced by large variation in linkage consent rates across studies, disciplines, and target populations (da Silva et al., 2012; Sakshaug & Kreuter, 2012). Furthermore, Fulton (2012) reports a declining trend in linkage consent rates in long-running repeated cross-sectional studies in the US, suggesting that respondents’ perception of the benefits of linkage are potentially being outweighed by other factors (e.g. concerns about sharing confidential data; see Sala, Knies, & Burton, 2014). However, trends in linkage consent rates for the National Health Interview Survey have improved based on changes to the linkage consent questions (Miller, Gindi, & Parker, 2011).

The purpose of the present study is to investigate the impact of highlighting particular benefits of record linkage on respondents’ willingness to consent to a linkage request. Apart from a main effect, we also look at effects for subgroups that can be expected to be responsive to the particular benefits mentioned, thus investigating the potential for tailoring the linkage request.

2. Background

Administering survey requests in a way that makes salient factors which are believed to be attractive to sample members is a well-known strategy for improving survey participation rates (Groves, Singer, & Corning, 2000). E orts to improve linkage consent rates in surveys have adopted similar strategies, namely, by emphasizing specific benefits of linkage during the linkage request. The idea of framing a data sharing request in terms of its expected benefits was experimentally tested in a nationally-representative telephone and face-to-face study by Bates, Wroblewski, and Pascale (2012). Respondents were asked to consider a hypothetical proposal in which the US Census Bureau would collect demographic information from government administrative records for people who did not return their census forms. Respondents were randomly allocated into separate framing groups which emphasized different benefits of the proposal. In one group, the expected cost savings incurred from utilizing government records instead of collecting census forms was emphasized. In the second group, the expected reduction in respondent burden that would result from substituting information from government records in lieu of filling out and mailing back a census form was emphasized. In the third group (control), no benefits of data sharing were emphasized in the proposal. The authors found that both benefit arguments (cost savings and reduced burden) elicited more positive feelings towards the proposal compared to the control group, with the cost savings argument yielding slightly more positive feelings than the reduced burden argument.

In an actual application of obtaining linkage consent, Pascale (2011) experimented with three benefit framing arguments for the linkage consent question in a U.S. telephone study conducted by the U.S. Census Bureau. The first argument stated that linking surveys to government administrative records would improve the accuracy of the research results, the second argument emphasized the cost savings that would result if the survey data were linked to government records, and the last argument was that record linkage would result in time savings for the respondent by producing additional statistical data “without taking up your time with more questions.” After being read the benefit framing arguments respondents were asked if they had any objections to the linkage. Contrary to expectations, there were no statistically significant differences in “no objection” rates between the three benefit arguments (cost savings. 85.3 percent; time savings. 83.6 percent; improved accuracy. 83.0 percent). Sakshaug, Tutz, and Kreuter (2013) also report the lack of a benefit framing effect in a linkage consent experiment embedded within a telephone study in Germany. Respondents in the benefit condition were presented with a time savings argument (“To keep the interview as brief as possible… ”) whereas respondents in the complementary condition received no benefit argument. Both versions yielded virtually identical consent rates (time savings. 95.5 percent; neutral. 95.6 percent). However, a subsequent replication of this experiment in a web survey yielded a positive, though modest, effect of the time savings argument (time savings. 61.4 percent; neutral. 55.4 percent) (Sakshaug & Kreuter, 2014).

Rather than emphasizing the benefits that would be gained if linkage consent were obtained, an alternative benefit framing strategy is to emphasize the benefits of linkage that would be unrealized if consent is not given. Kreuter, Sakshaug, and Tourangeau (2016) experimented with this strategy in a U.S. telephone study by randomly assigning respondents to a gain-framing condition and a loss-framing condition. In the gain-framing condition, respondents were told that the information that they provided in the survey would be “a lot more valuable” if it could be linked to administrative records. In the loss-framing condition, respondents were instead told that their survey information would be “much less valuable” if it could not be linked to records. In line with the social psychological literature on gain and loss framing (Kahneman & Tversky, 1979, 1984), the loss-framing condition yielded a higher consent rate (66.8 percent) than the gain-framing condition (56.1 percent). However, attempts to replicate these results in telephone studies in Germany have yielded no statistically significant differences between gain- and loss-framing conditions (Kreuter, Sakshaug, Schmucker, Singer, & Couper, 2015; Sakshaug, Wolter, & Kreuter, 2015).

3. Research Gaps and Hypotheses

The above literature review suggests that the strategy of emphasizing the benefits of record linkage to survey respondents does not consistently improve linkage consent rates over a neutral framing strategy. However, the evidence base from which this conclusion is drawn is based on a very small number of studies. Even fewer studies of actual linkage consent have experimentally tested the effectiveness of a benefit framing argument against a neutral comparison group (Sakshaug & Kreuter, 2014; Sakshaug et al., 2013). A limitation of omitting a neutral framing condition is that even if no differences are observed between two (or more) benefit framing arguments, both arguments could still be more effective than no benefit argument. Although the two studies cited above included a control group in their experiments, they considered only one benefit argument (time savings). To date, no linkage consent experiment has simultaneously tested multiple benefit framing arguments with comparison to a neutral control group.

Furthermore, no study to date has investigated whether linkage consent rates might be improved by tailoring the wording of the linkage consent question to attributes of individual respondents. In line with the general survey participation literature (Groves et al., 2000), it is highly plausible that different respondents place different importance on features of the linkage consent request. The survey introduction or questions asked prior to the linkage consent question might then take the role that the doorstep interaction takes in the process of gaining cooperation to help identify who can be persuaded by which argument (Groves & McGonagle, 2001). Thus, concerning the two experimental conditions tested in the present study (time savings and improved value of the study) some past findings about tailoring the survey request or refusal conversion might carry over to the linkage consent request (Groves & Couper, 1998; Morton-Williams, 1993).

For example, the argument of time savings should be attractive for those who are busy. There is evidence that busyness (or claiming to be busy) is related to survey nonresponse (Bates, Dahlhamer, & Singer, 2008; Vercruyssen, Roose, & van de Putte, 2011; Vercruyssen, van de Putte, & Stoop, 2011) and rushing through the questionnaire (Dahlhamer, Simile, & Taylor, 2008), and one of the standard arguments of interviewers is to emphasize that the interview will not take long (Groves & McGonagle, 2001). Thus, we assume that the benefit argument of time savings should work best for people who are busy. The argument of an improved value of the study should be more likely to attract those that have a high interest in the study producing high quality data. Independent of the study topic, this is generally assumed for people with high education. They are more likely to participate in scientific surveys and this is attributed to their increased commitment to the value of scientific investigation (Goyder, 1985). Thus, highly educated respondents should be more likely to respond to the increased study value argument. Additionally, persons who might personally benefit from the study results might be more likely to see the value of a higher quality study. The study at hand is about workers’ rights. To respondents, it is communicated as being about working conditions, working hours, and working time requirements. Thus, respondents who experience bad working conditions are likely to have a self-interest in the study being of improved value.

In this article, we address these research gaps and contribute to the relatively scarce literature on inducing linkage consent through benefit framing. The present study reports the results of a linkage consent experiment embedded within an employee survey on workers’ rights in which respondents were randomly allocated to one of three conditions, including two conditions which highlight a particular benefit of record linkage (time savings and improved value of the study) and a neutral control condition. Specifically, we examine whether there is a potential advantage of emphasizing different benefits of linkage on the linkage consent rate and whether subgroups of respondents that can be identified from their previous responses react differently to the benefits that are emphasized.

Based on the above arguments, the following hypotheses are tested.

  1. The time savings benefit argument should lead to an overall higher linkage consent rate than the neutral condition that does not specify a particular benefit of linkage;

  2. The improved study value benefit argument should lead to an overall higher linkage consent rate than the neutral condition that does not specify a particular benefit of linkage;

  3. The time savings argument should have a more positive effect on busy respondents;

  4. The improved study value argument should have a more positive effect for respondents with a high level of education and for those with a self-interest in the study being of improved value.

4. Data and Methods

4.1. Survey Data Collection

The linkage consent experiment was implemented on a sample of employees who participated in the “Diversity of Employment Relationships” (abbreviated as VA for “Vielfalt der Arbeitsverhältnisse”) survey. The VA survey was an employer-employee survey conducted in Germany and jointly sponsored by the German Federal Ministry of Labor and Social A airs (BMAS) and the Institute for Employment Research (IAB) in Nuremberg. Like many other countries, Germany has adopted various measures of labor market deregulation. As a consequence, since the 1980s, the share of flexible, non-standard employment has increased strongly. There is a controversial debate on the benefits and risks of non-standard employment for individuals. On the one hand, non-standard jobs can be beneficial for unemployed individuals. On the other hand, there is ample evidence that non-standard jobs provide comparatively unfavorable employment conditions. However, more research is needed into equal treatment regarding basic employment rights. The VA survey therefore aimed at providing new data to address the question of whether standard and non-standard workers are treated equally with regard to paid sick leave, paid vacation, and paid public holidays, as prescribed by German labor law.

The sample for the VA survey was drawn from an IAB employment database containing the universe of all German workers who are liable to social security contributions. Nontarget groups, such as apprentices, workers in private households, and a few other worker groups (e.g. family workers) were removed from the sampling frame. Workers of the temporary agency worker industry were further removed as the data did not allow identifying the establishment where the temporary agency worker is actually working (which would be crucial for the employer-employee-design). Workers of extraterritorial organizations (e.g. embassies) have also been removed as the conditions for the implementation of the applicable labor law can be very different due to the diplomatic privilege. Finally, the database was restricted to workers from establishments with a minimum of 11 employees who are social security contributors.2

The IAB database contains an establishment identifier. Thereby, a random sample of 3,003 establishments was drawn from the IAB employment database on the reference date 31st December 2012. Establishments were drawn with probability-proportional-to-size (PPS) sampling with the number of employees used as the measure of size. Thus, establishments employing a larger number of employees were sampled with higher probabilities of selection compared to establishments with fewer employees. Within selected establishments, the workforce was stratified into four groups (marginal part-time worker, part-time worker, worker with fixed-term contracts, and other workers) and within each stratum a random sample of employees was then drawn from the IAB employment database and invited to take part in the employee survey. A total of 48,006 individuals were selected into the employee sample.

Data collection was conducted between November 2013 and April 2014 by the survey institute infas. All interviews were conducted by computer-assisted telephone interviewing (CATI). The IAB employment database contains names, addresses and, to some extent, telephone numbers for employees. The contact information was enriched by infas on the basis of their own commercial directory research. Telephone numbers could be found for about 69 percent of the sample. Valid interviews were conducted with a total of 7,561 employees from 1,110 participating establishments. This yielded a raw employee response rate of 15.8 percent (Response Rate 1; American Association for Public Opinion Research, 2016. However, the employee survey used a screening procedure to exclude persons who were not employed at the time of the survey.3 Therefore, not all of the workers in the sampling frame were actually eligible to take part in the survey. Hence, the study reported also an American Association for Public Opinion Research (2016) Response Rate 3 of 24.7 percent, which accounts for the screening process. The response rate is similar to those of other telephone surveys of employed populations in Germany (e.g. Apel et al., 2012; Eckman et al., 2014). Full details of the data collection procedures and outcomes can be found in the methods report (Schütz, Harand, Kleudgen, Aust, & Weißpflug, 2014, available upon request).4

The interview took, on average, thirty minutes to complete and included questions on several topics related to, among others, employment rights, (desired) working hours, and general working conditions.

4.2. Experimental Design

All respondents were asked for explicit consent to link their survey answers to federal employment records of the IAB.5 The consent request was administered after having asked several, rather general questions about the worker’s employment relationship (after the first third of the questionnaire). Each respondent was read the following statement as part of the linkage consent request [English translation; see appendix for original German version].

“We would like to include in the evaluation of the survey, extracts from data that are available at the Institute for Employment Research of the Federal Employment Agency in Nuremberg. For example, information about previous periods of employment and unemployment. For the purpose of merging this data to the interview data I would like to ask you for your consent. It is absolutely certain that all data protection regulations are strictly adhered to. Your consent is of course voluntary. You can also withdraw it at any time. Do you agree?”

A linkage consent experiment was carried out to determine whether prefacing the above statement with a statement highlighting a specific benefit of record linkage would improve the linkage consent rate. Respondents were randomly allocated with equal probability to one of three experimental conditions. 1) time savings; 2) improved study value; and 3) control. Respondents assigned to the time savings condition (n = 2; 580) were read the following statement immediately prior to being read the standard consent statement (above). “To keep the interview as short as possible … ” Respondents assigned to the improved study value condition (n = 2; 417) were read a different prefacing statement emphasizing the improved value of the study if linkage were to occur. “The informative value of this study can be significantly improved if we can supplement your information with additional data.” Respondents allocated to the control condition (n = 2; 564) were simply read the above standard consent statement without any prefacing words.

4.3. Operationalization and Statistical Analysis

We hypothesize that both benefit framing conditions should motivate respondents to consent to linkage at a higher rate than respondents in the control condition (hypothesis 1 for the time savings argument and hypothesis 2 for the improved study value argument). In order to test this hypothesis (positive main effect of the time savings and improved study value arguments) it is sufficient to compare linkage consent rates between treatment groups.

For hypotheses 3 and 4, the hypothesized constructs are operationalized as follows. For hypothesis 3, busyness is measured using two indicators collected in the survey. Actual weekly working hours and number of children under the age of 14 (Vercruyssen, Roose, Carton, & van de Putte, 2014; Vercruyssen, Roose, & van de Putte, 2011; Vercruyssen, van de Putte, & Stoop, 2011). The original survey questions can be found in the appendix. For hypothesis 4, education is measured using three levels of general school degrees in Germany (low, intermediate, high).6 Having a self-interest in the improved value of the study is measured by the number of worker’s rights that are withheld by the employer.7 Although the linkage consent question was asked before the respondents knew about the workers’ rights questions, the introduction presented at the beginning of the interview made it clear that this topic would be covered in the study.

“Our study ‘Diversity of Working Conditions’—a survey of employees in Germany—examines how different the working conditions, working hours and working time requirements of employees in German companies are. Above all, we are interested in the personal assessments and experiences of the employees. I would like to interview you now.”

In order to address hypotheses 3 and 4 the treatment effect for each of the two benefit framing conditions is examined relative to the control condition for the subgroup variables of interest. However, as these additional variables are unconfounded with the treatment, but potentially confounded with each other, a logistic regression model of linkage consent on the treatment variable, the hypothesized subgroup variables, and interaction terms between the treatment and these subgroup variables is fitted. The regression model also includes some additional control variables. The interaction terms will inform us whether there are subgroup differences in the treatment effects.

The logistic regression model can be expressed as follows.

ln(πi1πi)=α+βXi+λYi+δZi+ΩXiZi

where πi is the conditional probability of consent to data linkage for respondent i,α is the model intercept, β is the vector of model parameters for the experimental covariates Xi (time savings, improved study value, and control group), λ is the vector of model parameters for the control covariates Yi (age, sex, and immigrant background), is the vector of model parameters for the hypothesized indicator covariates Zi (education, number of worker’s rights withheld, actual weekly working hours, and number of children), and Ω is the vector of model parameters for the interaction between the experimental covariates Xi and the hypothesized indicator covariates Zi.

All analyses were performed using the survey (svy) commands in Stata/MP 14.2 and account for weighting,8 stratification, and clustering of employees within interviewers and establishments.

5. Results

Appendix Table 1 shows the compositional distribution of respondents allocated to each experimental condition. There are no statistically significant differences between the experimental conditions with respect to the distribution of the following respondent characteristics. age (in years; 15–32, 33–45, 46–53, 54–84), sex, immigrant background (1st/2nd generation), and the hypothesized indicator variables. actual weekly working hours (25 or less, 25–39, 40 or more), number of children (0, 1, 2 or more), education (low, intermediate, high), and number of worker’s rights withheld (0, 1 or more). This suggests that the random assignment procedure yielded balanced samples.

5.1. Linkage Consent Rates by Subgroup

A total of 6,370 out of 7,561 respondents consented to record linkage for an overall linkage consent rate of 84.7 percent (weighted). Similar consent rates have been reported in other IAB-sponsored surveys (Bender, Fertig, Gorlitz, Huber, & Schmucker, 2008; Christoph et al., 2008; Sakshaug et al., 2013). Table 1 shows overall linkage consent rates for each respondent subgroup. Linkage consent rates are similar within most subgroups with the exceptions of sex, education, and actual working hours. linkage consent rates are statistically significantly higher among females, respondents with “Intermediate” or “High” levels of education, and those who work 40 or more hours per week.

Table 1.

Overall Linkage Consent Rates by Respondent Characteristics and Experimental Conditions.

Experimental conditions
Overall
Control
Time savings
Improved study value
Respondent characteristics % SE % SE % SE % SE
Age (in years)
 15–32 84.46 1.15 84.80 1.88 85.15 1.83 83.42 1.87
 33–45 83.38 1.03 81.40 1.93 85.39 1.53 83.30 1.72
 46–53 84.66 0.93 83.48 1.75 85.81 1.50 84.81 1.68
 54–84 86.38 0.95 85.78 1.63 87.59 1.66 85.73 1.65
Sex
 Female 86.42* 0.73 85.53* 1.15 87.80* 1.16 85.88 1.34
 Male 83.00* 0.73 82.03* 1.20 84.17* 1.10 82.84 1.22
Immigrant background
 Yes 83.06 1.11 81.69 2.22 84.79 1.89 82.63 1.88
 No 85.30 0.56 84.63 0.93 86.45 0.91 84.82 1.02
Hypothesized indicators
Actual weekly working hours
 25 or less 82.86 1.00 83.17 1.91 83.32 1.57 82.01* 1.69
 25 – 39 84.14 1.05 85.08 1.65 85.06 1.59 82.23* 1.73
 40 or more 85.78 0.67 83.34 1.25 87.77 1.14 86.28* 1.19
# of children
 0 85.57 0.62 85.72* 1.04 86.03 1.01 84.94 1.13
 1 83.14 1.35 82.68* 2.25 85.98 2.13 80.36 2.49
 2 or more 84.93 1.41 79.79* 2.53 90.39 1.84 83.99 2.46
Education
 Low 81.76* 1.22 81.22 2.08 81.27* 2.11 82.91 1.95
 Intermediate 85.11* 0.80 83.43 1.44 87.55* 1.20 84.25 1.34
 High 85.84* 0.79 85.37 1.34 86.99* 1.29 85.19 1.56
# of worker’s rights withheld
 0 85.04 0.54 83.58 0.97 86.50 0.88 85.08 0.96
 1 or more 83.30 1.50 83.14 2.30 83.99 2.36 82.74 2.24

N 7,561 2,564 2,580 2,417
*

p < 0.05, two-sided, chi-square test

5.2. Linkage Consent Rates by Experimental Conditions

We hypothesized that respondents who received either of the two benefit wording statements would consent to linkage at a higher rate than respondents who did not receive a benefit wording statement (control). Linkage consent rates for each experimental condition are shown in Figure 1. The “time savings” condition (86.0 percent) yields the highest consent rate overall, followed by the “improved study value” condition (84.4 percent), and the control condition (83.8 percent). The consent rate difference between the time savings condition and the control condition is statistically significant (t348 = 1.94; p – value = 0.027, one-sided). The consent rate of the improved study value condition, on the other hand, does not differ much from the consent rate of the control condition (t344 = 0.53;p – value = 0.298, one-sided). Thus, these results provide modest support for hypothesis 1, but not hypothesis 2.

Figure 1.

Figure 1.

Linkage Consent Rates by Experimental Condition. Error Bars are 95% Confidence Intervals.

5.3. Treatment Effects Across Respondent Subgroups

This section now examines the possibility that benefit framing may differentially affect willingness to consent for specific respondent subgroups relevant to hypotheses 3 and 4. Figure 2 shows treatment effects (deviation in consent rates between the two experimental conditions and the control condition) across the respondent subgroups. Similar to the overall consent rate results presented in Section 5.2, the figure shows that the magnitude of the effect of the two experimental groups is generally small and close to zero for most of the relevant subgroups. The largest effects can be seen for the time savings argument. respondents with two or more children and respondents who work at least 40 hours per week—both hypothesized indicators of busyness—are more likely to be persuaded by the time savings argument. Thus, there is some evidence that tailoring the linkage consent request for these subgroups may increase respondents’ willingness to give consent. No such effect is seen for the improved study value argument.

Figure 2.

Figure 2.

Consent Rate Deviations Between the Two Benefit Framing Conditions and the Control Condition for Respondent Subgroups.

5.4. Linkage Consent Model with Experimental Condition and Subgroup Interactions

As previously noted, one limitation of looking at the main effects of the experimental conditions on the indicators of the hypothesized constructs ignores the potential confounding of the indicator variables with each other. To mitigate the effects of confounding and formally test hypotheses 3 and 4, a logis tic regression model of linkage consent on control variables and the interaction between the experimental conditions and indicators of the two hypothesized constructs. busyness and self-interest in the improved value of the study, is presented. The focus will thus be on the four interaction effects (two indicators for each argument).

The results, presented in Table 2, show a positive interaction between the time savings argument and both busyness indicators. That is, respondents who work forty or more hours per week or have two or more children are more likely to consent to record linkage when presented with the time savings argument. These results are consistent with the notion that the time savings argument has a more positive effect on busy respondents, i.e. hypothesis 3 is supported.

Table 2.

Logistic Regression of Linkage Consent on Respondent Subgroups and Interactions Between Experimental Conditions and Respondent Subgroups.

Interaction terms
Main effect terms
Time savings
Improved study value
Covariates Coef. SE Coef. SE Coef. SE
Experimental conditions
 Control ref. - - - - -
 Time savings −0.50 0.30 - - - -
 Improved study value −0.26 0.31 - - - -
Control variables
Age (in years)
 15–32 ref. - - - - -
 33–45 0.002 0.14 - - - -
 46–53 0.08 0.15 - - - -
 54–84 0.21 0.15 - - - -
Sex
 Male ref. - - - - -
 Female 0.29* 0.11 - - - -
Immigrant background
 Yes −0.15 0.11 - - - -
 No ref. - - - - -
Hypothesized indicators
Actual weekly working hours
 25 or less ref. - ref. - ref. -
 25–39 0.14 0.22 0.04 0.30 −0.20 0.30
 40 or more −0.16 0.20 0.56* 0.28 0.44 0.27
# of children
 0 ref. - ref. - ref. -
 1 −0.21 0.20 0.20 0.29 −0.12 0.27
 2 or more −0.38 0.20 0.91* 0.29 0.42 0.29
Education
 Low ref. - ref. - ref. -
 Intermediate 0.10 0.21 0.51* 0.26 0.25 0.30
 High 0.31 0.21 0.11 0.27 0.05 0.30
# of worker’s rights withheld
 0 ref. - ref. - ref. -
 1 or more −0.05 0.23 0.07 0.29 −0.19 0.31
 Intercept term 1.49* 0.26 - - - -
*

p < 0.05, two-sided

Turning now to hypothesis 4, the model results show no statistically significant interaction between the improved study value condition and education; thus, there is no evidence to support the notion that the improved study value argument has a positive effect on the most highly educated respondents. Similarly, there is no statistically significant interaction between the improved study value argument and respondents whose employee rights are being withheld or who do not work their preferred hours.

6. Discussion

For researchers interested in collecting survey data and linking them to administrative data it is useful to consider design strategies to optimize the linkage consent rate and minimize the risk of non-consent bias. Designing the linkage consent request in a way that highlights particular benefits of record linkage to survey respondents is one relatively straightforward strategy that may be considered for this purpose. The present study found mixed success for this strategy in terms of improving respondent willingness to give linkage consent. Specifically, the study showed that respondents to a workers’ rights survey who were presented with a time savings argument were more likely to consent to record linkage relative to a neutral request. However, emphasizing a separate benefit related to improved value of the study did not improve respondents’ likelihood of consent. A small interview monitoring exercise suggested that these findings were unlikely to have been caused by deviant interviewer behavior (see appendix for details).

In addition, the study revealed statistically significant interactions of benefit framing for particular subgroups that could be utilized for tailoring the consent request. Specifically, the time saving argument had a positive effect on “busy” respondents. those who work at least 40 hours and/or have multiple children were more likely to give consent when presented with the time savings argument. Contrary to our expectations, the improved study value argument had no effect on respondents who experience a withholding of their worker’s rights and would be most likely to personally benefit from the improved value of the study. The study also found no effect of the improved study value argument on respondents with the highest level of education, whom we hypothesized would be more receptive to the idea of contributing to improving the value of the scientific study.

While these findings bring useful insights to survey practitioners, there are a few study issues worth discussing. For instance, the experiment was embedded within a low response rate survey which may have contributed to the high overall consent rate. Although the response rate is similar to that of other telephone surveys of employed populations in Germany (e.g. Apel et al., 2012; Sakshaug et al., 2013), the respondents are likely to be more cooperative with the consent request compared to the non-respondents had they been posed the consent question. Busyness is one of the reasons why people choose not to participate in surveys (Bates et al., 2008; Vercruyssen, Roose, & van de Putte, 2011; Vercruyssen, van de Putte, & Stoop, 2011); thus, while a higher response rate would be expected to coincide with a lower consent rate, we would also expect the effect of the time savings argument to be stronger if more busy and generally less cooperative people were recruited into the survey. Thus, the estimated effect of the time savings argument reported in this study may be considered as a lower bound of the actual phenomenon. People concerned with improving the value of scientific studies are likely to cooperate with scientific survey requests. Thus, we would not expect the null effect of the improved study value argument to change given a higher response rate.

A second issue is that multiple hypotheses have been tested on the same dataset. Although only three hypotheses have been put forward, six indicators have been tested for the significance to reject or support these hypotheses. This multiple testing increases the probability of rejecting at least one null-hypothesis by chance (alpha error). Although formal methods have been developed to correct for multiple testing (e.g. the Bonferroni correction Dunn, 1961), these decrease the statistical power of each single test substantially which is an issue with the number of cases available. Rather than applying the Bonferroni correction, we argue that while both indicators (education, withheld worker’s rights) lack significance for the improved study value argument, both indicators for busyness (2+ children, 40+ working hours) show a significant effect for the time savings argument. This is a consistent pattern that is very unlikely to be the result of an alpha error. If the strict Bonferroni correction is applied (dividing the p-level by the number of hypotheses), only one of the two effects (2+ children) retains significance.

Although benefit framing has been shown to have a positive effect on attitudes towards data sharing in hypothetical scenarios (Bates, Wroblewski, and Pascale 2012), their effects have been lackluster in actual linkage applications, particularly in telephone studies (e.g. Kreuter et al., 2015; Sakshaug et al., 2013). The present study provides another data point suggesting that emphasizing the benefits of record linkage is unlikely to have a dramatic impact on respondents’ decision-making process when considering the linkage request. However, the benefit framing effects observed for specific subgroups points to a possible strategy of tailoring the linkage consent request for specific individuals. Just as surveys tailor their recruitment strategies to address people who cite time pressures and “busyness” as reasons for not participating in survey research (Olson, Lepkowski, & Garabrant, 2011), a tailoring strategy that emphasizes the time savings argument for respondents who are previously identified as being “busy” at the time of the survey request, or based on answers to previous survey questions, could be considered.

A possible extension of this research relates to the optimal delivery of the benefit argument by interviewers during the linkage consent request. One plausible explanation for the small (less than 3 percentage point) main effect of benefit framing could be due to inadequate saliency of the benefit statement during its delivery. The benefit statement was read to respondents at the very beginning of the consent request and was followed by a longer statement describing the administrative data, data protection aspects, and the voluntary nature of the request. The saliency of the benefit argument therefore could have diminished over the course of delivering the full statement and lost its influence by the time respondents were prompted for an answer. Furthermore, interviewers were not instructed to add any vocal emphasis when reading the benefit statements, which could have further contributed to the lack of salience of the benefit statement. Experimenting with different levels of vocal emphasis when delivering the benefit argument as well as the proximity of the benefit argument relative to the point at which respondents are asked for an answer are both topics to be considered in future research (see e.g. Sakshaug, Schmucker, Kreuter, Couper, & Singer, 2019).

The effect of consent framing in self-versus interviewer-administered modes is another relevant topic for future research. On the one hand, self-administered modes, such as web, ensure that the entire framing argument is presented to respondents, whereas in interviewer-administered modes there is no assurance that the argument is read or emphasized to respondents. On the other hand, respondents in self-administered surveys may be less likely to read the complete consent statement. Thus, benefit framing arguments may be more (or perhaps less) salient to respondents when presented under alternative survey modes.

Appendix A. Fieldwork materials

Linkage Consent Statement and Question Administered to All Respondents

“Wir würden gerne bei der Auswertung der Befragung Auszüge aus Daten einbeziehen, die beim Institut für Arbeitsmarkt- und Berufsforschung der Bundesagentur für Arbeit in Nürnberg vorliegen. Dabei handelt es sich zum Beispiel um Informationen zu vorausgegangenen Zeiten der Beschäftigung und der Arbeitslosigkeit.

Zum Zweck der Zuspielung dieser Daten an die Interviewdaten möchte ich Sie herzlich um Ihr Einverständnis bitten. Dabei ist absolut sichergestellt, dass alle datenschutzrechtlichen Bestimmungen strengstens eingehalten werden. Ihr Einverständnis ist selbstverständlich freiwillig. Sie können es auch jederzeit wieder zurückziehen.

Sind Sie damit einverstanden?”

Benefit Arguments Administered to Respondents Assigned to the Treatment Conditions

  • Time savings.

“Um das Interview im Folgenden möglichst kurz zu halten… ”

  • Improved study value:

“Die Aussagekraft dieser Studie lässt sich deutlich verbessern, wenn wir Ihre Angaben mit weiteren Daten ergänzen können.”

Survey Questions Used in Analysis

  1. Number of children under the age of 14:
    “Und nun zur Größe Ihres Haushaltes. Wie viele Personen leben in Ihrem Haushalt, Kinder und Sie selbst mit eingeschlossen? Dazu zählen auch Personen, die normalerweise im Haushalt wohnen, aber vorübergehend abwesend sind, aus beruflichen oder persönlichen Gründen. Nicht dazu zählen dagegen Mitbewohner aus Wohngemeinschaften, Untermieter oder Hausangestellte.”
    Anzahl der Personen □□
    [If Filter Q > 1] “Und wie viele Personen davon sind Kinder unter 14 Jahren?”
  2. Actual weekly working hours:
    “Und wie viele Stunden arbeiten Sie im Durchschnitt tatsächlich pro Woche? Bitte berücksichtigen Sie nun auch regelmäßig geleistete bezahlte und unbezahlte ÃIJberstunden, Mehrarbeit, Vor- und Nacharbeitszeiten, Bereitschaftsdienste sowie Arbeit von zu Hause oder unterwegs.”

    Stunden □□□

  3. Education:
    “Welchen höchsten allgemeinbildenden Schulabschluss haben Sie?”
    • Keinen Abschluss beziehungsweise noch keinen Abschluss

    • Hauptschulabschluss, Volksschulabschluss

    • Mittlere Reife, Realschulabschluss, Fachschulreife, POS

    • Fachhochschulreife, Abschluss einer Fachoberschule

    • Abitur, Hochschulreife, EOS, Berufsausbildung mit Abitur

  4. Age:
    “Bevor wir mit dem eigentlichen Interview beginnen, sagen Sie mir bitte zunächst, wann Sie geboren sind! Nennen Sie mir dazu bitte den Monat und das Jahr.”

    Monat □□

    Jahr □□□□

  5. Immigrant background (1st/2nd generation):
    “Sind Sie in Deutschland geboren?”
    • Ja

    • Nein
      “Ist Ihr Vater in Deutschland geboren? ”
    • Ja

    • Nein
      Ist Ihre Mutter in Deutschland geboren?
    • Ja

    • Nein

  6. Number of workers’ rights withheld by employer:

    This variable was generated using three survey questions (see below) referring to paid vacation, paid sick leave, and paid public holidays, which are all legal rights of employees in Germany. For each question, a binary indicator was generated which indicated whether a specified worker right was illegally withheld (1) or not (0) from the respondent. An additive index of the number of workers’ rights withheld was then generated by summing the three binary indicator variables. More details about the variable generation process can be found in Fischer et al. (2015, p.319).

    1. Paid vacation
      “Erhalten Sie in bezahlten Urlaub?”
      • Ja

      • Nein

        [If Filter Q = “Nein”]
        “Aus welchem Grund erhalten Sie keinen bezahlten Urlaub? Sollten mehrere der folgenden Gründe auf Sie zutre en, nennen Sie uns bitte den Hauptgrund. Erhalten Sie keinen bezahlten Urlaub … ”
      • Weil Sie noch nicht lange genug im Betrieb beschäftigt sind

      • Weil Ihnen in Ihrer Tätigkeit kein bezahlter Urlaub zusteht

      • Weil die Personal- oder Auftragslage momentan keinen Urlaub zulässt.

      • Weil die Personal- oder Auftragslage dauerhaft keinen Urlaub zulässt.

    2. Paid sick leave:
      “Wenn Sie sich bei Ihrem Arbeitgeber krank melden, welche Auswirkungen hat dies? Ich lese Ihnen verschiedene Möglichkeiten vor, bitte sagen Sie mir welche für Sie zutre en”
      • Sie bekommen für die Zeit der Krankheit Ihren regulären Lohn

      • Sie müssen die Zeit der Krankheit unentgeltlich nacharbeiten

      • Sie müssen die Zeit der Krankheit über Ihr Arbeitszeit- oder Urlaubskonto ausgleichen

      • Sie müssen eine Vertretung organisieren, die Ihre Arbeit übernimmt

    3. Paid public holidays
      “Was geschieht, wenn einer Ihrer Arbeitstage auf einen gesetzlichen Feiertag fällt und Sie des-halb arbeitsfrei haben? Ich lese Ihnen wieder mehrere Antwortmöglichkeiten vor.”
      • Sie erhalten für den Tag ganz normal Ihren Lohn

      • Die Arbeitszeit wird Ihnen wie an einem Arbeitstag gutgeschrieben

      • Sie müssen die ausgefallene Arbeitszeit unentgeltlich vor- oder nacharbeiten

      • Die ausgefallene Arbeitszeit wird von Ihrem Arbeitszeit- oder Urlaubskonto abgezogen.

Appendix B. Compositional Distribution of Respondents Allocated to Each Experimental Condition

Table B1.

Distribution of Respondent Characteristics Within Experimental Conditions.

Respondent characteristics Experimental conditions
Control
Time savings
Improved study value
% SE % SE % SE
Control variables
Age in years (χ2 = 9.298; df = 6; p = 0.368)
 15–32 17.01 0.83 18.83 0.88 19.21 0.83
 33–45 26.32 1.03 27.17 1.02 27.70 1.09
 46–53 30.77 1.04 28.43 1.25 27.73 1.16
 54–84 25.90 1.06 25.58 1.10 25.36 1.06
Sex (χ2 = 1.850; df = 2; p = 0.491)
 Female 49.32 1.29 51.19 1.15 49.97 1.12
 Male 50.68 1.29 48.81 1.15 50.03 1.12
Immigrant background (χ2 = 2.926; df = 2; p = 0.329)
 Yes 18.21 0.84 19.31 0.84 20.11 1.05
 No 81.79 0.84 80.69 0.84 79.89 1.05
Hypothesized indicators
Actual weekly working hours (χ2 = 0.834; df = 4; p = 0.951)
 25 or less 19.67 0.94 19.61 0.83 19.22 0.86
 25–39 25.15 1.01 24.85 0.98 25.92 1.07
 40 or more 55.17 1.31 55.54 1.19 54.85 1.36
# of children (χ2 = 10.920; df = 4; p = 0.083)
 0 71.64 1.06 67.78 1.12 68.60 1.21
 1 15.75 0.91 17.71 0.98 16.12 0.96
 2 or more 12.61 0.85 14.51 0.87 15.27 0.95
Education (χ2 = 10.558; df = 4; p = 0.135)
 Low 20.11 0.93 19.77 0.86 19.17 0.97
 Intermediate 39.55 1.22 40.03 1.14 36.70 1.32
 High 40.34 1.22 40.20 1.17 44.13 1.38
Number of worker’s rights withheld (χ2 = 0.820; df = 2; p = 0.735)
 0 84.76 0.83 84.24 0.84 83.79 0.92
 1 or more 15.24 0.83 15.76 0.84 16.21 0.92

N 2,564 2,580 2,417

Appendix C. Behavior Coding of Interviewer-Respondent Interaction

To explore the possibility that interviewers failed to read the benefit arguments (or other parts of the consent statement) exactly as scripted in the questionnaire, we implemented a small interviewer monitoring exercise in which three study investigators visited the infas call center on four separate days (November 5th, 6th, 25th and December 9th) and listened to a set of interviews being conducted in realtime while simultaneously coding specific behaviors of the respondent-interviewer interaction. Interviewers are routinely monitored for quality control purposes, but none of the interviewers were made aware that they would be observed by one of the study investigators on a given day. The investigators coded interviewers on whether they read the linkage consent statements/question exactly as scripted in the questionnaire and respondents were coded on their reaction to the consent statements/question. A total of 49 interviews were observed and coded. Given the small sample size our intention is not to provide conclusive and generalizable estimates of behaviors, but rather to provide a basic qualitative impression of whether the telephone interaction adversely affected the implementation of the consent experiment.

The results of the interviewer coding exercise are shown in Appendix Table C1. Overall, we do not find strong evidence of interviewer deviance with respect to reading the consent statement exactly as instructed. Interviewers read the scripted consent statements/question verbatim in more than two-thirds (n = 35) of interviews, and only one of the script deviations affected the benefit statement. These results (albeit, limited) provide some reassurance that the benefit wording experiment wasn’t compromised by a failure to read the benefit statement as scripted. Despite correctly reading the consent statements, there is still the possibility that interviewers intervened and provided non-neutral feedback which could have influenced respondents’ decision to consent. There were some (n=9) instances where respondents hesitated to answer the consent question and this was met with interviewers intervening with unscripted feedback which included mentioning a non-scripted benefit of linkage. This deviant behavior led to a positive consent decision in about half (n = 5) of the interviews. Although these observations indicate that a small number of interviewers deviated from their instructions and provided non-neutral feedback to hesitant respondents, this behavior occurred evenly across the three experimental conditions (n=3 per condition). Thus, we conclude from this small behavior coding exercise that interviewer deviance was unlikely to be a factor in the benefit framing experiment.

Table C1.

Observed Behaviors Coded During the Interviewer-Respondent Interaction.

n %
Experimental conditions
Control 21 42.9
Time savings 14 28.6
Improved study value 14 28.6
Interviewer delivery of consent question
Read script as worded 35 71.4
Changed script, but not benefit wording 13 26.5
Changed script and benefit wording 1 2.0
Respondent reaction to consent question
R provided answer without hesitation 31 63.2
R hesitated, Interviewer gave neutral feedback 9 18.4
R hesitated, Interviewer gave additional benefit 9 18.4

Total 49 100.0

Footnotes

1

Race, ethnic origin, politics, religion, and trade union membership are examples of special category data where explicit consent is advised, see also https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/ful-basis-for-processing/special-category-data/.

2

This threshold was chosen based on design reasons and content-related considerations given the needs of the German Federal Ministry of Labor and Social A airs (BMAS).

3

The information in the IAB database refer to the situation of the workers approximately 11 months prior to the start of data collection.

4

The dataset used for this project is a cross-sectional workerlevel survey data set made available by the German Federal Employment Agency (Bundesagentur für Arbeit) and the Institute for Employment Research (Institut für Arbeitsmarkt- und Berufsforschung, IAB). The data is confidential and subject to restricted access due to data protection legislation. Researchers can access the data of this project at the IAB. We make all programs and a read-me file with instructions to replicate our results available upon request. Please refer to IAB Project 1495 (“Situation atypisch Beschäftigter und Arbeitszeitwünsche von Teilzeitbeschäftigten. Quantitative und qualitative Erhebung sowie begleitende Forschung”).

5

The administrative database contains information on workers’ age, sex, education, occupation, wages, and industry a liation among others. It does not contain information on employment rights and working conditions. Our analysis relies solely on the survey data.

6

Education “Low” refers to people who finished their highest level of schooling with or without a degree. The “Intermediate” group refers to people who received a Mittlere Reife or Realschulabschluss, or a Polytechnische Oberschule. The “High” group includes those who received a Fachhochschulreife or Abitur.

7

The number of withheld worker’s rights variable is constructed as the sum of three binary indicator variables denoting whether employees were illegally denied paid vacation, paid sick leave, or paid public holidays. More information about this variable’s construction is provided in the appendix.

8

The weighting was carried out by poststratifying on relevant characteristics of the sampling scheme to the target population using IAB administrative data available for every population unit. The basis for the calculation of the weighting factors was the distribution of the workforce across 5 establishment size classes and 4 employee groups (marginal part-time worker, part-time worker, worker with fixed-term contract, and other workers).

Contributor Information

Joseph W. Sakshaug, Institute for Employment Research (IAB), Ludwig Maximilian University of Munich, and University of Mannheim

Jens Stegmaier, Institute for Employment Research (IAB).

Mark Trappmann, Institute for Employment Research (IAB), and University of Bamberg.

Frauke Kreuter, Institute for Employment Research (IAB), University of Mannheim, and University of Maryland.

References

  1. American Association for Public Opinion Research. (2016). Standard definitions. Final dispositions of case codes and outcome rates for surveys. Retrieved from https://www.aapor.org/AAPOR_Main/media/publications/Standard-Definitions20169theditionfinal.pdf [Google Scholar]
  2. Apel H, Bachmann R, Bender S, vom Berge P, Fertig M, Frings H, … Wolter S. (2012). Arbeitsmarktwirkungen der Mindestlohneinführung im Bauhauptgewerbe. Journal for Labour Market Research, 45(3/4), 257–277. [Google Scholar]
  3. Bates N, Dahlhamer J, & Singer E (2008). Privacy concerns, too busy, or just not interested. Using doorstep concerns to predict survey nonresponse. Journal of Official Statistics, 24(4), 591–612. [Google Scholar]
  4. Bates N, Wroblewski M, & Pascale J (2012). Public attitudes toward the use of administrative records in the u.s. census. Does question frame matter? Technical Report, Survey Methodology Series #2012–04, United States Census Bureau. Retrieved from https://www.census.gov/srd/papers/pdf/rsm2012-04.pdf [Google Scholar]
  5. Bender S, Fertig M, Gorlitz K, Huber M, & Schmucker A (2008). WeLL—unique linked employer-employee data on further training in Germany. Ruhr Economic Papers (No. 67). [Google Scholar]
  6. Buck N & McFall S (2012). Understanding Society. Design overview. Longitudinal and Life Course Studies, 3(1), 5–17. [Google Scholar]
  7. Christoph B, Müller G, Gebhardt D, Wenzig C, Trappmann M, Achatz J, … Gayer C (2008). Codebook and documentation of panel study “Labour Market and Social Security” (PASS). Vol. 1. Introduction and overview, wave 1 (2006/2007). number 05. FDZ Datenreport. Documentation on Labour Market Data 200805_en. [Google Scholar]
  8. da Silva M, Coeli C, Ventura M, Palacios M, Magnanini M, Camargo T, & Camargo KJ (2012). Informed consent for record linkage. A systematic review. Journal of Medical Ethics, 38(10), 639–642. [DOI] [PubMed] [Google Scholar]
  9. Dahlhamer J, Simile C, & Taylor B (2008). Do you really mean what you say? Doorstep concerns and data quality in the National Health Interview Survey (NHIS). Proceedings of the Joint Statistical Meetings of the Ammerican Statistical Association, 1484–1491. [Google Scholar]
  10. Dunn O (1961). Multiple comparisons among means. Journal of the American Statistical Association, 56, 52–64. [Google Scholar]
  11. Eckman S, Kreuter F, Kirchner A, Jäckle A, Tourangeau R, & Presser S (2014). Assessing the mechanisms of misreporting to filter questions in surveys. Public Opinion Quarterly, 78(3), 721–733. [Google Scholar]
  12. Fulton J (2012). Respondent consent to use administrative data. University of Maryland, PhD dissertation. College Park, MD. [Google Scholar]
  13. General Data Protection Regulation. (2016). Regulation (EU) 2016/679 of the European parliament and of the council. Retrieved from http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX.32016R0679%5C&from=en
  14. Goyder J (1985). Nonresponse on surveys. A Canada–United States comparison. Canadian Journal of Sociology, 10(3), 231–251. [Google Scholar]
  15. Groves R & Couper M (1998). Nonresponse in household interview surveys. New York. John Wiley and Sons. [Google Scholar]
  16. Groves R & McGonagle K (2001). A theory-guided interviewer training protocol regarding survey participation. Journal of O cial Statistics, 17(2), 249–265. [Google Scholar]
  17. Groves R, Singer E, & Corning A (2000). Leveragesaliency theory of survey participation. Description and an illustration. Public Opinion Quarterly, 64(3), 299–308. [DOI] [PubMed] [Google Scholar]
  18. Kahneman D & Tversky A (1979). Prospect theory. An analysis of decisions under risk. Econometrica, 47(2), 263–291. [Google Scholar]
  19. Kahneman D & Tversky A (1984). Choices, values, and frames. American Psychologist, 39(4), 341–350. [Google Scholar]
  20. Kreuter F, Sakshaug J, Schmucker A, Singer E, & Couper M (2015). Privacy, data linkage, and informed consent. Presentation at the 70th Annual Conference of the American Association for Public Opinion Research, Hollywood,FL, May. [Google Scholar]
  21. Kreuter F, Sakshaug J, & Tourangeau R (2016). The framing of the record linkage consent question. International Journal of Public Opinion Research, 28(1), 142–152. [Google Scholar]
  22. Longitudinal Studies Strategic Review. (2018). 2017 report to the Economic and Social Research Council. Retrieved from https://esrc.ukri.org/files/news-events-and-publications/publications/longitudinal-studies-strategic-review-2017/
  23. Michaud S, Dolson D, Adams D, & Renaud M (1995). Combining administrative and survey data to reduce respondent burden in longitudinal surveys. Paper presented at the Joint Statistical Meetings of the American Statistical Association, Orlando, FL, USA. [Google Scholar]
  24. Miller D, Gindi R, & Parker J (2011). Trends in record linkage refusal rates: Characteristics of national health interview survey participants who refuse record linkage. Presentation at the Joint Statistical Meetings, Miami, FL, August. [Google Scholar]
  25. Morton-Williams J (1993). Interviewer approaches. Aldershot: Dartmouth Publishing Company. [Google Scholar]
  26. Olson K, Lepkowski J, & Garabrant D (2011). An experimental examination of the content of persuasion letters on nonresponse rates and survey estimates in a nonresponse follow-up study. Survey Research Methods, 5(1), 21–26. [Google Scholar]
  27. Parsons V, Moriarity C, Jonas K, Moore T, Davis K, & Tompkins L (2014). Design and estimation for the national health interview survey. National Center for Health Statistics. Vital and Health Statistics, 2(165), 2006–2015. [PubMed] [Google Scholar]
  28. Pascale J (2011). Requesting consent to link survey data to administrative records: Results from a split-ballot experiment in the survey of health insurance and program participation. Technical Report, Survey Methodology Series #2011–03, United States Census Bureau. Retrieved from https://www.census.gov/srd/papers/pdf/ssm2011-03.pdf [Google Scholar]
  29. Sakshaug J & Kreuter F (2012). Assessing the magnitude of non-consent biases in linked survey and administrative data. Survey Research Methods, 6(2), 113–122. [Google Scholar]
  30. Sakshaug J & Kreuter F (2014). The effect of benefit wording on consent to link survey and administrative records in a web survey. Public Opinion Quarterly, 78(1), 166–176. [Google Scholar]
  31. Sakshaug J, Schmucker A, Kreuter F, Couper MP, & Singer E (2019). The effect of framing and placement on linkage consent. Public Opinion Quarterly, 83(S1), 289–308. doi: 10.1093/poq/nfz018 [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Sakshaug J, Tutz V, & Kreuter F (2013). Placement, wording, and interviewers. Identifying correlates of consent to link survey and administrative data. Survey Research Methods, 7(2), 133–144. [Google Scholar]
  33. Sakshaug J, Wolter S, & Kreuter F (2015). Obtaining record linkage consent: Results from a wording experiment in Germany. Survey Insights. Methods from the Field. Retrieved from http://surveyinsights.org/?p=7288 [Google Scholar]
  34. Sala E, Knies G, & Burton J (2014). Propensity to consent to data linkage. Experimental evidence on the role of three survey design features in a UK longitudinal panel. International Journal of Social Research Methodology, 17(5), 455–473. [Google Scholar]
  35. Schütz H, Harand J, Kleudgen M, Aust N, & Weißpflug A (2014). Methodenbericht: Situation atypisch Beschäftigter und Arbeitszeitwünsche von Teilzeitbeschäftigten. Bonn. Institut für angewandte Sozialwissenschaft (Infas). [Google Scholar]
  36. Trappmann M, Bähr S, Beste J, Eberl A, Frodermann C, Gundert S, … Wenzig C. (2019). Data resource profile. Panel Study Labour Market and Social Security (PASS). International Journal of Epidemiology, 48(5), 1411–1411g. doi: 10.1093/ije/dyz041 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Trappmann M, Beste J, Bethmann A, & Müller G (2013). The PASS panel survey after six waves. Journal for Labour Market Research, 46(4), 275–281. [Google Scholar]
  38. US Commission on Evidence-Based Policymaking. (2017). The promise of evidence-based policy-making: Report of the commission on evidence-based policymaking. Retrieved from https://www.cep.gov/content/dam/cep/report/cep-final-report.pdf
  39. US National Academies of Sciences, Engineering, and Medicine. (2017). Federal statistics, multiple data sources, and privacy protection: Next steps. Washington, DC. The National Academies Press. doi: 10.17226/24893 [DOI] [PubMed] [Google Scholar]
  40. Vercruyssen A, Roose H, Carton A, & van de Putte B (2014). The e ect of busyness on survey participation. Being too busy or feeling too busy to cooperate? International Journal of Social Research Methodology, 17(4), 357–371. [Google Scholar]
  41. Vercruyssen A, Roose H, & van de Putte B (2011). Underestimating busyness. Indications of nonresponse bias due to work—family conflict and time pressure. Social Science Research, 40(6), 1691–1701. [Google Scholar]
  42. Vercruyssen A, van de Putte B, & Stoop I (2011). Are they really too busy for survey participation? The evolution of busyness and busyness claims in flanders? Journal of Official Statistics, 27(4), 619–632. [Google Scholar]

RESOURCES