Skip to main content
Public Opinion Quarterly logoLink to Public Opinion Quarterly
. 2019 Jun 20;83(Suppl 1):289–308. doi: 10.1093/poq/nfz018

The Effect of Framing and Placement on Linkage Consent

Joseph W Sakshaug 2,, Alexandra Schmucker 3, Frauke Kreuter 4, Mick P Couper 5, Eleanor Singer 5,1
PMCID: PMC6639764  PMID: 31337925

Abstract

Numerous surveys link interview data to administrative records, conditional on respondent consent, in order to explore new and innovative research questions. Optimizing the linkage consent rate is a critical step toward realizing the scientific advantages of record linkage and minimizing the risk of linkage consent bias. Linkage consent rates have been shown to be particularly sensitive to certain design features, such as where the consent question is placed in the questionnaire and how the question is framed. However, the interaction of these design features and their relative contributions to the linkage consent rate have never been jointly studied, raising the practical question of which design feature (or combination of features) should be prioritized from a consent rate perspective. We address this knowledge gap by reporting the results of a placement and framing experiment embedded within separate telephone and Web surveys. We find a significant interaction between placement and framing of the linkage consent question on the consent rate. The effect of placement was larger than the effect of framing in both surveys, and the effect of framing was only evident in the Web survey when the consent question was placed at the end of the questionnaire. Both design features had negligible impact on linkage consent bias for a series of administrative variables available for consenters and non-consenters. We conclude this research note with guidance on the optimal administration of the linkage consent question.

Introduction

One of Eleanor Singer’s key contributions to the survey methodological literature—and, indeed, social science literature more broadly—relates to the issue of informed consent. Her contributions in this area covered a wide range of topics, including the effect of the consent request on survey participation (e.g., Singer 1978, 2003; Singer, von Thurn, and Miller 1995; Sakshaug et al. 2016), consent to paradata capture (Singer and Couper 2010; Couper and Singer 2013), and attitudes toward administrative data linkages (e.g., Singer, Bates, and van Hoewyk 2011). A core element underlying her work was that the issue of consent in whatever form is not only an ethical one but is also subject to empirical investigation. This paper continues that tradition, focusing on consent to link survey data to administrative records, a topic that is gaining increasing attention (see, e.g., Groves and Harris-Kojetin 2017; Fobia et al. 2019).

Many large-scale surveys link interview data to administrative databases in order to enhance research opportunities in the social sciences. Linking administrative data to surveys allows researchers to study many policy-relevant topics, including lifetime employment and earnings, medical expenditures, and government benefit programs. Although administrative data linkages are viewed as cost-effective and useful supplements to survey data, their value is predicated on obtaining linkage consent from respondents. Respondent consent is not automatic, and some evidence suggests that linkage consent rates have declined over time similar to survey participation rates (Fulton 2012). Moreover, there is evidence that linkage non-consent can introduce bias in linked-data estimates (Sakshaug and Kreuter 2012; Sala, Burton, and Knies 2012; Yang, Fricker, and Eltinge 2019).

Efforts to optimize linkage consent rates have focused on two design aspects: the framing of the linkage consent question and its placement in the questionnaire. Regarding placement, the linkage consent question is typically administered at the end of the questionnaire. The rationale for this placement comes from questionnaire design guidelines, which recommend administering sensitive questions toward the end of the survey, at which point the respondent is most familiar with the study content and has established rapport with the interviewer (Sudman and Bradburn 1982). However, experimental studies show that administering the linkage consent request at the end of the survey is suboptimal from a consent rate perspective. Sakshaug, Tutz, and Kreuter (2013) conducted a German telephone survey experiment of 2,400 respondents who were asked to link their interview data to employment history records. They found that 95.6 percent of those who received the request at the beginning of the survey gave consent versus 86.0 percent of those who received the request at the end of the survey. Among 2,179 respondents participating in the fourth wave of the Innovation Panel of the UK Household Longitudinal Study, Sala, Knies, and Burton (2014) showed that those experimentally assigned to receive a request to link to administrative state benefit records earlier in the questionnaire consented at a rate of 65 percent compared to 58 percent of those that received the request at the end of the survey. Similar patterns were found in a Web-only establishment survey in Germany. When asked for consent to link federal employment records, Sakshaug and Vicari (2018) reported consent rates of 61.3 percent, 52.3 percent, and 45.2 percent for those randomly assigned to receive the linkage request at the beginning, middle, or end of the survey, respectively.

Regarding question framing, the most common strategy is to emphasize the benefits of linkage to respondents, such as to help meet the scientific goals of the study, reduce costs, or minimize respondent burden. This “gain framing” strategy has shown promise in hypothetical data sharing experiments. For example, in a multi-mode US survey of 4,011 respondents asked about their attitudes toward using government records to obtain Decennial Census information, Bates, Wroblewski, and Pascale (2012) reported that 48.2 percent of respondents expressed at least some positivity toward the hypothetical proposal if it was framed in terms of cost savings, 43.5 percent if it was framed in terms of reducing respondent burden, and 37.6 percent among the control group (no benefit framing; see also Fobia et al. 2019, in this issue). However, in practice, the effects of benefit framing have been mixed in data sharing applications. Pascale (2011) experimented with three benefit framing arguments (improved accuracy, reduced costs, and reduced respondent burden) in a US telephone study of 3,318 respondents who were asked if they objected to the linkage of their interview data with government records, but found no significant differences in objection rates between the framing groups. The aforementioned telephone study by Sakshaug, Tutz, and Kreuter (2013) also yielded no significant differences in linkage consent rates between respondents randomized to a time-saving argument and a neutral framing condition. However, a replication of this experiment conducted in a Web-only survey of 1,194 respondents in Germany who were asked for consent to link employment history records revealed a slight advantage of the time-savings argument, which yielded a consent rate of 61.6 percent compared to 55.4 percent in the neutral framing condition (Sakshaug and Kreuter 2014).

An alternative framing strategy emphasizes the consequences (or losses) of not consenting to linkage. The notion of framing a decision in terms of losses was conceptualized in a series of experiments by Kahneman and Tversky (1979, 1984), who showed that people become risk seeking when faced with choices that are framed in terms of sure losses and risk averse when the same choices are framed in terms of sure gains. Kreuter, Sakshaug, and Tourangeau (2016) tested this framing strategy in a linkage consent experiment embedded within a telephone survey of 750 Maryland residents by emphasizing the diminished value of the collected interview data if consent to link to voting records was not provided (loss frame) versus emphasizing the enhanced value of the interview data if consent was provided (gain frame). In line with Kahneman and Tversky (1979, 1984), a higher consent rate was obtained under the loss-framing strategy (66.8 percent vs. 56.1 percent).1

For the survey designer, it is useful to know that both beginning-placement and loss-framing the linkage consent request can positively impact the consent rate. However, what is unknown is how these two design features interact. For example, emphasizing the diminished value of the non-linked interview data (loss frame) is likely to be more salient to respondents at the end of the survey after they have answered all survey items, compared to the beginning when they have answered none. This argument is in line with Sakshaug, Wolter, and Kreuter (2015), who showed in a German telephone survey of 1,521 employees that loss-framing the request for consent to link to employment history records was less effective than gain-framing the request when the framing emphasis (diminished vs. enhanced value) was put on the ensuing interview data. This study, however, did not vary the placement of the request—both framing versions were implemented at the approximate midpoint of the questionnaire. Thus, it remains unclear to what extent the placement of the consent question affects the saliency of the gain-loss framing and which combination of placement and framing maximizes the linkage consent rate and minimizes the risk of consent bias.

We address this knowledge gap through a linkage consent experiment in which the placement and framing of the consent question were varied in separate telephone and Web surveys.2 In addition to assessing the joint impact of placement and framing on the linkage consent rate, we also assess whether these design features differentially impact linkage consent bias based on a selection of federal administrative variables available for both consenters and non-consenters. The aim of this investigation is to provide guidance to the survey practitioner on the optimal design of the linkage consent question from both a consent rate and consent bias perspective.

Data and Methods

TELEPHONE AND WEB SURVEYS

We administered linkage consent experiments in two separate survey implementations using samples of named individuals drawn from register data of the Federal Employment Agency of Germany (in German: Bundesagentur für Arbeit; which we will refer to as the BA). The BA register covers all working-age individuals who make social security contributions or utilize employment-related support services offered by the BA (vom Berge, Burghardt, and Trenkle 2013). The telephone sample (n = 7,001) was drawn from the BA register using a reference date of December 31, 2012, which at the time covered about 89 percent of the German civilian labor force between the ages of 15 and 64.3 Telephone numbers were acquired through BA records and address matching to commercial databases for 65.9 percent of sampled individuals. Individuals without a matched telephone number are treated as non-respondents. Fieldwork occurred between October 9 and November 19, 2014, and yielded 677 interviews for a response rate of 9.7 percent (Response Rate 1; AAPOR 2016).

The Web sample (n = 4,952) was drawn from the BA register using the same reference date as the telephone sample. Invitations were mailed to all households, and the Web survey ran from November 11, 2014, to February 12, 2015. A total of 651 interviews were completed for an RR1 response rate of 13.2 percent. An additional 28 Web respondents broke off the survey before the consent question was presented. These cases are excluded from further analysis. Both surveys were introduced under the theme “Challenges in the German Labor Market 2014.” Each questionnaire contained similar content and covered several topics, including employment history, job-seeking activities, and social media usage.

EXPERIMENTAL DESIGN

The linkage consent experiment consisted of a fully crossed 2 × 2 factorial design of framing and placement. Respondents were randomly assigned to receive a gain- or loss-framing version of the linkage consent question at the beginning or end of the survey. Table 1 shows the distribution of respondents to experimental conditions.

Table 1.

Number of respondents allocated to framing and placement conditions

Telephone survey
Framing
Placement Gain Loss Total (placement)
Beginning 181 178 359
End 141 177 318
Total (framing) 322 355 677
Web survey
Framing
Placement Gain Loss Total (placement)
Beginning 169 156 325
End 151 175 326
Total (framing) 320 331 651

Following a general prefacing statement (see Supplementary Online Material), the linkage consent question was presented. Each version of the consent question is shown in table 2. Following Kreuter, Sakshaug, and Tourangeau (2016) and Sakshaug, Wolter, and Kreuter (2015), the gain-framing version emphasized that the interview data would be “more useful” if consent was provided, whereas the loss-framing version emphasized that the interview data would be “less useful” if consent was not provided. These framing words were bolded in the Web survey. In the beginning-placement condition, the gain/loss-framing emphasis was put on the ensuing interview data “that you will give us in the course of the interview,” whereas the framing emphasis in the end-placement condition was put on the interview data “that you have already given us.”

Table 2.

Wording of linkage consent question (English translation) by framing and placement conditions

Framing
Placement Gain Loss
Beginning The information that you will give us in the course of the interview will be more useful if you agree to link with the data of the Federal Agency. Are you consenting to the transmission of the information? Unfortunately, the information you will give us in the course of the interview will be less useful if you disagree to link with the data of the Federal Agency. Are you consenting to the transmission of the information?
End The information that you have already given us in the course of the interview is more useful if you agree to link with the data of the Federal Agency. Are you consenting to the transmission of the information? Unfortunately, the information that you have already given us in the course of the interview is less useful if you disagree to link with the data of the Federal Agency. Are you consenting to the transmission of the information?

Note.—The original German text is provided in Supplementary Online Table S1. The gain- and loss-framing arguments “more useful” and “less useful” appeared in bold font in the Web survey.

MATCHING CONSENT TO ADMINISTRATIVE VARIABLES

In order to identify the administrative records corresponding to the consenting and non-consenting respondents, the linkage consent indicator from the survey was directly merged to the register data. Like many survey paradata variables (e.g., number of contact attempts, time stamps), the linkage consent indicator is not considered a substantive survey variable and therefore can be linked to the administrative data without respondent consent. This procedure was approved by the legal team of the Institute for Employment Research of the Federal Employment Agency of Germany and has been used in previous methodological studies on linkage consent bias (e.g., Sakshaug and Kreuter 2012; Sakshaug et al. 2017).

STATISTICAL ANALYSIS

We use chi-squared tests to test the interaction between placement and framing on the linkage consent rate. We also conduct chi-squared tests on consent rate differences between conditions within a single factor (e.g., gain vs. loss framing within the beginning-placement condition).

To assess the impact of placement and framing on linkage consent bias, we make use of seven dichotomized administrative variables extracted from the BA register on December 31, 2012, which corresponds to the same reference date both telephone and Web survey samples were drawn. The variables are sex (male), age (≥ 46 years), received non-university vocational training, currently employed, at least one employer change since 2008, average daily wage between 0 and 70 EUR, and at least one welfare benefit receipt since 2008.4 Descriptive estimates of each variable are provided in Appendix Tables A1, A2 (telephone), A3, and A4 (Web). These variables, which have been extensively used in methodological studies using the BA data (Kreuter, Müller, and Trappmann 2010; West, Kreuter, and Jaenichen 2013; Kirchner 2015), are merged to all respondents with a 100 percent match rate using unique IDs from the sampling frame.

Linkage consent bias is assessed by comparing the estimated proportion of the kth(=1,2,,7) administrative variable (Pk) based on respondents who consented to the linkage (Pk,consenters), and the corresponding proportion based on all respondents (Pk,respondents):

Linkage ​​ ​​ Consent ​​ ​​ Biask=Pk,consentersPk,respondents

A summary measure of average absolute linkage consent bias is also reported, which is calculated as the average of the absolute values of all consent bias estimates:

Avg. ​​ ​​ Abs. ​​ ​​ Linkage ​​ ​​ Consent ​​ ​​ Bias=k=17|Pk,consentersPk,respondents|7

Results

Linkage consent rates for each experimental condition and survey are presented in table 3. The overall consent rate in the telephone and Web surveys is 81.8 and 77.3 percent, respectively.5 Although this difference is statistically significant (p = 0.039), it is not as extreme as the 10–40-percentage-point differences found in other linkage consent studies involving self- and interviewer-administered survey modes (Burton 2016; Sakshaug et al. 2017; Thornby et al. 2017).

Table 3.

Linkage consent rates by framing and placement conditions

Telephone survey
Framing
Placement Gain Loss Total (placement)
Beginning 91.7 87.1 89.1
(2.1) (2.5) (1.7)
End 72.3 74.6 73.6
(3.8) (3.3) (2.5)
Total (framing) 82.9 80.9 81.8
(2.1) (2.1) (1.5)
Web survey
Framing
Placement Gain Loss Total (placement)
Beginning 80.5 85.9 83.1
(3.1) (2.8) (2.1)
End 65.6 76.6 71.5
(3.9) (3.2) (2.5)
Total (framing) 73.4 81 77.3
(2.5) (2.2) (1.6)

Note.—Parenthetical entries are standard errors.

FRAMING AND PLACEMENT ON THE LINKAGE CONSENT RATE

We now examine the impact of framing and placement on the linkage consent rate. We find a significant interaction between both factors in each survey (p < 0.01). To our surprise, framing the consent request in terms of gains or losses does not lead to statistically significant differences in the linkage consent rate, except in the Web survey where loss-framing yields an 11-percentage-point increase over gain-framing when the request is made at the end of the survey (p = 0.028). Regarding placement, table 3 shows that this design feature has a larger effect on the consent rate than framing. The results confirm the advantage of asking for consent at the beginning of the survey as opposed to the end. In both surveys, the superiority of beginning-placement is evident regardless of framing condition (p < 0.05).

FRAMING AND PLACEMENT ON LINKAGE CONSENT BIAS

Next, we examine the effects of framing and placement on linkage consent bias. Estimates of average absolute linkage consent bias across the seven administrative variables are presented in figure 1 (a tabular version is provided in Appendix Table A5). Individual consent biases are presented in Appendix Tables A6 and A7. The figure shows statistically significant (p < 0.05), but substantively small, average absolute linkage consent biases for every framing and placement condition, ranging from 0.85 percentage points (Gain-Beginning) to 2.96 percentage points (Gain-End) in the telephone survey, and 1.08 percentage points (Loss-Beginning) to 2.31 percentage points (Loss-End) in the Web survey. The figure also shows no statistically significant differences in consent bias between the experimental conditions in either survey. Thus, we conclude that framing and placement of the linkage consent question do not differentially impact linkage consent bias.

Figure 1.

Figure 1.

Average absolute linkage consent bias by framing and placement conditions. Error bars are 95 percent confidence intervals.

Discussion

This is the first study to examine the combined and interactive effects of framing and placement of the linkage consent question. We found a significant interaction between the framing (gain vs. loss) and placement (beginning vs. end) of the linkage consent question, indicating that both factors (and their combination) can positively influence the linkage consent rate. However, the importance of both design features varied in separate telephone and Web survey implementations. In both surveys, placement had a stronger effect than framing on the linkage consent rate: requesting linkage consent at the beginning of the survey always yielded a higher consent rate than requesting consent at the end of the survey regardless of framing condition. The effect of framing was evident only in the Web survey, where loss-framing yielded a higher consent rate than gain-framing, but only when the consent request came at the end of the survey. Finally, despite differences in consent rates, we found no statistically significant differences in average linkage consent bias between the placement and framing conditions.

The results are in line with other studies showing that end-placement of the linkage consent question is suboptimal from a consent rate perspective (Sakshaug et al. 2013; Sala, Knies, and Burton 2014; Sakshaug and Vicari 2018). The Web survey finding that loss-framing is more effective than gain-framing when the consent question is asked at the end of the survey replicates the results of Kreuter, Sakshaug, and Tourangeau (2016). The presence of a framing effect in the Web survey, but not the telephone survey, is consistent with other linkage consent studies showing mixed framing effects in self- and interviewer-administered modes (Pascale 2011; Bates, Wroblewski, and Pascale 2012; Sakshaug, Tutz, and Kreuter 2013; Sakshaug and Kreuter 2014). Given these consistencies with the literature, we expect our results to be generalizable to other surveys with different populations and to linkages involving other administrative data types that are performed in a research context.

The Web survey framing effect could be due to the visual nature of the mode, which ensures that the entire consent statement is presented to respondents, which is in contrast to the telephone mode, where there is no assurance that respondents intensely listen to, or interviewers read, the entire statement. Alternatively, one might expect smaller wording effects on the Web because respondents may be less likely to read the entire statement, so the fact that we find larger framing effects on the Web is important as more surveys shift their data collection activities to the Web. Nevertheless, more research is needed to better understand the extent to which consent statements are fully read online.

Based on the study results, we now provide some general guidance on the optimal administration of the linkage consent request. Most importantly, we suggest that the linkage consent question be asked as early as possible in the survey, as this design decision has the most consistent impact on maximizing the linkage consent rate. How the consent question is framed—whether in terms of gains or losses—is less important if an optimal placement is used. However, if the survey is implemented online and it is only possible to request linkage consent at the end of the survey, then the suggestion is to loss-frame the request by emphasizing the negative consequences of not obtaining linkage consent. While these suggestions are likely to maximize consent, they are unlikely to significantly impact linkage consent bias, as we showed here.

In conclusion, it is of some concern to see linkage consent rates vary to this extent by placement and framing. To us, this suggests that attitudes toward linkage are not as strongly held as regulations requiring consent for linkage might assume. Thus, further research is needed into the understanding of the requests themselves and how informed such consent is, consistent with Eleanor Singer’s long-standing work in this area.

Supplementary Material

nfz018_suppl_Supplementary-Material

Appendix

Table A1.

Percentage estimates of administrative variables by framing and placement conditions (main effects) among telephone survey respondents and consenters

Telephone survey respondents Telephone survey consenters
Framing Placement Framing Placement
Administrative variables Overall Gain Loss Beginning End Overall Gain Loss Beginning End
Sex (Male) 49.9 53.4 46.8 49.3 50.6 51.1 55.1 47.4 49.1 53.9
Age ≥ 46 years 56.6 54.0 58.9 56.3 56.9 56.5 52.4 60.3 56.3 56.8
Received non-university vocational training 42.1 41.9 42.3 43.2 40.9 44.4 43.8 45.0 44.1 44.9
Currently employed 76.1 76.1 76.1 75.8 76.4 77.4 76.8 78.1 78.1 76.5
At least one employer change since 2008 46.5 46.5 46.4 47.3 45.5 47.1 49.4 45.0 47.7 46.4
Avg. daily wage between 0–70 EUR 50.2 51.6 48.9 50.3 50.0 50.9 51.5 50.4 50.4 51.8
At least one welfare benefit receipt since 2008 65.1 65.2 65.1 63.2 67.3 64.6 65.2 64.1 62.8 67.1

Table A2. Percentage estimates of administrative variables by framing and placement conditions (cross-classification) among telephone survey respondents and consenters

Telephone survey respondents Telephone survey consenters
Placement: Beginning Placement: End Placement: Beginning Placement: End
Administrative variables Gain frame Loss frame Gain frame Loss frame Gain frame Loss frame Gain frame Loss frame
Sex (Male) 55.3 43.3 51.1 50.3 54.6 43.2 55.9 52.3
Age ≥ 46 years 53.0 59.5 55.3 58.2 52.1 60.7 52.9 59.9
Received non-university vocational training 42.0 44.4 41.8 40.1 41.8 46.5 47.1 43.2
Currently employed 75.7 75.8 76.6 76.3 77.0 79.4 76.5 76.5
At least one employer change since 2008 49.1 45.5 43.3 47.3 50.3 44.9 48.0 45.2
Avg. daily wage between 0–70 EUR 49.4 51.3 54.4 46.2 50.4 50.4 53.3 50.5
At least one welfare benefit receipt since 2008 69.1 57.3 60.3 72.9 69.7 55.5 57.8 74.2

Table A3. Percentage estimates of administrative variables by framing and placement conditions (main effects) among web sur- vey respondents and consenters

Web survey respondents Web survey consenters
Framing Placement Framing Placement
Administrative variables Overall Gain Loss Beginning End Overall Gain Loss Beginning End
Sex (Male) 53.2 55.3 51.1 51.7 54.6 51.9 56.6 47.8 51.5 52.4
Age ≥ 46 years 61.4 61.6 61.3 58.8 64.1 60.8 62.6 59.3 58.9 63.1
Received non-university vocational training 43.6 46.9 40.5 42.2 45.1 43.9 47.7 40.7 43.0 45.1
Currently employed 84.2 84.1 84.3 83.4 85.0 84.3 85.5 83.2 85.2 83.3
At least one employer change since 2008 33.3 34.4 32.2 35.7 30.9 34.9 35.5 34.3 37.1 32.3
Avg. daily wage between 0–70 EUR 35.1 35.3 34.9 35.8 34.4 34.6 33.6 35.5 34.1 35.2
At least one welfare benefit receipt since 2008 51.9 53.4 50.5 53.9 50.0 51.5 52.3 50.8 52.4 49.8

Table A4. Percentage estimates of administrative variables by framing and placement conditions (cross-classification) among web survey respondents and consenters

Web survey respondents Web survey consenters
Placement: Beginning Placement: End Placement: Beginning Placement: End
Administrative variables Gain frame Loss frame Gain frame Loss frame Gain frame Loss frame Gain frame Loss frame
Sex (Male) 52.7 50.6 58.3 51.4 54.4 48.5 59.6 47.0
Age ≥ 46 years 57.4 60.3 66.2 62.3 58.1 59.7 68.7 59.0
Received non-university vocational training 49.1 34.6 44.4 45.7 52.2 33.6 41.4 47.8
Currently employed 85.2 81.4 82.8 86.9 89.7 80.6 79.8 85.8
At least one employer change since 2008 36.2 35.2 32.4 29.6 37.3 36.8 33.0 31.8
Avg. daily wage between 0–70 EUR 36.9 34.5 33.6 35.2 33.9 34.4 33.3 36.5
At least one welfare benefit receipt since 2008 56.2 51.3 50.3 49.7 55.9 50.0 47.8 51.5

Table A5. Average absolute linkage consent bias by framing and placement condition

Telephone survey Web survey
Framing Framing
Placement Gain Loss Overall (placement) Placement Gain Loss Overall (placement)
Beginning 0.9 (0) 1.4 (1) 0.6 (1) Beginning 2.1 (2) 1.1 (0) 1.1 (1)
End 3.0 (3) 2.1 (1) 1.5 (2) End 1.9 (0) 2.3 (2) 1.1 (0)
Overall (framing) 1.3 (1) 1.5 (2) 1.0 (2) Overall (Framing) 1.2 (0) 1.4 (2) 0.7 (0)

Note.—Parenthetical entries denote the number of statistically significant (p < 0.10) linkage consent biases out of seven administrative variables.

Table A6. Signed bias by framing and placement conditions (main effects)

Telephone survey Web survey
Framing Placement Framing Placement
Administrative variables Overall Gain Loss Beginning End Overall Gain Loss Beginning End
Sex (Male) 1.2 1.6 0.6 –0.2 3.2 –1.3 1.3 –3.3 –0.2 –2.2
Age ≥ 46 years –0.1 –1.6 1.4 –0.0 –0.1 –0.6 1.0 –2.0 0.1 –1.0
Received non-university vocational training 2.3 1.9 2.7 0.9 4.0 0.3 0.8 0.2 0.8 –0.0
Currently employed 1.4 0.7 2.0 2.4 0.1 0.1 1.5 –1.1 1.8 –1.7
At least one employer change since 2008 0.7 2.9 –1.4 0.4 0.9 1.6 1.1 2.1 1.4 1.4
Avg. daily wage between 0–70 EUR 0.8 –0.1 1.6 0.0 1.8 –0.5 –1.7 0.6 –1.6 0.7
At least one welfare benefit receipt since 2008 –0.5 –0.1 –1.0 –0.4 –0.2 –0.4 –1.1 0.3 –1.5 –0.2
Avg. Abs. linkage consent bias 1.0 1.3 1.5 0.6 1.5 0.7 1.2 1.4 1.1 1.1

Note.—Boldface entries denote statistically significant biases at the 0.10 level.

Table A7. Signed bias by framing and placement condition (cross-classification)

Telephone survey Web survey
Placement: Beginning Placement: End Placement: Beginning Placement:End
Administrative variables Gain frame Loss frame Gain frame Loss frame Gain frame Loss frame Gain frame Loss frame
Sex (Male) –0.7 –0.0 4.8 2.0 1.8 –2.1 1.3 –4.4
Age ≥ 46 years –0.9 1.1 –2.4 1.7 0.7 –0.6 2.5 –3.3
Received non-university vocational training –0.2 2.1 5.2 3.1 3.1 –1.0 –3.0 2.1
Currently employed 1.3 3.5 –0.1 0.3 4.5 –0.8 –3.0 –1.0
At least one employer change since 2008 1.2 –0.6 4.7 –2.1 1.2 1.6 0.6 2.2
Avg. daily wage between 0–70 EUR 1.0 –0.9 –1.1 4.3 –3.0 –0.1 –0.2 1.4
At least one welfare benefit receipt since 2008 0.6 –1.8 –2.4 1.4 –0.3 –1.3 –2.9 1.8
Avg. Abs. linkage consent bias 0.9 1.4 3.0 2.1 2.1 1.1 1.9 2.3

Note.—Boldface entries denote statistically significant biases at the 0.10 level.

Joseph W. Sakshaug is distinguished researcher, head of the Data Collection and Data Integration Unit, and acting head of the Statistical Methods Research Department at the Institute for Employment Research, Nuremberg, Germany, and professor in the School of Social Sciences at the University of Mannheim, Mannheim, Germany.

Alexandra Schmucker is senior researcher at the Institute for Employment Research, Nuremberg, Germany. Frauke Kreuter is professor at the Joint Program in Survey Methodology, University of Maryland, College Park, MD, USA; full professor of Statistics and Social Science Methodology at the University of Mannheim, Germany, and head of the Statistical Methods Research Department (on leave) at the Institute for Employment Research, Nuremberg, Germany.

Mick P. Couper is research professor in the Survey Research Center at the Institute for Social Research, University of Michigan, Ann Arbor, MI, USA, and research professor at the Joint Program in Survey Methodology at the University of Maryland, College Park, MD, USA.

Eleanor Singer, who died on June 3, 2017, was research professor emerita in the Survey Research Center at the Institute for Social Research, University of Michigan, Ann Arbor, MI, USA.

Footnotes

1

Tourangeau and Ye (2009) conducted a similar framing experiment in which respondents were asked for consent to complete a follow-up interview. The authors also found that loss-framing the follow-up interview request yielded a higher consent rate than gain-framing the request.

2

We do not have any specific hypotheses regarding differences in placement and framing effects between the two survey modes, but given that prior research has found differences in linkage consent rates between self- and interviewer-administered modes (e.g., Burton 2016; Sakshaug et al. 2017; Thornby et al. 2017), we wanted to be sure to implement the experiments in both modes.

3

Sources: Integrated Employment Biographies Sample (http://fdz.iab.de/en/FDZ_Individual_Data/Integrated_Employment_Biographies.aspx), own calculations; Bundesagentur für Arbeit, Statistik: Dokumentation “Bezugsgröße 2012” (http://statistik.arbeitsagentur.de/Statischer-Content/Grundlagen/Berechnung-Arbeitslosenquote/Dokumentation/Generische-Publikationen/Dokumentation-der-Bezugsgroesse-2012.pdf); Statistisches Bundesamt (2015): Bevölkerung: Deutschland, Stichtag, Altersjahre, Wiesbaden 2015, own calculations.

4

Numeric variables were dichotomously coded using somewhat arbitrary cut-points with preference given to the approximate median value of the distribution.

5

The linkage consent estimates are unadjusted for nonresponse. A sensitivity analysis yielded nearly identical weighted estimates and the same study conclusions after adjusting on basic information from the sampling frame (sex, age, education, and employment status).

References

  1. American Association for Public Opinion Research (AAPOR) 2016. Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys, 9th ed. Available at https://www.aapor.org/AAPOR_Main/media/publications/Standard-Definitions20169theditionfinal.pdf. [Google Scholar]
  2. Bates Nancy, Wroblewski Monica J., and Pascale Joanne. . 2012. “Public Attitudes Toward the Use of Administrative Records in the U.S. Census: Does Question Frame Matter?” Technical Report, Survey Methodology Series #2012-04, US Census Bureau. Available at https://www.census.gov/srd/papers/pdf/rsm2012-04.pdf, accessed August 30, 2018. [Google Scholar]
  3. Burton Jonathan. 2016. “Results for Web/Face-to-Face Linkage Consent Questions in the Innovation Panel.” Paper presented at the Mixing Modes and Measurement Methods in Longitudinal Studies Workshop, London, University College London. [Google Scholar]
  4. Couper Mick P., and Singer Eleanor. . 2013. “Informed Consent for Web Paradata Use.” Survey Research Methods 7:57–67. [PMC free article] [PubMed] [Google Scholar]
  5. Fobia Aleia Clark, Holzberg Jessica, Eggleston Casey, Hunter Childs Jennifer, Marlar Jenny, and Morales Gerson. . 2019. “Attitudes Towards Data Linkage for Evidence-Based Policymaking.” Public Opinion Quarterly 83(Special Issue): 264–279. [Google Scholar]
  6. Fulton Jenna. 2012. “Respondent Consent to Use Administrative Data.” Unpublished dissertation, University of Maryland; Available at https://drum.lib.umd.edu/handle/1903/13601, accessed August 30, 2018. [Google Scholar]
  7. Groves Robert M., and Harris-Kojetin Brian A.. . 2017. Innovations in Federal Statistics: Combining Data Sources While Protecting Privacy. Washington, DC: National Academies Press. [PubMed] [Google Scholar]
  8. Kahneman Daniel, and Tversky Amos. . 1979. “Prospect Theory: An Analysis of Decisions under Risk.” Econometrica 47:263–91. [Google Scholar]
  9. ——— 1984. “Choices, Values, and Frames.” American Psychologist 39:341–50. [Google Scholar]
  10. Kirchner Antje. 2015. “Validating Sensitive Questions: A Comparison of Survey and Register Data.” Journal of Official Statistics 31:31–59. [Google Scholar]
  11. Kreuter Frauke, Müller Gerrit, and Trappmann Mark. . 2010. “Nonresponse and Measurement Error in Employment Research: Making Use of Administrative Data.” Public Opinion Quarterly 74:880–906. [Google Scholar]
  12. Kreuter Frauke, Sakshaug Joseph W., and Tourangeau Roger. . 2016. “The Framing of the Record Linkage Consent Question.” International Journal of Public Opinion Research 28:142–52. [Google Scholar]
  13. Pascale Joanne. 2011. “Requesting Consent to Link Survey Data to Administrative Records: Results from a Split-Ballot Experiment in the Survey of Health Insurance and Program Participation.” Technical Report, Survey Methodology Series #2011-03, US Census Bureau. Available at https://www.census.gov/srd/papers/pdf/ssm2011-03.pdf, accessed August 30, 2018. [Google Scholar]
  14. Sakshaug Joseph W., Hülle Sebastian, Schmucker Alexandra, and Liebig Stefan. . 2017. “Exploring the Effects of Interviewer- and Self-Administered Survey Modes on Record Linkage Consent Rates and Bias.” Survey Research Methods 11:171–88. [Google Scholar]
  15. Sakshaug Joseph W., and Kreuter Frauke. . 2012. “Assessing the Magnitude of Non-Consent Biases in Linked Survey and Administrative Data.” Survey Research Methods 6:113–22. [Google Scholar]
  16. ——— 2014. “The Effect of Benefit Wording on Consent to Link Survey and Administrative Records in a Web Survey.” Public Opinion Quarterly 78:166–76. [Google Scholar]
  17. Sakshaug Joseph W., Schmucker Alexandra, Kreuter Frauke, Couper Mick P., and Singer Eleanor. . 2016. “Evaluating Active (Opt-In) and Passive (Opt-Out) Consent Bias in the Transfer of Federal Contact Data to a Third-Party Survey Agency.” Journal of Survey Statistics and Methodology 4:382–416. [Google Scholar]
  18. Sakshaug Joseph W., Tutz Valerie, and Kreuter Frauke. . 2013. “Placement, Wording, and Interviewers: Identifying Correlates of Consent to Link Survey and Administrative Data.” Survey Research Methods 7:133–44. [Google Scholar]
  19. Sakshaug Joseph W., and Vicari Basha J.. . 2018. “Obtaining Record Linkage Consent from Establishments: The Impact of Question Placement on Consent Rates and Bias.” Journal of Survey Statistics and Methodology 6:46–71. [Google Scholar]
  20. Sakshaug Joseph W., Wolter Stefanie, and Kreuter Frauke. . 2015. “Obtaining Record Linkage Consent: Results from a Wording Experiment in Germany.” Survey Insights: Methods from the Field 6:1–12. [Google Scholar]
  21. Sala Emanuela, Burton Jonathan, and Knies Gundi. . 2012. “Correlates of Obtaining Informed Consent to Data Linkage: Respondent, Interview, and Interviewer Characteristics.” Sociological Methods & Research 41:414–39. [Google Scholar]
  22. Sala Emanuela, Knies Gundi, and Burton Jonathan. . 2014. “Propensity to Consent to Data Linkage: Experimental Evidence on the Role of Three Survey Design Features in a UK Longitudinal Panel.” International Journal of Social Research Methodology 17:455–73. [Google Scholar]
  23. Singer Eleanor. 1978. “Informed Consent: Consequences for Response Rate and Response Quality in Social Surveys.” American Sociological Review 43:144–62. [PubMed] [Google Scholar]
  24. ——— 2003. “Exploring the Meaning of Consent: Participation in Research and Beliefs About Risks and Benefits.” Journal of Official Statistics 19:273–85. [Google Scholar]
  25. Singer Eleanor, Bates Nancy, and van Hoewyk John. . 2011. “Concerns About Privacy, Trust in Government, and Willingness to Use Administrative Records to Improve the Decennial Census.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Phoenix, AZ, USA: Available at http://www.asasrms.org/Proceedings/y2011/Files/400168.pdf [Google Scholar]
  26. Singer Eleanor, and Couper Mick P.. . 2010. “Ethical Considerations in Internet Surveys.” In Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies, edited by Das Marcel, Ester Peter and Kaczmierk Lars, 133–62. New York: Taylor and Francis. [Google Scholar]
  27. Singer Eleanor, von Thurn Dawn R., and Miller Esther R.. . 1995. “Confidentiality Assurances and Response.” Public Opinion Quarterly 59:66–77. [Google Scholar]
  28. Sudman Seymour, and Bradburn Norman M.. . 1982. Asking Questions: A Practical Guide to Questionnaire Design. San Francisco: Jossey-Bass. [Google Scholar]
  29. Thornby Marie, Calderwood Lisa, Kotecha Mehul, Beninger Kelsey, and Gaia Alessandra. . 2017. “Collecting Multiple Data Linkage Consents in a Mixed Mode Survey: Evidence and Lessons Learnt from Next Steps.” Centre for Longitudinal Studies Working Paper 2017/13. London: Institute of Education. [Google Scholar]
  30. Tourangeau Roger, and Ye Cong. . 2009. “The Framing of the Survey Request and Panel Attrition.” Public Opinion Quarterly 73:338–48. [Google Scholar]
  31. vom Berge Philipp, Burghardt Anja, and Trenkle Simon. . 2013. “Stichprobe der Integrierten Arbeitsmarktbiografien: Regionalfile 1975–2010 (SIAB-R 7510).” FDZ-Datenreport, 09/2013 (de). Nürnberg: Institut für Arbeitsmarkt- und Berufsforschung (IAB). [Google Scholar]
  32. West Brady T., Kreuter Frauke, and Jaenichen Ursula. . 2013. “Interviewer Effects in Face-to-Face Surveys: A Function of Sampling, Measurement Error, or Nonresponse?” Journal of Official Statistics 29:277–97. [Google Scholar]
  33. Yang Daniel, Fricker Scott, and Eltinge John. . 2019. “Methods for Exploratory Assessment of Consent-to-Link in a Household Survey.” Journal of Survey Statistics and Methodology 7:118–55. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

nfz018_suppl_Supplementary-Material

Articles from Public Opinion Quarterly are provided here courtesy of Oxford University Press

RESOURCES