Abstract
BACKGROUND
Online study recruitment is increasingly popular, but we know little about the decision making that goes into joining studies in this manner. In GeneScreen, a genomic screening study that utilized online education and consent, we investigated participants’ perceived ease when deciding to join and their understanding of key study features.
METHODS
Individuals were recruited via mailings that directed them to a website where they could learn more about GeneScreen, consent to participate, and complete a survey.
RESULTS
Participants found it easy to decide to join GeneScreen and had a good understanding of study features. Multiple regression analyses revealed that ease of deciding to join was related to confidence in one’s genetic self-efficacy, limited concerns about genetic screening, trust in and lack of frustration using the website, and the ability to spend limited time on the website. Understanding of study features was related to using the Internet more frequently and attaining more information about GeneScreen conditions.
CONCLUSIONS
The ease of deciding to join a genomic screening study and comprehension of its key features should be treated as different phenomena in research and practice. There is a need for a more nuanced understanding of how individuals respond to web-based consent information.
Keywords: electronic consent, genetic screening, informed consent, genetic research
INTRODUCTION
In traditional informed consent procedures for clinical research, participants meet face-to-face with a study team member to review information in a consent form. Their interaction permits discussion of participants’ concerns and evaluation of participants’ capacity to consent as well as their understanding of study features and the voluntariness of their participation. However, this traditional model has problems. The length and readability of consent forms may inhibit understanding, and face-to-face interaction is time intensive and potentially costly for research studies [1, 2]. In contrast, online education and consent (sometimes referred to as “econsent”) is likely to reach more people at a faster rate and let participants proceed at their own pace. It also may permit participants to choose how much information they review about a study and in what order, offering them more freedom to investigate elements of a study they deem most important [3].
Recent surveys of U.S. biobankers [4] and U.K. clinical trial researchers [5] found that though few were currently using e-consent, most would be interested in doing so. Companies such as Apple (http://www.apple.com/researchkit/) and Sage Bionetworks (http://sagebase.org/governance/participant-centered-consent-toolkit/) have produced toolkits to help researchers develop apps for online education and e-consent to increase recruitment and enrollment.
Two recent studies of online education and e-consent compared understanding of information conveyed in the consent materials among participants randomized to receive either traditional paper consent or interactive online education and e-consent [2, 6]. Both studies found that participants randomized to online education and e-consent had better understanding of the information than those using paper informed consent. Participants also spent more time with online consent materials, perhaps in part because these materials were designed to be relatively interactive (they provided feedback on questions that participants answered) [2]. Another project that used e-consent without interactivity concluded that very few participants thoroughly read the consent form online before agreeing to participate [7], suggesting that using online education and e-consent in a study does not, on its own, ensure greater user engagement. It also raises the question of whether participants may vary in the extent to which they engage with a website and learn the information it presents.
The Multiplex Initiative (MI), which was motivated by technological advances in genetic sequencing and the increase in direct-to-consumer testing, examined whether online education could convey information necessary to help participants make an informed decision regarding genetic testing for common, complex diseases [8]. The MI surveyed a random sample of approximately 2000 healthy adults associated with the Henry Ford Health System and sent a follow up brochure directing them to the MI website offering free genetic multiplex testing. The website provided education about the study and testing in three modules, then in a fourth module asked whether respondents would like to participate in the genetic testing [8, 9]. Of the participants who logged onto the website, those who viewed more pages on the website were more likely to say the decision to get testing was easy and to have blood drawn for testing [8]. This finding highlighted the possibility that participants’ behavior on the website may be associated with their decision outcomes. The study also explored who did and did not visit the website and who sought testing. Findings indicated that 32% of people who were mailed the brochure visited the website, and 43% of those who went to the website ultimately consented to a blood draw for genetic testing (14%). Participants who logged onto the website and chose testing were significantly more likely to be White, female, and college educated [9].
Only a few other studies have addressed use of online education and e-consent for genetic research. One examined the feasibility of e-consent for genetic testing for lung cancer susceptibility among 116 smokers who were relatives of patients diagnosed with lung cancer [10]. The 44 who logged on to the study’s website and provided saliva for testing reported more knowledge about cancer genetics and more frequent Internet use than those who did not log on. A study in the UK conducted qualitative interviews with 42 members of the general public who were asked to try a website for a hypothetical large-scale gene-environment study [11]. Findings indicated that approximately 3 out of 4 participants would be willing to use e-consent, as long as trust is established quickly and people have online access to the amounts and types of information they desire.
We report here on results from GeneScreen, a study that screened healthy adults for rare conditions that are related to clinically important genetic variants and that have prevention and/or treatment options, such as Hereditary Breast and Ovarian Cancer, Lynch Syndrome, and Long QT Syndrome. Its testing panel included 17 genes for 11 conditions estimated to occur in 1–2% of the general population [12, 13]. This targeted approach, which we label “preventive genomic screening,” contrasts with clinical genome or exome sequencing for diagnostic purposes that may discover actionable variants but that also confronts a deluge of secondary findings for which no medical treatment is warranted [14]. GeneScreen used online education and e-consent.
In the present study, we examined two research questions motivated by the nascent literature on e-consent to genomic testing: What factors were associated with 1) ease of deciding to join and 2) understanding the main study features? We examined participants’ sociodemographic characteristics, website behavior on the GeneScreen site, attitudes, and other factors we hypothesized would be associated with these two outcomes. Specifically, based on the MI [8, 9] and those studies that compared online with traditional paper consent [2, 6], we hypothesized that participants’ behavior on the website (including amount of time spent on the website) would predict ease of deciding to join the study and understanding of main study features, and that frequency of Internet use [10] and trust in the website [8, 11] might be positively related to both outcomes. The MI, from which we drew several of our measures, also raised the issue of who would respond to recruitment for genomic sequencing, given that participants who agreed to testing in the MI were mainly White, female, and well-educated.
METHODS
Individuals approached to participate
GeneScreen recruited participants at two sites: 1) a hospital-based general medicine clinic at the University of North Carolina-Chapel Hill (UNC) (n = 436) and 2) a research biobank associated with Kaiser Permanente Northwest (KPNW) (n = 650). At UNC, we recruited participants using patient lists linked to seven internists who referred adult patients (over age 18) who were healthy (not suffering serious illness) and who in their clinical opinion had sufficient decisional capacity to provide informed consent. At KPNW, eligible individuals were adult biobank members who had a DNA sample stored and available. Recruitment was designed to yield a diverse population. Specifically, in each location, we approached equal numbers of men and women and, where possible, equal numbers from selected racial/ethnic groups (Black, White, Other) and adult age groups (18–40, 41–60, 61 and over) for recruitment. Although we did not specify that individuals had to have Internet access to participate, those without Internet access would not have been able to complete the study activities and thus were essentially excluded.
Procedures
Individuals were mailed a letter and brochure that provided: 1) an overview of the study, 2) an invitation to visit the study website to get additional information, and 3) a request to complete a 30-minute online survey regardless of whether they chose to have screening. When individuals first logged on to the website, they confirmed their eligibility (over 18, English speaker, UNC patient or KPNW member) and then proceeded to the content of the website, which included education on how genes affect health, the goals of the GeneScreen study, a description of the screening test, and a decision aid (i.e., five questions to consider before choosing whether to participate). The website also presented a summary of key study features required for informed consent, 14 of which could be expanded to display more information (see Supplemental Table 1). Two IRB experts helped to write these summaries so that the most essential information was visible without having to expand the statements. For example, one statement read, “What does GeneScreen test for? It examines a person’s DNA in 17 genes for mutations that can cause one of 11 genetic health conditions.” This statement could be clicked on to reveal more information about the screening test. Additionally, the phrase “11 genetic health conditions” in the statement was a hyperlink to the list of conditions and their detailed descriptions.
Consent for GeneScreen was provided online regardless of whether individuals expanded any of the summaries of study features required for informed consent. Participants who consented to screening (“joiners”) or decided against joining (“decliners”) then completed a survey assessing their decision making and understanding. For participants at UNC, a $30 incentive was offered for completion of the survey. At KPNW, no incentive was offered to joiners because, in anticipation of a larger study with biobank members, we wanted to test the rate at which joiners would be willing to complete a survey without compensation. At the end of the survey, individuals were asked whether they would be willing to be contacted for a telephone interview. A subset of those who agreed were contacted for an interview that lasted on average 30 minutes. Every interviewed participant received $30 (See Figure 1 and Supplemental Figure 1 for recruitment overview). Participants who were willing to complete a telephone interview (77%) did not differ demographically from those who were not willing. However, we oversampled men and UNC participants to complete the interviews because we wanted interviewees from each of three age categories, two gender categories, and three race categories at both recruitment sites. Because of this oversampling, interviewed participants reported better health, higher incomes, and more education in the survey than those not interviewed. Interviews were audio recorded, transcribed verbatim, and coded for factors related to when and why participants made their decision to join the study.
Figure 1.
Recruitment and participation outcomes
GeneScreen was approved by the institutional review boards of UNC and KPNW.
Measures
Outcome variables
Ease of deciding to join the study was measured with a question developed for this study: “How easy or difficult was it for you to decide to join GeneScreen?” Responses were given on a six-point scale where 1 = extremely difficult and 6 = extremely easy.
Understanding of main study features was assessed with five statements that described study components that were deemed by the study team to have key importance (see Results section for the statements). Participants could answer “true,” “false,” or “don’t know.” Correct responses were scored as 1 and incorrect and “don’t know” were scored as 0. An understanding score was computed by summing the number of correct responses, resulting in possible scores of 0 to 5. After each item, participants saw a message that indicated whether they had answered the question correctly and, if not, provided and explained the correct answer. Participants could not go back and change their answer after seeing the correct answer.
Predictor variables—website behavior
Time spent on the website was measured as minutes from log on to consent, as automatically captured by the website. Twenty-three participants logged on more than once, so their scores were calculated manually by examining how long they spent on specific pages. Three participants spent more than an hour on the site, suggesting that they had left the website open while engaging in unrelated tasks. We truncated these scores to 60 minutes. Despite this correction, the variable was skewed (M = 9.489, s.d. = 9.44, skewness = 3.152). We used a 90th percentile winsorizing procedure [15], resulting in a distribution with a skewness of 1.340 and a maximum value of 25.78.
Participants had access to enriched information when they clicked to open a webpage or expand a topic to get additional information. These items were designed to supplement the most critical information, which was available without additional clicking. From these behaviors, we created three variables. First, participants could click to open on the “health conditions” page (HC page) that showed a list and brief description of each of the 11 health conditions associated with the 17 genes included on the GeneScreen panel. We created an HC page variable that indicated whether each participant clicked to open the page (= 1) or not (= 0). Second, participants who opened the HC page could click to expand each of the 11 health conditions to get additional information about it. Because most participants did not click on any of the health conditions, we dichotomized this variable into 0 = clicked no items and 1 = 1 or more clicked. Third, one page displayed bullet points of key study features required for informed consent (see Supplemental Table 1); 14 of these could be expanded to display additional information. This variable was also skewed (2.432), so we recoded it such that 0 = clicked no items; 1 = clicked 1 item; 2 = clicked 2 or more items.
Trust in the website was measured with one item adapted from Kaphingst and colleagues [8]: “I trusted the information on the website.” Responses ranged from 1 = strongly disagree to 6 = strongly agree.
Frustration with the website was measured with one item developed for this study, “Using the website was frustrating.” Responses ranged from 1 = strongly disagree to 6 = strongly agree.
Possible control variables
Genetic self-efficacy was assessed with a 6-item scale developed by Kaphingst and colleagues [8] to measure participants’ confidence that they could understand genetics and how genetics affect health or that they could access this information if needed. Reponses ranged from 1 = strongly disagree to 6 = strongly agree, and the average was computed to create a genetic self-efficacy score. Higher scores indicated greater genetic self-efficacy. The scale had good internal reliability (α = 0.887).
Worry about genetic screening was assessed with a six-item scale developed for this study to measure the extent to which participants worried about privacy, having a mutation, their health, and their family’s health. Responses were provided on a scale from 1 = not at all worried to 5 = extremely worried. An average was calculated to yield a worry about genetic screening score. The scale had good internal reliability (α = 0.810).
State anxiety was assessed with the GAD-2 [16]. It is the average of two items that asked how often over the past two weeks participants felt anxious or unable to control worry. Responses ranged from 0 = not at all to 3 = nearly every day.
Trait anxiety was measured with the neuroticism subscale of the brief version of the Big Five Personality Inventory [17]. This scale asks participants how much they agree with statements about whether or not they see themselves as anxious or relaxed (1 = strongly disagree; 6 = strongly agree). Higher scores indicate higher levels of anxiety.
Frequency of Internet use was assessed with a single item developed for this study: “In a typical month, how often do you use the Internet (for instance, for email, getting information, paying bills, using social media like Facebook, or shopping)?” Responses were provided on a scale from 1 = a few times a month or less to 4 = several times a day.
Understanding of genetics was assessed with 15 items from the University of North Carolina Genomic Knowledge Scale [18], which measures knowledge in three domains thought to be critical for informed decision making in genomic sequencing: the structure and function of genes, how they are inherited in families, and their relation to health. The original scale includes 25 items. We dropped six items addressing understanding of whole exome sequencing because they were not relevant to GeneScreen. To reduce burden, we dropped an additional four items that were not relevant to this study, with guidance from the measure’s developers to minimize adverse effects on reliability. For each of 15 statements (e.g., “Gene variants can have positive effects, harmful effects, or no effects at all”), participants could answer “true,” “false,” or “don’t know.” We scored correct responses as 1 and incorrect responses and “don’t know” responses as 0. Possible scores ranged from 0 to 15. As with the GeneScreen knowledge items, participants were provided the correct answer after responding and were not permitted to change their response.
Analysis plan
First, we computed descriptive statistics (see Tables 1, 2, and 3) to evaluate the distributional and psychometric properties of each variable. We evaluated missing data and applied accepted methods for transforming or recoding variables, when necessary. Second, we calculated bivariate correlations (Tables 4 and 5) to identify variables that were significantly associated with each outcome variable (i.e., ease of deciding to join, understanding of main study features). These analyses allowed us to identify potential covariates from among sociodemographic, medical, and website behavior variables. Third, we conducted two multivariate linear regression analyses to test our hypotheses, one for each of the two outcomes (Tables 6 and 7). In bivariate and multivariate analyses, missing data were handled with an expectation-maximization imputation [19]. Finally, because the sites (UNC and KPNW) differed across some demographic and other variables measured in GeneScreen, we explored whether site contributed to variation in bivariate or multivariate associations with the two outcomes. Site was not a significant predictor in these subsequent analyses, so we did not include it.
Table 1.
Demographic characteristics of GeneScreen participants (n = 262)
| Variable | Frequency | Percentage | |
|---|---|---|---|
| Gendera | Male | 82 | 31.3 |
| Female | 180 | 68.7 | |
| Age | 18–40 | 43 | 16.4 |
| 41–60 | 89 | 34.0 | |
| 61+ | 130 | 49.6 | |
| Raceb | White | 207 | 78.7 |
| African-American | 13 | 4.9 | |
| Native American | 2 | 0.8 | |
| Asian-American | 17 | 6.5 | |
| Mixed | 17 | 6.5 | |
| Other | 5 | 1.9 | |
| Missing | 1 | ||
| Hispanic | Yes | 25 | 9.5 |
| No | 236 | 89.7 | |
| Missing | 1 | ||
| Educationc | Less than high school | 1 | 0.4 |
| High school or GED | 24 | 9.1 | |
| Some college (vocational, associate’s degree, or other) | 75 | 28.5 | |
| Four-year college degree | 63 | 37.6 | |
| Graduate or professional degree | 99 | 37.6 | |
| Incomed | Less than $24,999 | 14 | 5.3 |
| $25,000 to $49,999 | 41 | 15.6 | |
| $50,000 to $74,999 | 52 | 19.8 | |
| $75,000 to $99,999 | 38 | 14.4 | |
| $100,000 to $124,999 | 30 | 11.4 | |
| $125,000 to $149,999 | 23 | 8.7 | |
| $150,000 to $174,999 | 16 | 6.1 | |
| $175,000 to $199,999 | 7 | 2.7 | |
| $200,000 or more | 29 | 11.0 | |
| Missing | 12 | ||
. UNC participants were more likely to be women, p=.032
. KPNW participants were more likely to be White, p < .001
. UNC participants were more educated, p < .001
. UNC participants had higher incomes, p = .001
Table 2.
Descriptives of GeneScreen variables
| Variable | Mean | s.d. |
|---|---|---|
| Ease of deciding to join the study (range 1–6) | 5.24 | 0.87 |
| Understanding of main study features (range 1–5) | 3.93 | 0.98 |
| Time on the website (range 1.48 – 25.78 minutes) | 8.68 | 6.33 |
| Trust in the website (range 1–6) | 5.16 | 0.67 |
| Frustration with website (range 1–6) | 2.20 | 1.13 |
| Genetic self-efficacy (range 1–6) | 5.25 | 0.79 |
| Worry about genetic screening (range 1–5)a | 1.83 | 0.70 |
| State anxiety (range 0–3) | 0.36 | 0.61 |
| Trait anxiety (range 1–6) | 2.66 | 1.11 |
| Frequency of Internet use (range 1–4)b | 3.72 | 0.59 |
| Understanding of genetics (range 0–15)c | 12.18 | 2.66 |
. Participants at UNC expressed more worry, p=.005
. Participants at UNC used the Internet more often, p = .004
. Participants at UNC had higher scores on baseline genetic knowledge, p = .011
Table 3.
Frequencies of correct answers for understanding of main study features (n = 262)
| Main Study Features | Correct response |
n = Answered correctly |
n = Answered incorrectly |
n = Don’t know |
||
|---|---|---|---|---|---|---|
|
| ||||||
| GeneScreen only looks for gene variants that are harmful. | True | 137 | 79 | 46 | ||
| 52.3% | 30.2% | 17.6% | ||||
|
| ||||||
| GeneScreen will find a mutation in most of the people who are tested. | False | 245 | 10 | 7 | ||
| 93.5% | 3.8% | 2.7% | ||||
|
| ||||||
| A mutation found by GeneScreen causes a person to have a much higher risk for disease than someone without it. | True | 190 | 55 | 17 | ||
| 72.5% | 21% | 6.5% | ||||
|
| ||||||
| The mutations found by GeneScreen cause diseases that can be prevented or treated. | True | 199 | 38 | 25 | ||
| 76% | 14.5% | 9.5% | ||||
|
| ||||||
| Even if you get negative results from GeneScreen, you should still get the same medical care recommended for anyone of your age and family history. | True | 258 | 3 | 1 | ||
| 98.5% | 1.1% | 0.4% | ||||
Table 4.
Bivariate correlations among indicators of website interaction and attitudes (n = 262)
| Time on website |
Health conditions page opened |
Number of health conditions clicked |
Number of study features clicked |
Trust in website |
Frustration with website |
|
|---|---|---|---|---|---|---|
| Time on website | 1 | r=.168** | r=.214** | r=.377** | r=.062 | r=.119 |
| Health conditions page opened | 1 | r=.343** | r=.107 | r=.145* | r=−.077 | |
| Number of health conditions clicked | 1 | r=.176** | r=.147** | r=−.020 | ||
| Number of study features clicked | 1 | r=.057 | r= −.056 | |||
| Trust in website | 1 | r= −.305** | ||||
| Frustration with website | 1 |
Note:
means p<.05,
means p<.01
Table 5.
Bivariate correlations with two outcome variables (n = 262)
| Ease of deciding to join the study |
Understanding of main study features |
|
|---|---|---|
| Time on website | r = −.183, p = .003 | n.s. |
| Health conditions page opened | n.s. | r = .220, p <.001 |
| Number of health conditions clicked | n.s. | n.s. |
| Number of study features clicked | n.s. | n.s. |
| Trust in website | r = .386, p < .001 | n.s. |
| Frustration with website | r = −.252, p <.001 | n.s. |
| Genetic self-efficacy | r = .320, p <.001 | n.s. |
| Worry about genetic screening | r = −.330, p <.001 | n.s. |
| State anxiety | r = −.194, p = .002 | n.s. |
| Trait anxiety | r = −.153, p = .013 | n.s. |
| Understanding of genetics | n.s. | r = .200, p = .001 |
| Frequency of Internet use | n.s. | r = .249, p < .001 |
| Education | n.s. | r = .186, p = .003 |
| Income | n.s. | r = .164, p = .008 |
| Age | n.s. | r = −.138, p =.025 |
Table 6.
Multivariate model of associations with ease of joining GeneScreen (n = 262)
| B | SE of B | T | p | |
|---|---|---|---|---|
| Constant | 3.67 | .519 | 7.071 | <.0001 |
| Genetic self-efficacy | .162 | .067 | 2.407 | .017 |
| Worry about genetic screening | −.257 | .068 | −3.748 | <.0001 |
| State anxiety | −.046 | .091 | −0.508 | .611 |
| Trait anxiety | −.088 | .050 | −1.759 | .080 |
| Time on the website | −.020 | .007 | −2.734 | .007 |
| Trust in the website | .357 | .076 | 4.701 | <.0001 |
| Frustration with the website | −.084 | .043 | −1.970 | .050 |
Table 7.
Multivariate model of associations with understanding main study features (n = 262)
| B | SE of B | T | P | |
|---|---|---|---|---|
| Constant | 1.821 | .602 | 3.027 | .003 |
| Income | .021 | .027 | .780 | .436 |
| Age | .000 | .004 | −.047 | .963 |
| Education | .188 | .127 | 1.483 | .139 |
| Baseline genetic knowledge | .046 | .024 | 1.958 | .051 |
| Internet usage | .283 | .104 | 2.714 | .007 |
| Health conditions page opened | .382 | .131 | 2.924 | .004 |
RESULTS
Participant characteristics
Two hundred sixty-two participants completed the joiner survey and sent in saliva (UNC) or permitted use of banked DNA (KPNW) for sequencing. Participants ranged in age from 24 to 89 (M = 59.20, s.d. = 15.32). As seen in Table 1, two-thirds (68.7%) were women. They were highly educated (61.8% had a college degree or more) and largely non-Hispanic White. Although African Americans were slightly younger than the rest of the sample (M = 49.9 s.d. = 15.8 compared to M = 59.6, s.d. = 15.3, respectively, p=.026), they did not differ on education and income variables compared to White or other races. Participation rates differed for UNC (16.5%) and KPNW (30%) (Supplemental Figure 1). Overall, the enrollment rate was 24.5%.
Descriptive statistics of outcome and predictor variables
Tables 2 and 3 provide descriptive data for the variables used in these analyses. Overall, participants reported that deciding to join the GeneScreen study was easy (80% responded that it was very or extremely easy to decide). They also understood most of the five main study features; the mean number of correct items was 3.93 out of five.
The average amount of time participants spent on the website was fairly short: 8.67 minutes. Additionally, only 71 participants (27%) opened the HC page. Of those 71, most (n = 60, 84.5%) did not click to expand any health conditions, and only 5 clicked on all 11 conditions. Likewise, two-thirds of our study participants did not click to read more about any of the key study features (n = 171, 65.3%). Thirty-two participants clicked on 1 item (12.2%) and the remaining participants clicked on 2 or more (n = 59, 22.5%).
On average, participants agreed that the website was trustworthy and did not find it frustrating. They reported having high levels of genetic self-efficacy and were not particularly worried about genetic screening. Levels of anxiety were low on average. Participants reported that they use the Internet regularly (between every day or two and several times per day). They had a good understanding of genetics, on average, based on their scores on the UNC Genomic Knowledge Scale.
Bivariate associations
The two outcome variables—ease of deciding to join GeneScreen and understanding of key study features—were not correlated with one another (r = 0.073, p = 0.23). Many variables that measured interaction with the website were intercorrelated, as seen in Table 4, and most were also correlated with ease of deciding to join, as seen in Table 5.
Hypothesis tests
First, we examined predictors of ease of deciding to join the study to test the hypothesis that participants who interacted more with the GeneScreen website would find it easier to decide to join. Based on bivariate analyses, we used a multivariate regression model that included the control variables of self-efficacy, worry, state anxiety, and trait anxiety, as well as the predictor variables of time on the website, trust in the website, and frustration with the website. The overall model was significant (F(7,254) = 15.630, p <.000) with an R2 of .282. Participants who had greater genetic self-efficacy, less worry, more trust in information on the website, and less frustration with the website reported that the decision to join was easier. Importantly, however, contrary to what we had predicted (but consistent with the bivariate association), participants who spent less time on the GeneScreen website reported that the decision was easier to make. See Table 6 for the full model.
Next we examined predictors of understanding of main study features to test the hypothesis that participants who had more interaction with the GeneScreen website would have more understanding of the main study features. Based on bivariate analyses, we tested a multivariate regression model that included the control variables of income, education, age, understanding of genetics, and frequency of Internet use as well as the predictor variable of opening the health conditions page. Notably, no other measure of website behavior (time on the website, number of health conditions clicked, number of consent items clicked, trust in the website, and frustration with the website) was correlated with this outcome. The overall model was significant (F(6,255) = 5.532, p <.0001) with an R2 of .363. In support of our hypothesis, participants who opened the health conditions page had higher scores on the measure of GeneScreen understanding than those who did not. In addition, participants who reported more frequent Internet usage also had greater understanding of main study features. See Table 7 for the full model.
Qualitative interview data
During the fifty qualitative interviews, participants were asked when they made the decision to join GeneScreen. Unexpectedly, thirty interviewees (60%) made the decision to join the study before they visited the website. Participants stated they decided upon receiving the letter and brochure, often because their “physician asked” or “Kaiser asked,” and they trusted these healthcare providers. For these participants, the website did not influence their decision-making. For instance, one participant said that she had made up her mind before visiting the website and was “just going on the website answering the questions as a formality because it was required.” However, 12 participants who were interviewed (24%) decided to join GeneScreen after visiting the website. As one stated, “I think when I went to the website I was probably about seventy percent sure that I would do it, and then I read about the study and all the information that you had there, and I saw it, and that’s when I went the extra thirty percent in my decision.” For the remaining eight participants who were interviewed (16%), it was unclear when they made the decision to join.
DISCUSSION
In this paper, we sought to explore the relationship between use of online education and e-consent for a preventive genomic screening study and factors associated with 1) ease of deciding to join the study (that is, to accept screening), and 2) understanding the main study features. We found that the majority of GeneScreen participants reported that their decision making was easy—and, in fact, interviews revealed that many made their decision before they had even viewed the website. Scores for understanding the five main study features were high. These two outcomes were not correlated, indicating that participants’ reported ease of deciding to join the study (and, therefore, to have the genomic screening) was unrelated to their understanding of key study features. Furthermore, the factors that emerged in multivariate analyses as significantly associated with these outcomes were entirely distinct. Specifically, ease of deciding to join the study was related to several website perceptions and behaviors, genetic self-efficacy, and worry about genetic screening, whereas understanding main study features was related to just one specific website behavior (opening the health conditions page) and frequency of Internet use. Taken together, these findings show that the relative ease or difficulty of deciding to join a preventive genomic screening study and understanding key features of that study may be entirely different phenomena that should be treated as such in research and practice.
Participants who reported that they had an easier time deciding to join were those who felt more confident about their ability to get and use genetic information; were less worried about issues such as privacy, having a mutation, and the implications findings might have for their own and their family’s health; were more trusting of the information on the website; and, contrary to our hypothesis, spent less time on the website, not more time. Findings from our qualitative analyses can help shed light on why this may have occurred. Specifically, the subset of 50 participants who were interviewed often reported that they made the decision to join early in the process, after reading the recruitment letter and brochure that was sent by their physician and/or health care institution. Thus, many participants appear to have made a quick decision after relatively little deliberation and may therefore have been less inclined to take the time to seek more detailed information on the website. Similarly quick decision making was reported by Desch and colleagues [7], who found that college students presented with an online consent form describing a minimal-risk genetic study took far less time than expected—less than one-tenth of the time the investigators identified as the minimum predicted reading time—to consent to the study.
These findings highlight the potential importance of incorporating health decision-making theories that distinguish between two processes (often called “dual process models”): 1) decision making that is slow, deliberative, effortful, and conscious and 2) decision making that is fast, automatic, and unconscious, or intuitive [20]. Each has its pros and cons, and evidence suggests that, under some circumstances, deliberative decision making can have unintended negative consequences (e.g., because it raises negative emotions that bias subsequent reasoning or places insufficient weight on factors that are difficult to articulate) whereas intuitive decision making can result in better decision outcomes [21]. However, in cases where informed consent is being implemented online, it may be important to take specific steps to increase the possibility that participants will fully engage with information that is provided to them (i.e., to specifically promote slow, deliberative information processing and decision making). The online environment poses distinct challenges in this regard. When education and informed consent procedures are completed in person, it is often possible for study staff to see that the individual does not fully understand information or has become distracted. Staff can then take action to evaluate understanding and/or to re-engage the individual. However, other methods will be needed to ensure that participants receive and understand critical information when it is provided online (e.g., use of features designed to enhance engagement, such as interactive exercises and multimedia presentation of information) [2].
Interestingly, Kaphingst and colleagues [8] found that MI participants who spent longer time on that study’s website reported greater ease in deciding to join. The way participants used our website appears to have been quite different from the way that participants used the MI study’s website. It may be worth considering differences in website design as one explanation. Websites vary in their ability to engage participants in active learning [22, 23], and it may be that those with greater interactivity [2, 6] and other features associated with engagement in online information are better able to ensure participants learn and consider important pros and cons of the decision to accept genomic screening.
In fact, as noted above, our qualitative interview findings revealed that many participants felt at relative ease joining the study and had decided to join even before they visited the website and read the online consent materials, at least in part because they had received a recruitment letter signed by their physician or a representative of their healthcare institution’s biobank. They appeared to perceive this letter as signaling their providers’ or healthcare institution’s support for a decision to join the study and, importantly, many said they trusted their provider or healthcare institution. In the quantitative survey findings, it may be that our measure of trust in the website (which predicted greater ease of deciding to join) served as a proxy for individuals’ trust in their physician or health care institution. A number of studies, including several review studies, have found that people’s trust in the research enterprise affects decision making regarding consent to participate in genetic research (most of these studies involve consent to participate in a biobank) [e.g., 24–30]. Kelly and colleagues surveyed individuals in a research registry on attitudes toward consent for research use of their genetic and medical information [31]. They found that the majority of respondents thought that providing the name of the study’s lead researcher on the consent form would enable them to trust the researcher and provide them a person to contact about the study if necessary. These findings indicate that care must be taken in the wording of these letters to promote participants’ motivation to carefully weigh the pros and cons of joining rather than being guided to a quick decision by their trust in their provider or healthcare institution.
With respect to participants’ understanding of study features, only two factors were associated with understanding when controlling for other potential predictors: self-reported frequency of Internet usage and opening the health conditions page. No other measures of website behaviors (including time spent on the website or clicking on study features to get more information about them) or perceptions about the website (i.e., that it presented trustworthy data and was not frustrating to use) were related to understanding study features. It is important to note that the website’s summary statements (that is, the information that participants received without having to click to display more detailed information) included enough information about the study to support participants’ decision making and allow them to do well on the outcome measure of knowledge. Furthermore, information in the recruitment letter and brochure may have provided individuals with an adequate understanding of the GeneScreen study so that they did not have to further investigate information on the website. Answers to four of the five questions used to measure understanding GeneScreen features were addressed in these materials.
It is also possible that individuals who use the Internet frequently have higher literacy, more generally, and thus find it easier to engage with and gain knowledge from printed and online information provided about the study, compared to their less Internet-experienced peers. It is unclear why understanding was associated with opening the health conditions page, which merely showed a list of the health conditions covered in the GeneScreen test. It may be that participants with a better understanding of study features are also at least somewhat more oriented toward seeking and engaging with health information, and thus were more likely to want to see the list of health conditions. Taken together, these findings suggest that individuals with a greater tendency to seek and learn or retain health information may have been relatively advantaged in understanding key study features. However, these findings should be considered with caution given the nature of our sample (who were mostly well educated) and the fact that the items we used to measure study understanding did not thoroughly address all aspects of information available on the website.
This study has several limitations. First, our participants were recruited from two sites with different rates of participation. Enrollment by the KPNW Biobank members was nearly twice that of the participants recruited from the rosters of the UNC general medicine clinic (30% vs. 16.5%). There are several possible explanations for these different consent rates. The higher rate at KPNW likely reflects the fact that these participants had already consented to have their specimen in a biobank for broad future research purposes and thus we can assume have some interest in clinical research and comfort with their specimens being used for research. Additionally, participation involved the passive step of allowing the research biobank to use a sample they were already storing, unlike the UNC joiners who had to provide a saliva sample. Finally, recruits at UNC were far more likely to be from demographic groups that are typically underrepresented in clinical research. For example, 34% of the people who received our letters and brochures at UNC were African American compared to only 3% of the KPNW Biobank recruits. Likewise, 49% of UNC recruits were men, but men comprised only 34% of KPNW recruits. Interestingly, the KPNW participation rate was higher even though they were not provided survey incentives. It is not possible to disentangle the impact of site versus incentives on participation rates, but it raises interesting questions to explore in future studies.
Second, despite our attempt to recruit a diverse sample, the majority of participants were female, White, non-Hispanic, and well educated. Over 50 percent of the sample also had annual incomes of at least $75,000. The demographics of GeneScreen joiners were thus typical of many clinical research participants and “early adopters” of new health information [32]. One important finding was that African American joiners did not differ demographically from the rest of GeneScreen participants (except they were slightly younger), indicating that socioeconomic status was a more valid predictor of response to recruitment for this sample than race or ethnicity. Additionally, of course, participants had to have access to the Internet in order to join GeneScreen, which may explain part of the lack of diversity in our sample. Although approximately 90% of people in the United States use the Internet, gaps in usage remain among certain demographic groups. For example, Internet use is positively related to income and education and inversely related to age; there is no significant relationship between Internet use and race or ethnicity [33].
Third, although our recruitment plan included inviting people to take a decision-making survey even if they did not wish to participate in genetic screening, we were not able to achieve an adequate “decliner” response rate. Thus, several important research questions that required comparing responses of joiners to decliners remain unanswered. Additionally, other studies have compared e-consent with traditional in-person paper consent [2, 6]. These studies found that econsent, if it has an interactive component, takes longer than paper consent and participants have greater understanding of the study. We did not design our study to have a comparison group with in-person paper consent, so we do not know how participants’ understanding of the study features or ease of decision-making may have been different.
Despite these limitations, our study contributes to an important, but still limited area of research. Our findings suggest the promise of online education and e-consent for preventive genomic screening targeting the general adult population. Importantly, participants had a generally good understanding of study features, regardless of whether they had sought more information available to them on the website, but findings may be limited to a subgroup of individuals who are relatively well educated and Internet-savvy. Thus, it may be worthwhile to further investigate the best ways to ensure that participants access information available to them on a website such as this one, enhancing understanding and facilitating decision making to the greatest extent possible. Sorting out when and how individuals acquire information about features of an online study like GeneScreen is a rich area for future research.
Supplementary Material
Acknowledgments
The authors would like to thank those who kindly participated in the GeneScreen study. Thank you also to the reviewers for their helpful comments on an earlier draft of this article. Research for this study was funded by the National Institutes of Health (NIH) Grant 2P50HG004488 (Henderson, PI), “Center for Genomics and Society” (CGS). The views expressed are those of the authors alone, and do not necessarily reflect views of NIH or all CGS investigators. The NW Biobank resource was made possible with support from the Oregon Clinical and Translational Research Institute (OCTRI), grant number UL1 RR024140 from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH), NIH Roadmap for Medical Research; the MJ Murdock Charitable Trust, and institutional support from the Center for Health Research (CHR) Community Benefits funds, and Kaiser Permanente Northwest (KPNW).
References
- 1.Grady C. The changing face of informed consent. N Engl J Med. 2017;376(9):856–9. doi: 10.1056/NEJMra1603773. [DOI] [PubMed] [Google Scholar]
- 2.Simon C, Klein DW, Schartz HA. Interactive multimedia consent for biobanking: a randomized trial. Genet Med. 2016;18:57–64. doi: 10.1038/gim.2015.33. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Boutin NT, Mathieu K, Hoffnagle AG, Allen NL, Castro VM, Morash M, O’Rourke PP, Hohmann EL, Herring N, Bry L, Slaugenhaupt SA. Implementation of electronic consent at a biobank: an opportunity for precision medicine research. J Pers Med. 2016;6(2):17. doi: 10.3390/jpm6020017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Simon C, Klein DW, Schartz HA. Traditional and electronic informed consent for biobanking: a survey of US biobanks. Biopreserv Biobank. 2014;12(6):423–429. doi: 10.1089/bio.2014.0045. [DOI] [PubMed] [Google Scholar]
- 5.Stevens N, Edwards L, Balayah Z, Hooper R, Knowles C. Risk based survey evidence supports electronic informed consent as a recruitment method for UK clinical trials. J Clin Epidemiol. 2016 Sep 1;77:134–6. doi: 10.1016/j.jclinepi.2016.05.005. [DOI] [PubMed] [Google Scholar]
- 6.Rowbotham MC, Astin J, Greene K, Cummings SR. Interactive informed consent: randomized comparison with paper consents. PLoS ONE. 2013;8(3):e58603. doi: 10.1371/journal.pone.0058603. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Desch K, Li J, Kim S, Laventhal N, Metzger K, Siemieniak D, Ginsburg D. Analysis of informed consent document utilization in a minimal-risk genetic study. Ann Intern Med. 2011;155(5):316–322. doi: 10.1059/0003-4819-155-5-201109060-00009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Kaphingst KA, McBride CM, Wade C, Alford SH, Brody LC, Baxevanis AD. Consumers’ use of web-based information and their decisions about multiplex genetic susceptibility testing. J Med Internet Res. 2010;12(3):e41. doi: 10.2196/jmir.1587. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Hensley Alford S, McBride CM, Reid RJ, Larson EB, Baxevanis AD, Brody LC. Participation in genetic testing research varies by social group. Public Health Genom. 2011;14(2):85–93. doi: 10.1159/000294277. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.O'Neill SC, Sanderson SC, Lipkus IM, Bepler G, Bastian LA, McBride CM. The feasibility of online genetic testing for lung cancer susceptibility: uptake of a web-based protocol and decision outcomes. Genet Med. 2008 Feb 1;10(2):121–30. doi: 10.1097/GIM.0b013e31815f8e06. [DOI] [PubMed] [Google Scholar]
- 11.Wood F, Kowalczuk J, Elwyn G, Mitchell C, Gallacher J. Achieving online consent to participation in large-scale gene-environment studies: a tangible destination. J Med Ethics. 2011 Aug 1;37(8):487–92. doi: 10.1136/jme.2010.040352. [DOI] [PubMed] [Google Scholar]
- 12.Adams MC, Evans JP, Henderson GE, Berg JS. GeneScreen Investigators: The promise and peril of genomic screening in the general population. Genet Med. 2015;18(6):593–9. doi: 10.1038/gim.2015.136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Lázaro-Muñoz G, Conley JM, Davis AM, Prince AER, Cadigan RJ. Which results to return: subjective judgments in selecting medically actionable genes. Genet Test Mol Biomarkers. 2017;21(3):184–94. doi: 10.1089/gtmb.2016.0397. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Berg JS, Khoury MJ, Evans JP. Deploying whole genome sequencing in clinical practice and public health: Meeting the challenge one bin at a time. Genet Med. 2011;13:499–504. doi: 10.1097/GIM.0b013e318220aaba. [DOI] [PubMed] [Google Scholar]
- 15.Tukey JW. The future of data analysis. Ann Math Stat. 1962;33(1):1–67. 18. [Google Scholar]
- 16.Löwe B, Wahl I, Rose M, Spitzer C, Glaesmer H, Wingenfeld K, Schneider A, Brähle E. A 4-item measure of depression and anxiety: validation and standardization of the Patient Health Questionnaire-4 (PHQ-4) in the general population. J Affect Disord. 2009;122:86–95. doi: 10.1016/j.jad.2009.06.019. [DOI] [PubMed] [Google Scholar]
- 17.Rammstedt B, John OP. Measuring personality in one minute or less: a 10-item short version of the Big Five Inventory in English and German. J Res Pers. 2007;41:203–212. [Google Scholar]
- 18.Langer MM, Roche MI, Brewer NT, Berg JS, Khan CM, Leos C, Moore E, Brown M, Rini C. Development and validation of a genomic knowledge scale to advance informed decision-making research in genomic sequencing. MDM Policy & Practice. 2017 Feb;2(1):1–13. doi: 10.1177/2381468317692582. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Newman DA. Missing data: Five practical guidelines. Organ Res Methods. 2014;17:372–411. [Google Scholar]
- 20.Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 2008;59:255–78. doi: 10.1146/annurev.psych.59.103006.093629. [DOI] [PubMed] [Google Scholar]
- 21.de Vries M, Fagerlin A, Witteman HO, Scherer LD. Combining deliberation and intuition in patient decision support. Patient Educ Couns. 2013;91(2):154–160. doi: 10.1016/j.pec.2012.11.016. [DOI] [PubMed] [Google Scholar]
- 22.Danaher BG, Seeley JR. Methodological issues in research on web-based behavioral interventions. Ann Behav Med. 2009 Aug;38(1):28–39. doi: 10.1007/s12160-009-9129-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Glasgow RE, Christiansen SM, Kurz D, King DK, Woolley T, Faber AJ, Estabrooks PA, Strycker L, Toobert D, Dickman J. Engagement in a diabetes self-management website: usage patterns and generalizability of program use. J Med Internet Res. 2011 Jan 25;13(1):e9. doi: 10.2196/jmir.1391. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Trinidad SB, Fullerton SM, Bares JM, Jarvik GP, Larson EB, Burke W. Genomic research and wide data sharing: views of prospective participants. Genet Med. 2010;12(8):486–495. doi: 10.1097/GIM.0b013e3181e38f9e. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Haddow G. “We only did it because he asked us”: gendered accounts of participation in a population genetic data collection. Soc Sci Med. 2009;69(7):1010–1017. doi: 10.1016/j.socscimed.2009.07.028. [DOI] [PubMed] [Google Scholar]
- 26.Dixon-Woods M, Ashcroft RE, Jackson CJ, Tobin MD, Kivits J, Burton PR, Samani NJ. Beyond “misunderstanding”: written information and decisions about taking part in a genetic epidemiology study. Soc Sci Med. 2007;65(11):2212–2222. doi: 10.1016/j.socscimed.2007.08.010. [DOI] [PubMed] [Google Scholar]
- 27.Cadigan RJ, Davis AM. Deciding whether to participate in a biobank: the concerns of healthy volunteers. In: Kaye J, Stranger M, editors. Principles and practice in biobank governance. Surrey: Ashgate; 2009. pp. 117–133. [Google Scholar]
- 28.Hoeyer K. Donors perceptions of consent to and feedback from biobank research: time to acknowledge diversity? Public Health Genom. 2010;13(6):345–352. doi: 10.1159/000262329. [DOI] [PubMed] [Google Scholar]
- 29.Lipworth W, Forsyth R, Kerridge I. Tissue donation to biobanks: a review of sociological studies. Sociol Health Illn. 2011;33(5):792–811. doi: 10.1111/j.1467-9566.2011.01342.x. [DOI] [PubMed] [Google Scholar]
- 30.Nobile H, Vermeulen E, Thys K, Bergmann MM, Borry P. Why do participants enroll in population biobank studies? A systematic literature review. Expert Rev Mol Diagn. 2013;13(1):35–47. doi: 10.1586/erm.12.116. [DOI] [PubMed] [Google Scholar]
- 31.Kelly SE, Spector TD, Cherkas LF, Prainsack B, Harris JM. Evaluating the consent preferences of UK research volunteers for genetic and clinical studies. PLoS One. 2015;10(3):e0118027. doi: 10.1371/journal.pone.0118027. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Lewis KL, Han PK, Hooker GW, Klein WM, Biesecker LG, Biesecker BB. Characterizing participants in the ClinSeq genome sequencing cohort as early adopters of a new health technology. PloS ONE. 2015 Jul 17;10(7):e0132690. doi: 10.1371/journal.pone.0132690. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Pew Research Center: Internet & technology: Internet/broadband fact sheet. [Accessed August 29, 2017];2017 Jan 12; Available at: http://www.pewinternet.org/fact-sheet/internet-broadband/
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

