Introduction
Newborn screening is conducted by every state and territory in the US and represents one of the most successful public health programs of the modern era. These programs collect blood for testing soon after birth from almost every infant born in the US. Many states save the residual bloodspots for several uses, including biomedical research (Rothwell, Johnson, Riches, & Botkin, 2019). Lawsuits have been filed against several state programs for the lack of informed consent for this practice (Lewis, Goldenberg, Anderson, Rothwell, & Botkin, 2011). In response, some states have implemented a formal consent process for storing residual bloodspots and making them available to investigators for research. However, obtaining informed consent for bloodspot research is particularly challenging due to the hectic environment of the postnatal period and the relatively abstract nature of future, unspecified research on the biospecimens. Biobanks from newborn screening programs are particularly valuable because the specimens can represent the whole population of infants in a state or area. Low consent rates undermine the representativeness of these biobanks, therefore it is important to establish an effective informed decision-making process for research uses of residual bloodspots in order to maintain public trust and support.
The state of Michigan developed the Michigan BioTrust for Health biobank. Newborn screening specimens collected after May 1, 2010, are made available for research with parental permission through an opt-in consent process. A staff member at the birth facility typically gives parents a BioTrust brochure and then requests a “yes” or “no” decision with a signature on a short consent form. Current data indicate that ~67.5% of parents opt-in for Biotrust participation under this approach (Atkinson, 2019). However, some parents miss the opportunity to document their consent decision due, in part, to competing clinical duties of health professionals, demands of a newborn or they may not feel they receive enough information to make an informed decision.
Identifying ways to improve comprehension of biospecimen research is a national priority (Sanderson et al., 2017). Researchers have used interactive, multimedia informed consent platforms in other contexts to improve comprehension with mixed results (Palmer, Lanouette, & Jeste, 2012). These inconsistencies are in part due to the lack of guidance from conceptual and theoretical foundations in adult learning and decision-making for development of these consent tools (Flory & Emanuel, 2004). This research developed and evaluated theory-based, multi-media consent interventions to assist parents to make an informed decision about the retention and use of residual bloodspots within an established biobank repository.
Methods
The study was approved by the institutional review board at the University of Utah and Spectrum Health System. Women who gave birth in hospitals located in Lansing, Ann Arbor and Grand Rapids (Michigan) were recruited. These locations were chosen to represent diverse populations in the state and had existing consent procedures for the Michigan Biotrust.
Development of the Education Tools
The educational tools were produced by the Genetic Science Learning Center (GSLC) at the University of Utah which used two conceptual frameworks to guide tool development. Multimedia Learning Theory postulates that learning and retention is enhanced when a succinct combination of words and pictures are used together at the same time with narration (Mayer, 2002). Cognitive Load Theory states improved comprehension is achieved by building on existing knowledge, providing information in small “chunks” of no more than 7 (+/− 2) topics, and utilizing active learning strategies (Paas, Renkl, & Sweller, 2003). We developed two tools: (1) A 6-minute video that incorporated all of the required consent elements as well as key informational elements about residual bloodspot research that were identified from stakeholder focus groups (Rothwell et al., 2017); (2) An interactive app which included the same information presented as nine key points with links to optional, additional information. The video and app were validated with input from the Michigan Department of Health and Human Services BioTrust Community Advisory Board and employees of the Michigan Department of Health and Human Services (Link to video: https://vimeo.com/learngenetics/review/206501906/8232c39bf7).
Study Design and Participants
Eligible participants were either the mother or father of the newborn, and English speaking adults with a newborn infant in the post-natal hospital environment but not in the neonatal intensive care unit. An implementation assessment of each hospital environment was conducted prior to study recruitment. At each of the three hospitals, 30 observations of the newborn screening and Biotrust consent processes were conducted (n=90). This enabled the research team to incorporate typical hospital practice into the study protocol. Standard health department protocols for birthing hospitals dictate that patients are provided with the brochure-based educational material about the Michigan BioTrust prior to discharge home. This brochure may be provided at the same time as a large packet of other written information. After newborn screening is completed, a healthcare staff member reminds the family about the information discussing the storage and use of the residual bloodspots. The staff members ask for a decision and signature to document the parental decision. The consent form is attached to the back of the filter paper card used for collection of the newborn screening bloodspots and consists of a checkbox documenting a “Yes” or “No” decision and a signature line.
All participants were provided with the Michigan BioTrust informational brochure and consent form in the standard manner. At two of the hospitals, a healthcare provider then asked parents if they were interested in participating in this study. If they agreed, a research assistant (RA) later entered the patient room. In one of the hospitals, the RA, who was also an employee at this hospital, was the first person to approach and ask if the parents were interested. If a parent agreed, the RA conducted the consent process for this study, and the parents were randomized to one of the three study groups. After randomization, each participant was provided an electronic tablet to collect basic demographic information, and to allow them to watch the video or interact with the app intervention (if assigned to those groups). After they completed watching the video, interacting with the app or reading the brochure on the electronic tablet, participants completed the follow-up surveys within REDCap. If the parent was in the control group, the Michigan BioTrust brochure was provided again and they were asked to complete the survey on an electronic tablet when they were ready. During study participation, the RA left the patient room to allow ample time to explore the brochure, video or app. The participant was told they could pause their participation for any reason. The hospital staff were familiar with the study but were not directly involved in the interventions. Participants were given a $25 gift card for completion of this phase of the study.
Two to three weeks after birth, participants were re-contacted. Follow-up emails with a link to a REDCap survey were sent to those who provided email addresses. If responses were not received after one week, another follow up email was sent. After two email attempts, a follow-up phone call was initiated. For those who did not provide email addresses, only phone calls, with up to five attempts, were conducted.
The post-intervention and follow-up surveys were similar. At both time points, participants completed a 34-item knowledge survey that was modified based on an existing consent survey about the quality of informed consent for biobank participation (Ormond, Cirino, Helenowski, Chisholm, & Wolf, 2009; Rothwell et al., 2014). The knowledge survey consisted of two parts. Part A was comprised of 20 knowledge questions about the consent elements and Part B was comprised of 14 questions about self-assessed understanding of the consent elements. Additional questions about attitudes toward newborn screening and residual biospecimen research were asked, as well as satisfaction with the content, amount and clarity of the educational information presented. Self-report questions about their decision to participate in the BioTrust were also included in both the post and follow-up surveys. The Short Form Trust in Medical Research (Hall et al., 2006) was only administered in the post survey to measure participants levels of trust toward medical research. Decisional regret (Brehaut et al., 2003) was only administered in the follow-up survey to capture any regret about their deicison to participate in the Biotrust. An additional $25 gift card was provided for those who completed the follow-up survey.
Analysis
All analyses were conducted with SPSS version 22. Univariate analysis of variance with Tukey post hoc adjustment was conducted to test the relationship of group assignment (video, app, or brochure) on knowledge, attitudes and behaviors. Person X2 or Fisher exact test (if cell count minimums were violated) was used to test group assignment with categorical outcome variables. A sensitivity analysis was conducted with the following assumptions: sample size of each group = 180, Alpha = 0.05, Power = .80, and two-tailed testing, results showed we would be able to detect a small effect size.
Results
Between March 2018 to June 2018, 684 participants were approached in the recruitment sites (See Figure 1). Only 1.32% of individuals approached declined to participate and 675 were enrolled and randomized. The usual care group had 227 participants initially enrolled, the video group had 223 enrolled and the app group had 225 enrolled. Participants who completed the follow-up survey included 176 in the usual care group (78.2% retention), 178 in the video group (80.9% retention) and 178 in the app group (79.5% retention). Participants who completed both the post survey and follow-up survey were similar in demographics with respect to ethnicity, race, education, marital status, income, pregnancy history, sex, and language. Age differed significantly (p<.05) and was added to the analyses as a covariate. A summary of the participants’ demographic characteristics is presented in Table 1. The average time between the post survey and follow-up survey was 18.77 days (SD 7.04).
Figure 1.
Consort Diagram
Table 1.
Demographics
| Control | Video | App | Total | |
|---|---|---|---|---|
| Characteristic | n = 225 | n = 219 | n = 224 | N = 669 |
| Age at enrollment (years, SD) | 30.62 (4.73) | 30.08 (4.89) | 29.42 (5.02) | 30.04 (4.89) * |
| Gender | ||||
| Female | 192 (94.1%) | 179 (90.4%) | 182 (89.7%) | 553 (91.4%) |
| Male | 12 (5.9%) | 18 (9.1%) | 21 (10.3%) | 51 (8.4%) |
| Race | ||||
| White | 176 (78.9%) | 167 (77.%) | 176 (79.6%) | 519 (78.5%) |
| Non-White | 47 (21.1%) | 50 (23.%) | 45 (20.4%) | 142 (21.5%) |
| Ethnicity | ||||
| Hispanic | 9 (4.1%) | 11 (5.1%) | 12 (5.4%) | 32 (4.9%) |
| Non-Hispanic | 213 (95.9%) | 203 (94.9%) | 210 (94.6%) | 626 (95.1%) |
| Income | ||||
| Under $24,999 | 15 (7.1%) | 13 (6.3%) | 18 (8.5%) | 46 (7.3%) |
| $25,000 - $50,000 | 27 (12.8%) | 39 (18.8%) | 32 (15.2%) | 98 (15.6%) |
| $50,001 - $100,000 | 72 (34.1%) | 66 (31.7%) | 63 (29.9%) | 201 (31.9%) |
| $100,001 - $150,000 | 46 (21.8%) | 44 (21.2%) | 46 (21.8%) | 136 (21.6%) |
| Over $150,000 | 31 (14.7%) | 26 (12.5%) | 25 (11.8%) | 82 (13.0%) |
| Not sure/did not answer | 20 (9.5%) | 20 (9.6%) | 27 (12.8%) | 67 (10.6%) |
| Education | ||||
| Less than college graduate | 90 (40.0%) | 93 (42.3%) | 107 (47.8%) | 290 (43.3%) |
| College graduate and above | 135 (60.0%) | 127 (57.7%) | 117 (52.2%) | 379 (56.7%) |
| Relationship | ||||
| Married or living with partner | 197 (87.9%) | 178 (80.9%) | 185 (83.0%) | 560 (84.0%) |
| Significantly involved with partner but not living together | 17 (7.6%) | 29 (13.2%) | 28 (12.6%) | 74 (11.1%) |
| Single / not significantly involved | 9 (4.0%) | 13 (5.9%) | 9 (4.0%) | 31 (4.6%) |
| Other | 1 (0.4%) | 0 (0.0%) | 1 (0.4%) | 2 (0.3%) |
| Given birth before | ||||
| Yes | 146 (65.2%) | 144 (65.5%) | 131 (58.7%) | 421 (63.1%) |
| No | 78 (34.8%) | 76 (34.5%) | 92 (41.3%) | 246 (36.9%) |
p < 0.05
Knowledge Outcomes
We hypothesized that the video or app groups, based on our conceptual frameworks, would have higher knowledge scores immediately after the consent process and higher retention of knowledge after two weeks. There was a significant effect of group membership for the 20-item knowledge scores on biobanking (F2,646 = 12.15, p < 0.001). Pairwise comparisons indicated that the estimated marginal means showed higher correct response for the video group (71%, SEM = 0.01) when compared to the usual care group (65%, SEM = 0.01), p < 0.001; and the app group (69%, SEM = 0.01) versus the usual care group (65%, SEM = 0.01), p = 0.003. The app and video group did not differ significantly (p = 0.061), see Table 2. There was no time effect (F1,578 = 0.09, p =0.769), indicating no decrease in knowledge scores over the follow up period. For the outcome Self-assessment of Understanding there was also a statistically significant effect of group membership (F2,655 = 4.36, p =0.013). Pairwise comparisons indicated that the estimated marginal means showed higher understanding for the video group (4.23, SEM = 0.05) when compared to the usual care group (4.04, SEM = 0.05), p = 0.003. The app group (4.12, SEM = 0.05) did not differ significantly from either control (p = 0.200) or video group (p = 0.096). Self-assessment of Understanding slightly decreased over time (in-hospital = 4.17, SEM = 0.029) versus 2 week follow-up (4.09, SEM = 0.032), p = 0.008.
Table 2:
Knowledge, Understanding, and Attitudes
| Usual care N = 176 |
Video N = 178 |
App N = 178 |
|
|---|---|---|---|
| Surveys | |||
| Knowledge – 20 items (% correct) | 64.8 (0.009)a | 70.9 (0.009)b | 68.5 (0.009)b |
| Self-Assessment of Understanding – 14 items | 4.04 (0.045)a | 4.23 (0.046)b | 4.12 (0.045)a,b |
| How satisfied were you with the information you just received about the Michigan BioTrust? (1 – Highly dissatisfied, 5 - Highly satisfied) |
3.98 (0.054)a | 4.23 (0.053)b | 4.18 (0.053)b |
| How clearly was that information presented about the Michigan BioTrust? (1 - Many things were unclear, 4 - Everything was clear) |
2.88 (0.054)a | 3.15 (0.054)b | 3.14 (0.054)b |
| How would you rate the amount of information you just received about the Michigan BioTrust? (1 - A lot less than I needed, 3 – About right, 5 - Much more than I needed) |
2.92 (0.058)a | 3.07 (0.057)a,b | 3.13 (0.057)b |
| Was there any additional information that you were hoping to learn about but were unable to find? | |||
| Yes | 10 (5.7%) | 9 (5.0%) | 19 (10.7%) |
| No | 166 (94.3%) | 169 (95.0%) | 159 (89.3%) |
Post-hoc analysis, groups with the same superscript are not significantly different at p < 0.05 level
Attitude Outcomes
Learning preferences vary and this is evident when seeking consent for research participation during clinical care. Questions were asked about satisfaction, amount of information, and clarity (see Table 2). The usual care group (brochure-based education), was the least satisfied with the information received (p=0.002), reported that the information provided was the least clear (p< 0.001) and reported not enough information was provided (mean 2.92, SE = 0.06). The video group reported the information was just about right (mean = 3.07, SE = 0.06), and the app group reported slightly too much (mean = 3.17, SE = 0.06). In the app, we tracked how participants clicked on additional information. The results demonstrated that most participants (76%) did not click on additional information such as short videos, question and answer links, or the comprehension quiz.
Behavioral Outcomes
There are concerns that information about and the potential research use of residual bloodspots could negatively impact support for newborn screening. Additional questions were asked to measure the impact of information about residual biospecimens research on attitudes toward newborn screening and biospecimen research. For the question, “From your experience, and what you understand about newborn screening, how supportive are you of this program?”, respondents in the video group reported high support (98%; either “Very supportive” or “Moderately supportive”) followed by the usual care group (95%), and then the app group (91%), p = 0.107, see Table 3. For the question, “From your experience, and what you understand about the Michigan BioTrust, how supportive are you of this program?”, respondents in the video group reported high support (96%; either “Very supportive” or “Moderately supportive”) followed by the control group (95%), and then the app group (92%), p = 0.236.
Table 3:
Support, Partner Role, and Decision to participate at Post Survey*
| Usual care N = 176 |
Video N = 178 |
App N = 178 |
|
|---|---|---|---|
| From your experience, and what you understand about newborn screening, how supportive are you of this program? | |||
| Not supportive at all | 3 (1.8%) | 0 (0.0%) | 3 (1.8%) |
| A little supportive | 5 (3.0%) | 3 (1.8%) | 12 (7.1%) |
| Moderately supportive | 39 (23.2%) | 36 (21.3%) | 34 (20.0%) |
| Very supportive | 121 (72.0%) | 130 (76.9%) | 121 (71.2%) |
| From your experience, and what you understand about the Michigan BioTrust, how supportive are you of this program? | |||
| Not supportive at all | 4 (2.4%) | 0 (0.0%) | 3 (1.8%) |
| A little supportive | 4 (2.4%) | 7 (4.1%) | 10 (6.0%) |
| Moderately supportive | 51 (30.5%) | 42 (24.7%) | 45 (26.8%) |
| Very supportive | 108 (64.7%) | 121 (71.2%) | 110 (65.5%) |
| Please tell us about your partner’s role in your decision. | |||
| Me Alone | 27 (15.3%) | 30 (16.9%) | 19 (10.6%) |
| Mostly me | 19 (10.7%) | 22 (12.4%) | 26 (14.5%) |
| My partner and me equally | 131 (74.0%) | 124 (69.7%) | 129 (72.1%) |
| Mostly my partner | 0 (0.0%) | 2 (1.1%) | 4 (2.2%) |
| My partner alone | 0 (0.0%) | 0 (0.0%) | 1 (0.6%) |
| What decision did you make (or are you planning to make) about consent for the Michigan BioTrust? | |||
| Yes, I will participate | 191 (84.9%) | 187 (85.0%) | 176 (78.9%) |
| No, I will not participate | 13 (5.8%) | 18 (8.2%) | 20 (9.0%) |
| Don’t know | 21 (9.3%) | 15 (6.8%) | 27 (12.1%) |
For each item, there are missing percentages in the 1-4% range.
Questions were asked about the role of the parental partner during the decision-making process (scale 1 - 5: 1 – Me alone, 3 – My partner and me equally, 5 – My partner alone) and self-reported decisions about participation in the biobank. There were no significant differences between means of the study groups on the role of the partner in the decision process. The results showed that the mother tended to be the decision-maker more than the partner, but the majority reported that it was equal participation: Usual care = 74.0%, Video = 69.7%, and app = 72.1%. For self-reported decision to participate in the BioTrust, there were no differences between the study groups. Results indicated at the post survey that approximately 84.9% in the usual care group said they would participate (5.8% = no; 9.3% don’t know), 85.0% in the video group would participate (8.2% = no; 6.8%= don’t know) and 78.9% in the app group (9.0% = no; 12.1% = don’t know). See Table 3 for more details. However, the participants in this study reported a significantly higher rate of participation in the BioTrust compared to the overall 2017 state of Michigan consent rates (Yes 67.5%, No 21.6%, and consent decision not documented 10.9%).
Discussion
The research use of biospecimens collected from clinical settings as well as the use of electronic medical record (EMR) data is proliferating (Grady et al., 2015). However, these practices raise a number of new challenges for obtaining informed consent in this domain. One challenge is how to integrate consent into clinic flow without further impeding on time constraints from other clinical duties with sufficient education for a thoughtful, informed decision making process by participants (D'Abramo, Schildmann, & Vollmann, 2015). The purpose of this study was to address this problem of consent during clinical care. Results demonstrated efficacy and acceptability of using an educational video and interactive app for the consent of research use of leftover newborn screening bloodspots within the clinical setting. However, while the use of an electronic tool to aid in the consent process may help ease the time constraint concern, this study did not expressly test implementation. The results provide support for future feasibility studies to implement these innovative video or app-based education tools into an existing consent processes.
A primary outcome from this research was that the use of an electronic-based consent education tool, regardless of format (video or app), improved knowledge about the elements of consent for biobanking, improved satisfaction with the consent process, and did not negatively impact support for the newborn screening program. However, there was one key difference in acceptability between the two electronic interventions. While the brochure group reported that the information provided was “a lot less than I needed”, the video group reported that the information provided was “about right” and the app group reporting that it was “too much information.” This result was surprising, given the fact that the app allowed users to skip over text or to click on buttons for more information, while the video required the user to consume the information in a passive mode. The video may have been easier for parents in the postnatal environment because they were asked to merely listen and watch, a much easier task when sleep deprived or attending to a newborn.
Another important finding was the significant increase in documented consent decisions, as well as a higher participation rate in the biobank research, for all participants in this study compared to previous state rates. These increases may be an artifact of increased support for the biobank consent process after the implementation assessment was completed. Significant time and effort to streamline this study into clinical care was conducted in collaboration with the clinical staff prior to any data collection. But it is also possible that the research study itself raised awareness about the standard consent process. The ability for the electronic tablet to remain in the patient room until completion may provide a more visible tool which participants can pause when necessary and can be delivered independent of healthcare providers’ time. Also, the video and app may be easier to re-engage than a brochure that can become lost amongst other paperwork or competing clinical demands.
Future studies need to assess the feasibility of multimedia presentations in patient rooms, including ways for patients to learn the information on their own devices. One important limitation of this study is that despite efforts to target diverse populations, the sample had limited diversity. Further, health literacy was not measured and may have impacted how each intervention was comprehended. Also, the use of research assistants and financial incentives limit the generalizability of these approaches to real-world hospital environments, therefore future feasibility studies are needed. Finally, the current consent process in the state of Michigan requires a documented decision about research participation and signature on the BioTrust consent form found on the back of the Guthrie card after receipt of a companion brochure. This precluding our ability to assess sole use of an entirely electronic consent process. Future studies will evaluate these consent tools in more diverse settings and with an e-consent process.
Best Practices
Offering different modalities of patient education beyond brochure-based education efforts increase patient knowledge and satisfaction. As advancing technology and genomics continue to increase the importance of research with biospecimens collected in the clinical setting, improving patient education and engagement will be critical for continued support. This is particularly evident with leftover bloodspots from newborn screening because the resulting biobank has the ability to capture biospecimens from almost all children in the US without restriction to race, gender, income or geographic area. Utilizing educational tools for the retention of leftover newborn screening bloodspots that are integrated into the clinical setting, reducing the amount of clinical time required, will be essential for educating the public on this important need.
Research Agenda
Within the app, we designed several “layers” for participants to explore additional information that was beyond what is required in the consent process. These included additional videos and questions boxes (such as, “Why are there usually leftover blood spots after newborn screening?” and “Why is it helpful to have representation from all groups of people in research?”) within each key point of the consent process. The app also had the option for a comprehension quiz at the end. We found that parents rarely used these opportunities for additional information. It may be that the app offered too much information and overwhelmed participants during the consent process, potentially interfering with knowledge acquisition (Mayer, Heiser, & Lonn, 2001). Future research will need to identify how to balance giving options for individuals to explore more information while not providing too much information within the clinical context.
Educational Implications
This study raises the question about the best approach for educating patients in the clinical setting and how a video or an interactive app may be more or less appropriate. Allowing the option for patients to review larger volumes of information may not be useful in the clinical setting when there are competing demands. Providing information before and after encounters in the clinical facility, such as during prenatal care, may be a better approach. However, the study does reveal that satisfaction, clarity, and amount of information were significantly higher compared to those who received brochure-based education only. Offering different mechanisms of education in the clinical setting, applicable to many levels of health literacy, may be necessary to improve patient understanding.
Acknowledgments
Sources of Support: All phases of this study were supported by an NIH grant from the National Institute for Child Health and Development (5R01 HD082148).
Footnotes
No Disclaimers
Clinical Trial Registration: The Effect of Electronic Informed Consent Information (EICI) on Residual Newborn Specimen Research; Number: NCT03141307, Web link: https://clinicaltrials.gov/ct2/show/NCT03141307?term=NCT03141307&rank=1
References
- Atkinson S (2019, February 28, 2019). [Michigan Department of Health and Human Services Newborn Screening Follow up Program].
- Brehaut JC, O'Connor AM, Wood TJ, Hack TF, Siminoff L, Gordon E, & Feldman-Stewart D (2003). Validation of a decision regret scale. Med Decis Making, 23(4), 281–292. doi: 10.1177/0272989x03256005 [DOI] [PubMed] [Google Scholar]
- D'Abramo F, Schildmann J, & Vollmann J (2015). Research participants' perceptions and views on consent for biobank research: a review of empirical data and ethical analysis. BMC Med Ethics, 16, 60. doi: 10.1186/s12910-015-0053-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Flory J, & Emanuel E (2004). Interventions to improve research participants' understanding in informed consent for research: a systematic review. Jama, 292(13), 1593–1601. doi: 10.1001/jama.292.13.1593 [DOI] [PubMed] [Google Scholar]
- Grady C, Eckstein L, Berkman B, Brock D, Cook-Deegan R, Fullerton SM, … Wendler D (2015). Broad Consent for Research With Biological Samples: Workshop Conclusions. Am J Bioeth, 15(9), 34–42. doi: 10.1080/15265161.2015.1062162 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hall MA, Camacho F, Lawlor JS, Depuy V, Sugarman J, & Weinfurt K (2006). Measuring trust in medical researchers. Med Care, 44(11), 1048–1053. doi: 10.1097/01.mlr.0000228023.37087.cb [DOI] [PubMed] [Google Scholar]
- Lewis MH, Goldenberg A, Anderson R, Rothwell E, & Botkin J (2011). State laws regarding the retention and use of residual newborn screening blood samples. Pediatrics, 127(4), 703–712. doi: 10.1542/peds.2010-1468 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mayer RE (2002). Multimedia Learning. Psychology of Learning and Motivation, 41, 85–139. [Google Scholar]
- Mayer RE, Heiser J, & Lonn S (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of Educational Psychology, 93(1), 187–198. doi: 10.1037/0022-0663.93.1.187 [DOI] [Google Scholar]
- Ormond KE, Cirino AL, Helenowski IB, Chisholm RL, & Wolf WA (2009). Assessing the understanding of biobank participants. Am J Med Genet A, 149a(2), 188–198. doi: 10.1002/ajmg.a.32635 [DOI] [PubMed] [Google Scholar]
- Paas F, Renkl A, & Sweller J (2003). Cognitive Load Theory and Instructional Design: Recent Developments. Educational Psychologist, 38(1), 1–4. doi: 10.1207/S15326985EP3801_1 [DOI] [Google Scholar]
- Palmer BW, Lanouette NM, & Jeste DV (2012). Effectiveness of multimedia aids to enhance comprehension of research consent information: a systematic review. Irb, 34(6), 1–15. [PMC free article] [PubMed] [Google Scholar]
- Rothwell E, Goldenberg A, Johnson E, Riches N, Tarini B, & Botkin JR (2017). An Assessment of a Shortened Consent Form for the Storage and Research Use of Residual Newborn Screening Blood Spots. J Empir Res Hum Res Ethics, 12(5), 335–342. doi: 10.1177/1556264617736199 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rothwell E, Johnson E, Riches N, & Botkin JR (2019). Secondary research uses of residual newborn screening dried bloodspots: a scoping review. Genet Med, 21(7), 1469–1475. doi: 10.1038/s41436-018-0387-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rothwell E, Wong B, Rose NC, Anderson R, Fedor B, Stark LA, & Botkin JR (2014). A randomized controlled trial of an electronic informed consent process. J Empir Res Hum Res Ethics, 9(5), 1–7. doi: 10.1177/1556264614552627 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sanderson SC, Brothers KB, Mercaldo ND, Clayton EW, Antommaria AHM, Aufox SA, … Holm IA (2017). Public Attitudes toward Consent and Data Sharing in Biobank Research: A Large Multi-site Experimental Survey in the US. Am J Hum Genet, 100(3), 414–427. doi: 10.1016/j.ajhg.2017.01.021 [DOI] [PMC free article] [PubMed] [Google Scholar]

