Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Mar 12.
Published in final edited form as: J Empir Res Hum Res Ethics. 2014 Oct 2;9(5):1–7. doi: 10.1177/1556264614552627

A Randomized Controlled Trial of an Electronic Informed Consent Process

Erin Rothwell 1, Bob Wong 1, Nancy C Rose 1,2, Rebecca Anderson 1, Beth Fedor 2, Louisa A Stark 1, Jeffrey R Botkin 1
PMCID: PMC5847281  NIHMSID: NIHMS947029  PMID: 25747685

Abstract

A pilot study assessed an electronic informed consent model within a randomized controlled trial (RCT). Participants who were recruited for the parent RCT project were randomly selected and randomized to either an electronic consent group (n = 32) or a simplified paper-based consent group (n = 30). Results from the electronic consent group reported significantly higher understanding of the purpose of the study, alternatives to participation, and who to contact if they had questions or concerns about the study. However, participants in the paper-based control group reported higher mean scores on some survey items. This research suggests that an electronic informed consent presentation may improve participant understanding for some aspects of a research study.

Keywords: informed consent, randomized controlled trials, electronic


Obtaining adequate informed consent by potential research participants is an ethical challenge in research. Institutional review boards (IRBs) spend a great deal of time reviewing consent forms and, to a lesser extent, evaluating the consent process. However, the research literature provides substantial evidence that many participants do not understand the information conveyed to them during the process of consent, including the study purpose, procedures, risks, benefits, and their rights (Agre & Rapkin, 2003; Cohn & Larson, 2007; Flory & Emanuel, 2004; Henry et al., 2009; Palmer, Lanouette, & Jeste, 2012; Ryan, Prictor, McLaughlin, & Hill, 2008). As many as 30% to 44% of participants who consented in clinical trials research did not understand the key components of clinical trials (Daugherty, Banik, Janish, & Ratain, 2000; Howard & DeMets, 1981; Joffe, Cook, Cleary, Clark, & Weeks, 2001a; Verheggen, Jonkers, & Kok, 1996). In one study, only 50% of the parents interviewed after informed consent about enrollment of their child in an oncology randomized controlled trial (RCT) understood the concept of randomization (Kodish et al., 2004).

A particular challenge to participant understanding is the ever lengthening and complex consent form. The required elements of informed consent in the human subjects regulations (45CFR46) include the following:

  1. purpose of the research and procedures;

  2. description of any reasonably foreseeable risks;

  3. benefits associated with participation;

  4. alternatives to participation;

  5. confidentiality assurances;

  6. for research involving more than minimal risk, an explanation as to whether any compensation and medical treatments are available if injury occurs;

  7. an explanation of whom to contact for answers to questions about the research and research subjects’ rights, and whom to contact in the event of a research-related injury to the subject; and

  8. a statement that participation is voluntary, refusal to participate will involve no penalty or loss of benefits to which the subject is otherwise entitled, and the subject may discontinue participation at any time (Department of Health and Human Services [DHHS], 2009).

There is no standard process by which these elements are presented and described to potential research participants. Furthermore, documentation of comprehension is not required and is rarely obtained before a participant signs a consent form. These factors have contributed to barriers for promoting informed choice for research participation (Rowbotham, Astin, Greene, & Cummings, 2013).

To address these barriers, some researchers have used interactive, multimedia informed consent platforms to improve comprehension but have had mixed results. Flory and Emanuel (2004) conducted a systematic review of 12 multimedia consent interventions. They found limited successes in improving research participants’ understanding and suggested that the most effective way to improve consent was talking to a person one-on-one (Flory & Emanuel, 2004). However, a more recent review by Palmer et al. (2012) found improved comprehension through the use of multimedia consent tools in 10 of 20 studies reviewed (Palmer et al., 2012). Both reviews stated that conceptually grounded and methodologically rigorous research is needed to understand when multimedia consent tools are effective. The general consensus is that the value of multimedia consent tools is promising but remains unclear.

The purpose of this study was to explore the appropriateness of an electronic informed consent model within a RCT. The research question was whether an electronic informed consent process was more effective than a simplified paper-based consent process within a RCT for a minimal risk study. We hypothesized that the electronic informed consent process would result in significantly higher mean scores about the study-specific details of this research project.

Method

Participants

This study was a pilot substudy within a multicenter RCT. The parent study was designed to assess prenatal education about newborn screening and dried bloodspot retention by state health departments following newborn screening. In the parent study, participating women in Utah, New York, and California who were 32 to 38 weeks pregnant were randomly assigned to one of three study groups with each group receiving different education materials. The participants were assessed 2 to 4 weeks postnatally on knowledge and attitudes about newborn screening and bloodspot retention.

During January to March 2014, participants who were recruited within one of the study sites (Utah) were randomly selected for participation in this substudy. Of the total participants recruited (n = 240) in the parent study at this site, 62 participants (26%) were randomized to a control group or an electronic consent group. Inclusion criteria for the multicenter RCT were English- or Spanish-speaking women with full-term pregnancies who gave birth with normal birth outcomes. Spanish-speaking women were excluded from this pilot substudy. Participant demographics are listed in Table 1. All participants were female and 61% had given birth before. Of those who had given birth before, 31% had given birth to one child and 19.4% had given birth to two or more children.

Table 1.

Participant Characteristics.

n %
Marital Status
 Married or living together 58 93.55
 Partner, but not living together 2 3.23
 Other 2 3.23
Ethnicity
 Non-Hispanic 56 90.32
 Hispanic 5 8.06
 No answer 1 1.61
Race
 Asian 4 6.45
 White 52 83.87
 Native Hawaiian/Pacific Island 2 3.23
 Other 1 1.61
 No answer 3 4.84
Income
 Less than US$24,999 2 3.23
 US$25,000-US$50,000 14 22.58
 US$50,001-US$100,000 19 30.65
 >US$ 100,001 17 27.42
 Not sure 2 3.23
 No answer 8 12.90
Education
 Some high school 1 1.61
 High school diploma or General Education Diploma 5 8.06
 Some college 11 17.74
 Associate’s degree 6 9.68
 Bachelor’s degree 26 41.94
 Graduate degree 13 20.97

Procedures

Potential research participants for the RCT were identified through medical chart review and approached by a nurse researcher in the waiting room of the clinic. For this substudy, participants were randomly selected after participants agreed to participate in the RCT. For the electronic consent model (intervention group), participants watched a 5-min video presentation on an iPad and received a paper copy of the consent form. The informed consent procedure was administered to the control group by a nurse researcher who were encouraged to also read the paper consent that they received. After the consent process, all participants answered demographic questions and then were randomized to one of the three study groups in the parent RCT. Immediately following, participants in the substudy completed a brief, 14-item survey about the consent process. Participants who were randomized to the electronic consent group were asked whether they would allow a researcher to conduct a telephone interview about the video. All participants (100%) in the electronic consent group agreed to be interviewed and all interviews were completed approximately 1 week after the consent process. The interviews lasted approximately 15 min and were audio recorded.

The informed consent form used in the parent study at this study site was simplified in anticipation of this pilot study. To gain IRB approval for an electronic informed consent process, the video-based information needed to be consistent with the paper-based consent. Therefore, the research team focused on the development and approval of a simplified paper-based consent prior to video development. All of the required elements were included in the paper-based consent form but it used simple, short sentences with bulleted text to highlight key aspects of the research. Overall, it was shorter than the typical consent form approved by the IRBs of the two institutions (of the research study and research site) because it used bulleted text as opposed to complete sentences within paragraphs. See Table 2 for an example section of the simplified paper-based consent form.

Table 2.

Example of Simplified Paper-Based Consent Form.

You will be randomly assigned to one of three groups. This means that you or your health care provider cannot choose your group. Your assignment to a group is up to chance, like rolling dice.
Everyone will receive information on newborn screening. In addition, some mothers will view one or two 5-min videos near the end of their pregnancy.
  • Some mothers will view a video about newborn screening and a second video about leftover blood spots.

  • Other mothers will view a video about newborn screening only.

You will be asked to do a 15- to 20-min telephone survey 2 to 6 weeks after your due date. The survey is about your experiences, knowledge, and attitudes on newborn screening and leftover blood spots.
  • Some mothers will be asked if we can contact their partner to do a similar survey.

  • A few mothers will be asked for a telephone interview about their choices with newborn screening and leftover blood spots.

The text of the paper-based consent document served as the narration for the 5-min electronic informed consent video, minus contact details such as addresses and telephone numbers, which were available in the paper form that individuals also received. Video footage (B-roll), photographs, graphics, and animations were used to visually illustrate the information in the narration and were closely aligned with it.

Theoretical Frameworks

Two theoretical frameworks guided this research. The Principles of Multimedia Learning Theory by Mayer (1997) was used in the development of the electronic informed consent video. Five principles have been generated to guide the development of multimedia education materials to enhance learning and retention (Mayer, 1997; Mayer & Moreno, 2002). Each of the following principles was used in the development of the proposed intervention:

  1. Multiple representation principle: It is better to present an explanation in words and pictures than solely in words.

  2. Contiguity principle: It is best to present corresponding words and pictures contiguously rather than separately.

  3. Split-attention principle: Present words as auditory narration rather than as visual onscreen text.

  4. Individual differences principle: The first three principles apply more to low-knowledge than high-knowledge learners.

  5. Coherence principle: Use few rather than many extraneous words and pictures.

In addition to these principles of multimedia learning, cognitive load theory has served as a major construct for understanding cognitive processes and has heavily influenced instructional design for adult learning (Paas, Renkl, & Sweller, 2003). Research has clearly demonstrated that the more a person has to learn in a specified length of time, the less information is understood and retained. On average, a person can only retain seven items (e.g., the digits in a phone number) plus or minus two (Miller, 1956). Improved comprehension is achieved through building upon existing knowledge (schemas) and presenting small chunks of new information in a manner that promotes active learning.

Survey and Interview Instruments

The survey used in this study was based on a survey that measured subjective understanding of informed consent within clinical trials (Joffe et al., 2001a, 2001b). An example of one of the questions is as follows: When you signed the consent form to participate in this study, how well did you understand the following aspects of this study? If you did not understand the item at all, please circle 1. If you understood it very well, please circle 5. If you understand it somewhat, please circle a number between 1 and 5. Table 4 lists the survey questions. This survey has demonstrated good test-retest reliability and intraclass correlation coefficient (ICC = .66; Joffe et al., 2001b). Whenever possible, survey items were modified to directly reflect information in the informed consent form for this particular study. The survey took approximately 1 min to complete.

Table 4.

Mean and Standard Deviations of Survey Questions by Group.

Control
(n = 30)
Electronic Informed Consent (n = 32)
p value
M SD M SD
1. The fact that your research participation involves follow-up phone calls. 4.90 0.403 5.00 0.000 .167
2. What the researchers are trying to find in the research study. 4.52 0.829 4.19 1.091 .094
3. How long you will be in this research. 4.53 0.937 4.69 .644 .528
4. The steps for participating in this study. 4.67 0.606 4.84 .448 .202
5. Which of these education approaches in this study are new. 3.97 1.245 3.84 1.247 .088
6. The possible risks and discomforts from participation. 4.83 .531 4.78 0.491 .300
7. The possible benefits to your baby and you from participation in this study. 4.77 .430 4.66 0.653 .402
8. How your participation in this study may benefit other parents. 4.67 0.661 4.38 1.008 .341
9. The alternatives to participation in this study. 4.37 1.098 4.88 0.421 .047*
10. The effect of participation on your clinical care. 4.59 0.867 4.78 0.608 .469
11. Who to contact if you are upset because of participation in this study. 4.03 1.402 4.41 0.798 .002*
12. Whom you should contact if you have questions or concerns about this study. 4.13 1.332 4.34 0.971 .009*
13. The fact that participation in this study is voluntary. 5.00 0.000 4.97 0.177 .283
14. Overall, how well did you understand this study when you signed the consent form. 4.63 0.669 4.72 0.581 .019*
*

Significant at < .05.

The qualitative interview guide was developed by the members of the research team with experience in research ethics, informed consent, health education, and qualitative research. Table 3 lists the questions used in the interviews. The survey and interview in this pilot study were purposively designed to be brief to decrease the research burden on participants already enrolled in a RCT.

Table 3.

Telephone Interview Questions.

  1. Can you tell me what you remember about the video informed consent video?

  2. What did you like the best about the video?

  3. What did you like the least about the video?

  4. What would change in the video to improve how information about the research study was communicated?

  5. Now, I would like to ask a few questions about the actual research study. Can you tell what the purpose of the research study was for?

  6. Can you explain to me how your group in the study was chosen?

  7. Can you tell me what you think the risks are for this study?

  8. Can you tell me what you think are the benefits of this study?

Analyses

An ANCOVA was used to analyze the survey data, with “recruiter” (two different recruiters) as a covariate to control for any recruiter differences. Each of the survey items was analyzed as a separate dependent variable. We recognize that conducting multiple univariate analyses may spuriously increase Type I error but due to the pilot nature of this study, we deemed this acceptable. Table 4 lists the results of these analyses.

The interview recordings were transcribed and a member of the research team verified transcription by listening to the recordings while reading the transcripts. A content analysis was used to analyze the data from 15 interviews. A coding template was created from reading the transcripts and the semistructured interview guide. One of the researchers then systematically applied this template to the transcripts. Codes were grouped together into categories based on similarity for each of the questions in the interview. Results are presented using the participants’ own words to represent their personal experiences during the consent process.

Results

The survey results are presented in Table 4. Out of the 14 survey items, 4 items showed a statistical difference between the two groups, including the item about overall understanding of the study. The electronic consent group reported higher understanding for the following: “The alternatives to participation in this study” (4.88 ± 0.42 vs. 4.37 ± 1.10, p = .047); “Who to contact if you are upset because of participation in this study” (4.41 ± 0.80 vs. 4.03 ± 1.40, p = .002); “Whom you should contact if you have questions or concerns about this study” (4.34 ± 0.97 vs. 4.13 ± 1.33, p = .009); and “Overall, how well did you understand this study when you signed the consent form” (4.72 ± 0.58 vs. 4.63 ± 0.67, p = .019). Some survey items had higher mean scores for the paper-based consent group. These items focused on what the researchers were trying to find in this study, which education approaches were new, benefits, risks, and voluntary participation.

The qualitative results supported the survey results in that several participants stated the video was easy to understand and held their attention more than a paper-based approach would have. Representative quotes included the following: “The video makes it stick”; “I liked that it was easy to understand”; and “It was a video. It made it more real. Instead of reading something and you just skim over it. It made you think about things more.”

When asked what they would change about the video, two categories were most frequently reported. Several paricipants stated that the 5-min video was too long (“This is getting a little long”; and “The length [shorten it].”). Other participants thought that the disclosure about risks for this minimal risk study should be changed. For example, one participant stated, “That was the one part that I thought was funny. I didn’t think there were any risks [in the parent study].” Another participant stated, “I remember them saying if you got upset you can talk to someone. That was kinda funny because I don’t see myself upset by this.”

Participants in the electronic informed consent group were asked about other study-specific details such as what the study was about, the benefits of the study, and how a participant was assigned to one of the three study groups. All of the participants were able to recall accurate and detailed responses about the study. These responses indicated that study-specific details were understood. Representative quotes are included below:

To find how to better inform people about newborn screening and the purpose of it and why it is important.

To see how well we are educating people about the newborn screening testing after birth and then the leftover bloodspots.

The whole study was to find a better way to educate people about genetic screening and what it is. That there would be three groups and one group didn’t do anything, the next group watched one video and the third group watched three videos.

To inform parents what is going to happen with their bloodspots after the test were done on that particular child, that it could be kept [by the] state and used for additional studies.

Discussion

The research community is dependent on the integrity of the investigator to obtain informed consent and yet, the current format of the informed consent document complicates this process (Bailey et al., 2013). These concerns are magnified by the increasing length and complexity of consent forms, which further contributes to the lack of participant understanding (Henry et al., 2009). Identifying new mechanisms to improve research participant comprehension during the process is necessary to promote the ethical conduct of research. However, along with other research evidence, this study demonstrates that different formats of informed consent may not be applicable to all research contexts (Bailey et al., 2014). For example, in this study, we found that for 6 of the 14 items on the survey, participants in the simplified paper-based consent group had higher mean scores than those in the electronic informed consent group. These 6 items focused on study-specific details, including the purpose of the research to test a new education approach, benefits, risks, and voluntary participation. However, these differences were not significant and there was large variability in responses on these survey items. Interestingly, these survey results contradict the accurate, study-specific details that participants in the electronic consent group provided in response to interview questions. One possible explanation for these discrepancies is the way in which the research assistants presented the informed consent process. For the electronic consent group, the research assistants gave the iPad to the participant and allowed her to watch the video on her own (the research assistant sat near her in the clinic room). These participants also were given the paper consent form after they watched the video, but may have been less likely to read it without the research assistant sitting with them. For the paper-based consent group, the research assistants verbally highlighted the study-specific requirements for the study and then allowed the woman to read the paper-based consent while the research assistant sat near the woman. Therefore, participants in the latter group had two opportunities to gain informed consent information about the study—verbal and reading. The general conclusions from this pilot suggest that the electronic informed consent process resulted in higher overall knowledge about the study over the simplified paper-based consent but more research is warranted.

Another finding from this research that may influence the informed consent process was the participants’ qualitative responses during the interviews. They reported that some information discussed during the electronic informed consent process appeared misplaced and lengthy. This suggests that it might be most effective to have some elements of the consent process in electronic format and others in paper format. A mixed-format approach can tailor content in the electronic format and other information in the paper-based format to improve overall comprehension. For example, several studies have suggested that information about privacy and confidentiality, while legally required, is cumbersome and distracts research participants from understanding the requirements of study participation (Kass, Chaisson, Taylor, & Lohse, 2011).

There are several limitations to this pilot study. Although significant differences were found between the two consent formats for a few survey items, more research with a larger and more diverse sample is needed. Furthermore, it is unknown if the shortened paper-based consent form improved or decreased participant understanding compared with the original consent form; future research should include a traditional paper-based consent group for comparison. In addition, this pilot study was conducted with women who were around 36 to 38 weeks gestation and some interviews were conducted with women who had given birth a few days earlier. These recently postpartum women may not have recalled as much detail about the consent process as women who were interviewed later and had more time to recover from birthing. Also, this was a minimal risk study, which may have led participants in both study groups to not be as attentive or concerned about the study-specific details. Therefore, research with greater than minimal risk may need to be utilized to fully evaluate the effectiveness of electronic consent. Finally, the intervention for the parent RCT also used a video-based educational intervention in two of the three study groups. Although the survey was administered immediately after obtaining consent, the addition of more videos may have influenced the interview responses.

Best Practices

Multimedia consent approaches have been tested with mixed results on Participant Comprehension 1 (Flory & Emanuel, 2004; Palmer et al., 2012). Reviews of these studies as well as this research study recommend additional research on what format might be the most applicable for improved participant knowledge about the consent process. Investigating hybrid approaches to consent (electronic and paper) may yield a better understanding of how to more effectively inform participants during the consent process.

Research Agenda

A more nuanced investigation about the role of electronic informed consent on participant comprehension needs to be conducted. This should be further explored within both minimal and more than minimal risk research studies. A hybrid approach to informed consent that combines electronic and paper-based consent approaches may address some limitations of the current informed consent process.

Educational Implications

This research suggests that a mixed-method approach to consent may improve efficacy. Identifying ways to improve research participant comprehension of the protocol during informed consent will require additional research. Future education about informed consent may want to include an assessment of participant comprehension.

Acknowledgments

We thank the Genetic Science Learning Center team at the University of Utah for their participation in helping to develop the text for the simplified paper-based consent form and for producing the electronic informed consent video.

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The University of Utah Annette Poulson Cumming College of Nursing provided funding for this research. The parent study was funded by the National Institutes of Health (1R01HG006266-02).

Biographies

Erin Rothwell is an associate professor (research) in the College of Nursing and Division of Medical Ethics and Humanities at the University of Utah. Her research focuses on ethical issues with participant comprehension and informed consent with newborn screening, prenatal testing, and biobanking. She was the principal investigator of this study and was involved in all aspects of development and conduct of the research, and writing of the article.

Bob Wong is an associate professor (research) in the College of Nursing and the director of statistics for the Emma Eccles Jones Nursing Research Center. His interests range from psoriasis, to oncology symptom management, to newborn screening, and longitudinal statistical analysis, data collection systems, and data visualization. He was involved with data analysis and manuscript review.

Nancy C. Rose is a professor of obstetrics and gynecology at the University of Utah and director of Reproductive Genetics and Intermountain Healthcare. She is interested in public health as it applies to prenatal screening and diagnosis. She was involved with study oversight, patient recruitment, and manuscript review.

Rebecca Anderson is the assistant director of the GeneSIS Center at the University of Utah, and an associate of the Division of Medical Ethics and Humanities. Her research is focused on the ethical, legal, and social implications of genetic technology with a particular emphasis on newborn screening and biobanking. She participated in drafting the International Society of Nurses in Genetics position statement revision on Informed Decision-Making and Consent, and was involved with study design, instrument development, and manuscript review.

Beth Fedor is a clinical research coordinator at Intermountain Healthcare in Women and Newborn studies. Her research includes exercise immunology and women and newborn issues. She was involved in patient recruitment, study processes, and manuscript review.

Louisa A. Stark is an associate professor (research) of human genetics and directs the Genetic Science Learning Center (GSLC) at the University of Utah. Her research focuses on developing and testing the efficacy of educational interventions, including materials for patients and the public. She participated in developing the text for the simplified paper-based consent form, oversaw the GSLC’s production of the electronic informed consent, assisted in the design of the study, and participated in manuscript review.

Jeffrey R. Botkin is a professor of pediatrics and the associate vice president for research at the University of Utah. His research focuses on the ethical, legal, and social issues in human genetics. He is the principal investigator on the parent study for this project and assisted in the design and conduct of the study and in drafting the manuscript.

Footnotes

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

  1. Agre P, Rapkin B. Improving informed consent: A comparison of four consent tools. IRB. 2003;25(6):1–7. [PubMed] [Google Scholar]
  2. Bailey DB, Bann C, Bishop E, Guarda S, Barnum L, Roche M. Can a decision aid enable informed decisions in neonatal nursery recruitment for a fragile X newborn screening study? Genetics in Medicine. 2013;15:299–306. doi: 10.1038/gim.2012.135. [DOI] [PubMed] [Google Scholar]
  3. Bailey DB, Raspa M, Wheeler A, Edwards A, Bishop E, Bann C, Appelbaum PS. Parent ratings of ability to consent for clinical trials in fragile X syndrome. Journal of Empirical Research on Human Research Ethics. 2014;9(3):18–28. doi: 10.1177/1556264614540591. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Cohn E, Larson E. Improving participant comprehension in the informed consent process. Journal of Nursing Scholarship. 2007;39:273–280. doi: 10.1111/j.1547-5069.2007.00180.x. [DOI] [PubMed] [Google Scholar]
  5. Daugherty CK, Banik DM, Janish L, Ratain MJ. Quantitative analysis of ethical issues in phase I trials: A survey interview of 144 advanced cancer patients. IRB. 2000;22:6–14. [PubMed] [Google Scholar]
  6. Department of Health and Human Services. Code of Federal Regulations: Title 45 Public Welfare—Part 46 Protection of Human Subjects. 2009 Retrieved from http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html#46.116. [PubMed]
  7. Flory J, Emanuel E. Interventions to improve research participants’ understanding in informed consent for research: A Systematic review. Journal of the American Medical Association. 2004;292:1593–1601. doi: 10.1001/jama.292.13.1593. [DOI] [PubMed] [Google Scholar]
  8. Henry J, Palmer BW, Palinkas L, Glorioso DK, Caligiuri MP, Jeste DV. Reformed consent: Adapting to new media and research participant preferences. IRB. 2009;31(2):1–8. [PMC free article] [PubMed] [Google Scholar]
  9. Howard JM, DeMets D. How informed is informed consent: The BHAT experience. Controlled Clinical Trials. 1981;2:287–303. doi: 10.1016/0197-2456(81)90019-2. [DOI] [PubMed] [Google Scholar]
  10. Joffe S, Cook EF, Cleary PD, Clark JW, Weeks JC. Quality of informed consent in cancer clinical trials: A cross-sectional survey. The Lancet. 2001a;24:1772–1777. doi: 10.1016/S0140-6736(01)06805-2. [DOI] [PubMed] [Google Scholar]
  11. Joffe S, Cook EF, Cleary PD, Clark JW, Weeks JC. Quality of informed consent: A new measure of understanding among research subjects. Journal of the National Cancer Institute. 2001b;93:139–147. doi: 10.1093/jn-ci/93.2.139. [DOI] [PubMed] [Google Scholar]
  12. Kass NE, Chaisson L, Taylor HA, Lohse J. Length and complexity of US and international HIV consent forms from federal HIV network trials. Journal of General Internal Medicine. 2011;26:1324–1328. doi: 10.1007/s11606-011-1778-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Kodish E, Eder M, Noll RB, Ruccione K, Lange B, Angiolillo A, Drotar D. Communication of randomization in childhood leukemia trials. Journal of the American Medical Association. 2004;291:470–475. doi: 10.1001/jama.291.4.470. [DOI] [PubMed] [Google Scholar]
  14. Mayer RE. Multimedia learning: Are we asking the right questions? Educational Psychologist. 1997;32:1–19. [Google Scholar]
  15. Mayer RE, Moreno R. Animation as an aid to multimedia learning. Educational Psychology Review. 2002;14:87–98. [Google Scholar]
  16. Miller GA. The magic number seven plus or minus two: Some limits on our capacity to process information. Psychological Review. 1956;63:81–97. [PubMed] [Google Scholar]
  17. Paas F, Renkl A, Sweller J. Cognitive load theory and instructional design: Recent developments. Educational Psychologist. 2003;38:1–4. [Google Scholar]
  18. Palmer BW, Lanouette NM, Jeste DV. Effectiveness of multimedia aids to enhance comprehension of research consent information: A systematic review. IRB. 2012;34(6):1–15. [PMC free article] [PubMed] [Google Scholar]
  19. Rowbotham MC, Astin J, Greene K, Cummings SR. Interactive informed consent: Randomized comparisons with paper consents. PLoS ONE. 2013;8(3):e58603. doi: 10.1371/journal.pone.0058603. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Ryan RE, Prictor MJ, McLaughlin KJ, Hill SJ. Audio-visual presentation of information for informed consent for participation in clinical trials. Cochrane Database of Systematic Reviews. 2008;1 doi: 10.1002/14651858.CD003717.pub2. Article CD003717. [DOI] [PubMed] [Google Scholar]
  21. Verheggen FWSM, Jonkers R, Kok G. Informed consent in clinical trials. Health Policy. 1996;36:131–153. doi: 10.1016/0168-8510(95)00805-5. [DOI] [PubMed] [Google Scholar]

RESOURCES