Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Aug 17.
Published in final edited form as: Patient Educ Couns. 2018 Dec 21;102(5):944–951. doi: 10.1016/j.pec.2018.12.022

Improving quality of the informed consent process: Developing an easy-to-read, multimodal, patient-centered format in a real-world setting

Karen A Lindsley 1
PMCID: PMC7429926  NIHMSID: NIHMS1554230  PMID: 30635222

Abstract

Objective:

To develop a patient-centered informed consent and assessment tool written at a 6th grade-level that is multimodal, affordable, transportable, and readily modifiable for protocol updates.

Methods:

This quality improvement initiative was performed in two phases on an actively recruiting study at a pediatric diabetes clinic. In phase I, 38 volunteers underwent the standard-paper consent process, a comprehension assessment and provided feedback. Using feedback and the structure of the Plan-Do-Study-Act cycle a multimodal consent and assessment were developed. In phase II, volunteers were randomized to the standard (n = 25) or the multimodal consent (n = 25) and all completed the same comprehension assessment via touch-screen tablet. Primary outcomes were comparison of the individual and total comprehension assessment scores.

Results:

Total comprehension scores were higher in the multimodal versus the standard consent group (p < 0.001) and on the elements of benefits (p < 0.001), risks (p < 0.001), volunteerism (p < 0.012), results (p < 0.001), confidentiality (p < 0.004) and privacy (p < 0.001).

Conclusion:

A multimodal consent and assessment presented sequentially on a touch-screen tablet were patient-centered enhancements to standard consent.

Practice Implications:

Multimodal standardization of delivery with improved readability may strengthen the informed consent process.

Keywords: Informed consent, multimodal, patient-centered, quality improvement, PDSA, affordable, real-world, questionnaire development

1.0. Introduction

A properly executed informed consent (IC) is a continuous process, not a singular event. The ‘process’ includes ongoing, interactive discussions providing volunteers with information sufficient to support and maintain informed decision-making (§45 CFR 46.116 and §21 CFR 50.20). Required consent elements include study purpose, risks, benefits, all procedures, costs, compensation, volunteerism, withdrawal and confidentiality (§21 CFR 50.25). Consent information is increasingly complex as science, technology, genetic personalization evolves and yet IC still needs to inform with respect for persons, justice and beneficence[1]. Despite the legal mandate for clarity, volunteers often do not understand the information provided in the standard-paper informed consent process [24]. Using technology to provide information centered on the patient with clear, concise audio and visual communication enhances volunteer comprehension [2,5,6].

Methods to heighten patient engagement in research may improve research relevance, impact and understanding [7]. Patient-centered interventions to enhance consent comprehension are described in literature [5,6,810]. They include a consistent multimodal (MM) delivery, improved readability, and a comprehension assessment (CA) followed by an interactive review. Multimodal delivery is defined as communication given using at least two simultaneous modes, such as visual, aural, spatial or tactile. Improved readability implies reducing the difficulty or reading grade-level of written material. An interactive CA review includes a back and forth dialogue regarding study consent and comprehension questions. This initiative sought to introduce these literature findings into a real-world setting (actively recruiting research study in an outpatient clinic) in a practical, effective and affordable manner [5,6,810]. This MM format was then compared to the standard-paper consent and process by volunteers participating in a research study. The purpose of this improvement initiative was to develop and evaluate a consistently presented (standardized) multimodal-video consent with visual-text and narration-script written at a 6th grade-level.

1.1. Multimodal technology and IC comprehension

Despite standardization of written templates and provider training on IC delivery [1113], comprehension remains sub-optimal [13]. Measures reported in consent literature that positively affected study understanding (multimodal format, improved readability and post comprehension assessment) were synthesized in development of this IC video. Although video and technology use has grown explosively in funded research seeking improved volunteer consent understanding [5,6,810]this information is poorly integrated into real-world IC processes [14]. Technology advanced professional platform innovations such as ePRISM simplify, personalize, and improve consent process comprehension [10], but are costly and not widely implemented. Multimodal formats have been used in IC improvement studies [2,5,6,810] and a recent trial demonstrated MM approaches (slideshows with voice-over) best improved volunteer comprehension of study consent [5]. In addition to MM technology, improving IC readability may enhance volunteer comprehension.

1.2. Literacy, readability and IC comprehension

Adult literacy and learning [8,11] are variable and literacy skills in the United States fluctuate widely with thirteen percent of adults reading at highest proficiency and eighteen percent at the lowest level [15]. The National Institutes of Health (NIH) recommends IC should be written at the 6th to 8th grade level [16]. However, literature on consent demonstrates that neither IC readability or volunteer literacy are routinely assessed [4,17] and comprehension remains sub-optimal [18]. Research consents continue to be long, complex, hard to read [3,19,20] and volunteers report that IC failed to inform them [21]. An intentionally designed, easy-to-read MM consent was developed for this quality improvement. This initiative sought to test understanding of consent by exposing volunteers to one of the two formats (standard or easy-to-read MM) and then compared post comprehension scores between exposure groups.

1.3. Post-assessment, interaction and IC comprehension

Without a structured post-assessment, understanding is assumed or inconsistently reviewed and may lead to incomplete volunteer understanding [19]. Combining standardized media explanations and post assessment to document the consent process and interaction [3,22] may improve comprehension [3,23]. Increasing volunteer interaction [23] naturally with a back and forth discussion of the assessment questions stimulates meaningful learning [3]. Incorporating an easy-to-read post consent comprehension assessment may also provide a standardized, non-threatening method to document and support the FDA mandated process of consent.

2. Methods

2.1. Structure of quality improvement initiative within actively recruiting research setting

The Type 1 Diabetes TrialNet Pathway to Prevention (TN01) study is a diabetes risk screening initiative conducted by a worldwide network of diabetes centers (www.trialnet.org). The goal of TN01 is to identify family members of persons diagnosed with type 1 diabetes (T1D) at high-risk for disease development by testing blood for pancreatic autoantibodies. Antibody (AB) identification in relatives supports the risk prediction and staging of T1D[24]. The TN01 screening study serves as a gateway to diabetes prevention or intervention trials. The primary research study for family volunteers was the initial antibody screening. The sample for this single-center quality improvement initiative was derived solely from those volunteers expressing interest in the primary antibody screening study (AB study). After a volunteer expressed antibody study interest they were asked for interest in the additional consent QI initiative. This initiative is a secondary study of consent comprehension, MM format development and usability, conducted in a real-world study setting.

The QI initiative was conducted in two phases. In Phase I, gaps in volunteer-understanding of the antibody study elements following the standard consent process were identified. Then a multimodal consent and comprehension assessment were developed to address the gaps identified in volunteer IC understanding. Thirty-eight antibody study volunteers participating in the standard consent process completed the initial comprehension assessment. Patient-centered feedback was sought and directly used to refine the final CA questions.

In Phase II of the QI initiative fifty additional TN01 screening volunteers were recruited. They were randomly assigned to undergo consent via standard or MM format. The MM format involved visual, aural and tactile senses. The touch-screen was employed to support tactile (touch) modality of learning. The consent video was embedded into the introduction of the comprehension assessment assigned to the MM format. Thus, only the MM format group was exposed to the consent video. Following standard or MM consent all volunteers completed the identical comprehension assessment on same Samsung touch-screen tablet. Using the same tablet for all post assessment was an intentional measure to reduce testing format variation. Demographic data, usability rankings of the touch-screen tablet and post consent CA responses were captured on the touch-screen tablet by Survey Monkey®. After comprehension testing, but prior to performance of any study procedures the MM volunteers were consented or ‘re-consented’ with the standard paper consent process. All volunteers were given ample time for questions and interaction prior to study performance to ensure compliance with all federal, local and study guidelines.

2.2. Plan-Do-Study-Act cycle (PDSA)

The PDSA cycle is a process of continuous improvement through a ‘test of change’ [20,25]. One uses the PDSA cycle to plan and test a change based upon evidence and then to evaluate the impact and act on the change, if reasonable. Specifically, P- represents the plan to test the change, D-is do or implement the change, S- is study or observe the change and A- is act or determine if the desired change was achieved [20,25]. PDSA supported improvement with learning and actionable information [26] by guiding ongoing, incremental change to the comprehension assessment and MM video. PDSA contrasts with typical research performance where modifications are delayed until extensive data is collected [27]. Revision of the comprehension assessment and MM consent continued until saturation was obtained or volunteers and health team members had no further feedback for improvement.

2.3. Preparation and development

2.3.1. Comprehension assessment development

A literature review for questionnaires measuring volunteer-comprehension of informed consent was conducted. The process for question development and scoring was modeled upon the Deaconess Informed Consent Comprehension Test (DICCT) [11], and other consent assessments [28]. Concurrently, this QI team identified frequently misunderstood study elements (45 CFR46.116) from a pool of questions based on consent literature, published consent tests and local antibody-study volunteer feedback. Comprehension assessment questions were then consolidated around the most frequently misunderstood consent elements of the primary study. Nine distinct questions evolved and were refined to read at the 6th grade level. Diabetes clinic general staff (medical assistants, phlebotomists) then pre-tested CA questions for readability, structure and content; changes were incorporated.

Next, the local QI team verified that each question evaluated one consent element. Question stems were enhanced for clarity and brevity and were not designed to deceive the volunteer. The question regarding study purpose was intentionally specific due to repeated observations of volunteers having difficulty understanding the antibody or immune purpose of the study with the standard consent.

The Flesch Kincaid, SMOG, the Fry Graph and Gunning Fog Readability Assessment tools were used to calculate and support ongoing reductions to the CA reading level (http://www.readabilityformulas.com/freetests/six-readability-formulas.php; http://www.readabilityformulas.com/freetests/fry-graph.php). Four experienced researchers (two-physician and two registered-nurse coordinators) evaluated each question for face validity. They determined the questions measured the intended study element. The PDSA cycle supported ongoing CA revisions using the test scores and feedback from thirty-eight volunteers, multiple diabetes researchers and clinical staff. Consent elements queried and CA questions were the foundation for the teaching outline for the multimodal slides (visual-text and narration-script). Questions on demographics, format preference and evaluation of the MM and tablet usability were modeled after Tait et al. questionnaires and added to the tablet assessment [29].

2.3.2. Development of the multimodal consent tool

Use of commonly existing software (Microsoft programs) and affordable tablet technology for delivery of information were central to the real-world applicability of this quality improvement. A multimodal video of the study-specific IC elements was developed using a narrated PowerPoint® saved as a video. The narration (audio-script) complemented, but did not mirror the visual slide text. The MM text and narration were both reduced to a sixth grade level, reading below the standard-paper informed consent (Table 1). The same readability tools, Gunning Fog, Flesch Kincaid Readability and Reading Ease, SMOG Index, and Fry graph were used in grade level reduction of the quality improvement tools (MM video and the comprehension assessment). The video was approximately four minutes in length.

Table 1.

Readability tools and scores of Quality Improvement tools

Study tool Readability Scale Multimodal Script (audio) Grade Level Multimodal Text (visual) Grade Level Comprehension Assessment Grade Level Standard-paper Consent Grade Level

Gunning Fog (1) 6.2 5.6 5.1 11
Flesch Kincaid (1) 3.4 3.6 2.9 8.6
SMOG Index (1) 4.5 5 4.4 8.3
Fry Graph (2) 3 2.5 2 9.0
Flesch Kincaid 91.7 81.3 83.4 64.3
Reading Ease (1) very easy to read easy to read easy to read average

3.1. Quality improvement approval, design and setting

The Emory University Institutional Review Board (IRB) approved conduct of the primary study, TrialNet: Pathway to Prevention and determined this secondary informed-consent initiative as quality improvement via amendment. Additionally, this secondary initiative was acknowledged by the trial network as site-specific quality improvement. This secondary QI initiative tested and compared study comprehension of volunteers after MM or standard consent exposure. Touch-screen tablet usability and future consent format preferences were also rated by volunteers. The setting was a busy urban outpatient pediatric diabetes clinic.

3.2. Recruitment for antibody study and quality improvement initiative

To recruit for the primary antibody study, a trained researcher, reviewed pending clinic appointments for children diagnosed with T1D, under authority of a partial Health Insurance Portability and Accountability Act (HIPAA) waiver. Potential antibody study volunteers were first or second-degree relatives of the T1D proband such as parents, children or siblings. Those families eligible and expressing interest in the antibody study were met by researchers at the appointment. Antibody study participation included self-consent (adult) and giving a blood sample or through provision of consent and assent (parent and minor). Quality improvement volunteers were recruited only from potential antibody study participants. If a family did not express interest in learning about the antibody study they were not approached or recruited as potential participants in the secondary QI initiative. No compensation was given for participation in either the antibody study or QI initiative.

3.3. Quality improvement sample

Fifty volunteers (siblings or parents providing consent for themselves or minors), ages 18 to 56 years completed the QI initiative. They were a convenience sample based upon researcher availability, clinic schedule and potential antibody study volunteers. Self-reported demographic variables collected were gender, age, race, ethnicity, and highest grade-level achieved. Demographic data was not collected on volunteers declining participation in QI initiative. However, the acceptance rate for secondary QI participation was high (50 of 51 (98%)) in antibody volunteers approached.

3.4. Inclusion and exclusion

Family members age 18 and older who were native English speakers and were participating in the antibody study (either giving blood or consenting for a minor), were also asked for interest in the QI initiative. Families with prior antibody study participation were excluded as QI volunteers, due to repeat exposure to the consent content and process. Only families scheduled for and experiencing routine visits were approached for the QI. Families with complex or stressful visits (requiring social service, support system, drug or pregnancy referrals) were not approached for the QI initiative. This exclusion was to avoid any additional burden or perception of burden on volunteers.

3.5. Consent format assignment and follow-up

Quality improvement volunteers were randomly assigned to one of two formats 1) the standard-paper IC and process, followed by the comprehensive assessment on a touch-screen tablet (n=25) or; 2) the four-minute MM consent and the comprehensive assessment on a touch-screen tablet (n=25). Randomization was done by pulling a paper slip labeled for MM or standard IC assignment. To ensure volunteers were informed according to regulatory guidelines all were re-engaged with reading and interactive discussion of the standard consent, post-assessment. No antibody study procedures were completed until all volunteers’ questions were answered and the standard-paper IC process was completed (Figure 1).

Figure 1.

Figure 1.

Flowchart of Volunteers in Quality Improvement Initiative.

4.1. Analysis

Descriptive statistics were used to describe the background characteristics of the QI sample. Group differences in gender, age, race, ethnicity, and highest grade-level achieved were examined to assess adequacy of randomization. Comprehension was measured with nine multiple-choice questions after first exposure to either the standard-paper IC or to the MM consent. Each question was assigned one point if correctly answered, zero if incorrectly answered. An overall IC score and individual score of each consent element were calculated. The two sample t-test, Chi-square test or Fisher’s exact test were used in analysis. Raw data were initially captured in Survey Monkey® and exported into and analyzed using SPSS (SPSS, INC., Chicago, IL, USA) statistical software (version 24.0). An alpha of less than 0.05 was considered statistically significant. Touch-screen tablet usability (‘very easy’ to ‘very difficult’) and preference (‘strongly agree’ to ‘strongly disagree’) were measured with a 1 to 5 Likert scale. Volunteers (n=25) exposed to the MM consent rated the value of video in helping understanding (‘strongly disagree’ to ‘strongly agree’) and volume of information (‘useless’ to ‘way too much’) with 1 to 5 Likert scales.

3. Results

3.1. Sample characteristics

Fifty unique volunteers, primarily parents (40 female and 10 male) participated (mean age = 36.5, SD 8.5, range 18–56); were Caucasian (60%), African American (32%), and mixed or other race (8%). Ethnicity was 90% non-Hispanic. Highest grade-level completed ranged from 11th grade through graduate school (Table 2). There was no statistical difference in age, gender, race, ethnicity or education between groups.

Table 2.

Characteristics of QI study population.

Characteristic Total (n = 50) Multimodal (n = 25) Paper (n = 25) P-value

Age, M (SD) 36.5 (8.5) 37.8 (9.9) 35.2 (6.9) 0.291
Male, n (%) 10 (20) 5 (20) 5 (20) 1.002
Race, n (%) 0.523
 Caucasian 30 (60) 17 (68) 13 (52)
 African American 16 (32) 6 (24) 10 (40)
 Other or multi-racial 4 (8) 2 (8) 2 (8)
Hispanic, n (%) 5 (10) 1 (4) 4 (16) 0.173
Education, n (%) 0.193
 High school or less 14 (28) 4 (16) 10 (40)
 Community college / some college 14 (28) 8 (32) 6 (24)
 College graduate 17 (34) 9 (36) 8 (32)
 Graduate school 5 (10) 4 (16) 1 (4)
1

Two-sample t-test, equal variance not assumed.

2

Chi-square test.

3

Fisher’s exact test.

3.2. Comprehension

After first consent exposure overall comprehension scores (percent of questions answered correctly) in both groups ranged widely between 44–100% (M = 62.9%, SD = 25.0). The mean scores comparing multimodal (M = 82.2%, SD = 15.4) and standard consent process (M = 43.6%, SD= 16.0) also ranged widely and the difference was statistically significant (p < 0.001). A significant difference was also found between groups on comprehension of individual elements of benefits (p < 0.001), risks or harm (p < 0.001), volunteerism (p < 0.012), meaning of results (p < 0.001), confidentiality (p < 0.004), and privacy (p < 0.001) and what consent allows (p < 0.002). There was no significant difference in correct responses regarding comprehension of cost (p = 0.49) or study purpose (p = 0.38) (Table 3). There was no correlation between total comprehension score and sample characteristics of age, gender, race, ethnicity or education in either group. However, older age, Caucasian race, non-Hispanic ethnicity and higher education trended toward higher scores.

Table 3.

Comprehension Assessment questions: Total and intervention group scores

Question Total (n = 50) Multimodal (n = 25) Paper (n = 25) P-value

Q09 – Purpose of study 31 (62)a 17 (68) 14 (56) 0.382
Q10 – What consent allows 23 (46) 17 (68) 6 (24) 0.0022
Q11 – Any benefits 18 (36) 17 (68) 1 (4) <0.0012
Q12 – Any study costs to participant 48 (96) 25 (100) 23 (92) 0.493
Q13 – Any risks or harms 19 (38) 18 (72) 1 (4) <0.0012
Q14 – What is true of volunteers 36 (72) 22 (88) 14 (56) 0.0122
Q15 – How results will be received 26 (52) 19 (76) 7 (28) 0.0012
Q16 – Confidentiality 42 (84) 25 (100) 17 (68) 0.0043
Q17 – Privacy 40 (80) 25 (100) 15 (60) <0.0012

Total percent score 62.9 (25.0)b 82.2 (15.4) 43.6 (16.0) <0.0011
1

Two-sample t-test, equal variance not assumed.

2

Chi-square test.

3

Fisher’s exact test.

a

cells represent n (%) who answered correctly

b

For total percent score, cells represent M (SD) for score out of 100% on the 9 questions.

3.3. Multimodal content ratings

Only volunteers randomized to the multimodal group, rated information on the MM video using Likert scales. Twenty-four of twenty-five volunteers rated the amount of information presented in the MM and responses ranged from, just right (83.3%), too little (8.3%) or useless (8.3%). The value for improving study understanding was rated as strongly agree (37.5%), agree (33%), neither agree nor disagree (16.7%) or strongly disagree (12.5%).

3.4. Overall tablet format preference

All volunteers completed the comprehension assessment, usability ratings and entering of demographic data on the same tablet to reduce format bias. Forty-nine of fifty QI volunteers evaluated touch-screen tablet usability and format preference. Usability was rated as ‘very easy’ (75%) or ‘easy’ (22%) by ninety-seven percent of all volunteers and neutral (2%) or very difficult (2%) by four percent. Eighty-four percent of total volunteers preferred using the tablet to a standard-paper format, ‘strongly agreed’ (59.2%), agreed (24.5 %). Others were neutral or disagreed, ‘neither agree nor disagree’ (6.1%), ‘disagree’ (2%) or ‘strongly disagree’ (8.2%). Eighty-two percent of volunteers would select a tablet format for consent in the future and verbal or written formats were selected by eighteen percent (verbal (8%) and written (10%). After the first volunteer entered data into Survey Monkey®, it was noted that not all questions regarding format were marked as ‘mandatory’ and thus some data on one volunteer was not captured. Settings were revised to make all answers mandatory and no further omissions were noted.

4.0. Discussion and Conclusions

4.1. Discussion

Providing an IC process that meets ethical and legal guidelines, yet is understandable to the non-medical public, presents an ongoing challenge. This is consistent with changes to the common rule reflecting the growing recognition for clarity in consent provision. Local research team members are trained using the Process and Quality of informed Consent for clinical Research (P-QIC) essential elements [30]. Mandatory practice is to conduct IC using an IRB and sponsor-approved template administered only by trained professionals. Standardized templates include all mandatory language and IC elements. A consent documentation checklist further itemizes elements reviewed and is validated after content discussion with consenter signature, time, date (FDA 21 CFR §50.27). Despite these rigorous efforts in consent training, standardized templates and checklists, the IC process underperforms. Comprehension scores after first review of the standard IC were low. Following the assessment, however, questions were interactively reviewed with both consent exposure groups. If testing had then been repeated, scores may have yielded different or improved values. As the QI initiative was to test affordable MM development, usability and change or non-inferiority in comprehension between the two formats, repeat testing was not conducted.

In this QI initiative, the informed consent visual-text and narration-script and the comprehension assessment questions were refined to reduce the complexity, length and readability to a sixth grade level. The MM process of presenting the consent at a reading level lower than the standard-paper consent may have improved understanding. The MM format may have reached more learning styles than the paper format. Learning styles may be evolving and MM may better facilitate concentration than reading. The tactile nature of the MM video on the touch-screen tablet may have also improved concentration or retention. As the consent process was not timed the length of the consent interactions versus the format may have contributed to the differences in comprehension scores. Combining improved readability with the MM touch-screen experience may have contributed to improved overall comprehension scores in the MM group. This finding is consistent with recent MM consent intervention reports [12,13].

Additional factors may have influenced the higher MM scores. Volunteers may focus more fully on MM versus reading and discussion of the standard-paper consent. Despite rigorous researcher training, the MM may have provided a more targeted or standardized delivery than researchers. The MM focused specifically on required elements and those identified as complex versus all text in the consent. In contrast to human personnel, a recorded presentation may be less influenced by environmental factors such as clinic pace, noise or other distractions. Grade-level completion may not serve as an accurate proxy for reading level and there was not a correlation between education and CA scores. Alternatively, the reading-level reduction of the comprehension assessment may have leveled the impact of education on testing differences.

As we were seeking transportable MM presentation, a case protected 6 X 9.5 Samsung touch-screen tablet was easily carried by researchers on and off-site, functioned with Wi-Fi or cellphone hot spots in outpatient clinic areas. The tablet used for the MM presentation and assessment, was rated by volunteers as ‘easy to use’ and the MM format was ‘preferred’ for future IC. This is consistent with high acceptability and usability ratings of MM in the literature [6,31]. Even touch-screen computers were highly accepted in very low literacy populations’ due to similarities with cellphones [18].

Total supply cost using existing computer programs (narrated PowerPoint and Survey Monkey®) and a case protected tablet, was under $300.00. Using a MM format developed with existing personnel skills and technology was intentional to maintain cost while retaining flexibility to modify material for updates to protocol, site contact information, IRB, or Federal IC regulations. Many literature reports citing use of MM formats and enhanced IC explanations were developed by funded, professional teams, and high tech methods [5,6,10]. As encouraged by the Institute of Medicine [14] this improvement initiative applied existing tools and learned information to affordably support the IC process at the local level in an urban diabetes population.

4.1.1. Strengths

This was a QI initiative assessing a reduced reading level, MM enhanced, informed consent process compared to the standard consent and process. The QI volunteers were ‘real’ as all participated in consent for the antibody (primary) study. The research and clinic conditions were real and varied daily due to patient volume, providers assigned, Wi-Fi connectivity, availability of QI members and room space but represented the true outpatient clinic consenting environment. Findings translated from robust-funded clinical trials to improve the consent process, were the basis of tool and intervention development. The quality improvement initiative measured not only consent element understanding but also tablet usability and learning-tool preferences.

4.1.2. Limitations

Grade-level completion, versus a reading assessment was collected as a proxy and may not accurately reflect literacy. Although not statistically significant in this QI, the educational level of the MM volunteers was higher than those randomized to the standard consent and may have impacted the post comprehension scores. Parents ages thirty-seven and older (95%), versus adult siblings or other relations (5%), comprised the largest percentage of QI initiative volunteers. Adult males were underrepresented as a larger proportion of mothers accompanied the T1D probands to clinic visits.

Antibody volunteers agreed to participate in the additional QI initiative and thus may have been inadvertently primed to pay more attention to the novel MM process. Research personnel participating in the antibody consent process were notified that the QI initiative measured volunteer comprehension, not researcher competency. The intent of this notification was to reduce unintended practice change, however, both the performance of the primary research IC process and quality improvement may induce practice change in researchers. The consent tools were validated with study-specific content, local volunteers, clinic staff, diabetes professionals and experienced researchers from one healthcare entity. Therefore, whether the results of this quality improvement initiative is translatable to other study populations is unknown.

4.2. Implications

Findings suggest developing a multimodal IC, narrated at or below the 6th grade-level has potential to improve volunteer comprehension and is readily adaptable to local needs. Readability scales guiding reduction of the grade-level of written material are freely available on the internet. A touch-screen tablet is easy to use and preferred by volunteers for future IC. The tablet is readily transported by researchers and able to connect to data-collection programs via Wi-Fi, or using cell phone hot spots. Costs of the MM consent are limited to tablet purchase and employee time to develop the presentation. Ability to tailor information, affordability, transportability and high acceptance supports effective use in larger groups, satellite clinics and off-site venues. The “Framework for Spread” extends dissemination of practical applications of interventions cited in literature for improvement within and outside the local system [32]. This method for developing and testing an affordable (modifiable with existing staff and technology resources), transportable, multimodal process, scalable for consent improvement was documented.

There is real-world applicability to this initiative. The QI sample were volunteers consenting for an actual antibody-study protocol which serves as a gateway to an array of intervention trials. Improving understanding of disease risk and study elements from the first interaction, may have a lasting impact on volunteer trust and relationship-building with the research team. Potentially, future trial participation may be linked to the initial comprehension of the baseline research. Informed consent presentation has been updated with changes to the US common rule. The updated common rule requires that a simplified explanation of key and impactful study elements will be included at the beginning of the consent form. Combining commonly available programs and technology with existing readability tools, in a MM format may support those common rule changes and the intended goal of improved IC comprehension.

4.3. Conclusions

Translation of methods from arduous, complex or funded trials, applied to local sites, to improve IC understanding is supported and may be applied to larger trials. A multimodal consent tool, designed to address identified areas of low comprehension; improved comprehension and was usable and liked by participants. Consent processes should be evidence-based, patient-centered and assessing this tool in the context of a quality improvement initiative within an actual study provided valuable evidence of the tool’s real-world impact.

The PDSA cycle facilitates rapid improvement of tools to enhance the IC process and is well suited for QI initiatives. Narrated PowerPoint saved as a video file supports ongoing modification for site individualization or required updates and is affordable. Comprehension assessment, narration-script and visual slide text may all be written at reduced grade-levels. Readability scales such as Flesch Kincaid, SMOG, Gunning Fog and the Fry Graph assess and guide reduction of grade-level writing with free online automatic formulas. Health initiatives, such as healthy people 2020 [33] provide guidance to further reduce complexity of essential health documents. Presenting information in a MM format expands reach to volunteers with a wider variety of learning styles. Touch-screen tablets are easily transported to off-site settings and expand standardized delivery for both teaching and assessing IC comprehension outcomes.

References

  • [1].Williams JK, Anderson CM, Omics research ethics considerations, Nurs. Outlook. (2018). doi: 10.1016/j.outlook.2018.05.003. [DOI] [PubMed] [Google Scholar]
  • [2].Abujarad F, Alfano S, Bright TJ, Kannoth S, Grant N, Gueble M, Peduzzi P, Chupp G, Building an Informed Consent Tool Starting with the Patient: The Patient-Centered Virtual Multimedia Interactive Informed Consent (VIC)., AMIA ... Annu. Symp. Proceedings. AMIA Symp. 2017 (2017) 374–383. http://www.ncbi.nlm.nih.gov/pubmed/29854101 (accessed September 30, 2018). [PMC free article] [PubMed] [Google Scholar]
  • [3].Roberts KJ, Revenson TA, Urken ML, Fleszar S, Cipollina R, Rowe ME, Dos Reis LL, Lepore SJ, Testing with feedback improves recall of information in informed consent: A proof of concept study., Patient Educ. Couns. 99 (2016) 1377–81. doi: 10.1016/j.pec.2016.03.014. [DOI] [PubMed] [Google Scholar]
  • [4].Larson E, Foe G, Lally R, Reading Level and Length of Written Research Consent Forms, Clin. Transl. Sci. 8 (2015) 355–6. doi: 10.1111/cts.12253. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [5].Kraft SA, Constantine M, Magnus D, Porter KM, Lee SS-J, Green M, Kass NE, Wilfond BS, Cho MK, A randomized study of multimedia informational aids for research on medical practices: Implications for informed consent, Clin. Trials. 14 (2017) 94–102. doi: 10.1177/1740774516669352. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [6].Tait AR, Voepel-Lewis T, Chetcuti SJ, Brennan-Martinez C, Levine R, Enhancing patient understanding of medical procedures: evaluation of an interactive multimedia program with in-line exercises., Int. J. Med. Inform. 83 (2014) 376–84. doi: 10.1016/j.ijmedinf.2014.01.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Faulkner M, Alikhaani J, Brown L, Cruz H, Davidson D, Gregoire K, Berdan L, Rorick T, Jones WS, Pletcher MJ, Exploring Meaningful Patient Engagement in ADAPTABLE (Aspirin Dosing: A Patient-centric Trial Assessing Benefits and Long-term Effectiveness)., Med. Care. 56 Suppl 10 Suppl 1 (2018) S11–S15. doi: 10.1097/MLR.0000000000000949. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [8].Winter M, Kam J, Nalavenkata S, Hardy E, Handmer M, Ainsworth H, Lee WG, Louie-Johnsun M, The use of portable video media vs standard verbal communication in the urological consent process: a multicentre, randomised controlled, crossover trial., BJU Int. 118 (2016) 823–828. doi: 10.1111/bju.13595. [DOI] [PubMed] [Google Scholar]
  • [9].Antal H, Bunnell HT, McCahan SM, Pennington C, Wysocki T, V Blake K , A cognitive approach for design of a multimedia informed consent video and website in pediatric research, J. Biomed. Inform. 66 (2017) 248–258. doi: 10.1016/j.jbi.2017.01.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [10].Spertus JA, Bach R, Bethea C, Chhatriwalla A, Curtis JP, Gialde E, Guerrero M, Gosch K, Jones PG, Kugelmass A, Leonard BM, McNulty EJ, Shelton M, Ting HH, Decker C, Improving the process of informed consent for percutaneous coronary intervention: patient outcomes from the Patient Risk Information Services Manager (ePRISM) study., Am. Heart J. 169 (2015) 234–241.e1. doi: 10.1016/j.ahj.2014.11.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [11].Miller CK, O’Donnell DC, Searight HR, Barbarash RA, The Deaconess Informed Consent Comprehension Test: an assessment tool for clinical research subjects, Pharmacotherapy. 16 (n.d.) 872–8. http://www.ncbi.nlm.nih.gov/pubmed/8888082 (accessed October 3, 2017). [PubMed] [Google Scholar]
  • [12].Bowers N, Eisenberg E, Montbriand J, Jaskolka J, Roche-Nagle G, Using a multimedia presentation to improve patient understanding and satisfaction with informed consent for minimally invasive vascular procedures., Surgeon. 15 (2017) 7–11. doi: 10.1016/j.surge.2015.09.001. [DOI] [PubMed] [Google Scholar]
  • [13].Kang EY, Fields HW, Kiyak A, Beck FM, Firestone AR, Informed consent recall and comprehension in orthodontics: traditional vs improved readability and processability methods., Am. J. Orthod. Dentofacial Orthop. 136 (2009) 488.e1–13; discussion 488–9. doi: 10.1016/j.ajodo.2009.02.018. [DOI] [PubMed] [Google Scholar]
  • [14].Institute of Medicine, Best Care at Lower Cost: The Path to Continuously Learning Health Care in America, National Academies Press, Washington, D.C, 2013. doi: 10.17226/13444. [DOI] [PubMed] [Google Scholar]
  • [15].National Center for Education Statistics U.S. Department of Education, Adult Literacy, (n.d.). https://nces.ed.gov/fastfacts/display.asp?id=69.
  • [16].National Institutes of Health U.S. Department of Health & Human Services, How do I develop consent forms and who reviews them?, (n.d.). https://www.nhlbi.nih.gov/research/funding/research-support/crg/funding/consent-forms.htm.
  • [17].Montalvo W, Larson E, Participant comprehension of research for which they volunteer: a systematic review., J. Nurs. Scholarsh. an Off. Publ. Sigma Theta Tau Int. Honor Soc. Nurs. 46 (2014) 423–31. doi: 10.1111/jnu.12097. [DOI] [PubMed] [Google Scholar]
  • [18].Afolabi MO, McGrath N, D’Alessandro U, Kampmann B, Imoukhuede EB, Ravinetto RM, Alexander N, Larson HJ, Chandramohan D, Bojang K, A multimedia consent tool for research participants in the Gambia: a randomized controlled trial., Bull. World Health Organ. 93 (2015) 320–328A. doi: 10.2471/BLT.14.146159. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [19].Bhupathi PA, Ravi GR, Comprehensive Format of Informed Consent in Research and Practice: A Tool to uphold the Ethical and Moral Standards., Int. J. Clin. Pediatr. Dent. 10 (2017) 73–81. doi: 10.5005/jp-journals-10005-1411. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [20].Langley GL PL, Nolan KM, Nolan TW, Norman CL, The Improvement Guide: A Practical Approach to Enhancing Organizational Performance, 2nd ed, Jossey-Bass Publishers, San Francisco, CA, 2009. [Google Scholar]
  • [21].Manta CJ, Ortiz J, Moulton BW, Sonnad SS, From the Patient Perspective, Consent Forms Fall Short of Providing Information to Guide Decision Making, J. Patient Saf. (2016) 1. doi: 10.1097/PTS.0000000000000310. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [22].Batuyong ED, Jowett AJL, Wickramasinghe N, Beischer AD, Using multimedia to enhance the consent process for bunion correction surgery, ANZ J. Surg. 84 (2014) 249–254. doi: 10.1111/ans.12534. [DOI] [PubMed] [Google Scholar]
  • [23].Foe G, Larson EL, Reading Level and Comprehension of Research Consent Forms: An Integrative Review., J. Empir. Res. Hum. Res. Ethics. 11 (2016) 31–46. doi: 10.1177/1556264616637483. [DOI] [PubMed] [Google Scholar]
  • [24].Insel RA, Dunne JL, Atkinson MA, Chiang JL, Dabelea D, Gottlieb PA, Greenbaum CJ, Herold KC, Krischer JP, Lernmark Å, Ratner RE, Rewers MJ, Schatz DA, Skyler JS, Sosenko JM, Ziegler A-G, Staging presymptomatic type 1 diabetes: a scientific statement of JDRF, the Endocrine Society, and the American Diabetes Association., Diabetes Care. 38 (2015) 1964–74. doi: 10.2337/dc15-1419. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [25].Deming WE, The new economics for industry, government, and education, The MIT Press, Cambridge, MA, 2000. [Google Scholar]
  • [26].Reed JE, Card AJ, The problem with Plan-Do-Study-Act cycles, BMJ Qual. &amp; Saf. 25 (2016) 147 LP-152. http://qualitysafety.bmj.com/content/25/3/147.abstract. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [27].Etchells E, Ho M, Shojania KG, Value of small sample sizes in rapid-cycle quality improvement projects, BMJ Qual. &amp; Saf. 25 (2016) 202 LP-206. http://qualitysafety.bmj.com/content/25/3/202.abstract. [DOI] [PubMed] [Google Scholar]
  • [28].Shafiq N, Malhotra S , Ethics in clinical research: Need for assessing comprehension of informed consent form?, Contemp. Clin. Trials. 32 (2011) 169–172. doi: 10.1016/j.cct.2010.12.002. [DOI] [PubMed] [Google Scholar]
  • [29].Tait AR, Voepel-Lewis T, Levine R, Using digital multimedia to improve parents’ and children’s understanding of clinical trials., Arch. Dis. Child. 100 (2015) 589–93. doi: 10.1136/archdischild-2014-308021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [30].Cohn EG, Jia H, Smith WC, Erwin K, Larson EL, Measuring the process and quality of informed consent for clinical research: development and testing., Oncol. Nurs. Forum. 38 (2011) 417–22. doi: 10.1188/11.ONF.417-422. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [31].Tou S, Tou W, Mah D, Karatassas A, Hewett P, Effect of preoperative two-dimensional animation information on perioperative anxiety and knowledge retention in patients undergoing bowel surgery: a randomized pilot study, Colorectal Dis. 15 (2013) e256–65. doi: 10.1111/codi.12152. [DOI] [PubMed] [Google Scholar]
  • [32].Massoud MR, Neilsen GA, Nolan K, Schall MW, Sevin C, Framework for Spread: From Local Improvements to System-Wide Change, Cambridge, MA, 2006. www.IHI.org. [Google Scholar]
  • [33].Office of Disease Prevention and Health Promotion., Healthy People 2020, (2010). https://www.healthypeople.gov/.

RESOURCES