Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Jun 1.
Published in final edited form as: Contemp Clin Trials. 2021 Oct 12;111:106598. doi: 10.1016/j.cct.2021.106598

A/B design testing of a clinical trial recruitment website: A pilot study to enhance the enrollment of older adults

Hailey N Miller a,b, Timothy B Plante c, Kelly T Gleason b,d, Jeanne Charleston b,f, Christine M Mitchell e,f, Edgar R Miller III e, Lawrence J Appel e, Stephen P Juraschek g,*
PMCID: PMC8995844  NIHMSID: NIHMS1792470  PMID: 34653651

Abstract

Introduction:

Online tools are increasingly utilized in clinical trial recruitment. A/B testing is an effective technology used in political campaigns and commercial marketing to improve contributions or sales. However, to our knowledge, A/B has not been described in the context of clinical trial recruitment.

Methods:

Two A/B testing experiments were implemented on the recruitment website of the Study To Understand Fall Reduction and Vitamin D in You (STURDY), a response-adaptive, two-stage, randomized controlled trial. Commercial A/B platforms randomized web-users to different versions of the trial’s website landing page; Experiment 1 included two infographic versions and Experiment 2 included three video versions. We compared web-user engagement metrics between each version and the original landing page. We determined the effect of each version compared to the original landing page on the likelihood of a web-user to (1) request more information about the trial, (2) complete a screening visit, or (3) enroll in the trial.

Results:

A total of 2605 and 374 web-users visited the trial’s website during Experiment 1 and 2, respectively. Response to the online interest form significantly differed by infographic version in Experiment 1. The number of individuals who engaged with website content and pages significantly differed by video in Experiment 2.

Conclusion:

In a pilot study implementing A/B testing of a clinical trial recruitment website, different versions of the website led to differences in web-user engagement and interest in the trial. A/B testing tools offer a promising approach to test the effectiveness of clinical trial recruitment materials and to optimize recruitment campaigns.

Keywords: A/B testing, Recruitment methods, Randomized control trial, Clinical trials

1. Introduction

Recruitment is an essential component to the successful completion of clinical trials [1]. However, only 31–56% of clinicals trials meet their intended recruitment goals [25]. Failure to meet recruitment goals can result in trial termination, unused resources, and underpowered sample sizes [610]. Further, it can contribute to the pronounced underrepresentation of diverse racial and ethnic groups in research [1113].

The meteoric rise in internet use has spurred interest in understanding the role of digital-based approaches in clinical trial recruitment, including online advertisements [1416], social media campaigns [1719], electronic medical records [1922], email and text messages [23,24], and other internet-related tools [25,26]. These novel strategies are capable of delivering focused advertisements to a subset of individuals who might meet criteria for clinical trial enrollment, possibly reducing the costs and time associated with non-digital-based advertisements [19]. Digital recruitment modalities can also be refined in real time, often rapidly, and adjusted based upon the performance of specific design characteristics. Moreover, when these refinements are applied in real-time based on recruitment yields among underrepresented groups, they have the potential improve the representativeness of trial populations.

A/B testing is a widely applied marketing strategy that compares engagement or response between at least two versions of media (for example, a webpage) to determine if one version more effectively achieves a pre-specified outcome (for example, contacting the study team) than the other [27,28]. A/B testing can be used to test the performance of minor aesthetic differences, such as the color of a website button (blue versus red), or more substantive differences, such as two different photos on a website home page [29]. Importantly, multiple A/B tests can be deployed in parallel or in sequence, allowing for iterative, fast-paced, and empirically-driven improvements of website or media content [30].

A/B testing has been successfully deployed in non-medical fields to improve media content performance, including political campaigns [31], and is ubiquitous in website design and optimization. A/B testing has also been used in the health industry to identify strategies for quality improvement [32]. However, there is little to no literature describing how A/B testing may be utilized in clinical trial recruitment to optimize digital content and increase recruitment response rates, including responses from underrepresented groups.

In this context, we piloted two A/B testing experiments, using two A/ B testing software platforms, in the Study To Understand Fall Reduction and Vitamin D in You (STURDY) recruitment campaign. The purpose of these experiments was to understand if A/B testing could be used to improve the effectiveness of the STURDY recruitment website. Specifically, we tested if adding an infographic (Experiment 1) or a video invitation from a STURDY staff member (Experiment 2) to the website landing page was associated with higher website engagement or recruitment rates (i.e., contact with the study team, screening participation, and enrollment). Our underlying hypothesis was that A/B testing would identify website versions that optimized engagement and recruitment despite lack of consensus among the investigative team about which version might work best.

2. Methods

2.1. Study design

Details of the STURDY trial design have previously been reported [33] and its main results have been published [34]. In brief, STURDY was a response-adaptive, double-masked, two-stage, randomized controlled trial that examined the effects of four doses of vitamin D supplements on falls in adults aged 70 years and older. Participants were randomly assigned to one of the four vitamin D doses and were followed for up to two years, including completing three in-person and six telephone data collection visits. Participants were incentivized with $40 for each of three follow-up visits. In-person visits were conducted at two community-based research clinics, located in Baltimore, Maryland, and Hagerstown, Maryland. Additional details on the trial’s characteristics can be found in Table 1.

Table 1.

STURDY design and trial characteristics.

Feature Details

Trial design Bayesian response-adaptive, double-masked, two-stage (dose-finding and confirmatory), randomized trial
Duration of follow-up Up to 2 years of pill-taking and outcome ascertainment post-randomization
Primary inclusion criteria Age 70 and older
Elevated risk of fallinga
Serum 25-hydroxyvitamin D level 10–29 ng/mL
Primary exclusion criteria Use of vitamin D supplements (>1000 IU/day)
Use of calcium supplements (>1200 mg/day)
Cognitive impairmentb
Hyper or hypocalcemia (serum Ca2+> 10.5 mg/dL or < 8.5 mg/dL)
Kidney, ureteral or bladder stones made of calcium compounds
Use of any form of oral or injected calcitriol
Number of data collection visits 3 pre-randomization (1 via telephone, 2 in clinic)
1 randomization (in clinic)
9 post-randomization (3 in clinic, 6 via telephone)
Recruitment goal 1200
40% Black/African American
60% Women
Enrollment incentives $40 per in-clinic post-randomization visit, along with study-branded items (e.g., blanket, glasses cloth, etc.)
Trial sites Hagerstown, MD
Baltimore, MD
a

Defined by a “yes” response to one of the following questions: 1. Have you fallen and hurt yourself in the past year? 2. Have you fallen 2 or more times in the past year? 3. Are you afraid that you might fall because of your balance or walking problems? 4. Do you have difficulty maintaining your balance when bathing, dressing, or getting in and out of a chair? 5. Do you use a cane, walker, or other device when walking inside or outside your home?

b

Defined by Mini-Mental State Exam (MMSE) score < 24.

2.2. Recruitment population

The recruitment goal for STURDY was to enroll up to 1200 participants, with a goal of 40% Black or African American and 60% women. However, these goals were ultimately not achieved when the trial was terminated early for futility [34]. Recruitment targeted community-dwelling older adults, ages 70 years and older, who were at an elevated risk of falling and had a serum 25-hydroxyvitamin D level between 10 and 29 ng/mL. Fall risk was determined based upon a series of 5 questions regarding fall history, fear of falling, difficulty maintaining balance during activities of daily living, and use of walking assistance devices. Full details of these questions and additional details of the trial’s eligibility criteria can be found in Table 1.

2.3. Recruitment procedures and timeline

Recruitment for STURDY occurred between 2015 and 2019. STURDY’s recruitment campaign incorporated several recruitment methods, including mass mailing of brochures, patient portal messaging, in-person recruitment at health fairs or senior residences, radio and newspaper advertisements, and a community outreach ambassador program. The trial and all recruitment strategies, including the versions of recruitment materials used for A/B testing, were approved by a Johns Hopkins University School of Medicine Institutional Review Board (IRB).

All recruitment methods provided a URL or link to the trial’s website (sturdystudy.org). Digital methods of recruitment, such as patient portal messaging, provided a direct link to the website. Non-digital methods of recruitment, such as newspaper advertisements or brochures, required individuals to type the website URL into their web browser.

The landing page of the website included a heading (STURDY Study), a picture of a woman walking with a walking aid, and a button to “Learn More.” Questions regarding eligibility, such as “are you age 70 or older?”, were listed directly under the heading. The website also included facts about falls in older adults. There were a total of 6 menu items to which a visitor might navigate, labeled “Home”, “About”, “Meet the Team”, “Contact”, “Participate”, and “Staff Login.” A screenshot of the original landing page can be found in Appendix Fig. A1. To better understand the website’s performance and applicability of A/B testing in research recruitment, two experiments were performed. A schematic visual of each experiment is depicted in Fig. 1.

Fig. 1.

Fig. 1.

Experiment design diagrams.

aDigital randomization completed by A/B testing platform.

bUser data includes the number of webpage visitors and the number of clicks on the webpage.

cThere was only 1 place to access the form on the website. The forms were identical, but unique to each version in order to track response rates.

dUser data includes the number of webpage visitors,the bounce rate and the goal conversion rate for each visit.

2.3.1. Recruitment website A/B Experiment 1: landing page with unique infographics

The purpose of Experiment 1 was to determine if A/B testing could be utilized to enhance study recruitment and enrollment by understanding the effectiveness of the STURDY trial’s landing page. Experiment 1 was conducted using Optimizely [35], an online software for A/B testing, between December 2016 and March 2018. As shown in Fig. 1, web-users accessing the trial’s website were randomized equally to one of three landing pages: the original landing page, version A infographic or version B infographic. Version A and B infographics are displayed in Fig. 2. The landing pages were identical with the exception of the infographic displayed.

Fig. 2.

Fig. 2.

Experiment 1 infographics.

Note: Infographics were formatted to grayscale for publication.

2.3.2. Recruitment website A/B Experiment 2: Landing page with unique videos

The purpose of Experiment 2 was to determine if video invitations from distinct staff members enhanced study recruitment and enrollment and whether it differed by staff member. Experiment 2 was conducted using Google Analytics in February 2019. It is important to note that the length of Experiment 2 was abbreviated since the parent clinical trial in which these A/B experiments were embedded was terminated early for futility. In this experiment, web-users were randomized to one of four landing pages, one of which was the original landing page and three of which included a video. The landing pages were identical with the exception of the video displayed.

Each video contained study team member(s) who described the purpose and processes of the STURDY trial in participant-friendly language. These team members included the principal investigator, a research coordinator, a former study participant, and the director of clinical operations. The scripts between the videos were similar (Appendix Table A1). Additional details on these videos and team members can be found in Table 2.

Table 2.

Description of videos for Experiment 2.

Version Length of video # of person (s) Gender of person(s) Race of person(s) Relation of person(s) to study

Original No Video - - - -
Version A 2:50 1 Male White
American
Principal
Investigator
Version B 2:35 2 Female Black
American
Research
Coordinator
Female Black
American
Director of Clinical Operations
Version C 2:09 2 Female Black
American
Former Study Participant
Female Black
American
Research Coordinator

2.4. Development of infographic and video content

The infographics and videos for these Experiments were developed by student volunteers. The student volunteers used existing IRB approved language and material to develop the content. The selection of study personnel to deliver the video message was based on the actual study personnel working on STURDY, as we wanted to demonstrate how A/B testing could be used in the context of a real-world recruitment campaign. We did not employ standardized actors as used in some behavioral studies intended to evaluate factors related to participant engagement.

2.5. Participant engagement metrics

There was an embedded online form (hosted via Qualtrics) on each version of the landing page that allowed individuals to express interest in the trial. Accessing and completing the form did not require the web-users to leave the trial website. A web-user was directed to the embedded form on the landing page if they clicked “Learn More” located on various pages or “Participate” in the navigation menu. The locations of “Learn More” and “Participate” were identical on each landing page version. If the web-user clicked “Learn More” while on the landing page, the landing page refreshed the web-user to the bottom of the page where the form was embedded. The location, display and formatting of the form was also identical between landing page versions; however, a unique Qualtrics form was created for each version to track response rates. Following completion of the interest form, individuals were contacted via phone by study team personnel to complete pre-screening. If individuals were eligible following pre-screening, they were invited to an in-person screening visit at one of the two research clinics. For individuals that remained eligible and interested, they were invited to a second screening visit and, subsequently, a randomization visit. Verbal consent was obtained during the pre-screening phone call, and written informed consent was obtained at the in-person screening visit.

For both experiments, A/B testing software captured a series of metrics to measure web-user engagement. In Experiment 1, the number of unique visitors and the number of clicks on the webpage were captured. In Experiment 2, the number of unique visitors; the proportion of visitors that exited the webpage without visiting or clicking on other pages, such as “Meet the Team” or “Participate” (bounce rate); and the proportion of individuals that visited the landing page or “Meet the Team” page following initial entry to the webpage (goal conversion rate) were captured. Google Analytics calculated the bounce rate by dividing the number of single page sessions (numerator) by the number of all sessions (denominator) and the goal conversion rate by dividing the number of sessions that completed a goal (numerator) by the number of all sessions (denominator). In experiment 2, the goal was defined as completed if a web-user visited the landing page beyond their initial visit (either by clicking “Learn More” or “Home” or “Participate”) or clicked “Meet the Team” on the website.

2.6. Statistical analysis

Participant engagement metrics for both Experiment 1 and Experiment 2 were tabulated. All metrics were reported as aggregate count data, with bounce rate and goal conversion rate reported as percentages.

2.6.1. Recruitment website A/B Experiment 1: landing page with unique infographics

Bivariate associations between landing page version and engagement outcomes were examined using logistic regression. Landing page version, treated as a categorical variable (original landing page, version A infographic, version B infographic), was the independent variable in all models. Separate models were used for each of the three outcomes: (1) completion of the online interest form, (2) completion of the first in-person screening visit, and (3) completion of the randomization visit.

2.6.2. Recruitment website A/B Experiment 2: landing page with unique videos

Differences in bounce rate and goal conversion rate between the original landing page and each video version were evaluated using two-sample tests of proportions. Odds ratios (OR) compared completion of the online interest form for each version versus the original landing page (reference group).

Two-tailed P-values<0.05 were considered statistically significant. Statistical procedures were performed using Stata statistical software, version 15.1 (College Station, TX).

3. Results

3.1. Experiment 1: landing page with unique infographics

A total of 2605 web-users accessed the trial’s website during Experiment 1, which tested the performance of three landing page versions (the original landing page and infographic versions A and B). Each version received approximately one-third of the total visitors, consistent with the allocation scheme (Table 3). Among those who accessed the website, roughly two-thirds of users clicked somewhere on the landing page, indicating engagement with the website.

Table 3.

Experiment 1 engagement results from optimizely.

Pre-study team contact Post-study team contact



Version Visitors, N (%) N clicks on Pagea N completed online interest form (%) OR of completing the form (95% CI) N completed SV (%) OR of completing SV (95% CI) N randomized (%) OR of completing randomization (95% CI)

Original 814 (31.2) 546 137 (16.8) Reference 59 (7.2) Reference 16 (2.0) Reference
Version A 920 (35.3) 600 118 (12.8) 0.73 (0.56–0.95) 52 (5.7) 0.77 (0.52–1.13) 17 (1.8) 0.94 (0.47–1.87)
Version B 871 (33.4) 562 141 (16.2) 0.95 (0.74–1.23) 54 (6.2) 0.85 (0.58–1.24) 18 (2.1) 1.05 (0.53–2.08)
a

Counted each time a web-user clicked somewhere on the webpage; OR = Odds Ratio; CI = Confidence Interval; SV = Screening Visit.

A total of 137, 118, and 141 web-users completed the online interest form from the original landing page, version A infographic, and version B infographic, respectively. Web-users randomized to the version A infographic were significantly less likely to complete the form compared to those randomized to the original landing page (OR: 0.73, p = 0.019, 95% CI: 0.56, 0.95). The odds of completing the interest form did not significantly differ between the version B infographic and the original landing page (OR: 0.95, p = 0.723, 95% CI: 0.74, 1.23).

A total of 59, 52, and 54 web-users completed the screening visit and 16, 17, and 18 web-users went on to be randomized in STURDY from the original landing page, version A infographic, and version B infographic, respectively. The odds of a web-user completing the screening visit did not significantly differ between the version A infographic and the original landing page (OR: 0.77, p = 0.18, 95% CI: 0.52, 1.13) or the version B infographic and the original landing page (OR: 0.85, p = 0.39, 95% CI: 0.58, 1.24). Similarly, the odds of a web-user completing the randomization visit did not significantly differ between the version A infographic and the original landing page (OR: 0.94, p = 0.86, 95% CI: 0.47, 1.87) or the version B infographic and the original landing page (OR: 1.05, p =0.88, 95% CI: 0.53, 2.08).

3.2. Experiment 2: landing page with unique videos

A total of 374 web-users accessed the trial’s website during Experiment 2, which tested the performance of four landing page versions (the original landing page and video versions A, B, and C) (Table 4). Version C received the largest proportion of web-users (30.2%), followed by Version B (29.1%), the original landing page (26.2%), and Version A (14.4%).

Table 4.

Experiment 2 engagement results from google analytics.

Version Visitors, N (%) Bounce ratea, % Goal conversion rateb, % N completed online interest form (%)c OR completed form (95% CI)

Original 98 (26.2) 74.5 13.3 4 (4.1) Ref
Version A 54 (14.4) 77.8 14.8 2 (3.7) 0.90 (0.08–6.55)
Version B 109 (29.1) 75.2 6.4 2 (1.8) 0.44 (0.04–3.16)
Version C 113 (30.2) 82.3 2.7* 1 (0.9) 0.21 (0.00–2.18)

Note. OR = Odds Ratio, CI = Confidence Interval

*

Indicates a significant difference from the reference group at p < 0.01.

a

Bounce rate was calculated by dividing the number of single page sessions by the number of all sessions.

b

Goal conversion rate was calculated by dividing the number of sessions that completed a goal (i.e., by clicking “Meet the Team” or by visiting the landing page beyond initial visit by clicking “Learn More”, “Participate” or “Home”) by the number of all sessions.

c

Data on screening visits and randomizations completed were not collected due early trial termination, as advised by the Data and Safety Monitoring Board.

The bounce rate, or the percentage of web-users with a single-page session, was 74.5% or greater for all landing page versions, with the original landing page experiencing the lowest bounce rate (74.5%) and version C video experiencing the highest bounce rate (82.3%). There were no significant differences in bounce rates between the original landing page and each landing page version.

The goal conversion rate, or the percentage of web-users who refreshed or navigated back to the landing page or the “Meet the Team” page, was highest for the version A video (14.8%), followed by the original landing page (13.3%), the version B video (6.4%), and lastly, the version C video (2.7%). The goal conversion rate did not significantly differ between the version A video and the original landing page (p = 0.79) or the version B video and the original landing page (p = 0.10). However, the version C video had significantly fewer goal conversions compared to the original landing page (p = 0.004).

Approximately 2.4% (n = 9) of web-users completed the online interest form, with the highest proportion resulting from web-users randomized to the original landing page (n = 4). However, the odds of responding to the interest form did not significantly differ between the video versions and the original landing page, although the numbers were small due to early study termination.

4. Discussion

In our pilot study that assessed the role of A/B testing to improve the effectiveness of a trial’s website landing page, we found that subtle differences in website content affected web-users’ response and engagement rates. In one experiment, the addition of one infographic (version A) reduced the rate by which potential enrollees completed an interest form. In another experiment, a video featuring staff (video C) also altered engagement with the study website. In both experiments, we found that the original landing page yielded the highest response rate to the online interest form. Together, these experiments illustrate how A/B testing may be applied to empirically evaluate recruitment content and improve participation in clinical trials.

In the era of digital ubiquity, the Internet has proven to be an effective tool for disseminating information [36]. The use of the Internet has been rapidly adopted by all age groups in the United States, with an estimated 99% of individuals 18–29 years old and 75% of individuals 65 years and older using the Internet in 2021 [37]. As such, online recruitment strategies, such as study websites, have the potential to efficiently recruit participants into studies [3842]. In fact, a previous recruitment study conducted by our group in a trial among adults with gout found that electronic and digital-based recruitment methods were responsible for over 50% of participant inquiries into the trial, 22% of which occurred via the trial’s website, representing the most commonly used method of inquiry behind phone [19]. Furthermore, online recruitment efforts can be more cost-effective than traditional strategies [19,42,43], can reach diverse ethnic groups [41], and can even be used as a randomization tool [44]. Despite widespread adoption of the Internet, a recent Cochrane Review investigating strategies to improve recruitment to randomized trials discussed few Internet-based methods [45]. Common reasons cited by investigators for not using internet-based methods to recruit trial participants include lack of experience with technology-based approaches, insufficient planning in the development stages, misconceptions about how and which participants might respond to digital campaigns, and the high cost of certain forms of internet advertising [46].

We designed our recruitment campaign to leverage several online recruitment strategies. In fact, all of our recruitment sources, with the exception of in-person recruitment, directed participants to our study website. In our experience, the use of a study website resulted in a strong response yield among web-users. Specifically, in Experiment 1, 15.2% of web-users who visited the website completed the online interest form. Importantly, utilizing the website allowed potential participants to learn about the study without using staff time and to inquire at times in the day that were most convenient for them. These benefits are likely to translate to reduced cost for personnel, as little to no effort was required from study staff to maintain the study website and online form. Further, utilizing the website for participant recruitment, rather than solely conventional advertisements (e.g., brochures), permitted us the flexibility to test the effectiveness of IRB approved recruitment material in a low-cost and time-efficient manner.

Unlike conventional advertisements, website content can be varied in real time and multiple versions can exist simultaneously. This flexibility offers opportunities to compare performance and optimize content being displayed to participants. This is especially important during the development of recruitment materials, as they often serve as the trial’s first impression to potential participants, and thus, represent a critical step in the recruitment process. However, recruitment materials and strategies are often developed and implemented by the study investigator(s) and staff without consultation from key stakeholders or testing within the target population. This approach may result in suboptimal, and hence less effective, recruitment messaging and materials. A/B testing allows for real-time and frequent data on the performance of recruitment versions. As such, it offers a promising and empirical approach to optimize and refine content throughout the recruitment process. For example, in our study, we were able to identify that infographic A resulted in significantly less inquiries via online form than the original landing page. Even though our versions were quite similar, our results illustrate how small changes can result in meaningful differences, and more importantly, how this tool could be used to evaluate more substantive changes to recruitment materials.

A/B testing may also be leveraged to support diversity in clinical trials in research by refining recruitment materials such that underrepresented groups have a higher response rate. Several racial and ethnic groups have been historically underrepresented in clinical trials, including Hispanic/Latino Americans and Black/African Americans, which represent over 30% of the US population [47]. Despite the promotion of more equitable and diverse research from national organizations [48], inclusion of these groups in research continues to be disproportionately low [49,50]. Multiple barriers have been discussed in the literature as to why this remains true, such as mistrust in research, awareness of opportunities, lack of transportation, or economic constraints [5153]. However, one area that is rarely discussed in literature is the influence of the recruitment strategies and materials used in clinical research.

Our group’s previous study investigating the demographic differences in enrollees by recruitment method highlighted the potential bias recruitment strategies may introduce to trials [19]. More specifically, some recruitment outreach methods may never reach underrepresented groups in research or may do so at a disproportionately low rate [19]. Something we did not test, however, was the effectiveness of the recruitment materials that were reaching underrepresented populations. It is increasingly commonplace in research to test whether a specific treatment or intervention may be more or less effective in various populations, such as by age, race, gender, or ethnicity. However, this is not the norm in clinical trial recruitment and represents a critical gap in the literature. As such, a secondary aim of Experiment 2 was to understand if the effectiveness of landing page versions differed by demographic characteristics of web-users. Although we were not able to answer this important question due to early termination of the STURDY trial, our study highlights a data-driven approach that can be leveraged to test the effectiveness of recruitment materials in diverse populations and promote inclusivity in clinical trial recruitment.

5. Limitations and strengths

This study has limitations that should be considered. First, this was a pilot study and the sample size of the experiments were small. Also, the experiments were conducted well after the start of recruitment. Ideally, A/B testing would be conducted at the start of recruitment. Second, the randomization allocation in Experiment 2 was unequal across landing page versions. The reason for this allocation is unclear but is likely incidental due to early termination of the study. Third, this study was limited to few geographic areas. Multicenter trials could expand on the A/B testing used in this pilot to understand the effectiveness of recruitment materials by geographic location, potentially going so far as to deliver versions based on the zip code associated with a web-user’s IP address. Fourth, our experiments only examined the applicability of A/B testing to features of the website landing page. There are several other scenarios in which A/B testing strategies may be utilized to optimize recruitment and retention in clinical trials, such as the detail of content included on websites or even with survey instruments. Fifth, our second A/B test compared videos of study personnel with a large number of differences (e.g., the race, gender, voice of the staff) that may contribute to differences in performance. The video characteristics, such as the length of the video, were also different. As such, drawing conclusions on these factors would be inappropriate. However, this limitation is not applicable to more simple use cases, such as the page color or font, where only one factor is isolated during the A/B test. Sixth, the tools utilized in this manuscript are rapidly changing. For example, Optimizely no longer offers a free platform, which was part of the rationale for not using it in Experiment 2 [35]. Nevertheless, the principles explored in our paper are replicable, since our primary measure of success (completion of an embedded online interest form) does not depend on the referring A/B testing platform. Last, the development of the website and site maintenance were performed by the volunteer members of the investigative team. The time spent creating the A/B testing experiment and tracking response rates was not documented, but represents an important consideration for future research on A/B testing. Nevertheless, one of the greatest expenses for digital recruitment with the exception of electronic record-based approaches, continues to be advertisements that direct participants to the study website [17].

This study also has several strengths. To the best of our knowledge, this was the first study to disseminate findings regarding the application of A/B testing platforms in a clinical trial recruitment campaign of older adults. The A/B testing tools collected detailed data on web-user engagement, allowing us to compare the effectiveness between website landing pages, recruitment infographics, and recruitment videos. In doing so, this study highlights an empirical approach to test tresearch recruitment materials and procedures. This approach may also be used in the future to inform the development and modification of research recruitment materials. For example, a study team may use the results from A/B testing on their website to inform the development of print materials, such as brochures or flyers. It could also be used to test the effectiveness of the written content on recruitment materials, such as the level of detail included in a recruitment message. Testing these differences at the beginning stages of a trial may lead to enhanced recruitment approaches and thus, more timely and cost-effective recruitment.

6. Conclusion

In conclusion, this pilot study using A/B testing platforms in a clinical trial recruitment website provides foundational understanding of the role of this approach for optimizing content and enhancing recruitment. Future research is needed to understand the efficacy of this approach in clinical trials, and whether these changes can improve the recruitment of underrepresented groups in clinical research.

Acknowledgements

We would like to thank Olive Tang, Manik Arora, Seamus Wang, and Simon Zhang for their work in developing the Infographics utilized in Experiment 1 and Dr. David Reiss and his students at Towson University for recording the videos utilized in Experiment 2.

Funding

STURDY is supported by the NIH/NIA (U01AG047837). SPJ is supported by the NIH/NHLBI (K23HL135273). STURDY was also supported by the Johns Hopkins Institute for Clinical and Translational Research, which is funded in part by the National Center for Advancing Translational Sciences (NCATS) (Grant No. ULITR003098).

APPENDIX

Appendix

Fig. A1.

Fig. A1.

Original landing page.

Table A1.

Experiment 2 video scripts and links.

Version A: https://www.youtube.com/watch?v=jGYJqEyQUyI

0:00–0:22
Every year one in three individuals over the age of 70 will fall. The consequences of these falls can be catastrophic with $23 billion healthcare dollars spent helping patients recover from these falls each year.
According to the Centers for Disease Control and Prevention, in 2013, 58 people for every 100,000 US adults over the age of 65 died from a fall. This number has increased substantially over the past 10 years.
Dr. Appel is a physician and researcher at Johns Hopkins who is very much interested in the topic of falls. Let’s hear from him…
0:57–1:19
As Dr. Appel mentioned the effect of falls can be extremely unfortunate and are not just limited to an injury - falls in people over the age of 70 can result in deaths.
So we need to dig deep into the topic and Dr. Appel has taken the first step to preventing falls with his STURDY study. Let us hear about the main investigator’s vision of STURDY
1:43–1:58
However, we don’t know if it really works. If it does work we don’t know at what dose. Should people take a mega dose each day or a small supplement? Dr. Appel how will STURDY address this question during the trial?
2:19–
The success of clinical trials would not be possible with volunteers like you! We have made it as easy as possible for people to participate with our 2 convenient locations. One is at ProHealth in Gwenn Oak located near the Social Security complex. The other is in Hagerstown Maryland, again, very conveniently located just off of route 40. Both sites have free parking.
By agreeing to participate, you can make an incredible difference by contributing to our understanding of vitamin D and falls.
Version B: https://www.youtube.com/watch?v=Ydf_mkDX6OE
0:00–0:34
Every year one in three individuals over the age of 70 will fall. The consequences of these falls can be catastrophic with $23 billion healthcare dollars spent helping patients recover from these falls each year. Furthermore, some falls can lead to long-term disability or even death.
Therefore, the team is planning to perform the STURDY Trial to examine how vitamin D may affect risk of falling. Let us hear directly from Letitia, the Study coordinator about what the STURDY study is?
0:48–0:54
That seems pretty interesting Letitia. Can you briefly go through the entire STURDY process?
1:17–1:24
Thank you Letitia. Let us now go to Jeanne who is the Director of Clinical Operations at ProHealth, our Baltimore location, and ask her more about the process?
1:38–1:50
Because these individuals are volunteers we actually try to make it as easy as possible for people to participate, so we have a two clinical centers in Gwenn Oak and Hagerstown.
1:51–1:55
So what motivates the Johns Hopkins’ team and participants to look at the problem of falls?
2:30–
Thank you for watching! We’d love to see you at Prohealth or Hagerstown!
Version C: https://www.youtube.com/watch?v=TjrzNd27lJA
0:00–0:18
Every year one in three individuals over the age of 70 will fall. The consequences of these falls can be catastrophic with $23 billion healthcare dollars spent helping patients recover from these falls each year.
Let us ask Eva, a STURDY participant about why falls matter to her.
0:33–0:47
Based on Eva’s and other participants’ concern, a team at Johns Hopkins started looking into the effects of Vitamin D on falls and designed the STURDY study. Let us ask the STURDY study coordinator to describe the study in detail
1:23–1:42
That sounds straightforward for the participant. Moreover, because these individuals are volunteers we actually try to make it as easy as possible for people to participate, so we have two clinical centers in Gwenn Oak and Hagerstown. Letitia works at the Prohealth center and no one could better describe the center than her. So Letitia what are your views on the center?
1:01
As Letitia said, people enjoy coming to our clinical centers and many keep coming back for different studies. Let us return to Eva to hear about her motivation to participate in more studies?
2:05
Thank you for watching! We’d love to see you at Prohealth or Hagerstown!

Footnotes

Declaration of Competing Interest

The authors declare that there is no conflicts of interest.

Clinical trial registration: This trial is registered at ClinicalTrials.gov. The trial registration number is NCT02166333. The URL is: https://clinicaltrials.gov/ct2/show/NCT02166333

Trial Registration Number: NCT02166333

Trial Register: ClinicalTrials.gov

References

  • [1].Fogel DB, Factors associated with clinical trials that fail and opportunities for improving the likelihood of success: a review, Contemp. Clin. Trials Commun 11 (2018) 156–164, 10.1016/j.conctc.2018.08.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [2].Sully BGO, Julious SA, Nicholl J, A reinvestigation of recruitment to randomised, controlled, multicenter trials: a review of trials funded by two UK funding agencies, Trials. 14 (2013) 166, 10.1186/1745-6215-14-166. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [3].Walters SJ, Bonacho I. Cadby Dos Anjos Henriques, Bortolami O, et al. , Recruitment and retention of participants in randomised controlled trials: a review of trials funded and published by the United Kingdom Health Technology Assessment Programme, BMJ Open 7 (3) (2017), e015276, 10.1136/bmjopen-2016-015276. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [4].Campbell MK, Snowdon C, Francis D, et al. , Recruitment to randomised trials: strategies for trial enrollment and participation study. The STEPS study, Health Technol. Assess 11 (48) (2007) iii, ix–105, 10.3310/hta11480. [DOI] [PubMed] [Google Scholar]
  • [5].McDonald AM, Knight RC, Campbell MK, et al. , What influences recruitment to randomised controlled trials? A review of trials funded by two UK funding agencies, Trials. 7 (2006) 9, 10.1186/1745-6215-7-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [6].Carlisle B, Kimmelman J, Ramsay T, MacKinnon N, Unsuccessful trial accrual and human subjects protections: an empirical analysis of recently closed trials, Clin. Trials 12 (1) (2015) 77–83, 10.1177/1740774514558307. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Easterbrook PJ, Matthews DR, Fate of research studies, J. R. Soc. Med 85 (2) (1992) 71–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [8].Holden G, Rosenberg G, Barker K, Tuhrim S, Brenner B, The recruitment of research participants: a review, Soc. Work Health Care 19 (2) (1993) 1–44, 10.1300/J010v19n02_01. [DOI] [PubMed] [Google Scholar]
  • [9].Yancey AK, Ortega AN, Kumanyika SK, Effective recruitment and retention of minority research participants, Annu. Rev. Public Health 27 (2006) 1–28, 10.1146/annurev.publhealth.27.021405.102113. [DOI] [PubMed] [Google Scholar]
  • [10].Stensland KD, McBride RB, Latif A, et al. , Adult cancer clinical trials that fail to complete: an epidemic? J. Natl. Cancer Inst 106 (9) (2014) 10.1093/jnci/dju229. [DOI] [PubMed] [Google Scholar]
  • [11].Sardar MR, Badri M, Prince CT, Seltzer J, Kowey PR, Underrepresentation of women, elderly patients, and racial minorities in the randomized trials used for cardiovascular guidelines, JAMA Intern. Med 174 (11) (2014) 1868–1870, 10.1001/jamainternmed.2014.4758. [DOI] [PubMed] [Google Scholar]
  • [12].Oh SS, Galanter J, Thakur N, et al. , Diversity in clinical and biomedical research: a promise yet to be fulfilled, PLoS Med. 12 (12) (2015), e1001918, 10.1371/journal.pmed.1001918. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [13].Al Hadidi S, Mims M, Miller-Chism CN, Kamble R, Participation of African American persons in clinical trials supporting U.S. food and drug administration approval of cancer drugs, Ann. Intern. Med (June 2020), 10.7326/M20-0410. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [14].Frampton GK, Shepherd J, Pickett K, Griffiths G, Wyatt JC, Digital tools for the recruitment and retention of participants in randomised controlled trials: a systematic map, Trials. 21 (1) (2020) 478, 10.1186/s13063-020-04358-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [15].van Gelder MMHJ, van de Belt TH, Engelen LJLPG, Hooijer R, Bredie SJH, Roeleveld N, Google AdWords and Facebook ads for recruitment of pregnant women into a prospective cohort study with long-term follow-up, Matern. Child Health J 23 (10) (2019) 1285–1291, 10.1007/s10995-019-02797-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Gross MS, Liu NH, Contreras O, Munoz RF, Leykin Y, Using Google AdWords ˜ for international multilingual recruitment to health research websites, J. Med. Internet Res 16 (1) (2014), e18, 10.2196/jmir.2986. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [17].Juraschek SP, Plante TB, Charleston J, et al. , Use of online recruitment strategies in a randomized trial of cancer survivors, Clin. Trials 15 (2) (2018) 130–138, 10.1177/1740774517745829. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [18].Whitaker C, Stevelink S, Fear N, The use of Facebook in recruiting participants for Health Research purposes: a systematic review, J. Med. Internet Res 19 (8) (2017), e290, 10.2196/jmir.7071. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [19].Miller HN, Charleston J, Wu B, et al. , Use of electronic recruitment methods in a clinical trial of adults with gout, Clin. Trials (September 2020), 10.1177/1740774520956969, 1740774520956969. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [20].Miller HN, Gleason KT, Juraschek SP, et al. , Electronic medical record-based cohort selection and direct-to-patient, targeted recruitment: early efficacy and lessons learned, J. Am. Med. Inform. Assoc 26 (11) (2019) 1209–1217, 10.1093/jamia/ocz168. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [21].Plante TB, Gleason KT, Miller HN, et al. , Recruitment of trial participants through electronic medical record patient portal messaging: a pilot study, Clin. Trials 17 (1) (2020) 30–38, 10.1177/1740774519873657. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [22].Gleason KT, Ford DE, Gumas D, et al. , Development and preliminary evaluation of a patient portal messaging for research recruitment service, J. Clin. Transl. Sci 2 (1) (2018) 53–56, 10.1017/cts.2018.10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [23].Leonard A, Hutchesson M, Patterson A, Chalmers K, Collins C, Recruitment and retention of young women into nutrition research studies: practical considerations, Trials. 15 (2014) 23, 10.1186/1745-6215-15-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [24].Gupta A, Calfas KJ, Marshall SJ, et al. , Clinical trial management of participant recruitment, enrollment, engagement, and retention in the SMART study using a marketing and information technology (MARKIT) model, Contemp. Clin. Trials 42 (2015) 185–195, 10.1016/j.cct.2015.04.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [25].Meehan A, Bundorf MK, Klimke R, et al. , Online consent enables a randomized, controlled trial testing a patient-centered online decision-aid for Medicare beneficiaries to meet recruitment goal in short time frame, J. Patient Exp 7 (1) (2020) 12–15, 10.1177/2374373519827029. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [26].Thiel DB, Platt J, Platt T, et al. , Testing an online, dynamic consent portal for large population biobank research, Public Health Genomics. 18 (1) (2015) 26–39, 10.1159/000366128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [27].Kohavi R, Thomke S, The Surpising Power of Online Experiments. Harvard Business Review. https://hbr.org/2017/09/the-surprising-power-of-online-experiments, 2017. [Google Scholar]
  • [28].Gallo A, A Refresher on A/B Testing. Harvard Business Review. https://hbr.org/2017/06/a-refresher-on-ab-testing, 2017. Accessed July 28, 2020. [Google Scholar]
  • [29].Hanington J, The ABCs of A/B Testing. salesforce.com. https://www.pardot.com/blog/abcs-ab-testing/, 2012. Accessed May 7, 2021. [Google Scholar]
  • [30].Kohavi R, Longbotham R, Sommerfield D, Henne RM, Controlled experiments on the web: survey and practical guide, Data Min. Knowl. Disc 18 (1) (2009) 140–181, 10.1007/s10618-008-0114-1. [DOI] [Google Scholar]
  • [31].Nisbett RE, Mindware: Tools for Smart Thinking, Farrar, Straus and Giroux, Spain, 2015. [Google Scholar]
  • [32].Horwitz LI, Kuznetsova M, Jones SA, Creating a learning health system through rapid-cycle, randomized testing, N. Engl. J. Med 381 (12) (2019) 1175–1179, 10.1056/NEJMsb1900856. [DOI] [PubMed] [Google Scholar]
  • [33].Michos ED, Mitchell CM, Miller ER 3rd, et al. , Rationale and design of the study to understand fall reduction and vitamin D in you (STURDY): a randomized clinical trial of vitamin D supplement doses for the prevention of falls in older adults, Contemp. Clin. Trials 73 (2018) 111–122, 10.1016/j.cct.2018.08.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [34].Appel LJ, Michos ED, Mitchell CM, et al. , The effects of four doses of vitamin D supplements on falls in older adults : a response-adaptive, randomized clinical trial, Ann. Intern. Med 174 (2) (2021) 145–156, 10.7326/M20-3812. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [35].Optimizely: Extraordinary Experiences through Experimentation. https://www.optimizely.com, 2021. Accessed January 13, 2021.
  • [36].Cline RJW, Haynes KM, Consumer health information seeking on the internet: the state of the art, Health Educ. Res 16 (6) (2001) 671–692, 10.1093/her/16.6.671. [DOI] [PubMed] [Google Scholar]
  • [37].Pew Research Center, Internet/Broadband Fact Sheet. https://www.pewresearch.org/internet/fact-sheet/internet-broadband/, 2021. Accessed May 6, 2021.
  • [38].Arab L, Hahn H, Henry J, Chacko S, Winter A, Cambou MC, Using the web for recruitment, screen, tracking, data management, and quality control in a dietary assessment clinical validation trial, Contemp. Clin. Trials 31 (2) (2010) 138–146, 10.1016/j.cct.2009.11.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [39].Tweet MS, Gulati R, Aase LA, Hayes SN, Spontaneous coronary artery dissection: a disease-specific, social networking community-initiated study, Mayo Clin. Proc 86 (9) (2011) 845–850, 10.4065/mcp.2011.0312. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [40].Quach S, Pereira JA, Russell ML, et al. , The good, bad, and ugly of online recruitment of parents for health-related focus groups: lessons learned, J. Med. Internet Res 15 (11) (2013), e250, 10.2196/jmir.2829. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [41].Graham AL, Milner P, Saul JE, Pfaff L, Online advertising as a public health and recruitment tool: comparison of different media campaigns to increase demand for smoking cessation interventions, J. Med. Internet Res 10 (5) (2008), e50, 10.2196/jmir.1001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [42].Brøgger-Mikkelsen M, Ali Z, Zibert JR, Andersen AD, Thomsen SF, Online patient recruitment in clinical trials: systematic review and Meta-analysis, J. Med. Internet Res 22 (11) (2020), e22179, 10.2196/22179. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [43].Clayforth C, Pettigrew S, Mooney K, Lansdorp-Vogelaar I, Rosenberg M, Slevin T, A cost-effectiveness analysis of online, radio and print tobacco control advertisements targeting 25–39 year-old males, Aust. N. Z. J. Public Health 38 (3) (2014) 270–274, 10.1111/1753-6405.12175. [DOI] [PubMed] [Google Scholar]
  • [44].Jones RB, Goldsmith L, Hewson P, Williams CJ, Recruitment to online therapies for depression: pilot cluster randomized controlled trial, J. Med. Internet Res 15 (3) (2013), e45, 10.2196/jmir.2367. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [45].Treweek S, Pitkethly M, Cook J, et al. , Strategies to improve recruitment to randomised trials, Cochrane Database Syst. Rev 2 (2) (2018), MR000013, 10.1002/14651858.MR000013.pub6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [46].Coert RMH, Timmis JK, Boorsma A, Pasman WJ, Stakeholder perspectives on barriers and facilitators for the adoption of virtual clinical trials: qualitative study, J. Med. Internet Res 23 (7) (2021) e26813, 10.2196/26813. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [47].U.S Census Bureau, American Community Survey; Table PST045219. Generated Using census.gov/quickfacts. https://www.census.gov/quickfacts/fact/table/US/PST045219, 2019. Accessed December 4, 2020.
  • [48].Coakley M, Fadiran EO, Parrish LJ, Griffith RA, Weiss E, Carter C, Dialogues on diversifying clinical trials: successful strategies for engaging women and minorities in clinical trials, J. Women’s Health 21 (7) (2012) 713–716, 10.1089/jwh.2012.3733. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [49].Khan MS, Shahid I, Siddiqi TJ, et al. , Ten-year trends in enrollment of women and minorities in pivotal trials supporting recent US Food and Drug Administration approval of novel Cardiometabolic drugs, J. Am. Heart Assoc 9 (11) (2020), e015594, 10.1161/JAHA.119.015594. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [50].Prasanna A, Miller HN, Wu Y, et al. , Recruitment of black adults into cardiovascular disease trials, J. Am. Heart Assoc 10 (17) (2021), e021108, 10.1161/JAHA.121.021108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [51].Durant RW, Wenzel JA, Scarinci IC, et al. , Perspectives on barriers and facilitators to minority recruitment for clinical trials among cancer center leaders, investigators, research staff, and referring clinicians: enhancing minority participation in clinical trials (EMPaCT), Cancer 120 Suppl (07) (2014) 1097–1105, 10.1002/cncr.28574. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [52].Ford JG, Howerton MW, Lai GY, et al. , Barriers to recruiting underrepresented populations to cancer clinical trials: a systematic review, Cancer. 112 (2) (2008) 228–242, 10.1002/cncr.23157. [DOI] [PubMed] [Google Scholar]
  • [53].Harris Y, Gorelick PB, Samuels P, Bempong I, Why African Americans may not be participating in clinical trials, J. Natl. Med. Assoc 88 (10) (1996) 630–634. [PMC free article] [PubMed] [Google Scholar]

RESOURCES