Skip to main content
. 2024 Feb 6;8(1):e40. doi: 10.1017/cts.2024.19

Table 2.

Empowering the Participant Voice infrastructure and use case implementation at five participating sites

Site Scope of fielding Selection of sample Timing of survey delivery Frequency of survey fielding (months) Survey platform Response rate Early efforts to increase response rate or representativeness of response
Breadth of survey participation Census of all eligible participants, Random sample, or other Post-consent (0–2 months); End-of-study participation; Annual; or Other/Unspecified How often surveys are sent How survey invitations are sent Survey response rate for the site, May 2023 Sites met regularly with stakeholders and implemented their recommendations aimed to increase the reach of and response to the survey
A Study Level, with study principal investigator agreement, invited by the site team Census (100–200 per fielding) Post-consent, End-of-study, Unspecified Rolling Email, Telephone (pilot) 31.4% Motivation question pilot : Tested surveys with or without optional questions to test for negative impact on response rate; questions slightly increased response rate 23 to 28%. EFFECTIVE (conclusive).
Awareness campaign : Distributed flyers. NOT EFFECTIVE.
Partnered with community satellite : UNDERWAY.
Telephone outreach to Latino study ; Response rate to email invite 16%; response to telephone call: 31%. EFFECTIVE (expensive).
B Enterprise, leadership decision; RCT only Random (500 per fielding) End of Study 6 Email, SMS 18.4% SMS: Tested SMS survey invite to increase participation of younger participants and POC; Overall EFFECTIVE for younger participants; NOT EFFECTIVE for POC.
Expanded cohort : Pilot sending surveys to participants beyond RCTs. NEUTRAL.
C Enterprise, Only studies listed in central CTMS Census (1000) per fielding) Post-consent, End of Study 2 Email, Paper follow up 20.3% Raffle incentivization: Participants who return the survey have a 1:25 chance to win a $50 gift card; increased response rate from 18 to 30%. EFFECTIVE.
Paper surveys: Community advisors recommended sending paper surveys to Black participants to increase response rate (8%): NOT EFFECTIVE (expensive).
D Enterprise, all studies across the institution Census (100–400) Post-consent, End of Study, Annual 2 Email 22.4% Brand recognition : Inserted branded graphics from brochures into email survey invite to increase response rate: NOT EFFECTIVE.
Study team ambassadors : Targeted return of results to cultivate team members as ambassadors to encourage survey response. UNDERWAY.
Public return of results page: Positive resopnse from community advisors.
Results page on study business cards (for participants). UNDERWAY.
E Enterprise, all studies across the institution Census (3000–6000) End of Study 6 Portal, SMS 15.4% Expand platforms : Current response rate using portal (15%) lower than for pre-project pilot test (30%). Use of other platforms requires institutional policy change, UNDERWAY. Enhance representativeness : The response cohort was 84% White and 13% Black, compared to the population sent the survey (74% White, 19% Black). Instituted an Institutional Equity in Research Experience Committee to address how to reach underrepresented communities better. UNDERWAY.