Skip to main content
. 2024 Feb 6;8(1):e40. doi: 10.1017/cts.2024.19

Table 1.

Feedback was provided by stakeholders engaged locally at participating sites throughout the design and implementation phases of the project

1. During Project Design Phase: Early Engagement*
Stakeholder themes Action Impact
Anticipated Value:
Assess current research participation experience overall and for underrepresented groups Design At-a-Glance Dashboard, built-in scoring, filter by participant characteristics Able to analyze/act on data for groups, including those affected by disparities. Opportunity for transparency and trust building
Benchmark internally and with other CTSAs Aggregate data to EPV Consortium Dashboard Evidence-based, benchmarks, public-facing
Identify and sustain high scorers Local analysis/actions Evidence for best practices
Identify opportunities for enterprise-wide innovations Consortium Dashboard and Learning Collaborative Shared evidence base & use cases, opportunities to conduct multi-site clinical translational science
Measure pre/post innovation to assess the impact Dashboard views of data over time, custom reports Clinical Translational Science, accountability to stakeholders
Participants feel that their concerns matter C Communicate before and after the survey fielding Charge sites to return meaningful results
Participants can compare their experience to others’ C Return results publicly Charge sites to return meaningful results
Concerns:
Will groups engage? Engage early, manage fears, leverage community engagement and outreach expertise at site Imperfect sampling still provides valuable information
How to prioritize findings? Develop performance improvement workflow, with stakeholders Local autonomy
Will benchmarks compare apples to apples? Standards optimize comparability Validated tools, adherence to standards, filters
Risk of negative scores, reputational harm to the investigator or to the institution Local governance & data-sharing decisions; Data Use Agreement Experiences are real even if unmeasured; better to know
Teams might perceive scores as punitive Constructive performance improvement models Be able to share use cases, best practices
Are the questions relevant to participants? C Core questions from validated participant-centered research; Free text fields for additional input Sites may add custom questions and free text fields retained, and some sites include links to formal complaint workflow
The response might damage the relationship with the research team C Communicate privacy protections early and often Dashboard design suppresses results in any cell with<5 responses and could risk the re-identification of an individual
Lack of transparency and accountability for results and actions taken C Communicate plan to return results; share results and actions; engage stakeholders in analysis and action Sites develop public-facing websites for return of results; sites develop workflow for performance improvement
Potential for tokenism C Engage community and trusted proxies; be accountable Public return of results pages; aim for transparency

Stakeholders included institutional and community members, such as institutional and research leadership, investigators and faculty, privacy/IRB staff, research coordinators and nurses, patients, research participants, community representatives and liaisons, and others. Comments and themes were not linked to specific individuals during reporting.

C Themes raised at an engagement meeting that included one or more community members/advocates/research participants/patient/patient representatives. Community meetings ranged in attendance from 5 to>50.

*

Attendee roles and affiliations membership were not tabulated at stakeholder meetings held in the first 6 months of the project.