Skip to main content
Contemporary Clinical Trials Communications logoLink to Contemporary Clinical Trials Communications
. 2025 Apr 11;45:101483. doi: 10.1016/j.conctc.2025.101483

Novel approaches to recruiting clinical sites for embedded pragmatic clinical trials: Insights from the AIM-back trial

Tyler L Cope a,b,, Steven Z George a,b,c, S Nicole Hastings b,d,e, Courtni France d, Christa Tumminello d, Cynthia J Coffman d,f, Ashley Choate d, Trevor A Lentz a,c,g
PMCID: PMC12049813  PMID: 40321970

Abstract

Background

Embedded pragmatic clinical trials (ePCTs) assess interventions in real-world settings. Best practices for recruiting clinical sites for ePCTs are unknown, especially for sites that aren't known to the study team or familiar with clinical research. We describe the site recruitment process for AIM-Back, an ePCT of two nonpharmacologic pathways for low back pain within the Veterans Health Administration (VA).

Methods

During the planning phase of the AIM-Back trial, we aimed to recruit 18–20 sites. Eligible sites required provider capacity, administrative support, and geographic separation to avoid contamination. Our three-step approach involved: (1) lead (VA personnel) identification through existing VA contacts, data repositories of VA clinicians, and promotional outreach at events and listservs; (2) lead engagement via tailored communications emphasizing participation benefits; and (3) virtual meetings with administrators and clinicians.

Results

We identified 184 leads across 53 VA healthcare systems. Leads from 40 systems responded to outreach, and recruitment meetings were conducted with 23 systems involving primary care, physical therapy, research staff, and leadership. We met our recruitment goal, securing participation agreements from 19 sites, with a median timeline from outreach to participation agreement of 3.7 months. Common reasons for non-participation included infrastructure and resource constraints, resistance to new clinical programs, and competing programs.

Conclusion

AIM-Back's recruitment highlights ePCT site recruitment complexities for trials engaging new clinical research sites. Our innovative three-step recruitment approach provides an example for similarly designed trials. Future ePCTs should consider comprehensive recruitment strategies to ensure clinician buy-in, study feasibility, and broaden existing networks for completing ePCTs.

Highlights

  • Site recruitment influences generalizability and feasibility in embedded pragmatic trials.

  • The AIM-Back trial employed a 3-step approach to recruit 19 Veterans Administration sites.

  • Adapting business sales strategies can support site recruitment via enhanced lead generation.

  • Expanding beyond familiar site networks is critical to identifying sufficient trial sites.

  • Engaging clinicians at non-research sites may promote site participation and retention.

1. Introduction

Pragmatic clinical trials (PCTs) assess the effectiveness of treatments in usual care settings. Pragmatism in PCTs exists on a continuum, with benefits and drawbacks for study elements [1]. On the more pragmatic end of the continuum, embedded PCTs (ePCTs) use existing healthcare providers to deliver trial interventions and collect outcomes through real-world data sources, such as electronic health records or insurance claim data [2]. Benefits of ePCTs include the ability to evaluate intervention effectiveness in the usual care setting, assess implementation, and optimize the generalizability of findings [3]. Challenges of ePCTs include recruiting sites with the resources, infrastructure, and willingness to participate while representing the target patient population. Compounding the challenge is that many ePCTs also require the participation of clinical and administrative staff without the incentive of trial-related financial compensation [4,5].

In ePCTs, recruiting clinical sites and participants that accurately reflect real-world conditions is essential for ensuring the trial results’ generalizability and external validity [6]. Recruitment often involves engaging less experienced clinical sites without dedicated research staff. Outreach to these potential partners often includes educating clinical and administrative staff about the nuances of ePCT participation. Gaining their buy-in and trust as study partners requires different engagement and communication strategies than investigators use to recruit sites with established on-site research staff.

Investigators typically select sites for trial participation based on their proven recruitment and performance capabilities within their known network [7]. However, there is limited guidance on effective methods for recruiting new sites into ePCTs, especially those outside one's existing network and with less experience in clinical trial participation [8]. In addition, the National Institutes of Health Pragmatic Trials Collaboratory emphasizes the importance of deliberate site recruitment to enhance the generalizability and scalability of ePCTs, underscoring the need for guidance on effective strategies to support expanded site participation in future trials [9].

In this paper, we describe our experiences recruiting and enrolling sites for the Improving Veteran Access to Integrated Management of Back Pain (AIM-Back) Trial, an ePCT evaluating two nonpharmacologic clinical pathways for low back pain treatment within the Veterans Health Administration (VHA) [10]. Unlike many trials where sites are already known to the research team, we planned to recruit 18–20 sites, many of which were new collaborators, had limited research staff, and had no prior experience with participating in ePCTs. This paper aims to: 1) describe our three-step recruitment approach, 2) report site recruitment results, and 3) discuss lessons learned.

2. Methods

2.1. Overview of trial design

AIM-Back (NCT04411420) was a cluster-randomized clinical trial co-led by Duke University and the Durham, North Carolina VA, where the unit of randomization was the clinical site—either a VA main medical center (VAMC) or a community-based outpatient clinic (CBOC) within a Veterans Affairs healthcare system (VA HCS). A VA HCS typically includes one or more VAMCs and multiple CBOCs reporting to a single leadership team within a geographic region. A single VA HCS could contain multiple participating sites in the trial (Fig. 1).

Fig. 1.

Fig. 1

Illustrates potential site arrangements to fulfill participation requirements. VA Medical Centers (VAMCs) (A) and Community Based Outpatient Clinics (CBOCs) (B) within a VA Healthcare System (HCS) needed a participating primary care provider (PCP) clinic alongside a local VA physical therapy (PT) clinic to refer Veterans as part of the pathway. Sites without a local VA PT clinic (C) were deemed ineligible for participation.

Participating sites were randomly assigned to implement one of two embedded low back pain care pathways, either the Sequenced Care Pathway (SCP) or the Pain Navigator Pathway (PNP), as part of their regular clinical operations. Both pathways were designed to improve access to and use of guideline-concordant care for low back pain. To date, these methods of improving access to LBP care have not been compared directly. During the site recruitment process and prior to randomization, sites interested in participating had to be willing to implement either of the two care pathways.

The initiation of both pathways began with a referral from a VA primary care provider at a participating site. The SCP started with an evaluation and treatment by a local VA physical therapist, followed by six weeks of telehealth sessions promoting physical activity [11]. Veterans followed up at 6 weeks with their physical therapist, at which time they were administered the STarT Back Screening Tool to assess for persistent risk of disability [12]. Veterans at medium or high risk of persistent disability received an additional six weeks of psychologically-informed care through telehealth [11]. The PNP involved a Pain Navigator (via telehealth) who helped Veterans choose and access non-pharmacologic treatment options like acupuncture, physical therapy, or chiropractic care, with follow-ups at 6 and 12 weeks to revise the plan of care as needed [13]. Full pathway details are in the published protocol [10].

The care pathways in this study did not include experimental treatments but rather tested two different approaches to make existing, guideline-concordant care more accessible. As confirmed by the VA in a memorandum, sites would implement a new way to organize care and collect outcomes data through processes already established as standard of care. For this reason, participation in AIM-Back did not require patient-level consent. Staff were trained to integrate the pathways into standard care without financial incentives for sites, staff, or Veterans. Veterans could decline participation if offered an AIM-Back referral. However, research-related surveys were collected on a subset of Veterans in the AIM-Back program. Survey completion did constitute research, and individual consent was required to collect surveys per Durham VA and Duke University institutional review board protocols.

The cluster randomized design required a minimum of 16 sites (eight per pathway), each enrolling 105 Veterans. To account for potential site attrition, we aimed to recruit 18–20 clinical sites. Sites were recruited and launched in two blocks to minimize the time between site agreement and pathway launch. The first block included ten sites (five randomized to each pathway), with the remaining nine sites (4 SCP, 5 PNP) launching between 9 and 12 months later. Training and launch for each site took 2–3 months, with our research team training 3–4 sites concurrently.

2.2. Requirements for site participation

To participate in AIM-Back, sites had to: 1) have an associated VA physical therapy clinic with both the capacity and willingness to participate if assigned to the SCP pathway; 2) demonstrate administrative and clinical commitment to implementing either pathway, confirmed by a signed agreement from the Medical Center Director; and 3) ensure no geographical overlap in referral patterns and no shared clinical staff with other sites to prevent contamination (Fig. 1). To ensure sites would have enough volume to meet our enrollment goals, eligible sites had to have seen at least 800 unique patients with low back pain in a primary care setting in the previous year. We would consider sites with lower annual volume on a case-by-case basis, such as those with a strong clinical champion or those that are part of a larger health system with other qualifying clinics. We set a 5000-patient limit to avoid overwhelming navigators or SCP telehealth providers caseload.

2.3. Site recruitment process

To recruit sites, we used a stepwise approach to identify potential leads and engage key gatekeepers and partners. In this paper, 'leads' refer to clinicians or administrators within the VA HCS. Given our goal to recruit at least 16 sites, we needed to expand beyond our existing network to identify a large number of leads and convert them into participating sites. Our AIM-Back site recruitment team was led by two operations personnel with dedicated effort and expertise in implementation science, supported by the co-principal investigators and a co-investigator.

We found parallels between our recruitment needs and business sales lead prospecting, which informed our use of an adapted "business sales funnel" model [[14], [15], [16]]. This model visualizes the recruitment process as a funnel, starting with a wide pool of leads that narrows through stages of engagement and selection, recognizing that many leads will drop out along the way (Fig. 2). In business sales, this approach helps organizations systematically identify, qualify, and convert potential customers into buyers. Similarly, in site recruitment, the funnel framework provides a structured method to generate initial interest, assess feasibility, and guide prospective sites through the decision-making process toward participation.

Step 1

Identifying Recruitment Leads

Fig. 2.

Fig. 2

'Site Recruitment Funnel' conceptual model depicting the progression from a pool of leads to participating sites. This model highlights key rate-limiting stages in the recruitment process, with opportunities at each stage to optimize speed and conversion rates.

To identify leads, we used three different strategies concurrently.

  • A)

    Warm Market Engagement

In this strategy, we identified leads through the research team's personal and professional connections, which could include existing or prior research partnerships. This "warm market" approach relies on established relationships to facilitate initial outreach and build trust. In the second recruitment block, we re-engaged leads from the first block who had shown interest but had not immediately agreed to participate in AIM-Back.

  • B)

    Leveraging Data

In addition to Warm Market Engagement, the study team expanded its list of leads by leveraging larger datasets of healthcare provider contact information. The aim was to use various platforms and data sources to build a roster of potential contacts for our team to contact. Using the VA Clinical Data Warehouse, a repository storing all clinical data within the VHA, the team identified primary care providers at potential VA sites, focusing on those who treated a high volume of Veterans for low back pain (based on International Classification of Diseases-10 codes) in the prior year. These high-volume providers were prioritized as leads. Similarly, LinkedIn was used to generate a list of potential contacts, identifying primary care providers and outpatient physical therapists employed at VA HCSs nationwide.

  • C)

    Promotional Outreach

The warm market and data-driven strategies involved our team proactively identifying and contacting individual leads. Unlike those strategies, the Promotional Outreach strategy aimed to increase the trial's visibility and passively generate interest, encouraging interested individuals to initiate contact with our study team. We presented at conferences, VA webinars, and events such as the VA's Physical Medicine and Rehabilitation National Pain Call, the Pain Community of Practice Call, the VA's Health Services Research and Development National Call, the Telehealth Think Tank, and a Veterans Integrated Service Network (VISN) 6 session, focusing on the trial's mission and participation criteria. Additional outreach through social media and email listservs (e.g., physical therapy school alumni and professional groups) further promoted the trial.

Step 2

Approaching Recruitment Leads

After identifying leads, we used diverse communication channels to engage with them, including personalized emails, phone calls, and direct messaging via VA Microsoft Teams and LinkedIn. These modes facilitated high-volume, tailored, and targeted outreach to encourage engagement. We created templates for content in the subject line and message body that were specific to the lead type. For example, when approaching primary care providers, the subject lines highlighted the prospect of collaboration with Duke University. This approach aimed to evoke curiosity and leverage the reputation of a well-known university to encourage the lead to open the email. For primary care provider leads, the value proposition in the body of the email focused on the opportunity for an easy and efficient care pathway to address a well-known clinical challenge—low back pain management. Notably, the messaging centered on how we could assist the clinicians with this challenge instead of focusing on requesting help with research.

Step 3

Engaging and Selecting Clinical Sites

For leads who responded positively to the initial contact, we provided additional information about the project via email and requested to schedule a recruitment meeting. Recruitment meetings engaged site leads and key stakeholders to assess feasibility and secure buy-in. Leadership from primary care, physical therapy, and upper-level administration was prioritized to understand site priorities and approve necessary staffing adjustments, such as reallocating personnel for navigator roles if the site was randomized to the PNP or adapting PT workflows for the SCP. Practicing PCPs and PTs were also encouraged to attend to evaluate implementation feasibility, assess workflow fit, and foster early engagement. All meetings were conducted virtually.

During site recruitment meetings, the study team delivered presentations tailored to the attendees, highlighting the value proposition and emphasizing the potential benefits of participation. Our messaging emphasized a collaborative approach, focusing on 'how we can assist in implementing this program to support your clinicians,' rather than merely outlining our research objectives, such as enrollment goals.

After calls, we outlined specific next steps to expedite participation decisions, schedule subsequent meetings, and secure commitment from administrators in physical therapy and primary care. For any VA HCS interested in participating, the study team encouraged partners to identify which of their eligible sites would participate in AIM-Back, considering levels of staffing, referral overlap, and provider interest. A signed participation agreement from the VA HCS Medical Director was required for sites to participate, necessitating navigating through a hierarchical communication structure to ultimately receive the Medical Director's approval. Sites were then randomized and trained before beginning enrollment.

2.4. Data analysis

Throughout this process, we tracked the number and type of leads, the methods of outreach used, recruitment meeting details, and the roles of leads and call participants. Here, we report descriptive results of this process, as well as participating clinical site characteristics, the time between initial contact and finalization of site participation agreements, and the time between site launch and minimum Veteran enrollment at each site (n = 65). Due to the small sample size of participating clinical sites, we do not report any inferential statistics.

3. Results

3.1. Lead identification and recruitment meetings

Using our three strategies over two recruitment blocks, we identified 184 leads from 53 VA HCS. Warm Market Engagement yielded 34 leads, Leveraging Data strategies yielded 128 leads, and Promotional Outreach resulted in 22 leads. Leads from 40 of the 53 health systems responded to the initial contact (75% response rate) (Fig. 3). Among the most frequently represented groups, primary care providers were contacted in 33 VA HCS (62.3%), followed by physical therapists in 23 (43.4 %), with research investigators or staff in 3 (5.7 %) and other professionals, such as chiropractors, nurses, or administrators, in 2 (3.8 %). Of the 40 HCS responding to the initial outreach, recruitment meetings were conducted with 23 out (57.5 %) of these systems. Of these 23 VA HCS, representation included physical therapists in 18 VA HCS (78.3 %), primary care providers in 14 (60.9 %), other professionals such as chiropractors, nurses, or administrators in 8 (34.8 %), research investigators or staff in 6 (26.1 %), and executive leadership in 3 (13.0 %).

Fig. 3.

Fig. 3

AIM-Back site recruitment CONSORT diagram illustrating the recruitment process, starting with contact with individual leads at the VAHCS level, followed by site selection (VAMCs and/or CBOCs) upon VAHCS agreement. The diagram also depicts the final enrollment of Veterans across participating sites.

3.2. Non-participation reasons

Ten VA HCS ultimately agreed to participate in AIM-Back. Of 43 non-participating health systems, reasons included no response (n = 13), lost communication (n = 13), HCS declinations (n = 16), and one additional site that was interested but did not participate because we had already reached our recruitment target. Of the 16 VA HCS that outright declined, reasons included insufficient infrastructure or personnel (n = 2), resource constraints (n = 4), participation in competing trials (n = 3), reluctance to alter existing programs (n = 6), and no reason identified (n = 1).

3.3. Site participation

Nineteen sites from 10 VA healthcare systems were randomized to participate. Two sites from one VA HCS that agreed to participate dropped out soon after initiating the program due to reluctance by participating providers to alter existing care delivery at the sites, resulting in 17 sites ultimately completing enrollment. The median time from initial contact to participation was 3.8 months (block 1) and 3.6 months (block 2). Participating sites originated from all three of the lead generation strategies, with the most generated from the Promotional Outreach strategy (n = 9) (Table 1).

Table 1.

Distribution of clinic characteristics across randomized sites.

Variable Frequency of Randomized Clinics (n)/Percentage (%)
Recruitment Lead Generation Strategy
Warm Market Engagement 4 (21.1 %)
Leveraging Data 6 (31.6 %)
Promotional Efforts 9 (47.4 %)
Site Size by LBP Volume in Referring PCP Clinics
<1000 4 (21.1 %)
1000–1999 5 (26.3 %)
2000–2999 5 (26.3 %)
3000–3999 3 (15.8 %)
4000–4999 2 (10.5 %)
Site Location by Centers for Disease Control and Prevention (CDC) Geographic Region [17]
Northeast 1 (5.3 %)
South 8 (42.1 %)
Midwest 9 (47.4 %)
West 1 (5.3 %)
Site Geographical Setting
Urban 17 (89.5 %)
Rural 2 (10.5 %)

3.4. Site distribution and characteristics

Participating sites were distributed across the United States, with the highest concentrations in the South (n = 8 sites) and Midwest (n = 9 sites), and most clinical sites were located in urban settings (n = 17) [17].

3.5. Participant enrollment and site performance

A total of 1817 Veterans were enrolled across 17 clinics, exceeding the trial enrollment goal of 1680 veterans. The enrollment numbers reflect the active sites after two sites dropped out. Sixteen of the 17 participating clinics met the minimum enrollment goal of n = 65 veterans. The median time to meet the minimum veteran enrollment goal (n = 65) among the 16 clinics was 14.5 months with enrollment pace varying from 3.5 to 23.5 months.

4. Discussion

This paper presents the results of a novel three-step approach to ePCT recruitment that involved leveraging existing relationships and expanding beyond our familiar network using concepts from the business sales sector. Recruiting and selecting sites is critical for ePCTs, as it promotes clinician buy-in, generalizability, and feasibility of intervention integration. [18]. A key finding is the relative success of our three lead identification strategies. The "warm network" strategy was a valuable starting point for initiating site recruitment but resulted in only four participating sites. In contrast, our strategies of engaging individual leads identified through repositories of provider information and promoting AIM-Back participation to larger audiences accounted for an additional 15 participating sites. Exhausting our warm network naturally led us to greater use of other strategies over time. We met our recruitment goals and had minimal site attrition (n = 2), suggesting the approach's effectiveness. This approach may be generalizable to other investigators, particularly those lacking extensive institutional relationships or the ability to collaborate with large clinical research organizations.

4.1. Factors contributing to recruitment success

AIM-Back's recruitment strategy focused on identifying clinicians from a broad range of VA Healthcare Systems as leads, regardless of prior research experience, rather than following traditional methods of partnering with investigators or research centers with established trial experience [19]. By focusing recruitment on the clinician rather than the researcher, we were able to foster a sense of investment and collaboration in the program. This approach may contribute to greater traction during implementation and higher intervention fidelity.

While common in business, lead generation strategies for clinical trial recruitment remain underexplored [14,20,21]. In applying the site recruitment funnel conceptual model (Fig. 2), we generated many leads—regardless of their familiarity with our team—to establish a "wide top-of-funnel." The funnel model provided a structured approach for systematically engaging these leads and ultimately narrowing our extensive list to more interested and qualified potential sites. As leads move through the funnel, various bottlenecks and rate-limiting factors may arise, such as the failure of leads to open and respond to emails, the inability to convert meetings to actionable next steps, and the failure to secure participation commitment. Thus, proactively identifying and addressing bottlenecks may optimize the approach's yield.

We approached email outreach as a critical rate-limiting factor in the recruitment process. We assumed that successfully converting an email lead into a recruitment meeting and ultimately into a participating site would hinge on whether recipients opened the initial emails and found the content compelling enough to respond. In line with this strategy, we prioritized strategies that we believed would optimize both open rates and response rates. By drawing on email marketing best practices, we personalized subject lines in an effort to promote interest and influence the lead to open the email [22]. Email marketing research suggests that to maximize engagement and response, email content should be brief, build trust, remain relevant, appeal emotionally to the recipient, and conclude with a clear, informative call to action [23,24]. Our approach to message content involved continuously refining these elements, focusing on clear value propositions that addressed the challenges of managing low back pain in primary care, providing concise explanations of the trial's benefits, and including a clear next step to schedule a recruitment meeting. Thirteen of the 43 non-participating HCS did not respond to our emails (30.2%). Unfortunately, due to lacking email tracking software, we were unable to determine if non-responses were due to leads not opening emails or if leads opened emails and decided not to respond. Given the importance of email open rates, future site recruitment efforts for similar studies should consider using email engagement tracking tools to improve outreach effectiveness.

During site recruitment meetings, which often involved a mix of administrators and clinicians, we focused on conveying the value of the clinical pathways to the site. A benefit of ePCTs is that sites may continue the care pathway post-trial. This was a strong selling point for many sites looking for ways to better implement the VA stepped care model for back pain. Because our research team provided minimal support for clinical delivery, many potential sites saw the value in our team helping to initiate a new clinical pathway and allowing them to continue it independently. As part of the sell, we highlighted how the new clinical pathway could seamlessly fit into their existing care setting, reduce wait times, and offer an "easy button" for referring to evidence-informed treatments for low back pain, all aligning with VA priorities for more personalized pain care [4,6,18,25].

4.2. Recruitment challenges

The recruitment process demanded more planning and time investment than simply relying on and limiting to sites with existing relationships. A dedicated operational team member led the trial's site recruitment efforts in collaboration with the trial's statistician team. Coordinating meetings with clinical providers while meeting site selection deadlines was a significant challenge. Future trials aiming to recruit many sites without pre-existing connections must account for the time required for recruitment, as inefficiencies can substantially impact trial costs.

A major recruitment challenge in ePCTs is garnering support and enthusiasm for participation at all levels, from administrators to clinical staff. Each needed to be willing to support modifications to workflows, even if only minimal. In addition to practical challenges, like getting all decision-makers on the same call, we often found that each decision-maker was motivated by a different value proposition. In some locations, administrators and leadership drove participation because they saw the program as beneficial to the institution's priorities, even when there was limited clinician buy-in. While this 'top-down' approach has benefits, it can be unsuccessful if frontline providers are not similarly bought in. Notably, two sites with high-level approval struggled to engage the clinicians tasked with delivering the intervention, leading to their withdrawal and unmet enrollment targets. Pathway champions that emerged at each site helped get all parties aligned and secure buy-in from 'gatekeepers,' but not every site had a clear champion. Champions are important for any clinical study but were particularly important in AIM-Back because most sites lacked research staff with whom we could engage about site-specific project needs [[26], [27], [28]].

Sites that declined participation offer several anecdotal insights. First, infrastructure and staffing limitations, such as the inability to repurpose a staff member as a Pain Navigator or the lack of a physical therapy clinic, highlighted the need for early assessments of site capacity to meet intervention requirements. However, efficiencies were gained by identifying common reasons for non-participation early in the recruitment process. Sites that did not respond to initial contact, lost communication, or declined participation were quickly identified, allowing for prioritization toward more promising recruitment leads. Second, the COVID-19 pandemic further constrained resources, underscoring the importance of budgeting for flexible recruitment timelines to face unforeseen obstacles. Additionally, resistance to changing established practices at some sites pointed to the value of targeting locations with a culture of innovation and adaptability. Finally, some sites were engaged in similar initiatives, highlighting the value of assessing the landscape of ongoing trials to identify potential conflicts or competing priorities.

4.3. Considerations

Sites were recruited to deliver AIM-Back pathways without knowing which pathway they would implement, as participation required acceptance of random assignment. Furthermore, we employed various recruitment strategies, but the extent to which these specific strategies or type of recruitment lead directly contributed to a site's recruitment performance remains unclear. Numerous site and participant-specific characteristics contribute to recruitment success, and all but one site achieved our minimum recruitment target. Consequently, it is not possible to determine whether one recruitment strategy was more effective than another in producing better trial recruitment results, and it would likely require much larger samples across various trials to more definitively establish best practices for site recruitment in ePCTs.

It is also important to note that our recruitment efforts primarily focused on acquiring sufficient participating clinics rather than ensuring the final group of participating sites would be representative of the broader VA health system. If representativeness was a primary goal, investigators would likely need to plan for a longer site recruitment phase and may have a more targeted pool of initial leads.

Finally, while the VA's centralized provider data access simplifies lead identification, this may not always translate seamlessly to non-VA settings, though the site recruitment funnel model remains widely adaptable. Success outside the VA, particularly in larger healthcare systems, may require leveraging electronic health record data or internal provider repositories to build a robust pool of clinicians to approach for participation. Accessing these networks often involves cultivating relationships with key decision-makers, aligning with institutional priorities, and navigating complex administrative structures to secure buy-in.

4.4. Recommendations for future recruitment efforts

The business sales funnel model informed our recruitment strategy, but executing that strategy was an iterative process. Our experiences underscore the necessity for pre-planned and systematic site recruitment approaches and offer a few suggestions for investigators in future trials. First, investigators should identify comprehensive provider information sources to generate leads beyond a warm network, including professional directories, specialty group lists, hospital directories, commercial databases, electronic healthcare systems, and professional networking platforms [29]. Utilizing these resources ensures a broad pool of leads. Second, investigators should optimize their recruitment funnel by planning to address potential bottlenecks and rate-limiting factors, such as low email open and response rates. This involves more granular tracking and testing subject lines, something we were unable to do in this trial, and refining message content to enhance engagement. Third, investigators should present a value proposition that aligns with site priorities and minimizes workflow disruptions. Emphasizing benefits to stakeholders, such as enhancing daily practice or supporting organizational goals, rather than focusing on what the organization can provide for the study team, may increase clinician and administrative buy-in Ref. [30].

Conversely, investigators should avoid several common pitfalls. Investigators should not underestimate the number of leads required, as our study found that 184 leads yielded 10 participating VA healthcare systems—an average of 18 leads per system, with varying numbers of selected sites from those systems. As such, even with a compelling intervention, natural barriers, and attrition rates necessitate a substantial pool of potential sites to achieve enrollment targets and maintain recruitment timelines. Second, avoid bypassing direct engagement with implementing clinicians and relying solely on higher-level administrative approval, as this may lead to false assumptions about clinicians' willingness and capacity to effectively carry out the intervention [31]. Without clinician commitment, particularly for interventions relying on their consistent adoption, enrollment may stagnate, fidelity may suffer, and program implementation may falter [6]. Third, do not conclude recruitment meetings with stakeholders without a clear call to action and defined next steps. Without specific action items, momentum can be lost, leading to prolonged negotiations and delays in securing participation agreements.

5. Conclusion

The AIM-Back trial's site recruitment illustrates the complexities of recruiting clinical sites for ePCTs when these sites include partners that are not already known to the investigators or affiliated with academic medical centers. By expanding beyond existing connections, using innovative lead generation, and adapting outreach, we recruited and randomized 19 sites for participation. Our three-step approach emphasized involving clinicians, demonstrating value to clinical partners, and ensuring site readiness. Future ePCTs should prioritize well-planned recruitment methods that foster clinician buy-in and ensure study feasibility.

CRediT authorship contribution statement

Tyler L. Cope: Writing – review & editing, Writing – original draft, Visualization, Supervision, Project administration, Methodology, Investigation, Formal analysis, Data curation, Conceptualization. Steven Z. George: Writing – review & editing, Writing – original draft, Supervision, Project administration, Methodology, Investigation, Funding acquisition, Conceptualization. S. Nicole Hastings: Writing – review & editing, Writing – original draft, Supervision, Project administration, Methodology, Investigation, Funding acquisition, Conceptualization. Courtni France: Writing – review & editing, Writing – original draft, Resources, Project administration, Methodology, Investigation, Formal analysis, Data curation. Christa Tumminello: Writing – review & editing, Writing – original draft, Visualization, Resources, Project administration. Cynthia J. Coffman: Writing – review & editing, Writing – original draft, Resources, Methodology, Investigation, Formal analysis, Data curation. Ashley Choate: Writing – review & editing, Writing – original draft, Project administration, Methodology, Investigation, Data curation, Conceptualization. Trevor A. Lentz: Writing – review & editing, Writing – original draft, Visualization, Supervision, Project administration, Methodology, Investigation, Conceptualization.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgements

This work is supported through cooperative agreement 4UH3AT009790-03 from the NIH, National Center for Complementary and Integrative Health (NCCIH). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health, the U.S Department of Veterans Affairs, or the United States Government.

This manuscript is a product of the Pain Management Collaboratory. For more information about the Collaboratory, visit https://painmanagementcollaboratory.org/.

This material is the result of work supported with resources and the use of facilities at the Duke Clinical Research Institute and the Durham VA Health Care System in Durham, North Carolina.

Footnotes

Appendix A

Supplementary data to this article can be found online at https://doi.org/10.1016/j.conctc.2025.101483.

Appendix A. Supplementary data

The following is the Supplementary data to this article.

Multimedia component 1
mmc1.docx (12.8KB, docx)

Data availability

Data will be made available on request.

References

  • 1.Zwarenstein M., Al-Jaishi A., Garg A.X. 2021. Promoting Both Internal and External Validity: Designing the Trial to Match its Intention. [Google Scholar]
  • 2.Ford I., Norrie J. Pragmatic trials. N. Engl. J. Med. Aug. 2016;375(5):454–463. doi: 10.1056/NEJMra1510059. [DOI] [PubMed] [Google Scholar]
  • 3.Cocoros N.M., et al. Pragmatic guidance for embedding pragmatic clinical trials in health plans: large simple trials aren't so simple. Clin. Trials Lond. Engl. Jun. 2023;20(4):416. doi: 10.1177/17407745231160459. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Weinfurt K.P., et al. Pragmatic clinical trials embedded in healthcare systems: generalizable lessons from the NIH Collaboratory. BMC Med. Res. Methodol. 2017;17(1) doi: 10.1186/s12874-017-0420-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Garcia C.J., et al. Practical challenges in the conduct of pragmatic trials embedded in health plans: lessons of IMPACT-AFib, an FDA-Catalyst trial. Clin. Trials Lond. Engl. Jun. 2020;17(4):360. doi: 10.1177/1740774520928426. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Messner D.A., Moloney R., Warriner A.H., Wright N.C., Foster P.J., Saag K.G. Understanding practice-based research participation: the differing motivations of engaged vs. non-engaged clinicians in pragmatic clinical trials. Contemp. Clin. Trials Commun. Dec. 2016;4:136–140. doi: 10.1016/j.conctc.2016.08.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Gheorghe A., Roberts T.E., Ives J.C., Fletcher B.R., Calvert M. Centre selection for clinical trials and the Generalisability of results: a mixed methods study. PLoS One. Feb. 2013;8(2) doi: 10.1371/journal.pone.0056560. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Johnson A.M., et al. Hospital recruitment for a pragmatic cluster-randomized clinical trial: lessons learned from the COMPASS study. Trials. Jan. 2018;19:74. doi: 10.1186/s13063-017-2434-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.NIH Collaboratory Rethinking Clinical Trials - The Living Textbook,” Rethinking Clinical Trials. Accessed: January. 8, 2024. [Online]. Available: https://rethinkingclinicaltrials.org/.
  • 10.George S.Z., et al. Improving veteran access to integrated management of back pain (AIM-Back): protocol for an embedded pragmatic cluster-randomized trial. Pain Med. Malden Mass. Dec. 2020;21(Suppl 2):S62–S72. doi: 10.1093/pm/pnaa348. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Lentz T.A., et al. If you build it, will they come? Patient and provider use of a novel hybrid telehealth care pathway for low back pain. Phys. Ther. Feb. 2024;104(2) doi: 10.1093/ptj/pzad127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Butera K.A., Lentz T.A., Beneciuk J.M., George S.Z. Preliminary evaluation of a modified STarT back screening tool across different musculoskeletal pain conditions. Phys. Ther. Aug. 2016;96(8):1251–1261. doi: 10.2522/ptj.20150377. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.France C., et al. The implementation of a pain navigator program in the department of Veterans Affairs' (VA) health care systems: a cluster randomized pragmatic clinical trial. Pain Med. Nov. 2024;25(Supplement_1):S83–S90. doi: 10.1093/pm/pnae074. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Aretz F. Satakunta University of Applied Sciences; 2016. DEVELOPING THE MARKETING AND SALES PROCESS BY IMPLEMENTING THE BUSINESS FUNNEL.https://www.theseus.fi/handle/10024/121934 [Online]. Available: [Google Scholar]
  • 15.Sharma K.K., Tomar M., Tadimarri A. Optimizing sales funnel efficiency: deep learning techniques for lead scoring. J. Knowl. Learn. Sci. Technol. Nov. 2023;2(2):261–274. doi: 10.60087/jklst.vol2.n2.p274. ISSN 2959-6386 Online. [DOI] [Google Scholar]
  • 16.Conde R. Necessary condition analysis for sales funnel optimization. J. Mark. Anal. Mar. 2025 doi: 10.1057/s41270-025-00388-5. [DOI] [Google Scholar]
  • 17.Geographic division or region - health, United States. https://www.cdc.gov/nchs/hus/sources-definitions/geographic-region.htm [Online]. Available:
  • 18.Worsley S.D., et al. Series: pragmatic trials and real world evidence: paper 2. Setting, sites, and investigator selection. J. Clin. Epidemiol. Aug. 2017;88:14–20. doi: 10.1016/j.jclinepi.2017.05.003. [DOI] [PubMed] [Google Scholar]
  • 19.Gehring M., et al. Factors influencing clinical trial site selection in europe: the survey of attitudes towards trial sites in europe (the SAT-EU study) BMJ Open. Nov. 2013;3(11) doi: 10.1136/bmjopen-2013-002957. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Vasilieva E. PROCEEDINGS of the INTERNATIONAL CONFERENCE on TECHNOLOGY & ENTREPRENEURSHIP IN DIGITAL SOCIETY, Real Economy. Publishing House; 2020. The Sales funnel in Unit-economy indicators as an effective tool of Technological Entrepreneurship; pp. 15–19. [DOI] [Google Scholar]
  • 21.Wilson R.D. Using online databases for developing prioritized sales leads. J. Bus. Ind. Mark. 2003;18(4/5):388. doi: 10.1108/08858620310480287. [DOI] [Google Scholar]
  • 22.Conceição A., Gama J. In: Lecture Notes in Computer Science. Science Discovery, Kralj Novak P., Šmuc T., Džeroski S., editors. Springer International Publishing; Cham: 2019. Main factors driving the open rate of email marketing campaigns; pp. 145–154. [DOI] [Google Scholar]
  • 23.de Carvalho A.T.N. 2014. Effects of Message Design and Content on the Performance of Email Marketing Campaigns.https://repositorio.ucp.pt/handle/10400.14/18224 masterThesis. [Online]. Available: [Google Scholar]
  • 24.Ureña R., Kou G., Dong Y., Chiclana F., Herrera-Viedma E. A review on trust propagation and opinion dynamics in social networks and group decision making frameworks. Inf. Sci. Apr. 2019;478:461–475. doi: 10.1016/j.ins.2018.11.037. [DOI] [Google Scholar]
  • 25.Tuzzio L., et al. Pragmatic clinical trials offer unique opportunities for disseminating, implementing, and sustaining evidence-based practices into clinical care: proceedings of a workshop. Healthc. Amst. Neth. Mar. 2019;7(1):51–57. doi: 10.1016/j.hjdsi.2018.12.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Weir A., Presseau J., Kitto S., Colman I., Hatcher S. Strategies for facilitating the delivery of cluster randomized trials in hospitals: a study informed by the CFIR-ERIC matching tool. Clin. Trials Lond. Engl. Aug. 2021;18(4):398–407. doi: 10.1177/17407745211001504. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Damschroder L.J., Aron D.C., Keith R.E., Kirsh S.R., Alexander J.A., Lowery J.C. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement. Sci. IS. 2009;4:50. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Harvey G., Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement. Sci. IS. Mar. 2016;11:33. doi: 10.1186/s13012-016-0398-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.N. K. Saxena and R. Saxena, “Leveraging social media in the world of sales: an untapped potential,” Int. J. Adv. Comput. Res., vol. 3, no. 3.
  • 30.Topazian R., et al. Physicians' perspectives regarding pragmatic clinical trials. J. Comp. Eff. Res. Aug. 2016;5(5):499–506. doi: 10.2217/cer-2016-0024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Whicher D.M., Miller J.E., Dunham K.M., Joffe S. Gatekeepers for pragmatic clinical trials. Clin. Trials Lond. Engl. Oct. 2015;12(5):442–448. doi: 10.1177/1740774515597699. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Multimedia component 1
mmc1.docx (12.8KB, docx)

Data Availability Statement

Data will be made available on request.


Articles from Contemporary Clinical Trials Communications are provided here courtesy of Elsevier

RESOURCES