Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2024 Jan 1.
Published in final edited form as: J Autism Dev Disord. 2022 Jan 28;53(1):359–369. doi: 10.1007/s10803-022-05444-y

Utilization of a Best Practice Alert (BPA) at Point-of-Care for Recruitment into a US-based Autism Research Study

Andrea R Simon 1,2, Kelli L Ahmed 2,3, Danica L Limon 2,4, Gabrielle F Duhon 1,2, Gabriela Marzano 1,2, Robin P Goin-Kochel 1,2
PMCID: PMC9329488  NIHMSID: NIHMS1780305  PMID: 35089434

Abstract

Provider referral is one of the most influential factors in research recruitment. To ease referral burden on providers, we adapted the Best Practice Alert (BPA) in the EPIC Electronic Health Record (EHR) and assessed its utility in recruiting pediatric patients with autism spectrum disorder (ASD) for the national SPARK study. During a year-long surveillance, 1,203 (64.0%) patients were Interested in SPARK and 223 enrolled. Another 754 participants not recruited via the BPA also enrolled; 35.5% of these participants completed their participation compared to 58.3% of BPA-referred participants. Results suggest that (a) a BPA can successfully engage providers in the study-referral process and (b) families who learn about research through their providers may be more engaged and effectively retained.

Keywords: Recruitment, Best Practice Alert (BPA), Autism Spectrum Disorder (ASD), SPARK, Electronic Health Record (EHR), Enrollment


Participant recruitment and enrollment is a crucial part of clinical research and is one of the most challenging objectives in studies involving neurodevelopmental populations (Ahmed et al., 2020; Clinical Trial Transformation Initiative, 2018). Historical data reflect hurdles in attaining sufficient participant enrollment throughout the previous decade, resulting in high rates of timeline extensions and subsequent financial ramifications (Huang et al., 2018; Tufts University, 2020). Moreover, under-enrollment weakens the scientific value of a study by limiting statistical power and causing delays in services, interventions, and dissemination of information (Carlisle et al., 2014; Kadam et al., 2016). To address these issues, a 2016 report from the Tufts Center for the Study of Drug Development announced a research and development shift to increase patient-centered recruitment and retention methods within clinical trials. Their approach included enhanced collaboration between healthcare providers and investigators through the use of electronic health records (EHRs; Applied Clinical Trials, 2016).

EHRs provide a secure online infrastructure for housing a patient’s diagnostic and treatment history. They also aid clinical decision making and communication with patients, which leads to better healthcare outcomes and reduced overall care costs (Garrett & Seidman, 2011). Likewise, EHRs can support clinical-research efforts by facilitating participant prescreening, assessment, and data collection (Kruse et al., 2018; Raman et al., 2018). Numerous clinical-research trials have used EHRs as a cost-effective and time-efficient way to identify eligible participants for studies by matching study inclusion criteria with patient data (Devoe et al., 2019; Lai & Afseth, 2019; Raman et al., 2018). A recent review of this literature showed that use of EHRs can also extend the reach of studies to underrepresented racial and ethnic minorities compared to conventional recruitment methods; these efforts were even more successful when alert notifications were employed (Lai & Afseth, 2019). These alerts, often referred to as Best Practice Alerts (BPA), appear once the patient becomes eligible for a study and prompt the healthcare provider to discuss research opportunities during the patient’s clinical visit. This strategy may ease referral burden by reminding and supporting providers in communications about the study.

As recipients of study-related BPAs, providers are the patient’s first point-of-contact about the study and are crucial to recruitment success and enrollment (Devoe et al., 2019; Rollman et al., 2008). However, alert fatigue and providers’ limited knowledge about the studies for which they are recruiting are common limitations to this approach (Embi et al., 2005; Embi & Leonard, 2012), either of which may reduce providers’ use of or responsiveness to the BPA. Research staff often lack the bandwidth to personally recruit across a large clinic or hospital setting; they further lack a clinical relationship with patients. For these reasons, supports like the BPA that extend the reach and resonance of research opportunities are needed. It is likely that patients have a greater sense of trust in information coming from their provider, which may increase their receptiveness to participate in research. Further research is needed on how to increase provider engagement in BPA-research recruitment while minimizing burden.

Although evidence supports the use of EHRs and BPAs in the recruitment of patients for research, to our knowledge, these methods have yet to be applied to studies involving certain vulnerable populations, including pediatric populations and patients with developmental disabilities, such as autism spectrum disorder (ASD). Moreover, many BPA-recruitment studies focus strictly on enrollment rather than overall completion rates; thus, information is needed to understand the potential influence of BPA recruitment on participant engagement and completion.

Rationale

Simons Foundation Powering Autism Research for Knowledge (SPARK).

SPARK is the nation’s largest study on ASD, with a mission to advance the pace of autism research and understand more about ASD to improve lives (Feliciano et al., 2018). SPARK has engaged more than 30 clinical sites across the country to facilitate enrollment of at least 50,000 individuals with ASD and their biological parents. The [site] team is currently the statewide SPARK site for [state]. Briefly, participation in SPARK involves online registration and consent, as well as provision of a saliva sample for genetic analysis (whole exome sequencing), all of which can be done remotely. In order to leverage recruitment resources most effectively, the [site] team surveyed SPARK participants affiliated with their site to ask about the ways in which they learned about SPARK prior to enrolling and which methods were most influential in their decision to participate (Ahmed et al., 2020). Among the 374 respondents (response rate = 26.8%), the most commonly reported methods were social media (37.2%), seeing a flyer or other print material (24.0%), email (23.0%), and speaking with a medical provider (22.0%). Interestingly, 80.0% of those who had learned about the study by speaking with a medical provider also endorsed this method as most influential in their decision to enroll. Comparatively, while many people learned about SPARK by social media, email, or print materials, smaller proportions also endorsed these as most influential in their enrollment decision (54.0%, 47.0%, and 40.0% respectively).

Objective

SPARK has an extremely large enrollment objective and broad participant inclusion criteria—features that make it well-suited for BPA outreach. Likewise, a BPA fosters ethical recruitment practices because it prompts providers to share information about study opportunities more impartially, which may subsequently increase both sample size and sample diversity. For these reasons, and based on the aforementioned pilot survey, in-clinic SPARK recruitment practices were revised to better engage providers in the introduction of research opportunities to families, which led to a BPA. The primary objectives for the current study were to (a) evaluate the utility of a BPA in recruiting participants into the SPARK study, (b) assess providers’ responsiveness to and perceptions about the BPA, and (c) compare rates of study completion between participants recruited through the BPA versus other methods. Overall, we hypothesized that the BPA would be an effective and acceptable method for increasing participant enrollment in and completion of the SPARK study.

Methods

Best Practice Alert (BPA)

In collaboration with the EPIC-EHR Reporting Workbench team, we developed a BPA that would alert providers during clinical visits if a patient was eligible for the SPARK study. The BPA included a sentence about the purpose of the study, general eligibility criteria, three bullet points describing participation and potential benefits, and a link to the study website (Figure 1).

Figure 1.

Figure 1

Screenshot of Best Practice Alert

The trigger criteria for the BPA were as follows:

  1. The patient had an ASD diagnosis (including Asperger syndrome, autism/autistic disorder, or pervasive development disorder-not otherwise specified [PDD-NOS])

  2. He/she was currently being seen for an encounter through Neurology, Psychiatry, Psychology or Developmental Pediatrics. Encounter types of Note Only, Telephone Only, or Orders Only were excluded.

  3. The patient was not already linked to the study in the EHR. Patients were linked if (a) they had already enrolled in the study or (b) they had already recorded a response to the BPA during a previous visit.

  4. The family did not require the use of an interpreter.

It is important to note that a patient could trigger the BPA if they either met the aforementioned criteria at the moment the encounter began or if a change to their record was made during the course of their visit that thus satisfied these criteria (i.e., a new ASD diagnosis was given).

Once the alert message was triggered, providers were prompted to briefly describe the SPARK study to the family. They could then either dismiss the BPA or record a response of Interested, Declined, or Enrolled. Interested indicated that the patient wanted to receive additional information about the study from the research team. Declined indicated that the patient was not interested in the study at that time. Enrolled indicated that the patient said they were already participating in SPARK. Dismissing the BPA removed it from the screen; however, it would fire again the next time the patient’s chart was opened. Any response other than “dismiss” automatically linked the patient to the study in the EHR under the selected status (i.e., Interested, Declined, Enrolled). Concurrently, the response sent the research team an inbox pool message with the patient’s contact information. Once a patient was linked to the study in the EHR, they could not trigger the BPA to fire again. Figure 2 outlines this workflow. The BPA was activated for all primary provider types, including medical doctors, psychologists, residents, fellows, and social workers. Hereafter, an individual referred to as “patient” will indicate that he or she triggered the BPA; an individual referred to as a “participant” will indicate that the patient enrolled in the SPARK study and provided consent.

Figure 2.

Figure 2

BPA Workflow

BPA Education

Prior to launching the BPA, in-person educational presentations were conducted with each department that would receive the alert to explain the SPARK study, describe how to respond to the BPA, and answer questions. Suggested language was provided for how to verbally introduce the study to families in 30 seconds or less by emphasizing that questions would be addressed by the study team when the patient was contacted. Print materials about SPARK and how to use the BPA were distributed to all clinical sites within each department and shared via email. The BPA launched on 9/12/2018 in the departments of Psychiatry and Neurology as a pilot effort to assess the alert’s efficacy and collect informal provider feedback. On 10/17/2018, the BPA was expanded into two additional departments—Psychology and Developmental and Behavioral Pediatrics. Collectively, the BPA launched in 22 different clinic locations within the [site] Hospital System.

Follow-Up Efforts

BPA referrals were routinely transferred from the automated EHR inbox messaging system into an Excel spreadsheet to track follow-up efforts and referral patterns over time. Once documented, all referrals with a status of Interested or Enrolled were then contacted by email, phone, and/or a mailed letter with IRB-approved recruitment language. During the initial months of the BPA, various follow-up schedules were implemented, and outcomes (i.e., date of contact, patient response, study enrollment status) were carefully tracked in the spreadsheet (Figure 3).

Figure 3.

Figure 3

Post-BPA Follow-up Workflow

SPARK Participation

Interested patients received an invitation email with a link to visit the SPARK website to learn more about the study and enroll online. The enrollment process took approximately 30 to 40 minutes, and informed consent was obtained electronically. Those who consented to provide a DNA sample for genetic analysis were mailed a saliva-collection kit with instructions on how to collect the saliva sample and ship it to the lab for processing. Returned kits that passed quality-control checks were documented in the SPARK study portal by the lab. This information was then used by the study team to update the participant’s study status in the EHR to “completed,” at which time their participation in SPARK was considered complete.

Provider Feedback

The BPA was intended to encourage provider referrals to SPARK while reducing burdens associated with traditional research-referral processes. To assess provider burden, a brief questionnaire was distributed via REDCap to measure provider attitudes and solicit feedback. Email invitations to complete the online questionnaire were sent to 140 providers who had received the SPARK BPA three or more times. Following the initial invite, providers received up to three email reminders ten days apart. Completion of the questionnaire was incentivized with a chance to win a catered lunch for their department, up to a $500 value. The survey was open from 11/20/19 to 3/16/20.

Data Analysis

Information from an EHR-inbox message (triggered by a BPA response) included the patient’s response (Interested, Enrolled, or Declined) and their contact information, the visit date and department, and the provider name. This information was transferred to an excel spreadsheet for follow-up and tracking purposes. For the current analysis, only referrals from the first 12 months following the BPA launch (9/12/18–9/12/19) were examined. Because participation in the SPARK study could be completed entirely from home, most participants returned their saliva samples within two months, thereby completing their participation. For this reason, we allowed a two-month timeframe for the return of saliva kits (i.e., until 11/12/19) when analyzing how many participants completed their SPARK participation. Demographic information for patients who responded to the alert, including race, ethnicity, and payor status, was exported from the EHR and used to characterize the study sample (see demographics in Table 1).

Table 1.

Demographic Characteristics of Patients Who Responded to the BPA (n=1,879)

Characteristic n %

Race
White 1249 66.5
Black 318 16.9
Asian 108 5.8
Native Hawaiian or Other Pacific Islander 3 0.2
American Indian or Alaskan Native 5 0.3
Two or more races 33 1.8
Unknown 163 8.7
Ethnicity
Hispanic 725 38.6
Non-Hispanic 1038 55.2
Unknown 116 6.2
Payor Type
Self-Pay 194 10.3
Medicaid 814 43.3
Private 849 45.2
Unknown 22 1.2

Descriptive statistics were employed to (a) calculate total number of BPA responses; frequencies by race, ethnicity, and payor status; frequencies by type of response and department; and the range of referrals across providers; (b) calculate numbers of SPARK enrollments, enrollment rates following post-referral contact, and enrollment frequencies by racial group; and (c) characterize providers’ responses to items in the feedback questionnaire. Fisher’s exact tests were used to compare the proportions of BPA and non-BPA families who completed their SPARK participation during the study timeframe.

Results

BPA-response Frequencies

Between 9/12/18 and 9/12/19, a total of 3,634 unique patients triggered the alert and 1,879 (51.7%) patients had a response recorded, with an average of 23 responses per week (Figure 4). Of these, 1,203 (64.0%) were Interested, 646 (34.4%) Declined further contact, and 30 (1.6%) responded Enrolled, which indicated that they were already participating in the study but were not affiliated with the [site] site.

Figure 4.

Figure 4

Number of Responses per Week (Mean=23)

Race, ethnicity, and payor information for all BPA respondents (n=1,879) was pulled from the EHR and is presented in Table 1. The majority of patients were reportedly white and non-Hispanic, with relatively equal proportions relying on Medicaid versus private insurance.

The BPA fired for 335 different providers, of which 112 (33.4%) providers responded to the alert at least once (see Table 2). Total number of BPA responses recorded by each provider ranged from 1 response to 248 total responses.

Table 2.

Provider Engagement by Department (n = 112)

Department n %

Neurology 49 43.8
Developmental Pediatrics 11 9.8
Psychiatry 23 20.5
Psychology 4 3.6
Other Advanced Practitioner * 25 22.3
*

Includes social workers, nurse practitioners, residents, and fellows who did not explicitly belong to any specialty.

SPARK Enrollment and Completion Rates

A total of 223 patients registered for the SPARK study after receiving the BPA; 220 (98.6%) were Interested and 3 (1.3%) were participants with a Declined response who enrolled independently without contact from the study team. Among these BPA participants, 130 (58.3%) completed the study by returning their saliva collection kit. An additional 754 participants with ASD enrolled in the study as a result of other recruitment strategies (e.g., social media, word of mouth, online search), which were not halted during the BPA surveillance period because of recruitment demands. However, this group assuredly did not receive the BPA. Of these, 268 (35.5%) returned saliva kits, thereby completing the study. A significantly higher proportion of BPA participants (58.3%) versus non-BPA participants (35.5%) completed the study by returning their saliva kit (Fisher’s exact p < .0001). Furthermore, among BPA participants, a significantly higher proportion of non-white participants (68.3%) completed the study compared to white participants (54.1%; p = .028). Comparatively, among non-BPA participants, the completion rates for non-whites (39.7%) versus whites (35.6%) did not differ (Fisher’s exact p = .174; see Table 3).

Table 3.

Completion Rate by Recruitment Source and Race

BPA n % Non-BPA n %

Non-White Non-White
Incomplete 20 (31.7) Incomplete 108 (60.3)
Complete 43 (68.3)* Complete 71 (39.7)

White White
Incomplete 68 (45.9) Incomplete 278 (64.4)
Complete 80 (54.1)* Complete 154 (35.6)

Unknown Unknown
Incomplete 5 (41.7) Incomplete 100 (69.9)
Complete 7 (58.3) Complete 43 (30.1)

Grand Total 223 Grand Total 754

Note. Analyses compared proportions of complete and incomplete participants by racial group within BPA-status groups (BPA or Non-BPA).

*

Significant at p < .05.

Provider Feedback

A total of 43/140 providers (30.7%) responded to the BPA-feedback questionnaire (see items and results in Table 4). Most providers (83.7%) felt sufficiently educated about SPARK before the BPA launched in their department, and the majority (69.7%) felt “comfortable” or “very comfortable” introducing the SPARK study to a family based on information provided in the alert’s text. Twenty-two providers (51.2%) reported spending less than 1 minute discussing the study, with the remaining 21 (48.8%) spending between 1–5 minutes discussing the study. Respondents were also encouraged to provide open-ended feedback and/or ask questions. Approximately 25% reported that the SPARK BPA was too “poppy,” meaning that when a provider chose to dismiss the BPA, it continued to “pop-up” as they moved through the chart. The purpose of the dismiss option was to silence the alert if discussing research was not appropriate at that time. However, it was only silenced for a short time to encourage providers to record a viable response, which resulted in its firing multiple times during a visit. Additionally, some providers shared that they were uncomfortable discussing research with their patients (9.3%), did not feel that research opportunities should be offered to patients at the point of care (14.0%), or could not find a proper time to discuss the study (i.e., had no time/was not the right time, 55.9%; see Table 4).

Table 4.

Provider Feedback Survey Responses (n = 43)

Response n %

1. Do you feel like you were educated enough about SPARK before you started receiving the SPARK BPA?
 Yes 36 83.7
 No 7 16.3

2. How comfortable are you are you introducing the SPARK study to a family, based on the information provided in the BPA?
 Not comfortable at all 4 9.3
 Somewhat comfortable 9 20.9
 Comfortable 13 30.2
 Very comfortable 17 39.5

3. How long do you typically spend talking to the patient about SPARK?
 Less than 1 minute 22 51.2
 1–5 minutes 21 48.8
 6–10 minutes 0 0
 More than 10 minutes 0 0

4. Would you like future studies to use a BPA to help you identify potentially eligible families?
 Yes 18 41.9
 No 7 16.3
 Depends on the study 18 41.9

5. What is the most common reason you choose “Dismiss”?
 Not sure if the patient has ASD 9 20.9
 Don’t have time to talk to patients about researc h 10 23.3
 It’s not the right time to discuss with the patient 14 32.6
 NA – I have not selected dismiss 3 7.0

6. Had you ever referred a family to participate in any research study at TCH or BCM before you started receiving the SPARK BPA?
 Yes 23 53.5
 No 15 34.9

7. How much do you agree with this statement: “Research opportunities should routinely be offered to patients as a standard part of clinical care”?
 Strongly disagree 2 4.7
 Disagree 4 9.3
 Agree 23 53.5
 Strongly agree 14 32.6

Discussion

BPA Response

During the first twelve months following the launch of the BPA effort, the alert fired for 3,634 unique children, demonstrating its expansive reach to a unique pediatric population across a large health system. Furthermore, a total of 1,879 BPA responses were recorded during this time, indicating provider action (versus dismissal) in 51.2% of cases. This suggests that many providers are willing and able to quickly and consistently communicate study information at the point-of-care with the aid of EHR tools like a BPA, even when the study pursues a vulnerable population. The substantial range in the total numbers of BPA responses recorded by each provider (1 to 248 responses) is likely attributable to variability in patient load, provider specialty, provider type, individual practice patterns, and familiarity with the study team.

The finding that 64.0% of patient responses were Interested indicates that a majority of patients are receptive to study opportunities introduced at the point-of-care, even when receiving only a brief amount of study information. The BPA’s Decline response also allowed for quick identification of 646 uninterested patients, avoiding expensive, time-consuming, and/or unwanted contact efforts towards this group (e.g., mailings, cold calls). Relatedly, the nature of the BPA provided a less intrusive method for informing families about the study while preserving their privacy, creating an opportunity to decline interest before the study team reviewed their medical records— the latter of which is the traditional approach for identifying patients who meet study eligibility criteria. Moreover, learning about the study through a trusted provider increased the relevance of the study opportunity when study staff contacted Interested families by phone call or email.

SPARK Enrollment Rates

The BPA led to the successful enrollment of 223 SPARK participants with ASD in a 12-month period, which accounted for 23.0% of all SPARK enrollments that year. Among those who responded to the BPA as Interested, approximately one in five eventually enrolled. This may be viewed as a lower-than-desired conversion rate. However, given the number of patients being contacted and that the primary contact method was email, this enrollment rate is quite high, especially when compared to other point-of-care recruitment efforts. For example, in the 12-month period prior to the launch of the SPARK BPA, providers referred 175 patients to the study team during a clinical visit, which led to 51 enrollments. This is less than a quarter of those enrolled via the BPA effort in the same duration of time. It is also possible that some patients responded Interested when, in fact, they were not (i.e., acquiescence bias), or that some providers recorded an Interested response for the patient without actually discussing the study, lessening the impact of follow-up efforts. Finally, timeliness of the study team’s follow-up may have influenced an individual’s likelihood to participate. Further research is needed to understand how to improve the conversion rate of Interested patients to Enrolled participants.

Reviewing contact methods and follow-up efforts showed that emailing patient families after being referred by a provider via the BPA was an effective form of contact about the study. While phone calls were also effective, they were more time consuming. An important caveat was balancing the rate of BPA referrals relative to the ability of the study team to provide efficient follow-up. Additionally, the 27 (12.1%) participants who enrolled on their own prior to any contact by the study team suggests that having providers introduce the study and direct patients to a study website is sufficient for recruiting some participants. This may also be a testament to the importance of website appeal and autonomous enrollment processes when working with sensitive or highly burdened patient populations.

BPA vs. Non-BPA Participant Completion Rates

Recruiting participants via providers at the point-of-care is known to be influential in the decision to join a research study (Baer et al., 2012; Baquet et al., 2006; Getz, 2017). Our results demonstrated that participants recruited via their medical provider (i.e., the BPA) were significantly more likely to fully complete their participation compared to those recruited other ways, especially among non-white participants. This is a particularly valuable observation, as the success of any study depends on participants completing their participation. Furthermore, this highlights a potential opportunity to use BPA-facilitated recruitment to improve representation of diverse populations in clinical research. In fact, in our companion project where the SPARK BPA was extended into five primary pediatric care practices, specifically selected based on their serving relatively high numbers of families of color and children with ASD, we found that a significantly higher proportion of families of color seen in primary care were Interested compared to families of color seen in subspecialty clinics (Duhon et al., in press). It may be that families who learn about the study in this way view it as a recommendation from their doctor, similar to other types of recommendations intended to benefit their child; as such, they may be motivated to complete participation in order to realize the full benefits of this recommendation.

Provider Feedback

The majority of providers who responded to the BPA survey felt sufficiently educated on the BPA, comfortable sharing information about SPARK with families, and spent only a few minutes introducing the study. These metrics suggest that the BPA may be an acceptable strategy for engaging providers in study recruitment without competing with other visit needs. However, a common complaint was that the BPA fired too frequently during the same clinical visit, which led to frustration and sometimes requests to remove the BPA. To address this concern about firing too often, the study team frequently reminded providers that dismissing the BPA was not a true response—that it was a “snooze button” that only removed it from the screen temporarily. By taking a moment to record the response before moving, they could prevent the alert from triggering further. However, this messaging may not have reached many of those with this concern because of challenges in disseminating information to a large number of providers. Timed-system “snoozes” could be implemented to lengthen the duration of time between dismissing the alert and its readiness to fire again, thereby reducing provider frustration. It is also possible to offer a “silent” BPA that does not “pop up” at all, but is simply included in a specified “research” tab in the EHR. However, this type of alert requires the provider to remember to check a tab that is not typically in their workflow, which was a potential referral burden that we aimed to avoid.

Some providers also shared that they did not feel comfortable discussing research opportunities during the clinical visit, did not see research as a priority or beneficial to their patients, or could not find the right time to share study information. The [hospital system] is an academic medical center, meaning that clinical care is integrated with education and research in order to provide the highest quality of evidence-based care. For this reason, it would seem customary for providers to introduce study opportunities to families, especially with the BPA as a prompt. However, there are different professional tracks within the academic medical setting (e.g., clinician-educator, researcher), and it may be that those not involved in research themselves are less knowledgeable about and comfortable with the research process, leading to reduced likelihood of their broaching this topic with patients. At the same time, they may perceive the family’s reason for pursuing clinical care as the priority and assume that (a) they are not interested in learning about research studies at this time or in this context and/or (b) that research would place an additional burden on the family. It is possible that education/training and provider modeling of how/when to introduce research opportunities to families could address some of these issues.

Overall, it is important to be sensitive to provider practice patterns when implementing a BPA in order to minimize disruptions, maximize yield, and maintain positive provider relationships. Many providers may be resistant to new tasks during clinical visits because of competing demands, limited time, and general hesitation regarding research. Study teams should make informed decisions about how a BPA operates, who is required to respond, and have an understanding of visit processes prior to implementation in order to avoid disruptions to patient care. Furthermore, researchers should make concerted efforts to get provider buy-in with the study and acclimated with the BPA process prior to launch.

Limitations and Future Directions

This study yielded novel and valuable information about the utility of a BPA in engaging providers at the point of care to refer families of children with ASD to a research study. Nevertheless, several limitations and challenges should be noted. First, because the SPARK study criteria were broad and the BPA ran 24/7 in many clinic locations, a large number of potentially eligible families accumulated rapidly. For this reason, follow-up schedules were not always consistent and varied with study staff availability. A potential solution to better systematize follow-up is exploring the functionality of tools built into the EHR, such as MyChart, which can integrate with the BPA to automatically send patients with Interested responses the IRB-approved follow-up messages without study staff involvement. This would afford more secure outreach and further protect patient privacy while at the same time ensuring that the family received well-timed information about the study.

Another limitation of the BPA was difficulty attaining provider buy-in, which was evidenced in the discrepancy between the number of times the BPA fired and the number of responses ultimately recorded. Additionally, while educational presentations were conducted with each department prior to receiving the BPA, some providers still may not have been adequately informed, leading to incorrect use of the mechanism. We attempted to address this limitation by regularly circulating BPA educational tools/information/FAQ’s via email and during faculty meetings, as well as enlisting the aid of department heads in the distribution of this information to their sections. However, further work is needed to understand how to (a) most effectively inform providers about new alerts and (b) motivate and empower users to discuss research opportunities with patients at the point of care. Consistent communication, including progress updates and recognition of provider efforts, is essential to the adoption of the BPA across a large hospital system. At the same time, researchers should be sensitive to “alert fatigue” among providers who are already burdened by EHR processes. Using trigger criteria that is too broad or requiring too much effort from the provider upon receipt may lead to frustration and decreased or incorrect engagement. For example, frustrated providers may have opted to immediately decline the BPA as a way to eliminate the alert.

Conclusions

Summary

Recruiting participants for research studies can be challenging, thereby contributing to delays in evidence bases for advanced healthcare practices. This study shows that leveraging the EHR can significantly benefit study recruitment and completion for vulnerable, hard-to-reach populations by overcoming barriers, such as geographic limitations and provider engagement, in a cost-efficient, timely manner. Furthermore, findings corroborate existing evidence that families who learn about research opportunities at the doctor’s office may be more engaged and more likely to complete their participation. More research is needed to explore how the EHR can facilitate recruitment of other patient populations, such as diverse and underrepresented groups, in order to more rapidly advance high-quality healthcare.

Implications

Researchers and clinicians alike are increasingly more interested in leveraging the EHR to expedite workflows, from retrospective data analysis and participant recruitment, to biometric data collection and intervention delivery. Yet, despite the rapidly evolving digital-health landscape, EHR innovations to support clinical research lags far behind support for healthcare services. EHRs should continue to invest in the development of tools designed to facilitate these goals, not only for the benefit of clinical investigators and their patients/research participants, but for the continued improvement of high-quality outcomes within the healthcare system.

Acknowledgements

This work was supported by a grant from the Simons Foundation (SFARI #385052, RPG-K). We are grateful to all of the families in SPARK, the SPARK clinical sites, and SPARK staff. We appreciate obtaining access to phenotypic data on SFARI Base. Approved researchers can obtain the SPARK population dataset described in this study by applying at https://base.sfari.org. We appreciate obtaining access to recruit participants through SPARK research match on SFARI Base. This work was also partially supported by the Intellectual and Developmental Disabilities Research Center (1U54 HD083092) at Baylor College of Medicine. This project was initiated by the authors without specific or supplemental funding. The sponsors played no role in designing, executing, or writing up the results of this analysis.

Footnotes

Robin P. Goin-Kochel has served as a paid consultant in the design of clinical trials for Yamo Pharmaceuticals. All other authors have no known conflicts of interest to disclose.

Publisher's Disclaimer: This AM is a PDF file of the manuscript accepted for publication after peer review, when applicable, but does not reflect post-acceptance improvements, or any corrections. Use of this AM is subject to the publisher’s embargo period and AM terms of use. Under no circumstances may this AM be shared or distributed under a Creative Commons or other form of open access license, nor may it be reformatted or enhanced, whether by the Author or third parties. See here for Springer Nature’s terms of use for AM versions of subscription articles: https://www.springernature.com/gp/open-research/policies/accepted-manuscript-terms

References

  1. Ahmed KL, Simon AR, Dempsey JR, Samaco RC, & Goin-Kochel RP (March 20, 2020). Radio vs. Facebook: Evaluating two common strategies for research-participant recruitment into autism studies. Journal of Medical Internet Research. Epub ahead of print: 10.2196/16752 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Applied Clinical Trials. (2016, January 26). Tufts CSDD Report on New Patient Recruitment/Retention Approaches. Retrieved from http://www.appliedclinicaltrialsonline.com/tufts-csdd-report-new-patient-recruitmentretention-approaches
  3. Baer AR, Michaels M, Good MJ, & Schapira L (2012). Engaging referring physicians in the clinical trial process. Journal of oncology practice, 8(1), e8. 10.1200/JOP.2011.000476 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Baquet CR, Commiskey P, Mullins CD, & Mishra SI (2006). Recruitment and participation in clinical trials: socio-demographic, rural/urban, and health care access predictors. Cancer Detection and Prevention, 30(1), 24–33. 10.1016/j.cdp.2005.12.001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Carlisle B, Kimmelman J, Ramsay T, & Mackinnon N (2014). Unsuccessful trial accrual and human subjects protections: An empirical analysis of recently closed trials. Clinical Trials: Journal of the Society for Clinical Trials, 12(1), 77–83. 10.1177/1740774514558307 [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Clinical Trial Transformation Initiative. (2018, August 10). Recruitment. Retrieved from https://www.ctti-clinicaltrials.org/projects/recruitment
  7. Devoe C, Gabbidon H, Schussler N, Cortese L, Caplan E, Gorman C, Jethwani K, Kvedar J, & Agboola S (2019). Use of electronic health records to develop and implement a silent best practice alert notification system for patient recruitment in clinical research: Quality improvement initiative. Journal of Medical Internet Research Medical Informatics, 7(2), e10020. 10.2196/10020 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Duhon GF, Simon AR, Limon DL, Ahmed KL, Marzano G, & Goin-Kochel RP (in press). Use of a best practice alert (BPA) to increase diversity within a US-based autism research cohort. Journal of Autism and Developmental Disorders. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Embi PJ, Jain A, Clark J, Bizjack S, Hornung R, & Harris CM (2005). Effect of a clinical trial alert system on physician participation in trial recruitment. Archives of Internal Medicine, 165(19), 2272. 10.1001/archinte.165.19.2272 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Embi PJ & Leonard AC (2012). Evaluating alert fatigue over time to EHR-based clinical trial alerts: Findings from a randomized controlled study. Journal of the American Medical Informatics Association, 19(E1). 10.1136/amiajnl-2011-000743 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Feliciano P, Daniels AM, Green Snyder LA, Beaumont A, Camba A, Esler A, …Chung WK (2018). SPARK: A US cohort of 50,000 families to accelerate autism research. Neuron, 97(3), 488–493. 10.1016/j.neuron.2018.01.015 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Garrett P, & Seidman J (2011, January 4). EMR vs EHR - What’s the difference? Retrieved from https://www.healthit.gov/buzz-blog/electronic-health-and-medical-records/emr-vs-ehr-difference [Google Scholar]
  13. Getz KA (2017). Examining and enabling the role of health care providers as patient engagement facilitators in clinical trials. Clinical Therapeutics, 39(11), 2203–2213. 10.1016/j.clinthera.2017.09.014 [DOI] [PubMed] [Google Scholar]
  14. Huang GD, Bull J, Mckee KJ, Mahon E, Harper B, & Roberts JN (2018). Clinical trials recruitment planning: A proposed framework from the Clinical Trials Transformation Initiative. Contemporary Clinical Trials, 66, 74–79. 10.1016/j.cct.2018.01.003 [DOI] [PubMed] [Google Scholar]
  15. Kadam R, Borde S, Madas S, Salvi S, & Limaye S (2016). Challenges in recruitment and retention of clinical trial subjects. Perspectives in Clinical Research, 7(3), 137. 10.4103/2229-3485.184820 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Kruse CS, Stein A, Thomas H, & Kaur H (2018). The use of electronic health records to support population health: A systematic review of the literature. Journal of Medical Systems, 42(11). 10.1007/s10916-018-1075-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Lai YS, & Afseth JD (2019). A review of the impact of utilising electronic medical records for clinical research recruitment. Clinical Trials, 16(2), 194–203. 10.1177/1740774519829709 [DOI] [PubMed] [Google Scholar]
  18. Raman SR, Curtis LH, Temple R, Andersson T, Ezekowitz J, Ford I, James S, Marsolo K, Mirhaji P, Rocca M, Rothman RL, Sethuraman B, Stockbridge N, Terry S, Wasserman SM, Peterson ED, & Hernandez AF (2018). Leveraging electronic health records for clinical research. American Heart Journal, 202, 13–19. 10.1016/j.ahj.2018.04.015 [DOI] [PubMed] [Google Scholar]
  19. Rollman BL, Fischer GS, Zhu F, & Belnap BH (2008). Comparison of electronic physician prompts versus waitroom case-finding on clinical trial enrollment. Journal of General Internal Medicine, 23, 447–450. 10.1007/s11606-007-0449-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Tufts University. (2020, January 28). Drug Developers Are Making Strides in Streamlining Patient Recruitment and Retention for Clinical Trials, According to Tufts Center for the Study of Drug Development [Press release]. Retrieved from https://csdd.tufts.edu/csddnews

RESOURCES