Abstract
Background
eSource software is used to automatically copy a patient’s electronic health record data into a clinical study’s electronic case report form. However, there is little evidence to assist sponsors in identifying the best sites for multi-center eSource studies.
Methods
We developed an eSource site readiness survey. The survey was administered to principal investigators, clinical research coordinators, and chief research information officers at Pediatric Trial Network sites.
Results
A total of 61 respondents were included in this study (clinical research coordinator, 22; principal investigator, 20; and chief research information officer, 19). Clinical research coordinators and principal investigators ranked medication administration, medication orders, laboratory, medical history, and vital signs data as having the highest priority for automation. While most organizations used some electronic health record research functions (clinical research coordinator, 77%; principal investigator, 75%; and chief research information officer, 89%), only 21% of sites were using Fast Healthcare Interoperability Resources standards to exchange patient data with other institutions. Respondents generally gave lower readiness for change ratings to organizations that did not have a separate research information technology group and where researchers practiced in hospitals not operated by their medical schools.
Conclusions
Site readiness to participate in eSource studies is not merely a technical problem. While technical capabilities are important, organizational priorities, structure, and the site’s support of clinical research functions are equally important considerations.
Keywords: clinical research, electronic health record, eSource, information extraction, data reuse, organizational readiness
1. Introduction
Clinical trial costs continue to rise while productivity is declining [1–4]. These trends are particularly evident in large multi-center clinical studies [5–6]. Because most clinical trial costs are site-based (incurred for work performed by site personnel or for site management and monitoring), there is a growing interest in the use of information technologies to reduce clinical site workloads [5–8]. The reuse of medical record data to auto-populate study databases has been an ongoing area of interest [9–11].
eSource software is used to copy a patient’s electronic health record data into a clinical study’s electronic case report form (eCRF)[12–13]. While there has been considerable interest in this technology, the actual number of eSource studies conducted to date is quite small. A recent systematic review found only 14 clinical studies that used direct electronic health record (EHR) data extraction through eSource [13–14]. Eight of the 14 studies were conducted at a single site with a single EHR. Four of the remaining 6 multi-site studies were part of the same European pilot study. Thus, while there is some evidence of eSource use in sites with sophisticated information technologies, there is little evidence for sponsors to consult when selecting sites for multi-center eSource studies. We conducted a study to determine whether it is possible to identify clinical research sites that have the personnel, policies, procedures, and information technologies to support eSource use in clinical trials. We also sought to identify barriers to eSource implementation at study sites.
2. Methods
2.1. Study Design
Investigators from the University of Arkansas for Medical Sciences, the University of Texas Health Science Center at San Antonio, and the Duke Clinical Research Institute developed the Site EHR-to-eCRF eSource Questionnaire for assessing a clinical site’s readiness to participate in an eSource study. This questionnaire collects factual information about clinical research sites as well as key research staff perceptions of their environment and data needs. The eSource site readiness survey contains three components that are completed by site principal investigators (PIs), clinical research coordinators (CRCs), and informatics leadership, such as Chief Research Information Officers (CRIOs) or organizational research informatics leadership. Because this study is descriptive and hypothesis generating, there is no primary endpoint. Rather, we use results from the survey to identify elements that likely are predictive of site success in an eSource-based clinical study. This study was approved by the Duke University Medical Center Institutional Review Board (Pro00102679). Informed consent was obtained from all survey respondents.
2.2. Site eSource Site Readiness Survey
The site readiness survey is divided into 8 sections that evaluate different aspects of site readiness to utilize eSource software in multi-center clinical trials. All respondents (CRC, PI, and CRIO) complete a set of core questions with additional questions addressed to one or more respondent type. The total number of survey questions differed by respondent role: CRC (n=318), PI (n=191), and CRIO (n=136). Section 1: Site and Respondent Information – collects information about the site’s participation in other research activities and the respondent’s role and clinical research experience. Section 2: Potential eSource Benefits to the Site – asks about the site’s number of trials and enrollees in the prior year and ways in which eSource might benefit that site. Section 3: Site Enthusiasm and Organizational Support for eSource – asks about the respondent’s enthusiasm for using eSource and for their perceptions of their organization’s context for change. Questions regarding key opinion leaders, senior leadership/clinical management, clinical investigators, research staff members, and information technology (IT) staff members were derived from the Organizational Readiness to Change Assessment (ORCA) [15]. Section 4: Site Electronic Health Record (EHR) – asks about the site’s EHR capabilities to support eSource data collection and the site’s Fast Healthcare Interoperability Resources (FHIR®) capabilities and their use in clinical care. Section 5: Site EHR Use in Clinical Trials – collects information on the sites’ use of EHR research functionals and how research data are managed. Section 6: Human and Organizational Resources – asks about the sites’ IT personnel and procedures to support research. Section 7: Organizational Procedures – Focuses on institutional policies, procedures and processes that might support/inhibit eSource use. Section 8: Site Preferences for EHR-to eCRF eSource – asks about the institution’s preferences for eSource software installation and how to link the site’s EHR to a study’s electronic data capture (EDC) system.
2.3. Survey Administration
Survey administration began when Pediatric Trial Network (PTN) leadership sent an email describing the study to site PIs within the network to solicit their involvement. For those agreeing to participate, PTN leadership worked with site PIs to identify CRCs and CRIOs at their site who would be interested and eligible to participate. Eligible to participate was defined as: (1) being a CRC, PI, or CRIO at a PTN site, (2) stating their willingness to complete the survey and participate in an interview with the study team to review their responses, and (3) providing written informed consent within the study’s Research Electronic Data Capture (REDCap) system. Before beginning the survey, the PTN contracted with the study site to compensate them for each of the three respondents. There was no direct compensation for individual survey respondents.
The study team contacted potential respondents to determine their interest to participate and provided access to the REDCap-based survey. Informed consents were administered through the REDCap survey. After each survey was completed, the study team reviewed the responses and contacted respondents to schedule a follow-up interview. These interviews were designed to clarify survey responses and to obtain additional information that would be used in qualitative analyses. When follow-up interviews could not be scheduled, the study team emailed the respondents questions relating to their survey responses that required clarification.
2.4. Analysis
The analysis population includes all respondent surveys completed as of July 31, 2021. Survey results are presented as n (%) for discrete variables and means with standard deviations in parentheses for continuous variables. Rank variables are averaged and presented as ordinals (e.g., 1, 2, 3, etc.). Missing values are not imputed. Analyses were conducted using IBM SPSS Statistics version 27 (International Business Machines, Armonk, New York). We conducted content analysis using an open coding approach of the interview data. This analysis was guided by the overarching research question and focused on identifying themes related to organizational and information systems characteristics at the sites [16].
3. Results
Thirty site PIs were approached for this study and twenty-three (77%) agreed to participate (Table 1). The first survey was completed on January 27, 2020 and the final interview was completed on June 7, 2021. Both survey completion and subsequent interviews were delayed when site activities were reduced due to COVID-19. Surveys were completed by 62 respondents (CRC, 23; PI, 20; and CRIO, 19). The overall response rate at participating sites was 90% (CRC, 100%; PI, 87%; CRIO, 83%). One respondent withdrew consent between their survey completion and follow-up contact. The 61 remaining consented survey respondents comprise this study’s analytic population.
Table 1.
Study population.
| Survey sites | ||||
| Sites approached | 30 | |||
| Sites declining to participate | 7 | |||
| Sites responding positively | 23 | |||
| Survey respondents | CRC | PI | CRIO | Total respondents |
| Respondents consented | 23 | 20 | 19 | 62 |
| Surveys completed | 23 | 20 | 19 | 62 |
| Respondent withdrawal | 1 | 0 | 0 | 1 |
| Study population | 22 | 20 | 19 | 61 |
CRC, clinical research coordinator; CRIO, chief research information officer; PI, principal investigator; PTN, Pediatric Trial Network.
Most sites were affiliated with a hospital or health care system (CRC, 91%; PI, 100%; and CRIO, 100%) and their institutions had a Clinical and Translational Science Award (CRC, 68%; PI, 80%; and CRIO, 68%) (Table 2). The average respondent experience in their role at their present institution was greater than 7 years (CRC, 7.2 years; PI, 12.1 years; and CRIO, 8.0 years). Sites participated in more than ten clinical trials during the previous 12 months on average (CRC; 11.9 clinical trials and PI, 12.0 clinical trials) with most having at least one PTN study (CRC, 77% and PI, 85%). Overall, sites averaged more than two PTN trials in the previous 12 months (CRC, 2.5 studies with 57.0 enrollees and PI, 2.2 studies with 30.9 enrollees).
Table 2.
Site and respondent information.
| Respondent | |||
|---|---|---|---|
| Description | CRC | PI | CRIO |
| Site is affiliated with hospital or health care system – no. (%) | 20 (91) | 20 (100) | 19 (100) |
| Institution has Clinical and Translational Science Award – no. (%) | 15 (68) | 16 (80) | 15 (68) |
| Years in this role at your institution – mean (SD) | 7.2 (6.5) | 12.1 (8.5) | 8.0 (7.8) |
| Years total experience in this role – mean (SD) | 10.3 (6.7) | 14.2 (8.7) | 10.2 (7.4) |
| Number clinical trials participated in during past 12 months – mean (SD) | 11.9 (13.7) | 12.0 (18.5) | |
| Site participated in PTN studies during past 12 months – no. (%) | 17 (77) | 17 (85) | |
| Number PTN studies participated during past 12 months – mean (SD) | 2.5 (2.2) | 2.2 (1.7) | |
| Number total patient enrolled in PTN studies during past 12 months – mean (SD) | 57.0 (122.9) | 30.9 (53.0) | |
CRC, clinical research coordinators; CRIO, chief research information officers; PI, principal investigator; PTN, Pediatric Trial Network, SD, standard deviation.
CRCs reported that in the previous 12 months 58.8 % of their study data were first documented in the site’s EHR with 17.5% in paper study-specific worksheets, 10.6% in the study EDC system, and 13.1% in other sources. However, the percent of data first documented in the EHR varied greatly by site (range, 0% - 100%). Nonetheless, the amount of study data first documented in EHRs effectively sets a limit on the amount of participant data that can be copied from the site’s EHR to a study’s EDC without changing clinical site procedures.
During interviews, CRCs and PIs noted that the same patient data can be captured in multiple EHR fields. One PI said that she instructs medical center staff where to document participant study information in the EHR. However, a large site CRC said that it was not realistic to stipulate where medical center staff would document study related information in the EHR.
CRCs and PIs ranked medication administration, medication order sets, laboratory, medical history and vital signs data types as having the highest priority for automation, defined as electronically copying data from the EHR to the electronic EDC forms (ranked 1 through 5). Most data types were required for inclusion in any eSource solution (Table 3). CRCs also ranked these data types by the time required to locate and manually copy data, (enter data) from their EHR to a study EDC form for a single outpatient visit and for a 3-day inpatient stay. While medication administration and medication orders ranked highest for manual data collection time, other data types with lower priorities for automation were also noted as being time consuming. In interviews, Some CRCs said that they did not rank adverse events and other time-consuming data types as being high priority because they did not think their collection could be automated.
Table 3.
Respondent data type rankings.
| Data types | Priority for automation | Must be included to use eSource software | CRC time for entry EHR-to-EDC | |||
|---|---|---|---|---|---|---|
| Outpatient visit | 3-Day inpatient | |||||
| CRC | PI | CRC | PI | |||
| Medication administration | 1 | 2 | 2 | 4 | 1 | 1 |
| Medication orders set | 2 | 1 | 4 | 3 | 2 | 2 |
| Lab | 3 | 3 | 1 | 1 | 6 | 6 |
| Medical history | 4 | 5 | 5 | 6.5 | 3 | 4 |
| Vital signs | 5 | 4 | 3 | 2 | 10 | 9 |
| Non-medication orders | 6 | 6 | 7.5 | 6.5 | 9 | 5 |
| Diagnosis | 7 | 9 | 9.5 | 8 | 8 | 8 |
| Procedures | 8 | 8 | 7.5 | 10.5 | 5 | 7 |
| Adverse events | 9 | 10 | 9.5 | 9 | 4 | 3 |
| Demographic | 10 | 7 | 6 | 5 | 11 | 11 |
| Other time-consuming | 11 | 11 | 11 | 10.5 | 7 | 10 |
CRC, clinical research coordinator; EDC, electronic data capture; EHR, electronic health record; PI, principal investigator. Priorities for Automation are average rankings across the respondents for each of the eleven data categories. Must be Included - is percent agreement. Time for EHR-to-EDC - are rankings of time to manually enter data from EHR to EDC.
Many eSource technologies access EHR data via Health Level Seven Fast Healthcare Interoperability Resources (HL7 FHIR®), an international standard for exchanging data. However, only 42% of site EHRs were FHIR enabled with 21% developing this capability (Table 4). Similarly, only 21% of sites were using FHIR to exchange patient data with other institutions for routine care with 32% developing this capability. CRCs reported that 77% of sites provided guest / temporary log-in to the EHR with read-only access. Lastly, 42% of sites require that EHR to EDC automation software requesting data from an EHR via FHIR server be installed behind the institution’s firewall.
Table 4.
Site electronic health record access.
| Description | Respondent | |
|---|---|---|
| CRC | CRIO | |
| Is the EHR at your site FHIR-enabled, i.e., is the EHR capable of sending data to and receiving data from other institutions for routine clinical care/treatment purposes? | ||
| Yes (in use now) | 8(42) | |
| No, but in development now | 4(21) | |
| Does your institution use FHIR to exchange patient data with other organizations for routine clinical care/treatment purposes? | ||
| Yes (in use now) | 4(21) | |
| No, but in development now | 6(32) | |
| Which method do you use to provide access to source data? We print paper copies of the EHR data | 6(27) | |
| We provide a guest / temporary log-in to the EHR with read-only access | 17(77) | |
| We provide “over the shoulder access” to the EHR | 6(27) | |
| We provide access to a shadow chart | 4(18) | |
| We do not provide any access to source data in the EHR/chart | 0(0) | |
| Where would software that requests data from the EHR FHIR server need to be installed to comply with your institution’s policies? | ||
| Our policies require that such software be installed behind our institution’s firewall | 8(42) | |
| Our policies allow such software to be hosted by the Sponsor for the study | 1(5) | |
| Our policies allow such software to be hosted by a third-party for the study (i.e., a data coordinating center; not the Sponsor or the site) | 1(5) | |
| All of the above would comply with our institution’s policies | 9(47) | |
CRC, clinical research coordinator; CRIO, chief research information officer; EHR, electronic health record; FHIR®, Fast Healthcare Interoperability Resources.
Cells values are n (%).
Most respondents were affiliated with organizations that used EHR research functions (CRC, 77%; PI, 75%; and CRIO, 89%) (Table 5). However, the type of research functions used differed by respondent type (CRC, PI, or CRIO) and institution. Generally, CRIOs reported greater EHR research function availability than did CRCs or PIs. For example, only 36% of CRCs used data from an institutional clinical or research warehouse to assess feasibility; whereas, 95% of CRIOs said this function was available. Similarly, only 27% of CRCs used data from an institutional clinical or research warehouse to find potential study patients; whereas, 84% of CRIOs said this function was available.
Table 5.
Site EHR use in clinical studies.
| Description | Respondent | ||
|---|---|---|---|
| CRC | PI | CRIO | |
| Does your institution use any research functionality in your EHR such as using alerts to help with patient screening, associating patients with a study, scheduling research visits, getting patients to complete questionnaires on a patient portal, separating research and routine care billing? | 17(77) | 15(75) | 17(89) |
| Does your site use any of the following to assess study feasibility? | |||
| Data from an institutional clinical or research data warehouse | 8(36) | 10(50) | 18(95) |
| Data directly from the EHR via manual chart review | 10(46) | 10(50) | 15(79) |
| Data directly extracted from the EHR via electronic methods | 10(46) | 11(55) | 16(84) |
| Does your site use any of the following to find or screen potential study patients? | |||
| Data from an institutional clinical or research data warehouse | 6(27) | 4(20) | 16(84) |
| Data directly from the EHR via manual chart review | 17(77) | 10(50) | 15(79) |
| Data directly from the EHR through added study decision support rules such as Best Practice Advisories (BPAs) and in basket messages | 6(27) | 2(10) | 12(63) |
CRC, clinical research coordinator; CRIO, chief research information officer; EHR, electronic health record; PI, principal investigator.
Cells values are n (%).
Table 6 reports the percent of respondents who agreed or strongly agreed with each organizational readiness to change assessment statement. Generally, opinion leaders, clinical investigators, and research staff were perceived as supportive of change; whereas, senior leadership/clinical management and IT staff members were less supportive. Respondent interviews revealed important instances where these trends were not the case. A few PIs practiced in organizations that did not see clinical research as a major component of their mission and whose peers were not active researchers. Similarly, some CRIOs felt that PIs and research staff were less receptive to technological change. Instances where senior leadership/clinical management had neutral or unfavorable assessments typically occurred in organizations where the clinics/hospitals were not owned by the medical school or where the organization did not have a Clinical and Translational Science Award. In these cases, senior leadership/clinical management tended to focus on patient care and had less interest in supporting clinical research. CRC and PI perceptions of IT staff members typically differed depending upon how their organization’s health IT and EHR support groups were organized. When clinical research was supported by a centralized IT group that focused on patient care and business systems, PI and CRC perceptions were less favorable. However, when there was a separate research IT support group, PI and CRC perceptions were generally positive. Only a few organizations currently had IT personnel with the skills required to support eSource studies.
Table 6.
Site enthusiasm and organizational support.
| Description | Respondent | ||
|---|---|---|---|
| CRC | PI | CRIO | |
| Opinion leaders in my organization | |||
| Believe that the current research processes and tools can be improved | 19(86) | 18(90) | 16(84) |
| Are willing to try new research processes and tools | 17(77) | 17(85) | 14(74) |
| Encourage and support changes in research processes and tools | 18(82) | 15(75) | 16(84) |
| Work cooperatively with senior leadership/clinical management to make appropriate changes | 14(64) | 13(65) | 16(84) |
| Senior leadership/clinical management in my organization | |||
| Reward innovation and creativity to improve research processes and tools | 12(55) | 10(50) | 13(69) |
| Solicit opinions of clinical investigators regarding decisions about research processes and tools | 12(55) | 7(35) | 15(79) |
| Solicit opinions of research staff regarding decisions about research processes and tools | 11(50) | 10(50) | 12(63) |
| Solicit opinions of health IT staff regarding decisions about research processes and tools | 11(50) | 9(45) | 16(84) |
| Seek ways to improve research processes and tools | 16(73) | 16(80) | 15(79) |
| Provide effective management for continuous improvement of research processes and tools | 12(55) | 9(45) | 14(74) |
| Clearly define areas of responsibility and authority for clinical and clinical research managers and staff | 12(55) | 11(55) | 13(69) |
| Promote team building to solve problems that arise in research projects | 13(59) | 13(65) | 15(79) |
| Promote communications among investigators, clinical staff, IT staff and study staff | 16(73) | 11(55) | 15(79) |
| Clinical investigators in my organization | |||
| Have a sense of personal responsibility for improving research processes and tools | 18(82) | 16(80) | 12(63) |
| Cooperate to maintain and improve effectiveness of research processes and tools | 18(82) | 14(70) | 14(74) |
| Are willing to innovate and/or experiment to improve research processes and tools | 21(95) | 18(90) | 15(79) |
| Are receptive to change in research processes and tools | 18(92) | 19(95) | 12(63) |
| Research staff members such as study coordinators, research nurses and study assistants in my organization | |||
| Have a sense of personal responsibility for improving research processes and tools | 20(91) | 17(85) | 17(89) |
| Cooperate to maintain and improve effectiveness of research processes and tools | 20(91) | 18(90) | 17(89) |
| Are willing to innovate and/or experiment to improve research processes and tools | 19(86) | 17(85) | 16(84) |
| Are receptive to change in research processes and tools | 19(86) | 17(85) | 15(79) |
| IT staff members in my organization | |||
| Have a sense of personal responsibility for improving research processes and tools | 11(50) | 8(40) | 15(79) |
| Cooperate to maintain and improve effectiveness of research processes and tools | 14(64) | 9(45) | 15(79) |
| Are willing to innovate and/or experiment to improve research processes and tools | 12(55) | 9(45) | 15(79) |
| Are receptive to change in research processes and tools | 10(45) | 10(50) | 15(79) |
CRC, clinical research coordinator; CRIO, chief research information officer; PI, principal investigator.
Cells values are n (%).
Using this study’s results, we identified four criteria for eSource study site selection. (1) Sites must have experience in using their FHIR® servers to exchange clinical or research data. (2) Site clinical research personnel must have regular interactions with the health information technology group that supports their EHR. (3) The research site (e.g., clinic or hospital) is able to use patient EHR data for research. (4) The research site has executive support for the clinical research mission. Sites not meeting these criteria may experience difficulties participating in eSource studies.
4. Discussion
The typical eSource site readiness survey respondent was an experienced researcher with 10 or more years total experience in their role (7 years or more experience at their institution), and most had participated in PTN studies within the previous 12 months. Although there was variability among CRC and PI respondents with regard to preferences for eSource automation, the consensus was that first priority should be given to medication administration, medication order sets, laboratory, and vital signs data types. Despite CRCs spending more time manually collecting other data types (e.g., adverse events), they recognized that these data might not be easily automated and thus did not rate them as being of high priority for eSource automation.
While most respondents were willing to participate in an eSource study, there were a number of factors that might prevent sites from being selected for this type of study. First, only 44% of sites had EHRs that were FHIR® enabled and only 22% used FHIR to exchange patient data with other institutions. Additionally, 44% of sites had policies that would require software that requested EHR data from a FHIR® server to be installed behind their institution’s firewall. Since many eSource solutions are cloud-based, this requirement would be problematic. Most respondents (>75%) said that their institutions used EHR research functions. However, there were significant differences between the EHR research functions CRIOs said were available and those that were used by CRCs. Hence, a site research team’s experience with EHR research functions may be a more important indicator of site readiness than the mere availability of these technologies. Our study also identified several organizational characteristics that might inhibit the uptake of eSource solutions at sites. First, there was a clear difference between organizations that had a patient care only focus and those that actively supported clinical research. Organizations whose senior leaders did not see clinical research as a major component of their mission would not be good candidate for eSource studies. Similarly, there were differences between organizations that prioritized research and created separate research IT organizations and those that saw research IT as a secondary priority behind patient care and business information systems support.
Most sites in our study are not currently able to participate in eSource studies. There are a variety of reasons for this situation that include information technology immaturity, organizational priorities and policies, and current research organization and staffing. Clearly, the situation in many of these organizations is fluid and more likely will become eSource study candidates over time. However, for the present, eSource studies will need to have non-eSource sites. This adds a level of complexity that will have to be balanced with the benefits that eSource technologies provide. Sponsors and sites need to understand these benefits better to determine whether eSource is appropriate for specific studies and sites [17, 18]. For the present, eSource likely will have greatest use in large late phase clinical trials and registries.
Previous clinical research site questionnaires have sought to determine the feasibility of doing a direct EHR-to-EDC data transfer or to evaluate the site’s computer systems suitability to provide clinical research data [19, 20]. While these instruments may provide useful information, they do not address clinical site readiness to participate in an eSource clinical study and the information they provide can only be used to address certain facets of what is a multi-dimensional problem. Specifically, eSource clinical site readiness assessment is a socio-technical problem that requires knowledge of a clinical research site’s information systems and their uses, policies and practices, and research personnel’s training and experience.
4.1. Limitations
Our study included respondents from a single pediatric research network that mainly conducts pharmacokinetic/pharmacodynamic studies. While we believe our results may be generalizable to other settings, we realize that there likely are differences between studies conducted in other therapeutic areas and in different patient populations. Additionally, some of this study’s key results reflect respondent impressions of their organization’s commitment to and organization and staffing for research. Future versions of the site readiness survey should seek to quantify these impressions. Lastly, the environment for eSource based clinical studies is changing. More organizations are using FHIR® to share clinical data and may become more willing to entertain cloud-based software solutions. Thus, the number of sites that are able to participate in eSource based clinical studies likely will increase in the near term.
5.0. Conclusions
Site readiness to participate in eSource studies is not merely a technical problem. While computer capabilities are important, organizational priorities, such as clinical care versus research, availability of a research IT group to support EHR eSource studies, and the site’s previous use of EHR clinical research functions are equally important considerations.
Acknowledgements
Pediatric Trials Network (PTN) Steering Committee Members: Daniel K. Benjamin Jr., Christoph Hornik, Kanecia Zimmerman, Cheryl Alderman, Zoe Sund, Phyllis Kennel, and Rose Beci, Duke Clinical Research Institute, Durham, NC; Chi Dang Hornik, Duke University Medical Center, Durham, NC; Gregory L. Kearns, Texas Christian University and UNTHSC School of Medicine, Fort Worth, TX; Matthew Laughon, The University of North Carolina at Chapel Hill, Chapel Hill, NC; Ian M. Paul, Penn State College of Medicine, Hershey, PA; Janice Sullivan, University of Louisville, Louisville, KY; Kelly Wade, Children’s Hospital of Philadelphia, Philadelphia, PA; Paula Delmore, Wichita Medical Research and Education Foundation, Wichita, KS.
The Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD)
PTN Publications Committee: Chaired by Thomas Green, Ann & Robert H. Lurie Children’s Hospital of Chicago, Chicago, IL.
Funding
This work was supported in part by the Duke University-Vanderbilt University Medical Center Trial Innovation Center (U24TR001608) and the Vanderbilt University Medical Center Recruitment Innovation Center (U24TR001579), part of the NCATS Trial Innovation Network an innovative collaboration funded by the National Center for Advancing Translational Sciences (NCATS) to address critical roadblocks in clinical research and accelerating the translation of novel interventions into life-saving therapies. This study was also funded by the National Institute of Child Health and Human Development (HHSN275201800031). This work is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Role of the sponsor
The study sponsor played no role in study design; collection, analysis and interpretation of data; in writing the report; or submitting the article for publication.
Abbreviations
- CRCs
clinical research coordinators
- CRIOs
chief research information officers
- eCRF
electronic case report form
- EDC
electronic data capture
- EHR
electronic health record
- FHIR®
Fast Healthcare Interoperability Resources
- IT
information technology
- PIs
principal investigators
- PTN
Pediatric Trial Network
- REDCap
Research Electronic Data Capture
Footnotes
Declaration of interests
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Please see Acknowledgments for full listing of committee members.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Data availability
Data will be made available upon request.
References
- [1].DiMasi JA, Grabowski HG, Hansen RW, Innovation in the pharmaceutical industry: New estimates of R&D costs, Health Econ. 47 (2016) 20–33. [DOI] [PubMed] [Google Scholar]
- [2].Moore TJ, Zhang H, Anderson G, Alexander GC, Estimating costs of pivotal trials for novel therapeutic agents approved by the US Food and Drug Administration, 2015–2016, JAMA Intern. Med. 178 (11) (2018) 1451–1457. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [3].Paul SM, Mytelka DS, Dunwiddie CT, et al. , How to improve R&D productivity: the pharmaceutical industry’s grand challenge, Nat. Rev. Drug Discov. 9 (3) (2010) 203–214. [DOI] [PubMed] [Google Scholar]
- [4].Wouters OJ, McKee M,Luyten J, Estimated research and development investment needed to bring a new medicine to market, 2009–2018, JAMA 323 (9) (2020) 844–853. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [5].Eisenstein EL, Collins R, Cracknell BS, et al. , Sensible approaches for reducing clinical trial costs, Clin. Trials 5 (1) (2008) 75–84. [DOI] [PubMed] [Google Scholar]
- [6].Moore TJ, Heyward J, Anderson G, Alexander GC, Variation in the estimated costs of pivotal clinical benefit trials supporting the US approval of new therapeutic agents, 2015–2017: a cross-sectional study, BMJ Open 10 (6) (2020) e038863. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [7].Eisenstein EL, Lemons PW 2nd, Tardiff BE, Schulman KA, Jolly MK, Califf RM, Reducing the costs of phase III cardiovascular clinical trials, Am. Heart J. 149 (3) (2005) 482–488. [DOI] [PubMed] [Google Scholar]
- [8].Sertkaya A, Wong HH, Jessup A, Beleche T, Key cost drivers of pharmaceutical clinial trials in the United States, Clin. Trials 13 (2) (2016) 117–126. [DOI] [PubMed] [Google Scholar]
- [9].James S, Rao SV, Granger CB, Registry-based randomized clinical trials-anew clinical trial paradigm, Nat. Rev. Cardiol. 12 (5) (2015) 312–316. [DOI] [PubMed] [Google Scholar]
- [10].Kellar E, Bornstein S, Caban A, et al. , Optimizing the use of electronic data sources in clinical trials: the technology landscape, Ther. Innov. Regul. Sci. 51 (5) (2017) 551–567. [DOI] [PubMed] [Google Scholar]
- [11].Lauer MS, D’Agostino RB Sr., The randomized registry trial-the next disruptive technology in clinical research?, N. Engl. J. Med. 369 (2013) 1579–1581. [DOI] [PubMed] [Google Scholar]
- [12].Parab AA, Mehta P, Vattikola A A, et al. , Accelerating the adoption of eSource in clinical research: a Transcelerate point of view, Ther. Innov. Regul. Sci. 54 (5) (2020) 1141–1151. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [13].Garza M, Myneni S, Nordo A, et al. , eSource for standardized health information exchange in clinical research: a systematic review, Stud. Health Technol. Inform. 257 (2019) 115–124. [PubMed] [Google Scholar]
- [14].Garza M, Myneni S, Fenton SH, Zozus MN, eSource for standardized health information exchange in clinical research: A systematic review of progress in the last year, JSCDM 1 (2) (2021), doi: 10.47912/jscdm.66. [DOI] [PubMed] [Google Scholar]
- [15].Helfrich CD, Li YF, Sharp ND, Sales AE, Organizational readiness to change assessment (ORCA): development of an instrument based on the Promoting Action on Research in Health Services (PARIHS) framework, Implement. Sci. 4 (2009) 38, doi: 10.1186/1748-5908-4-38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [16].Ezzy D, Qualitative Analysis, first ed., Routledge, London, 2002, eBook published 2013. doi: 10.4324/9781315015484. [DOI] [Google Scholar]
- [17].Eisenstein EL, Garza MY, Rocca M, et al. , eSource-enabled vs. traditional clinical trial data collection methods: a site-level economic analysis, Stud. Health Technol. Inform. 270 (2020) 961–965. [DOI] [PubMed] [Google Scholar]
- [18].Marquis-Gravel G, Roe MT, Turakhia MP, et al. , Technology-enabled clinical trials: transforming medical evidence generation, Circulation 140 (17) (2019) 1426–1436. [DOI] [PubMed] [Google Scholar]
- [19].Griffon N, Pereira H, Djadi-Prat J, et al. , Performances of a solution to semiautomatically fill eCRF with data from the electronic health record: protocol for a prospective individual participant data meta-analysis, Stud. Health Technol. Inform.270 (2020) 367–371. [DOI] [PubMed] [Google Scholar]
- [20].eClinical Forum eSRA Team, Investigator Site eSource-Readiness Assessment: a tool for common assessment across sites and sponsors. V. 2021.2, June 22, 2021. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Data will be made available upon request.
