Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Apr 8.
Published in final edited form as: Data Basics. 2019 Summer;25(2):16–22.

Case Study: Electronic Data Capture System Validation at an Academic Institution

Dinesh Pal Mudaranthakam 1,2, Ron Krebill 1, Ravi D Singh 3, Cathy Price 1, Jeffrey Thompson 1,2, Byron Gajewski 1,2, Devin Koestler 1,2, Matthew S Mayo 1,2
PMCID: PMC8032204  NIHMSID: NIHMS1648975  PMID: 33842930

Introduction:

There has been a great amount of innovation in research informatics since the transition from paper records to digital formats. The impetus for such initiatives was to increase the efficacy, reliability, and portability of research data. In the first generation of innovation, data were transcribed into commercially available databases or spreadsheets that provided the capacity to construct a harmonized data table. While this methodology has a rudimentary utility, its adaptability is significantly limited. Each study has unique design and operational characteristics that may not be adaptable to a pre-existing database or spreadsheet data architecture. Furthermore, when dealing with clinical research data, there are other important considerations that these initial approaches cannot cope with.

Electronic Data Capture was a progeny of the federal Paperwork Reduction Acts of 1980 and 1995. Both acts in a broad sense were road maps for the implementation of standardization of information collection and optimized storage information structure that could be shared among federal departments to support the functions of the federal government. By October 2003, federal agencies could warehouse and maintain digital transactions from individuals or entities. However, outside the federal government, most academic and private service entities were slower to adopt EDC systems; in part because lack of human and financial capital resources, changing regulatory policy, and technological innovations were signicant barriers to EDC implemenation.

Electronic systems are an efficient platform to capture and warehouse data. However, it is also important that users of the sytem and regulatory bodies can trust the data in these systems. A properly validated EDC provides a level of confidence to the financial study sponsor and regulatory bodies that the data at each functional level: data capture, warehouse management and data exported. The process of validation ensures a robust evaluation the user interface transcription to the database level and export interface transcription from the database level to an external file. Furthermore, for research data that must comply with United States Food and Drug Administration (FDA) guidelines, the FDA has published a system validation guidance for each of the components of the system. These requirements are codified in 21 Code of Federal Regulations Part 11[1],[2] (21 CFR Part 11). 21 CFR Part 11 comprises of the following: This is the bare minimum checklist that is prescribed by FDA.

  • Subpart A – General Provisions

    • Scope

    • Implementation

    • Definitions

  • Subpart B – Electronic Records

    • Controls for closed systems

    • Controls for open systems

    • Signature manifestations

    • Signature/record linking

  • Subpart C – Electronic Signatures

    • General requirements

    • Electronic signatures and controls

    • Controls for identification codes/passwords

For the specific definition of Title 21, please refer https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?CFRPart=11

One of the critical components of 21 CFR Part 11 is the implementation of electronic signatures. Guidance is provided by the FDA for validation and implementation of electronic signatures. In general, electronic signatures must be constructed from at least two unique identification components: identification code, and user password. Both of these components are directly associated with a unique system user. Electronic signatures are considered legally binding and the equivalent to a person’s handwritten signature[3] when a system complies with 21 CFR Part 11.

In 2012 The University of Kansas Cancer Center and the Department of biostatistics jointly adopted a commercial electronic data capture system known as eResearch. The use of the electronic data capture system allows direct data entry for research activities by researchers, which reduces the issues associated with of capturing research data on paper. The goal of EDC implementation was to gradually expand the services Biostatistics and Informatics Shared Resource (BISR) to highly regulated FDA and pharmalogical treatment trial. The groundwork for 21 CFR Part 11 certification began in 2016 and was completed in April 2018. The purpose of this paper is to provide an overview for achieving system validation (21 CFR Part 11 compliance) for an Electronic Data Capture system at an academic institution.

Materials and Methods:

To evaluate our EDC system’s capacity to comply with requirements of 21 CFR Part 11, the BISR contracted independent IT compliance consultant to conduct a gap analysis. The gap analysis identifies current EDC capabilities that align with the requirements of 21 CFR Part 11 and compliance deficits. This three-day process included the majority of the stakeholders who interacted with or supported the system. Stakeholders included personnel such as data managers, information security, system engineers, application administrators, quality assurance specialists, and regulatory experts. The external team of experts interviewed all stakeholders and summarized the needs of the system to become 21 CFR Part 11 compliant. A compliant system is expected to perform according to the defined user and functional requirements, is secure from unauthorized or accidental change, and accurately records authorized changes while maintaining an audit trail of user actions. The summary report of the gap analysis served as a template to create the Validation/Evaluation Plan once BISR decided to move forward with achieving compliance.

System Validation:

System Validation is a set of actions used to check the compliance of any electronic system element with its purpose and functions. These actions are planned and carried out throughout the life cycle of the system. In this case, we have performed the validation after the system had been successfully deployed at the University of Kansas Medical Center[4]. BISR was responsible for the validation since they both own and manage the system[1]. An annual review is conducted by the external consulting team to verify system upgrades, as well as any changes to the system to ensure they were performed as per the standard operating procedures.

Method:

An evaluation plan was developed after determining that the eResearch system was deemed a system requiring evaluation. Evaluation of the system ensured that the system performs according to the defined user and functional requirements, is secure from unauthorized or accidental change, and accurately records authorized changes while keeping a compliant audit trail. This effort is to verify that eResearch is compliant with all requirements of 21 CFR Part 11 are summarized below(see Table 01) [5].

Table 01:

Summary of Evaluation Package Resources & Responsibilities

Role Responsibility
PS: Package Sponsor
QC: Quality Control
Provides consulting and expert knowledge on compliance process, as well as GCP training. Trains users on how to set up testing environment and monitor testing conditions.
PM: Package Manager Identifies and leads a Package Team:
 • Approve the Test Plan and the Test Summary Report, along with Managing the testing environment.
 • Drives the Package process and identifies ad-hoc members as needed.
 • Drives item preparation, establishes and manages package archive, and checks the quality of documents in production for their ability to pass audits.
TC: Test Coordinator  • Authors Test Plan and other test documentation, including the Test Summary Report.
 • Identifies and trains testers informal testing practices.
 • Manages a formal testing process.
T: Testers Execute test scripts informal testing.
SUP: Supplier(s) of products, services, platforms, or consulting the support  • Perform assigned tasks agreed to as specified in the contractual agreement, by standards and guidelines.
 • Provide documented evidence for their contribution to the Evaluation package.

Through the validation process, the following documents were developed, and the testing was performed (see Figure 1) [8].

Figure 01:

Figure 01:

System Validation Lifecycle

Validation Plan (Lifecycle: Plan): The validation plan describes the scope of the project such as what modules within the eResearch system were in the scope and which modules were out of scope as they were not actively utilized at the current time by the stakeholders, the order of activities, and the individuals responsible for planning, execution, testing, and approval.

Validation Risk Assessment (Lifecycle: Plan & Risk analysis): This document helped with identifying any risks and, if there were any, what were the actions that were taken to mitigate them.

User Requirement Specification (URS) (Lifecycle: Specify): URS addressed the technical controls, procedural controls, capacities, accuracy, security, fault tolerance, physical environment, and training requirements.

Functional Requirement Specification (FRS) (Lifecycle: Specify): FRS addressed the functional controls, procedural controls of the system, how it operated and the expected functionality.

Test Plan (Lifecycle: Plan): Test plan is designed to address all the testable user requirements and the functional requirements and what are the best practices [6] that need to be followed during the execution of each test plan.

Installation Qualification (Lifecycle: Verify): IQ package is something that deals with validating the system after the software has been successfully installed. In our case, we inherited this from our Vendor – Velos as they were responsible for installing the application on premise.

Operational Qualification (Lifecycle: Verify): OQ package certifies the system is in a stage where it is operational for regular business as expected. IQ, OQ involves the execution of a defined set of tests using test scripts that contain the instructions, expected results, and acceptance criteria. They also include a section to record the results of the testing of each script.

Master Traceability Matrix (Lifecycle: Verify and Report): The Master Traceability Matrix shows the relationship between each user requirement and corresponding test script(s). The traceability matrix summarizes that each URS and FRS was successfully tested.

Validation Summary Report (Lifecycle: Report): The Validation Summary Report, summarizes the results of the software validation project including a summary of the plan execution and the decisions as to whether the system qualified or not.

Results:

As a requirement of the validation exercise at KUMC, the BISR team has successfully developed the required Standard Operating Procedures (SOP) and the test scripts that tested every URS and FRS (Table 02) [9].

Table 02:

List of Standard Operating Procedures and Testing modules

SOPs developed URS Test Matrix FRS Test Matrix
Procedure for Controlled User/Access Users/Access
Documents Management Management
Service and Repair Study Management Customization - code
Server Back Up/Monitoring Study Team developed for an internal
System End User Training Management purpose
Good Data Management Practices Patient Management Randomization
Vendor Access Patient schedule System Features
Change Management Management Study Access controls
Document Standards Patient Adverse Event Patient Status controls
Controlling End-User Access and Patient Form Custom Reports
Access Request Form Library Audit Reports
Resolving Patient Record Reports
Duplication Quality Checks
Server SSL Cipher Strength Levels and Security Analysis Help and Library
Ad-Hoc Query Reporting
Calendar Creation Notification and Milestones
Electronic Case Report
Development
Procedure for Incident Management
Disaster Recovery
Adding a new Study

Additionally, the team performed and documented a complete disaster recovery test to validate the backups and recovery capabilities. The team was also trained in Good Clinical Practices as well as the importance of following all the controlled procedures to maintain the system in a validated state.

Table 03 is a summary of Time and Cost investment to get the system certified [7].

Table 03:

Time and Cost analysis for system validation

Time Cost
Gap Analysis 30 days ~ $8,000
Develop and review – Standard Operation Procedures (SOPs) 60 days – (2 full-time employees spending a couple of hours every day; consultants review) ~ $19,000
Develop Test Scripts 20 days – (1 full-time employee spending a couple of hours every day; consultants review) ~ $3,000
Execution of Test Scripts 20 days – (this included multiple stakeholders such as regulatory team, study coordinators, administrators, etc.) ~ $3,000
Training One day – In-person training for 3 hours, which included two parts basic and advance Part 11 training ~ $1,500
External Consultant 150 days – worked with the BISR team at every step ~ $26,200
Grand Total ~150 days to develop and implement required elements for system validation, certification ~ $60,700

Note: Table 03 is a rough estimate of how much it would cost at a research institute to perform a one-time 21 CFR Part 11 electronic data capture system validation. This cost could vary across other institutes depending upon team size, skills, and network infrastructure. The above table does not include the annual review, or the additional day-to-day cost of complying with additional processes which might be added on to achieve system compliance.

Conclusion:

The overall validation process yielded benefits as well as challenges. The greatest challenge was keeping momentum to keep the system validation project in alignment with the timeline. Not only did the process reinforce the importance of having these controls in place, but it also introduced efficiencies in our day-to-day process for managing and maintaining the electronic data capture system. This process also encouraged the team to be more flexible with covering cross-functional responsibility as the need arose to take on additional tasks. Following specified procedures also ensure that the end user was generating reliable data that could be used to advance their research with confidence.

Along with the benefits, there were few challenges that the team had to overcome to complete the system validation process promptly. Challenges included the sponsor identification for system validation. Receiving valuable stakeholder input promptly and active involvement was sometimes difficult due to conflicting schedules.

Acknowledgments/Funding

The research reported in this publication was supported by the National Cancer Institute Cancer Center Support Grant P30 CA168524 and used BISR core.

Research Support:

The Validation was supported by the National Cancer Institute (NCI) Cancer Center Support Grant P30 CA168524 and used the Biostatistics and Informatics Shared Resource (BISR).

Reference:

RESOURCES