Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2023 Apr 1.
Published in final edited form as: J Patient Saf. 2022 Apr 1;18(3):e672–e679. doi: 10.1097/PTS.0000000000000911

Challenges and Barriers to Adverse Event Reporting in Clinical Trials: A Children’s Oncology Group Report

Tamara P Miller 1,2, Melissa Zeilner Marx 3, Christopher Henchen 4, Nicholas P DeGroote 1, Sally Jones 5, Jenny Weiland 6, Beth Fisher 1, Adam J Esbenshade 7, Richard Aplenc 8,9, Christopher C Dvorak 10, Brian T Fisher 9,11
PMCID: PMC8940729  NIHMSID: NIHMS1728460  PMID: 34570002

Abstract

Objective:

Adverse event (AE) reporting is crucial for determining safety of trials. AEs are captured manually by clinical research associates (CRAs) and research nurses (RNs), and prior studies show underreporting. It is necessary to understand AE reporting training, processes, and institution-level differences to improve AE capture.

Methods:

A 26-item questionnaire regarding AE reporting training, identification, tracking, and challenges was distributed to all Children’s Oncology Group (COG) CRAs and RNs from February 15 to March 11, 2019 regardless of if they report AEs based on limitations of COG rosters. Results were tabulated. Institutions were grouped by self-reported full-time equivalents and compared using Chi-square tests.

Results:

Of 1315 CRAs and 2703 RNs surveyed, 509 (12.7%) responded. Of those, 369 (64.9%) individuals representing 71.8% of COG institutions report AEs. Only data from respondents who report AEs were collected and analyzed. There was a range in AE training; COG training modules were most common (79.7%). There was wide variability in AE ascertainment; only 51.2% use standardized approaches at their site. There was no standard AE tracking method; larger sites more commonly use spreadsheets (p=0.002) and smaller sites more commonly use paper (p=0.028). The greatest AE reporting challenges were differences between protocols (70%) and between AE definitions and documentation (53%). Half of respondents endorsed six of 13 proposed tools for improving reporting including online AE reporting modules (75.3%), tip sheets for interpreting CTCAE definitions (67.5%), and standardized AE tracking forms (66.9%). Only half of respondents reported that all colleagues at their site followed the same AE reporting practices and there was no dominant AE tracking approach across respondents.

Discussion:

There is wide variability in AE reporting training and practices. Numerous challenges exist, including differences between trials, challenges in interpreting AE definitions, and engaging clinicians.

Conclusion:

Respondents are eager for additional central resources. These results provide a roadmap for areas of potential improvement.

Introduction

Advances in cancer therapies have markedly improved cancer survival rates in adults and children.1, 2 While life-saving, cancer therapy regimens can have toxicities leading to significant morbidity and even mortality. In order to fully understand the efficacy of a treatment regimen, safety and tolerability outcomes need to be considered alongside primary outcomes such as remission rates. Safety and tolerability of a chemotherapy regimen is defined on cooperative group clinical trials by the documentation of adverse events (AE) and AEs are mandated to be reported on clinical trials. On most trials, AEs are captured through manual identification and reporting. This process is typically performed by clinical research associates (CRAs) and research nurses who review the medical record and identify and grade AEs that meet the definitions established in the National Cancer Institute Common Terminology Criteria for Adverse Events (CTCAE).3

The most recent version of CTCAE, version 5 (v5), lists more than 800 AEs and includes a definition and grading schema for each AE (https://ctep.cancer.gov/protocoldevelopment/electronic_applications/ctc.htm).3 For example, in CTCAE v5, sepsis is defined as “a disorder characterized by the presence of pathogenic microorganisms in the blood stream that cause a rapidly progressing systemic reaction that may lead to shock” and grade 3 sepsis is described as “blood culture positive with signs or symptoms; treatment indicated.”3 In an effort to reduce the burden of AE reporting, the protocol of each pediatric oncology clinical trial often identifies AEs and AE grades that are required (i.e. sepsis, febrile neutropenia, mucositis, neuropathy, and laboratory-based AEs such as hyperkalemia) for that trial.3 4, 5

Prior publications have demonstrated evidence of underreporting using the current AE reporting methods.5-7 This underreporting means that clinicians and patients do not have an accurate understanding of the risk of AEs for patients participating on these clinical trials or receiving similar chemotherapy regimens assessed on the trial. This vulnerability to under-reporting is likely multifactorial. One important consideration for the source of under-reporting is the possibility for differential understanding of AE reporting procedures among CRAs and research nurses across institutions. This variation in understanding of AE procedures may arise from different requirements for AE capture between trial protocols,8 frequently-changing CTCAE definitions due to periodic publication of new versions,9 and variability in center-level training and methods CRAs and research nurses use to capture AEs. For example, in CTCAE v4 there was no option for bacteremia and CRAs and research nurses were guided to use “infections and infestations, other” to report bacteremia, but this was added in CTCAE v5.3, 10 This has led to confusion regarding reporting of bacteremia even after CTCAE v5 was published. However, there are limited publications documenting variations in understanding of AE reporting procedures. Specifically, there has not been an assessment of the approaches centers have for training CRAs and research nurses on AE reporting, or how CRAs and research nurses differ regarding processes for actual AE capture.

Understanding variation in AE training by center and AE capture processes used will enable identification of opportunities for improvement and increase comprehensive capture of AEs on clinical trials. The objective of this project was to describe current AE training and reporting practices across pediatric centers belonging to the Children’s Oncology Group (COG), a clinical trials consortium for pediatric cancer. A questionnaire was developed and distributed to CRAs and research nurses performing AE reporting at COG centers. COG retains only rosters of all CRAs and nurses regardless of their activities. Therefore, there was no way to directly survey CRAs and nurses that specifically perform AE tasks. As such, all members of those rosters were surveyed and asked at the start of the survey if they report AEs. Only responses from those who report AEs were analyzed. The goal of the questionnaire was to identify variability in and potential areas to improve manual reporting of AEs.

Methods

Cohort

The project cohort consisted of all CRAs and research nurses present on either the COG CRA or nursing committee rosters. As these rosters do not distinguish who reports AEs, the questionnaire was distributed to all individuals on the roster. Utilizing the rosters of all CRAs and nurses was the only way to distribute the survey to individuals reporting AEs despite knowing that a large proportion on the lists were not involved in reporting AEs.

Questionnaire Development

The COG Toxicity Task Force, CRA Steering Committee, and Nursing Committee developed a 26-item questionnaire that intended to capture center-level AE training and reporting practices as well as individual CRA or research nurse details (Supplemental Figure 1). The COG CRA roster includes all CRAs affiliated with COG regardless of if their role includes AE reporting. The COG nursing roster includes all nurses, and the vast majority of nurses are not involved in AE reporting. There is no list of who reports AEs within COG, but it is known that at sites either CRAs or research nurses perform this task. Therefore, it was anticipated that a large fraction of the individuals surveyed would not respond, as they do not report AEs. In order to delineate this further within the questionnaire, an initial question asked if the respondent reports AEs; if the answer was no, the survey ended with capture of only COG institution, discipline (CRA, Nursing), and background education (CRA, nursing, other). Per COG recommendation, it was not required to identify the institution to complete the questionnaire. For those who report AEs, the questionnaire assessed the following: processes for learning about and identifying AEs, frequency of medical record review for AEs, average time spent per patient and per day reporting AEs, systems for tracking AEs, training received on AE reporting, and institutional protocols for AE identification, tracking, and training. Respondents ranked the top five greatest AE reporting challenges from a list of twelve options, one of which was “other,” with a free text field. The study team created a list of 13 potential tools to improve AE reporting and respondents identified which would be beneficial. For categorization purposes, data were captured on types of trials the respondent follows (multi-select response including COG-pharmaceutical trials, Phase I, Phase II, Phase III, Survivorship/Long-term follow-up, Biology studies, and Other), years of experience (<1, 1-2, 3-5, 5-10, >10 years), and institution size based on self-reported estimated full-time equivalent (FTE) CRAs or research nurses who work with COG trials (1, 2-5, 5-10, 10-20, >20).

Questionnaire Distribution

The questionnaire was built in a Research Electronic Data Capture (REDCap™) survey form. A link to the REDCap™ survey was distributed via email by COG communications to all members of the CRA and nursing rosters on February 15, 2019. Reminder emails were sent weekly. The questionnaire closed on March 11, 2019.

Data Cleaning and Analyses

Data were downloaded from REDCap™ into Excel 2016® csv files, which were imported into STATA (College Station, TX) and SAS (Cary, NC) for cleaning and analyses. Data cleaning primarily consisted of mapping each “other” response to a provided option when appropriate by consensus opinion of two authors. Incomplete surveys were included in analyses for the questions that were answered. Distributions of respondent characteristics and questionnaire responses were tabulated for each question. Given that a respondent’s institution name was not captured in the survey as a required question and therefore data for all respondents on institution name were not available, the reported FTE was used as a proxy for institution size. Institutions were grouped by CRA or research nurse FTE size as 1-5 or greater than 5 (5+) for comparisons of survey responses using Chi-square tests. Respondents who did not respond or responded as “I don’t know” to the FTE question were excluded for analyses based on FTE size. The question regarding if all CRAs or research nurses use the same tracking system included an option of “I am the only CRA or research nurse.” Responses to this question and the FTE question were compared. If responses to these two questions did not consistently indicate that there was only one CRA or research nurse, the responses were excluded due to inconsistency.

A two-sided p-value of 0.05 was considered statistically significant. Analyses were performed using STATA or SAS Enterprise. The project was deemed exempt by the Children’s Healthcare of Atlanta Institutional Review Board because it was labelled process improvement.

Results

Respondents and Institution Size

The questionnaire was distributed to the COG rosters that include 1315 CRAs and 2703 nurses at 227 distinct institutions. Responses were received from 509 respondents (12.7% overall, 27.1% of CRAs surveyed, 7.8% of nurses surveyed) representing 173 (76.2%) COG member institutions. Of those who responded, 369 (72.8%) confirmed that they performed the task of reporting AEs (Figure 1). The 369 respondents that perform AE reporting included 274 CRAs, 33 nurses, 61 reported as being both a CRA and nurse, 1 unknown. Of the CRAs that responded, 92.6% report AEs, while only 33 (21.8%) of nurses who responded report AEs (Figure 1). Overall, there were no differences in responses between CRAs and research nurses. These respondents represented 163 (71.8%) of the 227 institutions surveyed.

Figure 1.

Figure 1

Respondents divided by CRA or nursing affiliation

*61 respondents list affiliation as both CRA and nursing

AEs = Adverse Events

Respondents had a range of years of experience as a CRA or research nurse, with approximately one third reporting >10 years of experience (Table 1). The trial types most frequently covered by respondents were Phase III (88.9%), Phase II (82.9%) and Biology studies (81.8%); most respondents worked on more than one type of study (Table 1).

Table 1.

Characteristics of questionnaire respondents

N=369
  n %
COG Affiliated Discipline (Multi-select)
 CRA Only 274 74.3
 Nurse Only 33 8.9
 CRA and Nurse 61 16.5
 No Response 1 0.3
Years of Experience (Single select)
 <1 30 8.1
 1-2 63 17.1
 3-5 65 17.6
 5-10 71 19.2
 >10 135 36.6
 No Response 5 1.4
Institution Staff Size (Single select)
 1-5 FTEs 249 67.5
 5+ FTEs 104 28.2
 No Response 16 4.3
Types of Studies (Multi-select)
 Biology Studies 302 81.8
 Phase 1 Trials 109 29.5
 Phase 2 Trials 306 82.9
 Phase 3 Trials 328 88.9
 Survivorship/Long-term Follow-Up 210 56.9
 COG Pharmaceutical Trials 196 53.1
 Other 9 2.4

COG = Clinical Oncology Group; CRA = Clinical Research Associate; FTE = Full-time Equivalent

There was a range in CRA and research nurse FTE by institution (1 full FTE: 18.4%, 2-5: 49.1%, 5-10: 20.3%, 10-20: 5.2%, >20: 2.7%, Unknown: 3.5%, Missing response: 0.8%). Prior to additional analyses, institution size was dichotomized by total FTE. Larger sites were those who reportedly had 5+ FTEs (28.2%), smaller sites were considered those who had 1-5 FTEs (67.5%). There were 22.2% of individuals from the same center that had discordant responses for this FTE question raising concern for misclassification. In a sensitivity analyses these discordant responses were removed, but results remained similar.

CRA and Research Nurse AE Reporting Training

Table 2 describes the range of resources or learning formats used for training new CRAs and research nurses. The most common method was to use COG training modules (79.7%). Smaller sites were more likely to have new CRAs or research nurses attend the COG annual meeting for training (1-5 FTE: 50.2%, 5+ FTE: 32.7%, p=0.004). Larger sites were more likely to utilize locally-developed templates for tracking AEs (1-5 FTE: 41.4%, 5+ FTE: 52.9%, p=0.034) and to use institution-developed training modules (1-5 FTE: 23.7%, 5+ FTE: 39.4%, p=0.002). In response to the free text question about gaps in training, respondents had common themes of lack of training on how to determine whether an AE met criteria to be labeled a serious AE (SAE) and lack of protocol-specific training.

Table 2.

Resources used for training on adverse event capture

Overall FTE*
(N=369) 1 to 5
(n=249)
5+ (n=104)
What resources or learning formats are used for training new CRAs and research nurses on AE reporting (Multi-Select) N % n % n % P-
Value
COG training modules 294 79.7 203 81.5 82 78.8 0.749
Shadowing another CRA or research nurse 260 70.5 174 69.9 80 76.9 0.115
Having another CRA or research nurse double check my reporting until mastery 181 49.1 123 49.4 52 50.0 0.814
Provided with documents to use for tracking AEs (i.e. Spreadsheet or workbook) 163 44.2 103 41.4 55 52.9 0.034
New CRAs and research nurses attend COG fall meeting for education 163 44.2 125 50.2 34 32.7 0.004
Provided with direct teaching on where to look in the medical record 161 43.6 105 42.2 51 49.0 0.190
Institution-specific training modules 104 28.2 59 23.7 41 39.4 0.002
Sample cases 70 19.0 48 19.3 20 19.2 0.956
Computer-based training 44 11.9 24 9.6 15 14.4 0.174
Other methods1 17 4.6 13 5.2 3 2.9 0.415
Classroom-based training 15 4.1 8 3.2 5 4.8 0.451
AE reporting quizzes 13 3.5 10 4.0 2 1.9 0.520
No Response 5 1.4 1 0.4 2 1.9 -

FTE = Full-time Equivalent; COG = Children's Oncology Group; CRA = Clinical Research Associate; AE = Adverse Event

*

16 respondents did not report the number of FTE at their institution

1

Included 29% (5/17) indicating theme of minimal or no formal training, 24% (4/17) indicating theme of reviewing the protocol, 12% (2/21) indicating theme of reviewing other CRAs’ reports, 6% (1/17) indicating theme of checking with colleagues, and 6% (1/17) indicating theme of using Cancer Therapy Evaluation Program training modules

Study Allocation and AE Tracking Approaches by Site

There was variability in approaches used by institutions for allocating studies. The most common approach was distribution by disease type (35.5%); this was more common at larger sites (1-5 FTE: 20.1%, 5+ FTE: 73.1%, p<0.001) (Table 3). Respondents reported learning about AEs in numerous ways, including reviewing the medical record directly to identify AEs as a first step (77.0%) or being made aware of an AE by a treating clinician (59.9%) (Table 3). Only 189 (51.2%) respondents stated that all CRAs or research nurses at their institutions capture AEs using the same process. This was more likely to be standardized at smaller sites (1-5 FTE: 74.2%, 5+ FTE: 47.3%, p<0.001). Beyond using guidelines in COG protocols, 268 (72.6%) respondents reported that there is no institutional rule regarding timeframe for reporting AEs once a reporting period ends.

Table 3.

Institutional practices for adverse event capture

Overall FTE*
(N=369) 1 to 5 (n=249) 5+ (n=104)
N % n % n % P-
Value
How are studies divided between the CRA/research nurse team? (Single select)
By disease type (i.e. leukemia trials, solid tumor trials, neuro-oncology trials) 131 35.5 50 20.1 76 73.1 <0.0001
They are not divided up in a clear pattern 97 26.3 84 33.7 11 10.6 <0.0001
I am the only CRA or research nurse 54 14.6 54 21.7 0 0.0 <0.0001
By study 37 10.0 29 11.6 3 2.9 0.008
Other1 21 5.7 14 5.6 6 5.8 0.988
By Phase (i.e. Phase I, II, III) 12 3.3 5 2.0 7 6.7 0.028
By the order in which the studies are opened at my institution 6 1.6 6 2.4 0 0.0 0.185
No Response 11 3.0 7 2.8 1 1.0 -
How do you typically first become aware of adverse events? (Multi-select)
I review the medical record and identify the AEs as a first step 284 77.0 193 77.5 80 76.9 0.974
I review the medical record and identify AEs. Then I have the treating clinician and/or study PI review the list of AEs 279 75.6 183 73.5 86 82.7 0.044
A treating clinician or study PI will reach out and notify me of AEs for study patients 221 59.9 161 64.7 54 51.9 0.032
I inquire with the treating clinicians and/or study PI to see if any AEs have occurred with study patients and then I review the medical record 111 30.1 90 36.1 18 17.3 0.0005
I inquire with the treating clinicians and/or study PI as a first step to see if any AEs have occurred with study patients 97 26.3 72 28.9 23 22.1 0.205
Clinicians specifically document AEs (toxicity and grade) in the medical record and I report only those AEs identified in their notes 50 13.6 32 12.9 16 15.4 0.505
Research nurses review clinical documentation in the medical record and communicate specific AEs to me for reporting 45 12.2 20 8.0 22 21.2 0.0004
Other2 32 8.7 23 9.2 9 8.7 0.882
No Response 1 0.3 0 0.0 1 1.0 -
What kind of system do you use to track AEs? (Multi-select)
Excel Spreadsheet for each patient 152 41.2 90 36.1 57 54.8 0.002
Excel Spreadsheet for each study 25 6.8 16 6.4 8 7.7 0.688
Paper/notebook for each patient 154 41.7 114 45.8 35 33.7 0.028
One paper/notebook for all patients 20 5.4 12 4.8 6 5.8 0.730
Word document for each patient 78 21.1 53 21.3 23 22.1 0.906
Word document for each study 10 2.7 9 3.6 1 1.0 0.166
Other3 31 8.4 21 8.4 10 9.6 0.746
I do not track besides putting AE reports into the electronic data capture system 38 10.3 26 10.4 8 7.7 0.406
No Response 3 0.8 3 1.2 0 0.0 -

FTE = Full-time Equivalent; CRA = Clinical Research Associate; AE = Adverse Event; PI = Principal Investigator

*

16 respondents did not report the number of FTE at their institution

1

Included 43% (9/21) indicating theme of division by combination of disease type and phase, 43% (9/21) indicating theme of division by workload, and 4.8% (1/21) indicating theme of division by patients actively on-therapy

2

Included 44% (14/32) indicating theme of attending rounds/meetings, 28% (9/32) indicating theme of reviewing AEs with patients, 3% (1/32) indicating theme of email alerts, and 3% (1/32) indicating theme of using multiple of the "other" options

3

Included 22% (7/31) indicating theme of using different systems based on study specifics, 16% (5/31) indicating theme of using OnCore™ (Forte Research Systems, Madison, WI), 13% (4/31) indicating theme of making a note in the medical record, 13% (4/31) indicating theme of using an unspecified type of log, and 3% (1/31) indicating theme of using calendar appointments

There was a range of responses regarding the method used for AE tacking and a majority of respondents leveraged more than one method. The most common methods for tracking AEs prior to reporting into the trial electronic data capture system were an Excel® spreadsheet (41.2%) or paper/notebook for each patient (41.7%). An Excel® spreadsheet was more common at larger sites (1-5 FTE: 36.1%, 5+ FTE: 54.8%, p=0.002) and paper/notebook was more common at smaller sites (1-5 FTE: 45.8%, 5+ FTE: 33.7%, p=0.028) (Table 3). The majority of respondents (54.4%) reported spending an average of 10 or more minutes on AE reporting per patient per day.

Challenges in AE Reporting

There were 357 responses to the question regarding ranking the top five greatest AE reporting challenges. Figure 2 displays the frequency at which a respondent chose a listed item as their first, second or third greatest challenge. The most commonly ranked challenge was differences in AE reporting requirements between protocols (251, 70%, 60 number one ranks). In addition, 192 (53.7%) respondents were challenged by trying to match CTCAE terminology with clinical documentation and 190 (53.2%) respondents reported difficulty in getting clinicians to help with challenging AEs.

Figure 2.

Figure 2

Percentage of respondents ranking each AE reporting components as one the greatest challenges

PIs = Principal Investigators; AE = adverse event; CTCAE = Common Terminology Criteria for Adverse Events; CAEPR = Comprehensive Adverse Events and Potential Risks List; SPEER = Specific Protocol Exceptions to Expedited Reporting; CTEP-AERS = Cancer Therapy Evaluation Program Adverse Event Reporting System, Rave = Medidata Rave, the clinical data management system used currently in Children’s Oncology Group clinical trials

Percentages after each bar represents the number of respondents that ranked this challenge as one of the top five greatest AE reporting challenges with an N of 357 total responses to this question.

1Included 34% (12/35) indicating theme of unclear requirements, 9% (3/35) indicating theme of time burden, 3% (1/35) indicating theme of unclear guidance, 3% (1/35) indicating theme of electronic medical record system challenges

Potential Tools for Improvement of AE Reporting

Respondents commented on the potential utility of 13 potential central tools to help with AE reporting and 6 were endorsed by at least 50% (Table 4). Further, the following proposed tools were identified as potentially helpful by more than two-thirds of respondents: online AE reporting modules (75.3%), tip sheets for interpreting CTCAE definitions (67.5%), and standardized AE tracking forms (66.9%). Endorsement of potentially beneficial tools did not vary by FTE size.

Table 4.

Acceptability of proposed central tools for improving adverse event reporting processes

Overall (N=369)
If we were to create central tools to help with AE reporting, which of the following would be helpful? (Multi-select) N %
Online modules on how to identify what needs to be reported on a given study 278 75.3
A tip sheet on interpreting some CTCAE definitions 249 67.5
A standardized form for tracking AEs 247 66.9
Webinars about AE reporting 203 55.0
A practice quiz to learn if I am reporting correctly 192 52.0
Training for treating clinicians about AE reporting 184 49.9
Online modules on finding key AE data in the medical record 172 46.6
An online forum for asking AE reporting questions 172 46.6
Computer-based sample cases 167 45.3
Guidance on how to interpret the CTCAE 144 39.0
Downloadable sample cases 123 33.3
Encouragement of the study CRA on each protocol to have "office hours" phone lines arranged for assistance in answering AE questions 108 29.3
COG CRA Mentor of the Month 42 11.4
Other ideas?1 14 3.8
No Response 1 0.3

AE = Adverse Event; CTCAE = Common Terminology Criteria for Adverse Events; CRA = Clinical Research Associate; COG = Children's Oncology Group

1

Included 29% (4/14) indicating theme of improved and standardized protocols, 21% (3/14) indicating theme of more COG meeting trainings for CRAs and PIs, 21% (3/14) indicating theme of better explanation of types of AEs, and 7% (1/14) indicating theme of training on how to read the protocol AE sections

Discussion

This survey of CRA and research nurses in the COG consortium identifies wide variability in the training received for AE identification and in AE reporting practices. Factors specifically identified by CRAs and research nurses that contribute to under-reporting of AEs include differences in reporting requirements between clinical trials, challenges in harmonizing what is documented in the medical record with CTCAE definitions, and difficulty in finding time for clinicians and study principal investigators to assist with AE reporting. Respondents were eager for additional COG-sponsored resources to support their efforts, including, but not limited to, online modules customized for specific trials, tip sheets for interpreting AE definitions, and a standardized, user-friendly AE tracking sheet. The results of this project can serve as a roadmap for next steps to improve AE reporting.

There was some consistency to training approaches across centers, with most respondents reporting that their center used COG training modules. This highlights the importance of ensuring COG training modules are clear, available, and up-to-date. The majority (75.3%) of respondents felt that additional modules customized to specific AE reporting for individual trials would be beneficial. Creating these modules during study development and ensuring that they are accessible throughout the study might increase AE reporting. Variation in training approaches did exist, and this variation seemed to relate to site size. The 5+ FTE sites were more likely to supplement COG training modules with local guides; this may be due to division-wide requirements at those hospitals. Respondents from institutions with 1-5 FTE were more likely to report having CRAs or research nurses attend a COG meeting for formal in-person training. Costs of traveling to meetings may limit how many individuals can attend, especially affecting attendance from larger sites. If possible, building infrastructure for in-person training at individual institutions or regionally or for live, virtual training may improve access to this potentially important training approach. Training for clinicians on how best to document AEs should also be considered, as this may help address the barrier of CRAs and nurses to engage clinicians in the AE reporting process.

The variability in AE reporting procedures identified in this project appeared to exist both within and across institutions. Only half of respondents reported that all colleagues at their site follow the same AE reporting practices. Consistency within a site inversely correlated with the number of CRAs or research nurses at a site; smaller centers were more likely to adapt an institution-wide standard practice. Importantly, there was no dominant AE tracking approach across respondents. The COG does provide templates for tracking AEs, but the varied responses regarding tracking methods suggest that these templates are not universally implemented. More concerning is that sites and individuals are developing their own tracking tools, which may lead to variability in the extent to which AEs are followed, and thus inconsistencies in reporting. An effective strategy might be to convene a group of CRAs and research nurses to develop an effective, standardized template that could be distributed for mandatory use at all centers. Tailoring tracking forms to individual trials might also improve AE capture, especially AEs of concern for a particular intervention.

Finally, there was variability within and across institutions for how respondents first learn that an AE occurred for an on-study patient. Many respondents reported that they first identify an AE via review of the electronic medical record (EMR). As it appears that this is a primary point of AE detection for many CRAs and research nurses, it is important to educate clinicians to provide EMR documentation that maps to CTCAE definitions. Interestingly, respondents from smaller sites were more likely to confer with clinicians as a first step. This raises the likely possibility that approaches to optimizing AE identification may need to be considerate of institution size and structure.

Beyond providing insights into current AE training and reporting procedures, this questionnaire elucidated important factors that CRA and research nurses consider as barriers. One of the greatest challenges was the differences in reporting requirements between protocols. Efforts to standardize language between protocols could reduce this challenge. The COG Toxicity Task Force has created a guide for new study chairs writing the protocol AE reporting section that may address this barrier. Further, having a central list of AEs that are always required regardless of the protocol, along with a specific list for each trial, could improve AE capture across similar trials over time. While this may not be possible across disease groups, attempts within disease groups may feasible.

Many of the respondents endorsed centralized strategies for improving reporting. This concept is supported by the American Society of Clinical Oncology (ASCO) policy statement that promotes management by a central review board as an effective measure for improving AE monitoring.11 The majority of respondents to our questionnaire were interested in having more online modules for training new CRAs and for ongoing education; nearly 80% of respondents use the current modules. These modules should focus on providing guidance on navigating specific protocols, understanding AE definitions, and offering direction on AE ascertainment (e.g. where to find data in the EMR, how often to review charts, and how to ask for assistance). In addition, as some CTCAE definitions are complex and difficult to apply to pediatric patients,9 centrally-developed algorithms aimed at clarifying these challenging AEs could help standardize capture. The COG Toxicity Task Force has begun to develop and deploy such algorithms. AE reporting is only one of many tasks that CRAs and research nurses perform,12 and therefore there may be workflow challenges related to complete AE capture. An increased commitment to centralized guidance could streamline the process leading to more complete capture.

The results of this survey should be interpreted in the context of a number of limitations. First, the overall response rate was low. However, this low response rate is likely the result of having to survey all CRAs or nurses on the COG nurses rather than surveying specifically those that perform AE reporting. The majority (64.9%) of those that did respond to the survey were CRAs and nurses doing AE reporting. As such these results should reasonably represent the input of COG CRA and nurse members involved in AE reporting processes. Further, respondents represented 71.8% of COG hospitals, suggesting widespread interest in improving AE reporting. Second, the questionnaire was only available for four weeks and therefore it is possible that those surveyed did not have enough time to respond. However, responses decreased with each weekly reminder, and therefore this timeframe likely did not lead to a smaller response rate. Third, reported CRA and nurse FTE amount was used as a proxy for institution size rather than COG trial accrual numbers as respondents were not required to report institution name. This was done because respondents were not required to report their institution name during the survey and therefore there was incomplete capture of this variable. It is possible that responding CRAs and research nurses did not have complete knowledge of their site’s FTE which could have resulted in misclassification of institution size. A sensitivity analysis that removed discordant responses from individuals at the same site regarding their site’s FTE amount did not alter the study results. Further, the questionnaire included categories with overlapping numbers of FTE, which may have led to further misclassification if some sites with 5 FTE chose “1-5” and others chose “5-10.” Fourth, the REDCap™ did not have the capability of ensuring respondents only completed the questionnaire once. Given the uniqueness and length of the survey, we anticipate that few individuals would have completed the survey in duplicate. Lastly, the respondents were skewed to those with more experience; 73.4% of respondents had three or more years (Table 2). This could have led to under-representation of concerns from newly-hired CRAs or research nurses. However, the fact that the majority of respondents were experienced and still felt that they do not have sufficient training or resources may highlight how these deficiencies are even more urgent to address.

Conclusion

AE reporting is a crucial component of clinical trials, but many challenges exist that limit the comprehensive and accurate capture of AEs. The results of this questionnaire indicate that there is wide variation in training and reporting processes used for AE capture both between and within institutions. Respondents identified that the ways that protocols and CTCAE definitions are written contribute to AE reporting challenges. Further, engaging clinical teams to help with AE reporting is a barrier. Efforts to enhance centralized and standardized training, provide access to tools to assist with AE reporting, and engage clinicians in the process are cited as mechanisms to increase the comfort level of CRAs and research nurses in reporting AEs. This input should serve as guidance for implementation strategies to improve AE reporting between and within centers enrolling patients on clinical trials. Future projects should stratify differences in AE reporting training and practices by institution size as determined by study enrollment numbers and by type of trials opened to further identify areas of improvement.

Supplementary Material

Supplemental Data File

Supplemental Figure 1 REDCap Questionnaire

Conflict of Interest and Source of Funding

The authors have no conflicts of interest to declare. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Data are available on request from the authors. The authors thank Wendy Landier, PhD, RN, CRNP for her approval of the questionnaire and manuscript review. This work was supported by research funding from the Children’s Oncology Group, the National Cancer Institute at the National Institutes of Health [award numbers U10CA095861, U10CA180886, UG1CA189955, K07CA211959].

References

  • 1.Smith MA, Altekruse SF, Adamson PC, Reaman GH, Seibel NL. Declining childhood and adolescent cancer mortality. Cancer. Aug 15 2014;120(16):2497–506. doi: 10.1002/cncr.28748 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Siegel RL, Miller KD, Jemal A. Cancer statistics, 2020. CA Cancer J Clin. Jan 2020;70(1):7–30. doi: 10.3322/caac.21590 [DOI] [PubMed] [Google Scholar]
  • 3.NCI. National Cancer Institute Common Terminology Criteria for Adverse Events (CTCAE) v5.0. https://ctep.cancer.gov/protocolDevelopment/electronic_applications/ctc.htm. 2017; [Google Scholar]
  • 4.Angiolillo AL, Schore RJ, Kairalla JA, et al. Excellent Outcomes With Reduced Frequency of Vincristine and Dexamethasone Pulses in Standard-Risk B-Lymphoblastic Leukemia: Results From Children's Oncology Group AALL0932. Journal of clinical oncology : official journal of the American Society of Clinical Oncology. Jan 7 2021:JCO2000494. doi: 10.1200/JCO.20.00494 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Miller TP, Li Y, Kavcic M, et al. Accuracy of Adverse Event Ascertainment in Clinical Trials for Pediatric Acute Myeloid Leukemia. Journal of clinical oncology : official journal of the American Society of Clinical Oncology. May 1 2016;34(13):1537–43. doi: 10.1200/JCO.2015.65.5860 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Miller TP, Li Y, Getz KD, et al. Using electronic medical record data to report laboratory adverse events. British journal of haematology. Apr 2017;177(2):283–286. doi: 10.1111/bjh.14538 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Gwede CK, Johnson DJ, Daniels SS, Trotti A. Assessment of toxicity in cooperative oncology clinical trials: the long and short of it. The Journal of oncology management : the official journal of the American College of Oncology Administrators. Mar-Apr 2002;11(2):15–21. [PubMed] [Google Scholar]
  • 8.Good MJ, Hurley P, Woo KM, et al. Assessing Clinical Trial-Associated Workload in Community-Based Research Programs Using the ASCO Clinical Trial Workload Assessment Tool. Journal of oncology practice / American Society of Clinical Oncology. May 2016;12(5):e536–47. doi: 10.1200/JOP.2015.008920 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Miller TP, Fisher BT, Getz KD, et al. Unintended consequences of evolution of the Common Terminology Criteria for Adverse Events. Pediatric blood & cancer. Jul 2019;66(7):e27747. doi: 10.1002/pbc.27747 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.NCI. National Cancer Institute Common Terminology Criteria for Adverse Events (CTCAE), Version 4.0. http://ctep.cancer.gov/protocolDevelopment/electronic_applications/ctc.htm. Updated 2009. Accessed 2/17/20, http://ctep.cancer.gov/protocolDevelopment/electronic_applications/ctc.htm. [Google Scholar]
  • 11.American Society of Clinical O. American Society of Clinical Oncology policy statement: oversight of clinical research. Journal of clinical oncology : official journal of the American Society of Clinical Oncology. Jun 15 2003;21(12):2377–86. doi: 10.1200/JCO.2003.04.026 [DOI] [PubMed] [Google Scholar]
  • 12.Roche K, Paul N, Smuck B, et al. Factors affecting workload of cancer clinical trials: results of a multicenter study of the National Cancer Insititue of Canada clinical trials group. Journal of clinical oncology : official journal of the American Society of Clinical Oncology. 2002;20:545–556. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplemental Data File

Supplemental Figure 1 REDCap Questionnaire

RESOURCES