Skip to main content
BMC Medical Research Methodology logoLink to BMC Medical Research Methodology
. 2025 Feb 6;25:33. doi: 10.1186/s12874-025-02486-5

Discrepancies in safety reporting for chronic back pain clinical trials: an observational study from ClinicalTrials.gov and publications

Nick Boyne 1, Alison Duke 1, Jack Rea 1, Adam Khan 1,, Alec Young 1, Jared Van Vleet 1, Matt Vassar 1,2
PMCID: PMC11800428  PMID: 39915715

Abstract

Introduction

Chronic back pain (CBP) is a leading cause of disability worldwide and is commonly managed with pharmacological, non-pharmacological, and procedural interventions. However, adverse event (AE) reporting for these therapies often lacks transparency, raising concerns about the accuracy of safety data. This study aimed to quantify inconsistencies in AE reporting between ClinicalTrials.gov and corresponding randomized controlled trial (RCT) publications, emphasizing the importance of comprehensive safety reporting to improve clinical decision-making and patient care.

Methods

We retrospectively analyzed Phase 2–4 CBP RCTs registered on ClinicalTrials.gov from 2009 to 2023. Extracted data included AE reporting, trial sponsorship, and discrepancies in serious adverse events (SAEs), other adverse events (OAEs), mortality, and treatment-related withdrawals between registry entries and publications. Statistical analyses assessed reporting inconsistencies, following STROBE guidelines.

Results

A total of 114 registered trials were identified, with 40 (35.1%) corresponding publications. Among these, 67.5% were industry-sponsored. Only 4 (10%) publications fully reported adverse events (AEs) without discrepancies, while 36 (90%) contained at least one inconsistency compared to ClinicalTrials.gov. Discontinuation due to AEs was explicitly reported in 24 (60%) of ClinicalTrials.gov entries and in 30 (75%) of publications, with discrepancies in 16 trials (40%). Serious adverse events (SAEs) were reported differently in 15 (37.5%) publications; 80% reported fewer SAEs than ClinicalTrials.gov. Other adverse events (OAEs) showed discrepancies in 37 (92.5%) publications, with 43.2% reporting fewer and 54.1% reporting more OAEs.

Discussion

This study highlights pervasive discrepancies in AE reporting for CBP trials, undermining the reliability of published safety data. Inconsistent reporting poses risks to clinical decision-making and patient safety. Adopting standardized reporting guidelines, such as CONSORT Harms, and ensuring transparent updates in publications could enhance the accuracy and trustworthiness of safety data. Journals and regulatory bodies should enforce compliance and future efforts should develop mechanisms to monitor and correct reporting inconsistencies, enhancing the trustworthiness of safety data in clinical research.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12874-025-02486-5.

Keywords: Chronic back pain, Adverse events, Safety reporting, ClinicalTrials.gov, Randomized controlled trials

Introduction

Back pain is a debilitating condition that affects a substantial percentage of the global population across all demographics [1, 2]. It consistently ranks among the top five most frequent reasons for physician visits, with nonspecific low back pain being the most common form [3, 4]. The persistent and chronic nature of back pain makes it the leading cause of disability worldwide [5]. Given its potential to progress into more serious or complex conditions, numerous treatments—including pharmaceutical, non-pharmaceutical, and procedural interventions—have been developed to address chronic back pain (CBP). However, many of these treatments, though therapeutic, are associated with a diverse range of adverse events (AEs). For example, nonsteroidal anti-inflammatory drugs, the first-line pharmacologic treatment, are associated with an increased risk of gastrointestinal bleeding and acute kidney injury when used chronically [68]. Moreover, if a patient’s back pain is sufficiently debilitating and proves resistant to first- and second-line treatments, opioids may be used for chronic management [6]. This drug class carries a significantly higher risk profile than first- and second-line treatments, necessitating enhanced prescription surveillance. Given the prevalence of back pain, its wide range of treatments, and associated adverse effects, clear and consistent reporting of safety data related to these therapies is essential.

The reporting of AEs to ClinicalTrials.gov was mandated by the Food and Drug Administration Amendments Act (FDAAA 801) of 2007 and reinforced by the Final Rule implemented in 2017 [9, 10]. This mandate requires the reporting of serious adverse events (SAEs), other adverse events (OAEs) defined as non-SAE adverse events occurring with a frequency of 5% or more in any clinical trial arm, and all-cause mortality [11, 12]. These reports are expected to reflect findings published in corresponding randomized controlled trial (RCT) publications; however, this often does not occur. Despite FDA regulations requiring AE reporting to ClinicalTrials.gov, discrepancies frequently arise between registry data and published trial results [13, 14]. When they occur, space constraints imposed by journals or study design limitations are often cited to explain these inconsistencies [1517]. Although these limitations are acknowledged, limited reporting increases safety concerns. Excluding AEs can potentially obscure a treatment’s true risk profile, leading to an incomplete clinical understanding [18]. Consequently, patients bear the risks associated with impaired clinical judgment.

This study aims to quantify inconsistencies in safety reporting between registries and publications of CBP and to advocate for systematic improvements in AE reporting. By highlighting discrepancies in AE reporting between ClinicalTrials.gov and corresponding RCT publications, this study emphasizes the importance of transparent and consistent safety reporting. Enhancing the accuracy of safety data in publications not only improves clinical decision-making but also fosters greater trust in therapeutic outcomes, ultimately benefiting patient care. The overarching goal of this research is to advance reliable safety reporting standards to ensure a comprehensive understanding of treatment risks for CBP.

Methods

Study timeline and data sources

We conducted a retrospective analysis of completed RCTs focusing on CBP, as depicted in Fig. 1. The study period started on September 27, 2009, coinciding with the date when the reporting of AEs became compulsory, and continued through December 31, 2023, providing over 14 years of mandatory safety reporting. Trial data were obtained from ClinicalTrials.gov, where protocols, outcomes, and safety information were available. A systematic search of ClinicalTrials.gov was performed using the following keywords: “chronic back pain,” “lower back pain,” “lumbar pain,” “non-specific chronic low back pain,” and "back pain." For trials that had multiple publications, we prioritized publications that specifically reported on the current trial results.

Fig. 1.

Fig. 1

Flow Chart for Study Inclusion

Identification of corresponding publications

For each trial, we reviewed the “Publications” section in ClinicalTrials.gov to identify any listed references. When no publication was listed, we used the trial’s National Clinical Trial (NCT) identifier to search in PubMed. Only full, peer-reviewed publications reporting the results of only the current trial were included. If multiple publications reported the same trial results, we selected the publication closest in date to when the results were first posted on ClinicalTrials.gov.

This expanded two-step approach helped us capture trials that may have been published but not directly linked in ClinicalTrials.gov. Nonetheless, we acknowledge that some completed trials may remain unpublished or published in outlets not indexed in PubMed, which we note as a limitation.

Sample

This analysis included randomized controlled trials (RCTs) investigating pharmacological, non-pharmacological, or procedural interventions for the management of CBP. Eligible trials were required to be Phase 2, 3, or 4 studies registered on ClinicalTrials.gov. The selected timeline aligned with the implementation of the Food and Drug Administration Amendments Act (FDAAA) 801, which mandated the reporting of AEs starting in 2009 to ensure uniform safety reporting. The trials needed to involve child or adult participants (birth to 64 years old) diagnosed with CBP, as outlined in the trial eligibility criteria.

To be included, trials had to report results on ClinicalTrials.gov, specifically safety outcomes such as serious adverse events (SAEs), other adverse events (OAEs), mortality, and treatment-related withdrawals. Corresponding publications were required to be peer-reviewed articles published in English and indexed on PubMed. Trials without publicly available results on ClinicalTrials.gov or those not linked to a corresponding publication were excluded. Additionally, we excluded the following trials for wrong study design: non-randomized studies, observational studies, trials not focusing on CBP, and studies conducted in languages other than English. Phase 1 trials were excluded as they mainly focus on dose-escalation and safety in healthy volunteers, rather than efficacy in patient populations. Trials registered but without posted results were also excluded.

Data extraction and analysis of safety reporting

Data extraction was performed by two independent reviewers in a masked and duplicate manner, using a standardized form to collect information from each trial’s ClinicalTrials.gov entry and its corresponding publication. Key data elements included the National Clinical Trial (NCT) identifier, trial start date, primary completion date, results posting date, sponsor, funding source, and trial phase. Safety data extracted included the number of serious adverse events (SAEs), other adverse events (OAEs), frequency thresholds for OAEs, treatment-related withdrawals, and reported deaths. The reporting of adverse events on ClinicalTrials.gov was compared with corresponding publications to identify any discrepancies in the number or descriptions of adverse events, with discrepancies flagged as instances of inconsistent reporting; we also looked at discrepancies in reporting of frequency thresholds for OAEs.

We also assessed whether publications explicitly mentioned the absence of SAEs or OAEs. Instances where AEs were reported as zero in ClinicalTrials.gov but not mentioned in publications were flagged as incomplete reporting. Additionally, we noted the time interval between the results posting on ClinicalTrials.gov and the publication of corresponding articles to explore whether delays in publication were associated with reporting discrepancies.

Inter-rater reliability was assessed to maintain consistency between reviewers, and any disagreements were resolved through discussion with a third reviewer. We adhered to the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guidelines to ensure transparency in the reporting and extraction process.

Statistical analysis/openscience

Descriptive statistics were used to summarize the characteristics of the included trials and the patterns of safety reporting. Comparative analyses were performed using the Chi-square test or Mann Whitney U test to assess the differences in the number of SAEs, OAEs, deaths, and participant withdrawal reported between ClinicalTrials.gov and the corresponding publications. Following study completion, we uploaded raw data, statistical analysis scripts, and extraction forms to Open Science Framework (OSF)—a free‐to‐upload data repository [19]. Our data are accessible on OSF throughout the repository’s lifecycle, or alternatively, it can be obtained by making a request [20].

Ethical oversight

The Oklahoma State University Center for Health Sciences reviewed the study protocol and determined that the research qualifies as nonhuman subjects research, in accordance with 45 CFR 46.102(d) and (f).

Results

General Characteristics

A total of 114 registered trials were identified, with 40 (35.1%) available publications. The majority of the published trials were phase 3 (n = 22, 55.0%), followed by phase 4 (n = 11, 27.5%), phase 2 (n = 6, 15.0%), and phase 2 | phase 3 (n = 1, 2.5%). Additionally, 27 (67.5%) of the published trials were industry-sponsored, 2 (5.0%) were federally sponsored, and 11 (27.5%) were sponsored by either universities, hospitals, or private individuals. In terms of interventions/treatments, 29 (72.5%) of these trials used drugs, 3 (7.5%) used biologicals, 3 (7.5%) used both drugs and biologics, 1 (2.5%) used both drugs and behavioral interventions, 1 (2.5%) used procedures, 1 (2.5%) used both drugs and procedures, 1 (2.5%) used a device, and 1 (2.5%) used both drugs and a device (Table 1, Supplemental Table 2).

Table 1.

Characteristics of Included Trials

Characteristic Overall (N = 40)
Phase
 Phase II 6 (15%)
 Phase II | Phase III 1 (2.5%)
 Phase III 22 (55%)
 Phase IV 11 (27.5%)
Sponsor/Funding
 Industry 27 (67.5%)
 Federal 2 (5%)
 Othera 11 (27.5%)
Intervention / Treatment
 Drug 29 (72.5%)
 Biological 3 (7.5%)
 Drug, Biological 3 (7.5%)
 Drug, Behavioral 1 (2.5%)
 Procedure 1 (2.5%)
 Drug, Procedure 1 (2.5%)
 Device 1 (2.5%)
 Drug, Device 1 (2.5%)
Journal
 Pain Practice 2 (5%)
 PAIN: The Journal of the International Association of Pain 8 (20%)
 Molecular Pain 1 (2.5%)
 Current Medical Research and Opinion 1 (2.5%)
 Journal of Addicitve Diseases 1 (2.5%)
 Expert Opinion on Pharmacotherapy 1 (2.5%)
 Osteopathic Medicine and Primary Care 1 (2.5%)
 JAMA Network Open 1 (2.5%)
 JAMA 3 (7.5%)
 Pain Management 1 (2.5%)
 Journal of Pain and Symptom Management 1 (2.5%)
 Clinical Therapeutics 1 (2.5%)
 Annals of Emergency Medicine 2 (5%)
 Archives of Physical Medicine and Rehabilitation 1 (2.5%)
 Regional Anesthesia & Pain Medicine 1 (2.5%)
 Therapeutic Advances in Musculoskeletal Disease 1 (2.5%)
 Frontiers in Pain Research 1 (2.5%)
 Arthritis & Rheumatology 1 (2.5%)
 Pain and Therapy 2 (5%)
 Annals of Internal Medicine 1 (2.5%)
 The Primary Care Companion for CNS Disorders 1 (2.5%)
 Journal of Pain Research 3 (7.5%)
 European Journal of Pain 1 (2.5%)
 Lancet Rheumatology 1 (2.5%)
 Rheumatology and Therapy 1 (2.5%)
 RMD Open 1 (2.5%)
 Spine 1 (2.5%)
 Arthritis Research & Therapy 1 (2.5%)

aSponored by universities, hospitals, private individuals

Among the publications, 14 (35.0%) were published before the results of their trials were first posted on ClinicalTrials.gov. The median impact factor of the journals in which these trials were published was 7.3, [95% CI: 2.39–12.29] [21]. For AE reporting, only 4 (10%) publications fully reported AEs without any discrepancies. In contrast, 36 (90%) publications exhibited at least one inconsistency in the reporting of AEs when compared to ClinicalTrials.gov.

The discontinuation of participants due to AEs was explicitly reported on ClinicalTrials.gov in 24 (60%) of trials. In contrast, discontinuation due to AEs was explicitly reported in 30 (75%) of publications. Discrepancies in the number of discontinuations due to AEs between ClinicalTrials.gov and corresponding publications were found in 16 trials (40%), and 10 (25%) of publications did not report the number of discontinuations due to AEs. In most cases, the number of patients reported as having discontinued due to AEs in publications either matched or was lower than what was reported on ClinicalTrials.gov. However, 16 trials exhibited a discrepancy, indicating inconsistent reporting of participant withdrawals across sources (Table 2).

Table 2.

Trial and Associated Publication Identifiers

ClinicalTrials.gov Identifier (NCT) PubMed ID (PMID)
NCT01855919 27831985
NCT01838616 26095455
NCT00108550 26963844
NCT00125528 27852965
NCT01685684 26262828
NCT02362672 30747908
NCT00549042 20429852
NCT01452529 26111544
NCT00315120 23759340
NCT01898013 32119096
NCT00404079 20606148
NCT00876187 23628600
NCT02725411 34786956
NCT00490919 21945130
NCT01571362 25993547
NCT01112267 24183364
NCT01352741 24738609
NCT01863732 31565244
NCT03068897 30955985
NCT02665286 29089169
NCT01238536 22458343
NCT03372161 38875121
NCT00733096 22508732
NCT00668434 25988461
NCT03136861 34707696
NCT00424593 20461028
NCT01951105 34374961
NCT01559454 31774028
NCT01528332 39109241
NCT02700815 32221866
NCT01777581 25664215
NCT03802565 33262641
NCT01708915 25929250
NCT01587274 26501533
NCT03003000 31576162
NCT02528253 32453139
NCT01358175 27390130
NCT01697358 30720582
NCT02008916 29273067
NCT02159053 30121827

SAEs reporting

In 15 (37.5%) publications, the number of reported SAEs differed from the corresponding number in ClinicalTrials.gov (P = 0.154, P > 0.05). Of these 15 publications, 2 (13.3%) reported a higher number of SAEs than that reported on ClinicalTrials.gov, 12 (80%) reported a smaller number, and 1 (6.7%) did not report SAEs entirely. Additionally, 17 (42.5%) publications exhibited discrepancies in the description of SAEs. Furthermore, trials for which SAE data was first reported on ClinicalTrials.gov before being published (less than or equal to the median of 24.2 months [95% CI: 9.6–38.7]) were more prone to different descriptions of SAEs in publications, χ2 = 0.201, P = 0.654, P > 0.05. Additionally, authors of publications corresponding to trials with zero reported SAEs in ClinicalTrials.gov were more likely to omit explicit reporting of SAEs in publications, χ2 = 4.183, P = 0.041, P < 0.05 (Table 3).

Table 3.

Discrepancies in Reporting for SAEs, OAEs, Deaths and Discontinuations

Parameter SAEs OAEs Deaths and Discontinuations
Number of Publications with Discrepancy 15/40 (37.5%) 37/40 (92.5%)

Deaths Discrepancy:

15/40 (37.5%)

Discontinuation Discrepancy:

16/40 (40%)

Direction of Discrepancy

- 2 (13.3%) reported more SAEs than

ClinicalTrials.gov

- 12 (80%) reported fewer SAEs

- 1 (6.7%) omitted SAEs

- 16 (43.2%) reported fewer OAEs

- 20 (54.1%) reported more OAEs- 1 (2.5%) omitted OAEs

- 10 (66.7%) of the 15 death discrepancies were unreported on ClinicalTrials.gov but reported in publications

- 7 (43.8%) of 16 discontinuation discrepancies were omissions on ClinicalTrials.gov

Descriptive Differences - 17 (42.5%) publications had different descriptions of SAEs - 35 (87.5%) publications had different descriptions of OAEs Not specifically measured for descriptions
Additional Findings

- Trials posted ≤ 24.2 months before publication were more prone to different SAE descriptions

- Trials with zero reported SAEs on ClinicalTrials

gov were more likely to omit SAEs in the

publication

- Median frequency threshold for reporting on ClinicalTrials.gov: 2.5%

(range: 0–5%)

- 23 (57.5%) publications did not report any threshold

- 14 (82.4%) of 17 publications used a different threshold than ClinicalTrials

gov

- Deaths were reported in 14 (35%) trials on

ClinicalTrials.gov vs. 19 (47.5%) in publications

OAEs reporting

In 37 (92.5%) publications, the number of reported OAEs differed from the corresponding number in ClinicalTrials.gov (P = 1.95 × 10–8, P < 0.05), of which 16 (43.2%) publications reported a smaller number, and 20 (54.1%) reported a higher number. In 35 (87.5%) publications, the description of reported OAEs differed from the corresponding ones in ClinicalTrials.gov. OAEs were omitted entirely in 1 (2.5%) publication where OAEs were reported in ClinicalTrials.gov. The median reported frequency threshold in ClinicalTrials.gov was 2.5%, ranging from 0 to 5%. In 23 (57.5%) publications, the frequency threshold was not reported. Of the 17 frequency thresholds reported in the publications analyzed, 14 (82.4%) publications had a different frequency threshold than that reported on ClinicalTrials.gov (Table 3).

Deaths and participant discontinuation due to AEs reporting

In 14 (35%) trials, deaths were reported in ClinicalTrials.gov, while 19 (47.5%) corresponding publications reported deaths. Discrepancies in deaths reporting were found in 15 (37.5%) trials. Of these 15 trials, 10 (66.7%) did not report any deaths on ClinicalTrials.gov, even though their corresponding publications did report deaths (χ2 = 0.309, P = 0.579, P > 0.05). Sixteen discrepancies were found for participant discontinuation due to AEs between data reported on ClinicalTrials.gov and corresponding publications. Of these 16 discrepancies, 7 (43.8%) were due to ClinicalTrials.gov omitting participant discontinuation due to AEs when corresponding publications did report this data. Overall, publications were more likely to report participant discontinuation due to AEs than their corresponding trials on ClinicalTrials.gov (Table 3).

Discussion

Our study demonstrated major discrepancies in AE reporting between CBP trials registered on ClincalTrials.gov and their respective publications. Importantly, nearly all publications had at least one discrepancy in AE reporting. These discrepancies extended to discontinuation of participants due to AEs, with publications reporting a higher rate of discontinuation due to AEs than the trials themselves. Nearly half of the studies had a discrepancy in discontinuation due to AEs between the data reported on ClinicalTrials.gov and the data presented in the corresponding publication. Furthermore, both SAEs and OAEs had significant discrepancies in descriptions of the types of AEs. Notably, inconsistencies in OAE descriptions were found in nearly 90% of registered trials and their respective publications. Perhaps most alarming was the over one-third discrepancy in death reporting. These findings highlight a serious area of concern for the transparency and ultimately the validity of CBP clinical trials.

The lack of consistency in reporting between clinical trials and their respective publications is not a novel finding to our study. For example, Hartung et al. found numerous inconsistencies between ClinicalTrials.gov and publications—including outcomes, SAEs, and death reporting—highlighting that most SAE discrepancies involved higher numbers on ClinicalTrials.gov, and nearly a third of studies had discordant death reporting. [17] Our findings align with this pattern: most SAE discrepancies were larger on ClinicalTrials.gov, and over a third of trials showed differences in death reporting. Similarly, Paladin et al. identified multiple AE and death-reporting discrepancies in allergic rhinitis trials, where all trials reported SAEs and OAEs, yet fewer than half of the corresponding publications had complete AE data. [22] In our study, only 10% of publications fully matched registry AE reports. Paladin et al. also noted that about one fifth of SAE counts differed from the trial record, whereas we observed discrepancies in over a third of our publications, with only two reporting higher SAEs than ClinicalTrials.gov. OAE reporting further mirrored Paladin et al.’s findings: most publications differed from the registry, and many reported more OAEs than the trials themselves. [22] Even though these studies focus on distinct conditions, they underscore a broader concern regarding AE reporting and its implications. The lack of consistency between the data reported on ClinicalTrials.gov and the data published in the corresponding journal raises questions about the validity of the work.

Although our analysis quantifies how frequently these discrepancies occur, their direct impact on the strength and validity of conclusions in the corresponding publications remains unclear. However, underreporting or overreporting of adverse events has been shown to misrepresent a treatment’s true risk profile, potentially influencing clinical decision-making and patient outcomes [23, 24]. It is important to note that similar discrepancies in adverse event reporting have been observed beyond ClinicalTrials.gov and journal publications, extending to clinical study reports and regulatory documents. For instance, unpublished study reports often reveal more comprehensive AE data than their corresponding published versions, indicating a broader issue of under-reporting across multiple repositories [25, 26]. These findings underscore the need for robust, transparent reporting systems that integrate data from all available sources to ensure clinicians and researchers have access to accurate safety profiles. Therefore, it is imperative to address and remedy these issues to provide unbiased and well-founded clinical trials.

Clinical trials evaluating AE profiles have a significant role to play in the safety of treatment modalities for patients. However, when AEs are not disclosed or inadequately reported, it can lead to the false belief that a modality is safer than it actually is. Research evaluating the validity of AE reporting is therefore crucial to ensure AEs are being reported accurately. Belknap et al. conducted a meta-analysis on AE reporting in clinical trials for finasteride treatment of androgenic alopecia and found half of the trials with inadequate or no AE reporting. [27] Additionally, Sivendran et al. evaluated AE reporting in cancer clinical trials and found that most studies only reported AEs if they were above a certain threshold. [28] Our findings add to the discrepancy among clinical trials, demonstrating nearly all CBP trials in our study had discrepancies in AE reporting. These incongruencies can have detrimental ramifications for patients if an accurate AE profile is not developed. For example, Pasi et al. evaluated various AE reporting databases for cyanoacrylate adhesive, a believed-to-be-safe venous disorder treatment, and found 14 deaths along with hundreds of thromboembolic and immune reactions, noting that AEs are underreported in medical literature. [29, 30] This shows the need for an accurate AE profile to allow proper decisions regarding the benefit-to-risk decision-making for treatments.

It is important to recognize that efforts have tried to combat this issue. Most notably has been the development of the CONSORT extension, CONSORT Harms. This extension has attempted to address these nonreporting issues, going through iterations from 2004 to the now updated 2022 version that is incorporated within the CONSORT checklist [31]. Despite its advances in research in 2004, many studies did not have significant improvement in AE reporting, therefore leading to the updated 2022 version that is now incorporated within the CONSORT checklist itself [3133]. These advancements should continue to improve AE reporting, but that will require journals’ endorsement. With patients’ quality of life and ultimately lives at stake, AE reporting needs to be standardized and unified among journals publishing CBP trials.

Recommendations

We recommend that journals publishing CBP trials require the use of CONSORT Harms upon submission. This will provide journals with a standardized way to ensure consistency between the clinical trials and subsequent publications, fostering transparency and accuracy so that data can be reliably used in clinical practice to improve patient outcomes. Additionally, we recommend that if any changes occur that cause discordance between trials and publications, addenda should be made to reflect those updates, ensuring the most accurate data are represented in all forms.

Our findings also suggest that ClinicalTrials.gov entries alone are not necessarily more reliable than corresponding publications—particularly for discontinuations and deaths—underscoring the need for more robust and enforced reporting standards on ClinicalTrials.gov. Future efforts could benefit from leveraging clinical regulatory documents and safety databases as supplementary sources of adverse event information to ensure a more comprehensive safety profile [34, 35].

Finally, we suggest that journals require authors to submit links to their ClinicalTrials.gov entries and a document with explanations for any persistent discrepancies. We believe these measures will greatly improve consistency between CBP trials and publication reporting.

Strengths and limitations

Our study was designed with many strengths. First, our study looked at both ClinicalTrials.gov and the respective publications, allowing for an in-depth analysis of the adverse event profile. Second, our protocol was developed a priori. Finally, data were extracted in a masked, duplicate fashion. However, despite these many strengths, we are aware that our study is not without limitations. For one, human error is possible. Additionally, we are aware that some files or supplementary materials in publications may have been missed. However, we tried to mitigate this risk by reconciling data after masked, duplicate data extraction and by scanning any available supplementary files. Another limitation is the relatively small number of published trials that met our inclusion criteria, which may reduce the generalizability of our findings; drawing definitive conclusions from such a limited dataset may not fully capture the broader spectrum of safety reporting across CBP trials. Moreover, as we only used publications indexed on PubMed and trials registered on ClinicalTrials.gov for data extraction, we may have excluded completed trials that remain unpublished or are not indexed in PubMed, introducing potential selection bias. Future research with a larger sample size would be beneficial to validate and extend our findings, providing a more comprehensive understanding of discrepancies in safety reporting.

Conclusion

Major inconsistencies exist among CBP trials and publications in their AE and death reporting profiles, raising questions about the reliability of the evidence guiding patient care. While direct evidence that using CONSORT Harms alone reduces discrepancies between trials and publications is not yet available, it provides a structured framework for comprehensive and transparent reporting. We therefore recommend that journals encourage or require authors to use the CONSORT Harms extension, include addenda if updates are needed, and provide clear disclosure if any discrepancies remain. These practices have the potential to improve the transparency and quality of clinical research, thereby promoting better patient outcomes and greater trust in published findings.

Supplementary Information

Acknowledgements

Not applicable.

Author’s contributions

Author’s Contributions: NB: Study screening, data extraction and analysis, final approval of manuscript; AD: Study screening, data extraction and analysis, final approval of manuscript; JR: manuscript advisement, final approval of manuscript; AK Study design, manuscript preparation and revision, manuscript advisement, final approval of manuscript; AY: manuscript preparation and revision, final approval of manuscript; JV: manuscript preparation and revision, final approval of manuscript; MV: Study design, manuscript advisement, final approval of manuscript.

Funding

This study was not funded.

Data availability

No datasets were generated or analysed during the current study.

Declarations

Ethics approval and consent to participate

The Oklahoma State University Center for Health Sciences Institutional Review Board reviewed the study protocol and determined that the research qualifies as nonhuman subjects research, in accordance with 45 CFR 46.102(d) and (f).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Fatoye F, Gebrye T, Odeyemi I. Real-world incidence and prevalence of low back pain using routinely collected data. Rheumatol Int. 2019;39(4):619–26. 10.1007/s00296-019-04273-0. [DOI] [PubMed] [Google Scholar]
  • 2.Fabricant PD, Heath MR, Schachne JM, Doyle SM, Green DW, Widmann RF. The epidemiology of back pain in American children and adolescents. Spine (Phila Pa 1976). 2020;45(16):1135–1142. 10.1097/BRS.0000000000003461 [DOI] [PubMed]
  • 3.Chou R, Qaseem A, Snow V, et al. Diagnosis and treatment of low back pain: a joint clinical practice guideline from the American College of Physicians and the American Pain Society. Ann Intern Med. 2007;147(7):478–91. 10.7326/0003-4819-147-7-200710020-00006. [DOI] [PubMed] [Google Scholar]
  • 4.Finley CR, Chan DS, Garrison S, et al. What are the most common conditions in primary care? Systematic review. Can Fam Physician. 2018;64(11):832–840. https://pmc.ncbi.nlm.nih.gov/articles/PMC6234945/ [PMC free article] [PubMed]
  • 5.GBD 2021 Low Back Pain Collaborators. Global, regional, and national burden of low back pain, 1990–2020, its attributable risk factors, and projections to 2050: a systematic analysis of the Global Burden of Disease Study 2021. Lancet Rheumatol. 2023;5(6):e316-e329. 10.1016/S2665-9913(23)00098-X [DOI] [PMC free article] [PubMed]
  • 6.Qaseem A, Wilt TJ, McLean RM, et al. Noninvasive treatments for acute, subacute, and chronic low back pain: A clinical practice guideline from the American College of Physicians. Ann Intern Med. 2017;166(7):514–30. 10.7326/M16-2367. [DOI] [PubMed] [Google Scholar]
  • 7.Lanza FL, Chan FKL, Quigley EMM, Practice Parameters Committee of the American College of Gastroenterology. Guidelines for prevention of NSAID-related ulcer complications. Am J Gastroenterol. 2009;104(3):728–738. 10.1038/ajg.2009.115 [DOI] [PubMed]
  • 8.LaForge JM, Urso K, Day JM, et al. Non-steroidal anti-inflammatory drugs: Clinical implications, renal impairment risks, and AKI. Adv Ther. 2023;40(5):2082–96. 10.1007/s12325-023-02481-6. [DOI] [PubMed] [Google Scholar]
  • 9.Health and Human Services Department. Clinical Trials Registration and Results Information Submission. Federal Register. https://www.federalregister.gov/documents/2016/09/21/2016-22129/clinical-trials-registration-and-results-information-submission [PubMed]
  • 10.ClinicalTrials.gov to Include Basic Results Data. Published online October 21, 2008.  https://www.nlm.nih.gov/pubs/techbull/so08/so08_clinicaltrials.html. Accessed 23 Nov 2024.
  • 11.Clinical Trials Registration and Results Information Submission. Federal Register. 2016;81(183):64981–65157. https://www.federalregister.gov/documents/2016/09/21/2016-22129/clinical-trials-registration-and-results-information-submission. Accessed 27 Dec 2024. [PubMed]
  • 12.Adverse Event Information. ClinicalTrials.gov. https://clinicaltrials.gov/policy/results-definitions#adverseEventInformation. Accessed December 27, 2024.
  • 13.Fleming PS, Koletsi D, Dwan K, Pandis N. Outcome discrepancies and selective reporting: impacting the leading journals? PLoS ONE. 2015;10(5):e0127495. 10.1371/journal.pone.0127495. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Mayo-Wilson E, Fusco N, Hong H, Li T, Canner JK, Dickersin K. Opportunities for selective reporting of harms in randomized clinical trials: Selection criteria for non-systematic adverse events. Trials. 2019;20(1):553. 10.1186/s13063-019-3581-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Page MJ, Higgins JPT. Rethinking the assessment of risk of bias due to selective reporting: a cross-sectional study. Syst Rev. 2016;5(1):108. 10.1186/s13643-016-0289-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Cro S, Forbes G, Johnson NA, Kahan BC. Evidence of unexplained discrepancies between planned and conducted statistical analyses: a review of randomised trials. BMC Med. 2020;18(1):137. 10.1186/s12916-020-01590-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Hartung DM, Zarin DA, Guise JM, McDonagh M, Paynter R, Helfand M. Reporting discrepancies between the ClinicalTrials.gov results database and peer-reviewed publications. Ann Intern Med. 2014;160(7):477–83. 10.7326/M13-0480. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Saini P, Loke YK, Gamble C, Altman DG, Williamson PR, Kirkham JJ. Selective reporting bias of harm outcomes within studies: findings from a cohort of systematic reviews. BMJ. 2014;349:g6501. Published 2014 Nov 21. 10.1136/bmj.g6501 [DOI] [PMC free article] [PubMed]
  • 19.Accessed November 23, 2024. https://pmc.ncbi.nlm.nih.gov/articles/PMC5370619/
  • 20.Accessed November 23, 2024. https://osf.io/596rh/
  • 21.Journal Citation Reports. Academia and Government. July 16, 2024.  https://clarivate.com/academia-government/scientific-and-academic-research/research-funding-analytics/journal-citation-reports/ . Accessed 23 Nov 2024.
  • 22.Paladin I, Pranić SM. Reporting of the safety from allergic rhinitis trials registered on ClinicalTrials.gov and in publications: An observational study. BMC Med Res Methodol. 2022;22(1):262. 10.1186/s12874-022-01730-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Westergren T, Narum S, Klemp M. Biases in reporting of adverse effects in clinical trials, and potential impact on safety assessments in systematic reviews and therapy guidelines. Basic Clin Pharmacol Toxicol. 2022;131(6):465–73. 10.1111/bcpt.13791. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Veitch ZW, Shepshelovich D, Gallagher C, Wang L, Abdul Razak AR, Spreafico A, Bedard PL, Siu LL, Minasian L, Hansen AR. Underreporting of Symptomatic Adverse Events in Phase I Clinical Trials. JNCI J Natl Cancer Inst. 2021;113(8):980–8. 10.1093/jnci/djab015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Wieseler B, McGauran N, Kerekes MF, et al. Completeness of reporting of patient-relevant clinical trial outcomes: comparison of unpublished clinical study reports with publicly available data. PLoS Med. 2013;10(10):e1001526. 10.1371/journal.pmed.1001526. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Golder S, Loke YK, Wright K, Norman G. Reporting of adverse events in published and unpublished studies of health care interventions: a systematic review. PLoS Med. 2016;13(9):e1002127. 10.1371/journal.pmed.10021272. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Belknap SM, Aslam I, Kiguradze T, et al. Adverse event reporting in clinical trials of finasteride for androgenic alopecia: A meta-analysis: A meta-analysis. JAMA Dermatol. 2015;151(6):600–6. 10.1001/jamadermatol.2015.36. [DOI] [PubMed] [Google Scholar]
  • 28.Sivendran S, Latif A, McBride RB, et al. Adverse event reporting in cancer clinical trial publications. J Clin Oncol. 2014;32(2):83–9. 10.1200/JCO.2013.52.2219. [DOI] [PubMed] [Google Scholar]
  • 29.Licorn Publishing. Servier – Phlebolymphology. Servier - Phlebolymphology. September 8, 2022. https://www.phlebolymphology.org/cyanoacrylate-closure-in-the-treatment-of-varicose-veins-what-is-the-evidence/. Accessed 23 Nov 2024.
  • 30.Parsi K, Zhang L, Whiteley MS, et al. 899 serious adverse events including 13 deaths, 7 strokes, 211 thromboembolic events, and 482 immune reactions: The untold story of cyanoacrylate adhesive closure. Phlebology. 2024;39(2):80–95. 10.1177/02683555231211086. [DOI] [PubMed] [Google Scholar]
  • 31.Junqueira DR, Zorzela L, Golder S, et al. CONSORT Harms 2022 statement, explanation, and elaboration: updated guideline for the reporting of harms in randomised trials. BMJ. 2023;381:e073725. 10.1136/bmj-2022-073725. [DOI] [PubMed] [Google Scholar]
  • 32.Williams MR, McKeown A, Pressman Z, et al. Adverse event reporting in clinical trials of intravenous and invasive pain treatments: An ACTTION systematic review. J Pain. 2016;17(11):1137–49. 10.1016/j.jpain.2016.07.006. [DOI] [PubMed] [Google Scholar]
  • 33.Péron J, Maillet D, Gan HK, Chen EX, You B. Adherence to CONSORT adverse event reporting guidelines in randomized clinical trials evaluating systemic cancer therapy: a systematic review. J Clin Oncol. 2013;31(31):3957–63. 10.1200/JCO.2013.49.3981. [DOI] [PubMed] [Google Scholar]
  • 34.Hammad TA, Afsar S, Le-Louet H, et al. Navigating a transforming landscape: the evolving role of pharmacovigilance physicians in drug development and implications for future challenges and training. Front Drug SafetyRegul. 2023. Available at: https://www.frontiersin.org/articles/10.3389/fdsfr.2023.1257732/full. Accessed 27 Dec 2024.
  • 35.Tang H, Solti I, Kirkendall E, Zhai H, Solti I. Leveraging FDA adverse event reports for the automated monitoring of electronic health records in a pediatric hospital. SAGE Open Medicine. 2017;1178222617713018. Available at: https://journals.sagepub.com/doi/pdf/10.1177/1178222617713018 . Accessed 27 Dec 2024. [DOI] [PMC free article] [PubMed]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data Availability Statement

No datasets were generated or analysed during the current study.


Articles from BMC Medical Research Methodology are provided here courtesy of BMC

RESOURCES