Skip to main content
PLOS Medicine logoLink to PLOS Medicine
. 2023 Mar 21;20(3):e1004175. doi: 10.1371/journal.pmed.1004175

Institutional dashboards on clinical trial transparency for University Medical Centers: A case study

Delwen L Franzen 1,*, Benjamin Gregory Carlisle 1, Maia Salholz-Hillel 1, Nico Riedel 1, Daniel Strech 1
Editor: Florian Naudet2
PMCID: PMC10030018  PMID: 36943836

Abstract

Background

University Medical Centers (UMCs) must do their part for clinical trial transparency by fostering practices such as prospective registration, timely results reporting, and open access. However, research institutions are often unaware of their performance on these practices. Baseline assessments of these practices would highlight where there is room for change and empower UMCs to support improvement. We performed a status quo analysis of established clinical trial registration and reporting practices at German UMCs and developed a dashboard to communicate these baseline assessments with UMC leadership and the wider research community.

Methods and findings

We developed and applied a semiautomated approach to assess adherence to established transparency practices in a cohort of interventional trials and associated results publications. Trials were registered in ClinicalTrials.gov or the German Clinical Trials Register (DRKS), led by a German UMC, and reported as complete between 2009 and 2017. To assess adherence to transparency practices, we identified results publications associated to trials and applied automated methods at the level of registry data (e.g., prospective registration) and publications (e.g., open access). We also obtained summary results reporting rates of due trials registered in the EU Clinical Trials Register (EUCTR) and conducted at German UMCs from the EU Trials Tracker. We developed an interactive dashboard to display these results across all UMCs and at the level of single UMCs. Our study included and assessed 2,895 interventional trials led by 35 German UMCs. Across all UMCs, prospective registration increased from 33% (n = 58/178) to 75% (n = 144/193) for trials registered in ClinicalTrials.gov and from 0% (n = 0/44) to 79% (n = 19/24) for trials registered in DRKS over the period considered. Of trials with a results publication, 38% (n = 714/1,895) reported the trial registration number in the publication abstract. In turn, 58% (n = 861/1,493) of trials registered in ClinicalTrials.gov and 23% (n = 111/474) of trials registered in DRKS linked the publication in the registration. In contrast to recent increases in summary results reporting of drug trials in the EUCTR, 8% (n = 191/2,253) and 3% (n = 20/642) of due trials registered in ClinicalTrials.gov and DRKS, respectively, had summary results in the registry. Across trial completion years, timely results reporting (within 2 years of trial completion) as a manuscript publication or as summary results was 41% (n = 1,198/2,892). The proportion of openly accessible trial publications steadily increased from 42% (n = 16/38) to 74% (n = 72/97) over the period considered. A limitation of this study is that some of the methods used to assess the transparency practices in this dashboard rely on registry data being accurate and up-to-date.

Conclusions

In this study, we observed that it is feasible to assess and inform individual UMCs on their performance on clinical trial transparency in a reproducible and publicly accessible way. Beyond helping institutions assess how they perform in relation to mandates or their institutional policy, the dashboard may inform interventions to increase the uptake of clinical transparency practices and serve to evaluate the impact of these interventions.

Author summary

Why was this study done?

  • Clinical trials are the foundation of evidence-based medicine and should follow established guidelines for transparency: Their results should be available, findable, and accessible regardless of the outcome.

  • Previous studies have shown that many clinical trials fall short of transparency guidelines, which distorts the medical evidence base, creates research waste, and undermines medical decision-making.

  • University Medical Centers (UMCs) play an important role in increasing clinical trial transparency but are often unaware of their performance on these practices, making it difficult to drive improvement.

What did the researchers do and find?

  • We developed a pipeline to evaluate clinical trials across several established practices for clinical trial transparency and applied it in a cohort of 2,895 clinical trials led by German UMCs.

  • We found that while some practices are gaining adherence (e.g., prospective registration in ClinicalTrials.gov increased from 33% to 75% over the period considered), there is much room for improvement (e.g., 41% of trials reported results within 2 years of trial completion).

  • We developed a dashboard to communicate these transparency assessments to UMCs and support their efforts to improve.

What do these findings mean?

  • Our study demonstrates the feasibility of developing a dashboard to communicate adherence to established practices for clinical trial transparency.

  • By highlighting areas for improvement, the dashboard provides actionable information to UMCs and empowers their efforts to improve.

  • The dashboard may inform interventions to increase clinical trial transparency and be scaled to other countries and stakeholders, such as funders or clinical trial registries.

Introduction

Valid medical decision-making depends on an evidence base composed of clinical trials that were prospectively registered and reported in an unbiased and timely manner. The registration of clinical trials in publicly accessible registries informs clinicians, patients, and other relevant stakeholders about what trials are planned, in progress or completed, and aggregates key information relating to those trials. Trial registration thus reduces bias in our understanding of the existing medical evidence and disincentivizes outcome-switching and selective reporting [1]. For clinical trials to generate useful and generalizable medical knowledge gain, trial results should also be reported in a timely manner after trial completion per the World Health Organization (WHO) Joint Statement on Public Disclosure of Results from Clinical Trials [2]. Disclosure is a necessary but not sufficient component of transparency: Trial results should also be openly accessible and findable, in line with established guidelines [26]. However, several studies have shown that clinical trials are often not registered and reported according to these standards [711].

Audits of research practices can build understanding of the status quo, inform new policies, and evaluate the impact of interventions to support improvement. Examples include the European Commission’s Open Science monitor [12], the German Open Access monitor [13], the French Open Science Monitor in health [14], and institution-specific dashboards of select research practices [15]. Focusing on trial transparency, the EU Trials Tracker and the Food and Drug Administration Amendments Act 2007 (FDAAA) TrialsTracker [16,17] display up-to-date summary results reporting rates of public and private trial sponsors in a transparent and accessible way. The EU Trials Tracker served as a key resource for initiatives aiming to increase reporting rates of drug trials in the EU Clinical Trials Register (EUCTR) [18,19]. Based on the EU Trials Tracker, results reporting in the EUCTR has increased from 50% in 2018 to 84% (late 2022).

Research institutions such as University Medical Centers (UMCs) can incentivize practices for research transparency through their reward and promotion systems [20,21] and by providing education, infrastructure, and services [22,23]. However, internal and external assessments of research conducted at UMCs rarely acknowledge these practices [24,25]. Rather, traditional indicators of research performance such as the number of clinical trials, the extent of third-party funding, and the impact factor of published papers emphasize quantity over quality, which can entrench problematic research practices [26]. Initiatives such as the Declaration on Research Assessment (DORA) and the Hong Kong Principles have called for a change in the way researchers are assessed, and for more recognition of behaviors that strengthen research integrity [20,27]. The establishment of the Coalition on Advancing Research Assessment (CoARA) and the 2022 Agreement on Reforming Research Assessment emphasize this shift towards rewarding responsible research practices to maximize research quality and impact [28]. In turn, the UNESCO Recommendation on Open Science adopted in 2021 affirmed the need to establish monitoring and evaluation mechanisms relating to open science [29]. Audits of transparency practices could empower UMCs to support their uptake by highlighting where there is room for improvement and where to allocate resources. Comparative assessments between institutions could also provide examples of successes and stimulate knowledge transfer.

Audits that are based on open and scalable methods facilitate repeated evaluation and uptake at other organizations. Such an evaluation of transparency practices at the level of clinical trials led by UMCs requires reproducible and efficient procedures for (a) sampling all clinical trials and associated results publications affiliated to UMCs and (b) measuring select registration and reporting practices. We previously established procedures for identifying all clinical trials associated with a specific UMC and their earliest results publications [9,11]. In turn, an increasing number of open-source publication and registry screening tools have been developed in the context of meta-research projects aiming to increase research transparency and reproducibility [10,3032].

The objective of this study was to perform a status quo analysis of a set of established practices for clinical trial transparency at the level of UMCs and present these assessments in the form of an interactive dashboard to support efforts to improve performance. While the general approach of our study is applicable for UMCs worldwide, this study focused on German UMCs.

Methods

Producing a dashboard for clinical trial transparency required the development of a pipeline consisting of 3 main steps: first, the identification of registered clinical trials led by German UMCs; second, the evaluation of select registration and reporting practices, including (a) the partly automated and partly manual identification of earliest results publications of these trials and (b) the application of automated tools at the registry and publication level; third, the presentation of these baseline assessments in the form of an interactive dashboard. An overview of the dependence of these steps on automated versus manual approaches is provided in S1 Supplement. The development of the dashboard was iterative and did not have a prospective protocol. The methods to develop the underlying dataset of clinical trials and associated results publications, however, were preregistered in Open Science Framework (OSF) for trials completed 2009 to 2013 [33] and 2014 to 2017 [34].

Data sources and inclusion and exclusion criteria

The data displayed in the dashboard relate exclusively to registered (either prospectively or retrospectively) clinical trials obtained from 3 data sources with the following inclusion and exclusion criteria:

  1. The IntoValue cohort of registered clinical trials and associated results [35]. This dataset consists of interventional clinical trials registered in ClinicalTrials.gov or DRKS, considered as complete between 2009 and 2017 per the registry, and led by a German UMC (i.e., led either as sponsor, responsible party, or as host of the principal investigator). Trials were searched for 38 German UMCs based on their inclusion as members on the website of the association of medical faculties of German universities [36] at the time of data collection. In line with WHO and International Committee of Medical Journal Editors (ICMJE) definitions [4,37], trials in this cohort include all interventional studies and are not limited to Clinical Trials of an Investigational Medicinal Product (CTIMP) regulated by the EU’s Clinical Trials Regulation or Germany’s drug or medical device laws. The dataset includes data from partly automated and partly manual searches to identify the earliest reported results associated with these trials (as summary results in the registry and as publication). The methods for sampling UMC-specific sets of registered clinical trials and tracking associated results are described in detail elsewhere [9,11]. Briefly, we used automated methods to search registries for clinical trials associated with German UMCs and manually validated the affiliations of all trials. We deduplicated trials in this cohort that were cross-registered in ClinicalTrials.gov and DRKS (see more information in S2 Supplement). Results publications associated with these trials were identified by means of a manual search across several search engines. This was complemented by automated methods to identify linked publications in the registry [10]. To reflect the most up-to-date status of trials, we downloaded updated registry data for the trials in this cohort on 1 November 2022 and reapplied the original IntoValue exclusion criteria: study completion date before 2009 or after 2017, not considered as complete based on study status, and not interventional. More detailed information on the inclusion and exclusion criteria can be found in S2 Supplement.

  2. For assessing prospective registration in ClinicalTrials.gov, we used a more recent cohort of interventional trials registered in ClinicalTrials.gov, started between 2006 and 2018, led by a German UMC, and considered as complete per study status in the registry. We downloaded updated registry data for the trials in this cohort on 1 November 2022 and reapplied the same exclusion criteria as above except for completion date (S2 Supplement).

  3. For assessing results reporting in the EUCTR, we retrieved data from the EU Trials Tracker on 4 November 2022 [16]. We found a sponsor name for 34 of the UMCs included in this study as of August 2021 (sponsor names in the EU Trials Tracker are subject to change). If more than one corresponding sponsor name was found for a given UMC (Bochum, Giessen, Heidelberg, Kiel, Marburg, and Tübingen), we selected the sponsor with the most trials. More detailed information can be found in S3 Supplement.

Analysis of registration and reporting practices

The dashboard displays the performance of UMCs on 7 recommended transparency practices for trial registration and reporting. In this study, we focused on adherence to ethical principles and reporting guidelines that apply to all trials. Compliance with a legal regulation was only assessed for summary results reporting in the EUCTR. For an overview of these practices, relevant guidelines and laws, the sample considered, and the measured outcome, see Fig 1 (sources in S4 Supplement) and Table 1. The data for these metrics were obtained through a combination of automated approaches and manual searches, several of which have been described previously [811]. In the following, we outline the methods used to generate the data for each metric. More detailed information can be found in the Methods page of the dashboard and in S5 Supplement.

Fig 1. Overview of the clinical trial transparency practices included in the dashboard.

Fig 1

Relevant guidelines and/or laws are provided for each practice (as of November 2022). A list of references can be found in S4 Supplement. An adaptation of this overview is included in the “Why these practices?” page of the dashboard. *DFG: According to the DFG guidelines at the time of writing, summary results should be posted in the registry at the latest 2 years after trial completion, or earlier if required by applicable legal regulations. BMBF, Bundesministerium für Bildung und Forschung; CIOMS, Council for International Organizations of Medical Sciences; CONSORT, Consolidated Standards of Reporting Trials; CTIMP, Clinical Trial of an Investigational Medicinal Product; DFG, Deutsche Forschungsgemeinschaft; ICMJE, International Committee of Medical Journal Editors; ICTRP, International Clinical Trials Registry Platform; WHO, World Health Organization; WMA, World Medical Association.

Table 1. Sample considered and measured outcome for the trial transparency practices in the dashboard.

Section Practice Sample (denominator) What was measured
Trial registration Prospective registration DRKS (IntoValue): trials in the IntoValue cohort and registered in DRKS (interventional, led by a German UMC, completion date between 2009–2017, and considered as complete based on study status in the registration) with a start date in the registry Was the trial registered in the same month or in a previous month to the trial start date?
ClinicalTrials.gov: recent cohort of trials registered in ClinicalTrials.gov (interventional, led by a German UMC, start date between 2006–2018, and considered as complete based on study status in the registration)
Reporting of the TRN in results publications All trials in the IntoValue cohort (registered in ClinicalTrials.gov or DRKS) with a manuscript publication and a PubMed Identifier (detection of TRN in abstract) or for which the full text could be retrieved (detection of TRN in full text) For trials with a manuscript publication, was the TRN reported (a) in the abstract; (b) in the full text?
Publication link in the trial registry All trials in the IntoValue cohort (registered in ClinicalTrials.gov or DRKS) with a manuscript publication and a DOI or a PubMed Identifier For trials with a manuscript publication, is said publication linked in the registration?
Trial reporting Summary results reporting in the EUCTR Due trials listed on the EU Trials Tracker (and therefore registered in the EUCTR) with a sponsor name corresponding to one of the included UMCs How many due trials registered in the EUCTR have reported summary results in the registry?
Summary results reporting in ClinicalTrials.gov or DRKS All trials in the IntoValue cohort (registered in ClinicalTrials.gov or DRKS) How many due trials registered in ClinicalTrials.gov or DRKS have reported summary results in the registry?
Results reporting within 2 and 5 years of trial completion (summary results or manuscript publication) All trials in the IntoValue cohort (registered in ClinicalTrials.gov or DRKS). Reporting as summary results: only trials with a follow-up time of 2 and 5 years from trial completion to the registry download date were included. Reporting as a manuscript publication: only trials with a follow-up time of 2 and 5 years from trial completion to the manual publication search date were included. Reporting as summary results or manuscript publication: only trials with a follow-up time of 2 and 5 years from (1) trial completion to the registry download date AND (2) trial completion to the manual publication search date were included. How many trials have reported results within 2 and 5 years of trial completion? The following reporting routes were considered: summary results, manuscript publication, summary results or manuscript publication
OA OA status Unique results publications from the IntoValue cohort of trials (registered in ClinicalTrials.gov or DRKS) with a DOI and a publication date in Unpaywall Of all trial publications, how many are openly accessible and via which route (gold OA, hybrid OA, green OA, or bronze OA)?

Overview of the transparency practices assessed and included in the dashboard, along with the sample considered, and the measured outcome. See S5 and S8 Supplements for more detailed information.

DOI, Digital Object Identifier; DRKS, German Clinical Trials Register; EUCTR, EU Clinical Trials Register; OA, Open Access; TRN, Trial Registration Number; UMC, University Medical Center.

Prospective registration

Raw registry data downloaded from ClinicalTrials.gov and DRKS were further processed to determine the registration status of trials. We defined a trial to be prospectively registered if the trial was registered in the same or a previous month to the trial start date.

Bidirectional links between registry entries and associated results publications

We extracted links to publications from the registry data and obtained the full text of publications. We then applied regular expressions to detect publication identifiers in registrations, and trial registration numbers (TRNs) in publications. The application of these methods on the IntoValue cohort was reported previously [10].

Summary results reporting in the registry

For ClinicalTrials.gov, we extracted the relevant information from the structured summary results field. For DRKS, we detected summary results based on the presence of keywords (e.g., Ergebnisbericht or Abschlussbericht) in the reference title. The summary results date in DRKS was extracted manually from the registry’s change history. We obtained summary results reporting rates in the EUCTR from the EU Trials Tracker. We retrieved historical data (percent reported, total number of due trials, and total number of trials that reported results) from the associated code repository [16].

Reporting as a manuscript publication

The earliest publication found for each trial and its publication date was derived from the original IntoValue dataset [35]. Dissertations were excluded from publication-based metrics.

Open Access (OA) status

To determine the OA status of trial results publications, we queried the Unpaywall database via its API on 1 November 2022 using UnpaywallR and assigned one of the following statuses: gold (openly available in an OA journal), hybrid (openly available in a subscription-based journal), green (openly available in a repository), bronze (openly available on the journal page but without a clear open license), or closed. As publications can have several OA versions, we applied a hierarchy such that only one OA status was assigned to each publication, in descending order: gold, hybrid, green, bronze, and closed.

Interactive dashboard

We developed an interactive dashboard to present the outcome of these assessments at the institutional level in an accessible way to the UMC leadership and the wider research community. The dashboard was developed with the Shiny R package (version 1.6.0) [38] based on an initial version developed by NR for the Charité –Universitätsmedizin Berlin [15]. The dashboard was shaped by interviews with UMC leadership, support staff, funders, and experts in responsible research who provided feedback on a prototype version [39]. This feedback led to the inclusion of several features to facilitate the interpretation of the data and contextualize the assessed transparency practices. The code underlying the dashboard developed in this study is openly available in GitHub under an AGPL license (https://github.com/quest-bih/clinical-dashboard) and may be adapted for further use.

Analysis

We generated descriptive statistics on the characteristics of the trials and the transparency practices, all of which are displayed in the dashboard. We report proportions across UMCs (e.g., “Start” page) and per UMC broken down by start year (prospective registration only), completion year, publication year (open access), and registry (publication link in the registry, summary results reporting). We did not test specific hypotheses.

Software, code, and data

Data processing was performed in R (version 4.0.5) [40] and Python 3.9 (Python Software Foundation, Wilmington, Delaware, USA). With the exception of summary results reporting in the EUCTR (data available via the EU Trials Tracker), all the data processing steps involved in generating the dataset displayed in this dashboard are openly available in GitHub: https://github.com/maia-sh/intovalue-data/releases/tag/v1.1. The data displayed in the dashboard are available in OSF [41] and in the dashboard Datasets page. Raw data obtained from trial registries are openly available in Zenodo [42]. This study is reported as per the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guideline for cross-sectional studies (S6 Supplement).

Results

Characteristics of trials

The IntoValue dataset that this study is based on includes interventional trials registered in ClinicalTrials.gov or DRKS, led by a German UMC, and reported as complete between 2009 and 2017 (n = 3,113). Trials were found for 35 out of 38 UMCs searched. After downloading updated registry data for trials in this cohort, we excluded 91 trials based on our exclusion criteria (study completion date before 2009 or after 2017, n = 73; not considered as complete per study status, n = 16; not interventional, n = 2). After removal of duplicates, this led to 2,895 trials that served as the basis for most metrics (Fig 2). For prospective registration in ClinicalTrials.gov, we used a more recent cohort of interventional trials registered in ClinicalTrials.gov, led by a German UMC, started between 2006 and 2018, and considered as complete per study status in the registry (n = 4,058). After applying our inclusion criteria, this sample included 3,618 trials. S7 Supplement provides an overview of the characteristics of included trials stratified by registry. S8 Supplement provides flow diagrams of the trial and publication screening for each metric.

Fig 2. Trial screening.

Fig 2

Flowchart of the trial screening (IntoValue). The box with the thicker contour highlights the starting point of the trial screening for other registry-based metrics (see Flowcharts 1–3 in S8 Supplement). CT.gov, ClinicalTrials.gov; DRKS, German Clinical Trials Register; IV, IntoValue; UMC, University Medical Center.

Evaluation of trial registration and reporting practices

We developed an interactive dashboard (https://quest-cttd.bihealth.org/) to display the results of the evaluation of trial registration and reporting across UMCs. In the following, we highlight some of these results. More extensive evaluations of some of these practices are reported in separate publications, such as results reporting of trials [9,11] and links between trial registration and results publications [10].

Trial registration

Prospective registration: The proportion of trials led by German UMCs that were prospectively registered increased in both ClinicalTrials.gov and DRKS over the period considered. Of 178 trials registered in ClinicalTrials.gov and started in 2006, 58 (33%, 95% confidence interval 26% to 40%) were registered prospectively. A little more than a decade later, 144 of 193 (75%, 95% confidence interval 68% to 80%) trials started in 2018 were registered prospectively. Trials registered in DRKS followed a similar trend: While none of the 44 (0%, 95% confidence interval 0% to 10%) trials started between 2006 and 2008 were prospectively registered, this increased to 19 of 24 (79%, 95% confidence interval 57% to 92%) for trials started in 2017 (S9 Supplement). Among clinical trials registered in ClinicalTrials.gov, the median per-UMC rate of prospective registration ranged from 30% (n = 17/56) to 68% (n = 127/186) with a median of 55% and a standard deviation of 8%. Per-UMC rates of prospective registration in DRKS ranged from 0% (n = 0/1) to 75% (n = 15/20) with a median of 44% and a standard deviation of 15%.

Reporting of a TRN in publications

Of the 1,895 registered trials with a publication indexed in PubMed, 714 (38%, 95% confidence interval 35% to 40%) reported a TRN in the publication abstract. In turn, 1,136 of 1,893 registered trials for which the full text was available reported a TRN in the publication full text (60%, 95% confidence interval 58% to 62%) (S9 Supplement). Only 476 of 1,893 (25%, 95% confidence interval 23% to 27%) of trials reported a TRN in both the abstract and full text of the publication as per the ICMJE and Consolidated Standards of Reporting Trials (CONSORT) guidelines. The per-UMC rate at which clinical trial publications reported a TRN in the abstract ranged from 17% (n = 13/75) to 56% (n = 23/41) with a median of 38% and a standard deviation of 8%. The per-UMC rate at which clinical trial publications reported a TRN in the full text was higher, ranging from 43% (n = 41/95) to 76% (n = 32/42) with a median of 61% and a standard deviation of 7%.

Publication links in the registry

Of 1,493 trials registered in ClinicalTrials.gov with a publication, 861 (58%, 95% confidence interval 55% to 60%) had a link to the publication in the registration. In turn, only 111 of 474 trials registered in DRKS with a publication (23%, 95% confidence interval 20% to 28%) had a link to the publication in the registration. Among trials registered in ClinicalTrials.gov with a publication, the per-UMC rate of publication links in the registration ranged from 32% (n = 12/37) to 88% (n = 28/32) with a median of 56% and a standard deviation of 12%. Among trials registered in DRKS with a publication, the per-UMC rate of publication links in the registration ranged from 0% (n = 0/7) to 45% (n = 5/11) with a median of 23% and a standard deviation of 13%.

Trial reporting

Summary results reporting

We first assessed how many of the trials registered in ClinicalTrials.gov or DRKS had summary results in the registry. The cumulative proportion of trials that reported summary results has stagnated at low levels between 2009 and 2017. Only 191 of all 2,253 (8%, 95% confidence interval 7% to 10%) trials registered in ClinicalTrials.gov, and 20 of all 642 (3%, 95% confidence interval 2% to 5%) trials registered in DRKS had summary results in the registry (S9 Supplement). Per-UMC summary results reporting rates for all trials ranged between 0% (n = 0/42) and 32% (n = 8/25) (median of 7% and a standard deviation of 7%) for ClinicalTrials.gov, and between 0% (n = 0/23) and 50% (n = 7/14) (median of 0% and a standard deviation of 9%) for DRKS. In contrast, reporting of summary results in the EUCTR was higher and increased over time: In almost 2 years, results reporting for due trials almost doubled from 41% (n = 223/541, 95% confidence interval 37% to 46%) in December 2020 to 79% (n = 647/813, 95% confidence interval 77% to 82%) in October 2022 (EU Trials Tracker) (S9 Supplement). At the time of data collection (November 2022), per-UMC summary results reporting rates in the EUCTR ranged between 0% (n = 0/1) and 100% (n = 14/14) across all included UMCs with a median of 82% and a standard deviation of 30%.

Timely reporting of results (2- and 5-year reporting rates)

Next, we assessed how many trials registered in ClinicalTrials.gov or DRKS reported results in a timely manner. Reporting guidelines and German research funders have called on clinical trials to report (a) summary results in the registry within 12 and 24 months of trial completion and (b) results in a manuscript publication within 24 months of trial completion [2,4345]. We therefore considered 2 years as timely reporting for both reporting routes. Of 2,892 trials registered in ClinicalTrials.gov or DRKS with a 2-year follow-up period for reporting results as either summary results or a manuscript publication, 1,198 (41%, 95% confidence interval 40% to 43%) had done so within 2 years of trial completion.

While the 5-year reporting rate was unsurprisingly higher, 505 of 1,619 trials (31%, 95% confidence interval 29% to 34%) registered in ClinicalTrials.gov or DRKS with 5-year follow-up between trial completion and the manual publication search had not reported results as a journal publication within 5 years of trial completion. Publication in a journal was the dominant route of reporting results, with summary results reporting rates below 10% across all completion years and follow-up periods. Per-UMC reporting rates as a manuscript publication ranged between 15% (n = 7/46) and 58% (n = 19/33) (2-year rate, median 39%, standard deviation 9%) and between 50% (n = 24/48) and 87% (n = 13/15) (5-year rate, median 70%, standard deviation 8%). Per-UMC reporting rates as summary results ranged between 0% (n = 0/76) and 14% (n = 6/43) (2-year rate, median 4%, standard deviation 4%) and between 0% (n = 0/72) and 21% (9/42) (5-year rate, median 5%, standard deviation 5%).

Open Access

OA status

The proportion of trial results publications that were openly accessible (gold, hybrid, green, or bronze) increased from 42% in 2010 (n = 16/38, 95% confidence interval 27% to 59%) to 74% in 2020 (n = 72/97, 95% confidence interval 64% to 82%) (S9 Supplement). Across all publication years, 891 of 1,920 (46%, 95% confidence interval 44% to 49%) trial publications were neither openly accessible via a journal nor an OA repository based on Unpaywall. Per-UMC rates of trial results publications that were OA ranged from 26% (n = 10/38) to 72% (n = 23/32) with a median of 55% and a standard deviation of 10%.

Interactive dashboard

The key outcome of this paper is an interactive and openly accessible dashboard to visualize adherence to the aforementioned best practices for trial registration and reporting across German UMCs: https://quest-cttd.bihealth.org/. The dashboard displays the data in 3 ways: (a) assessment across all UMCs (national dashboard; see a screenshot in Fig 3); (b) comparative assessment between UMCs; and (c) UMC-specific assessment (see a screenshot for one UMC in S10 Supplement).

Fig 3. Screenshot of the home (“Start”) page of the dashboard for clinical trial transparency.

Fig 3

Assessment of 7 registration and reporting practices across all included German UMCs (8 November 2022).

To allow for a better interpretation of the data displayed in the dashboard, absolute numbers are displayed in all plots as mouse-overs. A description of the methods and limitations of each metric is also provided next to each plot, with more detailed information in the Methods page. A FAQ page addresses general considerations raised in interviews with relevant stakeholders [39]. These interviews highlighted the importance of an overall narrative justifying the choice of metrics included. We therefore designed an infographic of relevant laws and guidelines to contextualize the clinical transparency metrics included in the dashboard (adapted from Fig 1).

Discussion

Concerns about delayed and incomplete results reporting in clinical research and other sources of research waste have triggered debate on incentivizing individual researchers and UMCs to adopt more responsible research practices [20,22,23]. Here, we introduced the methods and results underlying a dashboard for clinical trial transparency, which provides actionable information on UMCs’ performance in relation to established registration and reporting practices and thereby empowers their efforts to support improvement. This dashboard approach for clinical trial transparency at the level of individual UMCs serves to (a) inform institutions about their performance and set this in relation to national and international transparency guidelines and funder mandates, (b) highlight where there is room for improvement, (c) trigger discussions across relevant stakeholder groups on responsible research practices and their role in assessing research performance, (d) point to success stories and facilitate knowledge sharing between UMCs, and (e) inform the development and evaluation of interventions that aim to increase trial transparency.

Trends in trial transparency

The dashboard displays progress over time and allows the data to be explored in different ways. While the upward trend for several practices (e.g., prospective registration, OA) is encouraging, there is much room for improvement with respect to established guidelines for clinical trial transparency. For example, less than half (45%) of trials registered in ClinicalTrials.gov or DRKS and completed in 2017 reported results in a manuscript publication within 2 years of trial completion as per WHO and funder recommendations [2,43,44]. We observed a striking difference in the cumulative proportion of summary results reporting of drug trials registered in the EUCTR compared with trials registered in ClinicalTrials.gov and DRKS. The uptake of summary results reporting in the EUCTR likely reflects the combined impact of the EU legal requirement for drug trials to report summary results within 12 months [45], the launch of the EU Trials Tracker and subsequent academic initiatives to increase reporting rates [8,18], as well as media attention [46]. This suggests that audits of compliance with respect to established guidelines and further awareness raising may also have the potential to increase results reporting rates of other types of trials.

Actionable areas for stakeholders

Some of the practices included in this dashboard can still be addressed retroactively, such as linking publications in the trial registration (realized for 49% of trials with a publication). These constitute actionable areas for improvement that UMCs can contribute to by providing education, support, and incentives. One important way to incentivize UMCs in this regard is to make responsible research practices part of internal and external quality assessment procedures. Other stakeholders such as funders, journals and publishers, registries, and bibliographic databases should complement these activities by reviewing compliance with their policies as well as applicable guidelines and/or laws. Salholz-Hillel and colleagues, for example, outlined specific recommendations for each stakeholder to improve links between trial registrations and publications [10]. UMCs and their core facilities for clinical research can, for example, use the data linked to the dashboard to inform principal investigators about the transparency of their specific trials. We are currently finalizing such a “report card” approach at the Charité - Universitätsmedizin Berlin [47].

Scalability beyond German UMCs

The datasets and methods used in this study can be scaled: This has been demonstrated in another European country (Poland) [48] and is currently underway in California, USA [49]. While the generation of the underlying dataset of clinical trials and associated results publications involves manual checks (approximately 10 person-hours per 100 trials), the assessment of transparency practices is largely automated. Institutions in possession of an in-house cohort of clinical trial registry numbers and persistent identifiers (e.g., Digital Object Identifier (DOI)) from matched journal publications, however, could achieve results more quickly. The code to create the dashboard is openly available and can be adapted to other cohorts.

Stakeholder and community engagement

The uptake of this dashboard approach by UMCs and other stakeholders depends on their respective attitudes and readiness. We previously solicited stakeholders’ views on an institutional dashboard with metrics for responsible research. While interviewees considered the dashboard helpful to see where an institution stands and to initiate change, some pointed to the challenge that making such a dashboard public might risk incorrect interpretation of the metrics and harm UMCs’ reputation [39]. While similar challenges with interpretation and reputation apply to current metrics for research assessment (e.g., impact factors and third-party funding), this stakeholder feedback demonstrates the need for community engagement when introducing novel strategies for research assessment. In this regard, a Delphi study was performed to reach consensus on a core outcome set of open science practices within biomedicine to support audits at the institutional level [50]. A detailed comparative assessment of existing monitoring initiatives and lessons learned could further support these efforts.

Updates and further development of the dashboard

We are planning regular updates of the registry data for trials already in the dashboard, as well as the inclusion of more recent cohorts of trials with at least 2 years follow-up (e.g., trials completed 2018 to 2021 assessed in 2023). Besides these updates, further transparency practices may be integrated into the dashboard in the future, e.g., dissemination of results as preprints, the use of self-archiving to broaden access to results [51], adherence to reporting guidelines [3], or data sharing [52]. Beyond transparency, other potential metrics could reflect the number of discontinued trials [53] or the proportion of trials that inform clinical practice [54]. The development of such metrics should acknowledge the availability of standards and infrastructure pertaining to the underlying practices [23] and differences between study types and disciplines [27]. Future versions of the dashboard may also display additional subpopulation comparisons, such as different clinical trial registries or UMC particularities [55].

Limitations

A limitation of this study is that inaccurate or outdated registry data (e.g., incorrect completion dates or trial status) may have impacted the assessment of transparency practices described in this study. To mitigate this limitation, we updated the registry data with the most recent data we could obtain. The update-related changes suggest no systematic bias in the comparison across UMCs. Another limitation is that the trial dataset may contain more cross-registrations than we identified. For the aforementioned “report card” project, we manually verified 168 trials and found only 2 missed cross-registrations (1%). We therefore believe that missed cross-registrations represent only a small portion of our sample. Moreover, the assessment of each practice in the dashboard applies to a specific subset of trials or publications and comes with unique limitations, largely resulting from challenges associated with manual or automated methods (outlined in more detail in S5 Supplement). More generally, the dashboard focuses on interventional trials registered in ClinicalTrials.gov or DRKS and does not display how German UMC drug trials only registered in the EUCTR perform on established transparency practices (except for summary results reporting in the registry). We are considering including all drug trials in the EUCTR conducted by German UMCs in future developments of the dashboard.

Conclusions

UMCs play an important role in fostering clinical trial transparency but face challenges doing so in the absence of baseline assessments of current practice. We assessed adherence to established practices for clinical trial registration and reporting at German UMCs and communicated the results in the form of an interactive dashboard. We observed room for improvement across all assessed practices, some of which can still be addressed retroactively. The dashboard provides actionable information to drive improvement, facilitates knowledge sharing between UMCs, and informs the development of interventions to increase research transparency.

Supporting information

S1 Supplement. Use of automated vs. manual approaches across methods.

(PDF)

S2 Supplement. Inclusion and exclusion criteria.

(PDF)

S3 Supplement. Selected sponsor names in the EU Trials Tracker.

(PDF)

S4 Supplement. Sources for Fig 1.

(PDF)

S5 Supplement. Detailed methods and limitations of registration and reporting metrics.

(PDF)

S6 Supplement. STROBE checklist for cross-sectional studies.

(PDF)

S7 Supplement. Characteristics of included trials.

(PDF)

S8 Supplement. Flow diagrams of the trial and publication screening.

(PDF)

S9 Supplement. Screenshots of the “Start” page of the dashboard.

(PDF)

S10 Supplement. Screenshot of the “One UMC” page of the dashboard.

(PDF)

Acknowledgments

We would like to acknowledge Tamarinde Haven and Martin Holst for their valuable input that shaped the dashboard. We acknowledge financial support from the Open Access Publication Fund of Charité – Universitätsmedizin Berlin and the German Research Foundation (DFG).

Abbreviations

CONSORT

Consolidated Standards of Reporting Trials

CTIMP

Clinical Trial of an Investigational Medicinal Product

DOI

Digital Object Identifier

DORA

Declaration on Research Assessment

DRKS

Deutsches Register Klinischer Studien (German Clinical Trials Register)

EUCTR

EU Clinical Trials Register

FDAAA

Food and Drug Administration Amendments Act

ICMJE

International Committee of Medical Journal Editors

OA

Open Access

OSF

Open Science Framework

STROBE

Strengthening the Reporting of Observational Studies in Epidemiology

TRN

trial registration number

UMC

University Medical Center

WHO

World Health Organization

Data Availability

The authors confirm that all data underlying the findings are fully available without restriction. The dashboard is openly available at: https://quest-cttd.bihealth.org/. Code to produce the dashboard is openly available in GitHub at: https://github.com/quest-bih/clinical-dashboard. Code to generate the dataset displayed in the dashboard is openly available in GitHub: https://github.com/maia-sh/intovalue-data/releases/tag/v1.1. Data can be downloaded from the dashboard and are openly available on OSF at: https://osf.io/26dgx/. Raw data obtained from trial registries are openly available on Zenodo at: https://doi.org/10.5281/zenodo.7590083. Data for summary results reporting in the EUCTR are available via the EU Trials Tracker.

Funding Statement

This work was funded by the Federal Ministry of Education and Research of Germany (BMBF 01PW18012, https://www.bmbf.de). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Zarin DA, Tse T, Williams RJ, Rajakannan T. Update on Trial Registration 11 Years after the ICMJE Policy Was Established. N Engl J Med. 2017;376:383–391. doi: 10.1056/NEJMsr1601330 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.World Health Organization. Joint statement on public disclosure of results from clinical trials. World Health Organization [Internet]. 2017 [cited 2021 Jun 15]. Available from: https://www.who.int/news/item/18-05-2017-joint-statement-on-registration.
  • 3.Hopewell S, Clarke M, Moher D, Wager E, Middleton P, Altman DG, et al. CONSORT for Reporting Randomized Controlled Trials in Journal and Conference Abstracts: Explanation and Elaboration. PLoS Med. 2008;5:e20. doi: 10.1371/journal.pmed.0050020 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.International Committee of Medical Journal Editors (ICMJE). Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals. ICMJE [Internet]. 2021 [cited 2021 Dec 21]. Available from: http://www.icmje.org/icmje-recommendations.pdf.
  • 5.Borysowski J, Wnukiewicz-Kozłowska A, Górski A. Legal regulations, ethical guidelines and recent policies to increase transparency of clinical trials. Br J Clin Pharmacol. 2020;86:679–686. doi: 10.1111/bcp.14223 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Wilkinson MD, Dumontier M, IjJ A, Appleton G, Axton M, Baak A, et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016;3:160018. doi: 10.1038/sdata.2016.18 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Chen R, Desai NR, Ross JS, Zhang W, Chau KH, Wayda B, et al. Publication and reporting of clinical trial results: cross sectional analysis across academic medical centers. BMJ. 2016;352:i637. doi: 10.1136/bmj.i637 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Goldacre B, DeVito NJ, Heneghan C, Irving F, Bacon S, Fleminger J, et al. Compliance with requirement to report results on the EU Clinical Trials Register: cohort study and web resource. BMJ. 2018:362. doi: 10.1136/bmj.k3218 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Wieschowski S, Riedel N, Wollmann K, Kahrass H, Müller-Ohlraun S, Schürmann C, et al. Result dissemination from clinical trials conducted at German university medical centers was delayed and incomplete. J Clin Epidemiol. 2019;115:37–45. doi: 10.1016/j.jclinepi.2019.06.002 [DOI] [PubMed] [Google Scholar]
  • 10.Salholz-Hillel M, Strech D, Carlisle BG. Results publications are inadequately linked to trial registrations: An automated pipeline and evaluation of German university medical centers. Clin Trials. 2022:1–10. doi: 10.1177/17407745221087456 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Riedel N, Wieschowski S, Bruckner T, Holst MR, Kahrass H, Nury E, et al. Results dissemination from completed clinical trials conducted at German university medical centers remained delayed and incomplete. The 2014–2017 cohort. J Clin Epidemiol. 2022;144: 1–7. doi: 10.1016/j.jclinepi.2021.12.012 [DOI] [PubMed] [Google Scholar]
  • 12.European Commission. Open science monitor. European Commission | Research and innovation [Internet]. [cited 2021 Nov 12]. Available from: https://ec.europa.eu/info/research-and-innovation/strategy/strategy-2020-2024/our-digital-future/open-science/open-science-monitor_en.
  • 13.Open Access Monitor. Open access monitor [Internet]. [cited 2021 Mar 22]. Available from: https://open-access-monitor.de/.
  • 14.French Open Science Monitor in health. French Open Science Monitor [Internet]. [cited 2022 Feb 4]. Available from: https://frenchopensciencemonitor.esr.gouv.fr/health.
  • 15.BIH-QUEST Center for Responsible Research at Charité - Universitätsmedizin Berlin. Charité Dashboard on Responsible Research. Charité Metrics Dashboard [Internet]. 2021. Available from: https://quest-dashboard.charite.de.
  • 16.EBM DataLab. EU Trials Tracker. 2018 [cited 2021 Jul 26]. Available from: https://eu.trialstracker.net.
  • 17.EBM DataLab. FDAAA TrialsTracker. 2018. Available from: https://fdaaa.trialstracker.net/.
  • 18.Brückner T. European health groups demand action over 4,046 missing drug trial results. In: TranspariMED [Internet]. 2021. May 6 [cited 2021 Nov 2]. Available from: https://www.transparimed.org/single-post/ctimps-results-reporting. [Google Scholar]
  • 19.Cochrane Sweden. Cochrane Sweden collaborates on trial transparency report. Cochrane [Internet]. 2020 [cited 2021 Nov 2]. Available from: https://www.cochrane.org/news/cochrane-sweden-collaborates-trial-transparency-report.
  • 20.Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong Principles for assessing researchers: Fostering research integrity. PLoS Biol. 2020;18:e3000737. doi: 10.1371/journal.pbio.3000737 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PLoS Biol. 2018;16:e2004089. doi: 10.1371/journal.pbio.2004089 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Begley CG, Buchan AM, Dirnagl U. Robust research: Institutions must do their part for reproducibility. Nat News. 2015;525:25. doi: 10.1038/525025a [DOI] [PubMed] [Google Scholar]
  • 23.Strech D, Weissgerber T, Dirnagl U, Group on behalf of Q. Improving the trustworthiness, usefulness, and ethics of biomedical research through an innovative and comprehensive institutional initiative. PLoS Biol. 2020;18:e3000576. doi: 10.1371/journal.pbio.3000576 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Holst MR, Faust A, Strech D. Do German university medical centres promote robust and transparent research? A cross-sectional study of institutional policies. Health Res Policy Syst. 2022;20:39. doi: 10.1186/s12961-022-00841-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities. BMJ. 2020:369. doi: 10.1136/bmj.m2081 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Ioannidis JPA, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, et al. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383:166–175. doi: 10.1016/S0140-6736(13)62227-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.San Francisco Declaration on Research Assessment. DORA [Internet]. 2012 [cited 2020 Dec 16]. Available from: https://sfdora.org/read/.
  • 28.Coalition for Advancing Research Assessment. Agreement on reforming research assessment. CoARA [Internet]. 2022 Jul 20. Available from: https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf.
  • 29.UNESCO. UNESCO Recommendation on Open Science. UNESCO | Open Science [Internet]. 2021 [cited 2022 Apr 29]. Available from: https://en.unesco.org/science-sustainable-future/open-science/recommendation.
  • 30.Riedel N, Kip M, Bobrov E. ODDPub–a Text-Mining Algorithm to Detect Data Sharing in Biomedical Publications. Data Sci J. 2020;19:42. doi: 10.5334/dsj-2020-042 [DOI] [Google Scholar]
  • 31.Serghiou S, Contopoulos-Ioannidis DG, Boyack KW, Riedel N, Wallach JD, Ioannidis JPA. Assessment of transparency indicators across the biomedical literature: How open is open? PLoS Biol. 2021;19:e3001107. doi: 10.1371/journal.pbio.3001107 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Weissgerber T, Riedel N, Kilicoglu H, Labbé C, Eckmann P, ter Riet G, et al. Automated screening of COVID-19 preprints: can we help authors to improve transparency and reproducibility? Nat Med. 2021;27:6–7. doi: 10.1038/s41591-020-01203-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Wieschowski S, Kahrass H, Schürmann C, Strech D, Riedel N, Siegerink B, et al. IntoValue. OSF. 2017[cited 2022 Nov 9]. doi: 10.17605/OSF.IO/FH426 [DOI]
  • 34.Wieschowski S, Strech D, Riedel N, Kahrass H, Bruckner T, Holst M, et al. In2Value 2. OSF. 2020 Jun. Available from: https://osf.io/98j7u/.
  • 35.Riedel N, Wieschowski S, Bruckner T, Holst MR, Kahrass H, Nury E, et al. Dataset for the IntoValue 1 + 2 studies on results dissemination from clinical trials conducted at German university medical centers completed between 2009 and 2017. Zenodo. 2021. doi: 10.5281/zenodo.5141343 [DOI] [Google Scholar]
  • 36.Mitglieder | Medizinischer Fakultätentag. Medizinischer Fakultätentag [Internet]. [cited 2020 Aug 31]. Available from: https://medizinische-fakultaeten.de/verband/mitglieder/.
  • 37.World Health Organization (WHO). Glossary. World Health Organization [Internet]. [cited 2021 Dec 21]. Available from: https://www.who.int/clinical-trials-registry-platform/about/glossary.
  • 38.Chang W, Cheng J, Allaire J, Sievert C, Schloerke B, Xie Y, et al. shiny: Web Application Framework for R. 2021. Available from: https://CRAN.R-project.org/package=shiny.
  • 39.Haven TL, Holst MR, Strech D. Stakeholders’ views on an institutional dashboard with metrics for responsible research. PLoS ONE. 2022;17:e0269492. doi: 10.1371/journal.pone.0269492 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2021. Available from: https://www.R-project.org/.
  • 41.Franzen D, Carlisle BG, Salholz-Hillel M, Riedel N, Strech D. Dataset for: An institutional dashboard to drive clinical trial transparency. OSF. 2022. doi: 10.17605/OSF.IO/26DGX [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Salholz-Hillel M, Franzen D, Carlisle BG, Riedel N. Raw Data for IntoValue Dataset. Zenodo. 2023. doi: 10.5281/zenodo.7590083 [DOI] [Google Scholar]
  • 43.Deutsche Forschungsgemeinschaft. Klinische Studien: Stellungnahme der Arbeitsgruppe „Klinische Studien”der DFG-Senatskommission für Grundsatzfragen in der Klinischen Forschung. 2018. Available from: https://www.dfg.de/download/pdf/dfg_im_profil/geschaeftsstelle/publikationen/stellungnahmen_papiere/2018/181025_stellungnahme_ag_klinische_studien.pdf.
  • 44.BMBF. Grundsätze und Verantwortlichkeiten bei der Durchführung klinischer Studien. 2019. Available from: https://projekttraeger.dlr.de/media/gesundheit/GF/Grundsaetze_Verantwortlichkeiten_Klinische_Studien.pdf.
  • 45.Commission European. Commission Guideline—Guidance on posting and publication of result-related information on clinical trials in relation to the implementation of Article 57(2) of Regulation (EC) No 726/2004 and Article 41(2) of Regulation (EC) No 1901/2006. Off J Eur Union. 2012:7–10. [Google Scholar]
  • 46.Berndt C, Grill M. In Deutschland erforscht, im Nirwana versunken. Süddeutsche.de [Internet]. [cited 2022 Jul 14]. Available from: https://www.sueddeutsche.de/gesundheit/veroeffentlichung-studien-1.4737316.
  • 47.Salholz-Hillel M, Franzen D, Müller-Ohlraun S, Strech D. Protocol: Clinical trial “report cards” to improve transparency at Charité and BIH: Survey and Intervention. OSF. [cited 2022 Jul 14]. Available from: https://osf.io/dk6gm.
  • 48.Strzebonska K, Wasylewski MT, Zaborowska L, Riedel N, Wieschowski S, Strech D, et al. Results dissemination of registered clinical trials across Polish academic institutions: a cross-sectional analysis. BMJ Open. 2020:10. doi: 10.1136/bmjopen-2019-034666 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Wieschowski S, Strech D, Franzen D, Salholz-Hillel M, Carlisle BG, Malički M, et al. CONTRAST–CalifOrNia TRiAlS Transparency. OSF. 2022 May. Available from: https://osf.io/u9d5c/.
  • 50.Cobey KD, Haustein S, Brehaut J, Dirnagl U, Franzen DL, Hemkens LG, et al. Community consensus on core open science practices to monitor in biomedicine. PLoS Biol. 2023;21:e3001949. doi: 10.1371/journal.pbio.3001949 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Franzen DL. Leveraging Open Tools to Realize the Potential of Self-Archiving: A Cohort Study in Clinical Trials. Publications. 2023;11:4. doi: 10.3390/publications11010004 [DOI] [Google Scholar]
  • 52.Ohmann C, Banzi R, Canham S, Battaglia S, Matei M, Ariyo C, et al. Sharing and reuse of individual participant data from clinical trials: principles and recommendations. BMJ Open. 2017;7:e018647. doi: 10.1136/bmjopen-2017-018647 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Speich B, Gryaznov D, Busse JW, Gloy VL, Lohner S, Klatte K, et al. Nonregistration, discontinuation, and nonpublication of randomized trials: A repeated metaresearch analysis. PLoS Med. 2022;19:e1003980. doi: 10.1371/journal.pmed.1003980 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Hutchinson N, Moyer H, Zarin DA, Kimmelman J. The proportion of randomized controlled trials that inform clinical practice. Boonstra P, Zaidi M, Boonstra P, Foster J, Cristea I, editors. eLife. 2022;11: e79491. doi: 10.7554/eLife.79491 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Thiele C, Hirschfeld G. Registration quality and availability of publications for clinical trials in Germany and the influence of structural factors. PLoS ONE. 2022;17:e0267883. doi: 10.1371/journal.pone.0267883 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Caitlin Moyer

3 May 2022

Dear Dr Franzen,

Thank you for submitting your manuscript entitled "An institutional dashboard to drive clinical research transparency: An open and scalable case study at University Medical Centers" for consideration by PLOS Medicine.

Your manuscript has now been evaluated by the PLOS Medicine editorial staff and I am writing to let you know that we would like to send your submission out for external peer review.

However, before we can send your manuscript to reviewers, we need you to complete your submission by providing the metadata that is required for full assessment. To this end, please login to Editorial Manager where you will find the paper in the 'Submissions Needing Revisions' folder on your homepage. Please click 'Revise Submission' from the Action Links and complete all additional questions in the submission questionnaire.

Please re-submit your manuscript within two working days, i.e. by May 05 2022 11:59PM.

Login to Editorial Manager here: https://www.editorialmanager.com/pmedicine

Once your full submission is complete, your paper will undergo a series of checks in preparation for peer review. Once your manuscript has passed all checks it will be sent out for review.

Feel free to email us at plosmedicine@plos.org if you have any queries relating to your submission.

Kind regards,

Caitlin Moyer, Ph.D.

Associate Editor

PLOS Medicine

Decision Letter 1

Caitlin Moyer

13 Sep 2022

Dear Dr. Franzen,

Thank you very much for submitting your manuscript "An institutional dashboard to drive clinical research transparency: An open and scalable case study at University Medical Centers" (PMEDICINE-D-22-01430R1) for consideration at PLOS Medicine.

Your paper was evaluated by a senior editor and discussed among all the editors here. It was also discussed with an academic editor with relevant expertise, and sent to four independent reviewers, including a statistical reviewer. The reviews are appended at the bottom of this email and any accompanying reviewer attachments can be seen via the link below:

[LINK]

In light of these reviews, I am afraid that we will not be able to accept the manuscript for publication in the journal in its current form, but we would like to consider a revised version that addresses the reviewers' and editors' comments. Obviously we cannot make any decision about publication until we have seen the revised manuscript and your response, and we plan to seek re-review by one or more of the reviewers.

In revising the manuscript for further consideration, your revisions should address the specific points made by each reviewer and the editors. Please also check the guidelines for revised papers at http://journals.plos.org/plosmedicine/s/revising-your-manuscript for any that apply to your paper. In your rebuttal letter you should indicate your response to the reviewers' and editors' comments, the changes you have made in the manuscript, and include either an excerpt of the revised text or the location (eg: page and line number) where each change can be found. Please submit a clean version of the paper as the main article file; a version with changes marked should be uploaded as a marked up manuscript.

In addition, we request that you upload any figures associated with your paper as individual TIF or EPS files with 300dpi resolution at resubmission; please read our figure guidelines for more information on our requirements: http://journals.plos.org/plosmedicine/s/figures. While revising your submission, please upload your figure files to the PACE digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at PLOSMedicine@plos.org.

We expect to receive your revised manuscript by Oct 04 2022 11:59PM. Please email us (plosmedicine@plos.org) if you have any questions or concerns.

***Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.***

We ask every co-author listed on the manuscript to fill in a contributing author statement, making sure to declare all competing interests. If any of the co-authors have not filled in the statement, we will remind them to do so when the paper is revised. If all statements are not completed in a timely fashion this could hold up the re-review process. If new competing interests are declared later in the revision process, this may also hold up the submission. Should there be a problem getting one of your co-authors to fill in a statement we will be in contact. YOU MUST NOT ADD OR REMOVE AUTHORS UNLESS YOU HAVE ALERTED THE EDITOR HANDLING THE MANUSCRIPT TO THE CHANGE AND THEY SPECIFICALLY HAVE AGREED TO IT. You can see our competing interests policy here: http://journals.plos.org/plosmedicine/s/competing-interests.

Please use the following link to submit the revised manuscript:

https://www.editorialmanager.com/pmedicine/

Your article can be found in the "Submissions Needing Revision" folder.

To enhance the reproducibility of your results, we recommend that you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. Additionally, PLOS ONE offers an option to publish peer-reviewed clinical study protocols. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols

Please ensure that the paper adheres to the PLOS Data Availability Policy (see http://journals.plos.org/plosmedicine/s/data-availability), which requires that all data underlying the study's findings be provided in a repository or as Supporting Information. For data residing with a third party, authors are required to provide instructions with contact information for obtaining the data. PLOS journals do not allow statements supported by "data not shown" or "unpublished results." For such statements, authors must provide supporting data or cite public sources that include it.

We look forward to receiving your revised manuscript.

Sincerely,

Caitlin Moyer, Ph.D.

Associate Editor

PLOS Medicine

plosmedicine.org

-----------------------------------------------------------

Requests from the editors:

From the Academic Editor:

1. The authors should pay attention to avoid citation shortcuts. In some occasions, it makes the manuscript difficult to follow and blurs the line between what is new and what was already published. This being said, the paper is very well written, all previous papers are appropriately cited, a web appendix, and online materials are available.

2. Authors also cite various initiatives (trial tracker, the French Monitor for Open Science, etc). I would be very interested in a detailed comparative assessment of these existing initiatives (in terms of metrics used, timeframe, etc.). Authors cite an initiative about developing a core outcome set. Describing all outcomes used by existing initiatives may inform the development of such a core outcome set.

3. The dashboard presented in Figure 2 displays mean values. Of course, this is important. But as this is a dashboard that aims to explore various institutions, one would be interested in values of all institutions (including the max and the min). I wonder if it would be possible to display this information.

4. One major problem of these monitors is that they are based on data which are likely biased. I'm not sure that trials registries are appropriately updated by all UMCs, e.g. regarding study status. It may introduce a lot of bias in the output, and this bias could be differential when one compares UMCs. This possible issue should be discussed in more detail as it is a major limitation of this comparative approach.

5. Authors may want to discuss their results in line with those of Speich et al. PLOS Medicine. 2022 and Thiele et al. Plos One 2022 ;

6. Introduction:

– I would stress that there is a very favorable environment for these initiatives

–More emphasis on DORA and the Hong Kong principles is needed ;

– I would also welcome more emphasis on the European approach for rewarding reproducible research practices and Open Science ;

–I would also cite the UNESCO approach of Open Science ;

7. Methods:

– I would provide more details on the automatic versus manual part of the extraction as this probably a very big problem for reproducibility regarding data extraction ;

–A citation is needed concerning the list of German UMC that was used ;

.--p6. 115, a citation is needed after has been described previously ;

–Again, the interactive dashboard is the most important and original part and it needs more emphasis ;

8. Results:

–The various flow charts in the supplements are very informative but difficult to follow. I was wondering if a single flow chart with various branches would be a better figure (I acknowledge that this part is very challenging to represent adequately) ;

9. Discussion:

– Author should elaborate a little bit more on the frequency of updates etc.

–Authors should also discuss the implication of this dashboard, for instance, should the trial registry implement such a dashboard directly? Which international organisation may be interested in using at a large scale such a dashboard? How difficult would it be to transfer such a dashboard from Germany to the rest of the world?

–Authors should stress that implementation of such dashboard will not guarantee that UMCs will be more transparent. What kind of research could be envisioned to answer this question? I would love to see more elaboration on this point.

Other editorial points:

10. Title: Please revise your title according to PLOS Medicine's style. Your title must be nondeclarative and not a question. It should begin with main concept if possible. "Effect of" should be used only if causality can be inferred, i.e., for an RCT. Please place the study design ("A randomized controlled trial," "A retrospective study," "A modelling study," etc.) in the subtitle (ie, after a colon).

11. Data availability statement: Please also include a link to the dashboard, where it will be made publicly available.

12. Abstract: Please combine the Methods and Findings sections into one section, “Methods and findings”.

13. Abstract: Line 11: Please define DRKS at first use.

14. Abstract: Methods and Findings: Please describe how adherence to established transparency practices was assessed.

15. Abstract: Methods and Findings: Please ensure that all numbers presented in the abstract are present and identical to numbers presented in the main manuscript text.

16. Abstract: Methods and Findings: In the last sentence of the Abstract Methods and Findings section, please describe the main limitation(s) of the study's methodology.

17. Abstract: Conclusions: * Please address the study implications without overreaching what can be concluded from the data; the phrase "In this study, we observed ..." may be useful.

18. Author summary: At this stage, we ask that you include a short, non-technical Author Summary of your research to make findings accessible to a wide audience that includes both scientists and non-scientists. The Author Summary should immediately follow the Abstract in your revised manuscript. This text is subject to editorial change and should be distinct from the scientific abstract. Please see our author guidelines for more information: https://journals.plos.org/plosmedicine/s/revising-your-manuscript#loc-author-summary

19. Main text: Please place in-text citations for references within square brackets. Where multiple references are indicated, please do not include spaces within brackets.

20. Main text: Please define all acronyms at first use in the text.

21. Methods: For all observational studies, in the manuscript text, please indicate: (1) the specific hypotheses you intended to test, (2) the analytical methods by which you planned to test them, (3) the analyses you actually performed, and (4) when reported analyses differ from those that were planned, transparent explanations for differences that affect the reliability of the study's results. If a reported analysis was performed based on an interesting but unanticipated pattern in the data, please be clear that the analysis was data-driven.

22. Methods: Did your study have a prospective protocol or analysis plan? Please state this (either way) early in the Methods section.

a) If a prospective analysis plan (from your funding proposal, IRB or other ethics committee submission, study protocol, or other planning document written before analyzing the data) was used in designing the study, please include the relevant prospectively written document with your revised manuscript as a Supporting Information file to be published alongside your study, and cite it in the Methods section. A legend for this file should be included at the end of your manuscript.

b) If no such document exists, please make sure that the Methods section transparently describes when analyses were planned, and when/why any data-driven changes to analyses took place.

c) In either case, changes in the analysis-- including those made in response to peer review comments-- should be identified as such in the Methods section of the paper, with rationale.

23. Methods: Please report your study according to the relevant guideline, which can be found here: http://www.equator-network.org/

Please ensure that the study is reported according to the STROBE or other guideline, and include the completed STROBE (or other) checklist as Supporting Information. When completing the checklist, please use section and paragraph numbers, rather than page numbers. Please add the following statement, or similar, to the Methods: "This study is reported as per the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guideline (S1 Checklist)." (Or please adapt to the most appropriate guideline for your study.)

24. Methods Line 93-95: Although described elsewhere, it would be helpful to have the methods for sampling and tracking results described here.

25. Results: Please present numerators and denominators for percentages.

26. Discussion: Please present and organize the Discussion as follows: a short, clear summary of the article's findings; what the study adds to existing research and where and why the results may differ from previous research; strengths and limitations of the study; implications and next steps for research, clinical practice, and/or public policy; one-paragraph conclusion.

27. Lines 336-349: Please remove the Funding, Author contributions, and Competing Interests sections from the main text. Please be sure all information is completely and accurately entered into the manuscript submission system.

28. References: Please use the "Vancouver" style for reference formatting, and see our website for other reference guidelines https://journals.plos.org/plosmedicine/s/submission-guidelines#loc-references

29. Table 1: In the legend, please define all abbreviations used in the table.

30. Supplement S2: In the legend, please define all abbreviations used in the table.

31. Supplement S4: In the legend, please define all abbreviations used in the table.

32. Supplement S5: Please define all abbreviations used in the flowcharts.

Comments from the reviewers:

Reviewer #1: 1. Please report p-values and 95% confidence interval in the abstract, and results section

2. Line 162/165: I suggesting adding more details about why 3790=2915=875 studies were excluded

3. The analysis mainly focused on the description of the overall "clinical transparency" over time. What are the underlying reasons for the changes overtime? Is it possible to examine the heterogeneity in the outcomes in study characteristics(e.g. individual UMC, size of study sample, disease category, results of the trial [ effective or not])

Reviewer #2: I would like to congratulate the authors on a thorough review of trial registration and results posting in the German Academic research environment. Although we could not see it completely, I also applaud the creation of a Dashboard for future data display.

I have a few comments for the authors' consideration.

The authors need to carefully distinguish disclosure (which is what they discuss) from transparency. Disclosure is a necessary but not sufficient step in transparency and open science. As the authors point out the disclosures need to findable and accessible to meet the FAIR principles. It would be good for the authors to reference FAIR upfront as they underpin open science in general. You have clearly outlined the principles on page 3 line 38.

There needs to be care in comparing results to clinical trials.gov. Note that the requirement to register and post results does not apply to Phase I trials and that might or might not affect the denominator. Also, before 2017 the results requirements only applied to trials of approved medicines so again the denominator for posted results may not be ALL trials. It is true that the ICMJE criteria were broader than that and while of great influence were not strictly speaking legal requirements.

P 3 line 40 You refer to "registered" with reference to results- I assume you mean there are results posted for trials that were not registered?

P 5 - It would be useful to define what you mean by "prospective". For CT.Gov the requirement is to register no later than 21 days after the first patient is enrolled. Is that the definition you are using?

P 8 line 163. Under results when you use the term "registered" are you using the "prospective" definition or all? In other places you are specific.

P 9 line 192: I agree part of findable means referring to the TRN. It is disappointing journals are not enforcing that and perhaps that needs to called out.

In the discussion you mention for the first time in the paper preclinical research and while the statement may be true you have not addressed that anywhere else and I suggest you might delete that here as it is a new concept. You also refer to "clinical research" in reference to your dashboard, but it is really clinical trials, correct? There are many discussions surrounding the registration of observation studies and I assume your dashboard is restricted to interventional trials as noted in the beginning?

Reviewer #3: PMedicine D-22-01430

1. The authors describe the development of a dashboard to illuminate execution of commitments to transparency in clinical trials. Note that many of the substantive results have been published previously, including trial registration, results reporting of trials, linkage of publications to trial registration and others. Therefore, the description and utility of the dashboard is the substantive new contribution, but it is rather briefly described (e.g., one paragraph in the manuscript). The description, development, user engagement, potential adaptations etc should be more fully described.

2. It would be very important to deduplicate the trials that are reported both in Clinicaltrials.gov and DRKS and/or divide the results by those that are registered and reported in one but not the other or both. The expectations and nature of the trials may differ.

3. Did the authors analyze by type of trial (regulated, drug/biologic/device/behavioral, industry v other, etc.) differences in registration, results reporting, linkage of registration to publications, and the other outcome measures? If not, why not?

4. The authors have used a dataset, published previously, that includes trials from 2009-2017. The standards and cultural expectations of transparency, including registration and results reporting, ORCID ID associations, publication preprint and access, have changed significantly in the last 5 years. It is unclear, if this approach is to be scalable and replicable as claimed, why the data was not updated to current state or at a minimum, a more recent end date (July 2021). Only the metric for publications would need to be adjusted for time, given the 2 year window for publication. All other metrics could be shown. A 5 year delay (or, as apparent annual updates are planned that include data from 3 years prior) is hardly relevant to current activities either by institution or sponsor.

a. It is unclear why results reporting at 5 years is less than at 2 years. If a trial is reported within 2 years, isn't it also included in the 5 year number? It is certainly reported at 5 years if reported at 2.

I suggest the authors update the data files and statistics, and replace the results as discussed below.

5. Similarly, for prospective registration of clinical trials, the authors used "a more recent cohort of interventional trials" 2006-2018. The field has changed dramatically since then, so why not use a truly recent cohort? E.g. ending 2021? Pooled statistics over that much time as described in the manuscript is not helpful.

6. A major deficiency of the manuscript is the failure to describe the trends in the reported outcomes (trial registration, results reporting, linkage of registration to publications, etc.) as the substance of the results section. There have been significant changes since 2006; pooling all the results necessarily decreases the utility of the findings. Coupled with the lack of recent data, this is highly problematic. The data should be reported by year, and by institution, and interpreted at more granular level of analysis. Graphs are shown in the figures, but the trends are not the focus of the results described.

7. The authors describe variability between and among UMC for trial registration but do not investigate that further. Are the trials that are not 'pre' registered applicable clinical trials? Are the trials not registered on Clinicaltrials.gov nevertheless registered on DRKS? What are the implications of interventional trials that are registered, but not pre-registered—did they need to be preregistered? Is the coding correct?

8. The authors claim that this manuscript describes "stakeholder engagement to provide UMCs with actionable information on trial transparency in an accessible and interactive format, and thereby empower their efforts to support improvement." This is not described in the current manuscript. Stakeholder engagement and how the stakeholders used or did not use the data to understand or change current practices are not described.

9. The authors should appreciate that some of their suggestions are not accomplished with "minimal effort" as claimed. A simple update (retrospectively linking publications in the trial registration) in Clinicaltrials.gov "opens" the trial record, and the entire record must then be updated to current requirements. The number of changes to requirements and the effort to do so are significant and a disincentive to retrospective changes.

10. Note that https://quest-cttd.bihealth.org requires a log in, which, given confidentiality provisions for reviewers, is not possible. The manuscript does alert to the fact that it will be open access upon publication, but it makes it difficult to review the utility, user-experience, and impact in the absence of access. Authors' claims to its novelty and importance (e.g., "provid[ing[actionable information") cannot be evaluated.

Reviewer #4:

This is a useful and interesting article on a novel Dashboard that provides seven parameters on the quality of trials registration and reporting for German University Medical Centres (UMCs). The centrepiece of the article is really the Figure to showing the Dashboard for all UMC's. However, the text also provides some interesting and novel data on registration and reporting features, such as the proportion of published trials that report the trial registration number in both abstract and text, and the number of published trials which have provided a summary on the trials registry - which is appallingly low. While the article made interesting reading, I was left wondering what those in the University medical centres might think of this dashboard? Some suggestions for improvement:

1/ It would help readers if the numbers in the text were more clearly linked to the Dashboard in Figure 2. For example,

* line 178 Describes the 74% of trials are prospectively registered and

* line 185 the 38% that have the trial registration number in the abstract.

These are the first two tiles of the Figure 2 Dashboard, and could be explicitly referred to in the text to help the reader link the text and the dashboard.

2/ I was puzzled that 44% of trials had published within two years but only 7% within five years - which seems contradictory. Can the authors explain this please?

3/ I was expecting and hoping that the Discussion would explore some of the practicalities and barriers for usage in practice by the UMCS. But there seemed to be little on this. Some of the questions I would like to see answered are:

Q1. How much manual work was required to produce the dashboards (that is the number of person-months or costs for doing this) and how much might that be for an annual process?

Q2. Has the dashboard been actively used by any current UMC's (eg the author's institution?) and what has been their reaction to this?

Q3. Clinical trials are important but small proportion of all clinical research, and that proportion will vary widely across UMCs. Might this be included in the Dashboard all the reports to UMC's, and/or are their plans for extending the range of reporting to other types of studies?

Such questions might be usefully addressed in the Discussion.

Any attachments provided with reviews can be seen via the following link:

[LINK]

Decision Letter 2

Callam Davidson

6 Jan 2023

Dear Dr. Franzen,

Thank you very much for re-submitting your manuscript "Institutional dashboards on clinical trial transparency: A case study for University Medical Centers" (PMEDICINE-D-22-01430R2) for review by PLOS Medicine.

I have discussed the paper with my colleagues and the academic editor and it was also seen again by three reviewers. I am pleased to say that provided the remaining editorial and production issues are dealt with we are planning to accept the paper for publication in the journal.

The remaining issues that need to be addressed are listed at the end of this email. Any accompanying reviewer attachments can be seen via the link below. Please take these into account before resubmitting your manuscript:

[LINK]

***Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.***

In revising the manuscript for further consideration here, please ensure you address the specific points made by each reviewer and the editors. In your rebuttal letter you should indicate your response to the reviewers' and editors' comments and the changes you have made in the manuscript. Please submit a clean version of the paper as the main article file. A version with changes marked must also be uploaded as a marked up manuscript file.

Please also check the guidelines for revised papers at http://journals.plos.org/plosmedicine/s/revising-your-manuscript for any that apply to your paper. If you haven't already, we ask that you provide a short, non-technical Author Summary of your research to make findings accessible to a wide audience that includes both scientists and non-scientists. The Author Summary should immediately follow the Abstract in your revised manuscript. This text is subject to editorial change and should be distinct from the scientific abstract.

We hope to receive your revised manuscript within 1 week. Please email us (plosmedicine@plos.org) if you have any questions or concerns.

We ask every co-author listed on the manuscript to fill in a contributing author statement. If any of the co-authors have not filled in the statement, we will remind them to do so when the paper is revised. If all statements are not completed in a timely fashion this could hold up the re-review process. Should there be a problem getting one of your co-authors to fill in a statement we will be in contact. YOU MUST NOT ADD OR REMOVE AUTHORS UNLESS YOU HAVE ALERTED THE EDITOR HANDLING THE MANUSCRIPT TO THE CHANGE AND THEY SPECIFICALLY HAVE AGREED TO IT.

Please ensure that the paper adheres to the PLOS Data Availability Policy (see http://journals.plos.org/plosmedicine/s/data-availability), which requires that all data underlying the study's findings be provided in a repository or as Supporting Information. For data residing with a third party, authors are required to provide instructions with contact information for obtaining the data. PLOS journals do not allow statements supported by "data not shown" or "unpublished results." For such statements, authors must provide supporting data or cite public sources that include it.

To enhance the reproducibility of your results, we recommend that you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. Additionally, PLOS ONE offers an option to publish peer-reviewed clinical study protocols. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript.

Please note, when your manuscript is accepted, an uncorrected proof of your manuscript will be published online ahead of the final version, unless you've already opted out via the online submission form. If, for any reason, you do not want an earlier version of your manuscript published online or are unsure if you have already indicated as such, please let the journal staff know immediately at plosmedicine@plos.org.

If you have any questions in the meantime, please contact me or the journal staff on plosmedicine@plos.org.  

We look forward to receiving the revised manuscript by Jan 13 2023 11:59PM.   

Sincerely,

Caitlin Moyer, Ph.D.

Senior Editor 

PLOS Medicine

plosmedicine.org

------------------------------------------------------------

Requests from Editors:

The Academic Editor extends their gratitude for the efforts made in addressing their previous comments.

Please revise your title to “Institutional dashboards on clinical trial transparency for University Medical Centers: A case study”.

Data Availability Statement: Please confirm the login requirements for the dashboard will be removed prior to publication.

Please trim your Author Summary such that it contains 2-3 single sentence bullet points under each of the three questions.

Please include key numbers in your Author Summary such as sample size and headline results.

Figure 3, while suitable for showing the dashboard home page, is not ideal for graphically representing the key findings (the individual panels are too small to appreciate, and axis/data labels are difficult to read). I would suggest providing screenshots of the individual panels in the Supporting Information and citing these in the Results instead. Figure 3 can still be kept in the main body of the manuscript but only cited when referring to the dashboard home page at line 347.

Comments from Reviewers:

Reviewer #1: I thank authors for their diligent work in revising the manuscript and addressing the reviewers' questions. I do not have any additional comments.

Reviewer #2: The authors have responded to all my comments adequately. One minor comment whiie their response to comment 2 about CT.Gov was fine I wanted to make sure they take care to consider carefully the denominators when comparing platforms.

Reviewer #3: The authors have spent considerable time and effort responding to the suggestions of the editors and the reviewers. This reviewer believes that the authors should be credited with their attention to detail, their own transparency in the process, and the general management of the submission.

Any attachments provided with reviews can be seen via the following link:

[LINK]

Decision Letter 3

Callam Davidson

18 Jan 2023

Dear Dr Franzen, 

On behalf of my colleagues and the Academic Editor, Dr Florian Naudet, I am pleased to inform you that we have agreed to publish your manuscript "Institutional dashboards on clinical trial transparency for University Medical Centers: A case study" (PMEDICINE-D-22-01430R3) in PLOS Medicine.

Before your manuscript can be formally accepted you will need to complete some formatting changes, which you will receive in a follow up email. Please be aware that it may take several days for you to receive this email; during this time no action is required by you. Once you have received these formatting requests, please note that your manuscript will not be scheduled for publication until you have made the required changes.

In the meantime, please log into Editorial Manager at http://www.editorialmanager.com/pmedicine/, click the "Update My Information" link at the top of the page, and update your user information to ensure an efficient production process. 

PRESS

We frequently collaborate with press offices. If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximise its impact. If the press office is planning to promote your findings, we would be grateful if they could coordinate with medicinepress@plos.org. If you have not yet opted out of the early version process, we ask that you notify us immediately of any press plans so that we may do so on your behalf.

We also ask that you take this opportunity to read our Embargo Policy regarding the discussion, promotion and media coverage of work that is yet to be published by PLOS. As your manuscript is not yet published, it is bound by the conditions of our Embargo Policy. Please be aware that this policy is in place both to ensure that any press coverage of your article is fully substantiated and to provide a direct link between such coverage and the published work. For full details of our Embargo Policy, please visit http://www.plos.org/about/media-inquiries/embargo-policy/.

To enhance the reproducibility of your results, we recommend that you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. Additionally, PLOS ONE offers an option to publish peer-reviewed clinical study protocols. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols

Thank you again for submitting to PLOS Medicine. We look forward to publishing your paper. 

Sincerely, 

Callam Davidson 

Associate Editor 

PLOS Medicine

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Supplement. Use of automated vs. manual approaches across methods.

    (PDF)

    S2 Supplement. Inclusion and exclusion criteria.

    (PDF)

    S3 Supplement. Selected sponsor names in the EU Trials Tracker.

    (PDF)

    S4 Supplement. Sources for Fig 1.

    (PDF)

    S5 Supplement. Detailed methods and limitations of registration and reporting metrics.

    (PDF)

    S6 Supplement. STROBE checklist for cross-sectional studies.

    (PDF)

    S7 Supplement. Characteristics of included trials.

    (PDF)

    S8 Supplement. Flow diagrams of the trial and publication screening.

    (PDF)

    S9 Supplement. Screenshots of the “Start” page of the dashboard.

    (PDF)

    S10 Supplement. Screenshot of the “One UMC” page of the dashboard.

    (PDF)

    Attachment

    Submitted filename: response-to-reviewers.docx

    Attachment

    Submitted filename: response-to-reviewers.docx

    Data Availability Statement

    The authors confirm that all data underlying the findings are fully available without restriction. The dashboard is openly available at: https://quest-cttd.bihealth.org/. Code to produce the dashboard is openly available in GitHub at: https://github.com/quest-bih/clinical-dashboard. Code to generate the dataset displayed in the dashboard is openly available in GitHub: https://github.com/maia-sh/intovalue-data/releases/tag/v1.1. Data can be downloaded from the dashboard and are openly available on OSF at: https://osf.io/26dgx/. Raw data obtained from trial registries are openly available on Zenodo at: https://doi.org/10.5281/zenodo.7590083. Data for summary results reporting in the EUCTR are available via the EU Trials Tracker.


    Articles from PLOS Medicine are provided here courtesy of PLOS

    RESOURCES