Skip to main content
PLOS ONE logoLink to PLOS ONE
. 2021 Aug 18;16(8):e0255704. doi: 10.1371/journal.pone.0255704

“He who pays the piper calls the tune”: Researcher experiences of funder suppression of health behaviour intervention trial findings

Sam McCrabb 1,*, Kaitlin Mooney 1, Luke Wolfenden 1,2, Sharleen Gonzalez 1, Elizabeth Ditton 1, Serene Yoong 1,2,3, Kypros Kypri 1
Editor: Quinn Grundy4
PMCID: PMC8372890  PMID: 34407104

Abstract

Background

Governments commonly fund research with specific applications in mind. Such mechanisms may facilitate ‘research translation’ but funders may employ strategies that can also undermine the integrity of both science and government. We estimated the prevalence and investigated correlates of funder efforts to suppress health behaviour intervention trial findings.

Methods

Our sampling frame was lead or corresponding authors of papers (published 2007–2017) included in a Cochrane review, reporting findings from trials of interventions to improve nutrition, physical activity, sexual health, smoking, and substance use. Suppression events were based on a previous survey of public health academics. Participants answered questions concerning seven suppression events in their efforts to report the trial, e.g., [I was…] “asked to suppress certain findings as they were viewed as being unfavourable.” We also examined the association between information on study funder, geographical location, targeted health behaviour, country democracy rating and age of publication with reported suppression.

Findings

We received responses from 104 authors (50%) of 208 eligible trials, from North America (34%), Europe (33%), Oceania (17%), and other countries (16%). Eighteen percent reported at least one of the seven suppression events relating to the trial in question. The most commonly reported suppression event was funder(s) expressing reluctance to publish because they considered the results ‘unfavourable’ (9% reported). We found no strong associations with the subject of research, funding source, democracy, region, or year of publication.

Conclusions

One in five researchers in this global sample reported being pressured to delay, alter, or not publish the findings of health behaviour intervention trials. Regulation of funder and university practices, establishing study registries, and compulsory disclosure of funding conditions in scientific journals, are needed to protect the integrity of public-good research.

Introduction

Generating scientific knowledge should be, in principle, a key consideration in the design of programmes to improve public health. Governments fund national agencies whose purpose is supporting science (e.g., the National Institutes of Health in the United States of America [USA], National Health and Medical Research Council (NHMRC) [Australia]), and researcher-initiated projects are routinely funded through such agencies. Research funding is also dedicated to addressing the priorities of funders with objectives typically relating to informing public policy or commercial imperatives [1]. Such strategic funding aims to address knowledge gaps important to funders thereby facilitating ‘research translation’ by ensuring relevance to end-users. However, these funding models have been shown to undermine the integrity of science by enabling funders to influence how research is done and reported [24].

As providers of publicly funded health and medical research, universities have a vital role in facilitating independent enquiry. The notion of academic freedom is that researchers bound by the scholarly conventions of peer review and ethical approval, are free to do research without interference or the threat of professional disadvantage [5]. Many see the preservation of such freedom as vital to safeguarding the reflection, critique, and innovation that academia can bring to society [3, 6]. However, academic integrity is increasingly undermined by the influence of vested interests on research [2, 7], and a reproducibility crisis [8], calling into question whether public research institutions actually serve the public interest [9]. That research funders who are also responsible for giving policy advice or implementing intervention programmes have a stake in study findings, puts pressure on the impartiality of the researchers who depend on the funding. This could include subtle pressure on researchers, unconsciously conveyed hopes for ‘positive’ findings, or total suppression or censorship of reports for political advantage [4]. Various mechanisms exist to regulate researcher behaviour, including codes of conduct and ethical review [10, 11]. In Australia, many government funding agreements require researchers to obtain funder approval to publish reports [12].

The suppression of public-good research by funders or other parties is neither well understood nor coherently regulated [13]. A 2006 survey of Australian public health researchers reported that 21% of participants (with a response rate of 46%) had experienced at least one incident in which a government funder suppressed their research in the preceding 5.5 years [14]. The most common forms of suppression reported were blocking, significantly delaying publication and requests to “sanitise” reports [14]. A survey of Australian ecologists and conservation scientists by Driscoll et al. [15] indicated that Government and industry respondents reported higher rates of suppression than university respondents (34%, 30% and 5% respectively) and this suppression was mainly in the form of internal communication and media. A 2015 Canadian survey of federal government scientists showed that within the last 5 years 24% of scientists had been asked to exclude certain findings from their reports, and 37% reported that they were prevented from responding to media enquiries within their area of expertise [16]. In the United Kingdom (UK) an enquiry into public-good research, commissioned by a science charity, presented nine case studies outlining the impact of significant delays in the publication of findings. In several cases delays appeared to be motivated by political considerations [4]. Knowing how often and in what circumstances the suppression of public health research occurs is important because of the potential impact of withholding, delaying, or misrepresenting findings. This is acutely apparent in the COVID-19 pandemic, where delays in releasing early research findings in China allowed significant outbreaks to occur in other countries [1719].

The aims of this study were: (1) to ascertain the reported prevalence of efforts to suppress the findings of primary prevention trials that target nutrition, physical activity, sexual health, tobacco use, alcohol or substance use; and (2) to identify associations between trial characteristics and suppression events.

Methods

Design

We invited the lead authors of primary prevention trials included in Cochrane reviews to complete a Computer Assisted Telephone Interview or online survey. This study was part of a larger cross-sectional study that investigated researchers’ experiences in developing, conducting and evaluating public health interventions, the effectiveness of the intervention, any knowledge translation strategies used, and reported impacts on health policy and practice (unpublished). The present study investigates the prevalence of suppression of trial findings, and how these relate to the trial characteristics. The University of Newcastle Human Research Ethics Committee approved the study protocol (H-2014-0070). Completion of the online survey was taken as implied consent.

Sampling

We searched the Cochrane Library for reviews that were: (1) focused on primary prevention or included trials with setting-based primary prevention components; and (2) related to nutrition, physical activity, sexual health, tobacco use, alcohol use, or other psychoactive substance use. These risk behaviour areas were chosen as the larger cross-sectional study this sub-study is a part of was interested in these health behaviours.

We classified primary studies from the reviews as eligible if they were randomised controlled trials (RCT) or non-randomised controlled trials investigating the effects of efforts to modify nutrition, physical activity, sexual health, tobacco use, alcohol use, or other substance use. We limited eligibility to English language reports published from 2007–2017.

Recruitment and data collection

Authors from identified articles were invited to participate if they were one of the first two authors, the last author, or the corresponding author. Contact information was sourced from the public domain. We contacted corresponding authors first to complete the survey on behalf of all authors. Corresponding authors could nominate co-authors to complete the survey on their behalf. If after four week we had no response to the corresponding author, we invited the first, second and/or last author of the trial manuscript respectively (if different from the corresponding author) to participate. Authors with available telephone contact details were invited to complete the survey via a telephone interview. Prior to contacting via telephone, we emailed an invitation attaching a study information sheet, a summary of the survey topics to be covered, and an opt-out form. Those authors without telephone contact details were contacted via email and sent the same information and a link to complete the same survey online via REDCap, a web survey hosting service [20]. Up to three reminder emails were sent to non-responders at intervals of approximately four weeks.

Measures

Suppression events

We asked respondents seven questions concerning their experiences when disseminating the trial results (see Box 1). The questions, based on those used by Yazahmeidi and Holman (2007) [14], had response options “not at all”, “a little”, or “substantially”.

Box 1. Options respondents were provided regarding funder behaviour

• Reluctance to publish the findings in peer-reviewed journals as they were viewed as being unfavourable.

• Delays in reporting or publishing the findings until a more favourable time (e.g. following elections, after certain policies had been approved).

• Asked to alter your conclusions so that the impact of the intervention was framed in a way that aligned more with their interests.

• Asked to suppress certain findings as they were viewed as being unfavourable.

• Discouraged from presenting your results to certain groups or organisations that may have an interest in the intervention.

• Attempts to discredit members of the research team or other staff involved in the conduct of the study.

• Changes made to study methods or analytical procedures that would have likely resulted in an outcome that aligned more with their interests (e.g. significant finding).

Trial characteristics

Two researchers (SG and KM) independently extracted the following information from published reports of eligible trials: year of publication, the health risk behaviour(s) targeted (physical activity, nutrition, sexual health, substance use), and the country of first author, where the trial was assumed to have occurred. We aggregated author country into the categories: North America, Europe, Oceania, and Other.

A democracy classification was added for each publication based on the country of origin and the year of study publication using the Economist Intelligence Unit (EIU) Democracy Index reports [2129]. The democracy index is a measure of a country’s democracy and is based on five categories of 60 indicators which are scored to provide a total out of 10. Based on the score, countries are categorised into full democracy (score = 8.01 to 10), or not a full democracy (0 to 8). (N.B. there were no reports for the years 2007 and 2009 so this data is missing for studies published in those years).

While the focus of the study is government funding, we extracted data from all eligible reports and classified them in the following mutually exclusive categories: Dedicated Research agency (government), other Government Agency, Industry, and Philanthropic (see Box 2 for definitions). If funding information was unavailable, we coded the data as Unknown. Where there was more than one source of funding, we coded the study as Multiple and excluded it from the regression analysis to avoid problems of attribution.

Box 2. Definitions of funding categories

Dedicated research agency: A government funded agency solely responsible for medical and public health research.

Other government agency: A government agency, dedicated to pursuits other than research, including local councils, public health and safety departments, and ministerial departments.

Industry: Companies and activities involved in the production of goods for sale.

Philanthropic: A non-government, non-profit organisation, with assets provided by donors and managed by its own officials and with income expended for socially useful purposes.

Unknown: No funding source listed.

Multiple: Reports more than one of the previous funding type.

Analysis

For each of the seven questions we calculated the proportion (aim 1) who answered “not at all” (coded as “never”) versus “a little” or “substantially” (coded as “at least once”), and then a dichotomous variable indicating the proportion who had experienced any one of the seven suppression events. We conducted a sensitivity analysis to estimate the extremities of possible non-response bias, by assuming (a) that all non-respondents had experienced an act of suppression, and conversely, (b) that all non-respondents had not experienced an act of suppression.

We estimated associations between trial characteristics including the risk behaviour targeted, funder, geographic location, full democracy (yes vs no), and age of the publication (in years) and instances of suppression (aim 2) using logistic regression, recoding year of publication as the continuous variable ‘age of publication in 2017’ (the last year in the sampling frame). We estimated adjusted odds ratios with 95% confidence intervals, and aggregated trials where groups were small: ‘sexual health/substance abuse (risky behaviour)’ and ‘nutrition/physical activity’ based on evidence that these behaviours cluster [30, 31].

Results

From 42 eligible reviews we identified 208 trials and received survey responses from 104 (50%) of their corresponding authors. Papers were published from 2007–2016 and reported trials concerning physical activity and/or nutrition (55%), substance use and/or sexual health (47%). Two thirds were conducted in North America (34%) or Europe/United Kingdom (33%), with the balance in Oceania (17%) and other countries (16%). Examining democracy data, the majority of studies were from full democracy countries (61%). The majority of studies receive funding from Other Government agencies (39%).

S1 Table shows that the characteristics of trials whose authors did not complete the survey were not markedly different from those who did, in terms of the study design, full democracy, or publication date. However, the proportion conducted in North America was higher among non-respondents (53%) than respondents (34%). Too, non-responder reported more funding from an Other Government Agency (49% non-completers vs 39% completers).

Aim 1: Prevalence of suppression events

Eighteen percent (18/98, 6 unknown) of respondents reported at least one instance of suppression. Table 1 shows the number of respondents who reported each type of suppression event having occurred at least once, by funding source. Rates of suppression were highest for studies funded by Other Government agencies. The most commonly reported suppression event was that of the funder expressing reluctance for publication due to ‘unfavourable’ results: with six, two and one suppression events being reported from studies funded by other Government agencies, independent sources, and multiple funding sources, respectively. In comparison, researchers receiving industry or philanthropic funding did not report a single suppression event.

Table 1. Researcher reports of funder efforts to suppress trial findings.

Never Once or more often*
Funder Type Industry Other Government Agency Philanthropic Dedicated Research Agency Multiple Unknown
Funder expressed reluctance for publication because they considered the results ‘unfavourable’ 89 0 6 0 2 1 0
Funder delayed reporting of findings until a more favourable time (e.g., following elections) 93 0 2 0 0 2 1
Funder asked researcher to alter conclusions to better align with funder interests 91 0 3 0 1 2 0
Funder asked researcher to not report findings they considered unfavourable 95 0 2 0 1 0 0
Funder discouraged researcher from presenting results to certain groups or organisations that may have an interest in the intervention 95 0 1 0 1 1 0
Funder attempted to discredit members of the research team or other staff involved in the conduct of the study 94 0 1 0 2 1 0
Funder demanded changes to study methods or analysis likely to produce findings that aligned with funder interests (e.g. emphasis on the “statistical significance of a result) 94 0 1 0 1 0 0

*Six respondents did not answer any of these questions, while the number of respondents answering each question ranged from 96 to 98.

Sensitivity analysis

Under the extreme assumptions that none or all the non-respondents had experienced a suppression event, the prevalence estimates would be as low as 9% (18/208) or as high as 59% [(18+104)/208], respectively.

Aim 2: Association between trial characteristics and suppression events

Table 2 summarises associations between trial characteristics and suppression events. Researchers receiving Other Government Grants, or who conducted studies in Europe, had higher odds of reporting a suppression event compared to those reporting on studies with independent dedicated research funding or conducting studies in North America, respectively. Researchers who had conducted sexual health/substance use trials more commonly reported a suppression event than those who had conducted nutrition/physical activity trials. Whether the publication came from a democratic country or not did not seem to change the odds of reporting suppression. As the age of the publication increased, so too did the odds of reporting suppression. The confidence intervals for the odd ratios of all the comparisons were wide and included 1.

Table 2. Associations between trial characteristics and suppression events.

Reported a suppression event n (%) Unadjusted Odds Ratio (95% CI) Adjusted*Odds Ratio (95% CI)
Risk behaviour targeted
    Nutrition/physical activity 7 (17%) Ref Ref
    Sexual health/substance use 8 (24%) 1.49 (0.48, 4.65) 2.25 (0.43, 11.68)
Funder
    Dedicated Research Agency 5 (19%) Ref Ref
    Philanthropic 0 (0%) - -
    Other Government Agency 9 (28%) 1.72 (0.50,5.95) 2.22 (0.41, 12.10)
    Industry 0 (0%) - -
    Unknown 1 (10%) 0.49 (0.05, 5.95) -
Geographic location
    North America 6 (20%) Ref Ref
    Europe 5 (21%) 1.10 (0.29, 4.14) 1.66 (0.28, 9.92)
    Oceania 2 (22%) 1.19 (0.02, 7.25) 0.59 (0.04, 8.36)
    Other 2 (18%) 0.93 (0.16, 5.45) 1.71 (0.04, 75.39)
Full democracy
    No 2 (18%) Ref Ref
    Yes 9 (19%) 1.04 (0.19–5.65) 0.99 (0.03, 32.94)
Age of publication in 2017 (years), mean (standard deviation) 6.73 (2.34) 1.09 (0.86, 1.37) 1.08 (0.79 to 1.49)

*Adjusted for behaviour targeted, funder type, geographic location, and age of publication.

Discussion

In a sample of authors of prevention intervention trial reports published over a decade, of whom 50% responded, we found that one in five reported at least one suppression event. A simple sensitivity analysis suggests that rates of suppression could be as low as 9%, and as high as 59%, depending on the proportion of non-respondents who were subjected to suppression events. Our overall estimate of 18% is similar to estimates from previous studies in Australia (21% to 34%) [15, 32] and Canada (24%) [16]. Notably, we asked specifically about what occurred in relation to a single trial, while other studies evaluated suppression events across many projects over the course of five [16] or 5.5 years [14]. As such, rates of suppression may be much higher.

The sampling frame provides for greater international representation and broader coverage of health research than previous studies [14, 32]. However, relying on published studies to direct us to authors means that we would not have identified authors of studies whose publication was supressed entirely. It is also likely that some authors would not disclose suppression events, even in the confidential context of a research study, fearing repercussions from the funder, or being negatively evaluated by the researchers. The implication of this being that the actual rate of suppression is much higher. Finally, the small numbers in our study constrain the precision of our estimates of prevalence and association. Accordingly, we suggest that 18% is an under-estimate of the true prevalence of studies subject to some form of suppression by funders.

It is hard to determine why older publications or those published in different geographical regions were suggestive of having to have greater odds of suppression (though the confidence intervals for all the comparisons were wide and included 1). A possible explanation may be that older studies which have existed longer may have had more ‘chance’ to experience suppression on their findings. That studies published outside North America were found to have greater odds of suppression is hard to elucidate. It may be that instances of suppression in North America are only reported by individual researchers when it is ‘more severe’ than those reported in other countries, that they were more afraid to report suppression, or that there are tighter regulations on what types of suppression can be enacted on grant holders. More research would be needed to explain the differences noted.

Our results, along with those of previous investigations, suggest that government funders interfere with public-good research. In addition to curtailing independent scientific enquiry, such practices deny the public access to the findings of research paid for through taxation, which in some cases, could have informed policy decisions. On this point, in his ‘Missing Evidence’ report, former High Court Judge Sir Stephen Sedley observed that the UK Parliament made a critical decision concerning the merits of minimum unit pricing of alcohol without the benefit of key findings whose publication had been deliberately delayed by the Department of Health [4]. In addition to the loss borne by taxpayers due to ill-informed policy, is the damage done to democracy when such perversions come to light.

This research has its limitations. For example, we did not attempt to ‘deep dive’ on the different types of suppression. We found that reluctance to publish research findings due to unfavourable results was the most reported type of suppression experiences. However, we did not seek to determine what reluctance meant in this context–whether it meant funders prevented publication or just tried to influence it. Too we did not give participants the opportunity to describe other forms of suppression they may have felt. Further research should investigate this in order to determine all type of suppression experienced in order to develop the most practical ways to overcome them.

Attention is urgently needed to protect the integrity of public health research from the influence of vested interests, whether private or official in origin. Preventive actions are required of all actors involved in the generation of research findings:

  1. Government agencies must ensure appropriate terms in funding agreements formed with research providers that protect academic freedom for e.g. the removal of clauses which require the approval of results prior to publication [32];

  2. Research institutions must not accept funding on terms that permit funders to interfere in public-good research;

  3. Government agencies should establish a registry of studies funded by government agencies including the terms of the funding to encourage openness;

  4. Research Ethics Committees must consider the source and terms of research funding to determine if there are any ethical implications of the funding source;

  5. Scientific journals must require authors to declare the terms of research funding, and potential conflicts of interest;

  6. Researchers must be held to Code of Conduct provisions concerning acceptable terms of funding;

  7. Audits of contracting and research practices of tertiary academic institutions should be undertaken by an independent body with appropriate powers (e.g. an independent government department); and

  8. Universities should consider establishing a mechanism for reporting instances research suppression, and the management of funders/individuals/etc. who are known to attempt to suppress research findings.

Suppression still exists among public-good researchers, with this study suggesting rates of suppression for a single trial may be as high as one in five. Prevention is key and these suggestions, similar to those previously described [32], need to be adopted in order to thwart the occurrence of suppression of public-good health research. As it is unlikely that instances of suppression will ever truly be stopped, further research needs to be conducted to determine ways of handling suppression when they do happen at the researcher level (e.g. what can a researcher do if someone tries to delay their publication), as well as reporting procedures in place if instances of suppression do occur.

Supporting information

S1 Table. Characteristics of responders and non-responders.

(DOCX)

S1 File

(XLSX)

S2 File

(XLSX)

S3 File

(XLSX)

S4 File

(XLSX)

Data Availability

All relevant data are within the paper and its Supporting Information files.

Funding Statement

Professor Wolfenden is funded by a NHMRC Career Development Fellowship (APP1128348) and a Heart Foundation Future Leader Fellowship (Award Number 101175). Infrastructure funding was provided by Hunter New England Population Health and The University of Newcastle. Professor Kypri was funded by a University of Newcastle Brawn Senior Research Fellowship for his input to this study. A/Prof Serene Yoong is funded by an ARC Discovery Early Career Researcher Award (DE170100382).

References

  • 1.Viergever RF, Hendriks TCC. The 10 largest public and philanthropic funders of health research in the world: what they fund and how they distribute their funds. Health Research Policy and Systems. 2016;14(1):12. doi: 10.1186/s12961-015-0074-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Gornall J. Sugar: spinning a web of influence. Bmj. 2015;350. doi: 10.1136/bmj.h231 [DOI] [PubMed] [Google Scholar]
  • 3.Miller A. Academic Freedom: Defending democracy in the corporate university. Social Alternatives. 2019;38(3):14. [Google Scholar]
  • 4.Sedley SS. Missing evidence: An inquiry into the delayed publication of governmentcommissioned research. In: trust TJC, editor. London: Sense about Science; 2016. [Google Scholar]
  • 5.Hoepner J. Silencing behaviours in contested research and their implications for academic freedom. Australian Universities’ Review, The. 2019;61(1):31. [Google Scholar]
  • 6.Beiter KD. Where have all the scientific and academic freedoms gone? And what is ‘adequate for science’? The right to enjoy the benefits of scientific progress and its applications. Israel Law Review. 2019;52(2):233–91. [Google Scholar]
  • 7.McCarthy J. Newcastle University protests on Tuesday over coal connections. Newcastle Herald. 2017. [Google Scholar]
  • 8.Ioannidis JP. Why most published research findings are false. PLoS medicine. 2005;2(8):e124. doi: 10.1371/journal.pmed.0020124 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Saltelli A, Funtowicz S. What is science’s crisis really about? Futures. 2017;91:5–11. [Google Scholar]
  • 10.National Health and Medical Research Council. Australian Code for the Responsible Conduct of Research 2018. National Health and Medical Research Council, Australian Research Council and Universities Australia. Commonwealth of Australia, Canberra. 2018. [Google Scholar]
  • 11.National Health and Medical Research Council. National Statement on Ethical Conduct in Human Research 2007 (Updated 2018). The National Health and Medical Research Council, the Australian Research Council and Universities Australia. Commonwealth of Australia, Canberra. 2007. [Google Scholar]
  • 12.Power J, Gilmore B, Vallières F, Toomey E, Mannan H, McAuliffe E. Adapting health interventions for local fit when scaling-up: a realist review protocol. BMJ Open. 2019;9(1):e022084. doi: 10.1136/bmjopen-2018-022084 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.National Health and Medical Research Council. Our policy on research integrity. Australian Government, 2019.
  • 14.Yazahmeidi B, Holman CDAJ. A survey of suppression of public health information by Australian governments. Australian and New Zealand journal of public health. 2007;31(6):551–7. doi: 10.1111/j.1753-6405.2007.00142.x [DOI] [PubMed] [Google Scholar]
  • 15.Driscoll DA, Garrard GE, Kusmanoff AM, Dovers S, Maron M, Preece N, et al. Consequences of information suppression in ecological and conservation sciences. Conservation Letters. 2021;14(1):e12757. 10.1111/conl.12757. [DOI] [Google Scholar]
  • 16.The Professional Institute of the Public Service of Canada (PIPSC). The big chill, silencing public interest science: a survey. Ottawa: PIPSC; 2013. [cited 04 September 2019]. Available from: http://www.pipsc.ca/portal/page/portal/website/issues/science/bigchill [Google Scholar]
  • 17.Buckley C, Myers SL. As new coronavirus spread, China’s old habits delayed fight. New York Times. 2020;1. [Google Scholar]
  • 18.Wong A. WHO recordings show frustration with China over coronavirus information delays. ABC News. 2020. [Google Scholar]
  • 19.Jianjun M. China delayed releasing coronavirus info, frustrating WHO. CNBC. 2020.
  • 20.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)—a metadata-driven methodology and workflow process for providing translational research informatics support. Journal of biomedical informatics. 2009;42(2):377–81. doi: 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.The Economist Intelligence Unit. Democracy index 2006: A pause in democracy’s march. 2007.
  • 22.The Economist Intelligence Unit. Democracy index 2008. 2008.
  • 23.The Economist Intelligence Unit. Democracy index 2010: Democracy in retreat. 2010.
  • 24.The Economist Intelligence Unit. Democracy index 2011: Democracy under stress. 2012.
  • 25.The Economist Intelligence Unit. Democracy index 2012: Democracy is at a standstill. 2013.
  • 26.The Economist Intelligence Unit. Democracy index 2013: Democracy in limbo. 2014.
  • 27.The Economist Intelligence Unit. Democracy Index 2014: Democracy and its discontents. 2015. [DOI] [PubMed]
  • 28.The Economist Intelligence Unit. Democracy index 2015: Democracy in an age of anxiety. 2016.
  • 29.The Economist Intelligence Unit. Democracy Index 2016: Revenge of the “deplorables”. The Economist Intelligence Unit Retrieved from https://wwweiucom/public/topical_reportaspx. 2017.
  • 30.Xu F, Cohen SA, Lofgren IE, Greene GW, Delmonico MJ, Greaney ML. Relationship between diet quality, physical activity and health-related quality of life in older adults: Findings from 2007–2014 national health and nutrition examination survey. The Journal of Nutrition, Health & Aging. 2018;22(9):1072–9. doi: 10.1007/s12603-018-1050-4 [DOI] [PubMed] [Google Scholar]
  • 31.Khadr S, Jones K, Mann S, Hale DR, Johnson A, Viner RM, et al. Investigating the relationship between substance use and sexual behaviour in young people in Britain: findings from a national probability survey. BMJ open. 2016;6(6):e011961. doi: 10.1136/bmjopen-2016-011961 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Kypri K. Suppression clauses in university health research: case study of an Australian government contract negotiation. The Medical Journal of Australia. 2015;203(2):72–4. doi: 10.5694/mja14.01497 [DOI] [PubMed] [Google Scholar]

Decision Letter 0

Quinn Grundy

22 Jun 2021

PONE-D-21-13007

"He who pays the piper calls the tune": Researcher experiences of funder suppression of health behaviour intervention trial findings

PLOS ONE

Dear Dr. McCrabb,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Your manuscript was very well received by the reviewers and they largely note minor revisions for clarity. Please also address the outstanding queries regarding recruitment and some of the nuances in terms of interpretation (or limits of interpretation) regarding what suppression actually entails. One of the reviewers suggests a novel analysis by region/country, which I would ask you to consider or respond to. I look forward to receiving your revision.

Please submit your revised manuscript by Jul 17 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Quinn Grundy, PhD, RN

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

3. We note that you have included the phrase “data not shown” in your manuscript. Unfortunately, this does not meet our data sharing requirements. PLOS does not permit references to inaccessible data. We require that authors provide all relevant data within the paper, Supporting Information files, or in an acceptable, public repository. Please add a citation to support this phrase or upload the data that corresponds with these findings to a stable repository (such as Figshare or Dryad) and provide and URLs, DOIs, or accession numbers that may be used to access these data. Or, if the data are not a core part of the research being presented in your study, we ask that you remove the phrase that refers to these data.

4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: I Don't Know

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This is an interesting and important paper but I have a number of suggestions for improvement. It reports the extent to which funders of studies attempt to suppress publication of unfavourable results, which is a highly important topic.

Major comments:

Recruitment part slightly unclear: “We invited the first two authors, the last author, and the corresponding author, using contact information in the public domain, to complete the survey. Authors with available contact details were invited to complete a telephone interview.” If they were contacted they all had contact details available – what does this last sentence mean? How was it actually decided whether to do interview or survey? Did you only do interview where phone number was available? Why not email to arrange phone interview and ask for number then? Were questions the same?

It would be helpful to know what “reluctance” actually signifies. Funders shouldn’t be able to affect whether findings are published – did responses suggest that funders could actually prevent publication or just tried to influence? A related point is that it would be very interesting to know how researchers dealt with suppression attempts – particularly attacks on the research team.

Generally the discussion (which is rather short) and/or results would benefit from more detail on the details of suppression attempts and how they were handled, if these data are available.

Minor comments:

Abstract should mention other aspects apart from supression specifically to give readers an idea of other behaviours.

Some minor lack of clarity in places, eg. “improve smoking/substance abuse ”

Why these specific health intervention areas and not others? A rationale should be given.

Were respondents able to suggest other types of suppression activity? There may be other types not mentioned in your survey.

Reviewer #2: This is an excellent and timely paper that is well written and comes to an important and well justified set of recommendations. I have only a few minor comments that will clarify the presentation and improve the general context of the research and interpretation.

I think it would be more usual to put the . after the references, such as [1, 2].

77-83. You could further improve the context by referring to the recent study on science suppression in ecology by Driscoll et al. 2020 in Conservation Letters, relevant because it addresses suppression in public good research in Australia.

98. researchers'

203. insert "the odds ratios of" at the start of this line.

211-12. Is it worth reporting these extreme values, as neither is likely to be true, or even close to true?

224-5. That is a reasonable interpretation.

226. Hold on, you said at 201-2 that older studies had higher rates of suppression. Some extra words in the results to help interpret direction of effect in relation to the odds ratio would help readers and the authors to avoid this mix up.

228. It's a bit confusing to combine older and north American in this sentence, giving the impression it was only older north American studies, but these tests were independent, so it's old studies everywhere, and any aged study in north America.

235-6; or north American researchers are more afraid to report suppression. Or you could look at the European countries that your studies came from to see if they include less democratic countries. In fact, in addition to the regional analysis, it would be interesting to examine the likelihood of suppression in relation to the global democracy index, which extends back to 2006, so could be aligned with the year each paper was published, or the year before.

251; spell out what that would mean, what kind of clauses must be excluded from contracts or included for example.

Also, in this list of actions, be clear about who should it for each point. It might also refer to other literature where similar lists of actions have been referred to (this might be best done in the list or as an introduction to the list).

It would be nice to wrap up with a general, bigger picture paragraph about interference in public good research.

I couldn't see if the data were available, which Plos seems to require.

Reviewer #3: This was a very clear, concise and well-presented paper. It highlights a key aspect of research suppression, namely pressure from funders to produce and publish findings that align with the priorities of the day.

It was necessarily narrow in scope. However, it would be interesting to know whether respondents experienced suppression events from parties other than the funding body. In my experience, public health fields are particularly prone to silencing from within. It also would have been interesting to include discussion of the broader, more insidious suppression that chronically underfunded public research produces. As noted in the authors' limitations section, it is impossible to ascertain the number of researchers whose work is suppressed before it has even begun. The unspoken or whispered warnings to keep one's head down; stick to something safer; only 'pick winners'. We know in Australia for example, that it is not worth writing a grant funding proposal that suggests the government's alcohol guidelines are not conducive to responsible drinking, or that school-based fitness programs may actually be counter-productive.

The only suggestions I would make are:

-Page 5, line 98 contains a missing apostrophe. It should read "researchers' experiences"

-Page 6, lines 129-130, title of Box 1 is convoluted. Consider renaming to “Options respondents were provided regarding funder behaviour”

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: Yes: Dr Jacqueline Hoepner

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Aug 18;16(8):e0255704. doi: 10.1371/journal.pone.0255704.r002

Author response to Decision Letter 0


19 Jul 2021

16 July 2021

Dear Dr Quinn Grundy

PLOS ONE

Thank you for your email dated 23rd June 2021 and the opportunity to revise and resubmit our manuscript entitled ‘"He who pays the piper calls the tune": Researcher experiences of funder suppression of health behaviour intervention trial findings’ (PONE-D-21-13007). Please find below our response to reviewers.

Reviewer #1: This is an interesting and important paper but I have a number of suggestions for improvement. It reports the extent to which funders of studies attempt to suppress publication of unfavourable results, which is a highly important topic.

Major comments:

1. Recruitment part slightly unclear: “We invited the first two authors, the last author, and the corresponding author, using contact information in the public domain, to complete the survey. Authors with available contact details were invited to complete a telephone interview.” If they were contacted they all had contact details available – what does this last sentence mean? How was it actually decided whether to do interview or survey? Did you only do interview where phone number was available? Why not email to arrange phone interview and ask for number then? Were questions the same?

We have rephrased the text around recruitment. Individuals were surveyed via telephone if they had a contact number available, if not, they were contacted via email and asked to complete the same survey online. We have attempted to clarify in the text.

“Authors from identified articles were invited to participate if they were one of the first two authors, the last author, or the corresponding author. Contact information was sourced from the public domain. We contacted corresponding authors first to complete the survey on behalf of all authors. Corresponding authors could nominate co-authors to complete the survey on their behalf. If after four week we had no response to the corresponding author, we invited the first, second and/or last author of the trial manuscript (if different from the corresponding author) to participate. Authors with available telephone contact details were invited to complete the survey via a telephone interview. Prior to contacting via telephone, we emailed an invitation attaching a study information sheet, a summary of the survey topics to be covered, and an opt-out form. Those authors without telephone contact details were contacted via email and sent the same information and a link to complete the same survey online via REDCap, a web survey hosting service.[19] Up to three reminder emails were sent to non-responders at intervals of approximately four weeks.”

2. It would be helpful to know what “reluctance” actually signifies. Funders shouldn’t be able to affect whether findings are published – did responses suggest that funders could actually prevent publication or just tried to influence? A related point is that it would be very interesting to know how researchers dealt with suppression attempts – particularly attacks on the research team.

3. Generally the discussion (which is rather short) and/or results would benefit from more detail on the details of suppression attempts and how they were handled, if these data are available.

We respond to comments 2 and 3 together. Unfortunately this information is not available. We have added to the discussion that this is a limitation of our research, and that further research would benefit from investigating.

“This research has its limitations. For example, we did not attempt to ‘deep dive’ on the different types of suppression. We found that reluctance to publish research findings due to unfavourable results was the most reported type of suppression experiences. However, we did not seek to determine what reluctance meant in this context – whether it meant funders prevented publication or just tried to influence it. Further research should investigate this in order to determine the most practical ways to overcome these types of suppression.”

Minor comments:

4. Abstract should mention other aspects apart from suppression specifically to give readers an idea of other behaviours.

We have added additional information to the abstract to indicate other parts of the study.

“We also collected information on study funder, geographical location and age of publication and how these were associated to reported suppression.”

5. Some minor lack of clarity in places, e.g. “improve smoking/substance abuse”

We have edited the text thoroughly to improve expression and clarity.

6. Why these specific health intervention areas and not others? A rationale should be given.

As this is part of a larger study examining public health primary prevention researchers’ experiences in developing, conducting and evaluating public health interventions, the health intervention areas where selected as areas of interest.

We have added to the text “These risk behaviour areas were chosen as the larger cross-sectional study this study is a part of was interested in these health behaviour domains.”

7. Were respondents able to suggest other types of suppression activity? There may be other types not mentioned in your survey.

We did not give participants the opportunity to describe other forms of suppression. We have added this to the limitation section in the discussion.

“Too we did not give participants the opportunity to describe other forms of suppression they may have felt.”

Reviewer #2: This is an excellent and timely paper that is well written and comes to an important and well justified set of recommendations. I have only a few minor comments that will clarify the presentation and improve the general context of the research and interpretation.

8. I think it would be more usual to put the . after the references, such as [1, 2].

Thank you for your suggestion. We follow AMA in-text referencing guidelines which places references outside of periods and commas so have not updated the text for this suggestion.

9. 77-83. You could further improve the context by referring to the recent study on science suppression in ecology by Driscoll et al. 2020 in Conservation Letters, relevant because it addresses suppression in public good research in Australia.

Thank you for your suggestion. We have added this reference to the text

“A survey of Australian ecologists and conservation scientists by Driscoll et al. indicated that Government and industry respondents reported higher rates of suppression than university respondents (34%, 30% and 5% respectively) and this suppression was mainly in the form of internal communication and media.”

10. 98. researchers' & 203. insert "the odds ratios of" at the start of this line.

We have updated the text for these suggestions.

11. 211-12. Is it worth reporting these extreme values, as neither is likely to be true, or even close to true? 224-5. That is a reasonable interpretation.

Given the large number of non-respondents for the survey (50%) we believe it add context to the range of potential suppression which may have been reported by respondents. This additional analysis add supports to our discussion where we highlight that we believe the reported prevalence of 18% is an underrepresentation which you highlight is a reasonable interpretation.

12. 226. Hold on, you said at 201-2 that older studies had higher rates of suppression. Some extra words in the results to help interpret direction of effect in relation to the odds ratio would help readers and the authors to avoid this mix up. 228. It's a bit confusing to combine older and north American in this sentence, giving the impression it was only older north American studies, but these tests were independent, so it's old studies everywhere, and any aged study in north America.

Thank you for highlighting this oversight we have updated the text.

“It is hard to determine why older publications or those published in different geographical regions were suggestive of having to have greater odds of suppression compared to younger publications published in North America (though the confidence intervals for all the comparisons were wide and included 1). A possible explanation may be that older studies which have existed longer may have had more chance to experience suppression on their findings.”

13. 235-6; or north American researchers are more afraid to report suppression. Or you could look at the European countries that your studies came from to see if they include less democratic countries. In fact, in addition to the regional analysis, it would be interesting to examine the likelihood of suppression in relation to the global democracy index, which extends back to 2006, so could be aligned with the year each paper was published, or the year before.

We have added the variable full democracy (yes, no) to our analysis based on the Global democracy index as suggested (please see Table 2). This analysis did not highlight any differences between democratic countries and others.

14. 251; spell out what that would mean, what kind of clauses must be excluded from contracts or included for example.

We have added an example of the types of clauses we believe should be omitted to the text.

“Government agencies must ensure appropriate terms in funding agreements formed with research providers that protect academic freedom for e.g. the removal of clauses which require the approval of results prior to publication”

15. Also, in this list of actions, be clear about who should it for each point. It might also refer to other literature where similar lists of actions have been referred to (this might be best done in the list or as an introduction to the list). It would be nice to wrap up with a general, bigger picture paragraph about interference in public good research.

We have added additional information to the checklist and a summary paragraph to the end of the manuscript.

“Suppression still exists among public-good researchers, with this study suggesting rates of suppression for a single trial may be as high as one in five. Prevention is key and these suggestions, similar to those previously described,{Kypri, 2015 #68} need to be adopted in order to thwart the occurrence of suppression of public-good health research. As it is unlikely that instances of suppression will ever truly be stopped, further research needs to be conducted to determine ways of handling suppression when they do happen at the researcher level (e.g. what can a researcher do if someone tries to delay their publication), as well as reporting procedures in place if instances of suppression do occur.”

16. I couldn't see if the data were available, which Plos seems to require.

We have added this data to the submission.

Reviewer #3: This was a very clear, concise and well-presented paper. It highlights a key aspect of research suppression, namely pressure from funders to produce and publish findings that align with the priorities of the day.

It was necessarily narrow in scope. However, it would be interesting to know whether respondents experienced suppression events from parties other than the funding body. In my experience, public health fields are particularly prone to silencing from within. It also would have been interesting to include discussion of the broader, more insidious suppression that chronically underfunded public research produces. As noted in the authors' limitations section, it is impossible to ascertain the number of researchers whose work is suppressed before it has even begun. The unspoken or whispered warnings to keep one's head down; stick to something safer; only 'pick winners'. We know in Australia for example, that it is not worth writing a grant funding proposal that suggests the government's alcohol guidelines are not conducive to responsible drinking, or that school-based fitness programs may actually be counter-productive.

The only suggestions I would make are:

-Page 5, line 98 contains a missing apostrophe. It should read "researchers' experiences"

Thank you, we have updated this in the text.

-Page 6, lines 129-130, title of Box 1 is convoluted. Consider renaming to “Options respondents were provided regarding funder behaviour”

Thank you, we have updated this in the text.

We hope you find these modifications and explanations satisfactory. We hope to be considered for publication in PLOS ONE.

Kind regards,

Dr Sam McCrabb

School of Medicine and Public Health

Faculty of Health and Medicine

Attachment

Submitted filename: Response Letter 07-07-2021.docx

Decision Letter 1

Quinn Grundy

22 Jul 2021

"He who pays the piper calls the tune": Researcher experiences of funder suppression of health behaviour intervention trial findings

PONE-D-21-13007R1

Dear Dr. McCrabb,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Quinn Grundy, PhD, RN

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Quinn Grundy

26 Jul 2021

PONE-D-21-13007R1

“He who pays the piper calls the tune”: Researcher experiences of funder suppression of health behaviour intervention trial findings

Dear Dr. McCrabb:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Quinn Grundy

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Table. Characteristics of responders and non-responders.

    (DOCX)

    S1 File

    (XLSX)

    S2 File

    (XLSX)

    S3 File

    (XLSX)

    S4 File

    (XLSX)

    Attachment

    Submitted filename: Response Letter 07-07-2021.docx

    Data Availability Statement

    All relevant data are within the paper and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES