Abstract
EQUATOR is an essential web resource for researchers, reviewers, editors, and readers
A young woman, just making ends meet and coping with four children, signed up to a breast cancer study where she would have to take two big pills every day for two years and show up for numerous frequent tests. Why would she put herself through that, wondered the researcher who went to obtain her consent. “I’m doing it for my daughter” said the mother, clearly expecting the study to yield usable, meaningful, and accessible evidence that might help prevent breast cancer in young women. Would she have consented so readily if she knew that some studies are never published and that many are reported so poorly that they are barely read and never used? This tale was told by that same researcher, Davina Ghersi, coordinator of the World Health Organization international clinical trials registry,1 at a meeting in London last month. Dr Ghersi was there to help launch the EQUATOR (enhancing the quality and transparency of health research) international network, which seeks to improve the quality of scientific publications by promoting transparent and accurate reporting of health research.
Registration, publication, and publicly available reporting of health research are already mandated by several sponsors and funders,2 3 some legislators,4 and many editors,5 particularly for clinical trials. The next big challenge is to decide when and how to disclose the results of a trial at a publicly available research registry, and what should go into a minimum dataset.6 Yet even journals, some of which have been in the business of reporting research for many decades, are still not producing articles that are clear enough to really judge a study’s conduct, quality, and importance—let alone to allow other researchers to reproduce it or build on it. With help from EQUATOR, journals should now be able to do a much better job and give authors the specific guidance they need to write up research properly.
Editors already provide instructions to authors, but this advice tends to be either unhelpfully vague and brief or comprehensively long and daunting—for instance, the BMJ’s advice currently extends to well over 20 000 words (http://resources.bmj.com/bmj/authors). The development of more than 80 guidelines for reporting different study types, many of them labelled by acronyms, adds to the confusion. Do authors know where to find these guidelines, and do editors and reviewers know how to use them? Do you know your MOOSE (meta-analysis of observational studies in epidemiology) from your STROBE (strengthening the reporting of observational studies in epidemiology)?
The EQUATOR website (www.equator-network.org/) comes to the rescue with a digital library of links to reporting guidelines. These guidelines give point by point advice that enables researchers to say unambiguously what they actually did and didn’t do in their study, how they did it, and what they found, thus allowing honest discussion of the study’s meaning, strengths, and weaknesses. As well as explanatory documents, these guidelines usually incorporate one or two tools—a checklist of items that must be reported clearly (with an empty column for authors to add the page numbers on which each item appears in their manuscript), and a template for a flowchart to show what happened to participants at each stage of the study.
The oldest and best known of the current guidelines is the CONSORT (consolidated standards of reporting trials) statement. This has spawned several extensions for different types and aspects of randomised controlled trials, and a plethora of other guidelines have now been developed by consensus groups of experts. The EQUATOR team has identified these guidelines through systematic literature searches, has pulled them together in one place, and has grouped them simply by type of study—including experimental studies, observational studies, systematic reviews, qualitative research, economic evaluations, quality improvement studies, and industry sponsored studies.
This is a real boon for the researcher, reviewer, or editor with a desire for clarity but a poor memory for acronyms. EQUATOR is a good resource for readers and learners too. Although these reporting guidelines are not explicitly intended to be critical appraisal tools, anyone running a journal club or sitting an exam that might test research skills should also find them useful. And the digital library is just the start of a comprehensive programme of work on knowledge translation. Over the next five years the EQUATOR network plans to develop much fuller online resources, including training materials for guideline developers, authors, reviewers, and editors, as well as published articles about improving the reporting of research. The network also aims to audit, every year for the next five years, the quality of reporting in health research and the performance of journals in implementing these guidelines.
The BMJ is actively supporting the EQUATOR initiative. We ask researchers to prepare each research article in line with the appropriate reporting guideline and to submit each manuscript with the right checklist properly completed and, if necessary, the right flowchart (http://resources.bmj.com/bmj/authors/types-of-article/research) (box). We will not send a research article for external peer review without these, thus giving our policy some teeth and helping reviewers to understand the study’s conduct and quality.
How to write a clear research paper
Go to the free “one stop shop” at EQUATOR (http://www.equator-network.org/) to find and follow the right guideline for reporting the type of study you have done
Next, follow the more general guidance at the International Committee of Medical Journal Editors (http://www.icmje.org/#author ) on manuscript preparation and submission
Now you will have a manuscript that should pass muster at any medical journal, and you will not have to spend much longer fitting the paper to a journal’s specific requirements (for example, http://resources.bmj.com/bmj/authors) even if you have to try several journals sequentially
Editors should not, however, use these reporting guidelines to reject studies that do not reach some fixed or arbitrary threshold for quality. In difficult and new areas of research, imperfectly conducted studies often provide good enough evidence to change policy or practice or to inform the next phase of research. Such studies deserve to be published, warts and all, but reporting guidelines point out where the warts are and how big they are. Using another bodily metaphor, Ian Needleman, director of London’s International Centre for Evidence-based Oral Health, said at the EQUATOR launch “research reporting is too often like swimwear: what it reveals is suggestive; what it conceals is vital.”
Competing interests: The BMJ Group sponsored the EQUATOR launch meeting and has sponsored consensus meetings to develop the CONSORT statement.
Provenance and peer review: Commissioned; not externally peer reviewed.
Cite this as: BMJ 2008;337:a718
References
- 1.World Health Organization. International Clinical Trials Registry Platform. www.who.int/ictrp/about/details/en/index.html
- 2.Medical Research Council. MRC policy on data sharing and preservation. www.mrc.ac.uk/PolicyGuidance/EthicsAndGovernance/DataSharing/PolicyonDataSharingandPreservation/index.htm
- 3.National Institutes of Health. NIH data sharing policy. http://grants.nih.gov/grants/policy/data_sharing/
- 4.US Food and Drug Administration. Law strengthens FDA www.fda.gov/oc/initiatives/advance/fdaaa.html
- 5.Laine C, Horton R, DeAngelis CD, Drazen JM, Frizelle FA, Godlee F, et al. Clinical trial registration. BMJ 2007;334:1177-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Krleža-Jerić K. International dialogue on the public reporting of clinical trial outcome and results—PROCTOR meeting. Croat Med J 2008;49:267-8. [DOI] [PMC free article] [PubMed] [Google Scholar]