Skip to main content
The BMJ logoLink to The BMJ
. 2008 Mar 29;336(7646):722. doi: 10.1136/bmj.39525.447361.94

Why promote the findings of single research studies?

Paul Wilson 1, Mark Petticrew 2, Medical Research Council’s Population Health Sciences Research Network knowledge transfer project team
PMCID: PMC2276286  PMID: 18369230

“Since when has a single scientific study constituted ‘the truth’ about anything?”1

Scientists have known about biases in single observations for centuries.2 A wealth of empirical evidence amassed across many disciplines tells us that single studies can be biased, are often seriously methodologically flawed and highly time and context dependent, and have findings that are likely to be misinterpreted and misrepresented (sometimes by the authors themselves). Increasingly it is accepted that decisions should not be based on the findings from single primary studies but rather informed by actionable messages derived from synthesised evidence based on systematic reviews.3 4 5 Over the past decade there has been substantial public funding of synthesised evidence and guidance to support healthcare decision making. In the United Kingdom this investment has been described as NHS research and development’s most important contribution to the global science base.6

Despite this investment the evidence indicates that although the transfer of research knowledge is possible its success can be variable.7 8 There is now renewed interest and emphasis on the gaps between research and policy and practice, nationally9 10 and internationally.11 This emphasis on bridging the gap may be viewed as encouragement to strive even harder to promote research to ever wider audiences—to be seen to be providing a return on investment. However, it may be worth pausing to consider the benefits and costs of such activity and ask why the research community continues to place such emphasis on the promotion of the primary research study.

The emphasis on the single study drives submissions to the current UK Research Assessment Exercise (RAE). Although the RAE does recognise that original research may include systematic reviews, the practice of many academic organisations is to prioritise—or sometimes only include—primary research, under the assumption that systematic reviews are not relevant or are less important.

Most bodies that commission research also expect and demand some commitment or effort on the part of grant holders, regardless of study design, to disseminate in ways that go beyond the traditional medium of academic publication. The emphasis is on communicating and interacting with wider policy and health service audiences in ways that will facilitate uptake of research results in practice and policy. The rationale, appropriate conditions, and contexts for such interactions are rarely provided by the funding body, and nor is any acknowledgment that evidence set in context is of most value to decision makers.

Some medical journals encourage researchers to discuss their findings in the context of the existing and relevant evidence base. However, research suggests that little progress has been made by researchers and journal editors in ensuring that single studies are set in context or that the consolidated standards of reporting trials (CONSORT) recommendation is adhered to.12 13 14 Furthermore, every week journals issue press releases aimed at generating publicity for the most “newsworthy” studies and for the journal itself. There is some evidence that primary research studies are proportionally more likely to be included in press releases than systematic reviews.15

The emphasis on newsworthy research promotion is not restricted to academic journals. Universities, research centres, funders, and councils all use the media to promote research. Rather than representing an attempt to explain new findings in the context of existing knowledge, media strategies are used to build the corporate image of the host institution or funder, for agenda setting, and to attract and secure future funding. Using the media to promote the results of primary research can have harms and can erode trust in and understanding of science generally.16 17 18 19 20 More often than not the latest breakthroughs, miracle cures, and wonder drugs are based on single studies.21 Often buried within the press release are notes of caution and suggestions that further research is needed and that a viable treatment is actually years away. We all know that these disclaimers are rarely if ever reported prominently; and the research community, while quick to raise concerns about media accuracy, rarely questions the appropriateness of media dissemination.

The renewed interest and emphasis on the knowledge translation is to be welcomed. But more than ever there is a need for researchers, especially those conducting primary research, to consider carefully the costs and benefits of dissemination. Not every study needs to be widely disseminated. Most primary research needs to be set in context, verified, and built on, moving the field forward incrementally before it can then have wider application. Rather than being encouraged to find ever more creative ways to get research noticed, funders should encourage researchers to demonstrate that they have considered carefully the appropriateness of their plans for dissemination.

Medical journals can do more to ensure that researchers actually do discuss the findings of primary studies in the context of the existing and relevant evidence base. The Academy of Medical Sciences in London has recently argued that researchers, funders, and institutions should take greater responsibility for the accurate communication of non-experimental research.20 In truth, the research community as a whole needs to be more circumspect when it comes to the active promotion of primary research. Although all research has an audience, and should be made accessible, not all research can or should have an impact on practice or policy.

Contributors: This paper was produced as part of a research project funded by the Medical Research Council (MRC) Population Health Sciences Research Network. The project team comprises Mark Petticrew (London School of Hygiene and Tropical Medicine), Mike Calnan (University of Kent), Irwin Nazareth (MRC General Practice Research Framework), and Paul Wilson (Centre for Reviews and Dissemination). All team members contributed to the conception, design, and writing of this paper.

References

  • 1.Blackman S, Pile B. Climate science: truth you can wear on your hands. www.battleofideas.org.uk/index.php/site/battles/957/
  • 2.Chalmers I, Hedges L, Cooper H. A brief history of research synthesis. Eval Health Prof 2002;25:12-37. [DOI] [PubMed] [Google Scholar]
  • 3.Black N, Donald A. Evidence based policy: proceed with care. BMJ 2001;323:275-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Lavis JN, Robertson D, Woodside JM, McLeod CB, Abelson J, Knowledge Transfer Study Group. How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q 2003;81:221-48. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Sechrest L, Backer T, Rogers E. Effective dissemination of clinical and health information. In: Sechrest L, Backer T, Rogers E, Campbell T, Grady M, eds. Conference summary: effective dissemination of clinical and health information. Rockville, MD: US Department of Health and Human Services, Agency for Health Care Policy and Research, 1994:1-8.
  • 6.Horton R. Health research in the UK: the price of success. Lancet 2006;368:93-7. [DOI] [PubMed] [Google Scholar]
  • 7.Grimshaw J, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess 2004;8:iii-iv,1-72. [DOI] [PubMed] [Google Scholar]
  • 8.Sheldon TA, Cullum N, Dawson D, Lankshear A, Lowson K, Watt I, et al. What’s the evidence that NICE guidance has been implemented? Results from a national evaluation using time series analysis, audit of patients’ notes, and interviews. BMJ 2004;329:999. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Cooksey D. A review of UK health research funding London: Stationery Office, 2006
  • 10.Department of Health. Best research for best health: a new national health research strategy London: DH, 2006
  • 11.World Health Organization. World report on knowledge for better health: strengthening health systems Geneva: WHO, 2004
  • 12.Clarke M, Chalmers I. Discussion sections in reports of controlled trials published in general medical journals: islands in search of continents? JAMA 1998;280:280-2. [DOI] [PubMed] [Google Scholar]
  • 13.Clarke M, Alderson P, Chalmers I. Discussion sections in reports of controlled trials published in general medical journals. JAMA 2002;287:2799-801. [DOI] [PubMed] [Google Scholar]
  • 14.Clarke M, Hopewell S, Chalmers I. Reports of clinical trials should begin and end with up-to-date systematic reviews of other relevant evidence: a status report. J R Soc Med 2007;100:187-90. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Bartlett C, Sterne J, Egger M. What is newsworthy? Longitudinal study of the reporting of medical research in two British newspapers. BMJ 2002;325:81-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Hargreaves I, Lewis J, Speers T. Towards a better map: science, the public and the media Swindon: Economic and Social Research Council, 2003
  • 17.Speers T, Lewis J. Misleading media reporting? The MMR story. Nat Rev Immunol 2003;3:913-8. [DOI] [PubMed] [Google Scholar]
  • 18.Petts J, Neiemeyer S. Health risk communication and amplification: learning from the MMR vaccination controversy. Health Risk Soc 2004;6:7-23. [Google Scholar]
  • 19.McCartney M. Mixed messages over breast milk and brainy babies. BMJ 2007;335:1074. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Academy of Medical Sciences. Identifying the environmental causes of disease: how should we decide what to believe and when to take action? A report by the Academy of Medical Sciences London: Academy of Medical Sciences, 2007
  • 21.Lawson M. Medical researchers need to stop hamming it up. BMJ 2007;335:1046 [Google Scholar]

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Publishing Group

RESOURCES