In the recent past, several important events took place in the realm of medical literature. On 24 June 2010, US District Judge Michael Ponsor handed down a 6-month sentence to Anesthesiologist Scott Reuben, who pled guilty earlier this year to falsifying research on the use of analgesics celecoxib (Celebrex; Pfizer) and rofecoxib (Vioxx; Merck) for postoperative pain management, for fabrication of data in a paper published in Anesthesia & Analgesia.[1] Later 10 articles of anesthesiologist Scott Reuben were retracted from Anesthesia and Analgesia. Bio-News published on 27th September, 2010 describes retraction of two more papers of Dr Savio Woo, a gene therapist at the Mount Sinai School of Medicine in New York.[2] Dr Woo retracted six papers this year after two of his post-docs, Li Chen and Zhiyu Li, were accused of scientific misconduct. In December 2010, the International Anesthesia Research Society announced retraction of the paper “Cardiopulmonary Bypass Priming Using a High Dose of a Balanced Hydroxyethyl Starch Versus an Albumin-Based Priming Strategy” by Prof. J Boldt for the falsification of data.[3] March 2011 issue of Anesthesia & Analgesia quotes – ‘Ladies and gentlemen, we have an apparent retraction record holder: Joachim Boldt, at 89 retractions.’ Overall the March issue of Anesthesia and Analgesia announced retraction of seven articles. The March issue of Anesthesia & Analgesia includes a series of editorials and articles on research misconduct and editorial steps to detect it prior to publication. In his editorial describing the fabricated research, Dr Steven L. Shafer[4] of Columbia University, Editor-in-Chief of Anesthesia & Analgesia, writes, “My commitment to ‘unimpeachable integrity’ means that credible allegations of misconduct are not ignored or swept under the rug, but are pursued, relentlessly, and sometimes at considerable personal cost.” With retraction notices posted everywhere, Urdaneta et al. voluntarily retracted their paper published in Annals of Cardiac Anaesthesia,[5] after Annals of Thoracic Surgery retracted their paper published in 2004.[6] Needless to say, these papers have seriously dented the true value of evidences and have raised a question mark on the veracity of the scientific literature.
The growth of science depends on research and there is no denying (in accepting the fact) that the understanding the physicians have, of various scientific subjects today, is because of research. The fundamental purpose of research is to know the truth (in terms of medical research - to know what is good for our patients) and to benefit the society. Scientific journals spread new knowledge, push forward the frontiers of current knowledge in every aspect, allow publication of creative ideas and form the basis for ongoing innovations. The development of heart-lung machine, open-heart surgery, imaging technologies, and electrophysiology-based diagnosis and treatment, key-hole surgery, video-assisted surgery, image-guided surgery, in vitro fertilization, ventilatory and circulatory support devices, etc., are the greatest medical advances of the last century and were made possible by the research efforts of pure scientists, technologists, basic laboratory researchers, and clinicians. The clinicians are the final link that delivers the fruits of research and development to the patients and to the society. Why such falsifications? Are these incidences of falsification sporadic or rampant? Why this decay in the system? Are medical scientists inhuman and do not appreciate that a falsified publication can harm innocent patients or are they victim of the existing circumstances? Is there any imbalance in the distribution of the rewards of research?
Reasons that drive research include – the desire to know the truth, and the quest for knowledge, the desire of recognition amongst peers and to enhance one's image and prestige, the linking of career enhancement in academic institutions to research output of faculty members, the phenomenon of one-upmanship, the industrial incentives, and the belief of mentors and individuals in the philosophy of ‘publish or perish’. Recently, Sir Miller observed that anesthesiology may be in danger of becoming a “trade union” of technicians unless there is a continual rejuvenation and development of new clinically relevant knowledge.[7] Apparently, there is an immense pressure on the scientific community and anesthesiologists to carry out research. Earlier experts used to analyse their clinical experience and used to set the standards of medical practice, the expert-based medical practice, but now medical researchers, not necessarily physicians, analyse data from the large data base, generate the evidence-based knowledge, control and set the standards of medical practice and directs the future growth of research. Over a period, we have moved from expert-based medical practice to evidence-based medical practice. The evidence-based medicine aims to provide a stronger scientific foundation for clinical work, achieve consistency, efficiency, effectiveness, quality, safety in medical practice and limit idiosyncrasies.[8] Apparently, researchers dictate the growth of knowledge, generate knowledge, arbitrate and differentiate between good and bad and are the torchbearers of the future. The extraordinary importance attached to research has stigmatized the practicing clinicians as workers and researchers as torchbearers of medical science. There is also development of a (ill-founded) belief that medical management outcomes are dependent on practice of evidence-based medicine and no importance is attached to individual clinical skills, experience and clinical judgement, and the patients are treated as interchangeable.
As a fall-out of the academic rewards attached to research, there has been an extraordinary explosion in the medical research and voluminous research material is made available for journals to scrutinize and select. Earlier, publication of one paper (if found acceptable) used to take about one to two years. Presently, one paper can complete the whole journey of submission, review, rejection, and acceptance, etc. through several journals in about a year. Apparently, the work load of journal editors and peer-reviewers has increased manifold and their role has become extremely important and critical. They are increasingly burdened with the responsibility to ensure that papers accepted for publication truly reflect the scientific evidence and enhance the existing literature.
The events cited in the beginning of this editorial indicate presence of loop holes and gaps in the presently followed peer-review system. Needless to emphasize, there is a need to tighten the peer-review system. However, in no way a peer reviewer or an editor can personally verify the veracity of the submitted material. What can one do to improve the review process? The reviewers are the strongest as well as the weakest link in the system. Perhaps, we need to define and set criteria for the selection of reviewers. Who is this ‘we’? What is the definition of a good reviewer, who is a good reviewer and what are the criteria of being a good reviewer? The author believes that the clinicians who are directly involved in patient care, in research and analysis of their own clinical experience and frequently publish their experiences should make sound reviewers and editors. Researchers who work in laboratories or libraries and do data analysis only are far away from the reality of medical practice and should not be entrusted with the responsibility of deciding the future of medical care. That means the editors of the journals should periodically collect information on clinical work involvement and papers published by the reviewers and only those reviewers who are constantly involved in both patient care and research should be entrusted with the responsibility of reviewing clinical scientific papers. In case a doubt is raised by any of the reviewers about the veracity of the reported data, the editor should insist on submission of raw data before finally accepting the article for publication. It is important to note here that the paper published by Prof. Joachim Boldt[2] raised several eyebrows. Why? The obvious reason, “the data presented in the paper were not tallying with the day to day experiences of perioperative physicians”.
Apparently, we need to redefine the value of correspondence published in response to published papers; such letters represent true experiences of practicing clinicians and should be taken seriously. Perhaps, the author guidelines should make it mandatory that the author replies to correspondence questioning the veracity of the data presented in an original research article. The chief editor should have the discretion to retract the paper in case the author(s) do not respond or send an unsatisfactory response. The author recently read a correspondence published in Journal of Cardiothoracic and Vascular Anesthesia published almost 5 years after publication of original research article.[9] Perhaps, journals should allow and accept correspondence in response to published papers indefinitely that would place the published papers in right perspective. Retrospective analysis and the case reports represent close analysis of day to day experiences and are close to truthfulness. Randomized controlled trials, though considered as level-one evidence, are easily amenable to falsification[1–3] and should be relied on only after due deliberations.
Publication of an original paper indicates addition of new knowledge to the existing literature or a new insight into an existing difficult problem; however, one should appreciate that generation of new knowledge is not an everyday affair. Apparently, the majority of papers published are not path breaking research but a repetition of what is already known but presented in a more emphatic and convincing manner. Perhaps, it is time to recognize that clinicians, who toil day and night for patient care and do not publish their results, are doing equally important work for the patient benefit and the society and they are not just workers. May be the research should be directed to find out why experts are experts so that individual expertise can spread to many. Excellence in clinical work should be given same recognition and importance in career enhancement as the publication of a research article. This would reduce unnecessary and undesirable research, pave the way for true meaningful scientific research, retain high-quality clinicians in clinical practice and reappraise the philosophy of “publish or perish.” It should be further realized that the data analysis and publication alone do not make a researcher a torchbearer of future, due appreciation and rewards to clinicians is a must in our move toward excellence in medical practice.
It should be appreciated that this editorial is not aimed to condemn research but to ensure that only high quality research-based on true scientific data and aimed to benefit patients and society finds its way in the medical literature. We all should remember that the goal of researchers, clinicians and all those working in the medical service is to benefit the patient and society. Medical research should be looked at with this perspective.
References
- 1.Reuben SS, Connelly NR. Postoperative analgesic effects of celecoxib or rofecoxib after spinal fusion surgery. Anesth Analg. 2000;91:1221–5. doi: 10.1097/00000539-200011000-00032. (Retraction announced by the editor) [DOI] [PubMed] [Google Scholar]
- 2.2010 Sep 17; Bio-News published on 27 September 2010: Sources and References: (I) Leading gene therapy researcher retracts more papers; Nature 24 September 2010; (II) Mount Sinai says Misconduct by postdocs led to retraction of gene therapy papers; Science 17 September 2010; (III) Update on gene therapy researcher Savio Woo retractions: Two post-docs dismissed for fraud; Retraction Watch 17 September 2010; (IV) Woo retractions; The scientist 24 September 2010. [Google Scholar]
- 3.Boldt J, Suttner S, Brosch C, Lehmann A, Röhm K, Mengistu A. Cardiopulmonary bypass priming using a high dose of a balanced hydroxyethyl starch versus an albumin-based priming strategy. Anesth Analg. 2010;109:1567–62. doi: 10.1213/ANE.0b013e3181b5a24b. (Retraction announced by the editor in: Anesth Analg 2010;111:1567.) [DOI] [PubMed] [Google Scholar]
- 4.Retraction of Falsified Studies Prompts Recommitment to ‘Unimpeachable Integrity’ at Anesthesia and Analgesia. Released:2/24/2011 10:30 AM EST. Source: International Anesthesia Research Society (IARS) 2011 [Google Scholar]
- 5.Urdaneta F, Lobato EB, Kirby DS, Sidi A. Treating myocardial stunning randomly, with either propofol or isoflurane following transient coronary occlusion in pigs. Ann Card Anaesth. 2009;12:113–21. doi: 10.4103/0971-9784.51362. (Retraction announced in: Ann Card Anaesth 2011;14:24) [DOI] [PubMed] [Google Scholar]
- 6.Urdaneta F, Willert JL, Beaver T, Naik B, Kirby DS, Lobato EB. Effects of a new phosphodiesterase enzyme type V inhibitor (UK 343-664) versus milrinone in a porcine model of acute pulmonary hypertension. Ann Thorac Surg. 2004;78:1433–7. doi: 10.1016/j.athoracsur.2004.04.039. (Retraction announced in Ann Thorac Surg 2011;91:338) [DOI] [PubMed] [Google Scholar]
- 7.Miller RD. The place of research and the role of academic anaesthetists in anaesthetic departments. Best Pract Res Clin Anaesthesiol. 2002;16:353–70. doi: 10.1053/bean.2002.0247. [DOI] [PubMed] [Google Scholar]
- 8.Timmermans S, Mauck A. The promises and pitfalls of evidence-based medicine. Health Aff (Millwood) 2005;24:18–28. doi: 10.1377/hlthaff.24.1.18. [DOI] [PubMed] [Google Scholar]
- 9.Yang P, Ye L, Dai S, Liu B. Intraoperative Transesophageal Ultrasonography Can Measure Left Renal Blood Flow. J Cardiothorac Vasc Anesth. 2006;20:905–7. doi: 10.1053/j.jvca.2006.04.020. [DOI] [PubMed] [Google Scholar]