Researchers at the centre of a row over data presented to JAMA, the journal of the American Medical Association, which showed the results of only the first six months of a trial of a new arthritis drug have defended their actions, following a controversial article in the Washington Post earlier this week (5 August).
The newspaper article accuses researchers of intentionally giving a prominent gastroenterologist incomplete data on the safety of a popular arthritis drug so that he might write a more favourable editorial about their study.
The editorial, published in the 13 September issue of JAMA (2000;284:1297-9) was cowritten by Dr Michael Wolfe, a gastroenterologist at Boston University, and concerned data from a then unpublished study involving more than 8000 patients. The drug involved, celecoxib, was associated with lower rates of stomach and intestinal ulcers than two older drugs for arthritis, diclofenac and ibuprofen.
The data made available to Wolfe encompassed only the first six months of the study. JAMA's editors reportedly wanted to rush these findings into print, and Wolfe and a colleague provided a favourable editorial to accompany the paper.
But in February Wolfe was shown the complete data from the same study as a member of the Food and Drug Administration's arthritis advisory committee, and he saw a different picture, said the story in the Washington Post.
The study, already completed at the time Wolfe wrote the editorial, had lasted a year, not six months as he had thought, and almost all of the ulcer complications that had occurred during the second half of the study were in users of celecoxib. When all of the data were considered, some of the drug's apparent safety advantage was diminished.
“For a group of researchers to send incomplete information to a journal for consideration while knowing that a more complete set will be reviewed by an authority figure like the FDA would seem very strange,” said former JAMA editor in chief George Lundberg. “That is, unless the time-sensitive marketing advantage of a drug with blockbuster sales potential was so compelling that the manufacturer was willing to take that chance to gain an early mass sales advantage.”
Jay Goldstein, professor of medicine at the University of Illinois in Chicago and one of the study's authors, said the Washington Post's account was inaccurate. He said the issue largely involved how best to present the results of the trial after there were an unusually large number of dropouts from the diclofenac arm of the study, mostly because of adverse events. “To put it bluntly, if you were looking to see if patients bleed at a different rate then when a lot of patients that leave are on diclofenac, you really can't continue the study.”
Goldstein said the best data on outcomes in all three arms of the study were available by looking at the six month timeframe. The 12 month data were widely available, so there was never an effort to deceive the public, he said. “The original intent was to follow patients for at least six months and compare [the three drugs], so for that particular study, the researchers believed that data best reflected the comparisons they were trying to make.”