Two studies have revealed widespread flaws in animal test reports, and this is the latest result of a series of papers criticizing crude biomedical research.
The team led by Ulrich Dirnagl from Charritt Medical School in Berlin, Germany, found that although clinical trial reports in major medical journals usually state how many patients died or withdrew from analysis during the research process, animal studies generally do not report these data, or Discard animals without explaining why. The team recently reported in the US "Public Library of Science · Biology" that this kind of negligence may lead to large deviations in the results.
In the second study published in the same journal, a team led by John Ioannidis, an epidemiologist at Stanford University, criticized the lack of data availability and specific experimental reports in biomedical papers. Ioannidis has repeatedly called for more reproducible and transparent research.
Dirnagl's team reviewed 100 reports that were published between 2000 and 2013, describing 522 trials, and compared the number of animals reported in the method and results section of the paper. These trials use dentate animals to test cancer and stroke treatments. About two-thirds of the trials did not state whether the researchers abandoned any animals in the final analysis. Of the reported number of trials, about 30% (53 trials) claimed to have abandoned odontodans in the research analysis, but only 14 trials explained why.
Researchers used computer simulations to confirm that these practices may seriously affect the results of the study. The team said that if biomedical scientists are biased in how to abandon animals, that is, to exclude outliers that give extreme data values, then the possibility of finding that the results are statistically significant on the surface but actually caused by accidental factors increases. 4 times, and will exaggerate the actual effect of the treatment by 175%.
At the same time, the Ioannidis team analyzed a random sample of PubMed retrieved articles published between 2000 and 2014. They found that none of the 268 biomedical papers provided complete data, and except for one, the other papers lacked the details needed by other researchers to reproduce the work. Compared with about a third in 2014, in 2000, more than 90% of analyzed papers lacked conflict of interest statements.
"I have to say that I am worried but not surprised." said Malcolm Macleod, a stroke researcher and trial design expert at the University of Edinburgh, UK. "These important findings are further evidence of the challenges we face in improving the quality of biomedical research."