Media Omit Basic Facts in Medical Reports
While much of what you read in the media about medical research is based on studies published in peer-reviewed journals, some of the most exciting work is discussed at medical conferences, where researchers share raw ideas that can range from future Nobel Prize material to total hooey.
Journalists sometimes go to these conferences looking for the interesting nuggets and a chance to report on potential breakthroughs before the competition.
But the media often omit basic facts in stories they report from professional medical conferences, a new study concludes.
"Scientific meetings are an important forum for researchers to exchange ideas and present work in progress. But much of the work presented is not ready for public consumption," said Lisa Schwartz, a Dartmouth Medical School associate professor. "The studies have undergone limited review and findings may change substantially by the time the final report is published in a medical journal." If it is ever published, that is.
In an email interview, Schwartz pointed out what most journalists already know: Studies presented at conferences often are not accompanied by adequate background information—such as a copy of an actual scientific paper—and writers are sometimes under strong pressure to file stories quickly.
Schwartz and colleague Steven Woloshin analyzed U.S. newspaper, TV and radio reports on research from five major scientific meetings. Their findings:
- Only 2 of 175 stories about unpublished studies noted that the study was unpublished.
- One-third of the articles failed to mention how many participants were in a study [studies with only a few test subjects are sometimes later refuted by larger studies].
- 40 percent of the reports did not quantify the main result of the research.
- Just one out of 17 news reports on animal studies noted that results might not apply to humans.
"Unless journalists are careful to provide basic study facts and highlight limitations, the public may be misled about the meaning, importance and validity of the research," Woloshin said.
Of course studies that have been published in reputable journals sometimes turn out to be wrong, too.
A classic example occurred last year when Korean scientist Hwang Woo-suk claimed to have cloned human embryonic stem cells. The apparent breakthrough was reported in Science, one of the most prestigious journals on the planet. Turns out the scientist lied about the whole thing, and even researchers working on the project didn't know the results had been faked.
In general, however, reporters have a better opportunity to properly represent work that has been peer reviewed.
Still, not all research that seems important when it is reported ultimately leads to the sorts of applications a scientist might foresee. Anti-aging research on worms and rats can be promising, for example, but that does not necessarily mean humans will one day live for centuries, as a few scientists claim. A study last year found that seven of 45 highly publicized studies published in major medical journals were later contradicted.
"Readers should approach the news with a healthy skepticism," Schwartz suggested.
Schwartz and Woloshin detailed the results of their study, by the way, in the June 4 issue of the Medical Journal of Australia.
MORE FROM LiveScience.com