SCI-ART LAB

Science, Art, Litt, Science based Art & Science Communication

We have been proved right over and over again. We complain forever like silly but wise old grandmas, ask people to be cautious about science reports in the  media like concerned mamas. But still if you fall for the trap again? And if  the media behaves like a reckless teenager?

Here is my proof...

"Eating that bar of chocolate can HELP you lose weight". I read news stories with  this headline in all most all the news papers here sometime back.

"Well, does it?", I thought for a while, laughed at it and threw into the rubbish bin of my mind. 

And forgot all about it.

And today we got this news that says

''Problems behind chocolate study “fooling millions” run much deeper than just a prank/spoof/sting''

Science journalist John Bohannon has done it again – providing another example of how easy it is to get slipshod studies published, and now, how easy it is to get naive news coverage of a slipshod study. Of course the study it was based on was pure junk!

http://www.healthnewsreview.org/2015/05/problems-behind-chocolate-s...

But then from India to Australia and US to Germany, news organizations shared finding published in the International Archives of Medicine in late March. And this story spread like wild fire!

Bohannon, a science journalist who also has a Ph.D. in science, lays out how he carried out an elaborate hoax to expose just how easily bad nutrition science gets disseminated in the mainstream media. He Fooled Millions Into Thinking Chocolate Helps Weight Loss.

Read for yourself the Bohannon’s story of how he and co-conspirators dreamed up and executed a plot to do an actual, albeit shoddy, study….submit it to a questionable journal…get it published almost overnight…write a news release…and watched the silly news coverage go viral in no time!

Diet science, Bohannon stresses, is still science – and reporters need to know how to cover it. "You have to know how to read a scientific paper — and actually bother to do it," he writes. "For far too long, the people who cover this beat have treated it like gossip, echoing whatever they find in press releases. Hopefully our little experiment will make reporters and readers alike more skeptical."

To be clear, the study involved was real — a randomized controlled trial. Bohannon and his partners, a German television reporter and his collaborator, really did recruit 16 people for a study on dieting. And they found that the ones who followed a low-carb diet and also ate a 1.5-ounce bar of dark chocolate daily lost weight faster than the control group that was dieting alone.

 There are plenty of problems here.

For starters, as Bohannon explains in great detail, the study design itself was flawed — it had too few subjects, and the research measured too many factors, making it likelier that some random factor would appear to have statistical significance. [Bohannon does an excellent job of explaining the specifics himself : a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a “statistically significant” result. Our study included 18 different measurements—weight, cholesterol, sodium, blood protein levels, sleep quality, well-being, etc.—from 15 people. (One subject was dropped.) That study design is a recipe for false positives.

Think of the measurements as lottery tickets. Each one has a small chance of paying off in the form of a “significant” result that we can spin a story around and sell to the media. The more tickets you buy, the more likely you are to win. We didn’t know exactly what would pan out—the headline could have been that chocolate improves sleep or lowers blood pressure—but we knew our chances of getting at least one “statistically significant” result were pretty good.

Whenever you hear that phrase, it means that some result has a small p value. The letter p seems to have totemic power, but it’s just a way to gauge the signal-to-noise ratio in the data. The conventional cutoff for being “significant” is 0.05, which means that there is just a 5 percent chance that your result is a random fluctuation. The more lottery tickets, the better your chances of getting a false positive. So how many tickets do you need to buy?

P(winning) = 1 - (1 - p)n

With our 18 measurements, we had a 60% chance of getting some“significant” result with p < 0.05. (The measurements weren’t independent, so it could be even higher.) The game was stacked in our favor.

It’s called p-hacking—fiddling with your experimental design and data to push p under 0.05—and it’s a big problem. Most scientists are honest and do it unconsciously. They get negative results, convince themselves they goofed, and repeat the experiment until it “works.” Or they drop “outlier” data points.]

Then there's the journal that published it — a so-called pay-for-play publication, which failed to carry out peer review of the findings. It was accepted within 24 hours, and published two weeks later. As Bohannon himself exposed in another sting for the journal Science a couple of years ago, there are lots of these publications that will publish bad research for a fee.

And finally, none of the reporters who covered it asked an outside expert to weigh in on the research – standard operating procedure in good science journalism. If they had, an astute scientist would have spotted the problems with the study design immediately. The reporters around the world just cut-and-pasted Bohannon’s press release.  None did the due diligence – such as looking at the journal, looking for details about the number of study participants, or even looking for the institute Bohannon claimed to work for (which exists only as a website) – that was necessary to find out if the study was legitimate.

Bohannon's full story is long, but worth the read as an explanation of the pitfalls that plague science communication – especially nutrition information – in today's media climate.

The problem is massive, says Gary Schwitzer, the publisher of Health News Review. For the past nine years, the site has been dedicated to critiquing the media's coverage of health in an effort to improve it.

"He's really only scratching the surface of a much broader, much deeper problem," Schwitzer says. "We have examples of journalists reporting on a study that was never done. We have news releases from medical journals, academic institutions and industry that mislead journalists, who then mislead the public." And the pressure to publish or perish, he says, can lead well-intentioned scientists to frame their work in ways that aren't completely accurate or balanced or supported by the facts.

"We are really mired in a mess, the boundaries of which few people really have a sense for," says Schwitzer.

Some people argue that “man on the street” — will conclude from this prank that all of journalism and all of science is not to be trusted (1). Some saythis is why they don't trust AGW theories.

I think this situation arises because people cannot differentiate between genuine science and pseudo- science. And we here try to rectify that problem (2). We tell people what to trust and how to trust science stories.

And to my shock a part of the the media which dedicatedly reported the news earlier here, now again ran this story only till the researchers publishing it in dubious journals and cautioning people about it but completely ignoring the part how the media blindly reported it!

And some other news papers refused to get corrected by ignoring the second part of the story!

Hmmm....

References:

1. https://www.sciencenews.org/blog/culture-beaker/attempt-shame-journ...

2. http://kkartlab.in/group/some-science/forum/topics/how-to-trust-sci...


Views: 181

Replies to This Discussion

175

Attempt to shame journalists with chocolate study is shameful

https://www.sciencenews.org/blog/culture-beaker/attempt-shame-journ...

RSS

Badge

Loading…

© 2024   Created by Dr. Krishna Kumari Challa.   Powered by

Badges  |  Report an Issue  |  Terms of Service