Monday, June 01, 2015

Sokal Hoax reborn..

...this time with BS health science.

//Because the whole thing was a bullshit hoax put on by a journalist to make the point that, at least when it comes to studies around diet and health, the journals and the media the reports on their papers are largely full of crap. Go read that entire thing, because it's absolutely fascinating, but I'll happily give you the truncated version. John Bohannon, who has a Ph.D in molecular biology of bacteria and is also a journalist, conspired with a German reporter, Peter Onneken, to see how badly they could fool the media to create BS headlines. They did this by turning John Bohannon into Johannes Bohannon (obviously) and creating a website for The Institute of Diet and Health, which isn't actually a thing. Then they conducted a very real study with three groups: 1 group eating a low-carb diet, 1 group eating their regular diet, and 1 group eating a low-carb diet and a 1.5oz bar of dark chocolate daily. After running background on the groups, conducting blood tests to correct for disease and eating disorders, and hiring a German doctor and statistician to perform the study, away they went. The results?
Onneken then turned to his friend Alex Droste-Haars, a financial analyst, to crunch the numbers. One beer-fueled weekend later and... jackpot! Both of the treatment groups lost about 5 pounds over the course of the study, while the control group’s average body weight fluctuated up and down around zero. But the people on the low-carb diet plus chocolate? They lost weight 10 percent faster. Not only was that difference statistically significant, but the chocolate group had better cholesterol readings and higher scores on the well-being survey.

Bam, results! Not just results, but results the media would absolutely love to sink their idiotic teeth into. The problem? Well, the method for running the entire study was bullshit.
Here’s a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a “statistically significant” result. Our study included 18 different measurements—weight, cholesterol, sodium, blood protein levels, sleep quality, well-being, etc.—from 15 people. (One subject was dropped.) That study design is a recipe for false positives.

Think of the measurements as lottery tickets. Each one has a small chance of paying off in the form of a “significant” result that we can spin a story around and sell to the media. The more tickets you buy, the more likely you are to win. We didn’t know exactly what would pan out—the headline could have been that chocolate improves sleep or lowers blood pressure—but we knew our chances of getting at least one “statistically significant” result were pretty good.

Bohannon goes into some of the gory math, and it really is fun to read, but this is pretty easy to understand. With a small enough sample size and testing for as wide a range of results and factors as possible, you absolutely expect to find greater variance than if your study was testing for less factors or had a higher sample size. It's simple: people are different and testing less people makes those difference statistically appear to be more significant. //

Be skeptical, even about "sciency" things.


No comments:

 
Who links to me?