Tuesday, September 2, 2014

15 ways to tell if that science news story is hogwash

Getty Images

Just because a study has been published in a scientific journal doesn't mean its perfect — there are plenty of flawed studies out there. But how can we spot them?

The excellent chart below offers "A Rough Guide to Spotting Bad Science." It was put together by the blogger behind the chemistry site Compound Interest. This isn't meant to be an exhaustive list — and not all of these flaws are necessarily fatal. But it's an excellent guide to what to look for when reading science news and scientific studies:

Spotting-bad-science-v2

Here's a more detailed breakdown:

1) Sensationalized headlines: Behind sensationalized headlines are often sensationalized stories. Be wary.

2) Misinterpreted results: Sometimes the study is fine but the press has completely messed it up. If it's a big story, MIT's Knight Science Journalism Tracker will often tell you which reporters did a good job and shame those who didn't. Otherwise, it's often a good idea to read the original study itself.

3) Conflict of interests: Who funded the research in question? If you see a study claiming that drinking grape juice helps your memory and it's funded by the grape industry, then think about that a bit. (That happens all the time. Lots of studies on random foods being good for you, funded by random food councils.)

Be careful: some journals require researchers to reveal conflicts of interest and funding sources, but many do not. And not all conflicts of interest involve funding. For example, be a bit suspicious of someone testing a medical device who consults for free for a company that owns medical devices.

4) Correlation and causation: Just because two things are correlated doesn't mean that one caused the other. If you want to really find out if something causes something else, you have to set up a controlled experiment. (Chemical Compound's infographic brings up the fabulous example of the correlation between fewer pirates over time and increasing global temperature. It's almost certain that fewer pirates did not cause global temperatures to rise, but the two are correlated.)

5) Speculative language: You can say anything with the word "could" and it could be true. Jelly beans could be the reason that the average global temperature is increasing. Unicorns might cause cancer. And pygmy marmosets may be living in the middle of black holes.

6) Small sample sizes: Did the researchers study a large enough group to know that the results aren't just a fluke? That is, did they treat cancer in two people or in 200? Or 2,000? Was that brain scanning psych study on just seven people?

7) Unrepresentative samples:
If a researcher wants to make claims about how all people think, but she only studies the college students who show up to her university lab, well, then she can only really draw conclusions about how those college students think. One cultural group can't tell you about all of humanity. This is just one example, but it's a pervasive issue.

8) No control group used: Why would anyone even waste their time doing a study like this?

9) No blind testing used: The placebo (and nocebo) effects are strong. (Awesome, three-minute video on the crazy effects of placebos here.)

In medical and psychology studies, participants should not be aware of whether they're in the control group. Otherwise their expectations will muddle the outcomes. And, if at all possible, the researchers who interact with the participants should also be unaware of who is in the control group. Studies should be performed under these double-blind conditions unless there is some really good reason that it cannot be done that way.

10) "Cherry-picked" results: Ignoring the results that don't support your hypothesis will not make them go away. It's possible that the worst cherry-picking happens before a study is published. There's all kinds of data that the scientific community and the public will never see.

11) Unreplicable results: If one lab discovers something once, it's sort of interesting. However, that lab could have some random result or — rarely, but possibly — be filled with liars. If someone else can replicate it, then it becomes far more real.

12) Journals and citations: That something was published in a fancy scientific journal or has been cited many times by others doesn't mean that it's perfect research, or even good research.

Reading

Getty Images

A couple more tips for evaluating science news

13) Check for peer review: Just because you saw it in a news story doesn't mean that it's been looked over by an independent group of scientists. Maybe the results were presented at a conference that doesn't review presentations. Maybe it went straight from the operating table to the press, like recent uterus transplants.

14) Results not statistically significant: Generally, researchers want to see a statistical analysis showing that the results have less than a 5 percent likelihood of happening from chance alone (a 95% confidence interval). Some fields are even more strict than that. This is so there's a reasonable degree of certainty that you're looking at a real result, not just a stroke of good luck.

15) Confounding variables: Might something else be causing the effect that you see? Did the statistical analysis take that into account?

X
Log In Sign Up

forgot?
Log In Sign Up

Please choose a new Vox username and password

As part of the new Vox launch, prior users will need to choose a permanent username, along with a new password.

Your username will be used to login to Vox going forward.

I already have a Vox Media account!

Verify Vox Media account

Please login to your Vox Media account. This account will be linked to your previously existing Eater account.

Please choose a new Vox username and password

As part of the new Vox launch, prior MT authors will need to choose a new username and password.

Your username will be used to login to Vox going forward.

Forgot password?

We'll email you a reset link.

If you signed up using a 3rd party account like Facebook or Twitter, please login with it instead.

Forgot password?

Try another email?

Almost done,

By becoming a registered user, you are also agreeing to our Terms and confirming that you have read our Privacy Policy.
Spinner.vc97ec6e

Authenticating

Great!

Choose an available username to complete sign up.

In order to provide our users with a better overall experience, we ask for more information from Facebook when using it to login so that we can learn more about our audience and provide you with the best possible experience. We do not store specific user data and the sharing of it is not required to login with Facebook.

tracking_pixel_10934_tracker