clock menu more-arrow no yes

Science journals screw up hundreds of times each year. This guy keeps track of every mistake.

Sergei25/Shutterstock

In 2010, on a nerdish whim, Ivan Oransky — a medical editor and physician — co-founded the website Retraction Watch with his journalist friend, Adam Marcus. The plan was simple: they'd start tracking all the retractions that were announced at scientific journals.

At the time, retractions were not exactly well promoted and no one else was methodically shining a light on research that turned out to be bogus. Retraction notices quietly languished in databases that no one read, making fraudulent or erroneous studies difficult to spot.

Almost as soon as Retraction Watch launched, it became something of a Perez Hilton for researchers, filled with the most salacious scientific gossip. On Retraction Watch, scientists could learn about the bad behavior of their peers — who was fabricating data, cheating in peer review, or lying their way into publications.

But the website is much more than a nerd's water-cooler: it has become a rich repository of data about how and where science goes wrong, and a cultural document about research today and the humans who do the work, with all their imperfections and flaws.

On the occasion of winning a MacArthur grant to expand the service, we spoke with Oransky about what the blog has taught him about science, whether retractions are really on the rise, and some of the most offensive frauds he has seen so far in looking behind the curtain.

Julia Belluz: You and your Retraction Watch co-founder are journalists, not researchers or scientists. Why did you become the retraction watchdogs?

Ivan Oransky: I think it's part of the reason why we can do Retraction Watch, because the people who determine the future of my career and the future of Adam's career are not the people we’re writing about. This is part of the reason why you see the code of silence in science the same way you do in the police and other places. It's the repercussion for speaking out and whistle-blowing. In some ways it makes perfect sense that we’re outsiders.

We had been reporting on retractions for our own publications and always found really interesting stories behind them. Adam had broken a big story from someone who eventually went to jail — an anesthesiologist working on Vioxx [the osteoarthritis that has been withdrawn over safety concerns]. He made up patients in more than 20 studies. I had also reported on these things and understood them. So we started Retraction Watch in August 2010, and it took off.

We were very lucky at the beginning because there were two things going on: one was that a couple of big retraction stories had happened. And we were in the midst of a retraction wave. In 2010 there were ten times as many retractions as there were in 2001. The scientific community just coalesced around the blog. They helped us and committed to commenting and giving us tips, context and critiques.

JB: Why obsessively follow retractions? What is the goal here?

IO: There wasn’t a comprehensive database of retractions anywhere, no one place where you can find all the retractions. They either were intentionally or by benign neglect swept under the rug. Journals are not good at publicizing them. Journals don't put out press release when they retract a paper even if it got a lot of press attention. Journals and publishers are not good at doing this.

Some of it is because it’s clunky, hard to do. Sometimes it’s because it's not in their interest. Lawyers are trying to keep a lid on information that doesn't make people look good. I believe in due process as much as anybody, and think everybody needs vigorous defense. But I think it’s a failing when journals, universities, and scientists who know they are in the right and try to ferret out this misconduct basically bow to legal threats.

JB: What are some of the most egregious science frauds you came across?

IO: The top one or two retraction holders, they are both anesthesiologists. There’s one in Japan who has had retract 183 papers of his 212 papers. His was one of these cases where, ten years before anything popped up, people wrote letters to journals saying that these results look "incredibly nice." That’s academic code for, "I don't believe this." But they didn't retract them, or put a correction on them. And he's smart. He realized when he was not getting published in anesthesiology journals, he started publishing in pediatrics and neurology journals. So he racked up all these papers. They were all too good to be true so eventually he got caught. This speaks to fact that the self-correction scientists like to talk about it — it works but not as well as people like us to think or as well as it should.

JB: Are there parts of the world that have a more acute retraction problem?

IO: Pound for pound, there are more retractions in the US but there are more papers published in the US. One thing we tend to see is that, in the West, the retractions tend to be for fraud, making things up. In parts of South East Asia, they tend to be for plagiarism and duplication. It’s always a soft number, though. We're still only dealing with 500 to 600 retractions a year.

JB: In nearly five years of doing this work, have you noticed any trends in terms of what papers get retracted?

IO: We’ve seen a rash of retractions in stem cell research. There's also an overall growth in retractions. Last year, there were about 500 retractions. In 2001 there were only 40. Between 2001 and 2010, the number of retractions went from 40 to 400. To put that into context: we're not talking about most papers, we're talking about a minority of papers. It's still a rare event.

JB: Why are so many more studies getting pulled today?

IO: There are at least two good explanations. One answer is that we’re better at finding problems: there are simply more eyeballs on papers nowadays. Whether open access or not, there are more people looking at papers online. There is plagiarism detection software. There’s some evidence that fraud and misconduct may be on the rise as well, the increasing pressure on scientists, decreasing funding lines from the National Institutes of Health.

We can comfortably say it’s because we’re better at catching fraud, but whether it’s on the rise is still an open question though directionally it does seem to be the case.

JB: Is it still difficult to get a study retracted or how does this decision happen at journals?

IO: We actually do have some insight into that. People leak us stuff all the time. Often what happens is, a lot of these scientists are in small communities of people. They don't really want to damage each other’s careers or reputations, so they — and I think this is a good thing — try to give people the benefit of the doubt. What often happens is, [the accused] will bury the journal with lots of information. it’s like burying your legal adversary in paperwork, creating confusion about what happened. Then journal editors take a long time because they themselves are within these communities. Even if there is a retraction, it takes much longer than it needs to.

JB: You just got a bunch of funding from the MacArthur Foundation. What's the growth plan?

IO: We're making a database where there will be an entry for every retraction we can find anywhere. We'll immediately populate the database with everything we’ve already built or written on Retraction Watch. All our posts are categorized by things like country, subject matter, journal, publisher, author, whether or not it’s behind a paywall, the reason for the retraction. What the database will do is allow people to search by all those things.

We also want the database to be hooked up to other databases and workflow tools people are using, whether they are submitting grants, writing papers, or doing exploratory searches. So if you come across something interesting, you’re trying to expand on the work someone has done, you’ll be able to query that paper and not only get the fact that there was a retraction, but also the fact that there were retractions in other areas, the reason for the retraction. The whole point is to make research more transparent and efficient.