clock menu more-arrow no yes mobile

Filed under:

Why you can't always believe what you read in scientific journals

Protasov AN /Shutterstock

Update: One of the founders of PubPeer, neuroscientist Brandon Stell, just revealed his identity in the journal Nature and at Retraction Watch. The Q&A below was penned by Stell and other PubPeer co-founders, who have chosen to remain anonymous.

When people talk about the flaws in the scientific process, they often raise the problem of peer review.

Right now, when a researcher submits an article for publication in a journal, it's sent off to his or her peers for constructive criticism or even rejection. The idea is that science is like a self-cleansing machine: The scrutiny of one's colleagues is meant to ensure quality and keep the junk at bay.

But there are problems with this traditional "pre-publication" peer review model: It relies on the goodwill of scientists, who are increasingly busy and may not spend the time required to properly critique a work; it's subject to the biases of a select few; it's slow; and it sometimes fails. This means that even in the highest-quality journals, mistakes, flaws, and even fraudulent work make it on the record.

In the past, there was no way to publicly point out those errors, barring the rare event of a retraction. Now the (anonymous) founders of a new website, PubPeer, are trying to change that. Since establishing the site in 2012, they've grown into a global platform for "post-publication" peer review — a new method some say could replace the current model.

PubPeer researchers can now comment on scientific articles, critiquing and discussing works anonymously, as soon as they've been published in journals — like a comments section on a news site. The space for criticism is no longer confined to just a few reviewers; anyone can log in and leave a thoughtful remark.

PubPeer has already amassed more than 25,000 comments in its centralized database. It has helped uncover science fraud, and recently became the subject of a court case over the right to anonymous scientific discussion. Along the way, it's creating a successful model that could replace the hallowed — and flawed — traditional peer review process. On the condition of anonymity, I interviewed the PubPeer founders — an early career researcher, computer scientist, and graduate student — about the revolution in peer review they're trying to shape, why science commenters tend not to troll, and why they refuse to reveal their identities.

Julia Belluz: As you see it, what's wrong with the scientific peer review process right now?

PubPeer Organizers: While standard "pre-publication" peer review often does improve the quality of published work, it is also clear that it lets through a huge number of mistakes and over-interpretations, and a surprising amount of misconduct. And the system as it stands has great difficulty in correcting work once published.

JB: How does PubPeer address these issues?

PP: PubPeer enables centralization of all commentary about published work. Instead of two or three pre-publication referees producing a confidential report to a strict deadline, any expert in the world can add to information about a paper after careful study, and that information is available to all.

A distinguishing feature of PubPeer is that we allow anonymous comments, including an option for strong, user-controlled anonymity. This has some potential disadvantages (which experience suggests are often overdramatized) but has clearly enabled many serious problems to be brought to light.

JB: Can you talk a bit about how you created PubPeer? What sparked the idea?

PP: The idea for PubPeer was born of the frustration of observing so many uncorrected and unacknowledged flaws in published work — a very widespread sentiment among researchers. We know from discussions with colleagues that the idea of a commenting site was not necessarily original. Several precursor sites were launched. But it seems we were the first to find a formula and a platform that prospered.

JB: How many people are using the site? And where are they coming from?

PP: The top countries connecting to the site are the United States, the United Kingdom, Germany, Japan, France, Canada, Israel, Brazil, Switzerland, and Italy. We assume they are all scientists.

JB: In journalism, a fair amount of trolling happens in comments sections. How do you avoid that? What kind of vetting of the comments goes on?

PP: The key point non-scientists struggle to understand is that effective criticism of science must have genuine substance. It's extremely difficult to invent a damaging criticism of a good paper, even if you want to. Just saying something is crap or nitpicking is deeply unconvincing. We normally moderate such comments, but our users are also quite effective at policing these issues. Our commenting and moderation policies reinforce this by requiring comments to be based on publicly verifiable information. So the whole issue of conflicts of interest and trolling is not nearly as serious as elsewhere: A comment has force if it makes a good point that a reader can check, and if it does, it matters little who made it.

JB: Have you noticed any trends in who comments or in which disciplines? What types of scientists or fields have the most active users?

PP: The bulk of our commenting is in the life sciences, but PubPeer is open for discussion of all publications. Nanotechnology, chemistry, climate science, management studies, and quantum mechanics all have active communities on PubPeer.

JB: Why do you think you have so many life sciences commenters?

PP: An interesting question is the degree to which different fields are subject to misconduct. Thus, it seems only two things can go wrong with a mathematics paper: it can contain a visible mistake, or it can be plagiarized. By contrast, in much of biology, readers have to trust the authors to describe their experiments accurately, and there are many more opportunities for mistakes and dishonesty. In addition, it seems particularly easy to spot problems in image data, which abound in life sciences. A recent study reported on PubPeer found that fully 25 percent of a random sample of cancer research publications contained problems in image data. That is a shocking statistic.

JB: What impact have you had so far? Has PubPeer resulted in any major findings of fraud or retractions or the like?

PP: There are now more than 25,000 comments in the PubPeer database, which represent a great deal of collected wisdom and expert information available to other scientists.

In terms of helping uncover fraud, PubPeer channeled many of the questions about the STAP stem cell fiasco last year [when a famous stem cell paper came under fire after anonymous commenters on PubPeer pointed out big flaws in the research]. We are aware of several cases where researchers appear to have lost jobs after incisive commentary appeared on PubPeer, and there are certainly many high-profile researchers whose work has been called into question on the site.

JB: There's been a lot of talk in recent years about how broken science is, particularly the peer review process. What are the bigger systemic changes that need to happen in order to fix it?

PP: The biggest problem is the pressure to chase after "metrics" — indirect measures of scientific success. The most important metric is publication in top journals, which determines jobs, grants, everything. This distorts the scientific process toward mostly illusory "breakthroughs" and "high-impact research" at the expense of careful work. Scientists now find themselves ruled by often incompetent kingmakers — the editors of the top journals — who effectively decide their futures and make scientific fashion.

PubPeer is helping scientists retake control of their lives, work, and careers by providing a collective judgment that is independent of and ultimately more important than acceptance by the top journals. That judgment is the expert opinion of your peers. We are also big fans of open-access publishing and the use of pre-print servers such as ArXiv or the newer bioRxiv; we believe these will also loosen the stranglehold of the top journals on research.

JB: Finally, why do you refuse to reveal your identities? You're critiquing science in an open democracy, not offering political dissent in a repressive regime or something.

PP: You need look no further than the Sarkar suit for an answer. [Fazul Sarkar is a cancer researcher who sued PubPeer, demanding the identities of anonymous commenters who criticized his work; the court sided with PubPeer.] There is a good chance that we as organizers would have been sued by now or suffered some kind of reprisal — rejected papers, non-funded grants, etc. And certainly we would have come under pressure to remove posts about colleagues or reveal the identity of commenters. That said, we do not expect or plan to remain anonymous indefinitely, but it has helped protect us while we build up the platform.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.