Should we be drinking coffee? Depending on which day of the week we read the news, we might be told it will either extend our lives or give us cancer.
You’ve surely come across similarly contradictory stories suggesting cell phones may or may not be spurring brain tumors, that e-cigarettes are either a boon for smokers or the next big public health threat, and that GMOs might feed humanity or lead to our demise.
Headline whiplash is the status quo when it comes to health news. Look no further than this roundup on a recent episode of John Oliver:
This jumble of information is not just confusing for the public — it’s embarrassing for journalism. It also represents a massive opportunity to improve health reporting for two reasons: We’ve never had more knowledge at our fingertips, and there’s already a longstanding paradigm we journalists can and should draw on. It’s called "evidence-based medicine."
The evolution of evidence-based medicine
This week, I’m in Oxford, England, to speak at Evidence Live, an annual evidence-based medicine conference. If you’re asking, "Isn’t all medicine evidence based?" The answer is, actually, no.
It wasn’t until the early 1990s that the idea started to gain a foothold. Back then, a group of doctors began to organize themselves to solve a problem: how to use the latest research to make decisions at the bedside of their patients.
Doctors were too often using single or cherry-picked studies, or what they learned in medical school or from their mentors, to inform their decisions about their patients’ best care, rather than looking at the totality of the best-available evidence.
Medical research really started to take off in the 1970s, but a lot of the studies were contradictory. And, too often, too difficult to make sense of.
So these science-minded doctors committed themselves to building up a repository of high-quality "systematic reviews," which bring together and sort through all the best science on specific medical questions. The reviews use statistical methods to weigh the relative contributions of single studies, and their findings, instead of relying on cherry-picked research.
This effort was revolutionary.
Systematic reviews brought empirical heft to medicine. They didn’t turn doctors into cold, evidence-based automatons, but rather helped them more easily access and make sense of a wider selection of data. The reviews often corrected misconceptions about important health issues that had crept into medical practice, like the advice that it was ideal to put newborn babies to sleep on their stomachs — a practice that actually increased babies’ risk of death.
Systematic reviews, and the methods behind them, also began to inform official medical guidance from influential groups like the World Health Organization and the UK’s National Institute for Health and Clinical Excellence, and tools doctors use today, like UpToDate.
Evidence-based medicine is still a work in progress. Not all systematic reviews were equally created, and researchers are continually finding new ways to make them even more accessible and up to date (including using machine learning and crowd sourcing).
But the primary goal of evidence-based medicine represents something worth pausing over: a commitment to thinking critically about the quality of studies, and making the best research accessible in places where it can save lives. Most simply, though, it’s a commitment to scientific thinking over magical thinking for health.
And its ideal is now disrupting other realms — from policy-making to education and crime and justice programs.
Yet it still hasn’t revolutionized another area that influences human health as much as policy or medicine: health journalism.
There’s a serious dearth of evidence in health journalism
Over here in the media, we are still largely in the 1990s when it comes to thinking about using research evidence — despite the fact that medical journalism influences human health as much as policy or even medicine.
Instead of trying to translate what the best-available research evidence tells us about how to live, we report on the latest studies out of context, with little focus on how they were designed, whether they were unduly conflicted by study funders, and whether they agree or disagree with the rest of the research.
We often ignore systematic reviews (maybe because some of us don’t even know they exist). And we mislead the public by pretending that "the latest research" holds definitive answers, instead of acknowledging the incremental nature of science.
To be clear: Health reporting isn’t uniformly dire. There are many journalists who respect evidence and deploy it skillfully. And there isn’t a conspiracy among journalists to mislead the public in the same way the pre-evidence-based medicine days weren’t a conspiracy among doctors to harm patients.
But we need to do better.
There are real hurdles to doing high-quality health journalism
Right now, many reporters lack the tools to make use of evidence and face other challenges. Here are just a few:
1) We don’t have access to research that’s languishing behind pay walls.
2) Hype in science is on the rise, which makes journalists’ jobs extra challenging. (See here for more.) We’re often misled by press releases trumpeting the latest findings and don’t know how to check it against the best evidence.
3) Some of us don’t know what systematic reviews are.
4) Research is messy and fraught, and wading through evidence is difficult and takes time — time we don’t always have under daily deadlines, and in media environments that place a lot of value on immediacy.
5) We are also under pressure not only to get stories right and on time, but to produce hits — hits that are ever-more measurable in online newsrooms.
6) We may need new things to report on every day. Health science, by contrast, evolves very slowly and breakthroughs are rare. Talking about how the evidence has accumulated isn’t 'news.'
So is the way the news environment is set up antithetical to good health journalism?
Not so fast.
The new media landscape represents a great opportunity in this regard. Journalists now have more space (compared to the limited real estate of print media) online to explain the science behind news, and we can update our stories as science evolves. Instead of just publishing what’s new for next morning’s edition, we can publish evergreen stories that explain the body of research. Online, we can also link back to primary studies and systematic reviews, so anyone who is interested in vetting our work or checking our interpretations can see the research for themselves.
Show Me the Evidence, and other ideas for more evidence-based journalism
Vox, for one, is built around a model that favors context and analysis, and updating older versions of stories as we learn more — a model that fits very well with the incremental nature of science.
We try to take a step back from the news of the day and offer readers something more in depth about the issues that matter to them. And we have been experimenting with new models for evidence-based journalism, including a series called Show Me the Evidence. In these stories, we take the broadest possible view of the research behind big medical questions (highlighting high-quality systematic reviews and discussing the limitations of research) to come to more fully supported conclusions about what science can tell us about how to live.
These reviews take a lot of time to produce but we’ve also been lucky in that people seem to like reading them, and they’ve translated well into other formats that reach even larger audiences on platforms such as Snapchat and YouTube.
In order to make this kind of journalism more achievable for health reporters everywhere, we need help. And doctors and others in evidence-based medicine world are in a special position to help bring their revolution closer to the media. I suspect this can happen through a few key channels:
1) Better engagement between the media and research communities: Researchers and the journalists who cover them should open the lines of communications wherever possible to foster a better understanding of the cultures of these two worlds. Researchers should make a particular effort to reach out to reporters at news outlets on the other side of the political divide, who may not share their views.
2) More accountability: Why is it still acceptable for health reporters to say "coffee is good for you" one day and "coffee causes cancer" the next? Researchers should hold media outlets that fail to put new findings in context, or that ignore the available evidence, to account by notifying reporters or editors when they spot shortcomings in their pieces. (Health News Review in the US does a great job on this front.)
3) Better dissemination of research evidence: If we want journalists to use research evidence in their work, they need to be able to access it. This doesn’t simply mean giving journalists passwords for journals or online university libraries. This means we also need to create tools for journalists that help them quickly make sense of science, the way systematic reviews (like those in the Cochrane Library or AHRQ) or summaries of evidence (like those at UpToDate) have for doctors. (NHS Choices in the UK is an example of a great translation tool for patients, which journalists could use.)
4) Better training for journalists in research methods: You don’t need to be a scientist to think critically and ask good questions about research, in the same way you don’t need to be a politician to do smart political reporting, or a teacher to cover education thoughtfully. But science is complicated, and journalists would benefit from an education in the basics of study design, research methods, and, very simply, the scientific method. (The MIT Knight Science and the Association of Health Care Journalists run excellent training programs and fellowships for reporters.)
5) Evidence-based press releases: There will always be times when reporters are rushing and relying on press releases to make sense of science, and we know from research that the language in a press releases trickles right into media stories. Why not feature a few sentences about the context around the new study, or even links to the best systematic reviews, right in the press release? (A group of researchers at Dartmouth have been advocating for similar changes.)
In the 1990s, doctors could have thrown up their hands at the information deluge, and just been carried along in the constant rush of research. But instead, the evidence-based medicine community valiantly faced up to the information overload, organized around it, and created systems and tools that helped doctors.
In journalism, it’s time to do the same. We need help making sense of research, and in turn, helping readers (who include patients, policymakers, and doctors) to do the same.
This post has been adapted from a talk Julia will deliver on Wednesday at Evidence Live, an evidence-based medicine conference at Oxford University. If you have ideas about evidence-based journalism or how to improve the state of health journalism, email Julia at email@example.com.