Over my years in health journalism, I’ve debunked many dubious claims. I’ve discussed how to cover quacks like Dr. Oz and the Food Babe, and how to navigate a medical world so filled with hooey it can make your head spin.
But I wasn’t always fluent in the ways of detecting bull. My eyes were opened in my early 20s, when I met a group of researchers at McMaster University in Canada. They taught me about the limitations of different kinds of evidence, why anecdotes are often wildly misleading, and what a well-designed study looks like. This experience changed how I see the world.
I’ve often wondered why these concepts aren’t taught in schools. We are bombarded with health claims — in the news, on TV, in magazines, at the doctor’s office or the pharmacy — and many of us lack the basic skills to navigate them.
That’s why I found this giant new trial, which is just wrapping up now in Uganda, so compelling. Its mission, according to Sir Iain Chalmers, the Cochrane Collaboration co-founder who’s co-leading it, is to teach children to "detect bullshit when bullshit is being presented to them."
The researchers designed teaching materials, lesson plans, and cartoon-filled workbooks for kids about the reliability of medical treatments. And they’ve tested them out on more than 15,000 kids in a randomized controlled trial in Uganda.
We don’t yet know whether their method will work; the researchers are still analyzing the results.
But whether or not this trial fails, it’ll bring us closer to answering an important question, maybe the most important question, on health information right now: Instead of just debunking, which often fails, can we prevent dubious claims from catching on in the first place?
This study could be the beginning of a recipe book for how create little armies of bullshit detectors. These bullshit detectors would, in theory, be able to tell the difference between a helpful kind of doctor and a Dr. Oz, a useful medicine and a harmful one, and whether a study was designed to give reliable answers.
In a world so chock full of hogwash health advice, anything that moves us in the direction of that goal is worth paying attention to.
The bible of health bullshit detection
One of the indispensable readings for anyone interested in evidence-based medicine is Testing Treatments (downloadable for free). The basic idea behind the book, as Chalmers puts it, is that "you don’t need to be a scientist to think critically and ask good questions." In plain language, he and the book’s co-authors explain concepts people need to understand in order to sort reliable health advice from nonsense.
In that spirit, around the time a new edition of the book was published in 2012, Andy Oxman, a co-investigator on the Uganda trial and a research director at the Norwegian Institute of Public Health, approached Chalmers and said, "Why don’t we fish out the key concepts in Testing Treatments and see whether we can teach them to primary school children in Uganda?"
Chalmers, in his delightful British accent, recalls telling Oxman, "You’re mad."
But Oxman, of course, wasn’t mad. He already had strong ties to Uganda, where he’d been leading a World Health Organization project to bring more research evidence to policymaking.
"Working with policymakers," Oxman told me, "made it clear most adults don’t have time to learn, and they have to unlearn a lot of stuff." So he’d already been dreaming about building a foundation of critical thinking skills about health in the young.
With the book, the researchers also had a template for the kinds of things they could teach. And they knew that this exercise of inculcating skepticism in children, while uncommon in high-income settings, was even rarer in a developing country like Uganda, where pseudoscientific medical advice can spread with abandon, just as it can in the US.
The researchers, along with others from Uganda, Kenya, Rwanda, Norway, and England, worked to identify the most important ideas a person would need to grasp, and in a systematic study they arrived at 32 concepts.
Building on these concepts, they drew up lesson plans and collaborated with teachers in Uganda to make materials that would resonate with local school children.
Allen Nsangi, a Ugandan researcher and co-investigator on the trial, told me that a big part of that process involved mining local medical myths. For example, she said, "Some people have been told to use locally available stuff like cow dung [on burns] — it’s almost the best known treatment."
Other medical myths have come with heavy costs, she added. "Some of the immunization campaigns have been sabotaged because of claims to do with infertility for the future." Worried parents end up skipping the shots for their children. Other rumors have spread that people should replace their antiretroviral therapies for HIV with herbal supplements.
Ultimately, the researchers put together a guide for teachers and cartoon-filled reading and exercise books for students.
"We are trying to teach children that stories are usually an unreliable basis for assessing the effect of treatments," Nsangi explained, adding that stories amount to anecdotal evidence. The kids are also learning to watch out for the perverting effects of conflicts of interest, and to recognize that all treatments carry both harms and benefits and that large, dramatic effects from a treatment are really, really rare.
Why the Uganda trial matters
The researchers didn’t stop there. They want to know whether their work would actually improve children's ability to assess health advice, so they designed a high-quality test: a randomized controlled trial.
The trial ran during the second school term — from June to September — on more than 15,000 fifth-graders, mostly ages 10 to 12. Half of the kids got the lessons, and half didn’t. (Separately, the researchers created a podcast that imparts the critical thinking concepts for parents, and tested those in another randomized controlled trial.)
At the end of the trial, students in both groups were tested to see whether their understanding about the reliability of health claims improved. Chalmers, Oxman, and the other researchers are now evaluating those results. They plan to test the children’s retention again after a year to see whether the concepts stuck.
A wish for the future
There have been other attempts to understand whether school-based interventions can get kids to think critically, but there’s very little research focusing specifically on health or on teaching these skills early in life. "We started out with primary school children because that’s the right age to start; it’s when you need to build a foundation," Oxman says.
The Uganda study, which was mostly supported by the Research Council of Norway, should also be big enough to detect meaningful differences in the critical thinking abilities between the groups of children.
After they evaluate the program’s success, the researchers will keep adapting and experimenting to find methods that work, and then use what they learn to create open source teaching materials for children in other settings, as well as for professionals and policymakers.
So whatever the results of the Uganda study, the trial will get us closer to understanding how to prevent bullshit from taking off and how to arm people with the skills needed to protect themselves in the future. That’s something schools everywhere should pay attention to.
"My hope," Oxman said, "is that these resources get used in curricula in schools around the world, and that we end up with the children ... who become science-literate citizens and who can participate in sensible discussion about policy and our health." He added: "I’m looking to the future. I think it’s too late for my generation."
With Oxman's help, maybe we'll see fewer patients harmed by unhelpful treatments, fewer quacks profiteering off bogus medical advice, and a world less brimming with medical bullshit.