Janie Oyakawa, 43, is a mother of six who lives just outside Dallas. In her home, she keeps a binder full of important documents from her family’s life, including a heavily annotated copy of Robert Sears’s The Vaccine Book: Making the Right Decision for Your Child.
“It’s like, here’s your birth certificate, and here’s when your mom was crazy for a few years,” she says. For a few years, Oyakawa, an occupational therapist, was an anti-vaxxer.
Oyakawa vaccinated her first four children according to the routine schedule. But the birth of her fourth child was traumatic; she felt the doctors and nurses didn’t listen to her. “That kick-started me being very skeptical of the medical establishment,” she says.
With her next child, Oyakawa opted for a home birth, with the help of a midwife. In the process, she began to meet other “crunchy” parents, she says, including those who were opposed to vaccines. For a time, the Oyakawas were uninsured. Slowly, and then all at once, Oyakawa stopped going to the pediatrician and stopped vaccinating her children, including her new baby.
“I had my sixth, and I feel bad looking back: She didn’t see a doctor until she was 1 year old,” she says. But parenting groups, in real life and on Facebook, offered validation. Oyakawa found thousands of people eager to reinforce each other’s anti-vaccine views. And they brought their own experts with them. (Sears, for example, is a controversial leader in the movement.)
“I was intelligent, but I had a really hard time evaluating sources,” Oyakawa says. She says she realizes now that people promoted anecdotes over data and emotion over evidence. But at the time, it felt like they were questioning conventional wisdom — and pursuing truth. For Oyakawa, it felt responsible.
Skepticism is, generally speaking, the doubting of a certain premise, or taking a questioning stance on a given topic. It’s the engine of scientific revolution — the searching spirit that pushed Nicolaus Copernicus to advance a heliocentric model of the universe, and Charles Darwin to propose the theory of natural selection — and widely considered a “healthy” perspective on the world. But in the 21st century, a certain kind of skepticism has become a thorn in the side of science itself.
It can be well-intentioned, as people seek to understand complicated topics in real time. But it can also be a front for deniers and conspiracy theorists, who hide the certainty they feel about “alternative facts” behind a well-placed question mark. In our search for certainty, social echo chambers — some intentionally seeded with misinformation by right-wing political actors to sow mistrust — are increasingly capable of transforming doubt into hesitancy, and even denial.
As of late January, 20 percent of Americans told pollsters that they would not get the Covid-19 vaccine unless it’s required or at all, according to a Kaiser Family Foundation survey; among some populations, particularly those who identified as Republicans, the share of those willing to be vaccinated hovered at just 35 percent. This hesitancy has complicated roots, but it’s not just vaccines people are turning down. Even as coronavirus cases soared this winter, some Americans have remained skeptical of masks and social distancing.
Add it to the list: Skepticism remains about everything from the threat climate change poses to the nation to the spherical shape of the Earth. On November 9, immediately following President Joe Biden’s election victory, a Politico/Morning Consult survey of nearly 2,000 registered voters reported that 70 percent of Republicans didn’t believe the election was “free and fair.” And the ranks of election skeptics included at least 147 Republican members of Congress, who subsequently, if unsuccessfully, voted to overturn the election results in January.
“Question everything, right?” a woman at a “Stop the Steal” protest in Pennsylvania told CNN in November. “Unfortunately, people fail to think for themselves,” she added — other people.
Ours is a nation of doubters. Since its peak in 1964, public trust in government has been on the decline, exacerbated by social crises like the Vietnam War, Watergate, and the 2007-2008 financial collapse. The National Election Study found in 2019 that just 17 percent of Americans said they always or mostly trusted the government. In 2020, the number of people with “no trust at all” in American mass media — 33 percent — was at an all-time high, according to Gallup. Instead, many have turned to Facebook groups, obscure blogs and message boards, and wildly popular podcasts like the conspiracy-theory-friendly Joe Rogan Experience, whose unofficial tagline could easily be, “I’m just asking questions.”
A skeptical attitude has been a tenet of rational thought since at least ancient Greece. In some sense, the scientific method, a process by which people can develop hypotheses and carry out experiments to see if their predictions are valid, is just skepticism, rigorously applied.
“The ancient skeptics would talk about skepticism as a ‘medicine for the mind,’” says Baron Reed, a philosophy professor at Northwestern University and the co-editor of Skepticism: From Antiquity to the Present. It could offer clarity and, some argue, even happiness.
That notion of a “healthy skepticism” persists. But Americans increasingly display only a “temperamental skepticism,” says Kurt Andersen, author of Fantasyland: How America Went Haywire — A 500-Year History. It’s “skepticism as an instinct or reflex,” he says, instead of empirically based doubt. In this paradigm, asking questions is enough. The hard work of evaluating evidence — and acting when it proves sufficient — is no longer required.
“It sounds so much more fair-minded and scientific [to be a skeptic] than to be a denier,” says Lee McIntyre, a research fellow at Boston University’s Center for Philosophy and History of Science and the author of Post-Truth. “But,” he adds, “the problem is this: They’re actually not skeptics, they’re actually quite gullible.”
For most of her life, Oyakawa was a member of the Mormon Church. But when she was pregnant with her sixth child, she left, losing many friends in the process. The experience pushed Oyakawa to reevaluate all of her deeply held beliefs, including her conviction that vaccines might be harmful for children. But when she began posting friendly questions about the potential benefits of certain vaccines on Facebook, she found “these groups couldn’t handle it.” Oyakawa says she realized that these parents weren’t “questioning everything,” as they liked to claim. They were promoting their own beliefs.
People have been debating the nature of truth for millennia. But it was René Descartes, a 17th-century philosopher, who argued that all received wisdom was specious and formalized a process for evaluating the truth of any claim. Historians consider his system, known as Cartesian doubt, the forerunner of the modern scientific method. It was also the origin of the ideal of every person as an intellectual island, capable of thinking clearly and freely for themselves, without interference or support.
In reality, people aren’t great at dealing with complex systems or uncertainty. We tend to think quickly, often at the expense of accuracy. In the 1970s, psychologists Daniel Kahneman and Amos Tversky developed the concept of “cognitive bias” to explain these systematic errors in thinking. They include things like recency bias (leaning on the things you’ve learned most recently) and confirmation bias (highlighting evidence that proves your point).
These shortcuts lead everyone astray at times. But “cognitive biases are not the main source of error,” Kahneman said. “The main sources of error are social.” Meaning: We may think we’re independent thinkers, but we’re much more likely to be relying on those we trust — from family members to domain experts — to inform and guide us.
There’s nothing inherently wrong with that. Some cognitive scientists theorize that human reason developed not so we would be natural statisticians, but so we could cooperate with each other. Like zebra stripes, humans can’t definitively say where one herd member’s ideas start and another’s end.
The implications are stark — increasingly so, with social media. Instead of forming relationships geographically, where some diversity of opinion is likely, we spend more and more of our time in digital spaces organized around shared beliefs. In this environment, “we’re able to put together a body of evidence without even realizing,” Reed says. No one is immune.
For decades, politicians and corporations have preyed on these social and psychological vulnerabilities. Since at least the 1950s, for example, researchers have recognized the link between smoking and lung cancer. But because scientists cannot definitively say that an individual case of lung cancer is directly caused by smoking, the tobacco lobby was able to sow doubt among the public, as historians of science Naomi Oreskes and Erik M. Conway showed in their bestselling book Merchants of Doubt.
By emphasizing the unresolved questions and imperfections of science, corporations were able to “provoke in people [a kind of] doubt that tends to crowd out knowledge,” says Reed. “We’ve inflamed in them a desire for knowledge we don’t have now, and because that tends to capture their attention, they stop asking the questions they can answer.”
Politicians and corporations have since performed the same sleight of hand on countless other topics. It’s easier than ever. On social media platforms like Facebook, it’s the people we trust most — our friends and family — who become the superspreaders of this manufactured misinformation.
“Science denial was so bloody successful for decades; that’s what made people in Washington say, wait a minute, if they can doubt climate [change], if they can doubt cigarettes, we can doubt anything,” says McIntyre, the author of Post-Truth.
There is reason to think that many so-called “skeptics” aren’t experiencing doubt at all. Instead, says Robbie Sutton, a social psychologist at the University of Kent who studies belief in conspiracy theories, studies have shown that people who question scientific conclusions are often motivated by a range of religious, economic, political, and personal convictions.
Evolution skepticism, for example, is more common among people who have strong beliefs about the relationship between God and humans. Climate change skepticism, by contrast, may be camouflaging resistance to climate action: In one 2014 study, Republicans expressed less skepticism about climate change when they were presented with free market solutions like technological innovation compared to traditionally liberal solutions like emissions restrictions. This wasn’t climate skepticism, the researchers concluded, but “solution aversion.”
“This is relatively smoking gun evidence that we choose to not believe or to adopt a skeptical stance toward some things because we don’t like what they mean,” Sutton says. And not just what they mean for us as individuals, but for everyone we think is like us.
When Oyakawa began to question different claims in her parenting groups, she felt attacked. But “every time I would start one thread that would get going, my inbox would explode” with people thanking her, privately, for speaking up.
In 2013, Oyakawa, now a vaccine advocate, founded a Facebook group called Crunchy Skeptics for evidence-based parenting. It’s brought almost 3,000 members together to evaluate claims and think critically about what’s best for their families. While it’s provided a science-focused alternative to other groups, Oyakawa says conversations around what “skepticism” really means haven’t gotten any easier.
There’s no easy fix for these problems. Removing misinformation on social platforms is crucial, but many scholars argue that to end the tyranny of “skeptics,” we need to replace it with a more robust scientific skepticism.
Psychology research seems to “suggest we’re tribal animals, and we’re not interested in shared truth, that we just want to support our tribe, our side,” Sutton says. But that doesn’t mean we have to give up on our commitment to reason.
Scientists are, in some sense, just another social group, but one that is defined by its commitment to a rigorous search for the truth. Individual “scientists believe false things all the time, but through the culture of science, what you’ve done is create a community that, through the process of openness and sharing experimental data, has eliminated a little bit of confirmation bias,” says Jonathan Haber, an educational researcher and the author of Critical Thinking.
The process can be unseemly; in the pandemic, everyone has been exposed to conflicting evidence and changing guidance, from the US Centers for Disease Control and Prevention’s initial guidance that Americans should not wear masks to the debate about whether the coronavirus is airborne (it is).
That’s why scientists remain skeptical of their own conclusions. Over time, as data accumulates and researchers correct their assumptions, they work toward consensus. While no question can be answered with 100 percent certainty, scientists can tell people how sure they should be.
“In science, skepticism doesn’t just mean that you doubt,” McIntyre says. “It means that when there’s sufficient evidence, you believe.”
Oyakawa knows how hard it is to achieve a truly healthy skepticism. On Facebook, she holds people’s hands and walks them through their logical fallacies, skewed personal anecdotes, and biased sources. She can do it because she’s made the same mistakes — and is acutely aware it could happen again. “I know that my experience and how I take in information is going to be affected by the biases I already have,” she says. But now, Oyakawa says, when she questions something, she does it with a method.
Eleanor Cummins is a science writer and frequent contributor to the Highlight. Most recently, she’s written about the Twitter presidency and social distancing scofflaws for Vox.