Skip to main content

Believe that journalism can make a difference

If you believe in the work we do at Vox, please support us by becoming a member. Our mission has never been more urgent. But our work isn’t easy. It requires resources, dedication, and independence. And that’s where you come in.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Support Vox

Twitter’s CEO doesn’t get how conspiracy theories work

Jack Dorsey’s argument for not banning Alex Jones reveals the limits of his understanding — and a major problem with big tech’s outsize role in our society.

Alex Jones, radio host and conspiracy theorist, addressing the media at the annual Bilderberg conference on June 6, 2013, in Watford, England.
Alex Jones, radio host and conspiracy theorist, addressing the media at the annual Bilderberg conference on June 6, 2013, in Watford, England.
Alex Jones, radio host and conspiracy theorist, addressing the media at the annual Bilderberg conference on June 6, 2013, in Watford, England.
Nick Ansell/PA Images via Getty Images
Zack Beauchamp
Zack Beauchamp is a senior correspondent at Vox, where he covers ideology and challenges to democracy, both at home and abroad. His book on democracy, The Reactionary Spirit, was published 0n July 16. You can purchase it here.

In the past week, tech giants including Facebook, Apple, YouTube, and Spotify banned notorious conspiracy theorist Alex Jones from their platforms. Jones, perhaps most famous for promoting the idea that the Sandy Hook Elementary School shooting was a hoax, was banned from these platforms for allegedly violating their terms of service in all sorts of ways.

But there was one Silicon Valley corporation that opted to allow Jones to stay: Twitter. You can still go to President Donald Trump’s favorite social media outlet and scan the @RealAlexJones feed, where you will learn that the bans are a plot by “deep state actors” to prevent the American public from learning the real truth about our government.

Tuesday night, Twitter CEO Jack Dorsey wrote a lengthy statement — published as a series of tweets, naturally — defending his company’s decision. My colleague Aja Romano has a lengthy and sweeping takedown of Dorsey’s full logic; I encourage you to read it.

But I want to focus on one of Dorsey’s specific tweets, one that, to my mind, reveals a deep issue at work here:

The tweet displays a profound misunderstanding of the way conspiracy theories and “fake news” work. The problem isn’t that there aren’t enough journalists correcting misinformation and myths; there’s tons of evidence out there that what Jones says is patently false.

Rather, it’s that conspiracy theories, once they spread, create hermetically sealed communities that are impervious to correction. The only way to stop this process is to stop them from spreading on platforms like social media, which is exactly what Twitter decided not to do.

It’s not surprising that Jack Dorsey doesn’t understand this: He doesn’t really have time to read the latest social science on conspiracy theories. And that’s the real problem: Tech giants are increasingly being asked to handle social problems, ones their leaders don’t seem equipped to address.

What Jack Dorsey gets wrong about conspiracy theories

The New York Times 2017 DealBook Conference
Jack Dorsey.
Michael Cohen/Getty Images/The New York Times

In 2008, Harvard Law professors Cass Sunstein and Adrian Vermeule penned an article on conspiracy theories and how they work. They argued that conspiracy theories — which they define as “an effort to explain some event or practice by reference to the machinations of powerful people, who have also managed to conceal their role” — are, in their own way, quite rational.

“Most people are not able to know, on the basis of personal or direct knowledge, why an airplane crashed, or why a leader was assassinated, or why a terrorist attack succeeded,” they wrote. As a result, they search for information that fits what they already believe about the world and is confirmed by people they trust.

Conspiracy theories, Sunstein and Vermeule argued, spread in a variety of ways. One of these pathways, called an “availability cascade,” happens when a group of people accept a conspiracy theory because their preexisting beliefs about the world make them likely to believe it.

This is what happens with Alex Jones and people on the American right. Theories like “Sandy Hook was faked so Obama could take your guns” and “the ‘deep state’ is conspiring against Trump to destroy democracy” appeal to their basic, gut-level political orientation, which is that Democrats are nefarious and Trump is a hero.

Not all conservatives accepted these ideas when presented with them, of course, but it was appealing enough that Jones managed to build up a significant social media presence and a shockingly large amount of influence. In December 2015, then-candidate Trump went on Jones’s show, telling the host that his “reputation is amazing” and vowing that “I will not let you down.”

Jones has created a thorny problem for society. Once people start believing in his conspiracy theories, and trusting him as a source, it becomes extremely difficult to change their minds.

“Conspiracy theorists are not likely to be persuaded by an attempt to dispel their theories; they may even characterize that very attempt as further proof of the conspiracy,” Sunstein and Vermeule wrote. Because conspiracy theorists “become increasingly distrustful and suspicious of the motives of others or of the larger society,” efforts to debunk their myths often “serve to fortify rather than undermine the original belief.”

This isn’t just Sunstein and Vermeule’s theory: A significant body of empirical research on conspiracy theories finds that it’s extremely hard to change believers’ minds. One 2017 study, by two UK-based psychologists, presented people with anti-vaccine conspiracy theories and evidence debunking them — but randomly switched whether they saw the anti-vax arguments or the actual facts first. Then they asked them how that affected their opinions on vaccinating a child. The results were sobering.

“Anti-conspiracy arguments increased intentions to vaccinate a fictional child but only when presented prior to conspiracy theories,” the authors explained. “These findings suggest that people can be inoculated against the potentially harmful effects of anti-vaccine conspiracy theories, but that once they are established, the conspiracy theories may be difficult to correct.”

This is the problem with Dorsey’s logic. Now that Jones has an audience on Twitter, journalists’ attempt to “refute” him will fail. His fans will mostly disregard the debunkings, and his audience will continue to grow. This is what was happening on every other platform, prior to the bans. The other companies recognized that Jones was spreading dangerous lies, and that journalists simply couldn’t debunk them. The only way to stop these ideas was to deprive them of oxygen, to prevent people from being exposed to them in the first place.

Twitter’s CEO just doesn’t get that.

The problem with tech making social decisions

As frustrating as Dorsey’s statement is, there’s a part of me that doesn’t blame him. It really is not his fault that he hasn’t read the academic literature on conspiracy theories. His job is running a massive technology company.

While Twitter was alone on the Alex Jones issue, Dorsey is hardly the only tech CEO to make glaringly ignorant comments about social issues that affect their platform. Just last month, for example, Facebook CEO Mark Zuckerberg offered this nugget of anti-wisdom in an interview with Recode’s Kara Swisher:

I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but … it’s hard to impugn intent and to understand the intent.

Zuckerberg’s argument is that Holocaust deniers are merely deluded people (he later clarified that he “didn’t intend to defend the intent of people who deny the Holocaust”). But the purpose of Holocaust denial is not to have a good-faith argument about history — it’s to advance an anti-Semitic political agenda. Letting deniers spread poison on Facebook doesn’t serve the purpose of illuminating debate. Rather, all it does is allow yet another vile conspiracy theory to spread.

How to approach Holocaust denial has been, historically, a hard problem for liberal societies. The United States, with its expansive free speech tradition, permits Holocaust deniers to publish freely on the grounds that it would be dangerous to let the government regulate speech in this fashion. Germany and France have both decided to criminalize denial, on the grounds that it’s a form of incitement to racial hatred rather than legitimate political speech. Both approaches have benefits and flaws; brilliant scholars have written tomes making the case for one or the other.

But today, the spread of Holocaust denial, Sandy Hook trutherism, and other vile conspiracy theories isn’t just a problem for governments. It’s a problem for technology corporations, which regulate the primary means through which information is disseminated today.

Those companies — none of which have the legitimacy or public accountability that government officials do — have no choice but to engage with all sorts of extremely hard social problems surrounding free speech and bigotry. People like Jack Dorsey and Mark Zuckerberg are not the people who ought to be making these decisions for a democratic polity, but they have no choice but to make them. Sometimes they’ll get those decisions right, as most of these companies eventually did with Alex Jones. But often, they’re going to get them wrong — and the public will have no real way to hold them accountable.

This is your politics on big tech.

More in Politics

How the US made progress against gun violence in 2024How the US made progress against gun violence in 2024
Politics

It’s easy to feel like gun violence is hopeless and never getting better. 2024 disproves that.

By Marin Cogan
The high-tech future of assisted suicide is here. The world isn’t ready.The high-tech future of assisted suicide is here. The world isn’t ready.
Politics

A “suicide pod” in Switzerland roils the right-to-die debate.

By Marin Cogan
9 actually good things that happened in 20249 actually good things that happened in 2024
Future Perfect

It wasn’t the easiest year, but 2024 was not without its bright spots.

By Bryan Walsh
The 10(ish) most read Future Perfect stories of 2024The 10(ish) most read Future Perfect stories of 2024
Future Perfect

Why young people are getting cancer, problems with OpenAI, and the little intelligence agency that could.

By Bryan Walsh
The Matt Gaetz ethics report, explainedThe Matt Gaetz ethics report, explained
Congress

The report accuses Gaetz of paying women for sex — including a 17-year-old.

By Ellen Ioanes and Li Zhou
The long decline of the American death penalty, explainedThe long decline of the American death penalty, explained
Criminal Justice

President Biden’s latest move against the death penalty is part of a much larger nationwide trend.

By Ian Millhiser