Facebook, YouTube, Twitter, and other social media companies are scrambling to take down and fact-check rampant misinformation about topics like Covid-19 and the 2020 election that spread on their platforms.
But complicating these companies’ efforts to moderate content is the fact that a majority of Americans — on both sides of the political aisle — believe that social media companies are censoring political viewpoints, according to a new poll by the Pew Research Center.
About three in four Americans feel it is very likely or somewhat likely that social media sites “intentionally censor political viewpoints that they find objectionable,” according to the survey. It polled around 4,700 Americans across the political spectrum. While people from both parties thought that social media companies were likely censoring content for political reasons, Republicans were much more likely than Democrats — 90 percent of Republicans compared to 59 percent of Democrats — to hold this belief.
For several years, President Trump and leading Republican lawmakers have complained, without evidence, that social media platforms are systematically censoring conservative content. These politicians and their supporters have tried to prop up their allegations by citing anecdotal examples of conservative individuals’ accounts being removed and how many people who work in Silicon Valley tend to lean liberal.
While media and technology experts have written off these complaints as lacking any serious empirical evidence, ultimately, the new survey suggests that, to some extent, evidence doesn’t matter. The belief that social media companies are using their power to further a political agenda is now mainstream. And the public’s negative perception about these companies’ intentions threatens to delegitimize their increasing efforts to fact-check and moderate content, particularly heading into the 2020 presidential election.
How allegations of bias went from fringe to mainstream
Allegations of social media platforms being politically biased in the US go back to a much-discussed 2016 Gizmodo story about the Facebook content moderators who managed its now-defunct Trending news section. The article cited anonymous moderators who said they deprioritized news from right-wing outlets based on individual editorial judgment. Subsequently, Facebook eliminated human reviewers from its platform. And in 2018, Facebook removed the Trending news section from the platform entirely.
In the following years, accusations of alleged anti-conservative bias focused less on trending news stories and more on individual figures like far-right conspiracy theorist Alex Jones and his site InfoWars. For years, Jones used social media to espouse violent, sometimes racist, and harmful conspiracy theories (including the false claim that the Sandy Hook shooting was a hoax), amassing millions of followers along the way.
When Jones was finally booted off of YouTube, Apple, and Facebook in 2018 for consistently violating their rules around hate speech, he and his fans launched a crusade against social media platforms. They rallied against these companies and their supposed political biases, and warned of a coming purge against other conservatives. And they weren’t alone in their claims. High-profile Republican politicians like Sen. Ted Cruz backed Jones on this, even though Cruz said he was against some of Jones’s more extreme views. And in the ensuing years, Trump and leading Republican Party members have consistently repeated the same talking points, fueled by claims not just from Jones but several other conservative social media figures like Diamond and Silk.
“We’ve entered a world where politicians are parroting in a lot of ways the headlines of media manipulators and misinformers,” Joan Donovan, director of research at Harvard University’s Shorenstein Center on Media, Politics, and Public Policy, told Recode.
Ironically, many of these extremists like Jones grew their massive follower bases on social media, which is where they also successfully perpetuated the theory that these companies are biased against them.
“This is a house of mirrors,” said Siva Vaidhyanathan, a professor of media studies at the University of Virginia. “A vast majority of Americans are getting their sense of what Facebook and Google are doing from completely unfounded claims that are circulating on Facebook and Google.”
Since major social media platforms like Facebook and YouTube don’t share much in the way of data or insights about what kind of content people see on the platform, or why they do or don’t see certain content, it’s hard to definitively prove there is not a political censorship problem on social media.
Responding to Republican lawmakers’ concerns, Facebook commissioned an external audit in 2019 to determine the existence of alleged anti-conservative bias. But the results were “little more than a formalized catalog” of Republicans’ grievances and didn’t “include any real quantitative assessment of bias” on the platform, as social media researcher Renee DiResta wrote in Slate at the time.
The data we do have shows conservative content is actually performing quite well on Facebook. The New York Times’s Kevin Roose has routinely gathered data from the Facebook-owned analytics company CrowdTangle, showing that the Facebook posts with the highest engagement (the ones people “liked,” clicked on, and shared the most) — are mostly from right-wing and conservative sources like Ben Shapiro and Fox News. Facebook has disputed this reading of its data, saying that other internal metrics provide a fuller picture of what posts are most viewed on the platform, but it hasn’t shared more than snippets of that data publicly.
In another study, the left-leaning media watchdog organization Media Matters did a study in 2019 of over 400 popular political pages on Facebook and found that conservative pages performed about equally as well as liberal ones.
And while it’s true that a majority of rank-and-file tech employees tend to vote liberal, recent reporting alleges that Facebook leadership relaxed rules on fact-checking in an attempt to avoid accusations that the platform has an anti-conservative political bias.
The wizard behind the curtain
While many people are concerned about what kind of content Facebook, YouTube, and Twitter take down or fact-check — particularly when it comes to posts from high-profile politicians like Trump — the truth is, only a very small percentage on Facebook and Twitter is ever fact-checked or taken down.
The vast majority of what we see on these platforms is controlled by a set of algorithms, tailored to show us content that will keep us engaged and interested in sharing that content with other people.
There’s a lot we don’t know about exactly how social media algorithms work, since they are proprietary information. The public can’t see into the algorithmic black box that determines which articles, photos, and videos we most easily find online. So while it’s easy for people to get upset about a right-wing post that gets fact-checked or taken down, it’s harder to understand the kind of invisible algorithmic amplification that may be helping surface those kinds of right-wing posts in the first place.
“The truth of how Facebook works is beyond most people’s comprehension. People imagine that there are wizards behind the curtain working the levers,” Vaidhyanathan told Recode. “And that’s not what Facebook really is — which is this self-propelling, controlling machine that is amplifying all sorts of video, texts, and images, and sorting it according to commercial need.”
One potential solution, according to Donovan: “Platform companies should be more transparent about what kind of news is circulating on their platforms, and what kind of top stories are getting the most clicks, likes, and shares. Then we could start to put some of these allegations of bias to rest.”
While the executives at Facebook and YouTube may not be wizards behind the curtain working to manipulate people’s political views, they are the companies that control how hundreds of millions of Americans get their news every day.
And right now, the stakes in the US couldn’t be higher. Physicians say that medical misinformation spread on social media is literally killing people who trust what they read online more than they trust the advice of medical experts. In the middle of this, the US is holding a presidential election during a pandemic, when people are scared to or unable to vote in person. Already, mass confusion about mail-in voting has spread on social media — promoted by Trump — which concerns some civil rights leaders, who are urging Facebook and Twitter to set the record straight by fact-checking false claims, regardless of who posts them.
While the Pew study showed that more than 70 percent of Democrats approved of social media companies labeling politicians’ posts as misleading, only 35 percent of Republicans shared that sentiment.
Across party lines, if Americans can’t trust social media companies to moderate content on these topics, the big question is: Who will they trust instead, and what information will they end up believing?
Will you become our 20,000th supporter? When the economy took a downturn in the spring and we started asking readers for financial contributions, we weren’t sure how it would go. Today, we’re humbled to say that nearly 20,000 people have chipped in. The reason is both lovely and surprising: Readers told us that they contribute both because they value explanation and because they value that other people can access it, too. We have always believed that explanatory journalism is vital for a functioning democracy. That’s never been more important than today, during a public health crisis, racial justice protests, a recession, and a presidential election. But our distinctive explanatory journalism is expensive, and advertising alone won’t let us keep creating it at the quality and volume this moment requires. Your financial contribution will not constitute a donation, but it will help keep Vox free for all. Contribute today from as little as $3.