clock menu more-arrow no yes mobile

Filed under:

YouTube is awash with election misinformation — and it isn’t taking it down

Videos questioning the election results are racking up hundreds of thousands of views.

Protesters hold “stop the steal” signs outside a vote-counting center in Phoenix, Arizona.
On YouTube, it’s easy to find videos that cast doubt on the integrity of the US election.
Gina Ferazzi/Los Angeles Times via Getty Images
Rebecca Heilweil covered emerging technology, artificial intelligence, and the supply chain.
Open Sourced logo

In the hours and days after polls closed in the United States, misleading and outright false content about the presidential election was easy to find on YouTube. A slew of videos claimed to be proof of voter fraud, while the right-wing channel One America News Network (OANN) racked up hundreds of thousands of views on videos incorrectly declaring that Donald Trump had already won the election. YouTube was even running ads on some videos amplifying claims of voter fraud.

While YouTube has added a lightly worded label on some videos noting that election results were not yet final, most of these misinformation videos were still readily accessible by Friday afternoon. Meanwhile, a slew of content stirring doubt in the electoral process was drawing a growing number of views. Those include videos posted in the official Donald Trump channel, which has more than 2 million subscribers, of Trump’s Thursday speech amplifying false conspiracy theories about the election’s legitimacy.

Moderating YouTube requires analyzing large amounts of video content, which can be harder to study and evaluate than text. Still, some have criticized the platform, saying it takes a soft approach to misinformation and other claims that aim to sow doubt in the election process. While it’s difficult to measure how “well” YouTube is doing at finding and eliminating election misinformation compared to its peers, it’s clear that some misinformation and content questioning the election’s integrity are allowed on the platform. Keep in mind that YouTube has previously been criticized for radicalizing its users through its recommendation algorithm and serving as a platform for conspiracy theorists.

Between November 3 and November 5, there were nearly 100 million views on videos from politics-focused channels with at least 10,000 subscribers that mention keywords related to “election fraud,” according to initial research from Transparency Tube, an independent tool that tracks politics-focused YouTube channels. More than 2.5 million of those views were on channels known for promoting conspiracy content. These large numbers of views demonstrate that the narrative is gathering significant attention.

“Many of these videos are reporting on claims being made by the president or his supporters, but a significant number, especially from partisan right and conspiracy channels, are endorsing claims of ‘election fraud,’” researcher Sam Clark told Recode in an email.

Joe Biden appears in a YouTube screenshot of Donald Trump’s channel, with the false headline “Joe Biden says he’s built the most extensive ‘voter fraud’ org in history.”
A video with nearly half a million views from Donald Trump’s channel falsely implies that Joe Biden used voter fraud to win the election.

“YouTube has always been a welcome home for conspiracy theorists and extremists and has largely paid lip service to making meaningful changes to address misinformation rampant on their platform,” Angelo Carusone, the president of Media Matters, told Recode in an email. “We saw this again in their half-hearted attempts to rein in election misinformation when they insisted on including Fox News — a persistent source of misinformation about voting — as an authoritative news partner.”

Recode identified about 20 videos that questioned the election in some capacity, such as promoting that Trump had actually won or pushing claims of voter fraud, to see how YouTube would respond. YouTube removed only one for violating its policies on deceptive practices and spam, which specifically bars discouraging voting and interfering in democratic processes. The company said it also took action on content that violated the company’s monetization policy, which prohibits ads on videos that undermine electoral processes.

YouTube has established a sliding scale for what content is and is not allowed on the platform, and much of its approach to content moderation depends on algorithms. For instance, searches for particular topics have been adjusted to elevate content from news outlets, such as CNN and Fox News. The platform also scales back recommendations for videos that are considered borderline but don’t quite violate YouTube’s rules. Content that violates its community guidelines is supposed to be taken down. That includes videos that allege mail-in ballots have been interfered with to influence an election or that call for violence at polling places.

“Over the last few years, we’ve heavily invested in the work that allows us to remove violative content, raise up authoritative content, and reduce the spread of borderline content,” YouTube spokesperson Ivy Choi told Recode in a statement. “Our policies prohibit misleading viewers about voting or encouraging interference in democratic processes, and we surface an information panel under videos discussing voting by mail. We continue to be vigilant with regards to election-related content in the lead-up and post-election period.”

The fact that it takes a bit of work to find these more extreme videos reflects that YouTube’s algorithmic approach to moderation works to a degree and that moderation isn’t always designed to address every single instance of violative content.

“It’s hard to do this kind of rigorous quantitative analysis in real time, but my impression of doing some searches — firing up so many anonymous browsers and looking at the recommendations — is that they’re definitely not pushing this, and it actually is somewhat hard to find unless you’re really, really looking for it,” said Kevin Munger, a political scientist at Penn State who has studied YouTube.

Results for terms like “election,” “voting,” and even “Trump won” do indeed usually lead to content from legitimate sources or news outlets, as Recode found when it did its own searches. Even searching for phrases like “voter fraud” generally reveals content refuting the idea that voter fraud was a significant factor in the election. Voter fraud is incredibly rare in the United States.

Videos that give more credence to voter fraud concerns include interviews that originally aired on cable television and talk radio, like Good Morning Britain and Fox News. But despite not showing up in the top search results, several videos that push the idea that the election is somehow being “stolen” from Donald Trump are still attracting a hefty number of views. (At the time of publication, Decision Desk HQ had called the election for Joe Biden, and there was no evidence of a stolen election.)

A video from One America News, claiming that “Trump won” on the night of November 3, had attracted nearly 420,000 views by Friday afternoon. While YouTube removed ads on this video, the company did not remove it because it didn’t “materially discourage voting,” YouTube told CNBC. Another video from OANN, published Thursday morning, claimed that Trump won but that Democrats were trying to steal the election. It now has more than 260,000 views. YouTube attached a label to both videos that reads “Results may not be final” and directs users to Google’s election tracker.

A University of Texas advertisement running on a video that alleges “Voter FRAUD for Joe Biden” was caught on video.

Most of the videos getting millions of views seem to be recordings of mainstream sources, like Fox News, rather than YouTube-native content creators, according to Munger from Penn State. YouTube has been successful in cutting down on recommendations to fringe channels, according to a report from the New York Times, though this has also boosted referrals to Fox New’s content on the site.

“I think this is consistent with what we’ve been seeing for how they’ve changed their recommendation system to essentially give precedence to mainstream conservative sources over fringe, right-wing conservative sources,” Munger said.

Still, amid ongoing anxiety about how social platforms deal with content that sows doubt in electoral and civic processes, some feel that YouTube has sidestepped scrutiny amid focus on platforms like Twitter and Facebook.

As the election has unfolded, it’s become clear how these three companies have differing approaches to content moderation. Facebook, for instance, took down a group with 350,000 members associated with the #StopTheSteal conspiracy and also blocked the hashtag. As of Friday afternoon, searching the same hashtag produced a slew of results on YouTube, including videos with tens of thousands of views.

“We’re so focused on the other platforms that we don’t demand the same accountability and transparency from (YouTube), and nobody kicks up a fuss,” Evelyn Douek, a lecturer at Harvard Law School, told the Washington Post. “We can’t just let them get away with this.”

It’s unrealistic to expect that YouTube will take down every single problematic piece of content — and many would say that’s not the role of social media companies. But it also appears YouTube has become an easy place for the president and conspiracy theorists to spread doubt about the election’s integrity. And so far, it looks as though YouTube appears willing to host videos that do just that.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.