The three-and-a-half-hour hearing with Google CEO Sundar Pichai and the House Judiciary Committee wasn’t exactly a showcase of deep knowledge of technology. One Republican representative complained that all the Google results for the Obamacare repeal act and the Republican tax bill were negative. Rep. Steve King (R-IA) had to be told that Google does not make the iPhone. Rep. Louie Gohmert (R-TX) demanded that Google be held liable for Wikipedia’s “political bias.”
But one lawmaker, Rep. Jamie Raskin (D-MD), raised an actually important and pressing issue: the way YouTube’s algorithms can be used to push conspiracy theories.
“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events,” he said. He was alluding to the Pizzagate conspiracy theory that led to an armed gunman showing up at a DC-area pizzeria in 2016 — a conspiracy theory spread, in part, on YouTube.
Raskin asked about another especially strange conspiracy theory that emerged on YouTube — “Frazzledrip,” which has deep ties to the QAnon and Pizzagate conspiracy theories. He asked Pichai, “Is your basic position that [Frazzledrip] is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?” He added, “Are you taking the threats seriously?
Raskin’s questions were getting at an important issue: YouTube, which Google purchased for $1.65 billion 12 years ago, has a conspiracy theory problem. It’s baked into the way the service works. And it appears that neither Congress nor YouTube is anywhere near solving it.
YouTube and conspiracy theories, explained
One billion hours’ worth of content is viewed on YouTube every single day. About 70 percent of those views come from YouTube’s recommendations, according to Algotransparency, a website that attempts to track “what videos YouTube’s recommendation algorithm most often recommends.”
YouTube’s content algorithms are incredibly powerful — they determine which videos show up in your search results, in the suggested videos stream, on the homepage, in the trending stream, and under your subscriptions. If you go to the YouTube homepage, algorithms dictate which videos you see and which ones you don’t. And if you search for something, an algorithm decides which videos you get first.
For example, as I write, I am listening to The Nutcracker Suite on YouTube, so YouTube has recommended a list of classical music videos, along with several others based on my viewing history. But the algorithm knows that I probably don’t want to listen to Nine Inch Nails right now, so it isn’t suggesting, say, Nine Inch Nails’ Broken album.
But YouTube’s algorithms have an extremism problem.
if this is recommended to me, imagine what goes to people who actually watch this sort of content pic.twitter.com/tPcOsdbM0r
— John Ganz (@lionel_trolling) December 11, 2018
As Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina, wrote in the New York Times in March, the YouTube advertising model is based on you watching as many videos as they can show you (and the ads that appear before and during those videos).
Whether the subject of the original video selected was right-leaning or left-leaning, or even nonpolitical, the algorithm tends to recommend increasingly more extreme videos — escalating the viewer, Tufekci wrote, from videos of Trump rallies to videos featuring “white supremacist rants, Holocaust denials, and other disturbing content.”
Watching videos of Hillary Clinton and Bernie Sanders, on the other hand, led to videos featuring “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11,” Tufekci wrote.
In a statement from a YouTube spokesperson, YouTube said, “YouTube is a platform for free speech where anyone can choose to post videos, subject to our Community Guidelines, which we enforce rigorously. Over the last year we’ve worked to better surface credible news sources across our site for people searching for news-related topics.” It added, “We’ve changed our search and discovery algorithms to surface credible content, built new features that clearly label and prominently surface news sources on our homepage and search pages, and introduced information panels to help give users more sources where they can fact check information for themselves.”
On Algotransparency’s website, which tries to reverse-engineer YouTube’s recommendation algorithm, I entered two terms to find out what the algorithm would recommend for a user with no search history based on those terms. First up was “Trump.” (You can try this yourself.)
The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommendation was a QAnon-themed video — relating to the conspiracy theory alleging that President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)
Next, I tried “Hillary Clinton.” The top three recommended videos based on YouTube’s algorithm are all conspiracy-theory driven, from a video on an anti-Semitic YouTube channel that argues Freemasons will escape from the United States on private yachts after America’s eventual collapse to a user alleging that Clinton has a seizure disorder (she does not) to one alleging that Clinton has had a number of people murdered (also untrue).
I spend a lot of time consuming content about conspiracy theories — but these results weren’t tailored to me. These results were based on a user who had never watched any YouTube videos before. (For the record, in comments from YouTube, it responded, “We’ve generally been unable to reproduce the results AlgoTransparency and Vox have encountered. We’ve designed our systems to help ensure that content from more credible sources is surfaced prominently in search results and watch next and up next recommendations in certain contexts, including when a viewer is watching news related content from a verified news source.”)
This isn’t a flaw in YouTube’s system — this is how YouTube works. Which brings us to Frazzledrip.
How YouTube helped spread the weirdest conspiracy theory of them all
The conspiracy theory behind Frazzledrip is this, as “explained” on the fake news website YourNewsWire.com in April: Hillary Clinton and former Clinton aide Huma Abedin were filmed ripping off a child’s face and wearing it as a mask before drinking the child’s blood in a Satanic ritual sacrifice, and that video was then found on the hard drive of Abedin’s former husband, Anthony Weiner, under the code name “Frazzledrip.”
“The #Hillgramage 2018”. I’m shaking tonight with this drop. This would be an appropriate outcome. From #Epstein to #CometPingPong and everything in between, We’ve been waiting for this. We’re coming @HillaryClinton, I’m sharpening my pitchfork right now. #FRAZZLEDRIP #Pizzagate https://t.co/NFbCc4AZTt
— ImMikeRobertson (@ImMikeRobertson) April 15, 2018
For the record: This is not true. There is no such video, and no such thing ever happened. But as Snopes has detailed, multiple conspiracy theories of the Trump era, including QAnon and Pizzagate, overlap, and all of them hold that Hillary Clinton is a secret child pedophile and murderer.
You have probably never heard of Frazzledrip. Most people haven’t heard of Frazzledrip, or QAnon, or perhaps even Pizzagate. But on YouTube, there are hundreds of videos, each with thousands of views, dedicated to a conspiracy theory alleging that a former presidential candidate ripped a child’s face off and wore it as a mask. And there’s markedly little YouTube, or Google, or even Congress seem able to do about it.
“It’s an area we acknowledge there is more work to be done”
Here’s how Pichai answered Raskin’s question: “We are constantly undertaking efforts to deal with misinformation, but we have clearly stated policies, and we have made lots of progress in many of the areas over the past year. ... This is a recent thing but I’m following up on it and making sure we are evaluating these against our policies. It’s an area we acknowledge there is more work to be done.”
While explaining that YouTube takes problematic videos on a case-by-case basis, he added, “It’s our responsibility, I think, to make sure YouTube is a platform for freedom of expression, but it needs to be responsible in our society.” And in comments to me, YouTube stated, “Freedom of speech is at the foundation of YouTube. As such, we have a strong bias toward allowing content on the platform even when people express controversial or offensive beliefs. That said, it’s not anything goes on YouTube.”
But it isn’t easy to balance a platform that claims to be for freedom of expression with societal responsibility. It’s not illegal to believe in conspiracy theories, or to think that the 9/11 attacks were an inside job (they weren’t) or that the Sandy Hook shootings never happened (they did) or that Hillary Clinton is a child-eating pedophilic cannibal (this, I suppose it must be said, is untrue). In a statement, YouTube said, “False information is not necessarily violative, unless it crosses the line into hate speech, harassment, inciting violence or scams. We’ve developed robust Community Guidelines, and enforce these policies effectively.”
YouTube could radically change its terms of service — in a way that would dramatically limit the freedom of expression Pichai and his colleagues are attempting to provide. Or it could invest much more heavily in moderation, or change its algorithm.
But all of that would be bad for business. As long as YouTube is so heavily reliant on algorithms to keep viewers watching, on a platform where hundreds of hours of video are uploaded every minute of every day, the conspiracy theories will remain. Even if YouTube occasionally bans conspiracy theorists like Alex Jones, users will continue to upload videos about Frazzledrip, or QAnon, or videos arguing that the Earth is flat — and YouTube’s algorithms, without any change, will keep recommending them, and other users will watch them.
Clarification 12/14: According to YouTube, the recommendation system no longer optimizes for watch time, impacting how the algorithm works overall for both search results and recommendations. A spokesperson for the company told me, “Over the last few years we started to focus more on how satisfied people are with their time spent on YouTube. We use surveys, likes, dislikes, shares and other information to measure and improve satisfaction. ... While we still use watch time as one indicator of satisfaction, it is heavily balanced by several other signals we use to help make sure users are satisfied with the content they are watching.”