clock menu more-arrow no yes mobile

Filed under:

Facebook’s fake news problem is way bigger than fake news

Algorithmic media consumption leaves readers lost at sea.

Mark Zuckerberg Attends Mobile World Congress 2016 Photo by David Ramos/Getty Images

Pepsi is currently facing a strange business problem, as supporters of Donald Trump mobilize on social media to boycott the company’s products because PepsiCo CEO Indra Nooyi told Trump fans to "take their business elsewhere.”

What’s strange about it is that Nooyi never said that which makes it difficult for her to apologize or do any of the other things you would normally expect a CEO in hot water to do. CNN’s Shannon Gupta writes, “Pepsi isn't the first brand to get hit by fake news,” thus framing the company’s problem as part of the larger conversation around the sharing of fake news on Facebook.

But if you look into the details of the story, this really isn’t a case of fake news. Core fake news is an alarming new trend in which people launch websites that just make up stories that are designed to be highly shareable regardless of their accuracy. For example, someone wrote a story about how Pope Francis had endorsed Donald Trump, and it went viral even though the pope never endorses candidates and has been sharply critical of Trump’s views on a range of issues.

The Pepsi story is something older, more banal, more difficult, and, in its way, more trouble — it’s a mistake.

Origins of a mythical quote

Here’s what happened:

  • On November 8, Donald Trump won the US presidential election.
  • On November 10, Nooyi spoke at a New York Times DealBook conference and said, “Our employees were all crying. And the question that they're asking, especially those who are not white, 'Are we safe?' Women are asking, 'Are we safe?' LGBT people are asking, 'Are we safe?' I never thought I would have to answer those questions.” Nooyi said she offered her employees reassurance: “What we heard was election talk.” And she said she wasn’t happy about the language Trump had used to address women, but that it’s a larger problem than Trump.
  • Later that day, Business Insider’s Kate Taylor wrote up the DealBook appearance with the headline “PepsiCo CEO: Employees are scared for their safety after Trump’s election” and offered what I would consider to be a fully accurate paraphrase of Nooyi’s remarks about alarmed employees and her effort at deescalation.
  • Then on November 13, the website The Conservative Treehouse read about Nooyi’s remarks and took exception to them, writing the headline “Massive Stewardship Fail — PepsiCo CEO Tells Trump Supporters to Take Their Business Elsewhere.” The story does not say that Nooyi literally said Trump supporters should take their business elsewhere. It just says that Nooyi was critical of Trump (which she was), which means that Trump supporters should take their business elsewhere. But the headline — as is often the case with headlines — lacks any kind of nuance or clarity about what’s really happening.
  • Amy Moreno of TruthFeed then wrote a story called “BREAKING: Pepsi STOCK Plummets After CEO Tells Trump Supporters to ‘Take Their Business Elsewhere.’”

Now, at that point, Moreno has unleashed a whirlwind of falseness onto the world. Her headline has Nooyi, in quotation marks, saying something that she rather clearly didn’t say. The story is false.

But it isn’t “fake news” exactly. It’s based on a real news event that has simply been aggregated and reaggregated, framed in different ways for different audiences. At some point in the telephone chain, the story goes from accurate to inaccurate. And the method is the same as the fake news method — maximum outrage, maximum engagement, minimum concern for context and accuracy.

Facebook-as-news-source is inherently broken

Facebook can and should do more to crack down on genuinely fake news stories being shared on its platform. In particular, it can and should clamp down on fake-news content mills to at least prevent them from swamping everything else on the platform.

But the underlying thing that makes Facebook so vulnerable to fake news is really inherent to what Facebook’s newsfeed is. You pop online and it shows you stories that, based on its understanding of how other people have reacted to them and how you have reacted to past stories, it thinks you will enjoy.

Facebook is really good at this. I don’t normally check my Facebook newsfeed, but I just did for the purposes of doing this story, and, yeah, it was full of headlines that I wanted to click on or even share without clicking.

But logging in to a platform constructed by clever engineers to select for you the stories that you are mostly likely to find it psychologically pleasing to click on and share is not a good way to obtain information about the world. If you are the kind of person who is inclined to like Donald Trump but also who is inclined to like the pope, the stories that you need to see are the psychologically difficult ones that pick at the tension between your identity as a Republican partisan and your identity as a Catholic.

A news diet overwhelmingly driven by shareability and algorithmic targeting is going to be profoundly misleading whether or not it contains fake news.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.