clock menu more-arrow no yes mobile

Filed under:

This is Facebook's plan to fight fake news

In the wake of the 2016 election, Facebook has been receiving an ongoing wave of criticism for its handling of fake news in its newsfeed used by more than a billion daily users. Initially, Facebook CEO Mark Zuckerberg dismissed concerns about fake news, arguing that it was a “pretty crazy idea” to blame Facebook for Donald Trump’s upset win.

Zuckerberg’s blasé response triggered a fierce debate inside the social media giant. And now Facebook is changing its tune in a dramatic way. In a Thursday blog post, Adam Mosseri, the Facebook executive responsible for the newsfeed, announced that Facebook will begin working with prominent third-party fact-checking organizations like Snopes, Factcheck.org, and Politifact to verify news articles.

Stories flagged by these fact-checking organizations will be marked as “disputed” in the newsfeed, and these stories will be penalized in the newsfeed algorithm, meaning that they’ll tend to show up further down the list but won’t be removed from Facebook altogether.

It’s exactly the kind of ambitious steps that Facebook’s critics have been demanding for weeks. It’s also likely to cause a backlash among conservatives, many of whom believe that a company from liberal Silicon Valley will simply use the “fake news” label as a pretext for suppressing conservative viewpoints generally.

But that’s a shortsighted concern. In the long run, it’s not good for conservatives for the right-of-center political conversation to be strongly influenced by totally bogus news stories. Facebook’s new approach will raise the profile of more responsible conservative news organizations like the Washington Times, the Washington Examiner, and National Review. And in the long run, that’s going to be good for both the conservative movement and American democracy more generally.

Facebook has more and more influence over the news we see

A handful of big tech companies — Twitter, Google, and especially Facebook — have gained a huge and growing influence over what news people see. Forty-four percent of US adults tell pollsters they got news from Facebook in 2016. That’s vastly larger than other news-focused social media sites like Twitter (9 percent) and Reddit (2 percent). And while many people still get their news from television programs or newspapers, those media are divided among many competing news organizations. This means that Facebook has a larger influence over ordinary Americans’ media diets than almost any other news organization.

Normally we think that organizations with a lot of power have an obligation to use that power responsibly. But Facebook’s leaders have traditionally resisted thinking of themselves in those terms. Zuckerberg has repeatedly insisted that “we are a tech company, not a media company” — implying that Facebook should not be in the business of making editorial decisions or held responsible for the fake material cluttering its newsfeed.

But the 2016 election campaign showed the folly of this see-no-evil approach. The basic problem is that Facebook’s massive growth actually created an incentive for people to create websites specifically designed to manufacture, and profit from, fake news.

For example, investigations by BuzzFeed and the Guardian found that a group of cynical Macedonian hucksters had created dozens of right-wing news sites that publish low-quality pro-Trump news stories. Some are plagiarized from other conservative news sites. Others appear to be totally made up, with headlines like “Proof surfaces that Obama was born in Kenya,” “Bill Clinton’s sex tape just leaked,” and “Pope Francis forbids Catholics from voting for Hillary!”

“Yes, the info in the blogs is bad, false, and misleading but the rationale is that ‘if it gets the people to click on it and engage, then use it,’” a Macedonian student told BuzzFeed.

There has always been shoddy journalism and accidental mistakes in news articles, of course. But until Facebook’s newsfeed came along, there was no real incentive for someone to create sites full of false but clickable stories, since these sites would get very little traffic and hence few, if any, advertising dollars.

Facebook’s growing popularity created a market opportunity for fake news sites, while the growth of internet advertising networks made it easy to turn traffic into cash. Both Google and Facebook have already taken steps to ban fake news sites from using their lucrative ad networks — but there are lots of other ad networks these sites can use.

Facebook is relying on third-party organizations to battle fake news

The fake news situation puts Facebook in an awkward position, because it likes to think of itself as simply providing a neutral platform for their users to share information with one another. Making direct editorial judgments about which news stories are fake would seemingly run contrary to that platform-company ethos.

So Facebook is essentially outsourcing these editorial judgments to independent organizations that are already in the fact-checking business. Facebook says it will rely on Poynter's International Fact Checking Network to identify suitable fact-checking organizations. A Facebook spokesperson tells me that at launch, Facebook will rely on ABC News, the Associated Press, Snopes, Factcheck.org, and PolitiFact to verify news articles.

“We’ll use the reports from our community, along with other signals, to send stories to these organizations,” Mosseri writes. “If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why.”

If users flag an article as inaccurate, Facebook will refer the article to the third-party fact-checkers, who will examine the article and render a verdict. If a story is ruled false by fact-checkers, Facebook will add a “disputed” tag to the article anytime a user shares it.

Disputed stories will also be penalized by Facebook’s newsfeed algorithm, which decides which stories to show you first. In the past, Facebook’s newsfeed has relied heavily on “engagement” — that is, how often someone clicks, likes, and shares a link — to decide which stories to show users first.

As one Facebook critic put it: “News Feed optimizes for engagement. As we've learned in this election, bullshit is highly engaging.”

Facebook will still take engagement into account, but now it will also take accuracy into account. Disputed stories can still be shared on the newsfeed, but they’ll tend to appear further down in the newsfeed than stories that have not been marked as inaccurate by fact-checkers.

And if users try to share a disputed story, Facebook will warn users that the story is disputed and ask if they really want to share it. That might help discourage users from unknowingly sharing bogus information.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.