Facebook thinks it has figured out how to stop the spread of fake news: It’s going to ask journalists to tell them if something’s fake, and then it will ask users not to share it.
The social network laid out its plan in a blog post today, following weeks of criticism for its role in spreading intentionally deceptive stories during the 2016 election. It follows the contour of the map Mark Zuckerberg laid out last month. Here are the big ideas:
- Facebook will ask users to report fake news by clicking on a button at the top right of a story they think is dubious. It will also use its software to look for signs of fake news stories that are getting traction.
- If Facebook thinks its users and/or its software have found a fake news story, it will ask a consortium of journalists to fact-check the story.
- If the journalists think the story is bogus, Facebook will flag the story as “disputed by third-party fact-checkers.”
- That “disputed” banner will be attached to the story within Facebook’s News Feed, and Facebook will tweak its algorithms to make sure “disputed” stories don’t get as much traction in the feed*.
- And users who do want to share a “disputed” story will get a prompt asking them if they’re really sure they want to share the story.
- Facebook also says it will try to make it harder for publishers to profit by publishing fake news, though it is vague about what that will mean.
The post, from Facebook News Feed boss Adam Mosseri , doesn’t really describe its plan to fact-check fake news, but I’ve asked around to get a bit more detail.
Facebook is working with four news organizations/fact-checking groups — ABC News, Politifact, FactCheck and Snopes (Update: The Associated Press has also signed on.) — which have agreed to vet potentially fake stories that Facebook sends them and publish their findings. But Facebook won’t flag the stories as “disputed” unless at least two of the fact-checking groups say there’s a problem.
ABC News President James Goldston says his company will take the team of about a half-dozen journalists who had been fact-checking claims during the 2016 election and assign them to the effort full-time. Facebook isn’t paying ABC for its efforts, Goldston says: “We regard this as being an important part of our mission.”
The Facebook fake news vetting plan is both unobjectionable — who has a problem with fact-checking? — and unintentionally hilarious: Facebook, run and staffed by some of the world’s most clever people, who have created one of the world’s most powerful companies, can’t figure out if Hillary Clinton is running a child sex ring out of a pizza parlor. So it’s going to outsource that question to someone else.
The plan also seems clearly designed to absolve Facebook from any kind of culpability or blame, if and when it receives charges of bias: We’re not saying these things are false — we don’t even use that word! — but people who don’t work here say they’re “disputed.”
Facebook describes these efforts as “tests,” and it will inevitably have to tweak them. You can also imagine some of the problems it’s going to encounter as it rolls this out.
It’s easy, for instance, to determine that the Pope didn’t endorse Donald Trump. But what about the story about the Muslim college student, who said three men, inspired by Trump’s victory, attacked her on New York City subway?
The incident, which she reported to the police, was widely covered. But now police have charged her with filing a false report after determining that she made up the story.
Many stories that reported on her accusations were true at the time, since they said she said it. But how should Facebook treat those stories now? And how should Facebook treat stories about bias attacks after Trump’s election that cite her original account?
Or flip it around: What about posts that pass along Trump’s unsubstantiated claim that millions of people voted “illegally” in the election? Stories that say Trump tweeted that are clearly accurate, since that’s what he did. But if they don’t point out that he’s wrong, are they fake as well?
* Remember this when Facebook tells you it doesn’t control the kind of thing you see in your News Feed.
This article originally appeared on Recode.net.