/cdn.vox-cdn.com/uploads/chorus_image/image/51810035/545232394.0.jpg)
During the 2016 election, Facebook showed fake news stories to millions of voters. Because these stories were disproportionately pro-Trump (inaccurately claiming, for example, that the Pope had endorsed Trump and that an FBI agent involved in investigating Hillary Clinton had been murdered), that has caused some people to wonder if Facebook itself had contributed to Trump’s victory.
On Saturday, Zuckerberg addressed his critics with a new post on Facebook. He insisted that “on Facebook, more than 99% of what people see is authentic” — though given that news is a minority of the content of what people see on Facebook, that doesn’t actually mean the fake news problem is trivial. Zuckerberg argues that it’s “extremely unlikely hoaxes changed the outcome of this election.” At the same time, he said Facebook is taking the problem seriously:
We don't want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further.
This is an area where I believe we must proceed very carefully though. Identifying the "truth" is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
He’s obviously right that this is a hard problem, and that caution is warranted. Still, I think it’s a mistake to frame the problem as merely rooting out “hoaxes.”
Facebook’s goal should be to help users find the best content possible. Limiting the distribution of obviously fake news is the bare minimum Facebook can do. But Facebook can also add value at the opposite end of the quality spectrum, by identifying articles that are thoughtful and thoroughly reported (and publications with a track record of producing such articles) and giving those an extra boost in the News Feed algorithm.
An advantage of this approach is that it makes it less necessary to make hard judgment calls about whether a particular article is so fake that it needs to be blocked outright. Low-quality news (whether outright hoaxes or just sloppy journalism) will just naturally get less exposure because people will only see it after scrolling through the higher-quality news their friends have shared.