/cdn.vox-cdn.com/uploads/chorus_image/image/51911715/511574488.0.jpeg)
Facebook CEO Mark Zuckerberg doesn’t think fake news influenced last week’s presidential election. But it turns out he does think fake news is a problem on Facebook, and late Friday night he laid out a few details about a number of “projects we already have under way” to stop the spread of fake news on Facebook in the future.
The general takeaway from the lengthy Facebook post is that the social network plans to be more proactive in identifying and removing fake news articles from users’ feeds moving forward. Until now, it has primarily relied on users to report and flag inaccurate stories.
That will still be possible, of course, but Zuckerberg outlined a number of other updates that are apparently in the works. A few of the potential changes:
- Adding a warning label to stories that users have flagged as inaccurate.
- Working with more third-party fact-checking organizations.
- Improving the accuracy of “related articles” that it suggests for users to read.
- Blocking fake news distributors from paying to promote their content. (Facebook started that process this week.)
- Building better algorithms to automatically detect fake news. “This means better technical systems to detect what people will flag as false before they do it themselves,” Zuckerberg wrote.
Zuckerberg did not say when these updates will be active or available, but did stress that it won’t be a simple fix. “Some of these ideas will work well, and some will not,” he wrote.
Facebook has been under fire this week after reports found that fake news stories may have played a larger role in last week’s election than anybody thought. Zuckerberg has denied on multiple occasions that fake news played a meaningful role in determining the election’s outcome. But his post Friday shows just how big of a problem Zuckerberg thinks misinformation really is.
Despite the looming changes, Zuckerberg emphasized that Facebook will have to walk a fine line between policing its feed for fake news and not infringing upon personal opinions and free speech.
“The problems here are complex, both technically and philosophically,” he wrote. “We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”
Facebook has long argued that it’s not a media company, but that it’s a technology platform that simply carries information. But the truth of the matter is Facebook and its algorithms determine what news articles hundreds of millions of people see around the world each day. That brings with it some ethical responsibilities as well.
“The bottom line is: we take misinformation seriously. Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information,” Zuckerberg wrote. “We understand how important the issue is for our community and we are committed to getting this right.”
A lot of you have asked what we're doing about misinformation, so I wanted to give an update. The bottom line is: we...
Posted by Mark Zuckerberg on Friday, November 18, 2016
This article originally appeared on Recode.net.