/cdn.vox-cdn.com/uploads/chorus_image/image/60407633/Zuck4_2.1531926173.jpg)
Facebook wants to rid itself of so-called “fake news,” and Infowars, the far-right site that often promotes appalling conspiracy theories, is one of the most egregious purveyors of fake news, including insisting that the Sandy Hook shooting of school children was staged.
That’s why a lot of people were confused last week when Facebook said it wouldn’t ban a site like Infowars, even though it acknowledged the service often shares “conspiracy theories or false news.”
What gives? The company’s official stance is that it will use its algorithms to minimize the spread of false news, but it won’t take those posts down. In a tweet, Facebook described it as a “free speech” issue. Now, CEO Mark Zuckerberg has weighed in with his explanation for how the company thinks about its role policing the news online.
Here’s how Zuckerberg described the company’s thinking to Recode Editor at Large Kara Swisher on this week’s Recode Decode podcast:
“Let’s take this a little closer to home. So I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong — I don’t think that they’re intentionally getting it wrong. It’s hard to impugn intent and to understand the intent. I just think as important as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public leaders who we respect do, too. I just don’t think that it is the right thing to say we are going to take someone off the platform if they get things wrong, even multiple times.”
The solution Zuckerberg thinks is more fair is the one that Facebook currently employs: An offensive or deliberately inaccurate post can stay up, but Facebook may downgrade the post so that its algorithms show it to fewer people. “You can put up that content on your page even if people might disagree with it or find it offensive, but that doesn’t mean that we have a responsibility to make it widely distributed in News Feed,” Zuckerberg said.
That policy has attracted a lot of criticism. Facebook’s content-filtering practices are so controversial, in fact, that they were the subject of a nearly three-hour-long congressional hearing yesterday. But Zuckerberg continued to insist that he doesn’t want to be the one to decide what’s right or wrong online, even if his desire to curb fake news has put the company in a position where it needs to do just that. And inevitably, he said, mistakes will be made.
“You can either look at this and say we should have predicted all these issues ahead of time, and some people think that,” he said. “I tend to think that it is very difficult to predict every single thing.”
“There are going to be challenges that come up that are things that we did not foresee.”
Facebook’s policy is still evolving, though. Zuckerberg says that some misinformation is worse than others. Specifically, misinformation that encourages harm, like what we’ve seen in Myanmar. “We are moving towards the policy of misinformation that is aimed at or going to induce violence, we are going to take down,” he said.
When asked how he felt personally about Facebook’s role in spreading dangerous content, Zuckerberg paused. “My emotion,” he said, “is feeling a deep sense of responsibility to try to fix the problem.”
This article originally appeared on Recode.net.