/cdn.vox-cdn.com/uploads/chorus_image/image/59241951/474639975.0.jpg)
Facebook CEO Mark Zuckerberg identified three main kinds of “fake news” and laid out Facebook’s different strategies for dealing with each one.
Speaking to Vox co-founder Ezra Klein on his podcast The Ezra Klein Show, Zuckerberg identified three broad categories of fake news: spammers who are usually independent actors, state actors being controlled by foreign governments and factions, and “real media outlets [that have] varying levels of accuracy or trustworthiness.”
These are three very different categories with three very different potential solutions, which Zuckerberg laid out in terms of escalating complexity. Perhaps surprisingly, the most challenging category was the last one, which involves the process of identifying and boosting real, trustworthy media outlets. Solving this issue — which Klein called a “conceptual” problem — has been an ongoing dilemma for Facebook, according to Zuckerberg. Here’s how he sees these three categories playing out on the platform and how Facebook is approaching them.
1) Spammers
Zuckerberg made it clear that the problem of spammers was the easiest to deal with — all Facebook needed to do to drive away spammers, he told Klein, was to “make it non-economical” for them to be on the platform. So Facebook prevented unethical advertisers from monetizing on the platform and tweaked the algorithms to show their content less:
So the first step, once we realized that this was an issue, was a number of them ran Facebook ads on their webpages. We immediately said, “Okay. Anyone who’s even remotely sketchy, no way are you going to be able to use our tools to monetize.” So the amount of money that they made went down.
Then they’re trying to pump this content into Facebook with the hopes that people will click on it and see ads and make money. As our systems get better at detecting this, we show the content less, which drives the economic value for them down.
And after implementing these steps, “eventually,” Zuckerberg said, “they just get to a point where they go and do something else.”
The second problem is a little trickier — but still manageable, he said.
2) State actors, or Russian bot farms
The second category is what Zuckerberg described as “basically the Russian interference effort” — in essence, Russian bot farms circulating propaganda and manipulating conversations on social media. That’s “a security problem” according to Zuckerberg, and he thinks Facebook is “making progress” in handling it — even though it’s not something that can ever be “fully” solved:
You never fully solve it, but you strengthen your defenses. They’re not doing it for money. But you make it harder and harder. You get rid of the fake accounts and the tools that they have for using this. We can’t do this all by ourselves, so we try to work with local governments everywhere who have more tools to punish them and have more insight into what is going on across their country so they can tell us what to focus on. And that one I feel like we’re making good progress on too.
Zuckerberg elaborated that following the disastrous 2016 elections, “we spent a bunch of time developing new AI tools to find the kind of fake accounts spreading misinformation.” Before last year’s German elections, the company removed about 10,000 fake accounts, though controversy continued about the ads that were allowed to remain.
Zuckerberg stressed that Facebook is invested in working with governments — just as it worked with Germany’s electoral commission — to combat the problem. “If you work with the government in a country, they’ll really actually have a fuller understanding of what is going on and what are all the issues that we would need to focus on.”
3) Fake news from real media
The biggest challenge of all, Zuckerberg told Klein, lies with real media outlets “who are probably saying what they think is true, but just have varying levels of accuracy or trustworthiness.”
“That is actually the most challenging portion of the issue to deal with,” he said. “Because there, I think, there are quite large free speech issues. Folks are saying stuff that may be wrong, but, like, they mean it, they think that they’re speaking their truth, and do you really want to shut them down for doing that?”
It’s worth pointing out that many of Facebook’s vocal critics have argued that, yes, it’s worth shutting down many types of harmful and toxic speech, and that not being more proactive on that front is how Facebook wound up here. Still, Zuckerberg was eager to discuss how the company is changing.
“This year,” he said, “we’ve rolled out a number of changes to News Feed that try to boost in the ranking broadly trusted news sources”:
Take the Wall Street Journal or New York Times. Even if not everyone reads them, the people who don’t read them typically still think they’re good, trustworthy journalism. Whereas if you get down to blogs that may be on more of the fringe, they’ll have their strong supporters, but people who don’t necessarily read them often don’t trust them as much.
By applying that kind of a lens on this — we know that people in our community want broadly trusted content — that is helping to surface more of the things that are building common ground in our society, and maybe pushing out a little of the stuff that is less trustworthy, even though we’re going to continue to be very sensitive to not suppress people’s ability to say what they believe.
Zuckerberg emphasized that the Facebook “community” was at the heart of this change. “We’ve surveyed people across the whole community and asked them whether they trust different news sources,” he said. Overall, Zuckerberg made it clear that he is counting on Facebook users to tell him what they want — and to direct Facebook’s algorithm how to give it to them.
Zuckerberg trusts the curative power of Facebook’s algorithm to show users “meaningful” content
Zuckerberg stressed that Facebook is sourcing this algorithmic weeding-out of untrusted news to Facebook users, canvassing them to find out what they want — despite the cries of critics who claim such an approach is untenable at best and potentially disastrous at worst. This is, incidentally, how the company did away with clickbait, according to Zuckerberg, by developing “panels of hundreds or thousands of people” to help train Facebook algorithms not to rank things based solely on likes and clicks.
“We try to design algorithms that just map to what people are actually telling us is meaningful to them,” Zuckerberg said.
He also noted, in response to a question from Klein about whether the changes would obscure new media outlets and smaller publications, that the changes he’s implemented to News Feed aren’t intended to be sweeping — at most, they’ll boost or obscure content by a range of about “20 percent”:
I think it’s important to keep in mind that of all the strategies that I just laid out, they’re made up of many different actions, which each have relatively subtle effects. So the broadly trusted shift that I just mentioned, it changes how much something might be seen by, I don’t know, just call it in the range of maybe 20 percent.
But it’s not going to make it so that you can’t share what you think, that if someone wants to have access to your content that they’re not going to get at it. What we’re really trying to do is make it so the content people see is actually really meaningful to them.
Zuckerberg went on to clarify that the type of “meaningful” content the company is banking on boils down to content that fosters relationship building and interactive engagement.
“You can break Facebook and social media use into two categories,” he said. “One is where people are connecting and building relationships, even if it’s subtle. ... The other part of the use is basically content consumption.”
Zuckerberg, apparently referring to research Facebook has conducted, claimed that “the things that are about interacting with people and building relationships end up being correlated with all of the measures of long-term well-being that you’d expect, whereas the things that are primarily just about content consumption, even if they’re informative or entertaining and people say they like them, are not as correlated with long-term measures of well-being.”
“So this is another shift we’ve made in News Feed and our systems this year,” he said. “We’re prioritizing showing more content from your friends and family first, so that way you’ll be more likely to have interactions that are meaningful to you and that more of the time you’re spending is building those relationships.”
Though Facebook’s research has shown that more active engagement will boost happiness, numerous academic studies over the years have shown that increased Facebook use corresponds to an increase in depression, low self-esteem, and general unhappiness, regardless of whether one’s usage of the site is passive or active. So don’t expect a new and improved News Feed to result in an immediate boost in happiness — even by just 20 percent.
You can listen to Zuckerberg’s full appearance on The Ezra Klein Show here.