Welcome to Mossberg, a weekly commentary and reviews column on The Verge and Recode by veteran tech journalist Walt Mossberg, executive editor at The Verge and editor at large of Recode.
Totally false news isn’t a new thing in the United States. In our fourth presidential election, in 1800, two of our most brilliant founders — John Adams and Thomas Jefferson — faced off in a vicious campaign that involved newspaper editors on the take and numerous false, often personal attacks. Some historians even claim that partisans for Adams spread the rumor that Jefferson was dead. (He won anyway.)
But they didn’t have Facebook to present, amplify and repeat those falsehoods instantly to millions of people. And that’s why the fake news problem is so serious, even outside the context of a presidential election.
Back in May, the Pew Research Center found that roughly 44 percent of the U.S. adult population got at least some of its news from Facebook. And that was before the general election. There’s nothing inherently wrong with this. Many, if not most, news organizations — old and new, big and small (including this one) — post stories and videos on the social network. And readers and viewers are moved to share stories, whether publishers have embraced the platform or not.
But that puts a heavy responsibility on Facebook to make sure it’s not helping to spread outright lies masquerading as news or publishing the output of made-up news organizations. Yet that’s exactly what happened during the 2016 presidential campaign. In the best-known example, BuzzFeed discovered that over 100 mostly pro-Trump fake news sites in a single town in Macedonia were pumping out false “news” on Facebook in an effort to make money from ads.
Since then, Facebook CEO Mark Zuckerberg has posted two long statements on the social network. On Nov. 12, while he said “we don't want any hoaxes on Facebook,” he also said it was “extremely unlikely hoaxes changed the outcome of this election.” But that was a weaselly excuse. Facebook has done controversial experiments to investigate whether the News Feed can affect emotions — surely fake news can affect beliefs as well.
A week later, in the second post, he got more detailed and outlined a series of steps the company was working on. These included better detection of fake news, a better reporting system for users to report fake news, and possibly flagging fake news with warning labels.
(Oddly, both posts briefly disappeared Tuesday. Shortly after The Verge reported that they were gone, they returned and the company said it was due to a system error.)
In both posts, Zuckerberg stressed the difficulty of deciding what was true or false, what was legitimate opinion or fact, and the need to balance dealing with fake news with protecting freedom of speech.
I agree that these considerations, and others, make this a delicate problem to solve. I especially agree that free speech and the right to opinions, on politics and everything else, must be protected — whether they are popular or not — as long as they aren’t hate speech.
But I am also convinced that Facebook has the financial, technical and human resources to ferret out and totally block almost all fake news and hate speech, both of which it says it wants gone from its service. It’s a company that earned nearly $3 billion just last quarter, and which is reportedly building a tool capable of preventing controversial content from appearing in its News Feeds in countries like China.
Yet the Zuckerberg posts suggest that, while the company is working to better detect fake news, it’s still hoping to rely on the all-too-common Silicon Valley belief that the wisdom of the crowd, plus third-party input, will save the day.
“We do not want to be arbiters of truth ourselves,” Zuckerberg says, “but instead rely on our community and trusted third parties.” Thus, among the ideas he lists for banishing fake news are those labels, that easier user reporting of fake news, and making fake news economically less enticing for its creators. (The company did bar known fake news sites from its ad networks, as did Google.)
But Facebook isn’t just a technology platform where news happens to be published, along with baby pictures, vacation bragging and amateur sports commentary. It’s clearly a media company. It is now publishing articles and videos directly from a host of news organizations, including The Verge. Including this very column. These are encoded in a special way to work best on Facebook, and there are business terms behind the practice. Increasingly, people read news on Facebook and never even visit the originating site or publication.
Hell, even those Macedonian teens understood that Facebook was a media company. They made up fake media organization names from which to post. (Really, Facebook, you weren’t even a little suspicious about DonaldTrumpNews.co?)
So, yes, in my view, Facebook has a direct responsibility to get rid of fake news, and it cannot simply rely on its audience or others to shoulder the burden. I’m happy to see tools made available to readers that help report such trash, and happy that Facebook is working with third-party fact checkers. But the ultimate responsibility is Facebook’s.
Nobody wants Facebook to tinker with legitimate news and opinion — again, except for hate speech. But getting rid of purely fake news from purely fake sources is an eminently achievable task, especially for a huge, well-funded, tech-savvy media company serving nearly two billion people.
I’m encouraged that the Nov. 19 Zuckerberg post says the company wants to “detect what people will flag as false before they do it themselves.” But again, I think Facebook needs to step up and take direct responsibility for expunging fake news, not just label it or give it less weight in the News Feed.
Facebook might even consider hiring a distinguished, non-partisan editor and a small staff to help in the effort. The company abandoned such human input in its little-known Trending box after conservatives complained that right-leaning stories were being culled. But if weeding out verifiably fake news — conservative or liberal or whatever — angers some users, that’s the price of being a news platform, even if it slightly affects growth. It’s the right trade-off.
All of this would mean Facebook would have to act like the media company it has become and stop pretending.
The time for pretending is over.
This article originally appeared on Recode.net.