clock menu more-arrow no yes mobile

Filed under:

Facebook is going to get more politically biased, not less

Facebook Exhibits Technologies At Innovation Hub Photo by Sean Gallup/Getty Images

Gizmodo's report that Facebook's "trending" box was curated by a liberal-leaning staff that suppressed stories from conservative outlets hit like a bomb. The revelation has been greeted with outrage, and rightly so: "Trending" shouldn't be an ideological concept.

The furor grew loud enough that Mark Zuckerberg has now responded. "In the coming weeks," he promised, "I'll also be inviting leading conservatives and people from across the political spectrum to talk with me about this and share their points of view. I want to have a direct conversation."

The CEO's attention to the problem of biased curators should be enough to solve it. But here's the truth: Facebook — at least as people really experience it — is likely to get more biased, not less, as a result of this controversy. The reason is simple: Facebook's paid curators aren't nearly as important as its unpaid curators. And its unpaid curators don't care what Mark Zuckerberg wants.

Online news and echo chambers

To see why, it's worth looking at a recent study of digital news junkies. Seth Flaxman, a statistician at the University of Oxford, picked through the web-browsing histories of 50,000 Americans who regularly read online news. The purpose wasn't mere prurience. Flaxman was testing the prevalence of "echo chambers": the widely bemoaned result of liberals and conservatives only consuming news they find congenial.

Behind the panic over echo chambers are three simple observations. First, people like news that confirms the opinions they already hold. Second, the internet makes it easier for people to find news that confirms the opinions they already hold. Third, the algorithms running in the background of Google's search engine and Facebook's News Feed are constantly trying to serve users content — which includes news — they will like, and that will mean serving them news they already agree with.

This is a theory of how partisans lose hold of a common reality. Before the web, most people got information from newspapers and nightly newscasts that strived for objectivity, and so presented them with a range of opinions. It was possible to cocoon yourself inside an echo chamber, but you really had to work at it.

Then came cable news and the early internet and access to countless news sources, and constructing an echo chamber became easier. But you still had to build it yourself — you had to consciously seek out sources that flattered you while avoiding unwanted opinions.

But now we have personalized search results, handcrafted Twitter feeds, and a Facebook algorithm based on likes. Now you can end up in an echo chamber without even knowing it.

Audiences are often more biased than outlets

Flaxman shows the theory is mostly correct. People tend to read publications they agree with, and that's more true when they find articles on Facebook than when they directly visit news sites.

But when you dig into the guts of Flaxman's study, you find something interesting. There's no consensus measure of outlet ideology, so he uses the ideology of readers — as measured by which party they supported in the last presidential election — as a stand-in. But when you look at the ideology of readers, you find that outlets that try very hard to be unbiased often have extremely biased audiences.

The Economist's audience, for instance, codes it as one of the most liberal publications in Flaxman's sample — more liberal even than Daily Kos, which is a site devoted to making the case for American political liberalism. Similarly, under this measure, the Dallas Morning News, a traditional newspaper with traditional newspaper values (top story as of Friday morning: "Frisco hospice targeted in FBI raid 'overmedicated' patients, state records say"), ends up to the right of Breitbart (top story as of Friday morning: "Obama goes trolling: dictates school rules for trans children").

Part of this is mere regionalism. Sites associated with newspapers from the Northeast, like the New York Times and NJ.com, code as extremely liberal, while sites associated with newspapers from crimson areas of the country, like the Salt Lake Tribune and the Kansas City Star, code as extremely conservative.

But the takeaway remains clear: Media outlets are often less ideologically polarized than their readerships. The editors of the New York Times or the Dallas Morning News or the Salt Lake Tribune are creating a publication more neutral than their audience would probably prefer. This has never protected them from (often accurate) charges of political bias, of course. But in the aggregate, they were playing a moderating role, and, until recently, their audience didn't have anywhere else to turn for daily news.

Now, of course, they have many options, but one dominates above all. Now they can turn to Facebook.

Facebook's most biased curator is you

The reaction to Gizmodo's article should ensure neutrality on the part of any future curators Facebook hires to certify trending news. The same can't be said about your Facebook feed's most important curator: you.

Like the newspapers before it, Facebook does not want to be known as politically biased. "We believe the world is better when people from different backgrounds and with different ideas all have the power to share their thoughts and experiences," Zuckerberg wrote. Facebook, after all, wants to be a utility used by everyone. A perception of bias is bad for business.

But its users very much want to be known as politically biased. People like and share articles that align with their identities. The stories we push to our friends make statements about who we are: They show that we care about LGBTQ rights, or believe in the Second Amendment, or think Donald Trump a fool, or believe Hillary Clinton a liar. Facebook's algorithm uses those shares and likes to build a rough model of each user's identity — political as well as nonpolitical — and surface yet more stories that align with it.

The bad press Facebook has received for political bias in recent days is likely to push it away from human curation and toward yet more algorithmic curation. The irony is that will make Facebook more of an echo chamber, not less of one. Facebook's human curators are under pressure to present both sides, but its algorithmic curators are not.

As Ben Thompson writes, there's a good argument to be made that "an authoritative news module from Facebook would actually be a civil benefit," as it might surface stories that don't align with users' political identities and that they wouldn't otherwise see. Facebook as an institution is almost certainly less biased than its users are.

But it's not at all clear why pushing against user bias would be good for Facebook's business. The reason Facebook created an echo chamber in the first place is because an echo chamber is what people actually want.


Lawmaking has a liberal bias

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.