By now you've probably heard about the controversial Facebook study in which the company altered the news feeds of some 698,003 users for a week in 2012 to determine if seeing more happy or sad posts affected the emotional content those users posted next. There's been an enormous backlash: the study itself seems particularly dumb, there's a chance Facebook acted unethically or illegally by not disclosing the study to users, the researchers involved are apologizing, and in general it seems like Facebook did something really, really bad.
But here's the thing: manipulating the News Feed is Facebook's entire business.
The conventional wisdom about ad-supported free services like Facebook and Google is that if you're not paying for anything, then you are the product being sold, presumably to advertisers who can't wait to put a targeted ad in front of your face. Under the most cynical reading, that means you're a sucker; no one wants to be the product. It's also a useful rallying cry against the perceived mystery of what giant internet corporations are doing with all their Big Data; it casts the unknowable as malicious, the unspoken as evil. If we're the product then we're being robbed; it's easy (and fun!) to argue that no one not gets enough value out of Facebook to even out all the dirty profits Mark Zuckerberg is surely making.
But that line of thinking is pretty superficial — it ignores both the value people do get out of Facebook and the actual way Facebook makes money.
And it's the way Facebook makes money that's tremendously important here. Google makes advertisers bid against each other to display ads that appear when you search for certain keywords. Facebook does something a fair bit simpler: it just doesn't show users everything in their News Feeds. Of the 1500 potential items your friends will share on Facebook in a given day, you'll likely only see 300 of them — and if an advertiser or marketer or news organization wants to get more eyeballs from Facebook, they can pay to make sure their stuff shows up in your News Feed, carefully targeted to keywords and demographics. Compare that to Twitter, which firehoses everything your friends share at you in real time. It's better for news junkies, but Facebook can make promises about how many and what sort of people will see something that Twitter can only dream about.
Make no mistake, this is a huge business for Facebook that puts it at the forefront of the entire tech industry. Every other web company (including Google) is struggling to figure out how to turn desktop ad revenue into mobile ad revenue, but Facebook's News Feed integration has been an instant hit on smartphones, with 59 percent of Facebook revenue now coming from mobile. There are no banner ads in the Facebook iPhone app, no crappy interstitials or weird redirects to the App Store. There's just a few sponsored entries mixed into the usual endless scroll of stuff your friends have shared.
And that's the key — stuff your friends have shared. There's so much value in showing you things from your friends on Facebook that it's the revenue model for an entire class of viral media startups. When Taco Bell pays BuzzFeed to write "13 Things Haters Are Always Going to Hate," part of the money is explicitly earmarked towards buying Facebook traffic: i.e, guaranteeing that when people share the story it appears on their friend's timelines. Taco Bell gets to send you a message with your friend as the courier: haters gonna hate, and you should live life and love Taco Bell's Loaded Grillers.
Taco Bell pays BuzzFeed to create shareable advertising, and then pays Facebook to tweak the News Feed to make sure that advertising shows up when it's shared. And since the point of advertising is to create an emotional relationship between you and a product, it's not at all unfair to say that Taco Bell's paying Facebook to manipulate your emotions by changing the News Feed.
But no one's mad about Taco Bell buying ads or promoted posts in Facebook's News Feed, even though ads are designed to change our emotions. (Just wait until election season really heats up.) What we're mad about is the idea of Facebook having so much power we don't understand — a power that feels completely unchecked when it's described as "manipulating our emotions." Advertisers paying to change our feelings about products feels clean and familiar; Facebook screwing with your mind just to see what happens feels like playing god. But in the grand scheme of the internet economy, Facebook running a badly-designed test on a tiny fraction of its billion and a half users for one week in 2012 is virtually meaningless when the company is building an entire business around #brand #engagement with #audiences every other day of the year.
It's true that Facebook is getting a little too good at apologizing for creepy behavior, as Mike Isaac just wrote in The New York Times. The company doesn't seem to know where its users will draw the line. But a big reason for that disconnect is everyone's totally confused about how our data actually turns into money. If Facebook wants to stop apologizing, it had better start being a lot clearer about what it's actually selling.