clock menu more-arrow no yes mobile

Filed under:

Facebook’s “I Voted” sticker was a secret experiment on its users

If you live in the US and logged into Facebook this morning, you probably saw something like this:

Facebook I voted button

You might think it's just a decorative gimmick — a cute electronic version of the "I Voted" sticker you get upon exiting the polling station. And Facebook says it's just doing its civic duty by encouraging its users to vote.

But there's a lot more to it than that, as Mother Jones reported last Friday. For the last few elections, Facebook has been running a series of quiet, but massive, experiments on you to see if it can make you more likely to vote. And it looks like their encouragements are working.

The Facebook "I Voted" sticker might be boosting voter turnout

Voter turnout in midterm elections, like this year's, is typically a lot lower than turnout in presidential elections. But the 2010 midterm elections saw many more voters than had shown up for the previous midterms, in 2006. According to a study done by Facebook scientists and published in Nature, that "I Voted" sticker on Facebook could be part of the reason.

Facebook offered the "I Voted" sticker to most of its users in 2010 — but not all of them. A few hundred thousand users just didn't see any sticker at all, and a few hundred thousand more got the sticker but no information about whether their friends had clicked it.

It turns out that people were a little more likely to vote — and definitely more likely to tell Facebook they'd voted — if they saw their friends had voted too. Eighteen percent of people who didn't see a list of friends who'd voted clicked on the "I Voted" sticker; 20 percent of people who did see the list of friends clicked on it. And users who saw both the sticker and the list of friends were slightly more likely (about 0.6 percent) to actually go to the polls than users who didn't see anything.

According to the Facebook scientists, the sticker — and the peer pressure — caused 340,000 more people to vote in the 2010 midterms. Their study claims that "a single message on Facebook" was a major factor in increased turnout from 2010.

Facebook's not running any voter-button experiments in 2014. But if turnout's down this year, despite the Facebook stickers, it might be harder for Facebook to take credit.

Pumping more politics into the News Feed

In 2012, Facebook tried a slightly different experiment, starting way before Election Day.

A few months before the election, Facebook shuffled the way stories appeared on about 2 million people's feeds. Instead of, say, engagements and baby photos automatically floating to the top of the page, the top slots went to news articles being shared by friends. Because of the timing, a lot of those news articles were about the 2012 campaign.

Facebook's research says that shuffle made a big difference — especially for occasional users who might not scroll down their feeds. After the election, the occasional Facebookers who'd seen more news in their feeds said they paid more attention to government than the ones who'd seen less. And while 64 percent of occasional users with normal feeds said they'd voted in the 2012 election, 67 percent of users with news-boosted feeds said they had.

Is manipulating voters a good thing?

There are serious caveats to the 2012 study, because self-reporting isn't always accurate. But if Facebook really has been able to boost turnout in the last two national elections, it wouldn't be the first time it's succeeded at manipulating users' behavior. Earlier this year, Facebook got into a lot of trouble for revealing it had attempted to experiment on what emotions showed up in people's news feeds, and whether the emotions they saw affected what they posted themselves.

It seems less controversial for Facebook to try to get users to vote than it does for Facebook to try to make users happy or sad. That's especially true because — unlike Facebook's voter engagement data — analysts assume Facebook's work with emotions has the ultimate goal of making users happy or sad about a particular advertiser or product. (Facebook emphasizes that they don't use voting data for advertising, or share it with advertisers.)

But as Nilay Patel pointed out for Vox this summer, Facebook's behavior isn't that different from ads telling us what to buy or who to vote for:

Taco Bell pays BuzzFeed to create shareable advertising, and then pays Facebook to tweak the News Feed to make sure that advertising shows up when it's shared. And since the point of advertising is to create an emotional relationship between you and a product, it's not at all unfair to say that Taco Bell's paying Facebook to manipulate your emotions by changing the News Feed.

But no one's mad about Taco Bell buying ads or promoted posts in Facebook's News Feed, even though ads are designed to change our emotions. (Just wait until election season really heats up.) What we're mad about is the idea of Facebook having so much power we don't understand - a power that feels completely unchecked when it's described as "manipulating our emotions." Advertisers paying to change our feelings about products feels clean and familiar; Facebook screwing with your mind just to see what happens feels like playing god. But in the grand scheme of the internet economy, Facebook running a badly-designed test on a tiny fraction of its billion and a half users for one week in 2012 is virtually meaningless when the company is building an entire business around #brand #engagement with #audiences every other day of the year.

Watch: The midterm elections, explained in 8 bits

UPDATE/CORRECTION: Facebook has confirmed that they're not using the voting button in any user experiments this year; the headline of this piece has been corrected to reflect that, and the text of the piece has been clarified. The article has also been updated with more information about the purposes of Facebook's voting data.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.