clock menu more-arrow no yes mobile

Filed under:

Facebook Will Stop Playing With Your Emotions (Sort Of)

The social network changed its research guidelines and established a review board in the wake of a much-maligned study.

Facebook updated its research guidelines Thursday, a policy change that comes three months after news surfaced that Facebook had purposely tried to manipulate user emotions in a 2012 study. The result for Facebook was a slew of pissed-off users, and the company hopes these changes will prevent similar issues in the future.

Facebook is always testing something in its attempt to improve the service, and Thursday’s update confirms the company will continue doing research. Facebook has lots of user data (lots and lots and lots), and it uses that information to change products, like Messenger or News Feed, and experiences, like which posts/ads you might see.

So what will Facebook do differently moving forward?

The company has essentially added a new level of checks and balances. Before, when groups within Facebook conducted research, the research plans were approved by those group leaders. For example, News Feed research was discussed and approved within the News Feed team, according to a spokesperson.

Now, Facebook has added a more expansive internal review panel that includes senior members from teams across Facebook. If a particular group plans to conduct research on sensitive topics, for example a specific groups of users (women of a particular age) or research relating to “content that may be considered deeply personal (such as emotions),” this research must be reviewed by the panel.

Facebook isn’t sharing names of individuals on the panel, but heads of different areas — communications, marketing, policy, etc. — will be on it, according to a spokesperson. No outside academics will sit on the panel, but Facebook consulted a number of professional researchers while assembling the group, the spokesperson added.

The idea is that opening the review and approval process to a wider range of employees will keep internal teams from moving forward with research that may not align with Facebook’s goals (at least the goals they want users to understand).

Facebook also says that research practices have been added to its six-week training program for new engineering hires beginning this week, and all employees will learn about company research practices during the annual privacy and security training that Facebook requires.

In regard to the manipulation study discovered in June, Facebook has apologized in the past for failing to communicate better with users about the research, and it reiterated that in a blog on Thursday. “We should have considered other non-experimental ways to do this research,” wrote Mike Schroepfer, Facebook’s CTO. “The research would also have benefited from more extensive review by a wider and more senior group of people.”

That second part — the more extensive review — is what Facebook’s new policy should fix. The company has also added a new website where all published academic research will live in the future.

This article originally appeared on Recode.net.

Sign up for the newsletter Sign up for Vox Recommends

Get curated picks of the best Vox journalism to read, watch, and listen to every week, from our editors.