/cdn.vox-cdn.com/uploads/chorus_image/image/63707513/10329707_10152215367226359_2736487638220618161_o.0.1462603428.0.jpg)
Facebook would like to clear a few things up.
On Sunday, the social network released a new, much longer version of its Community Standards, the guidelines for what is allowed and accepted on the network in regard to issues like nudity, bullying and spam.
Facebook didn’t change its position on any of these topics, Monika Bickert, Facebook’s head of global product policy, told Re/code. Instead, the company wanted to clarify its stance on topics that may have been vague in the past.
“These policies are blunt, sometimes more blunt than we’d like them to be,” said Bickert, who pointed out the challenge of writing one set of rules meant to apply to cultures all over the world. “We have to make our policies very objective and easy to apply so our reviewers around the world will efficiently and consistently reach the same result.”
The new version of the standards, which has been in the works for an entire year, does that for a number of categories.
Here are a few worth noting:
Authentic Identity
Facebook ran into issues last fall, after a group of drag queens protested the company’s removal of accounts that included their preferred stage name as opposed to their legal name. The company ultimately backed down, saying users can sign up with their “authentic identity,” not necessarily their legal name.
Facebook’s new identity section does a better job addressing this.
“There has been a lot of confusion from people who thought we were asking them to use what’s on their driver’s license,” said Bickert. “That’s not an accurate interpretation. We want people communicating using the name they actually use in real life.”
Hate Speech
Facebook has always listed the kinds of categories that qualified as hate speech, such as attacking someone’s religion, race or ethnicity. Now, the hate speech section includes more detailed explanations on things like satire or using hate speech from others to bring awareness to an issue.
Content Removal
There are a number of reasons why Facebook might remove something from the site, and the new standards explain these in more detail. It’s worth noting that an item isn’t more likely to be removed because lots of people report it. All reported items are reviewed in the same way, regardless of how many people flag it.
Facebook will also remove items under certain circumstances if they violate the laws of a specific country. It’s still a form of censorship that CEO Mark Zuckerberg has nonetheless discussed and defended in the past.
New Sections
Again, Facebook claims it hasn’t changed its stance on anything, but it did expand on certain rules enough to merit a few new sections. The Community Standards now includes a section on “Sexual Violence and Exploitation,” as well as a section on what happens to accounts when the account owner dies.
Facebook recently added an option that lets users assign a legacy contact to their account, essentially someone who will manage the account once a user passes away.
This article originally appeared on Recode.net.