/cdn.vox-cdn.com/uploads/chorus_image/image/63711962/mark_zuckerberg.0.jpg)
In the 48 hours since two violent videos were broadcast on Facebook — one of a black man dying in Minnesota after being shot by a cop and another of a shootout with police during a protest in Dallas — the company has been peppered with questions about how it monitors, distributes and censors live footage of people being killed.
It’s new terrain for Facebook, and much of the internet, to be honest. And Facebook hasn’t provided a lot of answers.
Late Friday afternoon, the company published a document called “Community Standards and Facebook Live,” a post briefly outlining what is allowed and banned from Facebook when it comes to violent videos.
Here’s the key part of the post, which also includes a section on reporting graphic videos (Facebook has a 24-hour team of people waiting for such reports, the company claims):
One of the most sensitive situations involves people sharing violent or graphic images of events taking place in the real world. In those situations, context and degree are everything. For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video.
But while the post outlining the guidelines is new, the standards themselves are not. Facebook’s Head of Policy Monika Bickert told us something similar two years ago when we asked about violent videos on the platform.
Facebook claims that its approach to graphic content in the case of live video is the same standard it applies to all content. Which is kind of the problem, because live video content isn’t like most other content.
Still, the new post doesn’t really answer questions about how Facebook will handle violent videos in the future. Or whether it even has a plan for live video scenarios that have not yet played out (many of which we hope never do).
Facebook has long been criticized for what many see as its lack of transparency. The latest incidents spotlight the fact that the world’s largest social network is grappling with the challenges of governing what is effectively a virtual state. The questions around what it does and does not allow underscore how vague its rules are for many of its members. Israeli officials recently blasted the company, calling Facebook a “monster” and claiming that it has failed to assist Israeli police in efforts to combat posts that the government believes incite violence or terror.
Facebook’s stance on how it handles violent content — and its transparency around why some things stay and others go — will only become more important as incidents like those in Dallas and Minnesota play out in other parts of the world.
This article originally appeared on Recode.net.