/cdn.vox-cdn.com/uploads/chorus_image/image/54304167/629044558.0.jpg)
A gunman took to Facebook on Sunday to share a video of himself killing a man on the streets of Cleveland.
The video was initially thought to have been filmed using Facebook Live, the social network’s video broadcasting feature, though a company spokesperson later confirmed that the killer recorded the video and uploaded it after the fact. He did use Facebook Live later in the day, though, to talk about the killing.
Live or not, the video was a shocking and troubling reminder that Facebook does not yet have the kind of technology necessary to automatically detect this kind of violence, though it is working on it. Instead, the company relies on users to flag these posts before they’re reviewed, which takes time and subjects its community to disturbing and violent videos.
That gap between posting and detection has, in a small number of cases, made Facebook Live a perverted tool for some to broadcast violence and abuse. In other instances, it has been a way for people to capture horrific scenes by accident, including multiple killings.
Facebook condemned the video in a statement, calling the act a “horrific crime,” and added that it’s working with local law enforcement.
But these horrific crimes get a lot of attention, and it’s time to ask if Facebook Live is worth the pain of exposing users to murder and torture.
It’s tough to imagine how Facebook Live will make any money for the company outside of publisher and celebrity videos. Advertisers won’t be keen to put their video ads before or alongside live videos created by random Facebook users, and most live video content isn’t great, even the stuff that Facebook pays its partners to produce.*
Facebook’s most-watched live video, Chewbacca Mom, was great, but not necessarily because it was live. The majority of the video’s 166 million views came after the live broadcast was over.
If Facebook limited the live broadcasting feature to selected partners, it wouldn’t need to worry about incidents where people accidentally broadcast a live murder, or talk about a murder they just committed. To be sure, in the Cleveland incident, limiting Live wouldn’t have prevented the video of the killing, but emphasizing vetted partners would set a more professional tone for Facebook video altogether.
Is this a drastic proposal? Yes. The vast majority of Facebook livestreams may be dull, but they are free of hatred and violence. No one thinks we should shut down the internet because it has dark corners, and no one is clamoring for Twitter to shut down because people use it to bully each other online.
There are times when Facebook Live and other streaming services offer an incredible glimpse into otherwise closed-off environments, like when Senator Elizabeth Warren was cut off while trying to criticize now Attorney General Jeff Sessions during debate before his confirmation. Warren took her message to Facebook instead.
Facebook is building artificial intelligence to detect the content of user videos, but CEO Mark Zuckerberg said in February it was still “very early in development.” Detecting the content of live videos is likely even further in the future.
Postponing Facebook Live until the technology is ready to keep up should be an option.
* Vox Media, which owns this site, is a Facebook publishing partner.
This article originally appeared on Recode.net.