/cdn.vox-cdn.com/uploads/chorus_image/image/54096879/PHOTO___Mark_Zuckerberg_talking_about_his_letter_to_the_community_at_Facebook_s_internal_quarterly_all_company_meeting.0.jpeg)
Facebook is using its image recognition software to keep users from sharing revenge porn to its different services, including Facebook, Instagram and Messenger, according to a post by CEO Mark Zuckerberg Wednesday morning.
“It's wrong, it's hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared,” he added.
Revenge porn is an inappropriate image or video shared online, usually by a former spouse or partner, with the intent of harassing and embarrassing someone.
Facebook’s plan here is slightly vague, but it sounds like the company will create a database of images that its algorithms can memorize and remove automatically from its different apps. Tech companies do something similar to fight the spread of child pornography. We’ve asked Facebook for clarity and will update once we hear back.*
Revenge porn is an issue in lots of corners of the internet, but the move on Facebook’s part comes just a few months after it was reported that hundreds of U.S. Marines were using a Facebook group to share photos of fellow service members. That group was shut down, but it has since moved to Snapchat, according to BuzzFeed.
* Update: In turns out Facebook shared more about these efforts in a blog post. According to the post, users can report an inappropriate image, which will be reviewed by a human on Facebook’s community operations team. If the image violates Facebook’s community standards, the company will use “photo-matching technologies” to block people from sharing that same image in the future.
We're focused on building a community that keeps people safe. That means building technology and AI tools to prevent...
Posted by Mark Zuckerberg on Wednesday, April 5, 2017
This article originally appeared on Recode.net.