clock menu more-arrow no yes mobile

Filed under:

How “predictive algorithms” are being abused for nonconsensual porn

Listen: People are manipulating artificial intelligence to nonconsensually put women’s faces in fake situations.

If you buy something from a Vox link, Vox Media may earn a commission. See our ethics statement.

Noctiluxx/Getty Creative Images

A technology called “predictive algorithms” can learn what your face looks like and then predict what your face would look like in other situations. Unsurprisingly, some are using it to map celebrities’ faces onto the bodies of porn stars having sex. Real people, fake videos. And totally nonconsensual.

In the latest episode of Today, Explained, Vox’s Aja Romano tells host Sean Rameswaram how “deepfakes” are spreading across the internet, what it means for consent, and what platforms like Reddit are doing — or not doing — to stop it.

Later in the episode, Peter Eckersley, chief computer scientist at the Electronic Frontier Foundation, explores how the same technology could tear our society apart in bigger ways — namely by encouraging the spread of fake news.

Listen to the full episode of Today, Explained here:

Related reading:

How do I get even more Today, Explained?

You can get the news we’re reading throughout the day, facts and stats to make you smarter about the world, and behind-the-scenes photos on Twitter @Today_Explained. You can follow Sean @Rameswaram. You can follow Aja Romano @AjaRomano.

How do I report a problem?

For all issues or feedback, please email todayexplained@vox.com.

How do I listen?

If you don’t see the player above, you can listen, subscribe, and review Today, Explained on Stitcher, Apple Podcasts, and Google Play Music.

What if I want to listen at home?

If you have Amazon Echo, add Today, Explained to your flash briefing. If you have Google Home, just say, “Hey Google, play the Today, Explained podcast!”