clock menu more-arrow no yes mobile

Filed under:

Facebook’s plan to stop revenge porn may be even creepier than revenge porn

To halt revenge porn, potential victims may have to submit their own nudes to Facebook.

Pixabay
Aja Romano writes about pop culture, media, and ethics. Before joining Vox in 2016, they were a staff reporter at the Daily Dot. A 2019 fellow of the National Critics Institute, they’re considered an authority on fandom, the internet, and the culture wars.

Last year, Facebook, facing increased scrutiny over its decision-making, launched a program intended to combat a growing problem on the platform: revenge porn, also known as nonconsensual pornography.

There’s a catch, however — to keep others from using nude photos against you, you have to submit your own.

Facebook’s program launched experimentally in Australia in November 2017, and expanded on May 22, 2018, to the US, the UK, and Canada. It requires those who wish to avoid being victims of revenge porn to act preemptively, before they become targets — by submitting their own nude photographs to Facebook. The idea is that Facebook can then digitally fingerprint the submitted image and block its potential future spread.

But the potential problems with this procedure are numerous. Though members of the public objected to the basic idea of the program back in November, the announcement of its spread has renewed public scrutiny about its harmful potential — and for good reason.

How Facebook’s program works

The technology is called perceptual image hashing, and it’s been around for years. It’s the algorithmic process that lets us do things like reverse image searches on Google and TinEye; it also allows law enforcement to stop the spread of child exploitation through the use of PhotoDNA.

But in this instance, it’s not the technology that’s alarming — it’s the vagueness of Facebook’s proposal, and the fact that anyone submitting their photo to Facebook would still be vulnerable to potentially having that image be exploited.

The program is called the Non-Consensual Intimate Image Pilot, and as described by Facebook, it’s intended to be an “emergency option” for people who are worried their images might be shared in the future.

The process by which users of the pilot programs are being asked to submit their photographs is convoluted and laborious. Users first have to fill out a form through one of Facebook’s partner networks in each of its four pilot countries. These include the Australian e-safety commissioner’s office, the Cyber Civil Rights Initiative and the National Network to End Domestic Violence in the US, the UK Revenge Porn Helpline, and YWCA Canada. At that point, the agency and Facebook email a link to the users, which will require them to upload the image to a Facebook database using an encrypted link.

The commission then notifies Facebook, which processes the image — by placing it in front of at least one pair of human eyes, those of “a specially trained representative from our Community Operations team.” Crucially, Facebook doesn’t store the nude images, only their digital image hash, which is basically a unique identifying code for each photo submitted. Facebook can then match future uploaded copies of the photo with that unique ID and block those copies from being distributed.

Once the image hash is obtained from the photo, a member of Facebook’s Community Operations team deletes the image from the Facebook server. A Facebook spokesperson confirmed to Vox that the team deletes the image from the database within a week of its submission; the entire process must be repeated for each and every photo submitted.

“We’re constantly working to prevent this kind of abuse and to keep this content out of our community,” the spokesperson said. “We’re using technology to limit the spread of these photos and developing innovative ways to prevent them from being uploaded in the first place.”

There’s a lot that could go wrong with this

The program raised public alarm in November when it was first reported, but it seems many people missed the memo the first time around. A tangentially related article in the New York Times in May 2018, as well as the news that the program is now launching in three additional countries, including the US, has spurred a renewed public outcry:

This concern is well-founded. From a layperson’s perspective, there are multiple aspects of this program that suggest a potential for misuse or further harm to the victim.

Let’s enumerate them, shall we?

1) It’s invasive.

Asking a potential victim to examine her intimate photos and choose which of them has the potential to be used against her is the kind of thing that induces mental strife all on its own. The laborious process by which she is then required to self-report her own potential victimization to authorities, and wait for a response, all while knowing that an unknown “specially trained representative” from Facebook is examining her nudes, would only exacerbate this strife.

In essence, although the process is intended to ward off exploitation, the process itself is a form of exploitation.

2) We all know how great Facebook is at responsibly storing user data.

Facebook’s recent Cambridge Analytica scandal doesn’t inspire confidence that the company can effectively anticipate how a large-scale database of publicly submitted nudes might be used or manipulated.

3) Facebook has a history of failing to protect its users from revenge porn.

Facebook seems to be turning its attention to solving the problem of revenge porn in part because of its past failure to protect its users in this regard. In January 2018, the company reached a settlement agreement in a 2014 case in which a 14-year-old girl from Northern Ireland was blackmailed into providing a compromising photo of herself. The photo was then circulated widely on Facebook despite her family’s efforts to enlist the company’s help to remove copies of it. Police were unable to prosecute the alleged perpetrator of the crime.

To Facebook’s credit, it began implementing a special reporting system for revenge porn in April 2017, while the lawsuit was still ongoing. Still, it took Facebook years to do this. 2017 was well after most US states had implemented laws against revenge porn, and years after what seems like a worst-case scenario had played out on its platform.

4) The system requires you to show your nudes to an unknown human.

Facebook is adamant that it doesn’t store your nude images — only the digital image hash.

However, according to Facebook, every submitted nude gets “reviewed” by one of five “specially trained representative[s] from our Community Operations team.” A Facebook spokesperson confirmed to Vox that the reviewing process does involve a single staff member viewing the photo directly, emphasizing that the team members are “trained on and work on eliminating safety related issues on our platform.”

But many Facebook users might consider this aspect of the process a deal breaker, since no amount of special training can predict unforeseen human behavior on the part of a bad actor.

And there’s plenty of precedent for these bad actors existing. In 2010, the US Marshals service admitted that tens of thousands of nude body scans, originating from a Florida courthouse, had been improperly and illegally stored; 100 of those scans were eventually made public via a Freedom of Information Act request.

There’s even precedent for these bad actors existing at Facebook. Earlier this month, Facebook fired an engineer who allegedly boasted to an alarmed Tinder contact that he used data obtained through his job to act as a “professional stalker.”

It only takes one bad actor on Facebook’s Community Operations team to manipulate this entire process and damage the bond of trust that Facebook is requiring of its users with this program.

Facebook is aware of the potential risks involved, but is doing its best to ameliorate them. “This pilot was created with the victim in mind, to give victims options to fight back,” the spokesperson told Vox, “and that is precisely why we’ve partnered with the experts who can provide the types of wrap-around services needed.” One of their advisers, Danielle Keats Citron, told Refinery 29 that she believed in the process. “I’m comfortable with what they’re doing because I know how hard they’re working on the security,” she said.

5) The system doesn’t protect you from photos obtained without your consent.

Facebook is obviously trying to provide additional options to users who’ve been threatened in some way with the potential release of photos. That’s an admirable goal, even if the company’s approach to achieving it is less than ideal.

But perhaps the biggest problem with this approach to revenge porn is that it assumes the photos that will be used against you are under your control to begin with. This is often not the case, as revenge porn victims frequently only discover after the fact that photos or videos were taken of them without their consent. A 2014 survey from the anti–online harassment group Without My Consent found that “participants’ experiences of other kinds of harassing conduct were not infrequent; these included the posting of photographs and videos taken when the participant was unaware,” as well as “posting digitally manipulated images involving participants.”

This approach to the problem assumes that the victim can participate in preventing their victimization, but too often, the reality is that the victim isn’t even aware they’ve been compromised until they discover the existence of the photos in question.

6) This system places the onus on the potential victim instead of focusing on identifying perpetrators.

From start to finish, this process focuses on making sure that potential victims — the overwhelming majority of whom are women — identify themselves and act against the risk of their future victimization. This approach might arguably empower the victims of revenge porn, but it’s also laying the groundwork for a subtle form of victim blaming.

It’s easy to envision a situation where, if you have access to your nude photos but choose not to subject yourself to the invasive process of reporting them, a legal argument could be made that it’s your fault if those photos are ultimately used against you. If the pilot program goes into effect more broadly, it’s not hard to imagine a Black Mirror scenario of subtle blame emerging whereby women, in particular, are socially pressured to submit to the process — in other words, to preemptively give up their privacy and bodily autonomy in an attempt to ward off future exploitation.

Teaching men not to perpetrate sexual assault is often a fraught and uncomfortable process. However, the nature of Facebook’s “emergency” revenge porn option feels uncomfortably like trying to teach women not to be violated.

Update: This article has been edited to provide an additional response from Facebook.