clock menu more-arrow no yes

Call “revenge porn” what it is: sexual abuse

Survivors of “revenge porn” told me how much they hate the term. So I came up with a new one.

Jason LaVeris/FilmMagic

It’s the Kardashian family, so the story is a tangled web of intrigue and outrage, with millions worldwide waiting for the next installment of the drama. This time it’s about Rob Kardashian and his ex-partner Blac Chyna. But we are not just watching another celebrity spat following an acrimonious breakup.

Kardashian has shared on social media explicit images of Chyna apparently without her consent. These were private, sexual images not meant for distribution. But Kardashian seems to have decided otherwise. He’s become the latest in a long line of ex-partners to target his former girlfriend with what is often called “revenge porn.”

But the term “revenge porn” is distracting and inaccurate, and minimizes the harms of this growing phenomenon. “Revenge porn” should be called out for what it is: image-based sexual abuse.

Survivors say the term “revenge porn” trivializes their experience

This new term — image-based sexual abuse — better describes the nature and harms of the deeply injurious actions of men (and it is mostly men) who perpetrate it. As a law professor at Britain’s Durham University, I developed this term together with fellow British law professor Erika Rackley.

We’d listened to survivors who say the term “revenge porn” trivializes their experiences. It makes them feel as if they’ve done something wrong to justify an act of revenge; and the focus on porn encourages victim blaming, as if they shouldn’t have taken or allowed to be taken these videos or pictures.

I’d also noticed that in my work with politicians to introduce new laws to criminalize these practices, the language of “revenge porn” limited not just our discussions but also the policies they were putting forward. We kept saying this is not just about revenge — what about hackers? It’s not about porn, and so obscenity standards aren’t relevant. It’s not just about sharing images — what about creation, including “upskirt” images?

While the language of “revenge porn” has certainly worked to secure the attention of the media and politicians, it has passed its sell-by date. It’s time for a new approach and new terminology. The term image-based sexual abuse both captures the broad range of behavior to be challenged and conveys the nature and extent of the harms. Survivors tell us that this term resonates with them.

And language matters because it frames our laws and public debates. A major purpose of the law is to express our shared condemnation of specific practices with the hope of changing people’s behavior. We can only achieve these purposes if the label applied to a law is the right one. And “revenge pornography” is the wrong one.

It’s about more than “revenge”

Why? First, “revenge porn” covers just one form of image-based sexual abuse, such as Rob Kardashian sharing, without consent, private sexual pictures or videos to exact “revenge” on a former partner. Even in such cases, images are shared for a variety of reasons other than revenge — to make money, for notoriety, for a “laugh,” for sexual gratification, or for no real reason at all.

The law should also cover other forms of image-based sexual abuse, such as the distribution of hacked images, as when naked pictures of Jennifer Lawrence and other celebrities were stolen and distributed. Nor does “revenge porn” cover the sharing of images created without consent and then shared, including recordings of a sexual assault or upskirt images.

It’s not "porn”

It’s not just the “revenge” aspect of “revenge porn” that is troubling — it’s also not “porn.” The word porn implies consent and legitimacy, which is not warranted. It leads some politicians down the wrong path by thinking that images must pass a threshold of obscenity before being unlawful, or that the perpetrator must be acting for purposes of sexual gratification to be criminalized.

It’s a form of abuse

Image-based sexual abuse emphasizes what these practices are — abusive. Creating and/or distributing sexually explicit images without consent is a serious harm, often resulting in considerable mental and physical injuries. It is a form of harassment and often part of a pattern of coercive domestic abuse. It is also a breach of the fundamental rights to privacy, dignity, and sexual autonomy, with women (and victims are mostly women) being forced offline and blamed or targeted for expressing themselves sexually through imagery.

The abuse is sexual — and criminal

We are talking about forms of sexual abuse. The harm comes from the fact that it is sexual images that are shared without consent; the images go viral because they are sexual. Sharing private sexual images without consent exploits an individual’s sexual identity and infringes their sexual autonomy. The online abuse that accompanies distribution of private sexual images includes sexual threats (rape threats), as well as abusive comments about the victim’s appearance, body, sexuality, and sexual agency.

So where people have shared sexual images without consent, as Rob Kardashian has done, survivors experience this as abusive, as sexualized: It’s a sexual offense. Lawrence said her naked images being hacked and distributed was a “sex crime.” When Mischa Barton’s ex-boyfriend tried to distribute a private sexual video without consent, her lawyer Lisa Bloom spoke on Barton’s behalf describing this as a “form of sexual assault.”

And it’s not just celebrities who are victims. Close to my home in the north of England, the survivor Keeley Richards-Shaw spoke out when her ex-boyfriend took and shared sexual images of her without her agreement. She said: “How anyone can fail to see revenge porn as a sexual crime is beyond me.”

My latest research has shown how all different forms of image-based sexual abuse — including “revenge porn,” “upskirting,” sexual extortion, and sexualized photoshopping — share common characteristics with other forms of sexual crime. Perpetrators act to gain a sense of power and to harm their victim in a way that attacks their identity and self-worth. Labeling and understanding these practices as sexual offenses is vital to ensuring appropriate support and protections for victims.

Change the label, focus on the harms

It’s time to stop using the term “revenge porn.” It is salacious and trivializes the abuse, dismissing it as “just a bit of fun.” It focuses on the motivations of perpetrators, instead of on the harms suffered by victims.

The language of image-based sexual abuse resonates with survivors. It emphasizes the abusive nature of these practices and the links to other forms of sexual offending. It can help our politicians focus on the real harms and the actions needed to challenge this abuse.

Of course, changing the terminology won’t stop the abuse. But replacing the language of “revenge porn” with the more accurate and inclusive image-based sexual abuse would be a small — but important — step in challenging the current cultural climate of blaming victims and minimizing this abuse.

What we also need is comprehensive legal reform. This means all states and countries across the world criminalizing the sharing of private sexual images without consent, whatever the motivation of the perpetrator, as well as threats to share. It means making upskirting a criminal offense, as well as sexual extortion. Progress is being made. The Cyber Civil Rights Initiative has been successful in securing change in many US states. But as Rob Kardashian has shown, much more needs to be done.

Clare McGlynn is a professor of law at Durham University in the UK. She has worked closely with politicians and voluntary organizations to introduce new laws criminalizing image-based sexual abuse across the UK, as well as addressing policymakers in Iceland, Australia, and across Europe on the urgent need for action. She is on Twitter @McGlynnClare, and her website is ClareMcGlynn.com.


First Person is Vox's home for compelling, provocative narrative essays. Do you have a story to share? Read our submission guidelines, and pitch us at firstperson@vox.com.