Cyber sexual abuse happens more often than you think. It may have happened to you or someone you know.
Scenario:
Someone posts an intimate photo online of their former partner. The photo is shared hundreds of times on social media and has hundreds of comments. The photo shows up on pornography sites and is downloaded dozens of times. The former partner starts getting phone calls and emails. Their home address is posted online. They feel people staring at them and talking about them. People make comments in the street. The harassment spreads like a virus to their family and friends. Soon their employer finds out and fires them.
The above scenario is the reality for 1 in 10 women under the age of 30. These women have experienced either the threat to share or have had their private images shared online without their consent. Everyone, regardless of gender, has the right to expect private images to remain private. Cyber sexual abuse victims can be further victimized by sextortion or AI-generative sexually explicit content. Many feel hopeless, lost, and without a solution. They may also feel like they are somehow at fault.
Cyber sexual abuse, or image-based sexual abuse (IBSA)), refers to a form of online abuse that includes sextortion and nonconsensual dissemination of (genuine or AI-generated) intimate images (NDII) (a.k.a. revenge porn or nonconsensual pornography). The technology enabling this abuse is developing faster than lawmakers can pass laws against it. However, many states have implemented laws to protect victims against cyber sexual abuse. Most of these laws consider NDII a criminal offense if the distributor acted with a specific intent (e.g., to harass, intimidate, embarrass) or knew the victim had not consented to the disclosure. The federal government is still working towards passing their own legislation.
Revenge porn is the most commonly used term to describe NDII. However, revenge porn is more specific in that the source of the NDII is a former spouse or partner. Revenge porn does not include pictures that are distributed for reasons like extortion or blackmail. Academics have pushed for the more acceptable terms ‘image-based sexual abuse’ or ‘cyber sexual abuse’ as it encompasses non-revenge scenarios as well. NDII includes ‘real’ photos and AI-generated images, also known as ‘deepfake’ images. 98% of all deepfake content online is pornographic in nature; 99% of that content targets women.
In 2022, Congress authorized a new cause of action allowing victims to file civil claims against NDII distributors. The act would help introduce other legislation to criminalize NDII. That or work to expand liability for distributing other types of content online.
Earlier this year, sexually violent and pornographic AI-generated images, or “deepfakes”, of the popular singer, Taylor Swift, began circulating on twitter. The response from twitter to remove the images was incredibly slow, allowing one photo to gain an audience of 45 million views. Swift was not the first victim of this, but as she is a high profile celebrity, there was an immediate call to action by legislators for laws against AI-generated intimate images. However, other celebrities and victims of cyber sexual abuse argue that it shouldn’t have taken such a high profile case for anyone to want to do anything about it. AI-generated images can have an impact on anyone, from adults to teenagers to young children.
On June 27, 2024, US Senator Ted Cruz (R-Texas) held a hearing for his TAKE IT DOWN Act which would require social media sites to remove non-consensual intimate imagery within 48 hours of receiving notice from the victim. It would also make publishing such material a federal crime. Victims of AI-generated nonconsensual pornography spoke for the bill and shared their stories of how it has destroyed their lives.
When Elliston Berry was 14-years-old, a classmate created AI-generated nonconsensual pornography of her using her Instagram photos and distributed them to the rest of the school via Snapchat. Eight and a half months and a phone call from Sen. Ted Cruz later, Snapchat removed the photos. In her statement, she says that she lives in constant fear that these images will resurface. That she will have her life ruined all over again. This impacts her mental health on a daily basis. Ms. Berry’s school did not have student conduct policies for AI-generated pictures and therefore would not expel the perpetrator. Instead, he received in-school suspension and a misdemeanor for distribution.
Another victim at the hearing, Francesca Mani, says that the her school didn’t have any policies on AI-generated content. When she was 14 years old, boys from her class created and distributed AI-generated sexually graphic images of her. According to the administration, the school couldn’t do anything because there were not any student conduct polices in place for this type of behavior. The boys still attended classes with her and she had to act as if nothing happened. There were no laws to protect her and she felt vulnerable and unsafe walking the hallways.
Many victims of cyber sexual abuse exhibit symptoms also seen in survivors of sexual assault. They struggle with depression, anxiety, low self-esteem/self-worth, and fear. If they don’t seek treatment, some turn to drugs and alcohol or self-harm. They blame themselves for what has happened to them, that they were stupid for trusting someone (even someone they considered a long-term partner that they thought they would have a future with). They consider themselves lesser, or damaged.
Many also exhibit symptoms of PTSD, but find that diagnosis doesn’t quite fit. For many victims there isn’t a “Post” in PTSD for them yet. They wake up every day to the possibility that deepfake images of them are still online. That the images are being distributed to other people. That today might be the day that they lose controls of their life again.
Online victims also find it harder to seek out help as there is rarely a physical element to this crime.
AlectoAI – an app created by Breeze Liu, a revenge porn survivor, with the goal of helping victims of nonconsensual pornography locate images of themselves online and help get them removed. This app is currently free in the Apple AppStore and GooglePlay.
Cyber Civil Rights Initiative (#CCRI) – The CCRI is on a mission to combat online abuse that threatens civil rights and liberties. They work as experts in court cases regarding image-based sexual abuse and serves thousands of victims around the world. They also advocate for technological, social, and legal innovation to fight online abuse.
Federal Trade Commission (FTC) – If you’ve taken the steps to have images removed from websites, but the website refuses you can contact the FTC and report the business.
Federal Bureau of Investigation (FBI) – The FBI also has a taskforce to combat cyber crime. The Internet Crime Complaint Center, or IC3, investigates reports made for all cyber related crimes.
Rape, Abuse, Incest, National Network (RAINN) – RAINN is the nation’s largest anti-sexual violence organization. The work to provide support to each survivor with the understanding that no two stories are the same. Their website features an extensive list of resources for not only victims of cyber sexual abuse, but domestic violence, sexual assault, and incest.