Skip to content
Policy & Justice

Image-Based Sexual Abuse Laws: Combat Nonconsensual AI Deepfakes 

Get RAINN’s recommendations for state legislators. Create and pass laws addressing nonconsensual, explicit AI-generated and AI-manipulated “deepfakes.”

Nonconsensual Manipulated Intimate Material (NCMIM):

Digitally altered or synthetic images, video, audio, or other content created using artificial intelligence or other technologies to portray an individual in a sexually explicit or intimate manner, without that individual’s knowledge or consent.

Have Your Image-Based Abuse Laws Caught Up with Technology?

No one should have to see nonconsensual sexual content of themselves online or elsewhere. Yet, technology and shortcomings in existing legal frameworks leave anyone exposed to this tech-enabled sexual abuse. The creation and distribution of nonconsensual manipulated intimate material (commonly referred to as “deepfakes”) inflicts lasting harm on survivors. We refer to imagery, video, audio, etc. that is produced or distributed without the consent of the subject and has been altered, potentially with artificial intelligence, to show nonconsensual intimate or explicit content as “nonconsensual manipulated intimate material”. 

The trauma from nonconsensual use of a victim’s image or voice for explicit material, or distributing private content, lasts a lifetime. The harm repeats with every reproduction. Its distribution invades private spaces, damaging reputations and relationships, including with family, friends, work, and religious communities. While states rush to catch up with technology, abusers exploit new technologies to control and harm victims. States must modernize their laws to address the crime and open pathways to justice for victims.

Confront Abusers & Technology

Given the invasiveness of this crime, many states criminalized the nonconsensual distribution of adult intimate images and the possession or distribution of child sexual abuse materials. However, at the time, the criminalization of manipulated imagery or audio involving adults or children was not addressed because the technology required to make realistic imagery or audio was inaccessible to most users. The advancements in technology make most laws addressing nonconsensual intimate material abuse insufficient to address the growing crime of manipulating it. 

For example, if a photo depicted the victim nude, where the victim’s face was real, but the victim’s genitals were computer-generated, many laws would not apply because the intimate part of the image was not the victim’s. States must confront creators and disseminators of this invasive material with appropriate pathways to justice for victims and updated statutes that adapt to a rapidly growing technological space. The prevalence and accessibility of new tools make the threat posed by those who create, change, and distribute this material acutely urgent.

The advancement of digital technology allows for the rapid creation, modification, sharing, and storage of nonconsensual manipulated intimate material. Computer-generated images and modification tools allow for the efficient creation of images virtually indistinguishable from reality. ​​Similar tools facilitate modifying voices for realistic sounding, but not true audio recordings. Especially with the advent of artificial intelligence, tools that expedite harm proliferate with few avenues for remedy. 

Applications (apps) and software can turn an otherwise mundane photo into an explicit nude image, and commonly used social networking platforms allow for quick access to many people. Such images frequently depict or place people in sexual positions, in addition to being without clothes. These tools are widely available and supported by online communities in which users discuss and create nonconsensual manipulated intimate material. (1) This material causes lasting harm. (2)

People depicted are forced to view or hear themselves engaging in nonconsensual explicit acts, face unwanted attention on their nude bodies, and endure stigma associated with explicit content. Victims lose jobs, experience trauma, and may suffer ongoing depression or anxiety. When these tools generate explicit images of a minor, whether or not the minor is identifiable, the creator of the image engages in the creation of Child Sexual Abuse Materials (CSAM). Such nonconsensual manipulated intimate material adds to the abuse of real children and threatens to anonymize and normalize pedophilia, child abuse, and trafficking.

Nonconsensual manipulated intimate material abuse threatens individuals and public safety. In one example, a man sent explicit images of a woman to predators insisting she fantasized “about being raped,” eventually leading to threatening strangers arriving at her place of work. (3) Abuse of nonconsensual manipulated intimate material enhances domestic violence, child sexual abuse, and sextortion. Abusers use the material to threaten, blackmail, and control victims. (4) The repercussions of such material reflect the sexual abuse implied by robbing someone of consent to their body. 

Although the content may be fake, the harm to the victims from the distribution of sexually explicit nonconsensual manipulated intimate material is real and long-lasting. Teens threatened by internet users have committed suicide when images are used to extort them. (5) Exacerbating the harm, creators and distributors of nonconsensual manipulated intimate material conduct their abuse repeatedly and attack multiple victims. (6)

The serial nature of the crime, combined with the anonymity offered by the internet, makes catching perpetrators a challenging public safety concern. The U.S. Department of Justice has called the nonconsensual manipulated intimate material a “clear, present, and evolving threat to the public across national security, law enforcement, financial, and societal domains.” (7) Individual victims, businesses, and public safety professionals need the support of the law to pursue justice for this abuse.

Lawmakers must address the trauma and harm caused by nonconsensual manipulated intimate material. Failure to act allows perpetrators to evade accountability for a crime that ruins lives. State legislators have options to ensure that their statutes addressing intimate image abuse cover the threats posed by evolving technology and nonconsensual manipulated intimate material. Lawmakers must center survivors of the abuse in their solutions. Below, we offer recommendations and sample statutory language to address this emerging and urgent threat.

RAINN’s Recommendations 

RAINN offers the following recommendations to policymakers to ensure state statutes addressing nonconsensual authentic or manipulated material reflect best practices and afford the best chance for justice. This is not an exhaustive list of considerations, but highlights components of comprehensive nonconsensual material laws in the context of manipulated material. Below, we offer general recommendations, followed by specific considerations and sample statutory language.

  • Modify existing statutes addressing disclosure of intimate material to include both authentic and manipulated material, and make it an offense regardless.
  • Modify existing statutes addressing intimate material to include a victim’s voice as well as their image.
  • Ensure definitions are technology-neutral, to encompass future developments and the advancement of generative material.
  • Verify statutes related to child sexual abuse material use terminology that accurately reflects the abuse that occurs and do not rely upon consent.
  • Amend any intent elements to focus on the harm caused and the intent of the perpetrator(s).
  • Ensure that consent to the creation of the intimate material will not constitute consent to the disclosure of the material.
  • Include a forfeiture provision that turns over the rights to the unlawful material to the victim or the court, thereby preventing the perpetrator from continued possession of the unlawful materials and giving victims support to request the removal of the unlawful material from public spaces.
  • Verify that jurisdiction to bring a criminal or civil action rests in any jurisdiction in which the victim resides, regardless of where the material was created or disclosed.

Other Considerations

  • Provide judicial processes to assist victims in removing unlawful material and provide for their safety, such as take-down orders or protective orders. 
  • Ensure that intimate material involving minors that does not arise to child sexual abuse material is prohibited if it is disclosed with the intent to cause harm to the minor.
  • Protect victim privacy in court filings, by allowing the use of pseudonyms, in camera review, the material to be sealed from public view, etc.
  • Clarify that the statute of limitations does not begin to run until the discovery of the material, and it applies to each piece of material individually.
  • Allow for civil actions against perpetrators that include:
    • Allowing guardians to bring actions on behalf of minors
    • Liquidated damages
    • Equitable relief, such as restraining orders prohibiting the further disclosure of the material or requiring its removal

We recognize that each state’s code is unique. RAINN’s policy experts can work hand-in-hand with lawmakers and their staff to tailor these recommendations to meet each state’s specific needs. 

How To Assess & Improve Your State’s Laws

We encourage policymakers, their staff, and those interested in advocating for reform within their state to consider the following questions: 

1. Do your state’s definitions cover malicious conduct?

RAINN Recommends: Evergreen Technology Definitions 

The manipulation of authentic material has existed since its creation. Over time, the methods have improved, but the underlying intent to manipulate the material has not changed. Because technology is rapidly changing and advancing, any definition for manipulated nonconsensual intimate material should focus on the manipulation, not on the technology used. This will allow the definition to be evergreen.

Sample Statutory Language 

“Depiction” – The term “depiction” means any voice or visual material created or altered by any means, including through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic depiction

RAINN Recommends: Prohibit Harm To Identifiable Individuals

Regardless of whether nonconsensual intimate material is authentic or manipulated, a victim can suffer trauma when their body is used without their consent. It is important that any use of a victim’s body without their consent to create this material is prohibited. For example, if an image depicts an individual engaged in sexual conduct, it is harmful whether the victim’s face is superimposed onto another person’s body or if another person’s face is superimposed on the victim’s body. (8) Definitions should be written to prohibit the use of a victim’s person without their consent if the material involves intimate depictions.

Sample Statutory Language 

“Identifiable Individual” – The term “identifiable individuals” means an individual who appears in whole or in part in a depiction and who is identifiable through the creation of the authentic depiction or by virtue of the person’s face, likeness, or other distinguishing characteristic, such as a unique birthmark or other recognizable feature, the sound of the voice or simulation of the voice, or from other information displayed in connection with the depiction.

“Intimate Conduct” – The term “intimate conduct” means a depiction, including voice or visual material, of:

  1. The uncovered genitals, pubic area, anus, or breast; (9)
  2. The display of transfer of bodily sexual fluids;
  3. Sexually explicit conduct, including
    1. sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex;
    2. beastiality;
    3. masturbation;
    4. sadistic or masochistic abuse.

RAINN Recommends: Singular Statute for Authentic & Manipulated Material 

With manipulated intimate material becoming nearly indistinguishable from authentic material, statutes should not differentiate between the two, so long as an identifiable individual is depicted. Separating authentic from manipulated material creates difficult evidentiary barriers. The harm is equivalent for identifiable individuals, which should be reflected in a singular statute.

Sample Statutory Language 

“It is unlawful to possess, disclose, or threaten to disclose a depiction of intimate conduct of an identifiable individual…” (See definitions above, which include “authentic” or “manipulated.”)

“It is a crime to possess, disclose, or threaten to disclose an authentic or manipulated intimate depiction…”

2. Has your state updated its laws for child sexual abuse material and prohibited the distribution of manipulated nude images of minors?

RAINN Recommends: Eliminate “Child Pornography”

“Child sexual abuse material” (9) (CSAM) is unlawful in every jurisdiction because it is a depiction of child sexual abuse. Laws that refer to this material as “child pornography” should be updated to change the terminology. Also, because this is child sexual abuse, consent should not be an element, either explicitly or implicitly.

RAINN Recommends: Prohibit Manipulated Intimate Depictions of Minors

Not every nude depiction of a minor rises to the offense of CSAM. There has been an exponential increase in the reports of offenders using artificial intelligence applications to turn innocent images of minors into nude images. (10) These images may not fall under the definition of child sexual abuse material because the intimate areas are entirely computer-generated, or the depiction is not considered sexually explicit or lascivious. As noted above, definitions of intimate depictions should be updated to include both authentic and manipulated depictions. Further, if the depiction is not child sexual abuse material, but is an intimate depiction, the disclosure of the depictions should be prohibited if the victim is harmed. (See recommendation for “harm” definition.)

3. Do your state laws recognize the harmful effects of the distribution of nonconsensual intimate material?

RAINN Recommends: Recognize That Distribution Harms Victims

Whenever intimate material is manipulated or distributed without the victims’ consent, it harms victims. (11) States should amend current laws or adopt new laws that recognize the different types of harm that this malicious act can cause.

Sample Statutory Language 

(See above definitions.)

“It shall be unlawful to possess, disclose, or threaten to disclose a depiction of intimate conduct of an identifiable individual

  1. With the intent to harass, annoy, threaten, alarm, or cause harm, including physical, psychological, emotional, financial, or reputational harm to the depicted individual; or
  2. With actual knowledge that, or reckless disregard for whether, such possession, disclosure, or threatened disclosure will cause harm, including physical, psychological, emotional, financial, or reputational harm to the depicted individual.

RAINN Recommends: Create Evergreen Disclosure Definitions

Disclosure should not be limited to specific means, such as the internet, because new means of disclosure arise every day. As with the technology used to create the nonconsensual manipulated intimate material, it is the intent of the perpetrator that needs to be the focus. So, whether the disclosure is by posting the material on a social media platform, email, airdrop, fileshare, text, posters, or even simply showing the image to another person, the disclosure is causing harm.

Sample Statutory Language 

“Disclose” – The term “disclose” means to transfer, publish, distribute, or make accessible.

4. Does your state allow a victim to obtain justice for the harm caused
by nonconsensual intimate material, whether authentic or manipulated?

RAINN Recommends: Include Forfeiture Provisions

A perpetrator of this type of sexual violence should not be allowed to keep or maintain the unlawful material. Forfeiture is the legal mechanism that allows the court to remove any ownership rights in the material from the offender as a penalty and transfer control of the material to either the state or the victim. Divesting the perpetrator of control over the material will remove financial incentives for perpetrators and give victims more avenues to remove the material from public spaces.

Sample Statutory Language 

“The court, in imposing a sentence on any person convicted of [crime] shall order that such person forfeit to [State, Victim] any material disclosed in violation of this section; such person’s interest in property, real or personal, constituting or derived from any gross proceeds of such violation, or any property traceable to such property, obtained or retained directly or indirectly as a result of such violation; and any personal property the person used, or intended to be used, in any manner or part, to commit or to facilitate the commission of such violation.”

RAINN Recommends: Limit Exceptions 

When it comes to the disclosure of nonconsensual manipulated intimate material, any disclosure causes trauma, even when the intent of the person disclosing the material is not malicious. Because of that, any exceptions should be very narrow, to limit the trauma to victims and prevent perpetrators from exploiting the exceptions. 

Sample Statutory Language 

“This section shall not apply to disclosures made reasonably and in good faith [An identifiable individual may not bring an action for relief under this section for disclosures made reasonably and in good faith.] 

  1. To or by a law enforcement officer or agency in the course of reporting or investigating—
    1. unlawful activity; or
    2. unsolicited or unwelcome conduct; or
  2. As part of a legal proceeding; or
  3. Intending to assist the identifiable individual.”

“Disclaimers – It shall not be a defense under this section that there is a disclaimer, through a label or some other form of information, stating that the intimate depiction of the identified individual was unauthorized, that the identified individual did not participate in the creation or development of the material, or that the depiction is not authentic.”


RAINN Recommends: Enable Jurisdiction in the Victim’s Home State

The pernicious nature of these offenses is caused in part by the ability of perpetrators to prey upon victims they do not know and the ability to disclose the material throughout the world in minutes. Because of that, states should provide their citizens with the means to seek justice within their state, regardless of where the perpetrator resides or where the material was disclosed.

Next Steps

NOTES & CITATIONS

(1) Lucas, K. T. (2022). Deepfakes and Domestic Violence: Perpetrating Intimate Partner Abuse Using Video Technology. Victims & Offenders, 17(5), 647–659. https://doi.org/10.1080/15564886.2022.2036656

(2) See footnote 1.  Deepfake Technology. (n.d.). Organization for Social Media Safety. Retrieved June 19, 2024, from https://www.socialmediasafety.org/advocacy/deepfake-technology/ 

    (3) Leonard, Collin. “Airline Pilot Extradited to Utah after Posting Illicit Images of Former Flight Attendant, Police Say.” Www.ksl.com, 4 July 2024, www.ksl.com/article/51060973/airline-pilot-extradited-to-utah-after-posting-illicit-images-of-former-flight-attendant-police-say. Accessed 8 July 2024.

    (4) Lucas, K. T. (2022). Deepfakes and Domestic Violence: Perpetrating Intimate Partner Abuse Using Video Technology. Victims & Offenders, 17(5), 647–659. https://doi.org/10.1080/15564886.2022.2036656

        (5) Sganga, Nicole. “Family of Teen Who Died by Suicide Warns of Dangers of Financial Sextortion – CBS News.” Www.cbsnews.com, 17 Jan. 2024, www.cbsnews.com/news/fbi-warning-financial-sextortion-minors-growing-threat-suicide/#:~:text=From%20October%202021%20through%20March

          (6) Asher Flynn, Anastasia Powell, Adrian J Scott, Elena Cama, Deepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging form of Image-Based Sexual Abuse, The British Journal of Criminology, Volume 62, Issue 6, November 2022, Pages 1341–1358, https://doi.org/10.1093/bjc/azab111

            (7) Department of Homeland Security. Increasing Threat of Deepfake Identities. 2023. https://www.dhs.gov/sites/default/files/publications/increasing_threats_of_deepfake_identities_0.pdf 

              (8) If no identifiable individual is depicted in any part of the material, a state’s obscenity laws would likely prohibit the conduct, as that is an offense against society, as opposed to these offenses, which harm identifiable victims.

                (9) The First Amendment is not implicated in statutes protecting adult victims because of the lack of consent or harm requirement. For children, if the material is obscene or sexually explicit, there are no First Amendment concerns, even without a consent or harm requirement. But any strict liability for depictions of minors that are not obscene or sexually explicit should be reviewed carefully to ensure there are no First Amendment implications.   

                  (10) Child sexual abuse material is generally defined as sexually explicit depictions involving a minor (e.g., actual or simulated sexual intercourse) or lascivious exhibition of intimate areas of a minor. Lascivious is if the depiction is sexually suggestive or designed to elicit a sexual response. 

                    (11) https://www.nbcnews.com/news/us-news/little-recourse-teens-girls-victimized-ai-deepfake-nudes-rcna126399

                    (12) As noted above, if the material is CSAM, it is sexual abuse and never consensual 

                        Last updated: July 22, 2025