Synthetic child sexual abuse material (CSAM) may not involve a camera or physical contact—but it still depicts a child in a sexual context. And it still causes deep harm.
What Is Synthetic or AI-Generated CSAM?
Synthetic CSAM includes:
- Deepfake videos that digitally place a child’s face on an adult’s body
- AI-generated images of children in explicit scenarios that look real but are fabricated
- Digitally altered photos of real children to make them appear nude or engaged in sex acts
These images often start with innocent photos—from social media, school yearbooks, or even family photos—that are manipulated using AI tools. Predators can create CSAM without ever meeting the child in person.
Even if the child was never physically present, synthetic CSAM still normalizes pedophilia and often targets a real person, retraumatizing survivors and feeding the demand for abuse.
How the CSAM Crisis Began
The term “deepfake” originated on Reddit in 2017, where users began swapping faces into explicit videos. What started as a novelty quickly became a tool of abuse. Today, deepfake tech is widely available and being used to target minors—sometimes by other minors who don’t understand the impact of their actions.
RAINN and other experts caution against normalizing the term “deepfakes,” as it obscures the real impact of these materials: nonconsensual, manipulated intimate images that weaponize someone’s likeness against them.
What the Law Gets Wrong About Artificial Intelligence
Right now, many state and federal laws don’t explicitly criminalize synthetic CSAM. If a child’s body wasn’t physically involved—or the child is “only” depicted—some laws fail to apply.
RAINN’s Recommendations to Lawmakers
Just because something is made with software doesn’t mean it’s not abuse. Our laws need to recognize all CSAM—real and synthetic—for what it is: child sexual exploitation.
RAINN urges legislators to pass laws that:
- Prohibits the distribution or creation of manipulated CSAM—regardless of whether a real child was physically present
- Establishes penalties for synthetic abuse equal to those for traditional CSAM
- Protects survivors’ rights to report, remove, and seek civil or criminal justice when their likeness is exploited
- Expands the legal definition of child sexual abuse material to include “realistic, nonconsensual, and manipulated depictions” of children (8)
TAKE IT DOWN Act
On May 19, 2025, President Trump signed the TAKE IT DOWN Act into law, establishing the first-ever federal protections for survivors of tech-enabled sexual abuse.
The law makes it a federal crime to knowingly share or threaten to share nonconsensual intimate images (NCII)—including AI-generated deepfakes—and requires platforms to remove them within 48 hours of a survivor’s verified request.
As AI tools become more advanced and accessible, the need for updated laws is urgent. Without action, offenders will continue to exploit loopholes—while survivors are left to deal with the consequences.
Last updated: July 21, 2025