Skip to content
Get Informed

What is CSAM?

CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.

Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.

A child cannot legally consent to any sexual act, let alone to being recorded in one. That’s why RAINN and other child protection experts use the term “CSAM” instead of “child porn” or “deepfakes.” By calling it what it is—sexual abuse—you stop minimizing the harm and you call it out as the crime it is. 

What CSAM Includes

  • Sexual acts involving a minor
  • Images of a child’s genitals or private areas
  • Live streamed or webcam-based abuse
  • AI-generated content that makes it look like a child is being abused
  • Any content that sexualizes or exploits a child for the viewer’s benefit

It doesn’t matter if the child agreed to it. It doesn’t matter if they sent the image themselves. If a minor is involved, it’s CSAM—and it’s illegal.

CSAM can ruin lives. It can follow survivors for years. And it’s far more common than most people realize.

Last updated: July 21, 2025