bar
Artificial intelligence (AI) is fueling a disturbing rise in synthetic CSAM—child sexual abuse material generated by easy-to-use apps and software. |
Predators count on secrecy and shame to keep their young victims silent. The best protection is ensuring kids know they can talk to you—and that they’ll be believed. |
If you discover CSAM of yourself or your child, document where you found it and immediately report it to CyberTipline.org. Call 911 to report immediate danger. |
According to U.S. law, child sexual abuse material (CSAM) is NOT a protected form of expression—it is a serious federal crime with harsh penalties. |
RAINN is leading the national response to child sexual abuse material—through public education, support for survivors, and bipartisan legislative advocacy. |
Learn how to find safe, trustworthy childcare or eldercare. This guide covers caregiver vetting tips, signs of abuse, and steps to take if you suspect harm. |
RAINN is committed to meeting the highest standards of fiscal management, program effectiveness, and governance. See why we’re a top-ranked nonprofit. |
Learn more about the TAKE IT DOWN Act, 2025’s groundbreaking legislation that combats the online spread of non-consensual intimate images (NCII). |
Learn how a person’s disability status affects their risk of experiencing sexual violence. Together, we can protect and support people with disabilities.