Skip to content

bar

What About AI-Generated CSAM—Like Deepfakes?

Artificial intelligence (AI) is fueling a disturbing rise in synthetic CSAM—child sexual abuse material generated by easy-to-use apps and software.

How Can Parents Help Protect Kids from Sexual Abuse & CSAM?

Predators count on secrecy and shame to keep their young victims silent. The best protection is ensuring kids know they can talk to you—and that they’ll be believed.

What Should You Do If You or Your Child Has Been Victimized?

If you discover CSAM of yourself or your child, document where you found it and immediately report it to CyberTipline.org. Call 911 to report immediate danger. 

Which U.S. Laws Address CSAM?

According to U.S. law, child sexual abuse material (CSAM) is NOT a protected form of expression—it is a serious federal crime with harsh penalties.

What Is RAINN Doing to Fight Back Against CSAM?

RAINN is leading the national response to child sexual abuse material—through public education, support for survivors, and bipartisan legislative advocacy.

How to Find Trustworthy Childcare and Eldercare

Learn how to find safe, trustworthy childcare or eldercare. This guide covers caregiver vetting tips, signs of abuse, and steps to take if you suspect harm.

An older person holds hands with a younger person, on a graphic background

5 Rules for Getting Consent

Financials

RAINN is committed to meeting the highest standards of fiscal management, program effectiveness, and governance. See why we’re a top-ranked nonprofit.

Heart balloon in a jar of coins

TAKE IT DOWN Act

Learn more about the TAKE IT DOWN Act, 2025’s groundbreaking legislation that combats the online spread of non-consensual intimate images (NCII).

Hands typing on a cellphone.

Get the Facts About Sexual Violence Against People with Disabilities

Learn how a person’s disability status affects their risk of experiencing sexual violence. Together, we can protect and support people with disabilities.

Hands typing on a cellphone.