Skip to content
CONSULTING & TRAINING

Responding to AI-Facilitated Abuse: Tips for Schools

From deepfakes to sextortion, artificial intelligence (AI) is putting kids at risk. Explore how schools can prevent and respond to tech-enabled sexual abuse.

Contributed by Sam Wilmoth, MSW, for RAINN Consulting Group


Content Note: This article discusses the harms and impacts of child sexual abuse, including AI-generated and AI-manipulated image-based abuse.

“I didn’t know this could happen. We’d put parental settings on her phone. We’d made sure the content was appropriate for her age, but I didn’t know just how vulnerable children are online.”

”Mary,” an anonymous mother of a victim of AI-enabled abuse

“I didn’t know this could happen. We’d put parental settings on her phone. We’d made sure the content was appropriate for her age, but I didn’t know just how vulnerable children are online.”

”Mary,” an anonymous mother of a victim of AI-enabled abuse

In my years of facilitating trainings on child abuse for parents and caregivers, I have never once heard a single person express confidence in their ability to keep their children 100% safe from threats on the internet. The reason for this is simple: keeping kids safe is too big a job for parents and caregivers to do alone. We need entire communities, starting with schools and districts, to improve the way they respond to AI-assisted abuse.      

While many school districts were quick to identify that AI chatbots could be used for cheating on academic assignments, they are still struggling to respond to threats of child sexual abuse that involve the same technology. 

AI Abuse Causes Real Harm To Real People

Children (girls in particular) face risks of abuse that are driven by the increasing presence of artificial intelligence in their lives. One such threat is “nudify” technology, in which AIs construct “deepfakes,” fake nude images made using features from real pictures of fully-clothed people, most often women and girls. Through this technology, children are also vulnerable to “sextortion,” in which perpetrators threaten to publish nude photos or nude images of a child for the purpose of blackmail. 

AI tools are making child sexual abuse material (CSAM) both easier to create and more explicit, according to two separate reports conducted by the Internet Watch Foundation in 2023 and 2024. The perpetrators who create these abusive images cause immeasurable harm. 

Impacts of Tech-Enabled Sexual Abuse

Even before the widespread availability of AI, studies about the effects of image-based sexual abuse documented harms to victims like posttraumatic stress disorder and suicidal thoughts, with one survivor crushingly describing her experience as “torture for the soul.” Unfortunately, when school districts do not fully understand these issues, they can further harm victims. This often happens when schools fail to report this abuse or compromise victims’ privacy during their investigations

Improving How Schools Respond To AI-Based Abuse

So, what can school districts do to improve how they respond to these issues? 

1. Education & Training

  • Schools can provide education and training to help their students, staff, and faculty understand more about AI-enabled abuse. 

RAINN has worked with schools across the country to provide victim-centered and trauma-informed training on sexual misconduct, and we have found that direct discussions with students can provide a deeper understanding of the harm caused by AI-enabled abuse. Students need to have a vocabulary for describing why this kind of abuse is so harmful, and the best way to develop that vocabulary is through discussion, not lecture. 

One of the most common training requests we get from teachers and staff is for information on how to recognize technology-based forms of abuse, so it’s important that training for teachers provides actionable steps they can take to prevent further harm when they become aware of the signs. In addition, detailed training about these issues is particularly important for Title IX investigators or other administrators who might have to respond to reports of AI-enabled abuse. 

2. Policies & Procedures

  • Schools can update their policies and procedures to address new forms of AI-enabled abuse. 

For instance, faculty and staff need clear reporting procedures that encompass AI-enabled abuse. School administrators may also need to update their anti-retaliation policies to include behaviors like students sharing deepfakes of a victim. 

Without comprehensive policies against all forms of sexual misconduct, faculty and staff will not have clear guidance about how to respond to a report of AI-enabled abuse, and victims will not have the protection they deserve.

Schools Can Make a Significant Impact

Educators and administrators are in the difficult position of having to respond to new kinds of abuse before legal or technological fixes emerge. But every improvement schools make to increase student safety has a critical impact. Every student deserves a learning environment that is free from violence and discrimination. 

I have spoken to parents and caregivers across the country who lie awake at night, worrying about protecting their children from someone on the other end of a computer screen or smartphone. But they are not alone. Young people have a community of adults around them who can step up to help. Schools are a critical starting point for a world free from AI-enabled abuse.

Resources for Prevention, Healing, & Justice

As advocates and policymakers become more aware of the risks posed to young people by deepfakes and nudification apps, they are developing more and more resources for victims.  

  • Explore state laws. A growing number of U.S. states have passed laws that address deepfakes, and some legal practitioners are attempting to apply existing laws about privacy or child pornography to this issue as well.
  • Explore federal legislation. Bipartisan efforts in Congress have led to the passage of RAINN-endorsed laws like the TAKE IT DOWN Act and other federal legislation with provisions for offender accountability and child safeguarding. 
  • Build awareness. The U.S. Department of Homeland Security has published an awareness campaign about online child sexual exploitation called Know2Protect.
  • Report child exploitation. To report child exploitation, visit the CyberTipline, operated by the National Center for Missing and Exploited Children.
  • Get support. RAINN’s National Sexual Assault Hotline provides support, information, and resources for survivors of all forms of sexual violence, including AI-facilitated abuse. It’s free, confidential, and available 24/7. Call 800.656.HOPE (4673), chat at hotline.RAINN.org, or text HOPE to 64673.  

Sexual abuse, assault, and misconduct are sensitive, challenging issues, but every organization should be prepared to support survivors.

RAINN Consulting Group Can Help 
Last updated: August 13, 2025