Join us Read
Listen
Watch
Book
Technology AI, Science and New Things

Nude deepfakes disrupt Beverly Hills school

All is not well in Beverly Hills and AI is to blame. Police have launched a non-criminal investigation at Beverly Vista Middle School (11-14 year-olds) as students have been sharing artificially-generated nudes of fellow classmates. The school district superintendent has called the incident part of the “disturbing and unethical use of AI plaguing the nation”. Earlier this month, sexually explicit AI-generated images of Taylor Swift went viral on social media leading Congress to propose legislation allowing victims to sue. Technically, since 2020, it is already possible in California for a victim to sue whoever distributed or shared sexually-explicit deepfakes, which could apply in this case if a student decided to bring charges. For now, the student perpetrators have been removed from the school while the investigation is completed. This is not an isolated incident. Similar reports have recently emerged in New Jersey, Seattle, Winnipeg, Spain and the UK. The big picture: online tools for artificially generating explicit imagery have become so easily available that children can use them.


Enjoyed this article?

Sign up to the Daily Sensemaker Newsletter

A free newsletter from Tortoise. Take once a day for greater clarity.



Tortoise logo

A free newsletter from Tortoise. Take once a day for greater clarity.



Tortoise logo

Download the Tortoise App

Download the free Tortoise app to read the Daily Sensemaker and listen to all our audio stories and investigations in high-fidelity.

App Store Google Play Store

Follow:


Copyright © 2026 Tortoise Media

All Rights Reserved