All is not well in Beverly Hills and AI is to blame. Police have launched a non-criminal investigation at Beverly Vista Middle School (11-14 year-olds) as students have been sharing artificially-generated nudes of fellow classmates. The school district superintendent has called the incident part of the “disturbing and unethical use of AI plaguing the nation”. Earlier this month, sexually explicit AI-generated images of Taylor Swift went viral on social media leading Congress to propose legislation allowing victims to sue. Technically, since 2020, it is already possible in California for a victim to sue whoever distributed or shared sexually-explicit deepfakes, which could apply in this case if a student decided to bring charges. For now, the student perpetrators have been removed from the school while the investigation is completed. This is not an isolated incident. Similar reports have recently emerged in New Jersey, Seattle, Winnipeg, Spain and the UK. The big picture: online tools for artificially generating explicit imagery have become so easily available that children can use them.