Join us Read
Listen
Watch
Book
Culture Society, Identity and Belonging

YouTube’s algorithm recommends eating disorder content to teenage girls

A report from the Center for Countering Digital Hate (CCDH) demonstrates how YouTube’s algorithm can recommend harmful eating disorder content.

In its study, after a fictional 13-year-old girl interacted with a video about eating disorders, two-thirds of the recommended videos were about eating disorders or weight loss, and half of those were considered harmful by YouTube’s own standards.

Each of these videos had an average of 344,000 views, with no way to know how many were children. YouTube’s policy states it does not allow promotion of eating disorders, but it has previously acknowledged that videos about recovery journeys can be helpful to some but triggering to others.

Removing harm without censoring help is difficult. But the report suggests that creators are able to post wildly dangerous claims without consequences, including videos documenting calorie restrictions, which can lead to malnutrition. 

Some videos alluded to crash diets through codewords, but others directly referenced diet restriction in the title, yet still weren’t flagged by YouTube. Another genre was “inspiration videos” with three main forms:

  • Thinspo, which shows sometimes emaciated bodies as aspirational;
  • Meanspo, where viewers are bullied and shamed to encourage weight loss; and
  • Fatspo, which shows fat bodies to provoke disgust and mocks the person pictured.

Automated flagging, which may miss subtler allusions to eating disorders, accounts for 97 per cent of videos removed from YouTube. Only 20 per cent of what the CCDH reported was removed. The UK’s Online Safety Bill, which comes into effect next year, will put more pressure on companies including YouTube to restrict promotion of eating disorders, and create tools for individual adult users to have more control over what they see and who they interact with on the Internet. But the eating disorder charity Beat has raised concerns that this doesn’t address the ability of algorithms to actively promote harmful content.


Enjoyed this article?

Sign up to the Daily Sensemaker Newsletter

A free newsletter from Tortoise. Take once a day for greater clarity.



Tortoise logo

A free newsletter from Tortoise. Take once a day for greater clarity.



Tortoise logo

Download the Tortoise App

Download the free Tortoise app to read the Daily Sensemaker and listen to all our audio stories and investigations in high-fidelity.

App Store Google Play Store

Follow:


Copyright © 2025 Tortoise Media

All Rights Reserved