A report from the Center for Countering Digital Hate (CCDH) demonstrates how YouTube’s algorithm can recommend harmful eating disorder content.
In its study, after a fictional 13-year-old girl interacted with a video about eating disorders, two-thirds of the recommended videos were about eating disorders or weight loss, and half of those were considered harmful by YouTube’s own standards.
Each of these videos had an average of 344,000 views, with no way to know how many were children. YouTube’s policy states it does not allow promotion of eating disorders, but it has previously acknowledged that videos about recovery journeys can be helpful to some but triggering to others.
Removing harm without censoring help is difficult. But the report suggests that creators are able to post wildly dangerous claims without consequences, including videos documenting calorie restrictions, which can lead to malnutrition.
Some videos alluded to crash diets through codewords, but others directly referenced diet restriction in the title, yet still weren’t flagged by YouTube. Another genre was “inspiration videos” with three main forms:
Automated flagging, which may miss subtler allusions to eating disorders, accounts for 97 per cent of videos removed from YouTube. Only 20 per cent of what the CCDH reported was removed. The UK’s Online Safety Bill, which comes into effect next year, will put more pressure on companies including YouTube to restrict promotion of eating disorders, and create tools for individual adult users to have more control over what they see and who they interact with on the Internet. But the eating disorder charity Beat has raised concerns that this doesn’t address the ability of algorithms to actively promote harmful content.