Hello. It looks like you�re using an ad blocker that may prevent our website from working properly. To receive the best Tortoise experience possible, please make sure any blockers are switched off and refresh the page.

If you have any questions or need help, let us know at memberhelp@tortoisemedia.com

Tackling online abuse

Tackling online abuse

0:00

Content showing the most extreme online sexual abuse of children has doubled since 2020. What’s being done to stop it?

Most people would rather not see what Amelia has to look at in the darkest corners of the internet. But it’s her job to look.

Amelia, which is not her real name, works for the Internet Watch Foundation, an organisation which takes down images of child sexual abuse online. 

“You are one click away from it. It could just appear on social media, it could be while you are searching for some car parts and all of a sudden something pops up,” she says.

In a new report the IWF has found that content showing the most extreme abuse, known as category A, has doubled since 2020.

Amelia explains how she looks for certain details, such as a school uniform or a poster on the wall, to help identify a child online. In one instance they were able to locate a child, who was being coerced into undressing online, to a town in Italy thanks to her uniform. 

Last year the IWF blocked 51,000 websites containing category A material. That’s more than in any previous year.

Susie Hargreaves, who is head of the IWF, says Covid lockdowns meant children’s lives were increasingly lived online. This exposed them to predators who could trick, coerce and encourage their victims into engaging in sexual activities without the perpetrator being in the same room.

“We used to play what we call whack-a-mole”, she says. “We’d take down one image at a time”. But the technology the IWF uses to detect images of child sexual abuse is improving. Now they can apply a unique digital fingerprint – what they call a ‘hash’ – to any images they find. The IWF has more than 1.5 million hashes.

Any image uploaded with a ‘hash’ can be quickly recognised and taken down. It also means that technology companies can stop people from uploading, downloading, viewing, sharing or hosting known images and videos showing child sexual abuse.

But the UK is still flooded with millions of pages of Category A abuse. The government’s Online Safety Bill, which promised to bring in stronger protections, has still not been passed after years of delay.