Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

More children under 10 being targeted by paedophiles online, says watchdog

The Internet Watch Foundation said almost two in five abusive images contained images of children under 10.

Margaret Davis
Tuesday 16 January 2024 19:01 EST
Data from the Internet Watch Foundation show more self-generated images of child abuse are appearing online, as are images of children under 10 (Dominic Lipinski/PA)
Data from the Internet Watch Foundation show more self-generated images of child abuse are appearing online, as are images of children under 10 (Dominic Lipinski/PA) (PA Wire)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Children under the age of 10 are increasingly being targeted by paedophiles online, figures from an internet watchdog suggest.

The Internet Watch Foundation (IWF) received 392,660 reports of possible abuse images on websites in 2023, of which 275,655 were found to contain such material.

Of those, 107,615 (39%) were found to contain images of children under 10, up from 64,735 pages out of 255,580 (25%) in 2022.

Each of the pages can contain hundreds or thousands of images of abuse.

Online child sexual abuse is on the rise, and the victims are only getting younger

Tom Tugendhat, Security minister

A total of 254,070 (92%) of the pages reported in 2023 featured what is classed as self-generated pictures or videos, where children are tricked or forced into taking obscene images of themselves.

About a fifth of the self-generated material was in the worst category, the IWF said.

In the previous year, 199,360 pages out of the 255,580 were found to contain child abuse material contained self-generated images (78%), of which 32,871, or 16%, were in the worst category.

Chief executive of the IWF, Susie Hargreaves, said: “The imagery extorted or coerced from primary school-aged children is now finding its way onto the most extreme, dedicated child sexual abuse sites in shocking numbers.

“What starts in a child’s bedroom, over a webcam, is shared, traded, and harvested by committed and determined sexual predators.

“The IWF is seeing the results in unprecedented numbers. These criminals are ruthless.”

She also hit out at plans by Meta to bring in end-to-end encryption on Facebook Messenger.

In 2022, the tech giant passed more than 20 million reports of users potentially sharing images of children being sexually abused to a US helpline.

Ms Hargreaves added: “It is incomprehensible that Meta is deciding to look the other way and offer criminals a free pass to further share and spread abuse imagery in private and undetected.

“Decisions like this, as well as Apple opting to drop plans for client-side scanning to detect the sharing of abuse, are baffling given the context of the spread of this imagery on the wider web.

“Children are falling victim like never before.

“There really has never been a worse time to be a child on the internet, and Meta’s decision to bring in end-to-end encryption appears to be wilfully making this worse, not better, ignoring the evidence, and rewarding criminals with safe spaces at children’s expense.”

Apple has been approached for comment.

Security Minister Tom Tugendhat said: “This alarming report clearly shows that online child sexual abuse is on the rise, and the victims are only getting younger.

“And yet, despite warnings from across government, charities, law enforcement and our international partners, Meta have taken the extraordinary decision to turn their backs on these victims, and provide a safe space for heinous predators.

“The decision to roll out end-to-end encryption on Facebook Messenger without the necessary safety features, will have a catastrophic impact on law enforcement’s ability to bring perpetrators to justice.

“It isn’t too late to work with us to keep children safe online.

“As Meta begins to implement default end-to-end encryption in the UK, they can and must ensure that robust safeguards are implemented at a time when children are at a greater risk online than ever before.”

Sir Peter Wanless, NSPCC chief executive, said: “It is alarming that the amount of child sexual abuse material online has continued to rise and is particularly worrying that increasing numbers of primary school-aged children are being targeted, groomed and coerced into sharing images of their own abuse.

“Amidst such unprecedented levels of child abuse, tech firms should be preparing for the Online Safety Act to come into force by addressing the way their sites put children at risk of sexual abuse.

“Instead, Meta are rushing out end-to-end encryption on its messaging services in the UK which will blindfold themselves and law enforcement to the child sexual abuse taking place on their platforms and the young victims who desperately need help.

“Meta’s executives can still pause the rollout of end-to-end encryption to allow regulators and experts to scrutinise their risk assessments and work with the company to ensure they do not undermine efforts to identify and disrupt child abuse.”

A Meta spokesperson said: “Encryption helps keep people, including children, safe from hackers, scammers and criminals.

“We don’t think people want us reading their private messages so have spent years developing robust safety measures to prevent, detect and combat child abuse while maintaining online security.

“Our recently published report detailed these measures, such as restricting over 19s from messaging teens who don’t follow them and using technology to identify and take action against malicious behaviour.

“We routinely provide more reports to the National Centre for Missing and Exploited Children than others, and given our ongoing investments, we expect that to continue.”

Images of child sexual abuse online can be reported anonymously to the IWF at iwf.org.uk/report.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in