Create watchdog to protect children online, charity says
The NSPCC says the watchdog should be created to function as an early warning system to alert Ofcom.
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Upcoming online safety laws should include provisions to create a new watchdog to advocate for children and protect them online, a charity has said.
The NSPCC has called for the Online Safety Bill to be amended to create a new statutory watchdog that could advocate for children in the same way Citizens Advice does for the rights of consumers.
The children’s charity said the current Bill – which is making its way through Parliament – would still leave children at risk of sexual abuse online.
It says the watchdog should be created to function as an early warning system to alert Ofcom, the new regulator for the sector, of threats to children as they emerge and technology evolves.
The watchdog could also be used to offer a counterbalance to any lobbying done by tech giants in an attempt to influence the regulator, the NSPCC said.
“The low priority tech firms place on reacting rapidly to protecting children relative to other business imperatives won’t end with regulation, which is why it is so important to have a watchdog to stand up for children at risk of abuse on their platforms,” NSPCC chief executive Sir Peter Wanless said.
“Access to dangers faced by children in real-time will equip industry and the regulator with the information they need to respond quickly.
“Other regulated sectors have bodies that promote the interests of users and ministers have the opportunity to ensure that children are given that voice in the online space.
“The landmark Online Safety Bill will compel companies to finally address the way their sites put children in harm’s way and its effectiveness can be bolstered by a watchdog that ensures children are at the heart of regulation for generations to come.”
The charity’s proposals have been backed by a number of fellow online safety campaigners, including Ian Russell, whose daughter Molly took her own life after viewing harmful content on social media.
Mr Russell, who has since set up an online safety foundation in his daughter’s name, said: “I know how isolating it can feel when speaking up about harmful online content; a small voice crying into a violent storm.
“From my experience, I understand how beneficial it would be to formalise a system to amplify calls for change.
“Co-ordinated user advocacy would ease the burden placed on those directly affected by harmful content and help to bring about the prompt change required to create a safer online world.
“The creation of a statutory watchdog would also help raise an early alarm when future harms are first encountered.
“So, I support the NSPCC’s call for the Online Safety Bill to create a statutory watchdog to advocate for children and help ensure their safety is at the heart of the new regulation.”
The NSPCC said it had also seen support from the public for the idea of a watchdog, citing new research it has published which found that 88% of people think such a body was necessary.
A Department for Digital, Culture, Media and Sport spokesperson said: “Our comprehensive online safety laws are built to protect young people and force social media platforms to take tough action against harmful content. We’re making sure organisations can raise serious concerns about platforms and giving Ofcom robust powers to hold tech firms to account.”