The Independent view

Ofcom must do more to protect children from online harm – but parents should do their part too

Editorial: While more robust regulation would certainly be welcome in our new tech-oriented world, responsibility for keeping children safe ultimately lies with their guardians and teachers

Wednesday 08 May 2024 15:30 EDT
Comments
The time will soon come, if it has not already, where we realise that laws, regulators and codes can only ever do so much
The time will soon come, if it has not already, where we realise that laws, regulators and codes can only ever do so much (Getty/iStock)

It is now six months since the elephantine gestation of the Online Safety Act was completed and it became law. It is fair to say that the internet hasn’t changed much in that time.

Children are still being exposed to harmful and damaging material. WhatsApp groups are still full of juveniles who shouldn’t be on the platform in the first place, and whose activities are encrypted, beyond the reach of the law and capable of being used for the vilest of purposes. Paedophiles and fraudsters are, so far as can be judged, as active as ever. The X platform, formerly Twitter, carries more misinformation, conspiracy theories, racism and misogyny than ever.

Lives are still being blighted by being exploited online and through bullying; people, including children, are still dying by suicide; and the transformation of youth from healthy family-oriented outdoor adventure to indoor, solitary, sedentary game-playing continues.

We are, it must be remembered, still at the early stages of a world dominated by social media – not to mention the very dawn of artificial media. The Online Safety Bill, as was, can trace its origins back to 2014 and is already out of date, left behind by new operating systems. Britain’s legislators have not even attempted to regulate AI, even as a general election approaches and ingenious, authentic-looking and sounding fakery has already further eroded trust in the political process.

So what hope is there that the promised Ofcom “codes of practice” will, as it is sometimes phrased, “tame the algorithms”? It has to be said that Ofcom’s recent track record does not inspire much confidence. The regulator has blatantly failed in its duty to uphold due impartiality in broadcasting by permitting GB News to broadcast propaganda for the Tories, Reform UK and other fringe groups – up to and including serving Conservative politicians “interviewing” each other. If controversial outlets such as GB News are allowed by Ofcom to break the rules and inflict harm on British society, why should this regulator be trusted to try to fight online extremism?

Ofcom, on behalf of parliament, wants the tech giants – some of the most powerful and wealthiest organisations on the planet – to introduce age verification and “reformulate” their algorithms, such that minors will be shielded from “toxic” material.

And if they don’t? Well, unlike broadcasting, where Ofcom has the ultimate sanction of cancelling a licence and putting a channel out of business, the penalties available to the regulator are, in practice, limited.

Unlike China, it is unthinkable, and probably impractical, for Ofcom to ban the likes of Google from the UK. It can levy fines but they would be unlikely to deter the tech giants; and if they were large enough to hurt, the likes of Meta would either exit the British scene or permit users to find a way of circumventing regulation. Extracting 10 per cent of global revenue from Elon Musk or Mark Zuckerberg would perhaps prove beyond the ability of even a regulator more vigorous than Ofcom; the signs are that Ofcom wouldn’t even try.

Under its present leader, Melanie Dawes, there can be little hope of world-leading effective activism. Dame Melanie says: “We will be publishing league tables so that the public knows which companies are implementing the changes and which ones are not.” Mr Musk, TikTok and the coming masters of the AI universe, we may be assured, won’t be swayed by her stern words.

If all that sounds like a counsel of despair, it is not meant to. Bereaved parents are right to want the authorities to strengthen the legislation and keep up with new developments. But the list of obligations at least theoretically placed on the industry, and for the regulators to oversee, is already awesome.

The law is designed to bring the following areas under control: child sexual abuse; controlling or coercive behaviour; extreme sexual violence; illegal immigration and people smuggling; promoting or facilitating suicide; promoting self-harm; animal cruelty; selling illegal drugs or weapons; terrorism; cyberflashing and sending unsolicited sexual imagery online; and “deepfake” pornography. It is right to outlaw such activities; but also, surely, sensible to think about how the authorities and the companies can avoid being overwhelmed by the scale of the tasks.

In terms of regulation, that requires a skilful mix of engagement with tech; and a determination to act swiftly and decisively when required.

The European Commission, albeit a heftier opponent than Ofcom, has shown a greater willingness to stand up to the tech oligopolists and there is no reason why a constructive, innovative and realistic British watchdog with a different leader couldn’t work with the big companies in the interests of all concerned.

Yet – and this is true anywhere in the world – protecting children from harm cannot ever be the sole responsibility of regulators and the platforms. Parents and teachers are the other primary players in this continuing struggle – and, in reality, the only ones with the presence and power to limit harm. The tech companies can, and do, provide tools to help filter online activity but the decision to give a child a smartphone or tablet and to control their use of it would still lie in the hands of parents, other guardians and schools.

The time will come, if it has not already, when we realise that laws, regulators and codes can only ever do so much. Controlling harmful content at the “supply” end will always be a cat-and-mouse game, with the young user continually devising new ways to evade age verification and explore beyond “approved” content – it is generally what children do. Encryption allows them to do so undetected and encryption cannot be “uninvented”.

Only the physical denial of access through making sure that online activity is restricted to older children – and even then, as supervised as possible – can minimise juvenile exposure to harmful material and unscrupulous people. There is surely a strong case for schools to be smartphone-free zones, driving the bullying and the hate out as far as possible, and for teaching staff to enjoy the necessary spells to confiscate devices.

But even in those circumstances, and even if there was perfect control of online content, the sheer amount of time the young spend on the internet can be as bad for their physical as it is for their mental health. It is up to parents and teachers to be much bolder in constraining the virtual lives of those teenagers and younger children in their care. With the very best will in the world, they cannot expect to be able to subcontract all of that work to ministers, Ofcom and the industry.

Parenthood has never been easy – and it is going to get more difficult.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in