NSPCC lays out six tests for Government to create world-leading laws to protect children online


The NSPCC has laid out six tests the Government’s regulation of social media will be judged on if it is to achieve bold and lasting protections for children online.

The charity’s How the Wild West Web should be won report, released today, sets out how the upcoming Online Harms Bill must set the global standard in protecting children on the web.

With crucial decisions just days away, they are urging Government to ensure they level the playing field for children, and new laws finally force tech firms to tackle the avoidable harm caused by their sites.

The call comes as new analysis of the latest ONS data shows the number of online sex crimes against children recorded by Staffordshire Police reached the equivalent of 2.1 a day between January and March this year, highlighting the sheer scale of web abuse.

Across England and Wales, that figure stood at 101. The NSPCC expects this to have increased during lockdown, with coronavirus resulting in significant online harms to children driven by a historic failure to make platforms safe, by not putting even the most basic child protections in place.

The NSPCC has routinely highlighted the growing levels of abuse and harm caused to children on social media platforms, and believe the problem has been exacerbated by the fallout from coronavirus.

At the Hidden Harms summit earlier this year, the Prime Minister signalled his personal determination to legislate for ambitious regulation that successfully combats child abuse.

But the NSPCC is worried the landmark opportunity to change the landscape for children online could be missed if this isn’t translated by Government into law.

They have released their six tests ahead of a full consultation response to the White Paper, amid concerns Ministers are wavering in their ambitions for robust regulation.

Regulation must:

1. Create an expansive, principles-based duty of care

2. Comprehensively tackle online sexual abuse

3. Put legal but harmful content and an equal footing with illegal material

4. Have robust transparency and investigatory powers

5. Hold industry to account with criminal and financial sanctions

6. Give civil society a legal voice for children with user advocacy arrangements

The charity believes, if done correctly, regulation could set a British model that leads the world in child protection online.

But in a stark warning, NSPCC CEO Peter Wanless, said that “failing to pass any of the six tests will mean that rather than tech companies paying the cost of their inaction, future generations of children will pay with serious harm and sexual abuse that could have been stopped”.

The pandemic is likely to result in long-term changes to the online child abuse threat, with high-risk livestreaming and video chat becoming more popular. Changes to working patterns, meaning more offenders working at home, could result in a greater demand for sexual abuse images and increased opportunities for grooming.

Mr Wanless added: “Industry inaction is fuelling this staggering number of sex crimes against children and the fallout from coronavirus has heightened the risks of abuse now and in the future.

“The Prime Minister has the chance of a lifetime to change this by coming down on the side of children and families, with urgent regulation that is a bold and ambitious UK plan to truly change the landscape of online child protection.

“The Online Harms Bill must become a Government priority, with unwavering determination to take the opportunity to finally end the avoidable, serious harm children face online because of unaccountable tech firms.”

The six tests are backed by Ian Russell, who has campaigned for regulation since the death of his daughter, Molly, by suicide, after she was targeted with self-harm posts on social media.

Mr Russell, who is due to be made an Honorary Member of Council for the NSPCC this week, said: “Today, I can’t help but wonder why it’s taking so long to introduce effective regulation to prevent the type of harmful social media posts we now know Molly saw, and liked, and saved in the months prior to her death.

“Tech self-regulation has failed and, as I know, it’s failed all too often at great personal cost. Now is the time to establish a regulator to protect those online by introducing proportionate legislation with effective financial and criminal sanctions.

“It is a necessary step forward in trying to reclaim the web for the good it can do and curtail the growing list of harms to be found online.”

The six tests the Government must pass if it is to create game-changing and lasting protections for children online are:

• An expansive, principles-based duty of care; tech firms should have a legal responsibility to identify harms caused by their sites and deal with them, or face tough consequences for breaching regulation.

• Tackling online sexual abuse; platforms must proactively and consistently tackle grooming and abuse images facilitated by dangerous design features. There must be no excuses. In the current state of play abuse images have been left online with the excuse that a child’s age cannot be proven, and images signposting abuse are not removed.

• Tackling legal but harmful content; current Government proposals will see companies set their own rules on legal but harmful content. This is not good enough. The law must compel firms to respond to the harms caused by algorithms targeting damaging suicide and self-harm posts at children and avoid a two-tier system that prioritises tacking illegal content. The danger of harmful content should rightly be balanced against freedom of expression, but focus on the risk to children.

• Transparency and investigation powers; tech firms currently only dish out information they want the public to see. The regulator must have the power to lift up the bonnet to investigate platforms and demand information from companies.

• Criminal and financial sanctions; fines are vital but will be water off a duck’s back to some of the world’s wealthiest companies. Government can’t backslide on a named manager scheme that gives the regulator powers to prosecute rogue tech directors in UK law.

• User advocacy arrangements; to level the playing field there must be strong civil society voice for children against well-resourced industry pressure. Big tech should be made to clean up the damage they have caused by funding user advocacy arrangements.

The NSPCC has been the leading voice for social media regulation and the charity set out detailed proposals for an Online Harms Bill last year, which informed much of the White Paper.

The Government has said the consultation response will be published in the autumn, with legislation expected to be delivered in the new year.