More powers to tackle child sexual abuse tabled in Online Safety Bill amendment
Social media companies could be compelled to roll out or introduce new technologies to help detect child sexual abuse and exploitation content, through proposals being put forward by the Government in an amendment to the Online Safety Bill.
The amendment would give greater powers to telecoms regulator Ofcom to demand tech companies show they are making reasonable efforts to find and tackle harmful content on their platforms, including developing and introducing new technology which can help find harmful content as well as stop it spreading.
Ofcom would have the power under the Bill to impose fines of up to £18 million or 10% of a company’s global annual turnover – whichever is higher.
The Government said the amendment aims to incentivise the development of new technologies to find and prevent the spread of child sexual abuse content that could be deployed even on encrypted platforms and still protect user privacy.
Officials said the amendment is not an attempt to stop the rollout of further end-to-end encryption services – technology which it said it broadly supports if implemented with assurances that children and others are being protected from harmful material.
Meta, which owns Facebook, WhatsApp and Instagram, has previously announced plans to roll out end-to-end encryption across all its messaging platforms by some time in 2023.
The amendment is the latest put forward for the landmark internet safety laws and will be considered later this month as part of the report stage of the Bill’s passage through Parliament.
Home Secretary Priti Patel said: “Child sexual abuse is a sickening crime. We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe.
“Privacy and security are not mutually exclusive – we need both, and we can have both and that is what this amendment delivers.”
The Government said its self-funded Safety Tech Challenge Fund – which has awarded five firms at least £85,000 to further develop prototype products capable of detecting child abuse material within encrypted settings – showed it was possible to find solutions.
Culture Secretary Nadine Dorries said tech firms “have a responsibility not to provide safe spaces for horrendous images of child abuse to be shared online”.
“Nor should they blind themselves to these awful crimes happening on their sites,” she said.
The amendment has been backed by child safety campaigners, with NSPCC chief executive Sir Peter Wanless saying it “will strengthen protections around private messaging and ensure companies have a responsibility to build products with child safety in mind”.
“This positive step shows there doesn’t have to be a trade-off between privacy and detecting and disrupting child abuse material and grooming,” he said.
But privacy campaigners raised concerns about the potential scope of the amendment.
Mark Johnson, legal and policy officer at privacy rights group Big Brother Watch, said: “Child safety online is absolutely essential, but would be undermined rather than bolstered if the government subverts social networks into privatised spying agencies.
“Major platforms send hundreds of thousands of reports to child safety bodies every year, and law enforcement agencies already have extraordinary powers to intercept and hack into criminal content.
“However, this Bill does nothing to better resource law enforcement to deal with the volume of child exploitation reports they receive, instead giving data-hungry foreign companies a license to snoop on the personal data of millions.
“The Bill creates powers to compel social media companies to scan millions of innocent people’s private messages.
“The Government’s plans to circumvent end-to-end encryption will lead to mass scale suspicionless surveillance of our private chats and cannot be tolerated in a rights-respecting country.”