Update: The vote on the bill is now expected to be delayed until the fall – see end for more details.
A proposed new CSAM law in the UK could force all messaging companies to use the type of client-side scanning approach that Apple planned to launch to detect child sexual abuse material (CSAM) on iPhones.
The Online Safety Bill
The Online Safety Bill (OSB) is something of a hotchpotch of measures intended to tackle “harmful” user-generated content – that is, any service that allows users to post text content or upload media. It was, of course, sold as being targeted at terrorist materials and CSAM.
The Bill introduces new rules for firms which host user-generated content, i.e. those which allow users to post their own content online or interact with each other, and for search engines, which will have tailored duties focussed on minimising the presentation of harmful search results to users.
Those platforms which fail to protect people will need to answer to the regulator, and could face fines of up to ten per cent of their revenues or, in the most serious cases, being blocked.
All platforms in scope will need to tackle and remove illegal material online, particularly material relating to terrorism and child sexual exploitation and abuse.
Platforms likely to be accessed by children will also have a duty to protect young people using their services from legal but harmful material such as self-harm or eating disorder content. Additionally, providers who publish or place pornographic content on their services will be required to prevent children from accessing that content.
The largest, highest-risk platforms will have to address named categories of legal but harmful material accessed by adults, likely to include issues such as abuse, harassment, or exposure to content encouraging self-harm or eating disorders. They will need to make clear in their terms and conditions what is and is not acceptable on their site, and enforce this.
A whole raft of changes has since been made, expanding both the scope and the powers of the law. One of the scariest changes is that the government could – after the law has been passed – change the definition of “harmful” content.
CSAM law proposal
The Guardian reports that a new amendment has been put forward, which would create an obligation to detect CSAM even in end-to-end encrypted messages. (Note: In the UK, the term CSAE is used instead of CSAM – child sexual abuse and exploitation content.)
Heavily encrypted messaging services such as WhatsApp could be required to adopt cutting-edge technology to spot child sexual abuse material or face the threat of significant fines, under new changes to UK digital safety legislation.
The amendment to the online safety bill would require tech firms to use their “best endeavours” to deploy new technology that identifies and removes child sexual abuse and exploitation content (CSAE).
It comes as Mark Zuckerberg’s Facebook Messenger and Instagram apps prepare to introduce end-to-end encryption, amid strong opposition from the UK government, which has described the plans as “not acceptable”.
Priti Patel, a longstanding critic of Zuckerberg’s plans, said the change in the law balanced the need to protect children while providing privacy for online users.
Specifically, the change would prevent messaging companies from simply shrugging and saying they have no way to see the content of E2E encrypted messages, and create an obligation for them to develop new ways to do so.
The only* technical way to do this would be to perform client-side scanning, either before encryption on the sender’s device, or after decryption on the recipient’s device. This was, of course, the approach Apple planned to take when it announced plans for CSAM scanning of photos. (*Another approach that has been suggested is the so-called “ghost proposal,” but I would argue that this breaks the definition of E2E encrypted messaging.)
Apple was forced to suspend its plans after concerns were raised about the potential for governments to abuse the technology. You can read a summary of the controversy here, and a potential solution here.
9to5Mac’s Take on the proposed CSAM law
The current British government – and most notably, its Home Secretary Priti Patel – has form for trying to block the use of end-to-end encryption. Indeed, as The Guardian noted, the whole issue blew up when Meta announced plans to adopt E2E encryption for Facebook Messenger and Instagram (WhatsApp already uses E2E encryption).
Given the technical illiteracy of the government, the smart money would be on this amendment being just another attempt to make E2E encrypted messaging illegal, not even realizing that client-side scanning is another option.
Either way, though, it will bring client-side CSAM scanning back into the spotlight, and apply renewed pressure on Apple to make its own position clear. The iPhone maker has so far said nothing since promising to come up with further privacy improvements, seemingly hoping that it might be able to just keep its head down and wait for the fuss to go away. This amendment, if passed, would make it impossible to maintain its silence.
Update: Bill expected to be delayed until the fall
Politico reports that the vote on the bill – originally scheduled for next week – is expected to be delayed until the fall, as one of the effects of the forced resignation of prime minister Boris Johnson.
Progress on Britain’s proposed new content regulation law is expected to be delayed until the fall, amid the fallout from Boris Johnson’s resignation as Conservative Party leader.
The Online Safety Bill, which ministers had hoped to move through the House of Commons before MPs go on their summer break on July 21, is expected to be dropped from the parliamentary schedule next week, according to a Department for Digital, Culture, Media and Sport official […]
If the bill is dropped it means it will not be allotted parliamentary time before Johnson leaves Downing Street on September 6, the day after the U.K. parliament returns.
We’ll find out for sure later today, when next week’s parliamentary schedule is formally announced.
FTC: We use income earning auto affiliate links. More.