Signal app warns it will quit UK if law weakens end-to-end encryption

The head of the messaging app Signal has warned that it will quit the UK if the forthcoming online safety bill weakens end-to-end encryption.

Signal’s president said the organisation would “absolutely, 100% walk” if the legislation undermined its encryption service.

Asked by the BBC if the bill could jeopardise Signal’s ability to operate in the UK, Meredith Whittaker said: “It could, and we would absolutely 100% walk rather than ever undermine the trust that people place in us to provide a truly private means of communication. We have never weakened our privacy promises, and we never would.”

The bill has been criticised by privacy campaigners for a provision allowing Ofcom, the communications watchdog, to order a platform to use certain technologies to identify and take down child sexual exploitation and abuse material. It also requires tech firms to make their “best endeavours” to deploy new technology that identifies and removes such content.

Privacy advocates warn the bill could force encrypted messaging services such as Signal, WhatsApp and Apple’s iMessage to monitor users’ messages and create vulnerabilities in their platforms that could be exploited by rogue actors and governments.

Whittaker told the BBC it was “magical thinking” to believe there can be privacy “but only for the good guys”, adding that the bill was an example of this thinking. She said: “Encryption is either protecting everyone or it is broken for everyone.”

Signal, which has been downloaded more than 100m times on Google’s app store, is operated by a US-based nonprofit organisation and is widely used by activists and journalists, as well as some intelligence services. End-to-end encryption ensures that only the sender and recipient of a message can view its content.

Whittaker also criticised a system called client-side scanning, where images are scanned before being encrypted. In 2021 Apple was forced to pause its client-side scanning plans, which would have involved the company scanning user photos before they are uploaded to its image-sharing service.

Whittaker said such a system would turn everyone’s phone into a “mass surveillance device that phones home to tech corporations and governments and private entities”. She added that technological “back doors” into encrypted services could be hijacked by “malignant state actors” and “create a way for criminals to access these systems”.

Will Cathcart, the head of WhatsApp, told the Financial Times last year that any UK move against encryption would have reverberations around the world.

“If the UK decides that it is OK for a government to get rid of encryption, there are governments all around the world that will do exactly the same thing, where liberal democracy is not as strong,” he said.

A Home Office spokesperson said the online safety bill, which is due to become law this year, does not ban encryption.

skip past newsletter promotion

“The online safety bill does not represent a ban on end-to-end encryption but makes clear that technological changes should not be implemented in a way that diminishes public safety – especially the safety of children online. It is not a choice between privacy or child safety – we can and we must have both.”

The Home Office also flagged a product developed by a UK cybersecurity company that draws upon a database of images compiled by monitoring organisation the Internet Watch Foundation in order to spot, and then block, illegal material before it is sent. Tom Tugendhat, the security minister, said the SafeToWatch product from the company SafeToNet showed there are “ways to protect children online whilst maintaining privacy”.

Dr Monica Horten, a policy manager at the Open Rights Group, which campaigns for online privacy, said the online safety bill’s provisions “threaten a highly intrusive mandate for mass surveillance”. She added: “If encrypted services are required to comply with this mandate, they will have to compromise their systems and undermine the confidentiality of messages.”

However, the child safety charity the NSPCC said tech platforms had a “responsibility” to invest in technology that tackles abuse online.

“Tech companies should be required to disrupt the abuse that is occurring at record levels on their platforms, including in private messaging and end-to-end encrypted environments,” said Anna Edmundson, the head of policy and public affairs at the NSPCC.

Read More

Dan Milmo Global technology editor