Meta delays encrypted messages on Facebook and Instagram to 2023

Mark Zuckerberg’s social media empire has been under pressure to abandon its encryption plans. Photograph: Onur Dogman/Sopa Images/Rex/Shutterstock

Meta

Move comes as child safety campaigners express concern plans could shield abusers from detection

Sun 21 Nov 2021 12.12 GMT

The owner of Facebook and Instagram is delaying plans to encrypt users’ messages until 2023 amid warnings from child safety campaigners that its proposals would shield abusers from detection.

Mark Zuckerberg’s social media empire has been under pressure to abandon its encryption plans, which the UK home secretary, Priti Patel, has described as “simply not acceptable”.

The National Society for the Prevention of Cruelty to Children (NSPCC) has said private messaging is the “frontline of child sexual abuse online” because it prevents law enforcement, and tech platforms, from seeing messages by ensuring that only the sender and recipient can view their content – a process known as end-to-end encryption.

The head of safety at Facebook and Instagram’s parent company, Meta, announced that the encryption process would take place in 2023. The company had previously said the change would happen in 2022 at the earliest.

“We’re taking our time to get this right and we don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023,” Antigone Davis wrote in the Sunday Telegraph. “As a company that connects billions of people around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.”

Meta already uses end-to-end encryption on its WhatsApp messaging service and had been planning to extend that to its Messenger and Instagram apps in 2022. It has already encrypted voice and video calls on Messenger. Announcing the privacy drive in 2019, Zuckerberg, said: “People expect their private communications to be secure and to only be seen by the people they’ve sent them to – not hackers, criminals, over-reaching governments or even the people operating the services they’re using.”

Meta’s apps are used by 2.8 billion people every day. The tech industry made more than 21m referrals of child sexual abuse identified on its platforms globally to the US National Center for Missing and Exploited Children in 2020. More than 20m of those reports were from Facebook.

Davis said Meta would be able to detect abuse under its encryptions plans by using non-encrypted data, account information and reports from users. A similar approach has already enabled WhatsApp to make reports to child safety authorities. “Our recent review of some historic cases showed that we would still have been able to provide critical information to the authorities, even if those services had been end-to-end encrypted,” she said.

Patel has been a vocal opponent of Meta’s plans. “We cannot allow a situation where law enforcement’s ability to tackle abhorrent criminal acts and protect victims is severely hampered,” she said in April.

The issue is also a concern for Ofcom, the communications regulator tasked with enforcing the online safety bill, which will become law around 2023 and imposes a duty of care on tech companies to protect children from harmful content and prevent abuse from occurring on their platforms. Ofcom’s chief executive, Melanie Dawes, told the Times on Saturday that social media companies should ban adults from directly messaging children or face criminal sanctions.

The NSPCC’s head of child safety online policy, Andy Burrows, welcomed Meta’s move. “Facebook is right not to proceed with end-to-end encryption until it has a proper plan to prevent child abuse going undetected on its platforms,” he said.

“But they should only go ahead with these measures when they can demonstrate they have the technology in place that will ensure children will be at no greater risk of abuse.”

Read More

Dan Milmo Global technology editor