Meta postpones encrypted Facebook and Instagram messages to 2023

The move comes as child safety advocates fear the plans will protect abusers from detection.

The owner of Facebook and Instagram is postponing plans to encrypt user messages until 2023, as child safety advocates fear his proposals will protect abusers from detection.

Mark Zuckerberg’s social media empire has come under pressure to abandon its encryption plans, which UK Home Secretary Priti Patel called “simply unacceptable”.

The National Society for the Prevention of Cruelty to Children (NSPCC) has said private messaging is “the front line of online child sexual abuse” because it blocks law enforcement and tech platforms from seeing it. messages by ensuring that only the sender and recipient can see their content – a process known as end-to-end encryption.

Facebook and Instagram’s parent company security chief Meta announced that the encryption process would take place in 2023. The company had previously said the change would take place in 2022 at the earliest.

“We are taking our time to get it right and we do not plan to complete the global rollout of end-to-end encryption by default on all of our email services for some time in 2023,” Antigone Davis wrote in the Sunday Telegraph. . “As a company that connects billions of people around the world and that has built cutting-edge technology, we are committed to protecting people’s private communications and keeping them safe online. “

Meta already uses end-to-end encryption on its WhatsApp messaging service and had plans to extend it to its Messenger and Instagram apps in 2022. It has already encrypted voice and video calls on Messenger. Announcing the 2019 privacy campaign, Zuckerberg said, “People expect their private communications to be secure and only seen by the people they sent them to, not by hackers, criminals, over-ambitious governments or even the people who exploit the services they use. “

Meta’s apps are used by 2.8 billion people every day. The tech industry made more than 21 million reports of child sexual abuse identified on its platforms worldwide to the US National Center for Missing and Exploited Children in 2020. More than 20 million of those reports were from Facebook.

Mr Davis said Meta would be able to detect abuse under its encryption plans using unencrypted data, account information and user reports. A similar approach has already enabled WhatsApp to report to child safety authorities. “Our recent review of some historical cases showed that we would still have been able to provide critical information to authorities, even if those services had been end-to-end encrypted,” she said.

Mr. Patel strongly opposed Meta’s plans. “We cannot allow a situation in which the ability of law enforcement to tackle heinous criminal acts and protect victims is severely hampered,” she said in April.

The issue is also of concern to Ofcom, the communications regulatory authority responsible for enforcing the online safety bill, which will come into force around 2023 and requires tech companies to protect children from content. harmful and prevent abuse on their platforms. Ofcom chief executive Melanie Dawes told The Times on Saturday that social media companies should ban adults from directing messages to children or face criminal penalties.

NSPCC Online Child Safety Policy Officer Andy Burrows praised Meta’s decision. “Facebook is right not to do end-to-end encryption until it has an adequate plan to prevent child abuse from going unnoticed on its platforms,” he said.

“But they should only move forward with these measures when they can demonstrate that they have the technology in place that will ensure that children are not further exposed to abuse. “