Meta Delays Plan to Encrypt Facebook and Instagram Messages until 2023

BY Chandraveer Mathur

Published 22 Nov 2021

Meta, the parent company for Instagram and Facebook, has delayed plans to encrypt messages on the platform until 2023 amid warnings from campaigners for child safety who argue that the move would shield abusers and help them avoid detection.

Back in August this year, Facebook announced that it would implement end-to-end encryption for chat messages on Facebook and Instagram. The company is now deferring message encryption until 2023. Justifying the move, Antigone Davis, the Global Head of Safety at Meta, wrote in the Sunday Telegraph:

“We’re taking our time to get this right, and we don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023.”

“As a company that connects billions of people around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.”

Meanwhile, the National Society for Prevention of Cruelty to Children (NSPCC) said that private messaging is the “front line of child sexual abuse online.” Encryption would only make matters worse because it prevents law enforcement and the tech platforms from seeing messages and curbing harassment. End-to-end encryption technology only allows the sender and corresponding recipient to view the contents of the messages exchanged.

Meta’s plans to secure interactions between its users have been deferred by another year, but the company first developed an interest in the area in 2019. At the time, Zuckerberg reportedly said:

“People expect their private communications to be secure and to only be seen by the people they’ve sent them to – not hackers, criminals, overreaching governments or even the people operating the services they’re using.”

End-to-end encryption has been a staple feature for WhatsApp, also owned by Meta. Recently, encryption was implemented for voice and video calls on Messenger.

Arguing in favor of encryption, Davis said Meta would detect abuse using non-encrypted data, account information, and user reports. WhatsApp has already enabled a similar approach to report child safety incidents to the relevant authorities. He further added, “Our recent review of some historic cases showed that we would still have been able to provide critical information to the authorities, even if those services had been end-to-end encrypted.” However, abuse remains rampant on Meta’s apps used by approximately 2.8 billion people every day. The company identified and reported 21 million instances of child sexual abuse on its platforms globally in 2020. Over 20 million of those reports were from Facebook alone.

Welcoming Meta’s move to halt its encryption drive, the NSPCC’s head of child safety online policy, Andy Burrows, said, “Facebook is right not to proceed with end-to-end encryption until it has a proper plan to prevent child abuse going undetected on its platforms.” His opinion resonated with Priti Patel’s statement in April: “We cannot allow a situation where law enforcement’s ability to tackle abhorrent criminal acts and protect victims is severely hampered.”

Do you think Meta’s encryption plans would make the platform a safe haven for child sexual abusers? Share your thoughts in the comments below!

[Via The Guardian]