The European Commission today issued a new plan revealed to require online platforms to detect and report online sexual abuse sharing. However, the proposal would force tech companies to actively scan users’ private messages and content for images of possible child sexual abuse. Experts in privacy say the regulations will lead to an invasive surveillance regime in Europe.
“We are failing to protect children today,” European Commissioner Ylva Johansson said at a press conference today. She called the plan groundbreaking and said it will make Europe a world leader in the fight against child sexual abuse online.
Under the new proposal, European countries can require companies such as Meta and Apple to implement comprehensive surveillance systems that can detect child abuse. The system should scan users’ messages, photos and videos for possible child abuse and, in the event of a “hit”, call in the police.
The new plan also proposes to use artificial intelligence (AI) to detect language patterns associated with grooming in conversations.
“This document is the most terrifying thing I’ve ever seen,” states Matthew Green, professor of cryptography. “It describes the most advanced mass surveillance ever deployed outside of China and the USSR. That is not an exaggeration.”
Jan Penfrat of the digital interest group European Digital Rights is also concerned. “This looks like a disgraceful general surveillance law, totally unfit for a free democracy.”
End-to-end encryption impossible
Privacy experts say the proposal could also seriously threaten end-to-end encryption. Requiring companies to install all necessary software in their systems that the EU deems necessary to detect child abuse makes encryption nearly impossible. Due to the EU’s influence on digital policies, the same measures could also spread further around the world to authoritarian states.
“If implemented, the proposal would be a disaster for user privacy, not just in the EU, but around the world. The child abuse argument is once again being used as a lubricant for pushing through these invasive regulations,” said Joe Mullin, senior policy analyst at the Electronic Frontier Foundation.
Not only in countries like China and Russia, digital privacy is a matter of life or death, journalists, activists, whistleblowers and critics also depend on it elsewhere in the world.
A good example is Russia. Although homosexuality was removed from criminal law in the country shortly after the fall of the Soviet Union, homosexuals, lesbians, bisexuals and transsexuals often suffer from government discrimination or violent confrontations. Privacy is therefore extremely important to them.
Unreliable spam filters
In addition, there is also criticism of the artificial intelligence plan, which aims to detect language patterns associated with grooming. Finding these would require the use of algorithmic scanners, which experts say can be error-prone, leaving innocent individuals under surveillance by their government.
“You just have to look at how unreliable spam filters are. They’ve been used in our emails for 20 years, but we still get them. That really shows the limitation of these technologies,” said Ella Jakubowska, policy advisor at the European Digital Rights organization. The Verge.
In the past, there was an outcry when Apple proposed something similar for finding child abuse. When the company introduced its plans last summer, it backed down after much criticism. At the time, the tech giant said it would scan messages from users under the age of 17 and warn them if they were about to send or receive “sexual” material.
The company also suggested scanning iCloud contents for known child abuse material to alert the National Center for Missing and Exploited Children in the event of a “hit”. However, the plan has been shelved after fierce opposition from privacy organizations.