The European Union wants to impose the detection of child pornography online

Brussels also proposes the establishment of a European center for the fight against sexual abuse committed against children, an independent agency which will be based in The Hague and will share resources with Europol. In particular, it will be responsible for collecting reports from platforms.

“Our society today is failing to protect children,” said EU Home Affairs Commissioner Ylva Johansson. The Covid-19 pandemic with its lockdowns has compounded the problem, with reports rising 64% between 2020 and 2021, according to Britain’s Internet Watch Foundation (IWF).

She pointed out that in 2021, 85 million sex videos and photos involving minors were reported, according to data from the US Center for Missing and Exploited Children, NCMEC. “And that’s just the tip of the iceberg,” said the Swedish official.

Up to 95% of reports of this illegal content in 2020 came from Facebook, while “the problem is not limited to a single platform”, notes the Commission.


Until now, the providers of services and couriers on the Internet proceed voluntarily to the detection of these illegal contents. Brussels now wants to force them to be proactive, under penalty of sanctions.

This legislation, which is part of a strategy announced in 2020, is part of the more general framework of the EU’s Digital Services Act (DSA) regulation aimed at regulating digital platforms, which provides fines of up to 6% of their annual turnover in the event of an infringement.

The new regulation provides for the obligation for platforms to assess the risks of their services being used to disseminate child pornography, or for the solicitation of children by paedophiles, and to put in place preventive measures.

Member States must designate an independent authority responsible for monitoring that the platform fulfills these obligations, empowered to request, if necessary, a court or an administrative authority to issue a detection order, targeted and limited in time.