The EU's initiative for child safety has hit a roadblock as the ePrivacy derogation has lapsed, an age verification application was compromised, and the CSA Regulation is stalled in trilogue discussions.
Summary: Europe’s initiative to safeguard children in online environments has come into conflict with its privacy framework. The ePrivacy derogation, which permitted voluntary scanning for child sexual abuse material (CSAM), lapsed on April 3 after the Parliament voted 311-228 against its extension. Additionally, the EU's newly announced age verification application was compromised within two minutes of its introduction on April 15. Meanwhile, the CSA Regulation (“Chat Control”) remains unresolved in trilogue negotiations, with a deadline set for July. The European Court of Human Rights has stated that encryption backdoors infringe on fundamental rights, and the GDPR, DSA, and proposed CSA Regulation all necessitate knowledge of whether a user is a child—this, in turn, requires the collection of data that privacy laws prohibit regarding children.
On April 3, the European Parliament decided by a vote of 311 to 228 to let the temporary ePrivacy derogation expire. This derogation had enabled platforms like Meta, Google, and Microsoft to voluntarily scan private messages for CSAM without breaching EU privacy laws. With its expiration, the legal framework for such scans vanished. Just twelve days later, the European Commission unveiled a new privacy-centric age verification app to protect children online, which security researchers managed to hack almost instantly. This situation highlights Europe's dilemma: while it aims to shield children from online exploitation, every tool developed for this purpose conflicts with the privacy architecture built over the past decade. Consequently, there exists a regulatory system in conflict, where mechanisms necessary to identify exploited children necessitate collecting data prohibited by EU law.
The scanning void
The ePrivacy derogation was introduced in 2021 as a temporary solution. The European Commission had proposed the CSA Regulation, informally known as Chat Control, which would obligate platforms to identify and report CSAM in private messages, including those that are end-to-end encrypted. This regulation was intended to replace the voluntary framework in three years, but it has not progressed as expected. Trilogue negotiations among the Parliament, Council, and Commission have stalled since 2022, with a forthcoming meeting on May 4 and a goal of reaching an agreement by July. In the meantime, the derogation has expired. The National Centre for Missing and Exploited Children in the U.S., which manages a significant portion of global CSAM reports, has expressed concerns that the lapse will notably reduce referrals from European platforms. Meta has confirmed it has halted voluntary scanning in the EU. The Parliament contends that the derogation conflicted with the fundamental right to privacy of communications. Conversely, child safety advocates argue that the Parliament has made it legal for platforms to disregard abuse material present in their systems.
The proposed CSA Regulation from the Commission would require platforms to follow detection orders issued by a new EU Centre to scan messages for known CSAM, new CSAM, and grooming behavior. The Parliament removed the most contentious aspects: it declined scanning of end-to-end encrypted messages, restricted detection to known material through hash-matching technology, and exempted real-time communications. The Council, however, has pushed for greater scanning powers, including for unrecognized material and grooming. The disagreement between these positions is not a minor detail but rather a fundamental dispute regarding whether private communications can be systematically surveilled to safeguard children, with the European Court of Human Rights already indicating its stance.
The encryption barrier
In February, the European Court of Human Rights ruled in Podchasov v. Russia that mandating platforms to compromise or create backdoors for end-to-end encryption breaches Article 8 of the European Convention on Human Rights, which protects the right to privacy in personal life and correspondence. This ruling targeted a Russian law compelling messaging services to provide decryption keys to authorities, but its logic directly pertains to the CSA Regulation’s proposed detection orders. If a platform cannot scan encrypted messages without undermining encryption, and doing so breaches fundamental rights, then the regulation cannot require what it intends. Signal’s president, Meredith Whittaker, stated the organization would exit the EU rather than comply with any law that necessitates compromising its encryption methods. Similarly, Apple disabled its Advanced Data Protection feature for UK users following a directive from the British government demanding backdoor access to iCloud data. The debate around encryption has become increasingly practical, with companies already making jurisdictional decisions based on government demands for access to private communications.
Both the European Data Protection Board and the European Data Protection Supervisor have warned that the CSA Regulation, as currently drafted by the Commission, would be disproportionate and incompatible with EU fundamental rights. The EDPS specifically highlighted that client-side scanning—the proposed alternative to breaking encryption by scanning content on devices before encryption—still constitutes mass surveillance because it involves processing every message to identify the illegal ones. Although the distinction between pre-encryption scanning and post-encryption scanning is technically relevant, it bears no legal significance if every private message is analyzed by an automated system. The Parliament's negotiating position reflects this perspective, while the Council's does not.
Other articles
The EU's initiative for child safety has hit a roadblock as the ePrivacy derogation has lapsed, an age verification application was compromised, and the CSA Regulation is stalled in trilogue discussions.
The ePrivacy derogation ended on April 3, the EU age verification application was compromised within minutes, and the CSA Regulation is still at a standstill. Europe's child safety measures require the data that its privacy regulations prohibit.
