The EU's efforts to enhance child safety have hit a roadblock as the ePrivacy derogation comes to an end, an age verification application has been compromised, and the CSA Regulation remains stalled in trilogue discussions.
Summary: Europe’s initiative to safeguard children online is at odds with its privacy laws. The ePrivacy derogation that permitted voluntary scanning for child sexual abuse material (CSAM) expired on April 3, following a Parliament vote of 311-228 against its extension. A new age verification application announced by the EU on April 15 was hacked in under two minutes, and the CSA Regulation (“Chat Control”) remains stalled in trilogue with a July deadline approaching. The European Court of Human Rights (ECHR) determined that encryption backdoors infringe on fundamental rights, while the GDPR, DSA, and proposed CSA Regulation necessitate identifying whether a user is a child, which involves gathering data that privacy laws prohibit collecting about children.
On April 3, the European Parliament voted against extending the temporary ePrivacy derogation, which had allowed platforms like Meta, Google, and Microsoft to voluntarily scan private messages for CSAM without breaching EU privacy regulations. The expiration of this derogation eliminated the legal framework for such scans. Twelve days later, the European Commission unveiled a new privacy-focused age verification app intended to protect children online, but researchers hacked it in less than two minutes. The challenge lies between the expired law and the failed app: Europe aims to shield children from online harm, yet the tools it devises conflict with the privacy framework it has developed over the last decade. Consequently, there exists a regulatory paradox where the necessary means to identify abused children require collecting exactly the data EU law prohibits regarding children.
The scanning gap
The ePrivacy derogation was introduced in 2021 as a temporary measure. The European Commission had proposed the Child Sexual Abuse Regulation—known as CSA Regulation or Chat Control—which would mandate platforms to detect and report CSAM in private messages, including those that are end-to-end encrypted. This regulation was meant to succeed the voluntary framework within three years, but progress has stalled. Negotiations among the Parliament, Council, and Commission have been ongoing since 2022, with the next meeting scheduled for May 4 and a political agreement target set for July. In the interim, the derogation expired. The National Centre for Missing and Exploited Children in the U.S., which handles the majority of global CSAM reports, cautioned that this lapse would lead to a noticeable decline in referrals from European platforms. Meta confirmed the suspension of voluntary scanning in the EU. The Parliament holds that the derogation conflicted with the fundamental right to privacy in communications, while child safety organizations argue that the Parliament effectively legalized platforms to overlook abusive content within their systems.
The CSA Regulation, as proposed by the Commission, would obligate platforms to utilize detection orders from a new EU Centre to scan communications for known CSAM, newly generated CSAM, and grooming behaviors. The Parliament removed the most contentious aspects: it forbade the scanning of end-to-end encrypted messages, limited detection to known material via hash-matching technology, and excluded real-time communications. The Council, influenced by a rotating presidency advocating stronger law enforcement access, demands broader scanning capabilities, including unknown material and grooming. The divergence between these two positions represents a fundamental disagreement regarding the systematic monitoring of private communications for child protection, one that the ECHR has already signaled its stance on.
The encryption wall
In February, the ECHR ruled in Podchasov v. Russia that mandating platforms to weaken or provide backdoors to end-to-end encryption breaches Article 8 of the European Convention on Human Rights, which ensures respect for private life and correspondence. This ruling targeted a Russian law requiring messaging services to offer decryption keys to the FSB, but its reasoning applies directly to the proposed detection orders in the CSA Regulation. If platforms cannot scan encrypted messages without compromising the encryption, and doing so infringes on fundamental rights, then the regulation cannot enforce what its authors intended. Signal’s president, Meredith Whittaker, declared the organization would exit the EU rather than comply with any law that jeopardizes its encryption standards. Apple disabled its Advanced Data Protection feature for U.K. users following a technical capability notice from the British government demanding backdoor access to iCloud data. The encryption issue has moved beyond theoretical discussions, with companies making jurisdictional choices based on government demands for access to private communications.
Both the European Data Protection Board and the European Data Protection Supervisor have expressed opinions stating that the CSA Regulation, as drafted by the Commission, would be disproportionate and inconsistent with EU fundamental rights. The EDPS specifically indicated that client-side scanning, proposed as an alternative to breaking encryption by inspecting content before it is encrypted, still constitutes mass surveillance since it processes every message to identify illegal content. The technical difference between scanning before and after encryption is relevant but legally insignificant if the result is the automated analysis of all private messages. The Parliament’s negotiating stance reflects this understanding; the Council’s does not.
The age verification paradox
While the CSA Regulation is delayed, individual member states have implemented age-based restrictions. France prohibits children under 15 from
Other articles
The EU's efforts to enhance child safety have hit a roadblock as the ePrivacy derogation comes to an end, an age verification application has been compromised, and the CSA Regulation remains stalled in trilogue discussions.
The ePrivacy derogation ended on April 3, the EU age verification application was compromised in just minutes, and the CSA Regulation is still at a standstill. Europe's child protection measures rely on data that its privacy regulations prohibit.
