The European Commission officially accuses Meta.
The European Commission has released initial findings indicating that Meta has breached its obligations under the Digital Services Act (DSA) by failing to prevent minors from accessing its platforms. This accusation, previously directed only at adult content websites, marks the first application of this specific allegation to a mainstream social media company.
The distinction is significant. In March 2026, the Commission issued similar preliminary findings against four pornographic platforms—Pornhub, Stripchat, XNXX, and XVideo—for allowing minors to access their services simply by clicking a confirmation that they were over 18.
The recent actions taken against Meta apply the same legal framework to a platform that children not only use but also register for. Although Meta mandates users to be at least 13 years old, its age verification process on Facebook and Instagram largely depends on self-declaration, which independent research has repeatedly shown to be ineffective.
These findings are part of broader formal proceedings initiated by the Commission in May 2024 concerning child protection responsibilities under DSA Articles 28, 34, and 35. The current finding specifically addresses Article 28(1), which mandates platforms to implement appropriate and proportionate measures to ensure high levels of safety, privacy, and security for minors, and to prevent access by children below the applicable national minimum age.
The timing of the announcement is intentional. Just two weeks prior, on April 15, European Commission President Ursula von der Leyen introduced a privacy-preserving EU age verification app based on zero-knowledge proof technology, which enables users to verify their age without revealing personal information to platforms.
Ursula von der Leyen made it clear: “Online platforms can easily utilize our age verification app, so there are no more excuses. We will have zero tolerance for companies that disregard our children’s rights.”
By issuing the Article 28(1) finding against Meta two weeks after the app's launch, the Commission indicates that the argument citing technical infeasibility—namely, that effective age verification cannot be conducted without compromising user privacy—is no longer valid.
The EU has provided a solution, which Meta has not adopted, resulting in the preliminary finding.
However, the app has faced challenges. Security researchers demonstrated it could be circumvented within two minutes of its debut. Still, the Commission's enforcement stance seems unaffected by this setback: the relevant regulatory question is not whether the app is flawless, but whether Meta has implemented any comparably robust alternative. Its current reliance on self-declaration and AI-based age estimation does not seem to meet this standard.
A study by the Interface-EU think tank published in 2025 assessed the sign-up processes of all major platforms utilized by children in the EU, including Instagram. The findings were clear: all examined platforms permitted a simulated 14-year-old to create an account by merely entering a false date of birth.
There was no document verification, no third-party checks, and no barriers beyond a click. Meta's explanation of its strategy, provided to TechCrunch when the formal proceedings commenced in 2024, indicated that it uses self-declared ages combined with AI assessments to identify users who might have lied and allows users to report suspected underage accounts.
The company claimed that internal tests showed it had prevented 96% of teens attempting to change their birthdays from under 18 to over 18 on Instagram. However, the Commission's preliminary findings suggest that these measures are inadequate based on the DSA's standards for being 'appropriate and proportionate.'
The precedent established by the findings against the pornographic platforms is informative. In that instance, the Commission's language was specific: the platforms had allowed minors to access their services “by a simple click confirming they are over 18.” The enforced standard does not require perfection but rather the replacement of easily bypassed self-declaration with a more stringent verification process. By this standard, Facebook and Instagram's current methods appear to fall short.
What are the next steps?
Preliminary findings do not equate to a final non-compliance declaration. Meta now has the opportunity to review the Commission's case file and respond in writing, and it may propose remedies. The European Board for Digital Services will also be consulted concurrently.
If the Commission's views are ultimately upheld and a non-compliance decision is made, Meta could face a fine of up to 6% of its global annual turnover, which, based on Meta's 2025 income, could translate to potential penalties amounting to several billion dollars. The Commission can also impose ongoing penalty payments to ensure compliance.
No fixed timeline exists for the conclusion of the proceedings. However, Wednesday's action is part of a larger enforcement push, which also includes preliminary findings regarding addictive design and recommendation systems issued on the same day. The overarching message from Brussels for Meta is clear: the period of negotiated goodwill is coming to an end. The Commission is now presenting formal charges on multiple fronts at once, and the potential fines are no longer merely theoretical.
At the time of publication, Meta had
Other articles
The European Commission officially accuses Meta.
The EU has released initial results indicating that Meta violated DSA regulations by not preventing underage children from accessing Facebook and Instagram, marking the first accusation of this kind against a major social media platform.
