Following X and Grok, Ofcom has launched an investigation into child safety concerning Telegram.
The UK's online safety regulator has initiated a formal investigation into Telegram under the Online Safety Act, looking into whether the messaging service has fulfilled its responsibilities to safeguard UK users from child sexual abuse material. This marks Ofcom's most significant enforcement action against a major messaging platform to date.
Ofcom has launched a formal inquiry into Telegram under the Online Safety Act 2023, assessing if the platform has adhered to its legal obligations to protect UK users from child sexual abuse material (CSAM).
As reported by Reuters, this investigation signifies a notable escalation in Ofcom's enforcement of the Act against one of the most widely used messaging services globally, which has long faced scrutiny regarding its handling of illegal content.
The investigation follows Ofcom's established enforcement framework under the Online Safety Act, which mandates user-to-user and search services to evaluate and mitigate risks of UK users encountering illegal content, including CSAM, and to promptly remove such content upon identification.
Ofcom holds the authority to impose fines on companies amounting to the greater of £18 million or 10% of their qualifying global revenue for non-compliance. In instances of serious ongoing non-compliance, Ofcom can seek court intervention for business disruption measures, potentially requiring internet service providers to block the platform in the UK.
The initiation of a formal investigation does not imply a determination of wrongdoing. Under the outlined process of the Act, Ofcom initially collects and analyzes evidence to ascertain whether a breach has taken place. If a compliance failure is identified, a provisional decision is issued to the company, which can then respond fully before a final decision is rendered.
This process usually spans several months. A similar framework is being utilized in Ofcom's ongoing investigation into X, which began in January 2026 after reports surfaced that its Grok AI chatbot was utilized to generate and distribute explicit images of children.
Telegram's engagement with UK regulators has evolved over time. As recently as December 2024, the platform joined the Internet Watch Foundation (IWF), a UK organization focused on identifying and removing CSAM, and pledged to implement IWF's detection tools across public areas of the platform, including hash-matching technology for identifying known CSAM and tools aimed at blocking AI-generated abusive content.
Ofcom’s annual review in March 2026 acknowledged that Telegram, alongside X, Discord, and Reddit, had implemented age controls in response to the Online Safety Act.
Consequently, this new investigation signifies a shift: despite prior advancements, Ofcom has determined there are sufficient grounds to formally investigate whether Telegram's compliance with the specific CSAM-related obligations of the Act is adequate.
The core tension in the Telegram case reflects an ongoing debate about the platform. Its structure is divided: public channels and groups are more amenable to external detection tools, while its encrypted private messaging—popular among activists, journalists, and dissidents in authoritarian regimes—limits the possibilities for content moderation.
In response to Telegram's partnership with the IWF in December 2024, the NSPCC highlighted this distinction, applauding the move concerning public content but asserting that "there should be no part of the service where perpetrators can act without detection."
The provisions of the Online Safety Act regarding end-to-end encrypted messaging remain the most contentious aspect of the regime, with Signal previously warning that it would withdraw from the UK if required to monitor private messages.
Ofcom has indicated that it currently does not intend to mandate client-side scanning.
This investigation occurs amid ongoing regulatory pressure on messaging and social media platforms in the UK. Since the Online Safety Act became effective in 2025, Ofcom has opened nearly 100 investigations, issued nearly a dozen fines, and in March 2026, contacted six of the largest platforms—Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube—demanding proof of further child safety enhancements by 30 April.
The Telegram investigation adds a significant messaging platform to an enforcement list that has primarily targeted pornography sites and niche image boards to date.
Telegram did not immediately respond to a request for comment following the Reuters report. Ofcom has stated it will provide updates on the investigation as soon as possible.
Other articles
Following X and Grok, Ofcom has launched an investigation into child safety concerning Telegram.
Ofcom has initiated a formal investigation into Telegram as per the UK Online Safety Act, assessing the platform's adherence to its obligations to safeguard users against CSAM.
