Following X and Grok, Ofcom has launched an investigation into child safety on Telegram.

Following X and Grok, Ofcom has launched an investigation into child safety on Telegram.

      The UK's online safety regulator has initiated a formal investigation into Telegram under the Online Safety Act, specifically looking into whether the messaging platform has fulfilled its responsibilities to safeguard UK users from child sexual abuse material (CSAM). This marks Ofcom's most significant enforcement action against a major messaging platform so far.

      Ofcom has launched the formal investigation in accordance with the Online Safety Act 2023, analyzing whether Telegram has adhered to its legal obligations to protect users in the UK from CSAM. Reported by Reuters, this investigation signifies a notable increase in Ofcom's enforcement actions against one of the most widely utilized messaging services globally, which has faced longstanding criticism regarding its management of illegal content.

      The investigation follows Ofcom’s established enforcement framework under the Online Safety Act, mandating user-to-user and search services to evaluate and mitigate risks related to UK users encountering illegal content, including CSAM, and to promptly remove such content when identified. Ofcom holds the authority to impose fines on companies amounting to the higher of £18 million or 10% of their qualifying global revenue for non-compliance, and in instances of serious ongoing violations, it can seek court action for business disruption measures, which may involve requiring internet service providers to block the platform in the UK.

      The initiation of a formal investigation does not imply any conclusion of wrongdoing. Under the Act's protocol, Ofcom first collects and examines evidence to ascertain if a violation has occurred. Should it determine that a compliance failure exists, it will issue a provisional decision to the company, allowing the company the opportunity to respond before a final decision is reached. This process usually spans several months. The same framework is being utilized in Ofcom’s ongoing investigation into X, which began in January 2026, triggered by reports of its Grok AI chatbot being used for generating and disseminating sexually explicit images of children.

      Telegram’s relationship with UK regulators has been changing. In December 2024, the platform joined the Internet Watch Foundation (IWF), a UK organization dedicated to identifying and removing CSAM, and pledged to implement IWF’s detection tools across public areas of the platform, including hash-matching technology for known CSAM and tools aimed at blocking AI-generated abusive content.

      Ofcom's annual review in March 2026 acknowledged that Telegram, along with X, Discord, and Reddit, had established age controls in response to the Online Safety Act. Consequently, this new investigation indicates a shift: despite prior progress, Ofcom has determined there are sufficient grounds to formally examine whether Telegram's adherence to its specific duties regarding CSAM under the Act has been satisfactory.

      The core tension in the Telegram case reflects long-standing debates about the platform. Its structure includes public channels and groups that are easier for external detection tools to access, while the platform's encrypted private messaging—key to its popularity among activists, journalists, and dissidents in repressive regimes—limits the extent of possible content moderation.

      The NSPCC, responding to Telegram's IWF-related partnership in December 2024, acknowledged this difference, appreciating the advancement for public content while emphasizing that “there should be no part of the service where perpetrators can act without detection.” Provisions concerning end-to-end encrypted messaging in the Online Safety Act are among the most contested, with Signal previously warning it might withdraw from the UK if mandated to scan private messages. Ofcom has indicated it is not currently inclined to require client-side scanning.

      This investigation occurs during a time of heightened regulatory scrutiny on messaging and social media platforms throughout the UK. Since the Online Safety Act came into effect in 2025, Ofcom has launched nearly 100 investigations, imposed close to a dozen fines, and as of March 2026, directly engaged with six major platforms—Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube—requesting evidence of additional child safety enhancements by 30 April. The inquiry into Telegram adds a significant messaging platform to an enforcement list that has so far predominantly focused on pornography sites and specialist image boards.

      Telegram did not immediately reply to a request for comment following the Reuters report. Ofcom has stated that it will provide updates on the investigation as soon as possible.

Other articles

Following X and Grok, Ofcom has launched an investigation into child safety on Telegram.

Ofcom has initiated a formal inquiry into Telegram in accordance with the UK Online Safety Act, focusing on the platform's adherence to its responsibilities for safeguarding users against CSAM.