close
close

Telegram ignored efforts by child protection organizations before CEO's arrest, groups say

Before Telegram's CEO was arrested in France, the app had earned a reputation for ignoring advocacy groups fighting against child exploitation.

Three of those groups — the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation — told NBC News that their outreach to Telegram about child sexual abuse material (often abbreviated to CSAM) on the platform had been largely ignored.

Pavel Durov, co-founder and CEO of Telegram, a messaging and news app widely used in former Soviet states and also gaining popularity among the far right in the United States and groups banned on other platforms, remains in the custody of French authorities, who arrested him on Saturday.

The Paris prosecutor, who has not yet filed charges, said on Monday that Durov was arrested as part of the investigation into an unnamed person. The charges against the person include “complicity” in illegal transactions and possession and distribution of child sexual abuse material, the prosecutor said in a statement.

Telegram wrote in a statement on X that it complies with European Union laws. It said Durov had “nothing to hide” and that it was “absurd to claim that a platform or its owner is responsible for the misuse of that platform.”

Telegram has long been considered relatively unmoderated and unwilling to cooperate with law enforcement. Durov said in April that it had 900 million regular users.

John Shehan, senior vice president of NCMEC's ​​Division of Exploited Children and International Engagement, said he was encouraged by France's decision to arrest Durov because Telegram has been such a haven for CSAM.

“Telegram is truly in a class of its own when it comes to its lack of content moderation or even interest in preventing child sexual exploitation on its platform,” he said.

“It is encouraging to see that the French government and the French police are taking measures to potentially stop such activities,” Shehan said.

Telegram's website states that the company never responds to reports of illegal activity in private or group chats, “even when reported by a user.” It also says that unlike other major tech platforms that routinely comply with court orders and search warrants, Telegram has “shared 0 bytes of user data with third parties, including governments.”

NBC News asked Telegram for comment on the groups' claims that their efforts to label CSAM had been ignored. In a statement, Telegram spokesperson Remi Vaughan did not address their comments but said the platform “actively moderates harmful content on its platform, including child abuse material.”

“Moderators use a combination of proactive monitoring of public parts of the platform, AI tools, and user reports to remove content that violates Telegram's terms of service,” Vaughan said. Telegram maintains a channel that provides daily updates on how many groups and channels have been reported for child abuse and claims that thousands of public groups are banned every day.

The Stanford Internet Observatory, in a report last year on platforms' enforcement of CSAM, found that while Telegram says sharing CSAM in public channels violates its rules, it is the only major tech platform whose privacy policy does not explicitly prohibit CSAM or soliciting children in private chats.

By law, U.S. platforms are required to work with the NCMEC, which runs the world's largest international coordination center for law enforcement, social media platforms and whistleblowers, to flag proven abusive material so it can be quickly removed. Telegram is based in Dubai in the United Arab Emirates, which Durov, who was born in the former Soviet Union, claims is a neutral country that does not subject his platform to any government.

But major tech companies outside the U.S., including TikTok, owned by China's ByteDance, U.K.-based Fenix, which owns OnlyFans, and Canadian conglomerate Aylo, which owns Pornhub, all remove CSAM reported by NCMEC, Shehan said.

Telegram offers an option to encrypt private messages end-to-end, meaning only users, not the platform, can read them. However, while other end-to-end encrypted messaging services like WhatsApp allow users to report and forward illegal content, Telegram does not offer this option.

In total, the NCMEC has received 570,000 reports of CSAM via Telegram, Shehan said. The app was launched in 2013.

“They have made it very clear to the team that they are not interested. We contact them sporadically, but not very often anymore,” he said. “They do not respond at all.”

A spokesperson for the UK-based Internet Watch Foundation, an independent nonprofit organization working to curb the spread of CSAM, said there had been repeated attempts to work with Telegram over the past year, but Telegram had refused to “use any of its services to block, prevent and stop the sharing of child sexual abuse images.”

“There is no excuse,” said the group's deputy executive director, Heidi Kempster. “All platforms have the opportunity to take action now to prevent the spread of child sexual abuse images. We have the tools, we have the data, and any failure to stop the spread of this known content is an active and conscious choice.”

Stephen Sauer, head of Canada's national CSAM hotline at the Canadian Centre for Child Protection, said in an emailed statement that Telegram has not only ignored attempts to label CSAM, but that offensive material is also more common there.

“Based on our observations, Telegram's platform is increasingly being used to expose CSAM to perpetrators. In many cases, we see Telegram links or accounts promoted on web forums and even on mainstream US social media platforms, acting as a conduit to drive traffic to illegal Telegram-based content,” he said.

“Telegram's moderation practices are completely opaque – we really have no idea how they work. Likewise, we receive no confirmation or feedback from the company on the moderation outcome when we report content to them. More importantly, it does not appear that the platform itself is taking sufficient proactive steps to curb the spread of CSAM on its service, despite it being known to facilitate the sharing of this type of material,” Sauer said.