Skip to main content
Open this photo in gallery:

Telegram has more than one billion users, and has drawn allegations that it has been used by criminals.DADO RUVIC/Reuters

A tip-off from Canada about child sexual abuse material allegedly being shared on the messaging app Telegram has prompted Britain’s online safety watchdog to launch a formal investigation into the platform.

The Manitoba-based Canadian Centre for Child Protection alerted Ofcom, the British online safety regulator, that child-abuse images were allegedly being shared on the app.

Sharing or possessing child sexual abuse material is illegal in Britain, as it is in Canada. But under Britain’s Online Safety Act, providers of “user to user services,” including messaging apps, are required to assess and mitigate the risk of such crimes being perpetrated on their platforms.

In a news release on Tuesday, Ofcom said it had “received evidence from the Centre for Child Protection regarding the alleged presence and sharing of child sexual abuse material on Telegram and carried out our own assessment of the platform.”

“In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content,” the statement said.

Telegram offers a platform for users to exchange messages, share files, hold private or group voice and video calls and organize livestreams with minimal content restrictions. It has more than one billion users, who include dissidents and journalists, but it has drawn allegations that it has been used by criminals.

In a statement Tuesday, Remi Vaughn, a spokesperson for Telegram, said the platform “categorically denies Ofcom’s accusations.”

“Telegram has virtually eliminated the public spread of CSAM [child sexual abuse material] on its platform through world-class detection algorithms and cooperation with NGOs,” the statement said. “We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.”

The Canadian Centre for Child Protection is renowned internationally for its work to stamp out child abuse online. It uses international web crawlers in an initiative called Project Arachnid to identify child-abuse material, including photos, videos and livestreams.

The Centre’s analysts review forums and chat groups used by pedophiles.

Lloyd Richardson, director of technology at the centre, said he worries that child exploitation has re-emerged on Telegram despite repeated attempts to warn the company.

“Although not directly related to the information provided to Ofcom, in the last year we have sent thousands of notifications to Telegram related to content and accounts on their service,” he said.

Ofcom also expressed concern Tuesday that two other chat services, which have open chatrooms and private messaging, are being used by predators to groom children.

Ofcom has the power to impose fines of up to £18 million, or 10 per cent of worldwide revenue, on a company found to have breached the law.

Canadian Identity Minister Marc Miller is currently consulting on an online safety act for Canada, and his department has been looking at Britain’s online safety law and how it is being applied.

A previous version of an online harms bill, which did not become law before the last election in Canada, would have forced online platforms to swiftly remove child sexual abuse material, intimate content shared without consent, and posts encouraging a child to self-harm. It also proposed a regulator to enforce the law, as in Britain.

The federal government is expected to include such measures in its forthcoming online harms bill, which could be published as early as June.

Canada’s Heritage Department is also currently consulting with experts on whether action should be taken in the forthcoming bill to regulate use by children of AI chatbots, and to introduce a ban on children under age 16 using social media.

Follow related authors and topics

Interact with The Globe