Skip to main content
Open this photo in gallery:

A school girl uses her phone as she walks with a group of kids in Sydney. In December, Australia banned social media for children under 16, prompting other countries to consider similar measures.Rick Rycroft/The Associated Press

Federal officials have drawn up plans to include a ban on social media for children under the age of 14 in the government’s coming online harms bill, part of a suite of possible measures aimed at protecting young people in the digital space, three sources told The Globe and Mail.

The proposal follows a social-media ban for young people under 16 that took effect in Australia in December. The move has caused other countries, including Canada and Britain, to consider following the Australian example.

In Canada, there is currently a ban on social-media use by children under the age of 13, though many children circumvent it by pretending they are older.

The proposal to raise the cutoff age to 14 would first need cabinet approval. Ministers are expected to consider the measure as early as next month, according to two of the sources.

They said there have also been discussions between civil servants in government departments in recent weeks about whether a new regulator would be required to police the ban.

The Globe is not naming the sources, who were not authorized to speak publicly about the proposals.

Ottawa urged to regulate AI chatbots in forthcoming online safety bill

Millions of young people have left major social-media platforms – such as X, TikTok, Instagram and Snapchat – in Australia in the weeks since the under-16 ban came into force there.

The new online harms bill, a replacement for a previous bill that was tabled in 2024 but died when Parliament was dissolved ahead of the past federal election, is expected to be introduced within months.

The government is also looking at introducing new protections that would shield youth under age 18 from targeted marketing in a forthcoming separate bill that would update privacy legislation and would be introduced by AI Minister Evan Solomon, the sources said.

Child-safety advocates have told the government that they are concerned that a lack of controls online is making children vulnerable to sexual exploitation, targeted grooming and scams.

Data released this week by the Canadian Centre for Child Protection found that online violence primarily targeting girls on social media is trending upward.

From June, 2022, through to the end of December, 2025, the centre received 127 reports of extreme violence online, the majority in the past 12 months. This included aggressive coercive tactics, such as making threats to distribute intimate images in order to force teenage victims to engage in dangerous behaviours such as self-harm.

Taylor Owen, Beaverbrook Chair in Media, Ethics and Communications at McGill University, said restricting social-media use among teens has widespread public support.

But he said for a ban to be effective it would require a regulator who could not only police it, and issue penalties for infractions, but address wider harms on the internet affecting both children and adults.

Prof. Owen warned that without a regulator, when a child hits the age when social media is allowed, they could “jump right into a social-media ecosystem that has no protections in it whatsoever.”

How do Canadian teens feel about banning social media?

He said there is a need to address problems on platforms, which include certain kinds of content, “the incentives within them, the way the algorithms boost that content, the lack of guardrails, the lack of accountability, lack of safety teams and measures.” He added that a teen social-media ban would not resolve these problems on its own.

Lianna Macdonald, executive director of the Canadian Centre for Child Protection, said her organization had spoken this week to Australian authorities about their under-16 social-media ban.

“The Canadian Centre for Child Protection supports the idea of a social-media delay as an additional layer to prevent serious injury and harms to children and youth,” she said in an e-mail.

“As is the case offline, regulations should clearly define what types of products and services companies make available to children.”

In December, Australia became the first country to ban social media for children under 16, blocking access to platforms including TikTok, Alphabet’s YouTube and Meta’s Instagram and Facebook.

The Associated Press

The previous online harms bill, known as Bill C-63, would have forced online platforms to swiftly remove child sexual-abuse material, intimate content shared without consent, and posts encouraging children to self-harm. It also proposed a digital-safety commission and ombudsperson to combat online hate.

But officials are looking at introducing a slimmed-down regulatory system in the new version of the bill, according to the three sources. One option is for a single commission with powers to impose major fines on social-media companies that fail to comply with the bill. The watchdog could also be a place Canadians could go if they have been harmed online.

Charlotte Moore Hepburn, a pediatrician and medical director for the Child Health Policy Accelerator at Sick Kids hospital in Toronto, expressed concern that, without a regulator, kids who age out of a ban and then use social media would find themselves in a harmful environment.

“The government needs a comprehensive strategy that is multipronged in its approach, and it cannot achieve that without an independent regulator,” she said, adding that Australia had a regulator in place when it introduced its ban.

For parents holding off on giving their kids smartphones, the peer pressure is real

Google has expressed reservations about the Australian ban, which includes YouTube. The video-streaming platform has content specifically targeted at children, such as cartoons for preschoolers and animal videos.

Meta, which runs Instagram and Facebook, has been holding meetings with Canada’s federal government to pitch a proposal for age verification at the app-store level in the forthcoming online harms bill.

This proposal would put the onus on companies providing apps, rather than platforms. The tech giant has suggested that age verification should be introduced when users set up their phones.

Kareem Ghanem, senior director of government affairs and public policy at Google, accused Meta of wanting to shift “responsibility to others rather than taking responsibility for their own platforms.”

“Their proposal puts the onus of age verification solely on app stores and increases the burden on parents, letting Meta apps like Instagram off the hook while creating serious privacy risks for families and doing little to make kids safer online,” he said.

The forthcoming online harms bill is expected to be steered through Parliament by Canadian Identity Minister Marc Miller.

His spokesperson, Hermine Landry, said: “We all want our children to be safe as they navigate the digital world, and platforms have an important role to play in meeting that challenge. Our government intends to act swiftly to better protect Canadians, especially children, from online harm.”

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe