Skip to main content
opinion
Open this photo in gallery:

In December, Australia banned social media for children under the age of 16.Hollie Adams/Reuters

Helen A. Hayes is the associate director of policy at the Centre for Media, Technology, and Democracy. Taylor Owen is the Beaverbrook Chair in Media, Ethics, and Communication at McGill University and the founding director of the Centre for Media, Technology, and Democracy.

Australia’s bold social media ban for children under 16, and calls for Canada to follow its lead, reflect genuine concern about child safety online. But if policy makers are going to take the dramatic step of restricting access for an entire generation, they should proceed carefully – by framing any restrictions as a temporary moratorium that pressures platforms to make structural changes, not as an end in itself.

A moratorium serves as a stopgap, not a solution. The real solution, the one arrived at by every jurisdiction that has studied this problem, is a regulator with the authority to hold platforms accountable through risk assessments, transparency requirements, and age-appropriate design standards. Australia was able to enact an age restriction because it already had a system and decade of experience with an eSafety commissioner. Canada has no such infrastructure. A stand-alone ban here would punish users rather than the products causing harm, and would ensure that the moment a child ages out, they enter a digital environment with no protections whatsoever.

Ottawa planning measures to protect young and vulnerable from AI chatbots, minister says

The frustration of parents and teachers is justified. And given how long it takes to create a functioning regulator or commission, there is a case for temporary restrictions to bridge the gap. But those restrictions must be tied to building the accountability regime, not treated as a substitute for it.

Age limits should be seen as less of a solution than a diagnosis. The very fact that governments feel compelled to prohibit young people from using social media is an admission that something is fundamentally broken about these platforms. When a product is so misaligned with public well-being that exclusion feels like the only viable option, the problem is not the user. It’s the system.

This is why the distinction between bans and moratoriums matters. Moratoriums create accountability and a signal that access will be restored if platforms meet specific safety standards. This shifts the burden back where it belongs: onto the companies profiting from systems that currently prioritize engagement over well-being.

Canada’s likely forthcoming online harms bill presents an opportunity to operationalize this approach. Rather than legislating a ban in isolation, the government could embed an age-restricted moratorium within a broader Duty to Protect Children that mandates concrete platform reforms, including an age-appropriate design code. The moratorium could then become part of the enforcement mechanism: platforms that comply gain access to young users, subject to continuing compliance.

For this to work, legislation would need to specify clear, measurable design standards that platforms must meet before the moratorium is lifted. These could include: eliminating or substantially limiting algorithmic amplification of harmful content; disabling features designed to maximize compulsive use, such as infinite scroll, autoplay, and streak mechanics; providing transparent, user-controlled content moderation settings with safe defaults; and submitting to regular independent audits of recommendation systems. Backed by a moratorium, these requirements would not be aspirational guidelines, but preconditions for market access.

This approach also resolves the tension between protection and participation. Young people are rights-bearing citizens who deserve access to safe digital public spaces for expression, community-building, and civic engagement. A moratorium respects these rights while acknowledging that current conditions make safe participation impossible.

Federal officials draft plans to ban social media for children under 14

The effectiveness of this framework hinges on properly defining the object of regulation. A narrow definition targeting only platforms such as Instagram and TikTok would be immediately circumvented, as young people migrate to Roblox, Discord, or Snapchat. The definition must focus on features, not platforms, and AI chatbots that replicate dangerous patterns that optimize for engagement and encourage compulsive use should fall within scope. And the regulator – not a static list in legislation – must have authority to assess new services as they emerge.

Of course, this only works if enforcement has the independent authority to impose substantial financial penalties, verify compliance through technical audits, and compel tech companies to share data with researchers. This does not require cumbersome regulatory infrastructure; it should be a purpose-built, nimble commission that can adapt to technological shifts and has real power.

If Canada proceeds with age restrictions on social media, we should do so with clear eyes about what we’re implementing and why. A permanent ban is an admission that we’ve surrendered digital public space to unregulated corporate interests and chosen exclusion over reform. A moratorium is a different proposition: a temporary measure that creates pressure for change while preserving the possibility of safe, inclusive digital participation.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe