Skip to main content
In Depth

Online sabotage

Disinformation from abroad is a real danger in a race where Canada-U.S. mistrust is high. Here’s what the cyber threats might look like

Toronto
The Globe and Mail
Illustration by The Globe and Mail (source: Getty Images)

Canada is at high risk of attack by hostile foreign entities – including possibly from within the United States – attempting to meddle with the federal election by covertly spreading false information online, experts warn.

They say the April 28 election is especially vulnerable to disinformation campaigns because of recent tensions in Canada-U.S. relations, the growth of artificial intelligence and weakened guardrails on social media platforms.

“We should be really concerned,” said Aengus Bridgman, director of the Media Ecosystem Observatory, a research group at McGill University that analyzes digital threats to democracy. “This is an enormous vulnerability right now.”

Open this photo in gallery:

Aengus Bridgman from McGill University anticipates serious risks of foreign interference in the election to come on April 28.Boris R. Thebia/The Globe and Mail

Canada is especially exposed because of the trade war and President Donald Trump’s musings about annexing this country, said Jean-Pierre Kingsley, former chief electoral officer of Canada.

“We are naive if we think that we are not the object of keen interest,” he said.

Earlier this year, Justice Marie-Josée Hogue, who led an independent inquiry on foreign interference, warned that information manipulation poses the biggest risk to this country’s democracy.

Election campaigns in Canada and around the world have been targeted in recent years by hostile foreign powers attempting to advance their own objectives with online disinformation, which is false or misleading information that is intended to deceive. Such activity is often as powerful as it is difficult to combat.

Tactics range from attacking politicians on social media with smears and doctored images to flooding voters with targeted propaganda and hacking election systems. Entities target election campaigns to push narratives that suit their own interests, prop up favoured candidates and influence government decisions about military, trade or immigration.

On Monday, a senior intelligence official told a news conference that a federal working group expects China, Russia, India and Pakistan to attempt to interfere with this election campaign. Vanessa Lloyd, deputy director of operations at the Canadian Security Intelligence Service and chair of the Security and Intelligence Threats to Elections task force, also said cyber threats pose a growing risk to democratic processes.

The government is working diligently to identify and counter online attacks and is planning to alert voters about threats, officials said. Elections Canada also said the agency has ramped up campaigns to counter false information.

Open this photo in gallery:

At a Trump rally in Arizona last fall, this AI-generated image harks back to false claims that the then-candidate had recently made about Haitian migrants eating cats.Rebecca Noble/AFP via Getty Images

In an extraordinary move, some observers including Prof. Bridgman and Mr. Kingsley are adding the U.S. to their watch list because of Mr. Trump’s tendency to spread false narratives, such as claiming he won the 2020 presidential election. In addition, one of Mr. Trump’s key advisers, tech mogul Elon Musk, is the owner of X, which has been widely criticized for amplifying misinformation.

The White House dismissed concerns about U.S. government interference in Canada’s election. “This is nonsense,” spokeswoman Anna Kelly said in an e-mail. Mr. Musk could not be reached for comment.

In a report earlier this month, the Communications Security Establishment, Canada’s cybersecurity protection agency, said that China, Russia and Iran will “very likely” use AI tools to try to interfere with the election, including by targeting candidates and political parties in attempted “hack-and-leak” operations.

The goal is “almost certainly” to break alliances and “entrench divisions within and between democratic states while also advancing their geopolitical goals,” the CSE said. However, the agency said it is very unlikely that hostile actors would mount a destructive attack on digital electoral systems.

In 2023 and 2024, the CSE said 102 reported cases of generative AI were directed at 41 elections around the world. Generative AI can be used to make deepfake photos, audio and video as well as write text and produce computer code. Deepfakes are manipulated images or recordings that make it appear that people did or said things they did not. While analysts do not know the source of most of the campaigns, the agency said Russia and China were behind “a high number.”

Canada is also more vulnerable to disinformation during this election campaign because of “severely reduced” collaboration and information-sharing with U.S. intelligence agencies because of the Trump administration’s government cuts, said Stephanie Carvin, a former national-security analyst who is a professor of international relations at Carleton University. “That’s going to be a real setback,” she said.

Open this photo in gallery:

Elon Musk, now a key figure in Mr. Trump's plans to dismantle federal institutions, loosened rules on misinformation when he took over the social platform Twitter, now X, in 2022.Jose Luis Magana/The Associated Press

Recent steps by tech giants to loosen guardrails on social media services – including scaled-down content moderation and fact checking, after putting such measures in place to respond to past criticisms about misinformation – have made it easier for false information to proliferate, said Prof. Bridgman, who teaches political science at McGill.

“We are in a place where platforms are not taking seriously their stewardship and public square-like responsibilities,” he said. “That opens up all sorts of possibilities for people doing information manipulation.”

After Mr. Musk bought Twitter in 2022, he relaxed content rules and reduced the moderation and election integrity teams.

Additional social media guardrails may fade soon. Although Meta spokeswoman Julia Perreira said changes to Facebook and Instagram have not yet been made in Canada, CEO Mark Zuckerberg announced in January that the platforms would stop using “politically biased” fact-checkers. Instead, they will use a crowdsourced “community notes” system similar to X, which critics say allows falsehoods to propagate because of delays in addressing misinformation.

In the run-up to the election, government officials met with representatives from tech companies and social media platforms to discuss “our shared commitment to a free, fair and secure election,” Laurie-Anne Kempton, an assistant secretary to cabinet, said on Monday.

“We will be looking to these companies to uphold their commitments, actively prevent the spread of fake information and correct this information on their platforms,” she said.

Open this photo in gallery:

Marie-Josée Hogue led an inquiry to find out what Russia, China, India and Iran may have done to sway the 2019 and 2021 elections.Patrick Doyle/Reuters

Justice Hogue found that foreign meddling in the 2021 and 2019 federal elections did not sway the overall results but may have affected a small number of ridings. Foreign interference is a broad category that includes a range of tactics, including illegal campaign donations, blackmail and cyberattacks, as well as disinformation.

In the early days of a new election campaign, Justice Hogue’s conclusion that the primary dangers lie in the information environment serves as a chilling warning.

“In my view it is no exaggeration to say that at this juncture, information manipulation (whether foreign or not) poses the single biggest risk to our democracy,” she wrote in her final report, released in January. “It is an existential threat.”


This ‘Elbows Up’ protest at Toronto City Hall brought out hundreds of people on March 22, the day before the Governor-General kicked off a new election. The trade war and Donald Trump’s talk of making Canada a ‘51st state’ loom large in this campaign, as do fears of malign U.S. influence on this country. Sammy Kogan/The Globe and Mail
When Liberals mobilized to replace Mr. Trudeau, concerns about foreign meddling reshaped the rules of a leadership race that Mark Carney ultimately won. One of his rivals, Chrystia Freeland, faced ‘co-ordinated and malicious’ interference on the social platform WeChat, a monitoring task force said. Spencer Colby/The Globe and Mail; Blair Gable/Reuters

Hostile entities will go to extreme lengths to interfere with democratic processes. While some campaigns are detected and denounced as fake, many fly under the radar, polluting the information ecosystem.

“Actors who are interested in manipulating elections are very much aware that the best way to do that is well before a voter reaches a polling station,” said Holly Ann Garnett, a political science professor at the Royal Military College of Canada who studies electoral integrity. “It’s about influencing beliefs and opinions.”

Former finance minister Chrystia Freeland was recently targeted by “coordinated and malicious activity” while running for the Liberal Party leadership, according to Global Affairs Canada.

The campaign was traced to a WeChat news account linked to the Chinese government, the department said last month, estimating that up to three million users globally saw articles disparaging Ms. Freeland on the social-media app. The Chinese government denied involvement.

A U.S. indictment unsealed last September alleged that a Quebec social-media influencer and her husband accepted nearly US$10-million through their company, Tenet Media, from a Russian government-controlled media outlet. The money was allegedly to create and distribute content that would benefit Moscow.

Analysts believe a Russian propaganda unit targeted U.S. vice-presidential nominee Tim Walz before last November’s election with false allegations he had sexually assaulted a student while working as a high school teacher. The campaign created a fake video using the likeness of an actual former student that went viral in October, reaching millions of people on X and other platforms.

Open this photo in gallery:

Supporters of Romanian presidential candidate Calin Georgescu got a boost from TikTok accounts that intelligence services traced back to Russia.Andreea Campeanu/Reuters

A few weeks later in Romania, a little-known, far-right candidate named Calin Georgescu unexpectedly won the first round of presidential elections. Intelligence reports revealed Russian influence in his campaign, including in 800 TikTok accounts backing him. Mr. Georgescu, who has voiced pro-Russian views, including calling for an end to Ukrainian aid, recently lost a court battle to compete in an election rerun in May.

After a rally by Conservative Party Leader Pierre Poilievre last summer in Kirkland Lake, Ont., a rash of posts appeared on hundreds of X accounts. The posts claimed that users had attended the event and repeated phrases, such as “I’m still buzzing from the energy!”

The campaign was likely created by a single entity or actor using low-quality AI, according to the Canadian Digital Media Research Network, a federally funded coalition anchored by the Media Ecosystem Observatory. The group estimated that several million Canadians heard about the incident through news coverage.

“The Kirkland Lake bot incident should serve as a wake-up call,” the network said in a report. “The event is best thought of as a test-case or capacity-building exercise by some entity interested in developing the ability to mass produce posts on social media platforms.”

Sam Lilly, a spokesman for the Conservatives, said the party “had absolutely noting to do with those posts.”


Conservative Leader Pierre Poilievre, who rallied supporters in North York on the first day of the current election campaign, held an event last summer in Kirkland Lake, Ont., that got a deluge of similarly worded X posts, likely the result of AI. Carlos Osorio/Reuters; X

While disinformation is insidious, experts believe quick action in calling out suspected plots is the best way to blunt their impact.

Throughout the election campaign, the Canadian Digital Media Research Network plans to monitor social media and other online spaces for signs of foreign and domestic information manipulation and will publicly report concerning activity.

“This is an all-hands-on-deck kind of situation,” Prof. Bridgman said. “We want to make sure our next election is free and fair, and that Canadians can have confidence in that election process and in our information environment.”

For its part, Elections Canada said it will continue to closely track online misinformation and conspiracy theories about voting and counter them with education campaigns. In a shift from previous elections, the agency no longer shies away from mentioning false narratives in its materials.

“We realize now that they’re out there. We may as well recognize them and deal with them upfront,” said Stéphane Perrault, Canada’s chief electoral officer.

Open this photo in gallery:

Stéphane Perrault, Canada's top election official, says his team will be vigilant and counter misinformation about the democratic process.Ashley Fraser/The Globe and Mail

Senior officials say the federal government is working to improve its monitoring and response to attempted foreign interference after the Hogue inquiry revealed shortcomings, including poor coordination and slow reactions.

During the 2021 election campaign, then-Conservative party leader Erin O’Toole and British Columbia candidate Kenny Chiu were the apparent targets of false information on online outlets tied to the Chinese government.

A government task force was aware of the narratives and notified a panel of five top public servants who administer the Critical Election Incident Public protocol, according to Justice Hogue’s report. The officials determine whether an incident “threatens Canada’s ability to have a free and fair election” and must inform Canadians when that high threshold is met. However, the panel did not make a public announcement, in part because the activity against the Conservatives could not be attributed to foreign actors.

While Justice Hogue said the panel’s decision was reasonable, she highlighted “a serious gap” in mechanisms to address false information, including a lack of clear guidelines for government action short of a public announcement by the panel.

In response to Justice Hogue’s recommendations, government officials will be more proactive in publicly calling out suspected disinformation and other interference during the election campaign, said Allen Sutherland, an assistant secretary to the cabinet.

“Transparency is the best sunlight,” he said. “This is something that helps Canadians make an informed decision.”

With a report from Laura Stone

AI and elections: More from The Globe and Mail

Machines Like Us podcast

How does AI technology actually work? Listeners of Machines Like Us had a lot of questions about that, so the podcast called on computer scientist Derek Ruths to answer them.


Patrick Dell on misinformation and disinformation

Ways to spot fake visuals during coming federal election

How well can DeepSeek answer questions about Canada? We put it to the test

The difference between disinformation and misinformation, and how to do your own analysis

Follow related authors and topics

Interact with The Globe

Trending