
Letters from four social-media companies to Elections Canada outline their platforms’ content and political advertising policies and in some cases also defend their approaches. The platforms are also partnering with Elections Canada to provide voting information to Canadians – a move researchers called positive.Adrian Wyld/The Canadian Press
Election-integrity initiatives outlined by four social-media companies to Elections Canada are too broad, Canadian researchers say, adding it is difficult to know how effective they will be without the platforms being more transparent.
Meta, which owns Facebook, Instagram, WhatsApp and Threads, TikTok, X and Snap Inc., which owns Snapchat, have so far responded to Chief Electoral Officer Stéphane Perrault’s February inquiry about addressing misinformation and disinformation during the election campaign. Mr. Perrault wrote to seven platforms, including LinkedIn, Google and Reddit.
The letters by the social-media companies discuss the platforms’ content and political advertising policies and in some cases also defend their approaches. The platforms are also partnering with Elections Canada to provide voting information to Canadians – a move researchers called positive.
If social-media companies wanted to help improve election integrity, they could share what’s happening on their platforms with researchers, said McGill University associate professor Taylor Owen, who is also a Media Ecosystem Observatory principal investigator.
X used to give researchers access to its application programming interface (API), or data, but now its most in-depth access level costs about $40,000 a month, said Prof. Owen, which effectively shuts out Canadian researchers. Meta also ended CrowdTangle in August, which was the main way researchers studied Facebook and Instagram, he said.
“They are making their platforms more opaque right at a time where we need transparency,” Prof. Owen said.
University of Calgary associate professor Emily Laidlaw called the letters “essentially marketing tools” as they lay out plans without much detail. That’s why researcher access is needed to see where the gaps are and if platforms are effectively addressing issues, she said.
Simon Fraser University associate professor Ahmed Al-Rawi agreed about transparency concerns, calling the letters are “too idealistic.”
Ten charts that explain the economic stakes in this election
Meta’s Rachel Curran wrote that the company’s approach includes a dedicated team responsible for the company’s election-integrity efforts, as well as policies to prevent election interference and misinformation related to voting, and mandating that AI content is properly labelled.
In 2023, Meta banned news links being posted on its platforms in response to the Online News Act, making this the first federal election campaign since it has been in effect. Prof. Al-Rawi said the ban means that people will be likely getting “second-hand” information that could be manipulated or distorted.
X’s Wifredo Fernández outlined many election initiatives – including activating its civic integrity policy – but his letter also discussed the platform’s community notes feature, which allows users to add context notes to posts.
While it is an “interesting experiment” and has proved valuable in some cases, Prof. Owen said, it does not replace content moderation. It’s also too slow to properly respond during an election campaign, said Prof. Laidlaw.
TikTok Canada’s Steve de Eyre wrote that it takes several election-related actions, including removing misinformation about civic and electoral processes, such as manipulated or AI-generated content that could mislead people.
When asked if those policies has been effective, Prof. Owen said he does not know. In Europe, the Digital Services Act requires platforms to report on the efficacy of election-integrity initiatives, he said, but Canada does not require it.
The platform also held a session with experts, said Mr. de Eyre, where they were briefed on TikTok’s plans and could provide feedback.
Snap Inc.’s Gina Woodworth said misinformation cannot spread as easily on Snapchat, as most interactions on the platform are person-to-person rather than public feeds. Public content is checked before it can be recommended for wide distribution, she said.
Prof. Laidlaw said that while the information is not public, the groups involved can be relatively big.
Aside from the letters, government officials said Monday they continue to engage social-media platforms around foreign interference.
Larisa Galadza, a Global Affairs Canada director general who oversees Canada’s Rapid Response Mechanism, said this week that when her organization finds content with a foreign connection, it informs the social-media platforms. This includes disinformation campaigns and instances of digital transnational repression.
The platforms then make their own decisions, she said.
What do you want to know about the federal election? Send us your questions