My column last month took a high-level look at how newsrooms are beginning to use artificial intelligence and how audiences can tell whether those uses are ethical. But there’s much more to discuss, given the complexity of AI and how quickly it is evolving.
The day before that article was published, The Globe and Mail and four other Canadian media organizations announced that they were taking legal action against OpenAI, the company that developed ChatGPT. At issue: OpenAI’s alleged use of published, copyrighted journalism content to create products, like the well-known chatbot, from which it derives profit.
“News media companies invest hundreds of millions of dollars into reporting Canadians’ critical stories, undertaking investigations and original reporting, and distributing media in both official languages in every province and territory across this country,” said the media organizations’ joint statement. “OpenAI is capitalizing and profiting from the use of this content, without getting permission or compensating content owners.”
OpenAI has also come under fire from news organizations with which it had been negotiating terms of use for their journalistic content – negotiations that in some cases broke down. Last December, for example, The New York Times announced its suit against OpenAI, similarly alleging “that millions of articles published by The Times were used to train automated chatbots that now compete with the news outlet as a source of reliable information.”
Pina D’Agostino, a law professor at York University, told NPR: “If it’s valuable enough to train these large language models, it’s also worthy of some compensation.”
The Associated Press wire service did come to an agreement with OpenAI in 2023. Neither party, however, disclosed what kind of financial compensation AP will receive for allowing OpenAI to scrape its news archive – only that “The arrangement sees OpenAI licensing part of AP’s text archive, while AP will leverage OpenAI’s technology and product expertise.”
The Financial Times, as well as News Corp, which controls Fox News and The Wall Street Journal, have also struck agreements with OpenAI.
At the same time as AI can be a threat, it also holds significant potential to support quality journalism. AI can streamline workflows, identify emerging trends, and can help journalists present the news in a variety of ways that fit readers’ individual preferences, said Nafid Ahmed, The Globe’s vice-president of enterprise analytics, AI and consumer insights.
I spoke to Nafid about how he and his team are collaborating with the newsroom to look at the ways AI might assist in the journalistic process and enhance users’ experience in the near and near-ish future.
He emphasized the critical role of audience feedback, noting that no project will move beyond the testing phase (in which a small proportion of subscribers are offered a new feature) unless it resonates with readers. That testing process, he said, involves evaluating both direct feedback from readers and engagement metrics from The Globe’s website and app (for example, how many stories individual readers are viewing and how much time they spend with each of them).
Several of the use cases under consideration aim to offer Globe audiences more ways to consume content. Subscribers can already choose to listen to any article, thanks to a voice generator called Amazon Polly. (Just click on “LISTEN TO THIS ARTICLE,” located in a bar just above the article text.) It gets the job done, but it can sound rather robotic. Nafid’s team is looking at alternatives that deliver more natural-sounding audio. He said advancements in this technology might one day enable features like voice commands, similar to a virtual assistant or “agent.”
He described how such technology might take the Climate Exchange, which I described in last month’s column, to the next level. Right now, there are 75 replies, which the newsroom team has researched and written in response to reader questions submitted earlier. If you were to visit the Climate Exchange page today and submit a question, AI would help to match your particular wording of the question with an appropriate answer (since there are many ways to ask the same question).
However, if there isn’t an appropriate answer in the answer bank, you would receive this response: “Thank you. We didn’t find an answer to your question but we’ll send your query to our newsroom for consideration. In the meantime, please feel free to ask another question.”
It might be that an answer does exist within The Globe’s vast archive of news and feature articles – but the right technology isn’t yet in place to retrieve an archival article that appropriately answers a reader’s novel question. Such technology could renew the relevance of many, many evergreen articles. And what if, rather than typing in your question, you could verbally query the chatbot, in the same way you ask Siri to set a timer for your Sunday roast, or Alexa to play your favourite album?
Carrying that further, what if you could ask The Globe app, “What news have I missed since the last time I visited?” “What’s the current stock price for Apple?” “Can you tell me the P/E ratio for Shopify?” “What are the latest movie reviews?”
Nafid emphasized that while The Globe is exploring applications of voice synthesis, any experimentation will be undertaken thoughtfully and responsibly. That’s an important standard to state and adhere to, as there is also a dark side to the technology, with the potential to misuse voice synthesis inappropriately to create harmful or misleading content. In 2023, for example, 4chan used a synthesized voice resembling that of Harry Potter actor Emma Watson to read out excerpts of the Hitler manifesto Mein Kampf.
Earlier this year, OpenAI found itself apologizing to Scarlett Johansson, who voiced the part of an operating system that has a love affair with Joaquin Phoenix in the movie Her; the company had launched a chatbot voice that sounded very similar to Ms. Johansson’s. Although OpenAI denied the resemblance was intentional, the controversy underscored the challenges of navigating this space responsibly.