
The view from inside as a Waymo robotaxi makes its way through the streets of Phoenix in December, 2025.Jason Tchir/The Globe and Mail
My first trip in a self-driving Waymo taxi didn’t feel like a bold glimpse into the future. It felt like riding with a student driver.
Nothing it did on the four-kilometre trip through Phoenix in December was dangerous exactly, but it made some odd moves such as giving the right of way to a driver who had a stop sign when we didn’t.
It seemed to know the rules, but didn’t quite understand them.
“Probably what you were picking up on is that it wasn’t driving like a person does,” said Phil Koopman, a Pittsburgh-based expert on safety standards for autonomous cars and faculty emeritus at Carnegie Mellon University. “And they make mistakes. It turns out that they make the same mistakes as people sometimes … and they will make mistakes that people are unlikely to make.”
Since launching in Phoenix in 2020, Waymo, which is owned by Google’s parent company Alphabet, has logged more than 204 million kilometres without a human driver in the car in a handful of U.S. cities, including San Francisco and Los Angeles.
But driverless Waymos have struggled with construction zones, stalled in intersections during a power outage and been stuck in drive-thrus. Last November, one drove through an active crime scene while police yelled at it to go the other way.
The U.S. National Highway Traffic Safety Administration is investigating reports of Waymos failing to stop for school buses that were pulled over to drop off students – and an incident in January where a Waymo vehicle struck a child crossing a street near a school in Santa Monica, Calif.
About “90 to 95 per cent” of the situations where autonomous vehicles get into trouble are routine situations for human drivers, said Henry Liu, a professor of engineering and director of the Center for Connected and Automated Transportation at the University of Michigan.
“The [situations] are really just normal for us, but they’re rare to autonomous vehicles because they are computer drivers," Liu said.
Growing pains?
Part of the problem is that the cars are still learning – and they don’t know what to do when they encounter situations companies didn’t anticipate, Koopman said. “Machine learning has no common sense. [It’s] good at things it has been taught and can be terrible at things it has not been taught.”
For example, a Waymo hit a telephone pole in Phoenix in 2024 because it didn’t know not to, Koopman said. It only knew to never hit a curb – and usually telephone poles are behind a curb, he said.
“It was unusual to see a telephone pole that was not protected by a curb in the places they would drive,” Koopman said, adding that the car had no passengers at the time and nobody was hurt.
Engineers call these edge cases: unusual situations that cars may not have learned to handle, including everyday scenarios humans manage easily and rare situations that also challenge human drivers, Liu said.
Because there are still relatively few cars operating in a limited number of areas (Waymo had approximately 2,500 cars in service by the end of 2025, compared to nearly 300 million vehicles on U.S. roads), that means there’s a risk of more edge cases as companies expand, Koopman said.
“Can they scale up and not make a mistake like Cruise did?” Koopman said, referring to the General Motors-owned company which shut down its robotaxi services after one struck and dragged a pedestrian in San Francisco in 2023. “The story of Cruise is that they tried to push too hard.”
Keeping the cars confined to a small area for a few years to detect all the gaps in their learning would make them safer, but companies need to expand to make money, Liu said.
He hopes that advancements in artificial intelligence might help the cars learn faster and drive more naturally.
Until last year, most of Waymo’s driverless trips were confined to carefully mapped neighbourhoods in cities that typically see dry, sunny weather. But the company has started allowing cars on some highways.
It now offers driverless rides to the public in six cities, and plans to expand service to at least 20 more – although not all of them will be open to the public this year, a spokesperson said in an e-mail. That expansion includes cities with winter weather, such as Denver and Detroit.
Waymo has not announced any plans to expand to Canada, although it has been lobbying some municipal and provincial governments.
Rivals, including Tesla and Amazon-owned Zoox, are testing their own robotaxis in select cities.

A Jaguar I-Pace being used as a Waymo robotaxi on the streets of Phoenix in December, 2025.Jason Tchir/The Globe and Mail
Safer than humans?
Another problem is that autonomous cars can’t yet easily understand human drivers’ social cues – including eye contact, waves and nods, Liu said. “[Autonomous] vehicles will need to understand the social norms of driving.”
Even with no driver in the car, most companies still have some degree of remote human backup for when the cars get confused, Koopman said.
In a hearing before a U.S. Senate committee in February, Mauricio Peña, chief safety officer at Waymo, said humans can provide guidance for the company’s vehicles, but “they do not remotely drive the vehicles.”
When we asked Waymo how often the cars needed remote assistance in 2025, a spokesperson directed us to their website, which doesn’t answer the question.
Still, even if the cars sometimes struggle, are they as safe, or safer than, human drivers?
Statistics generally come from the companies themselves. In the U.S. and Canada, regulators mostly rely on companies to certify their own vehicles, but do investigate when there’s a serious incident, Liu said.
But it’s not always clear whether less serious incidents are publicly reported. Waymo told the U.S. Senate committee that its cars “have been involved in 10 times fewer serious injury or worse crashes” compared with humans driving the same number of miles under similar conditions.
Those statistics might not matter, Koopman said, because Waymo frequently updates its software. “Every time they change the software, the previous miles lose their ability to predict the future. They didn’t used to blow past school buses. Why did that happen all of a sudden?"
While the cars might be able to brake more quickly than a human driver for a child that runs out from behind a parked vehicle – which is reportedly what happened in Santa Monica – they also don’t yet anticipate dangerous situations the way human drivers should, Koopman said.
“They say, well, [the car] jammed on the brakes faster than a human could," he said. “But would a human have been going [the speed limit] in the first place? A reasonable human driver is going to slow down when they see a chaotic school scene [where young kids are being dropped off by parents].”
But Liu said there are many human drivers who would have gone above the limit in that situation or would have been distracted and not seen the child at all.
“Autonomous vehicles are not a silver bullet … some accidents are going to happen,” Liu said. “But we have [more than] 42,000 road fatalities in the U.S. each year and the technology holds a huge potential to greatly cut that.”