Skip to main content
Open this photo in gallery:

The Autonomoose self-driving car with all of the equipment on its roof. It is a modified Lincoln MKZ hybrid fitted with eight wide-angle cameras and a LiDAR.Kunal D'souza/The Globe and Mail

Autonomous vehicles have made big strides in the last five years thanks to advancements in artificial intelligence. They’re better able to predict and react to various situations on the road and drive more naturally in traffic.

But there’s still a long road ahead before they’re good enough to drive and think like a human in every possible scenario. Autonomous vehicles are being tested as ride-sharing and delivery vehicles around the world but, for the most part, they’re only driving on fully mapped roads where snow isn’t a problem.

“Ninety-nine per cent is good and we’re there, but it’s still insufficient for self-driving,” says Krzysztof Czarnecki, who leads the Waterloo Intelligent Systems and Engineering Lab (WISE) Lab at the University of Waterloo. “Ninety-nine is just two nines, and we need to add a whole bunch more nines, and that’s where the challenge comes in.”

When driving, we rely on our experiences to predict how other drivers will behave. A vehicle driving erratically, falling debris or even extreme weather scenarios can happen out of the blue. These are rare occurrences that we might not have encountered before, but our brains can fill in the blanks for what might happen. Computers can’t fill in the blanks and need to learn millions of unique scenarios. Autonomous driving software has to be able to predict and react to the rarest of one-off occurrences.

In software engineering, these rare occurrences are referred to as edge cases. The WISE lab is developing the most extensive public dataset on edge cases so far. Companies like Tesla and Edge Case Research identify and leverage edge cases through their own data-collection methods but lack standard categorization or public benchmarks, according to Czarnecki. When the work is complete, this edge case dataset can be used to train self-driving vehicles to drive with more human-like levels of perception.

“We are building advanced new [AI] models that can understand all these edge cases ... Events that are safety critical,” Czarnecki says. “That’s our main focus right now to address these problems.”

Open this photo in gallery:

The view from the driver's seat of the Autonomoose with a screen to program the mission and an emergency stop button.Kunal D'souza/The Globe and Mail

Czarnecki and his team have to label as many edge cases as possible and use them to train vision-language models to recognize similar scenarios in the real world.

Imagine an edge case where there’s a truck driving on the highway with a poorly secured ladder that looks like it’s about to fall. Most drivers would be able to recognize this and move out of the truck’s way just in case it falls. An autonomous vehicle without proper training wouldn’t be able to anticipate this and might not move out of the way, potentially putting the occupants in danger.

Czarnecki has a library of more than 1,000 crowd-sourced videos and images from across North America of some of the wildest things you’ve ever seen on roads and highways from three-seater sofas tied to the roof of a car to people hitching a ride on the back of a tractor-trailer. Each of these videos are examples of edge cases and they’ve all been labelled and categorized according to a detailed schema that accounts for road structure, environmental conditions, traffic types and the presence of foreign or unusual objects on the road. The labels are simple English descriptions that the vision-language model can understand.

Czarnecki says the original annotations were done manually but they have now moved on to using a vision-language model, which proposes labels that get reviewed and corrected by the team.

“Once this initial batch is completed, our goal is to use the improved AI pipeline to discover additional edge cases online and scale the dataset further with minimal manual effort,” he says.

This public dataset will also give other researchers a way to benchmark and improve existing AI models to handle edge cases. The work being done at Waterloo is supported by Transport Canada and aims to provide a standardized dataset to inform future performance standards and test protocols, something today’s NCAP programs only address in a limited way for basic ADAS (automated driver assistance) features, says Czarnecki.

It’s essentially giving a self-driving vehicle the ability to generalize just like a human would and apply what it’s learned to react to something it hasn’t encountered before.

Czarnecki, an electronics and computer engineering professor at the University of Waterloo, works with research engineer Michal Antkiewicz at WISE lab to develop the software that powers autonomous cars. They have an autonomous test car, a modified Lincoln MKZ hybrid nicknamed “Moose,” short for Autonomoose, created as a joint effort between WISE lab and the Toronto Robotics and AI Laboratory (TRAIL).

In 2016, when Ontario began a 10-year pilot program to allow the testing of driverless vehicles on public roads, the Moose became the first autonomous vehicle in Canada licensed to drive on public roads. There have been others since then, most notably the Metrolinx autonomous shuttle projects that were held in Whitby and Toronto and a partnership between Loblaw and logistics company Gatik which run a driverless delivery van between five retail stores and a micro-fulfillment centre in Toronto’s west end.

The Moose also had a mission much more specific to Canada: to help create the first public dataset for driving in snowy weather. After logging more than 100 kilometres on snow-covered and clear roads and being used as a training platform for dozens of engineers and students, the Moose now serves as a demonstrator of the technology for the students and rarely leaves the campus.

The vehicle is fitted with eight wide-angle cameras and a LiDAR (Light Detection and Ranging) unit on its roof. Inside, there are screens and computers scattered about and a powerful computer in the trunk. It’s all very ad-hoc and fascinating.

Antkiewicz sits in the driver’s seat and loads in a “mission,” a three-minute loop around the Autonomous Vehicle Research and Intelligence Lab (AVRIL). A screen shows me what the car sees in real-time. Everything above ground looks like a series of purple boxes. The system tracks everything that’s moving and assigns it an ID. It’s not the smoothest drive and the car is overly cautious but we arrive back where we started without incident.

“To take it into a completely uncontrolled environment is challenging,” Czarnecki says, adding that it requires a lot of resources.

Czarnecki says Waymo, a software company based in the United States, offers one of the best examples of self-driving technology. The company’s ride-hailing service is driverless, operating a fleet of Jaguar I-Paces modified with an array of sophisticated sensors and cameras.

Waymo spends billions of dollars to thoroughly map and learn each city in which it operates. It’s one of the main reasons you don’t see a faster rollout of the technology. So far, it runs in cities with consistently sunny and warm weather, such as Phoenix and San Francisco, but it offers a rare glimpse into the future.

Open this photo in gallery:

All of the equipment in the trunk of the Autonomoose.Kunal D'souza/The Globe and Mail

In places like Southern Ontario, with four distinct seasons, weather poses a significant challenge for autonomous vehicles.

 “A road looks very different in the summer than it does in the winter or even the fall, whether it’s because of the snow or the different colours,” Antkiewicz says. “There’s so much variety, and it becomes many times harder to teach a system to interpret this.”

Teaching a car to drive in winter is similar to edge cases. Even when the snow isn’t falling, the car has to learn about snow banks and reduced lane size. And while snow isn’t an edge case, it can lead to them, such as how to react when a car in front is skidding out of control on an icy road or what to do when your car loses traction.

While winter driving is a hurdle to overcome, Czarnecki says the biggest challenge facing autonomous driving is being able to account for anything that might happen on the road, not just snow. The safety aspect cannot be overstressed.

Cruise LLC, an autonomous vehicle company and subsidiary of General Motors, shut down after an incident involving a pedestrian in 2023 in San Francisco. The pedestrian was initially struck by another car and flung into the path of the Cruise vehicle, which was operating without a driver. The data shows that the vehicle tried to avoid the pedestrian but couldn’t in time. The pedestrian was pinned under the vehicle, with much of her body out of view of the lidar object detection sensor. The car then proceeded to pull over to the side of the road, dragging her almost seven metres.

“From the perspective of the Cruise vehicle, it was like a person falling out of the sky.” Says Czarnecki. “No one programming the vehicle would have thought about that (particular scenario).”

A report found that a human driver would have done better, as they would have known about the impact and would have stopped driving immediately.

Incidents like these highlight just how much work there is to be done for autonomous vehicles to be safe enough for mainstream use in all situations.

“We have to create the best driver in the world,” says Czarnecki. “With AI, I think that’s possible.”

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe