AI is Pushing Autonomous Driving Tech to New Heights
Gallery



Artificial intelligence (AI) is becoming a bigger part of our daily lives, and it’s happening quickly. Bill Gates surmises that AI is as fundamental as the internet and the mobile phone, likening the technology to the rise of the microchip for the power it has to transform the lives of everyone on the planet.
Self-driving cars have been incorporating AI into their software stacks for almost a decade, but there’s been a major boost recently thanks to all the research and investment being poured into developing the latest and greatest deep learning networks.
Where Autonomous Driving is Now
While self-driving vehicle technology has not hit the mainstream yet, Waymo, a software company in the U.S., has one of the best real-world examples. Its ride-hailing service, Waymo One, uses driverless Jaguar I-Pace EVs to give customers rides in select cities in the U.S.

The service was first launched in Phoenix in 2020 and now runs in San Francisco, Los Angeles, and Austin. The Waymo driver uses an array of sensors, including radar, cameras, and LiDAR (Light Detection and Ranging) to build an incredibly detailed view of its surroundings. It uses these sensors in combination with pre-loaded maps that the company spends a lot of time creating.
“It knows where everything is like traffic lights, stop signs, and parking spots and with a higher level of resolution than what you’d find on Google Maps,” says Steven Waslander, a professor at the University of Toronto’s Institute for Aerospace Studies. “It can track its own position directly from the LiDAR and camera data so it knows exactly where it is in the lane and what driving options it has, like going left or right.”
How Does AI Help?
Where AI comes in, according to Waslander, is detecting, tracking, and predicting everything that’s moving in the scene. At any given time, countless things are happening on the road that you and I take for granted because of our ability to make reasonable decisions based on knowledge and experience. But software has to learn how to do this for each scenario it encounters, and that’s where deep learning or neural networks are used.
“These are techniques to learn how to look at camera, LiDAR, and radar data, and find the moving objects in them and then track them over time. From there, you make predictions and plan how you’re going to interact with those predictions,” says Waslander. “You can learn to drive like a human driver by taking millions of kilometres of human driving and then mimicking that with a learning network.”
Waslander says that AI itself is a broad term, but essentially, it’s teaching the software to generalize from data.
New learning networks are constantly being developed and being integrated into AI software stacks at an unprecedented rate. They can be used for many purposes like route optimization or even conquering winter driving.

Current Challenges
The driverless Waymo One ride-hailing service only operates in cities with predictable and typically sunny weather, like Phoenix and Austin. Each city that the service launches in is a significant undertaking, according to Waslander, taking years of mapping and learning local rules and driver behaviours. “There’s a huge amount of local specificity to how you drive that doesn’t translate from one city to the next,” says Waslander.
Adding snow, ice, and slippery roads to the mix has proven a challenge, but it’s not an insurmountable task, says Waslander. “It’s just another domain we have to get to,” he says.
Waymo has done testing in Michigan and New York states but doesn’t have firm plans to launch the service in more northerly locales any time soon, but at the University of Waterloo, Waslander leads a research program called the Canadian Adverse Driving Conditions Dataset (CADC). It’s the first public data set to focus on real-world driving in snowy weather conditions.
To collect data, they designed the Autonomoose in 2016, a Lincoln MKZ hybrid equipped with a full suite of radar, cameras, and LiDAR, and it’s the first vehicle of its kind designed to drive autonomously in snowy weather.
Waslander says that LiDAR and camera data can be filtered to remove most of the precipitation, allowing a self-driving car to perform almost as well in snow as it can in regular conditions, but it requires training, and that takes time.
LiDAR works by firing laser beams into the air and when one hits an object, it bounces back and the distance of that object can be calculated from the travel time of the pulse. If the laser beam strikes a snowflake first, some light bounces back but not all of it. The rest of the light that goes through might then hit a wall, which will reflect more light than the snowflake, creating a stronger peak in the data. The software will typically ignore the snowflake but will take the wall into account.
Waslander says that LiDAR companies are now returning data with the first peak (from the snowflake), the strongest peak (from the wall) and the last peak giving much more richness to the information and building a more detailed view of the world.
“We lose a bit of the detail at long range, so the distance at which we can see objects becomes a little diminished, but that’s pretty similar to what happens with human drivers,” says Waslander. “We drive a bit more slowly and attentively because we can’t see as well.”
Even with this supernatural ability to see through the snow, self-driving software still has to train in real-world snowy conditions, and that will take time. That’s all due to ensuring the system is as safe as possible because even one mistake can cost a life.

Final Thoughts
In the world of self-driving cars, 99 per cent safe isn’t good enough. Self-driving cars will essentially have to be the best drivers in the world to be good enough for mainstream use, but we’re getting there, and thanks to AI, it’s happening at a much quicker pace.
“The dream was that technology never gets tired, or distracted, or drunk and that it would ultimately outperform humans, and they’re pretty much there,” says Waslander.