Automotive

Anatomy of autonomy: A look at how self-driving cars think

Anatomy of autonomy: A look at how self-driving cars think
AImotive has offices in Budapest and Mountain View
AImotive has offices in Budapest and Mountain View
View 11 Images
AImotive has offices in Budapest and Mountain View
1/11
AImotive has offices in Budapest and Mountain View
The cameras mounted around the AImotive car feed their raw data to the recognition engine mounted within
2/11
The cameras mounted around the AImotive car feed their raw data to the recognition engine mounted within
Seeing and understanding the world is crucial to self-driving cars
3/11
Seeing and understanding the world is crucial to self-driving cars
AImotive plans on making its software compatible with a range of different brands of sensor
4/11
AImotive plans on making its software compatible with a range of different brands of sensor
Given we're in California, it's not surprising to see a Prius being used
5/11
Given we're in California, it's not surprising to see a Prius being used 
The AImotive team doesn't have its self-driving license in California yet
6/11
The AImotive team doesn't have its self-driving license in California yet
AImotive underwent a rebranding last year
7/11
AImotive underwent a rebranding last year
At the moment, this is still very much a prototype
8/11
At the moment, this is still very much a prototype
A look at how the car sees the world
9/11
A look at how the car sees the world
Need more proof this is a tester?
10/11
Need more proof this is a tester?
The AImotive team has offices in Europe and the USA
11/11
The AImotive team has offices in Europe and the USA
View gallery - 11 images

As self-driving cars edge closer to production, the race to develop autonomous systems that can handle the driving anytime and anywhere is hotting up. Along with the big names, smaller players are pushing to nail the formula with their own kits that would convert existing cars into autonomous ones. One such company is AImotive, and we paid a visit to its Mountain View, California office to chat with senior team member Lorant Pocsveiler about how self-driving cars see, understand and react to the world around them.

Open your eyes

Seeing and understanding the world is crucial to self-driving cars
Seeing and understanding the world is crucial to self-driving cars

Just like any human driver, an autonomous car needs to see what's happening around it. The team at AImotive believes, because the roads are designed for humans, the first self-driving cars will need to be vision-based.

"We use a camera-first approach here," Lorant Pocsveiler tells New Atlas. He's sitting in the conference room of the company's US offices, a house in Mountain View converted for business use. "We think in an environment such as [our current road setup], which has been designed around humans with visual cues in mind... the camera-first approach has the best chance to identify things. It's a very rich source of information."

Cameras mounted to the roof, nose and tail of the car (with radar as a backup) work to build a picture of what's going on, essentially acting as a set of eyes and ears. Different players in the self-driving game are taking varying approaches, with many utilizing expensive Lidar, but the team at AImotive is designing a system to work with any sensors that meet its basic resolution guidelines, making it easier for manufacturers to match the software with their own hardware.

Understanding what it's seeing

A look at how the car sees the world
A look at how the car sees the world

Being able to see what's happening is one thing, being able to interpret it is another. In the AImotive car, the recognition engine breaks down raw information from the sensors and feeds it through special segmentation software, which is tasked with identifying various objects. Although it's able to handle up to 100 different object classes, the system currently uses just 25.

For every frame captured on camera, the software is able to create a list of what it's seeing, with details regarding size, distance and angle in relation to the car. This is all displayed on the in-car screen, which gives each different class a unique colour. Information from the cameras and sensors is overlaid with data from the location engine, which uses regular GPS data to build a clearer picture of where the car is.

"The location engine's main role is to figure out where the car is" Pocveiler says. "We have to know where the car is precisely. GPS is the starting point for that, however GPS might not be precise enough."

aiSim - Weather Simulation

"In order to improve the location data, we also use landmarks," Pocveiler continues. "This is all about highlighting the precise position of certain traffic signs and signals. We know where a certain traffic sign should be, what the precise position is, then we figure the position of the car by measuring the distance and direction from that object. This improves the precision of where the car is."

Just as we trust what our eyes perceive over the little map on our dashboard, AImotive is building a vision-first system, meaning navigation data is there to confirm what the cameras are seeing. Obviously, it's also important the car knows what roads to take when users program a destination.

Cutting through the crowd

Given we're in California, it's not surprising to see a Prius being used
Given we're in California, it's not surprising to see a Prius being used 

Once it knows what everything is, the car needs to plot a course. At AImotive, that task falls to the motion engine. It calculates the historical path of objects (where they've been), their current location and, using this data, where they're likely to head next. As you'd imagine, the system is constantly recalculating what it expects the cars and pedestrians around it to do, just as regular drivers are always watching the traffic around them for cues as to what's coming.

With all this data in place, the car can actually go about plotting a path through the chaos. Once again, data about where the car wants to go can be displayed live on an in-car screen, complete with a moving arrow.

Putting the pieces together

At the moment, this is still very much a prototype
At the moment, this is still very much a prototype

With all the thinking done, the car needs to actually put thoughts – or plans – into action. Once all this information has been processed, all the AImotive system needs to do is electronically apply inputs to the steering, gas and brakes through its control engine.

We weren't able to see the car driving itself on our visit – the company doesn't have a driverless testing license in California yet – but we did get to see some footage of it working in a simulation (more on that later). Perhaps the highest praise we can heap on the display is that, if you hadn't been told there was no driver behind the wheel, you'd have no idea.

Obviously, this is complicated to put into action. Everything we've described needs to happen hundreds (or thousands) of times every second, and it needs to happen perfectly. There is no room for error and no room for teething problems when it's jostling with notoriously pushy Californian drivers. The software relies on perfectly calibrated cameras and sensors, too, something handled by yet another AI element of the AImotive package. In spite of this, AImotive is bullish about the potential of its system.

"I would say by the end of this year, on a technology level, a highway scenario would be in place," Pocveiler tells New Atlas. In other words, the system could be able to comfortably drive on the highway driving within twelve months. "By the end of next year we could have a city scenario covered." Just to be clear, that doesn't mean the system will be installed in a working autonomous car anytime soon, but the software capability is likely to be there.

The AImotive team doesn't have its self-driving license in California yet
The AImotive team doesn't have its self-driving license in California yet

That potential, and the rapid rate of development, is the result of constant testing, both in the real world and using an in-house simulator. Rather than focusing on covering millions of on-road testing miles like Waymo or Uber, the team at AImotive has put together an in-house simulation system that is able to put the car through its paces around the clock.

Perhaps more importantly, no humans are put into the line of fire during testing, and almost any scenario can be simulated. Want to know how the car will react if an elephant wanders onto the interstate? You can do that. Curious about what happens when two cars have an accident in front of you? Load it up and see what happens.

It's this level of research and development that will be integral to bringing self-driving cars to our roads. If you're keen on driving, that day might be a sad one, but when they do arrive, at least you'll have an idea what's happening under the hood.

Company page: AImotive

View gallery - 11 images
2 comments
2 comments
LarryWolf
With no accidents in over 50 years of driving (knock on wood) I'll trust my own instincts and skills before I'll ever set foot in a robotic car. Those drones crash all the time. Hardly fool proof.
Ralf Biernacki
Visual-based systems work on one assumption: that there is good visibility. Drive in the rain at night, and the camera-based system is likely to perform poorly, unless it is augmented with laser- or radar- based 3-D perception. A tree-lined road in the sun, with contrasting shadows across the road---which line is the curb? A wet, reflecting roadway at night---which lights are car lights, which are reflections? These kinds of problems are trivial if you have an auxiliary radar system. The idea that humans drive based on visual cues, so AI should too, overlooks one detail---humans are still miles ahead when it comes to /interpreting/ low-quality visual data.