AImotive develops "worldwide any weather" self-driving software
Autonomous cars are coming. Manufacturers are in a race at the moment, trying to balance progress with public safety and ever-changing regulations. Along with big players, third-parties are working on their own kits which can be slotted into essentially any car. One such company says it has created the first artificial intelligence ecosystem for autonomous driving regardless of location, local driving styles or conditions.
AImotive, previously known as AdasWorks, says it has developed a full-stack software system for autonomous driving. Unlike some other third-parties and manufacturers, the company hasn't tried to develop its own hardware and chips, instead it has focused its attention on the software needed to make a car drive itself.
The suite of elements making up the AImotive self-drive system is named aiDrive, which consists of a recognition engine, location engine, motion engine and control engine. According to Niko Eiden, chief operations officer at AImotive, the recognition engine is at the core of the brand's work on self-driving.
"We believe the central architecture will be the key requirement for autonomous driving cars," he says. "Because the amount of data you need from all of the cameras is huge, and having the camera process the data with an embedded system in between and then forwarding the data to a centralized unit just doesn't work. It has to be real time, it has to be time stamped, we have to be able to be sure the frames we get from each camera are from exactly the same moment... If you can do recognition, you can do the rest."
This approach is all about being able to run "worldwide in any weather." In other words, the system needs to work under any driving conditions, surrounded by drivers of all types, anywhere in the world.
To make that happen, the recognition engine takes information from between six and twelve cameras and breaks it down in incredibly fine detail. Although it can be set up to handle 100 different classes of object, the company currently uses just 25, which cover everything from pedestrians to line markings and footpaths. For every frame taken on the cameras, the engine is able to create a list of what it is seeing – object class, its distance and angle in relation to the car, and how big it is.
The information from the recognition engine works with data from the location engine to place the car on the road. Eiden says the location engine is a simple navigation system, like the kind you'd find in any car sat-nav, providing the car's realtime location on a map to compare with the camera data. All the information about where the car is, both from the recognition and location engines, is then fed to the motion engine.
"It [the motion engine] starts calculating the tracking, the historical path of every object it sees around it. It gives the object an individual ID and then it starts calculating... based on that, we can calculate the future path of every object" explains Eiden. "Once you have everything – you know what's happening around you, where the car is, and where you want to go – the motion engine can calculate the next step of where we want the car to be."
From there, it's simply a matter of actually making inputs to steering, throttle and brakes. Sounds simple, doesn't it? Unfortunately, it's not quite that easy. The car has to repeat the process described above hundreds of times every second. With no mistakes, no misreads, no breakdowns in communication. Not a simple task, but one AImotive is confident it can pull off.
Rather than just being effective on the roads around Budapest, or Mountain View California, the team at AImotive wants the system work anywhere in the world. Just like an American who lands in Europe, it might not be versed in the language or local customs, but it should still be able to drive without having a panic attack.
Instead of focusing on covering lots of autonomous miles and collecting huge amounts of data, an approach taken by Tesla and Google, AImotive is creating a game engine where a simulated car driven by the company's software system can be put through its paces. It's a tool that allows the system to be tested around the clock, without putting any humans in the line of fire.
All of this tech won't be locked up in one manufacturer or tied to one hardware ecosystem, it's being designed to work with cameras and chips of essentially any brand, efforts aided by the recent granting of a Dual Neural Network Standard with Khronos. AImotive wants to make it easier for hardware manufacturers, or just car manufacturers in general, to weave its software into their products.
According to the company, it's the first to enable an artificial intelligence ecosystem for autonomous driving regardless of location, local driving styles or conditions. It's designed to be a Level 5 technology, meaning passengers simply need to input their destination before kicking back and relaxing in the cabin.
Please keep comments to less than 150 words. No abusive material or spam will be published.