Automotive

AImotive develops "worldwide any weather" self-driving software

AImotive develops "worldwide any weather" self-driving software
At the moment, autonomous cars look like rolling laboratories. That might not be the case for much longer
At the moment, autonomous cars look like rolling laboratories. That might not be the case for much longer
View 16 Images
AImotive putting its software to the test
1/16
AImotive putting its software to the test
At the moment, autonomous cars look like rolling laboratories. That might not be the case for much longer
2/16
At the moment, autonomous cars look like rolling laboratories. That might not be the case for much longer
The team at AImotive is based in Budapest, but now has an office in Mountain View, California
3/16
The team at AImotive is based in Budapest, but now has an office in Mountain View, California
AImotive is vision based, which means lots of cameras
4/16
AImotive is vision based, which means lots of cameras
AImotive is testing its software on cars in Hungary and California at the moment
5/16
AImotive is testing its software on cars in Hungary and California at the moment
Cameras are key in the AImotive system
6/16
Cameras are key in the AImotive system
AImotive is putting its software to the test
7/16
AImotive is putting its software to the test
The AImotive self-drive system is named aiDrive, which consists of a recognition engine, location engine, motion engine and control engine
8/16
The AImotive self-drive system is named aiDrive, which consists of a recognition engine, location engine, motion engine and control engine
AImotive putting its software to the test
9/16
AImotive putting its software to the test
At the moment, autonomous cars look like rolling laboratories. That might not be the case for much longer
10/16
At the moment, autonomous cars look like rolling laboratories. That might not be the case for much longer
The team at AImotive is based in Budapest, but now has an office in Mountain View, California
11/16
The team at AImotive is based in Budapest, but now has an office in Mountain View, California
AImotive is vision based, which means lots of cameras
12/16
AImotive is vision based, which means lots of cameras
AImotive is testing its software on cars in Hungary and California at the moment
13/16
AImotive is testing its software on cars in Hungary and California at the moment
Cameras are key in the AImotive system
14/16
Cameras are key in the AImotive system
AImotive is putting its software to the test
15/16
AImotive is putting its software to the test
The AImotive self-drive system is named aiDrive, which consists of a recognition engine, location engine, motion engine and control engine
16/16
The AImotive self-drive system is named aiDrive, which consists of a recognition engine, location engine, motion engine and control engine
View gallery - 16 images

Autonomous cars are coming. Manufacturers are in a race at the moment, trying to balance progress with public safety and ever-changing regulations. Along with big players, third-parties are working on their own kits which can be slotted into essentially any car. One such company says it has created the first artificial intelligence ecosystem for autonomous driving regardless of location, local driving styles or conditions.

AImotive, previously known as AdasWorks, says it has developed a full-stack software system for autonomous driving. Unlike some other third-parties and manufacturers, the company hasn't tried to develop its own hardware and chips, instead it has focused its attention on the software needed to make a car drive itself.

The suite of elements making up the AImotive self-drive system is named aiDrive, which consists of a recognition engine, location engine, motion engine and control engine. According to Niko Eiden, chief operations officer at AImotive, the recognition engine is at the core of the brand's work on self-driving.

"We believe the central architecture will be the key requirement for autonomous driving cars," he says. "Because the amount of data you need from all of the cameras is huge, and having the camera process the data with an embedded system in between and then forwarding the data to a centralized unit just doesn't work. It has to be real time, it has to be time stamped, we have to be able to be sure the frames we get from each camera are from exactly the same moment... If you can do recognition, you can do the rest."

Cameras are key in the AImotive system
Cameras are key in the AImotive system

This approach is all about being able to run "worldwide in any weather." In other words, the system needs to work under any driving conditions, surrounded by drivers of all types, anywhere in the world.

To make that happen, the recognition engine takes information from between six and twelve cameras and breaks it down in incredibly fine detail. Although it can be set up to handle 100 different classes of object, the company currently uses just 25, which cover everything from pedestrians to line markings and footpaths. For every frame taken on the cameras, the engine is able to create a list of what it is seeing – object class, its distance and angle in relation to the car, and how big it is.

The information from the recognition engine works with data from the location engine to place the car on the road. Eiden says the location engine is a simple navigation system, like the kind you'd find in any car sat-nav, providing the car's realtime location on a map to compare with the camera data. All the information about where the car is, both from the recognition and location engines, is then fed to the motion engine.

"It [the motion engine] starts calculating the tracking, the historical path of every object it sees around it. It gives the object an individual ID and then it starts calculating... based on that, we can calculate the future path of every object" explains Eiden. "Once you have everything – you know what's happening around you, where the car is, and where you want to go – the motion engine can calculate the next step of where we want the car to be."

From there, it's simply a matter of actually making inputs to steering, throttle and brakes. Sounds simple, doesn't it? Unfortunately, it's not quite that easy. The car has to repeat the process described above hundreds of times every second. With no mistakes, no misreads, no breakdowns in communication. Not a simple task, but one AImotive is confident it can pull off.

Rather than just being effective on the roads around Budapest, or Mountain View California, the team at AImotive wants the system work anywhere in the world. Just like an American who lands in Europe, it might not be versed in the language or local customs, but it should still be able to drive without having a panic attack.

AImotive is vision based, which means lots of cameras
AImotive is vision based, which means lots of cameras

Instead of focusing on covering lots of autonomous miles and collecting huge amounts of data, an approach taken by Tesla and Google, AImotive is creating a game engine where a simulated car driven by the company's software system can be put through its paces. It's a tool that allows the system to be tested around the clock, without putting any humans in the line of fire.

All of this tech won't be locked up in one manufacturer or tied to one hardware ecosystem, it's being designed to work with cameras and chips of essentially any brand, efforts aided by the recent granting of a Dual Neural Network Standard with Khronos. AImotive wants to make it easier for hardware manufacturers, or just car manufacturers in general, to weave its software into their products.

According to the company, it's the first to enable an artificial intelligence ecosystem for autonomous driving regardless of location, local driving styles or conditions. It's designed to be a Level 5 technology, meaning passengers simply need to input their destination before kicking back and relaxing in the cabin.

Source: AImotive

View gallery - 16 images
5 comments
5 comments
Helios
A system of this sort is all well and good in the transition to autonomous vehicles, but the irony is that it may help in delaying full adoption of autonomous vehicles. Who will want to be an early adopter of autonomous travel while everyone else is still driving as fast as they want, change lanes as often as they want, tail gate someone or brake check them? We could have autonomous vehicles on the road today if we removed the weak link in transportation, the human element.We need to create a deadline for adoption of autonomous vehicles then eliminate all human driving.
Bob Flint
How many cases or objects are there in the real world, thousands, tens of thousands, or maybe near infinite once the objects react & move at different rates and directions simultaneously, plus the varying weather conditions acting upon those to constantly varying degrees. Six to 12 cameras alone won't be enough, and by the time reliability is up around 80% there will be no room left for a passenger, or even a driver at a cost that only a few dare afford...
christopher
I saved the life of a huge blue-tongue lizzard last week by pausing and waiting patiently for the slow-ass thing to get off the (empty) road.
eric.verhulst@altreonic.com
Its impressive, but claiming Level 5 that over the top. They just develop the software (granted, important) but I have my doubts that with just using cameras it can work under all circumstances (cfr. the Tesla accident). Level 4 and Level 5 also require that the system is fault tolerant. Where's the redundancy in their system? And how can this be developed with taking into account hardware? The computer can be blue screen in just a few nanoseconds. And wait for Linux to reboot?
PoppyAnn
looking at the first video it looked like the system was looking at all the people but not bothering to look at the traffic lights is it lible to set off when there are no cars around and not bother waiting for the lights to change.