University of Oxford develops low-cost self-driving car system
Oxford University’s Mobile Robotics Group (MRG) has developed an autonomous navigation system for cars at a build cost of only £5,000 (US$7,700). Installed in a production Nissan LEAF, the robot car uses off the shelf components and is designed to take over driving while traveling on frequently used routes.
Automated driving technology already exists on several different levels – from the assisted driving systems found in some upmarket cars to full-blown robots that can drive themselves. The latter have become so advanced in recent years that some U.S. states have legalized their use on public roads. These types of vehicles can navigate everything from city streets to speedways, but fully autonomous cars have the drawback of being heavily modified vehicles with hefty price tags.
Led by Prof. Paul Newman and Dr. Ingmar Posner, the 22-member MRG team’s goal is to develop an autonomous driving system that is more affordable and can be used on standard production cars. To achieve this, the system had to be largely self-contained without the need for beacons or other infrastructure. It also needed to use standard components and have a degree of artificial intelligence.
The car chosen for MRG’s tests was a modified Nissan LEAF. The LEAF was altered to make it fly-by-wire, so that everything down to the turn indicators could be controlled by the car’s computers.
The technology is based on “autonomous perception.” That is, the car learns about the route and can constantly monitor the immediate area in order to make driving decisions. It doesn't use GPS because satellite navigation isn't always available, isn't accurate enough for driving and doesn't provide any information about what’s going on around the robot car. Instead, a pair of stereo cameras is installed in the car and there are two scanning lasers under the front and rear bumpers.
These sensors feed data to the three computers that are at the heart of the autonomous driving system. One is an iPad, which acts as the user interface. This offers to drive if the car knows the route, guides the driver to set up autonomous mode and warns of obstacles and other situations requiring human intervention. The iPad is monitored by the LLC (Low Level Controller) and the brunt of the work is done by the MVC (Main Vehicle Computer) installed in the boot. The three computers act in concert. If they disagree on a situation, the car slows and stops.
Together these sensors and computers are used to build up a three-dimensional map of the route. This is augmented by “semantic information,” such as the location and type of road markings, traffic signs, traffic lights and lane information, as well as aerial images. Since such things can change, the system can also access the internet for updates. Only when the system has enough data and has been trained enough will it offer to drive the car.
The system also uses probability and machine learning to build and calibrate mathematical models, which are used to teach it how to navigate the route. It monitors the road ahead for cars, pedestrians and obstacles by scanning 85 degrees ahead 13 times a second to a distance of 50 meters (164 ft). It identifies what and where objects are and where they are going, slows and stops the car if it encounters an obstacle, and continues when the obstacle moves. If need be, the driver can take back control by tapping the brake. Overall, the team says that the system essentially works like a very sophisticated cruise control.
The MRG team sees an immediate future in production cars modified for autonomous driving only part of the time on frequently driven routes. They estimate that the cost of the system can be brought down from its current £5,000 to only £100 (US$155).
The video below shows the MRG autonomous car system scanning the road ahead.