Stanford's autonomous Audi TTS research vehicle is gaining on the performance of its human-piloted counterparts. In contrast to its slightly pedestrian romp up Pikes Peak back in 2010, the self-driving car known as Shelley has recently hit speeds of 120 mph and posted lap times only just behind those of expert race car drivers at Thunderhill Raceway in California.
Stanford's Center for Automotive Research (CARS) and Volkswagen have been collaborating on autonomous vehicles for quite some time, going back at least to their joint win of the 2005 DARPA Grand Challenge at a resounding 19 mph (31 km/h). They also took second place in the 2007 DARPA Urban Challenge.
CARS' current focus, Shelley, is quite a few steps closer to the target. The Audi TTS is a rather formidable car, boasting a supercharged 265 hp with a dual clutch automatic transmission, which together drive the Audi to an electronically limited top speed of 155 mph (250 kph). The TTS boasts 0-62 mph (0-100 km/h) times of 4.7 seconds, with a braking distance from 62 mph (100 km/h) of 113 ft (35 m).
Shelley, of course, differs from a normal Audi TTS in that it has a brain. More correctly, a brain, sensors, and control-by-wire features that make Shelley able to autonomously drive a prescribed course, although different programs may be required for different driving conditions. Make no mistake, however – Shelley's route is not specified, only the rules by which she is able to arrive there. For example, driving on public roads requires different skills than does hitting high speeds on the Bonneville salt flats.
The Thunderhill Raceway just outside of Willows, California, is described as "a fast, fun track." The standard course at Thunderhill measures 3.0 miles (4.83 km) per lap. The mostly flat course has a fast series of 15 twists and turns with minor elevation changes. The record lap time is 1:37.614, by Chris Ferrel in a STOHR WF1 sports racer.
Shelley's best lap time was under two and a half minutes, hitting speeds over 120 mph in the straightaways (the exact figures have not been made public). A professional race car driver driving the same car beat Shelley's time by a few seconds. As the record lap time for the slowest car class (Group N – "Showroom Class") at Thunderhill is just over 2:15, clearly Shelley is headed in the right direction ... and the research team is looking to bridge the gap even further.
The pro-driver at Thunderhill was the track's Chief Executive Officer David Vodden, who is, of course, extremely familiar with the layout. For his test drive, he was equipped with electrodes that recorded his vital signs, skin conductance, brain waves, actions and speed while driving.
In addition, the well-known British Grand Prix driver Brian Redman and American sports car racer John Morton, were recently enlisted to wear a suite of biological sensors as they raced a 1966 Ford GT40 on the Laguna Seca raceway on California's Monterey peninsula. This car (the only American-built automobile to finish first overall at the 24 Hours of Le Mans race) has been fitted with car and driver sensors similar to those on Shelley.
These experiments use remote monitoring equipment developed by the REVS program at Stanford, a sister research program to CARS, which focuses on establishing "a new trans-disciplinary field connecting the past, present and future of the automobile."
The object is to pair this biological/psychological feedback with the data from the mechanical sensors on how the drivers actually control the car.
"We need to know what the best drivers do that makes them so successful," says project leader and mechanical engineering Associate Professor Chris Gerdes. "If we can pair that with the vehicle dynamics data, we can better use the car's capabilities."
Check out Shelley on action at Thunderhill in the Stanford video below.
Source: Stanford University
A lot more than a human can, I'll wager. 10 or 15 years from now, we'll get in a car and tell it where to go. It'll default to the most efficient route & travel time unless you give it a pointer like, "Quickly!" or "And step on it!" (or unless you've changed the default setting)
Trying to improve by looking in the rear view mirror is next to impossible but, build a system that asks how to drive on the edge without crashing and you'll begin to match some drivers. If drivers' many many sensory inputs are maxed out (until they learn which ones to relegate to the back seat, until needed), computers will have to approximate human RAS systems in order to catch up.