Autonomous helicopters offer a highly maneuverable and versatile platform in scenarios like disaster relief operations, but programming these machines to perform complex aerobatics is a formidable challenge - unless of course they teach themselves. This example developed by Stanford computer scientists does just that, learning to fly by watching other RC helicopters in the air. Not only does this artificial intelligence system produce a spectacular flying exhibition, it's seen as an important demonstration of robotic learning through observation.

The helicopters can fly by far the most difficult aerobatic maneuvers flown by any computer controlled helicopter according to the team who developed the system, Andrew Ng, the professor directing the research of graduate students Pieter Abbeel, Adam Coates, Timothy Hunter and Morgan Quigley.

The Stanford aircraft is an off-the-shelf radio control helicopter, with instrumentation added by the researchers. The helicopter carries accelerometers, gyroscopes and magnetometers, the latter of which use the Earth's magnetic field to figure out which way the helicopter is pointed. The exact location of the craft is tracked either by a GPS receiver on the helicopter or by cameras on the ground (with a larger helicopter, the entire navigation package could be airborne.)

Writing software for robotic helicopters can be a daunting task, in part because the craft itself, unlike an airplane, is inherently unstable. During a flight with an 'expert' at the radio controls, all the sensor outputs are recorded with a data logger that effectively 'watches' what the expert did during the flight. It might seem that an autonomous helicopter could fly stunts by simply replaying the exact finger movements of an expert pilot using the joy sticks on the helicopter's remote controller. That approach, however, is doomed to failure because of uncontrollable variables such as gusting winds.

Early on in their research, Abbeel and Coates attempted to write computer code that would specify the commands for the desired trajectory of a helicopter flying a specific maneuver. While this hand-coded approach succeeded with novice-level flips and rolls, it flopped with the complex stunts. So the researchers had expert pilots fly entire aerobatic routines while every movement of the helicopter was recorded. As maneuvers are repeated several times, the trajectory of the helicopter inevitably varied slightly with each flight. But the learning algorithms created by Ng's team were able to discern the ideal trajectory the pilot was seeking. Thus the autonomous helicopter learned to fly the routine better and more consistently than the experts themselves.

During a robotic flight, some of the necessary instrumentation is mounted on the helicopter, some on the ground. Together, they continuously monitor the position, direction, orientation, velocity, acceleration and spin of the helicopter in several dimensions. A ground-based computer crunches the data, makes quick calculations and beams new flight directions to the helicopter via radio 20 times per second.

In an "airshow" demonstration year (see video below), the artificial-intelligence helicopter performed a smörgåsbord of difficult maneuvers: traveling flips, rolls, loops with pirouettes, stall-turns with pirouettes, a knife-edge, an Immelmann, a slapper, an inverted tail slide and a hurricane, described as a "fast backward funnel."

There is interest in using autonomous helicopters to search for land mines in war-torn areas or to map out the hot spots of California wildfires in real time, allowing firefighters to quickly move toward or away from them. Firefighters now must often act on information that is several hours old. I don't know that there's much call for fire reconnaissance copters that perform aerobatics though.

Paul Evans

View gallery - 4 images