Vision-based system that imitates insects designed to improve navigation of UAVs
Along with the well known defense applications, unmanned aerial vehicles (UAVs) are also used for crop dusting, bushfire and environmental monitoring, and infrastructure inspection. Such applications can see them flying close to the ground and amongst obstacles meaning it is of the utmost importance for pilotless craft to be able to accurately determine their heading and orientation to the ground. By imitating the method insects employ, Australian researchers have designed a vision-based system to provide real-time guidance for these eyes in the skies.
While UAVs employ sensors such as magnetometers, gyroscopes, and accelerometers to help them determine their heading and orientation, Richard Moore, a researcher at The Vision Centre and The Queensland Brain Institute at the University of Queensland, says they can suffer from problems such as noise-induced drift, and can be adversely affected by the motions of the aircraft or materials in the environment.
"This means that UAVs can't perform significant maneuvers without losing their sense of direction for a while," says Moore.
To overcome these problems and provide real-time guidance for UAVs, the researchers designed a vision-based system that imitates the method used by insects to provide a fixed image of the sky and the horizon.
"If you watch a flying insect, you will see that their heads are upright when they turn their bodies," Moore says. "Keeping their heads still allows them to have a stabilized image of the horizon and the sky, which is crucial in determining their heading."
The new system sees the aircraft using two back-to-back fish-eye lenses to capture a wide-angle view of the environment that is then divided into sky and ground regions using information such as the brightness and color combinations. The orientation of the sky and ground regions allows the aircraft to determine its roll and pitch angles with respect to the horizon.
"Using its estimated orientation, the aircraft can then generate a panoramic image of the horizon, and use it as a reference," Moore explains. "The aircraft can then determine its heading direction continuously throughout the flight by producing an instantaneous horizon panorama and comparing it with the reference image."
Although similar vision-based systems have been proposed in the past, the researchers claim their new system improves visual performance by enabling the aircraft to learn directions by itself.
"This system doesn't need any programming before takeoff, unlike earlier ones that required lots of offline training: researchers had to manually compute the differences between the sky and the ground, then feed it into the system," says Moore. "With the new system, we only have to tell the aircraft that it's in the upright position when it starts flying. It will then use that as a starting point to work out which is sky and which is ground, and train itself to recognize the differences."
Moore says this is important because an aircraft that relies solely on prior training will run into trouble in unfamiliar environments. Conversely, its self-learning ability allows the new system to keep a record of what it "sees," update its reference base continuously, and adapt to changing environments.
In a closed-loop flight test with the new system where the aircraft was commanded to perform a series of 90-degree turns for four minutes, Moore says the aircraft was able to estimate its heading much more accurately with a visual compass, compared to relying on other navigation systems such as magnetic compasses and gyroscopes.
"The ability to estimate the precise roll and pitch angles and the heading direction instantaneously is crucial for UAVs, as small errors can lead to misalignments and crashes," he added.
Video of the flight tests carried out by the team from the Queensland Brain Institute at the University of Queensland can be seen below, while their paper, "A fast and adaptive method for estimating UAV attitude from the visual horizon," can be found here.