Quicker, lighter, and even smarter – Fast Lightweight Autonomy program completes Phase 2 flight tests
DARPA's Fast Lightweight Autonomy (FLA) program has successfully wound up its Phase 2 flight tests, putting its drones and algorithms to the test in three primary scenarios – all without human assistance or control. The hope for the FLA program is to develop small, affordable and maneuverable assets which can function as team members in rescue, military or hazardous scenarios.
From pre-mission reconnaissance in hostile settings to searching damaged structures for survivors following an earthquake, the technologies developed in the program will help minimize the risk to life while providing mission-critical footage and date to teams.
Building on the success of the 2016 Phase 1 flight tests, FLA researchers refined their code and hardware in the quest for greater performance while using even smaller quadcopters. The Phase 2 flight tests were run in a mock town at the Guardian Centers training facility in Perry, Georgia.
Significant progress was achieved in urban outdoor as well as indoor autonomous flight scenarios, covering three main challenges:
1) Flying at increased speeds – up to 45 mph (72 km/h) – between multi-story buildings and through tight alleyways while identifying objects of interest
2) Flying through a narrow window into a building and down a hallway, searching rooms and creating a 3D map of the interior
3) Identifying and flying down a flight of stairs, and exiting the building through an open doorway
The need for development of autonomous UAVs is clear when one considers the environmental and human challenges faced by military and civil teams. GPS navigation signals aren't always available, and a radio frequency link to the UAV may be impossible given the area of operation. It's also unfeasible to embed a skilled UAV operator within every rescue team or military patrol – especially when time is critical.
It's worth noting that while FLA's algorithms have been trialed on air vehicles alone at this point, they could be used on small, lightweight ground vehicles as well.
"The outstanding university and industry research teams working on FLA honed algorithms that in the not-too-distant future could transform lightweight, commercial-off-the-shelf air or ground unmanned vehicles into capable operational systems requiring no human input once you've provided a general heading, distance to travel, and specific items to search," said J.C. Ledé, Program Manager in DARPA's Tactical Technology Office. "Unmanned systems equipped with FLA algorithms need no remote pilot, no GPS guidance, no communications link, and no pre-programmed map of the area – the onboard software, lightweight processor, and low-cost sensors do all the work autonomously in real-time."
Less is more when it comes to UAVs and difficult environments, and so the number of onboard sensors were reduced to lighten the vehicle for higher speed and more responsive maneuvering.
"This is the lightweight autonomy program, so we're trying to make the sensor payload as light as possible," said Nick Roy, co-leader of the MIT/Draper team who stripped back the sensors. "In Phase 1 we had a variety of different sensors on the platform to tell us about the environment. In Phase 2 we really doubled down trying to do as much as possible with a single camera."
A key aspect of the Phase 2 tests was mapping – not just geographically accurate 2D maps (and 3D ones in the cases of tunnels and stairs), but semantic maps as well. "As the vehicle uses its sensors to quickly explore and navigate obstacles in unknown environments, it is continually creating a map as it explores and remembers any place it has already been so it can return to the starting point by itself," said Jon How, the other MIT/Draper team co-leader.
This isn't just a map of routes, but a map of "things" as well. Using neural nets, the onboard computer recognizes buildings, cars and other objects, and identifies them as such on the map. Human team members can then download the map and images after the mission is completed.
The team also incorporated the ability to sync data collected by the UAV with an app called the Android Tactical Assault Kit (ATAK), already deployed by many military forces. Using a Wi-Fi link from the UAV, teams can watch real-time imagery of anything of interest. During the flight tests, the UAV successfully identified cars positioned around the mock town. With "exploration mode" mode on, the UAV identified the cars and plotted their location with high-resolution images in real time, appearing as an overlay on the ATAK geospatial digital map on the team's control unit.
A key feature of one of the test UAVs is its ability to create a detailed 3D map of unknown indoor spaces, as well as the ability to fly down stairwells.
"That's very important in indoor environments," said Camillo J. Taylor, one of the team leads. "Because you need to actually not just reason about a slice of the world, you need to reason about what's above you, what's below you. You might need to fly around a table or a chair, so we're forced to build a complete three-dimensional representation."
The next step, according to Taylor, is to squeeze even more computing grunt onto smaller platforms, potentially making a smart UAV for troops or first responders small enough to fit in the palm of their hand.
There's more information in the following video.