Drones

New tech could allow drone aircraft to recognize deck crews' arm signals

New tech could allow drone aircraft to recognize deck crews' arm signals
Aircraft carrier deck crews may one day be able to direct autonomous drones, using standard arm signals
Aircraft carrier deck crews may one day be able to direct autonomous drones, using standard arm signals
View 1 Image
Aircraft carrier deck crews may one day be able to direct autonomous drones, using standard arm signals
1/1
Aircraft carrier deck crews may one day be able to direct autonomous drones, using standard arm signals

We’ve all seen footage of flight crews on the decks of aircraft carriers, directing taxiing planes using arm signals. That’s all very well and good when they’re communicating with human pilots, but what happens as more and more human-piloted military aircraft are replaced with autonomous drones? Well, if researchers at MIT are successful in one of their latest projects, not much should change. They’re currently devising a system that would allow robotic aircraft to understand human arm gestures.

The MIT team divided the project into two parts. The first involved getting the system to identify body poses within “noisy” digital images, while the second was concerned with identifying specific gestures within a series of movements – those deck crews don’t stay still for very long.

A stereoscopic camera was used to record a number of videos for the study, in which several different people demonstrated a total of 24 gestures used commonly on aircraft carrier runways. While a device like the Microsoft Kinect could now pick out the body poses in that footage reasonably well, such technology wasn’t around at the time the study began. Instead, a system was created that picked out the positions of the subjects’ elbows and wrists, noted whether their hands were open or closed, and if the thumbs of those hands were up or down.

What the researchers are focusing on now is a way of sifting through all those continuous back-to-back poses, and isolating the different gestures for identification by the drones. It would take too long and require too much processing to retroactively analyze thousands of frames of video, so instead the system breaks the footage up into sequences about three seconds (or about 60 frames) in length. Because one gesture might not be fully contained within any one of those sequences, the sequences overlap one another – frames from the end of one sequence are also included in the beginning of the next.

The system starts by analyzing the person’s body pose in each frame. It then cross-references that pose with each of the 24 possible gestures, and uses an algorithm to calculate which gesture is most likely being made. This estimation process is then applied to the string of poses that make up the whole sequence, and then to several successive sequences.

So far, in identifying gestures from the video database, it’s managed an accuracy rate of about 76 percent. However, the researchers are confident that by refining the algorithms, that rate could be vastly improved.

More details are available in the video below.

Source: MIT

Guiding robot planes with hand gestures

7 comments
7 comments
flink
FYI, The signals used by plane captains and other ground crew are used almost everywhere you find operating aircraft, including civilian airports. Ground crews also use hand signals to communicate between themselves when working around aircraft, where it is frequently to noisy to hear even shouted requests.
Dawar Saify
Why not replace the human crews waving their arms with drones?
Rob Ayotte
So they are putting a limited Kinect on the things...is this a major step?
WebsterG
Anything less than 100% accuracy on a flight deck would spell disaster. Having a drone's computer "best guessing" what the signal is won't do.
warren52nz
Why not use an electronic homing device? They can't sneeze or make mistakes.
Foxy1968
Why isn't the entire process handed over to a computerised auto-pilot system once on the flight deck removing the need for any signalling and taking the human error factor out of the equation and also out of a very volatile and dangerous location.
Controll can be handed back to the pilot or drone for takeoff and the system could hold until the aircraft safely takes off or lands before processing the next move.
A human is a single point of failure with no redundancy. A computerised system would have multiple redundancies and require multiple point failure before becoming inoperable and never gets tired, never gets divorced and never has a pilot having an affair with a ground crew's wife or husband etc.
Slowburn
re; Foxy1968
A computer can not problem solve.