Good Thinking

High-tech armband detects user's hand gestures

The device currently recognizes 21 different gestures
Rabaey Lab
The device currently recognizes 21 different gestures
Rabaey Lab

There are already computer vision systems and sensor-equipped gloves that can detect a person's hand gestures. Scientists at the University of California, Berkeley have developed an alternative technology, however, that offers some key advantages.

Computer vision systems not only require the user to have their hand clearly visible at all times, but they also raise privacy concerns, particularly if the person's face is visible in the video. Electronic gloves, on the other hand (no pun intended) may be cumbersome, fragile, and not practical in all situations.

With these limitations in mind, a team of UC Berkeley researchers developed a computer chip-equipped thin-film armband that is wrapped around the user's forearm. That person starts out by performing a number of hand gestures, one at a time. As they do so, electrical sensors in the band detect nerve signals at 64 points within the arm. This data is used to train a custom artificial intelligence-based algorithm, that learns which signal patterns accompany which gestures.

When the user subsequently makes one of those gestures – or even thinks about doing so – the system is able to determine which one it is, by matching its distinct nerve signal pattern up with one that it's already learned. It's currently able to recognize 21 different gestures, including a fist, a thumbs-up, a flat hand, holding up individual fingers and counting off numbers.

Importantly, the algorithm automatically updates what's it's been taught, so it can compensate for new variables such as sweat on the arm, or the arm being held in an unusual position. Additionally, all of the processing takes place within the chip, so no user data is transmitted to the cloud.

It is hoped that once developed further, the technology could be utilized in applications such as gesture control of electronic devices, virtual reality, or possibly even the operation of prosthetic hands. "We have only tested the device on healthy human subjects with no amputation so far," researcher Ali Moin tells us. "However, it is proven in the literature that EMG [electromyography] pattern recognition is feasible from the signals of the residual limb in amputees, although it would generally be more challenging due to the signals being weaker."

The study, which is being led by Prof. Jan Rabaey, is described in a paper that was recently published in the journal Nature Electronics.

You can see a demonstration of the system, in the video below.

Source: University of California, Berkeley

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
1 comment
AryehZelasko
This would be ideal to control a robot. If sensor can work for the hands then it should work for the legs and other parts of the body.