SideSwipe lets phones read hand gestures using reflected wireless signals
Imagine if your smartphone was ringing away in your bag or pocket, and you were able to silence it simply by waving your hand in the air – without even taking the phone out. Well, that could soon be a reality, thanks to technology being developed at the University of Washington. Known as SideSwipe, the experimental system allows a phone to recognize gestures via the manner in which the user's hand reflects back the phone's own wireless transmissions.
First of all, there are already smartphones that feature gesture recognition functionality. They do so, however, using the phone's camera. This means that the phone must be able to "see" the user, plus battery life is an issue, as camera-use consumes a lot of power.
Sick of Ads?
Join more than 500 New Atlas Plus subscribers who read our newsletter and website without ads.
It's just US$19 a year.More Information
Using SideSwipe, on the other hand, the phone can be located anywhere that it's able to detect radio signals. Also, instead of a battery-draining camera, it utilizes a group of much less power-hungry small antennas. The program on which it runs is additionally said to be quite simple, requiring little processing power.
As the user passes their hand through the air, some of the radio signals that the phone ordinarily transmits to its cellular network are reflected back to it – even through layers of fabric. After a period of "training" in which the phone learns the movement patterns of its individual user, it can analyze those reflected signals to recognize up to 14 different mid-air gestures. These include things like sliding, tapping and hovering, which can in turn be assigned to functions such as silencing a ring, advancing through songs, or muting a speakerphone.
In tests conducted so far, SideSwipe has reportedly proven about 87 percent accurate at identifying users' gestures. The system is still under development, though, with the ultimate goal of its being marketed as a piece of hardware that could be included in third-party phones.
The University of Washington has also recently created a smartphone gesture-recognition system known as AllSee, which analyzes ambient electromagnetic waves from wireless transmissions such as TV broadcasts, and detects the changes in the amplitude of these waves caused by the user's movements. SideSwipe is considerably more self-sufficient, however, as the phone provides its own wireless signal.
Source: University of Washington