While touch screens have enabled smartphone manufacturers to increase screen real estate by ditching physical keyboards and other buttons, they do have a downside, with fingers often obscuring the display. Researchers at ETH Zurich have developed a new app that overcomes this problem by bringing gesture control to mobile devices using their existing built-in camera.
In the case of smartphones and tablets, the use of the built-in camera means the user would generally need to make gestures behind the device, which is slightly counterintuitive and could take some getting used to. The system works by comparing the shape of a hand against a database of stored gestures. When the shape is recognized, the program executes the command associated with the gesture.
NEW ATLAS NEEDS YOUR SUPPORT
Upgrade to a Plus subscription today, and read the site without ads.
It's just US$19 a year.UPGRADE NOW
Because smartphone cameras are unable to register depth information like the Kinect can, the size of the hand is used to determine its distance, with the program warning the user when the hand is too close or far away.
Currently, the system is only able to recognize six different gestures, but the researchers say this is far from its potential, with the number of gestures only limited by the number with unambiguous outlines. This is because gestures that resemble others are likely to confuse the program.
The system is made possible by an algorithm developed by Jie Song, a Master's student in the working group headed by Professor of Computer Science, Otmar Hilliges. Hilliges says the algorithm is ideal for smartphones, not only because it doesn't require any additional hardware, but because it also uses far less computer memory than most other movement-recognition programs. He says this would make it suitable for a range of mobile devices, including smartphones, tablets, smart watches and augmented reality glasses.
Hilliges believes the system is the first of its kind that can be run on a smartphone, but it does sound similar to technology unveiled by Japanese company Omron in 2012. Regardless, he is certain that the technology, or parts of it, will find its way onto the market. However, he believes gesture control will supplement touchscreen control, rather than replace it.
The team's system can be seen in action in the video below.
Source: ETH Zurich