There have been numerous attempts over the years to break the decades-long stranglehold the keyboard and mouse have had on the human-to-computer interface by providing some semblance of Minority Report-like gesture control. Apotact Labs recently joined the fray with a four-finger glove-like design called Gest that allows you to control your computer and your mobile devices with your hands.
Gest is described as a digital toolkit that consists of two components: a gesture controller that slips onto your hand, and an SDK that allows anyone to build new applications for the platform. The gesture controller is designed to fit any hand via an adjustable palm strap and four moldable finger mounts. There are 15 discrete sensors in each hand and each finger has the same type of standard accelerometer, gyroscope, magnetometer combination found in a smartphone.
The software that powers Gest allows it to sense small movements, and by monitoring and learning how you use your hands it also creates a personalized model that’s unique to each user and adapts to that user over time. Apotact Labs claims the result is more precise and accurate gesture control.
Some other gesture controlled devices currently available include the muscle-controlled Myo armband from Thalmic Labs that utilizes broad gestures to control a variety of devices, and the smaller Leap Motion controller that uses cameras and infrared to create a model of your hand movement. Unlike these devices, Apotact promises a greater degree of accuracy with smaller movements that is meant to appeal to artists and designers who are looking for greater precision.
Its first application will be for Adobe Photoshop and each device will come with a built in library of five standard gestures to allow it to be used right out of the box. For example, switching between apps is done with a twitch of a finger. Pointing at the screen allows you to move the mouse cursor, and a twist of your palm adjusts Photoshop sliders. 3D objects can also be rotated by “grabbing” them and rotating them in your hand.
If you don’t want to use Gest’s custom skeletal models and motion processed data, you can access the raw sensor data and use the provided Java and Python APIs to create your own models.
The device connects to smartphones, tablets and PCs via Bluetooth Low Energy (BLE), with an LED telling you when you’re connected and what mode you’re in. The device can be recharged through its Micro USB port.
Apotact Labs is also working on a typing proof-of-concept using a neural net that handles word prediction. The idea is to be able to use a Gest controller on each hand to turn any surface into a keyboard, however this is still at the experimental stage.
The company launched a Kickstarter campaign on October 30 to raise US$100,000 in funding. The company has already exceeded that goal and sold out of its early bird single controller that was priced for $99. However, single controllers are still available for the higher early bird price of $149. Once the retail version is available, Apotact Labs says a Gest single controller will sell for $200, or $400 for a pair. The company hopes to ship its product to backers by November 2016 if all goes to plan.
The following video illustrates how Gest works with both hand and finger tracking.