Touch Bionics has unveiled the latest enhancements to its i-limb Ultra Revolution at OTWorld 2014 International Congress. Users can now set and assign different grips for different objects and configure the prosthetic hand via Android apps.
Touch Bionics calls the i-limb Ultra Revolution, "the most advanced and versatile prosthesis available," and says it, "offers more dexterity and moves more like a natural hand than any other powered prosthetic hand."
The i-limb was launched in 2007 and was the world's first fully articulating and commercially available bionic hand. The Ultra Revolution followed last year and is used, amongst others, by social psychologist Dr. Bertolt Meyer, who last year presented a TV program looking at state-of-the-art artificial limbs, organs and blood.
The Ultra Revolution allows users to program a number of primary grip types into it that can be triggered by different muscle movements. These will generally be the grip types that the user most often uses. The recent upgrades to the Ultra Revolution focus on increasing grip adaptability.
Touch Bionics has introduced "Grip Chips," which are Bluetooth-enabled devices that can be stuck to objects and will trigger a pre-programmed grip configuration when detected by the Ultra Revolution. For example, a Grip Chip might be stuck to a keyboard to initiate a grip pattern best suited to typing. They're useful for triggering specific grip patterns that are used regularly, but perhaps not enough to warrant programming to the Ultra Revolution itself for triggering via muscle movement.
In addition to the introduction of Grip Chips, the biosim and my i-limb mobile apps have been updated to provide users with up to 12 additional custom grips. By programming custom settings, users can now access up to 36 different grip options. Like the Grip Chips, the apps allow users to save infrequently-used grip options for quick access when they are required.
Touch Bionics has also announced that all i-limb prostheses now have compatible Android apps, instead of just iOS apps, and that the silicon skin fingertips of the Ultra Revolution have been modified to be conductive so that wearers can use touch-screen devices. This will be of particular benefit for bilateral wearers.
Source: Touch Bionics
Onto the larger point, I'm much more curious in the next step which is using the frontal lobe to control the limb instead of perripheral nerves. To truly integrate the artificial limb into the organic structure of the human body. When, after a car accident or injury, you wake up and have to be informed the hand you just used to drink water from a glass isn't your natural hand. We truly do live in a fascinating and inspiring point in human history...
Short of directly reading the nervous system maybe the best way to control the prosthetic arm is to read micro-movements in the other hand.
For example take something like the Razr orbweaver that is meat for gaming, each of the keys can be macro'd to have multiple different purposes based on combinations (16, alt + 16, shift + 16, alt + space + 16 etc. are all uniquely different. combine it with a mouse and you could successfully power a prosthetic arm/hand with a lot more accuracy than learning to use a couple large muscles remaining in the partial arm. I understand that solves one problem and creates another but now instead of resting your hand on it build it into a glove like Nintendo Power Glove that recognized gestures like Leap Motion.
The difficulty is differentiating between normal movements of the good hand and commands sent to the prosthetic but the normal hand is capable of hundreds of individual movements, combinations of movements, double clicking etc. that even while holding an object offers granularity to control a prosthetic you wouldn't have using only large shoulder/arm muscles.
Another good reason to use this method is instead of using whats left of the bad arm which is highly subject to change on an individual basis and comes with higher costs the user interface for a glove based system on the good arm could be standardized and even manufactured cheaply. A standard UI around it could even have some use in gaming, graphics design, or industrial robotics where people with prosthetics who have mastered the UI could be sought after as robotics/machinery operators instead of just viewed as more limited than other employees.
I think that's the best next step before direct neural control is possible.