AI & Humanoids

Dexterous robot hand can take a beating in the name of AI research

Dexterous robot hand can take a beating in the name of AI research
Designed to withstand trial-and-error abuse from AI learning experiments, the Shadow Hand has also been fitted with dexterous fingers that a sensitive to touch
Designed to withstand trial-and-error abuse from AI learning experiments, the Shadow Hand has also been fitted with dexterous fingers that a sensitive to touch
View 4 Images
Designed to withstand trial-and-error abuse from AI learning experiments, the Shadow Hand has also been fitted with dexterous fingers that a sensitive to touch
1/4
Designed to withstand trial-and-error abuse from AI learning experiments, the Shadow Hand has also been fitted with dexterous fingers that a sensitive to touch
All-new Shadow Hand meets a previous dexterous hand creation from Shadow Robot
2/4
All-new Shadow Hand meets a previous dexterous hand creation from Shadow Robot
The Shadow Hand has been designed with speed, flexibility and precision in mind
3/4
The Shadow Hand has been designed with speed, flexibility and precision in mind
Each finger module boasts four joints driven by five motors, and can apply 10 newtons of fingertip force
4/4
Each finger module boasts four joints driven by five motors, and can apply 10 newtons of fingertip force
View gallery - 4 images

A robotics company likely most famous for a demo of its dexterous robot hand at Amazon re:MARS with Jeff Bezos has now unveiled a new robust model designed for machine learning research, which was developed in collaboration with Google's DeepMind.

London-based Shadow Robot has more than 20 years of robot design form behind it, and has had high-profile research and industry clients like NASA, ESA, OpenAI, Google, MIT and a number of universities on its books over the years.

Where previous iterations of "the world's most dexterous humanoid robot hand" have looked quite familiar from a human perspective, the new Shadow Hand – which the company says was developed with research and insights from the Google DeepMind robotics team – sports three fingers only, in gripper-like formation.

"A key challenge in AI and robotics is to develop hardware that is dexterous enough for complex tasks, but also robust enough for robot learning," said the company in a press statement. "Robots learn through trial and error which requires them to safely test things in the real world, sometimes executing motions at the limit of their abilities. This can cause damage to the hardware, and the resulting repairs can be costly and slow down experiments."

So as well as being designed with speed, flexibility and precision in mind, the new robot hand is also built to endure "a significant amount of misuse, including aggressive force demands, abrasion and impacts."

DEX-EE - A new robust robot hand by Shadow Robot

It measures 350 mm in length, and is 165 mm wide and 160 mm high (13.78 x 6.5 x 6.3 in). A single finger weighs in at 1.2 kg (2.6 lb), while the whole hand tips the scales at 4.1 kg (9 lb). And it requires a 48-V/200-W power supply.

The robot hand is reported to benefit from precise torque control, with each of the fingers able to muster up to 10 N of fingertip pinch force. The four joints of each finger are driven by motors housed in the base and connected via "tendons" and the fingers are capable of going from fully open to closed in 500 milliseconds.

Each finger is a self-contained unit, and incorporates a number of 3-DOF tactile sensors at the proximal and middle segments, along with a stereo camera setup that's pointed at the inside surface of silicone skin covering the fingertip to provide high-resolution, wide-dynamic-range tactile feedback in real time – which all combine to help the robot get to grips with the world around it "through the sense of touch."

DEX-EE - Tactile sensors on a new robot hand by Shadow Robot

If one of the finger modules suffers fatal damage during limit-pushing AI experiments, it can be removed from the base module (which connects to a robot arm) and replaced with a fresh one for minimum downtime. The tactile sensors can also be removed/replaced if needed, with the communication network within the finger able to register the presence or absence of a sensor and feed relevant information to a host computer automatically.

We don't have pricing for the Shadow Hand, but the company will demonstrate it to the public for the first time at ICRA 2024, which is due to open its doors next week in Yokohama, Japan.

Source: Shadow Robot

View gallery - 4 images
3 comments
3 comments
Karmudjun
Thanks Paul, this actually looks like a cool thing. Maybe some researcher who watches old vaudeville acts like WCFields black and white shorts will recreate his famous "cigar box" juggle with the robot arms and hands. Now that would highlight dexterity and touch even if a human were controlling the sequence of moves!
Treon Verdery
An enhancement to the actual learning through varied motion and pressure and contact angle robotic hand is to have an imaging ultrasound transducer be a part of the robot hand, palm, or fingers. Ultrasound scans the depthy deformation, abrasion, or also disintegration of the thing being touched with the robot hand. This makes it possible for the robot hand to have both learning, and high utility delicacy at interacting with a touched or gripped or motionized thing. Notably, ultrasound is only about 1% transmissive through open air, but a 49 watt ultrasound transducer is able to produce a slightly less than 1/2 watt scanning and imaging energy amount. Also, if the ultrasound emission occurs as a pulse, just once each 90th of a second, at a duration of 1/900th of a second, the peak pulse ultrasound imaging energy could be 400 watts, 4 imaging watts actually traversing the air, but have an average power draw of only 4 emitted continuous watts. That heightens energy efficiency and coolness.
Another even cheaper thing to put at robot hands, robot fingers, and robot body surfaces is a two 7¢ laser diode, 79¢ 2D/3D image sensor. Interferometry is utilized to characterize and image the object being touched as to locations and amounts of bending, stress, dynamic vibration, stress peak areas, and 3D shape change. The robot hands controlling software can then learn both effective contact, grip, and motion styles and energies, but can also learn to touch or grip things at a way that highly minimizes interferometry characterized bending, or other stress. That makes the occurrence of a robot hand, robot fingers, or other robot part touching a human to touch very very softly, delicately, and enjoyably.
Treon Verdery
At robot hands, fingers, and other robot parts, and at manufacturing technologies, vibration sensors or minute MEMS accelerometers that characterize vibration and other 3D movement could be combined, and software directed to do motionizing of multi-axis piezoelectric motionizing/vibrating elements to make a nodal concatenation that neutralizes vibration from doing antinodal effects. That is, the robot has less stochastic vibration. Less stochastic vibration enables higher volumetric pixel resolution and precision, or heightens volumetric vector/tensor space resolution and precision. This technology idea is placed in the public domain now, and all of the technology ideas I have written, or posted anywhere on the internet, including newatlas dot com, since 1999 are all placed at the public domain.