Robotics

ShadowSense tech tracks shadows to give robots a sense of touch

ShadowSense tech tracks shadows to give robots a sense of touch
The ShadowSense prototype "sees" the shadow of a user's hand (right) as they touch the robot's skin
The ShadowSense prototype "sees" the shadow of a user's hand (right) as they touch the robot's skin
View 1 Image
The ShadowSense prototype "sees" the shadow of a user's hand (right) as they touch the robot's skin
1/1
The ShadowSense prototype "sees" the shadow of a user's hand (right) as they touch the robot's skin

There are currently a number of groups developing touch-sensitive electronic skin for robots. Scientists at Cornell University are pursuing a simpler approach, however, using shadow-imaging cameras to let robots know when they're being touched.

Known as ShadowSense, the experimental system incorporates an ordinary USB-powered laptop-connected camera, located beneath a non-electronic translucent "skin" on a soft-bodied robot.

As a person reaches toward the robot, the ambient lighting casts a shadow of their hand onto the skin. The camera tracks that shadow from the other side of the skin (within the robot), utilizing machine learning-based algorithms to determine when the hand is actually touching the skin, which area of the skin it's touching, and what gesture it's making. In this way, not only can ShadowSense tell when and where the robot is being touched, but it can also assign different commands to different touch gestures.

The current prototype robot – which is mainly just an inflatable bladder of nylon skin stretched around a cylindrical wheeled skeleton – is capable of differentiating between touching with a palm, punching, touching with two hands, hugging, pointing and not touching at all. It's able to do so with an accuracy of 87.5 to 96 percent, depending on the strength and directionality of the lighting.

The researchers are quick to point out that the applications of the technology aren't limited to robotics, as it could also be used in touchscreen displays or electronic appliances. That said, ShadowSense currently still does have some limitations – not only is a light source required, but the camera also has to be located within line of sight of the interactive part of the skin. The use of mirrors or additional lenses could help address the latter.

"Touch is such an important mode of communication for most organisms, but it has been virtually absent from human-robot interaction," says the lead scientist, Assoc. Prof. Guy Hoffman. "One of the reasons is that full-body touch used to require a massive number of sensors, and was therefore not practical to implement. This research offers a low-cost alternative."

ShadowSense is demonstrated in the video below, and is described in a paper that was recently published in the journal Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies.

Source: Cornell University

Researchers give soft robots a human touch

1 comment
1 comment
paul314
If you use the right kind of translucent surface, can you tell where it's being touched by darkening when it's font-lit?