They may not make for the showiest videos, but some of the most interesting problems in robotics are to do with the subtleties of human interaction. Even something as apparently simple as receiving an object poses great difficulty, but it's a problem that will need to be solved before multipurpose robots are ready for the home. By building a database of captured human motion, Disney Research and Karlsruhe Institute of Technology are making strides towards building a robot that can take an object handed to it by a human.
Though one approach to the problem would be to use similar vision techniques employed by robots that can identify and grasp objects, an unnatural approach to snatching objects from a human may be disconcerting, and discourage people from accepting robots into their homes or workplaces. The goal of this research was to develop a robot that accepts objects much as a person would.
To do this, the researchers decided to arm the robot with foreknowledge of how humans give and receive objects. By motion-capturing (you know, leotards and ping pong balls) people passing objects to each other, they were able to build a database to which the robot can refer, identifying the sort of hand-off coming its way, and matching it with the most suitable response. In order to be both fast and adaptive, the researchers structured the database as a tree, clustering similar motions together so that the robot can home into a localized branch of the tree to find the appropriate match quickly and adapt on the fly.
This tree-like structure allows the robot to begin its movement as the person is handing off the object, a natural, familiar approach to taking something from someone. The robot is also able to put its hand into roughly the right place to receive the object.
As you can see from the video below, there's still work to do, as the giver still needs to place the object in the robots static hand. Building in finger motions may help with this final part of the trade. The researchers are also expanding the database to allow the robot to adapt to a greater variety of hand-offs.
The research was carried out by Katsu Yamane, Marcel Revfi, and Tamim Asfour, and written up in their paper, _Synthesizing Object Receiving Motions of Humanoid Robots with Human Motion Database_, which was nominated for a Best Cognitive Robotics Paper award at the IEEE International Conference on Robotics and Automation in Karlsruhe.
Source: Disney Research