Developing true robot surrogates that allow you to be in two places at once means duplicating all of our movements and senses in machine form. Given you can now make a video call on your phone, it's fair to say we have the sight and sound aspects pretty well covered, but the challenge of adding touch to the equation is formidable. The TELESAR V Robot Avatar shows just how far we've come in turning into telepresence into telexistence - it's a humanoid remotely controlled robot that boasts a wide range of movement along with the ability to transmit sight, hearing and touch sensations to its operator via a set of sensors and 3D head mounted display.
Telexistence as a concept has been advocated by a Japanese professor Susumu Tachi for more than two decades. Its aim is to allow people to interact with a distant, remote environment in real time, in the most realistic way possible.
Professor Tachi led the research team from Keio University's in Tokyo that developed TELESAR V. The operator of the robot wears a helmet equipped with a 3D display covering the entire field of robot's view and a pair of headphones that transmit exactly what the robot can see and hear.
TELESAR V is also equipped with a system capturing and sending touch sensations. The set of sensors built into robot's hands records force vectors and temperature data to allow the operator to feel the temperature and shape of an object. The current system is accurate enough to recognize uneven surfaces as small is the bumps on a LEGO brick.
"The robot's hands can't move as freely as a person's, but they do come very close, with 15 degrees of freedom," explains Masahiro Furukawa, Assistant Professor at Tachi Laboratory.
Studying Na'vi and other alien tribes aside, one promising application for robotic avatars is telemedicine, where a doctor utilizes a remotely controlled device to interact with patients at distant locations. This is a field of technology that can already be seen in action in the form of sophisticated remote surgical systems like the da Vinci.
Source: DigInfo TV
Take a look at the following video presenting the technology: