New iCub 3 biped robot used as a long-distance avatar for its operator
We first heard about the child-like iCub humanoid robot back in 2011, when it was nominated to take part in the Olympic Torch Relay. The latest and greatest version, the iCub 3, was the star of a recently-announced (and pretty impressive) telepresence demonstration.
Developed at the Istituto Italiano di Tecnologia (Italian Institute of Technology, or IIT), the biped iCub 3 features a total of 53 actuated degrees of freedom – seven in each arm, nine in each hand, six in the head, three in the torso/waist and six in each leg. Its head incorporates swivelling stereo cameras which serve as eyes, dual microphones for ears, and animated lines of LEDs which represent its mouth and eyebrows within its face panel.
Tactile sensors within each of its fingertips additionally give the robot a sense of touch. It's 25 cm taller and 19 kg heavier than previous models, topping out at 1.25 m (4 ft, 1 in) and tipping the scales at 52 kg (115 lb). Other improvements include a higher-capacity battery (located in the torso as opposed to on the back); more powerful leg motors for faster walking speeds; better, more human-like balance and locomotion; plus a new depth-sensing camera.
For the demo – which took place on Nov. 8th, 2021 – the robot was located at the 17th International Architecture Exhibition's Italian Pavilion in Venice, while its operator was 300 km (186 mi) away at an IIT lab in the city of Genova. A standard fiber optic connection was used to link the two.
The operator utilized a suite of wearable devices, which is known as the iFeel system. These gadgets include multiple IMUs (inertial measurement units) placed at various locations on a body suit; "gloves" that both track the user's finger movements and relay tactile sensations from the robot's finger pads; and a VR headset – the latter tracks the user's facial expressions, eyelids and eye movements, picks up their voice, plus it allows them to see what the robot is seeing, and to hear what it's hearing.
Additionally, the operator stood in a VR rig that allowed them to walk on the spot.
This combination of technologies let robot serve as an avatar for the operator, as it walked around the pavilion, had a conversation with a human tour guide, shook their hand, and even hugged them – for the latter, haptic feedback units in the torso of the body suit let the operator feel that hug. Everything occurred in real time, with a lag of only a few milliseconds.
"We believe that this research direction has a tremendous potential in many fields," said the project's coordinator, IIT's Daniele Pucci. "On the one hand, the recent pandemic taught us that advanced telepresence systems might become necessary very quickly across different fields, like healthcare and logistics. On the other hand, avatars may allow people with severe physical disabilities to work and accomplish tasks in the real world via the robotic body. This may be an evolution of rehabilitation and prosthetics technologies."
Highlights from the iCub 3 demonstration can be seen in the following video.