Robotics

EVA robot identifies and copies people's facial expressions

EVA robot identifies and copies people's facial expressions
As part of its learning process, EVA runs through a variety of facial movements while filmed by a camera
As part of its learning process, EVA runs through a variety of facial movements while filmed by a camera
View 2 Images
EVA can express six basic human emotions, plus it's capable of subtler expressions
1/2
EVA can express six basic human emotions, plus it's capable of subtler expressions
As part of its learning process, EVA runs through a variety of facial movements while filmed by a camera
2/2
As part of its learning process, EVA runs through a variety of facial movements while filmed by a camera

If you smiled at someone and they didn't smile back, you'd probably find it off-putting. Well, that's what usually happens if you smile at a humanoid robot … but not in the case of the expression-mirroring EVA.

Developed by a team of engineering researchers at New York City's Columbia University, EVA is in fact a humanoid robotic head. It's designed to explore the dynamics of human/robot interactions, and consists of a 3D-printed adult-human-sized synthetic skull with a soft rubber face on the front.

Motors within the skull selectively tug and release cables attached to various locations on the underside of the face, in the same way that muscles beneath the skin of our own faces allow us to switch between different facial expressions. For its part, EVA can express emotions such as anger, disgust, fear, joy, sadness and surprise, plus "an array of more nuanced emotions."

In order to develop its mirroring capabilities, the scientists started by filming EVA as it randomly moved its face. When the computer that controls the robot subsequently analyzed the hours of footage, it utilized an integrated neural network to learn which combinations of "muscle movement" resulted in which facial expressions.

When a connected camera subsequently imaged the face of a person interacting with the robot, a second neural network was utilized to identify that individual's present expression, and visually match it to one that the robot was capable of making. The robot then proceeded to take on that expression by moving its artificial facial muscles accordingly.

EVA can express six basic human emotions, plus it's capable of subtler expressions
EVA can express six basic human emotions, plus it's capable of subtler expressions

Although the engineers admit that simple mimicry of human expressions may have limited applications, they believe that it could nonetheless help advance the manner in which we interact with assistive technologies.

"There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers," says the project leader, Prof. Hod Lipson. "Our brains seem to respond well to robots that have some kind of recognizable physical presence."

You can see EVA in face-mirroring action, in the following video.

Sources: Columbia Engineering, EVA project

Animatronic Robotic Face Driven with Learned Models

2 comments
2 comments
Ralf Biernacki
Yes! I want my robot to have a face like this. Totally relatable. These guys are on the right track.
Taimaa alawar
I have already applied the experience
It is really fun, and I love these types of robots very much, as they contain a lot of details and exact sciences. Despite the difficulty of designing a robot because of the difficulty of securing materials, it is a challenge of the utmost magnificence
I thank you for open source, you have given me a golden opportunity to discover a beautiful world that I have wanted for a long time
Very grateful to you
I really love robots