Robotics

Learning robot puts on a happy face

The Einstein robot head performs some random facial movements as part of the learning process
The Einstein robot head performs some random facial movements as part of the learning process

Robots generally aren’t the most expressive of entities, but their faces are becoming increasingly realistic as the number of artificial muscles controlling them rises. Today, a highly trained person must manually set up these kinds of realistic robots so that the servos pull in the right combination to make specific facial expressions, but researchers at the University of California, San Diego (UCSD) are looking to automate the process by giving robots the ability to learn realistic facial expressions.

The UCSD researchers looked to both developmental psychology and machine learning as the basis for their research. Developmental psychologists speculate that infants learn to control their bodies through systematic exploratory movements, including babbling to learn to speak. Initially, these movements appear to be executed in a random manner as infants learn to control their bodies and reach for objects.

“We applied this same idea to the problem of a robot learning to make realistic facial expressions,” said Javier Movellan, the senior author on the team’s research paper.

To begin the learning process, the UC San Diego researchers directed an Einstein robot head (Hanson Robotics’ Einstein Head) to twist and turn its face in all directions, a process called “body babbling.” During this period the robot could see itself on a mirror and analyze its own expression using facial expression detection software created at UCSD called CERT (Computer Expression Recognition Toolbox).

This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors. Once the robot learned the relationship between facial expressions and the muscle movements required to make them, it could then learn to make facial expressions it had never encountered.

After one of the servos burned out and the team decided to continue with the experiment anyway, the model was even able to learn how to compensate for by activating a combination of nearby servos.

While the primary goal of the research was to solve the engineering problem approximating the appearance of human facial muscle movements with motors, the researchers say this kind of work could also lead to insights into how humans learn and develop facial expressions.

The UCSD team’s research is detailed in the paper, Learning to Make Facial Expressions, which was presented at the 2009 IEEE International Conference on Development and Learning.

You can check out the Einstein robot head in action in the vid, but it looks like it’ll be a while before it can stick its tongue out like the famous photo.

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
2 comments
Slashpot
Apparently we will soon have robots way smarter than us. Yeah! U.S.A! U.S.A! I can\'t see a single problem inherent with that idea... Can you?
As usual, we see only the limits of what we CAN do, never giving a thought to whether or not we SHOULD.
ralph.dratman
Don\'t worry, the Terminator will travel back in time and save us from this Einstein bot.