AI & Humanoids

Researchers create robot that mimics human emotions

ERWIN can display five distinct emotions whilst interacting with humans, via the manipulation of its mouth and eyebrows
ERWIN can display five distinct emotions whilst interacting with humans, via the manipulation of its mouth and eyebrows

Scientists from the School of Computer Science, University of Lincoln, UK are using expressive robot ERWIN (Emotional Robot with Intelligent Network) to study how long-term relationships may form between robots and humans. In its current form, the robot has the ability to display five distinct emotions whilst interacting with humans via the manipulation of its mouth and eyebrows.

A key barrier to more realistic relationships between humans and robots is the lack of personality in the robot design. Once a user has obtained all the information they desire, all conversational avenues are exhausted and the relationship hits a dead end. Beyond the exchange of information, there is no way to communicate with the machine.

This creates an inherent barrier in establishing any kind a lasting relationship, as humans have personality traits known as cognitive biases. These are little flaws in the way that we think, which create the characteristics and differences between individuals. It's our cognitive biases that, in the right circumstances, draw us together and allow us to form a relationship.

It is important to be able to create such a relationship between man and machine as this type of "user-friendly" technology has a wide variety of applications. “Robots are increasingly being used in different fields, such as rescuing people from debris, in medical surgeries, elderly support and as an aid for people who have autism," explained U Lincoln PhD student Mriganka Biswas.

For some of these applications, a decent level of empathy is required in order to make the robot better equipped to assess the mood and requirements of their ward, in addition to being friendly and appealing in order to create a relationship between the machine and the user.

Dr. John Murray is attempting to achieve this with ERWIN.

Its aesthetic, somewhat reminiscent of a Sesame Street character, is entirely intentional and designed to make ERWIN more appealing to children, with a focus on those with autism and other developmental disorders. The robot is currently the subject of a PhD study on how human-like thought biases can affect a relationship between a robot and its user.

The results from ERWIN will be compared with those from another robot, Keepon, which is humanoid in appearance, but lacks the ability to convey emotions. Dr. Murray and his team plan on improving ERWIN with further characteristics and personalities as the study continues.

While it's clear ERWIN won't be leading robot armies into battle, it is nevertheless yet another baffling step towards giving robots emotions. The robot can be seen "expressing" itself in the following video.

Source: University of Lincoln

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
5 comments
ivan4
Number 5 is alive!
BeWalt
*sigh* another one of these "Arduino for beginners" setups.
If I kill a person, then take the fresh corpse and hook up some facial nerves in such a way that the face of the deceased looks sad or happy, can I then claim (from my jail cell) to have given back emotions to the dead? I don't think so.
Showing emotions would be if the robot 1) came up, on its own, with the correct set of facial flags (which btw have been known by cartoonists for centuries) after, 2) being exposed to a new and un-programmed-for situation, like somebody threatening to cut its cord.
Sketching lines in other ways than cartoonists are doing it with a pen, that just doesn't cut it, sorry. Very nice high-school afternoon project tho.
Paul van Dinther
These days they call every clown that can follow an instructables project a "researcher" This has no business to be on here.
Stephen Connell
@BeWalt and Paul van Dinthor - unhappy about the article ? write one yourself and let us see your "brillant skills" or clever insights! Its a light article but it just might peak the interest of a future researcher destined to make big break throughs.
FrankR
It really fails completely doesn't it? If the "mouth" was about 33% smaller, and had a fixed screen behind it of a contrasting colour, it might get a little closer to projecting a suggestion of emotions. The 'eyes" work pretty well though. Clever idea, but needs heaps more thought and work on the 'mouth' in order to be usable.