AI & Humanoids

Robotic infants reveal posture may play an important role in learning

Robotic infants reveal posture may play an important role in learning
A robot being taught to distinguish between two objects (Photo: University of Plymouth)
A robot being taught to distinguish between two objects (Photo: University of Plymouth)
View 2 Images
Linda Smith, leader of the study (Photo: Indiana University)
1/2
Linda Smith, leader of the study (Photo: Indiana University)
A robot being taught to distinguish between two objects (Photo: University of Plymouth)
2/2
A robot being taught to distinguish between two objects (Photo: University of Plymouth)

It had to happen sooner or later; robots have replaced infants... at least, as subjects in psychological research being conducted by a team at the Indiana University (IU) Bloomington College of Arts and Sciences. The robots are being used to study how infants learn and have revealed that posture and body position are important factors in early learning.

Humans learn by association. In other words, we learn new things by linking them to things we already know or to the context the thing is learned in. It's one of the reasons why we're able to remember so much, recall information so quickly and recognize things even if they've been altered. It's also the reason why recalling faces in your head is so difficult. Rather than remember things like a camera taking a photograph, we build up a picture by making various associations.

What the IU team led by Linda Smith, a professor at the Department of Psychology and Brain Sciences, discovered is that for infants, this association of words and objects includes physical cues, such as the names and positions of objects, and body posture.

A key factor in understanding this connection was a series of experiments that were conducted using epigenetic robots. That is, humanoid robot "infants" that are designed to move and act like real infants. These weren't used because there's a shortage of human babies, but rather because the robots act as simplified models where various factors can be controlled. Their learning behavior can be compared with that of human infants as a way of developing new insights. Also, robot infants don't get bored after five minutes or need to have their nappies changed quite so often.

According to Smith, one surprising discovery was that learning wasn't just tied to where an object is, but also how the body was positioned during learning. The experiments included showing a robot an object on its left, then showing it a different object on its right. After this was repeated several times, it created an association in the robot's memory. The objects were then removed and the robot was ordered to move into the same posture it used in the learning exercise. It was then shown the two objects again in various places while their names were spoken.

According to the team, the robot turned and reached for the objects 20 times, indicating that it had correctly associated their names. However, when the robot was shown the objects in a way that wasn't associated with a specific posture, the robot did not form an association. When the experiment was repeated with human infants between the ages of 12 to 18 months, the results were very similar.

Smith says that later research will aim at determining if this association between posture and learning applies only to infants or adults as well, and that the study's results could provide a better understanding of cognitive and developmental disorders.

"This study shows that the body plays a role in early object name learning, and how toddlers use the body's position in space to connect ideas," says Smith. "The creation of a robot model for infant learning has far-reaching implications for how the brains of young people work."

The team's results were published in Plos One.

The video below shows the infant robot learning experiment.

Source: Indiana University Bloomington

Learning the names of objects

No comments
0 comments
There are no comments. Be the first!