Robotics

Robotic leg leans on animal evolution to teach itself to walk

Robotic leg leans on animal evolution to teach itself to walk
The new robotic limb attached to a four-legged machine
The new robotic limb attached to a four-legged machine
View 2 Images
Researchers have developed a robotic limb actuated by animal-like tendons and controlled by purpose-built AI algorithms
1/2
Researchers have developed a robotic limb actuated by animal-like tendons and controlled by purpose-built AI algorithms
The new robotic limb attached to a four-legged machine
2/2
The new robotic limb attached to a four-legged machine

In news that will do little to dampen fears of a robot apocalypse, researchers at the University of South Carolina (USC) have built a robo-limb that can teach itself to walk. Inspired by humans, and animals that have evolved to learn the skill within minutes of birth, it's hoped the research will open up new possibilities in the fields of dynamic prosthetics and robots that learn on the fly in unfamiliar environments.

"Nowadays, it takes the equivalent of months or years of training for a robot to be ready to interact with the world, but we want to achieve the quick learning and adaptations seen in nature," says Francisco J. Valero-Cuevas, a professor of Biomedical Engineering.

In pursuit of this aim, Valero-Cuevas and his colleagues developed a robotic leg actuated by animal-like tendons and controlled by bio-inspired AI algorithms. These enable the robot to develop the skill to walk in a similar way to humans through what is known in robotics circles as "motor babbling," or conducting repeated exploratory movements.

"These random movements of the leg allow the robot to build an internal map of its limb and its interactions with the environment," says USC engineering doctoral student Ali Marjaninejad, author of the study.

Researchers have developed a robotic limb actuated by animal-like tendons and controlled by purpose-built AI algorithms
Researchers have developed a robotic limb actuated by animal-like tendons and controlled by purpose-built AI algorithms

By taking it upon itself to learn about its structure and environment, the robotic limb can then develop its own personalized gait and learn a new walking task after just five minutes of motor babbling. So much so, it can recover when being tripped in time to plant its next step safely on the ground even though it wasn't programmed to do so. The researchers believe this is the first robot to be capable of such a feat, and are excited about the possibilities the advance opens up.

As they explain, robots can be programmed to perform certain tasks in certain scenarios, but you can't prepare them for every possibility. These kinds of robots, on the other hand, that are capable of developing their own personalized movements in response to their environment, will be able to take on a wider range of tasks.

"If you let these robots learn from relevant experience, then they will eventually find a solution that, once found, will be put to use and adapted as needed," says Marjaninejad. "The solution may not be perfect, but will be adopted if it is good enough for the situation. Not every one of us needs or wants – or is able to spend the time and effort – to win an Olympic medal."

Responsive prosthetics is one area where this kind of technology could have an impact, assisting people with disabilities by allowing more intuitive, natural and self-improving limbs. Space exploration is another, where robots could be placed on faraway bodies and use their learning capabilities to adjust their gait and navigate unknown terrain.

"The ability for a species to learn and adapt their movements as their bodies and environments change has been a powerful driver of evolution from the start," says Brian Cohn, fellow doctoral student and study author. "Our work constitutes a step towards empowering robots to learn and adapt from each experience, just as animals do."

The research was published in the journal Nature Machine Intelligence.

Source: University of Southern California

No comments
0 comments
There are no comments. Be the first!