Robotics

Robots taught to move safely, but not too safely

Robots taught to move safely, but not too safely
Researchers at Georgia Tech have developed algorithms that allow autonomous robots to loosen up on safety a little in order to get the job done
Researchers at Georgia Tech have developed algorithms that allow autonomous robots to loosen up on safety a little in order to get the job done
View 1 Image
Researchers at Georgia Tech have developed algorithms that allow autonomous robots to loosen up on safety a little in order to get the job done
1/1
Researchers at Georgia Tech have developed algorithms that allow autonomous robots to loosen up on safety a little in order to get the job done

On the road to an increasingly autonomous future, robots and AI systems will need to be programmed to instinctively avoid collisions when they take the wheel. But if bots are designed to be too careful, performance may suffer. A team at Georgia Tech has created new algorithms that aim to strike a balance between the two extremes, allowing robots to move in a swarm safely and efficiently.

Collision avoidance is one of the most important considerations of autonomous vehicles and robots, but some researchers have pondered the ethics of allowing self-driving cars to break minor laws to keep things running smoothly. It follows that autonomous robots may need to relax their own "bubbles" of personal space a little, too.

"When you have too many robots together, they get so focused on not colliding with each other that they eventually just stop moving," says Magnus Egerstedt, a roboticist at Georgia Tech. "Their safety behaviors take over and the robots freeze. It's impossible for them to go anywhere because any movement would cause their bubbles to pop."

Similar to other research into robot swarm behavior, Egerstedt's team developed a set of algorithms that allowed a small group of robots to cross paths and swap spots quickly and without crashing into each other. Essentially, each robot navigates using a set of safe states and barrier certificates, but does so with minimal disruption to their key objective.

"In everyday speak, we've shrunk the size of each robot's bubble to make it as small as possible," says Egerstedt. "Our system allows the robots to make the minimum amount of changes to their original behaviors in order to accomplish the task and not smack into each other."

The video below demonstrates the effect of that system, and watching the four robots moving in sync is almost hypnotic. The researchers demonstrate that it works just as well with eight robots, and even if one rogue bot doesn't follow the rules, the others will adapt to the wild card, keeping their distance and continuing onto their goal regardless.

Though there have been a few minor incidents and even a fatality, autonomous technology in cars has a pretty clean record for safety so far, but that's with a relatively limited sample size. These kinds of safety systems could help keep incidents to a minimum as more and more self-driving vehicles pull out into public roads, and even clear the airspace for autonomous planes.

"We haven't seen thousands of autonomous cars on the road together yet," says Egerstedt. "Robots are very conservative — they want to make sure they're safe. You couldn't pack the interstate with self-driving cars with today's technology."

The team's research paper will be presented at the IEEE Conference on Decision and Control in December.

Source: Georgia Tech

Safe Swarm Robotics

1 comment
1 comment
Bob Flint
So the smaller yields to the larger, similar to the typical traffic patterns. Now explain that....no program that to the Google car that thought the dumb bus had the right of way