Robotics

Speak! Robot guide dogs converse with their owners

Speak! Robot guide dogs converse with their owners
The robotic dog used in the study was a Unitree Go2 model
The robotic dog used in the study was a Unitree Go2 model
View 1 Image
The robotic dog used in the study was a Unitree Go2 model
1/1
The robotic dog used in the study was a Unitree Go2 model

Since the early 1900s, dogs have helped people who are blind or have low vision to navigate their world. Now, in a very 21st century twist, seeing-eye dogs have gone robotic and added a skill that not even the most well-trained canine could pull off: conversation.

Seeing-eye dogs are undoubtedly one of the clearest examples of human-canine bonding. Not only do they help keep their owners safe, but they also provide comfort and companionship to people who can often feel isolated. Yet these clever canines take a long time to train, with only 50-60% graduating the programs that make them fit to work with people who are blind or have low vision. That means that they are expensive, with costs ranging between US$20,000-50,000. As a result, only about 2-5% of the blind community are able to have a seeing-eye dog.

These facts led Shiqi Zhang, an associate professor at Binghamton University, to investigate an alternative. In 2022 he and his students went trick-or-treating with a quadruped robotic dog. In 2023, he decided to give that dog a more important role and trained it to respond to leash tugs to help it work more like a guide dog. Now, Zhang and his team have gone one step further and trained a Unitree Go2 robotic dog using a large language model via AI tool GPT-4 to question and respond to cues from the user and the environment.

"For this work, we’re demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs," said Zhang. "Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities."

To test the robo dogs, Zhang's team recruited seven legally blind participants who were asked to navigate a big multi-room indoor environment. The bot first asked each participant where they wanted to go, and then as it was guiding them there, provided clues about the environment such as: "this is a long corridor" or "you're passing by the main lobby, which is an open area with seating and information desks." You can see one of the tests in progress in the following video.

🤖These AI-Powered Guide Dogs Don’t Just Lead — They Talk!

Based on questionnaire data collected at the end of each test, the participants indicated that they preferred the combination of verbal and physical guidance through the environment rather than just being pulled along. However the participants did give the guide dog slightly lower marks in terms of its perceived safety, which the researchers say is likely to do with the unfamiliarity of walking alongside a robot. That didn't dampen their enthusiasm for the bots though, says Zhang.

"They were super excited about the technology, about the robots," he said. "They asked many questions. They really see the potential for the technology and hope to see this working."

In additional testing, the team had GPT-4 use natural language commands to run the dog through 77 different navigation scenarios, each of which it was able to complete successfully.

Now the researchers plan to carry out more studies in which the bots will navigate longer distances both indoors and out. They will also be working on amping up the autonomy of the system.

The paper describing the research was presented in January at the 40th Annual AAAI Conference on Artificial Intelligence in Singapore.

Source: Binghamton University

No comments
0 comments
There are no comments. Be the first!