Robotics

Speak! Robot guide dogs converse with their owners

Speak! Robot guide dogs converse with their owners
The robotic dog used in the study was a Unitree Go2 model
The robotic dog used in the study was a Unitree Go2 model
View 1 Image
The robotic dog used in the study was a Unitree Go2 model
1/1
The robotic dog used in the study was a Unitree Go2 model

Since the early 1900s, dogs have helped people who are blind or have low vision to navigate their world. Now, in a very 21st century twist, seeing-eye dogs have gone robotic and added a skill that not even the most well-trained canine could pull off: conversation.

Seeing-eye dogs are undoubtedly one of the clearest examples of human-canine bonding. Not only do they help keep their owners safe, but they also provide comfort and companionship to people who can often feel isolated. Yet these clever canines take a long time to train, with only 50-60% graduating the programs that make them fit to work with people who are blind or have low vision. That means that they are expensive, with costs ranging between US$20,000-50,000. As a result, only about 2-5% of the blind community are able to have a seeing-eye dog.

These facts led Shiqi Zhang, an associate professor at Binghamton University, to investigate an alternative. In 2022 he and his students went trick-or-treating with a quadruped robotic dog. In 2023, he decided to give that dog a more important role and trained it to respond to leash tugs to help it work more like a guide dog. Now, Zhang and his team have gone one step further and trained a Unitree Go2 robotic dog using a large language model via AI tool GPT-4 to question and respond to cues from the user and the environment.

"For this work, we’re demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs," said Zhang. "Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities."

To test the robo dogs, Zhang's team recruited seven legally blind participants who were asked to navigate a big multi-room indoor environment. The bot first asked each participant where they wanted to go, and then as it was guiding them there, provided clues about the environment such as: "this is a long corridor" or "you're passing by the main lobby, which is an open area with seating and information desks." You can see one of the tests in progress in the following video.

🤖These AI-Powered Guide Dogs Don’t Just Lead — They Talk!

Based on questionnaire data collected at the end of each test, the participants indicated that they preferred the combination of verbal and physical guidance through the environment rather than just being pulled along. However the participants did give the guide dog slightly lower marks in terms of its perceived safety, which the researchers say is likely to do with the unfamiliarity of walking alongside a robot. That didn't dampen their enthusiasm for the bots though, says Zhang.

"They were super excited about the technology, about the robots," he said. "They asked many questions. They really see the potential for the technology and hope to see this working."

In additional testing, the team had GPT-4 use natural language commands to run the dog through 77 different navigation scenarios, each of which it was able to complete successfully.

Now the researchers plan to carry out more studies in which the bots will navigate longer distances both indoors and out. They will also be working on amping up the autonomy of the system.

The paper describing the research was presented in January at the 40th Annual AAAI Conference on Artificial Intelligence in Singapore.

Source: Binghamton University

6 comments
6 comments
Trylon
Can't say that I see the utility of a dog-shaped guide walking alongside. Why not something like a small shopping cart or rollator? It would have a smaller footprint and could also carry a small amount of cargo – like groceries – as well. Wheels or even tracks would be much more energy-efficient than legs, allowing longer operation. Walking behind it, you would know that if the robot can negotiate something first, like a curb, so can you.
Global
Seems this concept could do so much more than walking beside you as a dog, but does it sense heat, water? Perhaps some tricky situations, and inclines or even snow/ice could prove challenging. How do the predicted/current costs of this prototype compare with the typical seeing eye dog?
I think a tracked seated device would have a lot to offer than simply mimicking a trusted dog .
Spud Murphy
Commenters missing the point here. They used a readily available, off-the-shelf robot rather than some specially designed, super expensive bespoke unit. The Go2 starts at just US$1600, vastly cheaper than anything that could be special-built for this task, and it has the processing capabilities to do the job, whereas most wheeled robots are not much more than RC machines.
Techutante
Tracks and wheels don't mix with stairs or lips over a couple inches. As to if it's as good as a real dog, I would say no by a longshot. But inevitably someone will say it's inhumane to train and slave a dog to service for people and they will be replaced. Benefits include: No giant piles of poop you can't see, it can carry hundreds of pounds including potentially letting smaller people ride it, you can charge your phone off it? Maybe you can send it to the store itself to get things for you.
Trylon
The rebuttals are hilarious. This LLM isn't processed in the robot, so it's not just an off-the-shelf Go 2. In the experiment, the robot is controlled by a human assistant, the guy walking behind the robot carrying the controller. Tracks and wheels work fine with stairs, even full flights of stairs far more than "a couple of inches" high. Tracked stairclimber robots have existed for years. Don't think wheels can handle stairs? Read the article here only days ago about the RAI Roadrunner wheeled robot. Or look up Tri-star wheels. And don't forget, there are things called "elevators." Again, wheels are far more energy efficient. There's a reason our cars, bikes, carts, scooters, etc. have wheels and not legs. Legs are great if you're only going to use it for an hour or two a day, but people have waking hours longer than that. You can't expect a blind person to have to stop and search for a charging station every couple of hours then wait a few more hours as it recharges. They're visually impaired, not motion impaired, so it would be better to allow them to provide the motive force. All they need is the audio prompts and directions to navigate the environment, much like GPS navigation systems. Even silent tactile cues in the grips, like a buzz in the left grip to tell you it's time to turn left. Add a sensor suite to detect and warn of obstacles, hazards, etc. and they could go outside into an uncontrolled environment that's not completely mapped out, where this experiment obviously can't go. Basically, it would act as a wheeled or tracked white cane. Believe it or not, there have been quite a few examples of research into smart walkers for the blind. It wouldn't be difficult to mount a smartphone or tablet to the handlebar to provide the LLM which seems to be the crux of this project.
fred
I am more interested in the AI stack than the dog. A cane, video cam in glasses and something that could run on an M series apple laptop in a bag would be perhaps more useful. The “describe surroundings” is the same, voice and cane might be easier than the dog format, certainly easier in an office, car or restaurant, and the glasses can transcribe or describe from the user’s POV while seated . Great work whatever format gets adopted