Robot barman knows when you want a drink
Meet James. He’s a barman with a cheery disposition, is quick with your order, and doesn't tolerate queue jumping. He’s also a one-armed robot with a tablet for a head. But the really curious thing about James is that he can read your body language to find out whether or not you want to order a drink.
The Joint Action in Multimodal Embodied Systems (James) robot is an EU-funded project that started in 2011. As part of the project, Professor Dr. Jan de Ruiter of the Psycholinguistics Research Group at Germany's Bielefeld University along with partners Foundation for Research and Technology-Hellas in Crete, Fortiss in Munich, and the University of Edinburgh set out to solve the problem of how to employ robots as bartenders in a manner that humans would readily accept.
There have been any number of robot bartenders built in recent years. Many have some cool moves, but ordering drinks from one often involves a bit of a learning curve as the patron figures out how to place an order using a touchscreen or smartphone. Unfortunately, pub goers tend to be a bit single minded about getting their hands on a pint and don’t like complications.
The problem of robot bartenders is simple: Robots don't like the real world. They like things to be tidy, orderly, and predictable – preferably with optical codes printed on everything. However, a pub is about as real as the world gets. It's crowded, noisy, dimly lit, with music and conversation everywhere.
It's relatively easy to make a robot that can mix drinks. It's another matter how to tell the robot what you want to drink. And it's another order of magnitude for the robot to figure out whether or not you want a drink in the first place, and another again to get it to do so in a pub.
Patrons don't like dealing with touch screens or other interfaces. What they want is a robot that really can replace a bartender, so as the drink ordering process doesn't change as they swap over. The trouble is, bar staff are very good at cutting through all the confusion and finding out who wants to order a drink and who doesn't. What is more remarkable is that they do so using cues that neither they or the patrons are consciously aware of.
Bielefeld University’s contribution to the James project was to study how people order drinks and program that knowledge into the robot. "Currently, we are working on the robot’s ability to recognize when a customer is bidding for its attention," says De Ruiter. "Thus, we have studied the process of ordering a drink in real life."
For James to be successful, he has to be able to serve people who have never met him and know nothing about how he works. That puts all the pressure on James to get things right. "In order to respond appropriately to its customers the robot must be able to recognize human social behavior," says de Ruiter.
It turns out that it's more important for the robot to understand body language than just what's spoken to it. This was discovered when the team took video camera to pubs and clubs in Bielefeld and Herford in Germany, and Edinburgh in Scotland, and recorded people ordering drinks at the bar. Later, the videos and snapshots from them were shown to experiment participants, who had to sort them as to which showed someone ordering a drink and someone who wasn't.
The results were rather surprising. When questioned, people said that when they wanted to order a drink they looked at their wallet, held up bank notes, or waved. It turned out that most people actually did none of these things or very rarely. For example, only 1 in 25 waved. Instead, 90 percent stood quietly perpendicular to the bar and looked at the bartender. If they didn’t want to order a drink, they adopted a different stance, such as turning slightly away from the bar or chatting with the person next to them.
"Effectively, the customers identify themselves as ordering and non-ordering people through their behavior," says psychologist Dr. Sebastian Loth. When asked in a BBC interview how people learned this ordering behavior, de Ruiter said that it was entirely natural and "like learning to breathe."
The team established that James can determine a patron’s posture, movements and actions almost in real time. The next step was to reprogram James to take into account the new data. He had to be programmed to not offend patrons by either mistakenly asking them if they wanted a drink or ignoring someone who wanted to order. The later, the team says, is worse. This meant giving James a clear definitions of when someone is ordering or not and to be able to use these definitions based on the social context.
The James project continues until January. Whether or not the team will be able to program James to discuss the football match last night remains to be seen.
The results of the study were published in Frontiers of Psychology.
The video below shows James going through his paces.