Robotics

Snackbot serves up some human-robot interaction... and snacks

Snackbot on his rounds delivering sustenance to those in need at CMU
Snackbot on his rounds delivering sustenance to those in need at CMU

If you’re a student at Carnegie Mellon University (CMU) who is left gasping for breath when forced to drag yourself away from your studies to get a snack, rejoice! A CMU team has created a robot that is designed to deliver snacks to you. But the appropriately named Snackbot is far more than a vending machine on wheels. It is designed to serve as a research platform for the study of long-term Human-Robot Interaction and packs a healthy helping of technological goodies, including a laser navigation system, sonar sensors and a stereo vision camera for eyes.

Snackbot is the culmination of two years of work by an interdisciplinary team including faculty members, undergraduates and doctoral students in the university’s Human Robot Interaction Group. About the size of a small person, Snackbot rolls around on wheels and is intended for both fully- and semi-autonomous operation. Initially orders for snacks will be made through a website or IM service, but other ordering options are expected to evolve as the field trials of Snackbot progress.

Snackbot’s head features a Bumblebee 2 stereo vision camera acting for eyes and a 3 x 12 LED display for the mouth that is programmed with a series of animations that show verbal and emotional feedback in the form of lip shapes, colors and movement. Although the robot doesn’t have functional ears, the team added ears to the head design to let customers know that Snackbot could hear them.

Aside from providing that 3 o’clock sugar fix, Snackbot is designed to assist research into robust autonomous operation in office environments. It will enable the team to study multi-sensor fusion algorithms for perception, reasoning about dynamic spaces, communicating with people through verbal and non-verbal mechanisms, and planning with incomplete information.

The research includes enabling the robot to navigate through congested areas in a socially acceptable fashion, detect individual people, recognize when someone it knows approaches and autonomously learn to recognize new objects. It will also support behavioral science research on such topics as personalization and people’s relationships with interactive objects.

As the interaction and service design of the robot are flexible, future experiments will be conducted by changing the form and behavior of the robot. Possible examples include experiments on how the robot might use music and nonverbal behavior instead of words to communicate with customers at their offices, and experiments on personalized services.

A paper detailing the design process undertaken by the CMU team in developing Snackbot can be found on Snackbot's website.

Via DVICE.

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
0 comments
There are no comments. Be the first!