You might think that your feeling of satiation when eating is due simply to your stomach filling up. According to the Hirose Tanikawa Group at the University of Tokyo, however, the visual perception of food also has something to do with it – the greater the amount of food that a person sees that they’re eating, the sooner they feel full. With that in mind, the team has created a prototype dieting system that uses augmented reality to trick people into thinking their food items are larger than they actually are.
Users wear a head-mounted camera-equipped display, and handle their food against a chroma-key-blue background – it is hoped that in a commercial version of the technology, any background (such as a table top) will suffice. The headgear could also likely be replaced by something considerably lighter and smaller, such as a set of Google Glasses.
The camera’s video signal is processed by software that identifies hand-held food items and enlarges them relative to the user’s hand, in the display. A deformation algorithm likewise makes their hand appear to be opened wider, as if it’s naturally holding the larger piece of food.
In tests of the system using 12 subjects, the amount eaten dropped by about 10 percent when the food was made to appear one and a half times larger. Interestingly, the principle also works in reverse – when the food was made to appear one third smaller than it actually was, the test subjects ate about 15 percent more. This might also come in handy, as it could be used to get people to eat larger amounts of healthy foods.
One question does arise, however ... what would happen when users tried to actually put the food in their mouths, if it looked bigger than it actually was? Perhaps they’d better keep a napkin handy.
The system can be seen in use, in the DigInfo video below.
Source: DigInfo via ExtremeTech