PlateMate crowd-sources nutritional analysis of users' meals
While there are a great many people who want to lose weight by dieting, there aren't too many who can afford to have a nutritionist assess the caloric value of all their food choices. Using the PlateMate system, however, members can get an online community of laypeople to do exactly that - and for considerably less money. Although taking such an approach to nutrition might sound kind of iffy, calorie estimates generated by the crowd-sourced system are apparently just as accurate as those provided by trained nutritionists, and more accurate than self-kept logs.
PlateMate was first developed by Jon Noronha and Eric Hysen, when they were undergrads in Harvard University's School of Engineering and Applied Sciences.
Users start by taking a photo of their meal, then submitting it to the crowd. That crowd is coordinated using Amazon Mechanical Turk, a collaborative system that was first developed to help improve Amazon product listings. Individual Turkers, as they're called, look over the submitted photos and try to determine which foods are present in each one, and in what approximate quantities. The total caloric value of the meal is then automatically calculated, the system averages out the totals generated by the various Turkers, and the user is provided with their answer.
Each Turker receives a nominal payment for every task they accomplish.
Presumably, PlateMate users don't sit around with the food still on their plate, waiting to see the results before tucking in. The crowd feedback for one meal choice, however, could guide them in choosing types and amounts of foods in subsequent meals.
There were some hiccups (no pun intended) in setting up the system. Some Turkers, for instance, misidentified foods submitted by users from other cultures. Other Turkers took the lazy approach - from an on-screen list of types of food, they simply selected the first term that had some relevance to a food they had identified, instead of searching through the list for a more specific term. These problems were addressed by breaking the process down into clearly defined tasks, posting warnings about common errors, being more selective when choosing Turkers, and applying algorithms that chose the most-likely-to-be-accurate food identification from a number that were selected for one item.
Down the road, user submissions may also be pared with locational data, so Turkers will know the geographical context of what they're looking at.
"A lot of prior crowdsourcing research has been about making crowds do things that we wish computers could do, like shorten an 800-word essay to 500 words and have it still make sense," said Noronha. "What makes the nutrition application so interesting as a problem in crowdsourcing is that computers are so very far away from doing it on their own - because food is such a human thing."