Photography

Automatic photo tagging with TagSense smartphone app

Tha TagSense smartphone app will automatically apply a greater variety of tags to photos
Tha TagSense smartphone app will automatically apply a greater variety of tags to photos

The old adage says "a picture is worth a thousand words," but just exactly which words is the question. While facial recognition and GPS-enabled cameras have made tagging digital snapshots with names and locations much easier, a team of students from Duke University and the University of South Carolina has developed a smartphone app called TagSense that takes advantage of the range of multiple sensors on a mobile phone to automatically apply a greater variety of tags to photos.

Looking to make it easier for people to search and retrieve specific photos from their ever-growing digital albums, the team set about creating an app that takes advantage of not just a smartphone's GPS, but also its accelerometer, light sensor and microphone to provide additional details about a given photo. The app doesn't just make use of the sensors on the phone taking the photo, but also collects information from the smartphone's of subjects within the photo - with permission of course.

"Phones have many different kinds of sensors that you can take advantage of," said Chuan Qin, a visiting graduate student from USC. "They collect diverse information like sound, movement, location and light. By putting all that information together, you can sense the setting of a photograph and describe its attributes."

The developers say that through the use of a smartphone's built-in accelerometer, the app can tell if a person is standing still for a posed photograph, bowling or even dancing, while light sensor's in the phone's camera can be used to tell if the picture is being taken indoors or out, or on a sunny or cloudy day. The app can even look up weather conditions at the time and location of the photograph. Additionally, the phone's microphone can be used to detect whether the subject is laughing or quiet. All of these attributes are then assigned to each photograph.

"So, for example, if you've taken a bunch of photographs at a party, it would be easy at a later date to search for just photographs of happy people dancing," Qin said. "Or more specifically, what if you just wanted to find photographs only of Mary dancing at the party and didn't want to look through all the photographs of Mary?"

The students envision that those most likely to benefit from TagSense would be groups, such as friends, who would "opt in," allowing the capabilities of the various group members' mobile phones to be harnessed when they are together to provide more detailed tags. To protect users' privacy, the app wouldn't request data from nearby phones that don't belong to the group.

The team conducted experiments using eight Google Nexus One smartphones, taking more than 200 digital photos at various locations across the Duke campus. Senior researcher Roy Choudhury said the app was compared to Apple's iPhoto and Google's Picasa and was shown to provide greater sophistication in tagging photos. The added details of automatic tagging could also help complement existing tagging applications, he said.

The students from Duke University and the University of South Carolina (USC) unveiled the app at the ninth Association for Computing Machinery's International Conference on Mobile Systems, Applications and Services (MobiSys), being held in Washington, D.C. It's currently still just a prototype, but the researchers believe a commercial product will be available in a few years.

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
1 comment
Mr Stiffy
Hmmmm I am so thrilled - not only to go beyond digital cameras, fillum cameras etc., - I shall ascend to a 2B pencil and paper, I shall draw pictures of everything I think it\'s worth keeping a picture of - and I shall give the pictures away to everyone who thinks you can take anything with you when you die.