Mobile Technology

SurroundSense uses your phone's sensors to figure out where you are

SurroundSense uses your phone's sensors to figure out where you are
SurroundSense uses your mobile phone's sensors to figure out where you are and is particularly effective where GPS fails
SurroundSense uses your mobile phone's sensors to figure out where you are and is particularly effective where GPS fails
View 1 Image
SurroundSense uses your mobile phone's sensors to figure out where you are and is particularly effective where GPS fails
1/1
SurroundSense uses your mobile phone's sensors to figure out where you are and is particularly effective where GPS fails

Smartphones use GPS locating for a variety of functions but mainly they're used on the road where their accuracy - only within 10m - is basically a case of 'near enough is good enough'. But try using one indoors. They don't work! Nor can they distinguish between two adjacent environments, however different. And 10m can make a big difference inside a shopping complex or multi-roomed office block. In a research jointly sponsored by Microsoft, Nokia, Verizon and the National Science Foundation, a group of computer engineers from Duke University is working on achieving better indoor localization using a combination of sounds, lighting and accelerometer data picked up by a mobile phone. They hope it will supplement the use of GPS systems, which most users know, have their limitations.

The project is challenging to say the least, even though the idea behind it is quite simple: when taken together, sound, light and accelerometer information (used to register the user's moving patterns) combined with WiFi and GSM location create a unique 'digital fingerprint' of the surrounding environment that is often enough information to identify your location.

For instance, in a bar, people spend little time moving around in a room that is typically dark and noisy; a retail store will be brightly lit, with different stores showing different audio patterns; and so on. The raw information is sent to a server which processes the data and, with the aid of machine learning classification algorithms, tries to figure out the user's current location.

One question immediately comes to mind: how can a mobile phone pick up light when it's in your pocket? Currently, the phone must be held with the camera facing down so that the system can use floor tile patterns to help identify the location, but the researchers at Duke University are working on strategies to make the system work even with the phone in your pocket or handbag.

To achieve this they are also relying on the fact that, as more and more people start using the system, the additional data collected will improve its accuracy which, in the long run, could be enough to figure out your location even without a visual input.

Large quantities of data are indeed vital for the success of the project, because not only is the ambiance different at different locations, but it can also change depending on the current time — take, for instance, a Starbucks during the morning rush when there are many customers and therefore noise, compared to the slower period in the mid-afternoon.

The group tested the system on 51 different stores and obtained an accuracy of around 87 percent when all three sensing capabilities were used, which is already quite an impressive result.

The continuous use of resources required by the system is, however, one of the biggest concerns for the team, which is now considering trade-offs to preserve more battery life, such as having the application take measurements only at regular intervals or to delegate even more data processing to the servers.

Smartphones use GPS locating for a variety of functions but mainly they're used on the road where their accuracy - only within 10m - is basically a case of 'near enough is good enough'. But try using one indoors. They don't work! Nor can they distinguish between two adjacent environments, however different. And 10m can make a big difference inside a shopping complex or multi-roomed office block. In a research jointly sponsored by Microsoft, Nokia, Verizon and the National Science Foundation, a group of computer engineers from Duke University is working on achieving better indoor localization using a combination of sounds, lighting and accelerometer data picked up by a mobile phone. They hope it will supplement the use of GPS systems, which most users know, have their limitations.

The project is challenging to say the least, even though the idea behind it is quite simple: when taken together, sound, light and accelerometer information (used to register the user's moving patterns) combined with WiFi and GSM location create a unique 'digital fingerprint' of the surrounding environment that is often enough information to identify your location.

For instance, in a bar, people spend little time moving around in a room that is typically dark and noisy; a retail store will be brightly lit, with different stores showing different audio patterns; and so on. The raw information is sent to a server which processes the data and, with the aid of machine learning classification algorithms, tries to figure out the user's current location.

One question immediately comes to mind: how can a mobile phone pick up light when it's in your pocket? Currently, the phone must be held with the camera facing down so that the system can use floor tile patterns to help identify the location, but the researchers at Duke University are working on strategies to make the system work even with the phone in your pocket or handbag.

To achieve this they are also relying on the fact that, as more and more people start using the system, the additional data collected will improve its accuracy which, in the long run, could be enough to figure out your location even without a visual input.

Large quantities of data are indeed vital for the success of the project, because not only is the ambiance different at different locations, but it can also change depending on the current time — take, for instance, a Starbucks during the morning rush when there are many customers and therefore noise, compared to the slower period in the mid-afternoon.

The group tested the system on 51 different stores and obtained an accuracy of around 87 percent when all three sensing capabilities were used, which is already quite an impressive result.

The continuous use of resources required by the system is, however, one of the biggest concerns for the team, which is now considering trade-offs to preserve more battery life, such as having the application take measurements only at regular intervals or to delegate even more data processing to the servers.

No comments
0 comments
There are no comments. Be the first!