Ear-IT project: How listening to the sounds of a city could make for smarter living
As the Internet of Things (IoT) continues to gather steam, we're seeing the emergence of gadgets equipped with all kinds of sensors to help improve our daily lives, from energy-saving climate control systems to smart locks for the front door. But have you ever thought about how sound might be used in the IoT? For the last two years, the Ear-IT project has been monitoring acoustics in the Spanish city of Santander, with the aim of improving the lives of its residents in ways ranging from improved traffic flow to energy savings in the home.
As part of FP7, the European Union's Seventh Framework Programme for Research, the Ear-IT project team installed more than 12,000 recording devices around Santander, the majority of which were integrated into lamp posts. One focus of the research was a troublesome junction near the city's hospital, which was the location of a high number of accidents.
"The complex junction was the scene of quite a few traffic incidents," says Professor Pedro Maló, project coordinator. "Traffic comes in a variety of directions and emergency vehicles are trying to get through. Ear-IT has set up sensors which hear sirens and then trigger other sensors to track the vehicle. This data is then used to change traffic lights in the ambulance’s favor."
Further to providing better access to the hospital in case of emergency, the sensors were also effective in measuring general traffic density. They were able to determine the number of cars passing by and help identify hot spots, which the researchers say could also help in other applications, such as efforts to to reduce air pollution. They corroborated the data with electromagnetic induction sensors in the roads.
"I was really relieved and delighted when, after a year of work to adapt the technology to a city environment, we found the acoustic and pressure sensors were giving us the same message," says Maló.
Behind the technology is a three-step acoustic detection process. During the pre-processing stage, an audio signal composed of background and target acoustic sounds is detected by the microphones. The background and target sounds are then separated to avoid unnecessary processing and the energy wastage that goes with it.
Sounds are then extracted from the recording and defined by certain characteristics such as time, spectral distribution, energy and modulation, before being categorized into different classes of acoustic events using a combination of computer modeling and manual labeling.
According to the researchers, other potential applications for the technology include sending an alert to your smartphone if there is a concert happening nearby, or another event in the street that you might find enjoyable.
Inside the home, the sensors were also able to analyze a room and identify how many people were in it, which could lead to automated functions, such as the closing of curtains, the opening of a window and switching off the lights to save energy.
Additionally, the sensors could prove useful as a safety measure for the elderly. Through the ability to detect the presence of somebody in the room, the system could be used to send a distress signal to a family member or carer if there is a fall or another accident.
The Ear-IT testing period concluded last month and the team says the project will be fully completed by the end of this year.