Imagine if you were a spy who needed to receive secret messages while you walked around a crowded room, but you didn't want to wear an earpiece. Well, a cheap new device could help. It's billed as being "the first sound projector that can track a moving individual."
Developed by a team at Britain's University of Sussex (and building upon recent research from that same group), the device incorporates an Arduino microcomputer, two 3D-printed acoustic lenses, a speaker, and an off-the-shelf £10 (US$12) webcam. Utilizing that camera along with custom face-tracking software, the projector is able to "autozoom" in and out on a human subject as they walk toward or away from it, automatically adjusting the distance between the lenses in accordance to how far away the person is.
This telescope-like setup allows it to consistently focus a 6-cm-wide (2.4-inch) "sphere of sound" directly in front of the targeted individual. As a result, unless they're standing quite close to someone else, only the target can hear what's in that sphere.
And in its current form, the projector will only work on people who have already given their consent to be tracked. That means – for now, at least – the technology can't be used to harass people, or make them think they're hearing phantom voices.
It's also presently only able to track in and out (as opposed to from side to side), and to project sound at one octave, although the researchers are working on addressing both of those limitations. Once it's capable of delivering full human speech, complete pieces of music or other such content, the system could have a number of uses.
"We believe this technology can be harnessed for plenty of positive applications including personalized alarm messages in a crowd, immersive experiences without headphones, the audio equivalent of special effects," says lead scientist, Dr. Gianluca Memoli.
The projector was demonstrated this week at the 46th International Conference and Exhibition on Computer Graphics & Interactive Techniques (SIGGRAPH 2019), in Los Angeles.
Source: University of Sussex via EurekAlert