Wellness & Healthy Living

Brain-reading hearing aid tech homes in on speakers' voices

Brain-reading hearing aid tech homes in on speakers' voices
The system analyzes a person's brainwaves to determine the direction in which their hearing is focused
The system analyzes a person's brainwaves to determine the direction in which their hearing is focused
View 1 Image
The system analyzes a person's brainwaves to determine the direction in which their hearing is focused
1/1
The system analyzes a person's brainwaves to determine the direction in which their hearing is focused

Hearing aids are often stymied by the "cocktail party" effect, wherein they can't amplify one person's voice without also boosting the voices of everyone else in the room. A new AI system, however, could help focus the devices' attention where it's needed.

First of all, there already are artificial intelligence-based systems that are able to determine which of several voices someone is listening to. In a nutshell, they do so by classifying each voice as a unique sound signal. When one of those signals causes an increase in certain brainwaves, then the system knows that's the voice it should be isolating and amplifying.

According to scientists at Belgium's KU Leuven university, the problem with such setups is that they can take 10 to 20 seconds to do their job – that's impractically long in fast-moving conversations, particularly those involving more than two people. Instead, the researchers developed a much quicker AI-based brainwave-reading system that doesn't listen to voices at all.

"We trained our system to determine whether someone is listening to a speaker on their left or their right," says Prof. Alexander Bertrand. "Once the system has identified the direction, the acoustic camera [an array of microphones] redirects its aim, and the background noise is suppressed. On average, this can now be done within less than one second. That’s a big leap forward, as one second constitutes a realistic timespan to switch from one speaker to the other."

Currently, the electroencephalogram (EEG) data utilized by the system has to be gathered via an electrode-equipped skull cap. That said, it is hoped that once the setup is developed further, the EEG technology could instead be built into a compact hearing aid with integrated electrodes.

A paper on the research was recently published in the journal IEEE Transactions on Biomedical Engineering.

Source: KU Leuven

2 comments
2 comments
Gizmowiz
Why not just use optics tied to the hearing aids so that wherever the eyes are POINTED is where the hearing aids focus on. If my eyes are keyed on the lips of a speaker across the table to filter out all noises emanating from that point in space.
Kevin Jacobsen
VincentWolf as someone with ASD I can tell you that if my hearing aids were tied to where I was looking if never hear the person I'm desperately trying to listen to at the moment.

Even the approach of isolating signals based upon what is deemed interesting by my brain at any given moment would turn my hearing aids into a 1960s car radio being controlled by my ADHD child self. No channel left behind.

Perhaps I'm not the market segment that this would benefit. I'd hate to get this and have some clever engineer force me to listen to one message at the cost of hearing something more important... You know something like that squirrel chittering in the yard.