Science

Scientists recreate Pink Floyd classic using only people's brains

Scientists recreate Pink Floyd classic using only people's brains
What would Roger Waters think of his band's classic tune being reconstructed this way?
What would Roger Waters think of his band's classic tune being reconstructed this way?
View 2 Images
What would Roger Waters think of his band's classic tune being reconstructed this way?
1/2
What would Roger Waters think of his band's classic tune being reconstructed this way?
This graphic shows which electrodes were responsive to the music, pointing scientists in the direction of the new area of the brain processing rhythm
2/2
This graphic shows which electrodes were responsive to the music, pointing scientists in the direction of the new area of the brain processing rhythm

While fans of “Another Brick in the Wall (Part 1)” may be horrified with the results, science on the way to recreating music through brain activity. And it has the potential for incredibly broad applications.

University of California (UC), Berkeley researchers used 2,668 electrodes to catch the brain activity of 29 people listening to the 1978 Pink Floyd epic, and then used what the wires gathered to recreate a recognizable version of the song with the help of computer algorithms.

This sort of algorithmic translation has been used to recreate speech from brain scans, but not music. It also identified an entirely new relationship between rhythm and brain activity, and where this takes place.

The results, while you may not feel holds up to the original (or, on the flip side, may think it’s an improvement on the polarising classic rock tune), unlocked unknown mechanisms in the brain that responds to rhythm, in particular, when those sound waves enter our ears.

It's a particularly exciting discovery for scientists working on how better to find ways to recreate prosody – the rhythm, stress, accent and intonation – in speech (and music) that the words don't communicate. To do this successfully could be a breakthrough for people who have suffered stroke or paralysis, or have other verbal communication issues.

"It's a wonderful result," said Robert Knight, a neurologist and UC Berkeley professor of psychology in the Helen Wills Neuroscience Institute who conducted the study with postdoctoral fellow Ludovic Bellier. "One of the things for me about music is it has prosody and emotional content. As this whole field of brain machine interfaces progresses, this gives you a way to add musicality to future brain implants for people who need it, someone who's got ALS or some other disabling neurological or developmental disorder compromising speech output. It gives you an ability to decode not only the linguistic content, but some of the prosodic content of speech, some of the affect. I think that's what we've really begun to crack the code on."

The music affected 347 electrodes, with most of those located in the Superior Temporal Gyrus (STG), the Sensory-Motor Cortex (SMC) and the Inferior Frontal Gyrus (IFG).

From this, scientists noticed that it was a unique region in the STG that responded to certain sounds, and they believe it’s the area that focuses on rhythm (in this example, the rhythm of the guitars on the track).

This graphic shows which electrodes were responsive to the music, pointing scientists in the direction of the new area of the brain processing rhythm
This graphic shows which electrodes were responsive to the music, pointing scientists in the direction of the new area of the brain processing rhythm

When they removed the electrodes from this region, the sound reconstruction faltered significantly, which suggests this region is key in communicating prosody.

The scientists, led by UC’s Ludovic Bellier, also believe this new cortical subregion in the temporal lobe will be important for future research and development of brain-machine interfaces, such as prosthetics to improve the perception of rhythm and the melody in speech.

"Language is more left brain. Music is more distributed, with a bias toward right," Knight said.

"It wasn't clear it would be the same with musical stimuli," Bellier said. "So here we confirm that that's not just a speech-specific thing, but that's it’s more fundamental to the auditory system and the way it processes both speech and music."

You can listen to the combined results of the 29 brains as their brain activity played the song back to be captured and processed. For those who aren’t fans of the song, the new abstract, dream-pop version taken straight from the brain might actually be preferable.

Scientists recreate Pink Floyd classic through listeners' brain activity

The research was published in the journal PLOS Biology.

Source: University of California, Berkeley

1 comment
1 comment
akarp
people who are not scared of AI and what is coming...this year, next year...are not aware of the capabilities of AI. Mind reading is going to happen in the next year!