Human hearing is pretty dismal compared to animals like bats and dolphins that use sound to navigate, but blind people have been reported to be able to learn to use echolocation to sense their surroundings. Now, research out of the Ludwig Maximilian University of Munich (LMU) suggests that almost anyone could pick up the skill, and use echolocation to accurately estimate how big a room is. Using MRI scanners, the team also studied which parts of the brain activate during the process.

First, the team recorded the acoustic properties of a small, echoey chapel, before digitally manipulating this audio fingerprint to effectively make a virtual room sound larger or smaller. The aim of the project was to determine if people could be taught to use echolocation, clicking their tongues to figure out how big a given virtual space was.

UPGRADE TO NEW ATLAS PLUS

More than 1,500 New Atlas Plus subscribers directly support our journalism, and get access to our premium ad-free site and email newsletter. Join them for just US$19 a year.

UPGRADE

"In effect, we took an acoustic photograph of the chapel, and we were then able to computationally alter the scale of this sound image, which allowed us to compress it or expand the size of the virtual space at will," says Lutz Wiegrebe, lead researcher on the study.

Using a fairly small sample size of 12 people – 11 sighted and one blind from birth – the participants were fitted with headphones and a microphone, and placed in an MRI scanner.

To determine the size of a virtual room, each participant had to make tongue clicking sounds into the microphone, and the headphones would play back echoes generated from the "acoustic photograph" of the real building at different virtual sizes.

"All participants learned to perceive even small differences in the size of the space," reports Wiegrebe. The researchers found that accurate room size assessment improved when subjects made click sounds in real time, rather than when recorded tongue clicks were played back to them. The team says that one subject eventually managed to estimate a virtual room space to within four percent of the correct answer.

While the tests were taking place, the MRI scanner allowed the researchers to peer inside the brains of the participants. As the sound waves from the tongue clicks bounced off the virtual walls and returned to the person's ears, the auditory cortex was activated. This was followed shortly after by activation of the motor cortex, stimulating the tongue and vocal cords to produce more clicking sounds.

Surprisingly, the experiments undertaken with the blind subject revealed that the returning sounds were mostly processed in the visual cortex. "That the primary visual cortex can execute auditory tasks is a remarkable testimony to the plasticity of the human brain," says Wiegrebe. Activation of the visual cortex in sighted participants during echolocation tasks was also recorded, but found to be relatively weak.

While we'll never be as good at echolocation as our bat buddies, the results do seem to suggest that humans could be taught to navigate using sound with some modicum of success. The next step for the researchers is to use their findings to develop an echolocation training program for the visually impaired.

The research was published in the Journal of Neuroscience.

Source: LMU Munich