How seeing people learn echolocation

Estimating the room size by tongue clicks works amazingly well

Seeers can also learn to orient themselves through sound and even estimate room sizes through echoes. © Jupiter Images / thinkstock
Read out

Seeing a gap instead: Even seeing people can learn to perceive the space around them with sound. In one experiment, these subjects achieved astonishing precision after only a short training period: They correctly estimate the size of a room down to four percent. Brain scans revealed that sensory as well as motor brain areas are active in the echolocation of sighted persons.

Bats, some marine mammals and birds capture their environment through ultrasound. The echoes of their calls tell them where obstacles or prey animals are and allow spatial orientation. On the other hand, we humans usually rely on our eyes or at most technical tracking aids.

However, many blind people have learned to orient themselves by means of echolocation. For example, you tap the floor with a stick or click with your tongue and then listen to the echoes of this sound created by obstacles or walls in the room. Your brain uses the areas no longer needed for vision to process these signals.

Clicking echoes in the brain scanner

But less well known so far was how well sighted control the sonic echolocation and what happens in their brain. Virginia Flanagin of the Ludwigs-Maximilians University Munich (LMU) and her colleagues have now investigated this. For this they first developed a virtual space - an acoustic environment that makes it possible to explore the echolocation in the narrow tube of a magnetic resonance tomograph.

To do this, the researchers recorded the acoustics of a chapel and reproduced the duration of the sound in this room with the help of microphones and loudspeakers. Echo delays also allowed the size of this virtual space to vary. In the test, sighted and a blind test person were to estimate with the help of clicks how big this room was. The sounds were generated either by computer or generated by the participants with their tongue. display

Amazingly accurate location

It turned out that even sighted people master the echolocation surprisingly well after a little previous training: "All participants were able to perceive even the smallest differences in room size, " reports Lutz Wiegrebe from the LMU. A test person could even name the room sizes so precisely that his data only deviated from the actual size by a maximum of four percent.

Here, however, a phenomenon was striking: If the sighted subjects were guided by computer-generated click sounds, this was hardly possible. But if they were able to snap their tongues themselves, their echolocation improved significantly, as the researchers report.

Sensory and motor areas coupled

An explanation for this was found in functional magnetic resonance tomography (fMRI): "In echolocation, there is a very close coupling between the sensory and motor cortex, " says Flanagin. Both areas fired when the subjects tried to capture their surroundings by means of tongue clicks and only this combination of both brain areas seems to enable a good echolocation in sighted persons.

"Our data provide evidence that human echolocation is a pro-active process both in behavior and brain activity, " the researchers note.

In contrast, in the blind subject, especially the visual cortex fired. "It shows how plastic the human brain is. The visual primary cerebellum can apparently perform auditory tasks, "says Wiegrebe. By contrast, the visual cortex hardly responded to the echolocation in the sighted subjects it is almost exclusively focused on visual tasks. (Journal of Neuroscience, 2017; doi: 10.1523 / JNEUROSCI.1566-12.2016)

(Ludwig-Maximilians-University Munich, 26.01.2017 - NPO)