Bat-inspired glasses help the blind & vision-impaired ‘see’ using sound
Inspired by bats’ use of echolocation, researchers have developed smart glasses that transform visual information into unique sound representations that enhance the ability of blind and vision-impaired people to navigate their surroundings. The technology could be world-changing for the visually impaired.
Assistive technology involves designing technologies to enable individuals with sensory disabilities to overcome barriers in their everyday lives. Blindness or low vision (BLV), particularly, affects a person’s ability to perform activities of daily living and engage in social activities and interactions.
A broad area of assistive technology research is concerned with using visual, haptic/tactile and auditory feedback as a means of augmenting the senses. Now, researchers at the University of Technology Sydney (UTS) have developed next-gen smart glasses that translate visual information into distinct sound icons, so-called “acoustic touch”, as a means of helping BLV people to ‘see’.
“Smart glasses typically use computer vision and other sensory information to translate the wearer’s surrounding into computer-synthesized speech,” said Chin-Teng Lin, one of the study’s co-authors. “However, acoustic touch technology sonifies objects, creating unique sound representations as they enter the device’s field of view. For example, the sound of rustling leaves might signify a plant, or a buzzing sound might represent a mobile phone.”
Inspired by how bats use echolocation, emitting a sound wave that bounces off an object, returning an echo that provides information about the object’s size and distance, the researchers set about developing their smart glasses, which they’ve called a Foveated Audio Device (FAD).
The FAD comprised a set of augmented reality glasses and an OPPO Find X3 Pro Android phone. The Unity Game Engine 2022 managed the audio input and camera/head tracking output of the glasses. In combination, this enabled the FAD to turn objects into distinct sound icons when they entered the device’s field of view.
The researchers tested their glasses on 14 adult participants, seven who were BLV and seven blindfolded sighted participants who acted as a control. The study consisted of a training stage, a seated task where the FAD was used to scan and sonify items on a table, and a standing task that explored the performance of the FAD when participants were walking and searching for an item in a cluttered environment. Four items were used in the study: a bowl, book, cup and bottle.
They found that the wearable device significantly enhanced the ability of BLV individuals to recognize and reach for objects without too much mental effort.
“The auditory feedback empowers users to identify and reach for objects with remarkable accuracy,” said Howe Yuan Zhu, lead and corresponding author of the study. “Our findings indicate that acoustic touch has the potential to offer a wearable and effective method of sensory augmentation for the visually impaired community.”
With some tweaking, the acoustic touch technology could become an integral part of assistive technologies, allowing BLV people to access their environment better than before.
The study was published in the journal PLOS One.