Good Thinking

EYE 21 system lets the blind 'see' by assigning sounds to shapes

View 3 Images
The experimental EYE 21 system assigns sounds to objects, allowing blind people to be aware of their surroundings (Photo: UPV)
The experimental EYE 21 system assigns sounds to objects, allowing blind people to be aware of their surroundings (Photo: UPV)
The experimental EYE 21 system assigns sounds to objects, allowing blind people to be aware of their surroundings (Photo: UPV)
The experimental EYE 21 system assigns sounds to objects, allowing blind people to be aware of their surroundings (Photo: UPV)
View gallery - 3 images

Engineers from the Research Center for Graphic Technologies at Spain's Universitat Politècnica de València (UPV) have created an experimental system, that allows the blind to be aware of their surroundings through the use of sound. Called EYE 21, it consists of a pair of sunglasses with two built-in micro video cameras, a computer, and a pair of headphones. It's similar to sonar systems that have been used to achieve the same goal.

The two cameras are able to analyze the space in front of them, creating a three-dimensional model of it. Sounds are assigned to the various surfaces in that space, and are played back through the headphones. By listening to this mosaic of sounds, blind users are reportedly able to "hear space," with their brains turning the sounds into shapes.

The experimental EYE 21 system assigns sounds to objects, allowing blind people to be aware of their surroundings (Photo: UPV)

A somewhat similar approach is used in Virginia Tech's AirPix system, that creates a tactile representation of a blind driver's surroundings, which they can feel by holding their hand over a matrix of small holes that are blowing pressurized air.

There are presently four EYE 21 prototypes, with ten more set for testing. The UPV technology was developed as part of the Cognitive Aid System for Blind People (CASBLiP) project.

View gallery - 3 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
5 comments
TogetherinParis
I proposed the Callish computer generated language to do much more than this back in the 1980\'s (trajectories, etc.). It would allow the blind to become air-traffic controllers. It\'s basically The Force with the blinders down.
Peter Meijer
The EYE 2021 website http://www.eye2021.com/ currently reads "Currently, EYE21 is the first and only system in the world of mobility aids for the blind people which allows its use in any environment". I'm not sure what the authors were thinking here, but The vOICe technology for the blind also sounds live camera views, and is already globally available and in use in over 125 countries. Of course I do welcome the EYE21 system as yet another (future) option for blind people, because different people have different needs and interests.
Best regards, Peter Meijer
Seeing with Sound - The vOICe http://www.seeingwithsound.com
gperis
Sorry Peter, your comments are correct. We will try to correct it as soon as possible.
I would like to explain the difference from my point of view:
As far as I know your system, EYE21is working on a different way. VOICE system is an excellent system to represent Images with a real good quality from the point of view of a blind, and EYE2021 works analyzing the environment and giving back a 3D acoustic representation of the space in 25 frames/secong of objects closer than 10 meters. So VOICE makes images representation, and EYE21 makes shapes representations, wht is really different. EYE2021 is like giving a similar sense than the dolphing has to a blind.
EYE2021: the first units have been deliver recently and the use of the system by blind people from birth are giving better results than expected.
Please correct me if I am wrong on the description of your device.
Peter Meijer
Thank you for getting back on this, Guillermo. I appreciate it. I agree that the mapping principles of EYE21 are different from those of The vOICe, which is what it makes EYE21 an interesting alternative to consider. Images contain shape information too, and hence The vOICe sounds shapes as well, albeit encoded differently. Effects of visual shapes as represented by The vOICe on the brain have been studied at Harvard Medical School and Montreal Neurological Institute, see for instance http://brain.huji.ac.il/publications/Amedi_at_al_vOICe_Nature_Neuroscience07.pdf and http://ww3.aievolution.com/hbm1001/files/content/abstracts/175651/1581WThAM_Kim.pdf
Although we lack convenient and affordable stereo camera glasses (I love the beautiful design of your stereo camera glasses!), The vOICe software is prepared to process stereo camera input, either to analyze the view and sound a depth map (e.g. up to 10 meters depending on the stereo camera), or alternatively sound the left camera view to the left ear and the right camera view to the right ear, as described at http://www.seeingwithsound.com/binocular.htm
However, EYE21\'s 25 fps give more immediate feedback, whereas The vOICe can sound images up to for instance 8 fps but then only at the expense of a severe degradation in effective resolution. So it will be interesting to learn how EYE21 handles a number of real-life situations. The two systems might even prove complementary. I hope you will put some demo sounds and corresponding 3D images online such that readers get an impression of how EYE21 sounds with some basic representative views? Looking forward to learning more about your system.
Best regards,
Peter Meijer
Turgay Caliskan
Hello no one talking about how much cost and insurance cover or not if have some info please lets me know