Wearables

AiSee wearable tells blind users what they're holding in their hand

AiSee wearable tells blind users what they're holding in their hand
Assoc. Prof. Suranga Nanayakkara (left) and visually impaired student Mark Myres put AiSee to the test
Assoc. Prof. Suranga Nanayakkara (left) and visually impaired student Mark Myres put AiSee to the test
View 2 Images
Assoc. Prof. Suranga Nanayakkara (left) and visually impaired student Mark Myres put AiSee to the test
1/2
Assoc. Prof. Suranga Nanayakkara (left) and visually impaired student Mark Myres put AiSee to the test
The AiSee headset alongside some of the products that it's currently able to identify
2/2
The AiSee headset alongside some of the products that it's currently able to identify

There are many situations in which blind people don't necessarily have to see what they're holding, they just need it described to them. An experimental new wearable device, known as AiSee, is designed to do that very thing.

Developed over the past five years by a team of scientists at the National University of Singapore, AiSee looks like a regular set of bone-conduction earphones joined together by a band that goes around the back of the wearer's neck. The technology is very much intended to keep users from feeling self-conscious, as might be the case if they were wearing something more noticeable such as special "smart glasses."

One of the earphones incorporates a forward-facing 13-megapixel camera which takes in the user's field of view, while the other one has a touchpad interface on its outer surface. A microprocessor and a lithium battery are located in the back of the device, which is wirelessly connected to the internet.

The AiSee headset alongside some of the products that it's currently able to identify
The AiSee headset alongside some of the products that it's currently able to identify

When the user picks up an item – such as when they're grocery shopping – they take a photo of it with the built-in camera. That image is processed in real time via cloud-based AI algorithms, which analyze data such as the shape, size and color of the item, along with any text printed on its labels.

If a match for a known object is found, the user is told what it is via a synthetic voice in the earphones. Should they require more information, they just ask verbally – hopefully the AI will know and provide the answer.

Importantly, AiSee doesn't need to be linked to a smartphone or any other device, keeping everything that much simpler. Additionally, because the bone-conduction earphones don't actually cover the ears, users are still able to hear the world around them.

The scientists are now working on making the technology more affordable and ergonomic, plus they hope to boost the processing speed and improve the object recognition algorithms.

"At present, visually impaired people in Singapore do not have access to assistive AI technology of this level of sophistication," said the lead scientist, Assoc. Prof. Suranga Nanayakkara. "Therefore, we believe that AiSee has the potential to empower visually impaired people to independently accomplish tasks that currently require assistance."

The device is demonstrated in the video below.

AiSee: AI-powered ‘eye’ for visually impaired people to ‘see’ objects

Source: National University of Singapore

1 comment
1 comment
Hasler
Inpatient to try out this device. This concept will greatly help anyone with poor sight as well as those unable to read.