Although many of us may have forgotten about Google Glass, the technology is now the base of a set of glasses designed to assist the blind. Known as Envision Glasses, they utilize AI to verbally tell their wearer what they're looking at.
Announced in their newest form this week at the CSUN Assistive Technology Conference, Envision Glasses are initially paired with an iOS/Android smartphone app when being set up, but function independently (for the most part) from that point on. Their software runs on a Qualcomm Quad Core processor, within Google Glass Enterprise Edition 2 hardware.
The idea is that whenever the user needs to know what's in front of them, they start by finger-swiping the right-hand arm of the glasses to select a mode, guided by synthetic speech feedback from the integrated speaker. Once a mode has been selected, the user double-taps the arm to record an image, which is processed by the system's AI-based algorithms.
Depending on the mode selected, the glasses can instantly read and speak short pieces of text such as street signs (in over 60 languages); scan longer pieces of text such as book pages, then subsequently speak them at the user's convenience; or they can be used to make an assistive video call, wherein the other (sighted) person verbally guides the user by viewing a real-time feed from their glasses.
Perhaps more impressively, the glasses are reportedly also capable of things like generally describing the scene in front of the wearer; identifying different colors (useful for doing laundry or buying clothes); or searching for a specific person or object, with a beep signalling when that individual/object is in front of the 8-megapixel camera.
Envision Glasses are also IP53 water-resistant (protected from water spray), tip the scales at approximately 46 g, and should run for four to five hours per charge of their lithium battery. They can be purchased now from their Netherlands-based manufacturer, and are priced at €3,269 (about US$3,607).
Source: Envision