OLED data glasses controlled with eye movements
Imagine that you’re a mechanic whose hands are covered in grease, and you’re trying to follow repair instructions. Every time you need to turn the page or advance the screen, you have to put down your tools and wipe your hands. That’s why scientists from the Fraunhofer Center for Organics, Materials and Electronic Devices Dresden (COMEDD) have developed glasses that allow the wearer to flip pages on a digital document using nothing but their eyes.
The glasses are intended not just for mechanics, but also for technicians, surgeons, or anyone else who has a need for hands-free document navigation.
The inside their lenses incorporate a combination of photodiodes and OLED pixels. The photodiodes act as a camera, registering the direction of the wearer’s eye movements, while the OLED pixels combine to form a display that is overlaid on the view through the glasses.
When working on the task at hand, the user can easily see what they’re doing. If they look up “as if at the horizon,” however, they will see the document displayed as if it’s being projected at a size of about one meter (3.3 feet) in front of them. To turn pages, they just glance at an arrow within that display – by contrast, the much-anticipated Google glasses will apparently require the user to tilt their head to control the display.
The chips used in the Fraunhofer glasses are capable of simultaneously sending and receiving information – they wirelessly send eye movement data to a linked computer, while also receiving data from that same computer, in the form of the content being displayed. A LINUX or Windows machine is required.
COMEDD developed the technology in a partnership with the Fraunhofer Institute for Optronics, System Technologies and Image Exploitation, and near-the-eye tech firm TRIVISIO.
Please keep comments to less than 150 words. No abusive material or spam will be published.
A Microphone can recognise morse code type commands. a nose wrinkle detector, or anything else, would make more sense than eye movement for turning pages. eye movement is suited to 2d navigation.
A mechanic would have a much better chance of getting a cheap computer and a webcam and use gestures to control the display.
That would save the workshop owner from having to buy new copies of the manuals that would constantly be getting ruined.