Smartphone users with limited vision will often utilize the phone's zoom feature, making one section of the phone's display larger and thus easier to see. The problem is, it can be difficult to keep track of which part of the overall display they're zoomed-in on. That's why researchers from Harvard Medical School and Massachusetts Eye and Ear have developed a Google Glass-based alternative.
In the new system, users navigate across the phone's display via head movements – moving their head up takes them to the top of the screen, moving it to the left takes them screen-left, and so on. The glasses correspondingly display an enlarged view of that part of the phone's screen.
UPGRADE TO NEW ATLAS PLUS
More than 1,200 New Atlas Plus subscribers directly support our journalism, and get access to our premium ad-free site and email newsletter. Join them for just US$19 a year.UPGRADE
The idea is that users can still get a sense of context, by looking directly at the phone's screen for a wide view of the webpage or other content that they're exploring. To get a closer and clearer look at any part of that content, they just move their head in the appropriate direction, to get a close-up of it on the glasses.
In lab tests, two groups of volunteers had to complete the same smartphone-based tasks – one group used the standard zoom feature to view the phone's screen, while the other group used the Google Glass-based system. Overall, the Glass-using group was 28 percent faster.
The researchers are now looking at incorporating other head gestures into the system, to perform tasks other than just moving within the display.
You can see the system in use, in the video below. A paper on the research was recently published in the journal IEEE Transactions on Neural Systems and Rehabilitation Engineering.
Source: Massachusetts Eye and Ear