Virtual Reality

Stanford researchers present adaptive display prototypes for virtual reality

Leading VR headsets like the Oculus Rift and HTC Vive don't have adaptive displays, but researchers at Stanford hope to change that
Will Shanklin/New Atlas
Leading VR headsets like the Oculus Rift and HTC Vive don't have adaptive displays, but researchers at Stanford hope to change that
Will Shanklin/New Atlas

Researchers at Stanford University have built some VR display prototypes that adapt to the eyes of each wearer, which could pave the way for more comfortable virtual experiences for everyone, regardless of age, eyesight or focusing problems.

With current VR headset technology, discomfort, eye strain and even nausea are pervasive problems. That's largely due to the fact that unlike the human eye, VR headsets have fixed focus displays – everything on screen is in the same sharp focus, all the time. In normal human vision, we see the area we're looking at directly in full focus, while everything in the peripheral vision is blurry. The discrepancy between the way our eyes work and the static characteristics of the display leads to discomfort and a lack of immersion.

Stanford's Computational Imaging Lab has presented two adaptive display prototypes to tackle this issue. One has liquid-filled lenses that when squeezed, change the display focus (similar to concepts we've seen in other emerging optical technology, such as these glycerin-filled glasses). The other one takes a more mechanical route, physically moving the display nearer to or further from the wearer's eyes.

Both prototypes are supported by unique eye-tracking technology that determines what the wearer is trying to look at, and software that responds by manipulating the display. It sounds like it's trying to achieve a similar goal as foveated rendering, something Oculus' research head Michael Abrash discussed at length last October.

The prototypes were tested on 173 study participants, aged 21 to 64, and researchers found that they increased comfort across a variety of vision characteristics. According to Stanford, the adaptive software has been able to successfully accommodate near- and far-sighted users, but it has not yet been able to correct for astigmatism.

While prototypes like these hint at a more comfortable, immersive future, we're still going to hang onto other nausea-fighting tech for now.

Source: Stanford

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
0 comments
There are no comments. Be the first!