Virtual Reality

EgoTouch tech lets VR users' palms serve as touchscreen interfaces

EgoTouch tech lets VR users' palms serve as touchscreen interfaces
The EgoTouch system works with existing VR headsets, without any modifications
The EgoTouch system works with existing VR headsets, without any modifications
View 1 Image
The EgoTouch system works with existing VR headsets, without any modifications
1/1
The EgoTouch system works with existing VR headsets, without any modifications

In the real world, you wouldn't want to carry a handheld controller all the time, or have menus constantly popping up in front of your face – so why put up with those things in VR worlds? With EgoTouch you don't have to, as it puts an interface on the palm of your hand.

Currently in prototype form, the EgoTouch system is being developed by PhD student Vimal Mollyn and colleagues at Carnegie Mellon University's Human-Computer Interaction Institute. It revisits the concept of placing touchscreen-like tactile interfaces on VR users' virtual bodies.

Previous studies have shown that such interfaces have significant speed, accuracy, and ergonomic benefits over the commonly used "in-air" interfaces which are superimposed over the user's view of the VR world.

One problem with existing on-body interfaces, however, lies in the fact that they typically require the use of special depth-sensing cameras. These cameras are utilized to track the location of the real-world body part upon which the interface is being displayed in the VR world, along with the location of the real-world finger which is doing the selecting on that interface.

By contrast, EgoTouch simply utilizes a VR headset's existing RGB optical camera.

As the user's real-world finger presses into their real-world palm, the camera picks up the resulting shadows and skin deformations. By mapping the locations of those visual indicators onto the virtual palm-displayed interface, it's possible to determine which options are being selected in the VR world.

In order to train the algorithm, Mollyn's team had a group of volunteers press their index finger into various locations on their palm while wearing a head-mounted RGB camera. A touch sensor, which couldn't be seen by the camera, ran along the underside of their finger.

By matching up the camera data with the sensor data, the algorithm learned which visuals matched up with which locations, intensities, and durations of touch. What's more, the volunteers had differing skin tones and hair densities, and recorded the data under a variety of different lighting conditions.

When tested, EgoTouch proved to 96% accurate at detecting touch on the palm, with a false positive rate of about 5%. It was also 98% accurate at determining whether a touch was soft or hard, and could recognize touch actions such as pressing down, lifting up, and dragging.

"For the first time, we have a system that just uses a camera that is already in all the headsets. Our models are calibration-free, and they work right out of the box," says Mollyn. "Now we can build off prior work on on-skin interfaces and actually make them real."

A paper on the research was recently published in Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology. A basic version of EgoTouch is demonstrated in the following video.

EgoTouch: On-Body Touch Input Using AR/VR Headset Cameras

Source: Carnegie Mellon University

No comments
0 comments
There are no comments. Be the first!