Robotics

VRoxy system pushes telepresence beyond just looking and talking

VRoxy system pushes telepresence beyond just looking and talking
Mose Sakashita, a Cornell University doctoral student in the field of information science, with the VRoxy system
Mose Sakashita, a Cornell University doctoral student in the field of information science, with the VRoxy system
View 2 Images
Mose Sakashita, a Cornell University doctoral student in the field of information science, with the VRoxy system
1/2
Mose Sakashita, a Cornell University doctoral student in the field of information science, with the VRoxy system
Cornell's Prof. François Guimbretière, working on the VRoxy system
2/2
Cornell's Prof. François Guimbretière, working on the VRoxy system

When it comes right down to it, most telepresence robots are essentially just remote-control tablets that can be steered around a room. The VRoxy system is different in that its robot replicates the user's movements, plus it auto-pilots itself to different locations within a given space.

The system is being developed by a team of researchers from Cornell and Brown universities.

In its current functional prototype form, the VRoxy robot consists of a tubular plastic truss body with motorized omnidirectional wheels on the bottom and a video screen at the top. Also at the top are a robotic pointer finger along with a Ricoh Theta V 360-degree camera.

The remotely located user simply wears a Quest Pro VR headset in their office, home or pretty much anyplace else. This differentiates VRoxy from many other gesture-replicating telepresence systems, in which relatively large, complex setups are required at both the user's and viewer's locations.

Via the headset, the user can switch between an immersive live view from the robot's 360-degree camera, or a pre-scanned 3D map view of the entire space in which the bot is located. Once they've selected a destination on that map, the robot proceeds to autonomously make its way over (assuming it's not there already). When it arrives, the headset automatically switches back to the first-person view from the bot's camera.

Not only does this functionality spare the user the hassle of having to manually "drive" the robot from place to place, it also keeps them from experiencing the vertigo that may come with watching a live video feed from the bot while it's on the move.

Cornell's Prof. François Guimbretière, working on the VRoxy system
Cornell's Prof. François Guimbretière, working on the VRoxy system

The VR headset monitors the user's facial expressions and eye movements, and reproduces them in real time on an avatar of the user, which is displayed on the robot's screen. The headset also registers head movements, which the robot mimics by panning or tilting the screen accordingly via an articulated mount.

And when the user physically points their finger at something within their headset view, the robot's pointer finger moves to point in that same direction in the real world. Down the road, the researchers hope to equip the robot with two user-controlled arms.

In a test of the existing VRoxy system, the team has already utilized it to navigate back and forth down a hallway between a lab and an office, where a user collaborated with different people on different tasks.

The study is being led by Cornell University's Mose Sakashita, Hyunju Kim, Ruidong Zhang and François Guimbretière, along with Brown University's Brandon Woodard. It is described in a paper presented at the ACM Symposium on User Interface Software and Technology in San Francisco.

Source: Cornell University

1 comment
1 comment
Daishi
For anyone that watched the Apple VR demo or the Lex Fridman and Mark Zuckerberg VR interview you would see a few companies (Nvidia included) are working on making scans of the user and then tracking their movement inside VR goggles to digitally re-create them on the other end with accurate facial expressions like a realistic avatar. It seems like what they are doing is bringing this to telepresence so the people on the other side don't need to be wearing VR goggles. Autonomous navigation is cool but might push the cost up a bit. For some people/companies autonomous security/inspection bots could double as telepresence robots when people need to take them over.