MIT system uses VR to get inside a robot's head
Oculus Rift sure has corners of the gaming community excited, but the virtual reality system could have very practical applications, too. Scientists at MIT have developed a system that uses Oculus Rift to pull the strings on a teleoperated robot, a system they say could one day allow blue-collar manufacturing jobs to be carried out from hundreds of miles away.
One of the main application for teleoperated robots, where a robot is controlled wirelessly and remotely by a human in another location, is carrying out activities in space. But they can have an impact here on Earth too, in areas like disaster response and surgery.
The team at MIT's Computer Science and Artificial Intelligence Laboratory developed the new system with manufacturing jobs in mind. These have always required a hands-on-deck approach, with workers needing to be physically present to operate machinery, but the scientists are exploring how they could be carried out remotely through a clever combination of VR and robotics.
Their system is based on something known as the "homunculus model of mind," the idea of a small human sitting inside the brain and controlling the actions of the body. To achieve this, the team created a virtual control room, which the user can leap into by donning the Oculus Rift headset and Touch controllers.
Here they can interact with virtual controls that correspond with the real-life hand grippers of the Baxter industrial robot, using them to pick up, move and retrieve items. The user relies on a set of sensor displays to guide their movement, which offer a live view of the robot's arms in the real-life factory space.
The team says its system solves a couple of drawbacks of current approaches to virtual work environments. It is lighter in terms of data because the user shares their view with the robot and therefore an entire 3D model copy of the robot itself doesn't need to be built. And it should avoid some nausea and headaches that can accompany delayed signals, because the system allows the user to constantly receive visual feedback from the virtual world.
"A system like this could eventually help humans supervise robots from a distance," says CSAIL postdoctoral associate Jeffrey Lipton, who was lead author on a paper describing the system. "By teleoperating robots from home, blue-collar workers would be able to tele-commute and benefit from the IT revolution just as white-collars workers do now."
The team carried out tests of the system, having users engage Baxter to carry out simple tasks like pick up screws, wires and blocks. It says the system showed much higher success rates than other approaches, and that users with gaming experience took to the system particularly well.
According to the researchers, these kinds of technologies could even be used to "gamify" manufacturing positions and give gamers a new source of remote work. In one of its tests, it even used the system in a hotel in Washington to pilot Baxter at MIT, a distance of around 150 mi (240 km). And though they used Baxter and Oculus to demonstrate the technology, they say it could be adapted to work with other robot platforms and the HTC Vive headset.
You can see the system in action in the video below.
Source: CSAIL via EurekAlert
Please keep comments to less than 150 words. No abusive material or spam will be published.