A team of researchers at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) led by Li-Shiuan Peh has come up with a new infrared depth-sensing system. The new system, which works outdoors as well as in, was built by attaching a US$10 laser to a smartphone, with MIT saying the inexpensive approach could be used to convert conventional personal vehicles, such as wheelchairs and golf carts, into autonomous ones.
Inexpensive rangefinding devices, such as the Microsoft Kinect, have been a great help to robotics engineers. Using the off-the-shelf product that relies on an infrared laser to measure distance, they allow for rapid prototyping and the ability to create robots that can sense and navigate in their environments without having to constantly reinvent the necessary technology.
UPGRADE TO NEW ATLAS PLUS
More than 1,200 New Atlas Plus subscribers directly support our journalism, and get access to our premium ad-free site and email newsletter. Join them for just US$19 a year.UPGRADE
Unfortunately, Kinect and similar infrared-based systems tend to be a bit fussy when it comes to ambient light conditions. Sunlight, fire, and heat sources can put them off and even indoors subdued light is often required for them to work.
Commercial outdoor rangefinders have been common for over 30 years, but they work by firing high-energy infrared bursts, which are extremely short to minimize the danger of eye damage. In addition, such systems are very expensive – often costing tens of thousands of dollars.
The MIT system gets around the need for high-energy bursts by timing its measurements to the emission of low-energy bursts. It does this by capturing four frames of video, two which measure reflections of the laser light and two that record only the ambient infrared light, then subtracting the latter from the former to make range measurements.
CSAIL researchers are presenting a new infrared depth-sensing system built from off-the-shelf components, that works outdoors as well as in
In its current prototype form, the MIT system uses a smartphone with a 30-frame-per-second camera, which produces a delay of about an eighth of a second. This limits the accuracy of the system, though more advanced 240-frame-per-second cameras with a delay of a 60th of a second are available.
With what is called "active triangulation," the MIT system uses the attached laser to emit light in a single plane, which is measured by the camera's 2D sensor. MIT says that at ranges of three to four meters (10 to 12 ft) the device boasts an accuracy within millimeters, while at five meters (16 ft) this is reduced to six centimeters (2.3 in). However, when the team installed the system in the driverless golf cart developed by the Singapore-MIT Alliance for Research and Technology it could produce depth measurement suitable for travel at 15 km/h (9 mph).
According to the team, once the technology is mature, it could lead to a plug-in method of creating autonomous golf carts, wheelchairs or other small vehicles, package delivery drones, or expendable robotic vehicles.
"My group has been strongly pushing for a device-centric approach to smarter cities, versus today's largely vehicle-centric or infrastructure-centric approach," says Peh. "This is because phones have a more rapid upgrade-and-replacement cycle than vehicles. Cars are replaced in the timeframe of a decade, while phones are replaced every one or two years. This has led to drivers just using phone GPS today, as it works well, is pervasive, and stays up-to-date. I believe the device industry will increasingly drive the future of transportation."
The MIT team says that as new camera technology becomes available, it will improve the accuracy of the system. At the moment, mobile phone cameras use a rolling shutter technology, which creates an image by scanning across the surface of the sensor in a 30th of second. New phones will uses a global shutter, where all the photodetectors are scanned at once, which will allow the MIT system to use shorter, higher-energy bursts for longer range measurements.
The team's findings will be presented at the International Conference on Robotics and Automation 2016 in Stockholm.
The video below outlines the MIT infrared rangefinder.