Although the Microsoft Kinect was designed first and foremost for gaming, the fact that it's a cheap but reliable depth-sensing camera has led to its use in everything from navigation systems for the blind to user-following grocery carts to remote-control cyborg cockroaches. Soon, however, it may be facing some competition. The Northwestern University-designed Motion Contrast 3D Scanning (MC3D) camera should also be economical, while offering higher-quality imaging and the ability to operate in sunlight.
The Kinect works by projecting a grid of infrared light points onto a scene, then measuring how long it takes the light from each of those points to reflect back to the camera's sensor chip.
More professional systems acquire higher-resolution images using a single laser, that scans back and forth across the scene – although doing all that scanning means that that the images aren't gathered all that quickly. Because of this, such systems thwarted by moving objects.
The MC3D, however, takes another approach to laser-scanning. After a scene is initially imaged and the three-dimensional depth of all the objects within it is assessed, the laser only subsequently re-scans areas where and when visual changes are detected.
This is similar to the fashion in which the human eye works. "If you send the same signal to your eye over and over, the neurons will actually stop firing," says project leader Prof. Oliver Cossairt. "The neurons only fire if there is a change in your visual stimulus. We realized this principle could be really useful for a 3D scanning system."
Additionally, because its laser is so bright, the MC3D can be used outdoors in bright sunlight. By contrast, the Kinect's infrared lights are drowned out by brightly-sunlit surfaces.
The system can be seen in use, in the video below.
Source: Northwestern University
Kinect 2 does not work like this. It is time-of-flight sensor.
@Piotr Ra You are right, but the first Kinect used Pattern Projection for estimating depth. And the kinect on that video is a Kinect1 ( look at 0:25)
What can I do about it?