Physicists at Scotland’s Heriot-Watt University have created a 3D imaging camera system capable of resolving depth on a millimeter scale at distances of up to one kilometer. Working much like a laser version of radar, the “Time-of-Flight” (ToF) measurement system “pings” a low-powered infrared laser beam off distant objects and records a pixel-by-pixel map using a detector that counts and positions individual photons as they arrive back at the source.

Unlike a typical 2D camera taking flat pictures, this type of device gathers digital information and creates images using accurate depth of field measurements of the distant surfaces it scans. This enables identification of real-world objects that could make distant landscapes or objects fully navigable inside a 3D virtual space, which could transform the “zoom-and-pan” Google Earth-type navigation into an imaging system that far more tactile and realistic – and potentially navigable inside a vector-based space.

Time of Flight laser measurement is already in wide used in machine vision navigation systems such as robot vehicles in factories moving parts and inventory around. However, the current crop of ToF systems struggle with anything beyond a few feet of distance, as well as with accurate imaging of certain types of objects.

The Heriot-Watt system has shown great promise working with typically "uncooperative" objects that don’t easily reflect laser pulses, such as fabric, making the system potential useful in a wide variety of field situations.

The primary use of the system is likely to be scanning human-made objects, such as vehicles, but it also has potential as a way of determining an object's position, speed and direction with unprecedented accuracy.

According to Dr Aongus McCarthy, Research Fellow at Heriot-Watt University, the approach provides a low-power route to the depth imaging of ordinary, small targets at very long range.

“While it is possible that other depth-ranging techniques will match or out-perform some characteristics of these measurements, this single-photon counting approach gives a unique trade-off between depth resolution, range, data-acquisition time and laser-power levels.”

The scanner is particularly good at identifying objects hidden behind clutter, such as foliage. However, it cannot render human faces, instead drawing them as dark, featureless areas as, at the long wavelength used by the system, human skin does not bounce back a large enough number of transmitted photons to obtain a depth measurement. A mannequin’s face, on the other hand, does render depth information as shown in the image above.

The light the team has chosen has a wavelength of 1,560 nanometres, longer (or "redder") than visible light, and thus it travels more easily through the atmosphere, is not drowned out by sunlight, and is safe for eyes. Many previous ToF systems could not detect the extra-long wavelengths that the team's device is specially designed to sense.

Outside of object identification, photon-counting depth imaging could be used for a number of scientific purposes, including the remote examination of the health and volume of vegetation and the movement of rock faces, to assess potential hazards.

Ultimately, McCarthy says, it has the potential to scan and image objects located as far as 10 kilometer away.

“It is clear that the system would have to be made smaller and made more rugged, but we believe that a lightweight, portable imager is conceivable at the commercial level within five years.”

The research was published this month in the journal Optics Express.

View gallery - 2 images