Electronics

Powerful imaging tech could help warehouse bots spot broken goods before shipping

View 2 Images
Spotting damaged items before they ship could go a long way towards making warehouse robots more efficient at handling packages
Images courtesy of the researchers
Spotting damaged items before they ship could go a long way towards making warehouse robots more efficient at handling packages
Images courtesy of the researchers
The mmNorm prototype built with a robotic arm (top), with the crude 3D reconstruction from other approaches shown in blue (middle) and the mmNorm reconstruction in purple (right)
Images courtesy of the researchers

Researchers have developed a way to use high-frequency electromagnetic waves to visualize objects that are hidden from view – such as a tool in a pile of junk or a vase in a cardboard box – with much greater accuracy than before.

This imaging technique could be applied in a number of ways, including giving warehouse robots X-ray-like vision so they can identify damaged products on the conveyor belt before they ship.

Here's a quick breakdown of how this tech works. mmWave signals are millimeter wave signals – electromagnetic waves with frequencies between 30 and 300 GHz. They're the same type of signals used to deliver Wi-Fi but with much higher frequencies, and have very short wavelengths, which allows for precise, high-resolution imaging. Such waves can travel through common obstacles like plastic containers or interior walls, and bounce off objects to reveal them.

The trouble is, existing mmWave imaging tech produces crude 3D reconstructions of hidden objects, which aren't useful for identifying small objects inside a box or container, like tools or cutlery. The team at Massachusetts Institute of Technology (MIT) believed it could do better with its mmNorm system, which relies on the property of specularity for higher accuracy by figuring out the surface curvature of the hidden object using mmWaves.

The mmNorm prototype built with a robotic arm (top), with the crude 3D reconstruction from other approaches shown in blue (middle) and the mmNorm reconstruction in purple (right)
Images courtesy of the researchers

mmWave signals bounce off most objects in a "mirror-like" or "specular" way, meaning the angle at which the signal hits a surface is roughly equal to the angle at which it reflects away. Because of this, a radar will only receive strong reflections from parts of an object's surface where the surface normal (the "outward pointing arrow") points directly back towards the radar. mmNorm leverages this direct relationship between the reflected signal and the surface's orientation at every point in 3D space.

For a specific point in 3D space (a "voxel"), mmNorm considers all the different radar positions that could potentially be seen. Each radar location effectively "votes" on what the surface normal should be, based on how strongly it received a reflection from that point. All these weighted "votes" are geometrically summed up to produce a final estimate of the surface normal for that point, and the process is repeated for every point in that 3D space, to reconstruct the 3D object more accurately than previous methods.

The researchers created an mmNorm prototype by attaching an mmWave radar to a robotic arm, and tested its 3D reconstruction ability with more than 60 everyday objects, like mugs, kitchen utensils, and power tools. The team's tech generated reconstructions with roughly 40% less error than similar systems, while also estimating the position of a hidden object more accurately, and working with items made from a range of materials. You can see it in action below.

The researchers are excited about what this system could unlock in robotics across industries:

  • Warehouse bots could reject damaged or incorrectly packed items before they're shipped off to customers.
  • Robots in factories could pick specific tools from a jumbled bunch to use for a task or hand off to a human.
  • AR headsets could reveal objects occluded by walls in industrial settings.
  • Security scanners at airports could more accurately identify objects in passengers' luggage.

The team plans to develop its tech further to improve 3D reconstruction resolution, work better with less reflective objects and thicker occlusions. You can find the paper detailing mmNorm here (PDF).

Source: MIT News

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
  • Reddit
0 comments
There are no comments. Be the first!