Dual-radar tech could help self-driving cars see through fog
Self-driving cars typically use both LiDAR and radar to detect obstacles on the road ahead, yet neither system is adept at identifying vehicles through fog. Now, though, engineers have discovered that radar is good at the task if it's "doubled up."
LiDAR (Light Detection And Ranging) sensors gauge the shape and distance of an object by sending out pulses of laser light, then measuring how long it takes that light to reflect back off of the item. Radar units send out radio waves, that are likewise reflected back by objects sitting in their path.
Unfortunately, airborne obstructions such as fog, dust, rain or snow absorb the light used by LiDAR systems, making them unreliable in such conditions. And while radar isn't as adversely affected, it can only ever create a partial image of what it detects – this is because even under ideal conditions, only a small percentage of its emitted radio signals get reflected back to its sensor.
Led by Prof. Dinesh Bharadia, a team at the University of California San Diego addressed the latter problem by installing two radar units on the hood of a car, approximately one car-width (1.5 m/4.9 ft) apart from each other. Special algorithms combine the reflected signals that they receive to create one composite image, while also filtering out irrelevant background "noise." The setup has already been successfully tested under simulated foggy conditions.
"By having two radars at different vantage points with an overlapping field of view, we create a region of high-resolution, with a high probability of detecting the objects that are present," says PhD student Kshitiz Bansal.
The researchers are now in talks with Toyota, which may combine the technology with optical cameras on its vehicles. More expensive LiDAR sensors could ultimately prove to be unnecessary.