Amid the rush to develop self-driving cars, there are a few questions that need answering. We've been worrying about hackers taking control of our autonomous vehicles, but it turns out they could be spooked by much simpler means. A team of researchers says strategically placed stickers on street signs could be enough to confuse self-driving cars.
The team, which included researchers from the University of Washington, University of Michigan, Stony Brook University and UC Berkeley, needed only a regular printer and a camera to trick the vision systems in their autonomous test subjects.
One method for bamboozling the self-driving cars involved printing a poster and simply sticking it over the existing sign. The result would look slightly off or faded to human eyes, but it caused the cars to misidentify the stop sign as a speed sign from a number of different angles. In the real world, that could obviously have some serious implications.
The other approach taken by researchers was more akin to an abstract art or guerrilla marketing project. The team stuck a few small stickers in strategic places on the stop sign and found they had the same impact as the full-sign coverup. Stickers reading "love" and "hate" made the cars think the stop signs were actually speed signs (or a yield sign, in one case), while smaller stickers placed around the sign had the same effect.
Gray stickers masking a right turn arrow made the test cars think it was a stop sign two-thirds of the time, or an added-lane sign for the rest of the time.
Not just anyone would be able to take advantage of this vulnerability. Anyone keen to meddle with a certain type of autonomous vehicle in their area would need to know the algorithm used by a specific car's vision system, or play a long game of trial-and-error with different sticker layouts. If they could work that out, however, all hackers would need to bamboozle self-driving cars is a decent color printer and some sticker paper.
There are a few different ways to deal with the threat. The researchers say manufacturers could use contextual information to make sure it's reading a sign correctly. Instead of just trusting what the vision system is reading. The backup would add a dash of reason to proceedings – asking why, for example, there might be a 65 mph sign on a quiet suburban street.
The results of the study was published in the paper Robust Physical-World Attacks on Machine Learning Models.
Source: University of Washington via Car and Driver