Amid the rush to develop self-driving cars, there are a few questions that need answering. We've been worrying about hackers taking control of our autonomous vehicles, but it turns out they could be spooked by much simpler means. A team of researchers says strategically placed stickers on street signs could be enough to confuse self-driving cars.
The team, which included researchers from the University of Washington, University of Michigan, Stony Brook University and UC Berkeley, needed only a regular printer and a camera to trick the vision systems in their autonomous test subjects.
One method for bamboozling the self-driving cars involved printing a poster and simply sticking it over the existing sign. The result would look slightly off or faded to human eyes, but it caused the cars to misidentify the stop sign as a speed sign from a number of different angles. In the real world, that could obviously have some serious implications.
The other approach taken by researchers was more akin to an abstract art or guerrilla marketing project. The team stuck a few small stickers in strategic places on the stop sign and found they had the same impact as the full-sign coverup. Stickers reading "love" and "hate" made the cars think the stop signs were actually speed signs (or a yield sign, in one case), while smaller stickers placed around the sign had the same effect.
Gray stickers masking a right turn arrow made the test cars think it was a stop sign two-thirds of the time, or an added-lane sign for the rest of the time.
Not just anyone would be able to take advantage of this vulnerability. Anyone keen to meddle with a certain type of autonomous vehicle in their area would need to know the algorithm used by a specific car's vision system, or play a long game of trial-and-error with different sticker layouts. If they could work that out, however, all hackers would need to bamboozle self-driving cars is a decent color printer and some sticker paper.
There are a few different ways to deal with the threat. The researchers say manufacturers could use contextual information to make sure it's reading a sign correctly. Instead of just trusting what the vision system is reading. The backup would add a dash of reason to proceedings – asking why, for example, there might be a 65 mph sign on a quiet suburban street.
The results of the study was published in the paper Robust Physical-World Attacks on Machine Learning Models.
Source: University of Washington via Car and Driver
Tesla cars memorise roads, and share via updates to all other Tesla's. Therefore one messed with sign will be noted by the first car that passes it and known by all of the others.
these cars are so much cleverer than we give them credit for.
However deep learning and contextual positioning easily resolves, as much as it does for humans.
The problem with people drivers is not perfect identification, but rather our attention span.
Computers are great at attention, but still working on perfect identification.
Most accidents are not the result of partially covered signs, with people it is all about partially given attention. However this may shift as self driving systems are able to give 100% attention, we may see more accidents due to impartial perception and identification.
One good thing is that self driving cars are also equipped with obstacle avoidance systems...
Test sensors in a dark tunnel without light after a storm. Test sensors with 50% disabled to simulate a defect or failure. Test sensor covered with ice not chipped off. Test sensors smacked with bird doo, WHILE IN MOTION. Test sensors covered with engine oil spray. Test sensors covered with Arizona road dust. Test sensors smacked with mud, WHILE IN MOTION. Test sensors covered with snow and sleet while in an actual storm. Test sensors covered with sticky leaves, WHILE IN MOTION. Test sensors covered with dead bugs sticking to sensors. Test sensors covered with clear wax applied by a vandal. Test sensors on roads with signs missing or vandalized. Test sensors disabled by a EMP strike, WHILE IN MOTION. Test sensors against popular car top carriers, canoes, mattress etc., WHILE IN MOTION.
CRITICAL TEST: Test "Self-Driving Car" software ability to block a NSA/CIA/FBI cyber hack used to stage an undetectable assassination! They can and will spoof their real location from the USA to Russia, Nigeria or your mommies basement!