Automotive

Autonomous cars could be bamboozled by stickers on street signs

Autonomous cars could be bamboozled by stickers on street signs
Self-driving cars could be bamboozled by stickers on stop signs, according to research from a new multi-university team.
Self-driving cars could be bamboozled by stickers on stop signs, according to research from a new multi-university team. 
View 5 Images
One of the ways researchers bamboozled the self-driving test subjects
1/5
One of the ways researchers bamboozled the self-driving test subjects
One of the ways researchers bamboozled the self-driving test subjects
2/5
One of the ways researchers bamboozled the self-driving test subjects
One of the ways researchers bamboozled the self-driving test subjects
3/5
One of the ways researchers bamboozled the self-driving test subjects
One of the ways researchers bamboozled the self-driving test subjects
4/5
One of the ways researchers bamboozled the self-driving test subjects
Self-driving cars could be bamboozled by stickers on stop signs, according to research from a new multi-university team.
5/5
Self-driving cars could be bamboozled by stickers on stop signs, according to research from a new multi-university team. 
View gallery - 5 images

Amid the rush to develop self-driving cars, there are a few questions that need answering. We've been worrying about hackers taking control of our autonomous vehicles, but it turns out they could be spooked by much simpler means. A team of researchers says strategically placed stickers on street signs could be enough to confuse self-driving cars.

The team, which included researchers from the University of Washington, University of Michigan, Stony Brook University and UC Berkeley, needed only a regular printer and a camera to trick the vision systems in their autonomous test subjects.

One method for bamboozling the self-driving cars involved printing a poster and simply sticking it over the existing sign. The result would look slightly off or faded to human eyes, but it caused the cars to misidentify the stop sign as a speed sign from a number of different angles. In the real world, that could obviously have some serious implications.

One of the ways researchers bamboozled the self-driving test subjects
One of the ways researchers bamboozled the self-driving test subjects

The other approach taken by researchers was more akin to an abstract art or guerrilla marketing project. The team stuck a few small stickers in strategic places on the stop sign and found they had the same impact as the full-sign coverup. Stickers reading "love" and "hate" made the cars think the stop signs were actually speed signs (or a yield sign, in one case), while smaller stickers placed around the sign had the same effect.

Gray stickers masking a right turn arrow made the test cars think it was a stop sign two-thirds of the time, or an added-lane sign for the rest of the time.

Not just anyone would be able to take advantage of this vulnerability. Anyone keen to meddle with a certain type of autonomous vehicle in their area would need to know the algorithm used by a specific car's vision system, or play a long game of trial-and-error with different sticker layouts. If they could work that out, however, all hackers would need to bamboozle self-driving cars is a decent color printer and some sticker paper.

One of the ways researchers bamboozled the self-driving test subjects
One of the ways researchers bamboozled the self-driving test subjects

There are a few different ways to deal with the threat. The researchers say manufacturers could use contextual information to make sure it's reading a sign correctly. Instead of just trusting what the vision system is reading. The backup would add a dash of reason to proceedings – asking why, for example, there might be a 65 mph sign on a quiet suburban street.

The results of the study was published in the paper Robust Physical-World Attacks on Machine Learning Models.

Source: University of Washington via Car and Driver

View gallery - 5 images
7 comments
7 comments
Fairly Reasoner
Self driving cars. Among the worst ideas ever.
SimonClarke
Self driving cars so much safer than people driven cars. Before commenting people need to drive in one. Current UK regulations limit this to level two autonomy which means that you have to have your hand on the wheel. Therefore the Computer and you are driving.
Tesla cars memorise roads, and share via updates to all other Tesla's. Therefore one messed with sign will be noted by the first car that passes it and known by all of the others.
these cars are so much cleverer than we give them credit for.
rozbeh
One of effective ways to ratify the error, is to register the signs on the map installed on the driving system and to update them regularly.
Brian M
Clearly its an issue, even with human drivers!
However deep learning and contextual positioning easily resolves, as much as it does for humans.
MK23666
Rozbeh and Brian M. Thank you
Sean Reynolds
There are two key points about driving systems. Always on attention. Perfect Identification.
The problem with people drivers is not perfect identification, but rather our attention span.
Computers are great at attention, but still working on perfect identification.
Most accidents are not the result of partially covered signs, with people it is all about partially given attention. However this may shift as self driving systems are able to give 100% attention, we may see more accidents due to impartial perception and identification.
One good thing is that self driving cars are also equipped with obstacle avoidance systems...
CharlieSeattle
Please run these "Self-Driving Car" tests to simulate real world conditions FIRST!!
Test sensors in a dark tunnel without light after a storm. Test sensors with 50% disabled to simulate a defect or failure. Test sensor covered with ice not chipped off. Test sensors smacked with bird doo, WHILE IN MOTION. Test sensors covered with engine oil spray. Test sensors covered with Arizona road dust. Test sensors smacked with mud, WHILE IN MOTION. Test sensors covered with snow and sleet while in an actual storm. Test sensors covered with sticky leaves, WHILE IN MOTION. Test sensors covered with dead bugs sticking to sensors. Test sensors covered with clear wax applied by a vandal. Test sensors on roads with signs missing or vandalized. Test sensors disabled by a EMP strike, WHILE IN MOTION. Test sensors against popular car top carriers, canoes, mattress etc., WHILE IN MOTION.
CRITICAL TEST: Test "Self-Driving Car" software ability to block a NSA/CIA/FBI cyber hack used to stage an undetectable assassination! They can and will spoof their real location from the USA to Russia, Nigeria or your mommies basement!