Medical Devices

Researchers demonstrate first backdoor "hack" into the human brain

Researchers demonstrate first backdoor "hack" into the human brain
Using an off-the-shelf Emotiv BCI, researchers have shown that it's possible to "hack" a human brain
Using an off-the-shelf Emotiv BCI, researchers have shown that it's possible to "hack" a human brain
View 4 Images
This graph shows the P300 signal that results from a target stimulus verses the signal from a non target stimulus (Image: Martinovic et al.)
1/4
This graph shows the P300 signal that results from a target stimulus verses the signal from a non target stimulus (Image: Martinovic et al.)
This graph shows the performance of the BCI test using three different data-response classification techniques with the dashed line showing the performance of random guesswork (Image: Martinovic et al.)
2/4
This graph shows the performance of the BCI test using three different data-response classification techniques with the dashed line showing the performance of random guesswork (Image: Martinovic et al.)
You can use something like the Emotiv BCI to play videogames. But can someone else use it to play you? Image: Martinovic et al.
3/4
You can use something like the Emotiv BCI to play videogames. But can someone else use it to play you? Image: Martinovic et al.
Using an off-the-shelf Emotiv BCI, researchers have shown that it's possible to "hack" a human brain
4/4
Using an off-the-shelf Emotiv BCI, researchers have shown that it's possible to "hack" a human brain
View gallery - 4 images

Once the preserve of science fiction, brain-computer interfaces (BCIs) have advanced to the point where they can even be found in novelty headwear, which only makes an achievement of an international team of scientists more frightening. Using an off-the-shelf Emotiv BCI costing only a few hundred dollars, the team has shown that it's possible to "hack" a human brain and pull things like bank details straight out of your skull.

For their experiment, researchers from the Universities of Oxford, Geneva and California (Berkeley) called in a group of Computer Science students. The students knew they were part of a security-related experiment but did not know the objectives or that they were being "hacked." Each of these students put on a Emotiv BCI and were sat down in front of a computer that displayed a series of images such as maps, banks, card PINs, and so on.

This graph shows the P300 signal that results from a target stimulus verses the signal from a non target stimulus (Image: Martinovic et al.)
This graph shows the P300 signal that results from a target stimulus verses the signal from a non target stimulus (Image: Martinovic et al.)

By tracking the P300 brain signal, given off when your brain registers particular kinds of stimuli as meaningful or useful, the researchers found that they were able to consistently reduce the entropy (or random data) in each variable they tested by about 10 to 40 percent, and demonstrated marked improvements over random guessing. In other words, the subjects were "leaking" information via the BCI that the researchers could then use to work out, say, the bank they used or where they lived.

Given the use of social engineering in many "hacks" and the many attempts to discover private information on social media sites such as Facebook, this study suggests that these devices could potentially leak even more information about you without you knowing about it.

This graph shows the performance of the BCI test using three different data-response classification techniques with the dashed line showing the performance of random guesswork (Image: Martinovic et al.)
This graph shows the performance of the BCI test using three different data-response classification techniques with the dashed line showing the performance of random guesswork (Image: Martinovic et al.)

"The simplicity of our experiments suggests the possibility of more sophisticated attacks," writes the team in their paper on the experiment. "For example, an uninformed user could be easily engaged into 'mindgames' that camouflage the interrogation of the user and make them more cooperative. Furthermore, with the ever increasing quality of devices, success rates of attacks will likely improve."

They also note a much more basic issue; these BCIs store the data they pull from your brain as part of their normal use. The stream of data from the inbuilt EEGs could potentially be exploited by malware, meaning that as things stand, "the development of new attacks can be achieved with relative ease and is only limited by the attacker’s own creativity."

The solution, at least for the moment, is simple. If you use a BCI, be careful what you think.

The team presented their paper at the recent Usenix Security conference, held at Bellevue, Washington.

Source: Usenix, via Extreme Tech

View gallery - 4 images
8 comments
8 comments
Mark A
I guess water-boarding can be abandoned now.
dsiple
ditto Mark. I was just about to order a few hundred for guantanamo
B. Stott
Now, knowing that by the time we, the public, sees or knows about things it has been working for a decade or more. What is this really telling us?
ralph.dratman
This sounds like a modified lie-detector technique to me. That, too, uses an EEG.
MintHenryJ
Taking things out is one thing. The obvious next step is putting things in while rearranging what's already there.
EJD1984
Just watch the 80s movie Brainstorm, and you'll see the dangers of a technology like this.
Mark Keller
Small steps away from recording dreams/thoughts and replaying them on displays and then eventually inputting them in the mind itself.
Andre de Guerin
This is how "The Matrix" started...