Researchers demonstrate first backdoor "hack" into the human brain
Once the preserve of science fiction, brain-computer interfaces (BCIs) have advanced to the point where they can even be found in novelty headwear, which only makes an achievement of an international team of scientists more frightening. Using an off-the-shelf Emotiv BCI costing only a few hundred dollars, the team has shown that it's possible to "hack" a human brain and pull things like bank details straight out of your skull.
For their experiment, researchers from the Universities of Oxford, Geneva and California (Berkeley) called in a group of Computer Science students. The students knew they were part of a security-related experiment but did not know the objectives or that they were being "hacked." Each of these students put on a Emotiv BCI and were sat down in front of a computer that displayed a series of images such as maps, banks, card PINs, and so on.
This graph shows the P300 signal that results from a target stimulus verses the signal from a non target stimulus (Image: Martinovic et al.)
By tracking the P300 brain signal, given off when your brain registers particular kinds of stimuli as meaningful or useful, the researchers found that they were able to consistently reduce the entropy (or random data) in each variable they tested by about 10 to 40 percent, and demonstrated marked improvements over random guessing. In other words, the subjects were "leaking" information via the BCI that the researchers could then use to work out, say, the bank they used or where they lived.
Given the use of social engineering in many "hacks" and the many attempts to discover private information on social media sites such as Facebook, this study suggests that these devices could potentially leak even more information about you without you knowing about it.
This graph shows the performance of the BCI test using three different data-response classification techniques with the dashed line showing the performance of random guesswork (Image: Martinovic et al.)
"The simplicity of our experiments suggests the possibility of more sophisticated attacks," writes the team in their paper on the experiment. "For example, an uninformed user could be easily engaged into 'mindgames' that camouflage the interrogation of the user and make them more cooperative. Furthermore, with the ever increasing quality of devices, success rates of attacks will likely improve."
They also note a much more basic issue; these BCIs store the data they pull from your brain as part of their normal use. The stream of data from the inbuilt EEGs could potentially be exploited by malware, meaning that as things stand, "the development of new attacks can be achieved with relative ease and is only limited by the attacker’s own creativity."
The solution, at least for the moment, is simple. If you use a BCI, be careful what you think.
The team presented their paper at the recent Usenix Security conference, held at Bellevue, Washington.