Science

Human biology inspires "unbreakable" encryption

View 2 Images
The human body has inspired a new form of digital security (Image: Shutterstock)
The human body has inspired a new form of digital security (Image: Shutterstock)
The encryption algorithm is modular and can handle several signals at once (Image: Lancaster University)

Researchers at Lancaster University, UK have taken a hint from the way the human lungs and heart constantly communicate with each other, to devise an innovative, highly flexible encryption algorithm that they claim can't be broken using the traditional methods of cyberattack.

Information can be encrypted with an array of different algorithms, but the question of which method is the most secure is far from trivial. Such algorithms need a "key" to encrypt and decrypt information; the algorithms typically generate their keys using a well-known set of rules that can only admit a very large, but nonetheless finite number of possible keys. This means that in principle, given enough time and computing power, prying eyes can always break the code eventually.

The researchers, led by Dr. Tomislav Stankovski, created an encryption mechanism that can generate a truly unlimited number of keys, which they say vastly increases the security of the communication. To do so, they took inspiration from the anatomy of the human body.

In nature, different systems within a living organism often interact with each other, exchanging matter and energy. The interaction between two such systems (for instance, that of the human lungs and heart) can be described by a so-called "coupling function."

Rather than relying on a single system for the encryption, the researchers decided to use two, and use the coupling function between them as the encryption key. Although somewhat laborious, this method has the advantage of creating an infinite number of possible keys, meaning that eavesdroppers cannot simply bruteforce their way into sensitive information.

The encryption algorithm is modular and can handle several signals at once (Image: Lancaster University)

For the more technically minded, here's how it all works. An information signal reaches the transmitter and is used as a parameter in the coupling function between two self-sustained systems, both generated inside the transmitter. The two signals are then sent over the public channel. At the other end, these two signals synchronize with the receiver and, using a private key that contains information on the coupling functions, the algorithm can infer the original parameters and decrypt the information.

The scientists say that their method is highly resistant from noise, that it can easily transmit several signals at once, and that it is highly modular, making it suitable to a wide range of applications.

A paper describing this patent-pending algorithm appears in the journal Physical Review X.

Source: Lancaster University

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
5 comments
Anne Ominous
The paper makes it clear that the system is resistant to *conventional* cryptographic attacks. This does not mean it is resistant to attack.
Further, it appears that what it boils down to is an incredibly complex method of creating obfuscated signals from a relatively simple shared encryption key.
The paper glosses over the inherent weakness of having to share a private key in the first place. A system that shares this same weakness but is otherwise cryptographically very strong and vastly simpler, is the one-time pad. Like the system described in this paper, the key for a one-time pad is *effectively* unbounded, but the mechanism is far simpler.
The only benefit this system appears to have over a one-time pad is that the shared key can be simple and small. That is all.
The authors do not say, but appear to imply, that unlike a one-time pad this key can be re-used securely. I am skeptical.
TheLegendOfGeorgeBurns
While interesting you have to wonder what performance impact this has on applications. The most advanced encryption techniques I'm aware of his from http://porticor.com with their 2 key encryption technology. While the theory isn't new (look at banks and safety deposit boxes) they have made it work so there isn't a performance impact on the server or applications. Their technology has been out for several years and is actually being used. While science can mimick nature it takes a lot of work to get it to work just as efficiently as nature can. I hope in time their system can work just as well.
piperTom
We already live in a world where keys are numbered in spaces of ten to power ninety, one hundred, or even more. So, claims like "truly unlimited number of keys" and "an infinite number of possible keys" must be extreme, even if we allow for hyperbole. (ummm.. "truly"?) If your key can be transmitted or stored, it's not a member of an infinite set.
The one time pad can get away with such claims, because the length of the key is the length of the message -- a brute force attack simple produces all possible messages of the same length. Making claims of infinity in practical matters denotes a lack of seriousness..,
ඊ▲ Ferasdour ▲ඊ
Looking over the white paper, some of it almost seems like an april fools joke. But reading over the whole thing, I gotta ask and maybe someone on here can help me resolve this: A. Would this prove functional with hex or base 16 on top, as it says it transmits through binary, it shouldn't have issues with this if it's not picky about it anyways right? B. I see they mention several times it being time sensitive and being synchronized, did I just skip over how it says to properly synch them? I mean, will this rely on synchronized calls the entire way through, and why make it so dependent, it may be risking availability for confidentiality. Not that anyone has a problem with that anymore, just a note. I mean, lets transmit UDP because it's faster: oh wait, this encryption system missed a step, now both systems are off, oh snap, what now? C. to actively implement this, from a programing view point maybe, how would you discern the correct client for the providing source? Even if an implementation was setup for resending lost data, how would this appropriately stop mitm attacks? specifically if they monitor any agreements, hand shakes, or otherwise the encrypted data in this regard? D. some of the portions conclude that it's modular but I'm just not seeing a big push on how that modulation would be advantageous to the encryption. (i refer back to my april fools joke reference)
dave be
I can't do it all but I can help on a few of those things. A. Doesn't matter, hex on computers is done for human eyes. Its easier for us to group long strings of binary together into hex groups is all. It all boils down to binary to the computer.
B. Most of our encryption standards in common use require synchronization. Change your computer time and notice that several things break, ssl for websites may not work, file encryption, etc.
All the one-time pad promotion is just silly. You can't implement a one-time pad for mass produced gadgets. If you did, you'd wind up with an inherent insecurity anyway when people needed to 'refill' their encryption. The one type pad data needs to be an infinite string, or at least as long as your message, and non-repeating. It also needs to be distributed, so once distributed you have a potential weakness there since the pad information can be recovered. Multiplying unique pads to every gadget becomes impractical to implement and requires massive workload to index clients from the server.
In short.. Its as impractical as it ever was. This creates the need for new methods like the one proposed above.