Quantum Computing

Landmark hot qubit research promises bigger, cheaper quantum computers

Landmark hot qubit research pr...
Dr Henry Yang and Professor Andrew Dzurak of UNSW, with the kind of large, expensive dilution refrigerator currently used to supercool quantum computing qubits. Their new hot qubit technology could remove the need for such bulky, expensive cooling systems
Dr Henry Yang and Professor Andrew Dzurak of UNSW, with the kind of large, expensive dilution refrigerator currently used to supercool quantum computing qubits. Their new hot qubit technology could remove the need for such bulky, expensive cooling systems
View 1 Image
Dr Henry Yang and Professor Andrew Dzurak of UNSW, with the kind of large, expensive dilution refrigerator currently used to supercool quantum computing qubits. Their new hot qubit technology could remove the need for such bulky, expensive cooling systems
1/1
Dr Henry Yang and Professor Andrew Dzurak of UNSW, with the kind of large, expensive dilution refrigerator currently used to supercool quantum computing qubits. Their new hot qubit technology could remove the need for such bulky, expensive cooling systems

Traditional computers, which perform their wonderfully quick calculations using millions of simple on/off transistors organized into logic gates, spent the last half a century getting faster and faster, to the point where we could reasonably expect the number of transistors on a chip to double every couple of years, while becoming half as expensive. To follow this famous "Moore's Law," they've become smaller and smaller, and thus faster and faster, to the point where human manufacturing ingenuity has run up against a hard obstacle.

The latest transistors are so small that they can no longer reliably control the flow of electrons, because at distances measured in just a few atomic widths, electrons can "quantum tunnel," or basically instantaneously disappear and reappear on the other side of a transistor, or hop to an adjacent path, causing all sorts of errors in computing. So next-gen transistor-based chips can't get any smaller, and this physical boundary threatens to grind processor development to a halt.

Quantum computing appears to be a promising solution, using the extraordinary weirdness of quantum-scale physics to unlock a new path forward. Instead of a transistor bit, which either lets electrons through or it doesn't, quantum "qubits" use nano-scale physics to express different states; the clockwise or counter-clockwise spin of an electron, for example, or the horizontal or vertical polarization of a photon – these become your ones and zeroes.

And where a transistor-based bit can be either open or shut, 1 or 0, a quantum "qubit" takes advantage of superposition – effectively being able to exist in both states at once, and indeed all levels of probability between those two states. Like Schrodinger's cat, a qubit is only forced to collapse into a single 1 or 0 reality when it's measured. While in superposition, it can run multiple calculations simultaneously, firing computing into a probability-based dimension that will be vastly superior for a certain sub-set of tasks.

It's brain-bending stuff, but the upshot is that multiple-qubit computers become exponentially more powerful than transistor bit computers as you scale them up. IBM's publicly-accessible 5-qubit computer has the processing power of a 32-bit computer and its 2017 16-qubit prototype chip has the power of 65,536 bits. A full-scale quantum computer could process complex database operations almost instantly that might take a regular computer weeks, or years.

IBM's 2017-model, 16-qubit quantum processor, seen here encased in a cryogenic chamber
IBM's 2017-model, 16-qubit quantum processor, seen here encased in a cryogenic chamber

Indeed, they might even prove dangerous genies to let out of the bottle, as their monstrous speed could break public-key encryption, the strongest privacy tool currently in widespread use, without breaking a sweat. But their incredible and unique powers will also enable next-level, monstrously complex simulations to be run with ridiculous numbers of variables and huge swathes of data. There's no doubt they'll be a hugely valuable tool for humanity.

Dealing with heat: The quantum quandary

One of the key issues with current superconductor-based quantum computer designs is that they need to be kept incredibly cold to operate, just 0.1 Kelvin. That's -273.05 °C, or -459.5 °F – temperatures so cold they make no sense in conventional scales. The energy of just a tiny amount of heat is enough to throw these little quantum superpositions out of whack. And computers, as we all know, tend to run hot, particularly the harder you ask them to work.

Figuring out ways to suck all heat and motion out of these tiny qubits has become a massive challenge in its own right; the current-generation quantum computers run by IBM, Google and the like use big, complex and expensive "dilution refrigerators," and while these are manageable solutions for prototypes without many qubits, they scale up very poorly. Qubits placed close to one another can quickly heat each other up, so to keep them cool, you need to space them far apart, and run a separate cable to each qubit to get information in and out of them.

A commercial-scale quantum computer would need millions of qubits, so with this generation of cooling tech, you'd need an enormous building to house a single computer – much like the early days of valve and transistor based computers – and millions upon millions of dollars worth of cooling gear. Not acceptable.

Other efforts to cool things at this tiny scale have used quantum weirdness to evacuate heat, like encouraging electrons to quantum-tunnel through nano-scale barriers, taking tiny amounts of heat with them as they go, as demonstrated by a Finnish team in 2017.

The Finnish Aalto University team's chip contains two parallel superconducting oscillators, connected to quantum-circuit refrigerators that use tunnelling electrons to reduce energy and cool down the systems
The Aalto team's chip contains two parallel superconducting oscillators, connected to quantum-circuit refrigerators that use tunnelling electrons to reduce energy and cool down the systems

But if you could design a qubit that could run hotter, even just a tiny bit hotter, you could vastly decrease the size and expense of the cooling gear, and group qubits much closer together. You could even have quantum qubits sitting close to traditional silcon electronics, opening the path to much smaller, much cheaper quantum computers. And this, folks, is what a Sydney team from the University of New South Wales says it's achieved.

The team has found a way to embed qubits in quantum dots, creating a silicon quantum processor. These qubits are controlled using electrically-driven spin resonance, and they can be measured, or "read," using quantum tunneling of electrons between the quantum dots.

At this stage, they have managed to demonstrate the technology using a simple two-qubit processor, and the results are impressive. At 1.5 Kelvin (-271.65 °C/-457 °F), these silicon qubits were able to maintain quantum superposition for a whole two microseconds, which is about as long as they last in a 0.1 K dilution-refrigerated system.

On the human temperature scale, these "hot qubits" at 1.5 K are still horrifically cold, but when you're so close to absolute zero, that's 15 times more acceptable heat energy you can get away with. Cooling something to 1.5 K is much, much easier than cooling it to 0.1 K, and the flow-on effects are enormous. The UNSW hot qubits can be grouped much more tightly using vastly smaller and cheaper cooling systems, and can work in close proximity to traditional electronics controlling read/write activities.

“Our new results open a path from experimental devices to affordable quantum computers for real world business and government applications,” says UNSW Professor Andrew Dzurak, lead researcher on the project. “While difficult to appreciate using our everyday concepts of temperature, this increase is extreme in the quantum world.”

This hot qubit research has been validated by a second team of researchers in the Netherlands, with both papers published back-to-back in the current edition of the journal Nature.

These embryonic quantum computers, the UNSW team says, won't need specialist manufacturing plants once they're scaled up into chips with millions of qubits on them. Production should be possible using the same gear currently used in existing silicon chip factories.

While commercial-scale quantum computers are still a distant hope, this research appears to clear a significant obstacle and bring them much closer to reality.

The UNSW and Delft University of Technology papers are published in the journal Nature.

Professor Dzurak explains the team's hot qubit and its implications in the terrific video below.

Hot Qubits: major quantum computing constraints overcome

Source: UNSW

1 comment
Cryptonoetic
I find it interesting that the "public" doesn't learn about research road-blocks, dead-ends, technical implementation challenges, and run-of-the-mill failures until science successfully solves the here-to-fore unknown problem, like "hot cubits", for example.