Quantum computing breakthrough: Qubits made from standard silicon transistors
In what is likely a major breakthrough for quantum computing, researchers from the University of New South Wales (UNSW) in Australia have managed for the first time to build the fundamental blocks of a quantum computer in silicon. The device was created using standard manufacturing techniques, by modifying current-generation silicon transistors, and the technology could scale up to include thousands, even millions of entangled quantum bits on a single chip. Gizmag spoke to the lead researchers to find out more.
What are quantum computers for?
Quantum computers are a peculiar beast. Though the machines we've been building since the 50s have been aiming to be as deterministic and reliable as possible – so a certain input will always result in the same output – in a quantum computer, this dynamic is turned on its head, and predictability is sacrificed for (sometimes) incredible speedups.
A quantum bit, or qubit, has two awesome and confusing properties. First, it can set itself to both 0 and 1 at the same time. And second, it can commune (or entangle) with other qubits to compound this ability. This means five entangled qubits can store and process as much information as 32 (two to the power of five) classical bits; 10 qubits can do as much as 1,000 classical bits; and 300 fully entangled qubits can manipulate as many classical bits of information as there are atoms in the Universe.
You might think this would lead to much faster number-crunching over a regular computer – and you'd be right, to a point. A quantum computer can perform any operation a classical computer can, but its exponential speedups only take effect when a quantum algorithm can process data in a massively parallel fashion, such as searching through a very large database, virtually designing a new drug by choosing among quadrillions of possible combinations, or simulating the behavior of every single atom in your right toe. However, if the bulk of operations has to be performed in a sequential order, flowchart-style, then no real quantum speedups are possible.
The downside to these significant speedups is that due to quantum effects, the results returned by a quantum algorithm are not deterministic. That is, even in the best of cases, a quantum computer is never guaranteed to return the correct result.
This usually means that a quantum algorithm must be run several times in succession to confirm that the solution is correct. So, in practice, classical computers will probably be faster and more practical than quantum computers for day-to-day operations, and quantum computers will only come in useful where massive parallelism is involved. When they are let loose, though, their speed will be spectacular.
Most of the prototype quantum computers developed so far feature a limited number of entangled qubits made from exotic and expensive materials like cesium or diamonds and which, in order to reduce external interference, need to be nearly frozen at temperatures just a few thousandths of a degree above absolute zero.
However, researchers at UNSW are focusing on the potentially revolutionary approach of building quantum computers out of silicon, a material that is cheap, well-known by the industry, and which could ultimately pave the way for quantum computers with not 300 but thousands, even millions of fully entangled qubits.
Last year, UNSW scientists were able to create single "CMOS type" qubits that leveraged current transistor technology and silicon-28, a very common isotope of silicon, to achieve a very high fidelity of 99.6 percent for quantum operations. Now, the researchers have built on this to create what's known as a CNOT quantum logic gate. Together with a single controllable qubit, this is the basic building block of a quantum computer, and paves the way to quantum chips that can perform just about any operation.
The scientists built this logic gate by taking two standard transistors, next to each other, and reconfiguring them so they would only hold a single electron each. The spin of the electron sets a code of 0 or 1, and an external current and microwave field control the qubits and make them interact as needed.
"A CNOT gate is a [...] two-qubit gate [that] flips the state of the target qubit depending on the state of the control qubit," lead author of the paper Menno Veldhorst told Gizmag. "In our case, the target qubit flips its spin if the control qubit is pointing down. If the control qubit is pointing up, the target qubit will remain in the same state.
"This two-qubit gate is most essential for a quantum computer and together with single qubit operations, which we have already demonstrated with very high fidelity, provides what is called a universal gate set. This means that any gate set can be constructed out of [it]."
Although their quantum computers wouldn't work at room temperature, this approach lets the researchers operate their device at approximately 1 Kelvin (-272° C, -458° F). That may not seem like much of an improvement over previous designs, but, the researchers told us, recent advances in cooling technology have resulted in fridges that can easily be operated at these temperatures.
The real power of this breakthrough is not in a slightly higher operational temperature, but in the fact that these basic building blocks of quantum computers were built by doing simple modifications to current-generation silicon transistors. The researchers say they have worked out a way to extend this technique to a much larger number of qubits, even numbering in the thousands of millions, all reported to be fully entangled.
"Our team is looking for industrial partners to construct a chip that would contain between tens and hundreds of qubits, and which uses the silicon-CMOS technology used today for most computer processor chips," lead researcher Andrew Dzurak told us. "This prototype manufacture would be done in a Si-CMOS foundry with wafer-scale manufacture, so that we can demonstrate a manufacturing process that can be scaled up to the thousands or millions of qubits.
"I believe that a Si-CMOS qubit prototype containing between tens and hundreds of qubits could be made within five years, provided we have the right level of investment and the right industry partners. Our main aim is to develop a prototype that can demonstrate that it is possible to go all the way with 'Quantum CMOS' and make a full-scale quantum processor. That final stage is likely to take 10-20 years."
Such a powerful quantum computer would have major implications for the finance, data security, and health industry. But perhaps one of the most interesting applications of all, and the one advanced by Richard Feynman decades ago as he first proposed the idea of a quantum computer, would be to conduct virtual experiments simulating the behavior of atoms and particles in unusual conditions, such as at the very high energies we can only recreate in the Large Hadron Collider, without actually performing the experiment.
The advance was published today in the journal Nature. Dzurak and Veldhorst further comment on the implications of this breakthrough in the video below.
Please keep comments to less than 150 words. No abusive material or spam will be published.
It will happen, but multiply the time estimates by at least 2. (and with all due respect to the author, don't expect to keep track of all the atoms in your toe...) But I agree with Joseph Mertens comment above (or below?) mine: the future of computing is in using analog (neural network-type) processes to get approximate answers to complex optimization problems, and then using digital (von Neumann-type) computers to finish the job.
Your calculation assume no EEC but this wouldn't make sense, quantum error correction is key for any gate based quantum chip.
Literally thousands of papers have been published on this topic.
For instance for another architecture the surface code error correction scheme only requires 99% gate fidelity to allow for scale out:
As to the temp the D-Wave machines already operate on this level.
@Jayna_Sheats I think it is worth noting that current mass wafer production lithography is down to about 10nm I believe. MY brother works on the machines and I am pretty sure they are getting close to single digit layers- though this is nearing the limit of current technology- I think 1-3nm is very possible within the 10 years or so discussed in the article.