Before the dream of quantum computing is realized, a number of inherent problems must first be solved. One of these is the ability to maintain a stable memory system that overcomes the intrinsic instability of the basic unit of information in quantum computing – the quantum bit or "qubit". To address this problem, Physicists working at the University of California Santa Barbara (UC Santa Barbara) claim to have created breakthrough circuitry that continuously self-checks for inaccuracies to consistently maintain the error-free status of the quantum memory.
Vulnerability to environmentally-induced error – such as cosmic ray events or simply an unknown collapse of quantum coherence, for example – means that the information contained in a qubit is easily lost. And because of the nature of of quantum entanglement required to encode the qubit in the first place, any attempt to replicate the information will also immediately destabilize it.
"One of the biggest challenges in quantum computing is that qubits are inherently faulty," said Julian Kelly, graduate student researcher at the John Martini physics lab at UC Santa Barbara. "So if you store some information in them, they’ll forget it." Rather than attempt to maintain the integrity of a qubit by, say, trapping it in an isotope of silicon, the UC Santa Barbara team has instead opted for an algorithm-based approach.
Unlike conventional computers, quantum computers do not use binary data storage (ones and zeroes), where a bit can be one of two states. Instead, quantum computers use what is known as "superpositioning," where the data contained in a qubit can also be either 0 or 1 (or even both simultaneously if superdense coding is used), and may exist at any and all possible positions simultaneously, and in various dimensions.
However, whilst this property of qubits is distinctly advantageous in terms of computational power, it is also this trait which renders qubits prone to "flipping" (randomly changing state), especially when in unstable environments, and thus difficult to work with.
"It’s hard to process information if it disappears," said Kelly.
To help solve this problem, the new error detection and correction process uses a system where several qubits are made to operate together to preserve the information. To achieve this, information is simultaneously stored across a number of qubits.
"… the idea is that we build this system of nine qubits, which can then look for errors," said Kelly. "Qubits in the grid are responsible for safeguarding the information contained in their neighbors in a repetitive error detection and correction system that can protect the appropriate information and store it longer than any individual qubit can."
This is necessary because qubits exist in a quantum state where you can either know the position of a particle or you can measure its momentum, but not both. To do so will result in the decoherence of the qubit to a random state.
"You can’t measure a quantum state, and expect it to still be quantum," said UC Santa Barbara postdoctoral researcher Rami Barends. "The very act of measurement locks the qubit into a single state and it then loses its superpositioning power."
To do this, UC Santa Barbara staff scientist, Austin Fowler, used what is termed a "surface code" to provide information about errors. Obtained by repeatedly measuring each qubit after interaction with its nearest neighbor data qubits on a matrix, changes in the measurement value indicate the presence of chains of errors in space and time.
In other words, this code utilizes parity information to detect any variation from the original data. In this case, if the polarization state applied to a set number of qubits is "even" and these qubits are then transmitted elsewhere in the system, any change to that polarization will be seen by comparing that state between the original and transmitted qubits.
This is different to the standard way of checking data in a computer that involves duplication of the original data to look for errors – an impossible task in quantum computing, because the qubits must remain unobserved to maintain their integrity.
"So you pull out just enough information to detect errors, but not enough to peek under the hood and destroy the quantum-ness," said Kelly.
So far, research has proven that it is capable of negating a "bit-flip" qubit error, but the team is hoping to next confront other qubit decoherence problems, such as the complementary "phase-flip," error.
Senior researchers from the Martinis group have now also partnered with Google to further explore this technology and research in quantum computing applications.
The team's paper appears in the journal Nature.
Update 20, Apr. 2015: This story originally credited UC Berkeley as the source and responsible for the research. This was incorrect, with the research actually carried out at UC Santa Barbara. We apologise to readers and those involved in the research for the error, which has now been corrected.
Source: UC Santa Barbara