IQC Researchers One Step Closer to Prototype Quantum Computer

Wednesday, October 3, 2007

Researchers at the Institute for Quantum Computing (University of Waterloo), in collaboration with MIT, have identified an experimental method to facilitate the design of prototype quantum computers and any other technologies requiring many-body quantum coherence.

This research is reported in "Symmetrized Characterization of Noisy Quantum Processes" by Emerson et al., in Symmetrized Characterization of Noisy Quantum Processeson page 1893 of the 28th of September issue of Science, see also the comment in Perspective: Does Our Universe Allow for Robust Quantum Computation? on 1876 of the same issue. "The quantum process tomography techniques described here represent a first step toward accurately assessing the powers and limits of these new quantum machines. Indeed, thanks to the techniques developed by Emerson et al., we may soon know whether our universe is generous enough to allow for large-scale robust quantum computation." says David Bacon from the University of Washington.

Precise, coherent control over the quantum dynamics of multi-body systems, such as laser-cooled trapped ions, quantum dots, nuclear spin systems, and superconducting circuits, is an active area of research that holds the promise of new quantum technologies, and, in particular, quantum computation and quantum communication. A major obstacle in this direction is the extreme sensitivity of these systems to the noise or 'decoherence' effects of the environment, as well as other control limitations. However, complete characterization of the decoherence affecting a given experimental arrangement is infeasible already for the number of interacting quantum systems that can be controlled in some of today's best labs.

"This technique provides a much needed solution to the important problem of efficiently characterizing the degree of experimental control over many-body quantum systems. The significance of this problem became particularly apparent over the course of some very labor intensive experiments and analysis performed a few years ago at MIT to fully characterize the noise affecting a 3-qubit system." explains Joseph Emerson.

The number of experiments required by existing noise characterization methods, known as quantum process tomography, grows quadruply exponentially with the number of coherently coupled quantum subsystems (in particular, the quantum bits or 'qubits'). This is impractical already for the few qubits within reach of today's labs, and an infeasible task for the thousands of qubits required for eventual applications of quantum computation. The proposed technique removes this obstacle by requiring a number of experiments that grows slower than linearly in the number of qubits.

The features of the decoherence that can be measured by this technique are relevant to selecting optimal quantum error correction algorithms and validating some of the theoretical assumptions of fault-tolerant threshold theorems. Quantum error correction techniques and fault-tolerant threshold theorems were discovered in the 1990s by Peter Shor and others. They proved that if the noise affecting the quantum computer is weak enough and satisfies certain other desirable properties than the resulting errors in the quantum computation can be corrected and, moreover, this error-followed-by- correction process can continue indefinitely, enabling arbitrarily complex quantum computations.

The problem is to determine if the noise affecting a given prototype quantum processor satisfies the various assumptions of these theorems. That's where the technique reported by Emerson et al. comes into play - it enables a practical method for measuring some of the properties of the noise that are required to determine the relevance of these various theorems and the appropriate noise-threshold for any prototype system.

The researchers also report in the paper an experimental demonstration of the method for the characterization of the robustness of "quantum memory" consisting of nuclear spins in a crystal lattice controlled by nuclear magnetic resonance techniques. The experiments, performed by Raymond Laflamme's research group at IQC, demonstrated the characterization and optimization of an experimental control sequence designed to reduce the impact of unwanted interaction between the nuclear spins.

The theoretical technique is based on symmetrizing the unknown quantum noise by rapidly applying certain random operations and then their inverses after a controlled delay. The operations must be drawn at random from particular sets of operations that isolate the features of the decoherence that is of interest. The parameters of the symmetrized noise are much fewer in number and can be directly estimated by measuring the effect of the symmetrized noise on specific input states. The possibility that sufficiently random quantum operations were experimentally achievable and could be applied for the task of noise characterization was first proposed in an earlier Science report (Dec. 19, 2003) by two of the authors and other collaborators.