Quantum: Music at the Frontier of Science


The debate on wave vs particle nature of light goes back to the days of Huygens and Newton. When used to model properties of quantum system these concepts lose their objective meaning and simply become the two aspects of wave-particle duality. Duality played a central role in Bohr—Einstein debates and prompted Bohr to formulate the complementarity principle. Complementarity leaves open the possibility that by adapting to the specific experimental sit-up a quantum system always behaves definitely either as a particle or as a wave.
The focus of this talk will be a general introduction to Nuclear Magnetic Resonance (NMR) detection schemes that are based on the use of Superconducting Quantum Interference Devices (SQUIDs) as highly sensitive magnetometers. I will begin by providing an overview of the relevant concepts and principles behind SQUID-detected NMR. In the main part of my talk I will be presenting our experimental results and achievements in the field of ultralow field SQUID NMR spectroscopy and Magnetic Resonance Imaging (MRI).
In a double slit interference experiment, the wave function at the screen with both slits open is not exactly equal to the sum of the wave functions with the slits individually open one at a time. The three scenarios represent three different boundary conditions and as such, the superposition principle should not be applicable. However, most well-known text books in quantum mechanics implicitly and/or explicitly use this assumption that is only approximately true.
I'll give a broad overview of my research over the last decade aimed at understanding the relationship between computational complexity and physics; and in particular, the capabilities and limitations of quantum computers.
Join us for the next Quantum Frontiers Distinguished Lecture Series when Dr. Leo Kouwenhoven will talk about particles that are equal to their anti-particles.
Nearly 80 years after Schroedinger described entanglement as the quintessential nonclassical phenomenon, and 50 years after Bell showed the inconsistency of quantum correlations with local realism, the quantum information revolution seeks to use its almost magical properties to enable new feats in information processing. As we shall see, entanglement can now be produced at high rates with exquisite precision, enabling unprecedented tests of nonlocality and such feats as quantum cryptography and teleportation.
Quantum key distribution (QKD) can be implemented in both so-called
entanglement-based (EB) and prepare-and-measure (PM) configurations. There is a certain degree of equivalence between EB and PM schemes from the point of view of security analysis that has been heavily exploited in the literature over the last fifteen years or so, where a given PM protocol is reduced to an equivalent EB protocol (following the BBM92 argument) whose security is then proved.
In 1981, Richard Feynman proposed a device called a “quantum computer” to take advantage of the laws of quantum physics to achieve computational speed-ups over classical methods. Quantum computing promises to revolutionize how we compute.
The topological color code and the toric code are two leading candidates for realizing fault-tolerant quantum computation. In the talk, I will introduce these two models and show their equivalence in d dimensions. I will describe codes with or without boundaries, and explain what insights one gets in the former case by looking at the condensation of anyonic excitations on the boundaries. I will conclude with a recipe of how one can implement fault-tolerantly a logical non-Pauli gate in the toric code in d dimensions.