Researchers at IQC have made significant contributions to a Post-Quantum Cryptography standardization process run by the National Institute for Standards and Technology (NIST). As the process enters its fourth round, researchers are one step closer to identifying codes that will be widely accepted as reliable and safe against attacks enabled by emerging quantum computers.
EvolutionQ, a leading quantum-safe cybersecurity company founded and led by Executive Director of the Institute for Quantum Computing Norbert Lütkenhaus, and IQC faculty member Michele Mosca, recently announced their latest partnership with SandboxAQ, an enterprise Saas company. This partnership was formed in relation to evolutionQ’s Series A funding and its recent grant of $7 million in funding, which will help organizations like SandboxAQ prepare for quantum computers.
A single-photon detector and counting module (SPODECT) recently built by Waterloo’s Quantum Photonics Lab for the International Space Station (ISS) will be used to verify quantum entanglement and test its survivability in space as part of the Space Entanglement and Annealing QUantum Experiment (SEAQUE) mission, in a collaboration with researchers at the University of Illinois Urbana-Champaign, the Jet Propulsion Laboratory, ADVR Inc, and the National University of Singapore
Jerry Li - Microsoft Research
In this talk, we consider two fundamental tasks in quantum state estimation, namely, quantum tomography and quantum state certification. In the former, we are given n copies of an unknown mixed state rho, and the goal is to learn it to good accuracy in trace norm. In the latter, the goal is to distinguish if rho is equal to some specified state, or far from it. When we are allowed to perform arbitrary (possibly entangled) measurements on our copies, then the exact sample complexity of these problems is well-understood. However, arbitrary measurements are expensive, especially in terms of quantum memory, and impossible to perform on near-term devices. In light of this, a recent line of work has focused on understanding the complexity of these problems when the learner is restricted to making incoherent (aka single-copy) measurements, which can be performed much more efficiently, and crucially, capture the set of measurements that can be be performed without quantum memory. However, characterizing the copy complexity of such algorithms has proven to be a challenging task, and closing this gap has been posed as an open question in various previous papers.
Dynamic qubit allocation and routing for constrained topologies by CNOT circuit re-synthesis
Recent strides in quantum computing have made it possible to execute quantum algorithms on real quantum hardware. When mapping a quantum circuit to the physical layer, one has to consider the numerous constraints imposed by the underlying hardware architecture. Many quantum computers have constraints regarding which two-qubit operations are locally allowed. For example, in a superconducting quantum computer, connectivity of the physical qubits restricts multi-qubit operations to adjacent qubits . These restrictions are known as connectivity constraints and can be represented by a connected graph (a.k.a. topology), where each vertex represents a distinct physical qubit. When two qubits are adjacent, there is an edge between the corresponding vertices.
Andrey Boris Khesin - Massachusetts Institute of Technology
Publicly verifiable quantum money is a protocol for the preparation of quantum states that can be efficiently verified by any party for authenticity but is computationally infeasible to counterfeit. We develop a cryptographic scheme for publicly verifiable quantum money based on Gaussian superpositions over random lattices. We introduce a verification-of-authenticity procedure based on the lattice discrete Fourier transform, and subsequently prove the unforgeability of our quantum money under the hardness of the short vector problem from lattice-based cryptography.