Near-Term Quantum Algorithms for Simulation and Machine Learning

Monday, January 15, 2018 1:30 pm - 1:30 pm EST (GMT -05:00)

PhD Comprehensive Seminar

Candidate: Guillaume Verdon-Akzam

​Abstract: Some of the most promising applications of quantum computing in the near-term, pre-fault-tolerance era of quantum computation are for the simulation of quantum systems and for quantum-accelerated machine learning. I begin this talk by reviewing near-term approaches/algorithms for quantum computation, including quantum-classical hybrid variational algorithms, and universal adiabatic quantum computation. A common feature of these approaches is that they are Hamiltonian-based, i.e. the quantum computation is phrased as a ground state problem of a certain Hamiltonian. I provide an overview of a few Hamiltonian-based/adiabatic constructions for universal quantum computation, and hint towards possible future work in this area. Following this, I propose a few paradigms I have constructed for Hamiltonian-based quantum-enhanced machine learning, i.e. quantum algorithms to learn patterns in classical data. I also outline a non-Hamiltonian-based approach to quantum feedforward neural networks and quantum backpropagation. Finally, I suggest possible links between universal adiabatic Hamiltonian constructions and quantum-data machine learning.