Daniel Brod, Perimeter Institute
BosonSampling is a restricted model of quantum computing based on non-adaptive linear optics. It has gathered much attention, from theorists and experimentalists alike, due to its natural implementation as a linear-optical experiment. It is also considered a good candidate for a near- to mid-term demonstration of the computational advantage that quantum computers display over their classical counterparts.
However, due in part to its non-adaptive nature, BosonSampling is not expected to have the same fault-tolerance properties as universal quantum computers. Thus, in order to strengthen the theoretical foundations of BosonSampling, it is now necessary to study the model’s robustness against real-world imperfections. In this talk, I will describe some of the recent (theoretical and experimental) advances in dealing with the main obstacles to the scalability of BosonSampling experiments.