Stefanie Beale PhD Thesis Defence
Modeling and managing noise in quantum error correction
Simulating a quantum system to full accuracy is very costly and often impossible as we do not know the exact dynamics of a given system. In particular, the dynamics of measurement noise are not well understood. For this reason, and especially in the context of quantum error correction, where we are studying a larger system with branching outcomes due to syndrome measurement, studies often assume a probabilistic Pauli (or Weyl) noise model on the system with probabilistically misreported outcomes for the measurements. In this thesis, we explore methods to decrease the computational complexity of simulating encoded memory channels by deriving conditions under which effective channels are equivalent up to logical operations. Leveraging this method allows for a significant reduction in computational complexity when simulating quantum error correcting codes. We then propose methods to enforce a model consistent with the typical assumptions of stochastic Pauli (or Weyl) noise with probabilistically misreported measurement outcomes. First, via a new protocol we call measurement randomized compiling, which enforces an average noise on measurements wherein measure- ment outcomes are probabilistically misreported. Then, by another new protocol we call logical randomized compiling, which enforces the same model on syndrome measurements and a probabilistic Pauli (or Weyl) noise model on all other operations (including idling). Together, these results enable more efficient simulation of quantum error correction systems by enforcing effective noise of a form which is easier to model and by reducing the simulation overhead further via symmetries. The enforced effective noise model is additionally consistent with standard error correction procedures and enables techniques founded upon the standard assumptions to be applied in any setting where our protocols are simultaneously applied.