Statistics and Biostatistics seminar series
Charles Margossian
University of British Columbia
Room: M3 3127
Variational Inference in the presence of symmetry
Given an intractable target density p, variational inference (VI) attempts to find the best approximation q from a tractable family Q. This is done by minimizing a divergence, for example the Kullback-Leibler divergence. In practice, Q is not rich enough to contain p, and the approximation is misspecified even when it is a global minimizer of the chosen divergence. In this talk, I examine how this misspecification manifests. First, I present a positive result which shows that VI is robust to many mispecifications if p exhibits certain symmetries which can be matched by optimizing over Q. Specifically, p recovers the mean in the presence of even-symmetry and the correlation matrix in the presence of elliptical symmetry. Next I present a negative result for the case where the VI approximation is factorized (mean-field) but the target is not. In this setting, the approximation is constrained by an impossibility theorem and can at best recover one of three common measures of uncertainty: the variance, the precision or the generalized variance (which can be linked to entropy). How this impossibility theorem gets resolved depends on which divergence we choose to minimize. This is to be contrasted with the positive result, where recovery of the mean and correlation is insured for a broad choice of divergences.