Guojun Zhang, PhD candidate
David R. Cheriton School of Computer Science
The expectation-maximization (EM) algorithm has been widely used in minimizing the negative log likelihood (also known as cross entropy) of mixture models. However, little is understood about the goodness of the fixed points it converges to.
In this seminar, we study the regions where one component is missing in two-component mixture models, which we call one-cluster regions. We analyze the propensity of such regions to trap EM and gradient descent (GD) for mixtures of two Gaussians and mixtures of two Bernoullis. In the case of Gaussian mixtures, EM escapes one-cluster regions exponentially fast, while GD escapes them linearly fast. In the case of mixtures of Bernoullis, we find that there exist one-cluster regions that are stable for GD and therefore trap GD, but those regions are unstable for EM, allowing EM to escape. Those regions are local minima that appear universally in experiments and can be arbitrarily bad. This implies that EM is less likely than GD to converge to certain bad local optima in mixture models.
This work is in collaboration with Prof. Pascal Poupart and George Trimponias. It is accepted at UAI 2019.
200 University Avenue West
Waterloo, ON N2L 3G1