Wednesday, December 12, 2018 4:00 pm
-
4:00 pm
EST (GMT -05:00)
Title: How to Escape Saddle Points Efficiently
Speaker: | Matthew Slavin |
Affiliation: | University of Waterloo |
Room: | MC 5479 |
Abstract: We will discuss the 2017 paper named in the title of the talk, by Jin et al. This paper outlines a perturbed form of gradient descent that converges to second-order stationary points in a nearly “dimension-free” manor, with rates comparable to standard gradient descent. Specifically, if all saddle points for a given problem are non-degenerate, we will show that perturbed gradient descent can escape these saddle points almost for free. We will discuss applications where these saddle point assumptions are reasonable, and will conclude with a discussion on a novel characterization of saddle point geometry which made this result possible.