Contact Info
Combinatorics & Optimization
University of Waterloo
Waterloo, Ontario
Canada N2L 3G1
Phone: 519-888-4567, ext 33038
PDF files require Adobe Acrobat Reader.
Title: Stochastic subgradient method converges on tame functions
Speaker: | Shenghao Yang |
Affiliation: | University of Waterloo |
Room: | MC 5479 |
Abstract: In this talk, I will present a paper (https://arxiv.org/abs/1804.07795) on the convergence of stochastic subgradient method in a nonconvex and nonsmooth setting.Stochastic subgradient method underlies several widely used solvers including Google’s TensorFlow and the open source PyTorch library. But its convergence guarantee is largely not understood in nonconvex and nonsmooth settings. Very recently, Davis et al prove that the stochastic subgradient method produces limit points that are all first-order stationary, on locally Lipschitz functions whose graph admits a Whitney stratification - a wide class of functions that subsumes all popular deep learning architectures.
In this talk, I will go through the ideas behind their proof.
Combinatorics & Optimization
University of Waterloo
Waterloo, Ontario
Canada N2L 3G1
Phone: 519-888-4567, ext 33038
PDF files require Adobe Acrobat Reader.
The University of Waterloo acknowledges that much of our work takes place on the traditional territory of the Neutral, Anishinaabeg and Haudenosaunee peoples. Our main campus is situated on the Haldimand Tract, the land granted to the Six Nations that includes six miles on each side of the Grand River. Our active work toward reconciliation takes place across our campuses through research, learning, teaching, and community building, and is centralized within our Office of Indigenous Relations.