Computational Math Colloquium | Jun Liu, Convergence Properties of Stochastic Gradient MethodsExport this event to calendar

Thursday, February 16, 2023 2:00 PM EST

MC 5501 and Zoom (email compmath@uwaterloo.ca for Zoom link)

<--break-><--break->Speaker

Jun LiuĀ | University of Waterloo

Title

Convergence Properties of Stochastic Gradient Methods

Abstract

The optimization of deep learning models heavily relies on stochastic gradient descent (SGD) and its variants. This talk will focus on examining the convergence properties of SGD and gradient descent methods in general. We aim to bridge the gap between the convergence analysis found in the literature, which is usually expressed in terms of expectations, and practical implementations of SGD, which require instantiations of the algorithm to converge.

We will demonstrate how the classical supermartingale convergence theorems of Robbins and Monro for stochastic approximation can be adapted to achieve almost sure convergence rates analysis for SGD and its variants, including the standard SGD, SGD with momentum (Polyak's heavyball), and Nesterov's accelerated gradient method. We will also show how Lyapunov and probabilistic arguments can be used to guarantee almost sure escape from strict saddle manifolds, bypassing the usual boundedness assumptions on gradients in prior work.

In addition, we will discuss the benefits of incorporating non-smooth gradient flows in the optimization process, leading to improved performance on benchmark neural network learning problems when compared to current gradient descent methods.

Event tags 

S M T W T F S
31
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
1
2
3
4
  1. 2024 (1)
    1. March (1)
  2. 2023 (7)
    1. November (1)
    2. September (2)
    3. April (1)
    4. March (1)
    5. February (1)
    6. January (1)
  3. 2022 (4)
  4. 2021 (8)