Continuous Optimization Seminar - Steve VavasisExport this event to calendar

Wednesday, October 17, 2018 — 4:00 PM EDT

Title: Tutorial on back-propagation and automatic differentiation

Speaker: Steve Vavasis
Affiliation: University of Waterloo
Room: MC 5479

Abstract: In this presentation, I'll cover the basics of automatic differentiation (AD). Then I'll explain how to apply AD for finding the gradient of one term of the loss function in training a neural network, which is called "back-propagation".  I'll also explain how some authors are using AD for the purpose of selecting hyperparameters in optimization algorithms.

Location 
MC - Mathematics & Computer Building
5479
200 University Avenue West

Waterloo, ON N2L 3G1
Canada

S M T W T F S
25
26
27
28
29
30
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
4
5
  1. 2022 (114)
    1. October (5)
    2. September (11)
    3. August (2)
    4. July (17)
    5. June (17)
    6. May (10)
    7. April (12)
    8. March (18)
    9. February (10)
    10. January (13)
  2. 2021 (103)
    1. December (3)
    2. November (7)
    3. October (6)
    4. September (12)
    5. August (6)
    6. July (10)
    7. June (12)
    8. May (7)
    9. April (9)
    10. March (13)
    11. February (8)
    12. January (10)
  3. 2020 (119)
  4. 2019 (167)
  5. 2018 (136)
  6. 2017 (103)
  7. 2016 (137)
  8. 2015 (136)
  9. 2014 (88)
  10. 2013 (48)
  11. 2012 (39)
  12. 2011 (36)
  13. 2010 (40)
  14. 2009 (40)
  15. 2008 (39)
  16. 2007 (15)