Wednesday, October 17, 2018 4:00 pm
-
4:00 pm
EDT (GMT -04:00)
Title: Tutorial on back-propagation and automatic differentiation
Speaker: | Steve Vavasis |
Affiliation: | University of Waterloo |
Room: | MC 5479 |
Abstract: In this presentation, I'll cover the basics of automatic differentiation (AD). Then I'll explain how to apply AD for finding the gradient of one term of the loss function in training a neural network, which is called "back-propagation". I'll also explain how some authors are using AD for the purpose of selecting hyperparameters in optimization algorithms.