Master's Defence | Tim Dockhorn, Generative Modeling with Neural Ordinary Differential Equations

Friday, December 13, 2019 10:00 am - 10:00 am EST (GMT -05:00)

MC 6460

Candidate

Tim Dockhorn | Applied Mathematics, University of Waterloo

Title

Generative Modeling with Neural Ordinary Differential Equations

 Abstract

Neural ordinary differential equations (NODEs) (Chen et al., 2018) are ordinary differential equations (ODEs) with their dynamics modeled by neural networks. Continuous normalizing flows (CNFs) (Chen et al., 2018; Grathwohl et al., 2018), a class of reversible generative models which builds on NODEs and uses an instantaneous counterpart of the change of variables formula (CVF), have recently proven to achieve state-of-the-art results on density estimation and variational inference tasks. In this thesis, we review key concepts that are important to understand NODEs and CNFs, ranging from numerical ODE solvers to generative models. We derive an explicit formulation of the adjoint sensitivity method for both NODEs and CNFs using a constrained optimization framework. Furthermore, we review several classes of NODEs and prove that a particular class of hypernetwork NODEs is a universal function approximator in the discretized state. Our numerical results suggest that the ODEs arising in CNFs do not need to be solved to high precision for training and we show that training of CNFs can be made more efficient by using a tolerance scheduler that exponentially reduces the ODE solver tolerances. Moreover, we quantify the discrepancy of the CVF and the discretized instantaneous CVF for two ODE solvers. Our hope in writing this thesis is to give a comprehensive and self-contained introduction to generative modeling (with neural ordinary differential equations) and to stimulate both theoretical as well as computational future work on NODEs and CNFs.