Please note: This PhD seminar will be given online.
Andreas Stöckel, PhD candidate
David R. Cheriton School of Computer Science
Supervisor: Professor Chris Eliasmith
When processing temporal data, artificial neural networks typically operate in discrete time using relatively coarse time-steps. This is fundamentally different from biological neural networks, which are intrinsically time-continuous. Moreover, biological systems are capable of online learning, and can, for example, learn to reproduce or to account for observed dynamics within a relatively short amount of time. One famous example for this is eyeblink conditioning, where an experimental animal learns the dynamics of a stimulus, allowing the subject to close its eye right before an unpleasant puff of air.
In this talk, we discuss methods for approximating dynamical systems in neural networks, taking inspiration from temporal receptive fields in biology. As a primary example of a dynamical system, we review the recently proposed Legendre Delay Network. We show how this network can be mapped as an “adaptive filter” onto the recurrent granule-Golgi microcircuit in the cerebellum — the part of the brain that has been hypothesized to be responsible for eyeblink conditioning. Building such models is not only of interest to computer scientists working on brain-inspired machine learning algorithms, but also to cognitive scientists and neurobiologists interested in connecting functional descriptions to the underlying biological mechanisms. We discuss some of the technical aspects of the neural modelling tool “NengoBio” we developed to this end. Using this tool, we are capable of incorporating low-level biological constraints into our cerebellar model, while reproducing key characteristics of empirical data on eyeblink conditioning.
To join this PhD seminar on Zoom, please go to https://zoom.us/j/7545987009?pwd=YWdUVGQyaGVUQ0xVS0NpMmtYV3B1QT09.
200 University Avenue West
Waterloo, ON N2L 3G1