AI seminar: Self-adaptive hierarchical sentence model
Speaker: Han Zhao, University of Waterloo MSc graduate
Speaker: Han Zhao, University of Waterloo MSc graduate
Speaker: Hassan Ashtiani, PhD candidate
We address the problem of communicating domain knowledge from a domain expert to the designer of a clustering algorithm.
Andreas Stöckel, PhD candidate
David R. Cheriton School of Computer Science
The artificial neurons typically employed in machine learning and computational neuroscience bear little resemblance to biological neurons. They are often derived from the “leaky integrate and fire” (LIF) model, neglect spatial extent, and assume a linear combination of input variables. It is well known that these simplifications have a profound impact on the family of functions that can be computed in a single-layer neural network.
Nabiha Asghar, PhD candidate
David R. Cheriton School of Computer Science
We address the problem of incremental domain adaptation (IDA). We assume each domain comes one after another, and that we could only access data in the current domain. The goal of IDA is to build a unified model performing well on all the domains that we have encountered. We propose to augment a recurrent neural network (RNN) with a directly parameterized memory bank, which is retrieved by an attention mechanism at each step of RNN transition. The memory bank provides a natural way of IDA: when adapting our model to a new domain, we progressively add new slots to the memory bank, which increases the number of parameters, and thus the model capacity.