Thursday, February 14, 2019 — 11:00 AM EST

Adam Schunk, Master’s candidate
David R. Cheriton School of Computer Science

Over the past years online social networks have become a major target for marketing strategies, generating a need for methods to efficiently spread information through these networks. Close-knit communities have developed on these platforms through groups of users connecting with likeminded individuals. 

Thursday, February 14, 2019 — 10:30 AM EST

Andrew Delong, Head of Computational Research
Deep Genomics

Genomics focuses on the sequences in our genomes and how they encode for function in our cells. Predicting how sequences will be interpreted by the cell is important for identifying disease-causing mutations and for designing therapies. 

Thursday, January 24, 2019 — 4:00 PM EST

Brandon Alcox, Master’s candidate
David R. Cheriton School of Computer Science

This thesis investigates the application of various fields of artificial intelligence to the domain of sports management and analysis. The research in this thesis is primarily focused on the entry draft for the National Hockey League, though many of the models proposed may be applied to other sports and leagues with minimal adjustments. 

Thursday, January 24, 2019 — 10:30 AM EST

Eunsol Choi, Paul G. Allen School of Computer Science
University of Washington

Real world entities such as people, organizations and countries play a critical role in text. Reading offers rich explicit and implicit information about these entities, such as the categories they belong to, relationships they have with other entities, and events they participate in. 

Friday, December 14, 2018 — 3:00 PM EST

Nabiha Asghar, PhD candidate
David R. Cheriton School of Computer Science

We address the problem of incremental domain adaptation (IDA). We assume each domain comes one after another, and that we could only access data in the current domain. The goal of IDA is to build a unified model performing well on all the domains that we have encountered. We propose to augment a recurrent neural network (RNN) with a directly parameterized memory bank, which is retrieved by an attention mechanism at each step of RNN transition. The memory bank provides a natural way of IDA: when adapting our model to a new domain, we progressively add new slots to the memory bank, which increases the number of parameters, and thus the model capacity. 

Thursday, December 13, 2018 — 4:00 PM EST

Andreas Stöckel, PhD candidate
David R. Cheriton School of Computer Science

The artificial neurons typically employed in machine learning and computational neuroscience bear little resemblance to biological neurons. They are often derived from the “leaky integrate and fire” (LIF) model, neglect spatial extent, and assume a linear combination of input variables. It is well known that these simplifications have a profound impact on the family of functions that can be computed in a single-layer neural network. 

Tuesday, October 23, 2018 — 9:00 AM EDT

Ben Armstrong, Master’s candidate
David R. Cheriton School of Computer Science

Understanding the factors causing groups to engage in coordinating behaviour has been an active research area for decades. In this thesis, we study this problem using a novel dataset of crowd behaviour from an online experiment hosted by Reddit.

Monday, October 22, 2018 — 4:00 PM EDT

Carolyn Lamb, PhD candidate
David R. Cheriton School of Computer Science

This thesis is driven by the question of how computers can generate poetry, and how that poetry can be evaluated. We survey existing work on computer-generated poetry and interdisciplinary work on how to evaluate this type of computer-generated creative product. 

Tuesday, September 25, 2018 — 1:00 PM EDT

Daniel Recoskie, PhD candidate
David R. Cheriton School of Computer Science

The wavelet transform is a well-studied and understood analysis technique used in signal processing. In wavelet analysis, signals are represented by a sum of self-similar wavelet and scaling functions. Typically, the wavelet transform makes use of a fixed set of wavelet functions that are analytically derived. We propose a method for learning wavelet functions directly from data. We impose an orthogonality constraint on the functions so that the learned wavelets can be used to perform both analysis and synthesis. We accomplish this by using gradient descent and leveraging existing automatic differentiation frameworks. Our learned wavelets are able to capture the structure of the data by exploiting sparsity. We show that the learned wavelets have similar structure to traditional wavelets.

Monday, September 24, 2018 — 1:30 PM EDT

Nicole McNabb, Master’s candidate
David R. Cheriton School of Computer Science

Clustering is the task of partitioning data so that “similar” points are grouped together and “dissimilar” ones are separated. In general, this is an ill-defined task. One way to make clustering well-defined is to introduce a clustering objective to optimize. While many common objectives such as k-means are known to be NP-hard, heuristics output “nice” clustering solutions efficiently in practice. This work analyzes two avenues of theoretical research that attempt to explain this discrepancy.

  1. 2019 (6)
    1. February (4)
    2. January (2)
  2. 2018 (30)
    1. December (2)
    2. October (2)
    3. September (4)
    4. August (3)
    5. July (1)
    6. June (5)
    7. May (4)
    8. April (3)
    9. March (3)
    10. February (1)
    11. January (2)
  3. 2017 (19)
  4. 2015 (4)
  5. 2012 (4)
  6. 2011 (27)
  7. 2010 (12)
  8. 2009 (18)
  9. 2008 (15)
  10. 2007 (24)
  11. 2006 (36)
  12. 2005 (13)