Events

Tuesday, September 25, 2018 — 1:00 PM EDT

Daniel Recoskie, PhD candidate
David R. Cheriton School of Computer Science

The wavelet transform is a well-studied and understood analysis technique used in signal processing. In wavelet analysis, signals are represented by a sum of self-similar wavelet and scaling functions. Typically, the wavelet transform makes use of a fixed set of wavelet functions that are analytically derived. We propose a method for learning wavelet functions directly from data. We impose an orthogonality constraint on the functions so that the learned wavelets can be used to perform both analysis and synthesis. We accomplish this by using gradient descent and leveraging existing automatic differentiation frameworks. Our learned wavelets are able to capture the structure of the data by exploiting sparsity. We show that the learned wavelets have similar structure to traditional wavelets.

Monday, September 24, 2018 — 1:30 PM EDT

Nicole McNabb, Master’s candidate
David R. Cheriton School of Computer Science

Clustering is the task of partitioning data so that “similar” points are grouped together and “dissimilar” ones are separated. In general, this is an ill-defined task. One way to make clustering well-defined is to introduce a clustering objective to optimize. While many common objectives such as k-means are known to be NP-hard, heuristics output “nice” clustering solutions efficiently in practice. This work analyzes two avenues of theoretical research that attempt to explain this discrepancy.

Thursday, September 13, 2018 — 1:30 PM EDT

Irish Medina, Master’s candidate
David R. Cheriton School of Computer Science

Smart water meters have been installed across Abbotsford, British Columbia, Canada, to measure the water consumption of households in the area. Using this water consumption data, we develop machine learning and deep learning models to predict daily water consumption for existing multi-family residences. We also present a new methodology for predicting the water consumption of new housing developments. 

Tuesday, September 4, 2018 — 12:30 PM EDT

Ricardo Salmon, PhD candidate
David R. Cheriton School of Computer Science

Stochastic satisfiability (SSAT), Quantified Boolean Satisfiability (QBF) and decision theoretic planning in infinite horizon partially observable Markov decision processes (POMDPs) are all PSPACE-Complete problems. Since they are all complete for the same complexity class, I show how to convert them into one another in polynomial time and space.

Tuesday, August 14, 2018 — 4:00 PM EDT

Abdullah Rashwan, PhD candidate
David R. Cheriton School of Computer Science

We present a discriminative learning algorithm for Sum-Product Networks (SPNs) based on the Extended Baum-Welch (EBW) algorithm.

Thursday, August 9, 2018 — 10:00 AM EDT

Vineet John, Master’s candidate
David R. Cheriton School of Computer Science

This thesis tackles the problem of disentangling the latent style and content variables in a language modelling context. This involves splitting the latent representations of documents by learning which features of a document are discriminative of its style and content, and encoding these features separately using neural network models.

Friday, August 3, 2018 — 11:00 AM EDT

Royal Sequiera, Master’s candidate
David R. Cheriton School of Computer Science

With the advent of deep learning methods, researchers are abandoning decades-old work in Natural Language Processing (NLP). The research community has been increasingly moving away from otherwise dominant feature engineering approaches; rather, it is gravitating towards more complicated neural architectures. Highly competitive tools like Parts-of-Speech taggers that exhibit human-like accuracy are traded for complex networks, with the hope that the neural network will learn the features needed. In fact, there have been efforts to do NLP "from scratch" with neural networks that altogether eschew featuring engineering based tools (Collobert et al., 2011).

Tuesday, July 24, 2018 — 2:00 PM EDT

Daniel Recoskie, PhD candidate
David R. Cheriton School of Computer Science

The wavelet transform has seen success when incorporated into neural network architectures, such as in wavelet scattering networks. More recently, it has been shown that the dual-tree complex wavelet transform can provide better representations than the standard transform.

Friday, June 29, 2018 — 10:00 AM EDT

Michael Cormier, PhD candidate

This thesis is focused on the development of computer vision techniques for parsing web pages using an image of the rendered page as evidence, and on understanding this under-explored class of images from the perspective of computer vision. This project is divided into two tracks — applied and theoretical — which complement each other. Our practical motivation is the application of improved web page parsing to assistive technology, such as screenreaders for visually impaired users or the ability to declutter the presentation of a web page for those with cognitive deficit. From a more theoretical standpoint, images of rendered web pages have interesting properties from a computer vision perspective; in particular, low-level assumptions can be made in this domain, but the most important cues are often subtle and can be highly non-local. The parsing system developed in this thesis is a principled Bayesian segmentation-classification pipeline, using innovative techniques to produce valuable results in this challenging domain. The thesis includes both implementation and evaluation solutions.

Monday, June 25, 2018 — 4:00 PM EDT

Abdullah Rashwan, PhD candidate

Sum-product networks have recently emerged as an attractive representation due to their dual view as a special type of deep neural network with clear semantics and a special type of probabilistic graphical model for which inference is always tractable. Those properties follow from some conditions (i.e., completeness and decomposability) that must be respected by the structure of the network. 

S M T W T F S
30
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
  1. 2018 (27)
    1. October (1)
    2. September (4)
    3. August (3)
    4. July (1)
    5. June (5)
    6. May (4)
    7. April (3)
    8. March (3)
    9. February (1)
    10. January (2)
  2. 2017 (19)
    1. December (1)
    2. November (2)
    3. October (1)
    4. September (1)
    5. August (3)
    6. July (4)
    7. June (3)
    8. May (2)
    9. April (1)
    10. February (1)
  3. 2015 (4)
  4. 2012 (4)
  5. 2011 (27)
  6. 2010 (12)
  7. 2009 (18)
  8. 2008 (15)
  9. 2007 (24)
  10. 2006 (36)
  11. 2005 (13)