Current students

We hope you are enjoying your time in our graduate programs. Check out our course offerings, information about degree completion, the PhD qualifying exams, the PhD lecturing requirement, and instructions on submitting your PhD annual activity report. If you still have some years ahead in your grad studies, you might be interested in applying for scholarships.

If you have any administrative questions, please contact us at cograd@uwaterloo.ca.

Seminars in Combinatorics and Optimization

Friday, April 4, 2025 3:30 pm - 4:30 pm EDT (GMT -04:00)

Tutte colloquium-Aukosh Jagannath

Title:: The training dynamics and local geometry of high-dimensional learning

Speaker: Aukosh Jagannath
Affiliation: University of Waterloo
Location: MC 5501

Abstract:Many modern data science tasks can be expressed as optimizing a complex, random functions in high dimensions. The go-to methods for such problems are variations of stochastic gradient descent (SGD), which perform remarkably well—c.f. the success of modern neural networks. However, the rigorous analysis of SGD on natural, high-dimensional statistical models is in its infancy. In this talk, we study a general model that captures a broad range of learning tasks, from Matrix and Tensor PCA to training two-layer neural networks to classify mixture models. We show the evolution of natural summary statistics along training converge, in the high-dimensional limit, to a closed, finite-dimensional dynamical system called their effective dynamics. We then turn to understanding the landscape of training from the point-of-view of the algorithm. We show that in this limit, the spectrum of the Hessian and Information matrices admit an effective spectral theory: the limiting empirical spectral measure and outliers have explicit characterizations that depend only on these summary statistics. I will then illustrate how these techniques can be used to give rigorous demonstrations of phenomena observed in the machine learning literature such as the lottery ticket hypothesis and the "spectral alignment" phenomenona. This talk surveys a series of joint works with G. Ben Arous (NYU), R. Gheissari (Northwestern), and J. Huang (U Penn).

This talk is based on joint work with Saeed Ghadimi and Henry Wolkowicz from University of Waterloo and Diego Cifuentes and Renato Monteiro from Georgia Tech.

 

 

Monday, April 7, 2025 11:30 am - 12:30 pm EDT (GMT -04:00)

Algebraic Graph Theory-Stefano Lia

Title: New Strongly Regular Graphs from Finite Semifields and Finite Geometry

Speaker:

Stefano Lia

Affiliation:

Umeå University

Location: Please contact Sabrina Lato for Zoom link.

Abstract: Finite geometry often provides natural examples of highly structured combinatorial objects, many of which exhibit strong symmetry properties. 

In particular, many constructions of strongly regular graphs arise from classical geometric configurations. In this talk, we will present two new constructions of quasi-polar spaces, that give rise to two families of pairwise non-isomorphic strongly regular graphs, having the same non-trivial automorphism group.

Both constructions are related to a pair of commuting polarities in a projective space. Surprisingly, one of these constructions is connected to the algebraic structure of finite semifields and their tensor representation.

Thursday, April 10, 2025 2:00 pm - 3:00 pm EDT (GMT -04:00)

Algebraic and enumerative combinatorics seminar-Natasha Ter-Saakov

Title: Log-concavity of random Radon partitions

Speaker Natasha Ter-Saakov
Affiliation Rutgers
Location MC 5479

 Abstract: Over one hundred years ago, Radon proved that any set of d+2 points in R^d can be partitioned into two sets whose convex hulls intersect. I will talk about Radon partitions when the points are selected randomly. In particular, if the points are independent normal random vectors, let p_k be the probability that the Radon partition has size (k, d+2-k). Answering a conjecture of Kalai and White, we show that the sequence (p_k) is ultra log-concave and that, in fact, a balanced partition is the most likely. Joint work with Swee Hong Chan, Gil Kalai, Bhargav Narayanan, and Moshe White.

There will be a pre-seminar presenting relevant background at the beginning graduate level starting at 1pm,