Solving fundamental problems to realize the potential of data science
Despite having more data at our disposal than ever before, we are still learning how best to process and understand it, so that we can solve pressing global problems. In this context, new fundamental questions in data science have emerged: What are the trade-offs between statistical and computational power when tackling complex data? How do we understand the remarkable power of stochastic approximation algorithms and neural networks? How hard are the typical high-dimensional optimization and sampling problems that we face?
The Mathematical Foundations of Data Science Lab, led by Dr. Aukosh Jagannath, Canada Research Chair in Mathematical Foundations of Data Science and an Assistant Professor in the Department of Statistics and Actuarial Science, will tackle these and related questions using recent advances in probability theory. These tools have had remarkable success in applications to data science in recent years but this research is only in its infancy.
The Lab will put Canada at the forefront of this exciting new field while developing the data scientists of tomorrow.

These days, we hear a lot about neural networks, like ChatGPT, but we do not understand why these neural networks are able to do what they do. As a result, the development of neural networks takes a trial-and-error approach that is mind-bogglingly expensive. Small companies can’t break through. One of the goals of my lab is to develop a foundational understanding of how neural networks work, so that we can adopt a principled rather than a heuristic approach to designing a neural network.