Stephen Vavasis
Biography
Vavasis received a Bachelors in Mathematics from Princeton in 1984, a Masters (i.e., Part III of the Tripos) in Mathematics from Cambridge in 1985, and PhD in Computer Science from Stanford in 1989. He was an assistant, then associate, then full professor of computer science at Cornell University from 1989 to 2006. Since 2006 he has been a professor in the Department of Combinatorics and Optimization at University of Waterloo. He served as Associate Dean for Computing 2017-2020. He has held summer or sabbatical positions at Argonne, Sandia, Bell Labs, Xerox PARC, NASA Ames and elsewhere. He is a past winner of the Hertz Graduate Fellowship, Churchill Scholarship, Presidential Young Investigator award, and Guggenheim Fellowship.
In the past he served as Associate Dean of Computing for the Faculty of Mathematics and as Interim Director of the Data Science graduate programs.
Scholarly Research
Vavasis's primary research areas are continuous optimization and application of continuous optimization to data science. He is also interested in scientific computing, numerical linear algebra, computational mechanics, and computational geometry. His research receives support from NSERC.
Current research interests
Convex optimization for data science: Convex optimization is powerful tool for solving problems in data science. Efficient algorithms can be brought to bear, and duality opens the door to analysis of the optimizer to establish that it solves the original application. Vavasis and his students have applied convex optimization to unsupervised machine learning problems of clustering, nonnegative matrix factorization, and the disjoint communities problem.
First-order methods: First order methods solve optimization problems using first-derivative information. Several "optimal" methods such as conjugate gradient, accelerated gradient, and geometric descent have been proposed. Recent work shows these methods have a unifying analysis. This leads to the development of new and more efficient algorithms.
Optimization in computational mechanics: With colleagues and students, Vavasis applies modern and powerful optimization techniques to fracture mechanics of solids. The main point is that fracture can be regarded as an energy minimization process, and hence is amenable to optimization. He is currently investigating new methods of machine learning for this class of problem.
Education
1984, A.B. in Mathematics, Princeton University
1985, Cert. Adv. Study Mathematics, University of Cambridge
1989, PhD in Computer Science, Stanford University
Service
2017-2020 Associate Dean of Computing (Faculty of Mathematics)
2021 Interim Director of Graduate Data Science Programs
2026-2027 Director of Graduate Data Science Programs
Teaching*
- CO 255 - Introduction to Optimization (Advanced Level)
- Taught in 2025
- CO 372 - Portfolio Optimization Models
- Taught in 2021, 2023, 2024, 2025, 2026
- CO 463 - Convex Optimization and Analysis
- Taught in 2024
- CO 663 - Convex Optimization and Analysis
- Taught in 2024
- CO 673 - Optimization for Data Science
- Taught in 2023, 2024, 2025
- CO 769 - Topics in Continuous Optimization
- Taught in 2026
- CS 794 - Optimization for Data Science
- Taught in 2023, 2024, 2025
* Only courses taught in the past 5 years are displayed.
Selected/Recent Publications
Iverson, V., & Vavasis, S.. (10040). Mean and Variance Estimation Complexity in Arbitrary Distributions via Wasserstein Minimization. Retrieved from https://arxiv.org/abs/2501.10172
Ang, A., De Sterck, H., & Vavasis, S.. (Accepted). MGProx: A nonsmooth multigrid proximal gradient method with adaptive restriction for strongly convex optimization. SIAM J. Optimization. Retrieved from https://arxiv.org/abs/2302.04077
Karimi, S., & Vavasis, S. A.. (Accepted). Nonlinear conjugate gradient for smooth convex functions. Mathematical Programming - Computation. Retrieved from https://arxiv.org/pdf/2111.11613.pdf
Hough, M., & Vavasis, S.. (2024). A Primal-Dual Frank-Wolfe Algorithm for Linear Programming. Retrieved from https://arxiv.org/abs/2402.18514
Moursi, W., Pavlovic, V., & Vavasis, S.. (2023). Accelerated gradient descent: A guaranteed bound for a heuristic restart strategy. Retrieved from https://arxiv.org/abs/2310.07674
Jiang, T., Moursi, W., & Vavasis, S.. (2023). Range of the displacement operator of PDHG with applications to quadratic and conic programming. Retrieved from https://arxiv.org/abs/2309.15009
Jiang, T., Tang, S., & Vavasis, S.. (2023). Re-embedding data to strengthen recovery guarantees of clustering. Retrieved from https://arxiv.org/abs/2301.10901
Tunçel, L., Vavasis, S. A., & Xu, J.. (2023). Computational complexity of decomposing a symmetric matrix as a sum of positive semidefinite and diagonal matrices. Foundations of Computational Mathematics, 2023, 1-47. Retrieved from https://arxiv.org/abs/2209.05678
Doan, X. Vinh, & Vavasis, S. A.. (2022). Low-rank matrix recovery with Ky Fan 2-k-norm. Journal of Global Optimization, 82, 727-751. Retrieved from https://link.springer.com/article/10.1007/s10898-021-01031-0
Jiang, T., & Vavasis, S. A.. (2021). Certifying clusters from sum-of-norms clustering. Retrieved from https://arxiv.org/abs/2006.11355
Majmudar, J., & Vavasis, S.. (2020). Provable overlapping community detection in weighted graphs. In Neural Information Processing Systems (NeurIPS) (Vol. 2020). Retrieved from https://proceedings.neurips.cc/paper/2020
Baghal, S., Paquette, C., & Vavasis, S.. (2020). A termination criterion for stochastic gradient descent for binary classification. Retrieved from https://arxiv.org/abs/2003.10312
Vavasis, S., Papoulia, K., & M. Hirmand, R.. (2020). Second-order cone interior-point method for quasistatic and moderate dynamic cohesive fracture. Comput. Meth. Appl. Mech. Engr., 358, 112633. Retrieved from https://arxiv.org/abs/1909.10641
Jiang, T., Vavasis, S., & Zhai., C. W.. (2020). Recovery of a mixture of Gaussians by sum-of-norms clustering. Journal of Machine Learning Research, 21(225), 1-16. Retrieved from https://jmlr.org/papers/volume21/19-218/19-218.pdf
Doan, X. V., & Vavasis, S.. (2019). Low-rank matrix recovery with Ky Fan 2-k-norm. In Optimization of Complex Systems: Theory, Models and Applications (pp. 310-319).
Paquette, C., & Vavasis, S.. (2019). Potential-based analyses of first-order methods for constrained and composite optimization.
Gillis, N., & Vavasis, S. A.. (2018). On the Complexity of Robust PCA and l1-Norm Low-Rank Matrix Approximation. Mathematics of Operations Research, 43, 1072-1084.
Vavasis, S., Papoulia, K., & Hirmand, M.. (2018). Second-order cone interior-point method for quasistatic and moderate dynamic cohesive fracture.
Karimi, S., & Vavasis, S.. (2017). IMRO: A Proximal Quasi-Newton Method for Solving $\\ell_1$-Regularized Least Squares Problems. SIAM J. Optimiz, 27, 583-615.
Karimi, S., & Vavasis, S.. (2017). A single potential governing convergence of conjugate gradient, accelerated gradient and geometric descent.
Graduate studies
I am currently seeking to accept graduate students. Please submit your graduate studies application and include my name as a potential advisor.