NeurIPS, short for Neural Information Processing Systems, is the foremost international conference for research on AI and Machine Learning. The conference receives thousands of submissions a year, and of the approximately 20% that are accepted, only a handful are recognized with the Outstanding Paper awards.
Jagannath’s paper is titled “High-dimensional limit theorems for SGD: effective dynamics and critical scaling.” He co-authored it with Dr. Gerard Ben Arous, professor of mathematics at New York University, and Dr. Reza Gheissari, assistant professor of mathematics at Northwestern University.
The paper studies the scaling limits of SGD with constant step-size in the high-dimensional regime, showing how complex SGD can be if the step size is large. “Characterizing the nature of SDE and comparing it to the ODE when the step size is small,” the authors write, “gives insights into the nonconvex optimization landscape.”
Jagannath, who came to Waterloo in 2019, works on the interface between mathematical physics, mathematical data science, optimization, and high-dimensional statistics. "It's a real honour!" he says of the award. "It's nice to know that the line of work Gerard, Reza, and I have been developing over the past few years is starting to be noticed by the broader AI and Machine Learning community."