Events
CDS Seminar: The Training Dynamics and Local Geometry of High-Dimensional Learning
Speaker: Aukosh Jagannath (University of Waterloo)
Location: 60 Fifth Avenue, Room 7th floor open space
Date: Thursday, April 16, 2026
Many modern data science tasks involve optimizing a complex, random function in high dimensions. The go-to methods for such problems are variations of stochastic gradient descent (SGD). Though they perform remarkably well, the rigorous analysis of SGD on high-dimensional statistical models is in its infancy. This talk will study the high-dimensional limits of the training dynamics and the local geometry from the point-of-view of the algorithm. We will discuss how in this limit, for many statistical tasks, the evolution of a certain summary statistics converge to a closed finite dimensional dynamical system, called their effective dynamics, and that the spectrum of the Hessian and Information matrices admit an explicit characterization that depends only these summary statistics. I will also discuss how these methods can be used to analyze variants of SGD such as SGD with momentum and certain pre-conditioning schemes. This talk is based on a series of joint works with G. Ben Arous (NYU), R. Gheissari (Northwestern), J. Huang (U Penn), T. Jones-McCormick (Waterloo), and V. Sarangian (Waterloo).
Bio: Aukosh Jagannath is the Canada Research Chair in Mathematical Foundations of Data Science and an Associate Professor in the Department of Statistics and Actuarial Science at the University of Waterloo. His recent work has focused on understanding the landscape and dynamics of learning, the hardness of random sampling and optimization problems, and the fundamental limits of learning all in the high-dimensional regime, as well as related problems in statistical mechanics. He received his PhD from NYU and was a Benjamin Pierce fellow at Harvard University, and an NSF postdoctoral fellow at Harvard and the University of Toronto. He is the recipient of a Canada research chair, a Golden Jubilee award, and a NeurIPS outstanding paper award.