CILVR Seminar: Learning Predictive Embeddings through Inter-View Regressor Alignment

Speaker: Michael Arbel

Location: 60 Fifth Avenue, Room 204
Videoconference link: https://nyu.zoom.us/j/94247644707

Date: Tuesday, April 21, 2026

Non-contrastive self-supervised learning has emerged as a powerful framework for predictive representation learning, with strong empirical performance on downstream tasks. Yet its most popular methods, such as SimSiam and BYOL, rely on heuristic mechanisms—notably stop-gradient and exponential moving average updates—that are not tied to an explicit objective and can admit collapse, making their dynamics difficult to understand in general.


In this work, we give a general analysis of idealized SimSiam-like dynamics. We show that these dynamics recover the leading nonlinear canonical correlation subspaces, prove convergence to equilibria, and provide a full stability characterization. Our analysis reveals that collapsed equilibria are not merely pathological possibilities: they can be stable attractors.


Motivated by this, we introduce a non-contrastive SSL method with an explicit predictive objective, defined by the trace of the best regularized linear regressor between two views. We show that its global maximizers recover the same canonical correlation subspaces, with regularization automatically selecting the effective dimension. But its dynamics are fundamentally different: all collapsed equilibria are unstable, and the only stable critical points are global maximizers of the objective. Preliminary experiments support the practical effectiveness of the method.


Speaker Bio:
Michael N. Arbel is a Research Scientist at Inria Grenoble and a member of the THOTH team. His work sits at the intersection of representation learning, optimization, and learning theory. Before joining Inria in this role, he was a Starting Research Fellow at Inria Grenoble, where he worked with Julien Mairal. He earned his PhD in 2021 from the Gatsby Computational Neuroscience Unit at University College London, under the supervision of Arthur Gretton, after prior training in applied mathematics at École Polytechnique and in the MVA program at ENS Paris-Saclay.

Notes:

  • Refreshments will be served.
  • In-person attendance only available to those with active NYU ID cards.