Andrew Gordon Wilson


We are a research group at New York University, wishing to understand the foundations of generalization, learning, and decision making, towards building highly practical new methods in machine learning. We often (1) take a probabilistic approach; (2) care about automatically discovering scientifically interpretable structure in data; and (3) are excited about connections with physics, numerical methods and scientific computing. We are particularly active in building methods for Bayesian and probabilistic deep learning, scalable Gaussian processes, kernel learning, geometric deep learning, physics-inspired deep learning, and training of deep neural networks. We believe in open and reproducible research. If you'd like to try out these methods check out our code page.

There are 1-2 PhD openings in our group for Fall 2021. If you wish to apply, we strongly recommend reading some of our papers carefully, and describing how your interests connect to our work in your application. Also check out the ICML Bayesian Deep Learning Tutorial and Lecture on Representation Learning with Gaussian Processes. Students with a strong technical foundation, often with physics, math, or computer science (with many math courses) undergraduate degrees, and strong programming experience, typically have a good background for research in the group.

I advise students in Courant Computer Science (Dec 12 deadline), Mathematics (Dec 18 deadline), and the Center for Data Science (Dec 12 deadline). Admissions happens through a centralized committee.

Group Leader
Andrew Gordon Wilson

Postdoctoral Fellow
Currently considering applications

PhD Students
Greg Benton
Marc Finzi
William Herlands
Pavel Izmailov
Polina Kirichenko
Wesley Maddox
Samuel Stanton

Masters Students
Ian Delbridge
Ke Alexander Wang

Ben Athiwaratkun (PhD; now at Amazon AI Research)
Jacob Gardner (Postdoc; now Assistant Professor at the University of Pennsylvania)
Patrick Nicholson (Masters)
Michael Luo (Masters)