Global AI Frontier Lab Seminar Series: Scientific Inference with Diffusion Models

Speaker: Stephan Mandt

Location: 1 MetroTech Center, Room Global AI Frontier Lab

Date: Monday, April 27, 2026

Diffusion models have transformed generative modeling in vision and language. But can they also serve as tools for scientific inference? In this talk, we develop a Bayesian perspective on diffusion models for solving inverse problems—recovering latent states or parameters from noisy, partial observations—with applications ranging from climate science to imaging. In scientific settings, the objective is not visual realism but statistical reliability: calibrated uncertainty, reliable marginals and conditionals, computational efficiency, and robustness to extreme or heavy-tailed behavior. We present a unified set of methodological advances that address these requirements, enabling expressive conditioning, fast inference without retraining, principled uncertainty quantification, and extensions beyond Gaussian assumptions. This talk will summarize a series of recent NeurIPS/ICML/ICLR papers that advance scientific inference with diffusion models across a range of scientific domains.

Bio: Stephan Mandt is an Associate Professor of Computer Science and Statistics at the University of California, Irvine. His research focuses on the foundations and applications of generative AI, including probabilistic generative models, statistical inference, neural data compression, and AI for scientific discovery. He is a Chan Zuckerberg Initiative Investigator and AI Resident, and has received the NSF CAREER Award, the UCI ICS Mid-Career Excellence in Research Award, and a Kavli Fellowship by the US National Academy of Sciences. Prior to joining UCI, he led the machine learning group at Disney Research and held postdoctoral positions at Princeton University and Columbia University. He frequently serves as a Senior Area Chair for NeurIPS, ICML, and ICLR, and has recently been Program Chair and General Chair for AISTATS in 2024 and 2025, respectively.