Robert J. Webber

I am currently a Ph.D. candidate advised by Jonathan Weare at the Courant Institute of Mathematical Sciences.

My goal as a researcher is to uncover the mathematical foundations of Monte Carlo methods in order to optimize their efficiency. In pursuit of this goal, I combine numerical analysis of existing Monte Carlo methods with development of new methods that improve accuracy.

To find out more, check out my curriculum vitae or take a look at the projects below. You can also reach me at rw2515 [AT] nyu [DOT] edu.


Projects

Monte Carlo methods are needed to identify probabilities of rare, impactful events. However, to estimate a rare probability \(p\) with just a single digit of accuracy, direct Monte Carlo sampling requires a very large sample size (\(> 100p^{-1}\)). Since generating such a large sample can be prohibitively computationally expensive, are there more practical rare event sampling methods that calculate rare probabilities with reduced sample size requirements?

Collaborators

Dorian Abbot, David Aristoff, Gideon Simpson, Morgan O'Neill, David Plotkin, and Jonathan Weare.

Papers

Figure: Rare event "splitting" method is used to sample rare, high values of the position \(x\). White circles indicate that samples are killed. Black circles indicate that samples are preserved and possibly replicated.

In quantum mechanics and the analysis of Markov processes, Monte Carlo methods are needed to identify low-lying eigenfunctions of dynamical generators. The standard Monte Carlo approaches for identifying eigenfunctions, however, can be inaccurate or slow to converge. What limits the efficiency of the currently available spectral estimation methods, and what is needed to build more efficient methods for the future?

Collaborators

Timothy Berkelbach, Aaron Dinner, Douglas Dow, Samuel Greene, Chatipat Lorpaiboon, Erik Thiede, and Jonathan Weare.

Papers

Figure: Spectral estimation error approaches zero as the number of basis functions increases and the length of the time series increases.

In many Bayesian inverse problems, Markov chain Monte Carlo (MCMC) methods are needed to approximate distributions on infinite-dimensional function spaces. Yet designing efficient and broadly applicable MCMC methods for function spaces has proved challenging. The available samplers are either inefficient, or they require gradient or covariance information that can be challenging to obtain. This raises the question: are there functional samplers that perform efficiently without requiring any gradient or covariance information?

Collaborator

Jeremie Coullon

Paper


Talks

You can hear me talk (virtually) at:

Lastly, here is a talk I gave at SIAM MPE2020 about rare event sampling: