Robert J. Webber

Google Scholar | CV | Ph.D. Thesis | Email

I am a recent Ph.D. graduate of the Courant Institute of Mathematical Sciences, advised by Jonathan Weare. In August 2021, I will be moving to Caltech's Department of Computing + Mathematical Sciences as a postdoctoral scholar supervised by Joel Tropp.

My goal as a researcher is to uncover the mathematical foundations of Monte Carlo methods in order to optimize their efficiency. In pursuit of this goal, I combine numerical analysis of existing Monte Carlo methods with development of new methods that improve accuracy.


Projects

Rare events can be highly impactful. However, to estimate the probability \(p\) of a rare event by direct Monte Carlo sampling requires a very large sample size (\(> 100p^{-1}\)). Since generating such a large sample can be prohibitively expensive, are there more practical rare event sampling methods that calculate small probabilities with reduced sample size requirements?

To address this question, I have introduced or analyzed several Monte Carlo "splitting" methods for rare probability estimation, including "Quantile Diffusion Monte Carlo" (QDMC) and "weighted ensemble".

In an exciting new application of QDMC, I worked with Dorian Abbot, Sam Hadden, and Jonathan Weare to calculate the probability that Mercury will become unstable and collide with another celestial body over the next 2 billion years. We calculated the probability to be \(\sim10^{-4}\) and obtained a speed-up of up to \(100x\) as compared to direct sampling.

Figure: We applied QDMC where splitting occurs every 0.2 Gyr, starts at 1.4 Gyr, and ends at a target time of 2.4 Gyr. The vertical axis is the 30-Myr running average of Mercury's eccentricity. The x’s signify a close encounter between Mercury and Venus.

Collaborators

Dorian Abbot, David Aristoff, Gideon Simpson, Morgan O'Neill, David Plotkin, and Jonathan Weare.

Papers

Many physical models have a dazzlingly high number of dimensions. For example, a simulation model for the atoms in a protein can have millions of dimensions, while a simulation model for the Earth’s weather can have billions of dimensions. This raises the question: how can we provide accurate statistical estimates for such high-dimensional models?

An important observation underlying many Monte Carlo algorithms for high-dimensional problems is that statistical estimates often depend most critically on a low-dimensional subspace of coordinates. These coordinates may vary from problem to problem, yet their existence helps to make estimation problems easier and Monte Carlo methods more efficient.

Figure: the "variational approach to conformational dynamics" (VAC) exhibits decreasing error as the data size and the basis size increase. In our analysis, we established how to tune the parameters in VAC to provide accurate dimensionality reduction.

Collaborators

Jeremie Coullon, Aaron Dinner, Douglas Dow, Chatipat Lorpaiboon, Erik Thiede, and Jonathan Weare.

Papers

The ground state and the first few excited states determine the fundamental properties of quantum systems at low temperatures. However, as the system size increases, it becomes exponentially more difficult to calculate these objects using traditional numerical methods. To address this curse of dimensionality, I have developed several modern methods that harness the power of Monte Carlo sampling.

My collaborators and I have applied RGN to spin systems with up to 400 spins (hence \( 2^{400} \) possible spin configurations) and have applied FRI to molecules as large as nitrogen (which has 14 interacting electrons).

Figure: Our new RGN optimization method for ground state wavefunctions leads to faster convergence and lower energy errors compared to the conventional method of natural gradient descent.

Collaborators

Timothy Berkelbach, Samuel Greene, Michael Lindsey, and Jonathan Weare.

Papers


Talks

You can hear me talk at:

Lastly, here is a talk I gave at SIAM MPE2020 about rare event sampling: