Benjamin Peherstorfer   |   Courant Institute of Mathematical Sciences, New York University

Greedy construction of quadratic manifolds for nonlinear dimensionality reduction and nonlinear model reduction

Dimensionality reduction on quadratic manifolds augments linear approximations with quadratic correction terms. Previous works rely on linear approximations given by projections onto the first few leading principal components of the training data; however, linear approximations in subspaces spanned by the leading principal components alone can miss information that are necessary for the quadratic correction terms to be efficient. In this work, we propose a greedy method that constructs subspaces from leading as well as later principal components so that the corresponding linear approximations can be corrected most efficiently with quadratic terms.

Download Python code from GitHub https://github.com/Algopaul/greedy_quadratic_manifolds.

References:

[1] Schwerdtner, P. & Peherstorfer, B. Greedy construction of quadratic manifolds for nonlinear dimensionality reduction and nonlinear model reduction.
arXiv, 2403.06732, 2024.
[Abstract] [BibTeX]

Nonlinear embeddings for conserving Hamiltonians and other quantities with Neural Galerkin schemes

We propose Neural Galerkin schemes that compute at each time step an explicit embedding onto the manifold of nonlinearly parametrized solution fields to guarantee conservation of quantities. The embeddings can be combined with standard explicit and implicit time integration schemes.

Download Python code from GitHub https://github.com/Algopaul/ng_embeddings/blob/main/embedded_ng.ipynb.

References:

[1] Schwerdtner, P., Schulze, P., Berman, J. & Peherstorfer, B. Nonlinear embeddings for conserving Hamiltonians and other quantities with Neural Galerkin schemes.
arXiv, 2310.07485, 2023.
[Abstract] [BibTeX]

Neural Galerkin schemes with active learning for solving high-dimensional evolution equations

Machine learning methods have been shown to give accurate predictions in high dimensions provided that sufficient training data are available. Yet, many interesting questions in science and engineering involve situations where initially no data are available and the principal aim is to gather insights from a known model. Here we consider this problem in the context of systems whose evolution can be described by partial differential equations. We use deep learning to solve these equations by generating data on-the-fly when and where they are needed, without prior information about the solution.

Download Python code from GitHub https://github.com/pehersto/ng.

References:

[1] Bruna, J., Peherstorfer, B. & Vanden-Eijnden, E. Neural Galerkin Scheme with Active Learning for High-Dimensional Evolution Equations.
Journal of Computational Physics, 2023.
[Abstract] [BibTeX]

Multilevel Stein Variational Gradient Descent (MLSVGD) for approximate Bayesian inference

A multilevel Stein variational gradient descent method (MLSVGD) for efficient approximate sampling from target distributions in, e.g., Bayesian inference. The MLSVGD method moves most iterations to lower, cheaper levels of approximate target distributions with the aim of requiring only few iterations on the higher, more expensive levels.

Download Matlab code on GitHub https://github.com/terrencealsup/MLSVGD.

References:

[1] Alsup, T., Venturi, L. & Peherstorfer, B. Multilevel Stein variational gradient descent with applications to Bayesian inverse problems.
In Mathematical and Scientific Machine Learning (MSML) 2021, 2021.
[Abstract] [BibTeX]

Sampling strategy for empirical interpolation and gappy proper orthogonal decomposition

Implements a strategy for selecting sampling points for (discrete) empirical interpolation.

Download Matlab code on GitHub https://github.com/pehersto/odeim.

References:

[1] Peherstorfer, B., Drmac, Z. & Gugercin, S. Stability of discrete empirical interpolation and gappy proper orthogonal decomposition with randomized and deterministic sampling points.
SIAM Journal on Scientific Computing, 42:A2837-A2864, 2020.
[Abstract] [BibTeX]

Operator Inference (OpInf)

Implements a data-driven model reduction method that builds on operator inference. Code by Elizabeth Qian (MIT).

Download from Elizabeth's GitHub page https://github.com/elizqian/operator-inference.

References:

[1] Peherstorfer, B. & Willcox, K. Data-driven operator inference for nonintrusive projection-based model reduction.
Computer Methods in Applied Mechanics and Engineering, 306:196-215, 2016.
[Abstract] [BibTeX]
[2] Qian, E., Kramer, B., Marques, A.N. & Willcox, K.E. Transform & Learn: A data-driven approach to nonlinear model reduction.
In AIAA Aviation 2019 Forum, AIAA, 2019.
[BibTeX]

Multifidelity Monte Carlo

Implements the multifidelity Monte Carlo (MFMC) method for uncertainty propagation. The implementation demonstrates MFMC on a toy example with five models.

Download from GitHub https://github.com/pehersto/mfmc.

References:

[1] Peherstorfer, B., Willcox, K. & Gunzburger, M. Optimal model management for multifidelity Monte Carlo estimation.
SIAM Journal on Scientific Computing, 38(5):A3163-A3194, 2016.
[Abstract] [BibTeX]

Online adaptive discrete empirical interpolation method

The online adaptive discrete empirical interpolation method (ADEIM) constructs reduced models that are adapted online. Updates to the DEIM basis are computed from sparse samples of the full model residual. This code demonstrates ADEIM on a time-dependent problem.

Download from GitHub https://github.com/pehersto/adeim.

References:

[1] Peherstorfer, B. & Willcox, K. Online Adaptive Model Reduction for Nonlinear Systems via Low-Rank Updates.
SIAM Journal on Scientific Computing, 37(4):A2123-A2150, 2015.
[Abstract] [BibTeX]