Dimension-independent likelihood-informed MCMC

by Tiangang Cui, Kody J. H. Law, Youssef M. Marzouk
Manuscripts Year: 2014

Bibliography

Tiangang Cui, Kody J.H. Law, Yous​sef M. Marzouk, Dimension-independent likelihood-informed MCMC, submited to  arXiv:1411.3688v1, Nov. 2014

Abstract

​​Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent, likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.

ISSN:

2014

Keywords

Dimension-independent likelihood-informed MCMC