Map Estimators for Bayesian Nonparametrics

by M. Dashti, Kody J. H. Law, A.M. Stuart, J. Voss
Refereed Journals Year: 2013

Bibliography

M. Dashti, K. J. H. Law, A.M. Stuart and J. Voss.  Map Estimators for Bayesian Nonparametrics.  Inverse Problems, 29 095017 (2013).

Abstract

We consider the inverse problem of estimating an unknown function $u$ from noisy measurements $y$ of a known, possibly nonlinear, map $\mathcal{G}$ applied to $u$. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field $\mu_0$. We work under a natural set of conditions on the likelihood which imply the existence of a well-posed posterior measure, $\mu^y$. Under these conditions we show that the {\em maximum a posteriori} (MAP) estimator is well-defined as the minimiser of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency. We also prove a similar result for the case where the observation of $\mathcal{G}(u)$ can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics.

ISSN:

doi:10.1088/0266-5611/29/9/095017

Keywords

Probability Bayesian non-parametrics inverse problems