Publications

Alexandros Beskos, Ajay Jasra, Kody Law, Raul Tempone and Yan Zhou, **Multilevel Sequential Monte Carlo Samplers**, submitted 2015

Alexandros Beskos, Ajay Jasra, Kody Law, Raul Tempone, and Yan Zhou

Multilevel Monte Carlo, Sequential Monte Carlo, Bayesian Inverse Problems

2015

In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial dierential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically,

using, for instance nite element methods and leading to a discretisation bias, with the step-size level **h****L**. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational eort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretisation levels ∞ > h0 > h1 > h**L. **In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence of probability distributions. A sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational eort to estimate expectations, for a given level of error, can be maintained within the SMC context. The approach is numerically illustrated on a Bayesian inverse problem.

2015