Computation of the Response Surface in the Tensor Train data format

by Sergey Dolgov, Boris N. Khoromskij, Alexander Litvinenko, Hermann G. Matthies
Manuscripts Year: 2014

Bibliography

Sergey Dolgov, Boris N. Khoromskij, Alexander Litvinenko and Hermanz G. Matthies, "Computation of the Response Surface in the Tensor Train data format", Submitted on 11 Jun 2014, under revision.

Abstract

We apply the Tensor Train (TT) approximation to construct the Polynomial Chaos Expansion (PCE) of a random field, and solve the stochastic elliptic
diffusion PDE with the stochastic Galerkin discretization. We compare two strategies of the polynomial chaos expansion: sparse and full polynomial
(multi-index) sets. In the full set, the polynomial orders are chosen independently in each variable, which provides higher flexibility and accuracy.
However, the total amount of degrees of freedom grows exponentially with the number of stochastic coordinates. To cope with this curse of dimensionality, the data is kept compressed in the TT decomposition, a recurrent low-rank factorization. PCE computations on sparse grids sets are extensively studied, but the TT representation for PCE is a novel approach that is investigated in this paper. We outline how to deduce the PCE from the covariance matrix, assemble the Galerkin operator, and evaluate some post-processing (mean, variance, Sobol indices), staying within the low-rank framework. The most demanding are two stages. First, we interpolate PCE coefficients in the TT format using a few number of samples, which is performed via the block cross approximation method. Second, we solve the discretized equation (large linear system) via the alternating minimal energy algorithm. In the numerical experiments we demonstrate that the full expansion set encapsulated in the TT format is indeed preferable in cases when high accuracy and high polynomial orders are required.

ISSN:

arXiv:1406.2816

Keywords

Computation of the Response Surface in the Tensor Train data format