+ All Categories
Home > Documents > Spectral Tensor-Train Decomposition for low-rank surrogate models · 2017-12-18 · Spectral...

Spectral Tensor-Train Decomposition for low-rank surrogate models · 2017-12-18 · Spectral...

Date post: 01-Aug-2020
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
2
General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. Users may download and print one copy of any publication from the public portal for the purpose of private study or research. You may not further distribute the material or use it for any profit-making activity or commercial gain You may freely distribute the URL identifying the publication in the public portal If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim. Downloaded from orbit.dtu.dk on: Dec 18, 2017 Spectral Tensor-Train Decomposition for low-rank surrogate models Bigoni, Daniele; Engsig-Karup, Allan Peter; Marzouk, Youssef M. Publication date: 2014 Document Version Publisher's PDF, also known as Version of record Link back to DTU Orbit Citation (APA): Bigoni, D., Engsig-Karup, A. P., & Marzouk, Y. M. (2014). Spectral Tensor-Train Decomposition for low-rank surrogate models. Poster session presented at Spatial Statistics and Uncertainty Quantification on Supercomputers, Bath, United Kingdom.
Transcript
Page 1: Spectral Tensor-Train Decomposition for low-rank surrogate models · 2017-12-18 · Spectral tensor-train decomposition for low-rank surrogate models Daniele Bigoni ⇤1, Allan P.

General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.

• Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal

If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.

Downloaded from orbit.dtu.dk on: Dec 18, 2017

Spectral Tensor-Train Decomposition for low-rank surrogate models

Bigoni, Daniele; Engsig-Karup, Allan Peter; Marzouk, Youssef M.

Publication date:2014

Document VersionPublisher's PDF, also known as Version of record

Link back to DTU Orbit

Citation (APA):Bigoni, D., Engsig-Karup, A. P., & Marzouk, Y. M. (2014). Spectral Tensor-Train Decomposition for low-ranksurrogate models. Poster session presented at Spatial Statistics and Uncertainty Quantification onSupercomputers, Bath, United Kingdom.

Page 2: Spectral Tensor-Train Decomposition for low-rank surrogate models · 2017-12-18 · Spectral tensor-train decomposition for low-rank surrogate models Daniele Bigoni ⇤1, Allan P.

Spectral tensor-train decompositionfor low-rank surrogate modelsDaniele Bigoni⇤1, Allan P. Engsig-Karup1, Youssef M. Marzouk21 Department of Applied Mathematics and Computer Science, Technical University of Denmark2 Department of Aeronautics and Astronautics, Massachusetts Institute of Technology⇤ Corresponding author: [email protected]

IntroductionThe construction of surrogate models is very important as a mean of acceleration in computational methods

for uncertainty quantification (UQ). When the forward model is particularly expensive compared to the

accuracy loss due to the use of a surrogate – as for example in computational fluid dynamics (CFD) – the

latter can be used for the forward propagation of uncertainty [7] and the solution of inference problems [4].

Figure 1: TT-cross

Software: http://www.compute.dtu.dk/⇠dabi/Python PyPi: TensorToolbox

Problem settingWe consider f 2 L

2

([a, b]

d

), where d � 1 andassume f is a computationally expensive func-tion. Let ⇠ 2 [a, b]

d be random variables enteringthe formulation of a parametric problem. In thecontext of UQ, we might want to:•Compute relevant statistics• Inquire the sensitivity of f to ⇠

• Infer the distribution of ⇠In most real problems, these goals require anhigh number of evaluations of f . Often the con-struction of the surrogate and its evaluation inplace of the original f provides a good payoff.

Tensor-train decompositionLet f be evaluated at all points on a tensor gridX =

Nd

j=1

x

j

, where x

j

= (x

i

j

)

n

j

i

j

=1

for j 2 [1, d].Let A = f (X ).

Discrete tensor-train approximation [5]

For r = (1, r

1

, . . . , r

d�1

, 1), let ATT

be s.t.

A(i

1

, . . . , i

d

) = ATT

(i

1

, . . . , i

d

) + ETT

(i

1

, . . . , i

d

)

ATT

=

rX

↵0,...,↵d

=1

G

1

(↵

0

, i

1

,↵

1

) . . . G

d

(↵

d�1

, i

d

,↵

d

)

The construction can be built through the evalu-ation of f on the most important fibers (Fig. 1),detected using the TT-cross algorithm [6].For example, let f (x, y) = 1

x+y+1

sin(4⇡(x + y))

Figure 2: TT-cross: selection of fibers.

•Existence of low-rank best approximation•Memory complexity: linear in d

•Computational complexity: linear in d

It tackles the curse of dimensionality.

References[1] BIGONI, D., MARZOUK, Y. M., AND ENGSIG-KARUP, A. P. Spectral

tensor-train decomposition.[2] ENGSIG-KARUP, A., MADSEN, M. G., AND GLIMBERG, S. L. A mas-

sively parallel GPU-accelerated model for analysis of fully nonlinearfree surface waves. International Journal for Numerical Methods in

Fluids 70, 1 (2011), 20–36.[3] ENGSIG-KARUP, A. P. Analysis of efficient preconditioned defect

correction methods for nonlinear water waves. International Journal

for Numerical Methods in Fluids, January (2014), 749–773.[4] MARZOUK, Y. M., AND NAJM, H. N. Dimensionality reduction and

polynomial chaos acceleration of Bayesian inference in inverse prob-lems. Journal of Computational Physics 228, 6 (Apr. 2009), 1862–1902.

[5] OSELEDETS, I. Tensor-train decomposition. SIAM Journal on Scien-

tific Computing 33, 5 (2011), 2295–2317.[6] OSELEDETS, I., AND TYRTYSHNIKOV, E. TT-cross approximation for

multidimensional arrays. Linear Algebra and its Applications 432, 1(Jan. 2010), 70–88.

[7] XIU, D., AND KARNIADAKIS, G. E. The Wiener-Askey polynomialchaos for stochastic differential equations. Tech. rep., DTIC Docu-ment, 2003.

Functional TT-decompositionUsing the spectral theory on (non-symmetric)Hilbert-Schmidt kernels, we can constructa functional counterpart of the discrete TT-approximation.

Functional tensor-train approximation [1]

For r = (1, r

1

, . . . , r

d�1

, 1), let fTT

be s.t.

f (x) = f

TT

(x) +R

TT

(x)

f

TT

(x) =

rX

↵0,...,↵d

=1

1

(↵

0

, x

1

,↵

1

) · · · �d

(↵

d�1

, x

d

,↵

d

)

where �

i

(↵

i�1

, ·,↵i

) are orthogonal (see [1]).

f

TT

is constructed through the eigenvalue de-composition of Hermitian integral operators de-fined in terms of f . It can be proved that [1]:• for fixed r, f

TT

is optimal• if @

f

@x

�11 ···@x�d

d

exists and is continuous, then�

k

(↵

k�1

, ·,↵k

) 2 C�

k

(I

k

) for all k, ↵k�1

and ↵

k

.The latter statement can be relaxed:

FTT-decomposition and Sobolev spaces [1]

Let I ⇢ Rd be closed and bounded, and f 2L

2

!

(I) be a Holder continuous function with ex-ponent > 1/2 such that f 2 Hk

!

(I). Then f

TT

issuch that �

j

(↵

j�1

, ·,↵j

) 2 Hk

!

j

(I

j

) for all j, ↵j�1

and ↵

j

.

Spectral TT-decompositionLet P

N

: L

2

!

(I) ! span

⇣{�

i

}Ni=0

⌘where {�

i

}Ni=0

areorthogonal polynomials:

STT-Projection

P

N

f

TT

=

NX

i=0

c

i

i

c

i

=

rX

↵0,...,↵d

=1

1

(↵

0

, i

1

,↵

1

) . . . �

d

(↵

d�1

, i

d

,↵

d

)

n

(↵

n�1

, i

n

,↵

n

) =

Z

I

n

n

(↵

n�1

, x

n

,↵

n

)�

i

n

(x

n

)dx

n

Let ⇧N

: L

2

!

(I) ! span

⇣{l

i

}Ni=0

⌘, {l

i

}Ni=0

being theLagrange polynomials:

STT-Interpolation

N

f

TT

=

rX

↵0,...,↵d

=1

1

(↵

0

, x

1

,↵

1

) · · · �d

(↵

d�1

, x

d

,↵

d

)

n

(↵

n�1

, x

n

,↵

n

) = L

(n)

n

(↵

n�1

, x

n

,↵

n

)

where L

(n) is the Lagrange interpolation matrix.

Conclusions•Tackles the curse of dimensionality.•Spectral convergence on smooth functions.

Ongoing works•Anisotropic heterogeneous adaptivity.•Ordering problem.•Application in the fields of coastal engineering

[2, 3] and geoscience.

Numerical Examples

102 103 104 105

# func. eval

10-1410-1310-1210-1110-1010-910-810-710-610-510-410-310-210-1

L2er

r

Oscillatory

d=10d=50d=100d=200

��������

�����

�����

103 104 105 106

# func. eval

10-6

10-5

10-4

10-3

10-2

10-1

100

101

L2er

r

Corner Peak

d=10d=15d=20

�������� ���� Genz functions:

f

1

(x) = cos

0

@2⇡w

1

+

dX

i=1

c

i

x

i

1

A

f

2

(x) =

0

@1 +

dX

i=1

c

i

x

i

1

A�(d+1)

The method shows spectralconvergence on both the tests,even on f

2

, when there is noanalytical low-rank represen-tation.For d = 5, we compare thenon-adaptive STT-Projectionwith the anisotropically adap-tive Smolyak Sparse Grid.

Ordering problemTT and STT are negativelyaffected by the wrong or-dering of the dimensions,leading to an increasedcomputational cost and se-vere loss of accuracy.We propose a strategy tofind a good ordering.

(a) Vicinity matrix (b) Undirected graph (c) Hierarchical clustering

We construct a vicinity matrix based on the 2nd order ranks of thetensor. We then need to solve the Traveling Salesman Problem.


Recommended