+ All Categories
Home > Documents > Bayesian Modeling and Inference for High-Dimensional ...

Bayesian Modeling and Inference for High-Dimensional ...

Date post: 16-Oct-2021
Category:
Upload: others
View: 7 times
Download: 0 times
Share this document with a friend
35
Bayesian Modeling and Inference for High-Dimensional Spatiotemporal Datasets Sudipto Banerjee University of California, Los Angeles, USA Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models
Transcript
Page 1: Bayesian Modeling and Inference for High-Dimensional ...

Bayesian Modeling and Inference forHigh-Dimensional Spatiotemporal Datasets

Sudipto Banerjee

University of California, Los Angeles, USA

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 2: Bayesian Modeling and Inference for High-Dimensional ...

Based upon projects involving:I Abhirup Datta (Johns Hopkins University)I Andrew O. Finley (Michigan State University)I Nicholas A.S. Hamm (University of Twente)I Martjin Schaap (TNO Built Environment and Geosciences)

Page 3: Bayesian Modeling and Inference for High-Dimensional ...

Example 1: U.S. forest biomass data

Figure: Observed biomass (left) and NDVI (right)

I Forest biomass data collected over 114,371 plotsI Normalized Difference Vegetation Index (NDVI) is a measure of

greennessI Forest Biomass Regression Model:

Biomass(`) = Ξ²0(`) + Ξ²1(`)NDVI(`) + error

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 4: Bayesian Modeling and Inference for High-Dimensional ...

Example 2: European Particulate Matter (PM10) data

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

●

● ●●●

●●

●● ●

●

●●

●

●●●

●●●●●●●

●●● ●● ●●●●●

●●

● ●

●●

●

●●●●

●

●

● ●●

●

●●

● ●

●●●●

● ●

●●● ●

●

●

●

●

●●

●

●

●

●

●

●

●

●●

●

●

●

●

●●

●

●

●

●

●

●

●

●

●

●

● ●

●

●

●

● ● ●

●

●

●

●

●

●

●

●

●

●

●

●●

●

● ●

●

●

●

● ●

●●●

●

●

●●●

●

●● ●

●●●

●

●

●

●

●

●

●

●

●●

●

●

●

●

● ●

●

●

●

●●

●●

●

●

● ●

●

●

●

●

●

●

●

●

●● ●

●

●

●

●

●

●●

●●●

●

●●

●●

●

●

●

●

●●●

●●

●

●

●

●

● ●

●●●

●

●

●

●●

●

●

●

●

●

20

40

60

80

100

(a) PM10 levels in March, 2009

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

●

● ●●●

●●

●● ●

●

●●

●

●●●

●●●●●●●

●● ●● ●●● ●●●

●●

● ●

●●

●

●●●

●

●●

● ● ●

●

●

●

●●

●

●●●

● ●● ●

●

●

●

●

●●

●

●

●

●

●

●

●

●●

●

●

●

●

●●

●

●

●

●●

●

●

●

●

●

●

● ●

●

●

●

●

● ● ●

●

●

●

●

●

●

●

●

●

●●

● ●

●

●

●

●

●

●

●

●●●

●

●●

●

●●● ●

●●●

●●

●

●

●

●

●●

●

●

●

●

● ●

●

●

●

●

●●

●●

●

●

● ●

●

●

●

●

●

●

●

●

●●

●

●

●

●

●

●●

●

●● ●

●●●●

●●

●

●

●

●

●

●●●

●●

●●

●

●

●

●

●

●

●●●

●

●

●

●●

●

●

●

●

51015202530354045

(b) PM10 levels in June, 2009

I Significant variation across space and time

I Daily observations at 308 stations for 2 years i.e.,n = 308Γ— 730 = 224, 840

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 5: Bayesian Modeling and Inference for High-Dimensional ...

Example 2: European PM10 data

I Computer models likeChemistry Transport Model(CTM) consistentlyunderestimate PM10 levels

I CTM outputs used ascovariates to improve fitslog(PM10)(`) =Ξ²0(`) + Ξ²1(`)CTM(`) + Ξ΅(`)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 6: Bayesian Modeling and Inference for High-Dimensional ...

Example 3: Tanana Valley (Alaska) forest canopy height analysis

(a) (b)

Figure: Tanana vally, Alaska, study region. (a) G-LiHT flight lines wherecanopy height was measured at ∼ 6Γ— 106 locations over the percent forestcanopy covariate. (b) Occurrence of forest fire in the past 20 years and areasof interest for prediction illustration.

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 7: Bayesian Modeling and Inference for High-Dimensional ...

Spatiotemporal regression models

I Y(`) = Ξ²0(`) + X(`)Ξ²1(`) + e(`)

I Produce maps for intercept and slope:

{β0(`) : ` ∈ L} and {β1(`) : ` ∈ L}

I L is spatial domain (e.g., D βŠ‚ <d) or spatiotemporal domain(e.g., D βŠ‚ <d Γ—<+)

I Potentially very rich: understand spatially- and/ortemporally-varying impact of predictors on outcome.

I Model-based predictions: Y(`0) | {y(`1), y(`2), . . . , y(`n)}.

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 8: Bayesian Modeling and Inference for High-Dimensional ...

Gaussian spatiotemporal process

I {w(`) : ` ∈ L} ∼ GP(0,Kθ(·, ·)) implies

w = (w(`1),w(`2), . . . ,w(`n))> ∼ N(0,Kθ)

for every finite set of points `1, `2, . . . , `n.

I KΞΈ = {KΞΈ(`i, `j)} is a spatial variance-covariance matrix

I Stationary: KΞΈ(`, `β€²) = KΞΈ(`βˆ’ `β€²). Isotropy:KΞΈ(`, `β€²) = KΞΈ(β€–`βˆ’ `β€²β€–).

I With β€œnugget” (esp. when modeling data): KΞΈ = C(Οƒ,Ο†) + DΟ„ ,where ΞΈ = {Οƒ, Ο†, Ο„}

I No nugget (esp. when modeling random effects): KΞΈ = C(Οƒ,Ο†),where ΞΈ = {Οƒ, Ο†}

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 9: Bayesian Modeling and Inference for High-Dimensional ...

Likelihood from (full rank) GP models

I L = {`1, `2, . . . , `n} are locations where data is observed

I y(`i) is outcome at the ith location, y = (y(`1), y(`2), . . . , y(`n))>

I Model: y ∼ N(Xβ,Kθ)

I Estimating process parameters from the likelihood:

βˆ’12

log det(KΞΈ)βˆ’12

(yβˆ’ XΞ²)>Kβˆ’1ΞΈ (yβˆ’ XΞ²)

I Bayesian inference: Priors on {Ξ², ΞΈ}

I Challenges: Storage and chol(KΞΈ) = LDL>.

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 10: Bayesian Modeling and Inference for High-Dimensional ...

Burgeoning literature on spatial big data

I Low-rank models (Wahba, 1990; Higdon, 2002; Kamman & Wand,2003; Paciorek, 2007; Rasmussen & Williams, 2006; Stein 2007, 2008;Cressie & Johannesson, 2008; Banerjee et al., 2008; 2010; Gramacy &Lee 2008; Sang et al., 2011, 2012; Lemos et al., 2011; Guhaniyogi etal., 2011, 2013; Salazar et al., 2013; Katzfuss, 2016)

I Spectral approximations and composite likelihoods: (Fuentes 2007;Paciorek, 2007; Eidsvik et al. 2016)

I Multi-resolution approaches (Nychka, 2002; Johannesson et al., 2007;Matsuo et al., 2010; Tzeng & Huang, 2015; Katzfuss, 2016)

I Sparsity: (Solve Ax = b by (i) sparse A, or (ii) sparse Aβˆ’1)1. Covariance tapering (Furrer et al. 2006; Du et al. 2009; Kaufman

et al., 2009; Shaby and Ruppert, 2013)2. GMRFs to GPs: INLA (Rue et al. 2009; Lindgren et al., 2011)3. LAGP (Gramacy et al. 2014; Gramacy and Apley, 2015)4. Nearest-neighbor models (Vecchia 1988; Stein et al. 2004; Stroud

et al 2014; Datta et al., 2016)Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 11: Bayesian Modeling and Inference for High-Dimensional ...

Reduced (Low) rank models

I KΞΈ β‰ˆ BΞΈKβˆ—ΞΈB>ΞΈ + DΞΈ

I BΞΈ is nΓ— r matrix of spatial basis functions, r << n

I Kβˆ—ΞΈ is r Γ— r spatial covariance matrix

I DΞΈ is either diagonal or sparse

I Examples: Kernel projections, Splines, Predictive process, FRK,spectral basis ...

I Computations exploit above structure: roughlyO(nr2) << O(n3) flops

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 12: Bayesian Modeling and Inference for High-Dimensional ...

Oversmoothing due to reduced-rank models

(a) True w (b) Full GP (c) PPGP 64 knots

Figure: Comparing full GP vs low-rank GP with 2500 locations. Figure(4(c)) exhibits oversmoothing by a low-rank process (predictive process with64 knots)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 13: Bayesian Modeling and Inference for High-Dimensional ...

Simple method of introducing sparsity (e.g. graphical models)

Full dependency graph

1

2

3

4

5

67

p(y1)p(y2 | y1)p(y3 | y1, y2)p(y4 | y1, y2, y3)Γ— p(y5 | y1, y2, y3, y4)p(y6 | y1, y2, . . . , y5)p(y7 | y1, y2, . . . , y6) .

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 14: Bayesian Modeling and Inference for High-Dimensional ...

Simple method of introducing sparsity (e.g. graphical models)

3βˆ’Nearest neighbor dependency graph

1

2

3

4

5

67

p(y1)p(y2 | y1)p(y3 | y1, y2)p(y4 | y1, y2, y3)p(y5 |οΏ½οΏ½y1, y2, y3, y4)p(y6 | y1,οΏ½οΏ½y2,οΏ½οΏ½y3, y4, y5)p(y7 | y1, y2,οΏ½οΏ½y3,οΏ½οΏ½y4,οΏ½οΏ½y5, y6)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 15: Bayesian Modeling and Inference for High-Dimensional ...

Gaussian graphical models: linearity

I Write a joint density p(w) = p(w1,w2, . . . ,wn) as:

p(w1)p(w2 |w1)p(w3 |w1,w2) Β· Β· Β· p(wn |w1,w2, . . . ,wnβˆ’1)

I Example: For Gaussian distribution N(w | 0,KΞΈ), we have alinear model

w1 = 0 + Ξ·1;w2 = a21w1 + Ξ·2;w3 = a31w1 + a32w2 + Ξ·3;wi = ai1w1 + ai2w2 + Β· Β· Β·+ ai,iβˆ’1wiβˆ’1 + Ξ·i; i = 4, . . . , n .

I More compactly: w = Aw + η ; η ∼ N(0,D).

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 16: Bayesian Modeling and Inference for High-Dimensional ...

Simple method of introducing sparsity (e.g. graphical models)

I For Gaussian distribution N(w | 0,KΞΈ),

KΞΈ = (I βˆ’ A)βˆ’1D(I βˆ’ A)βˆ’> D = diag(var{wi |w{j<i}})

I If L is from chol(KΞΈ) = LDL>, then Lβˆ’1 = I βˆ’ A.

I aij’s obtained from nβˆ’ 1 linear systems implied byβˆ‘j<i:j∼i

aijwj = E[wi |w{j<i}] i = 2, . . . , n

I Example:

for(i in 1:n) {

a[i+1,] = solve(K[1:i,1:i], K[i, 1:i])

}

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 17: Bayesian Modeling and Inference for High-Dimensional ...

I Let aij = 0 for all but m nearest neighbors of node i impliessolving βˆ‘

j∈N[i]aijwj = E[wi |w{j∈N[i]}] i = 2, . . . , n ,

where N[i] = {j < i : j ∼ i} are indices for neighbors of i fromits β€œpast.”

I Example:

for(i in 1:n) {

a[i+1,] = solve(K[N[i],N[i]], K[i, N[i]])

}

I We need to solve nβˆ’ 1 linear systems of size at most mΓ— mI We effectively model a (sparse) Cholesky factor instead of

computing it

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 18: Bayesian Modeling and Inference for High-Dimensional ...

Sparse precision matrices

N(wR | 0, KΞΈ) β‰ˆ N(wR | 0, KΜƒΞΈ) ; KΜƒβˆ’1ΞΈ = (I βˆ’ A)>Dβˆ’1(I βˆ’ A)

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 19: Bayesian Modeling and Inference for High-Dimensional ...

Sparse precision matrices

N(wR | 0, KΞΈ) β‰ˆ N(wR | 0, KΜƒΞΈ) ; KΜƒβˆ’1ΞΈ = (I βˆ’ A)>Dβˆ’1(I βˆ’ A)

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 20: Bayesian Modeling and Inference for High-Dimensional ...

Sparse precision matrices

N(wR | 0, KΞΈ) β‰ˆ N(wR | 0, KΜƒΞΈ) ; KΜƒβˆ’1ΞΈ = (I βˆ’ A)>Dβˆ’1(I βˆ’ A)

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 21: Bayesian Modeling and Inference for High-Dimensional ...

Sparse precision matrices

N(wR | 0, KΞΈ) β‰ˆ N(wR | 0, KΜƒΞΈ) ; KΜƒβˆ’1ΞΈ = (I βˆ’ A)>Dβˆ’1(I βˆ’ A)

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

(a) I βˆ’ A (b) Dβˆ’1 (c) KΜƒβˆ’1ΞΈ

I det(KΜƒβˆ’1ΞΈ ) =

∏ni=1 Dβˆ’1

ii , KΜƒβˆ’1ΞΈ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 22: Bayesian Modeling and Inference for High-Dimensional ...

Sparse likelihood approximations (Vecchia, 1988; Stein et al., 2004)

I LetR = {`1, `2, . . . , `r}

I With w(`) ∼ GP(0,Kθ(·)), write the joint density p(wR) as:

N(wR | 0,Kθ) =r∏

i=1

p(w(`i) |wH(`i))

β‰ˆr∏

i=1

p(w(`i) |wN(`i)) = N(wR | 0, K̃θ) .

where N(`i) βŠ† H(`i).

I Shrinkage: Choose N(`) as the set of β€œm nearest-neighbors”among H(`i). Theory: β€œScreening” effect of kriging.

I KΜƒβˆ’1ΞΈ depends on KΞΈ, but is sparser with at most nm2 non-zero

entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 23: Bayesian Modeling and Inference for High-Dimensional ...

Extension to a GP (Datta et al., JASA, 2016)

I Fix β€œreference” setR = {`1, `2, . . . , `r} (e.g. observed points)

I N(`i) is the set of at most m nearest neighbors of `i among{`1, `2, . . . , `iβˆ’1}.

I First piece: Model wR ∼ N(0, KΜƒΞΈ) (β€œVecchia prior”)

I Second piece: If ` /∈ R, then N(`) is the set of m-nearestneighbors of ` inR

I Third piece: w(`) =βˆ‘r

i=1 ai(`)w(`i) + η(`) with ai(`) = 0 if`i /∈ N(`).

I Nonzero ai(`)’s obtained by solving mΓ— m system:

E[w(`) |wN(`)] =βˆ‘

i:`i∈N(`)ai(`)w(`i)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 24: Bayesian Modeling and Inference for High-Dimensional ...

Neighbors in Space and Time

I No universal definition of distance in a space-time domain

I Use KΞΈ(Β·, Β·) as a proxy for distance

I Datta et al. (2016, AoAS): Efficient algorithm ∼ O(4nm) flopsto do this

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 25: Bayesian Modeling and Inference for High-Dimensional ...

Example 1: Hierarchical NNGP model

I Start with a desired full GP specification: GP(0,Kθ(·))I Derive the NNGP: NNGP(0, K̃θ(·))

Y(`) ind∼ Pθ exponential family ;g(E[Y(`)]) = β0(`) + X(`)β1(`)

(Ξ²0(`), Ξ²1(`))> ∼ NNGP(Ξ²Μƒ0 + X(`)Ξ²Μƒ1, KΜƒΞΈ(Β·))(Ξ²Μƒ0, Ξ²Μƒ1)> ∼ N(0,VΞ²) ; ΞΈ ∼ p(ΞΈ)

I Posterior predictive inference for Ξ²0(`0), Ξ²1(`0) and Y(`0)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 26: Bayesian Modeling and Inference for High-Dimensional ...

Example 2: Hierarchical NNGP model

I Start with a desired full GP specification for Y(`):Y(`) ∼ GP(x>(`)β,Kθ(·))

I Derive the NNGP: Y(`) ∼ NNGP(x>(`)Ξ², KΜƒΞΈ(Β·))

Y ∼ N(XΞ², KΜƒΞΈ) ;Ξ² ∼ N(0,VΞ²) ; ΞΈ ∼ p(ΞΈ)

I No need for Cholesky: it is modeled.I Easy posterior predictive inference for Y(`0) at new `0.I But no latent spatial-temporal process

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 27: Bayesian Modeling and Inference for High-Dimensional ...

(a) True w (b) Full GP (c) PPGP 64 knots

(d) NNGP, m = 10 (e) NNGP, m = 20

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 28: Bayesian Modeling and Inference for High-Dimensional ...

●

●

●

●

●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

m

RM

SP

E

1.15

1.20

1.25

1.30

1.35

●

●

●

●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

2.10

2.15

2.20

2.25

2.30

2.35

2.40

Me

an

95

% C

I w

idth

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

●

●

NNGP RMSPENNGP Mean 95% CI widthFull GP RMSPEFull GP Mean 95% CI width

Figure: Choice of m in NNGP models: Out-of-sample Root Mean SquaredPrediction Error (RMSPE) and mean width between the upper and lower95% posterior predictive credible intervals for a range of m for the univariatesynthetic data analysis

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 29: Bayesian Modeling and Inference for High-Dimensional ...

Back to European PM10 data

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

●

● ●●●

●●

●● ●

●

●●

●

●●●

●●●●●●●

●●● ●● ●●●●●

●●

● ●

●●

●

●●●●

●

●

● ●●

●

●●

● ●

●●●●

● ●

●●● ●

●

●

●

●

●●

●

●

●

●

●

●

●

●●

●

●

●

●

●●

●

●

●

●

●

●

●

●

●

●

● ●

●

●

●

● ● ●

●

●

●

●

●

●

●

●

●

●

●

●●

●

● ●

●

●

●

● ●

●●●

●

●

●●●

●

●● ●

●●●

●

●

●

●

●

●

●

●

●●

●

●

●

●

● ●

●

●

●

●●

●●

●

●

● ●

●

●

●

●

●

●

●

●

●● ●

●

●

●

●

●

●●

●●●

●

●●

●●

●

●

●

●

●●●

●●

●

●

●

●

● ●

●●●

●

●

●

●●

●

●

●

●

●

20

40

60

80

100

(a) PM10 levels in March, 2009

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

●

● ●●●

●●

●● ●

●

●●

●

●●●

●●●●●●●

●● ●● ●●● ●●●

●●

● ●

●●

●

●●●

●

●●

● ● ●

●

●

●

●●

●

●●●

● ●● ●

●

●

●

●

●●

●

●

●

●

●

●

●

●●

●

●

●

●

●●

●

●

●

●●

●

●

●

●

●

●

● ●

●

●

●

●

● ● ●

●

●

●

●

●

●

●

●

●

●●

● ●

●

●

●

●

●

●

●

●●●

●

●●

●

●●● ●

●●●

●●

●

●

●

●

●●

●

●

●

●

● ●

●

●

●

●

●●

●●

●

●

● ●

●

●

●

●

●

●

●

●

●●

●

●

●

●

●

●●

●

●● ●

●●●●

●●

●

●

●

●

●

●●●

●●

●●

●

●

●

●

●

●

●●●

●

●

●

●●

●

●

●

●

51015202530354045

(b) PM10 levels in June, 2009

I Interest in estimating short and long term temporal (and spatial)decay (to improve the CTMs)

I log(PM10)(s, t) = Ξ²0 + Ξ²1CTM(s, t) + w(s, t) + Ξ΅(s, t)

I w(s, t) ∼ DNNGP(0, KΜƒΞΈ(Β·))Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 30: Bayesian Modeling and Inference for High-Dimensional ...

European PM10 Dataset

I Significantly improved fit

OLS DNNGPRMSPE 12.8 8.2

I Total time 24 hrs

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 31: Bayesian Modeling and Inference for High-Dimensional ...

European PM10 Dataset

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

Missing[0,20)[20,40)[40,60)[60,80)[80,100)[100,120]

(a) PΜ‚M10 for 04.03.2009

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

[0,0.1)[0.1,0.2)[0.2,0.3)[0.3,0.4)[0.4,0.5)[0.5,0.6)[0.6,0.7)[0.7,0.8)[0.8,0.9)[0.9,1]

(b) Pr(PΜ‚M10 > 50Β΅gmβˆ’3)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 32: Bayesian Modeling and Inference for High-Dimensional ...

European PM10 Dataset

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

Missing[0,10)[10,20)[20,30)[30,40)[40,50)[50,60)[60,70)[70,80)[80,90]

(a) PΜ‚M10 for 04.05.2009

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

[0,0.1)[0.1,0.2)[0.2,0.3)[0.3,0.4)[0.4,0.5)[0.5,0.6)[0.6,0.7)[0.7,0.8)[0.8,0.9)[0.9,1]

(b) Pr(PΜ‚M10 > 50Β΅gmβˆ’3)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 33: Bayesian Modeling and Inference for High-Dimensional ...

Concluding remarks: Storage and computation

I Algorithms: Gibbs, RWM, HMC, VB, INLA; NNGP/HMCespecially promising

I Model-based solution for spatial β€œBIG DATA”

I Never needs to store nΓ— n distance matrix. Stores n small mΓ—mmatrices

I Total flop count per iteration is O(nm3) i.e linear in n

I Scalable to massive datasets because m is smallβ€”you never needmore than a few neighbors.

I Compare with reduced-rank models: O(nm3) << O(nr2).

I New R package spNNGP (on CRAN soon)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 34: Bayesian Modeling and Inference for High-Dimensional ...

Concluding remarks: Comparisons

I Are low-rank spatial models well and truly beaten?

I Certainly do not seem to scale as nicely as NNGPI Have somewhat greater theoretical tractability (e.g. Bayesian

asymptotics)I Can be used to flexibly model smoothnessI Can be constructed for other processesβ€”e.g., Spatial Dirichlet

Predictive ProcessI Compare with scalable multi-resolution frameworks (Katzfuss,

2016)

I Highly scalable meta-kriging frameworks (Guhaniyogi, 2016)

I Future work: High-dimensional multivariate spatial-temporal variableselection

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 35: Bayesian Modeling and Inference for High-Dimensional ...

Thank You !

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models


Recommended