+ All Categories
Home > Documents > Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor...

Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor...

Date post: 25-Jul-2020
Category:
Upload: others
View: 12 times
Download: 0 times
Share this document with a friend
24
Sensor Fusion, 2014 Lecture 6: 1 Lecture 6: Whiteboard: I Derivation and explanation of the PMF and PF. Frames: I Point-mass filter (PMF) I Particle filter (PF) I Examples and applications.
Transcript
Page 1: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 1

Lecture 6:

Whiteboard:I Derivation and explanation of the PMF and PF.

Frames:I Point-mass filter (PMF)I Particle filter (PF)I Examples and applications.

Page 2: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 2

Lecture 5: Summary

Key tool for a unified derivation of KF, EKF, UKF.(XY

)∈ N

((µxµy

),

(Pxx PxyPxy Pyy

))⇒ (X |Y = y) ∈ N(µx + PxyP−1

yy (y − µy ),Pxx − PxyP−1yy Pyx )

The Kalman gain is in this notation given by Kk = PxyP−1yy .

I In KF, Pxy and Pyy follow from a linear Gaussian modelI In EKF, Pxy and Pyy can be computed from a linearized model

(requires analytical gradients)I In EKF and UKF, Pxy and Pyy computed by NLT for

transformation of (xT , vT )T and (xT , eT )T , respectively. Nogradients required, just function evaluations.

Page 3: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 3

Chapter 9 overview

Particle filterI Algorithms and derivationI Practical and theoretical issuesI Computational aspectsI Marginalization to beat the curse of dimensionality

Page 4: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 4

Outline – Particle Filter

I The Bayesian optimal filter recursions and numericalapproximations.

I The particle filter, and how it relates to the point mass filter.I Code examples, how simple the PF is to implement.I Terrain navigation and other application, how the PF works in

practice.I Design issues, how to tweak the most out of the PF.I Marginalized particle filter, how to beat the curse of

dimensionality.

Page 5: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 5

Model and Bayesian recursion

General non-linear state space model

xk+1 = f (xk , uk , vk),

yk = h(xk , uk , ek).

or even more general Markov model

xk+1 ∈ p(xk |xk−1),

yk ∈ p(yk |xk).

Page 6: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 6

Bayes optimal filter: measurement update

Bayes’ rule

p(A|B,C ) =p(B|A,C )p(A|C )

p(B|C )

using A = xk , (B,C ) = y1:k , B = yk and C = y1:k−1, yields

p(xk |y1:k) =p(yk |xk , y1:k−1)p(xk |y1:k−1)

p(yk |y1:k−1)=

p(yk |xk)p(xk |y1:k−1)

p(yk |y1:k−1).

The Markov property p(yk |xk , y1:k−1) = p(yk |xk) was used in thelast equality.

Page 7: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 7

Bayes optimal filter: time update

Bayes’ rule p(A,B|C ) = p(A|B,C )p(B|C ) gives

p(xk+1, xk |y1:k) = p(xk+1|xk , y1:k) p(xk |y1:k) = p(xk+1|xk) p(xk |y1:k).

Again, the Markov property was used in the last equality.Marginalization of xk by integrating both sides with respect to xkyields

p(xk+1|y1:k) =

∫Rn

p(xk+1|xk)p(xk |y1:k) dxk .

This is known as the Chapman-Kolmogorov equation.

Page 8: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 8

Bayes optimal filter: summary

General Bayesian recursion (time and measurement updates)

p(xk+1|y1:k) =

∫Rn

p(xk+1|xk)p(xk |y1:k) dxk ,

p(xk |y1:k) =p(yk |xk)p(xk |y1:k−1)

p(yk |y1:k−1).

I Analytic solution available in a few special cases: linearGaussian model (KF) and finite state space model (HMM).

I However, for a given trajectory x1:k , the recursion can becomputed (neglect the integral).

I We can numerically evaluate different trajectories by comparingtheir likelihoods. But there are many possible trajectories, soMonte Carlo sampling of trajectories directly does not work.

Page 9: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 9

Numerical approximationBasic idea: postulate a discrete approximation of the posterior. Forthe predictive density, we have

p̂(xk |y1:k−1) =N∑

i=1

w ik|k−1δ(xk − x i

k).

The first moments (mean and covariance) are simple to computefrom this approximation:

x̂k|k−1 = E(xk) =N∑

i=1

w ik|k−1x

ik ,

Pk|k−1 = Cov(xk) =N∑

i=1

w ik|k−1(x

ik − x̂k|k−1)(x

ik − x̂k|k−1)

T .

Also, the MAP estimate can be useful:

x̂MAPk|k−1 = argmax

x ik

p̂(xk |y1:k−1)

Page 10: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 10

Measurement update

The measurement follows directly, without any extra approximation

p̂(xk |y1:k) =N∑

i=1

1ck

p(yk |x ik)w

ik|k−1︸ ︷︷ ︸

w ik|k

δ(xk − x ik),

ck =N∑

i=1

p(yk |x ik)w

ik|k−1.

The normalization constant ck corresponds to assuring that∑Ni=1 w i

k|k = 1.

Page 11: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 11

Time updateBayesian time update gives a continuous distribution

p̂(xk+1|y1:k) =N∑

i=1

w ik|kp(xk+1|x i

k).

To keep the approximation form, the distribution is sampled atpoints x i

k+1, and the weights are updated as

w ik+1|k = p̂(x i

k+1|y1:k) =N∑

j=1

w jk|kp(x i

k+1|xjk), i = 1, 2, . . . ,N.

There are two principles:I Keep the same grid, so x i

k+1 = x ik , which yields the point mass

filter.I Generate new samples from the posterior distribution

x ik+1 ∼ p̂(xk+1|y1:k), which yields the particle filter.

Both alternatives have quadratic complexity (N weights w ik+1|k ,

each one involving a sum with N terms).

Page 12: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 12

PMF

Advantages:I Simple to implement.I Works excellent when nx ≤ 2.I Gives the complete posterior, not only x̂ and P .I Global search, no local minima.

Problems:I Grid inefficient in higher dimensions, since the probability to be

at one grid point depends on the transition probability from allother grid points.

I The grid should be adaptive (rough initially, then finer).I Sparse matrices can be used for multi-model distributions.

Page 13: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 13

PMF example: 2 DOA sensors, 2 targets

Data from the FOCUS project (map overlay).Two microphone arrays (black x) compute two DOA.Two road-bound targets (green *).One grid point x i

k every meter on the road.Stem plot shows p(x i

k |y1:k) for all i

http://youtu.be/VcdTebC9uTs

Page 14: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 14

The Particle Filter

Trick to avoid cubic complexity: sample trajectories, not statesTime update for trajectory:

p(x i1:k+1|y1:k) = p(x i

k+1|x i1:k , y1:k)︸ ︷︷ ︸

p(x ik+1|x

ik)

p(x i1:k |y1:k)︸ ︷︷ ︸w i

k|k

= w ik|kp(x i

k+1|x ik)︸ ︷︷ ︸

w ik+1|k

No sum involved here!The new sample is sampled from the prior in the original PF (SIR,or bootstrap, PF)

x ik+1 ∼ p(xk+1|x i

k)

Page 15: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 15

Basic SIR PF algorithm

Choose the number of particles N.Initialization: Generate x i

0 ∼ px0 , i = 1, . . . ,N, particles.Iterate for k = 1, 2, . . . , t:1. Measurement update: For k = 1, 2, . . . ,

w ik = w i

k−1p(yk |x ik).

2. Normalize: w ik := w i

k/∑

i wik .

3. Estimation: MMSE x̂k ≈∑N

i=1 w ikx i

k or MAP.4. Resampling: Bayesian bootstrap: Take N samples with

replacement from the set {x ik}Ni=1 where the probability to take

sample i is w ik . Let w i

k = 1/N.5. Prediction: Generate random process noise samples

v ik ∼ pvk x i

k+1 = f (x ik , v

ik).

Page 16: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 16

PF code

Input arguments: NL object m, SIG object z.Output arguments: SIG object zhat.y=z.y.’;u=z.u.’;xp=ones(Np ,1)*m.x0.’ + rand(m.px0 ,Np); % Initializationfor k=1:N;

% Time updatev=rand(m.pv ,Np); % Random à

process noisexp=m.f(k,xp,u(:,k),m.th).’+v; % State prediction% Measurement updateyp=m.h(k,xp,u(k,:).’,m.th).’; % Measurement à

predictionw=pdf(m.pe,repmat(y(:,k).’,Np ,1)-yp); % Likelihoodxhat(k,:)=mean(repmat(w(:) ,1,Np).*xp); % Estimation[xp ,w]= resample(xp,w); % ResamplingxMC(:,k,:)=xp; % MC à

uncertainty repr.endzhat=sig(yp.’,z.t,u.’,xhat.’,[],xMC);

Page 17: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 17

Example: 1D terrain navigation

Problem: Measured velocityuk , unknown velocity distur-bance vk , known altitude profileh(x) which is observed with yk .http://youtu.be/thNh0E6tmV0

Model:

xk+1 = xk + uk + vk ,

yk = h(xk) + ek ,

y1

h(x)

Step 1. MUP p(x1|y1)

Step 4. Resampling p(x1|y1)

Step 5. TUP p(x2|y1)

Page 18: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 18

Example: 1D terrain navigation

Problem: Measured velocityuk , unknown velocity distur-bance vk , known altitude profileh(x) which is observed with yk .http://youtu.be/thNh0E6tmV0

Model:

xk+1 = xk + uk + vk ,

yk = h(xk) + ek ,

y2

h(x)

Step 1. MUP p(x2|y2)

Step 4. Resampling p(x2|y1:2)

Step 5. TUP p(x3|y1:2)

Page 19: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 19

Example: 1D terrain navigation

Problem: Measured velocityuk , unknown velocity distur-bance vk , known altitude profileh(x) which is observed with yk .http://youtu.be/thNh0E6tmV0

Model:

xk+1 = xk + uk + vk ,

yk = h(xk) + ek ,

yN

h(x)

Step 1. MUP p(xN |y1:N)

Step 4. Resampling p(xN |y1:N)

Step 5. TUP p(xN+1|y1:N)

Page 20: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 20

2D Terrain Navigation

Same assumptions as in 1D: aircraft measures ground altitude asmeasurement yk and noisy speed uk = vk + wk , terrain elevationmap (TAM) provides h(xk).

http://youtu.be/1uT8U22Pmw0

Page 21: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 21

2D Street Navigation by Odometry

Wheel speed sensors provide speed and yaw rate, street mapprovides constraints on turns.

http://youtu.be/aL0Tbt9Aw6k Video illustrates how a uniformprior over a part of the road network eventually converge to a singleparticle cluster when sufficient information is obtained (but manylocal particle clusters initially)

Page 22: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 22

2D Street Navigation by Fingerprinting

WiMAX network (Brussels here) provides signal strengthmeasurements, RSS map provides a fingerprint h(xk), street map(optional) provides constraints on turns.

WiMAX video First half of video shows PF without streetconstraint, second half with road constraint (with superiorperformance).

Page 23: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 23

2D Underwater Navigation

Underwater vessel measures its own depth and distance to bottom,and sea chart provides depth h(xk).

http://youtu.be/JxUjVEn87yE Video shows how a uniform priorquickly converges to a unimodal particle cloud. Note how the cloudchanges form when passing the ridge.

Page 24: Lecture6 - users.isy.liu.seusers.isy.liu.se/en/rt/fredrik/edu/sensorfusion/lecture6.pdf · Sensor Fusion, 2014 Lecture 6: 1 Lecture6: Whiteboard: I DerivationandexplanationofthePMFandPF.

Sensor Fusion, 2014 Lecture 6: 24

2D Surface NavigationOn-board radar measures range to shore and possible its speed, seachart provides conditional distance to shore h(xk).

http://youtu.be/zkPiWRgnDg4 http://youtu.be/MGJQotL79NsFirst video shows animated ship and radar measurements. Secondvideo shows radar measurements overlayed on sea chart (givenestimated position), the estimated (PF) position of the own ship,and the estimated (EKF) positions of other ships and 1 minuteprediction of their position (collision warning).


Recommended