+ All Categories
Home > Documents > 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

2 Introduction to Kalman Filters Michael Williams 5 June 2003.

Date post: 13-Dec-2015
Category:
Upload: zoe-robbins
View: 231 times
Download: 2 times
Share this document with a friend
Popular Tags:
26
Transcript
Page 1: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.
Page 2: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

2

Introduction to Kalman Filters

Michael Williams

5 June 2003

Page 3: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

3

Overview

• The Problem – Why do we need Kalman Filters?• What is a Kalman Filter?• Conceptual Overview• The Theory of Kalman Filter• Simple Example

Page 4: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

4

The Problem

• System state cannot be measured directly• Need to estimate “optimally” from measurements

Measuring Devices Estimator

MeasurementError Sources

System State (desired but not known)

External Controls

Observed Measurements

Optimal Estimate of

System State

SystemError Sources

System

Black Box

Page 5: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

5

What is a Kalman Filter?• Recursive data processing algorithm• Generates optimal estimate of desired quantities

given the set of measurements• Optimal?

– For linear system and white Gaussian errors, Kalman filter is “best” estimate based on all previous measurements

– For non-linear system optimality is ‘qualified’

• Recursive?– Doesn’t need to store all previous measurements and

reprocess all data each time step

Page 6: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

6

Conceptual Overview

• Simple example to motivate the workings of the Kalman Filter

• Theoretical Justification to come later – for now just focus on the concept

• Important: Prediction and Correction

Page 7: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

7

Conceptual Overview

• Lost on the 1-dimensional line• Position – y(t)• Assume Gaussian distributed measurements

y

Page 8: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

8

Conceptual Overview

0 10 20 30 40 50 60 70 80 90 1000

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

• Sextant Measurement at t1: Mean = z1 and Variance = z1

• Optimal estimate of position is: ŷ(t1) = z1

• Variance of error in estimate: 2x (t1) = 2

z1

• Boat in same position at time t2 - Predicted position is z1

Page 9: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

9

0 10 20 30 40 50 60 70 80 90 1000

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

Conceptual Overview

• So we have the prediction ŷ-(t2)

• GPS Measurement at t2: Mean = z2 and Variance = z2

• Need to correct the prediction due to measurement to get ŷ(t2)

• Closer to more trusted measurement – linear interpolation?

prediction ŷ-(t2)measurement z(t2)

Page 10: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

10

0 10 20 30 40 50 60 70 80 90 1000

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

Conceptual Overview

• Corrected mean is the new optimal estimate of position• New variance is smaller than either of the previous two variances

measurement z(t2)

corrected optimal estimate ŷ(t2)

prediction ŷ-(t2)

Page 11: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

11

Conceptual Overview

• Lessons so far:

Make prediction based on previous data - ŷ-, -

Take measurement – zk, z

Optimal estimate (ŷ) = Prediction + (Kalman Gain) * (Measurement - Prediction)

Variance of estimate = Variance of prediction * (1 – Kalman Gain)

Page 12: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

12

0 10 20 30 40 50 60 70 80 90 1000

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

Conceptual Overview

• At time t3, boat moves with velocity dy/dt=u

• Naïve approach: Shift probability to the right to predict• This would work if we knew the velocity exactly (perfect model)

ŷ(t2)Naïve Prediction ŷ-(t3)

Page 13: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

13

0 10 20 30 40 50 60 70 80 90 1000

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

Conceptual Overview

• Better to assume imperfect model by adding Gaussian noise• dy/dt = u + w• Distribution for prediction moves and spreads out

ŷ(t2)

Naïve Prediction ŷ-(t3)

Prediction ŷ-(t3)

Page 14: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

14

0 10 20 30 40 50 60 70 80 90 1000

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

Conceptual Overview

• Now we take a measurement at t3

• Need to once again correct the prediction• Same as before

Prediction ŷ-(t3)

Measurement z(t3)

Corrected optimal estimate ŷ(t3)

Page 15: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

15

Conceptual Overview

• Lessons learnt from conceptual overview:– Initial conditions (ŷk-1 and k-1)– Prediction (ŷ-

k , -k)

• Use initial conditions and model (eg. constant velocity) to make prediction

– Measurement (zk)• Take measurement

– Correction (ŷk , k)• Use measurement to correct prediction by ‘blending’ prediction and residual – always

a case of merging only two Gaussians• Optimal estimate with smaller variance

Page 16: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

16

Major equation

Page 17: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

17

Theoretical Basis• Process to be estimated:

yk = Ayk-1 + Buk + wk-1

zk = Hyk + vk

Process Noise (w) with covariance Q

Measurement Noise (v) with covariance R

• Kalman FilterPredicted: ŷ-

k is estimate based on measurements at previous time-steps

ŷk = ŷ-k + K(zk - H ŷ-

k )

Corrected: ŷk has additional information – the measurement at time k

K = P-kHT(HP-

kHT + R)-1

ŷ-k = Ayk-1 + Buk

P-k = APk-1AT + Q

Pk = (I - KH)P-k

Page 18: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

18

Blending Factor

• If we are sure about measurements:– Measurement error covariance (R) decreases to zero– K decreases and weights residual more heavily than prediction

• If we are sure about prediction– Prediction error covariance P-

k decreases to zero

– K increases and weights prediction more heavily than residual

Page 19: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

19

Theoretical Basis

ŷ-k = Ayk-1 + Buk

P-k = APk-1AT + Q

Prediction (Time Update)

(1) Project the state ahead

(2) Project the error covariance ahead

Correction (Measurement Update)

(1) Compute the Kalman Gain

(2) Update estimate with measurement zk

(3) Update Error Covariance

ŷk = ŷ-k + K(zk - H ŷ-

k )

K = P-kHT(HP-

kHT + R)-1

Pk = (I - KH)P-k

Page 20: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

20

Quick Example – Constant Model

Measuring Devices Estimator

MeasurementError Sources

System State

External Controls

Observed Measurements

Optimal Estimate of

System State

SystemError Sources

System

Black Box

Page 21: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

A simple exampleEstimate a random constant:” voltage” reading from a source.It has a constant value of aV (volts), so there is no control

signal uk. Standard deviation of the measurement noise is 0.1 V.

It is a 1 dimensional signal problem: A and H are constant 1.Assume the error covariance P0 is initially 1 and initial state

X0 is 0.

21

Page 22: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

A simple example – Part 1

Time 1 2 3 4 5 6 7 8 9 10

Value 0.39

0.50 0.48 0.29 0.25 0.32 0.34 0.48 0.41 0.45

22

Page 23: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

A simple example – Part 2

23

Page 24: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

A simple example – Part 3

24

Page 25: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

Result of the example

25

Page 26: 2 Introduction to Kalman Filters Michael Williams 5 June 2003.

26

References1. Kalman, R. E. 1960. “A New Approach to Linear Filtering and Prediction Problems”, Transaction of the ASME--

Journal of Basic Engineering, pp. 35-45 (March 1960).

2. Maybeck, P. S. 1979. “Stochastic Models, Estimation, and Control, Volume 1”, Academic Press, Inc.

3. Welch, G and Bishop, G. 2001. “An introduction to the Kalman Filter”, http://www.cs.unc.edu/~welch/kalman/


Recommended