+ All Categories
Home > Documents > Kalman Filters

Kalman Filters

Date post: 05-Jan-2016
Category:
Upload: azize
View: 34 times
Download: 0 times
Share this document with a friend
Description:
Kalman Filters. Introduction. Mathematical formulation is described by state space concepts . Solution is computed recursively . Both stationary and also non-stationary environments. Each updated estimate of the state computed from The previous estimate, and - PowerPoint PPT Presentation
Popular Tags:
36
Kalman Filters ELE 774 - Adaptive Signal P rocessing 1 Kalman Filters
Transcript
Page 1: Kalman Filters

Kalman Filters ELE 774 - Adaptive Signal Processing 1

Kalman Filters

Page 2: Kalman Filters

ELE 774 - Adaptive Signal Processing 2Kalman Filters

Introduction

Mathematical formulation is described by state space concepts.

Solution is computed recursively. Both stationary and also non-stationary environments.

Each updated estimate of the state computed from The previous estimate, and The new input data (innovation).

A unifying framework for the family of recursive least-squares (RLS) filters.

Page 3: Kalman Filters

ELE 774 - Adaptive Signal Processing 3Kalman Filters

Recursive MMS Estimation for Scalar RVs

Assume a complete set of observed r.v.s upto time n-1y(1), y(2), ..., y(n-1)

Let the minimum mean-square estimate of the zero mean x(n-1) be

where is the space spanned by the observations y(1) ... y(n-1).

Let there be a new observation y(n), We estimate x(n) using the observations y(1), y(2),...,y(n-1), y(n)

Do this either by storing y(1), y(2),...,y(n-1) and redo the whole problem, or

Exploit and use the new observation y(n), i.e. use a recursive estimation procedure.

Page 4: Kalman Filters

ELE 774 - Adaptive Signal Processing 4Kalman Filters

What is new in the new observation y(n)? Innovations!

Define the forward prediction error

Prediction order (n-1) increases linearly with n.

According to principle of orthogonality, fn-1(n) is orthogonal to y(1), y(2), ..., y(n-1).

fn-1(n) is a measure of the new information in y(n) ═> innovations!

Information provided by y(n) is composed of two parts One that in not new, contained in One that is new, contained in

Recursive MMS Estimation for Scalar RVs

one-step prediction of y(n)

Page 5: Kalman Filters

ELE 774 - Adaptive Signal Processing 5Kalman Filters

Refer to the prediction error as innovations, and define

Properties of the innovation (n):

Property 1: (n) is orthogonal to the observations y(1), y(2), ...,y(n-1)

Follows from the principle of orthogonality.

Property 2: (1), (2), ..., (n) are orthogonal to each other

Innovations process is white.

Follow from

Recursive MMS Estimation for Scalar RVs

Page 6: Kalman Filters

ELE 774 - Adaptive Signal Processing 6Kalman Filters

Recursive MMS Estimation for Scalar RVs

Property 3: There is one to one correspondence between

One sequence may be obtained from the other by means of a causal and causally invertible filter without any loss of information.

To show this use Gram-Schmidt orthogonalization procedure:

Page 7: Kalman Filters

ELE 774 - Adaptive Signal Processing 7Kalman Filters

Recursive MMS Estimation for Scalar RVs

Collecting all terms together

kth row of the matrix gives the coefficients of the forward prediction-error filter of order k-1.

The innovations can be calculated from the observations, or The observations can be calculated from the innovations

There is no loss of information in this transformation.

Page 8: Kalman Filters

ELE 774 - Adaptive Signal Processing 8Kalman Filters

Recursive MMS Estimation for Scalar RVs

means

or, equivalently

Page 9: Kalman Filters

ELE 774 - Adaptive Signal Processing 9Kalman Filters

Recursive MMS Estimation for Scalar RVs

Clearly,

Recalling that innovations are orthogonal to each other, and

choosing bk to minimize the mean-square estimation error

we get

Now rewrite

Adding a correction term bn(n) to the previous estimate gives the updated estimate , can be calculated recursively.

Page 10: Kalman Filters

ELE 774 - Adaptive Signal Processing 10Kalman Filters

Recursive MMS Estimation for Scalar RVs

Predictor – Corrector structure The use of observations to compute a forward prediction error – innovations, The use of the innovation to update (correct) the minimum mean-square

estimate of a r.v. related linearly to the observations

Page 11: Kalman Filters

ELE 774 - Adaptive Signal Processing 11Kalman Filters

Discrete-Time Dynamical System

A linear discrete-time dynamical system can be characterized by

Process Equation

Measurement Equation

Page 12: Kalman Filters

ELE 774 - Adaptive Signal Processing 12Kalman Filters

Discrete-Time Dynamical System

The state vector, x(n), is the minimal set of data that is sufficient to uniquely describe the unforced dynamical behaviour of the system. fewest data on the past behaviour needed to predict the future one. Dimension M.

The observation vector, y(n), is the set of observed data. Dimension N.

Page 13: Kalman Filters

ELE 774 - Adaptive Signal Processing 13Kalman Filters

Discrete-Time Dynamical System

The process equation:

models an unknown physical stochastic phenomenon denoted by the state x(n) as the output of a linear dynamical system excited by the white noise 1(n).

Properties of the transition matrix 1. Product rule

2. Inverse rule

Corollary

Transition matrix

Page 14: Kalman Filters

ELE 774 - Adaptive Signal Processing 14Kalman Filters

Discrete-Time Dynamical System

The measurement equation

gives the relation between the state x(n) and the output y(n), with zero-mean white measurement noise (disturbance) 2(n)

Initial value of the state : x(0) uncorrelated with both 1(n) and 2(n),

Noise vectors 1(n) and 2(n) are statistically independent

Page 15: Kalman Filters

ELE 774 - Adaptive Signal Processing 15Kalman Filters

Kalman Filtering

We need to solve these eqn.s jointly to find the state x(n)

Use the entire observed data, consisting of the observations y(1), y(2), ..., y(n) to find the minimum mean-square estimate of the state x(i), n ≥ 1

i = n, filtering, i > n, prediction, i < n, smoothing.

Process Equation

Measurement Equation

Page 16: Kalman Filters

ELE 774 - Adaptive Signal Processing 16Kalman Filters

The Innovations Process

Let the MMS estimate of y(n) be Span of the vector space is y(1), y(2), ..., y(n-1)

The innovations process associated with y(n)

(Similar to the scalar case)

the Mx1 vector (n) represents the new information in the observed data y(n).

Page 17: Kalman Filters

ELE 774 - Adaptive Signal Processing 17Kalman Filters

The Innovations Process

Properties of the innovations process: Property I: (n) is orthogonal to all past observations y(1), ..., y(n-1)

Property II: The innovations process consists of a sequence of vector random

variables that are orthogonal to each other

Property III: There is a one-to-one correspondence between the observations and

the observation process. One sequence may be obtained from the other by means of linear stable operators, without loss of information.

Page 18: Kalman Filters

ELE 774 - Adaptive Signal Processing 18Kalman Filters

The Innovations Process

Correlation Matrix of the Innovations Process Starting from initial state (n=0), we write

i.e. x(k) is a linear combination of x(0), 1(1), ..., 1(k-1)

We know that , then 1.

2.

3.

Page 19: Kalman Filters

ELE 774 - Adaptive Signal Processing 19Kalman Filters

The Innovations Process

Recall that

Then, given the past decisions , i.e. , the MMS estimate of y(n)

Hence

Predicted state-error vector at time n using data up to time n-1.

Page 20: Kalman Filters

ELE 774 - Adaptive Signal Processing 20Kalman Filters

The Innovations Process

Autocorrelation of the innovation process (n)

where the predicted state-error correlation matrix is

statistical description of the error in the predicted estimate

Page 21: Kalman Filters

ELE 774 - Adaptive Signal Processing 21Kalman Filters

Estimation of the State

Minimum mean-square estimation of the state x(n)

Estimate may be expressed as a linear combination of the sequence of innovations process:

Using principle of orthogonality

Page 22: Kalman Filters

ELE 774 - Adaptive Signal Processing 22Kalman Filters

Estimation of the State

Minimum mean-square estimate of x(n)

Let i=n+1

Page 23: Kalman Filters

ELE 774 - Adaptive Signal Processing 23Kalman Filters

Estimation of the State

Summation in is

Then

Kalman Gain Define

Then

Page 24: Kalman Filters

ELE 774 - Adaptive Signal Processing 24Kalman Filters

Estimation of the State

Convenient way to calculate the Kalman gain

Hence,Kalman gain becomes

Rewriting the estimate

have to be calculated at each iteration! Let’s make it recursive.

Page 25: Kalman Filters

ELE 774 - Adaptive Signal Processing 25Kalman Filters

Estimation of the State

Kalman gain computer

Page 26: Kalman Filters

ELE 774 - Adaptive Signal Processing 26Kalman Filters

Estimation of the State

Riccati Equation: Predicted state-error vector:

After substitution and manipulations

Page 27: Kalman Filters

ELE 774 - Adaptive Signal Processing 27Kalman Filters

Estimation of the State

We want to find K(n,n-1), hence from previous slide

Then using (1) and (2), we get the Riccati difference equation

where

Page 28: Kalman Filters

ELE 774 - Adaptive Signal Processing 28Kalman Filters

Estimation of the State

Riccati equation solver

Page 29: Kalman Filters

ELE 774 - Adaptive Signal Processing 29Kalman Filters

Estimation of the State

Kalman’s one-step prediction algorithm:

Page 30: Kalman Filters

ELE 774 - Adaptive Signal Processing 30Kalman Filters

Estimation of the State

One step prediction algorithm

Page 31: Kalman Filters

ELE 774 - Adaptive Signal Processing 31Kalman Filters

Filtering

Compute the filtered estimate by using the one-step prediction algorithm.

The state x(n) and the process noise 1(n) are independent of each other.

The MMSE estimate x(n+1) given the observations upto time n is

where second line follows from the fact that y(n) and 1(n) are

independent of each other.

Page 32: Kalman Filters

ELE 774 - Adaptive Signal Processing 32Kalman Filters

Filtering Filtered Estimation Error and Conversion Factor Define the filtered estimation error vector

We know that

Then

Conversionfactor

Page 33: Kalman Filters

ELE 774 - Adaptive Signal Processing 33Kalman Filters

Filtering

Substituting

gives

Filtered State-Error Correlation Matrix Define filtered state-error vector

Page 34: Kalman Filters

ELE 774 - Adaptive Signal Processing 34Kalman Filters

Filtering

Manipulations give us

Initial Conditions: Initial state of the process equation is not precisely known

In the absence of any observed data at n=0, let

andand

if x is zero mean

Page 35: Kalman Filters

ELE 774 - Adaptive Signal Processing 35Kalman Filters

Filtering

Kalman filter based on one-step prediction

Page 36: Kalman Filters

ELE 774 - Adaptive Signal Processing 36Kalman Filters

Summary


Recommended