State Estimation for Autonomous Vehicles Stergios I. Roumeliotis Computer Science & Engineering...

Post on 21-Jan-2016

223 views 0 download

Tags:

transcript

State Estimation for Autonomous Vehicles

Stergios I. RoumeliotisComputer Science & Engineering DepartmentUniversity of Minnesotastergios@cs.umn.eduwww.cs.umn.edu/~stergios

Outline

Sensing & Estimation Estimator Requirements Indirect Kalman filter formulation SC-KF Preliminary results Ongoing & related work Challenges & unresolved issues Extensions & future work

Sensors Proprioceptive: Directly measure the

motion of the vehicle IMU (accels. & gyros) Doppler radar

Noise Integration Exteroceptive: Measure “identities”

of the environment, or, “relation” of vehicle with the environment. Used to infer absolute and/or relative position and orientation (pose displacement) Compass, Sun Sensor GPS Cameras (single, stereo, omni,

FLIR) Laser scanner, MW radar, Sonar Wheel Encoders …

Uncertainty & Noise

State Estimation Techniques

Bayesian estimation Kalman filter Particle filter Unscented filter …

Goal: Estimate & Control State of vehicle (position, orientation,

velocity, direction of motion, …) State of environment (detect

obstacles, position of objects of interest, area identities, mapping, …)

Propagation

Update

filter H

Estimator Requirements

Portable (independent of vehicle) Adaptable (number & type of sensors) Modular (robustness against single point

sensor failures) Time flexible (able to process synchronous &

asynchronous sensor measurements) Expandable to multi-robot systems

Sensor &Vehicle Independence Adaptability & Portability:

Estimator considers any vehicle as a static network of sensors at known configuration

Indirect Kalman filter – Sensor Modeling State Propagation: Integrate sensor measurements

from these of the sensors that measure highest order derivatives of motion When IMU part of sensor payload, integrates

Advantages, compared to Dynamic Modeling

Difficult to derive precise vehicle/environment dynamics Vehicle modifications require new derivation CPU cost (large state vector to capture dynamics)

Statistical Modeling (commonly used for target tracking) Motion statistics unknown/uncertain

State Update: Asynchronously when new measurements available.

Indirect Kalman filter - Formulation Quantities of interest

State vector Estimated State vector Error state vector

Propagation Continuous time error

state propagation Covariance propagation

Indirect Kalman filter – Update (1 time instant) Measurement is a function of state vector at a

certain time instant Position (GPS, UHF link, DSN) Orientation (Compass, sun sensor) Linear Velocity (Doppler radar)

Observer(Estimator)

Controller

Indirect Kalman filter – Update (2

time instants) Measurement is a function of state vector at more than one time instant Estimated Rotational &

Translational Displacement (Relative State Measurement) Visual odometry (mono, stereo) Laser scan matching Kinematics-based vehicle

odometry

Example: Weighted Laser Scan Matching Relative position and

orientation measurement inferred by correlating sensor measurements recorded at 2 separate locations.

State Estimation & Relative Pose Measurements

PropagationSensor Model

UpdateSensor Models

ExteroceptiveMeasurements

(intermittently)

ProprioceptiveMeasurements

(“continuously”)

Previous Approaches I1. Approximate as higher order derivatives

Previous Approaches II2. Approx. as absolute state pseudo-measurement

[Hoffman, Baumgartner, Huntsberger, Shenker ’99]

3. Estimate relative states instead (2 estimators)

FILTER

Stochastic Cloning –Kalman Filter (SC-KF)• Relative State Measurement

• Relative State Measurement Error

• Augmented State Vector

SC-KF Propagation Equations• State Propagation

• Augmented Error State & Covariance

• Augmented Error State & Covariance Propagation

SC-KF Update Equations

Residual Covariance

Estimation Block Diagram (Helicopter) EstimatorsSensors

Camera

IMU3

accelerometers&

3 gyroscopes

Laser Altimeter

SC- KF

ˆ/1 kk

/1 kk

ˆ1/1 kk

Inertial Sensor

Integrator

Kalmanfilter

am

m

Visual FeatureTracking

pixel imagesdistance to features

zL

qkk

ˆ/1

mx mq

x kk

ˆ /1

Preliminary Results – Experimental Setup Average absolute errors in p = [x y z]:

IMU alone [53.5 464.7 126.1] mm (not shown on Fig. due to errors magnitude) VISION alone [17.4 41.4 29.9] mm KF: IMU & VISION [4.5 4.7 4.2] mm

simulatedplanetarysurface

helicopter E-Box

Preliminary Results - W/out sensor fusion

Preliminary Results - Altitude & Bias Estimates

Experiments w/ Mobile Robot I

Wheel Odometry and Weighted Laser Scan Matching

Experiments w/ Mobile Robot I

Total Distance: 22.25 mm

Average Distance ErrorsOdometry: 258 mmWLSM: 95 mmSC-KF: 77 mm

State Covariance - Simulation

Ongoing & Related Work

Treat time delays of vision algorithms (e.g. visual odometry) SC2-KF (3 copies of the state)

Detect kinematics-based odometry errors Slippage Estimation

Smoother – Trajectory Reconstruction Attitude estimation between

consecutive stops of the rover

Ongoing Work - *Unresolved Issues* Sensor Alignment

Determine 3D transformation between pairs of sensors

Must be accurate to correlate sensor measurements w/out errors

Tedious & time consuming process when done manually

Active Sensor Alignment Determine motions that

excite all d.o.f. and allow sensor network on the vehicle body to self-configure

Extensions & Future Work Extension to Simultaneous Localization And Mapping

(SLAM) Incorporate, update, and enhance previous maps of area

Satellite imagery, EDL Challenges: Computational complexity O(N2) Proposed solution: FWPT compression of covariance matrix P

Fault detection and identification Structural damages Sensor failures

Distributed state estimation Reconfigurable, mobile networks of robots & sensors

Acknowledgements DARPA, Tactical Mobile Robot Program (JPL)

Cog: Robert Hogg, PI: Larry Matthies NASA Ames, IS program (JPL)

“Safe & Precise Landing,” Cog: Jim Montgomery, PI: Larry Matthies NASA Mars Technology Program, (JPL)

“Navigation on Slopes,” Cog: Dan Helmick, PI: Larry Matthies “CLARAty”, PI: Issa Nesnas

University of Minnesota (UMN), GIA program PI: Stergios Roumeliotis

NSF, ITR program (UMN) PI: Nikos Papanikolopoulos

NSF, Ind./Univ. Cooperative Research Center (UMN) PI: Richard Voyles