Bundle Adjustment and SLAM - CVG @ ETHZ...Bundle Adjustment 1. Gradient Descent 9 Initialization: X...

Post on 09-Nov-2020

8 views 0 download

transcript

3D Photography: Bundle Adjustment and SLAM

Marc Pollefeys, Torsten Sattler

30 March 2014

1

Structure-From-Motion

• Two views initialization:

– 5-Point algorithm (Minimal Solver)

– 8-Point linear algorithm

– 7-Point algorithm

E → (R,t)

2

Structure-From-Motion

• Triangulation: 3D Points

E → (R,t)

3

Incremental Structure-From-Motion

• Subsequent views: Perspective pose estimation

(R,t) (R,t)

(R,t) 4

Incremental SfM with Bundle Adjustment

Bundle Adjustment

• Final step in Structure-from-Motion.

• Refine a visual reconstruction to produce jointly optimal 3D structures P and camera poses C.

• Minimize total re-projection errors .

x ij

argminX

xij Pj ,Ci Wij

2

j

i

ijz

z

Pj

X j,C j

Ci

zij

Cost Function:

Measurement error covariance

Wij

1 :

X [P,C]6

Bundle Adjustment

• Final step in Structure-from-Motion.

• Refine a visual reconstruction to produce jointly optimal 3D structures P and camera poses C.

• Minimize total re-projection errors .

x ij

z

Pj

X j,C j

Ci

zij

Cost Function:

argminX

zijTWijzij

j

i

Measurement error covariance

Wij

1 :

X [P,C]7

f X

Bundle Adjustment

• Minimize the cost function:

1. Gradient Descent

2. Newton Method

3. Gauss-Newton

4. Levenberg-Marquardt

argminX

f X

8

Bundle Adjustment

1. Gradient Descent

9

Initialization:

Xk X0

Compute gradient:

Xk Xk g

g f X X

X X k

ZTWJ

Update:

Slow convergence near minimum point!

Iterate until convergence

: Step size

J

X: Jacobian

Bundle Adjustment

2. Newton Method

10

2nd order approximation (Quadratic Taylor Expansion):

f (X )X XK

f (X) g 12

THX XK

Find that minimizes !

f (X )X XK

:H 2 f X

2X X k

Hessian matrix

Bundle Adjustment

2. Newton Method

11

Differentiate and set to 0 gives:

H1g

Computation of H is not trivial and might get stuck at saddle point!

Update:

Xk Xk

f (X )X XK

f (X) g 12

THX XK

Bundle Adjustment

3. Gauss-Newton

12

H JTWJ ZijWij

2 ij

X 2j

i

H JTWJ

Normal equation:

JTWJ JTWZ

Update:

Xk Xk

Might get stuck and slow convergence at saddle point !

Bundle Adjustment

13

4. Levenberg-Marquardt

Regularized Gauss-Newton with damping factor .

JTWJ I JTWZ

0 : Gauss-Newton (when convergence is rapid)

: Gradient descent (when convergence is slow)

HLM

Structure of the Jacobian and Hessian Matrices

• Sparse matrices since 3D structures are locally observed.

14

Solving the Normal Equation

• Schur Complement

15

HLM JTWZ

HLM

3D Structures

Camera Parameters

Solving the Normal Equation

• Schur Complement

16

HLM JTWZ

3D Structures

Camera Parameters

sH

HLM

SCH

T

SCH CH

Solving the Normal Equation

• Schur Complement

17

HLM JTWZ

HS HSC

HSCT HC

S

C

S

C

3D Structures

Camera Parameters

Multiply both sides by:

I 0

HSCT HS

1 I

HS HSC

0 HC HSCT HS

1HSC

S

C

S

C SHSCT HS

1

Solving the Normal Equation

• Schur Complement

18

HS HSC

0 HC HSCT HS

1HSC

S

C

S

C SHSCT HS

1

(HC HSCT HS

1HSC)C C SHSCT HS

1

First solve for from:

C

Schur Complement (Sparse and Symmetric Positive Definite Matrix)

Easy to invert a block diagonal matrix

Solve for by backward substitution.

S

Solving the Normal Equation

• Sparse matrix factorization 1. LU Factorization

2. QR factorization

3. Cholesky Factorization

• Iterative methods 1. Conjugate gradient

2. Gauss-Seidel

19

(HC HSCT HS

1HSC)C C SHSCT HS

1

Ax b

Can be solved without inverting A since it is a sparse matrix!

A QR

A LU

A LLT

Solve for x by forward backward substitutions.

Problem of Fill-In

20

Problem of Fill-In

• Reorder sparse matrix to minimize fill-in.

• NP-Complete problem.

• Approximate solutions: 1. Minimum degree

2. Column approximate minimum degree permutation

3. Reverse Cuthill-Mckee.

4. …

21

PTAP PT x PTb

Permutation matrix to reorder A

Problem of Fill-In

22

Robust Cost Function

• Non-linear least squares:

• Maximum log-likelihood solution:

• Assume that: 1. X is a random variable that follows Gaussian distribution.

2. All observations are independent.

23

argminX

zijTWijzij

ij

-argminX

ln p(Z | X)

-argminX

ln p(X |Z) -argminX

ln cij exp zijTWijzij

ij

argminX

zijTWijzij

ij

Robust Cost Function

• Gaussian distribution assumption is not true in the presence of outliers!

• Causes wrong convergences.

24

Robust Cost Function

25

argminX

ij zij ij

argminX

zijTSijzij

ij

Robust Cost Function scaled with

W ij

"ij

• Similar to iteratively re-weighted least-squares.

• Weight is iteratively rescaled with the attenuating factor .

• Attenuating factor is computed based on current error.

"ij

Robust Cost Function

26

(.)

"(.)

Reduced influence from high errors

Gaussian Distribution

Cauchy Distribution

Influence from high errors

Robust Cost Function

27

Outliers are taken into account in Cauchy!

State-of-the-Art Solvers

• Google Ceres:

– http://ceres-solver.org/

• g2o:

– https://openslam.org/g2o.html

• GTSAM:

– https://collab.cc.gatech.edu/borg/gtsam/

28

Simultaneous Localization and Mapping (SLAM)

• For a robot to estimate its own pose and acquire a map model of its environment.

• Chicken-and-Egg problem:

– Map is needed for localization.

– Pose is needed for mapping.

29

Full SLAM: Problem Definition

30

Robot Poses

Observations

Map landmarks

argmaxX,L

p X,L |Z,U argmaxX,L

p(X0) p(xi | xi1,ui) p(zk | xik,l jk)k1

K

i1

M

u1 u2 u3 Control Actions

31

argmaxX,L

p X,L |Z,U argmaxX,L

p(X0) p(xi | xi1,ui) p(zk | xik,l jk)k1

K

i1

M

-argminX,L

ln p(xi | xi1,ui)i1

M

ln p(zk | xik,l jk)k1

K

Negative log-likelihood

Simultaneous Localization and Mapping (SLAM)

Likelihoods:

p(xi | xi1,ui) exp{ f (xi1,ui) xi i

2}

Process model

p(zk | xik,l jk) exp{ h(xik,l jk) zk k

2

}

Measurement model

Simultaneous Localization and Mapping (SLAM)

32

argmaxX,L

p(X,L | Z,U) -argminX,L

ln p(xi | xi1,ui)i1

M

ln p(zk | xik,l jk)k1

K

Putting the likelihoods into the equation:

argmaxX,L

p(X,L |Z,U) argminX,L

f (xi1,ui) xi i

2

i1

M

h(xik,l jk) zk k

2

k1

K

Minimization can be done with Levenberg-Marquardt (similar to bundle adjustment)!

Simultaneous Localization and Mapping (SLAM)

33

JTWJ I JTWZJacobian made up of , , ,

f

x

f

u

h

x

h

l

Weight made up of ,

i

k

Normal Equations:

Can be solved with sparse matrix factorization or iterative methods

• Estimate current pose and full map .

• Inference with:

1. Kalman Filtering (EKF SLAM)

2. Particle Filtering (FastSLAM)

Online SLAM: Problem Definition

34

p(xt,L |Z,U) K p(X,L |Z,U) dx1 dx2...dxt1

tx L

Previous poses are marginalized out

EKF SLAM

• Assumes: pose and map are random variables that follows Gaussian distribution.

• Hence,

35

tx L

p(xt,L |Z,U) ~ N(,)

mean Error covariance

EKF SLAM

36

t f (ut,t1)

t Ftt1FtT Rt

Kt tHtT (HttHt

T Qt )1

t t Ktyt

t (I KtHt )t

Prediction:

Correction:

Process model

Error propagation with process noise

Kalman gain

Update mean

yt zt h(t ) Measurement residual (innovation)

Update covariance

Ft f (ut,t1)

xt1

Ht h(t )

x tMeasurement Jacobian Process Jacobian

Structure of Mean and Covariance

37

2

2

2

2

2

2

2

1

21

2221222

1211111

21

21

21

,

NNNNNN

N

N

N

N

N

llllllylxl

llllllylxl

llllllylxl

lllyx

ylylylyyxy

xlxlxlxxyx

t

N

t

l

l

l

y

x

Covariance is a dense matrix that grows with increasing map features!

True robot and map states might not follow unimodal Gaussian distribution!

Particle Filtering: FastSLAM

• Particles represents samples from the posterior distribution .

• can be any distribution (not necessarily Gaussian).

38

p(xt,L |Z,U)

p(xt,L |Z,U)

FastSLAM

39

ptm {xt

m, 1,tm ,1,t

m , 2,tm ,2,t

m ... N ,tm ,N ,t

m }

Each particle represents:

Robot state Landmark state (mean and covariance)

xtm ~ p(xt | xt1,ut )

Sample the robot state from the process model

p(Ln,tm | xt

m,zt )N Kalman filter

measurement updates

wtm p(zt |Lt

m,xtm) Weight update

Re-sampling

FastSLAM

• Many particles needed for accurate results.

• Computationally expensive for high state dimensions.

40

Pose-Graph SLAM

41

argminX

zij h(vi,v j )ij

2

ij

• 3D structures are removed (fixed).

• Constraints are relative pose estimates from 3D structures.

• Minimizes loop-closure errors.

Conclusion

• Bundle Adjustment

– See also Triggs et al., Bundle Adjustment – A Modern Synthesis, Vision, LNCS 1883, Springer 2000

• Simultaneous Localization and Mapping

– Full SLAM

– Online SLAM

• EKF SLAM

• FastSLAM

• Pose-Graph SLAM

42

Next Lectures

• Next week: Easter

• 13.04.: Project Updates (you!)

• 20.04.: Multi-view Stereo & Volumetric Modeling

43