+ All Categories
Home > Documents > Motion Estimation I

Motion Estimation I

Date post: 21-Nov-2021
Category:
Upload: others
View: 5 times
Download: 0 times
Share this document with a friend
85
Motion Estimation (II) Ce Liu [email protected] Microsoft Research New England
Transcript
Page 1: Motion Estimation I

Motion Estimation (II)

Ce Liu

[email protected]

Microsoft Research New England

Page 2: Motion Estimation I

Last time

• Motion perception

• Motion representation

• Parametric motion: Lucas-Kanade

• Dense optical flow: Horn-Schunck

𝑑𝑢𝑑𝑣= −𝐈𝑥𝑇𝐈𝑥 𝐈𝑥

𝑇𝐈𝑦

𝐈𝑥𝑇𝐈𝑦 𝐈𝑦

𝑇𝐈𝑦

−1𝐈𝑥𝑇𝐈𝑡𝐈𝑦𝑇𝐈𝑥

𝐈𝑥2 + 𝛼𝐋 𝐈𝑥𝐈𝑦

𝐈𝑥𝐈𝑦 𝐈𝑦2 + 𝛼𝐋

𝑈𝑉= −𝐈𝑥𝐈𝐭𝐈𝑦𝐈𝐭

Page 3: Motion Estimation I

Who are they?

Berthold K. P. Horn Takeo Kanade

Page 4: Motion Estimation I

Content

• Robust optical flow estimation

• Applications

• Feature matching

• Discrete optical flow

• Layer motion analysis

• Other representations

Page 5: Motion Estimation I

Content

• Robust optical flow estimation

• Applications

• Feature matching

• Discrete optical flow

• Layer motion analysis

• Other representations

Page 6: Motion Estimation I

Spatial regularity

• Horn-Schunck is a Gaussian Markov random field (GMRF)

𝐼𝑥𝑢 + 𝐼𝑦𝑣 + 𝐼𝑡2+ 𝛼 𝛻𝑢 2 + 𝛻𝑣 2 𝑑𝑥𝑑𝑦

• Spatial over-smoothness is caused by the quadratic smoothness term

• Nevertheless, real optical flow fields are sparse!

𝑢 𝑢𝑥 𝑢𝑦

𝑣 𝑣𝑥 𝑣𝑦

Page 7: Motion Estimation I

Data term

• Horn-Schunck is a Gaussian Markov random field (GMRF)

𝐼𝑥𝑢 + 𝐼𝑦𝑣 + 𝐼𝑡2+ 𝛼 𝛻𝑢 2 + 𝛻𝑣 2 𝑑𝑥𝑑𝑦

• Quadratic data term implies Gaussian white noise

• Nevertheless, the difference between two corresponded pixels is caused by– Noise (majority)

– Occlusion

– Compression error

– Lighting change

– …

• The error function needs to account for these factors

Page 8: Motion Estimation I

Noise model

• Explicitly model the noise 𝑛

𝐼2 𝑥 + 𝑢, 𝑦 + 𝑣 = 𝐼1 𝑥, 𝑦 + 𝑛

• It can be a mixture of two Gaussians, inlier and outlier

𝑛 ~ 𝜆𝑁 0, 𝜎𝑖2 + 1 − 𝜆 𝑁(0, 𝜎𝑜

2)

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 10

5

10

15

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 10

5

10

15

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 10

5

10

15

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-4

-2

0

2

4

6

8

10

12

14

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-4

-2

0

2

4

6

8

10

12

14

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1-4

-2

0

2

4

6

8

10

12

14

Inlier Outlier Mixture (𝜆 = 0.9)

PDF

Potential

Page 9: Motion Estimation I

More components in the mixture

• Consider a Gaussian mixture model

𝑛~1

𝑍

𝑘=1

𝐾

𝜉𝑘𝑁(0, 𝑘𝑠 2)

• Varying the decaying rate 𝜉, we obtain a variety of potential functions

-4 -3 -2 -1 0 1 2 3 4-5

0

5

10

15

20

25

30

-4 -3 -2 -1 0 1 2 3 4-2

0

2

4

6

8

10

12

𝜉 = 0.1 𝜉 = 0.9

Page 10: Motion Estimation I

Typical error functions

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

L2 norm𝜌 𝑧 = 𝑧2

L1 norm𝜌 𝑧 = |𝑧|

Truncated L1 norm𝜌 𝑧 = min( 𝑧 , 𝜂)

Lorentzian𝜌 𝑧 = log(1 + 𝛾𝑧2)

Page 11: Motion Estimation I

Robust statistics

• Traditional L2 norm: only noise, no outlier

• Example: estimate the average of 0.95, 1.04, 0.91, 1.02, 1.10, 20.01

• Estimate with minimum error

𝑧∗ = argmin𝑧 𝑖 𝜌 𝑧 − 𝑧𝑖

– L2 norm: 𝑧∗ = 4.172

– L1 norm: 𝑧∗ = 1.038

– Truncated L1: 𝑧∗ = 1.0296

– Lorentzian: 𝑧∗ = 1.0147

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

L2 norm𝜌 𝑧 = 𝑧2

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

L1 norm𝜌 𝑧 = |𝑧|

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

Truncated L1 norm𝜌 𝑧 = min( 𝑧 , 𝜂)

-2 -1.5 -1 -0.5 0 0.5 1 1.5 20

0.5

1

1.5

2

2.5

3

Lorentzian𝜌 𝑧 = log(1 + 𝛾𝑧2)

Page 12: Motion Estimation I

The family of robust power functions

• Can we directly use L1 norm 𝜓 𝑧 = 𝑧 ?– Derivative is not continuous

• Alternative forms

– L1 norm: 𝜓 𝑧2 = 𝑧2 + 𝜀2

– Sub L1: 𝜓 𝑧2; 𝜂 = 𝑧2 + 𝜀2 𝜂, 𝜂 < 0.5

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

1.2

-1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

1.2

|𝑧|

𝑧2 + 𝜀2

𝜂 = 0.5𝜂 = 0.4𝜂 = 0.3𝜂 = 0.2

Page 13: Motion Estimation I

Modification to Horn-Schunck

• Let x = (𝑥, 𝑦, 𝑡), and w x = 𝑢 x , 𝑣 x , 1 be the flow vector

• Horn-Schunck (recall)

𝐼𝑥𝑢 + 𝐼𝑦𝑣 + 𝐼𝑡2+ 𝛼 𝛻𝑢 2 + 𝛻𝑣 2 𝑑𝑥𝑑𝑦

• Robust estimation

𝜓 𝐼 x + w − 𝐼 x 2 + 𝛼𝜙 𝛻𝑢 2 + 𝛻𝑣 2 𝑑𝑥𝑑𝑦

• Robust estimation with Lucas-Kanade

𝑔 ∗ 𝜓 𝐼 x + w − 𝐼 x 2 + 𝛼𝜙 𝛻𝑢 2 + 𝛻𝑣 2 𝑑𝑥𝑑𝑦

Page 14: Motion Estimation I

A unifying framework

• The robust object function

𝑔 ∗ 𝜓 𝐼 x + w − 𝐼 x 2 + 𝛼𝜙 𝛻𝑢 2 + 𝛻𝑣 2 𝑑𝑥𝑑𝑦

– Lucas-Kanade: 𝛼 = 0, 𝜓 𝑧2 = 𝑧2

– Robust Lucas-Kanade: 𝛼 = 0, 𝜓 𝑧2 = 𝑧2 + 𝜀2

– Horn-Schunck: 𝑔 = 1, 𝜓 𝑧2 = 𝑧2, 𝜙 𝑧2 = 𝑧2

• One can also learn the filters (other than gradients), and robust function 𝜓 ⋅ , 𝜙(⋅) [Roth & Black 2005]

Page 15: Motion Estimation I

Derivation strategies

• Euler-Lagrange – Derive in continuous domain, discretize in the end

– Nonlinear PDE’s

– Outer and inner fixed point iterations

– Limited to derivative filters; cannot generalize to arbitrary filters

• Energy minimization– Discretize first and derive in matrix form

– Easy to understand and derive

– Iteratively reweighted least square (IRLS)

• Variational optimization

• Euler-Lagrange = Variational optimization = IRLS

Page 16: Motion Estimation I

Iteratively reweighted least square (IRLS)

• Let 𝜙 𝑧 = 𝑧2 + 𝜀2 𝜂 be a robust function

• We want to minimize the objective function

Φ 𝐀𝑥 + 𝑏 =

𝑖=1

𝑛

𝜙 𝑎𝑖𝑇𝑥 + 𝑏𝑖

2

where 𝑥 ∈ ℝ𝑑 , 𝐴 = 𝑎1 𝑎2⋯𝑎𝑛𝑇 ∈ ℝ𝑛×𝑑 , 𝑏 ∈ ℝ𝑛

• By setting 𝜕Φ

𝜕𝑥= 0, we can derive

𝜕Φ

𝜕𝑥∝

𝑖=1

𝑛

𝜙′ 𝑎𝑖𝑇𝑥 + 𝑏𝑖

2𝑎𝑖𝑇𝑥 + 𝑏𝑖 𝑎𝑖

= 𝑖=1

𝑛

𝑤𝑖𝑖𝑎𝑖𝑇𝑥𝑎𝑖 +𝑤𝑖𝑖𝑏𝑖𝑎𝑖

= 𝑖=1

𝑛

𝑎𝑖𝑇𝑤𝑖𝑖𝑥𝑎𝑖 + 𝑏𝑖𝑤𝑖𝑖𝑎𝑖

= 𝐀𝑇𝐖𝐀𝑥 + 𝐀𝑇𝐖𝑏

𝑤𝑖𝑖 = 𝜙′ 𝑎𝑖

𝑇𝑥 + 𝑏𝑖2

𝐖 = diag Φ′(𝐀𝑥 + 𝑏)

Page 17: Motion Estimation I

Iteratively reweighted least square (IRLS)

• Derivative: 𝜕Φ

𝜕𝑥= 𝐀𝑇𝐖𝐀𝑥 + 𝐀𝑇𝐖𝑏 = 0

• Iterate between reweighting and least square

• Convergence is guaranteed (local minima)

1. Initialize 𝑥 = 𝑥0

2. Compute weight matrix 𝐖 = diag Φ′(𝐀𝑥 + 𝑏)

3. Solve the linear system 𝐀𝑇𝐖𝐀𝑥 = −𝐀𝑇𝐖𝑏

4. If 𝑥 converges, return; otherwise, go to 2

Page 18: Motion Estimation I

IRLS for robust optical flow

• Objective function

𝑔 ∗ 𝜓 𝐼 x + w − 𝐼 x 2 + 𝛼𝜙 𝛻𝑢 2 + 𝛻𝑣 2 𝑑𝑥𝑑𝑦

• Discretize, linearize and increment

𝑥,𝑦

𝑔 ∗ 𝜓 𝐼𝑡 + 𝐼𝑥𝑑𝑢 + 𝐼𝑦𝑑𝑣2+ 𝛼𝜙 𝛻 𝑢 + 𝑑𝑢 2 + 𝛻 𝑣 + 𝑑𝑣 2

• IRLS (initialize 𝑑𝑢 = 𝑑𝑣 = 0)– Reweight:

– Least square:

𝚿𝑥𝑥′ = diag 𝑔 ∗ 𝜓′𝐈𝑥𝐈𝑥 , 𝚿𝑥𝑦

′ = diag 𝑔 ∗ 𝜓′𝐈𝑥𝐈𝑦 ,

𝚿𝑦𝑦′ = diag 𝑔 ∗ 𝜓′𝐈𝑦𝐈𝑦 , 𝚿𝑥𝑡

′ = diag 𝑔 ∗ 𝜓′𝐈𝑥𝐈𝑡 ,

𝚿𝑦𝑡′ = diag 𝑔 ∗ 𝜓′𝐈𝑦𝐈𝑡 , 𝐋 = 𝐃𝑥

𝑇𝚽′𝐃𝑥 + 𝐃𝑦𝑇𝚽′𝐃𝑦

𝚿𝑥𝑥′ + 𝛼𝐋 𝚿𝑥𝑦

𝚿𝑥𝑦′ 𝚿𝑦𝑦

′ + 𝛼𝐋𝑑𝑈𝑑𝑉= −𝚿𝑥𝑡′ + 𝛼𝐋𝑈

𝚿𝑦𝑡′ + 𝛼𝐋𝑉

Page 19: Motion Estimation I

What’s changed?

• Optical flow with robust function

• Horn-Schunck

𝚿𝑥𝑥′ = diag 𝑔 ∗ 𝜓′𝐈𝑥𝐈𝑥 , 𝚿𝑥𝑦

′ = diag 𝑔 ∗ 𝜓′𝐈𝑥𝐈𝑦 ,

𝚿𝑦𝑦′ = diag 𝑔 ∗ 𝜓′𝐈𝑦𝐈𝑦 , 𝚿𝑥𝑡

′ = diag 𝑔 ∗ 𝜓′𝐈𝑥𝐈𝑡 ,

𝚿𝑦𝑡′ = diag 𝑔 ∗ 𝜓′𝐈𝑦𝐈𝑡 , 𝐋 = 𝐃𝑥

𝑇𝚽′𝐃𝑥 + 𝐃𝑦𝑇𝚽′𝐃𝑦

𝚿𝑥𝑥′ + 𝛼𝐋 𝚿𝑥𝑦

𝚿𝑥𝑦′ 𝚿𝑦𝑦

′ + 𝛼𝐋𝑑𝑈𝑑𝑉= −𝚿𝑥𝑡′ + 𝛼𝐋𝑈

𝚿𝑦𝑡′ + 𝛼𝐋𝑉

𝐈𝑥2 + 𝛼𝐋 𝐈𝑥𝐈𝑦

𝐈𝑥𝐈𝑦 𝐈𝑦2 + 𝛼𝐋

𝑑𝑈𝑑𝑉= −𝐈𝑥𝐈𝑡 + 𝛼𝐋𝑈𝐈𝑦𝐈𝑡 + 𝛼𝐋𝑉

Page 20: Motion Estimation I

Example

Flow visualization

Coarse-to-fine LK with median filtering

Input two frames

Horn-Schunck

Robust optical flow

Page 21: Motion Estimation I

Content

• Robust optical flow estimation

• Applications

• Feature matching

• Discrete optical flow

• Layer motion analysis

• Other representations

Page 22: Motion Estimation I

Video stabilization

Page 23: Motion Estimation I

Video denoising

Page 24: Motion Estimation I

Video super resolution

Page 25: Motion Estimation I

Content

• Robust optical flow estimation

• Applications

• Feature matching

• Discrete optical flow

• Layer motion analysis

• Contour motion analysis

• Obtaining motion ground truth

Page 26: Motion Estimation I

Block matching

• Both Horn-Schunck and Lucas-Kanade are sub-pixel accuracy algorithms

• But in practice we may not need sub-pixel accuracy

• MPEG: 16 × 16 block matching using MMSE

• H264: variable block size and quarter-pixel precision

Page 27: Motion Estimation I

Tracking reliable features

• Idea: no need to work on ambiguous region pixels (flat regions & line structures)

• Instead, we can track features and then propagate the tracking to ambiguous pixels

• Good features to track [Shi & Tomashi 94]

• Block matching + Lucas-Kanade refinement

𝑑𝑢𝑑𝑣= −𝐈𝑥𝑇𝐈𝑥 𝐈𝑥

𝑇𝐈𝑦

𝐈𝑥𝑇𝐈𝑦 𝐈𝑦

𝑇𝐈𝑦

−1𝐈𝑥𝑇𝐈𝑡𝐈𝑦𝑇𝐈𝑡

Page 28: Motion Estimation I

Feature detection & tracking

Page 29: Motion Estimation I

From sparse to dense

• Interpolation: given values {𝑑𝑖} at { 𝑥𝑖 , 𝑦𝑖 }, reconstruct a smooth plane 𝑓(𝑥, 𝑦)

• Membrane model (first order smoothness)

𝑖

𝑤𝑖 𝑓 𝑥𝑖 , 𝑦𝑖 − 𝑑𝑖2 + 𝛼 𝑓𝑥

2 + 𝑓𝑦2 𝑑𝑥𝑑𝑦

• Thin plate model (second order smoothness)

𝑖

𝑤𝑖 𝑓 𝑥𝑖 , 𝑦𝑖 − 𝑑𝑖2 + 𝛼 𝑓𝑥𝑥

2 + 𝑓𝑥𝑦2 + 𝑓𝑦𝑦

2 𝑑𝑥𝑑𝑦

Page 30: Motion Estimation I

Membrane vs. thin plate

Page 31: Motion Estimation I

Dense flow field from sparse tracking

Page 32: Motion Estimation I

Pros and Cons of Feature Matching

• Pros– Efficient (a few feature points vs. all pixels)

– Reliable (with advanced feature descriptors)

• Cons– Independent tracking (tracking can be unreliable)

– Not all information is used (may not capture weak features)

• How to improve– Track every pixel with uncertainty

– Integrate spatial regularity (neighboring pixels go together)

Page 33: Motion Estimation I

Content

• Robust optical flow estimation

• Applications

• Feature matching

• Discrete optical flow

• Layer motion analysis

• Other representations

Page 34: Motion Estimation I

Discrete optical flow

• The objective function is similar to that of continuous flow

• x = (𝑥, 𝑦) is pixel coordinate, w = (𝑢, 𝑣) is flow vector

𝐸 w =

x

min( 𝐼1 x − 𝐼2 x + w x , 𝑡) +

x

𝜂( 𝑢 x + 𝑣 x ) +

x1,x2 ∈𝜀

min 𝛼 𝑢 x1 − 𝑢 x2 , 𝑑 + min 𝛼 𝑣 x1 − 𝑣 x2 , 𝑑

• Truncated L1 norms:– Account for outliers in the data term

– Encourage piecewise smoothness in the smoothness term

Data term

Small displacement

Spatial regularity

Page 35: Motion Estimation I

Decoupled smoothness

Smoothness O(L4)

Smoothness: O(L2)

Smoothness: O(L2)

Data term: O(L2) Data term: O(L2)

Coupled smoothness

Decoupled smoothnessSmoothness: O(L)

Smoothness: O(L)

Page 36: Motion Estimation I

Combinatorial optimization on graph

𝐸 w =

x

min( 𝐼1 x − 𝐼2 x + w x , 𝑡) +

x

𝜂( 𝑢 x + 𝑣 x ) +

x1,x2 ∈𝜀

min 𝛼 𝑢 x1 − 𝑢 x2 , 𝑑 + min 𝛼 𝑣 x1 − 𝑣 x2 , 𝑑

• Optimization strategies– Belief propagation

– Graph cuts

– MCMC (simulated annealing)

Page 37: Motion Estimation I

Dual-layer belief propagation

[Shekhovtsov et al. CVPR 07]

Horizontal flow u

Vertical flow v

u

v

w = (𝑢, 𝑣)

Data term

min( 𝐼1 x − 𝐼2 x + w , 𝑡)

Smoothness term on u

min 𝛼 𝑢 x1 − 𝑢 x2 , 𝑑

Smoothness term on v

min 𝛼 𝑣 x1 − 𝑣 x2 , 𝑑

Regularization term on u 𝜂|𝑢 x |

Regularization term on v 𝜂|𝑣 x |

Page 38: Motion Estimation I

Dual-layer belief propagation

u

v

Message 𝑀𝑗𝑘: given

all the information at node 𝑘, predict the distribution at node 𝑗

Update within 𝑢 plane

𝑘 𝑗

Page 39: Motion Estimation I

Dual-layer belief propagation

u

v

Update within 𝑣 plane

Page 40: Motion Estimation I

Dual-layer belief propagation

u

v

Update from 𝑢 plane to 𝑣 plane

𝑘 𝑗

Page 41: Motion Estimation I

Dual-layer belief propagation

u

v

Update from 𝑣 plane to 𝑢 plane

Page 42: Motion Estimation I

Example

Flow visualization

Coarse-to-fine LK with median filtering

Input two frames

Robust optical flow

Discrete optical flow

Page 43: Motion Estimation I

Content

• Robust optical flow estimation

• Applications

• Feature matching

• Discrete optical flow

• Layer motion analysis

• Other representations

Page 44: Motion Estimation I

Layer representation

• Optical flow field is able to model complicated motion

• Different angle: a video sequence can be a composite of several moving layers

• Layers have been widely used

– Adobe Photoshop

– Adobe After Effect

• Compositing is straightforward, but inference is hard

Wang & Adelson, 1994

Page 45: Motion Estimation I

Wang & Adelson, 1994

• Strategy– Obtaining dense optical flow field

– Divide a frame into non-overlapping regions and fit affine motion for each region

– Cluster affine motions by k-means clustering

– Region assignment by hypothesis testing

– Region splitter: disconnected regions are separated

Page 46: Motion Estimation I

Results

Optical flow field Clustering to affine regions Clustering with error metric

Three layers with affine motion superimposed

Reconstructed background layer

Flower garden

Page 47: Motion Estimation I

Weiss & Adelson, 1996

• Chicken & egg problem– Good motion → good segmentation

– Good segmentation → good motion

• We don’t have either of them, so iterate!

• Perceptually organized expectation & maximization (POEM)– E-step: estimate the motion parameter of each layer

– M-step: estimate the likelihood that a pixel belongs to each of the layers (segmentation)

Page 48: Motion Estimation I

Liu et. al. 2005

• Reliable layer segmentation for motion magnification

• Layer segmentation pipeline

Feature point

tracking

Trajectory

clustering

Dense optical

flow interpolation

Layer

segmentation

Page 49: Motion Estimation I

Normalized Complex Correlation

• The similarity metric should be independent of phase and magnitude

• Normalized complex correlation

tt

t

tCtCtCtC

tCtCCCS

)()()()(

|)()(|),(

2211

2

21

21

Feature point

tracking

Trajectory

clustering

Dense optical

flow interpolation

Layer

segmentation

Page 50: Motion Estimation I

Spectral Clustering

Affinity matrix Clustering Reordering of affinity matrix

Two clustersTrajectory

Tra

jecto

ry

Feature point

tracking

Trajectory

clustering

Dense optical

flow interpolation

Layer

segmentation

Page 51: Motion Estimation I

Clustering Results

Feature point

tracking

Trajectory

clustering

Dense optical

flow interpolation

Layer

segmentation

Page 52: Motion Estimation I

Flow vectors of

clustered sparse

feature points

Dense optical flow

field of cluster 1

(leaves)

Dense optical flow

field of cluster 2

(swing)

From Sparse Feature Points to Dense Optical Flow Field

Cluster 1: leaves

Cluster 2: swing

• Interpolate dense optical flow field using locally weighted linear regression

Feature point

tracking

Trajectory

clustering

Dense optical

flow interpolation

Layer

segmentation

Page 53: Motion Estimation I

Motion Layer Assignment

• Assign each pixel to a motion cluster layer, using four cues:

– Motion likelihood—consistency of pixel’s intensity if it moves with

the motion of a given layer (dense optical flow field)

– Color likelihood—consistency of the color in a layer

– Spatial connectivity—adjacent pixels favored to belong the same

group

– Temporal coherence—label assignment stays constant over time

• Energy minimization using graph cuts

Feature point

tracking

Trajectory

clustering

Dense optical

flow interpolation

Layer

segmentation

Page 54: Motion Estimation I
Page 55: Motion Estimation I

How good is optical flow?

• The AAE (average angular error) race on the Yosemitesequence for over 15 years

#I. Austvoll. Lecture Notes in Computer Science, 2005*Brox et al. ECCV, 2004.

Yosemite sequence State-of-the-art optical flow*

Improvement#

Page 56: Motion Estimation I

Middlebury flow database

Baker et. al. A Database and Evaluation Methodology for Optical Flow. ICCV 2007

Page 57: Motion Estimation I

Middlebury flow database

Page 58: Motion Estimation I

Human-assisted motion annotation

• Ground truth is essential to progress the field

• A first step in computer vision to obtain ground-truth motion for arbitrary scenes (a different approach from Baker et al. 2007)

• An interactive system to combine human perception and the state-of-the-art computer vision algorithms to annotate motion

Liu et al. Human-assisted motion annotation. CVPR 2008

Page 59: Motion Estimation I

Demo: interactive layer segmentation

Page 60: Motion Estimation I

Demo: interactive motion labeling

Page 61: Motion Estimation I

Motion database of natural scenes

Color map

Bruhn et al. Lucas/Kanade meets Horn/Schunck: combining local and global optical flow methods. IJCV, 2005

Page 62: Motion Estimation I

Content

• Robust optical flow estimation

• Applications

• Feature matching

• Discrete optical flow

• Layer motion analysis

• Other representations

Page 63: Motion Estimation I

Particle video

P. Sand and S. Teller. Particle Video: Long-Range Motion Estimation using Point Trajectories. CVPR 2006

Page 64: Motion Estimation I

Particle video

Page 65: Motion Estimation I

Seemingly Simple Examples

Kanizsa square

From real video

Page 66: Motion Estimation I

Output from the State-of-the-Art Optical Flow Algorithm

T. Brox et al. High accuracy optical flow estimation based on a theory for warping. ECCV 2004

Optical flow fieldKanizsa square

Page 67: Motion Estimation I

Output from the State-of-the-Art Optical Flow Algorithm

T. Brox et al. High accuracy optical flow estimation based on a theory for warping. ECCV 2004

Optical flow field

Dancer

Page 68: Motion Estimation I

Optical flow representation: aperture problem

Corners Lines Flat regions

Spurious junctions Boundary ownership Illusory boundaries

Page 69: Motion Estimation I

Optical flow representation: aperture problem

Corners Lines Flat regions

Spurious junctions Boundary ownership Illusory boundaries

We need motion representation beyond pixel level!

Page 70: Motion Estimation I

Challenge: Textureless Objects under Occlusion

• Corners are not always trustworthy (junctions)

• Flat regions do not always move smoothly (discontinuous at illusory boundaries)

• How about boundaries?

– Easy to detect and track for textureless objects

– Able to handle junctions with illusory boundaries

Page 71: Motion Estimation I

Frame 1

Page 72: Motion Estimation I

Frame 2

Page 73: Motion Estimation I

Extracted boundary fragments

Page 74: Motion Estimation I

Optical flow from Lucas-Kanade algorithm

Page 75: Motion Estimation I

Estimated motion by our system, after grouping

Page 76: Motion Estimation I

Boundary grouping and illusory boundaries (frame 1)

Page 77: Motion Estimation I

Boundary grouping and illusory boundaries (frame 2)

Page 78: Motion Estimation I

Rotating Chair

Page 79: Motion Estimation I

Frame 1

Page 80: Motion Estimation I

Frame 2

Page 81: Motion Estimation I

Extracted boundary fragments

Page 82: Motion Estimation I

Estimated flow field from Brox et al.

Page 83: Motion Estimation I

Estimated motion by our system, after grouping

Page 84: Motion Estimation I

Boundary grouping and illusory boundaries (frame 1)

Page 85: Motion Estimation I

Boundary grouping and illusory boundaries (frame 2)


Recommended