Date post: | 22-Dec-2015 |
Category: |
Documents |
View: | 216 times |
Download: | 2 times |
Random Sample Consensus: A Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography
Martin A. Fischler, Robert C. BollesArtificial Intelligence Center
SRI International, CA
CPSC 643, Presentation 1
Graphics and Image Processing, Volume 24, Number 6, June 1981.
Martin A. Fischler
Research FocusArtificial Intelligence, Machine Vision, Switching Theory, Computer Organization, Information Theory
B.E.E Degree – City College of New York, NY
M.S and PhD – Stanford University, CA
Computer Scientist – SRI International in 1977• Published the RANSIC paper firstly in work report of SRI
International in 1980 • Published the RANSIC paper in Graphics and Image
Processing in 1981• Currently working on Visual Odometry and Visual SLAM
Computer Vision in 1981• Focused on classification and recognition• Science-based (hadn’t gotten to applications yet)• Initially focused largely on artificial worlds.• Images were hard to come by.• 3-D range sensing was almost viewed as cheating. • Research was driven by sponsor’s interests.
Back to 1981
IBM first PC, 19814.77MHz
Apple II-Plus, 1981Max of 64K RAM
Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
Motivation
Least Square AlgorithmOptimize the fit of a functional description to ALL of the presented data.
Adapted from http://en.wikipedia.org/wiki/Linear_least_squares
2
1 1
minm m
ij j ji j
X y
Motivation
Least Square AlgorithmLeast square is an averaging technique that considers all the presented data, and therefore is sensitive to outliers.
Adapted from http://www.cs.unc.edu/~lazebnik/spring09/
Motivation
Robust Estimator• The robust function ρ behaves like squared distance to
small ri but saturates to large ri , where ri is the residual of point i w.r.t. model parameters θ, σ is scale parameter.
• Nonlinear optimization that must be solved iteratively.• Least squares solution can be used for initialization.
, ; mini ii
r x
Adapted from http://www.cs.unc.edu/~lazebnik/spring09/
Motivation
Two types of error• Measurement error – inliers• Classification error – outliers
Existing Problem• Least square and robust estimator (initialization) treat
inliers and outliers equally, as a whole.• Robust estimator tries to extract the outliers in the later
iteration, while fitting inliers and extracting outliers should be in the same process.
• Why not randomly choose data subset to fit – RANSAC.
RANSAC
NotationsU= {xi} set of data points, |U|=N
p model parameters
function f computes model parameters p given a sample S from U
the cost function for a single data point x
k times of iteration
Algorithm• Select random set ,• Compute parameters • Compute cost
• Stop of Ck < C* or k > k*
RANSAC
Example
• Select data subset
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
• Select data subset
• Calculate model parameters p
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
• Select data subset
• Calculate model parameters p
• Calculate cost for each data point
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
• Select data subset
• Calculate model parameters p
• Calculate cost for each data point
• Select the data that fit the current model
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
• Select data subset
• Calculate model parameters p
• Calculate cost for each data point
• Select the data that fit the current model
• Repeat sampling
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
• Select data subset
• Calculate model parameters p
• Calculate cost for each data point
• Select the data that fit the current model
• Repeat sampling
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
Example
• Select data subset
• Calculate model parameters p
• Calculate cost for each data point
• Select the data that fit the current model
• Repeat sampling
• Ck < C* or k > k*
Adapted from http://cmp.felk.cvut.cz/~matas/papers/presentations/
RANSAC
How many iterations• The average step number k is a function of the
sample size m and the fraction of outliers
• Choose K so that, with probability p, at least one random sample is free from outliers
( ) 1 1m
E k
log 1 / log 1 1m
k p
1 1 1km
p
proportion of outliers , p=0.95
m 5% 10% 20% 25% 30% 40% 50%2 2 3 5 6 7 11 173 3 4 7 9 11 19 354 3 5 9 13 17 34 725 4 6 12 17 26 57 1466 4 7 16 24 37 97 2937 4 8 20 33 54 163 5888 5 9 26 44 78 272 1177
RANSAC
Application: Location Determination Problem• Existence proofs of multiple solutions for the P3P, P4P, and
P5P problems.• An algorithm for solving the general P3P.• An algorithm for solving the planar P4P problem.• An automatic gross-error filtering technique (RANSAC).
Adapted from http://cmp.felk.cvut.cz/ransac-cvpr2006/
RANSAC
Results: Location Determination Problem
Final result (Deviations)
X: 0.1 ft Heading: 0.01O
Y: 0.1 ft Pith: 0.10O
Z: 0.1 ft Roll: 0.12O
Adapted from http://www.ai.sri.com/people/fischler/
RANSAC
Other Applications
Adapted from http://graphics.cs.cmu.edu/courses/15-463/2006_fall/www/463.html
RANSAC
Pros• Simple and general.• Applicable to many different problems.• Often works well in practice.
Cons• Sometimes too many iterations are required. • Can fail for extremely low inlier ratios.• Lots of parameters to tune.• Can’t always get a good initialization of the model.• We can often do better than brute-force sampling.