+ All Categories
Home > Documents > Rectification and Depth Computation

Rectification and Depth Computation

Date post: 13-Apr-2018
Category:
Upload: krsna2study
View: 231 times
Download: 0 times
Share this document with a friend

of 14

Transcript
  • 7/27/2019 Rectification and Depth Computation

    1/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Rectification and Depth Computation

    CMPE 264: Image Analysis and Computer VisionHai Tao

  • 7/27/2019 Rectification and Depth Computation

    2/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Image correspondences

    Two different problems

    Compute sparse correspondences with unknown camera motions used forcamera motion estimation and sparse 3D reconstruction

    Given camera intrinsic and extrinsic parameters, compute dense pixelcorrespondences (one correspondence per pixel) used for recoveringdense scene structure: one depth per pixel

    We will focus on the second problem in this lecture.

  • 7/27/2019 Rectification and Depth Computation

    3/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Examples of dense depth recovery

  • 7/27/2019 Rectification and Depth Computation

    4/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Examples of dense depth recovery

  • 7/27/2019 Rectification and Depth Computation

    5/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Triangulation

    If we know the camera matrix and the camera motion, for any pixelplin the left image, its correspondence must lie on the epipolar line in the

    right image

    This suggests a method to compute the depth (3D position) of eachpixel in the left image

    For each pixelpl in the left image, search for the best matchpralong itsepipolar line in the right image

    The corresponding 3D scene points is the intersection of Olpl and Orpr.This process is called triangulation

  • 7/27/2019 Rectification and Depth Computation

    6/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Rectification

    The search of the best match along the epipolar line can be veryefficient for a special configuration where the epipolar lines become

    parallel to the horizontal image axis and collinear (the same scan linesin both images)

    For such a configuration, to find the correspondence of pixel (x,y) inthe right image, only pixels (*,y) are considered

    This special configuration is called a simple or standard stereo system In such a system, the 3D transformation between the two cameras is

    Txlr TPP ]0,0,[=

  • 7/27/2019 Rectification and Depth Computation

    7/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Rectification

    Images taken from two cameras with arbitrary relative motionR,Tcanbe rectified. The resultant images transformed so that they are as if

    taken from a standard stereo system with the two camera centersunchanged

  • 7/27/2019 Rectification and Depth Computation

    8/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Image transformation for a rotating camera

    Question: From the original image, can we compute the image takenfrom the camera center at the same position but with different

    orientation ? If the answer is yes, we can rotate the two cameras so thatthe resultant images are rectified

    Suppose the camera rotation matrix isR. For a image pointThe corresponding 3D scene point is . After rotation, thecoordinates of this point is . The new homogeneous imagecoordinates are . This can be rewritten as

    The image transformation caused by a camera rotation is a 2Dhomography

    cTZpK 1

    cTZpRK 1

    TpKRKp 1

    y)(Homographmatrixationtransform33aiswhere

    1 =

    KRKH

    Hpp T

  • 7/27/2019 Rectification and Depth Computation

    9/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Rectification algorithm

    Build rectification matrixRrect as

    Rotate left camera byRrect and right camera byRrectR using thecorresponding homographies derived in the previous slide

    213

    22121

    3

    2

    1

    and

    ,]0,,[1

    ]1,0,0[,where

    eee

    eee

    e

    e

    e

    =

    +

    ===

    =

    Txy

    yx

    T

    T

    TT

    rect

    TT

    TTT

    T

    R

  • 7/27/2019 Rectification and Depth Computation

    10/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Disparity and depth in a simple stereo system

    In a simple/standard binocular stereo system, the correspondences arealong the same scan line in the two images. The following figure

    shows the the relationship between the depthZand the disparity d=xr-xl

    The following relationship can be easily proved

    The depth is inversely proportional

    to the disparity. The closer the

    object, the larger the disparity. For a scene point at infinity, the

    disparity is 0

    d

    T

    fZ x=

  • 7/27/2019 Rectification and Depth Computation

    11/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Finding correspondences correlation-based method

    Assumptions

    Most scene points are visible from both viewpoints

    Corresponding image regions are similar

    Correlation_matching_algorithm

    Letpl andprbe pixels in the left and right image, 2W+1 is the width of thecorrelation window, [-L,L] is the disparity search range in the right imageforp

    l For each disparity d in the range of [-L,L] compute the similarity measure

    c(d)

    Output the disparity with the maximum similarity measure

  • 7/27/2019 Rectification and Depth Computation

    12/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Finding correspondences correlation-based method

  • 7/27/2019 Rectification and Depth Computation

    13/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Finding correspondences correlation-based method

    Different similarity measures

    Sum of squared differences (SSD)

    Sum of absolution differences (SAD)

    Normalized cross-correlation

    2)),(),(()( jyidxIjyixIdc llr

    W

    Wi

    W

    Wjlll +++++=

    = =

    |),(),(|)( jyidxIjyixIdc llr

    W

    Wi

    W

    Wjlll +++++=

    = =

    = =

    = =

    = =

    ++++=

    ++=

    ++++++=

    =

    W

    Wi

    W

    Wjllrllrrr

    lll

    W

    Wi

    W

    Wjlllll

    llrllrlll

    W

    Wi

    W

    Wjllllr

    rrll

    lr

    ydxIjyidxIC

    yxIjyixIC

    ydxIjyidxIyxIjyixIC

    CC

    Cdc

    2

    2

    )],(),([

    )],(),([

    )],(),()][,(),([where

    )(

  • 7/27/2019 Rectification and Depth Computation

    14/14

    Department of Computer Engineering

    University of California at Santa Cruz

    Finding correspondences feature-based method

    Search is restricted to a set of features in two images, such as edges,corners, etc.

    A similarity measure is used for matching features

    Constraints such as the uniqueness constraint (each feature can onlyhave one match) can be used

    Algorithm_feature_matching

    Compute the similarity measure betweenfl and each feature in the rightimage

    Select the right image feature with the largest similarity measure andouput the disparity

    Sample similarity measure for line segments

    lineedgethealongconstrastaveragetheandmidpoint,

    then,orientatiothesegment,linetheoflengththeiswhere

    )()()()(1 2

    32

    22

    12

    0

    c

    ml

    ccwmmwwllwS

    rlrlrlrl

    +++=


Recommended