+ All Categories
Home > Documents > 1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 ©...

1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 ©...

Date post: 20-Dec-2015
Category:
View: 221 times
Download: 0 times
Share this document with a friend
Popular Tags:
33
1 al geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009
Transcript

1Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Spectral MethodsTutorial 6

© Maks Ovsjanikovtosca.cs.technion.ac.il/book

Numerical geometry of non-rigid shapesStanford University, Winter 2009

2Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Outline

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

1.Classic MDS and PCA review.

2.Metric MDS.

3.Kernel PCA, kernel trick, relation to Metric MDS.

4.Summary.

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape

Representation Rustamov R., SGP, 2007

3Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

Classic MDS (classical scaling) recap.

1. Given a dissimilarity matrix arising from a normed vector space:

2. We want to find the coordinates of points that would give rise to

E.g. given pairwise distances between cities on a map, find the locations:

Can only hope to find up to rotation, translation

4Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

Classic MDS (classical scaling).

1. Centering matrix H:

2. Define , where

Attention: Only works for normed vector spaces!

5Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

Classic MDS (classical scaling).

2. Define ,

3. Express to obtain

Note that if , then for any orthonormal

4. Since is symmetric, can find its eigendecomposition:

and

6Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Multivariate Analysis Mardia K.V. et al., Academic Press., 1979

Classic MDS (classical scaling).

1. Although is a matrix, it has only non-zero eigenvalues if was

sampled from .

2. Can project on the first eigenvectors, by taking:

7Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Multivariate Analysis Mardia K.V. et al., Academic Press., 1979

Classic MDS (classical scaling).

1. Although is a matrix, it has only non-zero eigenvalues if was

sampled from .

2. Can project on the first eigenvectors, by taking:

Optimality condition of classic MDS

Theorem: If is a set of points in with distances:

For any k-dimensional orthonormal projection , the distortion

is minimized when is projected onto its principal directions,

8Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Classic MDS – Relation to PCA.

1. During standard Principal Component Analysis, one performs

eigendecomposition of the covariance matrix:

2. Try to find a more natural basis to express the points in.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

9Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Classic MDS – Relation to PCA.

1. During standard Principal Component Analysis, one performs

eigendecomposition of the covariance matrix:

2. Try to find a more natural basis to express the points in.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

10Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Classic MDS – Relation to PCA.

1. During standard Principal Component Analysis, one performs

eigendecomposition of the covariance matrix:

2. Try to find a more natural basis to express the points in.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

11Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Classic MDS – Relation to PCA.

1. During standard Principal Component Analysis, one performs

eigendecomposition of the covariance matrix:

2. Using the centering matrix, we can express:

3. For any eigenvalue of we have:

which implies:

4. The eigenvalues of and are the same and the eigenvectors are given by

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

12Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Classic MDS – Relation to PCA.

1. The eigenvalues of and are the same and the eigenvectors are given by:

2. has the advantage that its size is and it is positive definite

rather than positive-semidefinite. Eigendecomposition more stable.

3. If we’re only given pairwise distances, cannot construct directly. Solving

different problems!

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

13Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Metric MDS.

1. Suppose instead of minimizing distortion (stress), we want to minimize derived

stress. Given pairwise distances , find a set of points to minimize:

2. Even if come from a Euclidean space, the problem is much more difficult.

3. Resort to numerical optimization. Differentiate w.r.t. to to get the gradient.

4. Alternative: perform classical MDS on derived distances. Eigensystem.

Problem: The matrix is no longer guaranteed to be positive semi-definite.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

Critchley F., Multidimensional Scaling: a short critique and a new algorithm, COMPSTAT, 1978

14Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Kernel PCA.

1. Basic Idea: represent a point by its image in a feature space:

2. Domains can be completely different!

3. Kernel Trick: In many applications we do not need to know explicitly, we

only need to operate if the kernel can be

computed efficiently (e.g. can be infinite dimensional)

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

15Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Kernel PCA.

1. Basic Idea: represent a point by its image in a feature space:

2. Domains can be completely different!

3. Kernel Trick: In many applications we do not need to know explicitly, we

only need to operate if the kernel can be

computed efficiently (e.g. can be infinite dimensional)

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

16Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Kernel PCA.

1. Could do PCA in the feature space: compute covariance matrix of feature

vectors, and perform its eigen-decomposition.

2. However, instead of , could use

If the dimension of feature vectors > , this is more efficient!

3. To center the data, so that can use the centering matrix and

find eigenvalues of

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

Schölkopf, B., et al., Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 1998

17Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Kernel PCA and Metric MDS.

1. Spherical (isotropic) kernel. Depends only on the distance between points:

2. If we assume that then:

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

18Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Kernel PCA and Metric MDS.

1. Suppose we’re given a matrix of pairwise distances:

2. If we set then

In matrix form: , and moreover:

3. Thus, performing Classical MDS on is equivalent to performing it on A.

4. Classical MDS on attempts to approximate: which is a

nonlinear function of distance. So classical MDS on is metric MDS on

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

19Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Kernel PCA and Metric MDS.

1. Thus, performing Classical MDS on is equivalent to performing it on A.

2. Classical MDS on attempts to approximate: which is a

nonlinear function of distance. Classical MDS on is metric MDS on .

3. Since , it is positive semi-definite if the kernel is chosen

appropriately. This is not the case for arbitrary Metric MDS functions.

4. An advantage of doing Kernel PCA is that a new point can be quickly projected

onto a pre-computed basis. Difficult with numerical optimization.

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

20Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Summary:

1. If the distance matrix comes from points in a normed vector space, MDS

reduces to an Eigenvalue Problem – classical scaling.

2. This classical MDS is also closely related to PCA, which computes the optimal

basis when positions are known.

3. Kernel PCA transforms points to a feature space and uses the kernel trick to

compute PCA in this space.

4. Metric MDS approximates derived distances , for some given function .

5. If the kernel is spherical, then Kernel PCA is a special case of Metric MDS, for

the function

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

21Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Outline

On a Connection between Kernel PCA and Metric Multidimensional

Scaling Williams C., Advances in Neural Information Proc. Sys., 2001

1.Classic MDS and PCA review.

2.Metric MDS.

3.Kernel PCA, kernel trick, relation to Metric MDS.

4.Summary.

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape

Representation Rustamov R., SGP, 2007

22Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Problem:

1. Given 2 articulated shapes in different poses, find point correspondences :

2. Many degrees of freedom, cannot apply rigid alignment.

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

Images by Q.-X. Huang et al. 08

23Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Approach:

1. Embed each shape into a feature space, defined by the Laplacian.

2. The embedding is isometry invariant:

for any isometric deformation .

3. The embedding is only defined up to a rigid transform in the feature space.

4. Find the optimal rigid transform in the feature space to find the

correspondences.

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

24Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Approach:

1. The shape is defined as a point cloud. Approximate the Laplacian:

2. Solve the generalized eigenvalue problem:

3. Find the most significant eigenvalues/vectors.

4. For each data point , let

Where is the i-th eigenvector of

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

25Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Approach:

1. For each data point , let

Where is the i-th eigenvector of .

2. Would like to have for corresponding points. However, each

eigenvector is only defined up to a sign. Reflection:

3. If correspond to the same eigenvalue, then for any

is also an eigenvector. Rotation:

4. Points from the two point sets can be aligned using:

where is orthogonal.

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

26Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Approach:

1. Given point correspondences it is easy to obtain the optimal orthogonal matrix:

SVD approach from optimal rigid alignment.

2. Let , and compute its singular value decomposition:

3. The optimal solution is given by:

4. With this step, can perform ICP in the feature space to find the optimal

correspondeces.

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

27Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Results:

Articulated Shape Matching by Robust Alignment of Embedded

Representations Mateus D. et al., Workshop on 3DRR, 2007

28Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Main Goal: Find a good, isometry-invariant shape descriptor.

Good: Efficient, Easily Computable, Insensitive to local topology changes

(unlike MDS)

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape

Representation Rustamov R., SGP, 2007

29Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Main Idea: For every point define a Global Point Signature

Where is an eigenvector of the Laplace-Beltrami operator.

GPS is a mapping of the surface onto an infinite dimensional space. Each

point gets a signature.

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape

Representation Rustamov R., SGP, 2007

30Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Properties of GPS:

1. If .

2. GPS is isometry invariant (since Laplace-Beltrami is)

3. Given all eigenfunctions and eigenvalues, can recover the shape up to

isometry (not true if only eigenvalues are known).

4. Euclidean distances in the GPS embedding are meaningful:

K-means done on the embedding provides a segmentation.

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape

Representation Rustamov R., SGP, 2007

31Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Comparing GPS:

1. Given a shape, determine its GPS embedding.

2. Construct a histogram of pairwise GPS distances (note that GPS is

defined up to sign flips, distances are preserved)

3. For any 2 shapes, compute the -norm difference between their

histograms.

4. For refined comparisons use more than one histogram.

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape

Representation Rustamov R., SGP, 2007

32Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

Results:

Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape

Representation Rustamov R., SGP, 2007

33Numerical geometry of non-rigid shapes Spectral Methods Tutorial.

1. Kernel methods attempt to embed the shape into a feature space, that

can be manipulated more easily.

2. Laplacian embedding is useful because of its isometry-invariance. Can

be used for comparing non-rigid shapes under isometric deformations.

3. Sign flipping and repeated eigenvalues can cause difficulties (no

canonical way to chose them).

Limitations:

1. Embeddings are not necessarily stable or mesh independent.

2. Difficult to compute for large meshes (millions of points)

3. Both topological and geometric stability is not well understood.

Conclusions


Recommended