+ All Categories
Home > Documents > Advanced Artificial Intelligence

Advanced Artificial Intelligence

Date post: 13-Mar-2016
Category:
Upload: bertille-marley
View: 14 times
Download: 0 times
Share this document with a friend
Description:
Advanced Artificial Intelligence. Lecture 8: Advance machine learning. Outline. Clustering K-Means EM Spectral Clustering Dimensionality Reduction. The unsupervised learning problem. Many data points, no labels. K-Means. Many data points, no labels. Choose a fixed number of clusters - PowerPoint PPT Presentation
Popular Tags:
27
Advanced Artificial Intelligence Lecture 8: Advance machine learning
Transcript
Page 1: Advanced Artificial Intelligence

Advanced Artificial Intelligence

Lecture 8:Advance machine learning

Page 2: Advanced Artificial Intelligence

Outline

Clustering K-Means EM Spectral Clustering

Dimensionality Reduction

2

Page 3: Advanced Artificial Intelligence

The unsupervised learning problem

3

Many data points, no labels

Page 4: Advanced Artificial Intelligence

K-Means

4

Many data points, no labels

Page 5: Advanced Artificial Intelligence

K-Means Choose a fixed number of

clusters

Choose cluster centers and point-cluster allocations to minimize error

can’t do this by exhaustive search, because there are too many possible allocations.

Algorithm fix cluster centers;

allocate points to closest cluster

fix allocation; compute best cluster centers

x could be any set of features for which we can compute a distance (careful about scaling)

x j i2

jelements of i'th cluster

iclusters

Page 6: Advanced Artificial Intelligence

K-Means

Page 7: Advanced Artificial Intelligence

K-Means

* From Marc Pollefeys COMP 256 2003

Page 8: Advanced Artificial Intelligence

K-Means Is an approximation to EM

Model (hypothesis space): Mixture of N Gaussians Latent variables: Correspondence of data and Gaussians

We notice: Given the mixture model, it’s easy to calculate the

correspondence Given the correspondence it’s easy to estimate the mixture

models

Page 9: Advanced Artificial Intelligence

Expectation Maximzation: Idea Data generated from mixture of Gaussians

Latent variables: Correspondence between Data Items and Gaussians

Page 10: Advanced Artificial Intelligence

Generalized K-Means (EM)

Page 11: Advanced Artificial Intelligence

Gaussians

11

Page 12: Advanced Artificial Intelligence

ML Fitting Gaussians

12

Page 13: Advanced Artificial Intelligence

Learning a Gaussian Mixture(with known covariance)

k

n

x

x

ni

ji

e

e

1

)(2

1

)(2

1

22

22

M-Step

k

nni

jiij

xxp

xxpzE

1

)|(

)|(][

E-Step

Page 14: Advanced Artificial Intelligence

Expectation Maximization

Converges! Proof [Neal/Hinton, McLachlan/Krishnan]:

E/M step does not decrease data likelihood Converges at local minimum or saddle point

But subject to local minima

Page 15: Advanced Artificial Intelligence

Practical EM

Number of Clusters unknown Suffers (badly) from local minima Algorithm:

Start new cluster center if many points “unexplained”

Kill cluster center that doesn’t contribute (Use AIC/BIC criterion for all this, if you want

to be formal)

15

Page 16: Advanced Artificial Intelligence

Spectral Clustering

16

Page 17: Advanced Artificial Intelligence

Spectral Clustering

17

Page 18: Advanced Artificial Intelligence

Spectral Clustering: Overview

Data Similarities Block-Detection

Page 19: Advanced Artificial Intelligence

Eigenvectors and Blocks Block matrices have block eigenvectors:

Near-block matrices have near-block eigenvectors: [Ng et al., NIPS 02]

1 1 0 0

1 1 0 0

0 0 1 1

0 0 1 1

eigensolver

.71

.71

0

0

0

0

.71

.71

1= 2 2= 2 3= 0 4= 0

1 1 .2 0

1 1 0 -.2

.2 0 1 1

0 -.2 1 1

eigensolver

.71

.69

.14

0

0

-.14

.69

.71

1= 2.02 2= 2.02 3= -0.02 4= -0.02

Page 20: Advanced Artificial Intelligence

Spectral Space Can put items into blocks by eigenvectors:

Resulting clusters independent of row ordering:

1 1 .2 0

1 1 0 -.2

.2 0 1 1

0 -.2 1 1

.71

.69

.14

0

0

-.14

.69

.71

e1

e2

e1 e2

1 .2 1 0

.2 1 0 1

1 0 1 -.2

0 1 -.2 1

.71

.14

.69

0

0

.69

-.14

.71

e1

e2

e1 e2

Page 21: Advanced Artificial Intelligence

The Spectral Advantage The key advantage of spectral clustering is the spectral space

representation:

Page 22: Advanced Artificial Intelligence

Measuring Affinity

Intensity

Texture

Distance

aff x, y exp 12 i

2

I x I y 2

aff x, y exp 12 d

2

x y

2

aff x, y exp 12 t

2

c x c y 2

Page 23: Advanced Artificial Intelligence

Scale affects affinity

Page 24: Advanced Artificial Intelligence
Page 25: Advanced Artificial Intelligence

Dimensionality Reduction

25

Page 26: Advanced Artificial Intelligence

Dimensionality Reduction with PCA

26

Page 27: Advanced Artificial Intelligence

Linear: Principal Components Fit multivariate Gaussian Compute eigenvectors of

Covariance Project onto eigenvectors

with largest eigenvalues

27


Recommended