+ All Categories
Home > Documents > EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian...

EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian...

Date post: 14-Jul-2020
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
23
Performance Comparison of K- Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry
Transcript
Page 1: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering

EE6540 Final ProjectDevin Cornell & Sushruth Sastry

Page 2: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Outline

● problem statement● background● experiments● results● conclusions

Page 3: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Problem Statement

● basic clustering● classification through distribution

modeling

Figures from [1]

Page 4: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Background: GMM

● equations from [2]

Page 5: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Background: EM

● equations, algorithm from [3]

Page 6: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Background: EM-GMMAlgorithm 2 reprint from [4]

Page 7: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Background: k-means

● special case of EM-GMM [2] with○ no component cluster covariances○ fixed priori of all covariances for K components○ no membership weights, each point just

belongs to the class with the nearest mean

Page 8: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

k-means Algorithm● based on modifications to algorithm mentioned by [2]

Page 9: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 1: Separate GMM Data

Page 10: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 2: Intermixed GMM Data

Page 11: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 3: Concentric Gaussian with Large Covariance Differences

Page 12: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 4: Radial Poisson Distributions with Different Means

Page 13: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Demonstration

● see Matlab

Page 14: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 1: Results

Page 15: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 2: Results

Page 16: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 3: Results

Page 17: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 4a: Results

Page 18: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 4b: Results

Page 19: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 4c: Results

Page 20: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Experiment 4d: Results

Page 21: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Results Summary

Page 22: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

Conclusions● EM-GMM is much slower than k-means● EM-GMM was more accurate for all experiments

performed here● These algorithms can be more flexible if run with

different values of K● With a way to map “fitted distributions” to

“generating distributions”, GMM can estimate arbitrary distributions with fewer fitted distributions

Page 23: EE6540 Final Project...Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry

References[1] A. W. Moore, “Clustering with Gaussian Mixtures,”, School of Computer Science, Carnegie Mellon University, http://cs.cmu.edu/awm

[2] D. K. P. Murphy, Machine learning: a probabilistic perspective. MIT press, 2012.

[3] A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the em algorithm,” Journal of the royal statistical society. Series B (methodological), pp. 1-38, 1977.

[4] Barber, Bayesian reasoning and machine learning. Cambridge University Press, 2012.


Recommended