+ All Categories
Home > Documents > Non-Parametric Learning

Non-Parametric Learning

Date post: 31-Dec-2015
Category:
Upload: reese-roberson
View: 51 times
Download: 2 times
Share this document with a friend
Description:
Non-Parametric Learning. Prof. A.L. Yuille Stat 231. Fall 2004. Chp 4.1 – 4.3. Parametric versus Non-Parametric. Previous lectures on MLE learning assumed a functional form for the probability distribution. We now consider an alternative non-parametric method based on window function. - PowerPoint PPT Presentation
Popular Tags:
17
Non-Parametric Learning Prof. A.L. Yuille Stat 231. Fall 2004. Chp 4.1 – 4.3.
Transcript
Page 1: Non-Parametric Learning

Non-Parametric Learning

Prof. A.L. Yuille

Stat 231. Fall 2004.

Chp 4.1 – 4.3.

Page 2: Non-Parametric Learning

Parametric versus Non-Parametric

• Previous lectures on MLE learning assumed a functional form for the probability distribution.

• We now consider an alternative non-parametric method based on window function.

Page 3: Non-Parametric Learning

Non-Parametric

• It is hard to develop probability models for some data.

• Example: estimate the distribution of annual rainfall in the U.S.A. Want to model p(x,y) – probability that a raindrop hits a position (x,y).

• Problems: (i) multi-modal density is difficult for parametric models, (ii) difficult/impossible to collect enough data at each point (x,y).

Page 4: Non-Parametric Learning

Intuition

• Assume that the probability density is locally smooth.

• Goal: estimate the class density model p(x) from data

• Method 1: Windows based on points x in space.

Page 5: Non-Parametric Learning

Windows

• For each point x, form a window centred at x with volume Count the number of samples that fall in the window.

• Probability density is estimated as:

Page 6: Non-Parametric Learning

Non-Parametric

• Goal: to design a sequence of windows so that at each point x• • (f(x) is the true density).• Conditions for window design:(i) increasing spatial resolution.

(ii) many samples at each point

(iii)

Page 7: Non-Parametric Learning

Two Design Methods

• Parzen Window: Fix window size:• K-NN: Fix no. samples in window:

Page 8: Non-Parametric Learning

Parzen Window

• Parzen window uses a window function

• Example:

• (i) Unit hypercube:

and 0 otherwise.

• (ii) Gaussian in d-dimensions.

Page 9: Non-Parametric Learning

Parzen Windows

• No. of samples in the hypercube is

• Volume

• The estimate of the distribution is:

• More generally, the window interpolates the data.

Page 10: Non-Parametric Learning

Parzen Window Example

• Estimate a density with five modes using Gaussian windows at scales h=1,0.5, 0.2.

Page 11: Non-Parametric Learning

Convergence Proof.

• We will show that the Parzen window estimator converges to the true density at each point x with increasing number of samples.

Page 12: Non-Parametric Learning

Proof Strategy.

• Parzen distribution is a random variable which depends on the

samples used to estimate it.

• We have to take the expectation of the distribution with respect to the samples.

• We show that the expected value of the Parzen distribution will be the true distribution. And the expected variance of the Parzen distribution will tend to 0 as no. samples gets large.

Page 13: Non-Parametric Learning

Convergence of the Mean

• Result follows.

Page 14: Non-Parametric Learning

Convergence of Variance

• Variance:

Page 15: Non-Parametric Learning

Example of Parzen Window

• Underlying density is Gaussian. Window volume decreases as

Page 16: Non-Parametric Learning

Example of Parzen Window

• Underlying Density is bi-modal.

Page 17: Non-Parametric Learning

Parzen Window and Interpolation.

• In practice, we do not have an infinite number of samples.

• The choice of window shape is important. This effectively interpolates the data.

• If the window shape fits the local structure of the density, then Parzen windows are effective.


Recommended