+ All Categories
Home > Documents > 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing...

1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing...

Date post: 14-Jan-2016
Category:
Upload: joan-oconnor
View: 218 times
Download: 2 times
Share this document with a friend
20
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology
Transcript
Page 1: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

1

Pattern Recognition:Statistical and Neural

Lonnie C. Ludeman

Lecture 8

Sept 23, 2005

Nanjing University of Science & Technology

Page 2: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

2

May be Optimum

Page 3: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

3

Review 2: Classifier performance Measures

1. A’Posteriori Probability (Maximize)

2. Probability of Error ( Minimize)

3. Bayes Average Cost (Maximize)

4. Probability of Detection ( Maximize with fixed Probability of False alarm)

(Neyman Pearson Rule)

5. Losses (Minimize the maximum)

Page 4: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

4

If l( x ) N

Likelihood ratio

<>C1

C2 Threshold

Review 3: MAP, MPE , and Bayes Classification Rule

(C22 - C12 ) P(C2)

(C11 - C21 ) P(C1)NBAYES =

P(C2)

P(C1)NMAP =

P(C2)

P(C1)NMPE =

Page 5: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

5

Topics for Lecture 8

1. Two Dimensional problem

2. Solution in likelihood space

3. Solution in pattern space

4. Solution in feature space

5. Calculation of probability of error

6. Transformational Theorem

Page 6: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

6

Example : 2 Class and 2 observations

C1 : x = [ x1, x2 ]T ~ p(x1, x2 | C1) , P(C1)C2 : x = [ x1, x2 ]T ~ p(x1, x2 | C2) , P(C2)

Given:

C1 : x ~ N( M1 , K1 ) C2 : x ~ N( M2 , K2 )

0 0

1 0 0 1

2 0 0 2

1 1M1 = M2 = K1 = K2 =

Find Optimum decision rule (MPE)

P(C1) = P(C2) = 1/2

Page 7: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

7

Page 8: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

8

Page 9: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

9

taking the ln of both sides gives an equivalent rule

Solution in different spaces

If - (x1 + x2 - 1) 0<>C2

C1

In Observation Space

If x1 + x2 1<>C1

C2

In feature space y=g(x1,x2)

rearranging gives

If y 1 <>C1

C2g(x1,x2) = x1+x2

Page 10: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

10

In Observation Space

1

1

x1 + x2 = 1

x2

x1

decide C2

decide C1

10

decide C2decide C1 where y = x1 + x2y

In Feature Space (Sufficient statistic for this problem)

Page 11: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

11

Calculation of P(error | C1) for 2 dimensional Example in y space

P(error | C1) = P(decide C2 |C1)

= p( y | C1 ) dyR2

P(error | C1) = exp(-y2/4)dy

1

oo

Under C1 : x1 and x2 are independent normally distributed gaussian random variables N(0,1) thus y is normally distributed as N(0,2).

12 pi

Page 12: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

12

Calculation of P(error | C2) for 2 dimensional Example in y space

P(error | C2) = P(decide C1 |C2)

= p( y | C2 ) dyR1

P(error | C2) = exp{(-(y-2)2/4)} dy1

oo

Under C2: x1 and x2 are independent normally distributed gaussian random variables N(1,1) thus y is normally distributed as N(2,2).

12 pi_

Page 13: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

13

P(error) = P(error | C1) P(C1) + P(error |C2) P(C2)

Probability of error for example

= exp(-y2/4)dy P(C1)

1

oo12 pi

+ exp{(-(y-2)2/4)} dy P(C2) 1

oo12 pi_

Page 14: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

14

Transformational Theorem

Given : X is a random Variable with known probability density function pX(x).

y=g(x) is a real vlued function with no flat spots

Define the random variable Y=g(X).

Then The probability density function for Y, pY(y) is as follows:

d g(x)dx x=xi

pY(y) = Where xi are all real roots of y=g(x)

pX(x)

all xi

Page 15: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

15

Example: Transformational Theorem

Given: X ~ N(0,1)

Define function: y = x2

Define the random variable: Y = X2

Find the probability density function pY(y)

Page 16: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

16

Solution: y

x2x1 x

y = x2

y > 0

x2 = yx1 = - y

for y > 0 there are two real roots of y = x2 given by

for y > 0 there are no real roots of y = x2

therefore pY(y) = 0 for those values of y

y < 0

Page 17: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

17

pY(y) = pX(x1) + pX(x2) = pX( - y ) + pX( y )

d g(x)dx x=xi

pY(y) = pX(x)

all xi

Apply Fundamental Theorem

= 0 if no real roots

if real roots

d g(x) = 2xdx

= exp(- (- y )2/2)1

2 pi exp(- ( y )2 /2)12 pi

+

2 (- y ) 2( y )

for y > 0

Page 18: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

18

exp(- y/2)

2 piu(y)=pY(y)

Final Answer

Page 19: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

19

Summary for Lecture 8

1. Two Dimensional problem

2. Solution in likelihood space

3. Solution in pattern space

4. Solution in feature space

5. Calculation of probability of error

6. Transformation Theorem

Page 20: 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.

20

End of Lecture 8


Recommended