Post on 28-Dec-2015
transcript
1
Pattern Recognition:Statistical and Neural
Lonnie C. Ludeman
Lecture 13
Oct 14, 2005
Nanjing University of Science & Technology
2
Lecture 13 Topics
1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space
2. Calculation of P(error) for 2-class case : several special cases
3. P(error) calculations examples for special cases – 2-class case
3
Example 1: Multiple observation - multiple classes
Given: the pattern vector x is composed of N independent observations of a Gaussian random variable X with the class conditional densities as follows for each component
A zero one cost function is given as
Find:(a) the Bayes decision rule in a sufficient statistic space. (b) the Bayes decision rule in a space of likelihood ratios
4
Solution:
(a) Since the observations are independent the joint conditional density is a product of the marginal densities and given by
for i = 1, 2, 3 and mi = i, i=1, 2, 3
Bayes decision rule is determined form a set of yi(x)
defined for M=3 by
5
Substituting the given properties gives
The region to decide C1 is found by setting the following
inequalities
Therefore the region R1 to decide C
1, reduces to the x that satisfy
6
Similarly the regions R2 and R
3 become
Substituting the conditional densities, taking the ln of both sides and simplifying the decision rule reduces to regions in a sufficient statistic s space as follows
7
Which is shown below in the sufficient statistic s space
An intuitively pleasing result !
s
8
where yi(x) = Cij p(x | Cj) P(Cj)j=1
M
if yi(x) < yj(x) for all j = i
Then decide x is from Ci
(b) Bayes Decision Rule in Likelihood ratio space: M-Class Case derivation
We know that Bayes Decision Rule for the M-Class Case is
9
LM
(x) = p(x | CM
) / p(x | CM
) = 1
Dividing through by p(x | CM) gives sufficient
statistics vi(x) as follows
Therefore the decision rule becomes
10
Bayes Decision Rule in the Likelihood Ratio Space
The dimension of the Likelihood Ratio Space is always one less than the number of classes ( M - 1)
11
Back to Example: Define the likelihood ratios as
Dividing both sides of the inequalities by p(x|C3)
gives the following equations in the Likelihood Ratio space for determining C
1
We have already determined the region to decide C1 as
12
The other regions are determined in the same fashion giving the decision regions in the likelihood ratio space
13
Calculation of Probability of error for the 2-class Gaussian Cases
We know Optimum Bayes Decision Rule is given by
Special Case 1:
14
The sufficient statistic Z conditioned on C1 has the
following mean and variance
15
thus under C1 we have :
a1 =
v1 =
Z ~ N( a1, v
1 )
The conditional variance becomes
16
Similarly the conditional mean and variance under class C2 are
The statistic Z under class C2 is Gaussian and given by
thus under C1 we have :
a2 =
v2 =
Z ~ N( a2, v
2 )
17
Determination of the P(error)
The total Probability Theorem states
where
18
Since the scalar Z is Gaussian the error conditioned on C
1 becomes:
19
Similarly the error conditioned on C2 becomes
Finally the total P(error) becomes for Special Case 1
20
Special case 2: Equal scaled identity Covariance matrices
Using the previous formula the P(error) reduces to
where
(Euclidean distance between the means)
21
Special case 3: Zero- one Bayes Costs and Equal apriori probabilities
Using the previous formula for P(error) gives:
22
Special Case 4:
Then
23
Example: Calculation of probability of Error
Given:
Find: P(error) for the following assumptions
24
(a)
Solution:
25
(b)
Solution:
Substituting the above into the P(error) gives:
26
(c)
Solution:
Substituting the above into the P(error) gives:
27
(d)
Solution:
Substituting the above into the P(error) for the case of equal covariance matrices gives:
28
(d) Solution Continued:
29
Lecture 13 Summary
1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space
2. Calculation of P(error) for two class case : special cases
3. P(error) calculations examples for special cases - 2 class case
30
End of Lecture 13