Chapter 8 Fuzzy Associative Memories Li Lin 2004-11-24.

Post on 12-Jan-2016

303 views 5 download

transcript

Chapter 8 Fuzzy Associative Memories

Li Lin2004-11-24

CONTENTS Review Fuzzy Systems as between-cube mapping Fuzzy and Neural Function Estimators Fuzzy Hebb FAMs Adaptive FAMs

Review In Chapter 2, we have mentioned BAM

theorem Chapter 7 discussed fuzzy sets as points

in the unit hypercube What is associative memories?

Fuzzy systems

Koskos: fuzzy systems as between-cube mapping

nI pIFig.1 A fuzzy system

Output universe

of discourse

Input universe

of discourse

The continuous fuzzy system behave as associative memories, or fuzzy associative memories.

Fuzzy and neural function estimators Fuzzy and neural systems estimates sampled

function and behave as associative memories

Similarities: 1. They are model-free estimator 2. Learn from samples 3. Numerical, unlike AI

Differences: They differ in how to estimate the sampled

function 1. During the system construction 2. The kind of samples used

Fig.2 Function f maps domains X to range Y

3. Application

4. How they represent and store those samples

5. How they associatively inference

Differences:

Neural vs. fuzzy representation of structured knowledge

Neural network problems: 1. computational burden of training

2. system inscrutability There is no natural inferential audit

tail, like an computational black box.

3. sample generation

Neural vs. fuzzy representation of structured knowledge

Fuzzy systems 1. directly encode the linguistic sample (HEAVY,LONGER) in a matrix 2. combine the numerical approaches with the symbolic one

Fuzzy approach does not abandon neural-network, it limits them to unstructured parameter and state estimate, pattern recognition and cluster formation.

FAMs as mapping Fuzzy associative memories are transforma

tions FAM map fuzzy sets to fuzzy sets, units cube to units cube. Access the associative matrices in parallel a

nd store them separately Numerical point inputs permit this simplification binary input-out FAMs, or BIOFAMs

FAMs as mapping

200nx1 5 01 0 05 00

1L ig h t M ed iu m Heav y

Tra f f ic de n s ity

40ny3 02 01 00

1M ed iu mS h o r t L o n g

G re e n lig h t du ra t io n

Fig.3 Three possible fuzzy subsets of traffic-density and green light duration, space X and

Y.

Fuzzy vector-matrix multiplication: max-min composition

Max-min composition “ ”

BMA

),...(),,...( 11 pn bbBaaA Where, , M is a fuzzy

n-by-p matrix (a point in )pnI

),min(max ,1

jiini

j mab

Fuzzy vector-matrix multiplication: max-min composition

ExampleSuppose A=(.3 .4 .8 1),

Max-product composition

3.2.0

5.1.8.

6.6.7.

7.8.2.

M

5.4.8. MAB

ijniij mab

1

max

Fuzzy Hebb FAMs Classical Hebbian learning law:

Correlation minimum coding:

Example

),min( jiij bam mTT

n

T bAbA

Ba

Ba

BAM

1

1

5.4.8.

5.4.8.

4.4.4.

3.3.3.

5.4.8.

1

8.

4.

3.

BAM

)()( jjiiijij ySxSmm

The bidirectional FAM theorem for correlation-minimum encoding

The height and normality of fuzzy set A

fuzzy set A is normal, if H(A)=1 Correlation-minimum bidirectional

theorem

iniaAH

1max)(

BMA AMB T BMA AMB T

)()( BHAH )()( AHBH

AB

(i)

(ii)

(iii)

(iv)

iffifffor any

for any

The bidirectional FAM theorem for correlation-minimum encoding

Proof)(maxmax

11AHaAaAA i

nii

ni

T

Then )( MAAMA T BAA T )(

BAH )(BAH )(

)()()( BHAHiffBBAH So

Correlation-product encoding

Correlation-product encoding provides an alternative fuzzy Hebbian encoding scheme

Example

Correlation-product encoding preserves more information than correlation-minimum

jiijT bamandBAM

5.4.8.

4.32.64.

2.16.32.

15.12.24.

5.4.8.

1

8.

4.

3.

BAM T

Correlation-product encoding

Correlation-product bidirectional FAM theorem

if and A and B are nonnull fit vector then

BAM T

BMA AMB T BMA AMB T

1)( BH1)( AH

AB

(i)

(ii)

(iii)

(iv)

iffifffor any

for any

FAM system architecture

jy

FAM Rule m

FAM Rule 1

FAM SYSTEM

),( 11 BA

),( 22 BAFAM Rule 2

),( mm BA

1B

2B

mB

1

2

m

A B Defuzzifier

Superimposing FAM rules

Suppose there are m FAM rules or associations The natural neural-network maximum or add the m

associative matrices in a single matrix M:

This superimposition scheme fails for fuzzy Hebbian encoding

The fuzzy approach to the superimposition problem additively superimposes the m recalled vectors instead of the fuzzy Hebb matrices

kkk

mkMMorMM

1max

kkTkk BBAAMA )(

kBkM

Superimposing FAM rules

Disadvantages: Separate storage of FAM associations consumes

space Advantages: 1 provides an “audit trail” of the FAM inference

procedure 2 avoids crosstalk 3 provides knowledge-base modularity 4 a fit-vector input A activates all the FAM rules

in parallel but to different degrees.

Back

Recalled outputs and “defuzzification” The recalled output B equals a weighted sum

of the individual recalled vectors

How to defuzzify? 1. maximum-membership defuzzification

simple, but has two fundamental problems: ① the mode of the B distribution is not unique ② ignores the information in the waveform B

kBkkB'B

m

1k

)(max)(1

max jBpj

B ymym

Recalled outputs and “defuzzification”

2. Fuzzy centroid defuzzification

The fuzzy centroid is unique and uses all the information in the output distribution B

p

jjB

jB

ym

ym

1

p

1jj

)(

)(y

B

Thank you!