+ All Categories
Home > Documents > Ming-Feng Yeh1 CHAPTER 13 Associative Learning. Ming-Feng Yeh2 Objectives The neural networks,...

Ming-Feng Yeh1 CHAPTER 13 Associative Learning. Ming-Feng Yeh2 Objectives The neural networks,...

Date post: 17-Dec-2015
Category:
Upload: malcolm-nicholson
View: 216 times
Download: 2 times
Share this document with a friend
36
Ming-Feng Yeh 1 CHAPTER 13 CHAPTER 13 Associative Associative Learning Learning
Transcript

Ming-Feng Yeh 1

CHAPTER 13CHAPTER 13

Associative Associative LearningLearning

Ming-Feng Yeh 2

ObjectivesObjectives

The neural networks, trained in a supervised manner, require a target signal to define correct network behavior.The unsupervised learning rules give networks the ability to learn associations between patterns that occur together frequently.Associative learning allows networks to perform useful tasks such as pattern recognition (instar) and recall (outstar).

Ming-Feng Yeh 3

What is an Association?What is an Association?

An association is any link between a system’s input and output such that when a pattern A is presented to the system it will respond with pattern B.When two patterns are link by an association, the input pattern is referred to as the stimulus and the output pattern is to referred to as the response.

Ming-Feng Yeh 4

Classic ExperimentClassic Experiment

Ivan Pavlov He trained a dog to salivate at the sound of a

bell, by ringing the bell whenever food was presented. When the bell is repeatedly paired with the food, the dog is conditioned to salivate at the sound of the bell, even when no food is present.B. F. Skinner

He trained a rat to press a bar in order to obtain a food pellet.

Ming-Feng Yeh 5

Associative LearningAssociative Learning

Anderson and Kohonen independently developed the linear associator in the late 1960s and early 1970s.Grossberg introduced nonlinear continuous-time associative networks during the same time period.

Ming-Feng Yeh 6

Simple Associative NetworkSimple Associative Network

Single-Input Hard Limit Associator Restrict the value of p to be either 0 or 1,

indicating whether a stimulus is absent or present.

The output a indicates the presence or absence of the network’s response. p w

1

b

n a

stimulus. no ,0

stimulus. ,1p

response. no ,0response. ,1

)5.0()( wphardlimbwphardlima

Ming-Feng Yeh 7

Two Types of InputsTwo Types of Inputs

Unconditioned Stimulus Analogous to the food presented to the dog in

Pavlov’s experiment.Conditioned Stimulus

Analogous to the bell in Pavlov’s experiment.The dog salivates only when food is presented. This is an innate that does not have to be learned.

Ming-Feng Yeh 8

Banana AssociatorBanana Associator

An unconditioned stimulus (banana shape) and a conditioned stimulus (banana smell)

The network is to associate the shape of a banana, but not the smell.

p w

1

b

n a0p 0w5.0 ,0 ,10 bww

detected.not shape ,0

detected. shape ,10p

detected.not smell ,0

detected. smell ,1p )( 00 bwppwhardlima

Ming-Feng Yeh 9

Associative LearningAssociative Learning

Both animals and humans tend to associate things occur simultaneously.If a banana smell stimulus occurs simultaneously with a banana concept response (activated by some other stimulus such as the sight of a banana shape), the network should strengthen the connection between them so that later it can activate its banana concept in response to the banana smell alone.

Ming-Feng Yeh 10

Unsupervised Hebb RuleUnsupervised Hebb Rule

Increasing the weighting wij between a neuron’s input pj and output ai in proportion to their product:Hebb rule uses only signals available within the layer containing the weighting being updated. Local learning ruleVector form:Learning is performed in response to the training sequence

)()()1()( qpqaqwqw jiijij

)()()1()( qqqq paww

)(...,),2(),1( Qppp

Ming-Feng Yeh 11

Ex: Banana AssociatorEx: Banana Associator

Initial weights:Training sequence:

Learning rule:

0)0(,10 ww

...},1)2(,2)2({},1)1(,0)1({ 00 pppp1),()()1()( qpqaqwqw

p w

1

b

n a0p 0w

Shape Smell

Fruit

Network

Banana ?

Banana ?

Smell

Sight

Ming-Feng Yeh 12

Ex: Banana AssociatorEx: Banana Associator

First iteration (sight fails):

(no response)

Second iteration (sight works):

(banana)

0)5.01001(

)5.0)1()0()1(()1( 00

hardlim

pwpwhardlima

0100)1()1()0()1( paww

1)5.01011(

)5.0)2()1()2(()2( 00

hardlim

pwpwhardlima

1110)2()2()1()2( paww

Ming-Feng Yeh 13

Ex: Banana AssociatorEx: Banana Associator

Third iteration (sight fails):

(banana)

From now on, the network is capable of responding to bananas that are detected either sight or smell. Even if both detection systems suffer intermittent faults, the network will be correct most of the time.

1)5.01101(

)5.0)3()2()3(()3( 00

hardlim

pwpwhardlima

2111)3()3()2()3( paww

Ming-Feng Yeh 14

Problems of Hebb RuleProblems of Hebb Rule

Weights will become arbitrarily large Synapses cannot grow without bound.

There is no mechanism for weights to decrease If the inputs or outputs of a Hebb network expe

rience ant noise, every weight will grow (however slowly) until the network responds to any stimulus.

Ming-Feng Yeh 15

Hebb Rule with DecayHebb Rule with Decay

, the decay rate, is a positive constant less than one.

This keeps the weight matrix from growing without bound, which can be found by setting both ai and pj to 1, i.e.,

The maximum weight value is determined by the decay rate .

)()()1()1(

)1()()()1()(

qqq

qqqqqT

T

paW

WpaWW

max)1( ijjiijij wpawwmaxijw

Ming-Feng Yeh 16

Ex: Banana AssociatorEx: Banana Associator

First iteration (sight fails): no response

Second iteration (sight works): banana

Third iteration (sight fails): banana

.1.0,1),()()1()1()( qpqaqwqw

0)5.0)1()0()1(()1( 00 pwpwhardlima

001.0100)0(1.0)1()1()0()1( wpaww

1)5.0)2()1()2(()2( 00 pwpwhardlima

101.0110)1(1.0)2()2()1()2( wpaww

1)5.0)3()2()3(()3( 00 pwpwhardlima

9.111.0111)2(1.0)3()3()2()3( wpaww

Ming-Feng Yeh 17

Ex: Banana AssociatorEx: Banana Associator

101.0

1max

ijw

0 10 20 300

10

20

30

0 10 20 300

2

4

6

8

10

Hebb Rule Hebb with Decay

Ming-Feng Yeh 18

0 10 20 300

1

2

3

Prob. of Hebb Rule with DecayProb. of Hebb Rule with Decay

Associations will decay away if stimuli are not occasionally presented.If ai = 0, then

If = 0.1, this reducestoThe weight decays by10% at each iterationfor which ai = 0(no stimulus)

)1()1()( qwqw ijij

)1(9.0)( qwqw ijij

Ming-Feng Yeh 19

Instar (Recognition Network)Instar (Recognition Network)

A neuron that has a vector input and a scalar output is referred to as an instar.This neuron is capable of pattern recognition.Instar is similar to perceptron, ADALINE and linear associator.

1

b

n a1p

2p

Rp

1,1w

Rw ,1

Ming-Feng Yeh 20

Instar OperationInstar Operation

Input-output expression:

The instar is active when orwhere is the angle between two vectors.If , the inner product is maximized when the angle is 0.Assume that all input vectors have the same length (norm).

)()( 1 bhardlimbhardlima T pwWp

bT pw1 bT cos11 pwpw

wp 1

Ming-Feng Yeh 21

Vector RecognitionVector Recognition

If , then the instar will be only active when = 0.

If , then the instarwill be active for a range of angles.

The larger the value of b, the more patterns there will be that can activate the instar, thus making it the less discriminatory.

pw1b

pw1bw1

Ming-Feng Yeh 22

Instar RuleInstar Rule

Hebb rule:Hebb rule with decay:

Instar rule: a decay term, the forgetting problem, is add that is proportion to :

If ,

)()()1()( qpqaqwqw jiijij

)()()1()1()( qpqaqwqw jiijij

)(qai

)1()()()()1()( qwqaqpqaqwqw ijijiijij

)]1()()[()1()(

)]1()()[()1()(

qqqaqq

qwqpqaqwqw

iiii

ijjiijij

wpww

Ming-Feng Yeh 23

Graphical RepresentationGraphical Representation

For the case where the instar is active( ),

For the case where the instaris inactive ( ),

1ia

)()1()1(

)]1()([)1()(

qq

qqqq

i

iii

pw

wpww

)(qp

)1( qi w

)(qi w

0ia)1()( qq ii ww

Ming-Feng Yeh 24

Ex: Orange RecognizerEx: Orange Recognizer

The elements of p will be contained to 1 values.

1

2b

n a1p

2p

3p

1,1w

3,1w

0p30 w

Sight of orange

Measured shape

Measured texture

Measured weight

Orange?

weight

texture

shape

p

detectednot orange ,0

visuallydetected orange ,10p

Sight

Fruit

Network

Orange ?

Measure

)( 00 bpwhardlima Wp

Ming-Feng Yeh 25

Initialization & TrainingInitialization & Training

Initial weights:The instar rule (=1):

Training sequence:

First iteration:

000)0()0(,3 10 Tw wW

)]1()()[()1()( 111 qqqaqq wpww

,1

1

1

)2(,1)2(,

1

1

1

)1(,0)1( 00

pp pp

response) (no0)2)1()1(()1( 00 Wppwhardlima

Ta 000)]0()1()[1()0()1( 111 wpww

Ming-Feng Yeh 26

Second Training IterationSecond Training IterationSecond iteration:

The network can now recognition the orange by its measurements.

(orange)12

1

1

1

00013

)2)2()2(()2( 00

hardlim

pwhardlima Wp

1

1

1

0

0

0

1

1

1

1

0

0

0

)]1()2()[2()1()2( 111 wpww a

Ming-Feng Yeh 27

Third Training IterationThird Training Iteration

Third iteration:

(orange)12

1

1

1

11103

)2)3()3(()3( 00

hardlim

pwhardlima Wp

1

1

1

1

1

1

1

1

1

1

1

1

1

)]2()3()[3()2()3( 111 wpww a

Orange will now be detected if either set of sensors works.

Ming-Feng Yeh 28

Kohonen RuleKohonen Rule

Kohonen rule:

Learning occurs when the neuron’s index i is a member of the set X(q). The Kohonen rule can be made equivalent to the i

nstar rule by defining X(q) as the set of all i such that

The Kohonen rule allows the weights of a neuron to learn an input vector and is therefore suitable for recognition applications.

)()],1()([)1()( 111 qXiqqqq wpww

1)( qai

Ming-Feng Yeh 29

Ourstar (Recall Network)Ourstar (Recall Network)

The outstar network has a scalar input and a vector output.It can perform pattern recall by associating a stimulus with a vector response.

p

1,1w

1,2w

1,Sw

1n

2n

Sn

1a

2a

Sa

Ming-Feng Yeh 30

Outstar OperationOutstar Operation

Input-output expression:

If we would like the outstar network to associate a stimulus (an input of 1) with a particular output vector a*, set W = a*. If p = 1, a = satlins(Wp) = satlins(a*p) = a* Hence, the pattern is correctly recalled.The column of a weight matrix represents the pattern to be recalled.

)( psatlinsa W

Ming-Feng Yeh 31

Outstar RuleOutstar Rule

In instar rule, the weight decay term of Hebb rule is proportional to the output of network, ai.In outstar rule, the weight decay term of Hebb rule is proportional to the input of network, pj.

If = ,

Learning occurs whenever pj is nonzero (instead of a

i). When learning occurs, column wj moves toward the output vector. (complimentary to instar rule)

)1()()()()1()( qwqpqpqaqwqw ijjjiijij

)()]1()([)1()(

)()]1()([)1()(

qpqqqq

qpqwqaqwqw

jjjj

jijiijij

waww

Ming-Feng Yeh 32

Ex: Pineapple RecallerEx: Pineapple Recaller

Sight

Fruit

Network

Measurement?

Measure

1n

2n

1a

2a

Measured shape

Measured texture

Measured weight

Identified Pineapple 3n 3a

11,1w

12,2w

13,3w

21,1w

23,3w

11p

12p

13p

2p

Recalled shape

Recalled texture

Recalled weight

)( 00 ppsatlins WWa

100

010

0010W

weight

texture

shape0p

otherwise ,0seen becan pineapple a if ,1

p

Ming-Feng Yeh 33

InitializationInitialization

The outstar rule (=1):

Training sequence:

Pineapple measurements:

)()]1()([)1()( qpqqqq jjjj waww

,1)2(,

1

1

1

)2(,1)1(,

0

0

0

)1( 00

pppp

1

1

1pineapplep

Ming-Feng Yeh 34

First Training IterationFirst Training Iteration

First iteration:

response) (no

0

0

0

)1

0

0

0

0

0

0

(

)1()1()1( 00

satlins

ppsatlins WWa

0

0

0

1

0

0

0

0

0

0

0

0

0

)1()]0()1([)0()1( 111 pwaww

Ming-Feng Yeh 35

Second Training IterationSecond Training IterationSecond iteration:

The network forms an association between the sight and the measurements.

given) nts(measureme

1

1

1

)1

0

0

0

1

1

1

()2(

satlinsa

1

1

1

1

0

0

0

1

1

1

0

0

0

)2()]1()2([)1()2( 111 pwaww

Ming-Feng Yeh 36

Third Training IterationThird Training IterationThird iteration:

Even if the measurement system fail, the network is now able to recall the measurements of the pineapple when it sees it.

recalled) nts(measureme

1

1

1

)1

1

1

1

0

0

0

()3(

satlinsa

1

1

1

1

1

1

1

1

1

1

1

1

1

)3()]2()3([)2()3( 111 pwaww


Recommended