+ All Categories
Home > Documents > An Illustrative Example

An Illustrative Example

Date post: 08-Feb-2016
Category:
Upload: nonnie
View: 49 times
Download: 6 times
Share this document with a friend
Description:
An Illustrative Example. Apple/Banana Sorter. Prototype Vectors. McCulloch-Pitts Perceptron. Perceptron Training. How can we train a perceptron for a classification task? We try to find suitable values for the weights in such a way that the training examples are correctly classified. - PowerPoint PPT Presentation
45
An Illustrative Example
Transcript
Page 1: An Illustrative Example

AnIllustrativeExample

Page 2: An Illustrative Example

Apple/Banana Sorter

Page 3: An Illustrative Example

Prototype Vectors

Page 4: An Illustrative Example

McCulloch-Pitts Perceptron

Page 5: An Illustrative Example

Perceptron Training• How can we train a perceptron for a

classification task?• We try to find suitable values for the

weights in such a way that the training examples are correctly classified.

• Geometrically, we try to find a hyper-plane that separates the examples of the two classes.

Page 6: An Illustrative Example

Perceptron Geometric ViewThe equation below describes a (hyper-)plane in the input space

consisting of real valued m-dimensional vectors. The plane splits the input space into two regions, each of them describing one class.

m

i ii 1

0w p b

x2

C1

C2x1

decisionboundary

w1p1 + w2p2 + b = 0

decisionregion for C1

w1p1 + w2p2 + b >= 0

Page 7: An Illustrative Example

Two-Input Case

Page 8: An Illustrative Example

Apple/Banana Example

Page 9: An Illustrative Example

Testing the Network

Page 10: An Illustrative Example

XOR problem

x1 x2 x1 x2

-1 -1 -1 -1 1 1 1 -1 1 1 1 -1

A typical example of non-linealy separable function isthe XOR. This function takes two input arguments with values in {-1,1} and returns one output in {-1,1}, as specified in the followingtable:

If we think at -1 and 1 as encoding of the truth values false and true, respectively, then XOR computes the logical exclusive or,which yields true if and only if the two inputs have different truth values.

Page 11: An Illustrative Example

XOR problem • In this graph of the XOR, input

pairs giving output equal to 1 and -1 are depicted with green and red circles, respectively. These two classes (green and red) cannot be separated using a line. We have to use two lines, like those depicted in blue. The following NN with two hidden nodes realizes this non-linear separation, where each hidden node describes one of the two blue lines.

1

1

-1

-1

x2

x1

Page 12: An Illustrative Example

Multilayer Network

Page 13: An Illustrative Example

Abbreviated Notation

Page 14: An Illustrative Example

Recurrent Network

Page 15: An Illustrative Example

Hamming Network

Page 16: An Illustrative Example

Feedforward Layer

Page 17: An Illustrative Example

Recurrent Layer

Page 18: An Illustrative Example

Hamming Operation

Page 19: An Illustrative Example

Hamming Operation

Page 20: An Illustrative Example

Hopfield Network

Page 21: An Illustrative Example

Apple/Banana Problem

Page 22: An Illustrative Example

Summary• Perceptron

– Feedforward Network– Linear Decision Boundary– One Neuron for Each Decision

• Hamming Network– Competitive Network– First Layer – Pattern Matching (Inner Product)– Second Layer – Competition (Winner-Take-All)– # Neurons = # Prototype Patterns

• Hopfield Network– Dynamic Associative Memory Network– Network Output Converges to a Prototype Pattern– # Neurons = # Elements in each Prototype Pattern

Page 23: An Illustrative Example

Learning Rules

p1 t1{ , } p2 t2{ , } pQ tQ{ , }

• Supervised LearningNetwork is provided with a set of examplesof proper network behavior (inputs/targets)

• Reinforcement LearningNetwork is only provided with a grade, or score,which indicates network performance

• Unsupervised LearningOnly network inputs are available to the learningalgorithm. Network learns to categorize (cluster)the inputs.

Page 24: An Illustrative Example

Early Learning Rules

• These learning rules are designed for single layer neural networks

• They are generally more limited in their applicability.

• Some of the early algorithms are:– Perceptron learning– LMS learning– Grossberg learning

Page 25: An Illustrative Example

Perceptron Architecture AGAIN!!!!

Page 26: An Illustrative Example

Single-Neuron Perceptron

Page 27: An Illustrative Example

Decision Boundary

Page 28: An Illustrative Example

Example - OR

Page 29: An Illustrative Example

OR Solution

Page 30: An Illustrative Example

Multiple-Neuron Perceptron

Each neuron will have its own decision boundary.

wTi p bi+ 0=

A single neuron can classify input vectors into two categories.

A multi-neuron perceptron can classify input vectors into 2S categories.

Page 31: An Illustrative Example

Learning Rule Test Problem

Page 32: An Illustrative Example

Starting Point

Page 33: An Illustrative Example

Tentative Learning Rule

Page 34: An Illustrative Example

Second Input Vector

Page 35: An Illustrative Example

Third Input Vector

Page 36: An Illustrative Example

Unified Learning Rule

Page 37: An Illustrative Example

Multiple-Neuron Perceptrons

Page 38: An Illustrative Example

Apple/Banana Example

Page 39: An Illustrative Example

Second Iteration

Page 40: An Illustrative Example

Check

Page 41: An Illustrative Example

Perceptron Rule Capability

The perceptron rule will always converge to weights which accomplish the desired classification, assuming that

such weights exist.

Page 42: An Illustrative Example

Rosenblatt’s single layer perceptron is trained as follow:

1. Randomly initialize all the networks weights.2. Apply inputs and find outputs ( feedforward).3. compute the errors.4. Update each weight as

5. Repeat steps 2 to 4 until the errors reach the satisfictory level.

)()()()1( kekpkwkw jiijij

Page 43: An Illustrative Example

What is this:

• Name : Learning rate.• Where is living: usually between 0 and 1.• It can change it’s value during learning.• Can define separately for each parameters.

Page 44: An Illustrative Example

Perceptron Limitations

Page 45: An Illustrative Example

You Can Find Your First Homework herehttp://saba.kntu.ac.ir/eecd/People/aliyari/

NEXT WEEK


Recommended