+ All Categories
Home > Documents > Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts...

Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts...

Date post: 21-Dec-2015
Category:
View: 214 times
Download: 1 times
Share this document with a friend
24
Neural Network Computing Lecture no.1
Transcript
Page 1: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

Neural Network Computing

Lecture no.1

Page 2: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 2

McCullogh-Pitts Neuron

The activation of a McCullogh-Pitts Neuron is binary.

McCullogh-Pitts Neurons are connected by directed, weighted paths.

Each neuron has a fixed threshold.

Page 3: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 3

Architecture

1w

2w2x

nx

1x

else 0

w if 1n

1ii ix

f

nw

Page 4: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 4

Theorem

We can model any function or phenomenon that can be represented as a logic function.

First step we’ll show that the neuron can perform a simple logic function as AND, OR and NOT.

At the second step we’ll use these simple neurons as building blocks.

(Recall representability of logic functions by DNF form).

Page 5: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 5

AND

00

01

10

11

1x 2x AND

0

0

0

1

0

0

0*1+0*1=0 0<1.5

0

1

0*1+1*1=1 1<1.5

1

0

1*1+0*1=1 1<1.5

1

1

1*1+1*1=2 2>1.51x

2x

Page 6: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 6

OR

00

01

10

11

1x 2x OR

0

1

1

1

0

0

0*1+0*1=0 0<0.9

0

1

0*1+1*1=1 1>0.9

1

0

1*1+0*1=1 1>0.9

1

1

1*1+1*1=2 2>0.91x

2x

Page 7: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 7

NOT

x NOT

1-*1-=1 -1-<0.50-*1=0 0->0.50

0

1

1

0

1

x

Page 8: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 8

DNF

DNF form :

OR OR AND AND AND 21 nPPP

Page 9: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 9

Biases and Thresholds

We can replace the threshold with a bias.

A bias acts exactly as a weight on a connection from a unit whose activation is always 1.

n

iii xw

1

n

iii xw

1

0-

n

iii xw

0

0 1 - 00 xw

Page 10: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 10

Perceptron

Loop : Take an example and apply to network. If correct answer – return to Loop. If incorrect – go to Fix.

Fix : Adjust network weights by input example. Go to Loop.

Page 11: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 11

Perceptron Algorithm

Let be arbitrary

Choose: choose

Test: If and go to Choose

If and go to Fix plus

If and go to Choose

If and go to Fix minus

Fix plus: go to Choose

Fix minus: go to Choose

vFx

Fx

FxFxFxFx

0xv

0xv

0xv0xv

xvv :xvv :

Page 12: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 12

Perceptron Algorithm

Conditions to the algorithm existence : Condition no.1:

Condition no.2:

We choose F to be a group of unit vectors.

0*

0* ,

xvFxif

xvFxifv

Page 13: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 13

Geometric viewpoint

*vx

nw

1nw

Page 14: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 14

Perceptron Algorithm

Based on these conditions the number of times we enter the Loop is finite.

Proof:

0 0

*

yAyxAx

FFF*

Positive examples

Negative

examples

Examples world

Page 15: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 15

Perceptron Algorithm-Proof

We replace the threshold with a bias. We assume F is a group of unit vectors.

1 F

Page 16: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 16

Perceptron Algorithm-Proof

We reduce what we have to prove by eliminating all the negative examples and placing their negations in the positive examples.

0

0

*

*

*

A

AF

AF

FF

-yy

yaya

ii

iiii

0 0 **

Page 17: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 17

Perceptron Algorithm-Proof

1*

*

*

A

AA

AA

AAAG

nAA

AAAAAAAAA

n

llll

*

****1

*

The numerator :

After n changes

Page 18: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 18

Perceptron Algorithm-Proof

nA

AAA

AAAAA

n

ttt

ttttt

2

222

11

2

1

1 0

12

The denominator :

After n changes

Page 19: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 19

Perceptron Algorithm-Proof

2

*

*

2

1

1

n

nn

n

A

AAAG

nAA

nA

n

n

n

n

From the numerator :

From the denominator :

n is final

Page 20: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 20

Example - AND

001

011

101

111

1x 2x AND

0

0

0

1

bias

1,0,11,1,00,1,1FIX

11,1,0

0,1,11,0,01,1,1FIX

11,0,0

1,1,11,1,10,0,0FIX

01,1,1

01,0,1

01,1,0

01,0,0

0,0,0

123

212

012

101

301

030

020

010

000

0

xww

wxw

xww

wxw

xww

wxw

wxw

wxw

wxw

w

wrong

wrong

wrong

etc…

Page 21: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 21

AND – Bi Polar solution

-1-110

1-110

-1110

1110

-1-110

1-111

-1111

1111

111

11-1

wrong +

wrong

wrong -

- 020

-1-110

1-110

-1110

1111

continue

success

Page 22: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 22

Problem

111 RHSMHSLHS 333 RHSMHSLHS222 RHSMHSLHS

0111 RHSMHSLHS 0222 RHSMHSLHS 0333 RHSMHSLHS

321 MHSMHSMHS

011 RHSLHS 022 RHSLHS 033 RHSLHS

21 LHSLHS 32 RHSRHS

should be small enough so that

should be small enough so that

2RHS2LHS

022 RHSLHS 022 RHSLHS

contradiction!!!

Page 23: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 23

Linear Separation

Every perceptron determines a classification of vector inputs which is determined by a hyperline

Two dimensional examples (add algebra)

OR AND XOR

not possible

Page 24: Neural Network Computing Lecture no.1. All rights reserved L. Manevitz Lecture 12 McCullogh-Pitts Neuron The activation of a McCullogh-Pitts Neuron is.

All rights reserved L. Manevitz Lecture 1 24

Linear Separation in Higher DimensionsIn higher dimensions, still linear separation, but

hard to tell

Example: Connected; Convex - which can be handled by Perceptron with local sensors; which can not be.

Note: Define local sensors.


Recommended