+ All Categories
Home > Documents > Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and...

Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and...

Date post: 05-Jun-2020
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
26
Security Analytics Topic 6: Perceptron and Support Vector Machine Purdue University Prof. Ninghui Li Based on slides by Prof. Jenifer Neville and Chris Clifton
Transcript
Page 1: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Security Analytics

Topic 6: Perceptron and Support Vector Machine

Purdue University Prof. Ninghui Li

Based on slides by Prof. Jenifer Neville and Chris Clifton

Page 2: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Readings

• Principle of Data Mining

– Chapter 10: Predictive Modeling for Classification

• 10.3 Perceptron

Page 3: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

PERCENTRON

Page 4: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Input signals sent

from other neurons

If enough

sufficient signals

accumulate, the

neuron fires a

signal.

Connection strengths

determine how the

signals are

accumulated

Page 5: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

1x

2x

3x

add

)( taif

1output outputsignal

• input signals ‘x’ and coefficients ‘w’ are multiplied

• weights correspond to connection strengths

• signals are added up – if they are enough, FIRE!

else0output

1w

2w

3w

i

M

i

iwxa

1

incoming

signal

connection

strength activation

level

output

signal

Page 6: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Sum notation

(just like a loop from 1 to M)

double[] x =

double[] w =

Multiple corresponding

elements and add them up

a

if (activation > threshold) FIRE !

(activation)

i

M

i

iwxa

1

Calculation…

Page 7: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

tif 0 else ,1 then outputoutput

i

M

i

iwx1

The Perceptron Decision Rule

Page 8: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

output = 1 output = 0

tif

0 else ,1 then outputoutput

i

M

i

iwx1

Rugby player = 1

Ballet dancer = 0

Decision

boundary

Page 9: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Is this a good decision boundary?

tif

0 else ,1 then outputoutput

i

M

i

iwx1

Page 10: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

w1 = 1.0

w2 = 0.2

t = 0.05

tif

0 else ,1 then outputoutput

i

M

i

iwx1

Page 11: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

w1 = 2.1

w2 = 0.2

t = 0.05

tif

0 else ,1 then outputoutput

i

M

i

iwx1

Page 12: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

w1 = 1.9

w2 = 0.02

t = 0.05

tif

0 else ,1 then outputoutput

i

M

i

iwx1

Page 13: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Changing the weights/threshold makes the decision boundary move.

Pointless / impossible to do it by hand – only ok for simple 2-D case.

We need an algorithm….

w1 = 0.8

w2 = -0.03

t = 0.05

Page 14: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

] 5.0 ,5.0 ,2.0 [w

] 0.2 ,5.0 ,0.1 [x

0.1t

w1

w2

w3

x1

x2

x3

M

i

iiwxa1

w1

w2

w3

x1

x2

x3

M

i

iiwxa1

45.1)5.00.2()5.05.0()2.00.1(1

M

i

iiwxa

Q1. What is the activation, a, of the neuron?

Q2. Does the neuron fire?

if (activation > threshold) output=1 else output=0

…. So yes, it fires.

Page 15: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

] 5.0 ,5.0 ,2.0 [w

] 0.2 ,5.0 ,0.1 [x

0.1t

w1

w2

w3

x1

x2

x3

M

i

iiwxa1

w1

w2

w3

x1

x2

x3

M

i

iiwxa1

45.0)0.00.2()5.05.0()2.00.1(1

M

i

iiwxa

Q3. What if we set threshold at 0.5 and weight #3 to zero?

if (activation > threshold) output=1 else output=0

…. So no, it does not fire..

Page 16: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

The Perceptron

error)tion classifica (a.k.a. mistakes ofNumber Error function

height

weight

Model

Learning algo. values... and theoptimise toneed .... ??? tw

0

1

"dancer"

"player"0y else 1 y then if

1

txw i

d

i

i

Page 17: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Perceptron Learning Rule

new weight = old weight + 0.1 ( trueLabel – output ) input

if… ( target = 0, output = 0 ) …. then update = ?

if… ( target = 0, output = 1 ) …. then update = ?

if… ( target = 1, output = 0 ) …. then update = ?

if… ( target = 1, output = 1 ) …. then update = ?

What weight updates do these cases produce?

update

Page 18: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

initialise weights to random numbers in range -1 to +1

for n = 1 to NUM_ITERATIONS

for each training example (x,y)

calculate activation

for each weight

update weight by learning rule

end

end

end

Perceptron convergence theorem:

If the data is linearly separable, then application of the Perceptron learning

rule will find a separating decision boundary, within a finite number of

iterations

Learning algorithm for the Perceptron

Page 19: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Model

(if… then…)

Testing Data (no labels)

Training data

Predicted Labels

Learning algorithm (search for good

parameters)

Supervised Learning Pipeline for Perceptron

Page 20: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

New data…. “non-linearly separable”

height

weight

Our model does not match

the problem!

(AGAIN!)

Many mistakes!

dancer"" else player"" then if1

txw i

d

i

i

Page 21: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

One Approach: Multilayer Perceptron

x1

x2

x3

x4

x5

Page 22: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Another Approach: Sigmoid activation – no more

thresholds needed

1

i

d

i

i xw

levelactivation

0y else 1 y then if1

txw i

d

i

i

) exp(1

1

1

i

d

i

i xw

a

Page 23: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

MLP decision boundary – nonlinear problems, solved!

height

weight

Page 24: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Neural Networks - summary

Perceptrons are a (simple) emulation of a neuron.

Layering perceptrons gives you… a multilayer perceptron.

An MLP is one type of neural network – there are others.

An MLP with sigmoid activation functions can solve highly nonlinear

problems.

Downside – we cannot use the simple perceptron learning algorithm.

Instead we have the “backpropagation” algorithm.

We will cover this later.

Page 25: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

Perceptron Revisited: Linear Separators

• Binary classification can be viewed as the task of separating classes in feature space:

wTx + b = 0

wTx + b < 0 wTx + b > 0

f(x) = sign(wTx + b)

Page 26: Security Analytics Topic 6: Perceptron and Support Vector ...• input signals ‘x’ and coefficients ‘w’ are multiplied • weights correspond to connection strengths • signals

SEE SLIDES FOR SVM


Recommended