Post on 22-Jan-2018
transcript
Neural NetworksChris Sharkey
today
@shark2900
Vs
Types of NeuronsS
Typ
es o
f N
euro
ns
S Linear
Binary Threshold
Rectifier
Sigmoid
Stochastic Binary
Simple neurons. Computationally limited.
Fixed output upon passing a threshold
Variable output upon passing a threshold
Outputs a smooth bounded function
Outputs a smooth bounded probability function
Linear Neuronβ’ simple and consequently computationally limited
π¦ = π +
π
π
π₯π π€ππ
output
bias i th input
weight on i th input
sum of all incoming connections with each connection considered the activity on the input neuron multiplied by the weight on the line
Linear Neuron
π +
π
π
π₯π π€ππ
y
β’ plotting the output by the bias + the weighted activity on the input lines produces a straight line that travels thought the origin
Binary Threshold Neuronsβ’ computes a weighted sum of inputs
β’ sends out a fixed spike of activity if the weighted sum exceeds a threshold
z = π π₯π π€ππ
y1 ππ π§ β₯ π
0 ππ‘βπππ€ππ π
z = π + ππ π₯π π€π
π
y1 ππ π§ β₯ 0
0 ππ‘βπππ€ππ π
π = βπ
Binary Threshold Neurons
β’ binary output either a spike in activity or no activity
β’ spike is like a truth value
threshold weighted input
output
1
0 threshold
Rectifier Linear Neuronsβ’ zero as an output or no output until a threshold is passed
β’ when threshold is passed the output z is equivalent to the output y
z = π + π π₯π π€ππ
yπ§ ππ π§ > 0
0 ππ‘βπππ€ππ π
Rectifier Linear Neurons
β’ allows for the nice properties of linear systems above zero and allows for decision making below at 0
y
z
0
Sigmoid Neuronsβ’ give a more real-valued output
β’ output is a smooth and bounded function of the total input
z = π + π π₯π π€ππ
y =1
1+πβπ§
Sigmoid Neurons
β’ nice derivatives of the curve exist
β’ nice derivatives are advantageous for easier learning algorithms
β’ (more details in next talk)
z
y.5
0
Stochastic Binary Neuronsβ’ same equation as sigmoid or logistic neurons
β’ treat the output of the logistics as the probability of producing a spike in a short window of time
y = π + π π₯π π€ππ
π(π = 1) =1
1+πβπ§
Stochastic Binary Neurons
β’ use same as logistic units but are bounded by measures of probability
z
p.5
0
Question?
Perceptronsβ’ first generation of neural networks
β’ good first example of a neural network
β’ binary threshold neurons
β’ trained binary neurons work as classifiers
β’ example of ability includes pattern recognition
β’ popularized by Frank Rosenblatt in the 1960βs 1
x1
x2
b
w1
w2
Perceptronsβ’ learning procedure:
β’ add an extra component with value 1 to each input vector β’ this accounts for the bias valuesβ’ pick training cases using any policy that ensures every training case will keep getting picked. To begin randomly assign weights then using an iterative method adjust the weights:
β£ if the output unit is correct do not change the weight
β£ if the output unit is incorrect and output is a 0 add the input vector to the weight vector β£ if the output unit is incorrect and the output is a 1 subtract the input vector from the weight vector
β’ stop when the set of weights that correctly classifies all training cases are found
β’ assuming such set of weights exists
Perceptronsβ’ Weight space
an input vector with correct answer (=1)
good weight vector
bad weight vector
bad weight vector
an input vector with correct answer (=1)
an input vector with correct answer (=1)
bad weight vectorgood weight
vector
an input vector with correct answer (=1)
Limitation of perceptronThe flaws and advantageous of perceptrons
What is next?types of network architectures
Thank you