Post on 08-Sep-2020
transcript
Neural networksNando de Freitas
Outline of the lecture
This lecture introduces you to the fascinating subject of classificationand regression with artificial neural networks. In particular, it
Introduces multi-layer perceptrons (MLPs) Teaches you how to combine probability with neural networksso that the nets can be applied to regression, binary classificationand multivariate classification. Discusses the modular approach to backpropagation and neuralnetwork construction in Torch, which was introduced in theprevious lecture.
MLP – 1 neuronu
1 q
q
1
2
yi = P(yi =1| xi ,q )xi1
^
q3xi2
MLP – 1 neuronu
1 q
q
1
2
yi = P(yi =1| xi ,q )xi1
^
q3xi2
MLP – 3 neurons, 2 layersu o
1 qq
1
2
u o
1
q
q
8
4
uq
q
6
7
15q11
12
21
12
11
yi = P(yi =1| xi ,q )^xi1
xi2
q3q9
MLP – 3 neurons, 2 layersu o
1 qq
1
2
u o
1
q
q
8
4
uq
q
6
7
15q11
12
21
12
11
yi = P(yi =1| xi ,q )^xi1
xi2
q3q9
MLP – 3 neurons, 2 layersu o
1 qq
1
2
u o
1
q
q
8
4
uq
q
6
7
15q11
12
21
12
11
yi = P(yi =1| xi ,q )^xi1
xi2
q3q9
MLP – Regressionu o
1 q
q
1
2
u o
1
q
q
3
4
uq
q
6
7
15q11
12
21
12
11
xiyi = E(yi | xi ,q )^
MLP – Regressionu o
1 q
q
1
2
u o
1
q
q
3
4
uq
q
6
7
15q11
12
21
12
11
xiyi = E(yi | xi ,q )^
MLP – Multiclassu o
u o
uq
q
6
7
1
5q11
12
21
12
11
uq
q
122
uq
q
21
q
yi1^
yi2^
yi3^
1 qq
1
2
1
q
q
8
4
xi1
xi2
q3q9
MLP – Multiclassu o
u o
uq
q
6
7
1
5q11
12
21
12
11
uq
q
122
uq
q
21
q
yi1^
yi2^
yi3^
1 qq
1
2
1
q
q
8
4
xi1
xi2
q3q9
MLP – Multiclass
Deep learning & backprop
Deep learning & backprop
Reverse auto-diff: why it works
Deep learning: linear layer
Deep learning: ReLU layer
Deep learning: extremely flexible!
Next lecture
In the next lecture, we will look at a successful type of neuralnetwork that is very popular in speech and object recognition,known as a convolutional neural network.