Post on 03-Apr-2018
transcript
7/28/2019 7 Neural Networks Learning
1/33
NeuralNetwo
Learning
Costfunc5
MachineLearning
7/28/2019 7 Neural Networks Learning
2/33
NeuralNetwork(Classifica2on)
Binaryclassifica5on
1outputunit
Layer1 Layer2 Layer3 Layer4
Mul5-classclassifica5
Koutputunits
totalno.oflayersinnetw
no.ofunits(notcoun5nglayer
pedestriancarm
E.g.,,
7/28/2019 7 Neural Networks Learning
3/33
7/28/2019 7 Neural Networks Learning
4/33
NeuralNetwo
Learning
Backpropaga
algorithm
MachineLearning
7/28/2019 7 Neural Networks Learning
5/33
Gradientcomputa2on
Needcodetocompute:
- -
7/28/2019 7 Neural Networks Learning
6/33
Gradientcomputa2on
Givenonetrainingexample(,):
Forwardpropaga5on:
Layer1 Layer2 Layer
7/28/2019 7 Neural Networks Learning
7/33
Gradientcomputa2on:Backpropaga2onalgorithm
Intui5on:errorofnodeinlayer.
Layer1 Layer2 Lay
Foreachoutputunit(layerL=4)
7/28/2019 7 Neural Networks Learning
8/33
Backpropaga2onalgorithmTrainingset
Set(forall).
For
Set
Performforwardpropaga5ontocomputefor
Using,compute
Compute
7/28/2019 7 Neural Networks Learning
9/33
NeuralNetwo
Learning
Backpropaga
intui5on
MachineLearning
7/28/2019 7 Neural Networks Learning
10/33
ForwardPropaga2on
7/28/2019 7 Neural Networks Learning
11/33
ForwardPropaga2on
7/28/2019 7 Neural Networks Learning
12/33
Whatisbackpropaga2ondoing?
Focusingonasingleexample,,thecaseof1outp
andignoringregulariza5on(),
(Thinkof)
I.e.howwellisthenetworkdoingonexamplei?
7/28/2019 7 Neural Networks Learning
13/33
ForwardPropaga2on
errorofcostfor(unitinlayer).
Formally, (for),where
7/28/2019 7 Neural Networks Learning
14/33
NeuralNetwo
LearningImplementa5on
note:Unrolling
parameters
MachineLearning
7/28/2019 7 Neural Networks Learning
15/33
Advancedop2miza2on
function [jVal, gradient] = costFunction(thet
optTheta = fminunc(@costFunction, initialThet
NeuralNetwork(L=4):
-matrices(Theta1, Theta2, Th
-matrices(D1, D2, D3)
Unrollintovectors
7/28/2019 7 Neural Networks Learning
16/33
Example
thetaVec = [ Theta1(:); Theta2(:); Theta3(:)]DVec = [D1(:); D2(:); D3(:)];
Theta1 = reshape(thetaVec(1:110),10,11);Theta2 = reshape(thetaVec(111:220),10,11);Theta3 = reshape(thetaVec(221:231),1,11);
7/28/2019 7 Neural Networks Learning
17/33
Haveini5alparameters.
UnrolltogetinitialThetatopasstofminunc(@costFunction, initialTheta, options)
LearningAlgorithm
function [jval, gradientVec] = costFunction(t
FromthetaVec, get.
Useforwardprop/backproptocomputeand.
UnrolltogetgradientVec.
7/28/2019 7 Neural Networks Learning
18/33
NeuralNetwo
Learning
Gradientche
MachineLearning
7/28/2019 7 Neural Networks Learning
19/33
Numericales2ma2onofgradients
Implement:gradApprox = (J(theta + EPSILON) J(theta
EPSILON)) /(2*EPSILON)
7/28/2019 7 Neural Networks Learning
20/33
Parametervector
(E.g.isunrolledversionof
7/28/2019 7 Neural Networks Learning
21/33
for i = 1:n,thetaPlus = theta;thetaPlus(i) = thetaPlus(i) + EPSILON;
thetaMinus = theta;thetaMinus(i) = thetaMinus(i) EPSILON;gradApprox(i) = (J(thetaPlus) J(thetaMin
/(2*EPSILON);end;
CheckthatgradApproxDVec
7/28/2019 7 Neural Networks Learning
22/33
Implementa2onNote:
- ImplementbackproptocomputeDVec(unrolled- Implementnumericalgradientchecktocomputegra- Makesuretheygivesimilarvalues.- Turnoffgradientchecking.Usingbackpropcodefor
Important:
- Besuretodisableyourgradientcheckingcodebeforyourclassifier.Ifyourunnumericalgradientcomput
everyitera5onofgradientdescent(orintheinnerlo
costFunction())yourcodewillbeveryslow.
7/28/2019 7 Neural Networks Learning
23/33
NeuralNetwo
Learning
Random
ini5aliza5oMachineLearning
7/28/2019 7 Neural Networks Learning
24/33
Ini2alvalueof
Forgradientdescentandadvancedop5miza5o
method,needini5alvaluefor.
Considergradientdescent
Set ?
optTheta = fminunc(@costFunction,initialTheta, options)
initialTheta = zeros(n,1)
7/28/2019 7 Neural Networks Learning
25/33
Zeroini2aliza2on
A_ereachupdate,parameterscorrespondingtoinputsgoinginto
twohiddenunitsareiden5cal.
7/28/2019 7 Neural Networks Learning
26/33
Randomini2aliza2on:Symmetrybreaking
Ini5alizeeachtoarandomvaluein
(i.e.)
E.g.
Theta1 = rand(10,11)*(2*INIT_EPSILON
- INIT_EPSILON;
Theta2 = rand(1,11)*(2*INIT_EPSILON- INIT_EPSILON;
7/28/2019 7 Neural Networks Learning
27/33
NeuralNetwo
Learning
Pu`ngit
togetherMachineLearning
7/28/2019 7 Neural Networks Learning
28/33
Traininganeuralnetwork
Pickanetworkarchitecture(connec5vitypaernbetweenneurons
No.ofinputunits:Dimensionoffeatures
No.outputunits:Numberofclasses
Reasonabledefault:1hiddenlayer,orif>1hiddenlayer,havesam
hiddenunitsineverylayer(usuallythemorethebeer)
7/28/2019 7 Neural Networks Learning
29/33
Traininganeuralnetwork
1. Randomlyini5alizeweights2. Implementforwardpropaga5ontogetforan3. Implementcodetocomputecostfunc5on4. Implementbackproptocomputepar5alderiva5vesfor i = 1:m
Performforwardpropaga5onandbackpropaga5on
example(Getac5va5onsanddeltatermsfor
7/28/2019 7 Neural Networks Learning
30/33
Traininganeuralnetwork
. Usegradientcheckingtocomparecomputbackpropaga5onvs.usingnumericales5mateofgra
of.Thendisablegradientcheckingcode.
6. Usegradientdescentoradvancedop5miza5onmetbackpropaga5ontotrytominimizeasafunc5o
parameters
7/28/2019 7 Neural Networks Learning
31/33
7/28/2019 7 Neural Networks Learning
32/33
NeuralNetwo
Learning
Backpropaga5on
example:Autondriving(op5ona
MachineLearning
7/28/2019 7 Neural Networks Learning
33/33
[CourtesyofDeanPomerleau]