back-propagation primer

Post on 15-Apr-2017

513 Views

Category:

Education

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

APrimeronBack-PropagationofErrors (appliedtoneuralnetworks)

AuroTripathyauro@shatterline.com

AuroTripathy

Outline

• SummaryofForward-Propagation• TheCalculusofBack-propagation• Summary

2

AuroTripathy

Feed-forwardtocalculatetheerrorrelativetothedesiredoutput

Error-Function(akaLoss-,Cost-,orObjective-Function)

• Inthefeed-forwardpath,calculatetheerrorrelativetothedesiredoutput• Wedefineaerror-functionE(X3,Y)asthe“penalty”ofpredictingX3whenthetrueoutputisY.• Theobjectiveistominimizetheerroracrossallthetrainingsamples.• Theerror/lossE(X3,Y)assignsanumericalscore(ascalar)forthenetwork’soutputX3given

theexpectedoutputY.• Thelossiszeroonlyforcaseswheretheneuralnetwork’soutputiscorrect.

4

AuroTripathy

SigmoidActivationFunction

Thesigmoidactivationfunction

σ(x) = 1/(1 + e−x)

isanS-shapedactivationfunctiontransformingallvaluesofxintherange,[0,1]

5https://en.wikipedia.org/wiki/File:Logistic-curve.svg

AuroTripathy

GradientDescent

6

Note,inpractice,wedon’texpectaglobalminima,asshownhere

ab

AuroTripathy

DerivativeoftheSigmoidActivationFunction

9

P3 X3

FortheSigmoidfunction,thecoolthingis,thederivativeoftheoutput,X3(withrespecttotheinput,P3)isexpressedintermsoftheoutput,i.e.,

X3.(1-X3)

http://kawahara.ca/wp-content/uploads/derivative_of_sigmoid.jpg

AuroTripathy

Propagatetheerrorsbackwardandadjusttheweights,w,sotheactualoutputmimicsthedesiredoutput

11

AuroTripathy

ComputationsareLocalized&PartiallyPre-computedinthePreviousLayer

12

AuroTripathy

Summary

☑Ifthere’sarepresentativesetofinputsandoutputs,thenback-propagationcanlearnthetheweights.

☑Back-propagationhaslinearperformancerelativetothenumberoflayers.

☑Simpletoimplement(andtest)

13

AuroTripathy

Credits

14

ConceptscrystalizedfromMITProfessorPatrickWinston’slecture,https://www.youtube.com/watch?v=q0pm3BrIUFo

auro@shatterline.com

top related