neural network algos formulas

2
Algorithm name Architec ture Net Input Activation Function Weight Update AND Biased update Stopping Condition Hebb-Net Single Layer, Feed- forward - - w ij (new)=w ij (old)+x i y b j (new)=b j (old)+y Only one iteration Perceptrone Dual layer Feed- forward y_in= b j +Σx i w ij y= 1 if y_in>θ y= 0 if -θ ≤y_in≤ θ y= -1 if y_in< w ij (new)=w ij (old)+ α tx i b j (new)=b j (old)+ α t where t=target If y=t, for all samples Adaline Feed- forward y_in=Σx i w i + b y= 1 if y_in≥ θ y=-1 if y_in < θ b(new)=b(old)+α(t- y_in), w i (new)=w i (old)+α(t- y_in)x i If the greatest weight change is smaller then the applied threshold . Madaline Dual Layer Z_inj=bj+ x i wij y_in = b 3 +z 1 v 1 + z 2 v 2 f (x)=1 if x> 0 -1 if x<0 When t=-1 bj(new)=b j (old)+α(- 1-z_in j ), w ij (new)=w ij (old)+ α(-1-z_in j )xi when t=1 bj(new)=b j (old)+α(1- z_in j ), w ij (new)=w ij (old)+ α(1-z_in j )xi If weight changes have stopped so one Iteration is complete Hetero Associative Single Layer Y_in j =Σx i w ij Y j =1 if y_in j j Y j if y_in j = θ j -1 if y_in j < θ j w ij (new)=w ij (old)+s i t j All samples have been processed Auto Associative Single Layer Y_in j =Σx i w ij Y j =1 if y_in j >0 w ij (new)=w ij (old)+x i y j All samples

Upload: zarnigar-altaf

Post on 21-Dec-2014

73 views

Category:

Documents


0 download

DESCRIPTION

Perceptrone Adaline Madaline Hetero Associative Auto Associative Discrete Hopfield Back propagation

TRANSCRIPT

Page 1: Neural network Algos formulas

Algorithm name

Architecture

Net Input Activation Function

Weight Update

AND

Biased update

StoppingCondition

Hebb-Net Single Layer, Feed-forward - -

wij(new)=wij(old)+xi ybj(new)=bj(old)+y

Only one iteration

Perceptrone Dual layerFeed-forward

y_in= bj+Σxiwij y= 1 if y_in>θy= 0 if

-θ ≤y_in≤ θy= -1 if y_in< -θ

wij(new)=wij(old)+ α txi

bj(new)=bj(old)+ α twhere t=target

If y=t, for all samples

Adaline Feed-forward y_in=Σxiwi + b y= 1 if y_in≥ θy=-1 if y_in < θ

b(new)=b(old)+α(t-y_in),wi(new)=wi(old)+α(t-y_in)xi

If the greatest weight

change is smaller then the applied threshold.

Madaline Dual Layer Z_inj=bj+∑xi wijy_in=b3+z1v1+z2v2

f (x)=1 if x>0 -1 if x<0

When t=-1bj(new)=bj(old)+α(-1-z_inj),

wij(new)=wij(old)+ α(-1-z_inj)xi

when t=1bj(new)=bj(old)+α(1-z_inj),

wij(new)=wij(old)+ α(1-z_inj)xi

If weight changes have

stopped so one Iterationis complete

HeteroAssociative

Single Layer Y_inj=Σxiwij Yj=1 if y_inj>θj

Yj if y_inj= θj

-1 if y_inj< θj

wij(new)=wij(old)+sitj

All samples have been processed

AutoAssociative

Single Layer Y_inj=Σxiwij Yj=1 if y_inj>0-1 if y_inj<0 wij(new)=wij(old)+xiyj

All samples have been processed

Discrete Hopfield

Unsupervised Learning

Feedbackward

Y_inj = xi + Σyiwji 1 if y-ini> θyi if y-ini= θ0 if y-ini< θ

Back propagation

Multi-layer supervised learning

Feed-forward

Y_inj = Σwijxi + bj

Errors:For hidden layersErrj = Oj (1-Oj)

∑ Errk wjk

For output layerErrj = Oj (1-Oj)

(Tj-Oj)

Yj=1/1-e-Y_in

Wij(new) = wij(old) + α Errj Oi

bj = bj(old) + α Errj

We will solve it until the

error is zero Err=0

Self Organization

unsupervised learning

Dj=∑(wij-xi)2 Choose the minimum Dj

Wij(new)= Wij(old)+α[xi-wij(old)]

Page 2: Neural network Algos formulas

mapFeed-

Forward  

and set the value of j

according to it.

(new)= 0.5 α (old) If

convergence criterion met,

STOP.Or

When cluster 1 and cluster 2 is inverse of

each other.