soft competitive learning without fixed network dimensionality
DESCRIPTION
Soft Competitive Learning without Fixed Network Dimensionality. Jacob Chakareski and Sergey Makarov Rice University, Worcester Polytechnic Institute. Algorithms. Neural Gas Competitive Hebbian Learning Neural Gas + Competitive Hebbian Learning Growing Neural Gas. Neural Gas. - PowerPoint PPT PresentationTRANSCRIPT
Soft Competitive Learning without Fixed Network Dimensionality
Jacob Chakareski and Sergey Makarov
Rice University, Worcester Polytechnic Institute
Algorithms
Neural Gas Competitive Hebbian Learning Neural Gas + Competitive Hebbian Learning Growing Neural Gas
Neural Gas
Sorts the network units based on their distance from the input signal
Adapts a certain number of units, based on this “rank order”
The number of adapted units and the adaptation strength are decreased according to a fixed schedule
The algorithm
Initialize a set A with N units ci
},......,{ 21 NcccA
Sort the network units
Adapt the network units
)()),(()( iii wAkhtw
|||| jw
))(/(exp)( tkkh
Simulation Results
Competitive Hebbian Learning
Usually not used on its own, but in conjunction with other methods
It does not change reference vectors wj at all It only generates a number of neighborhood edges
between the units of the network
The algorithm
Initialize a set A with N units ci and the connection set C
},......,{ 21 NcccA
Determine units s1 and s2
Create a connection between s1 and s2
0, CAxAC
)},{( 21 ssCC
||||minarg1 cAc ws
||||minarg }\{2 1 csAc ws
Simulation Results
Neural Gas + CHL
A superposition of NG and CHL Sometimes denoted as “topology-
representing networks” A local edge aging mechanism implemented
to remove edges which are not valid anymore
The algorithm
Set the age of the connection between s1 and s2 to zero (“refresh” the edge)
Increment the age of all edges emanating from s1
Remove edges with an age larger than the current age T(t)
0),( 21 ssage
1),(),( 11 isageisage )(1S
Ni
Simulation Results
Growing Neural Gas
Number of units changes (mostly increases) during the self-organization process
Starting with very few units new units are added successively
Local error measures are gathered to determine where to insert new units
Each new unit is inserted near the unit with the largest accumulated error
The algorithm
Add the squared distance between the input signal and the winner to a local error variable
Adapt the winner and its neighbors
If the number of input signals generated so far is a multiple integer of a parameter , insert a new unit :
)(1S
Ni
2||||11 sS wErr
)(11 SbS ww )( ini ww
Determine the unit with the max Err
cAc Errq maxarg Determine the neighbor of q with the max Err
cNc Errfq maxarg
Add a new unit r to the network
}{rAA 2/)( fqr www Insert edges connecting r with q and f, and remove the
original edge between q and f
}),(),,({ frqrCC )},{(\ fqCC
Decrease the error variables of q and f
qq ErrErr ff ErrErr
Interpolate the error variable of r from q and f
Decrease the error variables of all units
cc ErrErr )( Ac
2/)( fqr ErrErrErr
Simulation Results
Applications: Web/Database Maps