neural networks chapter 9 joost n. kok universiteit leiden
TRANSCRIPT
![Page 1: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/1.jpg)
Neural NetworksChapter 9
Joost N. Kok
Universiteit Leiden
![Page 2: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/2.jpg)
Unsupervised Competitive Learning
• Competitive learning
• Winner-take-all units
• Cluster/Categorize input data
• Feature mapping
![Page 3: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/3.jpg)
Unsupervised Competitive Learning
321
1 2 34 5
![Page 4: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/4.jpg)
Unsupervised Competitive Learning
output
input (n-dimensional)
winner
![Page 5: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/5.jpg)
Simple Competitive Learning
• Winner:
• Lateral inhibition
j
ijiji wwh
iiww *
![Page 6: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/6.jpg)
Simple Competitive Learning
• Update weights for winning neuron jji
w *
ji
j j
j
jiww **
)( ** jijjiww
![Page 7: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/7.jpg)
Simple Competitive Learning
• Update rule for all neurons:
)( * jijiij wOw
1* i
O
*0 iiifOi
![Page 8: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/8.jpg)
Graph Bipartioning
• Patterns: edges = dipole stimuli
• Two output units
![Page 9: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/9.jpg)
Simple Competitive Learning
• Dead Unit Problem Solutions– Initialize weights tot samples from the input
– Leaky learning: also update the weights of the losers (but with a smaller )
– Arrange neurons in a geometrical way: update also neighbors
– Turn on input patterns gradually
– Conscience mechanism
– Add noise to input patterns
![Page 10: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/10.jpg)
Vector Quantization
• Classes are represented by prototype vectors
• Voronoi tessellation
![Page 11: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/11.jpg)
Learning Vector Quantization
• Labelled sample data
• Update rule depends on current classification
incorrect is class if)(
correct is class if)(
*
*
*
jij
jij
ji w
ww
![Page 12: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/12.jpg)
Adaptive Resonance Theory
• Stability-Plasticity Dilemma
• Supply of neurons, only use them if needed
• Notion of “sufficiently similar”
![Page 13: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/13.jpg)
Adaptive Resonance Theory
• Start with all weights = 1• Enable all output units• Find winner among enabled units
• Test match• Update weights
j ji
ii
w
ww
iw
j j
iw
r*
** :ii
ww
![Page 14: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/14.jpg)
Feature Mapping
• Geometrical arrangement of output units
• Nearby outputs correspond to nearby input patterns
• Feature Map
• Topology preserving map
![Page 15: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/15.jpg)
Self Organizing Map
• Determine the winner (the neuron of which the weight vector has the smallest distance to the input vector)
• Move the weight vector w of the winning neuron towards the input i
Before learning
i
w
After learning
i w
![Page 16: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/16.jpg)
Self Organizing Map
• Impose a topological order onto the competitive neurons (e.g., rectangular map)
• Let neighbors of the winner share the “prize” (The “postcode lottery” principle)
• After learning, neurons with similar weights tend to cluster on the map
![Page 17: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/17.jpg)
Self Organizing Map
![Page 18: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/18.jpg)
Self Organizing Map
![Page 19: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/19.jpg)
Self Organizing Map
• Input: uniformly randomly distributed points
• Output: Map of 202 neurons
• Training– Starting with a large learning rate and
neighborhood size, both are gradually decreased to facilitate convergence
![Page 20: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/20.jpg)
Self Organizing Map
![Page 21: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/21.jpg)
Self Organizing Map
![Page 22: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/22.jpg)
Self Organizing Map
![Page 23: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/23.jpg)
![Page 24: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/24.jpg)
Self Organizing Map
![Page 25: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/25.jpg)
Self Organizing Map
![Page 26: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/26.jpg)
Feature Mapping
• Retinotopic Map
• Somatosensory Map
• Tonotopic Map
![Page 27: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/27.jpg)
Feature Mapping
![Page 28: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/28.jpg)
Feature Mapping
![Page 29: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/29.jpg)
Feature Mapping
![Page 30: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/30.jpg)
Feature Mapping
![Page 31: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/31.jpg)
Kohonen’s Algorithm
))(,( *ijjij wiiw
)2/||exp(),( 22**
ii rrii
![Page 32: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/32.jpg)
![Page 33: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/33.jpg)
Travelling Salesman Problem
))2())((( 11 iiiii wwwwiw
j j
i
w
wi
)2/exp(
)2/exp()(
22
22
![Page 34: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/34.jpg)
Hybrid Learning Schemes
unsupervised
supervised
![Page 35: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/35.jpg)
Counterpropagation
• First layer uses standard competitive learning
• Second (output) layer is trained using delta rule
jiiij VOw )(
jijiij Vww )(
![Page 36: Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden](https://reader030.vdocuments.site/reader030/viewer/2022033108/56649cf65503460f949c5304/html5/thumbnails/36.jpg)
Radial Basis Functions
• First layer with normalized Gaussian activation functions
k kk
jj
jg)2/exp(
)2/exp()(
22
22