self organising incremental neural network
DESCRIPTION
a brief description of SOINNWHAT IS SOINN??WHY SOINN??DETAIL ALGORITHMSOINN FOR MACHINE LEARNING AND ASSOCIATIVE MEMORY.........TRANSCRIPT
![Page 1: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/1.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Self-organizing incremental neural network and its
application
F. Shen1 O. Hasegawa2
1National Key Laboratory for Novel Software Technology, Nanjing University
2Imaging Science and Engineering Lab, Tokyo Institute of Technology
June 12, 2009
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 2: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/2.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Contents of this tutorial
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 3: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/3.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 4: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/4.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 5: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/5.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 6: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/6.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 7: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/7.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
What is SOINN
What is SOINN
SOINN: Self-organizing incremental neural network
Represent the topological structure of the input data
Realize online incremental learning
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 8: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/8.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 9: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/9.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 10: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/10.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 11: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/11.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 12: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/12.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for topology representation
SOM(Self-Organizing Map): predefine structure and size ofthe network
NG(Neural Gas): predefine the network size
GNG(Growing Neural Gas): predefine the network size;constant learning rate leads to non-stationary result.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 13: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/13.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 14: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/14.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 15: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/15.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 16: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/16.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 17: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/17.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Background: Networks for incremental learning
Incremental learning: Learning new knowledge without destroyof old learned knowledge (Stability-Plasticity Dilemma)
ART(Adaptive Resonance Theory): Need a user definedthreshold.
Multilayer Perceptrons: To learn new knowledge will destroyold knowledge
Sub-network methods: Need plenty of storage
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 18: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/18.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 19: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/19.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 20: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/20.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 21: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/21.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 22: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/22.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 23: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/23.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundCharacteristics of SOINN
Characteristics of SOINN
Neurons are self-organized with no predefined networkstructure and size
Adaptively find suitable number of neurons for the network
Realize online incremental learning without any prioricondition
Find typical prototypes for large-scale data set.
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 24: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/24.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 25: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/25.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 26: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/26.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 27: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/27.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 28: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/28.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 29: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/29.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Structure: Two-layer competitive network
Two-layer competitivenetwork
First layer: Competitivefor input data
Second layer: Competitivefor output of first-layer
Output topology structureand weight vector ofsecond layer
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 30: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/30.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 31: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/31.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 32: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/32.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 33: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/33.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 34: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/34.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 35: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/35.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 36: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/36.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Training flowchart of SOINN
Adaptively updatedthreshold
Between-classinsertion
Update weight ofnodes
Within-classinsertion
Remove noise nodes
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 37: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/37.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 38: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/38.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 39: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/39.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 40: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/40.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 41: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/41.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 42: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/42.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
First layer: adaptively updating threshold Ti
Basic idea: within-class distance ≤ T ≤ between-class distance
1 Initialize: Ti = +∞ when node i is a new node.2 When i is winner or second winner, update Ti by
If i has neighbors, Ti is updated as the maximum distancebetween i and all of its neighbors.
Ti = maxc∈Ni
||Wi − Wc || (1)
If i has no neighbors, Ti is updated as the minimum distanceof i and all other nodes in network A.
Ti = minc∈A\{i}
||Wi − Wc || (2)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 43: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/43.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 44: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/44.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 45: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/45.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 46: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/46.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 47: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/47.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc
Basic idea 1: within-class distance ≤ T ≤ between-classdistance
Basic idea 2: we already have some knowledge of input datafrom results of first-layer.
Within-class distance:
dw =1
NC
∑
(i ,j)∈C
||Wi − Wj || (3)
Between-class distance of two class Ci and Cj :
db(Ci ,Cj) = mini∈Ci ,j∈Cj
||Wi − Wj || (4)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 48: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/48.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-cluster distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (6)
3 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (7)
4 Go to step 2 to update Tc until Tc is greater than dw .F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 49: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/49.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-cluster distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (6)
3 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (7)
4 Go to step 2 to update Tc until Tc is greater than dw .F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 50: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/50.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-cluster distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (6)
3 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (7)
4 Go to step 2 to update Tc until Tc is greater than dw .F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 51: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/51.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-cluster distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (6)
3 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (7)
4 Go to step 2 to update Tc until Tc is greater than dw .F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 52: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/52.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Second layer: constant threshold Tc (continue)
1 Set Tc as the minimum between-cluster distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (5)
2 Set Tc as the minimum between-class distance.
Tc = db(Ci1 ,Cj1) = mink,l=1,...,Q,k 6=l
db(Ck ,Cl) (6)
3 If Tc is less than within-class distance dw , set Tc as the nextminimum between-cluster distance.
Tc = db(Ci2 ,Cj2) = mink,l=1,...,Q,k 6=l ,k 6=i1,l 6=j1
db(Ck ,Cl ) (7)
4 Go to step 2 to update Tc until Tc is greater than dw .F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 53: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/53.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (8)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (9)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(10)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 54: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/54.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (8)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (9)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(10)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 55: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/55.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (8)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (9)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(10)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 56: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/56.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Updating learning rate ǫ1(t) and ǫ2(t)
Update of weight vector
∆Ws1 = ǫ1(t)(ξ − Ws1) (8)
∆Wi = ǫ2(t)(ξ − Wi ) (∀i ∈ Ns1) (9)
After the size of network becomes stable, fine tune the network
stochastic approximation: a number of adaptation steps witha strength ǫ(t) decaying slowly but not too slowly, i.e.,∑∞
t=1 ǫ(t) = ∞, and∑∞
t=1 ǫ2(t) < ∞.
The harmonic series satisfies the conditions.
ǫ1(t) =1
t, ǫ2(t) =
1
100t(10)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 57: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/57.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 58: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/58.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 59: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/59.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 60: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/60.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 61: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/61.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 62: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/62.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Single-layer SOINN
For topologyrepresentation,first-layer is enough
Within-classinsertion slightlyhappened infirst-layer
Using subclass anddensity to judge ifconnection isneeded.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 63: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/63.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 64: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/64.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 65: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/65.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 66: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/66.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 67: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/67.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 68: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/68.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 69: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/69.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation
Stationary and non-stationary
Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.
Original data Stationary Non-stationary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 70: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/70.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 71: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/71.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 72: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/72.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 73: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/73.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 74: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/74.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 75: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/75.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 76: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/76.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 77: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/77.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results
Artificial data set: topology representation (continue)
Original data Two-layer SOINN Single-layer SOINN
Conclusion of experiments: SOINN is able to
Represent topology structure of input data.
Realize incremental learning.
Automatically learn number of nodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 78: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/78.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 79: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/79.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 80: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/80.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 81: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/81.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 82: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/82.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 83: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/83.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 84: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/84.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Some objectives of unsupervised learning
Automatically learn number of classes of input data
Clustering with no priori knowledge
Topology representation
Realize real-time incremental learning
Separate classes with low density overlapped area
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 85: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/85.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 86: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/86.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 87: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/87.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 88: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/88.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 89: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/89.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 90: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/90.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for unsupervised learning: If two nodes connectedwith one path, the nodes belong to one class
1 Do SOINN for input data, output topology representation ofnodes
2 Initialize all nodes as unclassified.
3 Randomly choose one unclassified node i from node set A.Mark node i as classified and label it as class Ci .
4 Search A to find all unclassified nodes that are connected tonode i with a “path.” Mark these nodes as classified and labelthem as the same class as node i .
5 Go to Step3 to continue the classification process until allnodes are classified.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 91: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/91.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 92: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/92.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 93: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/93.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 94: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/94.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 95: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/95.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 96: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/96.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Artificial data set: 5 classes with 10% noise
Original data Clustering result
Conclusion of experiments
Automatically reports number of classes.
Perfectly clustering data with different shape and distribution.
Find typical prototypes; incremental learning; de-noise; etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 97: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/97.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 98: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/98.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 99: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/99.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 100: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/100.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 101: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/101.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Face recognition: AT&T face data set
Experiment results
Automatically reports there are 10 classes.
Prototypes of every classes are reported.
With such prototypes, recognition ratio (1-NN rule) is 90%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 102: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/102.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 103: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/103.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 104: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/104.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 105: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/105.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 106: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/106.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 107: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/107.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 108: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/108.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 109: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/109.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 110: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/110.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 111: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/111.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Prototype-based classifier: based on 1-NN or k-NN rule
Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypesk-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.
Main difficulty
1 How to find enough prototypes without overfitting2 How to realize Incremental learning
Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 112: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/112.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 113: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/113.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 114: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/114.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 115: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/115.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 116: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/116.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN for supervised learning: Targets
Automatically learn the number of prototypes needed torepresent every class
Only the prototypes used to determine the decision boundarywill be remained
Realize both types of incremental learning
Robust to noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 117: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/117.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 118: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/118.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 119: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/119.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 120: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/120.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Adjusted SOINN Classifier (ASC)
SOINN learns k fork-means.
Noise-reduction removesnoisy prototypes
Center-cleaning removesprototypes unuseful fordecision
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 121: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/121.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 122: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/122.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 123: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/123.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 124: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/124.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 125: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/125.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
ASC: noise-reduction & center-cleaning
Noise-reduction
If the label of a node differs from the label of majority voting of itsk-neighbors, it is considered an outlier.
Center-cleaning
If a prototype of class i has never been the nearest prototype ofother classes, remove the prototype.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 126: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/126.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 127: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/127.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 128: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/128.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 129: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/129.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 130: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/130.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (I)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 6; Recognition ratio = 100%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 131: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/131.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 132: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/132.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 133: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/133.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 134: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/134.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 135: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/135.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (II)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 86; Recognition ratio = 98%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 136: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/136.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 137: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/137.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 138: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/138.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 139: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/139.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 140: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/140.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: artificial data (III)
Original data SOINN results ASC results
Test results of ASC
No. of prototypes = 87; Recognition ratio = 97.8%.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 141: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/141.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 142: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/142.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 143: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/143.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 144: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/144.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: optdigits
ASC with different parameter sets (ad , λ), displayed with averageof 10 times training and standard deviation
Parameter set of {ad , λ}(50, 50) (25, 25) (10, 10)
recognition ratio (%) 97.7 ± 0.2 97.4 ± 0.2 97.0 ± 0.2
No. of prototypes 377 ± 12 258 ± 7 112 ± 7
Compression ratio (%) 9.9 ± 0.3 6.8 ± 0.2 2.9 ± 0.2
Compare with SVM and 1-NN
LibSVM: 1197 support vectors; Recognition ratio = 96.6%.
1-NN: best classifier (98%). All 3823 samples as prototypes.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 145: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/145.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets
Comparison results of ASC and other classifiers: recognition ratio
Data set ASC (ad , λ) NSC (σ2max) KMC (M) NNC (k) LVQ (M)
Iris 97.4 ± 0.86 96.3 ± 0.4 96.2 ± 0.8 96.7 ± 0.6 96.1 ± 0.6Breast cancer 97.4 ± 0.38 97.2 ± 0.2 95.9 ± 0.3 97.0 ± 0.2 96.3 ± 0.4Ionosphere 90.4 ± 0.64 91.9 ± 0.8 87.4 ± 0.6 86.1 ± 0.7 86.4 ± 0.8
Glass 73.5 ± 1.6 70.2 ± 1.5 68.8 ± 1.1 72.3 ± 1.2 68.3 ± 2.0Liver disorders 62.6 ± 0.83 62.9 ± 2.3 59.3 ± 2.3 67.3 ± 1.6 66.3 ± 1.9Pima Indians 72.0 ± 0.63 68.6 ± 1.6 68.7 ± 0.9 74.7 ± 0.7 73.5 ± 0.9
Wine 82.6 ± 1.55 75.3 ± 1.7 71.9 ± 1.9 73.9 ± 1.9 72.3 ± 1.5
Average 82.3 ± 0.93 80.4 ± 1.2 78.3 ± 1.1 81.1 ± 0.99 79.9 ± 1.2
In average, ASC has best recognition performance.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 146: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/146.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets
Comparison results of ASC and other classifiers: recognition ratio
Data set ASC (ad , λ) NSC (σ2max) KMC (M) NNC (k) LVQ (M)
Iris 97.4 ± 0.86 96.3 ± 0.4 96.2 ± 0.8 96.7 ± 0.6 96.1 ± 0.6Breast cancer 97.4 ± 0.38 97.2 ± 0.2 95.9 ± 0.3 97.0 ± 0.2 96.3 ± 0.4Ionosphere 90.4 ± 0.64 91.9 ± 0.8 87.4 ± 0.6 86.1 ± 0.7 86.4 ± 0.8
Glass 73.5 ± 1.6 70.2 ± 1.5 68.8 ± 1.1 72.3 ± 1.2 68.3 ± 2.0Liver disorders 62.6 ± 0.83 62.9 ± 2.3 59.3 ± 2.3 67.3 ± 1.6 66.3 ± 1.9Pima Indians 72.0 ± 0.63 68.6 ± 1.6 68.7 ± 0.9 74.7 ± 0.7 73.5 ± 0.9
Wine 82.6 ± 1.55 75.3 ± 1.7 71.9 ± 1.9 73.9 ± 1.9 72.3 ± 1.5
Average 82.3 ± 0.93 80.4 ± 1.2 78.3 ± 1.1 81.1 ± 0.99 79.9 ± 1.2
In average, ASC has best recognition performance.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 147: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/147.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets (continue)
Comparison results of ASC and other classifiers: compression ratio
Data set ASC (a∗
d , λ∗) NSC (σ2max
∗
) KMC (M∗) NNC (k∗) LVQ (M∗)
Iris 5.2 (6, 6) 7.3 (0.25) 8.0 (4) 100 (14) 15 (22)Breast cancer 1.4 (8, 8) 1.8 (35.0) 0.29 (1) 100 (5) 5.9 (40)Ionosphere 3.4 (15, 15) 31 (1.25) 4.0 (7) 100 (2) 6.8 (24)
Glass 13.7 (15, 15) 97 (0.005) 17 (6) 100 (1) 45 (97)Liver disorders 4.6 (6, 6) 4.9 (600) 11 (19) 100 (14) 8.4 (29)Pima Indians 0.6 (6, 6) 1.7 (2600) 1.0 (4) 100 (17) 3.4 (26)
Wine 3.2 (6, 6) 96 (4.0) 29 (17) 100 (1) 32 (57)
Average 4.6 34.2 10.0 100 16.6
In average, ASC has best compression ratio.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 148: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/148.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results: UCI repository data sets (continue)
Comparison results of ASC and other classifiers: compression ratio
Data set ASC (a∗
d , λ∗) NSC (σ2max
∗
) KMC (M∗) NNC (k∗) LVQ (M∗)
Iris 5.2 (6, 6) 7.3 (0.25) 8.0 (4) 100 (14) 15 (22)Breast cancer 1.4 (8, 8) 1.8 (35.0) 0.29 (1) 100 (5) 5.9 (40)Ionosphere 3.4 (15, 15) 31 (1.25) 4.0 (7) 100 (2) 6.8 (24)
Glass 13.7 (15, 15) 97 (0.005) 17 (6) 100 (1) 45 (97)Liver disorders 4.6 (6, 6) 4.9 (600) 11 (19) 100 (14) 8.4 (29)Pima Indians 0.6 (6, 6) 1.7 (2600) 1.0 (4) 100 (17) 3.4 (26)
Wine 3.2 (6, 6) 96 (4.0) 29 (17) 100 (1) 32 (57)
Average 4.6 34.2 10.0 100 16.6
In average, ASC has best compression ratio.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 149: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/149.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 150: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/150.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 151: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/151.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 152: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/152.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 153: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/153.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Requirement of Semi-supervised learning
Labeled instances are difficult, expensive, or time consumingto obtain.
How can a system use large amount of unlabeled data withlimited labeled data to built good classifiers?
New data are continually added to an already huge database
How can a system learn new knowledge without forgettingprevious learned knowledge?
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 154: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/154.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (11)
Rc =∑
a∈Nc
dis(wa, wc) (12)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 155: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/155.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (11)
Rc =∑
a∈Nc
dis(wa, wc) (12)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 156: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/156.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (11)
Rc =∑
a∈Nc
dis(wa, wc) (12)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 157: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/157.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (11)
Rc =∑
a∈Nc
dis(wa, wc) (12)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 158: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/158.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for Semi-supervised learning
1 SOINN:represent topology,incremental learning;
2 Labeled data: label nodes(winner);
3 Division of a cluster
Condition of division
Rc−1 ≤ Rc&Rc > Rc+1 (11)
Rc =∑
a∈Nc
dis(wa, wc) (12)
c-1: former nodec+1: unlabeled neighbors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 159: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/159.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 160: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/160.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 161: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/161.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 162: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/162.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 163: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/163.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 164: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/164.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: original data
5%, 15%, or 40% overlap
training samples 500, validation samples 5,000, and testsamples 5,000
labeled samples: 10% and 20%
light blue: unlabeled data; others: labeled data
- - - - - ideal decision boundary
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 165: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/165.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results
Separate classeswith few labeledsamples.
For UCI data sets,work better thanother typicalmethods.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 166: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/166.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results
Separate classeswith few labeledsamples.
For UCI data sets,work better thanother typicalmethods.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 167: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/167.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment results
Separate classeswith few labeledsamples.
For UCI data sets,work better thanother typicalmethods.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 168: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/168.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 169: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/169.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 170: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/170.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 171: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/171.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 172: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/172.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 173: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/173.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 174: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/174.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
SOINN used for active learning
Targets: Actively ask for label of some samples to label allclasses
Idea:1 Use SOINN to learn the topology structure of input data.2 Actively label the vertex nodes of every class3 Use vertex nodes to label all nodes.4 Actively label the nodes lie in the overlapped area.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 175: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/175.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under stationaryenvironment
Original data: Four classes in all, with 10% noise.
Results: under stationary environment; 10 teacher vectors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 176: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/176.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under stationaryenvironment
Original data: Four classes in all, with 10% noise.
Results: under stationary environment; 10 teacher vectors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 177: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/177.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under stationaryenvironment
Original data: Four classes in all, with 10% noise.
Results: under stationary environment; 10 teacher vectors.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 178: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/178.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under non-stationaryenvironment
16 teacher vectors are asked.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 179: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/179.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
Unsupervised learningSupervised learningSemi-supervised learningActive learning
Experiment: artificial data set under non-stationaryenvironment
16 teacher vectors are asked.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 180: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/180.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 181: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/181.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 182: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/182.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 183: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/183.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 184: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/184.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 185: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/185.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 186: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/186.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 187: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/187.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 188: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/188.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 189: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/189.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 190: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/190.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 191: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/191.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Background: typical associative memory systems
Distributed Learning Associative Memory:Hopfield Network: most famous network, for auto-associativememoryBidirectional Associative Memory (BAM), forhetero-associative memory
Competitive Learning Associative MemoryKFMAM: Kohonon feature map associative memory.
Difficulties
Forget previously learned knowledge when learning newknowledge incrementally.
Storage limitation.
Memory real-valued data.
Many-to-Many associate.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 192: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/192.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 193: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/193.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 194: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/194.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 195: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/195.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 196: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/196.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 197: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/197.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Objectives of SOINN-AM
Incremental learning of memory pairs.
Robust for noise data.
Dealing with real-valued data.
Many-to-many association.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 198: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/198.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture of SOINN-AM
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 199: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/199.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 200: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/200.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 201: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/201.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 202: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/202.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 203: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/203.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 204: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/204.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 205: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/205.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 206: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/206.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Algorithms of SOINN-AM
Basic idea of memory phase
1 Combine key vector and associate vector as input data.
2 Use SOINN to learn such input data.
Basic idea of recall phase
1 Using key part of nodes to find winner node for key vector,the distance is d .
2 If d ≤ ǫ, output the associative part of winner as the recallresults.
3 If d > ǫ, report unknown for key vector.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 207: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/207.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Original data
Binary data
Real-valued data
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 208: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/208.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Comparison with typical AM systems
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 209: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/209.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Robustness of noise
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 210: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/210.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Many-to-Many associate testing
SOINN-AM recalls all patterns perfectly.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 211: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/211.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 212: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/212.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 213: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/213.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 214: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/214.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
BackgroundSOINN-AMExperimentsGeneral Associative Memory
Architecture and basic idea of GAM
Input layer: key vectorand associate vector.
Memory layer: Memorypatterns with classes.
Associate layer: Buildassociation betweenclasses.
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 215: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/215.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
1 What is SOINN
2 Why SOINN
3 Detail algorithm of SOINN
4 SOINN for machine learning
5 SOINN for associative memory
6 References
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 216: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/216.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, ”An Incremental Network for On-lineUnsupervised Classification and Topology Learning”, Neural Networks,Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, ”An enhancedself-organizing incremental neural network for online unsupervisedlearning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, ”A Fast Nearest Neighbor ClassifierBased on Self-organizing Incremental Neural Network”, Neural Networks,Vol.21, No.10, pp1537-1547, (2008)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 217: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/217.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, ”An Incremental Network for On-lineUnsupervised Classification and Topology Learning”, Neural Networks,Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, ”An enhancedself-organizing incremental neural network for online unsupervisedlearning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, ”A Fast Nearest Neighbor ClassifierBased on Self-organizing Incremental Neural Network”, Neural Networks,Vol.21, No.10, pp1537-1547, (2008)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 218: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/218.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for unsupervised learning:
Furao Shen and Osamu Hasegawa, ”An Incremental Network for On-lineUnsupervised Classification and Topology Learning”, Neural Networks,Vol.19, No.1, pp.90-106, (2005)
Furao Shen, Tomotaka Ogura and Osamu Hasegawa, ”An enhancedself-organizing incremental neural network for online unsupervisedlearning”, Neural Networks, Vol.20, No.8, pp.893-903, (2007)
SOINN for Supervised learning:
Furao Shen and Osamu Hasegawa, ”A Fast Nearest Neighbor ClassifierBased on Self-organizing Incremental Neural Network”, Neural Networks,Vol.21, No.10, pp1537-1547, (2008)
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 219: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/219.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: ”AnOnline Semi-Supervised Clustering Algorithm Based on a Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: ”AnOnline Semi-supervised Active Learning Algorithm with Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory forOnline Learning in Noisy Environments Using Self-organizing IncrementalNeural Network”, IEEE Transactions on Neural Networks, (2009) in press
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 220: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/220.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: ”AnOnline Semi-Supervised Clustering Algorithm Based on a Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: ”AnOnline Semi-supervised Active Learning Algorithm with Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory forOnline Learning in Noisy Environments Using Self-organizing IncrementalNeural Network”, IEEE Transactions on Neural Networks, (2009) in press
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 221: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/221.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
SOINN for Semi-supervised and active learning
Youki Kamiya, Toshiaki Ishii, Furao Shen and Osamu Hasegawa: ”AnOnline Semi-Supervised Clustering Algorithm Based on a Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
Furao Shen, Keisuke Sakurai, Youki Kamiya and Osamu Hasegawa: ”AnOnline Semi-supervised Active Learning Algorithm with Self-organizingIncremental Neural Network,” IJCNN 2007, Orlando, FL, USA, August2007
SOINN for Associative Memory:
Sudo Akihito; Sato Akihiro; Hasegawa Osamu, ”Associative Memory forOnline Learning in Noisy Environments Using Self-organizing IncrementalNeural Network”, IEEE Transactions on Neural Networks, (2009) in press
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application
![Page 222: self organising INCREMENTAL neural network](https://reader033.vdocuments.site/reader033/viewer/2022051615/55305c914a7959a2318b46cd/html5/thumbnails/222.jpg)
ContentsWhat is SOINN
Why SOINNDetail algorithm of SOINN
SOINN for machine learningSOINN for associative memory
References
References about SOINN
Download papers and program of SOINN
http://www.isl.titech.ac.jp/˜ hasegawalab/soinn.html
F. Shen, O. Hasegawa Self-organizing incremental neural network and its application