artificial neural networkย ยท 1 2 input layer output layer 11 12 1 1 ๐1 single layer perceptron...
Post on 28-Jun-2020
16 Views
Preview:
TRANSCRIPT
3
Introduction
History
1943
M-P neuron
1957
Perceptron
1974
Back
Propagation
2006
Deep Neural
Network
5
Neural Network
Single Layer Perceptron
๐ฅ1
๐ฅ๐
๐ฅ๐
Input layer Output layer
๐ค1
๐ค๐
๐ค๐
๐ฆ1
๐ =
๐=1
๐
๐ฅ๐๐ค๐ + ๐๐
๐ฆ = ๐(๐ )
๐ ๐
1
๐1
๐ ๐ก โถ
1943MP
1957P
1974BP
2006DNN
6
Neural Network
๐ฅ1
๐ฅ2
Input layer Output layer
๐ค11
๐ค12
๐ฆ1๐ ๐
1
๐1
๐ฆ1 = ๐ ๐ค11๐ฅ1 + ๐ค12๐ฅ2 + ๐1 = ๐ข1
๐ฅ2= โ๐ค11
๐ค12๐ฅ1 โ
๐1๐ค12
โ โ๐ฅ1 + 1.5
Single Layer Perceptron
โข Example - AND
1943MP
1957P
1974BP
2006DNN
7
Neural Network
๐ฅ1
๐ฅ2
Input layer Output layer
๐ค11
๐ค12
๐ฆ1๐ ๐
1
๐1
Single Layer Perceptron
โข Example โ XOR Problem (Minsky, M.; S. Papert (1969). ใAn Introduction to Computational Geometryใ. MIT Press.)
1943MP
1957P
1974BP
2006DNN
8
Neural Network
Multi Layer Perceptron (MLP)
๐ฅ1
๐ฅ๐
๐ฅ๐๐ผ
Input layer - ๐Hidden layer - ๐
๐ค111
1
๐๐
Output layer - ๐
๐ค๐๐
1
๐ฆ12
๐ฆ๐
๐ฆ๐๐2
๐ฆ11
๐ค112
๐ฆ๐ ๐ค๐๐
๐๐
1943MP
1957P
1974BP
2006DNN
9
Neural Network
Multi Layer Perceptron (MLP)
Hidden layer - ๐
๐ฅ1
๐ฅ๐
๐ฅ๐๐ผ
Input layer - ๐๐ค111
1
๐๐
Output layer - ๐
๐ค๐๐
1
๐ฆ12
๐ฆ๐
๐ฆ๐๐2
๐ฆ11
๐ค112
๐ฆ๐ ๐ค๐๐
๐๐
1943MP
1957P
1974BP
2006DNN
10
Back Propagation
Back Propagation (Werbos : 1974, Parker : 1982)
๐ฅ1
๐ฅ๐
๐ฅ๐๐ผ
Input layer - ๐Hidden layer - ๐
๐ค111
1
๐๐
Output layer - ๐
๐ค๐๐
1
๐ฆ12
๐ฆ๐
๐ฆ๐๐2
๐ฆ11
๐ค112
๐ฆ๐ ๐ค๐๐
๐๐
๐ฆ๐๐
๐๐
1943MP
1957P
1974BP
2006DNN
11
Back Propagation
Update Process
๐ค ๐ก + 1 = ๐ค ๐ก + โ๐ค ๐กMeasurement Process
๐ ๐ก = ๐ฆ๐ ๐ก โ ๐ฆ(๐ก)
Back Propagation (Werbos : 1974, Parker : 1982)
โข Concept of gradient descent algorithm
1943MP
1957P
1974BP
2006DNN
12
Back Propagation
Weight update
๐ค(๐ก + 1) = ๐ค(๐ก) โ โ๐ค(๐ก)
Back Propagation (Werbos : 1974, Parker : 1982)
โข Gradient descent algorithm
๐ธ๐๐๐๐ ๐ธ
๐ธ๐๐๐
๐ค๐ค0๐ค1๐ค2๐คโ
Objective function
E =1
2
๐=1
๐๐
๐๐2
The gradient
โ๐ค(๐ก) = โ๐๐๐ธ
๐๐ค
1943MP
1957P
1974BP
2006DNN
13
Back Propagation
Back Propagation (Werbos : 1974, Parker : 1982)
โข Example
1943MP
1957P
1974BP
2006DNN
14
Back Propagation
Back Propagation (Werbos : 1974, Parker : 1982)
Update Process
๐ค ๐ก + 1 = ๐ค ๐ก + โ๐ค ๐ก + ๐ผโ๐ค(๐ก โ 1)
Measurement Process
โ๐ค๐๐ ๐ก = ๐๐๐๐โฒ ๐ ๐ ๐ฆ๐
ฮ๐๐ ๐ก = ๐๐๐๐โฒ ๐ ๐
โ๐ค๐๐ ๐ก = ๐๐โฒ ๐ ๐ ๐ฅ๐
๐=1
๐๐
๐๐ ๐โฒ ๐ ๐ ๐ค๐๐
โ๐๐ ๐ก = ๐๐โฒ ๐ ๐
๐=1
๐๐
๐๐ ๐โฒ(๐ ๐)๐ค๐๐
1943MP
1957P
1974BP
2006DNN
16
Implementation
XOR Classification ๐ = 0.9
- Parameter - Output - Classification
1943MP
1957P
1974BP
2006DNN
20
Deep Learning
Local minima problem
- Local minima - Unsupervised Learning => Pre-training
1943MP
1957P
1974BP
2006DNN
21
Depp Learning
Deep Neural Network
Leon A. Gatys, Alexander S. Ecker, Matthias Bethge. โ A Neural Algorithm of Artistic Styleโ.
1943MP
1957P
1974BP
2006DNN
top related