1 computation in neural networks m. meeter. 2 perceptron learning problem input patterns desired...
TRANSCRIPT
1
Computation in neural networks
M. Meeter
2
Perceptron learning problem
Input Patterns Desired output
[+1, +1, -1, -1] [+1, -1, +1]
[-1, -1, +1, +1] [+1, +1, -1]
[-1, -1, -1, -1]
[-1, -1, +1, -1] [-1, -1, -1]
[-1, +1, +1, -1] [-1, +1, +1]
[+1, -1, +1, -1]
Calculating a function
3
Types of networks & functions
Attractor
Feedfwrd Hebbian
•associative (Hebbian)
•competitive
Feedfwrd error corr.
•perceptron
•backprop
completion, autoass. memory
•association, assoc. memory
•clustering
•categorization, generalization
•nonlinear, same
4
Types of networks
Attractor
Feedfwrd Hebbian
•associative (Hebbian)
•competitive
Feedfwrd error corr.
•perceptron
•backprop
completion, autoass. memory
•association, assoc. memory
•clustering
•categorization, generalization
•nonlinear, same
5
Classification
1~x
A 1~x
2~x
6
Generalization
76
128
?
7
Univariate Linear Regression
prediction of values
Y
X
)(
ˆ
ˆ
2
eKSMin
baxy
yye
Regression = generalization
8
Clustering
1~x
2~x
9
Types of networks
Attractor
Feedfwrd Hebbian
•associative (Hebbian)
•competitive
Feedfwrd error corr.
•perceptron
•backprop
completion, autoass. memory
•association, assoc. memory
•clustering
•categorization, generalization
•nonlinear, same
10
Perceptron learning problem
Prototypical Input Patterns Desired output
[+1, +1, -1, -1] [+1, -1, +1]
[-1, -1, +1, +1] [+1, +1, -1]
[-1, -1, -1, -1]
[-1, -1, +1, -1] [-1, -1, -1]
[-1, +1, +1, -1] [-1, +1, +1]
[+1, -1, +1, -1]
Classification - discrete
11
Perceptron learning problem
Prototypical Input Patterns Desired output
[+1, +1, -1, -1] [+1, -1, +1]
[-1, -1, +1, +1] [+1, +1, -1]
[-1, -1, -1, -1]
[-1, -1, +1, -1] [-1, -1, -1]
[-1, +1, +1, -1] [-1, +1, +1]
[+1, -1, +1, -1]
Classification - discrete
12
Xi
X1
X2
Xn
wji
0 0
0 1)
*
vif
vifv
wxvi
jii
1~x
threshold
y
Classification in Perceptron
13
Effe tussendoor…
Bij perceptron etc.: net input knoop>0 dan activatie 0
Niet altijd gewenst: daarom heeft knoop in continue vormen perceptron / backprop een ‘bias’, een activatie die altijd bij input opgeteld wordt
Effect: verschuiven threshold
14
Classification in 2 dimensions
1~x
2~x
1~xThreshold Input=
ThresholdInput= mixture
+
-
15
Discriminant Analysis
1~x
2~x
11gx 21gx
12gx
22gx
Produces exact same result
Find center of two categories, draw line in between, then one diagonal in middle = discrimination line
16
Univariate Linear Regression
prediction of values
Y
X
)(
ˆ
ˆ
2
eKSMin
baxy
yye
Generalization = Regression
17
Xi
Activation function
(·)
X1
X2
Xn
y
Change weights with rule, minimizing e2
j
wji
v = xi*wji
(v) = av + b
Bias
y
Perceptron with linear activation rule
18
Multivariate = multiple independent variables X
=multiple inputsXi
X1
X2
Xn
1 y1
2 y2
X
Y1
Y2
y
y
Multivariate Multiple Linear Regression
Multiple = multiple dependent variables Y
=multiple outputs
19
Linear vs. nonlinear regression
linear
x
ynonlinear
x
y
Here: quadraticGeneral: wrinkle-fitting
20
y1
y2
X
X
X= [x1, x2, .., xi, .., xn]
*wxvi
jii
avev
1
1)(1y
2y
Multi-Layer Perceptron
Fit any function:“Universal approximators”
21
x
y
Too simple model
Bad
Too complex model
x
y
Extremely bad
Overfitting
22
Clustering
1~x
2~x
Competitive learning:next weekART
23
Conclusions
Neural networks similar to statistical analyses Perceptron -> categorization / generalization Backprop -> same but nonlinear Competitive l. -> clustering
But… Whole data set vs. one pattern at a time
24
Feature reduction with PCA
1~x
2~x
11gx 21gx
12gx
22gx
25
Feature extraction with PCA
1y
1y
1y
1y
?
?
Unsupervised Learning
Hebbian Learning