artificial neural networks 0909.560.01/0909.454.01 fall 2004
DESCRIPTION
Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004. Lecture 1 September 13, 2004. Shreekanth Mandayam ECE Department Rowan University http://engineering.rowan.edu/~shreek/spring04/ann/. Plan. What is artificial intelligence? Course introduction - PowerPoint PPT PresentationTRANSCRIPT
S. Mandayam/ ANN/ECE Dept./Rowan University
Artificial Neural NetworksArtificial Neural Networks0909.560.01/0909.454.010909.560.01/0909.454.01
Fall 2004Fall 2004
Shreekanth MandayamECE Department
Rowan University
http://engineering.rowan.edu/~shreek/spring04/ann/
Lecture 1Lecture 1September 13, 2004September 13, 2004
S. Mandayam/ ANN/ECE Dept./Rowan University
PlanPlan
• What is artificial intelligence?• Course introduction• Historical development – the neuron
model• The artificial neural network paradigm• What is knowledge? What is learning?• The Perceptron
• Widrow-Hoff Learning Rule
• The “Future”….?
S. Mandayam/ ANN/ECE Dept./Rowan University
Artificial IntelligenceArtificial Intelligence
Systems that think like humans• Cognitive modeling
Systems that think rationally• Logic
Systems that act like humans• Natural language processing• Knowledge representation• Machine learning
Systems that act rationally• Decision theoretic agents
S. Mandayam/ ANN/ECE Dept./Rowan University
Course IntroductionCourse Introduction
• Why should we take this course?• PR, Applications
• What are we studying in this course?• Course objectives/deliverables
• How are we conducting this course?• Course logistics
• http://engineering.rowan.edu/shreek/spring04/ann/
S. Mandayam/ ANN/ECE Dept./Rowan University
Course ObjectivesCourse Objectives
• At the conclusion of this course the student will be able to:• Identify and describe engineering
paradigms for knowledge and learning• Identify, describe and design artificial
neural network architectures for simple cognitive tasks
S. Mandayam/ ANN/ECE Dept./Rowan University
History/PeopleHistory/People
1940’s Turing General problem solver, “Turing test”
1940’s Shannon Information theory
1943 McCulloch and Pitts Math of neural processes
1949 Hebb Learning model
1959 Rosenblatt The “Perceptron”
1960 Widrow LMS training algorithm
1969 Minsky and Papert Perceptron deficiency
1985 Rumelhart Feedforward MLP, backprop
1988 Broomhead and Lowe Radial basis function neural nets
1990’s VLSI implementations
1997 IEEE 1451
S. Mandayam/ ANN/ECE Dept./Rowan University
Neural Network ParadigmNeural Network ParadigmStage 1: Network Training
ArtificialArtificialNeuralNeural
NetworkNetwork
Present Examples Indicate Desired Outputs
DetermineSynapticWeights
ArtificialArtificialNeuralNeural
NetworkNetworkNew Data Predicted Outputs
Stage 2: Network Testing
“knowledge”
S. Mandayam/ ANN/ECE Dept./Rowan University
ANN ModelANN Model
ArtificialArtificialNeuralNeural
NetworkNetwork
xInput
Vector
yOutputVector
fComplexNonlinearFunction
3
2
1
x
x
x
3
2
1
y
y
y
f(x) = y
“knowledge”
S. Mandayam/ ANN/ECE Dept./Rowan University
Popular I/O MappingsPopular I/O Mappings
ANNx y
Single output
y1
ANNx
1-out-of-c selector
y2
yc
y1
ANNx
Coder
y2
yc
ANNx
Associator
y
S. Mandayam/ ANN/ECE Dept./Rowan University
The PerceptronThe Perceptron
(.)
wk1
wk2
wkm
x1
x2
xm
Inpu
ts
Synapticweights
Bias,bk
Induced field,
vk
Output,ykuk
Activation/ squashing function
S. Mandayam/ ANN/ECE Dept./Rowan University
““Learning”Learning”
[w]x y
ANN
Mathematical Model of the Learning Process
[w]0x y(0)
Intitialize: Iteration (0)
[w]1x y(1)
Iteration (1)
[w]nx y(n) = d
Iteration (n)desiredo/p
S. Mandayam/ ANN/ECE Dept./Rowan University
““Learning”Learning”
[w]x y
ANN
Mathematical Model of the Learning Process
[w]0x y(0)
Intitialize: Iteration (0)
[w]1x y(1)
Iteration (1)
[w]nx y(n) = d
Iteration (n)desiredo/p
S. Mandayam/ ANN/ECE Dept./Rowan University
Error-Correction LearningError-Correction Learning
(.)
wk1(n)x1 (n)
x2
xm
Inpu
ts
Synapticweights
Bias,bk
Induced field,vk(n)
Activation/ squashing function
wk2(n)
wkm(n)
Output,yk (n)
Desired Output,dk (n)
ErrorSignalek (n)
+
-
S. Mandayam/ ANN/ECE Dept./Rowan University
Learning TasksLearning Tasks• Pattern Association
• Pattern Recognition
• Function Approximation
• Filtering
ClassificationClassification
x1
x2
1
2
DB
x1
x2
1
2
DB
S. Mandayam/ ANN/ECE Dept./Rowan University
Perceptron Training Perceptron Training Widrow-Hoff Rule (LMS Algorithm)Widrow-Hoff Rule (LMS Algorithm)
w(0) = 0
n = 0
y(n) = sgn [wT(n) x(n)]
w(n+1) = w(n) + [d(n) – y(n)]x(n)
n = n+1
Matlab Demo
S. Mandayam/ ANN/ECE Dept./Rowan University
The Age of Spiritual MachinesWhen Computers Exceed Human Intelligenceby Ray Kurzweil | Penguin paperback | 0-14-028202-5 |