lecture 3. the brain as a network of neurons reading...

22
1 Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons Michael Arbib and Laurent Itti: CS564 - Brain Theory and Artificial Intelligence Lecture 3. The Brain as a Network of Neurons Reading Assignments:* TMB2: Section 2.3 HBTNN: Single Cell Models (Softky and Koch) Axonal Modeling (Koch and Bernander) Perspective on Neuron Model Complexity (Rall) * Unless indicated otherwise, the TMB2 material is the required reading, and the other readings supplementary.

Upload: others

Post on 28-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

1Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Michael Arbib and Laurent Itti: CS564 - Brain Theory and Artificial Intelligence

Lecture 3. The Brain as a Network of Neurons

Reading Assignments:*TMB2:Section 2.3HBTNN:Single Cell Models (Softky and Koch)Axonal Modeling (Koch and Bernander)Perspective on Neuron Model Complexity (Rall)

* Unless indicated otherwise, the TMB2 material is the required reading, and the other readings supplementary.

Page 2: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

2Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

The "basic" biological neuron

The soma and dendrites act as the input surface; the axon carries the outputs. The tips of the branches of the axon form synapses upon other neurons or upon effectors (though synapses may occur along the branches of an axon as well as the ends). The arrows indicate the direction of "typical" information flow from inputs to outputs.

Dendrites Soma Axon with branches andsynaptic terminals

Page 3: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

3Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

From Passive to Active Propagation

For "short" cells passive propagation suffices to signal a potential change from one end to the other;

If the axon is long, this is inadequate since changes at one end would decay away almost completely before reaching the other end.

If the change in potential difference is large enough, then in acylindrical configuration such as the axon, a pulse can actively propagate at full amplitude. The Hodgkin-Huxley Equations (1952)

Page 4: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

4Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Neurons and Synapses

Page 5: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

5Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Excitatory and Inhibitory Synapses

Dale's law states that each neuron releases a single transmitter substance.

This does not mean that the synapses made by a single neuron are either all excitatory or all inhibitory.

Modern understanding: Channels which "open" and "close"provide the mechanisms for the Hodgkin-Huxley equation, and this notion of channels extends to synaptic transmission.

The action of a synapse depends on both transmitter released presynaptically, and specialized receptors in the postsynaptic membrane.

Moreover, neurons may secrete transmitters which act as neuromodulators of the function of a circuit on some quite extended time scale (cf. TMB2 Sections 6.1 and 8.1).

Page 6: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

6Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Transmenbrane Ionic Transport

Ion channels act as gates that allow or block the flow of specific ions into and out of the cell.

Page 7: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

7Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Gated Channels

A given chemical (e.g., neurotransmitter) acts as ligand and gates the opening of the channel by binding to a receptor site on the channel.

Page 8: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

8Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Action Potential

At rest, the inside of the cell rests at a negative potential(compared to surroundings)

Action potential consists of a brief �depolarization� (negative rest potential decreases to zero) followed by �repolarization� (inside of membrane goes back to negative rest potential), with a slight �hyperpolarization� overshoot before reaching rest.

Page 9: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

9Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Action Potential and Ion Channels

Initial depolarization due to opening sodium (Na+) channelsRepolarization due to opening potassium (K+) channelsHyperpolarization happens because K+ channels stay open longer than Na+ channels (and longer than necessary to exactly come back to resting potential).

Page 10: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

10Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Channel activations during action potential

Page 11: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

11Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

A McCulloch-Pitts neuron operates on a discrete time-scale, t = 0,1,2,3, ... with time tick equal to one refractory period

At each time step, an input or output is

on or off � 1 or 0, respectively.

Each connection or synapse from the output of one neuron to the input of another, has an attached weight.

Warren McCulloch and Walter Pitts (1943)

x (t)1

x (t)n

x (t)2

y(t+1)

w1

2

n

w

w

axonθθθθ

Page 12: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

12Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Excitatory and Inhibitory Synapses

We call a synapse

excitatory if wi > 0, and

inhibitory if wi < 0.

We also associate a threshold θ with each neuron

A neuron fires (i.e., has value 1 on its output line) at time t+1 if the weighted sum of inputs at t reaches or passes θ:

y(t+1) = 1 if and only if Σ wixi(t) ≥ θ.

Page 13: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

13Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

From Logical Neurons to Finite Automata

AND

1

1

1.5

NOT

-10

OR

1

1

0.5

Brains, Machines, and Mathematics, 2nd Edition, 1987

X Y→

Boolean Net

X

Y Q

Finite Automaton

Page 14: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

14Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Increasing the Realism of Neuron Models

The McCulloch-Pitts neuron of 1943 is important

as a basis for

logical analysis of the neurally computable, and

current design of some neural devices (especially when augmented by learning rules to adjust synaptic weights).

However, it is no longer considered a useful model for making contact with neurophysiological data concerning real neurons.

Page 15: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

15Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Leaky Integrator Neuron

The simplest "realistic" neuron model is a continuous time model based on using

the firing rate (e.g., the number of spikes traversing the axon in the most recent 20 msec.)

as a continuously varying measure of the cell's activity

The state of the neuron is described by a single variable, the membrane potential.

The firing rate is approximated by a sigmoid, function of membrane potential.

Page 16: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

16Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Leaky Integrator Model

τ = - m(t) + h

has solution m(t) = e-t/ττττ m(0) + (1 - e-t/ττττ)h

→ h for time constant τ > 0.

We now add synaptic inputs to get the

Leaky Integrator Model:

τ = - m(t) + Σ i wi Xi(t) + h

where Xi(t) is the firing rate at the ith input.

Excitatory input (wi > 0) will increase

Inhibitory input (wi < 0) will have the opposite effect.

m(t)

m(t)

m(t)

Page 17: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

17Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Rall�s Motion Detector Model

Page 18: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

18Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Alternative Models

Even at this simple level, there are alternative models.

There are inhibitory synapses which seem better described by shunting inhibition which, applied at a given point on a dendrite, serves to divide, rather than subtract from, the potential change passively propagating from more distal synapses.

The "lumped frequency" model cannot model the subtle relative timing effects crucial to our motion detector example � these might be approximated by introducing appropriate delay terms

τ m(t) = - m(t) + Σ i wi xi(t - τi) + h.

Page 19: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

19Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Frog Tectum: Details and Modeling

Page 20: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

20Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Frog Tectum: Details and Modeling

Page 21: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

21Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

Many Levels of Detail in the Cerebellum

Page 22: Lecture 3. The Brain as a Network of Neurons Reading ...ilab.usc.edu/classes/2002cs564/lecture_notes/03... · Lecture 2. Networks of Neurons 14 Increasing the Realism of Neuron Models

22Arbib and Itti: CS564 - Brain Theory and Artificial Intelligence, USC, Fall 2000. Lecture 2. Networks of Neurons

No modeling approach is automatically appropriate

Rather we seek to find the simplest model adequate to address the complexity of a given range of problems.