noise and delays in neurophysics andre longtin center for neural dynamics and computation department...

42
NOISE and DELAYS NOISE and DELAYS in NEUROPHYSICS in NEUROPHYSICS Andre Longtin Andre Longtin Center for Neural Dynamics and Computation Center for Neural Dynamics and Computation Department of Physics Department of Physics Department of Cellular and Molecular Medicine Department of Cellular and Molecular Medicine UNIVERSITY OF OTTAWA, Canada UNIVERSITY OF OTTAWA, Canada

Upload: alexis-sherman

Post on 18-Dec-2015

215 views

Category:

Documents


2 download

TRANSCRIPT

NOISE and DELAYS in NOISE and DELAYS in NEUROPHYSICSNEUROPHYSICS

Andre LongtinAndre Longtin

Center for Neural Dynamics and ComputationCenter for Neural Dynamics and ComputationDepartment of PhysicsDepartment of Physics

Department of Cellular and Molecular MedicineDepartment of Cellular and Molecular Medicine

UNIVERSITY OF OTTAWA, CanadaUNIVERSITY OF OTTAWA, Canada

OUTLINEOUTLINE

Modeling Single Neuron noiseModeling Single Neuron noise

leaky integrate and fireleaky integrate and fire

quadratic integrate and firequadratic integrate and fire

“ “transfer function” approachtransfer function” approach Modeling response to signalsModeling response to signals Information theoryInformation theory Delayed dynamicsDelayed dynamics

MOTIVATION for STUDYING NOISE

““Noise” in the neuroscience Noise” in the neuroscience literatureliterature

•As an input with many frequency components over a particular band, of similar amplitudes, and scattered phases

•As the resulting current from the integration of many independent, excitatory and inhibitory synaptic events at the soma

•As the maintained discharge of some neurons

•As « cross-talk » responses from indirectly stimulated neurons

•As « internal », resulting from the probabilistic gating of voltage-dependent ion channels

•As « synaptic », resulting from the stochastic nature of vesicle release at the synaptic cleft

Segundo et al., Origins and Self Organization, 1994

Leaky Integrate-and-fire with Leaky Integrate-and-fire with + and - Feedback+ and - Feedback

f = firing rate function

Firing Rate FunctionsFiring Rate Functions

Or stochastic:

Noise free:

Noise induced Stochastic Noise induced Stochastic Gain Control ResonanceGain Control Resonance

For Poisson input (Campbell’s theorem): For Poisson input (Campbell’s theorem):

mean conductance mean conductance ~ ~ mean input ratemean input ratestandard deviation standard deviation σσ ~ sqrt(mean rate) ~ sqrt(mean rate)

NOISE smoothes out f-I curvesNOISE smoothes out f-I curves

WHAT QUADRATIC WHAT QUADRATIC INTEGRATE-AND-FIRE MODEL?INTEGRATE-AND-FIRE MODEL? Technically more difficultTechnically more difficult Which variable to use? Which variable to use?

On the real line? On the real line?

On a circle?On a circle?

Information-theoretic approachesInformation-theoretic approaches Linear encoding versus nonlinear processingLinear encoding versus nonlinear processing Rate code, long time constant , integratorRate code, long time constant , integrator Time code, small time constant, coincidence Time code, small time constant, coincidence

detector (reliability)detector (reliability) Interspike interval code (ISI reconstruction)Interspike interval code (ISI reconstruction) Linear correlation coefficientLinear correlation coefficient CoherenceCoherence Coding fractionCoding fraction Mutual informationMutual information Challenge: Biophysics of coding Challenge: Biophysics of coding Forget the biophysics? Use better Forget the biophysics? Use better

(mesoscopic ?) variables? (mesoscopic ?) variables?

dI )log(

ii

s(t) (t t )

Neuroscience101 (Continued):

Number of spikesIn time interval T:

T

0

N(T) s(t)dt

Spike train:

60 70 80420

450

480

T

rial N

um

ber

time (msec)

Raster Plot:

Random variables

Interspike Intervals (ISI):

i i 1 i i 1{I t t }

Information Theoretic Calculations:Information Theoretic Calculations:

Gaussian Noise Stimulus S Spike Train XNeuron

???

S~

S~

X~

X~

S~

X~

)f(C**

2*

Coherence Function: Mutual Information Rate:

c

c

f

f

2 )]f(C1[logdf2

1MI

Stimulus Protocol:

f

fc

Study effect of (stimulus contrast) and fc (stimulus bandwidth) on coding.

Information Theory:

)S/R(H)R(H)S,R(I

60 70 80 90 100

time (EOD cycles)

60 70 80 90 100

time (EOD cycles)

Linear Response Calculation for Linear Response Calculation for Fourier transform of spike train:Fourier transform of spike train:

)(~

)()(~

)(~

0 fSffXfX st

susceptibilityunperturbed spike train

Spike Train Spec = Background Spec + (transfer function*Signal Spec)

20 )(

0 100 200 30010-1

100

101

102

103

104

Pow

er (

spk2 /s

ec)

f (Hz)

LIFDT Nelson

1ii

2

21I

CV)0f(P

CV == INTERVAL mean / INTERVAL Standard deviation

Wiener KhintchineWiener Khintchine

diSC

diCS

)exp()()(

)exp()()( Power spectrum

Autocorrelation

Integral of S over all frequencies = C(0) = signal variance

Integral of C over all time lags = S(0) = signal intensity

P

(n,T

)

P0(n,T)

P1(n,T)

n (spikes)

Signal Detection Theory:

1 0

2 21 0

d '

d ' 3 5 0.0 0.2 0.4 0.6 0.8 1.00.0

0.2

0.4

0.6

0.8

1.0

typical

chance

perfect

PD

PFA

D 1n k

FA 0n k

P P (n,T)

P P (n,T)

ROC curve:

Information Theory Information Theory

Actual signal

Reconstructed signal

The stimulus can be well characterized (electric field). This allows for detailed signal processing analysis.

Gabbiani et al., Nature (1996) 384:564-567.Bastian et al., J. Neurosci. (2002) 22:4577-4590. Krahe et al., (2002) J. Neurosci. 22:2374-2382.

Linear Stimulus ReconstructionLinear Stimulus Reconstruction Estimate filter which, when convolved with the Estimate filter which, when convolved with the

spike train, yields an estimated stimulated “closest” spike train, yields an estimated stimulated “closest” to real stimulusto real stimulus

)(

)()]([)(

)]()([1

)'()'(')(

)_()(

)(

0

22

0

0

fS

fSthfH

tstsdtT

txtthdtts

xitttx

stimulusts

xx

sx

T

est

T

est

i

Spike train (zero mean)

Estimated stimulus

Mean square error

Optimal Wiener filter

NOISE smoothes out f-I curvesNOISE smoothes out f-I curves

““stochastic resonance above stochastic resonance above threshold”threshold”

Coding fraction versus noise intensity:

Modeling Electroreceptors:Modeling Electroreceptors: The Nelson Model (1996) The Nelson Model (1996)

High-PassFilterInput

Stochastic Spike Generator

Spike generator assigns 0 or 1 spike per EOD cycle: multimodal histograms

Modeling Electroreceptors: Modeling Electroreceptors: The Extended LIFDT ModelThe Extended LIFDT Model

High-PassFilterInput LIFDT Spike Train

Parameters: without noise, receptor fires periodically

(suprathreshold dynamics – no stochastic resonance)

Signal Detection: Count Spikes During Interval TSignal Detection: Count Spikes During Interval T

2 4 6 8 10 120.0

0.2

0.4

P(n

)

P0(n,T) (no stimulus)

P1(n,T) (with stimulus)

n (spikes)

20

21

01SNR

46 48 50 52 54 56 58

0.0

0.1

0.2

0.3

0.4 (b) baseline LIFDT stimulus LIFDT baseline Nelson stimulus Nelson

P(n

)

n

T=255 msec

10-1 100 101 102 103 104 105 106

10-2

10-1

100

n=5

CV2

LIFDT shuffled LIFDT Nelson

Fan

o fa

ctor

F(T

)

counting time T (msec)

)T(

)T()T(F

2

Fano Factor:

1ii

2 21CV)(F

0 5 10 15-0.5

0.0

0.5

1.0

j

j

Asymptotic Limit(Cox and Lewis, 1966)

Regularisation:Regularisation:

Sensory NeuronsSensory Neurons

ELL Pyramidal Cell

Sensory Input

Higher Brain

Feedback: Open vs Closed Loop ArchitectureFeedback: Open vs Closed Loop Architecture

Higher Brain

Higher Brain

Loop time d

Delayed Feedback Neural NetworksDelayed Feedback Neural Networks

Afferent Input

Higher Brain Areas

The ELL; first stage of sensory processing

Jelte Bos’ data

Andre’s data

Longtin et al., Phys. Rev. A 41, 6992 (1990)

If one defines:

corresponding to the stochastic diff. eq. :

one gets a Fokker-Planck equation:

One can apply Ito or Stratonovich calculus, as for SDE’s.

However, applicability is limited if there are complex eigenvalues or system is strongly nonlinear

TWO-STATE DESCRIPTION: TWO-STATE DESCRIPTION: S=S=±1±1

2 transition probabilities:

For example, using Kramers approach:

DETERMINISTIC DELAYED DETERMINISTIC DELAYED BISTABILITYBISTABILITY

Stochastic approach does not yet Stochastic approach does not yet get the whole picture!get the whole picture!

ConclusionsConclusions

NOISE: many sources, many approaches, NOISE: many sources, many approaches, exercise caution (Ito vs Strato)exercise caution (Ito vs Strato)

INFORMATION THEORY: usually makes INFORMATION THEORY: usually makes assumptions, and even when it doesn’t, ask the assumptions, and even when it doesn’t, ask the question whether next cell cares. question whether next cell cares.

DELAYS: SDDE’s have no Fokker-Planck DELAYS: SDDE’s have no Fokker-Planck equivalentequivalent

tomorrow: linear response-like theory tomorrow: linear response-like theory

OUTLOOKOUTLOOK

Second order field theory for stochastic neural Second order field theory for stochastic neural dynamics with delaysdynamics with delays

Figuring out how intrinsic neuron dynamics Figuring out how intrinsic neuron dynamics (bursting, coincidence detection, etc…) (bursting, coincidence detection, etc…) interact with correlated inputinteract with correlated input

Figuring out interaction of noise and burstingFiguring out interaction of noise and bursting Forget about steady state! Forget about steady state! Whatever you do, think of the neural Whatever you do, think of the neural

decoder…decoder…