what does the synapse tell the axon?
DESCRIPTION
What does the synapse tell the axon?. Idan Segev Interdisciplinary Center for Neural Computation Hebrew University. Thanks: Miki London Galit Fuhrman Adi Shraibman Elad Schneidman. Outline. Introduction Questions in my lab. A brief history of the synapse and of “ synaptic efficacy ” - PowerPoint PPT PresentationTRANSCRIPT
Idan Segev
Interdisciplinary Center for Neural Computation
Hebrew University
Thanks:Miki LondonGalit FuhrmanAdi ShraibmanElad Schneidman
What does the synapse tell the axon?
Outline• Introduction
– Questions in my lab.– A brief history of the synapse and of “synaptic efficacy”
• - what does it mean?• Complications with “synaptic efficacy”
• Information theory and synaptic efficacy– Basic definitions (entropy, mutual information)– The “Plug & Play” model
• Preliminary Results– “Synaptic efficacy”
• In simple neuron models• In passive dendritic structures• In excitable dendrites
• Conclusions• Future questions
Research focus in my group1. Neuronal “noise” and input-output properties of neurons.Ion-channels, synaptic noise and AP reliability
Optimization of information transmission with noise
2. Nonlinear cable theory.Threshold conditions for excitation in excitable dendrites
Active propagation in excitable trees
3. “Learning rules” for ion channels and synapses.How to build a “H&H” axon?
How to “read” synaptic plasticity?
4. The synapse: “what does it say”?Could dynamic synapses encode the timing of the pre-synaptic spikes?
“Synaptic efficacy” - what does it mean?
THE “Synapse”
Motivation: Single synapse matters
“Synaptic efficacy”
• Artificial Neural Networks - synaptic efficacy reduced to a single number, Wij (Jij)
• Biophysics - Utilizing the (average) properties of the PSP (peak; rise-time; area, charge …)
• Cross-Correlation - Relating the pre-synaptic input to the post-synaptic output (the firing probability). How do synaptic properties affect the cross-correlation?
Complications: Who is more “effective” and by how much?
• EPSP peak is equal but the rise time is different ?
• EPSP area is equal but the peak is different?
Complications: Background synaptic activity
L.J. Borg-Graham, C. Monier & Y. Frengac
Spontaneous in vivo voltage fluctuations in a neuron from the cat visual cortex
The “Plug & Play” Model
“Neuron”Output
Input
Noise
Background Activity
Input
Output
Mutual Information
0 10 0 11 0 11 10 0 Entropy
Known Synaptic Input
01 001 01 001 01 001 001 01 0 1 1 0• The Mutual Information is the extra bits saved by knowing the
input.
01000010010100100001
Compression, Entropy and Mutual Information
0 10 0 11 0 11 10 0
01 000 01 001 01 001 000 01
Information in the input?
Output Spike train
Compressed Spike train output
0 1 1 0
Compressed output Spike train given the input
Mutual Information
• Compression Information estimation
• We use the CTW compression algorithm (best known today)
I&F (effect of potentiation)
Threshold
Isolated synapsebackground
Background synapse
x5
(I&F) - EPSP parameters and the MIFixed peak Fixed charge
Why MI corresponds best to EPSP peak?
Sharp EPSP
Smeared EPSP
Less spikes, More accurate
More spikes, Less accurate
Input
Passive Cable with synapses
MI (efficacy) of distal synapses scales with EPSP
peak
ProximalDistal
MI with Active dendritic currents(Linear synapses)
proximal
distal
distal
intermediate
Conclusions
• Peak EPSP is the dominant parameter for Mutual information of synaptic input
• Validity & Generality of method– Advantage of modeling for such issues
– Possibility to ask many questions (with control)
– applicability for experimental data
Future Questions
• Natural Generalizations
– Dendritic trees
– MI of Inhibitory synapses
– Depressing and facilitating synapses
– Other noise sources
• Efficacy of inhibitory synapse
• “Selfish” or Cooperative strategies for maximizing information transfer (each synapse may want to increase each EPSP peak, but others do too)
• Establishing and improving the method (confidence limits, better estimates …)
galit
AP1
?
?? 2
recdtτUse
?3
Stochastic Model For Dynamic Synapses:
2 types of “randomness”:
1. Is there a vesicle in the release site?
2. Would a vesicle be released in response to a presynaptic AP?
How to quantify the relation between the input properties and its efficacy?
Which of these two inputs is more efficient?By how much?
• Given a sequence generated by a source the Shannon McMillan Breiman
theorem states that:
),,,,( 210 nxxxx K
Hxxxxn
npn ×♦
♦− )),,,,( 210log(1K
),,,,( 210 nXXXX K
Entropy estimation
000000100010101010101010101010101001001011……
• Two problems:
– The sequence is finite
– We don’t know the true probability p of the sequence (we can only estimate it).
Effect of bin size
SharpWide
WideSharp
Control
x3
x5