computational cognitive neuroscience lab

14
Computational Cognitive Neuroscience Lab Today: Model Learning

Upload: michi

Post on 13-Jan-2016

29 views

Category:

Documents


0 download

DESCRIPTION

Computational Cognitive Neuroscience Lab. Today: Model Learning. Computational Cognitive Neuroscience Lab. Today: Homework is due Friday, Feb 17 Chapter 4 homework is shorter than the last one! Undergrads omit 4.4, 4.5, 4.7c, 4.7d. Hebbian Learning. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Computational Cognitive Neuroscience Lab

Computational Cognitive

Neuroscience Lab

Today: Model Learning

Page 2: Computational Cognitive Neuroscience Lab

Computational Cognitive Neuroscience Lab

» Today:» Homework is due Friday, Feb 17» Chapter 4 homework is shorter than

the last one!» Undergrads omit 4.4, 4.5, 4.7c, 4.7d

Page 3: Computational Cognitive Neuroscience Lab

Hebbian Learning

» “Neurons that fire together, wire together”» Correlations between sending and

receiving activity strengthens the connection between them

» “Don’t fire together, unwire”» Anti-correlation between sending and

receiving activity weakens the connection

Page 4: Computational Cognitive Neuroscience Lab

LTP/D via NMDA receptors

» NMDA receptors allow calcium to enter the (postsynaptic) cell

» NMDA are blocked by Mg+ ions, which are cast off when the membrane potential increases

» Glutamate (excitatory) binds to unblocked NMDA receptor, causes structural change that allows Ca++ to pass through

Page 5: Computational Cognitive Neuroscience Lab

Calcium and Synapses

» Calcium initiates multiple chemical pathways, dependent on the level of calcium

» Low Ca++ long term depression (LTD)» High Ca++ long term potentiation (LTP)» LTP/D effects: new postsynaptic

receptors, incresed dendritic spine size, or increased presynaptc release processes (via retrograde messenger)

Page 6: Computational Cognitive Neuroscience Lab

Fixing Hebbian learning

» Hebbian learning results in infinite weights!» Oja’s normalization (savg_corr)

» When to learn?» Conditional PCA--learn only when you see

something interesting

» A single unit hogs everything?» kWTA and Contrast enhancement -->

specialization

Page 7: Computational Cognitive Neuroscience Lab

Principal Components Analysis (PCA)

» Principal, as in primary, not principle, as in some idea

» PCA seeks a linear combination of variables such that maximum variance is extracted from the variables. It then removes this variance and seeks a second linear combination which explains the maximum proportion of the remaining variance, and so on until you run out of variance.

Page 8: Computational Cognitive Neuroscience Lab

PCA continued

» This is like linear regression, except you take the whole collection of variables (vector) and correlate it with itself to make a matrix.

» This is kind of like linear regression, where a whole collection of variables is regressed on itself

» The line of best fit through this regression is the first principal component!

Page 9: Computational Cognitive Neuroscience Lab

PCA cartoon

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

Page 10: Computational Cognitive Neuroscience Lab

Conditional PCA

» “Perform PCA only when a particular input is received”

» Condition: The forces that determine when a receiving unit is active

» Competition means hidden units will specialize for particular inputs

» So hidden units only learn when their favorite input is available

Page 11: Computational Cognitive Neuroscience Lab

Self-organizing learning

» kWTA determines which hidden units are active for a given input

» CPCA ensures those hidden units learn only about a single aspect of that input

» Contrast enhancement -- drive high weighs higher, low weights lower

» Contrast enhancement helps units specialize (and share)

Page 12: Computational Cognitive Neuroscience Lab

Bias-variance dilemma

» High bias--actual experience does not change model much, so biases better be good!

» Low bias--experience highly determines learning, so does random error! Model could be different, high model variance

Page 13: Computational Cognitive Neuroscience Lab

Architecture as Bias

» Inhibition drives competition, and competition determines which units are active, and the unit activity determines learning

» Thus, deciding which units share inhibitory connections (are in the same layer) will affect the learning

» This architecture is the learning bias!

Page 14: Computational Cognitive Neuroscience Lab

Fidelity and Simplicity of representations

» Information must be lost in the world-to-brain transformation (p118)

» There is a tradeoff in the amount of information lost, and the complexity of the representation

» Fidelity / simplicity tradeoff is set by» Conditional PCA (first principal component

only)» Competition (k value)» Contrast enhancement (savg_corr, wt_gain)