math tools for neuroscience (neu 314) spring...

16
Jonathan Pillow Princeton Neuroscience Institute & Psychology. Math Tools for Neuroscience (NEU 314) Spring 2016 accompanying notes/slides for Thursday, Feb 4

Upload: others

Post on 22-Mar-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

Jonathan PillowPrinceton Neuroscience Institute & Psychology.

Math Tools for Neuroscience (NEU 314)Spring 2016

accompanying notes/slides for Thursday, Feb 4

Page 2: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

discussion items• course website: http://pillowlab.princeton.edu/teaching/mathtools16/

• sign up for piazza.• take math-background poll

• for Matlab newbies:• check out MATLAB for neuroscientists (Lewis Library),

chapter 2• either of the matlab tutorials posted on course website (just

first part, up through vectors & matrices).

• Today: “Intro to linear algebra: vectors and some of the basic stuff you can do with ‘em.”

• Next week: start of labs!

Page 3: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

today’s topics• vectors (geometric picture)

• vector addition• scalar multiplication

• vector norm (“L2 norm”)• unit vectors• dot product (“inner product”)• linear projection• orthogonality

Page 4: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

vectors

A Geometric Review of Linear Algebra

The following is a compact review of the primary concepts of linear algebra. The order of pre-sentation is unconventional, with emphasis on geometric intuition rather than mathematicalformalism. For more thorough coverage, I recommend Linear Algebra and Its Applications byGilbert Strang, Academic Press, 1980.

Vectors (Finite-Dimensional)

A vector is an ordered collection of N numbers. The numbers are called the components of thevector, and N is the dimensionality of the vector. We typically imagine that these componentsare arranged vertically (a “column” vector):

v =

v1

v2

...vN

,

Vectors of dimension 2 or 3 can be graphi-cally depicted as arrows. Vectors of higherdimension can be illustrated using a “spikeplot”.

vN v1 v2

v1

v2 v

v3

v2 v1

v

The norm (or magnitude) of a vector is defined as: ||v|| =√

n v2n. Geometrically, this corre-

sponds to the length of the vector. A vector containing all zero components has zero norm,

• Author: Eero Simoncelli, Center for Neural Science, and Courant Institute of Mathematical Sciences.• Created: 20 January 1999. Last revised: 13 Sep 2008.• Send corrections or comments to [email protected]

A Geometric Review of Linear Algebra

The following is a compact review of the primary concepts of linear algebra. The order of pre-sentation is unconventional, with emphasis on geometric intuition rather than mathematicalformalism. For more thorough coverage, I recommend Linear Algebra and Its Applications byGilbert Strang, Academic Press, 1980.

Vectors (Finite-Dimensional)

A vector is an ordered collection of N numbers. The numbers are called the components of thevector, and N is the dimensionality of the vector. We typically imagine that these componentsare arranged vertically (a “column” vector):

v =

v1

v2

...vN

,

Vectors of dimension 2 or 3 can be graphi-cally depicted as arrows. Vectors of higherdimension can be illustrated using a “spikeplot”.

vN v1 v2

v1

v2 v

v3

v2 v1

v

The norm (or magnitude) of a vector is defined as: ||v|| =√

n v2n. Geometrically, this corre-

sponds to the length of the vector. A vector containing all zero components has zero norm,

• Author: Eero Simoncelli, Center for Neural Science, and Courant Institute of Mathematical Sciences.• Created: 20 January 1999. Last revised: 13 Sep 2008.• Send corrections or comments to [email protected]

Page 5: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

column vector

A Geometric Review of Linear Algebra

The following is a compact review of the primary concepts of linear algebra. The order of pre-sentation is unconventional, with emphasis on geometric intuition rather than mathematicalformalism. For more thorough coverage, I recommend Linear Algebra and Its Applications byGilbert Strang, Academic Press, 1980.

Vectors (Finite-Dimensional)

A vector is an ordered collection of N numbers. The numbers are called the components of thevector, and N is the dimensionality of the vector. We typically imagine that these componentsare arranged vertically (a “column” vector):

v =

v1

v2

...vN

,

Vectors of dimension 2 or 3 can be graphi-cally depicted as arrows. Vectors of higherdimension can be illustrated using a “spikeplot”.

vN v1 v2

v1

v2 v

v3

v2 v1

v

The norm (or magnitude) of a vector is defined as: ||v|| =√

n v2n. Geometrically, this corre-

sponds to the length of the vector. A vector containing all zero components has zero norm,

• Author: Eero Simoncelli, Center for Neural Science, and Courant Institute of Mathematical Sciences.• Created: 20 January 1999. Last revised: 13 Sep 2008.• Send corrections or comments to [email protected]

% make a 5-component % column vector

v = [1; 7; 3; 0; 1];

in matlab

transpose% transposev'

row vector % create row vector by % separating with , % instead of ;

v = [1, 7, 3, 0, 1];

Page 6: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

addition of vectors

and is called the zero vector.

A unit vector is a vector of length one. In twodimensions, a unit vector may be parameter-ized in terms of its angle relative to the hori-zontal axis:

u(θ) =!

cos(θ)sin(θ)

"

.

Angular parameterization of unit vectorsgeneralizes to N dimensions, but getsmessier.

u

cosθ

sinθ

θ

v

v

Multiplying a vector by a scalar simplychanges the length of the vector by that fac-tor. That is: ||av|| = a||v||. Any vector(with the exception of the zero vector) maybe rescaled to have unit length by dividingby its norm: v = v/||v||.

2

v

v

v

The sum of two vectors of the same dimen-sionality is a vector whose components aresums of the corresponding components ofthe summands. Specifically, y = w+ v meansthat:

zn = wn + vn, for every n

Geometrically, this corresponds to stackingthe vectors head-to-foot.

translated

w

w

v

v

v

!

The inner product (also called the “dot product”) of two vectors is the sum of the pairwiseproduct of components:

v · w ≡#

n

vnwn.

2

Page 7: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

scalar multiplication

and is called the zero vector.

A unit vector is a vector of length one. In twodimensions, a unit vector may be parameter-ized in terms of its angle relative to the hori-zontal axis:

u(θ) =!

cos(θ)sin(θ)

"

.

Angular parameterization of unit vectorsgeneralizes to N dimensions, but getsmessier.

u

cosθ

sinθ

θ

v

v

Multiplying a vector by a scalar simplychanges the length of the vector by that fac-tor. That is: ||av|| = a||v||. Any vector(with the exception of the zero vector) maybe rescaled to have unit length by dividingby its norm: v = v/||v||.

2

v

v

v

The sum of two vectors of the same dimen-sionality is a vector whose components aresums of the corresponding components ofthe summands. Specifically, y = w+ v meansthat:

zn = wn + vn, for every n

Geometrically, this corresponds to stackingthe vectors head-to-foot.

translated

w

w

v

v

v

!

The inner product (also called the “dot product”) of two vectors is the sum of the pairwiseproduct of components:

v · w ≡#

n

vnwn.

2

Page 8: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

vector norm (“L2 norm”)• vector length in Euclidean space

A Geometric Review of Linear Algebra

The following is a compact review of the primary concepts of linear algebra. The order of pre-sentation is unconventional, with emphasis on geometric intuition rather than mathematicalformalism. For more thorough coverage, I recommend Linear Algebra and Its Applications byGilbert Strang, Academic Press, 1980.

Vectors (Finite-Dimensional)

A vector is an ordered collection of N numbers. The numbers are called the components of thevector, and N is the dimensionality of the vector. We typically imagine that these componentsare arranged vertically (a “column” vector):

v =

v1

v2

...vN

,

Vectors of dimension 2 or 3 can be graphi-cally depicted as arrows. Vectors of higherdimension can be illustrated using a “spikeplot”.

vN v1 v2

v1

v2 v

v3

v2 v1

v

The norm (or magnitude) of a vector is defined as: ||v|| =√

n v2n. Geometrically, this corre-

sponds to the length of the vector. A vector containing all zero components has zero norm,

• Author: Eero Simoncelli, Center for Neural Science, and Courant Institute of Mathematical Sciences.• Created: 20 January 1999. Last revised: 13 Sep 2008.• Send corrections or comments to [email protected]

In 2-D:

In n-D:

Page 9: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

vector norm (“L2 norm”)

v = [1; 7; 3; 0; 1]; % make a vector

% many equivalent ways to compute norm

norm(v)sqrt(v'*v)sqrt(v(:).^2)sqrt(v(:).*v(:)) % note use of .* and .^, which operate% 'elementwise' on a matrix or vector

in matlab

Page 10: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

unit vector

A Geometric Review of Linear Algebra

The following is a compact review of the primary concepts of linear algebra. The order of pre-sentation is unconventional, with emphasis on geometric intuition rather than mathematicalformalism. For more thorough coverage, I recommend Linear Algebra and Its Applications byGilbert Strang, Academic Press, 1980.

Vectors (Finite-Dimensional)

A vector is an ordered collection of N numbers. The numbers are called the components of thevector, and N is the dimensionality of the vector. We typically imagine that these componentsare arranged vertically (a “column” vector):

v =

v1

v2

...vN

,

Vectors of dimension 2 or 3 can be graphi-cally depicted as arrows. Vectors of higherdimension can be illustrated using a “spikeplot”.

vN v1 v2

v1

v2 v

v3

v2 v1

v

The norm (or magnitude) of a vector is defined as: ||v|| =√

n v2n. Geometrically, this corre-

sponds to the length of the vector. A vector containing all zero components has zero norm,

• Author: Eero Simoncelli, Center for Neural Science, and Courant Institute of Mathematical Sciences.• Created: 20 January 1999. Last revised: 13 Sep 2008.• Send corrections or comments to [email protected]

• in 2 dimensions

unit circle

• vector such that

Page 11: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

unit vector

• in n dimensions

• vector such that

• sits on the surface of an n-dimensional hypersphere

Page 12: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

unit vector

• make any vector into a unit vector via

• vector such that

Page 13: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

inner product (aka “dot product”)• produces a scalar from two vectorsNote that the result is a scalar.

This operation has an equivalent geometricdefinition (general proof a bit tricky):

v · w ≡ ||v|| ||w|| cos (φvw),

where φvw is the angle between the two vec-tors. Thus, the inner product of two perpen-dicular vectors is 0, the inner product of twoparallel vectors is the product of their norms,and the inner product of a vector with itselfis the square of its norm.

φvw

b

...

w

w v

v

!b ww

. v

The inner product is distributive over addition: v · (w + y) = v · w + v · y. The symmetry of thedefinitionmeans that the operation is also commutative (i.e., order doesn’t matter): v·w = w·v.

Despite this symmetry, it is often useful to think of one of the vectors in an inner product asan operator, and the other as an input. For example, the inner product of any vector v with thevector:

w = ( 1

N

1

N

1

N· · · 1

N)

gives the average of the components of v. The inner product of vector v with the vector

w = ( 1 0 0 · · · 0 )

is the first component, v1.

Furthermore, the inner product of a vectorv with a unit vector u has a nice geometricinterpretation. The cosine equation impliesthat the inner product is the length of thecomponent of v lying along the line in the di-rection of u. This component, which is writ-ten as (v · u)u, is referred to as the projectionof the vector onto the line. The difference (orresidual) vector, v − (v · u)u, is the compo-nent of v perpendicular to the line. Note thatthe residual vector is always perpendicularto the projection vector, and that their sum isv [prove].

.

v

u v u u( )

3

Page 14: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

linear projection

Note that the result is a scalar.

This operation has an equivalent geometricdefinition (general proof a bit tricky):

v · w ≡ ||v|| ||w|| cos (φvw),

where φvw is the angle between the two vec-tors. Thus, the inner product of two perpen-dicular vectors is 0, the inner product of twoparallel vectors is the product of their norms,and the inner product of a vector with itselfis the square of its norm.

φvw

b

...

w

w v

v

!b ww

. v

The inner product is distributive over addition: v · (w + y) = v · w + v · y. The symmetry of thedefinitionmeans that the operation is also commutative (i.e., order doesn’t matter): v·w = w·v.

Despite this symmetry, it is often useful to think of one of the vectors in an inner product asan operator, and the other as an input. For example, the inner product of any vector v with thevector:

w = ( 1

N

1

N

1

N· · · 1

N)

gives the average of the components of v. The inner product of vector v with the vector

w = ( 1 0 0 · · · 0 )

is the first component, v1.

Furthermore, the inner product of a vectorv with a unit vector u has a nice geometricinterpretation. The cosine equation impliesthat the inner product is the length of thecomponent of v lying along the line in the di-rection of u. This component, which is writ-ten as (v · u)u, is referred to as the projectionof the vector onto the line. The difference (orresidual) vector, v − (v · u)u, is the compo-nent of v perpendicular to the line. Note thatthe residual vector is always perpendicularto the projection vector, and that their sum isv [prove].

.

v

u v u u( )

3

• intuitively, dropping a vector down onto a linear surface at a right angle

• if u is a unit vector,length of projection is

• for non-unit vector, length of projection =

Page 15: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

linear projection

Note that the result is a scalar.

This operation has an equivalent geometricdefinition (general proof a bit tricky):

v · w ≡ ||v|| ||w|| cos (φvw),

where φvw is the angle between the two vec-tors. Thus, the inner product of two perpen-dicular vectors is 0, the inner product of twoparallel vectors is the product of their norms,and the inner product of a vector with itselfis the square of its norm.

φvw

b

...

w

w v

v

!b ww

. v

The inner product is distributive over addition: v · (w + y) = v · w + v · y. The symmetry of thedefinitionmeans that the operation is also commutative (i.e., order doesn’t matter): v·w = w·v.

Despite this symmetry, it is often useful to think of one of the vectors in an inner product asan operator, and the other as an input. For example, the inner product of any vector v with thevector:

w = ( 1

N

1

N

1

N· · · 1

N)

gives the average of the components of v. The inner product of vector v with the vector

w = ( 1 0 0 · · · 0 )

is the first component, v1.

Furthermore, the inner product of a vectorv with a unit vector u has a nice geometricinterpretation. The cosine equation impliesthat the inner product is the length of thecomponent of v lying along the line in the di-rection of u. This component, which is writ-ten as (v · u)u, is referred to as the projectionof the vector onto the line. The difference (orresidual) vector, v − (v · u)u, is the compo-nent of v perpendicular to the line. Note thatthe residual vector is always perpendicularto the projection vector, and that their sum isv [prove].

.

v

u v u u( )

3

• intuitively, dropping a vector down onto a linear surface at a right angle

• if u is a unit vector,length of projection is

• for non-unit vector, length of projection =

}

component of v in direction of u

Page 16: Math Tools for Neuroscience (NEU 314) Spring 2016pillowlab.princeton.edu/teaching/mathtools16/slides/lec02_vectors.pdf · A Geometric Review of Linear Algebra The following is a compact

orthogonality• two vectors are orthogonal (or “perpendicular”) if their

dot product is zero:

Note that the result is a scalar.

This operation has an equivalent geometricdefinition (general proof a bit tricky):

v · w ≡ ||v|| ||w|| cos (φvw),

where φvw is the angle between the two vec-tors. Thus, the inner product of two perpen-dicular vectors is 0, the inner product of twoparallel vectors is the product of their norms,and the inner product of a vector with itselfis the square of its norm.

φvw

b

...

w

w v

v

!b ww

. v

The inner product is distributive over addition: v · (w + y) = v · w + v · y. The symmetry of thedefinitionmeans that the operation is also commutative (i.e., order doesn’t matter): v·w = w·v.

Despite this symmetry, it is often useful to think of one of the vectors in an inner product asan operator, and the other as an input. For example, the inner product of any vector v with thevector:

w = ( 1

N

1

N

1

N· · · 1

N)

gives the average of the components of v. The inner product of vector v with the vector

w = ( 1 0 0 · · · 0 )

is the first component, v1.

Furthermore, the inner product of a vectorv with a unit vector u has a nice geometricinterpretation. The cosine equation impliesthat the inner product is the length of thecomponent of v lying along the line in the di-rection of u. This component, which is writ-ten as (v · u)u, is referred to as the projectionof the vector onto the line. The difference (orresidual) vector, v − (v · u)u, is the compo-nent of v perpendicular to the line. Note thatthe residual vector is always perpendicularto the projection vector, and that their sum isv [prove].

.

v

u v u u( )

3

}

component of v in direction of u

component of v orthogonal to u