mth 215: introduction to linear algebra - chapter...

87
Vectors in R n Linear Independence, Spanning Sets and Basis Row Space, Column Space and the Null Space of a Matrix Orthogonality and the Gram Schmidt Process MTH 215: Introduction to Linear Algebra Chapter 4 Jonathan A. Ch´ avez Casillas 1 1 University of Rhode Island Department of Mathematics February 25, 2019 Jonathan Ch´ avez

Upload: others

Post on 02-Feb-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

MTH 215: Introduction to Linear AlgebraChapter 4

Jonathan A. Chavez Casillas1

1University of Rhode IslandDepartment of Mathematics

February 25, 2019

Jonathan Chavez

Page 2: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

1 Vectors in Rn

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

2 Linear Independence, Spanning Sets and BasisDefinitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

3 Row Space, Column Space and the Null Space of a MatrixThe Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

4 Orthogonality and the Gram Schmidt ProcessOrthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Jonathan Chavez

Page 3: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

What is Rn?

Notation and TerminologyR denotes the set of real numbers.R2 denotes the set of all column vectors with two entries.R3 denotes the set of all column vectors with three entries.In general, Rn denotes the set of all column vectors with n entries.

Scalar quantities versus vector quantitiesA scalar quantity has only magnitude; e.g. time, temperature.A vector quantity has both magnitude and direction; e.g. displacement, force, wind velocity.

Whereas two scalar quantities are equal if they are represented by the same value, two vector quantities areequal if and only if they have the same magnitude and direction.

Jonathan Chavez

Page 4: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

R2 and R3

Vectors in R2 and R3 have convenient geometric representations as position vectors of points in the2-dimensional (Cartesian) plane and in 3-dimensional space, respectively.

Jonathan Chavez

Page 5: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

NotationIf P is a point in Rn with coordinates (p1, p2, ..., pn) we denote this by P = (p1, p2, ..., pn).If P = (p1, p2, . . . , pn) is a point in Rn, then

#�P =

p1p2...

pn

is often used to denote the position vector of the point.Instead of using a capital letter to denote the vector (as we generally do with matrices), we emphasizethe importance of the geometry and the direction with an arrow over the name of the vector.

Jonathan Chavez

Page 6: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Notation and TerminologyThe notation #�P emphasizes that this vector goes from the origin 0 to the point P. We can also uselower case letters for names of vectors. In this case, we write #�P = #�p .Any vector

#�x =

x1x2...

xn

in Rn

is associated with the point (x1, x2, . . . , xn). Please notice that in some context #�x can be a rowvector or a column vector.Often, there is no distinction made between the vector #�x and the point (x1, x2, . . . , xn), and we say

that both (x1, x2, . . . , xn) ∈ Rn and #�x =

x1x2...

xn

∈ Rn.

Jonathan Chavez

Page 7: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Algebra in Rn

Addition in Rn

Since vectors in Rn are n × 1 matrices, addition in Rn is precisely matrix addition using column or rowmatrices, i.e.,

If #�u and #�v are in Rn, then #�u + #�v is obtained by adding together corresponding entries of the vectors.The zero vector in Rn is the n × 1 zero matrix, and is denoted #�0 .

Example

Let #�u =

[ 123

]and #�v =

[ 456

]. Then,

#�u + #�v =

[ 123

]+

[ 456

]=

[ 579

]Jonathan Chavez

Page 8: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Properties of Vector Addition

Let #�u , #�v , and #�w be vectors in Rn. Then the following properties hold.1 #�u + #�v = #�v + #�u (vector addition is commutative).2 ( #�u + #�v ) + #�w = #�u + ( #�v + #�w ) (vector addition is associative).3 #�u + #�0 = #�u (existence of an additive identity).4 #�u + (− #�u ) = #�0 (existence of an additive inverse).

Jonathan Chavez

Page 9: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Scalar Multiplication

Since vectors in Rn are n × 1 matrices, scalar multiplication in Rn is precisely matrix scalar multiplicationusing column matrices, i.e., If #�u is a vector in Rn and k ∈ R is a scalar, then k #�u is obtained by multiplyingevery entry of #�u by k.

Example

Let #�u =

[ 123

]and k = 4. Then,

k #�u = 4

[ 123

]=

[ 48

12

]

Jonathan Chavez

Page 10: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Properties of Scalar Multiplication

Let #�u , #�v ∈ Rn be vectors and k, p ∈ R be scalars. Then the following properties hold.

1 k( #�u + #�v ) = k #�u + k #�v (scalar multiplication distributes over vector addition).2 (k + p) #�u = k #�u + p #�u (addition distributes over scalar multiplication).3 k(p #�u ) = (kp) #�u (scalar multiplication is associative).4 1 #�u = #�u (existence of a multiplicative identity).

Jonathan Chavez

Page 11: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Length of a Vector, R2

If #�x =[

x1x2

]∈ R2, then the length of the vector #�x is the distance from the origin 0 to the point

X = (x1, x2) given by d(0, X).

The length of #�x , denoted || #�x ||, is given by:

d(0, X) = || #�x || =√

#�x T #�x =√

x21 + x2

2

Jonathan Chavez

Page 12: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Length of a Vector, Rn

This extends clearly to #�x =

x1x2...

xn

∈ Rn.

The length of #�x is the distance from the origin 0 to the point X = (x1, x2, . . . , xn) given by d(0, X).

d(0, X) = || #�x || =√

#�x T #�x =√

x21 + x2

2 + . . . + x2n .

Please notice that if we define #�x = [x1, x2, . . . , xn] as a row vector. Then,

d(0, X) = || #�x || =√

#�x #�x T =√

x21 + x2

2 + . . . + x2n .

Jonathan Chavez

Page 13: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Unit VectorsDefinitionA unit vector is a vector of length one.

Example[ 100

],

[ 010

],

[ 001

],

√2

20√

22

, are examples of unit vectors.

ExampleIf #�v , #�0 , then

1|| #�v ||

#�v

is a unit vector in the same direction as #�v .Jonathan Chavez

Page 14: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

Example

#�v =

[−1

32

]is not a unit vector, since || #�v || =

√14. However,

#�u =1√

14#�v =

−1√14

3√14

2√14

is a unit vector in the same direction as #�v , i.e.,

|| #�u || =1√

14|| #�v || =

1√

14

√14 = 1.

ExampleIf #�v and #�w are nonzero that have

the same direction, then #�v = || #�v |||| #�w ||

#�w ;

opposite directions, then #�v = − ||#�v |||| #�w ||

#�w .

Jonathan Chavez

Page 15: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

DefinitionIf

#�u =

u1u2

...

un

, and #�v =

v1v2

...

vn

are in Rn, then the dot product #�u q#�v is as the 1× 1 matrix

#�u T #�v = [u1, u2, . . . , un]

v1v2...

vn

=[

u1v1 + u2v2 + · · ·+ unvn]

which is treated as a scalar given by u1v1 + u2v2 + · · ·+ unvn

Please notice that this definition can be adapted if #�x and #�y are regarded as row vectors. The only changeis that #�x q#�y = #�x #�y T .

Jonathan Chavez

Page 16: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

The Dot Product

ProblemFind #�u q#�v for #�u = [1, 2, 0,−1]T , #�v = [0, 1, 2, 3]T .

Solution

#�u q#�v = (1)(0) + (2)(1) + (0)(2) + (−1)(3)= 0 + 2 + 0 +−3 = −1

Jonathan Chavez

Page 17: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Table of Contents

1 Vectors in Rn

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

2 Linear Independence, Spanning Sets and BasisDefinitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

3 Row Space, Column Space and the Null Space of a MatrixThe Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

4 Orthogonality and the Gram Schmidt ProcessOrthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Jonathan Chavez

Page 18: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

DefinitionLet #�v 1, #�v 1, . . . , #�v k be k vectors on Rn. A vector #�w ∈ Rn is said to be a linear combination of the vectors#�v 1, #�v 1, . . . , #�v k if there exist constants a1, a2, . . . , ak (called coefficients) such that

#�w =k∑

i=1

ai#�v i = a1

#�v 1 + a2#�v 2 + . . . + ak

#�v k

DefinitionLet S = { #�v 1, #�v 1, . . . , #�v k} be a set of k vectors on Rn. That is, S ⊂ Rn.

The span of S, written as span(S) is the set of all linear combinations of the elements of S. That is,

span(S) =

{k∑

i=1

ai#�v i such that a1, a2 . . . ak ∈ R

}

Please notice that for creating the span, we consider ALL possible combinations of the coefficientsa1, a2, . . . , ak .

Jonathan Chavez

Page 19: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

DefinitionA set of non-zero vectors { #�u 1, · · · , #�u k} in Rn is said to be linearly independent if whenever

k∑i=1

ai#�u i = #�0

it follows that each ai = 0. A set that is not linearly independent is called linearly dependent.

We can rewrite the definition of linear independence as follows:

A set of non-zero vectors { #�u 1, · · · , #�u k} in Rn is said to be linearly independent if whenever #�0 is a linearcombination of them, the coefficients of the linear combination are all 0.

Jonathan Chavez

Page 20: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

A Linearly Dependent Set

Problem

Consider the vectors #�u =

[ 012

], #�v =

[ 023

], #�w =

[ 041

]. Is the set { #�u , #�v , #�w} linearly independent?

SolutionNotice that we can write #�w as a linear combination of #�u , #�v as follows:[ 0

41

]= (−10)

[ 012

]+ (7)

[ 023

]

Hence, #�w is in span{ #�u , #�v }. By the definition, this set is not linearly independent (it is linearly dependent).

Jonathan Chavez

Page 21: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Example

Is S =

{[ −101

],

[ 111

],

[ 135

]}linearly independent?

Jonathan Chavez

Page 22: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution

Jonathan Chavez

Page 23: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

ProblemLet { #�u , #�v , #�w} be an independent set of Rn. Is { #�u + #�v , 2 #�u + #�w , #�v − 5 #�w} linearly independent?

Solution

Jonathan Chavez

Page 24: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Problem

Describe the span of the vectors #�u =

[ 012

]and #�v =

[ 023

].

Jonathan Chavez

Page 25: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution

Jonathan Chavez

Page 26: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

ProblemLet [1, 1, 0]T and #�v = [3, 2, 0]T ∈ R3. Show that #�w = [4, 5, 0]T is in span { #�u , #�v }.

SolutionFor a vector to be in span { #�u , #�v }, it must be a linear combination of these vectors. If #�w ∈ span { #�u , #�v },we must be able to find scalars a, b such that

Jonathan Chavez

Page 27: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution (continued)

Jonathan Chavez

Page 28: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

ProblemLet [1, 1, 1]T and #�v = [3, 2, 0]T ∈ R3. Does #�w = [4, 5, 0]T belongs to span { #�u , #�v }?

This is almost identical to the previous, except that #�u (above) has one entry that isdifferent.

SolutionIn this case, the system of linear equations is inconsistent which you can verify. Therefore#�w < span { #�u , #�v }.

Jonathan Chavez

Page 29: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

DefinitionLet #�e j denote the j th column of In, the n × n identity matrix; #�e j is called the j th coordinate vector of Rn.

ClaimRn = span{ #�e 1, #�e 2, . . . , #�e n}.

Proof.

Let #�x =

x1x2...

xn

∈ Rn. Then #�x = x1#�e 1 + x2

#�e 2 + · · ·+ xn#�e n, where x1, x2, . . . , xn ∈ R. Therefore,

#�x ∈ span{ #�e 1, #�e 2, . . . , #�e n}, and thus Rn ⊆ span{ #�e 1, #�e 2, . . . , #�e n}.

Conversely, since #�e i ∈ Rn for each i , 1 ≤ i ≤ n (and Rn is a vector space), it follows thatspan{ #�e 1, #�e 2, . . . , #�e n} ⊆ Rn. The equality now follows. �

Jonathan Chavez

Page 30: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Problem

Let #�u 1 =

1−1

1−1

, #�u 2 =

−1

111

, #�u 3 =

1−1−1

1

, #�u 4 =

1−1

11

.

Show that span{ #�u 1,#�u 2,

#�u 3,#�u 4} , R4.

SolutionIf you check, you’ll find that #�e 2 can not be written as a linear combination of #�u 1,

#�u 2,#�u 3,

and #�u 4.

Jonathan Chavez

Page 31: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Example

A =

0 1 −1 2 5 10 0 1 −3 0 10 0 0 0 1 −20 0 0 0 0 0

is an ref matrix.

Treat the nonzero rows of A as transposes of vectors in R6:

#�u 1 =

01−1

251

#�u 2 =

001−3

01

#�u 3 =

00001−2

,

and suppose that a #�u 1 + b #�u 2 + c #�u 3 = #�0 6 for some a, b, c ∈ R.

Jonathan Chavez

Page 32: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Example (continued)This results in a system of six equations in three variables, whose augmented matrix is

0 0 0 01 0 0 0−1 1 0 0

2 −3 0 05 0 1 01 1 −2 0

The solution to the system is easily determined to be a = b = c = 0, so the set { #�u 1, #�u2, #�u 3} isindependent.In a slight abuse of terminology, we say that the nonzero rows of A are independent.

In general, the nonzero rows of any matrix in Row Echelon form (ref) form an independent set of (row)vectors.

Jonathan Chavez

Page 33: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

TheoremSuppose A is an m × n matrix with columns #�a 1, #�a 2, . . . , #�a n ∈ Rm. Then

1 The columns of A form a linearly independent set if and only if A #�x = #�0 m implies #�x = #�0 n.2 The columns of A span Rm if and only if A #�x = #�b has a solution for every #�b ∈ Rm.

How is this theorem useful?

Let #�x 1, #�x 2, . . . , #�x k ∈ Rn.1 Are #�x 1, #�x 2, . . . , #�x k linearly independent?2 Do #�x 1, #�x 2, . . . , #�x k span Rn?

To answer both question, simply let A be a matrix whose columns are the vectors #�x 1, #�x 2, . . . , #�x k ∈ Rn.Next, obtain the matrix R, which is the Row Echelon form (ref) of A.

The answer to the first question is “yes” if and only if each column of R has a leading one. Why?The answer to the second question is “yes” if and only if each row of R has a leading one. Why?

Jonathan Chavez

Page 34: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

ProblemLet

#�u 1 = [1,−1, 1,−1], #�u 2 = [−1, 1, 1, 1], #�u 3 = [1,−1,−1, 1], #�u 4 = [1,−1, 1, 1].

Show that span{ #�u 1, #�u 2, #�u 3, #�u 4} , R4.

Solution

Jonathan Chavez

Page 35: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

TheoremLet A be an invertible n × n matrix. Then the columns of A are independent and span Rn. Similarly, therows of A are independent and span Rn.

This theorem also allows us to determine if a matrix is invertible. If an n × nmatrix A has columns which are independent, or span Rn, then it follows thatA is invertible. If it has rows that are independent, or span Rn, then A isinvertible.

Jonathan Chavez

Page 36: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Problem (Again!)

Let #�u 1 =

1−1

1−1

, #�u 2 =

−1111

, #�u 3 =

1−1−1

1

, #�u 4 =

1−1

11

.

Show that span{ #�u 1, #�u 2, #�u 3, #�u 4} , R4.

Solution

Let A =[

#�u 1#�u 2

#�u 3#�u 4]

=

1 −1 1 1−1 1 −1 −1

1 1 −1 1−1 1 1 1

.

The columns of A span R4 if and only if A is invertible. Since det A = 0 (row 2 is (−1) times row 1), A isnot invertible, and thus { #�u 1, #�u 2, #�u 3, #�u 4} does not span R4.

Jonathan Chavez

Page 37: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Linear Independence

We can use the reduced row-echelon form of the matrix to determine if the columns form alinearly independent set of vectors.

ProblemDetermine whether the following set of vectors are linearly independent.

123

, 2

10

, 0

11

Jonathan Chavez

Page 38: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution

Jonathan Chavez

Page 39: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

ProblemDetermine whether the following vectors are linearly independent. If they are linearly dependent, write oneof the vectors as a linear combination of the others.

1241

,

27

172

,

0130

,

85

1111

Solution

Jonathan Chavez

Page 40: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution (continued)

Jonathan Chavez

Page 41: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution (continued)

Jonathan Chavez

Page 42: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Linear Dependence in Rn

TheoremLet { #�u 1, #�u 2, ..., #�u k} be a set of vectors in Rn. Then, if k > n then the set islinearly dependent.

Jonathan Chavez

Page 43: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Subspaces

Theorem (Subspace Test)A subset V of Rn is a subspace of Rn if

1 the zero vector of Rn, #�0 n, is in V ;2 V is closed under addition, i.e., for all #�u , #�w ∈ V , #�u + #�w ∈ V ;3 V is closed under scalar multiplication, i.e., for all #�u ∈ V and k ∈ R, k #�u ∈ V .

The subset V ={#�0 n}

is a subspace of Rn (verify this), as is the set Rn itself. Any other subspace of Rn isa proper subspace of Rn.

NotationIf V is a subset of Rn, we write V ⊆ Rn. In some texts, for a saying that V is a subspace of Rn, thenotation used is V ≤ Rn.

Jonathan Chavez

Page 44: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Problem

Is V =

a

bcd

∣∣∣∣∣∣∣ a, b, c, d ∈ R and 2a − b = c + 2d

a subspace of R4? Justify your answer.

Solution

Jonathan Chavez

Page 45: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution (continued)

Jonathan Chavez

Page 46: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution (continued)

Jonathan Chavez

Page 47: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Subspaces

DefinitionLet V be a nonempty collection of vectors in Rn. Then V is a subspace if whenever a and b are scalars and#�u and #�v are vectors in V , a #�u + b #�v is also in V .

Subspaces are closely related to the span of a set of vectors which we discussed earlier.

TheoremLet V be a nonempty collection of vectors in Rn. Then V is a subspace of Rn if and only if there existvectors { #�u 1, ..., #�u k} in V such that

V = span { #�u 1, ..., #�u k}

Jonathan Chavez

Page 48: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Subspaces

Subspaces are also related to the property of linear independence.

TheoremIf V is a subspace of Rn, then there exist linearly independent vectors { #�u 1, ..., #�u k} of V such that

V = span { #�u 1, ..., #�u k}

In other words, subspaces of Rn consist of spans of finite, linearly independent collections of vectors in Rn.

Jonathan Chavez

Page 49: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Problem

Is V =

a

bcd

∣∣∣∣∣∣∣ a, b, c, d ∈ R and 2a − b = c + 2d

a subspace of R4? Justify your answer.

Solution 2

Jonathan Chavez

Page 50: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Basis of a Subspace

DefinitionLet V be a subspace of Rn. Then { #�u 1, ..., #�u k} is called a basis for V if thefollowing conditions hold:

span{ #�u 1, ..., #�u k} = V{ #�u 1, ..., #�u k} is linearly independent.

Jonathan Chavez

Page 51: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

ExampleThe subset { #�e 1,

#�e 2, . . . ,#�e n} is a basis of Rn, called the standard basis of Rn. (We’ve

already seen that { #�e 1,#�e 2, . . . ,

#�e n} is linearly independent and thatRn = span{ #�e 1,

#�e 2, . . . ,#�e n}.)

ExampleIn a previous problem, we saw that R4 = span(S) where

S =

1111

,

0111

,

0011

,

0001

.

S is also linearly independent (prove this). Therefore, S is a basis of R4.

Jonathan Chavez

Page 52: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

The following theorem claims that any two bases of a subspace must be of the same size.

TheoremLet V be a subspace of Rn and suppose { #�u 1, ...,

#�u k} and { #�v 1, ...,#�v m} are two bases for V .

Then k = m.

The previous theorem shows than all bases of a subspace will have the same size. This sizeis called the dimension of the subspace.

DefinitionLet V be a subspace of Rn. Then the dimension of V is the number of a vectors in a basisof V .

Jonathan Chavez

Page 53: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Properties of Rn

Note that the dimension of Rn is n.

There are some other important properties of vectors in Rn.

TheoremIf { #�u 1, ...,

#�u n} is a linearly independent set of a vectors in Rn, then { #�u 1, ...,#�u n} is a

basis for Rn.Suppose { #�u 1, ...,

#�u m} spans Rn. Then m ≥ n.If { #�u 1, ...,

#�u n} spans Rn, then { #�u 1, ...,#�u n} is linearly independent.

Jonathan Chavez

Page 54: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

ProblemLet

U =

a

bcd

∈ R4

∣∣∣∣∣∣∣ a − b = d − c

.

Show that U is a subspace of R4, find a basis of U, and find dim(U).

Jonathan Chavez

Page 55: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution

Jonathan Chavez

Page 56: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

Solution (continued)

Jonathan Chavez

Page 57: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

TheoremThe following properties hold in Rn:

Suppose { #�u 1, · · · , #�u n} is linearly independent. Then { #�u 1, · · · , #�u n} is abasis for Rn.Suppose { #�u 1, · · · , #�u m} spans Rn. Then m ≥ n.

If { #�u 1, · · · , #�u n} spans Rn, then { #�u 1, · · · , #�u n} is linearly independent.

QuestionWhat is the significance of this result?

Jonathan Chavez

Page 58: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Definitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

AnswerLet V be a subspace of Rn and suppose B ⊆ V .

If B spans V and |B| = dim(V ), then B is also independent, and hence Bis a basis of V .If B is independent and |B| = dim(V ), then B also spans V , and hence Bis a basis of V .

Therefore if |B| = dim(V ), it is sufficient to prove that B is eitherindependent or spans V in order to prove it is a basis.

Jonathan Chavez

Page 59: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Table of Contents

1 Vectors in Rn

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

2 Linear Independence, Spanning Sets and BasisDefinitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

3 Row Space, Column Space and the Null Space of a MatrixThe Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

4 Orthogonality and the Gram Schmidt ProcessOrthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Jonathan Chavez

Page 60: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Row and Column Space

DefinitionLet A be an m × n matrix. The column space of A is the span of the columns of A. The row space of A isthe span of the rows of A.

ProblemFind the rank of the matrix A and describe the column and row spaces efficiently.

A =

[ 1 2 1 3 21 3 6 0 23 7 8 6 6

]

Jonathan Chavez

Page 61: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Example: Column SpaceSolution

Jonathan Chavez

Page 62: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Example: Column SpaceSolution (continued)

Jonathan Chavez

Page 63: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Example: Row SpaceSolution (continued)

Notice that the vectors used in the description of the column space are from the original matrix, whilethose in the row space are from the reduced row-echelon form or ALSO can be from the original matrix.

Jonathan Chavez

Page 64: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Null Space

DefinitionLet A be an m × nmatrix. The null space of A, or kernel of A is defined as:

ker(A) = {X ∈ Rn : AX = 0}

TheoremLet A be and m × n matrix. The kernel of A, ker(A), is a subspace of Rn.

DefinitionThe dimension of the null space of a matrix is called the nullity, denoted null(A).

To find ker(A), we solve the system of equations AX = 0.

Jonathan Chavez

Page 65: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

ProblemFind ker(A) for the matrix A:

A =

[ 1 2 10 −1 12 3 3

]

Solution

Jonathan Chavez

Page 66: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Null SpaceSolution (continued)

Jonathan Chavez

Page 67: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

Image of a Matrix

DefinitionThe image of an m × n matrix A is defined as:

im(A) = {Y ∈ Rm : there exists an X ∈ Rn such that AX = Y }

Roughly, the image of A are he vectors of Rm which “get hit” by A.

TheoremLet A be and m × n matrix. The image of A, im(A), is a subspace of Rm.

It can be shown that im(A)=col(A). Thus, to find im(A), we just find the column space of A, thatis, col(A).

Jonathan Chavez

Page 68: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

ProblemFind im(A) for the matrix A:

A =

[ 1 2 10 −1 12 3 3

]

SolutionAs we saw before, the reduced row-echelon form is:[ 1 0 3 0

0 1 −1 00 0 0 0

]

Jonathan Chavez

Page 69: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

The Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

The Rank-Nullity Theorem

One of the most important theorems in Linear Algebra with tremendous consequences is the following:

TheoremLet A be an m × n matrix. Then,

rank(A) + null(A) = n

For instance, in the last example, A was a 3× 3 matrix. The rank was 2 (since the image or column spacehad dimension 2) and the nullity was 1 (since the null space had dimension 1). Then,

rank(A) + null(A) = 2 + 1 = 3 = n

Jonathan Chavez

Page 70: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Table of Contents

1 Vectors in Rn

Algebra in RnLength of a VectorUnit VectorsThe Dot Product

2 Linear Independence, Spanning Sets and BasisDefinitions and examplesUsing matrices to determine linear independence and spanningSubspaces and Basis

3 Row Space, Column Space and the Null Space of a MatrixThe Row and Column SpacesThe Null SpaceThe Image of a MatrixThe Rank-Nullity Theorem

4 Orthogonality and the Gram Schmidt ProcessOrthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Jonathan Chavez

Page 71: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Definitions (Recall)

Let #�x =

x1x2...

xn

and #�y =

y1y2...

yn

be vectors in Rn.

1 The dot product of #�x and #�y is

#�x q#�y #�x T #�y = x1y1 + x2y2 + · · · xnyn = #�x T #�y

Note: #�x q#�y is a 1× 1 matrix, but we also treat it as a scalar.2 The length of #�x , denoted || #�x || is

|| #�x || =√

#�x T #�x =√

x21 + x2

2 · · ·+ x2n =

√#�x q#�x

3 #�x is called a unit vector if || #�x || = 1.

Jonathan Chavez

Page 72: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Orthogonality

DefinitionsLet #�u , #�v ∈ Rn. We say the #�u and #�v are orthogonal if #�u q#�v = 0.Let { #�u 1,

#�u 2, · · · , #�u m} be a set of vectors in Rn. Then this set is called an orthogonalset if:1 #�u i q#�u j = 0 for all i , j2 #�u i ,

#�0 for all iA set of vectors, { #�w 1, · · · , #�w m} is said to be an orthonormal set if

#�w i q#�w j = δij ={

1 if i = j0 if i , j

Jonathan Chavez

Page 73: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Theorem (Properties of length and the dot product)Let k and p denote scalars and #�u , #�v , #�w denote vectors. Then the dot product #�u q#�v satisfiesthe following properties.

#�u q#�v = #�v q#�u#�u q#�u ≥ 0 and equals zero if and only if #�u = #�0(k #�u + p #�v ) q#�w = k ( #�u q#�w) + p ( #�v q#�w)#�u q(k #�v + p #�w) = k ( #�u q#�v ) + p ( #�u q#�w)|| #�u ||2 = #�u q#�u

Jonathan Chavez

Page 74: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

ExampleLet { #�x 1, #�x 2, . . . , #�x k} ∈ Rn and suppose Rn = span{ #�x 1, #�x 2, . . . , #�x k}. Furthermore, suppose that thereexists a vector #�u ∈ Rn for which #�u q#�x j = 0 for all j, 1 ≤ j ≤ k. What type of vector is #�u ?

SolutionWrite #�u = t1

#�x 1 + t2#�x 2 + · · ·+ tk

#�x k for some t1, t2, . . . , tk ∈ R (this is possible because #�x 1, #�x 2, . . . , #�x kspan Rn).Then

|| #�u ||2 = #�u q#�u= #�u q(t1

#�x 1 + t2#�x 2 + · · ·+ tk

#�x k )= #�u q(t1

#�x 1) + #�u q(t2#�x 2) + · · ·+ #�u q(tk

#�x k )= t1( #�u q#�x 1) + t2( #�u q#�x 2) + · · ·+ tk ( #�u q#�x k )= t1(0) + t2(0) + · · ·+ tk (0) = 0

Since || #�u ||2 = 0, || #�u || = 0. We know that || #�u || = 0 if and only if #�u = #�0 n. Therefore, #�u = #�0 n.Jonathan Chavez

Page 75: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Examples1 The standard basis of Rn is an orthonormal set (and hence an orthogonal set).2

1111

,

11−1−1

,

1−1

1−1

is an orthogonal (but not orthonormal) subset of R4.3 1

2

1111

,12

11−1−1

,12

1−1

1−1

is an orthonormal subset of R4.

Jonathan Chavez

Page 76: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

DefinitionNormalizing an orthogonal set is the process of turning an orthogonal (but not orthonormal) set into anorthonormal set. If { #�u 1, #�u 2, . . . , #�u k} is an orthogonal subset of Rn, then{

1|| #�u 1||

#�u 1,1|| #�u 2||

#�u 2, . . . ,1|| #�u k ||

#�u k

}is an orthonormal set.

ProblemConsider the set of vectors given by

{ #�u 1, #�u 2} ={[

11

],

[−1

1

]}Show that it is an orthogonal set of vectors but not an orthonormal one. Find the correspondingorthonormal set.

Jonathan Chavez

Page 77: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Solution

Jonathan Chavez

Page 78: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Orthogonal Matrix

DefinitionA real n × n matrix U is called an orthogonal matrix if

UUT = UT U = I

ProblemShow the matrix U is orthogonal.

U =

[ 1√2

1√2

1√2 − 1√

2

]

Jonathan Chavez

Page 79: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Solution

Jonathan Chavez

Page 80: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Orthogonal Matrix

A matrix is orthogonal if its rows (or columns) form an orthonormal set of vectors.

Theorem (Orthonormal Basis)The rows of an n × n orthogonal matrix form an orthonormal basis of Rn. Further, any orthonormal basisof Rn can be used to construct an n × n orthogonal matrix.

Theorem (Determinant of Orthogonal Matrices)Suppose U is an orthogonal matrix. Then det(U) = ±1.

Jonathan Chavez

Page 81: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

TheoremLet { #�w 1, #�w 2, · · · , #�w k} be an orthonormal set of vectors in Rn. Then, this set is linearly independent andforms a basis for the subspace W = span{ #�w 1, #�w 2, · · · , #�w k}.

Proof.To show it is a linearly independent set, suppose a linear combination of these vectors equals #�0 , such as:

a1#�w 1 + a2

#�w 2 + · · ·+ ak#�w k = #�0 , ai ∈ R

We need to show that all ai = 0. To do so, take the dot product of each side of the above equation withthe vector #�w i and obtain the following.

#�w i q(a1#�w 1 + a2

#�w 2 + · · ·+ ak#�w k ) = #�w i q#�0

a1( #�w i q#�w 1) + a2( #�w i q#�w 2) + · · ·+ ak ( #�w i q#�w k ) = 0

Jonathan Chavez

Page 82: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Continued.Now since the set is orthogonal, #�w i q#�w m = 0 for all m , i , so we have:

a1(0) + · · ·+ ai ( #�w i q#�w i ) + · · ·+ ak (0) = 0

ai || #�w i ||2 = 0

Since the set is orthogonal, we know that || #�w i ||2 , 0. It follows that ai = 0. Since the ai was chosenarbitrarily, the set { #�w 1, #�w 2, · · · , #�w k} is linearly independent.

Finally since W = span{ #�w 1, #�w 2, · · · , #�w k}, the set of vectors also spans W and therefore forms a basis ofW .

Jonathan Chavez

Page 83: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

We have already seen that an orthonormal set of vectors in Rn is a linearly independent set. However, weare interested in the opposite problem.

Question: If given a linearly independent set of vectors #�u 1, · · · , #�u k ∈ Rn, how do we construct acorresponding orthogonal set? Corresponding orthonormal set?

Answer: The Gram-Schmidt Process

Jonathan Chavez

Page 84: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Gram-Schmidt ProcessLet { #�u 1, ..., #�u k} be a set of linearly independent vectors in Rn.I. Construct a new set of vectors { #�v 1, ..., #�v k} as follows:

#�v 1 = #�u 1

#�v 2 = #�u 2 −(

#�u 2 q#�v 1|| #�v 1||2

)#�v 1

#�v 3 = #�u 3 −(

#�u 3 q#�v 1|| #�v 1||2

)#�v 1 −

(#�u 3 q#�v 2|| #�v 2||2

)#�v 2

...

#�v k = #�u k −(

#�u k q#�v 1|| #�v 1||2

)#�v 1 −

(#�u k q#�v 2|| #�v 2||2

)#�v 2 − · · · −

(#�u k q#�v k−1|| #�v k−1||2

)#�v k−1

Then { #�v 1, ..., #�v k} is an orthogonal set.

Jonathan Chavez

Page 85: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Gram-Schmidt Process

II. Now, let #�w i =#�v i|| #�v i || for i = 1, ..., k. Then, { #�w 1, ..., #�w k} is an orthonormal set.

The result of the Gram-Schmidt Process is three related sets, where:{ #�u 1, ..., #�u k} is the original set{ #�v 1, ..., #�v k} is the corresponding orthogonal set{ #�w 1, ..., #�w k} is the corresponding orthonormal set.

Notice that span{ #�u 1, ..., #�u k} = span{ #�v 1, ..., #�v k} = span{ #�w 1, ..., #�w k} .

Therefore, we can use the Gram-Schmidt Process to find orthogonal or orthonormal sets which have thesame span as the original linearly independent set.

Jonathan Chavez

Page 86: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

ProblemLet { #�u 1, #�u 2} =

{[1, 1, 0]T , [3, 2, 0]T

}. Find an orthonormal set of vectors { #�w 1, #�w 2} having the same span.

Solution

Jonathan Chavez

Page 87: MTH 215: Introduction to Linear Algebra - Chapter 4math.uri.edu/~jchavezc/Files/MTH215/Lecture4.pdfVectors in Rn Linear Independence, Spanning Sets and Basis Row Space, Column Space

Vectors in Rn

Linear Independence, Spanning Sets and BasisRow Space, Column Space and the Null Space of a Matrix

Orthogonality and the Gram Schmidt Process

Orthogonal Sets and MatricesOrthogonality and IndependenceGram-Schmidt Process

Orthonormal SetSolution (continued)

Jonathan Chavez