eigen values and eigen vectors
TRANSCRIPT
Vector Calculus & Linear Algebra
EIGEN VALUES AND EIGEN VECTORS
By: Taher K D
IntroductionIn linear algebra, an eigenvector or characteristic vector of
a square matrix is a vector that points in a direction which is
invariant under the associated linear transformation. In other
words, if v is a vector which is not zero, then it is an
eigenvector of a square matrix A if Av is a scalar multiple of v.
This condition could be written as the equation
Av = λv
where λ is a number (also called a scalar) known as the
eigenvalue or characteristic value associated with the
eigenvector v.
Geometric interpretation of eigenvalues and eigenvectors
A n×n matrix A multiplied by n×1 vector x results in another n×1
vector y=Ax. Thus A can be considered as a transformation
matrix.
In general, a matrix acts on a vector by changing both its
magnitude and its direction. However, a matrix may act on certain
vectors by changing only their magnitude, and leaving their
direction unchanged (or possibly reversing it). These vectors are
the eigenvectors of the matrix.
A matrix acts on an eigenvector by multiplying its magnitude by a
factor, which is positive if its direction is unchanged and negative
if its direction is reversed. This factor is the eigenvalue
associated with that eigenvector.
In this shear mapping the red arrow changes direction but the blue arrow does not. The blue arrow is an eigenvector of this shear mapping because it doesn't change direction, and
since its length is unchanged, its eigenvalue is 1.
Two dimensional example
Consider the transformation matrix A, given by,
The eigenvectors v of this transformation satisfy the equation,
Av = λv
Rearrange this equation to obtain
(A-λI)v=0
which has a solution only when its determinant| A − λI | equals zero. Set the determinant to zero to obtain the polynomial equation,
known as the characteristic polynomial of the matrix A. In this case, it has the roots λ = 1 and λ = 3.
For λ = 1, the equation becomes,
which has the solution,
For λ = 3, the equation becomes,
which has the solution,
Thus, the vectors v and w are eigenvectors of A associated with the eigenvalues λ = 1 and λ = 3, respectively.
The figure shows the effect of this transformation on point coordinates in the plane.
The transformation matrix A = preserves the direction of vectors parallel to v = (1,−1)T (in purple) and w = (1,1)T (in blue). The vectors in
red are not parallel to either eigenvector, so, their directions are changed by the transformation. See also: An extended version, showing
all four quadrants.
Three dimensional example
The eigenvectors v of the 3×3 matrix A,
satisfy the equation
This equation has solutions only if the determinant | A − λI | equals zero, which yields the characteristic polynomial,
with the roots λ = 1, λ = 2 and λ = 3.
Associated with the roots λ = 1, λ = 2 and λ = 3 are the respective eigenvectors,
Diagonal matricesMatrices with entries only along the main diagonal are called diagonal matrices. It is easy to see that the eigenvalues of a diagonal matrix are the diagonal elements themselves. Consider the matrix A,
The characteristic polynomial of A is given by
which has the roots λ = 1, λ = 2 and λ = 3.
Associated with these roots are the eigenvectors,
respectively.
Triangular matrices
The eigenvalues of triangular matrices are the elements of the main diagonal, in the same way as for diagonal matrices. Consider the lower triangular matrix A,
The characteristic polynomial of A is given by
which has the roots λ = 1, λ = 2 and λ = 3. Associated with these roots are the eigenvectors,
respectively.
PropertiesLet A be an arbitrary matrix of complex
numbers with eigenvalues λ1 λ2…λn , Then
The trace of A defined as the sum of its diagonal
elements, Is also the sum of all eigenvalues:
The determinant of A is the product of all eigenvalues:
If A is invertible, then the eigenvalues of A-1 are
. Clearly, the geometric multiplicities coincide.
Moreover, since the characteristic polynomial of the
inverse is the reciprocal polynomial for that of the
original, they share the same algebraic multiplicity.
The eigenvalues kth of the power of A i.e. the
eigenvalues of Ak for any positive integer K are
The matrix A is invertible if and only if all the
eigenvalues λi are nonzero.
If A is Hermitian, then every eigenvalue is real. The
same is true of any a symmetric real matrix. If A is also
positive-definite, positive-semi definite, negative-definite,
or negative-semi definite every eigenvalue is positive,
non-negative, negative, or non-positive respectively.
If λ is an eigenvalue of A then kλ is an eigenvalue of kA
where k is any arbitrary scalar.
If λ is an eigenvalue of A then λ is an eigenvalue of AT.
Application of Eigen values and Eigen vectors
Scaling
Matrix :
Characteristic polynomial :
Eigen Values :
Algebric multiplicity :
Geometric multiplicity :
Eigenvectors : All non-zero vectors
Unequal Scaling
Matrix :
Characteristic polynomial :
Eigen Values :
Algebric multiplicity :
Geometric multiplicity :
Eigenvectors :
Rotation
Matrix :
Characteristic polynomial :
Eigen Values :
Algebric multiplicity :
Geometric multiplicity :
Eigenvectors :
Hyperbolic Rotation
Matrix :
Characteristic polynomial :
Eigen Values :
Algebric multiplicity :
Geometric multiplicity :
Eigenvectors :
Horizontal Shear
Matrix :
Characteristic polynomial :
Eigen Values :
Algebric multiplicity :
Geometric multiplicity :
Eigenvectors :
Thank You