linear algebra review

30
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis

Upload: kiefer

Post on 06-Jan-2016

45 views

Category:

Documents


3 download

DESCRIPTION

Linear Algebra Review. CS479/679 Pattern Recognition Dr. George Bebis. n-dimensional Vector. An n -dimensional vector v is denoted as follows: The transpose v T is denoted as follows:. Inner (or dot) product. - PowerPoint PPT Presentation

TRANSCRIPT

  • Linear Algebra Review*CS479/679 Pattern Recognition Dr. George Bebis

  • n-dimensional VectorAn n-dimensional vector v is denoted as follows:

    The transpose vT is denoted as follows:

  • Inner (or dot) productGiven vT = (x1, x2, . . . , xn) and wT = (y1, y2, . . . , yn), their dot product defined as follows:

    or(scalar)

  • Orthogonal / Orthonormal vectorsA set of vectors x1, x2, . . . , xn is orthogonal if

    A set of vectors x1, x2, . . . , xn is orthonormal if

  • Linear combinationsA vector v is a linear combination of the vectors v1, ..., vk if:

    where c1, ..., ck are constants.

    Example: vectors in R3 can be expressed as a linear combinations of unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)

  • Space spanningA set of vectors S=(v1, v2, . . . , vk ) span some space W if every vector in W can be written as a linear combination of the vectors in S

    - The unit vectors i, j, and k span R3

    w

  • Linear dependenceA set of vectors v1, ..., vk are linearly dependent if at least one of them is a linear combination of the others.

    (i.e., vj does not appear on the right side)

  • Linear independenceA set of vectors v1, ..., vk is linearly independent if no vector can be represented as a linear combination of the remaining vectors, i.e.:

    Example:

  • Vector basisA set of vectors (v1, ..., vk) forms a basis in some vector space W if:(1) (v1, ..., vk) are linearly independent(2) (v1, ..., vk) span W

    Standard bases:R2R3Rn

  • Matrix OperationsMatrix addition/subtractionMatrices must be of same size.

    Matrix multiplication

    Condition: n = qm x nq x pm x p

  • Identity Matrix

  • Matrix Transpose

  • Symmetric MatricesExample:

  • Determinants2 x 23 x 3n x nProperties:

  • Matrix InverseThe inverse A-1 of a matrix A has the property: AA-1=A-1A=I

    A-1 exists only if

    TerminologySingular matrix: A-1 does not existIll-conditioned matrix: A is close to being singular

  • Matrix Inverse (contd)Properties of the inverse:

  • Matrix traceProperties:

  • Rank of matrixEqual to the dimension of the largest square sub-matrix of A that has a non-zero determinant.

    Example:

    has rank 3

  • Rank of matrix (contd)Alternative definition: the maximum number of linearly independent columns (or rows) of A.

    i.e., rank is not 4!Example:

  • Rank of matrix (contd)

  • Eigenvalues and EigenvectorsThe vector v is an eigenvector of matrix A and is an eigenvalue of A if:

    i.e., the linear transformation implied by A cannot change the direction of the eigenvectors v, only their magnitude.

    (assume non-zero v)

  • Computing and vTo find the eigenvalues of a matrix A, find the roots of the characteristic polynomial:

    Example:

  • PropertiesEigenvalues and eigenvectors are only defined for square matrices (i.e., m = n)Eigenvectors are not unique (e.g., if v is an eigenvector, so is kv)Suppose 1, 2, ..., n are the eigenvalues of A, then:

  • Matrix diagonalizationGiven an n x n matrix A, find P such that: P-1AP= where is diagonal

    Take P = [v1 v2 . . . vn], where v1,v2 ,. . . vn are the eigenvectors of A:

  • Matrix diagonalization (contd)Example:

  • Only if P-1 exists (i.e., P must have n linearly independent eigenvectors, that is, rank(P)=n)

    If A is diagonalizable, then the corresponding eigenvectors v1,v2 ,. . . vn form a basis in Rn

    Are all n x n matrices diagonalizable P-1AP ?

  • Matrix decompositionLet us assume that A is diagonalizable, then A can be decomposed as follows:

  • Special case: symmetric matrices

    The eigenvalues of a symmetric matrix are real and its eigenvectors are orthogonal.P-1=PTA=PDPT=

    Pattern Recognition*George Bebis***************************