linear algebra & linear algebra & geometry why is linear algebra useful in computer vision? some of

Download Linear Algebra & Linear Algebra & Geometry why is linear algebra useful in computer vision? Some of

If you can't read please download the document

Post on 31-May-2020

1 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • Linear Algebra & Geometry why is linear algebra useful in computer vision?

    Some of the slides in this lecture are courtesy to Prof. Octavia I. Camps, Penn State University

    References: -Any book on linear algebra! -[HZ] – chapters 2, 4

    Edited by Yilin Yang and Liang Huang: removed SVD, added eigenvector and covariance matrix links.

  • Vectors (i.e., 2D vectors) P

    x1

    x2 θ

    v

    Magnitude:

    Orientation:

    Is a unit vector

    If , Is a UNIT vector

  • Vector Addition

    v w

    v+w

  • Vector Subtraction

    v w

    v-w

  • Scalar Product

    v av

  • Inner (dot) Product

    v w

    α

    The inner product is a SCALAR!

  • Orthonormal Basis P

    x1

    x2 θ

    v

    i j

  • Matrices

    Sum:

    Example:

    A and B must have the same dimensions!

    Pixel’s intensity value

    An⇥m =

    2

    6664

    a11 a12 . . . a1m a21 a22 . . . a2m ...

    ... ...

    ... an1 an2 . . . anm

    3

    7775

  • Matrices

    Product:

    A and B must have compatible dimensions!

    An⇥m =

    2

    6664

    a11 a12 . . . a1m a21 a22 . . . a2m ...

    ... ...

    ... an1 an2 . . . anm

    3

    7775

  • Matrix Inverse

    Does not exist for all matrices, necessary (but not su�cient) that

    the matrix is square

    AA�1 = A�1A = I

    A�1 =

     a11 a12 a21 a22

    ��1 =

    1

    detA

     a22 �a12 �a21 a11

    � , detA 6= 0

    If detA = 0, A does not have an inverse.

  • Matrix Determinant

    Useful value computed from the elements of a square matrix A

    det

    ⇥ a11

    ⇤ = a11

    det

     a11 a12 a21 a22

    � = a11a22 � a12a21

    det

    2

    4 a11 a12 a13 a21 a22 a23 a31 a32 a33

    3

    5 = a11a22a33 + a12a23a31 + a13a21a32

    � a13a22a31 � a23a32a11 � a33a12a21

  • Matrix Transpose

    Definition:

    Cm⇥n = A T n⇥m

    cij = aji

    Identities:

    (A + B)T = AT + BT

    (AB)T = BTAT

    If A = AT , then A is symmetric

  • 2D Geometrical Transformations

  • 2D Translation

    t

    P

    P’

  • 2D Translation Equation

    P

    x

    y

    tx

    ty P’

    t

  • 2D Translation using Matrices

    P

    x

    y

    tx

    ty P’

    t

  • Scaling

    P

    P’

  • Scaling Equation

    P

    x

    y

    sx x

    P’ sy y

  • P

    P’=S·P P’’=T·P’

    P’’=T · P’=T ·(S · P)=(T · S)·P = A · P

    Scaling & Translating

    P’’

  • Scaling & Translating

    A

  • Rotation

    P

    P’

  • Rotation Equations

    Counter-clockwise rotation by an angle θ

    P

    x

    y’ P’

    θ

    x’ y

  • Rotation+ Scaling +Translation P’= (T R S) P

    If sx=sy, this is a similarity transformation!

  • Eigenvalues and Eigenvectors

    A eigenvalue � and eigenvector u satisfies

    Au = �u

    where A is a square matrix.

    I Multiplying u by A scales u by �

    Please see geometric demos at: http://www.sineofthetimes.org/eigenvectors-of-2-x-2-matrices-a-geometric-exploration/

    See also geometry of covariance matrix: http://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/

  • Eigenvalues and Eigenvectors

    Rearranging the previous equation gives the system

    Au� �u = (A� �I)u = 0

    which has a solution if and only if det(A� �I) = 0. I

    The eigenvalues are the roots of this determinant which is

    polynomial in �.

    I Substitute the resulting eigenvalues back into Au = �u and solve to obtain the corresponding eigenvector.

Recommended

View more >