linear algebra & geometry - stanford university .linear algebra & geometry why is linear algebra

Download Linear Algebra & Geometry - Stanford University .Linear Algebra & Geometry why is linear algebra

Post on 31-Aug-2018

233 views

Category:

Documents

3 download

Embed Size (px)

TRANSCRIPT

  • Linear Algebra & Geometry why is linear algebra useful in computer vision?

    Some of the slides in this lecture are courtesy to Prof. Octavia I. Camps, Penn State University

    References:-Any book on linear algebra!-[HZ] chapters 2, 4

  • Why is linear algebra useful in computer vision?

    Representation 3D points in the scene 2D points in the image

    Coordinates will be used to Perform geometrical transformations Associate 3D with 2D points

    Images are matrices of numbers Find properties of these numbers

  • Agenda

    1. Basics definitions and properties2. Geometrical transformations3. SVD and its applications

  • P = [x,y,z]

    Vectors (i.e., 2D or 3D vectors)

    Image

    3D worldp = [x,y]

  • Vectors (i.e., 2D vectors)P

    x1

    x2

    v

    Magnitude:

    Orientation:

    Is a unit vector

    If , Is a UNIT vector

  • Vector Addition

    vw

    v+w

  • Vector Subtraction

    vw

    v-w

  • Scalar Product

    vav

  • Inner (dot) Product

    vw

    The inner product is a SCALAR!

  • Orthonormal BasisP

    x1

    x2

    v

    ij

  • Magnitude:kuk = kv wk = kvkkwk sin

    Vector (cross) Product

    The cross product is a VECTOR!

    wv

    u

    Orientation:

  • i = j kj = k ik = i j

    Vector Product Computation

    )1,0,0()0,1,0()0,0,1(

    =

    =

    =

    kji

    1||||1||||1||||

    =

    =

    =

    kji

    kji )()()( 122131132332 yxyxyxyxyxyx ++=

  • Matrices

    Sum:

    Example:

    A and B must have the same dimensions!

    Pixels intensity value

    Anm =

    2

    6664

    a11 a12 . . . a1ma21 a22 . . . a2m...

    ......

    ...an1 an2 . . . anm

    3

    7775

  • Matrices

    Product:

    A and B must have compatible dimensions!

    Anm =

    2

    6664

    a11 a12 . . . a1ma21 a22 . . . a2m...

    ......

    ...an1 an2 . . . anm

    3

    7775

  • Matrix Transpose

    Definition:

    Cmn = ATnm

    cij = aji

    Identities:

    (A + B)T = AT + BT

    (AB)T = BTAT

    If A = AT , then A is symmetric

  • Matrix Determinant

    Useful value computed from the elements of a square matrix A

    det[a11]

    = a11

    det

    [a11 a12a21 a22

    ]= a11a22 a12a21

    det

    a11 a12 a13a21 a22 a23a31 a32 a33

    = a11a22a33 + a12a23a31 + a13a21a32 a13a22a31 a23a32a11 a33a12a21

  • Matrix Inverse

    Does not exist for all matrices, necessary (but not sufficient) thatthe matrix is square

    AA1 = A1A = I

    A1 =

    [a11 a12a21 a22

    ]1=

    1

    det A

    [a22 a12a21 a11

    ], det A 6= 0

    If det A = 0, A does not have an inverse.

  • 2D Geometrical Transformations

  • 2D Translation

    t

    P

    P

  • 2D Translation Equation

    P

    x

    y

    tx

    tyP

    t

  • 2D Translation using Matrices

    P

    x

    y

    tx

    tyP

    t

  • Homogeneous Coordinates

    Multiply the coordinates by a non-zero scalar and add an extra coordinate equal to that scalar. For example,

  • Back to Cartesian Coordinates:

    Divide by the last coordinate and eliminate it. For example,

    NOTE: in our example the scalar was 1

  • 2D Translation using Homogeneous Coordinates

    P

    x

    y

    tx

    tyP

    tt

    P

  • Scaling

    P

    P

  • Scaling Equation

    P

    x

    y

    sx x

    Psy y

  • P

    P=SPP=TP

    P=T P=T (S P)=(T S)P = A P

    Scaling & Translating

    P

  • Scaling & Translating

    A

  • Translating & Scaling = Scaling & Translating ?

  • Rotation

    P

    P

  • Rotation Equations

    Counter-clockwise rotation by an angle

    P

    x

    yP

    xy

  • Degrees of Freedom

    R is 2x2 4 elements

    Note: R belongs to the category of normal matrices and satisfies many interesting properties:

  • Rotation+ Scaling +TranslationP= (T R S) P

    If sx=sy, this is asimilarity transformation!

  • Transformation in 2D

    -Isometries

    -Similarities

    -Affinity

    -Projective

  • Transformation in 2D

    Isometries:

    - Preserve distance (areas)- 3 DOF- Regulate motion of rigid object

    [Euclideans]

  • Transformation in 2D

    Similarities:

    - Preserve- ratio of lengths- angles

    -4 DOF

  • Transformation in 2D

    Affinities:

  • Transformation in 2D

    Affinities:

    -Preserve:- Parallel lines- Ratio of areas- Ratio of lengths on collinear lines- others

    - 6 DOF

  • Transformation in 2D

    Projective:

    - 8 DOF- Preserve:

    - cross ratio of 4 collinear points- collinearity - and a few others

  • Eigenvalues and Eigenvectors

    A eigenvalue and eigenvector u satisfies

    Au = u

    where A is a square matrix.

    I Multiplying u by A scales u by

  • Eigenvalues and Eigenvectors

    Rearranging the previous equation gives the system

    Au u = (A I)u = 0

    which has a solution if and only if det(A I) = 0.I The eigenvalues are the roots of this determinant which is

    polynomial in .

    I Substitute the resulting eigenvalues back into Au = u andsolve to obtain the corresponding eigenvector.

  • UVT = A Where U and V are orthogonal matrices,

    and is a diagonal matrix. For example:

    Singular Value Decomposition

  • Singular Value decomposition

    that

  • Look at how the multiplication works out, left to right:

    Column 1 of U gets scaled by the first value from .

    The resulting vector gets scaled by row 1 of VT to produce a contribution to the columns of A

    An Numerical Example

  • Each product of (column i of U)(value i from )(row i of VT) produces a

    +=

    An Numerical Example

  • We can look at to see that the first column has a large effect

    while the second column has a much smaller effect in this example

    An Numerical Example

  • For this image, using only the first 10 of 300 singular values produces a recognizable reconstruction

    So, SVD can be used for image compression

    SVD Applications

  • Remember, columns of U are the Principal Components of the data: the major patterns that can be added to produce the columns of the original matrix

    One use of this is to construct a matrix where each column is a separate data sample

    Run SVD on that matrix, and look at the first few columns of U to see patterns that are common among the columns

    This is called Principal Component Analysis (or PCA) of the data samples

    Principal Component Analysis

  • Often, raw data samples have a lot of redundancy and patterns

    PCA can allow you to represent data samples as weights on the principal components, rather than using the original raw form of the data

    By representing each sample as just those weights, you can represent just the meat of whats different between samples.

    This minimal representation makes machine learning and other algorithms much more efficient

    Principal Component Analysis

Recommended

View more >