optical flow, klt feature tracker - yonsei universityweb.yonsei.ac.kr/hgjung/lectures/aue859/8....

Post on 22-May-2019

311 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Flow,Optical Flow,KLT Feature TrackerKLT Feature Tracker

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion in Computer VisionMotion in Computer Vision

Motion

• Structure from motion• Detection/segmentation with direction

[1]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field Motion Field v.sv.s. Optical Flow [2], [3]. Optical Flow [2], [3]

Motion Field:

an ideal representation of 3D motion as it is projected onto

a camera image.

Optical Flow: the approximation (or estimate) of the motion field

which can

be computed from time-varying image sequences. Under the simplifying

assumptions of 1) Lambertian

surface, 2) pointwise

light source at infinity,

and 3) no photometric distortion.

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field [2]Motion Field [2]

An ideal representation of 3D motion as it is projected onto a camera

image.

The time derivative of the image position of all image points given that

they correspond to fixed 3D points. “field : position

vector”

The motion field v

is defined as

whereP

is a point in the scene where Z is the distance to that scene point.

V

is the relative motion between the camera and the scene,

T

is the translational component of the motion,

and ω is the angular velocity of the motion.

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field [2]Motion Field [2]

Zf Pp (1)

3D point P

(X,Y,Z) and 2D point p

(x,y), focal length f

Motion field v

can be obtained by taking the time derivative of (1)

ZYfyZXfx

2

2

ZVY

ZVfv

ZVX

ZVfv

ZYy

ZXx

(2)

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field [2]Motion Field [2]

PTV

(3)

The motion of 3D point P, V

is defined as flow

By substituting (3) into (2), the basic equations of the motion field is acquired

fy

fxyxf

ZfTyTv

fx

fxyyf

ZfTxTv

XYZX

YZy

YXZY

XZx

2

2

(4)

X X Y Z

Y Y Z X

Z Z X Y

V T Z YV T X ZV T Y X

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field [2]Motion Field [2]

The motion field is the sum of two components, one of which depends on translation only, the other on rotation only.

fy

fxyxf

ZfTyTv

fx

fxyyf

ZfTxTv

XYZX

YZy

YXZY

XZx

2

2

ZfTyTv

ZfTxTv

YZy

XZx

fy

fxyxfv

fx

fxyyfv

XYZXy

YXZYx

2

2

Translational components

Rotational components

(4)

(5) (6)

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Pure Translation [2]Motion Field: Pure Translation [2]

If there is no rotational motion, the resulting motion field has

a peculiar spatial

structure.

If (5) is regarded as a function of 2D point position,

ZfTyTv

ZfTxTv

YZy

XZx

(5)

Z

YZy

Z

XZx

TTfy

ZTv

TTfx

ZTv

Z

Y

Z

X

TTfy

TTfx

0

0

If (x0

, y0

) is defined as in (6)

0

0

yyZTv

xxZTv

Zy

Zx(6) (7)

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Pure Translation [2]Motion Field: Pure Translation [2]

Equation (7) say that the motion field of a pure translation is radial. In particular, if TZ

<0, the vectors point away from p0 (x0

, y0

), which is called the focus of expansion (FOE). If TZ

>0, the motion field vectors point towards p0

, which is called the focus of contraction. If TZ

=0, from (5), all the motion field vectors are parallel.

ZfTyTv

ZfTxTv

YZy

XZx (5)

ZTfvZ

Tfv

Yy

Xx

(8)

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Motion Parallax [6]Motion Field: Motion Parallax [6]

Equation (8) say that their lengths are inversely proportional to the depth of the corresponding 3D points.

ZTfvZ

Tfv

Yy

Xx

(8)

This animation is an example of parallax. As the viewpoint moves

side to side, the objects in the distance appear to move more slowly than the objects close to the camera [6].

http://upload.wikimedia.org

/wikipedia/commons/a/ab/

Parallax.gif

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Motion Parallax [2]Motion Field: Motion Parallax [2]

If two 3D points are projected into one image point, that is coincident, rotational component will be the same. Notice that the motion vector V

is about camera

motion.

(4)

fy

fxyxf

ZfTyTv

fx

fxyyf

ZfTxTv

XYZX

YZy

YXZY

XZx

2

2

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Motion Parallax [2]Motion Field: Motion Parallax [2]

The difference of two points’ motion field will be related with translation

components. And, they will be radial w.r.t

FOE or FOC.

01 2 1 2

01 2 1 2

1 1 1 1

1 1 1 1

x Z X Z

y Z Y Z

v T x T f x x TZ Z Z Z

v T y T f y y TZ Z Z Z

FOC

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Motion Parallax [2]Motion Field: Motion Parallax [2]

Motion ParallaxMotion Parallax

The relative motion field of two instantaneously coincident points:

1.

Does not depend on the rotational component of motion

2.

Points towards (away from) the point p0, the vanishing point of the translation direction.

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Pure Rotation Motion Field: Pure Rotation w.r.tw.r.t

YY--axis [7]axis [7]

If there is no translation motion and rotation w.r.t

x-

and z-

axis, from (4)

fxyv

fxfv

Yy

YYx

2

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Motion Field: Pure Rotation Motion Field: Pure Rotation w.r.tw.r.t

YY--axis [7]axis [7]

Z

Distance to the point, Z, is constant.

Translational Motion

Rotational Motion

Z

Distance to the point, Z, is changing. According to Z, y is changing, too.

Z1

Z2

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Estimation of the Optical Flow [4]Estimation of the Optical Flow [4]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

The Image Brightness Constancy Equation [2]

Estimation of the Optical Flow [4]Estimation of the Optical Flow [4]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

tT II V

I

tIV

AssumptionThe image brightness is continuous and differentiable as many times as needed in both the spatial and temporal domain.

The image brightness can be regarded as a plane in a small area.

Estimation of the Optical Flow [4]Estimation of the Optical Flow [4]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: Aperture Problem [2], [4], [9]Optical Field: Aperture Problem [2], [4], [9]

The component of the motion field in the direction orthogonal to

the spatial image

gradient is not constrained by the image brightness constancy equation.

Given

local information can determine component of optical flow vector

only in direction

of brightness gradient.

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: Aperture Problem [2], [9]Optical Field: Aperture Problem [2], [9]

The aperture problem. The grating

appears to be moving down and to the right, perpendicular

to the orientation of the bars. But it could be moving in many other directions, such as only down, or only to the right. It is impossible to determine unless the ends of the bars become visible in the aperture.

http://upload.wikimedia.org/wikipedia/commons/f/f0/Aperture_pr

oblem_animated.gif

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: Methods for Determining Optical Flow [4]Optical Field: Methods for Determining Optical Flow [4]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: Phase Correlation Method [10]Optical Field: Phase Correlation Method [10]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: Phase Correlation Method [10]Optical Field: Phase Correlation Method [10]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: Phase Correlation Method [10]Optical Field: Phase Correlation Method [10]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: LucasOptical Field: Lucas--KanadeKanade

Method [8]Method [8]

Assuming that the optical flow (Vx

,Yy

) is constant in a small window of size mxm

with m>1, which is center at (x, y) and numbering the pixels within as 1…n, n=m2, a set of equations can be found:

tT II V

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

Optical Field: LucasOptical Field: Lucas--KanadeKanade

Method [8]Method [8]

cf.) Harris corner detector

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

KLT Feature Tracker [11]KLT Feature Tracker [11]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

KLT Feature Tracker [11]KLT Feature Tracker [11]

F’(x)로 weighting

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

KLT Feature Tracker [11]KLT Feature Tracker [11]

KLT: An Implementation of the Kanade-Lucas-Tomasi

Feature Tracker

http://www.ces.clemson.edu/~stb/klt/

tT II V

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

Since the Lucas-Kanade algorithm was proposed in 1981 image alignment has becomeone of the most widely used techniques in computer vision.

Besides optical flow, some of its other applications include- tracking (Black and Jepson, 1998; Hager and Belhumeur, 1998),- parametric and layered motion estimation (Bergen et al., 1992),- mosaic construction (Shum and Szeliski, 2000),- medical image registration (Christensen and Johnson, 2001), - face coding (Baker and Matthews, 2001; Cootes et al., 1998).

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

Minimizing the expression in Equation (1) is a non-linear optimization task even if W(x; p) is linear in p because the pixel values I(x) are, in general, non-linear in x.

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

Setting this expression to equal zero and solving gives the closed form solution for the minimum of the expression in Equation (6) as:

SD

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

LucasLucas--KanadeKanade

Method: A Unifying Framework [12]Method: A Unifying Framework [12]

E-mail: hogijung@hanyang.ac.krhttp://web.yonsei.ac.kr/hgjung

ReferencesReferences

1. Richard Szeliski, “ Dense motion estimation,” Computer Vision: Algorithms and Applications, 19 June 2009 (draft), pp. 383-426.

2. Emanuele Trucco, Alessandro Verri, “8. Motion,” Introductory Techniques for 3-D Computer Vision, Prentice Hall, New Jersey 1998, pp.177-218.

3. Wikipedia, “Motion field,” available on www.wikipedia.org.4. Wikipedia, “Optical flow,” available on www.wikipedia.org.5. Alessandro Verri, Emanuele Trucco, “Finding the Epipole from Uncalibrated Optical Flow,”

BMVC 1997, available on http://www.bmva.ac.uk/bmvc/1997/papers/052/bmvc.html.6. Wikipedia, “Parallex,” available on www.wikipedia.org.7. Jae Kyu Suhr, Ho Gi Jung, Kwanghyuk Bae, Jaihie Kim, “Outlier rejection for cameras on

intelligent vehicles,” Pattern Recognition Letters 29 (2008) 828-840.8. Wikipedia, “Lucas-Kanade Optical Flow Method,” available on www.wikipedia.org.9. Wikipedia, “Aperture Problem,” available on www.wikipedia.org.10. Wikipedia, “Phase correlation,” available on http://en.wikipedia.org/wiki/Phase_correlation11. Wikipedia, “Kanade-Lucas-Tomasi feature tracker,” available on

http://en.wikipedia.org/wiki/Kanade%E2%80%93Lucas%E2%80%93Tomasi_feature_track er

12. Simon Baker, Iain Matthews, “Lucas-Kanade 20 Years: A Unifying Framework,” International Journal of Computer Vision, 56(3), 221-255, 2004.

top related