ee392j final project, march 20, 2002 1 multiple camera object tracking helmy eltoukhy and khaled...

21
EE392J Final Project, March 20, 2002 1 Multiple Camera Object Multiple Camera Object Tracking Tracking Helmy Eltoukhy and Khaled Salama Helmy Eltoukhy and Khaled Salama

Upload: tiffany-jones

Post on 23-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

EE392J Final Project, March 20, 2002 1

Multiple Camera Object TrackingMultiple Camera Object Tracking

Helmy Eltoukhy and Khaled SalamaHelmy Eltoukhy and Khaled Salama

EE392J Final Project, March 20, 2002 2

OutlineOutline

IntroductionIntroduction Point Correspondence between multiple Point Correspondence between multiple

camerascameras Robust Object TrackingRobust Object Tracking Camera Communication and decision Camera Communication and decision

makingmaking ResultsResults

EE392J Final Project, March 20, 2002 3

Object TrackingObject Tracking

The objective is to obtain an accurate The objective is to obtain an accurate estimate of the position (x,y) of the object estimate of the position (x,y) of the object trackedtracked

Tracking algorithms can be classified intoTracking algorithms can be classified into• Single object & Single Camera Single object & Single Camera • Multiple object & Single Camera Multiple object & Single Camera • Multiple objects & Multiple CamerasMultiple objects & Multiple Cameras• Single object & Multiple CamerasSingle object & Multiple Cameras

EE392J Final Project, March 20, 2002 4

Single Object & Single CameraSingle Object & Single Camera

Accurate camera calibration and scene modelAccurate camera calibration and scene model Suffers from OcclusionsSuffers from Occlusions Not robust and object dependantNot robust and object dependant

EE392J Final Project, March 20, 2002 5

Single Object & Multiple CameraSingle Object & Multiple Camera Accurate point correspondence between scenesAccurate point correspondence between scenes Occlusions can be minimized or even avoidedOcclusions can be minimized or even avoided Redundant information for better estimationRedundant information for better estimation Multiple camera Communication problemMultiple camera Communication problem

EE392J Final Project, March 20, 2002 6

System ArchitectureSystem Architecture

ObjectIdentification

ObjectTracking

Check Position(X1-X2) <(Y1-Y2) <

ChooseCameraView

ObjectIdentification

ObjectTracking

EE392J Final Project, March 20, 2002 7

Static Point CorrespondenceStatic Point Correspondence

The output of the tracking stage isThe output of the tracking stage is A simple scene model is used to get real A simple scene model is used to get real

estimate of coordinates estimate of coordinates Both Affine and Perspective models were Both Affine and Perspective models were

used for the scene modeling and static used for the scene modeling and static corresponding points were used for corresponding points were used for parameter estimationparameter estimation

Least mean squares was used to improve Least mean squares was used to improve parameter estimationparameter estimation

)(),( nYnX ii

)(ˆ),(ˆ nYnX ii

EE392J Final Project, March 20, 2002 8

Dynamic Point CorrespondenceDynamic Point Correspondence

Affine modelusing A(n)

B(n)

Affine modelusing A(n)

Add this point to AAdd this point to A

Check Position(X1-X2) < T(Y1-Y2) < T

EE392J Final Project, March 20, 2002 9

Block-Based Motion EstimationBlock-Based Motion Estimation

Typically, in object tracking precise sub-pixel Typically, in object tracking precise sub-pixel optical flow estimation is not needed.optical flow estimation is not needed.

Furthermore, motion can be on the order of Furthermore, motion can be on the order of several pixels, thereby precluding use of gradient several pixels, thereby precluding use of gradient methods.methods.

We started with a simple sum of squared We started with a simple sum of squared differences error criterion coupled with full search differences error criterion coupled with full search in a limited region around the tracking window.in a limited region around the tracking window.

• 2

),(

)),,(),,(( tyxsttnymxsSSD cyx

cerror

EE392J Final Project, March 20, 2002 10

Adaptive Window SizingAdaptive Window Sizing

Although simple block-based motion Although simple block-based motion estimation may work reasonably well when estimation may work reasonably well when motion is purely translational, it can lose the motion is purely translational, it can lose the object if its relative size changes.object if its relative size changes.• If the object’s camera field of view shrinks, the If the object’s camera field of view shrinks, the

SSD error is strongly influenced by the SSD error is strongly influenced by the background.background.

• If the object’s camera field of view grows, the If the object’s camera field of view grows, the window fails to make use of entire object window fails to make use of entire object information and can slip away. information and can slip away.

EE392J Final Project, March 20, 2002 11

Four Corner MethodFour Corner Method

This technique divides the rectangular object This technique divides the rectangular object window into 4 basic regions - each one quadrant.window into 4 basic regions - each one quadrant.

Motion vectors are calculated for each subregion Motion vectors are calculated for each subregion and each controls one of four corners.and each controls one of four corners.

Translational motion is captured by all four moving Translational motion is captured by all four moving equally, while window size is modulated when equally, while window size is modulated when motion is differential.motion is differential.

Resultant tracking window can be non-rectangular, Resultant tracking window can be non-rectangular, i.e., any quadrilateral approximated by four i.e., any quadrilateral approximated by four rectangles with a shared center corner.rectangles with a shared center corner.

EE392J Final Project, March 20, 2002 12

Example: Four Corner MethodExample: Four Corner Method

Synthetically generated test sequences:

EE392J Final Project, March 20, 2002 13

Correlative MethodCorrelative Method

Four corner method is strongly subject to error Four corner method is strongly subject to error accumulation which can result in drift of one or accumulation which can result in drift of one or more of the tracking window quadrants.more of the tracking window quadrants.

Once drift occurs, sizing of window is highly Once drift occurs, sizing of window is highly inaccurate.inaccurate.

Need a method that has some corrective feedback Need a method that has some corrective feedback so window can converge to correct size even after so window can converge to correct size even after some errors.some errors.

Correlation of current object features to some Correlation of current object features to some template view is one solution.template view is one solution.

EE392J Final Project, March 20, 2002 14

Correlative Method (con’t)Correlative Method (con’t)

Basic form of technique involves storing initial Basic form of technique involves storing initial view of object as a reference image.view of object as a reference image.

Block matching is performed through a combined Block matching is performed through a combined interframe and correlative MSE:interframe and correlative MSE:

where sc’(x0,y0,0) is the resized stored template image.where sc’(x0,y0,0) is the resized stored template image.

Furthermore, minimum correlative MSE is used to Furthermore, minimum correlative MSE is used to direct resizing of current window.direct resizing of current window.

,))0,,(),,(()1(

)),,(),,((1

200

'

),(1

2

),(1

21

yxsttnymxsa

tyxsttnymxsann

MSE

c

yxc

cyx

cerror

EE392J Final Project, March 20, 2002 15

Example: Correlative MethodExample: Correlative Method

EE392J Final Project, March 20, 2002 16

Occlusion DetectionOcclusion Detection

In order for multi-camera feature tracking to work, In order for multi-camera feature tracking to work, each camera must possess an ability to assess the each camera must possess an ability to assess the validity of its tracking (e.g. to detect occlusion).validity of its tracking (e.g. to detect occlusion).

Comparing the minimum error at each point to Comparing the minimum error at each point to some absolute threshold is problematic since error some absolute threshold is problematic since error can grow even when tracking is still valid.can grow even when tracking is still valid.

Threshold must be adaptive to current conditions.Threshold must be adaptive to current conditions. One solution is to use a threshold of k (constant > One solution is to use a threshold of k (constant >

1) times the moving average of the MSE.1) times the moving average of the MSE. Thus, only precipitous changes in error trigger Thus, only precipitous changes in error trigger

indication of possibly fallacious tracking. indication of possibly fallacious tracking.

EE392J Final Project, March 20, 2002 17

Redetection Procedure (1 Camera)Redetection Procedure (1 Camera)

Normal Tracking Occlusion Detected

Motion Tracking Remain Stationary

It < dNoiseIt > dNoiseError < Erravg

Error > kErravg

Redetection is difficult at most general level Redetection is difficult at most general level – Object recognition.– Object recognition.

Proximity and size constancy constraints can Proximity and size constancy constraints can be imposed to simplify redetection. be imposed to simplify redetection.

EE392J Final Project, March 20, 2002 18

Example: OcclusionExample: Occlusion

EE392J Final Project, March 20, 2002 19

Camera CommunicationCamera Communication

EE392J Final Project, March 20, 2002 20

ResultResult

EE392J Final Project, March 20, 2002 21

ConclusionConclusion

Multiple cameras can do more than just 3D Multiple cameras can do more than just 3D imagingimaging

Camera calibration only works if you have Camera calibration only works if you have an accurate scene and camera modelan accurate scene and camera model

Tracking is sensitive to the camera Tracking is sensitive to the camera characteristics (noise, blur, frame rate,..)characteristics (noise, blur, frame rate,..)

Tracking accuracy can be improved using Tracking accuracy can be improved using multiple camerasmultiple cameras