recognition by probabilistic hypothesis construction p. moreels, m. maire, p. perona california...
TRANSCRIPT
Recognition by Probabilistic Hypothesis Construction
P. Moreels, M. Maire, P. Perona
California Institute of Technology
Rich features,probabilistic,fast learning,
efficient matching
Background
• Huttenlocher & Ullman, 1990
Efficientmatching
Rich features
• Fischler & Elschlager, 1973• v.d. Malsburg et al. ‘93
• Burl et al. ‘96• Weber et al. ‘00• Fergus et al. ‘03
• Lowe ‘99, ‘04
Probabilistic constellations,
categories
Objective: Individual object recognition
• D.Lowe, constellation model.
• Hypothesis and score.
• Scheduling of matches.
• Experiments: compare with D.Lowe.
Outline
• Principled
detection/recognition
• Learn parameters from data
• Model clutter, occlusion,
distortions
+
-
Lowe’s recognition system Constellation model
• High number of parameters (O(n2))• 5-7 parts per model
• many training examples needed
• learning expensive
• Many parts redundancy
• Learn from 1 image
• Fast
Pros and Cons
• Manual tuning of parameters
• Rigid planar objects
• Sensitive to clutter
Reducing degrees of freedom
1. Common reference frame ([Lowe’99],[Huttenlocher’90])
2. Share parameters ([Schmid’97])
3. Use prior information learned on foreground and background ([FeiFei’03])
model m
position of model m
Foreground
Gaussian
shape pdf
Gaussian part
appearance pdf
Gaussian
relative scale pdf
log(scale)
Prob. of detection
0.8
Based on [Fergus’03][Burl’98]
Parameters and priors
0.8 0.75 0.9
Gaussian background
appearance pdf
ClutterConstellation model
Gaussian part
appearance pdfGaussian
relative scale pdf
log(scale)
Prob. of detection
0.80.2 0.2 0.2
Gaussian background
appearance pdf
Gaussian conditional
shape pdf
Foreground ClutterSharing parameters
Hypotheses – features assignments
= models from database
New scene (test image)
. . .
Interpretation
. . .
Models fromdatabaseNew scene (test image)
Hypotheses – model position
1
2
3
Θ = affine transformation
Score of a hypothesis
Hypothesis:model + position + assignments
observed featuresgeometry + appearance
database of models
(Bayes rule)
constantConsistency Hypothesis probability
Score of a hypothesis
- Consistency between observations and hypothesis
- Probability of number of clutter detections
- Probability of detecting the indicated model features
- Prior on the pose of the given model
foreground features ‘null’ assignments
geometry geometryappearance appearance
Scheduling – inspired from A*
empty hypothesis
1 assignment …
scene features, no assignment done
PP P Pperfect completion
(admissible heuristic, used as a guide for the search)
Increase computational efficiency:
- at each node, searches only a fixed number of sub-branches
- forces terminationPearl’84,Grimson’87
‘null’ assignment
… …2 assignments
P P P P
P Pcan be compared
explore most promising branches first
P P
Score
….models
fromdatabase
New scene
Recognition: the first match No clue regarding geometry first match based on appearance
best match
second best match
features Initialization ofhypotheses queue
….
P P P P P
P P P P P
P P P P P
….models
fromdatabase
New scene
Scheduling – promising branches first
features Updated hypothesesqueue
….
P P P P
P P P
P P P?
Test image Identified modelTest image Identified model
Examples
Lowe’s model implemented using [Lowe’97,’99,’01,’03]
Lo
we’
s m
eth
od
Ou
r sy
stem
a. Object found, correct pose Detection
b. Object found, incorrect pose False alarm
c. Wrong object found False alarm
d. Object not found Non detection
Performance evaluation
Test image hand-labeledbefore the experiments
Results – Toys imagesS
cene
s (t
est
imag
es)
Mod
els
(dat
abas
e)
- 80% recognition with false alarms / test set = 0.2- Lower false alarm rate than Lowe’s system.
- 153 model images- 90 test images- 0-5 models / test image
Results – Kitchen images
- Achieves 77% recognition rate with 0 false alarms
- 100 training images- 80 test images- 0-9 models / test image- 254 objects to be detected