mean-field theory and its applications in computer vision4 1

Post on 28-Mar-2015

220 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Mean-Field Theory and Its Applications In Computer Vision4

1

Motivation

2

Helps in incorporating region/segment consistency in the model

Pairwise CRF

Higher order CRF

Motivation

3

Higher order terms can help in incorporating detectors into our model

Image

Without detector

With detector

Marginal update

4

General form of meanfield update

Expectation of the cost given variable vi takes a label

Marginal Update

5

General form of meanfield update

Expectation of the clique given variable vi takes a label

• Summation over the possible states of the clique

Marginal Update in Meanfield

6

Some possible states

Total number of possible states: 36

labels

Marginal Update in Meanfield

7

Exponential # of possible states for clique of size |c| and labels L: |L|C

Expectation evaluation (summation) becomes infeasible

Marginal Update in Meanfield

8

• Use restricted form of cost

• Pattern based potential

Marginal Update in Meanfield

9

Restrict the number of states to certain number of patterns

Simple patterns

Segment takes a label from label set of 4 patterns Or none

Marginal Update in Meanfield

10

Expectation calculation is quite efficient

Pattern based cost

11

Segment takes one of the forms

Pattern based cost

12

Segment does not take one of the forms

Pattern based cost

13

• Simple patterns

Simple patterns

• Pattern based higher order terms

PN Potts based patterns

14

• PN Potts based patterns

Potts patterns

Potts cost

15

• Potts cost

Potts patterns

Marginal Update in Meanfield

16

General form of meanfield update

Expectation of the cost given variable vi takes a label

Expectation update

17

Probability of segment taking that label

Potts patterns

Expectation update

18

Probability of segment not taking that label

Potts patterns

Expectation update

19

Expectation update

Potts patterns

Complexity

20

• Expectation Updation:

• Time complexity• O(NL)

• Preserves the complexity of original filter based method

PascalVOC-10 dataset

21

• Inclusion of PN potts term:

Algorithm Time (s) Overall Av. Recall Av. I/U

AHCRF+Cooc 36 81.43 38.01 30.09

Dense CRF 0.67 71.63 34.53 28.4

Dense + PN Potts

4.35 79.87 40.71 30.18

• Slight improvement in I/U score compared to more complex model which includes Pn Potts + cooccurrence terms• Almost 8-9 times faster than the alpha-expansion based method

top related