jeremy bolton paul gader csi laboratory university of florida

49
Conjuntive Formulation of the Random Set Framework for Multiple Instance Learning: Application to Remote Sensing Jeremy Bolton Paul Gader CSI Laboratory University of Florida

Upload: hasad

Post on 21-Feb-2016

38 views

Category:

Documents


0 download

DESCRIPTION

Conjuntive Formulation of the Random Set Framework for Multiple Instance Learning: Application to Remote Sensing. Jeremy Bolton Paul Gader CSI Laboratory University of Florida. Highlights. Conjunctive forms of Random Sets for Multiple Instance Learning: - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

Conjuntive Formulation of the Random Set Framework for Multiple Instance Learning:Application to Remote Sensing

Jeremy BoltonPaul Gader

CSI Laboratory University of Florida

Page 2: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

2/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Highlights• Conjunctive forms of Random Sets

for Multiple Instance Learning:

– Random Sets can be used to solve MIL problem when multiple concepts are present

– Previously Developed Formulations assume Disjunctive relationship between concepts learned

– New formulation provides for a conjunctive relationship between concepts and its utility is exhibited on a Ground Penetrating Radar (GPR) data set

Page 3: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

3/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

OutlineI. Multiple Instance Learning

I. MI ProblemII. RSF-MILIII.Multiple Target Concepts

II. Experimental ResultsI. GPR Experiments

III. Future Work

Page 4: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

Multiple Instance Learning

Page 5: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

5/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Standard Learning vs. Multiple Instance Learning

• Standard supervised learning– Optimize some model (or learn a target concept) given

training samples and corresponding labels

• MIL– Learn a target concept given multiple sets of samples

and corresponding labels for the sets.– Interpretation: Learning with uncertain labels / noisy

teacher

},...,{},,...,{ 11 nn yyYxxX

?}?,...,{,1},,...,{ 11 ii iniiinii yyYxxX

Page 6: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

6/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Multiple Instance Learning (MIL)

• Given: – Set of I bags

– Labeled + or -

– The ith bag is a set of Ji samples in some feature space

– Interpretation of labels

• Goal: learn concept– What characteristic is common to the positive bags that

is not observed in the negative bags

},...,,,..{ 11

Iii BBBBB

},...,{ 1 iiJii xxB

1)(: iji xlabeljB

0)(, iji xlabeljB

Page 7: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

7/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Multiple Instance Learning

x1 label = 1x2 label = 1x3 label = 0x4 label = 0x5 label = 1

{x1, x2, x3, x4} label = 1

{x1, x2, x3, x4} label = 1

{x1, x2, x3, x4} label = 0

Traditional Classification Multiple Instance Learning

Page 8: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

8/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

MIL Application: Example GPR

• Collaboration: Frigui, Collins, Torrione

• Construction of bags– Collect 15 EHD

feature vectors from the 15 depth bins

– Mine images = + bags

– FA images = - bags 154321 ,...,,,, xxxxx

EHD: Feature Vector

Page 9: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

9/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Standard vs. MI Learning: GPR Example

• Standard Learning– Each training sample

(feature vector) must have a label

• Arduous task – many feature vectors

per image and multiple images

– difficult to label given GPR echoes, ground truthing errors, etc …

– label of each vector may not be known

EHD: Feature Vector1x1y

2y3y4y

ny

2x3x

4x

nx

Page 10: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

10/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Standard vs MI Learning: GPR Example

• Multiple Instance Learning– Each training bag

must have a label

– No need to label all feature vectors, just identify images (bags) where targets are present

– Implicitly accounts for class label uncertainty … 154321 ,...,,,, xxxxxY

EHD: Feature Vector

Page 11: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

Random Set Framework for Multiple Instance Learning

Page 12: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

12/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Random Set Brief

• Random Set

)(R)(R, B

))(,( B

)),(,( PB R

)),(,( PB

Page 13: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

13/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

How can we use Random Sets for MIL?

• Random set for MIL: Bags are sets (multi-sets)

– Idea of finding commonality of positive bags inherent in random set formulation

• Sets have an empty intersection or non-empty intersection relationship

• Find commonality using intersection operator• Random sets governing functional is based on intersection operator

– Capacity functional : T

It is NOT the case that EACH element is NOT the

target concept

Xx

xTXT )(11)(

},...,{ 1 nxxX

A.K.A. : Noisy-OR gate (Pearl 1988)

Page 14: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

14/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Random Set Functionals• Capacity functionals for intersection calculation

• Use germ and grain model to model random set– Multiple (J) Concepts

– Calculate probability of intersection given X and germ and grain pairs:

– Grains are governed by random radii with assumed cumulative:

)()( XTXP

J

jjj

1

)}({

jjjj

Tj

jjjj xrrr

rRPrRPxTj

,)exp(1

22)(1)(})({

j Xx

xTXTj

)(11)(

Random Set model parameters

},{ Germ Grain

Page 15: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

15/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

RSF-MIL: Germ and Grain Model

• Positive Bags = blue

• Negative Bags = orange

• Distinct shapes = distinct bags

x

x

x

x

x x

x

x

x

TT

T

T

T

Page 16: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

Multiple Instance Learning with Multiple Concepts

Page 17: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

17/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Multiple Concepts: Disjunction or Conjunction?

• Disjunction– When you have multiple types of concepts– When each instance can indicate the presence

of a target• Conjunction

– When you have a target type that is composed of multiple (necessary concepts)

– When each instance can indicate a concept, but not necessary the composite target type

Page 18: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

18/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Conjunctive RSF-MIL• Previously Developed Disjunctive RSF-MIL (RSF-

MIL-d)

• Conjunctive RSF-MIL (RSF-MIL-c)

j Xx

xTXTj

)(11)(

j XxxTXT

j)(11)(

Standard noisy-OR for one concept j

Noisy-AND combination across concepts

Noisy-OR combination across concepts and samples

Page 19: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

19/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Synthetic Data Experiments

• Extreme Conjunct data set requires that a target bag exhibits two distinct concepts rather than one or none

AUC (AUC when initialized near solution)

Page 20: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

Application to Remote Sensing

Page 21: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

21/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Disjunctive Target ConceptsTarget Concept

Type 1 Noisy

OR

NoisyOR

Target Concept Type 2

Target Concept Type n

NoisyOR

OR

Target Concept Present?

• Using Large overlapping bins (GROSS Extraction) the target concept can be encapsulated within 1 instance: Therefore a disjunctive relationship exists

Page 22: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

22/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

What if we want features with finer granularity

• Fine Extraction– More detail about image and more

shape information, but may loose disjunctive nature between (multiple) instances

NoisyOR

NoisyOR

AND

Target Concept Present?

Constituent Concept 1

(top of hyperbola)

Constituent Concept 2(wings of

hyperbola)

Our features have more granularity, therefore our concepts

may be constituents of a target, rather than encapsulating the

target concept

Page 23: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

23/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

GPR Experiments• Extensive GPR Data set

– ~800 targets– ~ 5,000 non-targets

• Experimental Design– Run RSF-MIL-d (disjunctive) and RSF-MIL-c

(conjunctive)– Compare both feature extraction methods

• Gross extraction: large enough to encompass target concept

• Fine extraction: Non-overlapping bins

• Hypothesis– RSF-MIL will perform well when using gross extraction

whereas RSF-MIL-c will perform well using Fine extraction

Page 24: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

24/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Experimental Results• Highlights

– RSF-MIL-d using gross extraction performed best – RSF-MIL-c performed better than RSF-MIL-d when

using fine extraction– Other influencing factors: optimization methods for

RSF-MIL-d and RSF-MIL-c are not the same

Gross Extraction

Fine Extraction

Page 25: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

25/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Future Work• Implement a general form that can learn

disjunction or conjunction relationship from the data

• Implement a general form that can learn the number of concepts

• Incorporate spatial information • Develop an improved optimization

scheme for RSF-MIL-C

Page 26: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

Backup Slides

Page 27: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

27/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

MIL Example (AHI Imagery)• Robust learning tool

– MIL tools can learn target signature with limited or incomplete ground truth

Which spectral signature(s) should we

use to train a target model or classifier?

1. Spectral mixing2. Background signal

3. Ground truth not exact

Page 28: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

28/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

MI-RVM• Addition of set observations and

inference using noisy-OR to an RVM model

• Prior on the weight w

)exp(11)(

)(11)|1(1

zz

xwXyPK

jj

T

),0|()( 1 AwNwp

Page 29: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

29/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

SVM review• Classifier structure

• Optimization

by )()( T xφwx

,0,1))((: 21min 2

,

iiiT

i

iibw

btist

C

xφw

w

Page 30: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

30/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

MI-SVM Discussion• RVM was altered to fit MIL problem by

changing the form of the target variable’s posterior to model a noisy-OR gate.

• SVM can be altered to fit the MIL problem by changing how the margin is calculated– Boost the margin between the bag (rather

than samples) and decision surface– Look for the MI separating linear discriminant

• There is at least one sample from each bag in the half space

Page 31: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

31/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

mi-SVM• Enforce MI scenario using extra

constraints

1:,1

,1:,12

1

Ii

IiI

i

TIt

TIt

}1,1{,0,1))((: 21minmin 2

,}{

iiiiT

i

iibwt

tbtist

Ci

xφw

w

Mixed integer program: Must find optimal hyperplane and optimal labeling

set

At least one sample in each positive bag must have a label

of 1.All samples in each negative bag must have a label of -1.

Page 32: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

32/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Current Applications

I. Multiple Instance LearningI. MI ProblemII. MI Applications

II.Multiple Instance Learning: Kernel MachinesI. MI-RVMII. MI-SVM

III. Current Applications I. GPR imageryII. HSI imagery

Page 33: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

33/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

HSI: Target Spectra Learning• Given labeled areas of interest: learn

target signature• Given test areas of interest: classify

set of samples

Page 34: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

34/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Overview of MI-RVM Optimization

• Two step optimization1. Estimate optimal w, given posterior of

w• There is no closed form solution for the

parameters of the posterior, so a gradient update method is used

• Iterate until convergence. Then proceed to step 2.

2. Update parameter on prior of w• The distribution on the target variable has

no specific parameters.• Until system convergence, continue at step

1.

Page 35: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

35/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

1) Optimization of w• Optimize posterior (Bayes’ Rule) of

w

• Update weights using Newton-Raphson method

)(log)|(logmaxargˆ wpwXpww

MAP

gww tt 11 H

Page 36: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

36/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

2) Optimization of Prior• Optimization of covariance of prior

• Making a large number of assumptions, diagonal elements of A can be estimated

dwAwpwXpAXpAAA

)|()|(maxarg)|(maxargˆ

12

1

iii

newi Hwa

Page 37: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

37/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Random Sets: Multiple Instance Learning

• Random set framework for multiple instance learning– Bags are sets– Idea of finding commonality of positive bags

inherent in random set formulation• Find commonality using intersection operator• Random sets governing functional is based on

intersection operator

)()( KPKT

Page 38: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

38/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

MI issues• MIL approaches

– Some approaches are biased to believe only one sample in each bag caused the target concept

– Some approaches can only label bags– It is not clear whether anything is

gained over supervised approaches

Page 39: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

39/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

RSF-MIL

• MIL-like • Positive

Bags = blue

• Negative Bags = orange

• Distinct shapes = distinct bags

x

x

x

x

x x

x

x

x

TT

T

T

T

Page 40: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

40/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Side Note: Bayesian Networks• Noisy-OR Assumption

– Bayesian Network representation of Noisy-OR

– Polytree: singly connected DAG

Page 41: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

41/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Side Note• Full Bayesian network may be intractable

– Occurrence of causal factors are rare (sparse co-occurrence)

• So assume polytree• So assume result has boolean relationship with causal

factors– Absorb I, X and A into one node, governed by

randomness of I• These assumptions greatly simplify inference calculation• Calculate Z based on probabilities rather than

constructing a distribution using X

j

jXZPXXXXZP )|1(11}),,,{|1( 4321

Page 42: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

42/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Diverse Density (DD)• Probabilistic Approach

– Goal:• Standard statistics approaches identify areas in a feature space

with high density of target samples and low density of non-target samples

• DD: identify areas in a feature space with a high “density” of samples from EACH of the postitive bags (“diverse”), and low density of samples from negative bags.

– Identify attributes or characteristics similar to positive bags, dissimilar with negative bags

– Assume t is a target characterization– Goal:

– Assuming the bags are conditionally independent

tBBBBP mnt

|,...,,,...,maxarg 11

jj

ii

ttBPtBP ||maxarg

Page 43: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

43/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Diverse Density

• Calculation (Noisy-OR Model):

• Optimization

j

iji BtPBtP )|(11)|( },...,{ 1 iiJii xxB

j

iji BtPBtP )|(1)|(

22

expexp)|( txtBBtP ijijij

It is NOT the case that EACH element is NOT the

target concept

jj

ii

ttBPtBP ||maxarg

Page 44: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

44/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Random Set Brief

• Random Set

)(R)(R, B

))(,( B

)),(,( PB R

)),(,( PB

Page 45: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

45/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Random Set Functionals• Capacity and avoidance

functionals

– Given a germ and grain model

– Assumed random radii

)()( KPKT

in

jijiji

1

)}({

ijijijij

Tij

ijijij

ij

xrrr

rRPrRP

xTxPij

,)exp(1

2)(1)(

})({)|}({

)()( KPKQ)(1)( KQKT

Page 46: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

46/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

When disjunction makes sense

• Using Large overlapping bins the target concept can be encapsulated within 1 instance: Therefore a disjunctive relationship exists

ORTarget

Concept Present

Page 47: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

47/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

Theoretical and Developmental Progress

• Previous Optimization:• Did not necessarily promote

diverse density

• Current optimization• Better for context learning and MIL

• Previously no feature relevance or selection (hypersphere)– Improvement: included learned weights

on each feature dimension

jjj

iii BQBT )()(maxarg ,,

jjj

iii BQBT )()(maxarg ,,

• Previous TO DO list• Improve Existing Code

– Develop joint optimization for context learning and MIL

• Apply MIL approaches (broad scale)• Learn similarities between feature sets of

mines• Aid in training existing algos: find “best”

EHD features for training / testing• Construct set-based classifiers?

Page 48: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

48/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

How do we impose the MI scenario?: Diverse Density (Maron et al.)

• Calculation (Noisy-OR Model):– Inherent in Random Set formulation

• Optimization

– Combo of exhaustive search and gradient ascent

j

iji BtPBtP )|(11)|( },...,{ 1 iiJii xxB

j

iji BtPBtP )|(1)|(

22

expexp)|( txtBBtP ijijij

jj

iit

BtPBtP ||maxarg

It is NOT the case that EACH element is NOT the

target concept

Page 49: Jeremy Bolton Paul  Gader CSI  Laboratory  University of Florida

49/23

CSI Laboratory Jeremy Bolton Paul Gader

2010

How can we use Random Sets for MIL?

• Random set for MIL: Bags are sets– Idea of finding commonality of positive bags inherent in

random set formulation• Sets have an empty intersection or non-empty intersection

relationship• Find commonality using intersection operator• Random sets governing functional is based on intersection operator

• Example:

Bags with target{l,a,e,i,o,p,u,f}{f,b,a,e,i,z,o,u}

{a,b,c,i,o,u,e,p,f}{a,f,t,e,i,u,o,d,v}

Bags without target

{s,r,n,m,p,l}{z,s,w,t,g,n,c}

{f,p,k,r}{q,x,z,c,v}

{p,l,f}

{a,e,i,o,u,f}

intersection

union

{f,s,r,n,m,p,l,z,w,g,n,c,v,q,k}Target concept = \ = {a,e,i,o,u}