visual impression localization of autonomous robots_#case2015

26
Visual Impression Localization of Autonomous Robots Somar Boubou, A.H. Abdul Hafez, Einoshin Suzuki 1. Dept. of Informatics, Kyushu University, Japan. 2. Control Systems Laboratory, Toyota Technological Institute, Japan. 3. Dept. of Computer Engineering, Hasan Kalyoncu University, Turkey. 1 1,2 1 3

Upload: somar-boubou

Post on 11-Feb-2017

358 views

Category:

Engineering


0 download

TRANSCRIPT

Visual Impression Localization of Autonomous Robots

Somar Boubou, A.H. Abdul Hafez, Einoshin Suzuki

1. Dept. of Informatics, Kyushu University, Japan.

2. Control Systems Laboratory, Toyota Technological Institute, Japan.

3. Dept. of Computer Engineering, Hasan Kalyoncu University, Turkey.

1

1,2 1 3

Previous Localization methods are precise = every node in the topological map represents a (relatively) precise position of the robot. [Abdul-Hafez13]

Precision: around 1m in outdoor applications, order of mm in indoor applications when geometric features are available. [Badino12][BK Kim15]

3

We achieved a

rough but fast localization with BIRCH.

Background and Objective

Base Work: Autonomous Mobile Robot that Models HSV Color Info. of

the Environment [Suzuki 2012]

Navigating indoor, the robot uses online clustering BIRCH [Zhang 97] and detects peculiar colors

4

X4

Proposed extension to our localization problem

โ€ข Robot in [Suzuki12] signals an observation which is sufficiently far from similar past observations

โ€ข Our robot inherits most of [Suzuki 12] but solves a localization problem by comparing a pair of CF trees based on All Common Sequence [Wang 97]

5

Observed

data CF tree

on RAM Incremental construction of

the model

Leaf: compressed

similar observations

Outlier (very different from

the corresponding leaf)

Localization problem

Ref1

Nav

Ref4

Ref3

Ref2 Nav CF-tree

Ref CF trees

on ROM

?

Robot localize itself by comparing its tree with several reference trees. Each of which is representing one area of interest.

6

Localization problem

Ref1

Nav

Ref4

Ref3

Ref2 Nav CF-tree

Ref CF trees

on ROM

Robot localize itself by comparing its tree with several reference trees. Each of which is representing one area of interest.

6

Localization problem

Ref1

Nav

Ref4

Ref3

Ref2 Nav CF-tree

Ref CF trees

on ROM

Robot localize itself by comparing its tree with several reference trees. Each of which is representing one area of interest.

6

BIRCH [Zhang 97]

BIRCH, Balanced Iterative Reducing and Clustering using Hierarchies:

โ€ข Groups similar examples by building a data index structure called a CF tree (i.e., Clustering Feature tree).

โ€ข An efficient and scalable clustering method for a huge data set. [Zhang 97]

Applications:

โ€ข Peculiar data discovery [Suzuki 12] and intrusion detection [Horng 11]

7

The Clustering Feature ๐ถ๐น of a cluster ๐• is a triple, denoted as:

CF tree [Zhang 97] 8

N d-dimensional data points or feature vectors x1, x2, โ€ฆ , x๐‘

Cluster ๐•

๐ฟ๐‘† = x๐‘–๐‘๐‘–=1 ๐‘†๐‘† = x๐‘–

2๐‘๐‘–=1

๐ถ๐น = ๐‘, ๐ฟ๐‘†, ๐‘†๐‘†

๐‘‘ ๐ถ๐น๐‘ฅ , ๐ถ๐น๐‘– < ฯ„

CF vector [Zhang97]

9

๐ถ๐น๐‘–โŠ•๐ถ๐น๐‘ฅ = ๐‘๐‘– + ๐‘๐‘ฅ , ๐ฟ๐‘†๐‘– + ๐ฟ๐‘†๐‘ฅ , ๐‘†๐‘†๐‘– + ๐‘†๐‘†๐‘ฅ

๐ถ๐น๐‘ฅ = ๐‘๐‘ฅ , ๐ฟ๐‘†๐‘ฅ , ๐‘†๐‘†๐‘ฅ ๐ถ๐น๐‘– = ๐‘๐‘– , ๐ฟ๐‘†๐‘– , ๐‘†๐‘†๐‘– insert

Yes

No Try again in new location

CF Vector for HSV color histogram [Suzuki 12]

๐ถ๐น = ๐‘, ๐ฟ๐‘†, ๐‘†๐‘†

๐ถ๐น = (โ„Ž,๐Ž, ๐‘›๐‘ข๐‘š, ๐‘˜๐‘’๐‘ฆ)

10

๐‘˜๐‘’๐‘ฆ = [๐ต; ๐บ;๐‘Š; ๐‘Ÿ0:3; ๐‘œ0:3; ๐‘ฆ0:3; ๐‘”0:3; ๐‘0:3; ๐‘0:3; ๐‘0:3]

Our extension: introduction

of weights

[Lei 99]

Robot Navigation

Navigation tree (๐’ฏ)

Paths of tree 1 P

Reference tree (๐’ฎ)

Paths of tree 1 Q

Navigation tree (๐’ฏ)

Comparison of the paths

Arrangement of the comparison results

ฮด(1,1)= โ€ฆ ฮด(1,2)= โ€ฆ

.

. ฮด(P,Q)= โ€ฆ

ฮด(1)> ฮด(2)>โ€ฆ> โ€ฆ> ฮด(P.Q)

S(๐’ฎ,๐’ฏ) = ๐œธ ฮด(๐’™)๐‘ท.๐‘ธ๐Ÿ Similarity(๐’ฎ,๐’ฏ) =

๐‘บ(๐“ข,๐“ฃ)

๐‘บ(๐“ฃ,๐“ฃ)

๐’ฎ= ๐‘†1, ๐‘†2, โ‹ฏ , ๐‘†๐‘ƒ

๐’ฏ= ๐‘‡1, ๐‘‡2, โ‹ฏ , ๐‘‡๐‘„

๐›ฟ๐‘Ž๐‘๐‘ (๐‘, ๐‘ž) =๐‘Ž๐‘๐‘  ๐‘†๐‘, ๐‘‡๐‘ž

2๐‘€+๐‘ฯ‰๐‘Ž๐‘๐‘ 

๐œธ = ๐‘ธ๐Ÿ/๐‘ท๐‘ธ ๐‘–๐‘  ๐‘Ž ๐‘ ๐‘๐‘Ž๐‘™๐‘–๐‘›๐‘” ๐‘ฃ๐‘Ž๐‘Ÿ๐‘–๐‘Ž๐‘๐‘™๐‘’.

Flow Chart for CF-trees comparison

Node weighting

๐›ฟ๐‘Ž๐‘๐‘  =๐‘Ž๐‘๐‘  ๐‘ , ๐‘ก

๐‘Ž๐‘๐‘  ๐‘ , ๐‘  ๐‘Ž๐‘๐‘  ๐‘ก, ๐‘ก๐Ž๐’‚๐’„๐’” โ†’ ๐›ฟ๐‘Ž๐‘๐‘  =

๐‘Ž๐‘๐‘  ๐‘ , ๐‘ก

2๐‘€+๐‘๐Ž๐’‚๐’„๐’”

๐‘  = ๐‘†๐‘, ๐‘ก = ๐‘‡๐‘ž

ฯ‰(๐‘–) =ฮฑฯ‰๐‘› + ฮฒฯ‰๐‘๐‘œ๐‘ 

ฮฑ + ฮฒ

Let us consider two paths:

12

Weights are used: - to define compression type.

- to eliminate noise.

๐Ž๐’‚๐’„๐’” = 1 โˆ’max ฯ‰๐‘ , ฯ‰๐‘ก โˆ’min (ฯ‰๐‘ , ฯ‰๐‘ก)

max (ฯ‰๐‘ , ฯ‰๐‘ก)

ฯ‰๐‘  = ฯˆฯ‰(๐‘–)๐‘€๐‘–=1

[Wang 97]:

Node weighting

๐›ฟ๐‘Ž๐‘๐‘  =๐‘Ž๐‘๐‘  ๐‘ , ๐‘ก

๐‘Ž๐‘๐‘  ๐‘ , ๐‘  ๐‘Ž๐‘๐‘  ๐‘ก, ๐‘ก๐Ž๐’‚๐’„๐’” โ†’ ๐›ฟ๐‘Ž๐‘๐‘  =

๐‘Ž๐‘๐‘  ๐‘ , ๐‘ก

2๐‘€+๐‘๐Ž๐’‚๐’„๐’”

๐‘  = ๐‘†๐‘, ๐‘ก = ๐‘‡๐‘ž

ฯ‰(๐‘–) =ฮฑฯ‰๐‘› + ฮฒฯ‰๐‘๐‘œ๐‘ 

ฮฑ + ฮฒ

Let us consider two paths:

12

Weights are used: - to define compression type.

- to eliminate noise.

๐Ž๐’‚๐’„๐’” = 1 โˆ’max ฯ‰๐‘ , ฯ‰๐‘ก โˆ’min (ฯ‰๐‘ , ฯ‰๐‘ก)

max (ฯ‰๐‘ , ฯ‰๐‘ก)

ฯ‰๐‘  = ฯˆฯ‰(๐‘–)๐‘€๐‘–=1

s={a,b,c} t={a,b} acs(s,t)={โˆ…, ๐‘Ž, ๐‘, ๐‘Ž๐‘}= 4

[Wang 97]:

Node weighting

๐›ฟ๐‘Ž๐‘๐‘  =๐‘Ž๐‘๐‘  ๐‘ , ๐‘ก

๐‘Ž๐‘๐‘  ๐‘ , ๐‘  ๐‘Ž๐‘๐‘  ๐‘ก, ๐‘ก๐Ž๐’‚๐’„๐’” โ†’ ๐›ฟ๐‘Ž๐‘๐‘  =

๐‘Ž๐‘๐‘  ๐‘ , ๐‘ก

2๐‘€+๐‘๐Ž๐’‚๐’„๐’”

๐‘  = ๐‘†๐‘, ๐‘ก = ๐‘‡๐‘ž

ฯ‰(๐‘–) =ฮฑฯ‰๐‘› + ฮฒฯ‰๐‘๐‘œ๐‘ 

ฮฑ + ฮฒ

Let us consider two paths:

12

Weights are used: - to define compression type.

- to eliminate noise.

๐Ž๐’‚๐’„๐’” = 1 โˆ’max ฯ‰๐‘ , ฯ‰๐‘ก โˆ’min (ฯ‰๐‘ , ฯ‰๐‘ก)

max (ฯ‰๐‘ , ฯ‰๐‘ก)

ฯ‰๐‘  = ฯˆฯ‰(๐‘–)๐‘€๐‘–=1

[Wang 97]:

ฯ‰๐‘› ๐‘– =๐‘›๐‘–๐‘›๐‘Ÿ๐‘œ๐‘œ๐‘ก

ฯ‰๐‘๐‘œ๐‘  =๐‘ฃ

3

Tree types of comparison (favor of root)

๐‘˜๐‘’๐‘ฆ = [๐ต; ๐บ;๐‘Š; ๐‘Ÿ0:3; ๐‘œ0:3; ๐‘ฆ0:3; ๐‘”0:3; ๐‘0:3; ๐‘0:3; ๐‘0:3]

49_๐‘˜๐‘’๐‘ฆ๐‘Ÿ๐‘œ๐‘œ๐‘ก = [๐‘Š: 57%;๐ต: 43%; ]

21_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–2 = [๐ต: 100%; ]

28_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–1 = [๐‘Š: 100%; ]

49_๐‘˜๐‘’๐‘ฆ๐‘Ÿ๐‘œ๐‘œ๐‘ก = [๐‘Š: 57%; ๐‘Ÿ0: 43%; ]

28_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–1 = [๐‘Š: 100%; ]

21_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–2 = [๐‘Ÿ0: 100%; ]

13

Tree types of comparison (favor of leaves)

๐‘˜๐‘’๐‘ฆ = [๐ต; ๐บ;๐‘Š; ๐‘Ÿ0:3; ๐‘œ0:3; ๐‘ฆ0:3; ๐‘”0:3; ๐‘0:3; ๐‘0:3; ๐‘0:3]

49_๐‘˜๐‘’๐‘ฆ๐‘Ÿ๐‘œ๐‘œ๐‘ก = [๐‘Š: 90%; ]

[๐‘Ÿ0; ๐‘œ0; ๐‘ฆ0; ๐‘”0; ๐‘0; ๐‘0; ๐‘0 = 2% < 5%]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–2 = [๐ต: 100%; ]

44_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–1 = [๐‘Š: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–3 = [๐บ: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–6 = [๐‘0: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–4 = [๐‘ฆ0: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–5 = [๐‘0: 100%; ]

49_๐‘˜๐‘’๐‘ฆ๐‘Ÿ๐‘œ๐‘œ๐‘ก = [๐‘Š: 90%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–2 = [๐ต: 100%; ]

44_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–1 = [๐‘Š: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–3 = [๐บ: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–6 = [๐‘0: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–4 = [๐‘ฆ0: 100%; ]

1_๐‘˜๐‘’๐‘ฆ๐‘โ„Ž๐‘–5 = [๐‘0: 100%; ]

In ๐‘˜๐‘’๐‘ฆ๐‘Ÿ๐‘œ๐‘œ๐‘ก :

14

Experiments (1)

Favor of the root

Favor of the Leaves

Neutral

- Six areas.

- One reference tree for each area.

- Five navigation trials in each area.

- Three types of comparison were introduced:

Results (1)

Favor of the root Favor of the Leaves Neutral

Experiments (2): KTH-IDOL2 Dataset [ Pronobis06]

- 5 rooms and three illumination conditions which are, cloudy, night, and sunny.

17

Four navigation trials under each condition:

- Three trials were used to create reference CF trees.

-The forth trials were used to create navigation trees.

KTH-IDOL2 Results (2)

0%

20%

40%

60%

80%

Cloudy Night Sunny

Training /Cloudy/

CAMML

NBM

Filter

0%

20%

40%

60%

80%

Cloudy Night Sunny

Training /Night/

0%

20%

40%

60%

80%

Cloudy Night Sunny

Training /Sunny/

[Rubio 14] - /CAMML/ Bayesian Network - Naive Bayes Method

19

- PC with 32-bit Ubuntu 12.04 system.

- Equipped with Intel Core i7 CPU 920.

- Clock speed: 2.67GHz.

- RAM: 11.8GB.

Computation time /Our Platform/

Computation time 20

๐‘ก๐‘Ž =29ms

per frame

Paths of tree 1 P

Reference tree (๐’ฎ)

Paths of tree 1 Q

Navigation tree (๐’ฏ)

Compare ๐‘‡๐‘ = ๐‘ก๐‘๐‘ƒ๐‘„

๐‘ก๐‘ =0.031 ms, P=Q=60 ๐‘‡๐‘ = 111.56 ๐‘š๐‘ 

15 fps ๐‘„ โ‰ˆ 60

๐‘‡๐‘Ž = ๐‘ก๐‘Žโˆ— 60 = 435 ๐‘š๐‘ 

๐‘‡๐‘ก๐‘œ๐‘ก = ๐‘ก๐‘Ž + ๐‘ก๐‘ = 546.56 ๐‘š๐‘ 

320ร—240

Contributions

โ€ข We are planning to investigate more robust features (e.g., SIFT, SURF, WI-SURF, HOG) to the changes in the environment due to illumination etc.

21

โ€ข Extend the discovery robot [Suzuki 12] for our localization problem.

โ€ข Color-based feature were not stable under different illumination conditions.

โ€ข Introduce a new measure for CF-tree similarity based on ACS.

Future work

Thank you for listeningโ€ฆ