fast exact k nearest neighbors search using an orthogonal search tree

Post on 18-Jan-2016

37 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Fast exact k nearest neighbors search using an orthogonal search tree. Presenter : Chun-Ping Wu Authors :Yi- Ching Liaw , Maw-Lin Leou , Chien -Min Wu. 國立雲林科技大學 National Yunlin University of Science and Technology. PR 2010. Outline. Motivation Objective Methodology Experiments - PowerPoint PPT Presentation

TRANSCRIPT

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.

Fast exact k nearest neighbors search using an orthogonal search tree

Presenter : Chun-Ping Wu Authors :Yi-Ching Liaw, Maw-Lin Leou, Chien-Min Wu

PR 2010

國立雲林科技大學National Yunlin University of Science and Technology

1

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Outline

Motivation Objective Methodology Experiments Conclusion Comments

2

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Motivation

The finding process of k nearest neighbors for a query point using FSA(full search algorithm) is very time consuming.

Many algorithms want to reduce the computational complexity of the kNN finding process. Pre-created tree structure

3

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Motivation

For a big PAT(Principal Axis Search), the computation time to evaluate boundary points and projection values will be large.

4

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Objective

To reduce the computation time on evaluation boundary points and projection values in the kNN searching process for a query point.

The proposed method requires no boundary points and only little computation time on evaluating projection values in the kNN finding process.

5

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

The OST(orthogonal search tree) algorithm OST construction process

K Nearest neighbors

search using the OST

6

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

The OST construction process

7

1,2,3,4,5,6,7,8,9

1,2,3,4,5,6,7,8,9

1,2,3 4,5,6 7,8,9

1,2,3,4,5,6,7,8,9

1,2,3 4,5,6 7,8,9

1 2 3

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Methodology

K nearest neighbors search

using the orthogonal search tree

8

1,2,3,4,5,6,7,8,9

1,2,3 4,5,6 7,8,9

1 2 3 4 5 6 7 8 9

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Example 1 Uniform Markov source

9

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

10

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Example 2 auto-correlated data

11

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Example 3 Clustered Gaussian data

12

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Example 4 Data sets are codebook

generated using 6 real images.

13

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Example 5 Statlog data set.

14

34% 39%

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Conclusion

1515

Experimental results show that the proposed method always spends less computation time to find the kNN for a query point than the other methods.

The proposed method will find the same results as those of the FSA(full search algorithm).

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Comments

1616

Advantage To reduce the computation of the kNN finding process.

Drawback Lack of illustrations

Application Classification

top related