instance-based learning algorithms

Post on 30-Dec-2015

35 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Instance-based Learning Algorithms. Presented by Yan T. Yang. Agenda. Background what is instance-based learning? Two simple algorithms Extensions [Aha, 1994]: F eedback algorithm Noise reduction Irrelevant attribute elimination Novel attribute adoption. Learning Paradigms. - PowerPoint PPT Presentation

TRANSCRIPT

INSTANCE-BASED LEARNING ALGORITHMSPresented by Yan T. Yang

Agenda

• Background what is instance-based learning?

• Two simple algorithms• Extensions [Aha, 1994]:

• Feedback algorithm• Noise reduction• Irrelevant attribute elimination• Novel attribute adoption

Learning Paradigms

• Cognitive psychology: how people/animals/ machines learn? Jerome Bruner

• Two schools of thoughts: [Bruner, Goodnow and Austin 1967]• Abstraction-based:

• Form a generalized idea from the

examples, then use it to

classify new objects.

Learning Paradigms

• Cognitive psychology: how people/animals/ machines learn? Jerome Bruner

• Two schools of thoughts: [Bruner, Goodnow and Austin 1967]• Abstraction-based:

• Examples: • Artificial Neural Network,• Support Vector Machine,• Rule based learner/decision trees:

If not animated… then not an animal

Learning Paradigms

• Cognitive psychology: how people/animals/ machines learn? Jerome Bruner

• Two schools of thoughts: [Bruner, Goodnow and Austin 1967]• Instance-based:

• Store all (suitable) training

examples, compare new

objects to the examples.

Comparison Between Two Paradigms

• Abstraction Based• Generalization:

• Rules• Discriminant planes or

functions• Trees

• Workload is during training time

• Little work during query time

• Instance Based• Store (suitable)

examples• Saved instances

• Workload is during query time

• Little work during training time

Instance-based LearningTrainingSet

Example [Aha, 1994]: Attributes – “is enrolled”, “has MS degree”, and “is married”

( <True, True, True>, PhD student) ( <False, False, True>, not PhD student)( <True, False, False>, PhD student)

}student PhDnot ,student PhD{C

CyyxyxyxT imm )},,),...(,(),,{( 2211

Instance-based LearningCyyxyxyxT imm )},,),...(,(),,{( 2211

TrainingSet

Instance-based learning Algorithm

Concept Description TT *

Instance-based LearningCyyxyxyxT imm )},,),...(,(),,{( 2211

TrainingSet

Instance-based learning Algorithm

Concept Description TT *

Similarity Function ]1,0[ ),( : 21 xxsim

Instance-based LearningCyyxyxyxT imm )},,),...(,(),,{( 2211

TrainingSet

Instance-based learning Algorithm

Concept Description TT *

Similarity Function ]1,0[ ),( : 21 xxsim

Classification Function CyxTclass mm 11 sim,*, :

Instance-based Learning Algorithm

• Input: Training set• Output: Concept Description• Similarity function• Classification function • Optional:

• Keep track of each concept description instance’s correct and incorrect rates

• Concept Description Adder• Concept Description Remover

Instance-based Learning Algorithm

• Advantages and disadvantages

[Mitchell, 1997]• Advantages:

• Training is very fast• Learn complex class membership• Do not lose information

• Disadvantages:• Slow at query time• Easily fooled by irrelevant attributes

Instance-based Learning Algorithm

• Example IBL1:• Assign the class of the most similar concept description instance to the new instance.

• Nearest neighbor • Save all training instances in concept description

CD= concept description

Instance-based Learning Algorithm

• Example IBL1:– Assign the class of the most similar concept

description instance to the new instance.– Nearest neighbor – Save all training instances in concept

description

VoronoiTessellation

Trainingdata

Instance-based Learning Algorithm

• Example IBL2:• Similar to IBL1: nearest neighbor• Save only incorrectly classified instances in training set:

Intuition:

“These are nearly always lies in the boundary between two classes. So, only if these are fully saved, the rest which are far from boundaries, can be easily deduced by using the similarity function” [Karadeniz,1996]

CD= concept description

CriticismsMainly because of Nearest Neighbor Algorithms as the basis: [Brieman, Friedman, Olshen and Stone, 1984 ]

1. They are expensive due to their storage

2. They are sensitive to the choice of the similarity function

3. They cannot easily work with missing attribute values

4. They cannot easily work with nominal attributes

5. They do not yield concise summaries of concepts

CriticismsMainly because of Nearest Neighbor Algorithms as the basis: [Brieman, Friedman, Olshen and Stone, 1984 ]

1. They are expensive due to their storage

2. They are sensitive to the choice of the similarity function

3. They cannot easily work with missing attribute values

4. They cannot easily work with nominal attributes

5. They do not yield concise summaries of concepts

[Aha, 1992]– IBL2 rectifies 1.– Extensions (following slides) rectifies 1,2,3.– [Stanfill and Waltz, 1986] rectifies 4.– [Salzberg, 1990] rectifies 5.

Extension: Filtering Noisy Training Instances (IBL3)

Modification:

1. Maintain classification records

2. Only significantly good instances are saved; and

3. Discard noisy saved instance (i.e. those instances with significantly poor classification performance)

Extension: Filtering Noisy Training Instances (IBL3)

Component IBL2 IBL3Similarity Function Euclidean distance Euclidean distance

Classification Function

Nearest acceptable neighbor

Nearest acceptable neighbor

Concept Description Updater

- Save only misclassified instances

- Save only misclassified instances

- Use only significantly good saved instances

- Remove significantly bad saved instances

Extension: Filtering Noisy Training Instances (IBL3)

“Signficantly” good or bad:

use statistical confidence intervals (CI).

construct CI for the current instance’s classification accuracy.

construct CI for its class’s current observed relative frequency.

Class frequency

Classification accuracy“Significantly” good

Extension: Filtering Noisy Training Instances (IBL3)

“Signficantly” good or bad:

use statistical confidence intervals (CI).

construct CI for the current instance’s classification accuracy.

construct CI for its class’s current observed relative frequency.

Class frequency

Classification accuracy“Significantly” bad

Extension: Filtering Noisy Training Instances (IBL3)

“Signficantly” good or bad:

use statistical confidence intervals (CI).

construct CI for the current instance’s classification accuracy.

construct CI for its class’s current observed relative frequency.

[Hogg and Tanis, 1983]

Extension: Tolerate irrelevant attributes (IBL4)

•IBL1-IBL3: Assume all attributes have equal relevance ;

•Real World: some attributes are more discriminative than others;

•Irrelevant attributes cause poor performance.

Extension: Tolerate irrelevant attributes (IBL4)

• Regular similarity measure (Euclidean Distance)

• IBL4’s similarity measure (Euclidean Distance)

Concept-dependent:

sim(animal, tiger, cat) > sim(pet, tiger, cat)

Extension: Tolerate irrelevant attributes (IBL4)

• IBL4’s similarity measure (Euclidean Distance)

Extension: Tolerate irrelevant attributes (IBL4)

• IBL4’s similarity measure (Euclidean Distance)

Extension: Tolerate novel attributes (IBL5)

• (IBL1– IBL4) assume: all attributes are known a priori to the training process;

• Everyday situations: instances may not initially described by all possible attributes;

• Missing value: a different issue. 1) assigning “don’t know”; 2) assigning the most probable value; 3) assigning all possible values [Gams and Lavrac, 1987]

Extension: Tolerate novel attributes (IBL5)

• Extension (IBL5): allow novel attributes introduced late in the training process (extra: handle missing values in a novel way)

• IBL4’s similarity measure (Euclidean Distance)

• IBL5’s similarity measure (Euclidean Distance)

Extension: Tolerate novel attributes (IBL5)

• Extension (IBL5): allow novel attributes introduced late in the training process (extra: handle missing values in a novel way)

• IBL5’s similarity measure (Euclidean Distance)

Results

IB = instance based learning (IBL)

Results

Thanks

•Q and A

top related