new 8. classification and pattern recognition · 2013. 1. 8. · consider the tolerance relation in...

20
8. Classification and Pattern Recognition 2012 1

Upload: others

Post on 23-Oct-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

  • 8. Classification and Pattern Recognition

    2012 1

  • 2012

    Introduction: Classification is arranging things by class or category. Pattern recognition involves identification of objects. Pattern recognition can also be seen as a classificationprocess.

    2

  • 2012

    Classification In classification or clustering, the most important issue isdeciding what criteria to classify against.

    To classify people, we may look at their height, weight,gender, religion, education, appearance, ….

    Height and weight are numerical in nature; other featuresare simply linguistic descriptors.

    To classify people according to one feature e.g. gender,this classification is simple: female or male.

    But suppose we want to classify people according towhether we would want them as neighbors. Here thenumber of features to be used in the classification is not atall clear, and we might also have trouble developing acriterion for this classification.

    3

  • 2012

    Classification (contd.)So for classification Expert knowledge is to be used and

    can be expressed in a very natural way using linguisticvariables, which are described by fuzzy sets.

    In fuzzy classification, a sample can have membership inmany different classes to different degrees. Typically, themembership values are constrained so that all of themembership values for a particular sample sum to 1.

    The rules can be combined in a table, called the rulebase.

    Depending on the system, it may not be necessary toevaluate every possible input combination, since some mayrarely or never occur.

    4

  • 2012

    Classification (contd.)Example:

    Salmon fish is oblong and light in colorSea bass is rectangular and dark

    5

  • 2012

    Note that, categoriesmay not refer to finalclasses but can refer tooverlapping ranges offeature values(Bolti is diamond inshape and medium incolor)

    Shape color Classoblong light salmon

    rectangular dark see bass

    Example for a fuzzy rule base

    Classification (contd.)

    6

  • 2012

    Classification by Equivalence RelationsAn equivalence relation R can divide the universe X into mutually

    exclusive equivalent classes.Example (Crisp Relations):

    Define a universe of integers X = {1, 2, 3, . . . , 10} and defineR as the crisp relation for ‘‘the identical remainder after dividing eachelement of the universe by 3.’’ We have

    7Hany Selim

  • 2012

    Classification by Equivalence Relations (contd.)this relation is reflexive, it is symmetric, and it is also transitive;

    hence the matrix is an equivalence relation.We can group the elements of the universe into classes as follows:

    8

    [1] = [4] = [7] = [10] = {1, 4, 7, 10} with remainder = 1[2] = [5] = [8] = {2, 5, 8} with remainder = 2[3] = [6] = [9] = {3, 6, 9} with remainder = 0

    These classes do not overlap, i.e.,they are mutually exclusive:[1] ∩ [2] = ∅ and [2] ∩ [3] = ∅

    The union of all the classes exhauststhe universe:

    The quotient set is then determined tohave three classes:{(1, 4, 7, 10), (2, 5, 8), (3, 6, 9)}

  • 2012

    Application of αlpha-Cut in ClassificationThe α-cut set of the fuzzy set A is a crisp set, where

    A={A(x), xX}

    Any particular fuzzy set A can be transformed into an infinite numberof α-cut sets, because there are an infinite number of values α on theinterval [0, 1].

    Any element x ∈ Aα belongs to A with a grade of membership that isgreater than or equal to the value α.

    α-Cuts for Fuzzy Relations:Each row of the relational matrix R is considered a fuzzy set, i.e., the jthrow in R represents a discrete membership function for a fuzzy set.Hence, a fuzzy relation can be converted to a crisp relation using α-cut.Since R is a two-dimensional array defined on the universes X and Y,then any pair (x, y)∈Rα belongs to R with a ‘‘strength’’ of relationgreater than or equal to α.

    9

  • Classification by Equivalence Relations (Contd.)Example (Fuzzy Relations):

    Consider the Tolerance relation in chapter 3, which was reformed into anequivalence relation by 2 compositions with itself:

    By taking α-cuts of fuzzy equivalent relation R at values of α = 1, 0.9, 0.8,0.5, and 0.4, we get the following:

    10Hany Selim

  • Classification by Equivalence Relations (Contd.)where we can see that the clustering of the five data points according to the

    α-cut level is as shown in Table:

    11

    We can express the classification scenariodescribed in this table with a systematicclassification diagram, in which you can see thatthe higher the value of α, the finer is theclassification. That is, as α gets larger the tendencyof classification tends to approach the trivial casewhere each data point is assigned to its own class.

  • 2012

    Pattern RecognitionPattern recognition can be defined as a process of identifying structure in

    data by comparisons to known structure; the known structure is developedthrough methods of classification.

    The purpose of the pattern recognition system is to assign each input to oneof c possible pattern classes (or data clusters).

    Basically, classification establishes (or seeks to determine) the structure indata, whereas pattern recognition attempts to take new data and assign themto one of the classes defined in the classification process. Simply stated,classification defines the patterns and pattern recognition assigns data to aclass.

    The data used to design a pattern recognition system are usually divided intotwo categories: design (or training) data and test data

    Examples of pattern recognition applications:Handwritten character and word recognition; automatic classificationof X-ray images and electrocardiograms; speech recognition andspeaker identification; fingerprint recognition; target identification;and human face recognition.

    12

  • 13

    λ H α Decisionvery high medium Urbanhigh or very high very low medium/high Urbanhigh high forestmedium high medium/high forestmedium medium medium/low vegetationmedium Low/very low low vegetationVery low runway

    Pattern Recognition Example:In the example of analyzing the images of the Earth’sland and ocean, sent by the SAR satellite, 3 featureswere used, namely wavelength ‘λ’ the entropy ‘H’ andthe angle ‘α’ related to the type of scattering, the rulebase was given by:

    2012

  • 2012

    Classifying ChromosomesLee [1975] described a method of examining the shape of chromosomes inorder to classify them into the three categories pictured in Fig. 1.As can be seen from this figure, the classification scheme is based on theratio of the length of the arms of the chromosome to its total body length. Itis difficult to identify sharp boundaries between these three types. Therefore,Lee uses a method of fuzzy pattern recognition, which compares the anglesand arm lengths of the chromosome with those labeled in the idealizedskeleton Each pattern u is characterized by 13 features (angles anddistances), which are shown in Fig. 2.

    Fig. 1

    14

  • 15

    Figure 2 Idealized patternfor chromosome classification

    μ = (θ1,, θ2, θ3, …, θ8, d1, d2, d3, ….. D5)

    2012

  • 2012

    SINGLE-SAMPLE IDENTIFICATIONA typical problem in pattern recognition is to collect data from aphysical process and classify them into known patterns. Considerthe pattern is characterized by one feature (one-dimensional).Suppose we have several typical patterns stored in our knowledgebase expressed as as fuzzy sets A1,A2, . . . ,Am.

    16

    Now suppose we are given a new datasample, which is characterized by the crispsingleton, x0. Using the simple criterion ofmaximum membership, the typical patternthat the data sample most closely resemblesis found by the following expression:

    where clearly the new data sample defined by the singleton x0 most closelyresembles the pattern described by fuzzy set A2.

  • 2012

    The Approaching Degree [Wang, 1983]

    Let us define a and b as fuzzy vectors of length n, and define fuzzyinner product of a and b as:

    , and

    as the fuzzy outer product of a and b

    17

  • 18

    The Approaching Degree [Wang, 1983] (contd.)Example: We have two fuzzy vectors of length 4 as defined here, and want to

    find the inner product and the outer product for these two fuzzy vectors:a= (0.3, 0.7, 1, 0.4)b= (0.5, 0.9, 0.3, 0.1)

    = (0.3 ∧ 0.5) ∨ (0.7 ∧ 0.9) ∨ (1 ∧ 0.3) ∨ (0.4 ∧ 0.1) = 0.3 ∨ 0.7 ∨ 0.3 ∨ 0.1 = 0.7a⊕ bT = (0.3 ∨ 0.5) ∧ (0.7 ∨ 0.9) ∧ (1 ∨ 0.3) ∧ (0.4 ∨ 0.1)

    = 0.5 ∧ 0.9 ∧ 1 ∧ 0.4 = 0.42012

  • 19

    The Approaching Degree [Wang, 1983] (contd.)Now we define two fuzzy sets to assess the degree of similarity of the two setsA and B:

    It can be shown that the first metric (A,B)1 always gives a value that is lessthan the value obtained from the second metric (A,B)2. Both of these metricsrepresent a concept that has been called the approaching degreeIn particular, when either of the values of these metrics approaches 1, then thetwo fuzzy sets A and B are ‘‘more closely similar’’; On the other hand wheneither of them approaches a value of 0, the two fuzzy sets are ‘‘more far apart’’(dissimilar).The metric (A,B)1 uses a minimum property to describe similarity, and theexpression (A,B)2 uses an arithmetic metric to describe similarity.

    2012

  • 2012

    The Approaching Degree [Wang, 1983] (contd.)Example Suppose we have a universe of five discrete elements, X = {x1,x2, x3, x4, x5}, and we define two fuzzy sets, A and B , on this universe.Note that the two fuzzy sets are special: They are actually crisp sets andboth are complements of one another:

    If we calculate the quantities expressed by the approaching degree weobtain the following values:

    The conclusion is that a crisp set and its complement are completelydissimilar.

    20

    8. Classification and Pattern RecognitionSlide18Slide19Slide20Slide21Slide22Slide23Slide24Slide28Slide26Slide27Slide8Slide15Slide9Slide17Slide10Slide30Slide16Slide29Slide11