![Page 1: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/1.jpg)
Learning Qualitative Models Ivan Bratko, Dorian Suc
Presented by Cem Dilmegani
FEEL FREE TO ASK QUESTIONS DURING PRESENTATION
![Page 2: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/2.jpg)
Summary● Understand QUIN algorithm● Explore the Crane Example● Analyze Learning Models expressed as QDEs
GENMODEL by Coiera QSI by Say and Kuru QOPH by Coghill et Al. ILP Systems
● Conclusion Applications Further Progress
![Page 3: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/3.jpg)
Modeling
●Modeling is complex●Modeling requires creativity●Solution: Use machine learning algorithms for modeling
![Page 4: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/4.jpg)
Modeling
●Modeling is complex●Modeling requires creativity●Solution: Use machine learning algorithms for modeling
![Page 5: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/5.jpg)
Learning
examples hypothesis
Hypothesis
examples
learning
![Page 6: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/6.jpg)
Decision Tree
![Page 7: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/7.jpg)
Decision Tree Algorithm
![Page 8: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/8.jpg)
QUIN (QUalitative INduction)
● Looks for qualitative patterns in quantitative data
● Uses so-called qualitative trees
![Page 9: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/9.jpg)
Qualitative tree
The splits define a partition of the attribute space into areas with common qualitative behaviour of
the class variable
Qualitatively constrained functions (QCFs) in leaves define qualitative constraints on the class
variable
![Page 10: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/10.jpg)
Qualitatively constrained functions (QCFs)
The qualitative constraint given by the sign only states that when the i-th attribute increases, the QCF will also change in the direction specified in M, barring other changes.
![Page 11: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/11.jpg)
Qualitative Tree Example
![Page 12: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/12.jpg)
Explanation of Algorithm(Leaf Level)
●Minimal cost QCF is sought●Cost= M+(inconsistencies or ambiguities between dataset and QCF)
![Page 13: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/13.jpg)
Consistency
●A QCV (Qualitative Change Vector) is consistent with a QCF if either a) class qualitative change is zero b) all attributes QCF-predictions are zero or c) there exists an attribute whose QCF prediction is equal to the class' qualitative change
Z=M+,-(X,Y)● a) no change = (inc,dec) ● a) no change = (inc,inc)● b) * = (no change, no change)● c) inc = (inc, dec)
![Page 14: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/14.jpg)
Ambiguity
●A qualitative ambiguity appears a) when there exist both positive and negative QCF-predictions b) whenever all QCF-predictions are 0.
Z=M+,-(X,Y)● a) * = (inc,inc) ● b) * = (no change, no change)
![Page 15: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/15.jpg)
Ambiguity-Inconsistency
![Page 16: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/16.jpg)
Explanation of Algorithm
● Start with QCF that minimizes cost in one attribute and then use “error-cost” to refine the current QCF with another attribute
● Tree Level algorithm: QUIN chooses best split by comparing the partitions of the examples it generates: for every possible split, it splits the examples into 2 subsets (according to the split), finds the minimal cost QCF in both subsets and selects the split which minimizes the tree error cost. This goes on until, a specified error bound is reached.
![Page 17: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/17.jpg)
Qualitative Reverse Engineering●In the industry, there exists library of designs and corresponding simulation models which are not well documented
●We may have to reverse engineer complex simulations to understand how the simulation functions.
●Similar to QSI
![Page 18: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/18.jpg)
Crane Simulation
![Page 19: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/19.jpg)
QUIN Approach
●Looks counterintuitive?
●Yes, but it outperforms straightforward transformations of quantitative data to quantitative model, like regression
![Page 20: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/20.jpg)
Identification of Operator's Skill
● Can't be learnt from operator verbally (Bratko and Urbancic 1999)
● Skill is manifested in operator's actions, QUIN is better at explaining those skills than quantitative models
![Page 21: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/21.jpg)
Comparison of 2 operators
S (slow) L (adventurous)
![Page 22: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/22.jpg)
Explanation of S's Strategy
● At the beginning V increases as X increases (load behind crane)
● Later, V decreases as X increases (load gradually moves ahead of crane)
● V increases as the angle increases (crane catches up with the load
![Page 23: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/23.jpg)
GENMODEL by Coiera
● QSI without hidden variables● Algorithm:
Construct all possible constraints using all observed variables
Evaluate all constraints
Retain those constraints that are satisfied by all states, discard all other
The retained constraints are your model
![Page 24: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/24.jpg)
GENMODEL by Coiera
● Limitations: Assumes that all variables are observed Biased towards the most specific models
(overfitting) Does not support operating regions
![Page 25: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/25.jpg)
QSI by Say and Kuru
● Explained last week● Algorithm:
Starts like GENMODEL Constructs new variables if needed
● Limitations: Biased towards the most specific model
![Page 26: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/26.jpg)
Negative Examples
● Consider U-Tube Example Conservation of water until the second tube bursts
or overflows There can not be negative amounts of water in a
container● Evaporation?
![Page 27: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/27.jpg)
Inductive Logic Programming (ILP)
● ILP is a machine learning approach which uses techniques of logic programming.
● From a database of facts which are divided into positive and negative examples, an ILP system tries to derive a logic program that proves all the positive and none of the negative examples.
![Page 28: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/28.jpg)
Inductive Logic Programming (ILP)
● Advantages: No need to create a new program, uses established
framework Hidden variables are introduced Can learn models with multiple operating regions as
well
![Page 29: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/29.jpg)
Applications
●German car manufacturer simplified their wheel suspension system with QUIN
●Induction of patient-specific models from patients' measured cardio vascular signals using GENMODEL
●An ILP based learning system (QuMAS) learnt the electrical system of the heart and is able to explain many types of cardiac arrhythmias
![Page 30: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/30.jpg)
Suggestions for Further Progress
●Better methods for transforming numerical data into qualitative data●Deeper study of principles or heuristics associated with the discovery of hidden variables●More effective use of general ILP techniques.
![Page 31: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/31.jpg)
Sources
Dorian Suc, Ivan Bratko “Qualitative Induction”Ethem Alpaydin “Introduction to Machine Learning” MIT PressWikipedia
![Page 32: Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani](https://reader035.vdocuments.site/reader035/viewer/2022070404/56813a6a550346895da26295/html5/thumbnails/32.jpg)
Any Questions?
????????? ??
?? ?? ?? ?? ?? ??
??
??