introduction to artificial intelligence 2nd semester … to artificial intelligence 2nd semester...
TRANSCRIPT
Introduction to Artificial Intelligence2nd semester 2016/2017
Chapter 9: Inference in First-Order Logic
Mohamed B. Abubaker
Palestine Technical College – Deir El-Balah
1
Outlines• Reducing first-order inference to propositional inference
• Unification
• Generalized Modus Ponens
• Forward chaining
• Backward chaining
• Resolution
2
Inference in First Order Logic
• The idea is that first-order inference can be done by converting the knowledge base to
propositional logic and using propositional inference.
• This technique is known as propositionalization.
• Given: ∀x King(x) ∧ Greedy(x) ⇒ Evil(x)
One can infer:
King(John) ∧ Greedy(John) ⇒ Evil(John)
King(Richard) ∧ Greedy(Richard) ⇒ Evil(Richard)
King(Father (John)) ∧ Greedy(Father (John)) ⇒ Evil(Father (John))
3
Inference in First Order Logic
• Universal Instantiation (in a ∀ rule, substitute all symbols)
• Existential Instantiation ( in a ∃ rule, substitute one symbol, use the rule and discard)
• Skolem constancts is a new variable that represents a new inference.
4
Universal Instantiation (UI)
• we can infer any sentence obtained by substituting a ground term (a term without variables)
for the variable.
• Let SUBST(θ,α) denote the result of applying the substitution θ to the sentence α. Then the
rule is written:
for any variable v and ground term g.
• For example, the three sentences given earlier are obtained with the substitutions
{x/John}, {x/Richard}, and{x/Father (John)}.
5
Existential instantiation (EI)
• For any sentence α, variable v, and constant symbol k (that does not appear elsewhere in the
knowledge base):
• For example, from the sentence
∃x Crown(x) ∧OnHead(x,John)
• we can infer the sentence
Crown(C1) ∧OnHead(C1,John)
where C1 is a new constant symbol, called a Skolem constant
• Existential and universal instantiation allows to “propositionalize” any FOL sentence or KB
6
Reduction to propositional inference
Suppose KB:
• ∀x King(x) ∧ Greedy(x) ⇒ Evil(x)
• King(John)
• Greedy(John)
• Brother(Richard, John)
Apply UI using {x/John} and {x/Richard}
• King(John) ∧ Greedy(John) ⇒ Evil(John)
• King(Richard) ∧ Greedy(Richard) ⇒ Evil(Richard)
And discard the Universally quantified sentence. We can get the KB to be propositions.
7
Problems with Propositionalization
• Propositionalization generates lots of irrelevant sentences• So inference may be very inefficient
• For example:x King(x) Greedy(x) Evil(x)
King(John)
y Greedy(y)
Brother(Richard,John)
• it seems obvious that Evil(John) is entailed, but propositionalization produces lots of facts such as Greedy(Richard) that are irrelevant
8
Unification
• Subst(θ, p) = result of substituting θ into sentence p
• Unify algorithm: takes 2 sentences p and q and returns a unifier if one exists
Unify(p,q) = θ where Subst(θ, p) = Subst(θ, q)
• Example:
p = Knows(John,x)
q = Knows(John, Jane)
Unify(p,q) = {x/Jane}
9
Unification examples
• simple example: query = Knows(John,x), i.e., who does John know?
p q θ
Knows(John,x) Knows(John,Jane) {x/Jane}
Knows(John,x) Knows(y,Bill) {x/Bill,y/John}
Knows(John,x) Knows(y,Mother(y)) {y/John,x/Mother(John)}
Knows(John,x) Knows(x,Elizabeth) {fail}
• Last unification fails: only because x can’t take values John and Elizabeth at the same time• But we know that if John knows x, and everyone (x) knows Elizabeth, we should be able to
infer that John knows Elizabeth
• Problem is due to use of same variable x in both sentences
• Simple solution: Standardizing apart eliminates overlap of variables, e.g., Knows(x17, Elizabeth)10
Unification examples
p q θ
Knows(John,x) Knows(x17,Elizabeth) {x17/John, x/Elizabeth}
Knows(John,x) Knows(y,z) {y/John, x/z} OR {y/John, x/John, z/John}
For the last one: The first unifier is more general -most general unifier (MGU) - than the second.
11
Example
x King(x) Greedy(x) Evil(x)
King(John)
y Greedy(y)
Brother(Richard,John)
And we would like to infer Evil(John) without propositionalization
12
Generalized Modus Ponens (GMP)
• For atomic sentences pi, p’i and q, where there is a substitution θ such that
SUBST(θ, p’i )=SUBST(θ, pi), for all i,
13
Completeness and Soundness of GMP
• GMP is sound
• Only derives sentences that are logically entailed
• GMP is complete for a KB consisting of definite clauses
• Complete: derives all sentences that are entailed
• OR…answers every query whose answers are entailed by such a KB
14
Inference approaches in FOL
• Forward-chaining• Uses GMP to add new atomic sentences
• Useful for systems that make inferences as information streams in
• Requires KB to be in form of first-order definite clauses
• Backward-chaining• Works backwards from a query to try to construct a proof
• Can suffer from repeated states and incompleteness
• Useful for query-driven inference
• Resolution-based inference (FOL)• Refutation-complete for general KB
• Can be used to confirm or refute a sentence p (but not to generate all entailed sentences)
• Requires FOL KB to be reduced to CNF
• Uses generalized version of propositional inference rule
• Note that all of these methods are generalizations of their propositional equivalents15
Horn Clauses and Definite Clauses
Horn Form
• KB conjunction of Horn clauses.
• Horn clause (at most one literal is Positive “not negated”)
• For example: (P Q V) is a Horn clause
• So is ( P W). But, ( P Q V) is not.
• Definite Clause: exactly one literal is positive.
• Horn clauses can be re-written as implications• Proposition symbols (fact) or
• Conjunction of symbols (body or premise) symbol (head)
• Example: (C B A) becomes (C ^ B A)
• Modus Ponens for Horn KB: 16
Forward and Backward Chaining
17
Forward chaining example
“AND” gate
“OR” Gate
18
Forward chaining example
19
Forward chaining example
20
Forward chaining example
21
Forward chaining example
22
Forward chaining example
23
Forward chaining example
24
Backward chaining
Idea: work backwards from the query q• check if q is known already, or
• prove by BC all premises of some rule concluding q
• Hence BC maintains a stack of sub-goals that need to be proved to get to q.
Avoid loops: check if new sub-goal is already on the goal stack
Avoid repeated work: check if new sub-goal1. has already been proved true, or2. has already failed
25
Backward chaining example
26
Backward chaining example
27
Backward chaining example
28
Backward chaining example
we need P to prove
L and L to prove P.
29
Backward chaining example
30
Backward chaining example
31
Backward chaining example
32
Backward chaining example
33
Backward chaining example
34
Backward chaining example
35
Forward vs. backward chaining
• FC is data-driven• May do lots of work that is irrelevant to the goal
• BC is goal-driven
• Complexity of BC can be much less than linear in size of KB
36
Knowledge Base in FOL
The law says that it is a crime for an American to sell weapons to hostile nations. The
country Nono, an enemy of America, has some missiles, and all of its missiles were sold to
it by Colonel West, who is American.
Prove that Colonel West is a Criminal?
37
Knowledge Base in FOL
38
Forward chaining proof
39
Forward chaining proof
40
Forward chaining proof
41
Forward chaining proof
42
Backward chaining example
43
Backward chaining example
44
Backward chaining example
45
Backward chaining example
46
Backward chaining example
47
Backward chaining example
48
Backward chaining example
49
Resolution in FOL• Full first-order version:
l1 ··· lk, m1 ··· mn
Subst(θ , l1 ··· li-1 li+1 ··· lk m1 ··· mj-1 mj+1 ··· mn)
where Unify(li, mj) = θ.
The two clauses are assumed to be standardized apart so that they share no variables.
For example,
Rich(x) Unhappy(x), Rich(Ken)
Unhappy(Ken)
with θ = {x/Ken}
• Apply resolution steps to CNF (KB α); complete for FOL
50
Converting FOL sentences to CNFOriginal sentence:
Everyone who loves all animals is loved by someone:x [y Animal(y) Loves(x,y)] [y Loves(y,x)]
1. Eliminate implicationsx [y Animal(y) Loves(x,y)] [y Loves(y,x)]
2. Move inwards:
Recall: x p ≡ x p, x p ≡ x p
x [y (Animal(y) Loves(x,y))] [y Loves(y,x)]
x [y Animal(y) Loves(x,y)] [y Loves(y,x)]
x [y Animal(y) Loves(x,y)] [y Loves(y,x)]
51
Conversion to CNF contd.
3. Standardize variables: each quantifier should use a different onex [y Animal(y) Loves(x,y)] [z Loves(z,x)]
4. Skolemize: a more general form of existential instantiation.Each existential variable is replaced by a Skolem function of the enclosing universally quantified variables:
x [Animal(F(x)) Loves(x,F(x))] Loves(G(x),x)
(reason: animal y could be a different animal for each x.)
52
Conversion to CNF contd.5. Drop universal quantifiers:
[Animal(F(x)) Loves(x,F(x))] Loves(G(x),x)
(all remaining variables assumed to be universally quantified)
6. Distribute over :
[Animal(F(x)) Loves(G(x),x)] [Loves(x,F(x)) Loves(G(x),x)]
Original sentence is now in CNF form – can apply same ideas to all sentences in KB to convert into CNF
Also need to include negated query
Then use resolution to attempt to derive the empty clause which show that the query is entailed by the KB
53
Example
KB:
Everyone who loves all animals is loved by someone
Anyone who kills animals is loved by no-one
Jack loves all animals
Either Curiosity or Jack killed the cat, who is named Tuna
Query: Did Curiosity kill the cat?
Inference Procedure:
Express sentences in FOL
Convert to CNF form and negated query54
Resolution-based Inference
55
Resolution-based Inference
56
H Animal (Tuna) E,F {x/Tuna}
I Kills(Jack,Tuna) D,G
J Animal(F(Jack)) Loves(G(Jack), Jack) A2,C {x/Jack, F(x)/x}
K Loves(G(Jack), Jack) J,A1 {F(x)/F(Jack), x/Jack}
L Loves(y,x) Kills(x, Tuna) H,B {z/Tuna}
M Loves(y, Jack) I,L {x/Jack}
N . M,K {y/G(Jack)}
Resolution-based Inference
Confusing because the sentencesHave not been standardized apart…
57
Recall: Example Knowledge Base in FOL... it is a crime for an American to sell weapons to hostile nations:
American(x) Weapon(y) Sells(x,y,z) Hostile(z) Criminal(x)
Nono … has some missiles, i.e., x Owns(Nono,x) Missile(x):Owns(Nono,M1) and Missile(M1)
… all of its missiles were sold to it by Colonel WestMissile(x) Owns(Nono,x) Sells(West,x,Nono)
Missiles are weapons:Missile(x) Weapon(x)
An enemy of America counts as "hostile“:Enemy(x,America) Hostile(x)
West, who is American …American(West)
The country Nono, an enemy of America …Enemy(Nono,America)
Convert to CNF
Q: Criminal(West)?
58
Resolution proof
59
Resolution proof
60
Adapted from Lecture Notes on Inference Methods, Dr. Mustafa Jarrar, University of Birzeit
END
61