19 uncertain evidence
TRANSCRIPT
Page 2 ===
Outline
Review of Probability Theory
Probabilistic Inference
Bayes Networks
Patterns of Inference in Bayes Networks
Uncertain Evidence
D-Separation
Probabilistic Inference in Polytrees
Page 3 ===
19.1 Review of Probability Theory
Random variables
Joint probability
(B (BAT_OK), M (MOVES) , L (LIFTABLE), G (GUAGE))
Joint Probability
(True, True, True, True) 0.5686
(True, True, True, False) 0.0299
(True, True, False, True) 0.0135
(True, True, False, False) 0.0007
„ „
Ex.
Page 4 ===
19.1 Review of Probability Theory
Marginal probability
Conditional probability
–Ex. The probability that the battery is charged given that the arm does not move
Ex.
Page 7 ===
19.2 Probabilistic Inference
The probability some variable Vi has value vi given the evidence =e.
p(P,Q,R) 0.3
p(P,Q,¬R) 0.2
p(P, ¬Q,R) 0.2
p(P, ¬Q,¬R) 0.1
p(¬P,Q,R) 0.05
p(¬P, Q, ¬R) 0.1
p(¬P, ¬Q,R) 0.05
p(¬P, ¬Q,¬R) 0.0
RpRp
Rp
RQPpRQPp
Rp
RQpRQp
3.01.02.0
,,),,,|
RpRp
Rp
RQPpRQPp
Rp
RQpRQp
1.00.01.0
,,),,,|
1||
75.0|
RQpRQP
RQp
?RQp |
Page 8 ===
Statistical Independence
Conditional independence
Mutually conditional independence
Unconditional independence
Page 9 ===
19.3 Bayes Networks
Directed, acyclic graph (DAG) whose nodes are labeled by random variables.
Characteristics of Bayesian networks
–Node Vi is conditionally independent of any subset of nodes that are not descendents of Vi.
)(|)(),(| iiiii VPVpVPVAVp
graph in the of parents immediate the)(
of sdescendent
not are graph that in the nodes ofset any )(
ii
i
i
VVP
V
VA
Page 10 ===
Bayes Networks
Prior probability
Conditional probability table (CPT)
k
i
iik VPVpVVVp1
21 )(|,...,,
Page 12 ===
Inference in Bayes Networks
Causal or top-down inference
–Ex. The probability that the arm moves given that the block is liftable
BpLBMpBpLBMp
LBpLBMpLBpLBMp
LBMpLBMpLMp
,|,|
|,||,|
|,|,|
855.0| LMp
Page 13 ===
Inference in Bayes Networks
Diagnostic or bottom-up inference
–Using an effect (or symptom) to infer a cause
–Ex. The probability that the block is not liftable given that the arm does not move.
9525.0| LMp (using a causal reasoning)
(Bayes’ rule) ( M| L) ( L|) 0.9525*0.3 0.28575( L| M)=
( M) ( M) ( M)
( M|L) (L|) 0.0595*0.7 0.03665(L| M)=
( M) ( M) ( M)
( L| M)=0.88632
Page 14 ===
Inference in Bayes Networks
Explaining away(辩解)
–¬B explains ¬M, making ¬L less
– certain
88632.0030.0
,
,|
,
|,|
,
|,,|
MBp
LpBpLBMp
MBp
LpLBpLBMp
MBp
LpLBMpMBLp (Bayes’ rule)
(def. of conditional prob.)
(structure of the Bayes network)
Page 15 ===
19.5 Uncertain Evidence
We must be certain about the truth or falsity of the
propositions they represent.
–Each uncertain evidence node should have a child node,
about which we can be certain.
–Ex. Suppose the robot is not certain that its arm did
not move.
• Introducing M’ : “The arm sensor says that the arm moved”
• We can be certain that that proposition is either true or false.
• p(¬L| ¬B, ¬M’) instead of p(¬L| ¬B, ¬M)
–Ex. Suppose we are uncertain about whether or not the
battery is charged.
• Introducing G : “Battery guage”
• p(¬L| ¬G, ¬M’) instead of p(¬L| ¬B, ¬M’)
Page 16 ===
19.6 D-Separation
d-separates Vi and Vj if for every
undirected path in the Bayes network
between Vi and Vj, there is some
node, Vb, on the path having one of
the following three properties.
–Vb is in , and both arcs on the
path lead out of Vb
–Vb is in , and one arc on the path
leads in to Vb and one arc leads
out.
–Neither Vb nor any descendant of Vb
is in , and both arcs on the path
lead in to Vb.
Page 18 ===
Inference in Polytrees
Polytree
–A DAG for which there is just one path, along arcs in either direction, between any two nodes in the DAG.
Page 19 ===
A node is above Q
–The node is connected to Q only through Q’s parents
A node is below Q
–The node is connected to Q only through Q’s immediate successors.
Three types of evidence.
–All evidence nodes are above Q.
–All evidence nodes are below Q.
–There are evidence nodes both above and below Q.
Inference in Polytrees
Page 20 ===
Evidence Above (1)
Bottom-up recursive algorithm
Ex. p(Q|P5, P4)
7,6
7,6
7,6
7,6
7,6
4|75|67,6|
4,5|74,5|67,6|
4,5|7,67,6|
4,5|7,64,5,7,6|
4,5|7,6,4,5|
PP
PP
PP
PP
PP
PPpPPpPPQp
PPPpPPPpPPQp
PPPPpPPQp
PPPPpPPPPQp
PPPPQpPPQp
(Structure of
The Bayes network)
(d-separation)
(d-separation)
Page 21 ===
Evidence Above (2)
Calculating p(P7|P4) and p(P6|P5)
Calculating p(P5|P1)
–Evidence is “below”
–Here, we use Bayes’ rule
2,1
3 3
25|12,1|65|6
34,3|74|34,3|74|7
PP
P P
PpPPpPPPpPPp
PpPPPpPPpPPPpPPp
5
11|55|1
Pp
PpPPpPPp
Page 22 ===
Evidence Below (1)
Top-down recursive algorithm
QpQPPpQPPkp
QpQPPPPkp
PPPPp
QpQPPPPpPPPPQp
|11,14|13,12
|11,14,13,12
11,14,13,12
|11,14,13,1211,14,13,12|
9
9
|99|13,12
|9,9|13,12|13,12
P
P
QPpPPPp
QppQPPPpQPPp
8
8,8|9|9P
PpQPPpQPp 9|139|129|13,12 PPpPPpPPPp
Page 23 ===
Evidence Below (2)
10
10
|1010|1110|14
|1010|11,14|11,14
P
P
QPpPPpPPp
QPpPPPpQPPp
1011,10|1511|1011,10|1511|10,15
1111|10,1510,15
1111|10,1510,15|11
1111,10|1510|15
10|1510,15|1110|11
1
11
15
PpPPPpPPpPPPpPPPp
PpPPPpkPPp
PpPPPpPPPp
PpPPPpPPp
PPpPPPpPPp
P
P
Page 24 ===
Evidence Above and Below
||
|,|
|
|,|,|
2
2
QpQpk
QpQpk
p
QpQpQp
}11,14,13,12{},4,5{| PPPPPPQp
+ -
Page 25 ===
A Numerical Example (1)
QpQUkpUQp ||
80.099.08.001.095.0
,|,|
,||
RpQRPpRpQRPp
RpQRPpQPpR
019.099.001.001.090.0
,|,|
,||
RpQRPpRpQRPp
RpQRPpQPpR
•Diagnostic reasoning