19 uncertain evidence

Download 19 uncertain evidence

Post on 15-Aug-2015



Art & Photos

2 download

Embed Size (px)


<ol><li> 1. Reasoning with Uncertain Information Chapter 19. </li><li> 2. Page 2=== Outline Review of Probability Theory Probabilistic Inference Bayes Networks Patterns of Inference in Bayes Networks Uncertain Evidence D-Separation Probabilistic Inference in Polytrees </li><li> 3. Page 3=== 19.1 Review of Probability Theory Random variables Joint probability (B (BAT_OK), M (MOVES) , L (LIFTABLE), G (GUAGE)) Joint Probability (True, True, True, True) 0.5686 (True, True, True, False) 0.0299 (True, True, False, True) 0.0135 (True, True, False, False) 0.0007 Ex. </li><li> 4. Page 4=== 19.1 Review of Probability Theory Marginal probability Conditional probability Ex. The probability that the battery is charged given that the arm does not move Ex. </li><li> 5. Page 5=== 19.1 Review of Probability Theory </li><li> 6. Page 6=== 19.1 Review of Probability Theory Chain rule Bayes rule Abbreviation for where </li><li> 7. Page 7=== 19.2 Probabilistic Inference The probability some variable Vi has value vi given the evidence =e. p(P,Q,R) 0.3 p(P,Q,R) 0.2 p(P, Q,R) 0.2 p(P, Q,R) 0.1 p(P,Q,R) 0.05 p(P, Q, R) 0.1 p(P, Q,R) 0.05 p(P, Q,R) 0.0 RpRp Rp RQPpRQPp Rp RQp RQp ,,),,, | RpRp Rp RQPpRQPp Rp RQp RQp ,,),,, | 1|| 75.0| RQpRQP RQp RQp | </li><li> 8. Page 8=== Statistical Independence Conditional independence Mutually conditional independence Unconditional independence </li><li> 9. Page 9=== 19.3 Bayes Networks Directed, acyclic graph (DAG) whose nodes are labeled by random variables. Characteristics of Bayesian networks Node Vi is conditionally independent of any subset of nodes that are not descendents of Vi. )(|)(),(| iiiii VPVpVPVAVp graphin theofparentsimmediatethe)( ofsdescendent notaregraph thatin thenodesofsetany)( ii i i VVP V VA </li><li> 10. Page 10=== Bayes Networks Prior probability Conditional probability table (CPT) k i iik VPVpVVVp 1 21 )(|,...,, </li><li> 11. Page 11=== 19.3 Bayes Networks </li><li> 12. Page 12=== Inference in Bayes Networks Causal or top-down inference Ex. The probability that the arm moves given that the block is liftable BpLBMpBpLBMp LBpLBMpLBpLBMp LBMpLBMpLMp ,|,| |,||,| |,|,| 855.0| LMp </li><li> 13. Page 13=== Inference in Bayes Networks Diagnostic or bottom-up inference Using an effect (or symptom) to infer a cause Ex. The probability that the block is not liftable given that the arm does not move. 9525.0| LMp (using a causal reasoning) (Bayes rule)( M| L) ( L|) 0.9525*0.3 0.28575 ( L| M)= ( M) ( M) ( M) ( M|L) (L|) 0.0595*0.7 0.03665 (L| M)= ( M) ( M) ( M) ( L| M)=0.88632 </li><li> 14. Page 14=== Inference in Bayes Networks Explaining away B explains M, making L less certain 88632.0030.0 , ,| , |,| , |, ,| MBp LpBpLBMp MBp LpLBpLBMp MBp LpLBMp MBLp (Bayes rule) (def. of conditional prob.) (structure of the Bayes network) </li><li> 15. Page 15=== 19.5 Uncertain Evidence We must be certain about the truth or falsity of the propositions they represent. Each uncertain evidence node should have a child node, about which we can be certain. Ex. Suppose the robot is not certain that its arm did not move. Introducing M : The arm sensor says that the arm moved We can be certain that that proposition is either true or false. p(L| B, M) instead of p(L| B, M) Ex. Suppose we are uncertain about whether or not the battery is charged. Introducing G : Battery guage p(L| G, M) instead of p(L| B, M) </li><li> 16. Page 16=== 19.6 D-Separation d-separates Vi and Vj if for every undirected path in the Bayes network between Vi and Vj, there is some node, Vb, on the path having one of the following three properties. Vb is in , and both arcs on the path lead out of Vb Vb is in , and one arc on the path leads in to Vb and one arc leads out. Neither Vb nor any descendant of Vb is in , and both arcs on the path lead in to Vb. </li><li> 17. Page 17=== 19.6 D-Separation I(G,L|B) I(G,L) I(B,L) </li><li> 18. Page 18=== Inference in Polytrees Polytree A DAG for which there is just one path, along arcs in either direction, between any two nodes in the DAG. </li><li> 19. Page 19=== A node is above Q The node is connected to Q only through Qs parents A node is below Q The node is connected to Q only through Qs immediate successors. Three types of evidence. All evidence nodes are above Q. All evidence nodes are below Q. There are evidence nodes both above and below Q. Inference in Polytrees </li><li> 20. Page 20=== Evidence Above (1) Bottom-up recursive algorithm Ex. p(Q|P5, P4) 7,6 7,6 7,6 7,6 7,6 4|75|67,6| 4,5|74,5|67,6| 4,5|7,67,6| 4,5|7,64,5,7,6| 4,5|7,6,4,5| PP PP PP PP PP PPpPPpPPQp PPPpPPPpPPQp PPPPpPPQp PPPPpPPPPQp PPPPQpPPQp (Structure of The Bayes network) (d-separation) (d-separation) </li><li> 21. Page 21=== Evidence Above (2) Calculating p(P7|P4) and p(P6|P5) Calculating p(P5|P1) Evidence is below Here, we use Bayes rule 2,1 3 3 25|12,1|65|6 34,3|74|34,3|74|7 PP P P PpPPpPPPpPPp PpPPPpPPpPPPpPPp 5 11|5 5|1 Pp PpPPp PPp </li><li> 22. Page 22=== Evidence Below (1) Top-down recursive algorithm QpQPPpQPPkp QpQPPPPkp PPPPp QpQPPPPp PPPPQp |11,14|13,12 |11,14,13,12 11,14,13,12 |11,14,13,12 11,14,13,12| 9 9 |99|13,12 |9,9|13,12|13,12 P P QPpPPPp QppQPPPpQPPp 8 8,8|9|9 P PpQPPpQPp 9|139|129|13,12 PPpPPpPPPp </li><li> 23. Page 23=== Evidence Below (2) 10 10 |1010|1110|14 |1010|11,14|11,14 P P QPpPPpPPp QPpPPPpQPPp 1011,10|1511|1011,10|1511|10,15 1111|10,15 10,15 1111|10,15 10,15|11 1111,10|1510|15 10|1510,15|1110|11 1 11 15 PpPPPpPPpPPPpPPPp PpPPPpk PPp PpPPPp PPPp PpPPPpPPp PPpPPPpPPp P P </li><li> 24. Page 24=== Evidence Above and Below || |,| | |,| ,| 2 2 QpQpk QpQpk p QpQp Qp }11,14,13,12{},4,5{| PPPPPPQp + - </li><li> 25. Page 25=== A Numerical Example (1) QpQUkpUQp || ,|,| ,|| RpQRPpRpQRPp RpQRPpQPp R ,|,| ,|| RpQRPpRpQRPp RpQRPpQPp R Diagnostic reasoning </li><li> 26. Page 26=== A Numerical Example (2) Other techniques Bucket elimination Monte Carlo method Clustering 2.0|8.0|| PUpPUpQUp 98.0|019.0|| PUpPUpQUp|,35.4|| UQpk kkUQp kkUQp </li></ol>


View more >