1 chapter 14 probabilistic reasoning. 2 outline syntax of bayesian networks semantics of bayesian...

57
1 Chapter 14 Chapter 14 Probabilistic Probabilistic Reasoning Reasoning

Upload: lee-geraldine-davis

Post on 02-Jan-2016

215 views

Category:

Documents


2 download

TRANSCRIPT

1

Chapter 14 Chapter 14

Probabilistic ReasoningProbabilistic Reasoning

2

OutlineOutline Syntax of Bayesian networks

Semantics of Bayesian networks

Efficient representation of conditional distributions

Exact inference by enumeration

Exact inference by variable elimination

Approximate inference by stochastic simulation*

Approximate inference by Markov Chain Monte Carlo*

3

MotivationsMotivations

3

Full joint probability distribution can answer Full joint probability distribution can answer any any questionquestion but can become but can become intractably largeintractably large as as number of variable increasesnumber of variable increases

Specifying probabilities for Specifying probabilities for atomic eventsatomic events can be can be difficult, e.g., large set of data, statistical estimates, difficult, e.g., large set of data, statistical estimates, etc.etc.

IndependenceIndependence and and conditional independenceconditional independence reduce reduce the probabilities needed for full joint probability the probabilities needed for full joint probability distribution.distribution.

4

Bayesian networksBayesian networks A directed, acyclic graph (DAG)

A set of nodes, one per variable (discrete or continuous)

A set of directed links (arrows) connects pairs of nodes. X is a parent of Y if there is an arrow (direct influence) from node X to node Y.

Each node has a conditional probability distribution that quantifies the effect of the parents on the node.

Combinations of the topology and the conditional distributions specify (implicitly) the full joint distribution for all the variables.

iX

)(| ii XParentsXP

5

Bayesian networksBayesian networks

Example 1Example 1 : : The Teeth Disease Bayesian The Teeth Disease Bayesian

6 6

Example: Burglar alarm Example: Burglar alarm systemsystem

I have a burglar alarm installed at homeI have a burglar alarm installed at home It is fairly reliable at detecting a burglary, but also

responds on occasion to minor earth quakes. I also have two neighbors, John and MaryI also have two neighbors, John and Mary

They have promised to call me at work when they hear the alarm

John always calls when he hears the alarm, but sometimes confuses the telephone ringing with the alarm and calls then, too.

Mary likes rather loud music and sometimes misses the alarm altogether.

Bayesian networks variables: Burglar, Earthquake, Alarm, JohnCalls, MaryCalls

7 7

Example: Burglar alarm Example: Burglar alarm systemsystem Network topology reflects “causal”

knowledge: A burglar can set the alarm off An earthquake can set the alarm off The alarm can cause Mary to call The alarm can cause John to call

conditional probability table (CPT): each row contains the conditional probability of each node value for a conditioning case (a possible combination of values for the parent nodes).

8

Compactness of Bayesian Compactness of Bayesian networksnetworksA CPT for Boolean Xi with k Boolean parents has 2^k rows for the combinations of parent values. Each row requires one number p for Xi =true (the number for Xi =false is just 1-p)

If each variable has no more than k parents, The complete network requires O(n. 2^k) numbers i.e., grows linearly with n, vs. O(2^k) for the full joint distribution

For Burglary net, 1+1+4+2+2=10 numbers (vs. 2^5-1=31)

9

Global semantics of Bayesian Global semantics of Bayesian networksnetworks

1 1( , , ) ( | ( ))

. .

( )

n

n i iip X X p X Parents X

e g

p j m a b e

( | ) ( | a) ( | , ) ( ) ( )

0.90 0.70 0.001 0.999 0.998

0.00062

p j a p m p a b e p b p e

Global semantics defines the full Global semantics defines the full

joint distribution as the product of joint distribution as the product of

the local conditional distributions the local conditional distributions

10 10

Local semantics of Bayesian Local semantics of Bayesian networknetwork

e.g., JohnCalls is independent of Burglary and Earthquake, given the value of Alarm.

11

Markov blanketMarkov blanket

e.g., Burglary is independent of JohnCalls and MaryCalls , given the value of Alarm and Earthquake.

12 12

Constructing Bayesian Constructing Bayesian networksnetworks

Need a method such that a series of Need a method such that a series of locallylocally testable assertions of testable assertions of conditional independence guarantees the required conditional independence guarantees the required globalglobal semantics. semantics.

The correct The correct orderorder in which to add nodes is to add the “ in which to add nodes is to add the “root causesroot causes” ” first, then the variables they influence, and so on.first, then the variables they influence, and so on.

What happens if we choose the wrong order?

13 13

ExampleExample

14 14

ExampleExample

15 15

ExampleExample

16 16

ExampleExample

17 17

ExampleExample

18 18

ExampleExample

Deciding conditional independence is hard in noncausal directions.Deciding conditional independence is hard in noncausal directions. Assessing conditional probabilities is hard in noncausal directionsAssessing conditional probabilities is hard in noncausal directions Network is less compact: Network is less compact: 1 + 2 + 4 + 2 + 4 = 13 1 + 2 + 4 + 2 + 4 = 13 numbers needednumbers needed

19 19

Example: Car diagnosisExample: Car diagnosis Initial evidence: car won’t start Testable variables (green), “broken, so fix it” variables (orange) Hidden variables (gray) ensure sparse structure, reduce parameters

20 20

Example: Car insuranceExample: Car insurance

21

• Class Exercise1. Using the axioms of probability, prove

2. Given Bayesian Network

– Calculate

– In what condition,

( )p P Q

( ) ( )p P Q p Q P

( | P ) 1p P Q

P p(P)

Q p(Q|P), p(Q| ﹁ P)

22

Efficient representation of conditional distributions

22

CPT grows exponentially with number of parents O(2CPT grows exponentially with number of parents O(2kk)) CPT becomes infinite with continuous-valued parent or childCPT becomes infinite with continuous-valued parent or child

Solution: Solution: canonical distributioncanonical distribution that can be specified by a few parameters that can be specified by a few parameters Simplest example: Simplest example: deterministic nodedeterministic node whose value specified exactly by whose value specified exactly by

the values of its parents, with no uncertaintythe values of its parents, with no uncertainty

23

Efficient representation of conditional distributions

“Noisy” logic relationships: uncertain relationships noisy-OR model allows for uncertainty about the ability of each

parent to cause the child to be true, but the causal relationship between parent and child maybe inhibited. E.g.: E.g.: FeverFever is caused by is caused by ColdCold, , FluFlu, or , or MalariaMalaria, but a patient , but a patient

could have a cold, but not exhibit a fever.could have a cold, but not exhibit a fever.

Two assumptions of noisy-OR parents include all the possible causes (can add leak node that covers

“miscellaneous causes.”) inhibition of each parent is independent of inhibition of any other parents,

e.g., whatever inhibits Malaria from causing a fever is independent of whatever inhibits Flu from causing a fever 1.0,,|

2.0,,|

6.0,,|

malariaflucoldfeverP

malariaflucoldfeverP

malariaflucoldfeverPCold

Cold

MalariaFlu

24

Efficient representation of conditional distributions

Other probabilities can be calculated from the product of the inhibition probabilities for each parent

Number of parameters Number of parameters linearlinear in number of parents in number of parents

O(2k)O(k)

25

Bayesian nets with continuous variables Hybrid Bayesian network: discretediscrete variables + variables + continuouscontinuous variables variables Discrete (Subsidy? and Buys?); continuous (Harvest and Cost)

Two options discretization – possibly large errors, large CPTs finitely parameterized canonical families(E.g.: Gaussian Distribution

) Two kinds of conditional distributions for hybrid Bayesian

network continuous variable given discrete or continuous parents (e.g., Cost) discrete variable given continuous parents (e.g., Buys?)

26

Continuous child variables Need one Need one conditional density function conditional density function for child variable given for child variable given

continuous parents, for each possible assignment to discrete parentscontinuous parents, for each possible assignment to discrete parents Most common is the Most common is the linear Gaussianlinear Gaussian model, e.g.,: model, e.g.,:

Mean Mean CostCost varies linearly with varies linearly with HarvestHarvest, variance is fixed, variance is fixed the linear model is reasonable only if the harvest size is limited to a the linear model is reasonable only if the harvest size is limited to a

narrow rangenarrow range

27

Continuous child variables Discrete + continuous linear Gaussian network is a conditional

Gaussian network, i.e., a multivariate Gaussian distribution over all continuous variables for each combination of discrete variable values.

A multivariate Gaussian distribution is a surface in more than one dimension that has a peak at the mean and drops off on all sides

28

Discrete variable with continuous parents

28

Probability of Buys? given Cost should be a “soft” threshold”

Probit distribution uses integral of Gaussian:

29

Discrete variable with continuous parents

Sigmoid (or logit) distribution also used in neural networks:

Sigmoid has similar shape to probit but much longer tails

30

Exact inference by enumeration

30

A query can be answered using a Bayesian network by computing sums of products of conditional probabilities from the network.

sum over hidden variables: earthquake and alarm

d = 2 when we have n Boolean variables

31

Evaluation tree

31

32

Exact inference by variable elimination

32

Do the calculation once and save the results for later use – idea of dynamic programming

Variable elimination: carry out summations right-to-left, storing intermediate results (factors) to avoid re-computation

33

Variable elimination – basic operations

33

34

Variable elimination – irrelevant variables

34

d = 2 (n Boolean variables)

The complexity of variable elimination Single connected networks (or polytrees)

any two nodes are connected by at most one (undirected) path time and space cost of variable elimination are ,

i.e., linear in the number of variables (nodes) if the number of parents of each node is bounded by a constant

Multiply connected network variable elimination can have exponential time and space complexity even the number

of parents per node is bounded.

)( ndO k

35

Approximate inference by stochastic simulation*

Direct samplingDirect sampling Generate events from (an empty) network that has no

associated evidence Rejection sampling: reject samples disagreeing with evidence Likelihood weighting: use evidence to weight samples

Markov chain Monte Carlo (MCMC)* Markov chain Monte Carlo (MCMC)* sample from a stochastic process whose stationary distribution

is the true posterior

36

Example of sampling from an empty network

36

37

Example of sampling from an empty network

37

38

Example of sampling from an empty network

38

39

Example of sampling from an empty network

39

40

Example of sampling from an empty network

40

41

Example of sampling from an empty network

41

42

Example of sampling from an empty network

42

43

Example of sampling from an empty network

44

Rejection sampling

45

Likelihood weighting

46

Likelihood weighting

46

47

Likelihood weighting

48

Likelihood weighting

48

49

Likelihood weighting

50

Likelihood weighting

51

Likelihood weighting

51

52

Likelihood weighting analysis

53

Approximate inference using MCMC*

54

The Markov chain

55

MCMC example

56

Markov blanket sampling

57 57