conditional probability mass function. introduction p[a|b] is the probability of an event a, giving...

32
Conditional Probability Mass Function

Upload: agatha-bishop

Post on 21-Jan-2016

228 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

Conditional Probability Mass Function

Page 2: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

2

Introduction P[A|B] is the probability of an event A, giving that we know that

some other event B has occurred. Unless A and B are independent, B will affect the probability of A.

Example: We choose a coin out of a fair and weighted coins and toss it 4 times. What’s the probability of observing 2 or more head?The probability depends on which coin is selected (condition).px[k| coin 1 chosen] is a Binomial PMF depends on p1

px[k| coin 2 chosen] is a Binomial PMF depends on p2

Page 3: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

3

Conditional Probability Mass FunctionLet X be the discrete RV describing the outcome of the coin choice

Since SX = {1,2}, we assign a PMF to X of

The second part of the experiment consists of tossing the chosen coin 4 times in succession. SY = {0,1,2,3,4}

The event A corresponds to 2 or more heads.

Page 4: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

4

Conditional Probability Mass Function

Only the PMF is needed to determine the desired probability. To do so we need

By using the definition of conditional probability for events we have(definition of joint PMF)

(definition of cond. prob.)

(definition of marginal PMF)

Page 5: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

5

Conditional Probability Mass Function

can be determined from the experimental description

Is given earlier

Note, that probability depends on the outcome X = i via pi.

For a given value of X = i , the probability has all usual properties of a PMF

Page 6: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

6

Conditional Probability Mass FunctionThen is a conditional PMF

Now we know pY|X[j|i] and pX we have

Page 7: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

7

Conditional Probability Mass Function

Finally the desired probability of even A is

The joint PDF is then given by

As an example, if p1 = ¼ and p2 = ¾, we have for α = ½, that P[A] = 0.6055, but if α = 1/8, then P[A] = 0.8633. Why??

Page 8: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

8

Conditional Probability Mass Function The conditional PMF can be expressed as

To make connection with cond. probability let’s rename

Hence, pY|X[j|i] is a conditional probability for the events Aj and Bi.

Page 9: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

9

Joint, Conditional, and Marginal PMFs Conditional PMF is defined as

Each PMF in the family is a valid PMF when xi is considered to be a constant.

In previous example {pY|X[j|1], pY|X[j|2]} is a family or valid PMFs.

But not

Page 10: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

10

Example: Two Dice toss Two dice are tossed. All outcomes are equally likely. The numbers

of dots are added together. What’s the cond. PMF of the sum if it’s known the sum is even?

Let

We wish to determine pY|X[j|0] and pY|X[j|1] for all j.

The sample space for Y is SY = {2,3,…,12}.

Page 11: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

11

Example: Two Dice toss Conditional probability if the sum being even and also equaling j

or

Nj is the number of outcomes in SX,Y for which the sum is j.

Page 12: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

12

Example: Two Dice toss

Note that

Page 13: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

13

Properties of PMF Property 1. Joint PMF yields conditional PMFsIf the joint PMF pX,Y[xi, yj] is known, then the conditional PMFs are

Hence, the cond. PMF is the joint PMF with xi fixed and then normalized so that it sums to one.

Page 14: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

14

Properties of PMF Property 2. Conditional PMFs are related

Proof:

but

therefore

Using pX,Y[xi, yj] = pY,X[yj|xi]pX[xi] yields the desired the results.

(*)

Page 15: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

15

Properties of PMF Property 3. Conditional PMF is expressible using Bayes’ rule

Proof: From property 1

and using (*) we have

substituting it into (**) yields the desired results

(**)

Page 16: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

16

Properties of PMF Property 4. Conditional PMF and its corresponding marginal

PMF yields the joint PMF

Property 5. Conditional PMF and its corresponding marginal PMF yields the other marginal PMF

This is the law of total probability.

Page 17: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

17

Conditional PMF relationships

Can also interchange X and Y for similar results

Page 18: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

18

Simplifying Probability Calculations Using Conditioning

Conditional PMFs can be used to simplify probability calculations. Find Z = X + Y, if X and Y are independent.If X were known X = i we can find the PMF of Z because Z = i + Y

This is a transformation of one discrete RV to another discrete RV Z. pZ|X[j|i] = pY|X[j-i|i]. (*)

To find unconditional PMF of Z we use property 5.

Since (*)

If X and Y are independent so that pY|X = pY then

Page 19: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

19

Mean of the Conditional PMFWe can determine attributes such as the expected value of a RV Y, when it is known that X = xi.

The mean of the conditional PMF is a constant when xi is fixed.

Generally, mean is a function of xi.

Example: Two dice are tossed, the event of interest is a sum, given the sum is even or odd. The means of the conditional PMF are given

Usually not equal

Page 20: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

20

Example: Toss one of two dice Two dice are given: D1 = {1,2,3,4,5,6} and D2 = {2,3,2,3,2,3}.

The die is selected at random and tossed. What’s the expected number of dots observed for the tossed die?We can view this problem as a conditional one by letting

and Y is the number of dots observed. Thus, we wish to determine EY|X[Y|1] and EY|X[Y|2].

Page 21: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

21

Example: Toss one of two dice

What is the unconditional mean (mean of Y)? Unconditional mean is the number of dots observed without

first condition on which die was chosen. Intuitively

Outcomes when die 1 is chosen Outcomes when die 2 is chosen

Mean = 3.88, True mean = 3.5 Mean = 3.58, True mean = 2.5

Page 22: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

22

Unconditional mean Let determine EY[Y] for the following experiment

1. Choose die 1 or die 2 with probability of ½.2. Toss the chose die.3. Count the number of dots on the face of tossed die, that is RV Y. To determine theoretical mean Y we need pY[j].

Page 23: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

23

Unconditional mean Thus the unconditional mean becomes

The other way to find unconditional mean

That is the average of the conditional means.

Page 24: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

24

Unconditional mean (Proof) In general unconditional mean is found as

Proof

(def. of cond. mean)

(def. of cond. PMF)

(marginal PMF from joint PMF)

Page 25: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

25

Modeling human learning Child learns by attempting to pick up the toy,

dropping it, picking it up again after having learned something.

Each time the experiment, “attempting to pick up the toy”, is repeated the child learns something or equivalently narrows down then number the number of strategies.

Many models of human learning employ a Baysian framework. By using it we are able to discriminate the right strategy with more accuracy as we repeatedly perform and experiment and observe the output.

Page 26: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

26

Modeling human learning: Example Suppose we wish to “learn” whether a coin is fair (p = ½) or is

weighted (p ≠ ½). Our certainty that the coin is fair or not, will increase as the number

of trials increase. In the Bayesian model we assume that p is a RV. In reality, the coin

has a fixed probably, but it is unknown to us. Let the probability of heads be denoted by RV Y and its values by yj.

for

Prior PMF, it summarizes our state of knowledge before the experiment is performed

Page 27: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

27

Modeling human learning: Example Let N be the number of coin tosses and X denote the number of

tosses heads observed in the N tosses. X ~ bin(N, p) i.e. is binomially distributed, however the

probability of heads Y is unknown. We can only specify the PMF of X conditionally Y = yj then the

conditional PMF of the number of heads for X = i is

We are interested in the prob of heads or the PMF of Y after observing the outcomes of N coin tosses pY|X[yj|i].

pY|X[yj|i] is a posterior PMF, since it is determined after the experiment is performed.

Page 28: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

28

Modeling human learning: Example The posterior PMF pY|X[yj|i] contains all the info about the prob.

of heads that results from our prior knowledge, summarized by pY, and our “data” knowledge, summarized by pX|Y.

The posterior PMF is given by Bayes’ rule with xi = i as

pY|X[yj|i] depends on the observed number of heads i.

Page 29: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

29

Modeling human learning: Example

Page 30: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

30

Problems A fair coin is tossed, If it comes up heads, then X = 1 and if it

comes up tails, then X = 0. Next, a point is selected at random from the area A if X = 1 and from the area B if X = 0 as shown.

BA

1

1

The area of the square is 4 and A and B both have areas of 3/2. If the point selected is in an upper quadrant, we set Y = 1 and if it is in a lower quadrant, we set Y = 0. Find the conditional PMF pY|X[j|i] for all values of i and j. Next, compute P[Y = 0].

Page 31: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

31

Problems Prove that

Page 32: Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred

32

Problems If X and Y are independent RV, find PMF of Z = | X – Y|. Assume

that SX = {0,1,…} and SY = {0,1,…}.

Hint: The answer is

as intermediate step show that