elec 303, koushanfar, fall’09 elec 303 – random signals lecture 7 – discrete random variables:...

22
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 7 – Discrete Random Variables: Conditioning and Independence Farinaz Koushanfar ECE Dept., Rice University Sept 15, 2009

Upload: wilfrid-berry

Post on 26-Dec-2015

224 views

Category:

Documents


1 download

TRANSCRIPT

ELEC 303, Koushanfar, Fall’09

ELEC 303 – Random Signals

Lecture 7 – Discrete Random Variables: Conditioning and Independence

Farinaz KoushanfarECE Dept., Rice University

Sept 15, 2009

ELEC 303, Koushanfar, Fall’09

Lecture outline

• Reading: Finish Chapter 2• Review• Joint PMFs• Conditioning• Independence

ELEC 303, Koushanfar, Fall’09

Random Variables

• A random variable is a Real-valued function of an experiment outcome

• A function of a random variable defines another random variable

• We associate with each RV some averages of interest, such as mean and variance

• A random variable can be conditioned on an event or another random variable

• There is a notion of independence of a random variable from an event or from another

ELEC 303, Koushanfar, Fall’09

Discrete random variables

• It is a real-valued function of the outcome of the experiments– can take a finite or infinitely finite number of values

• A discrete random variable has an associated probability mass function (PMF)– It gives the probability of each numerical value that the

random variable can take• A function of a discrete random variable defines

another discrete random variable (RV)– Its PMF can be found from the PMF of the original RV

ELEC 303, Koushanfar, Fall’09

Probability mass function (PMF)

• Notations– Random variable: X– Experimental value: x– PX(x) = P({X=x})

• It mathematically defines a probability law• Probability axiom: x PX(x) = 1• Example: Coin toss– Define X(H)=1, X(T)=0 (indicator RV)

ELEC 303, Koushanfar, Fall’09

Review: discrete random variable PMF, expectation, variance

• Probability mass function (PMF)• PX(x) = P (X=x)

• x PX(x)=1

ELEC 303, Koushanfar, Fall’09

Expected value for functions of RV

• Let X be a random variable with PMF pX, and let g(X) be a function of X. Then, the expected value of the random variable g(X) is given by

• E[g(X)] = x g(x)pX(x)

• Var(X) = E[(X-E[X])2] = x (x-E[X])2pX(x)• Similarly, the nth moment is given by– E[Xn]= x xnpX(x)

ELEC 303, Koushanfar, Fall’09

Properties of variance

ELEC 303, Koushanfar, Fall’09

Joint PMFs of multiple random variables

• Joint PMF of two random variabels: pX,Y

• PX,Y(x,y)=P(X=x,Y=y)• Calculate the PMFs of X and Y by the formula– PX(x)= y PX,Y(x,y)

– PY(y)= X PX,Y(x,y)

• We refer to PX and PY as the marginal PMFs

ELEC 303, Koushanfar, Fall’09

Tabular method

• For computing marginal PMFs

•Assume Z=X+2Y•Find E[Z]?

ELEC 303, Koushanfar, Fall’09

Expectation

ELEC 303, Koushanfar, Fall’09

Variances

ELEC 303, Koushanfar, Fall’09

Example: Binomial mean and variance

ELEC 303, Koushanfar, Fall’09

More than two variables

• PX,Y,Z (x,y,z) = P(X=x,Y=y,Z=z)

• PX,Y (x,y) = z PX,Y,Z (x,y,z)

• PX(x) = y z PX,Y,Z (x,y,z) • The expected value rule:• E[g(X,Y,Z)] = x y z g(x,y,z)PX,Y,Z (x,y,z)

http://www.coventry.ac.uk

ELEC 303, Koushanfar, Fall’09

Conditioning

• Conditional PMF of a RV on an event A• PX|A(x)=P(X=x|A) = P({X=x} A)/P(A)

• P(A) = x P({X=x} A)

• x PX|A(x) = 1

ELEC 303, Koushanfar, Fall’09

Example• A student will take a certain test up to a max of n times,

each time with a probability p of passing independent of the number of attempts

• Find the PMF of the number of attempts given that the student passes the test

• A={the event of passing}• X is a geometric RV with parameter p and A={Xn}• P(A) = {m=1 to n}(1-p)m-1p

otherwise,0

n,1,...,k if,pp)-(1

pp)-(1

)(n

1m

1-m

1-k

| kp AX

ELEC 303, Koushanfar, Fall’09

Conditioning a RV on another

• PX|Y(x|y) = P(X=x|Y=y)

• PX|Y(x|y) = P(X=x,Y=y)/P(Y=y) = PX,Y(x,y)/PY(y)• The conditional PMF is often used for the joint

PMF, using a sequential approach• PX,Y(x,y) = PY(y)PX|Y(x|y)

ELEC 303, Koushanfar, Fall’09

Conditional expectation

• Conditional expectation of X given A (P(A)>0)E(X|A)= x x PX|A(x|A)

E[g(X)|A] = x g(x) PX|A(x|A)

• If A1,..,An are disjoint events partitioning the sample space, then E[X]= i P(Ai)E[X|Ai]

• For any event B with P(AiB)>0 for all i

E[X|B]= i P(Ai|B)E[X|AiB]

E(X)= y pY(y)E(X|Y=y)

ELEC 303, Koushanfar, Fall’09

Mean and variance of Geometric

• Assume there is a probability p that your program works correctly (independent of how many times you write).

• Find the mean and variance of X, the number of tries till it works correctly?

pX(k)=(1-p)k-1p, k=1,2,…

E[X] = k k(1-p)k-1p

Var(X) = k (k-E[X])2(1-p)k-1p

ELEC 303, Koushanfar, Fall’09

Mean and variance of Geometric

• E[X|X=1]=1, E[X|X>1]=1+E(X) – E[X]

• E[X2|X=1]=1, E[X2|X>1]=E[(1+X)2]=1+2E[x]+E[X2]

E[X2] = 1+2(1-p)E[X]/p

ELEC 303, Koushanfar, Fall’09

Independence

• Independence from an eventP(X=x, A) = P(X=x)P(A) = PX(x) P(A), for all x

P(X=x, A) = P(X=x and A) = PX|A(x)(A),

PX|A(x)=PX(x), for all x• Independence of random variables• P(X=x,Y=y|A) =P(X=x|A)P(Y=y|A) for all x and y• For two independent RVs: E[XY] = E[X]E[Y]• Also, E[g(X)h(Y)] = E[g(X)]E[h(Y)]

ELEC 303, Koushanfar, Fall’09

Multiple RVs, sum of RVs

• Three RVs X, Y, and Z are said to be independent if PX,Y,Z (x,y,z) = PX(x)PY(y)PZ(z)