multiple random variables -...
TRANSCRIPT
Multiple Random Variables
Joint Cumulative Distribution
Function
Let X and Y be two random variables. Their joint cumulative
distribution function is FXY
x, y( ) P X x Y y .
0 FXY
x, y( ) 1 , < x < , < y <
FXY
,( ) = FXY
x,( ) = FXY
, y( ) = 0
FXY
,( ) = 1
FXY
x, y( ) does not decrease if either x or y increases or both increase
FXY
, y( ) = FY
y( ) and FXY
x,( ) = FX
x( )
Joint cumulative distribution function for tossing two dice
Joint Cumulative Distribution
Function
Joint Probability Mass Function
Let X and Y be two discrete random variables.
Their joint probability mass function is
PXY
x, y( ) P X = x Y = y .
Their joint sample space is
SXY
= x, y( ) | PXY
x, y( ) > 0{ }.
PXY
x, y( )x S
Xy S
Y
= 1 , P A = PXY
x, y( )x ,y( ) A
PX
x( ) = PXY
x, y( )y S
Y
, PY
y( ) = PXY
x, y( )x S
X
E g x, y( ) = g x, y( )PXY
x, y( )x S
Xy S
Y
Joint Probability Mass Function
Let a random variable X have a PMF
PXY
x, y( ) =
0.8x( ) 0.7
y( )41.17
, 0 x < 5, 4 y < 2
0 , otherwise
Joint Probability Density
Function
fXY
x, y( ) =
2
x yF
XYx, y( )( ) , f
XYx, y( ) 0 , < x < , < y <
fXY
x, y( )dxdy = 1 , FXY
x, y( ) = fXY
,( )d
x
d
y
fX
x( ) = fXY
x, y( )dy and fY
y( ) = fXY
x, y( )dx
P X ,Y( ) R = fXY
x, y( )dxdyR
P x1< X x
2, y
1< Y y
2= f
XYx, y( )dx
x1
x2
dyy
1
y2
E g X ,Y( )( ) = g x, y( )fXY
x, y( )dxdy
The Unit Rectangle Function
rect t( ) =
1 , t < 1 / 2
1 / 2 , t = 1 / 2
0 , t > 1 / 2
= u t +1 / 2( ) u t 1 / 2( )
The product signal g(t)rect(t) can be thought of as the signal g(t)“turned on” at time t = -1/2 and “turned back off” at time t = +1/2.
Let
fXY
x, y( ) =1
wX
wY
rectx X
0
wX
recty Y
0
wY
E X( ) = x fXY
x, y( )dxdy = X0
E Y( ) = y fXY
x, y( )dxdy = Y0
E XY( ) = xy fXY
x, y( )dxdy = X0Y
0Correlation of X and Y
fX
x( ) = fXY
x, y( )dy =1
wX
rectx X
0
wX
Joint Probability Density
Function
For x < X0
wX
/ 2 or y < Y0
wY
/ 2, FXY
x, y( ) = 0
For x > X0
+ wX
/ 2 and y > Y0
+ wY
/ 2, FXY
x, y( ) = 1
For X0
wX
/ 2 < x < X0
+ wX
/ 2 and y > Y0
+ wY
/ 2,
FXY
x, y( ) =1
wX
wY
dudvX
0w
X/2
x
Y0
wY
/2
Y0+w
Y/2
=x X
0w
X/ 2( )
wX
For x > X0
+ wX
/ 2 and Y0
wY
/ 2 < y < Y0
+ wY
/ 2,
FXY
x, y( ) =1
wX
wY
dudvX
0w
X/2
X0+w
X/2
Y0
wY
/2
y
=y Y
0w
Y/ 2( )
wY
For X0
wX
/ 2 < x < X0
+ wX
/ 2 and Y0
wY
/ 2 < y < Y0
+ wY
/ 2,
FXY
x, y( ) =1
wX
wY
dudvX
0w
X/2
x
Y0
wY
/2
y
=x X
0w
X/ 2( )
wX
y Y0
wY
/ 2( )w
Y
Joint Probability Density
Function
Joint Probability Density
Function
Combinations of Two Random
VariablesExample
If the joint pdf of X and Y is fX
x, y( ) = ex u x( )e
y u y( )find the pdf of Z = X / Y . Since X and Y are never negative
Z is never negative.
FZ
z( ) = P Z z( ) = P X / Y z( ) F
Zz( ) = P X zY Y > 0 + P X zY Y < 0
Since Y is never negative
FZ
z( ) = P X zY Y > 0
FZ
z( ) = fXY
x, y( )dxdy
zy
= e xe ydxdy0
zy
0
, z 0
FZ
z( ) = 1 e zy( )e ydxdy0
=e
y z+1( )
z +1e y
0
=z
z +1 , z 0
fZ
z( ) =F
Zz( )
z=
1
z +1( )2
, z 0
0 , z < 0
fZ
z( ) =u z( )
z +1( )2
Combinations of Two Random
Variables
Combinations of Two Random
Variables
Example
The joint pdf of X and Y is defined as
fXY
x, y( ) =6x , x 0, y 0,x + y 1
0 , otherwise
Define Z = X Y . Find the pdf of Z.
Given the constraints on X and Y , 1 Z 1.
Z = X Y intersects X + Y = 1 at X =1+ Z
2 , Y =
1 Z
2.
Combinations of Two Random
Variables
For 0 z 1, FZ
z( ) = 1 6xdxy+ z
1 y
dy0
1 z( )/2
= 1 3x2
y+ z
1 y
dy0
1 z( )/2
FZ
z( ) = 13
41 z( ) 1 z2( ) f
Zz( ) =
3
41 z( ) 1+ 3z( )
Combinations of Two Random
Variables
For 1 z 0,
FZ
z( ) = 2 6xdx0
y+ z
dyz
1 z( )/2
= 6 x2
0
y+ z
dyz
1 z( )/2
= 6 y + z( )2
dyz
1 z( )/2
FZ
z( ) =1+ z( )
3
4f
Zz( ) =
3 1+ z( )2
4
Combinations of Two Random
Variables
Combinations of Two Random
Variables
Conditional Probability FX |A
x( ) =P X x( ) A
P A
Let A = Y y{ }
FX | Y y
x( ) =P X x Y y
P Y y=
FXY
x, y( )F
Yy( )
Let A = y1< Y y
2{ }
FX | y
1<Y y
2
x( ) =F
XYx, y
2( ) FXY
x, y1( )
FY
y2( ) F
Yy
1( )
Joint Probability Density
Function
Let A = Y = y{ }
FX | Y = y
x( ) = limy 0
FXY
x, y + y( ) FXY
x, y( )F
Yy + y( ) F
Yy( )
=y
FXY
x, y( )( )d
dyF
Yy( )( )
FX | Y = y
x( ) =y
FXY
x, y( )( )
fY
y( ) , f
X |Y = yx( ) =
xF
X | Y = yx( )( ) =
fXY
x, y( )f
Yy( )
Similarly, fY |X =x
y( ) =f
XYx, y( )
fX
x( )
Joint Probability Density
Function
In a simplified notation
fX |Y
x( ) =f
XYx, y( )
fY
y( )and f
Y |Xy( ) =
fXY
x, y( )f
Xx( )
Bayes’ Theorem
fX |Y
x( )fY
y( ) = fY |X
y( )fX
x( )Marginal pdf’s from joint or conditional pdf’s
fX
x( ) = fXY
x, y( )dy = fX |Y
x( )fY
y( )dy
fY
y( ) = fXY
x, y( )dx = fY |X
y( )fX
x( )dx
Joint Probability Density
Function
It can be shown that, analogous to pdf, the conditional joint
PMF of X and Y given Y = y is
PX |Y
x | y( ) =P
XYx, y( )
PY
y( )and P
Y |Xy | x( ) =
PXY
x, y( )P
Xx( )
Bayes’ Theorem
PX |Y
x | y( )PY
y( ) = PY |X
y | x( )PX
x( )Marginal PMF’s from joint or conditional PMF’s
PX
x( ) = PXY
x, y( )y S
Y
= PX |Y
x | y( )PY
y( )y S
Y
PY
y( ) = PXY
x, y( )x S
X
= PY |X
y | x( )PX
x( )x S
X
Joint Probability Mass Function
Independent Random Variables
If two continuous random variables X and Y are independent then
fX |Y
x( ) = fX
x( ) =f
XYx, y( )
fY
y( )and f
Y |Xy( ) = f
Yy( ) =
fXY
x, y( )f
Xx( )
.
Therefore fXY
x, y( ) = fX
x( )fY
y( ) and their correlation is the
product of their expected values
E XY( ) = xy fXY
x, y( )dxdy = y fY
y( )dy x fX
x( )dx
E XY( ) = E X( )E Y( )
Independent Random Variables
If two discrete random variables X and Y are independent then
PX |Y
x | y( ) = PX
x( ) =P
XYx, y( )
PY
y( )and P
Y |Xy | x( ) = P
Yy( ) =
PXY
x, y( )P
Xx( )
.
Therefore PXY
x, y( ) = PX
x( )PY
y( ) and their correlation is the
product of their expected values
E XY( ) = xy PXY
x, y( )x S
Xy S
Y
= y PY
y( )y S
Y
x PX
x( )x S
X
E XY( ) = E X( )E Y( )
Covariance
XYE X E X( ) Y E Y( )
*
= x E X( )( ) y* E Y *( )( )fXY
x, y( )dxdy
or = x E X( )( ) y* E Y *( )( )PXY
x, y( )x S
Xy S
Y
XY= E XY *( ) E X( )E Y *( )
If X and Y are independent,
XY= E X( )E Y *( ) E X( )E Y *( ) = 0
Independent Random Variables
Correlation Coefficient
XY= E
X E X( )
X
Y * E Y *( )Y
=x E X( )
X
y* E Y *( )Y
fXY
x, y( )dxdy
or =x E X( )
X
y* E Y *( )Y
PXY
x, y( )x S
Xy S
Y
XY=
E XY *( ) E X( )E Y *( )X Y
= XY
X Y
If X and Y are independent = 0. If they are perfectly positively
correlated = +1 and if they are perfectly negatively correlated
Independent Random Variables
If two random variables are independent, their covariance is
zero.
However, if two random variables have a zero covariance
that does not mean they are necessarily independent.
Independence Zero Covariance
Zero Covariance Independence
Independent Random Variables
In the traditional jargon of random variable analysis, two
“uncorrelated” random variables have a covariance of zero.
Unfortunately, this does not also imply that their correlation is
zero. If their correlation is zero they are said to be orthogonal.
X and Y are "Uncorrelated"XY
= 0
X and Y are "Uncorrelated" E XY( ) = 0
Independent Random Variables
Bivariate Gaussian Random
Variables
fXY
x, y( ) =
exp
x μX
X
22
XYx μ
X( ) y μY( )
X Y
+y μ
Y
Y
2
2 1XY
2( )
2X Y
1XY
2
Bivariate Gaussian Random
Variables
Bivariate Gaussian Random
Variables
Bivariate Gaussian Random
Variables
Any cross section of a bivariate Gaussian pdf at any value of x or y
is Gaussian. The marginal pdf’s of X and Y can be found using
fX
x( ) = fXY
x, y( )dy
which turns out to be
fX
x( ) =e
x μX( )
2/2
X2
X2
Similarly, fY
y( ) =e
y μY( )
2/2
Y2
Y2
Bivariate Gaussian Random
Variables
The conditional pdf of X given Y is
fX |Y
x( ) =
expx μ
X( ) XY X/
Y( ) y μY( )( )
2
2X
2 1XY
2( )
2X
1XY
2
The conditional pdf of Y given X is
fY |X
y( ) =
expy μ
Y( ) XY Y/
X( ) x μX( )( )
2
2Y
2 1XY
2( )
2Y
1XY
2
Bivariate Gaussian Random
Variables