ee132b hw6 sol
DESCRIPTION
eeTRANSCRIPT
-
EE132B-HW Set #6 UCLA 2014 Fall Prof. Izhak Rubin
Problem 1(a)
We calculate the stationary distribution by using the following equations:pi = piP ;
iS
pii = 1. (1)
This set of equations yields pi =[
5293
2193
2093
].
(b)Let A0 = {X0 = c}, A1 = {X1 = b}, A2 = {X2 = c}, A3 = {X3 = a}, A4 = {X4 =
c}, A5 = {X5 = a}, A6 = {X6 = c}, and A7 = {X7 = b}. Then we have:
P (6i=1
Ai) =7
k=1P (Ak |
k1i=1
Ai) =7
k=1P (Ak | Ak1)
= p(c, b)p(a, c)p(c, a)p(a, c)p(c, a)p(b, c)p(c, b) = 32500 .(2)
(c)Due to the time-homogeneous property, we have
P (Xk+2 = c | Xk = b) = P (2)(b, c) =kS
P (b, k)p(k, c) = 16 . (3)
Problem 2(a)
We calculate the stationary distribution by using the following equations:pi = piP ;
iS
pii = 1. (4)
This set of equations yields pi =[
14
13
512
].
(b)We haveP (X1 = b,X3 = a,X4 = c,X6 = b | X0 = a)=nS
mS
P (X1 = b,X2 = m,X3 = a,X4 = c,X5 = n,X6 = b | X0 = a)
=nS
mS
P (X1 = b | X0 = a)P (X2 = m | X1 = b)P (X3 = a | X2 = m)
P (X4 = c | X3 = a)P (X5 = n | X4 = c)P (X6 = b | X5 = n)= 1180 .
(5)
1
-
EE132B-HW Set #6 UCLA 2014 Fall Prof. Izhak Rubin
(c)
P (X1 = b,X2 = b,X3 = a)=nS
P (X1 = b,X2 = b,X3 = a,X0 = n)
=nS
P (X3 = a | X2 = b)P (X2 = b | X1 = b)P (X1 = b | X0 = n)P (X0 = n)
= 51960 .
(6)
Problem 3(a)
To prove that N is a Markov chain, we need to show that:
P (Nn+1 = i | Nn, Nn1, . . . , N0) = P (Nn+1 = i | Nn), (7)for all i in S. Let Mn denote the number of successes in the nth trial, i.e., Mn = 1 ifthe nth trial is successful, and Mn = 0 otherwise. Then, for n = 0, 1, . . . , we have
Nn+1 = Nn +Mn+1. (8)
Since Mn+1 is independent of Nn, Nn+1, . . . , N0, we have
P (Nn+1 = i | Nn, Nn+1, . . . , N0) = P (Nn +Mn+1 = i | Nn, Nn+1, . . . , N0)= P (Nn +Mn+1 = i | Nn)= P (Nn+1 = i | Nn)
(9)
Therefore, N is a Markov chain.
(b)Since N0 = 0, the initial distribution for N is:
pi0 ={1 , for i = 00 , otherwise
(10)
We obtain the transition probabilities as follows:
p(i, j) = P (Nn+1 = j | Nn = i)= P (Nn +Mn+1 = j | Nn = i)= P (Mn+1 = j i)
=
p , for j = i+ 11 p , for j = i, i 00 , otherwise.
(11)
2
-
EE132B-HW Set #6 UCLA 2014 Fall Prof. Izhak Rubin
Problem 4(a)
We have
P (Xn+1 | Xn, . . . , X0) = P(n+1k=1
Yk = j | Xn, . . . , X0)
= P
Yn+1 +n
k=1Yk
=Xn
= j | Xn, . . . , X0
= P (Yn+1 +Xn = j | Xn)= P (Xn+1 | Xn) .
(12)
Therefore, X is a Markov chain.
(b)We calculate the transition probabilities as follows:
p(i, j) = P (Xn+1 = j | Xn = i)= P (Yn+1 +Xn = j | Xn = i)= P (Yn+1 = j i)
={pji , for j i 0, i 0,0 , otherwise.
(13)
3