solutions to problem set 3 - lse statisticsstats.lse.ac.uk/cetin/files/hw3_soln.pdf · solutions to...
TRANSCRIPT
SOLUTIONS TO PROBLEM SET 3
The following treatment follows from Chapter 23 of “Stochastic Processes” by Richard Bass.
1. Let gn = n(f − P1/nf). Since f is excessive, i.e. f ≥ Ptf for any t ≥ 0, then gn ≥ 0. We
have
Ugn = n
∫ ∞0
Psf ds− n∫ ∞0
Ps+1/nf ds = n
∫ 1/n
0Psf ds.
Since f is excessive, the right side of the previous identity is less than f for each n. Recall
that X is Feller, hence lims↓0 Psf = f . Therefore limn→∞ n∫ 1/n0 Psf ds = f .
2. Let us first understand the set of time Tn = {k2−n : 0 ≤ k ≤ n2n}. For each n, the time
interval [0, n] is divided into n2n sub-intervals. Then Tn contains all time on grid points.
Let us now prove gn is increasing. Observe that gn(x) ≥ P0gn−1(x) = Ex[gn−1(X0)] =
gn−1(x). Then g is increasing.
Now define H(x) := limn→∞ gn(x). Since the limit of an increasing sequence of continuous
functions is lower semi-continuous1, it suffices to show each gn is continuous function. When
n = 1, g1(x) = maxt∈{0,1/2,1} Ptg(x). Since X is Feller, Ptg(x) is then continuous, and the
maximum of finite continuous functions is continuous. Therefore g1 is continuous. The rest
proof follows from induction in n.
3. Let us first fix m and take t ∈ Tm. Then for n ≥ m,
H(x) ≥ gn(x) ≥ Ptgn−1(x) = Exgn−1(Xt).
Sending n ↑ ∞ and using the monotone convergence theorem, we obtain
H(x) ≥ ExH(Xt) for t ∈ Tm.
Moreover the previous inequality also holds when t ∈ ∪mTm. Now for an arbitrary t, there
exists a sequence ∪mTm 3 tk → t. Then Fatou’s lemma yields
H(x) ≥ lim infk→∞
ExH(Xtk) ≥ Ex[lim infk→∞
H(Xtk)] ≥ ExH(Xt),
where the last inequality utilises lim infk→∞H(Xtk) ≥ H(Xt) sinceH is lower semi-continuous
from Problem 2.
It then remains to show limt↓0 PtH(x) = H(x). For a ∈ R, let Ea = {y : H(y) > a},which is an open set. If a ≤ H(x), then
PtH(x) = ExH(Xt) ≥ aP x(Xt ∈ Ea)→ a as t ↓ 0.
Therefore lim inft↓0 PtH(x) ≥ a for any a < H(x). This implies that
lim inft↓0
PtH(x) ≥ H(x).
1The function f is lower semi-continuous if lim infx→x0 f(x) ≥ f(x0) for any x0 ∈ R.1
2 SOLUTIONS TO PROBLEM SET 3
On the other hand, we have already showH(x) ≥ ExH(Xt), henceH(x) ≥ lim supt↓0 PtH(x).
Therefore we confirmed limt↓0 PtH(x) = H(x). Thus H is excessive.
We have shown that H is excessive and H dominates g by its construction. Therefore H
is an excessive majorant of g. Let us show H is the least one. Take an arbitrary excessive
function F such that F ≥ g. If F ≥ gn−1, then F (x) ≥ PtF (x) ≥ Ptgn−1(x) for every
t ∈ Tn, hence F (x) ≥ gn(x). By an induction argument, F (x) ≥ gn(x) for all n, hence
F (x) ≥ H(x). Therefore H is the least excessive majorant of g.
4. Let G be the least excessive majorant of g. We are going to show that G is the value of the
optimal stopping problem, and one optimal stopping time is the first time that process X
hits the set {x : g(x) = G(x)}.In order to prove this result, let us prepare the following lemma:
Lemma 1. (a) If f is excessive, T is a finite stopping time, and h(x) = Exf(XT ), then h
is excessive.
(b) If f is excessive and T is a finite stopping time, then f(x) ≥ Exf(XT ).
(c) If f is excessive, then f(Xt) is a supermartingale.
Proof. (a) First suppose f = Ug for some nonnegative g. Then
h(x) = ExUg(XT ) = ExEXT∫ ∞0
g(Xs) ds
= Ex∫ ∞0
g(Xs+T ) ds = Ex∫ ∞T
g(Xs) ds
(1)
by the strong Markov property and a change of variables. The same argument shows that
Pth(x) = Exh(Xt) = ExEXt∫ ∞T
g(Xs) ds = Ex∫ ∞T+t
g(Xs) ds.
This is less than Ex∫∞T g(Xs) ds = h(x) and increases up to h(x) as t ↓ 0.
Now let f be excessive but not necessarily of the form Ug. In the paragraph above,
replace g by gn in Problem 1 to conclude
Pth(x) = limn→∞
Ex∫ ∞T+t
gn(Xs) ds ≤ limn→∞
Ex∫ ∞T
gn(Xs) ds = h(x).
That Pth increases up to h is proved similarly.
(b) When T is a real number, this has been proved in Proposition 2.1.2 in the lecture
note. Let us also prove it here. As in the proof of (1), it suffices to consider the case where
f = Ug and then take limits. It follows from (1) that
ExUg(XT ) = Ex∫ ∞T
g(Xs) ds ≤ Ex∫ ∞0
g(Xs) ds = Ug(x).
(c) This can be proved using Proposition 2.1.3 in the lecture note. We also give a direct
argument here. By the Markov property,
Ex[f(Xt) | Ft] = Exf(Xt−s) = Pt−sf(Xs) ≤ f(Xs).
�
SOLUTIONS TO PROBLEM SET 3 3
Let us now come back to our optimal stopping problem. The proof of Problem 4 is split
into the following three steps.
Step 1. Let Dε = {x : g(x) < G(x)− ε} and τε = inf{t : Xt /∈ Dε}. Define
Hε = Ex[G(Xτε)].
It then follows from Lemma 1 (a) that Hε is excessive.
In this step, we show
(2) g(x) ≤ Hε(x) + ε, x ∈ D.
To prove this, we suppose not, that is, we let
b = supx∈D
(g(x)−Hε(x))
and suppose b > ε. Choose η < ε, ad then choose x0 such that
g(x0)−Hε(x0) ≥ b− η.
Since Hε + b is an excessive majorant of g be the definition of b, and G is the least excessive
majorant, then
G(x0) ≤ Hε(x0) + b.
From the previous two inequalities, we conclude
(3) G(x0) ≤ g(x0) + η.
By the Blumenthal 0-1 Law (HW2, Problem 9), P x0(τε = 0) is either 0 or 1. In the first
case, for any t > 0,
g(x0) + η ≥ G(x0) ≥ Ex0 [G(Xt∧τε)] ≥ Ex0 [g(Xt∧τε + ε); τε > t],
where the first inequality is (3), the second is due to G being excessive, and the third
because G > g + ε until τε. Sending t ↓ 0 and use the fact that g is continuous, we get
g(x0) + η ≥ g(x0) + ε, which contradicts with the choice of η.
In the second case, where τ ε = 0 with probability 1, we have
Hε(x0) = Ex0G(Xτε) = G(x0) ≥ g(x0) ≥ Hε(x0) + b− η,
a contradiction since we choose η < b.
In either cases, we must have (2) hold.
Step 2. A conclusion we reach from (2) is thatHε+ε an excessive majorant of g. Therefore,
(4) G(x) ≤ Hε(x) + ε = Ex[G(XXτε )] + ε ≤ Ex[g(Xτε) + ε] + ε ≤ g∗(x) + 2ε.
The first inequality holds since G is the least excessive majorant, the second inequality
follows from g(Xτε) + ε ≥ G(Xτε) by the definition of τε, and the third b the definition of
g∗. Since ε is arbitrarily chosen, we see that G(x) ≤ g∗(x).
Step 3. For any stopping time T , because G is excessive and majorizes g,
G(x) ≥ Ex[G(XT )] ≥ Exg(XT ).
4 SOLUTIONS TO PROBLEM SET 3
Taking the supremum over all stopping times T , G(x) ≥ g∗(x) and therefore G(x) = g∗(x).
Step 4. Because τD is finite almost surely, the continuity of g tells us that Exg(Xτε) →Ex[g(XτD)] as ε ↓ 0. This is due to the fact that a strong Markov process does not jump at
stopping times which can be approximated from below by an increasing sequence of stopping
times. By the definition of g∗, we know that Exg(XτD) ≤ g∗(x).
On the other hand, by the definitions of τε and Hε,
Exg(Xτε) ≥ ExG(Xτε)− ε = Hε(x)− ε.
By the first inequality in (4), the right-hand side is greater than or equal to G(x) − 2ε =
g∗(x)− 2ε. Sending ε ↓ 0, we obtain
Exg(XτD ≥ g∗(x))
as desired.
5. Let G be the least excessive majorant of g. Then h(x) ≥ G. However,
h(x) = Exg(XτA) ≤ supTEx[g(XT )] = g∗(x) = G(x),
by results in Problem 4.
6. Let F be the smallest concave function which dominates g. By Jensen’s inequality
PtF (x) = ExF (Xt) ≤ F (Ex[Xt]) = F (x).
Because F is concave, it is continuous, and so PtF (x) = ExF (Xt) → F (x) as t → 0.
Therefore F is excessive. If F̃ is another excessive function larger than g and a ≤ c < x <
d ≤ h, we have F̃ (x) ≥ ExF̃ (XS), where S is the first time that X leaves [c, d] by Lemma
1. Since X is a Brownian motion, we have
F̃ (x) ≥ ExF̃ (XS) =d− xd− c
F̃ (c) +x− cd− c
F̃ (d).
Rearranging this inequality shows that F̃ is concave. Recall that the minimum of two
concave functions is concave, so F ∧ F̃ is a concave function larger than g. But F is the
smallest concave function dominating g, hence F = F ∧ F̃ , or F ≤ F̃ . Therefore F is the
least excessive majorant of g. It then follows from results in Problem 4 that F = g∗.