an estimate concerning the kolmogoroff limit distribution

15
AN ESTIMATE CONCERNING THE KOLMOGOROFF LIMIT DISTRIBUTION« BY KAI-LAI CHUNG 1. We consider a sequence of independent random variables having the common distribution function F(x) which is assumed to be continuous. Let nFn(x) denote the number of random variables among the first n of the se- quence whose values do not exceed x. Write (1.1) dn= sup \n(Fn(x) -F(x))\. — W <iC<e© Kolmogoroff [l](2) proved that the probability (1.2) P(dn^\ni'2), where X is a positive constant, tends as »—»co uniformly in X to the limiting distribution (1.3) *(X) = X(-l)'6-2'v. —oc Smirnoff [2] extended this result and recently Feller [3] has given new proofs of these theorems(3). In this paper we shall obtain an estimate of the difference between (1.2) and (1.3) as a function of n, valid not only for X equal to a constant but also for X equal to a function X(w) of n which does not grow too fast. Since this estimate of the "remainder" will be of the order of magnitude of a negative power of n, it is futile to consider X(w) which is beyond the order of mag- nitude of lg n. In fact, a glance at (1.3) will show that already for X(«) =lg n we have i>(X) differ from 1 by a term of an order of magnitude smaller than that of any negative power of n, thus smaller than our estimate of the re- mainder. For a similar reason it is also futile to consider X(w) whose order is less than (lg n)~l. Keeping these facts in mind we state our result as follows: Theorem 1. If A o> 0 is an arbitrary constant and (1.4)_ (Aolgn)-1 ^\(n) ^Aoign, Presented to the Society, September 10, 1948; received by the editors June 14, 1948. (') Research in connection with an ONR project. (2) Numbers in brackets refer to the references cited at the end of the paper. (3) More recently, Doob and Kac, independently, have simplified and amplified the matter (oral communication). However, none of these authors considered the error term, or in other words, a fixed large number n without the passage to infinity. 36 License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Upload: dotuong

Post on 13-Feb-2017

218 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: an estimate concerning the kolmogoroff limit distribution

AN ESTIMATE CONCERNING THE KOLMOGOROFFLIMIT DISTRIBUTION«

BY

KAI-LAI CHUNG

1. We consider a sequence of independent random variables having the

common distribution function F(x) which is assumed to be continuous. Let

nFn(x) denote the number of random variables among the first n of the se-

quence whose values do not exceed x. Write

(1.1) dn= sup \n(Fn(x) -F(x))\.— W <iC<e©

Kolmogoroff [l](2) proved that the probability

(1.2) P(dn^\ni'2),

where X is a positive constant, tends as »—»co uniformly in X to the limiting

distribution

(1.3) *(X) = X(-l)'6-2'v.—oc

Smirnoff [2] extended this result and recently Feller [3] has given new

proofs of these theorems(3).

In this paper we shall obtain an estimate of the difference between (1.2)

and (1.3) as a function of n, valid not only for X equal to a constant but also

for X equal to a function X(w) of n which does not grow too fast. Since this

estimate of the "remainder" will be of the order of magnitude of a negative

power of n, it is futile to consider X(w) which is beyond the order of mag-

nitude of lg n. In fact, a glance at (1.3) will show that already for X(«) =lg n

we have i>(X) differ from 1 by a term of an order of magnitude smaller than

that of any negative power of n, thus smaller than our estimate of the re-

mainder. For a similar reason it is also futile to consider X(w) whose order is

less than (lg n)~l. Keeping these facts in mind we state our result as follows:

Theorem 1. If A o > 0 is an arbitrary constant and

(1.4)_ (Aolgn)-1 ^\(n) ^Aoign,

Presented to the Society, September 10, 1948; received by the editors June 14, 1948.

(') Research in connection with an ONR project.

(2) Numbers in brackets refer to the references cited at the end of the paper.

(3) More recently, Doob and Kac, independently, have simplified and amplified the matter

(oral communication). However, none of these authors considered the error term, or in other

words, a fixed large number n without the passage to infinity.

36

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 2: an estimate concerning the kolmogoroff limit distribution

THE KOLMOGOROFF LIMIT DISTRIBUTION 37

we have

(1.5) | P(dn ¿ X(»V2) - H\(n)) | g An-v™{l + (lg nyi\\-l(n) + >r2(n))\

where Ais a constant depending only on A 0.

Henceforth we shall think of A o as fixed, for example, 100; then A will

simply be a "universal" constant. The form of the estimate can be varied to

a certain extent, but it is believed that the present form is about the best ob-

tainable without essential improvement of the method. The rather clumsy

situation of having several terms in an estimate is unavoidable if we want to

include both ends of the range of \(n). Clearly at a small sacrifice we may re-

place the right side of (1.5) by .4»~1/10(lg n)il2.

We wish to point out that we shall really prove a more general theorem

about the so-called "lattice distributions" which is embodied in formula (5.9)

and the remark following it.

Our estimate can undoubtedly be improved upon but the limitations of

our method are such that it is improbable that we can obtain the best

possible result. However, Theorem 1 is amply sufficient for proving the

following "strong" theorem.

Theorem 2(4). Let\(n) Î °o. Then ("i.o." standing for "infinitely often")

P(dn > X(w)w1/2 i.o.) =

according as

^ \2(n) , (<)

In particular, for any integer p^3

P(dn > (2-Hyi2(\g2 n + 2 lg3 n + lg4 n + ■ • •

+ lgp n + (1 + 5) \gp+x ny2 i.o.) =

according as

<{>

The second part of Theorem 2 follows of course from the first part by

taking the appropriate sequence and using the Abel-Dini theorem.

The method of proving Theorem 2 by means of Theorem 1 follows the

(4) The idea of considering X(re)n1/2 as belonging to the upper or lower class is due to P.

Levy [11].

;

i:

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 3: an estimate concerning the kolmogoroff limit distribution

38 K. L. CHUNG [September

same pattern as developed by Feller in [4], To avoid too much repetition

even of an excellent thing we refrain from entering into a complete proof of

Theorem 2. We shall, however, give a short proof for the particular result cor-

responding to the original form of the law of the iterated logarithm, due to

Khintchine and Kolmogoroff (see [5]). Our reason for doing this is mainly

didactic, in order to show how easily a result on the asymptotic distribution

with a suitable remainder can be used to derive the corresponding strong

result.

Theorem 2*. We have

PI lim sup-= 1 ) == 1.V «-.« (2->w lg2 «)1/2 /

2. To prove Theorem 1 we shall follow Kolmogoroff's steps up to the last

stage where he reduced the problem to one of addition of independent

random variables of a special kind. More precisely he proved the following

statement (with a different normalization).

Let { Yj}, j = 1 ,•••,«, be independent random variables having the

common distribution given below:

(2.1) P(Yj =i-l)=— e-\ i = 0, 1, 2, • • • ,

and let

n

Fn = X F* T*n = max | T¡ |.j=i îSjâ»

Putting

(2.2) Pn = P(Ttx ^ X«1'2, Tn = 0)

we have the equality

n\en(2.3) P(¿,áW'!)=-Pn.

nn

Thus the problem of finding <3?„(X) is reduced to that of finding Pn. The

asymptotic value of P„ as n—» «° was obtained by Kolmogoroff by means of a

general limit theorem of his employing partial differential equations. We

shall use instead a method based in the combinatorial ideas of Erdös-Kac [ó]

on the one hand and the analytical tools of Esseen [7] on the other hand. It is

similar to the one developed in my paper [8] but is adapted for discrete

probabilities. Roughly speaking, we shall first show that the asymptotic

value of (2.2) is independent of the nature of the distribution of the F's so

long as it satisfies certain general conditions. To be precise, let {A",-},

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 4: an estimate concerning the kolmogoroff limit distribution

1949] THE KOLMOGOROFF LIMIT DISTRIBUTION 39

j = 1, • • • , n, be independent random variables having the common distribu-

tion function G(x) which has the following properties.

(i) G(x) is a step function having all its (positive) jumps at integer points

including 0 and such that the minimum distance between the abscissa of two

jumps is 1.

(ii) The first moment of G(y) is 0, the second is 1, and the fourth absolute

moment is finite.

The following result is due to Esseen (p. 63 in [7]).

Lemma 1. Let {X¡}, j = 1, • ■ ■ , n, be independent random variables having

the common distribution G(x) which satisfies the condition (i) and (ii). Let

n

On = 2-1 XjJ-l

and let £ be such that %n112 is a possible value of Sn (that is, the abscissa of a jump

of the distribution function of Sn). Then

p(5--î»,,,, = <d^{*(i)-£>,(9(2.4)

+ 7(^*">(0 + ̂ *",<£))} + °(v)-where

*(Ö = e-t2'2, *«>(£)=—*(f)

and the remainder term o(l/n) is uniform with respect to £.

In particular

«*- = í»"*>-¿*«>+0(_L) = 0(_l)

where the 0- terms are uniform with respect to £.

3. We think of « as a very large number and define

k = k(n) = [»*'»].

Conformably with the statement in Theorem 1, we take X = X(w) to be a

function of n not exceeding lg n. We put also

2(lg n)1'2(3.1) e = e(n) = •

X(b)«1'»

Further we put

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 5: an estimate concerning the kolmogoroff limit distribution

40

With the notation

Sn = max | Sj |

we write

P(St-x > X«1'2, Sn = 0)

nic-i—1

= 2Z F(Sr-X ^ X«1'2, | Sr | > X«1'2, S. = 0, | Sni+1 ~ Sr \ < eX«1'2)r=l

nt-1—1

+ 12 ?(Sr-x ^ A«1'2, | Sr | > X«1'2, 5n = 0, | Snj+I - Sf | £ eX«»/2)r=l

+ ¿ F(5*_! ^ X«1'2, | Sr | > Xw1'2, 5„ = 0)r=nfc-l

= Xi + Xa + X3 say,

where the «¿+1 corresponding to each r is defined by w,-^r<«,-+i.

A moment's reflection shows that

(3.2) Xi ^ P[ max | 5»,.| > (1 - e)X«1'2, S„ = 0\1S«'S*—1

Next we may write

nk-i—X

„ „ X* = X P(Sr-X ̂ Xn1'2, I Sr\ > k"!)£f( I Sni+1 - Sr |(3.3) r=l j/

^ eXrc1'2, Sni+1 = y)P(Sn - Sni+1 = - y)

where y runs through certain (integral) values depending on the value of Sr.

We shall use Q to denote a changeable positive constant depending only

on G(x), and A a changeable positive universal constant. From (2.4) we infer

that

P(Sn - Sni+l - - y) ¿ Q(n - «!+i)-1/2 á Qk^n-"2.

It follows from (3.1) that

(3.4) X2 = Qk1'^'1'2 max P( \ Sni+1 - Sr | è eXw1'2).r

To estimate the last-written probability we use Tchebecheff's inequality for

small values of n — m+t and the central limit theorem with a remainder (due

K. L. CHUNG [September

i = 1, • • • , k.m = [-1 ,],

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 6: an estimate concerning the kolmogoroff limit distribution

1949] THE KOLMOGOROFF LIMIT DISTRIBUTION 41

to A. C. Berry [9] and Esseen [7]) for large values of « —w,-+i. More precisely,

let

g = (m2X2y3.

Then if w<+i —r^g, we have

ni+x — r g ( 1 \1/3eA»1'*) ^ -

If ni+x — r = h>g, we have

«e2Az ntlKl \ nt'K1 J

P( | Sni+1 - Sr I è eXra1'2) g (—) J e-2'2á« + Qr1

where

íXm1'2

(n<+i - 01/2

/ n \v= TeíX«l'2í-l-lj

By the choice of e and k, we have v—> °o as w—> =°. It follows from a well known

inequality that

A / ¿e2X2\ / 1 \»71(3.6) P(|5.,„-5.!i.X».«)S-)^exp(-—) + (,(—) .

In view of (3.5) we conclude that (3.6) holds in general. Hence it follows

from (3.4) that

_ ( 1 / ¿e2A2\ / 1 X1'3)(3.7) E, S fli«r«L_ exp (- —) + (—) } S «

if we substitute the values of k, e, and X.

Finally, we have

-2/3

Xs = X X P(S*-x Ú A«1'2, Sr = y).,, „^ r=»>l-l ll/l>Xn1/2

P(Sn -Sr= - y)|'S! max F(5„ - 5, = - y).

Similarly to the above, let h = (rik2)2li. If n — r^h, then

n — r hP(Sn - Sr = - y) Ú - ^ - Ú (WX2)"3'5.

y2 mX2

\in — r>h we apply Lemma 1 and obtain

P(Sn -Sr= - y)= Q(0 + o(h-*'2)

where Q(^) stands for the expression involving d>(£) and its derivatives on the

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 7: an estimate concerning the kolmogoroff limit distribution

42 K. L. CHUNG [September

right-hand side of (2.4), with !- = y(n — r)~112.

Since \y\ >Xwl/2

y A«1'2 n1'10> —-= ¿>'2A ^

(n-ry2 (n/ky2 . ¿„lg n

we see that Q(£) is negligible in order of magnitude and thus we may write

P(Sn - ST= - y) Ú Qh-*'2 g Q(n\2)-3'5.

(Notice that here Q actually depends on the constant A o in (1.4), as <2(£) does,

but we have agree to fix An.) Altogether we conclude from (3.8) that

(3.9) £s á Q(n\2)-*i*.

Continuing (3.2), (3.7), and (3.9) we obtain, recalling the order of

magnitude of X,

P(St-x > Aw1'2, Sn = 0) ^ P( max I Sni \ > (1 - ¿)\nV2, Sn = 0)\lSiat-l /

+ Q(n\2)-3">.

Equivalently we may write

(3.10)P(St-x á A«1'2, Sn = 0) â Pi max | Sni | S (1 - e)A«1/2, 5» = 0 )

\isiat-i /

- Q(n\2)-3'*.

4. Our next step is to approximate

P(\ Sni\ ^ xx, • • • , | Snu | á ac/í-i, 5„ = z)

by the corresponding probability associated with certain "discrete normal

distributions," the meaning of which will be clear in a moment. We state the

following lemma.

Lemma 2. Let Sni, i = l, • • • , k, be as in the preceding paragraphs and lei

Xx, • • • , xk-x be arbitrary positive numbers, z a possible value of Snk. We write

*-(¿rcrowhere d> is defined in Lemma 1. Then we have

P(\ Sni\ Û Xx, • • • , | Snt-x | á **_l, Snk = z)

(4.1) k2= 2-, • ■ • Z, PhPh-h ' ' ' P*-h-x + eQ —

|ít-ilá^-i Uiiáii n

where \d\ ^ 1.

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 8: an estimate concerning the kolmogoroff limit distribution

1949] THE KOLMOGOROFF LIMIT DISTRIBUTION 43

Proof. We shall use 6 as a changeable constant such that \d\ ¿1. From

Lemma 1 we have

P(Sni - 2) = p. + OQ k/n

since nx = k/n + 8. Assuming that

"*\ I ^ni [ = Xx, * ' * , | >Jn,--i j = *<— 1, On, === 2/

- E • • • X ííií*,-!! • ■ • K» + °Q —\tt-x\Sw-i \HliA» n

we shall show that the same as is true when we replace i by t + 1. In fact

"\ | ^"1 | — *^l» * * * 9 I ^n, [ = Xi, On¡+1 = 2y

k= X P( I Sni | á *i, • • • , | Sm-x I á *<_i, S„, = U)Pz-ti + BQ —

^ v. (*'+l)*

|i,-|âx,- Uilí»i n

Thus Lemma 2 is proved by induction on i.

Taking Xx= • • • =xk-x= (1 — e)Xn112, 2 = 0 in (4.1) we have

P[ max | Snt | S (1 - e)Aw1'2, S„ = 0 )\lS,Êt-l /

¿2

X • • • X PiiPtr-h ■ ' ' P-h-x + ÖQ —|¡i|S(l-e)Xn1/2,lS>áfc-l n

= -9(1 - e)\) + dQn-tls

where ^ is defined by the multiple sum.

Combining this with (3.10) we obtain

(4.2) P(St-x á Aw1'2, S» = 0) è *((1 - e)X) - Qn~3'5(l + A"6'6).

On the other hand it is obvious that

P(st-x á An1'2, Sn = 0) g P ( max | Sn¡ | á Aw1'2, Sn = 0 )\is«st-i /

á ¥(X) + Q»-3/6(l + A-6'6)

by Lemma 2.

Together we have

¥((1 - «)X) - Qn-*IK1 + A-6'6) á POSt-i á Aw1/2, S« = 0)

á ¥(X) + <2«-3/6(i + A-6'6).

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 9: an estimate concerning the kolmogoroff limit distribution

44 K. L. CHUNG [September

This being true no matter what the AT/s are provided that their distribution

G(x) satisfies (i) and (ii) of §2, we can estimate them's in (4.3) by evaluating

approximately the probability in the middle of (4.3) for a special case. We

shall make use of the classical Bernoullian distribution and a combinatorial

formula given by Bachelier.

5. Lemma 3. Let {Xj\, j=l, ■ ■ ■ , n, be independent random variables

having the following distribution

Í + 1 with probability 1/2(o. 1J X j = \

[ — 1 with probability 1/2.

Suppose that n is even, and

(5.2) (Aoign)-1 g \(n) ¿A0\gn.

Then as n—* co

* / 2 Y'2 /(lgw)n\(5.3) P(S*n^\(n)n^2,Sn = 0) = (-) *(X„) + 0 (^-^- )

\ ten / \ n /

where the 0 term does not depend on X(w) if the constant Ao in (5.2) is fixed.

Proof. If n is even and b is an integer, the formula of Bachelier [10, pp.

252-253] may be written as follows:

P(St < b, Sn = 0) = —(( n)+2 X (-l)'C , n )\-2" l\n/2/ isfsnftt \n/2 - jb/f

Applying Stirling's formula with a remainder term we find after some routine

calculations that if 0:£j:2(lg n)2, b=\nn112

L( » ) = (±)"Wd1+oß)\.2*\n/2-jb/ \wnj I Kn1'2/)

On the other hand, by a well known estimate concerning the binomial dis-

tribution, we have

V _( ) ^ g-^X2(Ign)4 ^ e-A(lgn)2

(ig«)2</SnI/2/2x„ 2*\n/2 - jb/

which is of a smaller order of magnitude than any negative power of n. Thus

(5.4)

P(SÏ < Kn1'2, Sn = 0) = (-) il + 2 X (-1)'«-*^\wn / ( j=x

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 10: an estimate concerning the kolmogoroff limit distribution

1949] THE KOLMOGOROFF LIMIT DISTRIBUTION 45

Notice that here the 0-term depends on the A o in (5.2).

This is the desired result (5.3) except that we have to replace the strict

" < " in the probability on the left by " :£ " and also remove the restriction that

X„«1/2 is an integer. Now if we replace X„«1/2 by \nnll2 + B=pnn112 we have

—1/2Pn — X„ = 6n

Differentiating (î>(X) we have

*'(X) «4X¿(-1){/V«*- -4a( X + X Yt-« V^-'JT2 ,2>2-IX-2/

The maximum of x2e~2x x being 2_1e_1X-2 the first sum is ^.4X-3. The second

sum is dominated by its first term, hence ^^4X~2. Thus

(5.3) |*'(X,|S^i + l).

It follows that

i /I 1\ 1 (lg*ÉwJ- #(X.) £A( — +—)-^A^-

\\n A*/»1" n1'

by (5.2). Comparing (5.4) with the same formula with pn replacing X„ we see

that

PCSt-i Û A*«1'2, 5» = 0) - Ptstlt < [Kn1'2], S. = 0) ^ A -^—n

where X„w1/2 need not be an integer now. This and (5.4) establish (5.3).

The distribution (5.1) does not satisfy (i) of §4. To remedy this we put

, x Xx + X2 + X3 + X,5.6) Z =-

2

where the form X's are independent random variables having the distribu-

tion (5.1). Then Z has a distribution satisfying (i) and (ii) of §4. Now let

Zj, j = 1, ■ • - , n, be n independent random variables each distributed as Z in

(5.6). Form

n

Wn= 2ZZj, Wt= max \wk\.j=X ISián

It is easy to see that we have, the S's referring to partial sums of the X's in

(5.1),

P(S*n-X á 2X, Sin = 0) g P(W*-X UX,Wn = 0)^ P(S*n-X á 2x + 3, Sin = 0).

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 11: an estimate concerning the kolmogoroff limit distribution

46 K. L. CHUNG [September

By the same reasoning as in the last paragraph we see that if x = X„«1/2 the

difference between the extreme terms in the last inequality is

^-(- + -\~ n \X» A2/

Therefore we obtain

P(W*n-X á A««1'2, Wn=0)= P(St-X á 2X«»1'1, S4n = 0) + 0 (^L\

(5.7)_ / 1 y'2 /(ign)n\

\2irn / \ n /

by (5.3).Since the distribution of the Z's satisfies (i) and (ii) of §4, we can substitute

(5.7) for the middle term in (4.3). Thus we obtain

¥((1 - *»)X.) - 0-3/6(l + A-6'6) ^ —— *(X»)(2irny¿

á *(X«) + Qn-W(l + A"6'6),

the error term in (5.7) being absorbed into the one in (4.3).

Hence

*(x") = v^xTi *(r^—) + e»-3^1 +x_6/6)(2irny2 \l — e„/

*((1 - 6„)A„) ̂ —-— $((1 - e„)A„) - Qn-^(1 + A"6'6):(27TW)1'2

substituting these into (4.3) we obtain

-,**((1 - in)A„) - Qn-v*(l + X-«'*) á (2^w)i/2P(5;_i g A««1'2, Sn = 0)

(5.8) / An \S * (-J + Qn-v^l + A-6'6).

Now from (5.5) we obtain

<1-^¡:)-$(x") = ^(a:+¿)

A<§nyi*/l l\

= W1'1» \\n A2J

by (3.1). The same estimate holds for i>(X„) — $((1 — «n)Xn). Hence observing

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 12: an estimate concerning the kolmogoroff limit distribution

1949] THE KOLMOGOROFF LIMIT DISTRIBUTION 47

that X^6/s is between X^"1 and X^"2 we can write (5.8) as

(2ttw)1/2F(5!_i á Anw1/2, Sn = 0)

(5.9) —I/10r 1/2 —1 —2 %= HK)+Qn {i + (lgn)'(K +K)}.

This result (5.9) holds in general for any independent random variables X¡,

j=l, • ■ ■ , n, whose common distribution function G(x) satisfies (i) and (ii)

of §4. In particular it holds for the F/s defined in (2.1). Referring to (2.2)

and changing Q into A we have therefore

(5.10) (2irn)1/2Pn = *(X») + An'1'1" {1 + (lg nj~V\\n + X?) ).

Using Stirling's formula in (2.3) we obtain

P(d» g Kn1'2) = (2imy2( 1 + —V»\ 12m/

which reduces to (1.5) by means of (5.10). Theorem l is proved.

We state two simple corollaries of Theorem 1 which we shall need in the

next section.

Corollary 1. There exists a universal constant 00 such that for every n

we have

P(dn g w1'2) ^ C.

This serves the purpose of Tchebycheff's inequality and follows already

from Kolmogoroff's theorem without the remainder term. For all finite n

the probability is a positive number; for all w^wo(«) the probability is

ïïd>(i)—e where e may be taken to be any positive number <4J>(1).

Corollary 2. For the range

\(n) >"v Co lg2 »

we have

Cie-2x2(«) g Prd% ^ x(w)«1'2) g C2e-2x2<">

where the C's are positive constants, Cx and d depending on Co-

in this range 1 — 3>(A) is dominated by the term 2e-2x2 and the remainder

term is negligible in comparison.

6. If n<m we define (m — n)Fm,n(x) to be the number of Xk, n<k^m,

whose values do not exceed x. Recalling the definition of nFn(x) in §1 we have

the relation

(6.1) nFn(x) + (m — n)Fn.m(x) = mFm(x).

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 13: an estimate concerning the kolmogoroff limit distribution

48 K. L. CHUNG [September

Given 77 > 0 let a>l be such that

1 + V V- > H-

a2 ~ 2

Put nk= [ak].

Suppose that nk^n<nk+1. Consider the events

En dn = sup I n(Fn(x) - F(x)) I S -(1 + v)(2-*n lg, ny2X

and

En,nk+x SUp I (nk+x - n)(Fn,nt+1(x) - F(x)) I g (Wjfc+l - w)1/2.

By (6.1) the two events F„ and En,nk+1 together imply

sup I nk+x(Fnk+l(x) - F(x)) I ¿' (1 + '?)(2"1« lg* «)1/2 - (»*+i - w)1'2a:

(6.2) = (1 + r,)(2-ink+x lg, »»¿"»{f-p-)l\«*+l lg2 «¡fc+l/

1 / nk+1 - nt \1/2)

1 + ij \2-1«i+i lg2 nk+x) f

As &—* » the expression within the braces tends to or1, thus is not less than

or2 for sufficiently large k. Then (6.2) implies the following event:

*(•+*>Fnh+l sup I wft+1(F„t+1(*) - F(x)) I è ( H- — )(2-1«i;+i Igi nk+xy2

by the choice of a.

From Corollary 1 to Theorem 1 we have

P(En,nk+l) ^ C.

This together with the fact that £,,,ltl is independent of all Ek for k <n gives

(n*+l—1 \

E£.U C-»P(FHi+l)

(see Feller [4, p. 394]). According to Corollary 2 to Theorem 1,

P(Fnk+l) è C2(\g w,)-(1+W2) g C3¿-<1+'/2>

where C3>0 is a constant depending only on a. Hence

/ "4+1—1 \

XW X En) ̂ C-iJ2 P(Fnk+1) <k \ n=nk / A

CO .

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 14: an estimate concerning the kolmogoroff limit distribution

1949] THE KOLMOGOROFF LIMIT DISTRIBUTION 49

It follows from the Borel-Cantelli lemma that P(En occurs i.o.) =0. In other

words,

(6.3) PI lim sup- < 1 + i? J= 1.\ »-» (2-H lg2 ny2 )

Now let ß > 0 be an integer such that

Consider the events

Hk sup | ß"(Ffi(x) - F(x)) | g (1 - v)(2~^k Ig2 ßky\X

Hk,k+x sup | (ß^ - ßk)(Fßk,ß*+i(x) - Fix)) |X

^ (i - —yir1^1 - ßk) ig2 (ßk+i - ßk)y2.

By (6.1) the two events Hk and Hk,k+1 together imply

sup | ßk+1(Fßk+i(x) - F(x)) |X

eil- —J(2~1«3*+1 - ßk) lgj (ßk+1 - Í3*))1'2

(6.4) - (1 - v)(2-ißk \g2 ßky2

(/ v\ /ßK+1 - ßk lg* 0^+1 - ßk)Y'2^ (2-1ß*+ilg2/3*+1)1/2^(l -— )(-- -— )

/ 0* lg2 ßk \in\

~ \ßk+1 lg2 ßk+1) / "

As k—> =o the expression within the braces tends to

Mx^r-ö"*Thus for sufficiently large k, say for k^ko, (6.4) implies

tf^+i sup I £*+»<$**(*) - F(x)) | S (1 - v)(2-^k+l lg2 ß^1)1'2X

by the choice of ß.

Since Hk,k+1 is independent of J3/ forj;£&, it follows that

p( II fft)P(ff*.iN-0 g P( ft HjHk+x)

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use

Page 15: an estimate concerning the kolmogoroff limit distribution

50 K. L. CHUNG

or

p( n b) ú p( n h) a - p(Hk.k+1)).\k=kQ / \k=kQ /

From Corollary 2 to Theorem 1 we have

P(Hk,k+x) a Ci(lg 03*« - ßrjy-O^M a C4Z-«-'/2)

where (C4>1) is a constant depending only on ß. Thus

p( ñ h) á n (i - c^-'1-"2') = 0.

This means P(i/fc occurs i.o.) = 1. In other words,

(6.5) PI lim sup---> 1 -77)= 1.\ »— (2-% lg2 «)1/2 /

Since 17 is arbitrary, (6.3) and (6.5) establish Theorem 2.

References

1. A. Kolmogoroff, Sulla determinazione empírica di une legge di distribuzione, Giornale del-

l'lstituto Italiano degli Attuari vol. 4 (1933) pp. 83-91.

2. N. Smirnoff, Sin les écarts de la courbe de distribution empirique, Rec. Math. (Mat.

Sbornik) N.S. vol. 6 (1939) pp. 3-26.3. W. Feller, On the Kolmogoroff-Smirnoff limit theorems, Ann. Math. Statist, vol. 19 (1948)

pp. 177-189.4. --, The general form of the so-called law of the iterated logarithm, Trans. Amer. Math.

Soc. vol. 54 (1943) pp. 373^02.5. A. Khintchine, Asymptotische Gesetze der Wahrscheinlichkeitsrechnung, Ergebnisse der

Mathematik und ihrer Grenzgebiete, Berlin, Springer, 1933.

6. P. Erdös and M. Kac, On certain limit theorems of the theory of probability, Bull. Amer.

Math. Soc. vol. 52 (1946) pp. 292-302.7. C. G. Esseen, Fourier analysis of distribution functions, Acta Math. vol. 77 (1945) pp.

1-125.8. K. L. Chung, On the maximum partial sums of sequences of independent random variables,

Trans. Amer. Math. Soc. vol. 64 (1948) pp. 205-233.9. A. C. Berry, The accuracy of the Gaussian approximation to the sum of independent

variâtes, Trans. Amer. Math. Soc. vol. 49 (1941) pp. 122-136.

10. L. Bachelier, Calcul des probabilités, vol. 1, Paris, 1912.

11. P. Lévy, Théorie de l'addition des variables aléatoires, Paris, Gautier-Villars, 1937.

Princeton University,

Princeton, N. J.Cornell University,

Ithaca, N. Y.

License or copyright restrictions may apply to redistribution; see http://www.ams.org/journal-terms-of-use