multi stable multi fractional processes revised 5

Upload: jacqueslevyvehel9913

Post on 06-Apr-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    1/26

    A Ferguson - Klass - LePage series representation of

    multistable multifractional motions and related processes

    R. Le GuevelUniversite Paris VI, Laboratoire de Probabilites et Modeles Aleatoires

    4 place Jussieu 75252 Paris Cedex 05, France

    [email protected]

    and

    J. Levy VehelRegularity Team, INRIA Saclay, Parc Orsay Universite

    4 rue Jacques Monod - Bat P - 91893 Orsay Cedex, France

    [email protected]

    Abstract

    The study of non-stationary processes whose local form has controlled properties isa fruitful and important area of research, both in theory and applications. In [8],a particular way of constructing such processes was investigated, leading in particu-lar to multifractional multistable processes, which were built using sums over Poissonprocesses. We present here a different construction of these processes, based on the

    Ferguson - Klass - LePage series representation of stable processes. We consider vari-ous particular cases of interest, including multistable Levy motion, multistable reverseOrnstein-Uhlenbeck process, log-fractional multistable motion and linear multistablemultifractional motion. We also show that the processes defined here have the samefinite dimensional distributions as the corresponding processes built in [8]. Finally, wedisplay numerical experiments showing graphs of synthesized paths of such processes.

    Keywords: localisable processes, stable processes, Ferguson - Klass - LePage series rep-resentation, multifractional processes.

    1 Introduction

    This work deals with a general method for building stochastic processes for which certainaspects of the local form are prescribed. We will mainly be interested here in local Holderregularity and local intensity of jumps, but our construction allows in principle to controlother properties that could be of interest. Our approach is in the same spirit as the oneproposed in [8], but it uses different methods. In particular, in [8], multistable processes,

    1

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    2/26

    that is localisable processes which are locally -stable, but where the index of stability varies with time, were constructed using sums over Poisson processes. We present herean alternative construction of such processes, based on the Ferguson - Klass - LePage seriesrepresentation of stable stochastic processes [9, 13, 14]. This representation is a powerful toolfor the study of various aspects of stable processes, see for instance [3, 19]. A comprehensivereference for the properties of this representation that will be needed here is [20].

    Stochastic processes where the local Holder regularity varies with a parameter t are in-teresting both from a theoretical and practical point of view. A well-known example ismultifractional Brownian motion (mBm), where the Hurst index h of fractional Brownianmotion (fBm) [11, 16] is replaced by a functional parameter h(t), permitting the Holder expo-nent to vary in a prescribed manner [1, 2, 10, 17]. This allows in addition local regularity andlong range dependence to be decoupled to give sample paths that are both highly irregularand highly correlated, a useful feature for instance in terrain or TCP traffic modeling.

    However, local regularity, as measured by the Holder exponent, is not the only local featurethat is useful in theory and applications. Jump characteristics also need to be accounted for,e.g. for studying processes with paths in D(R) (the space of cadlag functions, i.e. functions

    which are continuous on the right and have left limits at all t T). This has applications forinstance in the modeling of financial or medical data. Stable non-Gaussian processes yieldrelevant models in this case, with the stability index controlling the distribution of jumps.

    Just for the same reason why it is interesting to consider stochastic processes whose localHolder exponent changes in a controlled manner, tractable models where the jump intensity is allowed to vary in time are needed, for instance to obtain a more accurate descriptionof some aspects of the local structure of functions in D(R).

    The approach described in this work allows in particular to construct processes where hand evolve in time in a prescribed way. Having two functional parameters allows to finelytune the local properties of these processes. This may prove useful to model two distinctaspects of financial risk, textured images where both Holder regularity and the distributionof discontinuities vary or to describe epileptic episodes in EEG where at time there may beonly small jumps and at other very large ones. That the processes defined below have avarying jump intensity just means that they are tangent, in a sense to be described shortly,to a stable process with prescribed . The varying local Holder regularity is studied in [12],where an upper bound is given under general conditions, and an exact value is provided inthe case of the multistable Levy motion.

    Let us now recall the definition of a localisable process [5, 6]: Y = {Y(t) : t R} ishlocalisable at u if there exists an h R and a non-trivial limiting process Yu such that

    limr0

    Y(u + rt) Y(u)

    rh

    = Yu(t). (1.1)

    (Note Yu will in general vary with u.) When the limit exists, Yu = {Y

    u(t) : t R} is termed

    the local form or tangent process of Y at u (see [2, 17] for similar notions). The limit (1.1)may be taken in several ways. In this work, we will only deal with the case where convergenceoccurs in finite dimensional distributions. When convergence takes place in distribution, the

    process is called strongly localisable (equality in distribution is denotedd=).

    As mentioned above, a now classical example is mBm Y which looks like index-h(u)fBm close to time u but where h(u) varies, that is

    limr0

    Y(u + rt) Y(u)

    rh(u)

    = Bh(u)(t) (1.2)

    2

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    3/26

    where Bh is index-h fBm. A generalization of mBm, where the Gaussian measure is replacedby an -stable one, leads to multifractional stable processes [21].

    The h-local form Yu at u, if it exists, must be h-self-similar, that is Yu(rt)

    d= rhYu(t) for r >

    0. In addition, as shown in [5, 6], under quite general conditions, Yu must also have stationaryincrements at almost all (a.a.) u at which there is strong localisability. Thus, typical local

    forms are self-similar with stationary increments (sssi), that is rh(Yu

    (u+rt)Yu

    (u))d= Y

    u(t)

    for all u and r > 0. Conversely, all sssi processes are localisable. Classes of known sssiprocesses include fBm, linear fractional stable motion and -stable Levy motion, see [20].

    Similarly to [8], our method for constructing localisable processes is to make use of stochas-tic fields {X(t, v), (t, v) R2}, where the process t X(t, v) is localisable for all v. Thisfield will allow to control the local form of a diagonal process Y = {X(t, t) : t R}.For instance, in the case of mBm, X will be a field of fractional Brownian motions, i.e.X(t, v) = Bh(v)(t), where h is a smooth function of v ranging in [a, b] (0, 1). This is theapproach that was used originally in [1] for studying mBm. From a heuristic point of view,taking the diagonal of such a stochastic field constructs a new process with local form de-pending on t by piecing together known localisable processes. In other words, we shall use

    random fields {X(t, v) : (t, v) R2} such that for each v the local form Xv(, v) of X(, v) atv is the desired local form Yv of Y at v. An easy situation is when, for each v, the process{X(t, v) : t R} is sssi. It is clear that, in this approach, the structure of X(, v) for v in aneighbourhood of u is crucial to determine the local behaviour of Y near u. A simple wayto control this structure is to define the random field as an integral or sum of functions thatdepend on t and v with respect to a single underlying random measure so as to provide thenecessary correlations. General criteria that guarantee the transference of localisability fromthe X(, v) to Y = {X(t, t) : t R} were obtained in [8]. We will use of the following one:

    Theorem 1.1 LetU be an interval withu an interior point. Suppose that for some 0 < h <

    the process {X(t, u), t U} is h-localisable at u U with local form X

    u(, u) and

    P(|X(v, v) X(v, u)| |v u|) 0 (1.3)

    as v u. Then Y = {X(t, t) : t U} is h-localisable at u with Yu() = Xu(, u).

    In the sequel, we consider certain random fields and use Theorem 1.1 to build localisable pro-cesses with interesting properties. As a particular case, we study multifractional multistableprocesses, where both the local regularity and intensity of jumps evolve in a controlled way.

    The remaining of this article is organized as follows: we first collect some notations insection 2. We then build localisable processes using a series representation that yields the

    necessary flexibility required for our purpose. We need to distinguish between the situationswhere the underlying space is finite (section 3), or merely finite (section 4). In each case,we define a random field depending on a kernel f, and give conditions on f ensuring local-isability of the diagonal process. We then consider in section 5 some examples: multistableLevy motion, multistable reverse Ornstein-Uhlenbeck process, log-fractional multistable mo-tion and linear multistable multifractional motion. Section 6 is devoted to computing thefinite dimensional distributions of our processes, and proving that they are the same as theones of the corresponding processes constructed in [8]. Even though the processes constructedhere and the ones from [8] coincide in law, they provide different representations of multi-stable processes, just as, in the stable case, both the Poisson series representation and the

    3

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    4/26

    classical Ferguson - Klass - LePage representation have their own interest. It is also worth-while to note that the conditions required on the kernel defining the process are differenthere and in [8] : we provide a specific example in section 5 where localisability can be provedusing the Ferguson - Klass - LePage representation but not using the Poisson one. Finally,section 7 displays graphs of certain localisable processes of interest. Before we proceed, wenote that constructing localisable processes using a stochastic field composed of sssi processes

    is obviously not the only approach that one can think of. See [7] for an example.

    2 Notations

    We refer the reader to the first chapters of [20] for basic notions on stable random variablesand processes. In particular, recall that a process {X(t) : t T}, where T is generallya subinterval of R, is called -stable (0 < 2) if all its finite-dimensional distributionsare -stable. Many stable processes admit a stochastic integral representation as follows.Write S(,,) for the -stable distribution with scale parameter , skewness and shift-parameter ; we will assume throughout that = 0 and = 0. Let (E, E, m) be a sigma-finitemeasure space, m 0. Taking m as the control measure, this defines an -stable randommeasure M on E such that for A E we have M(A) S

    (m(A)1/, 0, 0

    ). It is termed

    symmetric -stable, or SS. Let

    F F(E, E, m) = {f : f is measurable and f < },

    where is the quasinorm (or norm if 1 < 2) given by

    f =

    E

    |f(x)|m(dx)

    1/. (2.4)

    The stochastic integral of f F(E, E, m) with respect to M then exists [20, Chapter 3]:

    I(f) =

    E

    f(x)M(dx) S(f, 0, 0). (2.5)

    In particular,

    E|I(f)|p =

    {c(, p)fp (0 < p < )

    (p )(2.6)

    where c(, p) < , see [20, Property 1.2.17]. As said above, we will consider in this work onlythe case of symmetric stable measures. We believe most results should have a counterpartin the non-symmetric case, although the proofs would probably have to be significantly more

    involved. We use the following notations throughout the paper:

    (i)i1 is a sequence of arrival times of a Poisson process with unit arrival time.

    (Vi)i1 is a sequence of i.i.d. random variables with distribution m on a measure space(E, m). When m(E) < , we always choose m = m/m(E). Otherwise, m is specifiedand is equivalent to the measure m in each case.

    (i)i1 is a sequence of i.i.d. random variables with distribution P(i = 1) = P(i =1) = 1/2.

    The three sequences (i)i1, (Vi)i1, and (i)i1 are always assumed to be independent.

    4

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    5/26

    3 A Ferguson - Klass - LePage series representation oflocalisable processes in the finite measure space case

    A well-known representation of stable random variables is the Ferguson - Klass - LePageseries one [3, 9, 13, 14, 19]. It is well adapted for our purpose since it will allows for easygeneralization to the case of varying . In this work, we will use the following version:

    Theorem 3.2 ([20, Theorem 3.10.1]) Let (E, E, m) be a finite measure space where m 0,and M be a symmetric -stable random measure with (0, 2) and finite control measurem. Then, for any f F(E, E, m),

    E

    f(x)M(dx)d= (Cm(E))

    1/i=1

    i1/i f(Vi), (3.7)

    where C =(

    0x sin(x)dx

    )1(Theorem 3.10.1 in [20] is more general, as it extends

    to non-symmetric stable processes, that are not considered here). As mentioned above, a

    relevant feature of this representation for us is that the distributions of all random variablesappearing in the sum are independent of . We will use (3.7) to contruct processes withvarying as described in the following theorem.

    Theorem 3.3 Let(E, E, m) be a finite measure space where m 0. Let be a C1 functiondefined on R and ranging in (0, 2). Let b be a C1 function defined onR. Let f(t,u,.) be afamily of functions such that, for all (t, u) R2, f(t,u,.) F(u)(E, E, m). Consider:

    X(t, u) = b(u)(m(E))1/(u)C1/(u)(u)

    i=1

    i1/(u)i f(t,u,Vi). (3.8)

    Assume X(., u) is localisable at u with exponent h (0, 1) and local form Xu(., u) and thatthere exists > 0 such that:

    (C1) v f(t,v,x) is differentiable on B(u, ) for any t B(u, ) and almost allx E(the derivatives of f with respect to v are denoted by fv),

    (C2)

    suptB(u,)

    E

    supwB(u,)

    |f(t,w,x)log |f(t,w,x)||(w)

    m(dx) < , (3.9)

    (C3)

    suptB(u,)

    E

    supwB(u,)

    (|fv(t,w,x)|(w)) m(dx) < . (3.10)

    Then Y(t) X(t, t) is localisable at u with exponent h and local form Yu(t) = Xu(t, u).

    ProofFirst, note that the condition (C2) implies the following condition :

    (C4) There exists > 0 such that:

    suptB(u,)

    E

    supwB(u,)

    (|f(t,w,x)|(w)) m(dx) < . (3.11)

    5

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    6/26

    Indeed, for all t B(u, ), w B(u, ) and x E,

    |f(t,w,x)| = |f(t,w,x)|1|f(t,w,x)|< 1e

    + |f(t,w,x)|1|f(t,w,x)|>e + |f(t,w,x)|1|f(t,w,x)|[ 1e,e]

    2|f(t,w,x)|| log |f(t,w,x)|| + e.

    Condition (C4) is then a consequence of the inequality |a + b| max(1, 21)(|a| + |b|),

    valid for all real numbers a, b and all positive .The function w C

    1/(w)(w) is C

    1 since (w) ranges in (0, 2). We shall denote a(w) =

    b(w)(m(E))1/(w)C1/(w)(w) . The function a is thus also C

    1. We want to apply Theorem 1.1.

    With that in view, we estimate, for v B(u, ) (the ball centered at u with radius ),

    X(v, v) X(v, u) =:i=1

    i(i(v) i(u)) +i=1

    i(i(v) i(u)),

    wherei(w) := i(v, w) := a(w)i

    1/(w)f(v, w , V i),

    i(w) := i(v, w) := a(w)

    1/(w)i i1/(w)

    f(v, w , V i).

    The reason for introducing the i and the i is that the random variables i are not in-dependent, which complicates their study. We shall decompose the sum involving the iinto series of independent random variables which will be dealt with using the three seriestheorem. The sum involving the i will be studied by taking advantage of the fact that, forlarge enough i, each i is close to i in some sense.

    Let c = infvB(u,) (v), d = supvB(u,) (v). If infvB(u,) (v) = supvB(u,) (v), we letinstead c = infvB(u,) (v), d = c + , for some > 0. Note that, in both cases, by decreasing and , d c may be made arbitrarily small.

    Thanks to the assumptions on a and f, i and i are differentiable almost surely (a.s.):

    i(w) = a(w)i1/(w)f(v, w , V i)+a(w)i

    1/(w)fw(v, w , V i)+a(w)(w)

    (w)2log(i)i1/(w)f(v, w , V i),

    i(w) = a(w)

    1/(w)i i

    1/(w)

    f(v, w , V i) + a(w)

    1/(w)i i

    1/(w)

    fw(v, w , V i)

    +a(w)(w)

    (w)2

    log(i)

    1/(w)i log(i)i

    1/(w)

    f(v, w , V i).

    Notice that the functions i and i depend on v. Consider now the function hi : x

    i(x) i(u) i(v) i(u)

    v u(x u) with v

    = u. Note that hi is a random process a.s. C

    1

    on B(u, ). Set wi = min Ki, where Ki = {x [u, v] : hi(x) = 0}. The mean value theorem

    yields that Ki is a non-empty closed set of R.

    Considering the function ki : x i(x) i(u) i(v) i(u)

    v u(x u), and the set

    Fi = {x [u, v] : ki(x) = 0}, we define also xi = min Fi.

    Then there exists a sequence of independent measurable random numbers wi [u, v] (or[v, u]) and a sequence of measurable random numbers xi [u, v] (or [v, u]) such that:

    X(v, u) X(v, v) = (u v)

    i=1 (Z1i + Z

    2i + Z

    3i ) + (u v)

    i=1 (Y1i + Y

    2i + Y

    3i ),

    6

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    7/26

    whereZ1i = ia

    (wi)i1/(wi)f(v, wi, Vi),

    Z2i = ia(wi)i1/(wi)fu(v, wi, Vi),

    Z3i = ia(wi)(wi)

    (wi)2log(i)i1/(wi)f(v, wi, Vi),

    Y1i = ia(xi)

    1/(xi)i i

    1/(xi)

    f(v, xi, Vi),

    Y2i = ia(xi)

    1/(xi)i i

    1/(xi)

    fu(v, xi, Vi),

    Y3i = ia(xi)(xi)

    (xi)2

    log(i)

    1/(xi)i log(i)i

    1/(xi)

    f(v, xi, Vi).

    Note that each wi depends on a,f,,u,v,Vi, but not on i. This remark will be useful in

    the sequel. We establish now a lemma in order to control the seriesi=1

    P(|Z1i | > ).

    Lemma 3.4 There exists a positive constant K such that for all > 0,i=1

    P(|Z1i | > ) KE

    [supwB(u,) |f(v, w , V 1)|

    (w)]

    min(c, d).

    Proof of Lemma 3.4 Fix > 0. We may assume without loss of generality that a 0,otherwise there is nothing to prove. One has:

    P(|Z1i | > ) P

    |f(v, wi, Vi)|

    (wi) > imin(c, d)

    supwB(u,) [|a(w)|(w)]

    .

    Note that, since a is bounded on the compact interval [u, v],

    K := supwB(u,)

    [|a(w)|(w)

    ]< +.

    P(|Z1i | > ) P

    K sup

    wB(u,)

    |f(v, w , V 1)|(w) > i min(c, d)

    .

    Thus

    +i=1

    P(|Z1i | > ) +i=1

    P

    K sup

    wB(u,)

    |f(v, w , V 1)|(w) > i min(c, d)

    K

    min(c, d)E

    sup

    wB(u,)

    |f(v, w , V 1)|(w)

    Now we come back to the proof of Theorem 3.3. The remainder of the proof is dividedinto four steps. The first step will apply the three-series theorem to show that each series

    i=1 Zji , j = 1, 2, 3, converges a.s.. In the second step, we will prove that

    i=1 Yji also converges

    7

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    8/26

    a.s. for j = 1, 2, 3. In the third step we will prove that condition (1.3) is verified byi=1

    Zji , j = 1, 2, 3. Finally, step four will prove the same thing fori=1

    Yji , j = 1, 2, 3.

    First step: almost sure convergence ofi=1

    Zji , j = 1, 2, 3.

    Consider Z1 =i=1

    Z1i . Fix > 0. We shall deal successively with the three series involved

    the three-series theorem.

    First series: S1 =i=1

    P(|Z1i | > ). From Lemma 3.4 and (C4), one gets S1 < +.

    Second series: Sn2 =n

    i=1

    E(Z1i 1l{|Z1i | }). One computes

    E(Z1i 1l{|Z1i | }) = E(ia

    (wi)i1/(wi)f(v, wi, Vi)1l{|a

    (wi)i1/(wi)f(v, wi, Vi)| })

    = E(i)E(a(wi)i

    1/(wi)f(v, wi, Vi)1l{|a(wi)i

    1/(wi)f(v, wi, Vi)| })

    = 0,

    where we have used that i is independent of (wi, Vi) and E(i) = 0. Thus limn+ Sn2 = 0.

    Third series: The final series we need to consider is S3 =i=1

    E [(Z1i 1l{|Z1i | })

    2]. Choose

    = 11. Let be such that d < < 2.

    (Z1i 1l{|Z1i | 1})

    2 |Z1i |1l{|Z1i | 1}

    E[

    (Z1i 1l{|Z1i | 1})

    2]

    E[

    |Z1i |1l{|Z1i | 1}

    ]=

    +0

    P(|Z1i |1l{|Z1i | 1} > x)dx

    10

    P(|Z1i | > x)dx.

    Now from Lemma 3.4 and (C4), there exists a positive finite constant K (0, ) such that

    S3 K

    10

    dx

    min(xc , x

    d )

    = K

    10

    dx

    xd

    < +.

    The case of the Z2 =

    i=1Z2i is treated similarly, since the conditions required on (a

    , f) in

    the proof above are also satisfied by (a, f

    u).Consider finally Z3 =

    i=1

    Z3i . Since the seriesi=1

    i(i(v) i(u)) converges a.s. (see for

    instance [15, page 132]), the convergence of Z1 and Z2 imply the convergence of Z3.We have thus shown that the series Z1, Z2 and Z3 are a.s. convergent.

    Second step: almost sure convergence ofi=1

    Yji , j = 1, 2, 3.

    1Recall that, in the three series theorem, for the series

    i=1Xi to converge a.s., it is necessary that, for

    all > 0, the three series

    i=1P(|Xi| > ),

    i=1E(Xi1l(|Xi| )), and

    i=1Var(Xi1l(|Xi| )) converge,

    and it is sufficient that they converge for one > 0, see, e.g. [18], Theorem 6.1.

    8

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    9/26

    To show that the seriesi=1

    Yji , j = 1, 2 converge a.s., we will first prove that it is enough

    to show thati=1

    Yji 1{ 12ii2}{|Yji |1}

    converges a.s. for j = 1, 2. Indeed, we prove now that

    i=1P

    {12

    ii

    2} { |Yji | > 1}

    < for j = 1, 2, where T denotes the complementary set

    of the set T, and conclude with the Borel Cantelli lemma. The case of+i=1

    Y3i is then treated

    as was the case of+i=1

    Z3i : we know that the seriesi=1

    i(i(v) i(u)) converges a.s. (see

    [15, page 132]); the convergence of+i=1

    Y1i and+i=1

    Y2i will then entail the one of+i=1

    Y3i . Now:

    P

    {

    1

    2

    ii

    2} { |Yji | > 1}

    = P

    {

    1

    2

    ii

    2}

    {|Yji | > 1} {

    1

    2

    ii

    2}

    P(i 2i) + P{|Yj

    i

    | > 1} {1

    2

    i

    i 2} .

    i, as a sum of independent and identically distributed exponential random variables withmean 1, satisfy a Large Deviation Principle with rate function (x) = x1log(x) for x > 0and infinity for x 0 (see [4] p.35), thus

    i1

    P(i 2i) < +.

    Consider nowi1

    P(

    {|Yji | > 1} {12

    ii

    2})

    , for j = 1, 2.

    Case j = 1:

    Let Bi = {12

    ii

    2}.

    P

    {|Y1i | > 1} {

    1

    2

    ii

    2}

    = P

    {|a(xi)i

    1/(xi)f(v, xi, Vi)||(ii

    )1/(xi) 1| > 1} Bi

    P

    ({(21/(xi) 1)|a(xi)i

    1/(xi)f(v, xi, Vi)| > 1} Bi)

    P

    sup

    wB(u,)

    |f(v, w , V 1)|(w) > Ki

    where K is a positive constant. Thus

    i1 P(

    {|Y1i | > 1} {12

    ii

    2})

    < +.Case j = 2: Since the conditions needed on (a, f) in the proof above are satisfied by

    (a, f

    u),

    i1P

    ({|Y

    2

    i | > 1} Bi) < +.It remains to prove the a.s. convergence of

    i=1

    Yji 1{ 12ii2}{|Yji |1}

    , j = 1, 2. In that

    view, we use the following well-known lemma:

    Lemma 3.5 Let{Xk, k 1} be a sequence of random variables such that+n=1

    E|Xn| < +,

    then+n=1

    Xn converges a.s..

    9

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    10/26

    Let us show thati=1

    E

    |Yji |1{ 1

    2ii2}{|Yji |1}

    < +.

    E

    |Yji |1{ 1

    2ii2}{|Yji |1}

    =

    0

    P

    {1 |Yji | > x} {

    1

    2

    ii

    2}

    dx

    1

    0

    P{|Yji | > x} {12

    i

    i 2} dx.

    Let Bi = {12

    ii

    2}.Case j = 1:

    Using the finite-increments formula applied to y y 1(xi) on [1

    2, 2], one shows that

    P(

    {|Y1i | > x} Bi)

    P

    |a(xi)i

    1/(xi)f(v, xi, Vi)||ii

    1| > xc

    21+1/c} Bi

    P{|f(v, xi, Vi)|(xi)|

    ii

    1|(xi) > Kcix(xi)} Bi

    with Kc := infwB(u,)

    c

    21+1/c|a(w)|

    (w)> 0 by assumptions on a, (we may assume again

    that a 0, otherwise there is nothing to prove). Thus, for x (0, 1),

    P(

    {|Y1i | > x} Bi)

    P

    sup

    wB(u,)

    |f(v, w , V 1)|(w)|

    ii

    1|c > Kcixd

    .

    Case d 1:Fix (d, 1 + c

    2) (since is continuous and d < 2, by decreasing if necessary , one may

    ensure that d < 1 + c/2). By Markov and Holder inequalities, and the independence ofV1, i,

    P(

    {|Y1i | > x} Bi)

    P

    sup

    wB(u,)

    |f(v, w , V 1)|(w)

    1/|ii

    1|c/ > K1/c i1/xd/

    1

    (Kcixd)1/

    E|

    ii

    1|2c/2

    supvB(u,)

    E( supwB(u,)

    |f(v, w , V 1)|(w))

    1/

    K

    xd/1

    i1/+c/2

    where we have used that the variance of i is equal to i, and K does not depend on v thanks

    to (C4). Thus E

    |Y1i |1{ 12ii2}{|Y1i |1}

    K

    i1/+c/2where 1

    + c

    2> 1.

    Case d < 1:

    P(

    {|Y1i | > x} Bi)

    1

    xdKciE( sup

    wB(u,)

    |f(v, w , V 1)|(w))E|

    ii

    1|c

    K1

    xd1

    i(E|

    ii

    1|2)c/2 K1

    i1+c/21

    xd,

    10

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    11/26

    thus E

    |Y1i |1{ 12ii2}{|Y1i |1}

    K

    i1+c/2with 1 + c

    2> 1.

    The case ofi1

    E

    |Y2i |1{Bi{|Y2i |1}

    is treated similarly, since the conditions required on

    (a, f) in the proof above are satisfied by (a, fu). Forj = 3, we have alsoi1

    E

    |Y3i |1{Bi{|Y3i |1}

    x} Bi) K

    xd/(log i)d/i1/+c/2

    for d 1 and P ({|Y3i | > x} Bi) Kxd

    (log i)di1+c/2

    for d < 1.

    Thus, for j = 1, 2, 3,+i=1

    Yji converges almost surely. We now move to the last two steps

    of the proof: to verify h-localisability, we need to check that for some such that h < < 1,

    P(|i=1

    Zji | |vu|1) and P(|

    i=1

    Yji | |vu|1) tend to 0 when v tends to u, for j = 1, 2, 3.

    Third step: verification of (1.3) fori=1

    Zji , j = 1, 2, 3.

    We need to estimate P(|

    i=1Zji | |v u|

    1). Let (0, 1) and a (0, 1 ).

    P

    |i=1

    Zji | > |v u|1

    P

    |i=1

    Zji 1|Zji ||vu|a| >

    |v u|1

    2

    + P

    |i=1

    Zji 1|Zji |>|vu|a| >

    |v u|1

    2

    .

    Since i is independent from k, i = k and |Zji | is independent ofi, Markov inequality yields

    P|

    i=1

    Zji 1|Zji ||vu|a| >

    |v u|1

    2 4

    |v u|2(1)

    i=1

    E |Zji |

    21|Zji ||vu|a .Let (d, 2). For any M > 1, we get:

    E

    |Zji |

    21|Zji |M

    = M2E

    |Zji |

    2

    M21|Zji |M

    M2

    10

    P(|Zji | > Mx1/)dx.

    For j = 1, we use again Lemma 3.4 : there exists a positive constant K such that, for anyM > 1

    i=1

    E |Z1i |21|Zji |M M2K1

    0

    dx

    min(Mcxc/, Mdxd/)

    M2cK

    10

    dx

    min(xc/, xd/).

    Thus, there exists a positive constant K such that

    i=1

    E

    |Z1i |

    21|Z1i |M

    KM2c.

    The same conclusion holds for j = 2: i=1 E |Z2i |21|Z2i |M KM2c.11

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    12/26

    For j = 3, fix > 0.

    P(|Z3i | > ) P

    |f(v, wi, Vi)|

    (wi) > K(wi)i

    (log i)(wi)

    ,

    where K := infwB(u,) (w)2

    |a(w)(w)|(w)

    > 0 by assumptions on a, and (we may assumewithout loss of generality that a 0, otherwise there is nothing to prove). In the sequel,K will always denote a finite positive constant, that may however change from line to line.

    Let gi(x) =x

    (logx)(wi)for x > 1 and i N. For x large enough and for all i, gi is strictly

    increasing and limx+ gi(x) = +. For z large enough (independently of i),

    gi(2z(log z)(wi)) =

    2z(log2z)(wi)

    (log(2z) + (wi) log log(2z))(wi) z.

    Let A R, A > e be such that z A, i N, g1i (z) 2z(log z)(wi). A depends only

    on . Let Ui = |f(v, wi, Vi)|

    (wi)

    and i

    N

    , i

    3 depending only on such that i i

    ,i(log i)(wi)

    A. Then:

    i i, P(|Z3i | > ) P

    Ui

    K(wi)>

    i

    (log i)(wi)

    (3.12)

    P

    i

    KUiK(wi)

    log( UiK(wi) )(wi)

    P

    i K

    Ui| log Ui|(wi)

    (wi)+ K

    Ui(wi)

    + K| log |(wi)

    (wi)Ui

    P

    supwB(u,)

    [|f(v, w , V 1)log |f(v, w , V 1)||(w)] Ki min(c, d)

    +P

    sup

    wB(u,)

    |f(v, w , V 1)|(w) Ki min(c, d)

    +P

    sup

    wB(u,)

    |f(v, w , V 1)|(w) Ki min((

    | log |)c, (

    | log |)d)

    .

    (3.13)

    Finally, with (C4) and (C2), for M > e,

    i=i

    10

    P(|Z3i | > Mx1/)dx K

    10

    1

    min(Mcxc/, Mdxd/)dx + K

    10

    1

    min(( Mx1/

    | logMx1/ |)c, ( Mx

    1/

    | logMx1/ |)d)

    dx

    K| log(M)|d

    Mc.

    We get then

    i=i E |Z3i |

    21|Z3i |M K| log(M)|dM2c.

    12

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    13/26

    Let M = |v u|a. Using previously obtained inequalities, sinceii=1

    E

    |Z3i |

    21|Z3i |M

    iM2,

    we get, for j = 1, 2, 3:

    P

    |

    i=1Zji 1|Zji ||vu|a

    | >|v u|1

    2

    K|v u|2(1a)

    and

    limvu

    P

    |i=1

    Zji 1|Zji ||vu|a| >

    |v u|1

    2

    = 0.

    We consider now the second term P

    |

    i=1 Zji 1|Zji |>|vu|a

    | > |vu|1

    2

    . Let i = inf{n 1 :

    i n, |Zji | |v u|a}. Since

    i1

    P(|Zji | > |v u|a) < +, the Borel-Cantelli lemma yields

    P(i = +) = 0. As a consequence,

    P

    |

    i=1

    Zji 1|Zji |>|vu|a| >|v u|1

    2

    =

    n=1

    P

    {|

    i=1

    Zji 1|Zji |>|vu|a| >|v u|1

    2 } {i = n}

    =

    n=2

    P

    {|

    i=1

    Zji 1|Zji |>|vu|a| >

    |v u|1

    2} {i = n}

    n=2

    P(i = n).

    For n 2, P(i = n) P(|Zjn1| > |v u|a).

    For j = 1, P(i = n) P(supwB(u,) |f(v, w , V 1)|(w) > |v u|acK(n 1)), and thus

    n=2

    P(i = n) K|v u|acE( supwB(u,)

    |f(v, w , V 1)|(w))

    K|v u|ac suptB(u,)

    E( supwB(u,)

    |f(t,w,V1)|(w)).

    For j = 2,

    n=2P(i = n) K|v u|ac sup

    tB(u,)

    E( supwB(u,)

    |fu(t,w,V1)|(w)),

    and for j = 3,

    n=2

    P(i = n) i1i=1

    P(|Z3i | > |v u|a) +

    i=i

    P(|Z3i | > |v u|a).

    The first term on the right hand side is a finite sum and it is bounded from above by K|vu|ac,because there exists a positive constant K such that

    P(|Z3i | > |v u|a) P( sup

    wB(u,)

    |f(v, w , V 1)|(w)

    Ki

    (log i)d|v u|ac).

    13

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    14/26

    The second term, using previous inequalities estimating P(|Z3i | > ) for i i, is bounded

    from above by K|v u|ac| log |v u||d. Finally,

    limvu

    P

    |i=1

    Zji 1|Zji |>|vu|a| >

    |v u|1

    2

    = 0.

    Fourth step: verification of (1.3) fori=1

    Yji , j = 1, 2, 3.

    We consider now P(|i=1

    Yji | |v u|1).

    Note that, with Bi = {12

    ii

    2}, we have

    i1 P ({|Y3i | > 1} Bi) < +. Indeed,

    for j = 3 and i > 1,

    P(

    {|Y3i | > 1} Bi)

    = P

    {|a(xi)

    (xi)

    (xi)2log(i)i1/(xi)f(v, xi, Vi)||

    logilog i

    (ii

    )1/(xi) 1| > 1} Bi

    .

    ( logilog i (ii )1/(x

    i) 1)(i>1) is bounded on {12 ii 2}, thus there exists K > 0 such that,

    for i 3,

    P

    {|Y3i | > 1} {

    1

    2

    ii

    2}

    P

    |f(v, xi, Vi)|

    (xi) >Ki

    (log i)(xi)

    .

    Following the computations done after equation (3.12), we conclude that

    i1

    P

    |f(v, xi, Vi)|

    (xi) >Ki

    (log i)(xi)

    < +

    and then

    i1 P ({|Y3i | > 1} Bi) < +.

    Let i0 = inf{n 1 : i n, |Yji | 1 and

    12

    ii

    2}.

    Sincei1

    P({|Yji | > 1} {i 2i}) < +, the Borel-Cantelli lemma yields

    P(i0 = +) = 0. As a consequence,

    P(|i=1

    Yji | |v u|1) =

    n1

    P

    {|

    i=1

    Yji | |v u|1} {i0 = n}

    .

    Let bn(v) = P

    {|i=1

    Yji | |v u|1} {i0 = n}

    . Our strategy is the following: we show

    that, for each fixed n, bn(v) tends to 0 when v tends to u. Then we prove that there existsa summable sequence (cn)n such that, for all n and all v, bn(v) cn. We conclude using thedominated convergence theorem that

    n1 bn(v) tends to 0 when v tends to u. For all n 1,

    bn(v) P(|n1i=1

    Yji | |v u|1

    2) + P(|

    i=n

    Yji 1{|Yji |1}{ 12ii2}

    | |v u|1

    2).

    14

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    15/26

    For n 2, consider P(|n1i=1

    Yji | |vu|1

    2).

    P(|n1i=1

    Yji | |v u|1

    2)

    n1i=1

    P(|Yji | |v u|1

    2(n 1)).

    Let p (0, cd

    ). With K a positive constant depending on n but not on v, we have, for j = 1:

    P

    |Y1i |

    |v u|1

    2(n 1)

    P

    ( supwB(u,)

    |f(v, w , V i)|(w))p|(

    ii

    )1/(xi) 1|p(xi) K|v u|pc(1)

    .

    We use the following chain of inequalities:

    |(ii

    )1/(xi) 1|p(xi) = |(ii

    )1/(xi) 1|p(xi)1ii + |(ii

    )1/(xi) 1|p(xi)1i

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    16/26

    We consider now the case j = 3. When i = 1:

    P

    |Y31 |

    |v u|1

    2(n 1)

    = P

    | log(1)

    1/(x1)1 f(v, x1, V1)||a(x1)

    (x1)| (x1)

    2

    2(n 1)|v u|1

    K|v u|pc(1)E

    (

    | log(1)|c + | log(1)|

    d

    1)p

    ,

    (K depends on n but not on v). Since p < 1, E

    ( | log(1)|c+| log(1)|d

    1)p

    < +, and

    limvu

    P

    |Y31 |

    |v u|1

    2(n 1)

    = 0.

    For i 2,

    P

    |Y3i |

    |v u|1

    2(n 1)

    = P

    log(i)log(i) ( ii )1/(xi) 1 |f(v, xi, Vi)a(xi)(xi)| (xi)2i1/(xi)|v u|log(i)2(n 1)

    One has:

    | log(i)log(i)

    ( ii

    )1/(xi) 1|(xi)p | log(i)log(i)

    ( ii

    )1/c 1|cp + | log(i)log(i)

    (ii

    )1/c 1|dp +

    |log(i)

    log(i)(

    ii

    )1/d 1|cp + |log(i)

    log(i)(

    ii

    )1/d 1|dp.

    Since p (0, cd

    ), the four terms in the right hand side of the above inequality have finiteexpectation (use again the density of i). Reasoning as in the case of Y

    31 , one gets:

    limvu

    P

    |Y3i |

    |v u|1

    2(n 1)

    = 0.

    Finally, we have, for j {1, 2, 3},

    limvu

    P

    |n1i=1

    Yji | |v u|1

    2

    = 0.

    Let us now consider, for n 1, P

    |i=n

    Yji 1{|Yji |1}{ 12ii2}

    | |vu|1

    2

    :

    P

    |i=n

    Yji 1{|Yji |1}{ 12ii2}

    | |v u|1

    2

    2|v u|1E

    |i=n

    Yji 1{|Yji |1}{ 12ii2}

    |

    K|v u|1

    (recall that the constants K used in bounding the series E(|Yji |1{|Yji |1}{ 12ii2}

    ) do not

    depend on v). Thus bn(v) 0 when v u for each n. In view of using the dominatedconvergence theorem, we compute (recall that Bi = {

    12

    ii

    2}):

    bn(v) P({i0 = n})

    P({|Yjn1| > 1} Bn1)

    P({|Yjn1| > 1} Bn1) + P(n1n 1

    2).

    16

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    17/26

    For j = 1 and d 1,

    P({|Y1n1| > 1} Bn1) K

    (n 1)1/+c/2( suptB(u,)

    E( supwB(u,)

    |f(t,w,V1)|(w)))1/

    and if d < 1,

    P({|Y1n1| > 1} Bn1) K

    (n 1)1+c/2( suptB(u,)

    E( supwB(u,)

    |f(t,w,V1)|(w))).

    The same conclusion holds for j = 2. The case j = 3 is treated in a similar way (with anadditional log term)

    4 A Ferguson - Klass - LePage representation of local-

    isable processes in the -finite measure space case

    When the space E has infinite measure, one cannot use the representation above. This is adrawback, because many applications deal with processes defined on the real line, i.e. E = Rand m is Lebesgue measure. In the -finite case, one may perform a change of measurethat allows to reduce to the finite case (see [20], Proposition 3.11.3). Section 5 containsspecific examples of changes of measure. In terms of localisability, this translates into addinga natural condition involving the kernel and the change of measure:

    Theorem 4.6 Let (E, E, m) be a -finite measure space. Let r : E R+ be such thatm(dx) = 1

    r(x)m(dx) is a probability measure. Let be a C1 function defined on R and

    ranging in (0, 2). Let b be a C1 function defined onR. Let f(t,u,.) be a family of functions

    such that, for all (t, u) R2

    , f(t,u,.) F(u)(E, E, m). Consider the following random field:

    X(t, u) = b(u)C1/(u)(u)

    i=1

    i1/(u)i r(Vi)

    1/(u)f(t,u,Vi). (4.14)

    Assume X(., u) is localisable at u with exponent h (0, 1) and local form Xu(t, u) and thatthere exists > 0 such that:

    (Cs1) v f(t,v,x) is differentiable on B(u, ) for any t B(u, ) and almost allx E (the derivatives of f with respect to v are denoted by fv),

    (Cs2)sup

    tB(u,)

    E

    supwB(u,)

    |f(t,w,x)log |f(t,w,x)||(w)

    m(dx) < , (4.15)

    (Cs3)

    suptB(u,)

    E

    supwB(u,)

    (|fu(t,w,x)|(w)) m(dx) < , (4.16)

    (Cs4)

    suptB(u,)

    E

    supwB(u,)

    |f(t,w,x)log(r(x))|(w)

    m(dx) < . (4.17)

    17

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    18/26

    Then Y(t) X(t, t) is localisable at u with exponent h and local form Yu(t) = Xu(t, u).

    Remark: from (4.14), it may seem as though the process Y depends on the particularchange of measure used, i.e. the choice of r. However, this is not case: see Proposition 6.13.

    ProofWe apply Theorem 3.3 with g(t,w,x) = r(x)1/(w)f(t,w,x) on (E, E, m).

    By (Cs1), the family of functions v f(t,v,x) is differentiable (v, t) in a neighbour-

    hood of u and a.a. x in E thus v g(t,v,x) is differentiable and (C1) holds.

    Choose > 0 such that (Cs2) and (Cs4) hold.R

    supwB(u,)

    |g(t,w,x)log |g(t,w,x)||(w)

    m(dx)

    =

    R

    supwB(u,)

    f(t,w,x)log |r(x)1/(w)f(t,w,x)|(w) m(dx).Expanding the logarithm above and using the inequality |a + b| max(1, 21)(|a| +|b|), valid for all real numbers a, b and all positive , one sees that (C2) holds.

    Choose > 0 such that (Cs3) and (Cs4) hold.

    gu(t,w,x) = r(x)1/(w)

    fu(t,w,x)

    (w)

    2(w)log(r(x))f(t,w,x)

    andR

    supwB(u,)

    (|gu(t,w,x)|(w)) m(dx) =

    R

    supwB(u,)

    fu(t,w,x) (w)2(w) log(r(x))f(t,w,x)(w)

    m(dx).

    The inequality |a + b| max(1, 21)(|a| + |b|) shows that (C3) holds

    5 Examples of localisable processes

    In this section, we apply the results above and obtain some localisable processes of interest, inparticular multistable versions of several classical processes. Similar multistable extensionswere considered in [8], to which the interested reader might refer for comparison. We firstrecall some definitions. In the sequel, M will denote a symmetric -stable (0 < < 2)random measure on R with control measure the Lebesgue measure L. We write

    L(t) := t

    0

    M(dz)

    for -stable Levy motion. The log-fractional stable motion is defined as

    (t) =

    (log(|t x|) log(|x|)) M(dx) (t R).

    This process is well-defined only for (1, 2] (the integrand does not belong to F for 1). Both Levy motion and log-fractional stable motion are 1/-self-similar with stationaryincrements. The following process is called linear fractional -stable motion:

    L,H,b+,b(t) =

    f,H(b+, b, t , x)M(dx)

    18

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    19/26

    where t R, H (0, 1), b+, b R, and

    f,H(b+, b, t , x) = b+

    (t x)

    H1/+ (x)

    H1/+

    + b

    (t x)

    H1/ (x)

    H1/

    .

    L,H,b+,b is an sssi process. If b+ = b = 1, this process is called well-balanced linear

    fractional -stable motion, denoted L,H. The reverse Ornstein-Uhlenbeck process is:

    Y(t) =t

    exp((x t))M(dx) (t R, > 0)

    The localisability of Levy motion, log-fractional stable motion and linear fractional -stablemotion stems from the fact that they are sssi. The localisability of the reverse Ornstein-Uhlenbeck process is proved in [7]. We now define multistable versions of these processes.

    Theorem 5.7 (Symmetric multistable Levy motion, compact case) Let : [0, T] (1, 2) and b : [0, T] (0, +) be continuously differentiable. Let m(dx) be the uniformdistribution on [0, T]. Define

    Y(t) = b(t)C

    1/(t)

    (t) T

    1/(t)+i=1

    i

    1/(t)

    i 1[0,t](Vi) (t [0, T]). (5.18)

    Then Y is 1/(u)-localisable at any u (0, T), with local form Yu = b(u)L(u).

    The proof is a simple application of Theorem 3.3, and is omitted.

    Theorem 5.8 (Symmetric multistable Levy motion, non-compact case) Let : R (1, 2) andb : R (0, +) be continuously differentiable. Letm(dx) =

    +j=1 2

    j1[j1,j[(x)dxon R. Define

    Y(t) = b(t)C1/(t)(t)

    +

    i=1+

    j=1i

    1/(t)i 2

    j/(t)1[0,t][j1,j[(Vi) (t R+). (5.19)

    Then Y is 1/(u)-localisable at any u R+, with local form Yu = b(u)L(u).

    Again, the proof is a straightforward application of Theorem 4.6 with m(dx) = dx, r(x) =j=1 2

    j1[j1,j[(x), f(t,u,x) = 1[0,t](x), and is omitted.

    Theorem 5.9 (Log-fractional multistable motion) Let : R (1, 2) and b : R (0, +) be continuously differentiable. Letm(dx) = 3

    2

    +j=1 j

    21[j,j+1[[j1,j[(x)dx on R.Define

    Y(t) = b(t)C1/(t)(t)

    +

    i=1+

    j=1i

    1/(t)i (log |tVi|log |Vi|)

    2/(t)

    31/(t)j2/(t)1[j,j+1[[j1,j[(Vi) (t R).

    (5.20)Then Y is 1/(u)-localisable at any u R, with Yu = b(u)(u).

    Proof We apply Theorem 4.6 with m(dx) = dx, r(x) = 2

    3

    j=1j

    21[j,j+1[[j1,j[(x),f(t,u,x) = log(|t x|) log(|x|) and the random field

    X(t, u) = b(u)C1/(u)(u)

    i,j=1

    i1/(u)i (log |t Vi| log |Vi|)

    2/(u)

    31/(u)j2/(u)1[j,j+1[[j1,j[(Vi).

    X(., u) is the symmetrical (u)-Log-fractional motion. It is 1(u)

    -localisable with local form

    Xu(., u) = b(u)(u) [20]. The remaining of the proof is easy and is left to the reader.

    19

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    20/26

    Theorem 5.10 (Linear multistable multifractional motion) Letb : R (0, +), :R (0, 2) and h : R (0, 1) be continuously differentiable.

    Letm(dx) = 32

    +j=1 j

    21[j,j+1[[j1,j[(x)dx on R. Define for t R

    Y(t) = b(t)C1/(t)(t)

    +

    i,j=1

    i1/(t)i (|tVi|

    h(t)1/(t)|Vi|h(t)1/(t))(

    2j2

    3)1/(t)1[j,j+1[[j1,j[(Vi).

    (5.21)The process Y is h(u)-localisable at all u R, with Yu = b(u)L(u),h(u) (the well balancedlinear fractional stable motion).

    Proof We apply Theorem 4.6 with m(dx) = dx, r(x) = 2

    3

    j=1j

    21[j,j+1[[j1,j[(x),

    f(t,u,x) = |t x|h(u)1/(u) |x|h(u)1/(u) and the random field

    X(t, u) = b(u)C1/(u)(u) (

    2j2

    3)1/(u)

    i,j=1

    i1/(u)i (|tVi|

    h(u)1/(u)|Vi|h(u)1/(u))1[j,j+1[[j1,j[(Vi).

    X(., u) is the ((u), h(u))-well balanced linear fractional stable motion and it is 1(u)-localisablewith local form Xu(., u) = b(u)L(u),h(u) [8]. (Cs1) is easy.

    (Cs3) First we note (Cs5) the following condition :

    There exists > 0 such that:

    suptB(u,)

    E

    supwB(u,)

    (|f(t,w,x)|(w)) m(dx) < . (5.22)

    In [8], it is shown that, given u R, one may choose > 0 small enough and numbersa,b,h, h+ with 0 < a < (w) < b < 2, 0 < h < h(w) < h+ < 1 and

    1

    a 1

    b

    < h 1 + 2 maxtU |t|)(5.24)

    for appropriately chosen constants c1 and c2. The conditions on a,b,h, h+ entail thatsuptU k1(t, )a,b < , where

    k1(t, )a,b =

    R

    |k1(t, x)|adx

    1/a+

    R

    |k1(t, x)|bdx

    1/b, (5.25)

    and (Cs5) holds for k1. (Cs3) is obtained with (5.23).

    (Cs2) One computes:

    |f(t,w,x)log(|f(t,w,x)|)|(w) |f(t,w,x)|(w) + |f(t,w,x)log(|f(t,w,x)|)|(w)1{|f(t,w,x)|>e}

    +|f(t,w,x)log(|f(t,w,x)|)|(w)1{|f(t,w,x)|< 1e}.

    20

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    21/26

    Since |f(t,w,x)| k1(t, x), one gets

    |f(t,w,x)log(|f(t,w,x)|)|1{|f(t,w,x)|>e} k1(t, x)log(k1(t, x))1{|f(t,w,x)|>e}

    |k1(t, x)log(k1(t, x))|

    |f(t,w,x)log(|f(t,w,x)|)|(w)1{|f(t,w,x)|>e} |k1(t, x)log(k1(t, x))|(w).

    Fix > 0 such that 1 < < a + ab

    ah+ and > 0 such that1

    < < 1. Then:

    |f(t,w,x)log(|f(t,w,x)|)|(w)1{|f(t,w,x)|< 1e} K|f(t,w,x)|

    (w) K|k1(t, x)|(w)

    and thus, since |k1| satisfies (Cs5) and k1 satisfies (Cs2), (Cs2) holds for f.

    (Cs4) For j large enough (j > j),

    |f(t,w,x)log(r(x))|(w)

    K1|k1(t, x)|(w)

    +K2

    +j=j

    |f(t,w,x)|(w)(log(j))d1[j,j+1[[j1,j[(x).

    Also

    |f(t,w,x)|(w)1[j,j+1[[j1,j[(x) K31

    |x|a(1+1/bh+)1[j,j+1[[j1,j[(x).

    Thus

    |f(t,w,x)log(r(x))|(w) K1|k1(t, x)|(w) + K4

    +j=j

    (log(j))d

    |x|a(1+1/bh+)1[j,j+1[[j1,j[(x).

    To conclude, note thatR

    (log(j))d

    |x|a(1+1/bh+)1[j,j+1[[j1,j[(x)dx = 2(log(j))

    d

    jj1

    dx

    |x|a(1+1/bh+) 2

    (log(j))d

    ja(1+1/bh+)

    Theorem 5.11 (Multistable reverse Ornstein-Uhlenbeck process) Let > 0, :R (1, 2) and b : R (0, +) be continuously differentiable.

    Letm(dx) =+

    j=1 2j11[j,j+1[[j1,j[(x)dx on R. Define

    Y(t) = b(t)C1/(t)(t)

    +i=1

    +j=1

    i1/(t)i 2

    (j+1)/(t)e(Vit)1[t,+)([j,j+1)[j1,j))(Vi) (t R).

    (5.26)Then Y is 1/(u)-localisable at any u R, with local form Yu = b(u)L(u).

    21

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    22/26

    The proof is similar to previous ones and is omitted.We provide now an example of a process whose kernel f does not satisfy the criteria

    of Theorem 5.2 in [8] for localisability. Thus it is not possible to prove that the processassociated to f through the Poisson sum representation is localisable. Indeed, as is readilyverified, f is such that, for all open set U (0, 1/e), all t U and all increasing function ,f(t,u,.)c,d = + (recall (5.25)) where c = inf

    vU(v), d = sup

    vU

    (v). However, it is possible

    to prove that the process defined by f in the Ferguson - Klass - LePage representation islocalisable thanks to Theorem 3.3:

    Theorem 5.12 Let U = (0, 12e

    ), : U R, (t) = 1 + t. Let m(dx) be the uniformdistribution on U. Define

    Y(t) = C1/(t)(t)

    1

    (2e)1/(t)

    +i=1

    i1/(t)i

    1

    V1/(t)i | ln Vi|

    4/(t)1[0,t](Vi) (t U). (5.27)

    Then Y is 1/(u)-localisable at any u U, with local form Yu =1

    u1/(u)| lnu|4/(u)L(u).

    ProofApply Theorem 4.6 with m(dx) = dx, f(t,u,x) = 1x1/(u)| lnx|4/(u) 1[0,t](x) and

    X(t, u) = C1/(u)(u)1

    (2e)1/(u)

    i=1

    i1/(u)i

    1

    V1/(u)i | ln Vi|

    4/(u)1[0,t](Vi).

    The computation of the characteristic function ofX(., u) (as in Section 6) allows to verifythat X(., u) is 1

    (u)-localisable with local form Xu(., u) =

    1u1/(u)| lnu|4/(u)

    L(u)

    6 Finite dimensional distributions

    In this section, we compute the finite dimensional distributions of the family of processesdefined in Theorem 4.6, and compare the results with the ones in [8].

    Proposition 6.13 With notations as above, let {X(t, u), t , u R} be as in (4.14) andY(t) X(t, t). The finite dimensional distributions of the process Y are equal to

    E

    eim

    j=1jY(tj)

    = exp

    2

    E

    +0

    sin2(m

    j=1

    jb(tj)C

    1/(tj)

    (tj)

    2y1/(tj)f(tj, tj, x)) dy m(dx)

    for m N, = (1, . . . , m) Rm, t = (t1, . . . , tm) R

    m.

    Proof Let m N and write t() = E

    eim

    j=1jY(tj)

    . We proceed as in [20], Theorem

    1.4.2. Let {Ui}iN be an i.i.d sequence of uniform random variables on (0, 1), independent of

    the sequences {i} and {Vi}, and g(t,u,x) = b(u)C1/(u)(u) r(x)

    1/(u)f(t,u,x). For all n N,

    mj=1

    jn1/(tj)

    nk=1

    kU1/(tj)k g(tj, tj, Vk)

    d=

    mj=1

    j

    n+1

    n

    1/(tj) nk=1

    k1/(tj)k g(tj , tj, Vk).

    (6.28)

    22

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    23/26

    The right-hand side of (6.28) converges a.s. tom

    j=1

    jY(tj) when n tends to infinity and thus

    t() = limn+

    E

    eim

    j=1jn

    1/(tj)n

    k=1kU

    1/(tj)

    k g(tj ,tj ,Vk)

    .

    Set nt () = E

    eim

    j=1jn

    1/(tj) n

    k=1kU

    1/(tj)k g(tj ,tj ,Vk)

    . This function may be written as:

    nt () = E

    n

    k=1

    eik

    m

    j=1jn

    1/(tj)U1/(tj)

    k g(tj ,tj ,Vk)

    =

    E

    ei1

    m

    j=1jn

    1/(tj)U1/(tj)

    1 g(tj ,tj ,V1)n

    ,

    since all the sequences {k}, {Uk}, {Vk} are i.i.d. We compute now the expectation usingconditioning and independence of the sequences {k}, {Uk} and {Vk}.

    E

    ei1m

    j=1jn

    1/(tj)

    U

    1/(tj)

    1 g(tj ,tj ,V1)

    = EE

    ei1m

    j=1 jn

    1/(tj)

    U

    1/(tj)

    1 g(tj ,tj ,V1)|U1, V1

    = E

    cos(

    mj=1

    jn1/(tj)U

    1/(tj)1 g(tj, tj, V1))

    = E

    1

    n

    n0

    cos(m

    j=1

    jy1/(tj)g(tj, tj, V1)) dy

    = 1 2

    n

    n0

    E

    sin2(

    mj=1

    j2

    y1/(tj)g(tj , tj , V1))

    dy.

    The function sin2 is positive and thus, when n tends to +,

    n0

    E

    sin2(

    mj=1

    j2

    y1/(tj)g(tj, tj, V1))

    dy

    +0

    E

    sin2(

    mj=1

    j2

    y1/(tj)g(tj, tj, V1))

    dy.

    To conclude, note that

    E

    sin2(

    mj=1

    j2

    y1/(tj)g(tj , tj, V1))

    =

    E

    sin2(m

    j=1

    j2

    y1/(tj)g(tj, tj , x)) m(dx)

    Comparing with Proposition 8.2, Theorems 9.3, 9.4, 9.5 and 9.6 in [8], it is easy to prove

    the following corollary:

    Corollary 6.14 The linear multistable multifractional motion, multistable Levy motion, log- fractional multistable motion and multistable reverse Ornstein-Uhlenbeck process defined inSection 5 have the same finite dimensional distributions as the corresponding processes of [8].

    7 Numerical experiments

    We display in this section graphs of synthesized paths of some of the processes defined above.The idea is to picture how multistability translates on the behaviour of trajectories, and,

    23

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    24/26

    in the case of linear multistable multifractional motion, to visualize the effect of both avarying H and a varying , these two parameters corresponding to two different notionsof irregularity. The synthesis method is described in [7], page 17-18. Theoretical resultsconcerning the convergence of this method will be presented elsewhere. The graphs on thefirst line of Figure 1 ((a) and (b)) display multistable Levy motions, with respectively increasing linearly from 1.02 to 1.98 (shown in (c)) and a sine function ranging in the

    same interval (shown in (d)). The graph (e) displays an Ornstein-Uhlenbeck multistableprocess with same sine function. A linear multistable multifractional motion with linearlyincreasing and H functions is shown in (f). H increases from 0.2 to 0.8 and from 1.41 to1.98 (these functions are displayed on the right part of the bottom line). The graph in (g) is alinear multistable multifractional motion with linearly increasing and linearly decreasing H.H decreases from 0.8 to 0.2 and increases from 1.41 to 1.98 (these functions are displayedon the left part of the bottom line). Finally, a zoom on the second half of the process in (f)is shown, that allows to appreciate how the graph becomes smoother as H increases. In allthe graphs, one sees how the variations of translate in terms of the intensity of jumps.Additionally, in the case of linear multistable multifractional motions, the interplay between

    the smoothness governed by H and the jumps tuned by indicate that such processes mayprove useful in applications.

    8 Acknowledgements

    We thank the anonymous referee for many comments and corrections that allowed us tosignificantly improve our paper.

    References

    [1] Ayache, A. and Levy Vehel, J. (2000). The generalized multifractional Brow-nian motion. Stat. Inference Stoch. Process. 3, 718.

    [2] Benassi, A., Jaffard, S. and Roux, D. (1997). Gaussian processes and pseu-dodifferential elliptic operators. Rev. Mat. Iberoamericana 13, 1989.

    [3] Bentkus, V., Juozulynas, A. and Paulauskas, V. (2001). Levy-LePage seriesrepresentation of stable vectors: convergence in variation. J. Theo. Prob.(2) 14, (4)949978.

    [4] Dembo, A. and Zeitouni, O. (1998). Large deviations techniques and applica-tions, Springer Verlag.

    [5] Falconer, K.J. (2002). Tangent fields and the local structure of random fields. J.Theoret. Probab. 15, 731750.

    [6] Falconer, K.J. (2003). The local structure of random processes. J. London Math.Soc.(2) 67, 657672.

    [7] Falconer, K.J., Le Guevel, R. and Levy Vehel, J. (2008). Localisable mov-ing average stable and multistable processes, Stochastic Models, to appear. Availableat http://arxiv.org/abs/0807.0764

    24

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    25/26

    0.5 1 1.5 2

    x 104

    -0. 5

    0

    0.5

    a: Lvy multistable motion

    0.5 1 1.5 2

    x 104

    -0.5

    0

    0.5

    b: Lvy multistable motion

    0 0.5 1 1.5 2

    x 104

    1

    1.5

    2c: alpha function of the process in (a),(e)

    0 0.5 1 1.5 2

    x 104

    1

    1.5

    2d: alpha function of the process in (b)

    0.5 1 1.5 2

    x 104

    -2000

    -1000

    0

    1000

    e: OrnsteinUhlenbeck multistable process

    0.5 1 1.5 2

    x 104

    -20

    0

    20

    f: linear multistable multifractional motion

    0 0.5 1 1.5 2

    x 104

    8

    642024

    g: linear multistable multifractional motion

    1.2 1.4 1.6 1.8 2

    x 104

    -1

    0

    1

    zoom on the second half of the process in (f)

    0 0.5 1 1.5 2

    x 104

    0

    1

    2alpha and H functions of the process in (g)

    0 0.5 1 1.5 2

    x 104

    0

    1

    2alpha and H functions of the process in (f)

    alpha

    H

    alpha

    H

    Figure 1: Paths of multistable processes. First line: Levy multistable motions with sine(a) and linear (b) function. Second line: (c) function for the process in (a), (d) function for the process in (b). Third line: (e) multistable Ornstein-Uhlenbeck process with function displayed in (c), and (f) linear multistable multifractional motion with increasing and H functions. Fourth line: (g) linear multistable multifractional motion with increasing function and decreasing H function, and zoom on the second part of the process in (f).Last line: and H functions for the process in (g) (left), and in (f) (right).

    25

  • 8/2/2019 Multi Stable Multi Fractional Processes Revised 5

    26/26

    [8] Falconer, K.J. and Levy Vehel, J. (2008). Multifractional, multistable, andother processes with prescribed local form, J. Theoret. Probab., DOI 10.1007/s10959-008-0147-9.

    [9] Ferguson, T.S. and Klass, M.J. (1972). A representation of independent in-crement processes without Gaussian components. Ann. Math. Stat. 43, 16341643.

    [10] Herbin, E. (2006). From N-parameter fractional Brownian motions to N-parametermultifractional Brownian motion. Rocky Mountain J. Math. 36, 12491284.

    [11] Kolmogorov, A.N. (1940). Wienersche Spiralen und einige andere interessanteKurven in Hilbertchen Raume, Doklady, 26, 115118.

    [12] Le Guevel, R. and Levy Vehel, J. (2008). Incremental moments andHolder exponents of multifractional multistable processes, preprint. Available athttp://arxiv.org/abs/0807.0764

    [13] Le Page, R. (1980). Multidimensional infinitely divisible variables and processes.

    I. Stable case Tech. Rep. 292, Dept. Stat, Stanford Univ.

    [14] Le Page, R. (1980). Multidimensional infinitely divisible variables and processes.II Probability in Banach Spaces III Lecture notes in Math. 860 279284, Springer,New York.

    [15] Ledoux, M. and Talagrand, M. (1996). Probability in Banach spaces. Springer-Verlag.

    [16] Mandelbrot, B.B. and Van Ness, J. (1968). Fractional Brownian motion,fractional noises and applications. SIAM Rev. 10, 422437.

    [17] Peltier, R.F. and Levy Vehel, J. (1995). Multifractional Brownian motion:definition and preliminary results. Rapport de recherche de lINRIA, No. 2645. Avail-able at: http://www-rocq1.inria.fr/fractales/index.php?page=publications

    [18] Petrov, V. (1995). Limit Theorems of Probability Theory, Oxford Science Publi-cation.

    [19] Rosinski, J. (1990). On Series Representations of Infinitely Divisible Random Vec-tors Ann. Probab., 18, (1) 405430.

    [20] Samorodnitsky, G. and Taqqu, M.S. (1994). Stable Non-Gaussian RandomProcesses, Chapman and Hall.

    [21] Stoev, S. and Taqqu, M.S. (2004). Stochastic properties of the linear multifrac-tional stable motion. Adv. Appl. Probab., 36, 10851115.

    26