boostrapping cointegration regression analysis

Upload: tian-yu-tan

Post on 04-Jun-2018

235 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    1/30

    Bootstrapping Cointegrating Regressions1

    Yoosoon ChangDepartment of Economics

    Rice University

    Joon Y. ParkSchool of Economics

    Seoul National University

    Kevin SongDepartment of Economics

    Yale University

    Abstract

    In this paper, we consider bootstrapping cointegrating regressions. It is shownthat the method of bootstrap, if properly implemented, generally yields con-sistent estimators and test statistics for cointegrating regressions. We do notassume any speci c data generating process, and employ the sieve bootstrapbased on the approximated nite-order vector autoregressions for the regressionerrors and the rst dierences of the regressors. In particular, we establish thebootstrap consistency for OLS method. The bootstrap method can thus be used

    to correct for the nite sample bias of the OLS estimator and to approximate theasymptotic critical values of the OLS-based test statistics in general cointegrat-ing regressions. The bootstrap OLS procedure, however, is not ecient. For theecient estimation and hypothesis testing, we consider the procedure proposedby Saikkonen (1991) and Stock and Watson (1993) relying on the regression aug-mented with the leads and lags of dierenced regressors. The bootstrap versionsof their procedures are shown to be consistent, and can be used to do inferencesthat are asymptotically valid. A Monte Carlo study is conducted to investigatethe nite sample performances of the proposed bootstrap methods.

    This version: February 18, 2002Key words and phrases: cointegrating regression, sieve bootstrap, ecient estimation andhypothesis testing, AR approximation.

    1Park thanks the Department of Economics at Rice University, where he is an Adjunct Professor, for its

    continuing hospitality and secretarial support. Correspondence address to: Yoosoon Chang, Department of

    Economics, Rice University, 6100 Main Street, Houston, TX 77005-1892, Tel: 713-348-2796, Fax: 713-348-

    5278, Email: [email protected].

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    2/30

    1

    1. Introduction

    The bootstrap has become a standard tool for the econometric analysis. Roughly, the pur-

    pose of using the bootstrap methodology is twofold: To nd the distributions of statisticswhose asymptotic distributions are unknown or dependent upon nuisance parameters, andto obtain renements of the asymptotic distributions that are closer to the nite sampledistributions of the statistics. It is well known that the bootstrap statistics have the sameasymptotic distributions as the corresponding sample statistics for a very wide, if not all,class of models, and therefore, the unknown or nuisance parameter dependent limit distribu-tions can be approximated by the bootstrap simulations. Moreover, if properly implementedto pivotal statistics, the bootstrap simulations indeed provide better approximations to thenite sample distributions of the statistics than their asymptotics. See Horowitz (2002) foran excellent nontechnical survey on the subject.

    The purpose of this paper is to develop the bootstrap theory for cointegrating regres-sions. The bootstrap can be potentially more useful for models with nonstationary time

    series than for the standard models with stationary time series, since the statistical theoriesfor the former are generally nonstandard and depend, often in a very complicated manner,upon various nuisance parameters. Nevertheless, the bootstrap theories for the former aremuch less developed compared to those for the latter. Virtually all of the published workson the theoretical aspects of the nonstationary bootstrap consider simple unit root models.See, e.g., Basawa et al. (1991), Chang (2000), Chang and Park (2002b) and Park (2000,2002). The bootstrap cointegrating regression has been studied only by simulations as inLi and Maddala (1997). The bootstrap method, however, is used quite frequently and ex-tensively by empirical researchers to approximate the distributions of the statistics in moregeneral models with nonstationary time series.

    We consider the sieve bootstrap to resample from the cointegrating regressions. The

    method does not assume any specic data generating processes, and the data are simplytted by a VAR of order increasing with the sample size. The bootstrap samples are thenconstructed using the tted VAR from the resampled innovations. We show in the paperthat under such a scheme the bootstrap becomes consistent for both the usual OLS and theecient OLS by Saikkonen (1991) and Stock and Watson (1993). The sieve bootstrap cantherefore be used to reduce the nite sample bias of the OLS estimator, and also to ndthe asymptotic critical values of the tests based on the OLS estimator. The bootstrappedOLS estimator, however, is inecient just as is the sample OLS estimator. The bootstrapdoes not improve the eciency. To attain eciency, we need to bootstrap the ecientOLS estimator. The sieve bootstrap can be naturally implemented to do resampling forthe ecient estimator, which itself relies on the idea of sieve estimation of the cointegrating

    regression. We show in the paper that the sieve bootstrap is generally consistent for theecient OLS method.Though we focus on the prototype multivariate cointegration model in the paper for

    concreteness, the theory we derive here can be used to analyze more general cointegratedmodels. The immediate extensions are the cointegrating regressions with more exible de-terministic trends including those allowing for structural breaks. Cointegrating regressionswith shifts in the coecients can also be analyzed using the methods developed in the paper.

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    3/30

    2

    Moreover, our theory extends further to the cointegrating regressions represented as errorcorrection models, seemingly unrelated cointegrated models and panels with cointegrationin individual units. The sieve bootstrap proposed here can be applied to all such models

    only with some obvious modications. The estimation of and testing for the cointegrationparameters can therefore be performed or rened by the sieve bootstrap. It is well expectedthat the sieve bootstrap is consistent for the aforementioned models if implemented properlyas suggested in the paper, and it can be proved rigorously, if necessary, using the theoryestablished in the paper.

    The rest of the paper is organized as follows. Section 2 introduces the model, assump-tions and some preliminary results. The multivariate cointegrating regression with a detailedspecication for data generating process is given, and a strong invariance principle, whichcan be used for both the sample and bootstrap asymptotics, is introduced and discussed.The standard asymptotic results for the cointegrating regressions are then derived. In Sec-tion 3, the sieve bootstrap procedure is presented, and the relevant bootstrap asymptoticsare developed. The bootstrap consistency is established there. Section 4 summarizes oursimulation study, and Section 5 concludes the paper. All the mathematical proofs are givenin Section 6.

    A word on notation. Following the standard convention, we use the superscript \" tosignify whatever is related to the bootstrap samples and dependent upon the realization ofthe samples. The usual notations for the modes of convergence such as!a:s:,!pand!dareused without additional references, and the notation =ddenotes the equality in distribution.The stochastic order symbols op andOp are also used frequently. Moreover, we often usesuch standard notations with the superscript \" to imply their bootstrap counterparts. Weuse the notationj jto denote the Euclidean norm for vectors and matrices, i.e.,jxj2 =x0xandjAj2 = tr A0A for a vector x and a matrix A. For matrices, we also use the operatornorm kk, i.e.,kAk= maxx jAxj=jxjfor a vectorx and a matrixA which are of conformable

    dimensions. For a matrixA, vec(A) denotes a column vector which stacks row vectors ofA.

    2. The Model, Assumptions and Preliminary Results

    2.1 The Model and Assumptions

    We consider the regression model given by

    yt = 0xt+ ut (1)

    xt = xt1+vt

    fort= 1; 2; : : :, wherewt = (u

    0t; v

    0t)0

    is an (` + m)-dimensional stationary vector process. Under this speci cation, the modelintroduced in (1) becomes a multivariate cointegrating regression. For the subsequent de-velopment of our theory, we let x0 be any random variable which is stochastically bounded,

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    4/30

    3

    and let (wt) be a linear process given by

    wt = (L)"t (2)

    where

    (z) =1Xk=0

    kzk

    We make the following assumptions.

    Assumption 2.1 We assume

    (a) ("t) are iid random variables such that E"t = 0, E "t"0t = > 0 and Ej"tj

    a 0, where K is an absoluteconstant depending only ona and`+m.

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    5/30

    4

    Lemma 2.2 follows immediately from the strong approximation result established in Einmahl(1987). For any givena >2, we may choose such that 0<

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    6/30

    5

    asn! 1. Though the OLS estimator n is super-consistent, it is asymptotically biasedand inecient when (ut) and (vt) are correlated. Moreover, the tests based on the OLSestimator n are generally invalid. For the test of the hypothesis

    H0: () = 0 (6)

    where: R`m!R is continuously dierentiable with rst-order derivative = @=@0; =vec , we may consider the Wald-type statistic

    Tn= ^0n

    n(M

    1n 11)

    0n

    1 ^n (7)

    where^n= (n), n= (n) andMn=Pn

    t=1xtx0t. For the practical implementation, of

    course, the longrun variance 11 of (ut) must be estimated. The statistic has the limitingdistribution

    Tn!d0Q1 (8)

    where = (M1I )(R10 B2dB1+21) and Q= (M1 11)0 using the notations

    = (), 21 = vec21 and M =R1

    0 B2B02. The asymptotic distribution ofTn is thus

    nonstandard and dependent upon various nuisance parameters. See, e.g., Park and Phillips(1988) for the distribution theories for n and Tn.

    For the ecient estimation of , we consider the procedure suggested by Saikkonen(1991) and Stock and Watson (1993), which is based on the regressions augmented with theleads and lags of the rst-dierenced regressors4xt. Under Assumption 2.1, we may write

    ut=1X

    k=1

    0kvtk+t (9)

    where (t) is uncorrelated with (vt) = (4xt) at all leads and lags with variance 112 = 11 12 122 21, and

    P1k=1 jkj p 0k4xtk. Since (k) are absolutely summable, we may well expect

    that the error ( pt) will become close to (t) if we let the number p of included leads andlags of the dierenced regressors increase appropriately as the sample sizen grows. Indeed,Saikkonen (1991) shows that if we letp! 1such thatp3=n! 0 andn1=2

    Pjkj>p jkj ! 0,

    then the regression (10) is asymptotically equivalent to

    yt= 0xt+t (11)

    In particular, the OLS estimator ~n of in regression (10), which we call the ecient OLSestimator, has the asymptotics given by

    n( ~n)!d

    Z 10

    B2B02

    1 Z 10

    B2d(B1 12122B2)

    0 (12)

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    7/30

    6

    which is precisely the same as the asymptotics of the OLS estimator of from regression(11).

    Note that B1 12122B2 has reduced variance 112 = 11 12

    122 21 relative to

    the variance 11ofB1. The asymptotic variance of ~nis therefore smaller than that ofn.Moreover,B1 12

    122B2 is independent ofB2, and consequently the limiting distribution

    of ~n is mixed normal, which is quite a contrast to the nonstandard limit theory ofn. Asa result, the usual chi-square test is valid if we use a test statistic based on ~n. Indeed, fortesting the hypothesis (6), we may use

    ~Tn= ~0n

    ~n(M

    1nn 112)

    ~0n

    1~n (13)

    where ~n = ( ~n) and ~n = (~n) are dened similarly as ^n andn introduced in (7),andMnn is the matrix dening the sample covariance of ~n, which corresponds to Mn forn given also in (7). Then it follows that

    ~Tn!d2 (14)

    As before, 112 should be consistently estimated for practical applications.In what follows, we reestablish the asymptotics for the augmented regression (10) using

    a weaker condition on the expansion rate for p, relative to the one required by Saikkonen(1991). We assume

    Assumption 2.3 Letp!1 andp= o(n1=2) asn!1.

    Our condition on p is substantially weaker than the one used in Saikkonen (1991). Ourupper bound for p is extended to o(n1=2) compared to his o(n1=3). Moreover, we do not

    impose any restriction on the minimum rate at which p must grow. Thus we may allowfor the logarithmic rate that is usually imposed for the common order selection rules suchas AIC and BIC. The logarithmic rate, however, is not allowed in Saikkonen (1991) unless(k) decreases at a geometric rate. The rate forp here is therefore valid for more generalstationary process (wt), as can be seen clearly in our proof for the following lemma.

    Lemma 2.4 UnderAssumptions 2.1 and 2.3, we have (12) and (14) asn!1.

    The asymptotics established in the above lemma will be used as a basis for the subsequentdevelopment of our bootstrap asymptotics.

    3. Bootstrap Procedures and Their Asymptotics

    3.1 Bootstrap Procedures

    In this section, we introduce the bootstrap procedures for our cointegrating regression model(1) and develop their asymptotics. For our bootstrap procedures introduced below, we may

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    8/30

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    9/30

    8

    and construct the bootstrap samples (wt ) recursively using

    wt =1w

    t1+ +

    qwtq+ "

    t

    given the initial values wt =wt fort= 0; : : : ; 1 q. This step amounts to theusual sieve bootstrap for (wt).

    Step 3: Dene wt = (u0t ; v

    0t )

    0 analogously as wt = (u0t; v

    0t)0. Obtain the boot-

    strap samples (xt ) by integrating (vt ), i.e., x

    t = x

    0+

    Ptk=1v

    t , with x

    0 = x0

    and generate the bootstrap samples (yt ) from

    yt = 0nx

    t +u

    t

    The estimate n of here need not be the same as the one used in Step 1.

    The bootstrap samples (x

    t

    ) and (y

    t

    ) can be used to simulate the distributions of variousstatistics as explained below.

    Discussions on some practical issues arising from the implementation of our bootstrapmethod are in order. First, the choice of an estimator n for should be made in Steps1 and 3. The choice in Step 3 is not important, as long as we regard the chosen estimateas the true parameter for the bootstrap samples. The choice of n for we use in Step 1,however, is very important, as it would aect the subsequent bootstrap procedure in Steps2 and 3. In particular, the order selection rule applied to determine the model in Step 2 canbe very sensitive to the choice of n made in Step 1. Naturally, we recommend to use thebest possible estimate, i.e., the most ecient estimate available incorporating all restrictionsfor the hypothesis testing. This is what is observed by Chang and Park (2002b) for thebootstrap unit root tests. Li and Maddala (1997) also nd that using the true value in Step

    1 yields the best result for the bootstrap hypotheses tests. Second, the initializations forthe generations of (wt ) and (x

    t ) are necessary respectively in Steps 2 and 3. The choices

    wt =wt fort = 0; : : : ; 1qand x0=x0 make the results conditional on these initial values

    of the samples. To reduce or eliminate such dependencies, we may generate suciently largenumber of (wt ) and retain only lastn drawings, or include a constant so that the estimaten of is independent of the initial value of (xt ).

    We consider the bootstrap version of regression (1) given as

    yt = 0nx

    t +u

    t (17)

    which yields the bootstrap estimator for the usual OLS, viz.,

    n=

    nXt=1

    xtx0t

    !1

    nXt=1

    xty0t

    !

    We also look at the bootstrap version of regression (10), which we write as

    yt = 0nx

    t +

    Xjkjp

    0k4xtk+

    pt (18)

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    10/30

    9

    wherept=

    Xjkj>p

    0kvtk+

    t

    The ecient bootstrap OLS estimator, i.e., the OLS estimator from regression (18), is givenby

    ~n=

    0@ nX

    t=1

    xt x0t

    nXt=1

    xt v0pt

    ! nXt=1

    vptv0pt

    !1 nXt=1

    vptx0t

    !1A1

    0@ nX

    t=1

    xt y0t

    nXt=1

    xtv0pt

    ! nXt=1

    vptv0pt

    !1 nXt=1

    vpty0t

    !1Awhere

    vpt= (4x0t+p; : : : ;4x

    0tp)

    0

    In both regressions (17) and (18), we denote by n the estimate of used to generate thebootstrap samples (yt ) from (x

    t ) and (u

    t ). Note that in regression (18) the coecients of

    the leads and lags of the dierenced regressors depend on the realized samples, which wesignify by attaching the superscript \" to the coecients k's. The test statistics

    Tn = ^0n

    n(M

    1n 11)

    0n

    1 ^n

    ~Tn = ~0

    n

    ~n(M

    1nn 112)

    ~0n

    1 ~n

    which are constructed from the bootstrap OLS and ecient estimators, n and ~n, analo-

    gously as as their sample counterparts Tnand ~Tndened in (7) and (13). For the denitions

    of the bootstrap statistics Tn and ~Tn , we may use the longrun variance estimate obtainedfrom each bootstrap sample, say 11 and

    112, in the places of the sample estimates

    11and 112. This would not aect any of our subsequent results, since we only consider therst order asymptotics.

    3.2 Bootstrap Asymptotics

    The asymptotic theories of the estimators n and ~n can be developed similarly as those

    for n and ~n. To develop their asymptotics, we rst establish the bootstrap invarianceprinciple for ("t ). We have

    Lemma 3.2 UnderAssumptions 2.1

    and 3.1,

    Ej"t ja =Op(1)

    as n!1.

    Roughly, Lemma 3.2 allows us to regard the bootstrap samples ("t ) as iid random variableswith nite a-th moment, given a sample realization. Following the usual convention, we

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    11/30

    10

    use the notation P to denote the bootstrap probability conditional on the samples. Thenotation E signies the expectation taken with respect to P, for each realization of thesamples.

    To develop the bootstrap asymptotics, it is convenient to introduce the bootstrapstochastic order symbols op and O

    p, which correspond respectively to op and Op. Let

    >0 be given arbirarily, and M > 0 be chosen accordingly. For a bootstrap statisticTn ,we dene

    Tn =op(1) if and only if PfP

    fjSnj< g> g<

    for all large n, and

    Tn =Op(1) if and only if PfP

    fjSnj> Mg > g<

    for all large n. They all satisfy the usual rules applicable forop andOp. See Chang andPark (2002b) for more details. In particular, wheneverTn!d T inP, we haveT

    n =O

    p(1)

    andT

    n+ o

    p(1)!d

    T in P. Note also that E

    jT

    n j =op(1) andOp(1) implyT

    n =o

    p(1) andOp(1), respectively.Dene

    Wn(r) =d n1=2

    nXt=1

    "t

    analogously asWn introduced earlier in Section 2, and let Wbe a vector Brownian motionwith variance . It follows from the strong approximation result in Lemma 2.2 applied tothe bootstrap samples ("t ) that we may choose W

    n satisfying

    P

    ( sup0r1

    jWn(r)W(r)j> n1=2cn

    ) Kn can E

    j"t ja

    for (cn) andKgiven exactly as in Lemma 2.2. Note in particular that Kdoes not dependon the realization of the samples. However, due to the result in Lemma 3.2, we have

    sup0r1

    jWnWj =op(1)

    as long as a >2, and consequently

    Wn!d W in P

    An `in probability' version of the bootstrap invariance principle is therefore established forthe bootstrap samples ("t ).

    The corresponding invariance principle for (wt ) can also be derived easily. We let

    (1) = 1qX

    k=1

    k

    and(1) =(1)1

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    12/30

    11

    Then we may deduce after some algebra that

    wt = (1)"t +

    (1)q

    Xi=1

    0@q

    Xj=i

    j1A (wtiwti+1)= (1)"t + ( w

    t1 w

    t )

    where wt =(1)

    Pqi=1(

    Pqj=i

    j)wti+1. We therefore have

    Bn(r) = n1=2

    [nr]Xt=1

    wt

    = n1=2(1)[nr]Xt=1

    "t +n1=2( w0 w

    [nr])

    = ^

    (1)W

    n(r) +R

    n(r)

    It is thus well expected that the invariance principle holds for ( wt ) if(1)!p (1) and

    sup0r1 jRn(r)j = o

    p(1). LetB be the vector Brownian motion introduced in (5). Then

    we may indeed show that

    Theorem 3.3 UnderAssumptions 2.1 and 3.1, we have

    Bn!d B in P

    as n!1.

    Letzt =z

    0+

    tXi=1

    wi

    Then we have

    Lemma 3.4 UnderAssumptions 2.1 and 3.1, we have

    1

    n2

    nXt=1

    zt z0t!d

    Z 10

    BB 0 in P

    1

    n

    n

    Xt=1

    zt1w0t!d Z

    1

    0

    BdB0 + in P

    as n!1.

    The limit distributions of the bootstrap OLS estimator and the associated statistic cannow be obtained easily from Lemma 3.4, and are given in the following theorem. Let andQbe dened as in (8).

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    13/30

    12

    Theorem 3.5 UnderAssumptions 2.1 and 3.1, we have

    n(nn)!d Z 1

    0

    B2B02

    1

    Z 1

    0

    B2dB01+ 21 in P

    and

    Tn!d 0Q1 in P

    as n!1.

    The bootstrap is therefore consistent for the OLS regression. The bootstrap estimatorn has the same limiting distribution as n. Therefore, in particular, the bootstrap can beused to remove the asymptotic bias in n, since the bootstrap distribution has the sameasymptotic bias as the sample distribution. The bias corrected OLS estimator dened by

    cn= n E(n n)

    does not have asymptotic bias, and is thus expected to have smaller bias in nite samples aswell. Hypothesis testing can also be done using the OLS estimator, if the critical values areobtained from the bootstrap distribution. For example, consider the test of the hypothesis(6). If we let c() be the bootstrap critical value given by

    Pn

    Tnc()

    o=

    for a prescribed size, then we have

    Pn

    Tnc()

    o!

    asn! 1. Therefore, the bootstrap test has the asymptotically correct size.We now consider the asymptotics for the bootstrap ecient estimator ~nobtained from

    (18). Denote by f() and (k), respectively, the spectral density and autocovariancefunction of (wt ). For all bootstrap samples yielding spectral density bounded away fromzero and absolutely summable autocovariance function, we would have the representation

    ut =1X

    k=1

    0k vtk+

    t (19)

    whereP1

    k=1 jkj g !p1

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    14/30

    13

    for some >0 and

    P

    8

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    15/30

    14

    as n!1.

    The limit distribution of the bootstrap ecient estimator ~n is equivalent to that of its

    sample counterpart ~n, just as in the case for the OLS estimator. The associated statistics~Tn and

    ~Tnalso have the identical asymptotic distribution. The bootstrap consistency of thetest statistic for cointegrating regression (1) therefore extends to the dynamic cointegratingregression (10) augmented with the leads and lags of the dierenced regressors.

    We may also consider the bias corrected estimator

    ~cn= ~n E( ~n n)

    for the ecient estimator ~n. Though it has no asymptotic bias, the correction may stillreduce the bias in nite samples. Moreover, the bootstrap critical value for the test ~Tn canbe found

    P n ~Tn~c()o =for the sizetest. The bootstrap test based onb() would then have the correct asymptoticsize, since

    Pn

    ~Tn~c()

    o!

    as n! 1. Note that ~Tn is asymptotically pivotal, unlike Tn whose limiting distributiondepends on various nuisance parameters. Therefore, the bootstrap test based on ~Tn mayprovide an asymptotic renement. In this case, the bootstrap test utilizing the bootstrapcritical values ~c() would yield rejection rates closer to the nominal values, compared tothe test relying on the asymptotic chi-square critical values.

    4. SimulationsIn this section, we conduct a set of simulations to investigate nite sample performances ofthe bootstrap procedures in the context of cointegrating regressions. Our simulations arebased on the simple bivariate cointegrating regression model

    yt = xt+ut

    xt = xt1+vt

    where the regression error (ut) and the innovation (vt) for the regressor (xt) are generatedas stationary AR(1) processes given by

    ut = '1vt1+"1tvt = '2vt1+"2t

    The innovations ("t), "t = ("1t; "2t)0, are drawn from bivariate normal distribution withcovariance matrix

    =

    1 1

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    16/30

    15

    Our simulation setup is entirely analogous to that of Stock and Watson (1993). In oursimulations, we set '1= 0:6,'2= 0:3,= 0:5, and = 0; 0:1.

    We consider two least squares regressions

    yt = nxt+ ut (21)

    yt = ~nxt+ ~1n4xt+ ~2n4xt1+ ~t (22)

    and compare the nite sample performances of the estimators n and ~n and the teststatistics Tn and ~Tn based respectively on n and ~n. Note that our simulation setupintroduces nonzero asymptotic correlation between the regressor and the regression error inregression (21), causing the asymptotic bias of n and the invalidity ofTn. The problemsare however expected to vanish, at least asymptotically, in regression (22). As explainedearlier, ~n has no asymptotic bias, and the test using ~Tn is asymptotically valid. For oursimulation setup, we have in regression (22)

    1= ; 2 = '1 '2

    andt = "1t "2t

    Notice that (t) dened as above are uncorrelated with ("2t) at all leads and lags, andtherefore are orthogonal also to (vt) at all leads and lags, which is required for the validityof the model (22) as an ecient cointegrating regression introduced in (10).

    To examine the eectiveness of the bootstrap bias correction, the nite sample perfor-mances of n and ~n are compared to those of the bias corrected estimators

    cn and ~

    cn,

    in terms of bias, variance and mean square error (MSE). For the sake of comparisons, thescaled biases, variances and MSE's for the estimators n, ~n,

    cn and ~

    cn are computed.

    For the test statistics, we compute the rejection probabilities under both the null and thealternative hypotheses, for which we set the value of respectively at 0 and 0:1. Rejec-tion probabilities are obtained using the critical values from 21 distribution for the sampletest statistics, Tn and ~Tn, and from the bootstrap distribution for the bootstrap tests, T

    n

    and ~Tn . The simulation results are presented in Tables 1, 2 and 3, respectively for themodels with no deterministic components, with drift and with a linear trend. We use thesuperscripts \" and \" to signify that the associated estimators/tests are computed fromthe models with drift and with a linear trend. Samples of sizes n = 25; 50; 100 and 200are considered. For each case, 5000 samples are simulated, and for each of the simulatedsamples, 1000 bootstrap repetitions are carried out to compute bootstrap estimators andtest statistics.

    The simulation results are largely consistent with the theory derived in the paper. Thebias of n is quite noticeable and does not vanish as the sample size increases. It turns out,however, that the bootstrap is quite eective in reducing the bias of n. The bootstrapreduces the bias of n substantially in all cases that we consider in our simulations, and thebias reduction becomes more eective as the sample size increases. In some of the reportedcases, the bias corrected estimate has bias as small as approximately 2% of that of n. Onthe other hand, the bootstrap bias correction has little impact on the sampling variation.

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    17/30

    16

    The sample variances of cn are roughly comparable to, or even slightly larger than, thoseof n. The bootstrap also reduces the bias in ~n, albeit the magnitudes of improvementsare not comparable to those for n. Clearly, ~n is asymptotically unbiased, and thus there

    is not as much room for improvement in ~n as in n. The bootstrap does not reduce thesampling variations for ~n, just as for n, and ~

    cnhas no smaller sample variances. For both

    estimators, the bootstrap bias corrections become relatively more important for models witha mean or a linear trend.

    It appears that the bootstrap generates quite precise critical values for both tests Tnand ~Tn, even when the sample sizes are small. In particular, the performance ofT

    n is quite

    satisfactory. The test Tn is invalid, and naturally, yields the rejection probabilities underthe null that are very distinct from its nominal test sizes. However, its bootstrap version Tnhas quite accurate null rejection probabilities even for small samples. As expected, the test~Tnbased on the ecient estimator ~nperforms much better than its OLS counterpart Tninterms of the rejection probabilities under the null. Its bootstrap version ~Tnalso improves thenite sample performances of ~T

    n. While the null rejection probabilities of both bootstrap

    testsTn and ~Tnapproach to the nominal values as the size of samples increase, the bootstrap

    ecient test ~Tn does better in small samples than its OLS counterpart Tn . This is more

    so for models with a mean or a linear trend. Due to the presence of non-uniform sizedistortions, the nite sample power comparisons between Tn and ~Tn and their bootstrapcounterparts Tn and ~T

    n are not clear. It just seems that they are all roughly comparable.

    5. Conclusion

    In this paper we consider the bootstrap for cointegrating regressions. We introduce thesieve bootstrap based on a VAR of order increasing with the sample size, and establish its

    consistency and asymptotic validity for two procedures: the usual OLS and the ecientOLS relying on the regressions augmented with the leads and lags of the dierenced re-gressors. For the usual OLS, the bootstrap can thus be employed to correct for biases inthe estimated parameters, and to compute the critical values of the tests. With the boot-strap bias correction, the OLS estimator becomes asymptotically unbiased. Moreover, theOLS-based tests become asymptotically valid, if the bootstrap critical values are used. Thebootstrap OLS method, however, is not ecient. For the ecient inference, we should baseour bootstrap procedure on the ecient OLS method. The sieve bootstrap proposed in thepaper appears to improve upon the ecient OLS method in nite samples. It generallyreduces the nite sample biases for the estimators and yields sizes that are closer to thenominal sizes of the tests.

    The theory and method developed in the paper can be used to analyze more general

    cointegrated models. The models with deterministic trends and/or structural breaks canbe analyzed similarly. The error correction models, seemingly unrelated and panel cointe-gration models are other examples, to which our theory and method are readily applicable.Indeed, the required modications of the theory and method for such extensions are mini-mal, and can be applied only with some obvious adjustments. The theory discussed in thepaper is concerned only with the consistency of the bootstrap. For the pivotal statistics,

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    18/30

    17

    however, it might well be the case that the bootstrap provides the asymptotic renementsas well. Our simulation study in the paper is in fact somewhat indicative of this possibility.Moreover, it appears that the theoretical illustration for the asymptotic renement is pos-

    sible using the techniques developed in Park (2000) to establish the bootstrap renementfor the unit root tests.

    6. Mathematical Proofs

    6.1 Useful Lemmas and Their Proofs

    We consider the regression

    wt = ~1wt1+ +~qwtq+ ~"qt (23)

    which is the tted version of regression (15). Since n is consistent, it is well expected

    that the tted regression (23) with (wt) should be asymptotically equivalent to the ttedregression (16) with (wt) introduced in Step 1 of our bootstrap procedure.

    Lemma A1 UnderAssumptions 2.1 and 3.1, we have asn! 1

    k= ~k+Op(n1=2)

    uniformly in1 k q. Moreover,

    max1tn

    j"qt ~"qtj =Op(n1=2)

    asn!1.

    Proof of Lemma A1 It follows immediately from the denition of (wt) that

    max0i;jq

    1

    n

    nXt=1

    wtiw0tj

    1

    n

    nXt=1

    wtiw0tj

    max

    1tnjwt wtj

    1

    n

    nXt=1

    jwtj+1

    n

    nXt=1

    jwtj

    !

    However, we havemax1tn

    jwt wtj= Op(n1=2)

    because

    jwt wtj = yt

    0nxt

    vt!wt =

    (n)0xt

    0! jnj jxtj

    andjnj =Op(n

    1); max1tn

    jxtj= Op(n1=2)

    Since1

    n

    nXt=1

    jwtj; 1

    n

    nXt=1

    jwtj =Op(1)

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    19/30

    18

    we have

    max0i;jq

    1

    n

    nXt=1

    wtiw0tj

    1

    n

    nXt=1

    wtiw0tj

    =Op(n

    1=2)

    The rest of the proof is rather straightforward, and we omit the details.

    Letwt= (w

    0t1; : : : ; w

    0t)

    0

    and deneM= Ewtw

    0t

    Moreover, denote by fthe spectral density of (wt). Then we have

    Lemma A2 UnderAssumption 2.1, we have

    M1 1

    2inf

    kf()k1

    for all 1.

    Proof of Lemma A2 Letc2R be an eigenvector associated with the smallest eigen-

    valueminofM. Dene'() = (e

    i; : : : ; ei)0

    and use \" to denote its cojugate. It follows that

    min = c0Mc

    =Z

    c0

    '()'()0f()

    c d

    Z

    inf

    '()'()0f()d=

    infkf()k

    Z k'()'()

    0kd

    = 2

    infkf()k

    However, we have M1 =1minfrom which the stated result can be deduced immediately.

    Lemma A3 UnderAssumptions 2.1 and 3.1, we have

    supjf() f()j =op(1)

    and1X

    k=1

    (k) =1X

    k=1

    (k) +op(1)

    as n!1.

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    20/30

    19

    Proof of Lemma A3 Given Lemma A1, the stated results are just straightforward ex-tensions of Lemma A2 in Chang and Park (2002b). Here we only obtain `in probability'versions, instead of `almost sure' versions, since the results in Lemma A1 hold only in

    probability.

    Lemmma A4 UnderAssumptions 2.1 and 3.1, we have

    E

    nXt=1

    (wtiw0tj

    (ij))

    2 =Op(n)

    uniformly ini andj, asn!1.

    Proof of Lemma A4 Once again, the stated result follows exactly as in Lemma A4 inChang and Park (2002b), due to Lemma A1, under some obvious modications to deal with

    multiple time series.

    Lemmma A5 UnderAssumptions 2.1, 2.3 and 3.1, we have

    E

    1

    n

    nXt=1

    vptv0pt

    !1 = Op(1)E

    nXt=1

    xt v0pt

    = Op(np1=2)

    as n!1.

    Proof of Lemma A5 The proof is essentially identical to that of Lemma 3.3 in Changand Park (2002b).

    Lemma A6 UnderAssumptions 2.1, 2.3 and 3.1, we have nXt=1

    vpt0pt

    =Op(n1=2p1=2)

    andnX

    t=1

    xt (t

    pt)

    0 =op(n)

    as n!1.

    Proof of Lemma A6 Dene (pk) such that

    pt t =

    Xjkj>p

    0kvtk=

    Xjkj>p

    pk"tk

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    21/30

    20

    Note that

    Xjkj>pjpkj

    1

    Xk=1jkj

    !0

    @Xjkj>pjkj

    1

    Aas one may easily deduce.

    To show the rst part, we write for 1 i p

    nXt=1

    vti0pt=

    nXt=1

    vti0t +

    nXt=1

    vti(pt

    t )

    0

    It is easy to see

    E

    nXt=1

    vti0t

    2 =Op(n)

    uniformly in 1 i p. Therefore, it suces to show that

    nXt=1

    vti(pt

    t )

    0 =op(n1=2) (24)

    uniformly in 1 ip. However, we have as in the proof of Lemma 3.1 in Chang and Park(2002a)

    nXt=1

    wti

    0@X

    jjj>p

    pj"tj

    1A0 =

    0@Xjkj>p

    jk1jjpkj

    1AOp(n) +

    0@ 1X

    i=0

    Xjjj>p

    jijjpjj

    1AOp(n1=2)

    =

    0@Xjkj>p

    jpkj

    1AO

    p(n

    1=2)

    =

    0@Xjkj>p

    jkj

    1AOp(n1=2)

    uniformly in 1 i p, and (24) follows immediately.To prove the second part, we dene

    xit =tX

    i=1

    "i (25)

    so that

    zt =(1)t + ( w0 wt )

    It follows that

    nXt=1

    zt (pt

    t )

    0 = (1)nX

    t=1

    t (pt

    t )

    0 + w0

    nXt=1

    (pt t )

    0 nX

    t=1

    wt (pt

    t )

    0 (26)

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    22/30

    21

    We haven

    Xt=1(pt

    t ) = X

    jkj>p

    pk

    n

    Xt=1 "tk

    =

    0@X

    jkj>p

    jpkj

    1AOp(n1=2)

    =

    0@X

    jkj>p

    jkj

    1AOp(n1=2) (27)

    We also have similarly as in the proof of the rst part

    nXt=1

    wt (pt

    t )

    0 =

    0

    @Xjkj>p

    jkj

    1

    AOp(n) (28)

    Moreover, we have as in the proof of Lemma 3.1 in Chang and Park (2002a)

    nXt=1

    t (pt

    t )

    0 =

    0@Xjkj>p

    jpkj

    1AOp(n) +

    0@Xjkj>p

    jpkj

    1AOp(n1=2)

    =

    0@Xjkj>p

    jkj

    1AOp(n) (29)

    The second part can now be easily deduced from (26) and (27) - (29). The proof is thereforecomplete.

    6.2 Proofs of Lemmas and Theorems

    Proof of Lemma 2.2 The stated result follows from Einmahl (1987). In particular,he shows that his Equation 1.3 holds for all when 2 < s < 4, and for Ksn with 0 as long as n is suciently large. His result is therefore applicable as weformulate here.

    Proof of Lemma 2.4 We write

    n( ~n) =

    1

    n2

    n

    Xt=1xtx

    0t Qn

    !11

    n

    n

    Xt=1xtt Pn

    !

    where

    Pn = 1

    n

    nXt=1

    xt(t pt)0 +

    1

    n

    nXt=1

    xtv0pt

    !1

    n

    nXt=1

    vptv0pt

    !11

    n

    nXt=1

    vpt0pt

    !

    Qn = 1

    n

    1

    n

    nXt=1

    xtv0pt

    !1

    n

    nXt=1

    vptv0pt

    !11

    n

    nXt=1

    vptx0t

    !

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    23/30

    22

    where in turnvpt= (v

    0t+p; : : : ; v

    0tp)

    0

    To get the stated result, it suces to show that

    Pn; Qn= op(1)

    under Assumptions 2.1 and 2.3.In the subsequent proof, we use 1n

    nXt=1

    xtv0pt

    = Op(p1=2) (30)

    1

    n

    nXt=1

    vptv0pt

    !1

    = Op(1) (31)

    which are the multivariate extensions of the results established in Chang and Park (2002a).The required extensions are straightforward and the details are omitted. It follows imme-diately from (30) and (31) that

    Qn= n1Op(p

    1=2)Op(1)Op(p1=2) =Op(n

    1p)

    since

    kQnk 1

    n

    1nnX

    t=1

    xtv0pt

    1

    n

    nXt=1

    vptv0pt

    !1 1n

    nXt=1

    vptx0t

    as one may easily see.

    We now show that

    Pn=

    0@Xjkj>p

    jkj

    1AOp(1) + Op(n1=2p)

    We rst write

    kPnk

    1nnX

    t=1

    xt( pt t)0

    +1n

    nXt=1

    xtv0pt

    1

    n

    nXt=1

    vptv0pt

    !1 1n

    nXt=1

    vpt0pt

    = An+ Bn

    We may show as in the proof of Lemma 3.1 of Chang and Park (2002a) that

    An=

    0@X

    jkj>p

    jkj

    1AOp(1)

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    24/30

    23

    Moreover, we have

    n

    Xt=1

    vti0

    pt

    n

    Xt=1

    vti0@ut Xjjjp

    0

    kvtj1A

    0

    nXt=1

    vtiu0t

    + 1X

    j=1

    jj j

    nXt=1

    vtiv0tj

    = Op(n1=2)

    uniformly ini forjij p. It therefore follows that

    Bn= Op(p1=2)Op(1)Op(n

    1=2p1=2) =Op(n1=2p)

    as was to be shown.

    Proof of Lemma 3.2 Given the result in Lemma A1, the proof is the trivial extensionof the proof of Lemma 3.2 in Park (2002). The details are, therefore, omitted.

    Proof of Theorem 3.3 To derive the bootstrap invariance principle for (wt ) from thatof ("t ), we need to show

    (1)!p (1) (32)

    and

    P

    max1tn

    n1=2 wt

    >

    =op(1) (33)

    for any >0.

    Let ~(1) be dened exactly as (1) using the tted coecients (~k) in regression (23).It follows immediately from Lemma A1 that

    (1) =~(1) + Op(n1=2q)

    Moreover, we may deduce as in the proof of Lemma 3.5 in Chang and Park (2002a) that

    ~(1) = (1) +Op(n1=2q) +o(qb)

    using the result in Shibata (1981). We therefore have

    (1) = (1) +op(1)

    and obtain (32). The proof of (33) is essentially identical to the proof of Theorem 3.3 inPark (2002).

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    25/30

    24

    Proof of Lemma 3.4 Set z0 = 0 for simplicity. The required modication to allow fornonzero z0 is trivial. The rst part follows immediately, since

    1n2

    nXt=1

    zt z0t =dZ 10

    BnB0n + 1n2 znz0n

    andn1=2zn= O

    p(1)

    for largen.To prove the second part, we let (t ) be dened as in (25) so that we have

    nXt=1

    zt1w0t =

    (1)nX

    t=1

    t1"0t

    (1)0 +nX

    t=1

    wt w0t

    z

    nw

    0

    n + w

    0

    n

    Xt=1 "

    0

    t

    ^(1)

    0

    n

    Xt=1 w

    t1"

    0

    t

    ^(1)

    0

    It is straightforward to show

    znw0n ; w

    0

    nXt=1

    "0t(1)0;

    nXt=1

    wt1"0t

    (1)0 =Op(n1=2)

    and we therefore have

    nXt=1

    zt1w0t =

    (1)nX

    t=1

    t1"0t

    (1)0 +nX

    t=1

    wt w0t + o

    p(1) (34)

    for largen.It follows that

    (1)nX

    t=1

    t1"0t

    (1)0 !dZ 10

    BdB0

    by the bootstrap invariance principle and Kurtz and Protter (1991). Moreover, it can bededuced analogously as in Lemma A4 that

    E

    nXt=1

    (wt w0tE

    wt w0t )

    2 =Op(n2)

    and we have

    1n

    nXt=1

    wt w0t =Ewt w0t +Op(n1=2)

    However,

    Ewt w0t =

    1Xk=0

    (k) =1Xk=0

    (k) +op(1)

    from which, together with (34), the stated result follows immediately.

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    26/30

    25

    Proof of Theorem 3.5 The results can easily be derived from Lemma 3.4 using thebootstrap invariance principle and continuous mapping theorem.

    Proof of Lemma 3.6 Notice that

    jQnj 1

    n

    1

    n

    nXt=1

    xt v0pt

    1

    n

    nXt=1

    vptv0pt

    !11

    n

    nXt=1

    vptx0t

    It therefore follows that

    Qn= n1Op(p

    1=2)Op(1)Op(p

    1=2) = Op(n1p)

    from Lemma A5. This shows that Qn= op(1).

    Moreover, we have

    jPn j 1n

    nXt=1

    xt (pt

    t )

    0 +

    1n

    nXt=1

    xt v0pt

    1n

    nXt=1

    vptv0pt

    !11n

    nXt=1

    vpt0pt

    and it follows from Lemmas A5 and A6 that

    Pn= op(1) +O

    p(p

    1=2)Op(1)Op(n

    1=2p1=2) =op(1)

    as required to be shown.

    Proof of Theorem 3.7 Due to Lemma 3.6, we have

    n( ~n n) =

    1n2

    nXt=1

    xt x0t!1

    1n

    nXt=1

    xt 0t +op(1)

    The bootstrap asymptotic distribution of ~ncan now be easily deduced from the bootstrapinvariance principle and Kurtz and Protter (1991). The bootstrap asymptotic distributionof ~Tn may similarly be obtained.

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    27/30

    26

    References

    Basawa, I.V., A.K. Mallik, W.P. McCormick, J.H. Reeves and R.L. Taylor (1991a). \Boot-

    strapping unstable rst-order autoregressive processes," Annals of Statistics 19: 1098-1101.

    Chang, Y. and J.Y. Park (2002a). \On the asymptotics of ADF tests for unit roots,"forthcoming in Econometric Reviews.

    Chang, Y. and J.Y. Park (2002b). \A sieve bootstrap for the test of a unit root," forth-coming in Journal of Time Series Analysis.

    Einmahl, U. (1987). \A useful estimate in the multidimensional invariance principle,"Probability Theory and Related Fields 76: 81-101.

    Horowitz, J. (2002). \The bootstrap," forthcoming in Handbook of Econometrics Vol. 5,

    Elsevier, Amsterdam.Johansen, S. (1988). \Statistical analysis of cointegration vectors," Journal of Economic

    Dynamics and Control 12: 231-254.

    Johansen, S. (1991). \Estimation and hypothesis testing of cointegration vectors in Gaus-sian vector autoregressive models," Econometrica 59: 1551-1580.

    Kurtz, T.G. and P. Protter (1991). \Weak limit theorems for stochastic integrals andstochastic dierential equations," Annals of Probability 19: 1035-1070.

    Li, H. and J.S. Maddala (1997). \Bootstrapping cointegrating regressions," Journal ofEconometrics 80: 297-348.

    Park, J.Y. (2000). \Bootstrap unit root tests," Mimeographed, School of Economics, SeoulNational University.

    Park, J.Y. (2002). \An invariance principle for sieve bootstrap in time series," forthcomingin Econometric Theory.

    Park, J.Y. and P.C.B. Phillips (1988). \Statistical inference in regressions with integratedprocesses: Part 1," Econometric Theory 4: 468-497.

    Phillips, P.C.B. and V. Solo (1992). \Asymptotics for linear processes,"Annals of Statistics20: 971-1001.

    Saikkonen, P. (1991). \Asymptotically ecient estimation of cointegration regressions,"

    Econometric Theory 7: 1-21.

    Shibata, R. (1980). \Asymptotically ecient selection of the order of the model for esti-mating parameters of a linear process," Annals of Statistics 8: 147-164.

    Stock, J.H. and M.W. Watson (1993). \A simple estimator of cointegrating vectors inhigher order integrated systems," Econometrica 61: 783-820.

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    28/30

    27

    Table 1.1: Finite Sample Performances of the Estimators

    n estimator nbias n2var n2MSE

    25 n 2.516 4.826 11.157cn

    0.576 6 .315 6.647~n 0.025 2 .981 2.982~cn

    0.026 2 .981 2.981

    50 n 2.705 6.213 13.528cn

    0.365 7 .305 7.439~n 0.015 2 .489 2.489~cn

    0.016 2 .490 2.490

    100 n 2.746 6.573 14.112cn

    0.197 7 .201 7.240~n 0.015 2 .224 2.224

    ~cn 0.014 2 .229 2.229

    200 n 2.723 6.457 13.871cn

    0.060 6 .775 6.779~n -0.005 2.165 2.165~cn

    -0.004 2.169 2.169

    Table 1.2: Finite Sample Performances of the Test Statistics

    sizes powers

    n test 1% test 5% test 10% test 1% test 5% test 10% test

    25 Tn 0.058 0.216 0.345 0.583 0.806 0.885

    Tn

    0.011 0.053 0.100 0.404 0.558 0.660~Tn 0.046 0.110 0.178 0.485 0.626 0.691~Tn

    0.007 0.045 0.090 0.296 0.481 0.592

    50 Tn 0.055 0.195 0.324 0.893 0.977 0.990

    Tn

    0.008 0.048 0.097 0.727 0.873 0.934~Tn 0.026 0.080 0.131 0.811 0.892 0.919~Tn

    0.012 0.051 0.097 0.744 0.856 0.902

    100 Tn 0.047 0.174 0.296 0.997 1.000 1.000

    Tn

    0.010 0.047 0.094 0.981 0.997 0.999

    ~Tn 0.018 0.066 0.117 0.982 0.994 0.997~Tn

    0.010 0.049 0.096 0.973 0.991 0.996

    200 Tn 0.039 0.161 0.270 1.000 1.000 1.000

    Tn

    0.009 0.045 0.094 1.000 1.000 1.000~Tn 0.012 0.054 0.105 1.000 1.000 1.000~Tn

    0.009 0.045 0.096 1.000 1.000 1.000

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    29/30

    28

    Table 2.1: Finite Sample Performances of the Estimators

    n estimator nbias n2var n2MSE

    25 n 3.288 11.692 22.504cn

    0.912 14.649 15.481~n

    0.184 7.171 7.205~cn

    0.074 7.233 7.238

    50 n

    3.398 13.270 24.816cn

    0.463 15.670 15.885~n

    0.069 5.226 5.231~cn

    0.018 5.241 5.241

    100 n

    3.494 13.384 25.590cn

    0.277 14.820 14.897~n

    0.045 4.587 4.589

    ~cn 0.019 4.596 4.596

    200 n

    3.517 13.540 25.912cn

    0.146 14.255 14.276~n

    0.028 4.312 4.312~cn

    0.017 4.322 4.322

    Table 2.2: Finite Sample Performances of the Test Statistics

    sizes powers

    n test 1% test 5% test 10% test 1% test 5% test 10% test

    25 Tn

    0.020 0.142 0.261 0.265 0.547 0.676

    Tn

    0.017 0.059 0.106 0.227 0.384 0.498~Tn

    0.044 0.117 0.180 0.257 0.415 0.505~Tn

    0.008 0.043 0.090 0.093 0.248 0.369

    50 Tn

    0.031 0.140 0.257 0.623 0.841 0.905

    Tn

    0.011 0.055 0.106 0.498 0.693 0.795~Tn

    0.026 0.084 0.135 0.591 0.736 0.808~Tn

    0.012 0.052 0.102 0.479 0.672 0.767

    100 Tn

    0.029 0.135 0.231 0.967 0.994 0.997

    Tn

    0.010 0.054 0.105 0.924 0.978 0.990

    ~Tn

    0.017 0.067 0.121 0.942 0.974 0.984~Tn

    0.011 0.051 0.099 0.921 0.967 0.980

    200 Tn

    0.026 0.125 0.220 1.000 1.000 1.000

    Tn

    0.011 0.052 0.100 1.000 1.000 1.000~Tn

    0.014 0.060 0.115 0.999 1.000 1.000~Tn

    0.010 0.050 0.103 0.999 1.000 1.000

  • 8/14/2019 Boostrapping Cointegration Regression Analysis

    30/30

    29

    Table 3.1: Finite Sample Performances of the Estimators

    n estimator nbias n2var n2MSE

    25 n 5.925 19.070 54.170cn

    2.460 27.327 33.379~n

    1.273 17.658 19.280~cn

    0.230 19.762 19.815

    50 n

    6.408 22.298 63.360cn

    1.538 29.023 31.389~n

    0.592 11.893 12.244~cn

    0.037 12.256 12.257

    100 n

    6.609 22.742 66.418cn

    0.907 26.866 27.688~n

    0.286 9.585 9.667

    ~cn 0.013 9.675 9.675

    200 n

    6.660 24.429 68.783cn

    0.467 26.910 27.128~n

    0.168 8.592 8.620~cn

    0.034 8.611 8.612

    Table 3.2: Finite Sample Performances of the Test Statistics

    sizes powers

    n test 1% test 5% test 10% test 1% test 5% test 10% test

    25 Tn

    0.051 0.212 0.351 0.204 0.503 0.661

    Tn

    0.029 0.071 0.123 0.120 0.243 0.355~Tn

    0.056 0.135 0.203 0.161 0.298 0.388~Tn

    0.010 0.042 0.091 0.035 0.131 0.228

    50 Tn

    0.060 0.246 0.387 0.522 0.790 0.879

    Tn

    0.016 0.057 0.114 0.271 0.490 0.612~Tn

    0.025 0.080 0.134 0.329 0.516 0.612~Tn

    0.010 0.050 0.100 0.221 0.430 0.551

    100 Tn

    0.065 0.235 0.375 0.927 0.987 0.995

    Tn

    0.010 0.056 0.108 0.753 0.905 0.951

    ~Tn

    0.015 0.064 0.116 0.815 0.904 0.938~Tn

    0.009 0.049 0.099 0.770 0.886 0.927

    200 Tn

    0.062 0.229 0.362 1.000 1.000 1.000

    Tn

    0.012 0.056 0.107 0.998 1.000 1.000~Tn

    0.014 0.057 0.109 0.997 0.999 0.999~Tn

    0.011 0.050 0.101 0.996 0.999 0.999