poe 4 formulas

Upload: missinu

Post on 03-Jun-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/12/2019 Poe 4 Formulas

    1/3

    The Rules of Summation

    n

    i1xi x1 x2 xn

    n

    i1 a nan

    i1axi a

    n

    i1xi

    n

    i1xi yi

    n

    i1xi

    n

    i1yi

    n

    i1axi byi a

    n

    i1xi b

    n

    i1yi

    n

    i1a bxi na b

    n

    i1xi

    x n

    i1xin x1 x2 xn

    n

    n

    i1xi x 0

    2

    i1

    3

    j1fxi;yj 2

    i1 fxi;y1 fxi;y2 fxi;y3 fx1;y1 fx1;y2 fx1;y3

    fx2;y1 fx2;y2 fx2;y3

    Expected Values & Variances

    EX x1fx1 x2fx2 xnfxn

    n

    i1xifxi

    xx fx

    E gX x

    gxfx

    E g1X g2X x

    g1x g2x fx

    x

    g1x

    f

    x

    x

    g2x

    f

    x

    E g1X E g2X E(c)cE(cX)cE(X)E(a cX)a cE(X)var(X)s2 E[X E(X)]2 E(X2) [E(X)]2var(a cX) E[(a cX) E(a cX)]2 c2var(X)

    Marginal and Conditional Distributions

    fx y

    fx;y for each valueXcan take

    fy x

    fx;y for each valueYcan take

    fxjy

    P X

    xjY

    y

    fx;yfy

    If X and Y are independent random variables, then

    f(x,y) f(x)f(y) for each and every pair of valuesx and y. The converse is also true.

    IfXand Yare independent random variables, then the

    conditional probability density function ofXgiven that

    Y y is fxjy fx;yfy

    fxfyfy fx

    foreach andevery pair of valuesx andy. Theconverse is

    also true.

    Expectations, Variances & Covariances

    covX; Y EXEXYEY

    xy

    x EX y EY fx;y

    r covX;YffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffivarXvarY

    pE(c1X c2Y) c1E(X)c2E(Y)E(X Y)E(X)E(Y)var(aX bY cZ) a2var(X) b2var(Y) c2var(Z) 2abcov(X,Y) 2accov(X,Z) 2bccov(Y,Z)IfX,Y, andZare independent, or uncorrelated, random

    variables, then the covariance terms are zero and:

    varaX bY cZ a2varX b2varY c2varZ

    Normal Probabilities

    IfX N(m,s2), then ZX ms

    N0; 1IfX N(m,s2) and a is a constant, then

    PX a P Z a ms

    If XNm;s2 and a and b are constants; then

    Pa X b P ams

    Z b ms

    Assumptions of the Simple Linear Regression

    Model

    SR1 The value ofy, for each value ofx, isyb1b2x e

    SR2 The average value of the random error e is

    E(e)0 since we assumethat E(y) b1b2xSR3 The variance of the random errore is var(e)

    s2 var(y)SR4 The covariance between any pair of random

    errors,ei and ejis cov(ei, ej)cov(yi, yj)0SR5 The variablex is not random and must take at

    least two different values.

    SR6 (optional ) The values ofe are normally dis-

    tributedabout their mean e N(0,s2)

    Least Squares Estimation

    Ifb1and b2are the least squares estimates, then

    yi b1 b2xiei yi ^yi yi b1 b2xi

    The Normal Equations

    Nb1 Sxib2 SyiSxib1 Sx2ib2 Sxiyi

    Least Squares Estimators

    b2 Sxi xyi yS xi x2

    b1 y b2x

  • 8/12/2019 Poe 4 Formulas

    2/3

    Elasticity

    h percentage change in ypercentage change in x

    Dy=yDx=x

    DyDx

    xy

    h DEy=EyDx=x DEyDx xEy b2 xEyLeast Squares Expressions Useful for Theory

    b2 b2 Swiei

    wi xi xSxi x2

    Swi 0; Swixi 1; Sw2i 1=Sxi x2

    Properties of the Least Squares Estimators

    varb1 s2 Sx2i

    NSxi x2" #

    varb2 s2

    Sxi x2

    covb1; b2 s2

    x

    Sxi x2" #

    Gauss-Markov Theorem: Under the assumptions

    SR1SR5 of the linear regression model the estimators

    b1and b2have the smallest variance of all linear and

    unbiased estimators ofb1 and b2. They are the Best

    Linear Unbiased Estimators (BLUE) ofb1 and b2.

    If we make the normality assumption, assumption

    SR6, about the error term, then the least squares esti-

    mators are normally distributed.

    b1 N b1; s2 x2i

    NSxi x2 !

    ; b2 N b2; s2

    Sxi x2 !

    Estimated Error Variance

    s2 Se2i

    N 2

    Estimator Standard Errors

    seb1 ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffibvarb1q

    ; seb2 ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffibvarb2q

    t-distribution

    If assumptionsSR1SR6of the simple linear regression

    model hold, then

    t bk bksebk tN2; k 1; 2

    Interval Estimates

    P[b2 tcse(b2) b2 b2tcse(b2)] 1 aHypothesis Testing

    Components of Hypothesis Tests

    1. A null hypothesis,H02. An alternative hypothesis, H13. A test statistic

    4. A rejection region

    5. A conclusion

    If the null hypothesis H0: b2 c is true, then

    t b2 cseb2 tN2

    Rejection rule for a two-tail test: If the value of the

    test statistic falls in the rejection region, either tail of

    the t-distribution, then we reject the null hypothesis

    and accept the alternative.

    Type I error: The null hypothesis istrueand we decidetorejectit.

    TypeII error:The null hypothesisisfalse andwe decide

    notto reject it.

    p-value rejection rule:When thep-value of a hypoth-

    esis test issmallerthan the chosen value ofa, then the

    test procedure leads to rejectionof the null hypothesis.

    Prediction

    y0 b1 b2x0 e0; ^y0 b1 b2x0; f y0 y0bvarf s2 1 1

    Nx0 x

    2

    Sxi x2" #

    ; sef ffiffiffiffiffiffiffiffiffiffiffiffiffibvarfq

    A (1a) 100% confidence interval, or predictioninterval, for y0

    ^y0 tcsefGoodness of Fit

    Syi y2 S^yi y2 Se2iSST SSR SSER2 SSR

    SST 1 SSE

    SST corry;^y2

    Log-Linear Model

    lny b1b2x e;bln y b1 b2x100 b2 % change iny given a one-unitchangeinx:yn expb1 b2x^yc

    exp

    b1

    b2x

    exp

    s2=2

    Prediction interval:

    exp blny tcsefh i

    ; exp blny tcsefh i

    Generalized goodness-of-fit measureR2g corry;yn2

    Assumptions of the Multiple RegressionModel

    MR1 yib1b2xi2 bKxiK eiMR2 E(yi) b1b2xi2 bKxiK , E(ei)0.MR3 var(yi)var(ei)s2MR4 cov(yi, yj) cov(ei,ej) 0MR5 The values ofxikare not random and are not

    exact linear functions of the other explanatory

    variables.

    MR6 yi Nb1 b2xi2 bKxiK;s2, ei N0;s2

    Least Squares Estimates in MR Model

    Least squares estimatesb1,b2, . . . ,bKminimize

    Sb1, b2, . . . , bK yi b1 b2xi2 bKxiK2

    Estimated Error Variance and Estimator

    Standard Errors

    s2 e2i

    N K sebk ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffibvarbkq

  • 8/12/2019 Poe 4 Formulas

    3/3

    Hypothesis Tests and Interval Estimates for Single Parameters

    Use t-distribution t bk bksebk tNK

    t-test for More than One ParameterH0 : b2 cb3 a

    When H0 is true t b2 cb3 aseb2 cb3 tNK

    seb2 cb3 ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffibvarb2 c2bvarb3 2c bcovb2; b3q

    JointF-tests

    To testJ joint hypotheses,

    F SSER SSEU=JSSEU=N K

    To test the overall significance of the model the null and alternative

    hypotheses and Fstatistic are

    H0 : b2

    0; b3

    0; : : : ; bK

    0

    H1: at least one of the bk is nonzero

    F SST SSE=K 1SSE=N K

    RESET: A Specification Test

    yi b1 b2xi2 b3xi3 ei ^yi b1 b2xi2 b3xi3yi b1 b2xi2 b3xi3 g1^y2i ei; H0: g1 0yi b1 b2xi2 b3xi3 g1y2ig2^y3i ei; H0: g1 g2 0Model Selection

    AICln(SSE=N) 2K=NSCln(SSE=N) Kln(N)=NCollinearity and Omitted Variables

    yi b1 b2xi2 b3xi3 eivarb2 s

    2

    1 r223 xi2 x22

    When x3 is omitted; biasb2 Eb2 b2 b3bcovx2;x3bvarx2

    Heteroskedasticity

    var(yi)var(ei)s i2General variance function

    s2i expa1 a2zi2 aSziSBreusch-Pagan and White Tests for H0:a2a3 aS0

    When H0 is true x2 NR2 x2S1

    Goldfeld-Quandt test for H0:s2

    Ms2

    R

    versus H1 :s2

    M6s2

    RWhen H0 is true Fs2M=s2R FNMKM;NRKR

    Transformed model for varei s2i s2xiyi=

    ffiffiffiffixi

    p b1 1=ffiffiffiffi

    xip b2 xi=

    ffiffiffiffixi

    p ei= ffiffiffiffixipEstimating the variance function

    lne2i lns2i vi a1 a2zi2 aSziS viGrouped data

    varei s2i s2M i 1; 2;. . .;NMs2R i 1; 2;. . .;NR

    (

    Transformed model for feasible generalized least squares

    yi

    . ffiffiffiffiffisi

    p b1 1

    . ffiffiffiffiffisi

    p

    b2 xi. ffiffiffiffiffisi

    p

    ei. ffiffiffiffiffisi

    p

    Regression with Stationary Time Series Variables

    Finite distributed lag model

    yta b0xt b1xt1 b2xt2 bqxtq vtCorrelogramrk yt yytk y= yt y2

    For H0 :rk 0; z ffiffiffiffi

    Tp

    rk N0; 1LMtest

    yt b1 b2xtret1 vt TestH0 :r 0 witht-testet g1 g2xtret1 vt Test usingLM TR2AR(1) error yt b1 b2xt et et ret1 vtNonlinear least squares estimation

    yt b11 r b2xt ryt1 b2rxt1 vtARDL(p, q) model

    yt d d0xt dlxt1 dqxtq ulyt1

    upyt

    p

    vt

    AR(p) forecasting model

    yt d ulyt1 u2yt2 upytp vtExponential smoothing ^yt ayt1 1 ayt1Multiplier analysis

    d0 d1L d2L2 dqLq 1 u1L u2L2 upLp b0 b1L b2L2

    Unit Roots and Cointegration

    Unit Root Test for Stationarity: Null hypothesis:

    H0 : g 0Dickey-Fuller Test 1 (no constant and no trend):

    Dyt gyt1 vtDickey-Fuller Test 2 (with constant but no trend):

    Dyt

    a

    gyt1

    vt

    Dickey-Fuller Test 3 (with constant and with trend):

    Dyt a gyt1 lt vtAugmented Dickey-Fuller Tests:

    Dyt a gyt1 m

    s1asDyts vt

    Test for cointegration

    Det get1 vtRandom walk: yt yt1 vtRandom walk with drift: yt ayt1 vtRandom walk model with drift and time trend:

    yt a dtyt1 vtPanel Data

    Pooled least squares regression

    yit b1 b2x2it b3x3it eitCluster robust standard errors cov(eit, e is)c tsFixed effects model

    yit b1i b2x2it b3x3it eit b1inotrandomyit yi b2x2it x2i b3x3it x3i eit ei

    Random effects model

    yitb1i b2x2itb3x3iteit bitb1 ui randomyitayi b11 ab2x2itax2ib3x3itax3ivit

    a 1seffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi

    Ts2u s2eq

    Hausman test

    t bFE;k bRE;kbvarbFE;k bvarbRE;kh i1=2