linear stationary processes. arma models
DESCRIPTION
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes is very restrictive since most economic variables are non-stationary. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/1.jpg)
Linear Stationary Processes. Linear Stationary Processes. ARMA modelsARMA models
![Page 2: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/2.jpg)
• This lecture introduces the basic linear models for stationary processes.
• Considering only stationary processes is very restrictive since most economic variables are non-stationary.
• However, stationary linear models are used as building blocks in more complicated nonlinear and/or non-stationary models.
![Page 3: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/3.jpg)
Roadmap
1. The Wold decomposition2. From the Wold decomposition to the
ARMA representation.3. MA processes and invertibility4. AR processes, stationarity and causality5. ARMA, invertibility and causality.
![Page 4: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/4.jpg)
The Wold DecompositionThe Wold Decomposition
Wold theorem in words:
Any stationary process {Zt} can be expressed as a sum of two components:
- a stochastic component: a linear combination of lags of a white noise process.
- a deterministic component, uncorrelated with the latter stochastic component.
![Page 5: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/5.jpg)
The Wold TheoremThe Wold Theorem
If {Zt} is a nondeterministic stationary time series, then
![Page 6: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/6.jpg)
Some Remarks on the Wold Decomposition, ISome Remarks on the Wold Decomposition, I
![Page 7: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/7.jpg)
Importance of the Wold decompositionImportance of the Wold decomposition
• Any stationary process can be written as a linear combination of lagged values of a white noise process (MA(∞) representation).
• This implies that if a process is stationary we immediately know how to write a model for it.
•Problem: we might need to estimate a lot of parameters (in most cases, an infinite number of them!)
• ARMA models: they are an approximation to the Wold representation. This approximation is more parsimonious (=less parameters)
![Page 8: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/8.jpg)
Birth of the ARMA(p,q) modelsBirth of the ARMA(p,q) models
)L(p
)L(q)L(
Under general conditions the infinite lag polynomial of the Wolddecomposition can be approximated by the ratio of two finite-lag polynomials:
Therefore
AR(p) MA(q)
![Page 9: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/9.jpg)
MA processes
![Page 10: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/10.jpg)
MA(1) process (or ARMA(0,1))MA(1) process (or ARMA(0,1))
Let a zero-mean white noise process
- Expectation
- Variance
Autocovariance
![Page 11: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/11.jpg)
MA(1) processes (cont)MA(1) processes (cont)-Autocovariance of higher order
- Autocorrelation
Partial autocorrelation
![Page 12: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/12.jpg)
![Page 13: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/13.jpg)
MA(1) processes (cont)MA(1) processes (cont)
StationarityMA(1) process is always covariance-stationary because
![Page 14: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/14.jpg)
MA(q)MA(q)
Moments
MA(q) is covariance-Stationary
for the same reasonsas in a MA(1)
![Page 15: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/15.jpg)
MA(infinite)MA(infinite)
Is it covariance-stationary?
The process iscovariance-stationaryprovided that
(the MA coefficients are square-summable)
![Page 16: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/16.jpg)
InvertibilityInvertibility
Definition: A MA(q) process is said to be invertible if it admits an autoregressive representation.
Theorem: (necessary and sufficient conditions for invertibility)
Let {Zt} be a MA(q), .Then {Zt} is invertible if and only
The coefficients of the AR
representation, {j}, are determined by the relation
![Page 17: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/17.jpg)
Consider the autocorrelation function of these two MA(1) processes:
The autocorrelation functions are:
Then, this two processes show identical correlation pattern. The MA coefficient is not uniquely identified.In other words: any MA(1) process has two representations (one with MA parameter larger than 1, and the other, with MA parameter smaller than 1).
Identification of the MA(1)Identification of the MA(1)
![Page 18: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/18.jpg)
Identification of the MA(1)Identification of the MA(1)
• If we identify the MA(1) through the autocorrelation structure, we would need to decide which value of to choose, the one greater than one or the one smaller than one. We prefer representations that are invertible so we will choose the value .
Z
![Page 19: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/19.jpg)
AR processes
![Page 20: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/20.jpg)
AR(1)AR(1) process process
Stationarity
geometric progression
Remember!!
![Page 21: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/21.jpg)
AR(1) (cont)AR(1) (cont)
Hence, an AR(1) process is stationary if
Mean of a stationary AR(1)
Variance of a stationary AR(1)
![Page 22: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/22.jpg)
Autocovariance of a stationary AR(1)
You need to solve a system of equations:
11 jjj
Autocorrelation of a stationary AR(1)
ACF
![Page 23: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/23.jpg)
EXERCISE
Compute the Partial autocorrelation function of an AR(1) process. Compare its pattern to that of the MA(1) process.
![Page 24: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/24.jpg)
![Page 25: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/25.jpg)
![Page 26: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/26.jpg)
AR(p)AR(p)
stationarity All p roots of the characteristic equation outside of the unit circle
ACF
System to solve for the first pautocorrelations:p unknowns and p equations
ACF decays as mixture of exponentials and/or damped sine waves, Depending on real/complex roots
PACF
![Page 27: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/27.jpg)
Exercise
Compute the mean, the variance and the autocorrelation function of an AR(2) process.
Describe the pattern of the PACF of an AR(2) process.
![Page 28: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/28.jpg)
Causality and StationarityCausality and Stationarity
ttt aZZ 11Consider the AR(1) process,
![Page 29: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/29.jpg)
Causality and Stationarity (II)Causality and Stationarity (II)
11 11
11
However, this stationary representation depends on future values of
It is customary to restrict attention to AR(1) processes with
Such processes are called stationary but also CAUSAL, or future-indepent AR representations.
Remark: any AR(1) process with can be rewritten as an AR(1) process with and a new white sequence.
Thus, we can restrict our analysis (without loss of generality) to processes with
11
![Page 30: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/30.jpg)
![Page 31: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/31.jpg)
Causality (III)Causality (III)
Definition: An AR(p) process defined by the equation
is said to be causal, or a causal function of {at}, if there exists a sequence of constants
and
- A necessary and sufficient condition for causality is
tatZ)L(p
![Page 32: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/32.jpg)
![Page 33: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/33.jpg)
Relationship between AR(p) and MA(q)Relationship between AR(p) and MA(q)Stationary AR(p)
Invertible MA(q)
![Page 34: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/34.jpg)
ARMA(p,q) Processes
![Page 35: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/35.jpg)
ARMA (p,q)ARMA (p,q)
ttp
q
ttq
pt
q
tqtp
aLaL
L
aZL
LZL
xx
xx
aLZL
)()(
)(ZtionrepresentaMA Pure
)(
)()( tion representa AR Pure
10)( of roots ty Stationari
10)( of roots ity Invertibil
)()(
t
p
![Page 36: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/36.jpg)
ARMA(1,1)ARMA(1,1)
![Page 37: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/37.jpg)
ACF of ARMA(1,1)
taking expectations
you get this system of equations
![Page 38: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/38.jpg)
PACF
ACF
![Page 39: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/39.jpg)
![Page 40: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/40.jpg)
![Page 41: Linear Stationary Processes. ARMA models](https://reader035.vdocuments.site/reader035/viewer/2022062520/56815acd550346895dc8a09a/html5/thumbnails/41.jpg)
Summary
• Key concepts– Wold decomposition– ARMA as an approx. to the Wold decomp.– MA processes: moments. Invertibility– AR processes: moments. Stationarity and
causality.– ARMA processes: moments, invertibility,
causality and stationarity.