Linear Stationary Processes. ARMA models - PowerPoint PPT Presentation

1 / 57
About This Presentation
Title:

Linear Stationary Processes. ARMA models

Description:

Under general conditions the infinite lag polynomial of the Wold ... Find the constants a and b such that. The numerator on the right hand side must be 1, so ... – PowerPoint PPT presentation

Number of Views:316
Avg rating:3.0/5.0
Slides: 58
Provided by: econ2
Category:

less

Transcript and Presenter's Notes

Title: Linear Stationary Processes. ARMA models


1
Linear Stationary Processes. ARMA models
Note I would like to thank Jesus Gonzalo and
Gloria Gonzalez-Rivera for making their class
material available, from which this lecture has
been prepared.
2
  • In this lecture we are concerned with models for
    stationary linear processes.
  • This framework is clearly restrictive for
    empirical applications since most economic
    variables are non-stationary and/or non-linear.
  • However, stationary linear models are often used
    as building blocks in nonlinear and/or
    non-stationary models.

3
(No Transcript)
4
(No Transcript)
5
(No Transcript)
6
(No Transcript)
7
The Wold Decomposition
The Wold Theorem basically states that any
zero-mean stationary process Zt can be
expressed as a sum of an stochastic component,
given by a linear combination of of lags of a
white noise process, and a deterministic process
that is uncorrelated with the latter stochastic
component.
8
The Wold Decomposition
If Zt is a nondeterministic stationary time
series, then
9
Some Remarks on the Wold Decomposition
10
What the Wold theorem does not say
  • The at need not be normally distributed, and
    hence need not be iid
  • Though PatZt-j0, it need not be true that
    EatZt-j0 (think of the possible
    consequences???)?
  • The shocks a need not be the true shocks to
    the system. (When will this happen???)
  • The uniqueness result only states that the Wold
    representation is the unique linear
    representation where the shocks are linear
    forecast errors. Non-linear representations, or
    representations in terms of non-forecast error
    shocks are perfectly possible.

11
Birth of the ARMA(p,q) models
Under general conditions the infinite lag
polynomial of the Wold Decomposition can be
approximated by the ratio of two finite lag
polynomials Therefore
AR(p)?
MA(q)?
12
MA processes
13
MA(1) process
Let
a zero-mean white noise process
Expectation
Variance
Autocovariance
14
MA(1) processes (cont)?
Autocovariance of higher order
Autocorrelation
Partial autocorrelation
15
(No Transcript)
16
MA(1) processes (cont)?
MA(1) process is always covariance-stationary
because
MA(1) process is ergodic for first and second
moments because
If were Gaussian, then would be
ergodic for all moments
17
Plot the function
0.5
1
-1
-0.5
  • Both processes share the same
  • autocorrelation function

MA(1) is not uniquely identifiable, except for
18
Invertibility
Definition A MA(q) process defined by the
equation is said to be invertible if there exists
a sequence of constants
and
Theorem (necessary and sufficient conditions for
invertibility) Let Zt be a MA(q). Then Zt is
invertible if and only if
The
coefficients ?j are determined by the relation
19
Identification of the MA(1)?
  • If we identify the MA(1) through the
    autocorrelation structure, we would need to
    decide which value of ??to choose, the one
    greater than one or the one smaller than one.
    Since we want our process to be invertible we
    will choose the value ???. (why????)
  • Notice that the variance of the innovation is
    bigger for the invertible than for the
    non-invertible representation

20
MA(q)?
Moments
MA(q) is covariance- stationary and ergodic
for the same reasons as in a MA(1)?
21
MA(infinite)?
Is it covariance-stationary?
The process is covariance-stationary provided that
(the MA coefficients are square-summable)?
22
Some interesting results
Proposition 1.
Errata! Añadir el ltinfinito
(square summable)?
(absolutely summable)?
Proposition 2.
Ergodic for the mean
23
Proof 1.
(1) (2)?
  • It is finite because N is finite
  • It is finite because is absolutely summable

then
24
Proof 2.
25
AR processes
26
AR(1)? process
Using backward substitution
geometric progression
Remember
Is a sufficient condition for stationarity and
ergodicity
27
AR(1) (cont)?
Hence, this AR(1) process is stationary if
Alternatively, consider the solution of the
characteristic equation
i.e. the roots of the characteristic equation lie
outside of the unit circle
Mean of a stationary AR(1)?
Variance of a stationary AR(1)?
28
Autocovariance of a stationary AR(1)?
Rewrite the process as
Autocorrelation of a stationary AR(1)?
ACF
PACF from Yule-Walker equations
29
(No Transcript)
30
(No Transcript)
31
Causality and Stationarity
Consider the AR(1) process,
32
Causality and Stationarity (II)
However, this stationary representation is
unnatural since it depends on future values of
at!! It is customary to restrict attention to
AR(1) processes with Such processes are called
CAUSAL, or future-indepent representations AR
representations. It should be noted that any
AR(1) processes that does not verify that the
coefficient is less than 1 in absolute value can
be reexpressed as an AR(1) process with
and a new white sequence. Thus, nothing is
lost by eliminating AR(1) processes that do not
verify the latter property.
33
Causality (III)
Definition An AR(p) process defined by the
equation is said to be causal, or a causal
function of at, if there exists a sequence of
constants
and Causality is
equivalent to the condition
34
AR(2)?
Stationarity
Study of the roots of the characteristic equation
35
The roots can be real or complex.
(1) Real roots
(2) Complex roots
36
1
real
1
2
-1
-2
complex
-1
37
Mean of AR(2)?
Variance and Autocorrelations of AR(2)?
38
Difference equation different shapes according to
the roots, real or complex
Partial autocorrelations from Yule-Walker
equations
39
(No Transcript)
40
(complex roots)
41
AR(p)?
All p roots of the characteristic equation
outside of the unit circle
stationarity
ACF
System to solve for the first p autocorrelations
p unknowns and p equations
ACF decays as mixture of exponentials and/or
damped sine waves, Depending on real/complex
roots
PACF
42
Relationship between AR(p) and MA(q)?
Stationary AR(p)?
Example
43
Invertible MA(q)?
Write an example, i.e. MA(2), and proceed as in
the previous example
44
ARMA(p,q) Processes
45
ARMA (p,q)?
46
Autocorrelations of ARMA(p,q)?
taking expectations
PACF
47
ARMA(1,1)?
48
ACF of ARMA(1,1)?
taking expectations
49
ACF
PACF
50
(No Transcript)
51
(No Transcript)
52
Appendix Lag Operator L
Definition
Properties
Examples
53
Appendix Inverse Operator
Definition
Note that
this definition does not hold because the limit
does not exist
Example
54
Appendix Inverse Operator (cont)?
Suppose you have the ARMA model
and want to find the MA representation
. You could try
to crank out directly, but
thats not much fun. Instead you could find and
matching terms in Lj to make sure this works.
Example Suppose

. Multiplying both polynomials and matching
powers of L,
which you can easily solver recursively for the
TRY IT!!!
55
Appendix Factoring Lag Polynomials
Suppose we need to invert the polynomial We can
do that by factoring it Now we need to invert
each factor and multiply
Check the last expression!!!!
56
Appendix Partial Fraction Tricks
There is a prettier way to express the last
inversion by using the partial fraction tricks.
Find the constants a and b such that
The numerator on the right hand side must be 1, so
57
Appendix More on Invertibility
Consider a MA(1)?
Definition
A MA process is said to be invertible if it can
be written as an AR( )?
  • For a MA(1) to be invertible we require
  • For a MA(q) to be invertible, all roots of the
    characteristic equation
  • should lie outside of the unit circle
  • MA processes have an invertible and a
    non-invertible representations
  • Invertible representation optimal forecast
    depends on past information
  • Non-invertible representation forecast
    depends on the future!!!
Write a Comment
User Comments (0)
About PowerShow.com