Economic Forecasting Seminar - PowerPoint PPT Presentation

1 / 39
About This Presentation
Title:

Economic Forecasting Seminar

Description:

Monthly time series with seasonal components will also exhibit spikes' in their ... Time series with a seasonal component exhibit variations in the data which are a ... – PowerPoint PPT presentation

Number of Views:79
Avg rating:3.0/5.0
Slides: 40
Provided by: mrdo
Category:

less

Transcript and Presenter's Notes

Title: Economic Forecasting Seminar


1
Economic Forecasting Seminar
  • Introduction to the
  • Box-Jenkins
  • Forecasting Methods

2
Box-Jenkins Methodology
  • Univariate forecasts based on a statistical
    analysis of the past data. Differs from
    conventional regression methods in that the
    mutual dependence of the observations is of
    primary interest.
  • Forecasts are linear functions of the sample
    observations.

3
Box-Jenkins Methodology
  • Identification of the type of model to be used is
    a critical first step. Differs from other
    univariate techniques in that a thorough study of
    the properties of a time series is carried out
    before applying a forecasting technique.

4
Principle of Parsimony
  • Find efficiently parameterized models. A model
    with a large number of parameters will achieve a
    good historical fit, but post-sample forecasts
    are likely to be poor.

5
  • By building models based on past realizations of
    a time series we are implicitly assuming that
    there is some regularity in the process
    generating the series.
  • One way to view such regularity is through the
    concept of stationarity.
  • The use of Box-Jenkins modeling techniques
    requires a stationary process.

6
Definition 1 Stochastic Process
  • A stochastic process is a collection Xt
    t1,2,,T of random variables ordered in time.
    Xt t Î T is a continuous process if - lt t
    lt . If T is finite, Xt is a discrete process.
  • Example the error term in a linear regression
    model is assumed to be a stochastic process.

7
  • If X is a random variable, then the variance of X
    is
  • which we can estimate by

8
  • If X and Y are two random variables having a
    joint probability density function, then the
    covariance of X and Y is
  • which we can estimate by

9
  • Finally, the simple correlation coefficient
    between X and Y is

10
Definition 2 Stationarity
  • A stochastic process is covariance stationary if,
    for all values of t,

11
Intuition
  • A stochastic process is said to be covariance
    stationary if its statistical properties do not
    change with time.
  • More precisely, a stationary series is one for
    which the mean and variance are constant across
    time and the covariance between current and
    lagged values of the series (autocovariances)
    depends only on the distance between the time
    points.

12
Interpretation
  • Recall that were trying to explain the future in
    terms of what is already known.
  • And for stationary series
  • But these two properties don't help very much in
    this regard.

13
Interpretation
  • However the above says, for example, that if g1 gt
    0, a high value of X today will likely be
    followed by a high value tomorrow.
  • By assuming that the gk are stable, this
    information can be exploited and estimated.

14
Autocorrelations
  • Covariances are often difficult to interpret
    because they depend on the units of measurement
    of the data.
  • Correlations, on the other hand, are scale-free.
    Thus we can obtain the same information about the
    time series by computing the autocorrelations of
    a time series.

15
Autocorrelation Coefficient
  • The autocorrelation coefficient between Xt and
    Xt-k is
  • A graph of the autocorrelations is called a
    correlogram.
  • Knowledge of the correlogram implies knowledge of
    the process model which generated the series
    and vice versa.

16
Partial Autocorrelations
  • Another important function in the Box-Jenkins
    methodology is the partial autocorrelation
    function.
  • Partial autocorrelations measure the strength of
    the relationship between observations in a series
    controlling for the effect of intervening time
    periods.

17
  • If the observations of X in period t are highly
    related to the observations in, say, period t-12
    then a plot of the partial autocorrelations for
    that series (partial correlogram) should exhibit
    a spike, or relative peak, at lag 12.
  • Monthly time series with a seasonal component
    should exhibit such a pattern in their partial
    correlograms.
  • Monthly time series with seasonal components will
    also exhibit spikes in their correlograms at
    multiples of 12 lags.

18
Autocorrelations
Partial Autocorrelations
19
Nonstationarities I
  • In the level of the mean. Here the mean changes
    over different segments of the data. Time series
    with a strong trend are not mean stationary.
  • Nonlinear trends will cause the covariances to
    change over time.

20
(No Transcript)
21
SP 500
22
Nonstationarities II
  • Seasonality. Time series with a seasonal
    component exhibit variations in the data which
    are a function of the time of the year.
  • The autocovariances will depend on the location
    of the data points in addition to the distance
    between observations.
  • Such nonstationarities are made worse by seasonal
    components which are not stable over time.

23
Nonstationarities III
  • Shocks. A drastic change in the level of a time
    series will cause it to be nonstationary.

24
Removing Nonstationarities
  • First differencing D(Xt) Xt - Xt-1
  • Second differencing
  • D(Xt) - D(Xt-1) Xt - 2Xt-1 Xt-2
  • Period-to-period annualized rate of growth

25
(No Transcript)
26
Differencing Using EViews
  • Three equivalent forms of first differencing
  • d(gasprice,1) d(gasprice) gasprice-gasprice(-
    1)
  • Two equivalent forms of seasonal differencing
  • d(gasprice,0,12) gasprice-gasprice(-12)
  • Two equivalent forms of first differencing a
    seasonally differenced series
  • d(gasprice,1,12) d(gasprice-gasprice(-12))
  • Three equivalent forms of second differencing
  • d(gasprice,2) d(gasprice-gasprice(-1))
  • d(gasprice)-d(gasprice(-1))

27
  • Identification

28
Autoregressive Processes
  • Xt depends on past values of X plus some random
    shock, or disturbance. This is an AR(p) process.
  • If p were known, the fact that the error term in
    the equation satisfies the assumptions of the
    classical linear regression model means that we
    could use OLS to estimate fi.

29
Practical Problem What is p?
  • Use OLS and traditional t-tests to determine p.
  • Minimize the Akaike Information Criteria (AIC).
  • Minimize the Schwarz Criteria.

30
First-Order Autoregressive Model
  • An autoregressive process of order 1, i.e., an
    AR(1) model can be written as
  • and simply says that the value of X at time t
    depends on the value of X in the preceding period
    plus some random disturbance.

31
Random Walk Model
  • A widely used AR(1) process is the random walk
    for which f11.
  • Let X00.
  • Then
  • And in general

32
Properties of the Random Walk
  • Q Is Xt a stationary process?
  • A

No
33
Xt f1 Xt-1 et
  • When f11 the random walk model can be written
    as
  • Q Is et a stationary process?

Yes
34
White Noise Process
  • If me 0 then et DXt is said to be a white
    noise series.
  • et is perfectly correlated with itself
  • r0 1
  • The correlogram of et would have a vertical spike
    at k 0 and then truncate for all values of k gt
    0 since the autocorrelations equal 0.

35
Moving Average Processes
  • An MA(q) process can be written as
  • A moving average forecasting model uses
    contemporaneous and lagged values of random
    shocks to represent movements in the series.
  • The current value of X is expressed as a function
    of current and lagged unobservable shocks.

36
ARMA Models
  • A time series may be represented by a combination
    of autoregressive and moving average processes.
  • In this case the series is an ARMA(p,q) process
    and takes the form

37
Box-Jenkins Methodology
  • 1. Identification. Determine, given a sample of
    time series observations, what the model of the
    differenced data is.
  • 2. Estimation. Estimate the parameters of the
    chosen model
  • 3. Forecasting. Use the resulting model to
    forecast the time series. If necessary, undo the
    differencing to restore the original time series.

38
Identification
  • Determining whether a series follows an AR, MA,
    or a mixture of both processes is a difficult
    task requiring judgment and experience on the
    part of the forecaster.
  • By plotting the correlogram (ACF) and partial
    correlogram (PACF) for a series we can apply
    known characteristics of theoretical moving
    average and autoregressive processes to help
    identify a series as characteristic of one of
    these classes of models.

39
Guidelines for B-J Identification
  • PARTIAL
  • MODEL CORRELOGRAM CORRELOGRAM
  • AR(p) Dies Off Truncates after lag p
  • MA(q) Truncates after lag q Dies Off
  • ARMA(p,q) Dies Off Dies Off
Write a Comment
User Comments (0)
About PowerShow.com