King Abdulaziz University Faculty of Engineering Industrial Engineering Dept. - PowerPoint PPT Presentation

About This Presentation
Title:

King Abdulaziz University Faculty of Engineering Industrial Engineering Dept.

Description:

... 10991 1972 4268 1961 50838 1993 30020 1982 10006 1971 4134 1960 52345 1992 27357 1981 9251 1970 4036 1959 57242 1991 25195 1980 8844 1969 3721 1958 55972 1990 ... – PowerPoint PPT presentation

Number of Views:125
Avg rating:3.0/5.0
Slides: 37
Provided by: kauEduSa2
Category:

less

Transcript and Presenter's Notes

Title: King Abdulaziz University Faculty of Engineering Industrial Engineering Dept.


1
King Abdulaziz UniversityFaculty of
EngineeringIndustrial Engineering Dept.
  • IE 436
  • Dynamic Forecasting

2
CHAPTER 3Exploring Data Patterns and an
Introduction to Forecasting techniques
  • Cross-sectional data
  • collected at a single point in time.
  • A Time series collected, and recorded over
    successive increments of time. (Page 62)

3
Exploring Time Series Data Patterns
  • Horizontal (stationary).
  • Trend.
  • Cyclical.
  • Seasonal.

A Stationary Series
Its mean and variance remain constant over time
4
The Trend
  • The long-term component that represents the
    growth or decline in the time series.

The Cyclical component
The wavelike fluctuation around the trend.
Cyclical Peak
Page (63)
FIGURE 3-2 Trend and Cyclical Components of an
Annual Time Series Such as Housing Costs
5
The Seasonal Component
  • A pattern of change that repeats itself year
    after year.

Page (64)
FIGURE 3-3 Electrical Usage for Washington water
Power Company, 1980-1991
6
Exploring Data Patterns with Autocorrelation
Analysis
  • Autocorrelation
  • The correlation between a variable lagged one
    or more periods and itself.

(Pages 64-65)
7
Autocorrelation Function (Correlogram)
  • A graph of the autocorrelations for various
    lags.

Computation of the lag 1 autocorrelation
coefficient
Table 3-1 (page 65)
8
Example 3.1
  • Data are presented in Table 3-1 (page 65).
  • Table 3-2 shows the computations that lead to the
    calculation of the lag 1 autocorrelation
    coefficient.
  • Figure 3-4 contains a scatter diagram of the
    pairs of observations (Yt, Yt-1).
  • Using the totals from Table 3-2 and Equation 3.1

9
Autocorrelation Function (Correlogram) (Cont.)
Minitab instructions Stat gt Time Series gt
Autocorrelation
FIGURE 3-5 Correlogram or Autocorrelation
Function for the Data Used in Example 3.1
10
using Autocorrelation Analysis
Questions to be Answered
  • Are the data random?
  • Do the data have a trend?
  • Are the data stationary?
  • Are the data seasonal?

(Page 68)
11
Are the data random?
  • If a series is random
  • The successive values are not related to each
    other.
  • Almost all the autocorrelation coefficients are
    significantly different from zero.

12
Is an autocorrelation coefficient significantly
different from zero?
- The autocorrelation coefficients of random data
have an approximate normal sampling distribution.
  • At a specified confidence level, a series can be
    considered random if the autocorrelation
    coefficients are within the interval 0 t
    SE(rk),
  • (z instead of t for large samples).

13
- Standard error of the autocorrelation at lag k
Where
ri the autocorrelation at time lag k. k the
time lag n the number of observations in the
time series
14
Example 3.2 (Page 69)
A hypothesis test
Is a particular autocorrelation coefficient is
significantly different from zero?
Note t is given directly in the Minitab output
under the heading T.
15
Is an autocorrelation coefficient different from
zero? (Cont.)
The Modified Box-Pierce Q statistic (developed
by Ljung, and Box) LBQ
A portmanteau test Whether a whole set of
autocorrelation coefficients at once.
16
Where
The value of Q can be compared with the
chi-square with m degrees of freedom.
17
Example 3.3 (Page 70)
t Yt t Yt t Yt t Yt
1 343 11 946 21 704 31 555
2 574 12 142 22 291 32 476
3 879 13 477 23 43 33 612
4 728 14 452 24 118 34 574
5 37 15 727 25 682 35 518
6 227 16 147 26 577 36 296
7 613 17 199 27 834 37 970
8 157 18 744 28 981 38 204
9 571 19 627 29 263 39 616
10 72 20 122 30 424 40 97
18
FIGURE 3-7 Autocorrelation Function for the Data
Used in Example 3.3
19
  • Q statistic for m 10 time lags is calculated
    7.75 (using Minitab).
  • The chi-square value 18.307,
  • (tested at 0.05 significance level, degrees of
    freedom df m 10).
  • Table B-4 (Page 527)
  • Q lt ,
  • Conclusion the series is random.

20
Do the Data have a Trend?
  • A significant relationship exists between
    successive time series values.
  • The autocorrelation coefficients are large for
    the first several time lags, and then gradually
    drop toward zero as the number of periods
    increases.
  • The autocorrelation for time lag 1 is close to
    1, for time lag 2 is large but smaller than for
    time lag 1.

21
Example 3.4 (Page 72)
Data in Table 3-4 (Page 74)
Year Yt Year Yt Year Yt Year Yt
1955 3307 1966 6769 1977 17224 1988 50251
1956 3556 1967 7296 1978 17946 1989 53794
1957 3601 1968 8178 1979 17514 1990 55972
1958 3721 1969 8844 1980 25195 1991 57242
1959 4036 1970 9251 1981 27357 1992 52345
1960 4134 1971 10006 1982 30020 1993 50838
1961 4268 1972 10991 1983 35883 1994 54559
1962 4578 1973 12306 1984 38828 1995 34925
1963 5093 1974 13101 1985 40715 1996 38236
1964 5716 1975 13639 1986 44282 1997 41296
1965 6357 1976 14950 1987 48440 1998 .
22
Data Differencing
  • A time series can be differenced to remove the
    trend and to create a stationary series.
  • See FIGURE 3-8 (Page 73) for differencing the
    Data of Example 3.1
  • See FIGURES 3-12, 3-13 (Page 75)

23
Are The Data Seasonal?
  • For quarterly data a significant autocorrelation
    coefficient will appear at time lag 4.
  • For monthly data a significant autocorrelation
    coefficient will appear at time lag 12.

24
Example 3.5 (Page 76)
See Figures 3-14, 3-15 (Page 77)
Table 3-5
Year December 31 March 31 June 30 September 30
1994 147.6 251.8 273.1 249.1
1995 139.3 221.2 260.2 259.5
1996 140.5 245.5 298.8 287.0
1997 168.8 322.6 393.5 404.3
1998 259.7 401.1 464.6 497.7
1999 264.4 402.6 411.3 385.9
2000 232.7 309.2 310.7 293.0
2001 205.1 234.4 285.4 258.7
2002 193.2 263.7 292.5 315.2
2003 178.3 274.5 295.4 286.4
2004 190.8 263.5 318.8 305.5
2005 242.6 318.8 329.6 338.2
2006 232.1 285.6 291.0 281.4
25
Time Series Graph
FIGURE 3-14 Time Series Plot of Quarterly Sales
for Coastal Marine for Example 3.5
26
FIGURE 3-15 Autocorrelation Function for
quarterly Sales for Coastal Marine for Example 3.5
Autocorrelation coefficients at time lags 1 and 4
are significantly different from zero, Sales are
seasonal on quarterly basis.
27
Choosing a Forecasting Technique
Questions to be Considered
  • Why is a forecast needed?
  • Who will use the forecast?
  • What are the characteristics of the data?
  • What time period is to be forecast?
  • What are the minimum data requirements?
  • How much accuracy is required?
  • What will the forecast cost?

28
Choosing a Forecasting Technique (Cont.)
The Forecaster Should Accomplish the Following
  • Define the nature of the forecasting problem.
  • Explain the nature of the data.
  • Describe the properties of the techniques.
  • Develop criteria for selection.

29
Choosing a Forecasting Technique (Cont.)
Factors Considered
  • Level of Details.
  • Time horizon.
  • Based on judgment or data manipulation.
  • Management acceptance.
  • Cost.

30
General considerations for choosing the
appropriate method
Method Uses Considerations
Judgment Can be used in the absence of historical data (e.g. new product). Most helpful in medium- and long-term forecasts Subjective estimates are subject to the biases and motives of estimators.
Causal Sophisticated method Very good for medium- and long-term forecasts Must have historical data. Relationships can be difficult to specify
Time series Easy to implement Work well when the series is relatively stable Rely exclusively on past data. Most useful for short-term estimates.
31
Method Pattern of Data Time Horizon Type of Model Minimal Data Requirements Minimal Data Requirements
Method Pattern of Data Time Horizon Type of Model Nonseasonal Seasonal
Naïve ST , T , S S TS 1
Simple averages ST S TS 30
Moving averages ST S TS 4-20
Single Exponential smoothing ST S TS 2
Linear (Double) exponential smoothing (Holts) T S TS 3
Quadratic exponential smoothing T S TS 4
Seasonal exponential smoothing (Winters) S S TS 2 x s
Adaptive filtering S S TS 5 x s
Simple regression T I C 10
Multiple regression C , S I C 10 x V
Classical decomposition S S TS 5 x s
Exponential trend models T I , L TS 10
S-curve fitting T I , L TS 10
Gompertz models T I , L TS 10
Growth curves T I , L TS 10
Census X-12 S S TS 6 x s
ARIMA (Box-Jenkins) ST , T , C , S S TS 24 3 x s
Lading indicators C S C 24
Econometric models C S C 30
Time series multiple regression T , S I , L C   6 x s
Pattern of data ST, stationary T, trended S,
seasonal C, cyclical. Time horizon S,
short term (less than three months) I,
intermediate L, long term Type of model TS,
time series C, causal. Seasonal s, length of
seasonality. of Variable V, number variables.
32
Measuring Forecast Error
Basic Forecasting Notation
  • actual value of a time series in time t
  • forecast value for time period t
  • - forecast error in time t
    (residual)

33
Measuring Forecasting Error (Cont.)
The Mean Absolute Deviation
The Mean Squared Error
Equations (3.7 - 3.11)
34
Example 3.6 (Page 83)
Used for
  • The measurement of a technique usefulness or
    reliability.
  • Comparison of the accuracy of two different
    techniques.
  • The search for an optimal technique.
  • Evaluate the model using
  • MAD, MSE, RMSE, MAPE, and MPE.

35
Empirical Evaluation of Forecasting Methods
Results of the forecast accuracy for a sample of
3003 time series (1997)
  • Complex methods do not necessarily produce more
    accurate forecasts than simpler ones.
  • Various accuracy measures (MAD, MSE, MAPE)
    produce consistent results.
  • The performance of methods depends on the
    forecasting horizon and the kind of data
    analyzed( yearly, quarterly, monthly).

36
Determining the Adequacy of a Forecasting
Technique
  • Are the residuals indicate a random series?
  • (Examine the autocorrelation coefficients of
    the residuals, there should be no significant
    ones)
  • Are they approximately normally distributed?
  • Is the technique simple and understood by
    decision makers?
Write a Comment
User Comments (0)
About PowerShow.com