Title: Ch5 Relaxing the Assumptions of the Classical Model
1Ch5 Relaxing the Assumptions of the Classical
Model
- 1. Multicollinearity What Happens if the
Regressors Are Correlated? - 2.Heteroscedasticity What Happens if the Error
Variance Is Nonconstant? - 3. Autocorrelation What Happens if the Error
Terms Are correlated?
2 1.Multicollinearity
- Perfect Collinearity
- Multicollinearity two or more variables are
highly(but not perfectly) correlated with each
other. - The easiest way to test multicollinearity is to
examine the standard errors of the coefficients. - Reasonable method to relieve multicollinearity is
to drop some highly correlated variables. -
3 1.Test of Multicollinearity
- A relatively high in an equation with few
significant t statistics - Relatively high simple correlations between one
or more pairs of explanatory variables - But the above criterion is not very applicable
for time series data. And it can not test
multicollinearity that arises because three or
four variables are related to each other, either. -
42.Heteroscedasticity
- Impact of heteroscedasticity on parameter
estimators - Corrections for heteroscedasticity
- Tests for Corrections for heteroscedasticity.
5 2.Impact of Heteroscedasticity
- Existence of heteroscedasticity would make OLS
parameter estimators not efficient, although
these estimators are still unbiased and
consistent - Often occurs when dealing with cross-sectional
data.
62.Correction of Heteroscedasticity
- Known Variance
-
- Unknown Variance (Error Variance Varies Directly
with an Independent Variable)
7 Known Variance
Two-variable regression model
8 Known Variance
Multiple linear regression model
let Because
Therefore WLS is BLUE
9Unknown Variance
Let Because
Therefore WLS is BLUE
10 2.Tests of Heteroscedasticity
- Informal Test Method
- Observe the residuals to see whether
estimated variances - differ from observation to observation.
-
- Formal Test Methods
- Goldfeld-Quandt Test
- Breusch-Pagan Test and The White Test
11 Goldfeld-Quandt Test
- Steps
- Order the data by the magnitude of Independent
Variable - Omit the middle d observations
- Fit two separate regression, the first for the
smaller X and the second for the larger X - Calculate the residual sum of squares of each
regression RSS1 and RSS2 - Assuming the error process is normally
distributed, then RSS2/RSS1F((N-d-2k)/2,
(N-d-2k)/2) .
12 Breusch-Pagan Test
- Steps
- First calculate the least-squares residuals
and use these residuals to estimate
- Run the followingregression
- If the error term is normally distributed and the
null hypothesis is valid, then
13 White Test
- Steps
- First calculate the least-squares residuals
and use these residuals to estimate
- Run the following regression
- When the null hypothesis is valid, then
14 3.Serial Correlation
- Impact of serial correlation on OLS estimators
- Corrections for serial correlation
- Tests for serial correlation.
15 Serial Correlation
- Serial Correlation often occurs in time-series
studies - Fist-order serial correlation
- Positive serial correlation
16 Impact of Serial Correlation
- Serial correlation will not affect the
unbiasedness or consistency of the OLS
estimators, but it does not affect their
efficiency.
17Correction of Serial Correlation
- The model with serial correlated error terms
usually is described as - Formula for first-order serial-correlation
coefficient
18 Correction of Serial Correlation
When is knownGeneralized Differencing
19 Methods for estimating
- The Cochrane-Orcutt Procedure
- The Hildreth-Lu Procedure.
20Cochrane-Orcutt Procedure
- Steps
- Using OLS to estimate the original model
- Using the residuals from the above equation to
perform the regression - Using the estimated value of to perform the
generalized differencing transformation process
and yield new parameters. - Substituting these revised parameters into the
original equation and obtaining the new estimated
residuals
21 Cochrane-Orcutt Procedure
- Steps
- Using these second-round residuals to run the
regression and obtain new estimate of - The above iterative process can be carried on
many times until the new estimates of differ
from the old ones by less than 0.01 or 0.005.
22The Hildreth-Lu Procedure
- Steps
- Specifying a set of grid values for
- For each value of , estimating the transformed
equation - Selecting the equation with the lowest
sum-of-squared residuals as the best equation - The above procedure can be continued with new
grid values chosen in the neighborhood of the
value that is first selected until the desired
accuracy is attained.
23 Test of Serial Correlation
- Durbin-Watson Test
- Test statistic is
- DW0,4, value near 2 indicating no
first-order serial correlation. Positive serial
correlation is associated with DW values below 2,
and negative serial correlation is associated
with DW values above 2.
24Range of the DW Statistic
Value of DW
Result 4-dlltDWlt4 Reject the null
hypothesis negative serial 4-dultDWlt4-dl
Result indeterminate 2ltDWlt4-du
Accept null hypothesis dultDWlt2
Accept null hypothesis dlltDWltdu
Result indeterminate 0ltDWltdl
Reject null hypothesis positive serial