Properties of the least squares estimates... - PowerPoint PPT Presentation

About This Presentation
Title:

Properties of the least squares estimates...

Description:

( the standard error is the estimated standard deviation of the estimator) ... Note that when x0 = 0, we have the special case of testing and/or estimating b0. ... – PowerPoint PPT presentation

Number of Views:246
Avg rating:3.0/5.0
Slides: 5
Provided by: frie9
Learn more at: http://people.uncw.edu
Category:

less

Transcript and Presenter's Notes

Title: Properties of the least squares estimates...


1
  • Properties of the least squares estimates...
  • They are all unbiased estimators i.e., their
    expected values are equal to the parameters they
    estimate (see 1-4 on in section 2.4.1 on page 32
    dont worry about the proofs given, just get
    the results)
  • Section 2.4.2 gives the variances of the
    estimators
  • Since one of our main goals is to say something
    about the parameters in the regression model,
    well do this by testing hypotheses about them
    this means we have to know the distributions of
    the estimators ...

2
To get the distributions of the estimators, we
must assume normality of the error term in the
model so not just that E(e) 0 and V(e) s2
but that eN(0, s2). This assumption implies the
following important distributional result about
the estimator of the slope Notice that weve
basically standardized b1-hat by taking away its
mean and dividing by its so-called standard
error. (the standard error is the estimated
standard deviation of the estimator)... The
resulting standardized statistic has a
t-distribution with n-2 degrees of freedom. Note
that the df n-2 the number of df associated
with the error sum of squares, the numerator in
our estimator s2 .
3
  • So now, we may use the distribution of the
    estimated slope to either
  • test the hypothesis that H0 b1 0 or
  • get a confidence interval for b1
  • Lets review both of these concepts from
    elementary statistics...
  • compare the value of the test statistic, T on the
    previous slide, assuming the null hypothesis is
    true, with the percentiles of the t(n-2)
    distribution to decide on whether to reject the
    null hypothesis or not. This comparison yields a
    so-called p-value and small p-values yield
    rejection while large p-values yield
    non-rejection of the null hypothesis.
  • construct a 100(1 a ) confidence interval for
    the true slope b1 by the usual estimate /-
    (margin of error)
  • estimate /- (value from t table)(s.e. of
    estimator)
  • ...remember that the s.e. of the estimator
    standard error of estimator estimated standard
    deviation of the estimator

4
  • Similarly, we may either test hypotheses or
    estimate m0 where using the unbiased least
    squares estimator
  • Note that when x0 0, we have the special case
    of testing and/or estimating b0 .
  • Now go back over the Hardness example and do the
    various tests dont forget to write out an
    interpretation of what the results mean...!
  • Look at how R implements these tests and
    estimates, try for the Hardness data....
  • do a scatterplot plot in the prediction line
    (the regression line, the mean line) can you
    plot the confidence bands around this line using
    the formula 2.23 at the top of page 37?
  • do the hypothesis test for no slope and give
    your results in terms of p-value. Is Hardness
    linearly related to Temperature of the quench
    bath water?
Write a Comment
User Comments (0)
About PowerShow.com