Multiple Regression Analysis - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

Multiple Regression Analysis

Description:

b0 is still the intercept. b1 to bk all called slope parameters ... Our test will still be valid, but less powerful (meaning less likely to detect ... – PowerPoint PPT presentation

Number of Views:13
Avg rating:3.0/5.0
Slides: 13
Provided by: PATRICIAM50
Learn more at: https://faculty.sfsu.edu
Category:

less

Transcript and Presenter's Notes

Title: Multiple Regression Analysis


1
Multiple Regression Analysis
  • y b0 b1x1 b2x2 . . . bkxk u
  • 1. Estimation

2
Similar to with Simple Regression
  • b0 is still the intercept
  • b1 to bk all called slope parameters
  • u is still the error term (or disturbance)
  • Still assume that E(ux1,x2, ,xk) 0
  • Still minimizing the sum of squared residuals

3
Interpreting the coefficient
4
Goodness-of-Fit
5
Goodness-of-Fit (continued)
  • Can compute the fraction of the total sum of
    squares (SST) that is explained by the model.
  • R2 SSE/SST 1 SSR/SST

6
Too Many or Too Few Variables
  • What happens if we include variables in our
    specification that dont belong?
  • OLS estimators remain unbiased. Our test will
    still be valid, but less powerful (meaning less
    likely to detect significant relationship)
  • What if we exclude a variable from our
    specification that does belong?
  • OLS estimators will usually be biased

7
Omitted Variable Bias
8
Omitted Variable Bias Summary
  • Two cases where there is no bias using a simple
    regression
  • b2 0, that is x2 doesnt really belong in model
  • x1 and x2 are uncorrelated in the sample.

9
  • The following 3 slides are some technical notes.
  • You could ignore them if find them boring.

10
Assumptions in OLS
  • Population model is linear in parameters y
    b0 b1x1 b2x2 bkxk u
  • We use a random sample of size n, (xi1, xi2,,
    xik, yi) i1, 2, , n, from the population
  • E(ux1, x2, xk) 0 .
  • None of the xs is constant, and there are no
    exact linear relationships among xs

11
Further assumption
  • Assume Var(ux1, x2,, xk) s2
    (Homoskedasticity)
  • Theses 5 assumptions are known as the
    Gauss-Markov assumptions

12
The Gauss-Markov Theorem
  • Given our 5 Gauss-Markov Assumptions it can be
    shown that OLS is BLUE, or Best Linear Unbiased
    Estimator.
  • Thus, if the assumptions hold, use OLS
Write a Comment
User Comments (0)
About PowerShow.com