Title: MANAGERIAL ECONOMICS 11th Edition
1MANAGERIAL ECONOMICS 11th Edition
2Demand Estimation
3Chapter 6OVERVIEW
- Demand Curve Estimation
- Regression Analysis
- Measuring Regression Model Significance
- Measures of Individual Variable Significance
4Demand Curve Estimation
- Simple Linear Demand Curves
- The best estimation method balances marginal
costs and marginal benefits. - Simple linear relations are useful for demand
estimation. - Using Simple Linear Demand Curves
- Straight-line relations give useful
approximations.
5- Example
- A business sells 2,000 units per month at a
price of 10 each. It can sell 250 more items per
month for each 0.25 reduction in price. What
price per unit will maximize the monthly revenue?
- A price of p10 corresponds to x 2,000 and a
price of p9.75 corresponds to x 2250. Using
this information you can use the point-slope form
to create the price equation.
6- m (10 - 9.75) / (2000 -2250) Find the slope
- m -.001
- p-10 -0.001(x-2000) Point-slope
form - p-0.001x 12
- Substituting this value into the revenue equation
produces - R x(-0.001x12)
- -0.001x2 12x
- Total Revenue at q200020,000
- Total Revenue at q225021,937.5
- To maximize the revenue function, find the
critical numbers - R 12 -.002x0
7Continued
- x 6,000 Critical number
- The price that corresponds to this production
level is - p12-0.001x Demand function
- 12 - 0.001(6000) Substitute 6,000 for x
- 6 Price per unit
(6000, 36000)
Revenue
R- therefore x6000 maximum
6000
8Regression Analysis
- Regression analysis is a statistical technique
that attempts to explain movements in one
variable, the dependent variable, as a function
of movements in a set of other variables, called
independent (or explanatory) variables through
the quantification of a single equation. - However, a regression result no matter how
statistically significant, cannot prove
causality. All regression analysis can do is test
whether a significant quantitative relationship
exists.
9Specifying the Regression Model
- Linear Model Model Assumption X and Y are
linearly related. Each coefficient will tell us
how much quantity demanded will change by a one
unit change in the coefficient. - Multiplicative Model Nonlinear relation that
involves X variable interactions. The coefficient
estimates are interpreted as estimates on the
constant elasticity of Y with respect of X, or
the percentage change in Y due to a 1 percent
change in X
10- The equation that describes how y is related
to x and - an error term is called the regression
model.
- The simple linear regression model is
y b0 b1x e
- where
- b0 and b1 are called parameters of the model,
- e is a random variable called the error term.
11Assumptions About the Error Term ?
1. The error ? is a random variable with mean
of zero.
2. The variance of ? , denoted by ? 2, is the
same for all values of the independent
variable.
3. The values of ? are independent.
4. The error ? is a normally distributed
random variable.
12- The simple linear regression equation is
E(y) ?0 ?1x
- Graph of the regression equation is a straight
line.
- b0 is the y intercept of the regression line.
- b1 is the slope of the regression line.
- E(y) is the expected value of y for a given x
value.
13- Positive Linear Relationship
Regression line
Intercept b0
Slope b1 is positive
14- Negative Linear Relationship
Regression line
Intercept b0
Slope b1 is negative
15Regression line
Intercept b0
Slope b1 is 0
16- The estimated simple linear regression equation
- The graph is called the estimated regression
line.
- b0 is the y intercept of the line.
- b1 is the slope of the line.
17- Least Squares Criterion
- where
- yi observed value of the dependent variable
- for the ith observation
- yi estimated value of the dependent variable
- for the ith observation
- This regression technique that calculates the ?
so as to minimize the sum of the squared
residuals.
18- The Multiple Regression Model
-
- y ?0 ?1x1 ?2x2 . . . ?pxp ?
- The Multiple Regression Equation
- E(y) ?0 ?1x1 ?2x2 . . . ?pxp
- The Estimated Multiple Regression Equation
- y b0 b1x1 b2x2 . . . bpxp
19- Least Squares Criterion
- Computation of Coefficients Values
- The formulas for the regression coefficients
b0, b1, b2, . . . bp involve the use of matrix
algebra. We will rely on computer software
packages to perform the calculations. - A Note on Interpretation of Coefficients
- bi represents an estimate of the change in y
corresponding to a one-unit change in xi when all
other independent variables are held constant.
20(No Transcript)
21Relationship Among SST, SSR, SSE
.
observed
.
SSE
SST
estimated
SSR
mean
- where
- SST total sum of squares
- SSR sum of squares due to regression
- SSE sum of squares due to error
22- Relationship Among SST, SSR, SSE
- SST SSR SSE
- where
- SST total sum of squares
- SSR sum of squares due to
regression - SSE sum of squares due to error
23(No Transcript)
24Measuring Regression Model Significance
- Standard Error of the Estimate increases with
scatter about the regression line.
25Coefficient of Determination
- Multiple Coefficient of Determination
- R 2 SSR/SST
- Adjusted Multiple Coefficient of Determination
26 Goodness of Fit, r and R2
- r 1 means perfect correlation r 0 means no
correlation. - R2 1 means perfect fit R2 0 means no
relation. - Corrected Coefficient of Determination, R2
- Adjusts R2 downward for small samples.
27(No Transcript)
28F statistic
- Tells if R2 is statistically significant.
29Measures of Individual Variable Significance
- t statistics
- t statistics compare a sample characteristic to
the standard deviation of that characteristic. - A calculated t statistic more than two suggests a
strong effect of X on Y (95 confidence). - A calculated t statistic more than three suggests
a very strong effect of X on Y (99 confidence).
- Two-tail t Tests
- Tests of effect.
- One-Tail t Tests
- Tests of magnitude or direction.
30Example
- An economist collected data for a sample of 20
computer stores. A suggestion was made that
regression analysis could be used to determine if
sales was related to the years of experience of
the manager and the score on the firms manager
aptitude test. The years of experience, score on
the aptitude test, and corresponding daily sales
(1000s) for a sample of 20 stores is shown on
the next slide.
31(No Transcript)
32- Exper. Score Sales Exper.
Score Sales - 4 78 24 9 88 38
- 7 100 43 2 73 26.6
- 1 86 23.7 10 75 36.2
- 5 82 34.3 5 81 31.6
- 8 86 35.8 6 74 29
- 10 84 38 8 87 34
- 0 75 22.2 4 79 30.1
- 1 80 23.1 6 94 33.9
- 6 83 30 3 70 28.2
- 6 91 33 3 89 30
33- Excel Computer Output
- The regression is
- Sales 3.17 1.40 Exper 0.251 Score
- Predictor Coef Stdev
t-ratio p - Constant 3.174 6.156 .52 .613
- Exper 1.4039 .1986 7.07 .000
- Score .25089 .07735 3.24 .005
- s 2.419 R-sq 83.4
R-sq(adj) 81.5
34Interpreting the Coefficients
b1 1. 404
Sales are expected to increase by 1,404
for each additional year of experience (when
the variable score on manager attitude test is
held constant).
35Interpreting the Coefficients
b2 0.251
Sales are expected to increase by 251 for
each additional point scored on the manager
aptitude test (when the variable years of
experience is held constant).
36- Excel Computer Output (continued)
- Analysis of Variance
- SOURCE DF SS MS
F P - Regression 2 500.33 250.16 42.76 0.000
- Error 17 99.46 5.85
- Total 19 599.79
37- F Test
- Hypotheses H0 ?1 ?2 0
- Ha One or both of the parameters
- is not equal to zero.
- Rejection Rule
- For ? .05 and d.f. 2, 17 F.05
3.59 - Reject H0 if F gt 3.59.
- Test Statistic
- F MSR/MSE 250.16/5.85 42.76
- Conclusion
- We can reject H0.
38- t Test for Significance of Individual Parameters
- Hypotheses H0 ?i 0
- Ha ?i 0
- Rejection Rule DFn-p-1
- For ? .05 and d.f. 17, t.025 2.11
- Reject H0 if t gt 2.11
- Test Statistics
- Conclusions
- Reject H0 ?1 0 Reject H0
?2 0
39(No Transcript)
40(No Transcript)
41(No Transcript)
42(No Transcript)
43(No Transcript)
44Self Test Problem 1
45(No Transcript)
46(No Transcript)