Lecture 16: Point Estimation Concepts and Methods - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Lecture 16: Point Estimation Concepts and Methods

Description:

Methodology obtain point estimate of variance for each of machine 1 and 2. ... If normal, X-bar is the minimum variance estimator. ... – PowerPoint PPT presentation

Number of Views:490
Avg rating:3.0/5.0
Slides: 34
Provided by: tas88
Category:

less

Transcript and Presenter's Notes

Title: Lecture 16: Point Estimation Concepts and Methods


1
Lecture 16 Point Estimation Concepts and Methods
  • Devore, Ch. 6.1-6.2

2
Topics
  • Point Estimation Concepts
  • Biased Vs. Unbiased Estimators
  • Minimum Variance Estimators
  • Standard Error of a Point Estimate
  • Point Estimation Methods
  • Method of Moments (MOM)
  • Methods of Least Squares
  • Methods of Maximum Likelihood (MLE)

3
I. Point Estimates
  • Objective obtain an educated guess of a
    population parameter from a sample.
  • Applications
  • Parameter Estimation
  • Hypothesis Testing

4
Applications Examples
  • Parameter Estimation
  • Suppose the failure times on an 300-hour
    accelerated life test are 10, 50, 110, 150, 200,
    220, 250 (3 have not failed). Estimate the
    parameter (lambda) if lifetimes follow an
    exponential distribution.
  • Hypothesis Testing
  • Suppose you have two bottle filling operations
    and wish to test if one machine yields lower
    variance.
  • Methodology obtain point estimate of variance
    for each of machine 1 and 2. Then, based on point
    estimates, test for statistically significant
    difference.

5
Parameter Estimation
  • Treat the data as constants and the parameters as
    rvs
  • Let X variable under study
  • Let q estimate of a parameter
  • Note predicted value of q is q-hat, or
  • Example predict the mean, m as m-hat, or

6
Hypothesis Testing
  • Given point estimator(s) from samples, we may
    wish to infer about the reproducibility of
    results, or if any statistical differences exist.
  • Examples suppose you measure two samples
  • Common Question Is it reasonable to conclude
    that no statistically significant difference
    exists?

7
Parameter Estimation Examples
  • Suppose you wish to estimate the population mean,
    m,. Some possible estimators include
  • Mean, Median, Trimmed Mean
  • Recall example from descriptive statistics, which
    of the following is the best estimator?
  • Mean 965.0 Median 1009.5 Trim
    Mean 971.4

8
Parameter Estimates - Fit
  • In practice, we would prefer to have one
    estimator (q-hat) that is always the best.
  • However, q-hat is a function of the observed Xis
  • So,
  • Thus, we may identify the best estimator as the
    one with
  • Least bias (unbiased)
  • Minimum variance of estimation error (increase
    the likelihood that the observed parameter
    estimate represents the true parameter)

9
Biased Vs. Unbiased Estimator
  • Bias - difference between the expected value of
    the statistic q-hat and the parameter q
  • Unbiased Estimator
  • Example X-bar is an unbiased estimated of m
    (Bias 0)
  • Suppose X1, X2, .. Xn are iid rvs, with E(Xi)
    m

10
Unbiased Estimator of Variance
  • Which of the following is an unbiased estimator
    of variance?
  • (n-1) is unbiased (see page 259 for proof)
  • Logic argument will Xis be closer to X-bar or
    m?
  • Thus, will dividing by n tend to overestimate or
    underestimate the true variance?
  • What happens to the bias effect as n becomes
    large?

11
Is S unbiased of s?
  • E(S)s s?
  • Simulation Experiment
  • Suppose you have a true variance 1
  • Simulate k replications of size n and compare
    expected values.
  • A Sample Result from k5000, n5
  • Variance using n divisor also has negative bias
    (underestimates)
  • Note S has a negative bias (underestimate s)
    there are other reasons to use S, well see that
    later

12
Minimum Variance Estimators
  • Several unbiased estimators may exist for a
    parameter.
  • Example Mean, median, trimmed mean
  • Minimum Variance Unbiased Estimator (MVUE)
  • among all unbiased estimators, MVUE represents
    the one with minimum variance.

13
MVUEs
  • Given the following pdfs for q1 and q2, which is
    the MVUE of q?

pdf of q1
pdf of q2
q
14
MVUE Example
  • Suppose Xi is N(m,s2)
  • Both X-bar and Xi are unbiased estimators of m
  • Note variance of Xi is s2
  • However, if ngt2,
  • then X-bar is better estimator of m because it
    has less variance of X-bar (s2 X-bar s2 / n )

15
Which is best the estimator?
  • The best estimator often depends on the
    underlying distribution or data pattern.
  • Consider advantages and disadvantages of Xbar,
    median or trimmed mean.
  • If normal, X-bar is the minimum variance
    estimator.
  • If bell shaped, but with heavy tails (e.g.,
    Cauchy Distribution), then median is better
    because outliers are likely.
  • Trimmed mean (10) is not better for either, but
    is robust to both ? robust estimator

16
Standard Error Point Estimate
  • Standard error of an estimator is
  • standard deviation of the estimator, sq
  • Standard error of X-bar

17
Point Estimate Intervals
  • If the estimator follows a normal distribution
    (very common), then we may be reasonably
    confident that the true parameter falls within
    /- 2 standard errors of the estimator.
  • 94-96 confident
  • Thus, standard errors may be used to identify an
    interval estimate of which the true parameter
    value likely falls within.

18
Example Interval Estimate
  • Suppose you have 10 measurements of thermal
    conductivity.
  • 41.60, 41.48, 42.34, 41.95, 41.86
  • 42.18, 41.72, 42.26, 41.81, 42.04
  • X-bar 41.924 S 0.286
  • Calculate an interval /- 2 standard errors for
    the true mean conductivity.
  • How precise is the std error of the mean?

19
II. Methods of Point Estimation
  • Goal obtain the best estimator
  • Conditions
  • Least bias (unbiased)
  • Minimum variance of estimation error
  • Recognize that we may need to tradeoff these
    conditions!
  • Applications
  • Estimate coefficients (Y b0 b1X ) /
  • Estimate distribution parameters e.g. , m, s2
  • We now discuss three general techniques to
    obtain point estimators.
  • Moments, maximum likelihood, least squares

20
A. Method of Moments (MOM)
  • Find first k moments of p.d.f. and equate to
    first k sample moments.
  • Solve system of simultaneous equations.
  • The kth moment of population distribution f(x) is
    E(Xk)
  • The kth sample moment

Let k1,2,3, ..
Example, if k1, E(X) SXi / n
21
Equations
  • Estimate 1 parameter, use 1 moment
  • Estimate m parameters, need m moments
  • Suppose you have 2 parameters to estimate f(x
    q1,q2)
  • E(X1)
  • E(X2)

22
Example Find Point Estimator based on sample of
data
  • Consider a random sample, X1 .. Xn. Suppose you
    wish to find estimator q given the pdf
  • f(x) (1 q x) 0 lt x lt 1
  • Exercises
  • Obtain an estimate of q based on MOM
  • Hint 1 parameter, so you only need 1 moment
    equation

23
Distribution Applications - Parameter Estimates
from Data
  • Bernoulli Distribution (n samples of size 1)
  • Sample Mean
  • Population Mean E(x) p
  • Parameter Estimate

24
Moments - Multiple Estimates
  • Exponential Distribution
  • Sample Mean
  • Population Mean E(x) 1/l
  • Parameter Estimate
  • Poisson Distribution
  • Sample Mean
  • Population Mean E(x) l
  • Parameter Estimate

25
Arent there other estimators?
  • Exponential
  • Sample Variance
  • Population Variance
  • Parameter Estimate So,
  • Which is preferred?

OR
26
Estimating by Method of Moments (MoM)
  • Advantages
  • Simple to generate
  • Unbiased
  • Asymptotically Normal (tends to normal when n is
    large)
  • Disadvantages
  • Inconsistent results (more than one estimator
    equation)
  • May not have desirable statistical properties or
    may produce flawed results.
  • See Example 6.13 in textbook

27
B. Method of Least Squares
  • Based on prediction error. Attempts to minimize
    prediction error.
  • Error xi m ei or ei xi - m
  • Sum of Squared Error
  • Estimate parameter(s) by minimizing Sum of
    Squared Error with respect to the parameter.
  • Note this is the basis for regression. Y mX b

28
C. Maximum Likelihood Estimation (MLE)
  • Developed by Sir R.A. Fisher (1920s)
  • Preferred method by statisticians particularly if
    n is sufficiently large, because the MLE (maximum
    likelihood estimator) approximates the MVUE.
  • Maximum likelihood estimation maximizes the
    likelihood that the observed sample is a function
    of the possible parameter values.

29
Maximum Likelihood Estimation - Single Parameter
  • Given a sample of size n x1, x2, .., xn from
    f(x)
  • Likelihood Function
  • Continuous L(q)
  • Discrete L(q)
  • To obtain MLE Maximize L or L ln(L) usually
    by setting
  • dL / d(parameter of interest) 0 and solving
    the resulting equations.
  • Note if more than 1 parameter, you will have a
    system of simultaneous equations f(x1, x2, .. Xn
    q1, .. qm)

30
Example Exponential distribution
  • f(x) l e -lx with x gt 0, and l gt 0
  • Find MLE for l
  • Note MLE is the same as MOM (though it is not an
    unbiased estimator (why? see Devore, p. 273, top)

31
Invariance Principle
  • Given
  • Then, the MLE of any function, h(q1,q2,..qn) of
    these parameters is the same function, replacing
    the thetas with their estimators.

32
MLE Vs. MoM
  • MLEs are usually preferred to MoM since they are
  • Consistent
  • Asymptotically Normal
  • Asymptotically Efficient
  • Invariant
  • May not be unbiased.
  • Disadvantages of MLE - may be complicated to
    solve
  • Using derivative calculus to maximize L() may not
    result in a logical answer.

33
Mean Squared Error (MSE)
  • Sometimes we choose to use a biased estimator.
  • MSE represents the squared difference between the
    estimator and bias.
  • If unbiased estimator MSE (q-hat) Var(q-hat)
  • If multiple estimators exist, it may be preferred
    to induce a small amount of bias to reduce
    variance of the estimator.
Write a Comment
User Comments (0)
About PowerShow.com