Sampling - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Sampling

Description:

Definition: The moment generating function (MGF) for a random variable Y is defined by, ... provided that this function exists in an interval b t b for ... – PowerPoint PPT presentation

Number of Views:23
Avg rating:3.0/5.0
Slides: 24
Provided by: robert904
Category:
Tags: in | moment | sampling | this

less

Transcript and Presenter's Notes

Title: Sampling


1
Sampling
  • Random Sample A sample of size n from a
    population with N elements is called a random
    sample if every sample of size n has an equal
    chance of being selected
  • Sampling with replacement A sample in which
    members are selected sequentially with each
    element replaced prior to the next selection.
  • Sampling without replacement A sample
    consisting of n distinct elements chosen from the
    population.

2
Chapter 3
  • Discrete Random Variables and Their Distribution

3
Definitions/Notation
  • Discrete Random Variable A random variable, Y,
    is discrete it the range of Y is finite or
    discrete.
  • Notation Let Y be a random variable defined on
    a sample space S. The set (Yy) will denote s?S
    Y(s) y
  • Probability Function (Probability Mass
    Function) Let Y be a discrete random variable.
    The function defined by p(y) P(Yy) for any
    real number y, is called the probability
    functions or probability mass function for Y.

4
Properties of the Probability Function
  • For any discrete random variable Y with
    probability function p(y).
  • 0??p(y) ?1
  • ?p(y) 1 where the summation is over all
    elements in the range of Y

5
Probability function for g(Y)
  • Let Y be a discrete random variable with
    probability function pY(y) and let g be a real
    valued function of a real variable. Suppose the
    random variable X is defined by X g(Y). Then,
    for each x in the range of the random variable X
    the probability function of X is given by pX(x)
    ?pY(yi) where the sum is take over all yi such
    that g(yi) x.

6
Expected Values of Functions of Random Variables
  • If Y be a discrete random variable with
    probability function pY(y) and if g be a real
    valued function of a real variable then E(X)
    E(g(Y)) ? g(y) pX(y) where the sum is take
    over all yi in the range of Y. (provided the
    series converges absolutely). If the series does
    not converge absolutely, we say that the expected
    value of X does not exist.

7
Mean, Standard Deviation and Variance
  • Mean of Y Let Y be a random variable and
    suppose that the E(Y) exists. Then the mean of
    Y, ? , is defined by ? E(Y)
  • Variance of Y Let Y be a random variable. If ?
    exists and if E(Y- ?)2 exists then the
    variance of Y, V(Y) or ?2 , is defined by ?2
    E(Y- ?)2
  • Standard Deviation of Y The standard deviation
    of Y, ?, is defined by ? ? ?2

8
Moment Generating Function
  • Definition The moment generating function (MGF)
    for a random variable Y is defined by, m(t)
    E(etY) provided that this function exists in an
    interval b lt t lt b for some positive number b.
  • Theorem Let Y be a random variable and suppose
    that m(t) exists for t lt b (bgt0). Then E(Ym)
    exists for any positive integer m and E(Ym)
    dm(m(t))/dtmt0.

9
Theorems
  • Let Y be discrete random variable with
    probability function p(y).
  • Let c be a constant, then E(c) c.
  • Let c be a constant and let g be a real valued
    function of a real variable, then E(cg(Y))
    cE(g(Y)
  • Let c1 and c2 be constants and let g1 and g2 be
    real valued functions, then E(c1g1(Y) c2g2(Y))
    c1E(g1(Y)) c2E(g2(Y))
  • If V(Y) exists, then V(Y) EY2 - ? 2

10
The Binomial Experiment
  • Definition A binomial experiment possesses the
    following properties
  • The experiment consists of n identical trials.
  • Each trial results in one of two outcomes
    success, S or failure F.
  • For each trial P(success) p P(failure) q
    1-p where 0 lt p lt 1
  • The n trials are independent
  • The random variable Y of interest counts the
    number of successes in the n trials.

11
Binomial Random Variable
  • The random variable Y which counts the number of
    successes in a Binomial Experiment is said to
    have a Binomial Distribution and the probability
    function for Y is given byp(y)
    C(n,y)py(1-p)n-y for y 0, 1, 2, , n (where
    C(n,y) is the number of combinations of n things
    taken y at a time.)
  • Furthermore any random variable with the above
    probability function is said to have a Binomial
    distribution.

12
MGF, Mean and Variance for aBinomial Random
Variable
  • Theorem Let Y be a binomial random variable
    based on n trials with success probability p.
    Then
  • m(t) (pet (1-p))n for all t
  • E(Y) np
  • E(Y2) np n(n-1)p2
  • V(Y) np(1-p)

13
The Geometric Distribution
  • Consider a sequence of Binomial experiments with
    P(success) p (0 lt p lt 1) and P(failure) q 1
    - p. Let Y be the number of trials needed to get
    a success. Then Y is said to have a geometric
    probability distribution
  • The pmf of X is given by,p(y) qy-1p for y
    1, 2, 3,

14
MGF, Mean and Variance for aGeometric Random
Variable
  • Theorem Let Y be a geometric random variable
    with P(success) p. Then
  • m(t) pet/(1- qet) for all t lt -ln(q)
  • E(Y) 1/p
  • E(Y2) (2-p)/p2
  • V(Y) (1-p)/p2

15
Hypergeometric Distribution
  • Consider an experiment consisting of selecting a
    sample of size n from a population of N objects
    of which r are of type I and N-r are of type II.
    Let Y denote the number of objects selected of
    type I. Then Y is said to have a hypergeometric
    probability distribution.
  • The pmf of Y is givenwhere y is an integer 0,
    1, 2, , n,subject to the condition y ? r andn
    - y ? N - r.

16
Mean and Variance for aHypergeometric Random
Variable
  • Theorem Let Y be a hypergeometric random
    variable. Then
  • E(Y) nr/N
  • V(Y) n ? (r/N) ? (N-r)/(N) ? (N-n)/(N-1)

17
Poisson Process
  • A Poisson process consists of an experiment where
    a certain event occurs randomly with the
    following properties
  • The number of occurrences in an interval t, t
    ?t is approximately proportional to the length
    of the interval, ?t.
  • The number of occurrences in the interval t, t
    ?t does not vary with time t.
  • The number of occurrences in non- overlapping
    intervals is independent
  • The probability of two or more occurrences in a
    short interval t, t ?t is negligible as ?t?0.

18
Poisson Distribution
  • Let Y(t) denote the number of events occurring in
    a Poisson process in the time interval 0, t and
    let ? be the average number of occurrence in an
    interval of unit time. Then Y has a Poisson
    distribution with pmf p(y) e-?t(?t)y/y! y
    0, 1, 2, ? gt 0
  • If ? is the average number of occurrences in a
    fixed interval then p(y) e-?(?)y/y! y 0,
    1, 2, ? gt 0

19
MGF, Mean and Variance for aPoisson Random
Variable
  • Let Y be a random variable with Poisson
    distribution p(y) e-?(?)y/y! for y 0, 1, 2,
    , ? gt 0. Then,
  • m(t) exp?(et - 1) for all t
  • E(Y) ?
  • E(Y2) ? ?2
  • V(Y) ?

20
Moments of a Random Variable
  • Definition The kth moment of a random variable
    Y taken about the origin is defined to be E(Yk)
    and denoted by ??k provided it exists.
  • Definition The kth moment of a random variable
    Y taken about the mean, or the kth central moment
    of Y, is defined to be E((Y-?)k) and denoted by
    ?k provided it exists.

21
Moment Generating Function
  • Recall The moment generating function (MGF) for
    a random variable Y is m(t) E(etY) provided
    it exists for t lt b where b gt 0.
  • Theorem If m(t) exists for Y, then E(Ym) exists
    and E(Ym) dm(m(t))/dtmt0 for m 1, 2, 3, ...
  • Theorem (Uniqueness) If the MGF exists for
    some probability distribution, then it is unique.
    i.e. if the MGF for Y and Z exist and are equal
    (for all tltb, b positive), then Y and Z have
    the same distribution.

22
Tchebysheffs Theorem (Chebyshevs Inequality)
  • Theorem Let Y be any random variable with mean
    ? and variance ?2. Then, for any positive
    constant k, P(Y- ? lt k?) ? 1 (1/k2)or
    P(Y- ? ? k?) ? (1/k2).

23
Homework
  • p.129 98, 99, 100, 104, 106, 109, 119, 120,
    121, 123
Write a Comment
User Comments (0)
About PowerShow.com