Some standard univariate probability distributions - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Some standard univariate probability distributions

Description:

Some standard univariate probability distributions Characteristic function, moment generating function, cumulant generating functions Discrete distribution – PowerPoint PPT presentation

Number of Views:225
Avg rating:3.0/5.0
Slides: 21
Provided by: gar115
Category:

less

Transcript and Presenter's Notes

Title: Some standard univariate probability distributions


1
Some standard univariate probability distributions
  • Characteristic function, moment generating
    function, cumulant generating functions
  • Discrete distribution
  • Continuous distributions
  • Some distributions associated with normal
  • References

2
Characteristic function, moment generating
function, cumulant generating functions
  • Characteristic function is defined as an
    expectation value of the function - e(itx)
  • Moment generating function is defined as (an
    expectation of e(tx))
  • Moments can be calculated in the following way.
    Obtain derivative of M(t) and take the value of
    it at t0
  • Cumulant generting function is defined as
    logarithm of the characteristic function

3
Discrete distributions Binomial
  • Let us assume that we carry out experiment and
    the result of the experiment can be success or
    failure. The probability of success in one
    experiment is p. Then probability of failure is
    q1-p. We carry out experiments n times.
    Distribution of k successes is binomial
  • Characteristic function
  • Moment generating function

4
Example and mean values
  • As the number of trials become increases the
    distribution becomes more symmetric and dense.
  • Calculate the probability of 2 or 3 successes if
    the probability of success is p0.2 and the
    number of trials is n3. Compare it with the the
    case when p0.5 and n3.
  • Mean value is np. Variance is npqnp(1-p).
  • If the number of trials is 10 and p 0.2 then
    average number of successes is 2.

P0.2, n10
P0.2, n100
P0.5, n10
P0.5, n100
5
Discrete distributions Poisson
  • When the number of the trials (n) is large and
    the probability of successes (p) is small and np
    is finite and tends to ? as n goes to infinity
    then the binomial distribution converges to
    Poisson distribution
  • Poisson distribution is used to describe the
    distribution of an event that occurs rarely (rare
    events) in a short time period. It is used in
    counting statistics to describe the number of
    registered photons.
  • Characteristic function is
  • What is the moment generating function?

6
Example
?1, ?5 and ?10. As ? increases the
distribution becomes more and more
symmetric. Expected values is ? and variance is
?. Variance and mean are equal to each
other. Exercise Assume that the distribution of
the number accidents is Poisson. If the average
number of accidents in one day is 3 then what is
the probability of three accidents happening in
one day? What is the probability of at least
three accidents in one day.
?1
?5
?10
7
Discrete distributions Negative Binomial
  • Consider an experiment Probability of success
    is p and probability of failure is q1-p. We
    carry out the experiment until k-th success. We
    want to find the probability of j failures before
    having kth success. (It is called sequential
    sampling. Sampling is carried out until stopping
    rule - k successes - is satisfied). If we have j
    failures then it means that the number of trials
    is kj. Last trial was success. Then the
    probability that we will have j failures is
  • It is called negative binomial because
    coefficients have the same from as those of the
    terms of the negative binomial series
    p-k(1-q)-k
  • Characteristic function is
  • What is the moment generating function?

8
Example, mean and variance
As the number of required successes increases the
distribution becomes more and more symmetric.
Mean value is kq/p and variance is
kq(q1)/p. Let us say we have an unfair coin.
Probability of throwing head is 0.2. We throw
the coin until we have 2 heads. What is the
probability that we will achieve it in 4
trials? What is the average number of trials
before we reach 2 heads?
k50,p0.2. x axis is between 0 and 500
k10,p0.2
k10,p0.5
k50,p0.5
9
Continuous distributions uniform
  • The simplest form of the continuous distribution
    is the uniform with density
  • Cumulative distribution function is
  • Moments and other properties are calculated
    easily.

10
Continuous distributions exponential
  • Density of random variable with an exponential
    distribution has the form
  • One of the origins of this distribution
  • From Poisson type random processes. If the
    probability distribution of j(t) events occurring
    during time interval 0t) is a Poisson with mean
    value ? t then probability of time elapsing till
    the first event occurs has the exponential
    distribution. Let Trdenotes time elapsed until
    r-th event
  • Putting r1 we get e(- ?t). Taking into account
    that P(T1gtt) 1-F1(t) and getting its derivative
    wrt t we arrive to the exponential distribution
  • Characteristic function is

11
Example, mean variance
As lambda becomes larger, fall of the
distribution becomes sharper. Mean value is 1/?
and variance is (?1)/?2 If average waiting time
is 1min then what is probability that first event
will happen within 1 minute Small exercise
What is the probability that the first event will
happen after 2 minutes?
?1
12
Continuous distributions Gamma
  • Gamma distribution can be considered as a
    generalisation of the exponential distribution.
    It has the form
  • It is probability of time - t elapsing before
    exactly r events happens
  • Characteristic function of this distribution is
  • If there are r independently and identically
    exponentially distributed random variables then
    the distribution of their sum is Gamma.
  • Sometimes for gamma distribution 1/? instead of ?
    is written. Implementation in R uses this form. r
    is called shape and 1/? is called scale
    parameter.

13
Gamma distribution
As the shape parameter increases the centre of
the distribution shifts to the left and it
becomes more symmetric. Mean value is r/? and
variance is r(?1)/?2
14
Continuous distributions Normal
  • Perhaps the most popular and widely used
    continuous distribution is the normal
    distribution. Main reason for this is that
    usually an observed random variable is the sum of
    many random variables. According to the central
    limit theorem under some conditions (for example
    random variables are independent. first and
    second and third moments exist and finite then
    distribution of the sum of these random variables
    converges to normal distribution)
  • Density of the normal distribution has the form
  • There are many tables for the normal
    distribution.
  • Its characteristic function is

15
Central limit theorem
  • Let us assume that we have n independent random
    variables Xi, i 1,..,n. If first, second and
    third moments (this condition can be relaxed) are
    finite then the sum of these random variables for
    sufficiently large n will be approximately
    normally distributed.
  • Because of this theorem, in many cases assumption
    that observations or errors are distributed with
    normal distribution is sufficiently good and
    tests based on this assumption give satisfactory
    results.

16
Exponential family
  • Exponential family of distributions has the form
  • Many distributions are special case of this
    family.
  • Natural exponential family of distributions is
    the subclass of this family
  • Where A(?) is natural parameter.
  • If we use the fact that distribution should be
    normalised then characteristic function of the
    natural exponential family with natural parameter
    A(?) ? can be derived to be
  • Try to derive it. Hint use the normalisation
    factor. Find D and then use expression of
    characteristic function and D.
  • This distribution is used for fitting generlised
    linear models.

17
Exponential family Examples
  • Many well known distributions belong to this
    family (All distributions mentioned in this
    lecture are from the exponential family).
  • Binomial
  • Poisson
  • Gamma
  • Normal

18
Continuous distributions ?2
  • Random variables with normal distribution are
    called standardized if their mean is 0 and
    variance is 1.
  • Sum of n standardized, independent normal random
    variables is ?2 with n degrees of freedom.
  • Density function is
  • If there are p linear restraints on the random
    variables then degree of freedom becomes n-p.
  • Characteristic function for this distribution is
  • ?2 is used widely in statistics for such tests as
    goodness of fit of model to experiment.

19
Continuous distributions t and F-distributions
  • Two more distributions are closely related with
    normal distribution. We will give them when we
    will discuss sample and sampling distributions.
    One of them is Students t-distribution. It is
    used to test if mean value of the sample is
    significantly different from a give value.
    Another and similar application is for tests of
    differences of means of two different samples.
  • Fishers F-distribution is the distribution of
    the ratio of the variances of two different
    samples. It is used to test if their variances
    are different. One of the important application
    is in ANOVA.

20
Reference
  • Johnson, N.L. Kotz, S. (1969, 1970, 1972)
    Distributions in Statistics, I Discrete
    distributions II, III Continuous univariate
    distributions, IV Continuous multivariate
    distributions. Houghton Mufflin, New York.
  • Mardia, K.V. Jupp, P.E. (2000) Directional
    Statistics, John Wiley Sons.
  • Jaynes, E (2003) The Probability theory Logic of
    Science
Write a Comment
User Comments (0)
About PowerShow.com