Appendix II - PowerPoint PPT Presentation

About This Presentation
Title:

Appendix II

Description:

Appendix II Probability Theory Refresher Leonard Kleinrock, Queueing Systems, Vol I: Theory Nelson Fonseca, State University of Campinas, Brazil – PowerPoint PPT presentation

Number of Views:157
Avg rating:3.0/5.0
Slides: 92
Provided by: Lari179
Category:

less

Transcript and Presenter's Notes

Title: Appendix II


1
Appendix II Probability Theory Refresher
  • Leonard Kleinrock, Queueing Systems, Vol I
    Theory
  • Nelson Fonseca,
  • State University of Campinas, Brazil

2
Appendix II Probability Theory Refresher
3
  • Random event statistical regularity
  • Example If one were to toss a fair coin four
    times, one expects on the average two heads and
    two tails.There is one chance in sixteen that no
    heads will occur. If we tossed the coin a million
    times, the odds are better than 10 to 1 that at
    least 490.000 heads will occur.

88
4
II.1 Rules of the game
  • Real-world experiments
  • A set of possible experimental outcomes
  • A grouping of these outcomes into classes called
    results
  • The relative frequency of these classes in many
    independent trials of the experiment
  • Frequency number of times the experimental
    outcome falls into that class, divided by number
    of times the experiment is performed

5
  • Mathematical model three quantities of interest
    that are in one-to-one relation with the three
    quantities of experimental world
  • A sample space is a collection of objects that
    corresponds to the set of mutually exclusive
    exhaustive outcomes of the model of an
    experiment. Each object ? is in the set S is
    referred to as a sample point
  • A family of events ? denoted A, B, C,in which
    each event is a set of samples points ?

6
  • A probability measure P which is an assignment
    (mapping) of the events defined on S into the set
    of real numbers. The notation is PA, and have
    these mapping properties
  • For any event A,0 lt PA lt1 (II.1)
  • PS1
    (II.2)
  • If A and B are mutually exclusive events then
    PA U BPAPB (II.3)

7
  • Notation
  • Exhaustive set of events a set of events whose
    union forms the sample space S
  • Set of mutually exclusive exhaustive events
  • , which have the properties

8
  • The triplet (S, ?, P) along with Axioms
    (II.2)-(II.3) form a probability system
  • Conditional probability
  • The event B forces us to restrict attention from
    the original sample space S to a new sample space
    defined by the event B, since B must now have a
    total probability of unity. We magnify the
    probabilities associated with conditional events
    by dividing by the term PB

9
  • Two events A, B are said to be statistically
    independent if and only if
  • If A and B are independent
  • Theorem of total probability
  • If the event B is to occur it must occur in
    conjunction with exactly one of the mutually
    exclusive exhaustive events A

i
10
  • The second important form of the theorem of total
    probability
  • Instead of calculating the probability of some
    complex event B, we calculate the occurrence of
    this event with mutually exclusive events

11
  • Bayes theorem
  • Where A are a set of events mutually exclusive
    and exhaustive
  • Example You have just entered a casino and
    gamble with a twin brother, one is honest and the
    other not. You know that you lose with
    probability½ if you play with the honest
    brother, and lose with probabilityP if you play
    with the cheating brother

i
12
II.2 Random variables
  • Random variable is a variable whose value depends
    upon the outcome of a random experiment
  • To each outcome, we associate a real number,
    which is in fact the value the random variable
    takes on that outcome
  • Random variable is a mapping from the points of
    the sample space into the (real) line

13
  • Example If we win the game we win 5, if we lose
    we win -5 and if we draw we win 0.

S
L (3/8)
D (1/4)
W (3/8)
14
  • Probability distribution function (PDF), also
    known as the cumulative distribution function

15
(No Transcript)
16
  • Probability density function (pdf)
  • The pdf integrated over an interval gives the
    probability that the random variable X lies in
    that interval

17
  • Distributed random variable

18
  • Impulse function (discontinuous)
  • Functions of more than one variable
  • Marginal density function
  • Two random variables X and Y are said to be
    independent if and only if

19
  • We can define conditional distributions and
    densities
  • Function of one random variable
  • Given the random variable X and its PDF, one
    should be able to calculate the PDF for the
    variable Y

20
(No Transcript)
21
(No Transcript)
22
II.3 Expectation
  • Stieltjes integrals deal with discontinuities and
    impulses
  • Let

23
  • The Stieltjes integral will always exist and
    therefore it avoids the issue of impulses
  • Without impulses the pdf may not exist
  • When impulses are permitted we have

24
(No Transcript)
25
  • Expectation of the sum of two random variables

26
  • The expectation of the sum of two random
    variables is always equal to the sum of the
    expectations of each variable
  • This is true even if the variables are dependent
  • The expectation operator is a linear operator

27
  • The question is what is the probability of your
    being playing with the cheating brother since you
    lost?

28
  • The expected result of the product of variables
    is equal to the product of the expected values if
    the variables are independent
  • Expected value of the product of random functions

29
  • nth moment
  • nth central moment

30
  • The nth central moment can be expressed as a
    function of n moments
  • First central moment 0

31
  • Second central moment gt variance
  • Standard deviation (central moment)
  • Coefficient of variation

32
  • Covariance of two random variables X1 and X2
  • Cov (X1, X2) E(X1 EX1) (X2 EX2)
  • var (X1 X2) var (X1)var (X2) 2Cov(X1, X2)
  • Corr (X1, X2) Cov (X1, X2) / (s1 s2)

33
Normal
  • Notation
  • Range
  • Parameters Scale
  • Parameters Shape

34
Normal
  • Probability Density Function

35
Normal
m10 s2
m10 s1
36
Normal
m0 s2
m0 s1
37
Normal
m0 s1
38
Normal
  • Expected Value

39
Normal
  • Variance

40
Chebyshev Inequality
41
Strong Law of Large Numbers
42
Strong Law of Large Numbers
43
Central Limit Theorem
44
Exponential
  • Probability Density Function
  • Distribution Function

45
Exponential
  • Inter arrival time of phone calls
  • Inter arrival time of web session
  • Duration of on and off periods for voice models

46
Heavy-tailed distributions
47
Heavy- Tailed distributions
  • Hyperbolic decay
  • Infinite variance
  • Unbounded mean
  • Network context

48
Pareto
  • Notation
  • Range
  • Parameters Scale
  • Parameters Shape

49
Pareto
  • Distribution Function

50
Pareto
  • Probability Density Function

51
Pareto
a1 b1
a1 b2
52
Pareto
a10 b5
a5 b10
53
Pareto
a5 b10
54
Pareto
  • Expected Value

55
Pareto
  • Moments Uncentered

56
Pareto
  • Distribution of file size in Unix systems
  • Duration of on and off periods in data models
    (ethernet individual user)

57
Weibull
  • Notation
  • Range
  • Parameters Scale
  • Parameters Shape

58
Weibull
  • Probability Density Function

59
Weibull
  • Distribution Function

60
Weibull
b1 c1
b2 c1
61
Weibull
b1 c2
b2 c2
62
Weibull
b10 c5
b5 c10
63
Weibull
b25 c10
64
Weibull
  • Moments Uncentered

65
Weibull
  • Expected Value

66
Weibull
  • Variance

67
Lognormal
  • Notation
  • Range
  • Parameters Scale
  • Parameters Shape

68
Lognormal
  • Probability Density Function

69
Lognormal
  • Expected Value

70
Lognormal
  • Variance

71
Lognormal
m0 s0.5
m0 s0.7
72
Lognormal
m1 s0.5
m1 s0.7
73
Lognormal
m0 s0.1
m1 s0.1
74
Lognormal
m0 s1
m1 s1
75
Lognormal
m0 s1
76
Lognormal
  • Multiplicative efffect

77
  • II.4 Transforms, generating functions and
    characteristic function
  • Characteristic function of a random variable
    x(fX(u)) is given by
  • u real variable

78
  • Expanding ejux and integrating

79
  • Notation

80
  • Moment generation function

81
  • Laplace transform of the pdf
  • Notation

82
  • Example

83
  • Probability generating function discrete
    variable

84
  • Sum of n independent variables
  • xi Identically distributed

85
  • Sum of independent variables

86
  • x1 and x2 independent
  • The variance of the sum of independent random
    variables is equal to the sum of variances

87
  • Variable ? sum of independent variables and the
    number of variables is a random variable
  • Where N is a random variable with mean N and
    variance sX2
  • Xi is independent and identically distributed
  • N and Xi independent
  • FY(y) - Compound distribution

88
  • Xi - identically distributed variables

89
(No Transcript)
90
  • II.6. Stochastic process
  • To each point of the sample process space wÎS a
    time function x is associated gt Stochastic
    process family

91
  • Autocorrelation
  • Wide sense stationary process
Write a Comment
User Comments (0)
About PowerShow.com