Title: Probability Review
1Probability Review
- Sample Space
- Probability Assignment
- Set theory
- Conditional Probability
- Independence Assumption
- Random Variables
- Exponential Distribution
2Sample Space
- The set of all possible outcomes of a random
experiment - Examples
- toss a coin, throw a die
- search an item in a database
- the time for searching an item in a database
- up time during a system operation
- down time of a system operation
3Probability Assignmentthe 3 fundamental axioms
of probability
- 1. The probability that an event A occurs is a
number between zero and unity - 0? P(A) ?1
- 2. The probability of the entire sample space is
unity P(S) 1 - 3. The probability of the union of two disjoint
events is the sum of the probabilities - P(AB)P(A)P(B)
4Set Theory
- Axiomatic probability is based on set theory
- Terminology
- A set is simply a collection or enumeration of
objects - Each item in the collection is an element of the
set - The universal set contains all possible elements
in the problem - When the concept of universal set is used in
probability theory, the term sample space is
generally applied - In probability discussion, the type of sets one
is interested in consists of those that can be
viewed as outcomes of an experiment. - Any subset of the Sample Space is called an
event. - Example
- Experiment flipping a coin
- Sample Space H,T, Events E1H, E2T
5Union Intersection
- The union of sets A1 and A2 is a third set B.
- Notation B A1 ? A2, or B A1 A2
- Disjoint sets or mutually exclusive sets when
two (or more) sets have no common elements - The intersection of sets A1 and A2 is defined as
a third set D, which is composed of all elements
contained in both A1 and A2 - Notation D A1 ? A2, or D A1 A2 , or D A1
A2 - If two sets are disjoint, then their intersection
is a set with no element called a null set, ?.
6Probability of a Union
- For Disjoint Union, this is the sum of the
probability of individual events - For a union of 2 non-disjoint events, what is the
probability of the union?
7General Formula for Probability of Union
8The problem of computing the probability of union
- The total number of terms is
- To simplify the computation, people usually do
the following - P(A1A2) 1- (1-P(A1)) (1-P(A2))
- General form
- How much will you save (in terms of computation
time)?
9A Story about the Prob. Of Union
10 Fault Tree Analysis for LOC
WAAS User Loss of
Continuity (LOC)
during ER/NPA
G001
g
n
WAAS signal is available and WAAS or WAAS/RAIM PL
exceeds the AL
User Receiver causes
WAAS Signal is unavailable and RAIM PLAL
LOC (Probability 0)
G171
G983
G003
g
n
LOC due to PL exceeds the AL RAIM only
LOC due to
transmission or
reception failures
continue
G002
G172
11Independence Assumption
- People usually forget about the independence
assumption made for the OR gate. - All fault tree modeling tools calculate the
probability of C1? C2? ? Cn as 1-?i1 to n
(1-P(Ci )). - This indicates that the calculation also assumes
independence between event C1, C2, and Cn. - It is necessary that events under an OR gate also
deserve some attentions of the independence
investigation.
12Conditional Probability
- What is the probability of obtaining the 4 of
clubs on one draw from a deck of cards? - What is the probability of drawing the 4 of clubs
given that a club is drawn? - P(A2A1) The probability of A2 occurring
conditioned on the previous occurrence of A1.
Or, simply as the probability of A2 given A1.
13Calculating P(A1? A2) using Conditional
Probability
- P(A1 ? A2)
- In general, P(A1 ? A2 ? ... ? An)
- In what condition will P(A2 A1) P(A2) ?
- In this situation, we say the probability of
occurrence of event A2 is independent of the
occurrence of event A1
14Pros and Cons about the Independence Assumption
- When A1 and A2 are independent, P(A1 ? A2)
P(A1)P(A2) - In general, when event A1, A2,... An are
independent, P(A1 ? A2 ? ... ? An) - The benefit of using independence assumption
- The cautious
15Random Variables
- A random variable is a function mapping each
element of a sample space to a real number - Random variables can be discrete or continuous
- If the random variable X assumes a finite number
of values, then X is called a discrete random
variable. - The probability function P(X xi) is a discrete
probability density function (PDF) , denoted as
f(x), associated with the discrete random
variable X
16Discrete Random Variables
- Example
- throw a dice, the random variable x is the number
of spots face up on any throw. - The domain of the random variable is x
1,2,3,4,5,6. - Using the equally-likely-events hypothesis, we
have - P(x1) P(x2) ... 1/6
- In this case, the probability density function is
constant. - The distribution function, F(x) P(x? x), is a
cumulative probability and is often called the
cumulative distribution function (CDF). - In the above example,
17Discrete r.v. example -Binomial Distribution
- The Bernulli process
- Applied to situations where an event is either
occur or not occur (success or failure) - Let the probability of success is p, the the
probability of failure is 1-p. - Let the total number of independent trials be n,
and the number of successes be r. - The probability of r successes in n trials is
18Continuous Random Variables
- The values that the random variable can take are
continuous - Examples
- The failure time of a system
- The value of a circuit resistance
- CDF cumulative distribution function
- The density function f(x) is given by the
derivative of the distribution function - Example The failure time of a system is
exponentially distributed
19Continuous r.v. Example - Exponential Distribution
- The exponential density function is if x?0
- Exponential distribution occurs in reliability
work over and over again, in the way used as the
distribution of the time to failure for a great
number of electronic-system parts - The parameter ? is constant and is usually called
the failure rate (with the units fraction
failures per hour) - The failure probability
- The success probability (probability of no
failure) - expected value (Mean Time Between Failures) 1/?
(MTBF) - The most commonly used distribution in
reliability and performance modeling
20Memory-less property
A random variable X is said to be without memory,
or memoryless, if PXstXt PXs for all
s, t?0
Interpretation
If an item is alive at time t, then the
distribution of the remaining amountof time that
it survives is the same as the original lifetime
distribution.
The item does not remember that it has already
been in use for a time t
21More on Memoryless Property
- PXstXt PXs for all s, t?0
When X is exponentially distributed, it follows
that e-?(st) e-?se-?t.
Hence, exponentially distributed random variable
are memoryless.
22Example 1
- Suppose the amount of time one spends in a bank
is exponentially distributed with mean 10
minutes, that is, ?1/10. What is the
probability that a customer will spend more than
fifteen minutes in the bank? What is the
probability that a customer will spend more than
fifteen minutes in the bank given that s/he is
still in the bank after ten minutes?
23Solution Example 1
It turns out that the exponentially distribution
is the unique distribution possessing the
memory-less property (proof omitted here)
24Expectation (Mean) of a Exponential Random
Variable
Let X be exponentially distributed with parameter
?, what is the expectation of X, denoted as EX?
25Homework