Title: EE255/CPS226 Continuous Random Variables
1EE255/CPS226Continuous Random Variables
- Dept. of Electrical Computer engineering
- Duke University
- Email bbm_at_ee.duke.edu, kst_at_ee.duke.edu
-
2Definitions
- Distribution function
- If FX(x) is a continuous function of x, then X is
a continuous random variable. - FX(x) discrete in x ? Discrete rvs
- FX(x) piecewise continuous ? Mixed rvs
-
3Probability Density Function (pdf)
- X continuous rv, then,
- pdf properties
-
-
-
-
4Exponential Distribution
- Arises commonly in reliability queuing theory.
- It exhibits memory-less (Markov) property.
- Related to Poisson distribution
- Inter-arrival time between two IP packets (or
voice calls) - Time interval between failures, etc.
- Mathematically,
5Exp Distribution Memory-less Property
- A light bulb is replaced only after it has
failed. - Conversely, a critical space shuttle component is
replaced after some fixed no. of hours of use.
Thus exhibiting memory property. - Wait time in a queue at the check-in counter?
- Exp( ) distribution exhibits the useful
memory-less property, i.e. the future occurrence
of random event (following Exp( ) distribution)
is independent of when it occurred last. -
6Memory-less Property (contd.)
- Assuming rv X follows Exp( ) distribution,
- Memory-less property find P( ) at a future
point. - X gt u, is the life time, y is the residual life
time -
7Memory-less Property (contd.)
- Memory-less property
- If the components life time is exponentially
distributed, then, - The remaining life time does not depend on how
long it has already working. - If inter-arrival times (between calls) are
exponentially distributed, then, time we need
still wait for a new arrival is independent of
how long we have already waited. - Memory-less property a.k.a Markov property
- Converse is also true, i.e. if X satisfies Markov
property, then it must follow Exp() distribution.
8Reliability Failure Rate Theory
- Reliability R(t) failure occurs after time t.
Let X be the life time of a component subject to
failures. - N0 total components (fixed) Ns survived ones
- f(t)?t unconditional prob(fail) in the interval
(t, t?t - conditional failure prob.?
-
-
9Reliability Failure Rate Theory (contd.)
- Instantaneous failure rate h(t) (failures/10k
hrs) - Let f(t) (failure density fn) be EXP( ?). Then,
- Using simple calculus,
10Failure Behaviors
- There are other failure density functions that
can be used to model DFR, IFR (or mixed) failure
behavior
DFR
IFR
CFR
Failure rate
Time
- DFR phase Initial design, constant bug fixes
- CFR phase Normal operational phase
- IFR phase Aging behavior
11HypoExponential
- HypoExp multiple Exp stages.
- 2-stage HypoExp denoted as HYPO(?1, ?2). The
density, distribution and hazard rate function
are -
- HypoExp results in IFR 0 ? min(?1, ?2)
12Erlang Distribution
- Special case of HypoExp All r stages are
identical. - X gt t Nt lt r (Nt no. of stresses applied
in (0,t and Nt is Possion (param ?t). This
interpretation gives,
13Gamma Distribution
- Gamma density function is,
- Gamma distribution can capture all three failure
models, viz. DFR, CFR and IFR. - a 1 CFR
- a lt1 DFR
- a gt1 IFR
14HyperExponential Distribution
- Hypo or Erlang ? Sequential Exp( ) stages.
- Parallel Exp( ) stages ? HyperExponential.
- Sum of k Exp( ) also gives k-stage HyperExp
- CPU service time may be modeled as HyperExp
15Weibull Distribution
- Frequently used to model fatigue failure, ball
bearing failure etc. (very long tails) - Weibull distribution is also capable of modeling
DFR (a lt 1), CFR (a 1) and IFR (a gt1). - a is called the shape parameter.
16Log-logistic Distribution
- Log-logistic can model DFR, CFR and IFR failure
models simultaneously, unlike previous ones. -
- For, ? gt 1, the failure rate first increases with
t (IFR) after momentarily leveling off (CFR), it
decreases (DFR) with time, all within the same
distribution.
17Gaussian (Normal) Distribution
- Bell shaped intuitively pleasing!
- Central Limit Theorem mean of a large number of
mutually independent rvs (having arbitrary
distributions) starts following Normal
distribution as n ? - µ mean, s std. deviation, s2 variance (N(µ,
s2)) - µ and s completely describe the statistics. This
is significant in statistical estimation/signal
processing/communication theory etc.
18Normal Distribution (contd.)
- N(0,1) is called normalized Guassian.
- N(0,1) is symmetric i.e.
- f(x)f(-x)
- F(z) 1-F(z).
- Failure rate h(t) follows IFR behavior.
- Hence, N( ) is suitable for modeling long-term
wear or aging related failure phenomena.
19Uniform Distribution
- U(a,b) ? constant over the (a,b) interval
20Defective Distributions
21Functions of Random Variables
- Often, rvs need to be transformed/operated upon.
- Y F (X) so, what is the distribution of Y ?
- Example Y X2
- If fX(x) is N(0,1), then,
- Above fY(y) is also known as the ?2 distribution
(with 1-d of freedom).
22Functions of R Vs (contd.)
- If X is uniformly distributed, then, Y
-?-1ln(1-X) follows Exp( ) distribution - transformations may be used to synthetically
generate random numbers with desired
distributions. - Computer Random No. generators may employ this
method.
23Functions of R Vs (contd.)
- Given,
- A monotone differentiable function,
-
- Above method suggests a way to get the desired
CDF, given some other simple type of CDF. This
allows generation of random variables with
desired distribution. - Choose F to be F.
- Since, YF(X), FY(y) y and Y is U(0,1).
- To generate a random variable with X having
desired distribution, choose generate U(0,1)
random variable Y, then transform y to x F-1(y)
.
24Jointly Distributed RVs
-
- Joint Distribution Function
- Independent rvs iff the following holds
25Joint Distribution Properties
26Joint Distribution Properties (contd)
27Order Statistics (min, max function)
- Define Yk ( known as the kth order statistics)
- Y1 minX1, X2, , Xn
- Yn maxX1, X2, , Xn
- Permute Xi so that Yi are sorted (ascending
order) - Y1 life of a system with series of
components. - Yn with parallel (redundant) set of
components. - Distribution of Yk ?
- Prob. that exactly j of Xi values are in (-8,y
and remaining (n-j) values in (y, 8 is
28Sorted random sequence Yk
Observe that there are at least k Xis that
arelt y. Some of the remaining Xis may Also be
lt y
29Sorted RVs (contd)
-
-
- Using FY(y), reliability may be computed as,
- In general,
30Sorted RVs min case (contd)
- ith components life time EXP(?i), then,
- Hence, life time for such a system also has EXP()
distribution with, - For the parallel case, the resulting distribution
is not EXP( )
31Sum of Random Variables
- Z F(X, Y) ? ((X, Y) may not be independent)
- For the special case, Z X Y
- The resulting pdf is,
- Convolution integral
32Sum of Random Variables (contd.)
- X1, X2, .., Xk are iid rvc, and Xi EXP(?),
then rv (X1 X2 ..Xk) is k-stage Erlang with
param ?. - If Xi EXP(?i), then, rv (X1 X2 ..Xk) is
k-stage HypoExp( ) distribution. Specifically,
for ZXY, - In general,
33Sum of Normal Random Variables
- X1, X2, .., Xk are normal iid rvc, then, the
rv Z (X1 X2 ..Xk) is also normal
with, - X1, X2, .., Xk are normal. Then,
- follows Gamma or the ?2 (with n-deg of
freedom) distributions
34Sum of RVs Standby Redundancy
- Two independent components, X and Y
- Series system (Zmin(X,Y))
- Parallel System (Zmax(X,Y))
- Cold standby the life time ZXY
- If X and Y are EXP(?), then,
-
- i.e., Z is Gamma distributed, and,
- May be extended 12 cold-standbys ? TMR
35k-out of-n Order Statistics
- Order statistics Yn-k1 of (X1, X2, .. Xn) is
- P(Yn-k1 ) HYPO(n? ,(n-1)? , k? )
- Proof by induction
- n2 case k2 ? Y1 parallel Y2 series
- F(Y1 ) (Fy)2 or F(Yn ) (Fy)n
- Y1 distribution? Y1 is the residual life time.
- If all Xi s are EXP(?) ? memory-less property
I.e. residual life time is independent of how
long the component has already survived. - Hence, Y1 distribution is also EXP(?).
36k-out of-n Order Statistics (contd)
- Assume n-parallel components. Then, Y1 1st
component failure or minX1, X2, .. Xn. - 2nd failure would occur within Y2 Y1 minX1,
X2, .. Xn. Xis are the residual times of
surving components. But due to memory-less
property, Xis are independent of past failure
behavior. Therefore, F( minX2, X3, .. Xn) is
EXP((n-1) ?). In general, for k-out of-n (k are
working) - Yn-k1 HYPO(n?, (n-1)?, .., k?)
EXP(n?)
EXP((n-1)?)
EXP((n-k1)?)
EXP(?)
Y1
Y2
Yn
Yn-k1