Title: PART 3 Random Processes
1PART 3Random Processes
Huseyin Bilgekul Eeng571 Probability and
astochastic Processes Department of Electrical
and Electronic Engineering Eastern Mediterranean
University
2Random Processes
3Kinds of Random Processes
4Random Processes
- A RANDOM VARIABLE X, is a rule for assigning to
every outcome, w, of an experiment a number
X(w). - Note X denotes a random variable and X(w)
denotes a particular value. - A RANDOM PROCESS X(t) is a rule for assigning to
every w, a function X(t,w). - Note for notational simplicity we often omit the
dependence on w.
5Conceptual Representation of RP
6Ensemble of Sample Functions
The set of all possible functions is called the
ENSEMBLE.
7Random Processes
- A general Random or Stochastic Process can be
described as - Collection of time functions (signals)
corresponding to various outcomes of random
experiments. - Collection of random variables observed at
different times. - Examples of random processes in communications
- Channel noise,
- Information generated by a source,
- Interference.
t1
t2
8Random Processes
Let denote the random outcome of an
experiment. To every such outcome suppose a
waveform is assigned. The
collection of such waveforms form a stochastic
process. The set of and the time index
t can be continuous or discrete (countably
infinite or finite) as well. For fixed
(the set of all experimental outcomes),
is a specific time function. For fixed
t, is a random variable. The ensemble of all
such realizations over time
represents the stochastic
9Random Process for a Continuous Sample Space
10Random Processes
11Wiener Process Sample Function
12(No Transcript)
13Sample Sequence for Random Walk
14Sample Function of the Poisson Process
15Random Binary Waveform
16Autocorrelation Function of the Random Binary
Signal
17Example
18(No Transcript)
19Random Processes Introduction (1)
20Introduction
- A random process is a process (i.e., variation in
time or one dimensional space) whose behavior is
not completely predictable and can be
characterized by statistical laws. - Examples of random processes
- Daily stream flow
- Hourly rainfall of storm events
- Stock index
21Random Variable
- A random variable is a mapping function which
assigns outcomes of a random experiment to real
numbers. Occurrence of the outcome follows
certain probability distribution. Therefore, a
random variable is completely characterized by
its probability density function (PDF).
22STOCHASTIC PROCESS
23STOCHASTIC PROCESS
24STOCHASTIC PROCESS
25STOCHASTIC PROCESS
- The term stochastic processes appears mostly in
statistical textbooks however, the term random
processes are frequently used in books of many
engineering applications.
26STOCHASTIC PROC ESS
27 DENSITY OF STOCHASTIC PROCESSES
- First-order densities of a random process
- A stochastic process is defined to be completely
or totally characterized if the joint densities
for the random variables
are known for all times and
all n. - In general, a complete characterization is
practically impossible, except in rare cases. As
a result, it is desirable to define and work with
various partial characterizations. Depending on
the objectives of applications, a partial
characterization often suffices to ensure the
desired outputs.
28 DENSITY OF STOCHASTIC PROCESSES
- For a specific t, X(t) is a random variable with
distribution . - The function is defined as the
first-order distribution of the random variable
X(t). Its derivative with respect to x - is the first-order density of X(t).
29 DENSITY OF STOCHASTIC PROCESSES
- If the first-order densities defined for all time
t, i.e. f(x,t), are all the same, then f(x,t)
does not depend on t and we call the resulting
density the first-order density of the random
process otherwise, we have a family
of first-order densities. - The first-order densities (or distributions) are
only a partial characterization of the random
process as they do not contain information that
specifies the joint densities of the random
variables defined at two or more different times.
30 MEAN AND VARIANCE OF RP
- Mean and variance of a random process
- The first-order density of a random process,
f(x,t), gives the probability density of the
random variables X(t) defined for all time t. The
mean of a random process, mX(t), is thus a
function of time specified by
- For the case where the mean of X(t) does not
depend on t, we have - The variance of a random process, also a function
of time, is defined by
31 HIGHER ORDER DENSITY OF RP
- Second-order densities of a random process
- For any pair of two random variables X(t1) and
X(t2), we define the second-order densities of a
random process as or
. - Nth-order densities of a random process
- The nth order density functions for
at times - are given by
- or
.
32 Autocorrelation function of RP
- Given two random variables X(t1) and X(t2), a
measure of linear relationship between them is
specified by EX(t1)X(t2). For a random process,
t1 and t2 go through all possible values, and
therefore, EX(t1)X(t2) can change and is a
function of t1 and t2. The autocorrelation
function of a random process is thus defined by
33 Autocovariance Functions of RP
34 Stationarity of Random Processes
- Strict-sense stationarity seldom holds for random
processes, except for some Gaussian processes.
Therefore, weaker forms of stationarity are
needed.
35 Stationarity of Random Processes
36 Wide Sense Stationarity (WSS) of Random Processes
37 Equality and Continuity of RP
- Equality
- Note that x(t, ?i) y(t, ?i) for every ?i is
not the same as x(t, ?i) y(t, ?i) with
probability 1.
38 Equality and Continuity of RP
39 Mean Square Equality of RP
40 Equality and Continuity of RP
41(No Transcript)
42Random Processes Introduction (2)
43Stochastic Continuity
44Stochastic Continuity
45Stochastic Continuity
46Stochastic Continuity
47Stochastic Continuity
48Stochastic Continuity
49Stochastic Convergence
- A random sequence or a discrete-time random
process is a sequence of random variables X1(?),
X2(?), , Xn(?), Xn(?), ? ? ?. - For a specific ?, Xn(?) is a sequence of
numbers that might or might not converge. The
notion of convergence of a random sequence can be
given several interpretations.
50Sure Convergence (Convergence Everywhere)
- The sequence of random variables Xn(?)
converges surely to the random variable X(?) if
the sequence of functions Xn(?) converges to X(?)
as n ? ? for all ? ? ?, i.e., - Xn(?) ? X(?) as n ? ? for all ? ? ?.
51Stochastic Convergence
52Stochastic Convergence
53Almost-sure convergence (Convergence with
probability 1)
54Almost-sure Convergence (Convergence with
probability 1)
55Mean-square Convergence
56Convergence in Probability
57Convergence in Distribution
58Remarks
- Convergence with probability one applies to the
individual realizations of the random process.
Convergence in probability does not. - The weak law of large numbers is an example of
convergence in probability. - The strong law of large numbers is an example of
convergence with probability 1. - The central limit theorem is an example of
convergence in distribution.
59Weak Law of Large Numbers (WLLN)
60Strong Law of Large Numbers (SLLN)
61The Central Limit Theorem
62Venn Diagram of Relation of Types of Convergence
Note that even sure convergence may not imply
mean square convergence.
63Example
64Example
65Example
66Example
67Ergodic Theorem
68Ergodic Theorem
69The Mean-Square Ergodic Theorem
70The Mean-Square Ergodic Theorem
- The above theorem shows that one can expect a
sample average to converge to a constant in mean
square sense if and only if the average of the
means converges and if the memory dies out
asymptotically, that is , if the covariance
decreases as the lag increases.
71Mean-Ergodic Process
72Strong or Individual Ergodic Theorem
73Strong or Individual Ergodic Theorem
74Strong or Individual Ergodic Theorem
75Examples of Stochastic Processes
- iid random process
- A discrete time random process X(t), t 1, 2,
is said to be independent and identically
distributed (iid) if any finite number, say k, of
random variables X(t1), X(t2), , X(tk) are
mutually independent and have a common cumulative
distribution function FX(?) .
76iid Random Stochastic Processes
- The joint cdf for X(t1), X(t2), , X(tk) is given
by - It also yields
- where p(x) represents the common probability
mass function.
77 Bernoulli Random Process
78 Random walk process
79 Random walk process
- Let ?0 denote the probability mass function of
X0. The joint probability of X0, X1, ? Xn is
80 Random walk process
81 Random walk process
- The property
- is known as the Markov property.
- A special case of random walk the Brownian
motion.
82Gaussian process
- A random process X(t) is said to be a Gaussian
random process if all finite collections of the
random process, X1X(t1), X2X(t2), , XkX(tk),
are jointly Gaussian random variables for all k,
and all choices of t1, t2, , tk. - Joint pdf of jointly Gaussian random variables
X1, X2, , Xk
83Gaussian process
84Time series AR random process
85The Brownian motion (one-dimensional, also known
as random walk)
- Consider a particle randomly moves on a real
line. - Suppose at small time intervals ? the particle
jumps a small distance ? randomly and equally
likely to the left or to the right. - Let be the position of the particle on
the real line at time t.
86The Brownian motion
- Assume the initial position of the particle is at
the origin, i.e. - Position of the particle at time t can be
expressed as -
where
are independent random variables, each having
probability 1/2 of equating 1 and ?1. - ( represents the largest integer not
exceeding - .)
87Distribution of X?(t)
- Let the step length equal , then
- For fixed t, if ? is small then the distribution
of is approximately normal with mean
0 and variance t, i.e.,
.
88Graphical illustration of Distribution of X?(t)
89- If t and h are fixed and ? is sufficiently small
then
90Graphical Distribution of the displacement of
- The random variable
is normally distributed with mean 0 and variance
h, i.e.
91- Variance of is dependent on t, while
variance of is not. - If , then
, -
- are independent random variables.
92(No Transcript)
93Covariance and Correlation functions of