Title: Lecture 11 Stochastic Processes
1Lecture 11 Stochastic Processes
- Topics
- Definitions
- Review of probability
- Realization of a stochastic process
- Continuous vs. discrete systems
- Examples
2Basic Definitions
- Stochastic process System that changes over
time in an uncertain manner - Examples
- Automated teller machine (ATM)
- Printed circuit board assembly operation
- Runway activity at airport
- State Snapshot of the system at some fixed point
in time - Transition Movement from one state to another
3Elements of Probability Theory
Experiment Any situation where the outcome is
uncertain.
Sample Space, S All possible outcomes of an
experiment (we will call them the state space).
Event Any collection of outcomes (points) in the
sample space. A collection of events E1, E2,,En
is said to be mutually exclusive if Ei ? Ej ?
for all i ? j 1,,n.
Random Variable (RV) Function or procedure that
assigns a real number to each outcome in the
sample space.
Cumulative Distribution Function (CDF), F()
Probability distribution function for the random
variable X such that F(a) ? PrX a
4Components of Stochastic Model
State Describes the attributes of a system at
some point in time. s (s1, s2, . . . , sv)
for ATM example s (n)
Convenient to assign a unique nonnegative integer
index to each possible value of the state vector.
We call this X and require that for each s ? X.
For ATM example, X n.
In general, Xt is a random variable.
5Model Components (continued)
Activity Takes some amount of time duration.
Culminates in an event. For ATM example ?
service completion.
state, a arrival, d departure
Stochastic Process A collection of random
variables Xt, where t ? T 0, 1, 2, . . ..
6Realization of the Process
Deterministic Process
7Realization of the Process (continued)
Stochastic Process
8Markovian Property
Given that the present state is known, the
conditional probability of the next state is
independent of the states prior to the present
state.
Present state at time t is i Xt i Next state
at time t 1 is j Xt1 j Conditional
Probability Statement of Markovian
Property PrXt1 j X0 k0, X1 k1,, Xt
i PrXt1 j Xt i  for t 0, 1,,
and all possible sequences i, j, k0, k1, . . . ,
kt1 Interpretation Given the present, the past
is irrelevant in determining the future.
9Transitions for Markov Processes
State space S 1, 2, . . . , m Probability of
going from state i to state j in one move pij
State-transition matrix
Theoretical requirements 0 ? pij ? 1, ?j pij
1, i 1,,m
10Discrete-Time Markov Chain
- A discrete state space
- Markovian property for transitions
- One-step transition probabilities, pij, remain
constant over time (stationary)
Simple Example State-transition
matrix State-transition diagram
11Game of Craps
- Roll 2 dice
- Outcomes
- Win 7 or 11
- Loose 2, 3, 12
- Point 4, 5, 6, 8, 9, 10
- If point, then roll again.
- Win if point
- Loose if 7
- Otherwise roll again, and so on
-
- (There are other possible bets not include here.)
12State-Transition Network for Craps
13Transition Matrix for Game of Craps
Probability of win Pr 7 or 11 0.167
0.056 0.223 Probability of loss Pr 2, 3, 12
0.028 0.56 0.028 0.112
14Examples of Stochastic Processes
Single stage assembly process with single worker,
no queue
State 0, worker is idle State 1, worker is
busy
15Examples (continued)
Multistage assembly process with single worker
with queue (Assume 3 stages only i.e., 3
operations)
s (s1, s2) where
Operations k 1, 2, 3
16Queuing Model with Two Servers
s (s1, s2 , s3) where
i 1, 2
17Series System with No Queues
18What You Should Know About Stochastic Processes
- What a state is.
- What a realization is (stationary vs. transient).
- What the difference is between a continuous and
discrete-time system. - What the common applications are.
- What a state-transition matrix is.