Title: Department of Electrical and Computer Engineering
1Multilevel Solver for Continuous Markov Chains
- Daniel Chen
- Xiaolan Zhang
2Outline
- Motivation
- Introduction to multilevel solver
- Gauss-Seidel algorithm (GS)
- Horton-Leutenegger Multilevel algorithm (ML)
- Our automatic aggregation algorithm (2AGG)
- Implementation
- Results
- Lecture notes example
- M/M/1/1000 queue example
- Molloys closed queuing network example
- Conclusion
3Motivation
- Markov system generated by modeling tool
generally has large number of states - Steady state solution must be solved numerically
- Power, Gauss-Seidel, and SOR. - Many iterations required to reach convergence.
- When modeling complex system
- When high precision is required
- When the transition rate has large variance
4The idea of iteration
If we define
and put
, we find
5Gauss-Seidel algorithm
6GS Example
7Multi-level algorithm
8Multi-level Algorithm
Solve Directly
Smoothing by running GS
Construct upper level steady state distribution -
q
Construct upper level rate transition matrix - P
Call MSS with the upper level q and P
Correct the result in the lower level
9ML Example
Create upper level matrix
10States aggregation
- Aggregation of states is essential to the success
of ML algorithm. - Optimal aggregation makes most out of GS
smoothing at each level and reduces the number of
levels (recursion in MSS). - Searching for the optimal strategy for grouping
any number of states could be difficult. - We provide a heuristic aggregation criteria for
grouping two states.
11Two-state aggregation (2AGG)
- An automatic aggregation tool analyzing the rate
transition matrix. - Group states in a unit of two.
- Start with states of maximum out degree.
- For each state, greedy search a mate in all
available states. The mate state should present
the following three properties to the given
state - Strong connection
- Similar magnitude in transition rate
- Preferable high transition rate
- Determine unit one by one until exhaust the state
space.
12Mutual communication
Similar magnitude
Maximum rate
13Implementation
- Linked adjacency list to save space. However have
penalty on ML speed. Matrix search function has
been accelerated for ML. - GS algorithm.
- ML algorithm for neighboring state aggregation at
all levels. - 2AGG algorithm.
- Pre-process function to permute states with the
mate state. Better aggregation provided. - Only preprocess at the first level. Future
improve expected if applied on all levels.
14Comparison metrics and results
- Fair comparison between GS and ML.
- Reflects the implementation potential of ML.
- Three metrics
- Number of total GS iterations
- For ML number of MSS iterations x total number
of GS calls in each iteration x GS iterations at
each call (v). - Disadvantage to ML because the size matrix being
smoothed is reduced at upper level. - Number of floating point operations (nflop)
- Count multiplication, division and beyond.
- Rough but reflects the computational complexity
of ML without memory overheads. - CPU processing time
- Most fair comparison on a particular
implementation. - Lack of insight on overhead improvements.
15Lecture notes example
- Show the advantage of ML on Markov chain with
obvious bottleneck rate transitions. - Show the importance of aggregation choices
16Lecture notes example
17Lecture notes example
18Lecture notes example
19M/M/1/1000 queue
- Birth-death chain with asymmetric rate flows.
- 2AGG tool is not applicable.
- Show how death rate u affects the performance.
- Understand different performance metrics.
20M/M/1/1000 queue
21M/M/1/1000 queue
22M/M/1/1000 queue
23Molloys closed queuing network
- Show how increment of state space affects the
performance. - Change the number of states by number of tokens
initially put at place P. - ML-p is the version integrated with 2AGG at the
first level.
24Molloys closed queuing network
25Molloys closed queuing network
26Conclusion and future work
- Aggregation strategy is very important.
- The result is mostly case-wise.
- Overhead cost of ML impairs the overall
performance. - Challenges lies on the demand of intelligent
tools to improve ML and the control of overhead
cost. - Future work
- Determine optimal number of GS smoothing at each
level. - Extend aggregation tool to arbitrary state
grouping at all levels. - Improve Q matrix data structure to reduce memory
access overheads.
27References
- G. Horton, S. Leutenegger, A Multi-Level
Solution Algorithm for Steady-State Markov
Chains. - W. H. Sanders, Lecture Notes for ECE/CS 541 and
CSE 524 Computer System Analysis. - PERFORM group, Mobius User Manual.
28Multilevel Solver for Continuous Markov Chains
- Daniel Chen
- Xiaolan Zhang
29Lecture notes example
30Lecture notes example
31Molloys closed queuing network