Title: Compact State Machines for High Performance Pattern Matching
1Compact State Machines for High Performance
Pattern Matching
Electrical and Computer Engineering Department
Design Automation Conference San Diego, CA, June
2007
2NIDS and Pattern Matching
- Network Intrusion Detection System (NIDS)
- Inspect packet header payload
- Detect computer viruses, worms, Spam, etc.
- Network intrusion detection applicationBro,
Snort, etc. - Pattern Matching System requirements
- Matching predefined multiple patterns(keywords,
or strings) at the same time - Patterns can be any size
- Patterns can be anywhere in the payloadof a
packet - Matching at line speed
- Flexibility to accommodate new sets of patterns
3Our Contributions
- Propose novel schemes to reduce the memory
required to store bit-level state machines
forpattern matching - Memory Partitions
- State Re-labeling
- Present architectures supporting the schemes
- Evaluate our purposed schemes using a realistic
pattern set (Snort)
4Aho-Corasick and State Machine
- A set of patterns
- he, her, him, his
Edges back to state 0 are not shown. Edges back
to state 1 are shown as dash lines.
5Memory Model of State Machine
- Snort (April04) 1976 string-patterns
- gt 18,000 states
- 256 next state pointers, and width 15 bits
- 8.5 MB Too expensive for fast SRAM
- Waste space for storing every possible next state
pointer
6Bit-level State Machine
8 bit state machines required !
Purposed by Tan-Sherwood
7Memory Model Bit-level State Machine
- Snort (April04) 1976 string-patterns
- 2 next state pointers , and width 9 bits
- divide the pattern set into many subsets (gt 200
subsets) - 512 states per subset
- keyword-ID width 16 bits per subset
- 1.6 MB
8Architecture Bit-level State Machine
Snort (April04) K 1976 patterns
k 8 (to 256) d 11 h 3 (to 8) N 247 (to 8)
9SPE State Processing Element
Nst of states in each subset c lg(Nst)
Nst 256 (to 4096) c 8 (to 12)
10Memory Matrix
Back edges are not shown.
11Memory Matrix
Back edges are not shown.
12How does it work?
0 1
0
13How does it work?
0 1
0
1
14How does it work?
6 2
1
15Motivation
16Memory Partition, and State Re-labeling
Back edges are not shown.
17SPE State Processing Element
(original)
Nst of states in each subset c lg(Nst)
Nst 256 (to 4096) c 8 (to 12)
18Architecture k-square
c 8 (to 12) a 3 (to 8)
10
7
0
8
19Memory Reduction
20Conclusion
- We propose two new schemes in order to reduce the
memory consumption of state machines. - Memory Partition
- State Re-labeling
- We present the architectural design using the
proposed schemes. - The experiment results show significant reduction
(up to 80) on the memory consumption.
21Future Works
- Implement the architectural design using
Full-Custom VLSI, and FPGA. - Estimate Area, Energy Consumption, and Throughput
of the design. - Innovate new schemes to reduce the memory
consumption. - Apply the design to Regular Expression.
22Thank you
Question?
- Piti_Piyachon_at_student.uml.edu
- http//cans.uml.edu