Compact State Machines for High Performance Pattern Matching

1 / 22
About This Presentation
Title:

Compact State Machines for High Performance Pattern Matching

Description:

Edges back to state 1 are shown as dash lines. Memory Model of State Machine ... Purposed by Tan-Sherwood. Memory Model: Bit-level State Machine ... –

Number of Views:29
Avg rating:3.0/5.0
Slides: 23
Provided by: carl290
Category:

less

Transcript and Presenter's Notes

Title: Compact State Machines for High Performance Pattern Matching


1
Compact State Machines for High Performance
Pattern Matching
  • Piti Piyachon
  • Yan Luo

Electrical and Computer Engineering Department
Design Automation Conference San Diego, CA, June
2007
2
NIDS and Pattern Matching
  • Network Intrusion Detection System (NIDS)
  • Inspect packet header payload
  • Detect computer viruses, worms, Spam, etc.
  • Network intrusion detection applicationBro,
    Snort, etc.
  • Pattern Matching System requirements
  • Matching predefined multiple patterns(keywords,
    or strings) at the same time
  • Patterns can be any size
  • Patterns can be anywhere in the payloadof a
    packet
  • Matching at line speed
  • Flexibility to accommodate new sets of patterns

3
Our Contributions
  • Propose novel schemes to reduce the memory
    required to store bit-level state machines
    forpattern matching
  • Memory Partitions
  • State Re-labeling
  • Present architectures supporting the schemes
  • Evaluate our purposed schemes using a realistic
    pattern set (Snort)

4
Aho-Corasick and State Machine
  • A set of patterns
  • he, her, him, his

Edges back to state 0 are not shown. Edges back
to state 1 are shown as dash lines.
5
Memory Model of State Machine
  • Snort (April04) 1976 string-patterns
  • gt 18,000 states
  • 256 next state pointers, and width 15 bits
  • 8.5 MB Too expensive for fast SRAM
  • Waste space for storing every possible next state
    pointer

6
Bit-level State Machine
8 bit state machines required !
Purposed by Tan-Sherwood
7
Memory Model Bit-level State Machine
  • Snort (April04) 1976 string-patterns
  • 2 next state pointers , and width 9 bits
  • divide the pattern set into many subsets (gt 200
    subsets)
  • 512 states per subset
  • keyword-ID width 16 bits per subset
  • 1.6 MB

8
Architecture Bit-level State Machine
Snort (April04) K 1976 patterns
k 8 (to 256) d 11 h 3 (to 8) N 247 (to 8)
9
SPE State Processing Element
Nst of states in each subset c lg(Nst)
Nst 256 (to 4096) c 8 (to 12)
10
Memory Matrix
Back edges are not shown.
11
Memory Matrix
Back edges are not shown.
12
How does it work?
0 1
0
13
How does it work?
0 1
0
1
14
How does it work?
6 2
1
15
Motivation
16
Memory Partition, and State Re-labeling
Back edges are not shown.
17
SPE State Processing Element
(original)
Nst of states in each subset c lg(Nst)
Nst 256 (to 4096) c 8 (to 12)
18
Architecture k-square
c 8 (to 12) a 3 (to 8)
10
7
0
8
19
Memory Reduction
20
Conclusion
  • We propose two new schemes in order to reduce the
    memory consumption of state machines.
  • Memory Partition
  • State Re-labeling
  • We present the architectural design using the
    proposed schemes.
  • The experiment results show significant reduction
    (up to 80) on the memory consumption.

21
Future Works
  • Implement the architectural design using
    Full-Custom VLSI, and FPGA.
  • Estimate Area, Energy Consumption, and Throughput
    of the design.
  • Innovate new schemes to reduce the memory
    consumption.
  • Apply the design to Regular Expression.

22
Thank you
Question?
  • Piti_Piyachon_at_student.uml.edu
  • http//cans.uml.edu
Write a Comment
User Comments (0)
About PowerShow.com