Title: How to Evaluate Network Intrusion Detection Systems
1How to Evaluate Network Intrusion Detection
Systems?
2Outline
- Published IDS evaluations
- IDS Comparisons
- NSS IDS Group Test
- Carnegie Mellon Software Engineering Institute
- Massachusetts Institute of Technology
- IDS Evaluation Methodologies
- NFR Security
- University of California, Davis
- Criteria for Evaluating Network Intrusion
Detection Systems
3Published IDS evaluations
- Evaluation
- Determination of the level to which a particular
IDS meets specified performance targets. - Comparison
- A process of 'comparing' two or more systems in
order to differentiate between them. - The majority of published documents claiming to
evaluate IDSs are conducted as comparisons,
rather than evaluations.
4IDS Comparisons (1/3)
- The NSS Group. (2001). Intrusion Detection
Systems Group Test (edition 2). - This is a comprehensive report on 15 commercial
and open source Intrusion Detection Systems. - Background traffic
- Smartbits SMB6000 and Adtech AX/4000 monitor
system - Advantage in background traffic
- They are capable of generating sufficient traffic
to saturate the network. - Two disadvantage in background traffic
- False positives being generated is reduced.
- The actual attacks would differ significantly to
the background traffic.
5NSS Performance Testing (edition 3)
- Network IDS Testing Procedure
Test 3 IDS Evasion Techniques
Test 1 Attack Recognition
Test 4 Stateful Operation Test
Test 2 Performance Under Load
6NSS Test Environment
Switch
Smartbit WebAvalanche
DUT IDS
Private Subnet
Attacker
FreeBSD 4.5
Solaris 8
RedHat 7.2
Win2000 SP2
IDS Console
Symantec Ghost
7Network Taps
8WebAvalanche and WebReflector
9NIDS Test 1 Attack Recognition
- Attack suite contains over 100 attacks covering
the following areas - Application bugs
- Back Doors/Trojan/DDOS
- Denial of Service
- Finger
- FTP
- HTTP
- ICMP
- Mail (SMTP/POP3)
- Malicious Data Input
- Reconnaissance
- SNMP
- SANS Top 20
10NIDS Test 1 Result
11NIDS Test 2 Performance Under Load
- Use boping / bosting tools.
- Small (64 byte UDP) packets with valid
source/destination IP addresses and ports. - 25 per cent network utilisation (37000pps)
- 50 per cent network utilisation (74000pps)
- 75 per cent network utilisation (111000pps)
- 100 per cent network utilisation (148000pps)
- Real world packet mix.
- 25 per cent network utilisation (10000pps - 60
conns/sec) - 50 per cent network utilisation (20000pps - 120
conns/sec) - 75 per cent network utilisation (30000pps - 180
conns/sec) - 100 per cent network utilisation (40000pps - 240
conns/sec) - Large (1514 byte) packets containing valid
payload and address data. - 25 per cent network utilisation (2044pps)
- 50 per cent network utilisation (4088pps)
- 75 per cent network utilisation (6132pps)
- 100 per cent network utilisation (8176pps)
12NIDS Test 2 Report
Internet Security Systems RealSecure 7.0
Cisco Secure IDS 4230
Snort 1.8.6
13NIDS Test 3 IDS Evasion Techniques (2/2)
- Using fragroute
- Ordered IP fragments of various sizes
- Out-of-order IP fragments of various sizes
- Duplicate fragments
- TCP segmentation overlap
- TCP and IP chaffing
- Whisker and Stealth Web Scanner
- URL encoding
- /./ directory insertion
- Premature URL ending
- Long URL
- Fake parameter
- TAB separation
- Case sensitivity
- Windows \ delimiter
- Session splicing
14NIDS Test 3 IDS Evasion Techniques (2/2)
- Finally, we run several common exploits in their
normal form, followed by the same exploits
altered by various evasion techniques including - RPC record fragging
- Inserting spaces in command lines
- Inserting non-text Telnet opcodes in data stream
- Fragmentation
- Polymorphic mutation (ADMmutate)
15NIDS Test 3 Report
Internet Security Systems RealSecure 7.0
Cisco Secure IDS 4230
Snort 1.8.6
16NIDS Test 4 Stateful Operation Test
- Test 1
- Use Stick and Snot to generate large numbers of
false alerts on the protected subnet. - During the attack, we also launch a subset of our
basic common exploits to determine whether the
IDS sensor continues to detect and alert. - The effect on the overall sensor performance and
logging capability is noted. - Test 2
- Create FTP session.
- Use the CAW WebAvalanche to open various numbers
of TCP sessions from 10,000 to 1,000,000 - If the IDS is still maintaining state on the
first session established, the exploit will be
recorded. - If the state tables have been exhausted, the
exploit string will be seen as a non-stateful
attack, and will thus be ignored.
17NIDS Test 4 Report
Internet Security Systems RealSecure 7.0
Cisco Secure IDS 4230
Snort 1.8.6
18IDS Comparisons (2/3)
- Allen, J. Christie, A. William, F. McHugh, J.
Pickel, J. Stoner, E. (2000) State of the
Practice of Intrusion Detection Technologies.
Carnegie Mellon Software Engineering Institute. - This publication covers a wide range if issues
facing Intrusion Detection Issues - Functionality
- Performance
- Implementation
- This document provides useful insights to
important weaknesses of IDSs and a plethora of
links to further information. - This publication also includes a list of
recommended IDS selection criteria as a appendix.
19IDS Comparisons (3/3)
- Richard P. Lippmann, Robert K. Cunningham, David
J. Fried, Issac Graf, Kris R. Kendall, Seth E.
Webster, Marc A. Zissman(1999). Results of the
DARPA 1998 Offline Intrusion Detection
Evaluation, slides presented at RAID 1999
Conference, September 7-9, 1999, West Lafayette,
Indiana. - Haines, J, W. Lippmann, R, P. Fried, R, P. Korba,
J. Das, K. (1999) The 1999 DARPA Off-Line
Intrusion Detection Evaluation. - Haines, J, W. Lippmann, R, P. Fried, R, P.
Zissman, M, A. Tran, E. Bosswell , S, B. (1999)
DARPA Intrusion Detection Evaluation Design and
Procedures. Lincoln Laboratory, Massachusetts
Institute of Technology. - This series of publications is a combined
research effort from Lincoln Laboratory, DARPA
and the American Air force. - These combined publications refer to two
comprehensive evaluations of IDSs and IDS
technologies. - These evaluations attempted to quantify specific
performance measures of IDSs and test these
against a background of realistic network traffic.
20MIT Test Methodology
Ability to detect new and stealthy attacks
A ratio of attack detection to False positive
Ability of anomaly detection techniques to detect
new attacks
A comparison of host vs. network based systems to
detect different types of attacks
21IDS Evaluation Methodologies (1/2)
- Ranum, M, J. (2001). Experiences Benchmarking
Intrusion Detection Systems. NFR Security - This article discusses a number of issues
relating to techniques used to benchmark IDSs. - Highly critical of many published IDS comparison
for their lack of understanding of IDS
techniques, and thus ability to design
appropriate testing methodologies. - Discusses the various measures that can be and
have been used measure the performance of IDSs. - The importance of using real life traffic and
attacks in the evaluation process, rather than
simulated traffic and attacks.
22IDS Evaluation Methodologies (2/2)
- Puketza, N. Chung, M. Olsson, R, A. Mukherjee,
B. (1996). Simulating Concurrent Intrusions for
Testing Intrusion Detection Systems
Parallelizing Intrusions. University of
California, Davis. - Puketza, N. Zhang, K. Chung, M. Olsson, R, A.
Mukherjee, B. (1996). A Methodology for Testing
Intrusion Detection Systems. University of
California, Davis. - Puketza, N. Chung, M. Olsson, R, A. Mukherjee,
B. (1997). A Software Platform for testing
Intrusion Detection Systems. University of
California, Davis. - Puketza have developed a application to simulate
specific attacks against a target system. These
attacks can be scripted to run concurrently or in
a specific sequence. - The advantage of this methodology is that each
test can easily be repeated for each device under
test. - One disadvantage of this application is that it
does target older vulnerabilities in UNIX
systems, which should not apply to a current
operating system.
23Criteria for Evaluating Network Intrusion
Detection Systems
- Ability to identify attacks
- Known vulnerabilities and attacks
- Unknown attacks
- Relevance of attacks
- Stability, Reliability and Security
- Information provided to analyst
- Severity, potential damage
- Outcome of attack
- Legal validity of data collected
- Manageability
- Ease or complexity of configuration
- Possible configuration options
- Scalability and interoperability
- Vendor support
- Signature updates
24Reference
- http//www.sans.org/resources/idfaq/eval_ids.php
- http//www.nss.co.uk/
- http//www.sei.cmu.edu/publications/documents/99.r
eports/99tr028/99tr028abstract.html - http//www.ll.mit.edu/IST/ideval/index.html
- http//www.raid-symposium.org/raid2001/program.htm
l - http//www.nfr.com