Title: Dependability benchmarking for transactional and web systems
1Dependability benchmarking for transactional and
web systems
- Henrique Madeira
- University of Coimbra, DEI-CISUC
- Coimbra, Portugal
2Ingredients of a recipe to bake a dependability
benchmark
- Measures
- Workload
- Faultload
- Procedure and rules (how to cook the thing)
- Dependability benchmark specification
- Document based only
- or
- Document programs, tools,
3Benchmark properties
- Representativeness
- Portability
- Repeatability
- Scalability
- Non-intrusiveness
- Easy to use
- Easy to understand
4Benchmark properties
- Representativeness
- Portability
- Repeatability
- Scalability
- Non-intrusiveness
- Easy to use
- Easy to understand
In practice
- A benchmark is always an abstraction of the real
world! - Its an imperfect and incomplete view of the
world. - Usefulness ? improve things
- Agreement
5The very nature of a benchmark
- Compare components, systems, architectures,
configurations, etc. - Highly specific applicable/valid for a very well
defined domain. - Contribute to improve computer systems because
you can compare alternative solutions. - A real benchmark represents an agreement.
6Three examples of dependability benchmarks for
transactional systems
- DBench-OLTP DSN 2003 VLDB 2003
- Dependability benchmark for OLTP systems
(database centric) - Provided as a document structured in clauses
(like TPC benchmarks) - Web-DB SAFECOMP 2004
- Dependability benchmark for web servers
- Provided as a set of ready-to-run programs and
document-based rules - Security benchmark (first step) DSN 2005
- Security benchmark for database management
systems - Set of tests to database security mechanisms
7The DBench-OLTP Dependability Benchmark
(SUB) System Under Benchmarking
(BMS) Benchmark Management System
Workload
Faultload
DBMS
RTE
BM
Control Data results
OS
FLE
Workload and setup adopted from the TPC-C
performance benchmark
8Benchmarking Procedure
- Phase 1 Baseline performance measures (TPC-C
measures) - Phase 2 Performance measures in the presence of
the faultload - Dependability measures
9Measures
- Baseline performance measures
- tpmC number of transactions executed per minute
- /tpmC price per transaction
- Performance measures in the presence of the
faultload - Tf number of transactions executed per minute
(with faults) - /Tf price per transaction (with faults)
- Dependability measures
- AvtS availability from the server point-of-view
- AvtC availability from the clients
point-of-view - Ne number of data integrity errors
10Faultload
- Operator faults
- Emulate database administrator mistakes
- Software faults
- Emulate software bugs in the operating system
- High-level Hardware failures
- Emulates hardware component failures
11Examples of systems benchmarked
12DBench-OLTP benchmarking results
13DBench-OLTP benchmarking results
14DBench-OLTP benchmarking results
15Using DBench-OLTP to obtain more specific results
Availability variation during the benchmark run
Corresponds to about 32 hours of functioning in
which the system have been subject of 97 faults.
Each fault is injected in a 20 minutes injection
slot. System is rebooted between slots
16DBench-OLTP benchmarking effort
17The WEB-DB Dependability Benchmark
(SUB) System Under Benchmarking
(BMS) Benchmark Management System
Workload
Faultload
Bench. Coordinator
Web- Server
Fault injector
SPECWeb Client
Control Data results
OS
Availability tester
Workload and setup adopted from the SPECWeb99
performance benchmark
18WEB-DB measures
- Performance degradation measures
- SPECf Main SPEC measure in the presence of the
faultload - THRf Throughput in the presence of the
faultload (ops/s) - RTMf Response time in the presence of the
faultload (ms) - Dependability related measures
- Availability Percentage of time the server
provides the expected service - Autonomy Percentage of times the server
recovered without human intervention
(estimator of the self-healing abilities of the
server) - Accuracy Percentage of correct results yielded
by the server
19WEB-DB faultloads
- Network hardware faults
- Connection loss (server sockets are closed)
- Network interface failures (disable enable the
interface) - Operator faults
- Unscheduled system reboot
- Abrupt server termination
- Software faults
- Emulation of common programming errors
- Injected in the operating system (not in the
web-server)
20WEB-DB procedure
Benchmark procedure 2 steps
- Step 1
- Determine baseline performance
- (SUB benchmark tools running workload without
faults) - Tune workload for a SPEC conformance of 100
- Step 2
- 3 runs
- Each run comprises all faults specified in the
faultload
- Bechmark results the average of the 3 runs
21Examples of systems benchmarked
- Benchmark and compare the dependability of two
common web-servers - Apache web-server
- Abyss web-server
- When running on
- Win. 2000
- Win. XP
- Win. 2003
22Availability
Accuracy
Autonomy
23- Performance in the presence of faults
SPECf
THRf
RTMf
Baseline performance - Apache 31, 26, 30 -
Abyss 28, 25, 24
Performance degradation () - Apache 55.4,
30.7, 62.3 - Abyss 63.2, 45.2, 46.3
24Security benchmark for database management systems
Web Browser
Network
Web Server
Web Browser
DBMS
Client Application
Network
Application Server
Network
Client Application
Client Application
Client Application
Key Layer
25Security Attacks vs System Vulnerabilities
- Security attacks
- Intentional attempts to access or destroy data
- System vulnerabilities
- Hidden flaws in the system implementation
- Features of the security mechanisms available
- Configuration of the security mechanisms
26Approach for the evaluation of security in DBMS
- Characterization of DBMS security mechanisms
- Our approach
- 1) Identification of data criticality levels
- 2) Definition of database security classes
- 3) Identification of security requirements for
each class - 4) Definition of security tests for two
scenarios - Compare different DBMS
- Help DBA assessing security in real installations
27Database Security Classes
28Requirements for DBMS Security Mechanisms
- Internal user authentication (username/
password)
29Measures and Scenarios
- Measures provided
- Security Class (SCL)
- Security Requirements Fulfillment (SRF)
- Potential scenarios
- Compare different DBMS products
- Help DBA assessing security in real installations
30Comparing DBMS Security
- Set of tests to verify if the DBMS fulfill the
security requirements - Development of a database scenario
- Database model (tables)
- Data criticality levels for each table
- Database accounts corresponding to the several DB
user profiles - System and object privileges for each account
31Database scenario
Level 4
Level 3
Level 5
Level 2
Level 1
32Example Comparative Analysis of Two DBMS
- Oracle 9i vs PostgreSQL 7.3
33Results Summary
- Oracle 9i does not fulfill all encryption
requirements - 400 lt performance degradation lt 2700
- PostgreSQL 7.3
- Some manual configuration is required to achieve
Class 1 - High SRF for a Class 1 DBMS
- Fulfills some Class 2 requirements
34Conclusions
- Comparing (components, systems, architectures,
and configurations) is essential to improve
computer systems ? Benchmarks needed! - Comparisons could be missleading
? Benchmarks must be carefully
validated! - Two ways of having real benchmarks
- Industry agreement
- User community (tacit agreement)