Title: DB-22:%20Zero%20to%2030,154%20in%20Twenty%20Days
1DB-22 Zero to 30,154 in Twenty Days
Tom Harris
Director, OpenEdge RDBMS Products
Kent Lipschultz
Technical Alliance Manager, HP
2Agenda
- Planning
- Preparation
- Lab Work
- Analysis
- Lessons Learned
3Planning Whats a Vendor Lab All About ?
You Get What You Pay For Borrow...
4HP Labs HPs Enterprise Solution Alliances
5HP Labs HPs Enterprise Solution Alliances
HP ISV Technical Services
Kent Lipschultz(HP St.Paul Minnesota)Technical
Alliance Manager651-982-9794 -
kent.lipschultz_at_hp.com
DAL Lab
Direct Access to all other HP Labs
Equipment Resources
Benchmarking Centers
6HP Labs HPs Enterprise Solution Alliances
Developers Alliance Lab DAL, HP Labs, HP
Cupertino California- Outbound / Inbound-
Benchmarking with OpenEdge (performance
sizing) - Performance Optimization of OpenEdge on
HP (All platforms)- Technology integration (i.e.
MC/SG, OV, VSE, SOA)- Progress RD support when
all other channels fail- Progress client support
when all other channels fail Partner Technology
Access Centers PTAC, 3 in U.S. (NJ, MA TX), 1
in UK, 1 in India- Technology specific expertise
(porting, on-site working sessions) Global
Solution Centers GSC, Nine centers throughout
the world- Client specific POC testing
7HP Labs HPs Enterprise Solution Alliances
- Position HP as the preferred technology provider
for end-to-end solutions for Progress
customers. - Ensure tight integration of solutions with
relevant HP technologies. (Servers, OS,
Management, High Availability, Storage, etc) - Ensure that solutions are optimized and
benchmarked on HP hardware. - Drive emerging technologies into Progress market.
8HP Labs Optimized and Benchmarked
Discussion Topics Real world needs are a mix of
real workloads. Is there such thing as a common
and standard OpenEdge implementation? What do
Progress ISV partners already know about their
application characteristics?
Query driven
OpenEdge (i.e. Batch)
Transaction driven
OpenEdge(i.e. OLTP)
How important is real implementation performance?
9HP Labs Reserving Equipment and Set Up
- What happens on the vendor side of this ??
- CC - Scenarios, Goals, Hardware and Software
Requirements, Loading, Data Collection - Timeline (Target start date, Run time)
- Documentation and Collateral
- How long does that usually take?
- Is this part of the lab time interval ? (yes)
- When do we get at the system? What do we do?
10Plan YOUR Business/Technical Goals
- Vendor Lab Visits
- Progress is the partner
- Its the partner that sponsors the visit
- A clear goal and a good plan really helps!
- Expect an initial conference about a visit
- Equipment specs freeze a month before a visit
- Theres a LOT of work to do before then !
11Plan YOUR Business/Technical Goals
- Why should we do this?
- Can we support more business ?
- How long would a hardware upgrade last us?
- Can we really support a lot more users ?
- Where are our bottlenecks ?
- How do we go about testing on a big system ?
- What should the deliverables be ?
12Plan Configurations Considered
- Main Test System
- Model, CPU count
- Memory
- Storage and setup needs
- Network bandwidth
- Driver System(s)
- CPU, memory, local storage, network
- Common storage w/Main ?
- Special test system requirements ?
13Plan What Do I Vary?
- Change only one thing at a time
- Ramp up client count or DB size ? Or ?
- We chose clients, holding memory constant
- Then we did another run, with bigger memory
steps - What is the key criteria? tps, commits,
time/work ? - Whats a good test run?
- Clear file system cache
- 10 runs per data point, discarding high low
- Immediately save logfiles and promon data!
- Minimal time to re-set for the next test run ...
Script everything you can !
14Example Test Scenarios
15Plan Remote Access
- What cant you do remotely ?
- Will you physically visit the lab ?
- How will you get programs data there ?
- How long do you need to set up ?
- Will you need sources there ? Dev envt ?
- How will you drive your application ? Mercury?
- Does your test database need secure deletion ?
- How will you get your test results home ?
pre-package everything you can and test it !
16Plan Initial Game Plan
- Goal is clear, key criteria are clear
- Equipment list agreed upon
- Storage network setup defined
- OS release, patch level, and special tools
defined - OpenEdge release SP defined
- Test system topology confirmed by the vendor
- Method for driving the test system looks OK
- Number of test data points and of runs agreed
- Plan in place to get software db lab ready
- Plan in place to get results from the lab
17Prep Documentation
- Test system topology (HW SW)
- Test data points per run, and number of complete
runs - Client range 1K 5K 10K 15K 20K 25K 30K
- Memory (shmsegsize) 1GB, 16GB
- Key measurement tps? Commits? Orders processed?
- Naming conventions for results result storage
- Initial Setup scripts (sw db at the lab)
- Test run setup scripts
- Test run execution save-off scripts
- Final data collection scripts
18Prep Scripting
- Lab Setup backup tape vs gzip to DVD(s) vs.
VPN - Lab Setup OpenEdge install, property files,
source/rcode - Lab setup gather scripts, etc
- What starts a test run? Do all clients start, or
ramp up? - Several OpenEdge installs? Which one do you
run first ? - How do you parameterize a test run? (and record
it?) - Should the vendor gather OS-level data too?
- When is a test-run done? Transaction , time,
or logout? - It is very common to mix up test result files.
19Prep Test Runs (at home)
- Start with a machine that only has the OS
- Try your scripts in order (you will find bugs)
- Setup OK ?
- OpenEdge setup OK?
- Do 5 test runs how much do results vary?
- Result archiving OK?
- Could someone else repeat this test?
20Prep Analysis Check
- Criteria are known
- Results have been obtained
- Now What? Try a brief analysis..
- Start with goals, config, test description
- Reduce the data - drop hi/lo, average rest?
- Plot the points
- Do these make sense ? What do they say?
- Do you also need OS info (I/O, pfault, etc)?
- Adjust the plan, if you need more info
21Prep Revised Game Plan
- Goal is clear, key criteria are clear
- Equipment list, storage network setup OK
- OS release, patch level, and special tools
defined - OS monitoring reqts documented for lab use
- OpenEdge release SP defined
- Method for driving the test system looks OK
- Additional promon data collection scripted
- Number of test data points and of runs agreed
- Plan in place to get software db lab ready
- Plan in place to get results from the lab
22Lab Lead Time
VPN access to external network (Internet)
Private Network switch Attaches to all servers
RP3440-46WR driver1 External network 156.153.117
.161 Private network 10.0.0.1
RP3440-4872 driver2 External network 156.153.117
.162 Private network 10.0.0.2
RP3440-46X4 driver3 External network 156.153.117
.163 Private network 10.0.0.3
RP3440-46X2 driver4 External network 156.153.117
.164 Private network 10.0.0.4
RX8640-31JD dbsrv External network 156.153.117.
160 Private network 10.0.0.10
(2) - 2GB fibers connected to EVA
EVA8000-2V1Y
EVA Management PC
HP LAN
23Lab Initial Shakedown Run
- OpenEdge is Installed
- The application and database are all set
- Here goes the first test run
- What ???? This makes no sense
- Check OS file system cache too big??
- Check semaphore sets enough ?
- Check swapping/paging/IO thruput
- Check OE benchmarking tips ( clients/server,
etc) - The lab experts may have ideas
24Lab Promon vs Vendor Tools
- Promon or OpenEdge Management
- Tells what the RDBMS sees
- Great for RDBMS info
- Useful to see applevel issues
- Vendor/OS
- Tells what the OS sees
- Great for low-level issues like memory or
filesys - A good complement to what promon sees
25Lab Getting Useful Data
- Is the data repeatable within 5-8 ?
- Are ongoing results reasonable
- If not, talk to the lab team right away
- There is still time to do drill deeper
- The whole team wants you to succeed
- Is there an app bottleneck?
- Is there a DB bottleneck?
- Save all data unreasonable may be later on
- Careful data labeling really helps sort this out
26Lab Winding Down
Get the data back home, and confirm it right
away THEN make sure data is deleted, media
returned, etc
27Tests Done Organize/Protect Data
- Lab work is done
- Data is here
- Burn it on a DVD
- Make sure any data reduction has source file id
- Do the reduction, set up some tables
- Burn them on a DVD, also with source file id
- Why? Because here is where the lab time can be
lost with no hope of recovery
28Analysis Getting Started
- Start using the trial report you started in
Planning - Build the graphs if you are visual
- Check the tables if you arent
- Is there a drop off - maybe a bottleneck or
limit? - Subsequent data should agree
- What is your hypothesis?
29Analysis Huh? What Happened?
- Uh Oh We have some inconsistent data in a run
- Performance dropped and then went up again
- Performance went up-down-up and stayed up
- Either way, something is strange
- Can you simulate this on a local system??
- Check the data did it get mixed up?
- Error in data collection? Test script? App? OE?
- (Usually its a data mixup)
30Analysis The Report
- Table of Contents
- Goal
- Key Criteria
- Test Systems
- Result Summary and Analysis
- Lessons Learned
31Heres What We Saw . . .
RX8640-31JD dbsrv External network 156.153.117.
160 Private network 10.0.0.10
-B 3145728 -L 500000 -Mm 8192 -semsets
20 -minport 11000 -maxport 13000 -spin
50000 -bibufs 500 -Ma 100 -Mi 5
clients -Mn ------------------ 1000
15 5000 57 10000 108 15000
162 20000 222 25000 273
29-May-2007
32Whats With the 30,154 ?
33Analysis Lessons Learned
What changed from Initial Game Plan to Revised
Plan ? What difficulties were there in Lab setup
? What difficulties were detected at the Lab vs
home office ? Were there data consistency issues
? Why? Were there difficulties in the analysis?
What could help ? What was the primary value of
the benchmark session ? Are the scripts,
database, apps, and config files archived
? Should the next test set be run remotely or at
the vendor lab ?
34Questions?
35Thank you foryour time
36(No Transcript)