No Respect - PowerPoint PPT Presentation

1 / 59
About This Presentation
Title:

No Respect

Description:

Salt Lake City AHM. February, 2005. A brief history of quality. Very First VTK Dashboard ... Salt Lake City AHM. February, 2005. Dart's Power. Distributed testing ... – PowerPoint PPT presentation

Number of Views:95
Avg rating:3.0/5.0
Slides: 60
Provided by: nam4
Category:
Tags: city | lake | respect | salt

less

Transcript and Presenter's Notes

Title: No Respect


1
No Respect
2
Dart2 Quality Framework
  • Software Quality Past, Present, Future

Dan Blezek Jim Miller Bill Lorensen
3
A brief history of quality
  • Very First VTK Dashboard
  • Update information
  • Builds Irix, Solaris, WinNT
  • Regression tests
  • WinNT Catastropic failure

4
  • Last VTK Dashboard
  • 8 Platforms
  • 650 Nightly Tests
  • 70 coverage (Still)
  • Nightly Purify

5
A Bad Day
6
The Continuous Build
7
My first VTK checkin
8
The Big Bat of Quality
9
Bill Yogi Lorensen
10
Lessons learned
  • Test on different platforms
  • Test nightly
  • Make it easy to add a test
  • Track changes daily
  • Keep historical information

11
Two roads diverged in a wood, and I-- I took the
one less traveled by, And that has made all the
difference. Robert Frost
Closed Source
12
Dart (v1)
  • Tests
  • Reports
  • Dashboards
  • Captures state of the system
  • Distills data into information
  • Convert build log ? errors/warnings
  • Summarize test execution
  • Rollup Coverage statistics

13
(No Transcript)
14
(No Transcript)
15
Someone broke the build!
16
Someone broke the build!
17
(No Transcript)
18
Darts Power
  • Distributed testing
  • If I dont have a platform, you do
  • Distill data from many tools
  • Distributed Extreme Programming
  • Know the state of the system
  • Instant feedback on changes

19
Quality Statistics
  • Original VTK dashboard
  • 8 platforms / 650 tests
  • 13.6 G over 4 years
  • Current VTK dashboard
  • 29 Nightly platforms / 500 tests
  • 1-2 G / week
  • Insight dashboard
  • 60 Nightly builds / 970 tests
  • 1-2 G / week

20
Dart2 Design Goals
  • One Server, multiple Projects
  • Simple, flexible setup and management
  • Configurable presentation
  • Persist data on dashboard over time
  • Aggregate Dashboards
  • Authenticated submission, if desired
  • Resource management tools

21
Implementation
  • Java
  • Many, many available packages
  • Cross-platform
  • Everything in one package
  • No extra OS packages required
  • Distribute as Jar and/or platform exe
  • Should be easily extensible
  • Even to non-Java programmers

22
Packages
23
Components/Concepts
  • Client, Submission
  • Test Hierarchy
  • Results
  • RDBMS
  • XML-RPC Server
  • Task Manager
  • Scheduler
  • Tracks
  • HTTP / Template Engine

24
Client, Submission
  • Client a unique platform
  • Need to define criteria
  • Currently Site / BuildName
  • Submission
  • One TimeStamped set of Test data
  • Particular to a sub-Project
  • slicer.itk
  • slicer.vtk

25
Test Hierarchy
  • Test is a logic group of Results
  • Has a Pass/Fail/NotRun status
  • May contain other Tests
  • Has Hierarchial naming convention
  • itk.common.PrintSelf
  • SubTest information rolled up

26
Results
  • Data produced by a Test
  • Examples
  • Log of standard out
  • Image
  • ExecutionTime
  • Typed
  • text/string, text/url, text/xml, text/text
  • numeric/integer, numeric/double
  • image/png,image/jpeg

27
RDBMS
  • Core of Dart2
  • Bundled with Derby embedded RDBMS
  • Any JDBC compliant DB works
  • Stores small data
  • Images, large blocks of text in files
  • Jaxor
  • Object Relational Bridge package
  • No fancy SQL required
  • Creates objects from rows in DB

28
XML-RPC Server
  • Accepts Submissions
  • Administrative functions
  • HTTP transport
  • Easy submission through firewalls
  • Digester used to process XML
  • Executes code when tags found

29
Task Manager
  • Tasks are units of work for the server
  • Project and Server Tasks
  • Scheduled, Event driven
  • When a Submission arrives, a Task is queued
  • QueueManager executes Tasks
  • Plug-ins allow Project specific Tasks
  • Simply implement the Task Interface

30
Scheduler
  • Quartz Enterprise Scheduler
  • Executes Tasks
  • Uses enhanced cron syntax
  • Uses
  • Regular DB maintained
  • Purge unnecessary data
  • Archive aging data

31
Tracks
  • Groups of Submissions
  • Dashboard consists of intersecting Tracks
  • Temporal Tracks
  • Time based, i.e. 12am start, 24hr duration
  • Most Recent Track
  • Last 5 Continuous builds
  • Project specific

32
HTTP / Template Engine
  • Jetty is HTTP/Servlet server
  • FreeMarker
  • Data prepared in Servelet
  • Template processed
  • Returned to client via HTTP
  • Flexible
  • Easy to add new pages
  • No XSLT!

33
Dart2 Current Status
  • Alpha version ready
  • Test server
  • http//www.na-mic.org8081/Insight/Dashboard/
  • Populated with Build Test from
    public.kitware.com
  • Subversion Code Repository
  • svn co http//svn.na-mic.org8000/svn/Dart
  • Web SVN http//www.na-mic.org8000/websvn/

34
Acknowledgements
  • Andy Cedilnik
  • Bill Hoffman
  • Will Schroeder
  • Ken Martin
  • Amitha Perera
  • Fred Wheeler

35
Dart2 Quality Framework
  • Software Quality Past, Present, Future

Dan Blezek Jim Miller Bill Lorensen
36
Desirable Qualities
  • Frequent testing
  • Identify defects as soon as they are introduced
  • Hard to find cause if not done frequently
  • Minimally invasive to daily activities
  • Automated testing
  • Automated report generation/summaries
  • Must be concise yet informative
  • Track results over time

37
NAMIC Software Process
Dan Blezek Jim Miller Bill Lorensen
38
Motivation
  • Many algorithms, many platforms
  • VTK, ITK, Slicer, LONI
  • Linux, Windows, Mac OSX, Solaris, SGI
  • Many users
  • Many datasets
  • Many sources of problems!

39
Motivation
  • Negative example
  • MIT codes ITK algorithm for LONI pipeline
  • UCLA developer changes LONI
  • GE changes ITK
  • Time for release, everythings broken

40
Motivation
  • Ensuring high quality software
  • Systems state must be known
  • If UCLA knew about MIT code, they would have been
    more careful w/changes
  • All the code works, all the time
  • As often as is feasible, compile and test the
    code

41
Extreme Programming
42
NAMIC Process
  • Light weight
  • Based on Extreme Programming
  • High intensity cycle
  • Design
  • Test
  • Implement
  • Supported with web-enabled tools
  • Automated testing integrated with the software
    development

43
Software Process
  • Design Process
  • Coding Standards
  • Testing
  • Bug Tracker
  • Communication
  • Mailing lists, Discussion forum, Wiki
  • Tcons
  • Documentation
  • Releases

44
Design Process
  • Take the time to design a good API
  • Plan for future use
  • Plan for future extension
  • Two routes
  • Code something, check it in
  • Others will tear it down make it better
  • Put together a strawman
  • Solicit ideas, implement

45
Coding Standards
  • Follow the packages rules
  • ITK has certain coding standards
  • Style guidelines
  • Naming conventions
  • Macros

46
Testing
  • If it isnt tested, its broken.
  • Tests
  • Ensure your code works
  • Documents expected results
  • Others free to change

47
Bug Tracker
  • Bugs assigned / taken by developers
  • Tracks progress to releases
  • Captures feature requests
  • Communication mechanism

48
Documentation
  • Doxygen
  • Automatic API documentation
  • Algorithm references
  • Implementation details
  • Books / Manuals
  • Insight Book

49
Communication
  • Email lists
  • Discussion forum
  • Wiki
  • Tcon

50
Extreme Programming
  • Compression of standard analyze, design,
    implement, test cycle into a continuous process

51
Daily Testing Is The Key
  • Testing anchors the development process (Dart)
  • Developers monitor the testing dashboard
    constantly
  • Problems are identified and fixed immediately
  • Developers receive e-mail if theyBreak the
    Build

52
Daily rhythm
  • Design, implement algorithm
  • write regression test
  • check it in

53
Dart
  • Testing
  • Reports
  • Dashboards
  • Central site for state of the system
  • Updates
  • Builds
  • Test
  • Coverage

54
(No Transcript)
55
(No Transcript)
56
Someone broke the build!
57
(No Transcript)
58
Conclusion
  • Have fun
  • Process extends your impact
  • Many can use your code
  • Many can improve your code
  • Communicate, Communicate, Communicate

59
NAMIC Software Process
Dan Blezek Jim Miller Bill Lorensen
Write a Comment
User Comments (0)
About PowerShow.com