As Simple As Possible, But No Simpler - PowerPoint PPT Presentation

1 / 64
About This Presentation
Title:

As Simple As Possible, But No Simpler

Description:

So what if it worked in the lab -- it's still unmanageable! Oh, and did we mention testing? ... in the lab -- it's still unmanageable! Oh, and did we mention ... – PowerPoint PPT presentation

Number of Views:46
Avg rating:3.0/5.0
Slides: 65
Provided by: downloadM
Category:

less

Transcript and Presenter's Notes

Title: As Simple As Possible, But No Simpler


1
As Simple As Possible,But No Simpler
  • Sam Guckenheimer
  • http//lab.msdn.microsoft.com/vs2005/teamsystem/
  • samgu_at_microsoft.com

2
Simple Project Management
  • The Iron Triangle
  • (err tetrahedron)

3
21st Century Mantra
  • Do more with less!
  • But if your only variables are
  • Functionality
  • Quality
  • Resources
  • Time
  • then how are you going to do that?

4
An Older Truth
??? ?????????? ????? ?????? ???? ?? ?????, ??????
???????????? ????? ??????????? ??-??????.
  • Happy families are all alike every unhappy
    family is unhappy in its own way.
  • Tolstoy, Anna Karenina

5
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

6
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

7
Code
  • Some why-nots
  • Use managed code
  • Use modern frameworks
  • Use service-oriented architecture
  • Use available tools
  • Transparency
  • Responsible costing
  • Visible results
  • Available tools
  • Unit tests
  • Code coverage
  • Static analysis
  • Profiling performance
  • Source control
  • Work item tracking
  • Build automation

8
Unit Tests and Code Coverage
Code Under Test not covered during the test run
Unit Test Results
9
Code Analysis
Direct jump to code from the warning
Code Analysis recommendations as build warnings
http//blogs.msdn.com/jason_anderson/archive/2004/
09/05/225798.aspx
10
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

11
Product Definition
  • Personas and Scenarios
  • Qualities of Service
  • Capture implicit requirements
  • Kano analysis
  • Stack ranking

Continually challenge your assumptions!
12
Personas and Scenarios
CEO Signs Contract
PM Starts New Portfolio Project PM Enumerates
Requirements in Excel PM Schedules Work in MS
Project
Architect Updates Design Architect Adds Tasks
Checks In
Dev Writes Code Dev Writes Runts Unit Tests Dev
Reviews Work Dev Runs Code Analysis Dev Writes
Load Tests Dev Checks In Work
PM Monitors Project Status
Tester Checks Build Status Tester Runs Load
Test Tester Reports Bug
PM Reviews Project Status PM Promotes For
Deployment
Dev Diagnoses Fixes Dev Checks In Work
PROJECT MANAGEMENT
ARCHITECT
DEVELOPER
TEST
13
Qualities of Service
  • Performance
  • Responsiveness
  • Concurrency
  • Efficiency
  • Fault tolerance
  • Scalability
  • Trustworthiness
  • Security
  • Privacy
  • Conformance to standards
  • Interoperability
  • Usability
  • Accessibility
  • Attractiveness
  • Compatibility
  • Discoverability
  • Ease of use
  • Localizability
  • Manageability
  • Availability
  • Reliability
  • Installability and uninstallability
  • Maintainability
  • Monitorability
  • Recoverability
  • Testability
  • Supportability

14
Kano Analysis
Hinshitsu (Quality), The Journal of the Japanese
Society for Quality Control, XIV2, pp.39-48,
April 1984
15
Challenging Assumptions
Customers desktop
Customer in usability lab
16
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

17
Architecture
  • Service-Oriented Architecture
  • Infrastructure Architecture
  • Legacy

18
Service Orientation
  • Build systems using autonomous services that
    adhere to the four tenets of Service Orientation
  • Boundaries are explicit
  • Services are autonomous
  • Services share schema and contract, not class
  • Service compatibility is determined based on
    policy

http//msdn.microsoft.com/msdnmag/issues/04/01/Ind
igo/default.aspx
19
Application Designer
Service-OrientedArchitecture model
Port Details editor
20
Infrastructure Architecture
  • Points of Failure
  • Points of Observation
  • Points of Attack
  • Manageability

21
Logical Infrastructure Designer
Services assignedto logical infrastructure
Architecture validatedagainst operationalsetting
s and constraints
22
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

23
Build Automation
  • Nightly build
  • Project heartbeat
  • Pre check-in tests
  • Validation of code prior against current base
    prior to check-in
  • Variant is continuous integration
  • Build verification tests
  • Functional tests (from unit tests)
  • Component integration tests
  • Build reporting
  • Against backlog, by check-in/changeset

24
Build Reporting
25
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

26
Versions
  • Track versions for each of
  • Source
  • Tests
  • Executables and other runtimes you create
  • XML, HTML, images, docs databases
  • Environmental/deployment components
  • Bugs
  • Report them together relate them

27
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

28
Environment
  • Production environment
  • Test environment
  • Capturing environment
  • Tools
  • Microsoft Virtual PC
  • Microsoft Virtual Server
  • Maintain lab images

29
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

30
Security
  • The core problem
  • Threat modeling
  • Code analysis
  • Security testing

Michael Howard, Writing Secure Code, 2003J.D.
Meier et al., Improving Web Application Security,
2003
31
Security Core Problem
  • Odds of securing a single level is 1 / 8
  • Bad guy has to find only one vulnerability
  • Infinite time
  • Microsoft as example
  • 100s of different IT environments
  • 2,500 unique attacks per day
  • 125,000 incoming virus-infected e-mails per month
  • Need to secure at every level
  • Design
  • Default
  • Deployment
  • Multiple layers of defense needed

32
Threat Modeling
  • Analyze the design for vulnerability
  • Model data flows
  • S - Spoofing Identity
  • T - Tampering with Data
  • R - Repudiation
  • I - Information Disclosure
  • D - Denial of Service
  • E - Elevation of Privilege

33
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

34
Performance
  • Deployment configuration
  • Model performance as part of product definition
  • Replicate environment in lab
  • Test it as part of development
  • Fix it where it hurts
  • Three-tiered problem
  • System
  • Components
  • Code

35
System and Component
Alerts and warnings on Systems Under Test
Performance measures of test and Systems Under
Test
36
Code Performance
Timeline of memory consumption
Suspect functions, drillable to code
37
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

38
Manageability
  • Operations documented and current for every
    service or application
  • Service level agreement in place
  • Security scanning in place
  • Proactively monitor and fix
  • Reactive and proactive problem management

39
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

40
Testing Mission Approach
  • Maricks Framework
  • Different missions and approaches apply for each
    quadrant

http//www.testing.com/cgi-bin/blog/2003/08/21agi
le-testing-project-1
41
Let the punishment fit the crime!
Gilbert Sullivan, The Mikado
  • A good test approach is
  • Diversified
  • Risk-focused
  • Product-specific
  • Practical
  • Defensible
  • Fit the technique and its data to its purpose in
    the quadrant

Kaner, Bach Pettichord, Lessons Learned in
Software Testing, 2002
42
Testing Mission Approach
  • Representative techniques

43
Test Coverage
  • Identify the Scenario, QoS or Code that the test
    tests
  • If theyre newly discovered, capture them
  • If you cant name them, question the value of the
    test
  • Measure coverage against these dimensions

44
Test Automation and Its Discontents
St (Value of Information) - St (Cost to Maintain)
St (Cost to Implement)
ROI
(adjusted for net present value and risk)
45
Test Automation and Its Discontents
St (Value of Information) - St (Cost to Maintain)
St (Cost to Implement)
ROI
  • Value depends on context
  • Automation is a programming exercise
  • Opportunity cost high due to resource constraints
  • Options theory problem
  • Very sensitive to volatility
  • Often incalculable

46
Testing Web Applications
Data substitution
Performance breakdown
Content validation
http request response
View of content as rendered
47
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

48
Metrics
  • Consider many dimensions at once
  • Single metrics easily mislead
  • Test results
  • Bug rates
  • Code churn
  • Code coverage
  • Requirements coverage
  • Never use metrics for reward or punishment
  • Flow of value, not completion of tasks
  • Planned and unplanned work

Robert Austin, Measuring and Managing Performance
In Organizations, 1996
49
Which Component is Healthiest?
  • Contrast two views of project data

Fewest bugs
Highest test pass rate
50
Which Component is Healthiest?
Lowest code coverage
  • Conclusions
  • Tests are stale
  • Highest risk here

Highest code churn
51
Focus on Flow of Value
Control height of work in progress
Value measured on completion
David J. Anderson, Managing with Cumulative Flow,
2004www.agilemanagement.net/Articles/Papers/BorCo
nManagingwithCumulat.html
52
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

53
Processes Differ for Good Reasons
  • Economics
  • Regulation
  • Liability
  • Plan-Driven vs. Adaptive
  • Iteration length
  • Documentation required
  • Sign-off gates
  • Time tracking requirements

54
and for Bad Reasons
55
Solution is Transparency
56
Transparency
  • Single product backlog
  • Task-aware versioning
  • Project portals
  • Process handbook

57
Single Product Backlog
Single backlog of all Work Items (Reqts, Tasks,
Bugs, etc.)
Queries to filter, view, report
Complete change history
Details for each entry
58
Task-aware Versioning
Source files to check in
with Work Items done
and Check-in Notes and Policy Status
59
Project Portal
60
Process Handbook
http//workspaces.gotdotnet.com/msfv4
61
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

62
Culture
  • Productivity and predictability
  • Responsibility over assignment
  • Team and individual
  • Product mentality

63
13 Symptoms of Unhappiness
  • Its the code, stupid!
  • Actually its the requirements!
  • No, the problem is that you neglected the
    architecture!
  • Architecture, schmarchitecture. I just want a
    working build.
  • What good is that the way we mix up versions?!
  • Not code versions, but the environments, dont
    you get it?
  • Ever heard of security?!
  • Yeah, but you ignored performance, duh!
  • So what if it worked in the lab -- its still
    unmanageable!
  • Oh, and did we mention testing?
  • Since youre not measuring it, you cant manage
    it anyway!
  • With a process like that, what do you expect?
  • Its our culture youll never change that.

64
  • Sam Guckenheimer
  • http//lab.msdn.microsoft.com/vs2005/teamsystem/
  • samgu_at_microsoft.com
Write a Comment
User Comments (0)
About PowerShow.com