Science and Substance: A Challenge to Software Engineers - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Science and Substance: A Challenge to Software Engineers

Description:

Robert L. Glass. Presented by: Alex Baker. The Problem: Despite all of our advances, SE is still hard ... Managers are willing to try new things. If evidence ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 17
Provided by: Ale8
Category:

less

Transcript and Presenter's Notes

Title: Science and Substance: A Challenge to Software Engineers


1
Science and SubstanceA Challenge to Software
Engineers
  • By
  • Norman Fenton
  • Shari Pfleeger
  • Robert L. Glass
  • Presented by
  • Alex Baker

2
The Problem
  • Despite all of our advances, SE is still hard
  • Over 250 Software Engineering Standards
  • What we know is based on
  • - Anecdotes - Gut Feelings
  • - Expert Opinions - Flawed Research

3
Convincing Practitioners
  • Managers are willing to try new things
  • If evidence says that they will see benefits
  • Evidence is rare
  • Unsubstantiated Claims

4
Convincing Practitioners
  • Managers are willing to try new things
  • If evidence says that they will see benefits
  • Hard evidence is rare
  • Unsubstantiated Claims

SolutionResearchers should adopt amore
scientific approach
5
5 Questions that should be raised
  • Is it based on empirical evaluation and data?
  • Was the experiment designed correctly?
  • Is it based on a toy or real situation?
  • Were the measurements appropriate?
  • Was the experiment run for a long enough time?

6
1) Empiricism vs. Intuition
  • Analytical Advocacy Research
  • Unsupportable
  • Not rigorous, quantitative experimentation
  • Some methods find their way into standards
    despite no evidence to support them

7
1) Empiricism vs. Intuition (contd)
  • Formal Methods provide an example
  • IBM/PRG study advocates them, sans evidence?
  • Study casts formal methods worth into doubt
  • Accepted, but no evidence exists?
  • Counterexample Code Inspections

8
2) Experimental Design
  • Poorly designed experiments can discredit
    researchs results
  • Flowcharts
  • Structured Programming (maximizing attributes)
  • Define success carefully
  • Experimental design not part of curricula

9
3) Toy vs. Real
  • Experiments often run on small, toy projects
  • Useful for initial forays
  • Object orientation, looping constructs
  • Do toy studies scale up?
  • Little research on the matter
  • Research, development organization support

10
4) Appropriate Measures
  • Are you measuring the correct attribute?
  • Success, reliability
  • Faults vs. failures
  • What scale?
  • Nominal, ordinal, interval, ratio and absolute
  • Too often, inappropriate scales are used

11
5) Long-Term View
  • Short-term effects ? long-term results
  • SEL and Adas long-term learning curve
  • CASE decreases productivity in first year
  • Eventually productivity increases, but
  • Explained by other factors?
  • May not be cost effective?

12
Recent Examples
  • Too few good examples, but heres a few
  • Cleanroom
  • Error detection and testing methodology
  • Used students and practitioners
  • Empirical, well-designed, real, appropriately
    measured, long-term

13
Recent Examples (contd)
  • Object-Oriented Design
  • 8 projects over several years
  • Good on most counts
  • Ada indicates bad design, complicates results
  • 4th Generation Languages
  • Many studies, varied results
  • Non-toy, but small. Short-term
  • Would need to see papers themselves for more

14
Conclusions
  • Little empirical evidence exists to confirm that
    the fixes we have discovered really improve
    software engineering.
  • We need a widespread demand for improvement.

15
Steps Toward Improvement
  • Managers
  • Insist on well-designed experiments
  • Participate in studies
  • Developers
  • Participate in studies
  • Be objective in such studies
  • Researchers
  • Evaluative research should support new ideas
  • Quantify
  • Identify degree of control over variables

16
Discussion
  • Which of these criteria are easy? Hard?
    Impossible?
  • Experimental Design
  • To what extent can we isolate causes of observed
    changes?
  • Have flowcharts and structured programming been
    proven?
  • 9 years later, how far have we come?
  • Do we support our claims empirically, or use
    Analytical Advocacy Research?
  • Are our advances being used?
  • Will adopting these methods improve SE research?
  • Will they improve software engineering itself?
  • How do these standards for rigorous research
    compare other sciences?
Write a Comment
User Comments (0)
About PowerShow.com