Object-Oriented Metrics - PowerPoint PPT Presentation

About This Presentation
Title:

Object-Oriented Metrics

Description:

Title: OO Metrics Author: Renaat Verbruggen Last modified by: student Created Date: 6/2/1995 10:15:24 PM Document presentation format: On-screen Show – PowerPoint PPT presentation

Number of Views:150
Avg rating:3.0/5.0
Slides: 50
Provided by: RenaatVe2
Category:

less

Transcript and Presenter's Notes

Title: Object-Oriented Metrics


1
Object-Oriented Metrics
  • Renaat Verbruggen
  • School of Computing,
  • Dublin City University

2
Introduction and Welcome
  • My Background
  • Esprit Object-oriented Project 1986
  • Lecturer in Software Engineering
  • Research in formal OO
  • Consultant to Industry on OO

3
Agenda
  • What is different about Object-oriented projects?
  • What is measurement in software development ?
  • What are typical OO metrics ?
  • What other guidelines exist?

4
Differences in OO projects 1
  • The code produced.
  • Encapsulation
  • Data Abstraction (Classes)
  • Inheritance
  • Polymorphism
  • Not just Lines of Code.

5
Differences in OO projects 2
  • Reuse a priority
  • Use (reuse) of libraries or Frameworks
  • Reuse through Inheritance
  • Reuse above code level
  • Patterns
  • Business objects

6
Differences in OO projects 3
  • Reuse changes process
  • Build reusable components
  • Frameworks and libraries
  • Abstraction, generalisation
  • Cost ? Investment
  • Find and reuse components
  • Saving ? return on investment

7
Differences in OO projects 4
  • Development process iterative
  • Often the major difference !
  • Growth of software over iterations
  • Reuse-based
  • Change considered explicitly
  • Support for risk management
  • ? Need for early and updated metrics

8
Reasoning 1
  • Tom DeMarco
  • ''You cannot control what you cannot measure.''
  • Clerk Maxwell
  • ''To measure is to know.''

9
Reasoning 2
  • Lord Kelvin
  • ''The degree to which you can express something
    in numbers is the degree to which you really
    understand it.''
  • Louise Pasteur
  • ''A science is as mature as its measurement
    tools.''

10
Experience 1
  • Lowell Arthur
  • ''Better a little caution than a great regret.''
  • Victor Basili
  • ''Measurement is an excellent abstraction
    mechanism for learning what works and what
    doesn't.''

11
Experience 2
  • Frederick Brooks
  • ''Adding manpower to a late software project
    makes it later.''
  • Tom Gilb
  • ''Project without clear goals will not achieve
    their goal clearly.''

12
Experience 3
  • Law of Parkinson
  • ''Work expands to fill the available time.''

13
Measurement
  • What is measurement ?
  • "Measurement is the process by which numbers or
    symbols are assigned to attributes of entities in
    the real world in such a way as to describe them
    according to clearly defined rules." (Norman
    Fenton)

14
Measurement
  • Units of Measurement
  • Measure a persons temperature
  • Celsius
  • Fahrenheit
  • Feverish - too hot (analogy)
  • Accuracy
  • Replicability
  • How do we measure software ?

15
Measurable Project Goals
  • Are the following measurable goals ?
  • The software will not fail
  • The software will be easy to use
  • The project will be completed by June 30
  • The product will meet all requirements
  • What makes a goal measurable ?

16
Setting Measurable Goals
  • Metric Definition
  • Clarity
  • Non-ambiguous
  • Common Understanding
  • Replicability
  • Accuracy ?
  • Examples

17
Setting up Measures 1
  • Establish why you are measuring
  • Goal-Question-Metric(GQM)
  • 1. define goal or purpose
  • 2. Break down into questions
  • 3. Pick suitable metrics
  • Create a Metrics programme within company
  • Choose metrics already developed

18
Setting up Measures 2
  • Create your own metrics ?
  • Define the metric as completely as possible
  • Define the properties of attribute to be measured
  • Define the object to be measured , the metrics
    domain
  • Define the metric.
  • Formality is essential

19
Setting up Measures 3
  • Validate the metric theoretically
  • Prove that the properties are met
  • Prove that the dimensions of the metric are sound
  • Determine the scale for the metric

20
Setting up Measures 4
  • Validate the metric practically
  • devise best means to measure
  • level of automation
  • minimum disruption for developers
  • Use metric in several practical places.
  • Promote metric

21
Setting up Measures 5
  • Example
  • Attribute to be measured product size
  • Essential property positive, additive
  • Metric domain set of lines ending with \n
  • Metric Name LOC
  • Theory
  • LOC is an absolute scale type
  • Fulfils essential property of product size

22
Setting up Measures 6
  • Yet LOC has problems - why?
  • Because it is modelled simplistically
  • \n s are just one element to the product size.
  • Used to try to capture too much about the
    software.

23
Software Metrics Validation
  • "Validation of a software measure is the process
    of ensuring that the measure is a proper
    numerical characterisation of the claimed
    attribute this means showing that the
    representation condition is satisfied.
  • Norman Fenton, City University London

24
Shyam R. Chidamber Chris F. Kemerer
  • Suggested metrics for OO

25
Weighted Methods Per Class (as sum of the McCabe
numbers)
  • The number of methods and the complexity of
    methods involved is an indicator of how much time
    and effort is required to develop and maintain a
    class.
  • The larger the number of methods in a class the
    greater the potential impact on children, since
    children will inherit all the methods defined in
    the class.
  • Classes with large numbers of methods are likely
    to be more application specific, limiting the
    possibility of reuse.''

26
McCabes Cyclomatic Complexity
  • Based on the control graph of the program
  • Can be used to decide on basis path testing etc.
  • no. linearly independent paths
  • no. of edges - no of nodes modules

27
Depth of Inheritance Tree
  • The deeper a class is in the hierarchy, the
    greater the number of methods it is likely to
    inherit, making it more complex to predict its
    behaviour.
  • Deeper trees constitute greater design
    complexity, since more methods and classes are
    involved.
  • The deeper a particular class is in the
    hierarchy, the greater the potential reuse of
    inherited methods.

28
Number of Children(as number of immediate
sub-classes)
  • Greater the number of children, greater the
    reuse, since inheritance promotes reuse.
  • Greater the number of children, the greater the
    likelihood of improper abstraction of the parent
    class. If a class has a large number of children,
    it may be a case of misuse of sub-classing.
  • The number of children gives an idea of the
    potential influence a class has on the design. If
    a class has a large number of children, it may
    require more testing of the methods in that
    class.

29
Response For a Class (as number of used methods)
  • If a large number of methods can be invoked in
    response to a message, the testing and debugging
    of the class becomes more complicated since it
    requires a greater level of understanding
    required on the part of the tester.
  • The larger the number of methods that can be
    invoked from a class, the greater the complexity
    of the class.
  • A worst case value for possible responses will
    assist in appropriate allocation of testing time.

30
Coupling Between Objects 1
  • Excessive coupling between objects is detrimental
    to modular design and prevents reuse. The more
    independent an object is, the easier it is to
    reuse it in another application.
  • In order to improve modularity and promote
    encapsulation, inter-object couples should be
    kept to a minimum. The larger the number of
    couples, the higher the sensitivity to changes in
    other parts of the design and therefore
    maintenance is more difficult.

31
Coupling Between Objects 2
  • A measure of coupling is useful to determine how
    complex the testing of various parts of a design
    are likely to be. The higher the inter-object
    coupling, the more rigorous the testing needs to
    be.''

32
Lack of Cohesion in Methods (disjunctive
instance variables)
  • Cohesiveness of methods within a class is
    desirable, since it promotes encapsulation.
  • Lack of cohesion implies classes should probably
    be split into two or more sub/classes.
  • Any measure of disparateness of methods helps
    identify flaws in the design of classes.
  • Low cohesion increases complexity, thereby
    increasing the likelihood of errors during the
    development process.

33
Design Metrics And Experience 1
  • From Mark Lorenz
  • 1. The average method size should be less than 8
    LOC for Smalltalk and 24 LOC for C. Bigger
    averages indicate O-O design problems (i.e.
    function-oriented coding).
  • 2. The average number of methods per class should
    be less than 20. Bigger averages indicate too
    much responsibility in too few classes.

34
Design Metrics And Experience 2
  • 3. The average number of instance variables per
    class should be less than 6. Similar in reasoning
    as the previous point - more instance variables
    indicate that one class is doing more than it
    should.
  • 4. The class hierarchy nesting level should be
    less than 6. Start counting at the level of any
    framework classes that you use or the root class
    if you don't.
  • 5. The number of subsystem-to-subsystem
    relationships should be less than the average
    number of class-to-class relationships within a
    subsystem.

35
Design Metrics And Experience 3
  • 6. The number of class-to-class relationships
    within a subsystem should be relatively high.
  • 7. The instance variable usage by the methods
    within a class can be used to look for possible
    design problems.
  • 8. The average number of comment lines should be
    greater than 1. Smaller averages indicate too
    little documentation with the (small) methods.
  • 9. The number of problem reports per class should
    be low.

36
Design Metrics And Experience 4
  • 10. The number of times a class is reused across
    the original application and in other
    applications might indicate a need to redesign
    it.
  • 11. The number of classes and methods thrown away
    should occur at a steady rate throughout most of
    the development process.

37
Other Experience 1
  • 1. A prototype class has 10 to 15 methods, each
    with 5 to 10 lines of code, and takes 1
    person-week to develop.
  • 2. A production class has 20 to 30 methods, each
    with 10 to 20 lines of code, and takes 8
    person-weeks to develop. In both these cases,
    development includes documentation and testing.
  • 3. C will have 2 to 3 times the lines of code
    of Smalltalk.
  • 4. Code volume will expand in the first half of
    the project and decline in the second half, as
    review clean up the system.

38
Other Experience 2
  • 5. Deeply nested classes are more complex, due to
    inheritance.
  • 6. A class or group of classes (e.g., a
    framework) with a low amount of coupling to other
    classes will be more reusable.
  • 7. A class has higher cohesion if its methods
    utilise similar sets of instance variables.

39
Project Completion Metrics And Experience 1
  • 1.The average number of support classes per key
    class ... will help you to estimate the total
    number of classes in the final system.
  • 2. The average man-days per application class ...
    to estimate the amount of human resources you
    need to complete the project.
  • 3. The average number of classes per developer
    ... will help you decide what staffing level
    needed to develop the application.

40
Project Completion Metrics And Experience 2
  • 4. The number of major iterations ... will help
    you schedule times when early-release drivers can
    be given to customers and human factors staff to
    verify requirements and usability.
  • 5. The number of subsystems should relate to
    major functionally-related portions of the total
    business' system.

41
Establishing A Metrication Programme 1
  • Barbara Kitchenham
  • 1. Define your goals (which are likely to include
    requirements for measurements to support basic
    management activities).
  • 2. Identify measures that will support the
    monitoring and achievement of those goals.
  • 3. Define and plan the metrication programme.

42
Establishing A Metrication Programme 2
  • 4. Establish a data collection system to support
    the metrication programme.
  • 5. Analyse your collected data to support your
    management activities and monitor achievement of
    your goals.
  • 6. Review and update your goals in the light of
    your achievements.

43
Establishing A Metrication Programme 3
  • 7. Update your data collection in the light of
    your changing goals and management requirements."

44
Software Metrics Application 1
  • "What do the successful companies do
  • They have 'decriminalized' errors. People talk
    openly about what they have done wrong as a means
    of self-improvement. There is no need to hide
    failure management is not allowed to, or simply
    does not, use it against you.

45
Software Metrics Application 2
  • Measurement is part of 'how we do business.' That
    is, there is no management mandate or policy that
    causes measurement to happen, but rather a common
    understanding that it is the only reasonable way
    to build product."

46
Software Metrics Application 3
  • They tend to have achieved an SEI process level
    of 4 or 5 (very good) without ever having passed
    through level 3. That is, they measure and use
    feedback to improve their software process
    without ever having invoked a defined process!
    (That is, of course, the epitome of
    technologist/experimentation vs.
    management/control.)

47
Object Oriented Metrics
  • Process of development tends to be different.
  • Project should not be penalised for this
  • Or allowed too much free rein !
  • Metrics are a very useful addition to an
    object-oriented project.

48
New Guidelines
  • Warning Signs
  • RFC gt 100
  • RFC gt 5 NMC
  • CBO gt 5
  • WMC gt 100
  • NMC gt 40

49
Overall
  • Far more important to validate current metrics
    empirically than propose new ones
  • Aim to make link to productivity, quality and
    project management
Write a Comment
User Comments (0)
About PowerShow.com