CBR in Software Engineering - PowerPoint PPT Presentation

About This Presentation
Title:

CBR in Software Engineering

Description:

Five levels of maturity. Evaluate by comparing to a reference best-practice model ... to develop a successful repository (in one try, these guys failed several times) ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 43
Provided by: cseLe
Category:

less

Transcript and Presenter's Notes

Title: CBR in Software Engineering


1
CBR in Software Engineering
  • Brought to you by Chris Creswell (and
    Klaus-Dieter Althoff)

2
Outline
  • Chapter 9 of Case Based Reasoning Technology
  • CBR for Experimental Software Engineering
  • By Klaus Dieter-Althoff, Andreas Birk,
    Christianne Gresse von Wangenheim, and Carsten
    Tautz
  • Another paper on the subject
  • Effective Experience Repositories for Software
    Engineering
  • By Kurt Schneider and Jan-Peter von Hunnius

3
CBR for SWE Goals
  • Present infrastructure for learning in the
    software domain using CBR
  • Use CBR to build an experience base
  • Outline additional uses of CBR for ESE (that's
    experimental software engineering)

4
Definitions
  • Software Engineering -- concerned with the
    definition, refinement, and evaluation of
    principles, methods, techniques, and tools to
    support
  • Individual aspects of software development and
    maintenance
  • Planning of a software development project
  • Performing development, project management, and
    quality assurance
  • Assessing the performance of these methods,
    techniques, and tools

5
Motivation
  • Software development needs to improve
  • Software quality
  • Development time
  • Development cost
  • Software projects often fail to meet the
    projected goals for each of these
  • Why?
  • Software complexity has increased dramatically
  • Processes havent changed much since the old days

6
How to improve
  • Better software comes from a better development
    process
  • Two ways to improve the process
  • Top-down Methods
  • Evaluate software development organizations and
    their processes
  • Capability Maturity Model (CMM)
  • Five levels of maturity
  • Evaluate by comparing to a reference
    best-practice model
  • Bottom-up Methods
  • Use continuous learning and reuse of experiences
  • Quality Improvement Paradigm/Experience Factory
    (QIP/EF)
  • We'll study an application of CBR to implement
    this

7
ESE What is it
  • Experimental Software Engineering
  • A branch of SWE that does reasearch to improve
    software development by experiment
  • Built on the QIP/EF approach
  • The QIP is a 6 step procedure with 3 phases
  • Phase 1 Planning where the process could
    benefit from CBR
  • QIP1) Characterize the initial situation
  • QIP2) Identify goals
  • QIP3) Develop a plan
  • Phase 2 Execution
  • QIP4) Execute the plan
  • Phase 3 Evaluation
  • QIP5) Analyze the performed actions
  • QIP6) Package the lessons learned into reusable
    artifacts

8
The Experience Factory What is that
  • The Experience Factory (EF) is a logical and/or
    physical organization that supports project
    development by analyzing and synthesizing all
    kinds of experience, acting as a repository for
    such experience, and supplying that experience to
    various projects on demand.
  • Conducts steps 5 and 6 (phase 3) of the QIP

9
Current uses of CBR in SWE
  • Current ways CBR is used in SWE
  • to introduce more flexibility in software reuse,
    in combination with object-oriented techniques
  • capturing and formalizing best practices
  • effort prediction
  • requirements acquisition
  • It could do much more for ESE

10
Extending CBR for use in an EF for ESE
  • CBR works well for fine grained decisions
  • It has many goals in common with the EF
  • Research overlaps in the subjects of reuse and
    similarity based retrieval
  • To be applied to large scale organizational
    aspects, CBR must be extended

11
A few more terms
  • Technology application domain describes the
    situations in which a technology can be applied
    successfully
  • Here technology means SWE technologies like
    lifecycle models, design methods, configuration
    management systems
  • Technology application a combination of an
    overall task, the specific goal of the
    application, what technology was used, and the
    context/domain of the application

12
Case representation for ESE
  • We'll represent cases as technology domain models
    (TDMs)
  • TDMs consist of intuitive descriptions of context
    characteristics each containing a context factor
  • e. g. characteristic experience of developers,
    factor value high

13
Example of a TDM
  • Characteristic Value
  • Technology Reading by stepwise abstraction
  • Task Code verification
  • Goal High reliability of software product
  • Programming language C
  • Amount of reuse less than 40
  • Experience of developers average
  • Size of project small (on-site)
  • What problems might there be with this
    representation?

14
TDMs continued
  • So as we have seen, TDMs are a qualitative
    experience representation
  • Get them by talking to software professionals
    using knowledge acquisition techniques
  • Assure the validity of the knowledge by tracing
    it back to concrete experiences

15
The Decision Process
  • Use a case base if there is one available
  • It should contain domain models for multiple
    technologies that share the same task and goal
  • This allows decision makers to pick a technology
    for the problem most similar to theirs
  • Decision process steps
  • 1) Determine the task and goal for the project
  • 2) Characterize the project in terms of domain
    factors
  • 3) Perform similarity-based retrieval to find
    technologies whose TDMs are sufficiently similar
    to the present project

16
So what do we need
  • A tool that performs similarity based retrieval
    on TDMs
  • The tool should allow decision makers to
  • Explore a filtered, prioritized set of cases to
    allow them to make informed decisions during
    planning
  • Since TDMs are so qualitative the probability of
    exact matches in similarity is very low
  • So the tool uses an interactive decision method

17
Maintaining case bases
  • This maintenance can be described using the CBR
    task decomposition by Aamodt and Plaza
  • That's the big tree that starts with retrieve,
    reuse, revise, and retain
  • We tailor it to the principles of ESE
  • The QIP described previously

18
The Result MIRACLE
  • MIRACLE Model Integrating Reuse And Case-based
    reasoning for Lots of software engineering
    Experiences
  • Its a combination of the CBR task decomposition
    and the QIP model
  • Sounds like a big deal, but theres only one real
    difference between this and the CBR decomposition

19
MIRACLE
20
MIRACLE elaboration
  • Specify specify the project
  • Search search for possibly relevant planning
    information
  • Choose choose planning info from projects that
    are similar to the forthcoming one
  • Select select suitable goals and models for
    them
  • Copy copy the most suitable models or create
    new ones

21
MIRACLE elaboration
  • Adapt modify models
  • Apply perform project and collect data
  • Assess evaluate success of applying models
    using the collected data
  • Improve detect weaknesses and find out how to
    improve the models
  • Extract identify information to store (lessons
    learned)
  • Index set up suitable characterization schemes
    for and relationships between the items to be
    stored
  • Integrate Update the experience base

22
MIRACLE analysis
  • The only real difference from the CBR
    decomposition is the inclusion of the apply task,
    and a few names have been changed
  • Lets look at how MIRACLE maps to the CBR task
    decomposition and the QIP

23
Reminder heres what the CBR cycle looks like
Example Slide Creation
- 9/12/03 talk_at_ cse395
24
Mapping MIRACLE to CBR
Retain
Retrieve
Revise
Reuse
25
Mapping MIRACLE to QIP
QIP 6 Package
QIP 1 Characterize
QIP 5 Analyze
QIP 2 Set goals
QIP 3 Choose models
QIP 4 Perform
26
Walking through the MIRACLE process
  • Retrieve objects with characterization similar to
    the one specified
  • Must be able to cope with incomplete information
  • CBR can do this as long as we
  • Specify the needed objects in as much detail as
    possible
  • Retrieval corresponds to steps specify, search,
    choose, and select of MIRACLE

27
Similarity
  • The set of candidate cases must be reduced after
    searching this is step choose
  • This is done using a global similarity measure
    and taking into account all specified features
    (whereas search only uses one feature)
  • A global similarity metric is something like the
    Hamming Distance we saw in class last week

28
Copying and Adapting
  • Selection should calculate costs for
  • Modifying an object
  • Creating a new one
  • Using these estimates, we decide if it is
    worthwhile to reuse an object
  • If a new object is to be created, a template
    should be used and the result modified
  • So either way, something is copied and adapted

29
Copying and adapting continued
  • Two ways to adapt
  • Transformational reuse modify the existing
    object itself to suit the new problem
  • Derivational reuse dont reuse the existing
    object itself, instead reuse the process that
    created it
  • Let the process be a guide to creating the new
    object, and modify it as necessary along the way
  • We will study case adaptation in more detail
    later in the course
  • After adaptation, the object must be validated

30
Validating the object
  • This can be done in several ways
  • Apply the object and assess its performance after
    the fact
  • Review by experts
  • Test by model use a formal model to determine
    the objects suitability

31
Improving the revised object
  • If it worked perfectly, then just characterize
    and store the new case
  • If the object failed, we can learn from it
  • Reasons for failure include
  • Wrong specification specification was
    incorrect, the other steps were fine
  • Wrong application the reused object was not
    applied as planned
  • Wrong adaptation all the other steps were
    correct but the adaptation was faulty
  • After we know what went wrong (if anything),
    improve the object accordingly

32
Retaining the object
  • Store the new insights for reuse
  • Experience base should be updated regardless of
    whether the reused object worked successfully
  • When storing info
  • The object can be stored as a new object or a
    modification of an existing one
  • Objects other than the one reused can be stored
  • Information related to the reused object can be
    stored (e. g. its characterization, assessment
    results, etc.)
  • This is called the experience package
  • Thats it for MIRACLE

33
Current status
  • The authors are constructing an experience base
    focusing on case representation for reusable
    objects, definition of similarity measures, and
    the representation of general knowledge
  • Translation it doesnt work yet, nobody has
    implemented MIRACLE completely
  • But, thats not all

34
Current status
  • The authors are also focusing on creating a
    metaCBR tool
  • They have selected 50 features to characterize
    CBR systems and allow
  • CBR practitioners to describe their system
    quickly
  • Developers to choose a system, then give
    feedback, thus creating more experience
  • The cases were developed with a commercial tool
    (CBR-Works from TecInno)
  • They are available online
  • http//www.iese.fhg.de/Competences/QPE/QE/metaCBR.
    html

35
Conclusion
  • Weve seen MIRACLE, a model that merges the CBR
    cycle and the principles of the QIP
  • Case representation with TDMs
  • We believe that the experience factory concept
    offers a kind of infrastructure that is helpful
    for CBR applications not only in ESE, but in any
    kind of applications fielded in industrial or
    other business environments. Thus, an experience
    factory operationalizes a CBR system in
    industrial/business environments, while CBR
    offers computer based support on a very broad
    level.
  • Questions about this chapter before we move on?

36
Effective Experience Repositories for Software
Engineeringby Kurt Schnieder and Jan-Peter von
Hunnius
  • The product of several attempted experience
    repositories at Daimler-Chrysler
  • They have developed a set of success factors for
    experience repositories
  • Motivation the difficulty of producing an
    effective tool is often underestimated, some
    guidelines are needed to develop a successful
    repository (in one try, these guys failed several
    times)

37
Key success factors for experience repositories
  • 1 User guidance
  • 2 Usability
  • 3 Process conformance
  • 4 Feedback mechanism
  • 5 Maintainability

38
Key success factors for experience repositories
elaboration
  • User guidance the system should provide direct
    support for the users work
  • Usability the user should not be required to
    learn much to use the system
  • A whole lot of work has been done on these first
    two
  • Process conformance the system should conform
    to the task that it is designed to support.
  • By making an improved process the center piece
    and backbone of an experience repository,
    navigation, orientation, and search are highly
    improved.

39
Key success factors for experience repositories
elaboration
  • Feedback mechanism the repository should enable
    and encourage feedback through several channels
  • Experience calls for continuous update and rework
  • Maintainability new feedback needs to be
    analyzed and integrated into the existing contents

40
Lessons that support these key qualities
  • Be specific dont be vague in structure of
    repository
  • One process at a time a repository should
    concentrate on a single process at a time
  • Usability counts if users have a hard time with
    the system, they wont bother with it
  • One-way communication fades away the system
    cannot be hardcoded, it has to be able to improve

41
Fast repository assessment
  • The authors developed a checklist to evaluate
    repositories quickly
  • A few example questions from the list
  • User Guidance
  • Is a new user properly introduced into the area
    of interest?
  • Can the community of practice discuss using the
    repository?
  • Process Conformance
  • Is the defined process the Best Known Practice?
  • Does feedback have an impact on the process?

42
Final point
  • Repositories are important, but they are not
    sufficient to make a software organization
    effectively learn from experiences. Without a
    learning attitude and some appreciation for
    continuous process improvement, even the best
    repository will not make the experiences fly.
  • Questions about this paper?
Write a Comment
User Comments (0)
About PowerShow.com