Johns Hopkins University Software Engineering Fall 2002 - PowerPoint PPT Presentation

1 / 90
About This Presentation
Title:

Johns Hopkins University Software Engineering Fall 2002

Description:

A discussion of the take-home portion of the mid-term exam ... Circa 1969. 22 October 2002. Bill Akin -- Johns Hopkins University. Slide 5. Engineering ... – PowerPoint PPT presentation

Number of Views:145
Avg rating:3.0/5.0
Slides: 91
Provided by: bill95
Category:

less

Transcript and Presenter's Notes

Title: Johns Hopkins University Software Engineering Fall 2002


1
Johns Hopkins UniversitySoftware
EngineeringFall 2002
  • 22 October 2002
  • Bill Akin

2
Tonights Agenda
  • A review of concepts covered so far with some
    additional material within the topics
  • A discussion of the take-home portion of the
    mid-term exam
  • Discussion on the in-class portion of the
    mid-term exam next week

3
Schedule
  • Class Date Chapters Events and Deliveries
  • 1 10-Sep Overview
  • 2 17-Sep 1, 2, 3, 4 Project and team
    organization
  • 3 24-Sep 5, 6 Present team project proposals
  • 4 1-Oct 10, 11, 12, 13
  • 5 8- Oct 14, 15, 16
  • 6 15-Oct 20, 21 Deliver proposal document
  • 7 22-Oct Present requirements analysis /
    models
  • 8 29-Oct 22 Mid-Term Exam
  • 9 5-Nov Deliver requirements document
  • 10 12-Nov 7, 8, 9, 19, 24 High Level Design
    Presentation
  • 11 19-Nov 17, 18, 23 Deliver high level design
    document
  • 12 26-Nov Part Five Topics
  • 13 3-Dec Project Demonstrations - Deliver
    project
  • 14 10-Dec Final Exam

4
Engineering
  • Software Engineering is the establishment and
    use of sound engineering principles in order to
    obtain economically software that is reliable and
    works efficiently on real machines.

Circa 1969
Economical acquisition Reliable
software Efficient software
Sound engineering principles gt
5
Engineering
  • Software Engineering
  • The application of a systematic, disciplined,
    quantifiable approach to the development,
    operation, and maintenance of software, that is,
    the application of engineering to software
  • The study of approaches as in (1)

IEEE
6
Engineering
Engineering is the analysis, design,
construction, verification, and management of
technical (or social) entities. Regardless of
the entities, the following questions must be
asked and answered.
  • What is the problem to be solved?
  • What characteristics of the entity are used to
    solve the problem?
  • How will the entity (and the solution) be
    realized?
  • How will the entity be constructed?
  • What approach will be used to uncover errors that
    were made in the design and construction of the
    entity?
  • How will the entity be supported over the long
    term, when corrections, adaptations, and
    enhancements are requested by users of the entity?

7
Measurement Metrics
  • Measure (n) the dimensions, capacity, or amount
    of something ascertained by measuring.
  • Metric (n) A standard of measurement, e.g., No
    metric exists that can be applied directly to
    happiness.
  • In general, a measure is a value you compare
    against a metric.

8
Measurement Metrics
  • Measure How many errors per hour are found
    during code reviews? The scribe in a code review
    polls all reviewers and asks how many errors were
    found by each and how long each reviewer spent in
    the review. The numbers are recorded.
  • Metrics are established using many reviews and
    compared to the measures for each review.
  • This metric has at least one obvious flaw.

9
Measurement Metrics
  • Measure How many comments per 100 lines of code
    are in this module? Someone counts the lines of
    code and the number of comments and derives the
    measure of 27.4 comments per 100 lines.
  • Metric The number of comments per 100 lines of
    code should be between 4 and 10.
  • Our measure is out of range.

10
Measurement Metrics
  • Cost, resources, schedule, and other quantities
    must be predicted for most projects before
    approval is given to begin.
  • Estimation variables are measures fed to models
    to scope projects. They include lines of code
    (LOC), function points (FP), and object points
    (OP).
  • Derived metric values drive the model.

11
Measurement Metrics
  • Take measurements over many similar projects.
  • Correlate the measures to some quality goal or
    objective.
  • Determine a value or set of values in the
    measures that tend to indicate good quality goals
    or objectives.
  • From these values establish a metric.
  • Beware of false conclusions.

12
Measurement Metrics
  • The number should be limited.
  • The value must be maximized.
  • The selection must be relevant.
  • Be sure the metrics continue to be good
    indicators. Some metrics will change depending
    on people, approaches, etc.
  • Do not be bogged down or misled by your own
    measures.

13
Measurement Metrics
  • Each kind of metric can be applied toward a given
    set of goals or objectives. Examples
  • Comments per 100 lines of code may indicate a
    level of maintainability.
  • The McCabe complexity is considered to be a good
    indicator of module maintainability, especially
    if maintenance is to be performed by someone
    besides the developer.

14
Risk Management
  • Risk is the chance that something may happen.
  • What if it does? What kind of damage can occur?
    What will that lead to?
  • How likely is it? Can we state its probability?
  • Who needs to know?
  • What should be done? Can we prevent it? How do
    we manage it?

15
Risk Management
  • Schedule slippage
  • Project scope
  • Product misses the mark
  • Product malfunctions after delivery (defect)
  • When a malfunction occurs there are results and
    consequences.
  • The impact of these varies by type of product.

16
Risk Management
  • Real-time software product provides unmanned
    control of customer equipment.
  • Defect Program error sends wrong value to
    equipment.
  • Result Equipment turns the wrong way. Package
    goes to wrong city.
  • Consequence
  • FED Ex router article is returned to sender and
    sender is unhappy, or
  • ICBM system article is returned to sender and
    sender no longer has the facility to be unhappy.

17
Risk Management
  • Risk prevention is not possible.
  • Risk can be understood and managed.
  • Management implies
  • Some reduction in the probability of risk,
  • Some reduction or control of results,
  • Some level of contingency plan to reduce the
    consequence of the results, and
  • Some ability to understand the level and
    likelihood of the risk, the potential result, and
    the predicted consequence.

18
Project Management
  • Support Processes
  • Documentation
  • Configuration
  • Management
  • Quality Assurance
  • Verification
  • Validation
  • Joint Review
  • Audit
  • Problem Resolution
  • Development Activities
  • Process Implementation
  • System Requirements Analysis
  • System Architectural Design
  • Software Requirements Analysis
  • Software Architectural Design
  • Software Detailed Design
  • Software Coding and Testing
  • Software Integration
  • Software Qualification Testing
  • System Integration
  • System Qualification Testing
  • Software Installation
  • Software Acceptance Support
  • Organizational Processes
  • Management
  • Infrastructure
  • Improvement
  • Training

19
Project Management
  • Define and scope the problem
  • Provide initial tasking and estimates
  • Manage customer and staff expectations
  • Continuously improve estimates
  • Continuously update project plan
  • Continuously evaluate and reduce uncertainty

20
Project Management
  • Bound the project get your arms around it
  • Determine what function and performance are
    allocated to software through system engineering
    (Chapter 10)
  • First cut should address data, control, function,
    performance, constraints, interfaces, and
    reliability
  • Initial project analysis

21
Project Management
  • Identify stakeholders
  • Identify users
  • Establish benefits
  • Examine alternative solutions
  • Identify solution environment
  • Establish and maintain continuous and iterative
    discussion with appropriate stakeholder
    representatives

22
Project Management
  • Determine feasibility
  • Can we build this
  • Is the project feasible
  • Four feasibility dimensions
  • Technology
  • Finance
  • Time
  • Resources

23
Project Management
  • Do we have the tools
  • Facilities and hardware
  • Software
  • Environment
  • Do we have components
  • Reusable components and subsystems including
    COTS
  • Class and subroutine libraries
  • Do we have the people
  • Skill
  • Availability

24
Project Management
  • Yesterday Huge, expensive computers with a few
    programs
  • Today Variety of reasonably priced hardware
    components and complex software elements
  • Emerging -- Variety of reasonably priced hardware
    components and vast repositories of reusable
    software elements

25
Project Management
  • Divide and conquer technique
  • Large problems are too hard to solve
  • Break them into a collection of smaller problems
    repeat until solvable
  • How can we tell how much work, time, people, etc.
    will be needed to solve the little problems we
    have solved little problems before and we took
    measurements

26
Project Management
  • Software scope
  • Context
  • Information objectives
  • Function and performance
  • Problem decomposition
  • Systems engineering
  • Problem partitioning
  • Requirements allocation
  • Divide and conquer strategy

27
Project Management
  • Start on the right foot
  • Maintain momentum
  • Track progress
  • Make smart decisions
  • Conduct a postmortem analysis

28
Project Management
  • Boehms W5HH Principle
  • Why is the system being developed establish the
    need, the concept, and justification with a
    cost-benefits analysis.
  • What will be done, by when develop a work
    breakdown structure and a schedule.
  • Who is responsible for a function establish
    staff structure and assign tasks

29
Project Management
  • Boehms W5HH Principle
  • Where are they organizationally located
    identify authority, roll, and organization for
    all responsible players
  • How will the job be done technically and
    managerially develop a project plan and a
    project management plan
  • How much of each resource is needed conduct a
    cost estimate analysis

30
Project Management
  • The author provides a list of critical practices
  • Formal risk management
  • Empirical cost and schedule estimation
  • Metric-based project management
  • Earned value tracking
  • Defect tracking against quality targets
  • People-aware program management

31
Systems Engineering
  • System -- A set or arrangement of elements that
    are organized to accomplish some predefined goal
    by processing information
  • Computer-based systems comprise
  • Software
  • Hardware
  • People
  • Data stores
  • Documentation
  • Procedures

32
Systems Engineering
  • The role of a systems engineer is to define the
    elements for a specific computer-based system in
    the context of the overall hierarchy of systems
    (macro elements).
  • Systems engineers define the elements of a system
    to be developed from a specific set of
    requirements and within the confines of a
    particular architecture.
  • They define the interfaces between the system
    elements and define the processes they perform.

33
Systems Engineering
  • Systems engineers must provide a mapping from a
    specific problem space to a selected solution
    space.
  • The problem space is the collection of
    requirements that defines the desired system.
  • The solution space is the collection of elements,
    interfaces, and processes that define a system to
    be built that completely and correctly satisfies
    all the requirements in the problem space.

34
Systems Engineering
  • Defining a system in the solution space is called
    system modeling.
  • The engineer creates models that
  • Define the processes that serve the needs of the
    view under construction
  • Represent the behavior of the processes and the
    assumptions on which the assumptions are based,
  • Explicitly define both exogenous and endogenous
    input to the model, and
  • Represent all linkages (including output) that
    will enable the engineer to better understand the
    view.

35
Systems Engineering
  • The hardest single part of building a software
    system is deciding what to build.No other part
    of the work so cripples the resulting system if
    done wrong. No other part is more difficult to
    rectify later.
  • This understated quote from Fred Brooks is the
    reason systems engineering is needed.

Fred Brooks
36
Systems Engineering
  • DoD architecture approach
  • Capture the operational view
  • Identify the technical environment
  • Map the operational onto the technical to form
    the system view
  • Business process engineering ? product
    engineering
  • Implement within environmental constraints

37
Requirements Engineering
  • The problem space provides a living definition of
    the problem a system is required to solve.
  • The process of requirements engineering defines
    and maintains the problem space and provides
    consistent views of it to a well defined
    community of stakeholders with diverse needs.

38
Requirements Engineering
  • Stakeholders who must view and understand the
    problem definition (i.e., the system
    requirements) include those who must
  • Acquire the system,
  • Build the system,
  • Maintain the system,
  • Test the system, and
  • Many others we could list

39
Requirements Engineering
  • The requirements engineering process must gather,
    specify, validate, and maintain requirements.
  • The entire collection of requirements must remain
    consistent, correct, and valid throughout the
    life of the defined problem space.
  • There may be multiple versions of the problem
    space at any point in time.

40
Requirements Engineering
  • Requirements engineering can be described in
    distinct steps
  • Requirements elicitation,
  • Requirements analysis and negotiation,
  • Requirements specification,
  • System modeling (in the problem space),
  • Requirements validation, and
  • Requirements management.

41
Requirements Elicitation
  • Learn from the stakeholders about the problem the
    system is to solve.
  • Listen carefully and capture every idea from each
    person who will share his or her vision.
  • Record everything even though some inputs may
    conflict with others.
  • Get only enough to help you understand what is
    required. Youll be back for details and
    clarification later.

42
Requirements Elicitation
  • Statement of need and feasibility
  • Bounded statement of system scope
  • List of customers, users, and other stakeholders
  • Description of systems technical environment
  • List of requirements and domain constraints
  • Set of usage scenarios
  • Any prototypes developed

43
Requirements Analysis
  • Systems engineers take possession of emerging and
    evolving artifacts.
  • The systems engineering process becomes the
    customer of the requirements engineering process.
  • Systems analysts still perform refinement of the
    products they have delivered.
  • Systems analysts deliver a validated, living
    requirements specification.

44
Requirements Analysis
  • Software requirements analysis bridges the gap
    between system requirements engineering and
    software design.
  • Software requirements analysis is divided into
  • Problem recognition,
  • Evaluation and synthesis,
  • Modeling,
  • Specification, and
  • Review

45
Requirements Analysis
  • Are requirements consistent with objectives?
  • Are specifications at the appropriate level?
  • Is each requirement necessary or enhancement?
  • Is each requirement bounded and unambiguous?
  • Does each requirement have attribution?
  • Are there conflicts between any requirements?
  • Is each requirement achievable in environment?
  • Is each requirement testable, once implemented?

46
Requirements Specification
  • In addition to the ultimate system user, others
    have a stake in the requirements specification.
  • All stakeholders should validate the requirements
    specification to be sure the requirements are
    unambiguous, consistent, complete, correct, and
    compliant.
  • Important stakeholders include testers,
    developers, managers, etc.

47
Requirements Specification
  • Every version of the requirements specification
    must be submitted to software configuration
    management (SCM) -- often just called CM.
  • Additions, changes, and deletions to requirements
    must be done through a controlled process and
    submitted to CM.
  • Requirements baselines must be identified.

48
Requirements Specification
  • Pressman System Specification is the final work
    product of the system and requirements engineer.
  • In almost all descriptions of the systems
    engineering process, systems engineers define the
    high level design of the system and allocate
    requirements to the various subsystems for
    further development.

49
Requirements Specification
  • A requirements specification defines a view of
    the problem space. It does not always define a
    particular system to be built.
  • A given system, version of a system, or release
    of a version of a system should satisfy a subset
    of the problem space defined by the requirements
    specification.
  • The subset that specifies a certain system is
    called a requirements baseline.

50
System Modeling
  • Create a model of system capabilities (Were
    still in the problem space.)
  • Model should represent
  • Input,
  • Processing,
  • Output,
  • User interface, and
  • Maintenance.

51
System Modeling
  • Modeling should be recursive and hierarchical.
  • For most models the top level view is called the
    context view.
  • The context view depicts the system in the
    context of its environment, including external
    entities that directly interface and interact
    with the system.

52
System Modeling
  • Suppose E-Bay had pockets that were not as deep
    and couldnt afford such great computer systems.
    Build a system to archive less active buyers and
    sellers.
  • Dream up some archiving functions, depict the
    context level diagram and at least one further
    decomposed diagrams.

53
System Modeling Context DFD (Level 0)
Auction System
E-Bay Archive System
Seller System
Buyer System
Rating System
54
System Modeling System (Level 1)
1
Sales Archiving
3
Customer Archiving
Ratings Archiving
5
4
6
7
2
Archives
55
System Modeling
  • The models capture the view of the functions to
    be performed by the system.
  • It is more likely that solution space groupings
    lie along common computer functions than along
    common user functions.
  • For example, user interface activities or data
    management functions may be grouped.

56
Process Maturity
  • The SEI defines five levels of capability
    maturity for software development organizations
    and provides models for each. The levels are
    called
  • Initial (often called Chaos)
  • Repeatable
  • Defined
  • Managed
  • Optimizing

SEI CMM Levels
57
Process Maturity
  • The SEI model defines a set of KPAs for each
    level, e.g., Project Planning.
  • Each level includes all the KPAs from previous
    levels plus new ones added to achieve the current
    level.
  • For each KPA there is a defined set of
    characteristics defined.

58
Process Maturity
  • Level 2 KPAs -- Repeatable Software
    configuration management
  • Software quality assurance
  • Software subcontract management
  • Software project tracking and oversight
  • Software project planning
  • Requirements management

59
Process Maturity
  • Level 3 KPAs -- Defined
  • Peer reviews
  • Intergroup coordination
  • Software product engineering
  • Integrated software management
  • Training program
  • Organization process definition
  • Organization process function

60
Process Maturity
  • Level 4 KPAs -- Managed
  • Software quality management
  • Quantitative process management

61
Process Maturity
  • Level 5 KPAs -- Optimizing
  • Process change management
  • Technology change management
  • Defect prevention

62
Process Maturity
  • KPA Characteristics
  • Goals
  • Commitments
  • Abilities
  • Activities
  • Methods for monitoring implementation
  • Methods for verifying implementation

63
Software Lifecycle
  • A software lifecycle has three basic phases
  • Definition phase focuses on What
  • Development phase focuses on How
  • Support phase focuses on Change
  • The change phase addresses
  • Correction Corrective maintenance
  • Adaptation Adaptive maintenance
  • Enhancement Perfective maintenance
  • Prevention Preventative maintenance

64
Software Lifecycle
  • All software lifecycle models must address, at a
    minimum, the following elements
  • Software requirements analysis
  • Design
  • Code generation
  • Testing
  • Support
  • Each model provides an approach to organizing
    and executing the elements.

65
Software Lifecycle Waterfall Model
Analysis
Design
Code
Test
66
Software Lifecycle Rapid Application Development
RAD and CBD are both discussed in Chapter 1.
Take these discussions with a grain of salt.
Business Modeling
Data Modeling
Process Modeling
Application Generation
This cycle is repeated for each component of a
system to be developed.
Testing Turnover
67
Software Lifecycle The Incremental Model
Analysis
Design
Code
Test
Analysis
Design
Code
Test
Be careful with this one, too. It is a very
important modern approach requiring good systems
engineering.
Analysis
Design
Code
68
Software Lifecycle Spiral Model
  • Each iteration of the spiral provides a more
    complete version of the system.
  • Each delivered version is a replacement for the
    previous version, built on that previous version
    as a baseline.
  • By comparison, the incremental model usually
    delivers additive elements to the previous
    version.
  • These are often built in parallel.

69
Software Lifecycle Code and Fix
  • Basic model used in the earliest days of software
    development
  • Two steps write the code, fix problems in the
    code
  • Problems
  • Repeatedly fixing the same code leads to such
    poor structure that the code becomes
    unmaintainable
  • Code typically does not meet user needs
  • Code is hard to test because there is no
    preparation for testing

70
Software Lifecycle Stagewise
  • Introduced in mid- 1950's
  • Software developed in successive stages
  • operational plan,
  • operational specifications,
  • coding specifications,
  • coding,
  • parameter testing,
  • assembly testing,
  • shakedown,
  • system evaluation

71
Software Lifecycle Waterfall Model (1)
  • Introduced as a refinement to the stagewise model
    in 1970
  • Added feedback loops between steps (confined to
    successive steps)
  • Added prototyping to support requirements
    analysis design
  • The standard approach to SW development for 20
    years (has been the basis for most SW acquisition
    standards used in government and industry)

72
Software Lifecycle Waterfall Model (2)
  • Criteria for selection
  • System can be developed in less than 12 months
  • Cannot practically break into builds (releases)
  • The added cost of developing support SW
    (emulation, simulation, test) is more than 20 of
    total development cost
  • The need for timely delivery of the entire system
    is the driver
  • Problems
  • Emphasis on elaborate documents as completion
    criteria does not work well for certain classes
    of SW (interactive, end-user applications)
  • Doesn't work well for COTS SW integration

73
Software Lifecycle Evolutionary (Incremental
Build)
  • Subset of functionality developed and delivered
    (Build 0)
  • Further functionality is added as subsequent
    builds
  • Each build goes through a complete cycle of the
    tailored SW activities
  • Development builds transition to maintenance
    builds
  • Problems can create "spaghetti code" if
    components of early builds are continuously
    modified for later builds assumes that the
    user's operational system will be flexible enough
    to accommodate unplanned evolution paths (not
    always a valid assumption)

74
Software Lifecycle Spiral Model (1)
  • Introduced in mid-1980's, based on experience
    with refinements to the Waterfall model
  • Each cycle involves a progression that addresses
    the same sequence of steps for each portion of
    the product each level of elaboration
  • Its range of options accommodates the good
    features of existing SW process models while its
    risk driven approach avoids many of their
    difficulties
  • Focuses early attention on options involving SW
    reuse

75
Software Lifecycle Spiral Model (2)
  • Focuses on eliminating errors and unattractive
    alternatives early
  • Problems not applied extensively to contract
    software acquisition, relies heavily on
    risk-assessment expertise, not yet fully
    elaborated as the more established models

76
Software Lifecycle Prototyping (1)
  • General characteristics
  • Prototyping almost always part of a software
    development project
  • Important for risk aversion and requirements
    analysis because it
  • 1) helps customers identify what they want (they
    see the system)
  • 2) helps make sure that potential problem areas
    have solutions
  • 3) can provide the customer a quick, usable
    capability
  • 4) provides checkpoints and demonstrates
    observable progress

77
Software Lifecycle Prototyping (2)
  • Throwaway Prototype
  • Used during early software engineering to
    pinpoint requirements
  • Reduce risk by experimenting with different
    implementations
  • Developed without respect to standards or future
    maintenance
  • Discarded after preliminary design, requirements
    and algorithms survive

78
Software Lifecycle Prototyping (3)
  • Evolutionary Prototype
  • Becomes part of the delivered system
  • Tailored development practices do apply
  • Working Model Prototype
  • Normally does not lead to a larger SW development
    effort
  • Used to examine performance characteristics, HMI
    response, ...

79
Software Lifecycle Software Process Models
  • Software process models define the sequence of
    activities criteria for transitioning from one
    activity to the next
  • A process model (evolutionary, prototype, spiral)
    differs from a methodology (structured analysis,
    object oriented design) in that methodologies
    define how to proceed through each step
    represent the products of each step
  • Five process models are identified (as well as
    three models no longer used in the software
    industry

80
Software Lifecycle COTS Integration
  • Two decades ago...
  • The only commercial software included in the
    typical software system was the operating system
    and even the OS was often modified to meet the
    needs of a specific project .
  • Today...
  • Customers expect the use of COTS software
    products whenever possible. Operating Systems,
    DBMS products, communications packages, graphical
    user interface products are typically an integral
    part of developed software systems. Integrators
    build systems with as much as 80 COTS. This
    reduces the cost of software development but
    increases the amount of integration required.
    Traditional software size cost estimation
    techniques do not adequately address this issue.

81
Object-Oriented Concepts
  • Pressman For many years, the term object
    oriented (OO) was used to denote a software
    development approach that used one of a number of
    object-oriented programming languages (e.g.,
    Ada95, C, Eiffel, Smalltalk).
  • If we want to do object-oriented development,
    what the heck is an object?

82
Object-Oriented Concepts
  • Definition An object is an entity which has a
    state and a defined set of operations to access
    and modify that state. Som89
  • Constructor operations modify an objects state.
  • Access operations access the objects state.
  • We can also think of an object as a closed
    system, subsystem, or component that maintains
    its own state, accepts a specified set of
    messages, provides responses, and/or performs
    pre-defined functions.

83
Object-Oriented Concepts
  • Examples of objects include a juke box, an ATM,
    a robot, a software module, etc.
  • Classes objects are generally instances of
    classes. Object-A is a Class-G. Roger is a
    robot. Billing-Address is an Address-Class.
  • Inheritance Class-Red-Dogs is a Class-Dog that
    also has an attribute that it is Red. This is
    not a new concept. We have other
    classifications. Monkeys are Primates with long
    tails. Primates are man-like Mammals.

84
Object-Oriented Concepts
  • Constructor Changes state
  • Joes_Car.Set_Color (Red)
  • My_Checking.Withdraw (75.68)
  • Access Retrieve state information
  • Print Joes_Car.Make
  • Print My_Checking.Balance
  • Print Point_A.Distance (Point_B)
  • Note that Balance may be calculated from state
    data of the object My_Checking and Distance is
    computed relative to an input parameter.

85
Object-Oriented Concepts
  • At entity, in an information system, is a
    representation of a thing from the real world.
  • An example is a person in a Microsoft Outlook
    contact list. The entry for Bill Akin is an
    entity.
  • An object is more than an entity. It is the
    information and the encapsulated operations on
    the entity.
  • An entity has attributes. An object has
    attributes and methods.

86
Object-Oriented Concepts
  • We are considering four object oriented
    approaches in software engineering
  • OO Architecture
  • OO Analysis
  • OO Design
  • OO Programming
  • There are many other OO approaches.

87
Object-Oriented Concepts
  • Classes objects are generally instances of
    classes.
  • Encapsulation data and procedures are
    encapsulated in the class definition.
  • Inheritance An object inherits the attributes,
    operations, and methods of its class. New
    classes can inherit all features of their parent
    classes.

88
Architecture
  • Software architecture
  • the structure of the components of a
    program/system, their interrelationships, and
    principles and guidelines governing their design
    and evolution over time.
  • Definition from the SEI

89
Architecture
  • Definition The structure of components, their
    interrelationships, and the principles governing
    their design and evolution.
  • Architecture types
  • Operational
  • Technical
  • System

90
Architecture
  • Architecture
  • A minimal set of rules governing the arrangement,
    interaction, and interdependence of the parts or
    elements of a system whose purpose is to ensure
    that a conformant system satisfy a specified set
    of requirements.
  • Identifies the services, interfaces, standards,
    their relationships.
  • Provides the technical guidelines for
    implementation of systems upon which engineering
    specifications are based, common building blocks
    are built, and product lines are developed.
  • Ingredients
  • Rules Design Rules, Standards, Conventions
  • Reference model(s)
Write a Comment
User Comments (0)
About PowerShow.com