Title: End of Semester Presentation
1End of Semester Presentation
- May 06, 2005
-
- TTA Rolling Team
- ChangSup Keum
- JungHo Kim
- SeonAh Lee
- ShinYoung Ahn
2Contents
- Introduction
- Activities (Spring 2005)
- Lessons Learned
- Future Work
3I. Introduction
- Rolling Team Goal Role
- Customer Overview
- Business Context
4Rolling Team Goal Role
- Team Goal
- Learning Team challenge, practice and master new
techniques - Smart Team work effectively and efficiently
- Satisfying Stakeholders hit all major deadlines
- Team Role (Spring 2005)
- Mentors David Root, Jim Tomayko
Seonah Lee Team Leader Requirement
Shinyoung Ahn Planning Manager Risk
Changsup Keum Process Manager Development
Jungho Kim Support Manager Configuration
5Customer Overview
- TTA (Telecommunication Technology Association)
- Establishes IT standards
- Tests and certifies IT products
- TTA tests many kinds of IT / Telecom products
- Wireless LANs, broadband technologies,
CORBA-based platforms - Specifies test cases in Tree and Tabular Combined
Notation (TTCN), an international standard
for test specification - Why does TTA initiate this studio project?
- TTA is using a commercial tool to translate from
TTCN to C code - However, the C code generated by the tool is
unreadable - Macros, Insufficient comments, Incomprehensible
identifiers
6Business Context Diagram
- Test Engineers need to review and modify the
Abstract Test Suite (ATS) before making an
Executable Test Suite (ETS)
Test Equipment
PC / Windows
Executable Test Suite
Test Case in TTCN
Standard
Test Suite in C
101010111
int test (int b) return
(verdict)
TestGen
LT!GetX LT?PDU
Legend
Data In/Out
Manual Conversion
Hardware Environment
External Entity
System
Standard
Data File
7II. Activities (Spring 2005)
- Project Management
- Risk Management
- Architecture Design
- Implementation
- Measurement
8Project Management Spring Semester Tasks
Spring 2005
Jan
Feb
Mar
Apr
May
Team Building
Implementation Plan for Summer
Test Plan
Management Document Update
Management
TSP Study
Architecture Inspection
SRE Seminar
Design Review
ATAM Seminar
QAW
Architectural Design
Detailed Design Document
Design
Architectural Design
Jflex JavaCC Seminar
Semantic Analyzer
Lexical Analyzer
Symbol Table
Testing
1st Iteration
Syntax Analyzer
Code Generator
MOSP
EOSP
Start
Deliverables
Modified Task
Planned Task
Delayed Task
Unplanned Task
Management Project, Risk, Configuration, Quality
QAW Quality Attribute Workshop
9Risk Management SRE
- Risk Elicitation
- Mini SRE with Gil Taran
- 14 Risks are elicited
- Multi voting for prioritization
- Three Top Risks
Making the translation program converting TTCN
BNF to source code for JavaCC
10Architecture Design Drivers
- Functional Requirements
- TestGen shall accept TTCN-MP test case and
generate an test suite written in ANSI C - Quality Attributes
- Usability TestGen shall produce ANSI C code that
can be understood and modified by a test engineer - Modifiability TestGen shall be extensible to
satisfy test cases in a variety of protocols in
the future - Constraint
- TestGen shall be executable on a personal
computer - TestGen shall be able to run on MS-Windows
11Architecture Design Utility Tree Scenario
- Usability, Modifiability, Performance
I Importance D Difficulty
12Architecture Design C C View (Data-Shared
Style)
TestGen System
13Architecture Design Sequence Diagram
TestGen System
Semantic
Parser
Main Controller
Code_Gen
Tree
Symbol
C_code
TTCN-MP
Test Engineer
START
1. Parsing
2. Read test case file
3. File data
4. Generate symbols
5. Generate abstract syntax tree
6. Return control
7. Check Semantic
8. Check symbols
9. Symbols
10. Generate decorated tree
11. Return control
12. Generate C code
13. Get tree
14. Tree data
15. Generate C code
16. Return control
END
14Architecture Design Module View (Decomposition
S.)
Parser
SymbolTable
ParseException
SymTab
SymTabEntry
ltltinterfacegtgt TtcnTreeConstants
TypeEntry
VariableEntry
TtcnParser
ltltinterfacegtgt TtcnConstants
VariableTable
TypeTable
Semantic
TtcnTolkenManager
Token
Semantic Main
Definition NodeAnalyzer
SimpleCharStream
TokenMgrError
Semantic Error
Expression NodeAnalyzer
TestGen
ltltinterfacegtgt Semantic Visitor
CodeGenerator
Statement NodeAnalyzer
CodeMain
CodeFormatter
Tree
OverviewGenerator
LogGenerator
ASTActualPar
ltltinterfacegtgt Node
DeclarationGenerator
TransRules
ASTAttach
ConstraintGenerator
ltltinterfacegtgt TtcnParseVisitor
SimpleNode
ASTValue
DynamicGenerator
ASTTtcnSpec
15Architecture Design Code_Gen (Alternatives
Analysis)
Rule Manager
Rule_Data
Managing mapping rules between TTCN Specification
and C code
Rule_Mgr
Overview
Modifiability, Usability
Declaration
Code_Main
Code_Format
Constraint
Code Formatter
Dynamic
Beautifying the C code for a test engineer to
understand easily
Modifiability, Usability
16Implementation Creating a Skeletal System
Generated by JavaCC
Skeletal System
17Implementation Status of Skeletal System
- A skeletal system was created based on the
architecture - TestGen system will be developed incrementally
based on the skeletal system - Parser
- Developed by JavaCC
- Create JavaCC input file which specifies tokens
grammars - tcnParser.jjt 4400 LOC
- Generated code by JavaCC 30045 LOC
- Coding completed, not fully tested
- Semantic Analyzer (status class design
completed) - 5 classes designed
- Code Generator (status dynamic part implemented)
- The most important component in our system
- 3098 LOC are created
18Analysis Reverse Engineering
options SANITY_CHECKtrue DEBUG_PARSERfalse D
EBUG_TOKEN_MANAGERfalse LOOKAHEAD2 MULTI
true VISITORtrue NODE_DEFAULT_VOID
true NODE_PACKAGE "parser" PARSER_BEGIN(Ttc
nParser) package parser import
java.util. public class TtcnParser
public static void main(String args) throws
ParseException TtcnParser parser
parser new TtcnParser(System.in) try
ASTTestCase n parser.TestCase()
TtcnParserVisitor v new DynamicGen()
n.jjtAccept(v,null) System.err.println("T
TCN-MP file processed successfully.")
catch(ParseException e)
System.out.println(e.toString())
PARSER_END(TtcnParser)
19Metrics
Metrics
Result
Category
Product
Defect density Defects / KLOD Software
Architecture Document 270 lines were reviewed
8 major defects were discovered
29.6 Defects / KLOD
Process
Process Compliance given grades / total grades
Project Management (4/6) Risk Management
(3/3) Development Process (2/3) Quality
Assurance (6/6) Configuration Management(7/9)
81
Project
Schedule Deviation actual time / planned
time SeonAh Lee (249.9) ChangSup Keum (236.7)
JungHo Kim (163.3) ShinYoung Ahn
(191.0) Actual time (840.9) Planned time (16
12 4 768)
110
20Effort Measurement
- Time spent this semester 841 hours
21III. Lessons Learned
- What is Good So Far
- What Could Go Better
22What is Good So Far
- QAW
- We wrote and prioritized the quality attribute
scenarios - We refined scenarios after QAW session
- We should had finished the QAW before starting
architecture design - ATAM
- We finalized the architectural design and elicit
sensitivities, tradeoffs, and risks - We refined the architecture and gathered more
risks in ATAM - ATAM is useful to clarify the customers
expectations regarding the output of the our
system.
23What is Good So Far
- SRE
- We identified the risks and evaluating the risks
more systematically - Detailed Task Plan
- Plan manager notified the detail progress to team
member - Skeletal System
- It can check the architectural structure
- It provided the base information for
implementation plan - Reverse Engineering of Skeletal System
- It was good for knowledge transfer
24What Could Go Better
- Process
- We used famous processes without selection
criteria - We just pursued famous processes in selecting a
process - Selection criteria was not based on team goals
- Process was changed this semester
- ACDM was chosen because it is well fit into our
team goal - ACDM is quite similar to our as-is process
- We will keep the ACDM next semester
25What Could Go Better
- Analysis before Decision
- We did not capture the characteristics, advantage
and disadvantage - We did not find alternatives, comparing two
candidates - We did not valuate the usefulness based on our
goals - Active Discussion and Communication
- We believed that frequent meeting was
unproductive - Experts knowledge did not propagate well
- We sometimes missed the best way that could be
resulted from discussion
26IV. Future Work
- Implementation Test Plan
- Plan Summary
27Implementation Test plan
CG Code Generator
1st week
2nd week
3rd week
4th week
5th week
6th week
7th week
8th week
9th week
10th week
11th week
12th week
Implement
CG Dynamic Part V1
CG Dynamic Part V2
CG Declaration Part
CG Constraint Part
CG Overview Part
Semantic Analyzer
Lexical/Syntactic
Adaptive Library
Test
Define Test Criteria
Create Test Suite
Unit Test V1
Unit Test V2
Integration test
Acceptance test
Kick-Off
Finalize Test Criteria
MOSP
Milestone
Complete Integration test
EOSP
28Plan Summary
- Implementation
- This is based on the skeletal system
- Two iterations
- Iteration I Basic functions
- Iteration II Advanced functions
- Test
- Acceptance test criteria should be defined with
client - Unit test Integration Test
- Acceptance Test
- Acceptance Test 1st (7/25)
- Acceptance Test 2nd (8/1)
- Acceptance Test 3rd (8/3)
29Questions
?
30Thank You !
31V. Backup slides
32Risk Management New Risks
- 5 Risks are elicited from ATAM
- Not prioritized, not mitigated yet
33Architecture Design Module View (Decomposition
S.)
Parser
SymbolTable
Legend
ParseException
SymTab
SymTabEntry
Package
ltltinterfacegtgt TtcnTreeConstants
TypeEntry
VariableEntry
Version1 Class
TtcnParser
ltltinterfacegtgt TtcnConstants
Version 2 Class
VariableTable
TypeTable
Semantic
ltltinterfacegtgt
interface
TtcnTolkenManager
Token
Semantic Main
Definition NodeAnalyzer
SimpleCharStream
TokenMgrError
Semantic Error
Package (A) contains Class (B) and Sub-package
(C)
Expression NodeAnalyzer
TestGen
ltltinterfacegtgt Semantic Visitor
CodeGenerator
D
E
Statement NodeAnalyzer
Class (D) depend on All class of package (E)
CodeMain
CodeFormatter
Tree
OverviewGenerator
LogGenerator
Realization relationship
ASTActualPar
ltltinterfacegtgt Node
DeclarationGenerator
TransRules
Same style class
ASTAttach
ConstraintGenerator
ltltinterfacegtgt TtcnParseVisitor
Dependency Relationship
SimpleNode
ASTValue
DynamicGenerator
ASTTtcnSpec
34Process ACDM
Discover Architectural Drivers 1
Functional requirements, constraints, quality
attributes
Establish Project Scope 2
SOW, Preliminary Project Plan
Create notionalarchitecture 3
Architecture views
Architectural Review 4
Risks, tradeoffs
production go or No-Go Decision 5
No
Plan Experiments 6
Yes
Experiment planRedefined project plan
Production Planning 6
Production and Test Plan
Execute experimentsRefine architecture 7
Detailed Designs Product
Production 7
Refined architectureUpdated project plan
Spring Semester
Iterate as necessary for production, Maintenance,
or enhancement
Summer Semester
35Reflection on process
- We selected our process without enough knowledge
- We have not realized that why process so
important - We have tried to find a ideal process without
selection criteria - We have realized the necessity of baseline
process - Mixing best practices from TSP, RUP, ACDM, and XP
have made team members and mentors confuse - Our Criteria should be based on team goals
- Goal of our team process To guide us how to
produce artifacts within the limited time
36Why ACDM ?
- Small change overhead ACDM is quite similar to
our as-is process - Role conflicts mapping to existing team roles
- Chief architect ? Team Leader, Chief Scientist ?
Development Manager, Managing Engineer ?
Development Manager - Learning curves not steep
- Following ACDM for design phase is a good way to
produce a good architecture document and to
execute experiments - We can apply architectural technology to our
studio project - Carrying out technical experiments will be a good
mitigation strategy for technical risks related
with parsing tool (JavaCC)