Title: AIT Team
1AIT Team
- Zhaohui Cheng QA Lead
- Shanna Sampson Framework Lead
- Shuang Qiu Cross Team Science
- Qingzhao Guo Data Format
- Peter Keehn Algorithm Integration
- Larisa Koval Documentation
- Yunhui Zhao CM
- Zhaohui Zhang Algorithm Integration
- Tim Wait Test Runs
- Xingpin Liu Monitoring Tools
- Haibing Sun Physical Collocations
2- AWG Integration Team
- The ProcessPresented byWalter Wolf
- AWG Integration Team Lead
- NOAA/NESDIS/STAR
3In the Beginning
- One day Mitch Goldberg described the AWG to me
and ask me Is this too much, can we do this,
clean up and deliver all the algorithms? - Sure, no problem it is just software.
4GOES-R Project Office
- Then I went to my first meeting at the GOES-R
Project Office - Doubt Uneasiness
- Can AWG execute the technical work scope, given
the available schedule and budget resources?
5Why Doubt?
- Developing Algorithms is easy
- World Class Scientists
- Preparing software for operations is easy
- Done it for years working with OSDPD
- Why does the doubt exist?
6Speaking Different Languages
- AWG
- Algorithm Development
- Algorithm Delivery
- GOES-R Project Office
- BOE
- IBR
- EVM
7Learn a New Language
- BOE Basis of Estimate
- IBR Integrated Baseline Review
- EVM Earned Value Management
8Basis of Estimate
- Background on project
- Ground rules and assumptions
- Methodology summary
- Estimate description based on WBS elements and
attributes of products - Cost Estimate Summary
- Cost Traceability (for subsequent estimates)
- BOE Algorithm Development May 19, 2008
- BOE R3/Cal/Tailoring May 28, 2008
9As Described in the BOE, the AIT Has Implemented
Development Phases
- 80 delivery to the GSP
- Algorithm Development and Testing
- Draft ATBD Delivery
- Algorithm Demonstration
- Algorithm Documentation
- Collaboration with AIT
- 80 ATBD Algorithm Package Delivery
- Baseline Products September 2009
- Option 2 products September 2010
- 100 delivery to the GSP
- Algorithm Development and Testing
- Algorithm Demonstration
- Algorithm Documentation
- Collaboration with AIT
- 100 ATBD Algorithm Package Delivery
- Baseline Products September 2010
- Option 2 products September 2011
10IBR Goals
- Ensure the technical scope of work is consistent
with the requirements (FPS) - Ensure the scheduled effort is consistent with
the work plan - Assess the validity of budgets, both in terms of
total resources and time-phasing - Assess performance measurement methods as
objective and meaningful - Establish a process that ensures EVMS baseline
integrity is maintained
11IBR - Summary
- The AWG IBR was conducted successfully on July
30, 2008 - All short term concern items were addressed by
August 31 from the IBR - Issued completion report to IBR Review Team
- Long term IBR concern items were input to the AWG
Risk Folder to be tracked - AWG schedules were baselined by Dec. 31 2008
12Earned Value Management
- AWG implemented Earned Value Management
- Schedule development -gt Rolling Wave Planning -gt
progressive elaboration planning where the work
to be accomplished in the near term is planned in
detail level, while work far in the future is
planned at a relatively high level. - By the 15th of every month, AWG team
chairs/deputies submit reports. - From the 16th to the last Tuesday of the month
AWG reports are compiled and sent to GSP EVM
analysts for response/feedback/issues. - Last Monday of the month, AWG team
chairs/deputies have a meeting to review the EVM
reports and address any issues. - Last Thursday of the month, present AWG EVM
report to GSP for formal review.
13Doubt Still Existed
- EVM, BOE, and IBR explain what the AWG was
spending, when we were doing it, and the high
level how. - But the details of the algorithms, how they would
play together and how they would meet the
aggressive latencies needed to be described
14CMMI Level 3
- At STAR, we used CMMI Level 3 on other projects
(Enterprise Process Lifecycle, EPL) - IASI
- CrIS/ATMS
- AWG Integration Team implemented it within AWG
15How Does CMMI Level 3 Process Erase the Doubt
- Now we are back to Speaking the Same Language
- Way for the Ground Segment Project to monitor the
algorithm development - Way for the AWG to show that we know what we are
doing understand the requirements
16Requirements Flowdown
Level 1 Requirements Document
Mission Requirements Document
GOES-R Ground Segment Project Plan
Functional and Performance Specification Ground
Algorithm Development Management Plan
Requirements used by AWG for Algorithm
Development
17Requirements
- As described in the Algorithm Development
Management Plan For Ground Segment Product
Generation - AWG is to develop algorithms to meet FPS
requirements for 57 products. - AWG is to deliver ATBDs and Algorithm Packages to
GOES-R GSP
18CMMI Process
- Enables the AWG to show that we understand the
requirements and that we can develop the
algorithms to meet those requirements - Five major reviews
- Algorithm Design Review (ADR)
- Critical Design Review (CDR)
- Test Readiness Review (TRR)
- Code Unit Test Review (CUTR)
- Algorithm (System) Readiness Review (ARR, was SRR)
19Algorithm Design Review
- Purpose of the Algorithm Design Review
- Choose the best algorithm that will meet the
requirements - Candidate algorithms
- Advantages
- Disadvantages
- Heritage
- Quality assurance, Schedule, Risks
20Critical Design Review
- Purpose of the CDR is to describe the chosen
algorithm that will meet the requirements. - Algorithm Theoretical Basis
- The physical and mathematical description of the
algorithm that was chosen to meet the
requirements - Other topics that are reviewed
- ADR Report and Actions
- Requirements
- Implementation Concept
- Software Architecture and Interfaces
- Design Overview and System Description
- Algorithm Package
- Quality Assurance
- Requirements Allocation
21Test Readiness Review
- Show that the algorithm within the Framework
(that meets coding standards) gives the same
answers as the offline research code - Show that the AWG is prepared for end to end test
of the algorithm with full product precedence
where the inputs and outputs are in NetCDF format - Establish the extended datasets that will be used
to show that the algorithm will meet the 80
requirements. This validation may be tied to the
offline algorithm validation.
22Test Readiness Review
- The TRR describes the following
- Requirements Allocation
- Quality Assurance
- Framework Architecture
- Product Precedence
- Interfaces
- Test Readiness
- Software Test Readiness
- Offline Algorithm Validation
- Framework Algorithm Validation
- Algorithm Package
23Code Unit Test Review
- The CUTR addresses the software readiness
- Software meets coding standards
- Error messaging has been standardized
- Interfaces have been finalized
- Common libraries used
- Common missing values used
- Current status of CUTR
- CUTRs will begin in 2010
24Algorithm (System) Readiness Review
- The ARR will describe the tests that will be
conducted to show that the algorithm will be
ready for delivery - All scientific dependencies have been addressed
- Product precedence have been fully tested
- CRTM has been implemented and tested
- Graceful degradation has been identified and
tested - The algorithm has been tested on full seasonal
data and will meet the requirements - The ARR describes the following
- Requirements Allocation
- Quality Assurance
- Framework Architecture
- Algorithm/Software Readiness
- Algorithm Package
25Status of Algorithm Reviews
- Algorithm Design Review (ADR)
- All Baseline products have completed their ADR.
- One Option 2 product has yet to complete the ADR
Absorbed Shortwave Radiation Surface - Critical Design Review (CDR)
- All Baseline products have completed their CDR
- Eight CDRs left for Option 2 Products
- 5000 slides
26Status of Algorithm Reviews
- Test Readiness Review (TRR)
- 23 Baseline products have completed their TRR
- Baseline TRRs left Snow Cover and Lightning
Detection - TRR for 10 Option 2 products completed 22 left
to complete - 3000 slides
- Code Unit Test Review (CUTR)
- Not started yet, for 100 delivery
- Algorithm (System) Readiness Review (ARR)
- Not started yet, for 100 delivery
27Artifacts
- Documents
- Review Item Disposition
- Presentation
- Requirements Allocation Document
- Flowchart Packages
- Test Plans
- Validation Plans
- Software
- Algorithms
28AWG Integration Team
- Lead the following technical reviews
- Critical Design Review
- Test Readiness Review
- Code Unit Test Review
- Algorithm (System) Readiness Review
- Provide guidance for the Algorithm Design Reviews
(ADR) - Provide documentation support
- Maintain configuration management for all the
algorithms - Provide support for
- Inegrated Baseline Reviews (IBR)
- Bases of Estimates (BOE)
- Integrated Master Schedules (IMS)
29AIT Role
- The AIT works with AWG Product Teams on preparing
the deliverables to the Ground Segment Project
(GSP) - Interface with the GSP
- Weekly and Monthly Meetings
- All deliveries to the GSP are made by the AIT,
not the individual product teams - AIT coordinates the communication between the
product teams and the GSP
30AWG Multi-Lingual
- AWG has always been confident.
- Now the AWG can speak the same language as the
GSP - As the Technical Reviews continue and the
deliveries continue to be made on time, the AWG
will not only speak the same language, but we
will be proficient.
31AIT Product Team Meetings Tuesday
- 830 am Ocean Dynamic Team
- 915 am Lightning Team
- 1030 am Radiation Budget Team
- 1115 am Land Team
- Lunch
- 100 pm Winds Team
- 145 pm Proxy Team
- 300 pm SST Team
- 345 pm Cryosphere Team
- 430 pm AAA Team
32AIT Product Team Meetings Wednesday
- 830 am Sounding Team
- 915 am Imagery Team
- 1015 am Cloud Team
- 1100 am Aviation Team
- Lunch
- 100 pm GRAFIIR Team
- 145 pm Hydrology Team
- 315 pm Surface Reflectance
- 400 pm ClearCase Training Review and
Question session