Title: Learning Based Project Reviews LBPRs: Research Findings DRAFT
1Learning Based Project Reviews (LBPRs)Research
FindingsDRAFT
- Tim Kotnour, PhD
- Catherine Vergopia
- University of Central Florida
- January 19, 2005
2Well Share a Conversation about LBPRs.
- What was our intent?
- Research scope
- How did we execute to meet our intent?
- Methodology
- What did we learn?
- Research results
- Where do we go next?
- Focus on specific processes and tools within a
program
3Why is this Research Important?
Standish Groups Chaos Studies (Johnson, 2001)
- The questions become
- Why are we so challenged?
- How can we become better?
4This Research Directly AddressesTwo of the CPMR
Priority Challenges.
- Development of a highly effective knowledge
sharing system e.g., a linked Lessons
Learned/Best Practices (LL/BP) process, within
NASA to ensure good Agency-wide communication
and, of equal or greater importance, the systems
adoption, implementation and usage by successive
generations of P/PMs - Improving P/PM review, interaction,
team-building, leadership, decision-making, human
capital planning, and communication processes -
particularly within the NASA environment. - This research aims to identify, evaluate, create,
and share best practice for LBPRs that - Are a sustainable, consistent approach (with
motivations) for collecting/organizing lessons
learned (LL) -- both historical and ongoing (Best
Practices Lessons Learned). - Define and use program metrics, with traceable
results, addressing stakeholder and customer
interests (Decision-Making Tools, Methods, and
Metrics). - Identify and validate the meaningful metrics on
which to judge performance (Decision-Making
Tools, Methods, and Metrics). - See later slides on overview of learning-based
project reviews
5The Research Scope is Consistent with theCPMR
Challenges and Research Objectives.
- Research Question
- How can organizational learning be integrated
into the project review process to learn from
project experiences to improve project knowledge
and performance throughout the life-cycle of a
program or project? - Sub-Questions
- What is the tacit and explicit knowledge that
needs to be shared in the NASA project
environment? - What are the different types of reviews NASA
completes throughout the project life-cycle? - Project team, PM, Design, PMO
- What processes, tools, skills facilitate the
creation, capture, sharing, and application of
tacit and explicit knowledge in the project
review process? - What are the barriers/enablers to productive
learning (i.e., learning that improves both the
current projects performance and the
organizations capabilities)? - What skills does the project manager need to have
be a reflective practitioner within the LBPR
framework?
6Were Understanding Howto Make Learning within
Reviews Routine.
Learning, Project, and Organizational Performance
7Consistent Learning Approachthrough Reviews,
Tools, and Metrics.
Lessons-Learned Sharing Process
Set of Lessons Learned
Plan
Project 2
Do
Act
Study
8Well Share a Conversation about LBPRs.
- What was our intent?
- Research scope
- How did we execute to meet our intent?
- Methodology
- What did we learn?
- Research results
- Where do we go next?
- Focus on specific processes and tools within a
program
9We Learned from the NASA/KSC Community.
NASA Launch Services Program
Across Organizations Learning Community
KSC Orbital Space Plan Office
Other Research
10Well Share a Conversation about LBPRs.
- What was our intent?
- Research scope
- How did we execute to meet our intent?
- Methodology
- What did we learn?
- Research results
- Where do we go next?
- Focus on specific processes and tools within a
program
11We Are Developing the Pieces of the LBPR Story.
Recommendations
Specific Tools
Conclusions LBPR Framework
LBPR
Project Types
Review Types
Review Energies
Enablers Barriers
Processes Tools
NASA Environment
Theory
- NASA Documents
- 7120
- CAIB
- System Engineering
- OSP Lessons Learned
- GAO report on NASAs LL Process
- KSC Interviews
- Shuttle Processing
- Spaceport Engineering Technology
- Launch Services Program
- SMO/PM Tools
- ASRC
- KSC Documents
- PH-YA Customer Agreement
- YA Procedures for Systems Life-cycle
- YA Portfolio Report
- 2 minute chart
- LSE Quarterly report
- Managing projects in SET PMO
- USTDC Exploration Status
- Monthly Performance Reports
- LSP Monthly reviews
- LSP mission flows
- Literature Background
- Organizational Learning
- Project Reviews
- Project Learning
12The Findings
- Recommendations
- Projects/programs need to continue to use a
combination of learning tools and sessions - Align to aim, focus, and interconnectedness of
the reviews for learning - Develop the unique processes and tools for
learning at multiple levels (individual, team,
program) - Need to develop an integrated approach
- One review approach will NOT solve the problem
- Conclusions
- NASA and KSC execute different types of projects
and reviews - The reviews play different roles in the project
and for learning - Dont make the mistake of forcing a learning
level or approach for a given review - All reviews play a role in the learning process
- Specific tools and lexicons drive learning
- Enablers and barriers must be accounted for
13Project Reviews Are Aligned to Project Types.
(Need to align to 7120.5c when policy released)
14Reviews Focus on Different Items.
(Need to align to 7120.5c and systems engineering
when policy released)
15Learning-Based Project Reviews.
16We Need to Align the Intent of the Reviewwith
the Expected Learning that Can be Achieved.
Trigger Event
Level III IV learning is usually triggered from
a level I or II review. A challenge is in
determining when it is appropriate to move to
level III and IV.
17We Identified Enablers and Barriers to Learning.
18We Identified Enablers and Barriers to Learning.
19We Identified Existing Tools to Support Learning.
20We Need to Understand how to Combine the
Variables.
- Team
- Other projects
- Program
- Grey beards
- Experts
- Appointed participants
- Within project team
- Project
- Program
- (In) Formal
- Structure vs. unstructured
- Open vs. lexicon
21We Need to Understand How toBetter Connect
Reviews and Learnings.
Variables Learning Level I, II, III, IV
vs.
22We Need to Understand Howto Integrate the Review
Process.
- NASA Program Management
- Accomplishments, where putting energy, what are
the risks, customer satisfaction, budget, and
human capital
Program Management Division Chief Team Review
(Quarterly) (Owner Program Management)
Budget
Infra-structure
Fleet Issues
Significant Items
Notional Example
Division Chief Team Review (Monthly) (Owner
Associate Director)
Status of All Missions
Next 12 Month Missions
Red Missions
Budget
Any Significant Items
Internal Functional Review (Monthly)
Mission
PIO
Launch Site Ops
SMA
Engineering
23We Need to Understand Data for Learning Across a
Program.
Trends Across Program
Notional Example
Mission Success
Mission Measures
Program Measures
24We Identified New Questions.
- How can we ensure we trigger the right level
III and IV in level II? - What do we need to provide each organizational
level (individual, project, program) to ensure
learning? - Whats the role of level I II reviews with
level III IV? - When and how should level III IV reviews be
scheduled? - Why isnt the learning paradigm/innovation
accepted? - ?
25Well Share a Conversation about LBPRs.
- What was our intent?
- Research scope
- How did we execute to meet our intent?
- Methodology
- What did we learn?
- Research results
- Where do we go next?
- Focus on specific processes and tools within a
program
26Education Outreach Activities
27Connecting the Phases
Phase 1 What is the learning context within
NASA/KSC?
Phase 2 What specific interventions help improve
learning within project reviews within NASA/KSC?
Individual Project Learning
Project Team Learning
Organizational Wide Learning
28Proposed Research
Enhanced Reviews Performance
Phase I Baseline Understanding (what is the
situation what are the tools)
On-Going Reviews
Enhanced Knowledge
Base Survey
Base Survey
- Nature of reviews
- Outputs/outcomes
- Usefulness of tools
- Use of tools
- Impacts
Tool Specific
Tool Specific
Tool Specific
29Summary of What We Accomplished
- Framework shared and accepted
- KSC context defined
- Enablers/barriers identified
- Process and tool examples gleaned
- Variables identified
- Further questions raised
- Move forward approach
- Started to think about the review process from a
learning perspective