Value Measuring Methodology - PowerPoint PPT Presentation

1 / 42
About This Presentation
Title:

Value Measuring Methodology

Description:

Council for Excellence in Government: Benefits Assessment Workshop Value Measuring Methodology May 2003 * * Two charts represent the concept of a risk tolerance boundary. – PowerPoint PPT presentation

Number of Views:181
Avg rating:3.0/5.0
Slides: 43
Provided by: Michelle457
Category:

less

Transcript and Presenter's Notes

Title: Value Measuring Methodology


1
Value Measuring Methodology
Council for Excellence in Government Benefits
Assessment Workshop
May 2003
2
Why isnt traditional business case analysis
providingthe information OMB is looking for?
  • Primarily focused on financial benefits (e.g.,
    ROI) that impact the government only

Analysis is viewed as a means to get funding,
not a tool for on-going management evaluation
3
How can traditional analysis be supplemented to
better address the challenges of the e-Government
environment?
In July 2001, the Social Security Administration
(SSA), in cooperation with the General Services
Administration (GSA), took on the task of
developing an effective methodology to assess the
value of electronic services that would be 1.
Compliant with current federal regulations OMB
guidance 2. Applicable across the federal
government 3. Do-Able A team of Booz Allen
analysts and thought-leaders affiliated with
Harvard Universitys Kennedy School of Government
were contracted to support this effort.
4
The approach used to develop VMM was built upon
the foundation of a public / private partnership
research analysis
Critical Inputs Research
Traditional / Emerging Approaches Federal Agency Staff
Legislation and OMB Circulars State Government Staff
Government Developed Training Documents Private Sector
Academic Institutions OMB
Think Tanks GAO
development
presentation
discussion
5
The output of this effort was the Value Measuring
Methodology (VMM)
  • First articulated in Building a Methodology for
    Measuring the Value of e-Services (1/02)
  • Refined tested through application to two
    cross-agency e-Government initiatives (e-Travel
    e-Authentication)
  • Release of the VMM How-To-Guide and VMM
    Highlights document by the Best Practices
    Committee of the CIO Council (10/02)
  • VMM Roll-Out, held by the Council for Excellence
    in Government in cooperation with the CIO
    Councils Best Practices Committee, OMB, and GSA
    (4/03)

6
VMM Overview
7
It is important to understand what VMM IS and
ISNT
COMPLIANT WITH GPRACCAOMB A-11 CONSISTENT WITH
THE PHILOSOPHY OF THE PMA
  • VMM IS
  • A scalable and flexible approach for quantifying
    and analyzing value, risk, and cost and
    evaluating the relationships among them
  • Helps to create a roadmap for on-going management
    and evaluation
  • Supports the development of critical management
    plans
  • VMM IS NOT
  • One Size Fits All
  • A Way to Avoid Analysis
  • Only Useful for e-Government Initiatives

8
The Essential Factors
9
A Decision Framework
Define User Needs Priorities Quantifiable
Measures of Performance (Metrics,
Targets) Foundation for Analysis On-going
Performance Measurement Early Consideration of
Risk
V A L U E Value Factors Value Factors priority
V A L U E value measure(s) metric, target, scale priority
R I S K
COST CustomizedCost ElementStructure 1.0 1.0
COST CustomizedCost ElementStructure 2.0 2.0
COST CustomizedCost ElementStructure 3.0 3.0
Risk Inventory Risk Tolerance Boundary
10
Communicating Value to Customers and Stakeholders
What will make an Appropriations Committee staff
member or OMB care about an investment in digital
Land Mobile Radio (LMR) equipment for public
safety agencies across government?
The technically superior digital technology
offers more bandwidth than analog technology
because the signal is.
OR
Using digital LMR will prevent the catastrophic
communications malfunctions and inefficiencies
that cost lives in the aftermath of 9/11 in NYC.
Digital LMR will accomplish this by
11
VMM Effective in Building WINNINGOMB Exhibit
300s
  • PMA Imperatives
  • Captures All Value Factors/Benefits
  • Analytic Rigor
  • Clarity
  • Completeness
  • Focus On Results

12
Value
13
The Value Factors
DIRECT USER (CUSTOMER) VALUE Benefits directly
realized by users or multiple user groups. Users
or customers will vary based on the type of
initiative being assessed. Users may include,
but are not limited to, government employees,
other government organizations, and
citizens SOCIAL (NON-DIRECT USER/PUBLIC) VALUE
Benefits not related to direct users (e.g.,
society as a whole) GOVERNMENT OPERATIONAL /
FOUNDATIONAL VALUEOrder of magnitude
improvements realized in current government
operations and processes and in laying the
groundwork for future initiatives GOVERNMENT
FINANCIAL VALUEFinancial benefit (e.g., cost
savings, cost avoidance) realized by the
government, including financial benefits received
by the managing or sponsor agency as well as
other federal agencies STRATEGIC / POLITICAL
VALUEBenefits that move an organization closer
to achieving its strategic goals, the priorities
established by the Executive Office of the
President and congressional mandates
14
Identifying and Defining Value
L A Y E R 2
Project Value
Project Value
Definition
Definition
(Measures)
Definition
Definition
Definition
Direct User (Customer)
Social (Non-Direct User)
Government Financial
Government Operational/ Foundational
Strategic/ Political
L A Y E R 1
15
Structured Approach to Identifying and Defining
Value Measures
The way measures are articulated can directly
impact the way they are perceived and understood.
The definition must consist of four parts
Concise, Illustrative Name Robust Reliable Service Robust Reliable Service Robust Reliable Service
Brief Description Service with Minimal or no disruptions Consistent service regardless of normal fluctuations in demand High fault tolerance with built-in redundancy Adequate speed to meet business needs Service with Minimal or no disruptions Consistent service regardless of normal fluctuations in demand High fault tolerance with built-in redundancy Adequate speed to meet business needs Service with Minimal or no disruptions Consistent service regardless of normal fluctuations in demand High fault tolerance with built-in redundancy Adequate speed to meet business needs

Performance Metrics Performance Metrics Target Scale (0-100)
Frequency of service disruptions Frequency of service disruptions None 0 disruptions 100 1 disruption 95 4 disruptions 60
Length of service disruptions Length of service disruptions 10 minutes 10 min. 90 1114 min. 60
Is an executable Continuity of Operations plan in place (with a backup NOC) sufficient to pass annual certification? Is an executable Continuity of Operations plan in place (with a backup NOC) sufficient to pass annual certification? Yes No 0 / Yes 100
Latency Latency 75 milliseconds
1
2
4
3
16
Building A Direct User Measure What Do Users
Want?
E-Travel Initiative
Travelers / Managers Anytime Access to Data Real-Time Data Availability Simplified/Automated Trip Planning Speed of Reimbursements Budget Accounting Staff Standardized and Electronic Auditing Function / Simplified Application and Monitoring of Travel Entitlements Access to Reporting Information Businesses (Travel Management Companies) Timely Accurate Receipt of Bill Payments
17
Prioritizing Value Factors - The Analytic
Hierarchy Process
  • Analytical Hierarchy Process (AHP) tools are
    designed to help groups enhance the quality of
    their decisions. These tools
  • Bring structure to the decision-making process
  • Elicit ideas, feelings and the judgments of
    stakeholders
  • Represent those judgments as meaningful numbers
  • Synthesize the results and
  • Analyze the sensitivity of those judgments to
    changes.

Through the use of pair-wise comparisons, the
relative importance of each of the criteria is
calculated Attention is focused on areas of
disagreement
18
Summary of VMM Weighting Scoring for Title
XVI Check Your Benefits
Applying VMM to Title VXI Check Your Benefits,
we determined the following scores for each of
the Value Factors and their respective value
measures
Agency Direction Program Mgt. Direction Sub-Criteria
Value Factors and Value Measures( share of total weight) Maximum Value of the Value Factors Maximum Measure Value Scoring (Normalized) Sub-Criteria Points () Value Score
Direct User 25 Scoring Specialists OPB, OAS, OQA, OCOMM 21.0
Expanded Access (50) 12.5 10 of 10 12.5
User Time Saved (30) 7.5 6 of 10 4.5
Increased Satisfaction (20) 5.0 8 of 10 4.0
Social 15 Scoring Specialists OCOMM, OQA, OSM 13.0
Increase Public Confidence (33) 5.0 10 of 10 5.0
Access for Hard to Reach (33) 5.0 8 of 10 4.0
Equity and Fairness (33) 5.0 8 of 10 4.0
Government Financial 10 Scoring Specialists OB, DCS, OQA 9.0
Effectiveness and Efficiency (50) 5.0 8 of 10 4.0
Return on Investment (50) 5.0 10 of 10 5.0
Operational/Foundational 30 Scoring Specialists DCS, OES, OPB, OAS 25.0
Supports Future eService Transactions (50) 15.0 10 of 10 15.0
Supports Transformation (33) 10.0 6 of 10 6.0
Supports Organizational Learning (17) 5.0 8 of 10 4.0
Strategic / Political 20 Scoring Specialists OSM, OES 18.0
Satisfies External Mandates / Requirements (50) 10.0 10 of 10 10.0
Supports ASP (50) 10.0 8 of 10 8.0
TOTAL 100 100 86 86
19
Risk
20
Identifying and Defining Risk
Risk that is not identified cannot be
mitigated. Risks that are not mitigated can
cause a project to fail either in the pursuit of
funding or, more dramatically, while the project
is being implemented.
  • IDENTIFYING RISKS
  • Consider standard IT project risks
  • Identify project specific risks via input from
    technical policy staff, representatives of
    partner agencies collected from
  • Working Sessions
  • Survey Efforts
  • EXAMPLE OMB RISK CATEGORIES
  • Project Resources / Financial
  • Technical / Technology
  • Business / Operational
  • Organizational Change Management
  • Data / Information
  • Security
  • Strategic
  • Privacy

21
Defining Risk Tolerance
  • What is the decision processbehind the
    following
  • Buying a 1 lottery ticket for the chance to win
    1 million. Odds are 1 in 1,000.
  • Buying a 100 lottery ticket for the chance to
    win 1 million. Odds are 1 in 1,000.
  • Buying a 100 lottery ticket for the chance to
    win 10 million. Odds are 1 in 1,000.
  • Organizational Tolerance For Cost Risk (increased
    cost)
  • Organizational Tolerance for Value Risk (slippage
    in performance)

22
As the estimated most likely value score
increases, risk tolerance is likely to increase.
As the estimate most likely cost increases, risk
toleranceis likely to decrease.
Value and Cost Risk Tolerance Boundaries
communicate the upper limit of the range of risk
an organization will accept in both areas.
23
Cost
24
Identifying Defining Costs
Consider Value and Risk
Ensure a complete, comprehensive cost estimate
Alleviate the risk of missing costs or
double-counting by developing a Cost
Element Structure Investments made on
incomplete or inaccurate estimates are likely to
run out of funding and, therefore, require
justification for additional funding or a
reduction of initiative scope
Direct User Value Training Marketing Access (e.g., kiosks) Incentives
Social Value Communications Public awareness Advertising Public Relations
Government Operational / Foundational Value Maintain legacy systems and processes during transitions On-going maintenance of paper process
25
Estimating and Comparing Value, Cost,
Risk
26
Identifying and Defining Viable Alternatives
Identify viable alternatives that have the
potential to deliver an optimum mix of both value
and cost efficiency
PEOPLE
PEOPLE
trainingoutreachmanagementstaffingcommunicatio
nsrecruitmentsocializationuser support508
requirementslanguage requirementsEA / FEA
TECHNOLOGY
TECHNOLOGY
hardware software interface data req. EA / FEA
PROCESS
PROCESS
Alternatives Must Address People, Process
Technology!
BPRAcquisitionOutsourcing/in-sourcingconcept
of operationsrisksecurityprogram
managementfundingcollaborationcommunicationsev
aluationlegislative req.policy req.EA / FEA
27
The Base Case
rising demand
Projects the results of maintaining current
systems and processes while attempting to keep
pace with changes over time.
base case
workforce attrition
customer satisfaction
status quo
T I M E
28
Collecting Data
Stage of Development Data Sources
Strategic Planning Strategic Performance plans Subject Matter Expert Input New and existing user surveys Private/public sector best practices, lessons learned and benchmarks Enterprise Architecture Modeling simulation Vendor / market survey
Business Modeling Pilots Subject Matter Expert input Data from analogous government initiatives New existing user surveys for each business line Private/public sector best practices, lessons learned benchmarks Refinement of modeling simulation
Implementation Evaluation Actual data from phased implementation Actual spending/cost data User group / stakeholder focus groups / surveys Other performance measurement
Avoid Analysis Paralysis Match Information to
the Phase of Development Data sources and detail
depend upon the initiatives stage of
development Use the best information available
rather than looking for information that doesnt
exist Update this information as better
information becomes available ALWAYS DOCUMENT
DATA SOURCES ASSUMPTIONS
29
Using Ranges
USE RANGES TO INCREASE CONFIDENCE IN COST
ESTIMATES!
High
Med
Low
Inputs
200
150
100
Projected Range of Training Costs Inputs
of Employees to be Trained/year Annual Cost
per Employee Trained
EXAMPLE
1500
1200
1000
30
Uncertainty and Sensitivity Analysis
Conduct Uncertainty and Sensitivity Analyses on
Both Cost Value Estimates
  • Uncertainty Analysis
  • Based on considerationsof requirement, cost
    estimating, and technical uncertainty
  • Increases confidence in the estimate. Doesnt
    increase the precision of the estimate
  • Tool Monte Carlo Simulation
  • Output Most Likely or Expected Cost Value
  • Sensitivity Analysis
  • Based on the output of the Monte Carlo Simulation
  • Sensitive variables have a significant impact on
    the overall estimate
  • Output Identification of which variables have a
    significant impact on the overall estimate. Can
    be used to determine which variables merit
    additional research

31
Analyzing Cost Risk and Value Risk
ALTERNATIVE 1 - COST RISK ANALYSIS ALTERNATIVE 1 - COST RISK ANALYSIS ALTERNATIVE 1 - COST RISK ANALYSIS ALTERNATIVE 1 - COST RISK ANALYSIS
Risk Probability Cost Impacted Impact
Cost Overruns Med 1.0 System Planning Development Low
Cost Overruns Med 2.0 System Acquisition Imp. High
Cost Overruns Med 3.0 System Maintenance Operations Med
Cost of Lost Info / Data High 1.0 System Planning Development Med
Cost of Lost Info / Data High 2.0 System Acquisition Imp. Med
Cost of Lost Info / Data High 3.0 System Maintenance Operations Low
The impact of a single risk factor may differ in
magnitude at each point where it interacts with
cost and value
The probability of a specific risk occurring
remains constant through out the analysis of a
specific alternative, regardless of where it
impacts the value or cost of a particular
alternative
ALTERNATIVE 1 - VALUE RISK ANALYSIS ALTERNATIVE 1 - VALUE RISK ANALYSIS ALTERNATIVE 1 - VALUE RISK ANALYSIS ALTERNATIVE 1 - VALUE RISK ANALYSIS
Risk Probability Value Impacted Impact
Cost Overruns Med Total Cost Savings to Investment Low
Cost Overruns Med Total Cost Avoidance to Investment Low
Cost of Lost Info/ Data High Total Cost Savings to Investment Low
Cost of Lost Info/ Data High Total Cost Avoidance to Investment Low
HW/ SWFailure Replacement Med Accessibility of e-Gov services to Users High
HW/ SWFailure Replacement Med User Trust In Internet Transactions High
HW/ SWFailure Replacement Med Application Owner Confidence in Identity of Users High
HW/ SWFailure Replacement Med Reduction of Identity Fraud High
HW/ SWFailure Replacement Med Regulatory Compliance High
HW/ SWFailure Replacement Med Total Cost to Savings Investment High
HW/ SWFailure Replacement Med Total Cost Avoidance to Investment High
32
Pulling Together the Information
  • You should be able to answer the following
    questions
  • What is the estimated cost of each alternative?
  • What is the financial return on investment
    associated with the alternatives?
  • What is the value score associated with the
    alternatives?
  • What are the cost and value risks associated with
    this alternative? What effect do they have?
    (value and cost risk scores)
  • How do the value, risk and cost of the
    alternatives compare?
  • Does the cost risk and value risk associated with
    the alternatives fall within the range
    represented by the relevant risk tolerance
    boundaries?

33
Comparing Value to Cost
Investment Cost To Value (Expected
Risk-Adjusted)
100
90
80
Alt 1
70
Alt 3
Alt 2
60
V A L U E
Expected Alt 1
50
Risk Adjusted Alt 1
40
Expected Alt 2
30
Risk Adjusted Alt 2
20
Expected Alt 3
Risk Adjusted Alt 3
10
0
-
5
10
15
20
25
30
35
40
C O S T (M)
Based on This Information, Which Alternative
Would You Choose?
34
Comparing Value to Value Risk,and Cost to Cost
Risk
The risk associated with all of the value scores
fall within the acceptable area. Alt. 2 bears
the lowest value risk.
The only alternative that falls squarely within
the Cost Risk Boundary is Alt. 2.
35
The VMM Guide
36
The VMM How-To Guide provides best practice
analysis techniques, real examples and required
resources
37
VMM Step 1 Develop a Decision Framework
Define User Needs Priorities Quantifiable
Measures of Performance (Metrics,
Targets) Foundation for Analysis On-going
Performance Measurement Early Consideration of
Risk
V A L U E Value Factors Value Factors priority
V A L U E value measure(s) metric, target, scale priority
R I S K
COST CustomizedCost ElementStructure 1.0 1.0
COST CustomizedCost ElementStructure 2.0 2.0
COST CustomizedCost ElementStructure 3.0 3.0
Task 1 Identify Define the Value Structure
Task 2Identify Define the Risk Structure
Risk Inventory Risk Tolerance Boundary
Task 3 Identify Define the Cost Structure
Task 4Begin Documentation
38
VMM Step 2 Alternatives Analysis
(estimate value, cost, risk)
Task 1 Identify Define Alternatives
Viable Alternatives Base Case - What will
happen if nothing changes? Match levelsof
information to the phases of development
Step 1
priority
V A L U E
Value Factors
S E N S I T I V I T Y
U N C E R T A I N T Y
priority
metric, target, scale
value measure(s)
Low High Expected
Task 2Estimate Value Cost
R I S K
Risk Inventory Risk Tolerance Boundary
Risk Analysis
Task 3 Conduct Risk Analysis
COST
1.0
CustomizedCost ElementStructure
Low High Expected
2.0
Task 4 On-going Documentation
U N C E R T A I N T Y
S E N S I T I V I T Y
3.0
39
VMM Step 3 Pull Together the Information
Task 1Aggregate the Cost Estimate
Step 2
ExpectedValue Score
Step 1
priority
V A L U E
Value Factors
Task 2 Calculate the Return-on-Investment
S E N S I T I V I T Y
U N C E R T A I N T Y
Low High Expected
priority
metric, target, scale
value measure(s)
Risk Adjusted ExpectedValue and Cost
Risk Scores
R I S K
Risk Analysis
Risk Inventory Risk Tolerance Boundary
Task 3 Calculate the Value Score
Expected Cost
Low High Expected
COST
1.0
CustomizedCost ElementStructure
2.0
RiskAdjusted Expected ROI
U N C E R T A I N T Y
S E N S I T I V I T Y
3.0
Task 4 Calculate the Risk Scores
Expected ROI
Task 5 Compare Value, Risk, Cost
Government Cost Savings/Avoidance
40
VMM Step 4 Communicate and Document
Step 2
Task 1 Communicate Value to Customers and
Stakeholders
Step 1
priority
V A L U E
Value Factors
Step 3
S E N S I T I V I T Y
U N C E R T A I N T Y
ExpectedValue Score
Low High Expected
priority
metric, target, scale
value measure(s)
R I S K
Risk Adjusted ExpectedValue and Cost
Risk Scores
Risk Inventory
Risk Analysis
Risk Inventory Risk Tolerance Boundary
Task 2 Prepare Budget Justification Documents
Low High Expected
COST
1.0
CustomizedCost ElementStructure
Expected Cost
2.0
RiskAdjusted Expected ROI
U N C E R T A I N T Y
Expected ROI
Task 3 Satisfy Ad Hoc Reporting Requirements
S E N S I T I V I T Y
3.0
Government Cost Savings/Avoidance
Task 4 Use Lessons Learned to Improve Processes
Reporting Consensus Building Investment
Planning Management Planning
41
Q A
42
VMM establishes an even scale for quantifying and
analyzing value, risk, and cost
  • Measures tangible and intangible benefits
  • Accounts for risk in cost and value calculations
  • Increases reliability of ROI through simulation
  • Tested and proven in multiple E-Gov projects
  • Flexible and adaptable
  • Results and outcome driven
  • Allows examination of the relationships among
    Value, Cost and Risk
  • Feasible for portfolio management

V A L U E Value Factors Value Factors priority
V A L U E value measure(s) metric, target, scale priority
R I S K
COST CustomizedCost ElementStructure 1.0 1.0
COST CustomizedCost ElementStructure 2.0 2.0
COST CustomizedCost ElementStructure 3.0 3.0
Risk Inventory Risk Tolerance Boundary
43
  • Building a Methodology for Measuring the Value of
    e-Services
  • http//www.estrategy.gov/documents/measuring_final
    report.pdf
  • VMM How-To-Guide and VMM Highlights
  • http//www.cio.gov/ best practices page
  • http//www.cio.gov/documents/ValueMeasuring_Method
    ology_HowToGuide_Oct_2002.pdf
  • http//www.cio.gov/documents/ValueMeasuring_Highli
    ghts_Oct_2002.pdf
Write a Comment
User Comments (0)
About PowerShow.com