Title: Advancing VMM Project Level Expert Choice Session
1Status Briefing
Business Case for the XML.gov Registry
Washington, DC August 1, 2002
This document is confidential and is intended
solely for the use and information of the client
to whom it is addressed.
2Agenda
- XML.gov Value Proposition/Profile Performance
Measures, Metrics and Scoring for Benefits in the
VMM Analysis - Alternatives Definition and Analysis
- Cost Models/Cost Element Structures for All
Alternatives - Next Steps
3VMM Decision Framework Development - Overview
Risk Tolerance Boundary
Value per Dollar Invested
Return on Investment
project specific sub-criteria
4Introduction to Performance Measures, Metrics,
and Scoring for Registry/Repository Benefits
Metrics and Scoring of Benefits
- A performance measure, such as total number of
users is associated with each benefit. The
measure identifies how the initiative owners
would demonstrate that they have delivered the
benefit in question. If the benefit were
increased size and awareness of user community,
it might be measured by total number of users - The term metric simply refers to actual numbers
associated with a given measure. The ideal total
number of users may be - 100,000 100,000 is the metric.
- For each measure, we must determine threshold
(min), targeted, and ideal levels of performance. - Threshold (T) minimum level of performance that
must be achieved in order to get any value out of
the initiative. - Target target level of performance, what the
initiative owners are shooting for. - Ideal (I) level of performance in the best of
all possible worlds, if everything works out
perfectly.
5Performance Measures, Metrics, and Scoring
Direct User or Customer Value/Benefits
Metrics and Scoring of Benefits
- Ease of Use of XML Resources and Information
- Average annual increase in the number of
artifacts in government-sponsored
registry/repositories - T 15 / target 60 / I100
- Single Authoritative Source for Agreed Upon
Schemas for XML Artifacts for Inherently
Governmental Data/Government-Unique Requirements - Survey of direct user/customer population
segments within chosen sample of communities of
interest (e.g. human resources, environmental) - of respondents indicating that they use
schemas in government-sponsored
registry/repositories as the primary way of
conducting electronic business transactions with
the Government - T25 / target 65 / I80
- Improved Search, Discovery and Access and
Analysis Capabilities for Functional
Users/Communities of Interest - Average number of clicks per search/query to
achieve desired results - T 5 / target 4 / I 3
6Performance Measures, Metrics, and Scoring
Direct User or Customer Value/Benefits
Metrics and Scoring of Benefits
- Ease of Submission for Posting Schemas and
Artifacts - Cycle time in business days from logon to
notification of acceptance/rejection of
submission - T 5 / target 3 / I 2
- Broad Knowledge-Sharing Capabilities
- Annual increase in the number of users of
government sponsored reg/reps - T 15 / target 60 / I100
- Improved Availability of XML Data and Information
(such as Schemas and Artifacts) for Communities
of Interest - XML.gov Registry/Repository site/system up
time in - T 98 / target 99.5 / I 99.9
- Time Savings Due to Efficient and Effective
Communications among Communities of Interest - of users indicating that the use of XML
artifacts reduced the time to conduct business - T50 / target 75 / I 90
7Metrics and Scoring of Benefits
Performance Measures, Metrics, and Scoring
Social Value/Benefits
- Coordination and Streamlining of
Intergovernmental Data Collection and Sharing - Number of major intergovernmental projects that
involve electronic communications/transactions
using artifacts from govt-sponsored reg/reps - T 5 / target35 / I 50
- More Efficient Use of Taxpayer Dollars
- Average annual increase in the number of
downloads of artifacts from govt-sponsored
reg/reps - T15 / target 30 / I50
8Metrics and Scoring of Benefits
Performance Measures, Metrics, and Scoring
Operational/Foundational Value/Benefits
- Improved Interagency Collaboration (Current and
Foundation for Future Collaboration) - Number of major federal cross-agency projects
that involve electronic communications/transaction
s using artifacts from govt-sponsored reg/reps - T 5 / target 34 / I 50
- Increase in Productivity and Efficiency in
Government Operations - Increases in productivity are a function of
positive movement, meeting minimum levels of
performance on an annual basis, for 5 key
indicators. All of the following must occur in
order to achieve productivity gains average
annual increase in the number of artifacts in
govt-sponsored reg/reps ? 15, of users
indicating that using artifacts from govt
reg/reps saved time ? 50, number of major
intergovernmental projects that involve
electronic communications/ transactions using
artifacts from govt reg/reps ? 5, number of major
federal cross-agency projects that involve
electronic communications/transactions using
artifacts from reg/reps ? 5, average annual
increase in the number of downloads of artifacts
from govt reg/reps ? 15. - Have govt sponsored XML reg/reps yielded
productivity gains in system development? - Binary measure Y/N TtargetI
9Metrics and Scoring of Benefits
Performance Measures, Metrics, and Scoring
Operational/Foundational Value/Benefits
- Efficient Reuse and Adaptation of Existing XML
Efforts and Consolidation of Currently Fragmented
Federal XML Efforts - Ratio of submissions to retrievals of schemas on
govt-sponsored reg/reps - T 13 / target140 / I 1100
- Minimization of Administrative Burdens Associated
with Posting/Creating XML Artifacts - of agency managers indicating that the
administrative burden of posting/creating XML
artifacts is minimal - T 50 / target75 / I 90
- Facilitation of Data and Information-Sharing
Among Disparate Systems and Entities/Interoperabil
ity - Number of major intergovernmental (state/local
w. federal) projects that involve electronic
communications/transactions using artifacts from
govt-sponsored reg/reps - T 5 / target35 / I 50
10Which Alternatives Will Be Compared and How Have
They Been Defined?
Alternatives Analysis
- Status Quo/Base Case Undertaking no coordination
activities to standardize data and ensure the
interoperability of all government-sponsored
registry/repositories. Allowing any and all
agencies to build, operate and maintain as many
reg/reps with as many different underlying
technologies and specifications as they choose. - Single Unified Registry/Repository Building a
single federal reg/rep from scratch that will
require that every federal agency wishing to
publish schemas or artifacts go through/ provide
submissions to the central reg/rep for review and
approval. This alternative requires the
termination of all current XML activities in
agencies (EPA, DoD, etc) and would require
existing activities to be subsumed by the new
single reg/rep. - Federated/Distributed Model Each agency or
entity may stand up its own reg/rep. However,
they must do so according to certain
specifications that ensure interoperability with
the central government-wide (XML.gov)
portal/reg-rep. For those agencies electing not
to build their own reg/reps, they may publish
information on the central reg/rep.
11Standard Cost Element Structure
Cost Element Structure
- System Development and Planning includes
personnel costs associated with the studies,
planning, documentation, and analysis required
for the project along with any personnel,
hardware and software necessary for a testing
environment of new systems or applications. It
captures program management and oversight type
activities. - System Acquisition and Implementation includes
the acquisition of the actual project hardware
and software, as well as the personnel required
to accomplish implementation. - System Maintenance and Operation includes
upgrades, maintenance, and recurring training.
12Cost Element Structure
Standard Cost Element Structure System Planning
and Development
13Cost Element Structure
Standard Cost Element Structure System
Acquisition and Implementation
14Cost Element Structure
Standard Cost Element Structure System
Maintenance and Operation
15Next Steps
Next Steps
- Complete Cost Estimates for 3 Alternatives
- Populate VMM Framework, Run Model and Analyze VMM
Output - Compare Value of Alternatives and Select
Alternative/Develop Recommendation - Complete First Draft