Title: Dennis Moellman, VACE Program Manager
1 Bidder Brief VACE Phase III BAA Discussion
- Dennis Moellman, VACE Program Manager
- 20 January 2006
2Administration
- Web conference (2 hours max.)
- Simultaneous oral broadcast and visual
presentation - On website there is a chat capability
- Presentation is being recorded
- For playback access see BAA website
- Participants
- Myself, Dennis Moellman
- John Garofolo, NIST -- Evaluation
3Brief Outline
- Introduction
- Context
- Background
- Technology
- Programmatics
- Schedule
- Evaluation Overview
- Formal QA
- Ad Hoc QA
4Context
Whats ARDA/VACE/DTO?
- ARDA Advanced Research and Development Activity
- A high-risk/high-payoff RD effort sponsored by
US DoD/IC - VACE Video Analysis and Content Extraction
- ARDA taking a new identity
- Now under the DNI
- Report to ADNI(ST)
- Renamed Disruptive Technology Office
- Exploit path-breaking scientific and research
advances that will enable us to maintain and
extend intelligence advantages(DNI National
Intelligence Strategy) - New location Adelphi, MD middle 2006
- Winding down Phase II
- Entering into Phase III
- Schedule on succeeding VU-graph and in BAA
5Background
- What is the problem?
- Analysts need robust, automated video tools
- What are the barriers to solving this problem?
- Effectively ingesting, indexing, managing,
accessing and understanding a large video corpus
from multiple heterogeneous sources - How will you overcome those barriers?
- By automating manual video activities
- What is the capability you are developing?
- Video analysis systems
- Video analysis system components
- Basic technology breakthroughs
- Formal evaluations procedures, metrics and data
sets
6VACE Roadmap
Object Detection Tracking
Object/Scene Classification
Object Recognition
Object Modeling
Simple Event Detection
Event Recognition
Complex Event Detection
Scene Modeling
Event Understanding
Mensuration
Indexing
Video Browsing
Summarization
Advanced query/retrieval using QA technologies
Content-based Routing
Video Mining
Change Detection
Video Monitoring
Image Enhancement/Stabilization
Camera Parameter Estimation
Multi-modal fusion
Enabling Technologies
Integrity Analysis
Motion Analysis
Motion Analysis
Event Ontology
Event Expression Language
Automated Annotation Language
Evaluation
7Development
- RD Process
- Understand a requirement
- Do the science to develop a capability
- Test and validate (evaluation)
- Make operationally viable
Metric Current Objective Accuracy ltltHuman
gtHuman Speed gtgtReal time ltReal time
8Changing Paradigm
- Will be advertised as DTO initiative
- Contracting Agency DOI, Ft. Huachuca, AZ
- A central focal point
- Enable Phase III schedule synchronized
- Less difficulty with foreign researchers
- Have DOI central COR
- Hope to retain DoD/IC agency COTRs
- Emphasis on both technology and system approach
- Move up technology path where applicable
- Stress ubiquity
9New Paradigm
Source Video
Enhancement Filters
UnderstandingEngine
RecognitionEngine
Visualization
ExtractionEngine
Intelligent Content Services
Concept Applications
10Programmatics
- Open BAA
- Released early CY2006
- Span 3 FYs and 4 CYs
- BAA remain open thru 6/30/08
- Funding objective 30M over program life
- Anticipate to grow in FY07 and beyond
- Address the same data source domains as Phase II
- Will conduct formal evaluations
- Will conduct maturity evaluations and tech
transfer
11Programmatics (cont)
- Divided into two tiers
- Tier 1 One year base with option year
- Technology focus
- Open to all US and international
- More awards for lesser funding
- Tier 2 Two year base with option year(s)
- Comprehensive component/system level initiative
- Must be US prime
- Fewer awards for greater funding
12 Tentative Schedule
- 12/01/05 Notice of intent in FedBizOPS
register interest build teaming base - 12/15/05 Draft BAA posted register for Bidders
Brief - 1/013/06 Comments/Questions Cutoff
- 1/20/06 Bidders Brief planning virtual meeting
- 2/1/06 Final BAA Announcement
- 3/03/06 Proposals due to Government
- 4/15/06 to 4/30/06 Evaluation recommendations
completed (depending on response) - 6/30/06 Contract Awards Completed
- 6/30/08 BAA extension period
- 6/30/09 BAA completion
13VACE Phase-III Evaluation
14Goals
- Objective assessment of key technologies
- summary of state-of-the-art for DTO management
for planning - summative and developmental feedback for
researchers - Evaluation includes
- necessary source data and gold-standard
annotations - metrics/evaluation software tools
- protocols/plans for formal evaluation
- informative primary and contrastive test
conditions - technical workshops
- Evaluation is a critical part of the research
15VACE Approach
- Government coordinates certain evaluations to
- Address problems that are important to the
Government and assess the state-of-the-art - Develop critical mass in important research areas
and focus the research community - Maximally leverage data resources
- Provide common ground for exchange of knowledge
16Government-coordinated Evaluation Selection
Process
- Assess DTO needs
- strategic interest and emphases
- via VACE PM and Government Advisory Panel
- Assess researcher needs and desires
- interest and probable participation levels
- required effort
- via Evaluation Forum and cross-program committees
- Assess impact
- effect on key technologies
- relationship to other programs
- Assess feasibility
- required/available data resources
- required annotation and evaluation tool
development effort - estimate of current state-of-the-art
- Prioritize, down-select, and refine
- Co-optimize above
- Refine tasks and metrics through arbitration
process
17VACE-II Evaluations
- References included to provide example of breadth
and process in Phase-II. These are not intended
to be prescriptive of evaluation in Phase-III. - Tasks person/face/hand/text/vehicle detection
and tracking, text recognition, shot boundary
detection, excerpt search, feature extraction - Domains broadcast news, meetings, UAV,
surveillance (security cam) - Venues VACE Core Eval, TRECVID, CLEAR, RT
- New Government-supported evaluation tasks are
likely to be developed after the Phase-III
selection process in the selection process
described above.
18ProposalRequirements
- Clearly specify key critical research elements
and how progress should be evaluated - Existing VACE-sanctioned evaluations and/or
metrics - Outside existing evaluations
- must describe or provide sufficiently informative
references that address questions in Appendix B. - New or non-sanctioned evaluation tasks/protocols
must be detailed in an evaluation plan addressing
the questions in Appendix B - must provide proposal for evaluation, proposed
metrics, goals and milestones, schedule, and
necessary data infrastructure (including whether
resources need to be created or acquired.) - Elements must be mapped to VACE technical
objectives - Must demonstrate a clear understanding of how
evaluation will be integrated into the research
process
19Proposal Requirements contd.
- Must address both performance and efficiency
(processing speed) - generally-accepted performance metrics should be
use if possible - otherwise, new metrics should be suggested,
should be minimal necessary to summarize
performance - objective metrics and repeatability should be
emphasized - processing speed determination should be on
single CPU if possible - should be times-realtime for automated systems
- near-realtime goals will be favored
- User-interaction-based technologies must report
standard usability metrics - Proposals with baselines will be favored
- estimates must be provided for proposals with no
baseline, must be performed within 6 months of
award - Must include objective performance and efficiency
goals and timeline for achieving these goals - awarded contracts will be measured against these
goals
20Evaluation Budgeting
- Minimum of 10 for evaluation
- planning and arbitration process described above
must devote at least one rep to this process - adaptation of research algorithms for evaluation
processing - dry run evaluations
- developmental intrinsic evaluations
- formal extrinsic evaluations
- participation in evaluation workshops
- Should NOT include algorithm development or
hardware/software for such development.
21Data Resources
- Effort will be made to provide pertinent source
data resources - however Government cannot guarantee delivery of
such data - acquisition is very difficult and time-consuming
- collection efforts in the meeting, surveillance,
and UAV domains are ongoing - international collaborations are helping to
accelerate the immediate availability of data - Consideration will be given to research that can
work with existing or near-ready data resources
22Current VACE Source Data Resources
- TRECVID broadcast news and BBC rushes collections
- NIST Meeting Corpora
- Multi-site meeting corpora and NIST corpora
- CHIL/AMI meeting corpora
- I-LIDS surveillance data
- VISA surveillance data (coming soon)
- DARPA VIVID UAV data
- VACE UAV data (coming during Phase-III)
- LDC Broadcast News data
23VACE Evaluation Programs
- VACE Core Evaluation (ongoing)
- VACE-supported evaluations implemented internally
to the VACE program (will be reduced in
Phase-III) - CLEAR Classification of Events, Activities, and
Relationships (Spring) - Collaborative cross-program/international
evaluation program focused on multi-modal spatial
analyses - TRECVID TREC Video (Summer)
- Collaborative international evaluation program
focussed on multi-modal search - RT Rich Transcription (Spring)
- Collaborative cross-program/international
evaluation program focused on multi-modal
extraction of content from language
24Evaluation Workshops
- Separate annual workshops will be held for CLEAR,
TRECVID, and RT - RT and CLEAR will rotate between the US and
Europe - Plan to send key technical personnel (primes and
subs) to each of these workshops as appropriate - must attend workshops for which you participated
in the associated evaluations - Expect to prepare technical presentations and
papers for each workshop you participate in - Evaluation results will be made public via the
proceedings of these workshops
25Scope
26Scope
27Scope
28New Paradigm
Source Video
Enhancement Filters
UnderstandingEngine
RecognitionEngine
Visualization
ExtractionEngine
Intelligent Content Services
Concept Applications
29Scope (final)
30Tier Structure
31Tier Structure
32Evaluations
33Evaluations
34Evaluations
35Data Management
36Tech Assessment
37Tech Assessment
38Adhoc QA