Dennis Moellman, VACE Program Manager - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Dennis Moellman, VACE Program Manager

Description:

Southern. California / Info. Science Inst. Univ. of. Central ... IBM T.J. Watson Research Center / Columbia Univ. ... University of Illinois, Urbana - Champaign ... – PowerPoint PPT presentation

Number of Views:38
Avg rating:3.0/5.0
Slides: 37
Provided by: DennisM91
Category:

less

Transcript and Presenter's Notes

Title: Dennis Moellman, VACE Program Manager


1

VACE Executive Brief for MLMI
  • Dennis Moellman, VACE Program Manager

2
Briefing Outline
  • Introduction
  • Phase II
  • Evaluation
  • Technology Transfer
  • Phase III
  • Conclusion

3
Introduction
What is ARDA/DTO/VACE?
  • ARDA Advanced Research and Development Activity
  • A high-risk/high-payoff RD effort sponsored by
    US DoD/IC
  • ARDA taking a new identity
  • In FY2007 under the DNI
  • Report to ADNI(ST)
  • Renamed Disruptive Technology Office
  • VACE Video Analysis and Content Extraction
  • Three Phase initiative begun in 2000 and ending
    2009
  • Winding down Phase II
  • Entering into Phase III

4
Context
Video Exploitation Barriers
  • Problem Creation
  • Video is an ever expanding source of imagery and
    open source intelligence such that it commands a
    place in the all-source analysis.
  • Research Problem
  • Lack of robust software automation tools to
    assist human analysts
  • Human operators are required to manually monitor
    video signals
  • Human intervention is required to annotate video
    for indexing purposes
  • Content based routing based on automated
    processing is lacking
  • Flexible ad hoc search and browsing tools do not
    exist
  • Video Extent
  • Broadcast News Surveillance UAV Meetings and
    Ground Reconnaissance

5
Research Approach
Video Exploitation
  • Research Objectives
  • Basic technology breakthroughs
  • Video analysis system components
  • Video analysis systems
  • Formal evaluations procedures, metrics and data
    sets
  • Evaluate Success
  • Quantitative Testing
  • Metric Current Need
  • Accuracy ltHuman gtgtHuman
  • Speed gtReal time ltltReal time
  • Technology Transition
  • Over 70 technologies identified as deliverables
  • 50 have been delivered to the government
  • Over 20 undergoing government evaluation

6
Management Approach
Geared for Success
  • Management Philosophy NABC
  • N Need
  • A Approach
  • B Benefit
  • C Competition

7
Interests
System View
8
VACE Interests
Technology Roadmap
9
Funding
Commitment to Success
FY06 Allocations
FY07 Allocations
4
12
20
64
10
Phase II
Programmatics
  • Researcher Involvement
  • Fourteen contracts
  • Researchers represent a cross section of industry
    and academia throughout the U.S. partnering to
    reach a common goal
  • Government Involvement
  • Tap technical experts, analysts and COTRs from
    DoD/IC agencies
  • Each agency is represented on the VACE Advisory
    Committee, an advisory group to the ARDA/DTO
    Program Manager

11
Phase II
Demographics
12
Phase II
Projects
13
Phase II
Projects
14
Phase II
Projects
15
Evaluation
Goals
  • Programmatic
  • Inform ARDA/DTO management of progress/challenges
  • Developmental
  • Speed progress via iterative self testing
  • Enable research and evaluation via essential data
    and tools build lasting resources
  • Key is selecting the right tasks and metrics
  • Gear evaluation tasks to research suite
  • Collect data to support all research

16
Evaluation
The Team
NIST
USF
Video Mining
17
Evaluation
NIST Process
18
Evaluation
NIST Mechanics
19
Evaluation
2005-2006 Evaluations
P Person F Face V Vehicle T Text
20
Evaluation
Quantitative Metrics
  • Evaluation Metrics
  • Detection SFDA (Sequence Frame Detection
    Accuracy)
  • Metric for determining the accuracy of a
    detection algorithm with respect to space, time,
    and the number of objects
  • Tracking STDA (Sequence Tracking Detection
    Accuracy)
  • Metric for determining detection accuracy along
    with the ability of a system to assign and track
    the ID of an object across frames
  • Text Recognition WER (Word Error Rate) and CER
    (Character Error Rate)
  • In-scene and overlay text in video
  • Focused Diagnostic Metrics (11)

21
Evaluation
Phase II Best Results
22
Evaluation
Face Detection BNews (Score Distribution)
23
Evaluation
Text Detection BNews (SFDA Score distribution)
24
Evaluation
Open Evaluations and Workshops -- International
  • Benefit of open evaluations
  • Knowledge about others capabilities and
    community feedback
  • increased competition -gt progress
  • Benefit of evaluation workshops
  • Encourage peer review and information exchange,
    minimize wheel reinvention, focus research on
    common problems, venue for publication
  • Current VACE-related open evaluations
  • VACE Core Evaluations
  • CLEAR Classification of Events, Activities, and
    Relationships
  • RT Rich Transcription
  • TRECVID Text Retrieval Conference Video Track
  • ETISEO Evaluation du Traitment et de
    lInterpretation de Sequences Video

25
Evaluation
Expanded
26
Evaluation
Schedule
27
TECH TRANSFER
DTO Test and Assessment Activities
  • Purpose Move technology from lab to operation
  • Technology Readiness Activity
  • An independent repository for test and assessment
  • Migrate technology out of lab environment
  • Assess technology maturity
  • Provide recommendations to DTO and researchers

28
TECH TRANSFER
DoD Technology Readiness Levels (TRL)
29
Technology Transfer
Applying TRL
DOD Technology Risk Scale
RISK
LOW
HIGH
DTO Control
8 9
Production
DTO Influence
UNCLASSIFIED CLASSIFIED
6 7
UNCLASSIFIED CLASSIFIED
Info-X Test Facility
UNCLASSIFIED
IC/DOD Test Facility(s)
Contractor Test Facility
4 5
Use in assessing projects
  • Technology maturity
  • Risk level
  • Commercialization potential

1 2 3
30
Technology Transfer
TRA Maturity Assessments
31
Phase III BAA
Programmatics
  • Contracting Agency DOI, Ft. Huachuca, AZ
  • DOI provides COR
  • ARDA/DTO retain DoD/IC agency COTRs and add more
  • Currently in Proposal Review Process
  • Span 3 FYs and 4 CYs
  • Remains open thru 6/30/08
  • Funding objective 30M over program life
  • Anticipate to grow in FY07 and beyond
  • Address the same data source domains as Phase II
  • Will conduct formal evaluations
  • Will conduct maturity evaluations and tech
    transfer

32
Phase III BAA
Programmatics
  • Emphasis on technology and system approach
  • Move up technology path where applicable
  • Stress ubiquity
  • Divided into two tiers
  • Tier 1 One year base with option year
  • Technology focus
  • Open to all US and international
  • More awards for lesser funding
  • Tier 2 Two year base with option year(s)
  • Comprehensive component/system level initiative
  • Must be US prime
  • Fewer awards for greater funding

33
Phase III BAA
Schedule
34
Summary
Take-Aways
  • VACE is interested in
  • Solving real problems with risky, radical
    approaches
  • Processing multiple data domains and multimodal
    data domains
  • Developing technology point solutions as well as
    component/system solutions
  • Evaluating technology process
  • Transferring technology into users space

35
Conclusion
Potential DTO Collaboration
  • Invitations
  • Welcome to participate in VACE Phase III
  • Welcome to participate in VACE Phase III
    Evaluations

36
Contacts
  • Dennis Moellman, Program Manager
  • Phones 202-231-4453 (Dennis Moellman)
    443-479-4365 (Paul
    Matthews) 301-688-7092 (DTO Office)
  • 800-276-3747 (DTO Office)
  • FAX 202-231-4242 (Dennis Moellman)
  • 301-688-7410 (DTO Office)
  • E-Mail dennis.moellman_at_dia.mil
    (Internet Mail)
  • pmmatth_at_nsa.gov
  • Location Room 12A69 NBP 1 Suite
    6644 9800 Savage Road
  • Fort Meade, MD 20755-6644
Write a Comment
User Comments (0)
About PowerShow.com