Unmanned and Robotics Systems Autonomy and Interoperability - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Unmanned and Robotics Systems Autonomy and Interoperability

Description:

Nightingale II: Concept of Operations. 6 ... Call for MedEvac received at Nightingale Control. Best UAV is chosen automatically ... – PowerPoint PPT presentation

Number of Views:118
Avg rating:3.0/5.0
Slides: 28
Provided by: Jeffr252
Category:

less

Transcript and Presenter's Notes

Title: Unmanned and Robotics Systems Autonomy and Interoperability


1
  • Unmanned and Robotics Systems Autonomy and
    Interoperability
  • Challenges and Solutions
  • Conference on Innovation Support for National
    Security
  • June 9, 2009
  • Jeffrey Wallace, CTO
  • Carpe Occasio Technology Systems

Carpe Occasio Technology Systems
Seize the Moment!
2
Overview
  • Definitions What do we mean?
  • Challenges
  • High Performance Interoperability Infrastructure
  • Examples Fire Control Mechanical
    Synchronization of a Gun
  • Autonomy Support
  • Bridging the Gap Between Humans and Machines
  • Examples Tasking Intelligence, Surveillance,
    Reconnaissance (ISR) Asset
  • Summary

3
Measuring Architectures and Ideas Marginal
Complexity
  • How many steps/operations are required to add one
    new feature/capability?
  • Measurement of complexity
  • What does it cost to do one thing of interest at
    all?
  • Firm understanding of fixed costs

4
New Approach to Interoperability
  • How to make different parts of a system work
    together at different levels of resolution
  • The general interoperability problem involves
    information transfer and synchronization of
    activity between parts of a system
  • There are at least four distinct levels or
    types of interoperability problems to consider

Support Autonomy from an Interoperability
Perspective
5
Unmanned and Robotics Systems Interoperability
  • Intra-system component level interoperability
    issues (to enable interchangeable parts within an
    unmanned system)
  • Inter-system interoperability between different
    unmanned or robotic systems, which are required
    for such systems to cooperate and perform complex
    missions in the battlespace, or disaster space
  • Interaction with manned systems unmanned and
    robotics systems must interoperate with a wide
    variety of manned platforms, weapons systems, and
    C4ISR systems
  • Interaction with personnel on scene whether
    first responders in a civil emergency, troops
    operating/working in theater or combat, etc., and
    in a wide variety of situations such as
    logistical support, casualty evacuation,
    manned/unmanned combined arms operations, etc.

6
Nightingale II Concept of Operations
C2
No Fly Zone
C
Autonomous collision obstacle avoidance
Autonomous Clear-Zone landing
D
  • Call for MedEvac received at Nightingale Control
  • Best UAV is chosen automatically
  • Route is autonomously planned uploaded
  • UAV is launched automatically

No Fly Zone
B
UGVBEAR deploys
UAS/UGV/BEAR system rejoins and goes to
destination
A
BEAR recovers Medic treats
BEAR deploys
Autonomous transit from starting point, to
pick-up point, to medical unit
Similar process for Logistics, Combat Rescue,
Special Ops
7
Todays Challenges
  • Brittle - easily fail with slight perturbations
    to systems and sub-systems
  • Dynamic environments and unanticipated
    information/data/knowledge is problematic
  • Robust always do something sensible, even in
    degraded or failure modes
  • Difficult to maintain as systems are upgraded
  • Complicated to scale when additional information
    and constituent systems, are required
  • Cannot provide clear, concise implementations
    among diverse sub-system sets

8
Control, Autonomy and Intelligence
  • The difference between robots and mechanisms
    robots can adapt to task changes, subjects of
    operation, or operating environments
  • Unmanned and robotic systems should be able to
  • Interpret the directives that describe tasks
  • Understand the operating environment from data
    provided by perception sensors
  • Reason about its state and the state of other
    robots/human ("agents") present in the same
    dynamic environment
  • Perform motion planning and activity planning
    based on task description,  on the environment,
    and own/agents states
  • Control the execution of the actions,
    while allocating attention to task-related events
  • Anticipate outcome of actions
  • When all this is performed without human
    guidance, the unmanned or robotic system can be
    called Autonomous.

9
General Capabilities Required
  • Creation of complex, realistic, and scalable
    networks of component inter-relationships
  • Distribution of autonomous controls and monitors
  • Implementation of complex webs of cause and
    effect
  • Dynamic alteration of the component execution
    structure
  • Adaptation and evolution of the system

10
Unmanned and Robotics System InteroperabilityExam
ple UV SENTRY OPERATIONAL CONCEPT
10001110011001101010101010101001111001001010100101
0
1010101010101010011110010010111001010001010
00110101110101010010101011110100001111000111100101
1101011100
01010100011010111010101001010101111010000111100011
1100101110101110001010
01100111100001010100011010111010101001010101111010
0001111000111100101110101110
0111100101110101110001010
101000011110001111001011101011100010100011
10100001111000111100101110101110001010
10100001111000111100101110101110001010001111100000
111110101
11
High Performance Interoperability
InfrastructureNext-Gen SOA
  • Extremely high performance
  • Ideal for situations requiring minimal latency
    and synchronization of activity/function
  • Automate development and deployment for
    operational environments
  • Allows components and systems to be brought
    online in days
  • Automate discovery and utilization
  • Quickly integrate systems of different types
  • Both legacy and new systems

12
Service Decomposition
CASE Tool Environment
User Defined IT System Interface
User Defined Hardware Interface
Web Services API (JNI, SOAP, OWL, etc.)
Composability Automation
Component Repository
Intelligent Application Services
Knowledge Representation
Integration
Meta-Data
Data Translation
System Execution Services
Distributed Object Mgmt
Std App Dev Interface
Synchronization Management
Event Management Services
Common Application Services
Security
State Saving
Core Programming
Compression
Encryption
Communication Services (Unicast, Multicast,
Broadcast)
Shared Memory
IP
JTRS
Reflective Memory
BLOS
Link-16
Others
13
Gun System Synchronization ExampleExample
Turret/Fire Control
Process Firing Commands (and Queuing Them)
Slew
Elevate
Fire When Slew and Elevate are Complete
14
Gun System Synchronization ExampleTurret Fire
Command
void Turretfire() P_VAR
P_BEGIN(3) // Wait until the turret
movement is completed WAIT_FOR(1,
slewComplete, -1) WAIT_FOR(2,
elevateComplete, -1) WAIT_FOR(3,
PermissionToFire, -1) // Fire the weapon,
this would activate the real gun
FireAt(PointingVector, TargetDesignator)
RB_cout ltlt "Flash, Boom, Bang, Echo" ltlt endl
fireComplete 1 P_END
15
Autonomy Support
  • Represent Tasks/Mission Threads and Resources
    (Platforms, ISR, Munitions, Comms, etc.) using
    open standards
  • Semantic Web, Other ISO Draft/Preliminary
  • Automatically generate resource selection to
    accomplish tasks/missions
  • Provide knowledge representation sharable by
    humans and machines
  • Conceptual Graphs (CGs)
  • CG tools, execution engine

16
Adaptive, Dynamic, Knowledge-based Execution and
ControlConceptual GraphsComputational Ontology
Framework
Actor
Relationship
Concept
A Cat sits on a mat
Une Chat assis sur une matte
STAT
LOC
CAT
SIT
MAT
17
Recipients
Causes
Scan
Carries
Global Hawk
AOI
SensorMTI
Generates
Attribute
Attribute
List Detection
Generates
PositionGlobal Hawk
Boundary
Accuracy Target Coordinates
Attribute
List Target
Creates Report
CoordinatesTO
Range Filters
Coordinate Global Hawk
18
Perception-based, Multi-subsystem, Dynamic
Environment
High
Perception-based, Single subsystem, Dynamic
Environment
Perception-based, Multi-subsystem, Static
Environment
Autonomy Levels
Perception-based, Single subsystem, Static
Environment
Simple
Low
Goal Attainment
19
SoS Automation Levels
high
Autonomy level
low
System Automation Levels
high
Mission Integration
low
Subsystem Automation Levels
high
low
Mobility RSTA
Scan React to threats
Maritime reconn Sub track and trail
UGV
UAV
UUV
MISSION SUBDOMAINS AND TYPES
20
System Scope
Mission Scope
Mission Complexity
Multiple vehicle cooperation high level of task
knowledge High uncertainty, large spatial And
temporal scope
Make sure that the carrier group is secure
System of Systems
Single vehicle multiple subsystems Medium levels
of task knowledge High uncertainty medium
spatial And temporal scope
Patrol sector A16 and keep a lookout for
intruders coming through the perimeter
Single Vehicle
Goto waypoints (UTM 1, UTM2,,UTMn, UTMa.
scan_direction, UTMb.dwell)
Single subsystem, low level of task Knowledge
medium-low Uncertainty medium-small spatial And
temporal scope
Subsystem
Single subsystem, no task Knowledge, very low
uncertainty Small spatial and temporal scope
Remote control
Actuator
21
Experiments and Pilot Projects
  • UAV-Ground Robot Collaboration
  • UAV performs
  • Communications
  • ISR
  • Targeting
  • UGV utilizes UAV ISR data and autonomously
    navigates to engage target with laser guided
    munition
  • UAV-USV Collaboration
  • UAV and USV exchange ISR information to rapidly
    provide high accuracy/low latency intercept and
    targeting information

22
Summary
  • Our technology is all about
  • Reducing robotics and unmanned system
    development costs
  • Improving the capability of robotic and unmanned
    systems
  • Bringing together both people and technology is
    our goal

23
Backup
24
How We Fit
  • Our software can be used to
  • Build standardized interfaces or support new
    interfaces and standards
  • Allow information to be exchanged easily, and
    synchronized in time properly
  • Talking and timing is our forte
  • This will permit lower overall system development
    costs
  • Improve the functional capabilities of the
    systems

25
What Can We Do?
  • Work with the other Protégé programs to help them
    build their technology so that it fits together
  • Demonstrate what a next generation system
    integrator will do
  • This implies the ability to actually do system
    of systems development and testing
  • This requires the ability to bring both groups of
    people together, in addition to technology

26
Issues in Autonomy
  • Examples of possible degrees of Robot Autonomy
    for  a mobile robot
  • The robot could understand directives in the form
    of
  • Low level program sequences (e.g. drive to x,y
     move robot arm to x,y,z)
  • Natural language (e.g., analyze any strange stone
    close by)
  • Understanding of the environment could be
  • Limited to determination of the configuration of
    a standard environment (e.g. the positions of
    doors in a corridor)
  • Reconstruction of the 3D model of an outdoor
    environment and association of entities to it
    (e.g. boulder, tree, pond)
  • Reasoning on own/co-agent states could  be
  • Simple tracking of own resources (e.g. level of
    energy in batteries)
  • Determining how tasks could be performed in
    co-operation with other agents

27
Other Autonomy Considerations
  • Planning could be
  • Geometrical/temporal planning of motion (e.g.
    interpolating a trajectory)
  • Break down high level task into elementary
    actions including resources and contingency
    actions (e.g. survey area stop on strange
    looking stone grasp stone deposit in analysis
    instrument run analysis procedure)
  • Control could be implemented as
  • Feedback execution of a command (e.g. a
    proportional, integrative and derivative control
    to follow a trajectory in space)
  • A set of behaviors triggered by events (e.g. when
    bump in obstacle backtrack)
  • The implementation of even the simplest Autonomy
    requires a computer with suitable interfacing
    means to the robot sensors and actuators
  • Such computer is called Robot Controller
Write a Comment
User Comments (0)
About PowerShow.com