Computing in Complex Systems - PowerPoint PPT Presentation

About This Presentation
Title:

Computing in Complex Systems

Description:

Illustration of Practical Challenges. Complex Landscapes ... exploit different aspects of problem but can be used in synergistic fashion ... – PowerPoint PPT presentation

Number of Views:60
Avg rating:3.0/5.0
Slides: 16
Provided by: jacobb7
Learn more at: https://www.csm.ornl.gov
Category:

less

Transcript and Presenter's Notes

Title: Computing in Complex Systems


1
Computing in Complex Systems
Research Alliance for MinoritiesFall
WorkshopORNL Research Office BuildingDecember
2, 2003
  • J. Barhen
  • Computing and Computational Sciences
    Directorate

2
Advanced Computing Activities at CESAR
In 1983 DOE established CESAR at ORNL. Its
purpose was to conduct fundamental theoretical,
experimental, and computational research in
intelligent systems. Over the past decade, the
Center has experienced tremendous growth. Today,
its primary activities are in support of DOD and
the Intelligence Community. Typical examples
include
  • missile defense BMC3, war games, HALO-2 project,
    multi-sensor fusion
  • sensitivity and uncertainty analysis of large
    computational models
  • laser array synchronization (directed energy
    weapons)
  • complex systems neural networks, global
    optimization, chaos
  • quantum optics applied to cryptography
  • mobile cooperating robots, multi-sensor and
    computer networks
  • nanoscale science (friction at the nanoscale,
    interferometric nanolithography)

Within the CCS Directorate, revolutionary
computing technologies (optical, quantum,
nanoscale, neuromorphic) are an essential focus
of CESARs research portfolio.
CESAR sponsors include MDA, DARPA, Army,
OSD/JTO, NRO, ONR, NASA, NSA, ARDA, DOE/SC, NSF,
DOE/FE, and private industry.
3
The Global Optimization ProblemIllustrative
Example of Computing in Complex Systems
  • Nonlinear Optimization problems arise in every
    field of scientific, technologic,
  • economic, or social interest. Typically,
  • The objective function (the function to be
    optimized) is multimodal, i.e., it possesses many
    local minima in the parameter region of interest
  • In most cases it is desired to find the local
    minimum at which the function takes its lowest
    value, i.e., the global minimum
  • The design of algorithms that can reach and
    distinguish between local and
  • global minima is known as the global optimization
    problem.
  • Examples abound
  • Computer Science design of VLSI circuits, load
    balancing,
  • Biology protein folding
  • Geophysics determination of unknown geologic
    parameters from surface measurements
  • Physics elasticity, hydrodynamics,
  • Industrial technology optimal control, design,
    production flow,
  • Economics transportation, cartels,

4
Problem Formulation
  • Definitions
  • x is a vector of state variables or parameters
  • f is referred to as the objective function
  • Goal
  • Find the values fG and xG such that
  • ? is the domain of interest over which one seeks
    the global minimum. It is assumed to be compact
    and connected.
  • without loss of generality, we will take ? as the
    hyper parallelepiped

5
Local vs Global Minima
6
Why is Global Optimization so Difficult?
Illustration of Practical Challenges
  • Complex Landscapes
  • we need to find global minimum of functions
  • of many variables
  • Typical problem size is
  • (102 105) variables
  • Difficulty
  • number of local minima grows
  • exponentially with the
  • number of variables
  • local and global minima have
  • the same signature, namely
  • zero gradient

Schubert function This function arises in signal
processing applications. It is used as one of the
SIAM benchmarks for Global Optimization. Even
its two dimensional instantiation exhibits a
complex landscape.
7
Leading Edge Global Optimization Methods
  • The Center for Engineering Science Advanced
    Research (CESAR) at the Oak Ridge
  • National Laboratory (ORNL) has been developing,
    demonstrating, and documenting
  • in the open literature leading edge global
    optimization (GO) algorithms.
  • What is the Approach?
  • three complementary methods address GO challenge
  • exploit different aspects of problem but can be
    used in synergistic fashion
  • What are the Options?
  • TRUST fastest published algorithm for searching
    complex landscapes via tunneling
  • NOGA performs nonlinear optimization while
    incorporating uncertainties from model and from
    external information (sensors, )
  • EO exploits the availability of information
    typically available to the user but never
    exploited by conventional optimization tools

Goal Further develop, adapt, and demonstrate
these methods on relevant DOE, DOD, and NASA
applications where major impact is expected.
8
Leading Edge Global Optimization MethodsTRUST
  • What is TRUST ?
  • a new, extremely powerful global optimization
    paradigm developed at CESAR / ORNL
  • How does it work ? three
    innovative concepts
  • subenergy tunneling a nonlinear transformation
    that creates a virtual landscape where all
    function values greater than the last found
    minimum are suppressed
  • non-Lipschitzian terminal repellers enable
    escape from local minima by pushing the
    solution flow under the virtual landscape
  • stochastic Pijavskyi cones eliminate
    unproductive regions by using information on the
    Lipschitz constant of the objective function
    acquired during the optimization process
  • iterative decomposition recombination of large
    scale problems
  • How does it perform ?
  • unprecedented speed and accuracy overall
    efficiency up to 3 orders of magnitude higher
    than best publicly available competitors for SIAM
    benchmarks
  • successfully tested on large-scale seismic
    imaging problem
  • outstanding performance led to article in Science
    (1997), to RD 100 award (1998), and to a patent
    in 2001.

9
TRUSTTerminal Repeller Unconstrained Subenergy
Tunneling
10
TRUSTComputational Approach
11
Uniqueness of TRUST
  • Virtual objective function E( x, x ) is a
    superposition of two contributing terms
  • Esub (x, x) subenergy tunneling
  • Erep (x, x) repelling from latest found local
    minimum
  • Its effect is to transform the current local
    minimum of f(x) into a global maximum, while
    preserving any lower laying local minima

Key Advantage of TRUST
  • Gradient descent applied to f(x) and initialized
    at x? can not escape from the basin of
    attraction of x
  • Gradient descent applied to E( x, x ) and
    initialized at x? always escapes it.
  • TRUST has a
  • global descent property.

Erep Esub
12
Leading Edge Global Optimization Methods
  • Comparison of TRUST performance to leading
    publicly available competitors for SIAM
    benchmarks
  • data correspond to number of function
    evaluations needed to reach global minimum
  • symbol ? indicates that no solution was found
    for method under consideration
  • benchmark functions BR (Branin), CA
    (camelback), GP (Goldstein-Price), RA
    (Rastrigin), SH (Shubert),
  • H3 (Hartman)
  • methods SDE (stochastic differential
    equations), GA/SA (genetic algorithms and
    simulated annealing),
  • IA (interval arithmetic), Levy TUN
    (conventional Levy tunneling), Tabu (Tabu search)

13
Leading Edge Global Optimization Methods
  • NOGA
  • The explicit incorporation of uncertainties into
    the optimization process is essential for the
    design of robust mission architectures and
    systems
  • NOGA method for Nonlinear Optimization and
    Generalized Adjustments
  • explicitly computes the uncertainties in model
    predicted results in terms of uncertainties in
    intrinsic model parameters and inputs
  • determine best-estimates of model parameters and
    reduces uncertainties by consistently
    incorporating external information
  • NOGA methodology is based on the concepts and
    tools of sensitivity and uncertainty analysis. It
    performs a non-linear optimization of a
    constrained Lagrange function that uses the
    inverse of a generalized total covariance matrix
    as natural metric
  • EO
  • EO Ensemble Optimization
  • Builds on systematic study on the role that
    additional information may have in significantly
    reducing the complexity of the GOP 
  • while in most practical problems additional
    information is readily available either at no
    cost at all or at rather low cost, present
    optimization algorithms cannot take advantage of
    it to increase the efficiency of the search. 
  • to overcome this shortcoming, we have developed
    EO, a radically new class of optimization
    algorithms that can readily fold in additional
    information and - as a result dramatically
    increase their efficiency

14
Leading Edge Global Optimization MethodsSelected
References
  • TRUST
  • Barhen, J., V. Protopopescu and D. Reister,
    TRUST A Deterministic Algorithm for Global
    Optimization, Science, 276, 1094-1097 (1997).
  • Reister, D., E. Oblow, J. Barhen, and J. DuBose,
    Global Optimization to Maximize Stack Energy,
    Geophysics, 66(1), 320-326 (2001).
  • NOGA
  • Barhen, J. and D. Reister, Uncertainty Analysis
    based on Sensitivities Generated using Automated
    Differentiation, Lecture Notes in Computer
    Science, 2668, 70-77, Springer (2003).
  • Barhen, J., V. Protopopescu, and D. Reister,
    Consistent Uncertainty Reduction in Modeling
    nonlinear Systems, SIAM Journal of Scientific
    Computing (in press, 2003).
  • EO
  • Protopopescu, V. and J. Barhen, "Solving a Class
    of Continuous Global Optimization Problems using
    Quantum Algorithms", Physics Letters, A 296, 9-14
    (2002).
  • Protopopescu, V., C. dHelon, and J. Barhen,
    Constant-time Solution to the Global
    Optimization Problem using Bruschweilers
    Ensemble Search Algorithm, Jour. Phys., A 36(24),
    L399-L407 (2003).

15
Frontiers in Computing
  • Three decades ago, fast computational units were
    only present in vector super-computers.
  • Twenty years ago, the first message-passing
    machines (Ncube, Intel) were introduced.
  • Today, the availability of fast, low-cost chips,
    has revolutionized the way calculations are
    performed in various fields, from personal
    workstation to tera-scale machines.
  • An innovative approach to high performance,
    massively parallel computing remains a key
    factor for progress in science and national
    defense applications.
  • In contrast to conventional approaches, one must
    develop computational paradigms that exploit,
    from the onset (1) the concept of massive
    parallelism and (2) the physics of the
    implementation device.
  • Ten to twenty years from now, asynchronous,
    optical, nanoelectronic, biologically inspired,
    and quantum technologies have the potential of
    further revolutionizing computational science and
    engineering by
  • offering unprecedented computational power for a
    wide class of demanding applications
  • enabling the implementation of novel paradigms
Write a Comment
User Comments (0)
About PowerShow.com