Learning%20Chaotic%20Dynamics%20from%20Time%20Series%20Data - PowerPoint PPT Presentation

About This Presentation
Title:

Learning%20Chaotic%20Dynamics%20from%20Time%20Series%20Data

Description:

Learning Chaotic Dynamics from Time Series Data A Recurrent Support Vector Machine Approach Vinay Varadan – PowerPoint PPT presentation

Number of Views:69
Avg rating:3.0/5.0
Slides: 10
Provided by: Vina162
Category:

less

Transcript and Presenter's Notes

Title: Learning%20Chaotic%20Dynamics%20from%20Time%20Series%20Data


1
Learning Chaotic Dynamics from Time Series Data
  • A Recurrent Support Vector Machine Approach
  • Vinay Varadan

2
Primary Motivation
  • Understand the biological cell as a complex
    dynamical system
  • Recent developments allow for in vivo
    post-translation protein modification
    measurements along with gene expression levels
  • Very expensive still, thus forcing only
    relatively sparse sampling of the modified
    protein concentrations in time
  • We invariably measure only a small number of
    variables of the system - in most cases just one
    or two variables
  • Develop modeling techniques to learn underlying
    dynamics with short time series without knowing
    the exact structure of the nonlinear differential
    equation
  • Even in the absence of noise, trajectory learning
    is still a difficult problem

3
Problem Statement
  • Given the time series of one variable in a
    multidimensional nonlinear differential equation
    (NDE)
  • Learn the number of dimensions, viz. number of
    interacting variables in the underlying NDE
  • Given a few samples, be able to generate all
    future samples exactly matching the trajectory of
    the variable
  • Do this for all possible NDEs, including ones at
    the edge of chaos and also chaotic systems
  • In this project we concentrate on chaotic systems
    because the rest would be easier to learn, for a
    given dimensionality

4
Previous Attempts At Chaotic Time Series
Prediction
  • Takens delay embedding theorem (1981) can
    recreate the geometry of the state-space using
    just delayed samples of the single observable
  • Thus for the time series measurement, y(t),
  • y(t) f(y(t-1), y(t-2), , y(t-m))
  • Nonlinear functions with universal approximation
    capability employed for f such as RBF, polynomial
    functions, rational functions, local methods
  • One-step predictors - these methods learn to
    predict one time step ahead when given past
    samples of the observable
  • Not good enough not learning to follow
    trajectories of the dynamical system thus not
    learning the geometry of the state space well
  • We need to learn Recurrent models

5
Recurrent Models - SVM
  • Consider learning models of the form
  • In order to estimate the function f, we use
    Recurrent Least Squares Support Vector Machines
  • We can rewrite the above equation in terms of the
    given data and the error variables as

6
Recurrent Training using SVM
  • The training of the network is formulated as
  • The final term of the equation to be minimized
    refers to the Least Squares formulation
  • We can now define the Lagrangian and derive the
    optimality conditions appropriately
  • Further, we can eliminate the calculation of w
    explicitly and use just the Kernel formulation

7
Recurrent Training using SVM
  • The resulting recurrent simulation model is given
    as
  • For the Recurrent SVM case, the parameter
    estimation problem becomes nonconvex
  • We thus have to use sequential quadratic
    programming

8
Recurrent Model Performance
  • Performance of different prediction algorithms on
    a chaotic Predator-Prey model

9
Conclusion and Pending Work
  • Recurrent SVM models are able to capture the
    underlying dynamics much better compared to other
    models
  • In the past, we have developed an Improved Least
    Squares (ILS) formulation for use in modeling
    chaotic systems
  • Need to explore how that can be integrated with
    SVMs
Write a Comment
User Comments (0)
About PowerShow.com