Title: Critical Interpretations of the Surrogate Results
1Critical Interpretations of the Surrogate Results
If the discriminating statistic is significantly
different for the set of surrogates fromthe
original data, the chosen null hypothesis is
rejected. This is often regarded as a proof of
dynamical nonlinearity.
- Other alternatives which also could lead to a
rejection - Chance Factor The rejection could be a result
of chance since 100 confidence level can
never be obtained - Non-dynamic Nonlinearity The measurement
function can be (i) non-invertible, (ii)
time-varying, (iii) non-monotonic, (iv) a process
with memory etc. - Nonstationarity The data can be strongly
non-stationary whereas the surrogates, by
creation, are always stationary.
2Recurrence Plot
Eckmann et al. (1987) Europhys. Lett.
A visualization technique to detect the
recurrence or correlations in the data. In
practice, one chooses ri such that the ball of
radius ri centered at xi in ?d contains a
reasonable number of other points xj of the
trajectory. One dots a plot at each point (i,
j) for which xj is in the ball of radius ri
centeredat xi. This plot is called recurrence
plot (RP).
The recurrence plot is usually symmetric because
distance measure is symmetric, But no complete
symmetry is obtained because ri ? rj The RP
exhibits characteristic large-scale and
small-scale patterns.
3Fractals, Scaling Long-range Correlated Process
4Standard time domain based modeling like AR, MA,
ARMA can approximate the standard statistical
properties (like autocorrelation) quite well.
Now, consider a time series Rapid decrease of
correlation at short time lags Slow
decrease of correlation at long time lags ARMA
based model has difficulty approximating such
profile of correlation. They tend to
underestimate the R(t) at large lag t.
A process which showed significant correlation at
longer time scale is called long-range correlated
process.
Now, Power spectrum is the Fourier Transform of
Autocorrelation Function. Thus, a good indicator
of long-term correlation would be the presence
ofbroad-band, low-frequency power in the
spectrum. More precisely, if the low-frequency
spectrum grows in amplitude as the length ofthe
data window increases, then autocorrelation
function is very slowly decayingtowards zero
relative to the window length.
5HRV Time Series
- Observations
- Low frequency power increases as the data
length increases - Indication of slowly decaying autocorrelation
function - Spectrum is well approximated by a straight
line over a wide range (-1.2 log(w) 4) - Such spectrum is called as 1/fa spectrum
because of the reciprocal power-law
dependence of power on frequency.
Such a time series is often called fractal. They
are self-similar, look similar at many levels of
magnification.
6Synthetic fractal signal
5000 points 2048 512 256
- Observations
- Low frequency power increases with increasing
data length - Overall power spectrum is approximately linear
(esp. for longer data length)
7Geometrical Scaling Self Similarity
Graph of a logarithmic spiral
Consider r(q) aeqq This generates a
logarithmic spiral for q lt 0
Rotate the graph by f r1(q) aeq(qf) eqf
r(q) Rotation is equivalent to
multiplication! The graphs of r(q) and r1(q) are
geometrically similar.
An object is geometrically similar to another
object if one object can be derivedby applying
linear scaling, rotations, or translations to the
other object.
Similarity Transform ? A transform which
produces geometrical similarity
Linear scaling, Rotations, Translations
8Example
- Consider a point (x, y) in a 2-D object.
- Apply a similarity transform to this
objects is the scaling factor, q is the
angle of rotation, dx and dy are translations in
the x and y directionslet, s 0.5, q p/6, dx
1, dy 0
9If one represents the original triangle as an
object P0, then the object after first iteration
will be represented as P1 TP0 T represents
the equations of transformation. The
asymptotical object will beAfter every step,
the resultant object will be geometrically
similar to P0.
Now, consider a situation, when the resulting
object is the union of the objects created by
applying several similarity transforms to P0.
Initial Object
P0
10Let T. be described by three different
similarity transformations, andthe output object
is the union of each transformed object. Lets
the three similarity transformations are
P1
P0
SierpinskiGasket
Pn
P2
11Geometrical Self Similarity
An object is geometrically self similar if one
can recreate the object through the union of
similarity transformations applied to the object.
Any given object P, if there exists a set of
similarity transformationsT1., T2., , Tn.
such that then P is geometrically
self-similar. Additionally, if one defines an
operator, Then HP P, H is called an
invariant operator. Ex If P is the Sierpinski
Gasket, then H may be defined as the union of
the three similarity transforms.
12Cantor Set
- Iteration procedure
- Draw a line segment over the chosen interval 0,
1 - Remove the middle third of the line segment but
retain the other two - Remove the middle third of each of the remaining
line segment - Repeat step 3 ad infinitum.
- The final object will be a set of points on the
original line segment.
13If the minimum resolvable interval as set by
the measurement process is s and the length is
l, then l ? (1/s)d It can be found that for
Cantor set, d 0.369 For Sierpinski Gasket, d
is 0.585.
14Similarity Dimension
It is closely related to measuring the size of
the object at a various precisions. It expresses
the relationship between the number of
geometrically similar piecescontained in an
object and the measurement precision, as the
minimum resolvableinterval approaches zero.
At the kth iteration step, the total measure of
the object Number of similar pieces
(length, area etc.) X
(Resolution)D D (or Ds) is the similarity
dimension of the object
Example Cantor Set The measurement length at
each iteration is, s (1/3)k The number of
similar pieces, N 2k Thus, the similarity
dimension,
15Thus, Cantor set has a dimension greater than a
point but less than that of a line. Ds for
Sierpinski Gasket is 1.585, ? Greater than a
line but less than that of a plane.
Self similar objects having a non-integer
similarity dimension are called fractals.
- Remarks
- Strictly speaking, for fractals, Hausdorff
dimension gt Topological dimension - Any object is practically considered fractal if
any measure of dimension is non-integer
(Remember that various definitions of dimensions
may not converge) - In practice, we only look for self-similarity
extend over a finite number of scales
16Temporal Self Similarity
If a graph can be assumed as a geometrical
object, one can principally applythe concept of
self similarity to the graph of a time function.
But there is a problem!
The abscissa and ordinate of the graph of a time
function have different units. So rotations are
not allowed! One could still consider a type of
similarity transformation in which the two axes
would be scaled by different factors. Objects
which satisfy the conditions for self similarity
but with different scalingin different
directions are called as self affine.
17Statistical Similarity for Time Functions
A random process, X(t), is statistically self
similar if the processes X(t) and b-aX(bt) are
statistically indistinguishable, where a is a
constant and b is an arbitrary scaling factor.
The upper definition implies ProbX(t) x
Probb-aX(bt) x
If X(t) is zero mean, E(b-aX(bt))2
EX2(t) Then, b-2aEX2(bt)
EX2(t) And VarX(bt) b2aVarX(t)
18Brownian Motion
The commonest example of a one-dimensional
self-similar process. Here, a particle is
subjected to frequent, microscopic forces that
cause microscopic changes in position along a
line. General assumption The increments of
displacement, Dt, follow a Gaussian
distribution with zero mean
and that they are statistically
independent
Now we cannot observe every microscopic
displacement. Let B(t) is aný measurement of
positionof the particle at the smallest
possible observation interval t. Then it is the
result of a summation of displacements over this
time interval. Lets the change in position over
one time interval be DB(t)
Now a summation of zero mean Gaussian variables
yields another Gaussianrandom variable having a
zero mean and a variance that is the sum of the
individual variances.
19Assume D(t) to have same density function and
same variance. Thus, the variance of summed
variables at the end of the interval t will be
proportional to t D is a constant, termed as
the average diffusion coefficient
Now, DB(t) is Gaussian so its pdf will
be where b is any particular value of DB(t)
Sample function of discrete increments of
Brownian motion, DB(i.t)with zero mean and unit
variance
Summation of above increments and the
corresponding sample function
20Suppose that the smallest observation interval is
now 2tThen, pdf will be
The varÃance gets
doubled. Thus, the variance of the increments of
Brownian motion depends on the frequency at
which the increments are measured, increasing
with the measurement time. What happens when
the measurement interval gets doubled?
21The Brownian motion is thus scale
invariant. Let, t1 bt, b1 b0.5b.
Then, new pdf
Self affinity
of Brownian motion
Thus, any transformation that changes the time
scale by b and the length scale byb0.5 produces
a process that is equal in distribution to the
original process.
Remarks The Brownian motion is not stationary
since its variance is a function of time. The
variance of the increment is constant, so the
process of Brownian incrementsis stationary.
22Fractional Brownian Motion
It was introduced by Mandelbrot Van Ness
(1968). FBM, BH(t), is defined as follows
Usually, 0 lt H lt1 FBM is derived from a process
of weighting past values of white noise by
(t-s)H-0.5 followed by integration. The
increments of FBM are stationary and self similar
with parameter H. Thus, FBM has no intrinsic time
scale. The variance of FBM
23The character of FBM depends on the value of H.
For 0 lt H lt 0.5, FBM is irregular. For H
0.5, FBM is ordinary BM. For H gt 0.5, FBM gets
smoother.
The correlation between past and
futureincrements of FBM is defined by
For H0.5, the correlation is zero. For Hlt0.5,
the correlation is negative ? increasing trend is
likely to be followed by decreasing
trend (antipersistence) For Hgt0.5, the
correlation is positive ? increasing trend is
likely to be followed by increasing
trend (persistence)It can be proved that, D 2-H
24Fractal Time Series Intuitive View
Any normal time series has a typical scale ?
Mean value of the measurement As we collect more
data, sample mean converges to population mean.
For a fractal time series, there does not exist
any single average value whichcan characterize
the process. A fractal object or time series
has a distribution of values consisting of a
few large values, many medium values, a huge
number of small values. Thus, the data has a
power-law distribution (i.e. the log-log plot of
relative frequency vs. values will be a straight
line.)
What does it practically indicate?
25It indicates that As we collect more data, the
means continue to increase/decrease. So, the
sample mean does not converge to population
mean! This holds for both fractal process and
process with long-range correlation (when
the probability density for a value at a given
time depends on values of long past)
26A meaningful way to characterize the data is to
determine how the sample means depend on the
time resolution of the measurement. Plot log
(average fluctuation of sample mean) vs. log
(time resolution)
If the plot is linear, data are monofractal. If
the plot is nonlinear, the data are
multifractal. The fractal dimension is related
with the slope of the line, and the relationship
depends on (a) the type of measurement (b)
the definition of fluctuations of sample
means If values are power-law distributed,
fractal dimension characterizes the relative
frequency of the small values compared to the
large values. If data is long-range correlated,
fractal dimension characterizes the scaling
behavior of the mean.