Title: Ratto Lecture 2
1Monte Carlo filtering and regional sensitivity
analysis (RSA) M. Ratto, IPSC-JRC, European
Commission
2Background
- environmental sciences, early 80s
- complex numerical and analytical models, based
on first principles, conservation laws, - ill-defined parameters, competing model
structures - (different constitutive equations, different
types of process considered, spatial/temporal
resolution, ) - need to establish magnitude and sources of
prediction uncertainty - Monte Carlo simulation analyses.
to achieve a better understanding of the
simulated system to increase the reliability of
model predictions to define realistic values to
be used in subsequent risk assessment
3MC filtering and RSA
One of the earliest landmark application of MC
simulation eutrophication in the Peel Inlet, SW
Australia. Regional Sensitivity Analysis
developed and employed. Hornberger G.M., Spear
R.C., An approach to the preliminary analysis of
environmental systems, J. Environm. Management,
12, 7-18, 1981. Young, P.C., Parkinson, S. D.
and Lees, M. Simplicity out of complexity
Occams razor revisited. Journal of Applied
Statistics 23, 165-210, 1996. Young, P.C.,
Data-based mechanistic modelling, generalised
sensitivity and dominant mode analysis. Computer
Physics Communication 117, 113-129, 1999.
4MC filtering and RSA
- Two tasks for RSA
- qualitative definition of the system behaviour
- a set of constraints thresholds, ceilings, time
bounds based on available information on the
system - binary classification of model outputs based on
the specified behaviour definition. - qualifies a simulation as behaviour (B) if the
model output lies within constraints,
non-behaviour ( ) otherwise
5MC filtering and RSA
Define a range for k input factors xi 1 ? i ?
k, reflecting uncertainty in parameters and
model constituent hypotheses. Each Monte Carlo
simulation is associated to a vector of values of
the input factors. Classifying simulations as
either B or , a set of binary elements is
defined allowing to distinguish two sub-sets for
each xi
6MC filtering and RSA
The Kolmogorov-Smirnov two-sample test (two-sided
version) is performed for each factor
independently
Test statistic
where F are marginal cumulative probability
functions, f are probability density functions
7MC filtering and RSA
- At what significance level ? does the computed
value of dm,n determine the rejection of Ho? - ? is the probability to reject Ho when it is true
(i.e. to recognise a factor as important when it
is not). - Derive the critical level D? at which the
computed value of dm,n determines the rejection
of Ho (the smaller ?, the higher D?). - If dm,n gt D?, then Ho is rejected at significance
level ?.
8MC filtering and RSA
9MC filtering and RSA
The importance of the uncertainty of each
parameter is inversely related to this
significance level. Input factors are grouped
into three sensitivity classes, based on the
significance level for rejecting H0 1
critical (?lt1) 2 important (1 lt ? lt
10) 3 insignificant (? gt 10).
10RSA comment
RSA has many global properties, similarly to
variance based methods The whole range of value
of input factors is considered, all factors are
varied at the same time RSA classification is
related to Main Effects of Variance Based
methods (it analyses univariate marginal
distributions). No higher order analysis is
performed with RSA approach, searching for
interaction structure.
11RSA Financial application
The problem of hedging a financial portfolio
A bank issues a caplet (a financial contract), a
particular type of European option whose
underlying is the curve of interest rates. The
way interest rates evolve through time is
unknown, and therefore by selling the caplet the
bank is facing a risk associated with interest
rates movements. The bank wants to offset such a
risk. The goal is not that of making a profit but
to avoiding the risk exposure of having issued
the option. In finance this is called the problem
of hedging.
12RSA Financial application
The bank buys a certain amount of FRA's (forward
rate agreement) that are contracts that, by
showing an opposite behaviour to that of the
caplet with respect to changes in interest rates,
are capable to offset the caplet risk exposure.
The amount of FRA's purchased is such that the
overall bank's portfolio, made by the caplet and
the FRA's, is insensitive (or almost insensitive)
to interest rates movements. The portfolio is
said to be delta neutral, delta indicating the
type of risk being hedged (offset).
13RSA Financial application
As time passes the portfolio tends to lose risk
neutrality and to become again sensitive to
interest rates changes. Maintaining risk
neutrality would require the portfolio to be
revised continuously In practice only a limited
number of portfolio revisions are feasible, also
because each revision implies a cost. Hence a
hedging error is generated and at maturity the
bank may incur in a loss. Goal of the bank is to
quantify the potential loss.
14RSA Financial application
The hedging error is defined as the difference
between the value of the portfolio at maturity
and what it would have been gained investing the
initial value of the portfolio at the interest
rate prevailing on the market (the market free
rate). Note that when the error is positive it
means that, although failing in maintaining risk
neutrality, the bank is making a profit. In
contrast, when the error is negative, the bank is
losing money.
15RSA Financial application
To compute the hedging error at maturity, we need
to evaluate the portfolio at any time. We use
the Hull and White one-factor model, which
assumes the interest rates evolution is driven by
one factor, rt, evolving as
Drift Standard deviation (volatility)
is estimated (e.g. a polynomial) with the
information available at time at t0
16RSA Financial Application
- The hedging error depends upon a number of
factors - the number of portfolio revisions performed,
- other parameters related to the assumptions made
on the way interest rates evolve with time. - The number of revisions is a factor that is
unknown and controllable, i.e. the bank can
decide how many revisions to perform (the cost
incurred to revise the portfolio also affects
this choice). - The parameters related to the interest rate
evolution are, not only unknown, but also
uncontrollable.
17RSA Financial Application
- Our analysis considers four input factors for
UA/SA - a factor representing the variability of the
dynamics of the evolution of the interest rate
through time (?) - the number of portfolio revisions to be performed
(N. rev.) - the parameters a and ? of the Hull and White
model of the spot rate. - Different scenarios have also been considered,
each corresponding to different values of the
transaction costs to revise the portfolio
composition.
18RSA Financial Application
Why transaction cost as different scenarios,
rather than uncertain input factor? They are
affected by natural spatial variability, varying
for instance from one financial market to
another. Their value is therefore unknown a
priori but becomes known once the market of
action has been chosen, i.e. when the model is
then to be applied. The analysis is thus
repeated for different scenarios to represent
what happens in different markets.
19RSA Financial Application
20RSA Financial application
memo
Drift Standard deviation (volatility)
is estimated (e.g. a polynomial) with the
information available at time at t0
?? selects among 10 possible paths of the initial
yield curve, from which the polynomial
is estimated
21RSA Financial Application
A Monte Carlo filtering / Smirnov analysis is
here the most appropriate. We look for the
factors mostly responsible for the model output
behaviour in a region of interest, that is
defined as Ylt0 (negative error), where the bank
is facing a loss.
22RSA Financial Application
Splitting the total output uncertainty into a
part associated with "uncontrollable" factors (a,
?, ?) and a part that can be reduced by
optimising the input values (N. rev.) is a
precious piece of information. It helps to assess
the percentage of risk associated with the
portfolio that is unavoidable.
23RSA Financial Application
A Monte Carlo filtering analysis was performed by
setting as "acceptable" output values those that
are positive. The MC sample size was set to
16384. The analysis was repeated for 5 possible
scenarios resulting from 5 different assumptions
for the values of the transaction costs (1) no
costs, (2-5) costs are a fixed proportion of the
amount of contracts exchanged (2, 5, 10, or
20).
24RSA Financial Application
Uncertainty analysis
25RSA Financial Application
The of acceptable Ys decreases as the
transaction costs increase
26RSA Financial Application
Smirnov test
27RSA Financial Application
Results of the Smirnov test do not allow the
conclusion that the model parameter a is
irrelevant and can be fixed to its nominal base
value. Correlation coefficients among factors
did not help to clarify this, as no significant
correlation values were detected. We therefore
proceeded to perform a global sensitivity
analysis to the (unfiltered!) output Y, the
hedging error, in order to assess the overall
importance of each factor by computing its total
index.
28RSA Financial Application
29RSA Financial Application
The total sensitivity indices indicate that,
although less important than other parameters,
the model parameter a has a non-negligible total
effect (0.4). Therefore its value cannot be
fixed and its uncertainty should be taken into
account in further studies.
30RSA Financial Application
The sensitivity analysis provides an encouraging
insight the uncertainty in the optimal number of
revisions is the main contributor to the
uncertainty in the output. As this is a
"controllable" factor, we are encouraged to carry
out further analysis searching for the optimal
value for this factor, thus reducing uncertainty
in the analysis outcome. To improve our
understanding of the relationship linking the
number of portfolio revisions and the hedging
error we plotted histograms of the number of
acceptable outputs vs. N. rev.
31RSA Financial Application
32RSA Financial Application
Uncertainty and sensitivity analyses are valuable
tools in financial risk assessment. Uncertainty
analysis quantifies the potential loss incurred
by the bank, and, in particular, the maximum
potential loss, a variable that is often of
interest in this context. Sensitivity analysis
identifies the relative importance of the sources
of the incurred risk. It can split the risk into
the amount, which is not reducible, and the
amount that in principle may be reduced by making
proper choices for "controllable" input factor
values.
33RSA Financial Application
An example where the Monte Carlo
filtering/Smirnov approach represents an
attractive methodology. The definition of
"acceptable" model behaviour is particularly
indicated when addressing risk problems where the
output is required to stay below a given
threshold. Nevertheless we recommend the use of
this approach in conjunction with variance-based
techniques (or TSDE techniques presented in next
lecture), as these may overtake the limits faced
by the MC/Smirnov analysis.
34RSA Level E application
we compare the cumulative distributions for the
two subsets corresponding to the 0-95 and
95-100 percentiles of the model output at yr
35RSA Level E application
solid highest 5
36RSA Level E application
The variables v(1) and W contribute significantly
to producing high output values at yr 2105
(??0). Low values of v(1) and W are mainly
responsible for producing high doses. Rc(1) and
Rc(2) have also a significant effects (?lt1).
37RSA Level E application
Correlation of the filtered sample (upper 5)
38RSA Level E application
All significant correlation terms involve v(1),
highest correlation terms with the other three
important variables W (positive correlation)
followed by Rc(1) and Rc(2) (negative
correlation). There are significant terms
including l(1), v(2) and l(2). This behaviour is
confirmed by the variance based analysis, in
which v(1) has the dominant total order effect
(0.91) at 2105 yr, while all other factors with
significant total effect are the same highlighted
by the correlation analysis.
39RSA Level E application
Correlation coefficients are useful because they
also suggest some qualitative way of interacting
in particular if the coefficient is positive, the
couple of factors act in the model as a
quotient/difference if it is negative they act
as a product/sum.
40RSA problems
- Spear et al. (1994) reviewed their experience
with RSA, highlighting two key drawbacks - (1) The success rate
- the fraction of B is ?5 over the total
simulations for large models (kgt20) - lack in statistical power
-
41RSA problems (ctd.)
-
- (2) Correlation and interaction structures of the
B subset ? also Becks review, 1987 - Smirnov test is a sufficient test only if Ho is
rejected (i.e. the factor is IMPORTANT) - any covariance structure induced by the
classification is not detected by the univariate
dm,n statistic. - e.g. factors combined as products or quotients
may compensate - bivariate correlation analysis is not revealing,
either.
42RSA problems with examples (1)
Example 1. Y X1X2, X1,X2 ?U-0.5,0.5 Behaviou
ral runs Ygt0.
Scatter plot of the B subset in the X1,X2 plane
Smirnov test fails Correlation analysis would
help (e.g. PCA)
43RSA example (1)
Example 1. Cumulative distributions of
X1 (Behavioural runs Ygt0.)
44RSA example (1)
In this case, a correlation analysis would be
helpful. the correlation coefficient of (X1, X2)
for the B set is ?0.75. This suggests
performing a Principal Component Analysis on the
B set, obtaining the two components (the
eigenvectors of the correlation matrix)
45RSA example (1)
PC1 (0.7079, 0.7063) accounting for the 87.5
of the variation of the B set. PC2 (-0.7063,
0.7079) accounting for the 12.5 of the
variation of the B set. The direction of the
principal component PC1 (associated with the
highest eigenvalue) indicates the privileged
orientation for acceptable runs.
46RSA example (1)
47RSA example (1)
48RSA example (1)
Now the level of significance for rejecting the
null hypothesis when it is true is very small
(?lt0.1), implying a very strong relevance of the
linear combinations of the two input factors,
defined by the principal component analysis
49RSA problems with examples (2)
Example 2. Y X12 X22, X1,X2
?U-0.5,0.5 Behavioural runs 0.2 lt Y lt 0.25
Scatter plot of the B subset in the X1,X2 plane
Smirnov test fails Correlation fails as
well (sample ??-0.04)
50RSA problems (ctd.)
To address the RSA limitations and to better
understand the impact of uncertainty and
interaction in the high-dimensional parameter
spaces of models, Spear et al. (1994) developed
the computer intensive tree-structured density
estimation technique (TSDE), Interesting
applications of TSDE in environmental sciences
can be found in Spear (1997), Grieb at al. (1999)
and Osidele and Beck (2001). ( next lecture )