Title: Value at Risk
1Value at Risk
By A V Vedpuriswar
February 8, 2009
2- VAR summarizes the worst loss over a target
horizon that will not be exceeded at a given
level of confidence. - For example, under normal market conditions,
the most the portfolio can lose over a month is
about 3.6 billion at the 99 confidence level.
3- The main idea behind VAR is to consider the
total portfolio risk at the highest level of the
institution. - Initially applied to market risk, it is now used
to measure credit risk, operational risk and
enterprise wide risk. - Many banks can now use their own VAR models as
the basis for their required capital for market
risk.
4- VAR can be calculated using two broad approaches
- Non parametric method This is the most
general method which does not make any
assumption about the shape of the distribution
of returns. - Parametric method VAR computation becomes much
easier if a distribution, such as normal, is
assumed. -
5Illustration
Average revenue 5.1 million per day Total
no.of observations 254. Std dev 9.2
million Confidence level 95 No. of
observations lt - 10 million 11 No. of
observations lt - 9 million 15
6- Find the point such that the no. of
observations to the left (254) (.05) 12.7 - (12.7 11) /( 15 11 ) 1.7 / 4 .4
- So required point - (10 - .4) - 9.6
million - VAR E (W) (-9.6) 5.1 (-9.6) 14.7
million - If we assume a normal distribution,
- Z at 95 confidence interval, 1 tailed 1.645
- VAR (1.645) (9.2) 15.2 million
7VAR as a benchmark measure
- VAR can be used as company wide yardstick to
compare risks across different markets. - VAR can also be used to understand whether risk
has increased over time. - VAR can be used to drill down into risk
reports to understand whether the higher risk is
due to increased volatility or bigger bets.
8VAR as a potential loss measure
- VAR can also give a broad idea of the worst loss
an institution can incur. - The choice of time horizon must correspond to the
time required for corrective action as losses
start to develop. - Corrective action may include reducing the risk
profile of the institution or raising new
capital. - Banks may use daily VAR because of the
liquidity and rapid turnover in their
portfolios. - In contrast, pension funds generally invest in
less liquid portfolios and adjust their risk
exposures only slowly. - So a one month horizon makes more sense.
9VAR as equity capital
- The VAR measure should adequately capture all
the risks facing the institution. - So the risk measure must encompass market risk,
credit risk, operational risk and other risks. - The higher the degree of risk aversion of the
company, the higher the confidence level chosen.
- If the bank determines its risk profile by
targeting a particular credit rating, the
expected default rate can be converted directly
into a confidence level. - Higher credit ratings should lead to a higher
confidence level.
10VAR Methods
- Mapping If the portfolio consists of a large
number of instruments, it would be too complex
to model each instrument separately. The first
step is mapping. Instruments are replaced by
positions on a limited number of risk factors.
If we have N risk factors, the positions are
aggregated across instruments. - Local valuation methods make use of the
valuation of the instruments at the current
point, along with the first and perhaps, the
second partial derivatives. The portfolio is
valued only once. - Full valuation methods, in contrast, reprice the
instruments over a broad range of values for the
risk factors.
11- Linear models are based on the covariance matrix
approach. - The matrix can be simplified using factor
models. - Non linear models take into account the first
and second partial derivatives (gamma/
convexity)
12Delta normal approach
- The delta normal method assumes that the
portfolio measures are linear and the risk
factors are jointly normally distributed. - The delta normal method involves a simple
matrix multiplication. - It is computationally fast even with a large no.
of assets because it replaces each position by
its linear exposure. - The disadvantages are the existence of fat tails
in many distributions and the inability to
handle non linear instruments.
13First, the asset is valued at the initial
point. V0 V(S0) dv dv/ds ds ?0 ds (?0
s) ds/s s is the risk factor. Portfolio VAR
?0 x VARs ?0 x (asS0) s Std devn of
rates of change in the price a Std normal
deviate corresponding to the specified
confidence level.
14- For more complex pay offs, local valuation is
not enough. - Take the case of a long straddle, i.e, the
purchase of call and a put. - The worst pay off (sum of the two premiums) will
be realized if the spot rate does not move at
all. - In general, it is not sufficient to evaluate the
portfolio at the two extremes. - All intermediate values must be checked.
15Delta Gamma Method
- In linear models, daily VAR is adjusted to other
periods, by scaling by a square root of time
factor. - This adjustment assumes that the position is
fixed and the daily returns are independent and
identically distributed. - This adjustment is not appropriate for options
because option delta changes dynamically over
time. - The delta gamma method provides an analytical
second order correction to the delta normal VAR.
16- Gamma gives the rate of change in delta with
respect to the spot price. - Long positions in options with a positive gamma
have less risk than with a linear model. - Conversely, short positions in options have
greater risk than implied by a linear model.
17Historical simulation method
- The historical simulation method consists of
going back in time and applying current weights
to a time series of historical asset returns. - This method makes no specific assumption about
return distribution, other than relying on
historical data. - This is an improvement over the normal
distribution because historical data typically
contain fat tails. - The main drawback of this method is its reliance
on a short historical moving window to infer
movements in market prices.
18- The sampling variation of historical simulation
VAR is greater than for a parametric method. - Longer sample paths are required to obtain
meaningful quantities. - The dilemma is that this may involve
observations that are no longer relevant. - Banks use periods between 250 and 750 days.
- This is taken as a reasonable trade off between
precision and non stationarity. - Many institutions are now using historical
simulation over a window of 1-4 years, duly
supplemented by stress tests .
19Monte Carlo Simulation Method
- The Monte Carlo Simulation Method is similar to
the historical simulation, except that movements
in risk factors are generated by drawings from
some pre specified distribution. - The risk manager samples pseudo random numbers
from this distribution and then generates pseudo
dollar returns as before. - Finally, the returns are sorted to produce the
desired VAR. - This method uses computer simulations to
generate random price paths.
20- They are by far the most powerful approach to
VAR. - They can account for a wide range of risks
including price risk, volatility risk, fat tails
and extreme scenarios and complex interactions.
- Non linear exposures and complex pricing
patterns can also be handled. - Monte Carlo analysis can deal with time decay
of options, daily settlements associated cash
flows and the effect of pre specified trading or
hedging strategies.
21- The Monte Carlo approach requires users to make
assumptions about the stochastic process and to
understand the sensitivity of the results to
these assumptions. - Different random numbers will lead to different
results. - A large number of iterations may be needed to
converge to a stable VAR measure. - When all the risk factors have a normal
distribution and exposures are linear, the
method should converge to the VAR produced by
the delta-normal VAR.
22- The Monte Carlo approach is computationally
quite demanding. - It requires marking to market the whole
portfolio over a large number of realisations
of underlying random variables. - To speed up the process, methods, have been
devised to break the link between the number of
Monte Carlo draws and the number of times the
portfolio is repriced. - In the grid Monte Carlo approach, the portfolio
is exactly valued over a limited number of grid
points. - For each simulation, the portfolio is valued
using a linear interpolation from the exact
values at adjoining grid points.
23- The first and most crucial step consists of
choosing a particular stochastic model for the
behaviour of prices. - A commonly used model in Monte carlo simulation
is the Geometric Brownian motion model which
assumes movements in the market price are
uncorrelated over time and that small movements
in prices can be described by - dSt µt St dt st St dz
- dz is a random variable distributed normally
with mean zero and variance dt.
24- This rules out processes with sudden jumps for
instance. - This process is also geometric because all the
parameters are scaled by the current price, St.
- µt and st represent the instantaneous drift and
volatility that can evolve over time.
25- Integrating ds/s over a finite interval, we have
approximately - ?St St-1 (µ ?t szv?t)
- z is a standard normal random variable with
mean zero and unit variance. - St1 St St (µ ?t sz1v?t)
- St2 St1 St1 (µ ?t sz2v?t)
26- Monte Carlo simulations are based on random
draws z from a variable with the desired
probability distribution. - The first building block is a uniform
distribution over the interval (0,1) that
produces a random variable x. - Good random number generators must create series
that pass all conventional tests of
independence. - Otherwise, the characteristics of the simulated
price process will not obey the underlying
model. - The next step is to transform the uniform random
number x into the desired distribution through
the inverse cumulative probability
distribution.
27Selective Sampling
- Sample along the paths that are most important
to the problem at hand. - If the goal is to measure a tail quantile,
accurately, there is no point in doing
simulations that will generate observations in
the centre of the distribution. - To increase the accuracy of the VAR estimator,
we can partition the simulation region into two
or more zones. - Appropriate number of observations is drawn from
each region.
28- Using more information about the portfolio
distribution results in more efficient
simulations. - The simulation can proceed in two phases.
- The first pass runs a traditional Monte Carlo.
- The risk manager then examines the region of the
risk factors that cause losses around VAR. - A second pass is then performed with many more
samples from the region.
29Backtesting
- Backtesting is done to check the accuracy of the
model. - It should be done in such a way that the
likelihood of catching biases in VAR forecasts
is maximized. - Longer horizon reduces the number of independent
observations and thus the power of the tests. - Too high a confidence level reduces the expected
number of observations in the tail and thus the
power of the tests. - For the internal models approach, the Basle
Committee recommends a 99 confidence level over
a 10 business day horizon. - The resulting VAR is multiplied by a
safety factor of 3 to arrive at the minimum
regulatory capital.
30- As the confidence level increases, the number
of occurrences below VAR shrinks, leading to
poor measures of high quantiles. - There is no simple way to estimate a 99.99 VAR
from the sample because it has too few
observations. - Shorter time intervals create more data points
and facilitate more effective back testing.
31Choosing the method
- Simulation methods are quite flexible.
- They can either postulate a stochastic process
or resample from historical data. - They allow full valuation on the target data.
- But they are prone to model risk and sampling
variation. - Greater precision can be achieved by increasing
the number of replications but this may slow the
process down.
32- For large portfolios where optionality is not a
dominant factor, the delta normal method
provides a fast and efficient method for
measuring VAR. - For fast approximations of option values, delta
gamma is efficient. - For portfolios with substantial option
components, or longer horizons, a full valuation
method may be required.
33- If the stochastic process chosen for the price
is unrealistic, so will be the estimate of VAR.
- For example, the geometric Brownian motion
model adequately describes the behaviour of
stock prices and exchange rates but not that of
fixed income securities. - In Brownian motion models, price shocks are
never reversed and prices move as a random
walk. - This cannot be the price process for default
free bond prices which must converge to their
face value at expiration.
34V A R Applications
35- VAR methods represent the culmination of a
trend towards centralized risk management. - Many institutions have started to measure
market risk on a global basis because the
sources of risk have multiplied and volatility
has increased. - A portfolio approach gives a better picture of
risk rather than looking at different
instruments in isolation.
36- Centralization makes sense for credit risk
management too. - A financial institution may have myriad
transactions with the same counterparty, coming
from various desks such as currencies, fixed
income commodities and so on. - Even though all the desks may have a reasonable
exposure when considered on an individual basis,
these exposures may add up to an unacceptable
risk. - Also, with netting agreements, the total
exposure depends on the net current value of
contracts covered by the agreements. - All these steps are not possible in the absence
of a global measurement system.
37- Institutions which will benefit most from a
global risk management system are those which
are exposed to -
- - diverse risk
- - active positions taking / proprietary trading
- - complex instruments
- .
38- VAR is a useful information reporting tool.
- Banks can disclose their aggregated risk
without revealing their individual positions. - Ideally, institutions should provide summary VAR
figures on a daily, weekly or monthly basis. - Disclosure of information is an effective means
of market discipline.
39- VAR is also a useful risk control tool.
- Position limits alone do not give a complete
picture. - The same limit on a 30 year treasury, (compared
to 5 year treasury) may be more risky. - VAR limits can supplement position limits.
- In volatile environments, VAR can be used as the
basis for scaling down positions. - VAR acts as a common denominator for comparing
various risky activities.
40- VAR can be viewed as a measure of risk capital
or economic capital required to support a
financial activity. - The economic capital is the aggregate capital
required as a cushion against unexpected losses.
- VAR helps in measuring risk adjusted return.
- Without controlling for risk, traders may become
reckless. - If the trader makes a large profit, he receives
a large bonus. - If he makes a loss, the worst that can happen is
he will get fined.
41- The application of VAR in performance
measurement depends on its intended purposes. - Internal performance measurement aims at
rewarding people for actions they have full
control over. - The individual/undiversified VAR seems the
appropriate choice. - External performance measurement aims at
allocation of existing / new capital to
existing or new business units. - Such decisions should be based on marginal and
diversified VAR measures.
42- VAR can also be used at the strategic level to
identify where shareholder value is being added
throughout the corporation. - VAR can help management take decisions about
which business lines to expand, maintain or
reduce. - And also about the appropriate level of
capital to hold.
43- A strong capital allocation process produces
substantial benefits. - The process almost always leads to improvements.
- Finance executives are forced to examine
prospects for revenues, costs and risks in all
their business activities. - Managers start to learn things about their
business they did not know.
44Extreme Value Theory (EVT)
- EVT extends the central limit theorem which
deals with the distribution of the average of
identically and independently distributed
variables from an unknown distribution to the
distribution of their tails. - The EVT approach is useful for estimating tail
probabilities of extreme events. - For very high confidence levels (gt99), the
normal distribution generally underestimates
potential losses.
45- Empirical distributions suffer from a lack of
data in the tails. - This makes it difficult to estimate VAR
reliably. - EVT helps us to draw smooth curves through the
extreme tails of the distribution based on
powerful statistical theory. - In many cases the t distribution with 4-6
degrees of freedom is adequate to describe the
tails of financial data.
46 - EVT applies to the tails
- Not appropriate for the centre of the
distribution - Also called semi parametric approach
- EVT theorem was proved by Gnedenko in 1943
- EVT helps us to draw smooth curves through the
tails of the distribution
45
47EVT Theorem
F (y) 1 (1 y)- 1/ ? 0 F (y) 1
e-y 0 y (x - µ) / ß, ß gt 0 Normal
distribution corresponds to 0 Tails disappear
at exponential speed
46
48EVT Estimators
2 Normal EVT 0
47
49 - Fitting EVT functions to recent historical data
is fraught with the same pitfalls as VAR. - Once in a lifetime events cannot be taken into
account even by powerful statistical tools. - So they need to be complemented by stress
testing. - The goal of stress testing is to identify
unusual scenarios that would not occur under
standard VAR models. - Stress tests can simulate shocks that have never
occurred or have been covered highly unlikely. - Stress tests can also simulate shocks that
reflect permanent structural breaks or
temporarily changed statistical patterns.
48
50- Stress testing should be enforced, but the
problem is the stress needs to be pertinent to
the type of risk the institution has. - It would be difficult to enforce a limited
number of relevant stress tests. - The complex portfolio models banks generally
employ give the illusion of accurate simulation
at the expense of substance.
49
51How effective are VAR models? VAR and sub prime
- The tendency of risk managers and other
executives to describe events in terms of
sigma tells us a lot. - Whenever there is talk about sigma, it implies
a normal distribution. - Real life distributions have fat tails.
- Goldman Sachs chief financial officer David
Viniar once described the credit crunch as a
25-sigma event
50
52- The credit crisis of late 2007 was largely a
failure of risk management. - Risk models of many banks were unable to
predict the likelihood , speed or severity of
the crisis. - Attention turned particularly to the use of
value-at- risk as a measure of the risk involved
in a portfolio. - While a few VAR exceptions are expected 99, a
properly working model would still produce two
to three exceptions a year the existence of
clusters of exceptions indicates that something
is wrong.
51
53- Credit Suisse reported 11 exceptions at the
99 confidence level in the third quarter,
Lehman brothers three at 95, Goldman Sachs
five at 95, Morgan Stanley six at 95, Bear
Stearns 10 at 99 and UBS 16 at 99. - Clearly, VAR is a tool for normal markets and it
is not designed for stress situations.
52
54What window?
- It would have been difficult for VAR models to
have captured all the recent market events,
especially as the environment was emerging from
a period of relatively benign volatility. - A two-year window wont capture the extremes, so
the VAR it produces will be too low. - A longer window is a partial solution at best .
- It will improve matters a little, but it also
swamps recent events.
53
55Is shorter window a better thing?
- A longer observation period may pick up a wider
variety of market conditions, but it would not
necessarily allow VAR models to react quickly
to an extreme event. - If the problem is that models are not reacting
fast enough, some believe the answer would in
fact be to use shorter windows. - These models would be surprised by the first
outbreak of volatility, but would rapidly adapt.
54
56What models work best?
- The best VAR models are those that are quicker
to react to a step-change in volatility. - With the benefit of hindsight, the type of VAR
model that would actually have worked best in
the second half of 2007 would most likely have
been a model driven by a frequently updated
short data history. - Or any frequently updated short data history
that weights more recent observations more
heavily than more distant observations.
55
57- In an environment like the third quarter of
2007, a long data series will include an
extensive period of low volatility, which will
mute the models reaction to a sudden increase
in volatility. - Although it will include episodes of volatility
from several years ago, these will be outweighed
by the intervening period of calm. -
56
58The importance of updating
- In the wake of the recent credit crisis, an
unarguable improvement seems to be increasing
the frequency of updating. - Monthly or even quarterly updating of the data
series is the norm. - Shifting to weekly or even daily updating would
improve the responsiveness of the model to a
sudden change of conditions.
57