Professional Documents
Culture Documents
Chapter #7: Value Ate Risk (VAR)
Chapter #7: Value Ate Risk (VAR)
dissatisfied with the long-risk reports he received every day. These contained a huge amount of
detail on the Greek letters for different exposures, but very little that was really useful to top
management. He asked for something simple that focused on the bank’s total exposure over the next
24 hours measured across the bank’s entire trading portfolio. At first his subordinates said this was
impossible , but eventually they adapted the Markowitz portfolio theory to develop a VAR report.
Producing the report entailed a huge amount of work involving the collection of data daily on the
positions held by the bank around the world, the handling of different time zones, the estimation of
correlations and instabilities , and the development of computer systems. The work was completed
in about1990..
Historical Perspectives
The main benefit of the new system was that senior management had a better understanding of the
risks being taken by the bank and were better able to allocate capital within the bank. Other banks
had been working on similar approaches for aggregating risks and by 1993 VAR was established
Banks usually keep the details about the models they develop internally a secret. However, in
1994 JPMorgan made a simplified version of their own system, which they called Risk Metrics.
Risk Metrics included variances and co variances for a very large number of different market
variables. This attracted a lot of attention and led to debates about the pros and cons of different
VaR models. Software firms started offering their own VaR models, some of which used
the database. After that ,VaR was rapidly adopted as a standard by financial institutions and some
nonfinancial corporations. The BIS Amendment, which was based on VaR was announced in 1996
and implemented in 1998. Later the Risk Metrics group within JPMorgan was turned off as a
separate company. This company developed Credit Metrics for handling credit risks in 1997 and
Corporate Metrics for handling the risks faced by non-financial corporations in 1999.
Value at risk
DEFINITION OF VAR
When using the value at risk measure, we are interested in making a statement
of the following form:
“We are X percent certain that we will not lose more than V dollars in time T.”
VAR
• The variable V is the VAR of the portfolio. It is a function of
Suppose we are trying to design a risk measure that will equal the capital a
financial institution is required to keep. Is VAR (with an appropriate time
horizon and an appropriate confidence level) the best measure?
Artzner et al., have examined this question. They first proposed a number of
properties that such a risk measure should have.
VAR and Capital
These are:
portfolio for every state of the world, its risk measure should be greater.
there relative amounts of different items in the portfolio the same, should
result in the risk measure being multiplied.
• 4. Subadditivity: The risk measure for two portfolios after they have been
merged should be no greater than the sum of their risk measures before they
were merged.
VAR and Capital
• The first condition is straightforward. If one portfolio always performs worse than
portfolio this provides a buffer against losses and should reduce the capital
requirement by K.
• The fourth condition states that diversification helps reduce risks. When we aggregate
two portfolios, the total risk measure should either decrease or stay the same .VAR
satisfies the first three conditions. However, it does not always satisfy the fourth one.
Advantages of VAR
• It is easy to understand
trading desks of banks calculate the profit and loss daily. When their
positions are fairly liquid and actively managed, it therefore makes sense to
calculate a VAR over a time horizon of one trading day. If the VAR turns out
month is often chosen. This is because the portfolio is traded less actively
and some of the instruments in the portfolio are less liquid. Also the
• A method of calculating value-at-risk (VaR) that uses historical data to assess the
portfolio. This distribution can then be used to calculate the maximum loss with a
• Because historical simulation uses real data, it can capture unexpected events and
VAR. The simple assumption here is that the past returns give some
indication of what the future will be like and therefore an idea of possible
• The first simulation trial assumes that the percentage changes in all market
• The second simulation trial assumes that the percentage changes in all
market variables and using historical data to estimate the model parameters.
The Model-Building Approach
• The mean and standard deviation of the value of a portfolio can be calculated
from the mean and standard deviation of the returns on the underlying
products and the correlations between those returns.if, daily returns on the
investments are assumed to be multivariate normal, the probability
distribution for the change in the value of the portfolio over one day is also
normal. This makes it very easy to calculate value at risk.
Credit scoring models
• Credit-scoring models use data on observed borrower characteristics either to
turn, depends on the values of various financial ratios of the borrower and the
analysis model.
Altman’s Z-Score.
• Altman’s credit-scoring model takes the following form:
Where
• X1 = Working capital/Total assets
The higher the value of Z, the lower the borrower’s default risk classification.
Thus, low or negative Z values may be evidence that the borrower is a member
of a relatively high default risk class.
Example#2
• Let’s assume Bill’s Boats’ financial statements had the following figures:
• Sales: $1M
• EBIT: $500,000
• B = $1,000,000 / $2,000,000
• C = $500,000 / $2,000,000
• D = $2,000,000 / $1,000,000
• E = $1,000,000 / $2,000,000