VaR The state of play

Similar documents
Market Risk Analysis Volume IV. Value-at-Risk Models

Bloomberg. Portfolio Value-at-Risk. Sridhar Gollamudi & Bryan Weber. September 22, Version 1.0

Executive Summary: A CVaR Scenario-based Framework For Minimizing Downside Risk In Multi-Asset Class Portfolios

Market Risk: FROM VALUE AT RISK TO STRESS TESTING. Agenda. Agenda (Cont.) Traditional Measures of Market Risk

Financial Risk Measurement/Management

Value at Risk Risk Management in Practice. Nikolett Gyori (Morgan Stanley, Internal Audit) September 26, 2017

Calculating VaR. There are several approaches for calculating the Value at Risk figure. The most popular are the

IEOR E4602: Quantitative Risk Management

RISKMETRICS. Dr Philip Symes

Alternative VaR Models

Accelerated Option Pricing Multiple Scenarios

Statistical Methods in Financial Risk Management

The risk/return trade-off has been a

Asset Allocation Model with Tail Risk Parity

Value at Risk, Expected Shortfall, and Marginal Risk Contribution, in: Szego, G. (ed.): Risk Measures for the 21st Century, p , Wiley 2004.

Pricing & Risk Management of Synthetic CDOs

Measuring and managing market risk June 2003

In physics and engineering education, Fermi problems

A new breed of Monte Carlo to meet FRTB computational challenges

Introduction to Algorithmic Trading Strategies Lecture 8

Measurement of Market Risk

SOLVENCY AND CAPITAL ALLOCATION

HANDBOOK OF. Market Risk CHRISTIAN SZYLAR WILEY

MEASURING TRADED MARKET RISK: VALUE-AT-RISK AND BACKTESTING TECHNIQUES

MFM Practitioner Module: Quantitative Risk Management. John Dodson. September 6, 2017

Beyond VaR: Triangular Risk Decomposition

Linda Allen, Jacob Boudoukh and Anthony Saunders, Understanding Market, Credit and Operational Risk: The Value at Risk Approach

Financial Risk Management and Governance Beyond VaR. Prof. Hugues Pirotte

MEASURING PORTFOLIO RISKS USING CONDITIONAL COPULA-AR-GARCH MODEL

COHERENT VAR-TYPE MEASURES. 1. VaR cannot be used for calculating diversification

Market Risk Analysis Volume I

Mathematics in Finance

Monte Carlo Methods in Structuring and Derivatives Pricing

Comparative analysis and estimation of mathematical methods of market risk valuation in application to Russian stock market.

CHAPTER II LITERATURE STUDY

Risk measures: Yet another search of a holy grail

Lecture 1: The Econometrics of Financial Returns

Risk Measurement: An Introduction to Value at Risk

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models

Recent developments in. Portfolio Modelling

Market risk measurement in practice

Comparison of Estimation For Conditional Value at Risk

Assessing Value-at-Risk

Publication date: 12-Nov-2001 Reprinted from RatingsDirect

Market Risk Analysis Volume II. Practical Financial Econometrics

Maturity as a factor for credit risk capital

An Application of Extreme Value Theory for Measuring Financial Risk in the Uruguayan Pension Fund 1

Handbook of Financial Risk Management

Interest-Sensitive Financial Instruments

Appendix CA-15. Central Bank of Bahrain Rulebook. Volume 1: Conventional Banks

EC316a: Advanced Scientific Computation, Fall Discrete time, continuous state dynamic models: solution methods

[D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright

Master s in Financial Engineering Foundations of Buy-Side Finance: Quantitative Risk and Portfolio Management. > Teaching > Courses

Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals

Financial Risk Measurement/Management

GN47: Stochastic Modelling of Economic Risks in Life Insurance

IEOR E4602: Quantitative Risk Management

Edgeworth Binomial Trees

INTEREST RATES AND FX MODELS

Certainty and Uncertainty in the Taxation of Risky Returns

A SUMMARY OF OUR APPROACHES TO THE SABR MODEL

Application of Conditional Autoregressive Value at Risk Model to Kenyan Stocks: A Comparative Study

The Term Structure and Interest Rate Dynamics Cross-Reference to CFA Institute Assigned Topic Review #35

Alan Greenspan [2000]

Risk Management and Time Series

Brooks, Introductory Econometrics for Finance, 3rd Edition

Valuation of a New Class of Commodity-Linked Bonds with Partial Indexation Adjustments

SUPERVISORY FRAMEWORK FOR THE USE OF BACKTESTING IN CONJUNCTION WITH THE INTERNAL MODELS APPROACH TO MARKET RISK CAPITAL REQUIREMENTS

Risks and Returns of Relative Total Shareholder Return Plans Andy Restaino Technical Compensation Advisors Inc.

Comparison of Capital Adequacy Requirements to Market Risks According Internal Models and Standardized Method

Economic Capital. Implementing an Internal Model for. Economic Capital ACTUARIAL SERVICES

FE670 Algorithmic Trading Strategies. Stevens Institute of Technology

Razor Risk Market Risk Overview

P2.T5. Market Risk Measurement & Management. Jorion, Value-at Risk: The New Benchmark for Managing Financial Risk, 3 rd Edition

Section B: Risk Measures. Value-at-Risk, Jorion

Monte Carlo Methods in Financial Engineering

Economics 430 Handout on Rational Expectations: Part I. Review of Statistics: Notation and Definitions

ANALYSIS OF THE BINOMIAL METHOD

Computational Finance Binomial Trees Analysis

Evaluating Value at Risk Methodologies: Accuracy versus Computational Time

Market interest-rate models

Market Risk Disclosures For the Quarter Ended March 31, 2013

From Financial Risk Management. Full book available for purchase here.

Financial Risk Forecasting Chapter 4 Risk Measures

Preprint: Will be published in Perm Winter School Financial Econometrics and Empirical Market Microstructure, Springer

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29

Field Guide to Internal Models under the Basel Committee s Fundamental review of the trading book framework

MFE8825 Quantitative Management of Bond Portfolios

Dynamic Replication of Non-Maturing Assets and Liabilities

Quantitative and Qualitative Disclosures about Market Risk.

The University of Chicago, Booth School of Business Business 41202, Spring Quarter 2012, Mr. Ruey S. Tsay. Solutions to Final Exam

Stochastic Modelling: The power behind effective financial planning. Better Outcomes For All. Good for the consumer. Good for the Industry.

An Adjusted Trinomial Lattice for Pricing Arithmetic Average Based Asian Option

Introduction Dickey-Fuller Test Option Pricing Bootstrapping. Simulation Methods. Chapter 13 of Chris Brook s Book.

Interagency Advisory on Interest Rate Risk Management

Market Risk Disclosures For the Quarterly Period Ended September 30, 2014

Value at Risk. january used when assessing capital and solvency requirements and pricing risk transfer opportunities.

Sensex Realized Volatility Index (REALVOL)

P2.T5. Market Risk Measurement & Management. Bruce Tuckman, Fixed Income Securities, 3rd Edition

Rebalancing the Simon Fraser University s Academic Pension Plan s Balanced Fund: A Case Study

Transcription:

Review of Financial Economics 11 (2002) 175 189 VaR The state of play Neil D. Pearson a, *, Charles Smithson b a Department of Finance, College of Commerce and Business Administration, University of Illinois at Urbana-Champaign, 1206 South 4th Street, Champaign, IL 61820-6980, USA b Rutter Associates, USA Received 12 January 2002; received in revised form 10 February 2002; accepted 13 April 2002 Abstract Since Value at Risk (VaR) received its first wide introduction in the July 1993 Group of Thirty report, the number of users of and uses for VaR have increased dramatically. However, VaR itself has been evolving. In this article, we will first review some of the important refinements in VaR that have appeared improved speed of computation, improved accuracy, and improved stress testing. We then look at the next steps (which we refer to as Beyond VaR ), in which we review extensions to standard VaR, the emergence of risk contribution measures, and alternatives to standard VaR (including Extreme Value Theory [EVT] and Coherent Risk Measures). D 2002 Elsevier Science Inc. All rights reserved. Keywords: Value-at-risk; Rbh measurement; Stress testing; Backtesting 1. Refinements in VaR It is important to recognize that the Value at Risk (VaR) technique has gone through significant refinement since it originally appeared a decade ago. * Corresponding author. Fax: +1-217-244-9867. E-mail address: pearson2@cba.uiuc.edu (N.D. Pearson). 1058-3300/02/$ see front matter D 2002 Elsevier Science Inc. All rights reserved. PII: S 1 0 5 8-3 3 0 0 ( 0 2 ) 0 0 045-9

176 N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 2. Improvement in the speed of computation At the outset, three approaches to computing VaR were proposed Historical Simulation, [Full] Monte Carlo Simulation, 1 and the Analytic Variance Covariance (Delta Normal) approach. Monte Carlo simulation was the preferred approach, because of its ability (in contrast to delta normal) to measure accurately the risk of portfolios with significant options content and the lack of any need for the large historical samples (and attendant assumption of stable volatilities) required by the Historical Simulation method. In addition, stress testing and sensitivity analysis fit naturally within the framework of the Monte Carlo simulation method. However, a significant drawback is that, for real-world portfolios, implementation of the Monte Carlo simulation method can be complex and often requires significant computer resources. 2 Consequently, researchers have focused attention on approximation methods to compute the VaR of options portfolios without simulation and ways to increase the speed of Monte Carlo simulations. 1. To produce an accurate VaR for options portfolios, without resorting to simulation, the Delta Gamma Theta method involves computing the first four moments of a delta gamma theta approximation 3 of the portfolio value, selecting a flexible distribution that permits skewness and fat tails, and then choosing the parameters of that flexible distribution so that it matches the moments of the delta gamma theta approximation. (If the time derivative is not used, the method is referred to as the Delta Gamma method.) 2. The Delta Gamma Theta Monte Carlo method combines simulation with a delta gamma theta approximation. At each draw of the market factors in the simulation, an approximate portfolio value is computed using the delta gamma theta approximation. Using the approximation avoids valuing each of the instruments for each of the thousands of draws in the simulation i.e., it breaks the link between the number of draws in the Monte Carlo and the number of times the portfolio is repriced. (If the time derivative is not used then the method is called Delta Gamma Monte Carlo.) A limitation of the preceding methods is their dependence on local approximations (i.e., Taylor series), the errors of which typically increase with the size of the risk factor 1 The Full Monte Carlo requires the repricing of the portfolio for each factor realization. 2 There are two mains reasons for the complexity and computational burden: (1) Even relatively simple portfolios are typically exposed to large numbers of underlying market factors, each of which must be included in the simulation e.g., in a fixed income portfolio, the yield curve for each currency is described by up to 20 different yields, each of which constitutes a separate market factor. (2) The valuation of the entire portfolio or trading book for each simulated scenario of future market rates and prices can be burdensome if the portfolio includes American options or instruments such as callable bonds with embedded options e.g., if the simulation involves 10,000 scenarios and the portfolio includes 1000 different options or callable bonds, the Monte Carlo simulation requires valuing 10,0001000=10 million American options, each of which involves traversing a binomial tree or similar grid of stock prices or interest rates. 3 Deltas are the first derivatives with respect to each of the market factors, gammas are the second derivatives, and theta is the time derivative. Thus, the approximation is first order in time and second order in the market factors.

N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 177 realizations. 4 One improvement is a Grid Monte Carlo method. Market factor changes are generated as in the Full Monte Carlo method. But, exact repricing of the portfolio occurs only on the nodes of a grid of factor value realizations; and linear interpolation is used to reprice the portfolio for factor realizations that fall between the nodes. However, a naive grid approach that reflected changes in all of the market factors would be of very high dimension e.g., a grid constructed for nine possible values of 20 market factors results in 9 20 =1.21810 19 nodes. While the link between the number of draws in the Monte Carlo and the number of repricings of the portfolio is broken, it goes the wrong way the number of portfolio repricings would exceed the number of draws. Since a naive grid would be infeasible, researchers looked for alternatives. 3. The Modified Grid Monte Carlo limits the factors to be modeled on the grid to those factors involving the most severe nonlinearities; the other factors are modeled using a first- (or higher-) order Taylor series. The gains from this are large only if nonlinearities are important for only a small number of factors. 4. The Principal Components Grid Monte Carlo (Frye, 1996) uses the method of principal components to reduce the number of factors e.g., for term structures, the first three principal components, often interpreted as level, slope, and curvature, explain most of the risk of changes in interest rates. By using only the first few principal components as factors, the number of nodes on the grid is reduced e.g., using nine possible change in the term structure level, five possible changes in slope, and three possible changes in curvature, 5 there would be only 953=135 nodes. Clearly, Grid Monte Carlo methods are satisfactory if the approximate portfolio values computed by linear interpolation from the grid are adequate. However, they are less useful for portfolios of instruments such as options on individual common stocks, for which the residual risk not explained by factor models is important. 5. Scenario Simulation (Jamshidian & Zhu, 1997) breaks the link between the number of Monte Carlo draws and number of portfolio repricings by approximating the distributions of changes in the factors rather than by approximating the portfolio value. Principal components analysis is used to reduce the number of factors. Each risk factor is then assumed to take only a small number of distinct values, leading to a small (or, at least, manageable) number of possible scenarios, each corresponding to a portfolio value that needs to be computed only once. Monte Carlo simulation is then done by sampling among these scenarios, leading to a great reduction in the number of portfolio revaluations required. 4 Pritsker (1997) included the phrase delta gamma in the names of two other methods. Delta Gamma Delta computes the first two moments of a delta gamma approximation to the portfolio value and then fits a normal distribution to those first two moments. Delta Gamma Minimization expresses the portfolio value in terms of i.i.d. standard normal random variables, identifies the set of realizations that includes say 95% of the probability and estimates the value-at-risk as the greatest loss within or on the sphere. Since both of these approaches are dominated (in the sense of accuracy and computational time requirements) by other approaches discussed in this article, no further mention will be made of them. 5 Frye (1996) suggests using fewer grid points for the second and third factors because the principal components analysis has identified them as making a smaller contribution to the risk than the first factor, so that the error from using a crude approximation is smaller.

178 N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 Using portfolios of European-style foreign currency options, Pritsker (1997) examined the tradeoff between accuracy and computational time for the standard Delta Normal method, 6 Full Monte Carlo, Delta Gamma Monte Carlo, and Modified Grid Monte Carlo. Abken (2000) examined the performance of Scenario Simulation on a portfolio involving interest rate derivatives. Gibson and Pritsker (2000) compared the performance of Scenario Simulation and Principal Components Grid Monte Carlo using portfolios of interest rate derivatives. The evaluation of the various methods is summarized in the following: 7 Not surprisingly, Delta Normal is fastest but least accurate and the approximate Monte Carlo methods are slower but more accurate than the Delta Normal approach but faster and less accurate than the Full Monte Carlo. More surprising is the fact that Scenario Simulation, Modified Grid Monte Carlo, and Principal Components Monte Carlo are dominated by the Delta Gamma Theta Monte Carlo method. This is presumably because the delta gamma 6 This method assumes that the risk factors are normally distributed and approximates changes in portfolio value as a linear function of the risk factors. 7 Since none of the researchers cited compared Principal Components and Modified Grid Monte Carlo, we assume that the accuracy of the two methods are similar, a reasonable assumption, because the grid in Modified Grid Monte Carlo could be chosen to have the same structure as that used in Principal Components Grid Monte Carlo.

N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 179 theta approximation provided a good description of the changes in the values of the portfolios used in the comparisons. 3. Improvements in the accuracy of VaR 3.1. Improved methodologies for Historical Simulation While Historical Simulation is well suited to portfolios that include options and does not require the specification of a particular distribution, the weakness of the standard Historical Simulation approach is the assumption that the distribution of changes in the market factors is at least approximately stable for relatively long periods. Moreover, even with this assumption, standard Historical Simulation is subject to purely statistical errors or sampling variation because the location of the say 95% VaR is determined by where the worst 5% of the outcomes fall, which depends on the random past realizations of the market factors. Researchers have attempted to overcome these drawbacks. Boudoukh, Richardson, and Whitelaw (1998) proposed Historical Simulation with exponentially weighted past returns: The portfolio return n periods in the past is weighted by al n 1, where N is the total number of past observations, l<1, and the coefficient a=(1 l)/(1 l N ) is chosen to make the sum of the weights equal to 1. The weights a, al,..., al n 1,..., al N 1 are used to construct the empirical distribution (histogram) of the returns by acting as if a proportion al n 1 of the observations had the return R t n, and the VaR estimate is then read off the empirical distribution. While this allows the VaR estimate to reflect recent volatility, it can exacerbate the estimation error, because deweighting the past observations is similar to using a smaller sample. Hull and White (1998) proposed a different method of adjusting past returns: Daily volatility estimates are computed for every market factor for the current date and every day during the historical data period, the historical returns are scaled by the ratio of the current to past volatilities, and the scaled returns are used in place of the ordinary returns. The adjusted market factors are used to reprice the portfolio and then to form the empirical distribution/ histogram. This approach has the advantage of reflecting current market volatility through the rescaled returns, while still using a long historical sample to provide information about the fatness of the tails. 4. Backtesting Backtesting is crucial to verify accuracy and identify areas in which improvement is needed; and banking regulators need to be sure that VaR models are not systematically biased. Underlying the simplest backtesting framework is the idea that, with a VaR model with a 99% confidence level, one expects to observe exceptions on 1% of the days backtesting the model using the last 250 daily P/L s, the expected number of exceptions is 2.5. Of course, the actual number of exceptions depends on the random outcomes of the underlying market

180 N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 factors in addition to the quality of the VaR model; even if the VaR model is correct, the actual number will typically differ from the expected number. This leads to a rule based on a range: number of exceptions outside the range, reject the model; number of exceptions within the range, accept the model as correct. While this simple approach is enshrined in Basle framework, it is not very powerful: a range wide enough that a correct model is rejected with only low probability is so wide that many incorrect models are also rejected with only low probability, while a range narrow enough that most incorrect models are rejected with high probability also leads to the rejection of most correct models. 8 One source of the problem is that, with a confidence level of 99%, there inherently are very few exceptions; so, a large sample is required. Because large samples of weekly or monthly data are either not available or include past periods no longer relevant, this leads to backtesting VaR models using daily data even if VaR is intended to be used with a longer horizon. The problem also can be ameliorated by using a lower confidence level, because there will be more exceptions to provide information about the performance of the model. While use of a lower confidence level is useful if the purpose of the VaR model is to serve as a broad gauge of risk, it does not help when VaR is used to set capital requirements i.e., when the user is concerned primarily with the lower tail of the distribution. Even ignoring the power of the tests, to pass backtests of the sort used in the Basle rules, a VaR model need only be correct on average. 9 VaR models that are grossly deficient can pass a Basle-rule backtest. (Consider a model that generates a VaR of $100 billion on every day but the 100th day when it generates a VaR of $0. Since this model would generate an exception on every 100th day and on only 2 or 3 days out of every 250, it would pass, although it provides no useful risk information.) A more realistic example is a VaR model that does not fully respond to changes in market volatility, thereby producing downward-biased estimates during high volatility periods. Since the exceptions from such a model will tend to follow one another, the problem could be uncovered by looking at the conditional probabilities of exceptions: If VaR is computed using a confidence level of 99%, then it should be the case that Prob{exception at time t+1jexception at time t}=prob{exception at time t+1jno exception at time t}=1%. A limitation of both the Basle and conditional exception approaches is that they focus on only one quantile of the distribution, the VaR. As a result, they are inherently not very powerful. Alternatively, backtests could exploit the fact that an estimate of the entire probability distribution underlies the VaR estimate. Let x t denote the realized P/L at time t and F the estimate of the distribution function of x t. If the estimate F is actually the distribution function of the P/L x t, then: (i) the transformed random variable u t =F(x t ) should be distributed 8 This is illustrated by the Basle rules: A 99% confidence level VaR model is placed in either the yellow zone or red zone if five or more exceptions are observed in a 250-day period. The probability that this happens to a correct VaR model is 10.8%, while an incorrect VaR model that understates VaR by 20% will have four or fewer exceptions and thus be accepted with a probability of almost 13%. 9 In statistical jargon, the Basle approach considers only the marginal distribution of the exceptions.

N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 181 uniformly on the interval [0,1] and (ii) the u s from different dates should be independent, i.e., u t+j should be independent of u t for j > 0. Tests based on these ideas have been proposed by Berkowitz (1999) and Crnkovic and Drachman (1996). However, tests based on the entire distribution are not necessarily the answer, precisely because they are based on the entire distribution. Errors in modeling the center of the distribution can lead to the rejection of the VaR model, although such errors are unimportant for risk management. 5. Improvements in stress testing U.S. Federal Reserve Chairman Alan Greenspan has worried that the incipient art of stress testing has yet to find formalization and uniformity across bank and securities dealers. At present, most banks pick a small number of ad hoc scenarios as their stress tests. 10 Chairman Greenspan s concerns are valid with respect to the selection of scenarios for the core risk factors. However, once the stress scenarios for these have been specified, statistical tools can be used to determine the scenarios for other factors. For example, suppose the stress scenario is a U.S. stock market crash accompanied by a flight to quality, defined as a 1-day 20% decline in the S&P 500 index and a 5% increase in the price of the 10-year U.S. Treasury note. Rather than an ad hoc specification of the other market factors (e.g., swap or bond spreads), the covariance matrix (and expected changes, if these are nonzero) of the market factors can be used to compute conditional expectations and the other market factors could be set equal to their conditional expectations. Or, the covariance matrix could be estimated using only data from periods of past market crises, resulting in a stress scenario that is both internally consistent and consistent with the data from past periods of market stress. 6. Beyond VaR Above, we described refinements to VaR, which increase computational speed, improve accuracy, and facilitate stress testing. Next, we describe extensions to standard VaR and the techniques that have been proposed as alternatives to VaR, as well as the emergence of risk contribution measures. 7. Extensions to standard VaR 7.1. Liquidity-adjusted VaR (LaVaR) Standard VaR measures the riskiness of a portfolio over a fixed usually short holding period. Inherent is the implicit assumption that the risk can be eliminated by the end of the 10 See Greenspan s Plea for Stress-Testing, Greenspan (2000).

182 N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 holding period, by liquidating or hedging the portfolio. In periods of market illiquidity, the implicit assumption may not be valid. Moreover, even in more normal periods, it is unlikely to be valid for all instruments. LaVaR addresses this issue by recognizing that there are limits to the rate at which a portfolio can be liquidated. To illustrate a simple LaVaR implementation, consider a situation where only b units of an instrument can be sold each day a 100-unit portfolio would require n=100/b days to liquidate. If the liquidation of the position became necessary, a possible strategy would be to liquidate b units each day and invest the proceeds at the risk-free rate (r f ). (Any positions not liquidated would remain exposed to market risk.) Under that trading strategy, b units would be exposed to market risk for 1 day and then invested at the risk-free rate r f for the remaining n 1 days; another b units would be exposed to market risk for 2 days and then invested at r f for n 2 days; another b units would be exposed to market risk for 3 days and then invested at r f for n 3 days; and so on. If the initial price per unit is P 0 and the i- th day return is r i, the trading strategy of liquidating b units on each of n days results in a liquidated value (at the end of n days) of bp 0 ½ð1 þ r 1 Þð1 þ r f Þ n 1 þð1 þ r 1 Þð1 þ r 2 Þð1 þ r f Þ n 2 þð1 þ r 1 Þð1 þ r 2 Þð1 þ r 3 Þð1 þ r f Þ n 3 þ ::: Š: LaVaR is then obtained by estimating the distribution of this liquidated value, similar to the way standard VaR is obtained by estimating the distribution of the mark-to-market portfolio value. 11 Since LaVaR measures portfolio risk over an n-day horizon, it follows that LaVaR will exceed a standard VaR over a 1-day horizon whenever n>1 but will be less than standard VaR over an n-day horizon. LaVaR can be computed using a simulation of the evolution of returns over n days and can be approximated with an analytic variance covariance (delta normal) approach. 12 7.2. Mark to future (MtF) 13 Standard VaR can be viewed as the second step in the evolution of risk measurement methodology. Step 1: 11 More sophisticated implementations of LaVaR could incorporate the fact that illiquidity is correlated with extreme market movements by making the liquidation rate b dependent on the values of other market factors. 12 The approximation makes use of the fact that (1+r 1 )(1+r 2 )(1+r 3 )1+r 1 +r 2 +r 3 and similarly for other returns. 13 Adapted from Dembo, Aziz, Rosen, and Zerbs (2000). We focus on a few of the specific enhancements in risk measurement provided by the framework. Another dimension of the MtF framework is that it provides a consistent risk architecture that facilitates the evolution of risk measurement methodologies.

N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 183 Step 2: Algorithmics MtF represents a third step. MtF is a scenario-based framework, encompassing both nonprobabilistic scenario analysis (by looking at a small set of prespecified scenarios) and Monte Carlo simulation (by assigning probabilities to the scenarios). 14 Step 3: MtF captures the passage of time by simulating the evolution of market factors and portfolio values. 15 This permits the computation of VaR and other risk measures at various horizons. It also makes it possible to incorporate path dependency of individual instruments and portfolio values, as well as maturation of instruments, prepayments, reinvestment of cash flows, and dynamic portfolio strategies. The ability to incorporate changes in portfolios is particularly important when considering longer time horizons. Standard VaR computations assume that the portfolio is constant over time. This perspective remains relevant for derivatives dealers a 1-day VaR provides a useful summary measure of risk assuming that the dealer does not liquidate any positions or change any hedges. However, over the longer horizons relevant for investment and portfolio management, the assumption of no changes in the portfolio makes a standard VaR less useful. Dynamic features of actual portfolios can only be captured by Monte Carlo over multiple time points that allows the user to specify rules for changing the positions as functions of factor realizations, instrument or portfolio values, or other features such as instrument deltas. This third step permits incorporation of credit risk, by including credit spreads or indexes of credit quality as factors. Moreover, at the cost of introducing potentially large numbers of factors, credit ratings of individual obligors could be included. The default and transition probabilities can depend on other market factors, allowing the approach to capture correlations between credit migrations (including default) and other market factors, e.g., wrong way exposures. Thus, in addition to generating VaR estimates that combine market and credit risk, the MtF approach can produce estimates of potential credit exposure that reflect correlations. 14 Note that this approach is not unique to Algorithmics; see, e.g., Askari s RiskBook and RiskWorld framework. 15 For computational efficiency, market factor scenarios are generated once and stored in a data structure termed the MtF cube where they are available for use in valuing multiple portfolios. This separates the simulation of market factors from the portfolio valuation process.

184 N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 Since this third step permits portfolio composition to depend on market factors, MtF can also be used to examine liquidity risk. Funding liquidity risk can be measured by tracking a cash account. In the case of asset liquidity risk, including market trading volumes as market factors and specifying portfolio holdings as functions of the factors, makes it possible to simulate illiquid scenarios i.e., scenarios where portfolio liquidation requires n days. The additional risk is revealed by a VaR calculation over an n-day horizon. Further, allowing trading volumes to depend on the other factors makes it possible to capture the relation between liquidity risk and extreme market movements. 8. Emergence of risk contribution measures As the use of VaR has expanded from a simple communication device to play a role in managing the portfolios of banks and other financial institutions, interest has naturally focused on the decomposition of risk into its sources and to measures of the risk contributions of instruments, asset classes, and market factors. There are two main approaches to measuring risk contributions, which we call incremental and marginal. 8.1. Incremental decomposition Incremental decomposition is similar to regression analysis. Express the return r on a portfolio in terms of the changes in (or returns to) K factors, i.e., r ¼ b 0 þ b 1 ðdf 1 Þþb 2 ðdf 2 Þþ ::: þ b K ðdf K Þþe where DF k is the change in the k-th factor, b k measures the sensitivity of the portfolio return to the k-th factor, and e is a residual (which may be zero). For any factor model, it is possible to compute the proportion of the variance of r explained by the K factors. Analogous to the R 2 of a multiple regression, we denote this measure R K 2. To compute the risk contribution of the K-th factor, we would consider the (K 1)-factor model r ¼ b 0 þ b 1 ðdf 1 Þþb 2 ðdf 2 Þþ ::: þ b K 1 ðd FK 1 Þþe 2 and compute the proportion of the variance of r explained by the K 1 factors, R K 1. The risk contribution of the K-th factor is then R 2 2 K R K 1. A limitation of incremental decomposition is that it depends on the order in which the factors are considered. While some situations might have a natural ordering of factors, 16 in most cases the order is less apparent. In such cases, Golub and Tilman (2000) suggested that at each step one should search over all of the remaining factors (or groups of factors) to find the one with the largest risk contribution. 16 For example, in the case of fixed income portfolios, the first three factors might correspond to changes in the level, slope, and curvature of the yield curve.

N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 185 8.2. Marginal decomposition An important property of both the standard deviation and VaR measures of a portfolio is that scaling all positions by a common factor, k, has the effect of scaling the standard deviation and VaR by the same factor, k. An implication of this scaling property of the risk measures is that the portfolio VaR can be decomposed into the risk contributions of the various positions. In particular, letting w=(w 1, w 2,..., w N ) denote the portfolio weights on the N assets or instruments in the portfolio, the portfolio VaR can be expressed as 17 VaRðwÞ ¼ @ VaRðwÞ @w 1 w 1 þ @ VaRðwÞ w 2 þ @w ::: þ @ VaRðwÞ w N : 2 @w N The partial derivative @VaR(w)/@w i can be interpreted as the effect on risk of increasing w i by one unit; so, the term [@VaR(w)/@w i ]w i is the risk contribution of the i-th position. This framework underlies the Hot Spots and hedges approach of Litterman (1996). Not surprisingly, the risk contribution of a position depends crucially on the covariance of the return on that position with the return on the existing portfolio. In fact, in the analytic variance covariance (delta normal) approach the term @VaR(w)/@w i is precisely the covariance. This is zero when the position is uncorrelated with the existing portfolio, in which case the risk contribution is zero. When the correlation is positive, the risk contribution is positive; when it is negative, the position serves as a hedge and the risk contribution is negative. In interpreting the risk decomposition it is crucial to remember that it is a marginal analysis. The marginal effects cannot be extrapolated to large changes, because the partial derivatives change as the position sizes change. In terms of correlations, changes in the size of a position change the correlation between the portfolio and that position. For example, if the i-th position is uncorrelated with the current portfolio, the risk contribution of a small increase in the i-th position is zero. However, if the size of the i-th position increases, that position comprises a larger fraction of the portfolio, requiring that the correlation increase the risk contribution of the i-th position increases with the position size. 9. Alternatives to standard VaR VaR is being called on to perform functions not imagined when it was first developed as a means of communication between trading desks and senior management. Using VaR to determine economic or regulatory capital requires the use of very high confidence levels, thereby focusing attention on the tails of the distributions of changes in market rates and prices. However, the techniques behind the standard VaR measure perform best where there is a lot of data (i.e., near the centers of distributions.) Not surprisingly, alternatives to VaR have been proposed that perform better in the tails. 17 This is Euler s law.

186 N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 9.1. Extreme value theory (EVT) It is well known that the actual distributions of changes in market rates and prices have fat tails relative to the normal distribution, implying that an appropriately fat-tailed distribution would provide better VaR estimates for high confidence levels. However, since the data contain relatively few extreme observations, we have little information about the tails. So, selecting the right fat-tailed parametric distribution and estimating its parameters are inherently difficult tasks. EVT offers a potential solution. Loosely, EVT tells us that the behavior of certain extreme values is the same (i.e., described by a particular parametric family of distributions), regardless of the distribution that generates the data. 18 Two principal distributions appear in EVT. The Generalized Extreme Value distribution describes the limiting behavior of the maximum of a sequence of random variables. The Generalized Pareto Distribution (GPD) describes the tail of a distribution above some large value and thus can be used to compute the probabilities of extreme realizations exactly what is required for VaR estimates at high confidence levels. To indicate the usefulness of the GPD, suppose we were considering a random variable X (e.g., a mark-to-market loss) with distribution function F. Focusing on the upper tail, define a threshold u. The conditional distribution function F(xjX>u) gives the conditional probability that the excess loss (i.e., X u) is less than x given that the loss exceeds u. EVT holds that, as the threshold u gets large, the conditional distribution function approaches the GPD given by GðxÞ ¼1 1 þ x x 1=x ; b where x and b are parameters that must be estimated. The preceding equation implies that the conditional probability of an excess loss greater than x is approximated by PðX u > x j X > uþ 1 GðxÞ ¼ 1 þ x x 1=x : b From this, the unconditional probability of a loss can be obtained as PðX u > xþ ¼PðX u > x j X > uþpðx > uþ: Evidence suggests that the GPD provides a good fit to the tails of the distributions of changes in individual market rates and prices (see Neftci, 2000); and EVT appears to be useful in measuring credit risk when there is a single important factor (Parisi, 2000). However, the available empirical evidence does not bear directly on the question of whether EVT is useful for measuring the VaR of portfolios that depend (perhaps nonlinearly) on multiple sources of risk. Classical EVT is univariate, i.e., it does not characterize the joint 18 The result requires certain restrictions on the distributions that generate the data, including smoothness restriction on the tail probabilities.

distribution of multiple risk factors. To apply EVT to portfolios that depend on multiple sources of risk one must estimate the distribution of P/L by Historical Simulation and then fit the GPD to the tail of the distribution of P/L. We are not aware of any results on the performance of this approach. 9.2. Coherent measures of risk VaR was criticized from the outset because it says nothing about the magnitude of losses greater than the VaR. A more subtle criticism of VaR is that it does not correctly capture the effect of diversification (although capturing the benefits of diversification is one of the commonly cited advantages of VaR). To see this, suppose the portfolio contains short digital puts and calls on the same underlying. Each option has a notional amount of $10 million, a 4% probability of being exercised, time to expiration equal to the VaR time horizon, and a premium of $400,000. The 95% confidence VaR measures of the two positions considered separately indicate no risk, because each suffers a loss with a probability of only 4%. However, the 95% VaR of the aggregate portfolio is $10 million 2$400,000=$9.2 million and the VaR of the diversified portfolio composed of one-half of each position is (1/2)($10 million 2$400,000)=$4.6 million. Artzner, Delbaen, Eber, and Heath (hereafter, ADEH) (1997, 1999) argue that risk measures should be coherent i.e., they should satisfy the following four properties (where the vectors X and Y denote the possible state-contingent payoffs of two different portfolios and r(x) and r(y) their risk measures): 1. Subadditivity: N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 187 rðx þ YÞ rðxþþrðyþ: A risk measure should reflect the impact of hedges or offsets; so, the risk measure of an aggregate portfolio must be less than or equal to the sum of the risk measures of the smaller portfolios that comprise it. 2. Homogeneity: rðax Þ¼arðX Þ; where a is an arbitrary constant: The risk measure is proportional to the scale of the portfolio, e.g., halving the portfolio halves the risk measure. 3. Monotonicity: rðxþ rðyþ; if X Y If portfolio Y dominates X in that each payoff of Y is at least as large as the corresponding payoff of X (i.e., XY), then Y must be of lesser or equal risk.

188 N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 4. Risk-free condition: rðx þ bð1 þ rþþ ¼ rðxþ b where r is the risk free interest rate and b is an arbitrary constant: Adding a risk-free instrument to a portfolio decreases the risk by the size of the investment in the risk-free instrument. This property ensures that coherent risk measures can be interpreted as the amount of capital needed to support a position or portfolio. Note that VaR is not a coherent risk measure, because the aggregate portfolio of the digital put and call discussed above fails to satisfy property (1) while the diversified portfolio fails to satisfy the combination of (1) and (2) with a=1/2. 19 ADEH show that all coherent risk measures can be represented in terms of generalized scenarios. In particular, first construct a list of K scenarios of future market factors and portfolio values, as might be done in Monte Carlo simulation or deterministic scenario analysis. Second, construct a set of M probability measures on the K scenarios. These probability measures will determine how the different scenarios are weighted in the risk measure, and need not reflect the likelihood of the scenarios. 20 For example, one measure might say that the K scenarios are equally likely, while another might say that the k-th scenario occurs with probability one while the other scenarios have probability zero. Third, for each of the M probability measures, calculate the expected loss. Finally, the risk measure is the largest of the M expected losses. This seemingly abstract procedure corresponds to some widely used risk measures. For example, ADEH show that the expected shortfall measure defined by E[lossjloss<cutoff ] is a coherent risk measure. And, the Chicago Mercantile Exchange s Standard Portfolio Analysis methodology can be shown to be a coherent risk measure. But, coherent risk measures do not appear to be making inroads on VaR among banks and their regulators. A drawback of explicitly scenario-based approaches is that it is unclear how reasonably to select scenarios and probability measures on scenarios in situations in which portfolio values depend on dozens or even hundreds of risk factors. This requires significant thought, and probably knowledge of the portfolio. 21 In situations with many market factors, scenario-based approaches loose intuitive appeal and can be difficult to explain to senior management, boards of directors, regulators, and other constituencies. 19 VaR satisfies property (1) if the price changes of all instruments follow a multivariate normal distribution. 20 In this context, a probability measure is just an assignment of probabilities to the K scenarios, where probabilities are numbers between 0 and 1 whose sum (over the K scenarios) is 1. At most, one of the probability measures will correspond to the likelihood of events, so unless M=1 (and sometimes even in that case), the probability measures will not be based on the risk manager s assessment of the likelihood of the scenarios. 21 However, ADEH (1997) argue that an approach that requires thinking before computation... can only improve risk management.

N.D. Pearson, C. Smithson / Review of Financial Economics 11 (2002) 175 189 189 References Abken, P. A. (2000, Summer). An empirical evaluation of value at risk by scenario simulation. Journal of Derivatives, 12 30. Artzner, P., Delbaen, F., Eber, J. -M., & Heath, D. (1997, November). Thinking coherently. Risk, 10(11), 68 71. Artzner, P., Delbaen, F., Eber, J. -M., & Heath, D. (1999, July). Coherent measures of risk. Mathematical Finance, 9(3), 203 228. Berkowitz, J. (1999, March). Evaluating the forecasts of risk models. Federal Reserve Board Working Paper. Boudoukh, J., Richardson, M., & Whitelaw, R. (1998, May). The best of both worlds. Risk, 11(5), 64 66. Crnkovic, C., & Drachman, J. (1996, September). Quality control. Risk, 9(9), 138 143. Dembo, R. S., Aziz, A. R., Rosen, D., & Zerbs, M. (2000). Mark to future: a framework for measuring risk and reward. Toronto: Algorithmics Publications. Frye, J. (1998). Monte Carlo by day. Risk, 11(11), 66 71. Gibson, M. S., & Pritsker, M. (2000). Improving grid-based methods for estimating value at risk of fixed-income portfolios. Federal Reserve Board Finance and Economics Discussion Series 2000-25. Golub, B. W., & Tilman, L. M. (2000). Risk management: approaches for fixed income markets. New York: Wiley. Greenspan, A. (2000). Speech at the 36th Annual Conference on Bank Structure and Competition of the Federal Reserve Bank of Chicago. Reprinted as Greenspan s Plea for Stress Testing. Risk, 13, 53 55. Hull, J. C., & White, A. W. (1998, Fall). Incorporating volatility up-dating into the Historical Simulation method for value at risk. Journal of Risk, 1(1), 5 19. Jamshidian, F., & Zhu, Y. (1997, January). Scenario simulation: theory and methodology. Finance and Stochastics, 1(1), 43 67. Litterman, R. (1996). Hot Spots and hedges. Journal of Portfolio Management, 23, 52 75 (December special issue). Neftci, S. N. (2000, Spring). Value at risk calculations, extreme events, and tail estimation. Journal of Derivatives, 7(3), 23 37. Parisi, F. (2000). Extreme value theory and Standard & Poor s ratings. ABS Research Special Report. New York: Standard & Poor. Pritsker, M. (1997, October/December). Evaluating value at risk methodologies: accuracy versus computational time. Journal of Financial Services Research, 12(2/3), 201 242.