9 th DGVFM Scientific Day 30 April 2010 1 Risk Measurement in Credit Portfolio Models 9 th DGVFM Scientific Day 30 April 2010
9 th DGVFM Scientific Day 30 April 2010 2 Quantitative Risk Management Profit & loss distributions (P & L) are very complex. Appropriate summary statistics are needed: Standardization facilitates communication Simple tools for sensitivity analysis Basis for capital regulation of financial firms Capital regulation is the cornerstone of bank regulators efforts to maintain a safe and sound banking system, a critical element of overall financial stability. (Ben S. Bernanke, 2006) This requires appropriate ways to measure the downside risk of financial positions.
9 th DGVFM Scientific Day 30 April 2010 3 Bailed out by the Tax Payer
9 th DGVFM Scientific Day 30 April 2010 4 Outline (i) Review of risk measures The industry standard: Value at risk Convex risk measures An example: Utility-based Shortfall Risk (ii) Implementation for credit portfolios Monte Carlo Methods Exponential Twisting Stochastic Approximation
9 th DGVFM Scientific Day 30 April 2010 5 Value at Risk
9 th DGVFM Scientific Day 30 April 2010 6 Value at Risk in the Media David Einhorn, who founded Greenlight Capital, a prominent hedge fund, wrote not long ago that VaR was relatively useless as a risk-management tool and potentially catastrophic when its use creates a false sense of security among senior managers and watchdogs. This is like an air bag that works all the time, except when you have a car accident. Nicholas Taleb, the best-selling author of The Black Swan, has crusaded against VaR for more than a decade. He calls it, flatly, a fraud. ( Risk Mismanagement, New York Times, 2. Januar 2009)
9 th DGVFM Scientific Day 30 April 2010 7 The Industry Standard Value at Risk Value at risk at level λ: VaR λ (X) = inf{m R : P [m + X < 0] λ} Smallest monetary amount to be added to a financial position such that the probability of a loss becomes smaller than λ. Drawbacks of Value at Risk does not account for the size of extremely large losses does not encourage diversification This motivates an axiomatic analysis of risk measures.
9 th DGVFM Scientific Day 30 April 2010 8 Value at Risk Diversification X i = 1 with probability 50% 1 with probability 50% The Value at Risk of X i at level 50% is 1. If X 1 and X 2 are stochastically independent, then X 1 + X 1 with probability 25% 2 = 0 with probability 50% 2 1 with probability 25% The VaR at level 50% of the diversified position is 0.
9 th DGVFM Scientific Day 30 April 2010 9 Value at Risk Large Losses X 1 = X 2 = 1 with probability 99% 1 with probability 1% 1 with probability 99% 10 10 with probability 1% The VaR of both positions at level 1% is 1.
9 th DGVFM Scientific Day 30 April 2010 10 Axiomatic Theory
9 th DGVFM Scientific Day 30 April 2010 11 Static Risk Measures Risk measures ρ : X R Monotonicity: If X Y, then ρ(x) ρ(y ). Cash invariance: If m R, then ρ(x + m) = ρ(x) m. Capital requirement A position X X is acceptable, if ρ(x) 0. The collection A of all acceptable positions is the acceptance set. ρ is a capital requirement, i.e. ρ(x) = inf {m R : X + m A}.
9 th DGVFM Scientific Day 30 April 2010 12 Diversification Semiconvexity: ρ(αx + (1 α)y ) max(ρ(x), ρ(y )) (α [0, 1]). = Convexity (Föllmer & Schied, 2002): ρ(αx + (1 α)y ) αρ(x) + (1 α)ρ(y ) (α [0, 1]). Geometric properties of the acceptance set ρ convex A convex.
9 th DGVFM Scientific Day 30 April 2010 13 A Better Risk Measure: UBSR Utility-based Shortfall Risk l : R R convex loss function, z interior point of the range of l. The acceptance set is defined as A = {X L : E P [l( X)] z} A induces a convex risk measure ρ: ρ(x) = inf{m R : X + m A} Simple formula Shortfall risk ρ(x) is given by the unique root s of the function f(s) := E[l( X s)] z.
9 th DGVFM Scientific Day 30 April 2010 14 Example: UBSR (2) 9 8 7 AVaR UBSR: x 1 x 1.5 x 2 VaR 6 5 4 3 2 1 0 1 0 1 2 3 4 5 6 7 8 9 10 µ peak VaR 0.05, AVaR 0.05 und Utility-based Shortfall Risk with p { 1, 3 2, 2} and z = 0.3 as functions of µ for a mixture of a Student t (weight 0.96) and a Gaussian with mean µ (weight 0.04).
9 th DGVFM Scientific Day 30 April 2010 15 Characterization Theorem Shortfall risk has convex acceptance and rejection set on the level of distributions; on L the only one with this property. Distribution-based risk measures Let M 1,c (R) be the space of probability measures on R. Distribution-based risk measure can be interpreted as functionals on M 1,c (R). Acceptance and rejection set An acceptance set on the level of probability distributions can be defined by N ρ = {µ M 1,c (R) : ρ(µ) 0}.
9 th DGVFM Scientific Day 30 April 2010 16 Characterization Theorem (cont.) Theorem 1 (W., 2006) Let ρ be a distribution-based risk measure. Assume there exists x R with δ x N such that for y R, δ y N c, for sufficiently small α > 0. (1 α)δ x + αδ y N Then the following statements are equivalent: (i) N is φ-weakly closed for some gauge function φ : R [1, ), N and N c are both convex. (ii) For some left-continuous loss function l : R R and a scalar z R in the interior of the convex hull of the range of l: { } N = µ D : l( x)µ(dx) z.
9 th DGVFM Scientific Day 30 April 2010 17 Implementation in Credit Risk Models
9 th DGVFM Scientific Day 30 April 2010 18 Better Manage Your Risks!
9 th DGVFM Scientific Day 30 April 2010 19 Credit Portfolios One-period model with time periods t = 0, 1 Financial positions at t = 1 are modeled as random variables Credit Portfolio Losses Portfolio with m positions (obligors) The random loss at time 1 due to a default of obligor i = 1, 2,..., m is denoted by l i The total losses are given by L = m i=1 l i Typical decomposition: l i = v i D i with exposure v i and default indicator D i {0, 1}
9 th DGVFM Scientific Day 30 April 2010 20 Credit Portfolios (2) Framework above is completely general (if we focus on one-period credit loss models) Risk assessment in practice requires specific models that need to be estimated/calibrated and evaluated Examples Credit Metrics JP Morgan; based on Normal Copula Credit Risk+ Credit Suisse; Poisson mixture model Copula models like the t-copula model general family; Gaussian mixture like t-copula particularly tractable
9 th DGVFM Scientific Day 30 April 2010 21 Credit Metrics JP Morgan s Credit Metrics is a simplistic toy model. The dependence structure is based on a Gaussian copula and ad hoc. The model exhibits no tail-dependence. Credit Metrics model is like Black-Scholes. Credit Metrics is also called Normal Copula Model (NCM). The NCM can be used as a basis for Gaussian mixture models like the t-copula model. Risk estimation techniques that work in the NCM can often be extended to Gaussian mixture models and other models.
9 th DGVFM Scientific Day 30 April 2010 22 Monte Carlo Simulation Shortfall risk ρ(x) with L = X is given by the unique root s of the function Efficient Computation f(s) := E[l( X s)] z. Variance reduction techniques increase the accuracy/rate of convergence, e.g. importance sampling (Dunkel & W., 2007) Stochastic approximation (Dunkel & W., 2009)
9 th DGVFM Scientific Day 30 April 2010 23 Normal Copula Model Model of overall losses of credit portfolio over fixed time horizon Losses L = X 0 are given by: m L = v i D i. i=1 Default indicators: D i = 1 {Yi >y i } Marginal default probabilities: p i = P {D i = 1} m-dimensional normal factor with standardized marginals: Y = (Y 1, Y 2,..., Y m ) Threshold levels: y i = Φ 1 (1 p i )
9 th DGVFM Scientific Day 30 April 2010 24 Normal Copula Model (continued) In industry applications the covariance matrix of the Gaussian vector Y is often specified through a factor model: where Y i = A i0 ε i + d A ij Z j i = 1,..., m, d < m; j=1 1 = A 2 i0 + A 2 i1 +... + A 2 id A i0 > 0, A ij 0, Z 1,..., Z d are d independent standard normal random variables (systematic risks), and ε 1,..., ε m are m independent standard normal random variables which are independent of Z 1,..., Z d (idiosyncratic risks).
9 th DGVFM Scientific Day 30 April 2010 25 Importance Sampling First task: Estimate E P [l(l s)] = E P [h(l)] with h(l) = l(l s). Two-step variance reduction: (i) Importance sampling for L conditional on factor Z. (ii) Variance reduction for factor Z.
9 th DGVFM Scientific Day 30 April 2010 26 Special case: independent default events In the factor model, independence corresponds to A i0 = 1, A ij = 0 i = 1,..., m, j = 1,..., d. Importance Sampling: If Q is an equivalent probability measure with a density of the form then E P [h(l)] = E Q [ h(l) g(l) dq dp = g(l), ]. Sampling L k independently from the distribution of L under Q, we get an unbiased, consistent estimator of E P [h(l)]: Jn g = 1 n h(l k ) n g(l k ). k=1
9 th DGVFM Scientific Day 30 April 2010 27 Exponential twisting SR loss function: Suppose that l(x) = γ 1 x γ 1 [0, ) (x) is polynomial. Measure change: Consider class of probability measures Q θ, θ 0, with dq θ dp = exp(θl), ψ(θ) where ψ(θ) = log E[exp(θL)] = m i=1 log[1 + p i(e θv i 1)]. Optimal measure change: Minimize an upper bound for the L 2 -error of the estimator of E P [l(l s)]. This suggests an optimal θ s.
9 th DGVFM Scientific Day 30 April 2010 28 Exponential twisting (continued) (i) Calculate q i (θ s ) := p i e v iθ s 1 + p i (e v iθ s 1). (ii) Generate m Bernoulli-random numbers D i {0, 1}, such that D i = 1 with probability q i (θ s ). (iii) Calculate ψ(θ s ) = m log[1 + p i (e θ sv i 1)] i=1 and L = m i=1 v id i, and return the estimator l(l s) exp [ Lθ s + ψ (θ s )].
9 th DGVFM Scientific Day 30 April 2010 29 Exponential twisting (continued) 1 (a) naive MC method 1 (b) one-step method: exponential twisting E[(L-c) γ 1 {L c} ] / γ 0.1 0.01 0.001 1e-04 1e-05 1e-06 c=0.2 L + c=0.3 L + 1e-05 c=0.4 L + c=0.5 L + 1e-06 2 2.5 3 3.5 4 4.5 5 log 10 (n) E[(L-c) γ 1 {L c} ] / γ 0.1 0.01 0.001 1e-04 c=0.2 L + c=0.3 L + c=0.4 L + c=0.5 L + 2 2.5 3 3.5 4 4.5 5 log 10 (n) Figure 1: MC results for estimating SR with piecewise polynomial loss function in the NCM. Length of error bars is sample standard deviation of estimator.
9 th DGVFM Scientific Day 30 April 2010 30 Stochastic Approximation Stochastic approximation methods provide more efficient root-finding techniques for shortfall risk (Dunkel & W., 2009). Robbins-Monro Algorithm Let Ŷs : [0, 1] R such that E[Ŷs(U)] = f(s) for U unif[0,1]. Choose a constant γ ( 1 2, 1], c > 0, and a starting value s 1 [a, b] s. For n N we define recursively: [ s n+1 = Π [a,b] s n + c ] n γ Y n with (1) Y n = Ŷs n (U n ) (2) for a sequence (U n ) of independent, unif[0,1]-distributed random variables.
9 th DGVFM Scientific Day 30 April 2010 31 Stochastic Approximation (2) Averaging procedure Theorem 2 Suppose that γ ( 1 2, 1). For arbitrary ρ (0, 1) we define s n = 1 ρ n n i=(1 ρ)n Then s n s P -almost surely. For every ε > 0 there exists another process ŝ such that P ( s n = ŝ n n) 1 ε and ( ) ρn (ŝn s σ 2 (s ) ) N 0, (f (s )) 2. Optimal rate and asymptotic variance guaranteed Finite sample properties usually good s i.
9 th DGVFM Scientific Day 30 April 2010 32 Stochastic approximation (3) PDF 0.3 0.2 0.1 Polyak-Ruppert N = 10000 c = 100 γ = 0.7 ρ = 0.1 n = 60 n = 3 10 2 n = 10 4 0 10 5 0 5 10 S n Averaging algorithmus: ρn( s n s ) is asymptotically normal (simulation of UBSR with polynomial loss function in the NCM with IS).
9 th DGVFM Scientific Day 30 April 2010 33 Conclusion
9 th DGVFM Scientific Day 30 April 2010 34 Conclusion (i) Axiomatic theory of risk measures VaR is not a good risk measure Better risk measures have been designed, e.g. Utility-based Shortfall Risk (ii) Implementation in credit portfolio models Importance sampling Stochastic approximation
9 th DGVFM Scientific Day 30 April 2010 35 Further research Comparison of stochastic average approximation with stochastic approximation Extension of the proposed techniques to a larger class of risk measures, e.g. optimized certainty equivalents Adjustments for liquidity risk Dynamic risk measurement procedures, and their numerical implementation
9 th DGVFM Scientific Day 30 April 2010 36 Thank you for your attention!