Clear Sight Introduction to Loss Distribution Approach Abstract This paper focuses on the introduction of modern operational risk management technique under Advanced Measurement Approach. Advantages of Loss Distribution Approach prevails this approach over others. This paper identifies preferred statistical tools along with its need in different scenarios. The two key component of LDA, frequency and severity, are fitted to discrete and continuous distributions to fit irregular scattered pattern and to cover blank spaces of incomplete historical loss data. Aggregated Loss Distribution is generated with the help of historical frequency and severity distributions to calculate the Operational Value at Risk (OpVaR). Inside this Issue 1. Introduction to Loss Distribution Approach 2. Frequency of loss 3. Severity of loss 4. Aggregation of Frequency and Severity Distribution 5. References Prepared by Risk Advisory Team BenchMatrix Private Limited October, 2010
Introduction to Loss Distribution Approach The nature of operational risk is highly bankspecific therefore demands for the development of more complex quantitative and qualitative solution. Loss Distribution Approach (LDA) is the most sophisticated approach under Advanced Measurement Approach to calculate optimal capital charge with regard to operational risk of an organization. LDA utilizes statistical tools to generate probability distributions for frequency and severity in order to calculate Operational Value at Risk (OpVaR) at a given confidence level. Generally, data set of loss events consist of limited incidences occurred during a short length period. This result in scattered pattern and blanks spaces for possible but yet to emerge loss events. To resolve the issue of unaccounted but possible loss events, the given loss data is fitted on a probability distribution. There are multiple probability distribution functions which can be utilized to represent given loss data however, distribution function which fit the most with the given data is selected for representation. The loss data is analyzed with respect to frequency and severity of events. Frequency is the number of event of a particular type occurred during a period whereas severity is the amount of loss incurred due to a particular event. Advantages 1. Focused Attention Since organizations specifically would know the main area of problem from loss data therefore it would be easy for them to address that particular operational issue directly. 2. Determining Capital Charge Operational Value-at-risk (OpVaR) is an important benchmark for measuring the operational risk capital charge in the advanced measurement approaches. It is a monetary metric that determines the maximum amount of loss that is expected to occur over a pre-specified time horizon at a pre-specified confidence level. 3. Risk Type Comparisons Comparison of various types of risk, such as external fraud risk with natural disaster risk, becomes possible because it can be conducted via a uniform metric that is OpVaR. 4. Capital Incentive Unlike Basic Indicator and Standardized approach, LDA provides an individualized operational risk capital charge depending on banks risk profile. It is an incentive for low risk banks in a form of low operational risk capital charge. 2010-11 BenchMatrix Pvt. Ltd, All rights reserved Risk Advisory Team 1
Frequency of Loss Frequency of an event in a period represents a discrete variable which is fitted on a discrete probability distribution function. Most common discrete distributions used for frequency data are Poisson distribution, Negative-binomial models, Geometric Distribution, and Binomial Distribution. The most common among these is the Poisson distribution. Severity of Loss Figure 2 Figure 1 The Poisson distribution is used to find the probability that a certain number of events would arrive within a fixed time interval. A Poisson process assumes a constant mean and is therefore often called a homogeneous Poisson process. To fit the Poisson distribution to data, one needs to estimate the mean number of events in a pre-specified time interval. Figure 1 shows a Poisson frequency distribution with a mean of 1 event per year has the following probability mix: 0 events: 36.8%, 1 event: 36.8%, 2 events: 18.4%; 3 events: 6.1%; 4 events: 1.5%, 5 and above events: 0.4%. More advanced models admit the possibility of a random or time dependent intensity rate. These models form the basis of Cox processes, or non-homogeneous Poisson processes. Severity represents the amount of loss incurred due to a loss event. Loss amount can be of any value therefore is calculated through a continuous function. Impact of losses can range from very small to catastrophic losses. Generally, small amount losses have much higher frequency than high impact losses which are very rare. Distribution functions that are defined only on the positive values of the underlying random variable and right-skewed are most suited to represent severity due to the specific nature of operational losses. The exponential, lognormal, weibull, gamma, beta, Pareto, Burr, and combination of these distributions have these required properties. Figure 2 illustrates the right skewed log-normal severity distribution having probability of loss amount on Y-Axis and amount ranges on X-Axis. High impact but rare events are given special consideration due to its significant effect on calculation of capital charge against operational risk. As high impact losses usually occupy right tail end of a distribution. Generally, most of the studies suggest that heavy-tailed loss distributions (such as lognormal or Pareto) best describe operational loss magnitudes. 2010-11 BenchMatrix Pvt. Ltd, All rights reserved Risk Advisory Team 1
Aggregation of Frequency and Severity Distribution Figure 3 Loss Distribution Approach emphasizes the significance of both the frequency and severity of operational losses in the operational risk modeling process. When the frequency for a particular timeframe is combined with the severity process, one obtains the compound process for this timeframe. This compounded loss, also known as Aggregated Loss, is used to calculate OpVaR. Figure 3 illustrates the process for frequency and severity aggregation. Among Direct Computation Approach, Monte Carlo Approach, Panjer s Recursive Method, and Inversion Method, the most common and sophisticated approach is Monte Carlo to aggregate loss distributions. This approach generates large number of simulations to cover the most probable as well as high impact future events. This simulated data is fitted to severity distribution. At pre-specified confidence interval, OpVaR is calculated. Sufficiently high confidence level is necessary to ensure a safe capital charge. In 2001, BIS suggested that banks should do provisions against expected losses (EL) and should keep capital charge against operational risk to cover unexpected losses (UL). Expected loss is deducted from reported income of that particular year because some of the banking activities have more likely and expected loss events (such as credit card frauds). 1 1 See the discussion in BIS (2001a). 2010-11 BenchMatrix Pvt. Ltd, All rights reserved Risk Advisory Team 2
References Bouchereau, Akkizidis, J. S., & Vivianne. (2005). Guide to Optimal Operational Risk & Basel II. Auerbach Publications. Chernobai, A. s., Rachev, S. T., & Fabozzi, F. J. A Guide to Basel II Capital Requirements, Models, and Analysis. Dionne, G., & Dahen, H. (2008, October). Determination of The Operational Value At Risk Of A Canadian Bank. Retrieved September 2010, from http://efmaefm.org/0efmsymposium/nantes%202009/paper/dionne.pdf Frequency Distribution Chart. (n.d.). Retrieved September 2010, from Tutor Vista: http://www.tutorvista.com/math/frequency-distribution-chart Investopedia. (n.d.). Kurtosis. Retrieved October 2010, from Investopedia: http://www.investopedia.com/terms/k/kurtosis.asp Navarrete, E. (n.d.). Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods. Retrieved August 2010, from http://www.palisade.com/downloads/pdf/calculationofexpectedandunexpectedlossesinoperationalri sk.pdf Society of Actuaries. (2010, July). Retrieved October 2010, from http://www.soa.org/files/pdf/researchnew-approach.pdf 2010-11 BenchMatrix Pvt. Ltd, All rights reserved Risk Advisory Team 3
About this publication The publication has been produced by Benchmatrix Private Limited with the intention of creating interest, improve understanding and spread knowledge about advanced management approach among operational risk managers. We strongly believe in sharing knowledge for the purpose of enhancing standard of the industry. We shall appreciate any feedback to develop better understanding on this subject. For feedback or inquiry, you can contact: Waqas Zafar waqas.zafar@benchmatrix.com Hunain Ahmed hunain.ahmed@benchmatrix.com Benchmatrix Private limited B904, 9 th Floor, Lakson Square, Building No. 3, Sarwar Shaheed Road, Karachi, Pakistan T: +922135620948 E: explore@benchmatrix.com W: www.benchmatrix.com 2010-11 BenchMatrix Pvt. Ltd, All rights reserved Risk Advisory Team 4