A practitioner s guide to the Advanced Measurement Approach to operational risk under Basel II

Size: px
Start display at page:

Download "A practitioner s guide to the Advanced Measurement Approach to operational risk under Basel II"

Transcription

1 A practitioner s guide to the Advanced Measurement Approach to operational risk Prepared by Tim Jenkins, Jason Slade and Arthur Street Presented to the Institute of Actuaries of Australia 005 Biennial Convention 8 May May 005 This paper has been prepared for the Institute of Actuaries of Australia s (Institute) Biennial Convention 005. The Institute Council wishes it to be understood that opinions put forward herein are not necessarily those of the Institute and the Council is not responsible for those opinions. Copyright of this paper is owned by PricewaterhouseCoopers The Institute will ensure that all reproductions of the paper acknowledge the Author/s as the author/s, and include the above copyright statement: The Institute of Actuaries of Australia Level 7 Challis House 4 Martin Place Sydney NSW Australia 000 Telephone: Facsimile: insact@actuaries.asn.au Website:

2 A practitioner s guide to the Advanced Measurement Approach to operational risk Tim Jenkins, Jason Slade and Arthur Street Abstract The paper introduces actuaries intending to practice in this field to some of the issues and methods involved in implementing the Basel II Advanced Measurement Approach to operational risk in a bank. Although it is unavoidable that judgmentbased assessments supported by only limited data must be used, Basel II nevertheless requires a bank to follow the discipline of applying formal actuarial and statistical methods. This is made all the more difficult because the bank must arrive at its operational risk capital requirement (i.e. a measure of the adverse tail of the aggregate loss distribution) with a very high degree of confidence. Recognising that the modelling involved carries an unavoidable element of uncertainty, especially in the adverse tails of loss distributions, the paper discusses how to approach the task of maintaining the necessary balance between theory and practice, a requirement that well suits the practical training and experience of the actuary. tim.jenkins@au.pwc.com, jason.slade@au.pwc.com, arthur.street@au.pwc.com

3 Introduction In June 004, after extensive consultation, the Basel Committee on Banking Supervision of the Bank for International Settlements ( the Committee ) released its report titled International Convergence of Capital Measurement and Capital standards, also known as Basel II ( the Revised Framework ). This Revised Framework builds upon and retains key elements of the banking capital adequacy framework of the 988 Accord; the basic structure of the 996 Market Risk Amendment regarding the treatment of market risk; and the definition of eligible capital. Basel II looks to an improved banking capital adequacy framework that rests on three pillars:. specific risk based minimum capital requirements. supervisory practice, over a bank s total risk, including business and strategic risk, and 3. disclosure of risk measures, methods and management. It places emphasis on fostering continuous improvement in a bank s risk management capabilities; enhanced supervision; and greater market discipline. The intention is to raise risk consciousness and to focus attention to the links between risk, capital required and management behaviour. The existing Accord is based on the concept of a capital ratio where the numerator represents the amount of capital a bank has available and the denominator is a measure (referred to as risk-weighted assets) of the risks faced by the bank. The resulting capital ratio must be no less than 8%. Under the Revised Framework the definition of the numerator and the minimum ratio of 8% are unchanged, but the measurement of the risks facing the bank that is reflected in the definition of riskweighted assets will be substantially different. The 988 Accord and 996 Amendment cover two types of risk explicitly in the definition of risk-weighted assets, namely credit risk and market risk, where the latter includes interest rate risk, equity position risk and foreign exchange risk. There is no change to the treatment of market risk under the Revised Framework. On the other hand there is substantial change to the treatment of credit risk and, for the first time, there is explicit treatment of operational risk that will result in a measure of operational risk being included in the denominator of a bank s capital ratio 3. While banks may use basic or standardised approaches, a feature of the Revised Framework is potentially greater use of risk and capital assessments based on a bank s own systems. The Supervisor in Australia is APRA. 3 Total risk-weighted assets are determined by multiplying the capital requirement for market risk and operational risk by.5 (i.e. the reciprocal of the minimum capital ratio of 8%) and adding the result to the risk-weighted assets determined under the rules for credit risk.

4 Under the basic approach to operational risk, a bank must hold capital at a uniform 5% of annual gross income (averaged over the previous 3 years), while under the standardised approach the aggregate of business line specific calculations is used where the percentage varies by business line. Table : Basel standardised approach Business line Capital as % of annual gross income Corporate finance 8% Trading & sales 8% Retail banking % Commercial banking 5% Payment & settlement 8% Agency services 5% Asset management % Retail brokerage % By contrast, under the Advanced Measurement Approach (AMA) for operational risk a bank may use its own method of assessing its exposure to operational risk, so long as it is sufficiently comprehensive and systematic, and has demonstrable integrity. In particular, a bank pursuing this approach must have a measurement system that attaches a verifiable relative importance to each of internal loss data, external loss data, scenario analysis, and its business environment and control systems. The operational risk and capital measurement system must be closely integrated into the day-to-day risk management process of the bank and be capable of supporting an allocation of economic capital for operational risk in a manner that creates incentives to improve operational risk management in the business lines. The benefit of the advanced rather than the basic or standardised approaches is that capital will better reflect a bank s own risk profile, leading to a sharper and more relevant connection between risk, capital and management behaviour. Implementation of the advanced approaches such as the AMA for operational risk is planned for year-end 007, with an extended period of testing and parallel running ahead of that. The use of an advanced approach must be approved by the supervisor. The aim of this paper to introduce actuaries intending to practice in this field to some of the methods involved in implementing the Advanced Measurement Approach. Although the Revised Framework makes it clear that the Committee is not specifying the approach or distributional assumptions to be used, it is apparent from the history of the Revised Framework s development that what is contemplated involves an application of actuarial science to the derivation of an aggregate loss distribution from the respective distributions of the frequency and severity of operational loss. Good quality past loss data is scarce and often irrelevant to the prevailing position of a business line, leading to a situation in which quantitative data must be combined with judgment-based assessments of relevant loss distribution parameters made by people who know the business and the prevailing outlook for it. 3

5 Notwithstanding that the resulting data includes a substantial element of judgment, the Revised Framework nevertheless requires us to follow the discipline of applying formal actuarial and statistical method to it. This is made all the more difficult because we must arrive at a capital requirement (i.e. a measure of the worst case outcome) with a very high degree of confidence. It is as well, therefore, to be clear from the outset about the purpose of statistical modelling in operational risk. Its purpose is twofold: to provide a conceptual and philosophical framework that draws upon what we would do if our data were strictly quantitative; and to provide a systematic calculation and information management framework to keep our approach organised and to improve the quality and relevance of data over time. However, such modelling has an unavoidable element of uncertainty and we cannot get carried away with its apparent precision, especially at the worst case outcome end of the distributions. Such models will not give a result with a known range of statistical error. Accordingly, a balance between theory and practice must be maintained in this application of actuarial science, a situation that should suit the practical training and experience of the actuary. Requirements for the Advanced Measurement Approach The Basel II AMA requirements are outlined in the Revised Framework, especially , and include the principles described below. Operational risk is defined as the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. The definition includes legal risk, but excludes strategic and reputation risk. The loss event types within this scope are defined in the Revised Framework (Annex 7). Regulatory capital must allow for a minimum 3 year (in due course, 5 year) observation period of reliable internal loss data, as well as allow for relevant external loss data, especially with regard to infrequent yet severe losses. The bank must also use scenario analysis based on expert opinion and informed by external loss data. The bank s risk assessment must capture a forward-looking view of factors in its business environment and control systems that can change its operational risk profile. The bank must reasonably estimate expected loss (EL) and unexpected loss (UL) based on attaching a verifiable relative importance and relevance to each of the sources of insight (i.e. internal and external loss data, scenario analysis and factors reflecting the bank-specific business environment and internal control systems). Regulatory capital will be based on the sum of EL and UL unless the bank can demonstrate that it has measured and otherwise already accounted for its EL, in which case capital can be based on UL alone. The measurement system must be sufficiently granular to capture the major drivers of operational risk affecting the shape of the tail of the loss estimates (i.e. the worst case 4

6 outcomes), and the bank must be able to demonstrate that its approach captures these worst case loss events at a confidence level of 99.9% over a year time horizon. A bank will be allowed to recognise the risk mitigating effect of insurance, but only up to a limit of 0% of its total operational risk capital and even then subject to a number of important conditions. Operational risk estimates for different types of risk event and business line must be added for the purpose of calculating overall regulatory capital, unless risk correlation assumptions take uncertainty (particularly in times of stress) into account and can be validated. The measurement system must also be closely integrated into the day-to-day processes of the bank and be capable of supporting an allocation of economic capital for operational risk across business lines in a manner that creates incentives to improve business line operational risk management. The system s specifications, parameters, processes and data must be credible, transparent and verifiable; and the system must be internally consistent (and avoid double counting of risk mitigants recognised elsewhere). The bank must maintain rigorous procedures for model development and validation. While the Basel Committee recognises that analytical approaches for operational risk are still evolving, it does not appear to have recognised that the precision inherent in its requirements is incapable of being achieved because of the difficulties involved in working with what is necessarily fuzzy data.. 3 Issues in the loss distribution approach to operational risk The Revised Framework implies the use of the loss distribution approach to measuring operational risk. Frequency and severity of loss are modelled separately and then combined to arrive at the distribution of aggregate loss in a year. This is done by type of risk event and by business line, and then further combined across risk event types within business lines and across business lines to arrive at business line and enterprise-wide loss distributions respectively. There are two main complications inherent in this. The first relates to the number of risk event types. Unlike more established analytical risk categories, operational risk springs from many sources 4. Because of differences in exposure, the experience arising from each event type differs from line to line, so that each business line needs to have its own loss data collected, recorded and maintained by risk type. Moreover, each of these risk event/business line cells needs assumptions about the underlying frequency and severity distributions, leading to its own assumed aggregate loss distribution. 4 Such as internal and external fraud, flaws in employment practices and workplace safety, breakdowns in the execution of transactions and processes, business disruption, defective or wrongly sold products, bad documentation, poor business practices and so on. 5

7 The product of the number of risk event types and the number of business units gives rise to a large number of such cells and hence to difficulties in consistently and reliably classifying and interpreting loss data, to potential complexity in risk and capital management, and to exposure to surprises as new and different types of failure occur. The number of cells can also make it difficult to uncover interdependencies. Another challenge is that of remapping loss data, and maintaining its integrity and applicability, each time a business line is part of an organisational restructure. The second complication relates to lack of objective and relevant data. Even with 3 or 5 years of loss history, loss incident databases, especially internal ones, may be too recent or too inconsistent in their classification of risk events to provide a basis for model assumptions by themselves. Also, they are often insufficiently relevant to current circumstances or future outlook to be the only source of insight about future loss experience. For these reasons, substantial reliance is also placed on incidents observed in external databases and on scenarios based on the expert judgment of business managers. This substantial reliance on subjective judgment gives rise to difficulties of consistency between business lines and over time for the same business line, of blind spots caused by blinkered vision, and of conscious or subconscious bias (e.g. having a business line viewed in a favourable light). As discussed earlier, subjectivity also means that the data on which the mathematical foundations of a model depend includes a substantial qualitative element, and its results are therefore subject to a degree of fuzziness. This fuzziness makes it difficult to know how reliable the results are, particularly when trying to assess the worst case outcomes at the high confidence levels required. In practice these issues are addressed by keeping a focus on materiality, by using a balanced mix of data and by placing an emphasis on validating it. These are discussed in turn below. A focus on materiality is essential in order to avoid being overwhelmed by data and to keep attention on the main drivers of risk. This means that we usually need to concentrate on some fraction of the risk cells by performing high level risk profiling to decide which event types are material for a business line. Materiality might be decided using a set of rules based on information from a number of sources: such as a survey of management on exposures, a review of internal loss experience, and a consideration of external loss data. A balanced mix of internal loss data, external loss data and scenario analysis would then be used to estimate the operational loss distributions in material risk cells and to determine the opening operational risk capital for each business line and for the bank as a whole. The weight given to internal data, external data and scenario analysis needs to reflect the assessed credibility of each. Immaterial cells would be watched but probably not measured. The type of scenario analysis often used as part of the data is a method that relies upon expert business judgment to arrive at assessments of the parameters (e.g. the 6

8 mean and a percentile loss frequency and severity that is relatively high but still capable of being readily contemplated, such as the 90 percentile) that define the distributions in each cell. This type of scenario analysis might be called assessmentbased quantification. It allows a mathematical framework to be used, but with the caveat that the data is not strictly quantitative. There are, however, two underlying assumptions with this type of scenario analysis. Firstly, that business lines are clear about their obligations to internal and external customers and hence about the resulting risks that they are responsible for controlling; and secondly, that the shape of the loss distributions is well enough known to be able to extrapolate from an assessment of a relatively high confidence level (say the 90 percentile) to a very high confidence level (i.e. the 99.9 percentile). Scenario analysis can also describe a somewhat different technique in which systems thinking is applied to the business value chain in order that scenarios can be developed and stress tests applied to consider to what extent rare but extreme events could happen. This method, which might be called scenario-based stress testing, does not easily lend itself to being incorporated into the same framework as loss data, but it is very useful (along with external loss data) when data is scarce and when losses are infrequent but potentially severe. Operational risk capital would be broadly updated between periodic reviews by using information from risk and control indicators to adjust the assumed loss distribution parameters. Capital and the materiality of risk cells would be reassessed from first principles at occasional periodic reviews, or in circumstances such as a major organisational restructure or if the emerging loss experience suggested it to be necessary. Given the fuzzy nature of the data, it is important to recognise the need to validate (to the extent that this is possible) the judgments that are a necessary part of measuring operational risk. Each step that involves judgment should have someone suitable responsible for it, have a validation process and have an independent party responsible for validation. Validation should include checks and balances that aim for reasonable consistency and the dampening of bias. All steps in validating data should be documented, and the measurement process should be periodically reviewed. Operational risk and capital measurement should be part of the governance of operational risk management. Governance needs to provide a sound framework that encourages a culture that properly considers risk and reward, facilitates the communication of risk to top management and the Board, and provides an independent risk management function responsible for the development and monitoring of risk policy. 7

9 4 The mathematical foundation A mathematical model of operational risk will have significant model and parameter uncertainty. Assuming that this uncertainty and the other issues discussed in the previous section can be addressed, we move on to consider the mathematical foundations. The Consultative Document that preceded the Revised Framework described the approach as one in which the bank estimates the probability distribution functions (for each risk/business cell) for the impact of risk and the frequency of its occurrence separately. From these impact and frequency distributions the bank then computes the combined distribution for operational losses. The general approach to establishing the initial operational risk capital is to use Monte Carlo simulation to model loss frequency and severity in each risk cell, and then to use a copula to combine the different loss types. The techniques involved in Monte Carlo simulation and in simulating combined risk with a copula are summarised in Appendices A and B respectively. In Appendix C, the simulation method in the case of a t-copula is described and the concept of tail dependence and how it is measured is introduced. In this paper, tail dependence is modelled using the t-copula, with 3 degrees of freedom. To gain an understanding of the important elements involved in applying the simulation we will illustrate the techniques used. It will be clear from the discussion earlier, that we must work with data that is scant, poor or based on a subjective assessment. Recognising the limitations that this imposes, we take a rudimentary analytical approach and use simple tools with few parameters. The mathematical approach we take for each risk type begins with the compound Poisson process, namely the stochastic process X ( t) = Y k, t 0 where { } is a sequence of independent identically distributed random variables, and N(t) is a Poisson process with parameter λ and independent of{ Y k }. In simple terms, for each risk type, loss events occur in accordance with a Poisson process (which can be thought of as a process like buses turning up at bus stop) and each individual loss event has some randomly determined severity associated with it. Some important properties of the compound Poisson process are summarised in Appendix D. As described in the Appendix, it is sometimes convenient to assume that the sequence { Y k } (the severity of the compound Poisson process) follows an exponential distribution, which is characterised by a single parameter. However, this assumption is only appropriate when the standard deviation of the loss severity is of the same order as the mean severity and, in order to overcome this limitation, for the purpose of this paper we will use a lognormal distribution to represent severity. While other distributions could be used, and indeed may be necessary, in order to represent an actual operational activity, the lognormal is sufficiently versatile to illustrate the subject. We will therefore proceed by assuming that the logarithm of the severity of an individual loss is normally distributed with mean µ and variance σ. N ( t) k= Y k 8

10 In our illustration we will consider a bank that faces only two types of risk 5. These are: FM risks (frequent losses with moderate impact) These are assumed to happen around 5 times per year and each loss amount is generally around $m, but can be higher. Occurrence is modelled using a Poisson distribution, recognising that there is a range of possible occurrences. Impact is modelled assuming a Lognormal distribution (with a mean of $m, and a standard deviation taken to be $m to recognise the lightly skewed nature of the risk). RS risks (rare losses but with severe impacts) These are assumed to happen around once in every 0 years and each loss amount is generally around $00m, but can be much higher. Occurrence is again modelled by a Poisson distribution and impact assuming a Lognormal distribution (with a mean of $00m, and a standard deviation in this case of $00m, recognising the more heavily skewed nature of the risk). Figure a below shows the cumulative distribution functions of the aggregate annual loss from a single FM risk. Figure b shows just the worst case outcomes end of the distribution. Figure a CDF of single FM risk 5 These represent major loss events; the continual noise generated from minor loss events (e.g. those with expected losses less than say $m) is ignored in our illustration. 9

11 Figure b CDF of single FM risk at high cumulative probabilities Table shows the annual aggregate loss corresponding to a range of selected cumulative probabilities. 50% of the time the annual loss will be less than $8.9m and 90% of the time it will be less than $8.3m. (Note these figures are aggregate losses, so that the $8.3m of aggregate loss could be made up of 6 losses of about $3m, or 9 losses of about $m, or any other combination for that matter). The worst case aggregate losses at the 99% and 99.9% confidence levels are $9.9m and $4.8m respectively. Cumulative probability Table : Single FM risk Aggregate loss $m Figure a below shows the cumulative distribution functions of the aggregate annual loss from a single RS risk. Figure b shows just the worst case outcomes end of the distribution. 0

12 Figure a CDF of single RS risk Figure b CDF of single RS risk at high cumulative probabilities Table 3 shows the annual aggregate loss amount from the RS risk exposure corresponding to selected cumulative probabilities. It can be seen that for 90% of the time the aggregate annual loss is zero (i.e. the loss event does not

13 occur). However, the worst case aggregate loss at the 99% and 99.9% confidence levels are $3m and $856m respectively. Another way of looking at this, is to say once every 0 times this type of loss event occurs its severity is greater than $3m and once every 00 times it occurs its severity is greater than $856m. Modelling the severity of loss events from the RS risk type using a lognormal distribution (with a mean of $00m and a standard deviation of $00m) in this way leads to the capital requirement being very sensitive to the degree of confidence targeted. In moving from a confidence level of 99% to 99.9%, the associated capital requirement more than trebles. Table 3: Single RS risk Cumulative probability Aggregate loss $m It can be seen that the two types of risk have very different distributions, not just in size, but also in their shape. Both are, however, broadly exemplify types of operational risk faced by banks and so are important for setting operational risk capital. Combining risk types We will consider how the capital requirements for a bank vary with the bank s exposure to different combinations of these risk types, working up to a combination of 0 FM risks and RS risks. Combining two FM types We will begin by considering two FM risks together and then will move on to consider two RS risks. For the purposes of establishing capital requirements, we are primarily interested in the worst case outcomes (or unexpected losses). The graphical technique we use to depict the effect of the two risks in combination is a scatter diagram of the joint simulation accompanied by a density table to quantify that scatter (with relatively greater rounding of the dominant cells). Figures 3a and 3b show results for two FM risks where there is no correlation between them and where there is no tail dependence (i.e. the two FM risks are completely independent).

14 Figure 3a Two uncorrelated FM risks without tail dependence Scatter from simulation Figure 3b Two uncorrelated FM risks without tail dependence Density of scatter Figures 3c and 3d below show results where there is no correlation between the risks but there is tail dependence (meaning the risk types tend to become more interrelated in extreme circumstances). 3

15 Figure 3c Two uncorrelated FM risks with tail dependence Scatter from simulation Figure 3d Two uncorrelated FM risks with tail dependence Density of scatter It can be noted from these figures that even where two FM risks are uncorrelated, tail dependence between them induces more frequent occurrence of high aggregated losses from both risks at the same time. 4

16 Figures 4a-d develop this illustration further by examining the effect of tail dependence between two risks that have a correlation coefficient of 0.5. Figure 4a Two correlated FM risks ( ρ =. 5 ) without tail dependence Scatter from simulation Figure 4b Two correlated FM risks ( ρ =. 5 ) without tail dependence Density of scatter 5

17 Figure 4c Two correlated FM risks ( ρ =. 5 ) with tail dependence Scatter from simulation Figure 4d Two correlated FM risks ( ρ =. 5 ) with tail dependence Density of scatter As before, tail dependence has a material effect on unexpected loss and this effect is amplified by the risks being correlated. 6

18 Combining two RS risks Figures 5a-d show the results of combining two uncorrelated RS risks both without and with tail dependence Figure 5a Two uncorrelated RS risks without tail dependence Scatter from simulation Figure 5b Two uncorrelated RS risks without tail dependence Density of scatter 7

19 Figure 5c Two uncorrelated RS risks with tail dependence Scatter from simulation Figure 5d Two uncorrelated RS risks with tail dependence Density of scatter The frequency of joint unexpected losses in a year from RS events is remote, however when they do happen jointly their impact is very severe. Tail dependence increases 8

20 the frequency of this remote event happening, making it necessary to at least consider the impact of tail dependence when setting risk capital. Figures 6a-d below illustrate the impacts with and without tail dependence where the two risks are 50% correlated. Figure 6a Two correlated RS risks ( ρ =. 5 ) without tail dependence Scatter from simulation Figure 6b Two correlated RS risks ( ρ =. 5 ) without tail dependence Density of scatter 9

21 Figure 6c Two correlated RS risks ( ρ =. 5 ) with tail dependence Scatter from simulation Figure 6d Two correlated RS risks ( ρ =. 5 ) with tail dependence Density of scatter 0

22 As would be expected, the joint effect of correlation and tail dependence between the two RS risks increases the incidence of extreme joint occurrences in a year, and these interdependencies need to be considered when setting risk capital. Combining ten FM risks and two RS risks Finally, as foreshadowed earlier, we now consider a more realistic situation - involving a bank that faces multiple risks - by combining 0 different FM risks and different RS risks together and examining the effects of correlation and tail dependence in a portfolio of risks. To allow for the effects of correlations between the risks we have assumed serial pairwise correlation (i.e. risk A is related to risk B, B is related to C, etc). This technique allows for a reducing degree of correlation as risks in the series get further apart. We have considered six different scenarios for levels of interrelatedness between the risks:, No pairwise correlation (both without and with tail dependence) 3, 4 50% pairwise correlation (both without and with tail dependence) 5, 6 90% pairwise correlation (both without and with tail dependence) Figure 7 below shows the cumulative distribution function of the combined aggregate losses at the worst case end of the distributions for each of these scenarios. (Scenarios to 6 are depicted left to right). Figure 7 CDF of portfolio of 0 FM & RS risks at high cumulative probabilities

23 Table 4: Capital requirement for the portfolio of risks Scenario Coefficient of serial pairwise correlation Tail dependence Capital requirement at 99% confidence level $b Capital requirement at 99.9% confidence level $b 0.0 No Yes No Yes No Yes This table demonstrates how both the shape of the risks in the extremes and the interaction between risks have a significant impact on the resulting capital requirement. In our example, the capital requirement increases 3-fold in moving from 99% to 99.9% confidence. In addition there is a 50% difference in the capital requirements between independent risks and highly correlated risks with tail dependence. This difference would of course be even greater if we had used 00% correlation, which is the default position under the Revised Framework if the risk correlation assumptions cannot be validated. It is also interesting to note that in this illustration the effect on the capital requirement of tail dependence is similar to that of partial correlation (with a coefficient 0.5). To provide additional context, Figure 8 below shows the corresponding combined aggregate loss distribution when the portfolio consists of only the two RS risks. Comparing this with Figure 7 gives a sense of what proportion of the capital requirement comes from these two rare and severe risk types alone. For example, in scenario 4, the two RS risks account for about $.3b of the total capital requirement (at the 99.9% confidence level) of $.6b.

24 Figure 8 CDF of portfolio of two RS risk portfolio at high cumulative probabilities To complete the picture, Figure 9 shows the combined aggregate loss distribution from combining just the 0 FM risks. Again, under scenario 4, the 0 FM risks contribute to only around $300m of the $.6 billion capital requirement. Figure 9 CDF of portfolio of 0 FM risk portfolio at high cumulative probabilities 3

25 Figures 7, 8 and 9 show the importance of properly identifying the bank s exposure to RS type risks when assessing the bank s overall operational risk capital requirement under the AMA. In this example, properly representing tail dependence and correlation between the two RS risks is as important as including the 0 FM risks. Armed with the insights from this investigation, we now consider their application to the practical task of complying with AMA under the Revised Framework. 5 Practical application As set out in Section 3, the Revised Framework implies the use of the loss distribution approach to measuring operational risk. Under this approach, for each type of risk event in each business line, the frequency and severity of loss are modelled separately and then combined to arrive at separate aggregate loss distributions. These aggregate loss distributions are then to be combined across risk event types and business line to arrive at business line and enterprise-wide loss distributions respectively. The number of different risk event/business line combinations (risk cells) is large (e.g. for an 8 business line bank, with 0 different subcategories of risk events, there would be 60 different risk cells). Having to compute an aggregate loss distribution for each risk cell creates a major problem, for the reasons described earlier in Section 3. In particular, it is difficult to measure the individual risk cells consistently and reliably (due to the lack of appropriate data and the high reliance on judgements) and uncover all of the interdependencies between risk cells. The mathematical examples in the previous section provided some important insights into what drives the AMA operational risk capital requirement under the Revised Framework in circumstances where a few RS risks are present along with a larger number of FM risks. One of the major points to come out is the degree to which the capital requirement is extremely sensitive to the shape of the tails, especially of the RS risks, the parameters for which can only be estimated in a largely subjective fashion. In the example, twothirds of the capital requirement relates to increasing the confidence level targeted (i.e. moving up the tail) form 99% to 99.9%. This demonstrates the importance of any extrapolation that is based on subjective assessments made at lower confidence levels, such as at the 90% level. Care should be taken to ensure that the process of ascribing aggregate loss distributions to the risk cells is not overly mechanistic in nature and that all assumptions made are transparent (so that it is easy to understand the level of subjectivity and its impact on the resultant capital requirement). Another related point is that the capital requirement is primarily driven by the RS type risks. In our examples (irrespective of whether correlations and/or tail dependence are 4

26 allowed for) the two RS risks account for around 80% of the capital required (at the 99.9% level). For the purposes of measuring capital requirements, the major effort should be concentrated on assessing the aggregate loss distributions for the RS event types. By definition, these types of events are rare and there would be very little (if any) appropriate data. Accordingly, there is instead heavy reliance on judgement. There may also be a natural tendency among practitioners to focus most of their efforts on measuring the FM (frequent but moderate) type risks, where the level of available data is greater and therefore the aggregate loss distribution likely to be more reliable, but this could be largely a misdirection of effort. It is important to note, however, that the suggestion to concentrate on the RS more than the FM risk cells is purely for the purposes of measuring capital under the AMA. In the ordinary course of events, it is the FM risks that should require management s day to day attention, and that provide the greater opportunity to improve an organisation s operational effectiveness and to reduce more foreseeable operational losses. 3 The judgement used to assess the RS type aggregate loss distributions should be applied as thoroughly as possible. Bearing in mind that the 90 th percentile worst case for a in 0 year event is only likely to happen once in a 00 years, it is also necessary to recognise that subjective assessments are likely to be extremely prone to both uncertainty and unintended bias. The implied aggregate loss distributions for the RS risk cells should be played back to the business line/risk event experts to ensure that the judgement they applied does not appear unreasonable (looking at different aspects of the aggregate loss distribution in detail can be used to help tune the judgement applied). Finally, the judgements made and the resulting aggregate loss distribution that are adopted should be validated (to the extent possible) by appropriate experts (independent of the process). 4 Subject of course to the particular mix of risks in each case, the corollary of point ) is that the FM risks do not drive the capital requirement to anywhere near the same extent, and comparatively less effort should be put into assessing the tails of these risk cells. The final major point coming out is that tail dependence and correlation have a significant impact. In the example provided, the capital requirement increases from a base amount of $.8 billion (no correlations, no tail dependence) to $.93 billion (90% correlated with tail dependence). Importantly, the impact of moving from no correlations to a 50% coefficient of pairwise correlation adds about 5% of the base amount, and including allowance for 5

27 tail dependence adds a further 5%, taking the capital required from $.8 billion to $.63 billion. To elaborate on this point further, in this example the issue of understanding correlations and tail dependence among the RS risk cells is more important than whether or not allowance is made for the 0 FM risks at all. 5 When assessing the RS risk cells, significant attention should be given to the degree of inter-relatedness between the risks particularly in the extreme scenarios. As a practical example, RS risk cells with a high degree of inter-relatedness might be considered as a whole whereas others that appear to be largely unrelated might be considered separately. 6 As an extension to 5) and to help ensure that all of the major RS events have been covered and that appropriate allowance has been made for their interrelatedness, a whole of bank top down view should be considered - perhaps supported by scenario based stress testing. 7 Most important of all, it should always be remembered that the level of capital required to address operational risks is the last line of defence against such risks. Risk measurement is not the same thing as risk management. The situations that could lead to the occurrence of large losses from RS type events should be explored as best they can and appropriate business monitoring and risk mitigation strategies employed. Pulling all of this together, it is our view that the majority of a bank s capital requirement will generally be driven by its exposure to a handful of key RS risks. It is therefore important that a bank should identify each of these key risks, rank them so that the attention given to quantifying them is commensurate with their importance, understand its exposure to them (particularly in the extremes) and what drives this exposure, and explore how the key risks are related to each other. In doing this, it is important that there is full transparency of the judgments made concerning their potential occurrence and impact. Equally, it is much less important for the purposes of establishing a bank s capital requirement to focus on the tails of the FM type risks, although for day to day risk management purposes understanding and addressing these risks is of course of critical importance. 6

28 Appendix A Monte-Carlo simulation Monte-Carlo simulation is based on the following theorem (and the converse). Theorem A: Let Y have a uniform distribution U (0, ). Let F(x) be a continuous distribution function such that F ( a) = 0 and F ( b) =. Then the random variable X where X = distribution function F (x). F ( Y ) is a continuous random variable with Proof: The distribution function of X is P( X x) = P[ F F ( Y ) x is equivalent to Y F(x) so P( X x) = P[ Y F( x)]. ( Y ) x]. However, However, Y is U(0, ) so P ( Y y) = y, 0 < y<. Hence P( X x) = P[ Y F( x)] = F( X ), 0 < F( x) <, so that the distribution function of X is F(x). Theorem A (converse): Let X have the continuous distribution function F(x). Then the random variable Y wherey = F(X ), has a distribution that isu (0,). Proof: The distribution function of Y is P( Y y) = P[ F( X ) y], 0 < y <. F However, F( X ) y is equivalent to X ( y) so P( Y y) = P[ X F ( y)]. Since P ( X x) = F( x), we see that P ( Y y) = P[ X F ( y)] = F[ F ( y)] = y, 0 < y <. In other words Y is distributedu (0,). For example, to simulate m observations from the Rayleigh distribution F( x) = exp( x / ), we proceed as follows.. Solveu = F(x) for x, giving x = ln( u), where -u is a random variable with uniform distribution. Generate a sequence of random numbers u, u,...,u m from U (0, ) 3. The vector x = ln u then has a Rayleigh distribution. Suppose now that we wish to simulate m observations from a discrete random variable x that takes the values k with probability p k = P[ x = k] k =,..., m. In this case F(x) is a function that has steps at each k, and its inverse has steps at F ( k ) = p pk, so that the procedure for simulating the random sequence xi becomes: 7

29 Set x i = k if p p... + pk ui < p k For example, setting = λ λ pk e k = 0,,... we obtain a random sequence with a k! Poisson distribution. Some cumulative distribution functions are not readily invertible, notably the Normal distribution, and the approach requires modification. One such modified method for the Normal distribution is as follows 6. Consider the random variable x cosω t + y sinω t = r cos( ω t φ) φ < π where the random variables x and y are N(0,) and independent. Transforming f ( x, y) = exp( ( x + y ) / ) π to polar coordinates using the appropriate Jacobian, and noting that x = r cos φ y = r sin φ we get r f ( r, ϕ ) = exp( r / ) r > 0 φ < π π This is the product of separable expressions involving the random variables r andφ, which are therefore independent, with f ( r) = r exp( r / ), f ( ϕ) = π From this it follows that if the random variables r and φ are independent then r has a Rayleigh distribution, φ is uniform in the interval ( π, π ) and the random variables x = r cos φ y = r sin φ are N(0,) and independent. Clearly, φ = π ( u ) and inverting f(r) as before we have r = ln v, where v is a random variable (independent of u) with uniform distribution in the interval (0,). It follows that x = r cosφ = ln v cosπ (u ), y = r sin φ = ln v sinπ (u ) are N(0,) and independent, and either may be used to simulate the standard Normal distribution. There are, of course, readily accessible means of simulating the standard Normal distribution in practice. 7 k 6 See (6.64), (6.7), example 6- and (8.58) in Probability, Random Variables and Stochastic Processes by A. Papoulis, 99, McGraw-Hill. 7 For example, by using NORMSINV(RAND()) in Microsoft Excel. 8

30 Appendix B Simulating combined risk with a copula If we know the joint distribution function of a vector X of risks we can easily derive the unique marginal distribution of each X j, j =,,..., n. If, on the other hand, we know the marginal distributions and the dependency structure between them and wish to determine the joint distribution, we are limited by the fact that there is no unique derivation. Furthermore, when allowing for dependency between the risks when setting capital requirements, we require a method that takes into account the possibility that dependency may become more marked at extreme outcomes. The copula provides a flexible way of obtaining a joint distribution that reflects both the limitation and the requirement. A copula C is a multivariate distribution function whose marginal distributions are distributed U(0, ). There exists a copula C that defines the dependence structure between X,..., X n by the relationship F ( x,..., xn ) = C( F ( x),..., Fn ( xn )), where F j is the marginal distribution function of and F is a joint distribution function X j of X,..., X n. That is, the univariate marginal distribution functions and the multivariate dependence structure can be separated, with the latter being represented by a copula and the joint distribution of X,..., X n can be defined using it. The converse is also true, namely that the univariate marginal distribution functions and the multivariate dependence structure can be combined, resulting in a joint distribution of X,..., X n. Suppose now that an overall risk X whose distribution is not known to us has risk components and that we have sufficient information to model each of them X j separately. Suppose also that we then need to combine these component risks in order n = to obtain a model for X. X j If we can generate a series of independent random vectors from a suitable copula C, then (bearing in mind theorem A) each F ( u) F n ( un ) generated from this series of vectors is an independent random sample of X and we can therefore simulate the distribution of X. Finding a model for the overall total risk X therefore comes down to selecting a suitable copula and generating random samples of the overall risk in this way. The method is illustrated in Appendix C using the Student t-distribution. 9

31 Appendix C Simulating combined risk with a t-copula The function of a copula is to combine known marginal distributions and the dependency structure between them into a joint distribution (or in the case at hand into the distribution of the sum of component risks). In this Appendix we illustrate the simulation of overall risk by a copula. We have chosen the Student t-distribution copula for this illustration because of the need to represent tail dependence in setting a capital requirement. The Normal copula, for example, has no tail dependence, whereas the t-copula does (with the Normal copula being a special case of it when the degrees of freedom approach infinity). v If a random vector X is represented by X = d µ + Z, where s ~ χ v and s Z ~ N (0, Σ) are independent, then X has an n-variate t distribution (for v > ) n with mean v µ and covariance matrix Σ. v v The copula of X is then C ( u) = t ( t ( u ),..., t ( u )), where and Z i i j j v, R v, R v v n Σi j R i j = and t v, R denotes the distribution function of Σ Σ R ~ N(0, R) are independent. v Z s and s R This leads to the following algorithm 8 for random vector generation from the t-copula, and hence to independent random samples from the overall sum of the risks:. Find the Cholesky decomposition A of R 9. Simulate n independent random variates z,..., z ~ (0,) n N 3. Simulate a random variate s ~ χ v independent of z,..., zn 4. Set x = v Az s 5. Set u = (x), giving C ( ) t v v, R u 6. Calculate F ( u) F n ( un ) to get an independent random sample of the sum of the different risks. To illustrate how tail dependence is measured, the coefficient of upper tail dependence λ between two marginal distribution functions F x ) & F ( ( x by λ = limp{x > F ( u) x > F ( u)}. In the case of the standard bivariate t- U u is given distribution with linear correlation ρ the measure of tail dependence is given by i j ) 8 This is based on algorithm 5. in Modelling Dependence with Copulas and Applications to Risk Management by P.Embrechts, F. Lindskog & A. McNeil, 00, 9 The Cholesky decomposition of R is the unique lower-triangle matrix A such that AA T = R. Given this and if z,..., zn ~ N(0,) are independent, then µ + Az ~ N n ( µ, R) 30

32 λ = t + ( v + ρ / + ρ ). The coefficient of upper tail dependence for the U v i j i j bivariate t-distribution is illustrated in the table: Degrees of freedom v Coefficient of tail dependence λ Coefficient of correlation ρ The case with infinite degrees of freedom corresponds to the Normal distribution. There are, of course, many other copulas that could be used, including many capable of representing greater tail dependence than the t-copula. 3

33 Appendix D The Compound Poisson Process The mathematical approach taken in the paper uses the compound Poisson process, N ( t) namely the stochastic process X ( t) = Y k, t 0 where { } is a sequence of k= independent identically distributed random variables having the distribution function F and characteristic functionϕ ; and N(t) is a Poisson process with parameter λ > 0 and independent of{ Y k }. That is, events occur in accordance with a Poisson process and each event has some randomly determined severity associated with it. Some well-known properties of the compound Poisson process are summarised here for ease of reference:. The mean and variance are given by E[ X ( t)] = µ Fλ t and var[ X ( t)] (σ = F + µ F ) λ t, where µ F and σ F are the mean and variance, respectively, of Y.. The characteristic function is given by ϕ X t ( u) = exp( λ t( ( u))), < u <. ( ) ϕ 3. The distribution function for X (t) is given by Pr{ X ( t) x} = n= 0 n ( λ t) e n! λ t F ( n) ( x), ( n) (0) where F x) = Pr{ Y Y x} and F ( x) =, x 0; = 0, x < 0 ( n 4. The sum of two independent compound Poisson processes is itself a compound Poisson process. The Poisson event rate parameter of X ( t) = X ( t) + X ( t) is λ + λ ; and the characteristic function of the λ λ associated value is given by ϕ( u) = ϕ( u) + ϕ ( u), λ + λ λ + λ which corresponds to that of a random variable Y which assumes a value from Y() (with characteristic function ϕ ( u ) ) with probability λ and a value from Y() (with characteristic function λ + λ ϕ ( u ) ) λ with probability λ + λ Because it leads to the only analytically tractable form of the compound Poisson distribution, it is sometimes assumed that the sequence { Y k } follows an exponential x υ distribution, namely Pr{ Y x} = e, x > 0. In turn then follows a gamma k Y k n Y k k= 3

Publication date: 12-Nov-2001 Reprinted from RatingsDirect

Publication date: 12-Nov-2001 Reprinted from RatingsDirect Publication date: 12-Nov-2001 Reprinted from RatingsDirect Commentary CDO Evaluator Applies Correlation and Monte Carlo Simulation to the Art of Determining Portfolio Quality Analyst: Sten Bergman, New

More information

Guideline. Capital Adequacy Requirements (CAR) Chapter 8 Operational Risk. Effective Date: November 2016 / January

Guideline. Capital Adequacy Requirements (CAR) Chapter 8 Operational Risk. Effective Date: November 2016 / January Guideline Subject: Capital Adequacy Requirements (CAR) Chapter 8 Effective Date: November 2016 / January 2017 1 The Capital Adequacy Requirements (CAR) for banks (including federal credit unions), bank

More information

ADVANCED OPERATIONAL RISK MODELLING IN BANKS AND INSURANCE COMPANIES

ADVANCED OPERATIONAL RISK MODELLING IN BANKS AND INSURANCE COMPANIES Small business banking and financing: a global perspective Cagliari, 25-26 May 2007 ADVANCED OPERATIONAL RISK MODELLING IN BANKS AND INSURANCE COMPANIES C. Angela, R. Bisignani, G. Masala, M. Micocci 1

More information

Operational Risk Modeling

Operational Risk Modeling Operational Risk Modeling RMA Training (part 2) March 213 Presented by Nikolay Hovhannisyan Nikolay_hovhannisyan@mckinsey.com OH - 1 About the Speaker Senior Expert McKinsey & Co Implemented Operational

More information

Study Guide for CAS Exam 7 on "Operational Risk in Perspective" - G. Stolyarov II, CPCU, ARe, ARC, AIS, AIE 1

Study Guide for CAS Exam 7 on Operational Risk in Perspective - G. Stolyarov II, CPCU, ARe, ARC, AIS, AIE 1 Study Guide for CAS Exam 7 on "Operational Risk in Perspective" - G. Stolyarov II, CPCU, ARe, ARC, AIS, AIE 1 Study Guide for Casualty Actuarial Exam 7 on "Operational Risk in Perspective" Published under

More information

Statistical Modeling Techniques for Reserve Ranges: A Simulation Approach

Statistical Modeling Techniques for Reserve Ranges: A Simulation Approach Statistical Modeling Techniques for Reserve Ranges: A Simulation Approach by Chandu C. Patel, FCAS, MAAA KPMG Peat Marwick LLP Alfred Raws III, ACAS, FSA, MAAA KPMG Peat Marwick LLP STATISTICAL MODELING

More information

IEOR E4602: Quantitative Risk Management

IEOR E4602: Quantitative Risk Management IEOR E4602: Quantitative Risk Management Basic Concepts and Techniques of Risk Management Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com

More information

Integration & Aggregation in Risk Management: An Insurance Perspective

Integration & Aggregation in Risk Management: An Insurance Perspective Integration & Aggregation in Risk Management: An Insurance Perspective Stephen Mildenhall Aon Re Services May 2, 2005 Overview Similarities and Differences Between Risks What is Risk? Source-Based vs.

More information

Rules and Models 1 investigates the internal measurement approach for operational risk capital

Rules and Models 1 investigates the internal measurement approach for operational risk capital Carol Alexander 2 Rules and Models Rules and Models 1 investigates the internal measurement approach for operational risk capital 1 There is a view that the new Basel Accord is being defined by a committee

More information

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is:

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is: **BEGINNING OF EXAMINATION** 1. You are given: (i) A random sample of five observations from a population is: 0.2 0.7 0.9 1.1 1.3 (ii) You use the Kolmogorov-Smirnov test for testing the null hypothesis,

More information

Basel Committee on Banking Supervision. Consultative Document. Pillar 2 (Supervisory Review Process)

Basel Committee on Banking Supervision. Consultative Document. Pillar 2 (Supervisory Review Process) Basel Committee on Banking Supervision Consultative Document Pillar 2 (Supervisory Review Process) Supporting Document to the New Basel Capital Accord Issued for comment by 31 May 2001 January 2001 Table

More information

Probability Weighted Moments. Andrew Smith

Probability Weighted Moments. Andrew Smith Probability Weighted Moments Andrew Smith andrewdsmith8@deloitte.co.uk 28 November 2014 Introduction If I asked you to summarise a data set, or fit a distribution You d probably calculate the mean and

More information

Guidance Note Capital Requirements Directive Operational Risk

Guidance Note Capital Requirements Directive Operational Risk Capital Requirements Directive Issued : 19 December 2007 Revised: 13 March 2013 V4 Please be advised that this Guidance Note is dated and does not take into account any changes arising from the Capital

More information

Risk Business Capital Taskforce. Part 2 Risk Margins Actuarial Standards: 2.04 Solvency Standard & 3.04 Capital Adequacy Standard

Risk Business Capital Taskforce. Part 2 Risk Margins Actuarial Standards: 2.04 Solvency Standard & 3.04 Capital Adequacy Standard Part 2 Risk Margins Actuarial Standards: 2.04 Solvency Standard & 3.04 Capital Adequacy Standard Prepared by Risk Business Capital Taskforce Presented to the Institute of Actuaries of Australia 4 th Financial

More information

Modelling Operational Risk

Modelling Operational Risk Modelling Operational Risk Lucie Mazurová 9.12.2016 1 / 38 Contents 1 Operational Risk Definition 2 Operational Risk in Banks 3 Operational Risk Management 4 Capital Requirement for Operational Risk Basic

More information

Bloomberg. Portfolio Value-at-Risk. Sridhar Gollamudi & Bryan Weber. September 22, Version 1.0

Bloomberg. Portfolio Value-at-Risk. Sridhar Gollamudi & Bryan Weber. September 22, Version 1.0 Portfolio Value-at-Risk Sridhar Gollamudi & Bryan Weber September 22, 2011 Version 1.0 Table of Contents 1 Portfolio Value-at-Risk 2 2 Fundamental Factor Models 3 3 Valuation methodology 5 3.1 Linear factor

More information

Lecture 4 of 4-part series. Spring School on Risk Management, Insurance and Finance European University at St. Petersburg, Russia.

Lecture 4 of 4-part series. Spring School on Risk Management, Insurance and Finance European University at St. Petersburg, Russia. Principles and Lecture 4 of 4-part series Spring School on Risk, Insurance and Finance European University at St. Petersburg, Russia 2-4 April 2012 University of Connecticut, USA page 1 Outline 1 2 3 4

More information

EBF response to the EBA consultation on prudent valuation

EBF response to the EBA consultation on prudent valuation D2380F-2012 Brussels, 11 January 2013 Set up in 1960, the European Banking Federation is the voice of the European banking sector (European Union & European Free Trade Association countries). The EBF represents

More information

Practical methods of modelling operational risk

Practical methods of modelling operational risk Practical methods of modelling operational risk Andries Groenewald The final frontier for actuaries? Agenda 1. Why model operational risk? 2. Data. 3. Methods available for modelling operational risk.

More information

Alternative VaR Models

Alternative VaR Models Alternative VaR Models Neil Roeth, Senior Risk Developer, TFG Financial Systems. 15 th July 2015 Abstract We describe a variety of VaR models in terms of their key attributes and differences, e.g., parametric

More information

TABLE OF CONTENTS - VOLUME 2

TABLE OF CONTENTS - VOLUME 2 TABLE OF CONTENTS - VOLUME 2 CREDIBILITY SECTION 1 - LIMITED FLUCTUATION CREDIBILITY PROBLEM SET 1 SECTION 2 - BAYESIAN ESTIMATION, DISCRETE PRIOR PROBLEM SET 2 SECTION 3 - BAYESIAN CREDIBILITY, DISCRETE

More information

2 Modeling Credit Risk

2 Modeling Credit Risk 2 Modeling Credit Risk In this chapter we present some simple approaches to measure credit risk. We start in Section 2.1 with a short overview of the standardized approach of the Basel framework for banking

More information

[D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright

[D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright Faculty and Institute of Actuaries Claims Reserving Manual v.2 (09/1997) Section D7 [D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright 1. Introduction

More information

Measurable value creation through an advanced approach to ERM

Measurable value creation through an advanced approach to ERM Measurable value creation through an advanced approach to ERM Greg Monahan, SOAR Advisory Abstract This paper presents an advanced approach to Enterprise Risk Management that significantly improves upon

More information

Slides for Risk Management

Slides for Risk Management Slides for Risk Management Introduction to the modeling of assets Groll Seminar für Finanzökonometrie Prof. Mittnik, PhD Groll (Seminar für Finanzökonometrie) Slides for Risk Management Prof. Mittnik,

More information

Stochastic Analysis Of Long Term Multiple-Decrement Contracts

Stochastic Analysis Of Long Term Multiple-Decrement Contracts Stochastic Analysis Of Long Term Multiple-Decrement Contracts Matthew Clark, FSA, MAAA and Chad Runchey, FSA, MAAA Ernst & Young LLP January 2008 Table of Contents Executive Summary...3 Introduction...6

More information

Dependence Modeling and Credit Risk

Dependence Modeling and Credit Risk Dependence Modeling and Credit Risk Paola Mosconi Banca IMI Bocconi University, 20/04/2015 Paola Mosconi Lecture 6 1 / 53 Disclaimer The opinion expressed here are solely those of the author and do not

More information

Measurement of Market Risk

Measurement of Market Risk Measurement of Market Risk Market Risk Directional risk Relative value risk Price risk Liquidity risk Type of measurements scenario analysis statistical analysis Scenario Analysis A scenario analysis measures

More information

Subject CS1 Actuarial Statistics 1 Core Principles. Syllabus. for the 2019 exams. 1 June 2018

Subject CS1 Actuarial Statistics 1 Core Principles. Syllabus. for the 2019 exams. 1 June 2018 ` Subject CS1 Actuarial Statistics 1 Core Principles Syllabus for the 2019 exams 1 June 2018 Copyright in this Core Reading is the property of the Institute and Faculty of Actuaries who are the sole distributors.

More information

GN47: Stochastic Modelling of Economic Risks in Life Insurance

GN47: Stochastic Modelling of Economic Risks in Life Insurance GN47: Stochastic Modelling of Economic Risks in Life Insurance Classification Recommended Practice MEMBERS ARE REMINDED THAT THEY MUST ALWAYS COMPLY WITH THE PROFESSIONAL CONDUCT STANDARDS (PCS) AND THAT

More information

Solvency Assessment and Management: Stress Testing Task Group Discussion Document 96 (v 3) General Stress Testing Guidance for Insurance Companies

Solvency Assessment and Management: Stress Testing Task Group Discussion Document 96 (v 3) General Stress Testing Guidance for Insurance Companies Solvency Assessment and Management: Stress Testing Task Group Discussion Document 96 (v 3) General Stress Testing Guidance for Insurance Companies 1 INTRODUCTION AND PURPOSE The business of insurance is

More information

Final draft RTS on the assessment methodology to authorize the use of AMA

Final draft RTS on the assessment methodology to authorize the use of AMA Management Solutions 2015. All rights reserved. Final draft RTS on the assessment methodology to authorize the use of AMA European Banking Authority www.managementsolutions.com Research and Development

More information

UPDATED IAA EDUCATION SYLLABUS

UPDATED IAA EDUCATION SYLLABUS II. UPDATED IAA EDUCATION SYLLABUS A. Supporting Learning Areas 1. STATISTICS Aim: To enable students to apply core statistical techniques to actuarial applications in insurance, pensions and emerging

More information

INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS

INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS Guidance Paper No. 2.2.x INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS GUIDANCE PAPER ON ENTERPRISE RISK MANAGEMENT FOR CAPITAL ADEQUACY AND SOLVENCY PURPOSES DRAFT, MARCH 2008 This document was prepared

More information

Statement of Guidance for Licensees seeking approval to use an Internal Capital Model ( ICM ) to calculate the Prescribed Capital Requirement ( PCR )

Statement of Guidance for Licensees seeking approval to use an Internal Capital Model ( ICM ) to calculate the Prescribed Capital Requirement ( PCR ) MAY 2016 Statement of Guidance for Licensees seeking approval to use an Internal Capital Model ( ICM ) to calculate the Prescribed Capital Requirement ( PCR ) 1 Table of Contents 1 STATEMENT OF OBJECTIVES...

More information

Introduction Recently the importance of modelling dependent insurance and reinsurance risks has attracted the attention of actuarial practitioners and

Introduction Recently the importance of modelling dependent insurance and reinsurance risks has attracted the attention of actuarial practitioners and Asymptotic dependence of reinsurance aggregate claim amounts Mata, Ana J. KPMG One Canada Square London E4 5AG Tel: +44-207-694 2933 e-mail: ana.mata@kpmg.co.uk January 26, 200 Abstract In this paper we

More information

Embedded Derivatives and Derivatives under International Financial Reporting Standards IFRS [2007]

Embedded Derivatives and Derivatives under International Financial Reporting Standards IFRS [2007] IAN 10 Embedded Derivatives and Derivatives under International Financial Reporting Standards IFRS [2007] Prepared by the Subcommittee on Education and Practice of the Committee on Insurance Accounting

More information

Correlation and Diversification in Integrated Risk Models

Correlation and Diversification in Integrated Risk Models Correlation and Diversification in Integrated Risk Models Alexander J. McNeil Department of Actuarial Mathematics and Statistics Heriot-Watt University, Edinburgh A.J.McNeil@hw.ac.uk www.ma.hw.ac.uk/ mcneil

More information

FRAMEWORK FOR SUPERVISORY INFORMATION

FRAMEWORK FOR SUPERVISORY INFORMATION FRAMEWORK FOR SUPERVISORY INFORMATION ABOUT THE DERIVATIVES ACTIVITIES OF BANKS AND SECURITIES FIRMS (Joint report issued in conjunction with the Technical Committee of IOSCO) (May 1995) I. Introduction

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions Chapter 3 Random Variables and Probability Distributions Chapter Three Random Variables and Probability Distributions 3. Introduction An event is defined as the possible outcome of an experiment. In engineering

More information

Economic Capital. Implementing an Internal Model for. Economic Capital ACTUARIAL SERVICES

Economic Capital. Implementing an Internal Model for. Economic Capital ACTUARIAL SERVICES Economic Capital Implementing an Internal Model for Economic Capital ACTUARIAL SERVICES ABOUT THIS DOCUMENT THIS IS A WHITE PAPER This document belongs to the white paper series authored by Numerica. It

More information

Risk Concentrations Principles

Risk Concentrations Principles Risk Concentrations Principles THE JOINT FORUM BASEL COMMITTEE ON BANKING SUPERVISION INTERNATIONAL ORGANIZATION OF SECURITIES COMMISSIONS INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS Basel December

More information

INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS GUIDELINE. Nepal Rastra Bank Bank Supervision Department. August 2012 (updated July 2013)

INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS GUIDELINE. Nepal Rastra Bank Bank Supervision Department. August 2012 (updated July 2013) INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS GUIDELINE Nepal Rastra Bank Bank Supervision Department August 2012 (updated July 2013) Table of Contents Page No. 1. Introduction 1 2. Internal Capital Adequacy

More information

PRE CONFERENCE WORKSHOP 3

PRE CONFERENCE WORKSHOP 3 PRE CONFERENCE WORKSHOP 3 Stress testing operational risk for capital planning and capital adequacy PART 2: Monday, March 18th, 2013, New York Presenter: Alexander Cavallo, NORTHERN TRUST 1 Disclaimer

More information

Prudential Standard APS 117 Capital Adequacy: Interest Rate Risk in the Banking Book (Advanced ADIs)

Prudential Standard APS 117 Capital Adequacy: Interest Rate Risk in the Banking Book (Advanced ADIs) Prudential Standard APS 117 Capital Adequacy: Interest Rate Risk in the Banking Book (Advanced ADIs) Objective and key requirements of this Prudential Standard This Prudential Standard sets out the requirements

More information

INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS

INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS Guidance Paper No. 2.2.6 INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS GUIDANCE PAPER ON ENTERPRISE RISK MANAGEMENT FOR CAPITAL ADEQUACY AND SOLVENCY PURPOSES OCTOBER 2007 This document was prepared

More information

Subject ST9 Enterprise Risk Management Syllabus

Subject ST9 Enterprise Risk Management Syllabus Subject ST9 Enterprise Risk Management Syllabus for the 2018 exams 1 June 2017 Aim The aim of the Enterprise Risk Management (ERM) Specialist Technical subject is to instil in successful candidates the

More information

Pricing & Risk Management of Synthetic CDOs

Pricing & Risk Management of Synthetic CDOs Pricing & Risk Management of Synthetic CDOs Jaffar Hussain* j.hussain@alahli.com September 2006 Abstract The purpose of this paper is to analyze the risks of synthetic CDO structures and their sensitivity

More information

RESERVE BANK OF MALAWI

RESERVE BANK OF MALAWI RESERVE BANK OF MALAWI GUIDELINES ON INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS (ICAAP) Bank Supervision Department March 2013 Table of Contents 1.0 INTRODUCTION... 2 2.0 MANDATE... 2 3.0 RATIONALE...

More information

GUIDELINES FOR THE INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS FOR LICENSEES

GUIDELINES FOR THE INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS FOR LICENSEES SUPERVISORY AND REGULATORY GUIDELINES: 2016 Issued: 2 August 2016 GUIDELINES FOR THE INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS FOR LICENSEES 1. INTRODUCTION 1.1 The Central Bank of The Bahamas ( the

More information

Market Risk: FROM VALUE AT RISK TO STRESS TESTING. Agenda. Agenda (Cont.) Traditional Measures of Market Risk

Market Risk: FROM VALUE AT RISK TO STRESS TESTING. Agenda. Agenda (Cont.) Traditional Measures of Market Risk Market Risk: FROM VALUE AT RISK TO STRESS TESTING Agenda The Notional Amount Approach Price Sensitivity Measure for Derivatives Weakness of the Greek Measure Define Value at Risk 1 Day to VaR to 10 Day

More information

MEASURING TRADED MARKET RISK: VALUE-AT-RISK AND BACKTESTING TECHNIQUES

MEASURING TRADED MARKET RISK: VALUE-AT-RISK AND BACKTESTING TECHNIQUES MEASURING TRADED MARKET RISK: VALUE-AT-RISK AND BACKTESTING TECHNIQUES Colleen Cassidy and Marianne Gizycki Research Discussion Paper 9708 November 1997 Bank Supervision Department Reserve Bank of Australia

More information

Embedded Derivatives and Derivatives under International Financial Reporting Standards

Embedded Derivatives and Derivatives under International Financial Reporting Standards Draft of Research Paper Embedded Derivatives and Derivatives under International Financial Reporting Standards Practice Council June 2009 Document 209063 Ce document est disponible en français 2009 Canadian

More information

Calculating VaR. There are several approaches for calculating the Value at Risk figure. The most popular are the

Calculating VaR. There are several approaches for calculating the Value at Risk figure. The most popular are the VaR Pro and Contra Pro: Easy to calculate and to understand. It is a common language of communication within the organizations as well as outside (e.g. regulators, auditors, shareholders). It is not really

More information

Some Simple Stochastic Models for Analyzing Investment Guarantees p. 1/36

Some Simple Stochastic Models for Analyzing Investment Guarantees p. 1/36 Some Simple Stochastic Models for Analyzing Investment Guarantees Wai-Sum Chan Department of Statistics & Actuarial Science The University of Hong Kong Some Simple Stochastic Models for Analyzing Investment

More information

On the Use of Stock Index Returns from Economic Scenario Generators in ERM Modeling

On the Use of Stock Index Returns from Economic Scenario Generators in ERM Modeling On the Use of Stock Index Returns from Economic Scenario Generators in ERM Modeling Michael G. Wacek, FCAS, CERA, MAAA Abstract The modeling of insurance company enterprise risks requires correlated forecasts

More information

Appendix CA-15. Central Bank of Bahrain Rulebook. Volume 1: Conventional Banks

Appendix CA-15. Central Bank of Bahrain Rulebook. Volume 1: Conventional Banks Appendix CA-15 Supervisory Framework for the Use of Backtesting in Conjunction with the Internal Models Approach to Market Risk Capital Requirements I. Introduction 1. This Appendix presents the framework

More information

Subject SP9 Enterprise Risk Management Specialist Principles Syllabus

Subject SP9 Enterprise Risk Management Specialist Principles Syllabus Subject SP9 Enterprise Risk Management Specialist Principles Syllabus for the 2019 exams 1 June 2018 Enterprise Risk Management Specialist Principles Aim The aim of the Enterprise Risk Management (ERM)

More information

CEIOPS-DOC January 2010

CEIOPS-DOC January 2010 CEIOPS-DOC-72-10 29 January 2010 CEIOPS Advice for Level 2 Implementing Measures on Solvency II: Technical Provisions Article 86 h Simplified methods and techniques to calculate technical provisions (former

More information

Lecture notes on risk management, public policy, and the financial system. Credit portfolios. Allan M. Malz. Columbia University

Lecture notes on risk management, public policy, and the financial system. Credit portfolios. Allan M. Malz. Columbia University Lecture notes on risk management, public policy, and the financial system Allan M. Malz Columbia University 2018 Allan M. Malz Last updated: June 8, 2018 2 / 23 Outline Overview of credit portfolio risk

More information

SUPERVISORY FRAMEWORK FOR THE USE OF BACKTESTING IN CONJUNCTION WITH THE INTERNAL MODELS APPROACH TO MARKET RISK CAPITAL REQUIREMENTS

SUPERVISORY FRAMEWORK FOR THE USE OF BACKTESTING IN CONJUNCTION WITH THE INTERNAL MODELS APPROACH TO MARKET RISK CAPITAL REQUIREMENTS SUPERVISORY FRAMEWORK FOR THE USE OF BACKTESTING IN CONJUNCTION WITH THE INTERNAL MODELS APPROACH TO MARKET RISK CAPITAL REQUIREMENTS (January 1996) I. Introduction This document presents the framework

More information

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29 Chapter 5 Univariate time-series analysis () Chapter 5 Univariate time-series analysis 1 / 29 Time-Series Time-series is a sequence fx 1, x 2,..., x T g or fx t g, t = 1,..., T, where t is an index denoting

More information

STRESS TESTING GUIDELINE

STRESS TESTING GUIDELINE c DRAFT STRESS TESTING GUIDELINE November 2011 TABLE OF CONTENTS Preamble... 2 Introduction... 3 Coming into effect and updating... 6 1. Stress testing... 7 A. Concept... 7 B. Approaches underlying stress

More information

4.0 The authority may allow credit institutions to use a combination of approaches in accordance with Section I.5 of this Appendix.

4.0 The authority may allow credit institutions to use a combination of approaches in accordance with Section I.5 of this Appendix. SECTION I.1 - OPERATIONAL RISK Minimum Own Funds Requirements for Operational Risk 1.0 Credit institutions shall hold own funds against operational risk in accordance with the methodologies set out in

More information

AMA Implementation: Where We Are and Outstanding Questions

AMA Implementation: Where We Are and Outstanding Questions Federal Reserve Bank of Boston Implementing AMA for Operational Risk May 20, 2005 AMA Implementation: Where We Are and Outstanding Questions David Wildermuth, Managing Director Goldman, Sachs & Co Agenda

More information

LIFE INSURANCE & WEALTH MANAGEMENT PRACTICE COMMITTEE

LIFE INSURANCE & WEALTH MANAGEMENT PRACTICE COMMITTEE Contents 1. Purpose 2. Background 3. Nature of Asymmetric Risks 4. Existing Guidance & Legislation 5. Valuation Methodologies 6. Best Estimate Valuations 7. Capital & Tail Distribution Valuations 8. Management

More information

Lloyd s Minimum Standards MS13 Modelling, Design and Implementation

Lloyd s Minimum Standards MS13 Modelling, Design and Implementation Lloyd s Minimum Standards MS13 Modelling, Design and Implementation January 2019 2 Contents MS13 Modelling, Design and Implementation 3 Minimum Standards and Requirements 3 Guidance 3 Definitions 3 Section

More information

Paper Series of Risk Management in Financial Institutions

Paper Series of Risk Management in Financial Institutions - December, 007 Paper Series of Risk Management in Financial Institutions The Effect of the Choice of the Loss Severity Distribution and the Parameter Estimation Method on Operational Risk Measurement*

More information

IEOR E4602: Quantitative Risk Management

IEOR E4602: Quantitative Risk Management IEOR E4602: Quantitative Risk Management Risk Measures Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Reference: Chapter 8

More information

The mean-variance portfolio choice framework and its generalizations

The mean-variance portfolio choice framework and its generalizations The mean-variance portfolio choice framework and its generalizations Prof. Massimo Guidolin 20135 Theory of Finance, Part I (Sept. October) Fall 2014 Outline and objectives The backward, three-step solution

More information

the display, exploration and transformation of the data are demonstrated and biases typically encountered are highlighted.

the display, exploration and transformation of the data are demonstrated and biases typically encountered are highlighted. 1 Insurance data Generalized linear modeling is a methodology for modeling relationships between variables. It generalizes the classical normal linear model, by relaxing some of its restrictive assumptions,

More information

3.4 Copula approach for modeling default dependency. Two aspects of modeling the default times of several obligors

3.4 Copula approach for modeling default dependency. Two aspects of modeling the default times of several obligors 3.4 Copula approach for modeling default dependency Two aspects of modeling the default times of several obligors 1. Default dynamics of a single obligor. 2. Model the dependence structure of defaults

More information

Financial Engineering. Craig Pirrong Spring, 2006

Financial Engineering. Craig Pirrong Spring, 2006 Financial Engineering Craig Pirrong Spring, 2006 March 8, 2006 1 Levy Processes Geometric Brownian Motion is very tractible, and captures some salient features of speculative price dynamics, but it is

More information

AN INTERNAL MODEL-BASED APPROACH

AN INTERNAL MODEL-BASED APPROACH AN INTERNAL MODEL-BASED APPROACH TO MARKET RISK CAPITAL REQUIREMENTS 1 (April 1995) OVERVIEW 1. In April 1993 the Basle Committee on Banking Supervision issued for comment by banks and financial market

More information

CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION

CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION Szabolcs Sebestyén szabolcs.sebestyen@iscte.pt Master in Finance INVESTMENTS Sebestyén (ISCTE-IUL) Choice Theory Investments 1 / 65 Outline 1 An Introduction

More information

Operational Risk Aggregation

Operational Risk Aggregation Operational Risk Aggregation Professor Carol Alexander Chair of Risk Management and Director of Research, ISMA Centre, University of Reading, UK. Loss model approaches are currently a focus of operational

More information

Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals

Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg :

More information

Basel Committee Norms

Basel Committee Norms Basel Committee Norms Basel Framework Basel Committee set up in 1974 Objectives Supervision must be adequate No foreign bank should escape supervision BASEL I Risk management Capital adequacy, sound supervision

More information

Exam P Flashcards exams. Key concepts. Important formulas. Efficient methods. Advice on exam technique

Exam P Flashcards exams. Key concepts. Important formulas. Efficient methods. Advice on exam technique Exam P Flashcards 01 exams Key concepts Important formulas Efficient methods Advice on exam technique All study material produced by BPP Professional Education is copyright and is sold for the exclusive

More information

A Cash Flow-Based Approach to Estimate Default Probabilities

A Cash Flow-Based Approach to Estimate Default Probabilities A Cash Flow-Based Approach to Estimate Default Probabilities Francisco Hawas Faculty of Physical Sciences and Mathematics Mathematical Modeling Center University of Chile Santiago, CHILE fhawas@dim.uchile.cl

More information

COPYRIGHTED MATERIAL. Bank executives are in a difficult position. On the one hand their shareholders require an attractive

COPYRIGHTED MATERIAL.   Bank executives are in a difficult position. On the one hand their shareholders require an attractive chapter 1 Bank executives are in a difficult position. On the one hand their shareholders require an attractive return on their investment. On the other hand, banking supervisors require these entities

More information

Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions

Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions ELE 525: Random Processes in Information Systems Hisashi Kobayashi Department of Electrical Engineering

More information

Introduction to Algorithmic Trading Strategies Lecture 8

Introduction to Algorithmic Trading Strategies Lecture 8 Introduction to Algorithmic Trading Strategies Lecture 8 Risk Management Haksun Li haksun.li@numericalmethod.com www.numericalmethod.com Outline Value at Risk (VaR) Extreme Value Theory (EVT) References

More information

1. You are given the following information about a stationary AR(2) model:

1. You are given the following information about a stationary AR(2) model: Fall 2003 Society of Actuaries **BEGINNING OF EXAMINATION** 1. You are given the following information about a stationary AR(2) model: (i) ρ 1 = 05. (ii) ρ 2 = 01. Determine φ 2. (A) 0.2 (B) 0.1 (C) 0.4

More information

Measuring and managing market risk June 2003

Measuring and managing market risk June 2003 Page 1 of 8 Measuring and managing market risk June 2003 Investment management is largely concerned with risk management. In the management of the Petroleum Fund, considerable emphasis is therefore placed

More information

ME3620. Theory of Engineering Experimentation. Spring Chapter III. Random Variables and Probability Distributions.

ME3620. Theory of Engineering Experimentation. Spring Chapter III. Random Variables and Probability Distributions. ME3620 Theory of Engineering Experimentation Chapter III. Random Variables and Probability Distributions Chapter III 1 3.2 Random Variables In an experiment, a measurement is usually denoted by a variable

More information

Value at Risk Ch.12. PAK Study Manual

Value at Risk Ch.12. PAK Study Manual Value at Risk Ch.12 Related Learning Objectives 3a) Apply and construct risk metrics to quantify major types of risk exposure such as market risk, credit risk, liquidity risk, regulatory risk etc., and

More information

Advisory Guidelines of the Financial Supervision Authority. Requirements to the internal capital adequacy assessment process

Advisory Guidelines of the Financial Supervision Authority. Requirements to the internal capital adequacy assessment process Advisory Guidelines of the Financial Supervision Authority Requirements to the internal capital adequacy assessment process These Advisory Guidelines were established by Resolution No 66 of the Management

More information

Some Characteristics of Data

Some Characteristics of Data Some Characteristics of Data Not all data is the same, and depending on some characteristics of a particular dataset, there are some limitations as to what can and cannot be done with that data. Some key

More information

Basel II Briefing: Pillar 2 Preparations. Considerations on Pillar 2 for Subsidiary Banks

Basel II Briefing: Pillar 2 Preparations. Considerations on Pillar 2 for Subsidiary Banks Basel II Briefing: Pillar 2 Preparations Considerations on Pillar 2 for Subsidiary Banks November 2006 Preamble Those studying this document should be aware that because of the nature of the technical

More information

regulation and smart regulation which are deployed in characterising the nature of frame of this new regulatory regime category.

regulation and smart regulation which are deployed in characterising the nature of frame of this new regulatory regime category. vi Preface The Australian Prudential Regulation Authority (APRA) as the Australian financial regulator began continuous consultations on the proposed policies for the formal implementation of the newer

More information

Operational Risk Quantification and Insurance

Operational Risk Quantification and Insurance Operational Risk Quantification and Insurance Capital Allocation for Operational Risk 14 th -16 th November 2001 Bahram Mirzai, Swiss Re Swiss Re FSBG Outline Capital Calculation along the Loss Curve Hierarchy

More information

Midas Margin Model SIX x-clear Ltd

Midas Margin Model SIX x-clear Ltd xcl-n-904 March 016 Table of contents 1.0 Summary 3.0 Introduction 3 3.0 Overview of methodology 3 3.1 Assumptions 3 4.0 Methodology 3 4.1 Stoc model 4 4. Margin volatility 4 4.3 Beta and sigma values

More information

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models IEOR E4707: Foundations of Financial Engineering c 206 by Martin Haugh Martingale Pricing Theory in Discrete-Time and Discrete-Space Models These notes develop the theory of martingale pricing in a discrete-time,

More information

Bonus-malus systems 6.1 INTRODUCTION

Bonus-malus systems 6.1 INTRODUCTION 6 Bonus-malus systems 6.1 INTRODUCTION This chapter deals with the theory behind bonus-malus methods for automobile insurance. This is an important branch of non-life insurance, in many countries even

More information

Use of Internal Models for Determining Required Capital for Segregated Fund Risks (LICAT)

Use of Internal Models for Determining Required Capital for Segregated Fund Risks (LICAT) Canada Bureau du surintendant des institutions financières Canada 255 Albert Street 255, rue Albert Ottawa, Canada Ottawa, Canada K1A 0H2 K1A 0H2 Instruction Guide Subject: Capital for Segregated Fund

More information

ECONOMIC CAPITAL MODELING CARe Seminar JUNE 2016

ECONOMIC CAPITAL MODELING CARe Seminar JUNE 2016 ECONOMIC CAPITAL MODELING CARe Seminar JUNE 2016 Boston Catherine Eska The Hanover Insurance Group Paul Silberbush Guy Carpenter & Co. Ronald Wilkins - PartnerRe Economic Capital Modeling Safe Harbor Notice

More information

John Hull, Risk Management and Financial Institutions, 4th Edition

John Hull, Risk Management and Financial Institutions, 4th Edition P1.T2. Quantitative Analysis John Hull, Risk Management and Financial Institutions, 4th Edition Bionic Turtle FRM Video Tutorials By David Harper, CFA FRM 1 Chapter 10: Volatility (Learning objectives)

More information

Practical example of an Economic Scenario Generator

Practical example of an Economic Scenario Generator Practical example of an Economic Scenario Generator Martin Schenk Actuarial & Insurance Solutions SAV 7 March 2014 Agenda Introduction Deterministic vs. stochastic approach Mathematical model Application

More information

Life 2008 Spring Meeting June 16-18, Session 67, IFRS 4 Phase II Valuation of Insurance Obligations Risk Margins

Life 2008 Spring Meeting June 16-18, Session 67, IFRS 4 Phase II Valuation of Insurance Obligations Risk Margins Life 2008 Spring Meeting June 16-18, 2008 Session 67, IFRS 4 Phase II Valuation of Insurance Obligations Risk Margins Moderator Francis A. M. Ruijgt, AAG Authors Francis A. M. Ruijgt, AAG Stefan Engelander

More information