Three Components of a Premium
|
|
- Frederica Tucker
- 6 years ago
- Views:
Transcription
1 Three Components of a Premium The simple pricing approach outlined in this module is the Return-on-Risk methodology. The sections in the first part of the module describe the three components of a premium for this methodology: Expected Loss : The Expected Loss is defined as the arithmetic average payout of the contract from the historical cleaned dataset for the station. This component will require an understanding of statistical concepts, such as the uncertainty in the mean, standard error in the mean, and the Central Limit Theorem. The section will describe both the process used to calculate Expected Loss and the process used to adjust for uncertainty to arrive at an Adjusted Expected Loss. The section will also discuss the impact that the quality of underlying data used to calculate expected loss will have on pricing. Probable Maximum Loss (PML) : This is the maximum payout that is likely to occur. This component explains three approaches to estimating the PML, which are based on the Historical Burn Analysis, the Historical Distribution Analysis, and the Monte Carlo Simulation. The concept of PML and the challenges in defining a PML estimate are given. Administration and Business Expenses : The third component that is required for the calculation of the premium of an insurance contract is the administrative and business expenses incurred by the insurer to provide the weather insurance contracts. The calculation of these expenses is provided in this section. The next section outlines the Return-on-Risk pricing approach itself, which depends on these three components. The methodology outlined in this first part of this module is for stand-alone (not portfolio) contract pricing. It is recommended that insurers use a stand-alone approach until their portfolio stabilizes and they develop a greater understanding and intuition about their overall risk and business. The second part of the Module outlines a portfolio pricing approach. 178 Module 7A Designing Index Based Weather Risk Management Programs
2 The Return-on-Risk Approach to Pricing Pricing must account for events that occur on average and for events that have more severe impacts. The potential losses from average events are, therefore, represented in pricing by the expected losses or average expected (EL and AEL). The Return-on- Risk (RoR) pricing approach accommodates those risks whose occurrences are not represented simply by looking at the AEL. Risk, in this case, is defined in terms of payouts in excess of the adjusted expected loss. The probable maximum loss (PML; 1-in-100) is often used in pricing to represent this risk. PML values must be established from the historical data estimates. The premium calculations for index-based weather insurance should always be performed on cleaned and, where appropriate, detrended data (as discussed in previous modules). This RoR approach to pricing is proposed in several publications, such as World Bank (2005), ISMEA (2006), and Henderson et al. (2002). This approach is recommended over other methodologies since it considers both expected loss and the tail risk the potential extreme negative deviations in indemnities as well as the associated capital charge. The tail risk will include the potential extreme negative deviations in indemnities. Other examples of methodologies include those that involve using the standard deviation or multiples of the expected loss for the risk loading. The simple recommended premium calculation for a retailed farmer weather insurance contract is defined as follows: Premium = AEL + α * ( PML (1-in-100) AEL ) + Administrative & Business Expenses, where: AEL is the Adjusted Expected Loss; the expected loss adjusted by a data uncertainty factor PML(1-in-100) is the 1-in-100 year Probable Maximum Loss of the contract (i.e., the maximum payout that is likely to occur once in 100 years) α is the target Return-on-Risk (RoR), or Return-on-PML, assuming the insurer is required to reserve capital against its portfolio at the PML(1-in-100) level The target return-on-risk α for the risk-taker is chosen by the risk-taker given business imperatives and ambitions; as a result, α can range from 5 percent to 20 percent. These values often also depend on the payout size and frequency of a given transaction and how that interacts with a risk taker s portfolio and risk appetite. Other risk metrics such as the PML (1-in-250) could also be used and the methodology can be easily adapted to do so if this benchmark is chosen. * This is the same as is the contract Value-at-Risk (VaR) at the 99th percent confidence level (i.e., the maximum loss that will occur with a one percent probability or less). 179 Module 7A Designing Index Based Weather Risk Management Programs
3 Component A: Expected Loss Expected loss is defined as the arithmetic average payout of the contract from the historical cleaned dataset for the station. Uncertainty in the Mean Normally, at least 20 years (preferably 30 years or over) of continuous daily data, with less than 5 percent of data missing, is the accepted minimum in the international weather market. Data not satisfying these criteria will be subject to higher premium rate s or, in some cases, will not be accepted by the market. Reinsurer s will take the short dataset lengths and missing data into consideration when pricing reinsurance treaties. They will then adjust prices upwards according to their risk appetite as well as their business and underwriting practices. As with the expected loss component, the uncertainty adjustment for the data should also be calculated from the historical data. To reflect the uncertainty associated with only having a limited number of historical years or when there are gaps in the data from which one can calculate the expected loss, the expected loss can, and in certain cases should, be adjusted by a data uncertainty factor. This sampling error introduces some uncertainty into the estimate of the expected loss and, hence, uncertainty in the pricing calculation. Note that there is no standard way of pricing the uncertainty associated with the quality and length of the underlying weather data. 180 Module 7A Designing Index Based Weather Risk Management Programs
4 Uncertainty in simulated data is discussed later in the section. This uncertainty in simulated data will also depend on the uncertainty associated with calibrating a simulation model to a limited sample size of historical data, as well as potential model error. Standard Error in the Mean There are a countless number of ways one could quantify and incorporate data uncertainty into a ratemaking methodology. In order to develop an Excel-based pricing tool, the following simple spreadsheet approach for capturing data uncertainty can be used. Efforts to incorporate data uncertainty due to historical weather data and quality into a ratemaking methodology should be considered by the insurer as they start and develop their business. The recommended approach is taken and adapted from Jewson et al. (2005). This process requires the insurer to differentiate between stations with good and poor quality data, especially if they have not considered this issue before. In the case where no detrending has been applied to the underlying index, a sample-based estimate of the expected loss will follow a normal distribution irrespective of the underlying distribution of payouts. A normal distribution will have: A mean equal to the actual but unknown population mean A standard deviation of s/sqrt(n) also known as the standard error, where: s is the population standard deviation, and N is the sample size, such as in a historical record of years and 30 payouts, N = 30. Applying this equation tells us that using 25 years of historical data gives a standard error on the expectation of a fifth of the standard deviation of the index and so on. To evaluate s/sqrt(n) we use our estimate of the standard deviation from the data. This formula no longer applies where detrending has been used because the number of degrees of freedom has changed. However, for simplicity, we will assume that the uncertainty in the detrended case can also be estimated approximately by the s/sqrt(n) rule. Note: Assuming detrended data where there is none will tend to underestimate the uncertainty a little, but the differences are not large and the method is simpler than proper alternatives to quantify uncertainty, such as using the Monte Carlo method. The Central Limit Theorem The Central Limit Theorem states that the sample mean is normally distributed with a standard deviation that is equal to the standard error of the mean. This leads to an expression of confidence levels for the mean, as follows: 90 percent confidence level = Mean +/ 1.64 * s/sqrt(n) 95 percent confidence level = Mean +/ 1.96 * s/sqrt(n) 99 percent confidence level = Mean +/ 2.58 * s/sqrt(n) 181 Module 7A Designing Index Based Weather Risk Management Programs
5 In the above equation, the 1.64, 1.96, 2.58, etc. are taking from the inverse of the standard Normal Cumulative Distribution for each confidence level. [1] In order to calculate expected loss for pricing a weather insurance contract, we are only interested in the upper bound uncertainty level. This is the bound that places X% confidence that the expected loss is less than or equal to some specific number. For example, at the 90 percent confidence level we can say the expected loss is less than or equal to [2]: Expected Loss * s/sqrt(n) Therefore, a possible Data Uncertainty Factor is defined as follows: Data Uncertainty Factor = F (β) * s/sqrt(n) (Eq 1) where F (β) is the inverse of the standard Normal Cumulative Distribution [3] for a given probability β and, therefore: Adjusted Expected Loss = Expected Loss + F (β) * s/sqrt(n), where: β, the required confidence level, is chosen by the insurer at their discretion Note that if β = 50%, and F (0.5) = 0, then β >= 50%. The insurer must adjust this level up or down to reflect the risk taker s risk preferences [1] In Excel, NORMSINV (0.95) = 1.64, NORMSINV (0.975) = 1.96, NORMSINV (0.995) = [2] The equivalent of which is (Expected Loss + NORMSINV (0.9) * s/sqrt(n)) in Excel. [3] F (β) = NORMSINV (β) in Excel. 182 Module 7A Designing Index Based Weather Risk Management Programs
6 Component A: Expected Loss Data to Calculate Expected Loss It is recommended initially that the expected loss calculation and the data uncertainty adjustment should be derived from cleaned data over simulated data unless the insurer is certain of the quality of the simulated data and its ability to capture the statistical and temporal properties of the meteorological variables (see later in this section). It is important to remember that the simulated data or parameters derived from a Historical Distribution Analysis, where a probability function is fitted to the historical index values, will be calibrated to the cleaned data. Therefore, the uncertainties in the simulated data will be related to the uncertainties associated with the number of historical years and quality of the underlying data, as well as the potential simulation model error. Cleaned data is the only touchstone that all stakeholders, including farmers and reinsurers and insurers have. It should, therefore, be the basis for expected loss calculations. The simulation models or distributions calibrated to this data should then agree with its characteristics, particularly in the mean. The underlying index must be detrended first when there is a trend in the historical payouts or the underlying index on which the payouts are based. This should take place before performing the expected loss calculation. The analyst should always be looking to detect trends even if the underlying daily weather data has already been detrended. Quality of Underlying Data The Adjusted Expected Loss calculation only considers the length of the historical record. It does not take the quality of the underlying data into account. However, there are times when there are missing data in the weather stations records. This will need to be incorporated into the pricing of the weather insurance contracts. As there is increased uncertainty in data that is received from a station with missing data, the resulting insurance contract will be priced higher due to the higher level of uncertainty regarding the underlying risk. Thus, a contract where the data is based on a weather station with more missing values will be more expensive than another contract that is based on data from a weather station with less missing data. While not statistically correct, it is a simple way of incorporating the data quality risk into the existing equation. The data uncertainty adjustment is done by multiplying the sample size N by (1 j), where j is the percentage number of missing raw data in the underlying data used to calculate the N payout values. In this equation, N decreases as the percentage of missing values in the raw data increases. Nonetheless, this method is a little less ad hoc than other methods. Intuitively, the less data that is available, the smaller the sample size from which one can estimate the expected loss (e.g., if a whole year of data is missing, N would simply reduce by 1). 183 Module 7A Designing Index Based Weather Risk Management Programs
7 Component A: Expected Loss Quality of Underlying Data The Adjusted Expected Loss calculation only considers the length of the historical record. It does not take the quality of the underlying data into account. However, there are times when there is missing data in the weather stations records. This will need to be incorporated into the pricing of the weather insurance contracts. As there is increased uncertainty in data that is received from a station with missing data, the resulting insurance contract will be priced higher due to the higher level of uncertainty regarding the underlying risk. Thus, a contract where the data is based on a weather station with more missing values will be more expensive than another contract which is based on data from a weather station with less missing data. While not statistically correct, it is a simple way of incorporating the data quality risk into the existing equation. The data uncertainty adjustment is done by multiplying the sample size N by (1 - j), where j is the percentage number of missing raw data in the underlying data used to calculate the N payout values. In this equation N decreases as the percentage of missing values in the raw data increases. Nonetheless, this method is a little less ad hoc than other methods. Intuitively the less data that is available, the smaller the sample size from which one can estimate the expected loss, e.g. if a whole year of data is missing, N would simply reduce by Module 7A Designing Index Based Weather Risk Management Programs
8 Component A: Expected Loss Adjusted Expected Loss A suggested calculation for the Adjusted Expected Loss is: AEL = Expected Loss + F (β) * s/sqrt(n*(1 j)) (Eq 2), where: β, the required confidence level, is chosen by the insurer, and: j is the percentage of missing data in the raw historical dataset If the missing data has been filled, and the cleaning procedure has been verified and robust, the percentage of missing data in the underlying historical cleaned dataset can be used instead. This will also depend on the insurers risk preferences. This data quality adjustment factor is proposed so that risk takers may be aware of the data quality issues that will be considered by reinsurers. It is recommended that an insurer experiments with several methods before settling on a method that they are comfortable with. For example, the insurer can use the proposed methodology for Adjusted Expected Loss only if a station does not reach a pre-defined good-quality data benchmark. A station would not meet such a benchmark if it does not have at least 30 years of historical data with less than 5 percent missing. Alternatively, it is recommended that the insurer must at least have experimented with other methodologies to differentiate between good (long historical record and few missing data points) and poor (short historical record and many missing data points) data in terms of pricing. 185 Module 7A Designing Index Based Weather Risk Management Programs
9 Component B: Probable Maximum Loss (PML) Regardless of the pricing methodology adopted, both expected loss and the risk of the most extreme payout must be factored into pricing. In some years, payouts in excess of the expected loss can occur, and the risk-taker must be compensated for this uncertainty. Therefore, internal provisions must be made to honor these potentially large payouts. For example, regulators and rating agencies use such tail-risk measures to determine the capital that a bank, an insurer, a reinsurer, or a corporation is required to hold in order to reflect the risks that it is bearing. Similarly, an insurer must also reserve capital against its portfolio. It is assumed in this section that the benchmark reserve level is the PML (1-in-100). This can be easily adjusted if necessary. The key advantage of using a RoR approach is that it directly refers to the loss side of the payout distribution, which is the potential financial loss to the insurer. Therefore, it directly corresponds to a capital charge required to underwrite the risk at a target level for the business. A PML calculation is aimed at determining the loss that will not exceed a specified return frequency (often set at 1-in-100) over a given time horizon. In the case of weather insurance, this time horizon is the life of the contract, and the PML (1-in-100) is the maximum payout that is expected to occur in 100 contract lifetimes. Advantage of PML (1-in-100) A PML set at the 1-in-100 return frequency is referred to as PML (1-in-100). The advantage of setting a PML (1-in-100) is that it is computed from the loss side of the payout distribution. In this way, the loss is defined with respect to the expected payout. Therefore, PML captures the potential financial loss to the seller. Using the Return-on-PML, from here on referred to as Return-on-Risk (RoR) method, is more appropriate for pricing structures that protect against low-frequency, but high-severity, risk. These kinds of risks have highly asymmetric payout distributions, such as weather insurance for farmers. Disadvantage of PML (1-in-100) The disadvantage of setting a PML at the 1-in-100 return frequency or PML (1-in-100) is that it is a difficult parameter to estimate. At particularly high strike levels, which are set far away from the mean, it becomes particularly hard to estimate this parameter. PML (1-in-100) is usually established through a Historical Distribution Analysis or Monte Carlo simulation. Nevertheless, the worst case recorded historically can often be used as a cross-check for the PML. Note that PML (1-in-100) is not straight forward to implement in Excel. Specific software, such as At- Risk, is required for the PML (1-in-100) analysis. Certainly, knowledge of a programming language, such as VBA, R, or C, in order to be able to write routines to fit distributions or simulate data is extremely helpful. 186 Module 7A Designing Index Based Weather Risk Management Programs
10 The concepts of Probable Maximum Loss (PML) and the similar concept of Value-at-Risk (VaR) are terms that have become widely used by insurers, corporate treasurers, and financial institutions to summarize the total risk of portfolios. Central bank regulators, for example, use VaR in determining the capital that a bank is required to hold in relation to the market risks that it is bearing. If an insurer is keen to build a weather business, it is recommended that the insurer invest in the appropriate tools and software in order to be able to estimate variables such as PML in a more robust manner. This would result in a more detailed analysis than simply looking at the worst-case recorded historically from a Historical Burn Analysis (HBA). 187 Module 7A Designing Index Based Weather Risk Management Programs
11 Component B: Probable Maximum Loss (PML) Estimating the PML Historical Burn Analysis From our previous modules, we know that HBA is considered the simplest method of weather contract pricing. HBA involves taking historical values of the index, from cleaned and possibly detrended data, and applying the contract in question to them. While HBA is a simple analysis to perform, it gives a limited view of possible index outcomes and may not capture the possible extremes while also being overly influenced by individual years in the historical dataset. Estimates of parameters, such as the PML can, therefore, become very difficult. The largest historical value is always a good reality check when considering the possible variability of payouts. Additionally, the confidence level that can be attached to averages and standard deviation calculated from historical data is limited by the number of years of data available. However, it should be noted, that this can be incorporated into the adjusted expected loss calculation. Therefore, if an insurer is keen to build a weather business, it is recommended that the insurer invest in the appropriate tools and software in order to be able to estimate variables such as PML in a more robust manner. This would result in a more detailed analysis than simply looking at the worst-case recorded historically from a Historical Burn Analysis (HBA). Some of the methods are outlined below. Historical Distribution Analysis Two ways to estimate the PML from a limited number of years of data is to fit a parametric or nonparametric probability distribution to the available data: Fit a parametric or non-parametric probability distribution to the historical index Fit a parametric or non-parametric probability distribution to the contract payout values The contract payout statistics can then be calculated from the properties of this distribution. When using the Historical Distribution Analysis approach, care should be taken with regards to the assumptions about the distribution of the payouts or the underlying index. In particular, information about payouts that do not happen often, but cover more extreme risk, need to be handled with care. Monte Carlo Simulation An alternative for estimating the true distribution of potential contract payouts is through a simulation. More information about the potential true distribution of payouts can be observed by running simulations as compared to simply considering the historical payout values. This is because a limited payout history can mask greater underlying variability of a contract. The simplest way to perform a simulation for three-phase contracts, for example, is through the dekadal [1] rainfall Monte Carlo simulation. This simulation will fit a distribution to the historical cumulative rainfall of each dekad within the contract. Thus, a correlation matrix can be established between the cumulative rainfall totals recorded in each dekad. Using this correlation matrix, a Monte Carlo simulation can be performed that preserves this correlation structure and the individual dekadal 188 Module 7A Designing Index Based Weather Risk Management Programs
12 distributions. The contract design webtool at the end of the course has a rainfall simulator that allows a simulation of dekadal rainfall in this way. Each simulation will produce one sample year of possible cumulative rainfall totals for the dekads within the contract. From these simulations, contract payouts can be calculated for a particular simulation year. Running many of these simulations will generate a distribution of possible contract payouts from which the pertinent contract statistics can be estimated. This approach can also be used for contracts with a dynamic start date. However, more dekads within the rainfall season must be simulated to accurately capture the moving start date. During these simulations, the mean and standard deviation of the simulated rainfall needs to be checked for consistency with the historical data. This also allows the simulated data to be used with confidence. For an even more robust analysis, it is possible to run a Monte Carlo simulation at the daily level. However, running a Monte Carlo simulation at the meteorological variable level can be the most complicated approach. This is because the process requires simulating thousands of years of daily rainfall or temperature data at each station with respect to the daily correlations and seasonal cycles. All the underwritten index values can then be calculated from these data. Subsequently, the weather contracts could be applied to each simulated index value to create thousands of simulated payouts from which the expected and variable payout statistics of the contracts can also be calculated. Building daily simulation models that correctly capture the statistics of the underlying data is very challenging. It is recommended that the approaches outlined above should be used to estimate PML, and careful thought should be given to embarking on a daily simulation and modeling project. 189 Module 7A Designing Index Based Weather Risk Management Programs
13 Component B: Probable Maximum Loss (PML) Pricing Using the PML The PML (1-in-100) should be estimated using one of the methods outlined previously. If the maximum payout has been reached in the historical cleaned data, the discussions on the various PML estimate methodologies that can be employed are moot, and the limit of the contract should be used as the PML estimate. After using several approaches, the question remains, as to what value should be used for the PML (1-in-100) estimate? There is no correct answer, but an estimate can be determined by using intuition. By putting aside uncertainty issues regarding the quality and length of the underlying data and if one is confident with the approaches used an estimate of PML can be determined by looking at the largest number of the estimates. This number should be at least equal to the maximum payout in the historical record as seen through the HBA analysis. The insurer will have to determine an estimate depending on their risk preferences and overall portfolio. Indeed, discussions of catastrophic risk loading can be made simpler when, for each contract, the insurer chooses to only consider the sum insured as the maximum historical loss to include in the risk margin formula. Although this method would be very simple to implement, it could make some contracts more expensive for farmers. If the estimated PML number is less than the historical maximum payout, after detrending, then it is recommended that the historical maximum payout is used instead: PML (1-in-100) = max (Estimated PML (1-in-100), Maximum Historical Payout) (Eq 3) This cross-check against the maximum historical payout is recommended, even though statistically it could be a PML overestimate, particularly if a simulation methodology is used. Although simulated data can capture the average well, in some cases, it tends to underestimate the variability. This also underestimates the risk of the historical data record, as the expected loss and payout frequency is lower than the historical loss and payout frequency. This underestimation of risk means that prices derived from the simulated data will be lower than the pricing derived from the historical cleaned data. Contract designs that require daily-level simulations are particularly prone to this problem as simulating daily meteorological data correctly, particularly rainfall, is challenging. Hence, it should not be surprising that there may be some discrepancies between the simulated and raw data. As reinsurers may run simulations, which may be very different from the insurer s simulations, it is recommended that the historical cleaned data is used for the expected loss calculation with an uncertainty adjustment, as described in the previous section. Therefore, unless the insurer is very sure of their simulation or historical data analysis methodology, using the historical cleaned data is the preferred approach. However, better estimating the tails of the payout distribution and the PML for a given return period is strongly recommended. This cannot be done accurately by only running a Historical Burn Analysis on a limited number of years (unless the maximum payout has been reached historically). Therefore, as simulations, or an HDA, provide a good value for estimating the tails of the payout distribution, they 190 Module 7A Designing Index Based Weather Risk Management Programs
14 should be used for the PML (1-in-100) calculation. This, of course, still necessitates a check against the historical cleaned data maximum payout. Note: As with the expected loss, there are uncertainties associated with the PML estimate when estimating the tails of the payout distribution, irrespective of the method used to determine its value. Instead of adding an uncertainty adjustment for this a 1-in-250 return frequency PML (i.e., VaR (99.6)) could be considered instead of the PML (1-in-100), for example. This will reduce vulnerability to model and assumption risk when estimating the tails of the payout distribution. 191 Module 7A Designing Index Based Weather Risk Management Programs
15 Component C: Administration & Business Expenses The Technical Premium (TP) is defined as follows: TP = AEL + α * (PML (1-in-100) AEL) (Eq 4), where: AEL is the Adjusted Expected Loss and PML (1-in-100) is the maximum likely payout in 100 contract life times. The administrative and business expenses must be included to arrive at the final gross premium for a contract. These expenses are often expressed in terms of percentages of the technical premium. Administrative and business expenses are determined by the insurer and reinsurer. These are not fixed or pre-determined, but are set based on the costs incurred by the insurer and reinsurer doing the business. To arrive at the final premium, the technical premium must be grossed up by multiplying by the factor (1 + TE), where TE is the total administrative and business expenses reflecting the insurer s fixed costs. The final premium is defined as: Premium = TP*(1 + TE) (Eq 5) The complete calculation for the final gross premium per contract, P, is: P = (1 + TE) * (AEL + α * (PML (1-in-100) AEL)) (Eq 6), where: AEL = Expected Loss + F (β) * s /sqrt (N*(1 j)), The Expected Loss is calculated using a Historical Burn Analysis on cleaned and, where appropriate, detrended data for all the historical years available. PML (1-in-100) = max (Estimated PML (1-in-100), Maximum Historical Payout) The Maximum Historical Payout is determined by a Historical Burn Analysis. If the insurer wants to take the timing of cash flows into account, the premium can be discounted with respect to the time when the premium is collected: Discounted Premium = exp[r (t T)]*TP*(1 + TE), where: r is the interest rate, t is time, and T is the contract maturity date and date of a potential payout. 192 Module 7A Designing Index Based Weather Risk Management Programs
16 Areas for Improvement in Premium Calculation There are a number of improvements that should be considered by insurers interested in developing the weather index insurance business: Improving the PML and expected loss estimates needed for the premium calculation. While robust PML estimates are often limited by the amount and quality of the underlying data, the more accurate the PML and the expected loss estimate, the more appropriate the premium costs will be for the product. Reflecting data uncertainty risk in the pricing and implementing of a data uncertainty adjustment. At the very least, insurers should be aware of these issues and the limitations and potential pitfalls of using a limited data history for ratemaking. The proposed adjustment in this module is simple to implement in spreadsheets; however, insurers should experiment with several methods to find a method for the data uncertainty adjustment that they are comfortable with. Alternative and additional methods that are strongly recommended and used in the market (available in weather derivative pricing software, such as Climetrix and Speedwell ) include observing the sensitivity of the contract statistics of expected loss and PML to: Contract Dates, i.e., changing the start date by a few days, e.g. +/ 1 day, +/ 2 days... +/ 10 days, on either side of the fixed start date, to see how it changes the pricing parameters Triggers and other contract parameters, i.e., adjusting the triggers up and down by small increments to see if new payouts occur with small trigger changes which can change the pricing parameters Trend Sensitivity, i.e., looking at how different detrending methodologies impact the historical payouts and, therefore, pricing parameters Missing data in-filling methodologies, i.e., looking at how different cleaning or in-filling methodologies impact the price as above These steps can help to extract more information from the historical dataset, reduce the potential sample error in the expected loss and other payout statistics, and minimize the risk of missing critical information about the payout potential of a contract by looking at, and stressing, the historical payout series in more than one way. 193 Module 7A Designing Index Based Weather Risk Management Programs
17 Note on Accuracy of Methods When employing any data analysis techniques, it should always be remembered that the results are only as good as the model and data used. Model outputs are subject to model error as well as the quality of the data available to calibrate the models and run them. At the end of the day there is a limit to the information that one can confidently extract from poor or short historical weather data records. Consider the example of fitting a distribution to historical payouts using a Historical Distribution Analysis. The uncertainty in the results will be driven by the underlying uncertainty of using only a limited number of values on which to fit a distribution. The uncertainty level in the estimates, such as the standard error in the mean and variance, is not reduced. Fitting a daily simulation model to meteorological data uses much more information to calibrate the model. An argument can be made that this can better represent the index distribution and its extremes. An example of this would be 30 daily data points per year, rather than one per year when modeling a monthly contract. However, the required models are much more complex and there is a greater potential risk of model error. The simulation and Historical Distribution Analysis approach can reduce uncertainties in relying on a Historical Burn Analysis only to a limited extent. The uncertainty analysis defined in the pricing methodology presented in this module can be applied even if methods other than the Historical Burn Analysis are used. This uncertainty is a fundamental characteristic of weather, and weather data, and should be born in mind throughout the pricing process. Although the Historical Burn Analysis is simple, its advantage lies in making the fewest assumptions. Hence, it should always be the starting point and touchstone for all pricing analysis. Jewson et al. (2005) have tried to address the issue of the potential accuracy of daily modeling over Historical Burn Analysis, or over a Historical Distribution Analysis. However, their results depend on the underlying model accuracy and the quality of the underlying historical data. Jewson recommends that unless a daily model works very well and all the relevant statistics have been thoroughly checked, a sensible approach is to use a combination of methods to estimate parameters such as the expected loss and the PML, as recommended above. * Jewson, S., A. Brix, and C. Ziehmann. Weather Derivative Valuation: The Meteorological, Statistical, Financial and Mathematical Foundations. Cambridge: Cambridge University Press, Module 7A Designing Index Based Weather Risk Management Programs
Module 6 Book A: Principles of Contract Design. Agriculture Risk Management Team Agricultural and Rural Development The World Bank
+ Module 6 Book A: Principles of Contract Design Agriculture Risk Management Team Agricultural and Rural Development The World Bank + Module 6 in the Process of Developing Index Insurance Initial Idea
More informationStochastic Analysis Of Long Term Multiple-Decrement Contracts
Stochastic Analysis Of Long Term Multiple-Decrement Contracts Matthew Clark, FSA, MAAA and Chad Runchey, FSA, MAAA Ernst & Young LLP January 2008 Table of Contents Executive Summary...3 Introduction...6
More informationGN47: Stochastic Modelling of Economic Risks in Life Insurance
GN47: Stochastic Modelling of Economic Risks in Life Insurance Classification Recommended Practice MEMBERS ARE REMINDED THAT THEY MUST ALWAYS COMPLY WITH THE PROFESSIONAL CONDUCT STANDARDS (PCS) AND THAT
More informationCalculating VaR. There are several approaches for calculating the Value at Risk figure. The most popular are the
VaR Pro and Contra Pro: Easy to calculate and to understand. It is a common language of communication within the organizations as well as outside (e.g. regulators, auditors, shareholders). It is not really
More informationMeasurement of Market Risk
Measurement of Market Risk Market Risk Directional risk Relative value risk Price risk Liquidity risk Type of measurements scenario analysis statistical analysis Scenario Analysis A scenario analysis measures
More informationAlternative VaR Models
Alternative VaR Models Neil Roeth, Senior Risk Developer, TFG Financial Systems. 15 th July 2015 Abstract We describe a variety of VaR models in terms of their key attributes and differences, e.g., parametric
More informationCatastrophe Reinsurance
Analytics Title Headline Matter When Pricing Title Subheadline Catastrophe Reinsurance By Author Names A Case Study of Towers Watson s Catastrophe Pricing Analytics Ut lacitis unt, sam ut volupta doluptaqui
More informationHomeowners Ratemaking Revisited
Why Modeling? For lines of business with catastrophe potential, we don t know how much past insurance experience is needed to represent possible future outcomes and how much weight should be assigned to
More informationStatistical Modeling Techniques for Reserve Ranges: A Simulation Approach
Statistical Modeling Techniques for Reserve Ranges: A Simulation Approach by Chandu C. Patel, FCAS, MAAA KPMG Peat Marwick LLP Alfred Raws III, ACAS, FSA, MAAA KPMG Peat Marwick LLP STATISTICAL MODELING
More informationThe private long-term care (LTC) insurance industry continues
Long-Term Care Modeling, Part I: An Overview By Linda Chow, Jillian McCoy and Kevin Kang The private long-term care (LTC) insurance industry continues to face significant challenges with low demand and
More informationCHAPTER II LITERATURE STUDY
CHAPTER II LITERATURE STUDY 2.1. Risk Management Monetary crisis that strike Indonesia during 1998 and 1999 has caused bad impact to numerous government s and commercial s bank. Most of those banks eventually
More informationComparison of Estimation For Conditional Value at Risk
-1- University of Piraeus Department of Banking and Financial Management Postgraduate Program in Banking and Financial Management Comparison of Estimation For Conditional Value at Risk Georgantza Georgia
More informationTransparency case study. Assessment of adequacy and portfolio optimization through time. THE ARCHITECTS OF CAPITAL
Transparency case study Assessment of adequacy and portfolio optimization through time. THE ARCHITECTS OF CAPITAL Transparency is a fundamental regulatory requirement as well as an ethical driver for highly
More informationGuidance paper on the use of internal models for risk and capital management purposes by insurers
Guidance paper on the use of internal models for risk and capital management purposes by insurers October 1, 2008 Stuart Wason Chair, IAA Solvency Sub-Committee Agenda Introduction Global need for guidance
More informationBrooks, Introductory Econometrics for Finance, 3rd Edition
P1.T2. Quantitative Analysis Brooks, Introductory Econometrics for Finance, 3rd Edition Bionic Turtle FRM Study Notes Sample By David Harper, CFA FRM CIPM and Deepa Raju www.bionicturtle.com Chris Brooks,
More informationAssessing the Impact of Reinsurance on Insurers Solvency under Different Regulatory Regimes
Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Public Disclosure Authorized Working Paper 70136 Assessing the Impact of Reinsurance on Insurers Solvency under Different
More informationPresented at the 2012 SCEA/ISPA Joint Annual Conference and Training Workshop -
Applying the Pareto Principle to Distribution Assignment in Cost Risk and Uncertainty Analysis James Glenn, Computer Sciences Corporation Christian Smart, Missile Defense Agency Hetal Patel, Missile Defense
More informationPricing & Risk Management of Synthetic CDOs
Pricing & Risk Management of Synthetic CDOs Jaffar Hussain* j.hussain@alahli.com September 2006 Abstract The purpose of this paper is to analyze the risks of synthetic CDO structures and their sensitivity
More informationModelling the Sharpe ratio for investment strategies
Modelling the Sharpe ratio for investment strategies Group 6 Sako Arts 0776148 Rik Coenders 0777004 Stefan Luijten 0783116 Ivo van Heck 0775551 Rik Hagelaars 0789883 Stephan van Driel 0858182 Ellen Cardinaels
More informationCHAPTER 5 STOCHASTIC SCHEDULING
CHPTER STOCHSTIC SCHEDULING In some situations, estimating activity duration becomes a difficult task due to ambiguity inherited in and the risks associated with some work. In such cases, the duration
More informationMarket Risk: FROM VALUE AT RISK TO STRESS TESTING. Agenda. Agenda (Cont.) Traditional Measures of Market Risk
Market Risk: FROM VALUE AT RISK TO STRESS TESTING Agenda The Notional Amount Approach Price Sensitivity Measure for Derivatives Weakness of the Greek Measure Define Value at Risk 1 Day to VaR to 10 Day
More informationUPDATED IAA EDUCATION SYLLABUS
II. UPDATED IAA EDUCATION SYLLABUS A. Supporting Learning Areas 1. STATISTICS Aim: To enable students to apply core statistical techniques to actuarial applications in insurance, pensions and emerging
More informationGamma. The finite-difference formula for gamma is
Gamma The finite-difference formula for gamma is [ P (S + ɛ) 2 P (S) + P (S ɛ) e rτ E ɛ 2 ]. For a correlation option with multiple underlying assets, the finite-difference formula for the cross gammas
More informationSensitivity Analyses: Capturing the. Introduction. Conceptualizing Uncertainty. By Kunal Joarder, PhD, and Adam Champion
Sensitivity Analyses: Capturing the Most Complete View of Risk 07.2010 Introduction Part and parcel of understanding catastrophe modeling results and hence a company s catastrophe risk profile is an understanding
More informationDependence structures for a reinsurance portfolio exposed to natural catastrophe risk
Dependence structures for a reinsurance portfolio exposed to natural catastrophe risk Castella Hervé PartnerRe Bellerivestr. 36 8034 Zürich Switzerland Herve.Castella@partnerre.com Chiolero Alain PartnerRe
More informationGuidelines on PD estimation, LGD estimation and the treatment of defaulted exposures
Guidelines on PD estimation, LGD estimation and the treatment of defaulted exposures European Banking Authority (EBA) www.managementsolutions.com Research and Development December Página 2017 1 List of
More informationMeasuring and managing market risk June 2003
Page 1 of 8 Measuring and managing market risk June 2003 Investment management is largely concerned with risk management. In the management of the Petroleum Fund, considerable emphasis is therefore placed
More informationSection B: Risk Measures. Value-at-Risk, Jorion
Section B: Risk Measures Value-at-Risk, Jorion One thing to always keep in mind when reading this text is that it is focused on the banking industry. It mainly focuses on market and credit risk. It also
More informationWeb Extension: Continuous Distributions and Estimating Beta with a Calculator
19878_02W_p001-008.qxd 3/10/06 9:51 AM Page 1 C H A P T E R 2 Web Extension: Continuous Distributions and Estimating Beta with a Calculator This extension explains continuous probability distributions
More informationCatastrophe Reinsurance Pricing
Catastrophe Reinsurance Pricing Science, Art or Both? By Joseph Qiu, Ming Li, Qin Wang and Bo Wang Insurers using catastrophe reinsurance, a critical financial management tool with complex pricing, can
More informationCatastrophe Exposures & Insurance Industry Catastrophe Management Practices. American Academy of Actuaries Catastrophe Management Work Group
Catastrophe Exposures & Insurance Industry Catastrophe Management Practices American Academy of Actuaries Catastrophe Management Work Group Overview Introduction What is a Catastrophe? Insurer Capital
More informationRules and Models 1 investigates the internal measurement approach for operational risk capital
Carol Alexander 2 Rules and Models Rules and Models 1 investigates the internal measurement approach for operational risk capital 1 There is a view that the new Basel Accord is being defined by a committee
More information2. Criteria for a Good Profitability Target
Setting Profitability Targets by Colin Priest BEc FIAA 1. Introduction This paper discusses the effectiveness of some common profitability target measures. In particular I have attempted to create a model
More informationAnnual risk measures and related statistics
Annual risk measures and related statistics Arno E. Weber, CIPM Applied paper No. 2017-01 August 2017 Annual risk measures and related statistics Arno E. Weber, CIPM 1,2 Applied paper No. 2017-01 August
More informationUse of Internal Models for Determining Required Capital for Segregated Fund Risks (LICAT)
Canada Bureau du surintendant des institutions financières Canada 255 Albert Street 255, rue Albert Ottawa, Canada Ottawa, Canada K1A 0H2 K1A 0H2 Instruction Guide Subject: Capital for Segregated Fund
More informationWeek 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals
Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg :
More informationStatistics 13 Elementary Statistics
Statistics 13 Elementary Statistics Summer Session I 2012 Lecture Notes 5: Estimation with Confidence intervals 1 Our goal is to estimate the value of an unknown population parameter, such as a population
More informationTerms of Reference. 1. Background
Terms of Reference Peer Review of the Actuarial Soundness of CCRIF SPC s Loss Assessment Models for Central America and the Caribbean (i) Earthquake and Tropical Cyclone Loss Assessment Model (SPHERA)
More informationexpenses, not including long-term care insurance. That s up from $220,000 just the prior year.
A New Frontier for Financial Advice: Enabling Financial Advisors To Predict a Client s Future Healthcare Costs The Problem The Increasing Consumer Burden of Health Care Costs Most people today face a ticking
More informationEconomic Capital. Implementing an Internal Model for. Economic Capital ACTUARIAL SERVICES
Economic Capital Implementing an Internal Model for Economic Capital ACTUARIAL SERVICES ABOUT THIS DOCUMENT THIS IS A WHITE PAPER This document belongs to the white paper series authored by Numerica. It
More informationPublication date: 12-Nov-2001 Reprinted from RatingsDirect
Publication date: 12-Nov-2001 Reprinted from RatingsDirect Commentary CDO Evaluator Applies Correlation and Monte Carlo Simulation to the Art of Determining Portfolio Quality Analyst: Sten Bergman, New
More informationValuation of a New Class of Commodity-Linked Bonds with Partial Indexation Adjustments
Valuation of a New Class of Commodity-Linked Bonds with Partial Indexation Adjustments Thomas H. Kirschenmann Institute for Computational Engineering and Sciences University of Texas at Austin and Ehud
More information[D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright
Faculty and Institute of Actuaries Claims Reserving Manual v.2 (09/1997) Section D7 [D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright 1. Introduction
More informationBetter decision making under uncertain conditions using Monte Carlo Simulation
IBM Software Business Analytics IBM SPSS Statistics Better decision making under uncertain conditions using Monte Carlo Simulation Monte Carlo simulation and risk analysis techniques in IBM SPSS Statistics
More informationMonte Carlo Introduction
Monte Carlo Introduction Probability Based Modeling Concepts moneytree.com Toll free 1.877.421.9815 1 What is Monte Carlo? Monte Carlo Simulation is the currently accepted term for a technique used by
More informationSimulation. Decision Models
Lecture 9 Decision Models Decision Models: Lecture 9 2 Simulation What is Monte Carlo simulation? A model that mimics the behavior of a (stochastic) system Mathematically described the system using a set
More informationValue-at-Risk (VaR) a Risk Management tool
Value-at-Risk (VaR) a Risk Management tool Risk Management Key to successful Risk Management of a portfolio lies in identifying & quantifying the risk that the company faces due to price volatility in
More informationFRBSF ECONOMIC LETTER
FRBSF ECONOMIC LETTER 2010-19 June 21, 2010 Challenges in Economic Capital Modeling BY JOSE A. LOPEZ Financial institutions are increasingly using economic capital models to help determine the amount of
More informationMEASURING TRADED MARKET RISK: VALUE-AT-RISK AND BACKTESTING TECHNIQUES
MEASURING TRADED MARKET RISK: VALUE-AT-RISK AND BACKTESTING TECHNIQUES Colleen Cassidy and Marianne Gizycki Research Discussion Paper 9708 November 1997 Bank Supervision Department Reserve Bank of Australia
More informationFINANCIAL INSTITUTIONS
FINANCIAL INSTITUTIONS Quality Of Trading Risk Management Practices Varies In Financial Institutions Primary Credit Analysts: Prodyot Samanta New York (1) 212-438-2009 prodyot_samanta@ standardandpoors.com
More informationINSTITUTE AND FACULTY OF ACTUARIES SUMMARY
INSTITUTE AND FACULTY OF ACTUARIES SUMMARY Specimen 2019 CP2: Actuarial Modelling Paper 2 Institute and Faculty of Actuaries TQIC Reinsurance Renewal Objective The objective of this project is to use random
More informationECONOMIC CAPITAL MODELING CARe Seminar JUNE 2016
ECONOMIC CAPITAL MODELING CARe Seminar JUNE 2016 Boston Catherine Eska The Hanover Insurance Group Paul Silberbush Guy Carpenter & Co. Ronald Wilkins - PartnerRe Economic Capital Modeling Safe Harbor Notice
More informationInstructions for the EBA qualitative survey on IRB models
16 December 2016 Instructions for the EBA qualitative survey on IRB models 1 Table of contents Contents 1. Introduction 3 2. General information 4 2.1 Scope 4 2.2 How to choose the models for which to
More informationREQUEST TO EIOPA FOR TECHNICAL ADVICE ON THE REVIEW OF THE SOLVENCY II DIRECTIVE (DIRECTIVE 2009/138/EC)
Ref. Ares(2019)782244-11/02/2019 REQUEST TO EIOPA FOR TECHNICAL ADVICE ON THE REVIEW OF THE SOLVENCY II DIRECTIVE (DIRECTIVE 2009/138/EC) With this mandate to EIOPA, the Commission seeks EIOPA's Technical
More informationMuch of what appears here comes from ideas presented in the book:
Chapter 11 Robust statistical methods Much of what appears here comes from ideas presented in the book: Huber, Peter J. (1981), Robust statistics, John Wiley & Sons (New York; Chichester). There are many
More informationPerformance of Statistical Arbitrage in Future Markets
Utah State University DigitalCommons@USU All Graduate Plan B and other Reports Graduate Studies 12-2017 Performance of Statistical Arbitrage in Future Markets Shijie Sheng Follow this and additional works
More informationHow to Explain and Use an Insurance Contract
How to Explain and Use an Insurance Contract Insurance contracts, in this case, are agreements between farmers and an insurance company. By signing the contract, the farmer agrees to pay a certain amount
More informationInternal Model Industry Forum (IMIF) Workstream G: Dependencies and Diversification. 2 February Jonathan Bilbul Russell Ward
Internal Model Industry Forum (IMIF) Workstream G: Dependencies and Diversification Jonathan Bilbul Russell Ward 2 February 2015 020211 Background Within all of our companies internal models, diversification
More informationstarting on 5/1/1953 up until 2/1/2017.
An Actuary s Guide to Financial Applications: Examples with EViews By William Bourgeois An actuary is a business professional who uses statistics to determine and analyze risks for companies. In this guide,
More informationOperational Risk Aggregation
Operational Risk Aggregation Professor Carol Alexander Chair of Risk Management and Director of Research, ISMA Centre, University of Reading, UK. Loss model approaches are currently a focus of operational
More informationMarket Risk Analysis Volume IV. Value-at-Risk Models
Market Risk Analysis Volume IV Value-at-Risk Models Carol Alexander John Wiley & Sons, Ltd List of Figures List of Tables List of Examples Foreword Preface to Volume IV xiii xvi xxi xxv xxix IV.l Value
More informationValidation of Liquidity Model A validation of the liquidity model used by Nasdaq Clearing November 2015
Validation of Liquidity Model A validation of the liquidity model used by Nasdaq Clearing November 2015 Jonas Schödin, zeb/ Risk & Compliance Partner AB 2016-02-02 1.1 2 (20) Revision history: Date Version
More informationResource Planning with Uncertainty for NorthWestern Energy
Resource Planning with Uncertainty for NorthWestern Energy Selection of Optimal Resource Plan for 213 Resource Procurement Plan August 28, 213 Gary Dorris, Ph.D. Ascend Analytics, LLC gdorris@ascendanalytics.com
More informationGI ADV Model Solutions Fall 2016
GI ADV Model Solutions Fall 016 1. Learning Objectives: 4. The candidate will understand how to apply the fundamental techniques of reinsurance pricing. (4c) Calculate the price for a casualty per occurrence
More informationStatement of Guidance for Licensees seeking approval to use an Internal Capital Model ( ICM ) to calculate the Prescribed Capital Requirement ( PCR )
MAY 2016 Statement of Guidance for Licensees seeking approval to use an Internal Capital Model ( ICM ) to calculate the Prescribed Capital Requirement ( PCR ) 1 Table of Contents 1 STATEMENT OF OBJECTIVES...
More informationREGIONAL WORKSHOP ON TRAFFIC FORECASTING AND ECONOMIC PLANNING
International Civil Aviation Organization 27/8/10 WORKING PAPER REGIONAL WORKSHOP ON TRAFFIC FORECASTING AND ECONOMIC PLANNING Cairo 2 to 4 November 2010 Agenda Item 3 a): Forecasting Methodology (Presented
More informationThe Black-Scholes Model
The Black-Scholes Model Liuren Wu Options Markets Liuren Wu ( c ) The Black-Merton-Scholes Model colorhmoptions Markets 1 / 18 The Black-Merton-Scholes-Merton (BMS) model Black and Scholes (1973) and Merton
More informationLloyd s Minimum Standards MS6 Exposure Management
Lloyd s Minimum Standards MS6 Exposure Management January 2019 2 Contents 3 Minimum Standards and Requirements 3 Guidance 3 Definitions 3 5 UW 6.1 Exposure Management System and Controls Framework 5 UW6.2
More informationMartingales, Part II, with Exercise Due 9/21
Econ. 487a Fall 1998 C.Sims Martingales, Part II, with Exercise Due 9/21 1. Brownian Motion A process {X t } is a Brownian Motion if and only if i. it is a martingale, ii. t is a continuous time parameter
More informationModerator: Kevin M Madigan MAAA,ACAS,CERA. Presenters: Barry A Franklin MAAA,FCAS,CERA Kevin M Madigan MAAA,ACAS,CERA
Session 26: The Role of a Model Risk Management Framework in P&C Insurers SOA Antitrust Disclaimer SOA Presentation Disclaimer Moderator: Kevin M Madigan MAAA,ACAS,CERA Presenters: Barry A Franklin MAAA,FCAS,CERA
More informationModelling economic scenarios for IFRS 9 impairment calculations. Keith Church 4most (Europe) Ltd AUGUST 2017
Modelling economic scenarios for IFRS 9 impairment calculations Keith Church 4most (Europe) Ltd AUGUST 2017 Contents Introduction The economic model Building a scenario Results Conclusions Introduction
More informationBloomberg. Portfolio Value-at-Risk. Sridhar Gollamudi & Bryan Weber. September 22, Version 1.0
Portfolio Value-at-Risk Sridhar Gollamudi & Bryan Weber September 22, 2011 Version 1.0 Table of Contents 1 Portfolio Value-at-Risk 2 2 Fundamental Factor Models 3 3 Valuation methodology 5 3.1 Linear factor
More informationValidation of Nasdaq Clearing Models
Model Validation Validation of Nasdaq Clearing Models Summary of findings swissquant Group Kuttelgasse 7 CH-8001 Zürich Classification: Public Distribution: swissquant Group, Nasdaq Clearing October 20,
More informationApproximating the Confidence Intervals for Sharpe Style Weights
Approximating the Confidence Intervals for Sharpe Style Weights Angelo Lobosco and Dan DiBartolomeo Style analysis is a form of constrained regression that uses a weighted combination of market indexes
More informationSolutions to the Fall 2013 CAS Exam 5
Solutions to the Fall 2013 CAS Exam 5 (Only those questions on Basic Ratemaking) Revised January 10, 2014 to correct an error in solution 11.a. Revised January 20, 2014 to correct an error in solution
More informationPortfolio Construction Research by
Portfolio Construction Research by Real World Case Studies in Portfolio Construction Using Robust Optimization By Anthony Renshaw, PhD Director, Applied Research July 2008 Copyright, Axioma, Inc. 2008
More informationCatastrophe Reinsurance Risk A Unique Asset Class
Catastrophe Reinsurance Risk A Unique Asset Class Columbia University FinancialEngineering Seminar Feb 15 th, 2010 Lixin Zeng Validus Holdings, Ltd. Outline The natural catastrophe reinsurance market Characteristics
More informationCEIOPS-DOC January 2010
CEIOPS-DOC-72-10 29 January 2010 CEIOPS Advice for Level 2 Implementing Measures on Solvency II: Technical Provisions Article 86 h Simplified methods and techniques to calculate technical provisions (former
More informationRBC Easy as 1,2,3. David Menezes 8 October 2014
RBC Easy as 1,2,3 David Menezes 8 October 2014 Figures often beguile me, particularly when I have the arranging of them myself; in which case the remark attributed to Disraeli would often apply with justice
More informationTHE PITFALLS OF EXPOSURE RATING A PRACTITIONERS GUIDE
THE PITFALLS OF EXPOSURE RATING A PRACTITIONERS GUIDE June 2012 GC Analytics London Agenda Some common pitfalls The presentation of exposure data Banded limit profiles vs. banded limit/attachment profiles
More informationThe Black-Scholes Model
The Black-Scholes Model Liuren Wu Options Markets (Hull chapter: 12, 13, 14) Liuren Wu ( c ) The Black-Scholes Model colorhmoptions Markets 1 / 17 The Black-Scholes-Merton (BSM) model Black and Scholes
More informationOverview. We will discuss the nature of market risk and appropriate measures
Market Risk Overview We will discuss the nature of market risk and appropriate measures RiskMetrics Historic (back stimulation) approach Monte Carlo simulation approach Link between market risk and required
More informationEnterprise Risk Management Economic Capital Modleing and the Financial Crisis
Risk Management and The Crisis Enterprise Risk Management Economic Capital Modleing and the Financial Crisis What worked and what did not Insurance Industry Continues to Respond to Risk Dynamics Risk Sources
More informationVariable Annuities - issues relating to dynamic hedging strategies
Variable Annuities - issues relating to dynamic hedging strategies Christophe Bonnefoy 1, Alexandre Guchet 2, Lars Pralle 3 Preamble... 2 Brief description of Variable Annuities... 2 Death benefits...
More informationRisk Video #1. Video 1 Recap
Risk Video #1 Video 1 Recap 1 Risk Video #2 Video 2 Recap 2 Risk Video #3 Risk Risk Management Process Uncertain or chance events that planning can not overcome or control. Risk Management A proactive
More informationINSTITUTE OF ACTUARIES OF INDIA
INSTITUTE OF ACTUARIES OF INDIA EXAMINATIONS 27 th May, 2014 Subject SA3 General Insurance Time allowed: Three hours (14.45* - 18.00 Hours) Total Marks: 100 INSTRUCTIONS TO THE CANDIDATES 1. Please read
More informationINTERNATIONAL MONETARY FUND. Information Note on Modifications to the Fund s Debt Sustainability Assessment Framework for Market Access Countries
INTERNATIONAL MONETARY FUND Information Note on Modifications to the Fund s Debt Sustainability Assessment Framework for Market Access Countries Prepared by the Policy Development and Review Department
More informationStochastic Modelling: The power behind effective financial planning. Better Outcomes For All. Good for the consumer. Good for the Industry.
Stochastic Modelling: The power behind effective financial planning Better Outcomes For All Good for the consumer. Good for the Industry. Introduction This document aims to explain what stochastic modelling
More informationDeutsche Bank Annual Report 2017 https://www.db.com/ir/en/annual-reports.htm
Deutsche Bank Annual Report 2017 https://www.db.com/ir/en/annual-reports.htm in billions 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 Assets: 1,925 2,202 1,501 1,906 2,164 2,012 1,611 1,709 1,629
More informationMinimizing Basis Risk for Cat-In- Catastrophe Bonds Editor s note: AIR Worldwide has long dominanted the market for. By Dr.
Minimizing Basis Risk for Cat-In- A-Box Parametric Earthquake Catastrophe Bonds Editor s note: AIR Worldwide has long dominanted the market for 06.2010 AIRCurrents catastrophe risk modeling and analytical
More informationValue at Risk. january used when assessing capital and solvency requirements and pricing risk transfer opportunities.
january 2014 AIRCURRENTS: Modeling Fundamentals: Evaluating Edited by Sara Gambrill Editor s Note: Senior Vice President David Lalonde and Risk Consultant Alissa Legenza describe various risk measures
More informationMeasurable value creation through an advanced approach to ERM
Measurable value creation through an advanced approach to ERM Greg Monahan, SOAR Advisory Abstract This paper presents an advanced approach to Enterprise Risk Management that significantly improves upon
More informationRazor Risk Market Risk Overview
Razor Risk Market Risk Overview Version 1.0 (Final) Prepared by: Razor Risk Updated: 20 April 2012 Razor Risk 7 th Floor, Becket House 36 Old Jewry London EC2R 8DD Telephone: +44 20 3194 2564 e-mail: peter.walsh@razor-risk.com
More informationExpected Return Methodologies in Morningstar Direct Asset Allocation
Expected Return Methodologies in Morningstar Direct Asset Allocation I. Introduction to expected return II. The short version III. Detailed methodologies 1. Building Blocks methodology i. Methodology ii.
More informationAIRCURRENTS: PORTFOLIO OPTIMIZATION FOR REINSURERS
MARCH 12 AIRCURRENTS: PORTFOLIO OPTIMIZATION FOR REINSURERS EDITOR S NOTE: A previous AIRCurrent explored portfolio optimization techniques for primary insurance companies. In this article, Dr. SiewMun
More informationPaper Series of Risk Management in Financial Institutions
- December, 007 Paper Series of Risk Management in Financial Institutions The Effect of the Choice of the Loss Severity Distribution and the Parameter Estimation Method on Operational Risk Measurement*
More informationLONG INTERNATIONAL. Rod C. Carter, CCP, PSP and Richard J. Long, P.E.
Rod C. Carter, CCP, PSP and Richard J. Long, P.E. LONG INTERNATIONAL Long International, Inc. 5265 Skytrail Drive Littleton, Colorado 80123-1566 USA Telephone: (303) 972-2443 Fax: (303) 200-7180 www.long-intl.com
More informationLLOYD S MINIMUM STANDARDS
LLOYD S MINIMUM STANDARDS Ms1.5 - EXPOSURE MANAGEMENT October 2015 1 Ms1.5 - EXPOSURE MANAGEMENT UNDERWRITING MANAGEMENT PRINCIPLES, MINIMUM STANDARDS AND REQUIREMENTS These are statements of business
More informationTHE INSURANCE BUSINESS (SOLVENCY) RULES 2015
THE INSURANCE BUSINESS (SOLVENCY) RULES 2015 Table of Contents Part 1 Introduction... 2 Part 2 Capital Adequacy... 4 Part 3 MCR... 7 Part 4 PCR... 10 Part 5 - Internal Model... 23 Part 6 Valuation... 34
More information