From Double Chain Ladder To Double GLM

Size: px
Start display at page:

Download "From Double Chain Ladder To Double GLM"

Transcription

1 University of Amsterdam MSc Stochastics and Financial Mathematics Master Thesis From Double Chain Ladder To Double GLM Author: Robert T. Steur Examiner: dr. A.J. Bert van Es Supervisors: drs. N.R. Valkenburg AAG drs. L.N. Aid Usman August 21, 2015

2 From Double Chain Ladder To Double GLM Rob Steur Abstract A popular technique to estimate future claim payments and reserves in nonlife insurance is the Chain Ladder Method which assumes a trend in past claim data to estimate future claim figures. This method has been refined to a new framework called Double Chain Ladder Method, which seperates the (outstanding) claim estimates into an IBNR and RBNS part by using the Chain Ladder Method twice but seperately on incurred claim count and paid claim amount triangles. In addition to the seperate reserve figures, this structure provides clear insight in underlying assumptions compared to the Chain Ladder Method, as well as an integrated method to estimate tail reserves. The data for the incurred claim counts are assumed to be Poisson distributed while the claim payments are assumed to be Overdispersed Poisson distributed in order to estimate parameters with a Chain Ladder Method. The Double Chain Ladder Method is a method which still has practical issues. In this thesis we will upgrade the Double Chain Ladder Method to cope with some important practical issues. We will adapt the Double Chain Ladder framework to work with types of Generalized Linear Models underlying the incurred claim count and paid claim amount data. This allows for additional trends such as calendar year inflation or a shift in payments caused by less or more payments in a specific calendar year, as well as avoiding overparametrization. Parameters will no longer be estimated using the Chain Ladder Method which is an exact equivalence of estimating parameters with a maximum likelihood procedure in a Poisson distribution. The new framework allows for a better fit to more types of data while maintaining the benefits of the seperate IBNR and RBNS structure. 1

3 2 Contents 1. Introduction Non-Life insurance introduction Double Chain Ladder introduction Thesis goals Thesis chapters overview 5 2. Double Chain Ladder Framework Current DCLM 6 Step 1. Origin and development trends 7 Step 2. Settlement delay pattern 8 Step 3. Individual claim sizes 9 Step 4. Extrapolating RBNS claims 9 Step 5. Extrapolating IBNR claims Discussing DCLM 13 Settlement delay pattern discussion 1 13 Settlement delay pattern discussion 2 14 Trend parameters discussion 14 Distribution paid amounts discussion DCLM issue summary Double GLM framework Formulating DGLM 16 Step 1. Origin, development and calendar trends 17 Step 2. Future calendar year effects 20 Step 3. Settlement delay pattern 21 Step 4. Individual claim sizes 23 Step 5. Extrapolating RBNS claims 24 Step 6. Extrapolating IBNR claims Discussing DGLM Comparing DGLM and DCLM DCLM and DGLM estimates Comparing DCLM and DGLM estimates Simulation study DGLM Distribution of individual claims Bootstrap Monte Carlo methodology Bootstrap Monte Carlo results Conclusions DGLM future research Popular Summary 41 Appendix A. 42 A.1. Chain Ladder Method 42 A.2. Generalized Linear Models 43 A.3. Linking CLM to a Poisson GLM 44 Appendix B. 45 B.1. Background definitions 45 References 47

4 3 1. Introduction The main goal of this thesis is to improve a popular existing model for estimating future claim payments at an insurance company. As a necessary introduction to the research we consider the payment and data structure at a non-life insurance company. Additional information regarding commonly used terms by an insurance company is stated in appendix B. Appendix B should be read after the introduction when one is not familiar with Non-Life insurance Non-Life insurance introduction. A policy holder experiences certain damages which he wants to claim from his insurance company. As such the company has datasets of reported and paid claim counts and amounts, which consist of an aggregation of individual claims from all policy holders per year of origin i and per year of development j. For example, we have damages that have occured in 2011 (year of origin 1). A part of these will be reported and or paid in 2011 (development year 0), a part will be reported and or paid in 2012 (development year 1) etc. So we get an upper left triangular data matrix, because we will only have data up till the current calendar year m, e.g. year of origin i and year of development j such that i + j = m. The matrix is usually referred to as a run-off triangle, see Figure 1 for an example with incurred claim counts. Value 17 for index (2013, 2) for example means that for all accidents happening in 2013, 17 accidents were reported 2 years later. This also means they were reported in calendar year If it had been a paid amount data triangle, it would mean that a money amount of 17 was paid out by the insurer after 2 years for accidents occuring in Data on a diagonal corresponds to the same calendar year. The data displayed here is part of a motor insurance run-off triangle with incremental data, which we will use throughout the thesis for illustrational purposes. Insurers often use cumulative data in their run-off triangles; results in this thesis can be extended to such a format if necessary. i\j Figure 1. Run-off triangle incurred claim counts By regulation, the company has to estimate future claim development. They need to set aside capital, usually referred to as reserves, in order to fulfill future payments arizing from past years of origin. These reserves can be split in two required parts. Reserves for damages that have Incurred, But Not yet Reported to the insurance company (IBNR reserve), and reserves for damages that have been Reported to the company But Not yet have

5 4 been Settled or fully paid (RBNS reserve). So the timeline for a claim is: accident happens reporting delay accident reported settlement final payment made. delay First the damage occurs (referred to as incurred), then the policyholder reports a claim, and finally a claim is settled and paid for by the insurance company. So first there is a reporting delay after the occurence and secondly a settlement delay after the reporting. RBNS can have different meanings depending on what is defined as Settled, but in our case we just assume that settling and paying for the total claim occur at the same time, so settling means the end of the claim timeline. Following this timeline, claims in an incurred triangle will ultimately be present in a paid triangle as well, since they will be paid eventually. These two types of triangles are thus linked. The Chain Ladder Method (CLM) is a popular method to analyze and extrapolate a trend for year of origin and a trend for development year in the data, see appendix A.1. For example we have a triangular dataset consisting of total claim payments. The CLM is only able to extrapolate estimates in the remainder of a data matrix, so the lower right triangular matrix. It cannot be used to estimate payments beyond the final data point in the development year. CLM is also unable to seperate claim estimates into an IBNR part and an RBNS part, it only estimates the combined payments. CLM does have a statistical justification. It can be shown that estimating with a CLM is equivalent to a maximum likelihood procedure when a Poisson distribution is assumed for the claim data, see Appendix A Double Chain Ladder introduction. The CLM has been refined to work in a new framework called Double Chain Ladder Method (DCLM), which seperates the future claim estimates into an IBNR and RBNS part by using the Chain Ladder Method twice but seperately on incurred claim count and paid claim amount triangles. The difference between incurred claim count and paid claim amount figures is explained in Appendix B. In addition to the seperate reserve figures, this structure provides insight in underlying assumptions as well as an integrated method to estimate tail reserves, so for claims beyond the final data point for development years. The data for the incurred claim counts are assumed to be Poisson distributed while the claim payments are assumed to be Overdispersed Poisson distributed in order to estimate parameters reliably with a Chain Ladder Method. These distributions are required to arrive at an equivalence to maximum likelihood estimation as mentioned earlier. It would still be possible to use CLM for other distributions, but this would not be statistically justified. Under a specific assumption, the combined future claim estimate produced by DCLM, excluding the tail estimate, is exactly equal to the future estimate produced by a single CLM procedure on a paid claim amount triangle. This result will be proven in a later section and it will be useful for validating CLM estimates, since the DCLM model provides richer insights in underlying risks by adding more parameters with a realistic interpretation.

6 1.3. Thesis goals. In this thesis we will adapt the DCLM framework to work with types of Generalized Linear Models (GLM) underlying the incurred claim count and paid claim amount data. The main goal and benefit of this adaption will be: The inclusion of a calendar year effect in the paid amount triangle. (1.1) This will constitute an approach with better predictions, as we believe that the most important effects in the data are best explained with a calendar year parameter. This will allow us to seperate a trend in paid amount data, into a trend for individual claim sizes and a trend for paid numbers. We will call this Gamma Factoring. We will demonstrate that the inclusion of this parameter will still allow the core DCLM framework to retain all of its properties and parameter interpretations. The trend in year of origin should then only give a measure of exposure, reflecting the number of policy holders and a base level of paid claim amounts. Typically, the trend in origin years should be constant, as we would not expect there to be much change in exposure. If there are sudden changes in exposure, the results can be rescaled for exposure, but this will not be included in the research. The details of the new approach will be explained in chapter 3. Parameters will then no longer be estimated using the popular Chain Ladder Method which was appropriate for a Poisson distribution with only two explanatory variables, but the new framework allows for a better fit to more types of data while maintaining the benefits of the IBNR and RBNS structure and the method for a tail reserve. The second goal of using a more general GLM is: Preventing overparametrization for development and origin trends. (1.2) For some years in the data triangle there is only a limited amount data, so here it is not appropriate to seperately estimate a parameter value which CLM does. A GLM allows for a trend in all years, so estimates for seperate years include data from the complete dataset. No longer using Chain Ladder Methods in the latter, we will call the new model a Double Generalized Linear Model (DGLM). The current DCLM is actually a DGLM as well, as a Poisson model is a specific case of a Generalized Linear Model, see Appendix A.2. Finally, we will: Discuss shortcomings of the settlement delay pattern in the DCLM. (1.3) These shortcomings will become apparent when we discuss the current DCLM Thesis chapters overview. In section 2.1 we will review the current DCLM framework, formulated by Martínez-Miranda et al. (2012). All sections following section 2.1 will be comprised of new material. We will discuss shortcomings of DCLM in more detail in section 2.2 and propose solutions. In chapter 3 we will then formulate the DGLM framework where our thesis goals will be accomplished. In chapter 4 we will compare reserve estimates between DCLM and DGLM for a real dataset. In chapter 5 we will do a simulation study for the DGLM reserve estimates using bootstrap techniques to illustrate the variance of the estimates and to compare the reserve distributions between DCLM and DGLM. Finally, we will give conclusions in chapter 6 following the discussion sections in chapter 3, 4 and 5. 5

7 6 2. Double Chain Ladder Framework First we will review the current Double Chain Ladder Method (DCLM) and its underlying assumptions. Challenging the assumptions and logic in the current model will help to get a better understanding of the overall structure. After this we will discuss some properties of DCLM in more detail to see if the current framework can be refined before we move on to adaption of the framework to GLMs and a calendar year effect in the next chapter Current DCLM. Following Martínez (2012), we will define the assumptions and structure of the current DCLM in this section. The overall structure of the DCLM can be divided into five parts which sequentially lead to an estimate of the RBNS claim amounts and IBNR claim amounts and thus to the total combined future payments estimate. The five steps are: (1) Estimate factors for origin and development trend using CLM seperately on incurred claim counts and paid claim amounts, and extrapolate future incurred claim counts; (2) Estimate settlement delay pattern using both incurred and paid development trends; (3) Estimate average individual claim sizes using both incurred and paid origin trends; (4) Extrapolate RBNS claim amounts by applying a settlement delay pattern and average claim size to the incurred claim counts data; and (5) Extrapolate IBNR claim amounts by applying a settlement delay pattern and average claim size to the estimated future incurred claim counts. The first three steps will determine the pattern in which incurred claims will be paid in the future, after which the last two steps apply these patterns to the incurred counts data and estimates of future incurred count data to get RBNS and IBNR figures. We will need definitions for data, and assumptions for data distributions, settlement delay pattern, and independence structure, which will all be stated below. We assume that two data run-off triangles are available: paid amounts and incurred counts defined as follows. Definition. Incurred counts: ℵ m = {N : (i, j) I}, with N being the total number of claims of insurance incurred in year i which have been reported in development year j and I = {(i, j) : i = 1,, m, j = 0,, m 1; i + j m}. N has values in N. Definition. Paid amounts: m = {X : (i, j) I}, with X being the total payment from claims incurred in year i and paid in development year j, with I as before. Definition. Paid counts: ℵ paid m = {N paid : (i, j) I}, with N paid being the total number of claims incurred in year i and paid in development year j. Defining d as the maximum number of years delay till payment is made after the claim is reported, d m 1, we can also write N paid = min(j,d) l=0 N paid i,j l,l where N paid l is the number of future payments originating from the N reported claims, which were paid with l periods delay.

8 With these definitions, the DCLM which we use in this thesis is formulated under the distributional assumptions A given below. A 2.1. Settlement delay pattern. Given N, the distribution of the numbers paid claims follows a multinomial distribution, so the random vector (N paid i,j,0,, N paid i,j,d ) Mult(N ; p 0,, p d ), for each (i, j) I. The probabilities p 0,, p d denote the delay probabilities such that d l=0 p l = 1 and 0 < p l < 1, l. A 2.2. Individual claim size. The individual claim sizes Y (k) per incurred claim are mutually independent with distributions f i with mean µ i and variance σi 2. Assume that µ i = µγ i, with µ being a mean factor and γ i the inflation in the accident years. Also the variances are σi 2 = σ 2 γi 2 with σ 2 being a variance factor. Also, it is assumed that the claims are settled with a single payment or as a zero claim. The paid amounts can be written as paid N X = k=1 Y (k) (i, j) I. A 2.3. Claim counts. The counts N are independent random variables from a Poisson distribution with multiplicative parametrization E[N ] = α i β j and identification m 1 j=0 β j = 1. Using this identification, the interpretation for the β j is the proportion of total claims allocated to or reported in development year j, and α i is the expected total number of claims originating from origin year i. A 2.4. Independence: We assume that the variables Y (k) are independent of the counts N and of the settlement delay pattern. We will now explain the five steps following Martínez which will lead to an estimate of the RBNS claim amounts and IBNR claim amounts whilst noting the appropriate assumptions used for the individual steps Step 1. Origin and development trends. Estimate factors for origin and development trend using CLM seperately on incurred claim counts and paid claim amounts, and extrapolate future incurred claim counts. Following assumption A 2.3 we apply the CLM to the triangle of incurred counts which follow a Poisson distribution. This leads to estimates α i and β j for trend parameters α i and β j such that E[N ] = α i β j. (2.1) For more details see appendix A.1. Estimates denoted with a circumflex can be used to estimate future incurred claim numbers which will be used to extrapolate the IBNR figures, so for i + j > m we get N := α i βj. The paid amounts are not assumed to follow a Poisson distribution, but the CLM is applied here as well. So for the X s we also get estimates α p i and β p j for parameteters αp i and βp j with assumption E[X ] = α p i βp j, (2.2) using the same identification m j=1 βp j = 1 as before. The p in the exponent denotes that we are dealing with the paid triangle. 7

9 8 Remark 2.1. CLM is applied for ease of use and is not completely statistically justified. The underlying justification comes from the fact that it can be shown that the variance of the paid amounts is roughly proportional to the expectation under certain assumptions, so an Overdispersed Poisson model can be fitted to the data to get maximum likelihood estimates for the parameters, see Verrall et al (2010). An Overdispersed Poisson model assumes a multiple of a Poisson distributed random variable, which has a variance higher than its mean. Applying CLM is not a very odd choice then, but we will describe an issue for this method at the end of the chapter. Also as mentioned earlier, CLM causes overparametrization of the model by estimating all trend parameters seperately. This will be remedied by adding a more general GLM Step 2. Settlement delay pattern. Estimate settlement delay proportions using both incurred and paid development trends. Following assumption A 2.1, we will estimate a delay pattern π 0,, π m 1, from which very similar delay probabilities p 0,, p d can be derived. The only difference is that the probabilities are defined to sum to 1 as well as 0 < p l < 1 l d as a practical assumption, whereas the π l can be chosen freely. The estimated delay pattern will be more important for the DCLM and it can be derived by solving the following linear system once we have estimates β j for the incurred development trend and β p j for the paid development trend as introduced in step 1: β p 0 β π 0... = β 1 β (2.3) β m 1 β 1 β 0 π m 1 β p m 1 This will result in a delay pattern, since the equations are formuled to express a single payment in period l as a combination of past incurreds which are individually delayed to payment in period l. Denoting the solution by π, we note that the values π l could be negative and or sum to more than 1. Ideally, we would want to solve the linear equations given by equation (2.3), but directly for probability vector p. So solve β p = Bp, where B corresponds to the matrix with values of β like in (2.3). This could be done by using a constrained Least Squares minimization. But because of the format of the matrix B, p 0 will have the biggest influence in the optimization, and the following individual values of p up to p m 1 will have a sequentially smaller influence on the optimization. This means that the resulting values will be very similar to a direct linear solution π as seen in chapter 2. Therefore we will just follow Martínez and estimate the maximum delay period, d, by counting the number of successive π l 0, that we get by solving (2.3), such that d 1 l=0 π l < 1 d l=0 π l and then define the estimated delay pattern parameters as: d 1 p l = π l, l = 0,..., d 1, and p d = 1 p l. (2.4) l=0

10 Remark 2.2. After closer inspection, there are two components in this estimation that challenge the assumptions. They will be introduced here and discussed in more detail in section 2.2. In step 5, every delay proportion π l will be applied to every development period, so we assume homogeneity in A 2.1, but the first delay effect π 0 is only determined by data in year j = 0. Data in this first year might be a poor representation for consecutive years. A similar observation applies for the remaining delay effects. Furthermore, when a run-off triangle is updated with a new diagonal, the new value for a given origin year i, might have a paid/incurred ratio based on the data that is very different from the estimated zero delay effect π 0 which corresponds to this ratio. This is very important for the origin year equal to the current calendar year where only one data point is available. Applying π then for this new year might produce a bad estimate for future payments originating from this datapoint Step 3. Individual claim sizes. Estimate average individual claim sizes using both incurred and paid origin trends. Following assumption A 2.2, we will determine the mean of the distribution of individual claim sizes, including the parameters γ i which measure the inflation in accident years. We can set γ 1 = 1 for identifiability, so we can estimate µ by α p 1 α 1. As discussed in A 2.3, the interpretation of the numerator here is the total paid amount originating from year i = 1, and the interpretation of the denominator is the total number of incurred claims originating from year i = 1, so µ represents the average claim payment per incurred claim accordingly. We can then estimate γ i by: γ i = α p i, i = 2,, m. (2.5) α i µ Step 4. Extrapolating RBNS claims. Extrapolate RBNS claim amounts by applying a settlement delay pattern and average claim size to the incurred claim counts data. We can now estimate future RBNS claim payments by extracting payment numbers from the reported incurred claim numbers, and then multiplying these figures with an average claim size to get total claim amounts. Looking at figure 2, we see the incurred data, as well as the future incurred count estimates. CLM only allows for completing the lower triangle. Looking at Figure 3, we see the claim payments data, and the location of the RBNS estimates. The estimates only run till calendar year 5 + d with d = 4 the maximum settlement delay. The farthest you can get in the triangle is achieved by taking the farthest incurred claim count at the current calendar year m = 5 and delaying the payment for as long as possible.

11 10 i\j N N N51 N54 Figure 2. Incurred claim counts and CLM estimates i\j Xrbns 15 Xrbns Xrbns 24 Xrbns Xrbns 35 X 36 rbns Xrbns 51 Xrbns 54 Figure 3. Paid claim amounts and DCLM RBNS estimates Take X rbns 35 as an example. Intuitively, the estimated payment at this timepoint will be the average claim size in year i = 3 multiplied with the sum of claims at (3, 1) multiplied with the 4-period delay proportion π 4, and claims at (3, 2) multiplied with the 3-period delay proportion π 3. These are the only two incurred datapoints that can reach (3, 5), since one would need a delay of 5 periods to delay payment from (3, 0) to (3, 5), which exceeds the maximum delay assumption d = 4 in this example. Using assumptions A1-A4, we finally estimate X rbns N paid E[X ℵ m ] = E[E[X N paid ] ℵ m ] = E[E[ k=1 E[N paid E[Y (k) min{j,d} ] ℵ m ] = µγ i E[N paid ℵ m ] = µγ i E[ min{j,d} µγ i l=0 min{j,d} E[N paid i,j l,l ℵ m] = µγ i l=0 by its expectation: Y (k) N paid ] ℵ m ] = l=0 N i,j l p l. N paid i,j l,l ℵ m] = Use of the tower property of conditional expectation is justified since paid claims are a function of incurred claims through assumption A 2.1, and independence from A 2.4 is used in the third step. We will use the π pattern instead of the p l for the settlement delay in order to show the equivalence of DCLM with a single CLM procedure on the paid claim amount triangle in the next section. This matches the formula with our intuition and as such we arrive at the following expression for an RBNS estimate: X rbns = µ γ i j l=i m+j N i,j l π l, (2.6)

12 where the summing indices are chosen to sum the appropriate datapoints Step 5. Extrapolating IBNR claims. Extrapolate IBNR claim amounts by applying a settlement delay pattern and average claim size to the estimated future incurred claim counts. The future IBNR claim payments can be estimated exacly like the RBNS payments, only we use the future incurred counts from Figure 2, since IBNR corresponds to claims that are not yet reported, so not in the dataset. The IBNR estimates will also reach farther in the triangle, since the estimates N are in later calendar years, so applying the maximum delay period to these claims, we get estimates as far as seen in figure i\j Xibnr 24 Xibnr Xibnr 35 Xibnr Xibnr 51 Xibnr 54 Xibnr 58 Figure 4. Paid claim amounts and DCLM IBNR estimates Notice that we do not get any estimates for year i = 1, since there are no incurred estimates N 1j to extract payments from. Taking again X ibnr 35 as an example, we now delay the future incurred claims in (3, 3) with 2 periods and (3, 4) with 1 period to estimate payments in (3, 5). Note that for timepoints such as X ibnr 34 in the lower triangle, we use future incurreds with a zero delay as well, since they can also be paid in the same year. The final expression for an IBNR estimate is then: X ibnr = µ γ i i m+j 1 l=0 N i,j l π l, (2.7) where the summing indices are chosen to sum the appropriate future claim counts. Finally summing the RBNS and IBNR components gives us the total future payment estimates X DCLM = X rbns + X ibnr. Comparing our derived DCLM estimates with CLM estimates, it can be shown that X DCLM rbns(2) using (2.6) with an adjustment X and (2.7), gives the same estimate as CLM when applied solely to the paid claim data as seen in Figure 3. For the RBNS estimate in (2.5) we need to use fitted values for the claim counts instead of the given data N. So we have used our CLM estimates of α and β to estimate future counts in step 1, but they can also be used to calculate fitted values instead of the datapoints, although the differences will be small. The CLM only estimates for development years stretching as far as the data, so no further then year j = 4 in the example. The DCLM estimates

13 12 for development years j > 4 are a useful extension of the CLM model, eliminating the need for a tail factor to model these remaining payments. The equivalence for years j 4 can be shown as follows: X rbns(2) + X ibnr = ( µ γ i = µ γ i j j l=i m+j l=0 = µ γ i j l=0 = ( α i µ γ i ) = α p i = α p i β p j j l=0 N i,j l π l α i βj l π l j l=0 β j l π l i m+j 1 N i,j l π l ) + ( µ γ i β j l π l = X CLM, l=0 N i,j l π l ) where in the fifth and sixth step we used the definition of γ i and the definition of π l as the solution to a linear equation. The preferred DCLM thus deviates from CLM by taking real incurred count data instead of fitted values.

14 2.2. Discussing DCLM. As mentioned before, the DCLM has many observable variables with a practical interpretation, which allows for transparent application of expert judgement or adjustments. There are still some issues for a number of variables however, which we will discuss here Settlement delay pattern discussion 1. The only variable that might not accurately represent a realistic effect is the settlement delay π. By solving π in a linear equation we get a delay effect that is intuitively correct, but analyzing its application indepth reveals two inconsistencies as discussed in remark 2.2. For the first one, we consider the way in which the linear equation is solved. We start with π 0 = βp 0 β 0, so the ratio of paid and incurred. This means that the zero delay effect is solely based on data in the first year. Next we get the 1-year delay π 1 = βp 1 π 0β 1 β 0. This formula can be interpreted as: (1) You take the amount paid in the second year β p 1 ; (2) You substract the amount paid which was reported in the second year π 0 β 1, so only payments originating from the first year remain; and (3) You divide the payment originating from the first year by the incurred in the first year β 0 to get the needed proportion. There are two illogicalities that arise by using this method. You assume that the zero delay portion paid in the second year is exactly the same as the zero delay portion in the first year, since π 0 is applied here as well, which is based solely on data from the first year. But in reality this portion can be very different in the second year. This means that the estimate of remaining payments originating from the first year in the second step is too large or too small, so π 1 does not represent a proper 1-year delay. It is easy to construct examples where this issue leads to negative values in π, which is usually not realistic, since it would mean that the insurer receives payments at some point. And again, the 1-year delay is only based on data from the first and second year, so it might not be a good representation for later years. We give a more detailed example with some imaginary values for β and β p that illustrate the issue. Suppose we have run-off triangles with 3 years of data, and we get the following incurred and paid effects: β = (0.6, 0.35, 0.05), β p = (0.3, 0.55, 0.15). So for this product most claims get reported in the first year, but most payments occur in the second year. This might happen because damages occur on average midyear, so they are reported towards the end of the year, which means that they are probably settled beginning next year. We would get π 0 = = 0.5 in the first year. But it might be entirely possible that most claims that get reported in the second year, are in fact reported in the beginning of the year, so that they are also settled in that year. The zero delay in this year is very high, say 0.8 for example. An estimate of the 1 year delay should then be π 1 = = But assuming the same zero delay 0.5 as the first year while applying the DCLM approach, we actually get π = (0.5, 0.625, ). This happens because we overestimate the 13

15 14 payments originating from the first year. Evidently, this is not a good representation of the delay effect. A more appropriate approach would be to use actual data about which proportion in the second year originates from the first year, but most companies are reluctant and unable to store this level of detailed information. Using π however, will still result in sensible estimates. This is because an estimate such as in 2.7 usually contains more than one entry of the vector π. The incorrect shift included in the calculation of one π i, is countered by a similar shift in calculation of π i+1, so the mistakes cancel eachother out. It is thus not a very big issue to use π, but the single entries π i themselves do not have a realistic interpretion, so one should be careful when adjusting these seperate values according to expert judgement Settlement delay pattern discussion 2. For the second inconsistency, we have fitted delay proportions π 0,, π l to the data, but for RBNS estimates, we already have available data for the past delay effects. Take for example datapoint (5,0) in Figure 3. We may have fitted a π 0 to all the data, but we should use actual data about the ratio of paid and incurred in (5,0) to see which portion of the incurred claims still remains to be paid in the future. We could have for example that π 0 = 0.6, so the total future proportion will be 0.4. In the current DCLM, this total effect of 0.4 is used regardless of the data. If the actual payment for (5,0) indicates a zero delay effect of 0.75, then only 0.25 remains to be paid in the future, so there is a payment shift in view of the average π effect. Therefore we would advise to rescale the π l for l > 0 to arrive at a total effect of 0.25, otherwise the total payment proportion would be = 1.15 instead of 1. Rescaling would still retain the same proportions between the estimated π l. There are no issues for IBNR estimates, since these are based on future incurred counts, so there is no data to update π. In most situations however, updating π with actual data ratios of paid counts/incurred counts might not be feasible. An insurer should be able to provide a run-off triangle with paid counts, which can be used in combination with incurred counts to derive information about past delays. But assumption A 2.4 in the DCLM that every incurred claim will be settled with 1 claim payment, will usually not hold in practice. There might be many paid counts arizing from a single incurred count, for example in disability insurance. A ratio in the first development year of paid counts / incurred counts does not represent a portion of incurred claims settled in that year then, so it does not represent a zero delay portion. We propose to establish a relation between total incurred counts and total paid counts. One could show a significant trend implying fixed proportions of incurred counts and paid counts, for example 3 payments on average for every incurred count. Then it is possible to derive valid information from a ratio of paid counts / incurred counts. We can analyze the statistical significance about whether including this data would result in better estimates. A procedure which derives information from paid counts versus incurred counts can be linked to the Munich Chain Ladder method. Munich Chain Ladder combines incurred amounts and paid amounts to reduce the estimated parameter variance, see Quarg and Mack (2004).

16 Trend parameters discussion. A CLM does not a produce a trend for calendar year effect, so a parameter that depends on i+j. There are a number of effects on payment patterns that are better reflected by a calendar year effect, so it would be useful to include it in the model. We will include this in chapter 3 when we introduce a more general GLM structure to estimate parameters. Aside from inflation which impacts individual claim sizes, we will use calendar year effects to identify payment shifts in paid counts in the data. The exact notion of a payment shift will become apparent in section 3 when we formule the DGLM and Gamma Factoring Distribution paid amounts discussion. As mentioned shortly in remark 2.1, an Overdispersed Poisson model can be fitted to the paid amounts X. But this is different from the compound distribution assumption X = N paid which is used to derive the DCLM reserve estimates. So when the variance of random variables is derived by means of overdispersed X, there is a theoretical mismatch in the model. The mismatch will be small if both distributions are similar, but it is worth noting. k=1 Y (k) 2.3. DCLM issue summary. As discussed in the sections above, there are a number of shortcomings in the DCLM that could be resolved: (1) There is overparametrization for estimated trends in the CLM; (2) There is no interpretation for different calendar year effects; (3) CLM is only statistically justified for Poisson distributed data, (4) The settlement delay pattern π does not reflect past delay data per year of origin for RBNS estimates; (5) The settlement delay pattern π does not have a realistic interpretation on an individual parameter level; and (6) There is a mismatch between different assumptions for X. We will formulate the DGLM now in the next section where we will adress a number of these issues. 15

17 16 3. Double GLM framework We will state the Double Generalized Linear Model (DGLM) here in the same way we stated the DCLM in chapter 2. First we will state the different steps of the new framework followed by a more detailed explanation. This will allow us to compare the DGLM with DCLM more easily. Assumptions A.1 and A.4 about the settlement delay pattern and independence will remain the same, while providing an update for A.2 and A.3 to reflect the use of GLMs. We will also provide a seperate summary of parameter interpretations Formulating DGLM. The overall structure of the DGLM can be divided into six parts which sequentially lead to an estimate of the RBNS claim amounts and IBNR claim amounts and thus to the total combined future payments estimate. The six steps are: (1) Estimate factors for origin and development trend using a GLM seperately for incurred claim counts and paid claim amounts, and extrapolate future incurred claim counts. Include a calendar year trend in the GLM for paid claim amounts as well; (2) Estimate future calendar year effects based on past calendar year effects given by the paid amount GLM, and split calendar year effects into inflation and payment shifts; (3) Estimate settlement delay pattern using both incurred and paid development trends, and calendar year shift effects; (4) Estimate average individual claim sizes using both incurred and paid origin trends, and calendar year inflation effects; (5) Extrapolate RBNS claim amounts by applying a settlement delay pattern and average claim size to the incurred claim counts data; and (6) Extrapolate IBNR claim amounts by applying a settlement delay pattern and average claim size to the estimated future incurred claim counts. The first four steps will determine the pattern in which incurred claims will be paid in the future, after which the last two steps apply these patterns to the incurred counts data and estimates of future incurred count data to get RBNS and IBNR figures. Especially step 1, 2, 3 and 4 are different from DCLM. Assumptions for independence structure, and definitions for paid amounts and incurred counts are defined like in the DCLM. We will provide our own assumptions for the data distributions, which will be slightly different then before. A 3.1. Settlement delay pattern. Given N, the distribution of paid claim numbers conditional on paid amounts follows a multinomial distribution, so (N paid i,j,0,, N paid i,j,d m) Mult(N ; p i+j,0,..., p i+j,d ), for each (i, j) I. The probabilities p i+j,0,..., p i+j,d denote the delay probabilities such that d l=0 p i+j,l = 1 and 0 < p i+j,l < 1, l. The p i+j,l will be based on the delay probabilities π l like in assumption A 2.1, as well as γ shift i+j to be explained in step 2.

18 A 3.2. Individual claim size. The individual claim sizes Y (k) per incurred claim are mutually independent with distributions f with mean µ and variance σ 2. Assume that µ = µw i γ infl i+j, with µ being a mean factor, w i a factor in accident years and γ infl i+j a calendar year inflation effect to be explained in step 2. Also the variances are σ 2 = σ2 (w i γ infl i+j )2 with σ 2 being a variance factor. Moreover, it is assumed that the claims are settled with a single payment or as a zero claim. A 3.3. Claim counts and amounts. Incurred counts N are independent random variables from a distribution in the exponential family with multiplicative parametrization E[N ] = α i β j and identification m 1 j=0 β j = 1. The paid amounts X are independent random variables from a distribution in the exponential family with multiplicative parametrization E[X ] = α p i βp j γ i+j and identification m 1 j=0 βp j = 1. The p in the exponent denotes paid N we are dealing with the paid triangle and X = k=1 Y (k) (i, j) I. Using the parametrizations from the assumptions, we want to arrive at the following interpretations for all parameters to be estimated: α i : Expected total number of incurred of claims for origin year i, β j : Expected proportion of claims reported in development year j, α p i : Expected total paid amount for origin year i at base level, so without inflation effects, thus measuring mostly an exposure level, 17 β p j j, : Expected proportion of claim amounts paid in development year γ i+j : Calendar year effect including both economic inflation and payment shifts such as an increase in settlements by the insurer, p d : Expected proportion of claim amounts arizing from incurred claims N delayed for d periods till payment, µ i : Expected average paid amount per incurred claim for origin year i at base level, so without inflation effects, thus measuring exposure. We will show that these interpretations are correct in the sections below Step 1. Origin, development and calendar trends. Estimate factors for origin and development trend using a GLM seperately for incurred claim counts and paid claim amounts, and extrapolate future incurred claim counts. Include a calendar year trend in the GLM for paid claim amounts as well. We will formulate a GLM to the incurred counts that complies with assumption A 3.3. In the following we will refer to this model as the Incurred GLM : Stochastic component: Observations N have a density in the exponential

19 18 family with a mean µ. Systematic component: We have a linear predictor η = α i + β j. We will use a ResQ representation as explained in remark A.1 in the appendix, because we will use Towers Watson projection software ResQ to calibrate our GLMs. Therefore we get η = i n=1 a n + j m=0 b m. Link function: We define link function g(µ ) = log(µ ) = η. So we get µ = exp ( i n=1 a n + j m=0 b m) = exp ( i n=1 a n) exp ( j m=0 b m) = α i β j. Remark 3.1. Before in this multiplicative representation, α i and β j would just be arbitrary numbers derived by applying an algorithm such as CLM. This means that parameters used to fit claims in the lower part of a runoff triangle, such as cell (5,0) in figure 2, are only based on very few data points. The resulting estimates can be very unstable. In order to avoid this type of overparametrization, we will define an actual trend underlying the number sequences α i and β j for the origin and development trends. We will do this by using the ResQ representation. One might want α i = α i for some real number α, so an exponential trend instead of a number sequence. This amounts to keeping a n = a constant in the linear predictor, because then we get: i i α i = exp ( a n ) = exp ( a) = exp (i a) = exp (a) i = α i. n=1 n=1 We will continue using the ResQ representation, which is very convenient for constructing different types of trends. One might choose to create a fit with a n arbitrary for the first value, constant for a number of consecutive values and zero for the remaining values. This will also make it easy to extrapolate values for the trends beyond the data periods as well. This will be especially useful for extrapolating future calendar year effects, since normal estimation only results in values for past and current calendar years. Which type of trend fits the data best, such as a and b constant or arbitrary, will not be discussed in this thesis, but there are plenty of methods to analyze and compare different choices. Furthermore, a GLM is only a specification of a model and does not prescribe a standard method for estimating a and b. We will use the Weighted Least Squares method to fit a and b to the data in the run-off triangle, with identification m 1 j=0 β j = 1. We will now formulate a GLM to the paid amounts that complies with assumption A 3.3. For the paid amounts we will also include a calendar year effect γ. In the following we will refer to this model as the Paid GLM : Stochastic component: Observations X have a density in the exponential family with a mean µ. A distribution will be chosen that is very similar to X = paid N k=1 Y (k), which unfortunately is not in the exponential family. Systematic component: We have a linear predictor η = α i + β j + γ i+j. We will use a ResQ representation as explained in remark A.1. in the appendix. So we get η = i n=1 ap n + j m=0 bp m + i+j l=1 c l.

20 19 Link function: We define link function g(µ ) = log(µ ). This means we get µ = E[X ] = exp (α i + β j + γ i+j ) = exp ( i n=1 ap n + j m=0 bp m + i+j l=1 c l) = exp ( i n=1 ap n) exp ( j m=0 bp m) exp ( i+j l=1 c l) = α p i βp j γ i+j. Again we define an actual trend for a ResQ representation a p, b p and c. Weighted Least Squares will be used to fit a p, b p and c to the data in the run-off triangle, with identification m 1 j=0 βp j = 1 and γ 1 = 1. It is now very important that the parameters produced by the GLM still have a realistic interpretation, because we use these interpretations to formulate estimates for settlement delay pattern probabilities and average claim sizes as seen in the DCLM. It might happen that parameter optimization in the GLM results in development and origin effects being taken into the calendar year parameters. This may happen when a poor GLM is chosen, so when wrong trends are modelled for a p, b p and c. We believe that α p or a p should be a measure for exposure, and should thus be fairly or entirely constant over different years of origin. Choosing a sensible trend c for calendar year effects will then really assure us of a realistic interpretation for γ and β p. This can be seen by considering a small triangle with fitted values where the α p i are constant. For explaining interpretations we can just leave α p out of the triangle since it will not alter any proportions between β p and γ. i\j β p 0 γ 1 β p 1 γ 2 β p 2 γ 3 2 β p 0 γ 2 β p 1 γ 3 3 β p 0 γ 3 Figure 5. Fitted paid amounts for constant exposure α p We can also portray this triangle with an index k for calendar year effects. k\j β p 0 γ 1 2 β p 0 γ 2 β p 1 γ 2 3 β p 0 γ 3 β p 1 γ 3 β p 2 γ 3 Figure 6. Fitted paid amounts for constant exposure α p We can see that we have the same interpretation for β p as before, so the expected portions of amounts paid in all development years. We can see this since for every calendar year k, we have a total amount γ k α p. This amount is divided in portions β p j over claims originating from year of origin in the same year, and from claims originating from year of origin a year earlier etc. These are the same kind of portions as seen before. Even if α p is not completely constant, but has a slight upward or downward trend, the skewness caused by the α p i in the illustration should not have large impact on the interpretation that we explained. In a later chapter we will show for an example data

21 20 triangle that the difference between β p in DCLM and DGLM is minimal. In line with the above illustration, γ is then really a calendar year effect and does not include any effects from origin or development trends. So we start with a total base level exposure α p, which is then divided over development years with β p and then corrected for positive or negative calendar year effects with γ Step 2. Future calendar year effects. Estimate future calendar year effects based on past calendar year effects given by the paid amount GLM, and split calendar year effects into inflation and payment shifts. We estimate calendar year effects γ i+j with a trend in the Paid GLM. We only have data on past calendar years in a run-off triangle, so we need to extrapolate future values of γ i+j for i + j > m. Before we do this, we will split the estimated calendar year trend in two effects. Following assumption N paid A 3.3, we have X =. Total amounts are thus dependent on individual claim sizes and on numbers of paid claims. The total amount might decrease or increase, which is caused either by a change in claim sizes or a change in paid claim numbers. Depending on which of these components has decreased or increased, the extrapolation procedure for future values is very different. Identifying and splitting these different trends, we will call Gamma Factoring. We will then have the following trends: k=1 Y (k) Economic inflation γ infl : If there is a decreasing or increasing inflation trend for claim sizes such as higher medical costs, it can be expected that this trend will continue to some degree. We can extrapolate an increasing trend for this parameter. Formally, we can define: γ infl i+j := exp ( i+j l=1 cinfl l ), i + j > m, c infl some real vector, with m being the current calendar year. These values will be included in the average claim sizes in step 4. Payment shifts γ shift : If the paid numbers have strongly increased, it might mean that more claims than expected have been settled from the total incurred claims. This can happen when insurers decide to catch up on old outstanding claims to be paid, so paid amounts will be higher in a specific calendar year s. The Paid GLM will reflect this with a very high value for γ s. For future years, there are then less claims remaining to be paid, so we would need to extrapolate a decreasing trend. Extrapolation for payment shifts will be included in the settlement delay pattern in step 3. Both effects have to be identified in one paid amount data triangle. Smooth trends in γ should be recognized as economic inflation, and strong outlying values in γ should be recognized as payment shifts. One could use an additional paid counts triangle to analyze paid numbers in order to seperate the calendar year effects. For now we will use a simple Gamma Factoring in chapter 4. First define γ infl by taking the entries in γ that constitute an identifiable smooth trend, and extrapolate the trend to replace entries that show strong outlying values. Secondly, you can then define γ shift := γ γ infl.

Double Chain Ladder and Bornhutter-Ferguson

Double Chain Ladder and Bornhutter-Ferguson Double Chain Ladder and Bornhutter-Ferguson María Dolores Martínez Miranda University of Granada, Spain mmiranda@ugr.es Jens Perch Nielsen Cass Business School, City University, London, U.K. Jens.Nielsen.1@city.ac.uk,

More information

Validating the Double Chain Ladder Stochastic Claims Reserving Model

Validating the Double Chain Ladder Stochastic Claims Reserving Model Validating the Double Chain Ladder Stochastic Claims Reserving Model Abstract Double Chain Ladder introduced by Martínez-Miranda et al. (2012) is a statistical model to predict outstanding claim reserve.

More information

Obtaining Predictive Distributions for Reserves Which Incorporate Expert Opinions R. Verrall A. Estimation of Policy Liabilities

Obtaining Predictive Distributions for Reserves Which Incorporate Expert Opinions R. Verrall A. Estimation of Policy Liabilities Obtaining Predictive Distributions for Reserves Which Incorporate Expert Opinions R. Verrall A. Estimation of Policy Liabilities LEARNING OBJECTIVES 5. Describe the various sources of risk and uncertainty

More information

A new -package for statistical modelling and forecasting in non-life insurance. María Dolores Martínez-Miranda Jens Perch Nielsen Richard Verrall

A new -package for statistical modelling and forecasting in non-life insurance. María Dolores Martínez-Miranda Jens Perch Nielsen Richard Verrall A new -package for statistical modelling and forecasting in non-life insurance María Dolores Martínez-Miranda Jens Perch Nielsen Richard Verrall Cass Business School London, October 2013 2010 Including

More information

[D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright

[D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright Faculty and Institute of Actuaries Claims Reserving Manual v.2 (09/1997) Section D7 [D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright 1. Introduction

More information

Methods and Models of Loss Reserving Based on Run Off Triangles: A Unifying Survey

Methods and Models of Loss Reserving Based on Run Off Triangles: A Unifying Survey Methods and Models of Loss Reserving Based on Run Off Triangles: A Unifying Survey By Klaus D Schmidt Lehrstuhl für Versicherungsmathematik Technische Universität Dresden Abstract The present paper provides

More information

Annual risk measures and related statistics

Annual risk measures and related statistics Annual risk measures and related statistics Arno E. Weber, CIPM Applied paper No. 2017-01 August 2017 Annual risk measures and related statistics Arno E. Weber, CIPM 1,2 Applied paper No. 2017-01 August

More information

arxiv: v1 [q-fin.rm] 13 Dec 2016

arxiv: v1 [q-fin.rm] 13 Dec 2016 arxiv:1612.04126v1 [q-fin.rm] 13 Dec 2016 The hierarchical generalized linear model and the bootstrap estimator of the error of prediction of loss reserves in a non-life insurance company Alicja Wolny-Dominiak

More information

Stochastic Claims Reserving _ Methods in Insurance

Stochastic Claims Reserving _ Methods in Insurance Stochastic Claims Reserving _ Methods in Insurance and John Wiley & Sons, Ltd ! Contents Preface Acknowledgement, xiii r xi» J.. '..- 1 Introduction and Notation : :.... 1 1.1 Claims process.:.-.. : 1

More information

City, University of London Institutional Repository

City, University of London Institutional Repository City Research Online City, University of London Institutional Repository Citation: Margraf, C. (2017). On the use of micro models for claims reversing based on aggregate data. (Unpublished Doctoral thesis,

More information

1. You are given the following information about a stationary AR(2) model:

1. You are given the following information about a stationary AR(2) model: Fall 2003 Society of Actuaries **BEGINNING OF EXAMINATION** 1. You are given the following information about a stationary AR(2) model: (i) ρ 1 = 05. (ii) ρ 2 = 01. Determine φ 2. (A) 0.2 (B) 0.1 (C) 0.4

More information

RISK ADJUSTMENT FOR LOSS RESERVING BY A COST OF CAPITAL TECHNIQUE

RISK ADJUSTMENT FOR LOSS RESERVING BY A COST OF CAPITAL TECHNIQUE RISK ADJUSTMENT FOR LOSS RESERVING BY A COST OF CAPITAL TECHNIQUE B. POSTHUMA 1, E.A. CATOR, V. LOUS, AND E.W. VAN ZWET Abstract. Primarily, Solvency II concerns the amount of capital that EU insurance

More information

Stochastic reserving using Bayesian models can it add value?

Stochastic reserving using Bayesian models can it add value? Stochastic reserving using Bayesian models can it add value? Prepared by Francis Beens, Lynn Bui, Scott Collings, Amitoz Gill Presented to the Institute of Actuaries of Australia 17 th General Insurance

More information

Extend the ideas of Kan and Zhou paper on Optimal Portfolio Construction under parameter uncertainty

Extend the ideas of Kan and Zhou paper on Optimal Portfolio Construction under parameter uncertainty Extend the ideas of Kan and Zhou paper on Optimal Portfolio Construction under parameter uncertainty George Photiou Lincoln College University of Oxford A dissertation submitted in partial fulfilment for

More information

Final exam solutions

Final exam solutions EE365 Stochastic Control / MS&E251 Stochastic Decision Models Profs. S. Lall, S. Boyd June 5 6 or June 6 7, 2013 Final exam solutions This is a 24 hour take-home final. Please turn it in to one of the

More information

A Comprehensive, Non-Aggregated, Stochastic Approach to. Loss Development

A Comprehensive, Non-Aggregated, Stochastic Approach to. Loss Development A Comprehensive, Non-Aggregated, Stochastic Approach to Loss Development By Uri Korn Abstract In this paper, we present a stochastic loss development approach that models all the core components of the

More information

EE266 Homework 5 Solutions

EE266 Homework 5 Solutions EE, Spring 15-1 Professor S. Lall EE Homework 5 Solutions 1. A refined inventory model. In this problem we consider an inventory model that is more refined than the one you ve seen in the lectures. The

More information

A Stochastic Reserving Today (Beyond Bootstrap)

A Stochastic Reserving Today (Beyond Bootstrap) A Stochastic Reserving Today (Beyond Bootstrap) Presented by Roger M. Hayne, PhD., FCAS, MAAA Casualty Loss Reserve Seminar 6-7 September 2012 Denver, CO CAS Antitrust Notice The Casualty Actuarial Society

More information

Generalized Log-Normal Chain-Ladder

Generalized Log-Normal Chain-Ladder Generalized Log-Normal Chain-Ladder D. Kuang Lloyd s of London, 1 Lime Street, London EC3M 7HA, U.K. di.kuang@lloyds.com B. Nielsen Nuffield College, Oxford OX1 1NF, U.K. bent.nielsen@nuffield.ox.ac.uk

More information

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is:

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is: **BEGINNING OF EXAMINATION** 1. You are given: (i) A random sample of five observations from a population is: 0.2 0.7 0.9 1.1 1.3 (ii) You use the Kolmogorov-Smirnov test for testing the null hypothesis,

More information

Modelling the Sharpe ratio for investment strategies

Modelling the Sharpe ratio for investment strategies Modelling the Sharpe ratio for investment strategies Group 6 Sako Arts 0776148 Rik Coenders 0777004 Stefan Luijten 0783116 Ivo van Heck 0775551 Rik Hagelaars 0789883 Stephan van Driel 0858182 Ellen Cardinaels

More information

Reserving Risk and Solvency II

Reserving Risk and Solvency II Reserving Risk and Solvency II Peter England, PhD Partner, EMB Consultancy LLP Applied Probability & Financial Mathematics Seminar King s College London November 21 21 EMB. All rights reserved. Slide 1

More information

Reserve Risk Modelling: Theoretical and Practical Aspects

Reserve Risk Modelling: Theoretical and Practical Aspects Reserve Risk Modelling: Theoretical and Practical Aspects Peter England PhD ERM and Financial Modelling Seminar EMB and The Israeli Association of Actuaries Tel-Aviv Stock Exchange, December 2009 2008-2009

More information

Maximum Likelihood Estimates for Alpha and Beta With Zero SAIDI Days

Maximum Likelihood Estimates for Alpha and Beta With Zero SAIDI Days Maximum Likelihood Estimates for Alpha and Beta With Zero SAIDI Days 1. Introduction Richard D. Christie Department of Electrical Engineering Box 35500 University of Washington Seattle, WA 98195-500 christie@ee.washington.edu

More information

1.1 Interest rates Time value of money

1.1 Interest rates Time value of money Lecture 1 Pre- Derivatives Basics Stocks and bonds are referred to as underlying basic assets in financial markets. Nowadays, more and more derivatives are constructed and traded whose payoffs depend on

More information

Solvency Assessment and Management: Steering Committee. Position Paper 6 1 (v 1)

Solvency Assessment and Management: Steering Committee. Position Paper 6 1 (v 1) Solvency Assessment and Management: Steering Committee Position Paper 6 1 (v 1) Interim Measures relating to Technical Provisions and Capital Requirements for Short-term Insurers 1 Discussion Document

More information

Session 5. A brief introduction to Predictive Modeling

Session 5. A brief introduction to Predictive Modeling SOA Predictive Analytics Seminar Malaysia 27 Aug. 2018 Kuala Lumpur, Malaysia Session 5 A brief introduction to Predictive Modeling Lichen Bao, Ph.D A Brief Introduction to Predictive Modeling LICHEN BAO

More information

Obtaining Predictive Distributions for Reserves Which Incorporate Expert Opinion

Obtaining Predictive Distributions for Reserves Which Incorporate Expert Opinion Obtaining Predictive Distributions for Reserves Which Incorporate Expert Opinion by R. J. Verrall ABSTRACT This paper shows how expert opinion can be inserted into a stochastic framework for loss reserving.

More information

On Some Test Statistics for Testing the Population Skewness and Kurtosis: An Empirical Study

On Some Test Statistics for Testing the Population Skewness and Kurtosis: An Empirical Study Florida International University FIU Digital Commons FIU Electronic Theses and Dissertations University Graduate School 8-26-2016 On Some Test Statistics for Testing the Population Skewness and Kurtosis:

More information

joint work with K. Antonio 1 and E.W. Frees 2 44th Actuarial Research Conference Madison, Wisconsin 30 Jul - 1 Aug 2009

joint work with K. Antonio 1 and E.W. Frees 2 44th Actuarial Research Conference Madison, Wisconsin 30 Jul - 1 Aug 2009 joint work with K. Antonio 1 and E.W. Frees 2 44th Actuarial Research Conference Madison, Wisconsin 30 Jul - 1 Aug 2009 University of Connecticut Storrs, Connecticut 1 U. of Amsterdam 2 U. of Wisconsin

More information

درس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی

درس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی یادگیري ماشین توزیع هاي نمونه و تخمین نقطه اي پارامترها Sampling Distributions and Point Estimation of Parameter (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی درس هفتم 1 Outline Introduction

More information

(iii) Under equal cluster sampling, show that ( ) notations. (d) Attempt any four of the following:

(iii) Under equal cluster sampling, show that ( ) notations. (d) Attempt any four of the following: Central University of Rajasthan Department of Statistics M.Sc./M.A. Statistics (Actuarial)-IV Semester End of Semester Examination, May-2012 MSTA 401: Sampling Techniques and Econometric Methods Max. Marks:

More information

Operational Risk Aggregation

Operational Risk Aggregation Operational Risk Aggregation Professor Carol Alexander Chair of Risk Management and Director of Research, ISMA Centre, University of Reading, UK. Loss model approaches are currently a focus of operational

More information

Practical example of an Economic Scenario Generator

Practical example of an Economic Scenario Generator Practical example of an Economic Scenario Generator Martin Schenk Actuarial & Insurance Solutions SAV 7 March 2014 Agenda Introduction Deterministic vs. stochastic approach Mathematical model Application

More information

Debt Sustainability Risk Analysis with Analytica c

Debt Sustainability Risk Analysis with Analytica c 1 Debt Sustainability Risk Analysis with Analytica c Eduardo Ley & Ngoc-Bich Tran We present a user-friendly toolkit for Debt-Sustainability Risk Analysis (DSRA) which provides useful indicators to identify

More information

CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION

CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION Szabolcs Sebestyén szabolcs.sebestyen@iscte.pt Master in Finance INVESTMENTS Sebestyén (ISCTE-IUL) Choice Theory Investments 1 / 65 Outline 1 An Introduction

More information

In terms of covariance the Markowitz portfolio optimisation problem is:

In terms of covariance the Markowitz portfolio optimisation problem is: Markowitz portfolio optimisation Solver To use Solver to solve the quadratic program associated with tracing out the efficient frontier (unconstrained efficient frontier UEF) in Markowitz portfolio optimisation

More information

MUNICH CHAIN LADDER Closing the gap between paid and incurred IBNR estimates

MUNICH CHAIN LADDER Closing the gap between paid and incurred IBNR estimates MUNICH CHAIN LADDER Closing the gap between paid and incurred IBNR estimates CIA Seminar for the Appointed Actuary, Toronto, September 23 rd 2011 Dr. Gerhard Quarg Agenda From Chain Ladder to Munich Chain

More information

Continuous Distributions

Continuous Distributions Quantitative Methods 2013 Continuous Distributions 1 The most important probability distribution in statistics is the normal distribution. Carl Friedrich Gauss (1777 1855) Normal curve A normal distribution

More information

Retirement. Optimal Asset Allocation in Retirement: A Downside Risk Perspective. JUne W. Van Harlow, Ph.D., CFA Director of Research ABSTRACT

Retirement. Optimal Asset Allocation in Retirement: A Downside Risk Perspective. JUne W. Van Harlow, Ph.D., CFA Director of Research ABSTRACT Putnam Institute JUne 2011 Optimal Asset Allocation in : A Downside Perspective W. Van Harlow, Ph.D., CFA Director of Research ABSTRACT Once an individual has retired, asset allocation becomes a critical

More information

Comparison of Estimation For Conditional Value at Risk

Comparison of Estimation For Conditional Value at Risk -1- University of Piraeus Department of Banking and Financial Management Postgraduate Program in Banking and Financial Management Comparison of Estimation For Conditional Value at Risk Georgantza Georgia

More information

Pricing Dynamic Solvency Insurance and Investment Fund Protection

Pricing Dynamic Solvency Insurance and Investment Fund Protection Pricing Dynamic Solvency Insurance and Investment Fund Protection Hans U. Gerber and Gérard Pafumi Switzerland Abstract In the first part of the paper the surplus of a company is modelled by a Wiener process.

More information

Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals

Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg :

More information

Non-life insurance mathematics. Nils F. Haavardsson, University of Oslo and DNB Skadeforsikring

Non-life insurance mathematics. Nils F. Haavardsson, University of Oslo and DNB Skadeforsikring Non-life insurance mathematics Nils. Haavardsson, University of Oslo and DNB Skadeforsikring Introduction to reserving Introduction hain ladder The naive loss ratio Loss ratio prediction Non-life insurance

More information

Clark. Outside of a few technical sections, this is a very process-oriented paper. Practice problems are key!

Clark. Outside of a few technical sections, this is a very process-oriented paper. Practice problems are key! Opening Thoughts Outside of a few technical sections, this is a very process-oriented paper. Practice problems are key! Outline I. Introduction Objectives in creating a formal model of loss reserving:

More information

Proxy Function Fitting: Some Implementation Topics

Proxy Function Fitting: Some Implementation Topics OCTOBER 2013 ENTERPRISE RISK SOLUTIONS RESEARCH OCTOBER 2013 Proxy Function Fitting: Some Implementation Topics Gavin Conn FFA Moody's Analytics Research Contact Us Americas +1.212.553.1658 clientservices@moodys.com

More information

TABLE OF CONTENTS - VOLUME 2

TABLE OF CONTENTS - VOLUME 2 TABLE OF CONTENTS - VOLUME 2 CREDIBILITY SECTION 1 - LIMITED FLUCTUATION CREDIBILITY PROBLEM SET 1 SECTION 2 - BAYESIAN ESTIMATION, DISCRETE PRIOR PROBLEM SET 2 SECTION 3 - BAYESIAN CREDIBILITY, DISCRETE

More information

Dynamic Replication of Non-Maturing Assets and Liabilities

Dynamic Replication of Non-Maturing Assets and Liabilities Dynamic Replication of Non-Maturing Assets and Liabilities Michael Schürle Institute for Operations Research and Computational Finance, University of St. Gallen, Bodanstr. 6, CH-9000 St. Gallen, Switzerland

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS Answer any FOUR of the SIX questions.

More information

Black-Litterman Model

Black-Litterman Model Institute of Financial and Actuarial Mathematics at Vienna University of Technology Seminar paper Black-Litterman Model by: Tetyana Polovenko Supervisor: Associate Prof. Dipl.-Ing. Dr.techn. Stefan Gerhold

More information

Simulating Continuous Time Rating Transitions

Simulating Continuous Time Rating Transitions Bus 864 1 Simulating Continuous Time Rating Transitions Robert A. Jones 17 March 2003 This note describes how to simulate state changes in continuous time Markov chains. An important application to credit

More information

Rules and Models 1 investigates the internal measurement approach for operational risk capital

Rules and Models 1 investigates the internal measurement approach for operational risk capital Carol Alexander 2 Rules and Models Rules and Models 1 investigates the internal measurement approach for operational risk capital 1 There is a view that the new Basel Accord is being defined by a committee

More information

GI ADV Model Solutions Fall 2016

GI ADV Model Solutions Fall 2016 GI ADV Model Solutions Fall 016 1. Learning Objectives: 4. The candidate will understand how to apply the fundamental techniques of reinsurance pricing. (4c) Calculate the price for a casualty per occurrence

More information

Prediction Uncertainty in the Chain-Ladder Reserving Method

Prediction Uncertainty in the Chain-Ladder Reserving Method Prediction Uncertainty in the Chain-Ladder Reserving Method Mario V. Wüthrich RiskLab, ETH Zurich joint work with Michael Merz (University of Hamburg) Insights, May 8, 2015 Institute of Actuaries of Australia

More information

Modelling economic scenarios for IFRS 9 impairment calculations. Keith Church 4most (Europe) Ltd AUGUST 2017

Modelling economic scenarios for IFRS 9 impairment calculations. Keith Church 4most (Europe) Ltd AUGUST 2017 Modelling economic scenarios for IFRS 9 impairment calculations Keith Church 4most (Europe) Ltd AUGUST 2017 Contents Introduction The economic model Building a scenario Results Conclusions Introduction

More information

Logit Models for Binary Data

Logit Models for Binary Data Chapter 3 Logit Models for Binary Data We now turn our attention to regression models for dichotomous data, including logistic regression and probit analysis These models are appropriate when the response

More information

Jacob: The illustrative worksheet shows the values of the simulation parameters in the upper left section (Cells D5:F10). Is this for documentation?

Jacob: The illustrative worksheet shows the values of the simulation parameters in the upper left section (Cells D5:F10). Is this for documentation? PROJECT TEMPLATE: DISCRETE CHANGE IN THE INFLATION RATE (The attached PDF file has better formatting.) {This posting explains how to simulate a discrete change in a parameter and how to use dummy variables

More information

Study Guide on LDF Curve-Fitting and Stochastic Reserving for SOA Exam GIADV G. Stolyarov II

Study Guide on LDF Curve-Fitting and Stochastic Reserving for SOA Exam GIADV G. Stolyarov II Study Guide on LDF Curve-Fitting and Stochastic Reserving for the Society of Actuaries (SOA) Exam GIADV: Advanced Topics in General Insurance (Based on David R. Clark s Paper "LDF Curve-Fitting and Stochastic

More information

APPROACHES TO VALIDATING METHODOLOGIES AND MODELS WITH INSURANCE APPLICATIONS

APPROACHES TO VALIDATING METHODOLOGIES AND MODELS WITH INSURANCE APPLICATIONS APPROACHES TO VALIDATING METHODOLOGIES AND MODELS WITH INSURANCE APPLICATIONS LIN A XU, VICTOR DE LA PAN A, SHAUN WANG 2017 Advances in Predictive Analytics December 1 2, 2017 AGENDA QCRM to Certify VaR

More information

GN47: Stochastic Modelling of Economic Risks in Life Insurance

GN47: Stochastic Modelling of Economic Risks in Life Insurance GN47: Stochastic Modelling of Economic Risks in Life Insurance Classification Recommended Practice MEMBERS ARE REMINDED THAT THEY MUST ALWAYS COMPLY WITH THE PROFESSIONAL CONDUCT STANDARDS (PCS) AND THAT

More information

Keywords Akiake Information criterion, Automobile, Bonus-Malus, Exponential family, Linear regression, Residuals, Scaled deviance. I.

Keywords Akiake Information criterion, Automobile, Bonus-Malus, Exponential family, Linear regression, Residuals, Scaled deviance. I. Application of the Generalized Linear Models in Actuarial Framework BY MURWAN H. M. A. SIDDIG School of Mathematics, Faculty of Engineering Physical Science, The University of Manchester, Oxford Road,

More information

a 13 Notes on Hidden Markov Models Michael I. Jordan University of California at Berkeley Hidden Markov Models The model

a 13 Notes on Hidden Markov Models Michael I. Jordan University of California at Berkeley Hidden Markov Models The model Notes on Hidden Markov Models Michael I. Jordan University of California at Berkeley Hidden Markov Models This is a lightly edited version of a chapter in a book being written by Jordan. Since this is

More information

Operational Risk Aggregation

Operational Risk Aggregation Operational Risk Aggregation Professor Carol Alexander Chair of Risk Management and Director of Research, ISMA Centre, University of Reading, UK. Loss model approaches are currently a focus of operational

More information

Characterization of the Optimum

Characterization of the Optimum ECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 5. Portfolio Allocation with One Riskless, One Risky Asset Characterization of the Optimum Consider a risk-averse, expected-utility-maximizing

More information

Tests for Two ROC Curves

Tests for Two ROC Curves Chapter 65 Tests for Two ROC Curves Introduction Receiver operating characteristic (ROC) curves are used to summarize the accuracy of diagnostic tests. The technique is used when a criterion variable is

More information

Actuarial Society of India EXAMINATIONS

Actuarial Society of India EXAMINATIONS Actuarial Society of India EXAMINATIONS 7 th June 005 Subject CT6 Statistical Models Time allowed: Three Hours (0.30 am 3.30 pm) INSTRUCTIONS TO THE CANDIDATES. Do not write your name anywhere on the answer

More information

Alternative VaR Models

Alternative VaR Models Alternative VaR Models Neil Roeth, Senior Risk Developer, TFG Financial Systems. 15 th July 2015 Abstract We describe a variety of VaR models in terms of their key attributes and differences, e.g., parametric

More information

Case Study: Heavy-Tailed Distribution and Reinsurance Rate-making

Case Study: Heavy-Tailed Distribution and Reinsurance Rate-making Case Study: Heavy-Tailed Distribution and Reinsurance Rate-making May 30, 2016 The purpose of this case study is to give a brief introduction to a heavy-tailed distribution and its distinct behaviors in

More information

On modelling of electricity spot price

On modelling of electricity spot price , Rüdiger Kiesel and Fred Espen Benth Institute of Energy Trading and Financial Services University of Duisburg-Essen Centre of Mathematics for Applications, University of Oslo 25. August 2010 Introduction

More information

Stochastic Loss Reserving with Bayesian MCMC Models Revised March 31

Stochastic Loss Reserving with Bayesian MCMC Models Revised March 31 w w w. I C A 2 0 1 4. o r g Stochastic Loss Reserving with Bayesian MCMC Models Revised March 31 Glenn Meyers FCAS, MAAA, CERA, Ph.D. April 2, 2014 The CAS Loss Reserve Database Created by Meyers and Shi

More information

Three Components of a Premium

Three Components of a Premium Three Components of a Premium The simple pricing approach outlined in this module is the Return-on-Risk methodology. The sections in the first part of the module describe the three components of a premium

More information

Institute of Actuaries of India Subject CT6 Statistical Methods

Institute of Actuaries of India Subject CT6 Statistical Methods Institute of Actuaries of India Subject CT6 Statistical Methods For 2014 Examinations Aim The aim of the Statistical Methods subject is to provide a further grounding in mathematical and statistical techniques

More information

Contents Utility theory and insurance The individual risk model Collective risk models

Contents Utility theory and insurance The individual risk model Collective risk models Contents There are 10 11 stars in the galaxy. That used to be a huge number. But it s only a hundred billion. It s less than the national deficit! We used to call them astronomical numbers. Now we should

More information

CHAPTER II LITERATURE STUDY

CHAPTER II LITERATURE STUDY CHAPTER II LITERATURE STUDY 2.1. Risk Management Monetary crisis that strike Indonesia during 1998 and 1999 has caused bad impact to numerous government s and commercial s bank. Most of those banks eventually

More information

Session 5. Predictive Modeling in Life Insurance

Session 5. Predictive Modeling in Life Insurance SOA Predictive Analytics Seminar Hong Kong 29 Aug. 2018 Hong Kong Session 5 Predictive Modeling in Life Insurance Jingyi Zhang, Ph.D Predictive Modeling in Life Insurance JINGYI ZHANG PhD Scientist Global

More information

Estimation Parameters and Modelling Zero Inflated Negative Binomial

Estimation Parameters and Modelling Zero Inflated Negative Binomial CAUCHY JURNAL MATEMATIKA MURNI DAN APLIKASI Volume 4(3) (2016), Pages 115-119 Estimation Parameters and Modelling Zero Inflated Negative Binomial Cindy Cahyaning Astuti 1, Angga Dwi Mulyanto 2 1 Muhammadiyah

More information

Log-Robust Portfolio Management

Log-Robust Portfolio Management Log-Robust Portfolio Management Dr. Aurélie Thiele Lehigh University Joint work with Elcin Cetinkaya and Ban Kawas Research partially supported by the National Science Foundation Grant CMMI-0757983 Dr.

More information

Budget Setting Strategies for the Company s Divisions

Budget Setting Strategies for the Company s Divisions Budget Setting Strategies for the Company s Divisions Menachem Berg Ruud Brekelmans Anja De Waegenaere November 14, 1997 Abstract The paper deals with the issue of budget setting to the divisions of a

More information

Modelling the Claims Development Result for Solvency Purposes

Modelling the Claims Development Result for Solvency Purposes Modelling the Claims Development Result for Solvency Purposes Mario V Wüthrich ETH Zurich Financial and Actuarial Mathematics Vienna University of Technology October 6, 2009 wwwmathethzch/ wueth c 2009

More information

Log-linear Modeling Under Generalized Inverse Sampling Scheme

Log-linear Modeling Under Generalized Inverse Sampling Scheme Log-linear Modeling Under Generalized Inverse Sampling Scheme Soumi Lahiri (1) and Sunil Dhar (2) (1) Department of Mathematical Sciences New Jersey Institute of Technology University Heights, Newark,

More information

The Two-Sample Independent Sample t Test

The Two-Sample Independent Sample t Test Department of Psychology and Human Development Vanderbilt University 1 Introduction 2 3 The General Formula The Equal-n Formula 4 5 6 Independence Normality Homogeneity of Variances 7 Non-Normality Unequal

More information

Calculating VaR. There are several approaches for calculating the Value at Risk figure. The most popular are the

Calculating VaR. There are several approaches for calculating the Value at Risk figure. The most popular are the VaR Pro and Contra Pro: Easy to calculate and to understand. It is a common language of communication within the organizations as well as outside (e.g. regulators, auditors, shareholders). It is not really

More information

4 Reinforcement Learning Basic Algorithms

4 Reinforcement Learning Basic Algorithms Learning in Complex Systems Spring 2011 Lecture Notes Nahum Shimkin 4 Reinforcement Learning Basic Algorithms 4.1 Introduction RL methods essentially deal with the solution of (optimal) control problems

More information

Assessment on Credit Risk of Real Estate Based on Logistic Regression Model

Assessment on Credit Risk of Real Estate Based on Logistic Regression Model Assessment on Credit Risk of Real Estate Based on Logistic Regression Model Li Hongli 1, a, Song Liwei 2,b 1 Chongqing Engineering Polytechnic College, Chongqing400037, China 2 Division of Planning and

More information

Chapter 2 Uncertainty Analysis and Sampling Techniques

Chapter 2 Uncertainty Analysis and Sampling Techniques Chapter 2 Uncertainty Analysis and Sampling Techniques The probabilistic or stochastic modeling (Fig. 2.) iterative loop in the stochastic optimization procedure (Fig..4 in Chap. ) involves:. Specifying

More information

CHAPTER III CONSTRUCTION AND SELECTION OF SINGLE, DOUBLE AND MULTIPLE SAMPLING PLANS

CHAPTER III CONSTRUCTION AND SELECTION OF SINGLE, DOUBLE AND MULTIPLE SAMPLING PLANS CHAPTER III CONSTRUCTION AND SELECTION OF SINGLE, DOUBLE AND MULTIPLE SAMPLING PLANS 3.0 INTRODUCTION When a lot is received by the customer (consumer), he has to decide whether to accept or reject the

More information

Choice Probabilities. Logit Choice Probabilities Derivation. Choice Probabilities. Basic Econometrics in Transportation.

Choice Probabilities. Logit Choice Probabilities Derivation. Choice Probabilities. Basic Econometrics in Transportation. 1/31 Choice Probabilities Basic Econometrics in Transportation Logit Models Amir Samimi Civil Engineering Department Sharif University of Technology Primary Source: Discrete Choice Methods with Simulation

More information

Lecture 3: Factor models in modern portfolio choice

Lecture 3: Factor models in modern portfolio choice Lecture 3: Factor models in modern portfolio choice Prof. Massimo Guidolin Portfolio Management Spring 2016 Overview The inputs of portfolio problems Using the single index model Multi-index models Portfolio

More information

MEASURING PORTFOLIO RISKS USING CONDITIONAL COPULA-AR-GARCH MODEL

MEASURING PORTFOLIO RISKS USING CONDITIONAL COPULA-AR-GARCH MODEL MEASURING PORTFOLIO RISKS USING CONDITIONAL COPULA-AR-GARCH MODEL Isariya Suttakulpiboon MSc in Risk Management and Insurance Georgia State University, 30303 Atlanta, Georgia Email: suttakul.i@gmail.com,

More information

Introduction to Sequential Monte Carlo Methods

Introduction to Sequential Monte Carlo Methods Introduction to Sequential Monte Carlo Methods Arnaud Doucet NCSU, October 2008 Arnaud Doucet () Introduction to SMC NCSU, October 2008 1 / 36 Preliminary Remarks Sequential Monte Carlo (SMC) are a set

More information

Martingales, Part II, with Exercise Due 9/21

Martingales, Part II, with Exercise Due 9/21 Econ. 487a Fall 1998 C.Sims Martingales, Part II, with Exercise Due 9/21 1. Brownian Motion A process {X t } is a Brownian Motion if and only if i. it is a martingale, ii. t is a continuous time parameter

More information

Presented at the 2012 SCEA/ISPA Joint Annual Conference and Training Workshop -

Presented at the 2012 SCEA/ISPA Joint Annual Conference and Training Workshop - Applying the Pareto Principle to Distribution Assignment in Cost Risk and Uncertainty Analysis James Glenn, Computer Sciences Corporation Christian Smart, Missile Defense Agency Hetal Patel, Missile Defense

More information

A Review of Berquist and Sherman Paper: Reserving in a Changing Environment

A Review of Berquist and Sherman Paper: Reserving in a Changing Environment A Review of Berquist and Sherman Paper: Reserving in a Changing Environment Abstract In the Property & Casualty development triangle are commonly used as tool in the reserving process. In the case of a

More information

A Comprehensive, Non-Aggregated, Stochastic Approach to Loss Development

A Comprehensive, Non-Aggregated, Stochastic Approach to Loss Development A Comprehensive, Non-Aggregated, Stochastic Approach to Loss Development by Uri Korn ABSTRACT In this paper, we present a stochastic loss development approach that models all the core components of the

More information

Subject CS2A Risk Modelling and Survival Analysis Core Principles

Subject CS2A Risk Modelling and Survival Analysis Core Principles ` Subject CS2A Risk Modelling and Survival Analysis Core Principles Syllabus for the 2019 exams 1 June 2018 Copyright in this Core Reading is the property of the Institute and Faculty of Actuaries who

More information

Statistics 431 Spring 2007 P. Shaman. Preliminaries

Statistics 431 Spring 2007 P. Shaman. Preliminaries Statistics 4 Spring 007 P. Shaman The Binomial Distribution Preliminaries A binomial experiment is defined by the following conditions: A sequence of n trials is conducted, with each trial having two possible

More information

Section J DEALING WITH INFLATION

Section J DEALING WITH INFLATION Faculty and Institute of Actuaries Claims Reserving Manual v.1 (09/1997) Section J Section J DEALING WITH INFLATION Preamble How to deal with inflation is a key question in General Insurance claims reserving.

More information

UPDATED IAA EDUCATION SYLLABUS

UPDATED IAA EDUCATION SYLLABUS II. UPDATED IAA EDUCATION SYLLABUS A. Supporting Learning Areas 1. STATISTICS Aim: To enable students to apply core statistical techniques to actuarial applications in insurance, pensions and emerging

More information

Summary Sampling Techniques

Summary Sampling Techniques Summary Sampling Techniques MS&E 348 Prof. Gerd Infanger 2005/2006 Using Monte Carlo sampling for solving the problem Monte Carlo sampling works very well for estimating multiple integrals or multiple

More information

Hedging Under Jump Diffusions with Transaction Costs. Peter Forsyth, Shannon Kennedy, Ken Vetzal University of Waterloo

Hedging Under Jump Diffusions with Transaction Costs. Peter Forsyth, Shannon Kennedy, Ken Vetzal University of Waterloo Hedging Under Jump Diffusions with Transaction Costs Peter Forsyth, Shannon Kennedy, Ken Vetzal University of Waterloo Computational Finance Workshop, Shanghai, July 4, 2008 Overview Overview Single factor

More information