THE NATURE OF MODELLING INSURANCE LOSSES

Size: px
Start display at page:

Download "THE NATURE OF MODELLING INSURANCE LOSSES"

Transcription

1 THE NATURE OF MODELLING INSURANCE LOSSES by Gordon E. Willmot Munich Re Chair in Insurance Department of Statistics and Actuarial Science University of Waterloo Presented at the Munich Re Inaugural Lecture December 5, 2001, 4:00pm Munich Reinsurance in Toronto.

2 The nature of modelling insurance losses I would like to thank the two presidents, David Johnston of the University of Waterloo, and Jim Brierley of Munich Re, for their warm comments and introduction. I wish to express my sincere gratitude to everyone for the support that has been given to me. I would also like to extend a special thanks here today to all for demonstrating support for the Actuarial Program at the University of Waterloo. This support is a critical ingredient in the continuing development of technical actuarial knowledge. It is greatly appreciated. My talk today will endeavour to illustrate some of the ways in which modelling has helped the insurance industry, and hopefully will encourage the development of active dialogue between practitioners and researchers. The focus will be on modelling of insurance losses arising in connection with health coverages involving drug, hospital, and disability benefits. The models are also applicable in various types of property and casualty insurance situations involving homeowner and automobile coverages. There are three main objectives in the talk. The first is an attempt to refamiliarize everyone with the current approaches which are available for fitting and use of the models for the types of insurance coverages mentioned. The second goal is to illustrate the importance of the impact of theoretical research on practical actuarial modelling. The third goal is to point out the fact that theoretical research often borrows ideas from and lends ideas to other disciplines whose connection may not be readily apparent, but has similar underlying technical components. This interaction normally benefits all involved disciplines. To begin with, it seems appropriate to clarify exactly what we mean by a quantitative model, and to explain why such a model is of significance. To deal with these issues, let us examine an analogy involving a model in a physical sense, such as a model airplane. A model airplane, although small in scale, is normally designed to capture the essential features of a real airplane including its basic shape, design, etc., and may actually even have the capacity to fly. It differs from a real airplane in that it cannot carry people or cargo, cannot fly a significant distance, and is technically considerably simpler than a real airplane. As such, it may be viewed as a highly simplified representation of a real airplane incorporating characteristics thought to be relevant. The quantitative model that I am presently suggesting shares many of the same features as the model airplane. Of course, we are dealing symbollically and not with anything in a real physical sense, but a quantitative model may justifiably be considered as a model nonetheless. To continue the analogy, the 'real airplane' is replaced by a real portfolio of insurance business. The relevant aspect of this portfolio is that it is extremely complicated in terms of the nature of its past and future risk-based behaviour. There are many deterministic and stochastic phenomena which will ultimately influence the unfolding of the future claims experience. For example, it is unlikely that the horrific events of September 11 and their resultant effect on insurance costs could have been predicted. Similarly, other economic factors too numerous to predict can affect even a small portfolio of insurance. The precise prediction of the future claims experience

3 requires that all such influences and their effects be identified. Of course, all is not lost in terms of predicting the future development of the claims experience in the absence of this ability. This is where the quantitative model comes in to focus. Even though we cannot hope to identify all influential factors relevant to the future claims experience, we can identify some which we feel may be relevant. Moreover, technical knowledge developed over time can help us to quantify and incorporate these factors in a cohesive fashion. Of course, there are limits to our technical knowledge at any juncture, and this results in a tradeoff in any given modelling situation between realism on the one hand and mathematical simplicity on the other. For the types of models in the present context, one quantity normally of interest is the incidence with which claims may be expected to occur over time, and there may well be related information that is available to us. Similarly, we may have some knowledge as to the scope or severity of the individual claims when they do occur, a second quantity normally of interest. These characteristics are analogous in the present model to the essential features of the real airplane such as the basic shape and design which are incorporated into the model airplane. The analogy does not end here, however. Just as the model airplane may be able to approximate the behaviour of the real airplane by flying a short distance, the quantitative model may be able to approximately predict the claims experience of the portfolio of insurance that it is intended to represent. This predictive ability of the quantitative model is one important reason why it is of interest, and we now discuss a second. William Jewell (1980), in a famous address at the International Congress of Actuaries meeting in Zurich, Switzerland, defined a model in the following somewhat practical manner: 'A model is a set of verifiable mathematical relationships or logical procedures which is used to represent observed, measurable real-world phenomena, to communicate alternative hypotheses about the causes of the phenomena, and to predict future behaviour of the phenomena for the purposes of decision-making.' The above definition identifies another use of the model in addition to its predictive ability discussed above. That is, the model may also provide insight into the nature of the behaviour of the claims experience itself. A well formulated model may often help to better explain the observed claims patterns by identifying characteristics or phenomena influencing claim levels and their interactive effects which are not readily apparent in the absence of such a model. The use of generalized linear models in automobile insurance, for example, provides a good illustration. A regression approach to the analysis of claim count data may directly incorporate relevant covariate information about the insured and the vehicle into the model itself. Such analysis may identify interactive effects between two or more of these explanatory variables which may not have been readily obvious. Insight such as this may well be extremely important in connection with sound insurance risk management. In particular, this may prove to be invaluable in analysis of new but related insurance coverages where little observed experience is available. I believe that this second important explanatory use of quantitative models is often overlooked in practice.

4 The famous quote 'All models are wrong, but some are useful.', normally attributed to the statistician George Box (1976), succinctly captures the imperfect nature of the modelling process, but also clarifies its relationship to the real life phenomena that it is attempting to mimic: As a simple example of a quantitative model, we could assume that the total claims arising from a block of business in the next year follows a normal distribution. This is clearly a simplifying assumption, since the actual claims could hardly be expected to follow such a simple mathematical law as a normal distribution. Thus, the model is in fact 'wrong', but does approximate the real life situation involving the actual claims which are likely to occur in the next year. How good the model is depends on how closely the future claims experience can be approximated by the given normal distribution. There are three important points to be made in this context. It is usually a good idea if the model provides a reasonable fit to past data, which often is used to calibrate the model in the first place. This often provides reassurance that the model is appropriate, unless conditions are expected to be different in the future from what they were in the past. Such changes would clearly need to be accounted for in the modelling process. However, what is more important is how well the model can be expected to predict future behaviour. It is usually easy to construct a detailed model which provides a superb fit to the past, but a detailed model is usually accompanied by poorer predictive ability. One main reason for this phenomenon is the fact that the past data itself is subject to random variability, so that the model can be overfit if it attempts to reproduce these random errors which would obviously not be expected to be repeated in an identical manner in the future. This is where the principle of parsimony is relevant. A simple, yet still adequate model in terms of fit is therefore better suited for prediction in general than a detailed model selected due to its superior fit to past data. The second point is that the choice of a model should have been motivated by theoretical considerations, at least in part. At the very least, the model should not be inconsistent with such principles. A model is clearly suspect if it is not chosen in a theoretically justifiable manner, since there would then be no reason to trust any resultant conclusions. In this situation, if the block of business is sufficiently large, then central limit theoretic arguments would support the use of a normal model. The third point reflects the stochastic nature of the model itself. We can never verify absolutely that the model provides a good approximation to the real life situation, since we will only observe the claims experience itself. That is, the underlying mechanism generating the claims is unobservable. This further reflects the need for a sound theoretical justification for the model, because we need to trust that the model is in fact appropriate in the given situation if we are going to base quantitative risk management decisions on it. To put the role of a quantitative model in perspective, its main purpose is to provide information and insight which is unavailable in its absence. It is not intended to replace sound actuarial judgement, an invaluable commodity for which there is no substitute. A

5 well formulated model is consistent with and adds to intuition, but cannot and should not replace experience and insight. The availability of good modelling techniques should be viewed as providing additional tools to be included in the toolkit of the actuary. To summarise, a quantitative model provides a window with which one may view the real world. Such a window is normally not available without such a model, but the view through the window is necessarily opaque in nature and limits clarity. Now that we (hopefully) all accept that quantitative models do have their use in connection with insurance coverages, let us examine in greater detail the modelling process in connection with the health and casualty coverages mentioned above. Consider again the normal distribution as a model for the claims in the following year in connection with a portfolio of insurance. The normal distribution does have the desirable property of reflecting the stochastic nature of the problem at hand, as opposed to deterministic models, which by definition do not involve anything of a random or uncertain nature. Deterministic models have historically been used in situations such as this, whereby a great deal of effort is spent on analysis of expected claims. This is certainly a useful exercise to carry out, but is insufficient in terms of allowing for financial management of the insurance risk. Quite simply, the problem is stochastic in nature due to the random numbers and amounts of claims, and deterministic models are unable to capture the inherent variability. This has been reflected by the use of premium loadings and reserves for 'contingencies' and 'adverse deviations'. These quantities, when used in connection with a deterministic model, are a step in the right direction, but cannot realistically be expected to satisfactorily replace a stochastic model. The technology is available for straightforward implementation of stochastic modelling techniques, and it is important for us to use them so that we as a profession may provide our best possible services to society. If the portfolio of insurance is extremely large, the normal distribution may provide a satisfactory model for the aggregate claims experience, as was mentioned previously. For coverages such as long-term disability, however, the normal assumption is suspect even for very large portfolios. Similarly, in group insurance, the normal assumption for life and health coverages is untenable in all but the largest groups. For many of these types of coverages, the normal model tends to understate the right tail of the distribution. Since this is the portion of the distribution involving the most significant financial implications for insurance, this is particularly troublesome for insurers. Serious consequences could result from an insurer basing financial risk management decisions on a model which understates the probability and scope of large losses. In the modelling context discussed previously, the normal model often fails to capture important and relevant components of the situation that it is attempting to describe. Early methods for dealing with the difficulties alluded to in connection with the normal distribution involved 'correcting' the normal model by employing different yet qualitatively similar models for the aggregate claims. Traditional approaches including normal power and translated gamma approximations, Esscher transforms, and

6 Edgeworth-type expansions were used. The difficulty with the use of these models is that, like the normal approximation, they fail to capture essential components of the real situation. Their use was understandable historically due to the lack of adequate computational resources which precluded the use of more realistic and consequently more complex models. The present situation is quite different. The stochastic nature of both the incidence and severity of claims are fundamental components of a realistic model, and need to be addressed whenever it is possible. That is, the modelling process in the present context necessarily involves incorporation of claim counts and claim numbers into the model. The usual modelling approaches which are currently utilized normally involve what is referred to as 'the collective risk model', and the resultant random variable is a 'random sum' which has a 'compound distribution'. These more complex models explicitly involve claim incidence and claim severity components, and tend to have thicker right tails than the normal model. Consequently, they tend to describe real insurance portfolios in a much more representative manner. The use of these random sum models received a large boost with technological advances in terms of numerical evaluation of compound distributions which were noticed by the actuarial profession in the early 1980's. These advances actually occurred earlier in disciplines such as operations research, and have only been utilized to any great extent in actuarial contexts much more recently. The recursive numerical approach to the evaluation of compound distributions has rendered such techniques as Monte Carlo simulation unnecessary in these situations, since any desired degree of accuracy may be achieved using these algorithms. An alternative approach to the use of recursions is the Fast Fourier Transform which is also quite efficient. As a result of these techniques, evaluation of compound distributions is essentially no longer an issue associated with the use of these models. The most common and best known of the compound distributions is the compound Poisson distribution, where the claim count component of the model is assumed to follow the Poisson law. This model is easily the most tractable analytically of all the compound models, as it is particularly amenable to a wide variety of situations in insurance. It is also consistent with various theoretical considerations including infinite divisibility, a notion which has practical implications in connection with subdivision of insurance portfolios and business growth. While the compound Poisson model is normally appropriate in connection with life insurance modelling, it often suffers from the disadvantage of providing an inadequate fit to insurance data in other coverages. In particular, it tends to understate the true variability inherent in these situations. Credibility theory provides an explanation for this phenomenon as well as a practical solution. This theory contends that the Poisson model assumes that the individual risks within the portfolio are homogeneous from the standpoint of risk characteristics, presumably as a result of some sort of underwriting mechanism. Unfortunately, it appears that the risks are normally heterogeneous from this vantage point, and this additional source of variability is not captured by the Poisson model, resulting in the potentially dangerous situation whereby the probabilities associated with large numbers of losses are understated. The thesis of credibility theory is that this heterogeneity may be systematically modelled by a probability distribution, and standard Bayesian statistical analysis leads to what is commonly referred to as a mixed Poisson model. The most important of the mixed Poisson models from both a theoretical and a practical

7 standpoint is the negative binomial, which provides a significantly improved fit to that of the Poisson with respect to claim count data in many situations. The resultant compound negative binomial distribution is normally quite tractable both analytically and numerically. Other useful claim count models in addition to the mixed Poisson,family have also been suggested and used in recent years. There are two approaches which have been employed in identifying these models. The first approach is to consider models which are well suited mathematically for use in a compound distributional setting. Fortunately, these same models have also proved to be useful in terms of fitting to claim count data. The second approach involves attempts to construct models to physically describe situations of interest. For example, compound models themselves have been employed to model claim counts in situations where 'accidents' or claim causing events occur at random, and these events each give rise to a random number of claims. Models for major catastrophic events such as plane crashes or epidemics would fall into this category. These models have an interesting theoretical connection to mixed Poisson models which is not immediately obvious. It is perhaps for this reason that compound claim count models have been found to provide a good fit to claim data in many situations, including those where there is no physical justification involving accidents and claims per accidents. Furthermore, they are well suited mathematically for use in the compound context described earlier. Up until the present point in this talk I have been deliberately vague about what I have been referring to as a 'claim'. To be more precise, in most situations a financial loss occurs to the insured, and a portion of this loss is reimbursed by the insurer. That is, policy modifications to the underlying losses such as deductibles and maximums are often applied before payment is made. Consequently, we need to distinguish between the total individual loss amount or simply the loss, and the payment amount on a claim. Thus, when we refer to the severity we must be precise as to whether it is the payment or the loss to which we are referring. In the presence of a deductible, nothing is payable on losses below the deductible amount. Therefore, in this situation it is not enough to simply refer to the number of claims, and we need to clarify whether we are referring to the number of losses or the number of payments. If we are dealing with the original loss amounts then we say that the analysis is on a loss basis. On the other hand, if the situation involves payment amounts as a result of the imposition of a particular deductible and maximum combination, then a payment basis results. There are relatively simple mathematical relationships between the number of losses, the number of payments, the size of individual losses, and the size of individual payments. Thus it is possible to convert from a loss to a payment basis and vice versa. These relationships are important in practice, since it is likely to be the case that the data which are being used to calibrate the models are payment data based on a particular set or sets of deductible and maximum combinations. We may wish to use the model for the same or possibly different sets of deductibles and maximums. Therefore, it may be important to have the ability to change payment bases, and this is most easily visualized by imagining first a conversion from the payment to the loss basis, followed

8 by a conversion to the new payment basis. Of course, this may actually be done simultaneously. In any event, from a data analytic standpoint it is important to properly identify the loss and payment components in order not to introduce systematic error into the analysis. That is, losses below the deductible will likely not be reported, and thus not be included in the data which are available. In statistical terms, the payment data reflect sampling bias since small losses are excluded. Similarly, the actual loss amount in excess of the maximum will not be known in general, and instead the only information normally available is the fact that a payment for the maximum amount was made. The presence of these deductibles and maximums, which admittedly does create technical complications, does not by any means result in intractability however, and the resultant analysis is an important component of the current Society of Actuaries actuarial examinations in both parts 3 and 4. Moreover, these particular situations arise in other contexts as well, and techniques for analysis may be both borrowed from and lent to these other related disciplines. In statistical terms, the deductible results in what is referred to as 'left truncation', and the maximum results in 'right-censoring'. Considerable statistical research into the analysis of such data has been performed in recent years, resulting in sound theoretical and practical tools for their calibration and management. Interestingly, the amount of the loss in excess of the deductible is mathematically identical to the well known future lifetime of an individual in a life contingencies setting, resulting in similarities here as well. The ability to convert from a payment to a loss basis and vice versa is important even if deductibles and maximums are to remain unchanged. This is due to the fact that in general it is a good idea to formulate a model for the underlying total individual loss, even though the available data is normally on a payment basis. This situation therefore involves fitting of a loss model on the basis of payment data as a result, a fact which is actually implicitly assumed in terms of the previous data analytic discussion. Of course, one could simply ignore the deductibles and maximums and directly model the payments rather than the losses. This is not desirable for various reasons. It is clearly not a good idea if deductibles and maximums are to be changed, and is of no use if the deductible is to be lowered. Theoretical mathematical laws should be applied to the losses themselves, which reflect a more random nature than when deterministic or nonrandom influences such as deductibles and maximums are superimposed. The impact of policyholder behaviour on payment amounts may be dependent on the cost sharing provisions implied by the deductible and maximum combinations involved. Also, it is well known that systematic differences in risk characteristics are known to occur on the basis of policy provisions. For example, the choice of a lower deductible may reflect an antiselective tendency on the part of the potential insured. The impact of this risk variability on the payments themselves is more difficult to ascertain and quantify than the effect on the underlying losses. Consequently, analysis of this heterogeneity in terms of risk characteristics normally requires that a loss rather than a payment basis be used. Finally, the payment amount may always be recovered in any event by reintroduction of the relevant deductible and maximum combinations. It is worth noting that one of the drawbacks to the historical aggregate modelling approaches utilizing such techniques as normal power and translated gamma approximations is the inability to work at the individual loss level. Conversely, one of the advantages of the compound modelling approach over these traditional techniques is

9 the ability to ascertain the impact on costs of variations in policy provisions including deductibles and maximums. As is the case with respect to models for the incidence of claims, or more precisely, losses, there are many desirable features associated with the use of parametric models for the individual losses. By parametric models I mean such distributions as the Pareto or the lognormal, where knowledge of a small number of parameters completely characterizes the distribution. This is in contrast to the empirical use of the original data. If the data can in fact be satisfactorily represented by a member of the parametric family, then quantities of interest such as the mean may be estimated more accurately than is possible using the raw data. Parametric models allow one to extrapolate to portions of the distribution where no data are available. In particular, the portion of the distribution below the deductible and above the maximum may be 'filled in' through the use of a parametric model. Similarly, if the data are grouped into crude intervals, then parametric models both smooth and fill in relevant portions of the distribution. A systematic approach such as this is certainly of use if a deductible of interest were inside one of the intervals rather than at an end point. Parametric models are easily updated for trends and inflation. Furthermore, statistical hypothesis tests are available to provide guidance on a variety of questions of interest. For example, one could statistically ascertain whether or not underlying loss models vary by policy type, i.e. deductible and maximum combinations. There has been a great deal of research in recent years within the statistical community into methods of fitting models to data in a wide variety of situations, including in particular the left-truncated and right-censored cases of interest in connection with policy provisions involving deductibles and maximums. Theoretically sound statistical procedures such as maximum likelihood estimation are well suited for use in these situations, and are now quite easy to implement numerically using any number of optimization routines. The Solver function in Excel is sufficient for many models, for example, and numerical implementation is now of such a straightforward nature that it should no longer be a consideration in terms of the choice of estimation procedure. That is, the use of ad hoc techniques with poor statistical properties such as moment estimation, whose appeal is primarily the inherent simplicity, is now quite difficult to justify. Interestingly, even these ad hoc procedures are not completely trivial to implement in the presence of deductibles and maximums, so that even the advantage of simplicity is lost. On the other hand, maximum likelihood estimation is no more difficult to implement, even in the presence of fairly complex censoring and truncation schemes. Quite simply, there is no longer any justification for the use of inefficient estimation techniques. In a given modelling situation, it is often desirable to use more than one model for loss frequency and/or severity. One would hope that similar qualitative answers to questions of interest would result from the use of different models. Stated another way, widely different answers to these questions by model may well raise questions about the use of one of the models, or perhaps even the whole approach being employed. Furthermore, there are often reasons justifying the use of models which are of other than a statistical or data-related nature. In particular, claim count models vary greatly in terms of their

10 ability to accommodate deductibles, combine statistically independent coverages, reflect changes in levels of business due to growth, or even in terms of how easily the associated compound distribution may be evaluated. For example, a negative binomial distribution for the number of losses may be a more desirable model than, say, a 'zeromodified logarithmic series' distribution, even though the latter may provide a superior fit to a particular data set. This is because the negative binomial is in general more tractable mathematically and thus may provide answers to questions in an easier fashion. Additionally, it is more well known and thus may be more easily accepted by a non-specialist interested in qualitative answers. It turns out that for technical reasons analytic properties of the loss severity models are less important than for the loss frequency models. However, it is desirable that these severity models do possess a form of 'scale invariance' so that inflation and trend effects may be accommodated, or even changes in monetary units. It is the case that there is a wide variety of parametric models which are available for both claim frequency and claim severity. This implies that in a given modelling situation, one can be reasonably confident that at least one model will provide an adequate fit. Although great advances have been made in recent years in terms of calibration and use of compound models for aggregate claims in various insurance contexts, there is still room for improvement and refinement. In particular, there remain inadequacies as a result of simplifying assumptions which are made in order to ensure mathematical tractability. The statistical independence assumptions between numbers of claims and claim sizes inherent in compound models may imply systematic error in real situations where some sort of dependency exists. Similarly, claim inflation effects and cyclical claim patterns may violate the identically distributed assumptions which are also employed. Cost sharing arrangements with respect to employees and dependents in group insurance which involve complex deductibles and maximums may also create conditions which are more difficult to manage using the compound models discussed here. The same type of difficulties may arise as a consequence of complicated reinsurance schemes. These situations are not meant as criticisms of the standard compound models which do have wide applicability, but simply serve as a reminder that no models are appropriate in a universal sense. All models are wrong, and the use of more complicated models to accommodate particular phenomena is required if the effects are felt to be sufficiently important. This philosophy is clearly, although somewhat redundantly, summarized in another famous quote which is normally attributed to Einstein, namely: 'A model should be as simple as possible, but no simpler', which is consistent with the principle of parsimony. The price to be paid for employment of a more complex model normally is the loss of mathematical tractability, and this is where theoretical research may be of some assistance. For example, it is possible to accommodate an inflation factor in conjunction with the use of Poisson and mixed Poisson claim count distributions, while still retaining many of the simplifying features associated with the use of compound models. There has also been some progress in terms of the application of time dependent claim count processes in the present context. Furthermore, the use of simulation techniques is of assistance in many situations where the mathematical difficulties are overwhelming. As with any technical tool however,

11 simulation should not be applied blindly without regard to potential use of relevant analytic information. For example, it is normally not necessary or advisable to use Monte Carlo simulation in order to evaluate standard compound distributions of the type discussed earlier. An overview of the whole modelling approach in the present situation fits naturally into the general structure of what has been referred to by the Australian actuarial community as the 'actuarial control cycle'. It seems to me that this systematic view of the procedure is more scientific than purely actuarial, but is relevant in any event. At a particular juncture, one imagines that data is collected on payments as described earlier. This payment data is then used to fit models for the number of losses and severity of losses by 'backing off' or removing the effects of policy modifications such as deductibles and maximums from the payment data. These loss models are then updated for inflation trends, business growth, etc., so that they are now assumed to be applicable on a prospective basis for future insurance contracts. The new policy provisions are then applied to convert the loss models to a payment basis.premium rates, contingency loadings, reserves, etc. are then set using the models, and policies are then sold. These policies then generate data as before, and the cycle starts over again. This overview helps put the modelling process in a long term context which is appropriate for insurance risk management. One of the main uses of these models has briefly been mentioned above. The setting of a net single premium on a block of business is clearly one of the most important output components of the modelling process. Clearly, in the above distributional context this is the expected claims, which is necessarily the only output obtainable from use of a deterministic model. The presence of deductibles and maximums necessitates for technical reasons much more than a simple expectation analysis, and a full distributional approach is needed. Similar comments apply to the use of an aggregate deductible or stop-loss agreement in a reinsurance context. Of course, sound financial risk management would imply that a simple expectation analysis is insufficient in any event, and the lack of normality of the distributions under study further implies that the standard deviation or variance is also insufficient. Thus, the use of a contingency loading based solely on the first two moments is inappropriatein this situation. All quantities associated with the distributions such as moments or percentiles are easily obtained, even in the presence of deductibles, maximums, stop-loss arrangements, etc. Adverse financial consequences may well result from failure to analyse all aspects of the distribution, and appropriate contingency protection is possible only with careful analysis of the relevant components of well formulated models. The quantification of contingency loadings normally involves various aspects of the distribution of the total claims. Historically, this topic has been considered in the context of premium calculation principles. In recent years its scope has widened, so that it is now seen more generally within the context of 'risk measures', where the goal is to quantify in a relatively simple manner the risk associated with a random future contingent outcome. Recent research in this area has utilized concepts from financial economics, and the ideas have been applied in areas involving financial risk

12 management other than in such traditional actuarial pricing contexts as the present one involving loss modelling. The particular models described above are well suited for use in an insurance context in the presence of reinsurance, or for reinsurance purposes themselves. This is possible with a careful identification of the loss and payment components of the model. For example, a maximum from the vantage point of a direct insurer is viewed as a deductible by a reinsurer. These ideas also extend to reinsurance layers in more complex reinsurance agreements. A theoretical approach to the pricing of stop-loss insurance is an important application of the methodology associated with the use of these models. Similarly, prospective experience rating in a group insurance context is well suited for use in conjunction with credibility modelling. Heterogeneity across groups may be modelled in a cohesive fashion using approaches such as the Buhlmann-Straub model,in which group size is itself an explicit parameter influencing the 'credibility' assigned to the past experience of the particular group in question. The issue of the solvency associated with a particular block or line of business, or even a whole company, is a very important yet technically complex issue. An analytical tool which is of some assistance in this context is supplied by 'ruin theory'. From a technical point of view, ruin theory happens to involve another application of these same compound distributional ideas. This relationship with compound distributions is not immediately obvious from physical considerations, and I will not examine this technical issue here. Ruin theory has as its goal the analysis of the development of the surplus associated with a block of business (or even a whole line of business or a whole company) as a result of the impact of random influences such as the payment of claims. In some situations of practical interest, the mathematical complexity inherent in ruin theoretic analysis unfortunately implies that the model may be too simple to be viewed as being realistic. That is, the 'ruin probabilities' which result cannot realistically be viewed as truly representing the possibility of insolvency. A realistic treatment of this issue is much more complex. This does not, however, render as useless the use of ruin theory in these situations. Qualitative insight into the nature of the development of claims over time does result from its use. The 'danger' associated with a block of business may be quantified through the use of ruin theory, which provides a different yet quite valid measure of the inherent volatility. Another qualitative conclusion which may be drawn from the use of ruin theory involves the optimality of particular types of reinsurance agreements involving deductibles. Ruin theory is well suited for use in conjunction with other risk assessment tools. It is also worth mentioning that recent analytical and computational advances have allowed for the use of more complex realistic modelling approaches to ruin theory. In particular, incorporation of the time value of money, as well as more technically intricate assumptions as to premium and claims processes are now possible. Many of these advances have benefited greatly from exploitation of the similar underlying mathematical structure that exists between ruin theory and an important branch of operations research known as queueing theory. Both ruin and queueing theory have undoubtedly benefitted from this interdisciplinary relationship.

13 Ideas from ruin theory are also of interest in the context of the risk measures mentioned earlier. Ruin theory may be viewed as a systematic study of the relationship between the initial capital required to support a block of insurance and the probability of insolvency, or equivalently the probability that the associated surplus reaches a predetermined level. This is closely connected to and consistent with the well-known concept of 'Value at Risk', or simply 'VaR', whose origins are in the area of financial risk management. The choice of the initial amount of capital in this context to fix the probability of insolvency is essentially equivalent to determination of a ruin probability. Recent research in ruin theory is also closely related to the concept of 'Conditional Tail Expectation' or 'CTE' for short, also known as 'Tail VaR', which is viewed as being an improved risk measure to that of 'VaR' for various theoretical reasons. The percentile associated with 'VaR' is supplemented under the 'CTE' measure by the 'mean excess loss' (also referred to as the complete expectation of life in a life contingencies setting, and the mean residual lifetime in a reliability context). The analysis of the deficit at ruin if ruin does in fact occur (also referred to the severity of ruin) has been the study of much research recently, and its expectation is the quantity of interest in connection with the 'CTE' interpretation of the ruin model. Interestingly, research into reliability properties of the mean excess loss in these other disciplines, independently of the ruin theoretic setting, is of interest in both the general loss model setting described in this talk and the 'CTE' risk measure context. Compound models of the type considered here have also been used in other situations as well. For example, analysis of incurred but not reported (IBNR) claim liabilities is possible using a combination of queueing and risk theoretic techniques to address the inherent stochastic nature. Another example involves the use of these compound models to quantify credit risk associated with the possibility of payment default with financial instruments such as bonds. This particular application provides a good example of the employment of actuarial approaches by the non-actuarial financial community. Further examples include determination of the quantity of oil deposits in a particular geological or geographic region, as well as for analysis of various other miscellaneous quantities of interest in such diverse disciplines as demography and biostatistics. The common underlying theme in all applications is the identification of a random number of events, each of which gives rise to a random quantity of interest. The development of the use of these models in the insurance context described above has reflected the impact of substantial computational and analytic advances in the last twenty years. In particular, it is no longer necessary to entertain only the simplest of models from a mathematical point of view, and equally importantly, it is not necessary to restrict oneself to examination of simplified approximations to quantities of interest within these models. This type of approach was typical and understandable historically, but more realistic nontraditional models which lend themselves to numerical implementation are quite appropriately becoming more commonly used. In light of the resources at hand, it behooves us to broaden our thought processes to examine different approaches to modelling which are consistent with and capitalize upon advances in technology.

14 It is important to recognize the valuable contributions that analytic research has provided into the use of all models in this area, including the simpler ones. Recursive numerical evaluation of compound distributions is a direct result of the impact of the use of a theoretical analytic tool known as a generating function. Similar comments apply to the use of the Fast Fourier Transform, and continuous analogues such as the Heckman Meyers algorithm, commonly used by casualty actuaries. This phenomenon is a good example of the valuable interaction between analytic and computational advancements. It is apparent that the accessability to powerful and sophisticated computational resources has not lessened the importance of analytic research. Rather, the opposite is true. The availability of such resources not only allows for the use of more complex models, but also allows one to extract more information from even the simplest models. A good example in the present context is the manner in which deductibles and maximums may now be treated. The ad hoc approaches which were employed twenty years ago may now be replaced by the systematic and theoretically appropriate methodology currently available. The availability of these tools is a direct result of theoretical analytic research. The benefits to the actuarial community of interactions with other disciplines are clear in general, but particularly so in the present loss modelling context. Obvious benefits from biostatistics, reliability theory, queueing theory, mathematical economics, and probability theory immediately come to mind. Special mention should also be made of the interaction with the area of finance. It is now apparent that financial risk management encompasses many varied components, including those which have traditionally been viewed as being primarily actuarial in nature. It is therefore reassuring to note that recent research in financial and traditional actuarial loss modelling analysis have involved the same type of interaction. The 'VaR' and 'CTE' risk measure theoretic interpretation of the ruin problem described earlier is an example of this phenomenon. Other examples include the very close connection between premium calculation principles and risk measures discussed previously, the similarity between the stop-loss premium in insurance loss modelling and the classical Black-Scholes formula in option pricing, and the use of the Esscher transform in aggregate loss modelling as well as in a financial context as an analytic entity. Along these lines, it is important to note that many technical tools are currently used in each of these financial and actuarial contexts. Laplace transforms, martingales, and ideas from extreme value theory are among those which are immediately apparent. The common use of such tools ensures that a close interaction in the future will be maintained between the actuarial and financial risk management research communities, both of which can and will benefit from the mutual exchange of ideas. Future technological advancements require that quantitative research be seen in a long term context. The view that the output of such research is not of use because immediate applications are not apparent reflects shortsightedness. Furthermore, the categorization of research as theoretical does not imply as a consequence that it is not practical. In short, to quote my colleague Harry Panjer, 'there is nothing more practical than a good theory'.

15 In conclusion, we have tried to provide a 'big picture' view of insurance loss modelling as it pertains to the health and casualty coverages considered here. I hope that you have an appreciation for the vital role that research has provided in arriving at the state of technical knowledge that is currently at our disposal, as well as for what its expected important role will be in future advancements. Moreover, the exchange of experience and ideas with other technically related disciplines has been and will likely continue to be instrumental in fostering the development of these future improvements. I would like to thank you all for your attention to these comments today. If there are any questions, I would be happy to take them now. I would now like to invite everyone to the reception. References 1. Box, G. (1976). "Science and Statistics". Journal of the American Statistical Association, 71, Jewell, W. (1980). "Generalized Models of the Insurance Business". Transactions of the 21st International Congress of Actuaries, Zurich, S,

GN47: Stochastic Modelling of Economic Risks in Life Insurance

GN47: Stochastic Modelling of Economic Risks in Life Insurance GN47: Stochastic Modelling of Economic Risks in Life Insurance Classification Recommended Practice MEMBERS ARE REMINDED THAT THEY MUST ALWAYS COMPLY WITH THE PROFESSIONAL CONDUCT STANDARDS (PCS) AND THAT

More information

Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods

Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods Enrique Navarrete 1 Abstract: This paper surveys the main difficulties involved with the quantitative measurement

More information

Stochastic Analysis of Long-Term Multiple-Decrement Contracts

Stochastic Analysis of Long-Term Multiple-Decrement Contracts Stochastic Analysis of Long-Term Multiple-Decrement Contracts Matthew Clark, FSA, MAAA, and Chad Runchey, FSA, MAAA Ernst & Young LLP Published in the July 2008 issue of the Actuarial Practice Forum Copyright

More information

Pricing Alternative forms of Commercial Insurance cover

Pricing Alternative forms of Commercial Insurance cover Pricing Alternative forms of Commercial Insurance cover Prepared by Andrew Harford Presented to the Institute of Actuaries of Australia Biennial Convention 23-26 September 2007 Christchurch, New Zealand

More information

IS YOUR DATA WAREHOUSE SUCCESSFUL? Developing a Data Warehouse Process that responds to the needs of the Enterprise.

IS YOUR DATA WAREHOUSE SUCCESSFUL? Developing a Data Warehouse Process that responds to the needs of the Enterprise. IS YOUR DATA WAREHOUSE SUCCESSFUL? Developing a Data Warehouse Process that responds to the needs of the Enterprise. Peter R. Welbrock Smith-Hanley Consulting Group Philadelphia, PA ABSTRACT Developing

More information

Central Bank of Ireland Guidelines on Preparing for Solvency II Pre-application for Internal Models

Central Bank of Ireland Guidelines on Preparing for Solvency II Pre-application for Internal Models 2013 Central Bank of Ireland Guidelines on Preparing for Solvency II Pre-application for Internal Models 1 Contents 1 Context... 1 2 General... 2 3 Guidelines on Pre-application for Internal Models...

More information

Financial Economics and Canadian Life Insurance Valuation

Financial Economics and Canadian Life Insurance Valuation Report Financial Economics and Canadian Life Insurance Valuation Task Force on Financial Economics September 2006 Document 206103 Ce document est disponible en français 2006 Canadian Institute of Actuaries

More information

THE INSURANCE BUSINESS (SOLVENCY) RULES 2015

THE INSURANCE BUSINESS (SOLVENCY) RULES 2015 THE INSURANCE BUSINESS (SOLVENCY) RULES 2015 Table of Contents Part 1 Introduction... 2 Part 2 Capital Adequacy... 4 Part 3 MCR... 7 Part 4 PCR... 10 Part 5 - Internal Model... 23 Part 6 Valuation... 34

More information

IRSG Opinion on Joint Discussion paper on Key Information Document (KID) for Packaged Retail and Insurance-based Investment Products (PRIIPs)

IRSG Opinion on Joint Discussion paper on Key Information Document (KID) for Packaged Retail and Insurance-based Investment Products (PRIIPs) EIOPA-IRSG-15-03 IRSG Opinion on Joint Discussion paper on Key Information Document (KID) for Packaged Retail and Insurance-based Investment Products (PRIIPs) Executive Summary It is of utmost importance

More information

ModelRisk for Insurance and Finance. Quick Start Guide

ModelRisk for Insurance and Finance. Quick Start Guide ModelRisk for Insurance and Finance Quick Start Guide Copyright 2008 by Vose Software BVBA, All rights reserved. Vose Software Iepenstraat 98 9000 Gent, Belgium www.vosesoftware.com ModelRisk is owned

More information

GUIDANCE NOTE 252 ACTUARIAL APPRAISALS OF LIFE INSURANCE BUSINESS

GUIDANCE NOTE 252 ACTUARIAL APPRAISALS OF LIFE INSURANCE BUSINESS THE INSTITUTE OF ACTUARIES OF AUSTRALIA A.C.N. 000 423 656 GUIDANCE NOTE 252 ACTUARIAL APPRAISALS OF LIFE INSURANCE BUSINESS PURPOSE This guidance note sets out the considerations that bear on the actuary's

More information

Organizing Your Approach to a Data Analysis

Organizing Your Approach to a Data Analysis Biost/Stat 578 B: Data Analysis Emerson, September 29, 2003 Handout #1 Organizing Your Approach to a Data Analysis The general theme should be to maximize thinking about the data analysis and to minimize

More information

Financial Simulation Models in General Insurance

Financial Simulation Models in General Insurance Financial Simulation Models in General Insurance By - Peter D. England Abstract Increases in computer power and advances in statistical modelling have conspired to change the way financial modelling is

More information

Chapter 9 Experience rating

Chapter 9 Experience rating 0 INTRODUCTION 1 Chapter 9 Experience rating 0 Introduction The rating process is the process of deciding on an appropriate level of premium for a particular class of insurance business. The contents of

More information

2 COMMENCEMENT DATE 5 3 DEFINITIONS 5 4 MATERIALITY 8. 5 DOCUMENTATION 9 5.1 Requirement for a Report 9 5.2 Content of a Report 9

2 COMMENCEMENT DATE 5 3 DEFINITIONS 5 4 MATERIALITY 8. 5 DOCUMENTATION 9 5.1 Requirement for a Report 9 5.2 Content of a Report 9 PROFESSIONAL STANDARD 300 VALUATIONS OF GENERAL INSURANCE CLAIMS INDEX 1 INTRODUCTION 3 1.1 Application 3 1.2 Classification 3 1.3 Background 3 1.4 Purpose 4 1.5 Previous versions 4 1.6 Legislation and

More information

Retirement Planning Software and Post-Retirement Risks: Highlights Report

Retirement Planning Software and Post-Retirement Risks: Highlights Report Retirement Planning Software and Post-Retirement Risks: Highlights Report DECEMBER 2009 SPONSORED BY PREPARED BY John A. Turner Pension Policy Center Hazel A. Witte, JD This report provides a summary of

More information

Basel Committee on Banking Supervision. Working Paper No. 17

Basel Committee on Banking Supervision. Working Paper No. 17 Basel Committee on Banking Supervision Working Paper No. 17 Vendor models for credit risk measurement and management Observations from a review of selected models February 2010 The Working Papers of the

More information

Distribution-Based Pricing Formulas. Are Not Arbitrage-Free

Distribution-Based Pricing Formulas. Are Not Arbitrage-Free Author s Response To Michael G. Wacek s Discussion Of a Paper Published In Proceedings Volume XC Distribution-Based Pricing Formulas Are Not Arbitrage-Free By David L. Ruhm Author s reply to discussion

More information

Ratemakingfor Maximum Profitability. Lee M. Bowron, ACAS, MAAA and Donnald E. Manis, FCAS, MAAA

Ratemakingfor Maximum Profitability. Lee M. Bowron, ACAS, MAAA and Donnald E. Manis, FCAS, MAAA Ratemakingfor Maximum Profitability Lee M. Bowron, ACAS, MAAA and Donnald E. Manis, FCAS, MAAA RATEMAKING FOR MAXIMUM PROFITABILITY Lee Bowron, ACAS, MAAA Don Manis, FCAS, MAAA Abstract The goal of ratemaking

More information

REINSURANCE PROFIT SHARE

REINSURANCE PROFIT SHARE REINSURANCE PROFIT SHARE Prepared by Damian Thornley Presented to the Institute of Actuaries of Australia Biennial Convention 23-26 September 2007 Christchurch, New Zealand This paper has been prepared

More information

Using simulation to calculate the NPV of a project

Using simulation to calculate the NPV of a project Using simulation to calculate the NPV of a project Marius Holtan Onward Inc. 5/31/2002 Monte Carlo simulation is fast becoming the technology of choice for evaluating and analyzing assets, be it pure financial

More information

Quantitative Impact Study 1 (QIS1) Summary Report for Belgium. 21 March 2006

Quantitative Impact Study 1 (QIS1) Summary Report for Belgium. 21 March 2006 Quantitative Impact Study 1 (QIS1) Summary Report for Belgium 21 March 2006 1 Quantitative Impact Study 1 (QIS1) Summary Report for Belgium INTRODUCTORY REMARKS...4 1. GENERAL OBSERVATIONS...4 1.1. Market

More information

Much attention has been focused recently on enterprise risk management (ERM),

Much attention has been focused recently on enterprise risk management (ERM), By S. Michael McLaughlin and Karen DeToro Much attention has been focused recently on enterprise risk management (ERM), not just in the insurance industry but in other industries as well. Across all industries,

More information

Statistical Modeling and Analysis of Stop- Loss Insurance for Use in NAIC Model Act

Statistical Modeling and Analysis of Stop- Loss Insurance for Use in NAIC Model Act Statistical Modeling and Analysis of Stop- Loss Insurance for Use in NAIC Model Act Prepared for: National Association of Insurance Commissioners Prepared by: Milliman, Inc. James T. O Connor FSA, MAAA

More information

Introduction to time series analysis

Introduction to time series analysis Introduction to time series analysis Margherita Gerolimetto November 3, 2010 1 What is a time series? A time series is a collection of observations ordered following a parameter that for us is time. Examples

More information

Society of Actuaries in Ireland

Society of Actuaries in Ireland Society of Actuaries in Ireland Information and Assistance Note LA-1: Actuaries involved in the Own Risk & Solvency Assessment (ORSA) under Solvency II Life Assurance and Life Reinsurance Business Issued

More information

ACTUARIAL CONSIDERATIONS IN THE DEVELOPMENT OF AGENT CONTINGENT COMPENSATION PROGRAMS

ACTUARIAL CONSIDERATIONS IN THE DEVELOPMENT OF AGENT CONTINGENT COMPENSATION PROGRAMS ACTUARIAL CONSIDERATIONS IN THE DEVELOPMENT OF AGENT CONTINGENT COMPENSATION PROGRAMS Contingent compensation plans are developed by insurers as a tool to provide incentives to agents to obtain certain

More information

Risk Analysis and Quantification

Risk Analysis and Quantification Risk Analysis and Quantification 1 What is Risk Analysis? 2. Risk Analysis Methods 3. The Monte Carlo Method 4. Risk Model 5. What steps must be taken for the development of a Risk Model? 1.What is Risk

More information

Master of Mathematical Finance: Course Descriptions

Master of Mathematical Finance: Course Descriptions Master of Mathematical Finance: Course Descriptions CS 522 Data Mining Computer Science This course provides continued exploration of data mining algorithms. More sophisticated algorithms such as support

More information

An introduction to Value-at-Risk Learning Curve September 2003

An introduction to Value-at-Risk Learning Curve September 2003 An introduction to Value-at-Risk Learning Curve September 2003 Value-at-Risk The introduction of Value-at-Risk (VaR) as an accepted methodology for quantifying market risk is part of the evolution of risk

More information

Compliance with the NAIC Valuation of Life Insurance Policies Model Regulation with Respect to Deficiency Reserve Mortality

Compliance with the NAIC Valuation of Life Insurance Policies Model Regulation with Respect to Deficiency Reserve Mortality Actuarial Standard of Practice No. 40 Compliance with the NAIC Valuation of Life Insurance Policies Model Regulation with Respect to Deficiency Reserve Mortality Developed by the Task Force on XXX Regulation

More information

STATISTICAL METHODS IN GENERAL INSURANCE. Philip J. Boland National University of Ireland, Dublin philip.j.boland@ucd.ie

STATISTICAL METHODS IN GENERAL INSURANCE. Philip J. Boland National University of Ireland, Dublin philip.j.boland@ucd.ie STATISTICAL METHODS IN GENERAL INSURANCE Philip J. Boland National University of Ireland, Dublin philip.j.boland@ucd.ie The Actuarial profession appeals to many with excellent quantitative skills who aspire

More information

Guidelines on the valuation of technical provisions

Guidelines on the valuation of technical provisions EIOPA-BoS-14/166 EN Guidelines on the valuation of technical provisions EIOPA Westhafen Tower, Westhafenplatz 1-60327 Frankfurt Germany - Tel. + 49 69-951119-20; Fax. + 49 69-951119-19; email: info@eiopa.europa.eu

More information

Black-Scholes-Merton approach merits and shortcomings

Black-Scholes-Merton approach merits and shortcomings Black-Scholes-Merton approach merits and shortcomings Emilia Matei 1005056 EC372 Term Paper. Topic 3 1. Introduction The Black-Scholes and Merton method of modelling derivatives prices was first introduced

More information

A three dimensional stochastic Model for Claim Reserving

A three dimensional stochastic Model for Claim Reserving A three dimensional stochastic Model for Claim Reserving Magda Schiegl Haydnstr. 6, D - 84088 Neufahrn, magda.schiegl@t-online.de and Cologne University of Applied Sciences Claudiusstr. 1, D-50678 Köln

More information

Comprehensive Business Budgeting

Comprehensive Business Budgeting Management Accounting 137 Comprehensive Business Budgeting Goals and Objectives Profit planning, commonly called master budgeting or comprehensive business budgeting, is one of the more important techniques

More information

BINOMIAL OPTIONS PRICING MODEL. Mark Ioffe. Abstract

BINOMIAL OPTIONS PRICING MODEL. Mark Ioffe. Abstract BINOMIAL OPTIONS PRICING MODEL Mark Ioffe Abstract Binomial option pricing model is a widespread numerical method of calculating price of American options. In terms of applied mathematics this is simple

More information

IASB/FASB Meeting Week beginning 11 April 2011. Top down approaches to discount rates

IASB/FASB Meeting Week beginning 11 April 2011. Top down approaches to discount rates IASB/FASB Meeting Week beginning 11 April 2011 IASB Agenda reference 5A FASB Agenda Staff Paper reference 63A Contacts Matthias Zeitler mzeitler@iasb.org +44 (0)20 7246 6453 Shayne Kuhaneck skuhaneck@fasb.org

More information

The Study of Chinese P&C Insurance Risk for the Purpose of. Solvency Capital Requirement

The Study of Chinese P&C Insurance Risk for the Purpose of. Solvency Capital Requirement The Study of Chinese P&C Insurance Risk for the Purpose of Solvency Capital Requirement Xie Zhigang, Wang Shangwen, Zhou Jinhan School of Finance, Shanghai University of Finance & Economics 777 Guoding

More information

FINANCIAL ECONOMICS OPTION PRICING

FINANCIAL ECONOMICS OPTION PRICING OPTION PRICING Options are contingency contracts that specify payoffs if stock prices reach specified levels. A call option is the right to buy a stock at a specified price, X, called the strike price.

More information

PUBLIC HEALTH OPTOMETRY ECONOMICS. Kevin D. Frick, PhD

PUBLIC HEALTH OPTOMETRY ECONOMICS. Kevin D. Frick, PhD Chapter Overview PUBLIC HEALTH OPTOMETRY ECONOMICS Kevin D. Frick, PhD This chapter on public health optometry economics describes the positive and normative uses of economic science. The terms positive

More information

Treatment of technical provisions under Solvency II

Treatment of technical provisions under Solvency II Treatment of technical provisions under Solvency II Quantitative methods, qualitative requirements and disclosure obligations Authors Martin Brosemer Dr. Susanne Lepschi Dr. Katja Lord Contact solvency-solutions@munichre.com

More information

New Horizons in Simulation Games and Experiential Learning, Volume 4, 1977 A SIMPLIFIED, NONCOMPUTERIZED MARKETING CHANNELS GAME

New Horizons in Simulation Games and Experiential Learning, Volume 4, 1977 A SIMPLIFIED, NONCOMPUTERIZED MARKETING CHANNELS GAME A SIMPLIFIED, NONCOMPUTERIZED MARKETING CHANNELS GAME Alvin C. Burns, Louisiana State University INTRODUCTION Several computer-based business simulation games are available as teaching aids and the author

More information

Measurement of Banks Exposure to Interest Rate Risk and Principles for the Management of Interest Rate Risk respectively.

Measurement of Banks Exposure to Interest Rate Risk and Principles for the Management of Interest Rate Risk respectively. INTEREST RATE RISK IN THE BANKING BOOK Over the past decade the Basel Committee on Banking Supervision (the Basel Committee) has released a number of consultative documents discussing the management and

More information

Featured article: Evaluating the Cost of Longevity in Variable Annuity Living Benefits

Featured article: Evaluating the Cost of Longevity in Variable Annuity Living Benefits Featured article: Evaluating the Cost of Longevity in Variable Annuity Living Benefits By Stuart Silverman and Dan Theodore This is a follow-up to a previous article Considering the Cost of Longevity Volatility

More information

Reporting Low-level Analytical Data

Reporting Low-level Analytical Data W. Horwitz, S. Afr. J. Chem., 2000, 53 (3), 206-212, , . [formerly: W. Horwitz, S. Afr. J. Chem.,

More information

Prentice Hall Algebra 2 2011 Correlated to: Colorado P-12 Academic Standards for High School Mathematics, Adopted 12/2009

Prentice Hall Algebra 2 2011 Correlated to: Colorado P-12 Academic Standards for High School Mathematics, Adopted 12/2009 Content Area: Mathematics Grade Level Expectations: High School Standard: Number Sense, Properties, and Operations Understand the structure and properties of our number system. At their most basic level

More information

white paper Challenges to Target Funds James Breen, President, Horizon Fiduciary Services

white paper Challenges to Target Funds James Breen, President, Horizon Fiduciary Services Challenges to Target Funds James Breen, President, Horizon Fiduciary Services Since their invention in the early 1990 s Target Date Funds have become a very popular investment vehicle, particularly with

More information

Executive Summary 1. Försäkringsrörelselagen (1982:713). 3

Executive Summary 1. Försäkringsrörelselagen (1982:713). 3 Executive Summary 1 The changes proposed in the Swedish Insurance Business Act 2 are intended to strengthen policyholder protection by increasing transparency and enhancing incentives for insurance undertakings

More information

Fundamentals of Actuarial Mathematics. 3rd Edition

Fundamentals of Actuarial Mathematics. 3rd Edition Brochure More information from http://www.researchandmarkets.com/reports/2866022/ Fundamentals of Actuarial Mathematics. 3rd Edition Description: - Provides a comprehensive coverage of both the deterministic

More information

ERM-2: Introduction to Economic Capital Modeling

ERM-2: Introduction to Economic Capital Modeling ERM-2: Introduction to Economic Capital Modeling 2011 Casualty Loss Reserve Seminar, Las Vegas, NV A presentation by François Morin September 15, 2011 2011 Towers Watson. All rights reserved. INTRODUCTION

More information

GN8: Additional Guidance on valuation of long-term insurance business

GN8: Additional Guidance on valuation of long-term insurance business GN8: Additional Guidance on valuation of long-term insurance business Classification Practice Standard MEMBERS ARE REMINDED THAT THEY MUST ALWAYS COMPLY WITH THE PROFESSIONAL CONDUCT STANDARDS (PCS) AND

More information

by Maria Heiden, Berenberg Bank

by Maria Heiden, Berenberg Bank Dynamic hedging of equity price risk with an equity protect overlay: reduce losses and exploit opportunities by Maria Heiden, Berenberg Bank As part of the distortions on the international stock markets

More information

ACCOUNTING STANDARDS BOARD DECEMBER 2004 FRS 27 27LIFE ASSURANCE STANDARD FINANCIAL REPORTING ACCOUNTING STANDARDS BOARD

ACCOUNTING STANDARDS BOARD DECEMBER 2004 FRS 27 27LIFE ASSURANCE STANDARD FINANCIAL REPORTING ACCOUNTING STANDARDS BOARD ACCOUNTING STANDARDS BOARD DECEMBER 2004 FRS 27 27LIFE ASSURANCE FINANCIAL REPORTING STANDARD ACCOUNTING STANDARDS BOARD Financial Reporting Standard 27 'Life Assurance' is issued by the Accounting Standards

More information

Final. Actuarial Standards Board. July 2011. Document 211070. Ce document est disponible en français 2011 Canadian Institute of Actuaries

Final. Actuarial Standards Board. July 2011. Document 211070. Ce document est disponible en français 2011 Canadian Institute of Actuaries Final Final Standards Standards of Practice for the Valuation of Insurance Contract Liabilities: Life and Health (Accident and Sickness) Insurance (Subsection 2350) Relating to Mortality Improvement (clean

More information

2014 Statutory Combined Annual Statement Schedule P Disclosure

2014 Statutory Combined Annual Statement Schedule P Disclosure 2014 Statutory Combined Annual Statement Schedule P Disclosure This disclosure provides supplemental facts and methodologies intended to enhance understanding of Schedule P reserve data. It provides additional

More information

Business Valuation under Uncertainty

Business Valuation under Uncertainty Business Valuation under Uncertainty ONDŘEJ NOWAK, JIŘÍ HNILICA Department of Business Economics University of Economics Prague W. Churchill Sq. 4, 130 67 Prague 3 CZECH REPUBLIC ondrej.nowak@vse.cz http://kpe.fph.vse.cz

More information

Approximation of Aggregate Losses Using Simulation

Approximation of Aggregate Losses Using Simulation Journal of Mathematics and Statistics 6 (3): 233-239, 2010 ISSN 1549-3644 2010 Science Publications Approimation of Aggregate Losses Using Simulation Mohamed Amraja Mohamed, Ahmad Mahir Razali and Noriszura

More information

MATHEMATICS OF FINANCE AND INVESTMENT

MATHEMATICS OF FINANCE AND INVESTMENT MATHEMATICS OF FINANCE AND INVESTMENT G. I. FALIN Department of Probability Theory Faculty of Mechanics & Mathematics Moscow State Lomonosov University Moscow 119992 g.falin@mail.ru 2 G.I.Falin. Mathematics

More information

WHEN YOU CONSULT A STATISTICIAN... WHAT TO EXPECT

WHEN YOU CONSULT A STATISTICIAN... WHAT TO EXPECT WHEN YOU CONSULT A STATISTICIAN... WHAT TO EXPECT SECTION ON STATISTICAL CONSULTING AMERICAN STATISTICAL ASSOCIATION 2003 When you consult a statistician, you enlist the help of a professional who is particularly

More information

THE MEASUREMENT OF NON-LIFE INSURANCE OUTPUT IN THE AUSTRALIAN NATIONAL ACCOUNTS

THE MEASUREMENT OF NON-LIFE INSURANCE OUTPUT IN THE AUSTRALIAN NATIONAL ACCOUNTS For Official Use STD/NA(99)20 Organisation de Coopération et de Développement Economiques OLIS : 19-Aug-1999 Organisation for Economic Co-operation and Development Dist. : 20-Aug-1999 Or. Eng. STATISTICS

More information

Statement #4/Managerial Cost Accounting Concepts and Standards for the Federal Government

Statement #4/Managerial Cost Accounting Concepts and Standards for the Federal Government Statement #4/Managerial Cost Accounting Concepts and Standards for the Federal Government Executive Office of the President Office of Management and Budget "Managerial Cost Accounting Concepts and Standards

More information

INTERNATIONAL COMPARISON OF INTEREST RATE GUARANTEES IN LIFE INSURANCE

INTERNATIONAL COMPARISON OF INTEREST RATE GUARANTEES IN LIFE INSURANCE INTERNATIONAL COMPARISON OF INTEREST RATE GUARANTEES IN LIFE INSURANCE J. DAVID CUMMINS, KRISTIAN R. MILTERSEN, AND SVEIN-ARNE PERSSON Abstract. Interest rate guarantees seem to be included in life insurance

More information

Article from: Risk Management. June 2009 Issue 16

Article from: Risk Management. June 2009 Issue 16 Article from: Risk Management June 2009 Issue 16 CHAIRSPERSON S Risk quantification CORNER Structural Credit Risk Modeling: Merton and Beyond By Yu Wang The past two years have seen global financial markets

More information

Capital Requirements for Variable Annuities - Discussion Paper. August 2009. Insurance Supervision Department

Capital Requirements for Variable Annuities - Discussion Paper. August 2009. Insurance Supervision Department Capital Requirements for Variable Annuities - Discussion Paper August 2009 Insurance Supervision Department Contents Introduction... 2 Suggested Methodology... 3 Basic Principle... 3 Basic Guarantee Liabilities...

More information

Cover. Optimal Retentions with Ruin Probability Target in The case of Fire. Insurance in Iran

Cover. Optimal Retentions with Ruin Probability Target in The case of Fire. Insurance in Iran Cover Optimal Retentions with Ruin Probability Target in The case of Fire Insurance in Iran Ghadir Mahdavi Ass. Prof., ECO College of Insurance, Allameh Tabataba i University, Iran Omid Ghavibazoo MS in

More information

http://www.jstor.org This content downloaded on Tue, 19 Feb 2013 17:28:43 PM All use subject to JSTOR Terms and Conditions

http://www.jstor.org This content downloaded on Tue, 19 Feb 2013 17:28:43 PM All use subject to JSTOR Terms and Conditions A Significance Test for Time Series Analysis Author(s): W. Allen Wallis and Geoffrey H. Moore Reviewed work(s): Source: Journal of the American Statistical Association, Vol. 36, No. 215 (Sep., 1941), pp.

More information

Matching Investment Strategies in General Insurance Is it Worth It? Aim of Presentation. Background 34TH ANNUAL GIRO CONVENTION

Matching Investment Strategies in General Insurance Is it Worth It? Aim of Presentation. Background 34TH ANNUAL GIRO CONVENTION Matching Investment Strategies in General Insurance Is it Worth It? 34TH ANNUAL GIRO CONVENTION CELTIC MANOR RESORT, NEWPORT, WALES Aim of Presentation To answer a key question: What are the benefit of

More information

Credibility: Theory Meets Regulatory Practice Matt Hassett, Arizona State University Brian Januzik, Advance PCS

Credibility: Theory Meets Regulatory Practice Matt Hassett, Arizona State University Brian Januzik, Advance PCS Credibility: Theory Meets Regulatory Practice Matt Hassett, Arizona State University Brian Januzik, Advance PCS 1. Introduction. The examination process requires actuaries to master the theoretical foundations

More information

7 Conclusions and suggestions for further research

7 Conclusions and suggestions for further research 7 Conclusions and suggestions for further research This research has devised an approach to analyzing system-level coordination from the point of view of product architecture. The analysis was conducted

More information

Return to Risk Limited website: www.risklimited.com. Overview of Options An Introduction

Return to Risk Limited website: www.risklimited.com. Overview of Options An Introduction Return to Risk Limited website: www.risklimited.com Overview of Options An Introduction Options Definition The right, but not the obligation, to enter into a transaction [buy or sell] at a pre-agreed price,

More information

Quantitative Methods for Finance

Quantitative Methods for Finance Quantitative Methods for Finance Module 1: The Time Value of Money 1 Learning how to interpret interest rates as required rates of return, discount rates, or opportunity costs. 2 Learning how to explain

More information

DETERMINING ULTIMATE CLAIM LIABILITIES FOR HEALTH INSURANCE COVERAGES

DETERMINING ULTIMATE CLAIM LIABILITIES FOR HEALTH INSURANCE COVERAGES 200 DETERMINING ULTIMATE CLAIM LIABILITIES FOR HEALTH INSURANCE COVERAGES EMIL J. STRUG I. INTRODUCTION The purpose of this paper is to add another chapter to the fund of knowledge being accumulated on

More information

Insurance as Operational Risk Management Tool

Insurance as Operational Risk Management Tool DOI: 10.7763/IPEDR. 2012. V54. 7 Insurance as Operational Risk Management Tool Milan Rippel 1, Lucie Suchankova 2 1 Charles University in Prague, Czech Republic 2 Charles University in Prague, Czech Republic

More information

Usage of Modeling Efficiency Techniques in the US Life Insurance Industry

Usage of Modeling Efficiency Techniques in the US Life Insurance Industry Usage of Modeling Efficiency Techniques in the US Life Insurance Industry April 2013 Results of a survey analyzed by the American Academy of Actuaries Modeling Efficiency Work Group The American Academy

More information

Valuation of Your Early Drug Candidate. By Linda Pullan, Ph.D. www.sharevault.com. Toll-free USA 800-380-7652 Worldwide 1-408-717-4955

Valuation of Your Early Drug Candidate. By Linda Pullan, Ph.D. www.sharevault.com. Toll-free USA 800-380-7652 Worldwide 1-408-717-4955 Valuation of Your Early Drug Candidate By Linda Pullan, Ph.D. www.sharevault.com Toll-free USA 800-380-7652 Worldwide 1-408-717-4955 ShareVault is a registered trademark of Pandesa Corporation dba ShareVault

More information

Fair Value Measurement

Fair Value Measurement Indian Accounting Standard (Ind AS) 113 Fair Value Measurement (This Indian Accounting Standard includes paragraphs set in bold type and plain type, which have equal authority. Paragraphs in bold type

More information

Mixing internal and external data for managing operational risk

Mixing internal and external data for managing operational risk Mixing internal and external data for managing operational risk Antoine Frachot and Thierry Roncalli Groupe de Recherche Opérationnelle, Crédit Lyonnais, France This version: January 29, 2002 Introduction

More information

GUIDANCE NOTE 253 - DETERMINATION OF LIFE INSURANCE POLICY LIABILITIES

GUIDANCE NOTE 253 - DETERMINATION OF LIFE INSURANCE POLICY LIABILITIES THE INSTITUTE OF ACTUARIES OF AUSTRALIA A.C.N. 000 423 656 GUIDANCE NOTE 253 - DETERMINATION OF LIFE INSURANCE POLICY LIABILITIES APPLICATION Appointed Actuaries of Life Insurance Companies. LEGISLATION

More information

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1 Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1 Kabir K. Dutta 2 David F. Babbel 3 First Version: March 25, 2010; This Version: September 24, 2010 Abstract

More information

Derivative Users Traders of derivatives can be categorized as hedgers, speculators, or arbitrageurs.

Derivative Users Traders of derivatives can be categorized as hedgers, speculators, or arbitrageurs. OPTIONS THEORY Introduction The Financial Manager must be knowledgeable about derivatives in order to manage the price risk inherent in financial transactions. Price risk refers to the possibility of loss

More information

INFORMATION FOR OBSERVERS. IASB Meeting: Insurance Working Group, April 2008 Paper: Non-life insurance contracts (Agenda paper 6)

INFORMATION FOR OBSERVERS. IASB Meeting: Insurance Working Group, April 2008 Paper: Non-life insurance contracts (Agenda paper 6) 30 Cannon Street, London EC4M 6XH, England International Phone: +44 (0)20 7246 6410, Fax: +44 (0)20 7246 6411 Accounting Standards Email: iasb@iasb.org.uk Website: http://www.iasb.org Board This document

More information

INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS

INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS Standard No. 13 INTERNATIONAL ASSOCIATION OF INSURANCE SUPERVISORS STANDARD ON ASSET-LIABILITY MANAGEMENT OCTOBER 2006 This document was prepared by the Solvency and Actuarial Issues Subcommittee in consultation

More information

Risk-Based Regulatory Capital for Insurers: A Case Study 1

Risk-Based Regulatory Capital for Insurers: A Case Study 1 Risk-Based Regulatory Capital for Insurers: A Case Study 1 Christian Sutherland-Wong Bain International Level 35, Chifley Tower 2 Chifley Square Sydney, AUSTRALIA Tel: + 61 2 9229 1615 Fax: +61 2 9223

More information

The Distribution of Aggregate Life Insurance Claims

The Distribution of Aggregate Life Insurance Claims The Distribution of ggregate Life Insurance laims Tom Edwalds, FS Munich merican Reassurance o. BSTRT This paper demonstrates the calculation of the moments of the distribution of aggregate life insurance

More information

IASB Educational Session Non-Life Claims Liability

IASB Educational Session Non-Life Claims Liability IASB Board Meeting Observer Note- Agenda Paper 10 January 2005 IASB Educational Session Non-Life Claims Liability Presented by the International Actuarial Association January 19, 2005 Sam Gutterman and

More information

SOA Health 2007 Spring Meeting Valuation and Reserving Techniques

SOA Health 2007 Spring Meeting Valuation and Reserving Techniques SOA Health 2007 Spring Meeting Valuation and Reserving Techniques Property & Casualty Reserving Techniques Theresa Bourdon, FCAS, MAAA CAS Statement of Principles Regarding P&C Loss and Loss Adjustment

More information

Summary of the Paper Awarded the SCOR Prize in Actuarial Science 2012 in Germany (2 nd Prize)

Summary of the Paper Awarded the SCOR Prize in Actuarial Science 2012 in Germany (2 nd Prize) Summary of the Paper Awarded the SCOR Prize in Actuarial Science 2012 in Germany (2 nd Prize) Title: Market-Consistent Valuation of Long-Term Insurance Contracts Valuation Framework and Application to

More information

Institute of Actuaries of India

Institute of Actuaries of India Institute of Actuaries of India GUIDANCE NOTE (GN) 6: Management of participating life insurance business with reference to distribution of surplus Classification: Recommended Practice Compliance: Members

More information

Guidance on Best Estimate and Margin for Uncertainty

Guidance on Best Estimate and Margin for Uncertainty 2014 Guidance on Best Estimate and Margin for Uncertainty Guidance on Best Estimate and Margin for Uncertainty Contents Introduction... 3 Best Estimate of Claims Liabilities... 3 Margin for Uncertainty...

More information

The Best of Both Worlds:

The Best of Both Worlds: The Best of Both Worlds: A Hybrid Approach to Calculating Value at Risk Jacob Boudoukh 1, Matthew Richardson and Robert F. Whitelaw Stern School of Business, NYU The hybrid approach combines the two most

More information

GLOSSARY OF ACTUARIAL AND RATEMAKING TERMINOLOGY

GLOSSARY OF ACTUARIAL AND RATEMAKING TERMINOLOGY GLOSSARY OF ACTUARIAL AND RATEMAKING TERMINOLOGY Term Accident Accident Date Accident Period Accident Year Case- Incurred Losses Accident Year Experience Acquisition Cost Actuary Adverse Selection (Anti-Selection,

More information

Guideline. Source of Earnings Disclosure (Life Insurance Companies) No: D-9 Date: December 2004 Revised: July 2010

Guideline. Source of Earnings Disclosure (Life Insurance Companies) No: D-9 Date: December 2004 Revised: July 2010 Guideline Subject: Category: (Life Insurance Companies) Accounting No: D-9 Date: December 2004 Revised: July 2010 This Guideline, which applies to life insurance companies and life insurance holding companies

More information

Credibility Procedures Applicable to Accident and Health, Group Term Life, and Property/Casualty Coverages

Credibility Procedures Applicable to Accident and Health, Group Term Life, and Property/Casualty Coverages Actuarial Standard of Practice No. 25 Credibility Procedures Applicable to Accident and Health, Group Term Life, and Property/Casualty Coverages Developed by the Casualty and Health Committees of the Actuarial

More information

RESP Investment Strategies

RESP Investment Strategies RESP Investment Strategies Registered Education Savings Plans (RESP): Must Try Harder Graham Westmacott CFA Portfolio Manager PWL CAPITAL INC. Waterloo, Ontario August 2014 This report was written by Graham

More information

Admission Criteria Minimum GPA of 3.0 in a Bachelor s degree (or equivalent from an overseas institution) in a quantitative discipline.

Admission Criteria Minimum GPA of 3.0 in a Bachelor s degree (or equivalent from an overseas institution) in a quantitative discipline. Overview Offered by the Mona School of Business in conjunction with the Department of Mathematics, Faculty of Science & Technology, The University of the West Indies. The MSc. ERM degree programme is designed

More information

The primary goal of this thesis was to understand how the spatial dependence of

The primary goal of this thesis was to understand how the spatial dependence of 5 General discussion 5.1 Introduction The primary goal of this thesis was to understand how the spatial dependence of consumer attitudes can be modeled, what additional benefits the recovering of spatial

More information

Risk Management & Solvency Assessment of Life Insurance Companies By Sanket Kawatkar, BCom, FIA Heerak Basu, MA, FFA, FASI, MBA

Risk Management & Solvency Assessment of Life Insurance Companies By Sanket Kawatkar, BCom, FIA Heerak Basu, MA, FFA, FASI, MBA Risk Management & Solvency Assessment of Life Insurance Companies By Sanket Kawatkar, BCom, FIA Heerak Basu, MA, FFA, FASI, MBA Abstract This paper summarises the work done in this area by various parties,

More information

Penalized regression: Introduction

Penalized regression: Introduction Penalized regression: Introduction Patrick Breheny August 30 Patrick Breheny BST 764: Applied Statistical Modeling 1/19 Maximum likelihood Much of 20th-century statistics dealt with maximum likelihood

More information