Modelling operational risk in the insurance industry



Similar documents
Market Value Margin: Practical calculations under the Solvency II Cost of Capital approach

SCOR Papers. Using Capital Allocation to Steer the Portfolio towards Profitability. Abstract. September 2008 N 1

Tail-Dependence an Essential Factor for Correctly Measuring the Benefits of Diversification

Solvency II and Predictive Analytics in LTC and Beyond HOW U.S. COMPANIES CAN IMPROVE ERM BY USING ADVANCED

Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods

Pricing Alternative forms of Commercial insurance cover. Andrew Harford

Mixing internal and external data for managing operational risk

Market-Consistent Valuation of the Sponsor Covenant and its use in Risk-Based Capital Assessment. Craig Turnbull FIA

SCOR inform - April Life (re)insurance under Solvency II

LDA at Work: Deutsche Bank s Approach to Quantifying Operational Risk

THE INSURANCE BUSINESS (SOLVENCY) RULES 2015

Stochastic Analysis of Long-Term Multiple-Decrement Contracts

Actuarial Risk Management

CONTENTS. List of Figures List of Tables. List of Abbreviations

CITIGROUP INC. BASEL II.5 MARKET RISK DISCLOSURES AS OF AND FOR THE PERIOD ENDED MARCH 31, 2013

Asymmetry and the Cost of Capital

Case Study Modeling Longevity Risk for Solvency II

Quantitative Impact Study 1 (QIS1) Summary Report for Belgium. 21 March 2006

QUANTIFYING OPERATIONAL RISK Charles Smithson

Managing Capital Adequacy with the Internal Capital Adequacy Assessment Process (ICAAP) - Challenges and Best Practices

Operational Risk Management: Added Value of Advanced Methodologies

Regulatory and Economic Capital

Basel II: Operational Risk Implementation based on Risk Framework

Pricing Alternative forms of Commercial Insurance cover

The Investment Implications of Solvency II

An introduction to Value-at-Risk Learning Curve September 2003

Operational Risk Scenario Analysis. 17/03/2010 Michał Sapiński

FRC Risk Reporting Requirements Working Party Case Study (Pharmaceutical Industry)

Dr Christine Brown University of Melbourne

ModelRisk for Insurance and Finance. Quick Start Guide

GN47: Stochastic Modelling of Economic Risks in Life Insurance

One-year reserve risk including a tail factor : closed formula and bootstrap approaches

Stochastic Solvency Testing in Life Insurance. Genevieve Hayes

IRSG Opinion on Joint Discussion paper on Key Information Document (KID) for Packaged Retail and Insurance-based Investment Products (PRIIPs)

Methodology. Discounting. MVM Methods

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

(Part.1) FOUNDATIONS OF RISK MANAGEMENT

Financial Simulation Models in General Insurance

ERM from a Small Insurance Company Perspective

Exam P - Total 23/

Capital Requirements for Variable Annuities - Discussion Paper. August Insurance Supervision Department

Statistics for Retail Finance. Chapter 8: Regulation and Capital Requirements

Effective Stress Testing in Enterprise Risk Management

Implementing an AMA for Operational Risk

THE USE OF STATISTICAL DISTRIBUTIONS TO MODEL CLAIMS IN MOTOR INSURANCE

STAT2400 STAT2400 STAT2400 STAT2400 STAT2400 STAT2400 STAT2400 STAT2400&3400 STAT2400&3400 STAT2400&3400 STAT2400&3400 STAT3400 STAT3400

How to Model Operational Risk, if You Must

EIOPA Risk Dashboard March 2015 Q data PUBLIC. EIOPA-FS-15/209 Frankfurt, 20th March 2015

Matching Investment Strategies in General Insurance Is it Worth It? Aim of Presentation. Background 34TH ANNUAL GIRO CONVENTION

Quantitative Operational Risk Management

Practical methods of modelling operational risk

Java Modules for Time Series Analysis

Regulations in General Insurance. Solvency II

FINMA s model review approach and future challenges

Statistics I for QBIC. Contents and Objectives. Chapters 1 7. Revised: August 2013

LIFE INSURANCE & WEALTH MANAGEMENT PRACTICE COMMITTEE

CEIOPS Preparatory Field Study for Life Insurance Firms. Summary Report

ERM Exam Core Readings Fall Table of Contents

Quantitative Methods for Finance

ON THE RISK ADJUSTED DISCOUNT RATE FOR DETERMINING LIFE OFFICE APPRAISAL VALUES BY M. SHERRIS B.A., M.B.A., F.I.A., F.I.A.A. 1.

Feedback on the 2012 thematic review of technical provisions

2 COMMENCEMENT DATE 5 3 DEFINITIONS 5 4 MATERIALITY 8. 5 DOCUMENTATION Requirement for a Report Content of a Report 9

Treatment of technical provisions under Solvency II

Statistical pitfalls in Solvency II Value-at-Risk models

Statistics in Retail Finance. Chapter 6: Behavioural models

Approximation of Aggregate Losses Using Simulation

EIOPA Risk Dashboard September 2014 Q data EIOPA-FS-14/083

THE MULTI-YEAR NON-LIFE INSURANCE RISK

Measuring operational risk in financial institutions: Contribution of credit risk modelling

CEIOPS-DOC-70/10 29 January (former Consultation Paper 74)

Curriculum Map Statistics and Probability Honors (348) Saugus High School Saugus Public Schools

Autorité de contrôle prudentiel (Prudential Supervisory Authority)

FAIR TRADE IN INSURANCE INDUSTRY: PREMIUM DETERMINATION OF TAIWAN AUTOMOBILE INSURANCE. Emilio Venezian Venezian Associate, Taiwan

Non Linear Dependence Structures: a Copula Opinion Approach in Portfolio Optimization

Operational Risk. Operational Risk in Life Insurers. Operational Risk similarities with General Insurance? unique features

iii. Risk Report Management's Assessment on Risk Profile

The Impact of Pension Reform Proposals on Claims Against the Pension Insurance Program, Losses to Participants, and Contributions

Choosing Probability Distributions in Simulation

Generating Random Samples from the Generalized Pareto Mixture Model

Development Period Observed Payments

GLOSSARY OF EVALUATION TERMS

Hedging at Your Insurance Company

How To Become A Life Insurance Agent

ORSA Implementation Challenges

1. This Prudential Standard is made under paragraph 230A(1)(a) of the Life Insurance Act 1995 (the Act).

BEL best estimate liability, deduced from the mean loss as a present value of mean loss,

From Ruin Theory to Solvency in Non-Life Insurance

pwc.com.au NT WorkSafe Actuarial review of Northern Territory workers compensation scheme as at June 2014

Insurance Guidance Note No. 14 System of Governance - Insurance Transition to Governance Requirements established under the Solvency II Directive

Selecting the Right Distribution

Transcription:

April 2011 N 14 Modelling operational risk in the insurance industry Par Julie Gamonet Centre d études actuarielles Winner of the French «Young Actuaries Prize» 2010 Texts appearing in SCOR Papers are the responsibility of their authors alone. In publishing such articles, SCOR takes no position on the opinions expressed by the authors in their texts and disclaims all responsibility for any opinions, incorrect information or legal errors found therein. Abstract One of the major new features of Solvency II is that insurance companies must now devote a portion of their equity to covering their exposure to operational risks. The regulator has proposed two approaches to calculating this capital requirement: a standard and a more advanced approach. The standard approach is a simplified calculation of a percentage of premiums or reserves. The advanced approach uses an internal model of risk that corresponds to the company s real situation. The EIOPA has encouraged insurance companies to adopt the internal model by structuring the standard approach such that it uses up much more equity, as we noted in Quantitative Impact Study 5 (QIS5). This paper proposes an approach that distinguishes frequency risks from severity risks. Frequency risks are defined as the risk of suffering small losses frequently. They are modelled by the Loss Distribution Approach. Severity risks represent the risk of large but rare losses. They are modelled by Bayesian networks. Note: because we have modelled frequency and severity risks separately, the frequency risk model is much less sensitive to technical choices (adjusted distributions, aggregation). This SCOR Paper is a summary of a dissertation awarded «Best research dissertation 2010» by the French Actuarial Prize, awarded by SCOR in partnership with the Institut de l actuariat.

1. Using the Loss Distribution Approach to Model Frequency Risk The Loss Distribution Approach consists in adjusting statistical distributions to loss data. More specifically, this means modelling i) the frequency of loss events and ii) their severity, then combining them to obtain the distribution of total losses. This approach is often used to model total insurance losses as part of the rate-setting or provisioning process. To calibrate the model, historical loss data must be available. Step 2: Building the total loss distribution Monte-Carlo simulation gives a good approximation of the loss distribution function S k. Step 1: choose and calibrate frequency and severity distributions For each risk k, the loss distribution will be modelled as follows: where N k is the random variable representing the number of losses for each risk k is the random variable representing the amount of loss j for the risk k S k is the sum of all losses for the risk k. The Poisson, binomial and negative binomial distributions are the ones generally used for frequency. There is a virtual consensus in the industry in favour of the maximum likelihood method for estimating the distribution parameters. The choice of model is then validated by statistical testing. Practical problem: the data collection threshold In general, operations staff cannot declare all losses; rather, a declaration threshold must be set. The observable data will then consist only of the losses that exceed this threshold. In our adjustments, we will have to take into account the fact that the data are truncated and modify the parameter estimates and curve fitting tests as a result. Otherwise, we will underestimate the frequency and adjust the distribution of individual costs incorrectly. Individual cost distribution: The data below the threshold are not reported but must be taken into account in estimating the distribution parameters. Specifically, these parameters will have to be estimated taking into account the lack of data to the left of the bars shown in the graph below. 2

This graph represents the cost of individual losses and the theoretical distribution readjusted to account for the truncated data: Note: certain distributions, e.g. exponential and Pareto, remain stable when truncated to the left. The maximum likelihood method enables us to calibrate our distributions but does not yield an analytical solution. Let (x 1, x 2,...,x n ) be a sample of losses whose amount exceeds the threshold U. The likelihood condition at threshold U can be written as follows: The log-likelihood can then be written: Let U be the loss data collection threshold. Let X be the random variable representing the cost of non-truncated losses, i.e. losses including amounts that might be less than U, and F its distribution function. Only the distribution generates observable data. This conditional distribution is the distribution of X truncated to the left of U. Knowing that the observable data are beyond the threshold U, we can specify the truncated distribution function: The parameters are estimated in the traditional manner by maximising log-likelihood. Frequency distribution: When there is a data collection threshold, the observed loss frequency is underestimated and must be adjusted to take undeclared losses into account. For each observation period i, the number of losses, indicated n i, is increased by the estimated number of losses under the threshold, indicated m i, as follows: The graph below shows how a log-normal distribution ln(2;1) is deformed depending on where the data collection threshold is positioned. The relationship between the number of losses under the threshold and the number of observable losses is the same as the relationship between the portion below the density curve of individual losses to the left of the threshold and the portion to the right of the threshold: where is the truncated distribution function of X, whose parameters have been estimated with the preceding method. From which we can derive: 3

2. Modelling severity risk using the Bayesian method The Bayesian approach consists in carrying out a qualitative risk analysis based on expert knowledge and in transforming it into quantitative analysis. A Bayesian network is a probabilistic causal graph representing the existing knowledge in a given domain. It is made up of discrete random variables, called nodes, linked by ordered arcs that represent a causal relationship. Specifically, we have studied the exposure occurrence severity model. The task consists in defining and modelling these three characteristic measures of risk, which are in turn influenced by variables called key risk indicators (KRI). We present below the methods for evaluating the various elements of the Bayesian network. a) Evaluating exposure Exposure represents all of the elements of the company that are exposed to risk. It must be defined in such a way that the risk can materialise into an occurrence only once per year. b) Evaluating occurrence As the exposed elements are chosen such that there can be a maximum of only one loss, occurrence will be a binominal distribution B(n,p), where n is the number of exposed elements and p the probability to be estimated. c) Evaluating severity We assume that there has been an occurrence, and we identify the quantifiable variables (KRI) involved in the calculation of severity. The structure of the Bayesian network is defined by experts via scenarios. The parameters of the Bayesian network can be determined empirically or based on expert knowledge. Once the Bayesian network is constructed, the calculation algorithm needs to be defined. Let (X 1, X 2,...,X n ) be the elements exposed to the operational risk in question. Let P i = P(Exposure = X i ) be the probability that the exposure indeed represents the elements X i. Let PO i = P(Occurence = «Yes» Exposure = X i ) be the probability that the risk materialises into an occurrence, given that the exposure is X i. These two probabilities are known (they were estimated as previously described). Let PS i = P(Severity Occurence = «Yes» and Exposure = X i ) be the severity distribution, given that a loss occurred on elements X i. The algorithm consists in carrying out the following steps in succession: 1) position Exposure at X i and Occurrence at Yes in the Bayesian network and read the severity distribution : PS i = P(Severity Occurence = «Yes» and Exposure = X i ) 2) sample the number of losses F i according to the binomial distribution B(nb(X i ); PO i ). 3) for each loss from 1 to F i, sample the severity according to the distribution PS i. 4) find the sum of the F i severities. Repeat these four steps a large number of times, recording the sum of the severities each time. In this way, we obtain the distribution of total losses. 4

3. Sample application of the Bayesian method We will now apply the Bayesian model to the risk of error in order execution in the financial markets. The risk is treated in a very simple fashion, because the objective is to show how the Bayesian approach works. The first step in the Bayesian approach is to create a network, represented by a graph, and to define the distributions corresponding to each factor. These parameters can be reviewed during a back-testing phase. Based on an analysis of the risk, we constructed the following graph: In the Bayesian approach, three elements are defined to model the operational risk: exposure, occurrence and severity. Exposure: this must correspond to elements exposed to risk that can be transformed into a loss only once during the period. We have chosen to make orders these elements. Indeed, an order can be erroneous only once. We do not anticipate an increase in the number of orders in the coming year and the number of observed orders is on average 25,000 per year. Occurrence: this is defined as an error in an order. On average, we have 18.2 erroneous orders per year, meaning that the probability a given order will erroneous is 18.2/25,000, or 0.0728%. Severity: this is defined as the sum of the loss due to market shift plus transaction costs. We make the following assumptions about the factors entering into the calculation of severity: 5

the amount of the error (expressed in millions of euros): the correction period: : this is the time it takes to correct the error, i.e. from the time it takes place until a corrected order is placed: Let s now run the algorithm described in the theoretical part: The elements exposed to the risk are orders of the same type: X i = X, which implies that P i = P(Exposure = X i ) = 1. We deduce from this the probability of an occurrence, keeping in mind that exposure is X i : PO i = P(Occurence = «Yes» Exposure = X i ) = P(Occurence = «Yes») = 0,0728 % The algorithm consists in carrying out the following steps in succession: 1) Calculate the severity distribution PS i = P(Severity Occurence = «Yes» and Exposure = X i ) transaction costs: we assume that transaction costs are equal to 0.25% of the amount of the transaction. The erroneous amount is executed the first time, then a second time to correct it. The costs will be calculated here by multiplying the erroneous amount by 0.5%. market shift: this is the change in rates observed during the time it takes to correct the order. Losses due to market shift are calculated by multiplying the erroneous amount by the amount of the rate shift. 2) Sample the number of losses F i according to the binomial distribution B(nb(X i ); PS i ) = B(25 000; 0.0728 %). Orders are independent and each order generates a Bernoulli distribution. 3) For each occurrence from 1 to F i, sample the severity according to the distribution PG i. 4) Find the sum of the F i severities. Repeat these four steps 10,000 times, recording the sum of the severities each time. In this way, we obtain a distribution of total losses and we can deduce a VaR of 99.5%. We have carried out sensitivity tests on the various parameters of the Bayesian network. In each set of tests, we vary the parameters of a factor, while holding the parameters of all other factors constant. 6

The Bayesian model presents numerous advantages: - it takes into account not only quantitative but also qualitative factors, which is not the case for most models; - it shows the links of causality between the variables. Risks are aggregated through the very construction of the network, obviating the need to estimate correlations; - it uses inference to detect risk reduction factors; - Risk Management can use it to implement action plans and observe their effectiveness. The major disadvantage of Bayesian networks is that it takes a long time to set them up, because they require a detailed analysis of each risk. 4. References Condamin, L. ; Louisot, J.-P. ; Naîm, P. (February 2007), Risk Quantification, Wiley, 286p. Partrat Christian ; Besson Jean-Luc (December 2004), Assurance non-vie, Modélisation, Simulation, Economica, Paris, 820 pages. Naïm P. ; Wuillemin P.-H. ; Leray P., Pourret O. ; Becker A. (November 2007), Réseaux Bayésiens, Eyrolles, collection Algorithmes, Paris, 424 pages. 7

SCOR Paper N 1 - September 2008 Using Capital Allocation to Steer the Portfolio towards Profitability SCOR Paper N 2 - Août 2008 La bancassurance : généralisation ou déclin du modèle? SCOR Paper N 3 - December 2008 Valuation in insurance and financial crisis SCOR Paper N 4 - March 2009 Modern Companies and Extreme Risks SCOR Paper N 5 - July 2009 Securitization, Insurance and Reinsurance SCOR Paper N 6 - January 2010 Adapting the solvency regulations to times of crisis, accepting the riskiness of the situation SCOR Paper N 7 - January 2010 The Influence of Risk Measures and Tail Dependencies on Capital Allocation SCOR Paper N 8 - March 2010 Principle-based Solvency: A Comparison between Solvency II and the Swiss Solvency Test SCOR Paper N 9 - December 2010 Aging: a Global Phenomenon SCOR Paper N 10 - December 2010 PrObEx: a New Method for the Calibration of Copula Parameters from Prior Information, Observations and Expert Opinions SCOR Paper N 11 - December 2010 Development Risk SCOR Paper N 12 - Décembre 2010 Study of the impact of inflation and GDP growth on property-liability and life insurance premiums over the last 30 years: case of the G7 countries SCOR Paper N 13 - Mars 2011 Preparing for Solvency II: Points of debate in the Standard Formula Each year SCOR rewards the best academic papers in the field of actuarial science with the Actuarial Prizes. These prizes are designed to promote actuarial science, to develop and encourage research in this field and to contribute to the improvement of risk knowledge and management. The juries are composed of academics and insurance, reinsurance and financial professionals. The winning papers are selected using criteria including an excellent command of actuarial concepts, high quality analysis instruments, and subjects that could have a practical application in the world of risk management. Prix de l actuariat Global Risk Center SCOR Papers, edited by SCOR, are one of the tool supporting the SCOR Global Risk Center. The SCOR Global Risk Center gathers and analyses the most interesting resources about risks. It operates as a dual resource center, based both on data and resources produced by SCOR itself, and on all other resources available selected specifically by SCOR. Publications will be available in English, French and/or German. SCOR Global Risk Center is available at www.scorglobalriskcenter.com or on SCOR s website www.scor.com. 8