Model Risk Within Operational Risk

Size: px
Start display at page:

Download "Model Risk Within Operational Risk"

Transcription

1 Model Risk Within Operational Risk Santiago Carrillo, QRR Ali Samad-Khan, Stamford Risk Analytics June 4, 2015 Reinventing Risk Management

2 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions Challenges in Operational Risk Modelling International Model Risk Management Conference Santiago Carrillo Menéndez Alberto Suárez González RiskLab-Madrid Quantitative Risk Research Toronto June 4, 2015 S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

3 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions A Non Well Defined Framework The defintion of the framework for AMA approach in Basel II is quite bare: The four components: Any risk measurement system must have certain key features to meet the supervisory soundness standard set out in this section. These elements must include the use of internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems (BEICF s). External data: A bank operational risk measurement system must use relevant external data..., especially when there is reason to believe that the bank is exposed to infrequent, yet potentially severe, losses.... A bank must have a systematic process for determining the situations for which external data must be used and the methodologies used to incorporate the data... Fat tails and Percentile 99.9: A bank must be able to demonstrate that its approach captures potentially severe tail loss events. Whatever approach is used, a bank must demonstrate that its operational risk measure meets a soundness standard comparable to that of the internal ratings based approach for credit risk, (i.e. comparable to a one year holding period and a 99.9 percent confidence interval. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

4 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Loss Distribution Approach I With time, it became clear that advanced models means, essentially, LDA. That means: Define a severity and (independent) frequency distribution functions, Combine then in the (yearly) aggregate loss distribution, The risk measure is the percentile 99.9% of this aggregate loss distribution. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

5 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Loss Distribution Approach II In this context, model risk is the risk of having a model which provides a wrong estimation for this percentile Nevertheless, several issues had appeared Scarcity of data Loss events are not a time serie How to weight of the different sources of data? How to use external data? Scaled? Not scaled? If not, more complexity for weights election? No specifications about how to use scenarios. In addition, the use of thresholds may induce serious biases in capital estimation. At least each of the pointed steps is a potential source of model risk It is necessary to remove those inconvenient in order to have sound AMA approaches with all the benefits they brought to risk management. It is feasible! Many FI have done a lot of work in this direction. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

6 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting The Percentile 99.9% One of the biggest issues in the LDA approach is the accurate calculation of the aggregate loss distribution 99.9 percentile. This requires a precise fit of the tail of the severity distribution. In practice, Data are sparse in the tails. Different distributions may offer similar goodness of fit to data with very different results in terms of capital: different lognormals with high sigma, g-and-h distributions, Pareto distribution. The goal is to extrapolate the shape of the severity distribution far in the tail, based on the knowledge of part of the body/tail. It may be very difficult (lack of sufficient data far in the tail) to distinguish between the different candidates. Model error is a real threat. Even more when high thresholds are used. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

7 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting Severity Uncertainty I Because of the sensitivity of capital to the tail of the severity distribution, we need the best possible choice for the tail. There may not be enough empirical evidence to select model distributions with very different asymptotic behavior. In particular, we may have distributions with a nice fitting to empirical data which assign non negligible probabilities to losses bigger than (f.e.) ten times the total assets of the bank (or more). Does it make sense to even consider such distributions? Let us consider, for example: a lognormal (µ = 10, σ = 2.5) (the histogram of the following figure); a piecewise defined distribution with a lognormal body and a g-and-h (a = 0.5, b = , g = 2.25 and h = 0.25) tail (15% of data, u 0 = ); a piecewise defined distribution with a lognormal body and a generalized Pareto (u = u 0, β = , ξ = 1) tail (15% of data, u 0 = ). We are considering really heavy tailed distributions. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

8 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting Severity Uncertainty II In the following figures we compare the lognormal distribution (histogram) with the lognormal + h-and-g (left) and lognormal + Pareto (right). These distributions are very similar except very far in the tails. The asymptotic behaviors of the distributions are very different. The resulting OpVar associated to those distributions (λ = 200) are: (lognormal), (g-and-h) and (generalized Pareto), S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

9 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting Reasons for the Use of Truncated Distributions How to solve this issue? Economic and regulatory capital should have economic sense. It seems reasonable that the sequence (or the next to come): Basic Approach Standard Approach Advanced Models Approach leads to a reduction in the requirements of capital based on the huge work done for a better management. The use of heavy tailed distributions in an LDA framework may lead to results with no economic interpretation: infinite expected losses, very unstable estimates of CaR values, diverging estimates of conditional CaR values. The losses of a bank may not be arbitrarily large. The use of truncated distributions allows to surpass this problems. It is interesting to outline that the right truncation level may be very high. The determination of the this level is an open question. We propose to consider it as an economical parameter for the fine tuning of models. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

10 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting The Framework We reconsider the previous model. The frequency of losses is Poisson with an average of 200 (λ = 200:) losses per year. For the severity, we shall use the three distributions previously indicated (same parameters): a lognormal; a piecewise defined distribution with a lognormal body and a g-and-h; a piecewise defined distribution with a lognormal body and a generalized Pareto. Several levels of right-truncation are considered: from 2 millions to 2 billions. This approach has several benefits: a faster congergence to real parameters (see appendix figures); a more robust estimation of capital; smaller differences between the different models, that is smaller differences induced by unobserved virtual losses. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

11 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting Summary These results are summarized in the table. Treshold LN LN + GH LN + GP v S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

12 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting Recapitulation The use of right-truncated distributions allows more stable estimations of economic/regulatory capital. The differences between the various models are less pronounced. For very heavy-tailed distributions the capital estimates are less sensitive to variations in the truncation level than to the sampling fluctuations when there is no truncation. The fact they imply less economical capital suggests that a careful analysis is needed in order to determine a reasonable truncation level. A possibility: having a multiplying factor associated to this level of capital. This level could be associate, in particular, to the quality of management (controls) and results (real empirical data). In the case of business lines/type of risk, a possibility is to take the level of capital required for the business line for the Standard Approach as a cap: A bank having a greater loss in its data base will not be allowed for a smaller amount of capital. However, this procedure by itself doesn t guaranty that the capital is smaller than using the Standard Approach. Something not unusual. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

13 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting Critical Issues when Using Pareto Distribution (I) ξ = 0.6 When fitting Pareto to actual loss data, it is usual to get high values for ξ (even greater than 1). The parameters estimates in the Pareto fit are unstable. The (absolute) fluctuations of economic capital are very important. In order to illustrate this, we generate 30 events greater than C10,000 quarterly Pareto-distributed (ξ = 0.6) and fit data to Pareto at the end of each period using all accumulated data. It doesn t seem to be an acceptable solution. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

14 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Percentile 99.9% Using Truncated Distributions Study on Truncated Distributions Instability of Pareto fitting Critical Issues when Using Pareto Distribution (II) ξ = 1.1 For ξ > 1, the situation is even more dramatic. First, the expected value of the losses is infinite. Therefore one should expect problems of consistency in the calculation of economic capital. With real data it is very easy to get extremely unrealistic amounts of capital. In general, Pareto fits tend to overestimate the value of capital-at-risk. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

15 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Threshold Effect The Body Effect Threshold Effect To illustrate the effect of varying the left threshold on the error in the model, we generate 10,000 lognormal random numbers for different values of σ (µ = 10), assuming λ = Two threshold levels are chosen (3,000 and 10,000). We present the results of the best fit (worst p-value of KS and AD tests). u = 3,000 u= 10,000 σ best fit 1.00 LN-gamma Gamma mixture 1.25 g-and-h g-and-h 1.50 g-and-h Burr 1.75 LN-gamma LN-gamma 2.00 LN Lognormal 2.25 LN LN mixture 2.50 LN mixture g-and-h 2.75 Burr Burr 3.00 Burr Burr What could happen with less well defined data? S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

16 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Threshold Effect The Body Effect Impact on Capital The impact on capital depends on the frequency of events. The frequency distribution must be corrected in order to take into account the probability mass of the losses under the left censoring threshold. For high frequencies, the impact on capital may be very important: u = 3,000 u= 10,000 σ CaR error CaR error E % 8.27E % E % 1.15E % E % 2.31E % E % 2.74E % E % 7.18E % E % 1.85E % E % 3.42E % E % 4.66E % E % 2.24E % S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

17 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Threshold Effect The Body Effect The Body Effect I Should single-loss events determine the economic capital? In a subexponential framework, high percentiles of the loss distributions levels are explained by a single extreme loss or a small amount of large losses. The value 99.9% is a very high percentile. Is it high enough in order to make the body (the part under the threshold) of the distribution irrelevant for the CaR calculation? In order to give an answer to this question, we perform the following simulation. Random values of the loss severity are generated with a lognormal distribution (µ = 5, σ = 2). Different thresholds (u), determined by the probability of the tail (p), are chosen. Three cases are to be considerated: case 0: empirical data; case 1: the losses under the threshold u are all equal to 0; case 2: the losses under the threshold u are all equal to u; S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

18 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Threshold Effect The Body Effect The Body Effect II The results for the CaR (in thousands) are displayed in the following table. In the case of conditional CaR the tendencies are similar. p u VaR 0 VaR 1 VaR 2 VaR 2 VaR 1 VaR ,253 1,249 1,263 1,14% λ = ,249 1,223 1,349 10,09% ,946 1,256 1,204 1,557 28,05% ,909 4,858 5,010 3,10% λ = ,896 4,624 5,897 25,98% ,946 4,911 4,399 7,903 71,36% ,655 28,126 29,653 5,32% λ = ,567 25,853 38,660 44,83% ,946 28,727 23,519 58, ,22% S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

19 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions The Threshold Effect The Body Effect The Body Effect III Conclusions: For low frequencies, the contribution of the body of the distribution is not decisive. Nevertheless the greater is the frequency the larger is the contribution of the body of the distribution. The asymptotic approximation works well for small frequencies. Note however that, in these experiments, the probability mass of the losses under the threshold is the same in all cases. If the probability mass of the body is extrapolated from the tail fit, these figures would show a much larger variation. From a practical point of view this suggests to collect data with a low truncating threshold. Or, at least, to have a realistic estimation of thr probability under the threshold. This point of view may have important returns in the management of the expected loss and insurance premiums negotiation. That means costs reduction. S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

20 A Non Well Defined Framework The Loss Distribution Approach Framework The Percentile 99.9% Threshold Effect Conclusions Conclusions We indicated some of the possible causes for Model Risk in Operational Risk. We proposed techniques which allow to avoid this type of risk. They probably are not the unique possibilities. They are solutions to make AMA more robust and accurate. Thank you S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

21 Stability for the fitting of truncated distributions Part I Appendix S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

22 Stability for the fitting of truncated distributions The Lognormal Distribution Truncated Case Truncation level: Truncation level: S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

23 Stability for the fitting of truncated distributions The Lognormal Distribution Non Truncated Case S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

24 Stability for the fitting of truncated distributions The Lognormal + g-and-h Distribution Truncated Case Truncation level: Truncation level: S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

25 Stability for the fitting of truncated distributions The Lognormal Distribution + g-and-h Distribution Non Truncated Case S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

26 Stability for the fitting of truncated distributions The Lognormal + GP Distribution Truncated Case Truncation level: Truncation level: S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

27 Stability for the fitting of truncated distributions The Lognormal + GP Distribution Non Truncated Case S. Carrillo, A. Suárez (UAM-QRR) Challenges in Operational Risk Modelling June 4, / 26

28 Model Risk Within Operational Risk Issues and Challenges International Model Risk Management Conference June 2-4, 2015 Toronto, Canada Ali Samad-Khan Reinventing Risk Management

29 OpRisk Modeling Challenges: Key Issues Can operational risk be measured accurately at the 99.9% confidence level, with a one year time horizon? Should internal loss data be the primary data source for modeling operational risk? Is it feasible to use loss scenarios and their corresponding probabilities to estimate operational risk under the AMA Scenario Based Approach? Is it practical or even feasible to weight the four AMA data elements (Internal Loss Data, External Loss Data, Scenario Analysis and BEIFCs) in arriving at an operational risk capital estimate? 1

30 Background: Understanding the difference between aggregate and single loss exceedence curves. Individual Loss Events Risk Matrix for Loss Data Intermediate Loss Distributions Final Loss Distributions 84,345,957 36,349,123 22,345,213 18,345,234 15,945,456 Corporate Finance Trading and Sales Number Mean Standard Deviation Number Mean Standard Deviation Internal Fraud ,459 5, ,189 8,541 External Fraud ,056 8, ,084 13,463 Employment Practices and Workplace Safety 25 3,456 3, ,184 5,768 Clients, Products and Business Practices 36 56,890 7, ,335 11,835 Damage to Physical Assets 33 56,734 3, ,101 5,184 Execution, Delivery and Process Management 150 1, , Business Disruption and System Failures 2 89,678 23, ,517 35,315 Total ,215 6, ,322 10,464 P Annual Event Frequency Retail Banking Number Mean Standard Deviation Commercial Banking Number Mean Standard Deviation Payment and Settlements Number Mean 45 47,870 7, ,083 6, , ,276 12, ,248 10, , ,666 5, ,199 4, , ,802 10, ,121 9, , ,591 4, ,932 4, , , , , ,065 31, ,959 28, , ,690 9, ,721 8, , Agency Services Asset Management Standard Deviation Number Mean Standard Deviation Number Mean Standard Deviation 6, ,529 7, ,876 6,725 9, ,308 11, ,477 10,599 4, ,535 5, ,081 4,541 8, ,651 10, ,186 9,318 3, ,446 4, ,002 4, , , , ,675 30, ,908 27,804 7, ,018 9, ,217 8,238 24,458 22,346 16,234 11,236 10,378 Retail Brokerage Insurance Total Number Mean Standard Deviation Number Mean Standard Deviation Number Mean Standard Deviation 48 50, ,226 7, ,653 7, , ,395 11, ,021 11, , ,408 4, ,450 4, , ,561 10, ,245 10, , ,362 4, ,044 4, , , ,598 1, , ,381 30, ,459 30, , ,394 8,897 3,484 56,926 8,981 P Single Event Severity (LEC)

31 One can use a pair of Frequency and Severity Distributions to estimate an Annual Aggregate Loss Distribution. P Annual Event Frequency Single Event Severity (LEC) Monte Carlo Simulation P

32 One can also use a pair of Frequency and Severity Distributions to estimate an Annualized (Single) Loss Exceedence Curve or ALEC. Single Event Severity (LEC) Annual Event Frequency P P x = A base ALEC can be estimated through the formula LEC x λ 4

33 What is model risk in an operational risk context? In order to estimate the degree of model risk one must know the true (actual) level of risk. Annualized Loss Exceedence Curve Probability of an Exceedence Perceived Risk 99.9% Model risk is the risk that the perceived level of risk may be different from the actual level of risk*. Actual Risk 99.9% Degree of Model Risk Loss Amount ($ Billions) Log Scale *Actual risk is a nebulous concept. For the purposes of this discussion we define actual risk as the most precise estimate of risk that can be made based on the best theoretical framework /model and data available at that time. 5

34 What are the sources of model risk within operational risk models? Impact of Model Error on Model Risk Fundamental Misconception of the Business Problem Magnitude of Model Error Conceptual Flaws Insufficient Simulation Iterations Imprecise Distribution Fitting Inappropriate Theoretical Framework Inappropriate Severity Distributions Used Precision Errors Data Quality Issues Type of Model Error 6

35 Accuracy and Precision: Conceptual Flaws vs. Precision Errors. Conceptual Flaws Precision Precision Errors Accuracy 7

36 Banks are required to estimate operational risk capital at the 99.9% confidence level, with a one year time horizon. a bank using the AMA must demonstrate to the satisfaction of its primary Federal supervisor that its systems for managing and measuring operational risk meet established standards, including producing an estimate of operational risk exposure that meets a one-year, 99.9th percentile soundness standard. Federal Register / Vol. 72, No. 235 / Friday, December 7, 2007 / Rules and Regulations 8

37 OpRisk Modeling Challenges: Key Questions Why 99.9% confidence level? Because the safety and soundness of the banking sector is critically important to the smooth operation of the global economy, banks must be held to a higher safety standard. Under this standard, assuming there are 1000 banks in the industry and each bank is like every other bank, then, on average, only 1/1000 banks will exceed its capital reserve in any given year. What does it mean to model risk at the 99.9% confidence level, with a one year time horizon? 0.1% probability of exceedence in the typical year, or 0.1% probability of exceedence in any given year (1 in a 1000 year event)? How should one determine data sufficiency where the goal is to model risk at the 99.9% confidence level, with a one year time horizon? What is the most important source of loss data for modeling risk at the 99.9% confidence level, with a one year time horizon: internal data or external data? 9

38 Suppose you want to build a house at the beach. How far up the beach should you build this house if you are willing to tolerate minimal risk of loss at the 99.9% level, with a one year time horizon? Question: what is the height of the 1 in 1000 year wave? Ten Years of Empirical Data (1,000,000 incidents) VaR 99.9% TVaR 99.9% Stress VaR 99.9% 10

39 Wind driven waves are irrelevant to this analysis. The only relevant data are tsunami waves. But suppose a tsunami wave takes place, on average, once every ten years. Then in an average ten year sample you have only one relevant data point. It is difficult to estimate the height of the 1 in 1000 year wave with only ten years of data, because these data will contain only wind driven waves and perhaps one or two tsunamis, which will be viewed as black swan events or outliers. Wind Driven Waves Where the data are heterogeneous, what is a better measure of data sufficiency? Number of data points Number of years in the data collection period 99.9% level VaR (typical year) Actual 99.9% level VaR (any given year) Tsunami Waves 11

40 The trade-off between accuracy and precision. 99.9% VaR No Tsunamis Precision 99.9% VaR Including Tsunamis Accuracy 12

41 If you want to base your analysis on ten years of data you must believe that fraud data are i.i.d., in other words that junior staff frauds have the same properties as all other types of frauds, including senior manager frauds. Where the data are heterogeneous, modeling risk at the 99.9% confidence level, with a one year time horizon, essentially means modeling the large, infrequent events. Junior Staff Frauds If you had 1000 years of data, you would not need a model to estimate risk at the 99.9% confidence level, with a one year time horizon. The largest loss would be a reasonable approximation for the 1 in 1000 year event. Senior Manager Frauds Confidence Interval 99.9% Confidence Level Describes a Rough Location on the X axis 13

42 If you want to base your analysis on ten years of data you must believe that fraud data are i.i.d., in other words that junior staff frauds have the same properties as all other types of frauds, including senior manager frauds. The reason one needs external data is to determine more precisely, how often a $10M, $50M and $100M loss exceedence takes place. Junior Staff Frauds Using data collected over 100 years, one can observe how often large loss exceedences actually take place. $10,000, events 1 in 5 years $50,000, events 1 in 10 years $100,000,000 5 events 1 in 20 years One can then fit this data to a curve and extrapolate the 1 in 1000 year event. Senior Manager Frauds Confidence Interval 99.9% Confidence Level Describes a Rough Location on the X axis 14

43 If you want to base your analysis on ten years of data you must believe that fraud data are i.i.d., in other words that junior staff frauds have the same properties as all other types of frauds, including senior manager frauds. To model operational risk at the 99.9% loss, one must follow a two-step process: Since one needs about 100 years of loss data, one must pool one s internal loss data with peer (external) loss data to create an industry loss data set (10 firms x 10 years = 100 firm years). Using such data one can estimate the risk profile of the average peer firm. After that one can use objective metrics, such as the ratio of the mean internal loss to the mean peer loss (per matrix cell) to estimate a firm specific VaR. Relevant external loss data does not mean relevant individual data points. Relevant data are all data (for each matrix cell) from all firms that have a similar risk profile (peer firms) without any modification, whatsoever. 15

44 Modeling operational risk at the 99.9% confidence level, with a one year time horizon, with internal data only, produces results with a very large confidence interval. 5.0 Annualized Loss Exceedence Curve Probability of an Exceedence Internal Data (no outliers ) Actual Risk Confidence Interval Internal Data (some outliers ) Loss Amount ($ Billions) Log Scale 16

45 Modeling operational risk at the 99.9% confidence level, with a one year time horizon, with industry data (internal + external data), produces results with a smaller confidence interval. 5.0 Annualized Loss Exceedence Curve Probability of an Exceedence Industry Data (few outliers ) Actual Risk Confidence Interval Industry Data (many outliers ) Loss Amount ($ Billions) Log Scale 17

46 How can one model operational risk at the 99.9% confidence level, with a one year time horizon, where the data are truncated, heterogeneous, subject to clustering and contain outliers? 18

47 The Traditional Property & Casualty Loss Modeling Approach. For severity fitting, the best practices approach is to fit one s loss data to a range of theoretical distributions using maximum likelihood estimation (MLE). Raw Loss Event Data (100 Years) Independently Fit Raw Loss Data to a Frequency and a Severity Distribution VaR Calculation 8,234,345,957 4,369,349,123 3,322,345,213 3,118,345,234 2,815,945,456 24,458 22,346 16,234 11,236 10,378 Frequency of Events P Number of Events Single Event Severity P Monte Carlo Simulation 19

48 Where the data are truncated, a modified form of MLE can be used to estimate parameters for distributions specified from zero. Maximum Likelihood Estimation (MLE) is a process used to fit empirical data to a theoretical distribution. Given a pre-specified theoretical distribution, MLE is used to find the set of parameters that have the maximum likelihood (probability) of describing the empirical data set. Generally, the likelihood function is the density function, but where loss data is censored or truncated (not collected above or below a certain threshold), an adjustment is required. In order to accommodate truncated data, the likelihood function must be modified to describe the conditional likelihood. For left truncated data, we do this by taking the original density function, in this case the lognormal, and dividing it by the probability of the data being drawn from above the threshold, as shown below: 20

49 However, where the data are heterogeneous, subject to clustering and contain outliers, severity fitting at high loss thresholds using the modified form of MLE can be problematic. 21

50 The ALEC Method is a newly invented computational technology that allows one to fit frequency and severity distributions simultaneously. Raw Loss Event Data (100 Years) Express Loss Event Data as Annualized Loss Exceedence Curve Values at Multiple Thresholds Simultaneously Derive the Combined Best-Fit Frequency and Severity Distribution Parameters VaR Calculation 8,234,345,957 4,369,349,123 3,322,345,213 3,118,345,234 2,815,945,456 24,458 22,346 16,234 11,236 10,378 Frequency of Events P Number of Events Single Event Severity P Monte Carlo Simulation Patent Pending 22

51 Where the data are truncated, heterogeneous, subject to clustering and contain outliers the ALEC method when implemented correctly appears to be a more reliable means of fitting loss data to theoretical distributions. Frequency and severity distributions are interdependent. In other words, if the severity fit is not precise, then the λ 0 (the implied frequency at the zero threshold) will also not be precise. Therefore, fitting frequency and severity simultaneously (both together) produces more reliable results than fitting frequency and severity independently or sequentially (one at a time). This is particularly relevant where the data are truncated, heterogeneous, subject to clustering and contain outliers. Patent Pending 23

52 The Scenario Based Approach to Modeling Operational Risk Scenario analysis under the advanced approaches rule is a systematic process of obtaining expert opinions from business managers and risk-management experts to derive reasoned assessments of the likelihood and loss impact of plausible, high-severity operational losses. Scenario analysis may include the well-reasoned evaluation and use of external operational loss event data adjusted, as appropriate, to ensure relevance to a bank s operational-risk profile and control structure. Scenario analysis provides a forward-looking view of operational risk that complements historical internal and external data. The scenario analysis process and its output are key risk-management tools that are especially relevant for assessing potential risks to which the bank may be exposed. Scenarios are typically developed through workshops that produce multiple scenarios at both the line of business and enterprise levels. Scenario development exercises allow subject matter experts to identify potential operational events and their impacts. Such exercises allow those experts to better prepare to identify and manage the risk exposures through business decisions, risk mitigation efforts, and capital planning. Inclusion of scenario data with other data elements in internal risk-management reporting can support development of a comprehensive operational risk profile of the bank. INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK, June 3,

53 The Scenario Based Approach is based on the assumption that a risk curve, such as a LEC, consists of points that comprise individual loss scenarios and corresponding probabilities, which is false. Scenarios describe individual loss incidents (e.g., a senior manager embezzles $10,000,000). However, risk curve fitting involves estimating the probability of exceedence at specified loss thresholds (e.g., $10,000,000) for a type of event (e.g., fraud which includes all fraud incidents irrespective of scenarios). Each scenario (specific point on a curve) has a zero probability of occurrence: 1 out of infinity. Many scenario workshop procedures call for business experts to identify plausible large loss scenarios and then guess their corresponding probabilities. These are not only the wrong questions (the questions are based on several false premises), the answers are unknowable. 25

54 To estimate a risk curve one must estimate the probability of loss exceedence at multiple loss thresholds for a type of event (risk class). These loss thresholds have nothing to do with scenarios. Curve Fitting (Fraud Risk) Relevant question: What is the probability of an event exceeding the $1,000, $10,000 and $25,000 loss thresholds from any type of fraud loss? For example, 20%, 10% and 5%, respectively. (But how does one estimate these probabilities?) Probability Scenario 4 20% 10% % Loss Amount ($ Thousands) 26

55 Based on industry data one can estimate number of event exceedences at multiple loss thresholds and then calculate corresponding 1 in N year exceedences and subsequently probabilities. Loss Threshold (T) Expected Event Exceedences Per 100 Years 1 in N Year Frequency (N) Probability of One or More Losses Exceeding T in any Given Year $1, % $10, % $25, % 27

56 To convert frequency into probability, one must assume a frequency distribution and then calculate the corresponding probability. Years (N) Mean Frequency if on Average 1 Event Occurs Every N Years Likelihood of Exactly 1 Event in a Single Year Likelihood of 1 or More Events in a Single Year N 1/N Prob. (1 Event) 1-Prob. (0 Events)

57 But what type of curve should we be trying to fit? LEC, which is a severity distribution; it describes, if a loss were to take place, what is the probability that that loss will exceed $X. A severity distribution has no time element. ALEC, which is a combined frequencyseverity distribution; it describes the probability of a loss (one or more events) exceeding $X in any given year. No one has any intuition for LEC probabilities. However, with industry data one can estimate annual event exceedences, and then calculate 1 in N year exceedences and subsequently probabilities. 29

58 It is theoretically possible to estimate an entire ALEC curve using as few as three loss exceedence inputs (points on the curve). (However, additional inputs will likely produce more precise results.) Annualized Loss Exceedence Curve Probability of One or More Exceedences Events > $1000 every 100 years 1-in-5 year event 10 Events > $1000 every 100 years 1-in-10 year event 5 Events > $25,000 every 100 years 1-in-20 year event Loss Amount ($ Thousands) Patent Pending 30

59 After deriving the combined best-fit frequency and severity parameters, one can extrapolate and/or simulate the relevant risk metrics at the desired confidence level

60 In order to derive the actual risk parameters by fitting the LEC directly, one would have to correctly estimate the (highly unintuitive) severity probabilities. ALEC Poisson λ = Lognormal (μ,σ) = ( , ) LEC Lognormal (μ,σ) = ( , ) Loss Threshold (T) The Probability of a Loss Exceeding T in any Given Year If a Loss Were to Take Place, the Probability of that Loss Exceeding T $1, % % $10, % % $25, % % 32

61 It is feasible to model operational risk at the 99.9% level, with a one year time horizon, using industry data. If done correctly, the results should be accurate, even though they may not be precise. However, it is not practical to manage risk exclusively at the 1/1000 year event level. a bank using the AMA must demonstrate to the satisfaction of its primary Federal supervisor that its systems for managing and measuring operational risk meet established standards, including producing an estimate of operational risk exposure that meets a one-year, 99.9th percentile soundness standard. Federal Register / Vol. 72, No. 235 / Friday, December 7, 2007 / Rules and Regulations The bank s internal operational risk measurement system must be closely integrated into the day-to-day risk management processes of the bank. Its output must be an integral part of the process of monitoring and controlling the bank s operational risk profile. For instance, this information must play a prominent role in risk reporting, management reporting, internal capital allocation, and risk analysis. The bank must have techniques for allocating operational risk capital to major business lines and for creating incentives to improve the management of operational risk throughout the firm. Basel Committee on Banking Supervision International Convergence of Capital Measurement and Capital, June

62 Internal data is a key element for modeling operational risk at the 99.9% confidence level, with a one year time horizon, but not the most important element. While the advanced approaches rule provides flexibility in a bank s use and weighting of the four data elements, internal data is a key element in operational risk management and quantification. INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK, June 3, 2011 One common misconception in ORM is that internal modeling means modeling primarily with internal loss data. External data are almost always necessary for modeling... Society of Actuaries, A New Approach for Managing Operational Risk Addressing the Issues Underlying the 2008 Global Financial Crisis. Updated July A ten year internal data set cannot be used to reliably extrapolate the 1/1000 year event, where the data are heterogeneous, truncated, subject to clustering and contain outliers. 34

63 The Scenario Based Approach has nothing to do with loss scenarios and their corresponding probabilities. Scenario analysis under the advanced approaches rule is a systematic process of obtaining expert opinions from business managers and risk-management experts to derive reasoned assessments of the likelihood and loss impact of plausible, high-severity operational losses. Scenario analysis may include the well-reasoned evaluation and use of external operational loss event data adjusted, as appropriate, to ensure relevance to a bank s operational-risk profile and control structure. Scenario analysis provides a forward-looking view of operational risk that complements historical internal and external data. The scenario analysis process and its output are key risk-management tools that are especially relevant for assessing potential risks to which the bank may be exposed. INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK, June 3, 2011 To estimate risk curves the starting point should be good quality industry data. Modeling experts must first transform industry data (internal + external data) into event exceedences at specified loss thresholds (irrespective of scenarios), and then calculate corresponding 1 in N year exceedences and subsequently the corresponding probabilities. After that business managers can be brought in to determine whether the risk curve should be objectively and systematically modified based on observed changes in Business and Internal Control Environment Factors (BEICFs). 35

64 It is not practical or even feasible to literally weight the four required AMA data elements. The advanced approaches rule requires that a bank s operational risk data and assessment systems capture operational risks to which the firm is exposed. These systems must include credible, transparent, systematic, and verifiable processes on an ongoing basis that incorporate the four required AMA data elements of (1) internal operational loss event data, (2) external operational loss event data, (3) scenario analysis, and (4) BEICFs. The advanced approaches rule also requires that a bank s operational risk quantification systems generate estimates of the bank s operational risk exposure using the bank s operational risk data and assessment systems, and include a credible, transparent, systematic, and verifiable approach for weighting each of the four elements. INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK, June 3, 2011 Weighting model elements is almost invariably an arbitrary process. An operational risk model should follow a two-step process. The initial results should based on industry loss data (which contain more precise information on tail frequency). Here, the inputs can be based on raw external data or expert opinion ( scenarios ) informed by such data. Objective metrics (e.g., the ratio of mean internal to external loss) should then be used to bring these results which would represent the historical risk profile of the average peer institution into line with the current risk profile of the bank as part of the BEICF process. 36

65 Operational risk modeling practices have been converging, but not necessarily in the right direction. As new methods and tools are developed, the agencies anticipate that the operational risk discipline will continue to mature and converge toward a narrower range of effective risk management and measurement practices. INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK, June 3, 2011 The source of innovation has not been the major banks. Many large banks have developed operational risk models which are conceptually flawed and add no tangible value. Nevertheless, because the large banks are generally deemed to be the experts in operational risk modeling, many have mistakenly assumed that these highly questionable models represent the standard for industry best practices. The time has come for the regulators to comprehensively examine what works and what doesn t work. And subsequently to recraft several vague and potentially misleading regulations so as to encourage the use of sound operational risk measurement and management practices and, more importantly, to discourage the use of unsound operational risk measurement and management practices. 37

66 What is the moral of this story? Before you can find the solution, you must first understand the problem! 38

67 Biographical Information Ali Samad-Khan Ali Samad-Khan is Founder and President of Stamford Risk Analytics. He has over 15 years of experience in risk management and more than 25 years of experience in financial services and consulting. Ali has advised more than 100 of the world s leading banks; insurance, energy and transportation companies; and national and international regulators on a full range of risk management issues. Key elements of his Modern ERM/ORM framework and methodology have been adopted by leading institutions around the world. Recognized globally as a thought leader in the risk management space, Ali s provocative articles and white papers have served as a catalyst for change in the way organizations manage risk. For his pioneering work in this field, Ali was named one of the 100 most influential people in finance by Treasury & Risk Management magazine. Ali is also a charter member of Who s Who in Risk Management. Prior to founding Stamford Risk Analytics, Ali served as a Principal in the ERM Practice at Towers Perrin (now Towers Watson), where he was also Global Head of Operational Risk Management Consulting. Previously, Ali was Founder and President of OpRisk Analytics LLC, a software provider, which was acquired by SAS. Before that Ali worked at PricewaterhouseCoopers in New York, where he headed the Operational Risk Group within the Financial Risk Management Practice. Prior to that, he led the Strategic Risk Initiatives Group in the Operational Risk Management Department at Bankers Trust. He has also worked as a policy analyst at the Federal Reserve Bank of New York and at the World Bank. Ali holds a B.A. in Quantitative Economics from Stanford University and an M.B.A. in Finance from Yale University. 39

INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK. Date: June 3, 2011

INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK. Date: June 3, 2011 Board of Governors of the Federal Reserve System Federal Deposit Insurance Corporation Office of the Comptroller of the Currency Office of Thrift Supervision INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT

More information

Regulatory and Economic Capital

Regulatory and Economic Capital Regulatory and Economic Capital Measurement and Management Swati Agiwal November 18, 2011 What is Economic Capital? Capital available to the bank to absorb losses to stay solvent Probability Unexpected

More information

LDA at Work: Deutsche Bank s Approach to Quantifying Operational Risk

LDA at Work: Deutsche Bank s Approach to Quantifying Operational Risk LDA at Work: Deutsche Bank s Approach to Quantifying Operational Risk Workshop on Financial Risk and Banking Regulation Office of the Comptroller of the Currency, Washington DC, 5 Feb 2009 Michael Kalkbrener

More information

An Internal Model for Operational Risk Computation

An Internal Model for Operational Risk Computation An Internal Model for Operational Risk Computation Seminarios de Matemática Financiera Instituto MEFF-RiskLab, Madrid http://www.risklab-madrid.uam.es/ Nicolas Baud, Antoine Frachot & Thierry Roncalli

More information

Implementing an AMA for Operational Risk

Implementing an AMA for Operational Risk Implementing an AMA for Operational Risk Perspectives on the Use Test Joseph A. Sabatini May 20, 2005 Agenda Overview of JPMC s AMA Framework Description of JPMC s Capital Model Applying Use Test Criteria

More information

IMPLEMENTATION NOTE. Validating Risk Rating Systems at IRB Institutions

IMPLEMENTATION NOTE. Validating Risk Rating Systems at IRB Institutions IMPLEMENTATION NOTE Subject: Category: Capital No: A-1 Date: January 2006 I. Introduction The term rating system comprises all of the methods, processes, controls, data collection and IT systems that support

More information

QUANTIFYING OPERATIONAL RISK Charles Smithson

QUANTIFYING OPERATIONAL RISK Charles Smithson This article originally appeared in the March 2000 issue of RISK QUANTIFYING OPERATIONAL RISK Charles Smithson Financial institutions recognize the importance of quantifying (and managing) operational

More information

Operational Risk Modeling *

Operational Risk Modeling * Theoretical and Applied Economics Volume XVIII (2011), No. 6(559), pp. 63-72 Operational Risk Modeling * Gabriela ANGHELACHE Bucharest Academy of Economic Studies anghelache@cnvmr.ro Ana Cornelia OLTEANU

More information

A Simple Formula for Operational Risk Capital: A Proposal Based on the Similarity of Loss Severity Distributions Observed among 18 Japanese Banks

A Simple Formula for Operational Risk Capital: A Proposal Based on the Similarity of Loss Severity Distributions Observed among 18 Japanese Banks A Simple Formula for Operational Risk Capital: A Proposal Based on the Similarity of Loss Severity Distributions Observed among 18 Japanese Banks May 2011 Tsuyoshi Nagafuji Takayuki

More information

OF THE WASHINGTON, D.C. 20551. SUBJECT: Supervisory Guidance for Data, Modeling, and Model Risk Management

OF THE WASHINGTON, D.C. 20551. SUBJECT: Supervisory Guidance for Data, Modeling, and Model Risk Management B O A R D OF G O V E R N O R S OF THE FEDERAL RESERVE SYSTEM WASHINGTON, D.C. 20551 DIVISION OF BANKING SUPERVISION AND REGULATION BASEL COORDINATION COMMITTEE (BCC) BULLETIN BCC 14-1 June 30, 2014 SUBJECT:

More information

Operational risk capital modelling. John Jarratt National Australia Bank

Operational risk capital modelling. John Jarratt National Australia Bank Operational risk capital modelling John Jarratt National Australia Bank Topics 1. Overview 2. Recent trends 3. Model features 4. Business challenges 5. Modelling challenges 6. Future directions 7. Useful

More information

Credit Scorecards for SME Finance The Process of Improving Risk Measurement and Management

Credit Scorecards for SME Finance The Process of Improving Risk Measurement and Management Credit Scorecards for SME Finance The Process of Improving Risk Measurement and Management April 2009 By Dean Caire, CFA Most of the literature on credit scoring discusses the various modelling techniques

More information

8. THE NORMAL DISTRIBUTION

8. THE NORMAL DISTRIBUTION 8. THE NORMAL DISTRIBUTION The normal distribution with mean μ and variance σ 2 has the following density function: The normal distribution is sometimes called a Gaussian Distribution, after its inventor,

More information

Chapter 7. Statistical Models of Operational Loss

Chapter 7. Statistical Models of Operational Loss Chapter 7 Statistical Models of Operational Loss Carol Alexander The purpose of this chapter is to give a theoretical but pedagogical introduction to the advanced statistical models that are currently

More information

Modern Operational Risk Management

Modern Operational Risk Management Modern Operational Risk Management A Comprehensive Two-Day Seminar Led by Ali Samad-Khan and Prakash Shimpi Hartford, June 25 26, 2008 2 Modern Operational Risk Management SEMINAR OBJECTIVES Alarmed by

More information

Challenges in Measuring Operational Risk from Loss Data

Challenges in Measuring Operational Risk from Loss Data Challenges in Measuring Operational Risk from Loss Data Eric W. Cope IBM Zurich Research Lab erc@zurich.ibm.com Giulio Mignola Intesa Sanpaolo S.p.A. giulio.mignola@intesasanpaolo.com Gianluca Antonini

More information

Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods

Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods Enrique Navarrete 1 Abstract: This paper surveys the main difficulties involved with the quantitative measurement

More information

FRC Risk Reporting Requirements Working Party Case Study (Pharmaceutical Industry)

FRC Risk Reporting Requirements Working Party Case Study (Pharmaceutical Industry) FRC Risk Reporting Requirements Working Party Case Study (Pharmaceutical Industry) 1 Contents Executive Summary... 3 Background and Scope... 3 Company Background and Highlights... 3 Sample Risk Register...

More information

Risk Analysis and Quantification

Risk Analysis and Quantification Risk Analysis and Quantification 1 What is Risk Analysis? 2. Risk Analysis Methods 3. The Monte Carlo Method 4. Risk Model 5. What steps must be taken for the development of a Risk Model? 1.What is Risk

More information

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1 Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1 Kabir K. Dutta 2 David F. Babbel 3 First Version: March 25, 2010; This Version: September 24, 2010 Abstract

More information

Implications of Alternative Operational Risk Modeling Techniques *

Implications of Alternative Operational Risk Modeling Techniques * Implications of Alternative Operational Risk Modeling Techniques * Patrick de Fontnouvelle, Eric Rosengren Federal Reserve Bank of Boston John Jordan FitchRisk June, 2004 Abstract Quantification of operational

More information

Treatment of Incomplete Data in the Field of Operational Risk: The Effects on Parameter Estimates, EL and UL Figures

Treatment of Incomplete Data in the Field of Operational Risk: The Effects on Parameter Estimates, EL and UL Figures Chernobai.qxd 2/1/ 1: PM Page 1 Treatment of Incomplete Data in the Field of Operational Risk: The Effects on Parameter Estimates, EL and UL Figures Anna Chernobai; Christian Menn*; Svetlozar T. Rachev;

More information

Capital Adequacy: Advanced Measurement Approaches to Operational Risk

Capital Adequacy: Advanced Measurement Approaches to Operational Risk Prudential Standard APS 115 Capital Adequacy: Advanced Measurement Approaches to Operational Risk Objective and key requirements of this Prudential Standard This Prudential Standard sets out the requirements

More information

Stochastic Analysis of Long-Term Multiple-Decrement Contracts

Stochastic Analysis of Long-Term Multiple-Decrement Contracts Stochastic Analysis of Long-Term Multiple-Decrement Contracts Matthew Clark, FSA, MAAA, and Chad Runchey, FSA, MAAA Ernst & Young LLP Published in the July 2008 issue of the Actuarial Practice Forum Copyright

More information

SENSITIVITY ANALYSIS AND INFERENCE. Lecture 12

SENSITIVITY ANALYSIS AND INFERENCE. Lecture 12 This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this

More information

Methods of quantifying operational risk in Banks : Theoretical approaches

Methods of quantifying operational risk in Banks : Theoretical approaches American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-03, pp-238-244 www.ajer.org Research Paper Open Access Methods of quantifying operational risk in

More information

PART A: OVERVIEW...1 1. Introduction...1. 2. Applicability...2. 3. Legal Provisions...2. 4. Effective Date...2

PART A: OVERVIEW...1 1. Introduction...1. 2. Applicability...2. 3. Legal Provisions...2. 4. Effective Date...2 PART A: OVERVIEW...1 1. Introduction...1 2. Applicability...2 3. Legal Provisions...2 4. Effective Date...2 PART B: INTERNAL CAPITAL ADEQUACY ASSESSMENT PROCESS...3 5. Overview of ICAAP...3 6. Board and

More information

CIRCULAR 3,647, MARCH 4 2013

CIRCULAR 3,647, MARCH 4 2013 CIRCULAR 3,647, MARCH 4 2013 Establishes the minimum requirements for use of the advanced approach based on internal models for the calculation of the operational risk component (RWA OAMA ) of the Risk-Weighted

More information

Chapter 3 RANDOM VARIATE GENERATION

Chapter 3 RANDOM VARIATE GENERATION Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.

More information

Supplement to Call Centers with Delay Information: Models and Insights

Supplement to Call Centers with Delay Information: Models and Insights Supplement to Call Centers with Delay Information: Models and Insights Oualid Jouini 1 Zeynep Akşin 2 Yves Dallery 1 1 Laboratoire Genie Industriel, Ecole Centrale Paris, Grande Voie des Vignes, 92290

More information

Conn Valuation Services Ltd.

Conn Valuation Services Ltd. CAPITALIZED EARNINGS VS. DISCOUNTED CASH FLOW: Which is the more accurate business valuation tool? By Richard R. Conn CMA, MBA, CPA, ABV, ERP Is the capitalized earnings 1 method or discounted cash flow

More information

Using ELD: The Australian Experience

Using ELD: The Australian Experience Using ELD: The Australian Experience Harvey Crapp Australian Prudential Regulation Authority Bank of Japan 19 March 2008 1 Agenda 1. Importance of ELD 2. Sources of ELD Publicly Available Consortium Insurance

More information

BEYOND AMA PUTTING OPERATIONAL RISK MODELS TO GOOD USE POINT OF VIEW

BEYOND AMA PUTTING OPERATIONAL RISK MODELS TO GOOD USE POINT OF VIEW POINT OF VIEW BEYOND AMA PUTTING OPERATIONAL RISK MODELS TO GOOD USE AUTHORS Ramy Farha, Principal Tom Ivell, Partner Thomas Jaeggi, Principal Evan Sekeris, Partner A long history of incidents, ranging

More information

Mixing internal and external data for managing operational risk

Mixing internal and external data for managing operational risk Mixing internal and external data for managing operational risk Antoine Frachot and Thierry Roncalli Groupe de Recherche Opérationnelle, Crédit Lyonnais, France This version: January 29, 2002 Introduction

More information

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Errata for ASM Exam C/4 Study Manual (Sixteenth Edition) Sorted by Page 1 Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Practice exam 1:9, 1:22, 1:29, 9:5, and 10:8

More information

Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm

Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm Mgt 540 Research Methods Data Analysis 1 Additional sources Compilation of sources: http://lrs.ed.uiuc.edu/tseportal/datacollectionmethodologies/jin-tselink/tselink.htm http://web.utk.edu/~dap/random/order/start.htm

More information

Java Modules for Time Series Analysis

Java Modules for Time Series Analysis Java Modules for Time Series Analysis Agenda Clustering Non-normal distributions Multifactor modeling Implied ratings Time series prediction 1. Clustering + Cluster 1 Synthetic Clustering + Time series

More information

Matching Investment Strategies in General Insurance Is it Worth It? Aim of Presentation. Background 34TH ANNUAL GIRO CONVENTION

Matching Investment Strategies in General Insurance Is it Worth It? Aim of Presentation. Background 34TH ANNUAL GIRO CONVENTION Matching Investment Strategies in General Insurance Is it Worth It? 34TH ANNUAL GIRO CONVENTION CELTIC MANOR RESORT, NEWPORT, WALES Aim of Presentation To answer a key question: What are the benefit of

More information

1) What kind of risk on settlements is covered by 'Herstatt Risk' for which BCBS was formed?

1) What kind of risk on settlements is covered by 'Herstatt Risk' for which BCBS was formed? 1) What kind of risk on settlements is covered by 'Herstatt Risk' for which BCBS was formed? a) Exchange rate risk b) Time difference risk c) Interest rate risk d) None 2) Which of the following is not

More information

Supervisory Guidance on Operational Risk Advanced Measurement Approaches for Regulatory Capital

Supervisory Guidance on Operational Risk Advanced Measurement Approaches for Regulatory Capital Supervisory Guidance on Operational Risk Advanced Measurement Approaches for Regulatory Capital Draft Date: July 2, 2003 Table of Contents I. Purpose II. Background III. Definitions IV. Banking Activities

More information

Article from: Risk Management. June 2009 Issue 16

Article from: Risk Management. June 2009 Issue 16 Article from: Risk Management June 2009 Issue 16 CHAIRSPERSON S Risk quantification CORNER Structural Credit Risk Modeling: Merton and Beyond By Yu Wang The past two years have seen global financial markets

More information

Measurement of Banks Exposure to Interest Rate Risk and Principles for the Management of Interest Rate Risk respectively.

Measurement of Banks Exposure to Interest Rate Risk and Principles for the Management of Interest Rate Risk respectively. INTEREST RATE RISK IN THE BANKING BOOK Over the past decade the Basel Committee on Banking Supervision (the Basel Committee) has released a number of consultative documents discussing the management and

More information

EDUCATION AND TRAINING

EDUCATION AND TRAINING A Model to Quantify the Return on Investment of Information Assurance By Charley Tichenor Defense Security Cooperation Agency [The following views presented herein are solely those of the author and do

More information

Why COSO is flawed. Operational risk is one of the most. COSO not only fails to help a firm assess its

Why COSO is flawed. Operational risk is one of the most. COSO not only fails to help a firm assess its Why COSO is flawed COSO not only fails to help a firm assess its risks, it actually obfuscates the risk assessment process. By Ali Samad-Khan Ali Samad-Khan Operational risk is one of the most significant

More information

Basel Committee on Banking Supervision. Results from the 2008 Loss Data Collection Exercise for Operational Risk

Basel Committee on Banking Supervision. Results from the 2008 Loss Data Collection Exercise for Operational Risk Basel Committee on Banking Supervision Results from the 2008 Loss Data Collection Exercise for Operational Risk July 2009 Requests for copies of publications, or for additions/changes to the mailing list,

More information

Aachen Summer Simulation Seminar 2014

Aachen Summer Simulation Seminar 2014 Aachen Summer Simulation Seminar 2014 Lecture 07 Input Modelling + Experimentation + Output Analysis Peer-Olaf Siebers pos@cs.nott.ac.uk Motivation 1. Input modelling Improve the understanding about how

More information

Basel II: Operational Risk Implementation based on Risk Framework

Basel II: Operational Risk Implementation based on Risk Framework Systems Ltd General Kiselov 31 BG-9002 Varna Tel. +359 52 612 367 Fax +359 52 612 371 email office@eurorisksystems.com WEB: www.eurorisksystems.com Basel II: Operational Risk Implementation based on Risk

More information

STRESS TESTING GUIDELINE

STRESS TESTING GUIDELINE STRESS TESTING GUIDELINE JUIN 2012 Table of Contents Preamble... 2 Introduction... 3 Scope... 5 Coming into effect and updating... 6 1. Stress testing... 7 A. Concept... 7 B. Approaches underlying stress

More information

Northern Trust Corporation

Northern Trust Corporation Northern Trust Corporation Market Risk Disclosures June 30, 2014 Market Risk Disclosures Effective January 1, 2013, Northern Trust Corporation (Northern Trust) adopted revised risk based capital guidelines

More information

Tail-Dependence an Essential Factor for Correctly Measuring the Benefits of Diversification

Tail-Dependence an Essential Factor for Correctly Measuring the Benefits of Diversification Tail-Dependence an Essential Factor for Correctly Measuring the Benefits of Diversification Presented by Work done with Roland Bürgi and Roger Iles New Views on Extreme Events: Coupled Networks, Dragon

More information

Industry Environment and Concepts for Forecasting 1

Industry Environment and Concepts for Forecasting 1 Table of Contents Industry Environment and Concepts for Forecasting 1 Forecasting Methods Overview...2 Multilevel Forecasting...3 Demand Forecasting...4 Integrating Information...5 Simplifying the Forecast...6

More information

Dr Christine Brown University of Melbourne

Dr Christine Brown University of Melbourne Enhancing Risk Management and Governance in the Region s Banking System to Implement Basel II and to Meet Contemporary Risks and Challenges Arising from the Global Banking System Training Program ~ 8 12

More information

Quantitative Inventory Uncertainty

Quantitative Inventory Uncertainty Quantitative Inventory Uncertainty It is a requirement in the Product Standard and a recommendation in the Value Chain (Scope 3) Standard that companies perform and report qualitative uncertainty. This

More information

LAB 4 INSTRUCTIONS CONFIDENCE INTERVALS AND HYPOTHESIS TESTING

LAB 4 INSTRUCTIONS CONFIDENCE INTERVALS AND HYPOTHESIS TESTING LAB 4 INSTRUCTIONS CONFIDENCE INTERVALS AND HYPOTHESIS TESTING In this lab you will explore the concept of a confidence interval and hypothesis testing through a simulation problem in engineering setting.

More information

This PDF is a selection from a published volume from the National Bureau of Economic Research. Volume Title: The Risks of Financial Institutions

This PDF is a selection from a published volume from the National Bureau of Economic Research. Volume Title: The Risks of Financial Institutions This PDF is a selection from a published volume from the National Bureau of Economic Research Volume Title: The Risks of Financial Institutions Volume Author/Editor: Mark Carey and René M. Stulz, editors

More information

Insurance as Operational Risk Management Tool

Insurance as Operational Risk Management Tool DOI: 10.7763/IPEDR. 2012. V54. 7 Insurance as Operational Risk Management Tool Milan Rippel 1, Lucie Suchankova 2 1 Charles University in Prague, Czech Republic 2 Charles University in Prague, Czech Republic

More information

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written

More information

Exploratory Data Analysis

Exploratory Data Analysis Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction

More information

RISK MITIGATION IN FAST TRACKING PROJECTS

RISK MITIGATION IN FAST TRACKING PROJECTS Voorbeeld paper CCE certificering RISK MITIGATION IN FAST TRACKING PROJECTS Author ID # 4396 June 2002 G:\DACE\certificering\AACEI\presentation 2003 page 1 of 17 Table of Contents Abstract...3 Introduction...4

More information

Financial Risk Forecasting Chapter 8 Backtesting and stresstesting

Financial Risk Forecasting Chapter 8 Backtesting and stresstesting Financial Risk Forecasting Chapter 8 Backtesting and stresstesting Jon Danielsson London School of Economics 2015 To accompany Financial Risk Forecasting http://www.financialriskforecasting.com/ Published

More information

EIOPACP 13/011. Guidelines on PreApplication of Internal Models

EIOPACP 13/011. Guidelines on PreApplication of Internal Models EIOPACP 13/011 Guidelines on PreApplication of Internal Models EIOPA Westhafen Tower, Westhafenplatz 1 60327 Frankfurt Germany Tel. + 49 6995111920; Fax. + 49 6995111919; site: www.eiopa.europa.eu Guidelines

More information

12.5: CHI-SQUARE GOODNESS OF FIT TESTS

12.5: CHI-SQUARE GOODNESS OF FIT TESTS 125: Chi-Square Goodness of Fit Tests CD12-1 125: CHI-SQUARE GOODNESS OF FIT TESTS In this section, the χ 2 distribution is used for testing the goodness of fit of a set of data to a specific probability

More information

MODERN OPERATIONAL RISK MANAGEMENT

MODERN OPERATIONAL RISK MANAGEMENT MODERN OERATIONAL RISK MANAGEMENT A comprehensive two-day masterclass led by Ali Samad-Khan and Tigran Kalberer 14-15 October 2008 Widder Hotel, Zürich 2 I Towers errin Masterclass Objectives Immense competitive,

More information

PROJECT RISK MANAGEMENT - ADVANTAGES AND PITFALLS

PROJECT RISK MANAGEMENT - ADVANTAGES AND PITFALLS 1 PROJECT RISK MANAGEMENT - ADVANTAGES AND PITFALLS Kenneth K. Humphreys 1, PE CCE DIF 1 Past Secretary-Treasurer, ICEC, Granite Falls, NC, United States Abstract Proper project decision-making requires

More information

Tutorial 5: Hypothesis Testing

Tutorial 5: Hypothesis Testing Tutorial 5: Hypothesis Testing Rob Nicholls nicholls@mrc-lmb.cam.ac.uk MRC LMB Statistics Course 2014 Contents 1 Introduction................................ 1 2 Testing distributional assumptions....................

More information

Statistics for Retail Finance. Chapter 8: Regulation and Capital Requirements

Statistics for Retail Finance. Chapter 8: Regulation and Capital Requirements Statistics for Retail Finance 1 Overview > We now consider regulatory requirements for managing risk on a portfolio of consumer loans. Regulators have two key duties: 1. Protect consumers in the financial

More information

ModelRisk for Insurance and Finance. Quick Start Guide

ModelRisk for Insurance and Finance. Quick Start Guide ModelRisk for Insurance and Finance Quick Start Guide Copyright 2008 by Vose Software BVBA, All rights reserved. Vose Software Iepenstraat 98 9000 Gent, Belgium www.vosesoftware.com ModelRisk is owned

More information

Fairfield Public Schools

Fairfield Public Schools Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity

More information

Content Sheet 7-1: Overview of Quality Control for Quantitative Tests

Content Sheet 7-1: Overview of Quality Control for Quantitative Tests Content Sheet 7-1: Overview of Quality Control for Quantitative Tests Role in quality management system Quality Control (QC) is a component of process control, and is a major element of the quality management

More information

How to Model Operational Risk, if You Must

How to Model Operational Risk, if You Must How to Model Operational Risk, if You Must Paul Embrechts ETH Zürich (www.math.ethz.ch/ embrechts) Based on joint work with V. Chavez-Demoulin, H. Furrer, R. Kaufmann, J. Nešlehová and G. Samorodnitsky

More information

An operational risk management framework for managing agencies

An operational risk management framework for managing agencies An operational risk management framework for managing agencies John Thirlwell Director, Operational Risk Research Forum Lloyd s Risk Forum, 28 May 2004 Operational risk and its evolution Regulators and

More information

The Study of Chinese P&C Insurance Risk for the Purpose of. Solvency Capital Requirement

The Study of Chinese P&C Insurance Risk for the Purpose of. Solvency Capital Requirement The Study of Chinese P&C Insurance Risk for the Purpose of Solvency Capital Requirement Xie Zhigang, Wang Shangwen, Zhou Jinhan School of Finance, Shanghai University of Finance & Economics 777 Guoding

More information

Standard Deviation Estimator

Standard Deviation Estimator CSS.com Chapter 905 Standard Deviation Estimator Introduction Even though it is not of primary interest, an estimate of the standard deviation (SD) is needed when calculating the power or sample size of

More information

6.4 Normal Distribution

6.4 Normal Distribution Contents 6.4 Normal Distribution....................... 381 6.4.1 Characteristics of the Normal Distribution....... 381 6.4.2 The Standardized Normal Distribution......... 385 6.4.3 Meaning of Areas under

More information

GLM, insurance pricing & big data: paying attention to convergence issues.

GLM, insurance pricing & big data: paying attention to convergence issues. GLM, insurance pricing & big data: paying attention to convergence issues. Michaël NOACK - michael.noack@addactis.com Senior consultant & Manager of ADDACTIS Pricing Copyright 2014 ADDACTIS Worldwide.

More information

Estimating Volatility

Estimating Volatility Estimating Volatility Daniel Abrams Managing Partner FAS123 Solutions, LLC Copyright 2005 FAS123 Solutions, LLC Definition of Volatility Historical Volatility as a Forecast of the Future Definition of

More information

ACTUARIAL CONSIDERATIONS IN THE DEVELOPMENT OF AGENT CONTINGENT COMPENSATION PROGRAMS

ACTUARIAL CONSIDERATIONS IN THE DEVELOPMENT OF AGENT CONTINGENT COMPENSATION PROGRAMS ACTUARIAL CONSIDERATIONS IN THE DEVELOPMENT OF AGENT CONTINGENT COMPENSATION PROGRAMS Contingent compensation plans are developed by insurers as a tool to provide incentives to agents to obtain certain

More information

Dongfeng Li. Autumn 2010

Dongfeng Li. Autumn 2010 Autumn 2010 Chapter Contents Some statistics background; ; Comparing means and proportions; variance. Students should master the basic concepts, descriptive statistics measures and graphs, basic hypothesis

More information

A New Approach for Managing Operational Risk

A New Approach for Managing Operational Risk A New Approach for Managing Operational Risk Addressing the Issues Underlying the 2008 Global Financial Crisis Sponsored by: Joint Risk Management Section Society of Actuaries Canadian Institute of Actuaries

More information

Monte Carlo analysis used for Contingency estimating.

Monte Carlo analysis used for Contingency estimating. Monte Carlo analysis used for Contingency estimating. Author s identification number: Date of authorship: July 24, 2007 Page: 1 of 15 TABLE OF CONTENTS: LIST OF TABLES:...3 LIST OF FIGURES:...3 ABSTRACT:...4

More information

Sample Size and Power in Clinical Trials

Sample Size and Power in Clinical Trials Sample Size and Power in Clinical Trials Version 1.0 May 011 1. Power of a Test. Factors affecting Power 3. Required Sample Size RELATED ISSUES 1. Effect Size. Test Statistics 3. Variation 4. Significance

More information

Basel Committee on Banking Supervision. Working Paper No. 17

Basel Committee on Banking Supervision. Working Paper No. 17 Basel Committee on Banking Supervision Working Paper No. 17 Vendor models for credit risk measurement and management Observations from a review of selected models February 2010 The Working Papers of the

More information

Auxiliary Variables in Mixture Modeling: 3-Step Approaches Using Mplus

Auxiliary Variables in Mixture Modeling: 3-Step Approaches Using Mplus Auxiliary Variables in Mixture Modeling: 3-Step Approaches Using Mplus Tihomir Asparouhov and Bengt Muthén Mplus Web Notes: No. 15 Version 8, August 5, 2014 1 Abstract This paper discusses alternatives

More information

Measuring Exchange Rate Fluctuations Risk Using the Value-at-Risk

Measuring Exchange Rate Fluctuations Risk Using the Value-at-Risk Journal of Applied Finance & Banking, vol.2, no.3, 2012, 65-79 ISSN: 1792-6580 (print version), 1792-6599 (online) International Scientific Press, 2012 Measuring Exchange Rate Fluctuations Risk Using the

More information

RISK MANAGEMENT CASE STUDY OIL AND GAS INDUSTRY

RISK MANAGEMENT CASE STUDY OIL AND GAS INDUSTRY RISK MANAGEMENT CASE STUDY OIL AND GAS INDUSTRY Risk Management Case Study Oil and Gas Industry Page 1 of 18 Table of Contents Executive Summary... 3 Major risks to the company... 3 Modelling Approach...

More information

ERM-2: Introduction to Economic Capital Modeling

ERM-2: Introduction to Economic Capital Modeling ERM-2: Introduction to Economic Capital Modeling 2011 Casualty Loss Reserve Seminar, Las Vegas, NV A presentation by François Morin September 15, 2011 2011 Towers Watson. All rights reserved. INTRODUCTION

More information

DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN. April 2009 SLAC I 050 07010 002

DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN. April 2009 SLAC I 050 07010 002 DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN April 2009 SLAC I 050 07010 002 Risk Management Plan Contents 1.0 INTRODUCTION... 1 1.1 Scope... 1 2.0 MANAGEMENT

More information

GN47: Stochastic Modelling of Economic Risks in Life Insurance

GN47: Stochastic Modelling of Economic Risks in Life Insurance GN47: Stochastic Modelling of Economic Risks in Life Insurance Classification Recommended Practice MEMBERS ARE REMINDED THAT THEY MUST ALWAYS COMPLY WITH THE PROFESSIONAL CONDUCT STANDARDS (PCS) AND THAT

More information

An Introduction to. Metrics. used during. Software Development

An Introduction to. Metrics. used during. Software Development An Introduction to Metrics used during Software Development Life Cycle www.softwaretestinggenius.com Page 1 of 10 Define the Metric Objectives You can t control what you can t measure. This is a quote

More information

Measuring operational risk in financial institutions: Contribution of credit risk modelling

Measuring operational risk in financial institutions: Contribution of credit risk modelling CAHIER DE RECHERCHE / WORKING PAPER Measuring operational risk in financial institutions: Contribution of credit risk modelling Georges Hübner, Jean-Philippe Peters and Séverine Plunus August 06 / N 200608/02

More information

Desafios na Implantação dos Modelos Avançados para. 3º. Seminário Internacional Febraban Modelos Avançados para Risco Operacional

Desafios na Implantação dos Modelos Avançados para. 3º. Seminário Internacional Febraban Modelos Avançados para Risco Operacional Desafios na Implantação dos Modelos Avançados para Risco Operacional no Itaú Unibanco 3º. Seminário Internacional Febraban Modelos Avançados para Risco Operacional Itaú Unibanco Holding S.A. The Merger

More information

PRINCIPLES FOR THE MANAGEMENT OF CONCENTRATION RISK

PRINCIPLES FOR THE MANAGEMENT OF CONCENTRATION RISK ANNEX 2G PRINCIPLES FOR THE MANAGEMENT OF CONCENTRATION RISK Concentration risk can be defined as any single (direct and/or indirect) exposure or group of exposures with the potential to produce losses

More information

Much attention has been focused recently on enterprise risk management (ERM),

Much attention has been focused recently on enterprise risk management (ERM), By S. Michael McLaughlin and Karen DeToro Much attention has been focused recently on enterprise risk management (ERM), not just in the insurance industry but in other industries as well. Across all industries,

More information

Modelling operational risk in the insurance industry

Modelling operational risk in the insurance industry April 2011 N 14 Modelling operational risk in the insurance industry Par Julie Gamonet Centre d études actuarielles Winner of the French «Young Actuaries Prize» 2010 Texts appearing in SCOR Papers are

More information

Operational Risk - Scenario Analysis

Operational Risk - Scenario Analysis Institute of Economic Studies, Faculty of Social Sciences Charles University in Prague Operational Risk - Scenario Analysis Milan Rippel Petr Teplý IES Working Paper: 15/2008 Institute of Economic Studies,

More information

Credibility and Pooling Applications to Group Life and Group Disability Insurance

Credibility and Pooling Applications to Group Life and Group Disability Insurance Credibility and Pooling Applications to Group Life and Group Disability Insurance Presented by Paul L. Correia Consulting Actuary paul.correia@milliman.com (207) 771-1204 May 20, 2014 What I plan to cover

More information

Risk-Based Capital. Overview

Risk-Based Capital. Overview Risk-Based Capital Definition: Risk-based capital (RBC) represents an amount of capital based on an assessment of risks that a company should hold to protect customers against adverse developments. Overview

More information

Monte Carlo Simulation

Monte Carlo Simulation 1 Monte Carlo Simulation Stefan Weber Leibniz Universität Hannover email: sweber@stochastik.uni-hannover.de web: www.stochastik.uni-hannover.de/ sweber Monte Carlo Simulation 2 Quantifying and Hedging

More information

Introduction to Quantitative Methods

Introduction to Quantitative Methods Introduction to Quantitative Methods October 15, 2009 Contents 1 Definition of Key Terms 2 2 Descriptive Statistics 3 2.1 Frequency Tables......................... 4 2.2 Measures of Central Tendencies.................

More information

Organizing Your Approach to a Data Analysis

Organizing Your Approach to a Data Analysis Biost/Stat 578 B: Data Analysis Emerson, September 29, 2003 Handout #1 Organizing Your Approach to a Data Analysis The general theme should be to maximize thinking about the data analysis and to minimize

More information