INSURANCE RISK THEORY (Problems)



Similar documents
A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Maximum Likelihood Estimation

Notes on the Negative Binomial Distribution

2WB05 Simulation Lecture 8: Generating random variables

1 Sufficient statistics

Section 5.1 Continuous Random Variables: Introduction

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page


Properties of Future Lifetime Distributions and Estimation

Modeling and Analysis of Information Technology Systems

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

The Exponential Distribution

Practice problems for Homework 11 - Point Estimation

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Probability Calculator

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

Section 6.1 Joint Distribution Functions

Probability Generating Functions

UNIVERSITY OF OSLO. The Poisson model is a common model for claim frequency.

VISUALIZATION OF DENSITY FUNCTIONS WITH GEOGEBRA

Statistics 100A Homework 8 Solutions

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

1 Prior Probability and Posterior Probability

Aggregate Loss Models

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

A Unifying Pricing Theory for Insurance and Financial Risks: Applications for a Unified Risk Management

Nonparametric adaptive age replacement with a one-cycle criterion

Joint Exam 1/P Sample Exam 1

Exponential Distribution

Survival Distributions, Hazard Functions, Cumulative Hazards

Notes on Continuous Random Variables

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Statistics 100A Homework 7 Solutions

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM C CONSTRUCTION AND EVALUATION OF ACTUARIAL MODELS EXAM C SAMPLE QUESTIONS

Lecture 13: Martingales

6 PROBABILITY GENERATING FUNCTIONS

Chapter 3 RANDOM VARIATE GENERATION

THE NUMBER OF GRAPHS AND A RANDOM GRAPH WITH A GIVEN DEGREE SEQUENCE. Alexander Barvinok

Approximation of Aggregate Losses Using Simulation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Stochastic Models for Inventory Management at Service Facilities

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Principle of Data Reduction

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Introduction to Probability

minimal polyonomial Example

Lecture Notes on Elasticity of Substitution

Exam C, Fall 2006 PRELIMINARY ANSWER KEY

1.5 / 1 -- Communication Networks II (Görg) Transforms

Portfolio Distribution Modelling and Computation. Harry Zheng Department of Mathematics Imperial College

ON RUIN PROBABILITY AND AGGREGATE CLAIM REPRESENTATIONS FOR PARETO CLAIM SIZE DISTRIBUTIONS

Determining distribution parameters from quantiles

An Introduction to Basic Statistics and Probability

Polynomial approximations for bivariate aggregate claims amount probability distributions

5. Continuous Random Variables

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Parametric Survival Models

Exact Confidence Intervals

Non-Life Insurance Mathematics

Alternative Price Processes for Black-Scholes: Empirical Evidence and Theory

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)


Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

Feb 28 Homework Solutions Math 151, Winter Chapter 6 Problems (pages )

Stats on the TI 83 and TI 84 Calculator

Optimization of Business Processes: An Introduction to Applied Stochastic Modeling. Ger Koole Department of Mathematics, VU University Amsterdam

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

ISyE 6761 Fall 2012 Homework #2 Solutions

Lecture 8: More Continuous Random Variables

Estimating the Degree of Activity of jumps in High Frequency Financial Data. joint with Yacine Aït-Sahalia

6.2 Permutations continued

( ) = 1 x. ! 2x = 2. The region where that joint density is positive is indicated with dotted lines in the graph below. y = x

Dongfeng Li. Autumn 2010

Supplement to Call Centers with Delay Information: Models and Insights

Credit Risk Models: An Overview

A LOGNORMAL MODEL FOR INSURANCE CLAIMS DATA

Course 4 Examination Questions And Illustrative Solutions. November 2000

The Kelly criterion for spread bets

Generating Random Variates

Lecture 4: BK inequality 27th August and 6th September, 2007

Differentiation and Integration

Negative Binomial. Paul Johnson and Andrea Vieux. June 10, 2013

AGGREGATE CLAIMS, SOLVENCY AND REINSURANCE. David Dickson, Centre for Actuarial Studies, University of Melbourne. Cherry Bud Workshop

Beta Distribution. Paul Johnson and Matt Beverlin June 10, 2013

Modeling the Implied Volatility Surface. Jim Gatheral Stanford Financial Mathematics Seminar February 28, 2003

Two-Stage Stochastic Linear Programs

ECE302 Spring 2006 HW5 Solutions February 21,

CONTINUOUS COUNTERPARTS OF POISSON AND BINOMIAL DISTRIBUTIONS AND THEIR PROPERTIES

Transcription:

INSURANCE RISK THEORY (Problems) 1 Counting random variables 1. (Lack of memory property) Let X be a geometric distributed random variable with parameter p (, 1), (X Ge (p)). Show that for all n, m =, 1,... P (X n + m X m) = P (X n). 2. Let X and Y be independent identically Ge (p) distributed random variables and Z = X + Y. Prove that a) the distribution of Z is given by P (Z = k) = (k + 1)(1 p) 2 p k, k =, 1, 2,... ; b) Find the conditional probability P (X = m X + Y = k), m =, 1,... k. 3. Let X k Ge(p k ), k = 1, 2,... n be independent random variables and m n = min 1 k n X k. Show that m n Ge(p), where p = n k=1 p k. 4. Show that the tail of the binomial distribution is given by n ( ) ( ) n n p p k (1 p) n k = m x m 1 (1 x) n m dx. k m 5. Let X and Y be independent negative binomial distributed random variables, X NB(α 1, p) and Y NB(α 2, p). Verify that Z = X + Y NB(α 1 + α 2, p). 6. Let X NB(n, p). Show that the tail of the distribution is given by P (X m) = ( ) ( ) n + k 1 m + n 1 p (1 p) n p k = m x m 1 (1 x) n 1 dx. k m 1

7. Let X be a Poisson distributed random variable with parameter λ (X P o(λ)). Verify that for any m =, 1,..., the tail is given by λ k λ x m 1 P (X m) = k! e λ = (m 1)! e x dx. 8. Let X P o(λ) and Y P o(µ) be independent. a) Verify that Z = X + Y P o(λ + µ); b) Given Z = X + Y, find the distribution of X, i.e. P (X = k X + Y = n), k =, 1,... n. 9. Let X be a nonnegative integer valued random variable with probabilities {p k }, such that k= p kt k < for any t [, t ] and t 1. Show that (1 t) r k+1 t k = 1 P X (t), t < t, k= where r k = p k + p k+1 +... is the tail of the distribution. 1. The random variable X is Poisson distributed with probability mass function P (X = k λ) = λk k! e λ, k =, 1,.... The parameter λ is a realization of Gamma distributed random variable with density function f(λ) = βr Γ(r) λr 1 e βλ, β >, λ >, where Γ is the Gamma function, r is the shape parameter and β the scale parameter. Prove that ( ) r ( ) ( ) k β r + k 1 1 P (X = k) =, k =, 1,.... 1 + β k 1 + β 11. Let S i CP o(λ i, F i (x)), i = 1,... n be independent Compound Poisson random variables with parameters λ i and Z F i (x), x >. Show that S n = S 1 +... S n is also Compound Poisson with parameters λ = n i=1 λ i and F (x) = n λ i i=1 F λ i(x). 12. Let X 1, X 2,... be independent identically Ge 1 (1 ρ) distributed random variables with parameter ρ [, 1) and probability mass function P (X 1 = i) = ρ i 1 (1 ρ), i = 1, 2,.... The random variable Y P o(λ) is independent of X i, i = 1, 2,.... Consider the random variable S Y = X 1 + X 2 +... + X Y. Show that the probability mass function of S Y is given by e λ, k =, P (S Y = k) = e λ k i=1 ( ) k 1 [λ(1 ρ)] i ρ k i, i = 1, 2,... i 1 i! Hint: The probability generating function of S Y is given by P SY (s) = e λ(1 P Y (s)), where P Y (s) = (1 ρ)s is the probability generating function of Y. The random variable S 1 ρs Y P A(λ, ρ) (Pólya - Aeppli distribution). 2

2 Continuous random variables 13. Let X and Y be independent identically exponential distributed random variables with parameter 1 (exp(1)). a) Find the distribution of X ; Y b) Show that Z = X U(, 1); X + Y c) Show that the random variable W = X Y has a standard Laplace distribution: f W (x) = 1 2 e x, < x <. 14. Let Y exp(λ). It is known that the Laplace transform of Y is given by LT Y (t) = λ λ+t. Find the Laplace transform of the r.v. V = a + Y exp(a, λ). 15. Let X and Y be independent exponentially distributed random variables with respective parameters λ and µ. Show that P (X Y min(x, Y ) > z) = λ λ + µ, i.e. the probability that the smallest value will be the value of X is proportional to the parameter of X and is independent of min(x, Y ). 16. Let Y k exp(λ k ), k = 1, 2,... n be independent random variables and m n = min{y 1, Y 2,...}. Prove that for any n = 1, 2,..., m n exp(λ), where λ = λ 1 +... + λ n. 17. Let X exp(λ). Prove that for any u >, m 1 (u) = E[X X > u] = u + 1 λ and V ar[x X > u] = 1 λ 2. Hint: Given a threshold u, the exceedances above u are calculated conditional on X > u. By Bayes rule f(x x > u) = f(x). 1 F (u) 18. Let X 1, X 2,... X n be independent Gamma distributed random variables (X i Γ(α i, β)). Show that X = X 1 +... X n Γ(α 1 +... α n, β). 19. Let X Γ(α, 1) and Y Γ(β, 1) be independent random variables. Show that the random variables U = X and V = X + Y are independent, U is Beta distributed with X+Y parameters (α, β), (U B(α, β)) and V Γ(α + β, 1). 2. Let X B(α, β). Prove that a) 1 X B(β, α); 1; b) the distribution of Y = 1 X is given by the density function f Y (x) = 1 B(α,β) c) the distribution of Z = Y 1 = 1 X is given by f X Z(x) = 1 (Beta distribution of second kind). B(α,β) x β 1 (x 1) β 1 x α+β, x > (1+x) α+β, x > 3

21. Let V Bi(n, p) and X B(m, n m + 1), where m n are integer numbers and < p < 1. Show that P (V m) = P (X < p). 22. Let U U(, 1). Show that the random variable X = U 1 α, α > is Pareto distributed with density function given by f X (x) = αx (α+1), x 1. 23. Let F (x) be the distribution function of a nonnegative random variable X and F ( ) =. Show that for any s R, the moment generating function M X (s): M X (s) = e sx df (x) = 1 + s e sx F (x)dx, if it exists. 24. Show that F with the first finite moment is heavy tailed if and only if the corresponding integrated tail distribution F I is heavy tailed. 25. Find the hazard rate function λ(t), of the Pareto distribution P ar(α, λ). 26. Show that the following distribution functions are Pareto - type: a) Pareto distribution P ar(α, λ); b) Loggamma distribution with a density function f(x) = λα Γ(α) (log x)α 1 x (λ+1), x > 1; c) Burr distribution with survival function ( ) α λ F (x) =, x. λ + x 27. Let X 1,... X n be independent identically Inverse Gaussian distributed random variables with density function f X (x) = µ (2πσx 3 ) 1 2 e (x µ)2 2σx, x >, µ, σ >. Notation: X i IG(µ, σ). Show that the sum S = X 1 +... X n is IG(nµ, σ) distributed. Hint: Show that M X (s) = e µ σ [1 (1 2σs) 1 2 ]. 3 Cramér - Lundberg model 28. Let {U t } be a Cramér-Lundberg model with initial capital c, claim intensity λ and Γ(2, β) distributed claims with density function f(x) = β 2 xe βx, x >. a) Explain the NPC (net profit condition); b) Show that the nonruin probability Φ(u) satisfies the equation cφ (u) + (2βc λ)φ (u) + β(βc 2λ)Φ (u) =. 4

c) Find the ruin probability. 29. Let {U t } be a Cramér-Lundberg model with initial capital c, claim intensity λ and constant claims Z i = µ. a) Show that the nonruin probability Φ(u) is differentiable everywhere except the point u = µ and satisfies the equation b) Show that for u µ Φ (u) = λ[φ(u) Φ(u µ)]; Φ(u) = c λµ e λ c u. c 3. Let {U t } be a renewal risk model with Erlang(2, β) distributed inter-arrival times. The mean value of the claims is m 1 = EZ. Verify that the Laplace transform of the nonruin probability is given by LT Φ (s) = c 2 sφ() + β 2 m 1 2βc c 2 s 2 2βcs + β 2 (1 LT F (s)), where Φ() is the nonruin probability with noinitial capital. 31. Let the claim sizes Z i P ar(α, λ) in the classical risk model. a) Find a condition of NPC; b) Find the integrated tail distribution F I ; c) Verify the following approximation of the ruin probability Ψ(u) = θ ( u ) (α 1), u > λ. α(1 θ) λ 32. Consider the classical risk model with λ = 1, safety loading θ and exp(1) distributed claims. The reinsurer s safety loading factor is η. Calculate the adjustment coefficient in the case of a) proportional reinsurance h(x) = bx; b) excess of loss reinsurance h(x) = (x b) +. Leda D.Minkova August 21 Faculty of Mathematics and Informatics Sofia University 5