INSURANCE RISK THEORY (Problems) 1 Counting random variables 1. (Lack of memory property) Let X be a geometric distributed random variable with parameter p (, 1), (X Ge (p)). Show that for all n, m =, 1,... P (X n + m X m) = P (X n). 2. Let X and Y be independent identically Ge (p) distributed random variables and Z = X + Y. Prove that a) the distribution of Z is given by P (Z = k) = (k + 1)(1 p) 2 p k, k =, 1, 2,... ; b) Find the conditional probability P (X = m X + Y = k), m =, 1,... k. 3. Let X k Ge(p k ), k = 1, 2,... n be independent random variables and m n = min 1 k n X k. Show that m n Ge(p), where p = n k=1 p k. 4. Show that the tail of the binomial distribution is given by n ( ) ( ) n n p p k (1 p) n k = m x m 1 (1 x) n m dx. k m 5. Let X and Y be independent negative binomial distributed random variables, X NB(α 1, p) and Y NB(α 2, p). Verify that Z = X + Y NB(α 1 + α 2, p). 6. Let X NB(n, p). Show that the tail of the distribution is given by P (X m) = ( ) ( ) n + k 1 m + n 1 p (1 p) n p k = m x m 1 (1 x) n 1 dx. k m 1
7. Let X be a Poisson distributed random variable with parameter λ (X P o(λ)). Verify that for any m =, 1,..., the tail is given by λ k λ x m 1 P (X m) = k! e λ = (m 1)! e x dx. 8. Let X P o(λ) and Y P o(µ) be independent. a) Verify that Z = X + Y P o(λ + µ); b) Given Z = X + Y, find the distribution of X, i.e. P (X = k X + Y = n), k =, 1,... n. 9. Let X be a nonnegative integer valued random variable with probabilities {p k }, such that k= p kt k < for any t [, t ] and t 1. Show that (1 t) r k+1 t k = 1 P X (t), t < t, k= where r k = p k + p k+1 +... is the tail of the distribution. 1. The random variable X is Poisson distributed with probability mass function P (X = k λ) = λk k! e λ, k =, 1,.... The parameter λ is a realization of Gamma distributed random variable with density function f(λ) = βr Γ(r) λr 1 e βλ, β >, λ >, where Γ is the Gamma function, r is the shape parameter and β the scale parameter. Prove that ( ) r ( ) ( ) k β r + k 1 1 P (X = k) =, k =, 1,.... 1 + β k 1 + β 11. Let S i CP o(λ i, F i (x)), i = 1,... n be independent Compound Poisson random variables with parameters λ i and Z F i (x), x >. Show that S n = S 1 +... S n is also Compound Poisson with parameters λ = n i=1 λ i and F (x) = n λ i i=1 F λ i(x). 12. Let X 1, X 2,... be independent identically Ge 1 (1 ρ) distributed random variables with parameter ρ [, 1) and probability mass function P (X 1 = i) = ρ i 1 (1 ρ), i = 1, 2,.... The random variable Y P o(λ) is independent of X i, i = 1, 2,.... Consider the random variable S Y = X 1 + X 2 +... + X Y. Show that the probability mass function of S Y is given by e λ, k =, P (S Y = k) = e λ k i=1 ( ) k 1 [λ(1 ρ)] i ρ k i, i = 1, 2,... i 1 i! Hint: The probability generating function of S Y is given by P SY (s) = e λ(1 P Y (s)), where P Y (s) = (1 ρ)s is the probability generating function of Y. The random variable S 1 ρs Y P A(λ, ρ) (Pólya - Aeppli distribution). 2
2 Continuous random variables 13. Let X and Y be independent identically exponential distributed random variables with parameter 1 (exp(1)). a) Find the distribution of X ; Y b) Show that Z = X U(, 1); X + Y c) Show that the random variable W = X Y has a standard Laplace distribution: f W (x) = 1 2 e x, < x <. 14. Let Y exp(λ). It is known that the Laplace transform of Y is given by LT Y (t) = λ λ+t. Find the Laplace transform of the r.v. V = a + Y exp(a, λ). 15. Let X and Y be independent exponentially distributed random variables with respective parameters λ and µ. Show that P (X Y min(x, Y ) > z) = λ λ + µ, i.e. the probability that the smallest value will be the value of X is proportional to the parameter of X and is independent of min(x, Y ). 16. Let Y k exp(λ k ), k = 1, 2,... n be independent random variables and m n = min{y 1, Y 2,...}. Prove that for any n = 1, 2,..., m n exp(λ), where λ = λ 1 +... + λ n. 17. Let X exp(λ). Prove that for any u >, m 1 (u) = E[X X > u] = u + 1 λ and V ar[x X > u] = 1 λ 2. Hint: Given a threshold u, the exceedances above u are calculated conditional on X > u. By Bayes rule f(x x > u) = f(x). 1 F (u) 18. Let X 1, X 2,... X n be independent Gamma distributed random variables (X i Γ(α i, β)). Show that X = X 1 +... X n Γ(α 1 +... α n, β). 19. Let X Γ(α, 1) and Y Γ(β, 1) be independent random variables. Show that the random variables U = X and V = X + Y are independent, U is Beta distributed with X+Y parameters (α, β), (U B(α, β)) and V Γ(α + β, 1). 2. Let X B(α, β). Prove that a) 1 X B(β, α); 1; b) the distribution of Y = 1 X is given by the density function f Y (x) = 1 B(α,β) c) the distribution of Z = Y 1 = 1 X is given by f X Z(x) = 1 (Beta distribution of second kind). B(α,β) x β 1 (x 1) β 1 x α+β, x > (1+x) α+β, x > 3
21. Let V Bi(n, p) and X B(m, n m + 1), where m n are integer numbers and < p < 1. Show that P (V m) = P (X < p). 22. Let U U(, 1). Show that the random variable X = U 1 α, α > is Pareto distributed with density function given by f X (x) = αx (α+1), x 1. 23. Let F (x) be the distribution function of a nonnegative random variable X and F ( ) =. Show that for any s R, the moment generating function M X (s): M X (s) = e sx df (x) = 1 + s e sx F (x)dx, if it exists. 24. Show that F with the first finite moment is heavy tailed if and only if the corresponding integrated tail distribution F I is heavy tailed. 25. Find the hazard rate function λ(t), of the Pareto distribution P ar(α, λ). 26. Show that the following distribution functions are Pareto - type: a) Pareto distribution P ar(α, λ); b) Loggamma distribution with a density function f(x) = λα Γ(α) (log x)α 1 x (λ+1), x > 1; c) Burr distribution with survival function ( ) α λ F (x) =, x. λ + x 27. Let X 1,... X n be independent identically Inverse Gaussian distributed random variables with density function f X (x) = µ (2πσx 3 ) 1 2 e (x µ)2 2σx, x >, µ, σ >. Notation: X i IG(µ, σ). Show that the sum S = X 1 +... X n is IG(nµ, σ) distributed. Hint: Show that M X (s) = e µ σ [1 (1 2σs) 1 2 ]. 3 Cramér - Lundberg model 28. Let {U t } be a Cramér-Lundberg model with initial capital c, claim intensity λ and Γ(2, β) distributed claims with density function f(x) = β 2 xe βx, x >. a) Explain the NPC (net profit condition); b) Show that the nonruin probability Φ(u) satisfies the equation cφ (u) + (2βc λ)φ (u) + β(βc 2λ)Φ (u) =. 4
c) Find the ruin probability. 29. Let {U t } be a Cramér-Lundberg model with initial capital c, claim intensity λ and constant claims Z i = µ. a) Show that the nonruin probability Φ(u) is differentiable everywhere except the point u = µ and satisfies the equation b) Show that for u µ Φ (u) = λ[φ(u) Φ(u µ)]; Φ(u) = c λµ e λ c u. c 3. Let {U t } be a renewal risk model with Erlang(2, β) distributed inter-arrival times. The mean value of the claims is m 1 = EZ. Verify that the Laplace transform of the nonruin probability is given by LT Φ (s) = c 2 sφ() + β 2 m 1 2βc c 2 s 2 2βcs + β 2 (1 LT F (s)), where Φ() is the nonruin probability with noinitial capital. 31. Let the claim sizes Z i P ar(α, λ) in the classical risk model. a) Find a condition of NPC; b) Find the integrated tail distribution F I ; c) Verify the following approximation of the ruin probability Ψ(u) = θ ( u ) (α 1), u > λ. α(1 θ) λ 32. Consider the classical risk model with λ = 1, safety loading θ and exp(1) distributed claims. The reinsurer s safety loading factor is η. Calculate the adjustment coefficient in the case of a) proportional reinsurance h(x) = bx; b) excess of loss reinsurance h(x) = (x b) +. Leda D.Minkova August 21 Faculty of Mathematics and Informatics Sofia University 5