1.7.1 Moments and Moment Generating Functions

Similar documents
Lecture Notes 1. Brief Review of Basic Probability

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Lecture 6: Discrete & Continuous Probability and Random Variables

Statistics 100A Homework 8 Solutions

Transformations and Expectations of random variables

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Section 5.1 Continuous Random Variables: Introduction

Introduction to Probability

Nonparametric adaptive age replacement with a one-cycle criterion

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Important Probability Distributions OPRE 6301

Probability Generating Functions

M2S1 Lecture Notes. G. A. Young ayoung

STAT 830 Convergence in Distribution

Lecture 8. Confidence intervals and the central limit theorem

5. Continuous Random Variables

Joint Exam 1/P Sample Exam 1

MAS108 Probability I

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Exponential Distribution

Principle of Data Reduction

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Math 431 An Introduction to Probability. Final Exam Solutions

Sums of Independent Random Variables

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

MATH 425, PRACTICE FINAL EXAM SOLUTIONS.

STA 256: Statistics and Probability I

ECE302 Spring 2006 HW3 Solutions February 2,

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day.

Aggregate Loss Models

. (3.3) n Note that supremum (3.2) must occur at one of the observed values x i or to the left of x i.

Notes on Continuous Random Variables

ECE302 Spring 2006 HW5 Solutions February 21,

Math 461 Fall 2006 Test 2 Solutions

THE CENTRAL LIMIT THEOREM TORONTO

Chapter 4 Lecture Notes

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = x = 12. f(x) =

Practice problems for Homework 11 - Point Estimation

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

How To Find Out How Much Money You Get From A Car Insurance Claim

Statistics 100A Homework 7 Solutions

Section 6.1 Joint Distribution Functions

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

Poisson Processes. Chapter Exponential Distribution. The gamma function is defined by. Γ(α) = t α 1 e t dt, α > 0.

MULTIVARIATE PROBABILITY DISTRIBUTIONS

5.3 Improper Integrals Involving Rational and Exponential Functions

The Exponential Distribution

The Black-Scholes-Merton Approach to Pricing Options

Taylor and Maclaurin Series

3. The Economics of Insurance

Lectures 5-6: Taylor Series

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

STAT x 0 < x < 1

Probability Calculator

NOV /II. 1. Let f(z) = sin z, z C. Then f(z) : 3. Let the sequence {a n } be given. (A) is bounded in the complex plane

SOLUTIONS. f x = 6x 2 6xy 24x, f y = 3x 2 6y. To find the critical points, we solve

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

A Tutorial on Probability Theory

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

LS.6 Solution Matrices

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Random variables, probability distributions, binomial random variable

( ) is proportional to ( 10 + x)!2. Calculate the

Practice with Proofs

1.5 / 1 -- Communication Networks II (Görg) Transforms

Manual for SOA Exam MLC.

Basic Concepts of Point Set Topology Notes for OU course Math 4853 Spring 2011

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

Lecture 7: Continuous Random Variables

Lectures on Stochastic Processes. William G. Faris

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

Chapter 6: Point Estimation. Fall Probability & Statistics

ECE302 Spring 2006 HW4 Solutions February 6,

ALOHA Performs Delay-Optimum Power Control

The Heat Equation. Lectures INF2320 p. 1/88

Math 526: Brownian Motion Notes

Properties of Future Lifetime Distributions and Estimation

E3: PROBABILITY AND STATISTICS lecture notes

Mathematics Review for MS Finance Students

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

The Actuary s Free Study Guide for. Second Edition G. Stolyarov II,

6 PROBABILITY GENERATING FUNCTIONS

Calculus. Contents. Paul Sutcliffe. Office: CM212a.

Systems with Persistent Memory: the Observation Inequality Problems and Solutions

Generating Functions

Concentration inequalities for order statistics Using the entropy method and Rényi s representation

Introduction to Detection Theory

Lecture 13: Martingales

MATHEMATICAL METHODS OF STATISTICS

PSTAT 120B Probability and Statistics

Transcription:

18 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY 1.7.1 Moments and Moment Generating Functions Definition 1.12. The nth moment n N) of a random variable X is defined as µ n E Xn The nth central moment of X is defined as where µ µ 1 E X. µ n EX µ) n, Note, that the second central moment is the variance of a random variable X, usually denoted by σ 2. Moments give an indication of the shape of the distribution of a random variable. Skewness and kurtosis are measured by the following functions of the third and fourth central moment respectively: the coefficient of skewness is given by γ 1 EX µ)3 σ 3 µ 3 µ 3 2 2 ; the coefficient of kurtosis is given by γ 2 EX µ)4 3 µ 4 3. σ 4 µ 2 2 Moments can be calculated from the definition or by using so called moment generating function. Definition 1.13. The moment generating function mgf) of a random variable X is a function M X : R [0, ) given by M X t) Ee tx, provided that the expectation exists for t in some neighborhood of zero. More explicitly, the mgf of X can be written as M X t) M X t) x X e tx f X x)dx, e tx PX x)dx, if X is continuous, if X is discrete. The method to generate moments is given in the following theorem.

1.7. EXPECTED VALUES 19 Theorem 1.7. If X has mgf M X t), then where EX n ) M n) X 0), M n) dn X 0) dt nm Xt) 0. That is, the n-th moment is equal to the n-th derivative of the mgf evaluated at t 0. Proof. Assuming that we can differentiate under the integral sign we may write d dt M Xt) d dt d e tx f X x)dx dt etx EXe tx ). ) f X x)dx xe tx ) f X x)dx Hence, evaluating the last expression at zero we obtain For n 2 we will get d dt M Xt) 0 EXe tx ) 0 EX). d 2 dt 2M Xt) 0 EX 2 e tx ) 0 EX 2 ). Analogously, it can be shown that for any n N we can write d n dt nm Xt) 0 EX n e tx ) 0 EX n ). Example 1.14. Find the mgf of X Expλ) and use results of Theorem 1.7 to obtain the mean and variance of X. By definition the mgf can be written as M X t) Ee t X) e tx f X x)dx.

20 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY For the exponential distribution we have f X x) λe λx I 0, ) x), where λ R +. Here we used the notation of the indicator function I X x) whose meaning is as follows: { 1, if x X; I X x) 0, otherwise. That is, f X x) { λe λx, if x 0, ); 0, otherwise. Hence, integrating by the method of substitution, we get M X t) 0 e tx λe λx dx λ 0 e t λ)x dx λ t λ provided that t < λ. Now, using Theorem 1.7 we obtain the first and the second moments, respectively: Hence, the variance of X is EX) M X0) EX 2 ) M 2) X 0) λ λ t) 2 t0 1 λ, 2λ λ t) 3 t0 2 λ 2. varx) EX 2 ) [EX)] 2 2 λ 2 1 λ 2 1 λ 2. Exercise 1.10. Calculate mgf for Binomial and Poisson distributions. Moment generating functions provide methods for comparing distributions or finding their limiting forms. The following two theorems give us the tools. Theorem 1.8. Let F X x) and F Y y) be two cdfs whose all moments exist. Then 1. If F X and F Y have bounded support, then F X u) F Y u) for all u iff EX n ) EY n ) for all n 0, 1, 2,.... 2. If the mgfs of X and Y exist and are equal, i.e., M X t) M Y t) for all t in some neighborhood of zero, then F X u) F Y u) for all u.

1.7. EXPECTED VALUES 21 Theorem 1.9. Suppose that {X 1, X 2,...} is a sequence of random variables, each with mgf M Xi t). Furthermore, suppose that lim M X i t) M X t), for all t in a neighborhood of zero, i and M X t) is an mgf. Then, there is a unique cdf F X whose moments are determined by M X t) and, for all x where F X x) is continuous, we have lim F X i x) F X x). i This theorem means that the convergence of mgfs implies convergence of cdfs. Example 1.15. We know that the Binomial distribution can be approximated by a Poisson distribution when p is small and n is large. Using the above theorem we can confirm this fact. The mgf of X n Binn, p) and of Y Poissonλ) are, respectively: M Xn t) [pe t + 1 p)] n, M Y t) e λet 1). We will show that the mgf of X tends to the mgf of Y, where λ np. We will need the following useful result given in the lemma: Lemma 1.1. Let a 1, a 2,... be a sequence of numbers converging to a, that is, lim n a n a. Then lim 1 + a ) n n e a. n n Now, we can write M Xn t) pe t + 1 p) ) n 1 + 1 n npet 1) ) n 1 + λet 1) n ) n 1) M n eλet Y t). Hence, by Theorem 1.9 the Binomial distribution converges to a Poisson distribution.

22 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY 1.8 Functions of Random Variables If X is a random variable with cdf F X x), then any function of X, say gx) Y is also a random variable. The question then is what is the distribution of Y? The function y gx) is a mapping from the induced sample space of the random variable X, X, to a new sample space, Y, of the random variable Y, that is gx) : X Y. The inverse mapping g 1 acts from Y to X and we can write Then, we have g 1 A) {x X : gx) A} where A Y. PY A) PgX) A) P {x X : gx) A} ) P X g 1 A) ). The following theorem relates the cumulative distribution functions of X and Y gx). Theorem 1.10. Let X have cdf F X x), Y gx) and let domain and codomain of gx), respectively, be X {x : f X x) > 0}, and Y {y : y gx) for some x X }. a) If g is an increasing function on X then F Y y) F X g 1 y) ) for y Y. b) If g is a decreasing function on X, then F Y y) 1 F X g 1 y) ) for y Y. Proof. The cdf of Y gx) can be written as a) If g is increasing, then F Y y) PY y) PgX) y) P {x X : gx) y} ) f X x)dx. {x X:gx) y} {x X : gx) y} {x X : g 1 gx) ) g 1 y)} {x X : x g 1 y)}.

1.8. FUNCTIONS OF RANDOM VARIABLES 23 So, we can write F Y y) f X x)dx {x X:gx) y} f X x)dx {x X:x g 1 y)} g 1 y) f X x)dx F X g 1 y) ). b) Now, if g is decreasing, then {x X : gx) y} {x X : g 1 gx) ) g 1 y)} {x X : x g 1 y)}. So, we can write F Y y) f X x)dx {x X:gx) y} f X x)dx {x X:x g 1 y)} g 1 y) f X x)dx 1 F X g 1 y) ). Example 1.16. Find the distribution of Y gx) log X, where X U[0, 1]). The cdf of X is 0, for x < 0; F X x) x, for 0 x 1; 1, for x > 1. For x [0, 1] the function gx) log x is defined on Y 0, ) and it is decreasing. For y > 0, y log x implies that x e y, i.e., g 1 y) e y and F Y y) 1 F X g 1 y) ) 1 F X e y ) 1 e y. Hence we may write F Y y) 1 e y) I 0, ). This is exponential distribution function for λ 1.

24 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY For continuous rvs we have the following result. Theorem 1.11. Let X have pdf f X x) and let Y gx), where g is a monotone function. Suppose that f X x) is continuous on its support X {x : f X x) > 0} and that g 1 y) has a continuous derivative on support Y {y : y gx) for some x X }. Then the pdf of Y is given by f Y y) f X g 1 y) ) d dy g 1 y) I Y. Proof. f Y y) d dy F Y y) { d { FX g 1 y) )}, if g is increasing; d dy{ 1 FX g 1 y) )}, if g is decreasing. { fx g 1 y) ) d dy g 1 y), if g is increasing; f X g 1 y) ) d dy g 1 y), if g is decreasing. which gives the thesis of the theorem. Example 1.17. Suppose that Z N0, 1). What is the distribution of Y Z 2? For Y > 0, the cdf of Y Z 2 is F Y y) PY y) PZ 2 y) P y Z y) F Z y) F Z y). The pdf can now be obtained by differentiation: f Y y) d dy F Y y) d dy FZ y) F Z y) ) 1 2 y f Z y) + 1 2 y f Z y) Now, for the standard normal distribution we have f Z z) 1 2π e z2 /2, < z <.

1.8. FUNCTIONS OF RANDOM VARIABLES 25 This gives, f Y y) 1 [ 1 2 e y) 2 /2 + 1 ] e y) 2 /2 y 2π 2π 1 1 e y/2, 0 < y <. y 2π This is a well known pdf function, which we will use in statistical inference. It is called chi squared random variable with one degree of freedom and it is denoted by χ 2 1. Note that gz) Z 2 is not a monotone function, but the range of Z,, ), can be partitioned so that it is monotone on its sub-sets. Exercise 1.11. The pdf obtained in Example 1.17 is also pdf of a Gamma rv for some specific values of its parameters. What are these values? Exercise 1.12. Suppose that Z N0, 1). Find the distribution of Y µ + σz for constant µ and σ. Exercise 1.13. Let X be a random variable with moment generating function M X. i) Show that the moment generating function of Y a + bx, where a and b are constants, is given by M Y t) e ta M X tb). ii) Derive the moment generating function of Y Nµ, σ 2 ). Hint: First find M Z t) for a standard normal rv Z.