Exponential Distribution



Similar documents
e.g. arrival of a customer to a service station or breakdown of a component in some system.

The Exponential Distribution

Section 5.1 Continuous Random Variables: Introduction

Notes on Continuous Random Variables

Poisson Processes. Chapter Exponential Distribution. The gamma function is defined by. Γ(α) = t α 1 e t dt, α > 0.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Statistics 100A Homework 7 Solutions

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Lecture 7: Continuous Random Variables

5. Continuous Random Variables

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Joint Exam 1/P Sample Exam 1

Practice problems for Homework 11 - Point Estimation

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

Probability Generating Functions

Maximum Likelihood Estimation

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

2WB05 Simulation Lecture 8: Generating random variables

6.263/16.37: Lectures 5 & 6 Introduction to Queueing Theory

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day.

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability Distribution

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Principle of Data Reduction

Lecture Notes 1. Brief Review of Basic Probability

STAT 830 Convergence in Distribution

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

ECE302 Spring 2006 HW5 Solutions February 21,

Optimization of Business Processes: An Introduction to Applied Stochastic Modeling. Ger Koole Department of Mathematics, VU University Amsterdam

4 The M/M/1 queue. 4.1 Time-dependent behaviour

How To Find Out How Much Money You Get From A Car Insurance Claim

INSURANCE RISK THEORY (Problems)

Aggregate Loss Models

Stats on the TI 83 and TI 84 Calculator

Tenth Problem Assignment

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

M/M/1 and M/M/m Queueing Systems

Modeling and Analysis of Information Technology Systems

Poisson processes (and mixture distributions)

Generating Random Variables and Stochastic Processes

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.)

Statistics 100A Homework 8 Solutions

Generating Random Variables and Stochastic Processes

Some special discrete probability distributions

Lecture 6: Discrete & Continuous Probability and Random Variables

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Math 461 Fall 2006 Test 2 Solutions

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

UNIFORM ASYMPTOTICS FOR DISCOUNTED AGGREGATE CLAIMS IN DEPENDENT RISK MODELS

A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails

Master s Theory Exam Spring 2006

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

POISSON PROCESS AND INSURANCE : AN INTRODUCTION 1

Section 6.1 Joint Distribution Functions

ST 371 (IV): Discrete Random Variables

STA 256: Statistics and Probability I

PSTAT 120B Probability and Statistics

Performance Analysis of a Telephone System with both Patient and Impatient Customers

Queueing Systems. Ivo Adan and Jacques Resing

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

1 Sufficient statistics

Homework set 4 - Solutions

Survival Distributions, Hazard Functions, Cumulative Hazards

Math 526: Brownian Motion Notes

CHAPTER IV - BROWNIAN MOTION

Simple Markovian Queueing Systems

THE CENTRAL LIMIT THEOREM TORONTO

Important Probability Distributions OPRE 6301

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Introduction to Probability

. (3.3) n Note that supremum (3.2) must occur at one of the observed values x i or to the left of x i.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

Binomial lattice model for stock prices

Properties of Future Lifetime Distributions and Estimation

Probability and Random Variables. Generation of random variables (r.v.)

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Chapter 9 Monté Carlo Simulation

Lectures on Stochastic Processes. William G. Faris

An Introduction to Basic Statistics and Probability

), 35% use extra unleaded gas ( A

Lesson 20. Probability and Cumulative Distribution Functions

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Continuous Random Variables

M5A42 APPLIED STOCHASTIC PROCESSES PROBLEM SHEET 1 SOLUTIONS Term

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS

Lecture 13: Martingales

Chapter 4 Lecture Notes

Exploratory Data Analysis

Manual for SOA Exam MLC.

Brownian Motion and Stochastic Flow Systems. J.M Harrison

Generating Random Variates

1.5 / 1 -- Communication Networks II (Görg) Transforms

Transcription:

Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1 e λx x 0 0 x < 0 λ λ t, E(X 2 ) = d2 dt 2 φ(t) t=0 = 2/λ 2. V ar(x) = E(X 2 ) (E(X)) 2 = 1/λ 2. t < λ 1

Properties 1. Memoryless: P(X > s + t X > t) = P(X > s). = = P(X > s + t X > t) P(X > s + t,x > t) P(X > t) P(X > s + t) P(X > t) = e λ(s+t) e λt = e λs = P(X > s) Example: Suppose that the amount of time one spends in a bank is exponentially distributed with mean 10 minutes, λ = 1/10. What is the probability that a customer will spend more than 15 minutes in the bank? What is the probability that a customer will spend more than 15 minutes in the bank given that he is still in the bank after 10 minutes? Solution: P(X > 15) = e 15λ = e 3/2 = 0.22 P(X > 15 X > 10) = P(X > 5) = e 1/2 = 0.604 2

Failure rate (hazard rate) function r(t) r(t) = f(t) 1 F(t) P(X (t,t + dt) X > t) = r(t)dt. For exponential distribution: r(t) = λ, t > 0. Failure rate function uniquely determines F(t): F(t) = 1 e t 0 r(t)dt. 3

2. If X i, i = 1, 2,...,n, are iid exponential RVs with mean 1/λ, the pdf of n i=1 X i is: λt (λt)n 1 f X1 +X 2 + +X n (t) = λe (n 1)!, gamma distribution with parameters n and λ. 3. If X 1 and X 2 are independent exponential RVs with mean 1/λ 1, 1/λ 2, P(X 1 < X 2 ) = λ 1 λ 1 + λ 2. 4. If X i, i = 1, 2,...,n, are independent exponential RVs with rate µ i. Let Z = min(x 1,...,X n ) and Y = max(x 1,...,X n ). Find distribution of Z and Y. Z is an exponential RV with rate n i=1 µ i. P(Z > x) = P(min(X 1,...,X n ) > x) = P(X 1 > x,x 2 > x,...,x n > x) = P(X 1 > x)p(x 2 > x) P(X n > x) n = e µ ix = e ( n i=1 µ i )x i=1 F Y (x) = P(Y < x) = n i=1 (1 e µ ix ). 4

Poisson Process Counting process: Stochastic process {N(t), t 0} is a counting process if N(t) represents the total number of events that have occurred up to time t. N(t) 0 and are of integer values. N(t) is nondecreasing in t. Independent increments: the numbers of events occurred in disjoint time intervals are independent. Stationary increments: the distribution of the number of events occurred in a time interval only depends on the length of the interval and does not depend on the position. 5

A counting process {N(t),t 0} is a Poisson process with rate λ, λ > 0 if 1. N(0) = 0. 2. The process has independent increments. 3. The process has staionary increments and N(t+s) N(s) follows a Poisson distribution with parameter λt: P(N(t+s) N(s) = n) = e λt(λt)n n! Note: E[N(t + s) N(s)] = λt. E[N(t)] = E[N(t + 0) N(0)] = λt., n = 0, 1,... 6

Interarrival and Waiting Time Define T n as the elapsed time between (n 1)st and the nth event. {T n,n = 1, 2,...} is a sequence of interarrival times. Proposition 5.1: T n, n = 1, 2,... are independent identically distributed exponential random variables with mean 1/λ. Define S n as the waiting time for the nth event, i.e., the arrival time of the nth event. n S n = T i. Distribution of S n : i=1 λt (λt)n 1 f Sn (t) = λe (n 1)!, gamma distribution with parameters n and λ. E(S n ) = n i=1 E(T i) = n/λ. 7

Example: Suppose that people immigrate into a territory at a Poisson rate λ = 1 per day. (a) What is the expected time until the tenth immigrant arrives? (b) What is the probability that the elapsed time between the tenth and the eleventh arrival exceeds 2 days? Solution: Time until the 10th immigrant arrives is S 10. E(S 10 ) = 10/λ = 10. P(T 11 > 2) = e 2λ = 0.133. 8

Further Properties Consider a Poisson process {N(t), t 0} with rate λ. Each event belongs to two types, I and II. The type of an event is independent of everything else. The probability of being in type I is p. Examples: female vs. male customers, good emails vs. spams. Let N 1 (t) be the number of type I events up to time t. Let N 2 (t) be the number of type II events up to time t. N(t) = N 1 (t) + N 2 (t). 9

Proposition 5.2: {N 1 (t),t 0} and {N 2 (t),t 0} are both Poisson processes having respective rates λp and λ(1 p). Furthermore, the two processes are independent. Example: If immigrants to area A arrive at a Poisson rate of 10 per week, and if each immigrant is of English descent with probability 1/12, then what is the probability that no people of English descent will immigrate to area A during the month of February? Solution: The number of English descent immigrants arrived up to time t is N 1 (t), which is a Poisson process with mean λ/12 = 10/12. P(N 1 (4) = 0) = e (λ/12) 4 = e 10/3. 10

Conversely: Suppose {N 1 (t), t 0} and {N 2 (t),t 0} are independent Poisson processes having respective rates λ 1 and λ 2. Then N(t) = N 1 (t) + N 2 (t) is a Poisson process with rate λ = λ 1 + λ 2. For any event occurred with unknown type, independent of everything else, the probability of being type I is p = λ 1 λ 1 +λ 2 and type II is 1 p. Example: On a road, cars pass according to a Poisson process with rate 5 per minute. Trucks pass according to a Poisson process with rate 1 per minute. The two processes are indepdendent. If in 3 minutes, 10 veicles passed by. What is the probability that 2 of them are trucks? Solution: Each veicle is independently a car with probability 5 5+1 = 5 6 and a truck with probability 1 6. The probability that 2 out of 10 veicles are trucks is given by the binomial distribution: ( )( ) 2 ( ) 8 10 1 5 2 6 6 11

Conditional Distribution of Arrival Times Consider a Poisson process {N(t), t 0} with rate λ. Up to t, there is exactly one event occurred. What is the conditional distribution of T 1? Under the condition, T 1 uniformly distributes on [0,t]. Proof P(T 1 < s N(t) = 1) = P(T 1 < s, N(t) = 1) P(N(t) = 1) P(N(s) = 1,N(t) N(s) = 0) = P(N(t) = 1) P(N(s) = 1)P(N(t) N(s) = 0) = P(N(t) = 1) = (λse λs ) e λ(t s) λte λt = s Note: cdf of a uniform t 12

If N(t) = n, what is the joint conditional distribution of the arrival times S 1, S 2,..., S n? S 1, S 2,..., S n is the ordered statistics of n independent random variables uniformly distributed on [0,t]. Let Y 1, Y 2,..., Y n be n RVs. Y (1), Y (2),..., Y (n) is the ordered statistics of Y 1, Y 2,..., Y n if Y (k) is the kth smallest value among them. If Y i, i = 1,...,n are iid continuous RVs with pdf f, then the joint density of the ordered statistics Y (1), Y (2),..., Y (n) is f Y(1),Y (2),...,Y (n) (y 1, y 2,...,y n ) = { n! n i=1 f(y i) y 1 < y 2 < < y n 0 otherwise 13

We can show that Proof f(s 1, s 2,...,s n N(t) = n) = n! t n 0 < s 1 < s 2 < s n < t f(s 1, s 2,...,s n N(t) = n) = f(s 1, s 2,...,s n,n) P(N(t) = n)) = λe λs 1λe λ(s 2 s 1) λe λ(s n s n 1 ) e λ(t s n) e λt (λt) n /n! = n! t, 0 < s n 1 < < s n < t For n independent uniformly distributed RVs on [0, t], Y 1,..., Y n : f(y 1, y 2,...,y n ) = 1 t n. Proposition 5.4: Given S n = t, the arrival times S 1, S 2,..., S n 1 has the distribution of the ordered statistics of a set n 1 independent uniform (0, t) random variables. 14

Generalization of Poisson Process Nonhomogeneous Poisson process: The counting process {N(t), t 0} is said to be a nonhomogeneous Poisson process with intensity function λ(t), t 0 if 1. N(0) = 0. 2. The process has independent increments. 3. The distribution of N(t+s) N(t) is Poisson with mean given by m(t + s) m(t), where m(t) = t 0 λ(τ)dτ. We call m(t) mean value function. Poisson process is a special case where λ(t) = λ, a constant. 15

Compound Poisson process: A stochastic process {X(t),t 0} is said to be a compound Poisson process if it can be represented as X(t) = N(t) i=1 Y i, t 0 where {N(t),t 0} is a Poisson process and {Y i,i 0} is a family of independent and identically distributed random variables which are also independent of {N(t),t 0}. The random variable X(t) is said to be a compound Poisson random variable. Example: Suppose customers leave a supermarket in accordance with a Poisson process. If Y i, the amount spent by the ith customer, i = 1, 2,..., are independent and identically distributed, then X(t) = N(t) i=1 Y i, the total amount of money spent by customers by time t is a compound Poisson process. 16

Find E[X(t)] and V ar[x(t)]. E[X(t)] = λte(y 1 ). V ar[x(t)] = λt(v ar(y 1 ) + E 2 (Y 1 )) Proof N(t) E(X(t) N(t) = n) = E( = E( = E( i=1 Y i N(t) = n) n Y i N(t) = n) i=1 n Y i ) = ne(y 1 ) i=1 E(X(t)) = E N(t) E(X(t) N(t)) = P(N(t) = n)e(x(t) N(t) = n) = n=1 P(N(t) = n)ne(y 1 ) n=1 = E(Y 1 ) np(n(t) = n) n=1 = E(Y 1 )E(N(t)) = λte(y 1 ) 17

N(t) V ar(x(t) N(t) = n) = V ar( = V ar( = V ar( i=1 Y i N(t) = n) n Y i N(t) = n) i=1 n Y i ) i=1 = nv ar(y 1 ) V ar(x(t) N(t) = n) = E(X 2 (t) N(t) = n) (E(X(t) N(t) = n)) 2 E(X 2 (t) N(t) = n) = V ar(x(t) N(t) = n) + (E(X(t) N(t) = n)) 2 = nv ar(y 1 ) + n 2 E 2 (Y 1 ) 18

V ar(x(t)) = E(X 2 (t)) (E(X(t))) 2 = P(N(t) = n)e(x 2 (t) N(t) = n) (E(X(t))) 2 = n=1 P(N(t) = n)(nv ar(y 1 ) + n 2 E 2 (Y 1 )) (λte(y 1 )) 2 n=1 = V ar(y 1 )E(N(t)) + E 2 (Y 1 )E(N 2 (t)) (λte(y 1 )) 2 = λtv ar(y 1 ) + λte 2 (Y 1 ) = λt(v ar(y 1 ) + E 2 (Y 1 )) = λte(y 2 1 ) 19