Some notes on the Poisson distribution

Similar documents
Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Important Probability Distributions OPRE 6301

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Notes on Continuous Random Variables

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

Lecture 8. Confidence intervals and the central limit theorem

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

How To Find Out How Much Money You Get From A Car Insurance Claim

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

Lecture 6: Discrete & Continuous Probability and Random Variables

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Lecture Notes 1. Brief Review of Basic Probability

Experimental Design. Power and Sample Size Determination. Proportions. Proportions. Confidence Interval for p. The Binomial Test

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

MAS108 Probability I

6 PROBABILITY GENERATING FUNCTIONS

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Joint Exam 1/P Sample Exam 1

Lecture 7: Continuous Random Variables

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

Chapter 3 Section 6 Lesson Polynomials

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Section 6.4: Counting Subsets of a Set: Combinations

Triangle deletion. Ernie Croot. February 3, 2010

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

Normal distribution. ) 2 /2σ. 2π σ

Random variables, probability distributions, binomial random variable

Stats on the TI 83 and TI 84 Calculator

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

2WB05 Simulation Lecture 8: Generating random variables

Math 461 Fall 2006 Test 2 Solutions

Some special discrete probability distributions

1 Prior Probability and Posterior Probability

Maximum Likelihood Estimation

Section 6.1 Joint Distribution Functions

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Lecture 19: Chapter 8, Section 1 Sampling Distributions: Proportions

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

9.2 Summation Notation

Principle of Data Reduction

Statistics 100A Homework 7 Solutions

Chapter 5. Random variables

Normal Distribution as an Approximation to the Binomial Distribution

1 Sufficient statistics

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

Probability Generating Functions

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Introduction to the Practice of Statistics Fifth Edition Moore, McCabe

To give it a definition, an implicit function of x and y is simply any relationship that takes the form:

Statistics 100A Homework 4 Solutions

Introduction to Probability

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Stat 704 Data Analysis I Probability Review

Math 319 Problem Set #3 Solution 21 February 2002

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Pr(X = x) = f(x) = λe λx

Aggregate Loss Models

Chapter 4 Lecture Notes

ECE302 Spring 2006 HW3 Solutions February 2,

CHAPTER 7 SECTION 5: RANDOM VARIABLES AND DISCRETE PROBABILITY DISTRIBUTIONS

MATH 4330/5330, Fourier Analysis Section 11, The Discrete Fourier Transform

Lecture 4: BK inequality 27th August and 6th September, 2007

WHERE DOES THE 10% CONDITION COME FROM?

You flip a fair coin four times, what is the probability that you obtain three heads.

Sampling Distributions

Probability Distributions

Lecture 5 : The Poisson Distribution

ST 371 (IV): Discrete Random Variables

Lecture 10: Depicting Sampling Distributions of a Sample Proportion

STAT x 0 < x < 1

1 Lecture: Integration of rational functions by decomposition

), 35% use extra unleaded gas ( A

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = x = 12. f(x) =

Equations, Inequalities & Partial Fractions

Notes on the Negative Binomial Distribution

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Linear Algebra Notes for Marsden and Tromba Vector Calculus

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Exponential Distribution

Lecture 14: Section 3.3

Mathematics 31 Pre-calculus and Limits

2. Discrete random variables

Taylor and Maclaurin Series

Master s Theory Exam Spring 2006

1 Review of Newton Polynomials

MATH 10550, EXAM 2 SOLUTIONS. x 2 + 2xy y 2 + x = 2

Polynomial Invariants

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.)

Math 151. Rumbos Spring Solutions to Assignment #22

36 CHAPTER 1. LIMITS AND CONTINUITY. Figure 1.17: At which points is f not continuous?

Tenth Problem Assignment

mod 10 = mod 10 = 49 mod 10 = 9.

Math 4310 Handout - Quotient Vector Spaces

E3: PROBABILITY AND STATISTICS lecture notes

Transcription:

Some notes on the Poisson distribution Ernie Croot October 2, 2008 1 Introduction The Poisson distribution is one of the most important that we will encounter in this course it is right up there with the normal distribution. Yet, because of time limitations, and due to the fact that its true applications are quite technical for a course such as ours, we will not have the time to discuss them. The purpose of this note is not to address these applications, but rather to provide source material for some things I presented in my lectures, which are not covered in your book. At the end of each section in what follows, I will let you know what material you should know for your next exams. You are, of course, also required to know all the material appearing in your text related to Poisson random variables. 2 Basic properties Recall that X is a Poisson random variable with parameter λ if it takes on the values 0, 1, 2,... according to the probability distribution p(x) = P(X = x) = e λ λ x. x! By convention, 0! = 1. Let us verify that this is indeed a legal probability density function (or mass function as your book likes to say) by showing that the sum of p(n) over all n 0, is 1. We have p(n) = e λ λ n n!. 1

This last sum clearly is the power series formula for e λ, so as claimed. p(n) = e λ e λ = 1, A basic fact about the Poisson random variable X (actually, two facts in one) is as follows: Fact. E(X) = V (X) = λ. Let us just prove here that E(X) = λ: As with showing that p(x) is a legal pdf, this is a simple exercise in series manipulation. We begin by noting that E(X) = np(n) = n e λ λ n. n! Noting that n! = n (n 1) (n 2) 3 2 1, we see that we may cancel the n with the n! to leave an (n 1)! in the denominator. So, pulling out the e λ and cancelling the n, we deduce that E(X) = e λ λ n (n 1)! = λe λ λ n 1 (n 1)!. (Note here that the term n = 1 has a 0! in the denominator.) Upon setting m = n 1, we find that E(X) = λe λ m=0 λ m m! = λe λ e λ = λ, and we are done. Take home message. You should know everything from this section. 2

3 A sum property of Poisson random variables Here we will show that if Y and Z are independent Poisson random variables with parameters λ 1 and λ 2, respectively, then Y +Z has a Poisson distribution with parameter λ 1 + λ 2. Before we even begin showing this, let us recall what it means for two discrete random variables to be independent : We say that Y and Z are independent if P(Y = y, Z = z) = P(Y = y and Z = z) = P(Y = y)p(z = z). This is not really the official definition of independent random variables, but it is good enough for us. Letting now we have that X = Y + Z, P(X = x) = P(Y + Z = x) = = = = e (λ 1+λ 2 ) P(Y = y, Z = z) P(Y = y)p(z = z) e λ 1 λ y 1 y! e λ 2 λ z 2 z! λ y 1λ z 2 y!z!. Now we require a little trick to finish our proof. We begin by recognizing that the y!z! is the denominator of ( ) ( ) x! x x = =. y!z! y z So, if we insert a factor x! in the sum, then we will have some binomial coefficient. Indeed, we have ( ) P(X = x) = e (λ 1+λ 2 ) x λ y x! y 1λ z 2. 3

Notice that from the binomial theorem we deduce that this last sum is just (λ 1 + λ 2 ) x. So, P(X = x) = e (λ 1+λ 2 ) (λ 1 + λ 2 ) x, x! and therefore X has a Poisson distribution with parameter λ 1 +λ 2, as claimed. Take home message. I expect you to know this fact about sums of Poisson random variables. BUT, I do not expect you to know the proof. 4 An application Now we consider the following basic problem, which appeared on an exam I gave in years past. It assumes you know what a Poisson process is, which although mentioned in your book, is not discussed in great detail. I will expect you to know enough to at least be able to solve this problem if it were put on an exam. Problem. The number of people arriving at a fast food drive thru (or through) in any given 2 minute interval obeys a Poisson process with mean 1. Suppose that the waiters can only process 3 orders in any given 4 minute interval (also, assume the waiters can process an order instantaneously, but are limited in how many they can process in given time intervals, as indicated). What is the expected number of people that leave the drive thru with their orders filled in any given 4 minute interval? Solution. Let X be the number of people that arrive in the 4 minute interval. We may write X = Y + Z, where Y is the number arriving in the first two minutes of the interval, and where Z is the number arriving in the last two minutes. Clearly, Y and Z are assumed to be independent, and thus X is Poisson with parameter 1 + 1 = 2. So, P(X = x) = e 2 2 x /x!. 4

Now let W denote the number of orders that are processed. Clearly, X = 0 = W = 0 X = 1 = W = 1 X = 2 = W = 2 X 3 = W = 3. So, the probability distribution for W is given by P(W = w) = e 2, if w = 0; 2e 2, if w = 1; 2e 2, if w = 2; 1 5e 2, if w = 3. We have now that E(W) = 0 P(W = 0) + 1 P(W = 1) + 2 P(W = 2) + 3 P(W = 3) = 2e 2 + 4e 2 + 3 15e 2 = 3 9e 2. Take home message. I expect you to be able to solve a problem at this level on an exam. 5