Lecture 2 Binomial and Poisson Probability Distributions



Similar documents
Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Chapter 4 Lecture Notes

WHERE DOES THE 10% CONDITION COME FROM?

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

Binomial Sampling and the Binomial Distribution

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Binomial Probability Distribution

2 Binomial, Poisson, Normal Distribution

ST 371 (IV): Discrete Random Variables

Probabilistic Strategies: Solutions

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Math 55: Discrete Mathematics

WEEK #23: Statistics for Spread; Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

2. Discrete random variables

AMS 5 CHANCE VARIABILITY

E3: PROBABILITY AND STATISTICS lecture notes

Sums of Independent Random Variables

4. Continuous Random Variables, the Pareto and Normal Distributions

Probability. Sample space: all the possible outcomes of a probability experiment, i.e., the population of outcomes

Hoover High School Math League. Counting and Probability

Lesson 1. Basics of Probability. Principles of Mathematics 12: Explained! 314

The Binomial Distribution

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

AP Statistics 7!3! 6!

Random variables, probability distributions, binomial random variable

MAS108 Probability I

Combinatorial Proofs

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Chapter 5. Discrete Probability Distributions

Probability Distribution for Discrete Random Variables

Lecture Note 1 Set and Probability Theory. MIT Spring 2006 Herman Bennett

Probability, statistics and football Franka Miriam Bru ckler Paris, 2015.

Binomial random variables

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

Remarks on the Concept of Probability

Lecture 5 : The Poisson Distribution

DETERMINE whether the conditions for a binomial setting are met. COMPUTE and INTERPRET probabilities involving binomial random variables

People have thought about, and defined, probability in different ways. important to note the consequences of the definition:

Discrete mathematics

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Betting systems: how not to lose your money gambling

Math 3C Homework 3 Solutions

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Section 6-5 Sample Spaces and Probability

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

MATH 140 Lab 4: Probability and the Standard Normal Distribution

6 PROBABILITY GENERATING FUNCTIONS

Section 5 Part 2. Probability Distributions for Discrete Random Variables

PROBABILITY AND SAMPLING DISTRIBUTIONS

Binomial random variables (Review)

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE


Section 6.2 Definition of Probability

Lecture 7: Continuous Random Variables

CHAPTER 7 SECTION 5: RANDOM VARIABLES AND DISCRETE PROBABILITY DISTRIBUTIONS

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction

Lesson Plans for (9 th Grade Main Lesson) Possibility & Probability (including Permutations and Combinations)

Session 8 Probability

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Section 6.1 Discrete Random variables Probability Distribution

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev

Sample Questions for Mastery #5

Chapter What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.

The Math. P (x) = 5! = = 120.

Chicago Booth BUSINESS STATISTICS Final Exam Fall 2011


Probability: The Study of Randomness Randomness and Probability Models. IPS Chapters 4 Sections

Math 431 An Introduction to Probability. Final Exam Solutions

Expected Value and the Game of Craps

Expected Value and Variance

Lecture 10: Depicting Sampling Distributions of a Sample Proportion

Ch. 13.3: More about Probability

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

Normal Distribution as an Approximation to the Binomial Distribution

SCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics

36 Odds, Expected Value, and Conditional Probability


Lecture 3 Gaussian Probability Distribution

Thursday, October 18, 2001 Page: 1 STAT 305. Solutions

Practice problems for Homework 11 - Point Estimation

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Universitat Autònoma de Barcelona

A Simple Pseudo Random Number algorithm

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Math Quizzes Winter 2009


13.0 Central Limit Theorem

Question of the Day. Key Concepts. Vocabulary. Mathematical Ideas. QuestionofDay

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

1 Combinations, Permutations, and Elementary Probability

6. Let X be a binomial random variable with distribution B(10, 0.6). What is the probability that X equals 8? A) (0.6) (0.4) B) 8! C) 45(0.6) (0.


Contemporary Mathematics Online Math 1030 Sample Exam I Chapters No Time Limit No Scratch Paper Calculator Allowed: Scientific

Chapter 16: law of averages

Transcription:

Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a coin + head or tail u rolling a dice + 6 or not 6 (i.e. 1, 2, 3, 4, 5) H Label the possible outcomes by the variable k + find the probability P(k) for event k to occur H Since k can take on only 2 values we define those values as: k = 0 or k = 1 u let P(k = 0) = q (remember 0 q 1) u something must happen + P(k = 0) + P(k = 1) = 1 + P(k = 1) = p = 1 - q u write the probability distribution P(k) as: P(k) = p k q 1-k (Bernoulli distribution) u coin toss: define probability for a head as P(1) + P(1) = 0.5 u dice rolling: define probability for a six to be rolled as P(1) + P(1) = 1/6 + P(0) = 5/6 (not a six) K.K. Gan L2: Binomial and Poisson 1

l What is the mean (m) of P(k)? m = 1 ÂkP(k) = 1 Â P(k) k=0 k=0 0 q +1 p q + p = p l What is the Variance (s 2 ) of P(k)? s 2 = 1 Âk 2 P(k) - m 2 = 0 2 P(0)+1 2 P(1) - m 2 = p - p 2 = p(1- p) = pq 1 Â P(k) k=0 k=0 l Suppose we have trials (e.g. we flip a coin times) + what is the probability to get m successes (= heads)? l Consider tossing a coin twice. The possible outcomes are: H no heads: P(m = 0) = q 2 H one head: P(m = 1) = qp + pq (toss 1 is a tail, toss 2 is a head or toss 1 is head, toss 2 is a tail) = 2pq two outcomes because we don't care which of the tosses is a head H two heads: P(m = 2) = p 2 H P(0) + P(1) + P(2) = q 2 + 2pq + p 2 = (q + p) 2 = 1 l We want the probability distribution function P(m,, p) where: m = number of success (e.g. number of heads in a coin toss) = number of trials (e.g. number of coin tosses) p = probability for a success (e.g. 0.5 for a head) K.K. Gan L2: Binomial and Poisson 2

l If we look at the three choices for the coin flip example, each term is of the form: C m p m q -m m = 0, 1, 2, = 2 for our example, q = 1 - p always! H coefficient C m takes into account the number of ways an outcome can occur regardless of order H for m = 0 or 2 there is only one way for the outcome (both tosses give heads or tails): C 0 = C 2 = 1 H for m = 1 (one head, two tosses) there are two ways that this can occur: C 1 = 2. l Binomial coefficients: number of ways of taking things m at time C,m = m ( ) =! m!( - m)! H 0! = 1! = 1, 2! = 1 2 = 2, 3! = 1 2 3 = 6, m! = 1 2 3 m H Order of things is not important u e.g. 2 tosses, one head case (m = 1) n we don't care if toss 1 produced the head or if toss 2 produced the head H Unordered groups such as our example are called combinations H Ordered arrangements are called permutations H For distinguishable objects, if we want to group them m at a time, the number of permutations:! P,m = ( - m)! u example: If we tossed a coin twice ( = 2), there are two ways for getting one head (m = 1) u example: Suppose we have 3 balls, one white, one red, and one blue. n umber of possible pairs we could have, keeping track of order is 6 (rw, wr, rb, br, wb, bw): 3! P 3,2 = (3-2)! = 6 n If order is not important (rw = wr), then the binomial formula gives 3! C 3,2 = 2!(3-2)! = 3 number of two-color combinations K.K. Gan L2: Binomial and Poisson 3

l Binomial distribution: the probability of m success out of trials: P(m,, p) = C,m p m q -m = ( m ) pm q -m! = m!( - m)! pm q -m u p is probability of a success and q = 1 - p is probability of a failure u Consider a game where the player bats 4 times: H probability of 0/4 = (0.67) 4 = 20% H probability of 1/4 = [4!/(3!1!)](0.33) 1 (0.67) 3 = 40% H probability of 2/4 = [4!/(2!2!)](0.33) 2 (0.67) 2 = 29% H probability of 3/4 = [4!/(1!3!)](0.33) 3 (0.67) 1 = 10% H probability of 4/4 = [4!/(0!4!)](0.33) 4 (0.67) 0 = 1% H probability of getting at least one hit = 1 - P(0) = 0.8 P (k, 7, 1/3) 0.40 0.30 0.20 0.10 Expectation Value m = np = 7 * 1/3 = 2.333... P (k, 50, 1/3) 0.14 0.12 0.10 0.08 0.06 0.04 0.02 Expectation Value m = np = 50 * 1/3 = 16.667... 0.00 0 2 4 6 8 10 k K.K. Gan L2: Binomial and Poisson 4 0.00 0 5 10 15 20 25 30 k

l To show that the binomial distribution is properly normalized, use Binomial Theorem: + binomial distribution is properly normalized l Mean of binomial distribution: m = ÂmP(m,, p) =  mp(m,, p) =  m m  P(m,, p) ( ) pm q -m H A cute way of evaluating the above sum is to take the derivative: È Â ( m ) p pm q -m Í Î = 0 p -1 m m k l=0( ) (a + b) k =  l k m m ( ) pm q -m a k-l b l  P(m,, p) =  m p m q -m = (p+ q) =1 ( ) ( ) pm-1 q -m ( ) pm ( - m)(1- p) -m-1  -  m = 0 ( ) pm (1- p) -m  = (1- p) -1  m - (1- p) -1  m m p -1 m = (1- p) -1 1- (1- p) -1 m m = p ( ) pm (1- p) -m K.K. Gan L2: Binomial and Poisson 5

l Variance of binomial distribution (obtained using similar trick): s 2 = Â(m - m) 2 P(m,, p) = pq  P(m,, p) H Example: Suppose you observed m special events (success) in a sample of events u measured probability ( efficiency ) for a special event to occur: e = m u error on the probability ("error on the efficiency"): s e = s m = pq = e(1-e) = e(1-e) + sample () should be as large as possible to reduce uncertainty in the probability measurement H Example: Suppose a baseball player's batting average is 0.33 (1 for 3 on average). u Consider the case where the player either gets a hit or makes an out (forget about walks here!). probability for a hit: p = 0.33 probability for no hit : q = 1 - p = 0.67 u On average how many hits does the player get in 100 at bats? m = p = 100 0.33 = 33 hits u What's the standard deviation for the number of hits in 100 at bats? s = (pq) 1/2 = (100 0.33 0.67) 1/2 4.7 hits + we expect 33 ± 5 hits per 100 at bats K.K. Gan L2: Binomial and Poisson 6

Poisson Probability Distribution l A widely used discrete probability distribution l Consider the following conditions: H p is very small and approaches 0 u example: a 100 sided dice instead of a 6 sided dice, p = 1/100 instead of 1/6 u example: a 1000 sided dice, p = 1/1000 H is very large and approaches u example: throwing 100 or 1000 dice instead of 2 dice H product p is finite l Example: radioactive decay H Suppose we have 25 mg of an element + very large number of atoms: 10 20 H Suppose the lifetime of this element l = 10 12 years 5x10 19 seconds + probability of a given nucleus to decay in one second is very small: p = 1/l = 2x10-20 /sec + p = 2/sec finite! + number of counts in a time interval is a Poisson process l Poisson distribution can be derived by taking the appropriate limits of the binomial distribution! P(m,, p) = m!( - m)! pm q -m! ( -1) ( - m +1)( - m)! = = m ( - m)! ( - m)! q -m = (1- p) -m =1- p( - m)+ p2 ( - m)( - m -1) 2! + ª 1- p + (p)2 2! + ª e - p K.K. Gan L2: Binomial and Poisson 7

P(m,, p) = m Let m = p P(m,m) = e-m m m m! m! pm e - p m= e -m m m  = e -m m= m m  = e -m e m =1 m! m! u m is always an integer 0 Poisson distribution is normalized u m does not have to be an integer H It is easy to show that: m = p = mean of a Poisson distribution s 2 = p = m = variance of a Poisson distribution mean and variance are the same number l Radioactivity example with an average of 2 decays/sec: H What s the probability of zero decays in one second? p(0,2) = e-2 2 0 = e-2 1 = e -2 = 0.135 Æ13.5% 0! 1 H What s the probability of more than one decay in one second? p(> 1,2) =1- p(0,2) - p(1,2) =1- e-2 2 0 - e-2 2 1 =1- e -2-2e -2 = 0.594 Æ 59.4% 0! 1! H Estimate the most probable number of decays/sec? m P(m,m) = 0 m K.K. Gan L2: Binomial and Poisson 8

u To solve this problem its convenient to maximize lnp(m, m) instead of P(m, m). Ê ln P(m,m) = ln e-m m m ˆ Á = -m + m ln m - ln m! Ë m! u In order to handle the factorial when take the derivative we use Stirling's Approximation: ln m!ª m ln m - m ln P(m,m) = (-m + m ln m - ln m!) m m ª (-m + m ln m - m ln m + m) m = ln m - ln m - m 1 m +1 = 0 m * = m + The most probable value for m is just the average of the distribution + If you observed m events in an experiment, the error on m is s = m = m u This is only approximate since Stirlings Approximation is only valid for large m. u Strictly speaking m can only take on integer values while m is not restricted to be an integer. K.K. Gan L2: Binomial and Poisson 9

Comparison of Binomial and Poisson distributions with mean m = 1 0.5 0.4 Probability 0.4 0.3 0.2 0.1 poisson m=1 binomial =3, p=1/3 Probability 0.35 0.3 0.25 0.2 0.15 0.1 0.05 binomial =10,p=0.1 poisson m=1 ot much difference between them! 0 0 1 2 3 4 5 m 0 0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 m For large : Binomial distribution looks like a Poisson of the same mean K.K. Gan L2: Binomial and Poisson 10