WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM



Similar documents

Section 1.3 P 1 = 1 2. = P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., =

Probability, statistics and football Franka Miriam Bru ckler Paris, 2015.

ECE302 Spring 2006 HW4 Solutions February 6,

Math 115 Spring 2011 Written Homework 5 Solutions

Elementary Statistics and Inference. Elementary Statistics and Inference. 17 Expected Value and Standard Error. 22S:025 or 7P:025.

AN ANALYSIS OF A WAR-LIKE CARD GAME. Introduction

HYPOTHESIS TESTING: POWER OF THE TEST

Comparison of frequentist and Bayesian inference. Class 20, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Mathematical goals. Starting points. Materials required. Time needed

Chapter 4 Lecture Notes

A New Interpretation of Information Rate

Math 319 Problem Set #3 Solution 21 February 2002

Math 141. Lecture 2: More Probability! Albyn Jones 1. jones/courses/ Library 304. Albyn Jones Math 141

Stanford Math Circle: Sunday, May 9, 2010 Square-Triangular Numbers, Pell s Equation, and Continued Fractions

Statistics 100A Homework 8 Solutions

Inner Product Spaces

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

6.042/18.062J Mathematics for Computer Science. Expected Value I

Probabilistic Strategies: Solutions

Section 4.4 Inner Product Spaces

The Tennis Formula: How it can be used in Professional Tennis

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Probability --QUESTIONS-- Principles of Math 12 - Probability Practice Exam 1

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

6 PROBABILITY GENERATING FUNCTIONS

Math 431 An Introduction to Probability. Final Exam Solutions

Colored Hats and Logic Puzzles

6.3 Conditional Probability and Independence

SINGLES TENNIS RULES

psychology and economics

5544 = = = Now we have to find a divisor of 693. We can try 3, and 693 = 3 231,and we keep dividing by 3 to get: 1

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Self-directed learning: managing yourself and your working relationships

MAS113 Introduction to Probability and Statistics

Lecture 13 - Basic Number Theory.

n k=1 k=0 1/k! = e. Example 6.4. The series 1/k 2 converges in R. Indeed, if s n = n then k=1 1/k, then s 2n s n = 1 n

Teaching & Learning Plans. Plan 1: Introduction to Probability. Junior Certificate Syllabus Leaving Certificate Syllabus

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Bayesian Nash Equilibrium

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Algebra 2 C Chapter 12 Probability and Statistics

WISE Power Tutorial All Exercises

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

Introduction to Hypothesis Testing. Hypothesis Testing. Step 1: State the Hypotheses

Practice Problems for Homework #8. Markov Chains. Read Sections Solve the practice problems below.

Practice Problems #4

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

E3: PROBABILITY AND STATISTICS lecture notes

6.042/18.062J Mathematics for Computer Science December 12, 2006 Tom Leighton and Ronitt Rubinfeld. Random Walks

ECE302 Spring 2006 HW1 Solutions January 16,

Law 14 The Penalty Kick

36 Odds, Expected Value, and Conditional Probability

Algebraic and Transcendental Numbers

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

That s Not Fair! ASSESSMENT #HSMA20. Benchmark Grades: 9-12

The Math. P (x) = 5! = = 120.

Basic Probability Concepts

Math 461 Fall 2006 Test 2 Solutions

Dawson College - Fall 2004 Mathematics Department

Section 4.2: The Division Algorithm and Greatest Common Divisors

Section 6.2 Definition of Probability

Thursday, October 18, 2001 Page: 1 STAT 305. Solutions

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS

Estimation of Unknown Comparisons in Incomplete AHP and It s Compensation

A Few Basics of Probability

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

USTA TENNIS RULES CHALLENGE FOR TEACHING PROFESSIONALS AND COACHES *****************************************

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

AMS 5 CHANCE VARIABILITY

WHERE DOES THE 10% CONDITION COME FROM?

STA 371G: Statistics and Modeling

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

4/1/2017. PS. Sequences and Series FROM 9.2 AND 9.3 IN THE BOOK AS WELL AS FROM OTHER SOURCES. TODAY IS NATIONAL MANATEE APPRECIATION DAY

Notes on the Negative Binomial Distribution

MATH10040 Chapter 2: Prime and relatively prime numbers

Converting from Fractions to Decimals

Binary Number System. 16. Binary Numbers. Base 10 digits: Base 2 digits: 0 1

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

A simple analysis of the TV game WHO WANTS TO BE A MILLIONAIRE? R

Statistics 100A Homework 3 Solutions

Math 181 Handout 16. Rich Schwartz. March 9, 2010

2 When is a 2-Digit Number the Sum of the Squares of its Digits?

Pascal is here expressing a kind of skepticism about the ability of human reason to deliver an answer to this question.

Lecture 25: Money Management Steven Skiena. skiena

Math Quizzes Winter 2009

arxiv: v1 [math.pr] 5 Dec 2011

Predicting a tennis match in progress for sports multimedia

0.1 Phase Estimation Technique

Contemporary Mathematics- MAT 130. Probability. a) What is the probability of obtaining a number less than 4?

Basics of information theory and information complexity

Betting Terms Explained

Concepts of Probability

Math , Fall 2012: HW 1 Solutions

Chapter 17. Orthogonal Matrices and Symmetries of Space

Transcription:

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM EXAMPLE 1. A biased coin (with probability of obtaining a Head equal to p > 0) is tossed repeatedly and independently until the first head is observed. Compute the probability that the first head appears at an even numbered toss. SOLUTION: Define: sample space Ω to consist of all possible infinite binary sequences of coin tosses; event H 1 - head on first toss; event E - first head on even numbered toss. We want P (E) : using the Theorem of Total Probability, and the partition of Ω given by {H 1, H 1} P (E) = P (E H 1 )P (H 1 ) + P (E H 1)P (H 1). Now clearly, P (E H 1 ) = 0 (given H 1, that a head appears on the first toss, E cannot occur) and also P (E H 1) can be seen to be given by P (E H 1) = P (E ) = 1 P (E), (given that a head does not appear on the first toss, the required conditional probability is merely the probability that the sequence concludes after a further odd number of tosses, that is, the probability of E ). Hence P (E) satisfies P (E) = 0 p + (1 P (E)) (1 p) = (1 p) (1 P (E)), so that P (E) = 1 p 2 p. 1

Alternatively, consider the partition of E into E 1, E 2,... where E k is the event that the first head occurs on the 2kth toss. Then E = E k, and ( P (E) = P ) E k = k=1 k=1 P (E k ). Now P (E k ) = (1 p) 2k 1 p (that is, 2k 1 tails, then a head), so P (E) = (1 p) 2k 1 p = k=1 p 1 p = 1 p 2 p. k=1 (1 p) 2k = p (1 p) 2 k=1 1 p 1 (1 p) 2 2

EXAMPLE 2. Two players A and B are competing at a trivia quiz game involving a series of questions. On any individual question, the probabilities that A and B give the correct answer are α and β respectively, for all questions, with outcomes for different questions being independent. The game finishes when a player wins by answering a question correctly. Compute the probability that A wins if (a) A answers the first question, (b) B answers the first question. SOLUTION: Define: sample space Ω to consist of all possible infinite sequences of answers; event A - A answers the first question; event F - game ends after the first question; event W - A wins. We want P (W A) and P (W A ). Using the Theorem of Total Probability, and the partition given by {F, F } P (W A) = P (W A F )P (F A) + P (W A F )P (F A). Now, clearly P (F A) = P [A answers first question correctly] = α, P (F A) = 1 α, and P (W A F ) = 1, but P (W A F ) = P (W A ), so that P (W A) = (1 α) + (P (W A ) (1 α)) = α + P (W A ) (1 α). (1) Similarly, P (W A ) = P (W A F )P (F A ) + P (W A F )P (F A ). We have P (F A ) = P [B answers first question correctly] = β, P (F A) = 1 β, 3

but P (W A F ) = 0. Finally P (W A F ) = P (W A), so that P (W A ) = (0 β) + (P (W A) (1 β)). = P (W A) (1 β). (2) Solving (1) and (2) simultaneously gives, for (a) and (b) P (W A) = α 1 (1 α) (1 β), P (W A ) = (1 β) α 1 (1 α) (1 β). Note: recall, for any events E 1 and E 2 we have that P (E 1 E 2 ) = 1 P (E 1 E 2 ), but not necessarily that P (E 1 E 2) = 1 P (E 1 E 2 ). 4

EXAMPLE 3. Patients are recruited onto the two arms (0 - Control, 1-Treatment) of a clinical trial. The probability that an adverse outcome occurs on the control arm is p 0, and is p 1 for the treatment arm. Patients are allocated alternately onto the two arms, and their outcomes are independent. What is the probability that the first adverse event occurs on the control arm? SOLUTION: Define: sample space Ω to consist of all possible infinite sequences of patient outcomes; event E 1 - first patient (allocated onto the control arm) suffers an adverse outcome; event E 2 - first patient (allocated onto the control arm) does not suffer an adverse outcome, but the second patient (allocated onto the treatment arm) does suffer an adverse outcome; event E 0 - neither of the first two patients suffer adverse outcomes; event F - first adverse event occurs on the control arm. We want P (F ). Now the events E 1, E 2 and E 0 partition Ω, so, by the Theorem of Total Probability, P (F ) = P (F E 1 )P (E 1 ) + P (F E 2 )P (E 2 ) + P (F E 0 )P (E 0 ). Now, P (E 1 ) = p 0, P (E 2 ) = (1 p 0 ) p 1, P (E 0 ) = (1 p 0 ) (1 p 1 ) and P (F E 1 ) = 1, P (F E 2 ) = 0. Finally, as after two non-adverse outcomes the allocation process effectively re-starts, we have P (F E 0 ) = P (F ). Hence P (F ) = (1 p 0 ) + (0 (1 p 0 ) p 1 ) + (P (F ) (1 p 0 ) (1 p 1 )) = p 0 + (1 p 0 ) (1 p 1 ) P (F ), which can be re-arranged to give P (F ) = p 0 p 0 + p 1 p 0 p 1. 5

EXAMPLE 4. In a tennis match, with the score at deuce, the game is won by the first player who gets a clear lead of two points. If the probability that given player wins a particular point is θ, and all points are played independently, what is the probability that player eventually wins the game? SOLUTION: Define: sample space Ω to consist of all possible infinite sequences of points; event W i - nominated player wins the ith point; event V i - nominated player wins the game on the ith point; event V - nominated player wins the game. We want P (V ). The events {W 1, W 1} partition Ω, and thus, by the Theorem of Total Probability P (V ) = P (V W 1 )P (W 1 ) + P (V W 1)P (W 1). (3) Now P (W 1 ) = θ and P (W 1) = 1 θ. To get P (V W 1 ) and P (V W 1), we need to further condition on the result of the second point, and again use the Theorem: for example where as, P (V W 1 ) = P (V W 1 W 2 )P (W 2 W 1 ) + P (V W 1 W 2)P (W 2 W 1 ), (4) P (V W 1) = P (V W 1 W 2 )P (W 2 W 1) + P (V W 1 W 2)P (W 2 W 1), P (V W 1 W 2 ) = 1, P (W 2 W 1 ) = P (W 2 ) = θ, P (V W 1 W 2) = P (V ), P (W 2 W 1 ) = P (W 2) = 1 θ, P (V W 1 W 2 ) = P (V ), P (W 2 W 1) = P (W 2 ) = θ, P (V W 1 W 2) = 0, P (W 2 W 1) = P (W 2) = 1 θ, given W 1 W 2 : the game is over, and the nominated player has won given W 1 W 2 : the game is back at deuce given W 1 W 2 : the game is back at deuce given W 1 W 2 : the game is over, and the player has lost 6

and the results of successive points are independent. Thus P (V W 1 ) = (1 θ) + (P (V ) (1 θ)) = θ + (1 θ) P (V ) P (V W 1) = (P (V ) θ) + 0 (1 θ) = θp (V ). Hence, combining (3) and (4) we have P (V ) = (θ + (1 θ) P (V )) θ + θp (V ) (1 θ) = θ 2 + 2θ(1 θ)p (V ) = P (V ) = Alternatively, {V i, i = 1, 2,...} partition V. θ 2 1 2θ(1 θ). P (V ) = P (V i ). i=1 Hence Now, P (V i ) = 0 if i is odd, as the game can never be completed after an odd number of points. For i = 2, P (V 2 ) = θ 2, and for i = 2k + 2 (k = 1, 2, 3,...) we have P (V i ) = P (V 2k+2 ) = 2 k θ k (1 θ) k θ 2 - the score must stand at deuce after 2k points and the game must not have been completed prior to this, indicating that there must have been k successive drawn pairs of points, each of which could be arranged win/lose or lose/win for the nominated player. Then that player must win the final two points. Hence P (V ) = P (V 2k+2 ) = θ 2 {2θ(1 θ)} k = k=0 k=0 as the term in the geometric series satisfies 2θ(1 θ) < 1. θ 2 1 2θ(1 θ), 7

EXAMPLE 5 A coin for which P (Heads) = p is tossed until two successive Tails are obtained. Find the probability that the experiment is completed on the nth toss. SOLUTION: Define: sample space Ω to consist of all possible infinite sequences of tosses; event E 1 : first toss is H; event E 2 : first two tosses are T H; event E 3 : first two tosses are T T ; event F n : experiment completed on the nth toss. We want P (F n ) for n = 2, 3,.... The events {E 1, E 2, E 3 } partition Ω, and thus, by the Theorem of Total Probability P (F n ) = P (F n E 1 )P (E 1 ) + P (F n E 2 )P (E 2 ) + P (F n E 3 )P (E 3 ). (5) Now for n = 2 and for n > 2, P (F 2 ) = P (E 3 ) = (1 p) 2 P (F n E 1 ) = P (F n 1 ), P (F n E 2 ) = P (F n 2 ), P (F n E 3 ) = 0, as, given E 1, we need n 1 further tosses that finish T T for F n to occur, and given E 2, we need n 2 further tosses that finish T T for F n to occur, with all tosses independent. Hence, if p n = P (F n ) then p 2 = (1 p) 2, otherwise, from (5), p n satisfies p n = (p n 1 p) + (p n 2 (1 p) p) = pp n 1 + p(1 p)p n 2. To find p n explicitly, try a solution of the form p n = Aλ n 1 + Bλ n 2 which gives Aλ n 1 + Bλ n 2 = p ( ) ( ) Aλ n 1 1 + Bλ n 1 2 + p(1 p) Aλ n 2 1 + Bλ n 2 2. 8

First, collecting terms in λ 1 gives λ n 1 = pλ n 1 1 + p(1 p)λ n 2 1 = λ 2 1 pλ 1 p(1 p) = 0, indicating that λ 1 and λ 2 are given as the roots of this quadratic, that is Furthermore, λ 1 = p p 2 + 4p(1 p), λ 2 = p + p 2 + 4p(1 p). 2 2 n = 1 : p 1 = 0 = Aλ 1 + Bλ 2 = 0 n = 2 : p 2 = (1 p) 2 = Aλ 2 1 + Bλ 2 2 = (1 p) 2 (1 p)2 = A = λ 1 (λ 1 λ 2 ), B = λ 1 (1 p)2 A = λ 2 λ 2 (λ 1 λ 2 ) = p n = (1 p)2 λ n 1 λ 1 (λ 2 λ 1 ) + (1 p)2 λ n 2 λ 2 (λ 2 λ 1 ) (1 p)2 ( ) = λ n 1 2 λ n 1 1, n 2. p(4 3p) 9

EXAMPLE 6 Information is transmitted digitally as a binary sequence know as bits. However, noise on the channel corrupts the signal, in that a digit transmitted as 0 is received as 1 with probability 1 α, with a similar random corruption when the digit 1 is transmitted. It has been observed that, across a large number of transmitted signals, the 0s and 1s are transmitted in the ratio 3 : 4. Given that the sequence 101 is received, what is the probability distribution over transmitted signals? Assume that the transmission and reception processes are independent SOLUTION: Define: sample space Ω to consist of all possible binary sequences of length three that is {000, 001, 010, 011, 100, 101, 110, 111} ; a corresponding set of signal events {S 0, S 1, S 2, S 3, S 4, S 5, S 6, S 7 } and reception events {R 0, R 1, R 2, R 3, R 4, R 5, R 6, R 7 }. We have observed event R 5, that 101 was received; we wish to compute P (S i R 5 ), for i = 0, 1,..., 7. However, on the information given, we only have (or can compute) P (R 5 S i ). First, using the Theorem of Total Probability, we compute P (R 5 ) 7 P (R 5 ) = P (R 5 S i )P (S i ). (6) i=0 Consider P (R 5 S 0 ); if 000 is transmitted, the probability that 101 is received is (1 α) α (1 α) = α (1 α) 2 (corruption, no corruption, corruption) By complete evaluation we have P (R 5 S 0 ) = α (1 α) 2, P (R 5 S 1 ) = α 2 (1 α), P (R 5 S 2 ) = (1 α) 3, P (R 5 S 3 ) = α (1 α) 2, P (R 5 S 4 ) = α 2 (1 α), P (R 5 S 5 ) = α 3, P (R 5 S 6 ) = α (1 α) 2, P (R 5 S 7 ) = α 2 (1 α). 10

Now, the prior information about digits transmitted is that the probability of transmitting a 1 is 4/7, so ( ) 3 3 ( ) ( ) 4 3 2 P (S 0 ) =, P (S 1 ) =, ( 7) ( ) 7 7 4 3 2 ( ) 4 2 ( 3 P (S 2 ) =, P (S 3 ) =, ( 7) ( 7) 7 7) 4 3 2 ( ) 4 2 ( 3 P (S 4 ) =, P (S 5 ) =, ( 7) 7 7 7) 4 2 ( ( ) 3 4 3 P (S 6 ) =, P (S 7 ) =, 7 7) 7 and hence (6) can be computed as P (R 5 ) = 48α3 + 136α 2 (1 α) + 123α (1 α) 2 + 36 (1 α) 3. 343 Finally, using Bayes Theorem, we have the probability distribution over given by {S 0, S 1, S 2, S 3, S 4, S 5, S 6, S 7 } P (S i R 5 ) = P (R 5 S i )P (S i ). P (R 5 ) For example, the probability of correct reception is P (S 5 R 5 ) = P (R 5 S 5 )P (S 5 ) P (R 5 ) = 48α 3 48α 3 + 136α 2 (1 α) + 123α (1 α) 2 + 36 (1 α) 3. 11

Posterior probability as a function of α: 12

EXAMPLE 7 While watching a game of Champions League football in a cafe, you observe someone who is clearly supporting Manchester United in the game. What is the probability that they were actually born within 25 miles of Manchester? Assume that: the probability that a randomly selected person in a typical local bar environment is born within 25 miles of Manchester is 1 20, and; the chance that a person born within 25 miles of Manchester actually supports United is 7 10 ; the probability that a person not born within 25 miles of Manchester supports United with probability 1 10. SOLUTION: Define B - event that the person is born within 25 miles of Manchester U - event that the person supports United. We want P (B U). By Bayes Theorem, P (B U) = = P (U B)P (B) P (U) 7 1 10 20 7 1 + 1 19 10 20 10 20 = P (U B)P (B) P (U B)P (B) + P (U B )P (B ) = 7 26 0.269. 13