Math 431 An Introduction to Probability. Final Exam Solutions



Similar documents
Lecture 6: Discrete & Continuous Probability and Random Variables

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

5. Continuous Random Variables

Math 151. Rumbos Spring Solutions to Assignment #22

Joint Exam 1/P Sample Exam 1

Math 461 Fall 2006 Test 2 Solutions

Random variables, probability distributions, binomial random variable

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

2. Discrete random variables

Binomial random variables (Review)

( ) is proportional to ( 10 + x)!2. Calculate the

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Binomial random variables

Notes on Continuous Random Variables

Statistics 100A Homework 4 Solutions

Covariance and Correlation

Stats on the TI 83 and TI 84 Calculator

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

MAS108 Probability I

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Normal distribution. ) 2 /2σ. 2π σ

Lecture 8: More Continuous Random Variables

The Binomial Probability Distribution

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Lecture 8. Confidence intervals and the central limit theorem

Math/Stats 342: Solutions to Homework

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Practice problems for Homework 11 - Point Estimation

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Statistics 100A Homework 7 Solutions

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

You flip a fair coin four times, what is the probability that you obtain three heads.

Chapter 9 Monté Carlo Simulation

University of California, Los Angeles Department of Statistics. Random variables

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

ECE302 Spring 2006 HW3 Solutions February 2,

WHERE DOES THE 10% CONDITION COME FROM?

Chapter 4 Lecture Notes

Probability Distribution for Discrete Random Variables

Practice Problems #4

Lecture Notes 1. Brief Review of Basic Probability

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Section 1.3 P 1 = 1 2. = P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., =

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

4. Continuous Random Variables, the Pareto and Normal Distributions

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Section 5.1 Continuous Random Variables: Introduction

ISyE 6761 Fall 2012 Homework #2 Solutions

ECE302 Spring 2006 HW5 Solutions February 21,

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

ST 371 (IV): Discrete Random Variables

ECE302 Spring 2006 HW4 Solutions February 6,

Approximation of Aggregate Losses Using Simulation

E3: PROBABILITY AND STATISTICS lecture notes

Chicago Booth BUSINESS STATISTICS Final Exam Fall 2011

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions

Aggregate Loss Models

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

An Introduction to Basic Statistics and Probability

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

Probability Generating Functions

STA 130 (Winter 2016): An Introduction to Statistical Reasoning and Data Science

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

Chapter 5. Random variables

Statistics 100A Homework 8 Solutions

Chapter 3 RANDOM VARIATE GENERATION

Math Quizzes Winter 2009

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

2 Binomial, Poisson, Normal Distribution

Lecture 7: Continuous Random Variables

Introduction to Probability

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Lecture 5: Mathematical Expectation

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

Transformations and Expectations of random variables

Mathematical Expectation

Resource Provisioning and Network Traffic

Introduction to Hypothesis Testing

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Hypothesis testing - Steps

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Chapter 5 Discrete Probability Distribution. Learning objectives

Hypothesis Testing for Beginners

Random Variables. Chapter 2. Random Variables 1

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan , Fall 2010

Sample Questions for Mastery #5

Transcription:

Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 < <, b for. (a) Determine the constants a and b. (b) Find the pdf of X. Be sure to give a formula for f X () that is valid for all. (c) Calculate the epected value of X. (d) Calculate the standard deviation of X. (a) We must have a = lim = 0 and b = lim + =, since F is a cdf. (b) For all 0 or, F is differentiable at, so f() = F () = if 0 < <, (One could also use any f that agrees with this definition for all 0 or.) (c) E(X) = f() d = 0 d = 3. (d) E(X ) = 0 d =, Var(X) = E(X ) [E(X)] = ( 3 ) = 0.0556, 8 and σ(x) = Var(X) = = 8 3.357.

. Suppose the number of children born in Ruritania each day is a binomial random variable with mean 000 and variance 00. Assume that the number of children born on any particular day is independent of the numbers of children born on all other days. What is the probability that on at least one day this year, fewer than 975 children will be born in Ruritania? We approimate the number of children born each day by a normal random variable. Letting X denote the number of children born on some specified day, and Z denote a standard normal, we have P (X 975) = P (X 974.5) = P ( X 000 0 974.5 000 =.55) = P (Z +.55).9946. Since each such random variable X (one for 0 each day) is assumed independent of the others, the probability that 975 or more children will be born on every day of this year is.9946 365.386, and the probability that, on at least one day this year, fewer than 975 children will be born is close to.386 86%. 3. Suppose that the time until the net telemarketer calls my home is distributed as an eponential random variable. If the chance of my getting such a call during the net hour is.5, what is the chance that I ll get such a call during the net two hours? First solution: Letting λ denote the rate of this eponential random variable X, we have.5 = F X () = e λ, so λ = ln and F X () = e λ = (e λ ) = (.5) =.75. Second solution: We have P (X ) = P (X ) + P ( < X ). The first term is.5, and the second can be written as P (X > and X ) = P (X > )P (X X > ). The first of these factors equals P (X ) =.5 =.5, and the second (by virtue of the memorylessness of the eponential random variable) equals P (X ) =.5. So P (X ) =.5 + (.5)(.5) =.75. 4. Suppose X is uniform on the interval from to. Compute the pdf and epected value of the random variable Y = /X. We have if < <, f X () = Putting g(t) = /t we have Y = g(x); since g is monotone on the range of X with inverse function g (y) =, Theorem 7. tells us that y f Y (y) = d = if < y <, dy y y (Check: f Y (y) dy = / dy =.) We have E(Y ) = y y f Y (y) dy = / dy = ln. y (Check: E(/X) = f X() d = d = ln.) 5. I toss 3 fair coins, and then re-toss all the ones that come up tails. Let X denote the number of coins that come up heads on the first toss, and let Y denote the number of re-tossed coins that come up heads on the second toss. (Hence 0 X 3 and 0 Y 3 X.)

(a) Determine the joint pmf of X and Y, and use it to calculate E(X + Y ). (b) Derive a formula for E(Y X) and use it to compute E(X + Y ) as E(E(X + Y X)). (a) P (X = j, Y = k) equals P (X = j)p (Y = k X = j) = ( ) ) 3 j ( )j ( )3 j( 3 j k ( )k ( )3 j k = ( ) ) 3 j ( )3( 3 j k ( )3 j = ( )( ) 3 3 j j k ( )6 j whenever 0 j 3 and 0 k 3 j (and equals zero otherwise), so the joint pmf f = f X,Y has the following values: f(0, 0) = 64, f(0, ) = 3 64, f(0, ) = 3 64, f(0, 3) = 64, f(, 0) = 3 3, f(, ) = 6 3, f(, ) = 3 3, f(, 0) = 3 6, f(, ) = 3 6, f(3, 0) = 8. Hence E(X + Y ) = 0 + ( 3 + 3 ) + ( 3 + 6 + 3 ) + 3 ( + 3 + 3 + ) = 9. 64 64 3 64 3 6 64 3 6 8 4 (Alternatively: X + Y is the total number of coins that come up heads on the first toss or, failing that, heads on the re-toss. Each of the three coins has a 3 chance of 4 contributing to this total, so by linearity of epectation, the epected value of the total is 3 + 3 + 3 = 9. 4 4 4 4 (b) For each fied (0 3), when we condition on the event X =, Y is just a binomial random variable with p = and n = 3, and therefore with epected value pn = (3 ). Hence E(Y X) = (3 X) and E(X +Y ) = E(E(X +Y X)) = E(E(X X)+E(Y X)) = E(X + (3 X)) = E( X + 3) = E(X)+ 3 = 3 + 3 = 9. 4 6. Let the continuous random variables X, Y have joint distribution f X,Y (, y) = (a) Compute E(X) and E(Y ). / if 0 < y < <, (b) Compute the conditional pdf of Y given X =, for all 0 < <. (c) Compute E(Y X = ) for all 0 < <, and use this to check your answers to part (a). (d) Compute Cov(X, Y ). (a) E(X) = f X,Y (, y) dy d = 0 0 dy d = 0 d =. E(Y ) = y f X,Y (, y) dy d = 0 0 y dy d = 0 d =. 4

(b) The marginal pdf for X is f X () = f X,Y (, y) dy, which equals 0 dy = for 0 < < (and equals zero otherwise). That is, X is uniform on the interval from 0 to. Hence for each 0 < <, the conditional pdf for Y given X = is f Y X (y ) = f X,Y (, y)/f X (), which is for 0 < y < and (c) E(Y X = ) = y f Y X(y ) dy = y 0 dy =. (We can also derive this answer from the fact that the conditional distribution of Y given X = was shown in (b) to be uniform on the interval (0, ), and from the fact that the epected value of a random variable that is uniform on an interval is just the midpoint of the interval.) To check the formula for E(Y X), we re-calculate E(Y ) = E(E(Y X)) = E( X) = E(X), which agrees with E(X) =, E(Y ) =. 4 (d) E(XY ) = 0 0 y dy d = 0 d =, so Cov(X, Y ) = E(XY ) E(X)E(Y ) = 6 ( )( ) =. 6 4 4 7. I repeatedly roll a fair die. If it comes up 6, I instantly win (and stop playing); if it comes up k, for any k between and 5, I wait k minutes and then roll again. What is the epected elapsed time from when I start rolling until I win? (Note: If I win on my first roll, the elapsed time is zero.) Let T denote the (random) duration of the game, and let X be the result of the first roll. Then E(T ) = E(E(T W )) = (E(T W = ) + E(T W = ) +... + E(T W = 6 5) + E(T W = 6)) = ((E(T ) + ) + (E(T ) + ) +... + (E(T ) + 5) + 0) = (5E(T ) + 5), 6 6 so 6E(T ) = 5E(T ) + 5 and E(T ) = 5 (minutes). 8. Suppose that the number of students who enroll in Math 43 each fall is known (or believed) to be a random variable with epected value 90. It does not appear to be normal, so we cannot use the Central Limit Theorem. (a) If we insist on being 90% certain that there will be no more than 35 students in each section, should UW continue to offer just three sections of Math 43 each fall, or would our level of aversion to the risk of overcrowding dictate that we create a fourth section? (b) Repeat part (a) under the additional assumption that the variance in the enrollment level is known to be 0 (with no other additional assumptions). (a) Since we do not know the variance, the best we can do is use Markov s inequality: P (X 06) 90.85; this is much bigger than.0, so to be on the safe side we 06 should create a fourth section. (b) Here we know the variance, but since normality is not assumed, we cannot use the Central Limit Theorem; we should use a two-sided or (better still) a one-sided

Chebyshev inequality. The two-sided inequality gives us P (X 06) P ( X 90 6) σ = 0.078 <.0, so we re on the safe side with just three classes. (Or 6 56 we could use the one-sided Chebyshev inequality: P (X 06) P (X 90 6) σ = 0.07.) σ +6 0+56 9. (a) A coin is tossed 50 times. Use the Central Limit Theorem (applied to a binomial random variable) to estimate the probability that fewer than 0 of those tosses come up heads. (b) A coin is tossed until it comes up heads for the 0th time. Use the Central Limit Theorem (applied to a negative binomial random variable) to estimate the probability that more than 50 tosses are needed. (c) Compare your answers from parts (a) and (b). Why are they close but not eactly equal? (a) The number of tosses that come up heads is a binomial random variable, which can be written as a sum of 50 independent indicator random variables. Since 50 is a reasonably large number, it makes sense to use the Central Limit Theorem, and to approimate X (the number of heads in 50 tosses) by a Gaussian with mean np = 50 = 5 and variance np( p) = 50 =.5. So P (X < 0) = P (X 9.5) = P ( X 5.5 9.5 5.5 = 5.5.5 ) = P (Z 5.5.5 ), where Z is a standard Gaussian; using 5.5.5.56, we have P (X < 0) Φ(.56) 6%. (b) The waiting time until the 0th heads-toss is a negative binomial random variable, which can be written as a sum of 0 independent geometric random variables. 0 is a decent-sized number, so, as in part (a), we may apply the Central Limit Theorem and approimate W (the number of tosses required to get heads 0 times) by a Gaussian with mean r p = 0 / = and variance r( p) p = 0(/) 50.5) = P ( W 50.5 = 0.5 ) = P (Z 0.5 =. So P (W > 50) = P (W (/).66) Φ(.66) 5%. (c) Suppose the coin is tossed until it has been tossed at least 50 times and heads has come up at least 0 times. Then the outcomes for which X < 0 are precisely those for which W > 50, so the two events have equal probability. The reason we did not get the eact same answers in parts (a) and (b) is that the Central Limit Theorem is only an approimation, and when specific numbers are used there is likely to be some error.