Math 151. Rumbos Fall Solutions to Review Problems for Exam 3

Similar documents
Math 151. Rumbos Spring Solutions to Assignment #22

Math 461 Fall 2006 Test 2 Solutions

Math 431 An Introduction to Probability. Final Exam Solutions

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

MAS108 Probability I

Lecture 6: Discrete & Continuous Probability and Random Variables

Practice problems for Homework 11 - Point Estimation

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Random variables, probability distributions, binomial random variable

ECE302 Spring 2006 HW4 Solutions February 6,

Lecture Notes 1. Brief Review of Basic Probability

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

ECE302 Spring 2006 HW3 Solutions February 2,

I. Pointwise convergence

Section 5.1 Continuous Random Variables: Introduction

Statistics 100A Homework 7 Solutions

Lecture 8. Confidence intervals and the central limit theorem

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Joint Exam 1/P Sample Exam 1

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = x = 12. f(x) =

Aggregate Loss Models

Notes on Continuous Random Variables

( ) is proportional to ( 10 + x)!2. Calculate the

Introduction to Probability

5. Continuous Random Variables

Important Probability Distributions OPRE 6301

Master s Theory Exam Spring 2006

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Lecture 7: Continuous Random Variables

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

How To Find Out How Much Money You Get From A Car Insurance Claim

THE CENTRAL LIMIT THEOREM TORONTO

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

STAT x 0 < x < 1

Statistics 100A Homework 4 Solutions

), 35% use extra unleaded gas ( A

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Math/Stats 342: Solutions to Homework

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

3. The Economics of Insurance

4. Continuous Random Variables, the Pareto and Normal Distributions

The Binomial Distribution

An Introduction to Basic Statistics and Probability

Probability Generating Functions

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

Chapter 4 Lecture Notes

Unit 4 The Bernoulli and Binomial Distributions

Tenth Problem Assignment

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Practice Problems #4

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Stat 704 Data Analysis I Probability Review

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Permanents, Order Statistics, Outliers, and Robustness

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!

Chapter 2: Binomial Methods and the Black-Scholes Formula

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

Introduced by Stuart Kauffman (ca. 1986) as a tunable family of fitness landscapes.

39.2. The Normal Approximation to the Binomial Distribution. Introduction. Prerequisites. Learning Outcomes

CONTINUED FRACTIONS AND PELL S EQUATION. Contents 1. Continued Fractions 1 2. Solution to Pell s Equation 9 References 12

Chapter 3 RANDOM VARIATE GENERATION

Maximum Likelihood Estimation

1 Sufficient statistics

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

ISyE 6761 Fall 2012 Homework #2 Solutions

WHERE DOES THE 10% CONDITION COME FROM?

2 Binomial, Poisson, Normal Distribution

1.5 / 1 -- Communication Networks II (Görg) Transforms

The Exponential Distribution

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 3 Solutions

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION

STA 256: Statistics and Probability I

ECE302 Spring 2006 HW5 Solutions February 21,

ST 371 (IV): Discrete Random Variables

Homework set 4 - Solutions

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions

Chapter 6: Point Estimation. Fall Probability & Statistics

Econ 132 C. Health Insurance: U.S., Risk Pooling, Risk Aversion, Moral Hazard, Rand Study 7

2. Discrete random variables

University of California, Los Angeles Department of Statistics. Random variables

Transcription:

Math 151. Rumbos Fall 2013 1 Solutions to Review Problems for Exam 3 1. Suppose that a book with n pages contains on average λ misprints per page. What is the probability that there will be at least m pages which contain more than k missprints? Solution: Let Y denote the number of misprints in one page. Then, we may assume that Y follows a Poisson(λ) distribution; Pr[Y = r] = λr r! e λ, for r = 0, 1, 2,... Thus, the probability that there will be more than k missprints in a given page is p = Pr[Y = r] = r=k+1 r=k+1 λ r r! e λ. Next, let X denote the number of the pages out of the n that contain more than k missprints. Then, X Binomial(n, p), where p is as given in (1). Then the probability that there will be at least m pages which contain more than k missprints is n ( ) n Pr[X m] = p l (1 p) n l, l where p = l=m r=k+1 λ r r! e λ. (1) 2. Suppose that the total number of items produced by a certain machine has a Poisson distribution with mean λ, all items are produced independently of one another, and the probability that any given item produced by the machine will be defective is p. Let X denote the number of defective items produced by the machine. (a) Determine the marginal distribution of the number of defective items, X.

Math 151. Rumbos Fall 2013 2 Solution: Let N denote the number of items produced by the machine. Then, N Poisson(λ), (2) Pr[N = n] = λn n! e λ, for n = 0, 1, 2,... Now, since all items are produced independently of one another, and the probability that any given item produced by the machine will be defective is p, X has a conditional distribution (conditioned on N = n) that is Binomial(n, p); thus, ( ) n p k (1 p) n k, for k = 0, 1, 2,..., n; k Pr[X = k N = n] = (3) 0 elsewhere. Then, Pr[X = k] = Pr[X = k, N = n] n=0 = Pr[N = n] Pr[X = k N = n], n=0 where Pr[X = k N = n] = 0 for n < k,, using (2) and (3), Pr[X = k] = n=k = e λ k! λ n n! e λ p k n=k ( ) n p k (1 p) n k k λ n 1 (n k)! (1 p)n k. (4) Next, make the change of variables l = n k in the last summation in (4) to get Pr[X = k] = e λ k! p k l=0 λ l+k 1 l! (1 p)l,

Math 151. Rumbos Fall 2013 3 Pr[X = k] = (λp)k k! = (λp)k k! e λ l=0 e λ e λ(1 p) 1 [λ(1 p)]l l! which shows that = (λp)k k! e λp, X Poisson(λp). (5) (b) Let Y denote the number of non defective items produced by the machine. Show that X and Y are independent random variables. Solution: Similar calculations to those leading to (5) show that Y Poisson(λ(1 p)), (6) since the probability of an item coming out non defective is 1 p. Next, observe that Y = N X and compute the joint probability Pr[X = k, Y = l] = Pr[X = k, N = k + l] = Pr[N = k + l] Pr[X = k N = k + l] = by virtue of (2) and (3). Thus, λ k+l (k + l)! e λ ( k + l k ) p k (1 p) l Pr[X = k, Y = l] = λk+l k! l! e λ p k (1 p) l where Thus, = λk+l k! l! e λ p k (1 p) l, e λ = e [p+(1 p)]λ = e pλ e (1 p)λ. Pr[X = k, Y = l] = (pλ)k k! e pλ [(1 p)λ]l l! e (1 p)λ = p X (k) p Y (l),

Math 151. Rumbos Fall 2013 4 in view of (5) and (6). Hence, X and Y are independent. 3. Suppose that the proportion of color blind people in a certain population is 0.005. Estimate the probability that there will be more than one color blind person in a random sample of 600 people from that population. Solution: Set p = 0.005 and n = 600. Denote by Y the number of color blind people in the sample. The, we may assume that Y Binomial(n, p). Since p is small and n is large, we may use the Poisson approximation to the binomial distribution to get where λ = np = 3. Then, Pr[Y = k] λk k! e λ, Pr[Y > 1] = 1 Pr[Y 1] 1 e 3 3e 3 0.800852. Thus, the probability that there will be more than one color blind person in a random sample of 600 people from that population is about 80%. 4. An airline sells 200 tickets for a certain flight on an airplane that has 198 seats because, on average, 1% of purchasers of airline tickets do not appear for departure of their flight. Estimate the probability that everyone who appears for the departure of this flight will have a seat. Solution: Set p = 0.01, n = 200 and let Y denote the number of ticket purchasers that do not appear for departure. Then, we may assume that Y Binomial(n, p). We want to estimate the probability Pr[Y > 2]. Using the Poisson(λ), with λ = np = 2, approximation to the distribution of Y we get Pr[Y 2] = 1 Pr[Y 1] 1 e 2 2e 2 0.594. Thus, the probability that everyone who appears for the departure of this flight will have a seat is about 59.4%.

Math 151. Rumbos Fall 2013 5 5. Let X denote a positive random variable such that ln(x) has a Normal(0, 1) distribution. (a) Give the pdf of X and compute its expectation. Solution: Set Z = ln(x), Z Normal(0, 1); thus, Next, compute the cdf for X, f Z (y) = 1 2π e z2 /2, for z R. (7) F X (x) = Pr(X x), for x > 0, to get F X (x) = Pr[ln(X) ln(x)] F X (x) = = Pr[Z ln(x)] = F Z (ln(x)), { F Z (ln(x)), for x > 0; 0 for x 0. Differentiating (8) with respect to x, for x > 0, we obtain f X (x) = F Z (ln(x)) 1 x, f X (x) = f Z (ln(x)) 1 x, (9) where we have used the Chain Rule. Combining (7) and (9) yields 1 e (ln x)2 /2, for x > 0; f X (x) = 2π x (10) 0 for x 0. In order to compute the expected value of X, use the pdf in (10) to get (8) E(X) = = 0 xf X (x) dx 1 2π e (ln x)2 /2 dx. (11)

Math 151. Rumbos Fall 2013 6 Make the change of variables u = ln x in the last integral in (11) to get du = 1 x dx, dx = eu du and E(X) = 1 2π e u u2 /2 du. (12) Complete the square in the exponent of the integrand in (12) to obtain E(X) = e 1/2 1 2π e (u 1)2 /2 du. (13) Next, make the change of variables w = u 1 for the integral in (13) to get E(X) = e 1/2 1 e w2 /2 dw = e. 2π (b) Estimate Pr(X 6.5). Solution: Use the result in (8) to compute Pr(X 6.5) = F Z (ln(6.5)), where Z Normal(0, 1). Thus, or about 97%. Pr(X 6.5) = F Z (1.8718) = 0.969383, 6. Forty seven digits are chosen at random and with replacement from {0, 1, 2,..., 9}. Estimate the probability that their average lies between 4 and 6. Solution: Let X 1, X 2,..., X n, where n = 47, denote the 47 digits. Since the sampling is done with replacement, the random variables X 1, X 2,..., X n are identically uniformly distributed over the digits {0, 1, 2,..., 9}; in other words, X 1, X 2,..., X n is a random sample from the discrete distribution with probability mass function 1, for k = 0, 1, 2,..., 9; 10 p X (k) = 0, elsewhere.

Math 151. Rumbos Fall 2013 7 Consequently, the mean of the distribution is µ = E(X) = 9 kp X (k) = 1 10 k=0 9 k = 4.5. (14) k=1 In order to compute the variance of X, we first compute the second moment E(X 2 ) = 9 k 2 p X (k) = 1 10 k=0 9 k 2 = 1 10 k=1 9(9 + 1)(2(9) + 1), 6 E(X 2 ) = 57 2. It then follows that the variance of the distribution is and We would like to estimate or where µ is given in (14), σ 2 = E(X 2 ) [E(X)] 2 = 57 2 81 4 ; σ 2 = 33 4 = 8.25, σ = 2.87. (15) Pr(4 X n 6), Pr(4 µ X n µ 6 µ), Pr(4 X n 6) = Pr( 0.5 X n µ 1.5) (16) Next, divide the last inequality in (16) by σ/ n, where σ is as given in (15), to get ( Pr(4 X n 6) = Pr 1.19 X ) n µ σ/ 3.58 (17) n For n large (say, n = 47), we can apply the Central Limit Theorem to obtain from (17) that Pr(4 X n 6) Pr ( 1.19 Z 3.58), where Z Normal(0, 1). (18)

Math 151. Rumbos Fall 2013 8 It follows from (18) and the definition of the cdf that Pr(4 X n 6) F Z (3.58) F Z ( 1.19), (19) where F Z is the cdf of Z Normal(0, 1). Using the symmetry of the pdf of Z Normal(0, 1), we can re write (19) as Pr(4 X n 6) F Z (1.19) + F Z (3.58) 1. (20) Finally, using a table of standard normal probabilities, we obtain from (20) that Pr(4 X n 6) 0.8830 + 0.9998 1 = 0.8828. Thus, the probability that the average of the 47 digits is between 4 and 6 is about 88.3%. 7. Let X 1, X 2,..., X 30 be independent random variables each having a discrete distribution with pmf: 1/4, if x = 0 or x = 2; p(x) = 1/2, if x = 1; 0, otherwise. Estimate the probability that X 1 + X 2 + + X 30 is at most 33. Solution: First, compute the mean, µ = E(X), and variance, σ 2 = Var(X), of the distribution: µ = 0 1 4 + 1 1 2 + 21 = 1. (21) 4 σ 2 = E(X 2 ) [E(X)] 2, (22) where E(X 2 ) = 0 2 1 4 + 12 1 2 + 1 22 = 1.5; (23) 4, combining (21), (22) and (23), σ 2 = 1.5 1 = 0.5. (24) Next, let Y = n X k, where n = 30. We would like to estimate k=1 Pr[Y 33]. (25)

Math 151. Rumbos Fall 2013 9 By the Central Limit Theorem ( ) Y nµ Pr Pr(Z z), n σ for z R, (26) where Z Normal(0, 1), µ = 1, σ 2 = 1.5 and n = 30. It follows from (26) that we can estimate the probability in (25) by Pr[Y 33] Pr(Z 0.77) = 0.7794. (27) Thus, according to (27), the probability that X 1 + X 2 + + X 30 is at most 33 is about 78%. 8. Roll a balanced die 36 times. Let Y denote the sum of the outcomes in each of the 36 rolls. Estimate the probability that 108 Y 144. Suggestion: Since the event of interest is (Y {108, 109,..., 144}), rewrite Pr(108 Y 144) as Pr(107.5 < Y 144.5). Solution: Let X 1, X 2,..., X n, where n = 36, denote the outcomes of the 36 rolls. Since we are assuming that the die is balanced, the random variables X 1, X 2,..., X n are identically uniformly distributed over the digits {1, 2,..., 6}; in other words, X 1, X 2,..., X n is a random sample from the discrete Uniform(6) distribution. Consequently, the mean of the distribution is and the variance is We also have that where n = 36. σ 2 = By the Central Limit Theorem, Pr(107.5 < Y 144.5) Pr µ = 6 + 1 2 (6 + 1)(6 1) 12 Y = = 3.5, (28) n X k, k=1 ( 107.5 nµ nσ < Z = 35 12. (29) 144.5 nµ nσ ), (30)

Math 151. Rumbos Fall 2013 10 where Z Normal(0, 1), n = 36, and µ and σ are given in (28) and (29), respectively. We then have from (30) that Pr(107.5 < Y 144.5) Pr ( 1.81 < Z 1.81) F Z (1.81) F Z ( 1.81) 2F Z (1.81) 1 2(0.9649) 1 0.9298; the probability that 108 Y 144 is about 93%. 9. Let Y Binomial(100, 1/2). Use the Central Limit Theorem to estimate the value of Pr(Y = 50). Solution: We use the so called continuity correction and estimate By the Central Limit Theorem, Pr(49.5 < Y 50.5) Pr Pr(49.5 < Y 50.5). ( 49.5 nµ nσ < Z where Z Normal(0, 1), n = 100, and nµ = 50 and ( 1 σ = 1 1 ) = 1 2 2 2. We then obtain from (31) that Pr(49.5 < Y 50.5) Pr ( 0.1 < Z 0.1) ) 50.5 nµ, (31) nσ F Z (0.1) F Z ( 0.1) 2F Z (0.1) 1 2(0.5398) 1 0.0796.

Math 151. Rumbos Fall 2013 11 Thus, or about 8%. Pr(Y = 50) 0.08, 10. Let Y Binomial(n, 0.55). Find the smallest value of n such that, approximately, Pr(Y/n > 1/2) 0.95. (32) Solution: By the Central Limit Theorem, Y n 0.55 (0.55)(1 0.55)/ n D Z Normal(0, 1) as n. (33) Thus, according to (32) and (33), we need to find the smallest value of n such that ( ) 0.5 0.55 Pr Z > (0.4975)/ 0.95, n or ( ) n Pr Z > 0.95. (34) 10 The expression in (34) is equivalent to ( ) n 1 Pr Z 0.95, 10 which can be re-written as 1 F Z ( ) n 0.95, (35) 10 where F Z is the cdf of Z Normal(0, 1). By the symmetry of the pdf for Z Normal(0, 1), (35) is equivalent to ( ) n F Z 0.95. (36) 10 The smallest value of n for which (36) holds true occurs when n 10 z, (37)

Math 151. Rumbos Fall 2013 12 where z is a positive real number with the property The equality in (38) occurs approximately when F Z (z ) = 0.95. (38) z = 1.645. (39) It follows from (37) and (39) that (32) holds approximately when n 1.645, 10 or n 270.6025. Thus, n = 271 is the smallest value of n such that, approximately, Pr(Y/n > 1/2) 0.95. 11. Let X 1, X 2,..., X n be a random sample from a Poisson distribution with mean n λ. Thus, Y = has a Poisson distribution with mean nλ. Moreover, by i=1 X i the Central Limit Theorem, X = Y/n has, approximately, a Normal(λ, λ/n) distribution for large n. Show that u(y/n) = Y/n is a function of Y/n which is essentially free of λ. Solution: We will show that, for large values of n, the distribution of 2 ( Y n n ) λ (40) is independent of λ. In fact, we will show that, for large values of n, the distribution of the random variables in (40) can be approximated by a Normal(0, 1) distribution. First, note that by the weak Law of Large Numbers, Y n Pr λ, as n. Thus, for large values of n, we can approximate u(y/n) by its linear approximation around λ ( ) Y u(y/n) u(λ) + u (λ) n λ, (41)

Math 151. Rumbos Fall 2013 13 where u (λ) = 1 2 λ. (42) Combining (41) and (42) we see that, for large values of n, Y n λ 1 2 n Since, by the Central Limit Theorem, Y n λ. (43) λ n Y n λ λ D Z Normal(0, 1) as n, (44) n it follows from (43) and (44) that 2 ( Y n n ) D λ Z Normal(0, 1) as n, which was to be shown. 12. Let X denote a random variable with mean µ and variance σ 2. Use Chebyshev s inequality to show that for all k > 0. Pr( X µ kσ) 1 k 2, Solution: Apply Chebyshev s inequality, Pr( X µ ε) Var(X) ε 2, for every ε > 0, to the case in which ε = kσ, Pr( X µ kσ) σ2 (kσ) 2 = 1 k 2, which was to be shown.

Math 151. Rumbos Fall 2013 14 13. Suppose that factory produces a number X of items in a week, where X can be modeled by a random variable with mean 50. Suppose al the variance for a week s production is known to be 25. What can be said about the probability that this week s production will be between 40 and 60? Solution: Write where µ = 50, where σ = 5. We can then write Pr(40 < X < 60) = Pr( 10 < X µ < 10), Pr(40 < X < 60) = Pr( X µ < 2σ), Pr(40 < X < 60) = 1 Pr( X µ 2σ), where, in view of the result in Problem 12, Pr( X µ 2σ) 1 4. Consequently, Pr(40 < X < 60) 1 1 4 = 3 4 ; the he probability that this week s production will be between 40 and 60 is at least 75%. 14. Let (X n ) denote a sequence of nonnegative random variables with means µ n = E(X n ), for each n = 1, 2, 3,.... Assume that lim µ n = 0. Show that X n n converges in probability to 0 as n. Solution: We need to show that, for any ε > 0, lim Pr( X n < ε) = 1. (45) n We will establish (45) by first applying the following inequality: Given a random variable X and any ε > 0, Pr( X ε) E( X ). (46) ε We prove the inequality in (46) for the case in which X is a continuous random variable with pdf f X. Denote the event ( X ε) by A ε and compute E( X ) = x f X (x) dx x f X (x) dx, A ε

Math 151. Rumbos Fall 2013 15 which yields (46). E( X ) ε f X (x) dx = εpr(a ε ), A ε Next, apply the inequality in (46) to X = X n, for each n, and observe that E( X n ) = E(X n ) = µ n, since we are assuming that X n is nonnegative. It then follows from (46) that It follows from (47) that Pr( X n ε) µ n, for n = 1, 2, 3,... (47) ε Pr( X n < ε) 1 µ n, for n = 1, 2, 3,... ε 1 µ n ε Pr( X n < ε) 1, for n = 1, 2, 3,... (48) Next, use the assumption that lim µ n = 0 and the Squeeze Lemma to obtain n from (48) that lim Pr( X n < ε) = 1, n which was to be shown. Hence, X n converges in probability to 0 as n.