6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS



Similar documents
Math 431 An Introduction to Probability. Final Exam Solutions

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Notes on Continuous Random Variables

5. Continuous Random Variables

Exponential Distribution

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Math 461 Fall 2006 Test 2 Solutions

Lecture 7: Continuous Random Variables

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.)

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Section 5.1 Continuous Random Variables: Introduction

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Aggregate Loss Models

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

ECE302 Spring 2006 HW5 Solutions February 21,

Joint Exam 1/P Sample Exam 1

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 8 Solutions

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

Lecture 8. Confidence intervals and the central limit theorem

Section 6.1 Joint Distribution Functions

MAS108 Probability I

The Exponential Distribution

Chapter 3 RANDOM VARIATE GENERATION

Practice problems for Homework 11 - Point Estimation

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Tenth Problem Assignment

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Chapter 4 Lecture Notes

Math/Stats 342: Solutions to Homework

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

ISyE 6761 Fall 2012 Homework #2 Solutions

ECE302 Spring 2006 HW4 Solutions February 6,

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

4. Continuous Random Variables, the Pareto and Normal Distributions

Lecture Notes 1. Brief Review of Basic Probability

Lecture 6: Discrete & Continuous Probability and Random Variables

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

PSTAT 120B Probability and Statistics

ECE302 Spring 2006 HW3 Solutions February 2,

ST 371 (IV): Discrete Random Variables

Random variables, probability distributions, binomial random variable

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

2. Discrete random variables

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

4 Sums of Random Variables

Stats on the TI 83 and TI 84 Calculator

2WB05 Simulation Lecture 8: Generating random variables

The normal approximation to the binomial

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

Statistics 100A Homework 4 Solutions

Section 5 Part 2. Probability Distributions for Discrete Random Variables

Some special discrete probability distributions

Probability Generating Functions

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

6 PROBABILITY GENERATING FUNCTIONS

Maximum Likelihood Estimation

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Lesson 20. Probability and Cumulative Distribution Functions

Math 151. Rumbos Spring Solutions to Assignment #22

Binomial lattice model for stock prices

Introduction to Probability

Chapter 5. Random variables

You flip a fair coin four times, what is the probability that you obtain three heads.

TEST 2 STUDY GUIDE. 1. Consider the data shown below.

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan , Fall 2010

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

Negative Binomial. Paul Johnson and Andrea Vieux. June 10, 2013

( ) = 1 x. ! 2x = 2. The region where that joint density is positive is indicated with dotted lines in the graph below. y = x

Chapter 9 Monté Carlo Simulation

1.5 / 1 -- Communication Networks II (Görg) Transforms

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

Solved Problems. Chapter Probability review

Important Probability Distributions OPRE 6301

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Normal Probability Distribution

Binomial random variables

Normal distribution. ) 2 /2σ. 2π σ

Basics of Statistical Machine Learning

by Dimitri P. Bertsekas and John N. Tsitsiklis Last updated: October 8, 2002

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

MULTIVARIATE PROBABILITY DISTRIBUTIONS

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

WORKED EXAMPLES 1 TOTAL PROBABILITY AND BAYES THEOREM

Transcription:

6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total Write your solutions in this quiz packet, only solutions in the quiz packet will be graded. Question one, multiple choice questions, will receive no partial credit. Partial credit for question two and three will be awarded. You are allowed 2 two-sided 8.5 by formula sheet plus a calculator. You have 2 minutes to complete the quiz. Be neat! You will not get credit if we can t read it. We will send out an email with more information on how to obtain your quiz before drop date. Good Luck!

6.4/6.43: Probabilistic Systems Analysis (Spring 28) Question Multiple choice questions. CLEARLY circle the best answer for each question below. Each question is worth 4 points each, with no partial credit given. a. Let X, X 2, and X 3 be independent random variables with the continuous uniform distribution over [, ]. Then P(X < X 2 < X 3 ) (i) /6 (ii) /3 (iii) /2 (iv) /4 : To understand the principle, first consider a simpler problem with X and X 2 as given above. Note that P(X < X 2 ) + P(X 2 < X ) + P(X X 2 ) since the corresponding events are disjoint and exaust all the possibilities. But P(X < X 2 ) P(X 2 < X ) by symmetry. Furthermore, P(X X 2 ) since the random variables are continuous. Therefore, P(X < X 2 ) /2. Analogously, omitting the events with zero probability but making sure to exhaust all other possibilities, we have that P(X < X 2 < X 3 ) + P(X < X 3 < X 2 ) + P(X 2 < X < X 3 ) + P(X 2 < X 3 < X ) + P(X 3 < X < X 2 ) + P(X 3 < X 2 < X ). And, by symmetry, P(X < X 2 < X 3 ) P(X < X 3 < X 2 ) P(X 2 < X < X 3 ) P(X 2 < X 3 < X ) P(X 3 < X < X 2 ) P(X 3 < X 2 < X ). Thus, P(X < X 2 < X 3 ) /6. b. Let X and Y be two continuous random variables. Then (i) E[XY ] E[X]E[Y ] (ii) E[X 2 + Y 2 ] E[X 2 ] + E[Y 2 ] (iii) f X+Y (x + y) f X (x)f Y (y) (iv) var(x + Y ) var(x) + var(y ) : Since X 2 and Y 2 are random variables, the result follows by the linearity of expectation. c. Suppose X is uniformly distributed over [, 4] and Y is uniformly distributed over [, ]. Assume X and Y are independent. Let Z X + Y. Then (i) f Z (4.5) (ii) f Z (4.5) /8 (iii) f Z (4.5) /4 (iv) f Z (4.5) /2 : Since X and Y are independent, the result follows by convolution: 4 f Z (4.5) f X (α)f Y (4.5 α) dα dα. 3.5 4 8 Page 3 of

6.4/6.43: Probabilistic Systems Analysis (Spring 28) d. For the random variables defined in part (c), P(max(X, Y ) > 3) is equal to (i) (ii) 9/4 (iii) 3/4 (iv) /4 : Note that P(max(X, Y ) > 3) P(max(X, Y ) 3) P({X 3} {Y 3}). But, X and Y are independent, so P({X 3} {Y 3}) P(X 3)P(Y 3). Finally, computing the probabilities, we have P(X 3) 3/4 and P(X 3). Thus, P(max(X, Y ) > 3) 3/4 /4. e. Recall the hat problem from lecture: N people put their hats in a closet at the start of a party, where each hat is uniquely identified. At the end of the party each person randomly selects a hat from the closet. Suppose N is a Poisson random variable with parameter λ. If X is the number of people who pick their own hats, then E[X] is equal to (i) λ (ii) /λ 2 (iii) /λ (iv) : Let X X +... + X N where each X i is the indicator function such that X i if the i th person picks their own hat and X i otherwise. By the linearity of the expectation, E[X N n] E[X N n] +... + E[X N N n]. But E[X i N n] /n for all i,..., n. Thus, E[X N n] ne[x i N n]. Finally, E[X] E[E[X N n]]. f. Suppose X and Y are Poisson random variables with parameters λ and λ 2 respectively, where X and Y are independent. Define W X + Y, then (i) W is Poisson with parameter min(λ, λ 2 ) (ii) W is Poisson with parameter λ + λ 2 (iii) W may not be Poisson but has mean equal to min(λ, λ 2 ) (iv) W may not be Poisson but has mean equal to λ + λ 2 : The quickest way to obtain the answer is through transforms: M X (s) e λ (e s ) and M Y (s) e λ 2(e s ). Since X and Y are independent, we have that M W (s) e λ (e s ) e λ 2(e s ) e (λ +λ 2 )(e s ), which equals the transform of a Poisson random variable with mean λ + λ 2. Page 4 of

6.4/6.43: Probabilistic Systems Analysis (Spring 28) g. Let X be a random variable whose transform is given by M X (s) (.4 +.6e s ) 5. Then (i) P(X ) P(X 5) (ii) P(X 5) > (iii) P(X ) (.4) 5 (iv) P(X 5).6 : Note that M X (s) is the transform of a binomial random variable X with n 5 trials and the probability of success p.6. Thus, P(X ).4 5. h. Let X i, i, 2,... be independent random variables all distributed according to the PDF f X (x) x/8 for x 4. Let S i X i. Then P(S > 3) is approximately equal to (i) Φ(5) (ii) Φ(5) 5 (iii) Φ 2 5 (iv) Φ 2 : Let S i Y i where Y i is the random variable given by Y i X /. Since Y i are iid, the distribution of S is approximately normal with mean E[S] and variance var(s). 3 E(S) Thus, P(S > 3) P(S 3) Φ. Now, var(s) Therefore, 4 x x3 4 8 E[X i ] x dx 8 24 3 var(x i ) E[X 2 ] (E[X i ]) 2 i 4 x 2 x 8 dx 8 2 3 x 4 4 32 8 2 8 3 9 and E[S] E[X i ] +... + E[X ] 8/3. 8 var(s) var(x i ) +... + var(x i ). 2 2 9 3 8/3 P(S > 3) Φ 8 9 5 Φ. 2 Page 5 of

6.4/6.43: Probabilistic Systems Analysis (Spring 28) i. Let X i, i, 2,... be independent random variables all distributed according to the PDF f X (x), x. Define Y n X X 2 X 3... X n, for some integer n. Then var(y n ) is equal to n (i) 2 (ii) 3 n 4 n (iii) 2 n (iv) 2 : Since X,..., X n are independent, we have that E[Y n ] E[X ]... E[X n ]. Similarly, E[Y n 2 ] E[X 2 ]... E[Xn 2 ]. Since E[X i ] /2 and E[Xi 2 ] /3 for i,..., n, it follows that var(s n ) E[Y 2 n ] (E[Y n ]) 2 4 n. 3 n Question 2 Each Mac book has a lifetime that is exponentially distributed with parameter λ. The lifetime of Mac books are independent of each other. Suppose you have two Mac books, which you begin using at the same time. Define T as the time of the first laptop failure and T 2 as the time of the second laptop failure. a. Compute f T (t ). Let M be the life time of mac book and M 2 the lifetime of mac book 2, where M and M 2 are iid exponential random variables with CDF F M (m) e λm. T, the time of the first mac book failure, is the minimum of M and M 2. To derive the distribution of T, we first find the CDF F T (t), and then differentiate to find the PDF f T (t). Differentiating F T (t) with respect to t yields: F T (t) P (min(m, M 2 ) < t) P (min(m, M 2 ) t) P (M t)p (M 2 t) ( F M (t)) 2 e 2λt t f T (t) 2λe 2λt t Page 6 of

6.4/6.43: Probabilistic Systems Analysis (Spring 28) b. Let X T 2 T. Compute f X T (x t ). Conditioned on the time of the first mac book failure, the time until the other mac book fails is an exponential random variable by the memoryless property. The memoryless property tells us that regardless of the elapsed life time of the mac book, the time until failure has the same exponential CDF. Consequently, f X T (x) λe λx x. c. Is X independent of T? Give a mathematical justification for your answer. Since we have shown in 2(c) that f X T (x t) does not depend on t, X and T are independent. d. Compute f T2 (t 2 ) and E[T 2 ]. The time of the second laptop failure T 2 is equal to T + X. Since X and T were shown to be independent in 2(b), we convolve the densities found in 2(a) and 2(b) to determine f T2 (t). f T2 (t) f T (τ)f X (t τ)dτ t 2(λ) 2 e 2λτ e λ(t τ) dτ t 2λe λt λe λτ dτ 2λe λt ( e λt ) t Also, by the linearity of expectation, we have that E[T 2 ] E[T ] + E[X] 2 λ + 3 λ 2 λ. An equivalent method for solving this problem is to note that T 2 is the maximum of M and M 2, and deriving the distribution of T 2 in our standard CDF to PDF method: Differentiating F T2 (t) with respect to t yields: F T2 (t) P(max(M, M 2 ) < t) P(M 2 t)p (M 2 t) F M (t) 2 2e λt + e 2λt t f T2 (t) 2λe λt 2λe 2λt t which is equivalent to our solution by convolution above. 2 Finally, from the above density we obtain that E[T 2 ] λ 3 2λ 2λ, which matches our earlier solution. Page 7 of

6.4/6.43: Probabilistic Systems Analysis (Spring 28) e. Now suppose you have Mac books, and let Y be the time of the first laptop failure. Find the best answer for P(Y <.) Y is equal to the minimum of independent exponential random variables. Following the derivation in (a), we determine by analogy: f Y (y) λe λy t Integrating over y from to., we find P(Y <.) e λ. Your friend, Charlie, loves Mac books so much he buys S new Mac books every day! On any given day S is equally likely to be 4 or 8, and all days are independent from each other. Let S be the number of Mac books Charlie buys over the next days. f. (6 pts) Find the best approximation for P(S 68). Express your final answer in terms of Φ( ), the CDF of the standard normal. Using the De Moivre - Laplace Approximation to the Binomial, and noting that the step size between values that S can take on is 4, 68 + 2 6 P (S 68) Φ 4 Φ 2 Φ (.5) Question 3 Saif is a well intentioned though slightly indecisive fellow. Every morning he flips a coin to decide where to go. If the coin is heads he drives to the mall, if it comes up tails he volunteers at the local shelter. Saif s coin is not necessarily fair, rather it possesses a probability of heads equal to q. We do not know q, but we do know it is well-modeled by a random variable Q where the density of Q is 2q for q f Q (q) otherwise Assume conditioned on Q each coin flip is independent. Note parts a, b, c, and {d, e} may be answered independent of each other. Page 8 of

6.4/6.43: Probabilistic Systems Analysis (Spring 28) a. (4 pts) What s the probability that Saif goes to the local shelter if he flips the coin once? Let X i be the outcome of a coin toss on the i th trial, where X i if the coin lands heads, and X i if the coin lands tails. By the total probability theorem: P(X ) P (X Q q)f(q)dq ( q)2q dq 3 In an attempt to promote virtuous behavior, Saif s father offers to pay him $4 every day he volunteers at the local shelter. Define X as Saif s payout if he flips the coin every morning for the next 3 days. b. Find var(x) Let Y i be a Bernoulli random variable describing the outcome of a coin tossed on morning i. Then, Y i corresponds to the event that on morning i, Saif goes to the local shelter; Y i corresponds to the event that on morning i, Saif goes to the mall. Assuming that the coin lands heads with probability q, i.e. that Q q, we have that P (Y i ) q, and P (Y i ) q for i,..., 3. Saif s payout for next 3 days is described by random variable X 4(Y + Y 2 + + Y 3 ). var(x) 6 var(y + Y 2 + + Y 3 ) 6 var(e[y + Y 2 + + Y 3 Q]) + E[var(Y + Y 2 + + Y 3 Q)] Now note that, conditioned on Q q, Y,..., Y 3 are independent. Thus, var(y +Y 2 + +Y 3 Q) var(y Q) +... + var(y 3 Q). So, var(x) 6 var(3q) + 6 E[var(Y Q) +... + var(y 3 Q)] 6 3 2 var(q) + 6 3 E[Q( Q)] 6 3 2 (E[Q 2 ] (E[Q]) 2 ) + 6 3(E[Q] E[Q 2 ]) 6 3 2 (/2 4/9) + 6 3(2/3 /2) 88 since E[Q] 2q 2 dq 2/3 and E[Q 2 ] 2q 3 dq /2. Page 9 of

6.4/6.43: Probabilistic Systems Analysis (Spring 28) Let event B be that Saif goes to the local shelter at least once in k days. c. Find the conditional density of Q given B, f Q B (q) By Bayes Rule: P (B Q q)f Q (q) f Q B (q) P (B Q q)fq (q)dq ( q k )2q ( q k )2q dq 2q( q k ) 2/(k + 2) q While shopping at the mall, Saif gets a call from his sister Mais. They agree to meet at the Coco Cabana Court yard at exactly :3PM. Unfortunately Mais arrives Z minutes late, where Z is a continuous uniform random variable from zero to minutes. Saif is furious that Mais has kept him waiting, and demands Mais pay him R dollars, where R exp(z + 2). e. Find Saif s expected payout, E[R]. f. Find the density of Saif s payout, f R (r). E[R] e z+2 f(z)dz e 2 e z dz e2 e2 F R (r) P(e Z+2 r) P(Z + 2 ln(r)) ln(r) 2 dz ln(r) 2 2 2 e r e f R (r) r e 2 r e 2 Page of

MIT OpenCourseWare http://ocw.mit.edu 6.4 / 6.43 Probabilistic Systems Analysis and Applied Probability Fall 2 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.