6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total Write your solutions in this quiz packet, only solutions in the quiz packet will be graded. Question one, multiple choice questions, will receive no partial credit. Partial credit for question two and three will be awarded. You are allowed 2 two-sided 8.5 by formula sheet plus a calculator. You have 2 minutes to complete the quiz. Be neat! You will not get credit if we can t read it. We will send out an email with more information on how to obtain your quiz before drop date. Good Luck!
6.4/6.43: Probabilistic Systems Analysis (Spring 28) Question Multiple choice questions. CLEARLY circle the best answer for each question below. Each question is worth 4 points each, with no partial credit given. a. Let X, X 2, and X 3 be independent random variables with the continuous uniform distribution over [, ]. Then P(X < X 2 < X 3 ) (i) /6 (ii) /3 (iii) /2 (iv) /4 : To understand the principle, first consider a simpler problem with X and X 2 as given above. Note that P(X < X 2 ) + P(X 2 < X ) + P(X X 2 ) since the corresponding events are disjoint and exaust all the possibilities. But P(X < X 2 ) P(X 2 < X ) by symmetry. Furthermore, P(X X 2 ) since the random variables are continuous. Therefore, P(X < X 2 ) /2. Analogously, omitting the events with zero probability but making sure to exhaust all other possibilities, we have that P(X < X 2 < X 3 ) + P(X < X 3 < X 2 ) + P(X 2 < X < X 3 ) + P(X 2 < X 3 < X ) + P(X 3 < X < X 2 ) + P(X 3 < X 2 < X ). And, by symmetry, P(X < X 2 < X 3 ) P(X < X 3 < X 2 ) P(X 2 < X < X 3 ) P(X 2 < X 3 < X ) P(X 3 < X < X 2 ) P(X 3 < X 2 < X ). Thus, P(X < X 2 < X 3 ) /6. b. Let X and Y be two continuous random variables. Then (i) E[XY ] E[X]E[Y ] (ii) E[X 2 + Y 2 ] E[X 2 ] + E[Y 2 ] (iii) f X+Y (x + y) f X (x)f Y (y) (iv) var(x + Y ) var(x) + var(y ) : Since X 2 and Y 2 are random variables, the result follows by the linearity of expectation. c. Suppose X is uniformly distributed over [, 4] and Y is uniformly distributed over [, ]. Assume X and Y are independent. Let Z X + Y. Then (i) f Z (4.5) (ii) f Z (4.5) /8 (iii) f Z (4.5) /4 (iv) f Z (4.5) /2 : Since X and Y are independent, the result follows by convolution: 4 f Z (4.5) f X (α)f Y (4.5 α) dα dα. 3.5 4 8 Page 3 of
6.4/6.43: Probabilistic Systems Analysis (Spring 28) d. For the random variables defined in part (c), P(max(X, Y ) > 3) is equal to (i) (ii) 9/4 (iii) 3/4 (iv) /4 : Note that P(max(X, Y ) > 3) P(max(X, Y ) 3) P({X 3} {Y 3}). But, X and Y are independent, so P({X 3} {Y 3}) P(X 3)P(Y 3). Finally, computing the probabilities, we have P(X 3) 3/4 and P(X 3). Thus, P(max(X, Y ) > 3) 3/4 /4. e. Recall the hat problem from lecture: N people put their hats in a closet at the start of a party, where each hat is uniquely identified. At the end of the party each person randomly selects a hat from the closet. Suppose N is a Poisson random variable with parameter λ. If X is the number of people who pick their own hats, then E[X] is equal to (i) λ (ii) /λ 2 (iii) /λ (iv) : Let X X +... + X N where each X i is the indicator function such that X i if the i th person picks their own hat and X i otherwise. By the linearity of the expectation, E[X N n] E[X N n] +... + E[X N N n]. But E[X i N n] /n for all i,..., n. Thus, E[X N n] ne[x i N n]. Finally, E[X] E[E[X N n]]. f. Suppose X and Y are Poisson random variables with parameters λ and λ 2 respectively, where X and Y are independent. Define W X + Y, then (i) W is Poisson with parameter min(λ, λ 2 ) (ii) W is Poisson with parameter λ + λ 2 (iii) W may not be Poisson but has mean equal to min(λ, λ 2 ) (iv) W may not be Poisson but has mean equal to λ + λ 2 : The quickest way to obtain the answer is through transforms: M X (s) e λ (e s ) and M Y (s) e λ 2(e s ). Since X and Y are independent, we have that M W (s) e λ (e s ) e λ 2(e s ) e (λ +λ 2 )(e s ), which equals the transform of a Poisson random variable with mean λ + λ 2. Page 4 of
6.4/6.43: Probabilistic Systems Analysis (Spring 28) g. Let X be a random variable whose transform is given by M X (s) (.4 +.6e s ) 5. Then (i) P(X ) P(X 5) (ii) P(X 5) > (iii) P(X ) (.4) 5 (iv) P(X 5).6 : Note that M X (s) is the transform of a binomial random variable X with n 5 trials and the probability of success p.6. Thus, P(X ).4 5. h. Let X i, i, 2,... be independent random variables all distributed according to the PDF f X (x) x/8 for x 4. Let S i X i. Then P(S > 3) is approximately equal to (i) Φ(5) (ii) Φ(5) 5 (iii) Φ 2 5 (iv) Φ 2 : Let S i Y i where Y i is the random variable given by Y i X /. Since Y i are iid, the distribution of S is approximately normal with mean E[S] and variance var(s). 3 E(S) Thus, P(S > 3) P(S 3) Φ. Now, var(s) Therefore, 4 x x3 4 8 E[X i ] x dx 8 24 3 var(x i ) E[X 2 ] (E[X i ]) 2 i 4 x 2 x 8 dx 8 2 3 x 4 4 32 8 2 8 3 9 and E[S] E[X i ] +... + E[X ] 8/3. 8 var(s) var(x i ) +... + var(x i ). 2 2 9 3 8/3 P(S > 3) Φ 8 9 5 Φ. 2 Page 5 of
6.4/6.43: Probabilistic Systems Analysis (Spring 28) i. Let X i, i, 2,... be independent random variables all distributed according to the PDF f X (x), x. Define Y n X X 2 X 3... X n, for some integer n. Then var(y n ) is equal to n (i) 2 (ii) 3 n 4 n (iii) 2 n (iv) 2 : Since X,..., X n are independent, we have that E[Y n ] E[X ]... E[X n ]. Similarly, E[Y n 2 ] E[X 2 ]... E[Xn 2 ]. Since E[X i ] /2 and E[Xi 2 ] /3 for i,..., n, it follows that var(s n ) E[Y 2 n ] (E[Y n ]) 2 4 n. 3 n Question 2 Each Mac book has a lifetime that is exponentially distributed with parameter λ. The lifetime of Mac books are independent of each other. Suppose you have two Mac books, which you begin using at the same time. Define T as the time of the first laptop failure and T 2 as the time of the second laptop failure. a. Compute f T (t ). Let M be the life time of mac book and M 2 the lifetime of mac book 2, where M and M 2 are iid exponential random variables with CDF F M (m) e λm. T, the time of the first mac book failure, is the minimum of M and M 2. To derive the distribution of T, we first find the CDF F T (t), and then differentiate to find the PDF f T (t). Differentiating F T (t) with respect to t yields: F T (t) P (min(m, M 2 ) < t) P (min(m, M 2 ) t) P (M t)p (M 2 t) ( F M (t)) 2 e 2λt t f T (t) 2λe 2λt t Page 6 of
6.4/6.43: Probabilistic Systems Analysis (Spring 28) b. Let X T 2 T. Compute f X T (x t ). Conditioned on the time of the first mac book failure, the time until the other mac book fails is an exponential random variable by the memoryless property. The memoryless property tells us that regardless of the elapsed life time of the mac book, the time until failure has the same exponential CDF. Consequently, f X T (x) λe λx x. c. Is X independent of T? Give a mathematical justification for your answer. Since we have shown in 2(c) that f X T (x t) does not depend on t, X and T are independent. d. Compute f T2 (t 2 ) and E[T 2 ]. The time of the second laptop failure T 2 is equal to T + X. Since X and T were shown to be independent in 2(b), we convolve the densities found in 2(a) and 2(b) to determine f T2 (t). f T2 (t) f T (τ)f X (t τ)dτ t 2(λ) 2 e 2λτ e λ(t τ) dτ t 2λe λt λe λτ dτ 2λe λt ( e λt ) t Also, by the linearity of expectation, we have that E[T 2 ] E[T ] + E[X] 2 λ + 3 λ 2 λ. An equivalent method for solving this problem is to note that T 2 is the maximum of M and M 2, and deriving the distribution of T 2 in our standard CDF to PDF method: Differentiating F T2 (t) with respect to t yields: F T2 (t) P(max(M, M 2 ) < t) P(M 2 t)p (M 2 t) F M (t) 2 2e λt + e 2λt t f T2 (t) 2λe λt 2λe 2λt t which is equivalent to our solution by convolution above. 2 Finally, from the above density we obtain that E[T 2 ] λ 3 2λ 2λ, which matches our earlier solution. Page 7 of
6.4/6.43: Probabilistic Systems Analysis (Spring 28) e. Now suppose you have Mac books, and let Y be the time of the first laptop failure. Find the best answer for P(Y <.) Y is equal to the minimum of independent exponential random variables. Following the derivation in (a), we determine by analogy: f Y (y) λe λy t Integrating over y from to., we find P(Y <.) e λ. Your friend, Charlie, loves Mac books so much he buys S new Mac books every day! On any given day S is equally likely to be 4 or 8, and all days are independent from each other. Let S be the number of Mac books Charlie buys over the next days. f. (6 pts) Find the best approximation for P(S 68). Express your final answer in terms of Φ( ), the CDF of the standard normal. Using the De Moivre - Laplace Approximation to the Binomial, and noting that the step size between values that S can take on is 4, 68 + 2 6 P (S 68) Φ 4 Φ 2 Φ (.5) Question 3 Saif is a well intentioned though slightly indecisive fellow. Every morning he flips a coin to decide where to go. If the coin is heads he drives to the mall, if it comes up tails he volunteers at the local shelter. Saif s coin is not necessarily fair, rather it possesses a probability of heads equal to q. We do not know q, but we do know it is well-modeled by a random variable Q where the density of Q is 2q for q f Q (q) otherwise Assume conditioned on Q each coin flip is independent. Note parts a, b, c, and {d, e} may be answered independent of each other. Page 8 of
6.4/6.43: Probabilistic Systems Analysis (Spring 28) a. (4 pts) What s the probability that Saif goes to the local shelter if he flips the coin once? Let X i be the outcome of a coin toss on the i th trial, where X i if the coin lands heads, and X i if the coin lands tails. By the total probability theorem: P(X ) P (X Q q)f(q)dq ( q)2q dq 3 In an attempt to promote virtuous behavior, Saif s father offers to pay him $4 every day he volunteers at the local shelter. Define X as Saif s payout if he flips the coin every morning for the next 3 days. b. Find var(x) Let Y i be a Bernoulli random variable describing the outcome of a coin tossed on morning i. Then, Y i corresponds to the event that on morning i, Saif goes to the local shelter; Y i corresponds to the event that on morning i, Saif goes to the mall. Assuming that the coin lands heads with probability q, i.e. that Q q, we have that P (Y i ) q, and P (Y i ) q for i,..., 3. Saif s payout for next 3 days is described by random variable X 4(Y + Y 2 + + Y 3 ). var(x) 6 var(y + Y 2 + + Y 3 ) 6 var(e[y + Y 2 + + Y 3 Q]) + E[var(Y + Y 2 + + Y 3 Q)] Now note that, conditioned on Q q, Y,..., Y 3 are independent. Thus, var(y +Y 2 + +Y 3 Q) var(y Q) +... + var(y 3 Q). So, var(x) 6 var(3q) + 6 E[var(Y Q) +... + var(y 3 Q)] 6 3 2 var(q) + 6 3 E[Q( Q)] 6 3 2 (E[Q 2 ] (E[Q]) 2 ) + 6 3(E[Q] E[Q 2 ]) 6 3 2 (/2 4/9) + 6 3(2/3 /2) 88 since E[Q] 2q 2 dq 2/3 and E[Q 2 ] 2q 3 dq /2. Page 9 of
6.4/6.43: Probabilistic Systems Analysis (Spring 28) Let event B be that Saif goes to the local shelter at least once in k days. c. Find the conditional density of Q given B, f Q B (q) By Bayes Rule: P (B Q q)f Q (q) f Q B (q) P (B Q q)fq (q)dq ( q k )2q ( q k )2q dq 2q( q k ) 2/(k + 2) q While shopping at the mall, Saif gets a call from his sister Mais. They agree to meet at the Coco Cabana Court yard at exactly :3PM. Unfortunately Mais arrives Z minutes late, where Z is a continuous uniform random variable from zero to minutes. Saif is furious that Mais has kept him waiting, and demands Mais pay him R dollars, where R exp(z + 2). e. Find Saif s expected payout, E[R]. f. Find the density of Saif s payout, f R (r). E[R] e z+2 f(z)dz e 2 e z dz e2 e2 F R (r) P(e Z+2 r) P(Z + 2 ln(r)) ln(r) 2 dz ln(r) 2 2 2 e r e f R (r) r e 2 r e 2 Page of
MIT OpenCourseWare http://ocw.mit.edu 6.4 / 6.43 Probabilistic Systems Analysis and Applied Probability Fall 2 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.