Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)



Similar documents
Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Math 55: Discrete Mathematics

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Chapter 4 Lecture Notes

Problem sets for BUEC 333 Part 1: Probability and Statistics

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

3.2 Roulette and Markov Chains

Statistics 100A Homework 3 Solutions

Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I

6.042/18.062J Mathematics for Computer Science. Expected Value I

Week 2: Conditional Probability and Bayes formula

Math 55: Discrete Mathematics

Solutions for Practice problems on proofs

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

Statistics 100A Homework 4 Solutions

Probability and Expected Value

6.3 Conditional Probability and Independence

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

An Introduction to Information Theory

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3

Sums of Independent Random Variables

36 Odds, Expected Value, and Conditional Probability

AMS 5 CHANCE VARIABILITY

15-251: Great Theoretical Ideas in Computer Science Anupam Gupta Notes on Combinatorial Games (draft!!) January 29, 2012

Introductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Chapter 16: law of averages

In the situations that we will encounter, we may generally calculate the probability of an event

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Betting systems: how not to lose your money gambling

R Simulations: Monty Hall problem

E3: PROBABILITY AND STATISTICS lecture notes

Probability Distribution for Discrete Random Variables

Practical Probability:

The Math. P (x) = 5! = = 120.

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS NUMBER OF TOSSES

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Lecture 13. Understanding Probability and Long-Term Expectations

Math 431 An Introduction to Probability. Final Exam Solutions

University of California, Los Angeles Department of Statistics. Random variables

Chapter What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.

Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty

Gaming the Law of Large Numbers

Bayesian Tutorial (Sheet Updated 20 March)

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Math 461 Fall 2006 Test 2 Solutions

Linear Risk Management and Optimal Selection of a Limited Number

Lecture Note 1 Set and Probability Theory. MIT Spring 2006 Herman Bennett

Probability Using Dice

Math 55: Discrete Mathematics

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Book Review of Rosenhouse, The Monty Hall Problem. Leslie Burkholder 1

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Sample Induction Proofs

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

WHERE DOES THE 10% CONDITION COME FROM?

Expected Value and the Game of Craps

Reading 13 : Finite State Automata and Regular Expressions

Chicago Booth BUSINESS STATISTICS Final Exam Fall 2011

Mathematical Expectation

Recursive Estimation

TOPIC P2: SAMPLE SPACE AND ASSIGNING PROBABILITIES SPOTLIGHT: THE CASINO GAME OF ROULETTE. Topic P2: Sample Space and Assigning Probabilities

Chapter 5 Section 2 day f.notebook. November 17, Honors Statistics

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

The Binomial Probability Distribution

Question 1 Formatted: Formatted: Formatted: Formatted:

Lecture 8. Confidence intervals and the central limit theorem

Probability. a number between 0 and 1 that indicates how likely it is that a specific event or set of events will occur.

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Mathematical Induction. Lecture 10-11

Conditional Probability

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

1/3 1/3 1/

Unit 19: Probability Models

Section 7C: The Law of Large Numbers

Worksheet for Teaching Module Probability (Lesson 1)

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

Chapter 4. Probability Distributions

Random Fibonacci-type Sequences in Online Gambling

3. Mathematical Induction

ECE302 Spring 2006 HW1 Solutions January 16,

Statistics 100A Homework 8 Solutions

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev

Colored Hats and Logic Puzzles

ECE302 Spring 2006 HW3 Solutions February 2,

The Binomial Distribution

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

A Few Basics of Probability

Probability Generating Functions

Solution. Solution. (a) Sum of probabilities = 1 (Verify) (b) (see graph) Chapter 4 (Sections ) Homework Solutions. Section 4.

2. Discrete random variables

Decision Making under Uncertainty

Chapter 5. Discrete Probability Distributions

MONT 107N Understanding Randomness Solutions For Final Examination May 11, 2010

That s Not Fair! ASSESSMENT #HSMA20. Benchmark Grades: 9-12

Lectures on Stochastic Processes. William G. Faris

Elementary Statistics and Inference. Elementary Statistics and Inference. 17 Expected Value and Standard Error. 22S:025 or 7P:025.

Transcription:

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you are given the choice of three doors. Behind one door is a car, behind the others, goats. You pick a door, say number 1, and the host, who knows whats behind the doors, opens another door, say number 3, which has a goat. He says to you, Do you want to pick door number 2? Is it to your advantage to switch your choice of doors? Backup your answer with rigorous reasoning, which includes a clear statement of the sample space, eventts of interest, and computation of probabilities. Make the following two reasonable assumptions: (i) the car is equally likely to be hidden behind each of the three doors, and (ii) after the player picks a door, the host must open a different door with a goat behind it and offer the player the choice of staying with the original door or switching. Solution: Without loss of generality, let A denote the door that the player picks, and B and C be the other doors. We can represent the sample space as S = {(Ȧ, B, C), (A, Ḃ, C), (A, B, Ċ)}, where the boldface of A indicates that the player picked that door, and the dot on a door indicates that the car is behind that door. From the statement of the problem, the probability of each of the three outcomes in S is 1/3. Let E denote the event of winning by switching the door and F denote the event of winning by not switching the door. Then, from the rules of the game, we have: E = {(A, Ḃ, C), (A, B, Ċ)}, and P (E) = 2/3 F = {(Ȧ, B, C)}, and P (F ) = 1/3 So, it is advantageous to switch! 2. (6 points) In Roulette, you can bet $1 and win $36 with a probability 1/38, as there are 38 numbers on the wheel. Suppose you have $105 and have enough time to make 105 bets on the wheel. What is the probability that you come out ahead, that is, with more than $105? After you write down the expression, compute the final numerical value, which I am sure you ll find interesting. Solution: The experiment is 105 mutually independent Bernoulli trials with probability of success 1/38. The sample space S is the set of all possible outcomes of 105 mutually independent Bernoulli trials, which can be thought of as the set of all bit strings of length 105 where bits 1 or 0 at position i denote success or failure, respectively, in ith Bernoulli trial. We observe that we don t come out ahead if and only if we win zero, one, or two bets out of 105 bets. Let A be the event that we win zero, one, or two bets. Then we want to find P (S A) = 1 P (A). Now, ( ) ( ) 105 37 105 ( 105 P (A) = + 0 38 1 ) ( 37 38 ) 104 ( ) 1 + 38 i.e., P (A) = 0.4758; hence, P (S A) = 1 P (A) = 0.5242. ( 105 2 ) ( ) 37 103 ( ) 1 2, 38 38 Note that the probability that we end up ahead is more than half, but in expectation, we will lose irrespective of the number of rounds played. 3. (10 points) Ramesh can get to work three different ways: by bicycle, by bus, or by car. Because of commuter traffic, there is 50% chance that he will be late when he drives his car. When he takes

the bus, which uses a special lane, there is a 20% chance that he will be late. The probability that he will be late when he bikes to work is only 5%. Ramesh arrives late one day. His boss wants to estimate the probability that he drove his car to work that day. (a) Suppose the boss assumes that that there is 1/3 chance that Ramesh takes each of the three ways he can get to work. Under this assumption what estimate for the probability that Ramesh drove his car does the boss obtain from Bayes Theorem? (b) Suppose the boss knows that Ramesh drives to work 30% of the time, takes the bus only 10% of the time, and takes his bicycle 60% of the time. With this information, what estimate for the probability that Ramesh drove his car does the boss obtain from Bayes Theorem? When solving the above problem, clearly define the events and state the probabilities. You need to use the Generalized Bayes Theorem. Solution: The sample space S is {late-car, late-bus, late-bike, not-late-car, not-late-bus, not-late-bike}. Define the following events: C, Ramesh travels by car, is {late-car, not-late-car} B, Ramesh travels by bus, is {late-bus, not-late-bus} K, Ramesh travels by bike, is {late-bike, not-late-bike} L, Ramesh is late, is {late-car, late-bus, late-bike} We are given: P (L C) = 1/2, P (L B) = 1/5, and P (L K) = 1/20. The problem asks for P (C L). (a) We are given P (C) = P (B) = P (K) = 1/3. Therefore, using the generalized Bayes Theorem: P (L C)P (C) P (C L) = P (L C)P (C) + P (L B)P (B) + P (L K)P (K) = 2 3 (b) We are given P (C) = 3/10, P (B) = 1/10, P (K) = 3/5. Therefore, using the generalized Bayes Theorem: P (L C)P (C) P (C L) = P (L C)P (C) + P (L B)P (B) + P (L K)P (K) = 3 4 4. (10 points) I made three vegetable burgers. The first came out really well, the second one had one side burned, and the last was hopeless both sides were burned. Disgusted with my performance, I picked a burger (uniformly) at random and dropped it on the ground. It rolled and settled. When I went to it, the side facing up was burned. Given this knowledge, what is the probability that the burger I dropped is the hopeless, burned-on-both-sides burger? Solve this problem in two different ways: first without using the Bayes theorem and then using Bayes theorem. Solution:

(a) Without using the Bayes Theorem: Let us denote well, one-side-burned, both-sides-burned burgers by W, O, B, respectively, and denote sides by subscripts 1 and 2. Then the sample space of the problem is S = {W 1, W 2, O 1, O 2, B 1, B 2 }, and each outcome is equally likely. Let B be the event that the dropped burger is the hopeless one; then, B = {B 1, B 2 }. Let U B be the event that the side facing up is burned; then, U B = {O 2, B 1, B 2 }, where we assume that O 2 is the burned side of the burger O. Thus, the probability is P (B U B ) = P (B U B) = 2/6 P (U B ) 3/6 = 2 3 (b) Using the Bayes Theorem: Let W, O, B denote the events that the burger I picked is well, one-side-burned, both-sidesburned, respectively. Since thrown burger was picked uniformly at random, we have P (W ) = P (O) = P (B) = 1 3. Let U B be the event that the side facing up is burned; then, P (U B B) is probability that the side facing up is burned given that the both-sides-burned burger was picked, which is clearly 1. And using similar arguments, we have P (U B B) = 1, P (U B W ) = 0, and P (U B O) = 1 2. We want to compute P (B U B). Using the generalized Bayes Theorem: P (B U B ) = P (U B B)P (B) P (U B ) P (U B B)P (B) = P (U B W )P (W ) + P (U B O)P (O) + P (U B B)P (B) 1 (1/3) = 0 (1/3) + (1/2) (1/3) + 1 (1/3) = 1/3 1/2 = 2 3. 5. (2+6+4 = 12 points) A McDonald s Happy Meal contains one of the four Teenage Mutant Ninja Turtles. (a) Assuming each meal gets a toy uniformly at random, how many times do you expect to have to eat a Happy Meal to get your favorite character, Leonardo? (b) How many times do you expect to eat a Happy Meal to get all four toys, in whatever order that you get them? (c) Generalize the solution for n toys. Solution: (a) We can think of the picking of a toy at random as a Bernoulli trial, with picking Leonardo representing success and picking any of the other three toys representing failure. Thus, the success probability p = 1/4. Let X denote the number of trials needed to get Leonardo. We are asked for E[X]. Since the random variable X has a geometric distribution (with parameter p), we know from class that E[X] = 1/p = 4.

(b) Let the random variable X denote the number of trials needed to get all four types of toys. The problem asks for E[X]. X can be expressed as X 1 + X 2 + X 3 + X 4, where X 1 is the number trials needed to get the first toy t 1 ; X 2 is the number trials needed to get a different type t 2 than t 1 ; X 3 is the number trials needed to get a different type t 3 than t 1, t 2 ; and X 4 is the number trials needed to get a different type t 4 than t 1, t 2, t 3. Since the probability p 1 of getting any toy in a trial is 1, E[X 1 ] = 1/p 1 = 1. Once a toy is obtained, the probability p 2 of getting a different type of toy in a trial is 3/4. Therefore, E[X 2 ] = 1/p 2 = 4/3. Once two types of toys are obtained, the probability p 3 of getting a different type of toy in a trial is 2/4 = 1/2. Therefore, E[X 3 ] = 1/p 3 = 2. Once three types of toys are obtained, the probability p 4 of getting a different type of toy in a trial is 1/4. Therefore, E[X 4 ] = 1/p 4 = 4. E[X] = E[X 1 + X 2 + X 3 + X 4 ] = E[X 1 ] + E[X 2 ] + E[X 3 ] + E[X 4 ] by Linearity of Expectation = 1 + 4/3 + 2 + 4 = 25/3 (c) Generalizing the above for n toys, we have: X = n X i i=1 p i = n i + 1 n E[X i ] = 1 p i = n n i + 1 [ n ] E[X] = E X i i=1 = n E[X i ] i=1 = n n i=1 n i + 1 = n n 1 i=1 n i + 1 = n n 1 j j=1 Linearity of Expectation pulling out n change of variable: letting j = n i + 1 6. (4 points) Let X and Y be the random variables that count the number of heads and the number of tails that come up when two coins are flipped. Show that X and Y are not independent. Solution: The sample space is {HH, HT, T H, T T }, where each outcome has probability 1/4. We have P (X = 2) = P (HH) = 1/4 and P (Y = 2) = P (T T ) = 1/4. Also, P (X = 2 Y = 2) = P ( ) = 0. Since P (X = 2 Y = 2) P (X = 2) P (Y = 2), the random variables X and Y are not independent.

7. (8 points) Let X be the sum of the numbers when a pair of dodecahedral dice is rolled. (a) What is E(X)? (b) What is V (X)? Solution: Let A = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}. The sample space S = A A, and S = 144. Let X 1 and X 2 be the random variables whose value is the number on the first die and the number on the second die, respectively. Then, X = X 1 + X 2. Let us first compute the expected value and the variance of just X 1. E[X 1 ] = 12 x=1 x P (X = x) = 1 12 E[X1 2] = 12 x 2 P (X = x) = x=1 12 x=1 x = 6.5. 12 13 25 6 1 12 = 325/6. V [X 1 ] = E[X 2 1 ] E2 [X 1 ] = 325/6 (13/2) 2 11.92. Now, we can compute E[X] and V [X] as follows: (a) E[X] = E[X 1 + X 2 ] = E[X 1 ] + E[X 2 ] by Linearity of Expectation = E[X 1 ] + E[X 1 ] since X 1 and X 2 have the same distribution = 13 (b) V [X] = V [X 1 + X 2 ] = V [X 1 ] + V [X 2 ] since X 1 and X 2 are independent random variables = V [X 1 ] + V [X 1 ] since X 1 and X 2 have the same distribution 23.84 8. (5 points) Suppose we have a Bernoulli trial with success probability p and failure probability q = 1 p. Let X be an indicator random variable that is 1 for success and 0 for failure. (a) Determine V [X]. (b) Show that V [X] 1/4 for any Bernoulli trial. Solution: (a) The sample space S = {succ, fail} and the probability distribution is: P (succ) = p, P (fail) = 1 p. The random variable X is defined by X(succ) = 1 and X(f ail) = 0. Then: E[X] = 1 P (succ) + 0 P (fail) = p. E[X 2 ] = 1 2 P (succ) + 0 2 P (fail) = p. V [X] = E[X 2 ] E 2 [X] = p p 2 = p(1 p) = pq. (b) V [X]= p(1 p) = p p 2 = 0.25 (p 0.5) 2 0.25 from Part (a) since a square is non-negative

9. (15 points) What is the expected number of items needed to fill all slots of a hash table of size k? Explain your reasoning clearly. It is acceptable to leave your answer as a summation (using the Σ notation), but you should leave the answer in as elegant a form as possible. You should assume that, for all slots i j and for each item x, the probability that x hashes into slot i is the same as the probability that x hashes into slot j, regardless of what slots other items hash into. Solution: This problem is analogous to Problem 5 (c): an item hashing to a slot different from already occupied slots corresponds to a toy from a meal turning out to be different from already acquired toys. Thus, the expected number of meals for acquiring each of the n toys is the same as the expected number of items to insert to fill all n slots. So, as in Problem 5 (c), the answer is n n j=1 1 j. Alternate solution: Let X be the random variable that represents the number of items needed to fill all slots of a hash table of size k; and let X i be the random variable that represents the number of items needed before one of them hashes to an empty slot, given that i of the k slots are previously occupied (and the remaining k i slots are empty). We note that X = X 0 + X 1 +... + X k 1 = k 1 X i. i=0 Since the probability p i that an item hashes to an empty slot given that i of the k slots are occupied is (k i)/k and X i has a geometric distribution with parameter p i, we have E[X i ] = 1/p i = k/(k i). Then: E[X] = E[ k 1 Extra Credit i=0 = k 1 i=0 = k 1 i=0 X i ] E(X i ) k/(k i) by Linearity of Expectation = k k 1/j change of variables: j = k i j=1 1. (5 + 5 + 15 = 25 points) Consider a set S of n people such that, for all distinct x and y, it is the case that either x and y like each other or x and y hate each other. Let us call a S S a friend group if, for all distinct x and y in S, x and y like each other, and let us call a S S a hater group if, for all distinct x and y in S, x and y hate each other. Let us define R(k) as the smallest natural number such that in every set S of R(k) people, there is either a friend group or a hate group of size k. For example, R(2) = 2. (a) Prove that R(3) > 5.

Proof: Let S = {p 0, p 1, p 2, p 3, p 4 }, and let p i and p j be friends if and only if (i + 1) mod 5 = j or (i 1) mod 5 = j. Then, it is easy to see that there is no friend group or a hate group of size 3. Hence, R(3) > 5. (b) Prove that R(3) 6. (Together with the previous part, we have R(3) = 6.) Proof: To prove the claim, we need to show that every set of six people contains a friend group or a hate group of size 3. Let S be a set of six people and p be an element of S. Since either p likes a majority in S {p} or hates a majority in S {p}, it follows that either p is friends with at least three people in S {p} or p hates at least three people in S {p}. Without loss of generality, suppose that p is friends with three people a, b, c in S {p}. If any two among a, b, c are friends, then those two and p constitute a friend group of size 3. On the other hand, if no two of a, b, c are friends, the three of them constitute a hate group of size 3. (c) Prove by induction that R(k) 2 k(k 1) 2 for all k 2. Proof: Let P (n) be the statement that every set of at least 2 n(n 1) 2 people contains a friend group or a hate group of size n. The statement that we wish to prove is implied by the following claim. Claim: n 2, P (n) Proof: We prove the claim by induction. Base Case: To show P (2). P (2) states every set of at least 2 2(2 1) 2 = 2 people contains a friend group or a hate group of size 2, which is clearly true. Hence, we have the base case. Inductive Hypothesis: Assume k 2 and P (k): every set of at least 2 k(k 1) 2 people contains a friend group or a hate group of size k. Inductive Step: To show P (k + 1), i.e., to show that every set of at least 2 k(k+1) 2 people contains a friend group or a hate group of size k + 1. Let S 0 be a set of at least 2 k(k+1) 2 people. We note that 2 k(k+1) 2 = 2 k 2 k(k 1) 2. Consider the following procedure that defines a sequence of successively smaller sets S 0, S 1,..., S k, a sequence of people p 0, p 1,..., p k, and a sequence of bits f 0, f 1,..., f k. for i = 0 to k / / Loop Invariant: j {0, 1,..., i 1}, p j S j, S j {p j } S j+1, S j 2 k j 2 k(k 1) 2 let p i any element of S i if p i is friends with a majority of people in S i {p i } let S i+1 = {x S i {p i } p i is a friend of x} and let f i = 1 else let S i+1 = {x S i {p i } p i hates x} and let f i = 0 / / Since S i 2 k i 2 k(k 1) 2, it follows that S i+1 2 k i 1 2 k(k 1) 2, which establishes the invariant There are three cases: (1) f 0 = f 1 = f 2 =... = f k 1 = 1, or (2) f 0 = f 1 = f 2 =... = f k 1 = 0, or (3) there exist i, j {0, 1,..., k 1} such that f i = 1, f j = 0. In Case (1), p 0, p 1,..., p k constitute a friend group of size k + 1. In Case (2), p 0, p 1,..., p k constitute

a hate group of size k + 1. In Case (3), it follows from the invariant that S k 2 k(k 1) 2. Then, by the Induction Hypothesis, S k has a friend group or a hate group of size k. If S k has a friend group F of size k, then F {p i } constitutes a friend group of size k + 1. If S k has a hate group H of size k, then H {p j } constitutes a hate group of size k + 1. Thus, in all of the cases, S 0 contains a friend group or a hate group of size k + 1. Hence, we have P (k + 1), which completes the induction step. Then, by the Principle of Weak Mathematical Induction, we have the claim: n 2, P (n).