Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)


 Darren Dean
 1 years ago
 Views:
Transcription
1 Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you are given the choice of three doors. Behind one door is a car, behind the others, goats. You pick a door, say number 1, and the host, who knows whats behind the doors, opens another door, say number 3, which has a goat. He says to you, Do you want to pick door number 2? Is it to your advantage to switch your choice of doors? Backup your answer with rigorous reasoning, which includes a clear statement of the sample space, eventts of interest, and computation of probabilities. Make the following two reasonable assumptions: (i) the car is equally likely to be hidden behind each of the three doors, and (ii) after the player picks a door, the host must open a different door with a goat behind it and offer the player the choice of staying with the original door or switching. Solution: Without loss of generality, let A denote the door that the player picks, and B and C be the other doors. We can represent the sample space as S = {(Ȧ, B, C), (A, Ḃ, C), (A, B, Ċ)}, where the boldface of A indicates that the player picked that door, and the dot on a door indicates that the car is behind that door. From the statement of the problem, the probability of each of the three outcomes in S is 1/3. Let E denote the event of winning by switching the door and F denote the event of winning by not switching the door. Then, from the rules of the game, we have: E = {(A, Ḃ, C), (A, B, Ċ)}, and P (E) = 2/3 F = {(Ȧ, B, C)}, and P (F ) = 1/3 So, it is advantageous to switch! 2. (6 points) In Roulette, you can bet $1 and win $36 with a probability 1/38, as there are 38 numbers on the wheel. Suppose you have $105 and have enough time to make 105 bets on the wheel. What is the probability that you come out ahead, that is, with more than $105? After you write down the expression, compute the final numerical value, which I am sure you ll find interesting. Solution: The experiment is 105 mutually independent Bernoulli trials with probability of success 1/38. The sample space S is the set of all possible outcomes of 105 mutually independent Bernoulli trials, which can be thought of as the set of all bit strings of length 105 where bits 1 or 0 at position i denote success or failure, respectively, in ith Bernoulli trial. We observe that we don t come out ahead if and only if we win zero, one, or two bets out of 105 bets. Let A be the event that we win zero, one, or two bets. Then we want to find P (S A) = 1 P (A). Now, ( ) ( ) ( 105 P (A) = ) ( ) 104 ( ) i.e., P (A) = ; hence, P (S A) = 1 P (A) = ( ) ( ) ( ) 1 2, Note that the probability that we end up ahead is more than half, but in expectation, we will lose irrespective of the number of rounds played. 3. (10 points) Ramesh can get to work three different ways: by bicycle, by bus, or by car. Because of commuter traffic, there is 50% chance that he will be late when he drives his car. When he takes
2 the bus, which uses a special lane, there is a 20% chance that he will be late. The probability that he will be late when he bikes to work is only 5%. Ramesh arrives late one day. His boss wants to estimate the probability that he drove his car to work that day. (a) Suppose the boss assumes that that there is 1/3 chance that Ramesh takes each of the three ways he can get to work. Under this assumption what estimate for the probability that Ramesh drove his car does the boss obtain from Bayes Theorem? (b) Suppose the boss knows that Ramesh drives to work 30% of the time, takes the bus only 10% of the time, and takes his bicycle 60% of the time. With this information, what estimate for the probability that Ramesh drove his car does the boss obtain from Bayes Theorem? When solving the above problem, clearly define the events and state the probabilities. You need to use the Generalized Bayes Theorem. Solution: The sample space S is {latecar, latebus, latebike, notlatecar, notlatebus, notlatebike}. Define the following events: C, Ramesh travels by car, is {latecar, notlatecar} B, Ramesh travels by bus, is {latebus, notlatebus} K, Ramesh travels by bike, is {latebike, notlatebike} L, Ramesh is late, is {latecar, latebus, latebike} We are given: P (L C) = 1/2, P (L B) = 1/5, and P (L K) = 1/20. The problem asks for P (C L). (a) We are given P (C) = P (B) = P (K) = 1/3. Therefore, using the generalized Bayes Theorem: P (L C)P (C) P (C L) = P (L C)P (C) + P (L B)P (B) + P (L K)P (K) = 2 3 (b) We are given P (C) = 3/10, P (B) = 1/10, P (K) = 3/5. Therefore, using the generalized Bayes Theorem: P (L C)P (C) P (C L) = P (L C)P (C) + P (L B)P (B) + P (L K)P (K) = (10 points) I made three vegetable burgers. The first came out really well, the second one had one side burned, and the last was hopeless both sides were burned. Disgusted with my performance, I picked a burger (uniformly) at random and dropped it on the ground. It rolled and settled. When I went to it, the side facing up was burned. Given this knowledge, what is the probability that the burger I dropped is the hopeless, burnedonbothsides burger? Solve this problem in two different ways: first without using the Bayes theorem and then using Bayes theorem. Solution:
3 (a) Without using the Bayes Theorem: Let us denote well, onesideburned, bothsidesburned burgers by W, O, B, respectively, and denote sides by subscripts 1 and 2. Then the sample space of the problem is S = {W 1, W 2, O 1, O 2, B 1, B 2 }, and each outcome is equally likely. Let B be the event that the dropped burger is the hopeless one; then, B = {B 1, B 2 }. Let U B be the event that the side facing up is burned; then, U B = {O 2, B 1, B 2 }, where we assume that O 2 is the burned side of the burger O. Thus, the probability is P (B U B ) = P (B U B) = 2/6 P (U B ) 3/6 = 2 3 (b) Using the Bayes Theorem: Let W, O, B denote the events that the burger I picked is well, onesideburned, bothsidesburned, respectively. Since thrown burger was picked uniformly at random, we have P (W ) = P (O) = P (B) = 1 3. Let U B be the event that the side facing up is burned; then, P (U B B) is probability that the side facing up is burned given that the bothsidesburned burger was picked, which is clearly 1. And using similar arguments, we have P (U B B) = 1, P (U B W ) = 0, and P (U B O) = 1 2. We want to compute P (B U B). Using the generalized Bayes Theorem: P (B U B ) = P (U B B)P (B) P (U B ) P (U B B)P (B) = P (U B W )P (W ) + P (U B O)P (O) + P (U B B)P (B) 1 (1/3) = 0 (1/3) + (1/2) (1/3) + 1 (1/3) = 1/3 1/2 = (2+6+4 = 12 points) A McDonald s Happy Meal contains one of the four Teenage Mutant Ninja Turtles. (a) Assuming each meal gets a toy uniformly at random, how many times do you expect to have to eat a Happy Meal to get your favorite character, Leonardo? (b) How many times do you expect to eat a Happy Meal to get all four toys, in whatever order that you get them? (c) Generalize the solution for n toys. Solution: (a) We can think of the picking of a toy at random as a Bernoulli trial, with picking Leonardo representing success and picking any of the other three toys representing failure. Thus, the success probability p = 1/4. Let X denote the number of trials needed to get Leonardo. We are asked for E[X]. Since the random variable X has a geometric distribution (with parameter p), we know from class that E[X] = 1/p = 4.
4 (b) Let the random variable X denote the number of trials needed to get all four types of toys. The problem asks for E[X]. X can be expressed as X 1 + X 2 + X 3 + X 4, where X 1 is the number trials needed to get the first toy t 1 ; X 2 is the number trials needed to get a different type t 2 than t 1 ; X 3 is the number trials needed to get a different type t 3 than t 1, t 2 ; and X 4 is the number trials needed to get a different type t 4 than t 1, t 2, t 3. Since the probability p 1 of getting any toy in a trial is 1, E[X 1 ] = 1/p 1 = 1. Once a toy is obtained, the probability p 2 of getting a different type of toy in a trial is 3/4. Therefore, E[X 2 ] = 1/p 2 = 4/3. Once two types of toys are obtained, the probability p 3 of getting a different type of toy in a trial is 2/4 = 1/2. Therefore, E[X 3 ] = 1/p 3 = 2. Once three types of toys are obtained, the probability p 4 of getting a different type of toy in a trial is 1/4. Therefore, E[X 4 ] = 1/p 4 = 4. E[X] = E[X 1 + X 2 + X 3 + X 4 ] = E[X 1 ] + E[X 2 ] + E[X 3 ] + E[X 4 ] by Linearity of Expectation = 1 + 4/ = 25/3 (c) Generalizing the above for n toys, we have: X = n X i i=1 p i = n i + 1 n E[X i ] = 1 p i = n n i + 1 [ n ] E[X] = E X i i=1 = n E[X i ] i=1 = n n i=1 n i + 1 = n n 1 i=1 n i + 1 = n n 1 j j=1 Linearity of Expectation pulling out n change of variable: letting j = n i (4 points) Let X and Y be the random variables that count the number of heads and the number of tails that come up when two coins are flipped. Show that X and Y are not independent. Solution: The sample space is {HH, HT, T H, T T }, where each outcome has probability 1/4. We have P (X = 2) = P (HH) = 1/4 and P (Y = 2) = P (T T ) = 1/4. Also, P (X = 2 Y = 2) = P ( ) = 0. Since P (X = 2 Y = 2) P (X = 2) P (Y = 2), the random variables X and Y are not independent.
5 7. (8 points) Let X be the sum of the numbers when a pair of dodecahedral dice is rolled. (a) What is E(X)? (b) What is V (X)? Solution: Let A = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}. The sample space S = A A, and S = 144. Let X 1 and X 2 be the random variables whose value is the number on the first die and the number on the second die, respectively. Then, X = X 1 + X 2. Let us first compute the expected value and the variance of just X 1. E[X 1 ] = 12 x=1 x P (X = x) = 1 12 E[X1 2] = 12 x 2 P (X = x) = x=1 12 x=1 x = = 325/6. V [X 1 ] = E[X 2 1 ] E2 [X 1 ] = 325/6 (13/2) Now, we can compute E[X] and V [X] as follows: (a) E[X] = E[X 1 + X 2 ] = E[X 1 ] + E[X 2 ] by Linearity of Expectation = E[X 1 ] + E[X 1 ] since X 1 and X 2 have the same distribution = 13 (b) V [X] = V [X 1 + X 2 ] = V [X 1 ] + V [X 2 ] since X 1 and X 2 are independent random variables = V [X 1 ] + V [X 1 ] since X 1 and X 2 have the same distribution (5 points) Suppose we have a Bernoulli trial with success probability p and failure probability q = 1 p. Let X be an indicator random variable that is 1 for success and 0 for failure. (a) Determine V [X]. (b) Show that V [X] 1/4 for any Bernoulli trial. Solution: (a) The sample space S = {succ, fail} and the probability distribution is: P (succ) = p, P (fail) = 1 p. The random variable X is defined by X(succ) = 1 and X(f ail) = 0. Then: E[X] = 1 P (succ) + 0 P (fail) = p. E[X 2 ] = 1 2 P (succ) P (fail) = p. V [X] = E[X 2 ] E 2 [X] = p p 2 = p(1 p) = pq. (b) V [X]= p(1 p) = p p 2 = 0.25 (p 0.5) from Part (a) since a square is nonnegative
6 9. (15 points) What is the expected number of items needed to fill all slots of a hash table of size k? Explain your reasoning clearly. It is acceptable to leave your answer as a summation (using the Σ notation), but you should leave the answer in as elegant a form as possible. You should assume that, for all slots i j and for each item x, the probability that x hashes into slot i is the same as the probability that x hashes into slot j, regardless of what slots other items hash into. Solution: This problem is analogous to Problem 5 (c): an item hashing to a slot different from already occupied slots corresponds to a toy from a meal turning out to be different from already acquired toys. Thus, the expected number of meals for acquiring each of the n toys is the same as the expected number of items to insert to fill all n slots. So, as in Problem 5 (c), the answer is n n j=1 1 j. Alternate solution: Let X be the random variable that represents the number of items needed to fill all slots of a hash table of size k; and let X i be the random variable that represents the number of items needed before one of them hashes to an empty slot, given that i of the k slots are previously occupied (and the remaining k i slots are empty). We note that X = X 0 + X X k 1 = k 1 X i. i=0 Since the probability p i that an item hashes to an empty slot given that i of the k slots are occupied is (k i)/k and X i has a geometric distribution with parameter p i, we have E[X i ] = 1/p i = k/(k i). Then: E[X] = E[ k 1 Extra Credit i=0 = k 1 i=0 = k 1 i=0 X i ] E(X i ) k/(k i) by Linearity of Expectation = k k 1/j change of variables: j = k i j=1 1. ( = 25 points) Consider a set S of n people such that, for all distinct x and y, it is the case that either x and y like each other or x and y hate each other. Let us call a S S a friend group if, for all distinct x and y in S, x and y like each other, and let us call a S S a hater group if, for all distinct x and y in S, x and y hate each other. Let us define R(k) as the smallest natural number such that in every set S of R(k) people, there is either a friend group or a hate group of size k. For example, R(2) = 2. (a) Prove that R(3) > 5.
7 Proof: Let S = {p 0, p 1, p 2, p 3, p 4 }, and let p i and p j be friends if and only if (i + 1) mod 5 = j or (i 1) mod 5 = j. Then, it is easy to see that there is no friend group or a hate group of size 3. Hence, R(3) > 5. (b) Prove that R(3) 6. (Together with the previous part, we have R(3) = 6.) Proof: To prove the claim, we need to show that every set of six people contains a friend group or a hate group of size 3. Let S be a set of six people and p be an element of S. Since either p likes a majority in S {p} or hates a majority in S {p}, it follows that either p is friends with at least three people in S {p} or p hates at least three people in S {p}. Without loss of generality, suppose that p is friends with three people a, b, c in S {p}. If any two among a, b, c are friends, then those two and p constitute a friend group of size 3. On the other hand, if no two of a, b, c are friends, the three of them constitute a hate group of size 3. (c) Prove by induction that R(k) 2 k(k 1) 2 for all k 2. Proof: Let P (n) be the statement that every set of at least 2 n(n 1) 2 people contains a friend group or a hate group of size n. The statement that we wish to prove is implied by the following claim. Claim: n 2, P (n) Proof: We prove the claim by induction. Base Case: To show P (2). P (2) states every set of at least 2 2(2 1) 2 = 2 people contains a friend group or a hate group of size 2, which is clearly true. Hence, we have the base case. Inductive Hypothesis: Assume k 2 and P (k): every set of at least 2 k(k 1) 2 people contains a friend group or a hate group of size k. Inductive Step: To show P (k + 1), i.e., to show that every set of at least 2 k(k+1) 2 people contains a friend group or a hate group of size k + 1. Let S 0 be a set of at least 2 k(k+1) 2 people. We note that 2 k(k+1) 2 = 2 k 2 k(k 1) 2. Consider the following procedure that defines a sequence of successively smaller sets S 0, S 1,..., S k, a sequence of people p 0, p 1,..., p k, and a sequence of bits f 0, f 1,..., f k. for i = 0 to k / / Loop Invariant: j {0, 1,..., i 1}, p j S j, S j {p j } S j+1, S j 2 k j 2 k(k 1) 2 let p i any element of S i if p i is friends with a majority of people in S i {p i } let S i+1 = {x S i {p i } p i is a friend of x} and let f i = 1 else let S i+1 = {x S i {p i } p i hates x} and let f i = 0 / / Since S i 2 k i 2 k(k 1) 2, it follows that S i+1 2 k i 1 2 k(k 1) 2, which establishes the invariant There are three cases: (1) f 0 = f 1 = f 2 =... = f k 1 = 1, or (2) f 0 = f 1 = f 2 =... = f k 1 = 0, or (3) there exist i, j {0, 1,..., k 1} such that f i = 1, f j = 0. In Case (1), p 0, p 1,..., p k constitute a friend group of size k + 1. In Case (2), p 0, p 1,..., p k constitute
8 a hate group of size k + 1. In Case (3), it follows from the invariant that S k 2 k(k 1) 2. Then, by the Induction Hypothesis, S k has a friend group or a hate group of size k. If S k has a friend group F of size k, then F {p i } constitutes a friend group of size k + 1. If S k has a hate group H of size k, then H {p j } constitutes a hate group of size k + 1. Thus, in all of the cases, S 0 contains a friend group or a hate group of size k + 1. Hence, we have P (k + 1), which completes the induction step. Then, by the Principle of Weak Mathematical Induction, we have the claim: n 2, P (n).
Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According
More informationDiscrete Mathematics for CS Fall 2006 Papadimitriou & Vazirani Lecture 22
CS 70 Discrete Mathematics for CS Fall 2006 Papadimitriou & Vazirani Lecture 22 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice, roulette
More informationMath 55: Discrete Mathematics
Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 7, due Wedneday, March 14 Happy Pi Day! (If any errors are spotted, please email them to morrison at math dot berkeley dot edu..5.10 A croissant
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected
More informationStatistics 100A Homework 3 Solutions
Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a realvalued function defined on the sample space of some experiment. For instance,
More informationProblem sets for BUEC 333 Part 1: Probability and Statistics
Problem sets for BUEC 333 Part 1: Probability and Statistics I will indicate the relevant exercises for each week at the end of the Wednesday lecture. Numbered exercises are backofchapter exercises from
More information3.2 Roulette and Markov Chains
238 CHAPTER 3. DISCRETE DYNAMICAL SYSTEMS WITH MANY VARIABLES 3.2 Roulette and Markov Chains In this section we will be discussing an application of systems of recursion equations called Markov Chains.
More informationQuestion: What is the probability that a fivecard poker hand contains a flush, that is, five cards of the same suit?
ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the
More information6.042/18.062J Mathematics for Computer Science. Expected Value I
6.42/8.62J Mathematics for Computer Science Srini Devadas and Eric Lehman May 3, 25 Lecture otes Expected Value I The expectation or expected value of a random variable is a single number that tells you
More informationWeek 2: Conditional Probability and Bayes formula
Week 2: Conditional Probability and Bayes formula We ask the following question: suppose we know that a certain event B has occurred. How does this impact the probability of some other A. This question
More informationJANUARY TERM 2012 FUN and GAMES WITH DISCRETE MATHEMATICS Module #9 (Geometric Series and Probability)
JANUARY TERM 2012 FUN and GAMES WITH DISCRETE MATHEMATICS Module #9 (Geometric Series and Probability) Author: Daniel Cooney Reviewer: Rachel Zax Last modified: January 4, 2012 Reading from Meyer, Mathematics
More informationExpected Value 10/11/2005
Expected Value 10/11/2005 Definition Let X be a numericallyvalued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω xm(x), provided
More informationMA 1125 Lecture 14  Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values.
MA 5 Lecture 4  Expected Values Friday, February 2, 24. Objectives: Introduce expected values.. Means, Variances, and Standard Deviations of Probability Distributions Two classes ago, we computed the
More informationProbabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I
Victor Adamchi Danny Sleator Great Theoretical Ideas In Computer Science Probability Theory I CS 525 Spring 200 Lecture Feb. 6, 200 Carnegie Mellon University We will consider chance experiments with
More informationMassachusetts Institute of Technology
n (i) m m (ii) n m ( (iii) n n n n (iv) m m Massachusetts Institute of Technology 6.0/6.: Probabilistic Systems Analysis (Quiz Solutions Spring 009) Question Multiple Choice Questions: CLEARLY circle the
More informationLecture 2: Probability
Lecture 2: Probability Assist. Prof. Dr. Emel YAVUZ DUMAN MCB1007 Introduction to Probability and Statistics İstanbul Kültür University Outline 1 Introduction 2 Sample Spaces 3 Event 4 The Probability
More informationThe Story. Probability  I. Plan. Example. A Probability Tree. Draw a probability tree.
Great Theoretical Ideas In Computer Science Victor Adamchi CS 55 Carnegie Mellon University Probability  I The Story The theory of probability was originally developed by a French mathematician Blaise
More informationProbability and Expected Value
Probability and Expected Value This handout provides an introduction to probability and expected value. Some of you may already be familiar with some of these topics. Probability and expected value are
More information36 Odds, Expected Value, and Conditional Probability
36 Odds, Expected Value, and Conditional Probability What s the difference between probabilities and odds? To answer this question, let s consider a game that involves rolling a die. If one gets the face
More informationSolutions for Practice problems on proofs
Solutions for Practice problems on proofs Definition: (even) An integer n Z is even if and only if n = 2m for some number m Z. Definition: (odd) An integer n Z is odd if and only if n = 2m + 1 for some
More informationMath 55: Discrete Mathematics
Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 5, due Wednesday, February 22 5.1.4 Let P (n) be the statement that 1 3 + 2 3 + + n 3 = (n(n + 1)/2) 2 for the positive integer n. a) What
More informationWald s Identity. by Jeffery Hein. Dartmouth College, Math 100
Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,
More informationSums of Independent Random Variables
Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables
More informationSolutions: Problems for Chapter 3. Solutions: Problems for Chapter 3
Problem A: You are dealt five cards from a standard deck. Are you more likely to be dealt two pairs or three of a kind? experiment: choose 5 cards at random from a standard deck Ω = {5combinations of
More information15251: Great Theoretical Ideas in Computer Science Anupam Gupta Notes on Combinatorial Games (draft!!) January 29, 2012
15251: Great Theoretical Ideas in Computer Science Anupam Gupta Notes on Combinatorial Games (draft!!) January 29, 2012 1 A TakeAway Game Consider the following game: there are 21 chips on the table.
More informationCombinatorics: The Fine Art of Counting
Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli
More informationAMS 5 CHANCE VARIABILITY
AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and
More information6.3 Conditional Probability and Independence
222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted
More informationStatistics 100A Homework 4 Solutions
Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.
More informationV. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE
V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPETED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny and a nickel are flipped. You win $ if either
More information8.7 Mathematical Induction
8.7. MATHEMATICAL INDUCTION 8135 8.7 Mathematical Induction Objective Prove a statement by mathematical induction Many mathematical facts are established by first observing a pattern, then making a conjecture
More informationHMMT February Saturday 21 February Combinatorics
HMMT February 015 Saturday 1 February 015 1. Evan s analog clock displays the time 1:13; the number of seconds is not shown. After 10 seconds elapse, it is still 1:13. What is the expected number of seconds
More informationSection 2.1. Tree Diagrams
Section 2.1 Tree Diagrams Example 2.1 Problem For the resistors of Example 1.16, we used A to denote the event that a randomly chosen resistor is within 50 Ω of the nominal value. This could mean acceptable.
More informationThe Casino Lab STATION 1: CRAPS
The Casino Lab Casinos rely on the laws of probability and expected values of random variables to guarantee them profits on a daily basis. Some individuals will walk away very wealthy, while others will
More informationIn the situations that we will encounter, we may generally calculate the probability of an event
What does it mean for something to be random? An event is called random if the process which produces the outcome is sufficiently complicated that we are unable to predict the precise result and are instead
More informationMath 30530: Introduction to Probability, Spring 2012
Name: Math 30530: Introduction to Probability, Spring 01 Midterm Exam I Monday, February 0, 01 This exam contains problems on 7 pages (including the front cover). Calculators may be used. Show all your
More informationAn Introduction to Information Theory
An Introduction to Information Theory Carlton Downey November 12, 2013 INTRODUCTION Today s recitation will be an introduction to Information Theory Information theory studies the quantification of Information
More informationIntroductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014
Introductory Probability MATH 07: Finite Mathematics University of Louisville March 5, 204 What is probability? Counting and probability 2 / 3 Probability in our daily lives We see chances, odds, and probabilities
More informationR Simulations: Monty Hall problem
R Simulations: Monty Hall problem Monte Carlo Simulations Monty Hall Problem Statistical Analysis Simulation in R Exercise 1: A Gift Giving Puzzle Exercise 2: Gambling Problem R Simulations: Monty Hall
More informationChapter 16: law of averages
Chapter 16: law of averages Context................................................................... 2 Law of averages 3 Coin tossing experiment......................................................
More informationSlide 1 Math 1520, Lecture 23. This lecture covers mean, median, mode, odds, and expected value.
Slide 1 Math 1520, Lecture 23 This lecture covers mean, median, mode, odds, and expected value. Slide 2 Mean, Median and Mode Mean, Median and mode are 3 concepts used to get a sense of the central tendencies
More informationLecture 13. Understanding Probability and LongTerm Expectations
Lecture 13 Understanding Probability and LongTerm Expectations Thinking Challenge What s the probability of getting a head on the toss of a single fair coin? Use a scale from 0 (no way) to 1 (sure thing).
More informationExamples of infinite sample spaces. Math 425 Introduction to Probability Lecture 12. Example of coin tosses. Axiom 3 Strong form
Infinite Discrete Sample Spaces s of infinite sample spaces Math 425 Introduction to Probability Lecture 2 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan February 4,
More informationSection 8.1 Properties of Probability
Section 8. Properties of Probability Section 8. Properties of Probability A probability is a function that assigns a value between 0 and to an event, describing the likelihood of that event happening.
More informationMATH 3070 Introduction to Probability and Statistics Lecture notes Probability
Objectives: MATH 3070 Introduction to Probability and Statistics Lecture notes Probability 1. Learn the basic concepts of probability 2. Learn the basic vocabulary for probability 3. Identify the sample
More informationMath 421: Probability and Statistics I Note Set 2
Math 421: Probability and Statistics I Note Set 2 Marcus Pendergrass September 13, 2013 4 Discrete Probability Discrete probability is concerned with situations in which you can essentially list all the
More informationGaming the Law of Large Numbers
Gaming the Law of Large Numbers Thomas Hoffman and Bart Snapp July 3, 2012 Many of us view mathematics as a rich and wonderfully elaborate game. In turn, games can be used to illustrate mathematical ideas.
More informationSection 7C: The Law of Large Numbers
Section 7C: The Law of Large Numbers Example. You flip a coin 00 times. Suppose the coin is fair. How many times would you expect to get heads? tails? One would expect a fair coin to come up heads half
More informationThe overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES
INTRODUCTION TO CHANCE VARIABILITY WHAT DOES THE LAW OF AVERAGES SAY? 4 coins were tossed 1600 times each, and the chance error number of heads half the number of tosses was plotted against the number
More informationBetting systems: how not to lose your money gambling
Betting systems: how not to lose your money gambling G. Berkolaiko Department of Mathematics Texas A&M University 28 April 2007 / Mini Fair, Math Awareness Month 2007 Gambling and Games of Chance Simple
More informationCh5: Discrete Probability Distributions Section 51: Probability Distribution
Recall: Ch5: Discrete Probability Distributions Section 51: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.
More informationLahore University of Management Sciences
Lahore University of Management Sciences CMPE 501: Applied Probability (Fall 2010) Homework 3: Solution 1. A candy factory has an endless supply of red, orange, yellow, green, blue and violet jelly beans.
More informationIEOR 4106: Introduction to Operations Research: Stochastic Models. SOLUTIONS to Homework Assignment 1
IEOR 4106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability Models, by Sheldon
More informationThe Math. P (x) = 5! = 1 2 3 4 5 = 120.
The Math Suppose there are n experiments, and the probability that someone gets the right answer on any given experiment is p. So in the first example above, n = 5 and p = 0.2. Let X be the number of correct
More informationP (A) = lim P (A) = N(A)/N,
1.1 Probability, Relative Frequency and Classical Definition. Probability is the study of random or nondeterministic experiments. Suppose an experiment can be repeated any number of times, so that we
More information4. Joint Distributions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose
More informationBasic Probability. Probability: The part of Mathematics devoted to quantify uncertainty
AMS 5 PROBABILITY Basic Probability Probability: The part of Mathematics devoted to quantify uncertainty Frequency Theory Bayesian Theory Game: Playing Backgammon. The chance of getting (6,6) is 1/36.
More informationChapter 6 Review 0 (0.083) (0.917) (0.083) (0.917)
Chapter 6 Review MULTIPLE CHOICE. 1. The following table gives the probabilities of various outcomes for a gambling game. Outcome Lose $1 Win $1 Win $2 Probability 0.6 0.25 0.15 What is the player s expected
More informationInduction. Margaret M. Fleck. 10 October These notes cover mathematical induction and recursive definition
Induction Margaret M. Fleck 10 October 011 These notes cover mathematical induction and recursive definition 1 Introduction to induction At the start of the term, we saw the following formula for computing
More informationExpected Value and the Game of Craps
Expected Value and the Game of Craps Blake Thornton Craps is a gambling game found in most casinos based on rolling two six sided dice. Most players who walk into a casino and try to play craps for the
More informationA (random) experiment is an activity with observable results. The sample space S of an experiment is the set of all outcomes.
Chapter 7 Probability 7.1 Experiments, Sample Spaces, and Events A (random) experiment is an activity with observable results. The sample space S of an experiment is the set of all outcomes. Each outcome
More informationPractical Probability:
Practical Probability: Casino Odds and Sucker Bets Tom Davis tomrdavis@earthlink.net April 2, 2011 Abstract Gambling casinos are there to make money, so in almost every instance, the games you can bet
More information1/3 1/3 1/3 0.4 0.4 0.4 0.4 0.4 0.4 0.4 0 1 2 3 4 5 6 7 8 0.6 0.6 0.6 0.6 0.6 0.6 0.6
HOMEWORK 4: SOLUTIONS. 2. A Markov chain with state space {, 2, 3} has transition probability matrix /3 /3 /3 P = 0 /2 /2 0 0 Show that state 3 is absorbing and, starting from state, find the expected
More informationE3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
More information7 Posterior Probability and Bayes
7 Posterior Probability and Bayes Examples: 1. In a computer installation, 60% of programs are written in C ++ and 40% in Java. 60% of the programs written in C ++ compile on the first run and 80% of the
More informationQuestion 1 Formatted: Formatted: Formatted: Formatted:
In many situations in life, we are presented with opportunities to evaluate probabilities of events occurring and make judgments and decisions from this information. In this paper, we will explore four
More information1. De Morgan s Law. Let U be a set and consider the following logical statement depending on an integer n. We will call this statement P (n):
Math 309 Fall 0 Homework 5 Drew Armstrong. De Morgan s Law. Let U be a set and consider the following logical statement depending on an integer n. We will call this statement P (n): For any n sets A, A,...,
More informationChapter 3: The basic concepts of probability
Chapter 3: The basic concepts of probability Experiment: a measurement process that produces quantifiable results (e.g. throwing two dice, dealing cards, at poker, measuring heights of people, recording
More informationProbability Models.S1 Introduction to Probability
Probability Models.S1 Introduction to Probability Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard The stochastic chapters of this book involve random variability. Decisions are
More informationMath 312 Lecture Notes Markov Chains
Math 312 Lecture Notes Markov Chains Warren Weckesser Department of Mathematics Colgate University Updated, 30 April 2005 Markov Chains A (finite) Markov chain is a process with a finite number of states
More informationProbabilities and Random Variables
Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so
More informationIntegers: applications, base conversions.
CS 441 Discrete Mathematics for CS Lecture 14 Integers: applications, base conversions. Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Modular arithmetic in CS Modular arithmetic and congruencies
More informationProbability Using Dice
Using Dice One Page Overview By Robert B. Brown, The Ohio State University Topics: Levels:, Statistics Grades 5 8 Problem: What are the probabilities of rolling various sums with two dice? How can you
More informationMathematical Expectation
Mathematical Expectation Properties of Mathematical Expectation I The concept of mathematical expectation arose in connection with games of chance. In its simplest form, mathematical expectation is the
More information1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let
Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as
More informationPROBABILITY. Chapter Overview Conditional Probability
PROBABILITY Chapter. Overview.. Conditional Probability If E and F are two events associated with the same sample space of a random experiment, then the conditional probability of the event E under the
More informationHow to build a probabilityfree casino
How to build a probabilityfree casino Adam Chalcraft CCR La Jolla dachalc@ccrwest.org Chris Freiling Cal State San Bernardino cfreilin@csusb.edu Randall Dougherty CCR La Jolla rdough@ccrwest.org Jason
More informationJan 31 Homework Solutions Math 151, Winter 2012. Chapter 3 Problems (pages 102110)
Jan 31 Homework Solutions Math 151, Winter 01 Chapter 3 Problems (pages 10110) Problem 61 Genes relating to albinism are denoted by A and a. Only those people who receive the a gene from both parents
More informationChapter 5 Section 2 day 1 2014f.notebook. November 17, 2014. Honors Statistics
Chapter 5 Section 2 day 1 2014f.notebook November 17, 2014 Honors Statistics Monday November 17, 2014 1 1. Welcome to class Daily Agenda 2. Please find folder and take your seat. 3. Review Homework C5#3
More informationMathematical Induction
Chapter 2 Mathematical Induction 2.1 First Examples Suppose we want to find a simple formula for the sum of the first n odd numbers: 1 + 3 + 5 +... + (2n 1) = n (2k 1). How might we proceed? The most natural
More informationChapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.
Chapter 6 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? 4/52. 2. What is the probability that a randomly selected integer chosen from the first 100 positive
More informationIntroductory Problems
Introductory Problems Today we will solve problems that involve counting and probability. Below are problems which introduce some of the concepts we will discuss.. At one of George Washington s parties,
More informationThe sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].
Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real
More informationLET S MAKE A DEAL! ACTIVITY
LET S MAKE A DEAL! ACTIVITY NAME: DATE: SCENARIO: Suppose you are on the game show Let s Make A Deal where Monty Hall (the host) gives you a choice of three doors. Behind one door is a valuable prize.
More informationDiscrete Random Variables; Expectation Spring 2014 Jeremy Orloff and Jonathan Bloom
Discrete Random Variables; Expectation 18.05 Spring 2014 Jeremy Orloff and Jonathan Bloom This image is in the public domain. http://www.mathsisfun.com/data/quincunx.html http://www.youtube.com/watch?v=9xubhhm4vbm
More informationHomework 4  KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.
Homework 4  KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 21 Since there can be anywhere from 0 to 4 aces, the
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationStatistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined
Expectation Statistics and Random Variables Math 425 Introduction to Probability Lecture 4 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan February 9, 2009 When a large
More information4.1 4.2 Probability Distribution for Discrete Random Variables
4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.
More informationMathematical induction. Niloufar Shafiei
Mathematical induction Niloufar Shafiei Mathematical induction Mathematical induction is an extremely important proof technique. Mathematical induction can be used to prove results about complexity of
More informationExample. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)
: Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest
More informationLECTURE 4. Conditional Probability and Bayes Theorem
LECTURE 4 Conditional Probability and Bayes Theorem 1 The conditional sample space Physical motivation for the formal mathematical theory 1. Roll a fair die once so the sample space S is given by S {1,
More informationLecture 11: Probability models
Lecture 11: Probability models Probability is the mathematical toolbox to describe phenomena or experiments where randomness occur. To have a probability model we need the following ingredients A sample
More informationTOPIC P2: SAMPLE SPACE AND ASSIGNING PROBABILITIES SPOTLIGHT: THE CASINO GAME OF ROULETTE. Topic P2: Sample Space and Assigning Probabilities
TOPIC P2: SAMPLE SPACE AND ASSIGNING PROBABILITIES SPOTLIGHT: THE CASINO GAME OF ROULETTE Roulette is one of the most popular casino games. The name roulette is derived from the French word meaning small
More information6.3 Probabilities with Large Numbers
6.3 Probabilities with Large Numbers In general, we can t perfectly predict any single outcome when there are numerous things that could happen. But, when we repeatedly observe many observations, we expect
More information13.0 Central Limit Theorem
13.0 Central Limit Theorem Discuss Midterm/Answer Questions Box Models Expected Value and Standard Error Central Limit Theorem 1 13.1 Box Models A Box Model describes a process in terms of making repeated
More informationSample Induction Proofs
Math 3 Worksheet: Induction Proofs III, Sample Proofs A.J. Hildebrand Sample Induction Proofs Below are model solutions to some of the practice problems on the induction worksheets. The solutions given
More informationChapter 13 & 14  Probability PART
Chapter 13 & 14  Probability PART IV : PROBABILITY Dr. Joseph Brennan Math 148, BU Dr. Joseph Brennan (Math 148, BU) Chapter 13 & 14  Probability 1 / 91 Why Should We Learn Probability Theory? Dr. Joseph
More information