Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Size: px
Start display at page:

Download "Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)"

Transcription

1 Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you are given the choice of three doors. Behind one door is a car, behind the others, goats. You pick a door, say number 1, and the host, who knows whats behind the doors, opens another door, say number 3, which has a goat. He says to you, Do you want to pick door number 2? Is it to your advantage to switch your choice of doors? Backup your answer with rigorous reasoning, which includes a clear statement of the sample space, eventts of interest, and computation of probabilities. Make the following two reasonable assumptions: (i) the car is equally likely to be hidden behind each of the three doors, and (ii) after the player picks a door, the host must open a different door with a goat behind it and offer the player the choice of staying with the original door or switching. Solution: Without loss of generality, let A denote the door that the player picks, and B and C be the other doors. We can represent the sample space as S = {(Ȧ, B, C), (A, Ḃ, C), (A, B, Ċ)}, where the boldface of A indicates that the player picked that door, and the dot on a door indicates that the car is behind that door. From the statement of the problem, the probability of each of the three outcomes in S is 1/3. Let E denote the event of winning by switching the door and F denote the event of winning by not switching the door. Then, from the rules of the game, we have: E = {(A, Ḃ, C), (A, B, Ċ)}, and P (E) = 2/3 F = {(Ȧ, B, C)}, and P (F ) = 1/3 So, it is advantageous to switch! 2. (6 points) In Roulette, you can bet $1 and win $36 with a probability 1/38, as there are 38 numbers on the wheel. Suppose you have $105 and have enough time to make 105 bets on the wheel. What is the probability that you come out ahead, that is, with more than $105? After you write down the expression, compute the final numerical value, which I am sure you ll find interesting. Solution: The experiment is 105 mutually independent Bernoulli trials with probability of success 1/38. The sample space S is the set of all possible outcomes of 105 mutually independent Bernoulli trials, which can be thought of as the set of all bit strings of length 105 where bits 1 or 0 at position i denote success or failure, respectively, in ith Bernoulli trial. We observe that we don t come out ahead if and only if we win zero, one, or two bets out of 105 bets. Let A be the event that we win zero, one, or two bets. Then we want to find P (S A) = 1 P (A). Now, ( ) ( ) ( 105 P (A) = ) ( ) 104 ( ) i.e., P (A) = ; hence, P (S A) = 1 P (A) = ( ) ( ) ( ) 1 2, Note that the probability that we end up ahead is more than half, but in expectation, we will lose irrespective of the number of rounds played. 3. (10 points) Ramesh can get to work three different ways: by bicycle, by bus, or by car. Because of commuter traffic, there is 50% chance that he will be late when he drives his car. When he takes

2 the bus, which uses a special lane, there is a 20% chance that he will be late. The probability that he will be late when he bikes to work is only 5%. Ramesh arrives late one day. His boss wants to estimate the probability that he drove his car to work that day. (a) Suppose the boss assumes that that there is 1/3 chance that Ramesh takes each of the three ways he can get to work. Under this assumption what estimate for the probability that Ramesh drove his car does the boss obtain from Bayes Theorem? (b) Suppose the boss knows that Ramesh drives to work 30% of the time, takes the bus only 10% of the time, and takes his bicycle 60% of the time. With this information, what estimate for the probability that Ramesh drove his car does the boss obtain from Bayes Theorem? When solving the above problem, clearly define the events and state the probabilities. You need to use the Generalized Bayes Theorem. Solution: The sample space S is {late-car, late-bus, late-bike, not-late-car, not-late-bus, not-late-bike}. Define the following events: C, Ramesh travels by car, is {late-car, not-late-car} B, Ramesh travels by bus, is {late-bus, not-late-bus} K, Ramesh travels by bike, is {late-bike, not-late-bike} L, Ramesh is late, is {late-car, late-bus, late-bike} We are given: P (L C) = 1/2, P (L B) = 1/5, and P (L K) = 1/20. The problem asks for P (C L). (a) We are given P (C) = P (B) = P (K) = 1/3. Therefore, using the generalized Bayes Theorem: P (L C)P (C) P (C L) = P (L C)P (C) + P (L B)P (B) + P (L K)P (K) = 2 3 (b) We are given P (C) = 3/10, P (B) = 1/10, P (K) = 3/5. Therefore, using the generalized Bayes Theorem: P (L C)P (C) P (C L) = P (L C)P (C) + P (L B)P (B) + P (L K)P (K) = (10 points) I made three vegetable burgers. The first came out really well, the second one had one side burned, and the last was hopeless both sides were burned. Disgusted with my performance, I picked a burger (uniformly) at random and dropped it on the ground. It rolled and settled. When I went to it, the side facing up was burned. Given this knowledge, what is the probability that the burger I dropped is the hopeless, burned-on-both-sides burger? Solve this problem in two different ways: first without using the Bayes theorem and then using Bayes theorem. Solution:

3 (a) Without using the Bayes Theorem: Let us denote well, one-side-burned, both-sides-burned burgers by W, O, B, respectively, and denote sides by subscripts 1 and 2. Then the sample space of the problem is S = {W 1, W 2, O 1, O 2, B 1, B 2 }, and each outcome is equally likely. Let B be the event that the dropped burger is the hopeless one; then, B = {B 1, B 2 }. Let U B be the event that the side facing up is burned; then, U B = {O 2, B 1, B 2 }, where we assume that O 2 is the burned side of the burger O. Thus, the probability is P (B U B ) = P (B U B) = 2/6 P (U B ) 3/6 = 2 3 (b) Using the Bayes Theorem: Let W, O, B denote the events that the burger I picked is well, one-side-burned, both-sidesburned, respectively. Since thrown burger was picked uniformly at random, we have P (W ) = P (O) = P (B) = 1 3. Let U B be the event that the side facing up is burned; then, P (U B B) is probability that the side facing up is burned given that the both-sides-burned burger was picked, which is clearly 1. And using similar arguments, we have P (U B B) = 1, P (U B W ) = 0, and P (U B O) = 1 2. We want to compute P (B U B). Using the generalized Bayes Theorem: P (B U B ) = P (U B B)P (B) P (U B ) P (U B B)P (B) = P (U B W )P (W ) + P (U B O)P (O) + P (U B B)P (B) 1 (1/3) = 0 (1/3) + (1/2) (1/3) + 1 (1/3) = 1/3 1/2 = (2+6+4 = 12 points) A McDonald s Happy Meal contains one of the four Teenage Mutant Ninja Turtles. (a) Assuming each meal gets a toy uniformly at random, how many times do you expect to have to eat a Happy Meal to get your favorite character, Leonardo? (b) How many times do you expect to eat a Happy Meal to get all four toys, in whatever order that you get them? (c) Generalize the solution for n toys. Solution: (a) We can think of the picking of a toy at random as a Bernoulli trial, with picking Leonardo representing success and picking any of the other three toys representing failure. Thus, the success probability p = 1/4. Let X denote the number of trials needed to get Leonardo. We are asked for E[X]. Since the random variable X has a geometric distribution (with parameter p), we know from class that E[X] = 1/p = 4.

4 (b) Let the random variable X denote the number of trials needed to get all four types of toys. The problem asks for E[X]. X can be expressed as X 1 + X 2 + X 3 + X 4, where X 1 is the number trials needed to get the first toy t 1 ; X 2 is the number trials needed to get a different type t 2 than t 1 ; X 3 is the number trials needed to get a different type t 3 than t 1, t 2 ; and X 4 is the number trials needed to get a different type t 4 than t 1, t 2, t 3. Since the probability p 1 of getting any toy in a trial is 1, E[X 1 ] = 1/p 1 = 1. Once a toy is obtained, the probability p 2 of getting a different type of toy in a trial is 3/4. Therefore, E[X 2 ] = 1/p 2 = 4/3. Once two types of toys are obtained, the probability p 3 of getting a different type of toy in a trial is 2/4 = 1/2. Therefore, E[X 3 ] = 1/p 3 = 2. Once three types of toys are obtained, the probability p 4 of getting a different type of toy in a trial is 1/4. Therefore, E[X 4 ] = 1/p 4 = 4. E[X] = E[X 1 + X 2 + X 3 + X 4 ] = E[X 1 ] + E[X 2 ] + E[X 3 ] + E[X 4 ] by Linearity of Expectation = 1 + 4/ = 25/3 (c) Generalizing the above for n toys, we have: X = n X i i=1 p i = n i + 1 n E[X i ] = 1 p i = n n i + 1 [ n ] E[X] = E X i i=1 = n E[X i ] i=1 = n n i=1 n i + 1 = n n 1 i=1 n i + 1 = n n 1 j j=1 Linearity of Expectation pulling out n change of variable: letting j = n i (4 points) Let X and Y be the random variables that count the number of heads and the number of tails that come up when two coins are flipped. Show that X and Y are not independent. Solution: The sample space is {HH, HT, T H, T T }, where each outcome has probability 1/4. We have P (X = 2) = P (HH) = 1/4 and P (Y = 2) = P (T T ) = 1/4. Also, P (X = 2 Y = 2) = P ( ) = 0. Since P (X = 2 Y = 2) P (X = 2) P (Y = 2), the random variables X and Y are not independent.

5 7. (8 points) Let X be the sum of the numbers when a pair of dodecahedral dice is rolled. (a) What is E(X)? (b) What is V (X)? Solution: Let A = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12}. The sample space S = A A, and S = 144. Let X 1 and X 2 be the random variables whose value is the number on the first die and the number on the second die, respectively. Then, X = X 1 + X 2. Let us first compute the expected value and the variance of just X 1. E[X 1 ] = 12 x=1 x P (X = x) = 1 12 E[X1 2] = 12 x 2 P (X = x) = x=1 12 x=1 x = = 325/6. V [X 1 ] = E[X 2 1 ] E2 [X 1 ] = 325/6 (13/2) Now, we can compute E[X] and V [X] as follows: (a) E[X] = E[X 1 + X 2 ] = E[X 1 ] + E[X 2 ] by Linearity of Expectation = E[X 1 ] + E[X 1 ] since X 1 and X 2 have the same distribution = 13 (b) V [X] = V [X 1 + X 2 ] = V [X 1 ] + V [X 2 ] since X 1 and X 2 are independent random variables = V [X 1 ] + V [X 1 ] since X 1 and X 2 have the same distribution (5 points) Suppose we have a Bernoulli trial with success probability p and failure probability q = 1 p. Let X be an indicator random variable that is 1 for success and 0 for failure. (a) Determine V [X]. (b) Show that V [X] 1/4 for any Bernoulli trial. Solution: (a) The sample space S = {succ, fail} and the probability distribution is: P (succ) = p, P (fail) = 1 p. The random variable X is defined by X(succ) = 1 and X(f ail) = 0. Then: E[X] = 1 P (succ) + 0 P (fail) = p. E[X 2 ] = 1 2 P (succ) P (fail) = p. V [X] = E[X 2 ] E 2 [X] = p p 2 = p(1 p) = pq. (b) V [X]= p(1 p) = p p 2 = 0.25 (p 0.5) from Part (a) since a square is non-negative

6 9. (15 points) What is the expected number of items needed to fill all slots of a hash table of size k? Explain your reasoning clearly. It is acceptable to leave your answer as a summation (using the Σ notation), but you should leave the answer in as elegant a form as possible. You should assume that, for all slots i j and for each item x, the probability that x hashes into slot i is the same as the probability that x hashes into slot j, regardless of what slots other items hash into. Solution: This problem is analogous to Problem 5 (c): an item hashing to a slot different from already occupied slots corresponds to a toy from a meal turning out to be different from already acquired toys. Thus, the expected number of meals for acquiring each of the n toys is the same as the expected number of items to insert to fill all n slots. So, as in Problem 5 (c), the answer is n n j=1 1 j. Alternate solution: Let X be the random variable that represents the number of items needed to fill all slots of a hash table of size k; and let X i be the random variable that represents the number of items needed before one of them hashes to an empty slot, given that i of the k slots are previously occupied (and the remaining k i slots are empty). We note that X = X 0 + X X k 1 = k 1 X i. i=0 Since the probability p i that an item hashes to an empty slot given that i of the k slots are occupied is (k i)/k and X i has a geometric distribution with parameter p i, we have E[X i ] = 1/p i = k/(k i). Then: E[X] = E[ k 1 Extra Credit i=0 = k 1 i=0 = k 1 i=0 X i ] E(X i ) k/(k i) by Linearity of Expectation = k k 1/j change of variables: j = k i j=1 1. ( = 25 points) Consider a set S of n people such that, for all distinct x and y, it is the case that either x and y like each other or x and y hate each other. Let us call a S S a friend group if, for all distinct x and y in S, x and y like each other, and let us call a S S a hater group if, for all distinct x and y in S, x and y hate each other. Let us define R(k) as the smallest natural number such that in every set S of R(k) people, there is either a friend group or a hate group of size k. For example, R(2) = 2. (a) Prove that R(3) > 5.

7 Proof: Let S = {p 0, p 1, p 2, p 3, p 4 }, and let p i and p j be friends if and only if (i + 1) mod 5 = j or (i 1) mod 5 = j. Then, it is easy to see that there is no friend group or a hate group of size 3. Hence, R(3) > 5. (b) Prove that R(3) 6. (Together with the previous part, we have R(3) = 6.) Proof: To prove the claim, we need to show that every set of six people contains a friend group or a hate group of size 3. Let S be a set of six people and p be an element of S. Since either p likes a majority in S {p} or hates a majority in S {p}, it follows that either p is friends with at least three people in S {p} or p hates at least three people in S {p}. Without loss of generality, suppose that p is friends with three people a, b, c in S {p}. If any two among a, b, c are friends, then those two and p constitute a friend group of size 3. On the other hand, if no two of a, b, c are friends, the three of them constitute a hate group of size 3. (c) Prove by induction that R(k) 2 k(k 1) 2 for all k 2. Proof: Let P (n) be the statement that every set of at least 2 n(n 1) 2 people contains a friend group or a hate group of size n. The statement that we wish to prove is implied by the following claim. Claim: n 2, P (n) Proof: We prove the claim by induction. Base Case: To show P (2). P (2) states every set of at least 2 2(2 1) 2 = 2 people contains a friend group or a hate group of size 2, which is clearly true. Hence, we have the base case. Inductive Hypothesis: Assume k 2 and P (k): every set of at least 2 k(k 1) 2 people contains a friend group or a hate group of size k. Inductive Step: To show P (k + 1), i.e., to show that every set of at least 2 k(k+1) 2 people contains a friend group or a hate group of size k + 1. Let S 0 be a set of at least 2 k(k+1) 2 people. We note that 2 k(k+1) 2 = 2 k 2 k(k 1) 2. Consider the following procedure that defines a sequence of successively smaller sets S 0, S 1,..., S k, a sequence of people p 0, p 1,..., p k, and a sequence of bits f 0, f 1,..., f k. for i = 0 to k / / Loop Invariant: j {0, 1,..., i 1}, p j S j, S j {p j } S j+1, S j 2 k j 2 k(k 1) 2 let p i any element of S i if p i is friends with a majority of people in S i {p i } let S i+1 = {x S i {p i } p i is a friend of x} and let f i = 1 else let S i+1 = {x S i {p i } p i hates x} and let f i = 0 / / Since S i 2 k i 2 k(k 1) 2, it follows that S i+1 2 k i 1 2 k(k 1) 2, which establishes the invariant There are three cases: (1) f 0 = f 1 = f 2 =... = f k 1 = 1, or (2) f 0 = f 1 = f 2 =... = f k 1 = 0, or (3) there exist i, j {0, 1,..., k 1} such that f i = 1, f j = 0. In Case (1), p 0, p 1,..., p k constitute a friend group of size k + 1. In Case (2), p 0, p 1,..., p k constitute

8 a hate group of size k + 1. In Case (3), it follows from the invariant that S k 2 k(k 1) 2. Then, by the Induction Hypothesis, S k has a friend group or a hate group of size k. If S k has a friend group F of size k, then F {p i } constitutes a friend group of size k + 1. If S k has a hate group H of size k, then H {p j } constitutes a hate group of size k + 1. Thus, in all of the cases, S 0 contains a friend group or a hate group of size k + 1. Hence, we have P (k + 1), which completes the induction step. Then, by the Principle of Weak Mathematical Induction, we have the claim: n 2, P (n).

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,

More information

Math 55: Discrete Mathematics

Math 55: Discrete Mathematics Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 7, due Wedneday, March 14 Happy Pi Day! (If any errors are spotted, please email them to morrison at math dot berkeley dot edu..5.10 A croissant

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Problem sets for BUEC 333 Part 1: Probability and Statistics

Problem sets for BUEC 333 Part 1: Probability and Statistics Problem sets for BUEC 333 Part 1: Probability and Statistics I will indicate the relevant exercises for each week at the end of the Wednesday lecture. Numbered exercises are back-of-chapter exercises from

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

3.2 Roulette and Markov Chains

3.2 Roulette and Markov Chains 238 CHAPTER 3. DISCRETE DYNAMICAL SYSTEMS WITH MANY VARIABLES 3.2 Roulette and Markov Chains In this section we will be discussing an application of systems of recursion equations called Markov Chains.

More information

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I

Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I Victor Adamchi Danny Sleator Great Theoretical Ideas In Computer Science Probability Theory I CS 5-25 Spring 200 Lecture Feb. 6, 200 Carnegie Mellon University We will consider chance experiments with

More information

6.042/18.062J Mathematics for Computer Science. Expected Value I

6.042/18.062J Mathematics for Computer Science. Expected Value I 6.42/8.62J Mathematics for Computer Science Srini Devadas and Eric Lehman May 3, 25 Lecture otes Expected Value I The expectation or expected value of a random variable is a single number that tells you

More information

Week 2: Conditional Probability and Bayes formula

Week 2: Conditional Probability and Bayes formula Week 2: Conditional Probability and Bayes formula We ask the following question: suppose we know that a certain event B has occurred. How does this impact the probability of some other A. This question

More information

Math 55: Discrete Mathematics

Math 55: Discrete Mathematics Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 5, due Wednesday, February 22 5.1.4 Let P (n) be the statement that 1 3 + 2 3 + + n 3 = (n(n + 1)/2) 2 for the positive integer n. a) What

More information

Solutions for Practice problems on proofs

Solutions for Practice problems on proofs Solutions for Practice problems on proofs Definition: (even) An integer n Z is even if and only if n = 2m for some number m Z. Definition: (odd) An integer n Z is odd if and only if n = 2m + 1 for some

More information

MA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values.

MA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values. MA 5 Lecture 4 - Expected Values Friday, February 2, 24. Objectives: Introduce expected values.. Means, Variances, and Standard Deviations of Probability Distributions Two classes ago, we computed the

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

More information

Probability and Expected Value

Probability and Expected Value Probability and Expected Value This handout provides an introduction to probability and expected value. Some of you may already be familiar with some of these topics. Probability and expected value are

More information

6.3 Conditional Probability and Independence

6.3 Conditional Probability and Independence 222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted

More information

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100 Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,

More information

An Introduction to Information Theory

An Introduction to Information Theory An Introduction to Information Theory Carlton Downey November 12, 2013 INTRODUCTION Today s recitation will be an introduction to Information Theory Information theory studies the quantification of Information

More information

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3 Problem A: You are dealt five cards from a standard deck. Are you more likely to be dealt two pairs or three of a kind? experiment: choose 5 cards at random from a standard deck Ω = {5-combinations of

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

36 Odds, Expected Value, and Conditional Probability

36 Odds, Expected Value, and Conditional Probability 36 Odds, Expected Value, and Conditional Probability What s the difference between probabilities and odds? To answer this question, let s consider a game that involves rolling a die. If one gets the face

More information

AMS 5 CHANCE VARIABILITY

AMS 5 CHANCE VARIABILITY AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and

More information

15-251: Great Theoretical Ideas in Computer Science Anupam Gupta Notes on Combinatorial Games (draft!!) January 29, 2012

15-251: Great Theoretical Ideas in Computer Science Anupam Gupta Notes on Combinatorial Games (draft!!) January 29, 2012 15-251: Great Theoretical Ideas in Computer Science Anupam Gupta Notes on Combinatorial Games (draft!!) January 29, 2012 1 A Take-Away Game Consider the following game: there are 21 chips on the table.

More information

Introductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014

Introductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014 Introductory Probability MATH 07: Finite Mathematics University of Louisville March 5, 204 What is probability? Counting and probability 2 / 3 Probability in our daily lives We see chances, odds, and probabilities

More information

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution Recall: Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.

More information

Chapter 16: law of averages

Chapter 16: law of averages Chapter 16: law of averages Context................................................................... 2 Law of averages 3 Coin tossing experiment......................................................

More information

In the situations that we will encounter, we may generally calculate the probability of an event

In the situations that we will encounter, we may generally calculate the probability of an event What does it mean for something to be random? An event is called random if the process which produces the outcome is sufficiently complicated that we are unable to predict the precise result and are instead

More information

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPETED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny and a nickel are flipped. You win $ if either

More information

Betting systems: how not to lose your money gambling

Betting systems: how not to lose your money gambling Betting systems: how not to lose your money gambling G. Berkolaiko Department of Mathematics Texas A&M University 28 April 2007 / Mini Fair, Math Awareness Month 2007 Gambling and Games of Chance Simple

More information

R Simulations: Monty Hall problem

R Simulations: Monty Hall problem R Simulations: Monty Hall problem Monte Carlo Simulations Monty Hall Problem Statistical Analysis Simulation in R Exercise 1: A Gift Giving Puzzle Exercise 2: Gambling Problem R Simulations: Monty Hall

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

4.1 4.2 Probability Distribution for Discrete Random Variables

4.1 4.2 Probability Distribution for Discrete Random Variables 4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.

More information

Practical Probability:

Practical Probability: Practical Probability: Casino Odds and Sucker Bets Tom Davis tomrdavis@earthlink.net April 2, 2011 Abstract Gambling casinos are there to make money, so in almost every instance, the games you can bet

More information

The Math. P (x) = 5! = 1 2 3 4 5 = 120.

The Math. P (x) = 5! = 1 2 3 4 5 = 120. The Math Suppose there are n experiments, and the probability that someone gets the right answer on any given experiment is p. So in the first example above, n = 5 and p = 0.2. Let X be the number of correct

More information

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES INTRODUCTION TO CHANCE VARIABILITY WHAT DOES THE LAW OF AVERAGES SAY? 4 coins were tossed 1600 times each, and the chance error number of heads half the number of tosses was plotted against the number

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Lecture 13. Understanding Probability and Long-Term Expectations

Lecture 13. Understanding Probability and Long-Term Expectations Lecture 13 Understanding Probability and Long-Term Expectations Thinking Challenge What s the probability of getting a head on the toss of a single fair coin? Use a scale from 0 (no way) to 1 (sure thing).

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

Chapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.

Chapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52. Chapter 6 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? 4/52. 2. What is the probability that a randomly selected integer chosen from the first 100 positive

More information

Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty

Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty AMS 5 PROBABILITY Basic Probability Probability: The part of Mathematics devoted to quantify uncertainty Frequency Theory Bayesian Theory Game: Playing Backgammon. The chance of getting (6,6) is 1/36.

More information

Gaming the Law of Large Numbers

Gaming the Law of Large Numbers Gaming the Law of Large Numbers Thomas Hoffman and Bart Snapp July 3, 2012 Many of us view mathematics as a rich and wonderfully elaborate game. In turn, games can be used to illustrate mathematical ideas.

More information

Bayesian Tutorial (Sheet Updated 20 March)

Bayesian Tutorial (Sheet Updated 20 March) Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Probability Models.S1 Introduction to Probability

Probability Models.S1 Introduction to Probability Probability Models.S1 Introduction to Probability Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard The stochastic chapters of this book involve random variability. Decisions are

More information

Linear Risk Management and Optimal Selection of a Limited Number

Linear Risk Management and Optimal Selection of a Limited Number How to build a probability-free casino Adam Chalcraft CCR La Jolla dachalc@ccrwest.org Chris Freiling Cal State San Bernardino cfreilin@csusb.edu Randall Dougherty CCR La Jolla rdough@ccrwest.org Jason

More information

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.

More information

Probability Using Dice

Probability Using Dice Using Dice One Page Overview By Robert B. Brown, The Ohio State University Topics: Levels:, Statistics Grades 5 8 Problem: What are the probabilities of rolling various sums with two dice? How can you

More information

Math 55: Discrete Mathematics

Math 55: Discrete Mathematics Math 55: Discrete Mathematics UC Berkeley, Spring 2012 Homework # 9, due Wednesday, April 11 8.1.5 How many ways are there to pay a bill of 17 pesos using a currency with coins of values of 1 peso, 2 pesos,

More information

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined Expectation Statistics and Random Variables Math 425 Introduction to Probability Lecture 4 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan February 9, 2009 When a large

More information

Book Review of Rosenhouse, The Monty Hall Problem. Leslie Burkholder 1

Book Review of Rosenhouse, The Monty Hall Problem. Leslie Burkholder 1 Book Review of Rosenhouse, The Monty Hall Problem Leslie Burkholder 1 The Monty Hall Problem, Jason Rosenhouse, New York, Oxford University Press, 2009, xii, 195 pp, US $24.95, ISBN 978-0-19-5#6789-8 (Source

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

Sample Induction Proofs

Sample Induction Proofs Math 3 Worksheet: Induction Proofs III, Sample Proofs A.J. Hildebrand Sample Induction Proofs Below are model solutions to some of the practice problems on the induction worksheets. The solutions given

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Expected Value and the Game of Craps

Expected Value and the Game of Craps Expected Value and the Game of Craps Blake Thornton Craps is a gambling game found in most casinos based on rolling two six sided dice. Most players who walk into a casino and try to play craps for the

More information

Reading 13 : Finite State Automata and Regular Expressions

Reading 13 : Finite State Automata and Regular Expressions CS/Math 24: Introduction to Discrete Mathematics Fall 25 Reading 3 : Finite State Automata and Regular Expressions Instructors: Beck Hasti, Gautam Prakriya In this reading we study a mathematical model

More information

Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011

Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011 Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011 Name: Section: I pledge my honor that I have not violated the Honor Code Signature: This exam has 34 pages. You have 3 hours to complete this

More information

Mathematical Expectation

Mathematical Expectation Mathematical Expectation Properties of Mathematical Expectation I The concept of mathematical expectation arose in connection with games of chance. In its simplest form, mathematical expectation is the

More information

Recursive Estimation

Recursive Estimation Recursive Estimation Raffaello D Andrea Spring 04 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March 8, 05 Notes: Notation: Unlessotherwisenoted,x, y,andz denoterandomvariables, f x

More information

TOPIC P2: SAMPLE SPACE AND ASSIGNING PROBABILITIES SPOTLIGHT: THE CASINO GAME OF ROULETTE. Topic P2: Sample Space and Assigning Probabilities

TOPIC P2: SAMPLE SPACE AND ASSIGNING PROBABILITIES SPOTLIGHT: THE CASINO GAME OF ROULETTE. Topic P2: Sample Space and Assigning Probabilities TOPIC P2: SAMPLE SPACE AND ASSIGNING PROBABILITIES SPOTLIGHT: THE CASINO GAME OF ROULETTE Roulette is one of the most popular casino games. The name roulette is derived from the French word meaning small

More information

Chapter 5 Section 2 day 1 2014f.notebook. November 17, 2014. Honors Statistics

Chapter 5 Section 2 day 1 2014f.notebook. November 17, 2014. Honors Statistics Chapter 5 Section 2 day 1 2014f.notebook November 17, 2014 Honors Statistics Monday November 17, 2014 1 1. Welcome to class Daily Agenda 2. Please find folder and take your seat. 3. Review Homework C5#3

More information

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179)

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179) Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172-179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

The Binomial Probability Distribution

The Binomial Probability Distribution The Binomial Probability Distribution MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2015 Objectives After this lesson we will be able to: determine whether a probability

More information

Question 1 Formatted: Formatted: Formatted: Formatted:

Question 1 Formatted: Formatted: Formatted: Formatted: In many situations in life, we are presented with opportunities to evaluate probabilities of events occurring and make judgments and decisions from this information. In this paper, we will explore four

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

Probability. a number between 0 and 1 that indicates how likely it is that a specific event or set of events will occur.

Probability. a number between 0 and 1 that indicates how likely it is that a specific event or set of events will occur. Probability Probability Simple experiment Sample space Sample point, or elementary event Event, or event class Mutually exclusive outcomes Independent events a number between 0 and 1 that indicates how

More information

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Practice Test Chapter 9 Name MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Find the odds. ) Two dice are rolled. What are the odds against a sum

More information

Mathematical Induction. Lecture 10-11

Mathematical Induction. Lecture 10-11 Mathematical Induction Lecture 10-11 Menu Mathematical Induction Strong Induction Recursive Definitions Structural Induction Climbing an Infinite Ladder Suppose we have an infinite ladder: 1. We can reach

More information

Conditional Probability

Conditional Probability Chapter 4 Conditional Probability 4. Discrete Conditional Probability Conditional Probability In this section we ask and answer the following question. Suppose we assign a distribution function to a sample

More information

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k. REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game

More information

1/3 1/3 1/3 0.4 0.4 0.4 0.4 0.4 0.4 0.4 0 1 2 3 4 5 6 7 8 0.6 0.6 0.6 0.6 0.6 0.6 0.6

1/3 1/3 1/3 0.4 0.4 0.4 0.4 0.4 0.4 0.4 0 1 2 3 4 5 6 7 8 0.6 0.6 0.6 0.6 0.6 0.6 0.6 HOMEWORK 4: SOLUTIONS. 2. A Markov chain with state space {, 2, 3} has transition probability matrix /3 /3 /3 P = 0 /2 /2 0 0 Show that state 3 is absorbing and, starting from state, find the expected

More information

Unit 19: Probability Models

Unit 19: Probability Models Unit 19: Probability Models Summary of Video Probability is the language of uncertainty. Using statistics, we can better predict the outcomes of random phenomena over the long term from the very complex,

More information

Section 7C: The Law of Large Numbers

Section 7C: The Law of Large Numbers Section 7C: The Law of Large Numbers Example. You flip a coin 00 times. Suppose the coin is fair. How many times would you expect to get heads? tails? One would expect a fair coin to come up heads half

More information

Worksheet for Teaching Module Probability (Lesson 1)

Worksheet for Teaching Module Probability (Lesson 1) Worksheet for Teaching Module Probability (Lesson 1) Topic: Basic Concepts and Definitions Equipment needed for each student 1 computer with internet connection Introduction In the regular lectures in

More information

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14 Roulette p.1/14 Roulette Math 5 Crew Department of Mathematics Dartmouth College Roulette p.2/14 Roulette: A Game of Chance To analyze Roulette, we make two hypotheses about Roulette s behavior. When we

More information

Chapter 4. Probability Distributions

Chapter 4. Probability Distributions Chapter 4 Probability Distributions Lesson 4-1/4-2 Random Variable Probability Distributions This chapter will deal the construction of probability distribution. By combining the methods of descriptive

More information

Random Fibonacci-type Sequences in Online Gambling

Random Fibonacci-type Sequences in Online Gambling Random Fibonacci-type Sequences in Online Gambling Adam Biello, CJ Cacciatore, Logan Thomas Department of Mathematics CSUMS Advisor: Alfa Heryudono Department of Mathematics University of Massachusetts

More information

3. Mathematical Induction

3. Mathematical Induction 3. MATHEMATICAL INDUCTION 83 3. Mathematical Induction 3.1. First Principle of Mathematical Induction. Let P (n) be a predicate with domain of discourse (over) the natural numbers N = {0, 1,,...}. If (1)

More information

ECE302 Spring 2006 HW1 Solutions January 16, 2006 1

ECE302 Spring 2006 HW1 Solutions January 16, 2006 1 ECE302 Spring 2006 HW1 Solutions January 16, 2006 1 Solutions to HW1 Note: These solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev Law of Large Numbers Alexandra Barbato and Craig O Connell Honors 391A Mathematical Gems Jenia Tevelev Jacob Bernoulli Life of Jacob Bernoulli Born into a family of important citizens in Basel, Switzerland

More information

Colored Hats and Logic Puzzles

Colored Hats and Logic Puzzles Colored Hats and Logic Puzzles Alex Zorn January 21, 2013 1 Introduction In this talk we ll discuss a collection of logic puzzles/games in which a number of people are given colored hats, and they try

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

The Binomial Distribution

The Binomial Distribution The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing

More information

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,

More information

A Few Basics of Probability

A Few Basics of Probability A Few Basics of Probability Philosophy 57 Spring, 2004 1 Introduction This handout distinguishes between inductive and deductive logic, and then introduces probability, a concept essential to the study

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Solution. Solution. (a) Sum of probabilities = 1 (Verify) (b) (see graph) Chapter 4 (Sections 4.3-4.4) Homework Solutions. Section 4.

Solution. Solution. (a) Sum of probabilities = 1 (Verify) (b) (see graph) Chapter 4 (Sections 4.3-4.4) Homework Solutions. Section 4. Math 115 N. Psomas Chapter 4 (Sections 4.3-4.4) Homework s Section 4.3 4.53 Discrete or continuous. In each of the following situations decide if the random variable is discrete or continuous and give

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

Decision Making under Uncertainty

Decision Making under Uncertainty 6.825 Techniques in Artificial Intelligence Decision Making under Uncertainty How to make one decision in the face of uncertainty Lecture 19 1 In the next two lectures, we ll look at the question of how

More information

Chapter 5. Discrete Probability Distributions

Chapter 5. Discrete Probability Distributions Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable

More information

MONT 107N Understanding Randomness Solutions For Final Examination May 11, 2010

MONT 107N Understanding Randomness Solutions For Final Examination May 11, 2010 MONT 07N Understanding Randomness Solutions For Final Examination May, 00 Short Answer (a) (0) How are the EV and SE for the sum of n draws with replacement from a box computed? Solution: The EV is n times

More information

That s Not Fair! ASSESSMENT #HSMA20. Benchmark Grades: 9-12

That s Not Fair! ASSESSMENT #HSMA20. Benchmark Grades: 9-12 That s Not Fair! ASSESSMENT # Benchmark Grades: 9-12 Summary: Students consider the difference between fair and unfair games, using probability to analyze games. The probability will be used to find ways

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

Elementary Statistics and Inference. Elementary Statistics and Inference. 17 Expected Value and Standard Error. 22S:025 or 7P:025.

Elementary Statistics and Inference. Elementary Statistics and Inference. 17 Expected Value and Standard Error. 22S:025 or 7P:025. Elementary Statistics and Inference S:05 or 7P:05 Lecture Elementary Statistics and Inference S:05 or 7P:05 Chapter 7 A. The Expected Value In a chance process (probability experiment) the outcomes of

More information