Chapter 4 Lecture Notes

Save this PDF as:

Size: px
Start display at page:

Transcription

1 Chapter 4 Lecture Notes Random Variables October 27,

2 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance, we may be concerned with the sum of the total value obtained when rolling two dice rather than the actual set of outcomes {(1, 6), (2, 5),... }. Or we may be interested in the total number of times a head is flipped in n trials of tossing a coin and not care about the head-tail sequence that occurred. This real number associated to specific events of an experiment is a random variable. Example (1a) Suppose we toss 3 coins and we let Y denote the number of heads that appear. Y is a random variable taking on one of the values 0, 1, 2, or 3 with respective probabilities P {Y = 0} = 1 8, P {Y = 1} = 3 8, P {Y = 2} = 3 8, P {Y = 3} = 1 8. We see that 3 P {Y = i} = 1 in accordance with the probability rules. Exercise (4.1) Two balls are chosen randomly from an urn containing 8 white, 4 black, and 2 orange balls. Suppose that we win \$2 for each black ball and we lose \$1 for each white ball selected. Let X denote our winnings. What are the possible values of X and what are the probabilities associated with each value? We first note that there are 6 total outcomes to this scenario. For each one, we gain or lose money. We ll denote this gain/loss by the random variable X. Let x denote any one value that X may take on. Let O, B, and W denote the event of drawing an orange, black, and white ball, respectively. Then, we can come up with the following: Outcome W W W O OO BW BO BB X To find the probability associated with each X value, we compute the probability of drawing each outcome. Consider selecting two white balls from the urn. The probability of this event is ( 8 P (W W ) = 2) ) = Therefore, we can write P {X = 2} = p( 2) = 4/13. Similarly, to find the probability of drawing a white ball and an orange ball, we find ( 8 )( 2 1 P (W O) = ( 1) 14 ) = We can do this for each outcome to create the following probability mass function: (

3 Outcome W W W O OO BW BO BB X P {X = x} 28/91 16/91 1/91 32/91 8/91 6/91 It is easy to verify that the sum of these probabilities is 1. Exercise (4.2) Two fair dice are rolled. Let X equal the product of the 2 dice. Compute P {X = i} for i = 1, 2,..., 36. Consider the product of the faces after two dice are rolled. Just as with the sum of two dice, there are 36 possible outcomes. We list them below: Scanning the set of possible outcomes, we see that the random variable X takes on the values 1, 2, 3, 4, 5, 6, 8, 9, 10, 12, 15, 16, 18, 20, 24, 25, 30, and 36, which the following probabilities: P {X = i} = 1, for i = 1, 9, 16, 25, P {X = j} = 2, for j = 2, 3, 5, 8, 10, 15, 18, 20, 24, P {X = 4} = 3 18 P {X = k} = 4, for k = 6, Exercise (1d) Independent trials consisting of flipping a coin having probability p of coming up heads are continually performed until either a head occurs or a total of n flips is made. If we let X denote the number of times the coin is flipped, then X is a random variable taking on what values? What are the respective probabilities? If X is the number of times the coin is flipped, then X takes on the values 1, 2,..., n. 3

4 Since each flip is independent of the last, we can come up with the following outcomes and probabilities: Outcome Probability H P {X = 1} = p T H P {X = 2} = (1 p)p T T H P {X = 3} = (1 p) 2 p T T T H P {X = 4} = (1 p) 3 p. } T T {{ T } H n 2 } T T {{ T } n 1.. P {X = n 1} = (1 p) n 2 p P {X = n} = (1 p) n 1 We note here that for the number of flips to be X = n, we need only flip n 1 tails in a row. The last flip doesn t matter because either way we will be flipping the coin n times. 4

5 Section 4.2 Discrete Random Variables A random variable that takes on at most a countable number of possible values is said to be discrete. A countable set is a set with the same number of elements as some subset of the natural numbers, N = {0, 1, 2,...}. A countable set can be a finite set of number or a countably infinite set. A countably infinite set is an infinite set whose members can be labeled in such a way that each label has a one-to-one correspondence with every element of the natural numbers. In other words, there is a prescription that allows us to identify every member of a countably infinite set by relating this member to an element of the natural numbers. This one-to-one-correspondence is called a bijection. Recall that a function is one-to-one or injective if every element of the range is the image of at most one element in the domain. A function is called onto or surjective if every element of the range has a corresponding element in the domain. Using these definitions, we can define what it means to be countably infinite. Consider the set of integers, Z. Are they countably infinite? Yes they are because the following function maps every natural number to an integer. { f(x) = x/2 if x = even (x + 1)/2 if x = odd It is easy to show that this function is indeed a bijection. [Draw Picture] Therefore, the integers are countable and have exactly the same number of elements as the natural numbers. In a similar way, we can show that other sets are countable, like the rationals. The real numbers are uncountable. For any discrete variable X, we define a probability mass function p(x) of X given by p(x) = P {X = x}. Here, x can be thought of as a continuous variable, but we suppose that X takes on the countable values x 1, x 2,..., where each value x i is labeled by a natural number i = 1, 2,.... This is customary notation that suggests each x i originates from a countable set. Then p(x i ) is non-negative at each value and zero for all other values: p(x i ) 0 for i = 1, 2,... p(x) = 0 for all other values of x. The variable X must take on one of the values x i, which means p(x i ) = 1. The probability mass function p(x) is presented graphically by plotting p(x i ) on the y-axis against x i on the x-axis. Example Suppose X is the random variable representing the sum when the dice are rolled. Then we know X takes on integer values between 2 and 12 with probabilities ranging from 1/36 to 6/36. We represent the probability mass function with the following graph, where each bar is centered over the value of X and the height is the probability: 5

6 Probability Mass Function, p(x) p(x) x Exercise A discrete probability mass function of a random variable X has the form p(i) = cλ i /i! for i = 0, 1, 2,..., where λ is some positive value. Find (a) P {X = 0} and (b) P {X > 2}. a.) Using the formula for the probability mass function, we obtain P {X = 0} = p(0) = c λ0 0! = c. To find the constant c, we note that p(i) is a probability mass function, which means p(i) = 1. i=0 Substituting in the expression for p yields the following: p(i) = = c = ce λ, since the infinite series is actually a series representation for e λ. Hence, P {X = 0} = c = e λ. c λi i! λ i i! 6

7 b.) It follows that P {X > 2} = 1 P {X 2}, which gives P {X > 2} = 1 P {X = 0} P {X = 1} P {X = 2} = 1 e λ λe λ λ2 2 e λ. For a random variable X, the function F defined by F (x) = P {X x}, < x < is called the cumulative distribution function or more simply the distribution function of X. This function specifies the probability that the random variable is less than or equal to x. Suppose that a b. Then the event {X a} is contained in the event {X b}, and so F (a) F (b), which means F (x) is a nondecreasing function of x. We can define the cumulative distribution function in terms of p(x). In this case, we can write F (b) = p(x). all x b From this we see that if X is a discrete random variable with possible values x 1, x 2,..., where x 1 < x 2 <, then the distribution function F of X is a step function. Indeed, consider the following example. Example Suppose X is a random variable with mass function given by the four values: p(1) = 1 4, p(2) = 1 2, p(3) = 1 8, p(4) = 1 8. Then, the cumulative distribution function F (x) is given as 0 if b < if 1 b < 2 3 F (b) = 4 if 2 b < if 3 b < 4 1 if 4 b The probability mass function and cumulative distribution function are presented below. Probability Mass Function, p(x) Cumulative Distribution Function, F(b) p(x) 0.3 F(b) x b 7

8 We note that the distance between each jump at each value x = 1, 2, 3, and 4 is exactly p(x). Exercise (4.17) Suppose that a cumulative distribution function is given by 0 if b < 0 F (b) = b 4 if 0 b < b 1 4 if 1 b < if 2 b < 3 1 if 3 b a.) Draw a rough sketch of the cumulative distribution function. b.) Find P {X = i} for i = 1, 2, 3. c.) Find P {1/2 X 3/2}. a.) A sketch of the cumulative distribution function is below: Cumulative Distribution Function, F(b) F(b) b b.) The value of the probability mass function is equal to the width of the jumps at each value. Therefore, P {X = 1} = 1/2, P {X = 2} = 11/12 3/4 = 1/6, and P {X = 3} = 1/12. c.) By the definition of the cumulative distribution function, it follows that P {1/2 < X < 3/2} = P {X < 3/2} P {X < 1/2}, which means P {1/2 < X < 3/2} = F (3/2) F (1/2) = =

9 Section 4.3 Expected Value If X is a discrete random variable, having a probability mass function p(x), then the expectation, or expected value of X, denoted by E[X], is defined as E[X] = x:p(x)>0 xp(x). The expected value is the weighted average of the possible values that X can take on. Each value is weighted by its corresponding probability. Example Suppose a simple mass function for the random variable X is p(0) = 1 2, p(1) = 1 2. Then E[X] = (0)p(0) + (1)p(1) = (0) ( ) ( ) (1) = In this case, the expected value is the ordinary average of the two possible values. Example Suppose a simple mass function for the random variable X is p(0) = 1 3, p(1) = 2 3. Then E[X] = (0)p(0) + (1)p(1) = (0) ( ) ( ) (1) = In this case, the value at X = 1 is given twice as much weight as the value at X = 0. Suppose we are playing a game where X is our winnings, which could take on the values x 1, x 2,... x n. Each winning values has a corresponding probability p(x 1 ), p(x 2 ),..., p(x n ). Then if each p(x i ) is thought of as a relative frequency (in the sense that we play the game for a very long time), then our average winnings would be x i p(x i ) = E[X]. Exercise (3a) Suppose X is the outcome of rolling a fair die. Find E[X]. 9

10 The random variable X takes on the values i = 1, 2, 3, 4, 5, 6 with each value being equally probable, p(i) = 1/6. Therefore, E[X] = = 1 6 = ip(i) = i (6)(7) 2 Exercise (4.25) Two coins are to be flipped. The first coin will land on heads with probability 0.6, the second with probability 0.7. Assume that the results of the flips are independent, and let X equal the total number of heads that result. Find a.) P {X = 1} and b.) E[X]. It is easy to conclude that the sample space for this experiment has four total outcomes: S = {(H, H), (H, T ), (T, H), (T, T )}, where H denotes heads and T denotes flipping a tails. If X is to be the random variable that counts the total number of heads flipped, then X takes on the values i = 0, 1, 2. The probability mass function is given by the following three probabilities: The expected value is then given by p(0) = P (T T ) = (0.4)(0.3) = 0.12 p(1) = P (T H) + P (HT ) = (0.6)(0.3) + (0.4)(0.7) = 0.46 p(2) = P (HH) = (0.6)(0.7) = E[X] = 2 ip(i) = (0)(0.12) + (1)(0.46) + (2)(0.42) = 1.3. i=0 Exercise (4.30) A person tosses a fair coin until a tail appears for the first time. If the tail appears on the n th flip, the person wins 2 n dollars. Let X denote the player s winnings. Find E[X]. a.) Would you be willing to pay \$1 million to play this game once? b.) Would you be willing to pay \$1 million for each game if you could play for as long as you liked and only had to settle up when you stopped playing? 10

11 Let X be the player s winnings in dollars, then X takes on the values x = 2, 4, 8,..., 2 n,... for n = 1, 2,.... Since the game consists of flipping a fair coin, each flip may land tails or heads with probability 1/2, and each flip is independent of the last. Hence, the probability mass function is ( ) 1 n p(x) = p(2 n ) = = n. Hence, the expected value is E[X] = x: p(x)>0 xp(x) = ( ) 1 (2 n ) 2 n = n=1 1 =. This means that if we continue to play this game for a long period of time (forever, actually) we can expect to win an infinite amount of money. a.) Probably not because to win our money back in one game would mean to flip 19 heads in a row: 2 n = n This is too difficult to do in only one try. b.) Yes, because we can definitely expect to win our money back at some point. n=1 Exercise (4.22) Suppose that two teams play a series of games that ends when one of them has won i games. Suppose that each game played is, independently, won by team A with probability p. Find the expected number of games that are played when a.) i = 2 and b.) i = 3. Also, show in both cases that this number is maximized when p = 1/2. 11

12 Section 4.4 Expectation of a Function of a Random Variable In order to talk about variance we need to look at the expected value of some function of X, say g(x). For example, suppose we have a random variable X but what if we want to find the expected value of the square of all of these values? In this case, we can think of g(x) as a discrete random variable, it has a probability mass function, which can be determined from p(x). Consider the following example: Example Let X denote a random variable that takes on the values 1, 0, and 1 with probabilities p( 1) = 0.2, p(0) = 0.5, p(1) = 0.3. We can compute the value E[X 2 ] by letting Y = X 2. Then the probability mass function of Y, denoted by p (y), obtains the following values: p (1) = p( 1) + p(1) = = 0.5 p (0) = p(0) = 0.5. Hence, We note that E[X 2 ] E[X] 2. E[X 2 ] = E[Y ] = (0)p (0) + (1)p (1) = (0)(0.5) + (1)(0.5) = 0.5. We get the general idea that we will always be able to calculate the expected value of g(x) if we know the original probability mass function p(x). This, indeed, is always the case. We prove it below. Proposition If X is a discrete random variable that takes on values x i for i = 1, 2,... with probabilities p(x i ) then E[g(X)] = g(x i )p(x i ). Proof. We prove this proposition for a finite random variable. Let the random variable, X, take on the values x i for i = 1, 2,..., n. Because the function g(x) may not be one to one, suppose g(x) (another random variable) takes on the values g 1, g 2,..., g m (where m n). It follows that g(x) is a random variable, such that for j = 1, 2,..., m, the probability mass function is P {g(x) = g j } = p (g j ) = p(x i ). i: g(x i )=g j 12

13 Using this and the definition of expected value gives the following: E[g(X)] = = = = m g j p (g j ) j=1 m j=1 m j=1 g j p(x i ) i: g(x i )=g j g j p(x i ) i: g(x i )=g j n g(x i )p(x i ). Proposition If a and b are constants, then E[aX + b] = ae[x] + b. Proof. We have E[aX + b] = (ax + b)p(x) = a x: p(x)>0 xp(x) + b p(x) x: p(x)>0 x: p(x)>0 = ae[x] + b. The expected value of a random variable X, E[X], is also referred to also referred to as the mean or first moment of X. The quantity E[X n ] is referred to as the n th moment of X and is given by E[X n ] = x n p(x). x: p(x)>0 13

14 Section 4.5 Variance The expected value of a random variable X is an important measure of center of X but tells us nothing about the variation or spread of these values. Consider the following random variables: W = 0 with probability 1 { 1 with p( 1) = 1/2 Y = 1 with p(1) = 1/2 { 100 with p( 100) = 1/2 Z = 100 with p(100) = 1/2 All of these variables have the same expectation. To look at variation, we look at how far each value is from its mean, on the average. On way to do this is to look at the quantity E[ x µ ], where µ = E[X]. The term x µ is difficult to deal with so we look at the squares. Definition Let X be a random variable with mean µ, then the variance of X is denoted by Var[X] and is defined as Var[X] = E[(x µ) 2 ]. Exercise Show that E[(x µ) 2 ] = E[X 2 ] (E[X]) 2. This is an alternate form for the variance. Typically, this form is an easier way to compute the variance. Note, that this involves the second moment of X. Exercise A useful identity is that for any constants a and b, Show that this is true. Var[aX + b] = a 2 Var[X]. Exercise (5a) Calculate the variance, Var[X], if X represents the outcome when a fair die is rolled 14

15 Exercise (4.36) Suppose that two teams play a series of games that ends when one of them has won i games. Suppose that each game played is, independently, won by team A with probability p. Find the variance in the number of games that are played when i = 2. Show this value is maximized when p = 1/2. The mean is typically analogous to the center of gravity of a distribution of mass. The variance, with respect to mechanics, represents the moment of inertia. The square root of the variance is called the standard deviation of X, and we denote it by SD[X], which is given by SD[X] = Var[X]. 15

16 Section 4.6 Bernoulli and Binomial Random Variables Suppose an experiment takes place, whose outcome can be only two things: a success or a failure. We can let X = 1 for a success and let X = 0 for a failure. Then our probability mass function is p(0) = P {X = 0} = 1 p p(1) = P {X = 1} = p, where p, 0 p 1, is the probability of a success. A random variable that satisfies these conditions is called a Bernoulli random variable. Suppose we perform n trials of the experiment, each of which result in success with probability p. If X represents the number of successes in n trials, then X is a Binomial random variable. The two values n and p are called parameters for the Binomial model and if X follows the Bin(n, p). Exercise What is the probability mass function for the Binomial random variable X with parameters n and p? Once obtained, check if n i=0 p(x i) = 1. Exercise (6a) Five fair coins are flipped. If the outcomes are assumed independent, find the probability mass function of the number of heads obtained. Exercise (6b) It is known that screws produced by a certain company will be defective with probability 0.01, independently of one another. The company sells the screws in packages of 10 and offers a money back guarantee that at most 1 of the 10 screws is defective. What proportion of packages sold must the company replace? Exercise (4.32) To determine whether they have a certain disease, 100 people are to have their blood tested. However, rather than testing each individual separately, it has been decided first to place the people into groups of 10. The blood samples of the 10 people in each group will be 16

17 pooled and analyzed together. If the test is negative, one test will suffice for the 10 people, whereas if the test is positive, each of the 10 will also be individually tested and, in all, 11 tests will be made on this group. Assume that the probability that a person has the disease is 0.1 for all people, independently of one another, and compute the expected number of tests necessary for each group. Now, that we have defined the Binomial random variable, we seek a formula for the expected value and variance. The expected value can be computed as follows: E[X] = = = n ( n i i n ( n i i i=0 ) p i (1 p) n i ) p i (1 p) n i 17

18 Section 4.7 Poisson Random Variable Exercise (7a) Suppose that the number of typographical errors on a single page of this book has a Poisson distribution with parameter λ = 1/2. Calculate the probability that there is at least one error on this page. Exercise (7b) Suppose that the probability that an item produced by a certain machine will be defective is 0.1. Find the probability that a sample of size 10 items will contain at most 1 defective item. Exercise (4.61) The probability that you will be dealt a full house in a hand of poker is approximately Find an approximation for the probability that in 1000 hands of poker, you will be dealt at least 2 full houses. 18

19 Section 4.8 Other Discrete Random Variables Geometric Random Variable Exercise (4.71) Consider a roulette wheel consisting of 38 numbers, 1 through 36, 0, and double 0. If Basi always bets that the outcome will be one of the numbers 1 through 12, what is the probability that his first win will occur on his fourth bet? How many bets does he expect to place before he wins one time? Negative Binomial Random Variable Exercise (4.72) Two teams play a series of games; the first team to win 4 games is declared the winner. Suppose that one team is stronger than the other and wins each game with probability 0.6, independently of the outcomes of the other games. Find the probability, for i = 4, 5, 6, 7 that the stronger team wins in i games. Compare this probability that the stronger team wins with the probability that it would win a 2-out-of-3 series. Hypergeometric Random Variable Exercise (4.79) Suppose that a batch of 100 items contains 6 that are defective and 94 that are not defective. If X is the number of defective items in a randomly drawn sample of 10 items from the batch, find a.) P {X = 0} and b.) P {X > 2}. 19

20 Section 4.9 Expected Value of Sums of Random Variables The expected value of a sum of random variables is equal to the sum of the expected values. We prove this assuming that the sample space is either finite or countably infinite. Let X be a random variable and let X(s) be the value of X when s S is the outcome. If X and Y are random variables, then so is the sum Z = X + Y, with Z(s) = X(s) + Y (s). Example (9a) Suppose an experiment consists of flipping a coin 5 times, with the outcome being the resulting sequence of heads and tails. Suppose X is the number of heads in the first 3 flips and Y is the number of heads in the last 2 flips. Let Z = X + Y. Then, for instance, for the outcome s = {HT HT H}, X(s) = 2 Y (s) = 1 Z(s) = 3. Let p(s) be the probability that s is the outcome of the experiment. For any event A in the sample space, we can write A as the union of mutually exclusive outcomes s, such that each s A. It follows P (A) = s Ap(s). Suppose A = S, then 1 = s S p(s). Let X be a random variable such that X(s) is the value of X when s occurs. Proposition E[X] = x S X(s)p(s). Proof. Suppose X takes on the values x i, for i = 1, 2,.... For each i, let S i be the event that X is 20

21 equal to x i. In other words, S i = {s X(s) = x i }. then E[X] = = = = = x i P {X = x i } x i P (S i ) x i p(s) s Si x i p(s) s S i X(s)p(s) s S i = s S X(s)p(s), where S i for i = 1, 2,... are mutually exclusive events that make up S. Example Suppose two independent flips of a coin come up heads with probability p. Let X be the number of heads, then the probability mass function can be written as This gives P {X = 0} = p(t T ) = (1 p) 2 P {X = 1} = p(ht ) + p(t H) = 2p(1 p) P {X = 2} = p(hh) = p 2 E[X] = 0(1 p) 2 + 2p(1 p) + 2p 2 = 2p. We can also compute the expected values by considering every mutual event that makes up the sample space: E[X] = X(T T )p(t T ) + X(HT )p(ht ) + X(T H)p(T H) + X(HH)p(HH) = (0)(1 p) 2 + (1)(p(1 p)) + (1)((1 p)p) + (2)(p 2 ) = 2p. We now prove the important result from this section. Proposition For random variables X 1, X 2,..., X n, [ n ] n E X i = E[X]. 21

22 Proof. Let Z = n X i. Then it follows from the previous proposition E[Z] = s S Z(s)p(s) = ( n ) X i (s) p(s) s S ( ) n = X i (s)p(s) = s S n E[X i ]. Example Suppose we wanted to find the expected value of the sum obtained when n fair dice were rolled. Let X be the sum, then E[X] = n E[X i ], where X i is the upturned value on die i. Because X i is equally likely to be any of the values from 1 to 6, we know 6 ( ) 1 E[X i ] = i = Hence, E[X] = n 7 2 = 7 2 n. 22

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

Random Variables and Their Expected Values

Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution

Statistics 100A Homework 3 Solutions

Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win \$ for each black ball selected and we

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get \$0 (i.e., you can walk away)

: Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

Chapter 3: Discrete Random Variable and Probability Distribution. January 28, 2014

STAT511 Spring 2014 Lecture Notes 1 Chapter 3: Discrete Random Variable and Probability Distribution January 28, 2014 3 Discrete Random Variables Chapter Overview Random Variable (r.v. Definition Discrete

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

Expected Value 10/11/2005

Expected Value 10/11/2005 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω xm(x), provided

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

MA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values.

MA 5 Lecture 4 - Expected Values Friday, February 2, 24. Objectives: Introduce expected values.. Means, Variances, and Standard Deviations of Probability Distributions Two classes ago, we computed the

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

ECE 316 Probability Theory and Random Processes

ECE 316 Probability Theory and Random Processes Chapter 4 Solutions (Part 2) Xinxin Fan Problems 20. A gambling book recommends the following winning strategy for the game of roulette. It recommends that

DISCRETE RANDOM VARIABLES

DISCRETE RANDOM VARIABLES DISCRETE RANDOM VARIABLES Documents prepared for use in course B01.1305, New York University, Stern School of Business Definitions page 3 Discrete random variables are introduced

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 )

RANDOM VARIABLES MATH CIRCLE (ADVANCED) //0 0) a) Suppose you flip a fair coin times. i) What is the probability you get 0 heads? ii) head? iii) heads? iv) heads? For = 0,,,, P ( Heads) = ( ) b) Suppose

Stat 110. Unit 3: Random Variables Chapter 3 in the Text

Stat 110 Unit 3: Random Variables Chapter 3 in the Text 1 Unit 3 Outline Random Variables (RVs) and Distributions Discrete RVs and Probability Mass Functions (PMFs) Bernoulli and Binomial Distributions

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Expectation Statistics and Random Variables Math 425 Introduction to Probability Lecture 4 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan February 9, 2009 When a large

Math 421: Probability and Statistics I Note Set 2

Math 421: Probability and Statistics I Note Set 2 Marcus Pendergrass September 13, 2013 4 Discrete Probability Discrete probability is concerned with situations in which you can essentially list all the

AMS 5 CHANCE VARIABILITY

AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and

36 Odds, Expected Value, and Conditional Probability

36 Odds, Expected Value, and Conditional Probability What s the difference between probabilities and odds? To answer this question, let s consider a game that involves rolling a die. If one gets the face

Statistics 100A Homework 4 Solutions

Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPETED VALUE A game of chance featured at an amusement park is played as follows: You pay \$ to play. A penny and a nickel are flipped. You win \$ if either

IEOR 4106: Introduction to Operations Research: Stochastic Models. SOLUTIONS to Homework Assignment 1

IEOR 4106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability Models, by Sheldon

Random variables, probability distributions, binomial random variable

Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

4. Joint Distributions

Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

Exploratory Data Analysis

Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction

Combinatorics: The Fine Art of Counting

Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli

5. Continuous Random Variables

5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

Section 5.1 Continuous Random Variables: Introduction

Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

4.1 4.2 Probability Distribution for Discrete Random Variables

4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.

STAT 35A HW2 Solutions

STAT 35A HW2 Solutions http://www.stat.ucla.edu/~dinov/courses_students.dir/09/spring/stat35.dir 1. A computer consulting firm presently has bids out on three projects. Let A i = { awarded project i },

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179)

Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172-179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities

MAS108 Probability I

1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES

INTRODUCTION TO CHANCE VARIABILITY WHAT DOES THE LAW OF AVERAGES SAY? 4 coins were tossed 1600 times each, and the chance error number of heads half the number of tosses was plotted against the number

Probability and Expected Value

Probability and Expected Value This handout provides an introduction to probability and expected value. Some of you may already be familiar with some of these topics. Probability and expected value are

Toss a coin twice. Let Y denote the number of heads.

! Let S be a discrete sample space with the set of elementary events denoted by E = {e i, i = 1, 2, 3 }. A random variable is a function Y(e i ) that assigns a real value to each elementary event, e i.

Random Variables. Chapter 2. Random Variables 1

Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

Probability distributions

Probability distributions (Notes are heavily adapted from Harnett, Ch. 3; Hayes, sections 2.14-2.19; see also Hayes, Appendix B.) I. Random variables (in general) A. So far we have focused on single events,

EE 322: Probabilistic Methods for Electrical Engineers. Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University

EE 322: Probabilistic Methods for Electrical Engineers Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University Discrete Random Variables 1 Introduction to Random Variables

Chapter 5. Discrete Probability Distributions

Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable

PROBABILITY. Chapter Overview Conditional Probability

PROBABILITY Chapter. Overview.. Conditional Probability If E and F are two events associated with the same sample space of a random experiment, then the conditional probability of the event E under the

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

WHERE DOES THE 10% CONDITION COME FROM?

1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

Statistics 100A Homework 8 Solutions

Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

Section 7C: The Law of Large Numbers

Section 7C: The Law of Large Numbers Example. You flip a coin 00 times. Suppose the coin is fair. How many times would you expect to get heads? tails? One would expect a fair coin to come up heads half

Notes on Continuous Random Variables

Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

Random Variable: A function that assigns numerical values to all the outcomes in the sample space.

STAT 509 Section 3.2: Discrete Random Variables Random Variable: A function that assigns numerical values to all the outcomes in the sample space. Notation: Capital letters (like Y) denote a random variable.

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

Chapter 5. Random variables

Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

Probability and Statistics

CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be

P(X = x k ) = 1 = k=1

74 CHAPTER 6. IMPORTANT DISTRIBUTIONS AND DENSITIES 6.2 Problems 5.1.1 Which are modeled with a unifm distribution? (a Yes, P(X k 1/6 f k 1,...,6. (b No, this has a binomial distribution. (c Yes, P(X k

Topic 2: Scalar random variables. Definition of random variables

Topic 2: Scalar random variables Discrete and continuous random variables Probability distribution and densities (cdf, pmf, pdf) Important random variables Expectation, mean, variance, moments Markov and

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Recall: Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.

Introduction to Probability

Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

PROBABILITIES AND PROBABILITY DISTRIBUTIONS

Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL

MATH 10: Elementary Statistics and Probability Chapter 4: Discrete Random Variables

MATH 10: Elementary Statistics and Probability Chapter 4: Discrete Random Variables Tony Pourmohamad Department of Mathematics De Anza College Spring 2015 Objectives By the end of this set of slides, you

MATH 3070 Introduction to Probability and Statistics Lecture notes Probability

Objectives: MATH 3070 Introduction to Probability and Statistics Lecture notes Probability 1. Learn the basic concepts of probability 2. Learn the basic vocabulary for probability 3. Identify the sample

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game

Topic 8: The Expected Value

Topic 8: September 27 and 29, 2 Among the simplest summary of quantitative data is the sample mean. Given a random variable, the corresponding concept is given a variety of names, the distributional mean,

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

Probability Models.S1 Introduction to Probability

Probability Models.S1 Introduction to Probability Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard The stochastic chapters of this book involve random variability. Decisions are

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS

MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS CONTENTS Sample Space Accumulative Probability Probability Distributions Binomial Distribution Normal Distribution Poisson Distribution

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

CARDINALITY, COUNTABLE AND UNCOUNTABLE SETS PART ONE

CARDINALITY, COUNTABLE AND UNCOUNTABLE SETS PART ONE With the notion of bijection at hand, it is easy to formalize the idea that two finite sets have the same number of elements: we just need to verify

Sums of Independent Random Variables

Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

Worked examples Random Processes

Worked examples Random Processes Example 1 Consider patients coming to a doctor s office at random points in time. Let X n denote the time (in hrs) that the n th patient has to wait before being admitted

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

Expected Value and the Game of Craps

Expected Value and the Game of Craps Blake Thornton Craps is a gambling game found in most casinos based on rolling two six sided dice. Most players who walk into a casino and try to play craps for the

Betting systems: how not to lose your money gambling

Betting systems: how not to lose your money gambling G. Berkolaiko Department of Mathematics Texas A&M University 28 April 2007 / Mini Fair, Math Awareness Month 2007 Gambling and Games of Chance Simple

Unit 19: Probability Models

Unit 19: Probability Models Summary of Video Probability is the language of uncertainty. Using statistics, we can better predict the outcomes of random phenomena over the long term from the very complex,

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

Expectations Expectations. (See also Hays, Appendix B; Harnett, ch. 3). A. The expected value of a random variable is the arithmetic mean of that variable, i.e. E() = µ. As Hays notes, the idea of the

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

You flip a fair coin four times, what is the probability that you obtain three heads.

Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

3.2 Roulette and Markov Chains

238 CHAPTER 3. DISCRETE DYNAMICAL SYSTEMS WITH MANY VARIABLES 3.2 Roulette and Markov Chains In this section we will be discussing an application of systems of recursion equations called Markov Chains.

2. Discrete random variables

2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Final Mathematics 51, Section 1, Fall 24 Instructor: D.A. Levin Name YOU MUST SHOW YOUR WORK TO RECEIVE CREDIT. A CORRECT ANSWER WITHOUT SHOWING YOUR REASONING WILL NOT RECEIVE CREDIT. Problem Points Possible

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

Basic concepts in probability. Sue Gordon

Mathematics Learning Centre Basic concepts in probability Sue Gordon c 2005 University of Sydney Mathematics Learning Centre, University of Sydney 1 1 Set Notation You may omit this section if you are

Expected Value. Let X be a discrete random variable which takes values in S X = {x 1, x 2,..., x n }

Expected Value Let X be a discrete random variable which takes values in S X = {x 1, x 2,..., x n } Expected Value or Mean of X: E(X) = n x i p(x i ) i=1 Example: Roll one die Let X be outcome of rolling

Statistics 100A Homework 4 Solutions

Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

Estimating the Frequency Distribution of the. Numbers Bet on the California Lottery

Estimating the Frequency Distribution of the Numbers Bet on the California Lottery Mark Finkelstein November 15, 1993 Department of Mathematics, University of California, Irvine, CA 92717. Running head:

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

Topic 8 The Expected Value

Topic 8 The Expected Value Functions of Random Variables 1 / 12 Outline Names for Eg(X ) Variance and Standard Deviation Independence Covariance and Correlation 2 / 12 Names for Eg(X ) If g(x) = x, then

Odds: Odds compares the number of favorable outcomes to the number of unfavorable outcomes.

MATH 11008: Odds and Expected Value Odds: Odds compares the number of favorable outcomes to the number of unfavorable outcomes. Suppose all outcomes in a sample space are equally likely where a of them