Chapter 4 Lecture Notes

Size: px
Start display at page:

Download "Chapter 4 Lecture Notes"

Transcription

1 Chapter 4 Lecture Notes Random Variables October 27,

2 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance, we may be concerned with the sum of the total value obtained when rolling two dice rather than the actual set of outcomes {(1, 6), (2, 5),... }. Or we may be interested in the total number of times a head is flipped in n trials of tossing a coin and not care about the head-tail sequence that occurred. This real number associated to specific events of an experiment is a random variable. Example (1a) Suppose we toss 3 coins and we let Y denote the number of heads that appear. Y is a random variable taking on one of the values 0, 1, 2, or 3 with respective probabilities P {Y = 0} = 1 8, P {Y = 1} = 3 8, P {Y = 2} = 3 8, P {Y = 3} = 1 8. We see that 3 P {Y = i} = 1 in accordance with the probability rules. Exercise (4.1) Two balls are chosen randomly from an urn containing 8 white, 4 black, and 2 orange balls. Suppose that we win $2 for each black ball and we lose $1 for each white ball selected. Let X denote our winnings. What are the possible values of X and what are the probabilities associated with each value? We first note that there are 6 total outcomes to this scenario. For each one, we gain or lose money. We ll denote this gain/loss by the random variable X. Let x denote any one value that X may take on. Let O, B, and W denote the event of drawing an orange, black, and white ball, respectively. Then, we can come up with the following: Outcome W W W O OO BW BO BB X To find the probability associated with each X value, we compute the probability of drawing each outcome. Consider selecting two white balls from the urn. The probability of this event is ( 8 P (W W ) = 2) ) = Therefore, we can write P {X = 2} = p( 2) = 4/13. Similarly, to find the probability of drawing a white ball and an orange ball, we find ( 8 )( 2 1 P (W O) = ( 1) 14 ) = We can do this for each outcome to create the following probability mass function: (

3 Outcome W W W O OO BW BO BB X P {X = x} 28/91 16/91 1/91 32/91 8/91 6/91 It is easy to verify that the sum of these probabilities is 1. Exercise (4.2) Two fair dice are rolled. Let X equal the product of the 2 dice. Compute P {X = i} for i = 1, 2,..., 36. Consider the product of the faces after two dice are rolled. Just as with the sum of two dice, there are 36 possible outcomes. We list them below: Scanning the set of possible outcomes, we see that the random variable X takes on the values 1, 2, 3, 4, 5, 6, 8, 9, 10, 12, 15, 16, 18, 20, 24, 25, 30, and 36, which the following probabilities: P {X = i} = 1, for i = 1, 9, 16, 25, P {X = j} = 2, for j = 2, 3, 5, 8, 10, 15, 18, 20, 24, P {X = 4} = 3 18 P {X = k} = 4, for k = 6, Exercise (1d) Independent trials consisting of flipping a coin having probability p of coming up heads are continually performed until either a head occurs or a total of n flips is made. If we let X denote the number of times the coin is flipped, then X is a random variable taking on what values? What are the respective probabilities? If X is the number of times the coin is flipped, then X takes on the values 1, 2,..., n. 3

4 Since each flip is independent of the last, we can come up with the following outcomes and probabilities: Outcome Probability H P {X = 1} = p T H P {X = 2} = (1 p)p T T H P {X = 3} = (1 p) 2 p T T T H P {X = 4} = (1 p) 3 p. } T T {{ T } H n 2 } T T {{ T } n 1.. P {X = n 1} = (1 p) n 2 p P {X = n} = (1 p) n 1 We note here that for the number of flips to be X = n, we need only flip n 1 tails in a row. The last flip doesn t matter because either way we will be flipping the coin n times. 4

5 Section 4.2 Discrete Random Variables A random variable that takes on at most a countable number of possible values is said to be discrete. A countable set is a set with the same number of elements as some subset of the natural numbers, N = {0, 1, 2,...}. A countable set can be a finite set of number or a countably infinite set. A countably infinite set is an infinite set whose members can be labeled in such a way that each label has a one-to-one correspondence with every element of the natural numbers. In other words, there is a prescription that allows us to identify every member of a countably infinite set by relating this member to an element of the natural numbers. This one-to-one-correspondence is called a bijection. Recall that a function is one-to-one or injective if every element of the range is the image of at most one element in the domain. A function is called onto or surjective if every element of the range has a corresponding element in the domain. Using these definitions, we can define what it means to be countably infinite. Consider the set of integers, Z. Are they countably infinite? Yes they are because the following function maps every natural number to an integer. { f(x) = x/2 if x = even (x + 1)/2 if x = odd It is easy to show that this function is indeed a bijection. [Draw Picture] Therefore, the integers are countable and have exactly the same number of elements as the natural numbers. In a similar way, we can show that other sets are countable, like the rationals. The real numbers are uncountable. For any discrete variable X, we define a probability mass function p(x) of X given by p(x) = P {X = x}. Here, x can be thought of as a continuous variable, but we suppose that X takes on the countable values x 1, x 2,..., where each value x i is labeled by a natural number i = 1, 2,.... This is customary notation that suggests each x i originates from a countable set. Then p(x i ) is non-negative at each value and zero for all other values: p(x i ) 0 for i = 1, 2,... p(x) = 0 for all other values of x. The variable X must take on one of the values x i, which means p(x i ) = 1. The probability mass function p(x) is presented graphically by plotting p(x i ) on the y-axis against x i on the x-axis. Example Suppose X is the random variable representing the sum when the dice are rolled. Then we know X takes on integer values between 2 and 12 with probabilities ranging from 1/36 to 6/36. We represent the probability mass function with the following graph, where each bar is centered over the value of X and the height is the probability: 5

6 Probability Mass Function, p(x) p(x) x Exercise A discrete probability mass function of a random variable X has the form p(i) = cλ i /i! for i = 0, 1, 2,..., where λ is some positive value. Find (a) P {X = 0} and (b) P {X > 2}. a.) Using the formula for the probability mass function, we obtain P {X = 0} = p(0) = c λ0 0! = c. To find the constant c, we note that p(i) is a probability mass function, which means p(i) = 1. i=0 Substituting in the expression for p yields the following: p(i) = = c = ce λ, since the infinite series is actually a series representation for e λ. Hence, P {X = 0} = c = e λ. c λi i! λ i i! 6

7 b.) It follows that P {X > 2} = 1 P {X 2}, which gives P {X > 2} = 1 P {X = 0} P {X = 1} P {X = 2} = 1 e λ λe λ λ2 2 e λ. For a random variable X, the function F defined by F (x) = P {X x}, < x < is called the cumulative distribution function or more simply the distribution function of X. This function specifies the probability that the random variable is less than or equal to x. Suppose that a b. Then the event {X a} is contained in the event {X b}, and so F (a) F (b), which means F (x) is a nondecreasing function of x. We can define the cumulative distribution function in terms of p(x). In this case, we can write F (b) = p(x). all x b From this we see that if X is a discrete random variable with possible values x 1, x 2,..., where x 1 < x 2 <, then the distribution function F of X is a step function. Indeed, consider the following example. Example Suppose X is a random variable with mass function given by the four values: p(1) = 1 4, p(2) = 1 2, p(3) = 1 8, p(4) = 1 8. Then, the cumulative distribution function F (x) is given as 0 if b < if 1 b < 2 3 F (b) = 4 if 2 b < if 3 b < 4 1 if 4 b The probability mass function and cumulative distribution function are presented below. Probability Mass Function, p(x) Cumulative Distribution Function, F(b) p(x) 0.3 F(b) x b 7

8 We note that the distance between each jump at each value x = 1, 2, 3, and 4 is exactly p(x). Exercise (4.17) Suppose that a cumulative distribution function is given by 0 if b < 0 F (b) = b 4 if 0 b < b 1 4 if 1 b < if 2 b < 3 1 if 3 b a.) Draw a rough sketch of the cumulative distribution function. b.) Find P {X = i} for i = 1, 2, 3. c.) Find P {1/2 X 3/2}. a.) A sketch of the cumulative distribution function is below: Cumulative Distribution Function, F(b) F(b) b b.) The value of the probability mass function is equal to the width of the jumps at each value. Therefore, P {X = 1} = 1/2, P {X = 2} = 11/12 3/4 = 1/6, and P {X = 3} = 1/12. c.) By the definition of the cumulative distribution function, it follows that P {1/2 < X < 3/2} = P {X < 3/2} P {X < 1/2}, which means P {1/2 < X < 3/2} = F (3/2) F (1/2) = =

9 Section 4.3 Expected Value If X is a discrete random variable, having a probability mass function p(x), then the expectation, or expected value of X, denoted by E[X], is defined as E[X] = x:p(x)>0 xp(x). The expected value is the weighted average of the possible values that X can take on. Each value is weighted by its corresponding probability. Example Suppose a simple mass function for the random variable X is p(0) = 1 2, p(1) = 1 2. Then E[X] = (0)p(0) + (1)p(1) = (0) ( ) ( ) (1) = In this case, the expected value is the ordinary average of the two possible values. Example Suppose a simple mass function for the random variable X is p(0) = 1 3, p(1) = 2 3. Then E[X] = (0)p(0) + (1)p(1) = (0) ( ) ( ) (1) = In this case, the value at X = 1 is given twice as much weight as the value at X = 0. Suppose we are playing a game where X is our winnings, which could take on the values x 1, x 2,... x n. Each winning values has a corresponding probability p(x 1 ), p(x 2 ),..., p(x n ). Then if each p(x i ) is thought of as a relative frequency (in the sense that we play the game for a very long time), then our average winnings would be x i p(x i ) = E[X]. Exercise (3a) Suppose X is the outcome of rolling a fair die. Find E[X]. 9

10 The random variable X takes on the values i = 1, 2, 3, 4, 5, 6 with each value being equally probable, p(i) = 1/6. Therefore, E[X] = = 1 6 = ip(i) = i (6)(7) 2 Exercise (4.25) Two coins are to be flipped. The first coin will land on heads with probability 0.6, the second with probability 0.7. Assume that the results of the flips are independent, and let X equal the total number of heads that result. Find a.) P {X = 1} and b.) E[X]. It is easy to conclude that the sample space for this experiment has four total outcomes: S = {(H, H), (H, T ), (T, H), (T, T )}, where H denotes heads and T denotes flipping a tails. If X is to be the random variable that counts the total number of heads flipped, then X takes on the values i = 0, 1, 2. The probability mass function is given by the following three probabilities: The expected value is then given by p(0) = P (T T ) = (0.4)(0.3) = 0.12 p(1) = P (T H) + P (HT ) = (0.6)(0.3) + (0.4)(0.7) = 0.46 p(2) = P (HH) = (0.6)(0.7) = E[X] = 2 ip(i) = (0)(0.12) + (1)(0.46) + (2)(0.42) = 1.3. i=0 Exercise (4.30) A person tosses a fair coin until a tail appears for the first time. If the tail appears on the n th flip, the person wins 2 n dollars. Let X denote the player s winnings. Find E[X]. a.) Would you be willing to pay $1 million to play this game once? b.) Would you be willing to pay $1 million for each game if you could play for as long as you liked and only had to settle up when you stopped playing? 10

11 Let X be the player s winnings in dollars, then X takes on the values x = 2, 4, 8,..., 2 n,... for n = 1, 2,.... Since the game consists of flipping a fair coin, each flip may land tails or heads with probability 1/2, and each flip is independent of the last. Hence, the probability mass function is ( ) 1 n p(x) = p(2 n ) = = n. Hence, the expected value is E[X] = x: p(x)>0 xp(x) = ( ) 1 (2 n ) 2 n = n=1 1 =. This means that if we continue to play this game for a long period of time (forever, actually) we can expect to win an infinite amount of money. a.) Probably not because to win our money back in one game would mean to flip 19 heads in a row: 2 n = n This is too difficult to do in only one try. b.) Yes, because we can definitely expect to win our money back at some point. n=1 Exercise (4.22) Suppose that two teams play a series of games that ends when one of them has won i games. Suppose that each game played is, independently, won by team A with probability p. Find the expected number of games that are played when a.) i = 2 and b.) i = 3. Also, show in both cases that this number is maximized when p = 1/2. 11

12 Section 4.4 Expectation of a Function of a Random Variable In order to talk about variance we need to look at the expected value of some function of X, say g(x). For example, suppose we have a random variable X but what if we want to find the expected value of the square of all of these values? In this case, we can think of g(x) as a discrete random variable, it has a probability mass function, which can be determined from p(x). Consider the following example: Example Let X denote a random variable that takes on the values 1, 0, and 1 with probabilities p( 1) = 0.2, p(0) = 0.5, p(1) = 0.3. We can compute the value E[X 2 ] by letting Y = X 2. Then the probability mass function of Y, denoted by p (y), obtains the following values: p (1) = p( 1) + p(1) = = 0.5 p (0) = p(0) = 0.5. Hence, We note that E[X 2 ] E[X] 2. E[X 2 ] = E[Y ] = (0)p (0) + (1)p (1) = (0)(0.5) + (1)(0.5) = 0.5. We get the general idea that we will always be able to calculate the expected value of g(x) if we know the original probability mass function p(x). This, indeed, is always the case. We prove it below. Proposition If X is a discrete random variable that takes on values x i for i = 1, 2,... with probabilities p(x i ) then E[g(X)] = g(x i )p(x i ). Proof. We prove this proposition for a finite random variable. Let the random variable, X, take on the values x i for i = 1, 2,..., n. Because the function g(x) may not be one to one, suppose g(x) (another random variable) takes on the values g 1, g 2,..., g m (where m n). It follows that g(x) is a random variable, such that for j = 1, 2,..., m, the probability mass function is P {g(x) = g j } = p (g j ) = p(x i ). i: g(x i )=g j 12

13 Using this and the definition of expected value gives the following: E[g(X)] = = = = m g j p (g j ) j=1 m j=1 m j=1 g j p(x i ) i: g(x i )=g j g j p(x i ) i: g(x i )=g j n g(x i )p(x i ). Proposition If a and b are constants, then E[aX + b] = ae[x] + b. Proof. We have E[aX + b] = (ax + b)p(x) = a x: p(x)>0 xp(x) + b p(x) x: p(x)>0 x: p(x)>0 = ae[x] + b. The expected value of a random variable X, E[X], is also referred to also referred to as the mean or first moment of X. The quantity E[X n ] is referred to as the n th moment of X and is given by E[X n ] = x n p(x). x: p(x)>0 13

14 Section 4.5 Variance The expected value of a random variable X is an important measure of center of X but tells us nothing about the variation or spread of these values. Consider the following random variables: W = 0 with probability 1 { 1 with p( 1) = 1/2 Y = 1 with p(1) = 1/2 { 100 with p( 100) = 1/2 Z = 100 with p(100) = 1/2 All of these variables have the same expectation. To look at variation, we look at how far each value is from its mean, on the average. On way to do this is to look at the quantity E[ x µ ], where µ = E[X]. The term x µ is difficult to deal with so we look at the squares. Definition Let X be a random variable with mean µ, then the variance of X is denoted by Var[X] and is defined as Var[X] = E[(x µ) 2 ]. Exercise Show that E[(x µ) 2 ] = E[X 2 ] (E[X]) 2. This is an alternate form for the variance. Typically, this form is an easier way to compute the variance. Note, that this involves the second moment of X. Exercise A useful identity is that for any constants a and b, Show that this is true. Var[aX + b] = a 2 Var[X]. Exercise (5a) Calculate the variance, Var[X], if X represents the outcome when a fair die is rolled 14

15 Exercise (4.36) Suppose that two teams play a series of games that ends when one of them has won i games. Suppose that each game played is, independently, won by team A with probability p. Find the variance in the number of games that are played when i = 2. Show this value is maximized when p = 1/2. The mean is typically analogous to the center of gravity of a distribution of mass. The variance, with respect to mechanics, represents the moment of inertia. The square root of the variance is called the standard deviation of X, and we denote it by SD[X], which is given by SD[X] = Var[X]. 15

16 Section 4.6 Bernoulli and Binomial Random Variables Suppose an experiment takes place, whose outcome can be only two things: a success or a failure. We can let X = 1 for a success and let X = 0 for a failure. Then our probability mass function is p(0) = P {X = 0} = 1 p p(1) = P {X = 1} = p, where p, 0 p 1, is the probability of a success. A random variable that satisfies these conditions is called a Bernoulli random variable. Suppose we perform n trials of the experiment, each of which result in success with probability p. If X represents the number of successes in n trials, then X is a Binomial random variable. The two values n and p are called parameters for the Binomial model and if X follows the Bin(n, p). Exercise What is the probability mass function for the Binomial random variable X with parameters n and p? Once obtained, check if n i=0 p(x i) = 1. Exercise (6a) Five fair coins are flipped. If the outcomes are assumed independent, find the probability mass function of the number of heads obtained. Exercise (6b) It is known that screws produced by a certain company will be defective with probability 0.01, independently of one another. The company sells the screws in packages of 10 and offers a money back guarantee that at most 1 of the 10 screws is defective. What proportion of packages sold must the company replace? Exercise (4.32) To determine whether they have a certain disease, 100 people are to have their blood tested. However, rather than testing each individual separately, it has been decided first to place the people into groups of 10. The blood samples of the 10 people in each group will be 16

17 pooled and analyzed together. If the test is negative, one test will suffice for the 10 people, whereas if the test is positive, each of the 10 will also be individually tested and, in all, 11 tests will be made on this group. Assume that the probability that a person has the disease is 0.1 for all people, independently of one another, and compute the expected number of tests necessary for each group. Now, that we have defined the Binomial random variable, we seek a formula for the expected value and variance. The expected value can be computed as follows: E[X] = = = n ( n i i n ( n i i i=0 ) p i (1 p) n i ) p i (1 p) n i 17

18 Section 4.7 Poisson Random Variable Exercise (7a) Suppose that the number of typographical errors on a single page of this book has a Poisson distribution with parameter λ = 1/2. Calculate the probability that there is at least one error on this page. Exercise (7b) Suppose that the probability that an item produced by a certain machine will be defective is 0.1. Find the probability that a sample of size 10 items will contain at most 1 defective item. Exercise (4.61) The probability that you will be dealt a full house in a hand of poker is approximately Find an approximation for the probability that in 1000 hands of poker, you will be dealt at least 2 full houses. 18

19 Section 4.8 Other Discrete Random Variables Geometric Random Variable Exercise (4.71) Consider a roulette wheel consisting of 38 numbers, 1 through 36, 0, and double 0. If Basi always bets that the outcome will be one of the numbers 1 through 12, what is the probability that his first win will occur on his fourth bet? How many bets does he expect to place before he wins one time? Negative Binomial Random Variable Exercise (4.72) Two teams play a series of games; the first team to win 4 games is declared the winner. Suppose that one team is stronger than the other and wins each game with probability 0.6, independently of the outcomes of the other games. Find the probability, for i = 4, 5, 6, 7 that the stronger team wins in i games. Compare this probability that the stronger team wins with the probability that it would win a 2-out-of-3 series. Hypergeometric Random Variable Exercise (4.79) Suppose that a batch of 100 items contains 6 that are defective and 94 that are not defective. If X is the number of defective items in a randomly drawn sample of 10 items from the batch, find a.) P {X = 0} and b.) P {X > 2}. 19

20 Section 4.9 Expected Value of Sums of Random Variables The expected value of a sum of random variables is equal to the sum of the expected values. We prove this assuming that the sample space is either finite or countably infinite. Let X be a random variable and let X(s) be the value of X when s S is the outcome. If X and Y are random variables, then so is the sum Z = X + Y, with Z(s) = X(s) + Y (s). Example (9a) Suppose an experiment consists of flipping a coin 5 times, with the outcome being the resulting sequence of heads and tails. Suppose X is the number of heads in the first 3 flips and Y is the number of heads in the last 2 flips. Let Z = X + Y. Then, for instance, for the outcome s = {HT HT H}, X(s) = 2 Y (s) = 1 Z(s) = 3. Let p(s) be the probability that s is the outcome of the experiment. For any event A in the sample space, we can write A as the union of mutually exclusive outcomes s, such that each s A. It follows P (A) = s Ap(s). Suppose A = S, then 1 = s S p(s). Let X be a random variable such that X(s) is the value of X when s occurs. Proposition E[X] = x S X(s)p(s). Proof. Suppose X takes on the values x i, for i = 1, 2,.... For each i, let S i be the event that X is 20

21 equal to x i. In other words, S i = {s X(s) = x i }. then E[X] = = = = = x i P {X = x i } x i P (S i ) x i p(s) s Si x i p(s) s S i X(s)p(s) s S i = s S X(s)p(s), where S i for i = 1, 2,... are mutually exclusive events that make up S. Example Suppose two independent flips of a coin come up heads with probability p. Let X be the number of heads, then the probability mass function can be written as This gives P {X = 0} = p(t T ) = (1 p) 2 P {X = 1} = p(ht ) + p(t H) = 2p(1 p) P {X = 2} = p(hh) = p 2 E[X] = 0(1 p) 2 + 2p(1 p) + 2p 2 = 2p. We can also compute the expected values by considering every mutual event that makes up the sample space: E[X] = X(T T )p(t T ) + X(HT )p(ht ) + X(T H)p(T H) + X(HH)p(HH) = (0)(1 p) 2 + (1)(p(1 p)) + (1)((1 p)p) + (2)(p 2 ) = 2p. We now prove the important result from this section. Proposition For random variables X 1, X 2,..., X n, [ n ] n E X i = E[X]. 21

22 Proof. Let Z = n X i. Then it follows from the previous proposition E[Z] = s S Z(s)p(s) = ( n ) X i (s) p(s) s S ( ) n = X i (s)p(s) = s S n E[X i ]. Example Suppose we wanted to find the expected value of the sum obtained when n fair dice were rolled. Let X be the sum, then E[X] = n E[X i ], where X i is the upturned value on die i. Because X i is equally likely to be any of the values from 1 to 6, we know 6 ( ) 1 E[X i ] = i = Hence, E[X] = n 7 2 = 7 2 n. 22

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

ECE 316 Probability Theory and Random Processes

ECE 316 Probability Theory and Random Processes ECE 316 Probability Theory and Random Processes Chapter 4 Solutions (Part 2) Xinxin Fan Problems 20. A gambling book recommends the following winning strategy for the game of roulette. It recommends that

More information

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined Expectation Statistics and Random Variables Math 425 Introduction to Probability Lecture 4 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan February 9, 2009 When a large

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

More information

36 Odds, Expected Value, and Conditional Probability

36 Odds, Expected Value, and Conditional Probability 36 Odds, Expected Value, and Conditional Probability What s the difference between probabilities and odds? To answer this question, let s consider a game that involves rolling a die. If one gets the face

More information

4.1 4.2 Probability Distribution for Discrete Random Variables

4.1 4.2 Probability Distribution for Discrete Random Variables 4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

Random Variables. Chapter 2. Random Variables 1

Random Variables. Chapter 2. Random Variables 1 Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

MA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values.

MA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values. MA 5 Lecture 4 - Expected Values Friday, February 2, 24. Objectives: Introduce expected values.. Means, Variances, and Standard Deviations of Probability Distributions Two classes ago, we computed the

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179)

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179) Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172-179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities

More information

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPETED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny and a nickel are flipped. You win $ if either

More information

AMS 5 CHANCE VARIABILITY

AMS 5 CHANCE VARIABILITY AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

STAT 35A HW2 Solutions

STAT 35A HW2 Solutions STAT 35A HW2 Solutions http://www.stat.ucla.edu/~dinov/courses_students.dir/09/spring/stat35.dir 1. A computer consulting firm presently has bids out on three projects. Let A i = { awarded project i },

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Exploratory Data Analysis

Exploratory Data Analysis Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction

More information

Chapter 5. Discrete Probability Distributions

Chapter 5. Discrete Probability Distributions Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Section 7C: The Law of Large Numbers

Section 7C: The Law of Large Numbers Section 7C: The Law of Large Numbers Example. You flip a coin 00 times. Suppose the coin is fair. How many times would you expect to get heads? tails? One would expect a fair coin to come up heads half

More information

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Probability and Expected Value

Probability and Expected Value Probability and Expected Value This handout provides an introduction to probability and expected value. Some of you may already be familiar with some of these topics. Probability and expected value are

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution Recall: Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.

More information

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES

The overall size of these chance errors is measured by their RMS HALF THE NUMBER OF TOSSES NUMBER OF HEADS MINUS 0 400 800 1200 1600 NUMBER OF TOSSES INTRODUCTION TO CHANCE VARIABILITY WHAT DOES THE LAW OF AVERAGES SAY? 4 coins were tossed 1600 times each, and the chance error number of heads half the number of tosses was plotted against the number

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k. REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game

More information

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin Final Mathematics 51, Section 1, Fall 24 Instructor: D.A. Levin Name YOU MUST SHOW YOUR WORK TO RECEIVE CREDIT. A CORRECT ANSWER WITHOUT SHOWING YOUR REASONING WILL NOT RECEIVE CREDIT. Problem Points Possible

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

Chapter 4. Probability Distributions

Chapter 4. Probability Distributions Chapter 4 Probability Distributions Lesson 4-1/4-2 Random Variable Probability Distributions This chapter will deal the construction of probability distribution. By combining the methods of descriptive

More information

Binomial random variables

Binomial random variables Binomial and Poisson Random Variables Solutions STAT-UB.0103 Statistics for Business Control and Regression Models Binomial random variables 1. A certain coin has a 5% of landing heads, and a 75% chance

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables 2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Estimating the Frequency Distribution of the. Numbers Bet on the California Lottery

Estimating the Frequency Distribution of the. Numbers Bet on the California Lottery Estimating the Frequency Distribution of the Numbers Bet on the California Lottery Mark Finkelstein November 15, 1993 Department of Mathematics, University of California, Irvine, CA 92717. Running head:

More information

You flip a fair coin four times, what is the probability that you obtain three heads.

You flip a fair coin four times, what is the probability that you obtain three heads. Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

3.2 Roulette and Markov Chains

3.2 Roulette and Markov Chains 238 CHAPTER 3. DISCRETE DYNAMICAL SYSTEMS WITH MANY VARIABLES 3.2 Roulette and Markov Chains In this section we will be discussing an application of systems of recursion equations called Markov Chains.

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

Lecture 13. Understanding Probability and Long-Term Expectations

Lecture 13. Understanding Probability and Long-Term Expectations Lecture 13 Understanding Probability and Long-Term Expectations Thinking Challenge What s the probability of getting a head on the toss of a single fair coin? Use a scale from 0 (no way) to 1 (sure thing).

More information

Section 5 Part 2. Probability Distributions for Discrete Random Variables

Section 5 Part 2. Probability Distributions for Discrete Random Variables Section 5 Part 2 Probability Distributions for Discrete Random Variables Review and Overview So far we ve covered the following probability and probability distribution topics Probability rules Probability

More information

Probability Models.S1 Introduction to Probability

Probability Models.S1 Introduction to Probability Probability Models.S1 Introduction to Probability Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard The stochastic chapters of this book involve random variability. Decisions are

More information

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Chapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.

Chapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52. Chapter 6 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? 4/52. 2. What is the probability that a randomly selected integer chosen from the first 100 positive

More information

Unit 19: Probability Models

Unit 19: Probability Models Unit 19: Probability Models Summary of Video Probability is the language of uncertainty. Using statistics, we can better predict the outcomes of random phenomena over the long term from the very complex,

More information

Expected Value and the Game of Craps

Expected Value and the Game of Craps Expected Value and the Game of Craps Blake Thornton Craps is a gambling game found in most casinos based on rolling two six sided dice. Most players who walk into a casino and try to play craps for the

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence

More information

Unit 4 The Bernoulli and Binomial Distributions

Unit 4 The Bernoulli and Binomial Distributions PubHlth 540 4. Bernoulli and Binomial Page 1 of 19 Unit 4 The Bernoulli and Binomial Distributions Topic 1. Review What is a Discrete Probability Distribution... 2. Statistical Expectation.. 3. The Population

More information

We rst consider the game from the player's point of view: Suppose you have picked a number and placed your bet. The probability of winning is

We rst consider the game from the player's point of view: Suppose you have picked a number and placed your bet. The probability of winning is Roulette: On an American roulette wheel here are 38 compartments where the ball can land. They are numbered 1-36, and there are two compartments labeled 0 and 00. Half of the compartments numbered 1-36

More information

Math/Stats 342: Solutions to Homework

Math/Stats 342: Solutions to Homework Math/Stats 342: Solutions to Homework Steven Miller (sjm1@williams.edu) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability

More information

Math Review. for the Quantitative Reasoning Measure of the GRE revised General Test

Math Review. for the Quantitative Reasoning Measure of the GRE revised General Test Math Review for the Quantitative Reasoning Measure of the GRE revised General Test www.ets.org Overview This Math Review will familiarize you with the mathematical skills and concepts that are important

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Introductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014

Introductory Probability. MATH 107: Finite Mathematics University of Louisville. March 5, 2014 Introductory Probability MATH 07: Finite Mathematics University of Louisville March 5, 204 What is probability? Counting and probability 2 / 3 Probability in our daily lives We see chances, odds, and probabilities

More information

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025.

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025. Elementary Statistics and Inference 22S:025 or 7P:025 Lecture 20 1 Elementary Statistics and Inference 22S:025 or 7P:025 Chapter 16 (cont.) 2 D. Making a Box Model Key Questions regarding box What numbers

More information

Aggregate Loss Models

Aggregate Loss Models Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

MBA 611 STATISTICS AND QUANTITATIVE METHODS

MBA 611 STATISTICS AND QUANTITATIVE METHODS MBA 611 STATISTICS AND QUANTITATIVE METHODS Part I. Review of Basic Statistics (Chapters 1-11) A. Introduction (Chapter 1) Uncertainty: Decisions are often based on incomplete information from uncertain

More information

Contemporary Mathematics Online Math 1030 Sample Exam I Chapters 12-14 No Time Limit No Scratch Paper Calculator Allowed: Scientific

Contemporary Mathematics Online Math 1030 Sample Exam I Chapters 12-14 No Time Limit No Scratch Paper Calculator Allowed: Scientific Contemporary Mathematics Online Math 1030 Sample Exam I Chapters 12-14 No Time Limit No Scratch Paper Calculator Allowed: Scientific Name: The point value of each problem is in the left-hand margin. You

More information

Betting systems: how not to lose your money gambling

Betting systems: how not to lose your money gambling Betting systems: how not to lose your money gambling G. Berkolaiko Department of Mathematics Texas A&M University 28 April 2007 / Mini Fair, Math Awareness Month 2007 Gambling and Games of Chance Simple

More information

Week 5: Expected value and Betting systems

Week 5: Expected value and Betting systems Week 5: Expected value and Betting systems Random variable A random variable represents a measurement in a random experiment. We usually denote random variable with capital letter X, Y,. If S is the sample

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

Some special discrete probability distributions

Some special discrete probability distributions University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that

More information

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3 Problem A: You are dealt five cards from a standard deck. Are you more likely to be dealt two pairs or three of a kind? experiment: choose 5 cards at random from a standard deck Ω = {5-combinations of

More information

Solution. Solution. (a) Sum of probabilities = 1 (Verify) (b) (see graph) Chapter 4 (Sections 4.3-4.4) Homework Solutions. Section 4.

Solution. Solution. (a) Sum of probabilities = 1 (Verify) (b) (see graph) Chapter 4 (Sections 4.3-4.4) Homework Solutions. Section 4. Math 115 N. Psomas Chapter 4 (Sections 4.3-4.4) Homework s Section 4.3 4.53 Discrete or continuous. In each of the following situations decide if the random variable is discrete or continuous and give

More information

Section 6.1 Discrete Random variables Probability Distribution

Section 6.1 Discrete Random variables Probability Distribution Section 6.1 Discrete Random variables Probability Distribution Definitions a) Random variable is a variable whose values are determined by chance. b) Discrete Probability distribution consists of the values

More information

Binomial random variables (Review)

Binomial random variables (Review) Poisson / Empirical Rule Approximations / Hypergeometric Solutions STAT-UB.3 Statistics for Business Control and Regression Models Binomial random variables (Review. Suppose that you are rolling a die

More information

Answer Key for California State Standards: Algebra I

Answer Key for California State Standards: Algebra I Algebra I: Symbolic reasoning and calculations with symbols are central in algebra. Through the study of algebra, a student develops an understanding of the symbolic language of mathematics and the sciences.

More information

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to

More information

Mathematical Expectation

Mathematical Expectation Mathematical Expectation Properties of Mathematical Expectation I The concept of mathematical expectation arose in connection with games of chance. In its simplest form, mathematical expectation is the

More information

X: 0 1 2 3 4 5 6 7 8 9 Probability: 0.061 0.154 0.228 0.229 0.173 0.094 0.041 0.015 0.004 0.001

X: 0 1 2 3 4 5 6 7 8 9 Probability: 0.061 0.154 0.228 0.229 0.173 0.094 0.041 0.015 0.004 0.001 Tuesday, January 17: 6.1 Discrete Random Variables Read 341 344 What is a random variable? Give some examples. What is a probability distribution? What is a discrete random variable? Give some examples.

More information

ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers

ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers Name DO NOT remove this answer page. DO turn in the entire exam. Make sure that you have all ten (10) pages

More information

6.042/18.062J Mathematics for Computer Science. Expected Value I

6.042/18.062J Mathematics for Computer Science. Expected Value I 6.42/8.62J Mathematics for Computer Science Srini Devadas and Eric Lehman May 3, 25 Lecture otes Expected Value I The expectation or expected value of a random variable is a single number that tells you

More information

In the situations that we will encounter, we may generally calculate the probability of an event

In the situations that we will encounter, we may generally calculate the probability of an event What does it mean for something to be random? An event is called random if the process which produces the outcome is sufficiently complicated that we are unable to predict the precise result and are instead

More information

To define function and introduce operations on the set of functions. To investigate which of the field properties hold in the set of functions

To define function and introduce operations on the set of functions. To investigate which of the field properties hold in the set of functions Chapter 7 Functions This unit defines and investigates functions as algebraic objects. First, we define functions and discuss various means of representing them. Then we introduce operations on functions

More information