2. Discrete Random Variables

Size: px
Start display at page:

Download "2. Discrete Random Variables"

Transcription

1 2. Discrete Random Variables 2.1 Definition of a Random Variable A random variable is the numerical description of the outcome of an experiment (or observation). e.g. 1. The result of a die roll. 2. The height of a person. 3. The number of heads when a coin is tossed k times. 1 / 97

2 A random variable A random variable is denoted by a capital letter and can be defined by its distribution (see Section 2.2). It should be interpreted as the (as yet) unobserved result of a random experiment. A realisation of a random variable is a particular observation taken from this distribution (e.g. the result of a die roll, the height of a particular person). Realisations are denoted by small letters. 2 / 97

3 An example of a random variable Consider a die roll. Let X be the result of the roll. The distribution of X is given by P(X = x) = 1, x {1, 2, 3, 4, 5, 6}. 6 The support S X of random variable X is the set of possible realisations (results). In the case of a die roll S X = {1, 2, 3, 4, 5, 6}. 3 / 97

4 Relation between random variables and elementary events Note that each possible realisation of X here is an elementary event ω. In general, a random variable can be defined as a mapping from the set of elementary events to the support of the random variable. If the elementary event observed is ω, then the realisation of the random variable is X (ω). In this case the definition of the random variable in terms of the elementary events is trivial, i.e. X (ω) = ω. Normally the relation between a random variable and the set of elementary events is ignored in the notation and a random variable is denoted simply by a capital letter. 4 / 97

5 Relation between random variables and elementary events For example, we might define Y to be 1 if the result of the die roll is divisible by 3, otherwise Y = 0. In this case Y (ω) = 1, ω {3, 6}, Y (ω) = 0, ω {1, 2, 4, 5}. Returning to Example 1.9, we might define U to be the number of coin tosses before tails appears. Using the notation there U(ω k ) = k. 5 / 97

6 Relation between random variables and elementary events It follows from the definition of a random variable that we can treat X = k as an event, since the event X = k corresponds to the set of elementary events ω for which X (ω) = k. Note that several elementary events can be mapped to one realisation of a random variable. For example, let W be the number of heads in 2 coin tosses. The event W = 1 corresponds to the elementary events HT and TH. Hence, one may think of the set of elementary events as being a full description of all the possible results of an experiment and a random variable as a numerical (possibly simplified) description of the result of an experiment. 6 / 97

7 Relation between random variables and elementary events Any 2 possible reaslisations of a random variable correspond to 2 mutually exclusive sets of elementary events. For example, W = 0 corresponds to {TT }, W = 1 corresponds to {HT, TH} and W = 2 corresponds to {HH}. Hence, the events X = k and X = j (j k) must be mutually exclusive, i.e. P(X = k X = j) = P(X = k) + P(X = j), for j k. 7 / 97

8 2.2. Definition of a discrete distribution The support of a discrete random variable is a set that can be listed. Commonly, discrete random variables take integer values. e.g. 1. No. of defective components. 2. No. of heads when a coin is tossed k times. 8 / 97

9 Definition of a discrete distribution Suppose the support of the random variable X is {x 1, x 2,..., x k }. The distribution of a discrete random variable X satisfies the following two conditions. P(X = x) 0, for any x. k P(X = x i ) = 1. i=1 Note that P(X = x) > 0 if and only if x S X. Otherwise, P(X = x) = 0. 9 / 97

10 Example 2.1 Anne and Bob play the following game. Anne tosses a coin and Bob rolls a die. If Anne tosses heads, then she wins 3 Euro from Bob. If Anne tosses tails, then she loses X Euro to Bob, where X is the result of the die roll. Let Y be the winnings of Anne (if Anne loses c Euro, then her winnings are c). Define the distribution of Y. 10 / 97

11 Example 2.1 Note that an elementary event of such a experiment (a simultaneous coin toss and die roll) describes the results of both the toss and the roll. There are 12 elementary events (1, H), (2, H),..., (6, H), (1, T ), (2, T ),..., (6, T ). Each of these elementary events is associated with a payoff to Anne. In order to calculate the distribution of Anne s winnings, we can tabulate the elementary events, their probabilities and the corresponding payoff Anne gets. The probability that Anne wins y units is simply the sum of the probabilities of the elementary events that lead to Anne winning y. 11 / 97

12 Example 2.1 Let P(i, H) denote the probability of obtaining i on the die roll and heads. Let P(i, T ) denote the probability of obtaining i on the die roll and tails. Since the die roll is independent of the coin toss, we have P(i, T ) = P(i, H) = = 1, for i {1, 2, 3, 4, 5, 6} / 97

13 Example 2.1 The following table considers all the possible results (elementary events) of the game (in terms of the result of the coin toss and the die roll). Anne s winnings are given in each cell, together with the probability of each result (in brackets) H 3 ( 1 12 ) 3 ( 1 12 ) 3 ( 1 12 ) 3 ( 1 12 ) 3 ( 1 12 ) 3 ( 1 T -1 ( 1 12 ) -2 ( 1 12 ) -3 ( 1 12 ) -4 ( 1 12 ) -5 ( 1 12 ) -6 ( 1 12 ) 12 ) 13 / 97

14 Example 2.1 In order to calculate, for example, the probability that Y = 3 (Anne wins 3 Euro), we sum the probabilities of the results that lead to Anne winning 3 Euro. Hence, P(Y = 3) = = 1 2. This is intuitively clear, since Anne wins 3 Euro if she throws heads (i.e. with probability 1 2 ). Similarly, P(Y = y) = 1, y { 1, 2, 3, 4, 5, 6} / 97

15 2.3 The expected value and variance of a random variable Suppose a die is thrown 600 times. We expect on average 100 observations of each possible result. Hence, we expect the average result to be This is = x i S X x i P(X = x i ) = / 97

16 The expected value and variance of a random variable The expected value of a discrete random variable X, denoted E(X ) or µ X, is given by µ X = E(X ) = x i S X x i P(X = x i ). The expected value of a function f of a random variable X, E[f (X )] is given by E[f (X )] = x i S X f (x i )P(X = x i ). The k-th moment of a random variable is given by E(X k ), i.e. E(X k ) = x i S X x k i P(X = x i ). 16 / 97

17 The expected value and variance of a random variable The variance of a random variable, Var(X ) = σx 2, is given by σx 2 =E[(X µ X ) 2 ] = (x i µ X ) 2 P(X = x i ) x i S X = [xi 2 P(X = x i ) 2µ X x i P(X = x i ) + µ 2 X P(X = x i)] x i S X = xi 2 P(X =x i ) 2µ X x i P(X =x i ) +µ 2 X P(X = x i ) x i S X x i S X x i S X =E(X 2 ) 2µ X µ X + µ 2 X = E(X 2 ) µ 2 X = E(X )2 E(X ) 2 The standard deviation of a random variable is given by σ X = Var(X ). 17 / 97

18 The expected value and variance of a random variable µ X is a measure of centrality of a random variable (it is the centre of mass of a distribution). If a distribution is symmetric about x = x 0, then E(X ) = x 0. σ X is a measure of the dispersion of the distribution. Note that be definition Var(X ) 0. Var(X ) = 0 if and only if P(X = c) = 1, for some value c. 18 / 97

19 The expected value and variance of a random variable Suppose we observe a sample of realisations from a distribution (recall the Introduction to Data Analysis Module), then the sample mean can be used to estimate the expected value. The sample standard deviation can be used to estimate the standard deviation of the random variable. Naturally, when we observe samples there will be random fluctuations from the expected value (µ X ) and standard deviation (σ X ). 19 / 97

20 Example 2.2 Calculate the expected value and standard deviation of Anne s payoff (see Example 2.1). We have P(Y = 3) = and P(Y = y) = 12, y { 1, 2, 3, 4, 5, 6}. Thus S Y = { 6, 5, 4, 3, 2, 1, 3} E(Y )= y S Y yp(y = y) = ( 1) ( 2) ( 6) 1 12 = Note that this means that on average Anne loses 25 cents a game to Bob. The game is not fair. 20 / 97

21 Example 2.2 In order to calculate the standard deviation, we first calculate the variance. It is normally easier to use the following formula We have E(Y 2 )= y S Y y 2 P(Y = y) Var(Y ) = E(Y 2 ) E(Y ) 2. = ( 1) ( 2) ( 6) = / 97

22 Example 2.2 Hence, Thus, Var(Y ) = 145 ( 12 4) σ Y = Var(Y ) = / 97

23 2.3 Standard Discrete Random Variables The 0-1 Distribution with Parameter p X has a 0-1 distribution with parameter p [we write X 0-1(p)] if P(X = 0)=1 p P(X = 1)=p For example, if I toss a coin once and X is the number of heads, then X 0-1( 1 2 ). If I roll a die once and Y is the number of sixes, then Y 0-1( 1 6 ). 23 / 97

24 2.3.2 The binomial probability distribution with parameters n and p Suppose I carry out n experiments and the probability of success in each experiment is p. The results of the experiments are independent of each other. Such a set of trials is called a series of independent Bernoulli trials. Let X be the number of successes. X has a binomial distribution with parameters n (the number of experiments) and p (the probability of success in each experiment). 24 / 97

25 The binomial distribution We write X Bin(n, p) If Y is the number of failures, then Y Bin(n, 1 p). Suppose I toss a coin 10 times. Let X be the number of heads. X Bin(10, 0.5). Suppose I roll a die 100 times. Let Y be the number of sixes. Y Bin(100, 1 6 ). 25 / 97

26 The binomial distribution Consider Y Bin(100, 1 6 ). Note that there are a large number of sequences that correspond to e.g. there being 15 sixes, i.e. Y = 15. Let U denote the event that the result of a roll is 6. Consider the probability of getting 15 sixes in the first 15 throws and then not getting a 6 in the remaining 85 rolls. From the independence of the rolls, the probability of this sequence is P(U 15 (U c ) 85 ) = P(U) 15 P(U c ) 85 = ( 1 6 ) 15 ( 5 6 ) 85 i.e. the probability of such a sequence is the probability of success to the number of successes times the probability of failure to the number of failures. The probability of any sequence with 15 sixes (and thus 85 failures) has to be the same (the same probabilities appear in a different order). 26 / 97

27 The binomial distribution Hence, the probability of obtaining 15 sixes in 100 rolls of a die is equal to the probability of any such sequence times the number of sequences that contain 15 heads out of 100 results. We now consider 1. The number of orderings of k objects (positions) chosen from n objects (positions). This is n(n 1)(n 2)... (n k + 2)(n k + 1) = n! (n k)! 2. The number of hands of k objects (positions) chosen from n objects (positions). This is n! k!(n k)!. Note that n! = n. By definition 0! = / 97

28 The binomial distribution To prove the first result, notice that there are n possible choices for the first object, n 1 for the second (given the choice of the first), n 2 for the third and so on. Since after the last of the k objects is chosen, n k remain, the final object is chosen out of the n k + 1 remaining before the choice. Since at each stage the number of possible choices (given the previous choices) is always the same, to get the total number of choices we multiply together the number of choices at each stage, i.e. n(n 1)(n 2)... (n k + 1) = n! (n k)!. 28 / 97

29 The binomial distribution To prove the second result suppose we chose objects 1, 2,... k in that order. Any permutation of that ordering gives us the same hand, i.e. there are k! orderings corresponding to each hand. Thus the number of ways k positions can be chosen from n (ignoring the order of choice) is given by n! k!(n k)! 29 / 97

30 The binomial distribution We have P(X = x) = ( ) n p x (1 p) n x, x for x {0, 1, 2,..., n}, where ( ) n = x n! x!(n x)!, ( ) n is the number of ways of choosing x objects from n (the x order in which the choices are made are not important). The notation n C x is also used (in particular, on scientific calculators). 30 / 97

31 The binomial distribution Note ( ) ( ) n n =. x n x Choosing the x positions for the successes is equivalent to choosing the n x positions for the failures. There must be the same number of ways of choosing in both cases. ( ) n 1 = By definition 0! =1. Hence, ( ) n = 0 ( n ) n 1 = n. ( ) n = 1. n 31 / 97

32 The relation between the binomial distribution and the 0-1 distribution Suppose X Bin(n, p). Let X i be the number of successes in the i-th experiment. It follows that i.e. X i 0-1(p). P(X i = 0)=1 p P(X i = 1)=p 32 / 97

33 The relation between the binomial distribution and the 0-1 distribution Since the results of the experiments are independent, these X i are independent. Summing these X i, we obtain the total number of successes i.e. X = X 1 + X X n. This result will be later used to derive the expected value and variance of a random variable with a binomial distribution. 33 / 97

34 Example 2.3 A signal is transmitted by a channel as a sequence of bits. The probability of any bit being sent correctly is 0.9 (independently of the other signals). If a signal contains 10 bits, calculate the probability that 1. the entire signal is transmitted correctly. 2. two bits of the signal are transmitted incorrectly. 34 / 97

35 Example 2.3 a) Let X be the number of bits sent correctly. X Bin(10, 0.9). ( ) n P(X = x)= p x (1 p) n x x ( ) 10 P(X = 10)= = / 97

36 Example 2.3 b) Two bits sent incorrectly i.e. 8 bits are sent correctly ( ) 10 P(X = 8)= = 10! 8!2! = / 97

37 2.3.3 The Poisson distribution Consider cars passing a point on a rarely used country road. Suppose 1. Arrivals occur at an average rate of λ per unit time. 2. The probability of an arrival in an interval of length k is constant. 3. The number of arrivals in two non-overlapping intervals of time are independent. Then the number of arrivals, X, in an interval of length t has a Poisson distribution with parameter µ = λt. Note: µ is the expected number of arrivals in time t. 37 / 97

38 The Poisson distribution We write X Poisson(µ) P(X = x) = e µ µ x The Poisson distribution can also be used to model the distribution of the number of individuals in a given area when the population does not form clusters. e.g. Suppose the number of male Siberian tigers is one per 100km 2. In a 1000km 2 area we expect on average 10 male tigers. Let X be the number of male tigers in this area X Poisson(10). x! 38 / 97

39 Example 2.4 Calls arrive at a call centre at a rate of 3 per minute. Calculate i) the probability that in one minute there is at least one call. ii) the probability that in 3 minutes there are exactly 10 calls. 39 / 97

40 Example 2.4 Let X be the number of calls in 1 minute. We expect on average 3 calls X Poisson(3). The complement of the event that there is at least one call is the event that there are no calls P(X = x)= e µ µ x = e 3 3 x x! x! P(X 1)=1 P(X = 0) = 1 e / 97

41 Example 2.4 ii) Let Y be the number of calls in a 3 minute period. We expect on average 9 calls. Hence, Y Poisson(9). P(Y = y) = e µ µ x = e 9 9 x x! x! P(X = 10) = e ! 41 / 97

42 The Poisson Approximation to the Binomial If X Bin(n, p), where n is large and p is small, then X is approximately Poisson distributed with parameter µ = np. Note: µ is the expected number of successes. This approximation is reasonable when n 20 and p 0.05 (or n 50 and p 0.1). If p 0.9, then this approximation can be used for the distribution of the number of failures, Y, Y Bin(n, 1 p). Y has an approximate Poisson(n(1 p)) distribution. 42 / 97

43 The Poisson distribution Note that from the Taylor expansion of e µ around x = 0 e µ = 1 + µ + µ2 2 + µ3 3! +... = i=0 µ i i! It follows that e µ µ i =e µ i! i=0 i=0 µ i i! =e µ e µ = 1 43 / 97

44 Example 2.5 Suppose the probability that a bit is correctly transmitted by a channel is bits are transmitted. Using the appropriate approximation, estimate the probability that i) exactly 5 bits are incorrectly transmitted. ii) more than 2 bits are incorrectly transmitted. 44 / 97

45 Example 2.5 a) The probability of an error is Let X be the number of errors. X Bin(2 000, 0.001). Hence, X approx Poisson(2). P(X = x)= e µ µ x = e 2 2 x x! x! P(X = 5)= e ! 45 / 97

46 Example 2.5 The complement of the event more than 2 mistakes is at most 2 mistakes P(X > 2) = 1 P(X 2)=1 P(X = 0) P(X = 1) P(X = 2) =1 e ! e 2 2 1! e 2 2 2! / 97

47 2.3.4 The Multinomial Distribution This is a generalisation of the binomial distribution. Suppose there are k possible results of an experiment. In the case of the binomial distribution k = 2 (the results may be labelled as success and failure). Suppose n independent experiments are carried out. In each experiment the probability of the i-th result is p i, where p 1 + p p k = 1. Let X i be the number of occurrences of the i-th result. We write X = (X 1, X 2,..., X k ) Mult(n, p 1, p 2,..., p k ). For x 1 + x x k = n, we have i=1 px i P(X 1 = x 1, X 2 = x 2,..., X k = x k ) = n! k i k i=1 x i!. Note P(X 1 = x 1, X 2 = x 2,..., X k = x k ) is the standard notation for P(X 1 = x 1 X 2 = x 2... X k = x k ). 47 / 97

48 Example 2.6 Suppose a computer chooses 10 times from the set of integers {1, 2, 3, 4}, with each integer being chosen with probability 0.25 each time. Calculate the probability that 1. 2 is chosen 4 times. 2. 1, 2 and 3 are each chosen twice. 48 / 97

49 Example We only need to consider 2 types of result: success - the computer chooses 2, failure - the computer chooses something else. Let X be the number of times 2 is chosen. X Bin(10, 0.25). ( ) 10 P(X = 4) = = / 97

50 Example If 1, 2 and 3 are each chosen twice, 4 must be chosen 4 times (in total 10 choices). Let Y i be the number of times i is chosen. (Y 1, Y 2, Y 3, Y 4 ) Mult(10, 0.25, 0.25, 0.25, 0.25) P(Y 1 = 2, Y 2 = 2, Y 3 = 2, Y 4 = 4)= 10! 2!2!2!4! / 97

51 The multinomial distribution It should be noted that, unlike the other distributions considered so far in this chapter, the multinomial distribution does not describe a single random variable, but is a joint distribution of a set of random variables. Such distributions will be considered in more detail in Chapter 4. Given that there are k possible results, we only really need to consider a vector of k 1 random variables X 1, X 2,... X k 1. Given the realisations of these variables we know the realisation of X k, since X 1 + X X k = n, thus X k = n X 1 X 2... X k / 97

52 2.3.5 The Geometric Distribution Suppose that a set of independent Bernoulli trials are carried out and the probability of a success is p. Let X be the number of trials until the first success occurs (including the success). X has a geometric distribution with parameter p, we write X Geom(p). P(X = x) = (1 p) x 1 p Note: if the first success occurs at the x-th trial, then the sequence of results must be FF... FS, where F occurs x 1 times (see also Example 1.9). Since the trials are independent, the probability of such a sequence is simply the product of the probability of the individual results i.e. (1 p) x 1 p. 52 / 97

53 Example 2.7 Suppose X Geom(p). Let k > i. Calculate the probability that X = k given that X > i 53 / 97

54 Example 2.7 ii) We must calculate P(X = k X > i). From the definition of conditional probability P(X = k X > i) = P(X = k X > i). P(X > i) Note that for k > i, then if X = k, then X > i is automatically satisfied. Hence, P(X = k X > i) = P(X = k) = (1 p) k 1 p. Also, if more than i trials are required before the first success, the first i trials must have been failures (see also Example 1.9). Hence, P(X > i) = (1 p) i. 54 / 97

55 Example 2.7 Thus P(X =k X >i) P(X =k X >i)= = (1 p)k 1 p P(X >i) (1 p) i = (1 p) (k i) 1 p. It can be seen that this is simply the probability that the first success comes on the k i-th trial. Given I have carried out an experiment i times without success, the number of additional trials I carry out before the first success (k i in the language of the example above) is has the same distribution as the number of trials at the start of the process. 55 / 97

56 The Memoryless Property Definition: Suppose c > 0 and P(X = k + c X > k) = P(X = c), then the random variable X has the memoryless property. Interpretation: Suppose I roll a die until I obtain a 6. At the beginning I expect to throw the die 6 times on average. If I throw k non-sixes in a row, then I still expect to throw the die an additional 6 times on average before obtaining a six. Hence, after a succession of k failures, the future looks just the same as it did at the beginning of the experiment. 56 / 97

57 2.3.6 The Hypergeometric Distribution Suppose we choose n objects from N objects without replacement (e.g. choosing a hand of cards, inspecting a batch of goods). Suppose there are N 1 objects of type 1 and the remaining N 2 = N N 1 objects are of type 2. Let X be the number of objects of type 1 chosen. What is P(X = x)? P(X = x) is given by the number of ways of choosing x objects of type 1 divided by the total number of ways of choosing the n objects (the possible choices are all equally likely). 57 / 97

58 The hypergeometric distribution There are ( ) N ways of choosing the n objects. n In order to choose x objects of type 1 a. We choose x objects from the N 1 of type 1. ways. ( ) N1 x b. Since in total we choose n objects, the remaining n ( x objects ) are chosen from the N 2 objects of type N2 2. ways. n x 58 / 97

59 The hypergeometric distribution Since for each way of choosing the objects of type 1, there are the same number of ways of choosing the objects of type 2, we have ( ) ( N1 N2 ) P(X = x) = x n x ( ) N, n where N - total number of objects. n - number of objects chosen. N i - total number of objects of type i. x - number of objects of type 1 chosen. n x - number of objects of type 2 chosen. 59 / 97

60 The hypergeometric distribution Note that when the total number of objects, N, is very large in comparison to the number of objects chosen, n, then the hypergeometric distribution can be approximated using the binomial distribution with parameters n and p = N 1 N. 60 / 97

61 Extension to a larger number of classes of object Suppose there are k classes of object. Let there be N objects in total and N i objects of class i. Suppose n objects are chosen without replacement. Let X i be the number of objects of class i chosen. For x 1 + x x k = n, we have ( ) k Ni i=1 P(X 1 = x 1, X 2 = x 2,..., X k = x k ) = x i ( ) N n 61 / 97

62 Example 2.8 Calculate the probability that a hand of 13 cards from a standard pack of 52 contains 1. Exactly 5 clubs (event A). 2. At least 1 ace (event B) clubs or 5 spades (or both) - event C. 62 / 97

63 Example We consider 2 classes of objects: clubs and non-clubs. We must obtain 5 clubs from the total of 13 clubs. The other 8 cards in our hand are chosen from the 39 non-clubs. ( ) ( ) Thus there are ways of choosing the clubs and ways 5 8 of choosing the non-clubs. Multiplying these together we obtain the number of ways of obtaining such a hand. 63 / 97

64 Example 2.8 It total, we choose( 13 cards ) from 52. Hence, the total number of 52 possible choices is. 13 Thus, ( ) ( ) P(A) = 5 8 ( ) / 97

65 Example It is simpler to first calculate P(B c ), where B c is the event no aces are chosen. As before, ( we ) consider 2 classes of object. We choose none of the 4 4 aces - = 1 way of doing this. 0 The ( ) remaining 13 cards in the hand come from the 48 non-aces - 48 ways of choosing these cards / 97

66 Example 2.8 Hence, arguing as in part 1 Thus P(B c ) = ( ) ( ) ( ) P(B) = 1 ( ) / 97

67 Example Note that we can write C as A D, where D is the event 5 spades are chosen and P(D) = P(A). Since A and D are not mutually exclusive (it is possible in a hand of 13 cards to have 5 clubs and 5 spades), we must use the general formula for P(A D). P(A D) = P(A) + P(D) P(A D). A D is the event 5 clubs and 5 spades are chosen. In order to calculate P(A D), we must consider 3 classes of object: clubs, spades and red cards (i.e. those that are neither clubs nor spades). 67 / 97

68 Example 2.8 From ( ) class 1 (clubs) we must choose 5 of the total of 13 clubs - 13 ways. 5 From class 2 (spades), the number of choices is as above. The remaining 3 cards chosen come from the 26 red cards - ways. ( ) 26 3 Multiplying these three terms together, we obtain the number of ways of choosing both 5 clubs and 5 spades. 68 / 97

69 Example 2.8 As before ( ) the number of ways of choosing a hand of 13 cards from is. 13 It follows that P(A D) = ( ) ( ) ( ) ( ) / 97

70 Example 2.8 It follows that P(A D)=P(A) + P(D) P(A D) = 2P(A) P(A D) ( ) ( ) ( ) ( ) ( ) = ( ) / 97

71 The use of the binomial (multinomial) and hypergeometric distributions You should be careful in distinguishing between scenarios in which the binomial distribution should be used and scenarios in which the hypergeometric distribution should be used. The binomial distribution should be used when the probability of success at each stage is p at each stage, regardless of the results of previous experiments. For example, if we choose cards from a pack and after each choice we replace the chosen card in the pack, the number of e.g. clubs chosen will have a binomial distribution since the probability of choosing a club at each stage is 0.25, regardless of previous choices. 71 / 97

72 The use of the binomial (multinomial) and hypergeometric distributions The hypergeometric distribution should be used when we choose from a set of objects and the chosen object is not put back into the pool of objects before the next choice is made. For example, suppose that 5% of the objects from a production line are faulty. Let X be the number of faulty objects in n objects. We assume that each object is faulty with probability Hence, X Bin(n, 0.05). Now suppose there are 5 faulty products in a batch of 100. Suppose a quality controller checks 10 of these objects (obviously different objects). If the first object he chooses is faulty, then of the remaining 99 objects he could check, only 4 are faulty. In this case the number of faulty objects found has a hypergeometric distribution. 72 / 97

73 2.4 Results for expected value and variance In order to calculate the expected value and variance for the binomial distribution, we use the following theorem regarding the expected value and variance of a sum of random variables. THEOREM: Suppose X = X 1 + X X n. i) E(X ) = E(X 1 ) + E(X 2 ) E(X n ) ii) If the X i are independent, then Var(X ) = Var(X 1 ) + Var(X 2 ) Var(X n ) Note that it follows from i) that iii)e[f 1 (X )+f 2 (X )+...+f n (X )] = E[f 1 (X )]+E[f 2 (X )]+...+E[f n (X )]. 73 / 97

74 Results for expected value and variance Proof of iii) E[f 1 (X )+...+f n (X )]= [f 1 (x) f n (x)]p(x = x) x S X = [f 1 (x)p(x = x) f n (x)p(x = x)] x S X = f 1 (x)p(x =x) f n (x)p(x =x) x S X x S X =E[f 1 (X )] E[f n (X )]. 74 / 97

75 Results for expected value and variance Suppose Y = ax + b, where a, b are constants. It follows that i) E(Y )=ae(x ) + b ii) Var(Y )=a 2 Var(X ) The proofs of these statements are left for the tutorials. 75 / 97

76 Example 2.9 Using the theorem on the expected value and variance of a sum of independent random variables, calculate E(X ) and Var(X ) when X Bin(n, p). 76 / 97

77 Example 2.9 We use the fact that X = X 1 + X X n, where X i 0-1(p). The X i are independent. Note: X i = 1 if the i-th trial results in a success. X i = 0 if the i-th trial results in a failure. i.e. P(X i = 1) = p; P(X i = 0) = 1 p. 77 / 97

78 Example 2.9 We have, E(X i )= xp(x i = x) x=0,1 =1 p + 0 (1 p) = p Thus, E(X )=E(X 1 ) + E(X 2 ) E(X n ) =p + p p = np 78 / 97

79 Example 2.9 We have where E(X 2 i ) = x=0,1 Var(X i ) = E(X 2 i ) E(X i ) 2, x 2 P(X i = x) = 1 2 p (1 p) = p Hence, Var(X i ) = p p 2 = p(1 p) 79 / 97

80 Example 2.9 Thus, Var(X )=Var(X 1 ) + Var(X 2 ) Var(X n ) =p(1 p) + p(1 p) p(1 p) = np(1 p) This expected value and variance, together with the interpretation of a binomial random variable as the sum of independent random variables, will be important when we consider the Central Limit Theorem in Section / 97

81 2.5 Extension of the law of total probability to expected values The conditional distribution of the random variable X given the event A is given by {P(X = x A)} x SX = { P(X = x A) } x SX P(A) The expected value of the random variable X conditional on the event A is defined by E(X A) = xp(x = x A) xp(x = x A) = P(A) x S X x S X Let A 1, A 2,..., A n form a partition for any experiment. We have E(X ) = E(X A 1 )P(A 1 ) + E(X A 2 )P(A 2 ) E(X A n )P(A n ). 81 / 97

82 Example 2.10 Let X be the result of a die roll. We have It follows that P(X = x) = 1 6, x S X = {1, 2, 3, 4, 5, 6}. E(X )= 6 xp(x = x) = x=1 6 x=1 = x 6 = 7 2 Note that this also follows from the symmetry of the distribution about x = / 97

83 Example 2.10 Let A be the event that the result of the die roll is less than or equal to 3. Just for illustration we will use the law of total probability to calculate E(X ). From the law of total probability, we obtain E(X ) = E(X A)P(A) + E(X A c )P(A c ). We have P(A) = P(A c ) = 1 2. Consider the conditional distribution of X given A. Intuitively, this conditional distribution is uniform on the set of integers {1, 2, 3}, i.e. P(X = x A) = 1 3, x {1, 2, 3}. Similarly, the conditional distribution of X given A c is uniform on the set of integers {4, 5, 6}, i.e. P(X = x A c ) = 1 3, x {4, 5, 6}. 83 / 97

84 Example 2.10 Mathematically, P(X = x A) = P(X = x X 3) = P(X = x X 3). P(X 3) For x 3, the numerator simplifies to P(X = x) = 1 6. For x > 3, the value of the numerator is 0. The value of the denominator is 0.5. It follows that P(X = x A) = 1, x {1, 2, 3}. 3 The derivation of the conditional distribution of X given A c is analogous. 84 / 97

85 Example 2.10 We have E(X A) = 3 xp(x = x A) = = 2 x=1 Again, this follows from the symmetry of the distribution about x = 2. Analogously, E(X A c ) = 5. It follows that E(X ) = E(X A)P(A) + E(X A c )P(A c ) = = 7 2. Note that the law of total probability is useful in calculating the expected value of a random variable with a geometric distribution. This law does not extend directly to the variance of a random variable (see tutorial sheet). 85 / 97

86 The expected value and variance of standard discrete random variables The following table gives the expected value and variance for the standard discrete distributions. In the case of the hypergeometric distribution p = N 1 N denotes the proportion of type 1 objects. Distribution E(X ) Var(X ) 0-1(p) p p(1 p) Bin(n, p) np np(1 p) Poisson(λ) λ λ Geometric(p) 1 p 1 p p 2 Hypergeometric np np(1 p) N n N 1 86 / 97

87 2.6 The Cumulative Distribution Function The cumulative distribution function of the random variable X is denoted by F X. If it is clear what random variable is being referred to the index is often left out. This function is often simply referred to as the distribution function. F X (x) = P(X x) Note that P(X x) = P(X < x) + P(X = x). 87 / 97

88 Properties of the Cumulative Distribution Function 1. F X is a non-decreasing function. 2. lim x F X (x) = lim x F X (x) = F X is continuous from the right hand side. 88 / 97

89 Properties of the Cumulative Distribution Function Just as the standard definition of a discrete random variable, given by {P(X = x)} x SX, the cumulative distribution function uniquely defines a random variable. Let x 1 and x k be the smallest and largest values in the support of X. For x < x 1, F X (x) = 0. For x x k, F X (x) = 1. If x i 1 and x i are neighbouring values in the support of X, then F X is constant on the interval [x i 1, x i ). At x i, the distribution function jumps up by P(X = x i ). F X (x i ) takes the value at the top of this step. 89 / 97

90 Example 2.11 Suppose a die is rolled 3 times. Let X be the number of sixes. Draw a graph of the cumulative distribution function of this random variable. 90 / 97

91 Example 2.11 We have X Bin(3, 1 6 ). We first define the distribution of X in the standard way. S X = {0, 1, 2, 3} ( ) ( ) ( ) 5 3 P(X = 0)= ( ) ( ) ( ) 5 2 P(X = 1)= ( ) ( ) ( ) 5 1 P(X = 2)= ( ) ( ) ( ) 5 0 P(X = 3)= It is easiest to derive F X iteratively, starting with x < 0 (0 is the smallest value in S X ). 91 / 97

92 Example 2.11 The lowest value that X can take is 0. Hence, F X (x) = 0, for x < 0. At 0, F X jumps up by P(X = 0) The next value that X can take is 1. Hence, for x [0, 1), F X (x) At 1, F X jumps up by P(X = 1) The next value that X can take is 2. Hence, for x [1, 2), F X (x) = At 2, F X jumps up by P(X = 2) The next value that X can take is 3. Hence, for x [2, 3), F X (x) = Since 3 is the largest value that X can take, for x 3, F X (x) = / 97

93 Example 2.11 Hence, 0, x < , 0 x < 1 F X (x) = , 1 x < , 2 x < 3 1, x / 97

94 Example / 97

95 2.7 The Median of a Discrete Distribution The median of a discrete distribution, q 0.5, satisfies P(X q 0.5 ) 0.5 P(X q 0.5 ) 0.5 The median, like E(X ), is a measure of the centrality. Approximately 50% of observations will be greater than the median and 50% smaller than the median. The relation between the median and the expected value will be considered by in the following chapter. 95 / 97

96 2.7 The Median of a Discrete Distribution 1. If the cumulative distribution function never takes the value 0.5, then the median is the unique value of x at which F X jumps from being below 0.5 to being above If the cumulative distribution function F X takes the value 0.5 on some interval [x 1, x 2 ), then any value in the interval [x 1, x 2 ] is a median of X. 96 / 97

97 Example 2.12 Calculate the median for the random variable defined in Example Since F X jumps from 0 to when x = 0, q 0.5 = / 97

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k. REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS 6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

SCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics

SCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics SCHOOL OF ENGINEERING & BUILT ENVIRONMENT Mathematics Probability and Probability Distributions 1. Introduction 2. Probability 3. Basic rules of probability 4. Complementary events 5. Addition Law for

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction CA200 Quantitative Analysis for Business Decisions File name: CA200_Section_04A_StatisticsIntroduction Table of Contents 4. Introduction to Statistics... 1 4.1 Overview... 3 4.2 Discrete or continuous

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you

More information

6.3 Conditional Probability and Independence

6.3 Conditional Probability and Independence 222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted

More information

DETERMINE whether the conditions for a binomial setting are met. COMPUTE and INTERPRET probabilities involving binomial random variables

DETERMINE whether the conditions for a binomial setting are met. COMPUTE and INTERPRET probabilities involving binomial random variables 1 Section 7.B Learning Objectives After this section, you should be able to DETERMINE whether the conditions for a binomial setting are met COMPUTE and INTERPRET probabilities involving binomial random

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

6.4 Normal Distribution

6.4 Normal Distribution Contents 6.4 Normal Distribution....................... 381 6.4.1 Characteristics of the Normal Distribution....... 381 6.4.2 The Standardized Normal Distribution......... 385 6.4.3 Meaning of Areas under

More information

Practice Problems #4

Practice Problems #4 Practice Problems #4 PRACTICE PROBLEMS FOR HOMEWORK 4 (1) Read section 2.5 of the text. (2) Solve the practice problems below. (3) Open Homework Assignment #4, solve the problems, and submit multiple-choice

More information

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution Recall: Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.

More information

6.2. Discrete Probability Distributions

6.2. Discrete Probability Distributions 6.2. Discrete Probability Distributions Discrete Uniform distribution (diskreetti tasajakauma) A random variable X follows the dicrete uniform distribution on the interval [a, a+1,..., b], if it may attain

More information

Some special discrete probability distributions

Some special discrete probability distributions University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that

More information

Math 202-0 Quizzes Winter 2009

Math 202-0 Quizzes Winter 2009 Quiz : Basic Probability Ten Scrabble tiles are placed in a bag Four of the tiles have the letter printed on them, and there are two tiles each with the letters B, C and D on them (a) Suppose one tile

More information

Basic Probability Concepts

Basic Probability Concepts page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Bayesian Tutorial (Sheet Updated 20 March)

Bayesian Tutorial (Sheet Updated 20 March) Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that

More information

Characteristics of Binomial Distributions

Characteristics of Binomial Distributions Lesson2 Characteristics of Binomial Distributions In the last lesson, you constructed several binomial distributions, observed their shapes, and estimated their means and standard deviations. In Investigation

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179)

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179) Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172-179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities

More information

Probability definitions

Probability definitions Probability definitions 1. Probability of an event = chance that the event will occur. 2. Experiment = any action or process that generates observations. In some contexts, we speak of a data-generating

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

The mathematical branch of probability has its

The mathematical branch of probability has its ACTIVITIES for students Matthew A. Carlton and Mary V. Mortlock Teaching Probability and Statistics through Game Shows The mathematical branch of probability has its origins in games and gambling. And

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Tenth Problem Assignment

Tenth Problem Assignment EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Definition and Calculus of Probability

Definition and Calculus of Probability In experiments with multivariate outcome variable, knowledge of the value of one variable may help predict another. For now, the word prediction will mean update the probabilities of events regarding the

More information

Decision Making Under Uncertainty. Professor Peter Cramton Economics 300

Decision Making Under Uncertainty. Professor Peter Cramton Economics 300 Decision Making Under Uncertainty Professor Peter Cramton Economics 300 Uncertainty Consumers and firms are usually uncertain about the payoffs from their choices Example 1: A farmer chooses to cultivate

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

WEEK #23: Statistics for Spread; Binomial Distribution

WEEK #23: Statistics for Spread; Binomial Distribution WEEK #23: Statistics for Spread; Binomial Distribution Goals: Study measures of central spread, such interquartile range, variance, and standard deviation. Introduce standard distributions, including the

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

The Binomial Distribution

The Binomial Distribution The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing

More information

Lecture 2 Binomial and Poisson Probability Distributions

Lecture 2 Binomial and Poisson Probability Distributions Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Binomial Probability Distribution

Binomial Probability Distribution Binomial Probability Distribution In a binomial setting, we can compute probabilities of certain outcomes. This used to be done with tables, but with graphing calculator technology, these problems are

More information

4.1 4.2 Probability Distribution for Discrete Random Variables

4.1 4.2 Probability Distribution for Discrete Random Variables 4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

Math 55: Discrete Mathematics

Math 55: Discrete Mathematics Math 55: Discrete Mathematics UC Berkeley, Fall 2011 Homework # 7, due Wedneday, March 14 Happy Pi Day! (If any errors are spotted, please email them to morrison at math dot berkeley dot edu..5.10 A croissant

More information

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1. MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column

More information

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.

More information

2 Binomial, Poisson, Normal Distribution

2 Binomial, Poisson, Normal Distribution 2 Binomial, Poisson, Normal Distribution Binomial Distribution ): We are interested in the number of times an event A occurs in n independent trials. In each trial the event A has the same probability

More information

Chapter 5. Discrete Probability Distributions

Chapter 5. Discrete Probability Distributions Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable

More information

The Binomial Probability Distribution

The Binomial Probability Distribution The Binomial Probability Distribution MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2015 Objectives After this lesson we will be able to: determine whether a probability

More information

Remarks on the Concept of Probability

Remarks on the Concept of Probability 5. Probability A. Introduction B. Basic Concepts C. Permutations and Combinations D. Poisson Distribution E. Multinomial Distribution F. Hypergeometric Distribution G. Base Rates H. Exercises Probability

More information

Unit 4 The Bernoulli and Binomial Distributions

Unit 4 The Bernoulli and Binomial Distributions PubHlth 540 4. Bernoulli and Binomial Page 1 of 19 Unit 4 The Bernoulli and Binomial Distributions Topic 1. Review What is a Discrete Probability Distribution... 2. Statistical Expectation.. 3. The Population

More information

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

8.3 Probability Applications of Counting Principles

8.3 Probability Applications of Counting Principles 8. Probability Applications of Counting Principles In this section, we will see how we can apply the counting principles from the previous two sections in solving probability problems. Many of the probability

More information

THE BINOMIAL DISTRIBUTION & PROBABILITY

THE BINOMIAL DISTRIBUTION & PROBABILITY REVISION SHEET STATISTICS 1 (MEI) THE BINOMIAL DISTRIBUTION & PROBABILITY The main ideas in this chapter are Probabilities based on selecting or arranging objects Probabilities based on the binomial distribution

More information

CHI-SQUARE: TESTING FOR GOODNESS OF FIT

CHI-SQUARE: TESTING FOR GOODNESS OF FIT CHI-SQUARE: TESTING FOR GOODNESS OF FIT In the previous chapter we discussed procedures for fitting a hypothesized function to a set of experimental data points. Such procedures involve minimizing a quantity

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables 2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

More information

3. Mathematical Induction

3. Mathematical Induction 3. MATHEMATICAL INDUCTION 83 3. Mathematical Induction 3.1. First Principle of Mathematical Induction. Let P (n) be a predicate with domain of discourse (over) the natural numbers N = {0, 1,,...}. If (1)

More information

The Math. P (x) = 5! = 1 2 3 4 5 = 120.

The Math. P (x) = 5! = 1 2 3 4 5 = 120. The Math Suppose there are n experiments, and the probability that someone gets the right answer on any given experiment is p. So in the first example above, n = 5 and p = 0.2. Let X be the number of correct

More information

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

Opgaven Onderzoeksmethoden, Onderdeel Statistiek Opgaven Onderzoeksmethoden, Onderdeel Statistiek 1. What is the measurement scale of the following variables? a Shoe size b Religion c Car brand d Score in a tennis game e Number of work hours per week

More information

APPLIED MATHEMATICS ADVANCED LEVEL

APPLIED MATHEMATICS ADVANCED LEVEL APPLIED MATHEMATICS ADVANCED LEVEL INTRODUCTION This syllabus serves to examine candidates knowledge and skills in introductory mathematical and statistical methods, and their applications. For applications

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

More information

ECON1003: Analysis of Economic Data Fall 2003 Answers to Quiz #2 11:40a.m. 12:25p.m. (45 minutes) Tuesday, October 28, 2003

ECON1003: Analysis of Economic Data Fall 2003 Answers to Quiz #2 11:40a.m. 12:25p.m. (45 minutes) Tuesday, October 28, 2003 ECON1003: Analysis of Economic Data Fall 2003 Answers to Quiz #2 11:40a.m. 12:25p.m. (45 minutes) Tuesday, October 28, 2003 1. (4 points) The number of claims for missing baggage for a well-known airline

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

Contemporary Mathematics- MAT 130. Probability. a) What is the probability of obtaining a number less than 4?

Contemporary Mathematics- MAT 130. Probability. a) What is the probability of obtaining a number less than 4? Contemporary Mathematics- MAT 30 Solve the following problems:. A fair die is tossed. What is the probability of obtaining a number less than 4? What is the probability of obtaining a number less than

More information

Without data, all you are is just another person with an opinion.

Without data, all you are is just another person with an opinion. OCR Statistics Module Revision Sheet The S exam is hour 30 minutes long. You are allowed a graphics calculator. Before you go into the exam make sureyou are fully aware of the contents of theformula booklet

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

More information

Sample Questions for Mastery #5

Sample Questions for Mastery #5 Name: Class: Date: Sample Questions for Mastery #5 Multiple Choice Identify the choice that best completes the statement or answers the question.. For which of the following binomial experiments could

More information

MATH 10034 Fundamental Mathematics IV

MATH 10034 Fundamental Mathematics IV MATH 0034 Fundamental Mathematics IV http://www.math.kent.edu/ebooks/0034/funmath4.pdf Department of Mathematical Sciences Kent State University January 2, 2009 ii Contents To the Instructor v Polynomials.

More information