Discrete probability and the laws of chance


 Denis Cannon
 1 years ago
 Views:
Transcription
1 Chapter 8 Discrete probability and the laws of chance 8.1 Introduction In this chapter we lay the groundwork for calculations and rules governing simple discrete probabilities. These steps will be essential to developing the skills to analyzing and understanding problems of genetic diseases, genetic codes, and a vast array of other phenomena where the laws of chance intersect the the processes of biological systems. To gain experience with probability, it is important to see simple examples. We start this introduction with experiments that can be easily reproduced and tested by the reader. 8.2 Simple experiments Consider the following experiment: We flip a coin and observe any one of two possible results: heads (H) or tails (T). A fair coin is one for which these results are equally likely. Similarly, consider the experiment of rolling a dice: A sixsided dice can land on any of its six faces, so that a single experiment has six possible outcomes. We anticipate getting each of the results with an equal probability, i.e. if we were to repeat the same experiment many many times, we would expect that, on average, the six possible events would occur with similar frequencies. We say that the events are random and unbiased for fair dice. How likely are we to roll a 5 and a 6 in successive experiments? A five or a six? If we toss a coin ten times, how probable is it that we get 8 out of ten heads? For a given experiment such as the one described here, we are interested in quantifying how likely it is that a certain event is obtained. Our goal in this chapter is to make more precise our notion of probability, and to examine ways of quantifying and computing probabilities for experiments such as these. To motivate this investigation, we first look at results of a real experiment performed in class by students. v January 5,
2 8.3 Empirical probability Each student in a class of N = 121 individuals was asked to toss a penny 10 times. The students were then asked to record their results and to indicate how many heads they had obtained in this sequence of tosses. (Note that the order of the heads was not taken into account, only how many were obtained out of the 10 tosses.) The table shown below specifies the number, k, of heads (column 1), the number, x k, of students who responded that they had obtained that many heads (column 2). In column (4) we compute the fraction of the class p(x k ) = x k /N who got exactly k heads. In column (3) we display a cumulative number of students who got any number up to and including k heads, and then in column (5) we compute the cumulative fraction of the class who got any number up to and including k heads. We will henceforth associate the fraction with the empirical probability of k heads. In the last column we include the cumulative probability, i.e. the sum of the probabilities of getting any number up to k heads. Number of heads frequency Cumulative probability cumulative probability k x k k x i p(x k ) = x k /N k p(x i ) i i Table 8.1: Results of a real experiment carried out by 121 students in this mathematics course. Each student tossed a coin 10 times. We recorded the number of students who got 0, 1, 2, etc heads. The fraction of the class that got each outcome is identified with the (empirical) probability of that outcome. See Figure 8.1 for the same data presented graphically. In Figure 8.1 we show what this distribution looks like on a bar graph. We observe that this empirical distribution is not very symmetric, because it is based on a total of only 121 trials (i.e. 121 repetitions of the experiment of 10 tosses). However, it is clear from this distribution that certain results occurred more often (and hence are associated with a greater probability) than others. To the right, we also show the cumulative distribution function, superimposed as an xyplot on the same graph. Observe that this function starts with the value 0 and climbs up to value 1, since the probabilities of any of the events (0, 1, 2, etc heads) must add up to 1. v January 5,
3 0.4 empirical probability of k heads in 10 tosses 1.0 Cumulative distribution 0.0 number of heads (k) number of heads (k) Figure 8.1: The data from Table 8.1 is shown plotted on this graph. A total of N = 121 people were asked to toss a coin n = 10 times. In the bar graph (left), the horizontal axis reflects k, the number, of heads (H) that came up during those 10 coin tosses. The vertical axis reflects the fraction p(x k ) of the class that achieved that particular number of heads. The same bar graph is shown on the right, together with the cumulative function that sums up the values from left to right. v January 5,
4 8.4 Mean and variance of a probability distribution In a previous chapter, we considered distributions of grades and computed a mean (also called average) of that distribution. The identical concept applies in the distributions discussed in the context of probability, but here we interchangeably use the terminology mean or average value or expected value. Suppose we toss a coin n times and let x i stand for the number of heads that are obtained in those n tosses. Then x i can take on the values x i = 0, 1, 2, 3,...n. Let p(x i ) be the probability of obtaining exactly x i heads. By analogy to ideas in a previous chapter, we would define the mean (or average or expected value), x of the probability distribution by the ratio x = n i=0 x ip(x i ) n i=0 p(x i). However, this expression can be simplified by observing that, according to property (2) of discrete probability, the denominator is just n p(x i ) = 1 This explains the following definition of the expected value. i=0 Definition The expected value x of a probability distribution (also called the mean or average value) is x = n x i p(x i ). i=0 It is important to keep in mind that the expected value or mean is a kind of average x coordinate, where values of x are weighted by their frequency of occurrence. This is similar to the idea of a center of mass (x positions weighted by masses associated with those positions). The mean is a point on the x axis, representing the average outcome of an experiment. (Recall that in the distributions we are describing, the possible outcomes of some observation or measurement process are depicted on the x axis of the graph.) The mean is not the same as the average value of a function, discussed in an earlier chapter. (In that case, the average is an average y coordinate.) We also define a numerical quantity that represents the width of the distribution. We define the variance, V and standard deviation, σ as follows: v January 5,
5 The variance, V, of a distribution is V = n (x i x) 2 p(x i ). i=0 where x is the mean. The standard deviation, σ is σ = V. In the problem sets, we show that the variance can also be expressed in the form V = M 2 x 2 where M 2 is the second moment of the distribution. Moments of a distribution are defined as the numbers obtained by summing up products of the probability weighted by powers of x. The j th moment, M j of a distribution is n M j = (x i ) j p(x i ). i= Example For the empirical probability distribution shown in Figure 8.1, the mean (expected value) is calculated by performing the following sum, based on the table of events shown above: 10 x = x k p(x k ) = 0(0) + 1(0.0083) + 2(0.0165) + + 8(0.0579) + 9(0) + 10(0) = k=0 The mean number of heads in this set of experiments is about 5.2. Intuitively, we would expect that in a fair coin, half the tosses should produce heads, i.e. on average 5 heads would be obtained out of 10 tosses. We see from the fact that the empirical distribution is slightly biased, that the mean is close to, but not equal to this intuitive theoretical result. To compute the variance we form the sum V = (x k x) 2 p(x k ) = (k ) 2 p(k). k=0 k=0 Here we have used the mean calculated above and the fact that x k = k. We obtain V = ( ) 2 (0) + ( ) 2 (0.0083) + + ( ) 2 (0) + ( ) 2 (0) = The standard deviation is then σ = V = v January 5,
6 8.5 Theoretical probability Our motivation in what follows it to put results of an experiment into some rational context. We would like to be able to predict the distribution of outcomes based on underlying laws of chance. Here we will formalize the basic rules of probability, and learn how to assign probabilities to events that consist of repetitions of some basic, simple experiment like the coin toss. Intuitively, we expect that in tossing a fair coin, half the time we should get H and half the time T. But as seen in our experimental results, there can be long repetitions that result in very few H or very many H, far from the mean or expected value. How do we assign a theoretical probability to the event that only 1 head is obtained in 10 tosses of a coin? This motivates our more detailed study of the laws of chance and theoretical probability. As we have seen in our previous example, the probability p assigns a number to the likelihood of an outcome of an experiment. In the experiment discussed above, that number was the fraction of the students who got a certain number of heads in a coin toss repetition Basic definitions of probability Suppose we label the possible results of the experiment by symbols e 1, e 2, e 3,...e k,...e m where m is the number of possible events (e.g. m = 2 for a coin flip, m = 6 for a dice toss). We will refer to these as events, and our purpose here will be to assign numbers, called probabilities, p, to these events that indicate how likely it is that they occur. Then the following two conditions are required for p to be a probability: 1. The following inequality must be satisfied: 0 p(e k ) 1 for all events e k. Here p(e k ) = 0 is interpreted to mean that this event never happens, and p(e k ) = 1 means that this event always happens. The probability of each (discrete) event is a number between 0 and If {e 1, e 2, e 3...e k,...e m } is a list of all the possible events then p(e 1 ) + p(e 2 ) + p(e 3 ) + + p(e k ) +...p(e m ) = 1 or simply m p(e k ) = 1 k=1 That is, the probabilities of any of the events occurring sum up to one, since one or another of the events must always occur. v January 5,
7 Definition The list of all possible events {e 1, e 2, e 3...e k,...e m } is called the sample space. 8.6 Multiple events and combined probabilities We here consider an experiment that consists of more than one repetition. For example, each player tossed a coin 10 times to generate data used earlier. We aim to have a way of describing the number of possible events as well as the likelihood of getting any one or another of these events. First multiplication principle If there are N 1 possible events in experiment 1 and N 2 possible events in experiment 2, then there are N 1 N 2 possible events of the combined set of experiments. In the above multiplication principle we assume that the order of the events is important. Example 1 Each flip of a coin has two events. Flipping two coins can give rise to 2 2 = 4 possible events (where we distinguish between the result TH and HT, i.e. order of occurrence is important.) Example 2 Rolling a dice twice gives rise to 6 6 = 36 possible events. Example 3 A sequence of three letters is to be chosen from a 4letter alphabet consisting of the letters T, A, G, C. (For example, TTT, TAG, GCT, and GGA are all examples of this type.) The number of ways of choosing such 3letter words is = 64 since we can pick any of the four letters in any of the three positions of the sequence. 8.7 Calculating the theoretical probability How do we assign a probability to a given set of events? In the first example in this chapter, we used data to do this, i.e. we repeated an experiment many times, and observed the fraction of times of occurrence of each event. The resulting distribution of outcomes was used to determine empirical probability. Here we take the alternate approach: we make some simplifying assumptions about each elementary event and use the rules of probability to compute a theoretical probability. v January 5,
8 Equally likely assumption One of the most common assumptions is that each event occurs with equal likelihood. Suppose that there are m possible events and that each is equally likely. Then the probability of each event is 1/m, i.e. P(e i ) = 1 m for i = 1...m Example 1 For a fair coin tossed one time, we expect that the probability of getting H or T is equal. In that case, P(H) + P(T) = 1 P(H) = P(T) Together these imply that P(H) = P(T) = 1 2. Example 2 For a fair 6sided die, the same assumption leads to the conclusion that the probability of getting any one of the six faces as a result of a roll is P(e k ) = 1/6 for k = Independent events In order to combine results of several experiments, we need to discuss the notion of independence of events. Essentially, independent events are those that are not correlated or linked with one another. For example, we assume in general that the result of one toss of a coin does not influence the result of a second toss. All theoretical probability calculated in this chapter will be based on this important assumption. Second multiplication principle Suppose events e 1 and e 2 are independent. Then, if the probability of event e 1 is P(e 1 ) = p 1 and the probability of event e 2 is P(e 2 ) = p 2, the probability of event e 1 and event e 2 both occurring is P(e 1 and e 2 ) = p 1 p 2. v January 5,
9 We also say that this is the probability of event e 1 AND event e 2. This is sometimes written as P(e 1 e 2 ). If the two events e 1 and e 2 are not independent, the probability of both occuring is P(e 1 e 2 ) = P(e 1 ) P(e 2 assuming that e 1 happened) = P(e 2 ) P(e 1 assuming that e 2 happened) Example 3 The probability of tossing a coin to get H and rolling a dice to get a 6 is the product of the individual probabilities of each of these events, i.e. P(H and 6) = = Example 4 The probability of rolling a dice twice, to get a 3 followed by a 4, is P(3 and 4) = = Addition principle Suppose events e 1 and e 2 are mutually exclusive. Then the probability of getting event e 1 OR event e 2 is given by P(e 1 e 2 ) = p 1 + p 2. In general, i.e. the events e 1 and e 2 are not necessarily mutually exclusive, the following equation holds: P(e 1 e 2 ) = P(e 1 ) + P(e 2 ) P(e 1 e 2 ). Example 5 When we roll a dice once, assuming that each face has equal probability of occurring, the chances of getting either a 1 or a 2 (i.e. any of these two possibilities out of a total of six) is P({1} {2}) = = 1 3. v January 5,
10 Example 6 When we flip a coin, the probability of getting either heads or tails is P({H} {T }) = = 1. This makes sense since there are only 2 possible events (N = 2), and we said earlier that N k=1 p(e k) = 1, i.e. one of the 2 events must always occur. Subtraction principle If the probability of event e k is P(e k ) then the probability of NOT getting event e k is P(not e k ) = 1 P(e k ). Example 7 When we roll a dice once, the probability of NOT getting the value 2 is P(not 2) = = 5 6. Alternatively, we can add up the probabilities of getting all the results other than a 2, i.e. 1, 3, 4, 5, or 6, and arrive at the same answer. P({1} {3} {4} {5} {6}) = = 5 6. Example 8 A box of jelly beans contains a mixture of 10 red, 18 blue, and 12 green jelly beans. Suppose that these are wellmixed and that the probability of pulling out any one jelly bean is the same. (a) What is the probability of randomly selecting two blue jelly beans from the box? (b) What is the probability of randomly selecting two beans that have similar colors? (c) What is the probability of randomly selecting two beans that have different colors? Solution There are a total of 40 jelly beans in the box. In a random selection, we are assuming that each jelly bean has equal likelihood of being selected. (a) Suppose we take out one jelly bean and then a second. Once we take out the first, there will be 39 left. If the first one was blue (with probability 18/40), then there will be 17 blue ones left in the box. Thus, the probability of selecting a blue bean AND another blue bean is: P(2 blue) = = v January 5,
11 The same answer is obtained by considering pairs of jelly beans. There are a total of (40 39)/2 pairs and out of these, only (18 17)/2 pairs are pure blue. Thus the probability of getting a blue pair is (18 17)/2 (40 39)/2 = (The result is the same whether we select both simultaneously or one at a time.) (b) Two beans will have similar colors if they are both blue OR both red OR both green. We have to add the corresponding probabilities, that is P(same color) = ( )+( )+(12 11 ) = (c) Two beans will have different colors if we DO NOT get the case that the two beans will have the same color. Thus P(not same color) = 1 P(same color) = = Example 9 (a) How many different ways are there of rolling a pair of dice to get the total score of 7? (By total score we mean the sum of both faces.) (b) What is the probability of rolling the total score 7 with a pair of fair dice? (c) What is the probability of rolling the total score 8 with a pair of dice? (d) What is the probability of getting a total of 13 by rolling three fair dice? Solution (a) We can think of the result as + = 7. Then for the first die we could have any value, j = 1, 2,...6 (i.e. a total of 6 possibilities) but then the second die must be 7 j, which means that there is no choice for the second die. Thus there are 6 ways of obtaining a total of 7. (We do not need to list those ways, here, since the argument establishes an unambiguous answer, but here is that list anyway, showing the face value of each pair of events that totals 7: (1, 6); (2, 5); (3, 4); (4, 3); (5, 2); (6, 1)). (b) There are a total of 6 6 = 36 possibilities for the outcomes of rolling two dice, and we saw above that 6 of these will add up to 7. Assuming all possibilities are equally likely (for fair dice), this means that the probability of a total of 7 for the pair is 6/36 = 1/6. (c) Here we must be more careful, since the previous argument will not quite work: We need + = 8. For example, if the first dice comes up 1, then there is no way to get a total of 8 for the pair. The smallest value that would work on the first die is 2, so that we have only 5 possible choices for the first die, and then the second has to make up the difference. There are only 5 such possibilities. These are: (2, 6); (3, 5); (4, 4); (5, 3); (6, 2). Therefore the probability of such an event is 5/36. v January 5,
12 (d) To get 13 by rolling three fair dice, we need + + = 13. We consider the possibilities: If the first dice comes up a 6, then we need, for the other pair + = 13 6 = 7. We already know this can be done in six ways. If the first dice comes up a 5, then we need, for other pair + = 13 5 = 8. There are 5 ways to get this. Let us organize our counting of the possibilities in the following table, to be systematic and to see a pattern: Face value total of Number of ways of first die remaining pair to get this 6 + = = = = = = 12 1 We can easily persuade ourselves that there is a pattern being followed in building up this table. We see that the total number of ways of getting the desired result is just a sum of the numbers in the third column, i.e = 21. But the total number of possibilities for the three dice is 6 3 = 216. Thus the probability of a total score of 13 is 21/216 = In this example, we had to list some different possibilities in order to achieve the desired result. The examples in this section illustrate the simplest probability assumptions, rules, and calculations. Many of the questions asked in these examples were answered by careful counting of possibilities and computing the fraction of cases in which some desired result is obtained. In the following sections, we will discuss ways of representing outcomes of measurement (by distributions, by numerical descriptors such as mean and variance ). We will also study techniques for helping us to count the number of possibilities and to compute probabilities associated with repeated trials of one type of experiment. v January 5,
13 8.8 Theoretical probability of coin tossing Earlier in this chapter, we studied the results of a cointossing experiment. Now we turn to a theoretical investigation of the same type of experiment to understand predictions of the basic rules of probability. We would like to quantify the probability of getting some number, k, of heads when the coin is tossed n times. We start with an elementary example when the coin is tossed only three times (N = 3) to build up intuition. Let us use the notation p=p(h) and q=p(t) to represent the probabilities of each outcome. For our theoretical probability investigation, we will make the simplest assumption about each elementary event, i.e. that it is equally likely to obtain a head (H) and tail (T) in one repetition. Then the probabilities of event H and event T in a single toss are the same, i.e. 1/2: P(H) = P(T) = 1/2 i.e. p = q = 1/2. A new feature of this section is that we will summarize the probability using a frequency distribution. In this important type of plot, the horizontal axis represents some observed or measured value in an experiment (for example, the number of heads in a coin toss experiment). The vertical axis represents how often that outcome is obtained (i.e. the frequency, or probability of the event.) (b) three coin tosses Suppose we are not interested in the precise order, but rather in the total number of heads (or tails). For example, we may win $1 for every H and lose $1 for every T that occurs. When a fair coin is tossed 3 times, the possible events are as follows: Grouping events together by the number of heads obtained, we summarize the same information with the following frequency table: event number of heads probability 1 TTT = TTH = THT = HTT = THH = HTH = HHT = HHH = Table 8.2: A list of all possible results of a 3 coin toss experiment, showing the number of heads in each case and the theoretical probabilities of each of the results Each result shown above has the same probability, p = 1/2 3 = 1/8. Grouping together results from Table 8.2, and forming Table 8.3 we find similarly that the probability of getting no heads is 1/8, of getting one head is 3/8 (the sum of the probabilities of three equally likely events), of getting two heads is 3/8, and of getting three heads is 1/8. This distribution is shown in Figure 8.2(b). We can use these results to calculate the theoretical mean number of heads (expected value) in this experiment. v January 5,
14 number of heads (x k = k) probability result 0 P(TTT) 1/8 1 P(TTH)+P(THT)+P(TTH) 3/8 2 P(HTT)+P(HHT)+P(THH) 3/8 3 P(HHH) 1/8 Table 8.3: Theoretical probability of getting 0, 1, 2, 3, H s in a 3coin toss experiment Example In the case of three tosses of a coin described above and shown in Figure 8.2(b), the expected value is: 3 x = k P(k Heads) k=0 x = 0p(0) + 1p(1) + 2p(2) + 3p(3) x = x = = 12 8 = 1.5 Thus, in three coin tosses, we expect that on average we would obtain 1.5 heads. 1.0 p(x) distribution of Heads in 3 coin tosses Figure 8.2: The theoretical probability distribution for the number of heads obtained in three tosses of a fair coin. The variance of this distribution can be calculated as follows: V = 3 (x i x) 2 p(x i ) = (0 1.5) (1 1.5) (2 1.5) (3 1.5)2 1 8 i=0 v January 5,
15 V = 2(1.5) (0.5)23 8 = = 0.75 The standard deviation is σ = V = 0.75 = The bar graph shown in Figure 8.2 (a) and (b) will be referred to as the probability distribution for the number of heads in three tosses of a coin. We make the following observations about this distribution: The values of the probabilities are all positive and satisfy 0 p 1. In both graphs, the sum of the areas of all the bars in the bar graph is 1, i.e. n p(x i ) = 1. i=0 Some events (for example obtaining 1 or 2 heads) appear more often than others, and are thus associated with larger values of the probability. (Even though each underlying event is equally likely, there are many combinations that contribute to the result of one head.) There is a pattern in the probability we computed theoretically for k, the number of heads obtained in n tosses of a fair coin. The pattern so far is: Probability of k heads in n tosses of a fair coin= [Number of possible ways to obtain k heads ] ( 1 2 So far, to determine the number of possible ways to obtain k heads, i.e., the factor in the square brackets, we have listed all the possibilities and counted those that have exactly so many heads. This would become very tedious, especially for large number of tosses, n. For this reason, some part of this chapter will be a diversion into the investigation of permutations and combinations. These results will help to understand what factor goes into the square brackets in the above term. In the case of a fair coin here examined, p = q = 1/2. This accounts for the factor (1/2) n. We will see later that this result is modified somewhat when the coin is biased, i.e. not fair, so that the probability of H is not the same as the probability of T, i.e., p q. In that case the factor (1/2) n will be modified (to p k q n k, as later discussed). ) n The coin toss experiment is an important example of a type of experiment with only two outcomes (H vs T). Such experiments are called Bernoulli trials. Here we have looked at examples where the probability of each event was (assumed to be) the same. We will generalize this to unequal probabilities further on in this chapter. The above motivation leads us to consider the subject of permutations and combinations. v January 5,
16 8.9 How many possible ways are there of getting k heads in n tosses? permutations and combinations In computing theoretical probability, we often have to count the number of possible ways there are of obtaining a given type of outcome. So far, it has been relatively easy to simply display all the possibilities and group them into classes (0, 1, 2, etc heads out of n tosses, etc.). This is not always the case. When the number of repetitions of an experiment grows, it may be very difficult and boring to list all possibilities. We develop some shortcuts to figure out, in general, how many ways there are of getting each type of outcome. This will make the job of computing theoretical probability easier. In this section we introduce some notation and then summarize general features of combinations and permutations to help in counting the possibilities Factorial notation Let n be an integer, n 0. Then n!, called n factorial, is defined as the following product of integers: n! = n(n 1)(n 2)...(2)(1) Example 1! = 1 2! = 2 1 = 2 3! = = 6 4! = = 24 5! = = 120 We also define 0! = Permutations A permutation is a way of arranging objects, where the order of appearance of the objects is important. v January 5,
17 (a) n! n distinct objects n n 1 n n slots (b) P(n,k)= n! (n k)! n distinct objects n n 1... n k+1 k slots (c) C(n,k) k! n distinct objects k objects n n 1... n k+1 k slots Figure 8.3: This diagram illustrates the meanings of permutations and combinations. (a) The number of permutations (ways of arranging) n objects into n slots. There are n choices for the first slot, and for each of these, there are n 1 choices for the second slot, etc. In total there are n! ways of arranging these objects. (Note that the order of the objects is here important.) (b) The number of permutations of n objects into k slots, P(n, k), is the product n (n 1) (n 2)...(n k + 1) which can also be written as a ratio of factorials. (c) The number of combinations of n objects in groups of k is called C(n, k) (shown as the first arrow in part c). Here order is not important. The step shown in (b) is equivalent o the two steps shown in (c). This means that there is a relationship between P(n, k) and C(n, k), namely, P(n, k) = k!c(n, k). v January 5,
18 Example 6 Given the three cards Jack, Queen, King, we could permute them to form the sequences JQK JKQ QKJ QJK KQJ KJQ We observe that there are six possible arrangements (permutations). Other than explicitly listing all the arrangements, as done here, (possible only for small sets of objects) we could arrive at this fact by reasoning as follows: Let us consider the possible slots that can be filled by the three cards. We have three choices of what to put in the first slot (J or K or Q). This uses up one card, so for each of the above choices, we then have only two choices for what to put in the second slot. The third slot leaves no choice: we must put our remaining card in it. Thus the total number of possibilities, i.e. the total number of permutations of the three cards is = 6. A feature of this argument is that it can be easily generalized for any number of objects. For example, given N = 10 different cards, we would reason similarly that as we fill in ten slots, we can choose any one of 10 cards for the first slot, any of the remaining 9 for the next slot, etc., so that the number of permutations is = 10! = We can summarize our observation in the following statement: The number of permutations (arrangements) of n objects is n!. (See Figure 8.3(a).) Recall that the factorial notation n! was defined in section Example 7 How many different ways are there to display five distinct playing cards? Solution The answer is 5! = 120. Here the order in which the cards are shown is important. Suppose we have n objects and we randomly choose k of these to put into k boxes (one per box). Assume k < n. v January 5,
19 For example, the objects are and we must choose some of them (in order) so as to fill up the 4 slots: We ask how many ways there are of arranging n objects taken k at a time? As in our previous argument, the first slot to fill comes with a choice of n objects (for n possibilities). This uses up one object leaving (n 1) to choose from in the next stage. (For each of the n first choices there are (n 1) choices for slot 2, forming the product n (n 1)). In the third slot, we have to choose among (n 2) remaining objects, etc. By the time we arrive at the k th slot, we have (n k + 1) choices. Thus, in total, the number of ways that we can form such arrangements of n objects into k slots, represented by the notation P(n, k) is P(n, k) = n (n 1) (n 2)...(n k + 1). We can also express this in the form of factorial notation: P(n, k) = n (n 1) (n 2)...(n k + 1) (n k) (n k)...(n k 1) = n! (n k)!. These remarks motivate the following observation: The number of permutations of n objects taken k at a time is (See Figure 8.3(b).) P(n, k) = n! (n k)! Combinations and binomial coefficients How many ways are there to choose k objects out of a set of n objects if the order of the selection does not matter? For example, if we have a class of 10 students, how many possible pairs of students can be formed for a team project? In the case that the order of the objects is not important, we refer to the number of possible combinations of n objects taken k at a time by the notation C(n, k) or, more commonly, by C(n, k) = ( n k ) = n! (n k)!k! Note that two notations are commonly used to refer to the same concept. We will henceforth use mainly the notation C(n, k). We can read this notation as n choose k. The values C(n, k) are also called the binomial coefficients for reasons that will shortly become apparent. As shown in Figure 8.3(b,c), combinations are related to permutations in the following way: To find the number of permutations of n objects taken k at a time, P(n, k), we would v January 5,
20 Choose k out of n objects. But the number of ways of doing this is C(n, k), i.e., ( ) n C(n, k) = k Find all the permutations of the k chosen objects. We have discussed that there are k! ways of arranging (i.e. permuting) k objects. Thus ( n P(n, k) = k ) k! The above remarks lead to the following conclusion: The number of combinations of n objects taken k at a time, sometimes called n choose k, (also called the binomial coefficient, C(n, k)) is: ( ) n P(n, k) n! C(n, k) = = = k k! k!(n k)!. We can observe an interesting symmetry property, namely that ( ) ( ) n n = or C(n, k) = C(n, n k). k n k It is worth noting that the binomial coefficients are entries that occur in Pascal s triangle: Each term in Pascal s triangle is obtained by adding the two diagonally above it. The top of the triangle represents C(0, 0) and is associated with n = 0. The next row represents C(1, 0) and C(1, 1). For row number n, terms along the row are the binomial coefficients C(n, k), starting with k = 0 at the beginning of the row and and going to k = n at the end of the row. For example, we see above that the value of C(5, 2) = C(5, 3) = 10 The triangle can be continued, by including subsequent rows; this is left as an exercise for the reader Example How many ways are there of getting k heads in n tosses of a fair coin? v January 5,
21 Solution This problem motivated the discussion of permutations and combinations. We can now answer this question. The number of possible ways of obtaining k heads in n tosses is C(n, k) = n! k!(n k)!. Thus the probability of getting any outcome consisting of k heads when a fair coin is tossed n times, is n! For a fair coin, P(k heads in n tosses)= k!(n k)! ( ) n 1. 2 The term containing the power (1/2) n is the probability of any one specific sequence of possible H s and T s. The multiplier in front, which as we have seen is the binomial coefficient C(n, k), is how many such sequences have exactly k H s and all the rest (i.e., n k) T s. (The greater the number of possible combinations, the more likely it is that any one of them would occur.) Example How many combinations can be made out of of 5 objects if we take 1, or 2 or 3 etc objects at a time? Solution Here the order of the objects is not important. The number of ways of taking 5 objects k at a time (where 0 k 5) is C(5, k). For example, the number of combinations of 5 objects taken 3 at a time is 5! C(5, 3) = 3!(5 3)! = (3 2 1)(2 1) = 5 4 = The list of all the coefficients C(5, k) appears as the last row displayed above in Pascal s triangle Example How many different 5card hands can be formed from a deck of 52 ordinary playing cards? v January 5,
22 Solution We are not concerned here with the order of appearance of the cards that are dealt, only with the hand (i.e. composition of the final collection of 5 cards). Thus we are asking how many combinations of 5 cards there are from a set of 52 cards. The solution is C(52, 5) = 52! = = 2, 598, !(52 5)! The binomial theorem An interesting application of combinations is the formula for the product of terms of the form (a + b) n known as the Binomial expansion. Consider the simple example (a + b) 2 = (a + b) (a + b). We expand this by multiplying each of the terms in the first factor by each of the terms in the second factor: (a + b) 2 = a 2 + ab + ba + b 2. However, the order of factors ab or ba does not matter, so we count these as two identical terms, and express our result as (a + b) 2 = a 2 + 2ab + b 2. Similarly, consider the product (a + b) 3 = (a + b)(a + b)(a + b). Now, to form the expansion, each term in the first factor is multiplied by two other terms (one chosen from each of the other factors). This leads to an expansion of the form (a + b) 3 = a 3 + 3a 2 b + 3ab 2 + b 3. More generally, consider a product of the form (a + b) n = (a + b) (a + b) (a + b). By analogy, we expect to see terms of the form shown below in the expansion for this binomial, i.e. (a + b) n = a n + a n 1 b + a n 2 b a n k b k + + ab n 1 + b n. The first and last terms are accompanied by the coefficients 1, since such terms can occur in only one way each. However, we must still fill in the boxes with coefficients that reflect the number of times that terms of the given form a n k b k occur. But this product is made by choosing k a s out of a total of n factors (and picking b s from all the rest of the factors). We already know how many ways there are of selecting k items out of a collection of n, namely, the binomial coefficients. Thus (a+b) n = a n +C(n, 1)a n 1 b+c(n, 2)a n 2 b 2 + +C(n, k)a k b n k + +C(n, 2)a 2 b n 2 +C(n, 1)ab n 1 +b n where the binomial coefficients are as defined in section We have used the symmetry property C(n, k) = C(n, n k) in the coefficients in this expansion. v January 5,
23 8.9.8 Example Find the expansion of the expression (a + b) 5. Solution The coefficients we need in this expansion are formed from C(5, k). We have already calculated the binomial coefficients for the required expansion in example 8.9.5, namely, Thus the desired expansion is (a + b) 5 = a 5 + 5a 4 b + 10a 3 b a 2 b 3 + 5ab 4 + b A coin toss experiment and binomial distribution A Bernoulli Trial is an experiment that has only two possible results. A typical example of this type is the coin toss. We have already studied examples of results of a Bernoulli trial in this chapter. Here we expand our investigation to consider more general cases and their distributions. We have already examined in detail an example of a Bernoulli trial in which each outcome is equally likely, i.e. a coin toss with P(H) = P(T). In this section we will drop the assumption that each event is equally likely, and examine a more general case. If we do not know that the coin is fair, we might assume that the probability that it lands on H is p and on T is q. That is p(h) = p, p(t) = q In general, p and q may not be exactly equal. By the property of probabilities, p + q = 1 Consider the following specific outcome of an experiment in which an (unfair) coin is tossed 10 times: TTHTHHTTTH Assuming that each toss is independent of the other tosses, we find that the probability of this event is: q q p q p p q q q p = p 4 q 6. The probability of this specific event is the same as the probability of the specific event HHHHTTTTTT (Since each event has the same number of H s and T s). The probability of each event is a product of factors of p and q (one for each of H or T that appear). Further, the number of ways of obtaining an outcome with a specific number of H s (for example, four H s as in this illustration) is the same whether the coin is fair or not. (That number is a combination, i.e. the binomial coefficient C(n, k) as before.) The probability of getting any outcome with k heads when the (possibly unfair) coin is tossed n times is thus a simple generalization of the probability for a fair coin: v January 5,
24 The binomial distribution Given a (possibly unfair) coin with P(H)= p, and P(T)=q, where p + q = 1, if the coin is tossed n times, the probability of getting exactly k heads is given by P(k heads out of n tosses) = C(n, k)p k q n k = n! k!(n k)! pk q n k. We refer to this distribution as the Binomial distribution. In the case of a fair coin, p = q = 1/2 and the factor p k q n k is replaced by (1/2) n. Having obtained the form of the binomial distribution, we wish to show that the probability of obtaining any of the possible outcomes, i.e. P(1 or 2 or...n Heads out of n tosses)=1. We can show this with the following calculation. For each toss, Then raising each side to the power n, (p + q) = 1. (p + q) n = 1 n = 1, but by the Binomial theorem, the expression on the left can be expanded, to form, (p+q) n = p n +C(n, 1)p n 1 q+c(n, 2)p n 2 q 2 + +C(n, k)p k q n k + +C(n, 2)p 2 q n 2 +C(n, 1)pq n 1 +q n, that is, (p + q) n = Therefore, since (p + q) n = 1, it follows that n C(n, k)p k q n k k=0 n C(n, k)p k q n k = 1 Thus n k=0p(k Heads out of n tosses)=1, verifying the desired relationship. k=0 Remark: Each term in the above expansion can be interpreted as the probability of a certain type of event. The first term is the probability of tossing exactly n heads: there is only one way this can happen, accounting for the coefficient 1. The last term is the probability of tossing exactly n Tails. The product p k q n k reflects the probability of a particular sequence containing k Heads and the rest (n k) Tails: but there are many ways of generating that type of sequence: C(n, k) is the number of distinct combinations that are all counted as k heads. Thus the combined probability of getting any of the events in which there are k Heads is given by C(n, k)p k q n k Example Suppose P(H)=p = 0.1. What is the probability of getting 3 Heads if this unfair coin is tossed 5 times? v January 5,
25 Solution From the above results, P(3 heads out of 5 tosses) = p 3 q 2 C(5, 3). But p = 0.1 and q = 1 p = 0.9, so P(3 heads out of 5 tosses)= C(5, 3) = (0.001)(.81)10 = Mean of a binomial distribution A binomial distribution has a particularly simple mean. An important result, established in the calculations in this section is as follows: Consider a Bernoulli trial in which the probability of event e 1 is p. Then if this trial is repeated n times, the mean of the resulting binomial distribution, i.e. expected number of times that event e 1 occurs is x = np. Thus the mean of a binomial distribution is the number of repetitions multiplied by the probability of the event in a single trial. We here verify the simple formula for the mean of a binomial distribution. The calculations use many properties of series that were established in Chapter 1. The calculation is presented for completeness, rather than importance but the result (in the box above) is very useful and important. By definition of the mean, x = n x k p(x k ) k=0 But here x k = k is the number of heads obtained, and p(x k )=P(k heads in n tosses)= C(n, k)p k q n k is the distribution of k heads in n tosses computed in this chapter. Then x = n k C(n, k)p k q n k = k=0 n n! k k!(n k)! pk q n k where in the last sum we have dropped the k = 0 term, since it makes no contribution to the total. The numerators in the sum are of the form k n (n 1)...(n k + 1) and the denominators are k (k 1) We can cancel one factor of k from top and bottom. We can also take one common factor of n out of the sum: n n x = k=1 k=1 n(n 1)...(n k + 1) p k q n k = n (k 1) k=1 (n 1)...(n k + 1) p k q n k (k 1) We now shift the sum by defining the following replacement index: let l = k 1 then k = l + 1 so when k = 1, l = 0 and when k = n, l = n 1. We replace the indices and take one common factor of p out of the sum: n 1 x = n l=0 (n 1)...(n l) l! n 1 p l+1 q n l 1 = np l=0 (n 1)! l!(n l 1)! pl q n l 1 v January 5,
26 Let m = n 1, then x = np m l=0 m! l!(n m)! pl q m l = np, where in the last step we have used the fact that n k=0 p(x k) = 1. This verifies the result A continuous distribution 0.4 The Normal distribution Figure 8.4: The Normal (or Gaussian) distribution is given by equation (8.1) and has the distribution shown in this figure. If we were to repeat the number of coin tosses a large number of times, n, we would see a certain trend: There would be a peak in the distribution at the probability of getting heads 50% of the time, i.e. at N/2 heads. A fact which we state but do not prove here is that the probability of N/2 heads, p(n/2) behaves like This can also be written in the form N p(n/2) 4 p(n/2) πn = 2π N. 1 2π = Const. v January 5,
27 One finds that the shapes of the various distributions is similar, but that a scale factor of N/2 is applied to stretch the graph horizontally, while compressing it vertically to preserve its total area. The graph is also shifted so that its peak occurs at N/2. As the number of Bernoulli trials grows, i.e. as we toss our imaginary coin in longer and longer sets (N ), a remarkable thing happens to the binomial distribution: it becomes smoother and smoother, until it grows to resemble a continuous distribution that looks like a Bell curve. That curve is known as the Gaussian or Normal distribution. If we scale this curve vertically and horizontally (stretch vertically and compress horizontally by the factor N/2) and shift its peak to x = 0, then we find a distribution that describes the deviation from the expected value of 50% heads. The resulting function is of the form p(x) = 1 2π e x2 /2 (8.1) We will study properties of this (and other) such continuous distributions in a later section. We show a typical example of the Normal distribution in Figure 8.4. Its cumulative distribution is then shown (without and with the original distribution superimposed) in Figure The cumulative distribution 1.0 The cumulative distribution The normal distribution Figure 8.5: The Normal probability density with its corresponding cumulative distribution Summary In this chapter, we introduced the notion of probability of elementary events. We learned that a probability is always a number between 0 and 1, and that the sum of (discrete) probabilities of all possible (discrete) outcomes is 1. We then described how to combine probabilities of elementary events to calculate probabilities of compound independent events in a variety of simple experiments. We defined the notion of a Bernoulli trial, such as tossing of a coin, and studied this in detail. We investigated a number of ways of describing results of experiments, whether in tabular or graphical form, and we used the distribution of results to define simple numerical descriptors. The v January 5,
28 mean is a number that, more or less, describes the location of the center of the distribution (analogous to center of mass), defined as follows: The mean (expected value) x of a probability distribution is x = n x i p(x i ). i=0 The standard deviation is, roughly speaking, the width of the distribution. The standard deviation, σ is where V is the variance, σ = V V = n (x i x) 2 p(x i ). i=0 While the chapter was motivated by results of a real experiment, we then investigated theoretical distributions, including the binomial. We found that the distribution of events in a repetition of a Bernoulli trial (e.g. coin tossed n times) was a Binomial distribution, and we computed the mean of that distribution. Suppose that the probability of one of the events, say event e 1 in a Bernoulli trial is p (and hence the probability of the other event e 2 is q = 1 p), then P(k occurrences of given event out of n trials) = n! k!(n k)! pk q n k. This is called the binomial distribution. The mean of the binomial distribution, i.e. the mean number of events e 1 in n repeated Bernoulli trials is x = np. v January 5,
Random variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More informationProbability distributions
Probability distributions (Notes are heavily adapted from Harnett, Ch. 3; Hayes, sections 2.142.19; see also Hayes, Appendix B.) I. Random variables (in general) A. So far we have focused on single events,
More informationPROBABILITIES AND PROBABILITY DISTRIBUTIONS
Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL
More informationLesson 1. Basics of Probability. Principles of Mathematics 12: Explained! www.math12.com 314
Lesson 1 Basics of Probability www.math12.com 314 Sample Spaces: Probability Lesson 1 Part I: Basic Elements of Probability Consider the following situation: A six sided die is rolled The sample space
More informationProbability. A random sample is selected in such a way that every different sample of size n has an equal chance of selection.
1 3.1 Sample Spaces and Tree Diagrams Probability This section introduces terminology and some techniques which will eventually lead us to the basic concept of the probability of an event. The Rare Event
More informationQuestion: What is the probability that a fivecard poker hand contains a flush, that is, five cards of the same suit?
ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the
More informationREPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.
REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game
More informationChapter 6. 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? Ans: 4/52.
Chapter 6 1. What is the probability that a card chosen from an ordinary deck of 52 cards is an ace? 4/52. 2. What is the probability that a randomly selected integer chosen from the first 100 positive
More informationSession 8 Probability
Key Terms for This Session Session 8 Probability Previously Introduced frequency New in This Session binomial experiment binomial probability model experimental probability mathematical probability outcome
More informationCombinatorics: The Fine Art of Counting
Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli
More information6.3 Conditional Probability and Independence
222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted
More informationP (A) = lim P (A) = N(A)/N,
1.1 Probability, Relative Frequency and Classical Definition. Probability is the study of random or nondeterministic experiments. Suppose an experiment can be repeated any number of times, so that we
More informationSTAT 35A HW2 Solutions
STAT 35A HW2 Solutions http://www.stat.ucla.edu/~dinov/courses_students.dir/09/spring/stat35.dir 1. A computer consulting firm presently has bids out on three projects. Let A i = { awarded project i },
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According
More informationI. WHAT IS PROBABILITY?
C HAPTER 3 PROBABILITY Random Experiments I. WHAT IS PROBABILITY? The weatherman on 0 o clock news program states that there is a 20% chance that it will snow tomorrow, a 65% chance that it will rain and
More informationProbabilistic Strategies: Solutions
Probability Victor Xu Probabilistic Strategies: Solutions Western PA ARML Practice April 3, 2016 1 Problems 1. You roll two 6sided dice. What s the probability of rolling at least one 6? There is a 1
More informationSection 65 Sample Spaces and Probability
492 6 SEQUENCES, SERIES, AND PROBABILITY 52. How many committees of 4 people are possible from a group of 9 people if (A) There are no restrictions? (B) Both Juan and Mary must be on the committee? (C)
More informationIntroduction to Probability
3 Introduction to Probability Given a fair coin, what can we expect to be the frequency of tails in a sequence of 10 coin tosses? Tossing a coin is an example of a chance experiment, namely a process which
More informationChapter 4 Probability
The Big Picture of Statistics Chapter 4 Probability Section 42: Fundamentals Section 43: Addition Rule Sections 44, 45: Multiplication Rule Section 47: Counting (next time) 2 What is probability?
More informationV. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE
V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPETED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny and a nickel are flipped. You win $ if either
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a realvalued function defined on the sample space of some experiment. For instance,
More information6.4 Normal Distribution
Contents 6.4 Normal Distribution....................... 381 6.4.1 Characteristics of the Normal Distribution....... 381 6.4.2 The Standardized Normal Distribution......... 385 6.4.3 Meaning of Areas under
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,
More information36 Odds, Expected Value, and Conditional Probability
36 Odds, Expected Value, and Conditional Probability What s the difference between probabilities and odds? To answer this question, let s consider a game that involves rolling a die. If one gets the face
More informationProbability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to
More information2. How many ways can the letters in PHOENIX be rearranged? 7! = 5,040 ways.
Math 142 September 27, 2011 1. How many ways can 9 people be arranged in order? 9! = 362,880 ways 2. How many ways can the letters in PHOENIX be rearranged? 7! = 5,040 ways. 3. The letters in MATH are
More informationSection 6.2 Definition of Probability
Section 6.2 Definition of Probability Probability is a measure of the likelihood that an event occurs. For example, if there is a 20% chance of rain tomorrow, that means that the probability that it will
More informationMATH 140 Lab 4: Probability and the Standard Normal Distribution
MATH 140 Lab 4: Probability and the Standard Normal Distribution Problem 1. Flipping a Coin Problem In this problem, we want to simualte the process of flipping a fair coin 1000 times. Note that the outcomes
More informationChapter 4 & 5 practice set. The actual exam is not multiple choice nor does it contain like questions.
Chapter 4 & 5 practice set. The actual exam is not multiple choice nor does it contain like questions. MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.
More informationBayesian Tutorial (Sheet Updated 20 March)
Bayesian Tutorial (Sheet Updated 20 March) Practice Questions (for discussing in Class) Week starting 21 March 2016 1. What is the probability that the total of two dice will be greater than 8, given that
More informationProblem of the Month: Fair Games
Problem of the Month: The Problems of the Month (POM) are used in a variety of ways to promote problem solving and to foster the first standard of mathematical practice from the Common Core State Standards:
More informationDescriptive statistics Statistical inference statistical inference, statistical induction and inferential statistics
Descriptive statistics is the discipline of quantitatively describing the main features of a collection of data. Descriptive statistics are distinguished from inferential statistics (or inductive statistics),
More informationChapter 3: The basic concepts of probability
Chapter 3: The basic concepts of probability Experiment: a measurement process that produces quantifiable results (e.g. throwing two dice, dealing cards, at poker, measuring heights of people, recording
More informationMAT 1000. Mathematics in Today's World
MAT 1000 Mathematics in Today's World We talked about Cryptography Last Time We will talk about probability. Today There are four rules that govern probabilities. One good way to analyze simple probabilities
More informationProbability and Hypothesis Testing
B. Weaver (3Oct25) Probability & Hypothesis Testing. PROBABILITY AND INFERENCE Probability and Hypothesis Testing The area of descriptive statistics is concerned with meaningful and efficient ways of
More informationSummary of Formulas and Concepts. Descriptive Statistics (Ch. 14)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 14) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
More informationAn Introduction to Basic Statistics and Probability
An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random
More information4: Probability. What is probability? Random variables (RVs)
4: Probability b binomial µ expected value [parameter] n number of trials [parameter] N normal p probability of success [parameter] pdf probability density function pmf probability mass function RV random
More informationCommon Core State Standards for Mathematics Accelerated 7th Grade
A Correlation of 2013 To the to the Introduction This document demonstrates how Mathematics Accelerated Grade 7, 2013, meets the. Correlation references are to the pages within the Student Edition. Meeting
More informationChapter 4  Practice Problems 1
Chapter 4  Practice Problems SHORT ANSWER. Write the word or phrase that best completes each statement or answers the question. Provide an appropriate response. ) Compare the relative frequency formula
More information+ Section 6.2 and 6.3
Section 6.2 and 6.3 Learning Objectives After this section, you should be able to DEFINE and APPLY basic rules of probability CONSTRUCT Venn diagrams and DETERMINE probabilities DETERMINE probabilities
More informationNormal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem
1.1.2 Normal distribution 1.1.3 Approimating binomial distribution by normal 2.1 Central Limit Theorem Prof. Tesler Math 283 October 22, 214 Prof. Tesler 1.1.23, 2.1 Normal distribution Math 283 / October
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 101634501 Probability and Statistics for Engineers Winter 20102011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete
More informationChapter 3. Distribution Problems. 3.1 The idea of a distribution. 3.1.1 The twentyfold way
Chapter 3 Distribution Problems 3.1 The idea of a distribution Many of the problems we solved in Chapter 1 may be thought of as problems of distributing objects (such as pieces of fruit or pingpong balls)
More informationCurrent Standard: Mathematical Concepts and Applications Shape, Space, and Measurement Primary
Shape, Space, and Measurement Primary A student shall apply concepts of shape, space, and measurement to solve problems involving two and threedimensional shapes by demonstrating an understanding of:
More informationFor two disjoint subsets A and B of Ω, say that A and B are disjoint events. For disjoint events A and B we take an axiom P(A B) = P(A) + P(B)
Basic probability A probability space or event space is a set Ω together with a probability measure P on it. This means that to each subset A Ω we associate the probability P(A) = probability of A with
More informationCharacteristics of Binomial Distributions
Lesson2 Characteristics of Binomial Distributions In the last lesson, you constructed several binomial distributions, observed their shapes, and estimated their means and standard deviations. In Investigation
More informationProbability and Statistics Vocabulary List (Definitions for Middle School Teachers)
Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence
More informationCombinatorial Proofs
Combinatorial Proofs Two Counting Principles Some proofs concerning finite sets involve counting the number of elements of the sets, so we will look at the basics of counting. Addition Principle: If A
More informationElements of probability theory
The role of probability theory in statistics We collect data so as to provide evidentiary support for answers we give to our many questions about the world (and in our particular case, about the business
More informationMath 2015 Lesson 21. We discuss the mean and the median, two important statistics about a distribution. p(x)dx = 0.5
ean and edian We discuss the mean and the median, two important statistics about a distribution. The edian The median is the halfway point of a distribution. It is the point where half the population has
More informationChapter 2: Systems of Linear Equations and Matrices:
At the end of the lesson, you should be able to: Chapter 2: Systems of Linear Equations and Matrices: 2.1: Solutions of Linear Systems by the Echelon Method Define linear systems, unique solution, inconsistent,
More informationDiscrete Mathematics for CS Fall 2006 Papadimitriou & Vazirani Lecture 22
CS 70 Discrete Mathematics for CS Fall 2006 Papadimitriou & Vazirani Lecture 22 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice, roulette
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected
More informationCombinations and Permutations
Combinations and Permutations What's the Difference? In English we use the word "combination" loosely, without thinking if the order of things is important. In other words: "My fruit salad is a combination
More informationProbability QUESTIONS Principles of Math 12  Probability Practice Exam 1 www.math12.com
Probability QUESTIONS Principles of Math  Probability Practice Exam www.math.com Principles of Math : Probability Practice Exam Use this sheet to record your answers:... 4... 4... 4.. 6. 4.. 6. 7..
More informationWorked examples Basic Concepts of Probability Theory
Worked examples Basic Concepts of Probability Theory Example 1 A regular tetrahedron is a body that has four faces and, if is tossed, the probability that it lands on any face is 1/4. Suppose that one
More informationST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
More informationDISCRETE RANDOM VARIABLES
DISCRETE RANDOM VARIABLES DISCRETE RANDOM VARIABLES Documents prepared for use in course B01.1305, New York University, Stern School of Business Definitions page 3 Discrete random variables are introduced
More informationCh5: Discrete Probability Distributions Section 51: Probability Distribution
Recall: Ch5: Discrete Probability Distributions Section 51: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.
More informationPeople have thought about, and defined, probability in different ways. important to note the consequences of the definition:
PROBABILITY AND LIKELIHOOD, A BRIEF INTRODUCTION IN SUPPORT OF A COURSE ON MOLECULAR EVOLUTION (BIOL 3046) Probability The subject of PROBABILITY is a branch of mathematics dedicated to building models
More informationSums of Independent Random Variables
Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables
More informationTHE MULTINOMIAL DISTRIBUTION. Throwing Dice and the Multinomial Distribution
THE MULTINOMIAL DISTRIBUTION Discrete distribution  The Outcomes Are Discrete. A generalization of the binomial distribution from only 2 outcomes to k outcomes. Typical Multinomial Outcomes: red A area1
More information7 Probability. Copyright Cengage Learning. All rights reserved.
7 Probability Copyright Cengage Learning. All rights reserved. 7.2 Relative Frequency Copyright Cengage Learning. All rights reserved. Suppose you have a coin that you think is not fair and you would like
More informationUnit 4 The Bernoulli and Binomial Distributions
PubHlth 540 4. Bernoulli and Binomial Page 1 of 19 Unit 4 The Bernoulli and Binomial Distributions Topic 1. Review What is a Discrete Probability Distribution... 2. Statistical Expectation.. 3. The Population
More informationMath 2020 Quizzes Winter 2009
Quiz : Basic Probability Ten Scrabble tiles are placed in a bag Four of the tiles have the letter printed on them, and there are two tiles each with the letters B, C and D on them (a) Suppose one tile
More informationElementary probability
Elementary probability Many of the principal applications of calculus are to questions of probability and statistics. We shall include here an introduction to elementary probability, and eventually some
More informationProbability Using Dice
Using Dice One Page Overview By Robert B. Brown, The Ohio State University Topics: Levels:, Statistics Grades 5 8 Problem: What are the probabilities of rolling various sums with two dice? How can you
More informationA Second Course in Mathematics Concepts for Elementary Teachers: Theory, Problems, and Solutions
A Second Course in Mathematics Concepts for Elementary Teachers: Theory, Problems, and Solutions Marcel B. Finan Arkansas Tech University c All Rights Reserved First Draft February 8, 2006 1 Contents 25
More informationProbability. Sample space: all the possible outcomes of a probability experiment, i.e., the population of outcomes
Probability Basic Concepts: Probability experiment: process that leads to welldefined results, called outcomes Outcome: result of a single trial of a probability experiment (a datum) Sample space: all
More informationMT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...
MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 20042012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................
More information4. Introduction to Statistics
Statistics for Engineers 41 4. Introduction to Statistics Descriptive Statistics Types of data A variate or random variable is a quantity or attribute whose value may vary from one unit of investigation
More informationNumerical Summarization of Data OPRE 6301
Numerical Summarization of Data OPRE 6301 Motivation... In the previous session, we used graphical techniques to describe data. For example: While this histogram provides useful insight, other interesting
More informationStatistics 100 Binomial and Normal Random Variables
Statistics 100 Binomial and Normal Random Variables Three different random variables with common characteristics: 1. Flip a fair coin 10 times. Let X = number of heads out of 10 flips. 2. Poll a random
More informationCHAPTER 2 Estimating Probabilities
CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a
More informationA Few Basics of Probability
A Few Basics of Probability Philosophy 57 Spring, 2004 1 Introduction This handout distinguishes between inductive and deductive logic, and then introduces probability, a concept essential to the study
More informationChapter 4: Probability and Counting Rules
Chapter 4: Probability and Counting Rules Learning Objectives Upon successful completion of Chapter 4, you will be able to: Determine sample spaces and find the probability of an event using classical
More information7.S.8 Interpret data to provide the basis for predictions and to establish
7 th Grade Probability Unit 7.S.8 Interpret data to provide the basis for predictions and to establish experimental probabilities. 7.S.10 Predict the outcome of experiment 7.S.11 Design and conduct an
More informationMath Review Large Print (18 point) Edition Chapter 4: Data Analysis
GRADUATE RECORD EXAMINATIONS Math Review Large Print (18 point) Edition Chapter 4: Data Analysis Copyright 2010 by Educational Testing Service. All rights reserved. ETS, the ETS logo, GRADUATE RECORD EXAMINATIONS,
More informationConditional Probability, Independence and Bayes Theorem Class 3, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
Conditional Probability, Independence and Bayes Theorem Class 3, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence
More information4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
More informationELEMENTARY PROBABILITY
ELEMENTARY PROBABILITY Events and event sets. Consider tossing a die. There are six possible outcomes, which we shall denote by elements of the set {A i ; i =1, 2,...,6}. A numerical value is assigned
More informationProbability and Statistics
CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b  0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute  Systems and Modeling GIGA  Bioinformatics ULg kristel.vansteen@ulg.ac.be
More informationWHERE DOES THE 10% CONDITION COME FROM?
1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay
More information4. Joint Distributions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose
More informationMath 141. Lecture 3: The Binomial Distribution. Albyn Jones 1. 1 Library 304. jones/courses/141
Math 141 Lecture 3: The Binomial Distribution Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Outline Coin Tossing Coin Tosses Independent Coin Tosses Crucial Features
More informationMathematical goals. Starting points. Materials required. Time needed
Level S2 of challenge: B/C S2 Mathematical goals Starting points Materials required Time needed Evaluating probability statements To help learners to: discuss and clarify some common misconceptions about
More informationPythagorean Triples. Chapter 2. a 2 + b 2 = c 2
Chapter Pythagorean Triples The Pythagorean Theorem, that beloved formula of all high school geometry students, says that the sum of the squares of the sides of a right triangle equals the square of the
More informationLab 11. Simulations. The Concept
Lab 11 Simulations In this lab you ll learn how to create simulations to provide approximate answers to probability questions. We ll make use of a particular kind of structure, called a box model, that
More informationChapter 4. Probability and Probability Distributions
Chapter 4. robability and robability Distributions Importance of Knowing robability To know whether a sample is not identical to the population from which it was selected, it is necessary to assess the
More informationE3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
More informationReady, Set, Go! Math Games for Serious Minds
Math Games with Cards and Dice presented at NAGC November, 2013 Ready, Set, Go! Math Games for Serious Minds Rande McCreight Lincoln Public Schools Lincoln, Nebraska Math Games with Cards Close to 20 
More informationMath/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability
Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock
More informationLecture 2 Binomial and Poisson Probability Distributions
Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a
More informationInduction. Margaret M. Fleck. 10 October These notes cover mathematical induction and recursive definition
Induction Margaret M. Fleck 10 October 011 These notes cover mathematical induction and recursive definition 1 Introduction to induction At the start of the term, we saw the following formula for computing
More informationHoover High School Math League. Counting and Probability
Hoover High School Math League Counting and Probability Problems. At a sandwich shop there are 2 kinds of bread, 5 kinds of cold cuts, 3 kinds of cheese, and 2 kinds of dressing. How many different sandwiches
More informationDiscrete Math in Computer Science Homework 7 Solutions (Max Points: 80)
Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you
More informationJan 17 Homework Solutions Math 151, Winter 2012. Chapter 2 Problems (pages 5054)
Jan 17 Homework Solutions Math 11, Winter 01 Chapter Problems (pages 0 Problem In an experiment, a die is rolled continually until a 6 appears, at which point the experiment stops. What is the sample
More informationMathematics Vocabulary List For PreAlgebra
Mathematics Vocabulary List For PreAlgebra 1. Absolute Value  the distance from a number to zero on a number line. It is always positive. l6 l = 6 l 25 l = 25 l 6 l = 6 l 25 l = 25 2. Associative Property
More informationPattern matching probabilities and paradoxes A new variation on Penney s coin game
Osaka Keidai Ronshu, Vol. 63 No. 4 November 2012 Pattern matching probabilities and paradoxes A new variation on Penney s coin game Yutaka Nishiyama Abstract This paper gives an outline of an interesting
More information