Summary of some Rules of Probability with Examples
|
|
- Laureen Morgan
- 7 years ago
- Views:
Transcription
1 Summary of some Rules of Probability with Examples CEE 201L. Uncertainty, Design, and Optimization Department of Civil and Environmental Engineering Duke University Henri P. Gavin Spring, 2016 Introduction Engineering analysis involves operations on input data (e.g., elastic modulus, wind speed, static loads, temperature, mean rainfall rate) to compute output data (e.g., truss bar forces, evaporation rates, stream flows, reservoir levels) in order to assess the safety, serviceability, efficiency, and suitability of the system being analyzed. There can be considerable uncertainty regarding the input data, but this uncertainty often can be characterized by a range of values (minimum and maximum expected values), an average expected value with some level of variability, or a distribution of values. For example, Figures 1 shows probability distributions of daily precipitation and daily max and min temperatures for Durham NC (27705) from June 1990 to June Clearly, any calculated prediction of the level of Lake Michie involves temperature and rainfall rate. Long-term predictions of lake levels should be reported in terms of expected values and uncertainty or variability, reflecting uncertainties in the input data. These calculations involve the propagation of uncertainty. In this course you will propagate uncertainty to determine the probability of failure of the systems you design. This kind of failure analysis requires: 1. the identification of all possible modes of failure; 2. the evaluation of failure probabilities for each failure mode; and 3. combining these failure probabilities to determine an overall failure probability This, in turn, requires methods based on the theory of sets (e.g., the union and intersection of sets and their complements) and the theory of probability (e.g., the probability that an event belongs to a particular set among all possible sets). In the following sections probability concepts are illustrated using Venn diagrams. In these diagrams the black rectangle represents the space of all possible events with an associated probability of 1. The smaller rectangles represent subsets and their intersections. It is helpful to associate the areas of these rectangles with the probability of occurence of the associated events. So think of the black rectangle as having an area of 1 and the smaller rectangles having areas less than 1.
2 2 CEE 201L. Uncertainty, Design, and Optimization Duke University Spring 2016 H.P. Gavin P.D.F C.D.F max and min temp, deg C daily precipitation, mm Figure 1. Probability Density Functions and Cumulative Distribution Functions of precipitation, minimum temperature, and maximum temperature in Durham NC from June 1990 to June 2013.
3 Rules of Probability 3 Complementary Events A A If the probability of event A occurring is P [A] then the probability of event A not occurring, P [A ], is given by P [A ] = 1 P [A]. (1) Example: This and following examples pertain to traffic and accidents on a certain stretch of highway from 8am to 9am on work-days. If a year has 251 work-days and 226 work-days with no accident (on the stretch of highway between 8am and 9am) the probability of a work-day with no accident is 226/251=0.90. The probability of a work-day with an accident (one or more) is =0.10. Mutually Exclusive (Non-Intersecting) Events (ME) A B Mutually exclusive events can not occur together; one precludes the other. If A and B are mutually exclusive events, then P [A and B] = 0. (2) All complementary events are mutually exclusive, but not vice versa. Example: If X represents the random flow of traffic in cars/minute, the following sets of events are mutually exclusive: 0 < X 5, 5 < X 10, and 20 < X 50. The pair of events X = 0 and 20 < X 50 are mutually exclusive but not complementary. Collectively Exhaustive Events (CE) A B If A and B are collectively exhaustive events then the set [A and B] represents all possible events, and P [A or B] = 1. (3) A pair of complementary events are ME and CE. The sum of the probabilities of a set of ME and CE events must be Example: Let Y represent the random number of accidents from 8am to 9am on a work-day. If P [Y = 0] = 0.90 (no accident) and P [Y = 1] = 0.04 then P [Y > 1] = Example: The following sets are mutually exclusive and collectively exhaustive: X = 0, 0 < X 5, 5 < X 10, 10 < X 20, 20 < X 50, and 50 < X. In the above if the < were replaced by the sets would be CE but not ME.
4 4 CEE 201L. Uncertainty, Design, and Optimization Duke University Spring 2016 H.P. Gavin Conditionally Dependent Events A B A B If the occurrence (or non-occurrence) of an event B affects the probability of event A, then A and B are conditionally dependent; P [A B = P [A and B] / P [B] (4) In this equation the set B can be thought of as a more restrictive set of all possibilities. The set B is therefore the new set of all possibilities and the probability of A given B is represented as the area associated with the intersection of A and B (the green rectangle) divided by the area associated with the set B (the red rectangle). Example: The term P [Y > 1 10 < X 20] denotes the probability of more than one accident from 8am to 9am given the traffic flow is between 11 and 20 cars per minute (inclusive). If A and B are mutually exclusive, the occurrence of one implies the non-occurrence of the other, and therefore a dependence between A and B. So P [A B] = P [B A] = 0. If A and B are mutually exclusive, A is a subset of B, and P [A B ] = 1. So all mutually exclusive events are conditionally dependent, but not vice versa. Example: The events [Y > 1] and [10 < X 20] are dependent but not mutually exclusive. Statistically Independent Events A B If the occurrence (or non-occurrence) of event B has no bearing on the probability of A occurring, then A and B are statistically independent; P [A B] = P [A] and P [B A] = P [B]. (5) Example: If Z represents the random number of migrating geese above above the certain stretch of highway from 9am-10am on a work-day, P [Y > 1 Z > 20] = P [Y > 1]. Intersecting sets of events can be dependent or independent. Example: The events [Y > 1] and [Z > 20] can occur on the same day (are not mutually exclusive; are intersecting) but are independent. A pair of events cannot be both mutually exclusive and independent. Example: The events [Z 10] and [Z > 20] are mutually exclusive. But the occurrence of one implies the non-occurrence of the other, and therefore a dependence.
5 Rules of Probability 5 Intersection of Events A B If A and B are conditionally dependent events, then (from equation (4)) the intersection: P [A and B] = P [A B] = P [AB] = P [A B] P [B] (6) Example: If P [Y > 1 10 < X 20] = 0.05 and P [10 < X 20] = 0.25 then P [Y > 1 10 < X 20] = (0.05)(0.25) = If A and B are independent events: P [A B] = P [A]; and P [A and B] = P [A B] = P [AB] = P [A] P [B] (7) Example: P [Y > 1 Z 20] = P [Y > 1]. If P [Z > 20] = 0.04 then P [Y > 1 Z 20] = (0.06)(0.96) = If A, B, and C, are dependent events: P [A B C] = P [ABC] = P [A BC] P [BC] = P [A BC] P [B C] P [C]. (8) If A, B, and C are independent events, P [A BC] = P [A], P [B C] = P [B], and P [A B C] = P [ABC] = P [A] P [B] P [C] (9) Union of Events A B A B For any two events A and B, P [A or B] = P [A B] = P [A] + P [B] P [A and B]. (10) Example: P [Y > 1 10 < X 20] = = If A and B are mutually exclusive, P [A and B] = 0 and P [A or B] = P [A B] = P [A] + P [B]. (11) Example: P [Y = 0 Y > 1] = = the same as (1 P [Y = 1]). For any n events E 1, E 2,, E n, P [E 1 E 2 E n ] = 1 P [E 1 E 2 E n] (12) If E 1, E 2,, E n are n mutually exclusive events, P [E 1 E 2 E n ] = P [E 1 ] + P [E 2 ] + + P [E n ] (13)
6 6 CEE 201L. Uncertainty, Design, and Optimization Duke University Spring 2016 H.P. Gavin E 1 E 2 E 3 E 4 Theorem of Total Probability If E 1, E 2,, E n are n mutually exclusive (ME) and collectively exhaustive (CE) events, and if A is an event that shares the same space as the events E i, (P [A E i ] > 0 for at least some events E i ) then via the intersection of dependent events and the union of mutually exclusive events: P [A] = P [A E 1 ] + P [A E 2 ] + + P [A E n ] P [A] = P [A E 1 ] P [E 1 ] + P [A E 2 ] P [E 2 ] + + P [A E n ] P [E n ] (14) Example: The table below shows the probabilities of a number of events. P [X = 0] P [0 < X 10] P [10 < X 20] P [20 < X 50] P [50 < X] P [Y > 1 X = 0] P [Y > 1 0 < X 10] P [Y > 1 10 < X 20] P [Y > 1 20 < X 50] P [Y > 1 50 < X] So... P [Y > 1] = (0)(0)+(0.0)(0.20)+(0.05)(0.25)+(0.07)(0.35)+(0.115)(0.20) = 0.06 A Bayes Theorem A B For two dependent events A and B, P [A B] is the fraction of the area of B that is in A and P [A B]P [B] is the area of B in A. Likewise, P [B A] is the fraction of the area of A that is in B and P [B A]P [A] is the area of A in B. Clearly, The area of A in B equals the area of B in A. P [A and B] = P [A B] P [B] = P [B A] P [A]. So, P [A B] = P [B A] P [A]. (15) P [B] If event A depends on n ME and CE events, E 1,..., E n P [E i A] = P [A E i ] P [E i ] P [A E 1 ] P [E 1 ] + P [A E 2 ] P [E 2 ] + + P [A E n ] P [E n ]. (16) Example: Using the probabilities from the table above, given an observation of more than one accident, find the probability that the traffic was very heavy (50 < X). P [50 < X Y > 1] = P [Y > 1 50 < X]P [50 < X]/P [Y > 1] = (0.115)(0.20)/(0.06) = Note that if A is a subset of B, then P [B A] = 1 and P [A B] = P [A]/P [B]. Note also that P [A B] P [B A] if and only if P [A] P [B].
7 Rules of Probability 7 Bernoulli sequence of independent events with probability p A Bernoulli sequence is a sequence of trials for which an event may or may not occur. In a Bernoulli sequence the occurrence of an event in one trial is independent of an occurrence in any other trial. If we have a sequence of trials, we may wish to know the probability of multiple occurrences in a given number of trials. This will depend upon the probability of the event in any single trial and the number of ways multiple events can occur in the given number of trials. Example: Given that the probability of an accident on any given work-day is 0.10, and assuming the probability of an accident today in no way depends on the occurrence of an accident on any other day, what is the probability of two days with an accident (and three days with zero accidents) in one work-week? Given: P [one or more accident on any work-day] = p = 0.10 From complementary events: P [no accident on any work-day] = 1 p = = 0.90 From the intersection of independent events: P [an accident on any two work-days] = (p)(p) = (0.10) 2 = From the intersection of independent events: P [no accident on any three work-days] =.... (1 p)(1 p)(1 p) = (1 0.10) 3 = From the intersection of independent events: P [an accident on any two work-days no accident on any three work-days]. = (p) 2 (1 p) 5 2 = (0.010)(0.729) From the union of independent events: number of ways to pick two days out of five = = 5!/(2! 3!) = 120/((2)(6)) = 10 P [two days out of five with an accident] = (10)(.010)(.729) = This is called the binomial distribution. P [n events out of m attempts] = As a special case, P [0 events out of m attempts] = (1 p) m The expected number of events out of m attempts is pm. m! n!(m n)! pn (1 p) m n (17) The more data we have in determining the probability of an event, the more accurate our calculations will be. If n events are observed in m attempts, p n/m, and p becomes more precise as m (and n) get bigger.
8 8 CEE 201L. Uncertainty, Design, and Optimization Duke University Spring 2016 H.P. Gavin Poisson process with mean occurrence rate ν (mean return period T = 1/ν). For a Bernoulli sequence with a small event probability p and a large number of trials, the binomial distribution approaches the Poisson distribution, in which the event probability p is replaced by a mean occurrence rate, ν, or a return period, T = 1/ν. (t/t )n P [n events during time t] = exp( t/t ) (18) n! Special cases: P [time between two events > t] = P [0 events during time t] = e t/t P [time between two events t] = 1 P [0 events during time t] = 1 e t/t Example: Given that the probability of an accident on any work-day is 0.10, the return period for work-days with accidents is 1/0.10 work-days or 10 work-days. On average, drivers would expect a day with one or more accidents every other work-week. Assuming such accidents are a Poisson process, P [two work-days with accidents in five days] =.... (5/10) 2 /2! exp( 5/10) = only slightly more than what we got assuming a Bernoulli sequence. P [time between work-days with accidents > T work-days] =.... P [0 accidents in T work days] = e T/T = e 1 = P [one or more accidents in T work-days] = 1 e T/T = 1 e 1 = The Normal distribution The figures below plot P [n days with accidents in m workdays] vs. n, the number days with accidents, for m = 5, 20, and 50 work-days. As m and t become large the Binomial distribution and the Poisson distribution approach the Normal distribution with a mean value of pm or t/t and a standard deviation of pm or t/t Binomial Poisson Normal Binomial Poisson Normal 0.2 Binomial Poisson Normal Probability Probability Probability number of days with multi-car accidents in one work-week number of days with multi-car accidents in four work-weeks number of days with multi-car accidents in ten work-weeks
9 Rules of Probability 9 The Exponential distribution Consider the random time T 1 between two random events in a Poisson process. The complementary event is P [T 1 > t] = P [no events in time t] = exp( t/t ) P [T 1 t] = 1 P [no events in time t] = 1 exp( t/t ) This is the distribution function for an exponential distribution, and describes the inter-event time, or the time between successive events. Examples 1. What is the probability of flipping a head in three coin tosses? (a) P [ a H on any toss ] = P [ a H in 3 tosses ] = = 1.5 > 1??? Clearly wrong. Probabilities must be between 0 and 1! A H on the first toss is not mutually exclusive of a H on any other toss. (b) Instead, using the union of multiple mutually exclusive events... There are seven mutually exclusive ways to get a H in three tosses: (H 1 T 2 T 3 ); (T 1 H 2 T 3 ); (T 1 T 2 H 3 ); (H 1 H 2 T 3 ); (H 1 T 2 H 3 ); (H 1 T 2 H 3 ); (H 1 H 2 H 3 ); The probability of each is the same (0.5) 3, so P [ a H in 3 tosses ] = 7 (0.5) 3 = (c) In fact, there is only one way not to get a head in three tosses: (T 1 T 2 T 3 ). Making use of complementary probabilities and the intersection of independent events (tossing three T ): P [ a H in 3 tosses ] = 1 P [ 0 H in 3 tosses ] = 1 (0.5) 3 = (d) Invoking the Bernouli distribution, we can say there are three mutually exclusive groups of making a H in three tosses: P [ a H in 3 tosses ] = P [ 1 H in 3 tosses ]+P [ 2 H in 3 tosses ]+P [ 3 H in 3 tosses ] = = Make a Venn diagram of three intersecting circles to help illustrate these four solutions. 2. Consider a typical year having 36 days with rain. The probability of rain on any given day is The return period for days with rain is 10 days. Use the Binomial and Poisson distributions to find: Bernoulli Poisson P [one rainy day in 31 days] (31!)/(1!30!) (1 0.1) 30 = ((31/10) 1 /1!) exp( 31/10) = P [one rainy day in 60 days] (60!)/(1!59!) (1 0.1) 59 = ((60/10) 1 /1!) exp( 60/10) = What is the probability of exactly one event in one day for a Poisson process with return period of T days? P [one occurrence in one day] = ((1/T ) 1 /1!) exp[ 1/T ] = (1/T ) exp( 1/T ) For T 1 day, P [one occurrence in one day] (1/T ). 4. What is the probability of exactly one event in T days for a Poisson process with return period of T days? P [one occurrence in T days] = ((T/T ) 1 /1!) exp[ T/T ] = exp( 1) = 0.368
10 10 CEE 201L. Uncertainty, Design, and Optimization Duke University Spring 2016 H.P. Gavin 5. What is the probability of one or more events in T days for a Poisson process with return period of T days? P [one or more occurrences in T days] = 1 P [no occurrences in T days] = 1 ((T/T ) 0 /0!) exp[ T/T ] = 1 exp( 1) = What is the return period T of a Poisson process with a 2% probability of exceedance in fifty years? 0.02 = P [n > 0] = 1 P [n = 0] ln(1 0.02) = 50/T = 1 ((50/T ) 0 /0!) exp( 50/T ) T = years 7. In the past 50 years there have been two large earthquakes (M > 6) in the LA area. What is the probability of a large earthquake in the next 15 years, assuming the occurrence of earthquakes is a Poisson process? The mean return period is T = 50/2 = 25 years. P [one or more EQ in 15 years] = 1 P [no EQ in 15 years] = = 1 ((15/25) 0 /0!) exp( 15/25) = (In actuality, the occurrence of an earthquake increases the probability of earthquakes, for a year or two.) 8. Consider the following events (E: electrical power failure); (F : flood) with probabilities: P [E] = 0.2; P [F ] = 0.1; P [E F ] = Note that P [E] > P [E F ]. So, P [E F ] = P [E F ] P [F ] = (0.1)(0.1) = 0.01 If E and F were assumed to be independent, then the probability of E and F would be calculated as P [E F ] = P [E] P [F ] = (0.2)(0.1) = 0.02 Neglecting the conditional dependence of random events can lead to large errors.
11 Rules of Probability Assume that during rush-hour one out of ten-thousand drivers is impaired by alcohol. Also assume that breathalyzers have a 2% false-positive rate and a 0% false-negative rate. If a police officer stops a car at random and finds that the driver tests positively for alcohol, what is the likelihood that the officer has correctly identified an impaired driver? assign random variables to events: D =drunk, D =sober, B =breathalyzer positive, B =breathalyzer negative. Given: P [D] = , P [B D ] = 0.02, P [B D] = 0, Find: P [D B]. D and D are complementary, P [D ] = 1 P [D] = (B D) and (B D) are complementary, P (B D) = 1 P (B D) = 1 Bayes Theorem with the Theorem of Total Probability: P [D B] = P [B D] P [D] P [B] P [B D] P [D] = P [B D] P [D] + P [B D ] P [D ] (1)(0.0001) = (1)(0.0001) + (0.02)(0.9999) = (1)(0.0001) = percent This surprisingly-low probability is an example of the base rate fallacy of the false positive paradox which arises when the event being tested ( the driver is drunk in this example) occurs with very low probability. Seen another way, the test changes the probability of finding a drunk driver from one in ten-thousand to one in two-hundred. 10. Ang+Tang, v.1, example 2.27, p. 57. Consider emission standards for automobiles and industry in terms of three events: I: industry meets standards; A: automobiles meet standards; and R: there has been an acceptable reduction in pollution. Given: P [I] = 0.75; P [A] = 0.60; A and I are independent events; P [R AI] = 1; P [R AI ] = 0.80; P [R A I] = 0.80; and P [R A I ] = 0. Find: P [R]. The following set of events are CE and ME: AI, AI, A I, A I. So, from the Theorem of Total Probability, P [R] = P [R AI] P [AI] + P [R AI ] P [AI ] + P [R A I] P [A I] + P [R A I ] P [A I ]
12 12 CEE 201L. Uncertainty, Design, and Optimization Duke University Spring 2016 H.P. Gavin Since A and I are independent, P [AI] = (0.60)(0.75) = 0.45; P [AI ] = (0.60)(0.25) = 0.15; P [A I] = (0.40)(0.75) = 0.30; P [A I ] = (0.40)(0.25) = So, P [R] = (1.0)(0.45) + (0.80)(0.15) + (0.80)(0.30) + (0)(0.1) = 0.81 If it turns out that pollution is not reduced (R ), what is the probability that it is entirely due to a failure to control automobile exhaust (A I)? P [A I R ] = P [R A I] P [A I] P [R ] = (1 0.8)(0.3) (1 0.81) = 0.32 If it turns out that pollution is not reduced (R ), what is the probability that automobile exhaust was not controlled (A )? The events I and I are complementary and the events A I and A I are mutually exclusive. P [A R ] = P [A I R ] + P [A I R ] = P [R A I] P [A I] + P [R A I ] P [A I ] P [R ] P [R ] (1 0.8)(0.3) (1 0)(0.10) = + = 0.84 (1 0.81) (1 0.81) If it turns out that pollution is not reduced (R ), what is the probability that industry emissions were not controlled (I )? The events A and A are complementary and the events AI and A I are mutually exclusive. P [I R ] = P [AI R ] + P [A I R ] = P [R AI ] P [AI ] + P [R A I ] P [A I ] P [R ] P [R ] (1 0.8)(0.15) (1 0)(0.10) = + = 0.68 (1 0.81) (1 0.81) So, if pollution is not reduced it is more likely that it has to do with cars not meeting standards than industry.
13 Rules of Probability Ang+Tang, v.1, example 2.25, p. 55 Consider damaging storms causing flooding and storm-damage in a county and within a city. Suppose that the probability of a storm in a county is 20% in any given year. During such a storm: the probability of a flood in the county is 25%; the probability of flood in the city during a flood in the county is 5%; and if there is no flood in the county the probability of storm damage in the city is 10%. Assume floods only happen during storms and if a flood happens in a city, property will be damaged with absolute certainty. If there is a flood in the county without a flood in the city, the probability of storm damage in the city is 15%. There can be no flood in the city without a flood in the county. What is the annual probability of storm damage in the city? In terms of events: S: storm in the county F : flood in the county (only happens if there is a storm in the county) W : flood in the city (only happens if there is a flood in the county) D: storm damage in the city the probabilities above (and their complements) can be summarized in the table below. storm flood in county flood in city damage in city P [S] = 0.20 P [F S] = 0.25 P [W F ] = 0.05 P [D W ] = 1 P [W F ] = 0.95 P [D SF W ] = 0.15 P [F S] = 0.75 P [W F ] = 0 P [D SF ] = 0.10 P [W F ] = 1 P [S ] = 0.80 P [F S ] = 0 P [D S ] = 0 P [F S ] = 1 P [W F ] = 0 The events (SF ), (SF ), and (S ) are CE and ME. So, from the Theorem of Total Probability, P [D] = P [D SF ] P [SF ] + P [D SF ] P [SF ] + P [D S ]P [S ] The conditional probabilities needed to solve this problem can be found from further application of the Theorem of Total Probability and the intersection of dependent events.
14 14 CEE 201L. Uncertainty, Design, and Optimization Duke University Spring 2016 H.P. Gavin If there is a storm, a flood in the county, and a flood in the city, damage in the city will certainly occur (P [D SF W ] = 1), and might occur even if there is not a flood in the city (P [D SF W ] = 0.15). P [D SF ] = P [D SF W ] P [W SF ] + P [D SF W ] P [W SF ] = (1.0)(0.05) + (0.15)(0.95) = P [SF ] = P [F S] P [S] = (0.25)(0.20) = 0.05 (intersection of dependent events)) P [D SF ] = 0.10 (given) P [SF ] = P [F S] P [S] = (0.75)(0.20) = 0.15 (intersection of dependent events) P [D S ] = 0 (given) So, P [D] = (0.1925)(0.05) + (0.10)(0.15) + (0)(0.8) = Note that while there is twice the likelihood of damage when there is a flood in the county as compared to no flood in the county (P [D SF ] 0.19 vs P [D SF ] = 0.1), the probability of damage and a flood in the county is less than the probability of damage and no flood in the county. P [D SF ] = P [D SF ] P [SF ] = (0.1925)(0.05) P [D SF ] = P [D SF ] P [SF ] = (0.10)(0.15) = W D P[SF]=0.05 SF P[SF ]=0.15 S P[S ]=0.8
Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett
Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.
More informationE3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
More informationSTAT 319 Probability and Statistics For Engineers PROBABILITY. Engineering College, Hail University, Saudi Arabia
STAT 319 robability and Statistics For Engineers LECTURE 03 ROAILITY Engineering College, Hail University, Saudi Arabia Overview robability is the study of random events. The probability, or chance, that
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationMath/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability
Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock
More informationStatistics I for QBIC. Contents and Objectives. Chapters 1 7. Revised: August 2013
Statistics I for QBIC Text Book: Biostatistics, 10 th edition, by Daniel & Cross Contents and Objectives Chapters 1 7 Revised: August 2013 Chapter 1: Nature of Statistics (sections 1.1-1.6) Objectives
More informationSummary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
More informationConditional Probability, Independence and Bayes Theorem Class 3, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
Conditional Probability, Independence and Bayes Theorem Class 3, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence
More informationChapter 4: Probability and Counting Rules
Chapter 4: Probability and Counting Rules Learning Objectives Upon successful completion of Chapter 4, you will be able to: Determine sample spaces and find the probability of an event using classical
More informationElements of probability theory
2 Elements of probability theory Probability theory provides mathematical models for random phenomena, that is, phenomena which under repeated observations yield di erent outcomes that cannot be predicted
More information6.3 Conditional Probability and Independence
222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted
More informationBayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
1 Learning Goals Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1. Be able to apply Bayes theorem to compute probabilities. 2. Be able to identify
More informationProbability and Statistics Vocabulary List (Definitions for Middle School Teachers)
Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence
More informationEXAM. Exam #3. Math 1430, Spring 2002. April 21, 2001 ANSWERS
EXAM Exam #3 Math 1430, Spring 2002 April 21, 2001 ANSWERS i 60 pts. Problem 1. A city has two newspapers, the Gazette and the Journal. In a survey of 1, 200 residents, 500 read the Journal, 700 read the
More informationLecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University
Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University 1 Chapter 1 Probability 1.1 Basic Concepts In the study of statistics, we consider experiments
More information2 Binomial, Poisson, Normal Distribution
2 Binomial, Poisson, Normal Distribution Binomial Distribution ): We are interested in the number of times an event A occurs in n independent trials. In each trial the event A has the same probability
More informationProbability. Sample space: all the possible outcomes of a probability experiment, i.e., the population of outcomes
Probability Basic Concepts: Probability experiment: process that leads to welldefined results, called outcomes Outcome: result of a single trial of a probability experiment (a datum) Sample space: all
More informationPeople have thought about, and defined, probability in different ways. important to note the consequences of the definition:
PROBABILITY AND LIKELIHOOD, A BRIEF INTRODUCTION IN SUPPORT OF A COURSE ON MOLECULAR EVOLUTION (BIOL 3046) Probability The subject of PROBABILITY is a branch of mathematics dedicated to building models
More informationFEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
More informationChapter 4. Probability and Probability Distributions
Chapter 4. robability and robability Distributions Importance of Knowing robability To know whether a sample is not identical to the population from which it was selected, it is necessary to assess the
More informationLECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process
LECTURE 16 Readings: Section 5.1 Lecture outline Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process Number of successes Distribution of interarrival times The
More informationQuestion: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?
ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the
More informationExploratory Data Analysis
Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction
More informationAn Introduction to Basic Statistics and Probability
An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random
More informationAP Stats - Probability Review
AP Stats - Probability Review Multiple Choice Identify the choice that best completes the statement or answers the question. 1. I toss a penny and observe whether it lands heads up or tails up. Suppose
More informationIEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
More informationSTAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE
STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about
More informationBasic Probability Concepts
page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes
More informationLecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
More information6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS
6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total
More informationMathematical goals. Starting points. Materials required. Time needed
Level S2 of challenge: B/C S2 Mathematical goals Starting points Materials required Time needed Evaluating probability statements To help learners to: discuss and clarify some common misconceptions about
More informationChapter 5. Random variables
Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like
More informationMidterm Exam #1 Instructions:
Public Affairs 818 Professor: Geoffrey L. Wallace October 9 th, 008 Midterm Exam #1 Instructions: You have 10 minutes to complete the examination and there are 6 questions worth a total of 10 points. The
More informationProbability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More informationHomework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.
Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the
More informationProbability & Probability Distributions
Probability & Probability Distributions Carolyn J. Anderson EdPsych 580 Fall 2005 Probability & Probability Distributions p. 1/61 Probability & Probability Distributions Elementary Probability Theory Definitions
More informationBasic Probability Theory II
RECAP Basic Probability heory II Dr. om Ilvento FREC 408 We said the approach to establishing probabilities for events is to Define the experiment List the sample points Assign probabilities to the sample
More informationMidterm Exam #1 Instructions:
Public Affairs 818 Professor: Geoffrey L. Wallace October 9 th, 008 Midterm Exam #1 Instructions: You have 10 minutes to complete the examination and there are 6 questions worth a total of 10 points. The
More informationCorrelation key concepts:
CORRELATION Correlation key concepts: Types of correlation Methods of studying correlation a) Scatter diagram b) Karl pearson s coefficient of correlation c) Spearman s Rank correlation coefficient d)
More informationThe normal approximation to the binomial
The normal approximation to the binomial The binomial probability function is not useful for calculating probabilities when the number of trials n is large, as it involves multiplying a potentially very
More informationIntroduction to Probability
Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence
More informationSection 6.1 Joint Distribution Functions
Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function
More informationStatistics in Geophysics: Introduction and Probability Theory
Statistics in Geophysics: Introduction and Steffen Unkel Department of Statistics Ludwig-Maximilians-University Munich, Germany Winter Term 2013/14 1/32 What is Statistics? Introduction Statistics is the
More informationSTAT 35A HW2 Solutions
STAT 35A HW2 Solutions http://www.stat.ucla.edu/~dinov/courses_students.dir/09/spring/stat35.dir 1. A computer consulting firm presently has bids out on three projects. Let A i = { awarded project i },
More information36 Odds, Expected Value, and Conditional Probability
36 Odds, Expected Value, and Conditional Probability What s the difference between probabilities and odds? To answer this question, let s consider a game that involves rolling a die. If one gets the face
More informationCHAPTER 2 Estimating Probabilities
CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a
More informationST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
More informationSection 6.2 Definition of Probability
Section 6.2 Definition of Probability Probability is a measure of the likelihood that an event occurs. For example, if there is a 20% chance of rain tomorrow, that means that the probability that it will
More informationPractice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3. Solve the practice problems below.
Practice Problems for Homework #8. Markov Chains. Read Sections 7.1-7.3 Solve the practice problems below. Open Homework Assignment #8 and solve the problems. 1. (10 marks) A computer system can operate
More informationProbability Calculator
Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that
More informationReliability. 26.1 Reliability Models. Chapter 26 Page 1
Chapter 26 Page 1 Reliability Although the technological achievements of the last 50 years can hardly be disputed, there is one weakness in all mankind's devices. That is the possibility of failure. What
More informationAMS 5 CHANCE VARIABILITY
AMS 5 CHANCE VARIABILITY The Law of Averages When tossing a fair coin the chances of tails and heads are the same: 50% and 50%. So if the coin is tossed a large number of times, the number of heads and
More informationSOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions
SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions 1. The following table contains a probability distribution for a random variable X. a. Find the expected value (mean) of X. x 1 2
More informationCA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction
CA200 Quantitative Analysis for Business Decisions File name: CA200_Section_04A_StatisticsIntroduction Table of Contents 4. Introduction to Statistics... 1 4.1 Overview... 3 4.2 Discrete or continuous
More informationLesson 1. Basics of Probability. Principles of Mathematics 12: Explained! www.math12.com 314
Lesson 1 Basics of Probability www.math12.com 314 Sample Spaces: Probability Lesson 1 Part I: Basic Elements of Probability Consider the following situation: A six sided die is rolled The sample space
More information99.37, 99.38, 99.38, 99.39, 99.39, 99.39, 99.39, 99.40, 99.41, 99.42 cm
Error Analysis and the Gaussian Distribution In experimental science theory lives or dies based on the results of experimental evidence and thus the analysis of this evidence is a critical part of the
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,
More information2) The three categories of forecasting models are time series, quantitative, and qualitative. 2)
Exam Name TRUE/FALSE. Write 'T' if the statement is true and 'F' if the statement is false. 1) Regression is always a superior forecasting method to exponential smoothing, so regression should be used
More informationThe Binomial Probability Distribution
The Binomial Probability Distribution MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2015 Objectives After this lesson we will be able to: determine whether a probability
More informationSample Questions for Mastery #5
Name: Class: Date: Sample Questions for Mastery #5 Multiple Choice Identify the choice that best completes the statement or answers the question.. For which of the following binomial experiments could
More information1 Math 1313 Final Review Final Review for Finite. 1. Find the equation of the line containing the points 1, 2)
Math 33 Final Review Final Review for Finite. Find the equation of the line containing the points, 2) ( and (,3) 2. 2. The Ace Company installed a new machine in one of its factories at a cost of $2,.
More informationECE302 Spring 2006 HW4 Solutions February 6, 2006 1
ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in
More informationPrinciple of Data Reduction
Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then
More informationMath 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions
Math 70, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the
More informationMath 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Problem Set 1 (with solutions)
Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand Problem Set 1 (with solutions) About this problem set: These are problems from Course 1/P actuarial exams that I have collected over the years,
More informationNormal Probability Distribution
Normal Probability Distribution The Normal Distribution functions: #1: normalpdf pdf = Probability Density Function This function returns the probability of a single value of the random variable x. Use
More informationStatistics 100A Homework 2 Solutions
Statistics Homework Solutions Ryan Rosario Chapter 9. retail establishment accepts either the merican Express or the VIS credit card. total of percent of its customers carry an merican Express card, 6
More informationMath 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions
Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the
More informationChapter 5 A Survey of Probability Concepts
Chapter 5 A Survey of Probability Concepts True/False 1. Based on a classical approach, the probability of an event is defined as the number of favorable outcomes divided by the total number of possible
More informationYou flip a fair coin four times, what is the probability that you obtain three heads.
Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.
More informationMath 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)
Math 370, Actuarial Problemsolving Spring 008 A.J. Hildebrand Practice Test, 1/8/008 (with solutions) About this test. This is a practice test made up of a random collection of 0 problems from past Course
More informationWHERE DOES THE 10% CONDITION COME FROM?
1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay
More informationSAT Math Facts & Formulas Review Quiz
Test your knowledge of SAT math facts, formulas, and vocabulary with the following quiz. Some questions are more challenging, just like a few of the questions that you ll encounter on the SAT; these questions
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and
More informationPROBABILITY SECOND EDITION
PROBABILITY SECOND EDITION Table of Contents How to Use This Series........................................... v Foreword..................................................... vi Basics 1. Probability All
More informationUnit 19: Probability Models
Unit 19: Probability Models Summary of Video Probability is the language of uncertainty. Using statistics, we can better predict the outcomes of random phenomena over the long term from the very complex,
More informationDECISION MAKING UNDER UNCERTAINTY:
DECISION MAKING UNDER UNCERTAINTY: Models and Choices Charles A. Holloway Stanford University TECHNISCHE HOCHSCHULE DARMSTADT Fachbereich 1 Gesamtbibliothek Betrtebswirtscrtaftslehre tnventar-nr. :...2>2&,...S'.?S7.
More informationRandom variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More informationChapter 13 & 14 - Probability PART
Chapter 13 & 14 - Probability PART IV : PROBABILITY Dr. Joseph Brennan Math 148, BU Dr. Joseph Brennan (Math 148, BU) Chapter 13 & 14 - Probability 1 / 91 Why Should We Learn Probability Theory? Dr. Joseph
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationMATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators...
MATH4427 Notebook 2 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 2 MATH4427 Notebook 2 3 2.1 Definitions and Examples...................................
More informationSet operations and Venn Diagrams. COPYRIGHT 2006 by LAVON B. PAGE
Set operations and Venn Diagrams Set operations and Venn diagrams! = { x x " and x " } This is the intersection of and. # = { x x " or x " } This is the union of and. n element of! belongs to both and,
More informationProbability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce
More informationLecture 2 Binomial and Poisson Probability Distributions
Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a
More information26 Integers: Multiplication, Division, and Order
26 Integers: Multiplication, Division, and Order Integer multiplication and division are extensions of whole number multiplication and division. In multiplying and dividing integers, the one new issue
More informationUNIVERSITY OF OSLO. The Poisson model is a common model for claim frequency.
UNIVERSITY OF OSLO Faculty of mathematics and natural sciences Candidate no Exam in: STK 4540 Non-Life Insurance Mathematics Day of examination: December, 9th, 2015 Examination hours: 09:00 13:00 This
More informationAnswer Key for California State Standards: Algebra I
Algebra I: Symbolic reasoning and calculations with symbols are central in algebra. Through the study of algebra, a student develops an understanding of the symbolic language of mathematics and the sciences.
More informationMath 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions
Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,
More informationMATH 140 Lab 4: Probability and the Standard Normal Distribution
MATH 140 Lab 4: Probability and the Standard Normal Distribution Problem 1. Flipping a Coin Problem In this problem, we want to simualte the process of flipping a fair coin 1000 times. Note that the outcomes
More informationHomework 3 (due Tuesday, October 13)
Homework (due Tuesday, October 1 Problem 1. Consider an experiment that consists of determining the type of job either blue-collar or white-collar and the political affiliation Republican, Democratic,
More informationEXAM #1 (Example) Instructor: Ela Jackiewicz. Relax and good luck!
STP 231 EXAM #1 (Example) Instructor: Ela Jackiewicz Honor Statement: I have neither given nor received information regarding this exam, and I will not do so until all exams have been graded and returned.
More informationComplement. If A is an event, then the complement of A, written A c, means all the possible outcomes that are not in A.
Complement If A is an event, then the complement of A, written A c, means all the possible outcomes that are not in A. For example, if A is the event UNC wins at least 5 football games, then A c is the
More informationIAM 530 ELEMENTS OF PROBABILITY AND STATISTICS INTRODUCTION
IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS INTRODUCTION 1 WHAT IS STATISTICS? Statistics is a science of collecting data, organizing and describing it and drawing conclusions from it. That is, statistics
More informationImportant Probability Distributions OPRE 6301
Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.
More informationProbability Distributions
Learning Objectives Probability Distributions Section 1: How Can We Summarize Possible Outcomes and Their Probabilities? 1. Random variable 2. Probability distributions for discrete random variables 3.
More informationALGEBRA. sequence, term, nth term, consecutive, rule, relationship, generate, predict, continue increase, decrease finite, infinite
ALGEBRA Pupils should be taught to: Generate and describe sequences As outcomes, Year 7 pupils should, for example: Use, read and write, spelling correctly: sequence, term, nth term, consecutive, rule,
More informationLesson 20. Probability and Cumulative Distribution Functions
Lesson 20 Probability and Cumulative Distribution Functions Recall If p(x) is a density function for some characteristic of a population, then Recall If p(x) is a density function for some characteristic
More informationThe Calculus of Probability
The Calculus of Probability Let A and B be events in a sample space S. Partition rule: P(A) = P(A B) + P(A B ) Example: Roll a pair of fair dice P(Total of 10) = P(Total of 10 and double) + P(Total of
More information