# MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook Definitions Joint Discrete Distributions...

Save this PDF as:

Size: px
Start display at page:

Download "MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions..."

## Transcription

1 MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook Definitions Joint Discrete Distributions Bivariate Distribution, PDF, CDF, Marginal Distributions Conditional Distributions, Regression Independent Random Variables Bivariate Hypergeometric Distribution Trinomial Distribution Simple Random Sample, Trinomial Approximation, Survey Analysis Joint Continuous Distributions Bivariate Distribution, PDF, CDF, Marginal Distributions Conditional Distributions, Regression Independent Random Variables Bivariate Uniform Distribution Bivariate Normal Distribution Transforming Random Variables, CDF Method Multivariate Distributions, Random Samples Multivariate Hypergeometric and Multinomial Distributions Mutual Independence, Repeated Trials, Random Samples

2 2

3 3 MT426 Notebook 3 This notebook is concerned with joint discrete and joint continuous distributions. The notes correspond to material in Chapter 3 of the Rice textbook. 3.1 Definitions A probability distribution describing the joint variability of two or more random variables is called a joint distribution. A bivariate distribution is the joint distribution of a pair of random variables. Here are examples of situations we might want to study: 1. An urn contains 10 red chips, 8 blue chips, and 15 green chips. Let X be the number of red chips and Y be the number of blue chips in a subset of size 5 chosen from the urn. We wish to describe the joint distribution of the random pair (X, Y ). 2. Let X be the height (in feet), Y be the weight (in pounds), and Z be the serum cholesterol level (in mg/dl) of a person chosen from a given population. We wish to describe the joint distribution of the random triple (X, Y, Z). 3. A manufacturing process produces rods whose lengths (in inches) vary in the interval [0.98, 1.02]. We repeat the experiment choose a rod from those produced that day and measure its length on 8 separate occasions. We wish to describe the joint distribution of the random eight-tuple (X 1, X 2, X 3, X 4, X 5, X 6, X 7, X 8 ). 3.2 Joint Discrete Distributions A joint discrete distribution describes the joint variability of two or more discrete random variables Bivariate Distribution, PDF, CDF, Marginal Distributions Assume that X and Y are discrete random variables. The joint probability density function (joint PDF) (or joint frequency function (joint FF)) of the random pair (X, Y ) is defined as follows: p(x, y) = P (X = x, Y = y) for all real pairs (x, y), where the comma is understood to mean the intersection of events. The joint cumulative distribution function (joint CDF) of the random pair (X, Y ) is defined as follows: F (x, y) = P (X x, Y y) for all real pairs (x, y), where the comma is understood to mean the intersection of events. 3

4 If X and Y are discrete random variables with joint PDF p(x, y), then The marginal probability density function (marginal PDF) (or marginal frequency function (marginal FF)) of X is p X (x) = where R Y is the range of Y. y R Y p(x, y), for all real numbers x, The marginal probability density function (marginal PDF) (or marginal frequency function (marginal FF)) of Y is p Y (y) = where R X is the range of X. x R X p(x, y), for all real numbers y, Exercise 1 (Olkin et al, Macmillan 1994, page 545). In an experiment to study the relationship between spatial perception and the ability to use a graphical method to solve algebra problems, subjects were asked to solve 3 puzzles and 3 algebra problems. Let X be the number of puzzles correctly solved, and let Y be the number of algebra problems correctly solved. The following table summarizes the results, using proportions of individuals in each crossclassification: y = 0 y = 1 y = 2 y = 3 sum: x = x = x = x = sum: Consider the experiment Choose a name from the population of subjects and record (X, Y ) = (# puzzles solved, # algebra problems solved). If each choice is equally likely, then the body of the table gives the joint (X, Y ) distribution, the rightmost column of the table gives the marginal X distribution, the bottom row of the table gives the marginal Y distribution, and the plot on the right gives the joint probability histogram of the random pair. The joint probability histogram is constructed using a box whose base is a square of area 1 centered at (x, y), and whose height is p(x, y), for each pair with nonzero probability. The sum of the volumes of the boxes is 1. 4

5 Assume that the table represents the joint (X, Y ) distribution. (a) Find the probability that the subject (1) solved the same number of puzzles and algebra problems, P (X = Y ). (2) solved fewer puzzles than algebra problems, P (X < Y ). (3) solved more puzzles than algebra problems, P (X > Y ). (b) Let W = X + Y be the total number of problems solved by the subject. Completely specify the PDF of W. 5

6 Exercise 2. An urn contains 4 red chips, 3 white chips, and 1 blue chip. Consider the following 2-step experiment: Step 1: Step 2: Thoroughly mix the contents of the urn. Choose a chip and record its color. Return the chip plus two more chips of the same color to the urn. Thoroughly mix the contents of the urn. Choose a chip and record its color. Let X be the number of red chips, and Y be the number of white chips, chosen. Construct a table showing the joint (X, Y ) distribution, and the marginal X and Y distributions. 6

7 3.2.2 Conditional Distributions, Regression Let X and Y be discrete random variables. If P (Y = y) 0, then the conditional probability density function (conditional PDF) (or conditional frequency function (conditional FF)) of X given Y = y is defined as follows: p X Y =y (x y) = P (X = x Y = y) = P (X = x, Y = y) P (Y = y) for all real numbers x. If P (X = x) 0, then the conditional probability density function (conditional PDF) (or conditional frequency function (conditional FF)) of Y given X = x is defined as follows: p Y X=x (y x) = P (Y = y X = x) = P (X = x, Y = y) P (X = x) for all real numbers y. Note that in the first case, the conditional sample space is the collection of outcomes with Y = y; in the second case, it is the collection of outcomes with X = x. Exercise 1, continued. Consider again the joint puzzle-algebra problem distribution. (c) Use the information in the joint (X, Y ) distribution table to fill-in the following table with the conditional distribution of Y given X = 1: y p Y X=1 (y 1) (d) Use the information in the joint (X, Y ) distribution table to fill-in the following table with the conditional distribution of Y given X = 3: y p Y X=3 (y 3) 7

8 Here are probability histograms for the conditional distributions of Y given X = x, for each x: Y X = 0 Y X = 1 Y X = 2 Y X = 3 As x increases, the conditional distribution of Y given X = x changes, with more weight being given to larger values of Y. Regression. Conditional distributions are often used as weights in weighted averages. For example, suppose that we would like to find the average number of algebra problems solved given that a subject solved exactly x puzzles correctly, for each possible x. Then we would calculate a quantity known as the conditional mean (or conditional expectation) of Y given X = x, E(Y X = x) = 3 y p Y X=x (y x), x = 0, 1, 2, 3. y=0 The pairs (x, E(Y X = x)) are shown in the plot on the right. Note that as x increases, so does the average value of Y given X = x. The collection of conditional means E(Y X = x) is often called the regression of Y on X. 8

9 3.2.3 Independent Random Variables The discrete random variables X and Y are said to be independent if p(x, y) = p X (x)p Y (y) for all real pairs (x, y), where p(x, y) is the joint PDF, and p X (x) and p Y (y) are the marginal PDFs, and they are said to be dependent if p(x, y) p Y (x)p Y (y) for at least one real pair (x, y). Notes: 1. The definition says that the discrete random variables X and Y are independent if the probability of the intersection of events X = x and Y = y is equal to the product of the probabilities of the events for all possible x and y: P (X = x, Y = y) = P (X = x)p (Y = y) for all real pairs (x, y). 2. An equivalent definition of independence uses the cumulative distribution functions rather than the frequency functions. That is, X and Y are independent if F (x, y) = F X (x)f Y (y) for all real pairs (x, y), where F (x, y) is the joint CDF, and F X (x) and F Y (y) are the marginal CDFs, and they are said to be dependent if F (x, y) F X (x)f Y (y) for at least one real pair (x, y). 3. Sometimes the context of a problem implies independence. For example, suppose you roll two fair six-sided dice (one red and one blue) and let X be the number on the top face of the red die and Y be the number on the top face of the blue die. Then, the context of the problem tells us that X and Y are independent random variables. 4. If X and Y are independent, then conditional and marginal distributions are equal. To see this (please complete): 9

10 3.2.4 Bivariate Hypergeometric Distribution Let n, M 1, M 2, and M 3 be positive integers with n < M 1 + M 2 + M 3. Then (X, Y ) is said to have a bivariate hypergeometric distribution with parameters n and (M 1, M 2, M 3 ) if its joint PDF has the following form: ) ( M2 ) ( M3 ) p(x, y) = P (X = x, Y = y) = when x and y are non-negative integers satisfying ( M1 x n x y y ( M1 +M 2 +M 3 n x min(n, M 1 ), y min(n, M 2 ), and max(0, n M 3 ) x + y min(n, M 3 ), and is equal to zero otherwise. Model for urn experiments. Bivariate hypergeometric distributions are used to model urn experiments, where the urn contains N = M 1 + M 2 + M 3 objects, with M i of type i for i = 1, 2, 3. Let X equal the number of objects of type 1 and Y equal the number of objects of type 2 in a subset of size n chosen from the urn. If each choice of subset is equally likely, then (X, Y ) has a bivariate hypergeometric distribution with parameters n and (M 1, M 2, M 3 ). Marginal and conditional distributions. If (X, Y ) has a bivariate hypergeometric distribution, then X has a hypergeometric distribution with parameters n, M 1, and N, and Y has a hypergeometric distribution with parameters n, M 2, and N. In addition, each conditional distribution is hypergeometric. ) Exercise. An urn contains 5 red, 3 white, and 4 blue balls. Let X be the number of red balls and Y be the number of white balls in a subset of size 4. Assume each choice of subset is equally likely. (a) Write the formula for the joint PDF of (X, Y ), with appropriate ranges. 10

11 (b) The table on the right gives the numerators in the formula for joint probabilities. Use this table to find the following probabilities: y = 0 y = 1 y = 2 y = 3 sum: x = x = x = x = x = sum: P (X < 2, Y < 2) P (X < 2, Y 2) P (X 2, Y < 2) P (X 2, Y 2) (c) X has a hypergeometric distribution based on choosing a subset of size containing with a total of from an urn special objects, objects. (d) Y given X = 1 has a hypergeometric distribution based on choosing a subset of size containing with a total of from an urn special objects, objects. (e) Are X and Y independent? Why? 11

12 3.2.5 Trinomial Distribution Let n be a positive integer, and p 1, p 2, and p 3 be positive proportions with sum 1. Then (X, Y ) is said to have a trinomial distribution with parameters n and (p 1, p 2, p 3 ) when its joint PDF has the following form: p(x, y) = P (X = x, Y = y) = ( n ) x, y, n x y p x 1 p y 2 pn x y 3, when x = 0, 1,..., n; y = 0, 1,..., n; x + y n, and is equal to zero otherwise. Model for counts of independent trials. Trinomial distributions can be used to model counts related to independent trials of an experiment with exactly three outcomes. Specifically, suppose that an experiment has three outcomes which occur with probabilities p 1, p 2, and p 3, respectively. Let X be the number of occurrences of outcome 1 and Y be the number of occurrences of outcome 2 in n independent trials of the experiment. Then the random pair (X, Y ) has a trinomial distribution with parameters n and (p 1, p 2, p 3 ). Marginal and conditional distributions. If (X, Y ) has a trinomial distribution with parameters n and (p 1, p 2, p 3 ), then X has a binomial distribution with parameters n and p 1, and Y has a binomial distribution with parameters n and p 2. In addition, each conditional distribution is binomial. Exercise. An urn contains 5 red, 3 white, and 4 blue balls. You repeat the following experiment 4 times: Thoroughly mix the contents of the urn, choose one ball, note its color (R, W, B), return the ball to the urn. Let X be the number of R s and let Y be the number of W s in the list of four colors. (a) Write the formula for the joint PDF of (X, Y ), with appropriate ranges. 12

13 (b) The table on the right gives the joint and marginal distributions, rounded to three decimal places of accuracy. Use this table to find the following probabilities: y = 0 y = 1 y = 2 y = 3 y = 4 sum: x = x = x = x = x = sum: P (X < 2, Y < 2) P (X < 2, Y 2) P (X 2, Y < 2) P (X 2, Y 2) (c) X has a binomial distribution based on independent trials of an experiment with success probability. (d) Y given X = 1 has a binomial distribution based on independent trials of an experiment with success probability. (e) Are X and Y independent? Why? 13

14 3.2.6 Simple Random Sample, Trinomial Approximation, Survey Analysis Suppose that an urn contains N objects. A simple random sample of size n is a sequence of n objects chosen without replacement from the urn, where the choice of each sequence is equally likely. Let M i be the number of objects of type i in the urn, for i = 1, 2, 3, X be the number of objects of type 1, and Y be the number of objects of type 2, in a simple random sample of size n. Then the random pair (X, Y ) has a bivariate hypergeometric distribution with parameters n and (M 1, M 2, M 3 ). Further, if N is large, then trinomial probabilities can be used to approximate bivariate hypergeometric probabilities: Theorem (Trinomial Approximation). If N is large and p i = M i /N is not too extreme, for i = 1, 2, 3, then the trinomial distribution with parameters n and (p 1, p 2, p 3 ) can be used to approximate the bivariate hypergeometric distribution with parameters n and (M 1, M 2, M 3 ). Specifically, P (x of type 1 and y of type 2) = ( M1 x )( M2 )( M3 y n x y for each (x, y) within the appropriate range. ) ( ( N n) n x, y, n x y ) p x 1p y 2 pn x y 3 The theorem tells us that sampling with replacement can be used to approximate sampling without replacement. The approximation is good when n <.01N and each p i = M i N (.05,.95). Survey analysis. Simple random samples are used in surveys. If the survey population is small, then bivariate hypergeometric distributions are used to analyze the results. If the survey population is large, then trinomial distributions are used to analyze the results, even though each person s opinion is solicited at most once. For example, suppose that a surveyor is interested in determining the level of support for a proposal to change the local tax structure, and decides to choose a simple random sample of size 10 from the registered voter list. If there are a total of 120 registered voters, where one-third support the proposal, one-half oppose the proposal, and one-sixth have no opinion, then the probability that exactly 3 support, 5 oppose, and 2 have no opinion is ( 40 ) ( 60 ) ( 20 ) P (X = 3, Y = 5) = ( 120 ) If there are thousands of registered voters, then the probability is ( ) ( ) ( ) 1 5 ( ) 1 2 P (X = 3, Y = 5) , 5, Note that in the approximation you do not need to know the exact number of registered voters. 14

15 3.3 Joint Continuous Distributions A joint continuous distribution describes the joint variability of two or more continuous random variables Bivariate Distribution, PDF, CDF, Marginal Distributions Assume that X and Y are continuous random variables. The joint cumulative distribution function (joint CDF) of the random pair (X, Y ) is defined as follows: F (x, y) = P (X x, Y y) for all real pairs (x, y), where the comma is understood to mean the intersection of events. If F (x, y) has continuous second partial derivatives, then the joint probability density function (joint PDF) of the random pair (X, Y ) is defined as follows: f(x, y) = for all possible pairs (x, y). 2 F (x, y) = 2 x y F (x, y), y x Computing probabilities using the joint PDF. If R is the joint range of (X, Y ) and D R 2, then the probability of the event the value of (X, Y ) is in the domain D is obtained by finding the volume under the density surface z = f(x, y) for (x, y) D R: P ((X, Y ) D) = f da. D R Marginal distributions. f(x, y), then If X and Y are continuous random variables with joint PDF The marginal probability density function (marginal PDF) of X is f X (x) = f(x, y) dy, for all real numbers x, R Y where R Y is the range of Y. The marginal probability density function (marginal PDF) of Y is f Y (y) = f(x, y) dx, for all real numbers y, R X where R X is the range of X. 15

16 Exercise 1. Let (X, Y ) be the random pair with joint density function f(x, y) = 1 8 (x2 + y 2 ), for (x, y) [ 1, 2] [ 1, 1], and 0 otherwise. (a) Find P (X > Y ). 16

17 (b) Completely specify the marginal PDF of X. (c) Completely specify the marginal PDF of Y. 17

18 Exercise 2. Let (X, Y ) be the random pair with joint density function f(x, y) = 1 4 e y/2, for 0 x y, and 0 otherwise. (a) Find P (X + Y < 5). 18

19 (b) Completely specify the marginal PDF of X. (c) Completely specify the marginal PDF of Y. 19

20 3.3.2 Conditional Distributions, Regression Let X and Y be continuous random variables with joint PDF f(x, y) and marginal PDFs f X (x) and f Y (y), respectively. If f Y (y) 0, then the conditional probability density function (conditional PDF) of X given Y = y is defined as follows: f X Y =y (x y) = f(x, y) f Y (y) for all real numbers x. If f X (x) 0, then the conditional probability density function (conditional PDF) of Y given X = x is defined as follows: f Y X=x (y x) = f(x, y) f X (x) for all real numbers y. In each case, the marginal PDF value is used to re-scale a one-variable function so that it becomes a valid density function. Exercise 1, continued. Let (X, Y ) be the continuous random pair whose joint PDF is f(x, y) = 1 8 (x2 + y 2 ), for (x, y) [ 1, 2] [ 1, 1], and 0 otherwise. Completely specify: (1) The conditional PDF of X given Y = y. (2) The conditional PDF of Y given X = x. 20

21 Exercise 2, continued. Let (X, Y ) be the continuous random pair whose joint PDF is f(x, y) = 1 4 e y/2, for 0 x y, and 0 otherwise. Completely specify: (1) The conditional PDF of X given Y = y. (2) The conditional PDF of Y given X = x. Regression. Conditional distributions are often used as weights in weighted averages. For example, suppose that we would like to find the average value of Y for each possible x in the problem above. Then we would calculate a quantity known as the conditional mean (or the conditional expectation) of Y given X = x, E(Y X = x) = x y f Y X=x (y x) dy. The pairs (x, E(Y X = x)) are shown in the plot on the right. Note that as x increases, so does the average value of Y given X = x. The collection of conditional means E(Y X = x) is often called the regression of Y on X. 21

22 3.3.3 Independent Random Variables The continuous random variables X and Y are said to be independent if F (x, y) = F X (x)f Y (y) for all real pairs (x, y), where F (x, y) is the joint CDF, and F X (x) and F Y (y) are the marginal CDFs, and they are said to be dependent if F (x, y) F X (x)f Y (y) for at least one real pair (x, y). Notes: 1. If F (x, y) has continuous second partial derivatives, then we can write an equivalent definition in terms of the density functions. That is, the continuous random variables X and Y are said to be independent if f(x, y) = f X (x)f Y (y) for all real pairs (x, y), where f(x, y) is the joint PDF, and f X (x) and f Y (y) are the marginal PDFs, and they are said to be dependent if f(x, y) f X (x)f Y (y) for at least one real pair (x, y). 2. Sometimes the context of a problem implies independence. For example, let X be the height of a randomly chosen woman living in the United States and Y be the height of a randomly chosen man living in the United States. Then, the context of the problem tells us that X and Y are independent random variables. 3. If X and Y are independent, then conditional and marginal distributions are equal. To see this (please complete), 22

23 Exercise. Let X and Y be independent continuous random variables. Let 1. D 1 R X be a subset of the range of nonzero density of X and 2. D 2 R Y be a subset of the range of nonzero density of Y. Demonstrate that P ((X, Y ) D 1 D 2 ) = P (X D 1 )P (Y D 2 ). 23

24 3.3.4 Bivariate Uniform Distribution Let R R 2 be a region of finite positive area. The random pair (X, Y ) is said to have a bivariate uniform distribution on R when its joint PDF has the following form: f(x, y) = 1 Area(R), for (x, y) R, and 0 otherwise. Exercise. Let (X, Y ) have a bivariate uniform distribution on the region R. Demonstrate that the probability that (X, Y ) lies in a subregion of R depends only on the area of the subregion. Exercise. Let (X, Y ) be the random pair with joint density function f(x, y) = 1 π, when x2 + y 2 1, and 0 otherwise. (a) Use geometry to find P (X > Y ). 24

25 (b) Use geometry to find P (X > 1/2). (c) Completely specify the marginal PDF of X. 25

26 Exercise. Let (X, Y ) have a bivariate uniform distribution on R = [5, 10] [0, 10]. (a) Find P (X < 2Y ). (b) Completely specify the marginal PDFs of X and Y. (c) Are X and Y independent? Why? 26

27 3.3.5 Bivariate Normal Distribution Let µ x and µ y be real numbers, σ x and σ y be positive real numbers, and ρ be a number in the interval 1 < ρ < 1. The random pair (X, Y ) is said to have a bivariate normal distribution with parameters µ x, µ y, σ x, σ y and ρ when its joint PDF is ( ) 1 term f(x, y) = 2πσ x σ exp y 1 ρ 2 2(1 ρ 2 for all real pairs (x, y) ) where ( ) x 2 ( ) ( ) ( µx x µx y µy y µy term = 2ρ + σ x and exp() is the exponential function. σ x σ y σ y ) 2 Note that X is a normal random variable with mean µ x and standard deviation σ x, and Y is a normal random variable with mean µ y and standard deviation σ y. Correlation coefficient. The parameter ρ is called the correlation coefficient. The correlation coefficient is a measure of the strength of the association between X and Y. Standard bivariate normal distribution. If µ x = µ y = 0 and σ x = σ y = 1, then the random pair (X, Y ) is said to have a standard bivariate normal distribution with parameter ρ. The joint density is as follows: ( 1 (x 2 f(x, y) = 2π 1 ρ exp 2ρxy + y 2 ) ) 2 2(1 ρ 2 for all real pairs (x, y). ) In this case, both X and Y are standard normal random variables. The following plots show the shape of the density surface for three different values of the correlation coefficient: ρ = 7 10 ρ = 0 ρ =

28 3.3.6 Transforming Random Variables, CDF Method Let (X, Y ) be a continuous random pair and let W = g(x, Y ), where g is a continuous function. Our goal is to determine the W distribution. The first steps will be to compute P (W w) = P (g(x, Y ) w) as an expression in w, and to find d dw P (W w), for each w in the range of W. Exercise. Let (X, Y ) have a bivariate uniform distribution on the disk R = {(x, y) x 2 + y 2 1}, and let W = X 2 + Y 2 be the distance of a random point to the origin. (Curves with distance w = 0.25, 0.50, 0.75, 1.00 are shown in the plot.) Completely specify the PDF of W. 28

29 Exercise. Let X and Y be the lengths of adjacent sides of a rectangle, and let W = XY be the area of the rectangle. Assume that X has a uniform distribution on the interval (0, 2), Y has a uniform distribution on the interval (0, 1), and that X and Y are independent. (Curves for areas w = 0.2, 0.6, 1.0, 1.4 are shown in the plot.) Completely specify the PDF of W. 29

30 Exercise. Assume that (X, Y ) has the joint PDF f(x, y) = 1 4 e y/2 when 0 < x < y, and 0 otherwise, and let W be the ratio W = Y/X. (Curves with ratios w = 2, 3, 4, 5 are shown in the plot.) Completely specify the PDF of W. 30

31 Exercise. Assume that (X, Y ) has the joint PDF f(x, y) = 1 x 2 y 2 when x > 1, y > 1, and 0 otherwise, and let W be the product W = Y X. (Curves with products w = 2, 3,..., 9 are shown in the plot.) Completely specify the PDF of W. 31

32 3.4 Multivariate Distributions, Random Samples A multivariate distribution is the joint distribution of k random variables, X 1, X 2,..., X k. Ideas studied in the bivariate case (k = 2) can be generalized to the case where k > 2. Joint CDF: The joint CDF of the random k-tuple (X 1, X 2,..., X k ) is defined as follows: F (x 1, x 2,..., x k ) = P (X 1 x 1, X 2 x 2,..., X k x k ) for all (x 1, x 2,..., x k ) R k, where commas are understood to mean the intersection of events. Joint PDF: The definition of the joint PDF depends on the types of random variables: 1. Discrete case: If the X i s are discrete, their joint PDF is defined as follows: p(x 1, x 2,..., x k ) = P (X 1 = x 1, X 2 = x 2,..., X k = x k ) for all (x 1, x 2,..., x k ) R k, where commas are understood to mean intersection. 2. Continuous case: If the X i s are continuous and F (x 1, x 2,..., x k ) has continuous k th order partial derivatives, then the joint PDF is defined as follows: f(x 1, x 2,..., x k ) = for all possible (x 1, x 2,..., x k ). k x 1 x k F (x 1, x 2,..., x k ) Multivariate Hypergeometric and Multinomial Distributions The following two multivariate families generalize the bivariate families we studied earlier: Multivariate Hypergeometric Distribution: Let n and M i, for i = 1,..., k, be positive integers with n < M 1 + M M k. The random k-tuple (X 1, X 2,..., X k ) is said to have a multivariate hypergeometric distribution with parameters n and (M 1, M 2,..., M k ) when its joint PDF has the following form: p(x 1, x 2,..., x k ) = ( M1 )( M2 ) ( x 1 x 2 Mk ( N, n) x k ) where N = M 1 + M M k, each x i is an integer satisfying 0 x i min(n, M i ), and the sum of the x i s is exactly n; otherwise, the joint PDF equals 0. Urn Sample Type 1 M 1 X 1 Type 2 M 2 X 2... Type k M k X k N n 32

33 Multinomial Distribution: Notes: Let n be a positive integer, and p i, for i = 1, 2,..., k, be positive proportions whose sum is exactly 1. The random k-tuple (X 1, X 2,..., X k ) is said to have a multinomial distribution with parameters n and (p 1, p 2,..., p k ) when its joint PDF has the following form: ( ) n p(x 1, x 2,..., x k ) = p x 1 1 x 1, x 2,..., x px 2 2 px k k, k when each x i is an integer satisfying 0 x i n, and the sum of the x i s is exactly n; otherwise, the joint PDF equals 0. Probs. Sample Type 1 p 1 X 1 Type 2 p 2 X 2... Type k p k X k 1 n 1. The multivariate hypergeometric distribution is used, for example, to model experiments where sampling is done without replacement from an urn containing exacty k types of objects, and the multinomial distribution is used to model experiments where sampling is done with replacement. 2. If N is large enough and each p i = M i /N is not too extreme, then the multivariate hypergeometric distribution is well-approximated by a multinomial distribution where p i = M i N for each i. (Thus, sampling with replacement can be used to approximate sampling without replacement.) Mutual Independence, Repeated Trials, Random Samples X 1, X 2,..., X k are said to be mutually independent (or independent when the context is clear) if F (x 1, x 2,..., x k ) = F 1 (x 1 )F 2 (x 2 ) F k (x k ) for all (x 1, x 2,..., x k ) R k, where F i (x i ) = P (X i x i ) for i = 1, 2,..., k. 1. Discrete case: Equivalently, the discrete X i s are mutually independent if p(x 1, x 2,..., x k ) = p 1 (x 1 )p 2 (x 2 ) p k (x k ) for all (x 1, x 2,..., x k ) R k, where p i (x i ) = P (X i = x i ) for i = 1, 2,..., k. 2. Continuous case: If the joint PDF exists, then the continuous X i s are mutually independent if f(x 1, x 2,..., x k ) = f 1 (x 1 )f 2 (x 2 ) f k (x k ) for all possible (x 1, x 2,..., x k ), where f i (x i ) is the density function of X i, for i = 1, 2..., k. 33

34 Exercise. Suppose that an urn contains 4 slips of paper with the numbers 1, 2, 3 and 4 written on them. The urn is thoroughly mixed, one slip of paper is removed, and the number is recorded. Let Let X = 1 if slip 1 or 2 is drawn, and let X = 0 otherwise. Let Y = 1 if slip 1 or 3 is drawn, and let Y = 0 otherwise. Let Z = 1 if slip 1 or 4 is drawn, and let Z = 0 otherwise. (a) Are X and Y independent? Are X and Z independent? Are Y and Z independent? (b) Are X, Y and Z mutually independent? 34

35 Mutual independence and probabilities: random variables, then If X 1, X 2,..., X k are mutually independent P ((X 1,..., X k ) D 1 D 2 D k ) = P (X 1 D 1 )P (X 2 D 2 ) P (X k D k ). Random samples: If X 1, X 2,..., X k are mutually independent and have a common distribution (each marginal distribution is the same), then X 1, X 2,..., X k are said to be a random sample from that distribution. Repeated trials and random samples: Consider k repetitions of an experiment, with the outcomes of the trials having no influence on one another, and let X i be the result of the i th repetition, for i = 1, 2,..., k. Then the X i s are mutually independent, and form a random sample from the common distribution. Exercise. Let X be the length in inches of a rod produced by a manufacturing process. Assume that X has a uniform distribution on the interval [0.98, 1.02]. On five separate days, a quality control engineer chooses one rod randomly from those produced that day and measures its length. Find the probability that all 5 lengths are greater than 0.99 inches long. 35

36 Exercise. Let X 1, X 2, X 3 be a random sample from the distribution with PDF x p(x) Let Y = max(x 1, X 2, X 3 ) be the maximum of the three random numbers. (a) Find P (Y y), for y = 1, 2, 3, 4. (b) Use your answer to part (a) to find P (Y = y) for y = 1, 2, 3, 4. 36

### 1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...

MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability

### Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

### ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

### Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

### Notes on Continuous Random Variables

Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

### Bivariate Distributions

Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

### Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

### The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

### ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

### An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

### Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

### Probability Calculator

Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that

### MATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators...

MATH4427 Notebook 2 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 2 MATH4427 Notebook 2 3 2.1 Definitions and Examples...................................

### Statistics 100A Homework 7 Solutions

Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

### Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

### 6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

### 4. Joint Distributions

Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

### Chapter 3 RANDOM VARIATE GENERATION

Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.

### Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence

### 4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

### Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

### Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

### WHERE DOES THE 10% CONDITION COME FROM?

1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

### Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

### STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

### Examination 110 Probability and Statistics Examination

Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

### ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

### MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

### P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

### Some probability and statistics

Appendix A Some probability and statistics A Probabilities, random variables and their distribution We summarize a few of the basic concepts of random variables, usually denoted by capital letters, X,Y,

### Stats on the TI 83 and TI 84 Calculator

Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and

### Chapter 4. Probability and Probability Distributions

Chapter 4. robability and robability Distributions Importance of Knowing robability To know whether a sample is not identical to the population from which it was selected, it is necessary to assess the

### Lecture Notes 1. Brief Review of Basic Probability

Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

### 6.4 Normal Distribution

Contents 6.4 Normal Distribution....................... 381 6.4.1 Characteristics of the Normal Distribution....... 381 6.4.2 The Standardized Normal Distribution......... 385 6.4.3 Meaning of Areas under

### Joint distributions Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Joint distributions Math 17 Probability and Statistics Prof. D. Joyce, Fall 14 Today we ll look at joint random variables and joint distributions in detail. Product distributions. If Ω 1 and Ω are sample

### Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

### MTH135/STA104: Probability

MTH135/STA14: Probability Homework # 8 Due: Tuesday, Nov 8, 5 Prof Robert Wolpert 1 Define a function f(x, y) on the plane R by { 1/x < y < x < 1 f(x, y) = other x, y a) Show that f(x, y) is a joint probability

### Algebra 2 Chapter 1 Vocabulary. identity - A statement that equates two equivalent expressions.

Chapter 1 Vocabulary identity - A statement that equates two equivalent expressions. verbal model- A word equation that represents a real-life problem. algebraic expression - An expression with variables.

### Senior Secondary Australian Curriculum

Senior Secondary Australian Curriculum Mathematical Methods Glossary Unit 1 Functions and graphs Asymptote A line is an asymptote to a curve if the distance between the line and the curve approaches zero

### 3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

### E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

### You flip a fair coin four times, what is the probability that you obtain three heads.

Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

### MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables Tony Pourmohamad Department of Mathematics De Anza College Spring 2015 Objectives By the end of this set of slides,

### Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

### Joint Distributions. Tieming Ji. Fall 2012

Joint Distributions Tieming Ji Fall 2012 1 / 33 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables

### 4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

### Continuous Random Variables

Chapter 5 Continuous Random Variables 5.1 Continuous Random Variables 1 5.1.1 Student Learning Objectives By the end of this chapter, the student should be able to: Recognize and understand continuous

### Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

### 6 3 The Standard Normal Distribution

290 Chapter 6 The Normal Distribution Figure 6 5 Areas Under a Normal Distribution Curve 34.13% 34.13% 2.28% 13.59% 13.59% 2.28% 3 2 1 + 1 + 2 + 3 About 68% About 95% About 99.7% 6 3 The Distribution Since

### ( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17

4.6 I company that manufactures and bottles of apple juice uses a machine that automatically fills 6 ounce bottles. There is some variation, however, in the amounts of liquid dispensed into the bottles

### Probability distributions

Probability distributions (Notes are heavily adapted from Harnett, Ch. 3; Hayes, sections 2.14-2.19; see also Hayes, Appendix B.) I. Random variables (in general) A. So far we have focused on single events,

### Chi-Square Test. Contingency Tables. Contingency Tables. Chi-Square Test for Independence. Chi-Square Tests for Goodnessof-Fit

Chi-Square Tests 15 Chapter Chi-Square Test for Independence Chi-Square Tests for Goodness Uniform Goodness- Poisson Goodness- Goodness Test ECDF Tests (Optional) McGraw-Hill/Irwin Copyright 2009 by The

### MATH 201. Final ANSWERS August 12, 2016

MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

### For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

### L10: Probability, statistics, and estimation theory

L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian

### Infinite Algebra 1 supports the teaching of the Common Core State Standards listed below.

Infinite Algebra 1 Kuta Software LLC Common Core Alignment Software version 2.05 Last revised July 2015 Infinite Algebra 1 supports the teaching of the Common Core State Standards listed below. High School

### Exploratory Data Analysis

Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction

### 3. Continuous Random Variables

3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

### Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

### Algebra I Vocabulary Cards

Algebra I Vocabulary Cards Table of Contents Expressions and Operations Natural Numbers Whole Numbers Integers Rational Numbers Irrational Numbers Real Numbers Absolute Value Order of Operations Expression

### Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

### MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. A) 0.4987 B) 0.9987 C) 0.0010 D) 0.

Ch. 5 Normal Probability Distributions 5.1 Introduction to Normal Distributions and the Standard Normal Distribution 1 Find Areas Under the Standard Normal Curve 1) Find the area under the standard normal

### Principle of Data Reduction

Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

### Probabilities and Random Variables

Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so

### Lesson 20. Probability and Cumulative Distribution Functions

Lesson 20 Probability and Cumulative Distribution Functions Recall If p(x) is a density function for some characteristic of a population, then Recall If p(x) is a density function for some characteristic

### Covariance and Correlation

Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

### FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

### Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics

Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics For 2015 Examinations Aim The aim of the Probability and Mathematical Statistics subject is to provide a grounding in

### MATH2210 Notebook 1 Fall Semester 2016/2017. 1 MATH2210 Notebook 1 3. 1.1 Solving Systems of Linear Equations... 3

MATH0 Notebook Fall Semester 06/07 prepared by Professor Jenny Baglivo c Copyright 009 07 by Jenny A. Baglivo. All Rights Reserved. Contents MATH0 Notebook 3. Solving Systems of Linear Equations........................

### MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS

MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS CONTENTS Sample Space Accumulative Probability Probability Distributions Binomial Distribution Normal Distribution Poisson Distribution

### Technology Step-by-Step Using StatCrunch

Technology Step-by-Step Using StatCrunch Section 1.3 Simple Random Sampling 1. Select Data, highlight Simulate Data, then highlight Discrete Uniform. 2. Fill in the following window with the appropriate

### Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere

Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x

### Probability. Distribution. Outline

7 The Normal Probability Distribution Outline 7.1 Properties of the Normal Distribution 7.2 The Standard Normal Distribution 7.3 Applications of the Normal Distribution 7.4 Assessing Normality 7.5 The

### Lecture 4: Probability Distributions and Probability Densities - 2

Lecture 4: Probability Distributions and Probability Densities - 2 Assist. Prof. Dr. Emel YAVUZ DUMAN MCB1007 Introduction to Probability and Statistics İstanbul Kültür University Outline 1 Multivariate

### Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Chapter 4 - Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the

### WEEK #22: PDFs and CDFs, Measures of Center and Spread

WEEK #22: PDFs and CDFs, Measures of Center and Spread Goals: Explore the effect of independent events in probability calculations. Present a number of ways to represent probability distributions. Textbook

### MAT 1000. Mathematics in Today's World

MAT 1000 Mathematics in Today's World We talked about Cryptography Last Time We will talk about probability. Today There are four rules that govern probabilities. One good way to analyze simple probabilities

### Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

### The Binomial Probability Distribution

The Binomial Probability Distribution MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2015 Objectives After this lesson we will be able to: determine whether a probability

### The Big 50 Revision Guidelines for S1

The Big 50 Revision Guidelines for S1 If you can understand all of these you ll do very well 1. Know what is meant by a statistical model and the Modelling cycle of continuous refinement 2. Understand

### PROBABILITIES AND PROBABILITY DISTRIBUTIONS

Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL

### Worked examples Basic Concepts of Probability Theory

Worked examples Basic Concepts of Probability Theory Example 1 A regular tetrahedron is a body that has four faces and, if is tossed, the probability that it lands on any face is 1/4. Suppose that one

### High School Algebra 1 Common Core Standards & Learning Targets

High School Algebra 1 Common Core Standards & Learning Targets Unit 1: Relationships between Quantities and Reasoning with Equations CCS Standards: Quantities N-Q.1. Use units as a way to understand problems

### Density Curve. A density curve is the graph of a continuous probability distribution. It must satisfy the following properties:

Density Curve A density curve is the graph of a continuous probability distribution. It must satisfy the following properties: 1. The total area under the curve must equal 1. 2. Every point on the curve

### CALCULATIONS & STATISTICS

CALCULATIONS & STATISTICS CALCULATION OF SCORES Conversion of 1-5 scale to 0-100 scores When you look at your report, you will notice that the scores are reported on a 0-100 scale, even though respondents

### 1) What is the probability that the random variable has a value greater than 2? A) 0.750 B) 0.625 C) 0.875 D) 0.700

Practice for Chapter 6 & 7 Math 227 This is merely an aid to help you study. The actual exam is not multiple choice nor is it limited to these types of questions. Using the following uniform density curve,

### Math 150 Sample Exam #2

Problem 1. (16 points) TRUE or FALSE. a. 3 die are rolled, there are 1 possible outcomes. b. If two events are complementary, then they are mutually exclusive events. c. If A and B are two independent

### Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

### STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

### Montana Common Core Standard

Algebra I Grade Level: 9, 10, 11, 12 Length: 1 Year Period(s) Per Day: 1 Credit: 1 Credit Requirement Fulfilled: A must pass course Course Description This course covers the real number system, solving

### The random variable X - the no. of defective items when three electronic components are tested would be

RANDOM VARIABLES and PROBABILITY DISTRIBUTIONS Example: Give the sample space giving a detailed description of each possible outcome when three electronic components are tested, where N - denotes nondefective

### Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

### ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

### Chapter 2, part 2. Petter Mostad

Chapter 2, part 2 Petter Mostad mostad@chalmers.se Parametrical families of probability distributions How can we solve the problem of learning about the population distribution from the sample? Usual procedure:

### A Primer on Mathematical Statistics and Univariate Distributions; The Normal Distribution; The GLM with the Normal Distribution

A Primer on Mathematical Statistics and Univariate Distributions; The Normal Distribution; The GLM with the Normal Distribution PSYC 943 (930): Fundamentals of Multivariate Modeling Lecture 4: September

### Probability Distributions

CHAPTER 6 Probability Distributions Calculator Note 6A: Computing Expected Value, Variance, and Standard Deviation from a Probability Distribution Table Using Lists to Compute Expected Value, Variance,

### Normal distribution. ) 2 /2σ. 2π σ

Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

### SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation

SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu September 19, 2015 Outline