Notes 11 Autumn 2005

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Notes 11 Autumn 2005"

Transcription

1 MAS 08 Probabilit I Notes Autumn 005 Two discrete random variables If X and Y are discrete random variables defined on the same sample space, then events such as X = and Y = are well defined. The joint distribution of X and Y is a list of the probabilities p X,Y (,) = P(X = and Y = ) for all values of and that occur. The list is usuall shown as a two-wa table. This table is called the joint probabilit mass function of X and Y. We abbreviate p X,Y (,) to p(,) if X and Y can be understood from the contet. Eample (Muffins: part ) I have a bag containing 5 chocolate-chip muffins, 3 blueberr muffins and lemon muffins. I choose three muffins from this bag at random. Let X be the number of lemon muffins chosen and Y be the number of blueberr muffins chosen. We calculate the values of the joint probablit mass function. First note that S = 0 C 3 =. Then P(X = 0 and Y = 0) = P(3 chocolate-chip are chosen) = 5 C 3 = 0. P(X = 0 and Y = ) = P( blueberr and chocolate chip are chosen) = 3 C 5 C = 30.

2 The other values are found b similar calculations, giving the following table. Values 0 0 of X 0 5 Values of Y Of course, we check that the probabilities in the table add up to. To obtain the distributions of X and Y individuall, find the row sums and the column sums: these give their probabilit distributions. The marginal probabilit mass function p X of X is the list of probabilities p X () = P(X = ) = p X,Y (,) for all values of which occur. Similarl, the marginal probabilit mass function p Y of Y is given b p Y () = P(Y = ) = p X,Y (,). The distributions of X and Y are said to be marginal to the joint distribution. The are just ordinar distributions. Eample (Muffins: part ) Here the marginal p.m.f. of X is and the marginl p.m.f. of Y is 0 p X () p Y () Therefore E(X) = 3/5 and E(Y ) = 9/0. We can define events in terms of X and Y, such as X < Y or X +Y = 3. To find the probabilit of such an event, find the probabilit of the set of all pairs (,) for which the statement is true. We can also define new random variables as functions of X and Y.

3 Proposition 0 If g is a real function of two variables then E(g(X,Y )) = g(,)p X,Y (,). The proof is just like the proof of Proposition 8. Eample (Muffins: part 3) Put U = X +Y and V = XY. Then E(U) = = 3 = E(X) + E(Y ). However, Theorem 7 E(V ) = = E(X)E(Y ). 5 (a) E(X +Y ) = E(X) + E(Y ) alwas. (b) E(XY ) is not necessaril equal to E(X)E(Y ). Proof (a) E(X +Y ) = = = ( + )p X,Y (,) b Proposition 0 p X,Y (,) + = = E(X) + E(Y ). p X,Y (,) + p X () + p Y () p X,Y (,) p X,Y (,) (b) A single counter-eample is all that we need. In the muffins eample we saw that E(XY ) E(X)E(Y ). Independence Random variables X and Y defined on the same sample space are defined to be independent of each other if the events X = and Y = are independent for all values of and ; that is p X,Y (,) = p X ()p Y () for all and. 3

4 Eample (Muffins: part 4) Here P(X = 0) = 7/5 and P(Y = 0) = 35/ but P(X = 0 and Y = 0) = = P(X = 0)P(Y = 0), so X and Y are not independent of each other. Note that a single pair of values of and where the probabilities do not multipl is enough to show that X and Y are not independent. On the other hand, if I roll a die twice, and X and Y are the numbers that come up on the first and second throws, then X and Y will be independent, even if the die is not fair (so that the outcomes are not all equall likel). If we have more than two random variables (for eample X, Y, Z), we sa that the are mutuall independent if the events that the random variables take specific values (for eample, X = a, Y = b, Z = c) are mutuall independent. Theorem 8 If X and Y are independent of each other then E(XY ) = E(X)E(Y ). Proof E(XY ) = = = [ p X,Y (,) p X ()p Y () = E(X)E(Y ). p X ()] [ p Y () if X and Y are independent Note that the converse is not true, as the following eample shows. Eample Suppose that X and Y have the joint p.m.f. in this table. ] Values of Y 0 Values of X The marginal distributions are 0 p X () 4

5 and 0 p Y () 4 4 Therefore P(X = 0 and Y = ) = 0 but P(X = 0)P(Y = ) = 4 0 so X and Y are not independent of each other. However, E(X) = / and E(Y ) = 0 so E(X)E(Y ) = 0, while so E(XY ) = E(X)E(Y ). Covariance and correlation E(XY ) = = 0, A measure of the joint spread of X and Y is the covariance of X and Y, which is defined b Cov(X,Y ) = E ((X µ X )(Y µ Y )), where µ X = E(X) and µ Y = E(Y ). Theorem 9 (Properties of covariance) (a) Cov(X,X) = Var(X). (b) Cov(X,Y ) = E(XY ) E(X)E(Y ). (c) If a is a constant then Cov(aX,Y ) = Cov(X, ay ) = a Cov(X,Y ). (d) If b is constant then Cov(X,Y + b) = Cov(X + b,y ) = Cov(X,Y ). (e) If X and Y are independent, then Cov(X,Y ) = 0. Proof (a) This follows directl from the definition. (b) Cov(X,Y ) = E ((X µ X )(Y µ Y )) = E (XY µ X Y µ Y X + µ X µ Y ) = E(XY ) + E( µ X Y ) + E( µ Y X) + E(µ X µ Y ) b Theorem 7(a) = E(XY ) µ X E(Y ) µ Y E(X) + µ X µ Y b Theorem 4 = E(XY ) E(X)E(Y ) E(Y )E(X) + E(X)E(Y ) = E(XY ) E(X)E(Y ). 5

6 (c) From part (b), Cov(aX,Y ) = E(aXY ) E(aX)E(Y ). From Theorem 4, E(aXY ) = ae(xy ) and E(aX) = ae(x). Therefore Cov(aX,Y ) = ae(xy ) ae(x)e(y ) = a(e(xy ) E(X)E(Y )) = acov(x,y ). (d) Theorem 4 shows that E(Y + b) = E(Y ) + b, so we have (Y + b) E(Y + b) = Y E(Y ). Therefore Cov(X,Y + b) = E(X E(X))(Y + b E(Y + b)) = E(X E(X))(Y E(Y )) = Cov(X,Y ). (e) From part (b), Cov(X,Y ) = E(XY ) E(X)E(Y ). Theorem 8 shows that E(XY ) E(X)E(Y ) = 0 if X and Y are independent of each other. The covariance of X and Y is the product of the distance of X from its mean and the distance of Y from its mean. Therefore, if X µ X and Y µ Y tend to be either both postive or both negative then Cov(X,Y ) is positive. On the other hand, if X µ X and Y µ Y tend to have opposite signs then Cov(X,Y ) is negative. If Var(X) or Var(Y ) is large then Cov(X,Y ) ma also be large; in fact, multipling X b a constant a multiplies Var(X) b a and Cov(X,Y ) b a. To avoid this dependence on scale, we define the correlation coefficient, corr(x,y ), which is just a normalised version of the covariance. It is defined as follows: corr(x,y ) = Cov(X,Y ) Var(X)Var(Y ). The point of this is the first and last parts of the following theorem. Theorem (Properties of correlation) Let X and Y be random variables. Then (a) corr(x,y ) ; (b) if X and Y are independent, then corr(x,y ) = 0; (c) if Y = mx + c for some constants m 0 and c, then corr(x,y ) = if m > 0, and corr(x,y ) = if m < 0; this is the onl wa in which corr(x,y ) can be equal to ±; (d) corr(x,y ) is independent of the units of measurement in the sense that if X or Y is multilpied b a constant, or has a constant added to it, then corr(x,y ) is unchanged. 6

7 The proof of the first part won t be given here. But note that this is another check on our calculations: if ou calculate a correlation coefficient which is bigger than or smaller than, then ou have made a mistake. Part (b) follows immediatel from part (e) of the preceding theorem. For part (c), suppose that Y = mx + c. Let Var(X) = α. Now we just calculate everthing in sight. Var(Y ) = Var(mX + c) = Var(mX) = m Var(X) = m α Cov(X,Y ) = Cov(X,mX + c) = Cov(X,mX) = mcov(x,x) = mα corr(x,y ) = Cov(X,Y )/ Var(X)Var(Y ) = mα/ m α { + if m > 0, = if m < 0. The proof of the converse will not be given here. Part (d) follows from Theorems 4, 5 and 9. Thus the correlation coefficient is a measure of the etent to which the two variables are linearl related. It is + if Y increases linearl with X; 0 if there is no linear relation between them; and if Y decreases linearl as X increases. More generall, a positive correlation indicates a tendenc for larger X values to be associated with larger Y values; a negative value, for smaller X values to be associated with larger Y values. We call two random variables X and Y uncorrelated if Cov(X,Y ) = 0 (in other words, if corr(x,y ) = 0). The preceding theorem and eample show that we can sa: Independent random variables are uncorrelated, but uncorrelated random variables need not be independent. Eample In some ears there are two tests in the Probabilit class. We can take as the sample space the set of all students who take both tests, and choose a student at random. Put X(s) = the mark of student s on Test and Y (s) = the mark of student s on Test. We would epect a student who scores better than average on Test to do so again on Test, and one who scores worse than average to do so again on Test, so there should be a positive correlation between X and Y. However, we would not epect the Test scores to be perfectl predictable as a linear function of the Test scores. The marks for one particular ear are shown in Figure. The correlation is 0.69 to decimal places. 7

8 Test Test Figure : Marks on two Probabilit tests 8

9 Theorem 0 If X and Y are random variables and a and b are constants then Proof First, Then Var(aX + by ) = a Var(X) + abcov(x,y ) + b Var(Y ). E(aX + by ) = E(aX) + E(bY ) b Theorem 7 = ae(x) + be(y ) b Theorem 4 = aµ X + bµ Y. Var(aX + by ) = E[(aX + by ) E(aX + by )] b definition of variance = E[aX + by aµ X bµ Y ] = E[a(X µ X ) + b(y µ Y )] = E[a (X µ X ) + ab(x µ X )(Y µ Y ) + b (Y µ Y ) ] = E[a (X µ X ) ] + E[ab(X µ X )(Y µ Y )] + E[b (Y µ Y ) ] b Theorem 7 = a E[(X µ X ) ] + abe[(x µ X )(Y µ Y )] + b E[(Y µ Y ) ] b Theorem 4 = a Var(X) + abcov(x,y ) + b Var(Y ). Corollar If X and Y are independent, then (a) Var(X +Y ) = Var(X) + Var(Y ). (b) Var(X Y ) = Var(X) + Var(Y ). 9

10 Mean and variance of binomial Here is the third wa of finding the mean and variance of a binomial distribution, which is more sophisticated than the methods in Notes 8 et easier than both. If X Bin(n, p) then X can be thought of as the the number of successes in n mutuall independent Bernoulli trials, each of which has probabilit p of success. For i =,..., n, let X i be the number of successes at the i-th trial. Then X i Bernoulli(p), so E(X i ) = p and Var(X i ) = pq, where q = p. Moreover, the random variables X,..., X n are mutuall independent. B Theorem 7, B the Corollar to Theorem 0, E(X) = E(X ) + E(X ) + + E(X n ) = p + p + + p } {{ } n times = np. Var(X) = Var(X ) + Var(X ) + + Var(X n ) = pq + pq + + pq } {{ } n times = npq. Note that if also Y Bin(m, p), and X and Y are independent of each other, then Y = X n+ + +X n+m, where X n+,..., X n+m are mutuall independent Bernoulli(p) random variables which are all independent of X,..., X n, so X +Y = X + X n + X n+ + X n+m, which is the sum of n + m mutuall independent Bernoulli(p) random variables, so X +Y Bin(n + m, p). 0

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304.  jones/courses/141 Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage 5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

More information

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1 ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Lecture 16: Expected value, variance, independence and Chebyshev inequality

Lecture 16: Expected value, variance, independence and Chebyshev inequality Lecture 16: Expected value, variance, independence and Chebyshev inequality Expected value, variance, and Chebyshev inequality. If X is a random variable recall that the expected value of X, E[X] is the

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

ISyE 6761 Fall 2012 Homework #2 Solutions

ISyE 6761 Fall 2012 Homework #2 Solutions 1 1. The joint p.m.f. of X and Y is (a) Find E[X Y ] for 1, 2, 3. (b) Find E[E[X Y ]]. (c) Are X and Y independent? ISE 6761 Fall 212 Homework #2 Solutions f(x, ) x 1 x 2 x 3 1 1/9 1/3 1/9 2 1/9 1/18 3

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

More information

Worked examples Multiple Random Variables

Worked examples Multiple Random Variables Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and

More information

Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).

Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ). Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The

More information

The Scalar Algebra of Means, Covariances, and Correlations

The Scalar Algebra of Means, Covariances, and Correlations 3 The Scalar Algebra of Means, Covariances, and Correlations In this chapter, we review the definitions of some key statistical concepts: means, covariances, and correlations. We show how the means, variances,

More information

Topic 8 The Expected Value

Topic 8 The Expected Value Topic 8 The Expected Value Functions of Random Variables 1 / 12 Outline Names for Eg(X ) Variance and Standard Deviation Independence Covariance and Correlation 2 / 12 Names for Eg(X ) If g(x) = x, then

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Bivariate Distributions

Bivariate Distributions Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

More information

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 )

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 ) RANDOM VARIABLES MATH CIRCLE (ADVANCED) //0 0) a) Suppose you flip a fair coin times. i) What is the probability you get 0 heads? ii) head? iii) heads? iv) heads? For = 0,,,, P ( Heads) = ( ) b) Suppose

More information

Lesson 5 Chapter 4: Jointly Distributed Random Variables

Lesson 5 Chapter 4: Jointly Distributed Random Variables Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence

More information

So, using the new notation, P X,Y (0,1) =.08 This is the value which the joint probability function for X and Y takes when X=0 and Y=1.

So, using the new notation, P X,Y (0,1) =.08 This is the value which the joint probability function for X and Y takes when X=0 and Y=1. Joint probabilit is the probabilit that the RVs & Y take values &. like the PDF of the two events, and. We will denote a joint probabilit function as P,Y (,) = P(= Y=) Marginal probabilit of is the probabilit

More information

Joint Distributions. Tieming Ji. Fall 2012

Joint Distributions. Tieming Ji. Fall 2012 Joint Distributions Tieming Ji Fall 2012 1 / 33 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables

More information

Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Covariance and Correlation. Consider the joint probability distribution f XY (x, y). Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

More information

Calculate the holding period return for this investment. It is approximately

Calculate the holding period return for this investment. It is approximately 1. An investor purchases 100 shares of XYZ at the beginning of the year for $35. The stock pays a cash dividend of $3 per share. The price of the stock at the time of the dividend is $30. The dividend

More information

Correlation in Random Variables

Correlation in Random Variables Correlation in Random Variables Lecture 11 Spring 2002 Correlation in Random Variables Suppose that an experiment produces two random variables, X and Y. What can we say about the relationship between

More information

Chapter 4. Multivariate Distributions

Chapter 4. Multivariate Distributions 1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

Joint Probability Distributions and Random Samples

Joint Probability Distributions and Random Samples 5 Joint Probabilit Distributions and Random Samples INTRODUCTION In Chapters 3 and 4, we developed probabilit models for a single random variable. Man problems in probabilit and statistics involve several

More information

), 35% use extra unleaded gas ( A

), 35% use extra unleaded gas ( A . At a certain gas station, 4% of the customers use regular unleaded gas ( A ), % use extra unleaded gas ( A ), and % use premium unleaded gas ( A ). Of those customers using regular gas, onl % fill their

More information

Exercises with solutions (1)

Exercises with solutions (1) Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

More information

Variances and covariances

Variances and covariances Chapter 4 Variances and covariances 4.1 Overview The expected value of a random variable gives a crude measure for the center of location of the distribution of that random variable. For instance, if the

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles

More information

Lecture 4: Joint probability distributions; covariance; correlation

Lecture 4: Joint probability distributions; covariance; correlation Lecture 4: Joint probability distributions; covariance; correlation 10 October 2007 In this lecture we ll learn the following: 1. what joint probability distributions are; 2. visualizing multiple variables/joint

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

Joint Continuous Distributions

Joint Continuous Distributions Joint Continuous Distributions Statistics 11 Summer 6 Copright c 6 b Mark E. Irwin Joint Continuous Distributions Not surprisingl we can look at the joint distribution of or more continuous RVs. For eample,

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables 6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

DISCRETE RANDOM VARIABLES

DISCRETE RANDOM VARIABLES DISCRETE RANDOM VARIABLES DISCRETE RANDOM VARIABLES Documents prepared for use in course B01.1305, New York University, Stern School of Business Definitions page 3 Discrete random variables are introduced

More information

SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions

SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions 1. The following table contains a probability distribution for a random variable X. a. Find the expected value (mean) of X. x 1 2

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

Random Vectors and the Variance Covariance Matrix

Random Vectors and the Variance Covariance Matrix Random Vectors and the Variance Covariance Matrix Definition 1. A random vector X is a vector (X 1, X 2,..., X p ) of jointly distributed random variables. As is customary in linear algebra, we will write

More information

Mathematical Expectation

Mathematical Expectation Mathematical Expectation Properties of Mathematical Expectation I The concept of mathematical expectation arose in connection with games of chance. In its simplest form, mathematical expectation is the

More information

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211) An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 5 Example: Joint

More information

Conditional expectation

Conditional expectation A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

More information

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions... MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

More information

Sample Questions for Mastery #5

Sample Questions for Mastery #5 Name: Class: Date: Sample Questions for Mastery #5 Multiple Choice Identify the choice that best completes the statement or answers the question.. For which of the following binomial experiments could

More information

Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012.

Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012. Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG8). December 15, 12. 1. (3p) The joint distribution of the discrete random variables X and

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

Lecture 5: Mathematical Expectation

Lecture 5: Mathematical Expectation Lecture 5: Mathematical Expectation Assist. Prof. Dr. Emel YAVUZ DUMAN MCB1007 Introduction to Probability and Statistics İstanbul Kültür University Outline 1 Introduction 2 The Expected Value of a Random

More information

Joint Distribution and Correlation

Joint Distribution and Correlation Joint Distribution and Correlation Michael Ash Lecture 3 Reminder: Start working on the Problem Set Mean and Variance of Linear Functions of an R.V. Linear Function of an R.V. Y = a + bx What are the properties

More information

Chapter 3: Expectation and Variance

Chapter 3: Expectation and Variance 44 Chapter 3: Expectation and Variance In the previous chapter we looked at probability, with three major themes: 1. Conditional probability: P(A B). 2. First-step analysis for calculating eventual probabilities

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Joint Distributions. Lecture 5. Probability & Statistics in Engineering. 0909.400.01 / 0909.400.02 Dr. P. s Clinic Consultant Module in.

Joint Distributions. Lecture 5. Probability & Statistics in Engineering. 0909.400.01 / 0909.400.02 Dr. P. s Clinic Consultant Module in. -3σ -σ -σ +σ +σ +3σ Joint Distributions Lecture 5 0909.400.01 / 0909.400.0 Dr. P. s Clinic Consultant Module in Probabilit & Statistics in Engineering Toda in P&S -3σ -σ -σ +σ +σ +3σ Dealing with multile

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean of 81 degree F and a standard deviation of 10 degree F. What is the mean, standard deviation and variance in terms

More information

Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011

Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011 Chicago Booth BUSINESS STATISTICS 41000 Final Exam Fall 2011 Name: Section: I pledge my honor that I have not violated the Honor Code Signature: This exam has 34 pages. You have 3 hours to complete this

More information

Polynomials and Factoring

Polynomials and Factoring 7.6 Polynomials and Factoring Basic Terminology A term, or monomial, is defined to be a number, a variable, or a product of numbers and variables. A polynomial is a term or a finite sum or difference of

More information

Notes on the Negative Binomial Distribution

Notes on the Negative Binomial Distribution Notes on the Negative Binomial Distribution John D. Cook October 28, 2009 Abstract These notes give several properties of the negative binomial distribution. 1. Parameterizations 2. The connection between

More information

3. Continuous Random Variables

3. Continuous Random Variables 3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

More information

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

Opgaven Onderzoeksmethoden, Onderdeel Statistiek Opgaven Onderzoeksmethoden, Onderdeel Statistiek 1. What is the measurement scale of the following variables? a Shoe size b Religion c Car brand d Score in a tennis game e Number of work hours per week

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

More information

MAT188H1S Lec0101 Burbulla

MAT188H1S Lec0101 Burbulla Winter 206 Linear Transformations A linear transformation T : R m R n is a function that takes vectors in R m to vectors in R n such that and T (u + v) T (u) + T (v) T (k v) k T (v), for all vectors u

More information

2 Binomial, Poisson, Normal Distribution

2 Binomial, Poisson, Normal Distribution 2 Binomial, Poisson, Normal Distribution Binomial Distribution ): We are interested in the number of times an event A occurs in n independent trials. In each trial the event A has the same probability

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

PROBABILITIES AND PROBABILITY DISTRIBUTIONS

PROBABILITIES AND PROBABILITY DISTRIBUTIONS Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL

More information

We have discussed the notion of probabilistic dependence above and indicated that dependence is

We have discussed the notion of probabilistic dependence above and indicated that dependence is 1 CHAPTER 7 Online Supplement Covariance and Correlation for Measuring Dependence We have discussed the notion of probabilistic dependence above and indicated that dependence is defined in terms of conditional

More information

Pearson s Correlation Coefficient

Pearson s Correlation Coefficient Pearson s Correlation Coefficient In this lesson, we will find a quantitative measure to describe the strength of a linear relationship (instead of using the terms strong or weak). A quantitative measure

More information

Exclusive OR (XOR) and hardware random number generators

Exclusive OR (XOR) and hardware random number generators Exclusive OR (XOR) and hardware random number generators Robert B Davies February 28, 2002 1 Introduction The exclusive or (XOR) operation is commonly used to reduce the bias from the bits generated by

More information

Math 202-0 Quizzes Winter 2009

Math 202-0 Quizzes Winter 2009 Quiz : Basic Probability Ten Scrabble tiles are placed in a bag Four of the tiles have the letter printed on them, and there are two tiles each with the letters B, C and D on them (a) Suppose one tile

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance

More information

Identify a pattern and find the next three numbers in the pattern. 5. 5(2s 2 1) 2 3(s 1 2); s 5 4

Identify a pattern and find the next three numbers in the pattern. 5. 5(2s 2 1) 2 3(s 1 2); s 5 4 Chapter 1 Test Do ou know HOW? Identif a pattern and find the net three numbers in the pattern. 1. 5, 1, 3, 7, c. 6, 3, 16, 8, c Each term is more than the previous Each term is half of the previous term;

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

Sampling Distributions

Sampling Distributions Sampling Distributions You have seen probability distributions of various types. The normal distribution is an example of a continuous distribution that is often used for quantitative measures such as

More information

Lecture 9: Introduction to Pattern Analysis

Lecture 9: Introduction to Pattern Analysis Lecture 9: Introduction to Pattern Analysis g Features, patterns and classifiers g Components of a PR system g An example g Probability definitions g Bayes Theorem g Gaussian densities Features, patterns

More information

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 Martingale Ideas in Elementary Probability Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 William Faris University of Arizona Fulbright Lecturer, IUM, 1995 1996

More information

INVESTIGATIONS AND FUNCTIONS 1.1.1 1.1.4. Example 1

INVESTIGATIONS AND FUNCTIONS 1.1.1 1.1.4. Example 1 Chapter 1 INVESTIGATIONS AND FUNCTIONS 1.1.1 1.1.4 This opening section introduces the students to man of the big ideas of Algebra 2, as well as different was of thinking and various problem solving strategies.

More information