Joint Distributions. Tieming Ji. Fall 2012

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Joint Distributions. Tieming Ji. Fall 2012"

Transcription

1 Joint Distributions Tieming Ji Fall / 33

2 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables joint distributions, both in discrete cases and continuous cases. Motivation: Many problems require the study of two random variables at the same time. For example, studying the effect of pesticide application time for the soybean yield; the population size and the crime rate relationship; the phenotype and genotype in agronomy studies; the yield of a chemical reaction in conjunction with the temperature at which the reaction is run, etc. 2 / 33

3 Discrete Cases Definition: Let X and Y be discrete random variables. The ordered pair (X, Y ) is called a two-dimensional discrete random variable. The joint (probability) density function f for (X, Y ) is defined as f (x, y) = P(X = x, Y = y). Definition: Function f is a valid discrete joint density function if and only if f (x, y) 0 for x, y R, and x R y R f (x, y) = 1. 3 / 33

4 Example (book page 157): In an automobile plant two tasks are performed by robots. The first entails welding two joints; the second, tightening three bolts. Let X denote the number of defective welds and Y the number of improperly tightened bolts produced per car. Past data indicates that the joint density for (X, Y ) is shown in the table below. x/y Is f (x, y) a proper joint density distribution? f (x, y) is a valid joint discrete density for (X, Y ) because (1) for any x and y, f (x, y) 0; and (2) x R y R f (x, y) = 2 3 x=0 y=0 f (x, y) = = 1. 4 / 33

5 2. What is P(X = 1, Y = 0)? P(X = 1, Y = 0) = What is the probability that there is no improperly tightened bolts? Solution: When there is no improperly tightened bolts, Y =0. So, we want to compute P(Y = 0). We have P(Y = 0) = P(X = 0, Y = 0) + P(X = 1, Y = 0) + P(X = 2, Y = 0) = = / 33

6 Definition: The marginal probability density functions f X (x) for X given the joint probability density function f X,Y (x, y) is given by f X (x) = f X,Y (x, y). y R Similarly, the marginal probability density function f Y (y) for Y is given by f Y (y) = f X,Y (x, y). x R 6 / 33

7 Example: Compute the marginal distributions of X and Y given the table in the last example. x/y f X (x) f Y (y) / 33

8 Continuous Cases Definition: Let X and Y be continuous random variables. The ordered pair (X, Y ) is called a two-dimensional continuous random variable. The probability that X [a, b] and Y [c, d] given the joint continuous probability density f X,Y (x, y) is computed by P(a X b, c Y d) = b d a c f X,Y (x, y)dydx. Definition: Function f is proper continuous joint density function if and only if f (x, y) 0 for x, y R, and f X,Y (x, y)dydx = 1. 8 / 33

9 Example: In a healthy individual age 20 to 29 years, the calcium level in the blood, X, is usually between 8.5 and 10.5 milligrams per deciliter (mg/dl) and the cholesterol level, Y is usually between 120 and 240 mg/dl. Assume that for a healthy individual in this age group, X is uniformly distributed in the range (8.5, 10.5), and Y is uniformly distributed on (120, 240). Thus, the density function is f X,Y (x, y) = 1 240, 8.5 x 10.5 and 120 y 240; 0, o.w. 1. Verify that f X,Y is a proper probability density function. Solution: (1) For any x R and y R, f (x, y) is non-negative. (2) f (x, y)dydx = dydx = 1. 9 / 33

10 2. Compute the probability that a healthy person in this age group will have calcium level between 9 and 10 mg/dl and cholesterol level between 125 and 140 mg/dl. Solution: This is to compute P(9 X 10, 125 Y 140). P(9 X 10, 125 Y 140) = = = dydx ( )dx 10 / 33

11 Definition: Let (X, Y ) be a two-dimensional continuous random variable with joint density f X,Y. The marginal density for X, denoted by f X, is given by f X (x) = f X,Y (x, y)dy. The marginal density for Y, denoted by f Y, is given by f Y (y) = f X,Y (x, y)dx. 11 / 33

12 Example: Continued with the last example. 1. Compute the marginal distributions for X and Y. Solution: f X (x) = When x < 8.5 or x > 10.5, f X (x) = 0. f Y (y) = When y < 120 or y > 240, f Y (y) = dy = 1, 8.5 x dx = 1, 120 y / 33

13 2. Compute the probability that a healthy individual has a cholesterol level between 150 and 200. Solution: This is to compute P(150 Y 200). P(150 Y 200) = dydx = We can also use marginal density to compute this as follows. P(150 Y 200) = dy = / 33

14 Example: Let 1, 0 x 1, 0 y 1. f (x, y) = 0, o.w. Compute the probability that 1 2 X + Y / 33

15 Example: Let e (x+y), x, y 0 f (x, y) = 0, o.w. Compute the probability that Y X / 33

16 Example: In studying the behavior of air support roofs, the random variables X, the inside barometric pressure (in inches of mercury), and Y, the outside pressure, are consdered. Assume that the joint density for (X, Y ) is given by f X,Y (x, y) = c x when 27 y x 33, and 1 c = ln Compute the marginal density functions of X and Y. 16 / 33

17 2. Compute the probability that X 30 and Y / 33

18 Independence Definition: Let X and Y be random variables with joint density f XY and marginal densities f X and f Y, respectively. X and Y are independent if and only if for all x and y. f XY (x, y) = f X (x)f Y (y) 18 / 33

19 Example: Continue with Example on slide no. 4. x/y f X (x) f Y (y) X and Y are not independent, because f XY (x = 0, y = 0) = 0.840, and f X (x = 0)f Y (y = 0) = = f XY (x = 0, y = 0). 19 / 33

20 Example: Continue with Example on slide no. 9 and no. 12. We have already derived the joint pdf and marginal pdf s as follows , 8.5 x 10.5 and 120 y 240; f X,Y (x, y) = 0, o.w. f X (x) = f Y (y) = 1 2, 8.5 x 10.5; 0, o.w , 120 y 240; 0, o.w. Thus, for any pair of (x, y) R 2, f X,Y (x, y) = f X (x)f Y (y). X and Y are independent. 20 / 33

21 Expectation For (X, Y ) discrete, E(X ) = xf X (x) = xf XY (x, y), and x R x R y R E(Y ) = yf Y (y) = yf XY (x, y). y R x R y R For (X, Y ) continuous, E(X ) = xf X (x)dx = xf XY (x, y)dxdy, and x R E(Y ) = y R yf Y (y)dy = x R y R y R x R yf XY (x, y)dxdy. 21 / 33

22 Extension: Let (X, Y ) be a two-dimensional random variable with joint density f XY. Let h(x, Y ) be a random variable. The expected value of h(x, Y ), denoted by E(h(X, Y )), is given by E(h(X, Y )) = h(x, Y )f XY (x, y), x R y R when (X, Y ) is discrete and E(h(X, Y )) exists; or E(h(X, Y )) = h(x, Y )f XY (x, y)dxdy, x R y R when (X, Y ) is continuous and E(h(X, Y )) exists. 22 / 33

23 Example: The joint pdf of X and Y are in the following table. Compute E(X ) and E(X + Y ). x/y f X (x) f Y (y) / 33

24 Example: The joint density for the random variable (X, Y ), where X denotes the calcium level and Y denotes the cholesterol level in the blood of a healthy individual, is given by f X,Y (x, y) = Compute E(X ) and E(XY ) , 8.5 x 10.5 and 120 y 240; 0, o.w. 24 / 33

25 Covariance Definition: Let X and Y be random variables with means µ X and µ Y, respectively. The covariance between X and Y, denoted by Cov(X, Y ) or σ XY is given by Cov(X, Y ) = E((X µ X )(Y µ Y )). Using definition to compute covariance is often inconvenient. Instead we use the following formula to compute covariance. Cov(X, Y ) = E((X µ X )(Y µ Y )) = E(XY ) E(X )E(Y ). 25 / 33

26 Theorem: Let (X, Y ) be a two-dimensional random variable with joint density f XY. If X and Y are independent then E(XY ) = E(X )E(Y ). Thus, If X and Y are independent, then Cov(X, Y ) = E(XY ) E(X )E(Y ) = 0. That is, independence covariance is 0. However, when covariance is 0 for two random variables, it is not always true that they are independent. 26 / 33

27 Example (book page 169): The joint density for X and Y is in the following table. x/y f X (x) 1 0 1/4 1/4 0 1/2 4 1/ /4 1/2 f Y (y) 1/4 1/4 1/4 1/4 1 From the table, we have E(X ) = 5/2, E(Y ) = 0, and E(XY ) = 0, yielding a covariance of 0. However, X and Y are not independent. 27 / 33

28 Correlation Definition: Let X and Y be random variables with means µ X and µ Y and variances σx 2 and σ2 Y, respectively. The correlation, ρ XY, between X and Y is given by ρ XY = Cov(X, Y ) σ 2 X σ 2 Y = Cov(X, Y ) σ X σ Y. This is also called the Pearson coefficient of correlation. 28 / 33

29 Property: The Pearson correlation coefficient ρ XY ranges from -1 to 1, and it measures linearity of X and Y. Specifically, When ρ XY =1, Y can be written as Y = β 0 + β 1 X and β 1 is positive. When ρ XY =-1, Y can be written as Y = β 0 + β 1 X and β 1 is negative. When ρ XY = 0, there is no linear relationship between X and Y, but they could have other relationships. 29 / 33

30 Example: The joint pdf of X and Y are in the following table. Compute ρ XY. x/y f X (x) f Y (y) / 33

31 Conditional Density Definition: Let (X, Y ) be a two-dimensional random variable with joint density f XY and marginal densities f X and Y. Then The conditional density for X given Y = y is given by f X Y =y (x) = f XY (x, y), when f Y (y) > 0. f Y (y) The conditional density for Y given X = x is given by f Y X =x (y) = f XY (x, y), when f X (x) > 0. f X (x) Note: When two r.v. s are independent, the marginal density is the same with the conditional density, i.e. f X Y (x) = f (x). and same for Y. 31 / 33

32 Example: The joint pdf of X and Y are in the following table. Compute P(X 1 y = 0). x/y f X (x) f Y (y) / 33

33 Chapter Summary For both the discrete and continuous cases, know the definition of joint pdf, and use the joint pdf to compute marginal pdf s. Given the joint pdf, be able to compute probability of X and Y satisfying specific requirements, such as P(X + Y 1), P(X < Y < 0.5), etc. Expectations computed from marginal density or joint density. Covariance and independence. Correlation. Relationship of marginal density, conditional density and joint density. 33 / 33

Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Covariance and Correlation. Consider the joint probability distribution f XY (x, y). Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

More information

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011 Joint density function of continuous Random Variable When X and Y are two continuous

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

Random Variables, Expectation, Distributions

Random Variables, Expectation, Distributions Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability

More information

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage 5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

More information

Chapter 4. Multivariate Distributions

Chapter 4. Multivariate Distributions 1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

More information

Bivariate Distributions

Bivariate Distributions Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1 ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291)

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291) Feb 8 Homework Solutions Math 5, Winter Chapter 6 Problems (pages 87-9) Problem 6 bin of 5 transistors is known to contain that are defective. The transistors are to be tested, one at a time, until the

More information

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211) An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 5 Example: Joint

More information

Joint Probability Distributions and Random Samples

Joint Probability Distributions and Random Samples STAT5 Sprig 204 Lecture Notes Chapter 5 February, 204 Joit Probability Distributios ad Radom Samples 5. Joitly Distributed Radom Variables Chapter Overview Joitly distributed rv Joit mass fuctio, margial

More information

Worked examples Multiple Random Variables

Worked examples Multiple Random Variables Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions Chapter 4 - Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Review Exam Suppose that number of cars that passes through a certain rural intersection is a Poisson process with an average rate of 3 per day.

Review Exam Suppose that number of cars that passes through a certain rural intersection is a Poisson process with an average rate of 3 per day. Review Exam 2 This is a sample of problems that would be good practice for the exam. This is by no means a guarantee that the problems on the exam will look identical to those on the exam but it should

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17 4.6 I company that manufactures and bottles of apple juice uses a machine that automatically fills 6 ounce bottles. There is some variation, however, in the amounts of liquid dispensed into the bottles

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

The Bivariate Normal Distribution

The Bivariate Normal Distribution The Bivariate Normal Distribution This is Section 4.7 of the st edition (2002) of the book Introduction to Probability, by D. P. Bertsekas and J. N. Tsitsiklis. The material in this section was not included

More information

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions... MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information

1 Definition of Conditional Expectation

1 Definition of Conditional Expectation 1 DEFINITION OF CONDITIONAL EXPECTATION 4 1 Definition of Conditional Expectation 1.1 General definition Recall the definition of conditional probability associated with Bayes Rule P(A B) P(A B) P(B) For

More information

Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).

Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ). Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The

More information

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304.  jones/courses/141 Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

We have discussed the notion of probabilistic dependence above and indicated that dependence is

We have discussed the notion of probabilistic dependence above and indicated that dependence is 1 CHAPTER 7 Online Supplement Covariance and Correlation for Measuring Dependence We have discussed the notion of probabilistic dependence above and indicated that dependence is defined in terms of conditional

More information

Lesson 5 Chapter 4: Jointly Distributed Random Variables

Lesson 5 Chapter 4: Jointly Distributed Random Variables Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence

More information

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015. Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

More information

6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables 6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

More information

Chap 3 : Two Random Variables

Chap 3 : Two Random Variables Chap 3 : Two Random Variables Chap 3.1: Distribution Functions of Two RVs In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

Exercises with solutions (1)

Exercises with solutions (1) Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

More information

Correlation in Random Variables

Correlation in Random Variables Correlation in Random Variables Lecture 11 Spring 2002 Correlation in Random Variables Suppose that an experiment produces two random variables, X and Y. What can we say about the relationship between

More information

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

More information

Random Vectors and the Variance Covariance Matrix

Random Vectors and the Variance Covariance Matrix Random Vectors and the Variance Covariance Matrix Definition 1. A random vector X is a vector (X 1, X 2,..., X p ) of jointly distributed random variables. As is customary in linear algebra, we will write

More information

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1 ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012.

Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012. Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG8). December 15, 12. 1. (3p) The joint distribution of the discrete random variables X and

More information

Chapter 4 Expected Values

Chapter 4 Expected Values Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

More information

Notes for STA 437/1005 Methods for Multivariate Data

Notes for STA 437/1005 Methods for Multivariate Data Notes for STA 437/1005 Methods for Multivariate Data Radford M. Neal, 26 November 2010 Random Vectors Notation: Let X be a random vector with p elements, so that X = [X 1,..., X p ], where denotes transpose.

More information

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

More information

MATH 201. Final ANSWERS August 12, 2016

MATH 201. Final ANSWERS August 12, 2016 MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

More information

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010 Simulation Methods Leonid Kogan MIT, Sloan 15.450, Fall 2010 c Leonid Kogan ( MIT, Sloan ) Simulation Methods 15.450, Fall 2010 1 / 35 Outline 1 Generating Random Numbers 2 Variance Reduction 3 Quasi-Monte

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Chapter 5: Joint Probability Distributions. Chapter Learning Objectives. The Joint Probability Distribution for a Pair of Discrete Random

Chapter 5: Joint Probability Distributions. Chapter Learning Objectives. The Joint Probability Distribution for a Pair of Discrete Random Chapter 5: Joint Probability Distributions 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance

More information

The random variable X - the no. of defective items when three electronic components are tested would be

The random variable X - the no. of defective items when three electronic components are tested would be RANDOM VARIABLES and PROBABILITY DISTRIBUTIONS Example: Give the sample space giving a detailed description of each possible outcome when three electronic components are tested, where N - denotes nondefective

More information

Some probability and statistics

Some probability and statistics Appendix A Some probability and statistics A Probabilities, random variables and their distribution We summarize a few of the basic concepts of random variables, usually denoted by capital letters, X,Y,

More information

The Scalar Algebra of Means, Covariances, and Correlations

The Scalar Algebra of Means, Covariances, and Correlations 3 The Scalar Algebra of Means, Covariances, and Correlations In this chapter, we review the definitions of some key statistical concepts: means, covariances, and correlations. We show how the means, variances,

More information

Math 432 HW 2.5 Solutions

Math 432 HW 2.5 Solutions Math 432 HW 2.5 Solutions Assigned: 1-10, 12, 13, and 14. Selected for Grading: 1 (for five points), 6 (also for five), 9, 12 Solutions: 1. (2y 3 + 2y 2 ) dx + (3y 2 x + 2xy) dy = 0. M/ y = 6y 2 + 4y N/

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Topic 8 The Expected Value

Topic 8 The Expected Value Topic 8 The Expected Value Functions of Random Variables 1 / 12 Outline Names for Eg(X ) Variance and Standard Deviation Independence Covariance and Correlation 2 / 12 Names for Eg(X ) If g(x) = x, then

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Probability for Estimation (review)

Probability for Estimation (review) Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: x = f x, u + η(t); y = h x + ω(t); ggggg y, ffff x We will primarily focus on discrete time linear

More information

3.6: General Hypothesis Tests

3.6: General Hypothesis Tests 3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.

More information

MC3: Econometric Theory and Methods

MC3: Econometric Theory and Methods University College London Department of Economics M.Sc. in Economics MC3: Econometric Theory and Methods Course notes: Econometric models, random variables, probability distributions and regression Andrew

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Section 3 Part 1. Relationships between two numerical variables

Section 3 Part 1. Relationships between two numerical variables Section 3 Part 1 Relationships between two numerical variables 1 Relationship between two variables The summary statistics covered in the previous lessons are appropriate for describing a single variable.

More information

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage 4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

More information

Chapter 3 Joint Distributions

Chapter 3 Joint Distributions Chapter 3 Joint Distributions 3.6 Functions of Jointly Distributed Random Variables Discrete Random Variables: Let f(x, y) denote the joint pdf of random variables X and Y with A denoting the two-dimensional

More information

Joint Probability Distributions and Random Samples

Joint Probability Distributions and Random Samples 5 Joint Probabilit Distributions and Random Samples INTRODUCTION In Chapters 3 and 4, we developed probabilit models for a single random variable. Man problems in probabilit and statistics involve several

More information

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables 2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

10 BIVARIATE DISTRIBUTIONS

10 BIVARIATE DISTRIBUTIONS BIVARIATE DISTRIBUTIONS After some discussion of the Normal distribution, consideration is given to handling two continuous random variables. The Normal Distribution The probability density function f(x)

More information

Probability Calculator

Probability Calculator Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that

More information

Lecture 9: Introduction to Pattern Analysis

Lecture 9: Introduction to Pattern Analysis Lecture 9: Introduction to Pattern Analysis g Features, patterns and classifiers g Components of a PR system g An example g Probability definitions g Bayes Theorem g Gaussian densities Features, patterns

More information

table to see that the probability is 0.8413. (b) What is the probability that x is between 16 and 60? The z-scores for 16 and 60 are: 60 38 = 1.

table to see that the probability is 0.8413. (b) What is the probability that x is between 16 and 60? The z-scores for 16 and 60 are: 60 38 = 1. Review Problems for Exam 3 Math 1040 1 1. Find the probability that a standard normal random variable is less than 2.37. Looking up 2.37 on the normal table, we see that the probability is 0.9911. 2. Find

More information

Fundamentals of Probability and Statistics for Reliability. analysis. Chapter 2

Fundamentals of Probability and Statistics for Reliability. analysis. Chapter 2 Chapter 2 Fundamentals of Probability and Statistics for Reliability Analysis Assessment of the reliability of a hydrosystems infrastructural system or its components involves the use of probability and

More information

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables Tony Pourmohamad Department of Mathematics De Anza College Spring 2015 Objectives By the end of this set of slides,

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Probability and Statistical Methods. Chapter 4 Mathematical Expectation

Probability and Statistical Methods. Chapter 4 Mathematical Expectation Math 3 Chapter 4 Mathematical Epectation Mean of a Random Variable Definition. Let be a random variable with probability distribution f( ). The mean or epected value of is, f( ) µ = µ = E =, if is a discrete

More information

Chapter 1 - σ-algebras

Chapter 1 - σ-algebras Page 1 of 17 PROBABILITY 3 - Lecture Notes Chapter 1 - σ-algebras Let Ω be a set of outcomes. We denote by P(Ω) its power set, i.e. the collection of all the subsets of Ω. If the cardinality Ω of Ω is

More information

Lecture 5: Mathematical Expectation

Lecture 5: Mathematical Expectation Lecture 5: Mathematical Expectation Assist. Prof. Dr. Emel YAVUZ DUMAN MCB1007 Introduction to Probability and Statistics İstanbul Kültür University Outline 1 Introduction 2 The Expected Value of a Random

More information

Notes 11 Autumn 2005

Notes 11 Autumn 2005 MAS 08 Probabilit I Notes Autumn 005 Two discrete random variables If X and Y are discrete random variables defined on the same sample space, then events such as X = and Y = are well defined. The joint

More information

Lesson 4 Part 1. Relationships between. two numerical variables. Correlation Coefficient. Relationship between two

Lesson 4 Part 1. Relationships between. two numerical variables. Correlation Coefficient. Relationship between two Lesson Part Relationships between two numerical variables Correlation Coefficient The correlation coefficient is a summary statistic that describes the linear between two numerical variables Relationship

More information

Probability and Statistics

Probability and Statistics CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be

More information

Assurance-vie 2. Manuel Morales Department of Mathematics and Statistics University of Montreal. February 2006

Assurance-vie 2. Manuel Morales Department of Mathematics and Statistics University of Montreal. February 2006 Assurance-vie 2 Manuel Morales Department of Mathematics and Statistics University of Montreal February 2006 Chapter 9 Multiple Life Functions 1. Joint Distributions 2. Joint-life status 3. Last-survivor

More information

Examination 110 Probability and Statistics Examination

Examination 110 Probability and Statistics Examination Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

More information

Sections 2.11 and 5.8

Sections 2.11 and 5.8 Sections 211 and 58 Timothy Hanson Department of Statistics, University of South Carolina Stat 704: Data Analysis I 1/25 Gesell data Let X be the age in in months a child speaks his/her first word and

More information

SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation

SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu September 19, 2015 Outline

More information

Correlation Coefficient The correlation coefficient is a summary statistic that describes the linear relationship between two numerical variables 2

Correlation Coefficient The correlation coefficient is a summary statistic that describes the linear relationship between two numerical variables 2 Lesson 4 Part 1 Relationships between two numerical variables 1 Correlation Coefficient The correlation coefficient is a summary statistic that describes the linear relationship between two numerical variables

More information

Aggregate Loss Models

Aggregate Loss Models Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing

More information

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b Continuous Random Variables The probability that a continuous random variable, X, has a value between a and b is computed by integrating its probability density function (p.d.f.) over the interval [a,b]:

More information

INSURANCE RISK THEORY (Problems)

INSURANCE RISK THEORY (Problems) INSURANCE RISK THEORY (Problems) 1 Counting random variables 1. (Lack of memory property) Let X be a geometric distributed random variable with parameter p (, 1), (X Ge (p)). Show that for all n, m =,

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information