# ST 371 (VIII): Theory of Joint Distributions

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or more random variables. The following examples are illustrative: In ecological studies, counts, modeled as random variables, of several species are often made. One species is often the prey of another; clearly, the number of predators will be related to the number of prey. The joint probability distribution of the x, y and z components of wind velocity can be experimentally measured in studies of atmospheric turbulence. The joint distribution of the values of various physiological variables in a population of patients is often of interest in medical studies. A model for the joint distribution of age and length in a population of fish can be used to estimate the age distribution from the length distribution. The age distribution is relevant to the setting of reasonable harvesting policies. 1 Joint Distribution The joint behavior of two random variables X and Y is determined by the joint cumulative distribution function (cdf): (1.1) F XY (x, y) = P (X x, Y y), where X and Y are continuous or discrete. For example, the probability that (X, Y ) belongs to a given rectangle is P (x 1 X x 2, y 1 Y y 2 ) = F (x 2, y 2 ) F (x 2, y 1 ) F (x 1, y 2 ) + F (x 1, y 1 ). 1

2 In general, if X 1,, X n are jointly distributed random variables, the joint cdf is F X1,,X n (x 1,, x n ) = P (X 1 x 1, X 2 x 2,, X n x n ). Two- and higher-dimensional versions of probability distribution functions and probability mass functions exist. We start with a detailed description of joint probability mass functions. 1.1 Jointly Discrete Random Variables Joint probability mass functions: Let X and Y be discrete random variables defined on the sample space that take on values {x 1, x 2, } and {y 1, y 2, }, respectively. The joint probability mass function of (X, Y ) is (1.2) p(x i, y j ) = P (X = x i, Y = y j ). Example 1 A fair coin is tossed three times independently: let X denote the number of heads on the first toss and Y denote the total number of heads. Find the joint probability mass function of X and Y. 2

3 Solution: the joint distribution of (X, Y ) can be summarized in the following table: x/y Marginal probability mass functions: Suppose that we wish to find the pmf of Y from the joint pmf of X and Y in the previous example: p Y (0) = P (Y = 0) = P (Y = 0, X = 0) + P (Y = 0, X = 1) = 1/8 + 0 = 1/8 p Y (1) = P (Y = 1) = P (Y = 1, X = 0) + P (Y = 1, X = 1) = 2/8 + 1/8 = 3/8 In general, to find the frequency function of Y, we simply sum down the appropriate column of the table giving the joint pmf of X and Y. For this reason, p Y is called the marginal probability mass function of Y. Similarly, summing across the rows gives p X (x) = i (x, y i ), which is the marginal pmf of X. For the above example, we have p X (0) = p X (1) = Jointly Continuous Random Variables Joint PDF and Joint CDF: Suppose that X and Y are continuous random variables. The joint probability density function (pdf) of X and Y is the function f(x, y) such that for every set C of pairs of real numbers (1.3) P ((X, Y ) C) = f(x, y)dxdy. (x,y) C 3

4 Another interpretation of the joint pdf is obtained as follows: P {a < X < a + da, b < Y < b + db} = b+db a+da b a f(a, b)dadb, f(x, y)dxdy when da and db are small and f(x, y) is continuous at a, b. Hence f(a, b) is a measure of how likely it is that the random vector (X, Y ) will be near (a, b). This is similar to the interpretation of the pdf f(x) for a single random variable X being a measure of how likely it is to be near x. The joint CDF of (X, Y ) can be obtained as follows: F (a, b) = P {X (, a], Y (, b]} = f(x, y)dxdy = X (,a],y (,b] b a It follows upon differentiation that f(a, b) = f(x, y)dxdy 2 F (a, b), a b wherever the partial derivatives are defined. Marginal PDF: The cdf and pdf of X can be obtained from the pdf of (X, Y ): Then we have P (X x) = P {X x, Y (, )} = f X (x) = d d P (X x) = dx dx x x f(x, y)dydx. f(x, y)dydx = f(x, y)dy. 4

5 Similarly, the pdf of Y is given by f Y (y) = Example 2 Consider the pdf for X and Y f(x, y)dx. f(x, y) = 12 7 (x2 + xy), 0 x 1, 0 y 1. Find (a) P (X > Y ) (b) the marginal density f X (x). Solution: (a). Note that P (X > Y ) = = = = X>Y y f(x, y)dxdy 12 7 (x2 + xy)dxdy 12 7 (x3 3 + x2 y 2 ) 1 ydy ( y y y3 )dy = 9 14 (b) For 0 x 1, f X (x) = 1 For x < 0 or x > 1, we have f(x2 + xy)dy = 12 7 x x. f X (x) = 0. 5

6 Example 3 Suppose the set of possible values for (X, Y ) is the rectangle D = {(x, y) : 0 x 1, 0 y 1}. Let the joint pdf of (X, Y ) be (a) Verify that f(x, y) is a valid pdf. f(x, y) = 6 5 (x + y2 ), for (x, y) D. (b) Find P (0 X 1/4, 0 Y 1/4). (c) Find the marginal pdf of X and Y. (d) Find P ( 1 4 Y 3 4 ). 6

7 2 Independent Random Variables The random variables X and Y are said to be independent if for any two sets of real numbers A and B, (2.4) P (X A, Y B) = P (X A)P (Y B). Loosely speaking, X and Y are independent if knowing the value of one of the random variables does not change the distribution of the other random variable. Random variables that are not independent are said to be dependent. to For discrete random variables, the condition of independence is equivalent P (X = x, Y = y) = P (X = x)p (Y = y) for all x, y. For continuous random variables, the condition of independence is equivalent to f(x, y) = f X (x)f Y (y) for all x, y. Example 4 Consider (X, Y ) with joint pmf Are X and Y independent? p(10, 1) = p(20, 1) = p(20, 2) = 1 10 p(10, 2) = p(10, 3) = 1 3 5, and p(20, 3) = 10. 7

8 Example 5 Suppose that a man and a woman decide to meet at a certain location. If each person independently arrives at a time uniformly distributed between [0, T ]. Find the joint pdf of the arriving times X and Y. Find the probability that the first to arrive has to wait longer than a period of τ (τ < T ). 8

9 2.1 More than two random variables If X 1,, X n are all discrete random variables, the joint pmf of the variables is the function (2.5) p(x 1,, x n ) = P (X 1 = x 1,, X n = x n ). If the variables are continuous, the joint pdf of the variables is the function f(x 1,, x n ) such that P (a 1 X 1 b 1,, a n X n b n ) = b1 a 1 bn a n f(x 1,, x n )dx 1 x n. One of the most important joint distributions is the multinomial distribution which arises when a sequence of n independent and identical experiments is performed, where each experiment can result in any one of r possible outcomes, with respective probabilities p 1,, p r. If we let denote the number of the n experiments that result in outcome i, then (2.6) P (X 1 = n 1,..., X r = n r ) = whenever r i=1 n i = n. n! n 1! n r! pn 1 1 pn r r, Example 6 If the allele of each ten independently obtained pea sections is determined with p 1 = P (AA), p 2 = P (Aa), p 3 = P (aa). Let X 1 = number of AA s, X 2 = number of Aa s and X 3 = number of aa s. What is the probability that we have 2 AA s, 5 Aa s and 3 aa s. 9

10 Example 7 When a certain method is used to collect a fixed volume of rock samples in a region, there are four rock types. Let X 1, X 2 and X 3 denote the proportion by volume of rock types 1, 2, and 3 in a randomly selected sample (the proportion of rock type 4 is redundant because X 4 = 1 X 1 X 2 X 3 ). Suppose the joint pdf of (X 1, X 2, X 3 ) is f(x 1, x 2, x 3 ) = kx 1 x 2 (1 x 3 ) for (x 1, x 2, x 3 ) [0, 1] 3 and x 1 + x 2 + x 3 1 and f(x 1, x 2, x 3 ) = 0 otherwise. What is k? What is the probability that types 1 and 2 together account for at most 50%? 10

11 The random variables X 1,, X n are independent if for every subset X i1,, X ik of the variables, the joint pmf (pdf) is equal to the product of the marginal pmf s (pdf s). Example 8 If X 1,, X n represent the lifetimes of n independent components, and each lifetime is exponentially distributed with parameter λ. The system fails if any component fails. Find the expected lifetime of system. 11

12 2.2 Sum of Independent Random Variables It is often important to be able to calculate the distribution of X + Y from the distribution of X and Y when X and Y are independent. Sum of Discrete Independent RV s: Suppose that X and Y are discrete independent rv s. Let Z = X + Y. The probability mass function of Z is (2.7) (2.8) P (Z = z) = x = x P (X = x, Y = z x) P (X = x)p (Y = z x), where the second equality uses the independence of X and Y. Sums of Continuous Independent RV s: Suppose that X and Y are continuous independent rv s. Let Z = X + Y. The CDF of Z is F Z (a) = P (X + Y a) = f XY (x, y)dxdy x+y a = f X (x)f Y (y)dxdy = = x+y a a y f X (x)dxf Y (y)dy F X (a y)f Y (y)dy By differentiating the cdf, we obtain the pdf of Z = X + Y : f Z (a) = = = d da F X (a y)f Y (y)dy d da F X(a y)f Y (y)dy f X (a y)f Y (y)dy. 12

13 Group project (optional): Sum of independent Poisson random variables. Suppose X and Y are independent Poisson random variables with respective means λ 1 and λ 2. Show that Z = X + Y has a Poisson distribution with mean λ 1 + λ 2. Sum of independent uniform random variables. If X and Y are two independent random variables, both uniformly distributed on (0,1), calculate the pdf of Z = X + Y. Example 9 Let X and Y denote the lifetimes of two bulbs. Suppose that X and Y are independent and that each has an exponential distribution with λ = 1 (year). What is the probability that each bulb lasts at most 1 year? 13

14 What is the probability that the total lifetime is between 1 and 2 years? 2.3 Conditional Distribution The use of conditional distribution allows us to define conditional probabilities of events associated with one random variable when we are given the value of a second random variable. (1) The Discrete Case: Suppose X and Y are discrete random variables. The conditional probability mass function of X given Y = y j is the conditional probability distribution of X given Y = y j. The conditional probability mass function of X Y is p X Y (x i y j ) = P (X = x i Y = y j ) = P (X = x i, Y = y j ) P (Y = y j ) = p X,Y (x i, y j ) p Y (y j ) This is just the conditional probability of the event {X = x i } given that {Y = y i }. 14

15 Example 10 Considered the situation that a fair coin is tossed three times independently. Let X denote the number of heads on the first toss and Y denote the total number of heads. What is the conditional probability mass function of X given Y? Are X and Y independent? 15

16 If X and Y are independent random variables, then the conditional probability mass function is the same as the unconditional one. This follows because if X is independent of Y, then p X Y (x y) = P (X = x Y = y) P (X = x, Y = y) = P (Y = y) P (X = x)p (Y = y) = P (Y = y) = P (X = x) (2) The Continuous Case: If X and Y have a joint probability density function f(x, y), then the conditional pdf of X, given that Y = y, is defined for all values of y such that f Y (y) > 0, by (2.9) f X Y (x y) = f X,Y (x, y). f Y (y) To motivate this definition, multiply the left-hand side by dx and the right hand side by (dxdy)/dy to obtain f X Y (x y)dx = f X,Y (x, y)dxdy f Y (y)dy P {x X x + dx, y Y y + dy} P {y Y y + dy} = P {x X x + dx y Y y + dy}. In other words, for small values of dx and dy, f X y (x y) represents the conditional probability that X is between x and x + dx given that Y is between y and y + dy. That is, if X and Y are jointly continuous, then for any set A, P {X A Y = y} = f X Y (x y)dx. In particular, by letting A = (, a], we can define the conditional cdf of A 16

17 X given that Y = y by F X Y (a y) = P (X a Y = y) = a f X Y (x y)dx. Example 11 Let X and Y be random variables with joint distribution f(x, y) = 6 5 (x + y2 ), for 0 x 1 and 0 y 1 and f(x, y) = 0 otherwise. What is the conditional pdf of Y given that X = 0.8? What is the conditional expectation of Y given that X = 0.8? 17

18 Group project (optional): Suppose X and Y are two independent random variables, both uniformly distributed on (0,1). Let T 1 = min(x, Y ) and T 2 = max(x, Y ). What is the joint distribution of T 1 and T 2? What is the conditional distribution of T 2 given that T 1 = t? Are T 1 and T 2 independent? 3 Expected Values Let X and Y be jointly distributed rv s with pmf p(x, y) or pdf f(x, y) according to whether the variables are discrete or continuous. Then the expected value of a function h(x, Y ), denoted by E[h(X, Y )], is given by { h(x, y)p(x, y) Discrete (3.10) E[h(X, Y )] = x y h(x, y)f(x, y)dxdy Continuous Example 12 The joint pdf of X and Y is { 24xy 0 < x < 1, 0 < y < 1, x + y 1 (3.11) f(x, y) = 0 otherwise. Let h(x, Y ) = X + Y. Find E[h(X, Y )]. 18

19 Rule of expected values: If X and Y are independent random variables, then we have E(XY ) = E(X)E(Y ). This is in general not true for correlated random variables. Example 13 Five friends purchased tickets to a certain concert. If the tickets are for seats 1-5 in a particular row and the ticket numbers are randomly selected (without replacement). What is the expected number of seats separating any particular two of the five? 19

20 4 Covariance and Correlation Covariance and correlation are related parameters that indicate the extent to which two random variables co-vary. Suppose there are two technology stocks. If they are affected by the same industry trends, their prices will tend to rise or fall together. They co-vary. Covariance and correlation measure such a tendency. We will begin with the problem of calculating the expected values of a function of two random variables. y y y x1 x2 x3 Figure 1: Examples of positive, negative and zero correlation Examples of positive and negative correlations: price and size of a house height and weight of a person income and highest degree received Lung capacity and the number of cigarettes smoked everyday. GPA and the hours of TV watched every week Expected length of lifetime and body mass index 20

21 4.1 Covariance When two random variables are not independent, we can measure how strongly they are related to each other. The covariance between two rv s X and Y is Cov(X, Y ) = E[(X µ X )(Y µ Y )] { x y = (x µ X)(y µ Y )p(x, y) X, Y discrete (x µ X)(y µ Y )f(x, y)dxdy X, Y continuous Variance of a random variable can be view as a special case of the above definition: Var(X) = Cov(X, X). Properties of covariance: 1. A shortcut formula: Cov(X, Y ) = E(XY ) µ X µ Y. 2. If X and Y are independent, then Cov(X, Y ) = Cov(aX + b, cy + d) = accov(x, Y ). 21

22 Variance of linear combinations: Let X, Y be two random variables, and a and b be two constants, then Var(aX + by ) = a 2 Var(X) + b 2 Var(Y ) + 2abCov(X, Y ). Special case: when X and Y are independent, then Var(aX + by ) = a 2 Var(X) + b 2 Var(Y ). Example 14 What is Cov(X, Y ) for (X, Y ) with joint pmf p(x, y) y = 0 y = 100 y = 200 x = x = ? 22

23 Example 15 The joint pdf of X and Y is f(x, y) = 24xy when 0 x 1, 0 y 1 and x + y 1, and f(x, y) = 0 otherwise. Find Cov(X, Y ). 23

24 4.2 Correlation Coefficient The defect of covariance is that its computed value depends critically on the units of measurement (e.g., kilograms versus pounds, meters versus feet). Ideally, the choice of units should have no effect on a measure of strength of relationship. This can be achieved by scaling the covariance by the standard deviations of X and Y. The correlation coefficient of X and Y, denoted by ρ X,Y, is defined by (4.1) ρ X,Y = Cov(X, Y ) σ X σ Y. Example 16 Calculate the correlation of X and Y in Examples 14 and

25 The correlation coefficient is not affected by a linear change in the units of measurements. Specifically we can show that If a and c have the same sign, then Corr(aX + b, cy + d) = Corr(X, Y ). If a and c have opposite signs, then Corr(aX+b, cy +d) = Corr(X, Y ). Additional properties of correlation: 1. For any two rv s X and Y, 1 Corr(X, Y ) If X and Y are independent, then ρ = 0. Correlation and dependence: Zero correlation coefficient does not imply that X and Y are independent, but only that there is complete absence of a linear relationship. When ρ = 0, X and Y are said to be uncorrelated. Two random variables could be uncorrelated yet highly dependent. Correlation and Causation: A large correlation does not imply that increasing values of X causes Y to increase, but only that large X values are associated with large Y values. Examples: Vocabulary and cavities of children. Lung cancer and yellow finger. 25

### Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

### Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

### Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

### Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

### Joint Distributions. Tieming Ji. Fall 2012

Joint Distributions Tieming Ji Fall 2012 1 / 33 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables

### 4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

### Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

### Random Variables, Expectation, Distributions

Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability

### STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011 Joint density function of continuous Random Variable When X and Y are two continuous

### 6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

### Jointly Distributed Random Variables

Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

### MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

### MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

### Worked examples Multiple Random Variables

Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and

### Chapter 4. Multivariate Distributions

1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

### ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

### Bivariate Distributions

Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

### Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

### Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

### Joint Probability Distributions and Random Samples

STAT5 Sprig 204 Lecture Notes Chapter 5 February, 204 Joit Probability Distributios ad Radom Samples 5. Joitly Distributed Radom Variables Chapter Overview Joitly distributed rv Joit mass fuctio, margial

### Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

### Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291)

Feb 8 Homework Solutions Math 5, Winter Chapter 6 Problems (pages 87-9) Problem 6 bin of 5 transistors is known to contain that are defective. The transistors are to be tested, one at a time, until the

### Math 431 An Introduction to Probability. Final Exam Solutions

Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

### For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

### Lesson 5 Chapter 4: Jointly Distributed Random Variables

Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence

### 5. Continuous Random Variables

5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

### Covariance and Correlation

Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

### MATH 201. Final ANSWERS August 12, 2016

MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

### Statistics 100A Homework 7 Solutions

Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

### Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

### Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

### Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere

Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x

### Exercises with solutions (1)

Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

### Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

### ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

### Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

### 3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

### P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

### Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

### Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

### We have discussed the notion of probabilistic dependence above and indicated that dependence is

1 CHAPTER 7 Online Supplement Covariance and Correlation for Measuring Dependence We have discussed the notion of probabilistic dependence above and indicated that dependence is defined in terms of conditional

### FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

### Notes on Continuous Random Variables

Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

### L10: Probability, statistics, and estimation theory

L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian

### MTH135/STA104: Probability

MTH135/STA14: Probability Homework # 8 Due: Tuesday, Nov 8, 5 Prof Robert Wolpert 1 Define a function f(x, y) on the plane R by { 1/x < y < x < 1 f(x, y) = other x, y a) Show that f(x, y) is a joint probability

### Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

### CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

### Lecture Notes 1. Brief Review of Basic Probability

Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

### P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b

Continuous Random Variables The probability that a continuous random variable, X, has a value between a and b is computed by integrating its probability density function (p.d.f.) over the interval [a,b]:

### Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

### Continuous Random Variables

Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles

### Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

### ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

### Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

### Random Variables. Chapter 2. Random Variables 1

Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

### Joint Probability Distributions and Random Samples

5 Joint Probabilit Distributions and Random Samples INTRODUCTION In Chapters 3 and 4, we developed probabilit models for a single random variable. Man problems in probabilit and statistics involve several

### Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

### Joint distributions Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Joint distributions Math 17 Probability and Statistics Prof. D. Joyce, Fall 14 Today we ll look at joint random variables and joint distributions in detail. Product distributions. If Ω 1 and Ω are sample

### 6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

### UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

### Some probability and statistics

Appendix A Some probability and statistics A Probabilities, random variables and their distribution We summarize a few of the basic concepts of random variables, usually denoted by capital letters, X,Y,

### Chapter 3 RANDOM VARIATE GENERATION

Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.

### Examination 110 Probability and Statistics Examination

Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

### e.g. arrival of a customer to a service station or breakdown of a component in some system.

Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

### ), 35% use extra unleaded gas ( A

. At a certain gas station, 4% of the customers use regular unleaded gas ( A ), % use extra unleaded gas ( A ), and % use premium unleaded gas ( A ). Of those customers using regular gas, onl % fill their

### Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012.

Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG8). December 15, 12. 1. (3p) The joint distribution of the discrete random variables X and

### The Bivariate Normal Distribution

The Bivariate Normal Distribution This is Section 4.7 of the st edition (2002) of the book Introduction to Probability, by D. P. Bertsekas and J. N. Tsitsiklis. The material in this section was not included

### Stats on the TI 83 and TI 84 Calculator

Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and

### Chap 3 : Two Random Variables

Chap 3 : Two Random Variables Chap 3.1: Distribution Functions of Two RVs In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record

### Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).

Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The

### ( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17

4.6 I company that manufactures and bottles of apple juice uses a machine that automatically fills 6 ounce bottles. There is some variation, however, in the amounts of liquid dispensed into the bottles

### E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

### Contents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables

TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics Yuming Jiang 1 Some figures taken from the web. Contents Basic Concepts Random Variables Discrete Random Variables Continuous Random

### Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

### MAS108 Probability I

1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

### 15.062 Data Mining: Algorithms and Applications Matrix Math Review

.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop

### Chapter 5: Joint Probability Distributions. Chapter Learning Objectives. The Joint Probability Distribution for a Pair of Discrete Random

Chapter 5: Joint Probability Distributions 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4

### Notes 11 Autumn 2005

MAS 08 Probabilit I Notes Autumn 005 Two discrete random variables If X and Y are discrete random variables defined on the same sample space, then events such as X = and Y = are well defined. The joint

### Notes for STA 437/1005 Methods for Multivariate Data

Notes for STA 437/1005 Methods for Multivariate Data Radford M. Neal, 26 November 2010 Random Vectors Notation: Let X be a random vector with p elements, so that X = [X 1,..., X p ], where denotes transpose.

### Section 5.1 Continuous Random Variables: Introduction

Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

### Worked examples Random Processes

Worked examples Random Processes Example 1 Consider patients coming to a doctor s office at random points in time. Let X n denote the time (in hrs) that the n th patient has to wait before being admitted

### STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

### 3. Continuous Random Variables

3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

### M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

### Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Chapter 4 - Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the

### What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

### MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

### Probability Calculator

Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that

### Chapter 3 Joint Distributions

Chapter 3 Joint Distributions 3.6 Functions of Jointly Distributed Random Variables Discrete Random Variables: Let f(x, y) denote the joint pdf of random variables X and Y with A denoting the two-dimensional

### Calculate the holding period return for this investment. It is approximately

1. An investor purchases 100 shares of XYZ at the beginning of the year for \$35. The stock pays a cash dividend of \$3 per share. The price of the stock at the time of the dividend is \$30. The dividend

### Discrete and Continuous Random Variables. Summer 2003

Discrete and Continuous Random Variables Summer 003 Random Variables A random variable is a rule that assigns a numerical value to each possible outcome of a probabilistic experiment. We denote a random

### An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 5 Example: Joint

### The Scalar Algebra of Means, Covariances, and Correlations

3 The Scalar Algebra of Means, Covariances, and Correlations In this chapter, we review the definitions of some key statistical concepts: means, covariances, and correlations. We show how the means, variances,

### Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

### Probability and Statistics

CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be

### Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal

### Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

### Aggregate Loss Models

Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing